Back


App: Zephyr Fan

Demo ↗

Autonomous Controlled Air Fan App. Using object detection to face people and hand gesture detection to control fan speed and mode of operation.

Built on React Native, TypeScript, Expo, ThreeJS and WebSockets.

Project Share Website

Zephyr Fan App in action

Pre Requisites

We were prompted with: Creating a robotic system that helps humans.

Thinking on that prompt from the University of Edinburgh's extremely hot rooms in AT_5 building it became trivial to think of a smart fan. We decided it would need:

Hand drawn sketch of a fan with arrows, showing vertical and horizontal movement, a built in camera and a microcontroller

Day 0: Sketch of the capabilities of the fan. Facing direction, power and mode of operation to be controlled by hand gestures and companion app

Additionally, as the system would be demoed in a public space, it needed:

Looking back, one of our smartest moves was to focus on the demo experience from Day 0. It allowed judges to experience the project's capabilities firsthand, directly from their own devices.

Technical layout of the project

Overview of the project's technical layout

I focused on:

As a little bonus, I dipped my toes into 3D modeling and rendering!

As both the designer and developer of the app, I had the freedom to choose the technology stack and design the app's user experience

Next, I'll cover some of the most interesting parts of the design and development process.

Design

As a form of challenge and looking for a premium-feeling experience, I decided to include a 3D model of the fan on the app. This idea followed a similar train of thought as what you would see from other premium IoT apps.

Screenshots of IoT apps on the left with a screenshot of Zephyr Fan App on the right

Design inspiration of the app

Part of committing to a real 3D render, not just an image of a render, was that I could adjust the representation dynamically. I envisioned to tweak the 3D model to be a 1 to 1 render of what the fan is currently doing. For example, it becomes obvious when removing the blade cage, the speed of the fan is represented by the speed of the blades!

Project Share Website

Zephyr Fan App with stripped down 3D model

An exciting part of working on a real mobile app is that I could access the taptic engine of the device. Having focused almost exclusively on web stacks, this was an incredible opportunity for me. I was mainly inspired by apps like Amie, Lapse and ID by Amo, where haptics are part of the personality of the app.

The main area where I included custom haptic feedback was on the power slider. As the user swipes on the slider, the haptic engine would trigger a different pattern depending on the speed of the slider. When the user reaches the top or bottom of the slider, the haptic engine would trigger a stronger and more definitive pattern. I was mesmerized by the interaction.

Unfortunately, the demo I linked above does not include the haptic feedback (as websites do not have access to the haptic engine).

Development

While I opted for Expo and React Native (known for its cross-platform capabilities), not all of the code worked for iOS, Android and Web out of the box. Thumbs down!

Some of the functions used by ThreeJS used for rendering instanced meshes were not fully supported on iOS. It became an issue for displaying the blade cage and blades as they are packed as instanced meshes.

As a workaround, I opted for the typical if(Platform.OS === 'ios')... pattern. Picking different models for the blades and the blade cage depending on the platform.

Big hug,
Tomas