Group Project: Rachel Lin, Felix Fu, Nicholas Berry
Category: HCI
Location: London, United Kingdom
What if we grew a tail that could express emotions?
We set out to develop a wearable, animatronic monster tail as an extension of the human body, designed to convey emotional responses based on visual input. This tail serves as a novel form of nonverbal expression.
Structurally, the tail features 3D-printed PLA ribs along a central silicone tube, functioning as a spine. Three fishing lines threaded through the ribs connect to stepper motors that control movement via a pulley system—two for lateral motion and one for vertical control. Additionally, a DC motor at the tip spins a rib-shaped propeller, while an integrated LED strip lights up when the tail feels threatened.
The Cyber-Tail interprets emotions through visual input, using a Python script and machine learning to recognize hand gestures as friendly, aggressive, or a reboot signal. Processing then relays the appropriate response to the Arduino controlling the tail.