Nest

How can we make the experience of creating live visuals engaging and intuitive for everyday users?

In this case study, I’ll show you my graduate project, highlighting the challenges of addressing multiple pain points that were important to me, and consolidating them into a single working product. 






About The Project

A conceptual project that evolved into a patented interaction product — capturing the essence of experimentation in creating live, audio-reactive visuals. It responds in real time to the user’s music, with intuitive control through a separate wireless interface. The system is designed with a thoughtful learning curve that encourages exploration without overwhelming the user.
Develop via TouchDesigner And Base44



Challenge

The field of live visual performance still resembles outdated, analog interfaces. Creators facing a gap in modern tools fail to balance technical complexity with creative accessibility.






Default DJ Interface




Goals

Ensure the real-time experience feels engaging, not overwhelming




Simplify creation through an engaging, intuitive interface




Reinforce user ownership through choosing multiple decisions mid-experience










As a designer passionate about music and self-expression, I wanted to create a way for anyone to experience the joy of generating live visuals themselves.






The Interface



Allowing making gestures such as:



  1. Selecting colors for the visuals
  2. Adding layers for creating depth
  3. Zooming in and out to make the visuals bigger /smaller.
  4. Choosing any music from Spotify for the best emotional connection.
  5. Finger gestures shape the visuals in real time, empowering the creator to compose intuitively.
  6. Choosing themes from the selected interface.
  7. Deleting layers.
  8. Dragging a color to the interface for changing the background color.


Any sound from the app is audio reactive by the visuals


Output Screenshots
 
The Product In Real Time