I designed the sound for this immersive experience exploring the
pressure of a shifting social identity.
This is part of my undergrad capstone.
I am from Hsinchu, a city in Taiwan known for its strong winds. As someone with allergies, I often face heightened symptoms during seasonal transitions or when air pollution intensifies. In particular, in Hsinchu, the wind is an ever-present force—whether it’s the discomfort of air currents or the invisible presence of PM2.5 particles, these phenomena significantly impact my quality of life. However, these elements are not immediately perceivable by the naked eye or through sound, and the presence of wind cannot be immediately visualized. This invisibility compels us to rely on technological means to reveal their true nature.
In response to this, I aim to create a new approach to represent these invisible weather phenomena. By transforming wind speed, PM2.5, and other environmental elements into an interactive experience that engages the senses, I hope to allow viewers to intuitively feel the changes in the air and weather. This is not merely a cold, data-driven presentation; instead, it’s an approach that enables emotional resonance with these invisible forces. My work seeks to not only present air pollution and wind speed in a simple manner but also to encourage understanding, awareness, and engagement with these unseen elements, reminding us of how we can interact more sensitively with these natural forces in our daily lives.
In creating this wind chime project, I integrated the Open Weather API to adjust the wind chime's sound based on weather and air quality. Using p5.js's sound module, I altered sound effects like distortion and reverb depending on local environmental conditions, giving the piece an added layer of interactivity and relevance.
To enhance the overall experience, I combined physical effects with gesture control using p5.js for visuals and ml5.js for gesture recognition, allowing the audience to trigger the chime’s movement and sound. To simulate realistic motion, I used Matter.js for gravity, elasticity, and friction, making the swaying appear more natural. After refining the gesture recognition and adjusting parameters, I improved stability and fluidity.