RainyDay is an interactive sensory art installment that explores the question: How might we harness the magical experience of being outside in the rain to understand and interact with rain in a novel way? RainyDay is funded by the Jacobs Innovation Catalysts Spark Grant awarded to Janaki Vivrekar in Fall 2019.
There’s a certain charm in experiencing the magic of a spectacular downpour. RainyDay encourages individuals to interact with the rain in an imaginative and whimsical way. RainyDay merges the natural rainy landscape with a digital rain-inspired soundscape by producing ambient music in response to the user's movements.
The audience's journey with the RainyDay art installment begins with a pair of gloves. Embedded with a raindrop detection sensor, the gloves allows the user to “collect” rain. An Adafruit Feather M0 channels data about the amount of rain to MaxMSP, which uses it to produce a dynamic, synthetic rain sound. The more it rains, the louder the soundscape gets.
An accelerometer embedded in the glove tracks the acceleration of the user’s movements, and the Adafruit Feather M0 channels the data to MaxMSP, which uses the movement in the x- and y- directions to perturb and distort rain-themed songs infused into the soundscape.
The digital QuNeo board prototypes sounds like a gust of wind, a light background drizzle of rain, a car driving through a wet puddle, bouts of thunder, rain hitting a metal roof, the angelic calm after a storm, and static-infused fragments of popular rain-themed songs.
See the musical RainyDay soundscape in action below.