rebecca's WIP

[pcomp final] Updated system diagram

Over the past few days, Nick and I continued experimenting with sensors in an effort to finalize which ones we will and won’t be using for our final project. We made some contact mics out of piezo discs which was very simple, but we then had a hard time figuring out how to amplify them. Nick tirelessly worked on a few different amplification circuits that were very complicated and time-consuming and ultimately not successful. Finally, we decided to plug the contact mics into an analog mixer which worked perfectly. We played around with using the contact mics to dig through the soil and were able to get some really interesting sounds.

2014-11-24 18.54.32 2014-11-24 18.44.05

With the knowledge from our sensor experiments, we took a step back to refine our system diagram and think about how all these parts fit together into a cohesive experience for a user. We want to have enough to keep a user interested, but not so many options that our project becomes confusing and overwhelming and muddled. We want the sound to be mapped to gestures and actions in a way that provides clear feedback for a user. We want to provide enough visual cues to instruct our users, but also retain a sense of magic and wonder.

Some questions and issues arose right off the bat. First of all, controlling water. We met with Ben Light to discuss the design of our planter and he raised the issue of waterproofing our planter so that we don’t ruin our electronics. But more importantly, there’s the issue of controlling the soil moisture in the context of a multiple-hour interactive installation (and not over-watering the plant).

At some point, the soil reaches its saturation point and no matter how much more you water the soil, the values that the sensor is reading won’t change (not to mention, you’ll drown the plant). Which means this interaction maxes out after a short period of time and provides a huge logistical hurdle. All this has made us rethink the idea of using water as one of our primary interactions for generating sound from the plant.

Keeping all of this in mind, we’ve refined our system diagram and rethought the basic user flow of our project. We’ll be using three kinds of sensors: capacitive touch sensors, soil moisture sensors and photocells. (For now, we’re scrapping the contact mics in the name of simplicity but we can always add those back in).

updated system diagram

The soil moisture sensors will read the water level in the soil and provide the base layer of our sound, but users won’t be watering the soil. That being said, the foundation of the sound they’ll be hearing will have a direct correlation with the moisture level of the plant.

Photocells embedded in the soil will read the light level from a grow light positioned above the plant. When the user approaches the garden and blocks the light, the plants will make more sound and the light level will be mapped to different notes. The photocells will also function as a signal to the user that the garden is interactive and that the sound levels correspond to the user’s actions. To ensure this works, we’ll need to very carefully engineer the setup of our lighting and the positioning of our photocells.

We’ll also be using capacitive touch sensors that make the plants into a touch surface. When a user brushes their hand over the leaves of the plant, they’ll manipulate the echo / reverb of the existing sound and add texture to the soundscape.

Plus, we’ll have three pairs of headphones attached to our planter as a visual cue that the garden produces sound. This solves our main question from the very first week, which is: how do you signal to a user that they’re supposed to touch or interact with a plant in unconventional ways?

Our goals for the weekend are to build the planter box and set up our system fully on one plant for our final user-testing on Monday. Friday, we’ll be designing our planter, buying wood at Prince Lumber and finalizing what kind of liner we’ll be using to waterproof the inside. Saturday, we’ll be testing different lighting scenarios to see how our photocells become most responsive. Sunday, we’ll be setting up the whole system on our plant and mapping it to sound it Max / MSP.