rebecca's WIP

[pcomp final] Root Note Final Presentation

Root Note is a botanical audio interface and generative soundscape that brings to life the dynamic relationship between plants and their surroundings. Hidden from the observer, capacitive touch sensors, photocells and moisture sensors embedded in the soil register changes in the plants’ environment such as the presence of human touch, fluctuations in light, and overall soil moisture level. Using an Arduino microcontroller, this data from the plants is routed into the audio synthesis environment Max where it is used to control aspects of the sound. Through their observation and interaction, the audience takes part in sculpting a real-time aural representation of the plants’ environment. The changing soundscape is a reflection on symbiosis and interspecies entanglement.

rootnote_02 Our final project, Root Note, has come a long way since last week. In certain ways, it’s evolved a lot since the very beginning; in other ways we landed right where we started. There have been a lot of twists and turns along the way, but we ultimately returned to our first idea of using touch as the primary gesture.

We’ve developed the project considerably since our first user testing last week. We scaled up from two sensors to fifteen sensors and went from a single plant to a full planter box. Perhaps most significantly, Nick worked his magic to build out the soundscape and adapt it to a system with many more sensors. Shifting scale seemed to be our biggest challenge (and something we want to refine before the show). More sensors means not only more possibilities for interaction, but lots more data to process and (sonically) output to the user in an intelligible way.




The capacitive sensors continued to be the most challenging part of building out our circuit. Below are some wiring diagrams for each part of our circuit. The three yellow wires are connected to three plants in each of the bays of our planter via alligator clips (which we soldered directly to our wires). We used a .01 microfarad capacitor across power and ground, as well as smaller value capacitors from the send / signal pin of the sensor to ground, to stabilize the circuit.


The capacitors seem to have made a difference, but sometimes we still experience a noise buildup and the values get a bit inconsistent. Keeping the computer plugged in to the wall to ground it, as well as resetting the Arduino every ten minutes or so, helps to stabilize the circuit. Work is also being done in the Max patch to consistently recalibrate the incoming data. For the final show, we’re looking into using more advanced touch sensing, both for stability and for more precise values. Over the next couple days, I’ll be building out this custom touch shield ( for our Arduino. If it works, we’ll find a way to integrate it with our current system.

Scaling up the photocells from 1 to 12 was fairly straightforward. At one point I was using too much resistance and wasn’t able to power more than 4 of them at a time, but I quickly corrected it by using lower value resistors. (I even resurrected Ohm’s law for the first time since the second week of school). Since our wires are long and a bit unruly, I labeled each photocell with the corresponding pin number at the base and at the top so that we can easily debug if necessary.


Embedding the photocells in the soil was a fun process, and we’ll need to think through whether or not we want to hide them or keep them somewhat visible. After all, everyone in our p-comp class knows what they’re looking for, but the average person has no idea what a photocell looks like and may not know to interact with it.


On the fabrication front, we finalized the design of our planter and filled it out with the fruits of our plant shopping spree. We got a number of different kinds of ferns, a pencil cactus, a jade plant and some other leafier varieties. We also developed the configuration for our lighting and built a simple T-stand mounted to one side of the box to clip our grown light. The planter now looks wonderfully lush and green, and we’re happy with the current setup.


After finalizing our circuit, I switched all of our sensors over to stranded wire, which is difficult to solder but incredibly flexible. We drilled a hole in the back of the box and fed the wires through it to the Arduino and breadboard, which are mounted on the back of the box. For the show, we’ll be moving everything over to strip board and soldering all our connections to make sure they’re secure. And though we like having our circuit visible, we’re  also thinking about creating some kind of clear enclosure or cover to protect it.


What’s been interesting — and will continue to be as we evolve our circuit — is to see which plants work better with the sensors, especially the capacitive touch. I’ve already noticed that certain kinds of plants with certain structural qualities are more touch sensitive (stems, for example, seem to work well with this) while others are less responsive. The moisture of the soil is also a huge factor, and we’ve noticed that the cap sensors work better when the plants have recently been watered (perhaps because they’re more conductive).


Nick has been working in Max / MSP to create a beautiful sound composition. The sound development is constantly evolving and we’ve been working in an iterative way: user testing with people, getting feedback, and going back to refine. Though I feel fairly comfortable with the serial communication patch we’re using, my goal by the show is to have a fuller understanding of what’s happening sonically in the main Max patch.


In the current patch, the twelve photocells are each tied to a note in the key of F minor. The light intensity at any given photocell is mapped to the cutoff frequency of a low pass filter on the corresponding note. The moisture level of the soil is mapped to a couple different reverb parameters to give a sense of depth and wetness to the sound. The average value of the capacitive touch sensors is mapped to both the tempo of the sequence and the overall volume of the piece.


The majority of the work we have left to do is in the refinement of the sound and in making sure that the piece feels responsive and intuitive to users. Some common feedback has been that speeding the tempo in response to human presence is unintuitive, touching the dirt and plants shouldn’t do the same thing, and that, since experiencing the project with another person is intimate, steps should be taken to guide two listeners through more of a collective experience.

In our final presentation, many people agreed that when two people are interacting with the plants, it’s not always clear whose actions are having what effect. We’re thinking of remedying this in one of two ways: either through the responsiveness of the sound overall, or by making this an experience to be used by one person at a time. I’m more inclined toward having it be a one-person piece, since I think that gives the user a nice kind of intimacy with the plants and makes the whole thing more immersive.


Nick and I are really excited to be in the Winter Show and we’re planning to make the following improvements to the project over the next few days:


  • Try out alternative capacitive sensing board (
  • Play with more ways to stabilize our existing capacitive sensor circuit (if we decide to use this)
  • Solder circuit into strip board
  • Resolve problem with capacitive and soil moisture sensors (error where we get -2, see documentation)
  • Reinforce soldering and all waterproofing (heat shrink etc.)


  • Buy some more plants
  • Sand and finish the outside of the box
  • Affix hooks for headphones to outside of box
  • Make housing for electronics and mount to side of box
  • Get pedestal for computer and figure out configuration with laptop etc.


  • Refine sound, address responsiveness
  • User testing and incorporating feedback


  • Make video documentation before show
  • Make one page site