We’d like to share a project we’ve been working on: Sounding Canvas, a series of interactive paintings that react to touch and proximity with evolving soundscapes. At the heart of the project, Arduino plays a key role in connecting the physical artwork to the auditory experience.
The Concept
The idea is simple yet powerful: imagine a painting that listens to your gestures and responds with sound. Our canvases are not just static images; they are interactive paintings, inviting the viewer to explore, touch, and co-create the experience.
Each canvas is embedded with touch-sensitive areas made from aluminum (or copper) foil, which detect when someone interacts with the surface. These sensors are connected to an Arduino, which communicates with a Raspberry Pi running our audio engine.
Why Arduino?
Arduino is ideal for this project because it:
- Provides precise, low-latency readings from capacitive sensors.
- Can interface with a wide variety of sensors and hardware.
- Communicates easily with higher-level systems like the Raspberry Pi via USB.
For us, the Arduino acts as the sensor manager, reliably converting touches into signals that our software can interpret and turn into sound.
Technical Implementation
Hardware
- Arduino Uno Rev3 reads the capacitive sensors.
- Capacitive sensors: Aluminum (or copper) foil pads connected via 1–1.4 MΩ resistors.
- Raspberry Pi 4 (with HiFiBerry Amp2) handles audio playback and sound synthesis.
The Arduino is connected via USB to the Raspberry Pi. It constantly reads the sensor values and sends touch events over serial. The Raspberry Pi then maps these events to audio triggers, producing unique sound textures based on the location and intensity of interaction.
Capacitive Sensing
We used the excellent CapacitiveSensor library by Paul Stoffregen. Each sensor uses a shared "send" pin and an individual "receive" pin. The library measures the RC time constant, which changes when a human approaches the sensor.
Example code snippet:
#include <CapacitiveSensor.h>
CapacitiveSensor cs_sensor1 = CapacitiveSensor(2, 4); // send pin 2, receive pin 4
CapacitiveSensor cs_sensor2 = CapacitiveSensor(2, 6); // send pin 2, receive pin 6
void setup() {
Serial.begin(9600);
}
void loop() {
long sensor1Value = cs_sensor1.capacitiveSensor(30);
long sensor2Value = cs_sensor2.capacitiveSensor(30);
Serial.print("Sensor 1: "); Serial.print(sensor1Value);
Serial.print("\tSensor 2: "); Serial.println(sensor2Value);
delay(50);
}
This setup provides stable, responsive readings while remaining easy to maintain and extend.
Event Management
Once the Arduino detects a touch, it sends the event to the Raspberry Pi, which handles:
- Debouncing: Avoiding repeated triggers from the same gesture.
- Sound selection: Mapping sensor activation to a library of pre-recorded or synthesized sounds and using AI algorithms to "respond" to the user's touch with an appropriate sound.
- Networked interaction: Optionally sending touch events to other peer canvases over the internet for distributed interactive experiences.
Lessons Learned
- Calibration is key: resistor values and sensor area must be tuned for optimal sensitivity.
- Separation of concerns: Arduino handles low-level sensing; Raspberry Pi handles sound, AI and higher-level logic.
- Environmental awareness: Aluminum foil sensors are sensitive to electrical interference; mounting, shielding and grounding are critical.
If you’re curious, you can see some demos and videos of the canvases in action on our project page.
We hope this inspires others to explore Arduino as the bridge between physical interaction and digital expression.
We’d love to hear your thoughts and any suggestions for improving our sensor setup or Arduino integration!
Dora & Luciano, the Perceptrum²