just wanted to share a project we developed last year, for a dance platform event up here in Liverpool.
The piece consisted of a trapeze performer wearing a light up costume built using fiber optic fabric and a laser cut black fabric masking. Using the pulse sensor monitor (http://pulsesensor.myshopify.com/,) we were able to measure the performers heart and use the data (via wifi) to control audio visual parameters for the piece. The RGB values of the suit were driven by this data, alongside the sound scape which was designed in SuperCollider.We were able to route the heart rate data very quickly from the performer to the computer, then process the data though a Processing app and send it back to the costume with pretty much zero latency. By placing the computer in the middle of this equation we were able to also use the data to control audio parameters (e.g the panning of the sound scape was controlled by the pulse) and also create a synthesized heartbeat, so rather than using a microphone we used the data (converted into OSC values) to control a synthesizer. We ended up using two arduino uno's with wifi shields. one was connected to the sensor and the other to the lights. Although this could have been done with one arduino we were worried about the volume of data and decided to go for this safer option to reduce latency.
below are some images and a video of the final results: