Since July I have been working on a machine that uses an Arduino paired with an ambient light sensor from Sparkfun to collect light data. This data is then routed through MaxMSP via the [serial] object to control 2 stepper motors and a single servo motor which are also connected to the Arduino. At this point the data is collected and parsed within MaxMSP to drive two specific functions: move/draw light patterns with the servo motor based on the movement of the stepper motors and at the same time sonify the data that is being streamed.
For example, if the intensity of the current light source is strong, the stepper motors will pull the servo motor to the top of the page and vice versa for low level light intensity. The varying ranges of light intensity are mapped in 4 different ways: in groups of 4, 7, 11, or random. All of the parameters that determine the ranges of these groups are created by using segments of the Lucas number sequence as a way to influence the system in an algorithmic way. Lets say if the user choses map group 4 (0 - 1023), the page will be broken up into 4 sections (256 x 4). When the light intensity falls within a group, lets say group 1, a random number/position is generated based on the current group you are in. So this would be in group 1, a random number from 0 - 255 will be plotted by the drawing machine. At the same time, the fluctuation of light will create sound based on the streaming parameters. Seven instruments were created for this project and they all have the ability to be compressed, equalized and mixed in real-time.
I hope you enjoyed reading this and if you wanna check it out here is the link below. Thanks for reading and enjoy! http://opkach.com/portfolio/weathertextures-2/weather-textures-optical-sonance