Fotocrono - a Lilypad based gesture interface

Hello all.

This is a collaborative work that i'm working with artist Vitor Silva for a wireless gesture interface using Lilypad, Xbee, flexibility/bend sensor, and accelerometer and Max/MSP. The interface allows any one to use movements from the right hand, arms and legs to control a Max/MSP patch. We added a nice 3 axis accelerometer to enable gestures based on body movement.

Let us know what do you think of it :)

http://fotocrono-fatosensivelwireless-vls.blogspot.com/

All the best Filipe Valpereiro

Looks, um, snug :)

Hard to really know what to think, you've only posted pix of wires and chips! What sort of APPLICATIONS are you aiming at triggering or controlling with Max/MSP? Dance, music, gaming, financial analysis?

Please don't give him any ideas: we've already had way too many problems caused by people who incorporated arm-waving into financial analysis....

It's a neat concept, but the video doesn't give a sense that there's any sort of system translating movements or positions into what the computer reports as its input (well, except when he touched his chest and it said "mammaries"): very similar-looking gestures could produce either "3", or "742", or "ghost". Probably just because we've been given a peek at something still in the very early stages.

It'll be interesting to see what it does as it evolves.

Ran