Interactive exterior light art installation -- PIR controlled WS2801 voxels

Hey all!

I'm just getting started with this Arduino stuff, and I'm doing so by getting myself involved in a somewhat complex project. To pull that trough, I think I'll need a bit of sound advice.

Just a little bit of background on this project, to give you some context: as a student of Architectural Lighting Design (BA) at the Buskerud University College in Drammen, Norway, I'm lucky enough to get the opportunity to work on some pretty interesting lighting projects. The most recent one is for a newly started light festival in Slemmestad, Norway, were a fellow student and I are to create an interactive light installation which is to be placed in a public space.

The site of the installation is in a wooded, sloped area where there is a flight of stairs -- during the festival, there will be a fair amount of traffic there, as it connects different parts of the festival area. I'll try and get up some pictures to illustrate what kind of place this is.

Anyways, for this area we have developed a (still somewhat loosely defined) concept that is about underlining the contrast between the dense vegetation in our site and the surrounding industrial architecture. To do that, we'll be creating one static installation that will use fiber optics to emphasise the geometric shapes of the industrial stuff, and one dynamic installation in the foliage that should convey a more living, breathing and organic impression.

And now I'm starting to get to the point here: it is this latter part of the installation that I figured could be Arduino-powered. The idea is that there will be one or more volumes of voxels (volumetric pixels), in the form of RGB LEDs in small (ø=~5cm) diffuser domes placed on ground-mounted stems, and that these volumes of voxels should react by changing colour and luminance when people pass by them, as to be somewhat reminescent of a swarm of fireflies moving through the vegetation (or something like that ...).

The exact size of the volume and density of the voxels is something we'll have to get back to; I don't know enough about the actual dimensions of the area yet, but there is a site inspection scheduled for the 1st of July -- I'll get back with details when that's done. But until then, there are quite a few things I'm wondering about. It should be noted that I'm quite new to Arduino -- I got an Uno R3 SMD and some other stuff just a few days ago. I have a little bit of previous experience with electronics, but just very simple stuff. I haven't done much programming either, and certainly not in languages that are very similar to Arduinop/Wiring/Processing -- just some HTML and CSS, plus a tiny bit if PHP that I just barely understood. So, in short, I might need some teaspoon explanations of certain things here.

What I have done so far is to hook up a strand of 20 WS2801 controlled RGB LEDs to the Arduino and play around with FastSPI (and Glediator, which was loads of fun, but I'll leave it out of this). I have made a sketch that consists of two green blobs of light that can move along a dim, blue background, controlled by two pots on a breadboard. Basically, what the finished installation will be is just that, but controlled by motion sensors instead of pots, and with 200+ instead of 20 LEDs, in 3 rather than 1 dimension.

To give an idea of where this is going, the sketch is in the next post. Couldn't put it in this one due to post length limitations. As I'm a bit of a noob, some of this might look a bit sketchy (hehe), so feel free to point out what could be improved. It's based on "FastSPI LED Effects" from Funkboxing. (FastSPI LED Effects [UPDATED TO WORK WITH FastSPI_2] » funkboxing)

[code in next post]

So, to get this to be controlled by people strolling in the stairs, I figured the following:
As people are restrained to one-dimensional movement, either up or down the stairs, the motion tracking mechanism that will make the green blobs follow them should in theory be relatively simple. What I was thinking was to use an array of PIR sensors with a relatively narrow field of view, placed along the stairs -- say, for instance, two per metre. They'll report whether or not they see people to the Arduio, where a nifty sketch will map the locations of people to positions along a strand of WS2801 controlled voxels.

  1. Is that feasible at all? If not, what's the better way to do it?
  2. I'd be very interested in suggestions for suitable, narrow-FOV digital PIR sensos, preferrably with IP>65 enclosures. (DIY-ing very possible if neccessary; budget not extremely tight, but not infinite either)
  3. There'd be very many sensors; is it a good idea to use shift registers, such as 74HC595N, to get enough inputs?

Assuming that this or some other solution works, the next issue is programming the motion. What I have in the sketch above does not look quite organic enough to me; the movement of the light blobs is of course very stiff. If I turn a pot from 0 to 1023, the blob jumps straight from LED 0 to 19. How could one go about slowing its movement down, so that it'd use a second or two to get all the way from one end to another? Also, perhaps there is a way to make it accelerate, then decelerate before stopping, perhaps even with a bit of elasticity when stopping? Anything to make it look more like an organism, less like a machine.

The next thing is this thing about voxels. The idea is that the volume should have a depth of, say, 5 rows of voxels, and a similar height (length TBD). I take it that each WS2801 controlled LED requires a bit of RAM from the Arduino, and that about 200 is max for an Uno. Either I could use a Mega instead, and figure out how to arrange them in a 3D matrix -- or, far more conveniently, I could send the exact same commands to the 5*5 strands, hoping that physically ofsetting them will make it look natural (I believe that's very feasible). The voxels won't be placed in a strict grid, but scattered a bit more randomly, so in theory, having all the strands get the same instructons should look good enough. I could acquire some more LEDs to test that at a small scale. Assuming that it works, what should I do to get several strands to take the same input?

Before I stop, there's one more thing I'm wondering about: cabling for the WS2801s. I take it that they use some kind of SPI/I2C, which I'm not familiar with at all. I read a highly inconclusive discussion in an A/V forum about cabling for this kind of signals, and it seemed that twisted pairs (with SDA/GND and SCL/GND) would be the way to go. Could I just use CAT5 for this? Could I to VCC/VCC on one pair and assume that there won't be too much voltage drop? (I presume I'd have to feed more power to every 20 LEDs or so on each strand anway).

--

Phew. That was a lot of questions. I'm not demanding full answers to everything just yet, but some pointers in the right diraction and some comments on whether this is even remotely close to making any sense whatsoever would be very appreciated. There are more specifics I'll get back to later, but that'll have to wait.

Thanks in advance for any clarifications!

... and here's the aforementioned sketch:

#include <FastSPI_LED2.h>
#define NUM_LEDS 20

struct CRGB leds[NUM_LEDS];
int ledMode = 11;
int idex = 0;              //-LED INDEX (0 to NUM_LEDS-1)
//int ibright = 0;         //-BRIGHTNESS (0-255)

//-POTS AND SWITCHES
const int analogInPin = A0;   // yellow wire
int sensorValue = 0;
int outputValue = 0;
const int analogInPin1 = A1;  // white wire
int sensorValue1 = 0;
int outputValue1 = 0;
const int switchPin = 2;     // orange wire
int switchState = 0;

//------------------------------------- UTILITY FUNCTIONS --------------------------------------

//-SET THE COLOR OF A SINGLE RGB LED
void set_color_led(int adex, int cred, int cgrn, int cblu) {  
  int bdex;
  bdex = adex;
  leds[bdex].r = cred;
  leds[bdex].g = cgrn;
  leds[bdex].b = cblu;  
  }

//-FIND ADJACENT INDEX CLOCKWISE
int adjacent_cw(int i) {
  int r;
  if (i < NUM_LEDS - 1) {r = i + 1;}
  //else {r = 0;}                <----------- this line is replaced to prevent warping, which is only desirable in a circle formation
  else {r = NUM_LEDS;}    //     <----------- because NUM_LEDS is outside the visible range, setting r to this makes it disappear
  return r;
}

//-FIND ADJACENT INDEX COUNTER-CLOCKWISE
int adjacent_ccw(int i) {
  int r;
  if (i > 0) {r = i - 1;}
  //else {r = NUM_LEDS - 1;}     <----------- this line is disabled to prevent warping
  return r;
}

//------------------------LED EFFECT FUNCTIONS------------------------

///////////////////////////////
//   SET ALL TO ONE COLOUR   //
///////////////////////////////
void one_color_all(int cred, int cgrn, int cblu) {
    for(int i = 0 ; i < NUM_LEDS; i++ ) {
      set_color_led(i, cred, cgrn, cblu);
      LEDS.show();       
      delay(1);
    }  
}

///////////////////////////////
//   TRACKING BLOB (MULTI)   //
///////////////////////////////
void trackerTestMulti() {
  sensorValue = analogRead(analogInPin);
  outputValue = map(sensorValue, 0, 1023, 0, NUM_LEDS - 1);    // map pot to a value between 0 and total number of LEDs-1
  
  sensorValue1 = analogRead(analogInPin1);
  outputValue1 = map(sensorValue1, 0, 1023, 0, NUM_LEDS - 1);    // map pot to a value between 0 and total number of LEDs-1
  
  int idexA = outputValue;    // set current position according to output from pot
  int idexB = outputValue1;

  int iL1A = adjacent_cw(idexA);
  int iL2A = adjacent_cw(iL1A);
  int iL3A = adjacent_cw(iL2A);  
  int iR1A = adjacent_ccw(idexA);
  int iR2A = adjacent_ccw(iR1A);
  int iR3A = adjacent_ccw(iR2A);

  int iL1B = adjacent_cw(idexB);
  int iL2B = adjacent_cw(iL1B);
  int iL3B = adjacent_cw(iL2B);  
  int iR1B = adjacent_ccw(idexB);
  int iR2B = adjacent_ccw(iR1B);
  int iR3B = adjacent_ccw(iR2B); 

  for(int i = 0; i < NUM_LEDS; i++ ) {
    if (i == idexA || i == idexB) {set_color_led(i, 60, 255, 20);}                              // current position A (pot 0) and B (pot 1)
    else if (i == iL1A || i == iR1A || i == iL1B || i == iR1B) {set_color_led(i, 60, 170, 40);} // 1 to the side
    else if (i == iL2A || i == iR2A || i == iL2B || i == iR2B) {set_color_led(i, 30, 95, 80);}  // 2 to the side
    else if (i == iL3A || i == iR3A || i == iL3B || i == iR3B) {set_color_led(i, 20, 40, 90);}  // 2 to the side
    else {set_color_led(i, 20, 0, 90);}                                                         // background
  }
  LEDS.show();
  delay(10);    // update interval (ms)
}

//------------------SETUP------------------
void setup()  
{
  pinMode(switchPin , INPUT);
  LEDS.addLeds<WS2801, 11, 13, RGB, DATA_RATE_MHZ(1)>(leds, NUM_LEDS);    // WS2801 on pin 11 (data/green) and 13 (clock/blue) with an ordering of RGB and 1MHz max data rate
  one_color_all(0,0,0);    // blank strip
  LEDS.show();
}

//------------------MAIN LOOP------------------
void loop() {
  switchState = digitalRead(switchPin);
  if (switchState == HIGH) {LEDS.setBrightness(255);}    // set brightness according to switch
  else {LEDS.setBrightness(80);}

  if (ledMode == 0) {one_color_all(0,0,0);}
  if (ledMode == 1) {one_color_all(255,255,255);}
  if (ledMode == 2) {one_color_all(255,0,0);}
  if (ledMode == 11) {trackerTestMulti();}
  }

Okay, here's a combined bump and status update. I have figured the simple things, and am now starting with the difficult stuff.

It turned out to be very simple to create the delayed motion effect, simply by getting the "idex" to move in the direction of a "target", using a delay to slow down the motion. In practice, the delay will have to be adjusted depending on the pixel pitch.

if (idex < target) {idex = idex + 1;}
else if (idex > target) {idex = idex - 1;}
delay (80);

The cabling issues I asked about do not seem to be problematic. I got some RJ45 breakout boards and connected them the the arduino end and at the LED end with the pairs sda/gnd, scl/gnd, gnd/gnd and 5v/5v. It seems to work just fine, even with a 10 metre long unshielded cable -- which is longer than what will be needed in the final installation.


With that out of the way; the motion tracking is the big challenge. I acquired ten Zilog ePIR modules (really cheap but very useful) to test the principle I tried to describe in the first post. It appears that it will work as intended, but there will be quite a bit of coding to get it right.

I have hooked up 8 of the sensors and placed them in a row with about 10-15 centimetres between each. (the last 2 aren't connected due to some minior hardware issues that I'll fix when I get around to it.) Using small snoots made of rolled up blackwrap I have narrowed their fields of view, so that typically 2-3 at a time get triggered when someone is standing in front of them. Due to the short distance between each, the target must be close (otherwise they all get triggered); this is simply due to limited space in my lab. In the final installation, the distance should be approximately half a metre.

Each is connected to a separade digital input. I have made a very crude sketch that reads the state of each sensor, and then does as follows (small excerpt):

if (sensorState2 == LOW){triggeredSensors[1] = 2; numAct++;}  // sensor goes low when triggered
else {triggeredSensors[1] = 0;}
  • if the sensor is triggered, its ID (1-8) is added to the corresponding place in an 8 long array. Also, numAct is incremented.
  • if the sensor is not triggered, the value in the array is 0

Therefore, if sensors 4, 5 and 6 are triggered, the array will be 0,0,0,4,5,6,0,0. The sum of these values (15) will then be divided by numAct, which is the number of active sensors (3), resulting in the average, which is of course 5.

The average value, which more or less is the actual position of the object, is then mapped to the target value between 0 and 19. Thus, the light will follow the object.

This code is extremely ugly; for instance, it would result in an attempted division by zero when no sensors are triggered -- but that should be an easy fix; I guess I could use an if/else statement to make something other than the averaging happen when no sensors are triggered.

But nevertheless, the thing actually works! I can stroll around in front of the sensors and have the green light thing follow me around, which is pretty cool. Sometimes it seems to move to the centre or the start of the LED strand; I guess this has to do with the averaging function. I tried to have it transmit the average value over serial, and while it was mostly correct, it seemed to occationally give some nonsensical results (such as -2).

Obviously, this works only when there is one object that is to be tracked. If there is one object at each end of the sensor array, the target would end up in the middle.

What I'd like to try to do now is to identify a group of triggered sensors as an object, and somehow distinguish between different objects so that two targets could follow two different objects. I'm very uncertain about how to accomplish this -- I'd guess that it'd make sense to treat active sensors adjacent to each other as detecting the same person or group, but my idea about how this would be programmed is a bit too abstract.

Any hints on where to go from here?

I don't know how to help, but reading this thread and your progress and thoughts Is fascinating. Your idea of grouping adjacent sensors seems useful, and I'm sure you'll figure it out with a bit of time.

Maybe one piece of general advice would be that useful hints might come from outside the arduino sphere - your idea about grouping sensors reminds me of a collision-detection optimization tutorial I read on a game programming website...

Good luck, and post Vids!