Hello everyone,
When we upload this piece of code:
/*
#include "pins_arduino.h"
int sig = 11; // signal
int channel [512]; // DMX frame buffer
int wait = 1000L; // How long should the light be on? In milliseconds (1ms = 0.001s)
void setup()
{
pinMode(sig, OUTPUT);
}
void loop()
{
lights();
}
void lights()
{
// 0700 hrs
setChannel(1,150); // Adjust lightintensity; (A,B)
setChannel(2,255); // A = channel [1=Nikki 2=Roger 3=Michiel]
setChannel(3,100); // B = Light intensity [100=depressed, 130=moody, 150=normal, 180=happy, 255=superhappy]
sendDMXframe(); // Fill buffer
delay(wait); // Turn on the lights (time = 'wait').
// 0800 hrs
setChannel(1,200); // Adjust lightintensity; (A,B)
setChannel(2,200); // A = channel [1=Nikki 2=Roger 3=Michiel]
setChannel(3,150); // B = Light intensity [100=depressed, 130=moody, 150=normal, 180=happy, 255=superhappy]
sendDMXframe(); // Fill buffer
delay(wait); // Turn on the lights (time = 'wait').
// 0900 hrs
setChannel(1,255); // Adjust lightintensity; (A,B)
setChannel(2,150); // A = channel [1=Nikki 2=Roger 3=Michiel]
setChannel(3,100); // B = Light intensity [100=depressed, 130=moody, 150=normal, 180=happy, 255=superhappy]
sendDMXframe(); // Fill buffer
delay(wait); // Turn on the lights (time = 'wait').
}
Our three lights change every second.
But when we raise the delay to 10000ms:
...
int wait = 10000L; // How long should the light be on? In milliseconds
...
The light go out after aprox. 3 seconds.
This delay however does work! After 10 seconds it switches to the second set of light intensity values.
Does anyone know why this happens and how we can fix this?
We are working with an Arduino Duemilanova, a 4 channel power DMX Dimmer pack and we are programming in an OSX environment.
If you need more info regarding this issue let me know!