Problem fading ShiftBrites between colors

Hello folks - I’m a new Arduino user, and I really like all the community support I’m seeing out there. Very comforting!

I’m working on a project that uses a string of ShiftBrites to fade up through a simulated sunrise, s-l-o-w-l-y cycling colors to match the morning sky (off → deep purple → deep blue → pink → orange → yellow, etc.) ending at full brightness, where it will stay until the end of the day, then cycling through the sunset colors on its way back to off.

I’m using a ShiftBrite shield on my Arduino Uno. I’ve soldered a ChronoDot clock into the shield’s prototyping area, and the clock routines are working correctly.

I’ve got a string of ten ShiftBrites, connected using the 3.5" cables, powering everything with a 6V 800ma power supply attached to the shield’s screw terminals (and the power jumper in place to power the Arduino & clock). The combination has worked well during simple tests, so I think the physical setup is okay.

My problem is getting the ShiftBrites to fade between colors. I found this example, http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1253503406 but I’d like to have very slow fades - 1000 steps instead of 32. Is that possible?

Here’s the relevant part of code, with the original 32 (or 33) replaced with 1000 (or 1001):

{

// begin LED write routine

int FromRed = 0; // starting value for red
int FromGreen = 0; // starting value for green
int FromBlue = 0; // starting value for blue
int ToRed = 100; // ending value for red
int ToGreen = 0; // ending value for green
int ToBlue = 300; // ending value for blue

for(int FadeCount = 0; FadeCount < 1001; FadeCount++) // fade in 1000 steps
{
for(int ChanCount = 0; ChanCount < 10; ChanCount++) // step thru all ShiftBrites
{
LEDChannels[ChanCount][0] = (FromRed * (1000 - FadeCount) + ToRed * FadeCount)/1000;
LEDChannels[ChanCount][1] = (FromGreen * (1000 - FadeCount) + ToGreen * FadeCount)/1000;
LEDChannels[ChanCount][2] = (FromBlue * (1000 - FadeCount) + ToBlue * FadeCount)/1000;
}
WriteLEDArray();
delay(100);
}

}

But when I run this, it starts promisingly enough (fading up from black) but about :10 in, all the ShiftBrites go nuts - flashing thru random colors and brightnesses, no rhyme or reason, settling down now and then to a solid color, then back to wild randomness and so on to the end of the cycle.

Any thoughts? Am I over-reaching with my 1000-step fades? Should I leave the number of steps at 32 (as in the original code example) and just increase my delay times at the end of the loop?

Thanks for any suggestions!

Look up the difference between floating point and intergers. Also the number of steps of real control you have is quite small compared to what you are trying to do. Yes you are better off adding a delay.to make things slower.

You're running into integer overflow. Change "FadeCount" to long i.s.o. int and it should work.