Led Increment & delay to time duration conversion

I have a led ring animation that increments the brightness by a set amount in this case 20 and a delay period of 10 microseconds. I want to find a calculation to convert time taken to the corresponding delay and increment to keep the animation smooth.

I am using the fast led library and have 24 LED's in my ring the code turns the brightness up in increments of 20 until the LED is at the set brightness level (255) and then moves to the next led this creates a smooth effect of sweeping around the LED ring.

The variables i have so far are:

Number of LED's: 24
Brightness: 255
Brightness Increment: 20
Loop Delay: 10

From this i can do the following:

Brightness / Brightness Increment = 13 loops of code for a illuminated led

Loop Delay x 13 loops = 130 milliseconds per LED illumination

130 milliseconds x 24 LED's = 3120 Milliseconds to scroll around the ring.

I reversed this calculation to be enter the corresponding time and have the delay in loop adjust but this creates issues with times around the 10 seconds mark as the delay will be so large the fading appears blotchy.

If i set the increment to 1 then it takes far to long to perform a single sweep even with no delay.

Anyone able to wrap there head around this dilemma?

You forgot to post your code.

I'm confused about what you want. Do you want to know how to calculate it? That's what you started out asking about but you seem to already have that figured out. So what is the actual problem? The only thing you talk about else is that when you take to long it looks bad.

Sorry find my code below it was in library so had to transplant it into a standard sketch, my calculation i showed doesn’t work for every time value due to the fact i was only scaling the timing in the loop and not the amount incremented.

#include <FastLED.h>
#define NUM_LEDS 24

//Display Variables
unsigned long Brightness = 255; //The brightness to set to.
int FadeIncrement = 20; //The increment value for the fade

//Led Animation.
unsigned long _Display_Millis;

bool _Display_Direction;    //| all 3 of these can be set through
int _Display_Position;      //|-external functions
int _Display_Fade_Position; //|
int _Display_Colour = 90;     //The default colour to display
int _Display_Current = 160;    //The current colour to display
int _Display_Saturation = 255; //The default saturation to display
int _Display_Current_Saturation = 255; //The current saturation to display
int _Display_Delay = 10; //Delay to fill the display
int _Decay_Delay = 50; //The decay to fill the display

//Timer code
unsigned long _Index[20];
unsigned long _Time;
unsigned long _currentMillis;

CRGB leds[24];

void setup() {
  Serial.begin(9600);
  FastLED.addLeds<NEOPIXEL, 7>(leds, NUM_LEDS);
  pinMode(19, INPUT);
}


void loop() {
  int pos;
  if (digitalRead(19)) {
    pos = Animate(1);
  } else {
    pos = Animate(0);
  }
  Serial.println(pos);
}


uint8_t Animate(int dir) {
  if (dir == 1 && Ignore(_Display_Delay, 1, 0)) { //Forwards Animation
    leds[_Display_Position].setHSV(_Display_Current, _Display_Current_Saturation, _Display_Fade_Position);
    _Display_Fade_Position = constrain(_Display_Fade_Position + FadeIncrement, 0, Brightness);

    if (_Display_Fade_Position >= Brightness && _Display_Position <= NUM_LEDS-2) { //If we have brought the led to full brightness
      leds[_Display_Position].setHSV(40, 255, 255);
      _Display_Fade_Position = 0;
      _Display_Position++;
      Serial.println(_Display_Position);
    }

    FastLED.show();
  } else if (dir == 0 && Ignore(_Decay_Delay, 1, 0)) {
    leds[_Display_Position].setHSV(_Display_Current, _Display_Current_Saturation, _Display_Fade_Position);
    _Display_Fade_Position = constrain(_Display_Fade_Position - FadeIncrement, 0, Brightness); //Dim the led
    if(_Display_Position <= NUM_LEDS-2){
    leds[_Display_Position + 1].setHSV(_Display_Colour, _Display_Saturation, Brightness);
    }
    
    if (_Display_Fade_Position <= 0) {
      _Display_Fade_Position = Brightness;
      _Display_Position--;
    }

    if (_Display_Position <= 0) {
      _Display_Fade_Position = 0;
      _Display_Position = 0;
      leds[_Display_Position].setHSV(_Display_Colour, _Display_Saturation, Brightness);
    }
    FastLED.show();
  }
  return _Display_Position;
}

bool Ignore(uint32_t Time, uint8_t Identifier, bool reset){
  if(reset){
    //_Index[Identifier] = _currentMillis; //Reset the time started.
    _Index[Identifier] = millis(); //To prevent single true output on initial call.
  }
  _currentMillis = millis();
  if(_currentMillis - _Index[Identifier] >= Time){
    _Index[Identifier] = _currentMillis;
    return 1;
  }else{
    return 0;
  }
}

Here is the code i was using to calculate the speed.

void Control_Point::Ring_Speed(uint32_t Time){
  uint32_t ledTime = Time / NUM_LEDS; //time it takes to light a single led.
  uint8_t loops = Brightness / FadeIncrement; //how many loops it will take us.
  _Display_Delay = ledTime / loops; //delay time for each loop.
  Serial.print("led Time: ");
  Serial.print(ledTime);
  Serial.print("Display Delay: ");
  Serial.println(_Display_Delay);
}

The issue is with large time values such as 10 seconds this scales to a delay in milliseconds of well 0 and falls in the microsecond range while also meaning the led fades look jumpy therefore i have to adjust the amount i increment the brightness which at the current moment is 20, to make it look smooth i need a smaller amount but i don’t want to have to adjust this each time i change the time taken instead i want to calculate it from the total time it takes the LED ring to sweep around a full revolution.

Hope this makes more sense.

Shandy:
The issue is with large time values such as 10 seconds this scales to a delay in milliseconds of well 0 and falls in the microsecond range

Use micros() instead of millis() and delayMicroseconds() instead of delay() and keep up with everything in microseconds instead of milliseconds them.

What your saying would make sense if the timing in loop scaled with the led brightness increments.

Having the LED increment at 20 and using delay works to a up to about 6 seconds then you have to add to the increment value to keep the animation smooth its more to do with the relationship between the two numbers.

For example you can have a delay of 1 millisecond and increment the brightness by 1 each time simple math would say...

1 millisecond x 24 LEDS = 24 milliseconds

24 milliseconds x 255 steps to full brightness = 6120 milliseconds.

unfortunately this is actually 14056 milliseconds when timed using the code and a stopwatch for better checks.

something is clearly wrong with my time calculations but yet the timer seems to return the correct intervals between brightness increments so i am hoping it is due to other program functions.


The problem i am having is how do i take a time input and scale the increment value and timer in loop to create a smooth animation for a range of time between 14 milliseconds and something more like 500,000 milliseconds.

Its a horrendous math challenge trying to factor in code execution times 3-4 different factors and return a meaningful result all im trying to do is sweep across an LED strip in a set time with a smooth animation.

Make the fade increment always 1. Then calculate things in microseconds. There's nothing you're going to do that will be any smoother than that.

It may well be the case in the end, the only trouble with doing that is at large values then you can hardly see any progress at all but i am now trying to implement different time calculations and corrections for different time periods in the hope that might get me out the trouble.

I appreciate you trying to answer my questions on such a bizarre application.

Shandy:
It may well be the case in the end, the only trouble with doing that is at large values then you can hardly see any progress at all

You'll see just as much progress taking 10 little single steps 100microseconds apart as you will taking one 10step 1 millisecond apart. I'm not sure what you mean here.

Okay i will add some context here to my application, this is a progress bar for a game in which you must control the area your within but the user can set the time taken for you to progress the ring which could be 10 seconds or 10 minutes.

In that situation you want to ensure the user can see they are making progress now when the time period is getting large its almost impossible to tell anything is happening unless you change the scaling around between increment and delay so that you can see the led changing.

Shandy:
In that situation you want to ensure the user can see they are making progress now when the time period is getting large its almost impossible to tell anything is happening unless you change the scaling around between increment and delay so that you can see the led changing.

If it still takes the same total length of time for the led to get from partial to full brightness then what you say makes no sense.

If you want it to have larger steps then it will take longer time and when you choose long times then the steps will be long. There's nothing you can do about that except to make the steps smaller but then as you say you won't see them.

If you have a set that works for large time values then great. If you make the total time smaller then it will take a shorter steps. Changing the step size will again not change the total time it takes to fade in or out if you need to complete the total string in a given amount of time.

I still don't understand why you think that you can take the same amount of time to turn on the whole ring yet somehow think you can speed up any one individual light. You got the same time per light no matter what your step size.

It sounds like there is no solution to your problem.