Getting time between frames as a fraction of a second

Hi all!

Firstly, my aim is to be able to input a fade time in seconds such as

float fadeTime = 1.f;

I come from a game programming background, and if I were trying to make a light go from full dark to full bright I would do

brightness += (1 / fadeTime) * deltaTime;

with deltaTime being the the time since the last loop as a fraction of a second, and 1 / fadeTime giving me the correct amount to multiply deltaTime with so that after 1 second brightness will have reached its full amount.

I'm absolutely useless at maths, and I'm trying to figure out how I would get that deltaTime value. In a games engine it would usually end up being ridiculously small like 0.00014 but I just don't know how to convert from unsigned long to float in this way.

Thanks in advance!

represent the time in msec or usec

And once you have this value, what do want to do with it?

Use it as I've shown above, to be able to give a time in seconds that I want the LED to fade up, and calculate the amount I need to add each loop so that it increases at a steady amount. I realise I've been a bit stupid though, and I can just do

deltaTime = (millis() - lastMillis) / 1000.f;

but for some reason

brightness += (1.f / fadeTimeSecondsBrightness) * deltaTime;

is making brightness become 1.34 immediately

#include "Arduino.h"

uint32_t oldTime;
float deltaTime;

void setup() {
    // Using micros instead of millis for a better resolution
    // When doing so, the time between loops cannot be greater than about 70 mins
    // In the rare case of having such a long-term program, you must use millis
    oldTime = micros();
}

void loop() {
    // Get the time difference and convert to seconds
    // In case of using millis, divide by 1000.0 to get seconds
    deltaTime= (micros() - oldTime) / 1000000.0;
    oldTime = micros();

    // Now deltaTime is the time since the last update in seconds
}

herrnamenlos123:

#include "Arduino.h"

uint32_t oldTime;
float deltaTime;

void setup() {
    // Using micros instead of millis for a better resolution
    // When doing so, the time between loops cannot be greater than about 70 mins
    // In the rare case of having such a long-term program, you must use millis
    oldTime = micros();
}

void loop() {
    // Get the time difference and convert to seconds
    // In case of using millis, divide by 1000.0 to get seconds
    deltaTime= (micros() - oldTime) / 1000000.0;
    oldTime = micros();

// Now deltaTime is the time since the last update in seconds
}

This is exactly what I was after! Thanks alot!