Using Millis for timing always returns 1 second

I am very very new to Arduino but have been a developer for years. Using Millis() to time is quite different than I would on the web.

Here is what I got

// Board Setup
int tiltPin   = 1;


// Global Setup
int tiltState  = 0; 
int totalTime  = 0;  
unsigned long startTimer   = 0;
unsigned long endTimer    = 0;       




void setup() {
  
  //Initialize serial and wait for port to open:
  Serial.begin(9600);
  while (!Serial) {
    ; // wait for serial port to connect. Needed for native USB port only
  }

  // check for the WiFi module:
  if (WiFi.status() == WL_NO_MODULE) {
    Serial.println("Communication with WiFi module failed!");
    // don't continue
    while (true);
  }

  String fv = WiFi.firmwareVersion();
  if (fv != "1.0.0") {
    Serial.println("Please upgrade the firmware");
  }


  
}





void loop() {


  tiltState = digitalRead(tiltPin);

   //If we are tilted
  if (tiltState == HIGH) {

    
    Serial.println("Tilting");
    
    startTimer = millis();

    
  } else {

    
    Serial.println("Done Tilting");

    if( startTimer > 0 ){
      endTimer =  millis();

      totalTime = (endTimer - startTimer)  / 1000.0;

      delay(1000);
    }
    
    if( totalTime > 0 ){ 

         testTime();
        
    }

    
  }

  delay(100);

}




/*
 *Helper Functions
 */


void testTime(){
    Serial.println("\nCalculating time ...");

    Serial.println("\nStart time");
    Serial.println(startTimer);

    Serial.println("\nEnd time");
    Serial.println(endTimer);
      
    Serial.println("\nTotal time");
    Serial.println(totalTime);

     totalTime   = 0;
     startTimer  = 0;
     endTimer   = 0;
    

}

To me this should work since I grab time in one function, then again when the tilt is done. Every time I run this no matter if it reads "tilting" for 10 seconds the equation returns 1 second. So the startTime and endTime doesnt seem to be recoding the actual time between the HIGH and LOW switch. I think it might be something with delay but again, I am very new.

Some of the retunred data:

Calculating time ...

Start time
61550

End time
62750

Total time
1



Calculating time ...

Start time
52250

End time
53450

Total time
1


Calculating time ...

Start time
12350

End time
13550

Total time
1

All those printed values are correct. What would you expect to see?

You have to be careful with your types. Consider the following.

void setup() {

  Serial.begin(9600);

  unsigned long startTime = 61550UL;

  unsigned long endTime = 62750UL;

  Serial.print("time 1: ");
  Serial.println((endTime - startTime) / 1000.0);

  int totalTime = (endTime - startTime) / 1000.0;
  Serial.print("time 2: ");
  Serial.println(totalTime);
}

void loop() {

}

Serial monitor:

time 1: 1.20
time 2: 1

@DKWatson one of those outputs was after 10 seconds of the the tilt being done so I would expect the startTime and endTime to be 10000 milis apart. I get the math is right, just the start and end time are not reflecting the true amount of time between when the switch goes from HIGH to LOW

The results you are getting are correct.

Let's look at the first:

62750 - 61550 is 1200 milliseconds.

This is then divided by 1000.0 which gives 1.200 seconds. That is then stored in an int type variable so is truncated to 1.

The other examples given are the same 1200 ms being divided and truncated correctly.

If you want to see 1.2 seconds then totalTime needs to be a type that can hold fractional values and not an int.

Your code indicates that it should also print "Tilting" and "Done Tilting" when it records those times. But I don't see that in the output. Are you sure you're not getting more than one trigger to start in that ten seconds?

if (tiltState == HIGH) {

That will be true on each pass of loop while tilting. So it starts tilting and you get a start time. Then you Delay 1 second. Then loop repeats, the digitalRead is still giving HIGH so you get a new start time, even though you really started a second before. On each pass of loop you replace your start time until you finally stop tilting. So you're not timing from the time it started tilting. You're timing from the last time the loop function ran.

Study the state change example. You want to get a start time only when the sensor first reads HIGH. That is, when it reads HIGH this time but didn't last time. The State Change Example in the IDE shows an easy way to handle thatby keeping track of the last value you read.

Delta_G:
Your code indicates that it should also print “Tilting” and “Done Tilting” when it records those times. But I don’t see that in the output. Are you sure you’re not getting more than one trigger to start in that ten seconds?

Ahhh so it reads out “Tilting” over and over while its tilting. I guess that would allow me to assume its declaring startTimer over and over again so the output is right since its re-assigned all the time. How do I get it to just set startTimer once its tilted and not over and over while its tilted?

How do I get it to just set startTimer once its tilted and not over and over while its tilted?

See the State Change Example.

Delta_G:

if (tiltState == HIGH) {

That will be true on each pass of loop while tilting. So it starts tilting and you get a start time. Then you Delay 1 second. Then loop repeats, the digitalRead is still giving HIGH so you get a new start time, even though you really started a second before. On each pass of loop you replace your start time until you finally stop tilting. So you're not timing from the time it started tilting. You're timing from the last time the loop function ran.

Study the state change example. You want to get a start time only when the sensor first reads HIGH. That is, when it reads HIGH this time but didn't last time. The State Change Example in the IDE shows an easy way to handle thatby keeping track of the last value you read.

Yep, I just realized that and posted while you posted this. I will take a look at the example.