Significant difference between millis() and micros()

When I run the code below and I change between millis() and micros() there is a quite a variance
With millis() the time shown varies between 2 and 3 milliseconds.

However if I change to micros(), the time shown is about 8700 micro seconds or 8.7 milli seconds.

Can anyone explain why the big difference ?

void loop()
{
	//start timer
	gettime_Millis = millis();    //micros();

	 DrawScreen1();     //draw on tft screen
	
	//Calculate time taken to a loop
	newtimeMillis = millis();    //micros();
	timelapseMillis = newtimeMillis-gettime_Millis;

	//need a bit of time to see the result
	delay(1000);

}

With such short times, I can imagine that this will happen.
The millis() is not accurate for a few milliseconds (read: very inaccurate). It is updated in an interrupt.
We also don't have your whole sketch, so we don't know if you did use the right variable types.

Perhaps you can try to DrawScreen1() in a loop and do it 20 or 100 times. The millis() and micros() should measure the same delay.

unsigned long are used for the variables.

Okay, will try timing in a loop and see.

These calls should definitely be better than 5 milliseconds out!

Expect millis() to have about 1ms jitter and micros () to be accurate to 10us
or so if no unexpected interrupts are flying around.

Do you have other interrupts flying about the system? Are you missing timer0
interrupts perhaps?

[ PS not printing the value of timelapseMillis in DrawScreen1() by any chance?
That's a circular dependence... ]

Here is all the code
A bit messy as I have playing around a bit with TFT screen stuff.

BTW, with all the code in the complete loop time is about 1/2 second, ie drawing the sine, cosine and tangent on the screen.

// UTFT_Demo_320x240 (C)2014 Henning Karlsen
// web: http://www.henningkarlsen.com/electronics
//
// This program is a demo of how to use most of the functions
// of the library with a supported display modules.
//
// This demo was made for modules with a screen resolution 
// of 320x240 pixels.
//
// This program requires the UTFT library.
//

#include <UTFT.h>
//#include <UTouch.h>


// 16 bit interface pin numbers
//ITDB02   ( D0, D1, D2, D3, D4, D5, D6, D7, D8, D9, D10, D11, D12, D13, D14, D15, RS,  WR, CS, RST)
//ITDB02 lcd(37, 36, 35, 34, 33, 32, 31, 30, 22, 23, 24,  25,  26,  27,  28,  29,  38,  39, 40, 41);

//Heart Beat LED
int led = 13;  
int ledState = 0; 


//Global Variables
unsigned long  gettime_Millis = 0;
unsigned long  newtimeMillis = 0;
unsigned long  timelapseMillis = 0;
unsigned long  numbertoshow = 0;


// Declare which fonts we will be using
extern uint8_t SmallFont[];
//extern uint8_t BigFont[];

// Set the pins to the correct ones for your development shield
// ------------------------------------------------------------
// Arduino Uno / 2009:
// -------------------
// Standard Arduino Uno/2009 shield            : <display model>,A5,A4,A3,A2
// DisplayModule Arduino Uno TFT shield        : <display model>,A5,A4,A3,A2
//
// Arduino Mega:
// -------------------
// Standard Arduino Mega/Due shield            : <display model>,38,39,40,41
// CTE TFT LCD/SD Shield for Arduino Mega      : <display model>,38,39,40,41
//
// Remember to change the model parameter to suit your display module!
UTFT myGLCD(ITDB32S,38,39,40,41);  // RS,WR,CS,RST



// Initialize touchscreen
// ----------------------
// Set the pins to the correct ones for your development board
// -----------------------------------------------------------
// Standard Arduino Uno/2009 Shield            : 15,10,14, 9, 8
// Standard Arduino Mega/Due shield            :  6, 5, 4, 3, 2
// CTE TFT LCD/SD Shield for Arduino Due       :  6, 5, 4, 3, 2
// Teensy 3.x TFT Test Board                   : 26,31,27,28,29
// ElecHouse TFT LCD/SD Shield for Arduino Due : 25,26,27,29,30
//

/*
UTouch  myTouch(  6,       //Pin for Touch Clock (D_CLK)
5,       //Pin for Touch Chip Select (D_CS)
4,       //Pin for Touch Data input (D_DIN)
3,       //Pin for Touch Data output (D_OUT)
2);      //Pin for Touch IRQ (DPenirq)

*/


void setup()
{
	pinMode(led, OUTPUT);
	//myTouch.InitTouch();
	//myTouch.setPrecision(PREC_MEDIUM);

	SetupLCD();
	DrawBaseScreen();
        DrawScreen();

}

//000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
//000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
//000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
void loop()
{
	//start timer
	gettime_Millis = millis();//micros();


	DrawScreen1();
	HeartBeat();    //speed test.Frequency should tell how fast we do a round trip


	//Calculate time taken to a loop
	newtimeMillis = millis();//micros();
	timelapseMillis = newtimeMillis-gettime_Millis;

	//need a bit time to see the result
	delay(1000);

}  
//000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
//000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
//000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000



//-------------------------------------------------------
void  HeartBeat(void)
{
	ledState = !ledState;
	digitalWrite(led, ledState);
}






//-------------------------------------------------------------------------
void DrawBaseScreen(void)
{
	// Clear the screen and draw the frame
	myGLCD.clrScr();

	myGLCD.setColor(255, 0, 0);
	myGLCD.fillRect(0, 0, 319, 13);
	myGLCD.setColor(64, 64, 64);
	myGLCD.fillRect(0, 226, 319, 239);
	myGLCD.setColor(255, 255, 255);
	myGLCD.setBackColor(255, 0, 0);
	myGLCD.print("* Universal Color TFT Display Library *", CENTER, 1);
	myGLCD.setBackColor(64, 64, 64);
	myGLCD.setColor(255,255,0);
	myGLCD.print("* V5 *", CENTER, 227);

	DrawMinorBackground();
	DrawCrossHair();

}  


//----------------------------------------------------------------
void DrawMinorBackground(void)
{
	myGLCD.setColor(0, 0, 255);
	myGLCD.drawRect(0, 14, 319, 225);
	myGLCD.fillRect(0, 14, 319, 225);
}

//----------------------------------------------------------------
void   DrawCrossHair(void)
{
	// Draw crosshairs
	myGLCD.setColor(0, 255, 255);
	myGLCD.setBackColor(0, 0, 0);
	myGLCD.drawLine(159, 15, 159, 224);
	myGLCD.drawLine(1, 119, 318, 119);
	for (int i=9; i<310; i+=10)
		myGLCD.drawLine(i, 117, i, 121);
	for (int i=19; i<220; i+=10)
		myGLCD.drawLine(157, i, 161, i);
}

//--------------------------------------------------------------
void    DrawScreen(void)
{
	//DrawMinorBackground();
	//DrawCrossHair();

	myGLCD.setBackColor(VGA_BLUE); // Set the background color to blue behind the text following below


/*
	// Draw sin-, cos- and tan-lines  
	myGLCD.setColor(0,255,255);
	myGLCD.print("Sin", 5, 15);
	/*for (int i=1; i<318; i++)
	{
		myGLCD.drawPixel(i,119+(sin(((i*1.13)*3.14)/180)*95));
		// delay(3);
	}
        

	myGLCD.setColor(255,0,0);
	myGLCD.print("Cos", 5, 27);
	/*for (int i=1; i<318; i++)
	{
		myGLCD.drawPixel(i,119+(cos(((i*1.13)*3.14)/180)*95));
		// delay(3);
	}
        

	myGLCD.setColor(255,255,0);
	myGLCD.print("Tan", 5, 39);
	/*r (int i=1; i<318; i++)
	{
		myGLCD.drawPixel(i,119+(tan(((i*1.13)*3.14)/180)));
		// delay(3);
	}
        */

	myGLCD.setColor(255,255,0);
	myGLCD.print("Time mS", 5, 55);
	numbertoshow = timelapseMillis;
	myGLCD.printNumI(numbertoshow,80,55);


}

void DrawScreen1 (void)
{
	numbertoshow = timelapseMillis;
	myGLCD.printNumI(numbertoshow,80,55);
}





//---------------------------------------
void  SetupLCD(void)
{
	// Sets up the LCD
	myGLCD.InitLCD();
	myGLCD.setFont(SmallFont);
}

Does the UTFT use interrupts?
The delay function "stops" interrupts.
Also, I believe, based on empirical testing, that the millis function seems to impact interrupts, however delaying in microseconds, and micros don't seem to affect interrupts.
Maybe that is the explanation

darkroomsource:
The delay function "stops" interrupts.

I don't think that's the case.

"Stops" in quotes, according to the documentation, interrupts are still active, however, in my (albeit limited) experience, they don't seem to happen "quite right"
I think the delay function must use an interrupt of some kind, because I have seen where the interrupts in my sketch are slowed or delayed, and if I'm counting something like a wheel encoder at 3 or 4 pulses per millisecond, it will, when delay() is used, sometimes not get the right result (the pin has changed state by the time the interrupt is actually invoked) so the delay can be 1/8th of a millisecond.
delayMicroseconds() doesn't seem to have the same effect

delay() does not use an interrupt, nor does it stop other interrupts, whether they are in quotes or not.

Edit: delay() does depend on the Timer0 interrupt running, so if something is interfering with that, like a very long running ISR or long periods of interrupt-inhibited code, then yes, it can get sketchy.

void delay(unsigned long ms)
{
	uint16_t start = (uint16_t)micros();

	while (ms > 0) {
		if (((uint16_t)micros() - start) >= 1000) {
			ms--;
			start += 1000;
		}
	}
}

but why the difference in time elapsed?

BulldogLowell:
but why the difference in time elapsed?

See expanded explanation in reply #8.

So, it doesn't use an interrupt, but uses a timer that uses an interrupt.
I stand corrected.
I will continue to use delayMicroseconds instead of delay if it's all right with you though.

Pun intended?

void DrawScreen1 (void)
{
	numbertoshow = timelapseMillis;
	myGLCD.printNumI(numbertoshow,80,55);
}

There is it is, you are printing the number of millis (or micros), and
timing the time it takes to do so, so of course the micros version will take
longer as you are printing more digits!!! printNumI has to break up the
number digit by digit and lookup the digit in a font table, and send the
font data to the LCD, all of which takes time dependent on the number
of digits.

Suggest you try:

void loop()
{
	//start timer
	gettime_Millis = millis() * 1000;    //micros();

	 DrawScreen1();     //draw on tft screen
	
	//Calculate time taken to a loop
	newtimeMillis = millis();    //micros();
	timelapseMillis = newtimeMillis-gettime_Millis;

	//need a bit of time to see the result
	delay(1000);

}

Then at least you expect 4 digit numbers in both cases.

BulldogLowell:

[quote author=Jack Christensen link=topic=255837.msg1810628#msg1810628 date=1405789763]
...it can get sketchy.

Pun intended?
[/quote]

Ugh, no, had I realized I'd have avoided such a bad one! :smiley:

darkroomsource:
So, it doesn't use an interrupt, but uses a timer that uses an interrupt.
I stand corrected.

Of course you are correct; I understood the original statement to question whether delay() sets up its own interrupt for each call and of course it does not. Perhaps it's six of one and half a dozen of the other but there is a distinction in my mind.

I will continue to use delayMicroseconds instead of delay if it's all right with you though.

Whatever floats your boat.

Okay, I have been through all the source code in the utft files from
http://www.henningkarlsen.com/electronics/library.php?id=51

and apart from the init code I can't see any delays anywhere, but then again I'm not fluent in C/C++.
All the loops seem okay with no obvious wasted time cycles.

Kim

Okay, I have been doing some more tests and condensed the code a bit.
In the following code when I run it as it is BOTH millis() and micros() are showing the about same result, 43ms and 42,569uS.

However if I comment out this part, I get a result of 0mS and 42,499uS

//timelapseMillis = endtimeMillis - starttimeMillis;
timelapseMicros = endtimeMicros - starttimeMicros;

However if I comment out this part, I get a result of 34mS and 0uS

timelapseMillis = endtimeMillis - starttimeMillis;
//timelapseMicros = endtimeMicros - starttimeMicros;

This diffrence got me beat at the moment

#include <UTFT.h>

//Global Variables
volatile unsigned long  starttimeMillis = 0;
volatile unsigned long  endtimeMillis = 0;

volatile unsigned long  endtimeMicros= 0;
volatile unsigned long  starttimeMicros=0;

unsigned long  timelapseMillis = 0;
unsigned long  timelapseMicros = 0;


extern uint8_t SmallFont[];
UTFT myGLCD(ITDB32S,38,39,40,41);  // RS,WR,CS,RST


void setup()
{
  myGLCD.InitLCD();
  myGLCD.setFont(SmallFont);
  //DrawBaseScreen();
  myGLCD.clrScr();
  myGLCD.setColor(255, 0, 0);
  myGLCD.fillRect(0, 0, 319, 13);
  myGLCD.setColor(64, 64, 64);
  myGLCD.fillRect(0, 226, 319, 239);
  myGLCD.setColor(255, 255, 255);
  myGLCD.setBackColor(255, 0, 0);
  myGLCD.print("* Universal Color TFT Display Library *", CENTER, 1);
  myGLCD.setBackColor(64, 64, 64);
  myGLCD.setColor(255,255,0);
  myGLCD.print("* Version 7 *", CENTER, 227);
  myGLCD.setColor(0, 0, 255);
  myGLCD.drawRect(0, 14, 319, 225);
  myGLCD.fillRect(0, 14, 319, 225);
  // Draw crosshairs
  myGLCD.setColor(0, 255, 255);
  myGLCD.setBackColor(0, 0, 0);
  myGLCD.drawLine(159, 15, 159, 224);
  myGLCD.drawLine(1, 119, 318, 119);
  for (int i=9; i<310; i+=10)
    myGLCD.drawLine(i, 117, i, 121);
  for (int i=19; i<220; i+=10)
    myGLCD.drawLine(157, i, 161, i);  
}


void loop()
{
  starttimeMillis = millis();
  starttimeMicros = micros();

      DrawScreen1();

  endtimeMillis = millis();
  endtimeMicros = micros();

  timelapseMillis = endtimeMillis - starttimeMillis;
  timelapseMicros = endtimeMicros - starttimeMicros;

  //need a bit time to see the result
  delay(1000);
}  




void DrawScreen1(void)
{
  myGLCD.setColor(255,255,0);
  myGLCD.print("Time mS", 5, 55); 
  myGLCD.print("Time uS", 5, 75);
  myGLCD.printNumI(timelapseMillis,80,55);
  myGLCD.printNumI(timelapseMicros,80,75);
}

Converted the output to a string as the printNumI in "myGLCD.printNumI(timelapseMillis,80,55)" takes an Int (16 bit) and Long is 32bit
However I still get up to 10mS variance in the output beetween the two values.

Hmm

Do you concede that the issue is how long the code takes to convert
strings to bit-patterns on the LCD, not any difference in behaviour of
micros() and millis() which both work as expected.

You must compare apples to apples, not apples to oranges, to compare
the two functions, conflating the code being timed with the code printing
out the results of the timing has been the confusion and you are still doing it.

MarkT
Yes, I know you are right but I just don't get it.

And I don't like it when I don't understand it.
I know there would be some variance, however around 6mS seems to me to at the very extreme end of the scale.

All I'm doing is:

Capture timer
do some work - show some variables on the display
recapture timer
then calculate the timer difference and show timer variance on next round trip in loop()

I presume that both milliS() and microS() just captures the hardware timer and they don't interfere with each other ?
Maybe the tft display functions do something somewhere that causes this.

Board is a Mega running at 16mHz

Thanks for commenting