Arduino Mega 1280 delay() turbocharged

Hi,

I’m using an arduino mega 1280 on 64 bit linux, using arduino 0021 software release. I can compile and upload the “blink” sketch without problems, but once uploaded on the board, the delay(1000) function, which is supposed to add a one second delay, is running much quicker.

Namely, delay(1000) results in a delay of perhaps 100 miliseconds (the LED just keeps blinking really quickly). Changing the function call to, for instance, delay(5000) results in the LED blinking a little slower, but still well above once per second.

Bottom line is, the delay() function is working MUCH faster than it should. The sketch “blink without delay” which uses the milis() function seems to work fine. Also, compiling and downloading the “blink” sketch on windows works fine.

Does anyone have a clue what might be the issue here? Maybe something with the Linux version of the arduino libraries?

Thanks in advance,
-Igor

I guess you have checked the board setting in the tools menu?

Are you sure your uploading of the blink sketch was successful? I think the bootloader blinks pin 13 led at a fast rate if there is no valid sketch in the flash memory.

Lefty

Thanks for the suggestions, however still no go:

dafid: The board is configured correctly as Mega (1280) under the tools menu

retrolefty: The arduino software confirms that upload is done. Also, if i change the delay() argument to a higher number, eg: delay(7000), I can tell the difference in the blink rate. Same happens if I output the signal to another pin (I tried with pin 12).

I'm still thinking this might be a compiler/library problem on 64bit Linux, any takers?

Thanks, -igor

I'm still thinking this might be a compiler/library problem on 64bit Linux, any takers?

Must be something like that.

You would think we would have a good well debugged blink sketch by now. :D

Lefty

Haha true that!

I'll try in 32 bit linux version as soon as I fix my other laptop.

-Igor

I guess the simplest question is always 'what does your code look like?'

One possibility is that 01000 is an octal ( base 8 ) constant in C++ ie 1x8x8x8 = 512(decimal) not 1000 = 1x10x10x10.

So leading '0' zeros on numbers make them smaller...

Dave

Hi dafid, 

Thanks for the tip, but that is not the problem. The code is just the standard blink sketch, i.e:
[code]
/*
  Blink
  Turns on an LED on for one second, then off for one second, repeatedly.
 
  This example code is in the public domain.
 */

void setup() {                
  // initialize the digital pin as an output.
  // Pin 13 has an LED connected on most Arduino boards:
  pinMode(13, OUTPUT);    
}

void loop() {
  digitalWrite(13, HIGH);   // set the LED on
  delay(1000);              // wait for a second
  digitalWrite(13, LOW);    // set the LED off
  delay(1000);              // wait for a second
}

and it runs just as expected on the board when it is compiled and uploaded through windows. It is only “turbocharged” when compiled and uploaded through Linux. Also milis() function seems to work fine. I dug up the implementation of delay() and it uses delayMicroseconds() so I’ll do some testing on delayMicroseconds() to see if thats the problem or if its some kind of compilation issue.
-Igor[/code]

I just received an Arduino UNO!

The first thing I tried is uploading the blink sketch! Changed the board type to UNO in the IDE and uploaded the sketch.

Behold! The LED is always ON, no flickering like on Arduino Mega 1280, and changing the delay() argument to, for instance, delay(5000), which would result in a slower blinking LED on the Mega, appears to do nothing. The LED is constantly on. Uploading the example "Blink without delay" works fine on both Uno and Mega boards.

This suggests that there is something seriously wrong with the Linux 64 bit libraries since the call to the same function (delay(1000)) results in different behaviours on different boards. I fear that many other functions may give unexpected results and I can't trust these libraries... so if anyone has any more clues about these issues I would love to hear some suggestions before installing a copy of window$...

Also, niadh in the topic "Arduino not responding correctly" is also having strange issues on 64 bit linux - maybe this is related?

Is there a place where I can file a bug with the devs about this issue? Open source hardware should have a good IDE implementation for open source software, no?

Thanks in advance, -Igor

Igor, You are right - there is nothing to go wrong in your code :). And since it works from windows and not from linux - it suggests the either the libararies are the problem, or the compiler is.

It would be good to see the avrdump of the file that is downloaded in each case.

As the WINAVR distribution said (dont recall exactly where, will try to track it down) the libraries are identical for all distributions (32, 64, x86, ...) as they vary with the target that is compiled for not the platform that does the compiling...

But since this a case where the behaviour is broken .. it could be the libraies are different - but not deliberately.

-- An asside - to your last message - I saw that delay() (in V21 and V22 from what I understand) which is in wiring.c, appears to be

void delay(unsigned long ms)
{
    uint16_t start = (uint16_t)micros();

    while (ms > 0) {
        if (((uint16_t)micros() - start) >= 1000) {
            ms--;
            start += 1000;
        }
    }
}

(Since I am looking at source that I did not use to build the libraries, I am not 100% confident of what I am reading.)

So it does not use delayMicroseconds() but rather loops until it has seen that many lots of 1000 micros() pass by..

And micros is updated by the timer0 interrupt ... and setup() has conditional code to set up timer-0 control registers.

It is all a bit hard in the abstract - as i have 32bit ubuntu and 64 bit windows... and not enough time to reverse this.

I'm running 64 bit linux and Arduino 0021 with both a Mega and a Uno and have had no issues at all with delay(), it is running at the correct speed here. I use the Mega for CNC so I think I would have noticed by now if there was a problem with delay!

I'd suggest backing up your sketches, then completely remove your Arduino directories then download again and reinstall. It sounds like an installation problem.

Which distribution are you using, and what is the output of running avr-gcc -v

Thanks for the help guys!

dafid: In fact you're right. I looked at wiring.c a couple of days ago and for some reason I remembered it using delayMicroseconds(), but I just checked again and the implementation of delay() is identical in my wiring.c as what you posted above. It does indeed use micros() - my mistake!

stimmer: I'm running 64 bit Arch Linux and Arduino 0021. The IDE installation on Arch is a bit of a mess since it bundles its own patched avrdude and librxtx with it instead of using the system wide ones. Also, the installation requires removing 64 bit gcc and gcc-libs and installing 64 and 32 bit compatible versions of these packages (gcc-multilib), which is pretty hackish IMHO. In addition, the package is not available from official "repository" but it is user contributed, so it is very likely to be an installation problem. I have removed everything and reinstalled again but the results are the same.

compiler and other program versions are:

??[igor]??[X201]
??[/usr/share/arduino/hardware/arduino/cores/arduino]$ avr-gcc -v
Using built-in specs.
COLLECT_GCC=avr-gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/avr/4.5.2/lto-wrapper
Target: avr
Configured with: ../configure --disable-libssp --disable-nls --enable-languages=c,c++ --infodir=/usr/share/info --libdir=/usr/lib --libexecdir=/usr/lib --mandir=/usr/share/man --prefix=/usr --target=avr --with-gnu-as --with-gnu-ld --with-as=/usr/bin/avr-as --with-ld=/usr/bin/avr-ld
Thread model: single
gcc version 4.5.2 (GCC)
??[igor]??[X201]
??[/usr/share/arduino/hardware/arduino]$ avrdude -v

avrdude: Version 5.10, compiled on Nov 29 2010 at 11:24:36

Can you please let me know your avr-gcc compiler version?

Thanks! -Igor

I am on Ubuntu 10.10, avr-gcc 4.3.5

$ avr-gcc -v
Using built-in specs.
Target: avr
Configured with: ../src/configure -v --enable-languages=c,c++ --prefix=/usr/lib --infodir=/usr/share/info --mandir=/usr/share/man --bindir=/usr/bin --libexecdir=/usr/lib --libdir=/usr/lib --enable-shared --with-system-zlib --enable-long-long --enable-nls --without-included-gettext --disable-checking --disable-libssp --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=avr
Thread model: single
gcc version 4.3.5 (GCC)

I don't believe it would be caused by changing avrdude or librxtx. I wonder if it is a problem with GCC 4.5.2?

Thanks for that. I'll try to downgrade my avr-gcc but its not a trivial matter on Arch linux (rolling release model). -Igor

Hello all,

I would like to report that I have the same problem using the delay() function on arch 32 bit and avr-gcc 4.5.2

I have the problem on 3 different boards, the arduino nano, arduino duelmilanove 168 and the 328.

On the 168 the LED blinks very fast and on the 328 and the nano the led is just on all the time.

I've duplicated the problem on three different computers running arch. On one of them I downgraded the avr-gcc to 4.3.5 and the problem was still there.

The delay() works ok when I compile my code and upload it using Ubuntu 10.04 and avr-gcc 4.3.5.

Regards, Aleks

I'm baffled by this one. I tried to compile GCC 4.5.2 on Ubuntu but just got too many errors. I also tried to install Arch Linux but Virtualbox was not cooperating, so gave up on that too.

Are either of you able to disassemble the object file and see what the compiler is doing to the delay function?

I wonder if the compiler is misconfigured or picking up a delay function from some other header somewhere... Does Arch set any environment variables that might interfere with GCC, like CFLAGS, CXXFLAGS, C_INCLUDE_PATH, CPLUS_INCLUDE_PATH and LIBRARY_PATH ?

Hi stimmer,

None of those environment variables seem to be set:

??[igor]??[X201]
??[~]$ echo $PATH
/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/java/bin:/opt/java/db/bin:/opt/java/jre/bin:/usr/bin/core_perl:/usr/local/bin:/usr/local/bin
??[igor]??[X201]
??[~]$ echo $CFLAGS

??[igor]??[X201]
??[~]$ echo $C_INCLUDE_PATH

??[igor]??[X201]
??[~]$ echo $C_PLUS_INCLUDE_PATH

??[igor]??[X201]
??[~]$ echo $LIBRARY_PATH

??[igor]??[X201]
??[~]$

I'll dive into the issue in more detail as soon as I get some free time. I just updated Arch linux and saw it pulled a new gcc in, so I will recompile the sketches and see if anything changes. If I can't figure anything out, I'll get in touch with the Arch linux arduino package maintainer and let him know about this issue, since it seems to be Arch specific.

Cheers, -Igor

Could you try doing a disassembly of the delay function?

Load up the Blink sketch, then hold down shift as you click Verify, then look in the box at the bottom for the temporary directory it uses.

In a terminal, type cd /tmp/buildnnnnnnnnnnn.tmp (where buildnnnnnnnnnnnn.tmp is the temporary directory as above)

Then type avr-objdump -S Blink.cpp.elf

Then find the symbol and cut and paste it from there up to the next symbol and post it here, it should look like this:

000004c4 <delay>:

void delay(unsigned long ms)
{
 4c4:	ef 92       	push	r14
 4c6:	ff 92       	push	r15
 4c8:	0f 93       	push	r16
 4ca:	1f 93       	push	r17
 4cc:	cf 93       	push	r28
 4ce:	df 93       	push	r29
 4d0:	7b 01       	movw	r14, r22
 4d2:	8c 01       	movw	r16, r24
	uint16_t start = (uint16_t)micros();
 4d4:	0e 94 3c 02 	call	0x478	; 0x478 <micros>
 4d8:	eb 01       	movw	r28, r22
 4da:	0e c0       	rjmp	.+28     	; 0x4f8 <delay+0x34>

	while (ms > 0) {
		if (((uint16_t)micros() - start) >= 1000) {
 4dc:	0e 94 3c 02 	call	0x478	; 0x478 <micros>
 4e0:	6c 1b       	sub	r22, r28
 4e2:	7d 0b       	sbc	r23, r29
 4e4:	68 5e       	subi	r22, 0xE8	; 232
 4e6:	73 40       	sbci	r23, 0x03	; 3
 4e8:	c8 f3       	brcs	.-14     	; 0x4dc <delay+0x18>
			ms--;
 4ea:	08 94       	sec
 4ec:	e1 08       	sbc	r14, r1
 4ee:	f1 08       	sbc	r15, r1
 4f0:	01 09       	sbc	r16, r1
 4f2:	11 09       	sbc	r17, r1
			start += 1000;
 4f4:	c8 51       	subi	r28, 0x18	; 24
 4f6:	dc 4f       	sbci	r29, 0xFC	; 252

void delay(unsigned long ms)
{
	uint16_t start = (uint16_t)micros();

	while (ms > 0) {
 4f8:	e1 14       	cp	r14, r1
 4fa:	f1 04       	cpc	r15, r1
 4fc:	01 05       	cpc	r16, r1
 4fe:	11 05       	cpc	r17, r1
 500:	69 f7       	brne	.-38     	; 0x4dc <delay+0x18>
		if (((uint16_t)micros() - start) >= 1000) {
			ms--;
			start += 1000;
		}
	}
}
 502:	df 91       	pop	r29
 504:	cf 91       	pop	r28
 506:	1f 91       	pop	r17
 508:	0f 91       	pop	r16
 50a:	ff 90       	pop	r15
 50c:	ef 90       	pop	r14
 50e:	08 95       	ret

00000510 <init>:

igor86: None of those environment variables seem to be set:

None of those variables should be set normally; the makefile sets them, and they basically are only set while make is running (i.e. during the build process).

Hi, I’m having the same problem Igor86 describes… Arduino Uno, Arch Linux 64-bit, avr-gcc version 4.5.2.

Here’s my disassembly of the standard example Blink sketch as Stimmer requested

00000210 <delay>:

void delay(unsigned long ms)
{
 210:	ef 92       	push	r14
 212:	ff 92       	push	r15
 214:	0f 93       	push	r16
 216:	1f 93       	push	r17
 218:	cf 93       	push	r28
 21a:	df 93       	push	r29
 21c:	7b 01       	movw	r14, r22
 21e:	8c 01       	movw	r16, r24
	uint16_t start = (uint16_t)micros();
 220:	0e 94 e2 00 	call	0x1c4	; 0x1c4 <micros>
 224:	eb 01       	movw	r28, r22

	while (ms > 0) {
 226:	0f c0       	rjmp	.+30     	; 0x246 <delay+0x36>
		if (((uint16_t)micros() - start) >= 1000) {
 228:	0e 94 e2 00 	call	0x1c4	; 0x1c4 <micros>
 22c:	6c 1b       	sub	r22, r28
 22e:	7d 0b       	sbc	r23, r29
 230:	83 e0       	ldi	r24, 0x03	; 3
 232:	68 3e       	cpi	r22, 0xE8	; 232
 234:	78 07       	cpc	r23, r24
 236:	38 f0       	brcs	.+14     	; 0x246 <delay+0x36>
			ms--;
 238:	08 94       	sec
 23a:	e1 08       	sbc	r14, r1
 23c:	f1 08       	sbc	r15, r1
 23e:	01 09       	sbc	r16, r1
 240:	11 09       	sbc	r17, r1
			start += 1000;
 242:	c8 51       	subi	r28, 0x18	; 24
 244:	dc 4f       	sbci	r29, 0xFC	; 252

void delay(unsigned long ms)
{
	uint16_t start = (uint16_t)micros();

	while (ms > 0) {
 246:	e1 14       	cp	r14, r1
 248:	f1 04       	cpc	r15, r1
 24a:	01 05       	cpc	r16, r1
 24c:	11 05       	cpc	r17, r1
 24e:	61 f7       	brne	.-40     	; 0x228 <delay+0x18>
		if (((uint16_t)micros() - start) >= 1000) {
			ms--;
			start += 1000;
		}
	}
}
 250:	df 91       	pop	r29
 252:	cf 91       	pop	r28
 254:	1f 91       	pop	r17
 256:	0f 91       	pop	r16
 258:	ff 90       	pop	r15
 25a:	ef 90       	pop	r14
 25c:	08 95       	ret

0000025e <init>:

I notice at offset 226 above the rjmp looks like it’s inside the while (ms > 0) { }

and at offsets 230-234 things are handled slightly differently, but my reading of the assembly language isn’t good enough to tell me what that means :slight_smile:

Hope this is helpful as I’d really like to be programming my new Arduino Uno from my Arch system too!

Thanks to Stimmer for looking into this problem.