High precision timer problem

Ive been writing a library to send data between 2 arduino boards using manchester 2 encoding.

The boards are currently back to back but will have other equipment between them at some point (optical stuff / radio).

As I want to do something other than wait for data I'm useing an external interrupt to detect when the signal changes and measuring the time between changes using code from http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1193623343.

This currently works some of the time but has occasional issues which corrupts the data.

After a couple of days I finally decided to write some code to check how much time was going by between the changes without all of my other code to decode it. basically just use the interrupt to measure the time difference, log it for a while then dump it out of the serial port (after disableing the interrupt);

My routine for calculating the differences will handle the usual overflow condition by assuming that if a time is earlier its actually a lot later but I'm getting strange things happening with the microsecond resolution timer.

left hand side is the raw microsecond time
right hand side is calculated time difference

1307664 - 1004
1307648 - 4294967279
1309680 - 2032

1654804 - 1008
1654784 - 4294967275
1656816 - 2032

Reporting time going backwards is what I would class as Not A Good Thing(tm) the timer to be doing . . . any ideas on what I can do to fix this?

This thread discusses a somewhat similar requirement and has some code that may help you do what you want: http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1201890734/12

I tried it with a second timer and thats a much cleaner way of doing it than my old code.

Thanks for the suggestion.

Now I just need to try and work out why the reciever just stops running code after a while :-/