Has anybody successfully decoded the SENT protocol using an Arduino Uno and/or Mega?
I promise I've read the discussions that I can find in this forum but all seem to close without any resolution. I'm not sure if that is because its so easy that everyone figured it out for themselves, or so complicated that everyone gave up, or its simply not possible.
My problem isn't the actual decoding, that's pretty easy. Using my Saleae Analyzer I can see and decode the data stream and I've written some code for the Uno that decodes the data just fine if I manually input the falling edge to falling edge durations.
My problem is accurately measuring the falling edge to falling edge durations in the first place. As the tick time in SENT is defined as 3uS < tick < 90uS I had hoped that the particular sensor that I need to decode would use a tick time close to the 90uS end of the spectrum, but alas its actually 3.15uS so using interrupt triggering just isn't accurate enough
SENT is a fairly simple protocol incorporating a CRC, and its an extremely fast data stream, so I'm fairly confident that even if I could measure the edges to 1uS that would be accurate enough for my needs as I could simply ignore any data that fails the CRC. I'm not too bothered about the transmission speed, more the accuracy of the data.
Any advice / guidance / words of encouragement would be very gratefully received !
If the micro doesn't have anything else to do, polling is much faster and has less latency than interrupts.
To time a transition on a pin you can use something like this (used to time codewheel transitions):
//first pass synchronizes edges
while( (PINC&(1<<PC5)) ==0); //seeing white, wait till black
while( (PINC&(1<<PC5)) !=0); //seeing black, wait till white
// TCNT1 is a 16 bit timer counter, running at 1 usec per tick
then=TCNT1; //now time one full revolution
while( (PINC&(1<<PC5)) ==0); //seeing white, wait till black
while( (PINC&(1<<PC5)) !=0); //seeing black, wait till white
now=TCNT1;
period=now-then; //always works for unsigned ints
On AVR processors, those while loops each compile to two machine instructions: bit test and loop back.
// added these definitions to the top
#define INVERTED_SIGNAL 1 // added since the signal from my device was inverted for some odd reason
#define NOISE_CANCELER_ENABLE 1 // likely not necessary, but cant hurt
...
// increased the serial speed, likely not necessary for slow SENT speeds, but required for 3ยตs ticks
constexpr auto serial_baud = 250000; // used to be 115200, increased to outpace the SENT bus, otherwise the buffer overflows quickly
...
// implemented the new #defines at the top
TCCR1B = 1 | INVERTED_SIGNAL<<6 | NOISE_CANCELER_ENABLE<<7; // used to be just '1'
...
// replaced the following line
dx = (dx - cycl_offset) / LOOKUP_DIV;
// with
dx = (dx - cycl_offset - 12 * cycl_tick) / LOOKUP_DIV
// this removes the 12 tick constant off each pulse, since the pulses are 12-27 ticks long and the lookup table is configured for looking up 0-15 ticks
115200 is a little on the slow side leaving no time for printing anything else than the decoded nibbles. Maybe I also got lucky with my sensor which also sends padding.
I would not multiply by 12 but rather by 11.5 to allow for jitter of the sensor.
Also I wonder why cycl_offset does not do the job here.