SENT(Single Edge nibble transmission) decoder with Arduino

Hey there.

I'm looking for a solution to Decode the SENT(SAE J2716 Protocol) signal using the Arduino UNO/Nano. A small tutorial on SENT is attached as a pdf document(Figure 2 of document is important)

The SENT Signal is coming from the MLX 90367 Hall Sensor attached to an Actuator.

Here's the things that I'm looking for:
1.Synchronization Pulse width(in microseconds[us])--> to calculate the Clock tick
2.Data 1 signal(3 nibbles)
status and CRC is not required for now.

Is it possible to use pulseIn() of Arduino? like

duration1 = pulseIn(<pin>, HIGH);

Will the consecutive pulseIn() outputs be trustworthy, as each Clock tick in my case is 3us(+/-25%) ?

If not, is there a better solution the achieve the above-mentioned points?

Also, if I use an array to store the pulse widths( 1 synchronization pulse and 3 nibble data pulses), will it add a latency to the pulse measurements?

Thanks to all upcoming replies in advance.

IDT_Tutorial-Digital-SENT-Interface-ZSSC416x-7x_WHP_20160410 (1).pdf (258 KB)

born-2learn:
Hey there.

I'm looking for a solution to Decode the SENT(SAE J2716 Protocol) signal using the Arduino UNO/Nano. A small tutorial on SENT is attached as a pdf document(Figure 2 of document is important)

The SENT Signal is coming from the MLX 90367 Hall Sensor attached to an Actuator.

Here's the things that I'm looking for:
1.Synchronization Pulse width(in microseconds[us])--> to calculate the Clock tick
2.Data 1 signal(3 nibbles)
status and CRC is not required for now.

Is it possible to use pulseIn() of Arduino? like

duration1 = pulseIn(<pin>, HIGH);

Will the consecutive pulseIn() outputs be trustworthy, as each Clock tick in my case is 3us(+/-25%) ?

If not, is there a better solution the achieve the above-mentioned points?

Also, if I use an array to store the pulse widths( 1 synchronization pulse and 3 nibble data pulses), will it add a latency to the pulse measurements?

Thanks to all upcoming replies in advance.

I read the purpose of the synchronization pulse is to begin your clock ticking, not to "calculate" the clock ticking. If your clock pulse edge is not very close to the device clock pulse edge, you will not properly decode the message.

Paul

This is what I think after checking that tutorial.

The pulseIn() function doesn't seem suitable, as it registers the time between a rising and falling edge (high pulse) or between a falling and rising edge (low pulse), while you have to measure the time between two falling edges (at the start and end of the synchronisation pulse). This time is 56 ticks, so 3*56 = 118 µs in your case.

Nibbles then take 12 to 27 ticks, or 36 to 81 µs. This are all timings that are no problem for an Arduino running at 8 or 16 MHz.

Same for the communication itself: you have to measure the time between two falling edges. One way to do this is using a FALLING interrupt, and measure the time using one of the timers set to f/8 or f/16 prescaler. Each falling edge marks the end of one nibble, and the beginning of the next.

Another option may be to use the Input Capture function, as is also done by AltSoftSerial, to capture the duration of the pulses (I've never used this function myself so not very familiar with it).

Can you please direct me to code snippets, or tell me how to use the attachInterrupt() in Arduino UNO/Nano, as I'm new to it.

Is it like

attachInterrupt(digitalPinToInterrupt(2),function1,FALLING);

I want to measure the time between 2 FALLING edges which is in range of 30us to 120us.

How will the function1 that i have mentioned in attachInterrupt() look like? Do i use micros() and subtract time periods? Will that be efficient to measure those tiny time periods without much latency?

Thanking you in advance.

Ugh... Good luck with that, it's absolutely not beginner level code.

Start here to learn more about interrupts.

micros() has a resolution of 4 us, which is probably not good enough meaning you have to start working with timers to keep track of the time passed. On an ATmega328p based Arduino (Uno, Pro Mini, Nano) that'd have to be Timer1 as it's the only 16-bit timer. Gonna be hard to implement this using an 8-bit timer.

Next you have to keep track of when the previous interrupt happened to be able to calculate how long this one nibble (or start pulse) lasted.

You'll have to do bit shifts and other bitwise operations to store the values, and you have to keep track of where in the communication you are as well (which nibble, or the start/stop pulse, or idle)

To my surprise there is no Arduino library for this protocol yet, while there are several older threads in this forum that talk about the same protocol. Doesn't look that hard to write a library for, so it must be a pretty rare thing.

Interrupt latency would be a real problem for decoding messages that differ by only a few usec pulse width. To make it work with interrupts, you will probably have to program in assembler.

Before you think seriously to go down that route, spend some time researching the topic. The avrfreaks forum is a good resource, example: https://www.avrfreaks.net/forum/maximum-interrupt-latency

jremington:
Interrupt latency would be a real problem for decoding messages that differ by only a few usec pulse width. To make it work with interrupts, you will probably have to program in assembler.

I've never experienced more than 1-2 clock ticks difference in getting into the ISR - be it pin chance or external. This while timing the discharge of a capacitor using a digital pin interrupt and a fixed resistor for the discharge I get a series of a hundred measurements within that margin. Accurate enough that I can clearly tell the difference between a ceramic and film cap, I even see the stability difference between a polypropylene and polyester cap.

An option for even more precise timing is of course the input capture function. Then the timing part is done in hardware, and ISR latency is not an issue any more.

I've never experienced more than 1-2 clock ticks difference in getting into the ISR - be it pin chance or external.

You've been lucky.

To gain more insight into the range of possibilities, take a look at the link I posted, and other topical entries on the freaks forum. Consider also that multiple interrupts (e.g. millis() tick) can be pending, with the critical one being last in the queue.

Yes... maybe lucky. And it may explain the odd off reading that I get :slight_smile:

Also of course I should have said "variation in latency" as I don't care about the actual latency as long as it's a constant time. I know it takes a number of clock ticks between the interrupt occurring and my ISR running, as long as that's always the same number there is no problem for me. That's simply calibrated out.

And indeed I should look into the input capture, as that stores the actual moment the interrupt happened.