Reliable serial comms when interrupts are occasionally disabled, or clarification on the serial receive buffer

When using the FastLED library, interrupts are disabled while the data is being sent to the led strip. This guide walks through some possible solutions, the most successful of which has been using an additional pin so that Arduino X can indicate to Arduino Y (which is running FastLED and is prone to dropping characters) that it has a message, so B can hold off calling FastLED.show() until it has received the entire message.

I'd like to forego that additional pin, if possible. If a byte is written to the serial buffer while interrupts are disabled, is it lost? Or does it only get lost if it is overwritten by the next incoming byte? Trying to think up a scheme where X can write a specific byte that indicates to Y that it has a message, but I just don't quite understand how the serial buffer works and how dependent it is on interrupts.

Why not write a small sketch to investigate ?

Turn off interrupts, write a byte to the Serial buffer, turn on interrupts and read the byte. Is it the correct data read ?

Turn off interrupts, write a byte to the Serial buffer, turn on interrupts, write a second byte to the Serial buffer. How many bytes are available and are the values correct ?

I was going to once I got in to work, but I was feeling hopeful that someone might be able to tell me ahead of time if it's a dead end :slight_smile:

Well, here are my findings so far, which are somewhat weird. No matter what characters I send or what baud rate I use, while interrupts are off, it only holds onto the first, second, and last character in the buffer. Nothing in between. This was with a Mega, so I'll do some testing to see if it's the same for an Uno or Nano.

Here is my test code if anyone is interested or wants to offer some insight as to what's going on:

void setup(){
  Serial.begin(115200);
}

bool interruptsOff = false;
unsigned long fakeTimer = 0; // since millis() is interrupt-dependent

void loop(){
  fakeTimer++;
  if(fakeTimer > 1000000){
    fakeTimer = 0;
    interruptsOff = !interruptsOff;
    if(interruptsOff){
      Serial.println("off");
      delay(1); // make sure "off" has time to print
      noInterrupts();
    }
    else{
      interrupts();
      Serial.println("on");
    }
  }
  if(Serial.available()){
    char input = Serial.read();
    Serial.print(input);
  }
}

PJR has a library that blasts WS2812 (aka NeoPixel) data out of a UART port on Teensy devices via DMA. Interrupts are never disabled. It's included in the Teensyduino package.

Alternately, you could switch to APA102-type LEDs (aka DotStar). They are timed by a SPI-type clock signal so interrupts don't have to be disabled.

Yes, that is expected behavior. In addition to the software FIFO buffer (which is essentially inaccessible when interrupts are turned off, the UART has a two-byte hardware receive buffer, and the receive shift register. The first two bytes go ito the hardware receive buffer, and the next byte sits in the receive shift register until it is overwritten by subsequent bytes.
This is the case with all AVRs having a UART or USART peripheral, AFAIK.
See section 20.7.4 of the atmega328p datasheet:

The Data OverRun (DORn) Flag indicates data loss due to a receiver buffer full condition. A Data OverRun occurs when the receive buffer is full (two characters), it is a new character waiting in the Receive Shift Register, and a new start bit is detected. If the DORn Flag is set there was one or more serial frame lost between the frame last read from UDRn, and the next frame read from UDRn. For compatibility with future devices, always write this bit to zero when writing to UCSRnA. The DORn Flag is cleared when the frame received was successfully moved from the Shift Register to the receive buffer.

Okay cool, thank you. As long as those three characters will reliably come through, I think I can make something work.

The 'right' way to do this is to implement a protocol.

The sender sends one byte and waits for an echo. When the receiver receives the character, it echoes it back; that indicates to the sender that it can send the next byte. Rinse and repeat till all data is transmitted.

Wouldn't that be kind of slow? Especially if I have a fairly constant data stream? It does seem like it would be very reliable and consistent, but do you have any guidance or links for error handling? What kind of timeout before a character is resent, what to do when the wrong character is echoed back, etc.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.