Speeding Up Serial Communication

Hey guys,

A bit of an Arduino noob here. I'm trying to get one Arduino to read an analog signal and then send a signal through Bluetooth to a second Arduino that will then send a TTL signal out from one of its pins.

For my experiment, it is very crucial to have a constant delay, and preferably, a small delay. However, the delay of the signal sent out from the pin seems to be inconsistent (fluctuating anywhere between 10 ms to 100 ms).

Judging from other posts I've consulted, this seems to be more of a software issue as the serial library is not optimized. Is there a way to make this delay more consistent, and even smaller (preferably under 10 microseconds)? I've included a copy of both the code on the master and slave devices just in case.

Master

#define FASTADC 1
// defines for setting and clearing register bits
#ifndef cbi
#define cbi(sfr, bit) (_SFR_BYTE(sfr) &= ~_BV(bit))
#endif
#ifndef sbi
#define sbi(sfr, bit) (_SFR_BYTE(sfr) |= _BV(bit))
#endif





void setup() {
  Serial.begin(1382400);
  pinMode(5, OUTPUT);
  digitalWrite(5, LOW);
  #if FASTADC
  // set prescale to 16gcf
  cbi(ADCSRA,ADPS2) ;
  sbi(ADCSRA,ADPS1) ;
  sbi(ADCSRA,ADPS0) ;
  #endif
}

void loop() {
  if (analogRead (A1) > 200){
    Serial.write(1);
//  digitalWrite(5, HIGH);
//  delay(1);
//  Serial.print("trig");
//  digitalWrite(5, LOW);
    delay(1000);
  }

}

Slave

#define signalPin 5

void setup() {
  
  pinMode(signalPin, OUTPUT);
  digitalWrite(signalPin, LOW);
  Serial.begin(1382400);
}

void loop() {

  while(!Serial.available()) {
  }
    digitalWrite(signalPin,HIGH);
    delay(10000);
    digitalWrite(signalPin,LOW);
}

Thank you.

fusionisnotcoming: A bit of an Arduino noob here.

The code you posted is certainly not typical of a noob. And I am not going to spend 30 minutes or more trying to figure it out. Why don't you write an explanation?

Leaving that to one side, I cannot see any obvious relationship between your code and your question. For example

For my experiment, it is very crucial to have a constant delay, and preferably, a small delay. However, the delay of the signal sent out from the pin seems to be inconsistent (fluctuating anywhere between 10 ms to 100 ms).

seems strangely incompatible with

delay(1000);

If what you are trying to say is that the interval is not always an exact 1000 millisecs then my guess is that you need to use millis() to manage timing without blocking as illustrated in Several Things at a Time

...R

Hey,

Thanks for the reply.

The master code does not actually need the delay, it was from a previous version of the code, but it does not change our result. Our setup is the following: 1) Master reads analog signal until the signal passes the threshold of 200 (signal from external function generator) 2) Master Serial writes a 1 3) Slave is monitoring for a signal from the master with serial available, so the 1 triggers it 4) When it receives the 1, it outputs a signal from the pin

The code at the top and in the setup of the master code is there to increase the analog read speed. So our issue is that the signal that we use to trigger the master (from our function generator) is offset with the signal output from the slave. Now if this signal offset was consistent it would be fine, however the offset can vary between 10 and 100ms, and we need precision within 20 us.

Cheers

Then I suggest that you abandon the serial libraries and bit bang the pin yourself.

fusionisnotcoming:
The code at the top and in the setup of the master code is there to increase the analog read speed.
So our issue is that the signal that we use to trigger the master (from our function generator) is offset with the signal output from the slave. Now if this signal offset was consistent it would be fine, however the offset can vary between 10 and 100ms, and we need precision within 20 us.

I recognize all the individual words but I have no idea what it all means.

Where does a function generator come into it?
What do you mean by “offset with the signal output from the slave”

Can you post a pair of simple programs (without any register modification) that illustrate the problem - something I could try myself.

…R

@ KeithRB:

Thanks a lot for the advice. Sounds like you may be onto something. I’m relatively new to the whole microcontroller thing. Can you expand a bit more on “bit banging the pin myself”? Possibly link me to something to read on?

@Robin2:

The entire set up of our experiment is this:

  1. The master Arduino is connected to an oscilloscope with a function generator.
  2. When triggered by the sensors connected to it, the oscilloscope outputs a 2V half square pulse for 0.1 millisecond from its function generator.
  3. This pulse is picked up by the master Arduino and read using analogRead().
  4. When the value rises over 200, the master Arduino sends a 1 through the Bluetooth connection to the slave Arduino.
  5. With Serial.available(), the slave Arduino looks for any data transferred over Bluetooth.
  6. When it receives something (any kind of data) through the Bluetooth connection, the slave Arduino turns its pin on to create a 5V signal that will eventually be used to trigger a high speed camera.

My problem lies with steps 5 and 6 of what I described above. The speed at which the slave Arduino receives the 1 through Bluetooth, turns on the pin and creates the 5V signal is inconsistent. Living in the physical world, there is necessarily some delay between the Bluetooth communication, which causes the 5V signal to be delayed by a couple of ms compared to the 2V square pulse created by the function generator. (see attached image: the blue signal is the 5V from the slave and the red signal at the very left is the 2V square pulse). In the case of that image, the delay is around 14 ms as seen in the bottom right corner of the oscilloscope. However, this delay (the time difference between the 5V pulse and the 2V pulse) can range from anywhere between 10 ms and 100 ms. So I’m asking if there is a way to make this delay constant, i.e. make it so it’s a 10 ms delay every time it triggers for example.

Cheers.

Here's a sample of the code without register modifications for you to visualize the problem:

Master

void setup() {

  Serial.begin(1382400);

}

void loop() {
  if (analogRead (A1) > 200){

    Serial.write(1);

  }

}

Slave

#define signalPin 5

void setup() {
  
  pinMode(signalPin, OUTPUT);
  digitalWrite(signalPin, LOW);
  Serial.begin(1382400);

}

void loop() {

  while(!Serial.available()) {
  }
    digitalWrite(signalPin,HIGH);
    delay(10000);
    digitalWrite(signalPin,LOW);
}

fusionisnotcoming: The entire set up of our experiment is this:

That makes things much clearer.

Have you tried (temporarily) making a wired connection between the two Arduinos to see if that eliminates the uncertainty in the timing?

If it does then the problem lies with the Bluetooth devices.

I can't imagine that the high baud rate you are using is necessary.

It seems strange to be using analogRead() when you seem to need speed. Can't you organize things so that a digital input can detect the pulse?

Or, if it must be analog why not use the analog comparator to trigger an interrupt?

I use nRF24L01+ wireless modules for radio control and AFAIK they send data with very little latency. IIRC someone else here said recently that a Tx and acknowledgement take about 500 microsecs - I have not tried to pin down the time that precisely myself but I know it is less than a few millisecs. The nRF24 modules are cheap and reliable.

...R Simple nRF24L01+ Tutorial

Just dive down into serial.write() and look at the code. (Admittedly a real IDE would make it easier)

You can remove all the error checking, buffering and other stuff and just pull out the part that talks to the hardware.