Pages: [1]   Go Down
Author Topic: Standalone XBee ADC to Arduino+XBee data transfer speed problem  (Read 485 times)
0 Members and 1 Guest are viewing this topic.
Bordeaux, France
Offline Offline
Newbie
*
Karma: 0
Posts: 1
View Profile
 Bigger Bigger  Smaller Smaller  Reset Reset

Hi everyone,

After spending the last couple of days trying to figure out what is going wrong with my code, I have finally decided to ask the community! I usually always find a post that has already covers my Arduino related problems, but this time I can't find any suitable answers.

So here's the setup :
- [emitter] A standalone XBee Series 1 is connected to a Grove XBee Carrier (http://seeedstudio.com/wiki/Grove_-_XBee_Carrier). A sensor providing an analog voltage output is continuously sampled on ADC0. Sample rate is set at 1ms (ATIR01), and data is sent every 20 samples (ATIT14).

- [receiver] Another XBee Series 1 is connected to an Arduino UNO via a Wireless SD Shield. This Arduino UNO is connected to a MB pro and debugging is done using the serial monitor in Arduino 1.0.1.

Here is the code in its simplest version :
Code:
uint8_t dataBuf[100+1];
int  sampleNumber;

uint16_t  adcValue;
uint16_t meanAdcValue;

void setup(){
  Serial.begin(115200);
}

void loop(){
  // Xbee packet description
  // 0 = start delimiter
  // 1,2 = length bytes
  // 3 = API identifier
  // 4,5 = Source adress bytes
  // 6 = RSSI value byte
  // 7 = option byte
  // 8 = number of samples  (ex : 14 = 20 samples)
  // 9,10 = channel identifier
  // 11...
  // last byte = checksum

  // Total packet size : 12 "utility bytes" + 2*nbSamples. For 20 samples, total packet size is 52 bytes.
  // In this case, the emitting Xbee is set as follows :
  //   ATIT14 : 20 samples before sending.
  //  ATIR01 : 1 sample per ms.
  //
  // Both Xbees are configured with 115200 baud ATBD.

  if(Serial.available()){
    uint32_t tStart = millis();
    Serial.print("Start : \t");
    Serial.println(tStart);
   
    if(Serial.read() == 0x7E){      // Xbee API start delimiter     
      Serial.readBytes((char *)dataBuf, 51);  // ATIT14 : 20 samples, so total packet size is 12 + 2*20 = 52 - (Start Delimiter) = 51
     
      sampleNumber = dataBuf[7];    // Contains the number of samples, 20 in that case.
     
      for(int i=0;i < sampleNumber;i++){
        adcValue = (((uint16_t) dataBuf[i*2 + 10] << 8) | dataBuf[i*2 + 11]);
        meanAdcValue += adcValue;
      }
      meanAdcValue /= sampleNumber;     
      // Write data to SD card
      // ...
           
      uint32_t tStop = millis();
      Serial.println(tStop-tStart);
    }
  }
}

My problem is (and maybe its more of an understanding problem than a technical one), considering the [emitter] XBee setup, that I should read a new packet containing 20 ADC0 values every 20*1ms = 20ms. But using millis(),it appears that I only get new data every 27 to 29ms.
Checksum is ok, and the sensor values are correct. The problem is the same when using 44 samples before sending (ATIT2C), and there's alway a small difference between the estimated time between packets and the effective time (checked for 2, 3 and 5ms sample rates).

So the question is : has anyone on the forum has experienced that kind of setup, that is continuous sampling of a XBee ADC channel at maximum sample rate? And did anybody encounters the same problems or has an explanation? It may be something very stupid, but after a couple of days on this problem it's hard to get my ideas clear enough...

Anyway thanks for reading so far, cheers!
Logged

Pages: [1]   Go Up
Jump to: