SD card logs then stops? Some kind of congestion or overflow?

Hi,

I am testing an Arduino Due with an Adafruit SD breakout (SPI), my aim is to datalog at 1000Hz.
I have a test sketch working, but the logging stops after a period.

  • If I slow down the logging rate (increase milliseconds between samples), then the logging runs for longer, actually giving more entries in the file, even though the logging rate is lower.
    -If I increase the number of characters written for each entry, then the logging stops sooner, fewer entries in the file.

Is there some overflow happening that I am not seeing, or is there something happening with SPI?

My test code:

#include <SD.h>
#include <Wire.h>

//Datalogging
unsigned int loggingRate = 1;  // Milliseconds between samples
char buffer[600];
char buffer2[600];
unsigned long currentMillis = 0;
unsigned long nextMillis = 0;
// File object to represent file
File txtFile;
char fileName[13];


void setup()
{  
  Serial.begin(9600);
  delay(2000);

// Initialise the SD card
    if (!SD.begin()) {
      Serial.println("Card failed, or not present");
      // don't do anything more:
    while (1);
   } 
  Serial.println("SD card initialised");
  delay(500);
  newLogFile();

  delay(3000);
}

void loop(){
  currentMillis = millis();
  if (currentMillis >= nextMillis) {
      nextMillis = currentMillis + loggingRate;
      readData();
      logData();
  }
}

void readData(){
  char temp1[10];
  char temp2[10];
  strcpy(temp1, "123456789");
  strcpy(temp2, "ABCDEFGHI");
  sprintf(buffer + strlen(buffer), "%d %s %s\r\n", millis(), temp1, temp2);
}

void logData(){   
  unsigned int chunkSize = txtFile.availableForWrite();
  if (chunkSize && strlen(buffer) >= chunkSize) {
    // write to file
    txtFile.write(buffer, chunkSize);
    // Copy unwritten data to temporary array, clear main buffer, then copy unwritten data back to main array 
    memcpy(buffer2, buffer+(chunkSize), sizeof(buffer));
    memset(buffer, 0, sizeof(buffer));
    memcpy(buffer, buffer2, strlen(buffer2));
    memset(buffer2, 0, sizeof(buffer2));
  }
}

void newLogFile(){
  strcpy(fileName, "test.txt");
  txtFile = SD.open(fileName, FILE_WRITE);
  if (!txtFile) {
      Serial.print("error opening ");
      Serial.println(fileName);
      while (1);
    }
 }

This produces the following entries, as expected, in test.txt:
5619 123456789 ABCDEFGHI
5620 123456789 ABCDEFGHI
5621 123456789 ABCDEFGHI
and so on...

At 10ms logging interval it ran for about an hour before stopping. At 1ms logging interval it runs for a few seconds, sometimes a bit longer, sometimes shorter.

I am so close to getting this to work, just need to iron-out this last issue.
Can anybody please help?

Thank you!

Can you show us in the code where you close the log file? I could not find it.

Hi Paul,
The file is not closed - as data is being written in 512 chunks the data is written to the card without requiring it to be closed.

Thanks for the quick reply!

If the file is not closed, then the file pointers are not properly updated, and it will appear to be empty, regardless of how much data you have written.

Issue a SD.flush() every once in a while to update the file pointers.

Why are you trying to buffer the data yourself? The SD library is written to handle all that using the internal 512 byte buffer. All you are doing is taking up precious RAM, and introducing bugs.

The data is saved in the file, no problem there. I just followed the methodology from the SD library example NonBlockingWrite...
And based on all the searching and researching I have done so far I got the impression that was the most streamlined method?
I'll try the flush method and report back, thank you!

Can you post information about how everything is powered, particularly the SD breakout? Also, a link to that breakout board.

Here is the breakout:https://learn.adafruit.com/adafruit-micro-sd-breakout-board-card-tutorial
I have tested with several cards, including a SanDisk Extreme Plus - doesn't seem to be a factor in this instance.

This is the one of the discussions that sent me down the buffer path - https://forum.arduino.cc/t/explanation-of-sd-write-and-flush-please/369320

Of course one of the contributors is the authority on the subject, and seems to give a differing answer to the others...

Thank you all for such a great response, it is really appreciated!

Sorry, power is just USB from laptop to Due, 3.3v from Due.
I have tried powering the Due from 1A wall adapter, no change.

This feels so wrong although it (probably) is not the problem; if chunksize is not zero, you are copying something that is not part of buffer to buffer2. You should actually copy sizeof(buffer) - chunkSize.

Use the ExFatLogger in the SdFat library by Bill Grieman

It just seems from your description of the problem that it could be a power issue. Specifically, the 3.3V regulator on the Due may be having trouble providing enough current to power the SD card, particularly when erasing or writing, which is when the card draws the most current. If the regulator is heating up under the load, that would explain why it runs longer when you slow down the logging frequency. Well, that's just a guess, but it seems to me if the code was bad it probably wouldn't run at all, and if you're running out of ram, it would crash at the same number of logs no matter how fast you saved them.

You can test this theory by powering the SD module at its 5V pin, powered from the 5V pin of the Due (i.e. - directly from USB). That bypasses the Due's regulator, and moves the 3.3V regulation to the module. If it still crashes, then I'm wrong, and it must be something else.

You can settle the code question by just running one of the example data logging sketches in the library. If they don't crash, then the problem is probably your code.

Can we get a report on how this came out?

Sorry, I was so busy implementing I forgot to follow up!
I ended up using flush() about every minute or so, but actually retained a buffer of 400 bytes. This achieved 1000 readings per second easily, so I just pushed ahead with that as I needed to move on. I didn't really have a solid reason for keeping that buffer other than it it was minimal change to the code, and stayed away from the 512 automatic write to SD card that was causing the problems.
I want to do some logging at much higher rates so I will revisit at a later stage, but only if the flush method doesn't suffice!

Interesting how someone inexperienced such as myself can get sent down perhaps an incorrect path, I did so much background research I really thought I was on the right track and implementing best-practice. Luckily for this one it was ok for my datalogging to miss a few readings every now-and-then, when the flush is occurring.

Thank you to all for your input!

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.