Writing IMU data from I2C to SD Card over SPI faster than 300 Hz reliably

Hey all,

Having trouble getting my code up to clip. Currently running by my estimation (printing millis() timestamps) at somewhere between 50-100 Hz. this isn't fast enough. im looking for 300 hz, preferably 500 for safety. I am confused why I can get this over serial but not to an SD card that is on SPI. the point of my IMU data-logger is to capture IMU data between camera frames to better estimate camera motion for my image stabilitizing/smoothing algorithm. My Hardware is simply

Transcent Micro SD Card (4g)

Sparkfun SD Shield

Arduino Uno

MPU-6050

My code is attached (its basically just me bastardizing i2cdevlibs code for this IMU). The main issues I see are the file.open() statement, which takes 3 milliseconds, and the file.flush() or file.close() statements, that (regardless of which I use) take about 9-15 milliseconds.

Any Help would be greatly appreciated. I would really like to not have to datalog over serial using GoBetuino or something of the like...

P.S. - I have commented out my loop pin-checking flags and interrupts for the purpose of uploading a more usable file. sorry if this hurts the readability.

water_interrupt.ino (8.88 KB)

You will never get near 300 Hz to an SD unless you buffer data. SD cards have very little RAM buffer to reduce cost. The assumption is that the host will buffer data. All SD cards have occasional long write latencies of 50-200 ms while they erase flash or move data and remap flash for wear-leveling.

You will either need a RTOS to run multiple threads or do clever buffering between data capture in an interrupt routine and writing to the SD in the background.

// open the file. note that only one file can be open at a time,
// so you have to close this one before opening another.
// this opens the file and appends to the end of file
// if the file does not exist, this will create a new file.

There was a one file open limit when SD.h was first released years ago but that is no longer true. You can't open and close a file and achieve any speed. Open does a linear search through the directory since FAT directories are not ordered.

I suggest you look at the nilSdLogger example of NilRTOS Google Code Archive - Long-term storage for Google Code Project Hosting.. This example was designed for logging the type of data you have.

Here is an example that proves 300 Hz is easily achievable Try this super fast analog pin logger - Storage - Arduino Forum.

Try sdfat.h library instead.
Author fat16lib has gotten it running very fast.
Speed is also dependent on the SD card, some are decidely faster than others.

Thanks guys. I'll give sdfat.h a shot.

In regards to NilRTOS, having read around this forum a bit before I posted, I was kinda worried you were going to say that :confused: I am going to implement this in serial, get it working and logging, and then move toward the RTOS route. I don't have too much experience with using RTOS, and I feel like it will ad an element of complexity to a task that, realistically, should have been done last week :frowning:

But if I get the opportunity, Just looking around at NilRTOS it looks like the scheduling aspect is pretty n00b-friendly. How do you anticipate it playing with more extensive libraries like those of i2cdev? I want to move into his implementation of the built in filter for the MPU6050, so i won't just be reading data in as a simple ADC signal.

Also, a more trivial programmatic question, the naming convention for the log file seems to choke when I implement it as

file.open("DATA_%d.CSV", n,...)

where n is some variable by which I can increment the filename. that way I can better differentiate when I am logging. any insights as to this incrementing the file name problem?

also, fat16lib, i was looking in the discussion thread you referred to, but didn't see any explicit example of this buffering (i might be able to do a dump ever 30 datapoints or so...); where should I be looking? I don't know how much buffering will actually help my code, because if I understand correctly, the buffer is 512 bites, and my data is currently 6 values at 16 bytes a piece, and the fastest i would have an opportunity to dump is every 10 runs (for a total of 960 bits).

You still don't seem to understand the problem. You must run more than one thread to log data fast. A high priority thread captures data and puts it in a FIFO buffer. A low priority thread removes data from the FIFO buffer and writes it to the SD.

(i might be able to do a dump ever 30 datapoints or so...)

This won't help since the single thread that is both reading data and writing the SD will miss data points while writing to the SD.

In the NilRTOS example each data point is a small record.

struct Record_t {
  uint16_t adc[NADC];
};

The FIFO has many of these records

// FIFO buffer size. Adjust so unused idle thread stack is about 100 bytes.
const size_t FIFO_SIZE_BYTES = 950;

So if your data points have six 16-bit numbers, each record would be 12 bytes. A FIFO with 60 records could buffer 200 ms of data at 300 Hz and make overruns unlikely.

How do you anticipate it playing with more extensive libraries like those of i2cdev

I2C and other libraries work with NilRTOS. You could use Wire but there is a better library for NillRTOS, NilTwi. NilTwi is thread friendly.

file.open("DATA_%d.CSV", n,...)

You need to create the file name string with something like sprintf then open the file using that string.

  SdFile file;
  uint16_t n = 123;
  char name[20];
  snprintf(name, sizeof(name), "DATA_%d.CSV", n);
  if (!file.open(name, O_WRITE | O_CREAT)) {
    Serial.println(F("open error"));
    while(1);
  }
  Serial.println(name);
  file.close();
  Serial.println("Done");

This prints:

DATA_123.CSV
Done

Wow. Amazingly helpful. Thanks so much fat16lib, you rock.

1 Like