ShermanP:
I don't know how you would dedicate ram to these buffers, or address them, or read data from them for conversion to CSV, in C++, but I assume there is a way to do all that if you are a C++ jock. And actually, I would love to see an example of that code if anybody knows of one.
If you can save the data in an interrupt, then one relatively easy way would be:
Declare a struct with all the data for each log entry:
struct LogEntry
{
float _myFloats[7];
long _myLong;
bool _myBool;
};
Create two arrays, each with enough entries so that you will have enough time to write out the data to the SD card. Since they'll be the same size, you can use a two dimensional array.
constexpr size_t MAX_LOG_MEM = 1024; // Or however much RAM you want to dedicate
constexpr size_t NUM_ENTRIES = MAX_LOG_MEM / 2 / sizeof LogEntry;
LogEntry myLogEntries[2][NUM_ENTRIES]; // Maximum entries that fit under the total size
Or just set NUM_ENTRIES directly if you have a good idea of what it should be,
Use a couple of global variable to track which one you are filling, the offset, and when it is full. When one is full, start filling the other, and in the main loop start writing out the first to the SD card. Mark that one as empty when you are done. You could even have the interrupt throw away entries if it tries to switch and the other one isn't empy yet.
If you have problems with the buffers sometimes not emptying fast enough, you could use more buffers, but with fewer entries (for example, change the '2' above to '3', and you'd have three sets to work with. 'Most' of the time the writer keeps up, but when it hits a delay, there's an 'extra' buffer for the logger to use.)
This all assumes you can actually get you data in the interrupt handler - you may not be able to if you have to talk to another device, for example. I don't know enough about the 'yield()' functionality to know how to incorporate that.
BUT, before doing that, you should test if it would be fast enough overall. In 15 seconds you write out 1500 entries. So, write a quick test program that does ONLY that (use dummy data), and see how long it takes. If that test takes more than 15 seconds, then your sampling rate will be too fast.