Hello, please bear with a rather long post from someone who doesn't really know C++.
I have a simple project, reading some sensors (3 ADC readings) and logging the results. The readings are not too close to each other, once every about 10 seconds or more is fine.
For the logging part, I have tried to use an SD card and the SDuFAT library.
The problem is, the lib has many limitations. One of the most obvious ones is that you can't create files so you have to have a preexisting file. This is not too big a problem, the card I 'm using is about 1.9GB so I just create a 1.8GB file called “data.txt” in it.
A few minor bugs are easy to fix, for example:
1- the lib tries to append to preexisting file “hola.txt”. See SDuFAT.cpp, function “int SDuFAT::append(const char* filename)”, line 454:
return print("hola.txt", data);
This is fixable by changing it to:
return print(filename, data);
2- The lib knows where the file ends when it hits EOF, in this case 0x03. The problem is it gets overenthusiastic when deleting as it tries to write an EOF at the start of EVERY block. Try that with a file of a couple of GB and you could be waiting a century or two.
Fortunately, this is also easy to fix, in SDuFAT.cpp, function "int SDuFAT::del(const char* filename)", line 173, I have changed line:
for(long m = 0; m <= sectors; m++) {
to:
for(long m = 0; m <= 1; m++) { // or change 1 to whatever block you want to stop at
which makes it write to the first couple of blocks. The example sketch seems to be running fine so far and function "int SDuFAT::cat(const char* filename)" seems not to have trouble identifying the end of file without needing to fill up the start of a few trillion blocks with EOFs.
This, however, is all the low hanging fruit I could find. The real trouble begins now and its name is "int SDuFAT::print(..." family of functions.
I found out by trying to write lines of about this size:
dd/mm/yyyy,hh:mm:ss,AAAA,BBBB,CCCC
roughly 35 chars including the newline. The first few lines take just over a second to write. However, this time increases and eventually cripples the application (in my case, where logging occurs once every ten seconds, the moment writing takes longer than that I 'm out of the game).
To find out what's happening I had a look inside "SDuFAT.cpp", function "int SDuFAT::print(const char* filename, char* data)", line 353, which is what all the other "SDuFAT.print..." redirect to.
A rather major problem is that SDuFAT does not keep an internal variable as a file pointer in order to know where we are in the file. Thus, every time a "print" is used, the lib has to go over the whole file to see how big it is (statement "long file1Length = usedBytes(filename);", line 360).
This is OK for short files but as the file gets longer and longer (and given that the SD card is a damn slow beast) it becomes increasingly slow until the whole thing becomes unusable.
To make matters worse, function "int SDuFAT::println(const char* filename, char* data)", which is what most logging applications would use if the calling function does not bother adding the newline to the passed string, calls "int SDuFAT::print(const char* filename, char* data)" twice, once with the string and once with the newline. This means we read the whole damn file twice to find where we are supposed to be writing.
I tried a quick and dirty fix, passing a string with a newline appended to "int SDuFAT::print(const char* filename, char* data)" from inside "int SDuFAT::println(const char* filename, char* data)". Here is my version:
int SDuFAT::println(const char* filename, char* data)
{
// Hackdition 11/9/2010
// "data" is presumably terminated by 0 so find the last char, dump a \n in its place and shift it one down
int datalength;
for (datalength=0; data[datalength] != 0; datalength++ ); // find the position of the 0 in the passed array
char dataln[datalength + 2]; // make a new array one larger than the passed one
for (int i=0; i<datalength; dataln = data[i++]); // copy the original array into the new one
- dataln[datalength] = '\n'; // put a '\n' at the penultimate position of the new array*
- dataln[datalength+1] = 0; // put a 0 at the end of the new array*
- int aux = print(filename, dataln); // send the new array to the SD card*
- // End of hackdition 11/9/2010*
- return aux;*
}
[/quote]
This should work and indeed does, until you try passing a big string to it at which point hideous things happen. The board hangs, or keeps resetting. Initially I though it was happening when the string was longer than 512 bytes but I soon found out that it happens for shorter strings too. I eventually realised that duplicating the passed string "data" by copying it to locally created "dataln" was making the board run out of memory and all hell was breaking loose. However, I didn't want to throw in the "String" lib to concatenate "data" with a newline as it would make the whole thing heavier still. An inelegant solution is to delete the "println" functions from SDuFAT.cpp and simply pass a newline-terminated string from the calling function to begin with. Any other ideas would be welcome but don't worry too much about it, it's a minor problem.
Now, to the real gritty stuff, the guts of "int SDuFAT::print(const char* filename, char* data)", the workhorse of the "print" family. As far as I can understand, it does weird things, some of which I 've already talked about but let's have another go.
First, it figures out where to write in the most hideous way possible, by reading the file (line 360).
It gets the passed string "data" and copies a chunk of it (up to BYTESPERSECTOR) to class variable "buffer"(line 378) . It then fills up the rest of the buffer with EOF (line 385). It then goes and writes the buffer ten times (WTF!) to the card (line 390) at the position it has calculated initially.
If the passed string overflows BYTESPERSECTOR, it goes and copies the rest of it into "buffer" (line 404), dumps an EOF after it (line 411), fills up with blank spaces (line 416, why not EOFs like before?) and again writes it ten times to the card (line 421).
So, the obvious questions of a relative noob like myself are. Why are things written ten times? Robustness against failings of the hardware? If so, is it OK to add something like:
#define NUMBER_OF_WRITES 10 // or whatever >=1 you feel lucky with, punk
to "SDuFAT.h" and allow people to try their luck with their hardware if they are willing to risk it for greater speed?
And the most important question, should I try to add a class variable, say "long position_in_file" which gets incremented by, say, "count" after line 390 and "count2" after line 421? Then, instead of calling "usedBytes()" and reading through the whole file every time "int SDuFAT::print(const char* filename, char* data)" is called, it could simply look it up and save a hell of a lot of time. Am I missing something obvious?