Go Down

Topic: Reading Log Files From SD Card (Read 216 times) previous topic - next topic

robsworld78

Jan 11, 2019, 09:05 am Last Edit: Jan 11, 2019, 09:06 am by robsworld78
Hi, I'm reading 200 CSV files from an SD card using the SdFat library and its working well however it takes a long time.

I've determined if the contents of the 200 files are located in one file it's really fast so I started doing some timing and discovered every time "sd.open("text.csv")" is called it takes about 90ms so times 200 files that's 18 seconds.

Is there any way around this?
Thanks

Nick_Pyner

It seems that you answered the question, so, better file management, perhaps?

Lucario448

I've determined if the contents of the 200 files are located in one file it's really fast so I started doing some timing and discovered every time "sd.open("text.csv")" is called it takes about 90ms so times 200 files that's 18 seconds.

Is there any way around this?
As Nick_Pyner said: you already answered youself.

As you can see, opening a file is easy to do; but the process isn't exactly straightfoward.
The library has to scan through the root directory entries (an array of 32-byte records that have a brief but very important description of the contained files and subfolders; among those is the name of course) until it founds a match with the given name or reaches the end of the list without success. If it founds a match, copies (to RAM) the entry itself and then queries to the File Allocation Table (FAT) where is physically the first chunk (cluster) of data corresponding to that file. And finally it caches the first 512 bytes of that data and creates the anticipated File object.

To make matters worse, if the card is formatted as FAT32, chances are the root directory's entry list might be physically fragmented (making this scanning process even slower); although it's very unlikely unless you've placed a lot of files/folders in this location.




Maybe not the most comprehensive explanation ever, but at least you should have an idea of why it is so slow and why it is better to do it once (by packing everything into a single large file).

robsworld78

Thanks for the replies. I wish I answered my own question. I can see where it's slowing down but was hoping there's something I could do to speed it up using the multiple file approach. Thanks for explaining why it's slow. I do use FAT32 because of that file count limit on FAT and tried multiple formatters to get the best I could. I guess I'll have to look into merging the files as time goes by.

Lucario448

I can see where it's slowing down but was hoping there's something I could do to speed it up
Well, another approach would be opening the files by iteration rather than by name.

What this means is that you first open the directory (folder) containing those files; then you open all of them, one at a time, with openNextFile().
Keep in mind when such location also has irrelevant files or folders, because those will be opened eventually during the iteration.
Furthermore, the directory entry list is not sorted at all (or at least isn't always guaranteed), not even by name (file explorer apps do sort the content in load time because they have plenty of RAM, while an Arduino doesn't); so don't be stranged if you find out that the files aren't opened in the order you've expected.



With this approach, the entries scan and name comparison stages of the file-opening process, are skipped; although the FAT query and the rest will still occur nevertheless. So this suggestion should save some time, but maybe not a lot.

Go Up