Slow directory iteration with many files using SD lib

there is a topic earlier about this but that is somehow not closed properly ( Very slow enum files with SD Library ).
So, the suggested way of directory listing is using the SD library and its File::openNextFile() method.

The symptom is that with large directories this method gets slower and slower (100+ files). With a directory containing 260 files, the first File is returned less than 10ms, the last one is returned about 250ms by openNextFile() (and each file's retrieve time is increasing gradually as proceeding from the first file to the last).

After checking the source one can see that while the openNextFile() traverses properly on the directory entries and at each entry it opens a new file. But the file SdFile::open() simply gets the filename and scans the directory again from the beginning for that particular filename. If the open() somehow could get the already retrieved dir_t structure there would be no slow-down during the iteration..
In fact, this is the relevant code part:

if (, name, mode)) {
        return File(f, name);
      } else {
        return File();

File's constructor just copies a few bytes.

Where can this issue be reported to be fixed?

Can you think of an easy workaround for this?
(I plan to reimplement the directory read, reading dir_t structures and doing the same conditional checks as implemented in openNextFile(), etc...)

ps: I'm using a Nano board but it is not important here, I guess.

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.