The SdFile object acts like a file handle in other systems. It contains information from the directory entry and cluster information for the current position. A number of blocks must be read from file structures to open a file and seek to a position.
If I wanted to optimize reads from a large number of files I would use raw SD reads.
When you copy files to freshly formatted SD, the files are contiguous. SdFat has a function to determine if a file is contiguous and where the blocks are located.
bool SdBaseFile::contiguousRange ( uint32_t * bgnBlock,
uint32_t * endBlock
Check for contiguous file and return its raw block range.
[out] bgnBlock the first block address for the file.
[out] endBlock the last block address for the file.
The value one, true, is returned for success and the value zero, false, is returned for failure. Reasons for failure include file is not contiguous, file has zero length or an I/O error occurred.
I would open each file and find its location with the the above function.
I would then use either the Sd2Card single block read function:
bool Sd2Card::readBlock (uint32_t block, uint8_t *dst);
Or the Sd2Card multi-block sequence:
bool Sd2Card::readStart (uint32_t blockNumber); // set start block for a multiple block read sequence.
bool Sd2Card::readData (uint8_t * dst); // Read one data block in a multiple block read sequence.
bool Sd2Card::readStop (); // End a read multiple blocks sequence.
SD cards do look ahead for multiple block reads so are very efficient in this mode.