SD card write speed on existing file

Hi,

if writing a new file on SD card 240 chunks of 640 bytes, that takes 12 seconds.
Writing the same 240 chunks of 640 bytes into the same (existing) file does cost only 1 second.

Is there a trick to create a 151KB file quickly (can have any content)?
This would immensely speed up the first (real) write of the file.

Hermann.

HermannSW:
Hi,

if writing a new file on SD card 240 chunks of 640 bytes, that takes 12 seconds.
Writing the same 240 chunks of 640 bytes into the same (existing) file does cost only 1 second.

Is there a trick to create a 151KB file quickly (can have any content)?
This would immensely speed up the first (real) write of the file.

Hermann.

So sorry but this made little to no grammatical sense.

It looks as though this is your issue:

  1. You write a 151kB set of data to an SD card.
  2. You then instruct the MCU to write the SAME 151kB in to the SD card and append it to the file that has been created in step 1.

You are finding step 1 takes 12 seconds but step 2 only takes 1.

You want to speed up step 1.

Correct?

You provided little information so it is difficult know what your problem is.

Common reasons for slow file creation are improperly formatted SD cards and fragmented cards with lots of files.

Try formatting the card with the SD association's Sd Formatter. Then run your test on the freshly formatted SD card.

If you can't use the SD association formatter, use the SdFat SdFormatter example.

Sorry if I was not clear enough, below code piece is from screenshotToFat().

What I tried to say is:

  • after deleting file "scrshot.565" t1-t0 will report as 12sec
  • if a previous file "scrshot.565" is already present, it gets overwritten in 1.8sec (0.8sec for reading LCD)

My question is, whether an initial "scrshot.565" file can be created much more quickly than in 12sec?
File size is 2403202 bytes plus 54 bytes header or 151KB, constant.
The measurements (1sec vs. 12sec) are easily reproducible.

...
        t0 = millis();

        // lcdbuffer = (uint16_t*) malloc(_width*2);    

        if (lcdbuffer==NULL) {
            Serial.println(F("Cannot allocate lcdbuffer"));
            return;
        }

        if (!bmpFile.open("scrshot.565", O_CREAT | O_WRITE)) {
            Serial.println(F("File open failed."));
            return;
        }

        bmpFile.write(bmp ? buf2 : buf, bmp ? sizeof(buf2) : sizeof(buf));

        for (uint8_t i = 0; i < (uint8_t)_height; i++)
        {
            beginTransaction();
            setAddr_cont(0, bmp ? _height-1-i : i, _width, 1);
            writecommand_cont(ILI9341_RAMRD); // read from RAM
            readdata8_cont(); // dummy read, also sets DC high

            for (uint16_t j = 0; j < (uint16_t)_width; j++)
            {
#if SPI_MODE_DMA
                read_cont(color, 3);
#elif SPI_MODE_NORMAL | SPI_MODE_EXTENDED
                color[0] = read8_cont();
                color[1] = read8_cont();
                color[2] = read8_cont();
#endif
                lcdbuffer[j] = color565(color[0], color[1], color[2]);  
            }

            disableCS();
            endTransaction();

            bmpFile.write(lcdbuffer, _width*2);
        }

        bmpFile.close();

        free(lcdbuffer);

        t1 = millis();
...

Hermann.

I am on Linux, so SdFormatter does not work.

I tried the Arduino script, but that does not compile :frowning:

Build options changed, rebuilding all
SdFormatter.ino: In function 'void setup()':
SdFormatter:487: error: 'class Sd2Card' has no member named 'begin'
'class Sd2Card' has no member named 'begin'

I will now format SD card with Linux system utility and see whether that makes a difference.

Hermann.

It is a very old SD card, 256MB in total.
I did remove all partitions, created one big new and formated.

On empty SD card without any files:
time [ms]: 13760

Each next screenshot (overwriting previous):
time [ms]: 1724

Here is the generated file:

Hermann.

You didn't install the SdFat library correctly. It does work on Linux.

Linux is a poor choice for formatting SD cards for use on Arduino. The format utility doesn't come close to creating the correct layout. It uses the wrong alignment of file structures with flash chips and the wrong cluster size to match the SD card size.

If you can't produce the standard format with SD formatter, try formatting the card FAT16 with 16 KB or 32 KB cluster size.

This won't produce an optimal format but should improve performance.

SD formatter adds hidden blocks to to align flash erase boundaries with file structures.

Ok, thanks for insisting of formatting the SD card newly.

I went to family PC, that has Win7, and installed SDFormatter you pointed to.
I formatted with overwrite option.
The SD card is only 256MB in size, and it may be that it had been formatted 8 years ago :wink:

I did run ILI9341_due library demo sdFatTftBitmap and in the middle uploaded my screenshot sketch.
As you can see it shows half of giraffe and half of stopP picture of the demo.

In fact the formatting with SDFormatter did make a difference.
I started without file SCRSHOT.BMP on the SD card, did 1st, 2nd and 3rd screenshot.
Then I manually deleted file SCRSHOT.BMP, and did 3 screenshots again:

time [ms]: 4448
time [ms]: 1783
time [ms]: 1778

time [ms]: 4615
time [ms]: 1769
time [ms]: 1770

4.4s is definitely much better than 13.7s (for initial screenshot file).

But my questions remain -- now for an optimally formatted SD card:

How can a new(!) 151KB file (content does not matter) be written in say less than 2 seconds?
This would allow to further reduce screenshot time for initial screenshot from 4.4s to below 2+1.77s=3.77s.

Would writing big memory areas do the trick?
Where can I find them (ram size of Uno is 2KB only)?
Is it possible to eg. write (part of) progmem (just to fill file initially)?
My DS3132+AT24C32 has 32KB EEprom, can that be written to FAT?
Maybe copying a 151KB template file from same SD card? How?

Hermann.


Your old SD card may be slow. Try running the SdFat bench example with a 512 byte buffer. Post the results.

Hi

find SDFat "bench" output with BUF_SIZE 512 here:

Type any character to start
Free RAM: 604
Type is FAT16
File size 5MB
Buffer size 512 bytes
Starting write test.  Please wait up to a minute
Write 277.93 KB/sec
Maximum latency: 452332 usec, Minimum Latency: 1548 usec, Avg Latency: 1813 usec

Starting read test.  Please wait up to a minute
Read 533.30 KB/sec
Maximum latency: 1908 usec, Minimum Latency: 940 usec, Avg Latency: 953 usec

Done

Type any character to start

BUF_SIZE 640 (as in screenshot program) is slower ...:

Type any character to start
Free RAM: 476
Type is FAT16
File size 5MB
Buffer size 640 bytes
Starting write test.  Please wait up to a minute
Write 48.24 KB/sec
Maximum latency: 311692 usec, Minimum Latency: 1624 usec, Avg Latency: 13238 usec

Starting read test.  Please wait up to a minute
Read 442.88 KB/sec
Maximum latency: 2920 usec, Minimum Latency: 1024 usec, Avg Latency: 1438 usec

Done

Type any character to start

Hermann.

Your card has problems with 640 byte writes.

When a file is created, random writes are required to allocate clusters and write data. Your card can't handles the pattern in some cases.

Your card can handle the sequential write after the file is created.

Here are two SanDisk Extreme cards

SanDisk Extreme Micro SD 16 GB

File size 5 MB
Buffer size 640 bytes
Starting write test, please wait.

write speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
275.95,23824,1604,2313

Starting read test, please wait.

read speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
427.54,3224,1072,1491


SanDisk Extreme SD 32 GB

ile size 5 MB
Buffer size 640 bytes
Starting write test, please wait.

write speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
284.65,12572,1500,2242

Starting read test, please wait.

read speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
424.20,3252,1080,1503

Here is a third card:

SanDisk 32 GB Extreme MicroSD

File size 5 MB
Buffer size 640 bytes
Starting write test, please wait.

write speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
301.75,12372,1616,2115

Starting read test, please wait.

read speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
427.76,3232,1072,1490

Good thing you tried the 640 byte case. In the 512 byte case the write is done directly from the buffer to the SD without using the cache. The cache holds the FAT block so no random writes are done.

In the 640 byte case the cache is use for both data and the FAT so random writes are required.

SdFat has an option to use two cache blocks when there is enough memory. One block is used for file structures and the second for data.

Your card has write latencies of 300 - 400 ms with random writes. This is really bad.

Thanks for your analysis, and yes, the card is (very) old and has latency issues.

Nevertheless the difference between 512 and 640 byte write you explainded did give another boost!

The trick I used is to

  • always write the file first, with 512 blocks (of random data).
  • then bmpFile.rewind()
  • and finally write the real data

This is the relevant code piece (intentionally writing some bytes too much, see below):

        // pave the way for quick "real" screenshot data write
        hsize = bmp ? sizeof(buf2) : sizeof(buf);
        tsize = (uint32_t)_width * _height + hsize;
        nbuff = tsize >> 8;

        for(uint16_t i=0; i <= nbuff; ++i) {
            bmpFile.write(lcdbuffer, 512);
        }
//        bmpFile.write(lcdbuffer, tsize % 512);
        bmpFile.rewind();

These are the new numbers, for 1st, 2nd and 3rd screenshot by new method (with repeat after deleting SCRNSHOT.BMP):

time [ms]: 2993
time [ms]: 2432
time [ms]: 2425

time [ms]: 2967
time [ms]: 2405
time [ms]: 2415

So decrease from 4.4s to below 3s(!) by this trick, and this will be my standard use case because the screenshot filename will not remain fixed, but contain date+time and will be unique for every screenshot.

The "price" to be paid for this is that 2nd screenshot time goes up from 1.8s to 2.4s, but that will not be a usecase with date+time filenames.

Making the file size being a multiple of 512 bytes does not hurt, but wins 0.2s compared to correctly sized file.

And this is not the end, even for this card.
I am sure that further improvement can be done by writing 512 byte blocks with actual screen data. Will be a bit more complicated than the simple row wise 640 byte write. But five 512 byte writes handle 4 complete rows.

Hermann.

Edit: I tried this and it is not as fast as creating on the fly for good SD cards with a 640 byte buffer. The problem is that the the file must be read since it is a rewrite of partial blocks. It is slightly faster for 512 byte writes. It might be faster for your old card.

There is one more trick you could use. SdFat has a call to create an empty contiguous file. The content will be undefined since the function only allocates space.

/** Create and open a new contiguous file of a specified size.
   *
   * See open() for more information.
   *
   * \param[in] dirFile The directory where the file will be created.
   * \param[in] path A path with a valid DOS 8.3 file name.
   * \param[in] size The desired file size.
   *
   * \return The value true is returned for success and
   * the value false, is returned for failure.
   */
  bool createContiguous(FatFile* dirFile, const char* path, uint32_t size);

Hi,

I buy "createContiguous()" you proposed, this is the complete pave the way code now:

        // pave the way for quick "real" screenshot data write
        hsize = bmp ? sizeof(buf2) : sizeof(buf);
        tsize = (uint32_t)_width * _height * 2 + hsize;

        sd.remove(snbuf);

        if (!bmpFile.createContiguous(sd.vwd(), snbuf, tsize)) {
          Serial.println(F("createContiguous failed"));
        }

This code has interesting effects:

  • if file exists now it takes 2.6s, compared to 2.4s before
  • if file does not exist, now it takes 2.2s only, down from 2.9s

Reason that now "file exists" takes longer is the time the needed "sd.remove()" takes (to make "createContiguous()" work). Since (2) will be the only use case with date+time based unique filenames, above code effectively reduced the 4.4s from yesterday to now only 2.2s for storing screenshot on SD card!!

Hermann.

I tried a 4GB SD card from an old cell phone -- the 256MB card, although much older, is not that bad.

As I did with the 256MB card I did format 4GB card with SDFormatter.

These are the write performance numbers from bench.ino:

While these number do not look that bad for the 4GB card, a screenshot on that card takes 8.8s, a factor of 4 slower than the (good old) 256MB card.

Hermann,

Only old SD cards, I wanted to test a new one and bought a no name 2GB card from local drugstore.

First I did format the card with SDFormatter, then I did the 512 byte

Free RAM: 604
Type is FAT16
File size 5MB
Buffer size 512 bytes
Starting write test.  Please wait up to a minute
Write 161.61 KB/sec
Maximum latency: 76500 usec, Minimum Latency: 2360 usec, Avg Latency: 3152 usec

Starting read test.  Please wait up to a minute
Read 395.36 KB/sec
Maximum latency: 2468 usec, Minimum Latency: 1148 usec, Avg Latency: 1289 usec

Done

and 640 byte buffer bench.ino runs:

Free RAM: 476
Type is FAT16
File size 5MB
Buffer size 640 bytes
Starting write test.  Please wait up to a minute
Write 149.29 KB/sec
Maximum latency: 145312 usec, Minimum Latency: 2496 usec, Avg Latency: 4281 usec

Starting read test.  Please wait up to a minute
Read 345.50 KB/sec
Maximum latency: 4020 usec, Minimum Latency: 1236 usec, Avg Latency: 1846 usec

Done

Not comparable with the quick cards fat16lib showed, and wrt read speed a bit slower than the 256MB card. And even for writing 512 byte blocks with 161KB/sec quite slower than the 256MB one (277KB/sec). But for 640 byte blocks it is quicker with 149 KB/sec than the 256MB card with 48KB/sec.

This seems to be the reason the writing a screenshot is even slightly quicker now:

2064
2067
2038

2063
2081
2098

Especially there i no difference anymore whether no previous screenshot file existed (1+4) or whether file existed and needed to be removed first (2,3,5,6). So down from 2.6/2.2 sec to now below 2.1 sec.

Hermann.

Design of SD cards has changed a lot over the years. Most cards are now designed for devices with lots of RAM. To get high speed you must use large transfers with multi-block transfer commands.

The Arduino Uno has little RAM but you can use multi-block transfers for some apps. That's how I get very high rates with the AnalogBinLogger example.

Your 4 GB card was probably optimize for multi-block transfers and single block transfers were neglected.

Here is the bench example for 32KB transfers on an Arduino Due. I use multi-block transfers. The card is the latest SanDisk Ultra.

This is fast for an SPI transfer.

Type is FAT32
Card size: 31.91 GB (GB = 1E9 bytes)

Manufacturer ID: 0X3
OEM ID: SD
Product: SE32G
Version: 8.0
Serial number: 0X838DE929
Manufacturing date: 9/2015

File size 10 MB
Buffer size 32768 bytes
Starting write test, please wait.

write speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
3640.89,22857,8081,8994
3713.95,16889,8063,8817

Starting read test, please wait.

read speed and latency
speed,max,min,avg
KB/Sec,usec,usec,usec
4447.81,7682,7331,7365
4449.80,7653,7280,7363