I have my ESP8266 connected to my home wifi network, acting as a web server. I have an SDcard reader connected with 6 wires, containing an SDcard formatted with FAT32, and I'm writing code that will reliably read very big files from the SDcard and serve them to a remote client over HTTP.
My record so far is a 28-megabyte file. Right now I'm trying to serve a 37 MB file.
Eventually I want my code to be able to reliably send a 4 gig file. What's the biggest file anyone's ever served from an SDcard to a remote client over HTTP?
When serving very big files from the ESP8266, one of the things you need to handle is that 'myFile.available()' will fail before all 4 gigs of the file have been read. When this happens, you need to close the file, re-open it and then seek to where you left off.
If this 37 MB files sends properly then I'm trying 150 MB next. I'm aiming to get it up to 4 gigs.
If I can get it up to 4 gigs, then the next thing I'll try to do is serve two 4-gig files together as if they're one 8-gig file.
Your topic has been moved to a more suitable location on the forum. Please read the sticky topics in Uncategorized - Arduino Forum why you should not post in "Uncategorized".
Don't worry too much There is a storage category (for e.g. SD card) and there is a generic microcontrollers category (for e.g. the ESPs). Is this was a mix, I thought that networking etc. was suitable.
You should consider the hardwire connections, if not already. If SPI is busy with the SD card, the wiznet module or whatever you use, disabled. So it should be different bus.
Okay I have two new tactics:
(1) Every half a megabyte, close the file, then re-open it and seek to the previous position.
(2) Every couple of megabytes, pause the TCP transmission for 1 second.
This seems to be much more reliable for sending big files. Just now I successfully downloaded a 64 MB file from the web server, and the SHA256 hash matches so it's not corrupt. I used 'curl' at the Linux command line as follows:
curl http://192.168.109.71/meg64.bin -o meg64.bin
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 64.0M 100 64.0M 0 0 144k 0 0:07:33 0:07:33 --:--:-- 138k
So I downloaded 64 MB in 7 min 33 sec, giving an average speed of 144 kB/s. At this speed it would take 2 hr 1 min to download a gigabyte.
I'm going to tweak the code a little before I try to download a 512 MB file.
Just now I downloaded a 128 MB file in 13min 25sec at an average of 162 kBps.
$ curl http://192.168.167.71/meg128.bin | sha256sum
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 128M 100 128M 0 0 162k 0 0:13:25 0:13:25 --:--:-- 186k
The hash matches so no corruption.
I'm going to try get it a little faster before I try 256 MB.
And just now I did 256 megabytes in 28min 6sec (average = 155 kBps).
$ curl http://192.168.167.71/meg256.bin | sha256sum
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 256M 100 256M 0 0 155k 0 0:28:06 0:28:06 --:--:-- 170k
08d79c1ce01752446c43ab55b56215546945d53023c58b6c87b8fb8d8ff596ee -