Hi @Bodmer, and all users,
I've tried to implement a TJpg inside my ESPFastSSD1331 library for ESP8266 and ESP32 but I can't compile it because there are some static functions on what the compiler can't see variables declared on the header file, can you help me please fugure how to compile it?
I've tried to create a pointer to the class and use it as static but without success, I'm not expert on this and Callbacks.
My library is very long code, around 5000 lines the .cpp file and 1000 lines the .h file, so I've reproduced it in a new sketch witn just my class and nothing regarding my library apart setup and initialization in the main sketch (that can be replaced with Adafruit library). My library is big because I've added a lot of functionalities mantaining the high speed and hardware optimizations, it contain even a bitmap render, a class called RawVideoBitmap and another class called RawVideoJpeg. With these two classes I can play (using ESP8266 @ 160 Mhz) videos based on bitmap file sequences at very high framerate, 96x64 16 bit colors about 100 fps, so need to insert a delay to slow down the video like the original framerate that is 24 fps.
The RawVideoJpeg class can do the same with jpeg images, currently I"ve used your JPEGDecoder and embedded inside my library, with this class I can play videos based on jpeg images at framerate of about 38-40 fps.
Now I think an explanation is mandatory..... at start of this project (my library) I've managed to read a sequence of bitmap (.bmp) images from SPIFFS and SDCard to form a video, using SPIFFS I had a framerate of about 30 fps, but using SDCard a low framerate of 12 fps, this because ESP8266 need to open, read and close a lot of files, so I've decided to create an encoder that create a file with an header containing all infos to play a video, then starting from byte 100 insert all images sequentially and already converted in the right format, eg. RGB565, RGB666, RGB88, BlackAndWhite (for monochrome oleds like SSD1306).
On the ESP side, (with RawVideoBitmap) it do no need to convert anything, just read a full frame (inside an array) to the ram, set the Address Window of the display, and then send a full frame with just one line of code, pushColors(intPixel, intPixelLen) command. This increased a lot a framerate, fom 12 to 100+ fps.
The RawVideoBitmap is capable of very high framerate because it read one fame at time and write it to the display, the RawVideoJpeg is slower because need to read and write MCUs separately and this slow down even if there are less data to read/write.
To Encode RawVideoBitmap and RawVideoJpeg I've created 2 iteractive sketches with serial input on witch the User set a name of and frame and a name of Output encoded video file that is saved on the SDCard itself, these have .rvb extension for the RawVideoBitmap and .rvj extension for RawVideoJpeg.
After a lots of problems with ESP8266 crash because WDT reset, finally I've maked it both to work, these are capable to encode even long videos, the max I've tried is 20000 frames, but because the ESP need to open, read and close a lots of files it is pretty slow, to encode videos like 500-1000 frames no problem, just some minutes, but 20000 frames required 4 hours and is a very long time.
So I've decided to develop both the encoders on my PC using B4J, and after a lot of tests both works well, are very fast even if my PC is so old, just a dualcore with 2GB of ram. Both encoders show a Preview in realtime while encodes the videos and as I says are very fast, to encode 20000 frames 96x54 required just 5-6 minutes.
RawVideoEncoder works only with regular JFIF files, Exif not supported, but I think even the arduino encoder do not support it like 'progressive' files. Right?
Because I've even developed an Audio library for ESP8266 and ESP32, I've tried to play video with audio out of the box by connecting my Raspberry Zero W Pimoroni PhatDac to ESP8266, playing RawVideoBitmap video without audio I've 104 fps, with audio 22050 16 Bit it work good and need to add some small delays after any frame, note that I read from SDCard a .wav file extracted from the video, so to synchronize Audio and video require a fine tuning (microseconds) delay. With audio 44100 16 Bit it work but with some clips, the video framerate is lower than original video file, is about 20-22 fps, so I need to speed up a bit to have audio CD quality without clips.
The RawVideoBitmap files are very large files, the RawVideoJpeg are small files depending on Jpegs image quality.
The same video using bitmaps require 82MB, using Jpegs require 11MB depending on it's compression ratio, this data I've wrote use a quality of 2 in scale 0-32 (0 mean best quality).
That I"ve missed is that both encoders need a serie of BITMAP or JPEG files numbered in sequene, eg. Frame1.bmp, Frame2.bmp etc.... The encoder will use these numbers to know a frame order. All frames must be the same size in pixels, for bitmaps 24bit colour not compressed, for jpegs only JFIF (no Exif).
To do this my steps are:
- download a video from Youtube, or other sources, because Exif not supported camera jpegs can't be used without manage it
- on Linux I use avconv or ffmpeg, or mencoder to extract (and resize) from downloaded .mp4 video all frames, just one command on terminal extract all frames in .bmp or .jpg format, for latest the quality can be choosen. The same is valid to extract audio than can be resampled to a slower sample rate.
- put all extracted images on SDCard, possibily format it with SDCARD Formatter.
- put on PC the SDCard, launch the right encoder, select from encoder the first frame and last frame, eg MyVideo-1.jpg and MyVideo-1000.jpg to encode 1000 frames, is possible even to encode just some part of video, eg, from frame 280 to 400, is even possible encode in reverse a full video or just some parts, eg from frame 800 to frame 50, the video frames will be reversed, the encoder even has a stop button to stop the encoding process, is this is a case, the resulting video is not corrupt,, just contain some frame that already encoded and the header will be refreshed.
While read back to Arduino or ESP side the first thing to do is open a file, read the header, using the Header structure this happen on just one line of code, after this I have the file size, width, height, frame count and other useful data to play a video file.
With my classes after I open the file it verify it and mantain the file open, the class has a play() command on witch I set the first and last frame to play, even there are a setFrame(framenumber) to show the exact frame, the frames can be rendered in any order, from begin to end and viceversa, even just a section of video, even I can select frames randomly, using RawBitmapVideo draw a frame on my small SSD1331 0.96 Inch 96x64 16bit Color oled require just 9,8 ms to be drawn.
I Will release both encoders on github next months if I solve some problems, the B4J code is not opensource for now, maybe in future. I just release the .jar files that works on Win, Linux and Mac, but I even release the ESP8266 RawVideoJpegEncoder sketch as opensource. For a RawVideoBitmapEncorer is a mistake that I need to know, but after several days developing it after I opened last time on Arduino IDE it is deleted, zero bytes, so I don't have it, completely deleted, tried with some utilities to recover deleted files but with no luck at all.
Please help me to make the TJpg encoder inside my library so I can increase a bit a framerate and play both with audio. Many Thanks
Sorry for my Bad English but is not my native language, I'm Italian