ESP32 Websockets and images, Oh my!

Hello Arduino Forums,

I've been banging my head on an issue for a coupled of days trying to find a way of doing something but to no avail, hence why I am posting here.

As a quick overview of the project I have an ESP32 with touch screen (this device to be exact) that is acting as a WebSocket server. I am then connecting to this on a computer and getting the touch data back and processing that in a webpage. In my testing it's a simple remote control UI. I then take that UI (which is in a canvas element) and want to send it over to the SPI Screen so you can see what you're touching.

So far I have tried a few different things but I haven't come up with anything that works. My last attempts have been based around taking the canvas element as a Base64 JPEG and sending that and trying to use the TJpg Decoder library to deal with the display but I can't seem to get the Base64 decode to work.

I would appreciate any pointers and to be honest even having an HTTP endpoint I could post the image data to would be great. Ideally I am wanting to keep the latency as low as possible to am avoiding writing to storage etc.

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.