Runtime image compression on Zero

I'm wondering whether it's possible to compress images during runtime on the Arduino.

Let's say the MCU is a ATSAM21E18 MCU (Arduino Zero equivalent), there are image being gathered by a camera module (OV2640 or OV5640) and saved into a external memory (flash IC). I don't have this precise setup build but from my experience with these components, I'm fairly certain that this should be no problem.

When sending these images, they should be as small as possible. I would like to take this image (normally can be PNG, JPG and BMP) and compress it during runtime, so it can be easily transmitted via radio.

I'm not sure whether the CPU can handle the compression algorithm. I'm not sure whether the flash (256kB) can fit the compression algorithm. I'm sure that many of the images won't fit into the RAM (32kB), is that a problem for the algorithm?

So, what do you think, is what I'm proposing feasible on this hardware? Are there any libraries for this? Thanks.

I may be talking garbage here, but I would have thought that the computations required to convert from a raw image (I assume this is what is saved on your flash chip) into a more recognisable BMP, PNG or JPG would be within the capabilities of a SAMD21.

You would need to find out more about your chosen algorithm to decide if it needed the whole image to be held in memory (RAM) or whether it can be read in smaller chunks.

I wouldn't like to guess how quickly the image could be converted from raw to JPG (for example).

Would you expect to acquire several images quickly and then post-process them into JPG/BMP/PNG or would you acquire a single image, process it, transmit it and then wait to acuire then next image?

As far as I know, the camera can output these formats directly. You can check this example sketch which uses FIFO RAM as well, though that shouldn't be necessary.

Sure. From what I understand, if you remove a few bytes from a JPEG, you can break it, while when you remove a few bytes from a PNG you just loose a few pixels. Because of this I think PNG could work better, but maybe I'm wrong.

The whole reason for the compression is that the camera is on a flying probe, which has limited radio capabilities. I would expect to capture and store images in the flash at higher resolutions periodically. I would then expect the probe to send just a tiny preview (100px wide or even smaller) of these images. I could then request only the good ones to be downloaded in full resolution. I'm now thinking that maybe 2 pictures at different resolutions could be taken right after each other, but I don't know how well that would work in practice, since the probe can be moving and rotating at very high speeds.

Easy enough to try surely ?

Get a JPG and PNG image and change some bytes at random in the image with a hex file editor and see what happens.

What 'flying probe' ?

You didn't say which compression algorithm you plan to use. Are you fairly well versed in such things?

Good call, when I do this, neither works. I've still seen systems which transmit images and experience packet loss use PNG, I will have to read more about the encoding.

A small high altitude balloon.

I'm not, that's why I'm asking, maybe there is some C++ implementation that just needs to be ported.

Maybe I misunderstand. Do you want to compress e.g. a JPG image?

I don't know about BMP, but the other two are already compressed and you will not have any gain.

So, there's a ton of online tools that can take a JPG image and reduce its size by applying lossy compression algorithms, reducing the number of colors, etc. This can reduce the size by 80-90%. I'm just trying to find a C++ implementation and whether it could run on a CPU like this.

Ah, IC.

Before you plan too far, there are problems with transferring images, in particular duty cycle issues in the UHF ISM bands, whereas there are usually no duty cycle restrictions at 2.4Ghz.

Some of the issues are described here;

And whilst high res images sound like a great idea, first work out how long they would take to transfer over typical high altitude balloon distances.

In order to do that, the image must first be expanded to its normal size, and then compression done again. How much memory will that take?
Paul

A 1024x768 pixel image at 1 byte / pixel would unpack to 786,432 bytes.

It's been a while since I did any image manipulation, but if the BMP image is stored uncompressed in your flash memory, then I think you may be able to work your way through it sequentially picking out every nth pixel to create a crude low res image. I may be wrong here (it's been a while), but I think a compressed BMP can be also be processed sequentially - just with a bit of uncompressing. Wikipedia has a fair amount of detail on BMP image formats which would be a good starting point.

Porting image compression code would normally be an experienced level task for a programmer, so my question stands... JPEG code at least, is all over the internet so you must have seen it by now... if not start Googling.

In the original post, it is not clear how you are obtaining the raw images. If they are streamed to the Arduino, you have a chance at sequential encoding. But this has a huge do it yourself component. I hope you're ready for that. You aren't going to find any ready made solutions.

Compression algorithms aren't usually lengthy, although they are often complex. So you won't run out of flash for the compression code.

If you want a code that you can easily understand the internal workings of, use Huffman code.

Thanks for these resources, I read all of these and they will definitely help me to design my own transmission system.

I calculated the airtime to transmit a 10kB picture (640x480px) to be 22 seconds for SF9BW250 which has good range from my experience (when using 17-20 dBm). Larger resolutions or datarates will make this longer, but I don't mind having 1 picture each 1-4 hours. Also, pictures will only be taken during day but they can theoretically be transmitted during night as well.

The duty cycle is a limit I can fit into. The thing is I'm playing with the idea to send the pictures through LoRaWAN, not just from point-to-point LoRa. TTN has community airtime limitations, which would make this hard. The new Helium network, which now has over 120k gateways, doesn't have this limit (since you pay for the packets) so it could work (?).

Huh I didn't think of that, maybe I could try mounting external SRAM for this?

Oh, that sounds pretty smart for previews at least, I will have to find out how much time and RAM this takes though.

I see, I have some C++ programmer friends which I can ask, I was mainly asking about this being possible on an Arduino.

Right, the solution @markd833 suggested sounds good so I'll try that.

Okay, I will try this as well.

Dont even attempt to do this !

Even at the fastest datarate, SF7 BW125000, you free access limit is circa 12kbytes of useful data a day.

The load such a transmission from a high altitude balloon would put on the TTN system could be huge with the signals being processed by many gateways over a wide area.

Also remember that LoRaWAN\TTN is really designed as a one way system, there is no general acknowledgement of packet reception which you would need for a video image transmission to be successful, a single corrupt bit can mean no image at all.

The UKHAS guys do transmit pictures from HABs, and for very good reasons they use SSDV whereby the image is split into blocks, loss of a block does not result in total image loss. They also use a LoRa bandwidth of 20800hz to avoid legal duty cycle issues.

In most places in the World, your limited in the license excempt bands to 10dBm @ 434Mhz and 14dBm @ 868Mhz (common TTN band).

Is this high altitude balloon an up to 30km, burst, and down to Earth job or a round the World floater ?

I understand, though I didn't know the 12kB limit before. That's why I mentioned Helium, they shouldn't care since their network has 6x the gateways and they charge you for the packets. They also say that their network supports OTA updates through LoRa, that's a lot of data too.

It supports occasional downlinks though, right?

It's a floater. If the LoRaWAN approach doesn't work at all, I will think about constructing private links at various locations.

10 downlinks a day.

If memory serves, with a BMP format image, then the file format details each pixel from left to right, one row at a time from top to bottom (but I seem to recall that BMP went from bottom to top - or maybe that was GIF!). You don't need to hold the whole image in RAM if you are simply wanting to generate a crude thumbnail image. You could, for example, take every 4th pixel along a row and use that as the new pixel in your thumbnail. Do the same for every 4th row and you have a crude thumbnail.

You can read in a small subset of pixels at a time - maybe 1 row - as you have a huge 32Kb ( :grinning:) of SRAM to play with, and subsample that down to your desired thumbnail size.

On the other hand, I think the JPG (or JPEG) format splits the image up into small blocks rather than rows and columns of pixels, which makes it harder to work with than BMP.

I don't know much about PNG images but I think they may be encoded linearly like BMP rather than blocks like JPG, which would make that format easier to work with as well - although more sophisticated than BMP files.

EDIT: The Winbond W25Q128FV SPI flash memory chips are really cheap and give you loads of storage space. You will need to check that you can read from one file (the source image) and write to another file (the destination thumbnail image) keeping both files open at the same time - assuming that you use a filesystem - you don't have to, you can just write to the flash directly.

jpg is already lossy :smiley: But if you want to throw more information away, that will work. I do not know of any available solutions.

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.