PROGMEM for program created numerical data

Hi, this is my first post here and I've done some searching beforehand without much luck;

While I have learned the various procedures for storing information in flash, all the examples I've seen refer to 'user defined' strings into arrays.

I have two sets of around 1000 single bytes which have been generated by the program in SRAM, I can easily put them in EEPROM but would prefer to use the Flash Memory.

Thank you in advance.

Flash memory is read-only. You can not store data in Flash memory at run time.

If the values are generated, why do they need to be stored?

nickthesafe:
Hi, this is my first post here and I've done some searching beforehand without much luck;

While I have learned the various procedures for storing information in flash, all the examples I've seen refer to 'user defined' strings into arrays.

I have two sets of around 1000 single bytes which have been generated by the program in SRAM, I can easily put them in EEPROM but would prefer to use the Flash Memory.

Thank you in advance.

Can you generate the numbers at compile time then, if you do not want to hard-code all 1000 bytes. This is of course only useful the numbers are not relying on a runtime feature.

I keep seeing contradicting explanations - can't put anything in flash.... is one, and another is 'how to use avr/pgmspace followed by lines such as;

const dataType variableName PROGMEM = {data0, data1, data3...};

I run a setup routine which determines (on power up) whether or not the setup has been completed previously, if it hasn't then it goes ahead and creates the required data sets. This data is referred to later in the program and slight changes are made as necessary but on the whole, once the information is created, unless a 'full reset' is needed, it won't change..

It's because of the odd alterations that I don't really want to burn up eeprom writes :wink:

I keep seeing contradicting explanations - can't put anything in flash.... is one, and another is 'how to use avr/pgmspace followed by lines such as;

const dataType variableName PROGMEM = {data0, data1, data3...};

Flash is read-only AT RUN TIME. When you upload, all the data is stored in Flash. What the PROGMEM directive does is stop the data from being copied to SRAM.

You do NOT (seem to) have data at upload time, so your ONLY choice is to define the data in arrays that are in SRAM.

Sorry, understood...- the line const dataType....... IS during compile time. I'll get my coat and leave quietly :wink:

I'll have to Use EEPROM and avoid changes.
Unless any other ideas

Thanks

nickthesafe:
Sorry, understood...- the line const dataType....... IS during compile time. I'll get my coat and leave quietly :wink:

I'll have to Use EEPROM and avoid changes.
Unless any other ideas

Thanks

EEPROM has 10 times more endurance than the flash memory (100,000 writes vs. 10,000).

With 100,000 writes, you can write a byte 10 times per day every day of the year for 27 years.

What are you worried about? Do you just hear that "EEPROM has a limited number of writes" and don't check how limited it is? Do you not check how many writes your alternative (flash memory) gets, or did you assume it was more/infinite?

Those numbers I quoted are on the front page of the datasheet, 3rd bullet point down. They aren't buried in fine print.

There are a variety of solutions to this kind of problem. One of them is to use a large chunk of storage - an SD card - and to write out "deltas". At the data changes, you write out just the bits that have changed. This means that to read the data, the program needs to read the original data and then "replay" all the deltas. Once you have "enough" deltas to make it worthwhile, you calculate the full dataset as it is now and write the whole thing out, and start the cycle again.

A way to address this would be to put the dataset in progmem by generating C++ source code, and store in EEPROM just enough data to get from the progmem dataset to the "current" dataset.

For instance: divide your data into 64 150-byte blocks, write out 8 bytes indicating which blocks have changed, and then for each of those blocks write out the 150 so bytes. Follow the bocks with 8 bytes of zero to indicate the end of the chain.

To add a new batch of changes, you'd overwrite the zero bytes with a new header. and follow with what has changed since last update. This way, you are only writing to any given EEPROM byte twice at most. You do this write at intervals, or only when the device is about to be powered down. When the EEPROM looks like filling up with stuff, grab all the stuff out of the EEPROM (perhaps by uploading a sketch to echo it to serial.out), apply it to the original dataset, generate new source code and compile it into a new sketch, which you upload.

For convenience, stick the dataset into its own C++ file - an array named baseData or whatever.

Another question is: how much entropy is in your data? If it's low entropy - eg: mostly zeros with a few places having meaningful data - it may pack down well with a standard compression algorithm.

Jiggy-Ninja:
EEPROM has 10 times more endurance than the flash memory (100,000 writes vs. 10,000).

With 100,000 writes, you can write a byte 10 times per day every day of the year for 27 years.

What are you worried about? Do you just hear that "EEPROM has a limited number of writes" and don't check how limited it is? Do you not check how many writes your alternative (flash memory) gets, or did you assume it was more/infinite?

Those numbers I quoted are on the front page of the datasheet, 3rd bullet point down. They aren't buried in fine print.

Errr...No, it's the amount of data. I can work with the EEPROM just have to keep it slim :slight_smile:

I've only been working with C++ and arduino for a fortnight, before that it was 1980's BASIC. I've been tasked with building a digital prototype of an analogue device and have jumped headlong in, The initial question has been answered - thanks again - I've 'thrown the switches' to use the EEPROM and it's working just fine. :slight_smile:

PaulMurrayCbr:
For convenience, stick the dataset into its own C++ file - an array named baseData or whatever.

Another question is: how much entropy is in your data? If it's low entropy - eg: mostly zeros with a few places having meaningful data - it may pack down well with a standard compression algorithm.

That's a good point Paul!. Something for me to consider If I have to increase the 'resolution' of the analysis. :wink:
Cheers.