What is the current status of 128k+ PROGMEM support on Arduino mega2560?

Postng in desperation on this one.

I have a large project that has been hovering around 117k memory use all up. Rewrote a library, tested it thorughly on linux then kicked in my #define s for the arduino build.

It now fails to run (I get a few newlines and then nothing).

The build has suddenly blown out:

 ection                      size      addr
.data                         368   8389120
.text                      121216         0
.bss                         1843   8389488
.comment                       17         0
.note.gnu.avr.deviceinfo       64         0
.debug_aranges                576         0
.debug_info                  6082         0
.debug_abbrev                3202         0
.debug_line                  2924         0
.debug_str                    998         0
Total                      137290

Spent two days deep diving and expiriementing on every online resource I can find. I've looked at / done:

  • Early development (2003) of pgm_get_far_address(), now included in AVR-libc.
  • Reverted all the chages I made (that I can remember)
  • Played endlessly with other memory sections eg:
#define HIGHPROGMEM8 __attribute__((section(".fini8")))
#define HIGHPROGMEM7 __attribute__((section(".fini7")))
  • Got high memory access working
  • Put a heap of string arrays up into high memory areas. Accessed them.
  • Played with padding to avoid 128k crossing - added dummy arrays to avoid useful data spanning the 128K barrier (used fini(n) etc to order them)
  • Ripped out heaps of PROGMEM's sections to reduce the .text section overall.
  • Tried defining my own memory sections and allocating them via linker scripts
  • Tried a number of new / example scetches to test out my learning (some of the older ones break current C/C++ rules)
  • etc etc.

Im building on ubuntu 18.04 with currently distributed AVR libraries. I use eclispe with the Sloeber plugin for IDE.


  • I just can't get the build back under 128k (used tp be 117k)
  • With careful crafting of high memory sections and _FP accessors I've managed to get it to run again, however it now fails on any use of SD.begin()!

Head is spinning with so much research and reading. And I can't determine what is old new vs current issues.

So my question in short:

Can someone knowlegabe update us on the current status and limitations:

  • Current core libraries wrt +128K on mega
  • Current PROGMEM limitations and _FP function limitations
  • Anything than can restore my santiy

Many thanks in advance,

I suggest to start with the Arduino IDE 1.8.9, because I don't understand you build environment at the moment.

Remove all the "arduino" related things from the repositories, you may even remove all "Java". Remove the "librxtx-java" package.
Perhaps you can make a fresh start and remove the hidden ~/.arduino15 folder as well.
Download the newest Arduion IDE, that is version 1.8.9.

Unpack it in a folder and make a shortcut to the "arduino" file. I never use the install scripts.

I have no problem with large code. If you do, please give a test sketch.

When the PROGMEM section is larger than 64k, then you can get into problems.
I have tested that a while ago: Progmem_far.ino.
The far pointers will work, but the other PROGMEM features of Arduino may no longer be used. All the Serial.println() with a text in PROGMEM or the F() macro or the memcpy_P() functions may no longer be used.

If you still encouter a problem with PROGMEM, then again: give a test sketch that shows the problem.

I suggest to start with the Arduino IDE 1.8.9, because I don’t understand you build environment at the moment.

Have removed and re-installed the IDE and libraries as you suggested. I then compared the library versions with the eclipse managed versions and they are the same.

Unfortunatley the project is to large and complex (10,000 lines, dozens of source and header files) to post any meaningful code. If I could narrow down the problem to something (other than exceeding 128k) I would!

BTW, You can still use F Macros and _P functions if you force the linker into placing data in one of the fini memory sections. These get ordered after the normal PROGMEM section. Providing the PROGMEM section still remains in < 128K bytes (64K Words), the normal progmem macros and functions work fine, and with the far addressing being handled as you suggest.

I changed your program a little. In the declarations:

#define HIGH_PROGMEM __attribute__((section(".fini7")))

const uint16_t block1[15000] HIGH_PROGMEM = { R1000(100) };
const uint16_t block2[15000] HIGH_PROGMEM = { R1000(200) };
const uint16_t block3[15000] HIGH_PROGMEM = { R1000(300) };
const uint16_t block4[15000] HIGH_PROGMEM = { R1000(400) };
const uint16_t block5[15000] HIGH_PROGMEM = { R1000(500) };

static const char progmem_string[100] PROGMEM = "This string is in lower PROGMEM using the PROGMEM attribute";

And then at the end of the program:

  // The function "useFarData" shows how to use the data in far PROGMEM.
  useFarData( far_block1);
  useFarData( far_block2);
  useFarData( far_block3);
  useFarData( far_block4);
  useFarData( far_block5);

  Serial.println(F("This string uses the F Macro and Serial.print(__FlashStringHelper*) overload"));
  char ram_string[100];
  strcpy_P(ram_string, progmem_string);

Thanks, I have added a link to this thread in my sketch at Github.

Can you explain the problem ?
I assume that a small sketch with (multiple segments of) PROGMEM is working ?
Is a large sketch with a single small PROGMEM working ?
Is a large sketch with a large PROGMEM section working ?

This year I used the "large making macros" in a different way.
I changed the macro to 100,000 but that was too much, so now it is a few times 10,000 to fill up the program storage to 94%.

// The macro X10000 executes something 10000 times.
#define X10(a)      a;a;a;a;a;a;a;a;a;a
#define X10000(a)   X10(X10(X10(X10(a))))

void setup()
  Serial.begin( 9600);

void loop()
  unsigned long t1, t2, result;

  t1 = millis();
  t2 = millis();
  result = t2 - t1;

  Serial.println( result);
  delay( 1000);

Not tested.

Update: Caught by coincidence! (should have suspected - just so typical with siftware development with hardware!).

In short, its a hardware issue now fixed.

In long:

  • Cut heaps of code, especially everything to do with SD from the codebase.
  • Got back under 128k
  • Seemed to work ok
  • Added back in some SD code that reads the card, keeping a big chunck that writes still excluded.
  • Still just under 128k. Flahsed it and it ran fine but failed opening the SD still.
  • In desperation, I uploaded it and flashed it to my production board (over sat link, and 900Mhz radio link) - 2.5 hours drive to fix if I brick something.
  • It worked fine!
  • Replaced some suspect looking wires on the test rig, all code now includes, well over 128k and all is working fine.

Just one of those crazy situations when a seemingly fundamental code change coincided exactly, to the very build with a hardware fault on a rig that had otherwise been rock solid for about 2 years!