Today I reached a "virtual memory exhausted" error from gcc for the first time. I am compiling a pretty small sketch and have about 3GB of virtual memory. The root cause is: I am doing a time memory trade off and use the preprocessor like a functional programming language. Thus gcc -E creates a really large temporary file.
This is what is at the edge of what still compiles. The output of gcc -E is test.txt. The compiled code is about 4k.
$ls -l
-rw-r--r-- 1 udo udo 4833 2011-05-10 21:58 test.c
-rw-r--r-- 1 udo udo 14824454 2011-05-10 21:58 test.txt
Of course it is clear that I am pushing gcc to its limits. The expanded input is too large. But what exactly is causing the compiler to nuke? Deeply nested () levels or the mere length of the source? That is: should I try to decrease the size of the preprocessed source or should I try to reduce the () nesting?
Any comments on this one.