Go Down

Topic: gcc virtual memory exhausted (Read 3013 times) previous topic - next topic


May 10, 2011, 10:07 pm Last Edit: May 11, 2011, 08:00 pm by Udo Klein Reason: 1
Today I reached a "virtual memory exhausted" error from gcc for the first time. I am compiling a pretty small sketch and have about 3GB of virtual memory. The root cause is: I am doing a time memory trade off and use the preprocessor like a functional programming language. Thus gcc -E creates a really large temporary file.

This is what is at the edge of what still compiles. The output of gcc -E is test.txt. The compiled code is about 4k.
Code: [Select]

$ls -l
-rw-r--r-- 1 udo udo     4833 2011-05-10 21:58 test.c
-rw-r--r-- 1 udo udo 14824454 2011-05-10 21:58 test.txt

Of course it is clear that I am pushing gcc to its limits. The expanded input is too large. But what exactly is causing the compiler to nuke? Deeply nested () levels or the mere length of the source? That is: should I try to decrease the size of the preprocessed source or should I try to reduce the () nesting?

Any comments on this one.
Check out my experiments http://blog.blinkenlight.net


Oops, I should have asked this in programming questions. Anyway I figured this one out by now. The deep nesting seems to cause the trouble. So I am breaking this by introducing suitable enum constants as temporary "constants". This in turn also reduces the size of expanded code to <2MB. This then implies compile in  secods rather than minutes.

Now I have a compile time and space performance optimized runtime performance optimization :)
Check out my experiments http://blog.blinkenlight.net

Go Up