Go Down

Topic: Optimization/Size Coding Challenges (Read 1 time) previous topic - next topic


A forum section similar to this: http://www.unitedti.org/forum/index.php?s=c2434180fad22da31670712eef177af0&showforum=63
where people try to come up with the fastest(or in the case of the TI forum, smallest) program that can do the specified challenge.  This could help people learn optimization techniques, as well as make people think about how they would optimize the code.  The newbies(of which I am one) could learn from the long time users, and it may sometimes end up being the other way around.

Coding Badly

Is another section necessary?  Other Software Development seems like an appropriate section and only gets about five posts a day.


Agree with Coding Badly

But optimizing is only done if the application is not fast enough. If it aint broke why fix it?

Think there should be more focus on getting the code right iso fast.

Rob Tillaart

Nederlandse sectie - http://arduino.cc/forum/index.php/board,77.0.html -
(Please do not PM for private consultancy)


I suppose not getting it fast as much as getting it small.  In a large program, saving space in small tasks could let you add more functionality to what you're doing.


Very true, smal is beautiful :)

Some rules of thumb:

use small texts for your display, minimize debug messages length,  use uint8_t if you only need a value between 0..255  also in a for loop.

Also learn to use bit manipulations like  & | ^ ~ 
see tutorial section for a short intro, and http://graphics.stanford.edu/~seander/bithacks.html - for advanced bithacks.

And don't use Strings but char array[] as Strings can be very RAM expensive

use functions with local vars iso global vars. Functions keep your code clean and leads to reuse of code.

Rob Tillaart

Nederlandse sectie - http://arduino.cc/forum/index.php/board,77.0.html -
(Please do not PM for private consultancy)

Udo Klein

IMHO the nebies would not learn from the experts. The reason is that many newbies chose to ignore the fact that there is no real benefit in optimization for its own sake. Also the examples you quote are "tactical" in nature. They are brain teasers. Proper optimization has to analyze and optimize the structure first. Unfortunately this is nothing that you can learn from these small scale examples.

Just to give you some ideas:

1) If you are creating small numbers of devices software development cost will dominate everything --> go for faster chips if you need to improve by just some factor
2) If you are creating very large numbers of devices hardware cost will start to dominate --> you may have to go for really cheap chips (e.g. 4bit processors or even cheaper)
3) If you need to scale to large datasets you need to apply suitable algorithms. Chances are that the literature offers very much better algorithms than you can dream of, so do not roll your own but go and look it up.
4) Rolling your own is almost always more fun but no matter what you do and how experienced you are consider (3)

Check out my experiments http://blog.blinkenlight.net


You are quite right Udo,
Proper optimization has to analyze and optimize the structure first

Very true, sometimes optimization is not optimization but a completely different algorithm/strategy, as the optimization depends very much on the context.

e.g. sorting is a well known problem, but depending on your dataset and its usage different sorting algorithms perform better - QuickSort -
Rob Tillaart

Nederlandse sectie - http://arduino.cc/forum/index.php/board,77.0.html -
(Please do not PM for private consultancy)

Udo Klein

Quicksort is an excellent example:

1) It performs far better (most of the times) than anything you can come up with easily.
2) Depending on preconditions (e.g. presorted dataset) there may be still a better algorithm
3) For really large datasets (requiring distributed storage) is performs poor but the literature offers enough better algorithms.
Check out my experiments http://blog.blinkenlight.net

Udo Klein

Mar 24, 2012, 07:13 pm Last Edit: Mar 24, 2012, 07:15 pm by Udo Klein Reason: 1
With regard to tactical vs. structural optimizations here I have one example where I blundered:

Have a look at my removing flicker experiments http://blog.blinkenlight.net/experiments/removing-flicker/ . I used the macro processor for loop unrolling. A somewhat agressive optimization technique. Gives ~20kHz PWM frequency with 32 different levels. Of course it looks as if there is not more room for optimiziation left. It consumes significant memory and utilizes the cpu  at 100%.

However there ARE better algorithms. I did not even think that there might be better approaches but there are. Right now I have a PWM on the Arduino that outputs 256 levels at all 20 pins with ~20kHz while keeping cpu load below 70% and consuming less memory. However the optimization technique to pull this one off is at least as aggressive as the macro approach ;)

Figure this one out if you want to learn about optimization approaches.
Check out my experiments http://blog.blinkenlight.net

Go Up

Please enter a valid email to subscribe

Confirm your email address

We need to confirm your email address.
To complete the subscription, please click the link in the email we just sent you.

Thank you for subscribing!

via Egeo 16
Torino, 10131