Speed of Arduino compared to PC...

So I'm taking the free online Stanford class in Cryptography, and the first assignment has one of those "with a little thought you can solve this in a loop with less than 2^28 iterations" problems. For kicks, I thought I'd run it on an Arduino as well as on my (someone aged) 2.8GHz Xeon desktop. The straightforward implementation takes about 7500 seconds on a 16MHz Arduino. A bit of obvious optimization (eliminating 32bit multiplications and modulus and replacing them with repeated additions and conditional bounds checking) reduces that to only about 1500s.

It takes about 0.5s on the desktop (either way.)

So there you have it. An Arduino is about 15000 times slower than a Mac. :slight_smile:

Hi,
Interesting, unless my maths is wrong, taking clock speed alone, you would expect around 175 times slower. Is my maths wrong or is there a lot of floating point or multi byte maths in your algorithm ?

Duane B

rcarduino.blogspot.com

It's not just clock speed, it's data path width, and some instructions doing more in one clock cycle.

A lot would depend on the algorithm, but I believe a lot of cryptographic stuff is designed to run on low-power, low-speed devices, like smart cards. A "reference" implementation may not be designed to be optimized for those situations though.