When the multimeter is set at 20ma I'm getting a reading of "2.8", but when I set it at 10A I get ".6".

Back when I was in school, I remember my teacher telling me "do part of the math in your head, to

*check your work*" while using a calculator. Just because the calculator says an answer, doesn't mean it is correct. (Of course, I was always the kid who said the answer is "ERROR" or "5318008.")

Here's a good example of checking your work.

As explained, the range selections on your Multimeter represent the maximum that range can measure. You already know approximately what your LEDs draw (~100mA ea) and approximately what the Arduino draws (~25mA). So how meaningful is it when you use the maximum 20mA scale range and you get a value of 2.8mA, knowing your circuit draws a minimum of 25mA?

Maybe my multimeter is broken...

Again, check your work. Measure the current running through just resistors. A 1kOhm resistor connected to 5V should draw how much? 100ohm? Put some in parallel to increase the current flow so you can test the 10A range.

Each alkaline AA should be 2700mah or 21 Ah for the 8 ones.

How are your batteries configured? I'm guessing in series to get up to 12V. Batteries in series have the same current capacity as a single cell. (Batteries in parallel give you more current capacity.)

Assuming your batteries are in series,

The simple formula of 2700ma-h / 600ma = 4.5 hours.

However, 2700mA is estimated by the battery manufacturer at low current draw conditions. 600mA probably isn't considered a low-current draw situation so 4.5hours is the absolute best you could get with the real life being much much l.ess