I have written a program that controls a process

I am using a raw ATMega328P@1MHz and AS6

My code uses a timed interrupt every second to update a count so I have one second between interrupts which is a nice time frame to work in! as so much can be done

Anyway I have a couple of sensors that I read and what I want to do is to take as many readings as possible in the one second time frame and then take the average of all the readings so my average result is updated every one second

I could use a fancy algorithm which I havent ruled out but I dont think I need to bother, I was thinking an array

ARRAY[0]+=ADC_DATA; ARRAY[1] +=1;

That could add all the samples together with the total number of sample held in other entry

AVERAGE=ARRAY[0]/ARRAY[1];

The average is simply the two divided but what I am worried about is overflows as at the moment I am not sure how many samples there will be

Now I could just go with a (64bit)long long and thats a big number! a (32bit)long would do it but I want to understand the best way to approach this, the large numbers are unwieldy (not that its a problem) but I am considering using a float array with the ADC data divided by 1000

Can I ask

A what are the drawbacks (aside from speed issues)of dividing by 1000 and saving the ADC data as a float?

B what is the maximum count a float can go to without overflowing ?

C Which way would you use to calculate an average of many samples?

Thanks in advance