I am creating a project where I need to send data every 8-10 milliseconds (more details below). At the moment, I'm testing on both a Arduino MKR WiFi 1010 and an Arduino Uno. However, I'm getting very inconsistent readings as I don't think the boards are able to output that quickly in a reliable way. I.E. I set up a serial print and will get data every 10 ms, then every 20 ms, then 35 ms, so forth and so on; the boards aren't able to keep up.
I'm assuming the microprocessor with the most reliable clock would be the Arduino Giga R1 but Im not sure and would appreciate any suggestions.
My project: gather readings from a force sensing resistor every 8-10 milliseconds for 10 seconds.
Store all the data points on the board, package them, then send them wirelessly to my computer and export as a csv. Having a board that can get reliable readings from 8-10 seconds is really important.
It takes 1 millisecond to transmit a single character at that glacial Baud rate.
Bump that up to 115200 Baud, which is the maximum that the Arduino serial monitor can handle. With other devices, much higher Baud rates are usable.
A bit more information on the project might help - there maybe other solutions to what you are trying to do .
For example how fast does the system you are measuring from respond , so is 10mS correct? Do you need 10 sec of data ? What is the response time of your force sensor ?
Any Arduino with a crystal instead of a resonator (Uno, many others) will be most accurate. Even one you make yourself from a bare AVR chip. Also note that rated AVRs can run at 20MHz and AMD Arduino/compatibles clock at high as 600MHz that I know of!
Yes, you can make your own crystal clock AVR on breadboard to perfboard quick and cheap.
My project is about gathering pressure information. I need to see how much pressure is being applied to a ball upon its release. It's imperative I get data as quickly as possible as the release happens quickly and I need to see how much pressure is being applied milliseconds before release. Then I need to store that information and do it for multiple throws.
Thank you for the suggestion. As you can see in the code I changed the Baud rate to 115200 and am getting more accurate data. The only issue is it seems that when I make my delay 10 ms I get data every 11 ms. When I make the delay 9ms I get the data every 10 ms. Is that just an issue with the delay function itself?
I am pretty sure that this is due to the fact that arduinos dont count exactly 1 millisecond, more like 1.024 milliseconds according to How accurate is millis()? - #9 by nickgammon? .
I have a similar application here; I am trying to design an ebike computer that monitors the battery voltage, current, charge, and the speed of the bike, odometer, etc. Most importantly with determining the charge, theres a process known as coulomb counting where we basically integrate current with respect to time. I'll discuss about this in case if you are not familiar with integral calculus.
Since we are doing this real time, its impossible to integrate it accurately. Therefore dt would have to be non zero. In my case I made it 20 milliseconds or in other words the timing accuracy is 50 hertz. The ammeter reads the current every 20 milliseconds.
So why am I blabbering all this? Well here is the point. Similar to you're issue, recall that arduinos count 1 ms as 1.024 ms. This will eventually lead to errors. This means that by dividing 0.024 by 1 and finding it's percent, the error is 2.4%. This is quite significant, not to mention we are literally doing real time reimann sums that are not as accurate as integrating with a dt of ≈0. Basically its like using a digitally converted version of an analog signal; analog signals are infinitely smooth but digital signals are obviously blocky; the more bits however the smoother it is but it wont ever reach infinetely smooth.
Also this issue will apply to the speedometer and odometer. This is since these processes involve calculations dependent on time. With the ammeter and voltmeter, the timing inaccuracy could be trash and still give useful readings, albeit probably laggy.
So far, it seems that the AVR arduinos count a millisecond as 1.024. Is this the case for the giga? Because I am planning to use the giga for the project. Some advice would be appreciated anyone reading this and I hope my answer helped.
The major factors to clock ‘drift’ are temperature and the type of on board clock source. Crystal is good, resonator not so much…. External clock: TCXO is better, NTP is better, nuclear is best.
Millis updates every 1024 micros. In the low 8 bits the number changes a bit later by a tiny fraction but 6 times out of 256, millis() skips a number to catch up the lateness. You will never see the low 8 bits == 0xFF.
Upshot, bits 8 to 32 count 0.25 sec intervals as close as the clock source can.
When doing unsigned subtraction to determine elapsed time, millis() is +/- 1.
If that's not close enough, use micros().