Hi there, it's me again. I'm building a project (Arduino Forum) in which I really could use some precise timing. (For measuring time periods longer than 1 hour +-1% is not ok, I'm afraid.) I have two I/O pins free (that's IC pins 1 and 2 counted by standard means clockwise starting from top right) and am now unsure how to proceed. One way would be to buy some external RTC chip. But problem with those is that they need ic2, which seems sort of complicated to me, and furthermore, they are about twice the price of attiny. Another solution would be to use external crystal. However, correct me if I'm wrong, but that would mean: 1) changing fuses (which to me sounds like complete sorcery) 2) my program would run slower, so I'm not sure if I wouldn't run into some timing issues and problems at speed-critical functions, such as tone(). 3) I'm using arduino tiny core and there isn't any option that would say "attiny85 with external crystal at 32.768khz" or whatever. So-does that mean it wouldn't work? I could also get a 1mhz crystal (for which there is an option in the menu) but that would mean I'd lose the advantage of lower power consumption I'd have with watch crystal. Last option I could think of was attaching some sort of timer and letting it interrupt the processor every [known period of time]. But again, 555 timer chip operates on higher voltages than other components do and its power consumption also makes it unsuitable for my project. So... Which way do you think is best? Please, I believe I just need a small hint to get me going. Thanks in advance.
You can use an 8 MHz or 16 MHz crystal on pins 2 and 3. That should be as accurate as the crystal you get. That would require some fuse changes but that shouldn't be too hard.
If the voltage and temperature are stable you can apply a software correction to the built-in RC clock to correct for it's errors. You could measure the clock over a long period (100 minutes?) and determine how many milliseconds per minute it's drifting. Then you can correct the number of milliseconds you count as a "minute". That should get you very close.
- It's not that bad.
- Your program would run more quickly. I'm assuming you're only running at 8MHz now but a crystal will allow 20MHz.
- If you're using the MIT "High Low Tech" files for the ATTiny you'll want to be aware that they updated them for Arduino 1.0, and with that update they included configurations for the ATTiny with external crystals.
Btw, you only get +/- 1% accuracy with the ATTiny internal oscillator if you calibrate it. It's factory calibrated to +/- 10%.
For measuring time periods longer than 1 hour +-1% is not ok
What rate error is OK?
Ok. So first of all, thanks for all your suggestions! But I believe in some cases we might be talking past each other. I probably haven't expressed some things clearly enough, so let me please clarify myself: I'm using the awesome arduino tiny core (Coding Badly: If I get it right, you are the one behind it, so thank you very much, it's awesome). I'm running my chip @1 mhz, because I simply don't need more, so using 20 mhz crystal/oscillator/whatever would be an overkill, furthermore resulting in decreased battery life. I need clock-like accuracy; few minutes a week is ok; quarter an hour a day (with calibrated internal oscillator) is not. So... Yeah, software calibration seems like the way to go. I actually used it before, but thought of it like as of sort of temporary solution to avoid having to set OSCCAL fuses. So thank you very much again! I'm going to implement some sort of calibrating routine. If I run it at :00 or :30 real time, it will round the internal time to nearest half hour, calculate drift and set the offset value. So it'll basically get more precise with every calibration. Then I''l somehow have to solve the voltage and temperature change dependence issues. But that's it so far. Thanks once more.
Coding Badly: If I get it right, you are the one behind it, so thank you very much, it's awesome
Thank you!
so using 20 mhz crystal/oscillator/whatever would be an overkill, furthermore resulting in decreased battery life
That may not be true. The argument goes something like this: Presumably, there is a certain amount of "overhead" power; power consumed in between processor clock pulses. If the clock is fast, the processor can finish its work more quickly and get back to sleep sooner which makes the overhead power much less significant. So, if the processor spends most of the time asleep and has a fairly fixed amount of work to do when awake, a faster clock may actually result in a longer battery life.
Yeah, software calibration seems like the way to go.
I've gotten very good results with that. Using fractional math (unsigned/unsigned) makes software calibration fairly easy.
Ok, so after numerous days of debugging, I found out what I could know before-integer isn't really the same as float and after you convert it to float, you solve the mysterious problem that has been bugging you for hours-but also exceed progmem by 330bytes with everything optimized to its finest. DAMNIT! Btw thanks for that arduino tiny core you released just hours ago, Coding Badly. Just thinking... Aren't you by any chance aware of any functions of your core that could be made shorter but aren't for sake of code readability or lack of time? Cause that would be a shot in the arm. (I'm also using the attachPcInterruptSimple library, so... yeah.) Now when I think about it, I might be able to convert to integer math only, but that would mean drastically changing the code, I believe.
Edit: My belief was mistaken. Nevermind that.
thegoodhen:
Aren't you by any chance aware of any functions of your core that could be made shorter but aren't for sake of code readability or lack of time?
It is possible to make every function in the core smaller. The tradeoff for the user is that constant pin numbers have to be used (which is very typical so the tradeoff in most cases is insignificant). The tradeoff for the developer is that there has to be two versions of every function.
Lack of time has prevented me from getting version 2 finished.
Cause that would be a shot in the arm.
It is. Using what I have finished for the version 2 core, I was able to fit an application onto an ATtiny13 that otherwise needed an ATtiny45.
(I'm also using the attachPcInterruptSimple library, so... yeah.)
Excellent.
Now when I think about it, I might be able to convert to integer math only, but that would mean drastically changing the code, I believe.
In my experience, converting to integer math only (or fixed-point math) requires just a bit more care and a different way of thinking but isn't too drastic.
Thanks very much. I was able to change the code shortly after I posted. So now it works like charm, it's off by less than minute and it's been running for two days now. My original code was:
diff=millis()-lastmillis;
realdiff=round((float)diff/hh)*hh;//round to closest half hour and convert to millis; hh=constant, half hour in milliseconds
drift=realdiff-diff;
drift=((float)2/(realdiff/hh))*drift;
lastmillis=millis();
int hrz=hour();
int minz;
if (minute()>15&&minute()<45)
minz=30;
else
{
if (minute()>45)
hrz+=1;
minz=0;
}
setTime(hrz,minz,0,1,1,1);
Now, I managed to avoid fractional math...
diff=millis()-lastmillis;
unsigned long compval=hh/2;
int i;
for (i=1; i<=10000; i++)//round to closest half hour using different (yet working) method; basically checks how many half hour can fit into "diff" (which is time difference between last and this sync)
{
compval+=hh;
if (diff<=compval)
break;
}
realdiff=i*hh;
drift=realdiff-diff;
drift=(2*drift)/(realdiff/hh);
lastmillis=millis();
int hrz=hour();
int minz;
if (minute()>15&&minute()<45)
minz=30;
else
{
if (minute()>45)
hrz+=1;
minz=0;
}
setTime(hrz,minz,0,1,1,1);