EXACTLY one second

First of all let me give you some background on my project. I built an entire digital clock out of 74161 binary counters that then get decoded in to decimal with 7447's. Basically my 1hz Oscillator just broke, and until i get another, i need to simulate a 1 Hz signal with the arduino. I have tried using Digital write and then Delay, and this is surprisingly inaccurate. Is there another way to set a pin high for maybe a 20-50% duty cycle EXACTLY every second since this is for a clock project. The Counters detect a rising edge so it doesnt matter how long it is high for. Thanks in advance!

Bit of a hint :

time = millis();
if (time >= count){Do your pin manipulating here; count = count + 1000;};
// do not use delay;

Yeap, I’d do something like this:

unsigned long time = millis()

void loop(){
  while(millis() <= time + 500){
    digitalWrite(pin, HIGH);
  }
  while(millis() <= time + 1000){
    digitalWrite(pin, LOW);
  }
  time = time + 1000;
}

I think you want long time millis(), and not int.

Plyggy and TomServo, your code breaks at the overflow, in TomServo's case even if you were using unsigned long instead of int.

I would use a code like this to get 1Hz and to avoid accumulating errors. In this case, you have in the long run the precision of the millis timer.

const int outputPin=13;
void setup() {
   pinMode(outputPin, OUTPUT)
}

void loop() {
static unsigned long lastmillis = 0;
if (millis() - lastmillis >= 1000) {
   // Start new period without accumulating errors
   lastmillis += 1000;

   // Toggle output
   digitalWrite(outputPin, HIGH);
   delay (200);
   digitalWrite(outputPin, LOW);
}
}

Korman

I think you want long time millis(), and not int.

Not long, but unsigned long, unsigned short or unsigned char.

unsigned char - intervals from 0 to 127 unsigned short - intervals from 0 to 32767 unsigned long - intervals from 0 to 2147483647

void loop() {
  static unsigned char time1;
  if (((unsigned char)millis - time1) >= 100) {
    time1 += 100;
    // code to run every 100ms goes here
  }
}

A question

How is it possible to create an accurate clock cycle with a device running a "hand written" programme since surely at the end of each cycle the programme must run to generate the next cycle. Surely the instructions take time in their own right.

Wouldn't it be simpler to use an RTC that is designed to generate accurate clock pulses.

Or am I missing something (my wife says I am, but that's another matter !)

jack

Easiest would be to get a random clock crystal and a counter. But as millis() is tied quite closely to the CPU-clock, you inherit the precision from there. All the code in between might shift the signal a little, but the error won’t accumulate, thus not matter in the long run.

Korman

You could also use microseconds, may be more precise. The arduino executes an instruction every 62.5 nanoseconds (62.5 x 10-9). If you are executing code that increments a 32 bit counter and sees if 16,000,000 ticks have occurred, and then you do something within a few more clock cycles, you won't be too far off from 1 second.

Yes, unsigned long. My mistake! Yes it does break after overflow, I just figured it didn't need to run that long (since it is only a temporary fix). I thought in 50 days he might have a new 1Hz oscillator :) But yes your code is better as mine would accumulate error over time, whereas yours will not. Modified mine now to prevent error accumulation :)

My code is long term tested in various long running projects. It coughs briefly at the rollover before continuing since the count also rolls over . I took it as self evident all the time related variables to be the same as Millis = unsigned long. The accuracy is as accurate as the clock crystal in the board. 2 of my 3 Duemilanoves are to within a second a day (varies slightly with ambient temperature, in October it kept within 10 seconds in 49 days). A Uno with a ceramic resonator around a minute a week. All of them run fast rather than slow.

How is it possible to create an accurate clock cycle with a device running a "hand written" programme since surely at the end of each cycle the programme must run to generate the next cycle. Surely the instructions take time in their own right.

The millis() count is based on a timer that runs continuously and completely independently of the program. You can miss recognizing it for a few microseconds, but it "catches up" shortly thereafter, and should always have an accurate time to within +/- 1ms or so. (barring BAD program behavior like disabling interrupts for multiple milliseconds at a time...) Now, if you call delay(ONESEC) repeatedly each call can be off by about +/- 1ms, plus your program execution time even if it is reasonably well behaved, and you can accumulate pretty significant errors relatively quickly.

My code is long term tested in various long running projects. It coughs briefly at the rollover before continuing since the count also rolls over

Well, a known bug is better than random behaviour, but a system without the bug is even better. The “coughing” at rollover might be acceptable for your applications, but why not do it right instead and stop worry about roll-over completely? Also I don’t think it good practice to give beginners sample with bugs.

Korman

In his first post he said it was a get-me-by until he gets a new 1Hz oscillator, 7 weeks before it rolls over should give him plenty of time. It firing twice instead of once every 7 weeks (if it runs that long) doesn't justify the extra coding in my opinion.

So why not do it right instead?

Korman