23-minute "ticks" in sensor input???

I have an Arduino Uno hooked up to a DS1307 real time clock from Sparkfun, and two light sensors: a light-to-frequency converter (TDL235R from Sparkfun) and a conventional photoresistor hooked up to an analog input pin in the obvious voltage divider setup. Once per minute, my program counts pulses from the light-to-frequency converter for one second, and reads the voltage on the analog pin, then prints the time, voltage reading, and frequency count. The two sensors are just outside the door, pointed at a blank spot in the sky.

I would expect to see their outputs drift up and down with the motion of the sun (or interference by clouds), but instead I get something very strange: roughly every 23 minutes, both sensors show a spike in output. The photoresistor pins at the top of the scale every time.

Sometimes it’s not every 23 minutes; instead, it’s 16 minutes, then 7 minutes – adding up to 23 minutes. And sometimes the light-to-frequency sensor simply goes nuts.

I attach a graph showing the output of the two sensors over a four hour interval today.

So what could be the source of this craziness? First off, it’s almost certainly not some kind of electromagnetic interference: I live in the country, the next house is more than a quarter-mile away, there’s no thunderstorm activity, and I don’t have any big motors turning on and off at regular intervals. The power to the Arduino is on a standard surge protection box. The power to both sensors is coming from my own 5V regulator, not the Arduino’s. The grounds are solid. The Arduino is connected to the USB port on my Mac, but I can’t imagine how that could be the source of the problem.

I’ve gone over my code carefully and there’s nothing that could lead to any kind of cycle, be it 16 minutes, 7 minutes, or 23 minutes.The photoresistor seems to give good results except for its hiccups. The light-to-frequency sensor looks like junk.

The problem appears to be with the Arduino. After all, photoresistors are about as simple a sensor as you can get, way too stupid to hiccup every 23 minutes. Can anybody suggest what might possibly be going wrong here, or perhaps some experiment I could do to zero in closer on the cause?

Do the sensors hiccup if you keep them in the dark?


Excellent question! I just wrapped them in a dark cloth; their output is much reduced. I'll have to wait for another 10 minutes for the next hiccup.

It would also help to see your code, while you're waiting for the glitch…


OK, I’ve got some results back: the photoresistor shows the hiccup, but the light-to-frequency sensor did not show a hiccup, but it’s still meandering around senselessly.

Here’s the code. I’ve stripped out a bunch of stuff that was commented out because I haven’t turned on portions of the program yet.

#include <SoftwareSerial.h>
#include "Wire.h"

#define DS1307_I2C_ADDRESS 0x68
#define Light_Input 12

// RTC connections: SCL to A5, SDA to A4

byte previousMinute=0;

// Gets the date and time from the ds1307
void getDateDs1307(byte *second,
byte *minute,
byte *hour,
byte *dayOfWeek,
byte *dayOfMonth,
byte *month,
byte *year)
  // Reset the register pointer

  Wire.requestFrom(DS1307_I2C_ADDRESS, 7);

  // A few of these need masks because certain bits are control bits
  *second     = bcdToDec(Wire.read() & 0x7f);
  *minute     = bcdToDec(Wire.read());
  *hour       = bcdToDec(Wire.read() & 0x3f);  // Need to change this if 12 hour am/pm
  *dayOfWeek  = bcdToDec(Wire.read());
  *dayOfMonth = bcdToDec(Wire.read());
  *month      = bcdToDec(Wire.read());
  *year       = bcdToDec(Wire.read());

void setup()
  byte second, minute, hour, dayOfWeek, dayOfMonth, month, year;
  pinMode(Light_Input, INPUT);

void loop()
  byte second, minute, hour, dayOfWeek, dayOfMonth, month, year;
  boolean previousBit=true;

  getDateDs1307(&second, &minute, &hour, &dayOfWeek, &dayOfMonth, &month, &year);
  if (minute>previousMinute) {
    if (previousMinute==59) {
    // count pulses from light pin
    unsigned long counter;
    unsigned long timer=millis()+1000;
    while (timer>millis()) {
      if (previousBit!=digitalRead(Light_Input)) {
    Serial.print(hour, DEC);
    Serial.print(minute, DEC);
    Serial.print(second, DEC);
    Serial.print("  ");

    unsigned int photocell;
    Serial.print("  ");

It would also be useful if you posted the output of that program when it has been running for, say, ten minutes.


There's a problem with variable allocation on the stack.

Does it still glitch if you move this line out of setup():

  byte second, minute, hour, dayOfWeek, dayOfMonth, month, year;

And also delete the similar line from loop(). In other words, make all those byte variables global and see if it still glitches.


Edit: On reflection I don't think there's a bug there. Though it couldn't hurt to make them global, it probably won't help.

I thought that perhaps the glitch might be a simple one-cycle hiccup, so I added this code:

    unsigned int photocell;
    if (photocell==1023) {

I thought that, if I got a pinned reading, I would wait a second and try again. But it didn't change anything. My next experiment: carry out the loop at two-minute intervals instead of one-minute intervals. That way we'll see if it's a function of time or of loop counts.

Back in, well, not a flash, but in half an hour fer sure.

It took a bit more than half an hour to be sure, but now I can sing:

Ding, dong, the glitch is dead!
The mean old glitch!
The wicked glitch!
Ding, dong, the wicked glitch is dead!

I made two changes: first, I changed from one-minute intervals to two-minute intervals. More important, I shifted from using the RTC to using the delay() command. That might well be the real reason for the death of the glitch, so now I'm going to set up a two-minute delay based on the RTC. Back in a very, very long flash.

The plot thickens! This time I ran a version of the program that uses the real time clock (RTC), but runs the measurement only every 2 minutes. Once again, I have the saturated peaks at regular intervals, but now they're at intervals of 10 minutes and 2 minutes, for a total cycle of 12 minutes. I also ran an independent check: when I knew that a saturated result was about to hit, I monitored the voltage with a voltmeter. It stayed rock solid at 33 millivolts without so much as a jiggle. Meanwhile, the software recorded a peak value of 1023.

So it's now obvious that the problem somehow involves the Sparkfun DS1307 RTC module. When I used it to determine the correct time to take a reading, it would sometimes trigger a saturation event. When I did not use it to determine the correct time to take a reading, there were no saturation events. So it is somehow central to the problem.

At this point, I have an easy solution: just use the damn thing as is, and if I get a saturation event, throw away the data and use the average of the data points on either side of it. But this is such an odd situation that I can't help but wonder what is going on. The DS1307 uses the analog input pins A4 and A5. Could there be some crosstalk? I doubt it. I'm going to try some more experiments.

Only taking one reading is a good way to end up with some perplexing results. I suggest reading the ADC multiple times and then averaging the results and see if the spikes suddenly shrink into the average. You could be reading it when there is an A/C hum present from something running intermittently or spikes from something else in your power supply. I really doubt that the 1307 is going to be the culprit except in maybe an indirect way. Are you doing the ADC reading while overlapping I2C or other communications, say by starting a transfer and then starting an ADC conversion right away while the transfer could still be going on. That could add noise. Timer overflows? Is it possible that the device reset itself? The first conversion after selecting a channel can be way off.

I'm pretty sure that the DS1307 is involved in the glitch, because when I ran the program without using it for timing checks, the glitch disappeared. To defeat the possibility of overlapping activity, I inserted a 100 msec delay before reading the analog pin. That was insufficient. However, I just set up a new run with a 1000 msec delay before reading the analog pin. Let's see if that helps.

I think I've found a solution: inserting a 2 second delay just before reading the analog pin seems to have eliminated the problem. This seems to confirm the suggestion that some of the communications activity (either the I2C communication with the RTC or the 9600 baud communication with the serial port) was the cause of the problem. I hadn't expected that either process would operate asynchronously. But there you have it.

I think I've found a solution: inserting a 2 second delay just before reading the analog pin seems to have eliminated the problem. This seems to confirm the suggestion that some of the communications activity (either the I2C communication with the RTC or the 9600 baud communication with the serial port) was the cause of the problem. I hadn't expected that either process would operate asynchronously. But there you have it.

Cool. :slight_smile: 2 seconds sounds like a long time though. The serial hardware has a built in FIFO that will hold up to 16 bytes (usually, some will hold more). The software hands the UART the byte to be sent and the UART sticks it into the FIFO and returns. At 9600 baud, it could take a smidge over 16 mS to send all 16 of the data that the software thinks is gone, but was really sitting in a FIFO. I wouldn't worry much about the I2C, especially if you are using the library.