I made a project where I want to detect rotations of a disk.
The disk has a black marking and to detect this I use a line detector, QRB1114 (basically an IR LED and phototransistor in a house).
Because I want to integrate this with a routine that use delay(), the sensor is used to trigger an interrupt, in that way I always catch the black mark, even when noop'ing away.
However, connecting an "analog" sensor to a digital pin is somewhat challenging, when you don't want to use additional circuitry.
When the line comes flying by the interrupt is triggered 40+ times due to the analog value jumping up and down between the logical high and low value before settling.
To debug what the sensor was seeing I decided to connect the sensor to an analog input and read the value out on an LCD. However by doing this the jitter seem to completely stop.
About the circuit:
QRB collector is pulled high with 5.6kohm and connected to pin2 and A3.
The analog value is ~40 (~0.2V) when no mark is present and ~600 (2.9V) when line is present (in a 0-1023 range).
This is far from optimal, as high level should be close(er) to 5V, however it seems to work now when pin2 and A3 are directly connected.
Could anybody tell me why this is the case - that the jitter stops when pin2 and A3 is connected together?
And how to (easily) obtain same result without the use of A3?
You are adding extra capacitance by connecting another pin, and that capacitance serves to further filter the oscillations of the signal (beyond the capacitance of the first pin). That would be my best guess.
You can test it yourself by adding a small capacitance (<1nF or so) from pin 2 to GND, then disconnect pin A3.
--
The Gadget Shield: accelerometer, RGB LED, IR transmit/receive, light sensor, potentiometers, pushbuttons
Thanks!
I was thinking in that direction, but I guess I was playing with too big caps yesterday to make it work properly. Adding a 470pF now seem to do the trick (still need to run for a while to be 100% sure though).
But as far as I have read every port on the ATmega8/168 (328?) i.e. the Arduino, already has a schmitt trigger, so wouldn't this be overkill?
And I am aiming at keeping the needed peripheral components to an absolute minimum, and if I can't manage without they should preferably be passive.
(I know I'm not making it easy).
I want to detect when the disk has rotated once and determine the rotation time (every time). It is an (very) old type of electricity meter with a rotating disk, where there is a black mark.
Trick is that disk is located behind sealed encapsulation (luckily transparent) and minimum distance to disk is therefore fixed due to this and limiting how the sensor can be placed.
Never use delay()
You are so right, but nevertheless it is (was) the easiest solution and as such not the problem. The process is not (very) time sensitive. I just want to upload samples once a minute (+/- a few seconds).
But that is the problem, you can't depend on what will happen during delay().
AFAIK when the sensor is attached to an interrupt the delay() routine will be halted to process the (new) user interrupt, so that should not be a problem. Or will user defined interrupts not work during delay()? (that would be a serious limitation, but I might be wrong?).
The main problem (I believe) is that when the black mark on the disk pass the sensor, then the analog voltage from the photo-transistor jumps up and down between logical high and low (I assume due to the uneven black paint-mark) and thus triggers the interrupt a few times. Same thing likely occurs when the line starts. I tried to solve this as RuggedCircuits suggest with a small cap on the sensor output, but does so far not seem to completely solve the issue.
I might have to take a look at the debounce trick, but then again I want to do as little as possible during the interrupt to not screw up the delay() time too much.
It sounds like you are getting multiple triggers from your sensor. The best way to remove this is to use your sensor to fire a monostable. Use an NE555 or a 74LS121 or 74LS123. This will produce a fixed length pulse out for the first edge it sees. You can easily make sure this pulse length is not longer than the fastest pulse and it is only the time of the leading edge of the pulse you are interested in.
I think the debounce approach helped to solve the problem.
I now changed the interrupt to look for logic level CHANGE rather than FALLING, and at the same time adding a "dead zone" of 50 ms after each detected interrupt where further interrupts are ignored. This also requires to keep track on the reason for the interrupt as it occurs both when the line is approaching and leaving the sensor range.
It turned out to be a bit challenging to incorporate a check if millis() has overflowed and take that into account in the timing calculations, but I think I also got that right finally.
I have the same kind of meter and I monitor my electricity using an arduino. I have it hooked to analog pin (the disk isn't perfectly true on its edge and the signal often hovers around the half way mark at some points on the rotation which compeletly screws trying use interrupts) and do the timing for all the other stuff using code without delays. I have come to regard the use of delay in sketches as a botch up......
I don't make any allowance for millis() overflowing - the computer monitoring the arduino is set to reboot once a week which takes the arduino down with it. I get a predictible glitch at 5:07 AM on a Monday morning...
I will definitely look into ditching the delay() in the main loop, but I believe that the interrupt is now working reliable enough (for this purpose).
The Arduino is running as standalone with a ethernet shield and not hooked up to a PC, so I do not have the privilege to do timing and other corrections outside the Arduino.