I am working on a device for measuring the width of filament as it is extruded. There is a laser shining through a pinhole at a phototransistor which is also behind a pinhole, the idea being to keep both the light source and the sensor smaller than the 1.75mm filament being measured. The laser and sensor both get swept past the filament, and the width is determined based on the length of time the phototransistor is shaded in microseconds.
I have the emitter of the phototransistor plugged into both D2 and D3 of a Nano with hardware interrupts. One watches for FALLING and sets the start time, and the other pin watches for RISING and sets the end time. The duration is calculated for sweeps in only one direction, since they tend to be different going each way. Here is a graph of the analog output of the phototransistor with lines showing where the pin changes happen-
I would like those interrupts to happen closer to the drop from 1023, since those are sharper and more consistent. They usually hit around 400 going down and 500 going up, which is expected since that would be around 2.5v. That point is a little more variable however, depending on the slope from 1023 to the bottom.
I can do this based on the analog reading, but I've found that the resolution of measurement is based on the amount of time it takes the loop to execute. There is more code involved with the analog version since there has to be an analogRead, some smoothing, and additional logic to figure out if it is going light->dark or dark-> light. The durations come out as multiples of 1140us, since setting the start and end times have to wait on the loop rather than the instant the edge actually happens.
Is there a way to get a combination, where I get the consistency of the sharp corner and the accuracy of the hardware interrupt? I think the illuminated phase probably needs to stay at 1023, otherwise the falloff is a little more gradual.