Hello, looking for an answer to a hopefully simple question.
I'm hoping to implement a timer interrupt that is a time multiple of a digital gate coming in from outside of the device. I can use micros() to get some good information about how long between incoming pulses but have been having trouble finding the right library to handle multiplying that interval using an interrupt timer for better accuracy.
I tried the "arduino-timer" library to try this out but based on searching around it looks like this library does not let you change the interval for the timer after the first time the timer is called. I tried setting the timer to 1 microsecond and incrementing a variable to fake what I am trying to do but found the timing results to be fairly inaccurate.
I then tried the "Asynctimer" library but the interval argument only works in milliseconds which is also not accurate enough for my purposes.
I did some google searching but did not come up with an obvious solution so I figured that this is the right place to ask. I'm using an Arduino Nano Every and arduino IDE on a Windows 10 laptop.
Thank you for taking the time to read this and respond.
The Arduino Uno is well known, and there is many code for it.
The Nano Every has also a microcontroller from the AVR family, but it is not the same.
Could you draw on a piece of paper what you want ?
Is there a incoming pulse that is a short time high, and you want to lengthen the pulse while keeping the frequency the same ? Or do you want a output frequency that is, for example, ten times lower ?
Ok, I have a diagram here that should be explaining what I am trying to do. In this case the black line is the input being analyzed. Whenever there is a rising edge at the input we calculate the time since the last rising edge and divide it by four, this value is then set as the timer interrupt interval. Now we have an interrupt that is activated at x4 the speed of the input gate stream and adjusts to changes in the timing of the stream.
In reality I would like to have the interval be more like 128 - 512 times the speed of the input clock. I am using this for music so multiplying a 16th note at 120 BPM I believe the interval of a steady clock coming in would be about 125 ms so dividing that by 128 should give an interval of roughly 977 us (for those not familiar 'us' means microseconds). This is lower than 1 ms so you can see why I need accuracy to be down to microseconds although if there is a little jitter in the clock having this many slices of a clock will still give me more accurate results compared to using "micros()" to count the timing between events.