hi all. I have an odd problem i'm trying to resolve and can't figure it out. My code is simple and reads two pots. one is a timing delay, and the other adjusts a variable pulse which pulses a mosfet. and then it loops thru the code pulsing. Here's the code, and the problem is below.
int thump = 13;
int delayTiming = A1;
int strength = A3;
int delayReadout;
int delayPeriod;
int spike;
int spikePeriod;
void setup() {
Serial.begin(115200);
pinMode(thump, OUTPUT);
digitalWrite(thump, LOW);
pinMode(delayTiming, INPUT);
pinMode(strength, INPUT);
}
void loop() {
delayPeriod = analogRead(delayTiming);
delayReadout = map(delayPeriod, 1023, 0, 500, 3000);
spike = analogRead(strength);
spikePeriod = map(spike, 1023, 0, 1000, 18000);
Serial.println(spikePeriod);
digitalWrite(thump, HIGH);
delayMicroseconds(spikePeriod);
digitalWrite(thump, LOW);
delay(delayReadout);
}
This line of code. spikePeriod = map(spike, 1023, 0, 1000, 18000); sets the pulse length for the mosfet, but whenever the top end is adjusted above 16200 the pulse falls on its face. but the serial.print line prints the correct value. Anything above 16200 causes this line "delayMicroseconds(spikePeriod); " to fall down. Not only the mosfet doesn't get the full pulse, the loop LED on the board acts the same way. from 1000 to 16200 the LED gets brighter until it hits ~16200 and then dims and acts like it's set at 1000. And this problem not only happens under full power, but it also does it whenever the board is powered by the computer.
If i change the code and use delay instead of delayMicroseconds, it works like it should, but i need to use delayMicroseconds to have a higher resolution for adjustment. But can't understand why it falls down above 16200. what am i missing or doing wrong? Thanks
My first suspicion might be integer overflow; first, try simply changing all your ints to long, or to int32_t. Of course, if that works, you need to check the map() code.
Second thought is, what about the timer code - is there a *2 or a *4 that's taking your value beyond 16-bits?
Currently, the largest value that will produce an accurate delay is 16383; larger values can produce an extremely short delay. This could change in future Arduino releases.
Thanks for your help. It's been a couple years since i've coded any and had a vague thought that there was a limit when using delayMicroseconds but couldn't remember exactly.
I'll try to see if there's a workaround for my problem. maybe i can use a combo of delay and delayMicroseconds together to get what i need.
So i was able to find an easy fix after a lot of experimenting by calling a delay and delayMicroseconds back to back. it'll count to the lowest whole number and then count the remaining microseconds left. I know that it isn't perfect being there is a very very slight delay to switch from delay to microseconds, but for what i need it'll work good.
But I did try delay(100) and delayMicroseconds(100) to make a 100.1 millisecond pulse, and every pulse was either 100.139 or 100.0736 ms.
I tried two delayMicroseconds() of 15000 and the pulses ranged from 30.212 to 30.277 ms. So there is some kind of overhead, and some kind of basic not getting away with this. Not accurate, not repeatable.
I used a logic analyzer on a pin set HIGH before the delay and LOW after. An oscilloscope would be a better tool for observing the jitter in the ending of the desired pulse.
If you want accuracy you might want to perform similar experiments. You may have to get more sophisticated about producing these pulses.
If you ever require more precision / less jitter, then you can create the signal using hardware. All boards in the Arduino ecosystem have this capability using some form of hardware timer / counter.
For the code i was working with, in the end, it turned out to be a waste of time, other than learning something different.. Turns out I didn't need more than the 16000 microseconds of a pulse, and anything more than that, the circuit became saturated and i actually lost power.
Back in the day i was experimenting with the natural delays in arduino and kind of remember them taking around 40 microseconds to change lines of code. But i do remember that if i skipped the arduino language and used C-language instead, the shortest delay possible between lines of code was 4 microseconds. But i never really understood that language very well since it was just letters and 0's and 1's