Hi,
I want to write a custom delay function which takes input in microseconds and add a delay of that time period in the code. For this purpose i am using counter 1 of arduino mega which is used to count time in microseconds and another function which i have written is written is delay_us_1() which add delay.
But when i use this function to generate an waveform of 200us time period. But it is not giving expected results and it is generating waveform of time period 930us. Also it is giving incorrect time delays when i change the delay to some other values.
Here is my code
volatile uint32_t microseconds = 0;
ISR(TIMER1_COMPA_vect)
{
PORTB ^= ( 1 << 7);
microseconds += 4;
// seconds++;
}
void setupTimer1()
{
// WGM1[3:0] = 15 for fast PWM with TOP = OCR1A
TCCR1A = ( 1 << WGM10 ) | ( 1 << WGM11 );
// ICES1 == 1 for rising edge capture
TCCR1B = ( 1 << WGM12 ) | ( 1 << WGM13 ) | ( 1 << ICES1 ) | ( 1 << CS11 );
// OCR1A set TOP to 15 or (111)
OCR1A = 0x7;
TIMSK1 = (1 << OCIE1A );
}
void delay_us_1( uint64_t delay )
{
uint64_t start = microseconds;
while ( ( microseconds - start ) < delay );
}
void setup() {
DDRB |= ( 1 << 7 ) | ( 1 << 6 );
PORTB |= ( 1 << 7 ) | ( 1 << 6);
setupTimer1();
sei();
Serial.begin(115200);
Serial.println("Serial started");
}
void loop() {
delay_us_1( 100 );
PORTB ^= ( 1 << 6);
}
and here is the result on oscilloscope.
