# programing for buck converter

I am designing a buck converter the circuit is as below:
Now initially as i am applying 19V which needs to be stepped down to 12V.
Hence the circuit provides at the max 25Khz.I am using 20Khz -->50us.
And checked the o/p on CRO.
here is the code:

``````int freq=9; //pulses of 20Khz at pin 9 named freq

void setup()
{
// put your setup code here, to run once:
pinMode(freq,OUTPUT);     //make freq a output pin

}

void loop()
{
// put your main code here, to run repeatedly:
digitalWrite(freq,HIGH);

delayMicroseconds(25);
digitalWrite(freq,LOW);

delayMicroseconds(25);
}
``````

what i observed on CRO was not 20Khz but, 18.89Khz.
then i had to change the time period to suit my requirement then, the ultimate value came out to be T=36us. Now this 36uS corresponds to 20Khz though it actually means 27.77Khz.Why is this happening?
should i write my own delay routine rather than depending on these inbuilt functions?
Another weird thing i observed:
That is to get around 12V o/p from 19V PWM=(12/19)*100=63%duty cycle hence now program will be:

``````void loop()
{
// put your main code here, to run repeatedly:
digitalWrite(freq,HIGH);
delayMicroseconds(31);
digitalWrite(freq,LOW);
delayMicroseconds(19);
}
``````

But the result i got was that:
rather than stepping down of voltage voltage was comparable to the supply voltage and only 1-2V difference.
I am unable to figure out what is going wrong.
the figure below i am attaching of MOSFET driver (consists of 555 timer+optocoupler)
and mofet+incductor+capactor=buck converter.

what i observed on CRO was not 20Khz but, 18.89Khz.

digitalWrite takes . . . how long to execute?
What is the call overhead for loop()?

What you should do is use a hardware timer in PWM mode to drive your output, and use a PID to control the PWM pulsewidth. Doing the PWM timing in software is not going to work very well. And using a hardware PWM will let you run much higher PWM frequency.

Regards,
Ray L.