Need help to use microseconds delay outsite void loop

Hi!

Need help to get my delay in milliseconds to microseconds, have tried google this problem but I am not much of an programmer and haven´t found anything that could help me.

My problem is that I want to use an delay outside the void loop in microseconds instead of milli. This is my code:

#include <util/delay.h>
//-----------------------------------------------
int const valve_in = 8;
int const valve_ut = 9;
int const matgaffel = 3;

volatile byte matskiva_plats=1;
volatile int pressure=55;
volatile int unsigned time = 30 ;
byte desiredmax = 70;
byte desiredmin = 55;

int start=0;

unsigned long timeold = 0;

//------------------------------- ----------------
void setup()
{
Serial.begin(9600);

pinMode(valve_in,OUTPUT);
pinMode(valve_ut,OUTPUT);
pinMode(matgaffel,INPUT);
attachInterrupt(1, valve_ut_fun, RISING);
pinMode(A0,INPUT);
}
//-----------------------------------------------
void loop()
{
if(matskiva_plats==1)
attachInterrupt(1, valve_in_fun, RISING);
else
attachInterrupt(1, valve_ut_fun, FALLING);

}
//-----------------------------------------------
void valve_in_fun()
{
Serial.println(pressure, DEC);

if(start>20)
{
if(pressure<desiredmin)
time++;
else if(pressure>desiredmax)
time–;
}
else
start++;

digitalWrite(valve_ut, LOW);
digitalWrite(valve_in, HIGH);

delay_ms(time);

digitalWrite(valve_in, LOW);

attachInterrupt(1, fix, CHANGE);
matskiva_plats=0;
}
//-----------------------------------------------
void valve_ut_fun()
{
pressure=analogRead(A0);
digitalWrite(valve_ut, HIGH);

attachInterrupt(1, fix, CHANGE);
matskiva_plats=1;
}
//-----------------------------------------------
void delay_ms(unsigned int time)
{
while (time–)
_delay_ms(1);
}
//-----------------------------------------------
void fix()
{
delay_ms(1);
}

//-----------------------------------------------

What my program does is controlling two inlets/ outlets for an air engine depending on the pressure when the piston reaches its lowest point. But when putting this to practical use the engine/ car hacks forward because changing 1ms makes to much differense in air inlet.

Pls help, thanks

Have you tried the function delayMicroseconds()? The value you pass is an "unsigned long".

What I have found is that I can´t have delay function in the attachinterupt, that´s why I have delay_ms, I wonder if there is something similar to delay_ms but for microseconds, Like can I just replace ms with us and it will work?

The delayMicroseconds() function will work in an ISR.

Oops, the input value is an unsigned int, not unsigned long. The longest usable delay is about 16000 microseconds.

So you are saying that I could rewrite my code as follow aslong as I dont exceed time = 16000?

#include <util/delay.h>
//-----------------------------------------------
int const valve_in = 8;
int const valve_ut = 9;
int const matgaffel = 3;

volatile byte matskiva_plats=1;
volatile int pressure=55;
volatile int unsigned time = 30 ;
byte desiredmax = 70;
byte desiredmin = 55;

int start=0;

unsigned long timeold = 0;

//------------------------------- ----------------
void setup()
{
Serial.begin(9600);

pinMode(valve_in,OUTPUT);
pinMode(valve_ut,OUTPUT);
pinMode(matgaffel,INPUT);
attachInterrupt(1, valve_ut_fun, RISING);
pinMode(A0,INPUT);
}
//-----------------------------------------------
void loop()
{
if(matskiva_plats==1)
attachInterrupt(1, valve_in_fun, RISING);
else
attachInterrupt(1, valve_ut_fun, FALLING);

}
//-----------------------------------------------
void valve_in_fun()
{
Serial.println(pressure, DEC);

if(start>20)
{
if(pressure<desiredmin)
time++;
else if(pressure>desiredmax)
time–;
}
else
start++;

digitalWrite(valve_ut, LOW);
digitalWrite(valve_in, HIGH);

delayMicroseconds(time);

digitalWrite(valve_in, LOW);

attachInterrupt(1, fix, CHANGE); //
matskiva_plats=0;
}
//-----------------------------------------------
void valve_ut_fun()
{
pressure=analogRead(A0);
digitalWrite(valve_ut, HIGH);

attachInterrupt(1, fix, CHANGE); //
matskiva_plats=1;
}
//-----------------------------------------------
void fix()
{
delayMicroseconds(1);
}

//-----------------------------------------------

Kind regards

If you put a delay much more than 900 microseconds in an ISR you will start to lose timer interrupts and the millis() function will lose time.

I’m not sure I follow the logic. I don’t see how the delay in the ‘fix’ ISR does anything useful.

You don’t need “#include <util/delay.h>” anymore.

You shouldn’t use .print() or .println() in an ISR. If the buffer is full your code will lock: .print() will be waiting for the buffer to empty but since the interrupts are disabled the buffer can’t empty.

It seems to me that switching interrupt handlers repeatedly as you're doing is not a good strategy, and imo it would be clearer and simpler just to write an interrupt handler that implemented the logic you need, using state variables if necessary. Additionally, I can't see the purpose of fix(). Conventional wisdom is that interrupts should do as little as possible and especially not do any delays. In this case a short delay is all it does - how is that useful?