I try to get the duration of the coincidence of two events. That is, the time during which event2 is low while event1 is also low. This time is short, in the range 1 to 12 milliseconds, however it should be captured by the difference dt of the sketch below:
int inUS1 = 1;
int inUS2 = 2;
int val1, val2;
unsigned long t1, t2, t3, t4, dt;
void setup()
{
Serial.begin(9600);
pinMode(inUS1, INPUT);
pinMode(inUS2, INPUT);
}
void loop()
{
val1 = analogRead(inUS1);
val2 = analogRead(inUS2);
t1 = millis();
while(val1 < 500){
val1 = analogRead(inUS1);
t2 = millis();
while(val2 < 500){
val2 = analogRead(inUS2);
t3 = millis() - t2;
}
t4 = millis() - t1;
}
dt = t4 - t3;
Serial.println(t1);
Serial.println(t2);
Serial.println(t3);
Serial.println(t4);
Serial.println(dt);
delay(100);
}
But this program does not work, I always get dt = 0.
I tried to use digital pins instead of analog, it didn't help. I tried also a single "while" while(val1 < 500 && val2 < 500){
to no avail.
Do you think there is something wrong in the code or it is just my setup?
Did you mean:
while(val1 < 500){
val1 = analogRead(inUS1);
t2 = millis();
}
while(val2 < 500){
val2 = analogRead(inUS2);
t3 = millis() - t2;
}
t4 = millis() - t1;
?
No, the while for val2 is inside the while for val1.
I want to measure the time during which both val1 and val2 are < 500.
This time is short, in the range 1 to 12 milliseconds ...
12 milliseconds is a long time to an Arduino. One of the reasons I love microcontrollers, and Arduino in particular, is it gives us access to timescales which are way finer than our bodies. It's like a microscope for time.
It isn't clear if you want the time at which they both go low, and then one stops being low, but that would be:
int inUS1 = 1;
int inUS2 = 2;
int val1, val2;
unsigned long t1, t2, t3, t4, dt;
void setup()
{
Serial.begin(9600);
pinMode(inUS1, INPUT);
pinMode(inUS2, INPUT);
}
void loop()
{
val1 = analogRead(inUS1);
val2 = analogRead(inUS2);
// wait until they are both low
while(val1 >= 500 || val2 >= 500){
val1 = analogRead(inUS1);
val2 = analogRead(inUS2);
}
t1 = millis();
while (val1 < 500 && val2 < 500) {
val1 = analogRead(inUS1);
val2 = analogRead(inUS2);
}
t2 = millis();
dt = t2-t1;
Serial.println(t1);
Serial.println(t2);
Serial.println(dt);
delay(100);
}
[edit]
I want to measure the time during which both val1 and val2 are < 500.
Okay, we overlapped posting. I think this is what you've described.[/edit]
HTH
GB
WARNING - I have only Verify'ed this, and not tested it. It is likely to be wrong!