Good evening everyone,
I am using a couple of piezoelectric sensors connected to an amplification circuit built based on David Houlding's personal blog. The correctly amplified signal is then inputted to an ARDUINO UNO. I would like to measure the speed of sound propagating through a wooden table using the two piezo sensors as checkpoints. I have set the ADC prescaler to 4 bits to increase the sampling rate.
I was thinking to save the time stamp at which a given threashold is exceeded by the signal coming in every analog pin and output the time difference in microseconds.
I am currently using the following code
unsigned long time0 = 0;
unsigned long time1 = 0;
int threshold = 350;
void setup() {
Serial.begin(2000000);
ADCSRA &= ~(bit (ADPS0) | bit (ADPS1) | bit (ADPS2)); // clear prescaler bits
ADCSRA |= bit (ADPS1); // setting the prescaler to 4 bits
delay(2500); // Allow time for the serial monitor to open
}
void loop() {
if (analogRead(A0) > threshold && time0 == 0) {
time0 = micros();
while(1){
if (analogRead(A1) > threshold && time1 == 0) {
time1 = micros();
Serial.println(time1 - time0);
time0 = 0;
time1 = 0;
delay(1000);
break;
}
}
}
if (analogRead(A1) > threshold && time1 == 0) {
time1 = micros();
while(1){
if (analogRead(A0) > threshold && time0 == 0) {
time0 = micros();
Serial.println(time1 - time0);
time0 = 0;
time1 = 0;
delay(1000);
break;
}
}
}
}
The problem I face is with the output in microseconds I get... I don't understand why most of the time it is extremely large, for example:
4294967284
4294967284
4294967288
4294967284
4292064684
4284993872
12
8
8
4294967288