First off, massive respect to those who spent their time helping on the forums.
Personally I have learnt so much from troubleshooting using the forum search bar, a big thanks.
My problem. (I'm assuming it will be a easy fix)
I have a loadcell +amp chip, achieving good signals.
My system currently calibrates itself on start up. (I physically cycle the load cell)
[while (millis() < 5000) {
sensorValue0 = analogRead(sensorPin0);
// record the maximum sensor value
if (sensorValue0 > sensorMax0) {
sensorMax0 = sensorValue0; }
// record the minimum sensor value
if (sensorValue0 < sensorMin0) {
sensorMin0 = sensorValue0; }
]
I then map the values for the processing i require.
I do not have the experience to define a calibration sequence (for a time period, like 5 seconds) after a button demand.
I have done all the relevant tutorials but I cannot find something that works.
My button command will have to be a analog signal, just to make things a little more awkward. Eg:
[
buttonState = analogRead(A3);
if (buttonState > 1000) { Do something}
]
To summarise - How do I calibrate a sensor for a given time period after a command?
A flow chart along the lines of:
[Button Pressed] -> [Start calibration 'scan'] -> [5secs delay]-> [Stop Cal scan]->[Generate Min and Max Values]
Kind regards,
Calibration-Noob Will.