I've combined the codes that you mentioned as best i could.
what is happening is this; the led is off, when it reaches the threshold it led starts blinking. and doesn't stop. same thing when i do it without the delay only every 50millis. also my serial data has jumped from hovering around 450-500 to 650 area.
Here is my code without the delays:
int ledPin = 13; // LED connected to digital pin 13
int sensorPin = 0; //analog pin 0
int sensorValue;
int ledState = LOW; // ledState used to set the LED
long previousMillis = 0; // will store last time LED was updated
int val = 0;
// the follow variables is a long because the time, measured in miliseconds,
// will quickly become a bigger number than can be stored in an int.
long interval = 1000; // interval at which to blink (milliseconds)
void setup() {
// declare pin 13 to be an output:
pinMode(13, OUTPUT);
pinMode(0, INPUT);
Serial.begin(9600);
}
void loop() {
sensorValue = analogRead(sensorPin);
Serial.print("Sensor Value: ");
Serial.println(sensorValue);
val = analogRead(ledPin); // read the input pin
analogWrite(ledPin, val / 4); // analogRead values go from 0 to 1023, analogWrite values from 0 to 255
// here is where you'd put code that needs to be running all the time.
// check to see if it's time to blink the LED; that is, if the
// difference between the current time and last time you blinked
// the LED is bigger than the interval at which you want to
// blink the LED.
unsigned long currentMillis = millis();
if(currentMillis - previousMillis > interval) {
// save the last time you blinked the LED
previousMillis = currentMillis;
}
if (sensorValue>600) {
digitalWrite(ledPin, HIGH);
}
else {
digitalWrite(ledPin, LOW);
}
// set the LED with the ledState of the variable:
digitalWrite(ledPin, ledState);
}
The blink without delay was meant as a template of a technique to eliminate the call to 'delay', not an actual solution, since you did say you wanted to fade and not blink.
I can't seem to figure out how to make this work. Unfortunately i don't seem to have enough experience with arduino or programing in general to add the slow fade in and out using the "blink without delay" tutorial. could anyone maybe help me or explain how i adapt the two codes together?
What the goal is, is; the mic registers the sound, when the threshold is crossed the led fades in over 2 seconds then pauses for 2 seconds then fades out over 2 seconds.
i've been looking at this tutorial the fading tutorial and i think it is what i need. it actually doens't matter if there is a delay because once the threshold is crossed it needs to run the fade in, pause, fade out every time so a delay would work and it doesn't matter if it misses some data.
so what i need to know is how do i increase the fade time?
I believe what i need to do is create some kind of delay after every time fadeValue=+5. is that correct? how do i do that?
int ledPin = 13; // LED connected to digital pin 13
int sensorPin = 0; //analog pin 0
int sensorValue;
int ledState = LOW; // ledState used to set the LED
// will store last time LED was updated
int val = 0;
void setup() {
// declare pin 13 to be an output:
pinMode(13, OUTPUT);
pinMode(0, INPUT);
Serial.begin(9600);
}
void loop() {
sensorValue = analogRead(sensorPin);
Serial.print("Sensor Value: ");
Serial.println(sensorValue);
if (sensorValue>900) {
for(int fadeValue = 0 ; fadeValue <= 255; fadeValue +=5) {
// sets the value (range from 0 to 255):
analogWrite(ledPin, fadeValue);
// wait for 30 milliseconds to see the dimming effect
delay(5000);
}
}
else {
digitalWrite(ledPin, LOW);
}
// set the LED with the ledState of the variable:
digitalWrite(ledPin, ledState);
}
i feel like it should work but the led doesn't come on.
int ledPin = 13; // LED connected to digital pin 13
int sensorPin = 0; //analog pin 0
int sensorValue;
int ledState = LOW; // ledState used to set the LED
int val = 0;
void setup() {
// declare pin 13 to be an output:
pinMode(13, OUTPUT);
pinMode(0, INPUT);
Serial.begin(9600);
}
void loop() {
sensorValue = analogRead(sensorPin);
Serial.print("Sensor Value: ");
Serial.println(sensorValue);
delay(50);
if (sensorValue<600) {
digitalWrite(ledPin, LOW);
}
}
void fadeValue1() {
for(int fadeValue1 = 255 ; fadeValue1 >= 0; fadeValue1 -=5) {
// sets the value (range from 255 to 0):}
if (sensorValue>600) {
analogWrite(ledPin, fadeValue1);
// wait for 30 milliseconds to see the dimming effect
delay(1000);
}
}
}
void fadeValue1()
{
for(int fadeValue1 = 255 ; fadeValue1 >= 0; fadeValue1 -=5)
{
if (sensorValue>600)
{
analogWrite(ledPin, fadeValue1);
// wait for 30 milliseconds to see the dimming effect
delay(1000);
}
}
}
Aside from the fact that you never call this code, I have some questions (after I put the curly braces where they belong and indented the code properly). Why would you call this function of sensorValue is less than 600? If you wouldn't, then there is no reason to test sensorValue in this function.
Does your sense of time really tell you that 30 milliseconds == 1000 milliseconds? If the comment doesn't agree with the code, the code is right and the comment is useless. Useless comments are worse than no comments.
255/5 = 51 steps. That function will take nearly a minute to fade the LED off. During that time, the Arduino will do nothing else.
Finally, ledPin is set to 13. Pin 13 is not a PWM pin, so the LED will simply stay bright for 29.5 seconds, then turn off. Not exactly what I consider a fade effect.