How to measure RF signal accurately [SOLVED]

I am trying to measure RF signal HIGH duration. My sender and receiver code are at the end of this message.

I have two Arduino's. Just to be sure I've tried VirtualWire send/receive on them and it works fine. So the equipment and connections are OK. I've also tried to connect pin 11 to 12 directly (plain wire connected) and then I get my code running fine ok that way too. So my code should be OK.

But when I use my send/receive using RF I get a wrong duration. For example I've set it to send 500 milliseconds HIGH but my receiver gets 105-150 milliseconds. Furthermore if I change the send HIGH duration to 250 millisecond I still get same 105-150 milliseconds. I don't think noise is a factor: it is much shorter than 100 milliseconds.
So what is going on? What am I missing? Are OFF noise signals somehow blanking a HIGH from my sender?

Here is my sender code:

#define SENDERPIN 12

void setup()
{
Serial.begin(9600);
Serial.println("setup rfout"); 
pinMode(SENDERPIN, OUTPUT);     
digitalWrite(SENDERPIN,LOW);   
}

void loop()
{
int i,i2;    
   
i2=100;   
for (i=0;i<i2;i++)
   {
   Serial.print("Round: ");Serial.print(i+1);Serial.print("/");Serial.println(i2);
   digitalWrite(SENDERPIN,1);  
   delay(500);
   digitalWrite(SENDERPIN,0);
   delay(1);
   delay(5000);  // odota 5 sek
   }
}

Here is my receiver code

#define RECEIVEPIN 11

void setup(){
Serial.begin(9600);
Serial.println("setup receive"); 
pinMode(RECEIVEPIN,INPUT);     
}

void loop()
{
long time1,time2;    
while (1)   // wait for a HIGH
  { 
  if (digitalRead(RECEIVEPIN)==1)
     { // got HIGH, exit and start timeing it's duration
     break; 
     }
  }
time1=millis();  //start time for start of HIGH
while(1)
  { 
  if (digitalRead(RECEIVEPIN)==0)
     { // HIGH ends, now we have LOW, measure how much time was on HIGH
     time2=millis()-time1;   // how much time was on HIGH
     if (time2>100 && time2<800)
        {  // cut some noise and try to focus on sender's 500 millisec
        Serial.println(time2);
        break;
        }
     else     
        { // got a short HIGH
        while (1)   // wait until the short HIGH is back to LOW
          { 
          if (digitalRead(RECEIVEPIN)==0)
            break;  // got OFF  break and start at inition while -loop
          }
        break;  
        }
     }   
  }
}

This is because of the AGC on the receiver adjusting and dropping the gain till the signal looks like noise.

Since RF signals off-air vary in power by factors of many billion, a receiver has to adapt its sensitivity
dynamically. 0.5s is a very very long time and many times longer than the time constant of the AGC
response.

Any suggestion on how to get a signal with a measureable duration on the receiver?

Perhaps your "sender" is expecting "data" to be sent, not just "high" or "low".

Paul

Data is made up of highs or lows that make binary bytes which eventually make, say, ASCII. For example uppercase letter A is 01000001 which is a series of highs and lows similar to morse code.
In order to make those highs or lows you need to accurately have a set length what is one low and not a number of lows etc. There are a lot of techniques and protocols to aid that.
Arduino is well capable of that. VirtualWire seems to employ interrupts but I would prefer using just the loop() function. Anyone?

I think I see a Serial.print in your loop that you are timing the bits. That cannot possibly play because it takes a LONG time to complete the print.

Paul

The Send.println does not affect timing in this case as it is done after timing is finished and the signal is not between 100 and 800 milliseconds.
The problem is that no matter what signal the sender sends the receiver sees is at 100-160 milliseconds and not 500 millisecods. If I directly connect the sender and receiver I get accurate timing +/- a few microseconds (not milliseconds).

OK. I got it solved by adding sender to send

i4=50;
for (i3=0;i3<i4;i3++)
{
digitalWrite(SENDERPIN,1);
delay(1);
digitalWrite(SENDERPIN,0);
delay(1);
}

to "wake up" the sendor (ie AGC)

I'd like to add some more info should someone else run into a same problem.
Adding the preamble appeared to solve the problem of not getting 500 msec of high on the receiver. Everything ran fine. This morning everything gradually seemed to go south. After a lot of of testing it appears that after the preample pulses I can get maximum consistent high pulse of about 150 msec and then the signal becomes choppy and deteriorates. Probably the sender is not able to keep high because of not enough amperes to keep it up. Low signals of course can go longer. Keeping everything under 100 msec i get a workable pulse. The serial plotter confirms just what I expect.