how does delay() affect my output

#include <SoftwareSerial.h>

char Reading;
int Fin = 1;
int i = 0;
char Integer[8];


SoftwareSerial sensor(9, 8); // RX, TX
void setup()
{
  Serial.begin(9600);
  Serial.println("initializing");
  sensor.begin(9600);
  Serial.println("Success");
}

void loop() {
 if(sensor.available()){
    Reading = sensor.read();
    if(  Reading != '\r' && i <5)
    {
    Integer[i++] = Reading;
  }
    else
    {
      Integer[i] = '\0';
      i = 0;
      int Num = atoi(Integer);
      if(Num != 0)
          Fin = Num;         
    };
    }
    Serial.print("G");
    Serial.println(Fin);
    delay(50);
}

Hi, all Arduino masters,

I have a code here. Generally, this code is to get readings from a sensor device.

but according to my experiment, I can get readings only when I set the delay as 50 us.

If I change the delay to 100 or more, the serial monitor will stop streaming after streaming about 3 values.

So, Is there any way to change the delay to 100, while I can still get the correct reading continuously?

Thank you! !!!

Using delay() freezes the micro, SoftwareSerial will not work during this time. Use BWD millis() technique for your delays. See blink without delay.

LarryD: Using delay() freezes the micro, SoftwareSerial will not work during this time. Use BWD millis() technique for your delays. See blink without delay.

Problem solved!!!!!!!

Thank you a million times!!!!!!!