Compiler error? A delay is needed to work

Hello folks, I can't understand why the following program don't work without the"delay(2)" sentence.
Aparently all is fine, Knows someone why need the delay for don't exit without reason of (do--->while) ?
Anyone know other way to do this witout the delay? Regards.

/* This program generate an echo to the serial port. It resends the same  characters writings
I'm looking read all sent characters to serial port inside the statement port_read(char *recive) .

#include <stdlib.h>
#include <string.h>
char read_buffer, recived_from_pc[50], send_to_pc[50];
//***********************  functions **************************

int port_read(char *recive);

//************************** Setup ****************************
void setup() {
  Serial.begin(9600);  
}

//*********************** Main Program *************************

void loop() {
 port_read(recived_from_pc);

}

//****************************  Functions *****************************

int port_read(char *recive){
   bool empty_port;
   int i;  
   i=0;
   do     
       if (Serial.available()>0) { 
          recive[i]=Serial.read();          
          i=i+1;           
          recive[i]='\0'; 
          empty_port=true;
          //delay(2);  //try coment and uncoment this line  <-------------------------------------------------------------------- HERE!!!
       }
       else { 
         empty_port=false;          
       }
   while (empty_port==true);
     if (i !=0){
          Serial.print("_____the number of elements is: ");
          Serial.print(i);
          Serial.print("_________the string of char is: ");
          Serial.println(recive);
          Serial.println("Procesed____________________________________________");
     }
}

Moderator edit: [code] ... [/code] tags added. (Nick Gammon)

Please modify your post, and put ... tags around the code. Select the text and click the '#' button.

I voted for the second option, but I'm not entirely certain.

int port_read(char *recive){
   bool empty_port;
   int i;  
   i=0;
   do     
       if (Serial.available()>0) { 
          recive=Serial.read();

Here you write the pointer to the character array and not into the array ...

          i=i+1;           
          recive='\0';

... and now you set the pointer to NULL. This is very likely to mess things completely up. The delay() might change some behaviour, but it is only a symptom for the bug.

So use *recive or recive[ i ] to write to the array (see C/C++ reference). Keep in mind that there is a receive buffer and a function serialEvent() (see http://arduino.cc/en/Reference/SerialEvent) which might be handy. And most important: you loop very fast through your code compared to the speed characters are received (~1ms per char at 9600 baud).

And please do not start senseless polls - just start a new topic.

   do     
       if (Serial.available()>0) { 
          recive[i]=Serial.read();          
          i=i+1;           
          recive[i]='\0'; 
          empty_port=true;
          //delay(2);  //try coment and uncoment this line  <-------------------------------------------------------------------- HERE!!!
       }
       else { 
         empty_port=false;          
       }
   while (empty_port==true);

The reason you need the delay() is that you are reading until you get ahead of the sender, not until you read a character that indicates the end of input. Without the delay you are reading faster than the sender is sending and get ahead of the sender before the transmission is complete.

This will hopefully help you understand how to read serial data without needing delays:

In future please do "New Topic" not "Post new poll". Polls are for things like "what is your favourite colour?".

The delay(2) is meant to slow down arduino so in next iteration arduino can decide whether the next character comes in or not. In case of not, arduino assumes the transmission has ended, in this heuristic way. Highly recommend you to stay clear of this method and read Nick's reference.

Here is my way to pick this method apart:

In 1960, when all they had were modems with ~96baud rate, and people have 3+ decades until they first hear about USB, this way may work pretty well. Serial port sends data immediately after it receives a full byte. Back to future of 2012, every arduino (more or less) is equipped with a USB TTL adapter that simulates serial communication with USB packages. These packages are about 63 bytes long and won't take off until one of the two things happen, the buffer of 63 bytes just filled with data, or time out has occurred. Say your sender has 4 bytes of data and sends to you before a long sleep, and in this case his first 3 bytes rides with 60 other bytes down the USB to arduino, fine, then the last byte he sends is stuck in the PC USB buffer alone to wait for time out. This time out by default is longer than 2ms. Your arduino gets garbled message and sets your house on fire. Not so good isn't. You will be left without a house and no idea why this could have happened XD

This problem is especially bad when the sender keeps sending data with no time stamp on the data and the receiver assumes the delivery would be immediate. The data ripples and you end up with useless chart.