Serial.available glitches.

@bask185: You have mail.

bask185:
that there is no clock line present does not mean you cant time. you said "in other words, timing should not be the way to solve an asynchronous problem."

The definition of 'Asynchronous' I posted, was copied straight from a dictionary. It is the same definition I was taught when I started working for a digital tele-comms company, 20 years ago. Arguing 'Asynchronous' might not mean what it means, would seem to me, an exercise in hitting one's head against a brick wall. So, sure, go ahead, knock yourself out.

...well there is the start bit, the byte and the stop bit, between the start and stop bit there is only timing.

Is that 1 stop bit, 1.5 or 2? Is that byte 5, 6, 7 or 8 data bits? What about a parity bit, odd even or none? You are talking about transmission layer stuff. Layer 1 to 2. Are you programming a UART directly? No you are not. A discussion on the intricacies of transmission here, is hardly relevant.

How do you think arduino can distinct two '0' from one '0'??

Great, the UART knows how long it should take for one frame to arrive. How long will one character, two characters, 1000 characters take? It depends on the length of the frame, how fast the data is being placed into the transmit buffer by the host processor at the far end, how fast that data is framed onto the wire by the UART. Even for a constant stream of data being written to the transmit buffer, the inter-character spacing can and will vary. 9600 or whatever the rate is, is the maximal transmission throughput, it is not the throughput, hence 'Asynchronous.'

what do you think will happen if you let 2 arduinos communicate with eachother when one has Serial.begin(9600) and the other has Serial.begin(115200) in the setup.

They fail at the transmission layer, which has very little to do with the data which is being transmitted. It's the digital equivalent of a fault on a telephone line; both parties can still speak perfectly fine but their voices, the data, become corrupted during transmission. OSI 101.

...it wont work because they are not timing right.

Define work? You might make it work by a co-incidence of timing. Personally, I don't call that working. Have you found one example of an established async protocol which depends on timing? Apparently not.

and this "You need to make your protocol robust, which it isn't." WHY THE F*CK YOU THINK I POST THIS ON A FORUM.

I do not know. To vent your temper at strangers who don't agree with you, as far as I can see. I am pretty sure the forum is not your own personal entertainment channel though.

If you dont have anything usefull to post, then don't post at all.

Please, do not try to tell me what to do.

You only bitch how wrong I am and you haven't given a single "solution" or answer

Just LOL.

The solution is, you put a stop mark in your data and frame your data accordingly. As I said, as was said, many times, by many people, on this thread and others, already. I wrote a series of long posts, complete with code examples, which solved a real problem yesterday. Robin2 took more time and care writing some FAQs, using very short words, to summarise the content of some long books on the subject. We are not the ones asking for any help here. But here you are, claiming to know better, posting a beginners mistake and swearing at people who point it out. The swearing, the deflection and the bluster, it's all a bit too impotent.

Now I have endured your ranting, can I ask you (again) to put your money where your mouth is. Provide a link to just one example, of an established async protocol, which marks the end of data, by inferring it from the passing of time. That is all you need to do. I would reciprocate, with a link to an async protocol which uses a stop signal but it's basically all of them.

I feel sorry for you because you are to stupid to understand that you are an idiot and that you are not nearly half as smart as you think you are.

I'm not too stupid to know when to use, to, too and two. Please don't waste your time feeling sorry for me. I am content to let the audience decide who might be the fool here. Stick to the point. Clue, it's about digital communication, not me.

so again, if you have nothing useful to add to my topic, will you please get the f*ck out of here.

Continue to swear at me and I may continue to answer back, which I am sure you would, until the moderators ask us both to be quiet.