logic level on UART

Hi Guys, I have a question about the interpretation of the logic levels using the UART. In any documentation I have read that 0 V is a logic 0 and 5 or 3.3 V corresponds to a logic 1. I have checked the transmission of a character (M) but this comes to a strange result.

As you can see in the screenshot I got the following signal sequence:

0 - 1 0 1 1 0 0 1 0 - 1

The first zero should be the start bit, the last 1 the stop bit... the remaining eight bits are the data bits... which results in decimal 178. But that´s not the ASCII code for an "M". But if I use an inverted logic (0 V = 1, 5 V = 0) then I will get the signal sequence for the data bits as:

0 1 0 0 1 1 0 1

this is decimal 77 and that´s exactly the ASCII code for "M".

But in this case the start bit is logical 1 and stop bit logical 0 and that contradicts all the documentation even the UART specification in the Atmel datasheets.

Where is my error in reasoning? Do you have any idea?

No screenshot. See this Simple Image Guide

It would also be a good idea if you post your program.

...R

Hi Robin,

thanks for your comment. Sorry, I forgot to upload the image. I hope it is now visible.

According the code. It is as simple as it could be:

void setup(){
    Serial.begin(9600);
}

void loop(){
    Serial.println("M");
    delay(1000);
}

I guess it is independent from the code and from the MCU. Any target I connect to the Arduino interprets this as the character "M". So it works. My question is about the presentation of the signal levels.

Osci_Screenshot.jpg

cyrano1960:
I hope it is now visible.

Not yet.

...R

Use another character, e.g. '0'.

Least signficant bit goes first.

But I can see it it the attachments :o

But I have solved the problem and it is very strange :slight_smile:

My mistake was: after the start bit follows the LSB not the MSB... if you mirror the code for 178:

1 0 1 1 0 0 1 0

you will get:

0 1 0 0 1 1 0 1

and that is 77 ("M").

This works just for the M. If you mirror LSB to MSB you will get the same value as if you invert evrey bit. Sometimes a coincidence makes our life very hard.

But perhaps it is helpful for one of you!

cyrano1960:
But I can see it it the attachments :o

I know you have it attached. But you have not made it visible. See the link I gave you earlier.

A while back I wrote a program to interpret serial data and, now that you mention it, I think the LSB does come first. I don't recall that I found that strange when I was writing my code. If you think about it the natural way to send a byte is LSB first - keep shifting to the right.

...R

Hello Sterretje,

thanks for your answer. That´s it!! I have written my answer as you posted yours.

Hello Robin,

sorry, I have been unclear. I didn´t find it strange that LSB comes directly after the start bit. I found it strange that I have just used an "M" to test this and this character has a symmetrical pattern so that the mistake is hard to find :confused:

cyrano1960:
I found it strange that I have just used an "M" to test this and this character has a symmetrical pattern so that the mistake is hard to find

Yeah, that sort of thing can be a real PITA :slight_smile:

...R

cyrano1960:
Hello Robin,

sorry, I have been unclear. I didn´t find it strange that LSB comes directly after the start bit. I found it strange that I have just used an "M" to test this and this character has a symmetrical pattern so that the mistake is hard to find :confused:

There is no law or rule that says the ASCII character is sent LSB. But historically, every one does it to be compatible. Back when EBCDIC was also being used, it was always sent MSB first.

Paul

Back when EBCDIC was also being used, it was always sent MSB first.

The IBM2741 didn't use EBCDIC. AFAIK, there weren't any other IBM-specific "asynchronous serial" or rs232 terminals. (I think they decided that if their timesharing terminals had to use some weird character encoding anyway, they might as well use ASCII and be able to talk to those "non-IBM" terminals as well.)
The 3270 and similar wasn't async-based.

westfw:
The IBM2741 didn't use EBCDIC. AFAIK, there weren't any other IBM-specific "asynchronous serial" or rs232 terminals. (I think they decided that if their timesharing terminals had to use some weird character encoding anyway, they might as well use ASCII and be able to talk to those "non-IBM" terminals as well.)
The 3270 and similar wasn't async-based.

Correct. The normal IBM communication was Bisync using EBCDIC characters. Once a customer assured me his system used ASCII characters with Bisync. This was an option with Bisync and I gathered up all the necessary documentation and then discovered the customer had no clue and was just throwing out buzz words!

The main difference between EBCDIC and ASCII on Bisync was which bit was sent first.

Paul