How well can you define the uart protocol

UART transmits LSb first.

These are concerns for lower-level (hardware) protocols. RS232, RS485, "TTL" (3.3V/5V/whatever), etc. define voltage levels and duplex/simplex (duplex simply means two uni-directional communication channels anyway).

Yes, UART is LSB first.
It matters just if you would capture a waveform with scope. When using UART for Tx and Rx: the shift register inside the device (peripheral) will do the job: so a byte is sent, a byte is received (never mind if it would be LSB or MSB first, as long both sides do the same).

"Idle" is high (on UART pins). A start bit and stop bit is low.
What "idle" means on your physical interface?
In terms of V.24 or RS232 - it is the negative voltage (I think, e.g. -12V or -5V).

Duplex vs. Simplex depends just on fact if you have separated Tx and Rx (two uni-directional "channels"), they go via separate wires, so that they can transmit both in parallel. Then it is Duplex. And UART is duplex (always).
Simplex would be just one wire and the direction changes on the single wire, therefore just one direction at a time possible.

Not really. Some devices can only RX or TX, so duplex is meaningless.

Sure, if you enable just Tx. But this is not the point:
UART provides always separated Tx and Rx - so, it is Duplex (per se).
If you do not use Duplex - it does not change the "nature" of Duplex UART: it is capable to do Duplex, even you do not operate in Duplex mode.

Does the direction on Tx changes? No, so it remains Duplex capable.

There is a "Single Wire Protocol" with just one wire for transmission: this would be simplex.
Even, when you use an SD Card in SDIO mode: the data lines change the direction.
You cannot write to an SD Card in SDIO mode when a read transaction goes on. This is simplex.

Simplex and Duplex is not related to the fact how do you use (just uni-directional), it is related to the fact if you can do both directions at the same time.

Think about a radio communication with airplanes, on a walkie-talkie: when you transmit - you cannot hear. The same frequency used for both directions. No way for two-directional communication at the same time. It is not possible when Tx and Rx share the same "wire", obvious. But with two separated wires (as UART has) - Duplex possible (never mind if you use).

My point is that you can't just say "Uart is a communication protocol through which a device can send data to other device or receive data from other devices."
There are all of these "details" that have to be in "agreement" between both side of the connection.

YES, therefore it is called "protocol". And based on the OSI ISO seven level layer model: it involves all layers, including the physical parameters (e.g. voltage levels).

The details are "defined" by a protocol. UART is not a protocol, it is just a "bit shift register" device. UART just means: send bits of a word (byte) as a serial bit stream - nothing else.
A "protocol" is an agreement (contract), right: but much more as just to say "bits are travelling a path in a serial way".

Voltage levels on "connector" as well as the meaning of the bytes, the "ping-pong" between Rx and Tx, the flow control ... are also part of your protocol. One layer missing (e.g. how to do 'flow control' or how to interpret the data) results in a "protocol violation" and misunderstanding "when talking".

1 Like

ONLY if they are transmitting ASCII. IF transmitting EBCDIC, they transmit high-order bit first.

For some reasons, the UART Port always sends the LSBit first; there is no option to change the bit order as can be done in SPI.

No! The STOP bit is always HIGH (Fig-1).
185-00
Figure-1:

1 Like

Really? Do you have a citation for that? The only somewhat relevant thing I found online was in a comment by you a few years ago about the bit order of EBCDIC vs ASCII in the unrelated Bisync protocol (logic level on UART - #13 by Paul_KD7HB).

UART defines the bit order as LSb for any data sent through it. It doesn't care if it's ASCII or even character data at all (it can be, and often is, binary data).

Right there in my IBM manuals. I programmed a PC with a USART for both Bisync - EBCDIC- and for ASCII, async. Same chip in the same PC. Worked marvelously.

Ah, so you're actually talking about something that's not the UART protocol (even if it's a serial protocol that is supported by some hardware serial devices).

A USART (the hardware) may support other serial protocols besides UART, such as STR, BSC (aka Bisync), SDLC, and HDLC. Those are synchronous protocols (the "S" in USART), and they have their own rules separate from UART (the protocol).

UART (the protocol) is always asynchronous and defines data transmission as LSb.

1 Like

No, the UART devices I was programming all had a bit to tell the UART which direction to send the bits. They would send ASYNC ASCII high order bit first if you told it to do that.

I finally trekked through the snow to the garage and into the storage loft to see what I could find. I found documentation on the communications FEP, Front end processor, I designed and programmed to allow a Data General mini to talk to other devices, including bisync terminals.

In the doc, I found the manual I made has schematics for the plug-in circuit boards and the UART and USART used. Then Google found old documents.

I am not correct about the direction of the bits being transmitted or received. As I also did the interface to a national ATM network, which was bisync and used the identical board as for other ASCII sync data, all the data bits go the same direction.

Each UART/USART document states that if the data must be sent HO bit first, the data lines to the chip must be reversed. I know I did not do that.

I think what got me going was the IBM 3270 series devices manual. They allowed ASCII data on their bisync network. One customer was insistent his 3270 network used ASCII, so I spent a lot of time researching that. Customer was wrong. Imagine that. Data was EBCDIC.

Historically, all data transmission began with the mechanical Teletype machines. The first UARTS could be programmed to support 5 level Baudot communications, so all technology advances had to support the old stuff, and that is still true today.

The bit direction today is historical, but there is no reason it could not be the other way around for security reasons.

Those were the days. In graduate school, we had an IBM1620 that we could use as our own “personal computer.” After all, it only filled a very small room. Some of the grad students were working on projects that required different coding on the punched cards and you could change up how the keypunch coded the cards by changing out a wire wrapped module inside it. Not EBCDIC, just alien.:grinning:

I punched my own cards because if you submitted your source to the data center it took longer to correct the typos than to just do it yourself. I learned the hard way to check if one of those modules had been left in a keypunch. :rofl:

Never saw or touched a 1620, but did convert their Fortran to Fortran IV on IBM 360 mod 40. Had to run in BG only and sucked the entire processor while printing a line on the printer.

Job was to load logs into a ship so the ship could remain level.

Actual logs, like wood logs, and not log files? I suppose log files could keep a ship level if they're printed out on reams and reams of paper (which are just processed logs anyway). :smiley:

The customer was a stevedore company that loaded real tree parts into the holds of ships at the port of Portland, Oregon. Logs are really heavy. I don't know if they weighed the logs or not. Once I got the program tested to their satisfaction, I did not get involved afterward.

Just for fun…

A UART or USART is simply a pre-packaged shift register with some additional buffer-state flags.

There is no protocol inside the UART, simply the ‘protocol’ you choose to push data in or out of that ‘chip’ or software emulation.

That’s why/how a UART in hardware or software can support so many different protocols !

Uppp, true! STOP bit is high (a guaranteed period of 1 or 2 bit periods to have it high). Sorry.