Yet another Software Serial

I don't know how to contact moderator. If you explain me I can do. If you want you can do by yourself. :wink:

Ok, I think I know a bit about serial transmissions, look my minimal knowledges I guessed by myself and made them in a "working program" for atmega328 (correct me if wrong, please):

  1. When a byte is going to be transmitted, the value of a pin goes from HIGH to LOW.
  2. On every bit transmission, there's a timing/delay of 1000000/baud_rate.
  3. What does it happen when a byte is transfered? How many time is there between another byte transmission, is it maybe 100000/baudrate?

So, what did I do in my first implementation?

for transferring a byte at 9600bps,

  1. set TX pin from HIGH to LOW.
  2. wait 105 microseconds (100000/9600).
  3. send first bit (bit 0)
  4. delayMicroseconds(105).
  5. send second bit (bit 1).
  6. ... repeat step 4 and 5 until transfer the 7th bit.
  7. delay 105 microseconds before transfer another byte.

So, for RX is the same as TX but at inverse. And the best way of detecting a transmission start, is to setup the INT on FALLING edge for a pin. In my case pin 3.

But, what is wrong in my code? If using timer, knowing accuracy is better than delay routine, I can RX a byte every 1.5 * 104.1 uS, going to half, isn't it?

Thanks Robin2,

cheers.