I don't know how to contact moderator. If you explain me I can do. If you want you can do by yourself.
Ok, I think I know a bit about serial transmissions, look my minimal knowledges I guessed by myself and made them in a "working program" for atmega328 (correct me if wrong, please):
- When a byte is going to be transmitted, the value of a pin goes from HIGH to LOW.
- On every bit transmission, there's a timing/delay of 1000000/baud_rate.
- What does it happen when a byte is transfered? How many time is there between another byte transmission, is it maybe 100000/baudrate?
So, what did I do in my first implementation?
for transferring a byte at 9600bps,
- set TX pin from HIGH to LOW.
- wait 105 microseconds (100000/9600).
- send first bit (bit 0)
- delayMicroseconds(105).
- send second bit (bit 1).
- ... repeat step 4 and 5 until transfer the 7th bit.
- delay 105 microseconds before transfer another byte.
So, for RX is the same as TX but at inverse. And the best way of detecting a transmission start, is to setup the INT on FALLING edge for a pin. In my case pin 3.
But, what is wrong in my code? If using timer, knowing accuracy is better than delay routine, I can RX a byte every 1.5 * 104.1 uS, going to half, isn't it?
Thanks Robin2,
cheers.