I need to use digital pins 0 and 1 for BOTH serial communication and digital i/o.. I will need to toggle between these two uses programmatically..
I noticed in the reference that once I call Serial.begin, I am told not to use these pins for digital i/o.. This is obviously not a limitation of the Atmega.. So, since there is Serial.begin, is there effectively a Serial.end? Can I just call pinMode(0,INPUT) and pinMode(1,INPUT) after I'm done using these pins for serial comm? I poked around in the source a bit but beginSerial() in wiring_serial.c is using assembly macros and pinMode isn't.. It's not exactly clean enough for me to be sure what's happening, so I thought I'd ask..
Well my thought is that unless there is some implicit Serial.end() (like simply calling pinMode does the same thing) then we should have a Serial.end().. But even if I code it, we would need good reasons for it to be in the core..
I have a special case where I'm using the UART to implement a custom 1 wire bus and I need to use the digital input to sense if another node is communicating on the bus.. But what about other more "typical" cases?
I'll start the list..
A datalogger that can automatically switch between datalogger and datadump modes.. It would call Serial.begin() and send some byte(s) at restart and then wait for a specific response.. If the correct response is not received from the datadump PC application the device goes into datalogging mode by calling Serial.end() and pins 0 and 1 are used as input.. Otherwise pins 0 and 1 are used for Serial comm..
?
?
Any more ideas that would support including a Serial.end()?
It can be very handy to have a software interrupt that triggers when a specific byte has been received. For example, a carriage return character (13). This lets you process a receive buffer only after a full line has been received, instead of having to watch and process every received byte in realtime. Even better, watching for several characters, or a specific pause in transmission (handy for MODBUS RTU).