Although I've been dabbling with my nano & mega for a while, this is my first post to the Arduino forum. I'd like to thank everyone for their great input to the community.
The project: I have a Nano ('168) that is being fed data by a PC through the hardware serial at 115200. It updates an lcd to match whats going on the PC display. The PC is running a Flash animation and the data is relayed by serialproxy. It works, and looks pretty cool as well. Don't ask me why: it is for the fun of it!
The challenge: I've added a mechanical rotary encoder on the nano. The idea is that the feedback from the encoder affects the animation. I'm reading it with an interrupt on dpin 2, ISR only sets a flag and then I check rotating direction etc in main loop. And it works, but I get glitches in the animation while turning the encoder a lot - the buffer for the screen data goes out of sync. This seems to be due to lost characters during the encoder interrupt. Minor problem that is visible only for a few tenths of a second until the next frame buffer is ready, but still looks ugly.
The questions: 1) Is it really so, that using interrupts while streaming serial always equals to lost characters in the HW serial pipeline ? i.e. what really happens in the USART when the processor is busy with the ISR ? Is it "doing anything" (filling the buffer or something) or is the serial signal simply 'dropped' ? 2) Does anyone have practical experiences of using a softserial at 57600 or above ? 3) Does anyone know if the processor really jumps back to exactly where it was in executing the main loop after an ISR ... I'm getting suspicious... ;)
I don't have the code to post at the moment (its on another laptop at my home). I'd just like to hear opinions on the possible plans of attack to the design problem at hand, which is combining the need for interrupts & fast serial update. And I know I'm not out of options: I could throw in a second nano to handle the encoder if I can't get it working with one arduino.