I was wondering about the way the SoftwareSerial handles the input: the specs say that Arduino Pro Mini has only pins 2 and 3 usable for interrupts. However, the SoftwareSerial successfully reads the input from all other pins, what puzzles me big time.
By studying the source code I see that SoftwareSerial::begin method uses the digitalPinToPCICR(_receivePin) macro to test “if there is a valid PCINT for this pin”, and apparently, the implementation for Pro Mini returns a valid value for all pins between 0 and 21:
#define digitalPinToPCICR(p) (((p) >= 0 && (p) <= 21) ? (&PCICR) : ((uint8_t *)0))
Long story short - it just works! Could someone explain to me by what means it works, when declared interrupts should be only for pins 2 and 3?
The Atmega chips have External Interrupts on specific pins (pins 2 and 3 on an Uno) and Pin Change Interrupts on most (all?) pins. Have a look at the Atmel datasheet for the fine details.
The PinChange interrupts are less convenient because it is necessary to have code in the Interrupt Service Routine (ISR) to determine which pin caused the interrupt (if more than one is enabled for a port) and whether it changed from HIGH to LOW or from LOW to HIGH.
Hey, that's a great explanation - it makes sense now! Thanks!!!
It is strange that the main page about interrupts interrupts() - Arduino Reference does not say anything about the "Pin Change Interrupts".
Some Atmega chips doesnt have PCINT, for e.g Atmega32 or 16. I started my project with 32 and later moved to 328, as I need software serial and 328 supports it.
So before considering software serial look at the datasheet of the chip.