So, I know (at least think I know) that a baud rate, synchronizes the rate at which bits'n'bytes are sent across a serial connection, or RF24, or ... flavor of your choosing.
But here I'm wondering more about the clock of each machine in the process and how often it initiates a 'send'.
E.g.
I have an app on my PC which responds to mouse movements.
Pc then sends the relevant values via serial USB to my Uno
Uno then sends the values to my MEGA via RF24
MEGA uses the values to drive servos on my bot.
What I am thinking, is that my PC's processor runs at 2.4GHz
My Uno controller runs at 16MHz
So am I right in thinking that unless managed, my PC will be sending faaaar more data across the serial, than the Uno could ever hope to process. (Sorry - 'control').
I certainly know that in the Uno's receiving code, I have to empty out the buffer or things get messy.
The above 'takes care' of surplus data, but in my mind, it's rather 'messy' and inefficient to send quite so much extra data than is ever needed.
Looking into the future, I might introduce a Due into the process, and this runs at 84MHz.
This would mean I have 2.4Ghz, a 16MHz and a 84MHz machines, all spewing and eating packets at different rates.
Hence I was looking for a better way to manage this.
Being a little inexperienced, I am thinking ..
Throw some delays in the Host's code to slow down the send rate. (not too much, or there will be delays !)
Get the Host to wait for an acknowledgement from 1 send, before sending the next. (will have to test if this slows things down )
I guess I'm here now, just to ask if there is a standard way of addressing this issue, or whether we should not worry and simply 'bin' the surplus .... am I reinventing the wheel ?
Any thoughts, suggestions, pointers - greatly received !
Many Thanks
Simon