I am trying to implement CRC checking for data over serial and have been reading some papers regarding the proper polynomial selection. I have read the "Cyclic Redundancy Code (CRC) Polynomial Selection For Embedded Networks, Koopman P." paper, where the "good" polynomials are described. One thing about this paper that is causing me problems is that the maximum tested data length was 2048 bits.
I need to implement a good CRC for an interval of 4 bytes to 600 bytes. I have been testing the CRC-16 (modbus) algorithm with (0xA001).
I have been searching for some advice on how to select the proper polynomial and CRC for my data size, but have been unsuccessful so far. Should I use CRC-8 for shorter data and CRC-16 for longer data?
Any suggestions please? I would also be grateful if some paper or similar could be cited regarding my question.
I need to implement a good CRC for an interval of 4 bytes to 600 bytes.
The outgoing serial buffer is 64 bytes (512 bits). It seems to me that it would make sense to packetize your data, into 64 byte packets, including the CRC, and compute the CRC for one packet, rather than the whole collection of data.
But I am a little confused right now... I thought that the receiver end determines the size of the buffer? For example, if I establish a serial connection between Arduino (transmiter) and PC (receiver) and read the data, isn't the buffer size limited by the receiver buffer?
I am asking this, because I have been sending ~130 bytes of data and had no problem reading this. My loop was executing every 15 ms and I was reading the serial port on the PC as fast as I could. As far as I analyzed the data, there was no loss. Probably, I was reading the data fast enough?
I would like to keep the same method of communication.