I am making a laser tag system with my arduino and I've moved on to the step of sending my IR LED through a lens to focus the light to get more distance and prevent a huge tag radius that would form otherwise.
Some background about my protocol: Every time the gun is shot, I send a unique ID of who is shooting (so that the player who is shot will know who shot him) using SoftwareSerial through a 555 timer chip. The chip is hooked up to create a frequency of 56khz so the IR light will be the right frequency to be sensed by my 56khz IR sensor. To make sure no bit errors are observed, I simply send the UID, currently simply a number between 1-14, twice in the same byte. Eg: for UID 3, the bits sent would be 0011 0011. Then on the receiving end, if the two nibbles are not the same, the byte is ignored and it waits for the next byte.
Anyway, the system works 100% close up and without a lens. The UID being sent by the laser gun is always correctly decoded and there are never any problems. When I introduce using a lens, though, the data transfer only works up to a foot away from the lens with an error rate of around 1 wrong identification in 10, but any farther and the whole transfer goes to hell. At one point about 4 feet from the lens, the data being sent through the beam was 0% correct; it was literally never decoded correctly (for the example UID 3, it detected values 5, 10, 11, and 13 but rarely ever detected the same incorrect value more than a few times in a row).
And yes, these error rates are including my automatic ignoring of bytes whose nibbles are not the same, meaning that at 4 feet away the transfer of data somehow changed from 0011 0011 to 0101 0101 for 5, 1010 1010 for 10, etc.
My main question is this: If my system worked 100% without a lens, why does adding a lens cause such terrible failure rates? Is there something else I should be doing to use a lens?