I would stick to something that is standardized.
irDA has been around a long time and provides everything you need including two way
data communications but isn't really used that much anymore.
There are also non standard things like this:http://hackaday.com/2012/02/06/play-hide-and-go-seek-with-infrared-leds/
Unfortunately you may not be able to do that kind of stuff using just the camera in a smart phone as the sensor.
(Older devices actually had irDA built in)
If you are going to use an LED as output I'd avoid using brightness levels to indicate
different data. I'd stick with on/off which is much easier to detect.
I'd do a simple serial protocol. You could even do ASCII.
(some thing like space = led off, mark = led on)
Drop the baud rate way way down and it shouldn't be that difficult.
If you want to crush down the numbers of bits/byte you can even look way back
at the 5 bit baudot codes (from the late 1870's).
There was lots of this low speed bit compaction kind of stuff going on back
in the 70's during and earlier on teletypes and even over early modems.
You might even be able to support a standard like 45.5 baud with baudot codes
or possibly just regular ascii.
Google around for things like 45.5 baud, TDD, baudot, 5 bit telex codes etc.
You will find lots of interesting information that predates much of the entire
computer revolution that started in the 80's.
There is often still support for these kinds of things even in todays operating systems.