Would like your thoughts on an alternative to lcd displays using led/morse code

I have been kicking around this idea for some time, but I would like to see if anyone else has any interest. As well, I am wondering if someone is already doing this, or something like it. I need not reinvent the wheel.

The problem: Devices need to display user feedback more complex than simple (human-read) led-blink-codes can supply. Some devices are too small, or need to be made too inexpensively/simply to incorporate an lcd or similar display.

The idea: Develop an open standard communication protocol using light (via led, or similar) and morse-code (or similar encoding) that can be easily read via a smartphone app. Basically, the user can point their phone's camera at a blinking led on the device and can obtain diagnostic/feedback/other info from it. If this was a standard of some kind, what I am envisioning is that just about any device could incorporate this protocol as a simple and effective means of user feedback. I would think you could supply a great deal of information very easily using a method like this. Future versions of the protocol could even allow for two way communication (perhaps by way of flashing the phone's flash bulb).

A practical example of this might be something like this: You have a device that outputs a variable amount of current to an LED driver based on temperature. In a simple device like this, you could easily have more money invested in the lcd display than in the other components combined. You press a button and aim your smartphone, equipped with the morse-reader app at the led, the led starts flashing (a sync message, then the output). A message is displayed on your screen, "Current temp is 98 degrees, output is 85%, press button labeled 'button 2' to change parameters." At this point, an number of subroutines could occur.

Anyway, please let me hear your thoughts. It seems like having an open platform for providing feedback from inexpensive devices would be very helpful to many of the projects we work on with arduino. I really think that if this is to work, it would need to be an open collaborative project instead of something proprietary.

Thanks, -shane

I don't know if such a standard exists, but the idea certainly has merit. For example, when the serviceman works on our dishwasher he connects up a reader to one (or two) of the LEDs on the front (that we assumed simply indicated "working") and can send commands into the machine and get a response back. However this may be a proprietary protocol.

Probably some sort of protocol like Manchester encoding, which relies on changes rather than static states, coupled with some sort of sumcheck to tell the difference between just noise and data.

When I think of morse code, I think of a couple dozen words per minute or so. Sending out the sample message you quoted (93 characters) would take a while this way. If the transmitter and receiver are both computerized as you suggest, then it can be way faster. Maybe the entire message in a second.

Some more thoughts: -Maybe use an IR Led for the transmitter, not unlike the "beam" function of palm pilots of yesteryear.

-You can't really do lower case letters with Morse code.

-I think pointing your phone at a device, waiting a couple seconds for the message, press a button, wait a couple more seconds, press another button, ect. would get pretty annoying. Suppose you had an outdoor thermometer; I would want to be able to glance at it to see the temperature immediately, rather than go find my phone, load the app, go stand near the thermometer, press "send", wait, then read the temperature...

-As a project it definitely has a cool factor, and will present some interesting challenges, but with the price of LCDs so cheap these days, I can't see it catching on.

Thanks both for your replies. I think you're right, John, Morse code itself would not allow for enough different characters. I think, ideally, the protocol should be able to output (and map to) unicode. That would open up a lot of opportunities with respect to interfacing through software (on either the arduino/avr side or the smartphone/reader side). I guess I was fixating on morse code because of how universal it is. However, now that I think of it, unicode is far more universal in a modern context. In any event, I guess the smartest thing would be to have something that can quickly/efficiently output a standard character set via the output device (here, a blinking led).

-shane

Sounds like "post codes". It's the beeps. http://en.wikipedia.org/wiki/Power_On_Self_Test

LCDs are cheap (but still not 20c like a LED) but large and need bezels to look nice, you can't get much smaller than a single LED and anyone can drill a neat hole with a chamfered edge, so I think the idea has merit.

As for the code, what's wrong with plain old ASCII?


Rob

The idea: Develop an open standard communication protocol using light (via led, or similar) and morse-code (or similar encoding) that can be easily read via a smartphone app.

Basic question: Have you allready data how fast an smartphone can track a LED, how many bits/bytes per second is possible.

Thinking out loud: video at 30 frames/second will give one or two bytes per second per LED when using just ON/OFF. (Nyquist) When using PWM to control the intensity of the LED one could add a factor 255 in theory but a factor 10 (?) in practice. so roughly I expect in the order 100-150 bits/second (10-15 chars/sec) should be possible. As this is per LED one could combine 9 LED's in parallel (including a parity LED) which would give approx 100 bytes per sec as upper limit.

Well, at this point, I believe the figure of 30fps to be about right based on the research I've done. I'm actually leaning back in the direction of morse code again because of it's limited character set and relative ease of interpretation from the phone/camera's point of view. Originally I had wanted to expand the number of displayable characters (e.g. unicode or ASCII), but it doesn't seem feasible, at least not in version 1.0. In reality, it's unlikely that a system using one LED to display morse code talking to a camera capturing data at ~30fps will be able to display a great deal of data; at least not very quickly. However, the overall idea here is to have a universal communication medium that can be viewed by any smartphone. No need for a device manual or spec sheets (to interpret blink-codes, etc). You walk up to a device you know (almost) nothing about, point your phone at it, and it 'talks' to you. That is the essence of what I'm trying to accomplish. For that sort of thing, I'm not sure that much more than the A-Z/1-10 character set is necessary. Thanks again for all the feedback. Happy to hear any additional input/ideas/critiques.

-shane

The only advantage I can see to Morse code is that there may be fewer bits with the common characters so it should be faster. OTOH you will need a longish space between characters because there's no way of detecting the end of each one, so that may put you back to the same throughput as if you used ASCII.

Also if you have a lot of numbers or less-common characters it will be a lot slower.


Rob

I wouldn't use Morse code personally.

http://en.wikipedia.org/wiki/Morse_code

For a start, a dash is three dot lengths. And a zero is 5 dashes. So sending a single zero (something you are quite likely to do) is going to be 15 dot lengths, plus 5 extra dot lengths (the gaps), so zero is 20 dot lengths. That's a lot of "bits" for a single number. By comparison, an 8-bit number can be sent as 0s and 1s in only 8 bits.

Yes that's what I was getting at, imagine sending the value 100, and let's say we can get away with 2 bit times for a dash (because it's a computer and not a human reading it) and 5 bits to delimit characters, that's

. - - - - - - - - - - - - - -

or about 13 + 5 + 15 + 5 + 15 + 5 or 73 bit times or there abouts.

This value is sent in 8 bit times using binary and 27 using 7-bit ASCII.

I do like the idea and agree that you only need a limited character set, but just can't see morse as being appropriate.


Rob

Nick, Rob, good points on limitations of morse code as an encoding scheme. I guess it's back to the drawing board in terms of the most efficient use of the ~30fps we have to work with. A couple of additional ideas:

1.) A 'custom' encoding scheme, not Morse, not ASCII, but perhaps around 60 characters (A-Z, 1-10, and a collection of other characters such as punctuations) that map to a 6-bit encoding scheme. Do some research to determine the most likely (statistically) used among those characters, and give them dominance in the lower-order bits (e.g. a is 1, 0 is 2, e is 3, q is 35, something like that). This would be done to enhance character acquisition speed. 2.) A variation on 1., where there are a number of encoding schemes, all similar, but mapping to different characters for different encoded values. The idea here would be to optimize for a given application. This would, of course, be more work and more code. 3.) This one is a bit removed from the original concept, but could actually be quite powerful. You know those QR code readers? How about a variation on that theme? The original blink codes could correspond to a url of some kind (this would be the most complex part of the encoding). After the phone has 'synced up' with the appropriate server, simpler blink codes could simply 'replay' an already downloaded list of predefined messages. This would allow for a much larger array of data to be displayed (video, images, sound, whatever).

Anyway, just brainstorming (and 'storm' may be a bit of a stretch)...

-shane

If you aren’t worried about lower case then yes you can shave it down to 6 bits, and have plenty left over

26 A-Z
10 0-9
28 punctuation etc.

I can’t see much (any?) benefit in determining the most used etc because this gets you into variable-length bit fields which is one of the morse code problems, ie how do you detect the end of character? The answer is probably using a delay and that moots the whole point. I really don’t think you have to shave every bit off, this is for very short diagnostic messages, if it takes 1 second or 2 seconds I don’t think it matters.

I would essentially stick with plain text (no option if you go to 6 bits anyway), the ASCII values may have to be shifted though, normally the printable chars start at 0x20, maybe shift the lot to start at 0 and use the upper spare values for something clever (don’t know what, maybe an ESC character for lower case, or use LC normally and escape UC).

This idea I like. The device sends a URL and a diagnostic code and you can then get the full monty, error descriptions, user manual pages, technical manual pages, you name it. Heck the phone could even ring tech support for you :slight_smile:

This will not be complex if you stick to plain(ish) text and don’t get too clever, after all how would you map codes to URLs anyway?


Rob

how would you map codes to URLs anyway?

I figured it out.

You take the 26 alpha characters plus a few punctuation characters. You combine them in a certain sequence to generate what is essentially a hash code, then you use part of that code to index index into a huge lookup table on a server which gives you the address of another server to which you send the rest of the code. That server sends back a web page with the fault information.

You could call it Device Diagnostic Network System, or DDNS for short.

Man I don’t know why I never made my fortune :slight_smile:


Rob

Just thought i’d throw this in, leds work quite well at sensing too
http://www.merl.com/papers/docs/TR2003-35.pdf

I would stick to something that is standardized. irDA has been around a long time and provides everything you need including two way data communications but isn't really used that much anymore. There are also non standard things like this: http://hackaday.com/2012/02/06/play-hide-and-go-seek-with-infrared-leds/ Unfortunately you may not be able to do that kind of stuff using just the camera in a smart phone as the sensor. (Older devices actually had irDA built in)

If you are going to use an LED as output I'd avoid using brightness levels to indicate different data. I'd stick with on/off which is much easier to detect.

I'd do a simple serial protocol. You could even do ASCII. (some thing like space = led off, mark = led on) Drop the baud rate way way down and it shouldn't be that difficult. If you want to crush down the numbers of bits/byte you can even look way back at the 5 bit baudot codes (from the late 1870's). There was lots of this low speed bit compaction kind of stuff going on back in the 70's during and earlier on teletypes and even over early modems.

You might even be able to support a standard like 45.5 baud with baudot codes or possibly just regular ascii.

Google around for things like 45.5 baud, TDD, baudot, 5 bit telex codes etc. You will find lots of interesting information that predates much of the entire computer revolution that started in the 80's.

There is often still support for these kinds of things even in todays operating systems.

--- bill

in regards to re-inventing the wheel:

I am positive such devices already exist. I believe I have seen a paper which had two circuits communicating bidirectionally using one led per circuit for communication. (I think it was by Toshiba or Motorola or some company with a Japanese name)

Then I think some MIT people developed micro-controllers which are re-programmable by strobe-like light from the monitor.

I dont have time to google, but I am sure there are more things out there.

still is a really cool Idea and a standard protocol would be cool…

Re reply #16,nice one having just spaces for a name :)


Rob

Mister is now being investigated by moderators. I suspect some sort of forum bug. Meanwhile if you () would like to edit your profile and fix your name up that might help.

If you look at the members list there are a few no-names.


Rob