Artificial skin

Ok, so I am not going for the "full sensorpackage" of actual skin, but more like a way to make a flexible'ish material that can sense touch in diffrent locations. Does anyone have any ideas on how to do this?

I was thinking of making a simple grid of thin non-insulated copper wire, with some sort of spacing in between the vertical and the horizontal lines, somewhat like this:

    A B C D
1 _|_|_|_|
2 _|_|_|_|
3 _|_|_|_|
4 _|_|_|_|

A-D would be connected to outputs and 1-4 to inputs. The program would switch on A-D in turn, and read 1-4 so the Arduino would get the status of the grid like this:

A:1,2,3,4
B:1,2,3,4
C:1,2,3,4
D:1,2,3,4

Thing is, I would like the grid to be a bit bigger so I could have some outputs available for, say, tactile feedback (through skin/tounge). How would I go about doing this? I am guessing a serial to parallell shift register could save me some ports. Am I right in thinking that a s2p shift register takes serial input along with a clock pulse and outputs to f.ex. 16 parallell pins?

For the inputs I'm guessing (damn, I do a lot of guessing in this field ::)) that I could use a demultiplexer or something, but this is not critical if I can get the number of needed outputs down to 2. However, it would be nice if there is a simple component that would allow me to read parallell data serially. You know.. give the chip a clockpulse, and read bits representing the chips inputs into the Arduino serially.

I would greatly appreciate it if anyone could list some part numbers or give me some other creative feedback on how I can achieve reading a grid like this as simply as possible, using a minimum of I/O-lines.

d(^-^)

Edit: corrected my retarded english. ::slight_smile:

Rip the controller chip out of a computer keyboard. They are set up to do almost exactly what you describe, only with keys instead of the skin sensors. Most keyboard controllers have data sheets available. If you use an older model, you can interface directly to the arduino with the PS2 protocol. Most keyboards will give you a 6 or 7 by 22 matrix or so.

A keyboard matrix can only read one key at a time though right?

A keyboard matrix can only read one key at a time though right?

Most touch matrices are one touch at a time.
E.g. Touchpads, touch screens, etc...

Touch two different points at the same time and the cursor will move in between the two contacts.

Making it sense multiple touches is very difficult.

I thought PlastBox was talking about a grid of simple switches, not a continuously variable device like a touchpad. For example, musical keyboards can be polyphonic, with all keys able to be read simultaneously. It seems like this would be more desirable for a skinlike device meant to measure touch. I hesitate to ask what you're actually building with this though :o

I thought PlastBox was talking about a grid of simple switches, not a continuously variable device like a touchpad. For example, musical keyboards can be polyphonic, with all keys able to be read simultaneously. It seems like this would be more desirable for a skinlike device meant to measure touch. I hesitate to ask what you're actually building with this though :o

Well, I'm not supposed to reveal this but.. Well, Mr. Skywalker contacted me and told me he was getting a bit frustrated with having to use the Force to get feeling in his prostetic hand when masturbating. A pretty dire situation if you ask me, prostetics are quite strong and could easily rip... erm stuff. =P

On the more serious side, check out "sensory substitution" and brain plasticity.
http://www.utc.fr/gsp/publi/Lenay03-SensorySubstitution.pdf is one fairly good pdf giving some brief details on the subject.
http://youtube.com/watch?v=OKd56D2mvN0 YouTube-video of TVSS.

The brain can take a new perceptual modality and integrate it through an existing one. One example is the TVSS (tactile-visual substituion system) where the image from a camera is representet by a 20x20 "pixel" grid of electrodes on the tounge. Where the user is able to freely manipulate the camera him-/herself the brain fairly quickly understands that the information representet on the tounge is actually visual information. It has even been shown (using fMRI/PET-scans) that after going through very rudimentary training with the TVSS, even grown people born blind will register neural activity in the brains visual center when using the TVSS.

This is in my opinion f***ing amazing!

I'm not aming to replicate that, nor am I going to say that 20x20, low contrast, black&white vision is any sort of substitution for real vision (eventhough it is enough for users to be able, after a while, to even recognize faces). However, I wonder why sensory substitution isn't used in f.ex. hand/arm/leg prostetics.. Imagine using a mechanical hand prostesis to pick up an egg. You have NO feedback, except visual, to tell you what force is excertet on the egg by the prostetic device. In fact, this lack of feedback is by far the biggest complaint most amputees have, and yet, an artificial hand that costs (here in Norway) about $6500 does not include any form of feedback! :-?

It seems to me like a simple and extremely low-cost thing to fit, say, a rather sensitive force-sensor to each of the 3 main fingertips of a hand-prostesis, coupled to vibrators or piezo-elements on the subjects skin (where does not matter, plasticity, remember? :wink: ). Perhaps even connect a flex-sensor to the fingers, also feeding back to a vibrator to give the subject a sense of the position the hand is in without constantly looking at it.

Humans do in fact have 6 senses. Scientists have agreed that in addition to the normal 5 we talk about, there is also proprioception. Proprioception is the sum of touch, feedback from force excerted on muscles and tendons, your sense of acceleration, balance and so on, which allows you to always know the position of your body. Proprioception - Wikipedia

If one lost a finger, like my stepdad, and was fittet with an artificial finger that had a grid of switches for sensing where something is touching the fingertip, as well as an underlying force-sensor to read the general force applied to the finger, SS-research has shown that through Active Sensing the brain will after a while interpret the feedback (controlled by the sensors in the fake finger) given through skin otherwise on the body as coming from the finger.

You poke something with your prostetic finger, and feel a vibration in your lower arm relative to the force of your poke. You see that the sense should come from your finger, not your arm, your proprioception tells you the same, and after a while of this exploration (active sensing) the brain understands the link and what happens in the prostetic device is felt as actually coming from the lost finger.

Sorry for the long post! It just rubs me the wrong way that a prostetic device with a price-tag of about $6500 doesn't include any sort of feedback! Not even a simple analog force-sensor coupled with a battery and a vibrator costing less than 1/1000 of the prostetic device itself. :-?

By the way, could the 4017 used here: http://metku.net/index.html?path=mods/vilkkuvalot1/index_eng be used to cycle between output-lines in the grid? I know the Arduino would not be able to adress individual lines nor check which line is "live", but that doesn't seem important. A simple software-counter going from 0-9 then resetting while sending pulses to the 4017 would keep track of what line is "live", unless something retarded like a loose wire or a short-out or something messed things up, right?