I am working in an typical projector installation. This means variable light. I want to use Arduino to detect the position of multiple people, in a square area that is aprox. 6. x 6. (meters) / 19 x 16 (feet).
I think that IR sensors are out of the questions, since light from the projector will interfere. Upon investigating, I concluded that I probably need ultrasonic sensors, however they can only sense the first obstacle and I'm not sure about their limited radius of reach, specially at closer distances (so I would need 4 for each corner).
The only thing I need is people's location, and a relatively cheap way to achieve so. I already tried with pressure sensitive mats, but it's just too many to conceal.
Though I'm not looking for a military-precise system, it would be ideal to know a person's placement within each square meter. An exact head count is preferred but I settle with missing some from time to time.
A grid of ultrasonic sensors is out of the questions because I'm only a student and can't afford more than 120 dollars.
Beanies is not compatible with the installation, BTW.
Could you please explain more about IR sensor interference? I'm always reading about how these sensors only work with stable lightning in the room after calibration. If lightning from the projector changes drastically, I would assume it to interfere. Or perhaps, a projector's minimum wavelength range doesn't affect IR sensors, even if it is always shedding light?
Of course, this is all up to experimentation, but I'd rather be most informed before.
Could you throw more computing power at the problem? Mount a webcam on the ceiling looking down, feeding (wirelessly or otherwise) a PC running some kind of image processing & blob tracking algorithm?
The idea behind this project is to use the less resources as possible, as I'm working with a very limited computer (1GB RAM, no dedicated graphics processor, and a lot of graphics and sound happening in real-time).
If I mount a camera on the ceiling, I would need a very high ceiling with no obstacles (which there are, on the room I'm working) in order to properly capture the 19x 16 (feet) square, which would in turn decrease blob-tracking efficiency.
Of course, if there is no other option I'll just have to handle it with computer vision, but I need to be sure there is no other workaround with electronics.
I designed the first version of this people-counter and ported the software to it:
It uses a 64x64 IR sensor array with 256 level grey-scale to detect people by their thermal signature. It uses a Kalman filter (lots of fast number-crunching), and can work in any lighting conditions, including complete darkness.
If you only need vague information, you could look at using a couple of IR or ultrasonic rangefinder modules. mount them on servos and "scan" the room from two different directions, from there it is just a matter of simple triangulation.
@Leon Heller could you give us more details of the system it employs? Is this "IR sensor array" just a matrix of IR photodiodes?
The only thing I need is people's location, and a relatively cheap way to achieve so.
Tracking people has been the dream of the home automation types for years. If there was a cheap and easy to do it, you would see it in use in a lot of places. The easiest way probably would be to have a static image of an empty room and have a computer program compare pixel blocks in current image to the empty room. Should be with in reason on a pc, similar to motion detection web cam applications