Making a pair of motion-tracking "eyes"

I'm building an animatronic head, and I'd like to build an eye mech, with eyes that aim at the closest moving object (the servos and eye mechanism are one can of worms, I'd like to concentrate on the tracking). I have a ping rangefinder, but that seems to just find distance, not locate something in space. Does anyone know what the best way is to loosely track an object in space for a project like this?

It's a pretty complicated requirement, both in hardware and software.

Option 1: one or more cameras with very fast image analysis. Not likely to be feasible on Arduino. They make dedicated processor boards for this kind of video work, and that doesn't include the neck/eye actuators. Point-n-shoot cameras even have face recognizers built in now. I really wish they exported the data from this over their USB ports.

Option 2: three or four simpler sensors in a triangle or cross formation. You need to pick a sensor that will be able to distinguish nearby things from faraway background, without disrupting the other sensors. When a sensor picks up a nearby thing, the software should try to move it so that the nearby thing is shifted "inwards" to the center of the ring of sensors. Four sensors is easier to write, but three may be possible for some kinds of objects. Two is okay if you only want to pan left/right.

Here's some pretty nifty IR eyes, you can make your own (they offer a schematic) or buy some for fairly cheap.

http://letsmakerobots.com/node/11293

http://letsmakerobots.com/node/10822

I think they only have code for Basic, but maybe somebody could either "arduino-ify" the code, or write some simple Arduino code for using them.