Tracking position of an autonomous robot

How do people usually go about determining the position of an autonomous ground robot (relative to a charging station)? The robot in question will roam around my house building a map of the walls and furniture, and maybe do something useful once it has all that information. I imagine it's possible with an accelerometer, but is there a sensor better suited to this task?

It's not possible with an accelerometer. If you try to calculate position based on the acceleration readings over time you'll find yourself off course within just a few seconds. Accelerometers are just not precise enough.

Indoors your best chance is by using infrared beacons. If you just want to find the charging station then the rover can just bump around until it sees the beacon and then home in on it. If you create multiple beacons and have each emit a specific (remote control) signal then the rover can do some triangulation.

An Arduino really isn't the right board for mapping. Not enough memory. If you use something like a Raspberry Pi then you can use an upward facing camera and fiducials on the ceiling to triangulate your position and quite accurately as well (google "reacTIVision").

The robot in question will roam around my house building a map of the walls and furniture

Just have it map the charging station before it starts to roam, then just have it return to that part of the map it made.

The goal is to have the robot completely autonomous, so no beacons or markers on the ceiling, although I like the idea of the charging station having a beacon so help locate it. The station might also have a wireless link to the robot (not sure if it's useful enough to include). Memory shouldn't be problem as the robot will have an SD card. Also, the control board is going to be custom, so I'm not limited to regular arduino hardware. I think a Raspberry Pi is overkill for this, and it'll probably be so easy to implement with one that it wouldn't be any fun to build!

zoomkat:
Just have it map the charging station before it starts to roam, then just have it return to that part of the map it made.

But how will it map the area and follow the map it it doesn't know how far it's moving? Or am I misunderstanding something?

The robot will have a pivoting ultrasonic sensor to determine how far away things are. If it could accurately determine its rotation, it could travel in a zig-zag pattern in hallways (and zig-zag to and from a wall in an empty room) so that there is always something moving in view of the sensor. Then, with some math I could figure out the total distance traveled down the hallway/along the wall. A compass IC would be best suited for getting rotation, right?

A relatively popular option is to use wheel encoders. If you search for Dead Reckoning you will find a number of tutorials. Here is one decent description: A Tutorial and Elementary Trajectory Model for The Differential Steering System

David P. Anderson's work on robot localization and behavior with JBot is classic, and well worth studying: jBot: The Journey Robot - David P. Anderson

In particular, check out this video where the robot goes out (in out-of-doors) for 100 feet, and returns to the same spot. http://www.geology.smu.edu/~dpa-www/robo/jbot/jbot_100x.mpg

SD Card is not likely to help much with memory. You'll be able to store lots of data, but not access it quickly, especially not in real time which you will probably need. Beacons are the easiest way, but since you don't want to do that, you'll probably need to load a significant number of sensors on board that will sense all obstacles (including walls) within a certain range, as well as the distance to them. You will also need some way to calculate distance traveled, and while using encoders on the wheels can work to some extent, there is always some concern about the accuracy of this without some form of feedback (Or being sure your encoders are reliable). The task is not trivial. Good luck!

I reckon encoders on the wheels will lead to a lot of error over time because of slippage. I guess there's no one sensor that can precisely give position. The more I read about autonomous robots, the more it seems that people just average the readings of multiple different types of sensors to get a position reading. So far the sensors I'm thinking of using are an accelerometer, digital compass, and ultrasonic sensor. Will a digital compass be enough to reliably determine the rotation of the robot? Will the motors interfere with it? Or should I also include a gyroscope and average compass readings with it?

mirith, the map will only be 2 dimensional, so there isn't a whole lot of data to be read/written when following/creating the map.

I was also throwing around the idea of aiming the guts of an optical mouse at the ground and getting distance readings from that. Focusing on the ground while going over bumps might be an issue and I don't know if a mouse will read correctly at high speed. What do you guys think?

aiming the guts of an optical mouse at the ground

How well does your optical mouse work if you lift it 1/4 inch off the table?

I was also throwing around the idea of aiming the guts of an optical mouse at the ground and getting distance readings from that. Focusing on the ground while going over bumps might be an issue and I don't know if a mouse will read correctly at high speed. What do you guys think?

A mouse works well when it is always pointed in a certain direction relative to its moves. Try moving an optical mouse around like it is a bot and see how well the cursor tracks the mouse position.

I've only recently began developing my own autonomous indoor robot and it is surprisingly challenging. So far I just have encoder based odometry for dead reckoning but plan to incorporate support for gyroscopes and compasses next.

The odometry approach I think is very useful for localized maneuvers, like moving around obstacles where you are using short-range sensor information and traveling small distances between readings. Over longer distances it quickly becomes less useful.

Good luck with the endeavor. It sounds fun. Let us know if you blog about it as I would be interested in seeing how you get along.

You might be interested in taking the free online course "Artificial Intelligence for Robotics (Programming a Robotic Car)" offered by Georgia Tech through Udacity. It is taught by the designer of the Google self driving car.

charliesixpack:
You might be interested in taking the free online course "Artificial Intelligence for Robotics (Programming a Robotic Car)" offered by Georgia Tech through Udacity. It is taught by the designer of the Google self driving car.

I thought it was an MIT course, but yes, this one is pretty good and explains in a fairly straightforward manner how you make an AI for this type of thing.

bobthebanana:
I think a Raspberry Pi is overkill for this, and it'll probably be so easy to implement with one that it wouldn't be any fun to build!

  1. A Raspberry Pi is definitely NOT overkill for any autonomous robot.
  2. A fully autonomous robot is going to be a nontrivial challenge to implement even with all the best hardware money can buy (short of just buying an already fully autonomous robot anyways).

Do some research on SLAM (Simultaneous Localization And Mapping). If you don't want to use a pre-generated environment for navigation, you're going to want something with substantially more memory and processing power, otherwise you'll end up with a robot that moves at the pace of a snail, if not slower.

I can't second the suggestion of checking out David P Anderson's work enough. Here's his web page:

http://www.geology.smu.edu/~dpa-www/myrobots.html

In particular the YouTube video at the bottom of his page is 2.5 hours of autonomous robot insight well worth watching. Also his bots were built back around 2002 using a pro version of the MIT HandyBoard so they definitely could be done with Arduino technology.

This is a pretty hard project. All the dead reckoning methods are useful but only partial solutions.

IR beacons are a common solution, but even they are fairly challenging for beginners. They were used in the MIT media lab and you can buy em from Pololu.

The google car uses LIDAR which is like a 360 degree range finder. In the hobby range, you might try using an ultrasonic range finder or a webcam. The range finder can be coupled with a motor so you can get a range of readings without turning your bot. This should be more than challenging for a noob. After that I'd switch to the webcam (which can't be processed on an arduino), but there are lots of options. Webcams are cheap and provide images with lots of information. The challenge is in processing them.