grid based navigation using ultrasonic sensor

hi,
i want my robot to navigate based on grids divided into a foot each using an ultrasonic sensor and servo to look around any ideas how to acheive it or any place to get started ?
i want it to detect its place by measuring distances to right and left
at least first if i give it its present location and again give the final position it should be able to go till there atleast :wink:
afterwards ill keep on improvising by adding more sensors
:slight_smile:

could anyone help me please :cry:

You bump after an hour?
Get real. >:(

You have told us nothing about the "robot", the environment, so how can we answer?

I guess use continuous rotation servos and figure out how much time it takes to move forward one grid square. The grid would have x and y values. Every time you move forward a square you could y++. Then if you turn left or right it would switch to counting the horizontal squares. You would have to keep track of the four directions with your grid.

If (moving north){
     If (turn left){
            direction==west;
    } else if (turn right){
            direction == east;

Here is a different example, with this one the robot would just move North until it reached 
the right amount, then it would turn right and move East the right amount.

int targetx = 5;
int targety= 4;

int CurrentY=0;
int CurrentX=0;

mainloop{

if (CurrentY < TargetY){
 move forward a square; 
 CurrentY++;
} else { YisDone=1;   
   }

if (YisDone==1){
  turnright;
  YisDone=0;
  StartX=1;
}

 if (StartX==1){ 
  if (CurrentX < TargetX){
     move forward;
     CurrentX++
   }
}

@awol sorry
here are the details :- :wink:
it is meant to roam about my house and carry stuff around thats all

good example buddy
but what if im in another room and need to make many turns and all
?
i guesss ill need to learn more about grid based navigation
you could please provide me with a url for a blog or website or a pdf something so that i can keep refering continously ill learn it step by step right from the basics so that i will not come up with stupid questions !
and make someone angry " >:("
:stuck_out_tongue:

How will your robot recognize when it has crossed a grid boundary? If you are just relying on "knowing" how far you have traveled, in each direction, you will find that the lack of feedback will quickly result in massive errors.

well it mostly relies on the distance travelled but at some places i decided to put a back patch or something like that aqnd also measuring the distances to right and left the :wink:

any help ?
please ....... :cry:

What does the robot's environment look like?
How big is it?
What is your robot's configuration?
Is Google broken where you are?

any help ?

As in "Will some one program my robot for me?"? Or, do you have specific questions?

the environment is a house
coming to coifiguration im running it on one ultrasonic sensor two dc motors (differential drive ) and one servo to pan the ultrasonic sensor and a arduino duemilanove
a pvc pipe as the chassis :stuck_out_tongue:
i "dont want somebody to write me a program "
i will do it by myself but some questions on where to get started google might yeild sufficient results but i just wanted some one so that i can clarify my doubts continually :wink:
and please refer to some place where i can find some quality information

  1. You should implement encoders on the wheels/gearboxes/motors - so you can accurately track the motion of your robot.

  2. Grid-based navigation isn't going to cut in the cluttered environment of a house (unless everything in that house is arranged on a perfect grid and never moves).

  3. You are going to want to look into SLAM (Simultaneous Localization and Mapping) techniques for robots, if you want this to seriously work.

  4. Unfortunately, any form of real SLAM won't be implementable on an Arduino, likely (not enough memory).

  5. Start looking into small form-factor PCs (Mini-ITX and smaller).

The closest you might be able to get on an Arduino (and you would likely need a Mega, at least) would be to map the house as-you-go using an internal array of line segments; the biggest issue will be converting the information from the robot's interaction with the environment (distances travelled, ultrasonic sensors, IR sensors, touch sensors, etc) into those line segments - not an easy task by far. You would also need to have a method to know when one line segment from one read is actually an extension of another line segment from a previous read (so you can combine the line segments to reduce memory consumption). Finally, you would need code to locate the robot "avatar", as represented in the program, within the internal map of line segments. Then there's path planning, dealing with moving objects (and knowing when those aren't a part of the map), line segment removal and update (so when a chair or other object changes position, it can be intelligently routed around)...

Indeed, a robot such as you're planning to build is going to be anything but simple; in fact, the hardware, electronics, etc will be the simplest piece - its the software that will be the challenging part. The software, though, will need as much information as you can extract from the environment using as many sensors as you can mount - so it should be planned carefully. I am not trying to dissuade you from the project. Instead, I a merely positing ideas and information which you seem to not be aware of, or maybe haven't had time to think about yet. Plenty of information on all of this exists on the internet, though. Be sure to look up information applicable to game programming as well (robotics is one of those topics that encompasses nearly every level of computer and engineering science that exists - a great way to solve problems in robotics is to look at solutions in other areas of industries and technologies that are involved in the same areas).

Good luck, and be sure to let us know what you come up with!

:slight_smile:

1 Like

thanks crosh
but i got to know that i was thinking too much at a time so i decided to first design a map of my house ("ii guess even this is not going to be that easy " - but ill do it :slight_smile: )for my bot
experiment with it get it to atleast 90% accuracy and then look into things like Simultaneous Localization and Mapping
i think now iam doing the right thing
and i will sure let you know if i came up with something :slight_smile:

What you want to do is fairly complex and time consuming. You might use the ultrasonics to detect objects in the travel path and possibly use beacons (do a forum search for beacon) for bot location. Any wheel slippage or calculation issues when turning make encoders unreliable.

i just built an obstacle avoiding robot so wanted to improvise on that and go for grid based navigation :slight_smile:

I've been thinking about doing this too.

Object detection with wheel encoders seem like the place to start. At least you could map a room with good floors (no slippage).

Ultrasound is only good for object detection or avoidance. Touch sensors are probably needed too. Like what if the bot gets stuck behind a 1" high wall? The ultrasound may not be needed, but your mechanical/sensor design needs robust object detection. It could be just big bumpers front and back with touch sensors. Stall detection would be a good idea too.

For distance measurements, you could use dead reckoning. For example, full power on all motors for 2 seconds equals 2 feet. But the first improvement here is wheel encoders. They will give you accurate distance regardless of hill climbing or battery power. They do not detect slipage.

The next step would be an infrared homing beacon, because it is simple and cheap. This will help correct for slippage.

Now your bot can move around, bang in to things and find its way home.

For software, buld the map as you move. I was actually thinking of doing it on my pc in processing via bluetooth because it would be easier to figure out what was happening. You could start with a black screen with the word home in the center. As the bot moves, it marks its path in white and obstacles in red.

hmm gud idea ill try i actually gave up the idea of ultrasoinc first ill start with encoders first

Please post if you make any progress. I'm going to try too, as it sounds so interesting. It will probably take me a couple weeks to get bluetooth and a beacon working.

Wireless, a beacon and a touch sensor seem like a good way to start. The wireless lets you monitor progress remotely, graph it with processing and do calculations on your PC. The homing beacon allows the bot to home every few minutes to a known position. Homing provides feedback for correcting the bots dead reckoning and improving the maps, i.e. it allows it's navigation to be self correcting.

For mapping it is better to use a IR sensor, or better a LIDAR sensor, but they are hugely expensive. A servo with a Sharp Long Range IR sensor can become the Poor Man's Laser (PML). Another good idea is to use a compass to correct the heading. Although it is better to use a PC to do the mapping and path planning, I think it is doable on the microcontroller. A Mega has enough RAM to do it and you always can break the map into sectors and store the unused ones on a microSD card. So, to resume, you need:

  • encoders
  • compass
  • PML sensor or LIDAR
  • a microcontroller for the mapping/path planning functions
  • separate sensors for obstacle avoidance

The micro reads the PML scanning sensor, compass and encoders, plots the map and decides the best path to follow to get to the desired destination. Sounds easy, but there is a lot of code to do this. It will take me a long time until I'll get results, but I intend to do it. But it would be nice if someone would get there faster...