Go Down

### Topic: Indoor navigation and mapping, using sonar (Read 1 time)previous topic - next topic

#### Rupert909

##### May 27, 2016, 11:29 pmLast Edit: May 27, 2016, 11:41 pm by Rupert909
I've been doing some testing using a cheap sonar on top of a stepper, taking measurements of its surroundings. The sonar is of pretty low quality, but mostly detects an echo. I have a concept for an algorithm, and wanted to hear comments before building more than my current test harness.

The idea is to be able to build a rudimentary map of a room within ~1-2Kb of RAM, using overlapping circles identifying open area.

Doing a stationary round-the-circle scan of 30+ measure angles, the code keeps track of the closest echo distance, and also counts the number of readings that fall into predefined intervals, like under 50cm, between 50cm and 100cm, and so on.

The minimum echo distance is stored along with the estimated coordinates of the robot. The robot then moves to some point within that circle, and performs a new circle scan. If the radius of the new "free area" is so small that it fits inside the original circle, we know there is an obstacle that the previous scan failed to see, and we may choose to dispose of that reading.

I plan to use dual 12V steppers implementing a simple differential drive. For each movement the robot does, an inaccuracy factor is updated (increasing)  ... unless we can identify having been there before. That's what those counters are for, creating simple numbers representing "scenery".

The elegance of this is that the algorithm does not bother with how sonar singals can bounce of surfaces at an angle, as long as we hopefully get an indication of the smallest unobstructed circular area. As we all know, the closest point on any plane is always at a 90 degree angle relative to that plane.

Has anybody tried something similar?

So far I've tried manually moving a test platform around at different locations, and letting it scan the area, but to really get somewhere I need to construct a somewhat accurate moving platform, which is a big undertaking, so any tips, please?

Oh and the robot will have bumper sensors also, possibly even an electronic magnetic compass. Only in revision 2 do I plan to fit it with deadly weapons and giving it a mind of its own.
The internet is a great thing. Use it.
:s/internet/mind/g
Real Programmers use ++C

#### PaulS

#1
##### May 28, 2016, 03:45 am
Quote
along with the estimated coordinates of the robot.
How are you estimating that?

Quote
The robot then moves to some point within that circle, and performs a new circle scan.
How will you move a known distance in a known direction over unknown surfaces?

#### Rupert909

#2
##### May 28, 2016, 12:09 pmLast Edit: May 28, 2016, 12:44 pm by Rupert909
How are you estimating that?
How will you move a known distance in a known direction over unknown surfaces?
If we assume letting the robot start at some predefined position, for example (0,0), and a predefined heading, for example pointing along X-axis, then for each movement operation, be it rotation or moving forward, an internal value is updated according to heuristics determined experimentally. This value indicates accumulated margin of error.

Thus, when stopping to make a scan of the environment, the information is stored in memory along with the margin of error at that point (and "scenery data").

Further movement increases this margin of error value. Rotation around its own axis will be particularly error-prone, and must be minimized.

Anyway, this is where the "scenery data" comes in, represented by the counters for different echo-distances recorded during the sweep. Using the estimated position along with the accumulated margin of error, together with "scenery data" for previous measurements made in the same general area, the hope is to be able to make the otherwise ever-increasing margin of error go down somewhat.

I have experimented with the "scenery data" collection and correlation algortihm for calculating a deviance factor between what the robot sees of its surroundings and previously collected information, and have been able to calculate a number which if smaller than 1.0 indicates a pretty good match, scenery-wise.

Memory consumption:

x-position: two bytes
y-position: two bytes
accumulated-margin-of-errors: two bytes
scenery-data: four values x 4 bits = two bytes

Total: 10 bytes

Leaving room for other state, based on the 2K available for Uno, we can easily store 100 such measurements in the RAM of an Uno.