Creating a single coordinate system from multiple relative coordinate systems? (Arduino robots detecting each other via image recognition)

Let's say I have multiple robots moving around in a 2d space. The robots can detect each other via image recognition. If robot A detects robot B, A knows B's distance and heading/angle relative to itself. This creates a relative coordinate set where A knows that B is X inches away, at an angle of Y degrees.

This then happens with multiple robots detecting each other at once, so I have multiple sets of these relative coordinates. How can I stitch them together to create one single coordinate set that has every robot's position?

That is not a particularly simple problem, because of measurement errors. None of the distances and angles are exact and the various measurements will conflict with each other.

Fortunately this problem can be approximately solved using least squares methods, with one robot being chosen as the origin.

It is the same problem as locating a single robot from a set of beacon distances, and the simplest approach is described in this paper.

For other methods, look up "find point coordinates from distance matrix". First hit: geometry - Finding the coordinates of points from distance matrix - Mathematics Stack Exchange

If you want a single coordinate system, the you need to pick an origin point... maybe that's robot A, maybe 'base', The math to convert polar coordinates (what you have) into rectangular coordinates is easy. The angle and distance can be solved as XY using a right-triangle solution whereby the vector distance is the hypothenuse.
Here is one (of many) sample links https://socratic.org/precalculus/polar-coordinates/converting-equations-from-polar-to-rectangular.
Once you know where each robot is, relative to each other and plot them in relation to your origin point, you can readily calculate the bearing/distance between any pair of them.

  • Wes

If I pick robot A as an origin point, would that mean that A needs to detect every other robot and get their distance/angle from A?

With my application, robots might be sufficiently spread out that no single robot can view every single other robot. However each robot is seen by at least one other robot.

In 2D, each robot has position (X,Y), so there are 2N independent parameters for N robots. You need at least 2N equations or independent measurements to determine all 2N parameters.

For two robots, you can arbitrarily assign robot 0 to be at (0,0), then the distance and angle to robot 1 gives (x1, y1) and so on.

To use the distance matrix method linked above, you need distances between all pairs of points.

If you convert those polar coordinates to rectangular ones, and pick one to be the 'origin' then if A sees B, B sees C but A cannot see C. it is no problem, because now you know where both A and B are at, so you can calculate C from whichever one you have. The problem can just be extended from there to D, E, F, ...

This problem could be solved using nothing but polar coordinates, but the angle additions and subtraction require trig solutions, no longer simple right triangle ones.

  • Wes

No, you can't, because you don't know the orientation of B with respect to A. The angle measurement assumes an orientation.

To calculate the positions of other robots with respect to A, using the angle/orientation method, requires that A see all the robots.

Is "A knows B's distance and heading/angle relative to itself" sufficient? Presumably "heading" means an angle relative to its own axis, but when A is detected by another robot (call it C), does C know the direction A is facing in, and thus the angle between the lines connecting C to A, and A to B?

Another way it could work would be if each robot has its own magnetic compass, and those compasses are reliable and not affected by metal parts or electric currents. Then you'd be OK, and the robots could report neighbors at particular compass bearings and distances.

According to the OP

Assuming each robot has a camera, "detection" doesn't give distance, without a lot of very sophisticated computation, and it doesn't give an absolute angle, either.

So this discussion seems pretty theoretical at the moment.

Yeah it is all theoretical. However, as I explained, when BOTH A and B are converted to rectangular coordinates, you can now calculate from either point. After determining where C is located, you can calculate the heading/distance of A from it. You may not be able to "see" it, but you can calculate where you would have to look.

  • Wes

Yes, assuming that you know the orientation of robot B with respect to the orientation of robot A.

I probably made the assumption that these sightings are commutative, that is if A can see B, B can see A. Without that, or some other external orientation mechanism (compass or a common point seen by all) then you would NOT be able to solve the problem.

  • Wes

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.