Thanks to all your help, I've gotten my ultrasonic rangefinders working.
Now I'm trying to derive the math behind using two sensors in parallel to accurately map an object. Using only one sensor would give me a single distance, with no idea of angle. With two of them, at a set distance apart and fancing the same direction, I should be able to figure out the exact distance and angle relative to a point.
How do I go about doing this? I tried using right triangles and pythagorean's theorem but it ended with a trivial solution.
When I try to google triangulation methods, most of the equations involve knowing the angles (relative to each sensor) beforehand. How can I do this with just two distances?
See section 4. Delta is known (separation between two sensors). ar and br are known from the sensors. It makes two circles, and you can solve for s and t. With s, t, ar and br, you can build a triangle and solve for the angles, allowing for triangulation.
Am I wrong in thinking this? However, it seems like this would only work IF the object was in-between both sensors.
The picture will be fuzzy at best. No points, just plate-size dots if you're lucky.
Maybe someone else wants to pour through the code but you're working with water-bombs not squirt guns so I fail to see how you expect to use linear equations to solve anything.
I can't imagine that someone else has not figured out a semi-reliable way of direction with two sensor. However, I'm having a hard time finding an example of how to do it. I don't need pin-point precision, but I would like to be able to do something that say, the car bumpers can do. At least on the car I've driven, with 4 sensors facing forward they can give you the general direction and proximity of an object. I understand with two the resolution is much lower but I don't need pin-point precision, just accuracy.
I looked at car bumper ultrasonic sensors. One the way I found that ultrasonics don't work off hair much at all so don't expect one to 'see' a dog or person in soft clothes with them.
Other than that, you have separate sensors giving data that could be defined as fuzzy shapes at distance. How about use Processing to graphic fuzzy shapes on your PC screen and see where the data suggests exploits? Just saying, as a tool to let you see which ways you can step.
You're saying you get two different distance readings and you want to calculate the position?
I think that's a Geometry problem. Draw a circle around each sensor, with the radius equal to the distance of the sensor's reading. There should be two points max where the circles intersect. The object should be in one of those two places.
Edit:
I see you already found the solution in the pdf you linked.
Am I wrong in thinking this? However, it seems like this would only work IF the object was in-between both sensors.
There is no requirement for the object to be in-between both sensors. The equations will work even if the object is outside. And you don't really need to know the angles at all. The equations will give you x and y coordinates. If you arrange your point of view so that point A (Your left sensor) is the origin and point B (the right sensor) is along the X axis, the Y coordinate will tell you how far in front the object is, and the X coordinate will tell you how far to the left or the right it is.