Go Down

Topic: Tracking position in 3D space (Read 2645 times) previous topic - next topic

asuryan

Sep 11, 2012, 11:09 pm Last Edit: Sep 11, 2012, 11:42 pm by asuryan Reason: 1
Hi!
A new project came into my mind.
I would like to accurately track an object in 3D space and transfer
this data to a software on the PC. In practice I would like to build
a virtual camera rig and drive a camera from a 3D software package
on my desktop with this information.

1. What kind of sensors are out there to accomplish detection of
position and orientation?
2. What kind of accuracy is to expect using various techniques/sensors
on a hobbylevel?
3. Is it actually possible on a hobby level?

Something like that:
http://www.blendernation.com/2012/08/27/using-an-android-phone-as-a-virtual

Or a more professional equipment (Spielberg with a virtual camera on the set of "Tin Tin"):


Thanks in advance!


PeterH

The Augmented Reality guys have made it easy for you to detect the position and orientation of a marker in 3-d space. Positioning a camera viewport at the equivalent coordinates in a 3D model and rendering the resulting view is even easier.

If you aren't willing to use markers, I've seen demos of markerless object detection in 3D but it looks a lot harder to do. In any case this is an image processing problem that you will solve on a PC, and not a problem for the Arduino.
I only provide help via the forum - please do not contact me for private consultancy.

asuryan

#2
Sep 24, 2012, 01:09 pm Last Edit: Sep 24, 2012, 01:14 pm by asuryan Reason: 1
Thanks for your reply!

Youre right that is more a thing of optical tracking in the PC via software but
I thought of something like a gyroscope sensor thing. I dont want to track an object
that is "seen" I want to track a "virtual" camera which is more or less only one point
with XYZ and rotation values.

Assume that you can take a "virtual camera" and you have a button to say "youre now
on your origin (0,0,0) position". Now wouldnt it be possible to detect the movement in
space? Like the gyro detects a movement in +x and adds that movement to a global variable
X etc. These values also for rotation of course.

Do you think there would too much noise (value jittering) for such an approach?

PeterH


Assume that you can take a "virtual camera"


You seem to be looking for a way to get a real-world position and orientation, which can be used to position/orient a virtual camera within your virtual world. Is that the idea? It seems reasonable possible. Usually this would be done with software using sliders and so on within an application, but I'm sure it would be possible to create a six degree of freedom input device if you really wanted to. I don't imagine it would be easy to get it to work well enough to be useful in practice, but it would certainly be possible.
I only provide help via the forum - please do not contact me for private consultancy.

I also currently started to think about such a solution, actually to track the movements of a GLIDECAM Camcrane 200. Attached is a short sketch of the "luxury wireless solution", which would offer the greatest flexibility. Plan B could be a wired solution, where the movement of the cran/jib are machanicaly transferred to normal fader/rotational input devices.

Do anyone here has experimented with analyzing the strength or maybe time delays of radio signals? Since we talk about already small distances here (the whole device moves within a few meters) and the data needs to be quite precise (to make the movements smooths mm should be measured exactly, even better we should be capable to measure sub-mm here) there is also the question, can we be really so precise here using Arduino hardware/software?

Any feedback or real world experience with analyzing/comparing radio signals would be highly appreciated!

After looking through different other projects and instead of going completely wireless I will try the UM6-LT Orientation Sensor first and keep you updated about the results here...

Go Up