get position of objects using ir

Hi all, I originally wanted to make a shoddy vr system with two IMU's in two seperate right and left controllers, which their values could be sent over wifi to the game service unity so I can plot their position and incorperate them into half-assed vr games. But I quickly learned that although some problems with IMU tracking can be solved, gimbal lock with quaternions, and yaw drift with magnetometers, the problem of dead reckoning was just inherent in the process. I since have figured out it much better to use sensor fusion.

This is my first idea for sensor fusion, so it could be wrong, but since I would like to make the setup as portable as possible, i would like to just have one IR camera in my headset and then have the two controllers be like trackers, which their position/depth in 3D space could be handled by the cameras, and orientation by the IMU. And when the trackers went out-of-bounds, I could use the IMU's accelerometer or simply freeze there position untill they went back in scope since it would not be third person and not matter what they do when you cannot see them. And in my scenario, I have the dvantage of only needing the controllers tracked because the vr 'headset' in this circumstance will be your phone.

IR tracking was just my first idea, so if there may be a better method or more cost effective way to preform sensor fusion, do let me know. Since I want it to be as portable as possible, if IR tracking could not be preformed with just one sensor, yet another method could, thereby making it more portable, do let me know. My preferences for this project are as portable, cost effective, and user-friendly as possible. I have tried looking up on google for how to get started but have found no good results. So maybe the good people of arduino can help me! All input is appreciated. I can provide more details if nesecarry, and as always, thank you! =)

1 Like

In which area will the players move (extent of room, hall)?

I'd track position like with a mouse: the user moves his mouse until the cursor on screen is where it should be. The same can be done in a 3D VR, where the user only gives turns (gyro) and moves (accel) without actually moving in reality. But if a user still moves towards a physical wall, or two users towards each other, an US obstacle avoidance sensor could be used to put a virtual wall in front of the player, forcing him to move somewhere else. This way it does not matter where the user actually is, it only matters where he thinks to be.

Thats what I thought I could do too, but apparently IMU's are not really suitable for movement, even if the movment dosent need to be directly mapped on to the person on the physical world.

Use the IMU like the steering stick inside a plane. Move it in front of you (front edge down) to get accel in x direction indicating forward move, and back in neutral position (flat) or reverse (front up) to stop or slow down movement. Same in y direction for turning. Then the plane can move anywhere while the pilot sits still in his seat. At least this is how I'd attack VR moves.

Well, that certainly is an interesting and clever approach. How would you use this setup and orient the remote?

Detect and remember the user choosen (neutral) orientation in setup().