Tracking the position of an object in a closed environment

Hi there!

I was wondering if it’s possible to track the position of an object in a closed environment.
Look at it as a squash court, and I would like to know where in space the ball is; therefore I can detect which wall it hit and at what height.

Would it be feasible to create this type of device? And are cameras an absolute must or can it be done with electronic components in the ball?

Note: there are people in the closed environment but the space has absolute dimensions

Kind regards!

Electronic components in the ball are unlikely to survive the environment, even if they could get a usefull signal out to the world. I think your'e gonna need something a 'little' bigger than an arduino.......

123Splat:
Electronic components in the ball are unlikely to survive the environment, even if they could get a usefull signal out to the world. I think your’e gonna need something a ‘little’ bigger than an arduino…

I agree about not surviving the impact during an actual squash session, but it was more as an example.

Anything that I can think of which will give a near accurate position in 3d space is going to be WAY beyond the capabilities of a Microcontroller. Aside from a matrix of sensors/emmiters of a selected sort, it is going to take a lot of memory and a good deal of processing power.

devm: Would it be feasible to create this type of device?

Yes. It's called motion tracking, and has been well studied and implemented for decades.

devm: And are cameras an absolute must or can it be done with electronic components in the ball?

For wireless applications to track small moving options, cameras tend to be the lowest cost option overall; that said, depending on the number of cameras, resolution (both spacial and temporal) of the individual cameras, lighting, item(s) being tracked, and a whole host of other issues - will determine not only the accuracy of the tracking, but it will also have an effect on the overall cost of the system (for some systems, this cost can easily reach 50K+ USD).

I can imagine a system being done with components in the ball (maybe not the computational processing - and discounting what it would take to make such a system withstand the forces of playing squash - if that is really the application); it wouldn't be perfectly accurate, but it might be accurate enough.

First, you would likely need multiple cameras inside the ball itself; these cameras would be wireless, high-speed, non-rolling shutter, and probably high-resolution (without IR filters, perhaps black and white as well). How many would be needed, I am not sure, but likely enough to provide an overlapping outward-bound spherical field-of-view. They would transmit their images wirelessly to a computer system external to the ball (this computer system would likely be some kind of parallel processing computational cluster or similar).

The ball would also contain some form of orientation system (IMU, accellerometers, gyros, etc) - which would also transmit information wirelessly back to the cluster.

The walls, floors and ceiling of the arena would (ideally) have some kind of randomized non-repeating pattern painted on them (you can imagine it being visible, but likely it would be visible in the infra-red spectrum only). This particular piece may or may not be needed; it might be found through testing that the system would work ok using a standard layout, paint scheme and lighting.

The most difficult piece of this system would be the software - not to sound trite, but because I really don't know what form the software would take in a specific sense - what would needed to be done would be to integrate all of the data received from the ball, and use it via some complex SLAM (simultaneous localization and mapping) algorithm to determine - to a certain degree of probability - where the ball is based on prior knowledge gained from the environment while in play.

In the beginning, as the ball travelled through the environment, it would show very bad accuracy in knowing where it was at; over time though, this accuracy should get better - and closer to the walls (where it can see the pattern more clearly) is should be best (but this is guess on my part). With such a system, you might be able to get sub-10cm accuracy (perhaps even 1-cm accuracy with enough trials). It would only work properly though in that singular squash court; if you took it to another than wasn't set up the same, it would probably be inaccurate until enough trials had again established the map of the space.

That is all a guess at how the system would work, obviously. The closest kind of system I have seen to this would be the various trials of using SLAM (inbound and outbound) to track one or more quadcopters in flight. Also note that none of this involves the use an Arduino; I sincerely can' t think of where or how you could incorporate such a microcontroller into the mix (maybe the orientation sensor package?)...

cr0sh:

devm: Would it be feasible to create this type of device?

Yes. It's called motion tracking, and has been well studied and implemented for decades.

devm: And are cameras an absolute must or can it be done with electronic components in the ball?

For wireless applications to track small moving options, cameras tend to be the lowest cost option overall; that said, depending on the number of cameras, resolution (both spacial and temporal) of the individual cameras, lighting, item(s) being tracked, and a whole host of other issues - will determine not only the accuracy of the tracking, but it will also have an effect on the overall cost of the system (for some systems, this cost can easily reach 50K+ USD).

I can imagine a system being done with components in the ball (maybe not the computational processing - and discounting what it would take to make such a system withstand the forces of playing squash - if that is really the application); it wouldn't be perfectly accurate, but it might be accurate enough.

First, you would likely need multiple cameras inside the ball itself; these cameras would be wireless, high-speed, non-rolling shutter, and probably high-resolution (without IR filters, perhaps black and white as well). How many would be needed, I am not sure, but likely enough to provide an overlapping outward-bound spherical field-of-view. They would transmit their images wirelessly to a computer system external to the ball (this computer system would likely be some kind of parallel processing computational cluster or similar).

The ball would also contain some form of orientation system (IMU, accellerometers, gyros, etc) - which would also transmit information wirelessly back to the cluster.

The walls, floors and ceiling of the arena would (ideally) have some kind of randomized non-repeating pattern painted on them (you can imagine it being visible, but likely it would be visible in the infra-red spectrum only). This particular piece may or may not be needed; it might be found through testing that the system would work ok using a standard layout, paint scheme and lighting.

The most difficult piece of this system would be the software - not to sound trite, but because I really don't know what form the software would take in a specific sense - what would needed to be done would be to integrate all of the data received from the ball, and use it via some complex SLAM (simultaneous localization and mapping) algorithm to determine - to a certain degree of probability - where the ball is based on prior knowledge gained from the environment while in play.

In the beginning, as the ball travelled through the environment, it would show very bad accuracy in knowing where it was at; over time though, this accuracy should get better - and closer to the walls (where it can see the pattern more clearly) is should be best (but this is guess on my part). With such a system, you might be able to get sub-10cm accuracy (perhaps even 1-cm accuracy with enough trials). It would only work properly though in that singular squash court; if you took it to another than wasn't set up the same, it would probably be inaccurate until enough trials had again established the map of the space.

That is all a guess at how the system would work, obviously. The closest kind of system I have seen to this would be the various trials of using SLAM (inbound and outbound) to track one or more quadcopters in flight. Also note that none of this involves the use an Arduino; I sincerely can' t think of where or how you could incorporate such a microcontroller into the mix (maybe the orientation sensor package?)...

Thank you very much! I'll need some time to process this information and do some research, but it does sound promising. I was thinking about maybe computational processing the measurements with a fitted smartphone application.