Getting 2d coordinates of a fast object that passed through a box.

Hello this is my first post so please feel free to give any positive criticism on asking questions around these parts.

I am very new to the world of electronics in the sense of the physical hardware and types of sensors that are available. I do however have a fairly strong working knowledge in .NET development and programming in general.

What I am trying to accomplish in simple terms is high speed object position tracking. I will try and explain to give more detail as I see it in my head the simplest way possible.

I have a completely 100% square frame that is 5'x5'. The box is an open frame nothing in front of it nothing behind it. The environment is controlled indoor's and can be manipulated if needed. The lighting in the room can also be controlled if necessary.

What I will be doing with this box is to passing an object roughly 1/4 inch in diameter through it from anywhere in the range of 100-400 feet per second.

What I am trying to accomplish is getting the coordinates of this "said" object in 2D space. For example if the top left corner of the box = 0,0 and the object passes through the box 3 inches from the left and 3 inches down from the top to get a reading of 3,3. I would also like to be using float values here that are within 1/4 inch accuracy.

So the question I have is what type of sensors do I need to accomplish this feat. Can someone please provide a pysedo idea of how you would go about setting something like this up? Any insight or help on the topic would be greatly appreciated.

Thanks in advance for looking and your help.

1/4 inch resolution over a 60x60 inch frame implies a 240x240 sensor array that is also VERY fast (response time faster than 1 millisecond required).

You could do that with 240x240 photogates (laser tripwires) but it would be extremely expensive and difficult to interface.

This is not an Arduino project.

Sounds more like a job for high speed cameras + computer analysis.

Is this object some sort of bullet?

Are we looking for a single co-ordinate when the object (which sounds like it is a bullet) passes this square frame...so irrelevant of time?

I.e. so you could tell from a distance a shot was "5 inches up and 4 left" for example when it struck a target/passed the square frame as it pass through it.

Maybe LIDAR has a use here?

Put a fresh piece of paper in the frame and photograph it, after the bullet has passed through. Then analyze the image frame to locate the hole.

Grumpy_Mike:
Is this object some sort of bullet?

No the object isn't a bullet it's an arrow.

Johnny010:
Are we looking for a single co-ordinate when the object (which sounds like it is a bullet) passes this square frame...so irrelevant of time?

I.e. so you could tell from a distance a shot was "5 inches up and 4 left" for example when it struck a target/passed the square frame as it pass through it.

Maybe LIDAR has a use here?

Time isn't something that I care about so long as the position is captured at the time the object (arrow) passes through the frame.

jremington:
Put a fresh piece of paper in the frame and photograph it, after the bullet has passed through. Then analyze the image frame to locate the hole.

This would defeat the purpose of the concept all together.

How big is the flight?

AWOL:
How big is the flight?

the flight would start approx 60 feet from the frame. It would be nice to have this capable of also starting from shorter distances as well.

The flight - the stabilising fins at the back of the arrow.
How big is the flight?

AWOL:
The flight - the stabilising fins at the back of the arrow.
How big is the flight?

I'm not 100% sure what you are asking but I will try and give more detail. The arrow can range from 20 inches to 30 inches in length. The arrow has 3 fins on the back of it that at their largest point stick off the arrow about 1/4 inch. The arrow is approx 1/4 inch in diameter. The arrow speed will range from 100 to 400 feet per second. The arrow will be fired from 60 feet from the frame.

So the flight or fletching makes the object about 1/2 to 3/4 inch across.

AWOL:
So the flight or fletching makes the object about 1/2 to 3/4 inch across.

Yes in a sense it does. However, I would want to account for the fact that those fletchings are spinning and bending with air so the exact dimensions or width of the arrow during flight at that high of a speed could differ I would assume.

Impossible project for a beginner.

As an aside, it is physically impossible for the arrow to fly in a perfectly horizontal path, so your expectations of the required sensor resolution are quite unrealistic. 60x60 sensors would do the job giving 1 inch resolution. Still very difficult.

jremington:
Impossible project for a beginner.

As an aside, it is physically impossible for the arrow to fly in a perfectly horizontal path, so your expectations of the required sensor resolution are quite unrealistic. 60x60 sensors would do the job giving 1 inch resolution. Still very difficult.

Impossible... that I know isn't the case. There are multiple systems out there that do this same thing with computer vision and IR light. That is what I am trying to figure out is just what is out there and how it could be put together. As far as beginner I'm sure with enough study I can figure out how to wire something up properly. The coding I'm far from a beginner and can understand that easily.

Do let us know how you progress!

can anyone provide me some sort of a starting point with computer vision and the Arduino. How this all comes together and possibly what type of camera I would need to achieve this concept? I don't know anything about computer vision and how to work with it so a beginners tutorial series would be awesome.

can anyone provide me some sort of a starting point with computer vision and the Arduino

You're unlikely to find a solution involving an Arduino - they're generally not fast enough, and don't have enough memory.
Consider some back-of-a-beermat calculations: the active part of an analogue SD video line lasts about 52us, regardless of standard. Let's say you digitise a low-res raster of 100 pixels along a line. So each pixel is 0.52us wide. At 16MHz, that's less than 9 instruction cycles available per pixel during the active video.
If you want to go down the video route, get a faster processor with wider data paths.

How about measuring the difference in time that the compression shock wave spilling from the the tip of the arrow takes to reach 3 (or more) sensors in the plane of the target. If the time of arrival is the same for all sensors then by definition the arrow must be equidistant from all three.

My only thought is a long line (well...2 long lines of) laser diodes and photo-diodes on the other side.

As soon as any photodiode goes LOW, this causes a Parallel to Serial converter ICs (like the 8bit parallel in Serial Out) to latch the states of the other photo-diodes - using the "SH/LD" pin. The Arduino could then poll the parallel to serial ICs to see the states of all the photo-diodes.