Hey everyone, I am a 21 year old Computer Scientist (with a career in Software Engineering) that just adopted a kitten to keep me company while my fiancée is finishing up school out-of-town.
(for the people that only want to skim, important points throughout the entire thread will be in bold, and I PROMISE there will be a lot of pictures/videos!)
I have always loved the idea of computer-controlled physical devices, but I have never made anything myself. This project aims to change that.
The idea behind this project has been in my head for YEARS, but until finishing up college, I never had the time/resources to really turn it into a reality.
In my apartment, my livingroom is 12' by 16', with a 12' foot ceiling. The center of the ceiling is perfect vantage point for me to put a laser turret that will autonomously move around to keep the laser dot away from (but still near) my cat.Equipment list:
1x Arduino Uno R3
2x 28BYJ-48 motors
2x ULN2003 motor controllers
1x ATLASNOVA 650nm <50mw red laser pen (class iiia, safe for the kitty!)
1x Playstation Eye (75 degree FOV, 60fps @ 640x480, 120fps @ 320x240)
1x Home-built computer
Things I will have to construct:
2-axis mount to hold the laser pointer
Algorithms to determine where to position the laser based on the cat's movementProject Outline
Assuming the laser is projected from a point 18" down from the ceiling (mounted to the bottom center of the ceiling fan), it will need to have a range of motion of 92.8 degrees to be able to position the laser on any point on the entire floor.
<<graphic to be inserted when I get home from work>>
My targeting system will be radial, where the position of the dot is determined as a radius length from the center of the room and an angle difference from 0.
<<graphic to be inserted when I get home from work>>
The Playstation Eye will be mounted pointing straight down as close as I can to the turret without it blocking movement and without the turret interfering with the video feed. Conveniently, the 16' x 12' of the room perfectly matches to the 4:3 aspect ratio of the webcam, AND the 75 degree FOV from a position of 18" down from the ceiling theoretically gives the camera a viewing area of 16.1' x 12.075', AKA the exact size of the room - what are the chances!? (the difference in this 75 degree angle and the 92.8 degree angle of the turret is because the turret's angle is diagonal while the camera's angle is horizontal)
Using OpenCV running on a powerful computer across the room, I will be able to do very fast and accurate motion tracking on my cat as it walks across the room, calculate the radius and angle from the center, and relay those numbers from C++/Python back to the arduino which will then adjust the stepper motor locations to match. The arduino will be left with the task of finding the quickest path to the new location from its current location, the C++/Python will not keep track of any previous points.
Other calculations(mostly for my reference):
Room is 27648 square inches, camera has 307200 pixels. This means the motion should
be able to be tracked to an accuracy of .09 of an inch, assuming each pixel on the camera is unique.
Stepper motor (after reduction) has a step angle of 0.088 degrees, so (if I did the math right) it should have an average precision of roughly 1/4", and if this motor has microstepping then that precision could be even higher! (I don't know much about how microstepping works)Problems I expect to encounter:
*When laser motion is near the center of the room, I will be in a gimbal lock situation with the turret pointed straight down. This means movement speed will suffer greatly the closer it is to the center of the room, but reading some posts by sbright33 it looks like this motor can be over-driven to turn at fairly fast speeds.
*I am unsure what the delay will be through the path of "motion happens > motion tracked > points determined > points sent > points interpreted > motors moved", but I am confident the bottleneck will be the motors, which using the over-driving method sbright33 talked about, I should be able to reduce that delay down to hopefully be undetectable.
*I am unsure how much data the arduino can handle. I plan on sending a pair of integer numbers (4 bytes each) 60 times per second, so 240 bytes/sec. I believe (could be wrong) the arduino can only handle 9600 baud which is something like 150 bytes/sec. I believe a workaround would be to use 16 bit ints, which would drop that to 120 bytes/sec and still maintain an accuracy higher than the stepper would need.Future ideas for expansion
*Self-calibration, e.g. motion tracking of the laser to calibrate what points are where so this can be set up anywhere at angles other than straight down.
*Avoidance calibration, e.g. there is a couch here, chair here, tv stand here - don't shine the laser on them, go around them. Keep the dot on the carpet only.
The parts all started showing up yesterday, and now I have everything but the motors.
I was able to get communication to and from the arduino working through python even easier than I was expecting, and I messed a little bit with OpenCV for the motion tracking on the camera.
I will keep track of my build in this thread Constructive criticism is encouraged
, but be warned: If you tell me "It can't be done" I will only take that as a challenge to prove you wrong (no hard feelings, right?)