I have an idea to produce a small automated drone (probably just for indoors, for now at least) that will be able to be programmed to fly between rooms in my house simply by telling it to go to "living room" for example.
My first issue is the sensors. GPS technology probably would be too pricey and also not tremendously accurate on such a small scale.
The best bet that I can think of would be Sonar, which would theoretically allow me to map the walls and other objects near the drone. Could this also be used to determine height in the room?
Another issue that I have is recognizing which room it is in, as well as which direction it is facing. I know there are some compass modules, but again I want this to be very small. Would there be a way for me to "map" out my house and use the sonar sensors to verify its location? Should I try to use RFID in the rooms?
Another major issue that I have is physical control and data transfer. Obviously, it would need to be two microcontrollers, one sending a signal based on the input and one recieving the signal and controlling the drone. Both would have to do each, however, since there is feedback. The most comfortable way for me to do this is IR, but that most likely will not work from room to room. Would bluetooth or wifi be a viable option? Or do I need to go straight to RF? Should the automation be on the drone, or in the remote? Would there be too much of a delay for it to be in the controller?
Finally, How could I build the project? I do not have a 3D printer, mill, laser cutter, etc. How many sensors should I use?
I know I am asking a lot of questions, but I'm pretty sure I am in over my head here. And, given the way my brain works, its going to be difficult to not complete this.
tyler_newcomb:
which would theoretically allow me to map the walls and other objects near the drone.
That is a VERY big task. If you manage to do that, contact me
tyler_newcomb:
Could this also be used to determine height in the room?
Sure, one sonar facing downwards, one facing upwards, both distances + distance between sensors added = height of the room, if there is nothing else than floor and ceiling under/above the drone.
tyler_newcomb:
Another major issue that I have is physical control and data transfer. Obviously, it would need to be two microcontrollers, one sending a signal based on the input and one recieving the signal and controlling the drone. Both would have to do each, however, since there is feedback.
To be honest, you've lost me there. Can you explain a bit more?
tyler_newcomb:
The most comfortable way for me to do this is IR, but that most likely will not work from room to room. Would bluetooth or wifi be a viable option? Or do I need to go straight to RF?
If you have a good Wi-Fi coverage in your home, I would use that. Another option would be nRF24L01+ modules. Cheap, good range, easy to use.
tyler_newcomb:
Should the automation be on the drone, or in the remote? Would there be too much of a delay for it to be in the controller?
On the drone. You don't want it to crash, if you loose the signal.
tyler_newcomb:
Finally, How could I build the project? I do not have a 3D printer, mill, laser cutter, etc.
Building a quadcopter is simple and you don't need any expensive tools. There are a lot of tutorials on the internet.
Sure, one sonar facing downwards, one facing upwards, both distances + distance between sensors added = height of the room, if there is nothing else than floor and ceiling under/above the drone.
To be honest, you've lost me there. Can you explain a bit more?
If you have a good Wi-Fi coverage in your home, I would use that. Another option would be nRF24L01+ modules. Cheap, good range, easy to use.
On the drone. You don't want it to crash, if you loose the signal.
lg, couka
Sorry I hit enter accidentally. The transfer and recieving of data is useless, since I decided with your advice to keep the automation on the drone. How would I go about mapping the room? Is there a laser scanner for it with code that I could try out?
Your problem is not the mapping itself, it's finding out where the drone is, when you only have some sonar measurements and the map.
A very well made and trained neural network might be a possible tool to achieve that. Yep, seriously, a neural network.
I failed doing that in 2 dimensions (aka on the floor), if you manage to do it in 3 dimensions, you have my respect.
Even if you do that, you have to train it very well. In that case that means, putting the drone in a known location, enter the location by hand. Thousands and thousand of times.
I think I once made a scratch "program" using another empirical method for 2D indoor navigation when I was bored in IT class... Let me see if I find it...
I have an idea to create an autonomous indoor drone.
The largest setback for me would be creating a map of the environment and sensing where the drone is in the environment. I was thinking 3D mapping, but the problem is that the robot will never be able to accurately know where it is in an environment without gps, or am I wrong?
3D mapping takes up TONS of memory, but I would want to do this whole thing off of an Uno and probably a Micro or some other small controller just to send which room it should go to.
I am NOT looking for full code, but maybe examples or a detailed explanation.
These are some of the reasons that the 'clever' indoor drone nav technology uses multiple room mounted cameras, and reflectors mounted on the drone(s).
The cameras send spatial scene info to complex 3D processing algorithms on land-based systems, then positioning info is transmitted up to the drone - not the other way around.
This also permits multiple drones to operate in the same space without colliding, or the added burden of heavy, complex flying technology.
So, what exactly is a neural network? And how difficult would it be to learn and integrate?
The goal would be to have the drone be able to navigate within a new environment by recording it as it goes. For example, it starts in a bedroom mapping that out, so that it can navigate there, then when prompted it moves into the hallway and maps that as well. Is this feasible?
P.S. I wasn't sure whether the programming questions should be asked in this forum or in the programming thread. Sorry for any troubles.
tyler_newcomb:
Well is it possible? Could it be done on another arduino? What makes it not well suited to arduino? Is it the memory, the language?
Simple neural networks, with few input and output nodes and layers could be implemented on an arduino, but the one you need... No way. Both memory and processor-power are to low by orders of magnitude. A raspberry might do it, I'm not sure.
Anyways, this is theory, don't waste your time on it.
So if a neural network on an arduino is not feasible, what would the remote 3D model entail? With the cameras in each room? What system could process the information and communicate with the arduino inboard the drone?
^^^ That is an open question...
How many drones operating in what volume of space, what distances and speeds.
How close is too close... etc
I've seen demos of these systems with four or more cameras, running three UAVs, with hundreds of MIPS of processing - plus dedicated GPUs for inertial/predictive algorithms.
I'd budget anything over 20K for the ground station (up to say 100K), and allow 2-5K per drone. Range is limited to something less than line of sight.
Software and labour/space rental extra.
lastchancename:
^^^ That is an open question...
How many drones operating in what volume of space, what distances and speeds.
How close is too close... etc
I've seen demos of these systems with four or more cameras, running three UAVs, with hundreds of MIPS of processing - plus dedicated GPUs for inertial/predictive algorithms.
I'd budget anything over 20K for the ground station (up to say 100K), and allow 2-5K per drone. Range is limited to something less than line of sight.
Software and labour/space rental extra.