Fire sensor array

I am planning to build a fire sensor array for a school project, but I need some help. The plan is to have a pair of 2 types. Type A is simply a couple sensors set up to cover a wide area, the idea being that if you have two arrays of this sort, you should be able to use the overlapping sensors from each array and code it to figure out where the fire is inbetween the two arrays, programming it to use triangulation of some sort. I need help with what type of UV and/or IR sensors would work best for that? I have an Arduino UNO. Also where could I find the code to implement the triangulation?
Thanks in advance

Here is some background information;

Usually home and office based fire detectors work on heat and smoke, not IR/UV there is probably a reason for that.

Wikipedia lists applications of UV/IR sensors as;

Fire systems detect fires in zones. Trying to get one to triangulate a fire sounds tricky. Using CCTV to determine the exact location after the alarm is raised may be the best bet.

Thanks for the help, this is supposed to be a prototype like thing of sensors for veldt fires, so thats why smoke wouldnt work, and why we speculated about IR or UV

The Panasonic Grid Eye sensor could probably do everything you want. Up until recently it's only been accessible to professionals but Adafruit now has a breakout for it.

Thanks for the help, this is supposed to be a prototype like thing of sensors for veldt fires, so thats why smoke wouldnt work, and why we speculated about IR or UV

Because you mentioned school I put two and two together and came up with five thinking it was an indoor problem.

The wiki link still applies. Outdoors sunlight is going to be a big issue. You also need to look at sensor costs, they might be expensive. Range might also be important. I can see the visible light from a fire from a long distance away but I can only feel the heat on my skin when I am quite close so I am not sure how far IR can be 'seen' from.

The ideas in the wiki about comparing the ratios of different wavelengths might be worth exploring. IT also might be worth thinking about taking still images and subtracting one from the other to see what has changed. Fire and smoke might cause big changes (but the Arduino is not good at image processing).

Here is a trial of a bush fire detection system

They seem to be looking for smoke during the day and fireglow at night. Interestingly they are using black and white images.

Here is a report on three systems one of which uses off the shelf cameras.

It is a good idea to research how other people solved the problem.

This topic seems like quite a challenging one for a school project. You might want to very carefully define your objectives so that you have a problem you can solve.

I heard a few decades ago already that human fire watchers look specifically for smoke during the day (as the saying goes: where there's fire, there's smoke) - and I suppose indeed for fire glow at night. Both are easy to pick out for human observers.

For automated fire detection I'd think of IR, especially at night when the sun doesn't mess up things: fire is much hotter than the surroundings - and will produce a column of hot air, both of which should be very clear on an IR image.

During the day it's harder - smoke detection should work based on overall brightness of an image. Smoke will be much less bright than the surrounding clear sky, even if it's a cloudy sky. Clouds emit light (due to the sun shining on them from the top), smoke not so much.