SHC has received the following awards and nominations. Way to go!
Check the GitHub page for the complete description and all the files related to the project https://github.com/gdelazzari/SpaceApps2017
The description you'll find on GitHub is reported below.
Every year there are a lot of wild fires in the world, and they cause a lot of damage (both in term of deaths and economic loss). Take a look here for example.
We developed a system that takes into consideration both user alerts (made through a mobile app) and signals from an "insight system" (which uses data like fire risk maps from NASA and other sources, weather data, custom sensors data, historical data, etc...) to manage a drone swarm capable of autonomously flying over the interested area and capture real-time imagery of the fire, which is then processed by a computer vision algorithm to generate a live "fire/smoke size & spreading map".
Once you have this real-time representation of the phenomenon, you can alert the users in the area and show them (through the mobile app) the forecasted fire spread in real time, so they can figure out the best way outs. But, most important, the live data can be used by the firefighters to arrive prepared, choosing the best vehicles and tools to extinguish the fire and therefore optimizing their resources (which leads to an economic benefit).
Another bonus feature is that combining wind data from weather stations or custom wind sensors deployed in the surrounding area (and maybe also on the drone stations themselves), the smoke and ash direction can be predicted and nearby cities can be timely alerted about the danger.
Moreover, a system like this allows a fire to be detected within minutes instead of hours (satellite live fire data has a latency up to 3 hours - source). This is feasible, owing to the insight system which, even without a user report, is capable of guessing if something is happening thanks to temperature/thermal sensors that can be deployed in the area. Or, even without this kind of data, the system can plan timed drone patrols based on the calculated risk for that particular moment of year/month/day, all of this taking into consideration the battery life of the drones and leaving enough in case of user reports that require an instant patrol flight.
The system is also expandable and could be used, for example, to report the live status of a volcanic eruption or an avalanche/flood, or possibly to find people lost in the area. The input data for the insight system is also flexible, in the sense that you can, for example, add more sources like Twitter or other social networks to trigger the drones when someone tweets or posts something about a fire that has started. Futhermore, you can also use custom sensors based on the scenario: for instance, if a forest lies at the foot of an high hill, you can place a thermal camera on the top of the hill to instantly detect any possible flame.
The drones will be equipped with an auto-pilot system, a GPS unit, a video camera (or optionally a more expensive thermal camera, but it's possible to rely only on standard cameras) and a wireless transmitter to communicate with the rest of the system, sending the video stream and other kinds of data (such as their position and current battery life). Whether the drones will send the data directly to the server (thus having to connect directly to the internet) or the data will go through their respective ground station before being sent to the servers, is a design choice that still has not been addressed. However the future arrival of the 5G network will certanly ease the implementation and design of the communication part.
With that said, the drones can be relatively cheap and are charged when they are retired in their ground station, which gets power from the sun. The ground station also provides infrared-light and other kinds of reference points for the drone to land precisely.
Automatic drone landing on a base recharging station is totally possible as one can see from various videos on the web: example 1, example 2.
The live video stream captured by a drone during a fire can quite easly be processed with computer vision or machine learning algorithms such as CNN (Convolutional Neural Networks). For our demo, we trained a really simple feed-forward neural network that classified 8x8 pixels cells of the video frames into one of these categories
This is a YouTube video showing a quick demo of our neural network working on a 10 seconds video
You can find the code used for the demo in the NeuralNetwork directory. For our demo a NN was probably overkill, since we could as well have used a simpler classifier (maybe with the average color of the 8x8 cell)... but neural networks are cool, ...so why not.
Anyway, that was just to prove that a live image analysis to obtain a "fire map" is totally possible.
The app, which anyone will be able to download, is really simple to use, and will show a map that allows you to select the location where you spotted the fire and send an alert. You can also view the fire risk map so you know you must be careful in a certain area (a "Warning, you are in a fire risk area" alert is also possible). On the right you can see a mockup we have made using MIT AppInventor.
In the event of a nearby fire, as soon as the drones start to report live data of the phenomenon, all the users located near the danger are alerted with a notification/sound alarm and the analyzed representation of the flames and the smoke is showed in real time. This will allow everyone to find the best way out and/or prepare themselves in case of smoke and ash coming in their direction.
SpaceApps is a NASA incubator innovation program.