Project Description:
As part of understanding human impact on the environment, scientists are very interested in tracking and analyzing a variety of species, including birds.
Until recently scientist would have to manually tag birds to track their migrations. With the communication boom that came with the internet and smartphones scientists can now leverage an already existing community of dedicated birdwatchers to track birds with far greater efficiency. For example, the Cornell Lab of Ornithology allows users to submit bird sightings as a means to increase the data available for scientists [1]. Our application expands on these ideas by creating a simple android application that facilitates data collection.
Birdwatchers love to capture pictures of birds in their natural habitats, trekking through sunshine or rain. With this application, birdwatchers are able to take pictures of birds they spot, and upload them directly to a server along with the GPS location. Our server uses machine learning algorithms to identify the species of bird, and creates a record of that bird in the database. This greatly enhances the user experience, because the user does not need to be able to recognize the bird to expand the database. Users can then view, on google maps, pictures of birds taken by other people. Ultimately, as the dataset grows, the app would allow scientists to analyse migration patterns.
Our application uses the open source TensorFlow software to identify birds. This software uses machine learning to analyze patterns in the images and train itself to identify birds. Starting from a default dataset that comes with TensorFlow, we were able to train the algorithm using a subset of the Caltech-UCSD Birds 200 image dataset [2] to improve the accuracy of bird identification. With further training, the system should be able to identify a wider variety of birds with a greater accuracy. Currently the server returns the bird with the highest percent match to the user, along with the next two closest matches.
Our current implementation uses rudimentary search algorithms to allow users to search the database. For example, searching with the letter “a” will give all birds whose species name begins with the letter “a”. Over time, the search parameters could be expanded to conduct searches over specific areas and time periods. This will enhance the ability for scientists to analyze the data.
Team Members:
Abdulwahaab Ahmed
Abdulwahaab is a second year computer science student at the University of Ottawa. He is currently working as a security DevOps intern at Shopify. He started programming when he was 8 years old, since then he has been so passionate about programming. Later on he discover his passion for cybersecurity. In his spare time he likes to work on side projects, or play cybersecurity CTF.
Vikram Bomhbi
Vikram is a second year software engineer at Carleton University. He has always been fascinated by space, joining the Carleton Planetary Robotics Club as a freshman. He loves everything new in tech from machine learning to the new microprocessor. His fascination of space is in part due to the ingenious ways engineers and scientists solve problems and dreams of being part of that process one day.
William Wang
William is a grade 10 student at Colonel By Secondary School. In his spare time he likes doing cross country running and track and field. His favorite subject is mathematics.
Michael Dysart
Michael is a second year software engineering student at Carleton University. He currently works as a software development intern at KX Labs in Ottawa. He is also a member of the Carleton Planetary Robotics Team, where he helps build rovers for international competitions. In his spare time he enjoys running and spending time with friends.
Resources Used:
Development:
-Android Studio
Frontend:
-Google Maps API
-Android
Backend:
-Flask
-SQLalchemy
-TensorFlow
Video Editing:
-Adobe Premier
-Adobe Photoshop
A mock dataset was created for the purposes of the demonstration using fictional GPS coordinates. Pictures for this dataset were found at [2].
The photo of the Indigo Bunting was found at [3].
The video used the “Action hero” music from the Youtube Audio Library [7].
Video Icon Credits:
Android Icon:
Designed by Pixel perfect from www.flaticon.com
Oculus Rift:
Designed by Freepik from www.flaticon.com
Map Icon:
Designed by Vectors Market from www.flaticon.com
Github Links:
Frontend: https://github.com/MichaelDysart/SpaceAppsApplicat...
Backend: https://github.com/VikramBombhi/SpaceApps/
Citations:
[1] “What We Do Citizen Science”, Birds.cornell.edu, 2017. [Online]. Available: http://www.birds.cornell.edu/page.aspx?pid=1664 [Accessed: 03-May-2017].
[2] Welinder P., Branson S., Mita T., Wah C., Schroff F., Belongie S., Perona, P. “Caltech-UCSD Birds 200”. California Institute of Technology. CNS-TR-2010-001. 2010.
[3] D. Scranton, “Indigo Bunting.”, Flickr, 2012. [Online] Available: https://www.flickr.com/photos/68782989@N03/6900656764/ [Accessed: 03-May-2017].
[4] Pixel Perfect, “Android”, Flaticon, [Online] Available: http://www.flaticon.com/free-icon/android_183315 [Accessed: 04-May-2017].
[5] Freepik, “Oculus Rift”, Flaticon, [Online] Available: http://www.flaticon.com/free-icon/oculus-rift_314579 [Accessed: 04-May-2017].
[6] Vectors Market, “Map Icon”, Flaticon, [Online] Available: http://www.flaticon.com/free-icon/map_235861 [Accessed: 04-May-2017].
[7] “Action Hero.”, YouTube, [Online] Available: https://www.youtube.com/audiolibrary/music [Accessed: 04-May-2017].SpaceApps is a NASA incubator innovation program.