Background
The app's has two goals.
I've tried to solve both of these goals with an iPad app. The Earth and satellites going around it were made with SceneKit. When the user selects the satellite - onboard cam goes live, and the user sees other satellites and space debris around him. He can control it just like real spacecraft: using Pitch, Yaw and Roll sliders.
To solve the "co-location" I've recorded each instrument's field of view, projected it on map (virtually), and compared pair-by-pair with every other instrument. The pairs that match their locations, are the co-locational points we're looking for! To show that concept I went with the simple design, showing two satellites' names, instruments on them and short descriptions of the experiments we'll perform.
Also, I added the view of the latest image from NASA'S DSCOVR spacecraft (EPIC camera) and ISS position with predicted ground track.
Resources
The main resource I used in code is SGP4 library, used to track satellites using their TLE, which I got from Space-Track website. Also, I've used NASA API (particularly EPIC cam), and resources (like Earth texture and NASA logo). For the ISS ground track I've used Google Maps iOS API.
SpaceApps is a NASA incubator innovation program.