This project is solving the SpaceBot Stereo Vision challenge. Description
We aim to make it easier for scientists to review the SPHERES experiment videos, specifically regarding the quality and quantity of feature matches between the left and right views of the goggles. The primary aim is to expose a given point in the video timeline and allow for efficient browsing and visualisation of interesting sections of this timeline, allowing these to be documented and saved for later.
- Application of SURF computer vision techniques to find and annotate feature matches for each keyframe of the video
- Feature annotation of the video to highlight these for later use
- Production of an API the expose the feature match data to an app
- An Ipad app that displays the annotated video feed and links the currently viewed keyframe to visualisations using the api
- A facility to bookmark and make notes on significant sections of the video timeline
Nice to have (if time):
- Creation of the pointCloud from the feature matches to produce a 3d reconstruction of the SPHERE satellite
- Exposing this 3d model in the app and allow touch exploration.
We achieved our main goal of providing a tool to allow scientists to investigate and visualise the quality of stereo image matches processed from youtube source videos.
We provided features to bookmark interesting areas of the timeline and browse adjacent time-slices interactively. The key visualisation used was a scatter chart, mapping match error rate on the Y axis and the time slice (in seconds) on the X axis.
Live project demo:
Video of iPad app (running in browser) in action
License: MIT license (MIT)
Source Code/Project URL: https://github.com/MattDooner/spacebot-stereo-vision
Video Feeds - https://www.youtube.com/user/MITSpaceSystemsLab/videos