spaceslothSpace Sloth - Wearable Navigation Technology. We created a sensing helmet that can detect obstacles by utilizing computer vision as well as short range IR and long range LIDAR sensors. The feedback was visualized on a server that was open to the public. Tactile feedback was implemented using the vibration of a LG Smart Watch.
This project is solving the Sensor Yourself challenge. Description
Space Sloth Wearable Navigation Technology
Using an Intel Edison our team created a device and application to allow visually or otherwise impaired people to navigate obstacles as directed by tactile feedback from a smartwatch. The technology can be implemented in numerous space applications, from directing autonomous robots to astronauts on the space station. We began by creating an omni directional near field sensor using five infrared proximity detectors, and later implemented a far field front sensor with a LIDAR and a camera to detect objects as valid obstacles in the trajectories or paths of our users. We used a Moto X and a LG Smart Watch as a vibrating tactile feedback that informed the user how far away he was from an obstacle.
In the end we created Space Sloth: adding another dimension to sensing objects in new ways.
Our name was officially approved by astronaut Cady Coleman.
Resources used: - Logitech Camera from Intel - Intel Edison - IBM Bluemix
Android / Wearables code here: https://github.com/somya/SensorNavigator
License: MIT license (MIT)
Source Code/Project URL: https://github.com/spaceappsnyc/SpaceSloth
Space Sloth - http://i1.kym-cdn.com/photos/images/newsfeed/000/437/645/a9d.jpg