airos

An augmented reality platform/operating system designed for gesture, voice, and maximum awareness of an astronaut's or person's surroundings. Feeds live sensor data into the user's HUD, allows instant video communications with other astronauts, remote teleconferencing, voice control and AI assistance. AirOS monitors user's vitals and situation for dangerous space tigers, asteroids, and increases the user's senses through external sensors akin to Jarvis in Ironman.

This project is solving the Space Wearables: Designing for Today’s Launch & Research Stars challenge.

Description

For astronauts to achieve their maximum potential while working in extreme environments, information needs to be quickly and easily accessible without getting in the way. AirOS is a next generation OS that manages plugins and modules (Intel Edison + sensors) presented on an easy to understand HUD (heads up display) with voice communications using Watson.

Our project pipes video straight into the Oculus Rift, creating a pseudo augmented reality environment. We then overlay a GUI over the video, which live updates temperature and detects if there is flame present using Intel's Edison sensors. We also detect gesture controls using the Leap Motion API attached to the front of the Oculus.

This is a future looking project. Augmented Reality is going to quickly become the de facto personal computing interface in the near future, as its easy to tell from the direction many big name companies are moving. While AirOS definitely has a place in space and aeronautics, its also a technology that is readily suited for day to day life.

  • Improve safety and visibility of dangerous objects Give important vital information in a digestible format (Edison w/ Sensors) Temperature O2 Levels CO2 Levels Mission critical data Flame detection

  • Communications (AR/VR + Webcam) Able to communicate Share video of perspective directly with others See video from someone else’s perspective Get help as if they “other person were there”

  • Instant Answers (Watson) Uses Watson for questions and answer Speech to text for communications

  • Gesture controls (Leap Motion) ** Uses gesture controls to control UI


Project Information


License: MIT license (MIT)


Source Code/Project URL: https://github.com/badave/airos.git


Resources


HUD 1 - https://www.dropbox.com/s/1rc237k1zmhcelf/ship.png?dl=0
HUD 2 - https://www.dropbox.com/s/0mtp24jid1p0kw9/space_apps.png?dl=0
Live Demo - https://www.dropbox.com/s/6wd93xfrpalv60g/Screenshot%202015-04-19%2016.33.56.png?dl=0

Team

  • Patrick Chamelo
  • Maria Roosiaas
  • Patrick Chamelo
  • Karl Aleksander Kongas
  • Scott Mobley
  • Marc Seitz
  • David Badley


Loading...
×
Loading...
×