An augmented reality platform/operating system designed for gesture, voice, and maximum awareness of an astronaut's or person's surroundings. Feeds live sensor data into the user's HUD, allows instant video communications with other astronauts, remote teleconferencing, voice control and AI assistance. AirOS monitors user's vitals and situation for dangerous space tigers, asteroids, and increases the user's senses through external sensors akin to Jarvis in Ironman.

This project is solving the Space Wearables: Designing for Today’s Launch & Research Stars challenge.


For astronauts to achieve their maximum potential while working in extreme environments, information needs to be quickly and easily accessible without getting in the way. AirOS is a next generation OS that manages plugins and modules (Intel Edison + sensors) presented on an easy to understand HUD (heads up display) with voice communications using Watson.

Our project pipes video straight into the Oculus Rift, creating a pseudo augmented reality environment. We then overlay a GUI over the video, which live updates temperature and detects if there is flame present using Intel's Edison sensors. We also detect gesture controls using the Leap Motion API attached to the front of the Oculus.

This is a future looking project. Augmented Reality is going to quickly become the de facto personal computing interface in the near future, as its easy to tell from the direction many big name companies are moving. While AirOS definitely has a place in space and aeronautics, its also a technology that is readily suited for day to day life.

  • Improve safety and visibility of dangerous objects Give important vital information in a digestible format (Edison w/ Sensors) Temperature O2 Levels CO2 Levels Mission critical data Flame detection

  • Communications (AR/VR + Webcam) Able to communicate Share video of perspective directly with others See video from someone else’s perspective Get help as if they “other person were there”

  • Instant Answers (Watson) Uses Watson for questions and answer Speech to text for communications

  • Gesture controls (Leap Motion) ** Uses gesture controls to control UI

Project Information

License: MIT license (MIT)

Source Code/Project URL:


HUD 1 -
HUD 2 -
Live Demo -


  • Patrick Chamelo
  • Maria Roosiaas
  • Patrick Chamelo
  • Karl Aleksander Kongas
  • Scott Mobley
  • Marc Seitz
  • David Badley