A system to remotely navigate a spacecraft or astronaut suit over the internet using an off-the-shelf smartphone. The platform is capable of: Spacecraft Control, Voice Command Input,and Heads-up Display Menu Navigation.

This project is solving the SpaceGloVe: Spacecraft Gesture and Voice Commanding challenge.


Inputs from smartphone: Gyroscope, Accelerometer, Altimeter, and Touchscreen display.

The navigation system uses two sets of controls: determine direction of where spacecraft will move in x, y, and z space, and the rotation of the spacecraft (yaw, pitch, roll). Each of the controls determine the 6 degrees by a calculated value from the gyroscope and accelerometer to establish the exact position of the smartphone and a "gas" button to initiate each control. The controls are transmitted using IP-based network to a web server the provides the front-end interface for a space environment to navigate.

The voice command system translates speech to text and can be used to issue control commands without selecting options on the menu.

Technology: • FireBase - Data transport from app via internet to web server • ThreeJS - 3D interface • Google Now - Voice to text translation • Java - Android application and navigation logic • CSS, HTML, Bootstrap - Web front-end interface

Project Information

License: MIT license (MIT)

Source Code/Project URL:


Deployed site: -
Web Interface -
Functional Requirements Document -


  • Oleg Pakhnyuk
  • Joey Smith
  • Zachary Verbeck