SpaceGloVe: Spacecraft Gesture and Voice Commanding

2015 Challenge

Create Project Solving this Challenge

  • Hashtags:

    #humans, #spaceglove, #intermediate


    [email protected]


    Platform, Hardware


    The command interface to our computers has moved away from keyboard and mouse, to touchscreens, gesture and voice activation. These advances are now in commercial consumer technology and could be applied to a manned space mission to improve the human-machine interface.

    Existing smartphone devices and other wearable technology contain accelerometers and other positioning sensors that could be used as a command interface when combined with gesture. Wearing such a device would also allow the user to access the built-in functions and applications of the device. For example, applications could be split or shared between the main computer and the wearable device, thereby allowing them to be mobile and moved around the spacecraft.


    This challenge seeks to change the way crew members and operators interact with computer systems using gestures and voice control. The user wears a number of devices on their person, e.g. one on each wrist, which will allow them to control an application running on a linked (e.g. via wireless technology) computer.

    We’d like you to produce a new technique for controlling a computer application via one or more wearable devices (e.g. smartphone) that wirelessly connects to the main computer and responds to gestures and voice commands. Create unique ways for data and applications to be swiped/transferred between the wearable devices and the main computer. Arm and hand gestures can be used as a control interface to the main computer replacing traditional mouse or touch interaction. In addition, voice commands can be used as an input. The two should be able to work in concert with each other, or independently, should one input mode fail.


    • Consider using a smartphone as the baseline wearable technology, strapped to the user’s wrist(s).
    • The team can develop an application that obtains gesture based data from the device’s accelerometer.
    • You can pair the device with wireless technology or Wifi to the main computer and transfer the gesture data to the computer.
    • Your computer will need a dedicated application to be written by the team to take this data and control the command interface (e.g. the mouse focus or touch screen).
    • A secondary command source can be voice control, which is registered by the smartphone, processed, and instructions sent to the main computer.
    • You will want to identify early on in your project what command gestures and voice instructions you want.
    • Applications can be shared between the wearable device and the main computer.
    • You can develop a demonstrator application to run on the host computer, controlled by the gesture and voice activated wearable devices, in order to demonstrate the feasibility of this new control interface. For example, the demonstrator application could be an implementation of an Earth model that allows the user to pan, zoom, rotate, and search for locations on the Earth with additional support for internet browsing controlled by gesture and voice.
    • The gestures that trigger command responses can include arm movements left – right, up – down, rotation, and forward – backwards. If a device is worn on both wrists, then the combined gesture could be used for additional commands (e.g. moving devices together could create a ‘zoom’ response, and moving apart could be a ‘unzoom’.)
    • The voice statements that trigger command responses can include “select”, “de-select” or a similar word, “zoom”, “unzoom”, “forward” and “back”.

    Sample Resources (Participants do not have to use these resources, and NASA in no way endorses any particular entity listed).

  • The following projects are solving this challenge:

    • i5

      Control the keyboard mouse and power functionality of your computer including typing on one hand. Visit Project

    • SpaceGes

      SpaceGes This project will use gestures from a mobile phone or wearable to control a web applicaton. Visit Project

    • MKV Hand's

      Space Glove: Spacecraft Gesture and Voice Commanding Project Review Interacting with computer systems while wearing spacesuit uniforms is the issue that our team chose to address. Through our innovative ideas we will ease astronaut’s accessing methods and their way of controlling applicatio... Visit Project

    • El guantaco

      Going for the space glove challenge. Visit Project

    • DSpace - Draw in Space

    • SpaceGlove

      SpaceGlove is for astronauts which makes easy for them to control the spacecraft. It has voice commands and gesture commands. Visit Project

    • Like A Glove

      Using bluetooth connection we managed to connect from an android phone to a personal computer and transmit real time sensor data. Specifically we used accelerometer data to derive gestures made by the user and determine what effect they would have on our computer system. Our main mode of oper... Visit Project

    • VoidGlove

      We aim to gain as much experience as possible by developing this smart glove in just 2 days. We believe that this is a chance to impact the whole world in a positive way, by participating in shaping the future of space exploration. We will develop the glove, and work in R&D for sometime to make i... Visit Project

    • SMIXY

      A glove that helps astronauts to quickly analyze planets surface Visit Project

    • Christopher

      New generation space glove. Visit Project


      We will develop a project in order to help and improve the way crew members and operators interact with computer systems using gestures and voice control. We will allow users to control an application running on via wireless and bluetooth technology using JAVA code, also we provide domotic options. Visit Project

    • Jupiter Glove

      This projects focuses on creating an efficient voice command interface but may also include basic motions commands. To achieve an efficient voice command interface we want to divide the interface in two parts: 1. The voice capture module, this can be in the user's arm and can use an option... Visit Project


      Overview Visit Project

    • Hermes

      The project is based on a software that allows control of external systems such as a capsule or space station by voice and gestures in the air, thanks to a client-server system. Orders can be made in a totally natural speech. In turn, the device constantly be hearing the possible orders, and the ... Visit Project

    • Armstrong Glove

      For these challenge we have developed and prototyped a glove that integrates different sensors to recognize some gesture, to control a generic application. A 10-DOF IMU module measure all the movements of the hand in space, with a 3-axis accelerometer, a 3-axis gyroscope, a 3-axis magnetometer... Visit Project

    • Spacesuitrobot

      It’s a kind of robotic  Skelton that is going to be wore by the astronaut, under its spacesuit and built by a 3D printer. IT will be related to his brain so that it can detect easily the brain waves and describe the astronaut situation, for example its temperature, tension… if it detects a weird ... Visit Project

    • Human Gesture Interface Commands

      Capture movements and hand gestures to convert into actions, then the computer recognizes as commands. By the following steps: - Data capture. - Data processing. - Data interpretation. # Data capture Create a circuit that captures changes in position of the hand and finger movement. # Da... Visit Project

    • Move

      It's a webpage that can be accessed from mobile phone, in this page you are going to find a ball that you can move and it's like your mouse in your computer, also you can send some voice commands in spanish: “clic”, “doble clic”, “clic derecho”, “clic arriba”, “clic abajo”, “escribir ….”, “tecla ... Visit Project

    • SAGA : Speech And Gesture Assistant

      SAGA is a new innovation for using speech recognition and gesture commands in space with the help of a smartphone that consists the necessary hardware such as an Accelerator and runs on any android operating system. SAGA Features two modes : 1. Gesture Mode 2. Speech Mode An astronaut on h... Visit Project

    • Team Bodge! (Wireless Gloves)

      We are believed the astronaut needs a clear and direct tool or assistant to help them,a gloves could be a necessary choice. In the design, a Gloves with 5 flex sensors, two Gyroscope(Gyro), two Accelerometers(Acc), a heart-beat sensor, a humidity sensor and a LCD interface.The Hardware controller... Visit Project

    • Stella Motus

      The actual command interface to our computers has moved away from from keyboards and mouses to touchscreen and hand gestures. These advances are now accessible to human hands and could be used commercially, promoting an advancement in consumer technology. The main objective of our project is si... Visit Project

    • SML Glove

      SML Glove ========= * * * Overview The SML Glove is designed to allow astronauts to control remote robotic devices or virtual actions. The device contains two pieces: the actual glove, including sensors, a microcontroller, and the glove fabric; and an Android app which communicates wit... Visit Project

    • Space-Craft

      We want to develop an interface to connect and possibly control it through hands. We will control movements of hands and voice. The signals will be caught through a Kinect and then send to a computer. After that, the computer will process the action of movements. The development is based in... Visit Project

    • The Future

      This project aims to shed light onto the future of user interface. The Future renders the use of the mouse and keyboard useless. The user has complete control through hand motions and voice recognition. This project was inspired mainly by all the futuristic Hollywood movies which often portray... Visit Project

    • rrc

      # Description Telemanipulators on the ISS are very important and versatile robots. They are use in repair operations and reduce the risk to humans from space walks. One of the tele-manipulators is Dextre - the special purpose dexterous manipulator (SPDM). It is controlled currently by joy stic... Visit Project

    • NumaSpace

      The Project aims to show how Voice Activation, Natural Language and Gestures can be used to build an interactive app. We have targeted the following features for the demonstration *Use of Gesture to show Right/Left/Up/Down/Zoom In/Zoom Out movements *Use voice command to activate/change Vital... Visit Project

    • Spacecraft Automation using gesture and voice recognition

      In space environment, where communication with your computer is not as easy on in earthly environment, astronauts find difficulty to control various thing over computer. There is a need of a new technique that will help astronauts to communicate to the space craft’s computer. His important thing ... Visit Project

    • Spock

      ![Spock Space Glove]( "Spock Space Glove") **Current challenges and limitations:** * The spacesuit and spaceglove are relatively very rigid and make the normal human – machine interaction difficult. * Analog controls and touchscreens are the most... Visit Project

    • Midas

      ![alt text]( "Midas") #It's time for a revolution Gesture based computing is not a new technology. Current systems such as the Microsoft Kinect and Leap Motion technology have issues. F... Visit Project

    • iGLOVEu

      The aim of this project is to Support NASA's aims of Human Exploration of Mars and Asteriod Mining. The method by which we aim to progress this goal is through the development a User-Friendly GUI to remotely control a Robotic Exploratory unit, wirelessly communicate with a range of measurement in... Visit Project

    • SpaceyConnect

      We saw this as a great opportunity to create an open standard for the interconnection of devices, which places emphasis on extensibility, and portability. However, we underestimated the amount of work necessary to make this happen and failed to achieve our objectives in the required time. We plan... Visit Project

    • spacehack

      Project Review Since the introduction of the keyboard and the mouse there has been little progress in the development of new ways to allow human-computer interaction and the control of generic software applications; Our main project goal was to develop a new way of interacting with any softwar... Visit Project

    • Space Glove

      This project would be relatively easy to make, it comes to life but it has its difficulties. The difficulties are probably more in the aspect of time than a specific problem which will be talked more in details in the end of the documentation. The realization will be easy. Space would not be pro... Visit Project

    • EnterAdata

      For years, thanks at the technology advances, a lot of software was created to help humanity but, the creations have problems, the people everyday needs the fast and easy use of the technology. Our app no only helps the space travels, ¿who hasn´t troubles looking around the PC, for special app... Visit Project


      We will develop a project in order to help and improve the way crew members and operators interact with computer systems using gestures and voice control. We will allow users to control an application running on via wireless and bluetooth technology using JAVA code, also we provide domotic options. Visit Project

    • Space Glov

      We created a software for controlling a computer application via one or more wearable devices (e.g. smartphone) that wirelessly connects to the main computer and responds to gestures and voice commands. Arm and hand gestures can be used as a control interface to the main computer replacing tradit... Visit Project


      - "Genesis" Project presents an incredible experiences in space!! Be free with realistic!! This is 3D game flying between planets. Players ride on the solar sail "IKAROS" ( by JAXA :Japan Aerospace eXploration Agency) and go for space from the Earth with Solar wind. This is no... Visit Project

    • S.H.M.I Out of hands

      SHMI demonstrates that we can create new ways to interact with the laboratory and computer equipment in the ISS. This new HMI have to be reliable and intuitive. Nowadays the astronauts rely on direct contact with the controls and sometimes is difficult to interact with them, specially if both of ... Visit Project

    • Hendrix

      This project aims to provide a simple interaction platform for users to send input to an application running on a desktop or any other device over a wireless connection. Visit Project

    • Interact

      The system has 2 main parts: - The mobile side: an android application that detects hand and voice gestures. For gestures, we examined the behavior of accelerometer and designed some basic rules to detect linear gestures ( right, left, up, down, up-right, up-left, down-right, down-left ). ... Visit Project

    • HI! - Hand Interface

      We want to make interactions easier, either for work in space or in Earth, or for entertaiment. And for an easier solution we need an intuitive way to communicate with a computer. Our hands, used for most of our common activities, can be our tool to work, communicate and do everything that we wan... Visit Project

    • Glove Above

      The very first feature we lay emphasis on is recreating the experience of a virtual musical instrument.It makes use of piezoelectric crystals as input sensors to actually create the experience of playing an instrument and reciprocate distinct tunes with every beat played by the user.this is pres... Visit Project

    • Give Me A Hand

      Give me a hand aims to be not just a glove, but a complete body system that will interact with the computer of the suit, the computer on the space station, a land rover, or even an exoskeleton. With the glove as our first step and prototype we use an Arduino UNO R3, an accelerometer and some c... Visit Project

    • SpaceBumGlove

      Inputs from smartphone: Gyroscope, Accelerometer, Altimeter, and Touchscreen display. The navigation system uses two sets of controls: determine direction of where spacecraft will move in x, y, and z space, and the rotation of the spacecraft (yaw, pitch, roll). Each of the controls determin... Visit Project

    • Energy Gauntlet

      We have used our hands as the primary method of interacting with the world since the dawn of man. Why then do we work through middlemen such as keyboards, mice, and controllers to interact with our devices? Energy Gauntlet is a language agnostic platform for interacting with devices using only ... Visit Project

    • FirstHand

      Youtube Video: One of the many challenges that Nasa Astronauts face when out in the vacuum of space is the drastic atmosphere change between what’s out there and what we need to survive. Thankfully we've done well for what we have to protect our Astronauts but ther... Visit Project

    • Valkyrie

      Video link - <> ![ValkyrieLogo]( ![Slogan]( Visit Project

    • M GloVe

      M GloVe M may be master may be magnificent may be motion or may be Mudassar which is my name ;-) M Glove is solving Space glove challenge.It has two parts an android app which uses gyro sensor readings on android for remote mouse control on a PC and proximity sensor readings for click events on ... Visit Project

  • Welcome to the collaborative hackpad! You can use this open document to collaborate with others, self organize, or share important data. Please keep in mind that this document is community created and any views, opinions, or links do not reflect an official position of the Space Apps Challenge, NASA, or any of our partners.

    Building a team or looking for one to join? Feel free to create a Matchmaking section at the bottom of the document to help in gathering great minds together!

    If you want to edit this Hackpad, or have trouble viewing it, please create an account at

    SpaceGloVe: Spacecraft Gesture and Voice Commanding Hackpad: