The main goal of ExoBot is to assist with the use of NASA's Robonaut and other robotic assistants either already in place or in development. Through implementation of sleeves with embedded sensors, voice commands, and a virtual reality video display, astronauts and technicians alike will be able to manipulate robots in order to safely correct situations that could otherwise result in loss of crew, loss of mission, or loss of vehicle.

This project is solving the Space Wearables: Designing for Today’s Launch & Research Stars challenge.


The ExoBot project aims to make space and ground operations safer through the use of remote control of NASA robots like Robonaut. Robots are extremely capable with regards to repetitive tasks and precise movement, but they are not critical thinkers nor problem solvers. In that area, humans may never be surpassed. Remote control allows the robot to benefit from the problem solving skills and ability to process complex tasks that humans have while still performing tasks with the precision only robots are capable of.

To achieve precise movements and control, a reliable method of tracking human motion is needed. To simplify things for Space Apps, the focus of this project will be on simulating a single joint. Once one joint is complete, a solid foundation for the rest of a control layout will be created. The accelerometer and gyroscope from a smartphone are being used to track motion and simulate the expected input from the sensors. An app was written for Android devices that tracks the accelerometer and gyroscope data and displays it on the screen. This provides the groundwork for future interpretation into robot-executable commands by providing access to all of the data in one location.

Through the use of a gyroscope and accelerometer located at each arm joint, the user’s motion data will be received and translated to control the corresponding joint on the robot. Additionally, flex sensors will be used in the fingers of the glove that will allow the user to have a full range of motion in their fingers to match up with Robonaut’s fingers. Finally, there will be five proximity meters located on the user to calibrate the suit. The first will be located in the center of the user’s chest to measure their arm span and correspond their range of motion to Robonaut’s. The second and third will be on the user’s shoulders to estimate distance of a fully extended arm, as the last two proximity sensors will be attached at the palm, or wrist, of the user. Finally, to account for cost, the model will include the use of a light weight material, such as spandex, with small sensors and wires connected within the material. This will allow for implementation of the suit without a large cost of weight or excessive loss of usable space.

In the future, the ExoBot project will allow the human to see what the robot does through the use of a virtual reality headset. Implementation of a virtual reality headset will provide the most realistic views for the human by providing realistic depth perception. Lack of depth perception provides one of the largest challenges for operators right now. Operations requiring extreme precision become nearly impossible when ability to perceive distance is removed. In addition to the standard viewing range, the user will be able to switch to other cameras on the robot for different and more local views, allowing for further precision from zoomed views.

Although the main control interface will be through gloves and sleeves, robots that have more physical capabilities than humans, such as the ability to rotate joints 360 degrees, can be fully controlled with voice commands. This will allow for the control of many types of robots without loss of the function resulting from having a motion controlled interface.

If implemented, the ExoBot project can make working in space, around launch vehicles, and in other hazardous areas safer and more efficient. The best characteristics of humans and robots are being joined, resulting in an incredible combination that will revolutionize standards of safety and efficiency in fields from medicine to aerospace. Virtual reality, motion tracking, and voice commands will all be implemented to produce an extremely accurate and efficient device useful anywhere a human’s safety can be compromised. ExoBot will help make everyone safer and more productive- the decision doesn’t have to be a trade-off.

Project Information

License: MIT license (MIT)

Source Code/Project URL:



  • Alex Bassett
  • Dylan Gramza
  • Jonathan Rach
  • Benjamin Burdick
  • Caeley Looney