shmi-out-of-hands

The Human-Machine Interface completely relies on physical human contact, even with modern touch screens, but in a limited environment where you need to work alone and have your hands busy, a little help will always be welcome; but how to do it? well, it's time to complement the current HMI with the Internet of the Things technology, expanding the HMI from contact to voice and gesture recognition, helping the astronauts in their job and generating data for researchers on Earth.

This project is solving the SpaceGloVe: Spacecraft Gesture and Voice Commanding challenge.

Description

SHMI demonstrates that we can create new ways to interact with the laboratory and computer equipment in the ISS. This new HMI have to be reliable and intuitive. Nowadays the astronauts rely on direct contact with the controls and sometimes is difficult to interact with them, specially if both of your hands are busy. We want apply RealSense (TM) technology in conjunction with powerful microcomputers and MCUs to give another option of interaction with the equipment in general. In our 30 hrs prototype we developed a solution to control a robotic arm using voice commands and no hands but the human head gestures, literally, using wireless technology and techniques to make an application that help to close the gap between man and machine communication.


Project Information


License: MIT license (MIT)


Source Code/Project URL: https://github.com/jgtres/SHMI_out_of_hands


Resources


Youtube Video 1min - https://youtu.be/FQDoO5Yk_Yg

Team

  • Jeannette Induni
  • Jeiner Alvarado Fonseca
  • Fernando Herrera
  • Jonathan Guevara


Loading...
×
Loading...
×