interact

A smartphone application to control 3D model viewer and internet browser with hand gestures and voice commands via wireless communication.

This project is solving the SpaceGloVe: Spacecraft Gesture and Voice Commanding challenge.

Description

The system has 2 main parts: - The mobile side: an android application that detects hand and voice gestures. For gestures, we examined the behavior of accelerometer and designed some basic rules to detect linear gestures ( right, left, up, down, up-right, up-left, down-right, down-left ). For voice, we used Google ASR. - The PC side: we applied our model to internet browsing and 3D model viewing. For internet browsing, we made a C#.Net application using internet explorer plugin. It has the following features: search in google and google maps by voice and scroll up and down by gestures. For 3D model viewing, we made a WebGL application using javascript that can load any .obj and .stl models. The user can interact with the model using the linear gestures i mentioned before and zoom in and out using voice.


Project Information


License: Educational Community License, Version 2.0 (ECL-2.0)


Source Code/Project URL: https://github.com/qinglong-nasa/Interact


Resources


Slideshow Presentation - https://github.com/qinglong-nasa/Interact/blob/master/QingLong.pptx

Team

  • Abdelrahman Hassan Mohamed
  • Eslam Hashem
  • Ahmad Mustafa
  • Karim Amer
  • Andrew Khalel


Loading...
×
Loading...
×