This project is solving the SpaceGloVe: Spacecraft Gesture and Voice Commanding challenge. Description
Since the introduction of the keyboard and the mouse there has been little progress in the development of new ways to allow human-computer interaction and the control of generic software applications; Our main project goal was to develop a new way of interacting with any software application to control it without the need of a keyboard or mouse.
In this sense, speech is the most natural way of communication for humans and as such is the cornerstone of our project that allows the use of voice commands recollected from a smart-phone or any other Android-based mobile device and sent to a remote computer over a Bluetooth communication channel to execute a wide range of actions varying from simple orders to launch applications to complex tasks such as performing a web search.
The base mode of operation also use data derived from the mobile device sensors (Accelerometer, Gyroscope and Magnetometer) to control the mouse pointer in the computer making use of a technique know as SensorFusion. Another interesting feature of our project is that it "hacks" the original functionality of the audio jacket on the mobile device to trigger "click" and "double-click" actions in the remote computer using the "media key" of a standard headset.
With our project we also hope to enable people with some sort of physical disability to interact with computers in a easier way, as the voice commands and gestures supported by the system provide the level of flexibility required to adjust the system to the user needs.
License: MIT license (MIT)
Source Code/Project URL: https://github.com/ortega-io/SpaceHack
Project Video - https://www.youtube.com/watch?v=22HrGtqyuwQ
About Sensor Fusion - http://en.wikipedia.org/wiki/Sensor_fusion
Sensor Fusion in Android - https://github.com/KEOpenSource/GyroscopeExplorer/blob/master/GyroscopeExplorer/src/com/kircherelectronics/gyroscopeexplorer/sensor/FusedGyroscopeSensor.java