numaspace
This project is solving the SpaceGloVe: Spacecraft Gesture and Voice Commanding challenge. Description
The Project aims to show how Voice Activation, Natural Language and Gestures can be used to build an interactive app. We have targeted the following features for the demonstration
Use of Gesture to show Right/Left/Up/Down/Zoom In/Zoom Out movements Use voice command to activate/change Vital stats in a spaceship *Use of Android mobile app to control a Web application using WebSockets
Do checkout the detailed demo on You Tube if you like what we have done here.
NumaSpace Submission Detailed Demo on YouTube
Description of Various Parts on Portal
The Primary portal of the Project is the Web Portal we build using Drupal. It has controls which are activated using WebSocket based connection to the same server. The component are
- Viewport area where Map, Spaceship, Weather details come
- Controls - Move left, Move Right, Zoom In, Zoom out, Move up, Move down
- Graphs - Temperature, Pressure, Oxygen Levels, Speed
- Time Clock
- Real time chat window with backend,which does natural language processing
- A Youtube Video
Here is how it looks
When you choose the Map option, The Open Web GL map comes in. We have added features so you can control the globe using gestures. Also a Natural language search has been added - You can say "Show me Stature of Liberty" and it shows New York.
Here is how it looks
The next part which we loved to work on was managing the Aircraft and controls. We picked up a wireframe of a Spacecraft and divided it into Room, Wing etc. We all provided controls for Temperature, Pressure, Oxygen Levels, Speed. Some of the command that work here are
- Switch on the lights for room 1B
- Switch off all lights
- What is the current Oxygen level
- At what temperature is the aircraft in
- set the pressure 80
- Make Oxygen levels 200
- Start Ignition or just sat FIRE UP *** You should try this, the aircraft starts moving with a speed of 100. You can say "set speed 700" to make it move faster
- Stop Ignition
- Danger
- Clear
Here is how it looks
This part has an integration with Weather API. You can say "Show me the weather for Chicago"
Description of the Mobile App
The Mobile app had all the controls available on the Portal i.e. Move left/right/up/down/zoom in/zoom out. This was built using Apache Cordova. Additionally we added the ability to accept VOICE COMMANDS using Google Voice Recognition available in Android phones.
For Gesture management, we used the Accelerometer to fetch the coordinates in X, Y and Z Axis. The most difficult part of the problem was getting rid of the noise and reduce cross interference. We used several filters, State machines and latching mechanism to fetch it. See the video link below for a detailed view of how the app worked.
Here is how the phone app looked
While the gesture recognition did work, it was very jerky. The noise and interference, especially from electromagentic devices around us made it not as pleasant to use at it appears. We tied the phone to a leather glove (see below).
The experience on working on this was a really good one. We explored several new areas and loved working together. A big thanks to the entire team.
Project Information
License: MIT license (MIT)
Source Code/Project URL: https://github.com/abhi1304/NumaSpace
Resources
The NumaSpace Live Portal - http://numaspace.constonline.in/
The NumaSpace Mobile App (Android) - http://www.numahub.com/numaspace/numaspace.apk
Detailed Demo - You Tube - https://www.youtube.com/watch?v=jJus058UwBI