Monday, 3 November 2014

Sensor Dashboard

- a developer tool for visually inspecting Android Wear sensor data

Gestural interfaces and activity detection is going to extremely interesting topics when we're starting to see the next generation of Google's Android Wear devices and apps that innovative developers write for them.

Over the last weekend after the awesome Droidcon UK conference there was an Android Wear hackathon. In the hackathon I got together with two friends and we put our heads together to build a tool for developers to help them understand what kind of data is available for them from Android Wear devices.

Here is what we built:

What does it do?

First, I'd like to remind you that the app was built in hackathon. It's not perfect. It is potentially buggy.

The app reads all the sensor data available from a connected Android Wear device and graphs it a connected handheld app.

The idea is to help developers to figure out if the app idea they have is feasible. Let's say that you're planning to build an app that detects push-ups. Without writing any code you simply install the Sensor Dashboard do pushups and observe the effect on various sensors. If the result of the test is that you see an identifiable peak of data on some of the sensors you can probably detect it programmatically as well and you have a good starting point for starting to design your algorithms.


This is how an airplane takeoff looks like in sensor data:

Future development

This app was built as part of a hackathon in just under 2 days. It's far from perfect. The UI doesn't really work very well. In fact, it's never been tested on anything else than Nexus 5 running Android 5.0 preview.

There's many features that we would like to see added to the app. Firstly, it would be great to be able to run the sensor data gathering also on a handheld device. This would not be too large change as Android Wear runs Android so the same service could easily be executed on a handheld as well.

I'd also love to implement feature to allow the user to pin multiple sensors and overlay them. I believe that in many use cases use of multiple sensors is required to make the gesture or activity detection easier.

Get the source

The project is open source and you can download the source code from github:

Please contribute back if you make improvements. Pull-requests are very welcome!

Get the app

The app is also available for free from Google Play store:

Please give us feedback!

I believe that this app can be pretty useful tool for devs. It would be great to hear what you think about it and what could be better. Feel free to ping me at Google+ for any questions: 

The team behind this project:

No comments:

Post a Comment