Making sense

Post on 11-Feb-2017

182 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

Transcript

Making SenseThe Road to mobile Awareness

• Jared Sheehan• Twitter: @jayroo5245• meetup.com/DCAndroid• slideshare.net/Jayroo5245

• What is Contextual Awareness?

• Use Cases• Sensor Fusion• Hard way• Medium hard way• Easy way• Questions

Agenda

“Context is any information that can be used to characterize the situation of an entity. An entity is a person, place, or object that is considered relevant to the interaction between a user and an application, including the user and application themselves. ”

• Anind Dey• Director of Human-Computer

interaction at Carnegie Mellon University

• Mobile sensing of a user’s context• Sensor based algorithms • Some Sensor Types on the Android

Platform• Accelerometer• Gyroscope/Orientation/Rotation

Vector• Barometric Pressure• Magnetic Field• Gravity• Relative Humidity• Ambient Room Temperature• Device Temperature

Mobile Contextual Awareness

Sensor Types in Android

• Detecting when a user:• Changes the orientation of a

their device• walking, running or biking• Driving a vehicle• Handling their device• Driving AND Handling their

device • Disclaimer – Don’t do it

• Drives by a restaurant or coffee shop when it is open• Driving detection• Google Places• Time

Use Cases

Sensor fusion is combining of sensory data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually.

Sensor Fusion – What is it?

If a developer takes individual sensor output and combines it with additional output from other sensors (or other hints) then you get a better understanding of what is going on with the mobile device.

Sensor Fusion – What is it?

Determine the attitude of a mobile device.

Attitude - orientation of a device relative to Earth's horizon

Sensor Fusion – Example

The common way to get the attitude of an Android device is to use the SensorManager.getOrientation() method to get the three orientation angles. These two angles are based on the accelerometer and magnetometer output.

Sensor Fusion – Example

In simple terms, the accelerometer provides the gravity vector (the vector pointing towards the center of the earth) and the magnetometer works as a compass. The Information from both sensors suffice to calculate the device’s orientation.

Sensor Fusion – Example

Isn’t that enough?

Sensor Fusion – Example

No

Problem is that both sensor outputs are inaccurate, especially the output from the magnetic field sensor which includes a lot of noise.

How do we fix it?

Gyro drift and noisy orientation are common issues with this approach, to solve it, the gyroscope output is applied only for orientation changes in short time intervals. While the magnetometer/accelerometer data is used as support information over long periods of time.

Sensor Fusion – Example

This is equivalent to low-pass filtering of the accelerometer and magnetic field sensor signals and high-pass filtering of the gyroscope signals. The overall sensor fusion and filtering looks like this:

Sensor Fusion – Example

Sensor Fusion – Example

So what exactly does high-pass and low-pass filtering of the sensor data mean? The sensors provide their data at (more or less) regular time intervals. Their values can be shown as signals in a graph with the time as the x-axis, similar to an audio signal.

Sensor Fusion – Example

The low-pass filtering of the noisy accelerometer/magnetometer signal (accMagOrientationin the above figure) are orientation angles averaged over time within a constant time window.

Sensor Fusion – Example

Initialize sensor containers:

Sensor Fusion – Example

Register you listeners:

Sensor Fusion – Example

Store sensor events:

Sensor Fusion – Example

At some time interval you process the sensor arrays and then events can be inferred from a single or multiple passes.

Sensor Fusion – Example

Example of Rotation Vector processing: https://developer.android.com/reference/android/hardware/SensorEvent.html#values

Sensor Fusion – Example

• https://github.com/Jayroo5245/makingsense

• https://github.com/Jayroo5245

Demo time!

• This is a simple-ish formula to obtain one feature• Very large task• Lots of math, calculations,

sensor state maintenance• Not something a standard

Android developer is used to working with

Sensor Fusion – Challenges

• How do you support 100% of devices?

• Very difficult• Android Fragmentation• Not all sensors return values

at the same frequency

Sensor Fusion – Challenges

• Process prioritization issues• OEMs build devices to

their specs, not ours• Missing sensors on some

devices.• Android/Java platform

limitations• Go Native - NDK

Sensor Fusion – Challenges

Example Platform limitation: The Android Platform was not designed to process sensor data as fast as it is generated. Using an Executor had the best results but you will not get consistent 16, 32 or 64 hertz.

Sensor Fusion – Challenges

• External Libraries• Lost - Drop in Replacement for

Google’s Fused Location API• www.zendrive.com• www.driversiti.com• www.pathsense.com• www.locationkit.io

Alternatives to the hard way:

• External Libraries• Licensing – IE Cost• Probably don’t do exactly

what you want• Inference

Change/Deprecation• lack support• Battery Drain

Drawbacks to External Libs:

Battery Issues:

Let Google Do It for you – Awareness API

• Current Local Time

Context #1– Time

• Latitude• Longitude

Context #2 – Location

• Place, including place Type

Context #3 – Place

• Activity Recognition• Detected user activity

(walking, running, biking)

Context #4 – Activity

• Nearby beacons (including namespace, type, and content)

Context #5 – Beacons

• Are the Headphones plugged?

Context #6 – Headphones

• Current Weather Conditions

Context #7 – Weather

• Apps can combine these context signals to make inferences about the user's current situation, and use this information to provide customized experiences.

• Exp: Suggest a playlist while jogging in the rain.

What is it?

• Easy implementation• One API

• Signals are processed for the app• No need to build

complicated algorithms

• Optimized Battery

Awareness Benefits

• Fence API• System Notifications

• Snapshot API• Real time request

Great now what?

• Push Mechanism - React to specific situations

• Provides notifications when a specific combination of actions occur

• Exp: Tell me when a user is biking, its lunchtime and near a bike friendly restaurant

Fence API

• Pull mechanism• Provides notifications

when a specific combination of actions occur

• Exp: Tell me when a user is biking, its lunchtime and near a bike friendly restaurant

Snapshot API

• Hard way• Build your own

• Easier Way• External Lib

• Easiest Way• Awareness API

Three options

Thank you for coming!The Road to mobile Awareness

• Jared Sheehan• Twitter: @jayroo5245• meetup.com/DCAndroid• slideshare.net/Jayroo5245

Sources• https://en.wikipedia.org/wiki/Sensor_fusion• https://developer.android.com/guide/topics/sensors/sensors_overview.html• http://plaw.info/2012/03/android-sensor-fusion-tutorial/comment-page-1/• http://

www.androidpolice.com/2016/05/19/the-new-awareness-api-will-let-apps-better-understand-your-environment/• https://developers.google.com/awareness/overview• https://

www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/context-aware-computing-context-awareness-context-aware-user-interfaces-and-implicit-interaction

top related