Gestures

From Openmoko

(Difference between revisions)
Jump to: navigation, search
m (Accelerometer-based Gestures moved to Gestures: Title was too long)
(just dumped the ideas)
Line 1: Line 1:
 
Read more on how the project is going [http://www.borza.ro/index.php/category/accelerometer-based-gestures/ here]. I will release a video on YouTube that will demo the gesture recognition framework at the end of June. I'm currently working on continuous recognition. I've already done isolated recognition. Continuous recognition is almost like isolated recognition, only that I'm using a two-class gaussian classifier as the end-point detector.
 
Read more on how the project is going [http://www.borza.ro/index.php/category/accelerometer-based-gestures/ here]. I will release a video on YouTube that will demo the gesture recognition framework at the end of June. I'm currently working on continuous recognition. I've already done isolated recognition. Continuous recognition is almost like isolated recognition, only that I'm using a two-class gaussian classifier as the end-point detector.
 +
 +
== Applications (needs some cleaning) ==
 +
* Mute audio or suspend when screen is facing down;
 +
* Go to main menu when shaken;
 +
* Volume up/down during call when tilting left/right (still unclear);
 +
* Turning the phone face to the user (not the same as taking it to the 
 +
ear) to turn on the backlight
 +
* Automatic portrait/landscape switching for the UI
 +
* Turning the phone screen down to mute sound (and probably turn off the 
 +
backlight) or hold call
 +
* Swinging in an O-shape in the air to redial
 +
* Moving the phone in a firm gesture from one ear to the other to switch 
 +
between active and held calls
 +
* Scrolling with firm tilts (suggested several times, should see if it's 
 +
usable)
 +
* Dropping (suggested several times, though it's unclear how to react to 
 +
it)
 +
* Shaking to get audio feedback (could e.g. imitate balls rolling inside 
 +
to the number of unread messages, or liquid splashing to incdicate the 
 +
battery level)
 +
* Starting driving in a car (if that's detectable -- probably has other 
 +
patterns than walking etc) to switch to some “car mode”
 +
* Stopping e.g. at a traffic light to choose a better time to notify about 
 +
new messages than while driving
 +
* Taking off in a plane (should be detectable, but hard to train) to shut 
 +
down all RF systems
 +
* Similarly, landing to re-enable RF systems
 +
 +
Vigorous shaking (side to side) while receiving a call could reject it.
 +
 +
A sideways swing (90degres) out of the wrist could mean general
 +
Cancel/Esc/Back
 +
A long swing could close a app (more a arm swing than wrist. Same
 +
G-forces but longer time).
 +
These swing moves could be used on two axis and each in two axis for
 +
different usage.
 +
A firm wrist tilt backside down could mean global OK.
 +
 +
Maybe some basic moves like thsese should have absolute global meaning.
 +
 +
(like left, right, enter, esc)
 +
 +
Mute phone my hitting it on something hard three times with one side.
 +
 +
Face down lying still - lock screen
 +
Face up lying still - never lock screen
 +
 +
* holding the moko out & angling the front of it up repeatedly turns up
 +
volume
 +
* angling front down repeatedly turns down volume
 +
* a set of 5 or 10 standard, easily distinguishable gestures that the user
 +
can map to favorite programs

Revision as of 10:07, 7 June 2008

Read more on how the project is going here. I will release a video on YouTube that will demo the gesture recognition framework at the end of June. I'm currently working on continuous recognition. I've already done isolated recognition. Continuous recognition is almost like isolated recognition, only that I'm using a two-class gaussian classifier as the end-point detector.

Applications (needs some cleaning)

  • Mute audio or suspend when screen is facing down;
  • Go to main menu when shaken;
  • Volume up/down during call when tilting left/right (still unclear);
  • Turning the phone face to the user (not the same as taking it to the

ear) to turn on the backlight

  • Automatic portrait/landscape switching for the UI
  • Turning the phone screen down to mute sound (and probably turn off the

backlight) or hold call

  • Swinging in an O-shape in the air to redial
  • Moving the phone in a firm gesture from one ear to the other to switch

between active and held calls

  • Scrolling with firm tilts (suggested several times, should see if it's

usable)

  • Dropping (suggested several times, though it's unclear how to react to

it)

  • Shaking to get audio feedback (could e.g. imitate balls rolling inside

to the number of unread messages, or liquid splashing to incdicate the battery level)

  • Starting driving in a car (if that's detectable -- probably has other

patterns than walking etc) to switch to some “car mode”

  • Stopping e.g. at a traffic light to choose a better time to notify about

new messages than while driving

  • Taking off in a plane (should be detectable, but hard to train) to shut

down all RF systems

  • Similarly, landing to re-enable RF systems

Vigorous shaking (side to side) while receiving a call could reject it.

A sideways swing (90degres) out of the wrist could mean general Cancel/Esc/Back A long swing could close a app (more a arm swing than wrist. Same G-forces but longer time). These swing moves could be used on two axis and each in two axis for different usage. A firm wrist tilt backside down could mean global OK.

Maybe some basic moves like thsese should have absolute global meaning.

(like left, right, enter, esc)

Mute phone my hitting it on something hard three times with one side.

Face down lying still - lock screen Face up lying still - never lock screen

  • holding the moko out & angling the front of it up repeatedly turns up

volume

  • angling front down repeatedly turns down volume
  • a set of 5 or 10 standard, easily distinguishable gestures that the user

can map to favorite programs

Personal tools

Read more on how the project is going here. I will release a video on YouTube that will demo the gesture recognition framework at the end of June. I'm currently working on continuous recognition. I've already done isolated recognition. Continuous recognition is almost like isolated recognition, only that I'm using a two-class gaussian classifier as the end-point detector.