UI Improvements

From Openmoko

(Difference between revisions)
Jump to: navigation, search
(Areas of improvement)
Line 1: Line 1:
 
DRAFT (taken from emails), will be reorganized shortly
 
DRAFT (taken from emails), will be reorganized shortly
  
==News ideas==
+
==Introduction==
  
==Areas of improvement==
+
> Obviously the tools are in the wild to build interfaces that could rival
 +
> (or better IMO) anything Apple comes up with. We just need to organize
 +
> this stuff. This would need hardware that can support dynamic
 +
> interfaces. I can help here, too.
 +
sean@openmoko.com
  
* OpenGL for fuild zooming interfaces (2D: the infinite sphere model, 1D: the infinite wheel of fortune/ribbon model, exposé)
+
==Finding inspiration ...==
* HandGestures
+
* Physics-model based improvements: inertia and friction
+
* multi touch screen for natural handgestures http://www.youtube.com/watch?v=89sz8ExZndc
+
  
==Human/Machine papers, resources==
+
===Ergonomy - Human/Machine papers ===
  
==Code/Layers offering place for evolution==
+
===Seen Live===
  
libmokoui
+
*[http://www.youtube.com/watch?v=89sz8ExZndc Multi-Touchscreen experiments video @youtube]
gtk
+
*[http://www.youtube.com/watch?v=nPqqfVLQ_qY iPhone UI features demo @youtube]
  
==Reusable code==
+
==Our weapons==
  
==eMails ''en vrac'', to reorganize==
+
===The touchscreen===
  
I know it's not gonna come in the priority list before at least a year
+
Question:
(or ever), but if we want to add eye candy & (possibly) useability to
+
the UI (such as smooth realistic list scrolling, as seen in apple's
+
iphone demo on contacts lists), we'll need a physics engine, so that
+
moves & animations aren't all linear.
+
  
All i found so far is the akamaru lib, needs just GLib:
+
  What exactly does the touchscreen see when  you touch the screen with 2 fingers at the same time,
 +
  when you move them, when you move only one of the 2, etc. I'm also interested in knowing how precise
 +
  the touchscreen is (ex: refresh rate, possible pressure indication, ...)?
  
http://people.freedesktop.org/~krh/akamaru.git/
+
Answear:
  
If you want to take a quick look at the code:
+
  The output is the center of the bounding box of the touched area.
svn co http://svn.kiba-dock.org/akamaru/ akamaru
+
  Pressure has little, but not no effect. Almost no effect on a single
 +
  touch, on a double touch, the relative pressures will have a slight
 +
  skewing effect towards the harder touch.
 +
  (from theory).
 +
  The touch point skips instantly on double touch.
  
It's a verlet integration implementation;  modelization includes
+
==Areas of improvement==
elasticity, friction, gravity, geometrical objects...
+
  
The only (AFAIK) implementation using this model is kiba-dock, a *fun*
+
* OpenGL for fuild zooming interfaces (2D: the infinite sphere model, 1D: the infinite wheel of fortune/ribbon model, exposé)
app launcher, but we may find another use to it in the future.
+
* HandGestures
 +
* Physics-model based improvements: inertia and friction
 +
* multi touch screen for natural handgestures
  
For instance, it may be useful to gesture recognition (i'm not aware
+
===Physics-inspired animation===
if existing gesture recognition engines measure speed, acceleration,
+
...).
+
  
There's an undergoing verlet integration into the e17 project (by rephorm)
+
If we want to add eye candy & useability to the UI (such as smooth realistic list scrolling, as seen in apple's
see http://rephorm.com/news/tag/physics , so we may see some UI
+
iphone demo on contacts lists), we'll need a physics engine, so that moves & animations aren't all linear.
physics integration into e17 someday.
+
  
Such early projects mainly show "rope-like" animations, but maybe the
+
The most used technique for calculating trajectories and systems of related geometrical objects seems to be [http://en.wikipedia.org/wiki/Verlet_integration verlet integration] implementation; it is an alternative to Euler's integration method, using fast approximation.  
API wil spare some time for different usages.
+
  
Well, that was just a useless thought, but i'd be happy to know if
+
We may have no need for such a mathematical method at first, but perhaps there are other use cases. For instance, it may be useful to gesture recognition (i'm not aware if existing gesture recognition engines measure speed, acceleration...).
other people are wondering about improving the user experience
+
regarding physics-model-based UI animation.
+
Having a scroll that isn't a 1:1 map to the user's action isn't hard.
+
It's just an extra calculation in the scroll code.
+
  
I think it's a great idea to have some rate-aiding on scrolling.
+
====Libakamaru====
  
The difference with traditional scrolling is that the menu/image has
+
The [http://people.freedesktop.org/~krh/akamaru.git/ akamaru library] is the code behind [http://www.youtube.com/watch?v=VekgyKQoTeM kiba dock]'s fun and dynamic behaviour. It's dependencies are light (needs just GLib). It takes elasticity, friction, gravity into account.
to continue moving on it's own even when you don't touch it, with some
+
friction (so that the movement stops by itself after a certain time).
+
In fact, we have to modelize a "wheel of fortune" (for scrolling), and
+
a "floating sphere" for image/webpage navigation.
+
  
I took finger-based scrolling as an example, but the same
+
If you want to take a quick look at the code:
can apply to popups (such as calling event), in-gallery (image / mp3)
+
svn co http://svn.kiba-dock.org/akamaru/ akamaru
finger-based navigation, map (displacement, rotation)/zoomed
+
image/zoomed web page navigation, wheel-based menu control ...
+
Take a look @ this iphone video:
+
http://www.youtube.com/watch?v=nPqqfVLQ_qY
+
  
There is:
+
The only (AFAIK) application using this library is kiba-dock, a *fun* app launcher, but we may find another use to it in the future.
- it's as if the entire list is the scrolling bar, but reverted
+
(finger down -> scroll up)
+
- the list follows the pointer
+
- as soon as you stop touching, the list continues to scroll (in
+
contrary to standard gtk scrolling bar)
+
- the list moves at the speed measured at the end of the "touching"
+
- some "friction" lets it slow down
+
- when you touch it again, it stops the scrolling
+
  
Questions:
+
As suggested on the mailing list, it is mostly overkill for the uses we intend to have, but this library may be optimized already, the API can spare some time for too. Furthermore, ''Qui peut le plus, peut le moins''.
- will the neo/openmoko graphics system be powerful enough for such
+
animation? I suspect apple to do opengl acceleration on this device,
+
which is waaaaay impossible for us
+
- ok, akamaru is overkill. But it allows friction, speed etc...
+
The changes to bring to the standard gtk scrolling are:
+
- consider the list as scrollable (not just the scroll item)
+
- change the scrolling "stop" behaviour (when the user stops touching
+
the screen) like this: if (last_cursor_speed > 0),
+
continue_scrolling(last_cursor_speed)
+
- when touching the moving list again, stop the scrolling immediately
+
- addition of friction may be a plus, for a more
+
"wheel-of-fortune"-like experience
+
  
http://en.wikipedia.org/wiki/Verlet_physics
+
====Verlet integration implementation from e17====
  
After reading this i realize that's not what we need, simpler
+
There's an undergoing verlet integration implementation into the e17 project (by rephorm) see http://rephorm.com/news/tag/physics , so we may see some UI physics integration into e17 someday.
equations will suffice completely.
+
  
I'm wondering what layer of openmoko has to be hacked, i.e. if working
+
==Improvement ideas==
at openmoko layer allows enough possibilities for this; if i'm not
+
mistaken, this is part of libmokoui, but i'm pretty afraid that
+
patching gtk itself woud be needed. Working on the lower level would
+
apply changes to every application, not only openmoko's.
+
  
One preliminary idea can be to add inertia and friction to the finger
+
''I think it's a great idea to have some rate-aiding on scrolling.''
scrolling wheel; which means, you launch the wheel, and stop it when
+
you like. For instance, one complete wheel turn = one element in the
+
list further. This is an interesting option, because it only needs
+
modification of the wheel's function and graphics effect.
+
  
Nevertheless, there are different approaches. Example (it's merely an
+
===1D Scrolling: [http://en.wikipedia.org/wiki/Uniform_prism n-sided uniform prism]===
iphone "imitation", so if you have novative ideas, please add your 2
+
cents):
+
  
*Scrolling: the "wheel of fortune" effect
+
====Description====
  
Imagine that every item of a scrollable list is on the front surface; the aim is to reach this http://www.youtube.com/watch?v=nPqqfVLQ_qY
+
Take an item list (ex: adress book), print it on a ribbon of paper, and glue it on a wheel (on the tire). You're looking in the front of it, so when you want to go from the A to Z, you touch the wheel and drag it up. When you let the wheel go, it goes furter, taken by it's inertia. Stop the wheel when you got your contact. Got the idea? That's why we may speak of an "infinite wheel", so that the surface is flat. For our case here, we always want to display square content, so the [http://en.wikipedia.org/wiki/Uniform_prism n-sided uniform prism] analogy is mathematically more exact.
  
Forget the slider bar for scrolling windows, and let the entire window
+
Why this wheel model? Because if the modelisation is coherent:
be scrollable by default
+
- '''weight''': the more heavy, the fastest it goes = the biggest the item list, the faster it scrolls; that way, you don't have to wait too long for big lists, and you don't miss your item on shorter lists
 +
- '''friction''': there is friction where the wheel is fixed, so that the wheel doesn't turn infinitely
 +
- the initial speed and acceleration vector you give it determines it's futher rotation
 +
- it's "round"/cyclic, so you can browse the list in two directions
 +
 +
We can add "parallel wheels", symbolizing different sorting methods. Slide long to the left / right to look at a different wheel = items organization.
  
* Sliding = Single click + maintained for a minimal distance
+
====Controls====
 +
 
 +
* Sliding up/down = Single click + maintained for a minimal distance
 
Effect: scroll in an inverted/negated fashion (slide down = scroll up,
 
Effect: scroll in an inverted/negated fashion (slide down = scroll up,
 
slide up = scroll down)
 
slide up = scroll down)
Line 142: Line 106:
 
* Right click = long tap
 
* Right click = long tap
  
* Gestures can be interesting, especially for "jumps" (when the cursor
+
* Sliding left/right: switch sorting method
jumps from upper left corner to down right)
+
  
> Obviously the tools are in the wild to build interfaces that could rival
+
====Parts to "hack"====
> (or better IMO) anything Apple comes up with. We just need to organize
+
 
> this stuff. This would need hardware that can support dynamic
+
''Having a scroll that isn't a 1:1 map to the user's action isn't hard. It's just an extra calculation in the scroll code.''
> interfaces. I can help here, too.
+
 
 +
<---- Where is the scroll code? :)
 +
 
 +
libmokoui
 +
gtk
 +
 
 +
I'm wondering what layer of openmoko has to be hacked, i.e. if working at openmoko layer allows enough possibilities for this; if i'm not mistaken, this is part of libmokoui, but i'm pretty afraid that patching gtk itself woud be needed. Working on the lower level would
 +
apply changes to every application, not only openmoko's.
 +
 
 +
TODO:
 +
- remove the scrolling slider on finger mode
 +
- make the entire list a "scrolling zone", i.e. an overlay transparent scrolling slider
 +
- define controls
 +
 
 +
 
 +
===1D Scrolling: inertia friction integration into openmoko's finger wheel===
 +
 
 +
The same, but for the wheel. It can be very short to do: you don't have 1:1 anymore, but, for example, 1/4 wheel turn = 1 item. It's demultiplicated, but has inertia.
 +
 
 +
===2D Scrolling: the infinite sphere model / non-infinite polyhedron===
 +
 
 +
The same model as the infinite wheel can apply to 2D navigation, except that your wheel becomes an infinite "floating sphere" for image/webpage navigation.  
 +
 
 +
Usages are:
 +
* zoomed image
 +
* zoomed internet webpage
 +
* browsing maps
 +
 
 +
===n-D navigation: the polyhedra inspiration===
 +
 
 +
When we want to navigate files, mp3s in an mp3 player, etc... Every control that the application needs is a button. What about looking at the polyhedrons ?
 +
 
 +
http://en.wikipedia.org/wiki/Polyhedra
 +
http://en.wikipedia.org/wiki/List_of_uniform_polyhedra
 +
 
 +
===Advanced/Natural handgesture recognition===
 +
 
 +
TODO
 +
 
 +
Gestures can be interesting, especially for "jumps" (when the cursor jumps from upper left corner to down right). Jump is different of sliding, and appears only with touchpads and touchscreeds; it can be detected as different from a button press if done fast enough (so you don't have to "aim" precisely.
 +
 
 +
The interesting jumps are:
 +
 
 +
* left <-> right
 +
* middle up <-> middle down
 +
* top left <-> down right
 +
* down left <-> top right
 +
 
 +
===OpenGL compositing===
 +
 
 +
Compositing seems to give zooming interfaces reality (at last!).
  
 
Well, considering recent changes in destkop applications, opengl has a
 
Well, considering recent changes in destkop applications, opengl has a
Line 165: Line 178:
 
And, if we really want deep changes, multi touch screen if essential
 
And, if we really want deep changes, multi touch screen if essential
 
too :(  (example: zooming with fingers)...
 
too :(  (example: zooming with fingers)...
 +
 +
 +
==Open questions==
 +
 +
- will the neo/openmoko graphics system be powerful enough for such uses? I suspect apple to do opengl acceleration on this device,
 +
which is waaaaay impossible for us for now
 +
- how does the touchscreen behave? We need a detailed touchscreen wiki information page, with visual traces. How hardware-specific is it?

Revision as of 01:23, 2 April 2007

DRAFT (taken from emails), will be reorganized shortly

Contents

Introduction

> Obviously the tools are in the wild to build interfaces that could rival > (or better IMO) anything Apple comes up with. We just need to organize > this stuff. This would need hardware that can support dynamic > interfaces. I can help here, too. sean@openmoko.com

Finding inspiration ...

Ergonomy - Human/Machine papers

Seen Live

Our weapons

The touchscreen

Question:

 What exactly does the touchscreen see when  you touch the screen with 2 fingers at the same time,
 when you move them, when you move only one of the 2, etc. I'm also interested in knowing how precise
 the touchscreen is (ex: refresh rate, possible pressure indication, ...)?

Answear:

 The output is the center of the bounding box of the touched area.
 Pressure has little, but not no effect. Almost no effect on a single
 touch, on a double touch, the relative pressures will have a slight
 skewing effect towards the harder touch.
 (from theory).
 The touch point skips instantly on double touch.

Areas of improvement

  • OpenGL for fuild zooming interfaces (2D: the infinite sphere model, 1D: the infinite wheel of fortune/ribbon model, exposé)
  • HandGestures
  • Physics-model based improvements: inertia and friction
  • multi touch screen for natural handgestures

Physics-inspired animation

If we want to add eye candy & useability to the UI (such as smooth realistic list scrolling, as seen in apple's iphone demo on contacts lists), we'll need a physics engine, so that moves & animations aren't all linear.

The most used technique for calculating trajectories and systems of related geometrical objects seems to be verlet integration implementation; it is an alternative to Euler's integration method, using fast approximation.

We may have no need for such a mathematical method at first, but perhaps there are other use cases. For instance, it may be useful to gesture recognition (i'm not aware if existing gesture recognition engines measure speed, acceleration...).

Libakamaru

The akamaru library is the code behind kiba dock's fun and dynamic behaviour. It's dependencies are light (needs just GLib). It takes elasticity, friction, gravity into account.

If you want to take a quick look at the code: svn co http://svn.kiba-dock.org/akamaru/ akamaru

The only (AFAIK) application using this library is kiba-dock, a *fun* app launcher, but we may find another use to it in the future.

As suggested on the mailing list, it is mostly overkill for the uses we intend to have, but this library may be optimized already, the API can spare some time for too. Furthermore, Qui peut le plus, peut le moins.

Verlet integration implementation from e17

There's an undergoing verlet integration implementation into the e17 project (by rephorm) see http://rephorm.com/news/tag/physics , so we may see some UI physics integration into e17 someday.

Improvement ideas

I think it's a great idea to have some rate-aiding on scrolling.

1D Scrolling: n-sided uniform prism

Description

Take an item list (ex: adress book), print it on a ribbon of paper, and glue it on a wheel (on the tire). You're looking in the front of it, so when you want to go from the A to Z, you touch the wheel and drag it up. When you let the wheel go, it goes furter, taken by it's inertia. Stop the wheel when you got your contact. Got the idea? That's why we may speak of an "infinite wheel", so that the surface is flat. For our case here, we always want to display square content, so the n-sided uniform prism analogy is mathematically more exact.

Why this wheel model? Because if the modelisation is coherent: - weight: the more heavy, the fastest it goes = the biggest the item list, the faster it scrolls; that way, you don't have to wait too long for big lists, and you don't miss your item on shorter lists - friction: there is friction where the wheel is fixed, so that the wheel doesn't turn infinitely - the initial speed and acceleration vector you give it determines it's futher rotation - it's "round"/cyclic, so you can browse the list in two directions

We can add "parallel wheels", symbolizing different sorting methods. Slide long to the left / right to look at a different wheel = items organization.

Controls

  • Sliding up/down = Single click + maintained for a minimal distance

Effect: scroll in an inverted/negated fashion (slide down = scroll up, slide up = scroll down)

When finger is released (i.e. touchscreen doesn't detect any press):

 if (last_speed_seen > value ) then keep this speed and

acceleration, with friction (so that it slows down)

 else stop scrolling

Scrolling here is seen as unidimensional, but can apply to bidimensional situations (ex: zoomed image) too

  • Action = quick double tap
  • Details/select = short single tap
  • Right click = long tap
  • Sliding left/right: switch sorting method

Parts to "hack"

Having a scroll that isn't a 1:1 map to the user's action isn't hard. It's just an extra calculation in the scroll code.

<---- Where is the scroll code? :)

libmokoui gtk

I'm wondering what layer of openmoko has to be hacked, i.e. if working at openmoko layer allows enough possibilities for this; if i'm not mistaken, this is part of libmokoui, but i'm pretty afraid that patching gtk itself woud be needed. Working on the lower level would apply changes to every application, not only openmoko's.

TODO: - remove the scrolling slider on finger mode - make the entire list a "scrolling zone", i.e. an overlay transparent scrolling slider - define controls


1D Scrolling: inertia friction integration into openmoko's finger wheel

The same, but for the wheel. It can be very short to do: you don't have 1:1 anymore, but, for example, 1/4 wheel turn = 1 item. It's demultiplicated, but has inertia.

2D Scrolling: the infinite sphere model / non-infinite polyhedron

The same model as the infinite wheel can apply to 2D navigation, except that your wheel becomes an infinite "floating sphere" for image/webpage navigation.

Usages are:

  • zoomed image
  • zoomed internet webpage
  • browsing maps

n-D navigation: the polyhedra inspiration

When we want to navigate files, mp3s in an mp3 player, etc... Every control that the application needs is a button. What about looking at the polyhedrons ?

http://en.wikipedia.org/wiki/Polyhedra http://en.wikipedia.org/wiki/List_of_uniform_polyhedra

Advanced/Natural handgesture recognition

TODO

Gestures can be interesting, especially for "jumps" (when the cursor jumps from upper left corner to down right). Jump is different of sliding, and appears only with touchpads and touchscreeds; it can be detected as different from a button press if done fast enough (so you don't have to "aim" precisely.

The interesting jumps are:

  • left <-> right
  • middle up <-> middle down
  • top left <-> down right
  • down left <-> top right

OpenGL compositing

Compositing seems to give zooming interfaces reality (at last!).

Well, considering recent changes in destkop applications, opengl has a definite future. For instance, the expose (be it apple's or beryl's) is a very interesting and usable feature. Using compositing allows the physics metaphore: the human brain doesn't like "gaps"/jumps (for instance while scrolling a text), it needs continuity. When you look at apple's iphone prototype, it's not just eye candy, it's maybe the most natural/human way of navigating, because it's sufficiently realistic for the brain to forget the non-physical nature of what's inside.

So, opengl hardware will be needed in a more or less distant hardware, for 100% fluid operation.

And, if we really want deep changes, multi touch screen if essential too :( (example: zooming with fingers)...


Open questions

- will the neo/openmoko graphics system be powerful enough for such uses? I suspect apple to do opengl acceleration on this device, which is waaaaay impossible for us for now - how does the touchscreen behave? We need a detailed touchscreen wiki information page, with visual traces. How hardware-specific is it?

DRAFT (taken from emails), will be reorganized shortly

Introduction

> Obviously the tools are in the wild to build interfaces that could rival > (or better IMO) anything Apple comes up with. We just need to organize > this stuff. This would need hardware that can support dynamic > interfaces. I can help here, too. sean@openmoko.com

Finding inspiration ...

Ergonomy - Human/Machine papers

Seen Live

Our weapons

The touchscreen

Question:

 What exactly does the touchscreen see when  you touch the screen with 2 fingers at the same time,
 when you move them, when you move only one of the 2, etc. I'm also interested in knowing how precise
 the touchscreen is (ex: refresh rate, possible pressure indication, ...)?

Answear:

 The output is the center of the bounding box of the touched area.
 Pressure has little, but not no effect. Almost no effect on a single
 touch, on a double touch, the relative pressures will have a slight
 skewing effect towards the harder touch.
 (from theory).
 The touch point skips instantly on double touch.

Areas of improvement

  • OpenGL for fuild zooming interfaces (2D: the infinite sphere model, 1D: the infinite wheel of fortune/ribbon model, exposé)
  • HandGestures
  • Physics-model based improvements: inertia and friction
  • multi touch screen for natural handgestures

Physics-inspired animation

If we want to add eye candy & useability to the UI (such as smooth realistic list scrolling, as seen in apple's iphone demo on contacts lists), we'll need a physics engine, so that moves & animations aren't all linear.

The most used technique for calculating trajectories and systems of related geometrical objects seems to be verlet integration implementation; it is an alternative to Euler's integration method, using fast approximation.

We may have no need for such a mathematical method at first, but perhaps there are other use cases. For instance, it may be useful to gesture recognition (i'm not aware if existing gesture recognition engines measure speed, acceleration...).

Libakamaru

The akamaru library is the code behind kiba dock's fun and dynamic behaviour. It's dependencies are light (needs just GLib). It takes elasticity, friction, gravity into account.

If you want to take a quick look at the code: svn co http://svn.kiba-dock.org/akamaru/ akamaru

The only (AFAIK) application using this library is kiba-dock, a *fun* app launcher, but we may find another use to it in the future.

As suggested on the mailing list, it is mostly overkill for the uses we intend to have, but this library may be optimized already, the API can spare some time for too. Furthermore, Qui peut le plus, peut le moins.

Verlet integration implementation from e17

There's an undergoing verlet integration implementation into the e17 project (by rephorm) see http://rephorm.com/news/tag/physics , so we may see some UI physics integration into e17 someday.

Improvement ideas

I think it's a great idea to have some rate-aiding on scrolling.

1D Scrolling: n-sided uniform prism

Description

Take an item list (ex: adress book), print it on a ribbon of paper, and glue it on a wheel (on the tire). You're looking in the front of it, so when you want to go from the A to Z, you touch the wheel and drag it up. When you let the wheel go, it goes furter, taken by it's inertia. Stop the wheel when you got your contact. Got the idea? That's why we may speak of an "infinite wheel", so that the surface is flat. For our case here, we always want to display square content, so the n-sided uniform prism analogy is mathematically more exact.

Why this wheel model? Because if the modelisation is coherent: - weight: the more heavy, the fastest it goes = the biggest the item list, the faster it scrolls; that way, you don't have to wait too long for big lists, and you don't miss your item on shorter lists - friction: there is friction where the wheel is fixed, so that the wheel doesn't turn infinitely - the initial speed and acceleration vector you give it determines it's futher rotation - it's "round"/cyclic, so you can browse the list in two directions

We can add "parallel wheels", symbolizing different sorting methods. Slide long to the left / right to look at a different wheel = items organization.

Controls

  • Sliding up/down = Single click + maintained for a minimal distance

Effect: scroll in an inverted/negated fashion (slide down = scroll up, slide up = scroll down)

When finger is released (i.e. touchscreen doesn't detect any press):

 if (last_speed_seen > value ) then keep this speed and

acceleration, with friction (so that it slows down)

 else stop scrolling

Scrolling here is seen as unidimensional, but can apply to bidimensional situations (ex: zoomed image) too

  • Action = quick double tap
  • Details/select = short single tap
  • Right click = long tap
  • Sliding left/right: switch sorting method

Parts to "hack"

Having a scroll that isn't a 1:1 map to the user's action isn't hard. It's just an extra calculation in the scroll code.

<---- Where is the scroll code? :)

libmokoui gtk

I'm wondering what layer of openmoko has to be hacked, i.e. if working at openmoko layer allows enough possibilities for this; if i'm not mistaken, this is part of libmokoui, but i'm pretty afraid that patching gtk itself woud be needed. Working on the lower level would apply changes to every application, not only openmoko's.

TODO: - remove the scrolling slider on finger mode - make the entire list a "scrolling zone", i.e. an overlay transparent scrolling slider - define controls


1D Scrolling: inertia friction integration into openmoko's finger wheel

The same, but for the wheel. It can be very short to do: you don't have 1:1 anymore, but, for example, 1/4 wheel turn = 1 item. It's demultiplicated, but has inertia.

2D Scrolling: the infinite sphere model / non-infinite polyhedron

The same model as the infinite wheel can apply to 2D navigation, except that your wheel becomes an infinite "floating sphere" for image/webpage navigation.

Usages are:

  • zoomed image
  • zoomed internet webpage
  • browsing maps

n-D navigation: the polyhedra inspiration

When we want to navigate files, mp3s in an mp3 player, etc... Every control that the application needs is a button. What about looking at the polyhedrons ?

http://en.wikipedia.org/wiki/Polyhedra http://en.wikipedia.org/wiki/List_of_uniform_polyhedra

Advanced/Natural handgesture recognition

TODO

Gestures can be interesting, especially for "jumps" (when the cursor jumps from upper left corner to down right). Jump is different of sliding, and appears only with touchpads and touchscreeds; it can be detected as different from a button press if done fast enough (so you don't have to "aim" precisely.

The interesting jumps are:

  • left <-> right
  • middle up <-> middle down
  • top left <-> down right
  • down left <-> top right

OpenGL compositing

Compositing seems to give zooming interfaces reality (at last!).

Well, considering recent changes in destkop applications, opengl has a definite future. For instance, the expose (be it apple's or beryl's) is a very interesting and usable feature. Using compositing allows the physics metaphore: the human brain doesn't like "gaps"/jumps (for instance while scrolling a text), it needs continuity. When you look at apple's iphone prototype, it's not just eye candy, it's maybe the most natural/human way of navigating, because it's sufficiently realistic for the brain to forget the non-physical nature of what's inside.

So, opengl hardware will be needed in a more or less distant hardware, for 100% fluid operation.

And, if we really want deep changes, multi touch screen if essential too :( (example: zooming with fingers)...


Open questions

- will the neo/openmoko graphics system be powerful enough for such uses? I suspect apple to do opengl acceleration on this device, which is waaaaay impossible for us for now - how does the touchscreen behave? We need a detailed touchscreen wiki information page, with visual traces. How hardware-specific is it?