View source for Touchscreen Filters

From Openmoko

Jump to: navigation, search

You do not have permission to edit this page, for the following reasons:

  • The action you have requested is limited to users in the group: Administrators.
  • You must confirm your email address before editing pages. Please set and validate your email address through your user preferences.

You can view and copy the source of this page:

Template used on this page:

Return to Touchscreen Filters.

Personal tools

Contents


Work in progress.

TODO: Complete the document. TODO: Improve videos (with no music, please).

In this document we describe the algorithms we use to make the touchscreen behave well. By "well" we mean that we can:

  • Get reliable clicks with the stylus and with the finger
  • Avoid further filtering in user space
  • Avoid calibration in user-space. We believe X (and other programs) should be able to gather reliable data from /dev/input/eventX.

Openmoko developers and contributors have tried different approaches that have helped improve the touchscreen performance. As a result now we have a touchscreen filtering framework that we are using in Linux 2.6. We describe the current working solution as of December 2008 as it is in the andy-tracking branch of the Openmoko GIT repository.

If you think we can improve something please send us feedback.

We include videos showing how things improve as we add filters. In the videos we use a program that plots the points reported by the driver.

Our hardware

The FreeRunner (GTA02) uses the S3C2442 touchscreen controller, while the Neo1973 (GTA01) uses the S3C2410 one. For both devices we use the same driver.

The driver can be used for other devices:

We don't use pressure information

The only information we are using from the hardware is the reported X and Y coordinates. We are not using pressure information. We could get the area of the contact to the touchscreen and we could tell between stylus, fingernail or thumb. The area information is reliable but we didn't manage to normalize the results across the screen as they varied widely depending on which corner of the screen is pressed, in a non-linear way.

Why are we doing filtering in kernel space?

This is not new. An early driver for this device does some filtering (corrections, averaging, etc).

In the latest driver we decided to add a generic filtering framework that could be used by more touchscreen drivers.

We are aware of tslib and we use it with the current stable kernel (2.6.24) but we also think that doing filtering in the kernel is a good idea, and in the latest kernel (2.6.28 - next stable) we are doing it and as a side effect we no longer need to use tslib.

Let's say we would like to deliver a TS event to user space each 10 milliseconds. In the GTA02 with the current configuration the Analog/Digital conversion time of a sample is 0.4697 milliseconds, thus can get 18 samples for each event that we send to user-space. Sending the event (with 18 samples) takes us about 8.45 milliseconds. Sometimes we even decide that the event should not be sent to user-space (because the hardware didn't provide reliable data), and our tests show that it's the right thing to do. In previous versions of the driver light taps would confuse the driver (that would report bad clicks) and this is no longer an issue.

Briefly stated: We think it's better if the driver sends only good data to the applications. If there are multiple distributions for a device (there are quite a few for the GTA02) they can share a tuned in-kernel configuration. We also save a lot of kernel mode/user mode transitions and data transmission allowing us to consider more samples for each reported event.

Raw output

This is the raw output of the s3c2410 driver we are using (source).

In this video:

  • We draw a few numbers
  • We draw points while trying to keep the stylus still and we get stain because of the variance of the raw output data. We also try to use some pressure, we are not testing light pressure here.
  • We draw two lines, two curves
  • We draw points with the finger
  • We use the stylus again to draw face

Note that we paint the points after we have lifted the stylus/finger[3] but keep in mind that the kernel is sending events while we press the screen.

Video: http://www.youtube.com/watch?v=-ouBKOETiG8