Friday, February 25, 2011

Paper Reading #12, "Detecting and Leveraging Finger Orientation for Interaction with Direct-Touch Surfaces"

http://chi2010-cskach.blogspot.com/2011/02/paper-reading-12-disappearing-mobile.html
http://dlandinichi.blogspot.com/2011/02/paper-reading-12-teslatouch.html

Detecting and Leveraging Finger Orientation for Interaction with Direct-Touch Surfaces
Feng Wang & Xiangshi Ren
Kochi University of Technology, Japan
Xiang Cao

Microsoft Research Cambridge
Pourang Irani
University of Manitoba
This is paper, not a presentation, and as such has no specific venue.

Summary
In this paper, the authors present an algorithm to detect the orientation of fingers and not just the point of contact for usage in touch screen interfaces. The algorithm is based on input shape only. Extant work foundational to this project includes mouse interactions that can be adapted, and prior work in the area of finger interactions beyond simply point of contact (not necc. limited to orientation).

The math to determine vertical vs. oblique touch and direction is included in the paper, although not the code. The most difficult part, determining between one direction and its opposite, is done by tracking the order in which pixels are contacted. Orientation is continuously tracked afterward.

The algorithm was tested by running it past eight adults who had no experience in touch screen controls. The five trials (Disambiguation success, orientation stability, orientation precision, dynamic orientation precision, involuntary position rotation) all showed results between "good" and "excellent".

The paper finished the section on the algorithm description with sections on what tasks are possible with this interface and what can be inferred from finger orientation.

Limitations of the algorithms include vertical presses, oblique touches that deviated from the usual method, and fingers other than index. These were of varying degree; the last was least serious. The authors believe that portability should require minimal tweaking.

Future work described in the paper is mostly aimed at building upon and expanding the concepts here (more precision, better geometry, multiple fingers from the same hand in use at once), rather than a completely new direction. In fact, this paper is clearly described as "proof of concept".

Discussion
This paper represents a large step forward in detail from the previous classes I have been reading, something I appreciate. This research isn't particularly exciting to me, being a refinement of something (touch screens) I have little knowledge about rather than a big leap or something in my areas of interest. It does, however, strike me a good step toward significant improvement in its arena.

The best part of the paper was the attention to detail, while I feel the strongest part of the algorithm is being able to dynamically track changing finger orientation. It just seems that would be the most technically difficult piece of work.

Something I would have liked to have seen would have been the proficiency of test users in this system contrasted to the proficiency of test users in conventional touch screens.

The direction they are taking with the research seems about right. I am particularly fascinated by the idea of being able to use all ten fingers to increase proficiency and speed with such displays.

Picture is related because that's what time it was when I finished this entry. (12:18 was the first autosave)
Apologies for any (probably related) drop in quality from my previous writings.

1 comment:

  1. This is a great idea. I still have some problems with my ipod touch every now and again when I try to zoom in or select text. Maybe these issues could be fixed if the orientation of the fingers could be recognized.

    ReplyDelete