Gaze Interaction for Handheld Multi-touch Devices - An Explorative Study
MetadataShow full item record
This thesis explores how promising gaze interaction can replace touch interaction on handheld multi-touch devices. A set of three modules have been developed that use gaze as interaction method. These modules are run through a usability test consisting of two iterations. The modules will be compared against similar applications that use interactions for touch, keyboard and mouse. The research process begins by analyzing previous research on interaction techniques with gaze and touch interaction. By studying the iOS Human Interaction Guidelines, a collection of basic gestures for touch are identified. Strengths and weaknesses of gaze interactions are examined from previous literature. These findings create the research foundation, and has formed a selection of functionalities for gaze interaction. In the attempt to cover these gaze gestures, three prototypes has been developed that will be the basis for the usability test. A first iteration of the usability test includes 9 test cases divided into 3 groups for each interaction method - gaze, touch, keyboard and mouse. Each group will consist of a self developed prototype, with two other applications using interaction with touch or keyboard and mouse. In total 11 individuals participated in this part of the usability test. Efficiency has been used as a measurement for usability for the first iteration, where time commitment has been an essential factor. There were different results based on the prototypes, and they appear to be caused by limitations in user experience, personal factors, hardware and software. Effectiveness has shown to be similar between the prototypes, where the results indicate from a few cases that gaze interaction is the superior interaction method. Gaze interaction has shown to be intuitive and easy to learn, although user satisfaction has been low in all test cases. With the first iteration completed, a second iteration includes a fourth module which combines the 3 previous prototypes. Feedback from the first iteration has been used to add improvements. A group of 4 participants tested this last module, where it was also compared with the previous developed prototypes. Mostly positive responses were received. To conclude, gaze interaction has a great potential, but due to many factors it was difficult to justify it as superior over existing interactions from the usability test. It is unlikely for gaze interaction to replace interaction with touch, keyboard and mouse at the current state. However, issues can and will be fixed with new and excelling hardware running on dedicated software.