Monday, February 11, 2013

Reading Assignment: Visual Similarity of Pen Gestures

Reference Information
Title: Visual Similarity of Pen Gestures
Authors: A. Chris Long, Jr., James A. Landay, Lawrence A. Rowe, and Joseph Michiels
Citation: "Visual Similarity of Pen Gestures", A. Chris Long, Jr., James A. Landay, Lawrence A. Rowe, Joseph Michiels, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 360-367, 2000.

Summary
This paper discussed a set of experiments that were conducted in order to create a model for predicting the perceived similarity of gestures. The results of the experiments were used to create the gesture design tool, quill, that was discussed in the previous reading assignment. The motivation behind this research and the tool is that gestures are often difficult for users to remember and recognize. Therefore, the authors wanted to help gesture designers to create improved gestures such that they are easier to recognize by both humans and machines by developing an algorithm to compute the similarity between gestures.

Two experiments with participants were conducted, each designed to determine what properties of a gesture can lead a user to find it similar to other gestures. The experiments were designed with prior work in mind, including work with gesture features (such as Rubine and MDS) and psychological research regarding. The first experiment consisted of showing participants sets of animated gestures and having the participant select the gesture in the set with the least similarity to the others. From the resulting data, a set of features designed to accurately measure similarity was created and a set of equations for prediction were developed. In addition, it was determined that the similarity decisions were participant-dependent. The second experiment was similar to the first, but it allowed the prediction equations from the first experiment to be tested. It was determined that the predictions worked reasonably well and that the perceived similarity can be reasonably related to the features that were calculated for each gesture.

Thoughts
It was very helpful to read the details of the experiments that were briefly mentioned in the previous reading assignment. It made the previous paper much easier to understand, and the amount of detail that was discussed regarding these experiments was very welcoming compared to the lack of details in the previous paper. I found it very interesting that not only was prior work in gesture recognition used to design the experiments, but that psychological research was considered, as well. The development of features based on experiment data seemed like a great idea, as did the fact that a second experiment tested the developments that resulted from the first experiment. It seemed like a very thorough development process.

Some of the details regarding the experiments were debatable, however. For instance, the usage of only a student population for participants, while convenient, may not be the best representation of users for a gesture design tool. In addition, the fact that the gestures were not drawn by the users, but that animations were viewed instead, may have skewed the results, as well. It would be interesting to conduct similar experiments that take these factors into account in order to see whether the results are affected.

No comments:

Post a Comment