Wikia

Stanford University Wiki

CS349W Project3 jGestures

Talk0
224pages on
this wiki

CS349W project 3 writeup

Introduction Edit

We have developed jGestures, a jQuery plugin that enables mouse gestures in web applications. Mouse gestures have been available in traditional single-user applications, and even some browsers (e.g. Firefox), but only for browser commands (e.g. next/previous page, home, etc). jGestures lets web developers integrate mouse gestures into their web applications easily and efficiently. Its main features are:

  • Runs completely on the client side, using JavaScript.
  • Highly configurable: Developers specify their own gestures, can customize the way gestures are drawn, and can tweak the recognition algorithm.
  • Fast, small code footprint (3.6KB minified).
  • Easy to integrate as a jQuery plug-in.

With its default configuration, the user draws a gesture with the mouse by pressing and holding the left button. The gesture is displayed on the screen as the user draws it. To finish a gesture, the user releases the mouse button. jGestures then tries to recognize the gesture, and initiates its associated action if the gesture is successfully recognized. To provide visual feedback to the user, the gesture changes its color to green if it is recognized, or to red if it is not, and disappears 200ms after the user finishes it.

Developer InterfaceEdit

Specifying gesturesEdit

Sensitivity types
Gesture sensitivity options
SanchezdAdded by Sanchezd

jGestures allows the web developer to specify a series of custom gestures to recognize in a flexible way, using the addGesture method:

$(this).addGesture(points, directionSensitivity, proportionSensitivity, startSensitivity, name, handler);

The points argument specifies the sequence of points that form the gesture as an array of x-y coordinates. It can be simple, like a left-to-right stroke ([[0,0],[1,0]]), or more complex like a rectangle or circle. The coordinates can be in any system (e.g. [[0.5,0.5],[1000.5,0.5]] would also specify a left-right stroke). The next three options specify the sensitivity of the gesture to different aspects:

  • directionSensitivity: Whether the direction in which the gesture is drawn matters
  • proportionSensitivity: Whether the proportions of the gesture matter
  • startSensitivity: Whether the starting point of the gesture matters (useful for closed gestures, e.g. boxes or circles)

The name argument specifies a name for the gesture. Finally, handler is the event handler that will be called if the gesture is recognized. The handler function should take two arguments: the gesture detected, and the sequence of points captured.

Integrating jGesturesEdit

Integrating jGestures with your web application is very simple:

  1. Include the jQuery and jGestures JavaScript files.
    <script type="text/javascript" src="jquery.js"></script>
    <script type="text/javascript" src="jquery.gestures.js"></script>
  2. Optionally, modify the default configuration.
    $(this).gesturesConfig().strokeWidth = 5;
  3. Add your own gestures.
    var gHandler = function (gesture,points) {
    alert("Detected: " + gst.name);
    };
    $(this).addGesture(
    [[0,0], [0,1], [1,1], [1,0], [0,0]],
    false, false, false, "Box", gHandler
    );
    $(this).addGesture(
    [[0,0], [1,0]],
    true, true, true, "L-R stroke", gHandler
    );
  4. Initialize the jGestures plug-in
    $(this).initGestures();

See the full API Reference for more information on using jGestures.

jGestures Use CasesEdit

Galleria screenshot
jGestures in the Galleria photo gallery
TkoguntebiAdded by Tkoguntebi
Google maps screenshot
jGestures controlling a Google Map
TkoguntebiAdded by Tkoguntebi

To demonstrate some possible applications of mouse gestures and evaluate its ease of use, we have integrated it with two applications:

GalleriaEdit

The jQuery galleria plugin implements a browsable photo library. Mouse gestures can be used to traverse the gallery. Gestures are drawn by pressing the left button and holding it while drawing the gesture. Multiple gestures can be used:

  • Left-right stroke to go to the next image, right-left to go to the previous one
  • Up-down stroke to go to the first image, down-up to go to the first one
  • Up-down + left-right strokes (an "L") to jump to the next 5 images, and up-down + right-left to jump the the previous 5.

This integration required minimal effort (10 lines of code). The Galleria demo described can be seen here.

Google MapsEdit

We used the Google Maps API to create a map in which the zoom level can be controlled using mouse gestures. The recognized gesture, a box around a specific region of the map, causes the map to zoom and focus region. To draw the box, left-click on the map, trace a rectangular shape around your desired region, then click again to end the gesture. The map will zoom in to the outlined area. Integrating gestures with Google Maps required somewhat more effort because of conflicts with the event handlers provided by the Google Maps API, and the more elaborated code required to zoom in. The Google Maps demo can be seen here.

ImplementationEdit

Jgestures recognition
Recognition process in jGestures
SanchezdAdded by Sanchezd

We now discuss how jGestures captures and recognizes gestures. This process happens in four phases:

  1. Capture: By default, when initialized jGestures installs handlers for the mousedown, mousemove and mouseup events to capture gestures (however, if these handlers disrupt normal operation, the methods gestureStart, gestureUpdate and gestureEnd can be used instead). A gesture is captured as a sequence of points as the user draws it, but comparing it to other gestures in this form is hard, because these coordinates depend on the location and size of the gesture drawn, and we have an arbitrary number of points. To solve this, the gesture is normalized.
  2. Normalization is a two-step process. First, the gesture is scaled and translated so that every point is in the range [0,400]x[0,400]. This eliminates the differences in position and size between gestures. Then, the gesture is re-sampled to have a fixed number of evenly spaced points (40 by default).
  3. Comparison: Once normalized, the gesture is compared against all the gestures specified, and a distance to each of them is computed.
  4. Detection: If the closest gesture is at a distance below a configurable threshold, it is considered a match, and its corresponding event handler is called.

Comparing gesturesEdit

The distance D(a,b) between two gestures a and b is computed as the sum of square distances between the individual points, i.e.:

D(a,b) = \sum_{i=0}^{n-1} dist(a[i],b[i]), where
dist(p,q)= (p.x-q.x)^2 + (p.y-q.y)^2

where a[i], b[i], i=0,...,n-1 are the sequences of points of a and b.

To allow for gestures that are not sensitive to the starting point, we first find the closest point to a[0] in b, b[c], rotate b so that b'[i]= b[(i-c)~mod~n], i=0,...,n-1, and compute D(a,b') instead. For gestures that are insensitive to the direction, if dist(a[1], b[1]) > dist(a[1], b[n-1]), we reverse b so that b'[i]= b[(-i)~mod~n], i=0,...,n-1, and compute D(a,b').

Given the high overhead of executing interpreted JavaScript code, calculating the distances in each recognition, if done in a naive way, could require a significant amount of computation even in modern machines, which would be hinder interactivity. To aid performance, our algorithm has the following features:

  • It uses the sum of square distances, avoiding the use of square roots in the algorithm.
  • Points are compared in sequence instead of trying to find the closest pairs of points, so the algorithm is O(n).
  • Distance computation stops if the partial sum of point distances exceeds the threshold. This way, gestures that differ significantly from the drawn one are discarded early.

In the end, these optimizations yield a high-performance implementation: the capture and comparison process, when using six gestures to compare to and 40 points/gesture takes only 2.5ms on average (measured in an Intel Core 2 at 2.4GHz using Linux/Firefox 3.0). Subjectively, we found the algorithm to be accurate too: when a correct gesture is drawn, its distance is usually about 10x-100x smaller than the following one.

Drawing gesturesEdit

We found it important to display the gesture as the user is drawing it. However, current browser lack support for drawing in arbitrary parts of the page. Thus, we resort to using small-sized div elements in absolute positions to draw the gesture. We use code from the wz_jsgraphics library to perform this task. This code uses Bresenham algorithms to minimize the number of divs that are used, which improves performance. Although this method works well for the gestures we have tried, more complex gestures may require hundreds of div elements to be represented, degrading browser performance. This could be improved with adequate browser support for drawing line overlays on a page.

DiscussionEdit

The jGestures plug-in is a natural and interactive way to control certain web applications, as shown in the two demos above. The recognition algorithm works quite well, and we have seen good performance in terms of the amount of time required per recognition (around 2.5 ms). The entire plug-in is small (about 3.6 KB minified), and being a jQuery extension allows developers to easily use the plug-in. One potential point of debate is whether mouse gestures are a useful way to interact with a web application. For example, one might argue that in the Galleria application, it would be just as easy to have buttons/links to traverse the pictures as opposed to gestures. However, our particular implementation allows detection of closed polygons such as squares. This gives a developer the flexibility to add new interactions such as the zoom feature demonstrated in the Google Maps example. In addition, we feel that with the increasing use of embedded devices with touchscreens such as the iPhone, mouse gestures will become more relevant.

API ReferenceEdit

Function callsEdit

One can be up and running with a simple "left to right" gesture by adding the following code:

$(this).addGesture([ [0,0] , [1,0] ], true, true, true, "l_to_r", handler_func);
$(this).initGestures();
Function Arguments Description
addGesture coordinates, directionSensitive, proportionSensitive, startSensitive, name, handler Add a custom gesture (see specifying gestures)
initGestures None Initialize the jGestures plug-in (handler registration, graphics set-up)
g_mouseDown None Default mousedown event handler. Begins recording coordinates and drawing the gesture.
g_mouseMove None Default mousemove event handler. Records coordinate and updates gesture drawing.
g_mouseUp None Default mouseup event handler. Runs matching algorithm on points, determines result, and clears drawing.
gestureStart x,y Starts gesture capture. Use when the default mouse handlers cause conflicts.
gestureUpdate x,y Adds a point to the current gesture.
gestureEnd x,y Finishes gesture capture.
gestureOn None Returns true if in the currently capturing a gesture, false otherwise.


ConfigurationEdit

Configuration of the jGestures plugin is available via modification of the config object. This object can be accessed by calling gesturesConfig(). For example, changing the number of points to 30 would simply require the following code:

$(this).gesturesConfig().numPoints = 30;
Config Option Default Value Description
numPoints 40 Number of coordinates used in the internal representation of each gesture
maxThreshold 5000*numPoints Allowable difference in sum of square deltas - any greater error results in no match
xLen, yLen 400, 400 Dimensions of canvas surface to which all gestures are normalized
registerDefaultHandlers true Determines whether to register default jGestures handlers
mouseButton 0 Mouse button to use for gestures (firefox only): 0-left, 1-middle, 2-right
drawGesture true Specifies whether to draw gestures.
strokeColor yellow Valid if drawGesture is true. Determines the color used to draw the gesture.
strokeWidth 3 Integer describing the width of lines used to draw gestures.
detectedColor green Drawn gesture changes to this color if a gesture is matched.
notDetectedColor red Drawn gesture changes to this color if no gesture matches.
afterDetectionDisplay 200 (ms) Number of milliseconds to show gesture in recognized/non-recognized color after finishing gesture capture.

Around Wikia's network

Random Wiki