Results 1 - 20 of 35

Multi-touch interaction

Pointer events extend DOM input events to support various pointing input devices such as pen/stylus and touch screens as well as mouse. The pointer is a hardware-agnostic device that can target a specific set of screen coordinates. Having a single event model for pointers can simplify creating Web sites, applications and provide a good user experience regardless of the user's hardware.
Guide Pointer Events touch

Pinch zoom gestures

Adding gestures to an application can significantly improve the user experience. There are many types of gestures, from the simple single-touch swipe gesture to the more complex multi-touch twist gesture, where the touch points (aka pointers) move in different directions.
Guide PointerEvent touch

Using Pointer Events

This document demonstrates how to use pointer events and <canvas> to build a multi-touch enabled drawing application. This example is identical to the application described in the Touch events Overview except this example uses the pointer events input event model (instead of touch events. Another difference is that because pointer events are pointer device agnostic, the application accepts both touch input, pen and mouse input, the latter two for free.
Guide PointerEvent touch

Multi-touch interaction

The touch event interfaces support application-specific single and multi-touch interactions. However, the interfaces can be a bit tricky for programmers to use because touch events are very different from other DOM input events, such as mouse events. The application described in this guide shows how to use touch events for simple single and multi-touch interactions, the basics needed to build application-specific gestures.
Guide touch TouchEvent

Supporting both TouchEvent and MouseEvent

The touch interfaces enable applications to create enhanced user experiences on touch enabled devices. However, the reality is the vast majority of today's web content is designed only to work with mouse input. Consequently, even if a browser supports touch, the browser must still emulate mouse events so content that assumes mouse-only input will work as is without direct modification.
Guide touch TouchEvent

Using Touch Events

Today, most Web content is designed for keyboard and mouse input. However, devices with touch screens (especially portable devices) are mainstream and Web applications can either directly process touch-based input by using Touch Events or the application can use interpreted mouse events for the application input. A disadvantage to using mouse events is that they do not support concurrent user input, whereas touch events support multiple simultaneous inputs (possibly at different locations on the touch surface), thus enhancing user experiences.
Guide touch TouchEvent

Touch.screenX

Returns the X coordinate of the touch point relative to the screen, not including any scroll offset.
API DOM Mobile Property touch

Touch.screenY

Returns the Y coordinate of the touch point relative to the screen, not including any scroll offset.
API DOM Mobile Property touch

Touch.clientX

The Touch.clientX read-only property returns the X coordinate of the touch point relative to the viewport, not including any scroll offset.
API Property Read-only Reference touch

Touch.clientY

The Touch.clientY read-only property returns the Y coordinate of the touch point relative to the browser's viewport, not including any scroll offset.
API Property Read-only Reference touch

Touch.pageX

The Touch.pageX read-only property returns the X coordinate of the touch point relative to the viewport, including any scroll offset.
API Property Read-only Reference touch

Touch.pageY

The Touch.pageY read-only property returns the Y coordinate of the touch point relative to the viewport, including any scroll offset.
API Property Read-only Reference touch

Touch.radiusX

Returns the X radius of the ellipse that most closely circumscribes the area of contact with the touch surface. The value is in CSS pixels of the same scale as Touch.screenX.
API DOM Experimental Mobile Property touch

Touch.radiusY

Returns the Y radius of the ellipse that most closely circumscribes the area of contact with the touch surface. The value is in CSS pixels of the same scale as Touch.screenX.
API DOM Experimental Mobile Property touch

Touch.rotationAngle

Returns the rotation angle, in degrees, of the contact area ellipse defined by Touch.radiusX and Touch.radiusY. The value may be between 0 and 90. Together, these three values describe an ellipse that approximates the size and shape of the area of contact between the user and the screen. This may be a relatively large ellipse representing the contact between a fingertip and the screen or a small area representing the tip of a stylus, for example.
API DOM Experimental Mobile Property touch

Touch.target

Returns the Element (EventTarget) on which the touch contact started when it was first placed on the surface, even if the touch point has since moved outside the interactive area of that element or even been removed from the document. Note that if the target element is removed from the document, events will still be targeted at it, and hence won't necessarily bubble up to the window or document anymore. If there is any risk of an element being removed while it is being touched, the best practice is to attach the touch listeners directly to the target.
API DOM EventTarget Mobile Property touch