Skip to content

Commit

Permalink
feat(Touch): added touch events to tools (#247)
Browse files Browse the repository at this point in the history
* feat: added touch to view tools and annotation tools

Native touch is now enabled for view tools (pan/scroll/zoom/magnify/etc)
and annotations tools (length, rect, ellipse, freehandroi, etc)

Touch events consist of
 - TOUCH_START
 - TOUCH_START_ACTIVATE
 - TOUCH_PRESS
 - TOUCH_DRAG
 - TOUCH_END
 - TOUCH_TAP
 - TOUCH_SWIPE

All touch are multi touch supported. In order to be easily compatible with
mouse tools, multi touch are "reduced" by taking the mean of touch points.
Other reduction strategies are possible such as first value, and future work
may consider making these reduction strategies configurable.

Touch drag calculates pinch by calculating the average change in distance among
multi touch. Rotation is possible by translating points by the mean, matching
touch identifiers (which are consistent if a user adds/removes fingers), and
calculating the theta relative to the mean point. Each points theta could be
averaged to get a final rotation.

Touch event dispatcher checks for activeTool when looking for handles / overlay
annotations on a touch start. This is because there is no "touch move" event
which highlights handles. Proximity is also increased for the case of a touch
interaction.

Touch events also provide a force/radius parameter. Future work can provide
these as binding options

Touch based segmentation is not yet implemented.

* fix(touch):Make the touch events fire correctly for primary

Touch events weren't listening to key modifiers and button primary

* fix(htj2k):Support htj2k in the streaming volume loader

* fix import

* Update for PR

* Event Types refactored. NormalizedMouseEventDetail and NormalizedTouchEventDetail merged into NormalizedInteractionEventDetail

* Solving Rebase

* Gradienthealth/change types (#3)

* Event Types refactored. NormalizedMouseEventDetail and NormalizedTouchEventDetail merged into NormalizedInteractionEventDetail

* Types changed

* Types updated

* Refactored

* InteractionEventType included

* Event types refactored

* deleted as with main

* Solving conflicts

* Fix setToolActive logic

* Normalize InteractionEventType for tools

* Update changes

* Docs Updated

* Docs Updated

* Build conflicts solved

* API Updated

* Tools changes

* Listeners changed

* Event changed

* Listeners Reorganized

* postTouchDownCallback function fixed

* postTouchStartCallback function fixed

* added documentation and minor nit

* docs: added todo and example

* removed import

---------
Co-authored-by: Diego Hennrich <[email protected]>
  • Loading branch information
Ouwen authored Jan 27, 2023
1 parent 23cdffb commit e35f963
Show file tree
Hide file tree
Showing 69 changed files with 3,025 additions and 731 deletions.
464 changes: 308 additions & 156 deletions common/reviews/api/tools.api.md

Large diffs are not rendered by default.

218 changes: 218 additions & 0 deletions packages/docs/docs/concepts/cornerstone-tools/touch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,218 @@
---
id: touchEvents
title: TouchEvents
---

# Touch Events

Touch events are fired when the user touches device with one or more touch points such as a finger or stylus. The flow of touch points are the following:

1. `TOUCH_START`
2. `TOUCH_START_ACTIVATE`
3. optional: `TOUCH_PRESS`
4. optional: `TOUCH_DRAG`
5. optional: `TOUCH_SWIPE`
6. `TOUCH_END`

Every time a user places a finger down and lifts it up, the touch order flow will always follow the above. Touch events are mutually exclusive from click events.

Other touch events that can occur are the `TOUCH_TAP` event. A `TOUCH_TAP` will not trigger a `TOUCH_START` event flow. If the user taps successively, only one `TOUCH_TAP` event will fire with the count of how many times the user tapped.

| EVENT | Description |
| ---------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `TOUCH_START` | Triggers if the user places their touchpoint down for > 50ms. |
| `TOUCH_START_ACTIVATE` | Triggers only if no tools decided to stop propagation from the `TOUCH_START` event. It is useful to differentiate between touching an existing annotaton, vs needing to create a new annotaiton. |
| `TOUCH_PRESS` | Triggers if the user places their touchpoint down and does not move it for > 700ms |
| `TOUCH_DRAG` | Triggers anytime the user moves their touchpoint, may occur before `TOUCH_PRESS` since the `TOUCH_PRESS` event will tolerate some motion. |
| `TOUCH_SWIPE` | Triggers alongside `TOUCH_DRAG` if the user moves more than `100px` within a single drag cycle. |
| `TOUCH_END` | Triggers when the user lifts one or more of their touchpoints. |
| `TOUCH_TAP` | Triggers when the user makes contact with screen for less than 50ms - 10ms (buffer from the `TOUCH_START` event.) |

## Multitouch

Touch events natively support multitouch which is provided as a list of [`ITouchPoints[]`](api/tools/namespace/Types#ITouchPoints).
In order for touch events to be compatiable with mouse events, these `ITouchPoints[]` need to be reduced into a single
`ITouchPoint`. The current strategy for array reduction is taking the mean coordinate values. Other strategies can be
implemented such as first point, median point, etc. This can be implemented in the
[`touch` utilities codebase](https://github.com/cornerstonejs/cornerstone3D-beta/main/packages/tools/src/utilities/touch/index.ts)

The structure of `ITouchPoints` are the following:

```js
type ITouchPoints = {
/** page coordinates of the point */
page: Types.Point2,
/** client coordinates of the point */
client: Types.Point2,
/** canvas coordinates of the point */
canvas: Types.Point2,
/** world coordinates of the point */
world: Types.Point3,

/** Native Touch object properties which are JSON serializable*/
touch: {
identifier: string,
radiusX: number,
radiusY: number,
force: number,
rotationAngle: number,
},
};
```

## Multitouch Drag Calculations

`TOUCH_DRAG` events have the following structure:

```js
type TouchDragEventDetail = NormalizedTouchEventDetail & {
/** The starting points of the touch event. */
startPoints: ITouchPoints,
/** The last points of the touch. */
lastPoints: ITouchPoints,
/** The current touch position. */
currentPoints: ITouchPoints,
startPointsList: ITouchPoints[],
/** The last points of the touch. */
lastPointsList: ITouchPoints[],
/** The current touch position. */
currentPointsList: ITouchPoints[],

/** The difference between the current and last points. */
deltaPoints: IPoints,
/** The difference between distances between the current and last points. */
deltaDistance: IDistance,
};
```

`deltaPoints` is the difference between the mean coordinate point of `lastPointsList` and `currentPointsList`.
`deltaDistance` is the difference between the average distance between points in `lastPointsList` vs `currentPointsList`

## Usage

You can add an event listener to the element for the event.

```js
import Events from '@cornerstonejs/tools/enums/Events';
// element is the cornerstone viewport element
element.addEventListener(Events.TOUCH_DRAG, (evt) => {
// my function on drag
console.log(evt);
});

element.addEventListener(Events.TOUCH_SWIPE, (evt) => {
// my function on swipe
console.log(evt);
});

// within the chrome console in a deployed OHIF application
cornerstone
.getEnabledElements()[0]
.viewport.element.addEventListener(Events.TOUCH_SWIPE, (evt) => {
// my function on swipe
console.log('SWIPE', evt);
});
```

A full example can be found by running
`yarn run example stackManipulationToolsTouch` whose source is [here](https://github.com/gradienthealth/cornerstone3D-beta/blob/gradienthealth/added_touch_events/packages/tools/examples/stackManipulationToolsTouch/index.ts)

## Binding

Touch tools have bindings depending on the number of pointers that are placed down.
In the future, bindings can be filter based on force, as well as radius (stylus detection).
The `numTouchPoints` can be as many as is supported by hardware.

```js
// Add tools to Cornerstone3D
cornerstoneTools.addTool(PanTool);
cornerstoneTools.addTool(WindowLevelTool);
cornerstoneTools.addTool(StackScrollTool);
cornerstoneTools.addTool(ZoomTool);

// Define a tool group, which defines how mouse events map to tool commands for
// Any viewport using the group
const toolGroup = ToolGroupManager.createToolGroup(toolGroupId);

// Add tools to the tool group
toolGroup.addTool(WindowLevelTool.toolName);
toolGroup.addTool(PanTool.toolName);
toolGroup.addTool(ZoomTool.toolName);
toolGroup.addTool(StackScrollTool.toolName);

// Set the initial state of the tools, here all tools are active and bound to
// Different touch inputs
// 5 touch points are possible => unlimited touch points are supported, but is generally limited by hardware.
toolGroup.setToolActive(ZoomTool.toolName, {
bindings: [{ numTouchPoints: 2 }],
});
toolGroup.setToolActive(StackScrollTool.toolName, {
bindings: [{ numTouchPoints: 3 }],
});
toolGroup.setToolActive(WindowLevelTool.toolName, {
bindings: [
{
mouseButton: MouseBindings.Primary, // special condition for one finger touch
},
],
});
```

The `MouseBindings.Primary` is a special binding type which will
automatically bind single finger touch.

## Touch and Mouse Event Analogs

Touch and Mouse Events share a lot of overlapping inheritance. Most touch events
have a mouse event analog. See the below:

| TOUCH EVENT | MOUSE_EVENT |
| ---------------------- | --------------------- |
| `TOUCH_START` | `MOUSE_DOWN` |
| `TOUCH_START_ACTIVATE` | `MOUSE_DOWN_ACTIVATE` |
| `TOUCH_PRESS` | N/A |
| `TOUCH_DRAG` | `MOUSE_DRAG` |
| `TOUCH_SWIPE` | N/A |
| `TOUCH_END` | `MOUSE_UP` |
| `TOUCH_TAP` | `MOUSE_CLICK` |

The main difference between touch events and mouse events are that touch events
can have multiple pointers (multi-touch). Touch events will automatically reduce
multiple pointers into a single point value. The default way these points are
reduced is taking the weighted average. This reduced point can be used as a `IPoints`
or `ITouchPoints` depending if touch information is needed.

In the case multiple touch points are needed, they are accessible in list form.

```js
type MousePointsDetail = {
/** The starting points of the mouse event. */
startPoints: IPoints,
/** The last points of the mouse. */
lastPoints: IPoints,
/** The current mouse position. */
currentPoints: IPoints,
/** The difference between the current and last points. */
deltaPoints: IPoints,
};

type TouchPointsDetail = {
/** The starting points of the touch event. */
startPoints: ITouchPoints,
/** The last points of the touch. */
lastPoints: ITouchPoints,
/** The current touch position. */
currentPoints: ITouchPoints,

startPointsList: ITouchPoints[],
/** The last points of the touch. */
lastPointsList: ITouchPoints[],
/** The current touch position. */
currentPointsList: ITouchPoints[],

/** The difference between the current and last points. */
deltaPoints: IPoints,
/** The difference between distances between the current and last points. */
deltaDistance: IDistance,
};
```
143 changes: 143 additions & 0 deletions packages/tools/examples/stackManipulationToolsTouch/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,143 @@
import { RenderingEngine, Types, Enums } from '@cornerstonejs/core';
import {
initDemo,
createImageIdsAndCacheMetaData,
setTitleAndDescription,
} from '../../../../utils/demo/helpers';
import * as cornerstoneTools from '@cornerstonejs/tools';

// This is for debugging purposes
console.warn(
'Click on index.ts to open source code for this example --------->'
);

const {
PanTool,
WindowLevelTool,
StackScrollTool,
ZoomTool,
ToolGroupManager,
Enums: csToolsEnums,
} = cornerstoneTools;

const { ViewportType } = Enums;
const { MouseBindings } = csToolsEnums;

// ======== Set up page ======== //
setTitleAndDescription(
'Basic Stack Manipulation',
'Manipulation tools for a stack viewport'
);

document.body.style.overflow = 'hidden';
document.body.style.touchAction = 'none';

const content = document.getElementById('content');
const element = document.createElement('div');

// Disable right click context menu so we can have right click tools
element.oncontextmenu = (e) => e.preventDefault();

element.id = 'cornerstone-element';
element.style.width = '100vw';
element.style.height = '300px';
document.body.style.margin = '0';

content.appendChild(element);

const instructions = document.createElement('p');
instructions.innerText = `
One finger will change WW/WL,
Two fingers will allow for pan/zoom,
Three fingers will allow for stack scrolling
`;

content.append(instructions);
// ============================= //

/**
* Runs the demo
*/
async function run() {
// Init Cornerstone and related libraries
await initDemo();

const toolGroupId = 'STACK_TOOL_GROUP_ID';

// Add tools to Cornerstone3D
cornerstoneTools.addTool(PanTool);
cornerstoneTools.addTool(WindowLevelTool);
cornerstoneTools.addTool(StackScrollTool);
cornerstoneTools.addTool(ZoomTool);

// Define a tool group, which defines how mouse events map to tool commands for
// Any viewport using the group
const toolGroup = ToolGroupManager.createToolGroup(toolGroupId);

// Add tools to the tool group
toolGroup.addTool(WindowLevelTool.toolName);
toolGroup.addTool(PanTool.toolName);
toolGroup.addTool(ZoomTool.toolName);
toolGroup.addTool(StackScrollTool.toolName);

// Set the initial state of the tools, here all tools are active and bound to
// Different mouse inputs
toolGroup.setToolActive(ZoomTool.toolName, {
bindings: [{ numTouchPoints: 2 }],
});
toolGroup.setToolActive(StackScrollTool.toolName, {
bindings: [{ numTouchPoints: 3 }],
});
toolGroup.setToolActive(WindowLevelTool.toolName, {
bindings: [
{
mouseButton: MouseBindings.Primary, // Left Click
},
],
});

// Get Cornerstone imageIds and fetch metadata into RAM
const imageIds = await createImageIdsAndCacheMetaData({
StudyInstanceUID:
'1.3.6.1.4.1.14519.5.2.1.7009.2403.334240657131972136850343327463',
SeriesInstanceUID:
'1.3.6.1.4.1.14519.5.2.1.7009.2403.226151125820845824875394858561',
wadoRsRoot: 'https://d3t6nz73ql33tx.cloudfront.net/dicomweb',
});

// Instantiate a rendering engine
const renderingEngineId = 'myRenderingEngine';
const renderingEngine = new RenderingEngine(renderingEngineId);

// Create a stack viewport
const viewportId = 'CT_STACK';
const viewportInput = {
viewportId,
type: ViewportType.STACK,
element,
defaultOptions: {
background: <Types.Point3>[0.2, 0, 0.2],
},
};

renderingEngine.enableElement(viewportInput);

// Set the tool group on the viewport
toolGroup.addViewport(viewportId, renderingEngineId);

// Get the stack viewport that was created
const viewport = <Types.IStackViewport>(
renderingEngine.getViewport(viewportId)
);

// Define a stack containing a single image
const stack = [imageIds[0], imageIds[1], imageIds[2]];

// Set the stack on the viewport
viewport.setStack(stack);

// Render the image
viewport.render();
}

run();
Loading

0 comments on commit e35f963

Please sign in to comment.