Navigation

Ventuz Introduction

  • Introduction
  • Getting Started
  • Ventuz Editions
  • Ventuz Products
  • Realtime Rendering
  • Frequently Asked Questions
  • Common Mistakes
  • Deploying a Ventuz Presentation
  • Scene Performance and Tweaks

Quick Guides

  • Quick Guide Index
  • Business Logic
  • 3D Art
  • 2D Art
  • Programming
  • System Engineer

General

  • Index
  • What's New
  • Ventuz System Requirements
  • Communication Protocol Overview
  • Configuration Editor
  • Audio / Video Configuration
  • Machine Configuration
  • Web Configuration Editor and License Manager
  • GPI Configuration for Runtime or Director
  • Supported Formats
  • Supported Hardware
  • Multisampling / Anti-Aliasing
  • Input Subsystem
  • Ventuz Proprietary Files
  • Migrating Content to Ventuz 6
  • Migrating Content to Ventuz 5
  • Summary Shortcuts
  • Terminology
  • Manual Index

Ventuz Designer

  • Designer Indices
Introduction
  • Designer Introduction Index
  • Designer Overview
  • Realtime Rendering
  • Project Browser
  • Designer Interface
  • Designer Options
  • Working with Nodes
  • Hierarchy and Content Editors
  • 2D Workflow
  • 3D Workflow
  • Animation Workflow
  • Best Practices
  • Reading Data in Ventuz
  • Display Images and Movies
  • Scene Performance and Tweaks
  • Deploying a Ventuz Presentation
  • Render to Disk
User Interface
  • Designer User Interface Index
  • Designer Interface
  • Renderer Window
  • Layer Editor
  • Property Editor
  • Property Groups
  • Hierarchy Editor
  • Content Editor
  • Find and Replace
  • Toolbox
  • Animation Editor
  • Shader Editor
  • Text Editor
  • Message View
  • Scene Tree
  • Stage Editor
  • Container Outline
  • Watches Editor
  • Performance Statistics
2D Workflow
  • 2D Workflow Index
  • 2D Workflow
  • Layer Editor
  • Common Layer Properties
  • IPP Effects
  • Color Correction FX
  • Distortion FX
  • Filter FX
  • Hierarchy and Content Editors
  • Display Images and Movies
3D Workflow
  • 3D Workflow Index
  • 3D Workflow
  • Hierarchy and Content Editors
  • Renderer Window
  • Camera Navigation
  • Manipulate Objects with Gizmos
  • Layer Editor
  • Property Editor
  • Hierarchy Editor
  • Working with Nodes
  • Isolated Objects
  • Containers
  • Text Rendering
  • Character Sets
  • Geometry Import
  • Display Images and Movies
  • Particle System
  • Creating Realistic Reflections
  • Unreal Integration
  • Notch Integration
  • E2E Node Overview
Logic Workflow
  • Logic Workflow Index
  • Hierarchy and Content Editors
  • Content Editor
  • Hierarchy Editor
  • Working with Nodes
  • Property Editor
  • Containers
  • Project and Scene Data
  • Reading Data in Ventuz
  • Display Images and Movies
  • Input Subsystem
  • Multitouch
  • TUIO Protocol
  • Open Sound Control
  • Unreal Integration
  • Notch Integration
  • E2E Node Overview
Animation Workflow
  • Animation Workflow Index
  • Animation Workflow
  • Animation Editor
  • Content Editor
  • Hierarchy Editor
  • Property Editor
  • Animation and State Engine
  • Templates
  • Template Engine
  • Unreal Integration
  • Notch Integration
Project Structure
  • Project Structure Index
  • Annotations
  • Projects and Scenes
  • Project Properties
  • Project Maintenance
  • Project and Scene Data
  • Scene Management
  • Scene Statistics
  • Scene Tree
  • Performance Statistics
How Tos
  • Designer How to Index
  • How to Run Ventuz
  • How to Work with Designer
  • Ventuz Designer Drag&Drop Workflow
  • How to work with Shadows
  • How to Build Content for Multiple Screens
  • How to Use Emoijs
  • How to Build a Template
  • How to Use the Color Difference Keyer
  • How To Use the HDR Engine
  • How Create Lens Flares and Bloom
  • How to Create Visuals Loader Node
  • How to Remote Control with a Phone
  • How to use Head Mounted Displays
  • How to work with 3D Reference Layers
  • How to create a Firework Particle System
  • How to use DDS with new Block Compression modes
  • How to use the Substance Integration
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To use the Vertex Integration
  • How To Control and Customize Ventuz
Reference
  • Available Nodes
  • Animation Nodes
  • Material&Color Nodes
  • Data Nodes
  • E2E Nodes
  • Geometry Nodes
  • Interaction Nodes
  • IO Nodes
  • Layers
  • Light Nodes
  • Logic Nodes
  • Render Option Nodes
  • Slides Nodes
  • Sound Nodes
  • Text Nodes
  • Texture Nodes
  • VR Nodes
  • World Nodes
  • Summary Shortcuts
  • Layer Editor Shortcuts
  • Hierarchy Editor Shortcuts
  • Content Editor Shortcuts
  • Animation Editor Shortcuts
  • Director Shortcuts

Ventuz Director

  • Index
  • Introduction
  • Environment
  • Show
  • User Interface
  • Assets
  • Taking Action
  • Property Editor
  • Shot Box
  • Project Data
  • Pages
  • Playlist
  • Timeline
  • Content References
  • Topology
  • Channels
  • Macros
  • Designing Templates
  • Plug-Ins
  • Shortcuts
  • Command Line Options
  • Application Settings
  • Glossary
  • GPI Configuration

Ventuz Runtime & Configuration

  • Runtime Index
  • Configuration Configuration Editor
  • Machine Configuration
  • Video/Audio Configuration
  • Web Configuration Editor and License Manager
  • Render Setup Editor
  • Warping and Soft-Edging Editor
  • Machine Clustering
  • Supported Hardware
  • Director Mode
  • Runtime How Tos Index
  • How to Configure Audio
  • How to Use Live Options
  • How To Play Out On Multiple Screens
  • How To Render on a Machine Cluster
  • How to Use Head Mounted Displays
  • How to Setup Spout with Ventuz
  • How to Use Newtek NDI
  • How to Use a Mixed Frame Rate Cluster
  • How to Use Tracking

How To

Designer
  • Designer How to Index
  • How to Run Ventuz
  • How to Work with Designer
  • Ventuz Designer Drag&Drop Workflow
  • How to work with Shadows
  • How to Build Content for Multiple Screens
  • How to Use Emoijs
  • How to Build a Template
  • How to Use the Color Difference Keyer
  • How To Use the HDR Engine
  • How Create Lens Flares and Bloom
  • How to Create Visuals Loader Node
  • How to Remote Control with a Phone
  • How to use Head Mounted Displays
  • How to work with 3D Reference Layers
  • How to create a Firework Particle System
  • How to use DDS with new Block Compression modes
  • How to use the Substance Integration
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To build and playback Ventuz Content in Vertex
Runtime & Configuration
  • Runtime How Tos Index
  • How to Configure Audio
  • How to Use Live Options
  • How To Play Out On Multiple Screens
  • How To Render on a Machine Cluster
  • How to use Head Mounted Displays
  • How to setup Spout with Ventuz
  • How to use Newtek NDI
  • How to use a Mixed Frame Rate Cluster
  • How to use Tracking
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To build and playback Ventuz Content in Vertex
Director
  • How To Control Multiple Graphics Independently From Each Other
  • How to use the Companion with Director

Ventuz Node Reference

ANIMATION
  • Mover
  • Alternator
  • Simple Control
  • Timeline Control
  • Anmation Rig
  • Keyframe Animation
  • Animation Group
COLOR/MATERIAL
  • Alpha
  • Fog
  • Ground Fog
  • Sky Box
  • Color to RGBA
  • HSLA to Color
  • RGBA to Color
  • Color Transformer
  • HLSL Shader
  • Color
  • Material
  • Color Picker
  • Substance Material
DATA
  • Database
  • Excel
  • JSON
  • RSS Feed
  • Resource Linker
  • Text File
  • XML
E2E
  • E2E Axis
  • E2E Data
  • E2E Control
  • E2E Layer
  • E2E Provider
  • E2E Node Overview
GEOMETRY
  • Rectangle
  • Rounded Rectangle
  • Gradient Rectangle
  • Overlay Rectangle
  • Cube
  • Circle
  • Sphere
  • Cylinder
  • Cone
  • Torus
  • Chart
  • Random Points
  • Mesh Loader
  • Geometry Import (Live)
  • Volume
  • Get Bounding Box
  • Arrow
  • Particle System
  • Path Renderer
  • Geometry Renderer
INTERACTION
  • Interaction Rect
  • Touch Button
  • Touch Excluder
  • Touch Marker
  • Touch Paint
  • Touch Pattern
  • Touch Proxy
  • Touch Ripples
  • Touch Transformations
  • Web Browser
  • Touch Teleport
  • Touch Simulator
INPUT/OUTPUT (I/O)
  • GPI
  • Joystick
  • Keyboard
  • MIDI
  • Mouse
  • Network
  • Open Sound Control
  • Serial
  • Timecode
  • DMX
  • HTTP
  • RamDiskWriter
LAYER
  • 3D Layers
  • 3D Layer Reference
  • 2D Layers
  • PSD Import Layer
  • E2E Layer
  • Others
LIGHT
  • Light Sources
LOGIC
  • Array Processing
  • Convert To Text
  • Cluster Synchronization
  • Counter
  • Date Time
  • Directory
  • Dispatcher
  • Enumeration
  • Expressions
  • Invert
  • Log
  • Loop Breaker
  • Math Effects
  • Matrix Operations
  • Scene Event
  • Script
  • String Operations
  • System ID
  • Text Splitter
  • Timer
  • Toggle
  • URL
  • Value Switch
  • Value Buffer
  • Variables
  • Visual Indexer
RENDER OPTIONS
  • Alpha Blending
  • Color Write
  • Alpha Testing
  • Clip Plane
  • Filter
  • Mask
  • Mirror
  • Effect
  • Render Cube Map
  • Draw Modes
  • Stencil
  • ZTesting
SOUND
  • Audio Clip
  • Sound
  • Volume Control
  • Audio Analysis
SLIDES
  • Slide Manager
  • Slide
  • Slide Port
  • Pivot
TEXT
  • Text Effects
  • Text Layouts
  • Text Rendering
TEXTURE
  • Background
  • Hatch
  • Image
  • Texture
  • SVG Loader
  • Gradient Texture
  • Live Video
  • Movie Stream
  • Movie Frame
  • Movie Clip
  • Texture Loader
  • Snapshot
  • Snapshot Framebuffer
  • Texture Saver
  • Video Source Selector
  • VIO Input
  • Spout Receiver
  • NDI Receiver
  • Substance Loader
  • QR Code
VR/AR
  • Tracked Devices
  • Draw Tracked Devices
WORLD
  • Axis
  • Billboard
  • GetWorld
  • SetWorld
  • Arrange
  • Ticker
  • Layout
  • Group
  • World Z Sort
  • YesNo
  • Switch
  • Spread
  • Filter Pass
  • Set Pass
  • Hierarchy Container
  • Scene Port
  • Content Container
  • Template Port
  • Container Info
  • Camera
  • Paths

Advanced and Development

  • Advanced and Development Index
  • Command Line Options
  • Ventuz IP Ports
  • Ventuz Machine Service
  • TUIO
  • .NET Scripting
  • HLSL Shader Programming
  • Ventuz API and SDK
  • Ventuz Extension API
  • Ventuz VIO API
  • Ventuz File Format (VFF)
  • Ventuz Stream Out API
  • Lens Calibration File for FreeD
  • E2E Node Overview
  • Unreal Integration
  • Notch Integration
Remoting
  • Remoting Index
  • Remoting Overview
  • How To Control and Customize Ventuz
  • Remoting 4
  • Remoting 4 via Websockets
  • Remoting 4 via HTTP
  • Director Remoting
  • Deprecated Remoting
  • Remoting Machine Signature

Misc

  • Presets
« Previous:
» Index «
Next: »

Interaction Transformation Nodes

Table of Contents

  1. Touch Translation
  2. Touch Rotation
  3. Touch Orbit
  4. Touch Transformation
  5. Touch Transformation 3D
  6. Resetting
  7. Customized Motion
  8. Physics Simulation

Touch Translation Single touch gesture to translate an axis.
Touch Rotation Single touch gesture to rotate around an axis.
Touch Orbit Single touch gesture to rotate around multiple axis at the same time.
Touch Transformation Two-touch gesture to translate, rotate and scale at the same time.
Touch Transformation 3D Two-touch gesture to translate, rotate and scale at the same time - especially built for use in VR.

These four interaction nodes form the set of transformation gestures. They operate in an object-based fashion so to activate them a touch has to hit a mesh/font/volume that is part of the subtree spanned by the interaction node. At first glance it might seem an unnecessary duplication to have dedicated translation and rotation nodes when the Touch Transformation can handle all of these transformations. However, they all are custom tailored to special use cases and achieve different effects.

The transformation nodes are derived from Touch Button and as such inherit the same properties and events to detect whether a touch is hovering inside the active area of the node or actually pressed. These can for example be used to add some custom visual feedback when the user activates the transformation node.

All of the transformation nodes have in common that the values they generate are computed by intersecting a viewing ray (starting at the camera and going through the touch position) with the X/Y-plane of the coordinate system active at the location of the node in the hierarchy and then mapping this value. The underlying meshes/fonts/volumes are only used to decide which interaction node is activated and will not affect the value generation.

In addition, all three nodes have an inherent behavior and a simplified physics simulation. There is no collision detection but objects can have a certain inertia which keeps the gesture moving after the user has released the touch. By using the Limit input properties, movement in the directions can artificially be restricted.

Touch Translation

Use for: Sliders, 2D positioning...

A single touch is used to translate the object in x/y direction. This is usually used with the X- and Y-axis aligned to the screen but rotated version are also possible in order to move objects back and forth in 3D space.

Be careful to have the vanishing points of both axes lie outside the area visible by the camera as mapping calculation close to those points is error prone.

A simple slider can be built by using the Limit input properties to restrict movement in one direction.

Touch Rotation

Use for: Slot machine style wheels, volume knob style elements, steering wheel, ...

The most important properties for Touch Rotation are the Rotation Axis and the Mapping mode. The on-axis mapping mode assumes that the user roughly looks along the rotation axis onto the top of the object (i.e. a steering wheel or the dial plate on an antique telephone). In this mode, the node will try to follow the touch as closely as possible in order to keep the same point on the surface under the touch position at all times. This makes it possible to do very precise rotations even up to the point where one can do a complete revolution. However, this mode can feel weird if the touch changes its distance from the rotation axis.

The off-axis mapping mode assumes that the rotation axis is at roughly a 90 degree angle to the viewing direction (i.e. a slot machine). When rotating, the object will only stick to the touch position up to a certain point. As the touch gets closer to the boundary of the object, a different mapping is used so that moving the touch further along will keep the object rotating at a constant speed. While one looses the feeling of the object sticking exactly to the finger, this mode feels much nicer when it comes to making an object spin.

Both modes work completely different and which one is more suitable depends on the specific case of usage.

Touch Orbit

Use for: Product presentation, earth globes, ...

The Touch Orbit node enables the user to rotate in multiple axis at the same time. A vertical movement of the touch will tilt the object where a horizontal movement rotates the object around its Y-axis. One particularity of this node is that the tilt is always done by rotating the X-axis of the gesture, not object space. This has the effect that while rotating the object, the tilt is always constant with respect to the viewer. It is thus ideal for doing a product presentation or any other situation where the user should be able to tilt the object but nonetheless be unable to look under the object. In addition, the object can be scaled by doing a two finger pinch/stretch.

The rotation is measured in degree Azimuth and degree Inclination, the former being the rotation around the Y-axis of the object and the latter the amount of tilting. Both these values as well as the amount of scaling can be restricted by using the respective Limit properties.

LimitActive activates all limits at the same time. If only a selection of properties should be restricted, simply set the other limit properties to very large values like plus and minus ten to the power of thirty.

Due to numerical reasons, the Inclination is restricted from -90 to +90 degrees even when the LimitsActive property is false. This also avoids that the object ever is upside-down.

Touch Transformation

Use for: Moving picture metaphor, scaling objects, ...

The Touch Transformation node combines the most common one and two finger gestures to interact with an object. A single touch can be used to translate an object. Two touches can be used to rotate the object (by doing a circular motion), scale it (pinch/stretch), and translate it (by moving all touches in the same direction).

Due to the degree of freedom with this node, there are only few practical use cases for it. Typically, a user will use the Limit properties to restrict movement in one or the other direction. In most cases however, one should try to avoid the use of the Transformation node in favor of the dedicated Translation or Rotation node.

The enabled property ApplyTransformationCenter makes sure that rotation and scaling is applied to the center of the two touches. If it is disabled the transformation is applied to the center of the object.

Touch Transformation 3D

Use for: Full transformation of objects in 3D space with VR Controllers; free rotation in 2D screen mode.

This node is an extension of the Transformation and Orbit node to provide full transformation in 3D space (e.g. Virtual Reality). With an appropriate controller it is possible to translate objects on all three axes. Additionally a free rotation is implemented and scaling with a second touch provider (second controller or two fingers in screen mode). The rotation is based on Quaternions and allows a free and intuitive rotation of objects. But this technique also implies that it is not possible to limit rotation on the XYZ axes. It is only possible to disallow the complete rotation via the DisableRotation property.
In 2D screen mode this node behaves slightly different than the Transformation and Orbit node:

  • single touch rotates the object similar to the Orbit node but without limitations
  • dual touch is used to translate the object in X and Y (like single touch for Transformation node), for scaling and rotation around the Z axis (like dual touch for Transformation node)

Resetting

All these nodes can be reset to specific values. This means that their transformation outputs can be overridden from the according input properties. All node have the following Reset parameters in common: ResetOnLoad specifies if the transformations should be reset if the scene is loaded. The ResetAnimation property determines how the reset should be performed. All standard easing function known from the Ease Node are available. The duration of this transition can be controlled by the ResetDuration property. Set ResetDuration to zero seconds to let the reset jump to its specified values. The ResetPolicy flags control if reset can be performed if a touch is active and if an object can be touched during a reset animation.

Customized Motion

Each of the transformation nodes automatically performs the necessary transformation implied by the gesture (i.e. a translation node moves without an explicit node in front of it). To customize the way objects behave, this implicit motion can be disabled by unchecking BehaviorMotion. When BehaviorMotion is disabled, the node will still do all of its calculation and provide the result as output properties. Those can be manually bound to an Axis node and by using Math Effect nodes like Decay or Dampener, different interaction results can be achieved.

Physics Simulation

All transformation nodes contain a simplified physics model to simulate effects like inertia or other post-motion. Subtle effects like these are crucial to convey realism or "weight" in a MultiTouch scene.

As stated before, the physics model inside Ventuz is a simplified one custom tailored for MultiTouch purposes. It is not designed to handle collisions, gravity or other factors expected in a full-fledged physics simulation.

The physics simulation is by default enabled for all Touch Transformation Nodes. To disable it, uncheck the BehaviorApply property. This is independent of whether or not the object will move (see BehaviorMotion) and just prevents the physics simulation of influencing the values generated by the mapping calculation.

To modify the physics properties of a node, click the Edit Behavior button at the bottom of the Property Editor.

There are three forces that can be applied to the object moved by any of the transformation nodes:

  • Inertia: Kicks in when the users touch leaves an object and the object travels at a velocity higher than Velocity Threshold. The inertia simulation will try to continue the users movement and slowly decrease the objects velocity to simulate the effects of friction. The amount of friction can be adjusted by changing the Deceleration factor. Only for the Transformation 3D node there is a further parameter: VR Rotation Coefficient. This parameter is only applied if the node is used with VR Controllers. It is used to increase the deceleration of the rotation to make the motion of a thrown object look more natural. Using low values objects thrown with a VR Controller rotate too fast and too long.
  • Limit Elasticity: The "bounciness" of an object when it hits one of the limits specified in the nodes properties. When a limit is hit, the velocity vector of the object is multiplied by this weight and then reflected from the limit.
  • Tick Attraction: Defines a uniform spacing of "snap points" called ticks. If the spacing is non-zero, a virtual tick is placed at every multiple of the spacing in the nodes mapping system. When the objects velocity falls below the Attraction Threshold, the ticks apply an attraction force to the object, slowly pulling it until it stops exactly on one of the tick positions. The rate of attraction can be adjusted using the Attraction Speed. For the Touch Transformation node, the TicksAffect property can be used to specify which type of transformation (translation, rotation, scaling) is affected by the ticks.

All these parameters are based on the nodes own mapping system instead of real world parameters like screen distance in cm. While this is often unintuitive, it is the only plausible solution to be both screen size independent and capable of handling motion that is not aligned with the screen plane.

The Machine Configuration has two options to adapt physics behavior parameter to changing screen size. As long as Scale Thresholds with Window Size is active, Ventuz will do its best to internally scale all parameters such that the user experience stays the same. For more explicit measures, an additional scaling factor that is applied to all parameters expect tick spacing has been created.

See also:
  • MultiTouch
  • Project Properties
  • Touch Button
  • Touch Excluder
  • Touch Paint
  • Touch Pattern
  • Touch Proxy
  • Touch Ripples

« Previous:
» Index «
Next: »
Copyright 2022 Ventuz Technology