Navigation

Ventuz Introduction

  • Introduction
  • Getting Started
  • Ventuz Editions
  • Ventuz Products
  • Realtime Rendering
  • Frequently Asked Questions
  • Common Mistakes
  • Deploying a Ventuz Presentation
  • Scene Performance and Tweaks

Quick Guides

  • Quick Guide Index
  • Business Logic
  • 3D Art
  • 2D Art
  • Programming
  • System Engineer

General

  • Index
  • What's New
  • Ventuz System Requirements
  • Communication Protocol Overview
  • Configuration Editor
  • Audio / Video Configuration
  • Machine Configuration
  • Web Configuration Editor and License Manager
  • GPI Configuration for Runtime or Director
  • Supported Formats
  • Supported Hardware
  • Multisampling / Anti-Aliasing
  • Input Subsystem
  • Ventuz Proprietary Files
  • Migrating Content to Ventuz 6
  • Migrating Content to Ventuz 5
  • Summary Shortcuts
  • Terminology
  • Manual Index

Ventuz Designer

  • Designer Indices
Introduction
  • Designer Introduction Index
  • Designer Overview
  • Realtime Rendering
  • Project Browser
  • Designer Interface
  • Designer Options
  • Working with Nodes
  • Hierarchy and Content Editors
  • 2D Workflow
  • 3D Workflow
  • Animation Workflow
  • Best Practices
  • Reading Data in Ventuz
  • Display Images and Movies
  • Scene Performance and Tweaks
  • Deploying a Ventuz Presentation
  • Render to Disk
User Interface
  • Designer User Interface Index
  • Designer Interface
  • Renderer Window
  • Layer Editor
  • Property Editor
  • Property Groups
  • Hierarchy Editor
  • Content Editor
  • Find and Replace
  • Toolbox
  • Animation Editor
  • Shader Editor
  • Text Editor
  • Message View
  • Scene Tree
  • Stage Editor
  • Container Outline
  • Watches Editor
  • Performance Statistics
2D Workflow
  • 2D Workflow Index
  • 2D Workflow
  • Layer Editor
  • Common Layer Properties
  • IPP Effects
  • Color Correction FX
  • Distortion FX
  • Filter FX
  • Hierarchy and Content Editors
  • Display Images and Movies
3D Workflow
  • 3D Workflow Index
  • 3D Workflow
  • Hierarchy and Content Editors
  • Renderer Window
  • Camera Navigation
  • Manipulate Objects with Gizmos
  • Layer Editor
  • Property Editor
  • Hierarchy Editor
  • Working with Nodes
  • Isolated Objects
  • Containers
  • Text Rendering
  • Character Sets
  • Geometry Import
  • Display Images and Movies
  • Particle System
  • Creating Realistic Reflections
  • Unreal Integration
  • Notch Integration
  • E2E Node Overview
Logic Workflow
  • Logic Workflow Index
  • Hierarchy and Content Editors
  • Content Editor
  • Hierarchy Editor
  • Working with Nodes
  • Property Editor
  • Containers
  • Project and Scene Data
  • Reading Data in Ventuz
  • Display Images and Movies
  • Input Subsystem
  • Multitouch
  • TUIO Protocol
  • Open Sound Control
  • Unreal Integration
  • Notch Integration
  • E2E Node Overview
Animation Workflow
  • Animation Workflow Index
  • Animation Workflow
  • Animation Editor
  • Content Editor
  • Hierarchy Editor
  • Property Editor
  • Animation and State Engine
  • Templates
  • Template Engine
  • Unreal Integration
  • Notch Integration
Project Structure
  • Project Structure Index
  • Annotations
  • Projects and Scenes
  • Project Properties
  • Project Maintenance
  • Project and Scene Data
  • Scene Management
  • Scene Statistics
  • Scene Tree
  • Performance Statistics
How Tos
  • Designer How to Index
  • How to Run Ventuz
  • How to Work with Designer
  • Ventuz Designer Drag&Drop Workflow
  • How to work with Shadows
  • How to Build Content for Multiple Screens
  • How to Use Emoijs
  • How to Build a Template
  • How to Use the Color Difference Keyer
  • How To Use the HDR Engine
  • How Create Lens Flares and Bloom
  • How to Create Visuals Loader Node
  • How to Remote Control with a Phone
  • How to use Head Mounted Displays
  • How to work with 3D Reference Layers
  • How to create a Firework Particle System
  • How to use DDS with new Block Compression modes
  • How to use the Substance Integration
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To use the Vertex Integration
  • How To Control and Customize Ventuz
Reference
  • Available Nodes
  • Animation Nodes
  • Material&Color Nodes
  • Data Nodes
  • E2E Nodes
  • Geometry Nodes
  • Interaction Nodes
  • IO Nodes
  • Layers
  • Light Nodes
  • Logic Nodes
  • Render Option Nodes
  • Slides Nodes
  • Sound Nodes
  • Text Nodes
  • Texture Nodes
  • VR Nodes
  • World Nodes
  • Summary Shortcuts
  • Layer Editor Shortcuts
  • Hierarchy Editor Shortcuts
  • Content Editor Shortcuts
  • Animation Editor Shortcuts
  • Director Shortcuts

Ventuz Director

  • Index
  • Introduction
  • Environment
  • Show
  • User Interface
  • Assets
  • Taking Action
  • Property Editor
  • Shot Box
  • Project Data
  • Pages
  • Playlist
  • Timeline
  • Content References
  • Topology
  • Channels
  • Macros
  • Designing Templates
  • Plug-Ins
  • Shortcuts
  • Command Line Options
  • Application Settings
  • Glossary
  • GPI Configuration

Ventuz Runtime & Configuration

  • Runtime Index
  • Configuration Configuration Editor
  • Machine Configuration
  • Video/Audio Configuration
  • Web Configuration Editor and License Manager
  • Render Setup Editor
  • Warping and Soft-Edging Editor
  • Machine Clustering
  • Supported Hardware
  • Director Mode
  • Runtime How Tos Index
  • How to Configure Audio
  • How to Use Live Options
  • How To Play Out On Multiple Screens
  • How To Render on a Machine Cluster
  • How to Use Head Mounted Displays
  • How to Setup Spout with Ventuz
  • How to Use Newtek NDI
  • How to Use a Mixed Frame Rate Cluster
  • How to Use Tracking

How To

Designer
  • Designer How to Index
  • How to Run Ventuz
  • How to Work with Designer
  • Ventuz Designer Drag&Drop Workflow
  • How to work with Shadows
  • How to Build Content for Multiple Screens
  • How to Use Emoijs
  • How to Build a Template
  • How to Use the Color Difference Keyer
  • How To Use the HDR Engine
  • How Create Lens Flares and Bloom
  • How to Create Visuals Loader Node
  • How to Remote Control with a Phone
  • How to use Head Mounted Displays
  • How to work with 3D Reference Layers
  • How to create a Firework Particle System
  • How to use DDS with new Block Compression modes
  • How to use the Substance Integration
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To build and playback Ventuz Content in Vertex
Runtime & Configuration
  • Runtime How Tos Index
  • How to Configure Audio
  • How to Use Live Options
  • How To Play Out On Multiple Screens
  • How To Render on a Machine Cluster
  • How to use Head Mounted Displays
  • How to setup Spout with Ventuz
  • How to use Newtek NDI
  • How to use a Mixed Frame Rate Cluster
  • How to use Tracking
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To build and playback Ventuz Content in Vertex
Director
  • How To Control Multiple Graphics Independently From Each Other
  • How to use the Companion with Director

Ventuz Node Reference

ANIMATION
  • Mover
  • Alternator
  • Simple Control
  • Timeline Control
  • Anmation Rig
  • Keyframe Animation
  • Animation Group
COLOR/MATERIAL
  • Alpha
  • Fog
  • Ground Fog
  • Sky Box
  • Color to RGBA
  • HSLA to Color
  • RGBA to Color
  • Color Transformer
  • HLSL Shader
  • Color
  • Material
  • Color Picker
  • Substance Material
DATA
  • Database
  • Excel
  • JSON
  • RSS Feed
  • Resource Linker
  • Text File
  • XML
E2E
  • E2E Axis
  • E2E Data
  • E2E Control
  • E2E Layer
  • E2E Provider
  • E2E Node Overview
GEOMETRY
  • Rectangle
  • Rounded Rectangle
  • Gradient Rectangle
  • Overlay Rectangle
  • Cube
  • Circle
  • Sphere
  • Cylinder
  • Cone
  • Torus
  • Chart
  • Random Points
  • Mesh Loader
  • Geometry Import (Live)
  • Volume
  • Get Bounding Box
  • Arrow
  • Particle System
  • Path Renderer
  • Geometry Renderer
INTERACTION
  • Interaction Rect
  • Touch Button
  • Touch Excluder
  • Touch Marker
  • Touch Paint
  • Touch Pattern
  • Touch Proxy
  • Touch Ripples
  • Touch Transformations
  • Web Browser
  • Touch Teleport
  • Touch Simulator
INPUT/OUTPUT (I/O)
  • GPI
  • Joystick
  • Keyboard
  • MIDI
  • Mouse
  • Network
  • Open Sound Control
  • Serial
  • Timecode
  • DMX
  • HTTP
  • RamDiskWriter
LAYER
  • 3D Layers
  • 3D Layer Reference
  • 2D Layers
  • PSD Import Layer
  • E2E Layer
  • Others
LIGHT
  • Light Sources
LOGIC
  • Array Processing
  • Convert To Text
  • Cluster Synchronization
  • Counter
  • Date Time
  • Directory
  • Dispatcher
  • Enumeration
  • Expressions
  • Invert
  • Log
  • Loop Breaker
  • Math Effects
  • Matrix Operations
  • Scene Event
  • Script
  • String Operations
  • System ID
  • Text Splitter
  • Timer
  • Toggle
  • URL
  • Value Switch
  • Value Buffer
  • Variables
  • Visual Indexer
RENDER OPTIONS
  • Alpha Blending
  • Color Write
  • Alpha Testing
  • Clip Plane
  • Filter
  • Mask
  • Mirror
  • Effect
  • Render Cube Map
  • Draw Modes
  • Stencil
  • ZTesting
SOUND
  • Audio Clip
  • Sound
  • Volume Control
  • Audio Analysis
SLIDES
  • Slide Manager
  • Slide
  • Slide Port
  • Pivot
TEXT
  • Text Effects
  • Text Layouts
  • Text Rendering
TEXTURE
  • Background
  • Hatch
  • Image
  • Texture
  • SVG Loader
  • Gradient Texture
  • Live Video
  • Movie Stream
  • Movie Frame
  • Movie Clip
  • Texture Loader
  • Snapshot
  • Snapshot Framebuffer
  • Texture Saver
  • Video Source Selector
  • VIO Input
  • Spout Receiver
  • NDI Receiver
  • Substance Loader
  • QR Code
VR/AR
  • Tracked Devices
  • Draw Tracked Devices
WORLD
  • Axis
  • Billboard
  • GetWorld
  • SetWorld
  • Arrange
  • Ticker
  • Layout
  • Group
  • World Z Sort
  • YesNo
  • Switch
  • Spread
  • Filter Pass
  • Set Pass
  • Hierarchy Container
  • Scene Port
  • Content Container
  • Template Port
  • Container Info
  • Camera
  • Paths

Advanced and Development

  • Advanced and Development Index
  • Command Line Options
  • Ventuz IP Ports
  • Ventuz Machine Service
  • TUIO
  • .NET Scripting
  • HLSL Shader Programming
  • Ventuz API and SDK
  • Ventuz Extension API
  • Ventuz VIO API
  • Ventuz File Format (VFF)
  • Ventuz Stream Out API
  • Lens Calibration File for FreeD
  • E2E Node Overview
  • Unreal Integration
  • Notch Integration
Remoting
  • Remoting Index
  • Remoting Overview
  • How To Control and Customize Ventuz
  • Remoting 4
  • Remoting 4 via Websockets
  • Remoting 4 via HTTP
  • Director Remoting
  • Deprecated Remoting
  • Remoting Machine Signature

Misc

  • Presets
« Previous:
» Index «
Next: »

How To Use Tracking in Ventuz

Table of Contents

  1. Preliminary Requirements
  2. Tracking System Vendors
    1. Trackmen-Egripment
    2. NCam
    3. Stype
    4. Mo-Sys StarTracker
    5. FreeD Protocol
  3. Configuring Tracking Devices
    1. Trackmen/NCam/Stype/Mo-Sys/FreeD - Ethernet interfaces
    2. Mo-Sys - Serial interface
    3. Common Tracking device Parameters
    4. Vendor specific Parameters
      1. Trackmen
      2. NCam
      3. FreeD
  4. Configuring Deltacast Delta HD-E-Key 22
    1. Setting Up Timecode Mode – Ventuz Internal Key
    2. Setting Up No Timecode Mode – Ventuz Internal Key
    3. Setting Up Timecode Mode – Ventuz External Key
  5. Using external Camera Tracking Data

Preliminary Requirements

In order to use Camera Tracking data an additional Tracking license option must be purchased and added to your Ventuz Designer or Runtime licenses.

Among the multiple Video I/O boards supported by Ventuz, there are some limitations and we have prepared a list of supported Cards. Please have a look at the the Supported Hardware Vendors Page.

Tracking System Vendors

Currently Ventuz supports these Camera Tracking system vendors:

Trackmen-Egripment

German vendor Trackmen develops various camera tracking systems using various technologies. Trackmen uses in all cases the same consistent protocol for tracking data transfer, so all Trackmen systems can be connected to Ventuz, regardless of the tracking technology used.Currently, Trackmen offers these tracking products, all supported by Ventuz:

  • VioTrack: Uses standard camera video feed and, depending on versions, an additional sensor camera for increased accuracy. More info ​here
  • ManuTrack: Uses standard camera video signal to track presenter hands to link virtual objects or gesture triggering events. More info ​here
  • TorqTrack: Uses encoders on camera pedestals/support joints to calculate camera position in 3D space. This can be used as a kit to sensorize any existing camera support, crane, tripod, pedestal, etc.... Currently Trackmen collaborates with crane manufacturer Egripment to offer sensorized cranes and pedestals - since these systems use Trackmen technology are also supported by Ventuz. More info ​here

NCam

British manufacturer NCam develops Optical Camera tracking solutions that use a lightweight sensor bar attached to the camera to track natural features in the environment, allowing the camera to move freely in all locations - that makes this system especially well suited for shoulder-held or steady-cam shots, or for augmented reality projects outside of the studio. More info ​here

Stype

Croatian manufacturer Stype offers a sensorizing kit with auto-aim functionality, called Stype Kit, for existing cranes and jibs. The system does not require any additional external sensors or infra-red cameras and there is no need for any physical modifications of the camera crane. More info ​here

Mo-Sys StarTracker

British manufacturer Mo-Sys is a traditional vendor of solutions for remote heads & motion control, broadcast robotics, mechanical and optical camera tracking for AR and VR, and on-set visualization. One of the latest additions to their portfolio is an optical camera tracking system called StarTracker, which features a small sensor camera tracking a cloud of markers placed on the ceiling - that makes it mostly usable for permanent in-studio setups. More info ​here

FreeD Protocol

Introducing the FreeD protocol as vendor independent tracking input allows to receive tracking data from a variety of devices. The input implements the D1 data packet allowing to receive Camera ID, Pan-, Tilt- and Roll Angle, X-, Y- and Z-Position (Height), Zoom and Focus. To correct lens distortion, a lens calibration file can be loaded into the FreeD input. Contrary to the vendor specific tracking inputs, using the FreeD tracking data input does not require an additional license option to receive just the D1 data. For the lens correction an according license option is required though.



Configuring Tracking Devices

If you have the Tracking option enabled in your license, all the supported tracking devices will be displayed in the AV Configuration menu. Please bear in mind that Ventuz does not check if Tracking systems are connected, this list only shows the supported Tracking systems.

In order to configure the video device and tracking source just drag and drop icons to input and output panels as needed. As you can see in Figure above, for tracking you will normally need one Tracking and one Video source which must be placed in the Inputs pane, and normally one Video Output device, which must be placed in the Outputs pane. Please check the AV Configuration section for more info.

Depending on the Camera Tracking system of your choice, tracking data will be transmitted via Ethernet (Trackmen/NCam/Stype/FreeD) of using Serial connections (Mo-Sys). Therefore, when you add the tracking systems in your Configuration Editor, these are the options available, and how to set up tracking data transmission to Ventuz system.

Trackmen/NCam/Stype/Mo-Sys/FreeD - Ethernet interfaces

In order to set up the tracking data communication we press the little gear icon close to the Tracking Vendor Logo (A in figure above).
After pressing the sprocket a Device Options contextual menu will appear and, depending on the communications interface used by every Tracking system, some parameters will be available.
For tracking systems using Ethernet infrastructure two parameters will be available, currently the supported tracking systems that use IP Communication for the tracking data are Trackmen, NCam and Stype:

  • Tracking IP Port: Use this text field (B in figure above) to set up the local Port used to receive the UDP tracking data stream.
  • Tracking IP: Use this text field (C in figure above) to set up the IP address of the machine network adapter used to receive the UDP tracking data stream. Follow the standard IP address formatting as appears in the example. Attention: For Trackmen, Stype and FreeD it is the IP of the local network adapter of your machine used to receive the tracking data as this is UDP. For NCam the IP of the machine that sends the tracking data is needed, as this is TCP.
  • Since Mo-Sys supports both Serial as well as Ethernet interfaces with the Tracking Protocol you can adjust which one to use.



Mo-Sys - Serial interface

In order to setup the tracking data communication we press the little gear icon close to the Tracking Vendor Logo.
After clicking on the gear icon a Device Options contextual menu will appear and, depending on the communications interface used by every Tracking system, some parameters will be available.
For tracking systems using Serial Communications infrastructure two parameters will be available, currently the only supported tracking system that uses Serial Communication for the tracking data is Mo-Sys StarTracker:

  • Comm Port: Use this text field to set up the local Comm Port used to receive the tracking data stream. Please, bear in mind that you must follow the same formatting that appears in the example - i.e. "COM1".
  • Baud rate: Use this drop down menu to set up the Baud rate of the Serial Port used to receive the tracking data stream. Currently Mo-Sys only supports the two Baud Rate options in the list, 57600 and 38400.
  • Since Mo-Sys supports both Serial as well as Ethernet interfaces with the Tracking Protocol you can adjust which one to use.



Common Tracking device Parameters

  • Format: A drop-down menu, this has no real use for now, will be useful for future development. Default and only value is Auto Detect.
  • Mipmaps: A checkbox, this has no real use for now, will be useful for future development. Default value is ON, you can leave it like that.
  • Milliseconds Delay: A textbox used to set the Tracking data delay measured as Milliseconds, normally used to adjust that the tracking data is received at the very same moment as the video frame - normally you will use it to get rid of some noticeable jittering in the tracked objects. It can be used independently or combined with Field Delay below - both values are added. Default value is 0.
  • Field Delay: A textbox used to set the Tracking data delay measured as Fields. This is usually used to compensate delay in the tracking system. Most tracking systems have better results when filtering across multiple frames, creating a delay. Default value is 0.
  • Scaling: A slider to adjust the tracking position scaling. Default value is 1.
  • Lens Distortion: A checkbox, used to select if Automatic Lens Distortion is applied. Default value is ON, for most cases you can leave it like that.
  • Advanced Settings: A foldable menu that feature some additional settings like:
    • RGB Format: This has no effect for tracking inputs.
    • Synchronized: A checkbox, used when input and output are genlocked, in that case this should be ON. Default value is OFF
    • Low Latency: This has no effect for tracking inputs. A checkbox, used to reduce Ventuz system latency by reducing the amount of frames buffered, which results in slightly better performance, in general it's better to leave it as the default OFF value.
    • Extra Buffers: This has no effect for tracking inputs. A textbox used to set add extra frame buffers to avoid possible frame drops, but results in nigher system delay. Works in both Normal and Low Latency modes, and the default value is 0.

Vendor specific Parameters

Apart from the above settings which are common to most Tracking systems, some of them feature specific parameters due to some unique capabilities.

Trackmen

  • Tracking Camera ID: A textbox used to set the ID of the Tracking Camera - Trackmen system allows using more than one camera. Default value is 1.
  • Tracking Studio ID: A textbox used to set the ID of the Tracking Studio - Trackmen also supports multiple studios. Default value is 1.

NCam

  • Use SDI Timestamps: A checkbox used to receive Ancillary data on the SDI input - A very unique NCam feature, Ventuz is the only system in the market that currently supports it. Default value is OFF.

FreeD

  • Lens Calibration File: The FreeD tracking input implements only the D1 data packet allowing to receive Camera ID, Pan-, Tilt- and Roll Angle, X-, Y- and Z-Position (Height), Zoom and Focus. To correct lens distortion, a lens calibration file can be loaded into this property. For description of the lens calibration file see OpenCV Lens Calibration File for FreeD.
  • FOV X: Sets the horizontal field of view in degrees. The fov value is not included in the FreeD D1 packet, thus have to be set manually if no lens calibration file is used. If a lens calibration file is provided, the field of view from the according calibration point is applied and overrides this FovX setting.
  • CCD Width: Sets the width of the cameras sensor size in millimeters. As described above, ths only needs to be set manually if no lens calibration file is provided.
  • CCD Height: Sets the height of the cameras sensor size in millimeters. As described above, ths only needs to be set manually if no lens calibration file is provided.

Configuring Deltacast Delta HD-E-Key 22

Ventuz can be used in 2 basic modes (not only for tracking):

  • Internal Key: With a camera feed to key internally, producing a composited output. In this mode the tracking data to video timing is adjusted to ensure that the video output of Ventuz is accurate. Video and tracking data delays will be different. These delays are also dependent on cable lengths and intervening equipment in the signal pathways, so each installation will be different.
    The Ventuz implementation allows two methods of locking the tracking data to the correct video frame. The particular method to use depends on whether the camera stream has accurate timecode embedded in it.
    • Timecode: The timecode has to be generated from either the main camera, or a from a timecode generator inline between the camera and the Tracking / Ventuz systems. Tracking system extracts the timecode from the incoming video frame and assign it the tracking data sample for that frame. That timecoded tracking data is then supplied to Ventuz. Ventuz will also decode the timecode on the incoming video frames, and assign the correct tracking data packets to the specific video frame to ensure that the virtual graphics are always correctly in sync. This is only supported by NCam
    • No Timecode: There is no way to uniquely identify tracking data packets to match specific video frames, so the alignment requires some manual adjustment of video and tracking data delays. Tracking data packets are streamed by the tracking system, Ventuz receives video frames and tracking data packets, and internally timestamps these based on the arrival time at the Ventuz computer. The tracking data is then applied to the video frame with the closest matching system arrival time. Ventuz maintains input video buffers and tracking data buffers than can be used to adjust the “lock” to ensure the correct tracking data is used for the specific video frame.
  • External Key: With a video feed locked to house reference to produce key and fill channels for external keying and compositing in a downstream keyer/mixer. In this mode the tracking data and video timing is adjusted so that the AR graphics are locked to the video at the output of the keyer/mixer.

In most cases Video and tracking data delays won't be equal. These delays are also dependent on cable lengths and intervening equipment in the signal pathways, so each installation will feature different delays.

Setting Up Timecode Mode – Ventuz Internal Key


Genlock Source settings for Deltacast board


Deltacast Video input for Timecode operation with extra video input buffers:

  • Check the Deltacast settings in Ventuz from the attached screenshots.
  • Set the Enable Ancillary Input to enable the reading of timecode from the video stream (see above).
  • Set Tracking Source. This will cause the tracking data to be aligned to this input
  • Set Synchronized.
  • Create a cube in Ventuz Designer and position a corner at the origin
  • Pan the real camera left and right to check the timing.
  • Check the accuracy of the timing on the output of Ventuz – normally graphics shouldn't be locked to the video, but will be behind. This is because, by default, there is no video buffering in Ventuz and the tracking data packets arrive after the video. In this case, Ventuz just assigns the tracking data to the nearest video frame, so the graphics appear to move after the video.
  • Add video input delay in Ventuz AV Configuration Editor. Increment the Extra Buffers parameter to a value of 2 or 4 to create video ring buffers – these buffers can be used to delay the incoming video. Then in Ventuz Designer Live Options, increment the Input Delay value to use these buffers. By using the Ventuz Live Options, there is no need to restart the software when adjusting values. If you need to increase the value of Extra Buffers in Ventuz AV Configuration Editor, then a restart will be required. This delay works in the opposite direction as the Field Delay in the tracking system.
  • Once Ventuz has the appropriate video frame stored in its input buffer, it will correctly assign the right tracking data to the appropriate frame and the virtual graphics will snap into sync with the video.



Setting Up No Timecode Mode – Ventuz Internal Key

  • Create a cube in Ventuz Designer and position a corner at the origin.
  • Set Tracking Source. This will cause the tracking data to be aligned to this input
  • Set Synchronized.
  • Pan the real camera left and right to check the timing.
  • Check the accuracy of the timing in Ventuz. If the video moves before the cube, then you need to delay the video stream by adding a video input delay in Ventuz AV Configuration Editor. Increment Input delay (frames) in Ventuz Designer Live Options. The delay threshold that you are able to add is limited by the amount of buffers specified in the Extra Buffers parameter.
  • If the video moves after the cube, then you need to delay the tracking data using both Milliseconds Delay and/or Fields Delay controls in the tracking system parameters of Ventuz AV Configuration Editor.
  • Once the timing is accurate, it should remain locked.



Setting Up Timecode Mode – Ventuz External Key


Even when using external keying we need an input SDI signal synchronized to the house clock. The image of this signal does not matter, it is only used to get timing information to correctly align the tracking data with the house clock.

  • Check the Deltacast input board settings in Ventuz AV Configuration Editor from the above screenshot to set External Keying.
  • Create a cube in Ventuz Designer and position a corner at the origin.
  • Remove any Video input and/or Tracking Data delay values in Ventuz AV Configuration Editor (see screenshot)
  • Check the output of the keyer/switcher and Pan the real camera left and right to check the timing
  • Add external video delays until the AR graphics and video are synchronized.
  • As you can only add complete fields or frames in the external video delay box, you may have to delay the video so that it just starts after the graphics, and then adjust the Tracking Data Milliseconds Delay to compensate.

You will need to adjust the frame delay in the external keyer / video mixer.

Using external Camera Tracking Data



Once you have set up the tracking data sources and the correct synchronization (see above), now you are ready to use the external Camera Tracking and Lens Distortion data to drive the virtual camera into your Ventuz scene.

In order to do so, just create a 3D Layer in Ventuz Layer Editor. As you can see in figure above, if you get into the 3D scene Properties, by clicking on the 3D Scene Root (A in figure), the contextual Properties will appear in the Properties Editor to the right.

If you expose the 3D Camera Properties, you will get two fields, first one being a Camera dropdown menu (B in figure above) that displays the camera being used - the (default) camera is the start option, and Ventuz always assumes that it will be controlled by any external tracking device data, so if you leave it like that the 3D camera will be tracked. If you don't want your camera to be tracked, just add another 3D Camera into your Hierarchy Editor.

The other control available, Lens Distortion (C in figure above), is just a checkbox that controls if Lens Distortion data is applied or not.

« Previous:
» Index «
Next: »
Copyright 2022 Ventuz Technology