Navigation

Ventuz Introduction

  • Introduction
  • Getting Started
  • Ventuz Editions
  • Ventuz Products
  • Realtime Rendering
  • Frequently Asked Questions
  • Common Mistakes
  • Deploying a Ventuz Presentation
  • Scene Performance and Tweaks

Quick Guides

  • Quick Guide Index
  • Business Logic
  • 3D Art
  • 2D Art
  • Programming
  • System Engineer

General

  • Index
  • What's New
  • Ventuz System Requirements
  • Configuration Editor
  • Device Configuration
  • Machine Configuration
  • Render Setup Editor
  • Introduction to Compositions, Screens and Outputs
  • Explanation of Compositions, Screens and Outputs
  • Live Runtime Control
  • Project and Show Management
  • Web Configuration Editor
  • Communication Protocol Overview
  • GPI Configuration for Runtime or Director
  • Introduction to the Ventuz Video Engine
  • Supported Formats
  • Supported Hardware
  • Color Management and HDR Workflow
  • Multisampling / Anti-Aliasing
  • Input Subsystem
  • Ventuz Proprietary Files
  • Migrating Configs & Content to Ventuz 8
  • Migrating Content to Ventuz 7
  • Migrating Content to Ventuz 6
  • Migrating Content to Ventuz 5
  • Summary Shortcuts
  • Terminology
  • Manual Index

Ventuz Designer

  • Designer Indices
Introduction
  • Designer Introduction Index
  • Designer Overview
  • Realtime Rendering
  • Project Browser
  • Designer Interface
  • Designer Options
  • Working with Nodes
  • Hierarchy and Content Editors
  • 2D Workflow
  • 3D Workflow
  • PBR Workflow
  • Animation Workflow
  • Best Practices
  • Reading Data in Ventuz
  • Display Images and Movies
  • Scene Performance and Tweaks
  • Deploying a Ventuz Presentation
  • Render to Disk
  • Multi Screen and Output Setup
  • Explanation of Compositions, Screens and Outputs
  • Workflow of using Multi Screens
  • Multi GPU
  • Previs
User Interface
  • Designer User Interface Index
  • Designer Interface
  • Renderer Window
  • Layer Editor
  • Property Editor
  • Property Groups
  • Hierarchy Editor
  • Content Editor
  • Find and Replace
  • Toolbox
  • Animation Editor
  • Shader Editor
  • Text Editor
  • Message View
  • Scene Tree
  • Stage Editor
  • Container Outline
  • Watches Editor
  • Scene Control Window
  • Performance Statistics
  • Video Scopes
2D Workflow
  • 2D Workflow Index
  • 2D Workflow
  • Layer Editor
  • Common Layer Properties
  • IPP Effects
  • Color Correction FX
  • Distortion FX
  • Filter FX
  • Custom Shader Postprocessing
  • Hierarchy and Content Editors
  • Display Images and Movies
3D Workflow
  • 3D Workflow Index
  • 3D Workflow
  • Hierarchy and Content Editors
  • Renderer Window
  • Camera Navigation
  • Manipulate Objects with Gizmos
  • In-Scene Editing
  • Layer Editor
  • Property Editor
  • Hierarchy Editor
  • Working with Nodes
  • Isolated Objects
  • Containers
  • Text Rendering
  • Character Sets
  • Geometry Import
  • Color Management and HDR Workflow
  • Display Images and Movies
  • Particle System
  • Creating Realistic Reflections
  • Unreal Integration
  • Notch Integration
  • E2E Node Overview
  • Custom 3D Shader
Logic Workflow
  • Logic Workflow Index
  • Hierarchy and Content Editors
  • Content Editor
  • Hierarchy Editor
  • Working with Nodes
  • Property Editor
  • Containers
  • Project and Scene Data
  • Reading Data in Ventuz
  • Display Images and Movies
  • Input Subsystem
  • Multitouch
  • TUIO Protocol
  • Open Sound Control
  • Unreal Integration
  • Notch Integration
  • E2E Node Overview
Animation Workflow
  • Animation Workflow Index
  • Animation Workflow
  • Animation Editor
  • Content Editor
  • Hierarchy Editor
  • Property Editor
  • Animation and State Engine
  • Templates
  • Template Engine
  • Scene Transitions
  • Unreal Integration
  • Notch Integration
Project Structure
  • Project Structure Index
  • Annotations
  • Project Documentation
  • Projects and Scenes
  • Project Properties
  • Project Maintenance
  • Project and Scene Data
  • Scene Management
  • Scene Statistics
  • Scene Tree
  • Performance Statistics
How Tos
  • Designer How to Index
  • How to Run Ventuz
  • How to Work with Designer
  • Ventuz Designer Drag&Drop Workflow
  • How to work with Shadows
  • How to Build Content for Multiple Screens
  • How to Use Emoijs
  • How to Build a Template
  • How To Build a Custom Scene Transition
  • How to Use the Color Difference Keyer
  • How To Enable HDR Video Output
  • How To Work with the HDR Layer
  • How Create Lens Flares and Bloom
  • How to Create Visuals Loader Node
  • How to Remote Control with a Phone
  • How to use Head Mounted Displays
  • How to work with 3D Reference Layers
  • How to create a Firework Particle System
  • How to use DDS with new Block Compression modes
  • How To use Stream Out
  • How to use the Substance Integration
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To use the Vertex Integration
  • How to use Previs scenes inside the Director
  • How To Control and Customize Ventuz
  • How to use the Companion with Director
  • How to build Previs scenes with Designer
  • How to migrate from Ventuz 6 to Ventuz 7
Reference
  • Available Nodes
  • Animation Nodes
  • Material&Color Nodes
  • Data Nodes
  • E2E Nodes
  • Geometry Nodes
  • Interaction Nodes
  • IO Nodes
  • Layers
  • Light Nodes
  • Logic Nodes
  • Previs Nodes
  • Render Option Nodes
  • Slides Nodes
  • Sound Nodes
  • Text Nodes
  • Texture Nodes
  • VR Nodes
  • World Nodes
  • Summary Shortcuts
  • Layer Editor Shortcuts
  • Hierarchy Editor Shortcuts
  • Content Editor Shortcuts
  • Animation Editor Shortcuts
  • Director Shortcuts

Ventuz Director

  • Index
  • What's New in Director
  • Introduction
  • Environment
  • Show
  • User Interface
  • Assets
  • Taking Action
  • Property Editor
  • Shot Box
  • Project Data
  • Pages
  • Playlist
  • Transitions
  • Timeline
  • Content References
  • Topology
  • Channels
  • Macros
  • Designing Templates
  • Scene Transitions
  • Plug-Ins
  • Shortcuts
  • Command Line Options
  • Application Settings
  • Glossary
  • GPI Configuration

Ventuz Runtime & Configuration

  • Runtime Index
  • Configuration Editor
  • Machine Configuration
  • Device Configuration
  • Project and Show Management
  • Live Runtime Control
  • Web Configuration Editor
  • Render Setup Editor
  • Warping and Soft-Edging Editor
  • Multi Screen and Output Setup
  • How to migrate from Ventuz 6 to Ventuz 7
  • Machine Clustering
  • Supported Hardware
  • Director Mode
  • How to Configure Audio
  • How to Use Live Options
  • How To Play Out On Multiple Screens
  • How To Render on a Machine Cluster
  • How to Use Head Mounted Displays
  • How to Setup Spout with Ventuz
  • How to Use Newtek NDI
  • How to Use a Mixed Frame Rate Cluster
  • How to Use Tracking

Multi Output & Display Setups

Introduction
  • Introduction to Compositions, Screens and Outputs
  • Explanation of Compositions, Screens and Outputs
  • Machine Clustering
  • Support for multiple GPU's
Editors
  • Configuration Editor
  • Device Configuration
  • Render Setup Editor
  • Warping and Soft-Edging Editor
  • Designer Stage Editor
Workflows
  • How to Build Content for Multiple Screens
  • How To Play Out On Multiple Outputs
  • How To Render on a Machine Cluster
  • How to build Previs scenes with Designer

How To

Designer
  • Designer How to Index
  • How to Run Ventuz
  • How to Work with Designer
  • Ventuz Designer Drag&Drop Workflow
  • How to work with Shadows
  • How to Build Content for Multiple Screens
  • How to Use Emoijs
  • How to Build a Template
  • How To Build a Custom Scene Transition
  • How to Use the Color Difference Keyer
  • How To Work with the HDR Layer
  • How To Enable HDR video output
  • How Create Lens Flares and Bloom
  • How to Create Visuals Loader Node
  • How to Remote Control with a Phone
  • How to use Head Mounted Displays
  • How to work with 3D Reference Layers
  • How to create a Firework Particle System
  • How to use DDS with new Block Compression modes
  • How to use the Substance Integration
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To build and playback Ventuz Content in Vertex
Runtime & Configuration
  • Runtime How Tos Index
  • How to Configure Audio
  • How to Use Live Options
  • How To Play Out On Multiple Screens
  • How To Render on a Machine Cluster
  • How to use Head Mounted Displays
  • How to setup Spout with Ventuz
  • How to use Newtek NDI
  • How to use a Mixed Frame Rate Cluster
  • How to use Tracking
  • How To Integrate Unreal
  • How To Integrate Notch
  • How To build and playback Ventuz Content in Vertex
  • Multi Screen and Output Setup
  • How To Enable HDR video output
Director
  • How To Control Multiple Graphics Independently From Each Other
  • How to use the Companion with Director

Ventuz Node Reference

Available Nodes Overview
  • All Designer Nodes
ANIMATION
  • Mover
  • Alternator
  • Simple Control
  • Timeline Control
  • Anmation Rig
  • Keyframe Animation
  • Animation Group
COLOR/MATERIAL
  • Alpha
  • Fog
  • Ground Fog
  • Sky Box
  • Color to RGBA
  • HSLA to Color
  • RGBA to Color
  • Color Transformer
  • HLSL Shader
  • Color
  • Material
  • Color Picker
  • Substance Material
DATA
  • Database
  • Excel
  • JSON
  • RSS Feed
  • Resource Linker
  • Text File
  • XML
E2E
  • E2E Axis
  • E2E Data
  • E2E Control
  • E2E Layer
  • E2E Provider
  • E2E Node Overview
GEOMETRY
  • Rectangle
  • Rounded Rectangle
  • Gradient Rectangle
  • Overlay Rectangle
  • Cube
  • Circle
  • Sphere
  • Cylinder
  • Cone
  • Torus
  • Chart
  • Random Points
  • Mesh Loader
  • Geometry Import (Live)
  • Volume
  • Get Bounding Box
  • Arrow
  • Particle System
  • Path Renderer
  • Geometry Renderer
INTERACTION
  • Interaction Rect
  • Touch Button
  • Touch Excluder
  • Touch Marker
  • Touch Paint
  • Touch Pattern
  • Touch Proxy
  • Touch Ripples
  • Touch Transformations
  • Web Browser
  • Touch Teleport
  • Touch Simulator
INPUT/OUTPUT (I/O)
  • GPI
  • Joystick
  • Keyboard
  • MIDI
  • Mouse
  • Network
  • Open Sound Control
  • Serial
  • Timecode
  • DMX
  • HTTP
  • RamDiskWriter
LAYER
  • 3D Layers
  • 3D Layer Reference
  • Composition Layer
  • 2D Layers
  • PSD Import Layer
  • E2E Layer
  • Mixer Layer
  • Others
LIGHT
  • Light Sources
LOGIC
  • Array Processing
  • Convert To Text
  • Cluster Synchronization
  • Counter
  • Data Portals
  • Date Time
  • Directory
  • Dispatcher
  • Enumeration
  • Expressions
  • Invert
  • Log
  • Loop Breaker
  • Math Effects
  • Matrix Operations
  • Scene Event
  • Script
  • String Operations
  • System ID
  • Render Setup Relation
  • Text Splitter
  • Timer
  • Toggle
  • Transition Info
  • URL
  • Value Switch
  • Value Buffer
  • Variables
  • Visual Indexer
PREVISUALIZATION
  • Introduction to Previs
  • Previs Screen
  • Previs Canvas
  • Compositon List
  • Rendersetup Objects
  • Composition Projector
  • Previs Screen Render Options
RENDER OPTIONS
  • Alpha Blending
  • Color Write
  • Alpha Testing
  • Clip Plane
  • Filter
  • Mask
  • Mirror
  • Effect
  • Render Cube Map
  • Draw Modes
  • Stencil
  • ZTesting
SOUND
  • Audio Clip
  • Sound
  • Volume Control
  • Audio Analysis
SLIDES
  • Slide Manager
  • Slide
  • Slide Port
  • Pivot
TEXT
  • Text Effects
  • Text Layouts
  • Text Rendering
TEXTURE
  • Background
  • Hatch
  • Image
  • Texture
  • SVG Loader
  • Gradient Texture
  • Live Video
  • Movie Stream
  • Movie Frame
  • Movie Clip
  • Texture Loader
  • Snapshot
  • Snapshot Framebuffer
  • Texture Saver
  • Video Source Selector
  • VIO Input
  • Spout Receiver
  • NDI Receiver
  • Substance Loader
  • QR Code
VR/AR
  • Tracked Devices
  • Draw Tracked Devices
WORLD
  • Axis
  • Billboard
  • GetWorld
  • SetWorld
  • Arrange
  • Ticker
  • Layout
  • Group
  • World Z Sort
  • YesNo
  • Switch
  • Spread
  • Filter Pass
  • Set Pass
  • Hierarchy Container
  • Scene Port
  • Content Container
  • Template Port
  • Container Info
  • Camera
  • Paths
  • Cloner

Advanced and Development

  • Advanced and Development Index
  • Command Line Options
  • Ventuz IP Ports
  • Ventuz Machine Service
  • TUIO
  • .NET Scripting
  • HLSL Shader Programming
  • Ventuz API and SDK
  • Ventuz Extension API
  • Ventuz VIO API
  • Ventuz File Format (VFF)
  • Ventuz Stream Out API
  • Lens Calibration File for FreeD
  • E2E Node Overview
  • Unreal Integration
  • Notch Integration
Remoting
  • Remoting Index
  • Remoting Overview
  • How To Control and Customize Ventuz
  • Remoting 4
  • Remoting 4 via Websockets
  • Remoting 4 via HTTP
  • Director Remoting
  • Deprecated Remoting

Misc

  • Presets
« Previous:
» Index «
Next: »

Screen Render Options

Screen Render Options Forces the engine to render a Canvas multiple times with different resolutions and/or cameras

The Screen Render Options Node can be used on a Screen or Composition Projector Node to force the Ventuz Engine to render a Canvas multiple times with different resolutions and/or cameras. The Engine can render the content at the highest resolution and filter the result for lower resolution outputs resulting in saving system resources.

The CameraOverride can make use of

Setting Description
TrackedCameraFor Head Mounted Displays or Camera Tracking Systems
MatrixCameraAllows the use of a custom view matrix. The presets zero and identity matrix can be used or modified manually or the Matrix Property can be connected to a matrix coming from for example a Matrix Node or a Script
SetExtensionEnables a POV (Point of View) or Eyepoint to be visualized. This can be used to give the impression of an extended, view as if looking through a window, when the POV, or camera moves from one place to another
ZNearSets where the Z depth view starts. This will clip objects close to the camera
ZFarSets where the Z depth view stops. This will clip objects far from the camera
X/Y/ZSets the viewpoint of the camera in 3D space
RenderFilterCan be set to exclude parts of the hierarchy to be rendered - the filter enumeration can be selected on the output property in the Render Setup Editor
OverrideWidth/HeightChanges the output resolution
PrevisWidth/HeightChanges the previsualization resolution for the Previs rendering inside the Ventuz Designer as well as for the Previs Screen output in the Render Setup Editor
Keep this setting as low as possible to prevent performance issues when working with high resolution outputs
Spherical projection Rendering will be warped to produce the correct result for the spherical display, and touch coordinates are warped the other way around to match the screen.

Spherical Projection

  • CubemapSize : size in pixels of one cubemap face. There are 6 faces!
  • Antialiasing : Multisample Antialiasing to be used while rendering cubemap
  • FromOutside : Invert rendering, so that the camera inside the sphere looks at the geometry from the outside. This effects Z-clipping and culling.
  • RenderMethod : FullSphere for 90° cubemap faces or OneFace for rendering once with varying field of view. "OneFaceParallel" for orthogonal projection.
  • RenderOffsetY : for OneFace, say how much the virtual camera is moved backwards along the optical axis, in percent of radius. Set to 0% if objects placed above the surface of the sphere must not be destorted, set to 100% for a better distribution of the rendered pixels (higher image quality).
  • RenderRadius" : For OneFace with RenderOffsetY and OneFaceParallel, this is the sphere radius at which objects are not distorted.
  • DebugRendering : Normally the screen will show the content distorted for projection. With this on one can see what is actually rendered before distortion. This is for debugging and understanding what's going on. The previs will show garbage while this is on.
  • ProjectionForRendering : Projection used for rendering
  • ProjectionForTouch : Projection used for touch.

Projection for touch defaults to "same as rendering". Some hardware uses a different projection for touch than for rendering.

The Pufferfish property group will automatically set up different projections as required by the hardware, it only needs to be placed in the rendering slot to do that.

Often you need square aspect for projection on a 16:9 display, but touch data comes in 0..1 TUIO range relative to the 16:9 display, so it must be stretched. The StretchedTouchSquareRender does the right thing without having to setup projection twice.

Camera Override

It is possible to apply the sphere mapping at a per layer basis as a camera override.

To do this, in the camera node, set the projection property to spherical projection. This will give you most of the options found in previs scene screen render options.

Unfortunately, touch is not supported in this mode, as the magic that makes spherical touch work happens while processing the render setup, and not on a per layer basis. Also the RenderOffsetAuto feature does not work (see below, reprojection).

Projection for Rendering

All property groups provide SphereAngleOut and LensAngleOut as a output properties.

For many projection models the values are the same as the input property LensAngle. The purpose is to have the output property be correct while switching the projection model, so that bindings are not broken.

  • SphereAngleOut : the angle of the physical screen, 360° for full sphere.
  • LensAngleOut : the minimal angle of a fisheye-lens to fit the whole screen, which is different from the SphereAngle if the projector is not at the center of the screen. In reality the lens angle will be a bit larger than the minimum.

Equirectangular: Long-Lat Maps

This is the traditional world-map projection that plots degrees of longitude and latitude at equal distances, putting a full world map in a 2:1 aspect ratio. This should not be confused with the Mercator projections used by map services like google maps, which prioritizes the pole regions.

  • SphereAngle : Angle of the sphere, 360° for full sphere
  • LongitudeBias : Rotation in East-West direction
  • LongitudeFlip : Flip East-West direction
  • Aspect : see below

LensAngleOut is meaningless and the same as SphereAngle.

Azimuthal Equidistant: Flat Earth

This projection has the north-pole at the center and the south pole stretched all around the edge of a circular image.

  • SphereAngle : Angle of the sphere, 360° for full sphere
  • LongitudeBias : Rotation in East-West direction
  • LongitudeFlip : Flip East-West direction

LensAngleOut is set to the same as SphereAngle.

Fisheye:

Fisheye is a variant of the equirectangular projection that is based on the fact that many spherical displays are implemented by a video projector projecting through a fisheye lens on a screen. In this configuration, the projector is usually purposefully located below or above the center of the sphere, along the optical axis, for mechanical reasons or to achieve a better pixel-distribution for the projected image.

While there is no reason to move the projector / lens off the optical axis, that of course happens in practice and can be compensated. With these factors alone it is often possible to calibrate a display quite satisfyingly, although there are usually further lens distortions not captured by this model.

  • SphereAngle : Angle of the sphere, 360° for full sphere
  • LensAngle : Angle of the fish-eye lens.
  • LongitudeBias : Rotation in East-West direction
  • LongitudeFlip : Flip East-West direction
  • Aspect : see below
  • LensCenter : Offset to center of the lens in the sphere, in percent of Radius.
  • LensShift : Misalignment between the projector and the lens
  • RenderOffset, RenderParallel, RenderRadius: see rendering section

LensAngleOut is calculated depending on SphereAngle and LensCenter, and reflects the minimum required lens angle to fully cover the screen. The input property LensAngle must be set to the actual lens angle, which will usually be slightly larger.

Pufferfish / Mediascreen:

Specialized nodes that simplify setting up directly supported products. The parameters can be copied from the device datasheet.

About Aspect

Some projections have the "aspect" property.

These projections create an image that is circular in some way. For projection you want this image to be centered on the screen, which may be 16:9 and not 1:1. But the touch-device will usually create touch in a TUIO 0..1 coordinate system which is then spread to screen pixels in the non-1:1 aspect ratio, so for touch you might prefer the input to be stretched.

This property group allows to control this:

  • Square : The circle is centered on the screen, for projection
  • Stretched : The circle is stretched into the edges of the screen, for touch
  • StretchedTouchSquareRender : Mixed mode, that allows to set up touch and rendering without having to specify all parameters twice.

Rendering Techniques.

The default mode for rendering is FullSphere, rendering the 3d scene in 6 directions from the center at 90° field of view. This is optimal for displays near the full 360° as it minimizes distortions, but it requires 6 render steps, which is costly on the CPU and GPU.

For display with a smaller angle it is enough to render the scene once and modify the FOV angle, the OneFace option. One can easily imagine how things at the edge get distorted a lot. As we approach 180° for a virtual camera in the center of the sphere, distortions become infinite. If we double the resolution, we can go from 90° to 127° while maintaining the same quality for the pixels at the pole. This is a big performance win because we render only into one target of twice the resolution, instead of 6 targets, so it's less render calls and less total pixels. This is worth doing until about 150° before distortions become too bad, and it breaks totally apart near 180°.

But that's with the camera at the center of the sphere. If we put the camera at the bottom of the sphere, we can render close to 360° without extreme distortions for latitude: The only real waste is through the regions of the south-pole being stretched across the edge of the screen, making this impractical beyond 180°.

The RenderOffset property allows controlling the position of the virtual camera independent from the position of the physical projector, So one can use optimal virtual camera position for minimizing distortion while rendering and a different position for the physical projector to match the device with it's practical limitations. A value of 100% moves the camera to the bottom of the sphere, which seems to be the optimal position for improving pixel utilization.

The OneFaceParallel option changes the camera to a orthogonal projection from outside the sphere on the sphere, which may be another options to choose. In this case the RenderOffsetY is not used.

Both off-center options need to know how large the sphere is, so the RenderRadius must be specified. For a camera at the center of the sphere, the distance does not matter, you can scale the sphere and object on top of the sphere as large as you like, as it will always project correctly to the center of the sphere. But with an off-center virtual camera, the projection is only correct for objects located on the sphere at the specified radius. As long as objects are reasonably near the radius this poses no problem. It may even look better than what happens if you correctly project something like a rectangle on a sphere. Check out the Reprojection feature to learn more about dealing with that.

Moving the virtual camera only changes how the rendered image looks. The (intended) distortion from the off-center virtual camera is corrected while the mapping is calculated. As long as all geometry is on the sphere, the distortion is not noticeable in the final image, except for hopefully better distribution of rendered pixels. The "DebugRendering" property may be used to see how the rendered image looks without the mapping applied, that can give some insight about pixel distribution and distortions coming from objects not being on the sphere.

See also:
  • Previs Nodes
  • Previs Screen
  • Render Setup Editor
  • Composition Layer
  • Composition Projector
  • Composition List

« Previous:
» Index «
Next: »
Copyright 2025 Ventuz Technology