Documentation / Unity Package

Table of Contents

Getting Started

After recording in the Capture Suite, you are now ready to create an experience in Unity. Soar supports Unity versions 2020.2 and newer. Also, if building on Windows, ensure your system is 64 bit. Soar does not support 32 bit Windows. Based on what you set the output resolution for your compressed capture in the Capture Suite, you may be able to run multiple instances in your scene. With the default 2048 x 1024 resolution, you may find you can run 3-5 instances.

Compatibility

Unity provides the option of selecting from two render pipelines — HDRP and URP — as well as the standard renderer. On Apple devices, we support all three of these. On Windows, we support URP and standard (not HDRP).

There are multiple graphics APIs to choose from in Unity. At the moment, we support OpenGL Core exclusively — not Direct3D or Vulkan. This precludes us from being able to support HDRP on Windows for the time being.

We require a modern NVIDIA GPU on Windows that supports H.264 hardware decoding. Check if your GPU is compatible by clicking here. Please make sure your drivers are up-to-date, as older drivers have been reported to cause playback issues.

The compatible build devices for the Apple Devices Unity Package are currently iPhone X and newer running iOS 13+, all iPad Pro models running iPadOS 13+, and both 1st and 2nd Generation Apple TV 4K running tvOS 14+.

For deploying a scene with three simultaneously-playing captures to devices, we recommend the iPhone 12 Pro, iPhone 12 Pro Max, iPhone 13 Pro, iPhone 13 Pro Max, and all iPad Pro models due to memory requirements. All other supported iPhone models, as well as supported Apple TV models, support up to two simultaneously-playing captures. Windows machines and Macs can support up to three.

Finally, to avoid any script compilation errors, the Mathematics and High Definition RP package MUST be included in your project.

For M1 Devices

Unity’s current native silicon build of their editor is in active development and currently still contain a litany of bugs that can cause crashes, and because of this we recommend using a Rosetta build of the Unity Editor.

Some of our users encountered an issue loading SoarUnityPluginMac.bundle on Apple Silicon (M1) devices. To fix this issue, open terminal and run the following command:

xattr -d com.apple.quarantine<path to bundle>/SoarUnityPluginMac.bundle

Importing the Package (Apple)

To get started, import the Soar Unity Package into a new Unity Project:

  • Click Assets.

Click Assets

  • Scroll down to Import Package, then click Custom Package.

Import Package

  • Navigate to the Soar Unity Package and click Open.

Open Package

  • By default, all assets will be imported from the unity package. Click Import.

Import Package

  • Now you should see all the imported assets in your projects assets folder.

Imported Package

After importing the Soar Unity Package, we need to allow unsafe code to get full use of the Soar plugin. To allow unsafe code in our Unity project, follow these steps:

  • Click Edit.

Click Edit

  • Scroll down and click Project Settings.

Project Settings

  • Within project settings, click the Player tab.

Player tab

  • In the player tab, expand the Other Settings dropdown.

Other Settings

  • Within the Script Compilation section, check the box titled Allow unsafe code.

Unsafe Code

Importing the Package (Windows)

To get started, import the Soar Unity Package into a new Unity Project:

  • Click Assets.

Assets Tab Windows

  • Scroll down to Import Package, then click Custom Package.

Custom Package Windows

  • Navigate to the Soar Unity Package and click Open.

Open Package Windows

  • By default, all assets will be imported from the unity package. Click Import.

Import Assets Windows

  • Now you should see all the imported assets in your projects assets folder.

Assets Folder Windows

After importing the Soar Unity Package, we need to allow unsafe code and make some small changes to out project settings to get full use of the Soar plugin. To make these changes in our Unity project, follow these steps:

  • Click Edit.

Edit Tab Windows

  • From there, open Project Settings.

Project Settings Windows

  • Open up the Player tab in the Project Settings window.

Player Tab Windows

  • In the Other Settings section, you should see a box titled Auto Graphics API for Windows. We need to uncheck this box and set a specifics Graphics API to use.

Auto Graphics API

  • After you uncheck the box Direct3D11 should populate the field automatically. However, we want to change this from Direct3D11 to OpenGLCore.

D3D11 Replace

  • You should only see OpenGLCore in the Graphics APIs for Windows field.

OpenGLCore Set

  • Lastly within the Script Compilation section in Project Settings, check the box titled Allow unsafe code.

Unsafe Code

Now that we have our settings fully set, we can start using our Volumetric Captures!

Rendering Pipelines

NOTE: We provide materials for each render pipeline provided by Unity. Depending on what pipeline you're using, make sure you use the corresponding material.

Pipeline Materials

Setting up the Universal Render Pipeline (Apple)

Before we start, your Unity project needs to have the Universal RP package included in your project. To include that package, click Window > Package Manager.

From the Package Manager navigate to Universal RP.

URP Package Manager

Our package already includes a configured render pipeline asset to use with the Unity Universal Render Pipeline. To include our pipeline assets with URP, in the top navigation bar click Edit > Project Settings and navigate to the Graphics tab. In the Scriptable Render Pipeline Settings field, add the included Render Pipeline Asset.

URP Graphics Tab

Setting up the High Definition Render Pipeline (Apple)

Before we start, your Unity project needs to have the High Definition RP package included in your project. To include that package, click Window > Package Manager.

From the Package Manager navigate to High Definition RP.

HDRP Package Manager

After including the High Definition RP package, the Render Pipeline Wizard will pop up asking you to fix configuration settings, when this comes up click Fix All.

Fix All Render Pipeline Wizard

While fixing your configuration settings, the Render Pipeline Wizard will ask you to load or create an HDRenderPipelineAsset. The package already includes a configured asset, so click Load One and select the included HDRenderPipelineAsset.

When your project finishes updating, you should see nothing but green check marks.

HDRP Green Checks

Setting up Universal Render Pipeline (Windows)

Before we start, your Unity project needs to have the Universal RP package included in your project. To include that package, click Window > Package Manager.

From the Package Manager navigate to Universal RP.

URP Package Manager

Our package already includes a configured render pipeline asset to use with the Unity Universal Render Pipeline. To include our pipeline assets with URP, in the top navigation bar click Edit > Project Settings and navigate to the Graphics tab. In the Scriptable Render Pipeline Settings field, add the included Render Pipeline Asset.

URP Graphics Tab

Using the SDK

Offline Content

Now that the Soar Unity Package is imported, create a Streaming Assets folder.

Streaming Assets

Inside this folder, you must also create a VOD folder.

VOD

Inside the VOD folder drag and drop all 5 exported files that you saved and exported from the Capture Suite. When creating our volumetric content, we are most concerned with our .m3u8 playlist file.

BIN File

NOTE: If your capture includes audio, make sure to also include your .m4a audio file and the audio manifest file. Captures include the ability to use Spatial/3D Audio and is automatically applied when available, however, this MUST be configured to be enabled through the Capture Suite.

Audio Files

After importing the files, you are ready to add the content to your Unity scene. Within the DemoAssets folder there are three prefabs we are going to use for our offline volumetric content. In any scene using volumetric content we will need to add a Volumetric Controller. The Volumetric Controller is used to keep track of each instance of volumetric content. The Volumetric Model and Volumetric Model Apple TV prefabs are very similar, but we will only use the Volumetric Model Apple TV prefab when building to Apple TV devices. Since we are going to be building to iOS in this scenario, we are going to add the Volumetric Model to our scene.

Volumetric Model

Inside the Prefab is a script titled Volumetric Render. There is a text box where you can paste the name of the imported .m3u8 file, excluding the .m3u8 extension.

Canned Script

The Material is automatically set to the volumetric model and is called Model.

Material

Within the Volumetric Render script, there are functions created to facilitate full playback functionality, include Play, Pause and Stop. There is also functionality to enable Auto Play/Auto Loop.

Autoplay/Autoloop

Playback Functions

To get access to and invoke these playback functions we need to first get the instance of the Volumetric Render script attached to the gameobject.

public VolumetricRender playbackComponent;

void Start()
{
  playbackComponent = gameObject.GetComoponent<VolumetricRender>();
}

Now that we have a reference to the Volumetric Render script we need to get the index of the volumetric model we want to control. We have a dynamic list of instances in the Volumetric Render component that can be accessed through the variable instanceRef. Our script is able to determine which volumetric instance to play/pause/stop based on the index passed into the function. We have only one volumetric instance in the scene so we will pass 0 into the playback function arguments.

  • Pause
    public void PlaybackPause()
    {
      playbackComponent.PauseModel(0);
    }
  • Stop
    public void PlaybackStop()
    {
      playbackComponent.StopModel(0);
    }
  • Play
    public void PlaybackStart()
    {
      playbackComponent.StartPlayback(0);
    }

For an example on how to select individual volumetric instances and control them individually, check out the PlaybackControls.cs included in the package. This script goes over how to get the index of individual volumetric instances when the user clicks on the Volumetric Model prefab.

Scrubbing

Our volumetric playback also facilitates playback scrubbing to give the user more control over their volumetric content. To set up playback scrubbing we will first need two things, the length of the volumetric content playing, and the current time of the volumetric content playing. Luckily, we already have two functions made for us that do just that.

  • Get Duration of Content
    playbackComponent.GetFullDuration(0);
  • Get Current Timestamp
    playbackComponent.GetCurrentPosition(0);

REMINDER: We are passing an index of 0 into these functions because we only have ONE instance in our scene.

In our example SoarSampleScene we create a basic scrubbing UI using just the UI Slider object built into Unity.

Scrubbing UI GIF

We use the GetFullDuration function call to set the maximum value of our scrubbing bar and the GetCurrentPosition function to track the time value of the scrubber position.

Next, on the Slider object, we need to add an Event Trigger component so we can trigger functions when we grab the scrubber and let go of the scrubber. These triggers will be End Drag and Begin Drag.

Event Trigger Component

In the PlaybackControls.cs provided in the package, we have the functions written that handle the scrubbing functionality.

  • Seek To Timestamp
    public void SeekToTimestamp()
    {
      playbackComponent.SeekToCursor(0, (int)scrubbingSlider.value);
      PlaybackStart();
    }
  • Get Slider Handle
    public void GetSliderHandle()
    {
      PlaybackStop();
    }

With these functions set we now have the ability to scrub to a certain time in our volumetric content.

Scrubbing UI In Use

Manipulation Controls are included in the package so you can rotate, translate and scale the volumetric content.

The volumetric model is configured to appear directly in front of the phone in your AR scene.

ARKit will need to be integrated to view the content in Augmented Reality.

Streaming

For streaming, the workflow is extremely similar. We will be working with the same Volumetric Model prefab, but with one small difference.

Volumetric Model

Inside the Volumetric Render script we will replace our master file name with the URL that the volumetric model is being streamed from.

NOTE: All volumetric content being streamed from a URL play automatically.

Receiver Script

The Material is automatically set to the volumetric model and is called Model.

Material

Loading Captures at Runtime

Along with full playback of captured files, we now offer the ability to load new capture files at runtime with the click of a button. We kept the function call as simple playback functions to keep things streamlined.

  • Load New Clip
    public void LoadNewClip()
    {
      playbackComponent.LoadNewClip(newClipFileName, volumetricIndex);
    }

The LoadNewClip function takes two arguments, the file name or URL of the capture clip you want to load, and the instance index you want to load the capture clip to.

We updated our PlaybackControls.cs script to include a string box where you can place the name of the clip you want to load.

New Clip File Name

Now that our new clip is set, we can play the scene, clip the Load Clip button, and you’ll see your volumetric video change immediately!

Load New Clip Sample

Chroma Key

We have added a chroma key feature to our volumetric render component to allow users to chroma key out specific colors on your volumetric capture.

Looking at the updated Volumetric Render component we can see 4 main components of the chroma key feature. These 4 components are the Chroma Color, Gain, Bias and our Override Chroma checkbox.

Chroma Key Settings

Chroma Color is the color that is selected to be keyed out.

Gain changes how severe it de-weights against the chroma distance; also controls de-saturation.

Bias is the base level cut-off; how far is this color away from the selected chroma key.

Chroma key is disabled by default, if you would like chroma keying applied to your volumetric capture, you must click the Override Chroma checkbox.

Included in the package is a capture that has chroma keying applied. Here we can see the model without chroma keying applied.

Capture No Chroma Applied

After clicking the Chroma Override checkbox, we can now see the desired colors applied to our model!

Capture Chroma Applied

Volumetric Capture Preview

To help with faster iteration and development, we added a feature within our SDK to create an animated preview of your volumetric capture. All that’s needed is an extra script on your gameobject.

On the same gameobject that has the Volumetric Render component, click Add Component and search for Create Preview.

Create Preview Script

Looking at the Create Preview component we can see that it is very simple.

Create Preview Component

The Create Preview component will create a fully animated preview of whatever capture is set in your Volumetric Render script. All you have to do is set the Create Preview bool to true, and your capture will begin animating while the unity app is in edit mode!

Create Preview Sample

The Create Preview component also gives you the ability to tie your preview with the Unity Timeline by including a Playable Director to help speed up iteration. To attatch a Playable Director, set the "Attatch To Director" boolean to true. This will cause an object field to appear within the component, allowing you to set a Playable Director in the Create Preview component.

Attatch Director to Preview

Dynamic Lighting

We provide both unlit shaders and shaders that allow for dynamic lighting for each render pipeline supplied by Unity.

Soar Shader Files

NOTE: On Apple products we support the Core, URP and HDRP pipelines. On Windows we currently support only Core and URP.

The lit SoarShader assets includes dynamic lighting allowing the volumetric model to match the lighting set in the scene. Our shader also gives the user the ability to adjust ambient lighting color/intensity and also the color/intensity of the textures themselves, allowing for more customization of the look of your volumetric capture. These values are accessed through the Material that is applied to the ModelMesh gameobject.

Dynamic Lighting:

Dynamic Lighting

Ambient Color/Intensity:

Ambient Color Intensity

Texture Color/Intensity:

Texture Color Intensity

These three values can be used together to get some very unique looks for your model! Colored Capture

Timeline Playback

Our volumetric captures have the ability to be integrated with the Unity Timeline allowing for frame specific sequencing to help make volumetric filmmaking even easier!

Included in the Unity Package is a sample scene titled SoarTimelineSample demonstrating our Timeline integration.

In our SoarTimelineSample scene all of our Timeline components and scripts are set up on the Director gameobject.

Timeline Scene

In our sample we have two Volumetric Models in our scene, the Timeline contains two Volumetric Render Tracks (one for each model) and each track has a Volumetric Render Clip for the Volumetric Model.

Each Volumetric Render Clip has a dropdown where you specify the capture file you intend this clip to use. This allows you to create multiple clips on one Volumetric Render Track and load different capture clips on one Volumetric Render Track!

Clip Attributes

NOTE: When set in the Timeline, Volumetric Models need the Auto Play and Auto Loop boxes unchecked to allow the Timeline to control model playback.

Unchecked Boxes

In our sample we specify our Playable Director component to Play on Awake, so when you begin playing the editor, you will see the cursor move through the Unity Timeline and play Volumetric Models when reaching the Volumetric Render Clip.

Timeline Playback

We allow scrubbing in the Unity Timeline allowing you to move between clips at runtime and move through the Volumetric Models frame by frame.

Timeline Scrubbing

To add your own Volumetric Render Tracks and Volumetric Render Clips go through the following steps.

  • Right click the Track area in the Timeline and add a new Volumetric Render Track.

Add New Track

  • With the Volumetric Render Track highlighted, right click the Clip area within the Timeline and click Add Volumetric Render Clip.

Add New Clip

  • Next, we need to add a third Volumetric Model and connect the Volumetric Render component with the third Volumetric Render Track.

Connect to Track

  • Set the clip to the end and Play the editor, when the timeline cursor hits it, your new clip will play!

Play New Clip

Soar VFX Graph

NOTE: Unity's VFX Graph is currently only supported by the Universal Render Pipeline and High Definition Render Pipeline, VFX Graph will NOT work with Unity's Standard Render Pipeline.

Before using the VFX assets included in the package, Unity's Visual Effect Graph package and Unity's Mathematics must be included in your project.

We can download both of these right from the Unity package manager.

VFX Graph Package Manager

After download the Visual Effect Graph package, we need to set some preferences to get our shader graph working. Set set these preferences go to Unity -> Preferences -> Visual Effects.

VFX Preferences

The package contains a scene titled SoarSampleVFXScene, to see the VFX graph in action, open up that scene.

The VFX graph included in the package is responsible for re-creating the volumetric mesh using a particle system.

VFX Graph Demo

Let us take a closer look at the SoarVFXGraph.vfx asset to get a better understanding of how it is working.

There are 2 major components of the SoarVFXGraph. First we will look at the Spawn and Initialize nodes.

Spawn Initialize Node

In the VFX Spawn node we are passing our volumetric mesh to a vertex count operator, which we then pass to our Spawn node to tell our VFX graph to only spawn one particle for every vertex on our mesh.

In the VFX Initialize Particle node we then place each particle at the same position of the vertex to completely recreate our mesh using the VFX Graph and its particle system.

Now let us take a look at the VFX Graph output node.

Output Node

This complicated node is responsible for taking the data from our SDK and uses a custom VFX Shader Graph shader to set the color of each particle to match the color of the rendered mesh. You do not want to change the connected values in this node.

Now that we understand how the SoarVFXGraph works, we can start playing around with the update node.

Update Node

The Update node within the VFX Graph is where we can really start to experiment with the artistic looks of our model. We have a Gravity node set in our update node, but it is disabled. To see the effect gravity will have on your particles, enable to node to see the effect in action.

VFX Graph Gravity

Now our particles fall as soon as they spawn! However, because the lifetime of our particles is set to a random value between 0.25 and 1 second, the particles respawn very quickly, and maintain the shape of our model.

However we can add different force operators to our Update node to change how our particles move. Try replacing the Gravity operator with a Turbulence operator and see how that effects our particles.

Update Node Turbulence

Quite a different effect than the constant falling of the Gravity operator.

VFX Graph Turbulence

These built in force operators can be heavily customized to suite your creative needs!

Build Process

After you are satisfied with your application and want to test it on your iPhone/iPad, you can start the Build process to Xcode.

First we need to add the necessary shaders to our Graphics settings so they will always be included within the project when we build it. To do this click Edit, then Project Settings.

Project Settings

After opening the project settings, go to the Graphics tab.

Graphics Setting

Scroll down to Always Included Shaders. In here, we will add the SoarShader and the ColorCameraDepth shader.

Include Shaders

Now that we have the shaders included in the project, we can officially start the build process.

First, click File, then Build Settings.

Build Settings

Ensure that the correct scene is selected in Scenes in Build, if not, click Add Open Scenes.

Add Open Scenes

Make sure your platform is iOS. If it is not, click iOS, then click Switch Platform.

Switch Platform

Then, click either Build to just build the application or Build and Run to build the application and launch Xcode.

Build

After Unity is finished building the application, Xcode will launch.

Xcode Build

Ensure you have added an Apple ID which can be used to build the application. If not, view the Apple documentation in order to setup a Developer Account. Add developer Account

Attach a compatible device and click the Play Button to build the application to the device. Start Build

NOTE: When building for Apple, there is a chance you will come across this error: VideoToolbox Error

This error is caused by Unity not linking the VideoToolBox framework to the Unity framework generated by Xcode. To solve this, follow these steps:

  • Go to the Build Phases tab in the UnityFramework target. Unity Framework Build Phases

  • Open the tab titled Link Binary With Libraries. Link With Libraries

  • Double check to make sure that VideoToolBox.framework is missing. If it is, press the plus button. Add To Library

  • Add VideoToolBox.framework to your project and build again. Add VideoToolbox