Documentation / Capture Suite

Table of Contents

Getting Started

Installation Requirements

Capture Suite requires the Microsoft Visual Studio C++ Redistributable, which is not bundled with its installer. Download it here.

We also require the Azure Kinect SDK, available here.

Ensure your Azure Kinects are up-to-date with the latest firmware. Instructions on how to do this can be found here.

Starting the Capture Suite


Capture Suite requires a license key. We support both offline and online activations. For online activation, paste the key into the license entry field when prompted. For an offline activation, launch the Capture Suite and enter the license key you were given — then select "Offline Activation Request". You will be prompted to save a .txt file. Send the .txt file to Soar at We will send back a .dat file. Return to Capture Suite and click on "Activate Offline License". You will be prompted to load the .dat file and the license will activate.

Main Screen

When launching the Capture Suite, all of your connected cameras should show up on the camera tab bar at the very top of the window. The camera tabs are color coded.

  • Green: Calibrated
  • Yellow: Calibrating
  • Red: Not Calibrated

These tabs will also show notifications if a color or depth camera is disabled or if the camera has a temperature out of the acceptable range. They can be reordered, but the order cannot be saved so that it persists across Capture Suite launches. If you are missing a camera that is connected, click the button marked "Refresh" found on the capture tab. If the camera is plugged in but does not appear, it is not communicating properly with your computer. You may have a USB bandwidth issue or a faulty USB extension cable. Check the tips and tricks section for further troubleshooting.

Clicking "Log", found on the capture tab, will open a window flagging potential issues as you use the Capture Suite.


Before you can record content, you need to calibrate your cameras. Ideally, this process is done each time you use the Capture Suite. Even if cameras do not move, you might find calibration can drift a little after people repeatedly walk in the area.

Calibration in this context refers to the process of computing the camera extrinsics, or where they are in space relative to the calibration cube — including how they're oriented.

Things to Check Before Calibrating

  • Are lens flares visible on the color camera feed? If so, adjust the cameras and lights so that they are not.
  • Are the cameras more or less vertical? Straighten them if not.
  • Are you in a room with a lot of florescent lighting? If so, turn the lights off and use LEDs. If that's not possible, turn as many off as you can, calibrating in dim lighting is OK provided you have enabled adaptive thresholding.

Performing the Calibration

Enabling Hardware Sync and Setting Delay

We recommend using hardware sync which dramatically lowers depth noise, resulting in a more accurate calibration.

To use hardware sync, first ensure that your hardware sync cables are wired correctly. Your primary camera should have a 3.5mm sync cable inserted into only the sync out port. Your last subordinate camera should only have a 3.5mm sync cable inserted into the sync in port. Every other camera in your sync chain should have 3.5mm sync cable inserted into both the sync in port and sync out port. If you notice your rendering speeds decrease with hardware sync enabled, you may have to adjust your RAM timings in the BIOS. In order to diagnose potential sync issues, check out the tips and tricks section found at the bottom of this documentation.

If you're using hardware sync, you must also set the "delay from primary" setting for each camera which offsets the emissions and depth exposure to avoid cameras interfering with one another. "Delay from primary" is found in the color controls section (within the color section) inside each camera tab.

Delay From Primary

Soar recommends an offset of 160 microseconds. Your primary camera should be set to 0, the next camera in your sync chain should be set to 160, the following camera in your sync chain should be set to 320, so on and so forth. The above picture references the seventh camera in the sync chain, out of a total of 8. As you can see, this is set to 960. After setting these values, save your settings in the profile section on the capture tab.

Hardware Sync

Use the checkbox "Enable hardware sync" on the capture tab to turn it on. Confirm hardware sync is running correctly by inspecting the infrared views for each camera tab. You should not see large pulses of light from any camera, which is interference (see image below). If you notice your rendering speeds decrease with hardware sync enabled, you may have to adjust your RAM timings in the BIOS. In order to diagnose potential sync issues, check out the tips and tricks section found at the bottom of this documentation.


Now you are ready to start calibrating. Head to the calibration section on the capture tab. You must accurately measure your calibration cube and input the marker width, preferably in millimeters. If using a Soar calibration cube, the default values will work, but you will want to measure the cube to confirm. Adaptive thresholding will be useful if you are calibrating in a brightly lit environment or a darker environment. Keeping the adaptive threshold value at 0 is ideal for most use cases. In brighter environments, you may want to use a negative value if your calibration is not yielding quality results. In darker environments, you may want to use a positive value.


The default cube width value is 216.1mm should be correct, but confirm this is correct by measuring the width of your calibration cube. The top marker offset is 148.3mm. This can be confirmed by measuring from the center of the face of the marker on a side, straight upwards to the center of the top marker face. Click "calibrate all cameras" to start calibrating. This process will take about one minute if your "required samples" setting is set to 20; increasing this value will add time to calibration, but may potentially improve results. If a camera tab is yellow, this means that it is currently calibrating. During calibration, you can click into a camera tab and check the color feed. You will see a marker outline. It's important that this marker outline remains straight and does not bounce around. Also, if looking at a lower corner camera, you will notice it can see two markers. The marker it uses will have a colored dot. Once all tabs turn green, you can click the preview button to see your calibration.

After calibrating, click the preview button to see your calibrated content.

Accessing the Settings and Calibration Data

Settings for Capture Suite are stored in %appdata%/Soar as JSON. AppData is a hidden folder, so type that path (%appdata%/Soar) in Windows File Explorer to easily access it. Here, you will find Capture Suite settings, calibration extrinsics, calibration pinhole intrinsics, world view projections, the color info structure, and the depth info structure.

Camera Configuration

Top Of Camera Screen

Within a camera tab, you will be able to access and modify a variety of controls. To start off, you can give a camera a different name other than its serial number. The serial number is always visible next to the camera name.

You can switch between the color, depth, and infrared views. You can also choose to calibrate a specific camera individually rather than calibrating all cameras simultaneously. During calibration, a button appears next to the calibrate camera button when the camera is calibrating so that you can cancel calibration at any time.



Within the color section you can enable and disable the camera as well as change the camera resolution. While each camera can have a unique resolution, if one camera utilizes 3072p (which is capped at 15 FPS), all cameras will be capped to 15 FPS. Prior to making a selection, you can select apply to all which will go on to apply the resolution setting to all connected cameras. You can also do this for enabling/disabling multiple cameras. Ensure Hardware JPEG Decoding is always enabled.

Color Controls

Color Controls

Inside the color section lives the color controls and color post-process controls. The post-process controls can impact your content in a slightly different manner compared to the Azure Kinect color controls. These four controls can also be applied to raw captures in post. The regular color control settings can be applied to all connected cameras provided you select apply to all prior to modifying settings. Soar recommends using manual exposure and manual white balance to help match the colors between cameras. Unless otherwise noted, the default settings will more or less be fine for most environments. For details on the "Delay from Primary" setting, see the above section "Enabling Hardware Sync and Setting Delay".

  • Exposure (with auto option): sets exposure time and gain based on lighting conditions of environment. When auto is enabled, the Exposure time and Gain values will be set for you. The unit of measurement is microseconds. Soar recommends this to be set to manual (prior to hardware sync and calibration) and potentially fine-tuned based on recording requirements.
  • Gain: in combination with exposure, gain will increase the exposure of the sensor color video. This setting is hidden when auto exposure is enabled.
  • White Balance (with auto option): sets recommended color temperature based on lighting conditions of environment. There is an Auto option. Soar recommends this to be set to manual (prior to hardware sync and calibration) and fine-tuned for your environment.
  • Brightness: adjusts overall brightness of sensor color video.
  • Contrast: adjusts overall contrast of sensor color video.
  • Saturation: adjusts overall saturation of sensor color video.
  • Sharpness: accentuates fine color detail represented in sensor color video.
  • Backlight Compensation: can be enabled if you are shooting in a low or inconsistently lit environment.
  • Powerline Frequency: setting to prevent flickering or banding seen in video that is not compatible with the AC frequencies of the capture space; 60Hz is most common in North America, but most other countries have an AC frequency of 50Hz.
  • Delay from Primary: sets the emission and depth exposure delay for each subordinate camera; unit of measurement is microseconds. This is used to prevent interference between the cameras. Proper configuration is required to use hardware sync. Soar recommends a delay multiple of 160 for your sync chain (primary camera is 0, second camera is 160, third camera is 320, etc.). The maximum delay from primary setting for most modes is 1450 microseconds, while the maximum setting for 1024 x 1024 is 2390 microseconds.
  • Color Post-Process Controls: a separate pass that comes after the color is set on the Azure Kinects, helpful in grading captures after they've been recorded.



The depth section, much like the color section, allows you to enable/disable a camera and modify its resolution. While each camera can have a unique resolution, if one camera utilizes 1024 (which is capped at 15 FPS), all cameras will be capped to 15 FPS. These settings can be applied to all connected cameras, provided you select apply to all prior to making your change.

The Capture Suite has a hole-filling feature which allows you to improve your capture quality. Basic is the default option. This setting can be applied to all cameras if you select apply to all prior to selecting the option.

"Direction" allows you to fine-tune your hole filling. Selecting near camera will have the foreground spread out, whereas selecting far camera will have the background spread out; far camera is the default setting.

Camera Geometry

Camera Geometry

Inside the camera geometry section you will be able to modify the near plane and far plane, which are measured in meters. You should not have to modify these settings.

The near plane is the minimum distance which the camera will capture; anything past this value will be discarded. The default value is 0.050.

The far plane is the maximum distance which the camera will capture; anything past this value will be discarded. The default value is 10.000.

Device Statistics

Device Statistics

The device statistics section has pertinent information about your connected camera. The most important bit of information in this section is regarding hardware sync. Your sync input/output, sync status, and sync state for the camera will be shown here.




In order to capture audio, you must connect a USB interface and microphone to the PC via USB audio interface; microphone can be wireless or wired.. Within the audio section, select the capture device that corresponds to the USB audio interface. Then, select enable audio capture. Once you have headphones connected, ensure they are selected in the playback device drop down and select start monitoring. You should be able to hear the audio. If there is a sync issue, you most likely have a hardware issue. Consult the tips and tricks section. The input gain slider allows you to modify the audio level for the recording. The preview gain slider acts only as a volume slider when previewing your audio — it does not impact the actual recorded audio.

If recording raw, confirm you have selected raw audio in the output section. You can select a raw capture format within the audio section - IEEE Float or PCM (16 or 32 bit). When importing the raw file after recording, import your SRD file normally and the corresponding WAV file will be brought in. You can also import custom WAV files recorded externally.

Checking "enable 3D audio playback" sets a flag to default to using spacialized audio features when playing back with our Unity SDK.

In order to route audio from other mixing programs into the Capture Suite, you must install Jack Audio. The Jack Audio workflow is as follows:

  • Launch Jack Audio Connection Kit.
  • Start Jack (should say started).
  • Open settings and ensure sample rate is 44100.
  • Ensure interface is default.
  • Confirm Jack Audio is still running.
  • Setup Jack Audio in your mixing program then launch Capture Suite.
  • Within the Capture Suite, enable audio capture, select Jack Audio as capture device, and start monitoring.
  • Return to Jack Audio.
  • Open patch bay.
  • Add input.
  • Select the Capture Suite as Client.
  • Add plug.
  • Add output (your mixing program).
  • Add plug.
  • Save.
  • Open up graph.
  • Ensure your mixing program has its output linked to Capture Suite Jack Audio input.
  • Return to Capture Suite.
  • Play your mixing app or get ready to record - you should hear audio in your headphones if you are monitoring.
  • Select preview and then you are ready to record.

Usage with Green Screen

Chroma Key

Soar does not require or recommend a green screen in order to capture volumetric content. In fact, it could potentially hinder your output quality due to green spill. If you are using a green screen for other reasons, Capture Suite has a chroma key setting found within the volumization rendering section to aid in removing any green spill on the mesh. This setting will allow you to choose a color that best represents your green screen.

  • Gain: how severe it de-weights against the chroma distance; also controls de-saturation.
  • Bias: base level cut-off; how far is this color away from the selected chroma key.

These settings can be set both prior to recording and on playback using our Unity SDK. If recording raw, this setting can be set after capture.

Saving a Compressed Capture

Compressed Capture

If you are happy with your preview and you want to record a compressed capture, head to the output section. Type in a capture name, select a capture path, then check only the "compressed capture" checkbox. The default settings here should suffice in most cases. Click "preview" then "record". When you are ready to finish recording, click "stop". Your compressed capture will be saved in the specified directory, ready for playback in Capture Suite or our Unity SDK.

Saving a Raw Capture

Raw Capture

If you would like to record a raw capture, head to the output section and select only the "raw capture" checkbox. Note that this raw capture will have a very large file size but provide flexibility to re-process after the content has been shot. You can also export to compressed from raw, as well as a variety of mesh sequences. Click "preview" then "record". When you are ready to finish recording, click "stop".

Saving a Textured Mesh

In order to generate a mesh export (OBJ & PLY), you must first record a raw capture then select the "mesh output" before loading the file back in; ensure only one output format is selected.

Playing Back a Compressed Capture

Compressed Capture Files

Head to the folder that has your recording. You will note a variety of files, including an mp4, sgv, m4a (if you are recording audio), and a handful of m3u8 files. If you want to load a recording and play it back, select "load recording" inside the Capture Suite. Select the desired master m3u8 file, then click "play".

Capture Suite will append seconds since epoch 1/1/2020 midnight GMT to the filename. This is done so that captures will never be overwritten.


Your captures are ready to be imported into Unity for playback on a variety of devices. Check out the Unity Package documentation to find out more.

Playing Back a Raw Capture

Raw Capture File

Head to the folder that has your recording. You will note one srd file.

Import Raw Capture

In the Capture Suite, click into the "import raw capture section". You are able to load the srd file here. Prior to loading your capture, ensure you have storage space for your export. Also, note the settings in this area. Select "display only" if you want to just watch your raw content. "Process with display" will process the capture per your export settings upon load. If using "process with display", be sure to select the desired file output within the output section - either compressed capture, mesh (OBJ/PLY sequence), MVE, or point cloud PLY. Check "start paused" if you would like to adjust your capture prior to export. Ensure you resubmit the frame, or click r, after every change to view your changes.

Raw playback mode allows you to adjust settings similarly to when you originally made the capture, including setting the capture's bounding box and choosing start and end points. Ensure you resubmit your frame whenever you make a change to see the difference visually.

The processing for both meshes (OBJ/PLY sequences) and MVE may take a little bit of time. Capture Suite may look like it is frozen, but don’t worry - it's churning.

Streaming Locally

Local Stream

The firewall on your capture PC should be disabled to allow connections for a live stream on your local network.

In order to stream on your local network to a device, you must select both local server and compressed capture within the output section after you set a capture name and capture path. The port 8080 should suffice. Then select "preview and record". On the device that you want to stream to, enter the streaming URL. The streaming URL starts with http://, followed by the local IP address of the computer (found by going to network connections on the PC, clicking into adapter settings, and viewing the IPv4 address) and then adding :port/capture-name_seconds-since-epoch_master.m3u8.

The full capture name, appended by seconds since epoch 1/1/2020 midnight GMT, is found within file explorer at the capture path selected in the output section as soon as you are recording/streaming. Ex:

Tips and Tricks


  • If you notice artifacts such as the ceiling within the bounding box after adjusting your bounding box settings, you may need to increase your minimum coverage setting.


  • Audio Failed to Open WMF Audio Device - the issue is most likely rooted in other software taking control of your connected audio device. Ensure no other program is running or disable exclusive mode. To disable exclusive mode, right-click the Speaker icon on the Windows toolbar, and select Open Sound Settings. Click Device Properties located underneath Choose Your Output Device, then click Additional Device Properties located underneath related settings. In the line properties window, click the Advanced tab, then uncheck "Allow applications to take exclusive control of this device". Click apply, then click OK.

  • Be sure to test and verify your audio path outside of the Capture Suite.

  • If audio is not synced properly, you have a hardware issue. Consult the hardware section below.


  • Environment

    • There should be no IR reflections.

      • Check infrared views of all cameras and ensure you do not see pulses of light coming off of areas such as glass, television screens, and picture frames.
    • If you have multiple calibration cubes, ensure that only one can be seen in the camera views. Another cube in the scene, even if it is 20+ feet away, can potentially be picked up and will yield an incorrect calibration.

    • Be wary of cameras pointing at room corners and at shiny objects as this can lead to multi-path interference.

    • Watch out for lights in the direct background of the cube, either in the ceiling or on the ground.

    • Keep an eye out for invalid pixels in your environment - black pixels in the depth view of a camera. Pixels become invalid when there is not enough dynamic range, multi-path interference, or due to fast motion. If your floor suffers from invalid pixels, consider putting a carpet down which is light and matte - too dark in IR will crush the dynamic range and add noise. Invalid pixels also cause bad depth/depth noise.

  • Lowering the stand on which the calibration cube is placed may yield better results.

  • Software

    • If your content is super skinny, one of the cameras may be mis-calibrated. Enable the point cloud view to find the camera that is the culprit.
    • Ensure the marker outlines (viewable during calibration in the color camera feed for each camera) during calibration do not jump around or extend past the marker.

Capture Room

  • Temperature - we recommend having a temperature controlled room, with quality airflow, where the temperature is set in the range between 10 and 25 celsius. Temperatures towards the upper bounds, and higher, have shown to reduce calibration and capture quality. Stay within 10 - 25 celsius.

  • If you see a lot of invalid pixels in your depth view in the background or on the floor, this may cause worse calibrations and captures. Enshrouding your capture area with curtains, and laying down a rug, should mitigate this issue.

Corrupt JPEG

  • Reboot the Camera - if this fixes your corruption issue, then the camera was in a bad state.

  • Reboot the PC - if this fixes your corruption issue, then it was caused by a random Windows issue.

  • Isolate the camera having issues; only connect one camera to the PC and verify if you still receive the error - if this fixes your corruption issue, then you may have a USB bandwidth issue which is solved by utilizing an external USB PCIe card as referenced in the hardware requirements.

  • Remove the USB extension cable and test with a direct connection - if this fixes your corruption issue, then it was caused by a faulty USB extension cable.

  • Ensure your external USB PCIe cards are powered by SATA cables.

  • Confirm your PC has an appropriately rated power supply and you are not under-powering the computer.

  • Confirm your hardware specs are included in the hardware requirements list from Soar.

  • Ensure your RAM timings are set correctly.

  • Make sure that your configuration has the maximum number of supported memory channels populated.


  • If exporting to a mesh is very slow, ensure there is no floor in your scene. Also utilizing a higher volumization resolution will result in slower exports.

  • Confirm your output selection and storage space prior to exporting from a raw capture.


  • If you encounter rendering speed issues, you may have an issue with your RAM clock speeds. Use CPU-Z to look at memory settings; if you're running in single channel with the lowest possible memory clock, you're going to encounter issues. In order to modify this, you must do so within the BIOS.

  • If there is a flicker when viewing content in preview or a flicker in the color camera feed, open the log. If there is a corrupt JPEG error, this might mean the USB extension cable for that camera needs to be replaced. This can also mean your powerline frequency setting is not correct; 60Hz for North America, 50Hz for most other countries. To view other corrupt JPEG causes, head to the Corrupt JPEG section of this guide.

  • Memory Channels/RAM

    • Make sure that your configuration has the maximum number of supported memory channels populated. Failure to do so can cause performance problems.

    • If rendering is slow, ensure your RAM timings are set correctly within the BIOS. Also, confirm there are no corrupt JPEG errors in the log as this means frames are not arriving on time and can cause rendering issues.

  • StarTech Card

    • Can shift in its socket, sometimes due to temperature changes - may need to be reseated if experiencing issues such as corrupt JPEGs or cameras not appearing.

    • BIOS - ensure the BIOS is setup right for the cards. If the cards are forced to run in PCIe 1.0 mode, it can explain peformance issues. Also, if using our required motherboards, confirm the PCIe_3 Lane Configuration (non-GPU configuration) is set to x4 + 4. Failure to do this can cause performance issues or cameras not appearing.

  • External Hard Drive

    • Soar does not recommend capturing audio or video to an external hard drive.
  • USB Extension Cables

    • Lack of strain relief, carpeted room, or an exposed cable connector could destroy a cable and yield corrupt JPEG errors in the log.
    • Be mindful of electrostatic discharge especially if you are continually grabbing metal microphone stands as you are pulling out, or putting your hand near, the plugs of cables.
  • If all of your connected cameras are not appearing in the Capture Suite, and this persists after clicking the refresh button, you most likely have a USB bandwidth issue. Consult the recommended hardware and ensure you are using the recommended StarTech USB 3.0 PCIe card.

Hardware Sync

  • Check infrared views to ensure there are no pulses of light coming from any cameras.

  • Ensure that delay from primary is set for each camera in accordance with documentation.

  • If camera feeds stop when enabling hardware sync, it could be one of a few things. First, ensure hardware sync is wired correctly for each camera and that the cameras are all updated to the latest Azure Kinect firmware. There could be a bad USB extender, which is typically noted by corrupt JPEGs in the log. It can also mean that there is a USB bandwidth issue. Finally, try unplugging your cameras and re-plugging them back into the computer one by one. Sometimes a PC shutdown will help as well.

  • If hardware sync is enabled and you still notice pulses of light from cameras in the infrared view, increase your delay from primary setting until the pulses of light cease.


  • Don’t use fluorescent lighting - use LED lighting instead.

  • Confirm there are no lens flares on any of the cameras.

  • Check the camera stands in infrared mode and ensure there are no bright spots on them which can lead to multi-path interference.

  • If calibrating in a very bright space or a very dark space, consider using adaptive thresholding.

    • Just enabling this setting, and keeping it at 0, may do the trick.

      • In dark areas, you might want to use a positive value to make the light areas come through.
      • In light areas, you might want to use a negative value to make the darker areas come through.


  • If your content does not appear, and you are sure it's not a hardware issue, check to see if your volumization resolution is set to 256 and the maximum vertices is set to 262144. If you volumization resolution is higher, you will need to increase your maximum vertices until your content reappears.

  • If you notice lots of noise (ripples) on the calibration cube, this is due to camera placement. One or more of your high cameras can see another marker on a glancing angle; this may be a very slight angle. Ensuring the high camera is directly set on the marker for each face should eliminate this issue.