The NVIDIA Holoscan Sensor Bridge is a platform for low-latency camera capture transmitted over 10Gbit ethernet. It supports stereo capture, where each camera stream is transmitted in separate ethernet connections for better bandwidth.
It is based on two FPGAs from Lattice: the CrossLink-NX and the CertusPro-NX. The Crosslink-NX is equipped with high-speed MIPI interfaces, useful for camera connections. The CertusPro-NX implements the IP cores for ethernet package preparation and interconnection. To see more information about this platform and its possible applications, please visit our previous blog Leveraging Low Latency to the Next Level with the Holoscan Sensor Bridge.
In this blog, we will go through the process of getting started with the NVIDIA Holoscan Sensor Bridge on an NVIDIA Jetson AGX Orin communicated with a single 10 Gbit internet for a single camera capture.
Getting the Hardware Components
To get started, please make sure to have the following hardware components:
An NVIDIA Jetson AGX Orin developer kit
A USB-C power supply for the NVIDIA Jetson AGX orin
A USB-C power supply for the Holoscan Sensor Bridge (2A works)
An ethernet cable: category 6 or superior
A USB-micro data cable
A display port cable (optional: with HDMI adapter for monitors)
A monitor or display
A keyboard and mouse with USB connection
Preparing the Software Environment
The first step is to prepare your Jetson for the Hololink Framework. The Hololink can be considered an extension of the Holoscan Framework, providing functionality for video capture and some custom modules specialised for image signal processing, such as gamma correction, auto-white balancing, and black correction.
First, flash your NVIDIA Jetson AGX Orin with Jetpack 6.0 using the NVIDIA SDK Manager. Connect a USB-micro to the Jetson's debugging port and follow the steps in the NVIDIA SDK Manager. It is important to have connected the following ports:
The debugging port to the PC (with NVIDIA SDK Manager).
The ethernet connection to a network with Internet access.
The power connection to the power supply.
It is discouraged to have other ports connected since they might interfere with the flashing process.
It is highly recommended to select software packages such as CUDA and NVIDIA Container Runtime since they are needed for the Holoscan and Hololink frameworks to work, as presented in the following picture:
For more information, please visit our Jetpack installation guide.
Preparing the Hardware Environment
For the hardware environment, please, make sure to have the following setup:
The NVIDIA Jetson AGX Orin devkit connected to the power supply.
The ethernet cable connected between the Holoscan Sensor Bridge and the Jetson.
The USB-micro connected between a PC and the Jetson.
A display, mouse and keyboard connected to the Jetson.
The Holoscan Sensor Bridge connected to the power supply.
An illustration of the setup mentioned above is the following:
Bringing Up the Holoscan and Hololink Frameworks
Once the software environment is completed and the hardware setup is ready, there are some preliminary steps to follow in order to configure the environment before launching the containers.
Cloning the Repository and Preparing the Docker Image
You may need to install git-lfs to clone and initialise the repository. Please, install it with:
sudo apt-get update
sudo apt-get install -y git-lfs
Then, clone the repository:
And log into the NVCR docker repository. An API key is needed in order to log in. Please, follow this link to get one. Then, log in:
docker login nvcr.io
In the username prompt, use the provided API.
Finally, create the image:
bash docker/build.sh --igpu
It will take some time to complete.
Note: If there are some issues with the network, please, disconnect the ethernet connection between the Jetson and the Holoscan Sensor Bridge. Then, connect the Jetson to a Wireless Network with Internet Access.
For further information and possible issues, visit our developer wiki.
Configure the Network
The network needs to be configured according to preliminary settings. Make sure of having the Jetson connected to the Holoscan Sensor Bridge through the ethernet cable before proceeding.
First, configure the sockets to support large buffers:
echo 'net.core.rmem_max = 31326208' | sudo tee /etc/sysctl.d/52-hololink-rmem_max.conf
sudo sysctl -p /etc/sysctl.d/52-hololink-rmem_max.conf
Configure a static IP Address:
sudo nmcli con add con-name hololink-eth0 ifname eth0 type ethernet ip4 192.168.0.101/24
sudo nmcli connection up hololink-eth0
Note: Connecting through a network router or switch is discouraged.
Launching the Sample Capture Application
The image generated compiles the Holoscan and Hololink framework assuming that it runs on a Jetson. This means that it will not compile with DPDK support, given that the Jetson does not have a DPDK-compatible network interface card. Moreover, the NVIDIA Jetson AGX Orin only has one ethernet port, supporting only a single stream capture (single camera). Therefore, the following example will launch under these limitations. For more information, please visit our developer wiki.
First, in the desktop environment of the Jetson (interacting with the GUI), open a terminal and configure the X redirection:
xhost +
Then, launch a docker container with the Holoscan Image built previously. For that, place the terminal to the root of the repository and:
bash docker/demo.sh
Finally, inside of the container environment, run the demo:
python3 examples/linux_imx274_player.py
The demo looks like:
For more information, please visit our developer wiki.
Important Remarks
For stereo capture, the processing platform (i.e. the NVIDIA Jetson) must have two separate network interfaces, given that each camera stream is delivered in separate ports.
The NVIDIA Jetson AGX Orin developer kit does not possess a DPDK-compatible card, falling back into the Linux socket system. This increases the glass-to-glass latency. For DPDK, it is necessary to connect a DPDK-compatible card or use a custom carrier board with a compatible NIC.
Expect more information from us
The next blog post will show the glass-to-glass latency, useful for critical applications and how to lower it by using our CUDA ISP.
If you want to know more about how to leverage this technology in your project: Contact Us