Are you looking for 360 video stitching solutions? RidgeRun has worked for years to launch and continuously improve a stitcher capable of running with low latency on systems with CUDA support. Yes, we support any NVIDIA Jetson and x86 platforms with discrete GPUs!
Our latest feature? 360 video support!
We have developed a stitching ecosystem that empowers our customers to create their own stitching solutions, for their specific systems, with the following key features:
Panoramics with rectilinear cameras.
360 views from a set of fisheye cameras!
GUI-based Calibration Tool.
Public video dataset and 3D models for testing.
Stitching in any direction (vertical, horizontal, diagonal)
Increase your application view and awareness with CudaStitching!
360 Sample
Take a look at a demo video on the left. It uses VLC to visualize an mp4 recording video from the stitcher generating a 360 video. The recording passed to VLC was created with our stitching solution and the following GStreamer pipeline.
gst-launch-1.0 -e -v \
cudastitcher name=stitcher homography-list=$HOMOGRAPHIES \
filesrc location=360-s0.mp4 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! rreqrprojector center_x=$S0_C_X center_y=$S0_C_Y radius=$S0_rad rot-y=$S0_R_Y rot-z=$S0_R_Z lens=$S0_LENS name=proj0 ! queue ! stitcher.sink_0 \
filesrc location=360-s1.mp4 ! qtdemux ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! rreqrprojector center_x=$S1_C_X center_y=$S1_C_Y radius=$S1_rad rot-y=$S1_R_Y rot-x=$S1_R_Y rot-z=$S1_R_Z lens=$S1_LENS name=proj1 ! queue ! stitcher.sink_1 \
stitcher. ! queue ! nvvidconv ! nvv4l2h264enc bitrate=30000000 ! h264parse ! queue ! qtmux ! filesink location=360_stitched_video.mp4
GUI-based Calibration Tool
CudaStitching comes with a user-friendly, GUI-based calibration tool to help you in the process of computing complex stitching parameters, that otherwise would require high effort and many hours of tuning. Calibrating a stitcher for your use case is now easier than ever!
It supports both panoramic stitching and 360 stitching. Our goal is to empower our users with our tools, therefore you will have access to extensive wikis and video tutorials explaining how to use the calibration tool, providing you with enough information to bring up your ideas to life and experiment with different setups until you are happy with the results.
Mosaic Stitching
Real-time stitching has many challenges:
Create powerful, fast algorithms.
Keep good image quality.
Allow any stitching direction.
Both the stitcher itself and the calibration tool are able to stitch the images in any direction. This means that it doesn't matter if your images overlap horizontally, vertically, diagonally, or a mix of those, our stitcher is capable of handling each overlapping case. It can also handle cases where more than 2 images overlap over the same region since it uses custom CUDA kernels to perform the operations with high control and high speed.
"I still don't have a setup"
This is not a problem. We provide a public data set of various stitching use cases:
Mosaic stitching: 6 rectilinear cameras in a 2x3 configuration.
Triple 180° fisheye cameras.
Dual fisheye (195° angle each) to cover the whole 360 view with only two cameras.
This dataset will let you test the power of the stitcher.
Our 3D models are public as well, you can print them out and start prototyping your solution quickly.
Contact us to evaluate CudaStitching. We offer trial versions, so you can first validate the functionality and make sure it meets your use-case requirements.
What’s Next?
Stay tuned and check out our developer wiki for more information.