Nvidia and AMD ease 360-degree video production with new APIs

— 10:46 AM on July 26, 2016

360-degree video streaming is shaping up as a major application for VR headsets, and both Nvidia and AMD are working on ways to help content producers accelerate the production of those streams. Both companies revealed software packages that will let companies use GPUs to stitch and output 360-degree video in VR-ready formats at SIGGRAPH this week.

Nvidia's VRWorks 360 Video SDK makes it possible for content providers to take input from up to 32 cameras and perform 360-degree video stitching in both offline and real-time workflows. It lets creators account for custom camera positions and lens types in their 360-degree rigs, and it accelerates the "decode, equalization, calibration, stitching, and encode" portions of the 360-degree video workflow. The 360 Video SDK is also compatible with Nvidia's GPUDirect API for low-latency processing direct from video I/O cards.

AMD's 360-degree video solution (can we just call them video spheres?) is called Project Loom. The company unveiled this tech at its Capsaicin SIGGRAPH event last night. Not to be confused with Google's Project Loon, this software can accept input from four to 24 1080p input streams. It then stitches those streams into a 4K sphere at 30 FPS for both desktop and mobile HMDs. Similar to Nvidia's solution, Loom lets Radeon Pro graphics cards work directly with video capture hardware using AMD's Direct GMA protocol. Loom will be made available as an open-source solution through GPUOpen later this summer, according to AMD's presentation slides.

Like what we're doing? Pay what you want to support TR and get nifty extra features.
Top contributors
1. BIF - $340 2. Ryu Connor - $250 3. mbutrovich - $250
4. YetAnotherGeek2 - $200 5. End User - $150 6. Captain Ned - $100
7. Anonymous Gerbil - $100 8. Bill Door - $100 9. ericfulmer - $100
10. dkanter - $100
Tip: You can use the A/Z keys to walk threads.
View options