Rapper Bryson Tiller transports audiences to virtual worlds during livestream


Xite Labs used the Extended Reality (xR) disguise workflow, powered by a gx 2c media server and rx real-time rendering platform, to bring myriad virtual worlds to life in Unreal Engine for live streaming. live from “Trapsoul World Series” by rapper Bryson Tiller. concert.

Filmed on the xR stage at Xite Labs in Los Angeles, the concert heightened the artist’s unique musical vibe for fans around the world watching at home. The immersive livestream featured Tiller in a series of six different worlds linked by narrative running through the songs. Xite Labs was responsible for the xR content of 14 different songs performed in four virtual “worlds” with distinct looks and themes.

For this unique immersive experience, Xite turned to its internal workflow, with a disguise gx 2c media server as the main xR environment controller, while a dedicated rx disguise machine was used to run the Unreal scenes. Engine through the RenderStream disguise. Faceplate elements have been created in Notch to further connect Tiller to each of the unique worlds.

These included a virtual living room that collapsed to reveal a world of galaxies, nebulae and spaceships, a time theme with a desert mountain landscape and flight through a moonlit sky, a guerrilla transforming into a neon jungle as well as austere hallways with bold, flat lighting, color-shifting walls and silhouettes. Throughout, Tiller appeared to perform on a moving platform, which served as an anchor transporting him from an otherworldly environment to another.

“I haven’t seen anything done in xR that’s as diverse and complex as this. And the fact that it was shot on our smallest volume in such short notice still amazes me,” said Greg Russell, Creative Director at Xite Labs.

One unintended benefit the xR disguised workflow brought to the production team was the ability to film Tiller wearing a shiny black reflective jacket for the intermezzo performance.

“It would have been incredibly difficult on a green screen and technically time consuming for lighting and light field separation. But xR made it possible, and it looked amazing,” added Vello Virkhaus, Director creative at Xite Labs.

The ambitious production took over 400 hours of development for nine Unreal Engine artists over ten weeks and over 200 hours for two Notch artists to bring the virtual environments to life.

“Top to bottom, the live concert went extremely well and received great feedback,” Russell said. “Bryson understood the technology and intuitively knew where to be on stage and how to be in and out of the lighting.”

Creative studio 92 Group joined Xite Labs on the project, which provided overall show direction and art direction, while production was handled by HPLA.

“Production-wise, this was the first time director Mike Carson, DP Russ Fraser and producer Amish Dani did xR,” Russell said. “With film and music video professionals coming into our world, it was very difficult work from a production perspective, because we were basically teaching them the xR workflow on the job. But once they started putting the pieces together, they realized its value.


Comments are closed.