Esports is the next sport to hit the big time. Its young, passionate fanbase already surpasses American football, globally. So far, the fans have largely been served by online coverage or from dedicated pay-TV channels, but established channels like Turner’s TBS have now started to cover the sport and are bringing in higher production value.

The main challenge for content producers like ourselves and channels is to bring a digital sport to life in the real world, bringing the audience closer to the action. At the same time, it opens up new possibilities to create exciting, new forms of coverage.

Coinciding with the rise of esports, virtual reality, augmented reality and mixed reality are also growing more popular and numbers from IDC suggest revenues will grow to $215 billion in 2020 from only $11.4 billion in 2017. So far, high prices for headsets and hardware have prevented VR from significantly penetrating the mass market. Augmented reality on the other hand, can easily be accessed by mobile devices or even a TV.

AR has a history with sports

Augmented reality has been used in sports coverage for a long time, to project scores and data over the real world video feed — in fact, the yellow “first down line” on football broadcasts may be many people’s first exposure. For esports, augmented reality could be used to blend the digital characters or objects with the real world in a mixed reality environment. For example, we at The Future Group have created Frontier, a new virtual studio graphics rendering platform based on Epic Games’ Unreal gaming engine. This new studio technology can instantly insert AR figures or virtual studio sets in real time into any broadcast.

Turner’s TBS channel has been at the forefront of showing and developing esports coverage. In May of last year, during their ELEAGUE Street Fighter V Invitational event, we placed 3D characters from the game live on stage. The characters were created by Capcom, while the work of integrating them into our system was led by our VP of Virtual Øystein Larsen, who has previously worked on The Matrix Trilogy, Catwoman and Norwegian cult hit Trollhunter.

This was so successful that we did it again for ELEAGUE’s Injustice 2 with superheros from the DC Universe, created by NetherRealm Studios, on three consecutive Fridays last fall. But this time, the 3D characters had more advanced moves and set interactions, so when Superman flew up into the ceiling to exit the stage, virtual debris fell down onto the studio floor. And everyone–including the studio audience — could enjoy the visual effects on a TV screen without the need of additional hardware.

While it takes time for us to create the graphics and program the movements of the characters, once they are completed we can reuse them and new effects and moves can be added easily. And since the field is so new, we’re still learning ways to shorten production cycles and speed up the delivery process.

We’ve seen seen other examples of AR in esports as well, like Yu-Gi-Oh! characters who come to life when the cards are placed on the table, or the player’s game statistics shown on screen in both esports and physical sports. As more viewers flock to this entertainment form and more money enter the field we’re likely to see a huge increase in the AR provided for these shows.

How AR can work

The characters or virtual sets can be inserted with the click of a button, which gives the industry new creative freedom. The results are instantaneous, so we can immediately see what it will look like during the live broadcast, which was simply not possible when I started working with visual effects some 20 years ago. As everything happens in real-time it should eventually even be possible to develop the characters enough to bring the fights from the computer screens out onto the real world stage.

Changing and creating new forms of esports content will drive monetization and innovation. Brands will be able to reach a very valuable young audience, viewers will discover new ways to interact with their favorite games and communities, and publishers will expand beyond their games into new realms. This will result in a new marketplaces and revenue streams for content owners. For instance, why shouldn’t a game character get an energy drink sponsorship like a real athlete? We are already seeing the integration of branding into esports evolve based on our visuals. By freeing the assets and creating new experiences with them, the viewer is more engaged, thus driving branded content and increasing advertisement revenue.

In the interactive mixed reality TV show Lost In Time, co-created by The Future Group and FremantleMedia, the viewers can participate in the same competitions as the contestants through their mobile devices. This could be replicated for eSsorts coverage too, using the Frontier platform. Adding a layer of interactivity would be one way of letting the fans, who often are players themselves, directly engage before, during and after the event.

Activating the viewers’ second screen will not only engage the viewers more actively, but also has the capacity to deliver targeted advertisements to a broad audience, so for instance a millennial woman in Seoul could see a different product than a teenage boy in Wisconsin.

Esports is challenging traditional sports for fans and viewers, so investments in broadcast coverage of esports will follow. Since the action is digital, the creative possibilities are also greater. New advancements in VR, AR, and MR technology gives us opportunities to visually enhance esports coverage that simply were not possible only a few years ago. This will make the the level of viewer engagement surpass what we see in traditional sports, and sports broadcasters’ will look to the virtual-infused esports coverage for inspiration.

Lawrence Jones has over 25 years experience in the interactive and mixed reality fields, specializing in real-time VFX advertising for a decade as a leader at ESPN. He is the The Future Group’s VP North America — Business and Content Development.