Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event. 


Primal Space Systems has raised $8 million for its subsidiary Instant Interactive, and it’s going to use it to make a technology dubbed GPEG, which is like a cousin of the MPEG format used to run videos, but for graphics.

But GPEG, a content streaming protocol, is a different way of visualizing data, and its creators hope it could be a huge boost for broadening the appeal of games as well as making people feel like they can be part of an animated television show. Instant Interactive wants to use its GPEG technology to more efficiently stream games on the one hand, and on the other, it wants to turn passive video entertainment into something more interactive and engaging.

This may be where most people stop reading this story. But I think there’s a lot of legs to this technology. The idea for the Geometry Pump Engine Group (GPEG) originated with Instant Interactive cofounders Barry Jenkins (a medical doctor who became a graphics expert), John Scott (chief technology officer and formerly of Epic Games), and Solomon Luo (a medical vision expert and chairman) — who have thought about this challenge for years and created the startup, Primal Space Systems, and its game-focused division Instant Interactive.

The investors include a variety of seed and angel funders, including ImmixGroup cofounder Steve Charles. The capital will support the development and initial launch of GPEG, which can be used with game engines such as Epic’s Unreal. As a company, Instant Interactive has been working on the technology since 2015. It has just seven employees.

Event

The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2

January 25 – 27, 2022

Learn More

Jenkins has put more than a decade of work into the technology, and the company has 11 patents on technology that enables better streaming of games and ways to make TV shows into interactive entertainment. While MPEG (short for the Motion Picture Experts Group) gave us technologies for compressing video so it could be easily viewed across networks, GPEG can make some very expensive game and entertainment technologies much more practical, said Bill Freeman, president and chief operating officer of both Primal Space Systems and Instant Interactive, in an interview with GamesBeat.

Above: Bill Freeman is president and COO of Instant Interactive.

Image Credit: Instant Interactive

“This kind of technology can enable interactive content on over-the-top programs” such as interactive Netflix shows, Freeman said.

For cloud gaming and interactive television, GPEG replaces video-based streaming through the use of pre-encoded content packets, which can be more efficiently streamed using GPEG middleware technology. The packets are prefetched to eliminate lag, while also lowering overall streaming costs. GPEG’s middleware solution is designed to be used with all existing content delivery networks and to be integrated into any game engine, including Epic’s Unreal Engine 4, providing for the efficient delivery of true in-the-moment personalized content.

“We think it’s possible to bring interactivity to the entertainment industry, breaking the interactivity out of the gamer silos and create content that everyone can consume,” Jenkins said in an interview with GamesBeat.

The possibilities for interactive entertainment

Above: Netflix’s Black Mirror: Bandersnatch show let you choose the outcomes.

Image Credit: Netflix

If you’re wondering what this is all about, you may have heard the story that Epic Games’ Unreal Engine, used primarily for making games, was used to create cinematic special effects for the Disney+ television show, The Mandalorian. GPEG can be used on top of Unreal, and it could enable a television show where you can participate in the action, taking the story in a direction you want it to go.

With GPEG, the interactive entertainment industry, as exemplified by Netflix’s Black Mirror: Bandersnatch where you could choose your own story, could get a big technological boost. Jenkins envisioned GPEG as a more effective way to create transmedia, or content that is used across multiple media such as comics, movies, and games.

Today’s games often have in-game visually rich cinematic sequences or “cutscenes” that are rendered in real time. Advanced real-time rendering effects can now give games a sophisticated cinematic look that was not possible before. At the same time, the art of interactive storytelling has evolved considerably since professor Henry Jenkins (then at MIT, now at USC) talked about transmedia.

Narrative-driven games such as Detroit Become Human and Life is Strange II focus on allowing users to actively discover and participate in the conflict and resolution cycles that are fundamental to story.

And if you looks at streaming, today’s edge centric content-delivery network (CDN) infrastructure delivers pre-encoded video streams at a very low cost per user and at global scale. However, this infrastructure is insufficient to support video-based cloud gaming systems such as Stadia or GeForce Now, which depend on expensive game server hardware that is housed in specialized data centers, Freeman said.

While most game content today is delivered by CDNs, this delivery is in the form of slow game downloads, which often require users to wait minutes or hours to begin gameplay. New methods of streaming game engine and VR content to game consoles, gaming PCs, and mobile
devices could eliminate these download delays and provide virtually instant access to interactive content.

Using the existing CDN infrastructure, this new type of stream could change the way games are delivered by broadband and wireless and could also enable new types of instantly interactive content for cable and over-the-top (OTT, think Netflix) audiences.

Jenkins said you should imagine animated programs or special effects sequences streamed to your came console or gaming PC using an application like “HBO Max Interactive” or “Netflix Interactive.” Such programs would convey the story driven, “lean back” entertainment experience expected from video but would also allow users to optionally pick up a game controller and customize the characters, explore a different narrative arc, take up a brief challenge, or otherwise ‘lean into’ the experience in ways that are more deeply engaging than Bandersnatch’s simple branching video.

Such programming would naturally appeal to gamers and could also attract non-gamers and larger mainstream audiences to the unique and compelling entertainment value of interactivity. This type of programming could augment the traditional transmedia approach by enabling convergent media experiences that combine the impact of cinematic storytelling with the kind of immersive engagement made possible by the modern game engine.

Instant Interactive wants to create a graphics revolution

Above: This shows the output of the GPEG encoder for a single viewcell. Completely visible triangles (blue) are calculated using a conservative method of from-region visibility pre-calculation which is much faster and more accurate than ray tracing.

Image Credit: Instant Interactive

Instant Interactive is pioneering the development of a game engine middleware protocol, GPEG, for streaming interactive content to game consoles, PCs, mobile devices, and next-gen set-top boxes.

Primal Space Systems itself used the technology to enable drones to stream data more efficiently when they’re flying over a place and capturing imagery and location data. The U.S. Army is using that technology. But Instant Interactive uses the technology to more efficiently stream games and turn passive video entertainment into something more interactive and engaging.

“We’ve all grown up with MPEG (a video format),” said Freeman. “This is not MPEG. It is a new way of encoding and streaming 3D data. Games have been our core, but we see GPEG going beyond gaming, bringing interactivity to what has historically passive content that will be more leaning forward.”

Cloud gaming and interactive entertainment need the help. Putting games in the cloud so they can be processed with heavy-duty servers is theoretically a great way to process games. You can then stream a video of a game scene to a user’s computer or console. When the user interacts with a controller, the input is sent back to the data center, where the impact is computed, and then a new scene is sent in video form back to the user’s computer. This allows the heavy lifting to be done in the cloud, so a high-end game can run on a low-end laptop.

The problem is that this streaming of data consumes a lot of bandwidth, and cable modem systems have only recently been able to transfer data at speeds high enough to enable cloud gaming services such as Google Stadia and Nvidia GeForce Now. But with GPEG, Jenkins claims that the data can be dramatically reduced and transferred using a fraction of the bandwidth needed today.

“We know what OnLive tried to do years ago,” Freeman said. “But what if we could stream fully interactive content with no lag, no loss, no frame drops or no downloads, and provide an instant interactive experience at the highest quality, and actually at a much lower cost using today’s infrastructure. What we’re the most excited about is the interactivity. So VR, MR, AR content is really possible with our technology. And since we’re not downloading all the content ever, we eliminate piracy out of the equation. This is not traditional cloud gaming as others have approached it. It’s an entirely new way of streaming data.”

The mad doctor

Above: Barry Jenkins is cofounder of Instant Interactive.

Image Credit: Instant Interactive

Jenkins got his medical degree from the Harvard University Medical School. But he became very interested in computational models of human vision and computer vision and real-time graphics. He wrote software for like massive 3D data set processing that was used by defense contractor Northrop Grumman for rendering imagery.

“We are focused on the 3D data itself, not video data, which is really what Stadia and GeForce Now do,” Jenkins said. “They run the entire game, of course in a data center, and then they compress the video frames, and then it gets sent to the user. And as you know, this was possible many years ago, but not necessarily practical. It does have a very high cost per user and almost an unprecedentedly high cost per user in the data center. We estimate that Stadia dedicates $1,500 dollars worth of equipment to one user at one time.”

The compression of the game content into video form isn’t that good, and so it takes up a lot of bandwidth. There is also a lot of latency, or interaction delays, in a cloud game as the player’s laptop sends an input, it travels to the data center, gets computed there, and then gets sent back down to the computer in the form of a video. At 720p resolution, this lag is OK. But not with today’s 4K TVs.

How to fix game streaming

Above: In this top-down view of a game level, the sections in green are the parts of a scene that a user might see.

Image Credit: Instant Interactive

Instant Interactive’s software can fix this, as it is integrated into the Unreal Engine and will enable game publishers, game developers, and game distributors to give their users a better experience, Jenkins said.

“We’re actually streaming the game engine content itself in a very sort of agile way to the gaming console to PC or mobile device,” Jenkins said. “We take the game itself, all the levels that make it up a game, and we process them offline through our GPEG encoder, and we turn it into data, which is GPEG. That packet data can sit on any server. It doesn’t have to have a GPU, basically any CDN server.”

He added, “What it does is it streams of data into our GPEG client software, which is a plug-in for Unreal Engine. So we have integrated this into Unreal Engine. It’s written to be integrated into other engines as available. And basically instead of downloading the entire game, we just, we initially very quickly stream the just the data that that game needs in the moment. It’s all prefetched. So there’s, you know, the user doesn’t experience any latency.”

The software takes a given scene and breaks it up into cells. Then the software precomputes exactly what surfaces which texture triangles are visible from each box. And then we encode the change between the boxes. All this data sits on a server, and the server predicts where a user is inside a game and then only shows the parts of a scene that the user can see. It also predicts what the user might see in the next few moments and it precomputes that to show it to the user. The game engine doesn’t have to render the rest of the scene.

“So instead of downloading your Call of Duty games as a 140GB download, we could really get this started with a few 10s of megabytes of data,” Jenkins said. “There’s a real opportunity here to improve the performance of the game by managing very precisely what the game engine actually has to render at any one point in time.”

Reaching into the past

Above: Quake from 1996

Image Credit: id Software

Computer scientists have studied this concept for decades. John Carmack, the graphics guru who once worked at id Software on games like Doom and Quake, was the first one actually use this kind of technology in a game, in the Quake game engine from the 1990s.

“He actually accelerated the game by like a factor of three or four in the frame rate like by using this pre-computed visibility,” Jenkins said.

While others left this technology behind, Jenkins did not. And he applied it brought it into the modern world.

Instant Interactive wants to license what it has created — middleware — to other companies.

“We’d like to see it used by many different distributors and producers of games from Valve to Epic to Activision and EA,” Freeman said.”The bandwidth requirement is actually much more reasonable than video-based game streaming because we’re not using any compression. We’re actually exploiting the structure of the game data itself.”

Jenkins said the GPEG can work with better interactivity, instant access, efficient and lossless transmission, and fast delivery of 4K and VR content.

“This is middleware that is really designed to help the game engine run better,” Jenkins said. “And say you are watching Netflix and decide to jump off the cinematographer’s camera rail, for a different narrative or to pick up a brief challenge, and become more immersed and engaged in the story. Video-based streaming doesn’t really allow you to do this. Within a matter of seconds, you’re actually playing other content. A 22-minute video episode could become an hour or more of engagement every week.”

GamesBeat

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and "open office" events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties
Become a member