Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.

Game shows have been losing younger audiences for some time. The Future Group wants to change that by making game shows more interactive with mixed reality technology.

The Oslo, Norway-based startup — which raised $20 million in venture funding — teamed up with FremantleMedia (maker of the Idol TV shows) to create Lost in Time, a new game show that uses The Future Group’s Interactive Mixed Reality (IMR) on both TV and mobile devices.

Lost in Time, which ran a pilot test season in Norway, featured real-time special effects combined with real-life contestants who competed for prizes. The audience at home could also compete in the same challenges in a virtual way on iOS and Android, blending of primetime TV and mobile entertainment.

The Future Group also created another technology, dubbed Frontier, that enables esports broadcast producers to insert animated characters into a broadcast in real time. In this case, it inserted characters from Street Fighter V into a broadcast of the Street Fighter V International tournament.


GamesBeat Summit Next 2022

Join gaming leaders live this October 25-26 in San Francisco to examine the next big opportunities within the gaming industry.

Register Here

I spoke with Ellen Lyse Einarsen, vice president of interactive at The Future Group, at the Gamelab gaming event in Barcelona last week. Here’s an edited transcript of our interview.

Above: Ellen Lyse Einarsen developers games that work with The Future Group’s TV game shows.

Image Credit: Gamelab

GamesBeat: Tell us about your background. 

Ellen Lyse Einarsen: I’m from Oslo in Norway. I’ve been in the games industry for more than 10 years. My background is in screenwriting for film and TV. I came in a bit sideways, working on Age of Conan, an MMO that was launched about nine years ago by Funcom. I started out as a voice-over director for that game. I realized I loved making games, so I stayed and became an associate producer. I realized I loved producing and designing games as well, so I also worked on The Secret World.

I spent a few years in Germany, working on Facebook games and a couple of mobile games. One of those was Angry Birds Epic, published by Rovio. The other was Sacred Legends, from Deep Silver Fishlabs. Then, a year and a half ago, I was contacted by a new startup in Norway, the Future Group. They were going to blend television and e-commerce and games into one ecosystem. I thought it was a chance I couldn’t pass on, so I packed my bags and moved back home.

GamesBeat: The company is creating a game show, Lost in Time. The contestants are filmed doing challenges against a green screen in a TV studio, and Future Group’s technology renders a virtual world around them live. The viewer at home sees the contestants immersed in a castle or an abandoned mine, a virtual environment, and the contestants’ interaction with physical props triggers a reaction in the virtual world. The audience can play along as well, doing challenges on a mobile app in parallel with the contestants. Did I get that right?

Einarsen: More or less? When I started at Future Group, I was told that it normally takes people about three months from when they start at the company to when they actually realize what we do. It’s always fun to try to wrap it up in one paragraph. But more or less. We render real time graphics with the Unreal engine, using our technology on top, so that the players are in a virtual environment. They interact with it using physical props and partake in challenges. Then the players at home can follow along with the app, doing the same things.

GamesBeat: Can you explain what was happening there and what results from Lost in Time?

Einarsen: The video shows a different working day for game developers. You saw the green screen studio there. The entire team was in there — carrying props, testing, sitting on the motion platform. There were assistants, game designers, backend developers. We were all in there developing games in a different way than we were used to. We’re used to thinking about what’s fun for a user who sits with a device. Designing this, we also have to think about what’s fun for a contestant in a game show. It also has to be fun for people sitting at home and not participating, but watching.

It was a different way of thinking. My closest partner was Matt Claxton, who was also a game producer, but coming from the TV industry that means something else. He’s been in game shows in England for 15 years. We would sit together and try to figure out what mechanics would work both on mobile and on TV. It was fun.

For our dry run games, we had a motion platform we could control, and the props could be rigged. We were the guinea pigs to figure out the fine line between fun and dangerous. We wanted to trigger real emotions in the contestants. Most of the time I would come home bruised and beaten, but we found that right point where, yes, now it gives you that theme park tingle, but you’re never really scared. That was a lot of fun to do.

Above: Lost in Time

Image Credit: Post Perspective

GamesBeat: And you did the pilot in Norway?

Einarsen: Right. The first season of Lost in Time just aired in Norway. We used Norway as a soft launch market to see how the technology would work. Are people participating? Are they enjoying it? It’s been fun to see, because we were estimating that maybe five or 10 percent of viewers would play along. We ended up seeing 45 percent interactivity for the last show. It started at about 20 percent, and with each show more people would participate. They found out that experience gave them something extra.

The games are designed within a traditional free-to-play model, but we don’t have any in-app purchases. It’s free to play, but with meta-features for sharing and retention, so players can play throughout the week and earn virtual currency that they then sink into the live broadcast for a chance to win real prizes.

We have skill-based tournaments where the best player wins, but also, with each show, the players at home choose a team. We would separate Norway into two halves – women and men, or under 25 and over 25, or attached and single – and calculate the average score of players throughout the show. Toward the end we’d take a random person from the winning side, and they would win the same amount of money as the winning contestant on the show. It was always in everyone’s interest that the winner on the show got as much prize money as possible.

GamesBeat: We’ve seen historical examples of interactive video experiments, but it seems like there’s a lot more engagement in this. What are the origins of the technology? I guess the idea came from Hollywood in the first place?

Einarsen: Our founder, Bård Anders Kasin, worked at Warner Bros. He was a technical director for the Matrix movies. He was in special effects. That’s when they started using game engines, to pre-render special effects. He developed the technology that allows this real time rendering. What’s cool about it is it’s live-capable, but you could also use post-production as you traditionally would. There are options for how to solve problems.

GamesBeat: There’s a variety of uses for this technology. It’s also been applied to a Street Fighter tournament, an esports broadcast. Can you tell us more about what went into that?

Einarsen: This was a collaboration we did with Turner about a month ago. It was aired on TBS in the states. It showcases the second use of our technology, our third-party product, which we call Frontier. It’s a graphics rendering platform that allows for real time rendering of graphics live. That lets the cameraman see the AR characters when he’s filming, so he reacts directly to their movements. He’ll see in his monitors the way to capture the best shots possible.

GamesBeat: That’s happening in real time, then, whereas most people would be used to seeing computer-generated characters inserted in post-production. Why is it more important to be able to do it live?

Einarsen: It allows for more interaction with the characters. It’s the same thing you see in Lost in Time. If I touch a physical prop live, it’ll blow something up in the virtual world. If we were filmed now, using Frontier, you could interact with a character right here and everyone would be able to see it on the screen. It allows for a more immersive AR experience.

GamesBeat: It must save you on post-production, too.

Einarsen: I wouldn’t say that to the CG guys back in the office, but definitely. It depends on the outcome you want to have. The more you prepare in advance, you can have that ready, and then you can also do post-production in terms of inserting shots. But you record all the data live. It’s all live-capable.

It’s a new experience for the viewing audience, especially for the type of show that we made, something that’s a showcase for our technology. We’re selling Frontier as a third party for production companies and broadcasters who want to make these kinds of productions themselves, but we also develop content like Lost in Time, to show what our technology can do and sell those formats on.

GamesBeat: It’s more than just you guys working on this kind of thing in the industry. At the Game Developers Conference in March, I saw Epic Games show the Unreal engine being used to render a race car live. They’d film a car with a blank QR code on top of it as its paint job, and then use the game engine to inject the shell of some other virtual car on top of it, so the audience could see something different on the fly as the director chose.

Einarsen: We’re partners with Epic. It’s really cool what they’re doing. It’s a game engine, but they also want to take it to other outlets and accomplish other goals. That’s fantastic. I think more and more people will get on that bandwagon, because it’s so powerful. It lets you do things that you can’t do in any other way.

That’s also where we’re going, with everything being customizable for the user and the viewer. I imagine that it’s not too long until not only the director can change things. You could be watching the car and say, “I don’t like the color on this one. I’d like to see another one.” With these technologies you’ll be able to do just that.

Above: Street Fighter characters are superimposed on a TV stage.

Image Credit: Future Group

GamesBeat: This is what we’re calling mixed reality now. Where do you think this technology is going? What uses do you see?

Einarsen: We call it interactive mixed reality, because we also let users who are not in the studio participate. The fact that you can be in the same challenge at the same time adds the interactive element to it. But I think we’re only at the beginning. I’m very curious to see where it goes, based on everyone who makes games and makes content. It’s a huge toolbox of new things we can do.

It allows for a lot of possibilities in movies and television to do things that used to be very expensive, or almost impossible. If you can do them virtually, then you might get a completely different product than if you had to scale back to do something you could build in a studio.

GamesBeat: The audience for traditional TV game shows is aging. I take it you see this as a way to bring new life into the genre with a younger audience?

Einarsen: Absolutely. And not just game shows. The audience for linear television in general, the average age is growing up. The TV channel that broadcast our show in Norway, that was their wish, to get more of a younger audience back to watch TV. We saw that our main audience was between 25 and 35. You got that audience back to watching TV, which is pretty cool. It also goes the other way. People who traditionally only watch TV will start playing a game. We saw that in the demographics. I was also quite happy to see that the gender split of our users was about 50-50. It spread across the whole family. We even had a small chunk of people over 65 who would play along.

GamesBeat: As this goes wide around the world, what are some things you might expect as far as how it gets used? In the United States I can see something like the American Ninja shows benefiting from this.

Einarsen: And also more traditional shows. We’re in talks with several broadcasters from all over the world who want to know what we can do for their series. Some of these are shows that have been on TV for many years. They want to see what’s possible in terms of both the virtual studio production, and also what players at home can do. The numbers are clear, right? Something like 80 percent of viewers are using a second screen while they watch television or film. With interactive mixed reality, you have this reunification of concentration. You have the same product on both screens. What that does to the experience is quite powerful for people like us who create content.

GamesBeat: You’re bridging the physical and digital in yet another way.

Einarsen: It’s funny to see, because you didn’t see so much of the contestants in our trailer, but they’re doing everything in a green room. The biggest challenge we have is, will we get proper emotions and reactions from them? They’re not actors. They’re real people. They see exactly what they have to see. But it was obvious within a few minutes of testing that you’ll definitely laugh and scream and get excited, because you’re drawn into it. Whenever people come to our studios to test the games, they say, “Okay, I get it. It’s fun. It’s exhausting.” If you’re pulling a rope, we want it to be heavy. People would come of that challenge sweating. That’s what comes across even with the virtual background. It looks like it fits.

Above: Ellen Lyse Einarsen (left) of The Future Group and Dean Takahashi of GamesBeat at Gamelab in Spain.

Image Credit: Gamelab

GamesBeat: You could use this for something like the Eurovision song contest.

Einarsen: Just as one example. You could also use Frontier for huge live events. It’s expensive to win the Eurovision contest, and then you have to host it. If you can use something like Frontier, you could do that in a more reasonable budget and still have something that looks incredible.

GamesBeat: You’ve had connections to Hollywood because of this technology. Do you feel like you’ve been able to communicate with the larger TV industry and get a reaction?

Einarsen: From the very beginning, during funding rounds, people were very interesting. We had a meeting at Disney where they said they’d been trying to do something like this for years. Every time they tried it turned into a headache and they gave up. I think a lot of players have been on the fence waiting to see if anyone could actually do it. Now that Lost in Time is out, people are coming back interested in trying it out. It’s complicated, but it’s very cool when you get a chance to work with it and see how it plays out.

Above: The Mill’s Blackbird is a car that can be reskinned with game animations.

Image Credit: Epic Games

GamesBeat: Are there any points of frustration still, things you can’t quite do yet?

Einarsen: If there’s any frustrating element, it’s that we’re one company in several different industries working together. We’re coming with all our different expertise, languages, processes, ways of doing things, and trying to work with each other. Just putting the TV industry and games industry together was a very interesting exercise. We came out of it learning a lot, but that was where we had the most friction.

Game developers will sit in a room with their headsets on, quietly working behind their keyboards. If they talk with each other, it’s on Slack. When there’s a meeting, we’ll have everything planned beforehand. That gets presented and it’s done. In the television business, it’s much more extroverted. Everyone talks to each other. When you go into a meeting, nothing’s decided. That’s where you brainstorm. It’s a whole different way of doing things, but it’s also cool. That’s an area where we’ve been innovative, I think, establishing a way to do this kind of collaboration – and with the special effects industry as well.

GamesBeat: It seems like there are some natural challenges when you combine industries. Film and TV do so much preproduction and then render it once. Pixar doesn’t render things 30 times if they can help it. A game developer prototypes things all the time and continuously changes a game. It seems like it might be hard to get along.

Einarsen: But you learn about each other. If you don’t learn, you don’t know what can be frustrating for someone else. Our CG artists could be working and working, and for them it’s more like a painting. They can keep touching up something and nothing breaks. But they have to learn that if they change one thing or move it around, something else might break later on in the pipeline. That was one thing we had to learn by doing.

Above: The Future Group is creating interactive mixed reality.

Image Credit: The Future Group

GamesBeat: What does your roadmap look like? Where do you plan to take this?

Einarsen: We have these two ways of going at the moment. We have Lost in Time, which we’re selling as a format. We’ve developed that in collaboration with Fremantle Media, the developers of Idol and X Factor and big shows that get franchised across the world. We’re talking now with several interested territories about making it in those countries. Then we also have Frontier, the rendering platform, which we sell as a third party to allow people to do what we did with Street Fighter, and to use the virtual studio. We’re in some very interesting talks with different people, but I can’t say anything yet.

GamesBeat: What are some of your early learnings about mixed reality? Would you have any advice for people doing other mixed reality products?

Einarsen: Try it, because it’s fun. That’s why most of us at Future Group are doing it. We all came together as experienced professionals from different fields to do something that nobody had ever done before. I’m very glad to have been a part of that ride. It broadens your mind, broadens your toolbox for storytelling and creating immersive experience. Also it lets you sometimes ride on a motion platform.

GamesBeat: I feel like this could lead us toward a return of physicality to games. In esports today the best player is the guy who spends the most time behind a computer playing League of Legends. He’s a star athlete. But if you’re filming things with a green screen and capturing motion, you could have people competing in a sport and translate that for the audience into a virtual world or virtual performance. They’d be succeeding because they’re going back to being the best physical performer. You could have a Madden football match with people translating their movement into control of the game. Is that something you would think about doing?

Einarsen: We’re already thinking about it. That’s one of the verticals we want to get into. In terms of the esports element, it’ll enhance the experience not just in the video, but also — you mentioned League of Legends. You could put a reporter on the battlefield as a real live war reporter, walking inside the battle scene. That’s why it’s cool to not only use augmented reality, but also virtual. You could also use this with VR goggles. But when it’s televised and broadcast like this, you want to see people’s eyes, see their reactions. We don’t want to take that away by watching through goggles. But it’s also possible to create VR experiences.

Above: Ellen Lyse Einarsen believes that Hollywood live shows can become more interesting with mixed reality.

Image Credit: Gamelab

Question: With the viewer being able to interact with the series, does that have the potential to derail the contestant’s performance?

Einarsen: The contestant doesn’t directly feel what the consumer at home is doing. We open a window where you can play the challenge the same night the contestant is doing it. But we didn’t want to restrict it to be at the exact same time. We wanted people to be able to watch it first and then play it. It opens at the same time, and then there’s a window of a few minutes where you can play it. But your score, either way, will be compared to that of the contestant. If you get a higher score than the contestant, you’ll get bonus points.

Question: Are kids able to participate as contestants?

Einarsen: Not for this show, but that’s obviously a possibility.

Question: We’ve seen reports from people in esports talking about working with TV and running into problems, because the TV presentation breaks up the flow of the game. Do you run into a similar problem?

Einarsen: Not necessarily. In the Street Fighter example, that’s already being broadcast. We’re just enhancing that broadcast experience. Our technology is broadcast ready. It feels quite natural. It doesn’t go against the flow. After all, it doesn’t have to be on television. It can as easily be on Twitch or YouTube or other live streams. It’s not confined to a classic linear TV experience.

GamesBeat: What sort of resources do you need to fully execute this now? Do you need more employees? What do you expect as far as what it takes to pull off a global presentation with this technology?

Einarsen: The studio is ready to execute another season of Lost in Time. But it depends on how many territories sign up, what features they’d like to add to the app, and so on. The app is geo-locked, so every territory will be able to make their own modifications. Some territories will want in-app purchases and some won’t. We’re definitely looking for more developers and game designers.

GamesBeat: Would you open up a studio in new territories?

Einarsen: For the first season we’d have them fly to Norway with the contestants and hosts. We’re what we call “IMR ready.” The studio is set up with all the cameras. But then we could have hubs. We could have a North American hub, a Middle East hub, or an Asian hub, so more territories could gather in place around one green screen set up with the right technology.

Above: About 45 percent of viewers in Norway also played the mobile game that accompanied Lost in Time.

Image Credit: Gamelab

Question: At this point you have contestants playing the game in the studio and the audience interacting through the app. Do you ever plan to expand that interaction to a point where viewers using the app can work their way toward becoming contestants?

Einarsen: Definitely. We didn’t do any of that for the first iteration, but it’s a plan. We also want, for the tie-in game, something like what you see in Mario Kart, where you see the TV contestant as a ghost on your screen. You can have that intermediate connection.

We also want to do, for the live shows, something more than just voting for a contestant like you do on game shows today. If you can truly be interactive, maybe you can try and hinder a contest you don’t want to win. Or if the contestant you want to win has to cross a river, maybe you can try to build a bridge so they can get across. There are all kinds of elements we could introduce when it’s live. Those are the things I’m most excited about, where you can truly help or hinder the contestants in the studio. Hopefully, in a couple of years, I’ll be here talking about that.

Disclosure: The organizers of Gamelab paid my way to Barcelona. Our coverage remains objective.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.