
Game shows have been losing younger audiences for some time. The Future Group wants to change that by making game shows more interactive with mixed reality technology.
The Oslo, Norway-based startup — which raised $20 million in venture funding — teamed up with FremantleMedia (maker of the Idol TV shows) to create Lost in Time, a new game show that uses The Future Group’s Interactive Mixed Reality (IMR) on both TV and mobile devices.
Lost in Time, which ran a pilot test season in Norway, featured real-time special effects combined with real-life contestants who competed for prizes. The audience at home could also compete in the same challenges in a virtual way on iOS and Android, blending of primetime TV and mobile entertainment.
The Future Group also created another technology, dubbed Frontier, that enables esports broadcast producers to insert animated characters into a broadcast in real time. In this case, it inserted characters from Street Fighter V into a broadcast of the Street Fighter V International tournament.
I spoke with Ellen Lyse Einarsen, vice president of interactive at The Future Group, at the Gamelab gaming event in Barcelona last week. Here’s an edited transcript of our interview.

Image Credit: Gamelab
GamesBeat: Tell us about your background.
Ellen Lyse Einarsen: I’m from Oslo in Norway. I’ve been in the games industry for more than 10 years. My background is in screenwriting for film and TV. I came in a bit sideways, working on Age of Conan, an MMO that was launched about nine years ago by Funcom. I started out as a voice-over director for that game. I realized I loved making games, so I stayed and became an associate producer. I realized I loved producing and designing games as well, so I also worked on The Secret World.
I spent a few years in Germany, working on Facebook games and a couple of mobile games. One of those was Angry Birds Epic, published by Rovio. The other was Sacred Legends, from Deep Silver Fishlabs. Then, a year and a half ago, I was contacted by a new startup in Norway, the Future Group. They were going to blend television and e-commerce and games into one ecosystem. I thought it was a chance I couldn’t pass on, so I packed my bags and moved back home.
GamesBeat: The company is creating a game show, Lost in Time. The contestants are filmed doing challenges against a green screen in a TV studio, and Future Group’s technology renders a virtual world around them live. The viewer at home sees the contestants immersed in a castle or an abandoned mine, a virtual environment, and the contestants’ interaction with physical props triggers a reaction in the virtual world. The audience can play along as well, doing challenges on a mobile app in parallel with the contestants. Did I get that right?
Einarsen: More or less? When I started at Future Group, I was told that it normally takes people about three months from when they start at the company to when they actually realize what we do. It’s always fun to try to wrap it up in one paragraph. But more or less. We render real time graphics with the Unreal engine, using our technology on top, so that the players are in a virtual environment. They interact with it using physical props and partake in challenges. Then the players at home can follow along with the app, doing the same things.
GamesBeat: Can you explain what was happening there and what results from Lost in Time?
Einarsen: The video shows a different working day for game developers. You saw the green screen studio there. The entire team was in there — carrying props, testing, sitting on the motion platform. There were assistants, game designers, backend developers. We were all in there developing games in a different way than we were used to. We’re used to thinking about what’s fun for a user who sits with a device. Designing this, we also have to think about what’s fun for a contestant in a game show. It also has to be fun for people sitting at home and not participating, but watching.
It was a different way of thinking. My closest partner was Matt Claxton, who was also a game producer, but coming from the TV industry that means something else. He’s been in game shows in England for 15 years. We would sit together and try to figure out what mechanics would work both on mobile and on TV. It was fun.
For our dry run games, we had a motion platform we could control, and the props could be rigged. We were the guinea pigs to figure out the fine line between fun and dangerous. We wanted to trigger real emotions in the contestants. Most of the time I would come home bruised and beaten, but we found that right point where, yes, now it gives you that theme park tingle, but you’re never really scared. That was a lot of fun to do.

Above: Lost in Time
Image Credit: Post Perspective
GamesBeat: And you did the pilot in Norway?
Einarsen: Right. The first season of Lost in Time just aired in Norway. We used Norway as a soft launch market to see how the technology would work. Are people participating? Are they enjoying it? It’s been fun to see, because we were estimating that maybe five or 10 percent of viewers would play along. We ended up seeing 45 percent interactivity for the last show. It started at about 20 percent, and with each show more people would participate. They found out that experience gave them something extra.
The games are designed within a traditional free-to-play model, but we don’t have any in-app purchases. It’s free to play, but with meta-features for sharing and retention, so players can play throughout the week and earn virtual currency that they then sink into the live broadcast for a chance to win real prizes.
We have skill-based tournaments where the best player wins, but also, with each show, the players at home choose a team. We would separate Norway into two halves – women and men, or under 25 and over 25, or attached and single – and calculate the average score of players throughout the show. Toward the end we’d take a random person from the winning side, and they would win the same amount of money as the winning contestant on the show. It was always in everyone’s interest that the winner on the show got as much prize money as possible.
GamesBeat: There have been historical examples of interactive video experiments, but it seems like there’s a lot more engagement in this. What are the origins of the technology? I guess the idea came from Hollywood in the first place?
Einarsen: Our founder, Bård Anders Kasin, worked at Warner Bros. He was a technical director for the Matrix movies. He was in special effects. That’s when they started using game engines, to pre-render special effects. He developed the technology that allows this real time rendering. What’s cool about it is it’s live-capable, but you could also use post-production as you traditionally would. There are options for how to solve problems.
GamesBeat: There’s a variety of uses for this technology. It’s also been applied to a Street Fighter tournament, an esports broadcast. Can you tell us more about what went into that?
Einarsen: This was a collaboration we did with Turner about a month ago. It was aired on TBS in the states. It showcases the second use of our technology, our third-party product, which we call Frontier. It’s a graphics rendering platform that allows for real time rendering of graphics live. That lets the cameraman see the AR characters when he’s filming, so he reacts directly to their movements. He’ll see in his monitors the way to capture the best shots possible.
GamesBeat: That’s happening in real time, then, whereas most people would be used to seeing computer-generated characters inserted in post-production. Why is it more important to be able to do it live?
Einarsen: It allows for more interaction with the characters. It’s the same thing you see in Lost in Time. If I touch a physical prop live, it’ll blow something up in the virtual world. If we were filmed now, using Frontier, you could interact with a character right here and everyone would be able to see it on the screen. It allows for a more immersive AR experience.
GamesBeat: It must save you on post-production, too.
Einarsen: I wouldn’t say that to the CG guys back in the office, but definitely. It depends on the outcome you want to have. The more you prepare in advance, you can have that ready, and then you can also do post-production in terms of inserting shots. But you record all the data live. It’s all live-capable.
It’s a new experience for the viewing audience, especially for the type of show that we made, something that’s a showcase for our technology. We’re selling Frontier as a third party for production companies and broadcasters who want to make these kinds of productions themselves, but we also develop content like Lost in Time, to show what our technology can do and sell those formats on.
GamesBeat: It’s more than just you guys working on this kind of thing in the industry. At the Game Developers Conference in March, I saw Epic Games show the Unreal engine being used to render a race car live. They’d film a car with a blank QR code on top…
The post The Future Group changes both game shows and live events with mixed reality appeared first on FeedBox.