Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.
Accenture is getting serious about helping its clients move into immersive media. The company, whose workforce of 425,000 consultants spans the globe, last week acquired Mackevision, a 3D visualization and production house in Germany that is known for its work on the visual effects for Game of Thrones.
If that sounds like a strange move, the big company believes it is serving its enterprise customers by helping them move into extended reality (XR) — a combination of emerging technologies such as virtual reality, augmented reality, and mixed reality — where millennial customers will spend much of their time in the future. To see a company like Accenture embracing XR is encouraging for the fledgling technology companies that are trying to succeed amid a slowdown in VR’s expansion.
I talked about the new acquisition with Jason Welsh, Accenture’s managing director of Accenture Extended Reality, and Raffaella Camera, head of go-to-market at Accenture Extended Reality, at CES 2018, the big tech trade show in Las Vegas last week. They said the deal is more proof that the company is putting in place the resources (500 new employees, in the case of the Mackevision deal), processes, and technology to help businesses take advantage of VR and AR. It’s kind of like a parallel move to Intel’s announcement last week of Intel Studios, which will help Hollywood create entertainment for new media.
Here’s an edited transcript of our interview.
GamesBeat Summit 2023
Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.
Jason Welsh: We announced, early this morning, the acquisition of Mackevision, based out of Germany. They’re a CGI shop. They’ve historically been in the automotive industry predominantly — a lot of asset creation for TV commercials and digital marketing channels. They’ve started, in the last several years, to do the same kind of asset production in support of AR and VR experiences. That’s also been predominantly in automotive, but they’re also venturing out into CPG [consumer packaged goods], retail, some work in the entertainment industry. A lot of the headlines have mentioned Game of Thrones, because they did some effects work for the show. That’s not the main business they’re in, though. They’re mainly in 3D asset production.
Last time we talked about how a lot of the conversations we’re having with CIOs and CTOs revolve around, “How am I going to do this at scale?” It’s great to be able to do a one-off experience or a proof of concept, whether it’s for an assembly line AR solution or some consumer solution, but how am I going to do this at scale, across many different types of experiences, against many types of product lines? One of the big areas that the conversation is typically going into is asset creation, in different formats and different levels of fidelity. How do I pull from my source systems? How do I capture physical objects and make that all streamlined so it’s not costing me a fortune?
This acquisition, which we’ve been working on for awhile now, is one step in terms of rounding out some of that capability. We can do more in the XR managed services space. Initially we’ll continue to focus a lot on their core competency, around automotive, but we’ll quickly look at moving into lateral industries like aerospace, heavy equipment manufacturing, things with complex physical products where they’re using the assets — mostly in B-to-B channels and B-to-C channels, but also up in the value chain, in design and engineering use cases, as well as manufacturing and post services types of use case.
VentureBeat: For you guys, is this another service to offer?
Welsh: When you think about the vision of what we’re doing around extended reality, it sits within that context. It’s part of creating experiences. It’s also part of doing XR managed services. One element, one service within that, is asset production.
The reality is that some of the assets they’re producing, historically, won’t only feed AR and VR types of experiences, or XR types of experiences. Those same assets may also be used in batch render digital marketing campaigns, where it’s just a flat screen. But I don’t want to have to go shoot the commercial. I just want to use digital assets to re-create the scene. In automotive they’ve done that extensively. We may use it to support a pure digital marketing campaign with the capability or that service, but it’s also going to be heavily focused on supporting AR and VR experiences that we’re designing and creating as well.
Raffaella Camera: We have a lot of practice in this area already. We produce a lot of assets for anything that’s web-related, for mobile, things like that. This is really the extension of that same concept in the XR world. It showcases how serious we are as far as enlarging and building the entire practice, the strategy we have.
VB: I’ve heard of something similar from Epic Games. They were talking about people using the Unreal engine to create TV commercials with race cars, and they could just stick different cars into the scenes that were already shot.
Welsh: You look at automotive, most of the cars you see on TV now aren’t real. They’re CGI. It’s just a lot less expensive to shoot a commercial using an engine like Unreal. The advantage you have there too is you can batch render. I can have very high polygon-count assets to create that pristine experience.
VB: The guys in Germany, would they be using something like Unreal, or–?
Welsh: No, they use Unreal. That’s one of the engines they’re good at. They also work a lot with the upstream tools like Maya and all those other traditional 3D modeling tools. Unreal is more for the rendering. They also work closely with systems from Dassault, the core CAD system.
This is what we’re focusing on a lot. We don’t always want to have to re-create these assets, or even pull stuff out of CAD and then manually re-create it. How much automation can we build into that pipeline, so we can pull from true source systems, and not only create all the 3D assets, but also associate all the product data along with that asset, so that same bundle, if you will, can be used in a consumer experience, but also could be used by an engineer designing a car? Or designing a component of an engine for an airplane.
VB: I did an interview with Neil Blomkamp, the director of District 9. He was talking about how they went and shot a VR scene out in the desert, and then they just captured the whole environment. It was expensive to do that, but now he doesn’t have to take a crew out there again. Now he can place things there with a game engine, people or characters or whatever, and they can interact in this scene any way he wants them to.
Welsh: We’re not seeing this in our business so much, but in the entertainment industry, definitely, they’re starting to use the engines, Unity and Unreal in particular, to do a lot of pre-visualization for shoots. They can re-create the scene and test variations of the scene, even if it’s not the final shoot. Look at it in three dimensions and say, “OK, that works, we’ll do that.”
VB: Do you guys feel like you have a role to play there?
Welsh: On the entertainment side? Right now, more and more, the conversations we’re having with the studios are more around working with them on technology infrastructure. Looking at what tools they may need to look at differently. What’s that pipeline look like, from preproduction through production, actually shooting a film, through capturing the assets for the VFX process? How do we store those assets and make them available for downstream?
By downstream I mean someone creating a VR experience, but — what’s happening today is that when Warner or Disney or Paramount goes and creates a VR experience, they’re usually making all the assets. They’re not using the things that got produced up here, partially because they’re just not thinking about that all the way back in preproduction. Working with clients on that is an area we’re focused on, not necessarily getting embedded in the actual production process of the studios so much.
Camera: What we do with 3D asset production will be required by any industry, more and more. We’re getting ready to support that. Some industries are ahead of others. Automotive is certainly ahead. This specific acquisition allows us to have a lot of people doing that, with a lot of experience, particularly in automotive, in very high end asset creation. That can be applied in automotive, retail, entertainment if needed — there will a bottleneck of how much 3D content can be created.
VB: Did you catch anything from Intel’s keynote?
Welsh: I did see the announcement of the new studio they opened up in Manhattan Beach for immersive entertainment.
VB: Yeah, their big capture stage. That’ll be interesting. They had their Ready Player One demo, and the Intel booth re-created in Sansar.
Welsh: I’ve been walking around here and seeing all the different location-based VR entertainment setups. I do think, as far as the entertainment industry — we’re not necessarily going to try to become a VFX shop, get into that realm of the industry. Even though Mackevision has done stuff like that, it’s not necessarily what we’re trying to build to. It’s more just generally different tiers of 3D asset creation that can be used in a lot of different ways.
Mackevision represents a premium quality creation, as well as a lot of sophistication in how they can take the data from source systems and turn it into these usable assets. We’ll build around that tiering of different types of quality of experience. For a CPG product, it may not need something with quite the density and fidelity of the asset I’m going to render in certain environments, whereas the entertainment industry wants very high fidelity assets. Automotive is closer to entertainment than CPG. But the entertainment industry is helping. They’re driving demand around the technology, enabling stuff to get a little more streamlined.
VB: Intel was saying their role was more enabling. They’re not going to make movies or VR experiences, but they’re going to show people how they can create new kinds of media.
Welsh: I won’t speak for Unity, but I think if you look at that, it’s starting to be used in a way in entertainment that’s not necessarily what they originally thought of, in terms of pre-visualization. They’re moving more upstream, into the actual creation of the product, and not just products that are created after the core movie.
VB: So is it more that you’re saying to your clients, “This stuff is important and you need to know about it”?
Welsh: The conversation we’re having, even with CPG, retail — automotive has been in this for awhile, because they’ve known that they can use assets like this. They needed this capability to create assets that they can then use in a lot of different ways. We have CPG companies — retailers, just to take that example: They’ve had the same need, but their need has been more on store visualization and planograms. They needed a constant pipeline of assets, whether it’s from the CAD system or from scanning and capturing the physical product, and now they’re moving to a VR environment. “Let’s see what this planogram looks like.”
That may be Nestle coming to retail and saying, “This is what we want it to look like.” Or it may be the retailer looking at a new store design, and they need people all around the world to review that store design. They need all those individual assets in the environment, and if it doesn’t feel real, it doesn’t replace a physical store walkthrough. They’re already seeing the need, and they’re asking the questions.
Nobody, from our perspective, has started to capture this market of 3D asset production for use in industry. Mackevision was probably one of the biggest players who could say that was the crux of their business model. You had agencies doing it, some of the consulting firms doing it, but more as a one-off versus a dedicated practice.
Camera: Our clients know their need for 3D assets. They’ve all played in the space for a while. This allows us to augment our ability to provide expertise at scale, in the production of 3D assets at the high end, which is increasingly what our clients are looking for. How do we make this more efficient? How do we create assets in a certain way? That’s what the industry is looking for.
VB: Will you continue to operate them the same way?
Welsh: Initially. We already have a very large digital content outsourcing practice for two-dimensional content. We have a lot of expertise around what that looks like, how you build that on a global scale. We have people in Costa Rica and Mumbai and eastern Europe. We have a very global network model. A lot of the process behind that, we do that very well. These guys have done something a bit different very well. We’re going to want to get inside there, start to work with them, and not break what’s working well. We’ll take some of their expertise and combine it with how we’ve operated our more traditional practice, and then start to integrate the two over time.