Better file formats and standards for representing 3D structures like USD and glTF have played a crucial role in progressing the metaverse and digital twins. However, there has been far less agreement on representing materials. Vendors, artists and enterprises mitigate these issues today by sticking within a given ecosystem of design tools and rendering engines or generating multiple versions.
Now, the 3D industry and industrial designers are exploring ways to promote material interoperability across tools. This could allow creators or businesses to create a virtual representation of new cloth, upholstery, style, shoes or paint and have it accurately rendered across various tools and 3D worlds everywhere.
There are actually two complementary material interoperability challenges. First, each rendering engine has a different approach for capturing and representing the physical appearance of materials under various lighting conditions. Second, there are multiple ways of representing the physical properties of materials, such as how they fold, drape, feel, blow in the wind or resist fire.
It could take a while for the industry to converge on any one format. Various file formats have emerged to help exchange materials across tools and virtual worlds, including U3M, AXF, MDB, MTL, KMP and SBS, each with its strengths and weaknesses. It may be that industry specific formats dominate within their respective domains, while others are used across domains.
A realistic look
Enterprises creating 3D assets for games and entertainment are exploring how better materials processing techniques like physically based rendering (PBR) can improve the look of virtual worlds. “People think of a material as a fabric, commonly, but the 3D industry talks about materials as a visual thing,” Elliott Round, co-founder and CTO of M-XR, a 3D production platform, told VentureBeat.
Most people are familiar with the way primary paints like red, yellow and blue are combined to create a variety of colors. Materials take this a step further with additional texture maps representing other properties like albedo, metalness, roughness, opacity and diffuse. This is where it gets complicated. “Different render engines have different amounts of material properties,” Round explained. “Some will have five parameters, while others can have ten, so they can all work slightly differently. That’s something we are hoping to solve with other companies to unify 3D a bit better.”
The industry has traditionally faced computational and memory constraints for accurately rendering materials. But now, these constraints are starting to fall away with better computers and algorithms. “I’m hoping we get to a position where we no longer have to sort of cut corners and hack materials because there are fewer constraints,” Round said. “It could become unified like in the real world.”
His company has been developing tools and techniques for quickly capturing the visual properties of real-world objects into virtual worlds. They started out using tools like photogrammetry and structured light scanning to capture 3D objects. “All of these approaches give you really good 3D geometry, but none of them will give you material information. And that is arguably what is key to photo realism,” Round explained. This includes aspects such as how light reflects off an object and whether it is scattered, absorbed or transmitted.
His team also explored various kinds of swatch scanners often used in the textile industry. These types of scanners from companies like Vizoo and X-Rite can capture visual material properties by scanning fabric swatches or pieces of paper. Artists and enterprise apps can later apply these to 3D objects. Round said these scans are really good but don’t work particularly well for scanning a whole object, prompting research on better whole object capturing techniques. Epic recently invested in M-XR to help scale these tools for 3D creators.
A realistic feel
Companies making physical materials, such as textiles, upholstery and clothing, face additional material challenges. They also need to capture the physical feel of things using various tools and approaches. For example, Bru Textiles, a Belgian textile giant, spent four years developing a workflow for capturing visually and physically accurate textile digital twins for its new Twinbru service. Twinbru partnership development manager Jo De Ridder told VentureBeat, “[The digital twin] is a 100% replica of the physical fabric both physically and specification wise.”
This helps design firms create realistic prototypes, such as a new hotel lobby, and quickly explore variations for clients. In the past, they would have to approximate the look through a swatch book and create a mockup that did not always look the same as the finished product. “Having digital twins shortens the supply chain, reduces complexity and increases accuracy,” De Ridder said.
However, it is a complex process. It took the Twinbru team years to develop and streamline the workflow to capture the visual and physical properties and render these into digital twins. They used a combination of X-Rite and Vizoo scanners to capture AXF and U3M files representing visual aspects of the fabrics. In addition, they worked with Labotex to capture the physical properties of the textile into an SAP database that is converted into the appropriate physics engine format. They have created digital twins of the fabric available for Nvidia Omniverse, Chaos Cosmos, ArchiUp and Swatchbook.
Creating a more material metaverse
Improved industry collaboration could help streamline similar workflows for other companies that make and work with textiles, paints, cloth and other materials. A 2020 Digital Fabric Physics Interoperability survey by the 3D Retail Coalition concluded that it is now possible to measure five fabric physics attributes once and accurately translate these into the equivalent physics values for multiple 3D apparel software solutions. These include bend, stretch/elongation, shear/diagonal stretch, weight and thickness.
Industry leaders are also starting to collaborate on open standards. For example, Browzwear, which makes 3D fashion design software, has been collaborating with Vizoo to drive the adoption of the Unified 3D Material (U3M) standard in the fashion industry. One big plus compared to other formats is that it can capture both the fabric’s visual information and physical properties.
“I truly believe that evolving the metaverse to the point of mass adoption requires materials and textures to be accurately represented,” Avihay Feld, CEO of Browzwear, told VentureBeat. “Synthetic visions involving digital twins as frozen snapshots of the physical world are a good start. Digital twins as an evolving picture of reality that is synchronized with reality are even better.”
He argues that it is not clear where the metaverse is going, but it is easy to imagine two possibilities. One is a metaverse that is a departure from reality, where virtual worlds defy the laws of physics. The other is a metaverse that imitates reality so users would have experiences that are analogous to those possible in the real world.
He believes that a true-to-life representation of both the visual and physical properties will be essential in this second case. Having realistic things inside the virtual world will make it more immersive and compelling, but it will also enable the metaverse to support a variety of use cases. A major one of these is commerce, not of strictly digital items, but of real-life objects. In this second case, having true digital twins, from the perspectives of visualizing textures and simulating the physics of an object, will be essential.
“It is possible that these two possibilities will coexist, but without the true-to-life experiences, it’s likely the metaverse will remain a fantasy world for the tech-savvy instead of being the transformative new universe it has the potential to be,” Feld said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.