With both ARKit and ARCore available to the public, augmented reality is now enabled on over 500 million devices. There are over 2,000 AR apps available in iOS App Store and another 200-plus on Google Play. With few breakout hits, many are wondering what the killer use cases for AR will be. We can examine the growth of the mobile app ecosystem to better understand how mobile AR will evolve.
The growth of the mobile ecosystem was driven, in part, by three use cases: Creative, Contextual and Connected apps. These same use cases are now pushing the AR ecosystem past the initial novelty stage into creating true value for users.
Apple first launched its app store in 2008 with nearly 20,000 apps, and by the end of 2017, it had more than 3 million apps. The first mobile apps were novelties, simple and single-use such as flashlights, but the ecosystem quickly evolved by extending existing user behavior into mobile. Quickly, gimmicky apps were replaced by a wide spectrum of apps that are now integral to our everyday lives.
The growth of the mobile ecosystem was driven, in part, by those Creative, Contextual and Connected use cases. These are now pushing the AR ecosystem past the initial novelty stage into creating true value for users.
Creative apps give users the ability to create new and better content. Instagram makes everyone a professional photographer and Musical.ly makes everyone a rock star. Creative apps enhance photos and videos with filters, stickers, music and more. By allowing users to share more than the raw footage of the camera, greater creative potential is unleashed. Similarly, AR is enabling new creation tools to share experiences beyond the camera. We see this with Snapchat’s lenses but it’s just the tip of the AR iceberg.
Filmr is a mobile AR app that enables fast and easy film editing. It offers a variety of animated 3D characters and objects that can be placed into a scene. The combination of their editing tools and AR effects allows anyone to become a special FX wizard.
While these effects are new and magical now, they will become even better over time, and we’re seeing a surge of 3D artists creating assets for AR and sharing them through libraries such as Sketchfab and Google Poly. Companies like Adobe, Torch3D and Simile are building tools for content creators to make 3D skills more accessible to everyone. Finally, improvements in rendering techniques like occlusion and physics-based rendering will soon make it impossible to distinguish between the digital and physical world in a scene.
Mobile apps became more useful when developers added context to experiences. Yelp combined an existing dataset of local business data with location awareness to enable discovery on the go. Maps became more useful on mobile devices with turn by turn directions. Inputs from users also helped companies like Foursquare and Waze beef up their datasets to compete with existing competitors and build defensible data moats. If done right, the combination of tools and data can create powerful apps.
Similarly, AR becomes more useful the better we understand the world. Plane detection, cloud points, markers/image tracking and machine learning are all tools that enable a deeper comprehension of the real world. Through this knowledge, AR developers can overlay content and data to enhance your knowledge of your surroundings. Google Lens is one of the best examples of contextual AR and how combining different tools and data produce high-value experiences.
Entur is the official public transit app of Norway. It uses AR to overlay the location of public transit stop relative to the user’s location. Rather than having to translate a digital map to the real world, Entur visually shows where stops are located and provides directions.
AR tools are quickly evolving to help developers better understand the world and add context to it. ARKit 1.5 and ARCore 1.2 added support for vertical planes and image/marker tracking. Mapbox provides a location-based AR platform making it easy to bridge geolocation with AR. Apple, Google, Amazon, and others offer machine learning tools to recognize images and objects in the real world. As these tools continue to improve accuracy and coverage, we will see more contextual AR experiences that enhance our real-world surroundings.
In 2011, we saw an explosion of on-demand apps such as Uber, Postmates, Lyft, Instacart and many others. These connected experiences enabled goods and services to arrive with a tap on our phones. Mobile apps were no longer single player experiences but connected a wide variety of customers to service providers. Better connectivity, the ubiquity of smartphones and location data enabled these services to solve customer issues at scale. Even today, new breakout hits, like HQ Trivia, are connected apps that bring people together in a shared experience.
AR is shifting from individual-based use cases to connected experiences. Streem is an AR platform that connects customers with service providers. They enable users, with the tap of a button, to connect with service professionals to solve their problems. The service providers are able to identify issues visually and provide support and directions by overlaying information into the user’s world.
Streem is a great example of how connected AR is benefiting users today. There are many more multiplayer AR experiences available today. And this trend will only accelerate with the emergence of the AR Cloud. At I/O 2018, Google release cloud anchors to enable multiplayer AR experience in the same location. This is the first step on connected experiences. Once the real world is accurately mapped at the cloud point level offering persistence and scale, we will see another explosion of connected AR that rivals the on-demand revolution.
2018: The year of mobile AR
While some are disappointed by the current adoption of AR, when compared to similar platforms, mobile AR is evolving as expected. New computing paradigms are rarely enough on their own to result in killer use cases. A combination of tools, data and scale are necessary to make the jump to the next level of consumer experiences.
With the launch of ARCore earlier this year, mobile AR already has meaning scale across both platforms. This scale (500 million+ AR enabled smartphones) is attracting nearly all companies to experiment with spatial computing and proprietary data. Companies are combining data with API’s to create new consumer AR experiences.
Perhaps most importantly, the tools and frameworks for building robust AR experiences are rapidly evolving. We have seen both Apple and Google quickly add key spatial features to their respective SDK’s. (And expect Apple to jump into the AR Cloud game after Google’s recent launch of multi-player AR experiences.)
Every week a startup launches a new tools that make part of the AR pipeline from asset creation to real world interaction better. As we saw with the mobile app ecosystem, now that these pieces are in place, expect mobile AR to evolve in 2018 to the next level with new use cases and experiences.
Danny Moon is founder & CEO of Viro Media, a platform for developers to build AR and VR applications.