Announced at WWDC this week but lost in the shuffle of several dozen bigger pieces of news, Apple’s new Measure app for iOS 12 is a curiosity — an extremely rare addition to iOS’ built-in collection of apps, and the first with augmented reality (AR) functionality. If you’re using an iOS device that supports ARKit, Measure will automatically pop up on your Home screen on your first installation of iOS 12, and if you’re like me, you’ll at least want to give it a try before deleting it.
Right now, Measure doesn’t entirely make sense to me. As the name suggests, the app’s key purpose is to measure things. It uses AI and the device’s camera to analyze the real world; you manually choose any arbitrary two points in a scene, then it tells you what the distance is between those two points. If it identifies a shape within the scene, it can automatically assess the length, width, and area of that shape.
The non-trivial problem is that Measure doesn’t actually get the measurements right. I tested it with a number of common reference objects: a nearby Nest Thermostat (measured at 3.0 inches), a slightly further box (measured 20.5 inches), a more distant printer (measured 14.5 inches), a distant painting (measured 52 inches), and distant sofa (measured 91 inches). It also automatically assessed the size of the Nest’s mounting plate. Every measurement was off by different degrees — the nearby Nest Thermostat was off by 0.25 inches, its wall plate by 0.5 inches on each axis, the box by around 1 inch, the printer by 1.5 inches, the sofa by 3 inches, and the painting by 8 inches.
An AI or AR engineer’s response to this is predictable: Given how hard it is for a device to scan a scene and make guesses as to the relative sizes of objects, the fact that it can even come close should be impressive. And to some extent, it is. If Measure had just been a WWDC tech demo, a ruler that could guesstimate a wall-mounted object to between 87 and 92 percent of its actual size would impress some people. That’s a lot better than Siri’s accuracy these days.
But do you really want to rely upon a ruler that offers best guesses? Granted, Measure says that the lengths are “approximate” when you click on them, but if you’re measuring an item to ship via FedEx, a sofa to place in your living room, or a painting to fill your wall, having to leave a 10 percent margin of error can make a difference. It could mean that the sofa doesn’t really fit into the space, or that the item you’re about to ship can’t actually squeeze into the box you bought.
One of Measure’s semi-cool tricks is that it uses AR to preserve the locations of points you’ve added to a room, so you can continue to move yourself and the camera around and see your measurements. However, you’ll note from the screenshot below that the precisely pinpointed diagonal on the box shifted a little when the camera moved. That’s normal when dealing with AR objects, but it doesn’t convey a sense of confidence about accurate measurements.
Now that AI and AR are firmly entrenched as industry buzzwords, my concern is that we may be entering an age of “good enough” AI-AR tools that purport to be accurate but on closer inspection are measurably off. What use is a facial recognition tool that is reliably wrong in estimating its subjects’ ages? Until and unless it gets good, it’s just a parlor trick, and in my view, no more worthy of bundling with an operating system than a calculator that only approximates correct answers.
Since it’s still in beta, literally everything in Measure is subject to change before release — it could look different, work better or worse, or get pulled from the final iOS release. Having used many iOS betas over the years, all of those things have happened, but most of the time, early features improve at least a little, and what shows up in beta 1 makes it into the final release.
For that reason, I expect Measure to improve before it hits mass availability in September. If it doesn’t, it will be easy enough to press the delete button if you have no use for a rough estimating tool.