Do the specs matter on a new smartphone? Google wants you to think it doesn’t matter as much anymore, and I can see some logic in that view.
To the casual user, what matters is whether the phone responds to voice commands, processes the information, and even just sits on the table and looks cool while you do other things. At the same time, we all know the specs do matter — and Google took great pains to cover the specifications on the new Google Pixel 2 phone.
But what’s really going on here? Google is not afraid to call out Apple in subtle ways, mentioning at the reveal event yesterday that the Pixel 2 and Pixel 2 XL have some of the same optics, a swipe at the iPhone 8 and iPhone 8 Plus using different cameras.
“Moore’s law and Dennard scaling are ideas of the past,” said Rick Osterloh, the senior vice president of hardware at Google, asserting that the continuum for processing power improvements are not as important as they once were. “The playing field for hardware components in smartphones is leveling off. Core features are table stakes now. We’re taking a very different approach at Google. Smartphones are reaching parity on their specs.”
He’s right about one thing: The user doesn’t care that much. They walk into an AT&T store or shop online, and they see the fancy new features. The bot can give you directions and text your wife all in one step! The image recognition features can identify where you’re standing based on a photo! Of course these features depend greatly on the chipset, RAM, and operating system to work — but the part where Google has a point is the long-term strategy. The user is not as into the speeds and feeds as they once were because they now assume all phones will be fast and reliable. They don’t pay as much attention to Intel and AMD and the chipset innovations that seem to be inching forward with each passing year. They don’t want to keep track of the exact processor used on a new smartphone, since they figure it’s a little better and a little faster than the one from last year or even a few months ago.
It’s a sign of overproliferation. Someone seems to have some kind of launch event every month: faster, better, longer-lasting. But what about smarter? That’s what the user base really wants — please relieve us from the tedium of tapping on apps all day. Most of the apps we use run fast enough now, there’s no issue with storage for most of us (unless you shoot videos on 4K), and there’s plenty of RAM to keep things running smoothly for the most part.
Few of us buy a new phone and show it to friends and family and say “Look at how fast this runs for apps!” Most of us set the phone down these days and tell Google to turn the flashlight on or solve a tough math problem. That’s the cool part. That’s the part that sells phones (and tablets, and eventually laptops like the Google Pixelbook).
All that said, the AI on your phone will require a lot of processing power. It’s why Apple is using a neural engine on the new iPhone X to process things like scanning your face. The great irony is that it feels like Google is saying “it’s not about specs” while likely refocusing entire teams on how to speed up AI processing on modern chipsets. The need for speed has not changed at all. What has changed is market perception. The shift is now all about what we need to use all of that speed to do. Importantly, it’s now about cloud processing and using the vast storehouse of data from other users to make AI to work effectively.
And, by the way, that’s all incredibly dependent on chipsets, RAM, and storage.