Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.

Apple’s investing heavily in artificial intelligence (AI). That much was clear from today’s iPhone and Apple Watch unveiling in Cupertino, California.

The new iPhone Xs and iPhone Xs Max boast the A12 Bionic, a 7-nanometer chip that Apple characterized as its “most powerful ever.” It packs six cores (two performance cores and four high-power cores), a four-core GPU, and a neural engine — an eight-core dedicated machine learning processor, up from a two-core processor in the A11 — that can perform five trillion operations per second (compared to 500 billion for the last-gen neural engine). Also in tow is a smart compute system that automatically determines whether to run algorithms on the processor, GPU, neural engine, or a combination of all three.

Apps created with Core ML 2, Apple’s machine learning framework, can crunch numbers up to nine times faster on the A12 Bionic silicon with one-tenth of the power. Those apps launch up to 30 percent faster, too, thanks to algorithms that learn your usage habits over time.

Real-time machine learning-powered features enabled by the new hardware include Siri Shortcuts, which allows users to create and run app macros via custom Siri phrases; Memoji, a new version of Emoji that can be customized to look like you; Face ID; and Apple’s augmented reality toolkit, ARKit 2.0.


MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

The news follows on the heels of Apple’s Core ML 2 announcement this summer.

Core ML 2 is 30 percent faster, Apple said at its Worldwide Developers Conference in June, thanks to a technique called batch prediction. Furthermore, Apple said the toolkit would let developers shrink the size of trained machine learning models by up to 75 percent through quantization.

Apple introduced Core ML in June 2017 alongside iOS 11. It allows developers to load on-device machine learning models onto an iPhone or iPad, or to convert models from frameworks like XGBoost, Keras, LibSVM, scikit-learn, and Facebook’s Caffe and Caffe2. Core ML is designed to optimize models for power efficiency, and it doesn’t require an internet connection in order to get the benefits of machine learning models.

News of Core ML’s update came hot on the heels of ML Kit, a machine learning software development kit for Android and iOS that Google announced at its I/O 2018 developer conference in May. In December 2017, Google released a tool that converts AI models produced using TensorFlow Lite, its machine learning framework, into a file type compatible with Apple’s Core ML.

Core ML is expected to play a key role in Apple’s future hardware products.

In a hint at the company’s ambitions, Apple hired John Giannandrea, a former Google engineer who oversaw the implementation of AI-powered features in Gmail, Google Search, and the Google Assistant, to head up its machine learning and AI strategy. And it is looking to hire more than 150 people to staff its Siri team.

Read all the latest stories from Apple's Gather Round event