Siri is now used by 500 million monthly active users and processes 10 billions requests a month. With the launch of iOS 12 today, you can expect a number of major changes for Apple’s assistant.

Here’s a quick rundown of some of the biggest changes for one of the best-known AI assistants around.

Shortcuts

Apple first began to introduce proactive features with Siri in iOS 8 when Siri began to help users automatically add people to their contacts or add events to their calendar based on recent activity.

Over the years, Siri began to do things like recommend apps and news stories. Shortcuts is the next step in that evolution, and there’s a few ways to get them.

Siri Suggestions will serve up recommended actions based on your regular activity on an iPhone lock screen, in the search area, and on an Apple Watch face.

Above: An Add to Siri suggestion in the Photos app

Apps can also recommend Shortcuts, which can be put to use after tapping the “Add to Siri” button.

So you can say “Hey Siri, start my day” to hear about your first appointment, start a specific playlist, and order coffee with a single command.

App developers must enable Shortcuts APIs in order to allow users to create Shortcuts functions. At launch, Shortcuts-enabled apps include Instacart for grocery orders, Walgreens for prescription orders, Sky Guide for finding stars, and Jet.com for ecommerce, as well as travel apps from United Airlines, American Airlines, Kayak, and TripAdvisor.

The other way to get Shortcuts is with the standalone Shortcuts app. With the app you can create your own custom commands connected with any app that has adopted Shortcuts APIs. That means that for the first time, any app can now work with Siri. So if you order coffee every morning on the way to work, Siri can pop up on your home screen and recommend you preorder your latte.

Beyond custom-made voice commands, the Gallery in the Shortcuts app includes examples of how to do things like play Apple Music, log a meal with a nutrition app, send a loved one your ETA in a text, or automate photo taking or sharing.

Shortcuts doesn’t come to you flawless. Some flows in the app can be a little cumbersome. For example, if you make a “Hey Siri, record audio” command, for example, you must choose both a “record audio” command and a “save file” command.

And if you’re a Wunderlist devotee like me and want to add a to-do task using Shortcuts, Siri can quickly get you to a place where you can add the task, but you still have to pick up your phone to add the task.

To be clear: Shortcuts isn’t designed to make everything hands-free, but voice is an interface perfect for busy people who want to get things done without the need to tap their phone screen.

Any Shortcut added to Siri can be invoked with HomePod, iOS, or Watch OS devices unless it needs to use an app extension, as many of the shortcuts you can create in the Shortcuts app require.

If you’re familiar with how Siri can work with developers, you know that SiriKit has been available for some time now, and it also allows third-party app developers to connect with the assistant. That’s how you can simply say “Siri, call me an Uber please” or “Read my WhatsApp messages” or even do more complicated things, like edit a to-do list.

SiriKit will continue to operate for deep integrations of specific kinds of apps such as ride-hailing apps or sending WhatsApp messages, while Shortcuts was made to cover everything Siri doesn’t know how to do.

Shortcuts gets a lot more interesting when you think about the knowledge bases in apps that don’t yet exist with Siri. For example, Siri doesn’t have the ability to serve up surf reports, but Surfline might be the best app in the world for that. So the avid surfer who regularly opens the app can now get what they need with a very simple voice command like “Surf report.”

As simple as that seems, it’s the chipping away at automating frequent tasks that AI assistants can be incredibly helpful with.

Overall, Shortcuts isn’t perfect, but at first glance seems more flexible than Routines for customization with Alexa or Google Assistant, and it actively works to help you recognize and identify your habits.

That’s the real beauty of Shortcuts, compared to solutions from Apple’s competitors: You don’t have to do anything but continue using your phone as usual to reap the benefits. Siri will begin to recognize your routines and suggest commands based on what you do with your iPhone or iPad.

Shortcuts can generate suggestions based on 100 different factors like your location, time of day, what’s in your calendar, or even which Wi-Fi network you’re on, then use a predictive machine learning model to recommend an action.

Every feature can be ignored or glossed over by users, but Shortcuts appears to be the first time Apple is extending Siri to a range broad enough to include every iOS app. That’s a benefit to both developers and users, and that’s exciting.

Survey after survey has shown that the lion’s share of actions people take with voice assistants are still things like playing music, setting reminders, or checking the weather, so people might be a little alarmed to find out Apple can track your activities this way, but predictive models for Siri Suggestions operate entirely on device, and data synced between devices is shared using end-to-end encryption.

Tapping into people’s daily habits will get personal for some iOS users, however, and will require trust.

Smarter about nutrition and celebrities

Siri uses a number of knowledge graphs to serve up answers to questions from repositories like Wikipedia, and now draws on a USDA nutrition database so you can ask things like “Hey Siri, how many calories are there in red wine?”

This is an area Siri may have catching up to do in this area. In a Q&A test of assistants this summer, Siri was considered better than Alexa and Cortana at answering questions but not as good as Google Assistant.

Siri is also getting smarter about famous people. In addition to the previous ability to hear a rundown summation about famous people, Siri can now answer questions like “Who is Harrison Ford married to?” or “Who are Beyoncé’s children?” or “Where was Steven Tyler born?”

Translation for more than 50 language pairs

Siri began to do on-the-spot translations in iOS 11. In iOS 12, Siri’s translations will leave beta, and with it gain the ability to translate 50 new language pairs.

Improved photo search

You could use Siri and your voice to search the Photos app before, but now it’s much smarter, with the addition of Memories and improved object detection AI.

Whereas before you might say “Show me photos from June 2016,” you can now get more elaborate to say things like “Show me photos from a nightclub” or “Show me photos from a sports arena.” You can also ask for specific environments like the beach or in snow, or simply photos of animals.

More live sports results

Apple began to hire people in 2016 to improve Siri’s sports smarts, and today the assistant can share results from more than 100 professional sports leagues.

With iOS 12, Siri will add F1, Indy 500, MotoGP, and Nascar to deliver live results like standings, driver statistics, and upcoming schedules. Should this prove to be anything like Siri’s other sports smarts, the assistant should also be able to tell you how drivers measure up to one another in the standings or compare specific stats.

Ask Siri for your password

Should you forget your Netflix or Amazon Prime video password — or any other for that matter — Siri can bring it up for you in your Safari Password manager.

More expressive voices in Mandarin and Irish English

More expressive voices for Siri were introduced last fall, and new expressive voices are available in iOS 12, including for Siri speaking Danish, Norwegian, Irish English, South African English, and Cantonese and Mandarin for Taiwan.

Find my devices

HomePod users can now say “Hey Siri, ping my iPhone” or “Hey Siri, find my AirPods” to locate your devices with simple voice commands.

Flashlight support

Not much to explain here. Just say “Hey Siri, turn on my flashlight” to turn on your flashlight.

Talk to Siri on Apple Watch without the need to say ‘Hey Siri’

Again, not much to explain here. With Apple Watch, just raise your watch to your mouth and gyrometers will understand the motion closely associated with speaking to Siri and wake up the assistant rather than you needing to explicitly say “Hey Siri.”

However small a feat this may seem, each step taken to remove friction makes it more likely that a person may actually utilize their assistant to get stuff done.

In other upgrades for Siri on the way soon, Apple announced last week that an upgrade for HomePod software will allow Siri to make and receive phone calls with the smart speaker and support for Spanish speakers. HomePods go on sale in Mexico October 26.