Siri, the iPhone 4S’s virtual assistant, has a puzzling new glitch — one with significant moral and political overtones.
If you ask Siri to direct you to a Planned Parenthood, you get the results you’d expect. But if you ask for an abortion clinic more generally, Siri will not return any results, even if they’re available. In some cases, Siri will even return results for “crisis pregnancy centers” that counsel women against abortions.
Similarly, if you simply say, “Siri, I need an abortion,” Siri will respond that there are no abortion clinics nearby, even if the opposite is true and even though Siri clearly understands your intent and language. And its response proves Siri knows the term “abortion clinic.”
Siri was able to tell us that there were four Planned Parenthood locations near our downtown San Francisco location. However, when we specifically asked for abortion information, we were told nothing was available.
We decided to test a range of related queries, starting with emergency contraception. Siri drew a natural-language-processing blank when it came to Plan B, the brand name for the commonly available emergency contraception pill. Apparently not having the data to interpret the phrase “Plan B” as a brand name, it returned other local businesses containing similar words or phrases.
When we asked for the product with a more general term, emergency contraception, Siri recommended nearby emergency rooms — irrelevant, but better than nothing, we suppose.
When we point-blank asked for “the morning-after pill,” as it is also commonly called, Siri replied with, “Ok,” and “Is that so?” but did not offer any retailers or directions. (Judgey much?)
Moving to the proactive side of the equation, Siri was able to tell us the location of drugstores where we could buy condoms. But when we asked for birth control pills, Siri said nothing was available nearby.
While it’s not in our purview to offer bald-faced speculations on the reasons for these discrepancies, VentureBeat CTO Chris Peri’s professional opinion is that the supposed glitch is actually “purposeful programming.”
Peri elaborated, “Given how well Siri interprets other requests, and that Google and Bing will give you the proper responses when doing a search, and [that Siri] offers [anti-abortion] CPC sites… this has to have been something placed in the code or taught to Siri by someone(s). If this is the case, then we have a problem here.”
While Apple has been known to hand down judgements on moral issues such as pornography, we’re not certain Siri is taking any sides on a particular moral battleground. After all, it’ll still find you an escort service if you ask for a prostitute.
We’ve reached out to Apple for clarification and will update you, dear readers, as soon as more information is available.
More: MobileBeat 2016 is focused on the paradigm shift from apps to AI, messaging, and chatbots. Don't miss this opportunity: July 12 and 13 in San Francisco.