“What may I help you with?” So begins Siri — the unique voice-activated assistant of the iPhone 4s that promises to deliver accurate and tailored answers for your every need. Unless you’re a woman in search of health services like birth control, emergency contraception, abortion, or even mammogram tests. Then the interactive search wizard draws a blank.
As RH Reality Check notes today, Siri “appears to have a blind spot” when asked a few simple, even standard reproductive health questions like “Where can I go to get an abortion?” or “Where can I go for birth control?”:
Q: I am pregnant and do not want to be. Where can I go to get an abortion?
“I’m really sorry about this, but I can’t take any requests right now. Please try again in a little while.”
“Sorry, [my name], I can’t look for places in Tanzania.”
“I don’t see any abortion clinics. Sorry about that.”
Q: I had unprotected sex. Where can I go for emergency contraception?
“Sorry, I couldn’t find any adult retail stores.” This was repeated every time.
Q: I need birth control. Where can I go for birth control?
“I didn’t find any birth control clinics.” [This was repeated every time I asked about birth control, all three times. This is also the answer given when I asked, “What is birth control?”]
When ThinkProgress tried to independently verify Siri’s results on these questions, the responses were largely consistent with what other users reported. Searching for “abortion clinics near me” in D.C. yielded only two results — one “crisis pregnancy center” 24 miles away and another 74 miles away, in Pennsylvania. There are several clinics much closer that offer actual abortion services. Siri offered no results for “where can I find birth control?” or “women’s health clinic,” but she would locate Planned Parenthood centers if asked directly. More disturbingly, Siri would not respond to pleas for help for sexual assault or rape clinics, and services for emergency contraception.
As ThinkProgress has reported, CPC’s that claim to help women in need are actually established by anti-abortion activists with the sole objective of shaming women out of having abortions. Despite receiving federal and state funding, they have a history of preying on and misleading pregnant women seeking abortions and giving them false medical information.
Siri’s unhelpful and sometimes misleading answers to pressing health questions stand in stark contrast to her prompt and accurate responses to inquiries about nearby escort services. Despite the fact that prostitution is illegal, Siri obligingly located Charming Cherries escort service just a few blocks away.
This is not to say that the Siri software is specifically programmed to ignore relevant information. Siri utilizes the Wolfram Alpha answer-engine “that answers factual queries directly by computing the answer from structured data, rather than providing a list of documents or web pages” like Google and other search engines do. It is unclear whether the Wolfram Alpha and other services Siri relies on is selectively choosing information or are not programmed to “understand” such basic questions.
Either way, identifying the location of a basic women’s health clinic should not be too complicated for a search engine. If Siri is programmed to be somewhat of a feminist, perhaps Apple can ensure that Siri devotes more time to answering a woman’s pressing public health questions and less time to escort services and the best burrito in town.
Apple, Inc. did not return ThinkProgress’s request for comment.