I think it’s pretty clear that there was no intentionality behind the fact that Siri, the AI assistant on the iPhone 4s, turns out to be pretty good at directing users to anti-choice crisis pregnancy centers, but not to abortion clinics (though it seems to find Planned Parenthood very easily when searched for by name). Some of it may simply be that Apple relies heavily on external databases like Yelp to source answers to queries. And pursuant to that, I think Jill Filipovic nails it:
That data is often messy, and savvier companies will pay for the data about them to be accurate and to include the full range of their services. Abortion clinics and other women’s health facilities, obviously, are not dedicating tons of time to figure out how to optimize their search results. So the data is crappy to begin with. To fix that, programmers go in and add tens of thousands of little tweaks to a program like Siri to make it as accurate as possible, and also to include some jokes (like where to hide a dead body). But when programmers are mostly dudes, the lady-stuff just gets… ignored. So Siri knows 15 different ways to say “oral sex performed on a man” and can find a place to get it, but anything involving female sexuality at all leaves her clueless. Which doesn’t make it excusable. It’s pretty appalling that programmers thought far ahead enough to know where to send users who needed to remove rodents from their buttholes, but didn’t consider a medical procedure that 1 in 3 American women will have. I mean, they appear to have thought far ahead enough to have Siri respond to the boyfriend of the woman who is pregnant, but not to the woman herself.
On the first point, and sort of pursuant to the point I made earlier this fall about tech infrastructure for the feminist blogosphere, it would be very smart strategic giving for someone to set up a fund to optimize the hell out of progressive service providers’ sites. I’d be pretty concerned about attempts to politicize algorithms, because I think any step in that direction can have profound and dangerous consequences, but I think it’s important to make sure that progressive organizations have all the resources they need to game those algorithms as effectively as possible.
Second, making technology for women isn’t really a matter of color, or angles, or whether it fits in your purse. It’s about whether the snazzy, solves-all-your-problems technology (which is unquestionably the way Apple is marketing Siri, rather than as a Beta) actually serves that purpose for all of your customers. If your ability to think about the varied needs of your consumers only extends to thinking about the varied needs of men, you’re not actually as an expansive thinker as you believe yourself to be. Tech companies should be particularly attentive to female feedback on products like this not because our tiny girl brains will give them marketing ideas, but because artificial intelligence is about perspective, not just information.