By Sady Doyle (Staff Writer, In These Times)
Last week, tech and feminist blogs erupted with a startling story: Siri, the iPhone 4 app that responds to voice queries with pre-programmed or search-engine-based replies, refused to direct its users to abortion clinics.

Not only that: Apple’s Siri seems programmed to respond to sexual or sex-related questions almost invariably as if the user were a certain kind of cisgender man.

If you tell Siri you’ve been raped, she wouldn’t tell you to go to a hospital, or to the police; if you tell Siri you want some Viagra, she knows where you could get it. If you tell Siri you want “a blow job,” she looks up escorts for you; if you tell her that you “want your pussy eaten,” she directed you to pet stores. (Also, she knows the word “dick,” but not the word “clitoris.”)

Many people don’t believe this is sexist. After all, many of Siri’s answers are generated by searching public databases like Yelp or Google; there’s often little intentionality in her answers. But looking up “blow job” on Yelp, according to blogger and journalist Amanda Marcotte, directs you to hair salons;Google “abortion clinic” in New York, and the page fills up with, well, abortion clinics. And this was especially strange, given the fact that so many women are doing important work in the fields of social media and technology.

“It’s not that a group of straight privileged (probably white) dudes sat around and said, ‘I know! Let’s make an app that will make it hard for women to find ABORTIONS,’” social media and tech activist Deanna Zandt said in an e-mail. “What happens is that a group of homogenous developers goes into a product situation with serious blinders on.”

» Read the full article at In These Times.