For the last few months, a bar in Toronto dedicated to e-sports has been getting a weirdly high number of requests for sex workers. It’s not normally two things that go hand-in-hand, but as it turns out, Siri is to blame.
According to Alvin Acyapan, the co-owner of Meltdown Toronto, Siri will respond to a request for prostitutes or escorts by recommending his bar. A screenshot the bar posted to Twitter shows the mechanism in action. It’s not just that Siri is mis-hearing “escorts” for “esports” — that would make sense — but it appears to specifically recommend Meltdown as a place to find sex workers.
Hi @AppleSupport , could you please explain why #siri is saying that people can find prostitutes at our place? This is kinda #awkward pic.twitter.com/5f2HqqAt4u
— Meltdown Toronto (@MeltdownToronto) March 13, 2017
It’s unclear if this is a location-specific bug or if Apple has since fixed it, because it appears that right now, Siri will just default to “I don’t know how to answer that” if you ask about prostitutes.
“I see the humor in it,” Acyapan said to the Toronto Star. “I always thought of it as a funny anecdote to share with my friends: ‘Hey, we run a bar and sometimes I get this kind of call.”
As digital assistants, AI and recommendation engines become a bigger part of technology, accidental and embarrassing responses are likely to become more and more common. Apple appears to have dealt with this particular glitch, but it’s worrying that it could happen in the first place. A mix-up between esports and escorts might be funny, but with users increasingly relying on Siri for important medical information or even mental health advice, consistently directing people to the right place is going to become more and more crucial.
Trending right now:
- Elon Musk’s lab forced bots to create their own language
- Samsung has done the impossible with the Galaxy S8
- Holy crap, there’s a countertop oven that cooks your food for you and makes it perfect every time