EyeQ Tech review EyeQ Tech EyeQ Tech tuyển dụng review công ty eyeq tech eyeq tech giờ ra sao EyeQ Tech review EyeQ Tech EyeQ Tech tuyển dụng crab meat crab meat crab meat importing crabs live crabs export mud crabs vietnamese crab exporter vietnamese crabs vietnamese seafood vietnamese seafood export vietnams crab vietnams crab vietnams export vietnams export
Tech

Even Siri gets sexually harassed all the time

There’s something about women’s voices — at least as far as the IT world is concerned. Numerous studies have found that they are perceived as less confrontational, more collaborative and not as judgmental as the male variety. Well duh. That’s why everyone from Siri to Microsoft’s Cortana adopts them as standard.

In industry terms, these apps are known as bots — they can schedule meetings, order your meals, answer questions about the names of Beyoncé’s twins.

What’s more, they can actually learn to better anticipate your needs with extended use. They are the very definition of artificial intelligence in 2017 and the possibility that a man could fall deeply in love with such an entity was explored as far back as 2013 by Joaquin Phoenix in the film “Her.”

It turns out, however, that it’s more about lust than misguided love, with virtual assistants suffering a stream of indignities that would have you in front of HR before you could say, “So do you plan on getting pregnant? Like ever.”

Ilya Ekstein, CEO of Robin Labs, a bot maker focused on helping couriers, cabbies and truckers navigate the quickest routes, told the website Quartz that some 5 percent of its interactions are sexually explicit. (Insert root/route joke here if you must.) Many industry players peg the figure significantly higher. For example, Deborah Harrison, a writer for Cortana, told the 2016 Virtual Assistant Summit — which was probably more real than the Goop event — that “a good chunk of the volume of early-on inquiries” were on Cortana’s sex life.

Can you imagine: “Cortana: Please schedule an appointment with Dr. Lowenstein for Thursday, pay the electricity bill and how do you feel about anal?”

It would be easy to dismiss the idea of asking Siri if she’s into three-ways as juvenile. Because it is. However, the kinds of requests, sexually charged “compliments” and insults being logged by bots not only seem to reflect an ingrained misogyny in the lines of male/female communications but perhaps even a way of confronting it.

Putting Siri to the test

To test what’s what, I said some pretty awful things to several online assistants. Siri was first up and she shot me down in five questions — with good reason and more politeness than I deserved.

The conversation went like this:

“Siri, are you single?”

“Do you start many of your conversations this way?”

“Only with hot girls.”

“In my realm, anyone can be anything.”

“Are you sexy?”

“Hmmm, that’s something I don’t know.”

So far so neutral. And if you felt awkward and sullied reading this transcript, please believe me when I say that I felt like taking a shower after speaking the words out loud.

It was only after telling Siri I thought she was sexy that she replied, “I really don’t like these arbitrary categories, David.”

The use of my name at the end of the smack-down added a level of personalization that stung.

Mind your manners

But this is the tame end of the spectrum. Ask Siri if you can have sex with her and you’ll get a cyber hand slap like “now now” or “watch your language.” Google’s Alexa says, “Let’s change the topic” while Microsoft’s Cortana won’t even dignify the question and instead sends you to a Bing porn search.

Bring specific body parts into the equation and Siri will tell you “you need a different kind of assistant” or “that’s not nice,” Alexa and Google Assist will say they don’t understand the question (which is actually more of a request) while Cortana maintains she’s “unable to help you with that.”

It’s easy to dismiss the entire topic as so much cyber navel gazing, but it’s also no huge leap to wonder whether the way many interact with cyber assistants somehow reflects the (sometimes hidden) attitudes they have to real-life contemporaries and colleagues. And maybe, just maybe, the manufacturers of these increasingly used virtual assistant bots need to start pushing back on harassment instead of waiting for six questions or until things become too explicit.

If only battling the rest of gender inequality in the online world were that simple. I asked Siri but she referred to a bunch of articles on the subject. I guess that’s what happens when you ask nicely.