Here’s how Cortana responds to sexual harassment

Laurent Giret

Looking for more info on AI, Bing Chat, Chat GPT, or Microsoft's Copilots? Check out our AI / Copilot page for the latest builds from all the channels, information on the program, links, and more!

Digital assistants may be one of the most interesting technology breakthrough in recent computing history: not only are they easy to use (you just have to speak with them), but asking them to do something for you is often more fast and efficient than doing it yourself. And last but not least, they can also be entertaining: did you know that Cortana can tell you some pretty good jokes?

But if it’s totally OK to play with your digital companion and test the limits of this impressive artificial intelligence technology, it seems that some users are not afraid to ask them inappropriate questions. According to Microsoft, soon after the company introduced Cortana on Windows Phone in 2014, the digital assistant with a female voice received a lot of questions about her sex life.

That should be expected given that Cortana is, like Siri and other AIs, humanized with a female name and a somewhat friendly personality. But while such personas can certainly help to build trust with users, they can also encourage some people to be less than kind.

As CNN Money reports, Cortana is dealing with this problem directly, as Deborah Harrison, one of the eight US writers at Microsoft’s Cortana division, discussed during the Re-Work Virtual Assistant in San Francisco last week:

“If you say things that are particularly a**holeish to Cortana, she will get mad. That’s not the kind of interaction we want to encourage. We wanted to be very careful that she didn’t feel subservient in any way … or that we would set up a dynamic we didn’t want to perpetuate socially.”

So, be warned, dear readers: the writers at Microsoft did prepare Cortana to respond to sexual harassment and vulgarity in the appropriate manner. Actually, Harrison told CNN that the Cortana US team learned a good deal about how to handle harassment by talking to real assistants–which makes some sense given that these are the people who are actually dealing with this problem in real life.

Considerable hard work has certainly been necessary to create a believable personality for Cortana and to anticipate the many kinds of inquiries that the digital assistant may deal with in different languages. But we’re happy to learn that Microsoft is working to design the best digital assistant while also taking issues like sexual harassment seriously.