Main Menu

Cortana combats harassment

Visit Us
editorial cartoon by Jacob Bogdanoff and Des Delgadillo

editorial cartoon by Jacob Bogdanoff and Des Delgadillo

People should not stand for sexual harassment. Now neither will Microsoft’s personal digital assistant Cortana.

A team of program writers has given “her” the ability to respond and shut down any inappropriate or suggestive comments from users.

Deborah Harrison, one of the eight writers in the U.S. Cortana division at Microsoft, told CNN that when Cortana was first launched in 2014, a majority of the early queries to the feminine sounding system were about her sex life.

Unlike other virtual assistants like Siri or Google Now, Microsoft’s Cortana was given a more human personality because the program’s voice and personality was based on a popular female video game character who has the same name.

Cortana from the Halo series is portrayed as an attractive female hologram and has often been named one of the “top sexiest female video game characters” by many sources, including IGN, MSN and GameDaily, some of the top gaming websites.

By having Cortana respond angrily to such inappropriate comments, Microsoft wanted to make sure its program accurately reflected real human assistants.

According to Harrison, the team spoke with people who hold jobs as assistants and regularly face problems with sexual harassment firsthand. Some, however, believe that Cortana’s ability to react negatively to harassing comments makes the virtual assistant too “real” and that she should react however the user requests.

We don’t believe there is anything wrong with making a virtual assistant more human by adding such realistic human responses, which are appropriate responses to comments that should never be made in the first place.

Many of these virtual assistants are programmed to have a woman’s voice.

Having the voice default to a woman’s is a problem in itself, as it perpetuates the stereotype that women are submissive servants, meant to bend to the user’s requests.

However, by creating a way for its own virtual assistant to combat sexual harassment, Microsoft is taking a step in making sure this logic is not normalized.

“If you say things that are particularly a-holeish to Cortana, she will get mad,” Harrison told CNN. “That’s not the kind of interaction we want to encourage.”

Having operating systems discourage these types of comments, users will hopefully get the hint, and refrain from this type of behavior toward people in the real world as well.

Hopefully other companies will follow suit and implement similar features in their virtual assistant programs.

Kudos to Cortana and Microsoft for taking this step to combat the normalization of sexual harassment by letting users know that it is never OK to speak to women – even virtual women – that way.

Visit Us

, , , , , , , , ,

Comments are closed.