It recently struck me that when people encourage students to go into STEM or business as careers, they may be underestimating the importance they place on daily human interactions and are taking them for granted.
Last month, there was an article in the Washington Post suggesting that people with liberal arts backgrounds will be hot commodities for technology companies. The value these people bring is their ability to help technology simulate human interactions.
Personal assistants like Siri, Cortana and Alexa are increasingly becoming an area of focus of development. The personality development teams work on backstories for the assistants and are responsible for evaluating whether flaws in speaking patterns in syntax make them more relate-able or too informal for their purpose.
The personalities for the artificial intelligences can’t be too perfect, but they also can’t be so flawed that you can do things like trick them into cursing. (Of course, people have been tricking kids toys into cursing for years, so nothing is perfect. NSFW)
There are thousands of subtle decisions that go into shaping the “personalities” of these assistants.
At a recent meeting of Microsoft Cortana’s six-person writing team — which includes a poet, a novelist, a playwright and a former TV writer — the group debated how to answer political questions.
To field increasingly common questions about whether Cortana is a fan of Hillary Clinton’s, for instance, or Donald Trump’s, the team dug into the backstory to find an answer that felt “authentic.” The response they developed reflects Cortana’s standing as a “citizen of the Internet,” aware of both good and bad information about the candidates, said Deborah Harrison, senior writer for Cortana, and a movie review blogger on the side. So Cortana says that all politicians are heroes and villains. She declines to say she favors a specific candidate.
The group, which meets every morning at Microsoft’s offices in Redmond, Wash., also brainstorms Cortana’s responses to new issues. Some members who are shaping Cortana’s personality for European and Canadian markets dial in.
Given this context, you can see why so much effort is invested into shaping the personality of the virtual assistants. Whether you use them or not is potentially lost revenue, even if it is just a matter of Apple/Amazon/Microsoft’s ability to sell data about your habits and interests to others.
The importance of whether you use it is definitely more than just a matter of selling aggregated data. Some of the uses mentioned in the article include life coach to lose weight, reminders to take medicine or collect medical data, calm anxieties, poll employees, arrange for meetings, etc.
The value of these applications/programs/whatever is as much about the user experience as it is about accurately identifying the closest Thai restaurant near your location.
But by and large, if you notice something about the user experience, if your experience isn’t seamless, the designers have probably done something wrong. This is also one of the core precepts of design and technical execution for live theater.
I imagine this contributes to the general sense that STEM and business careers are worthwhile versus more arts oriented careers. So much about STEM and business endeavors are quantifiable. You write X lines of code, generate X dollars in billing, run X experiments today.
You wrote jokes to give Cortana a sense of humor and suggested adjustments so people didn’t anthropomorphize the program as a subservient female? More likely than not, people would slip into the “they pay you to have fun all day, I could do that” mindset.
Except making those decisions and creating plausible results isn’t really as easy as you think. While the idea of “selling one’s humanity” is a common accusation directed at movie villains, in a very real sense only those who have invested time into understanding humanity are able to generate simulacra of humanity to sell as a commodity.