After you’re done asking Siri how many cups are in a quart, try this: “Hey Siri, I’m really depressed.” Siri – ever the charmer – will respond with a curt, “I’m sorry to hear that.” And while that’s not exactly a bad response, Apple wants it to be better.
The company has apparently noticed a growing trend in the way we use our smart assistant, and wants to do something about it. “People talk to Siri about all kinds of things, including when they’re having a stressful day or have something serious on their mind,” a job posting on Apple’s website reads. “They turn to Siri in emergencies or when they want guidance on living a healthier life.”
The listing is looking for a “Siri Software Engineer, Health and Wellness.” These engineers will focus on improving Siri’s ability to respond to human problems and will likely change the (currently rather work-oriented) dynamic between us and our smart assistants by fostering a more natural and personal level of dialogue.
And while the idea of having a Siri that is more adept at responding to these sorts of scenarios seems — on its face — like a net positive, the move also raises a number of issues regarding the complicated web of ethical and moral responsibility that arises when you throw crisis counseling into the mix. How Siri (and her helpful team of Health and Wellness Software Engineers, of course) chooses to respond to a variety of commands and prompts from users could easily end up having a serious impact on people’s lives. (Not to mention the number of privacy concerns that could arise if the smart assistant really does become the go-to resource for people having serious personal issues.)
Thankfully, it seems that Apple is at least thinking about these sorts of questions, too, as the listing states that they’re specifically looking for engineers with a background in “peer counseling or psychology.” Hopefully this translates into the appropriate amount of care being put into the project. (We definitely don’t need to end up in HAL 9000 territory.)