Int. The HR office of a large enterprise
An HR manager greets Tom, a company executive, and both sit down.
HR manager: Hi Tom. We had our HR bot ask you to come in today because, quite frankly, we’re a little concerned about you.
Tom: What do you mean?
HR manager: Well, the latest sentiment analysis of your team’s collaboration platform conversations indicate that you’ve been feeling excessive stress lately. Plus your most recent mobile apps usage pattern reflects an increased lack of focus. And data from your company smartphone shows you’ve been tapping the screen 34 percent harder on average over the past two weeks, a clear sign of frustration or even anger. Is everything OK?
Thanks to personal devices, data analytics, and the emergence of a field called computational psychology, this scene may become reality in the near future. Over at Harvard Business Review, author Michael Schrage explores how our devices and digital interactions produce data that could be used by enterprises as early-warning signals that an employee is struggling with depression or behavioural issues.
“For example, looking for correlations between managerial moods and sentiment analysis of, say, Slack chats could prove extraordinarily helpful and healthful,” Schrage writes. “What kinds of chats evoke disproportionate anxiety and stress? Conversely, what managerial mood states might signal or predict unhealthy online interaction?”
Dr. John Torous, co-director of the Digital Psychiatry Program at BIDMC/Harvard Medical School, tells Schrage that “the potential to use this data to promote workplace wellness is unparalleled.”
“Such data may even help predict future mood states of teams and individuals — for example, proactively signalling when an employee may benefit from a day off to care for their mental health,” Torous says. “Of course, such apps and wearables need to earn users’ trust and protect sensitive information — without trust there is no health or wellness.”
And that’s a big “of course.” There’s no doubt that people are much more willing to share information than ever before. But the idea of employers using data from personal devices, apps, and digital team conversations to draw assumptions about a person’s mental or emotional health sure feels like it’s crossing a line, even if it’s intended for good reasons.
Acknowledging that “privacy concerns are unavoidable,” Schrage adds:
Organisations will have to become transparent about which behavioural data they won’t monitor and manage as well as what they will aggregate and analyse. Employees must provide informed consent on how mental health analytics will be used to evaluate fitness, performance, and promotion.
That last part raises an interesting question for employees: Will providing informed consent work for or against them? Conversely, will denying informed consent put an employee at a disadvantage?
Here’s another thought that might cross an employee’s mind: “If the company can collect and analyse behavioural data from me, why can’t I see the same data from the CEO or my department head? That way I can know if they have any mental health problems that could affect me. It only seems fair.”
Schrage, a research fellow at MIT Sloan School’s Center for Digital Business, concludes that “there is no avoiding the data-driven reality that, as cognitive, emotional, and affective variables increasingly determine workplace performance and outcomes, expectations around privacy will shift.”
He’s probably right, given how much privacy expectations already have changed. Still, there’s a difference between posting drunk selfies on Instagram and having your employers constantly analysing your mental state. We’ll see how things play out.
Are you comfortable with the idea of employers using digital and computational psychology to monitor and evaluate employees? Let us know in the comments section below.