Should bosses use AI to track employees’ emotions?

Posted On 25 Mar 2025

Should bosses use AI to track employees’ emotions?

25 Mar 2025

Candidate Resource, Employer Resource, Interview Tips, On The Job, Popular Culture

“Big Brother” at work: Have employers gone too far with their new tool? Emotional AI tracking is now being embedded in the workplace, taking away the cherished privacy of one’s own emotions.

Since the COVID-19 era, emotional AI tracking has been embedded in workplaces. Using biological signals picked up by facial expressions, vocal tone, and other reactions, devices can monitor human behaviour and then use AI to make an assumption on their perceived emotional state.

This is not, however, a new phenomenon. Back in 2019, Gartner reported that over 50 per cent of large employers in the US were using emotional AI to keep track of the internal states of their employees. Phrased that way, it reads like a gross and invasive breach of someone’s privacy, yet organisations have claimed that it’s done from the perspective of positively impacting a worker’s wellbeing.

HR Leader reached out to EST10 founder Roxanne Calder – who’s previously been rather frank with her thoughts on wellbeing policies – about what she thought in regards to the implementation of emotional AI tracking in the Australian workplace.

“Emotional AI has the potential to provide actionable insights into employee engagement and mental health. It is possible to identify patterns of stress or disengagement that lead organisations to intervene early and offer support or resources to employees,” said Calder.

“Tools like these can, in theory, democratise access to mental healthcare, providing real-time feedback that would otherwise be difficult to gauge in large teams.

“Yet, I’m not entirely convinced. There is the issue of personal privacy – ‘Why can’t I have a bad day in privacy … without it being brought to the attention of the world?’ With global low productivity levels and reduced engagement levels in many organisations using such AI, emotional tracking could erode trust within organisational cultures.”

Despite the benefits, the implementation of this process is a complex interplay between rapidly advancing technology and the privacy of our emotions.

“Emotional AI presents a complex interplay between technological advancement (necessary) and the preservation of human dignity (vital). Our wellness and wellbeing are, of course, important. Yet, my question as always remains, why is the shift in responsibility from the individual to organisations and with emotional AI tracking, have we gone too far?” said Calder.

“By relying on tech to tell us how we are feeling, will it reduce our ability to recognise and regulate emotions and feelings?

“Worse, for someone to reach out and say, ‘R U OK’ will be determined by AI telling us to – instead of our human emotional senses. Humans need humans, and these precious human skills need practice.”

Based on this, Calder argues how imperative it is that technology works for humans, and not vice versa.

“Technology should serve to enhance human capabilities without undermining fundamental human values. We should advocate for the ethical application of technology, ensuring it supports rather than supplements human connection,” said Calder.

“Within this context, it is clear that emotional AI can offer important insights, [but] it is also clear that it remains only as a tool that cannot replicate the depth of human empathy and the nuances of interpersonal relationships.

“While recognising the potential benefits of technology in monitoring and promoting wellbeing, it is absolutely crucial to approach its implementation with caution. Prioritising transparent communication, robust ethical guidelines, and the irreplaceable value of genuine human interaction is essential to ensure that such tools truly serve the collective wellbeing of employees.”

As time goes on and the technology accelerates its development, a pivotal decision must be made on where to draw the line. This is only one example of AI’s application to the workplace when there are various ways it can and will be implemented. If, however, this implementation exits the ethical boundaries, then the core fabric of the workplace could be torn apart.

“At its core, this technology attempts to quantify something deeply human: our emotions. The question isn’t just whether emotional tracking goes ‘too far’ but whether it shifts the workplace away from fostering genuine wellbeing towards technocratic management of feelings. Please, no!” said Calder.

About the author
Roxanne Calder
Managing Director

As Founder and Managing Director at EST10, Roxanne has an all-encompassing role that includes building and growing the business, as well as actively recruiting and consulting.

After completing a Bachelor’s Degree at Monash University, Roxanne began her recruitment career with renowned recruiter Julia Ross. From there, Roxanne worked in HR and recruitment with a number of global players and boutique businesses throughout Australia, the UK, Singapore and Hong Kong for over 20 years. She has been responsible for managing large teams and projects, implementing RPO models, managing and assisting businesses to an IPO and assisting companies in setting up their recruitment teams and processes.

Following completion of her MBA at the Australian Graduate School of Management, Roxanne launched EST10 in July 2010. In doing so, she hoped to combine the flexibility and high touch service levels of boutique agencies with the structure and strategy afforded to larger firms. Roxanne believes in high-touch, high-care consulting and is always on the lookout for consultants that share this vision of recruitment.

Our Blog
Related Articles
How self-sabotaging is your career’s number 1 enemy
Most of us are aware of the concept of self-sabotage. We have read about it, perhaps even pondered i...
Invisible Ink
Have you heard of ‘invisible ink’ before? If I have worked on a job brief with you, I would have...
He’s just not that into you!
Fly undone? Excruciating to hear but necessary to know. One, single dark facial hair on your chin (i...
Invisible Ink
Have you heard of ‘invisible ink’ before? If I have worked on a job brief with you, I would have...
He’s just not that into you!
Fly undone? Excruciating to hear but necessary to know. One, single dark facial hair on your chin (i...
How self-sabotaging is your career’s number 1 enemy
Most of us are aware of the concept of self-sabotage. We have read about it, perhaps even pondered i...
He’s just not that into you!
Fly undone? Excruciating to hear but necessary to know. One, single dark facial hair on your chin (i...
How self-sabotaging is your career’s number 1 enemy
Most of us are aware of the concept of self-sabotage. We have read about it, perhaps even pondered i...
Invisible Ink
Have you heard of ‘invisible ink’ before? If I have worked on a job brief with you, I would have...