The era of brain surveillance has begun. Advances in neuroscience and artiﬁcial intelligence are converging to give us an aﬀordable and soon-to-be widely available generation of consumer neurotech devices—a catchall term for gadgets that, with the help of dry electrodes, connect human brains to computers and the ever-more-sophisticated algorithms that analyze the brain-wave data. Neuroscientists wrote oﬀ earlier iterations of consumer neurotech devices as little better than toys. But as both the hardware and the software have improved, neurotechnology has become more accurate and tougher to dismiss. Today, the global market for neurotech is growing at a compound annual rate of 12% and is expected to reach $21 billion by 2026. This is not a fad. It’s a new way of living and thinking about ourselves and our well-being— personally and professionally.
Brain sensors have a rapidly expanding range of personal applications. Using a simple, wearable device that measures electrical activity in the brain or at muscle junctions throughout the body, you can now get graphical, real-time displays of your brain activity and bioelectric changes in your muscles. You can use those displays to “see” your emotions, your arousal, and your alertness. You can learn if you’re wired to be conservative or liberal; whether your insomnia is as bad as you think; whether you’re in love or just in lust. You can track changes in neurological function over time, such as the slowing down of activity in certain brain regions associated with the onset of conditions such as Alzheimer’s disease, schizophrenia, and dementia. If you have epilepsy, you can get advance warning of a seizure so that you can prepare yourself for it. If you’re a football player, you’ll soon be able to wear a smart helmet that can diagnose concussions immediately after they occur.
Neurotech devices also have a rapidly expanding range of commercial and managerial applications. Companies across the globe have started to integrate neural interfaces into watches, headphones, earbuds, hard hats, caps, and VR headsets for use in the workplace to monitor fatigue, track attention, boost productivity, enhance safety, decrease stress, and create a more responsive working environment. This is new and uncharted territory, full of promise and peril for employers and employees alike. Neurotech devices oﬀer employers ways to improve the well-being and productivity of their employees and thus create healthier, more successful organizations. But they also give employers access to incidental information that can be used to discriminate against employees—for example, information about early cognitive decline. And if employers fail to be transparent about what data they’re collecting and why, the devices can undermine employee trust and morale.
Advances in neurotechnology certainly raise signiﬁcant privacy concerns for employees. Will they know what brain data is being collected or how their employer will use it? Whatever is gained in workplace safety or productivity could be oﬀset by the loss of employee trust, an essential ingredient of corporate success. Employees in high-trust organizations are more productive, have more energy, collaborate better, and are more loyal; employees in low-trust companies feel disempowered and become disengaged. And disengagement matters: It’s recently been estimated that corporations in the United States lose $450 billion to $550 billion each year because of it.
The dangers are real. But in some situations—for example, ensuring that the driver of a 40-ton truck is not falling asleep at the wheel—brain monitoring at work seems like a very good idea. It’s hard to argue that a driver’s right to mental privacy trumps public safety.
To navigate this territory successfully, business leaders need guidance. I’ve been studying this subject for years. I’m a professor of law and philosophy at Duke University, where I specialize in the legal and ethical issues of emerging technologies, with a particular focus on neurotechnology. I’ve also served as president of the International Neuroethics Society and cochair of the Neuroethics Working Group for the NIH Brain Initiative, and I currently serve as a neuroethicist for the National Academies of Sciences, Engineering, and Medicine. In this article, I’ll provide an overview of the neurotechnology landscape and oﬀer some thoughts on how to balance the risks and beneﬁts of using neurotech devices in the workplace.
It’s early days yet, but tens of thousands of workers are already using early-stage devices, and Big Tech is investing heavily to replace peripherals such as the computer mouse and the keyboard with neural interfaces integrated into headsets, earbuds, and wrist-worn devices. So now is the time to start thinking in practical terms about how best to engage with the world that’s opening before us, in ways that thoughtfully consider the interests of employees, employers, and society.
THE LAY OF THE LAND
Let’s start by taking stock of three ways in which neurotechnology is already being used in the workplace: to track fatigue, to monitor attention and focus, and to adapt the work environment to workers’ brains. Tracking fatigue. In 2019, Tim Ekert, the CEO of SmartCap, made a bold proclamation. He announced that his company’s ﬂagship tool—the LifeBand, a fatigue-tracking headband with embedded EEG sensors that can be worn alone or integrated into a hard hat or cap—would “transform the American trucking industry.”
The LifeBand gathers brain-wave data and processes it through SmartCap’s LifeApp, which uses proprietary algorithms to assess wearers’ fatigue level on a scale from 1 (hyperalert) to 5 (involuntary sleep). When the system detects that a worker is becoming dangerously drowsy, it sends an early warning to both the employee and the manager.
More than 5,000 companies worldwide, in industries such as mining, construction, trucking, and aviation, already use SmartCap to ensure that their employees are wide awake. SmartCap and similar EEG systems can be used in all sorts of employment settings where fatigue negatively aﬀects safety— factory ﬂoors, air-traﬃc-control towers, operating rooms, laboratories, and so on. And safety isn’t the only concern: Fatigue also reduces motivation, concentration, and coordination. It slows reaction times, undermines judgment, and impairs workers’ ability to carry out even the simplest of mental and physical tasks. It causes some $136 billion in productivity losses a year.
Fatigue also levies catastrophic costs on society. In Chicago, a transit authority train jumped its tracks entering a station at O’Hare International Airport after its driver fell asleep. The train careened onto an escalator, injuring 32 people. In New York, a sleep-deprived train engineer fell asleep while operating a commuter train from Poughkeepsie to Grand Central Terminal in Manhattan. The train took a 30-mph curve at 82 mph and derailed, killing four people, injuring 70, and causing millions of dollars in dof phosphate and crashed head-on into a coal train. Thirty-two cars left the tracks, spilling 1,346 tons of coal, 1,150 tons of phosphate, 7,400 gallons of diesel fuel, and 77 gallons of battery acid. Aviation accidents are much less common; however, during the past few decades at least 16 major plane crashes have been blamed on pilot fatigue.
As neurotechnology and the algorithms for decoding brain activity continue to improve, neural interfaces will become the gold standard in monitoring fatigue in the workplace. Not just employers but society as a whole may soon decide that the gains in safety and productivity are well worth the costs in employee privacy. But how much we ultimately gain from workplace brain wearables depends largely on how employers use the technology. For example, will employees receive realtime feedback from the devices so that they can act on it themselves, or will managers directly monitor employee fatigue? If so, will they use that information to improve workplace conditions or to justify disciplinary actions, pay cuts, and terminations? The answers to those questions will shape the future of brainwave monitoring.
Given the lack of societal norms and laws regarding tracking brain activity in general, for now companies are simply creating their own rules about fatigue monitoring. Some use SmartCap and similar technologies to optimize employees’ working conditions; others are likely to use the technologies punitively, because that’s typically how employers approach workplace surveillance. A recent study of companies that track how their employees use their computers found that 26% of employers had ﬁred workers for misusing the internet, and 25% had ﬁred them for misusing email. It’s not hard to imagine what might happen when ﬁrms are able to regularly monitor not just employees’ computers but also their brains.
Monitoring attention and focus
Many of us lack the ability to focus for long stretches at a time. But Olivier Oullier, a former president of the bioinformatic company Emotiv, believes that neurotechnology can help.
A few years ago, at the Fortune Global Tech Forum, Oullier unveiled the MN8, Emotiv’s enterprise solution for attention management. The MN8 looks like a set of standard earbuds (and can in fact be used to listen to music or participate in conference calls). But with just two electrodes, one in each ear, the device allows employers to monitor employees’ stress and attention levels in real time.
Emotiv teamed up with the German software company SAP to create Focus UX, a system that monitors employees’ brain states and in real time shares personalized feedback with them and their managers. SAP predicts that this will create a more responsive workplace environment in which employees focus on what they are best “able to handle at that moment.”
To illustrate how the system works, Oullier described a hypothetical situation. A data scientist wearing the MN8 has spent several hours videoconferencing with her team and is now reviewing code. The system has used her alpha- brain-wave activity to index the attentive state in her brain. The proprietary algorithm sees that her attention is ﬂagging, so it sends a message to her laptop: “Christina, it’s time for a break. Do you want to take a short walk or do a ﬁve-minute guided meditation to reset your focus?”
Focus UX data can be used to evaluate employees’ cognitive loads, com pare individuals across the workforce, and make decisions about how to opti mize the workforce for productivity. It can also help inform decisions about promotion, retention, and ﬁring. Other companies oﬀer similar technology. For example, Lockheed Martin’s CogC2 (short for Cognitive Command and Control) provides ﬁrms with real-time neurophysiological assessments of employees’ workloads so that they can “optimize their workforce for increased productivity and improved employee satisfaction.” It’s now even possible to use EEG to classify the type of activity an individual is engaged in, according to research funded by the Bavarian State Ministry of Education. As pattern classiﬁcation of brain-wave data becomes more sophisticated, employers will be able to tell not just whether you are alert or your mind is wandering but also whether you are surﬁng social media or writing code.
Employers might soon even be able to nudge employees back to work when their minds start to wander. The MIT Media Lab has developed a system called AttentivU, which measures a person’s engagement via EEG sensors embedded in a pair of glasses and a wearable scarf. The device provides haptic feedback (usually a form of vibration) whenever the wearer’s engagement declines. Researchers found that people who received haptic feedback logged higher alertness scores than those who didn’t. While the Media Lab is excited about the results, it acknowledges the risk for misuse, saying it hopes “no one will be forced to use this system, whether in work or school settings.”
Some employees may volunteer to use such systems, which have the potential to improve their productivity while giving them control over their brain-activity data. This could allow them to reap the beneﬁts of better time management without any sacriﬁce of autonomy. As with other neurofeedback approaches, self-monitoring for productivity could also help employees establish better work habits as they learn when and why they get distracted.
The problem is that some organizations may be tempted to impose brain- productivity technology on workers and make attention the currency of productivity measurement. A recent Brookings Institution report found that some companies are now using webcams to track eye movements, body position, and facial expressions as measures of attentiveness to tasks, and are reprimanding employees for inattentiveness on the basis of that data. While that kind of monitoring has become increasingly common, especially with the shift to remote work, using attentiveness as a yardstick for employee success may seriously backﬁre for employers. As Albert Einstein and Isaac Newton both acknowledged, creative ideas depend as much on minds wandering as staying on task. And research across 900 Boston Consulting Group teams in 30 countries has shown that mental downtime increases alertness, improves creativity, and leads to greater output quality. When workers know their attention is being monitored, they may attempt to minimize mental downtime—by doing things to actively bring their attention and focus back to the task at hand—out of a fear of appearing unproductive.
It’s not just employees’ productivity that can suﬀer. It’s their health too. When employees lack mental downtime, they often experience serious job strain, which has been strongly linked to a variety of health problems: depression and anxiety, ulcers, cardiovascular trouble, and even suicidal thoughts.
Workplace brain surveillance to monitor levels of attention, stress, and other cognitive and emotional functions has stark and signiﬁcant downsides. It has potential beneﬁts, too, including enhanced employee productivity. But at this point these beneﬁts are purely speculative, so for now I’d recommend that employers steer clear of engaging in this type of brain surveillance.
Creating more-adaptive work environments
As neurotechnology, AI, and robotics continue to advance, we can expect a future in which brain-a ctivity neural-interface devices are used to make the workplace more adaptive. Penn State researchers, for example, are experimenting with EEG headsets for employees that provide input to robots, which then calibrate their pace of work to the employees’ state of mind. In one experiment, participants wore EEG headsets that monitored their cognitive loads and detected signs of stress. Their robotic coworkers reacted to the data by slowing down, speeding up, or keeping a steady pace, giving the workers just the right amount of room to maximize their productivity without stressing them out.
Other researchers have found that EEG sensors could help monitor and address the greater cognitive load that assembly workers bear as automation becomes the norm in industrial settings and they are tasked with increasingly complex assembly procedures. In one recent study, researchers in Belgium had participants perform assembly tasks in a simulated factory setting while being subjected to varying levels of cognitive load (low, high, and overload). The researchers found that by tracking EEG activity and eye movements, they could diﬀerentiate between a high cognitive load and cognitive overload, which can produce errors, safety hazards, and detrimental health eﬀects on workers. Smart manufacturing systems of the future could automatically adapt production levels to allow for higher cognitive loads while avoiding overload, ushering in a new era of “cognitive ergonomics.”
Some companies are already implementing changes to the workplace in accordance with feedback from employees’ brains. Microsoft’s Human Factors team, for example, has helped the company adapt its oﬃce environments and products to be more responsive to workers’ brain health and functioning. Company researchers asked 13 teams of two employees to complete similar tasks together in person and remotely and found that remote collaboration led to greater stress levels in the brain. A second study of employees’ brains in back-to-back video meetings versus in-person meetings found that the former were more cognitively stressful.
In response, Microsoft introduced Together mode, a feature in Teams that gives meeting participants a shared background to simulate a shared physical space while collaborating. Initial results are promising: Brain activity of participants in Together mode reﬂected a lower cognitive burden compared with that of participants using the traditional grid view of online meetings.
The Human Factors team also discovered a simple yet powerful way to address meeting fatigue. By monitoring the brain-wave activity of employees who volunteered to participate, the team learned that people who took short breaks between meetings had lower levels of stress compared with people attending back-to-back meetings. Providing guided brain-wave-based meditation during the breaks also improved well-being and the ability to focus in subsequent meetings.
Cognitive ergonomics—making the workplace safer, more responsive, and more adaptive to employees’ well-being—represents one of the most promising new applications of neurotechnology. In my view, companies should embrace opportunities to experiment with it.
USING BRAIN WEARABLES RESPONSIBLY
To reap the maximum beneﬁts from brain wearables in the workplace while minimizing the risks, ﬁrms must adopt policies and practices that specify how and when they are used. To start, this will require action in ﬁve key areas.
1. Employee rights
Employees have a right to mental privacy. Governments should codify that as part of the international human right to privacy. A right to mental privacy would place the burden on corporations to identify a speciﬁc use for brain wearables that is limited in scope to legitimate purposes, such as monitoring fatigue in commercial drivers or tracking attention in air traﬃc controllers. A right to mental privacy would also prohibit unauthorized access to other brain-wave data that may be collected incidentally during legal monitoring. Even then, companies should be prohibited from using data for any purpose other than the one it was originally gathered for.
2. Privacy laws and regulation
Employers should stay abreast of biometrics privacy laws and implement policies consistent with their requirements. The collection of brain-wave data is or soon will be subject to stringent privacy laws and regulatory requirements in some U.S. jurisdictions. The failure to obtain prior written consent and provide adequate disclosure to employees can have costly ﬁnancial and reputational implications for employers. In October 2022, for example, employees in a class-action lawsuit that included nearly 45,000 people were awarded $228 million in a jury verdict against BNSF Railway, one of the largest freight railroad networks in North America, because it had collected and stored ﬁngerprint data in violation of the Illinois Biometric Information Privacy Act. Given the unique liability risks associated with the collection of biometric brain data, companies planning to introduce neurotechnology in the workplace should carefully consider laws enacted in various U.S. states and in other countries, including the General Data Protection Regulation in Europe.
When appropriate, employers should oﬀer employees the opportunity to use brain wearables at work to monitor their own levels of stress, waning attention, or increasing cognitive load. Companies may also choose to oﬀer guided meditation and other neurofeedback tools to employees who would like to improve their well-being. If employees elect to use those tools, ﬁrms should not access or mine the neural data collected, unless employees explicitly consent to its use for a speciﬁed purpose. Employees must have a right to obtain a copy of any neural data collected about them, along with any interpretations drawn from it. To use these tools without consent constitutes a breach of trust, undermining the value they would otherwise create. Giving employees the right to audit their own brain data can help build trust and ensure that only relevant and legitimate brain data is collected. It also provides a check on the quality of the data being collected and an opportunity for employees to challenge invalid interpretations.
Regardless of biometric- data-collection laws or other regulations, employers should be transparent with employees about what data they’re collecting from brain wearable devices and how they intend to use that information. They should specify the purpose for which brain data is being collected and what actions they will take in response to insights drawn from it. They should also collect data from brain wearables only when the employee is working. If, for example, an employer issues headphones embedded with EEG sensors, and the employee is permitted to use those headphones not just for work but also for leisure activities, employers should not collect neural data during “oﬀ” hours.
5. Storing brain data
Employers should adopt best practices for data minimization and store brain data on employees’ own devices and not on servers of device manufacturers, software companies, or employers whenever possible. This is critical. People associate their sense of self most closely with the information in their own minds, which makes neural data particularly sensitive. As machine-learning algorithms improve, the ability to mine and interpret neural data will also improve, enabling ﬁrms to learn far more about what employees are feeling or thinking and about cognitive or aﬀective changes to their brains over time. Employers should adopt security safeguards against the risk of unauthorized access, destruction, disclosure, or use of neural data. For example, companies should make sure that brain data is “overwritten” once its limited purpose has been served.
NEURAL INTERFACES WILL increasingly compete with existing peripheral devices to become one of the primary ways people interact with technology, oﬀering ﬁrms powerful new insights into employees and their well-being, and revealing ways to make workplaces safer and more productive. To realize those beneﬁts, employers must understand the unique risks this technology poses to mental privacy and adopt clear workplace policies that empower employees and earn the trust of the future workforce.
HBR Reprint S23022.
NITA A. FARAHANY is the Robinson O. Everett Distinguished Professor of Law and Philosophy at Duke University, and a scholar on the ethical, legal, and social implications of emerging technologies. She is the author of The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology (St. Martin’s Press, 2023), from which this article is adapted.
Share your thoughts about this write-up in the comment section below now.
Kindly share with others.