Get ready for a clash between health measures and privacy rights when employees begin returning to a workplace where their every single move could be tracked.
Temperature scans, immunity badges, a “bio passport” and even smartphone apps that record all the steps you take are just some of the tools technology companies are gearing up.
Under the goal of protecting public health and limiting further coronavirus outbreaks, software developers and employers are looking at dramatically ramping up workplace surveillance.
That will raise a whole host of issues, legislators and experts said: How accurate is the data? Could inaccurate data result in discrimination? Could the data be sold? Given to law enforcement? Could you be blackballed from work? Or blackmailed by hackers who have stolen your data from your employer?
Will the public adjust to surveillance and dismiss privacy concerns, much in the way it accepted new intrusions for national security after the Sept. 11, 2001, terrorist attacks?
The issue is emerging so quickly that there aren’t federal standards to oversee it completely. But New York State lawmakers say they’re preparing legislation to begin to tackle it.
“I’ve been thinking about it since the day we shut down,” Sen. Kevin Thomas (D-Levittown) said. “I started seeing a lot of companies foaming at the mouth, thinking how they’re going to make the next buck with the next app that says ‘I’m negative’ or ‘I have the antibodies.’ Even before COVID-19 hit, people have been giving up their privacy and now we have a real health emergency … So we need to move expeditiously on this before it gets out of control.”
Thomas is finalizing a bill that makes a start. He said it would mandate that all health data a company collects on a worker be deleted after 14 days. That would reduce the possibility of it being sold and/or hacked, he said.
“This data shouldn’t be held on a server for long periods of time,” he said.
Some analysts said it’s all but inevitable that surveillance might be taken to a level never seen before.
“In my opinion, we are about to begin rewriting the rules of social behavior and privacy,” said Michael Balboni, a former state senator and former state director of homeland security. He now runs the consulting group RedLand Strategies, which is working with companies on an array of health and tracing issues.
“I think tolerance for invasion of privacy is different now than even in January when we first heard about coronavirus,” said Alyson Mathews, a Melville attorney who chairs the Labor and Employment Law section of the New York State Bar Association. “If you had tried in December or even early March” to raise monitoring and tracing proposals, “there would have been more complaints than now.”
Some of the new workplace measures being discussed won’t touch on privacy issues — such as eliminating or restricting cafeterias, installing touchless vending machines and signage in elevators, staggering work shifts, cleaning offices thoroughly every night and promoting options for “work anywhere, anytime” as long as goals are met.
But along with smartphone tracking, other proposals do raise privacy issues, such as segregating workers on different floors or sections of a building based on health and testing status.
“I think there is no doubt that workplace behavior will be changing permanently in a lot of ways,” said Janet Lenaghan, a dean and business management professor at Hofstra University. She expects a “more hybrid working arrangement.”
The pressure to deploy monitoring devices could depend on the nature of the business and how flexible employers can be about work arrangements.
“The more density you have, the more employees you bring back, the more you are going to force those measures,” Lenaghan said about surveillance devices.
“The technology already is here,” but there are “accuracy issues,” said Richard Chan, a business professor at Stony Brook University, about monitoring employee movements. Because of that, he said, “it still may be too early for privacy to be a major issue.”
For example, some Bluetooth-affiliated devices now can’t be more precise than about 5 feet — and social distancing practices requiring 6 feet of distance. So if the signal beacons aren’t set up properly, devices could get a lot of inaccurate readings, Chan said.
But privacy is a broader issue, said Ifeoma Ajunwa, a Cornell University professor who focuses on organizational behavior, labor law and the adoption of new technologies, among other topics.
Monitoring could clash with the medical health privacy law often called HIPAA, and the Americans with Disabilities Act.
Further, “anything that’s electronically based or app-based … is going to leave a trail of digital data that can be easily hacked unless you take measures that are very extensive and very expensive,” Ajunwa said.
“Shadowy data brokers could hack your information. They can try to sell it to insurance companies, mortgage companies, life insurance companies,” she said. “Then, there’s also the issue of blackmailing somebody.”
Sen. Brad Hoylman (D-Manhattan) has been a leading voice in the State Legislature on privacy and has sponsored measures to keep law enforcement from using certain types of facial recognition technology. Similarly, he now is promoting a measure to prevent contact tracing information and health information collected by an employer from being shared with police to prevent use of the data for noncoronavirus purposes.
“I’m sensitive to the need for testing, especially for individuals,” Hoylman said. “I’m just not sure the employer needs to know it.”
Months from now, or a year from now, it’s possible the changes will just be accepted and privacy concerns washed away — as how 9/11 ushered in screening and security changes and broader police powers to deal with terrorism, but which are now used for investigating a wider variety of crimes. Even when a coronavirus vaccine is developed, it is possible that changes implemented now might stay, some think.
“My read is that 9/11 changes quite a bit for people. Post 9/11, people just became very comfortable with giving up personal data in exchange for security measures,” Cornell’s Ajunwa said. “The danger of a terrorist attack or, in this case, a health crisis, is a lot more immediate than the danger of your data being used against you, or the idea that this data could end up costing you a lot of money in the future or preventing you from getting a job.”