Female,Hand,Holds,Black,Binoculars,On,A,Yellow,Background.,Journey,

There’s a Spider-Man story where Peter Parker signs up to work for one Tony Stark, CEO of the global tech giant, Stark Industries. As a new company man, he gets a cool new employee costume (er, uniform).

But suddenly Stark knows things — personal things about Spidey and his abilities — things Parker’s never told anyone, let alone his new CEO. The new company gear has come with a hidden cost: It’s constantly extracting his biometric data and sending the detailed results back to his boss.

Being snooped on by the boss is a scenario that’s increasingly familiar to modern day employees, particularly those working in tech, finance and banking — or almost anyone not currently working in the office.

The rise in working from home, and its seeming permanence for some industries, has coincided with a greater demand for employee monitoring software — leading to a host of new concerns about data security, employee privacy, and exactly where an employer’s rights end.

“The general rule is if you’re on company software, they can monitor that, for any time and for any reason … email, internet, all of it,” said Greg Givens of the Law Offices of Gregory E. Givens P.C., a Springs-based firm specializing in labor and employment law. 

“Employers have a responsibility to ensure employees are doing what they say they are doing. … The classic example is someone working in a call center. You know for sure they will be on their phone and computer, and especially for hourly employees, there is a reasonable employer expectation of having them at that desk.”

That said, Givens also added that “if word gets out an employer is really snooping around, or that everyone is being looked at for eight to 10 hours a day, that’s sure not going to engender good morale.”

Greg Williams, director of IT operations at UCCS, said employer concerns are twofold. “There are companies that want to make sure employees are being productive, doing what they say they are doing — but they also want to protect their data and intellectual property, since their employees are working from home. UCCS is more concerned with the latter; … at UCCS we very much protect the privacy of others.”

But protecting data doesn’t necessarily require digital snooping.

“I think surveillance of employees is a bad practice,” said Williams. “Nobody wants to be spied upon. And there are many tech controls you can put on data, rather than using surveillance of employees.”

Williams also said that he’s seen a dramatic rise in productivity of his own staff since many began working remotely, adding that “I don’t need to worry about if they’re working or not, because I can see their work outcomes.”

As the university’s former information security officer, Williams also has a lot to say about ensuring remote workers are operating safely while not in the office. 

“One tip is to always use your company’s VPN.  Here [at UCCS] while you’re on the VPN, you are also connected to the campus firewall, and that should protect from some threats.”

Williams is also wary of facial recognition software and other forms of surveillance that have shown a propensity for bias and misclassification of Black faces.

“The algorithms are getting better, but it’s still not up where it should be,” he said. “Facial or voice recognition, things that automatically detect certain things … it’s been proven to be biased, and it’s going to take some time to work out the [artificial] intelligence and machine learning algorithms that identify that. … Anything that is biased, in my belief, should not be used.”

It’s a sentiment echoed by Jay Stanley, a senior policy analyst at the American Civil Liberties Union.

“You see a lot of snake oil in the tech sector, with companies selling products that sound amazing but really are too good to be true,” said Stanley, noting that there is even emotion-monitoring software available, which is largely bogus. Stanley emphasized this technology is not yet ready for primetime.

“Companies will always sell [software] before it is ready. … Facial recognition can fall into that category. It’s gotten better over the years, but it remains problematic,” he said. “Facial recognition has different error rates for different ethnic groups — that’s not acceptable, and the software is not ready to be used for public purposes.”

NEXT STEPS

So what steps can employers and employees take now, while the bugs are still being worked out and privacy expectations aren’t always clear?

Williams said employers need to make certain their security tech is beefed up to the point that employees have what they need to be safe at home — which includes making sure they don’t have to do work on their home systems, which are much more susceptible to threats the company can’t see. It’s also crucial to establish a training program for remote employees so they’re aware of the latest phishing and malware risks.

Williams said employees should make sure to use company tech for company activities — no using your laptop to go watch YouTube videos or visit websites that are personal in nature — and be certain security controls are in use, such as a VPN and anti-malware/antivirus solutions. 

Stanley feels that any advice for workers would have to be calibrated to their bargaining power. 

“Many may feel they have no choice — they’re not in a position to make demands or say no, so they need to accept [monitoring],” said Stanley. But like any workplace issue, he believes that “if you’re in a position where you do have bargaining power, you should push back and agitate for a better solution.

“Employers have a legitimate interest in monitoring work performance and productivity,” he said, “but going back to Henry Ford and earlier, sometimes monitoring goes well beyond what’s legitimate.

“Workplace monitoring should be very narrowly tailored, and employees should be informed of all monitoring that takes place. It should never take place in a way that creates an atmosphere of intimidation.”

Another privacy concern Stanley sees is the use of artificial intelligence in employee surveillance. “AI logic can be opaque, biased and unfair. It’s a whole new level of intrusion.”

But for Stanley, the issue of an employee’s privacy extends beyond the legal realm and into the psychological welfare of the worker.

“Study after study has shown people do need privacy,” said Stanley. “You can go for periods of time without it ... but it exacts a psychological toll, so employers should not think they can do this and everything will be fine. It does create negative feelings in people to be monitored closely, and that can make for unhappy workers.” 

Stanley cites ‘The Transparency Paradox,’ a study by Ethan Bernstein at Harvard Business School which indicated that surveillance actually reduces productivity, and that a reduction in surveillance causes productivity to go up. “Excess surveillance can slow down workers ... they are less efficient when every movement is being watched, so even from a pure profit perspective, these are things employers should think about before they start installing pervasive monitoring software.”

Givens believes most problems arise when the employer has not fostered a work environment where employees feel they can go and talk to someone. And in his experience, it’s mostly large international corporations who end up taking aggressive measures, and not so much smaller LLCs based in Colorado Springs.

“But if you do want to work for a defense contractor based in Virginia, you’ve got to abide by their rules, like it or not,” he said.

“The lesson to be learned here is that communication is key, and if the lines stay open most problems can get resolved. Employers need to adapt, just like employees. Employers sometimes tell me, ‘If I do this I’ll have a mutiny’ … OK, well maybe just don’t do it?”

“I don’t think it’s an intractable problem,” Givens said. “It has a resolution, and communications can solve 75 to 80 percent of this in the workplace … but people can’t always come together and sing kumbaya and play guitar, which is why I can make a living.”