The Rationalizations for Workplace Surveillance


Despite the amount of workplace surveillance that takes place in this country, peering into the lives of their employees is not something that businesses undertake lightly: It's expensive, time-consuming, and inherently destructive of employee morale. It's reasonable to assume that businesses would not spend so much on surveillance technologies—Internet monitoring, closed-circuit television, drug testing, background checks, GPS systems, etc.—if they did not feel compelled to do so. A variety of interwoven factors, however, are driving U.S. companies to spending enormous amounts of time and money on employee surveillance. The bet is that the investment in increased surveillance will pay off by reducing employee theft and sabotage, increasing productivity, preventing lawsuits, avoiding violent incidents in the workplace, and preventing terrorist attacks.

Minimizing Theft and Sabotage

To understand the rationale for employee surveillance, you need go no further than this figure: In 2001, employees stole an estimated $15.243 billion in inventory from their employers. According to the 2001 National Retail Security Survey (NRSS) Final Report, prepared by the Department of Sociology and the Center for Studies in Criminology and Law at the University of Florida in Gainsville, "There is no other form of larceny that annually costs the American public more money than employee theft." Based on the responses from retailers, the NRSS estimates that 45.9 percent of all inventory shrinkage is the result of employee theft, compared to the estimated 30.8 percent that results from shoplifting. The fact that retailers lost more than $10 billion from customer theft is also relevant, however, since a lot of employee surveillance is an accidental by-product of trying to stop customer theft or misbehavior.

The theft of consumer goods, of course, is only one of the theft risks that employers face. According to a study by the Santa Monica, California, think tank Rand Corporation, high-tech equipment has proven to be a very popular target as well, accounting for as much as $4 billion in losses during late 1998 and early 1999.

If the computers themselves aren't being stolen, then they're often being used to steal. The tremendous increase in the number of personal computers in the workplace and the enthusiastic adoption of the Internet by the business community has opened up enormous new security concerns. It's one thing to try to prevent the theft of things that must be physically removed from the workplace; it's another thing entirely to prevent the theft of information that can be downloaded to a floppy or Zip disk, e-mailed to oneself or a confederate, or even posted to a Web page for the entire world to see.

For the past seven years, the Computer Security Institute (CSI) in San Francisco, in conjunction with the San Francisco Federal Bureau of Investigation's (FBI) Computer Intrusion Squad, has conducted a survey of computer crime and security in the United States. Among the important findings of the CSI/FBI 2002 Report:

  • Nine out of ten respondents were the victims of computer security breaches;

  • Eight out of ten respondents suffered financial losses as a result of those security breaches;

  • Forty-four percent of the respondents had tallied the cost of the security breaches: They reported a total of $455,848,000 in financial losses;

  • The two most expensive categories of theft were intellectual property ($170,827,000) and financial fraud ($115,753,000); and

  • Three-quarters of all respondents said that their Internet connections were a frequent point of attack, while only a third listed their internal network systems.

It would be tempting but inaccurate to blame nonemployees for the computer and Internet security breaches that plague companies. In fact, experts estimate that 70 percent to 80 percent of all computer crime is committed by employees against their employers. In the CSI/FBI report, for instance, eight out of ten respondents reported that their employees had abused their workplace Internet connections in some manner, ranging from inappropriate use of the e-mail system to downloading pornography. [4]

Theft is only one of the perils faced by employers in today's high-tech world. Before computers, causing significant damage to an employer's property required time, energy, and typically the use of an accelerant like gasoline. Today, however, a few lines of software code can wipe out millions of dollars of intellectual property. For example, in 1996, Timothy Lloyd, a thirty-year-old network program designer at the Bridgeport, New Jersey, facility of Omega Engineering, Inc., was told that he was about to be fired. Outraged at the company's treatment of him, Lloyd wrote and planted a "logic bomb" that detonated three weeks after his departure from the company. The "bomb" destroyed Omega's main database, resulting in an estimated $10 to $12 million in lost sales and repair costs, and causing eighty other Omega employees to be laid off.

Companies are often reluctant to report insider attacks for fear of revealing vulnerabilities that might be exploited by others, or they're unwilling to go through the time and expense of actually prosecuting someone. There's little doubt, however, that employees pose the greatest threat to corporate computers.

The Productivity Paradox

The next most-frequently cited justification for employee surveillance is the need to maintain or improve productivity. "An honest day's work for an honest day's pay"—that's the core agreement between worker and employer, whether payment is in bushels of wheat or stock options. How we determine "productivity" has changed somewhat over the years ("Kill the mastodon or you don't eat" eventually became "Get that report in on time or you're fired") but the basic issue remains the same. In a capitalist system, businesses that do not produce an adequate supply of goods or services, or whose costs routinely exceed their revenues, effectively starve to death.

The challenge that companies have faced over the last century is that many of the same technological innovations aimed at improving individual workplace productivity have carried with them the risk of wasted time and resources. Of course, a chatty coworker is more than sufficient to waste time at work, and doodling has been a reliable workplace diversion since the pencil was invented. But one of the great things about technology is that it makes the wasting of time so much easier.

Employers got their first taste of this phenomenon with the introduction of the telephone. It took less than fifty years for the telephone to make the transition from Alexander Graham Bell's crowded workshop to virtually every office desktop in the country. As it did so, it became the communication equivalent of the railroad, helping to accelerate the Industrial Revolution by dramatically speeding up (if not entirely inventing) the art of the deal.

From an employer's perspective, the chief benefit of the telephone was that it made it possible for employees to do more work in a shorter period of time. But this new technological marvel added something not typically found in more formal office communication—chitchat. To a limited degree, of course, businesses promote chitchat: They want their employees to develop personal relationships with the people with whom they regularly deal.

The challenge, of course, is balancing telephone chitchat with productivity. This was less of a problem in the early part of the twentieth century, when the number of residential telephones lagged behind business phones. By the late 1950s, however, more than 75 percent of the homes in America had their own telephone, and the office gradually stopped being the secluded domain of the man in the gray-flannel suit. Leave It to Beaver's June Cleaver would never have called her husband Ward, asking him to stop and pick up a gallon of milk on the way home, but thirty-something's Hope Murdoch Steadman saw nothing inappropriate in calling Michael to discuss everything from dinner plans to child crises. More than any other single technology, the telephone helped blur the lines between work and home.

Most employers recognize that as long as the number or length of personal calls is not excessive, attempts to ban them are bad for employee morale. Besides, the telephone offers only middling possibilities for wasting time. In most offices, it's fairly difficult to keep fellow employees or bosses from realizing that you're spending hours on the phone chatting with family or friends.

The same thing, of course, can hardly be said about the personal computer. To be fair, when they were first introduced to the office, personal computers posed little threat to employee productivity. The early PCs were fairly simple machines: stand-alone devices with limited storage space (the first IBM desktops didn't even have a hard drive), even more limited memory, and monochrome screens, none of which were particularly conducive to game-playing or other time-wasting applications. More importantly, their expense made most companies very conscious of how they were being used.

Games were not completely unavailable, of course—my college buddies and I spent a now-embarrassing number of hours playing Oubliette, a Dungeons-and-Dragons-type game that was very clever in its ability to create images on a monochrome screen. In addition, text-based games like Adventure, Zork, and The Leather Goddess of Phoebos also enjoyed a brief but intense period of popularity, but overall, the businesses that made early purchases of desktop computers could safely assume that they would in fact be used for business purposes.

By the early 1990s, however, simplicity had become a lost art. With the release of Microsoft's Windows 3.0 (1990) and 3.1 (1992), businesses rapidly discovered that a perfectly functioning PC offered an almost limitless potential for frittering. Users could and still do spend hours tweaking color schemes and wallpaper, changing various settings, and installing utilities.

From a productivity point of view, the real footsteps of Doom could be heard in 1990, when Microsoft included a version of Solitaire in its release of Windows 3.0. [5] Solitaire has remained a part of Windows system releases ever since and may well hold the title as the world's most frequently played computer game. And Solitaire is merely the most obvious example of a huge universe of available computer games. Expert estimates vary widely on how much game playing takes place in offices around the country, but Apreo (formerly DVD Software), which produces a game-blocking program called AntiGame, recently claimed that workplace game playing now costs employers over $50 billion per year. On the Apreo website, Steve Watkins, an assistant vice president and director of information technology at Summit National Bank in Atlanta, Georgia, offers a typical comment:

AntiGame is a great asset to our organization. ... With its customizable database, AntiGame allows us to unobtrusively monitor users' workstations for unapproved software, games or otherwise, and clean the offending stations if deemed necessary. With about one hundred users spread over six locations from Georgia to California, AntiGame is a tremendous time-saving improvement over our former monitoring technique—visiting each workstation individually.

As the Internet becomes an increasingly important part of the workplace, however, game playing is losing its position as the chief time-wasting tool by employees. From an employer's perspective, having unmonitored Internet access on each desk is roughly the equivalent of installing a gazillion-channel television set for each employee. In part because of its sheer convenience, and in part because businesses tend to have faster Web access, employees are finding it difficult to resist the temptation to shop for presents, plan vacations, check out sports scores, trade stocks, buy and sell items on eBay, correspond with friends and family, read reviews, buy movie tickets, and so on. There were certainly noncomputer ways to do all of these things before the Internet; it's just that the Internet makes it so much easier and less immediately obvious to the employer.

Moreover, the very nature of the World Wide Web exacerbates the productivity problem. As any researcher knows (including yours truly), the lure of hypertext links is so seductive that it's easy to begin researching the finer points of wireless networks and wind up reading a Web page about the right kind of chiles for salsa. This type of research drift is problematic enough in a library; on a device where the next interesting thing is simply a click away, it can be a huge challenge to maintain focus.

Minimizing Workplace Litigation

In truth, most companies recognize that an occasional personal e-mail or a little online Christmas shopping is not an enormous threat to employee productivity and, in fact, may help to improve workplace morale. Companies run into real trouble, though, when employees use the Internet to share racist or sexist jokes, or to access sexually explicit websites. Companies that fail to take steps to prevent that from happening face the threat of harassment suits based on the existence of a hostile work environment.

start sidebar
Examples of Internet-Related Harassment Actions
  • In 2001, nine female workers sued John Deere & Co. for discrimination and sexual harassment. They alleged, among other things, that coworkers printed out Internet pornography on company equipment during work hours.

  • In 1996, financial giant Smith Barney, Inc. was sued by twenty-five female employees for sexual harassment and discrimination, based in part on messages and materials distributed across the company's computers. In 1998, the company fired two high-ranking stock analysts for violating the company's rules on the use of computers to distribute pornography.

  • In 1995, just as the World Wide Web was growing in popularity, Chevron was sued by a number of female workers under the hostile work environment theory. Among the evidence that the plaintiffs introduced was an e-mail message that was circulated on the company system entitled "Twenty-Five Reasons Why Beer Is Better than Women." That e-mail cost Chevron just under $100,000 per reason; the company settled the lawsuit for $2.2 million.

end sidebar

Lawsuits that allege the existence of a hostile work environment are an outgrowth of the landmark Civil Rights Act of 1964, a piece of legislation that resulted in the longest debate in the history of the U.S. Congress. Under Title VII of the Civil Rights Act, it is "an unlawful employment practice for an employer ... to discriminate against any individual with respect to his compensation, terms, conditions, or privileges of employment, because of such individual's race, color, religion, sex, or national origin." [6] Not surprisingly, nearly every word in that sentence has been the subject of repeated litigation, as the employers and employees try to sort out the boundaries of permissible conduct.

Not long after the Civil Rights Act went into effect, courts concluded that discrimination did not have to be "economic" or "tangible" in order to be a violation of Title VII. Ruling for a woman, Teresa Harris, who had been subjected to gender- and sexually-based comments and other offensive behavior from her company's president, Justice Sandra Day O'Connor wrote: "When the workplace is permeated with discriminatory intimidation, ridicule, and insult that is sufficiently severe or pervasive to alter the conditions of the victim's employment and create an abusive working environment, Title VII is violated." [7] O'Connor went on to add that the Court was attempting to draw a line between conduct that causes a "tangible psychological injury" and conduct that is "merely offensive."

Not surprisingly, employees (and their lawyers) were quite pleased with the Harris decision. After the case was decided in November 1993, the Equal Employment Opportunity Commission (EEOC) saw an immediate rise in the number of sexual harassment claims, from 11,908 in fiscal 1993 to 14,420 in fiscal 1994, or an increase of more than 20 percent. Currently, approximately 15,500 sexual harassment claims are filed with the EEOC each year.

The cost of sexual harassment claims has also gone up significantly since Harris was decided. According to the financial services magazine Treasury & Risk Management, corporations paid about $1 billion on sexual harassment settlements and awards between 1992 and 1997. In 1998 alone, the average award for an employment claim was $550,000. And those figures, of course, do not include other related costs of sexual harassment litigation, such as lawyers' fees and litigation preparation.

Technology—especially the Internet—has dramatically increased the risk of hostile work environment claims. Fifteen years ago, an insensitive lout might tell a sexist joke around the water cooler or in the lunch room; if the conduct was pervasive and sufficiently long-lasting, it could serve as the basis of a hostile work environment claim, but the scope of the damages was likely to be relatively limited. Today, that same lout not only has access to thousands of offensive jokes and images, he (or she, to be fair) can distribute them around the entire company with a click of a button.

Preventing Workplace Tragedies

Just after 11:00 A.M. on December 26, 2000, Michael McDermott stood up from his desk at Edgewater Technologies, a Wakefield, Massachusetts, Internet-consulting firm where he worked as a software tester. Having just been told that his car was being repossessed and facing garnishment of his wages by Edgewater for federal back taxes, McDermott pulled a rifle, shot-gun, and pistol from a gun bag, and went on a shooting spree that resulted in the deaths of seven coworkers. [8] Just six weeks later, on February 5, 2001, a sixty-six-year-old former employee at Navistar International Corp. in suburban Chicago pushed past security, opened his golf bag full of weapons, and shot four coworkers. After wounding four other employees, William Baker then killed himself.

The two incidents were merely the latest in a string of high-profile workplace shootings. Overall, the Occupational Safety & Health Administration (OSHA) reported that in 2000, 674 workplace homicides occurred, and in 1999, violent (but nonfatal) assaults occurred 16,664 times. Put another way, during an average five-day workweek, nearly thirteen people are killed and roughly 320 are attacked at work.

Needless to say, employers have an interest in doing everything they can to minimize workplace violence. Apart from the pain and suffering that workplace violence causes, companies face significant liability issues arising from on-the-job attacks. Preferring to go after defendants with so-called "deep pockets," or at least deep-pocketed insurance policies, plaintiffs' attorneys have worked hard to develop new theories of potential corporate liability, including negligent hiring and negligent retention of dangerous employees. Increasingly, corporations are also being challenged on whether they had in place adequate security measures to prevent workplace violence.

Because of its high profile and the scope of the perceived threat, workplace violence may be the biggest single contributor to reduced employee privacy. As most employers correctly realize, the best solution to workplace violence is to prevent it from occurring in the first place. Unfortunately, the type of information that might signal that an employee is a potential threat is often among the most private—a history of physical or sexual abuse, serious domestic issues, profound financial pressures, psychological illness, etc.

Despite the potential invasions of privacy, workplace security consultants strongly advise employers to undertake the types of investigations that might root out such information. For instance, Michael McIntyre, a professor of psychology at the University of Tennessee in Knoxville and the developer of a test to identify the potential for workplace violence, says that he "recommends that employers do as much due diligence on the preemployment side as possible." While conceding that his test can't predict that someone will become a workplace shooter ("There aren't enough incidents to use as a basis for scientific study," he said), Professor McIntyre is confident that the twenty-five-question test developed by him and his colleague, Larry James, can successfully identify people with a propensity for "desk rage," the cubicle equivalent of "road rage."

The steps companies take to ferret out potential violence include putting increased resources into background checks and increasing their efforts to obtain medical and psychological records. Since health records are often difficult (but unfortunately, not impossible) to obtain, employers are administering more psychological and personality tests to applicants and existing employees in an attempt to locate potential time bombs. In addition, employers are also paying closer attention to how employees behave on the job, instituting tighter access controls, and even expanding their monitoring of employee off-hour activities.

Some of these measures have a relatively limited impact on employee privacy—even tracking where employees are in a particular building is not an enormous infringement on privacy. The problem with some inquiries—psychological and personality tests, for instance, or monitoring off-hour activities—is the seemingly tenuous relationship between the information gathered and a propensity for violence. The potential for abuse and misguided profiling is enormous: Should we watch more closely the National Rifle Association member with the cabinet full of guns? How about the bulky, bearded Hell's Angel member? The stereotypically angry, militant feminist? The Vietnam vet? The short, mild-mannered bookish type just waiting to avenge a lifetime of wisecracks and Randy Newman songs? The list, obviously, is endless.

Preventing Electronic and Physical Terrorism

Before September 11, 2001, concern over terrorism in this country, particularly in the business community, was largely limited to the threat of cyber-terrorism. Admittedly, there was some increased concern as a result of the first bombing of the World Trade Center in 1993 and the destruction of the Alfred P. Murrah Building in Oklahoma City in 1995, but to a large extent, those were viewed as aberrations. The perception of the country as Fortress America remained strong.

By contrast, concern over the growing threat of cyberterrorism—outside attacks on computer systems—had been growing steadily for some time. Even before the World Wide Web began skyrocketing in popularity in 1995 and 1996, businesses and government agencies had awakened to the unpleasant fact that the new international communications network could be used by hackers, competitors, foreign governments, grumpy employees, and precocious twelve-year-olds to do unpleasant things to their computer systems and critical data. The ongoing efforts to protect corporate computer systems diminished employee privacy in a number of ways, from the increased scrutiny given to workers when they are hired to the greater surveillance of all computer-related activity.

The phenomenon of hacking first hit the mainstream when Hollywood released War Games in 1983. In the film, the character played by Matthew Broderick shows off for a friend by using his computer to dial into the computer system at the North American Aerospace Defense Command (NORAD) and play computer games with the "WOPPER," the system's mainframe. In the process, he nearly triggers World War III. A commercial success (taking in nearly $80 million), the film helped stir public interest in the vulnerabilities of computer systems.

But to really grab the public's imagination, you need a hero or a villain. Hacking got its villain on July 4, 1994, when John Markoff reported on the federal government's efforts to capture hacker Kevin Mitnick. The frontpage article in The New York Times breathlessly recounted Mitnick's exploits as a "phreaker" and hacker and said that Broderick's character in War Games was based on Mitnick, who had allegedly broken into the NORAD computer systems as a teenager in the early 1980s. [9] Both NORAD and Mitnick, not surprisingly, strenuously denied that the break-in ever occurred, but the article helped to establish Mitnick as the most infamous hacker in the country.

start sidebar

The online debate over Mitnick's guilt or innocence is passionate. At the time that Markoff's article was written, Mitnick was allegedly a fugitive from justice and carrying out an electronic crime wave resulting in tens of millions of damage to a variety of major corporations, including Motorola, Nokia, and Sun. However, during pretrial proceedings in 1998 and 1999, government prosecutors conceded that Mitnick had largely complied with the terms of his earlier probation, although they stood by their claim that the issuance of a secret arrest warrant did make the hacker a fugitive.

While it was apparently true that Mitnick was breaking into corporate computer systems, a similar disagreement arose over the amount of damage that he caused when he did so. The government asserted that Mitnick's activities caused corporations $80 million in damages, and as part of his plea agreement, Mitnick admitted to $10 million in damage. Mitnick's defenders, however, allege that he was the classic hacker—someone who broke into systems merely for the intellectual challenge of doing so, and not for economic gain. True hackers distinguish themselves from a "cracker," defined as "[o]ne who breaks security on a system, a term coined ca. 1985 by hackers in defense against journalistic misuse of hacker." ["Cracker," q.v., The New Hacker's Dictionary (Eric S. Raymond, compiler), Cambridge, Mass.: MIT Press, 1994.]

While acknowledging that Mitnick did not seek to profit from the material he copied from invaded systems, prosecutors nonetheless argued that Mitnick's activities diminished the value of the material he copied and made it easier for others to steal and sell the confidential material.

For his exploits, real or imagined, Mitnick earned the dubious honor of being the first hacker on the FBI's most-wanted list and became the subject of a nationwide manhunt. He was arrested on February 15, 1995, after a widely publicized investigation spearheaded by Tsutomu Shimomura, a computer expert at the San Diego Supercomputer Center. Following his arrest, Mitnick was denied bail (in fact, in a exceedingly rare move, he was even denied a bail hearing) and spent the next four years in jail awaiting trial. Just prior to the scheduled start of his trial in April 1999, Mitnick signed a plea agreement in which he pled guilty to five of twenty-five counts of fraud. He was sentenced to forty-six months in federal prison, and with credit for time served, was released in 2000.

end sidebar

Mitnick may or may not deserve the scarlet H that's been pinned to his chest, but there's no question that media reports about his alleged career and the exploits of hacker gangs like the Legion of Doom helped heighten public awareness about the potential for cyberterrorism. A 1998 report by the Rand Corporation warned of the potential for a "digital Pearl Harbor" and concluded because of the growth of network systems, "the U.S. home-land may no longer provide a sanctuary from outside attack." Congress also leapt on the cyberthreat bandwagon, with Senator Fred Thompson of Tennessee declaring that an attack on the United States would start with attempts "to screw up our computers." (Unfortunately, 9/11 proved Senator Thompson wrong.)

The increased concern over electronic attacks has also led directly to sizeable increases in spending on cybersecurity measures. In 1999, the Aberdeen Group, a Boston Internet analyst firm, estimated that corporations spent $7.1 billion to guard against cyberattacks, and predicted that outlays would rise to $17 billion per year by 2003.

In addition to protecting themselves against cyberattacks, companies are also investing much more heavily on physical security. Even before 9/11, corporations were spending nearly $12 billion per year on security systems; in a survey conducted not long after the attacks, the corporate security firm Kroll Inc. found that the percentage of businesses rating physical security as a priority had jumped from 40 percent to 90 percent.

In general, businesses that spend money on physical security are trying to accomplish two separate objectives: to protect their employees and property from attack, and to make sure that their resources and systems are not infiltrated and used to attack other targets.

A seemingly inevitable by-product of the increased corporate security is a shrinking of employee privacy. In order to effectively protect their resources, businesses need to be aggressive about keeping track of who is on their property, what they're doing, and where they are.

As we'll see in the following chapters, the determination of who is on a business's property is currently the most intrusive part of employment. The largest increase in the amount of money that businesses are spending on security is for background checks of employees, with the result that more employees are being examined, more information is being gathered, and stricter standards are being applied. In many cases, relatively minor and seemingly irrelevant past infractions are causing a loss of employment or a failure to be hired in the first place.

[4]Press release, "Cyber crime bleeds U.S. corporations, survey shows; financial losses from attacks climb for third year in a row," Computer Security Institute (April 7, 2002). Available online at www.gocsi.com/press/20020407.html.

[5]Ironically, the author of Solitaire, Wes Cherry, did not receive a single cent for creating his Windows version of the game. "Some program manager in the Windows group saw it and decided to release it with Windows 3.0," Cherry said in an interview with journalist Charles Slocum. "[T]he condition was I did it for free (I was uncompensated other than the use of a computer they provided to work on it during my senior year of college). I probably could have negotiated some kind of one-time payment, but Microsoft really doesn't like to give per-copy royalties unless they absolutely need the technology." Charles B. Slocum, "Solitary Confinement," Written by (February, 2001).

[6]42 U.S.C. 2000e-2(a)(1).

[7]Harris v. Forklift Systems, Inc., 510 U.S. 17 (1993) (internal citations and quotations omitted).

[8]"Workplace shooting suspect accused of killing seven victims in seven minutes," Court TV website, [n.d.]. Available online at www.courttv.com/trials/mcdermott/background.html.

[9]A "phreaker" is someone who breaks into telephone systems, primarily by using equipment or even whistling to fake dial tones.




The Naked Employee. How Technology Is Compromising Workplace Privacy
Naked Employee, The: How Technology Is Compromising Workplace Privacy
ISBN: 0814471498
EAN: 2147483647
Year: 2003
Pages: 93

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net