Section 9.2. Quantifying Security


9.2. Quantifying Security

Cybersecurity threats and risks are notoriously hard to quantify and estimate. Some vulnerabilities, such as buffer overflows, are well understood, and we can scrutinize our systems to find and fix them. But other vulnerabilities are less understood or not yet apparent. For example, how do you predict the likelihood that a hacker will attack a network, and how do you know the precise value of the assets the hacker will compromise? Even for events that have happened (such as widespread virus attacks) estimates of the damage vary widely, so how can we be expected to estimate the costs of events that have not happened?

Sidebar 9-1: A Business Case for Web Applications Security

Cafésoft [CAF06] presents a business case for web applications security on its corporate web site. The business case explains the return on investment for an organization that secures its web applications. The ROI argument has four thrusts.

  • Revenue: Increases in revenue can occur because the security increases trust in the web site or the company.

  • Costs: The cost argument is broader than simply the installation, operation, and maintenance of the security application. It includes cost savings (for example, from fewer security breaches), cost avoidance (for example, from fewer calls to the help desk), efficiency (for example, from the ability to handle more customer requests), and effectiveness (for example, from the ability to provide more services).

  • Compliance: Security practices can derive from the organization, a standards body, a regulatory body, best practice, or simply agreement with other organizations. Failure to implement regulatory security practices can lead to fines, imprisonment, or bad publicity that can affect current and future revenues. Failure to comply with agreed-upon standards with other organizations or with customers can lead to lost business or lost competitive advantage.

  • Risk: There are consequences to not implementing the proposed security measures. They can involve loss of market share or productivity, legal exposure, or loss of productivity.

To build the argument, Cafésoft recommends establishing a baseline set of costs for current operations of a web application and then using a set of measurements to determine how security might change the baseline. For example, the number of help-desk requests could be measured currently. Then, the proposer could estimate the reduction in help-desk requests as a result of eliminating user self-registration and password management. These guidelines can act as a more general framework for calculating return on investment for any security technology. Revenue, cost, compliance, and risk are the four elements that characterize the costs and benefits to any organization.


Unfortunately, quantification and estimation are exactly what security officers must do to justify spending on security. Every security officer can describe a worst-case scenario under which losses are horrific. But such arguments tend to have a diminishing impact: After management has spent money to counter one possible serious threat that did not occur, it is reluctant to spend again to cover another possible serious threat.

Gordon and Loeb [GOR02a] argue that for a given potential loss, a firm should not necessarily match its amount of investment to the potential impact on any resource. Because extremely vulnerable information may also be extremely costly to protect, a firm may be better off concentrating its protection resources on information with lower vulnerabilities.

The model that Gordon and Loeb present suggests that to maximize the expected benefit from investment to protect information, a firm should spend only a small fraction of the expected loss due to a security breach. Spending $1 million to protect against a loss of $1 million but with a low expected likelihood is less appropriate than spending $10,000 to protect against a highly likely $100,000 breach.

The Economic Impact of Cybersecurity

Understanding the economic impact of cybersecurity issuesprevention, detection, mitigation, and recoveryrequires models of economic relationships that support good decision-making. However, realistic models must be based on data derived both from the realities of investment in cybersecurity and consequences of actual attacks. In this section, we describe the nature of the data needed, the actual data available for use by modelers and decision makers, and the gap between ideal and real.

For any organization, understanding the nature of the cybersecurity threat requires knowing at least the following elements:

  • number and types of assets needing protection

  • number and types of vulnerabilities that exist in a system

  • number and types of likely threats to a system

Similarly, understanding the realities of cyber attack also requires knowing the number and types of attacks that can and do occur, and the costs associated with restoring the system to its pre-attack state and then taking action to prevent future attacks.

Both the types of possible attacks and the vulnerabilities of systems to potential cyber attacks are fairly well understood. However, the larger direct and indirect consequences of such attacks are still largely unknown. We may know that a system has been slowed or stopped for a given number of days, but often we have no good sense of the repercussions as other systems can no longer rely on the system for its information or processing. For instance, an attack on a bank can have short- and long-term effects on the travel and credit industries, which in turn can affect food supply. This lack of understanding has consequences among interconnected computers.

Data to Justify Security Action

Interest in society's reliance on information technology has spawned a related interest in cybersecurity's ability to protect our information assets. However, we lack highquality descriptive data.

Data are needed to support cybersecurity decision-making at several levels.

  • National and global data address national and international concerns by helping users assess how industry sectors interact within their country's economy and how cybersecurity affects the overall economy. These data can help us understand how impairments to the information infrastructure can generate ripple effects[1] on other aspects of national and global economies.

    [1] A ripple effect is a cascading series of events that happen when one event triggers several others, which in turn initiate others.

  • Enterprise data enable us to examine how firms and enterprises apply security technologies to prevent attacks and to deal with the effects of security breaches.

    In particular, the data capture information about how enterprises balance their security costs with other economic demands.

  • Technology data describe threats against core infrastructure technologies, enabling modelers to develop a set of least-cost responses.

If we were looking at cost of labor, raw materials, or finished goods, we would have excellent data from which to work. Those concepts are easier to quantify and measure, governments assist in collecting the data, and economists know where to turn to obtain them. What makes these statistics so valuable to economists is that they are comparable. Two economists can investigate the same situation and either come to similar conclusions or, if they differ, investigate the data models underlying their arguments to determine what one has considered differently from the other.

Data to support economic decision-making must have the following characteristics:

  • Accuracy. Data are accurate when reported values are equal or acceptably close to actual values. For example, if a company reports that it has experienced 100 attempted intrusions per month, then the actual number of attempted intrusions should equal or be very close to 100.

  • Consistency. Consistent reporting requires that the same counting rules be used by all reporting organizations and that the data be gathered under the same conditions. For example, the counting rules should specify what is meant by an "intrusion" and whether multiple intrusion attempts by a single malicious actor should be reported once per actor or each time an attempt is made. Similarly, if a system consists of 50 computers and an intrusion is attempted simultaneously by the same actor in the same way, the counting rules should indicate whether the intrusion is counted once or 50 times.

  • Timeliness. Reported data should be current enough to reflect an existing situation. Some surveys indicate that the nature of attacks has been changing over time. For instance, Symantec's periodic threat reports [SYM06] indicate that attack behavior at the companies it surveys has changed from mischievous hacking to serious criminal behavior. Reliance on old data might lead security personnel to be solving yesterday's problem.

  • Reliability. Reliable data come from credible sources with a common understanding of terminology. Good data sources define terms consistently, so data collected in one year are comparable with data collected in other years.

Sidebar 9-2 describes some of the data available to support cybersecurity decision-making.

Notice that some of the results in Sidebar 9-2 present trend data (37 percent in 2003 versus 65 percent in 2005 use security standards) and others report on events or activities (organizations have hardened their systems). However, few of the results contain data that could be used directly in a security investment business case.

Security Practices

The Information Security Breaches Survey (ISBS) is a particularly rich source of information about cybersecurity incidents and practices and provides a good model for capturing information about cybersecurity. A collaborative effort between the U.K. Department of Trade and Industry and PricewaterhouseCoopers, this survey is administered every two years to U.K. businesses large and small. Participants are randomly sampled and asked to take part in a structured telephone interview. In late 2005 and early 2006, over a thousand companies agreed to participate in the study. Additionally, PricewaterhouseCoopers conducted in-depth interviews with a few participants to verify results of the general interviews.

Sidebar 9-2: A Summary of Recent Security Surveys

We are not at a loss for surveys on computer crime and security incidents. Several surveys have been conducted for a number of years, so there is a significant body of collected data. Some surveys are more statistically accurate than others. And because of survey design, the data from one year's survey are not necessarily comparable to other years of that same survey, let alone to other surveys. Here are some of the surveys of the area.

The CSI/FBI Computer Crime and Security Survey is administered in the United States by the Computer Security Institute; it is endorsed by California units of the Federal Bureau of Investigation. Voluntary and anonymous, the participants are solicited from CSI members and attendees at CSI conferences and workshops. Five thousand information security practitioners were given the survey in 2005, and 699 responded.

Key findings:

  • Viruses are the largest source of financial loss. Unauthorized access showed dramatic gains, replacing denial of service as the second greatest source of loss.

  • The total dollar amount of financial loss from cyber crime is decreasing.

  • The reporting of intrusions continues to decrease, for fear of negative publicity.

  • Only 87 percent of respondents conduct security audits, up from 82 percent in the previous survey.

The 2005 Australian Computer Crime and Security Survey is the fourth annual survey conducted by AusCERT, the Australian National Computer Emergency Response Team. Modeled on the CSI/FBI Computer Crime and Security Survey, the Australian survey examines Australia's private and public industry cybersecurity threats, records the number of cyber incidents, and attempts to raise awareness of security issues and effective methods of attack prevention. The survey questionnaire was sent to the chief information security officers of 540 public and private sector organizations in Australia. Participation in the survey was voluntary and anonymous, and AusCERT received 188 responses.

Key findings:

  • Only 35 percent of respondents experienced attacks that affected the confidentiality, availability, or integrity of their networks or data systems in 2005, compared with 49 percent in 2004 and 42 percent in 2003.

  • The level of insider attacks has remained constant over three years, at 37 percent.

  • Viruses were the most prevalent type of attack. Denial of service created the most financial loss.

  • Only 37 percent of respondents used security standards in 2003, but 65 percent use them now.

In its third year, the Deloitte Touche Tohmatsu Global Security Survey in 2005 continued to focus on security practices of major global financial institutions. The respondents were voluntary and anonymous, and the data were gathered from extensive interviews with chief information security officers and chief security officers of financial institutions. Additionally, Deloitte allows a preselected group of institutions to participate in the survey using an online questionnaire instead of the interviews. The survey gathers data on seven areas: governance, investment, value, risk, use of security technologies, quality of operations, and privacy. The main issues it addresses are the state of information security practices in the financial services industry, the perceived levels of risk, the types of risks, and the resources and technologies applied to these risks.

Key findings:

  • Organizations have hardened their systems, making them less attractive to security breaches from hackers.

  • The weakest link is humans, not technology, particularly using phishing and pharming attacks.

  • Only 17 percent of respondents overall deem government security-driven regulations as "very effective," and 50 percent "effective" in improving their organization's security position or in reducing data protection risks.

  • There is a trend toward having the chief information security officer report to the highest levels of the organization.

The 2004 Ernst and Young Global Information Security Survey found that although company executives are aware of computer security threats, their security practices are lacking. The survey, which included input from 1,233 companies worldwide, also concluded that internal threats are underemphasized and that many organizations rely on luck rather than security measures for protection. Ernst and Young has been conducting this kind of annual survey since 1993, using two methods for data collection. Companies are first asked to participate in face-to-face interviews; if that is not possible, they are sent electronic questionnaires. The survey is anonymous, and participation is voluntary.

Key findings:

  • Only one in five respondents strongly agreed that their organizations perceive information security as a priority at the highest corporate levels.

  • Lack of security awareness by users was the top obstacle to effective information security. However, only 28 percent of respondents listed "raising employee information security training or awareness" as a top initiative in 2004.

  • The top concern among respondents was viruses, Trojan horses, and Internet worms. A distant second was employee misconduct, regardless of geographic region, industry, or organizational size.

  • Fewer than half of the respondents provided employees with ongoing training in security and controls.

  • Only one in four respondents thought their information security departments were successful in meeting organizational security needs.

  • One in ten respondents consider government security-driven regulations to be effective in improving security or reducing risk.

The Internet Crime Complaint Center (IC3) is a collaborative U.S. effort involving the Federal Bureau of Investigation and the National White Collar Crime Center. It provides information to national, state, and local law enforcement agencies that are battling Internet crime. The IC3 collected its fifth annual compilation of complaints in 2005.

Key findings:

  • During 2005, the IC3 received over 231,000 submissions, an increase of 11.6 percent over the previous year. Of these, almost 100,000 complaints were referred to law enforcement organizations for further consideration. The majority of the referred cases involved fraud. The total dollar loss was over $182 million, with median dollar loss of $424 per complainant.

  • Internet auction fraud was the most frequent complaint, involved in 62.7 percent of the cases. Almost 16 percent of the cases involved nondelivered merchandise or nonpayment. Credit or debit card fraud was involved in almost 7 percent of the cases. The remaining top categories involved check fraud, investment fraud, computer fraud, and confidence fraud.

  • More than three of four perpetrators were male, and half resided in one of the following states: California, New York, Florida, Texas, Illinois, Pennsylvania, or Ohio. Although most of the reported perpetrators were from the United States, a significant number were located in Nigeria, the United Kingdom, Canada, Italy, or China.

  • Sixty-four percent of complainants were male, nearly half were between the ages of 30 and 50, and one-third resided in one of the four most populated states: California, Florida, Texas, or New York. For every dollar lost by a female, $1.86 dollars was lost by a male.

  • High activity scams included Super Bowl ticket scams, phishing attempts associated with spoofed sites, reshipping, eBay account takeovers, natural disaster fraud, and international lottery scams.

The Imation Data Protection Survey sponsored by Imation Corporation attempts to understand how small and mid-size U.S. companies conduct data backup, protection, and recovery. In 2004, the online survey gathered information from 204 tape storage managers and information technology directors, who were selected by the Technology Advisory Board, a worldwide panel of more than 25,000 engineers, scientists, and IT professionals.

Key findings:

  • Most companies have no formal data backup and storage procedures in place. They rely instead on the initiative of individual employees.

  • E-mail viruses are the primary reason companies review and change their data protection procedures.

  • Regular testing of disaster recovery procedures is not yet a common practice.

In 2002, Information Security Magazine (ISM) gathered data from 2,196 security practitioners regarding organizational behavior and practices. By separating the data by organization size, the ISM survey detailed the differences in security responses and budget allocations. The survey found

  • Security spending per user and per machine decreases as organization size increases.

  • Allocating money for security does not reduce the probability of being attacked but does help an organization detect losses.

  • Most organizations do not have a security culture or an incident response plan.

Survey Sources

CSI/FBI Survey:
http://www.gocsi.com/forms/fbi/csi_fbi_survey.jhtml

Australian Computer Crime and Security Survey:
www.auscert.org.au/render.html?it=2001

Deloitte Global Information Security Survey:
www.deloitte.com/dtt/research/0,1015,sid=1013&cid=85452,00.html

Ernst and Young Global Security Survey:
www.ey.com/global/download.nsf/International/2004_Global_Information_Security_Survey/$file/2004_Global_Information_Security_Survey_2004.pdf

2005 IC3 Internet Crime Report:
www.ic3.gov/media/annualreport/2005_ic3report.pdf

DTI Information Security Breaches Survey 2006:
www.pwc.com/extweb/pwcpublications.nsf/docid/
7FA80D2B30A116D7802570B9005C3D16

ICSA Tenth Annual Computer Virus Prevalence Survey:
www.icsalabs.com/icsa/docs/html/library/whitepapers/VPS2004.pdf


The survey results are reported in four major categories: dependence on information technology, the priority given to cybersecurity, trends in security incidents, and expenditures on and awareness of cybersecurity. In general, information technology is essential to U.K. businesses, so computer security is becoming more and more important. Of businesses surveyed, 97 percent have an Internet connection, 88 percent of which are broadband. More than four in five businesses have a web site, most of which are hosted externally. Small business is particularly dependent on information technology: Five of six said that they could not run their companies without it. Many of the respondents rate security above efficiency when specifying corporate priorities.

Nearly three times as many companies have a security policy now than in 2000. Almost every responding organization does regular backups, and three-quarters store the backups offsite. These organizations are proactive about fighting viruses and spam; 98 percent of businesses have antivirus software, 80 percent update their antivirus signatures within a day of notification of a new virus, and 88 percent install critical operating system patches within a week. Moreover, 86 percent of companies filter their incoming e-mail for spam. They view these controls as sufficient and effective; three-quarters of U.K. businesses are confident or very confident that they identified all significant security breaches in the previous year.

Economic Impact

But what is the economic impact of the policies and controls? Although the large increase in security incidents during the 1990s has stabilized (62 percent of U.K. companies had a security incident in 2005, compared with 74 percent in 2003), the reported costs remain substantial. The average U.K. company spends 4 to 5 percent of its information technology budget on information security, but two out of five companies spend less than 1 percent on security.

The average cost of a company's worst security incident was about £12,000, up from £10,000 in 2003. Moreover, large businesses are more likely than small businesses to have incidents and to have more of them (the median is 19 per year). A large business's worst breach costs on average £90,000. Overall, the cost of security breaches to U.K. companies has increased by about half since 2003; it is approximately £10 billion per annum. Fewer than half the companies surveyed conduct a security risk assessment, and those that do tend to spend more on security. Table 9-3 summarizes the changes in incidents and cost reflected by the ISBS survey.

Table 9-3. Overall Changes in Cost of U.K. Security Incidents (adapted from ISBS 2006)
 

Overall Change

Change for Large Businesses

Number of companies affected

20%

10%

Median number of cybersecurity incidents at affected companies

50%

30%

Average cost of each incident

20%

10%

Total change in cost of cybersecurity incidents

50%

50%


Are the Data Representative?

How representative are these data? Pfleeger et al. [PFL06c] have evaluated the available data, which collectively paint a mixed picture of the security landscape.

Classification of Attack Types

Understandably, the surveys measure different things. One would hope to be able to extract similar data items from several surveys, but unfortunately that is not often the case.

For example, the Australian Computer Crime and Security Survey reported a decrease in attacks of all types, but 43 percent of CSI member organizations reported increases from 2003 to 2004. The Deloitte survey found the rate of breaches to have been the same for several years. The variation may derive from the differences in the populations surveyed: different countries, sectors, and degrees of sophistication about security matters.

Types of Respondents

Most of these surveys are convenience surveys, meaning that the respondents are self-selected and do not form a representative sample of a larger population. For convenience surveys, it is usually difficult or impossible to determine which population the results represent, making it difficult to generalize the findings. For example, how can we tell if the CSI/FBI survey respondents represent the more general population of security practitioners or users? Similarly, if, in a given survey, 500 respondents reported having experienced attacks, what does that tell us? If the 500 respondents represent 73 percent of all those who completed the survey, does the result mean that 73 percent of companies can expect to be attacked in the future? Or, since completing the questionnaire is voluntary, can we conclude only that respondents in the attacked 500 sites were more likely to respond than the thousands of others who might not have been attacked? When done properly, good surveys sample from the population so that not only can results be generalized to the larger group but also the results can be compared from year to year (because the sample represents the same population).

Comparability of Categories

There are no standards in defining, tracking, and reporting security incidents and attacks. For example, information is solicited about

  • "electronic attacks" (Australian Computer Crime and Security Survey)

  • "total number of electronic crimes or network, system, or data intrusions" and "unauthorized use of computer systems" (CSI/FBI)

  • "security incidents," "accidental security incidents," "malicious security incidents," and "serious security incidents" (Information Security Breaches Survey)

  • "any form of security breach" (Deloitte Global Security Survey)

  • "incidents that resulted in an unexpected or unscheduled outage of critical business systems" (Ernst and Young Global Information Security Survey)

Indeed, it is difficult to find two surveys whose results are strictly comparable. Not only are the data characterized differently, but the answers to many questions are based on opinion, interpretation, or perception, not on consistent capture and analysis of solid data.

Sources of Attack

Even the sources of attack are problematic. The Australian survey notes that the rate of insider attacks has remained constant, but the Deloitte survey suggests that the rate is rising within its population of financial institutions. There is some convergence of findings, however. Viruses, Trojan horses, worms, and malicious code pose consistent and serious threats, and most business sectors fear insider attacks and abuse of access. Most studies indicate that phishing is a new and growing threat.

Financial Impact

Many of the surveys capture information about effect as well as cause. A 2004 survey by ICSA Labs reports a 12 percent increase in "virus disasters" over 2003, but the time to recover lost or damaged data increased 25 percent. The cost of recovery exceeded $130,000 on average. By contrast, the Australian, Ernst and Young, and CSI/FBI surveys found a decrease in total damage from attacks. The nature of the losses varies, too; CSI/FBI reports that 25 percent of respondents experienced financial loss, and 56 percent experienced operational losses.

These differences may derive from the difficulty of detecting and measuring the direct and indirect effects of security breaches. There is no accepted definition of loss, and there are no standard methods for measuring it. Indeed, the ICSA 2004 study notes that "respondents in our survey historically underestimate costs by a factor of 7 to 10."

There is some consensus on the nature of the problems. Many surveys indicate that formal security policies and incident response plans are important. Lack of education and training appears to be a major obstacle to improvement. In general, a poor "security culture" (in terms of awareness and understanding of security issues and policies) is reported to be a problem. However, little quantitative evidence supports these views. Thus, in many ways, the surveys tell us more about what we do not know than about what we do know. Many organizations do not know how much they have invested in security protection, prevention, and mitigation. They do not have a clear strategy for making security investment decisions or evaluating the effectiveness of those decisions. The inputs required for good decision makingsuch as rates and severity of attacks, cost of damage and recovery, and cost of security measures of all typesare not known with any accuracy.

Conclusion

We can conclude only that these surveys are useful for anecdotal evidence. A security officer can point to a survey and observe that 62 percent of U.K. respondents reported a security incident at an average loss of £12,000. But management will rightly ask whether those figures are valid for other countries, what constitutes an incident, and whether its organization is vulnerable to those kinds of harm.

The convenience surveys are a good start, but for serious, useful analysis, we need statistically valid surveys administered to the same population over a period of time. In that way we can derive meaningful measures and trends. The surveys need to use common terminology and common ways to measure effect so that we can draw conclusions about past and likely losses. And ideally, comparable surveys will be administered in different countries to enable us to document geographical differences. Without these reliable data, economic modeling of cybersecurity is difficult.




Security in Computing
Security in Computing, 4th Edition
ISBN: 0132390779
EAN: 2147483647
Year: 2006
Pages: 171

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net