Appendix D The Information System Security Engineering Professional (ISSEP) Certification

Overview

The ISSEP Certification is defined by (ISC)2 as the CISSP concentration area that is designed to denote competence and expertise in information security engineering.

To qualify for and obtain the ISSEP certification, the candidate must possess the CISSP credential, sit for and pass the ISSEP examination, and maintain the ISSEP credential in good standing.

The ISSEP examination is similar in format to that of the CISSP examination. The questions are multiple choice, with the examinee being asked to select the best answer of four possible answers. The examination comprises 150 questions, 25 of which are experimental questions that are not counted. The candidate is allotted three hours to complete the examination.

The ISSEP certification and examination cover the following four primary areas:

  • Systems security engineering - Focuses on applying the processes of the Information Assurance Technical Framework (IATF) to determine the information assurance needs of an organization, and then to design the corresponding systems in a manner consistent with those needs. Systems security engineering also includes understanding the system life cycle and the related information assurance requirements, defense in depth, and risk assessment methods.
  • Technical management - Concerned with system development models and associated security mechanisms.
  • Certification and accreditation - Details the Certification and Accreditation (C&A) processes.
  • United States Government information assurance (IA) regulations - Discusses the practices provided by the United States Government information assurance regulations

The key concepts that ISSEP candidates need to understand in the four domains are summarized and reviewed in this appendix.

Systems Security Engineering

The Systems Security Engineering domain of the ISSEP concentration is designed to enable the candidate to apply the processes defined in the IATF document, Release 3.1 for the protection of information systems. It also includes the fundamentals of the Systems Engineering and Systems Security Engineering processes.

As stated in the ISSEP Study Guide, the ISSEP candidate is expected to be able to do the following after completing this Appendix:

  1. Describe the Information Systems Security Engineering (ISSE) process as documented in the IATF document 3.1
  2. Describe systems engineering processes in general and infer how security engineering integrates with these processes
  3. Explain the applicability of evaluated products and the various types of evaluation and evaluation processes
  4. Construct network architectures according to the principles of Defense in Depth
  5. Construct proper documentation for each phase of the ISSE process

In addition, the candidate should understand the fundamental concepts of risk assessment and the system life cycle or, as it is sometimes called, the system development life cycle (SDLC). These areas are covered in NIST Special Publication 800-30, “Risk Management Guide for Information Technology Systems”; NIST Special Publication 800-14, “Generally Accepted Principles and Practices for Securing Information Technology Systems”; NIST Special Publication 800-27 Rev A, “Engineering Principles for Information Technology Security (A Baseline for Achieving Security,” and NIST Draft SP 800-26,Rev. 1, “Guide for Information Security Program Assessments and System Reporting Form” with NIST SP 800-53 “References and Associated Security Control Mappings.” NIST SP 800-26 provides an information system security assessment checklist.

The Information Assurance Technical Framework

The IATF document is a product of the Information Assurance Technical Framework Forum (IATFF). The IATFF, sponsored by the National Security Agency (NSA), encourages and supports technical interchanges on the topic of information assurance among U.S. industry, U.S. academic institutions, and U.S. government agencies. Information on the IATFF can be found at its Web site, www.iatf.net.

The IATF document 3.1 provides a technical process for developing systems with inherent information assurance, emphasizing the criticality of the people involved, the operations required, and the technology needed to meet the organization’s mission. It also defines the information security requirements of the system hardware and software. Applying the process of document 3.1 results in a layered protection scheme known as Defense in Depth for critical information system components.

Principles of Defense in Depth

The strategy of Defense in Depth is aimed at protecting U.S. federal and defense information systems and networks from the various types and classes of attacks. The technology focus areas of the Defense in Depth strategy are:

  • Defending the network and infrastructure
  • Defending the enclave boundary
  • Defending the computing environment
  • Defending the supporting infrastructures

The second item in this list refers to an enclave. In DoD Directive 8500.1, “Information Assurance (IA),” October 24, 2002, an enclave is defined as a “collection of computing environments connected by one or more internal networks under the control of a single authority and security policy, including personnel and physical security. Enclaves always assume the highest mission assurance category and security classification of the Automated Information System (AIS) applications or outsourced IT-based processes they support, and derive their security needs from those systems. They provide standard IA capabilities such as boundary defense, incident detection and response, and key management, and also deliver common applications such as office automation and electronic mail. Enclaves are analogous to general support systems as defined in OMB A-130. Enclaves may be specific to an organization or a mission, and the computing environments may be organized by physical proximity or by function independent of location. Examples of enclaves include local area networks and the applications they host, backbone networks, and data processing centers.”

The Defense in Depth strategy promotes application of the following information assurance principles:

  • Defense in multiple places - Deployment of information protection mechanisms at multiple locations to protect against internal and external threats
  • Layered defenses - Deployment of multiple information protection and detection mechanisms so that an adversary or threat will have to negotiate multiple barriers to gain access to critical information
  • Security robustness - Based on the value of the information system component to be protected and the anticipated threats, estimation of the robustness of each information assurance component. Robustness is measured in terms of assurance and strength of the information assurance component.
  • Deploy KMI/PKI - Deployment of robust key management infrastructures (KMI) and public-key infrastructures (PKI)
  • Deploy intrusion detection systems - Deployment of intrusion detection mechanisms to detect intrusions, evaluate information, examine results, and, if necessary, to take action

Types and Classes of Attack

IATF document 3.1 lists the following types of attacks:

  • Passive
  • Active
  • Close-in
  • Insider
  • Distribution

These attacks and their characteristics, taken from IATF document 3.1, September 2002, are given in Table D-1.

Table D-1: Classes of Attack
Open table as spreadsheet

ATTACK

DESCRIPTION

Passive

Passive attacks include traffic analysis, monitoring of unprotected communications, decrypting weakly encrypted traffic, and capture of authentication information (such as passwords). Passive intercept of network operations can give adversaries indications and warnings of impending actions. Passive attacks can result in disclosure of information or data files to an attacker without the consent or knowledge of the user. Examples include the disclosure of personal information such as credit card numbers and medical files.

Active

Active attacks include attempts to circumvent or break protection features, introduce malicious code, or steal or modify information. These attacks may be mounted against a network backbone, exploit information in transit, electronically penetrate an enclave, or attack an authorized remote user during an attempt to connect to an enclave. Active attacks can result in the disclosure or dissemination of data files, denial of service, or modification of data.

Close-In

Close-in attack consists of individuals attaining close physical proximity to networks, systems, or facilities for the purpose of modifying, gathering, or denying access to information. Close physical proximity is achieved through surreptitious entry, open access, or both.

Insider

Insider attacks can be malicious or nonmalicious. Malicious insiders intentionally eavesdrop, steal, or damage information; use information in a fraudulent manner; or deny access to other authorized users. Nonmalicious attacks typically result from carelessness, lack of knowledge, or intentional circumvention of security for such reasons as “getting the job done.”

Distribution

Distribution attacks focus on the malicious modification of hardware or software at the factory or during distribution. These attacks can introduce malicious code into a product, such as a back door to gain unauthorized access to information or a system function at a later date.

The enclaves in the U.S. federal and defense computing environments can be categorized as:

  • Classified
  • Private
  • Public

The attacks categorized in Table D-1 are the types that can be perpetrated on the computing environment enclaves. The relationships of the classes of attacks to computing environment enclaves are depicted in Figure D-1.

image from book
Figure D-1: Relationships of the classes of attacks to computing environment enclaves (from IATF document, Release 3.1, September 2002).

The Defense in Depth Strategy

The Defense in Depth strategy is built upon three critical elements: people, technology, and operations.

People

To implement effective information assurance in an organization, there must be a high-level commitment from management to the process. This commitment is manifested through the following items and activities:

  • Development of information assurance policies and procedures
  • Assignment of roles and responsibilities
  • Training of critical personnel
  • Enforcement of personal accountability
  • Commitment of resources
  • Establishment of physical security controls
  • Establishment of personnel security controls
  • Penalties associated with unauthorized behavior

Technology

An organization has to ensure that the proper technologies are acquired and deployed to implement the required information protection services. These objectives are accomplished through processes and policies for the acquisition of technology. The processes and policies should include the following items:

  • A security policy
  • System-level information assurance architectures
  • System-level information assurance standards
  • Information assurance principles
  • Specification criteria for the required information assurance products
  • Acquisition of reliable, third-party, validated products
  • Configuration recommendations
  • Risk assessment processes for the integrated systems

Operations

Operations emphasize the activities and items that are necessary to maintain an organization’s effective security posture on a day-to-day basis. These activities and items include:

  • A visible and up-to-date security policy
  • Enforcement of the information security policy
  • Certification and accreditation
  • Information security posture management
  • Key management services
  • Readiness assessments
  • Protection of the infrastructure
  • Performing systems security assessments
  • Monitoring and reacting to threats
  • Attack sensing, warning, and response (ASW&R)
  • Recovery and reconstitution

The Defense in Depth approach has become widely accepted and has been incorporated into a number of federal and DoD policies and guidelines. One example is the DoD Global Information Grid (GIG) Information Assurance Policy and Implementation Guidance (www.c3i.osd.mil/org/cio/doc/gigia061600.pdf). Figure D-2 illustrates the embodiment of the Defense in Depth strategy as shown in the GIG Policy and Implementation Guidance.

image from book
Figure D-2: Defense in Depth as applied in the GIG Information Assurance Policy and Implementation Guidance (from IATF document, Release 3.1, September 2002).

Sample U.S. Government User Environments

The target systems of a Defense in Depth strategy can be put in perspective by examining two U.S. government computing environments - the Department of Energy (DoE) and Department of Defense information systems.

The DOE interconnects its laboratories and other facilities through wide area networks (WANs), including the Energy Science Network (ESN). ESN supports classified and unclassified DoE networking for research and mission-critical applications. The DoE computing environment is shown in Figure D-3.

image from book
Figure D-3: The DoE computing environment (from IATF document, Release 3.1, September 2002).

The DoD Defense Information Infrastructure (DII) provides networking and information services to more than 2 million primary users and 2 million extension users. The DII enclaves typically comprise more than 20,000 local networks and 300,000 secure telephone users. The DII also includes worldwide networks such as the Joint Worldwide Intelligence Communications System (JWICS), the Secret Internet Protocol Router Network (SIPRNet), and the Non-secure Internet Protocol Router Network (NIPRNet). An example DII site is shown in Figure D-4.

image from book
Figure D-4: A typical DII site (from IATF document, Release 3.1, September 2002).

Systems Engineering Systems Security Engineering Processes

Information system security engineering should be conducted in parallel with and according to the proven principles of systems engineering (SE). Even though the terms ISSE and SE are commonly used, it is important to formally define these two concepts so that there is no misunderstanding of their meaning. In Chapter 3 of IATF document 3.1, ISSE is defined as:

The art and science of discovering users’ information protection needs and then designing and making information systems, with economy and elegance, so that they can safely resist the forces to which they may be subjected.

Systems engineering has been defined in numerous ways. Four such definitions are:

  • The branch of engineering concerned with the development of large and complex systems, where a system is understood to be an assembly or combination of interrelated elements or parts working together toward a common objective. (This is a commonly accepted definition of systems engineering.)
  • The branch of engineering, based on systems analysis and information theory, concerned with the design of integrated systems. (From the Collins English Dictionary [Harper Collins, 2000].)
  • The selective application of scientific and engineering efforts to:

    • Transform an operational need into a description of the system configuration that best satisfies the operational need according to the measures of effectiveness
    • Integrate related technical parameters and ensure compatibility of all physical, functional, and technical program interfaces in a manner that optimizes the total system definition and design
    • Integrate the efforts of all engineering disciplines and specialties into the total engineering effort
  • (This definition is adapted from the “Systems Engineering Capability Model” [SE-CMM-95-0I] document, version 1.1, from the Carnegie Mellon Software Engineering Institute [SEI]. The definition in the SE-CMM was taken from the U.S. Army Field Manual, 770–78.)
  • The discipline that integrates all the disciplines and specialty groups into a team effort forming a structured development process that proceeds from concept to production to operation. Systems Engineering considers both the business and the technical needs of all customers with the goal of providing a quality product that meets the user needs. (This definition is from The International Council on Systems Engineering [INCOSE], www.incose.org.)

The Systems Engineering Process

This definition of the Systems Engineering process is taken from Chapter 3 of the IATF document 3.1. This generic SE process will be used as the basis for integration with the ISSE process and comprises the following components:

  • Discover needs
  • Define system requirements
  • Design system architecture
  • Develop detailed design
  • Implement system
  • Assess effectiveness

An important characteristic of this process is that it emphasizes the application of SE over the entire development life cycle. Figure D-5 illustrates the IATF generic SE process; the arrows show the information flow among activities in the process. The notation of USERS/USERS’ REPRESENTATIVES in the figure is included to emphasize the interaction among the users and the systems engineer throughout the SE process.

image from book
Figure D-5: The generic systems engineering process (from IATF document, Release 3.1, September 2002).

A good systems or information systems security engineer will always keep the problem to be solved in perspective. Some rules of thumb to remember are that the purpose of a project is to meet the customer’s needs in his or her environment, the problem and solution spaces should be kept separate, and the solution space must be determined by the problem space. This approach relates to the simple but elegant definitions of verification and validation stated by Barry Boehm. He offered that “verification is doing the job right and validation is doing the right job.”

The Information Systems Security Engineering Process

The ISSE process mirrors the generic SE process of IATF document 3.1. The ISSE process elements and the associated SE process elements, respectively, are:

  • Discover Information Protection Needs - Discover Needs
  • Define System Security Requirements - Define System Requirements
  • Design System Security Architecture - Design System Architecture
  • Develop Detailed Security Design - Develop Detailed Design
  • Implement System Security - Implement System
  • Assess Information Protection Effectiveness - Assess Effectiveness

Each of the six ISSE process activities will be reviewed in the following sections.

Discover Information Protection Needs

The information systems security engineer can obtain a portion of the information required for this activity from the SE process. The objectives of this activity are to understand and document the customer’s needs and to develop solutions that will meet these needs. This approach is illustrated in Figure D-6.

image from book
Figure D-6: Discover Information Protection Needs activity (from IATF document, Release 3.1, September 2002).

The information systems security engineer should use any reliable sources of information to learn about the customer’s mission and business operations, including areas such as human resources, finance, command and control, engineering, logistics, and research and development. This knowledge can be used to generate a concept of operations (CONOPS) document or a mission needs statement (MNS). Then, with this information in hand, an information management model (IMM) should be developed that ultimately defines a number of information domains. Information management is defined as:

  • Creating information
  • Acquiring information
  • Processing information
  • Storing and retrieving information
  • Transferring information
  • Deleting information

The principle of least privilege should be used in developing the model by permitting users to access only the information required for them to accomplish their assigned tasks.

The IMM is illustrated in Figure D-7.

image from book
Figure D-7: Graphic of the information management model (from IATF document, Release 3.1, Appendix H, September 2002).

In the Discover Information Protection Needs activity of the ISSE process, the information systems security engineer must document all elements of the activity. These elements include:

  • Roles
  • Responsibilities
  • Threats
  • Strengths
  • Security services
  • Priorities
  • Design constraints

These items form the basis of an Information Protection Policy (IPP), which in turn becomes a component of the customer’s Information Management Policy (IMP), as shown earlier in Figure D-6.

The information systems security engineer must also support the certification and accreditation (C&A) of the system. For example, the security engineer can identify the Designated Approving Authority (DAA) and the Certification Authority (CA). A detailed discussion of C&A is given in Chapter 11.

Define System Security Requirements

In this ISSE activity, the information systems security engineer identifies one or more solution sets that can satisfy the information protection needs of the IPP. This subprocess is illustrated in Figure D-8.

image from book
Figure D-8: Mapping of solution sets to information protection needs.

In selecting a solution set, the information systems security engineer must also consider the needs of external systems such as Public Key Infrastructure (PKI) or other cryptographic-related systems, as shown in Figure D-9.

image from book
Figure D-9: Mapping of needs to solution set components.

A solution set consists of a preliminary security CONOPS, the system context, and the system requirements. In close cooperation with the customer and based on the IPP, the information systems security engineer selects the best solution among the solution sets. The information protection functions and the information management functions are delineated in the preliminary security CONOPS, and the dependencies among the organization’s mission and the services provided by other entities are identified. In developing the system context, the information systems security engineer uses systems engineering techniques to identify the boundaries of the system to be protected and allocates security functions to this system as well as to external systems. The information systems security engineer accomplishes this allocation by analyzing the flow of data among the system to be protected and the external systems and by using the information compiled in the IPP and IMM.

The third component of the solution set - the system security requirements - is generated by the information systems security engineer in collaboration with the systems engineers. Requirements should be unambiguous, comprehensive, and concise, and they should be obtained through the process of requirements analysis. The functional requirements and constraints on the design of the information security components include regulations, the operating environment, targeting internal as well as external threats, and customer needs.

At the end of this process, the information systems security engineer reviews the security CONOPS, the security context, and the system security requirements with the customer to ensure that they meet the needs of the customer and are accepted by the customer. As with all activities in the ISSE process, documentation is very important and should be generated in accordance with the C&A requirements.

Design System Security Architecture

The requirements generated in the Define System Security Requirements activity of the ISSE process are necessarily stated in functional terms - indicating what is needed but not how to accomplish what is needed. In Design System Security Architecture, the information systems security engineer performs a functional decomposition of the requirements that can be used to select the components required to implement the designated functions. Some aids that are used to implement the functional decomposition are timeline analyses, flow block diagrams, and a requirements allocation sheet. The result of the functional decomposition is the functional architecture of the information security systems, shown schematically in Figure D-10.

image from book
Figure D-10: Design system security architecture.

In the decomposition process, the performance requirements at the higher level are mapped onto the lower-level functions to ensure that the resulting system performs as required. Also as part of this activity, the information systems security engineer determines, at a functional level, the security services that should be assigned to the system to be protected as well as to external systems. Such services include encryption, key management, and digital signatures. Because implementations are not specified in this activity, a complete risk analysis is not possible. General risk analysis, however, can be done by estimating the vulnerabilities in the classes of components that are likely to be used.

As always, documentation in accordance with requirements of the C&A process should be performed.

Develop Detailed Security Design

The information protection design is achieved through continuous assessments of risks and the comparison of these risks with the information system security requirements by the ISSE personnel. The design activity is iterative, and it involves both the SE and ISSE professionals. The design documentation should meet the requirements of the C&A process. It should be noted that this activity specifies the system and components but does not specify products or vendors.

The tasks performed by the information systems security engineer include:

  • Mapping security mechanisms to system security design elements
  • Cataloging candidate commercial off-the-shelf (COTS) products
  • Cataloging candidate government off-the-shelf (GOTS) products
  • Cataloging custom security products
  • Qualifying external and internal element and system interfaces
  • Developing specifications such as Common Criteria protection profiles

Implement System Security

This activity moves the system from the design phase to the operational phase. The steps in this process are shown in Figure D-11.

image from book
Figure D-11: The path from design to operations in the Implement System Security activity.

The Implement System Security activity concludes with a system effectiveness assessment that produces evidence that the system meets the requirements and needs of the mission. Security accreditation usually follows this assessment.

The assessment is accomplished through the following actions of the information systems security engineer:

  • Verifying that the implemented system does address and protect against the threats itemized in the original threat assessment
  • Providing inputs to the C&A process
  • Application of information protection assurance mechanisms related to system implementation and testing
  • Providing inputs to and reviewing the evolving system life cycle support plans
  • Providing inputs to and reviewing the operational procedures
  • Providing inputs to and reviewing the maintenance training materials
  • Taking part in multidisciplinary examinations of all system issues and concerns

An important part of the Implement System Security activity is the determination of the specific components of the information system security solution. Some of the factors that have to be considered in selecting the components include:

  • Availability now and in the future
  • Cost
  • Form factor
  • Reliability
  • Risk to system caused by substandard performance
  • Conformance to design specifications
  • Compatibility with existing components
  • Meeting or exceeding evaluation criteria (typical evaluation criteria include the Commercial COMSEC Evaluation Program [CCEP], National Information Assurance Partnership [NIAP], Federal Information Processing Standards [FIPS], NSA criteria, and NIST criteria)

Assess Information Protection Effectiveness

In order to assess the effectiveness of the information protection mechanisms and services effectively, this activity must be conducted as part of all the activities of the complete ISSE and SE process. Table D-2, with information taken from the IATF document, Release 3.1, September 2002, lists the tasks of the Assess Information Protection activity that correspond to the other activities of the ISSE process.

Table D-2: Assess Information Protection Effectiveness Tasks and Corresponding ISSE Activities
Open table as spreadsheet

ISSE ACTIVITYP

ASSESS INFORMATION ROTECTION EFFECTIVENESS TASKS

Discover Information Protection Needs

Present the process overview.

 

Summarize the information model.

 

Describe threats to the mission or business through information attacks.

 

Establish security services to counter those threats and identify their relative importance to the customer.

 

Obtain customer agreement on the conclusions of this activity as a basis for determining the system security effectiveness.

Define System Security Requirements

Ensure that the selected solution set meets the mission or business security needs.

 

Coordinate the system boundaries.

 

Present security context, security CONOPS, and system security requirements to the customer and gain customer concurrence.

 

Ensure that the projected security risks are acceptable to the customer.

Design System Security Architecture

Begin the formal risk analysis process to ensure that the selected security mechanisms provide the required security services and explain to the customer how the security architecture meets the security requirements.

Develop Detailed Security Design

Review how well the selected security services and mechanisms counter the threats by performing an interdependency analysis to compare desired to effective security service strengths.

 

Once completed, the risk assessment results, particularly any mitigation needs and residual risk, will be documented and shared with the customer to obtain the customer’s concurrence.

Implement System Security

The risk analysis will be conducted/updated.

 

Strategies will be developed for the mitigation of identified risks.

 

Identify possible mission impacts and advise the customer and the customer’s Certifiers and Accreditors.

Summary Showing the Correspondence of the SE and ISSE Activities

As discussed in the descriptions of the SE and ISSE processes, there is a one-to-correspondence of activities in the ISSE process to those in the SE process. Table D-3, taken from IATF document, Release 3.1, September 2002, summarizes those activities in the ISSE process that correspond to activities in the SE process.

Table D-3: Corresponding SE and ISSE Activities
Open table as spreadsheet

SE ACTIVITIES

ISSE ACTIVITIES

Discover Needs

Discover Information Protection Needs

The systems engineer helps the customer understand and document the information management needs that support the business or mission. Statements about information needs may be captured in an information management model (IMM).

The information systems security engineer helps the customer understand the information protection needs that support the mission or business. Statements about information protection needs may be captured in an Information Protection Policy (IPP).

Define System Requirements

Define System Security Requirements

The systems engineer allocates identified needs to systems. A system context is developed to identify the system environment and to show the allocation of system functions to that environment. A preliminary system concept of operations (CONOPS) is written to describe operational aspects of the candidate system (or systems). Baseline requirements are established.

The information systems security engineer allocates information protection needs to systems. A system security context, a preliminary system security CONOPS, and baseline security requirements are developed.

Design System Architecture

Design System Security Architecture

The systems engineer performs functional analysis and allocation by analyzing candidate architectures, allocating requirements, and selecting mechanisms. The systems engineer identifies components, or elements, allocates functions to those elements, and describes the relationships between the elements.

The information systems security engineer works with the systems engineer in the areas of functional analysis and allocation by analyzing candidate architectures, allocating security services, and selecting security mechanisms. The information systems security engineer identifies components, or elements, allocates security functions to those elements, and describes the relationships between the elements.

Develop Detailed Design

Develop Detailed Security Design

The systems engineer analyzes design constraints, analyzes trade-offs, does detailed system design, and considers life cycle support. The systems engineer traces all the system requirements to the elements until all are addressed. The final detailed design results in component and interface specifications that provide sufficient information for acquisition when the system is implemented.

The information systems security engineer analyzes design constraints, analyzes trade-offs, does detailed system and security design, and considers life cycle support. The information systems security engineer traces all the system security requirements to the elements until all are addressed. The final detailed security design results in component and interface specifications that provide sufficient information for acquisition when the system is implemented.

Implement System

Implement System Security

The systems engineer moves the system from specifications to the tangible. The main activities are acquisition, integration, configuration, testing, documentation, and training. Components are tested and evaluated to ensure that they meet the specifications. After successful testing, the individual components - hardware, software, and firmware - are integrated, properly configured, and tested as a system.

The information systems security engineer participates in a multidisciplinary examination of all system issues and provides inputs to C&A process activities, such as verification that the system as implemented protects against the threats identified in the original threat assessment; tracking of information protection assurance mechanisms related to system implementation and testing practices; and providing inputs to system life cycle support plans, operational procedures, and maintenance training materials.

Assess Effectiveness

Assess Information Protection Effectiveness

The results of each activity are evaluated to ensure that the system will meet the users’ needs by performing the required functions to the required quality standard in the intended environment. The systems engineer examines how well the system meets the needs of the mission.

The information systems security engineer focuses on the effectiveness of the information protection - whether the system can provide the confidentiality, integrity, availability, authentication, and nonrepudiation for the information it is processing that is required for mission success.

ISSE and Its Relationship to C A Processes

The ISSE process provides input to the C&A process in the form of evidence and documentation. Thus, the information systems security engineer has to consider the requirements of the accrediting authority. The Certification and Accreditation Process certifies that the information system meets the defined system security requirements and the system assurance requirements. It is not a design process. Details of Certification and Accreditation are presented in Chapter 11 of this text. The SE/ISSE process also benefits by receiving information back from the C&A process that may result in modifications to the SE/ISSE process activities. Figure D-12 illustrates the relationship of the SE/ISSE process to the C&A process.

image from book
Figure D-12: Relationship of the SE/ISSE process to the C&A process (from IATF document, Release 3.1, September 2002).

In summary, the outputs of the SE/ISSE process are the implementation of the system and the corresponding system documentation. The outputs of the C&A process are Certification documentation, Certification recommendations, and an Accreditation decision.

Another means of specifying information system assurance requirements are through Common Criteria protection profiles. Protection profiles, which are independent of implementation, comprise:

  • Security-related functional requirements
  • Security objectives
  • Information assurance requirements
  • Assumptions
  • Rationale

Many protection profiles are available on the IATF Web site at www.iatf.net/protection_profiles/. Protection profiles that are provided include:

  • Firewalls
  • Switches and routers
  • Mobile code
  • Biometrics
  • Certificate management

Implementing Information Assurance in the System Life Cycle

The documents providing the basis for material in this section are NIST SP 800-14, “Generally Accepted Principles and Practices for Securing Information Technology Systems” September 1996; NIST SP 800-27, “Engineering Principles for Information Technology Security (A Baseline for Achieving Security),” June 2001; and NIST SP 800-64, “Security Considerations in the Information System Development Life Cycle,” September-October 2003. In some publications, the System Life Cycle is also referred to as the System Development Life Cycle (SDLC).

The document SP 800-14 defines eight system security principles and 14 practices. SP 800-27 develops another set of 33 engineering principles for information technology security (EP-ITS) that provide a system-level perspective of information system security. These 33 principles incorporate the concepts developed in the eight principles and 14 practices detailed in SP 800-14. With this foundation, the five system life cycle phases are then defined, and each of the 33 EP-ITS principles are mapped onto the life cycle phases as applicable. These principles and practices are also presented in NIST SP 800-14 in a checklist form that can be used by federal agencies for self-evaluation. NIST SP 800-64 details a framework for incorporating information systems security into all the phases of the SDLC activity, using cost-effective control measures.

The System Life Cycle Phases

NIST SP 800-14 defines the system life cycle phases as follows:

  • Initiation - The need for the system and its purpose are documented. A sensitivity assessment is conducted as part of this phase. A sensitivity assessment evaluates the sensitivity of the IT system and the information to be processed.
  • Development/Acquisition - This phase comprises the system acquisition and development cycles. In this phase, the system is designed, developed, programmed, and acquired.
  • Implementation - Installation, testing, security testing, and accreditation are conducted.
  • Operation/Maintenance - The system performs its designed functions. This phase includes security operations, modification/addition of hardware and/or software, administration, operational assurance, monitoring, and audits.
  • Disposal - This phase involves disposition of system components and products, such as hardware, software, and information; disk sanitization; archiving files; and moving equipment.

The EP-ITS principles can be applied during each phase of the system life cycle. Some principles are critical to certain phases, whereas others can be considered optional or not necessary. SP 800-64 complements SP 800-14 and 800-27 and expands on the SDLC concepts presented in these two publications.

The following list summarizes the information system security steps to be applied to the SDLC as described in SP 800-64.

  • An organization will use the general SDLC described in this document or will have developed a tailored SDLC that meets its specific needs. In either case, NIST recommends that organizations incorporate the associated IT security steps of this general SDLC into their development process:
  • Initiation Phase:

    • Security Categorization - Defines three levels (low, moderate, or high) of potential impact on organizations or individuals should there be a breach of security (a loss of confidentiality, integrity, or availability). Security categorization standards assist organizations in making the appropriate selection of security controls for their information systems.
    • Preliminary Risk Assessment - Results in an initial description of the basic security needs of the system. A preliminary risk assessment should define the threat environment in which the system will operate.
  • Acquisition/Development Phase:

    • Risk Assessment - Analysis that identifies the protection requirements for the system through a formal risk assessment process. This analysis builds on the initial risk assessment performed during the Initiation phase but will be more in-depth and specific.
    • Security Functional Requirements Analysis - Analysis of requirements that may include the following components: (1) system security environment (that is, enterprise information security policy and enterprise security architecture) and (2) security functional requirements.
    • Assurance Requirements Analysis Security - Analysis of requirements that address the developmental activities required and assurance evidence needed to produce the desired level of confidence that the information security will work correctly and effectively. The analysis, based on legal and functional security requirements, will be used as the basis for determining how much and what kinds of assurance are required.
    • Cost Considerations and Reporting - Determines how much of the development cost can be attributed to information security over the life cycle of the system. These costs include hardware, software, personnel, and training.
    • Security Planning - Ensures that agreed-upon security controls, planned or in place, are fully documented. The security plan also provides a complete characterization or description of the information system as well as attachments or references to key documents supporting the agency’s information security program (e.g., configuration management plan, contingency plan, incident response plan, security awareness and training plan, rules of behavior, risk assessment, security test and evaluation results, system interconnection agreements, security authorizations/accreditations, and plan of action and milestones).
    • Security Control Development - Ensures that security controls described in the respective security plans are designed, developed, and implemented. For information systems currently in operation, the security plans for those systems may call for the development of additional security controls to supplement the controls already in place or the modification of selected controls that are deemed to be less than effective.
    • Developmental Security Test and Evaluation - Ensures that security controls developed for a new information system are working properly and are effective. Some types of security controls (primarily those controls of a nontechnical nature) cannot be tested and evaluated until the information system is deployed - these controls are typically management and operational controls.
    • Other Planning Components - Ensures that all necessary components of the development process are considered when incorporating security into the life cycle. These components include selection of the appropriate contract type, participation by all necessary functional groups within an organization, participation by the certifier and accreditor, and development and execution of necessary contracting plans and processes.
  • Implementation Phase:

    • Inspection and Acceptance - Ensures that the organization validates and verifies that the functionality described in the specification is included in the deliverables.
    • Security Control Integration - Ensures that security controls are integrated at the operational site where the information system is to be deployed for operation. Security control settings and switches are enabled in accordance with vendor instructions and available security implementation guidance.
    • Security Certification - Ensures that the controls are effectively implemented through established verification techniques and procedures and gives organization officials confidence that the appropriate safeguards and countermeasures are in place to protect the organization’s information system. Security certification also uncovers and describes the known vulnerabilities in the information system.
    • Security Accreditation - Provides the necessary security authorization of an information system to process, store, or transmit information that is required. This authorization is granted by a senior organization official and is based on the verified effectiveness of security controls to some agreed-upon level of assurance and an identified residual risk to agency assets or operations.
  • Operations/Maintenance Phase:

    • Configuration Management and Control - Ensures adequate consideration of the potential security impacts resulting from specific changes to an information system or its surrounding environment. Configuration management and configuration control procedures are critical to establishing an initial baseline of hardware, software, and firmware components for the information system and subsequently controlling and maintaining an accurate inventory of any changes to the system.
    • Continuous Monitoring - Ensures that controls continue to be effective in their application through periodic testing and evaluation. Security control monitoring (that is, verifying the continued effectiveness of those controls over time) and reporting the security status of the information system to appropriate agency officials is an essential activity of a comprehensive information security program.
  • Disposition Phase:

    • Information Preservation - Ensures that information is retained, as necessary, to conform to current legal requirements and to accommodate future technology changes that may render the retrieval method obsolete.
    • Media Sanitization - Ensures that data is deleted, erased, and written over as necessary.
    • Hardware and Software Disposal - Ensures that hardware and software is disposed of as directed by the information system security officer.

After discussing these phases and the information security steps in detail, the guide provides specifications, tasks, and clauses that can be used in an RFP to acquire information security features, procedures, and assurances.

The ISSEP candidate should also understand the relationship between the SDLC phases and the acquisition process for the corresponding information system. This relationship is illustrated in Table D-4, also taken from NIST SP 800-64.

Table D-4: Relationship between Information Systems Acquisition Cycle Phases and the SDLC Phases
Open table as spreadsheet

ACQUISITION CYCLE PHASES

Mission and Business Planning

Acquisition Planning

Acquisition

Contract Performance

Disposal and Contract Close- Out

SDLC PHASES

Initiation

Acquisition/ Development

Implementation

Operation/ Maintenance

Disposition

NIST SP 800-64 also defines the following acquisition-related terms:

  • Acquisition includes all stages of the process of acquiring property or services, beginning with the process for determining the need for the property or services and ending with contract completion and closeout.
  • The acquisition initiator is the key person who represents the program office in formulating information technology requirements and managing presolicitation activities.
  • The acquisition technical evaluation is a component of the selection process and is defined as the examination of proposals to determine technical acceptability and merit.

An additional, valuable tool in the acquisition process is the spiral model of the acquisition management process. This approach is known as an evolutionary acquisition strategy. This model depicts the acquisition management process as a set of phases and decision points in a circular representation. The model illustrates the concept that a mission need is defined and translated into a solution that undergoes a continuous circle of improvement and evolution until it is no longer required.

NIST SP 800-64 also lists the key personnel associated with system acquisition and development as follows:

  • Chief information officer (CIO) - The CIO is responsible for the organization’s information system planning, budgeting, investment, performance and acquisition. As such, the CIO provides advice and assistance to senior organization personnel in acquiring the most efficient and effective information system to fit the organization’s enterprise architecture.
  • Contracting officer - The contracting officer is the person who has the authority to enter into, administer, or terminate contracts and make related determinations and findings.
  • Contracting officer’s technical representative (COTR) - The COTR is a qualified employee appointed by the contracting officer to act as his or her technical representative in managing the technical aspects of a particular contract.
  • Information Technology Investment Board (or equivalent) - The Information Technology (IT) Investment Board, or its equivalent, is responsible for managing the capital planning and investment control process defined by the Clinger-Cohen Act of 1996 (Section 5).
  • Information security program manager - The information security program manager is responsible for developing enterprise standards for information security. This individual plays a leading role in introducing an appropriate, structured methodology to help identify, evaluate, and minimize information security risks to the organization. Information security program managers coordinate and perform system risk analyses, analyze risk mitigation alternatives, and build the business case for the acquisition of appropriate security solutions that help ensure mission accomplishment in the face of real-world threats. They also support senior management in ensuring that security management activities are conducted as required to meet the organization’s needs.
  • Information system security officer - The information system security officer is responsible for ensuring the security of an information system throughout its life cycle.
  • Program manager (owner of data)/acquisition initiator/program official - This person represents programmatic interests during the acquisition process. The program manager, who has been involved in strategic planning initiatives of the acquisition, plays an essential role in security and is, ideally, intimately aware of functional system requirements.
  • Privacy officer - The privacy officer is responsible for ensuring that the services or system being procured meet existing privacy policies regarding protection, dissemination (information sharing and exchange), and information disclosure.
  • Legal advisor/contract attorney - This individual is responsible for advising the team on legal issues during the acquisition process. ISSEP candidates who are interested in additional information contained in NIST SP 800-64 can obtain the document from the NIST Web site: http://csrc.nist.gov/publications/nistpubs/.

Risk Management and the System Development Life Cycle

The risk management process minimizes the impact of threats realized and provides a foundation for effective management decision making. Thus, it is very important that risk management be a part of the system development life cycle. As defined in NIST SP 800-30, risk management comprises three processes:

  • Risk assessment
  • Risk mitigation
  • Evaluation and assessment

These processes should be performed during each of the five phases of the SDLC. Table D-5, taken from NIST SP 800-30, details the risk management activities that should be performed for each SDLC phase.

Table D-5: Risk Management in the SDLC
Open table as spreadsheet

SDLC

PHASE

RISK MANAGEMENT ACTIVITIES

Phase 1 - Initiation

The need for an IT system is expressed and the purpose and scope of the IT system are documented.

Identified risks are used to support the development of the system requirements, including security requirements, and a security concept of operations (strategy).

Phase 2 - Development or Acquisition

The IT system is designed, purchased, programmed, developed, or otherwise constructed.

The risks identified during this phase can be used to support the security analyses of the IT system, which may lead to architecture and design trade-offs during system development.

Phase 3 - Implementation

The system security features should be configured, enabled, tested, and verified.

The risk management process supports the assessment of the system implementation against its requirements and within its modeled operational environment. Decisions regarding risks identified must be made prior to system operation.

Phase 4 - Operation or Maintenance

The system performs its functions. Typically the system is being modified on an ongoing basis through the addition of hardware and software and by changes to organizational processes, policies, and procedures.

Risk management activities are performed for periodic system reauthorization (or reaccreditation) or whenever major changes are made to an IT system in its operational, production environment (e.g., new system interfaces).

Phase 5 - Disposal

This phase may involve the disposition of information, hardware, and software. Activities may include moving, archiving, discarding, or destroying information and sanitizing the hardware and software.

Risk management activities are performed for system components that will be disposed of or replaced to ensure that the hardware and software are properly disposed of, that residual data is appropriately handled, and that system migration is conducted in a secure and systematic manner.

Roles of Key Personnel in the Risk Management Process

To be effective, risk management must be supported by management and information system security practitioners. Some of the key personnel that should actively participate in the risk management activities are:

  • Senior management - Provides the required resources and meet responsibilities under the principle of due care
  • Chief information officer (CIO) - Considers risk management in IT planning, budgeting, and meeting system performance requirements
  • System and information owners - Ensure that controls and services are implemented to address information system confidentiality, integrity, and availability
  • Business and functional managers - Make trade-off decisions regarding business operations and IT procurement that affect information security
  • Information system security officer (ISSO) - Participates in applying methodologies to identify, evaluate, and reduce risks to the mission-critical IT systems
  • IT security practitioners - Ensure the correct implementation of IT system information system security requirements
  • Security awareness trainers - Incorporate risk assessment in training programs for the organization’s personnel

The Risk Assessment Process

As defined in NIST SP 800-30, “Risk is a function of the likelihood of a given threat-source’s exercising a particular potential vulnerability, and the resulting impact of that adverse event on the organization.” Risk assessment comprises the following steps:

  1. System characterization
  2. Threat identification
  3. Vulnerability identification
  4. Control analysis
  5. Likelihood determination
  6. Impact analysis
  7. Risk determination
  8. Control recommendations
  9. Results documentation

Each of these steps will be summarized in the following sections.

System Characterization

This step characterizes and defines the scope of the risk assessment process. During this step, information about the system has to be gathered. This information includes:

  • Software
  • Hardware
  • Data
  • Information
  • System interfaces
  • IT system users
  • IT system support personnel
  • System mission
  • Criticality of the system and data
  • System and data sensitivity
  • Functional system requirements
  • System security policies
  • System security architecture
  • Network topology
  • Information storage protection
  • System information flow
  • Technical security controls
  • Physical security environment
  • Environmental security

This information can be gathered using questionnaires, on-site interviews, review of documents, and automated scanning tools. The outputs from this step are:

  • Characterization of the assessed IT system
  • Comprehension of the IT system environment
  • Delineation of the system boundary

Threat Identification

This step identifies potential threat-sources and compiles a statement of the threat-sources that relate to the IT system under evaluation. A threat is defined in NIST SP 800-30 as “the potential for a threat-source to exercise (accidentally trigger or intentionally exploit) a specific vulnerability.” A threat-source is defined in the same document as “either (1) intent and method targeted at the intentional exploitation of a vulnerability or (2) a situation and method that may accidentally trigger a vulnerability.” Common threat-sources include natural threats such as storms and floods, human threats such as malicious attacks and unintentional acts, and environmental threats such as power failure and liquid leakage. A vulnerability is defined as “a flaw or weakness in system security procedures, design, implementation, or internal controls that could be exercised (accidentally triggered or intentionally exploited) and result in a security breach or a violation of the system’s security policy.”

Sources of threat information include the Federal Computer Incident Response Center (FedCIRC), intelligence agencies, mass media, and Web-based resources. The output from this step is a statement that provides a list of threat-sources that could exploit the system’s vulnerabilities.

Vulnerability Identification

This activity results in a list of system vulnerabilities that might be exploited by potential threat-sources. Vulnerabilities can be identified through vulnerability analyses, including information from previous information assessments; audit reports; the NIST vulnerability database (http://icat.nist.gov/icat.cfm); FedCIRC and DOE security bulletins; vendor data; commercial computer incident response teams; and system software security analyses. Testing of the IT system will also yield important results. This testing can be accomplished using penetration testing techniques, automated vulnerability scanning tools, and security test and evaluation (ST&E) procedures.

This phase also involves determining whether the security requirements identified during system characterization are being met. Usually, the security requirements are listed in a table with a corresponding statement about how the requirement is or is not being met. The checklist addresses management, operational, and technical information system security areas. The result of this effort is a security requirements checklist. Some useful references for this activity are the Computer Security Act of 1987, the Privacy Act of 1974, the organization’s security policies, industry best practices, and NIST SP 800-26, “Security Self-Assessment Guide for Information Technology Systems.

The output from this step is a list of system vulnerabilities/observations that could be exploited by the potential threat-sources.

Control Analysis

The control analysis step analyzes the controls that are in place or in the planning stage to minimize or eliminate the probability that a threat will exploit vulnerability in the system.

Controls can be implemented through technical means such as computer hardware or software, encryption, intrusion detection mechanisms, and identification and authentication subsystems. Other controls such as security policies, administrative actions, physical and environmental mechanisms are considered nontechnical controls. Both technical and nontechnical controls can further be classified as preventive or detective controls. As the names imply, preventive controls attempt to anticipate and stop attacks. Examples of preventive, technical controls are encryption and authentication devices. Detective controls are used to discover attacks or events through such means as audit trails and intrusion detection systems.

Changes in the control mechanisms should be reflected in the security requirement checklist.

The output of this step is a list of current and planned control mechanisms for the IT system to reduce the likelihood that a vulnerability will be exercised and to reduce the impact of an attack or event.

Likelihood Determination

This activity develops a rating that provides an indication of the probability that a potential vulnerability might be exploited based on the defined threat environment. This rating takes into account the type of vulnerability, the capability and motivation of the threat-source, and the existence and effectiveness of information system security controls. The likelihood levels are given as high, medium, and low, as illustrated in Table D-6.

Table D-6: Definitions of Likelihood
Open table as spreadsheet

LEVEL OF LIKELIHOOD

DEFINITION OF LIKELIHOOD

High

A highly motivated and capable threat source and ineffective controls to prevent exploitation of the associated vulnerability

Medium

A highly motivated and capable threat source and controls that might impede exploitation of the associated vulnerability

Low

Lack of motivation or capability in the threat source or controls in place to prevent or significantly impede the exploitation of the associated vulnerability

The output of this step is a likelihood rating of high, medium, or low.

Impact Analysis

If a threat does exploit a vulnerability in an IT system, it is critical to know the negative impact that would result to the system. Three important factors should be considered in calculating the negative impact:

  • The mission of the system, including the processes implemented by the system
  • The criticality of the system, determined by its value and the value of the data to the organization
  • The sensitivity of the system and its data

The information necessary to conduct an impact analysis can be obtained from existing organizational documentation, including a business impact analysis (BIA), or mission impact analysis report as it is sometimes called. This document uses either quantitative or qualitative means to determine the impacts caused by compromise or harm to the organization’s information assets. An attack or adverse event can result in compromise or loss of information system confidentiality, integrity, and availability. As with the likelihood determination, the impact on the system can be qualitatively assessed as high, medium, or low, as shown in Table D-7.

Table D-7: Definitions of Likelihood
Open table as spreadsheet

IMPACT MAGNITUDE

DEFINITION OF IMPACT

High

May cause costly loss of major tangible assets or resources; might cause significant harm or impedance to the mission of an organization; might cause significant harm to an organization’s reputation or interest; might result in human death or injury

Medium

May cause costly loss of tangible assets or resources; might cause harm or impedance to the mission of an organization; might cause harm to an organization’s reputation or interest; might result in human injury

Low

May cause loss of some tangible assets or resources; might affect noticeably an organization’s mission; might affect noticeably an organization’s reputation or interest

Qualitative analysis is more easily accomplished and provides identifiable areas for immediate improvement. However, it does not provide specific measures of magnitudes of measures, and thus it makes a cost-benefit analysis difficult. Quantitative analysis does provide magnitudes of measurements but may take more time. It is sometimes very difficult or impossible to place quantitative values on abstract items such as reputation.

Other items that should be included in the impact analysis are the estimated frequency of the threat-source’s exploitation of a vulnerability on annual basis, the approximate cost of each of these occurrences, and a weight factor based on the relative impact of a specific threat exploiting a specific vulnerability.

The output of this step is the magnitude of impact: high, medium, or low.

Risk Determination

This step, the seventh step in the risk assessment process, determines the level of risk to the IT system. The risk is assigned for a threat/vulnerability pair and is a function of the following characteristics:

  • The likelihood that a particular threat-source will exploit an existing IT system vulnerability
  • The magnitude of the resulting impact of a threat-source successfully exploiting the IT system vulnerability
  • The adequacy of the existing or planned information system security controls for eliminating or reducing the risk

Mission risk is calculated by multiplying the threat likelihood ratings (the probability that a threat will occur) by the impact of the threat realized. A useful tool for estimating risk in this manner is the risk-level matrix. An example risk-level matrix is shown in Table D-8. In the table, a high likelihood that the threat will occur is given a value of 1.0; a medium likelihood is assigned a value of 0.5; and a low likelihood of occurrence is given a rating of 0.1. Similarly, a high impact level is assigned a value of 100, a medium impact level 50, and a low impact level 10.

Table D-8: A Risk-Level Matrix Example
Open table as spreadsheet

LIKELIHOOD OF THREAT

LOW IMPACT (10)

MEDIUM IMPACT (50)

HIGH IMPACT (100)

High (1.0)

Low 10 × 1.0 =10

Medium 50 × 1.0 = 50

High 100 × 1.0 = 100

Medium (0.5)

Low 10 × 0.5 =5

Medium 50 × 0.5 = 25

High 100 × 0.5 = 50

Low (0.1)

Low 10 × 0.1 =1

Medium 50 × 0.1 = 5

High 100 × 0.1 = 10

Using the risk level as a basis, the next step is to determine the actions that senior management and other responsible individuals must take to mitigate estimated risk. General guidelines for each level of risk are:

  • High risk level - At this level, there is a high level of concern and a strong need for a plan for corrective measures to be developed as soon as possible.
  • Medium risk level - For medium risk, there is concern and a need for a plan for corrective measures to be developed within a reasonable period of time
  • Low risk level - For low risk, the DAA of the system must decide whether to accept the risk or implement corrective actions.

The output of the risk determination step is risk level of high, medium, or low.

Control Recommendations

With the risks identified and general guidelines provided for risk mitigation in the previous step, this step specifies the controls to be applied for risk mitigation. In order to specify appropriate controls, issues such as cost/benefit, operational impact, and feasibility have to be considered. In addition, other factors, including applicable legislative regulations, organizational policy, safety, reliability, and the overall effectiveness of the recommended controls should be taken into account.

The output of this step is a recommendation of controls and any alternative solutions to mitigate risk.

Results Documentation

The last step in the risk assessment process is the development of a risk assessment report.

The risk assessment report describes threats and vulnerabilities, risk measurements, and recommendations for implementation of controls. This report is directed at management and should contain information to support appropriate decisions on budget, policies, procedures, management, and operational issues.

Risk Mitigation

Risk mitigation prioritizes, evaluates, and implements the controls that are an output of the risk assessment process. Risk mitigation is the second component of the risk management process.

Because risk can never be completely eliminated and control implementation must make sense under a cost-benefit analysis, a least-cost approach with minimal adverse impact on the IT system is usually taken.

Risk Mitigation Options

Risk mitigation can be classified into the following options:

  • Risk assumption - Accept the risk and keep operating
  • Risk avoidance - Forgo some functions
  • Risk limitation - Implement controls to minimize the adverse impact of threats realized
  • Risk planning - Develop a risk mitigation plan to prioritize, implement, and maintain controls
  • Research and development - Researching control types and options
  • Risk transference - Transfer risk to other sources, such as purchasing insurance

SP 800-30 emphasizes the following guidance on implementing controls:

Address the greatest risks and strive for sufficient risk mitigation at the lowest cost, with minimal impact on other mission capabilities.

The control implementation approach from the risk mitigation methodology recommended by SP 800-30 is given in Figure D-13.

image from book
Figure D-13: A control implementation approach (from NIST SP 800-30).

Categories of Controls

Controls to mitigate risks can be broken into the following categories:

  • Technical
  • Management
  • Operational
  • A combination of the above

Each of the categories of controls can be further decomposed into additional subcategories. Technical controls can be subdivided into:

  • Supporting controls - These controls implement identification, crypto-graphic key management, security administration, and system protections.
  • Preventive controls - Preventive technical controls include authentication, authorization, access control enforcement, nonrepudiation, protected communications, and transaction privacy.
  • Detection and recovering controls - These technical controls include audit, intrusion detection and containment, proof of wholeness (system integrity), restoration to a secure state, and virus detection and eradication.

Management controls comprise:

  • Preventive controls - Preventive management controls include assigning responsibility for security, developing and maintaining security plans, personnel security controls, and security awareness and technical training.
  • Detection controls - Detection controls involve background checks, personnel clearance, periodic review of security controls, periodic system audits, risk management, and authorization of IT systems to address and accept residual risk.
  • Recovery controls - These controls provide continuity of support to develop, test, and maintain the continuity of the operations plan and establish an incident response capability.

Operational security controls are divided into preventive and detection types. Their functions are listed as follows:

  • Preventive controls - These operational controls comprise control of media access and disposal, limiting external data distribution, control of software viruses, securing wiring closets, providing backup capability, protecting laptops and personal computers, protecting IT assets from fire damage, providing an emergency power source, and control of humidity and temperature.
  • Detection controls - Detection operation controls include providing physical security through the use of items such as cameras and motion detectors and ensuring environmental security by using smoke detectors, sensors, and alarms.

Determination of Residual Risk

The risk that remains after the implementation of controls is called the residual risk. All systems will have residual risk, because it is virtually impossible to completely eliminate risk to an IT system. An organization’s senior management or the DAA is responsible for authorizing/accrediting the IT system to begin or continue to operate. The authorization/accreditation must take place every three years in federal agencies or whenever major changes are made to the system. The DAA signs a statement accepting the residual risk when accrediting the IT system for operation. If the DAA determines that the residual risk is at an unacceptable level, the risk management cycle must be redone with the objective of lowering the residual risk to an acceptable level.

Figure D-14 shows the relationship between residual risk and the implementation of controls.

image from book
Figure D-14: The relationship between residual risk and implementation of controls (from NIST SP 800-30).

Risk Management Summary

As stated in SP 800-30, “A successful risk management program will rely on (1) senior management’s commitment; (2) the full support and participation of the IT; (3) the competence of the risk assessment team, which must have the expertise to apply the risk assessment methodology to a specific site and system, identify mission risks, and provide cost-effective safeguards that meet the needs of the organization; (4) the awareness and cooperation of members of the user community, who must follow procedures and comply with the implemented controls to safeguard the mission of their organization; and (5) an ongoing evaluation and assessment of the IT-related mission risks.”

Technical Management

To meet the challenges of successful security project management, a combination of technical and management skills are required. Understanding just the technical process is not enough; the proper program management environment must also be created. Many mature methods and tools are used to ensure successful project management, and each should be implemented as early in the SDLC as possible. This chapter will examine the processes and tools the program manager uses, as well as the responsibilities of the program manager to satisfy the needs of the project.

Capability Maturity Models (CMMs)

The Software CMM is covered in Chapter 7 of this text, and the System Security Engineering CMM (SSE-CMM) is covered in Chapter 5.

Program Manager Responsibilities

The program manager is the lead for all activities involving cost, schedule, and performance responsibilities. For example, the program manager’s function in the Certification and Accreditation is to ensure that security requirements are integrated into the IT architecture in a way that will result in an acceptable level of risk to the operational infrastructure. The PM works directly with the development integration, maintenance, configuration management, quality assurance, test verification, and validation organizations.

Program Management Plan (PMP)

Usually there is one overall planning document for every program or project, which covers all requirements at a high level and leads to a variety of lower-level plans that address specific areas of activity. Although the specific nomenclature may vary from one program to the next, the title Program Management Plan (PMP) is most often selected to represent this high-level plan. Two major components of the PMP are the Systems Engineering Management Plan (SEMP) and the Work Breakdown Structure (WBS).

Systems Engineering Management Plan (SEMP)

All of the key participants in the system development process must know not only their own responsibilities but also how to interface with one another. This interaction of responsibilities and authority within the project must be defined and controlled, and it is accomplished through the preparation and dissemination of a System Engineering Management Plan (SEMP). An important function of the SEMP is to ensure that all of the participants know their responsibilities to one another.

The SEMP also serves as a reference for the procedures that are to be followed in carrying out the numerous systems security engineering tasks. Often the contractor is required to prepare a SEMP as part of the concept definition effort. The place of the SEMP in the program management plan is shown in Figure D-15.

image from book
Figure D-15: Placement of the SEMP in the program management plan. Source: Systems Engineering: Principles and Practice, A. Kossiakoff and W. N. Sweet (Wiley Publishing, Inc., 2003). Used by permission.

The SEMP is intended to be a dynamic document. It starts as an outline and is updated as the security system development process goes on. The SEMP covers all management functions associated with the performance of security systems engineering activities for a given program. The responsibility for the SEMP must be clearly defined and supported by the program manager.

SEMP Elements

The SEMP contains detailed statements of how the systems security engineering functions are to be carried out during development. Two major elements of the SEMP are:

  • Development program planning and control
  • Security systems engineering process

Development Program Planning and Control

The development program planning and control section describes the tasks that must be implemented to manage the development phase of the security program, including:

  • Statement of Work (SOW)
  • Organizational Structure
  • Scheduling and Cost Estimation
  • Technical Performance Measurement (TPM)

Security Systems Engineering Process

The security systems engineering process section describes the security systems engineering process as it applies to the development of the system, including:

  • Operational Requirements
  • Functional Analysis
  • System Analysis and Trade-Off Strategy

Statement of Work (SOW)

The Statement of Work (SOW) is a narrative description of the work required for a given project. It is commonly described in the PMP and should include the following:

  • Summary statement of the tasks to be accomplished
  • Identification of the input requirements from other tasks, including tasks accomplished by the customer and supplier
  • References to applicable specifications, standards, procedures, and related documentation
  • Description of specific results to be achieved and a proposed schedule of delivery

Work Breakdown Structure (WBS)

After the generation of the SOW and the identification of the organizational structure, one of the initial steps in program planning is the development of the Work Breakdown Structure (WBS). The WBS is a tree that leads to the identification of the activities, functions, tasks, and subtasks that must be completed.

The WSB is an important technique to ensure that all essential tasks are properly defined, assigned, scheduled, and controlled. It contains a hierarchical structure of the tasks to be accomplished during the project. The WBS may be a contractual requirement in competitive bid system developments.

The WSB structure generally includes three levels of activity:

  • Level 1 - Identifies the entire program scope of work to be produced and delivered. Level 1 may be used as the basis for the authorization of the program work.
  • Level 2 - Identifies the various projects, or categories of activity, that must be completed in response to program requirements. Program budgets are usually prepared at this level.
  • Level 3 - Identifies the activities, functions, major tasks, and/or components of the system that are directly subordinate to the Level 2 items. Program schedules are generally prepared at this level.

The WBS provides many benefits, such as:

  • It provides for the reporting of system technical performance measures (TPMs).
  • The entire security system can be easily defined by the breakdown of its elements into discrete work packages.
  • It aids in linking objectives and activities with available resources.
  • It facilitates budgeting and cost reporting.
  • Responsibility assignments can be readily identified through the assignment of tasks.
  • It provides a greater probability that every activity will be accounted for.

WBS Components

The use of the WBS as a project-organizing framework generally begins in the concept exploration phase. Later, in the concept definition phase, the WBS is defined in detail as the basis for organizing, costing, and scheduling. The WBS format follows a hierarchical structure designed to ensure a slot for every significant task and activity.

In the following example, the entire security system project is at Level 1 in the hierarchy, and the five components represent the Level 2 categories:

  • 1.1 Security System Product - The effort required to develop, produce, and integrate the security system
  • 1.2 Security System Support - The equipment, facilities, and services necessary for the development and operation of the system product
  • 1.3 Security System Testing - To begin after the design of the individual components has been validated via component tests; a very significant fraction of the total test effort is usually allocated to system-level testing
  • 1.4 Project Management - All activities associated with project planning and control, including all management of the WBS, costing, scheduling, performance measurement, project reviews, reports, and associated activities
  • 1.5 Security Systems Engineering - The actions of the security systems engineering staff in guiding the engineering of the system through all its conceptual and engineering phases

Each of the Level 2 categories will have deeper, associated Level 3, Level 4, and possibly Level 5 categories as each component is further broken down. These lower level categories represent the breakdown of each component into definable products of development, the lowest level defining each step of the component’s design, development, and testing. This is vital for establishing cost allocation and controls. The WBS should be structured so that every task is identified at the appropriate place within the WBS hierarchy.

Cost Control and Estimating

Cost control starts with the initial development of cost estimates for the program and continues with the functions of cost monitoring, the collection of cost data, the analysis of the data, and the immediate initiation of corrective action. Cost control requires good overall cost management, including:

  • Cost estimating
  • Cost accounting
  • Cost monitoring
  • Cost analysis and reporting
  • Control functions

The cost control process is typically performed in this order:

  1. Define the elements of work, as extracted from the SOW.
  2. Integrate the tasks defined in the WBS.
  3. Develop the estimated costs for each task.
  4. Develop a functional cost data collection and reporting capability.
  5. Develop a procedure for evaluation and quick corrective action.

CRITICAL PATH METHOD (CPM)

Critical path analysis is an essential project management tool that traces each major element of the system back through the engineering of its constituent parts. Estimates are made up not only of the size, but also of the duration of effort required for each step. The particular path that is estimated to require the longest time to complete is called the critical path. The differences between this time and the times required for other paths are called “slack” for those paths.

Outsourcing

The term outsourcing refers to the identification of, selection of, and contracting with one or more outside suppliers for the procurement and acquisition of materials and services for a given system. The term suppliers is defined here as a broad class of external organizations that provide products, components, materials, and/or services to a producer or prime contractor.

The prime activities of the outsourcing process are:

  1. Identification of potential suppliers
  2. Development of a request for proposal (RFP)
  3. Review and evaluation of supplier proposals
  4. Selection of suppliers and contract negotiation
  5. Supplier monitoring and control

System Design Testing

An important step in the security systems development process is the development of a well-designed test plan for determining whether the security system design is stable. A well-planned test program often requires the following five steps:

  1. Planning - The test approach must be planned properly to uncover potential design deficiencies and acquire sufficient test data to identify areas needing correction. This includes the activities:

    • Development of a test plan
    • Development of test procedures
    • Development of a test analysis plan
  1. Development or acquisition of test equipment and facilities - The process in the creation of test equipment and test facilities includes:

    • Creating the Test Environment - The design and construction of the test environment and the acquisition of equipment for the realistic generation of all of the input functions and the measurement of the resulting outputs
    • Test Software - The acquisition of the software to be used for testing, tailored to the system at hand
    • Test Equipment Validation - The test equipment itself must be validated to ensure that it is sufficiently accurate and reliable
  1. Demonstration and validation testing - The actual conduct of the test to demonstrate and validate the security system design is often the most critical period in the development of a new system.
  2. Analysis and evaluation of test results - The outputs from the component under examination and the results of the test must then be analyzed to disclose all significant discrepancies, in order to identify their source and assess whether correction is required.
  3. Correction of Design Deficiencies - The final step is a prioritized effort to quickly correct identified design deficiencies.

Test and Evaluation Master Plan (TEMP)

The methods and techniques to be used for measuring and evaluating the system to ensure compliance with security system design requirements must be described early in the SDLC. Individual tests to be performed at each level of the WBS are defined in a series of separate test plans and procedures.

TEST ANALYSIS PLANNING

The planning of how the test results are to be analyzed is just as important as planning how the tests are to be conducted. The following steps should be taken:

  • Determine what data must be collected.
  • Consider the methods by which these data can be obtained; examples include special laboratory tests, simulations, subsystems test, or full-scale systems tests.
  • Define how all data will be processed, analyzed, and presented.

An overall description of test objectives and content and a listing of the individual test to be performed should also be set forth in an integrated test planning and management document, the Test and Evaluation Management Plan (TEMP). The TEMP is developed during the later stages of system design. In DoD parlance, this is parallel to the Security Test and Evaluation (ST&E) plan.

Initial test planning is included in the TEMP, which commonly consists of:

  • Requirements for testing and evaluation
  • Categories of tests
  • Procedures for accomplishing testing
  • Resources required
  • Associated planning information, such as tasks, schedules, organizational responsibilities, and costs

Other methods used to determine compliance with the initial specification of security system design requirements may entail using simulations and related analytical methods, using an engineering model for test and evaluation purposes, testing a production model, evaluating an operational configuration in the consumer’s environment, or some combination of these methods.

In the Defense sector, a TEMP is required for most large programs and includes the planning and implementation of procedures for the Development Test and Evaluation (DT&E) and the Operational Test and Evaluation (OT&E). The DT&E basically equates to the Analytical, Type 1, and Type 2 testing (see the next section, “Testing and Evaluation Categories”), and the OT&E is equivalent to Type 3 and Type 4 testing.

Testing and Evaluation Categories

Testing and evaluation processes often involve several stages of testing categories or phases, such as:

  1. Analytical - Design evaluations conducted early in the system life cycle using computerized techniques such as CAD, CAM, CALS, simulation, rapid prototyping, and other related approaches
  2. Type 1 testing - The evaluation of system components in the laboratory using bench test models and service test models, designed to verify performance and physical characteristics
  3. Type 2 testing - Testing performed during the latter stages of the detail design and development phase, when preproduction prototype equipment and software are available
  4. Type 3 testing - Tests conducted after initial system qualification and prior to the completion of the production or construction phase; the first time that all elements of the system are operated and evaluated on an integrated basis
  5. Type 4 testing - Testing conducted during the system operational use and life cycle support phase, intended to provide further knowledge of the system in the user environment

Figure D-16 shows a common security system test and evaluation corrective-action loop.

image from book
Figure D-16: Security system test and evaluation corrective-action loop. Source: Systems Engineering Management, Third Edition, B. Blanchard (Wiley Publishing, Inc., 2004). Used by permission.

Technical Performance Measurement (TPM)

As the security system development effort progresses, periodic reviews will need to be conducted. Within the systems specification should be the identification and prioritization of Technical Performance Measurements (TPMs). Checklists may be utilized to aid in the evaluation process, identifying those characteristics that have been incorporated into and directly support the TPM objectives. Design parameters and the applicable TPMs will be measured and tracked.

TESTING RESOURCE TRADE-OFFS

Although the ideal testing configuration would be a replica of the entire system and its environment, such a configuration would be too costly in terms of resources. A more practical solution would be to incorporate the elements to be tested into a prototype subsystem, simulating the rest of the system and utilizing the relevant part of the operating environment. The choice of a specific test configuration requires a complex balancing of risks, costs, and contingency plans, requiring a high level of judgment.

Certification and Accreditation

Certification and Accreditation is covered in detail in Chapters 11 through 15 of this text, the CAP credential. See the assessment questions at the end of those chapters.

United States Government Information Assurance (IA) Regulations

The U.S. Government Information Assurance Regulations domain of the ISSEP concentration is designed to enable the candidate to identify, understand, and apply the practices as defined by the U.S. Government IA regulations and policies.

Common U S Government Information Assurance Terminology

A large amount of U.S. government assurance terminology has, necessarily, been defined and used in the material preceding this chapter. Therefore, it is not necessary to repeat those definitions in this section. However, the definitions of a number of important terms as they are used in the context of U.S. government information assurance will be presented in this section to ensure that the candidate is familiar with them. Also, National Security Telecommunications and Information Systems Security Instruction (NSTISSI) Publication No. 4009, “National Information Systems Security (INFOSEC) Glossary,” September 2000, Appendix F, provides a comprehensive list of U.S. government IA terms.

Important Government IA Definitions

The following definitions, taken from NIST Special Publication 800-12, “An Introduction to Computer Security: The NIST Handbook,” October 1995, are fundamental to the understanding of U.S. government IA material.

  1. Management controls - Techniques and concerns that are normally addressed by management in the organization’s computer security program
  2. Operational controls - Security controls that are usually implemented by people instead of systems
  3. Technical controls - Security controls that the computer system executes
  4. Computer security - The protection afforded to an automated information system in order to attain the applicable objectives of preserving the integrity, availability and confidentiality of information system resources (including hardware, software, firmware, information/data, and telecommunications)
  5. Integrity - In lay usage, information has integrity when it is timely, accurate, complete, and consistent. However, computers are unable to provide or protect all of these qualities. Therefore, in the computer security field, integrity is often discussed more narrowly as having two facets: data integrity and system integrity. As defined in National Research Council, Computers at Risk, National Academy Press, Washington, DC, 1991, p. 54: “Data integrity is a requirement that information and programs are changed only in a specified and authorized manner.” System integrity is defined in National Computer Security Center, Publication NCSC-TG-004-88 as a requirement that a system “performs its intended function in an unimpaired manner, free from deliberate or inadvertent unauthorized manipulation of the system.”
  6. Availability - Computers at Risk, p. 54, defines availability as a “requirement intended to assure that systems work promptly and service is not denied to authorized users.”
  7. Confidentiality - A requirement that private or confidential information not be disclosed to unauthorized individuals.

The additional definitions that follow are selectively taken from the (NSTISSI) Publication No. 4009, Glossary. They are listed to provide the candidate with knowledge of terminology that is used in government IA publications. This list gives the definitions of fundamental concepts that are important to the ISSEP certification:

  1. Assurance - Measure of confidence that the security features, practices, procedures, and architecture of an IS accurately mediates and enforces the security policy
  2. Authentication - Security measure designed to establish the validity of a transmission, message, or originator, or a means of verifying an individual’s authorization to receive specific categories of information
  3. Binding - Process of associating a specific communications terminal with a specific cryptographic key or associating two related elements of information.
  4. BLACK - Designation applied to information systems (and to associated areas, circuits, components, and equipment) in which national security information is encrypted or is not processed.
  5. CCI Assembly - Device embodying a cryptographic logic or other COMSEC design that NSA has approved as a Controlled Cryptographic Item (CCI). It performs the entire COMSEC function but depends upon the host equipment to operate.
  6. CCI Component - Part of a Controlled Cryptographic Item (CCI) that does not perform the entire COMSEC function but depends upon the host equipment, or assembly, to complete and operate the COMSEC function.
  7. Certification Authority Workstation (CAW) - Commercial-off-the-shelf (COTS) workstation with a trusted operating system and special purpose application software that is used to issue certificates.
  8. Certification Package - Product of the certification effort documenting the detailed results of the certification activities.
  9. Certification Test and Evaluation (CT&E) - Software and hardware security tests conducted during development of an IS.
  10. Certified TEMPEST Technical Authority (CTTA) - An experienced, technically qualified U.S. Government employee who has met established certification requirements in accordance with CNSS (NSTISSC)-approved criteria and has been appointed by a U.S. Government Department or Agency to fulfill CTTA responsibilities.
  11. Ciphony - Process of enciphering audio information, resulting in encrypted speech.
  12. Classified information - Information that has been determined (pursuant to Executive Order 12958 or any predecessor Order, or by the Atomic Energy Act of 1954, as amended) to require protection against unauthorized disclosure and that is marked to indicate its classified status.
  13. Clearance - Formal security determination by an authorized adjudicative office that an individual is authorized access, on a need to know basis, to a specific level of collateral classified information (TOP SECRET, SECRET, or CONFIDENTIAL).
  14. Commercial COMSEC Evaluation Program (CCEP) - Relationship between NSA and industry in which NSA provides the COMSEC expertise (i.e., standards, algorithms, evaluations, and guidance) and industry provides design, development, and production capabilities to produce a Type 1 or Type 2 product. Products developed under the CCEP may include modules, subsystems, equipment, systems, and ancillary devices.
  15. Compartmentalization - A nonhierarchical grouping of sensitive information used to control access to data more finely than with hierarchical security classification alone.
  16. Compartmented mode - Mode of operation wherein each user with direct or indirect access to a system, its peripherals, remote terminals, or remote hosts has all of the following: (a) valid security clearance for the most restricted information processed in the system; (b) formal access approval and signed nondisclosure agreements for that information which a user is to have access; and (c) valid need-to-know for information which a user is to have access.
  17. COMSEC boundary - Definable perimeter encompassing all hardware, firmware, and software components performing critical COMSEC functions, such as key generation and key handling and storage.
  18. Concept of Operations (CONOPS) - Document detailing the method, act, process, or effect of using an IS.
  19. Controlled Cryptographic Item (CCI) - Secure telecommunications or information-handling equipment, or associated cryptographic component, that is unclassified but governed by a special set of control requirements. Such items are marked “CONTROLLED CRYPTOGRAPHIC ITEM” or, where space is limited, “CCI.”
  20. Crypto-ignition key (CIK) - Device or electronic key used to unlock the secure mode of crypto-equipment.
  21. Dangling threat - Set of properties about the external environment for which there is no corresponding vulnerability and therefore no implied risk.
  22. Dangling vulnerability - Set of properties about the internal environment for which there is no corresponding threat and, therefore, no implied risk.
  23. Enclave - Collection of computing environments connected by one or more internal networks under the control of a single authority and security policy, including personnel and physical security.
  24. Enclave boundary - Point at which an enclave’s internal network service layer connects to an external network’s service layer (i.e., to another enclave or to a Wide Area Network [WAN]).
  25. Endorsed for Unclassified Cryptographic Item (EUCI) - Unclassified cryptographic equipment that embodies a U.S. Government classified cryptographic logic and is endorsed by NSA for the protection of national security information. See Type 2 Product.
  26. Evaluated Products List (EPL) - Equipment, hardware, software, and/or firmware evaluated by the National Computer Security Center (NCSC) in accordance with DoD TCSEC and found to be technically compliant at a particular level of trust. The EPL is included in the NSA Information Systems Security Products and Services Catalogue.
  27. Evaluation Assurance Level (EAL) - Set of assurance requirements that represents a point on the Common Criteria predefined assurance scale.
  28. Global Information Infrastructure (GII) - Worldwide interconnections of the information systems of all countries, international and multinational organizations, and international commercial communications.
  29. High Assurance Guard (HAG) - Device (comprising both hardware and software) that is designed to enforce security rules during the transmission of X.400 message and X.500 directory traffic between enclaves of different classification levels (e.g., UNCLASSIFIED and SECRET).
  30. IA architecture - Framework that assigns and portrays IA roles and behavior among all IT assets and prescribes rules for interaction and interconnection.
  31. Information assurance (IA) - Measures that protect and defend information and information systems by ensuring their availability, integrity, authentication, confidentiality, and non-repudiation. These measures include providing for restoration of information systems by incorporating protection, detection, and reaction capabilities.
  32. Information systems security (INFOSEC) - Protection of information systems against unauthorized access to or modification of information, whether in storage, processing, or transit, and against the denial of service to authorized users, including those measures necessary to detect, document, and counter such threats.
  33. Information Systems Security Engineering (ISSE) - Process that captures and refines information protection requirements and ensures their integration into IT acquisition processes through purposeful security design or configuration.
  34. Key-auto-key (KAK) - Cryptographic logic using previous key to produce a key
  35. Multilevel mode - INFOSEC mode of operation wherein all the following statements are satisfied concerning the users who have direct or indirect access to the system, its peripherals, remote terminals, or remote hosts: (a) Some users do not have a valid security clearance for all the information processed in the IS; (b) all users have the proper security clearance and appropriate formal access approval for that information to which they have access; and (c) all users have a valid need-to-know only for information to which they have access.
  36. Multilevel security (MLS) - Concept of processing information with different classifications and categories that simultaneously permits access by users with different security clearances and denies access to users who lack authorization.
  37. National Information Assurance Partnership (NIAP) - Joint initiative between NSA and NIST responsible for security testing needs of both IT consumers and producers and promoting the development of technically sound security requirements for IT products and systems and appropriate measures for evaluating those products and systems.
  38. National Information Infrastructure (NII) - Nationwide interconnection of communications networks, computers, databases, and consumer electronics that make a vast amount of information available to users. It includes both public and private networks, the Internet, the public switched network, and cable, wireless, and satellite communications.
  39. National security information (NSI) - Information that has been determined, pursuant to Executive Order 12958 or any predecessor order, to require protection against unauthorized disclosure.
  40. No-lone zone - Area, room, or space that, when staffed, must be occupied by two or more appropriately cleared individuals who remain within sight of each other.
  41. Operations security (OPSEC) - Systematic and proven process by which potential adversaries can be denied information about capabilities and intentions by identifying, controlling, and protecting generally unclassified evidence of the planning and execution of sensitive activities. The process involves five steps: identification of critical information, analysis of threats, analysis of vulnerabilities, assessment of risks, and application of appropriate countermeasures.
  42. Partitioned security mode - IS security mode of operation wherein all personnel have the clearance, but not necessarily formal access approval and need-to-know, for all information handled by an IS.
  43. Policy Approving Authority (PAA) - First level of the PKI Certification Management Authority, which approves the security policy of each PCA.
  44. Policy Certification Authority (PCA) - Second level of the PKI Certification Management Authority, which formulates the security policy under which it and its subordinate CAs will issue public key certificates.
  45. QUADRANT - Short name referring to technology that provides tamper-resistant protection to crypto-equipment.
  46. RED - Designation applied to an IS and associated areas, circuits, components, and equipment in which unencrypted national security information is being processed.
  47. RED/BLACK concept - Separation of electrical and electronic circuits, components, equipment, and systems that handle national security information (RED) in electrical form from those that handle non-national security information (BLACK) in the same form.
  48. Red team - Independent and focused threat-based effort by an interdisciplinary simulated adversary to expose and exploit vulnerabilities as a means to improve the security posture of ISs.
  49. RED signal - Any electronic emission (e.g., plaintext, key, key stream, subkey stream, initial fill, or control signal) that would divulge national security information if recovered.
  50. Risk management - Process of identifying and applying countermeasures commensurate with the value of the assets protected, based on a risk assessment.
  51. Security fault analysis (SFA) - Assessment, usually performed on IS hardware, to determine the security properties of a device when a hardware fault is encountered.
  52. Security test and evaluation (ST&E) - Examination and analysis of the safeguards required to protect an IS, as they have been applied in an operational environment, to determine the security posture of that system.
  53. Sensitive Compartmented Information (SCI) - Classified information concerning or derived from intelligence sources, methods, or analytical processes, which is required to be handled within formal access control systems established by the Director of Central Intelligence.
  54. Sensitive Compartmented Information Facility (SCIF) - An accredited area, room, or group of rooms, buildings, or installation where SCI may be stored, used, discussed, and/or processed.
  55. Special Access Program (SAP) - Program established for a specific class of classified information that imposes safeguarding and access requirements that exceed those normally required for information at the same classified level.
  56. Superencryption - Process of encrypting encrypted information, as when a message encrypted off-line is transmitted over a secured, online circuit, or when information encrypted by the originator is multiplexed onto a communications trunk that is then bulk encrypted.
  57. System high - Highest security level supported by an IS.
  58. System high mode - IS security mode of operation wherein each user, with direct or indirect access to the IS, its peripherals, remote terminals, or remote hosts, has all of the following: (a) valid security clearance for all information within an IS; (b) formal access approval and signed nondisclosure agreements for all the information stored and/or processed (including all compartments, subcompartments, and/or special access programs); and (c) a valid need-to- know for some of the information contained within the IS.
  59. TEMPEST - Transient ElectroMagnetic Pulse Emanations Standard, the U.S. Government standard for control of spurious compromising emanations emitted by electrical equipment; also used to refer to investigation, study, and control of compromising emanations from IS equipment.
  60. TEMPEST zone - Designated area within a facility where equipment with appropriate TEMPEST characteristics (TEMPEST zone assignment) may be operated.
  61. Tranquility - Property whereby the security level of an object cannot change while the object is being processed by an IS.
  62. Type 1 product - Classified or controlled cryptographic item endorsed by the NSA for securing classified and sensitive U.S. Government information, when appropriately keyed. The term refers only to products and not to information, keys, services, or controls. Type 1 products contain approved NSA algorithms. They are available to U.S. Government users, their contractors, and federally sponsored non-U.S. Government activities subject to export restrictions in accordance with International Traffic in Arms Regulation.
  63. Type 2 product - Unclassified cryptographic equipment, assembly, or component, endorsed by the NSA, for use in national security systems as defined in Title 40 U.S.C. § 1452.

U S National Policy

In the U.S., the Committee on National Security Systems (CNSS) was assigned the responsibility to set national policy for national security systems. CNSS is the result of Executive Order (E.O.) 13231, “Critical Infrastructure Protection in the Information Age,” issued on October 16, 2001. E.O. 13231 renamed the National Security Telecommunications and Information Systems Security Committee (NSTISSC) as CNSS. CNSS is a standing committee of the President’s Critical Infrastructure Board and is chaired by the U.S. DoD.

E.O. 13231directed the following actions:

  • Protection of information systems for critical infrastructure
  • Protection of emergency preparedness communications
  • Protection of supporting physical assets

The E.O. also assigned the following responsibilities to the U.S. Secretary of Defense and the Director of Central Intelligence regarding the security of systems with national security information:

  • Developing government-wide policies
  • Overseeing the implementation of government-wide policies, procedures, standards, and guidelines

National security systems are categorized as systems with one or more of the following characteristics:

  • Contain classified information
  • Involved with the command and control of military forces
  • Employ cryptographic activities related to national security
  • Support intelligence activities
  • Associated with equipment that is an integral part of weapon or weapons system(s)
  • Critical to the direct fulfillment of military or intelligence missions but not including routine administrative and business applications

The responsibilities of the CNSS for national security systems outlined in E.O. 13231 include:

  • Providing a forum for the discussion of policy issues
  • Setting national policy
  • Through the CNSS Issuance System, providing operational procedures, direction, and guidance

An index of CNSS Issuances can be found at www.nstissc.gov/Assets/ pdf/index.pdf

Additional Agency Policy Guidance

Additional valuable guidance on polices for federal agencies is provided in OMB Circular A-130, “Management of Federal Information Resources, Transmittal 4,” November 30, 2000. This circular addresses information management policy and management of information systems and information technology policy. These policies are summarized in the following two sections.

Information Management Policy

For government agencies, an information management policy should address the following entities:

  • Conducting information management planning
  • Establishing guidelines for information collection
  • Establishing guidelines for electronic information collection
  • Implementing records management
  • Providing information to the public
  • Implementing an information dissemination management system
  • Avoiding improperly restrictive practices
  • Disseminating electronic information
  • Implementing safeguards

Management of Information Systems and Information Technology Policy

A policy for the management of information systems should include the following items:

  • Use of a process for capital planning and investment control
  • Documentation and submission of the initial enterprise architecture (EA) to OMB and submission of updates when significant changes to the EA occur. The OMB Circular defines EA as “the explicit description and documentation of the current and desired relationships among business and management processes and information technology.”
  • Ensure security in information systems
  • Acquisition of information technology

In performing the oversight function, Circular A-130 states:

The Director of OMB will use information technology planning reviews, fiscal budget reviews, information collection budget reviews, management reviews, and such other measures as the Director deems necessary to evaluate the adequacy and efficiency of each agency’s information resources management and compliance with this Circular.

Department of Defense Policies

The policies and guidance for information assurance in U.S. defense organizations are given in DoD Directive 8500.1, “Information Assurance (IA),” October 4, 2002. Additional support and implementation guidance is also provided by DoD Directive 8500.2, “Information Assurance (IA) Implementation,” February 6, 2003; DoD 5025.1-M, “DoD Directives System Procedures,” current edition; and DoD Directive 8000.1, “Management of DoD Information Resources and Information Technology,” February 27, 2002. The principal components of U.S. DoD IA policy as embodied in DoD Directive 8500.1 are summarized in the following section.

DoD Directive 8500 1

DoD Directive 8500.1:

Establishes policy and assigns responsibilities to achieve Department of Defense (DoD) information assurance (IA) through a defense-in-depth approach that integrates the capabilities of personnel, operations, and technology, and supports the evolution to network centric warfare.

There are 26 policy items listed in Directive 8500.1. The main elements of these policy statements taken from the Directive are given as follows:

  1. Information assurance requirements shall be identified and included in the design, acquisition, installation, operation, upgrade, or replacement of all DoD information systems in accordance with 10 U.S.C. § 2224, Office of Management and Budget Circular A-130, DoD Directive 5000.1, this Directive, and other IA-related DoD guidance, as issued.
  2. All DoD information systems shall maintain an appropriate level of confidentiality, integrity, authentication, nonrepudiation, and availability that reflect a balance among the importance and sensitivity of the information and information assets; documented threats and vulnerabilities; the trustworthiness of users and interconnecting systems; the impact of impairment or destruction to the DoD information system; and cost effectiveness.
  3. Information assurance shall be a visible element of all investment portfolios incorporating DoD-owned or -controlled information systems, to include outsourced business processes supported by private sector information systems and outsourced information technologies.
  4. Interoperability and integration of IA solutions within or supporting the Department of Defense shall be achieved through adherence to an architecture that will enable the evolution to network-centric warfare by remaining consistent with the Command, Control, Communications, Computers, Intelligence, Surveillance, Reconnaissance Architecture Framework, and a defense-in-depth approach.
  5. The Department of Defense shall organize, plan, assess, train for, and conduct the defense of DoD computer networks as integrated computer network defense (CND) operations that are coordinated across multiple disciplines in accordance with DoD Directive O-8530.1.
  6. Information assurance readiness shall be monitored, reported, and evaluated as a distinguishable element of mission readiness throughout all the DoD Components, and validated by the DoD CIO.
  7. All DoD information systems shall be assigned a mission assurance category that is directly associated with the importance of the information they contain relative to the achievement of DoD goals and objectives, particularly the war fighters’ combat mission.
  8. Access to all DoD information systems shall be based on a demonstrated need-to-know and granted in accordance with applicable laws and DoD 5200.2-R.
  9. In addition to the requirements in item 8, foreign exchange personnel and representatives of foreign nations, coalitions, or international organizations may be authorized access to DoD information systems containing classified or sensitive information only if all of the following conditions are met:

    • Access is authorized only by the DoD Component Head in accordance with the Department of Defense, the Department of State (DoS), and DCI disclosure and interconnection policies, as applicable.
    • Mechanisms are in place to strictly limit access to information that has been cleared for release to the represented foreign nation, coalition, or international organization (e.g., North Atlantic Treaty Organization), in accordance with DoD directives.
  1. Authorized users who are contractors, DoD direct- or indirect-hire foreign national employees, or foreign representatives as described in item 9 shall always have their affiliation displayed as part of their e-mail addresses.
  2. Access to DoD-owned, -operated, or -outsourced Web sites shall be strictly controlled by the Web site owner using technical, operational, and procedural measures appropriate to the Web site audience and information classification or sensitivity.
  3. DoD information systems shall regulate remote access and access to the Internet by employing positive technical controls such as proxy services and screened subnets, also called demilitarized zones (DMZ), or through systems that are isolated from all other DoD information systems through physical means. This includes remote access for telework.
  4. All DoD information systems shall be certified and accredited in accordance with DoD Instruction 5200.40.
  5. All interconnections of DoD information systems shall be managed to continuously minimize community risk by ensuring that the assurance of one system is not undermined by vulnerabilities of interconnected systems.
  6. All DoD information systems shall comply with DoD ports and protocols guidance and management processes, as established.
  7. The conduct of all DoD communications security activities, including the acquisition of COMSEC products, shall be in accordance with DoD Directive C-5200.5.
  8. All IA or IA-enabled IT hardware, firmware, and software components for products incorporated into DoD information systems must comply with the evaluation and validation requirements of National Security Telecommunications and Information Systems Security Policy Number 11.
  9. All IA and IA-enabled IT products incorporated into DoD information systems shall be configured in accordance with DoD-approved security configuration guidelines.
  10. Public domain software products, and other software products with limited or no warranty, such as those commonly known as freeware or shareware, shall be used in DoD information systems only to meet compelling operational requirements. Such products shall be thoroughly assessed for risk and accepted for use by the responsible DAA.
  11. DoD information systems shall be monitored based on the assigned mission assurance category and assessed risk in order to detect, isolate, and react to intrusions, disruption of services, or other incidents that threaten the IA of DoD operations or IT resources, including internal misuse. DoD information systems also shall be subject to active penetrations and other forms of testing used to complement monitoring activities in accordance with DoD and Component policy and restrictions.
  12. Identified DoD information system vulnerabilities shall be evaluated for DoD impact and tracked and mitigated in accordance with DoD-directed solutions, e.g., Information Assurance Vulnerability Alerts (IAVAs).
  13. All personnel authorized access to DoD information systems shall be adequately trained in accordance with DoD and Component policies and requirements and certified as required in order to perform the tasks associated with their IA responsibilities.
  14. Individuals shall be notified of their privacy rights and security responsibilities in accordance with DoD Component General Counsel–approved processes when attempting access to DoD information systems.
  15. Mobile code technologies shall be categorized and controlled to reduce their threat to DoD information systems in accordance with DoD and Component policy and guidance.
  16. A DAA shall be appointed for each DoD information system operating within or on behalf of the Department of Defense, to include out-sourced business processes supported by private sector information systems and outsourced information technologies. The DAA shall be a U.S. citizen, a DoD employee, and have a level of authority commensurate with accepting, in writing, the risk of operating DoD information systems under his or her purview.
  17. All military voice radio systems, to include cellular and commercial services, shall be protected consistent with the classification or sensitivity of the information transmitted on the system.

Assessment Questions

You can find the answers to the following questions in Appendix A.

Systems Security Engineering

1. 

Which one of the following is not one of the five system life cycle planning phases as defined in NIST SP 800-14?

  1. Initiation phase
  2. Requirements phase
  3. Implementation phase
  4. Disposal phase

image from book

2. 

Which one of the following sets of activities best describes a subset of the Acquisition Cycle phases as given in NIST SP 800-64, “Security Considerations in the Information System Development Life Cycle”?

  1. Mission and business planning, acquisition planning, contract performance, disposal and contract closeout
  2. Initiation, mission and business planning, acquisition planning, contract performance
  3. Initiation, acquisition/development, contract performance, disposal and contract closeout
  4. Mission and business planning, acquisition/development, contract performance, disposal and contract closeout

image from book

3. 

The IATF document 3.1 stresses that information assurance relies on three critical components. Which one of the following answers correctly lists these components?

  1. People, documentation, technology
  2. People, Defense in Depth, technology
  3. People, evaluation, certification
  4. People, operations, technology

answer: d answers a, b, and c are distracters.

4. 

In the 14 Common IT Security Practices listed in NIST SP 800-14, one of the practices addresses having three types of policies in place. Which one of the following items is not one of these types of policies?

  1. A program policy
  2. An issue-specific policy
  3. A system-specific policy
  4. An enclave-specific policy

image from book

5. 

Risk management, as defined in NIST SP 800-30, comprises which three processes?

  1. Risk assessment, risk mitigation, and evaluation and assessment
  2. Risk identification, risk mitigation, and evaluation and assessment
  3. Risk assessment, risk impacts, and risk mitigation
  4. Risk assessment, risk mitigation, and risk identification

answer: a answers b, c, and d are distracters.

6. 

In the system development life cycle (SDLC), or system life cycle as it is sometimes called, in which one of the of the five phases are the system security features configured, enabled, tested, and verified?

  1. Operation/maintenance
  2. Development/acquisition
  3. Implementation
  4. Initiation

image from book

7. 

Which one of he following activities is performed in the Development/Acquisition phase of the SDLC?

  1. The scope of the IT system is documented.
  2. The IT system is developed, programmed, or otherwise constructed.
  3. The system performs its function.
  4. Information, hardware, or software is disposed of.

image from book

8. 

In NIST SP 800-30, risk is defined as a function of which set of the following items?

  1. Threat likelihood, vulnerabilities, and impact
  2. Threat likelihood, mission, and impact
  3. Vulnerabilities, mission and impact
  4. Threat likelihood, sensitivity, and impact

answer: a answers b, c, and d are distracters.

9. 

The risk assessment methodology described in NIST SP 800-30 comprises nine primary steps. Which one of the following is not one of these steps?

  1. System characterization
  2. Control analysis
  3. Impact analysis
  4. Accreditation boundaries

answer: d delineating accreditation boundaries is a subset of system characterization (answer a).

10. 

The Engineering Principles for Information Technology Security (EPITS), described in NIST SP 800-27, are which one of the following?

  1. A list of 33 system-level security principles to be considered in the design, development, and operation of an information system
  2. A list of eight principles and 14 practices derived from OECD guidelines
  3. Part of the Common Criteria (CC)
  4. Component of the Defense in Depth strategy

image from book

11. 

Which one of the following items is not one of the activities of the generic systems engineering (SE) process?

  1. Discover needs
  2. Define system requirements
  3. Obtain accreditation
  4. Assess effectiveness

image from book

12. 

The elements Discover information protection needs, Develop detailed security design, and Assess information protection effectiveness are part of what process?

  1. The systems engineering (SE) process
  2. The information systems security engineering process (ISSE)
  3. The system development life cycle (SDLC)
  4. The risk management process

image from book

13. 

In the ISSE process, information domains are defined under the Discover Information Protection Needs process. Which one of the following tasks is not associated the information domain?

  1. Identify the members of the domain.
  2. List the information entities that are under control in the domain.
  3. Identify the applicable privileges, roles, rules, and responsibilities of the users in the domain.
  4. Map security mechanisms to security design elements in the domain.

answer: d this task is performed under the develop detailed security design activity.

14. 

In the Discover Information Protection Needs activity of the ISSE process, the information systems security engineer must document the elements of this activity, including roles, responsibilities, threats, strengths, security services, and priorities. These items form the basis of which one of the following?

  1. Threat matrix
  2. Functional analysis
  3. Synthesis
  4. Information protection policy (IPP)

answer: d answers a through c are distracters.

15. 

As part of the Define System Security Requirements activity of the ISSE process, the information systems security engineer identifies and selects a solution set that can satisfy the requirements of the IPP. Which one of the following elements is not a component of the solution set?

  1. Functional decomposition
  2. Preliminary security concept of operations (CONOPS)
  3. System context
  4. System requirements

image from book

16. 

The information systems security engineer’s tasks of cataloging candidate commercial off-the-shelf (COTS) products, government off-the-shelf (GOTS) products, and custom security products are performed in which one of the following ISSE process activities?

  1. Define System Security Requirements
  2. Develop Detailed Security Design
  3. Implement System Security
  4. Design System Security Architecture

image from book

17. 

Which ISSE activity includes conducting unit testing of components, integration testing, and developing installation and operational procedures?

  1. Assess Information Protection Effectiveness
  2. Develop Detailed Security Design
  3. Implement System Security
  4. Design System Security Architecture

image from book

18. 

Security certification is performed in which phase of the SDLC?

  1. Implementation phase
  2. Validation phase
  3. Development/Acquisition phase
  4. Operations/Maintenance phase

image from book

19. 

The certification and accreditation process receives inputs from the ISSE process. These inputs are which one of the following items?

  1. Certification documentation
  2. Certification recommendations
  3. Accreditation decision
  4. Evidence and documentation

answer: d answers a, b, and c are outputs of the certification and accreditation process.

20. 

Which one of the following items is not part of an implementation-independent protection profile (PP) of the Common Criteria (CC)?

  1. Security objectives
  2. Information assurance requirements
  3. Security-related functional requirements
  4. Defense of the enclave boundary

answer: d defense of the enclave boundary is addressed in the defense-in-depth strategy.

21. 

Which one of the following is not one of the technology focus areas of the Defense in Depth strategy?

  1. Defend the certificate management
  2. Defend the network and infrastructure
  3. Defend the computing environment
  4. Defend the supporting infrastructure

image from book

22. 

Security categorization is part of which phase of the SDLC?

  1. Initiation
  2. Acquisition/Development
  3. Implementation
  4. Requirements

image from book

23. 

The Defense in Depth strategy identifies five types of attacks on information systems as listed in IATF document 3.1. Which one of the following types of attacks is not one of these five types?

  1. Passive
  2. Active
  3. Close-in
  4. Outsider

image from book

24. 

Which one of the following items is not an activity under the Acquisition/Development phase of the SDLC?

  1. Preliminary risk assessment
  2. Security functional requirements analysis
  3. Cost considerations and reporting
  4. Developmental security evaluation

image from book

25. 

Which one of the following types of enclaves is not one of those categorized in the U.S. federal and defense computing environments?

  1. Private
  2. Public
  3. Classified
  4. Secure

image from book

26. 

According to NIST SP 800-64, which phase of the SDLC includes the activities of functional statement of need, market research, cost-benefit analysis, and a cost analysis?

  1. Initiation
  2. Acquisition/Development
  3. Implementation
  4. Operations/Maintenance

image from book

27. 

Which one of the following models is an evolutionary model used to represent the acquisition management process?

  1. The acquisition process model
  2. The spiral model
  3. The waterfall model
  4. The acquisition/development model

image from book

28. 

In NIST SP 800-30, a threat is defined as which one of the following items?

  1. Intent and method targeted at the intentional exploit of a vulnerability
  2. The likelihood that a given threat source will exercise a particular potential vulnerability, and the resulting impact of that adverse event on the organization
  3. The potential for a threat source to exercise a specific vulnerability
  4. A flaw or weakness in system security procedures, design, implementation, or internal controls that could be exercised and result in a security breach or a violation of the system’s security policy

image from book

29. 

Questionnaires, on-site interviews, review of documents, and automated scanning tools are primarily used to gather information for which one of the following steps of the risk assessment process?

  1. System characterization
  2. Risk determination
  3. Vulnerability identification
  4. Control analysis

image from book

30. 

In performing an impact analysis as part of the risk assessment process, three important factors should be considered in calculating the negative impact. Which one of the following items is not one of these factors?

  1. The sensitivity of the system and its data
  2. The management of the system
  3. The mission of the system
  4. The criticality of the system, determined by its value and the value of the data to the organization

Technical Management

Some material in these questions is also covered in Chapter 5.

answer: b technical management

31. 

Which statement about the SSE-CMM is incorrect?

  1. The SSE-CMM defines two dimensions that are used to measure the capability of an organization to perform specific activities.
  2. The domain dimension consists of all the practices that collectively define security engineering.
  3. The domain dimension represents practices that indicate process management and institutionalization capability.
  4. The capability dimension represents practices that indicate process management and institutionalization capability.

image from book

32. 

Which description of the SSE-CMM Level 5 Generic Practice is correct?

  1. Planned and Tracked
  2. Continuously Improving
  3. Quantitatively Controlled
  4. Performed Informally

image from book

33. 

Which statement about testing and evaluation is not true?

  1. A TEMP is required for most large programs.
  2. A DT&E is equivalent to Analytical, Type 1, and Type 2 testing.
  3. An OT&E is equivalent to Type 5 and Type 6 testing.
  4. An OT&E is equivalent to Type 3 and Type 4 testing.

image from book

34. 

Which attribute about the Level 1 SSE-CMM Generic Practice is correct?

  1. Performed Informally
  2. Planned and Tracked
  3. Well Defined
  4. Continuously Improving

image from book

35. 

Which of the following is not a true statement about good cost control?

  1. Cost control starts with the initiation of corrective action.
  2. Cost control requires good overall cost management.
  3. Cost control requires immediate initiation of corrective action.
  4. Cost control starts with the initial development of cost estimates for the program.

image from book

36. 

Which statement about the SE-CMM is not correct?

  1. The SE-CMM describes the essential elements of an organization’s systems engineering process that must exist in order to ensure good systems engineering.
  2. The SE-CMM provides a reference to compare existing systems engineering practices against the essential systems engineering elements described in the model.
  3. The SE-CMM goal is to improve the system or product engineering process.
  4. The SE-CMM was created to define, improve, and assess security engineering capability.

image from book

37. 

Which statement about system security testing and evaluation (ST&E) categories is correct?

  1. Type 1 testing is performed during the latter stages of the detail design and development phase.
  2. Type 2 testing is design evaluation conducted early in the system life cycle.
  3. Type 3 testing is performed during the latter stages of the detail design and development phase.
  4. Type 4 testing is conducted during the system operational use and life cycle support phase.

image from book

38. 

Which choice is not an activity in the cost control process?

  1. Identifying potential suppliers
  2. Developing a functional cost data collection capability
  3. Developing the costs as estimated for each task
  4. Creating a procedure for cost evaluation

image from book

39. 

Which choice does not describe a common outsourcing activity?

  1. Review of proposals
  2. Develop a functional cost reporting capability
  3. Contract negotiation
  4. Development of an RFP

image from book

40. 

Which choice is not an accurate description of an activity level of the WBS?

  1. Level 1 may be used as the basis for the authorization of the program work.
  2. Program budgets are usually prepared at Level 1.
  3. Level 2 identifies the various projects that must be completed.
  4. Program schedules are generally prepared at Level 3.

image from book

41. 

Which of the following is not a phase in the IDEAL model?

  1. Authorizing
  2. Learning
  3. Diagnosing
  4. Establishing

image from book

42. 

Which choice best describes systems engineering, as defined in the SSE-CMM?

  1. An integrated composite of people, products, and processes that provides a capability to satisfy a need or objective
  2. The selective application of scientific and engineering efforts to integrate the efforts of all engineering disciplines and specialties into the total engineering effort
  3. A narrative description of the work required for a given project
  4. The contracting with one or more outside suppliers for the procurement and acquisition of materials and services

image from book

43. 

Which of the following choices is not a benefit of the WBS?

  1. The WBS facilitates the initial allocation of budgets.
  2. The WBS facilitates the collection and reporting of costs.
  3. The system can easily be described through the logical breakout of its elements into work packages.
  4. The WBS integrates the efforts of all engineering disciplines and specialties into the total engineering effort.

image from book

44. 

Which choice is not an element of the Statement of Work (SOW)?

  1. An identification of the input requirements from other tasks
  2. A description of specific results to be achieved
  3. Management of security awareness, training, and education programs
  4. A proposed schedule for delivery of the product

image from book

45. 

Which of the following statements best describes the difference between a Type 1 testing and evaluation category and a Type 2 category?

  1. Type 1 testing is the evaluation of system components in the laboratory, designed to verify performance and physical characteristics.
  2. Type 2 testing is the evaluation of system components in the laboratory, designed to verify performance and physical characteristics.
  3. Type 1 testing establishes design evaluations conducted early in the system life cycle.
  4. Type 2 testing is conducted after initial system qualification and prior to the completion of the production or construction phase.

image from book

46. 

Which choice has the outsourcing activities listed in their proper order?

  1. Review and evaluation of supplier proposals, supplier monitoring and control, development of a Request for Proposal (RFP), and selection of suppliers
  2. Development of a Request for Proposal (RFP), review and evaluation of supplier proposals, supplier monitoring and control, and selection of suppliers
  3. Development of a Request for Proposal (RFP), review and evaluation of supplier proposals, selection of suppliers, and supplier monitoring and control
  4. Review and evaluation of supplier proposals, selection of suppliers, development of a Request for Proposal (RFP), and supplier monitoring and control

image from book

47. 

Which answer best describes a Statement of Work (SOW)?

  1. A narrative description of the work required for a given project
  2. An integrated composite of people, products, and processes that provides a capability to satisfy a need or objective
  3. The contracting with one or more outside suppliers for the procurement and acquisition of materials and services
  4. The development of a functional cost reporting capability

image from book

48. 

Which statement about SSE-CMM Base Practices is correct?

  1. BPs are mandatory characteristics that must exist within an implemented security engineering process before an organization can claim satisfaction in a given PA.
  2. BPs are ordered in degrees of maturity and are grouped to form and distinguish among five levels of security engineering maturity.
  3. BPs are ordered in degrees of maturity and are grouped to form and distinguish among 22 levels of security engineering maturity.
  4. BPs are optional characteristics that must exist within an implemented security engineering process before an organization can claim satisfaction in a given PA.

image from book

49. 

As per the SE-CMM, which definition of a system is incorrect?

  1. An interacting combination of elements that are viewed in relation to function
  2. A continuous cycle of evaluating the current status of an organization, making improvements, and repeating the cycle
  3. An assembly of things or parts forming a complex or unitary whole
  4. An integrated composite of people, products, and processes that provides a capability to satisfy a need or objective

image from book

50. 

Which of the following choices best describes the purpose of the Learning phase of the IDEAL model?

  1. The Learning phase is the implementation phase and requires the greatest level of effort of all the phases, in terms of both resources and time.
  2. The Learning phase is both the final stage of the initial process improvement cycle and the initial phase of the next process improvement effort.
  3. In the Learning phase, it is imperative that an understanding of the organization’s current and desired future state of process maturity be established.
  4. In the Learning phase, a detailed plan of action based on the goals of the effort and the recommendations developed during the Diagnosing phase is developed.

image from book

51. 

Which statement about the System Engineering Management Plan (SEMP) is not true?

  1. Development program planning and control is a SEMP element.
  2. The goal of SEMP is to establish a continuous cycle of evaluating the current status of the organization.
  3. The SEMP contains detailed statements of how the systems security engineering functions are to be carried out during development.
  4. The security systems engineering process is a SEMP element.

image from book

52. 

Which choice has the correct order of activities in the IDEAL model?

  1. Learning, Initiating, Diagnosing, Establishing, and Acting
  2. Initiating, Learning, Diagnosing, Establishing, and Acting
  3. Learning, Diagnosing, Initiating, Establishing, and Acting
  4. Initiating, Diagnosing, Establishing, Acting, and Learning

image from book

53. 

Which choice is an incorrect statement regarding the Systems Engineering Management Plan (SEMP)?

  1. The SEMP covers all management functions associated with the performance of security systems engineering activities for a given program.
  2. It starts as an outline and is updated as the security system development process goes on.
  3. It contains detailed statements of how the systems security engineering functions are to be carried out during development.
  4. The SEMP is a static document, intended to remain unchanged.

image from book

54. 

Which choice best describes an outsourced supplier?

  1. A broad class of external organizations that provide products, components, materials, and/or services to a producer or prime contractor
  2. An interacting combination of elements that are viewed in relation to function
  3. An integrated composite of people, products, and processes that provides a capability to satisfy a need or objective
  4. Practices that indicate process management and institutionalization capability

image from book

55. 

Which of the following statements best describes the main premise of process improvement?

  1. Major changes must be sponsored by senior management.
  2. The quality of services produced is a direct function of the quality of the associated development and maintenance processes.
  3. Focus on fixing the process, not assigning blame.
  4. All suppliers must be security vetted prior to contracting.

image from book

56. 

What is the main purpose of the Work Breakdown Structure (WBS)?

  1. It creates a hierarchical tree of work packages.
  2. It may be a contractual requirement in competitive bid system developments.
  3. It ensures the authorization for the program work.
  4. It ensures that all essential tasks are properly defined, assigned, scheduled, and controlled.

image from book

57. 

Which choice is not an activity in the Development Program Planning and Control element of the SEMP?

  1. System Test and Evaluation Strategy
  2. Scheduling and Cost Estimation
  3. Technical Performance Measurement
  4. Statement of Work

image from book

58. 

At what point in the project is the Work Breakdown Structure (WBS) usually created?

  1. After the generation of the SOW and the identification of the organizational structure
  2. After the development of a functional cost data collection and reporting capability
  3. After the costs for each task are estimated
  4. After the development of an RFP but before the identification of the organizational structure

image from book

59. 

Which choice accurately lists the five levels of security engineering maturity as defined by the SSE-CMM?

  1. Planned and Tracked, Well Defined, Performed Informally, Quantitatively Controlled, and Continuously Improving
  2. Planned and Tracked, Performed Informally, Well Defined, Quantitatively Controlled, and Continuously Improving
  3. Performed Informally, Planned and Tracked, Well Defined, Quantitatively Controlled, and Continuously Improving
  4. Performed Informally, Planned and Tracked, Quantitatively Controlled, Well Defined, and Continuously Improving

image from book

60. 

Which choice has the correct order of activities in the security system design testing process?

  1. Acquisition, Testing, Analysis, Planning, and Correction
  2. Acquisition, Planning, Testing, Analysis, and Correction
  3. Planning, Analysis, Testing, Acquisition, and Correction
  4. Planning, Acquisition, Testing, Analysis, and Correction

Certification and Accreditation

See Chapter 11 Assessment Questions.

U.S. Government Information Assurance Regulations

Some material in these questions is also covered in Chapter 12.

image from book

61. 

Techniques and concerns that are normally addressed by management in the organization’s computer security program are defined in NIST SP 800-12 as:

  1. Administrative controls
  2. Management controls
  3. Operational controls
  4. Technical controls

image from book

62. 

The National Research Council publication Computers at Risk defines an element of computer security as a “requirement intended to assure that systems work properly and service is not denied to authorized users.” Which one of the following elements best fits this definition?

  1. Availability
  2. Assurance
  3. Integrity
  4. Authentication

image from book

63. 

NSTISSI Publication No. 4009, “National Information Systems Security (INFOSEC) Glossary,” defines the term assurance as:

  1. Requirement that information and programs are changed only in a specified and authorized manner
  2. Measure designed to establish the validity of a transmission, message, or originator, or a means of verifying an individual’s authorization to receive specific categories of information
  3. Measure of confidence that the security features, practices, procedures, and architecture of an IS accurately mediate and enforce the security policy
  4. Requirement that private or confidential information not be disclosed to unauthorized individuals

image from book

64. 

The “National Information Systems Security (INFOSEC) Glossary” defines an information system security term as a “formal determination by an authorized adjudicative office that an individual is authorized access, on a need to know basis, to a specific level of collateral classified information.” This definition refers to which one of the following terms?

  1. Sensitivity of information
  2. Classification of information
  3. Clearance
  4. Compartmentalization

image from book

65. 

In NSTISSI Publication No. 4009, what term is defined as a “document detailing the method, act, process, or effect of using an information system (IS)”?

  1. QUADRANT
  2. Concept of Operations (CONOPS)
  3. Evaluation Assurance Level (EAL)
  4. Information Assurance (IA) architecture

image from book

66. 

Which one of the following definitions best describes the National Information Assurance Partnership (NIAP) according to NSTISSI Publication No. 4009?

  1. Nationwide interconnection of communications networks, computers, databases, and consumer electronics that makes vast amounts of information available to users
  2. Worldwide interconnections of the information systems of all countries, international and multinational organizations, and international commercial communications
  3. Joint initiative between NSA and NIST responsible for security testing needs of both IT consumers and producers, promoting the development of technically sound security requirements for IT products
  4. First level of the PKI Certification Management Authority that approves the security policy of each Policy Certification Authority (PCA)

image from book

67. 

TEMPEST refers to which one of the following definitions?

  1. Property whereby the security level of an object cannot change while the object is being processed by an IS
  2. Investigation, study, and control of compromising emanations from IS equipment
  3. Program established for a specific class of classified information that imposes safeguarding and access requirements that exceed those normally required for information at the same classified level
  4. Unclassified cryptographic equipment

image from book

68. 

Executive Order (E.O.) 13231, issued on October 16, 2001, renamed the National Security Telecommunications and Information Systems Security Committee (NSTISSC) as which one of the following committees?

  1. Committee for Information Systems Security (CISS)
  2. Committee on National Security Systems (CNSS)
  3. Committee on National Infrastructure Protection (CNIP)
  4. Committee for the Protection of National Information Systems (CPNIS)

answer: b the other answers are distracters.

69. 

In addressing the security of systems with national security information, E.O. 13231 assigned the responsibilities of developing government-wide policies and overseeing the implementation of governmentwide policies, procedures, standards, and guidelines to the:

  1. U.S. Secretary of Defense and the Director of the FBI
  2. FBI and the Director of Central Intelligence
  3. NIST and the U.S. Secretary of Defense
  4. U.S. Secretary of Defense and the Director of Central Intelligence

image from book

70. 

Which one of the following characteristics is not associated with the definition of a national security system?

  1. Contains classified information
  2. Involved in industrial commerce
  3. Supports intelligence activities
  4. Involved with the command and control of military forces

image from book

71. 

In 2002, the U.S. Congress enacted the E-Government Act (Public Law 107-347). Title III of the E-Government Act was written to provide for a number of protections of Federal information systems, including to “provide a comprehensive framework for ensuring the effectiveness of information security controls over information resources that support Federal operations and assets.” Title III of the E-Government Act is also known as the:

  1. Computer Security Act (CSA)
  2. Computer Fraud and Abuse Act (CFAA)
  3. Federal Information Security Management Act (FISMA)
  4. Cyber Security Enhancement Act

image from book

72. 

FISMA assigned which one of the following entities the responsibility of overseeing the security policies and practices of U.S. government agencies?

  1. The FBI
  2. The U.S. Secretary of Defense
  3. The Director of the Office of Management and Budget (OMB)
  4. The Director of Central Intelligence

image from book

73. 

Which information system security–related Act requires government agencies to perform periodic assessments of risk, develop policies and procedures that are based on risk assessments, conduct security awareness training, perform periodic testing and evaluation of the effectiveness of information security policies, and implement procedures for detecting, reporting, and responding to security incidents?

  1. Computer Security Act (CSA)
  2. Federal Information Security Management Act (FISMA)
  3. Computer Fraud and Abuse Act (CFAA)
  4. Cyber Security Enhancement Act

image from book

74. 

FISMA charged which one of the following entities to develop information system security standards and guidelines for federal agencies?

  1. FBI
  2. DoD
  3. NSA
  4. NIST

image from book

75. 

The general formula for categorization of an information type developed in FIPS Publication 199, “Standards for Security Categorization of Federal Information and Information Systems,” is which one of the following?

  1. SC information type = {(confidentiality, risk), (integrity, risk), (availability, risk)}
  2. SC information type = {(confidentiality, impact), (integrity, impact), (availability, impact)}
  3. SC information type = {(assurance, impact), (integrity, impact), (authentication, impact)}
  4. SC information type = {(confidentiality, controls), (integrity, controls), (availability, controls)}

answer: b the other answers are distracters.

76. 

Circular A-130 directs that an oversight function should be performed consisting of the use of information technology planning reviews, fiscal budget reviews, information collection budget reviews, management reviews, and such other measures as deemed necessary to evaluate the adequacy and efficiency of each agency’s information resources management and compliance with the circular. Which one of the following individuals does the circular designate as being responsible for this oversight function?

  1. The Secretary of Commerce
  2. The Director of the Office of Management and Budget
  3. The U.S. Secretary of Defense
  4. The Director of NSA

image from book

77. 

The National Computer Security Center Publication NCSC-TG-004-88 includes a definition that refers to the characteristic of a system that “performs its intended function in an unimpaired manner, free from deliberate, inadvertent, or unauthorized manipulation of the system.” This characteristic defines which one of the following terms?

  1. Data integrity
  2. System integrity
  3. Enterprise integrity
  4. Risk integrity

image from book

78. 

Which one of the following terms best describes a secure telecommunications or associated cryptographic component that is unclassified but governed by a special set of control requirements, as defined in NSTISSI Publication 4009?

  1. Controlled cryptographic item (CCI) assembly
  2. Controlled cryptographic item (CCI) component
  3. Controlled cryptographic item (CCI)
  4. Crypto-ignition key (CIK)

image from book

79. 

What is a definable perimeter encompassing all hardware, firmware, and software components performing critical COMSEC functions, such as key generation and key handling and storage?

  1. COMSEC area
  2. COMSEC compartment
  3. COMSEC partition
  4. COMSEC boundary

answer: d answers a, b, and c are distracters.

80. 

What process involves the five steps of identification of critical information, analysis of threats, analysis of vulnerabilities, assessment of risks, and application of appropriate countermeasures?

  1. Operations security
  2. Application security
  3. Administrative security
  4. Management security

answer: a the other answers are distracters.

81. 

Information that has been determined pursuant to Executive Order 12958 or any predecessor order to require protection against unauthorized disclosures is known as:

  1. Protected information (PI)
  2. National security information (NSI)
  3. Personally identifiable information (PII)
  4. Secure information (SI)

image from book

82. 

An area that, when staffed, must be occupied by two or more appropriately cleared individuals who remain within sight of each other is referred to as which one of the following terms?

  1. No-lone zone
  2. Restricted area
  3. Protected occupancy zone
  4. Cleared area

answer: a answers b, c, and d are distracters.

83. 

According to NSTISSI Publication 4009, the process of identifying and applying countermeasures commensurate with the value of the assets protected based on a risk assessment is called a:

  1. Vulnerability assessment
  2. Continuity planning
  3. Risk management
  4. Risk control

image from book

84. 

In the context of information systems security, the abbreviation ST&E stands for which one of the following terms?

  1. Security training and evaluation
  2. Security test and evaluation
  3. Security test and engineering
  4. Sensitivity test and evaluation

answer: b answers a, c, and d are distracters.

85. 

Which one of the following designations refers to a product that is a classified or controlled cryptographic item endorsed by the NSA for securing classified and sensitive U.S. government information when appropriately keyed?

  1. Cleared product
  2. Type 3 product
  3. Type 1 product
  4. Type 2 product

image from book

86. 

Which one of the following items is not one of the responsibilities of the Committee on National Security Systems (CNSS) for the security of national security systems?

  1. Providing a forum for the discussion of policy issues
  2. Setting national policy
  3. Providing operational procedures, direction, and guidance.
  4. Requiring agencies to identify and provide information security protections commensurate with the risk and magnitude of the harm to information or information systems of government agencies

answer: d this responsibility is assigned to the omb.

87. 

FISMA, Title III of the E-Government Act of 2002, reserves the responsibility for standards associated with the national defense establishment to which of the following entities?

  1. DoD and NSA
  2. DoD and CIA
  3. CIA and NSA
  4. CIA and NIST

image from book

88. 

FIPS Publication 199, “Standards for Security Characterization of Federal Information and Information Systems, NIST Pre-Publication Final Draft,” December 2003, characterizes three levels of potential impact on organizations or individuals based on the objectives of confidentiality, integrity, and availability. What is the level of impact specified in Publication 199 for the following description of integrity: “The unauthorized modification or destruction of information could be expected to have a serious adverse effect on organizational operations, organizational assets, or individuals”?

  1. High
  2. Moderate
  3. Low
  4. Severe

image from book

89. 

Referring to question 88, the following impact description refers to which one of the three security objectives and which corresponding level of impact: “The disruption of access to or use of information or an information system could be expected to have a limited adverse effect on organizational operations, organizational assets, or individuals”?

  1. Confidentiality - Low
  2. Availability - Moderate
  3. Availability - Low
  4. Availability - High

image from book

90. 

DoD Directive 8500.1, “Information Assurance (IA),” October 4, 2002, specifies a defense-in-depth approach that integrates the capabilities of which set of the following entities?

  1. Personnel, operations, and technology
  2. Personnel, research and development, and technology
  3. Operations, resources, and technology
  4. Personnel, operations, and resources

answer: a answers b, c, and d are distracters.

Answers

1. 

Answer: b

The requirements phase is not one of the five system life cycle planning phases. The other two phases of the system life cycle are the Development/Acquisition phase and the Operations phase.

2. 

Answer: a

Answers b, c, and d are distracters comprising components of the SDLC and the Acquisition Cycle.

3. 

Answer: d

Answers a, b, and c are distracters.

4. 

Answer: d

A program policy is used to create and define a computer security program, an issue specific policy addresses specific areas and issues, and a system specific policy focuses on decisions made by management.

5. 

Answer: a

Answers b, c, and d are distracters.

6. 

Answer: c

7. 

Answer: b

Answer a refers to the Initiation phase, answer c refers to the Operation/Maintenance phase, and answer d refers to the Disposal phase.

8. 

Answer: a

Answers b, c, and d are distracters.

9. 

Answer: d

Delineating accreditation boundaries is a subset of system characterization (answer a).

10. 

Answer: a

Answer b describes the principles and practices found in NIST SP 800-14. Answers c and d are distracters.

11. 

Answer: c

Obtaining accreditation is not one of the SE process activities. The other SE process activities are to design system architecture, develop detailed design, and implement system.

12. 

Answer: b

13. 

Answer: d

This task is performed under the Develop Detailed Security Design activity.

14. 

Answer: d

Answers a through c are distracters.

15. 

Answer: a

Functional decomposition is part of the Design System Security Architecture activity of the ISSE process.

16. 

Answer: b

17. 

Answer: c

18. 

Answer: a

Security certification is performed in the Implementation phase. Validation (answer b) is not a phase of the SDLC. Answers c and d are additional phases of the SDLC. This activity has tasks that should be performed throughout the ISSE process.

19. 

Answer: d

Answers A, B, and C are outputs of the Certification and Accreditation process.

20. 

Answer: d

Defense of the enclave boundary is addressed in the Defense-In-Depth strategy.

21. 

Answer: a

22. 

Answer: a

Security categorization, performed in the Initiation phase, defines low, moderate, or high levels of potential impact on organizations in the event of a security breach. Answers b and c are other phases of the SDLC. Answer d is not a phase of the SDLC.

23. 

Answer: d

Answer d is a distracter. The other two types of attacks, in addition to passive attacks (answer a), active attacks (answer b), and close-in attacks (answer c), are insider and distribution attacks.

24. 

Answer: a

Preliminary risk assessment is performed in the Initiation phase of the SDLC. Additional activities under the Acquisition/Development phase of the SDLC are risk assessment, assurance requirements analysis security, security planning, and security control development.

25. 

Answer: d

26. 

Answer: b

Additional activities under the Acquisition/Development phase include requirements analysis, alternatives analysis, and a software conversion study.

27. 

Answer: b

The spiral model depicts the acquisition management process as a set of phases and decision points in a circular representation. The other answers are distracters.

28. 

Answer: c

Answer a is a threat source, answer b defines risk, and answer d is the definition of vulnerability.

29. 

Answer: a

30. 

Answer: b

Technical Management

31. 

Answer: c

The SSE-CMM defines two dimensions that are used to measure the capability of an organization to perform specific activities, the domain dimension and the capability dimension. The domain dimension consists of all the practices that collectively define security engineering. The capability dimension represents practices that indicate process management and institutionalization capability.

32. 

Answer: b

Level 5, “Continuously Improving,” is the highest level. A statement characterizing this level would be: “A culture of continuous improvement requires a foundation of sound management practice, defined processes, and measurable goals.”

33. 

Answer: c

In the Defense sector, a TEMP is required for most large programs and includes the planning and implementation of procedures for the Development Test and Evaluation (DT&E) and Operational Test and Evaluation (OT&E). DT&E basically equates to the Analytical, Type 1, and Type 2 testing, and OT&E is equivalent to Type 3 and Type 4 testing.

34. 

Answer: a

The lowest level, Level 1, “Performed Informally,” focuses on whether an organization or project performs a process that incorporates the BPs. The attribute of this level simply requires that the BPs are performed.

35. 

Answer: a

Cost control starts with the initial development of cost estimates for the program and continues with the functions of cost, monitoring, and the collection of cost data, the analysis of the data, and the immediate initiation of corrective action. Cost control requires good overall cost management, including:

  • Cost estimating
  • Cost accounting
  • Cost monitoring
  • Cost analysis and reporting
  • Control functions

36. 

Answer: d

The SSE-CMM, not the SE-CMM, goal is to define, improve, and assess security engineering capability. The SE-CMM goal is to improve the system or product engineering process. The SE-CMM describes the essential elements of an organization’s systems engineering process that must exist in order to ensure good systems engineering. It also provides a reference to compare existing systems engineering practices against the essential systems engineering elements described in the model.

37. 

Answer: d

Testing and evaluation processes often involves several stages of testing categories or phases, such as:

  1. Analytical - Design evaluations conducted early in the system life cycle using computerized techniques such as CAD, CAM, CALS, simulation, rapid prototyping, and other related approaches
  2. Type 1 testing - The evaluation of system components in the laboratory using bench test models and service test models, designed to verify performance and physical characteristics
  3. Type 2 testing - Testing performed during the latter stages of the detail design and development phase, when preproduction prototype equipment and software are available
  4. Type 3 testing - Tests conducted after initial system qualification and prior to the completion of the production or construction phase, the first time that all elements of the system are operated and evaluated on an integrated basis
  5. Type 4 testing - Testing conducted during the system operational use and life-cycle support phase, intended to gain further knowledge of the system in the user environment

38. 

Answer: a

Answer a is an activity of outsourcing. The cost control process includes:

  1. Define the elements of work, as extracted from the SOW
  2. Integrate the tasks defined in the WBS
  3. Develop the costs as estimated for each task
  4. Develop a functional cost data collection and reporting capability
  5. Develop a procedure for evaluation and quick corrective action

39. 

Answer: b

Developing a functional cost reporting capability is a function of Cost Control. The order of activities for the outsourcing process are:

  1. Identification of Potential Suppliers
  2. Development of a Request for Proposal (RFP)
  3. Review and Evaluation of Supplier Proposals
  4. Selection of Suppliers and Contract Negotiation
  5. Supplier Monitoring and Control

40. 

Answer: b

The WBS structure generally includes three levels of activity:

  • Level 1 identifies the entire program scope of work to be produced and delivered. Level 1 may be used as the basis for the authorization for the program work.
  • Level 2 identifies the various projects, or categories of activity, that must be completed in response to program requirements. Program budgets are usually prepared at this level.
  • Level 3 identifies the activities, functions, major tasks, and/or components of the system that are directly subordinate to the Level 2 items. Program schedules are generally prepared at this level.

41. 

Answer: a

The five phases of the IDEAL model are:

  • Initiating - Laying the groundwork for a successful improvement effort
  • Diagnosing - Determining where you are relative to where you want to be
  • Establishing - Planning the specifics of how you will reach your destination
  • Acting - Doing the work according to the plan
  • Learning - Learning from the experience and improving your ability

42. 

Answer: b

The definition of systems engineering on which the SE-CMM is based is defined as the selective application of scientific and engineering efforts to:

  • Transform an operational need into a description of the system configuration that best satisfies the operational need according to the measures of effectiveness
  • Integrate related technical parameters and ensure compatibility of all physical, functional, and technical program interfaces in a manner that optimizes the total system definition and design
  • Integrate the efforts of all engineering disciplines and specialties into the total engineering effort

Answer a describes a system, answer c describes the SOW, and answer d describes outsourcing.

43. 

Answer: d

The WBS provides many benefits, such as:

  • It provides for the reporting of system technical performance measures (TPMs).
  • The entire security system can be easily defined by the breakout of its elements in to discrete work packages.
  • The WBS aids in linking objectives and activities with available resources.
  • The WBS facilitates budgeting and cost reporting.
  • Responsibility assignments can be readily identified through the assignment of tasks.
  • The WBS provides a greater probability that every activity will be accounted for.

Answer d describes a benefit of systems engineering.

44. 

Answer: c

The Statement of Work (SOW) is a narrative description of the work required for a given project. It includes:

  • Summary statement of the tasks to be accomplished
  • Identification of the input requirements from other tasks, including tasks accomplished by the customer and supplier
  • References to applicable specifications, standards, procedures, and related documentation
  • Description of specific results to be achieved and proposed schedule of delivery

Answer c is an example of a SSE-CMM Best Practice.

45. 

Answer: a

Testing and evaluation processes often involve several stages of testing categories or phases, such as:

  1. Analytical - Design evaluations conducted early in the system life cycle using computerized techniques such as CAD, CAM, CALS, simulation, rapid prototyping, and other related approaches
  2. Type 1 testing - The evaluation of system components in the laboratory using bench test models and service test models, designed to verify performance and physical characteristics
  3. Type 2 testing - Testing performed during the latter stages of the detail design and development phase, when preproduction prototype equipment and software are available
  4. Type 3 testing - Tests conducted after initial system qualification and prior to the completion of the production or construction phase, the first time that all elements of the system are operated and evaluated on an integrated basis
  5. Type 4 testing - Testing conducted during the system operational use and life-cycle support phase, intended to gain further knowledge of the system in the user environment

46. 

Answer: c

47. 

Answer: a

The Statement of Work is a narrative description of the work required for a given project. Answer b describes a “system” as defined by the SECMM, answer c describes outsourcing, and answer d describes a function of Cost Control.

48. 

Answer: a

BPs are mandatory characteristics that must exist within an implemented security engineering process before an organization can claim satisfaction in a given PA. The GPs are ordered in degrees of maturity and are grouped to form and distinguish among five levels of security engineering maturity. The other answers are distracters.

49. 

Answer: b

In the SE-CMM, a system is defined as:

  • An integrated composite of people, products, and processes that provide a capability to satisfy a need or objective
  • An assembly of things or parts forming a complex or unitary whole; a collection of components organized to accomplish a specific function or set of functions
  • An interacting combination of elements that are viewed in relation to function

    Answer b describes process improvement.

50. 

Answer: b

The Learning phase is both the final stage of the initial process improvement cycle and the initial phase of the next process improvement effort. Based on the analysis of the improvement effort itself, the lessons learned are translated into recommendations for improving subsequent improvement efforts. Answer a describes the Acting phase, answer c describes the Diagnosing phase, and answer d describes the Establishing phase.

51. 

Answer: b

The SEMP contains detailed statements of how the systems security engineering functions are to be carried out during development. Two elements of the SEMP are:

  • Development program planning and control
  • Security systems engineering process

Answer b describes a goal of process improvement.

52. 

Answer: d

The order of activities in the IDEAL model is Initiating, Diagnosing, Establishing, Acting, and Learning.

53. 

Answer: d

The SEMP is intended to be a dynamic document. It starts as an outline and is updated as the security system development process goes on, and contains detailed statements of how the systems security engineering functions are to be carried out during development. The SEMP covers all management functions associated with the performance of security systems engineering activities for a given program.

54. 

Answer: a

The term suppliers is defined here as a broad class of external organizations that provide products, components, materials, and/or services to a producer or prime contractor. Answers b and c describe a system, and answer d is a distracter.

55. 

Answer: b

The basic premise of process improvement is that the quality of services produced is a direct function of the quality of the associated development and maintenance processes. Answers a and c describe knowledge or assumptions required to implement a successful security engineering process improvement activity, but not the main premise. Answer d is a distracter.

56. 

Answer: d

The Work Breakdown Structure (WBS) is an important technique to ensure that all essential tasks are properly defined, assigned, scheduled, and controlled. It contains a hierarchical structure of the tasks to be accomplished during the project. The WBS may be a contractual requirement in competitive bid system developments. Answers a, c, and d are attributes of the WBS, not its main purpose.

57. 

Answer: a

Development Program Planning and Control describes the security systems security engineering tasks that must be implemented to manage the development phase of the security program, including:

  • Statement of Work
  • Organizational Structure
  • Scheduling and Cost Estimation
  • Technical Performance Measurement

Answer a is an activity of the Security Systems Engineering Process element of the SEMP.

58. 

Answer: a

After the generation of the SOW and the identification of the organizational structure, one of the initial steps in program planning is the development of the Work Breakdown Structure (WBS). The other answers are distracters.

59. 

Answer: c

The five levels are: Level 1, Performed Informally; Level 2, Planned and Tracked; Level 3, Well Defined; Level 4, Quantitatively Controlled; and Level 5, Continuously Improving.

60. 

Answer: d

The correct order of activities in the security system design testing process is Planning, Acquisition, Testing, Analysis, and Correction.

Certification and Accreditation

See Chapter 11 Assessment Questions.

U.S. Government Information Assurance Regulations

61. 

Answer: b.

Answer a is a distracter. Operational controls (answer c) are security controls that are usually implemented by people instead of systems, and technical controls (answer d) are security controls that the computer system executes.

62. 

Answer: a

63. 

Answer: c

Answer a is a definition of data integrity, answer b defines authentication, and answer d describes confidentiality.

64. 

Answer: c

Answers a and b are distracters. Answer d refers to a “nonhierarchical grouping of sensitive information used to control access to data more finely than with hierarchical security classification alone,” as defined in NSTISSI Publication No. 4009.

65. 

Answer: b, Concept of Operations

Answer a, QUADRANT, refers to technology that provides tamper-proof protection to cryptographic equipment. Answer c defines “a set of assurance requirements that represent a point on the Common Criteria predefined assurance scale,” and answer d is a “framework that assigns and portrays IA roles and behavior among all IT assets, and prescribes rules for interaction and connection.”

66. 

Answer: c

Answer a refers to the National Information Infrastructure (NII), answer b defines the Global Information Infrastructure (GII), and answer d defines a Policy Approving Authority (PAA).

67. 

Answer: b

Answer a refers to the concept of Tranquility, answer c refers to a Special Access Program (SAP), and answer d is distracter.

68. 

Answer: b

The other answers are distracters.

69. 

Answer: d

70. 

Answer: b

Additional characteristics of a national information system include employing cryptographic activities related to national security, associated with equipment that is an integral part of a weapon or weapons system(s), and critical to the direct fulfillment of military or intelligence missions.

71. 

Answer: c

72. 

Answer: c

The Director of the Office of Management and Budget (OMB) has the responsibility of overseeing government agency security policies and practices. Standards associated with national defense are still the responsibility of the DoD and NSA.

73. 

Answer: b

74. 

Answer: d

75. 

Answer: b

The other answers are distracters.

76. 

Answer: b

77. 

Answer: b

78. 

Answer: c

Answer a refers to a device embodying a communications security (COMSEC) design that NSA has approved as a CCI. Answer b is part of a CCI that does not perform the entire COMSEC function but depends upon the host equipment, or assembly, to complete and operate the COMSEC function. Answer d is a device or electronic key used to unlock the secure mode of crypto-equipment.

79. 

Answer: d

Answers a, b, and c are distracters.

80. 

Answer: a

The other answers are distracters.

81. 

Answer: b

Answers a and d are distracters. PII (answer c) is usually associated with privacy. An example of PII is a person’s health care information.

82. 

Answer: a

Answers b, c, and d are distracters.

83. 

Answer: c

84. 

Answer: b

Answers a, c, and d are distracters.

85. 

Answer: c

Answers a and b are distracters. Answer d, a Type 2 product, defines unclassified cryptographic equipment, assembly, or component endorsed by the NSA for use in national security systems as defined in Title 40 U.S.C. § 1452.

86. 

Answer: d

This responsibility is assigned to the OMB.

87. 

Answer: a

88. 

Answer: b

89. 

Answer: c

90. 

Answer: a

Answers b, c, and d are distracters.



The CISSP and CAP Prep Guide. Platinum Edition
The CISSP and CAP Prep Guide: Platinum Edition
ISBN: 0470007923
EAN: 2147483647
Year: 2004
Pages: 239

Flylib.com © 2008-2020.
If you may any questions please contact us: flylib@qtcs.net