As noted in Chapter 10, the system overview is not intended to divulge much about the system's architecture but rather to ground a new reader in the system's background. If such an overview exists in the overall project documentation, as it well may, the architect's obligation for this information is satisfied by referring the reader to that overview.
Mission to Planet Earth (MTPE) is a long-term, multi- and interdisciplinary NASA research mission to study the processes leading to global climate change and to develop a predictive capability for the Earth system on time scales of decades to centuries. To accomplish these objectives, re-searchers require a readily accessible collection of diverse observations of Earth over an extended period of time, with the capability to create and add new data products to this collection, based on improved understanding. MTPE aims at not only a study of the disciplinary sciences of the atmosphere, oceans, cryosphere, biosphere, and solid Earth but also the interdisciplinary interactions among these often disparate realms of study. This is necessary for development of a predictive capability for modeling the Earth system as the scientific basis for global environmental policy.
MTPE consists of three major components:
The EOSDIS will provide a broader community of users with a unique resource for enhancing their understanding of global change issues and for acquiring data for use in other applications.
The EOSDIS Core System (ECS) is the major component of the EOSDIS and the subject of this document. The ECS will control the EOS spacecraft and instruments, process data from the EOS instruments, and manage and distribute EOS data products and other selected data sets and updated NASA/SPSO Product List Tables. Interoperating with other data systems maintained by government agencies and the research community, the ECS will provide comprehensive services for accessing Earth science data.
The ECS is part of a much larger data collection and research enterprise. The ECS development process provides flexibility to accommodate changes in user needs and to incorporate new data system technologies while also satisfying cost and schedule constraints.
2.2 EOS Mission Science Data Flow
To understand the ECS, it helps to understand how data moves throughout the entire system of which ECS is a part. In general, NASA satellites will transmit their data through the Tracking and Data Relay Satellites (TDRS), which will forward the data to the receiving station at White Sands, New Mexico. From there, the data will be transmitted via dedicated circuits to the new Fairmont Complex in West Virginia, where the data will be processed to recover the raw instrument data. International Partner satellites downlink directly to the International Partner Ground Systems (IPGS) via their ground receiving stations. Data from NASA instruments on the International Partner platforms will be transmitted to Fairmont via commercial networks. Landsat 7 downlinks data directly to the EDC and to the IPGS.
The data from each instrument will be sent from Fairmont (part of EDOS) to Distributed Active Archive Center (DAAC). These seven data centers will house the ECS computing facilities and operational staff needed to produce EOS Standard Products and to manage and store EOSDIS data as well as the associated metadata and browse data required for effective use of the data holdings. The DAACs will exchange data via dedicated EOSDIS networks to support processing at one DAAC, which requires data from another DAAC. DAACs will provide the facilities, management, and operations support for the production, archiving, and distribution of EOS Standard Products. At the DAACs, users can expect a level of service that would be difficult to maintain in a single data center attempting to serve the extraordinarily wide range of disciplines encompassed by the EOS program. It is important that a user interacting with any given DAAC will be able to access data from all the DAACs. The DAACs also house systems for processing and/or storing non-EOS Earth science data.
Flight Operations, including Spacecraft and Instruments, will be conducted from the EOS Operations Center (EOC). Non-U.S. Instruments on U.S. Platforms will be operated and monitored through International Partner (IP) Instrument Control Centers (ICC).
Most science users will access EOS data products via shared networks. Open access to the data by all members of the science community distinguishes the EOS from previous re-search satellite projects, on which selected investigators have had proprietary data rights for a number of years after data acquisition.
Scientific Computing Facilities (SCFs), located at EOS investigators' home institutions, are used to develop and to maintain algorithms for both Standard and Special Products, calibrate the EOS instruments, validate data and algorithms, generate Special Products, provide data and services to other investigators, and analyze EOS and other data in pursuit of the MTPE science objectives. The SCFs may range from single workstations to large supercomputer data centers. Whereas the SCFs will be developed and acquired directly by the EOS investigators, the ECS will provide software toolkits to the SCFs and other users to facilitate data access, transformation and visualization, and algorithm development. Some SCFs will play an operational role in quality control of the EOS Standard Products; these SCFs will be linked to the DAACs via guaranteed service quality communications lines to support reliable exchange of large volumes of data.
Comprehensive understanding of Earth system processes requires data from a diverse range of sensors. Field campaign and other in situ data will be contributed by NOAA and by the scientific users. Remote-sensing data from the EOS instruments will be supplemented by measurements from operational sensors, most notably on satellites operated by NOAA. Some data centers will interoperate with the EOSDIS, allowing the DAACs and their users to search data inventories, much as if the data resided at one of the DAACs. Other data centers will not necessarily interoperate with the EOSDIS but may provide data for the EOS science investigations.
2.3 Broad Requirements for ECS
ECS must satisfy the following broad requirements:
About 1.5 terabytes of raw data per day flow into the system: data about vegetation, lightning, surface images, atmospheric soundings, trace gas quantities, atmospheric dynamics and chemistry, ice topography, sea surface winds, ocean color, volcanic activity, evaporation amounts, and altimetry. The data is processed, analyzed, and refined using complex algorithms provided by the science community. The system daily produces and distributes about 2 terabytes of data that get warehoused and made available to the scientific community in several formats. The overall mission is to understand Earth processes and potential global climate changes and to provide information for planning and adaptation.
In addition to meeting day-to-day objectives, a goal of the ECS program is to provide a highly adaptable system that is responsive to the evolving needs of the Earth science community. Over the system lifetimeat least two decades beyond the launch of the first EOS spacecraftevolution will come from at least three separate sources:
Thus, the ECS will support the "vision" of an evolving and comprehensive information system to promote effective use of data for research in support of the MTPE goals. To support these goals, the ECS will
ECS development is being accomplished in cooperation with the user community, with a shared commitment to the vision of an information system that promotes effective use of data across the entire Earth science community.
Before EOSDIS, satellite data was stored and formatted in ways specific to each satellite; accessing this datalet alone using it for analysiswas almost impossible for scientists not directly affiliated with that satellite's science project. An important feature of EOSDIS is to provide a common way to store and hence process data and a public mechanism to introduce new data formats and processing algorithms, thus making the information widely available to the scientific community at large.
2.4 Deploying the System
The same version of the software runs on each of four distributed sites in Maryland, Virginia, Colorado, and South Dakota. It is deployed on about 20 UNIX servers from multiple vendors at each site.
Authority for various aspects of the system is decentralized. Data centers maintain control of their local facilities and interface with data users. Scientific teams maintain control over their science software and data products. The ECS project management team is responsible for maintaining and planning the evolution of the system.
The first operational version of ECS was delivered in early 1999. The system has been fielded incrementally, its growing capabilities coordinated with the launch of new spacecraft that will "feed" it. The intended operational life of the system is roughly through 2013.
Software Architectures and Documentation
Part I. Software Architecture Viewtypes and Styles
The Module Viewtype
Styles of the Module Viewtype
The Component-and-Connector Viewtype
Styles of the Component-and-Connector Viewtype
The Allocation Viewtype and Styles
Part II. Software Architecture Documentation in Practice
Documenting Software Interfaces
Choosing the Views
Building the Documentation Package
Other Views and Beyond
Rationale, Background, and Design Constraints