Chapter 13: Analysis of Operating System Components

 < Free Open Study > 



This chapter is divided into the following sections. Section 13.1 covers an introduction to the specific performance evaluation conducted, its basic concepts, the types of workloads being used, the experimental design for the performance analysis, and an introduction to the simulation toolkit used for the evaluation. Section 13.2 includes the architecture of the four operating systems being used. We have tried to keep the architectures very specific to the experiments being carried out. Section 13.3 is focused on statistics, analysis of the results obtained from experiments, sensitivity analysis, cost/ performance issues, and presentation in the form of graphs and charts. Section 13.4 discusses experimental design and simulation. Section 13.5 covers conclusions about the performance analysis. [1]

13.1 Introduction

Computer systems users, administrators, and designers are all interested in performance evaluations, since their goal is to obtain or provide the highest performance at lowest cost. Performance evaluation is essential at every step in the life cycle of a computer system. This includes design, manufacturing, use, upgrade, and so on. We need to perform this evaluation in order to predict the adequacy of the system. In order to do this performance evaluation we must define the system correctly; define its components; state the environment in which the system resides; and define parameters, which we measure and on which the system is built. Computer systems are a backbone of an organization, which might have its clients scattered around the globe. If the system doesn't perform the way it is intended to, it results in loss of infrastructure, efficiency, and credibility of the organization. So a sound evaluation of the computer system is of prime importance. This encompasses not only the hardware/software performance but also a cost versus, performance measure. For any computer system performance measures such as responsiveness, missionability, dependability, and productivity are of immense importance. There are different techniques of performance evaluation. We can identify them as two major classes. One includes designing an experiment (HW/SW/Stimulus), and the second includes modeling, which might be analytical (queuing, Petri nets) or by simulation (discrete, continuous, combined). This study utilizes both of these techniques to perform a comparison among the four operating systems.

This chapter evaluates the performance of four operating systems: Microsoft's Windows XP, Windows ME, Windows NT, and LINUX 7.2. These operating system performance assessments were completed by a graduate computer systems performance evaluation class at UMass Dartmouth during the spring semester of 2002. The performance evaluation of these operating systems was performed on an x86 architecture. The operating systems' performance was examined using three specific types of workloads. The evaluation is based on the currently available major releases of these operating systems "as-is" without performance tuning. Each team was asked to design "high-level" models and convert these models into a simulation by using the AWESIM simulation toolkit. Teams came up with a common experimental design and performed specific types of performance tests to measure the response of the four operating systems pertaining to specific factors. Each team performed a comparative analysis.

[1]Contributed by: P. Abdelmalek, S. Bapat, K. Challapalli, I. Chen, A. Chennamraju, P. Furey, J. Joseph, R. Madiraju, S. Chowdary A. Pisharody V. Rajan, W. Rosa, B. Sarangarajan, S. Sharma, P. Singhal, X. Tao, K. Vangapally, T. Zhou, and Q. Yu. University of Massachusetts Dartmouth, Department of Electrical and Computer Engineering.



 < Free Open Study > 



Computer Systems Performance Evaluation and Prediction
Computer Systems Performance Evaluation and Prediction
ISBN: 1555582605
EAN: 2147483647
Year: 2002
Pages: 136

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net