13.1. What Is a Thread?

 < Day Day Up > 

When an assembly (.exe file) begins execution, a primary thread is created that serves as the entry point to the application in C#, this is an application's Main() method. The thread is the unit or agent responsible for executing code.

.NET does not physically create threads that is the responsibility of the operating system. Instead, it provides a THRead class that serves as a managed version of the unmanaged physical thread. The THRead class, located in the System.Threading namespace, exposes properties and methods that allow a program to perform thread-related operations. These class members allow an application to create a thread, set its priority, suspend, activate or kill it, and have it run in the background or foreground.

Figure 13-1 is a simplified representation of the relationship between a process, applications, and threads. Physically, a thread consists of CPU registers, a call stack (memory used for maintaining parameter data and method calls), and a container known as Thread Local Storage (TLS) that holds the state information for a thread.

Figure 13-1. Threads contained in a process


Multithreading

In a single CPU system, only one thread can execute at a time. The order in which threads run is based on their priority. When a thread reaches the top of the priority queue, its code stream is executed for a fixed amount of time known as a time slice. If the thread does not complete execution, its state information must be stored so that the thread can later resume execution at the point it is interrupted. The state information includes registers, stack pointers, and a program counter that tells the thread which instruction is executed next. All of this information is stored in the area of memory allocated to Thread Local Storage.

Core Note

.NET provides support for multiple processor systems by permitting a process to be assigned to a processor. This is set using the ProcessAffinity property of the System.Diagnostics.Process class.


Thread Priority

As mentioned, the order in which a thread runs is based strictly on its priority. If a thread is running and a thread with a higher priority becomes available to run, the running thread is preempted to allow the higher priority thread to run. If more than one thread has the same priority, the operating system executes them in a round-robin fashion.

In .NET, a thread's Priority property is used to get or set its priority level. It may have one of five values based on the THReadPriority enum: Lowest, BelowNormal, Normal, AboveNormal, and Highest. The default is THReadPriority.Normal.

You should override thread priorities only in situations where a task has a clearly defined need to execute with a low or high priority. Using thread priorities to fine-tune an algorithm can be self-defeating for several reasons:

  • Even threads with the highest priority are subject to blocking by other threads.

  • Raising the priority of a thread can place it into competition with the operating system's threads, which can affect overall system performance.

  • An operating system keeps track of when a thread runs. If a thread has not run for a while, its priority is increased to enable it to be executed.

Foreground and Background Threads

.NET classifies each thread as either a background or foreground thread. The difference in these two types is quite simple: An application ends when all foreground threads stop; and any background threads still running are stopped as part of the shutdown process.

By default, a new thread is set to run as a foreground thread. It can be changed to background by setting its IsBackground property to true. Clearly, you only want to set this for noncritical tasks that can logically and safely end when the program does. Note that even though .NET attempts to notify all background threads when the program shuts down, it's good practice to explicitly manage thread termination.

Thread State

During its lifetime, a thread may exist in several states: It begins life in an Unstarted state; after it is started and the CPU begins executing it, it is in Running mode; when its slice of execution time ends, the operating system may suspend it; or if it has completed running, it moves into Stopped mode. Running, Stopped, and Suspended are somewhat deterministic states that occur naturally as the operating system manages thread execution. Another state, known as WaitSleepJoin, occurs when a thread must wait for resources or for another thread to complete its execution. After this blocking ends, the thread is then eligible to move into Running mode.

Figure 13-2 illustrates the states that a thread may assume and the methods that invoke these states. It is not a complete state diagram, because it does not depict the events that can lead to a thread being placed in an inconsistent state. For example, you cannot start a running thread nor can you abort a suspended thread. Such attempts cause an interrupt to be thrown.

Figure 13-2. Thread states


A thread's current state is available through a read-only property named ThreadState. This property's value is based on the ThreadState enum that defines 10 states:

 Aborted        = 256   StopRequested    = 1 AbortRequested = 128   Suspended        = 64 Background     = 4     SuspendRequested = 2 Running        = 0     Unstarted        = 8 Stopped        = 16    WaitSleepJoin    = 32 

If a program is not interested in a specific state, but does need to know if a thread has been terminated, the Boolean Thread.IsAlive property should be used.

     < Day Day Up > 


    Core C# and  .NET
    Core C# and .NET
    ISBN: 131472275
    EAN: N/A
    Year: 2005
    Pages: 219

    flylib.com © 2008-2017.
    If you may any questions please contact us: flylib@qtcs.net