22.5 Using the ThreadPool to Perform Background Tasks

 <  Day Day Up  >  

22.5 Using the ThreadPool to Perform Background Tasks

You want to have some task executed in the background on a thread supplied by the CLR .


The procedure to have a task executed on a background thread is similar conceptually to the procedure we discussed earlier for explicitly creating a thread to perform some task ”except that a different delegate is involved and there is no need to set up the thread explicitly:

  1. You must code an entry point method for the task. It can be either a static or an instance method of a class, but it must have the signature void SomeMethod(object state) . Note that, unlike the situation for a thread you create explicitly, this entry method accepts a parameter.

  2. You instantiate a System.Threading.WaitCallback delegate to encapsulate the chosen method.

  3. You pass this delegate to the static method System.Threading.ThreadPool.QueueUserWorkItem() .

Let's assume this is the method that you want executed on a background thread:

 class WorkerClass {         public void DoSomeWork(object state)         {                 // code for the task to be executed here         } } 

You get it executed like this:

 WorkerClass worker = new WorkerClass(); WaitCallback task = new WaitCallback(worker.DoSomeWork); ThreadPool.QueueUserWorkItem(task); 

That's literally all you have to do. Just as for explicitly creating a thread, you can supply data to the DoSomeWork() method by initializing fields in its containing class before you start the task running. However, when queuing a work item, there is an alternative technique: You can supply any additional data by using a two-parameter overload of QueueUserWorkItem() :

 WorkerClass worker = new WorkerClass(); WaitCallback task = new WaitCallback(worker.DoSomeWork); string taskData = "This is extra data for the task"; ThreadPool.QueueUserWorkItem(task, taskData); 

Although we used a string here, QueueUserWorkItem() is expecting an object reference for the second parameter, so you can pass in an instance of any class you want. The reference you pass here is available via the state parameter of the method executed ”and is indeed the purpose of this parameter:

 public void DoSomeWork(object state) {         if (state == null)                 Console.WriteLine("No initialization data was passed in");         else                 Console.WriteLine("Data passed in was " + state.ToString()); } 


The techniques shown before this recipe in this chapter involve explicitly creating and manipulating dedicated threads for certain tasks. By queuing a user work item, you are still arranging for the task to be performed on a new thread: the crucial difference is that instead of creating the thread in your code, you are asking the CLR to supply a suitable thread and to take control of managing that thread on your behalf . The benefit of doing so is higher performance if you are using multithreading extensively and are likely to have a large number of tasks to execute on different threads. The CLR selects a thread from a thread pool that it maintains. The thread pool consists of a group of threads that are normally asleep: When an item is queued to the pool, the CLR selects a thread from the pool, wakes it, and passes it your task to execute. Once the task finishes, the thread goes back to sleep. Waking the thread involves far less work than creating a brand new thread from scratch, hence the performance improvement. Using the thread pool also guarantees that you don't overload the CPU by creating too many threads because the number of threads in the pool is determined in advance by the CLR, based on what is appropriate to the software and hardware you are running. (Typically, it sets a maximum of 25 worker threads on a single-processor machine.) If all the threads are busy when a new request comes in, the CLR simply holds the request until a thread becomes free.

The disadvantage of using the thread pool is you don't get a great deal of explicit control over the thread. Notice that in the code we presented, at no point does the calling code receive a Thread reference that describes the thread on which the task will be performed. Without this reference, the calling thread cannot do tasks such as change the priority of or abort the worker thread. Your code also has no say in which of the thread-pool threads is used for a task; that is entirely at the discretion of the CLR, which in particular means there is no guarantee about whether successive queued tasks will execute on the same thread. There is also a startup cost: The thread pool is created dynamically the first time your code invokes it, but creating a thread pool is obviously more work than merely explicitly creating one thread. If you will be using the thread pool extensively in an application, then this startup cost is far outweighed by the accumulated performance benefits.

It's worth pointing out that when delegates are invoked asynchronously, the actual mechanism used under the hood by the CLR involves the thread pool. Hence, the code we presented here has virtually the same effect as invoking a delegate asynchronously, as discussed in Chapter 5, "Delegates and Events," and to some extent, it's a matter of personal preference which technique you adopt. Asynchronous delegates generally involve more code to set up the asynchronous operation but have the advantage that they feature inbuilt support for allowing the calling thread to monitor the progress of the operation ”something that you need to code yourself if you are simply queuing items to the thread pool and you require that feature.

One important point about the thread pool is that thread-pool threads are background threads. They have no power to keep a process alive ; a thread that does have that power is known as a foreground thread. Background threads are a concept introduced by the CLR: A process remains running as long as it contains active foreground threads. The CLR terminates a process as soon as all foreground threads finish.

This point means that you should not write code like this:

 // program entry function static void main() {        WorkerClass worker = new WorkerClass("");        WaitCallback task = new WaitCallback(worker.DoSomeWork);        string taskData = "This is extra data for the task";        ThreadPool.QueueUserWorkItem(task, taskData); } 

The problem here is that the main thread queues a task to the thread pool and promptly exits. Because in this code the main thread is the only foreground thread, the CLR removes the entire process when that thread exits the main() method, and the queued work item will not have a chance to execute. If you are in a situation where the main thread is ready to exit the application, but there might be thread-pool threads still running, then it's important for the main thread to check whether background threads have finished their tasks. We discuss one technique for doing so using events later in this chapter.

This problem does not occur if you are starting a thread explicitly using new Thread() because new non “thread-pool threads are by default foreground threads. You can manually change the status of a thread that you have explicitly created by changing its IsBackground property:

 workerThread.IsBackground = true; 

This technique is not recommended for thread-pool threads because you are in effect interfering with the CLR's ability to manage these threads. (And if due to a bug in your program, you inadvertently leave a thread-pool thread in a foreground state, then it might become impossible for your application to exit.)

 <  Day Day Up  >  

Microsoft Visual C# .Net 2003
Microsoft Visual C *. NET 2003 development skills Daquan
ISBN: 7508427505
EAN: 2147483647
Year: 2003
Pages: 440

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net