Thread Pooling

I l @ ve RuBoard

In a client/server system, a single server process frequently has to handle requests from multiple clients . Ideally, the server will handle these requests in a concurrent manner, using threads. (Single-threaded servers tend to form bottlenecks and do not survive long in a commercial environment.)

But how many threads should a server create? This is a little like asking how long is a piece of string. One solution would be to create a new thread for every client request, process the request, and then destroy the thread when processing has completed. However, this approach has at least two drawbacks. The first is that thread creation and destruction is a moderately expensive operation. The second drawback concerns the number of threads: If 5,000 clients submit requests at the same time, how practical is it to create 5,000 concurrent threads?

A more scalable solution is to create a pool of threads (called worker threads ) in the server. When a request arrives, an idle worker thread from the pool can be dispatched to handle it. When it's finished, the thread will return to the pool. If no worker threads are available when a request arrives, the server can create a new worker thread or queue the request until a worker thread becomes free. The strategy can be dynamic, taking into account resources such as the number of threads already running, the number of CPUs, and the memory available. This sounds like a lot of work to put together, but the development team at Microsoft has already done it for you in the System.Threading.ThreadPool class. This act was not as philanthropic as its sounds because the team had to build this functionality for other parts of the .NET Framework Class Library anyway ”timers, sockets, and asynchronous I/O all use it.

The ThreadPool Class

If you examine the documentation for the ThreadPool class, you might be intrigued to discover that it comprises a few static methods but does not have a constructor. This is because a ThreadPool object is created automatically on demand the first time you need it. A process can contain at most one thread pool.

The threads that a thread pool contains have default properties, such as the priority, which you should not attempt to change. Pooled threads are designated as background threads. You must also leave thread management to the thread pool; do not cancel or abort threads that are running under the auspices of the thread pool. (There is no simple way to cancel a thread request once it has been queued in the thread pool.) Threads are created as required by the thread pool, and you have little control over how many threads there are because the thread pool uses its own heuristics to determine the optimum size based on the machine load, the work being performed, and the number of processors. (Actually, the thread pool has a default limit of 25 threads for each available processor, but this can be changed ”although I am not going to show you how!)

A thread pool contains a queue of work items. As clients submit requests, work items are added to the queue. An internal thread inside the thread pool takes items of this queue and assigns them to worker threads, which it then invokes. The simplest way to create a work item is by using the QueueUserWorkItem method. This method expects a WaitCallback delegate (which you saw in the section on timers) that refers to the method to be executed by a worker thread, together with an optional parameter (an Object ) any supplying additional user -defined information for that method. The delegated method must have a void return type and take a single Object parameter:

 privatevoiddoWork(Objectstate) { } intdata=...;//exampleuser-defineddata ThreadPool.QueueUserWorkItem(newWaitCallback(doWork),newInteger(data)); 

If the thread pool does not exist, it will be created at this juncture. At some point, when a worker thread becomes available, the doWork method will run.

You can also queue requests for methods to be executed once a resource becomes available. In the threaded world, you wait for resources using a WaitHandle object (a ManualResetEvent , AutoResetEvent , or Mutex ). With a thread pool, you can use the RegisterWaitForSingleObject method to specify a WaitHandle , a WaitOrTimerCallback delegate that refers to a method (with optional parameter), and a timeout interval. When the WaitHandle object is signaled, a worker thread will execute the designated method. If a timeout occurs, the target method will also be run. The target method must supply two parameters ”the inevitable Object and a boolean that will be set to true if the wait has timed out or false otherwise :

 privateAutoResetEventevt=newAutoResetEvent(false); privatevoidwaitAndDoWork(Objectstate,booleantimedOut) { } ThreadPool.RegisterWaitForSingleObject(evt, newWaitOrTimerCallback(waitAndDoWork),null, newTimeSpan(0,0,10),true); 

In the preceding example, a thread will be executed that runs the waitAndDoWork method when the AutoResetEvent evt is signaled or when the 10-second timeout expires . (As with the other synchronization methods, you can specify a 0 value if you do not want to wait ”this is not wise with a thread pool ”or Timeout.Infinite if you're willing to wait forever.) The null value in the middle is the argument passed to waitAndDoWork . The final Boolean parameter ( true ) indicates whether the worker thread should wait again when the method completes or just finish. In this case, the thread will wait once more and run the waitAndDoWork method again the next time the event is signaled, and then wait again, and so on. Be careful: If you wait on a ManualResetEvent , be sure to reset it in the delegated method before it finishes; otherwise, you might trigger a fast-running infinite loop in the worker thread!

Asynchronous I/O

As mentioned earlier, other areas of .NET use the thread pooling mechanisms. The technique used to perform asynchronous I/O is worth a brief mention because it employs patterns that are used in many aspects of .NET that rely on asynchronous operations and that you can use in your own classes.

If you examine the System.IO.Stream class, you'll see that it contains methods for sending and receiving streams of bytes to or from a source. The Stream class is used as a basis for more specialized streams, such as the System.IO.FileStream class and System.Net.Sockets.NetworkStream . Two methods in the Stream class, BeginRead and BeginWrite , are of particular interest. These methods initiate an I/O operation asynchronously, employing worker threads from the thread pool. The signature of both methods is the same. You must provide a buffer indicating the source or destination for the data, an offset into the buffer at which to begin reading or writing, the number of bytes to read or write, and an AsyncCallback delegate that points to a method that will be executed when the read or write operation has completed. (There is also a final Object parameter which represents a user-defined state. See Chapter 9 for some examples that use it.)

The following example uses a FileStream object to write approximately 10 MB of data to a file asynchronously, calling the writingDone method when it has finished. This example is available in the PoolThreads project:

 privatevoidwritingDone(IAsyncResultar) { } FileStreamfs=newFileStream("C:\temp\MyFile",FileMode.OpenOrCreate); ubyte[]fileData=newubyte[10000000]; IAsyncResultresult=fs.BeginWrite(fileData,0,10000000, newAsyncCallback(writingDone),null); 

The Begin methods return an IAsyncResult object that contains status information about the operation, including a reference to the final Object parameter submitted earlier, which is accessible through the AsyncState ( get_AsyncState) property. Another useful property is IsCompleted , which you access by calling the get_IsCompleted method from J#. This method returns a Boolean value indicating whether the operation has finished. Incidentally, the method targeted by the AsyncCallback delegate in the Begin methods must also take an IAsyncResult parameter, which will be populated with a reference to this same object.

If you need to wait for the asynchronous operation to complete before continuing ( assuming you've performed some other tasks in the meantime), you can use the EndRead or EndWrite methods, which are also exposed by the Stream class. These methods expect the IAsyncResult value returned from the corresponding Begin method call as a parameter, and they block until the worker thread has completed its task:

 fs.EndWrite(result); 

If the operation has already completed, the End method will finish immediately.

I l @ ve RuBoard


Microsoft Visual J# .NET (Core Reference)
Microsoft Visual J# .NET (Core Reference) (Pro-Developer)
ISBN: 0735615500
EAN: 2147483647
Year: 2002
Pages: 128

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net