An XML Web Service Singleton

You can easily adopt the example design pattern in a .NET Remoting scenario by using a component configured as a singleton. In an XML Web service, however, this option doesn't appear. All XML Web services act intrinsically like single-call objects, and separate instances are created for each client. With this hard-wired design, is it possible to create an XML Web service that acts like a singleton?

The short answer is yes. You accomplish this trick by creating the XML Web service as an extra layer. The XML Web service then communicates with a singleton component. There are two possible ways for this communication to take place:

  • The XML Web service can communicate with a singleton component over .NET Remoting. This isn't necessarily a bad approach, but it does introduce the overhead of an extra network call, which is not trivial.

  • The XML Web service can use a component contained in ASP.NET's application state collection. These objects are global to all Web pages and XML Web services in the Web application.

Using an XML Web service with a singleton enables you to create a task-based service that completes work asynchronously. A full description is outside the scope of this chapter, but you can find an example of this technique at http://www.prosetech.com. In addition, you can refer to Chapter 18 for an example of how to use the asynchronous task-based model without a singleton pattern by storing the task result in a durable data store.

Thread Pools

One problem with the pattern presented so far is that it creates threads indiscriminately. If 10 tasks are underway simultaneously, 10 threads are created. If 100 tasks are requested, the component creates 100 threads, even though this will result in poor performance because the operating system will waste a significant portion of its CPU power just tracking and scheduling all the threads.

Keep in mind that .NET Remoting manages the connection process using its own thread pooling. Therefore, if it receives 100 simultaneous requests, some of these requests will be serialized. In our asynchronous task example, however, even though only a small set of requests might be initiated simultaneously (say, 25 at a time), the tasks continue running after the connection ends. This means that the number of threads can build up without any limit.

You can avoid this problem in several ways. One option is to create your own thread-tracking and thread-processing system. The basic concept is that you don't create a new thread immediately for each client. Instead, you add the information about the client's request to a collection. On a separate thread, another method periodically scans this collection and the current set of tasks, ensuring that new tasks are assigned to existing threads as they become free. You can even store information about task priority, allowing this monitoring code to make intelligent decisions about which tasks should be processed first. However, this approach requires a major programming project. A better solution is to rely on a special class that .NET provides for this task: the ThreadPool class in the System.Threading namespace.

The ThreadPool class uses a fixed pool of threads to queue tasks. You create work items for the ThreadPool class as needed; the number of task items processed simultaneously is automatically limited to 25 per processor, however, because this is the number of threads used. The ThreadPool class provides a number of benefits, including a dedicated monitoring thread, reuse of worker threads, and automatic scheduling. However, the ThreadPool also introduces a few limitations. First of all, there can only be one ThreadPool per application domain. That means that if you use the ThreadPool in more than one location in your application to perform more than one type of task, both sets of work items compete for the same limited set of threads. In addition, the ThreadPool does not provide any way to prioritize work items or cancel them after they have been submitted.

To find the number of threads allocated to the pool, you can use the shared ThreadPool.GetMaxThreads property. To find the number of threads currently free, you can examine the ThreadPool.GetAvailableThreads property. To actually queue a work item, you just call the ThreadPool.QueueUserWorkItem method with a delegate that points to the method.

Listing 7-8 shows how you would rewrite the StartSomething method to use a thread pool.

Listing 7-8 Using a thread pool
 Public Function StartSomething() As String     ' Generate new ticket.     Dim Ticket As String = Guid.NewGuid().ToString()     ' Create task and add it to the collection.     Dim Task As New ClassB()     SyncLock ClientTasks         ClientTasks(Ticket) = Task     End SyncLock     ' Queue the task process.     Task.InProgress = True     Task.Client = Ticket     ThreadPool.QueueUserWorkItem( _       New WaitCallback(AddressOf Task.DoSomething))     Return Ticket End Function 

Note

There's another way to solve this problem. Instead of starting a new thread or queuing a work item when a request is received, you can send a message to a queue. You then have the luxury of retrieving the messages by priority in small batches and deciding how to process them. This approach is described in the next chapter.




Microsoft. NET Distributed Applications(c) Integrating XML Web Services and. NET Remoting
MicrosoftВ® .NET Distributed Applications: Integrating XML Web Services and .NET Remoting (Pro-Developer)
ISBN: 0735619336
EAN: 2147483647
Year: 2005
Pages: 174

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net