We'll conclude this chapter with an interoperability performance test that follows the same format as the previous ones in the book. This test can be found in the C:\Interoperability\Samples\Point\WebServices\PerformanceTest directory and can be used to measure the response time between a .NET client and a Java Web service, by using the samples provided throughout this chapter. The Web service can be run by entering start ant run within the Server subdirectory. To build the client, run nant run from the Client subdirectory. This test assumes the client and server are on the same machine but can easily be distributed by modifying the URL in the FinancialServices.cs proxy file.
For those interested in the performance differences between using .NET Remoting and Web services for a solution that requires interoperability, I recommend using this performance test's sample code and comparing it with its counterpart in Chapter 4. Another way to extend these tests is to use XSD, as shown in Chapter 3, to rerun the tests with data types that more accurately represent your own environment. The results can help you judge the potential performance of using custom data types over Web services.
For these particular tests on my development machine, I've seen an average of 12.9 ms to download the five stock recommendations, and 14.1 ms to call and commit the sale command. This compares with 4.6 ms and 8.1 ms when running the .NET Remoting samples on the same setup.
Again, using this code and modifying it for your own controlled environment will be the best performance guide between the two solutions.