Business Tier Performance


Business Tier Performance

The business tier is the core WebSphere tier, and it's typically associated with EJB containers and the like. The business tier consists of the following Java components :

  • One or many EJB containers and EJBs

  • Non-Web container Java classes

  • Operational Java components

  • Design pattern considerations

Let's now take a look at the business tier development performance considerations.

Reuse EJB Homes Where Possible

As you've seen in previous chapters, EJB homes are the points at which other Java client components hook into EJBs running within the EJB container.

Each initialContext lookup to obtain the EJB home is a JVM- intensive operation. For that reason, EJB homes can be both cached and reused with multiple Java components, thereby greatly reducing the cost of requesting the context of an EJB home for each request.

Other implementation considerations include developing helper or locator classes that are called by Java clients that maintain a list of EJB homes. This approach works well for high-volume applications in which EJB homes are looked up frequently.

Performance and scalability benefit:

2

Implementation complexity:

4

Consider Pass-by-Reference Rather Than Pass-by-Value

Pre-EJB 2.0 (i.e., WebSphere version 4), there was no direct way to avoid deploying an EJB client and an EJB to the same server and having to communicate out via the network using RMI. Some vendors , including IBM, provide a feature known as Pass-by-Reference that facilitates method calls via local interfaces, between EJB clients and EJBs. This effectively makes the EJBs appear local and avoids the overhead of going out via the network and RMI stack for communications. This recommendation is only valid for WebSphere version 4, as WebSphere version 5 supports EJB 2.0, which includes a feature to set EJBs to listen locally rather than remotely (or both).

Implementing this change is as simple as modifying the IBM extension deployment descriptor after launch via the ORB properties configuration dialog box (within the Application Server properties menu in the WebSphere console) or via the Application Assembly Tool (AAT) prior to deploying the application EJB components.

Performance and scalability benefit:

2

Implementation complexity:

1

Use a Multithreaded Logger Instead of system.out.println()

One of the painful aspects of operational management is trying to balance the trade-off between good tracing and logging information, and too much information (which usually drives down performance).

The Java.IO classes are solid in functionality, but they do suffer from performance issues pertaining to their fairly heavy use of object synchronization. This can also lead to a heavy usage of disk I/O due to the nature of the I/O mechanism used by the system.out.println() class. For this reason, consider employing a third-party logging application. Probably the most popular Java-based logger available is the Jakarta log4j application. It's a multithreaded, collapsible logging system for J2EE and Java application environments.

Use system.out.println() only for specific messages relating to the containers in your applications. For application debug and informational messages, use log4j, which you can configure to log only certain types of data. You can also activate and deactivate log4j during runtime. And best of all, it's fast and free!

Performance and scalability benefit:

2

Implementation complexity:

3

Use Container Threading Services and Avoid Manually Spawning Threads

Pre-J2EE days meant that the onus of threading within Java application environments was on the Java developer. One of the powerful features of J2EE is the container. The introduction of the container meant that it (the container) would manage and handle all threading allocation and management within its scope, as opposed to having developers manually develop this form of plumbing.

It's still possible to develop applications and manually create and spawn threads that are unmanaged by the container, but this can place a great deal of overhead on your application server. As each thread consumes JVM processing cycles (and memory), the JVM, through the J2EE threading framework, is constantly monitoring and managing thread pool allocation. If threads are being spawned outside the thread manager's scope, then performance problems will arise.

In summary, unless you have very specific reasons for doing so, don't spawn your own threads. Use the containers to manage thread allocation for your applications.

Performance and scalability benefit:

2

Implementation complexity:

3

Use Lazy Initialization of Objects

Lazy initialization is the process or design of your application code to not load up objects until they're required. For example, don't initialize an object outside an if statement. Instead, initialize the object within the specific if branch that you may require it for.

Lazy initialization also refers to a manner of the activation of entity EJBs. Typically, entity EJBs are activated with all their data references referenced. This may not be the desired result if you only want to load the entity EJB in order to get a specific reference to data.

By using the lazy initialization approach, you're able to activate the EJB to get the specific data reference from the entity EJB when you require it, rather than having all the data retrieved by default, at activation.

Performance and scalability benefit:

2

Implementation complexity:

3

Avoid Overusing Java Niceties

The Java language has a raft of smart features and nice coding syntaxes. Many of these features and syntaxes come with performance trade-offs, however. If your application environment is small or your volumes just aren't high, then your mileage with these considerations may not warrant a look.

Here are some of the more common Java language and feature niceties that you should avoid:

  • Consider using the conditional syntax of "? :" instead of if-then structures.

  • Use StringBuffer instead of concatenating the String type. Concatenating strings causes the JVM to have to copy and overlay objects in memory.

  • Avoid creating objects early. A classic example is when you create a variable in front of an if - then loop. Only one of the if - then branches may ever use it, if at all. This consumes memory and should be avoided. Place these variables and other object creation directives in the actual if - then clause requiring them.

  • Attempt to size objects such as hashTables and Vectors. This helps the JVM pre-allocate object memory. Heap allocation of these types of objects works by doubling their size each time memory is required to be increased.

  • Be conscious of unused objects floating around that consume heap space. Objects will lie around for a period of time until garbage collection occurs. For example, the toUpper() and toLower() methods create objects in memory when used, but the objects remain in memory until garbage collection, after the method call is completed.

Performance and scalability benefit:

3

Implementation complexity:

1

Avoid Stateful EJBs Where Possible

Stateful EJBs are good in theory ”on their own ”but they're a problematic component type to work with. From a WebSphere perspective, stateful EJBs don't support workload management failover. This is a big problem for scalable and high-performing WebSphere environments.

For the most part, the functionality of a stateful EJB can be supported through the use of session objects or persisting information to a database or data store. The very nature and the limited EJB caching capability of a stateful EJB means that there's a fair degree of overhead to the EJB container when it attempts to passivate dormant or idled stateful EJBs and later activate them.

In summary, where possible avoid stateful EJBs and consider using other stateful constructs such as session objects and databases. Use stateless EJBs to perform data population. In Chapter 9, you explored the process of sizing and tuning the EJB cache. This will help you gain some performance advantages if your application design mandates that you use stateful EJBs.

Performance and scalability benefit:

3

Implementation complexity:

3

Use Constants

Like many programming languages, variables and constants are commonplace in code. In Java, a variable is an object that is provided to hold some form of data of a specific data type. A constant is a variable that is constant and doesn't change for the duration of its lifetime.

Always use constants for variables that don't change within the life cycle of the application. The application's performance can benefit greatly, thanks to the JVM's capability to optimize compilation of constants over that of standard variables. For example, here's a standard variable:

 String myString = new String("WebSphere"); 

and here's a constant:

 Static final String myString = "WebSphere"; 

Heavy use of constants over variables during some Java programming usages can result in a tenfold increase in performance.

Performance and scalability benefit:

2

Implementation complexity:

1

Assume Remote Objects, but Write for Local

It's important to develop applications in EJB (or any other distributed J2EE technology) form with the mind-set that although the component you're developing can be used remotely from other locations (e.g., from another WebSphere server via RMI), there's always a great deal of overhead associated with remote calls. A transaction here and a transaction there aren't going to bring your application to its knees; however, many transactions will show a performance degradation if the reliance and efficiency of the remote call is overused .

Local method calls within a stand-alone Java application will see response times of between 1 and 10 milliseconds, whereas a remote call to a local EJB method will see response times of between 200 and 1,000 milliseconds . A remote call to another host will see response times of anywhere between 500 and 5,000 milliseconds. Therefore, it's important that you develop your remote invocation components in a way that's highly efficient and assumes remote calls will be made. Following this approach will also help to ensure that local access is optimized.

Note  

WebSphere version 5 supports the new EJB 2.0 local interface specification that allows application components local to the EJB container to access methods inside EJBs without the overhead of being serialized and deserialized via the RMI protocol.

Performance and scalability benefit:

2

Implementation complexity:

3

Use Abstraction

Abstraction is a fairly loosely used term in application design, but when abstraction is used and implemented correctly, it provides many benefits. Abstraction, in the context of developing J2EE applications, directs application design in such a way that all remote callable components should be accessed via a form of abstraction such as a fa §ade (e.g., the Fa §ade pattern).

Abstraction provides a level of robustness, and ultimately scalability, in your application's design. You can scale up, change, and alter your back-end components but, if abstraction is used and implemented correctly, your Java clients will still be able to access the modified back-end components (e.g., EJBs) via a fa §ade, which helps with platform robustness and scalability.

A slight runtime performance trade-off is involved, depending on your abstraction implementation approach. Every additional level of abstraction, while introducing an additional level of robustness and component compartmentalization, does incur a slight performance hit. Without abstraction, however, your application platform will become brittle and will be affected by changes made to application components at almost every level.

Performance and scalability benefit:

2

Implementation complexity:

2




Maximizing Performance and Scalability with IBM WebSphere
Maximizing Performance and Scalability with IBM WebSphere
ISBN: 1590591305
EAN: 2147483647
Year: 2003
Pages: 111
Authors: Adam G. Neat

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net