Section 14.2. Implementing Queue


14.2. Implementing Queue

14.2.1. PriorityQueue

PriorityQueue is one of the two nonlegacy Queue implementations not designed primarily for concurrent use (the other one is ArrayDeque). It is not thread-safe, nor does it provide blocking behavior. It gives up its elements for processing according to an ordering like that used by NavigableSeteither the natural order of its elements if they implement Comparable, or the ordering imposed by a Comparator supplied when the PriorityQueue is constructed. So PriorityQueue would be an alternative design choice (obviously, given its name) for the priority-based to-do manager that we outlined in Section 13.2 using NavigableSet. Your application will dictate which alternative to choose: if it needs to examine and manipulate the set of waiting tasks, use NavigableSet. If its main requirement is efficient access to the next task to be performed, use PriorityQueue.

Choosing PriorityQueue allows us to reconsider the ordering: since it accommodates duplicates, it does not share the requirement of NavigableSet for an ordering consistent with equals. To emphasize the point, we will define a new ordering for our to-do manager that depends only on priorities. Contrary to what you might expect, PriorityQueue gives no guarantee of how it presents multiple elements with the same value. So if, in our example, several tasks are tied for the highest priority in the queue, it will choose one of them arbitrarily as the head element.

The constructors for PriorityQueue are:

 PriorityQueue()       // natural ordering, default initial capacity (11) PriorityQueue(Collection<? extends E> c)                       // natural ordering of elements taken from c, unless                       // c is a PriorityQueue or SortedSet, in which case                       // copy c's ordering PriorityQueue(int initialCapacity)                       // natural ordering, specified initial capacity PriorityQueue(int initialCapacity, Comparator<? super E> comparator)                       // Comparator ordering, specified initial capacity PriorityQueue(PriorityQueue<? extends E> c)                       // ordering and elements copied from c PriorityQueue(SortedSet<? extends E> c)                       // ordering and elements copied from c 

Notice how the second of these constructors avoids the problem of the overloaded TReeSet constructor that we discussed in Section 13.2.1. We can use PriorityQueue for a simple implementation of our to-do manager with the PriorityTask class defined in Section 13.2, and a new Comparator depending only on the task's priority:

 final int INITIAL_CAPACITY = 10; Comparator<PriorityTask> priorityComp = new Comparator<PriorityTask>() {   public int compare(PriorityTask o1, PriorityTask o2) {     return o1.getPriority().compareTo(o2.getPriority());   } }; Queue<PriorityTask> priorityQueue =   new PriorityQueue<PriorityTask>(INITIAL_CAPACITY, priorityComp); priorityQueue.add(new PriorityTask(mikePhone, Priority.MEDIUM)); priorityQueue.add(new PriorityTask(paulPhone, Priority.HIGH)); ... PriorityTask nextTask = priorityQueue.poll(); 

Priority queues are usually efficiently implemented by priority heaps. A priority heap is a binary tree somewhat like those we saw implementing TReeSet in Section 13.2.1, but with two differences: first, the only ordering constraint is that each node in the tree should be larger than either of its children, and second, that the tree should be complete at every level except possibly the lowest; if the lowest level is incomplete, the nodes it contains must be grouped together at the left. Figure 14.3(a) shows a small priority heap, with each node shown only by the field containing its priority. To add a new element to a priority heap, it is first attached at the leftmost vacant position, as shown by the circled node in Figure 14.3(b). Then it is repeatedly exchanged with its parent until it reaches a parent that has higher priority. In the figure, this required only a single exchange of the new element with its parent, giving Figure 14.3(c). (Nodes shown circled in Figures 14.3 and 14.4 have just changed position.)

Figure 14-3. Adding an element to a PriorityQueue


Figure 14-4. Removing the head of a PriorityQueue


Getting the highest-priority element from a priority heap is trivial: it is the root of the tree. But, when that has been removed, the two separate trees that result must be reorganized into a priority heap again. This is done by first placing the rightmost element from the bottom row into the root position. Thenin the reverse of the procedure for adding an elementit is repeatedly exchanged with the larger of its children until it has a higher priority than either. Figure 14.4 shows the processagain requiring only a single exchangestarting from the heap in Figure 14.3(c) after the head has been removed.

Apart from constant overheads, both addition and removal of elements require a number of operations proportional to the height of the tree. So PriorityQueue provides O(log n) time for offer, poll, remove(), and add. The methods remove(Object) and contains may require the entire tree to be traversed, so they require O(n) time. The methods peek and element, which just retrieve the root of the tree without removing it, take constant time, as does size, which uses an object field that is continually updated.

PriorityQueue is not suitable for concurrent use. Its iterators are fail-fast, and it doesn't offer support for client-side locking. A thread-safe version, PriorityBlockingQueue (see Section 14.3.2), is provided instead.

14.2.2. ConcurrentLinkedQueue

The other nonblocking Queue implementation is ConcurrentLinkedQueue, an unbounded, thread-safe, FIFO-ordered queue. It uses a linked structure, similar to those we saw in Section 13.2.2 as the basis for skip lists, and in Section 13.1.1 for hash table overflow chaining. We noticed there that one of the main attractions of linked structures is that the insertion and removal operations implemented by pointer rearrangements perform in constant time. This makes them especially useful as queue implementations, where these operations are always required on cells at the ends of the structurethat is, cells that do not need to be located using the slow sequential search of linked structures.

ConcurrentLinkedQueue uses a CAS-based wait-free algorithmthat is, one that guarantees that any thread can always complete its current operation, regardless of the state of other threads accessing the queue. It executes queue insertion and removal operations in constant time, but requires linear time to execute size. This is because the algorithm, which relies on co-operation between threads for insertion and removal, does not keep track of the queue size and has to iterate over the queue to calculate it when it is required.

ConcurrentLinkedQueue has the two standard constructors discussed in Section 12.3. Its iterators are weakly consistent.




Java Generics and Collections
Java Generics and Collections
ISBN: 0596527756
EAN: 2147483647
Year: 2006
Pages: 136

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net