|
Suppose you have an image and you would like to improve its appearance. You then need to access the individual pixels of the image and replace them with other pixels. Or perhaps you want to compute the pixels of an image from scratch, for example, to show the result of physical measurements or a mathematical computation. The BufferedImage class gives you control over the pixels in an image, and classes that implement the BufferedImageOp interface let you transform images. This is a major change from the image support in JDK 1.0. At that time, the image classes were optimized to support incremental rendering. The original purpose of the classes was to render GIF and JPEG images that are downloaded from the Web, a scan line at a time, as soon as partial image data is available. In fact, scan lines can be interlaced, with all even scan lines coming first, followed by the odd scan lines. That mechanism lets a browser quickly display an approximation of the image while fetching the remainder of the image data. The ImageProducer, ImageFilter, and ImageConsumer interfaces in JDK 1.0 expose all the complexities of incremental rendering. Writing an image manipulation that fit well into that framework was quite complex. Fortunately, the need for using these classes has completely gone away. JDK 1.2 replaced the "push model" of JDK 1.0 with a "direct" model that lets you access pixels directly and conveniently. We cover only the direct model in this chapter. The only disadvantage of the direct model is that it requires all image pixels to be in memory. (In practice, the "push model" had the same restriction. It would have required fiendish cunning to write image manipulation algorithms that processed pixels as they became available. Most users of the old model simply buffered the entire image before processing it.) Future versions of the Java platform may support a "pull" model with which a processing pipeline can reduce memory consumption and increase speed by fetching and processing pixels only when they are actually needed. Accessing Image DataMost of the images that you manipulate are simply read in from an image filethey were either produced by a device such as a digital camera or scanner, or constructed by a drawing program. In this section, we show you a different technique for constructing an image, namely, to build up an image a pixel at a time. To create an image, construct a BufferedImage object in the usual way. image = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB); Now, call the getraster method to obtain an object of type WritableRaster. You use this object to access and modify the pixels of the image. WritableRaster raster = image.getRaster(); The setPixel method lets you set an individual pixel. The complexity here is that you can't simply set the pixel to a Color value. You must know how the buffered image specifies color values. That depends on the type of the image. If your image has a type of TYPE_INT_ARGB, then each pixel is described by four values, for red, green, blue, and alpha, each of which is between 0 and 255. You supply them in an array of four integers. int[] black = { 0, 0, 0, 255 }; raster.setPixel(i, j, black); In the lingo of the Java 2D API, these values are called the sample values of the pixel. CAUTION
You can supply batches of pixels with the setPixels method. Specify the starting pixel position and the width and height of the rectangle that you want to set. Then, supply an array that contains the sample values for all pixels. For example, if your buffered image has a type of TYPE_INT_ARGB, then you supply the red, green, blue, and alpha value of the first pixel, then the red, green, blue, and alpha value for the second pixel, and so on. int[] pixels = new int[4 * width * height]; pixels[0] = . . . // red value for first pixel pixels[1] = . . . // green value for first pixel pixels[2] = . . . // blue value for first pixel pixels[3] = . . . // alpha value for first pixel . . . raster.setPixels(x, y, width, height, pixels); Conversely, to read a pixel, you use the getPixel method. Supply an array of four integers to hold the sample values. int[] sample = new int[4]; raster.getPixel(x, y, sample); Color c = new Color(sample[0], sample[1], sample[2], sample[3]); You can read multiple pixels with the getPixels method. raster.getPixels(x, y, width, height, samples); If you use an image type other than TYPE_INT_ARGB and you know how that type represents pixel values, then you can still use the getPixel/setPixel methods. However, you have to know the encoding of the sample values in the particular image type. If you need to manipulate an image with an arbitrary, unknown image type, then you have to work a bit harder. Every image type has a color model that can translate between sample value arrays and the standard RGB color model. NOTE
The getColorModel method returns the color model: ColorModel model = image.getColorModel(); To find the color value of a pixel, you call the getdataElements method of the Raster class. That call returns an Object that contains a color-model-specific description of the color value. Object data = raster.getDataElements(x, y, null); NOTE
The color model can translate the object to standard ARGB values. The getrGB method returns an int value that has the alpha, red, green, and blue values packed in four blocks of 8 bits each. You can construct a Color value out of that integer with the Color(int argb, boolean hasAlpha) constructor. int argb = model.getRGB(data); Color color = new Color(argb, true); To set a pixel to a particular color, you reverse these steps. The getrGB method of the Color class yields an int value with the alpha, red, green, and blue values. Supply that value to the geTDataElements method of the ColorModel class. The return value is an Object that contains the color-model-specific description of the color value. Pass the object to the setDataElements method of the WritableRaster class. int argb = color.getRGB(); Object data = model.getDataElements(argb, null); raster.setDataElements(x, y, data); To illustrate how to use these methods to build an image from individual pixels, we bow to tradition and draw a Mandelbrot set, as shown in Figure 7-30. Figure 7-30. A Mandelbrot setThe idea of the Mandelbrot set is that you associate with each point in the plane a sequence of numbers. If that sequence stays bounded, you color the point. If it "escapes to infinity," you leave it transparent. The formulas for the number sequences come ultimately from the mathematics of complex numbers. We just take them for granted. For more on the mathematics of fractals, choose from the hundreds of books out there; one that is quite thick and comprehensive is Chaos and Fractals: New Frontiers of Science by Heinz-Otto Peitgen, Dietmar Saupe, and Hartmut Jürgens [Springer Verlag 1992]. Here is how you can construct the simplest Mandelbrot set. For each point (a, b), you look at sequences that start with (x, y) = (0, 0) and iterate: Check whether the sequence stays bounded or "escapes to infinity," that is, whether x and y keep getting larger. It turns out that if x or y ever gets larger than 2, then the sequence escapes to infinity. Only the pixels that correspond to points (a, b) leading to a bounded sequence are colored. Example 7-10 shows the code. In this program, we demonstrate how to use the ColorModel class for translating Color values into pixel data. That process is independent of the image type. Just for fun, change the color type of the buffered image to TYPE_BYTE_GRAY. You don't need to change any other codethe color model of the image automatically takes care of the conversion from colors to sample values. Example 7-10. MandelbrotTest.java1. import java.awt.*; 2. import java.awt.event.*; 3. import java.awt.image.*; 4. import javax.swing.*; 5. 6. /** 7. This program demonstrates how to build up an image from individual pixels. 8. */ 9. public class MandelbrotTest 10. { 11. public static void main(String[] args) 12. { 13. JFrame frame = new MandelbrotFrame(); 14. frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); 15. frame.setVisible(true); 16. } 17. } 18. 19. /** 20. This frame shows an image with a Mandelbrot set. 21. */ 22. class MandelbrotFrame extends JFrame 23. { 24. public MandelbrotFrame() 25. { 26. setTitle("MandelbrotTest"); 27. setSize(DEFAULT_WIDTH, DEFAULT_HEIGHT); 28. BufferedImage image = makeMandelbrot(DEFAULT_WIDTH, DEFAULT_HEIGHT); 29. add(new JLabel(new ImageIcon(image)), BorderLayout.CENTER); 30. } 31. 32. /** 33. Makes the Mandelbrot image. 34. @param width the width 35. @parah height the height 36. @return the image 37. */ 38. public BufferedImage makeMandelbrot(int width, int height) 39. { 40. BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB); 41. WritableRaster raster = image.getRaster(); 42. ColorModel model = image.getColorModel(); 43. 44. Color fractalColor = Color.red; 45. int argb = fractalColor.getRGB(); 46. Object colorData = model.getDataElements(argb, null); 47. 48. for (int i = 0; i < width; i++) 49. for (int j = 0; j < height; j++) 50. { 51. double a = XMIN + i * (XMAX - XMIN) / width; 52. double b = YMIN + j * (YMAX - YMIN) / height; 53. if (!escapesToInfinity(a, b)) 54. raster.setDataElements(i, j, colorData); 55. } 56. return image; 57. } 58. 59. private boolean escapesToInfinity(double a, double b) 60. { 61. double x = 0.0; 62. double y = 0.0; 63. int iterations = 0; 64. do 65. { 66. double xnew = x * x - y * y + a; 67. double ynew = 2 * x * y + b; 68. x = xnew; 69. y = ynew; 70. iterations++; 71. if (iterations == MAX_ITERATIONS) return false; 72. } 73. while (x <= 2 && y <= 2); 74. return true; 75. } 76. 77. private static final double XMIN = -2; 78. private static final double XMAX = 2; 79. private static final double YMIN = -2; 80. private static final double YMAX = 2; 81. private static final int MAX_ITERATIONS = 16; 82. private static final int DEFAULT_WIDTH = 400; 83. private static final int DEFAULT_HEIGHT = 400; 84. } java.awt.image.BufferedImage 1.2
java.awt.image.Raster 1.2
java.awt.image.WritableRaster 1.2
java.awt.image.ColorModel 1.2
java.awt.Color 1.0
Filtering ImagesIn the preceding section, you saw how to build up an image from scratch. However, often you want to access image data for a different reason: You already have an image and you want to improve it in some way. Of course, you can use the getPixel/getDataElements methods that you saw in the preceding section to read the image data, manipulate them, and then write them back. But fortunately, the Java 2D API already supplies a number of filters that carry out common image processing operations for you. The image manipulations all implement the BufferedImageOp interface. After you construct the operation, you simply call the filter method to transform an image into another. BufferedImageOp op = . . .; BufferedImage filteredImage = new BufferedImage(image.getWidth(), image.getHeight(), image.getType()); op.filter(image, filteredImage); Some operations can transform an image in place (op.filter(image, image)), but most can't. Five classes implement the BufferedImageOp interface: AffineTransformOp RescaleOp LookupOp ColorConvertOp ConvolveOp The AffineTransformOp carries out an affine transformation on the pixels. For example, here is how you can rotate an image about its center. AffineTransform transform = AffineTransform.getRotateInstance(Math.toRadians(angle), image.getWidth() / 2, image.getHeight() / 2); AffineTransformOp op = new AffineTransformOp(transform, interpolation); op.filter(image, filteredImage); The AffineTransformOp constructor requires an affine transform and an interpolation strategy. Interpolation is necessary to determine pixels in the target image if the source pixels are transformed somewhere between target pixels. For example, if you rotate source pixels, then they will generally not fall exactly onto target pixels. There are two interpolation strategies: AffineTransformOp.TYPE_BILINEAR and AffineTransformOp.TYPE_NEAREST_NEIGHBOR. Bilinear interpolation takes a bit longer but looks better. The program in Example 7-11 lets you rotate an image by 5 degrees (see Figure 7-31). Figure 7-31. A rotated imageThe RescaleOp carries out a rescaling operation
for all sample values x in the image. Sample values that are too large or small after the rescaling are set to the largest or smallest legal value. If the image is in ARGB format, the scaling is carried out separately for the red, green, and blue values, but not for the alpha values. The effect of rescaling with a > 1 is to brighten the image. You construct the RescaleOp by specifying the scaling parameters and optional rendering hints. In Example 7-11, we use: float a = 1.5f; float b = -20.0f; RescaleOp op = new RescaleOp(a, b, null); The LookupOp operation lets you specify an arbitrary mapping of sample values. You supply a table that specifies how each value should be mapped. In the example program, we compute the negative of all colors, changing the color c to 255 c. The LookupOp constructor requires an object of type LookupTable and a map of optional hints. The LookupTable class is abstract, with two concrete subclasses: ByteLookupTable and ShortLookupTable. Because RGB color values are bytes, we use the ByteLookupTable. You construct such a table from an array of bytes and an integer offset into that array. Here is how we construct the LookupOp for the example program: byte negative[] = new byte[256]; for (int i = 0; i < 256; i++) negative[i] = (byte) (255 - i); ByteLookupTable table = new ByteLookupTable(0, negative); LookupOp op = new LookupOp(table, null); The lookup is applied to each color value separately, but not to the alpha value. NOTE
The ColorConvertOp is useful for color space conversions. We do not discuss it here. The most powerful of the transformations is the ConvolveOp, which carries out a mathematical convolution. We do not want to get too deeply into the mathematical details of convolution, but the basic idea is simple. Consider, for example, the blur filter (see Figure 7-32). Figure 7-32. Blurring an imageThe blurring is achieved by replacement of each pixel with the average value from the pixel and its eight neighbors. Intuitively, it makes sense why this operation would blur out the picture. Mathematically, the averaging can be expressed as a convolution operation with the following kernel: The kernel of a convolution is a matrix that tells what weights should be applied to the neighboring values. The kernel above leads to a blurred image. A different kernel carries out edge detection, locating areas of color changes: Edge detection is an important technique for analyzing photographic images (see Figure 7-33). Figure 7-33. Edge detectionTo construct a convolution operation, you first set up an array of the values for the kernel and construct a Kernel object. Then, construct a ConvolveOp object from the kernel and use it for filtering. float[] elements = { 0.0f, -1.0f, 0.0f, -1.0f, 4.f, -1.0f, 0.0f, -1.0f, 0.0f }; Kernel kernel = new Kernel(3, 3, elements); ConvolveOp op = new ConvolveOp(kernel); op.filter(image, filteredImage); The program in Example 7-11 allows a user to load in a GIF or JPEG image and carry out the image manipulations that we discussed. Thanks to the power of the image operations that the Java 2D API provides, the program is very simple. Example 7-11. ImageProcessingTest.java[View full width] 1. import java.awt.*; 2. import java.awt.event.*; 3. import java.awt.geom.*; 4. import java.awt.image.*; 5. import java.io.*; 6. import javax.imageio.*; 7. import javax.swing.*; 8. 9. /** 10. This program demonstrates various image processing operations. 11. */ 12. public class ImageProcessingTest 13. { 14. public static void main(String[] args) 15. { 16. JFrame frame = new ImageProcessingFrame(); 17. frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); 18. frame.setVisible(true); 19. } 20. } 21. 22. /** 23. This frame has a menu to load an image and to specify 24. various transformations, and a panel to show the resulting 25. image. 26. */ 27. class ImageProcessingFrame extends JFrame 28. { 29. public ImageProcessingFrame() 30. { 31. setTitle("ImageProcessingTest"); 32. setSize(DEFAULT_WIDTH, DEFAULT_HEIGHT); 33. 34. JPanel panel = new 35. JPanel() 36. { 37. public void paintComponent(Graphics g) 38. { 39. super.paintComponent(g); 40. if (image != null) g.drawImage(image, 0, 0, null); 41. } 42. }; 43. 44. add(panel, BorderLayout.CENTER); 45. 46. JMenu fileMenu = new JMenu("File"); 47. JMenuItem openItem = new JMenuItem("Open"); 48. openItem.addActionListener(new 49. ActionListener() 50. { 51. public void actionPerformed(ActionEvent event) 52. { 53. openFile(); 54. } 55. }); 56. fileMenu.add(openItem); 57. 58. JMenuItem exitItem = new JMenuItem("Exit"); 59. exitItem.addActionListener(new 60. ActionListener() 61. { 62. public void actionPerformed(ActionEvent event) 63. { 64. System.exit(0); 65. } 66. }); 67. fileMenu.add(exitItem); 68. 69. JMenu editMenu = new JMenu("Edit"); 70. JMenuItem blurItem = new JMenuItem("Blur"); 71. blurItem.addActionListener(new 72. ActionListener() 73. { 74. public void actionPerformed(ActionEvent event) 75. { 76. float weight = 1.0f / 9.0f; 77. float[] elements = new float[9]; 78. for (int i = 0; i < 9; i++) elements[i] = weight; 79. convolve(elements); 80. } 81. }); 82. editMenu.add(blurItem); 83. 84. JMenuItem sharpenItem = new JMenuItem("Sharpen"); 85. sharpenItem.addActionListener(new 86. ActionListener() 87. { 88. public void actionPerformed(ActionEvent event) 89. { 90. float[] elements = 91. { 92. 0.0f, -1.0f, 0.0f, 93. -1.0f, 5.f, -1.0f, 94. 0.0f, -1.0f, 0.0f 95. }; 96. convolve(elements); 97. } 98. }); 99. editMenu.add(sharpenItem); 100. 101. JMenuItem brightenItem = new JMenuItem("Brighten"); 102. brightenItem.addActionListener(new 103. ActionListener() 104. { 105. public void actionPerformed(ActionEvent event) 106. { 107. float a = 1.1f; 108. float b = -20.0f; 109. RescaleOp op = new RescaleOp(a, b, null); 110. filter(op); 111. } 112. }); 113. editMenu.add(brightenItem); 114. 115. JMenuItem edgeDetectItem = new JMenuItem("Edge detect"); 116. edgeDetectItem.addActionListener(new 117. ActionListener() 118. { 119. public void actionPerformed(ActionEvent event) 120. { 121. float[] elements = 122. { 123. 0.0f, -1.0f, 0.0f, 124. -1.0f, 4.f, -1.0f, 125. 0.0f, -1.0f, 0.0f 126. }; 127. convolve(elements); 128. } 129. }); 130. editMenu.add(edgeDetectItem); 131. 132. JMenuItem negativeItem = new JMenuItem("Negative"); 133. negativeItem.addActionListener(new 134. ActionListener() 135. { 136. public void actionPerformed(ActionEvent event) 137. { 138. byte negative[] = new byte[256]; 139. for (int i = 0; i < 256; i++) negative[i] = (byte) (255 - i); 140. ByteLookupTable table = new ByteLookupTable(0, negative); 141. LookupOp op = new LookupOp(table, null); 142. filter(op); 143. } 144. }); 145. editMenu.add(negativeItem); 146. 147. JMenuItem rotateItem = new JMenuItem("Rotate"); 148. rotateItem.addActionListener(new 149. ActionListener() 150. { 151. public void actionPerformed(ActionEvent event) 152. { 153. if (image == null) return; 154. AffineTransform transform = AffineTransform.getRotateInstance( 155. Math.toRadians(5), image.getWidth() / 2, image.getHeight() / 2); 156. AffineTransformOp op = new AffineTransformOp(transform, 157. AffineTransformOp.TYPE_BILINEAR); 158. filter(op); 159. } 160. }); 161. editMenu.add(rotateItem); 162. 163. JMenuBar menuBar = new JMenuBar(); 164. menuBar.add(fileMenu); 165. menuBar.add(editMenu); 166. setJMenuBar(menuBar); 167. } 168. 169. /** 170. Open a file and load the image. 171. */ 172. public void openFile() 173. { 174. JFileChooser chooser = new JFileChooser(); 175. chooser.setCurrentDirectory(new File(".")); 176. 177. chooser.setFileFilter(new 178. javax.swing.filechooser.FileFilter() 179. { 180. public boolean accept(File f) 181. { 182. String name = f.getName().toLowerCase(); 183. return name.endsWith(".gif") || name.endsWith(".jpg") || name.endsWith (".jpeg") 184. || f.isDirectory(); 185. } 186. public String getDescription() { return "Image files"; } 187. }); 188. 189. int r = chooser.showOpenDialog(this); 190. if(r != JFileChooser.APPROVE_OPTION) return; 191. 192. try 193. { 194. image = ImageIO.read(chooser.getSelectedFile()); 195. } 196. catch (IOException e) 197. { 198. JOptionPane.showMessageDialog(this, e); 199. } 200. repaint(); 201. } 202. 203. /** 204. Apply a filter and repaint. 205. @param op the image operation to apply 206. */ 207. private void filter(BufferedImageOp op) 208. { 209. if (image == null) return; 210. BufferedImage filteredImage 211. = new BufferedImage(image.getWidth(), image.getHeight(), image.getType()); 212. op.filter(image, filteredImage); 213. image = filteredImage; 214. repaint(); 215. } 216. 217. /** 218. Apply a convolution and repaint. 219. @param elements the convolution kernel (an array of 220. 9 matrix elements) 221. */ 222. private void convolve(float[] elements) 223. { 224. Kernel kernel = new Kernel(3, 3, elements); 225. ConvolveOp op = new ConvolveOp(kernel); 226. filter(op); 227. } 228. 229. private BufferedImage image; 230. private static final int DEFAULT_WIDTH = 400; 231. private static final int DEFAULT_HEIGHT = 400; 232. } java.awt.image.BufferedImageOp 1.2
java.awt.image.AffineTransformOp 1.2
java.awt.image.RescaleOp 1.2
java.awt.image.LookupOp 1.2
java.awt.image.ByteLookupTable 1.2
java.awt.image.ConvolveOp 1.2
java.awt.image.Kernel 1.2
|
|