Using HLSL to Write a Pixel Shader

So far you've manipulated the vertices and written vertex programs, but that's only half the fun when dealing with the programmable pipeline. What about the color of each individual pixel? For this example, we will take the MeshFile example you wrote back in Chapter 5, "Rendering with Meshes," and update it to use the programmable pipeline instead. This one is being used because not only does it render vertices, but they are colored from a texture where a pixel shader could be used.

After loading this project, you will need to make a few changes to the source to begin using the programmable pipeline. First, you will need the variables to store the transformations and effect object, so add those:

 private Effect effect = null; // Matrices private Matrix worldMatrix; private Matrix viewMatrix; private Matrix projMatrix; 

You will also need to update the initialization method to handle picking the reference device if shader support isn't available. Replace the method with the method found in Listing 11.2.

Listing 11.2 Initializing Graphics with Fallback
 public bool InitializeGraphics() {     // Set our presentation parameters     PresentParameters presentParams = new PresentParameters();     presentParams.Windowed = true;     presentParams.SwapEffect = SwapEffect.Discard;     presentParams.AutoDepthStencilFormat = DepthFormat.D16;     presentParams.EnableAutoDepthStencil = true;     bool canDoShaders = true;     // Does a hardware device support shaders?     Caps hardware = Manager.GetDeviceCaps(0, DeviceType.Hardware);     if ((hardware.VertexShaderVersion >= new Version(1, 1)) &&         (hardware.PixelShaderVersion >= new Version(1, 1)))     {         // Default to software processing         CreateFlags flags = CreateFlags.SoftwareVertexProcessing;         // Use hardware if it's available         if (hardware.DeviceCaps.SupportsHardwareTransformAndLight)             flags = CreateFlags.HardwareVertexProcessing;         // Use pure if it's available         if (hardware.DeviceCaps.SupportsPureDevice)             flags |= CreateFlags.PureDevice;         // Yes, Create our device         device = new Device(0, DeviceType.Hardware, this, flags, presentParams);     }     else     {         // No shader support         canDoShaders = false;         // Create a reference device         device = new Device(0, DeviceType.Reference, this,             CreateFlags.SoftwareVertexProcessing, presentParams);     }     // Create our effect     effect = Effect.FromFile(device, @"..\..\simple.fx", null, ShaderFlags.None, null);     effect.Technique = "TransformTexture";     // Store our project and view matrices     projMatrix = Matrix.PerspectiveFovLH((float)Math.PI / 4,         this.Width / this.Height, 1.0f, 10000.0f);     viewMatrix = Matrix.LookAtLH(new Vector3(0,0, 580.0f), new Vector3(),         new Vector3(0,1,0));     // Load our mesh     LoadMesh(@"..\..\tiny.x");     return canDoShaders; } 

This is similar to the initialization method that was used earlier in this chapter, although this checks not only for vertex shader support, but also pixel shader support. Don't try to run this example yet, since you haven't created the HLSL file yet. Since the initialization now returns a Boolean, the main method will need updating as well. Simply use the same method you used earlier in this chapter for this one.

Since the SetupCamera method is using nothing but the fixed-function pipeline for transformations and lighting, you can eliminate that method completely from this source. Naturally, you'll also need to remove the call in your OnPaint override as well. The last code change you will need to make is to replace the DrawMesh method with one that will use HLSL to render. Replace the method with the one found in Listing 11.3.

Listing 11.3 Drawing a Mesh with the HLSL
 private void DrawMesh(float yaw, float pitch, float roll, float x, float y, float z) {     angle += 0.01f;     worldMatrix = Matrix.RotationYawPitchRoll(yaw, pitch, roll)         * Matrix.Translation(x, y, z);     Matrix worldViewProj = worldMatrix * viewMatrix * projMatrix;     effect.SetValue("WorldViewProj", worldViewProj);     int numPasses = effect.Begin(0);     for (int iPass = 0; iPass < numPasses; iPass++)     {         effect.Pass(iPass);         for (int i = 0; i < meshMaterials.Length; i++)         {             device.SetTexture(0, meshTextures[i]);             mesh.DrawSubset(i);         }     }     effect.End(); } 

Since this method is called every frame, this is where you will combine the transformation matrices and update the HLSL code with this variable. You then render each subset of the mesh for every pass in the technique. However, you currently haven't declared your HLSL for this application yet, so go ahead and add a new blank file called "simple.fx" to hold the HLSL and add the code found in Listing 11.4.

Listing 11.4 HLSL Code for Rendering a Textured Mesh
 // The world view and projection matrices float4x4 WorldViewProj : WORLDVIEWPROJECTION; sampler TextureSampler; // Transform our coordinates into world space void Transform(     in float4 inputPosition : POSITION,     in float2 inputTexCoord : TEXCOORD0,     out float4 outputPosition : POSITION,     out float2 outputTexCoord : TEXCOORD0     ) {     // Transform our position     outputPosition = mul(inputPosition, WorldViewProj);     // Set our texture coordinates     outputTexCoord = inputTexCoord; } void TextureColor(  in float2 textureCoords : TEXCOORD0,  out float4 diffuseColor : COLOR0) {     // Get the texture color     diffuseColor = tex2D(TextureSampler, textureCoords); }; technique TransformTexture {     pass P0     {         // shaders         VertexShader = compile vs_1_1 Transform();         PixelShader  = compile ps_1_1 TextureColor();     } } 

You'll notice pretty quickly that things were done differently for these shader programs. Rather than store the return values in a structure, you've simply added out parameters to the method declaration itself. For the vertex program, you only really care about the position (which is transformed) and the texture coordinates (which will simply be passed on).

The second method is the pixel shader. It accepts the texture coordinates for this pixel and returns the color this pixel should be rendered. For this case, you simply want to use the default color of the texture at this coordinate, so use the tex2D intrinsic method. This samples the texture at a given set of coordinates and returns the color at that point. For this example case, you can simply pass this on. This is about as simple as you can get for a pixel shader, and will render the mesh color just as the textures say, much like you see in Figure 11.3.

Figure 11.3. Mesh rendered using HLSL.

graphics/11fig03.jpg

This pixel shader doesn't really do anything you haven't seen with the fixed-function pipeline, though. You should do something else that may make it a little more exciting. While you're at it, you should make more than one technique for this HLSL program. Add the following to your HLSL code:

 void InverseTextureColor(  in float2 textureCoords : TEXCOORD0,  out float4 diffuseColor : COLOR0) {     // Get the inverse texture color     diffuseColor = 1.0f - tex2D(TextureSampler, textureCoords); }; technique TransformInverseTexture {     pass P0     {         // shaders         VertexShader = compile vs_1_1 Transform();         PixelShader  = compile ps_1_1 InverseTextureColor();     } } 

This code here is similar to the first pixel shader, with the only difference being that you are subtracting the sampled color from 1.0f. Since 1.0f is considered "fully on" for a color, subtracting from this number will produce the inverse color for this pixel. You have also created a new technique whose only difference is the pixel shader method you call. Now, you should update the main C# code to allow switching between these two techniques. You should override the key press event and use the number keys to switch between them:

 protected override void OnKeyPress(KeyPressEventArgs e) {     switch (e.KeyChar)     {         case '1':             effect.Technique = "TransformTexture";             break;         case '2':             effect.Technique = "TransformInverseTexture";             break;     }     base.OnKeyPress (e); } 

Run the application now and press the 2 key. Notice how the mesh is now rendered so that it looks like a negative? See Figure 11.4.

Figure 11.4. Mesh rendered with inverse colors using HLSL.

graphics/11fig04.jpg

How do you think you could add new techniques to mask out the blue color from the rendered mesh? What about only rendering the blue color of the mesh? See the code on the included CD for examples of how to accomplish these tasks.



Managed DirectX 9 Graphics and Game Programming, Kick Start
Managed DirectX 9 Kick Start: Graphics and Game Programming
ISBN: B003D7JUW6
EAN: N/A
Year: 2002
Pages: 180
Authors: Tom Miller

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net