This section of the site explores the purpose and use of vertex and pixel shaders (sometimes known as fragment shaders) with notes and examples. This section will grow over time as I add more examples and more advanced techniques.
If you are new to shaders I recommend you read these notes in order i.e. this page first then vertex shaders then pixel shaders. It would also be a good idea to read the page on Effects Files before proceeding.
When rendering a scene the geometry and texturing information is sent to the graphics card for processing. In the past the graphics card had a few hard wired algorithms which could be used to process this data. This was known as the fixed function pipeline (FFP). The programmer could select a fixed function and set some card states but that was the limit of control. The problem with this was that it was hard to create unique looking games, the programmer could not control the processing. In an attempt to allow programmers more freedom, and influenced by the success of Pixar's RenderMan technology, in 2001 the first shader capable hardware appeared.
Shaders come in two flavours, vertex shaders which allow the manipulation of vertex data and pixel shaders which allow the manipulation of pixel data. The shader code is loaded into the graphics card memory and plugged directly into the graphics pipeline. Shader code is in assembly however nowadays there are a number of higher level 'C' type languages that can be compiled down to the assembly and making them much easier to program. Microsoft have HLSL (High-Level Shading Language) for use with DirectX and OpenGL has the GLSL (OpenGL Shading Language). Hardware vendors also provide some high level languages, NVidia have Cg and ATI have ASHLI (Advanced Shading Language Interface). It is to be hoped that one day there will be just one common language although note that HLSL and Cg are actually the same language (different names for branding purposes).
This site concentrates on DirectX and so from now on I will describe shaders in terms of using them with Direct3D and coding in HLSL.
Shader model 4 is supported by DirectX 10 running on Vista, it provides:
In order to program shaders we need to know where they fit in the 3D pipeline so that we can see which parts can and cannot be programmed. Wolfgang Engel has written a number of very good articles and books on shaders. I recommend you look at his article on GameDev to see how shaders fit into the pipeline: Shader Programming. Also if you want to explore shading further I would recommend his book Programming Vertex and Pixel Shaders (more information about this book is on the Resources page).
A cut down version of the pipeline is shown below:
The Application is your game and is where you do any scene management (to cut down the number of polygons being rendered) and any tessellation of mesh etc. You then feed the graphics card with vertices and other data. The graphic card processes the vertices by transforming and lighting them using the matrices supplied by the application. The card then culls any invisible polygons and clips to the viewport. This transformed data is then rasterized and passes through the pixel operations before finally being displayed.
We can insert our shaders into the pipeline as shown below:
From this we can see that if we write a vertex shader we can control the transformation and lighting stage of the pipeline. Note: we cannot just leave part of the processing to the card, if we write a vertex shader we must program the transformations ourselves. The pixel shader can manipulate each pixel from the triangle and control texturing etc. Note that this occurs before alpha test and so it may be that a pixel we process is discarded straight after passing through our shader.
We cannot control everything in the pipeline. In the future other types of shaders are likely to appear giving us more control e.g. geometry shaders.
Direct3D comes with a facility called effects that are defined in .fx files. These are very useful as they allow shaders to be written and chosen at run time based on the users' graphic card capabilities. Different graphic cards support different capabilities and so it is difficult to write complex shaders that work on every platform. With effect files you can write a number of shaders and the best one is chosen at run time. More details about Effect files can be found on the Effect Files page. I will use effect files for all my examples. The SDK comes with a utility called Effect Edit that allows you to load and manipulate an effect file. This is a superb way of writing shaders as you can see the changes as you make them. When you get more advanced though you might want a more fully featured tool like RenderMonkey from ATI. It provides a fantastic shader IDE and will export .fx files and it's free.
As well as vertex and pixel data there are a number of constant parameters that need to be passed from the game to the shader. For example you will want to provide world, view and projection matrix so that your vertex shader can carry out transformations. This is easy using the effect files.
You should now have some understanding of the purpose, use and limitations of shaders. The next part of this section will go into more detail about each type of shader with examples and guidelines on programming them. Please follow the links below:
Wolfgang Engel is one of the leading authors on shaders with DirectX and his book Programming Vertex and Pixel Shaders is highly recommended. More information about this book can be found on the books page.