Creating the Shaders

How to initialise vertex and fragment shaders

Shaders are one of the key ways to modify the final output of the graphics pipeline.

What are shaders?

Shaders are programs which are designed to run on a GPU, typically as part of a graphics pipeline. Shaders are intended to operate on a large amount of numbers at a time, making use of the GPU's highly parallelised nature.

The two types of shader shown in this guide are vertex shaders and fragment shaders:

  • Vertex shaders operate on individual vertices, transforming their positions in space. They are often used to transform the vertex positions from 3D object space to 2D screen space (normalised device co-ordinates).
  • Fragment shaders operate on individual fragments. They determine the pixel colour value that will be rendered into the swapchain image buffer.

There are other types of shaders which are found at other points in the pipeline, such as geometry shaders, but they will not be covered here.

In Vulkan, all shader source code must be in a bytecode format called SPIR-V. Shaders can be compiled from GLSL into SPIR-V using a tool called glslangValidator. This tool is provided with the LunarG Vulkan SDK. This approach avoids the need to compile GLSL at runtime, as glslangValidator can be run offline.

The SPIR-V source code of the shaders are wrapped in objects called shader modules. Shader modules are associated with shader stages. These tell Vulkan at what point in the pipeline the shader module should be used. In this case it will either be the vertex shader stage or the fragment shader stage. More information on pipeline stages will be given in Initialising a Graphics Pipeline.

Example: initShaders()

In the code example, both the vertex and fragment shaders are very simple.

The vertex shader transforms the vertex position co-ordinates from local, object space into normalised device co-ordinates. The transformation matrix used in this shader is produced by combining an orthographic projection matrix with a rotation matrix. The projection matrix handles the conversion to screen space and the rotation matrix handles the rotation of the vertices when they are in screen space. This shader also reads in texture co-ordinates from the vertex and then immediately outputs them again so they can be passed onto the fragment shader.

The fragment shader samples a 2D texture image using the vertex texture co-ordinates. The shader then writes this colour into the pixels of the image buffer.

initShaders() uses a custom helper function which loads the shader source code into the shader modules and sets the shader stage.