Introduction to Physically-Based Rendering with Image-Based Lighting for PowerVR

Implementing Physically-Based Rendering with Image-Based Lighting on PowerVR hardware

This document explains the Physically Based Rendering (PBR) with Image-Based-Lighting (IBL) demo (ImageBasedLighting), which is included in the PowerVR SDK. There are many quirks and challenges associated with implementing PBR with IBL but this document will go through all of these in detail.

Please refer to the demo itself to get a clearer overview of the various elements which will be referred to throughout this document. It is available either by downloading the PowerVR SDK or from the PowerVR SDK GitHub repository.

There are two versions of the demo available, one which uses OpenGL® ES and the other which uses Vulkan®, but these are functionally identical in all regards.

What are Physically-Based Rendering and Image-Based-Lighting?

Physically Based Rendering (PBR) is a family of techniques that try to approximate the interaction of light with objects in a scene. When compared to "old style" diffuse/specular colour of objects, these techniques provide much more realistic lighting. In general, PBR tries to base the lighting technique on real material properties. There are a variety of principles involved in PBR which will not be explained in detail here, as there is already a lot of excellent content on the principles of PBR which is freely available online.

The implementation used in this demo is largely based on the Epic Games publication by Brian Karis, which can be found here: Real Shading in Unreal Engine 4.

Image Based Lighting (IBL) covers a broad family of techniques which are primarily focused on trying to simulate global illumination. Global illumination (GI) is the name of a set of algorithms that attempt to model the incoming light from all objects and directions, not just the expected light sources such as the Sun. In IBL, this is done using a set of images which capture the incoming light information from all directions in a scene. This provides excellent results for light incoming from the far away environment, but unfortunately performance considerations normally make it static. This means all of the lighting is encoded in the environment map and is not dependent on any other objects in the scene.

Combining these techniques is a very good way of producing very realistic object lighting that can be rendered easily in real time, while giving artists more freedom to define plausible, physical materials.

Nowadays, PBR and IBL are widely used in video games, and major middleware, such as Unreal or Unity, normally support PBR rendering pipelines by default.

So, what's the good stuff?

The basic requirements of this demo, such as setting up a rendering pipeline, are already well covered by other examples in the PowerVR SDK, as well as other online resources, so they will not be detailed here. If a simplistic renderer is required, then one has already been created in the SDK example called IntroducingPVRUtils.

This document will primarily focus on:

  • Explaining the concepts and the basic steps involved in creating a PBR/IBL asset pipeline that goes with the renderer. An asset pipeline starts from an artist-created resource and outputs something in the renderer.
  • Demonstrating all of the tricks PVRTexTool has up its sleeve which can be used to simplify asset processing.

  • Selecting and optimising the texture formats for the assets and the render targets.

  • Dealing with the quirks that have been purposely built into the example. For instance, the implementation uses environment maps with values of real light sources, so it does not need to incorporate any lights within the scene itself. One of the challenges with this approach is managing the extreme values, like the Sun, which can be encoded in these images. Handling these extreme values improperly can cause subsequent problems.
  • Optimising and managing value ranges. This is necessary because, while very high dynamic range environment maps may provide interesting effects and realistic scenes, they may potentially overflow the value ranges of their storage format, so care and pre-processing are required.
  • Optimising shaders by modifying the shader code to use as much half-precision floating point maths (FP16) as possible, which gives important performance benefits. However, FP16 maths can become quite difficult when using high dynamic range maps and performing operations that square values, such as tonemapping, as it is very easy to overflow the maths and break the scene.