Reflections have always been an important topic in video game graphics, given how much they add to the realism of a scene and how expensive they tend to be in terms of computations, when aiming for quality. Mobile virtual reality pushes both of these aspects to their limits: realism is key for VR, so it is fundamental to have high quality reflections; as is carefully balancing the available runtime resources.
Precomputed reflections allow you to reduce the load at runtime, while their quality can be controlled by adjusting the resolution. Our Graphics and Multimedia team has previously demonstrated the performance and quality advantages using local cubemaps to implement highly efficient rendering techniques for mobile, and for reflections in particular. If you are interested in implementing these in Unity you can have a look at this video on IceCave 10 VFX techniques for Unity, this blog post on stereo reflections in VR, and this post on reflections based on local cubemap.
More recently, when developing the Circuit VR demo with Unreal Engine, we considered Unreal’s native implementation of precomputed reflections (see Reflection Capture in Reflection Environment), but it did not offer us the flexibility and quality we were needed for the Samsung GearVR. It is definitely worth trying this out-of-the-box solution to establish if it could work for you and meet your artistic needs.
If the native solution does not work for you, or you want more control over reflections, keep reading as we walk you through the steps required to implement your own high quality, precomputed reflections.
The approach we will follow is based on static local cubemaps: we will go through the basics of the algorithm and then move to implementing it in Unreal. You can find more details on the specifics of the algorithm in this post.
If you do not know what cubemaps are, you can think of them as 360-degree snapshots taken from a certain spot, looking along each orthogonal axis. Intuitively, if we generate a cubemap looking from a reflective object, we can get the reflected color just by sampling the cubemap along the appropriate direction.
Let us try to directly sample the cubemap along the direction of the reflection:
Sampling a cubemap without any correction
R = reflect(D, N); color = sampleCubemap(R);
The pseudocode is nice and simple, but this approach is flawed: looking at the picture above, we can see that the R vector on the reflective surface and the corresponding R vector in the cubemap are pointing to different locations. This approach only works if the reflection point corresponds to the origin of the cubemap, but we can’t capture a different cubemap for each point in our reflective surface!
Let us consider a scene and add a horizontal mirror to it, using this approach.
Starting scene, without any mirror
Wrong reflections, due to the absence of a local correction
That looks wrong! Luckily, reflections can be fixed with a small modification to the algorithm:
Sampling a cubemap applying a local correction
R = reflect(D, N); find intersection point P; R’ = P – C; color = sampleCubemap(R’);
What is new here? We have introduced a bounding volume, which should roughly correspond to the shape of the surroundings (e.g. a box, for a box-shaped room). If we intersect the reflection vector with this bounding volume, we can sample the cubemap using the new direction R’, based on the intersection point we found.
Correct reflections, thanks to the local correction
Side-by-side comparison of the two approaches
The results are much better with the second approach, so what is the catch? Finding the intersection point can range from being cheap to extremely expensive, depending on the shape of the bounding volume. While in principle it would be possible to exactly model the room the object is in, the amount of computation required to find the intersection point would skyrocket really quickly, and that means a higher runtime load on the GPU.
Let us think of the simplest shape: a box. We can compute a ray-box intersection by solving linear equations; can we beat that? We tried moving to a sphere or a cylinder (yes, that room is cylinder-shaped!): what we got was a 20% increase in shader instructions when moving from a box to a sphere, and a 67% increase when moving to a cylinder. Furthermore, the extra complexity in solving 2nd order equations made the algorithm less numerically stable.
The takeaway? Just stick with a box shape and tweak its parameters: reflections may show some distortions, but you will still get the high quality at a lower performance impact.
Let us go through the steps needed to implement reflections, starting from the basics: generating the cubemap that we will later use in our reflective material.
First, look for a Scene Capture Cube in the Modes panel and place it into the scene. This actor will be the origin of our cubemap.
Drag the actor to the desired point of your scene, it should look like a camera. Make sure you don’t rotate it, so the cubemap will be generated with the default orientation and sampling it will be easier.
Now it is time to connect our Scene Capture Cube to a Cube Render Target: to do so, look for the Texture Target parameter in the Details panel of the Scene Capture Cube, click on it and then select Cube Render Target under Create New Asset.
Once you do that, the scene capture should happen immediately; if that is not the case, try playing the game in the editor. Double click on the Cube Render Target you just created and look at a couple of parameters:
Right now we have an actor which renders a cubemap each frame, while what we want is a static cubemap, so right-click on the Cube Render Target in the Content Browser and select Create Static Texture.
In our material we will need the world space location of the Scene Capture Cube. You can get it from the Transform section in the Details panel for the actor: click on Location and select World from the dropdown menu, so that the location vector is in world space. Apart from these coordinates, you will not need the Scene Capture Cube and the Cube Render Target anymore.
We are now ready to move to the algorithm itself. We can use Unreal’s Material Functions to nicely wrap our code in a reusable block; therefore, let us create a Material Function called Local Correction.
We will need 3 input nodes: BBoxOrigin, BBoxMax, and BBoxMin; all of them are of type Vector3, and they represent the position of the origin and the extremes of the bounding box. The extremes are in the form (minX, minY, minZ) and (maxX, maxY, maxZ). We will also need the Absolute World Position and the Reflection Vector, which are already available as nodes in the Material Editor.
BBoxOrigin
BBoxMax
BBoxMin
Vector3
minX, minY, minZ
maxX, maxY, maxZ
The first part of the algorithm is the ray-box intersection. Considering a ray in its parametric form, that is , we are looking for the value of the t parameter corresponding to the intersection point.
Here is how to implement ray-box intersection in Unreal:
We now have the value of t we were looking for, so we can get the intersection point with the bounding box by implementing . Finally, we can get the locally corrected vector by subtracting the intersection point from the origin of the cubemap. We will use this vector in our material to sample the cubemap.
t
Here is the implementation for this second part of the algorithm:
As a final touch, click on the background of your Material Editor to bring up the Material Function properties and tick the Expose to Library checkbox so that you can easily add the function to any material as a standard Material Editor node.
With our Material Function out of the way, we only need to set up a Material which uses it.
You will typically have a pre-existing material to which you want to add reflections, and this can be achieved by attaching the sampled cubemap color to the Emissive Color node.
Specifically, you will need to place the Local Correction node we just created in your material editor and provide its inputs (the origin and the extremes of the bounding box, in world space coordinates). The output of the local correction node should be attached to a Texture Sample Parameter Cube node, whose Texture input should be set to the cubemap we generated previously.
Once that is done, your reflections should be in place! You can then do any sort of processing with them, depending on your specific needs. A very basic control you can implement is a Linear Interpolation (Lerp) node, interpolating between (0, 0, 0) and the sampled texture color. This allows you to tone down the reflection intensity, based on the Alpha input of the Lerp.
The opportunity for offline processing of the cubemap is extremely important in Mobile VR where post processing is not really an option as it is too demanding. We can achieve the desired effects by exporting the cubemap, editing it in an external program and reimporting it to Unreal. As an example, blurring the cubemap is a great tool for softer reflections.
When implementing localized reflections, we ran into a serious issue with flickering. Reflections appeared extremely unstable, as if a different point in the cubemap were sampled at each frame. A number of approaches helped reduce the issue: changing the Mip Gen Settings to have more blurred reflections, playing with the Filtering options in the Texture Editor, simplifying the normal map of the object (as the issue was more prominent near the edges). Nevertheless, what actually solved the issue was ticking Use Full Precision under the Mobile section in the Material Editor. This is not ideal, as it increases the execution time of the shaders, but you may want to consider this tradeoff between quality and performances.
If you want to reuse the same reflective material with several different cubemaps, you can create material instances based on that material which are more efficient because they can share most of the compiled code. Material instances are allowed to change the parameters of the original material: while we have already specified the cubemap as a parameter, you will also need to provide the inputs to the Local Correction function as parameters.
Reflections based on static local cubemaps are a powerful tool to give a better sense of realism to your game at an affordable performance cost. Their main advantage is that they remove runtime workload by moving it to the offline rendering process.
Going for our manual approach is more complex than using the native Reflection Environment feature, but it also gives you more control over the generation and processing of the cubemap. Furthermore, on certain platforms the native feature is supported only at a low resolution; in that case, a manual implementation is the only way to achieve high quality static reflections.
An intrinsic limitation to precomputed reflections is that they are only able to reproduce correct reflections in a static environment, as moving objects require their reflections to be computed at runtime. However, considering the fluidity requirements for a good mobile virtual reality experience and the performance budget we are working with, high quality static reflections are the right choice to convey the sense of immersion which is so essential to VR.
Hi Jaye,
That node is a VectorParameter, I'm surprised that the negative values are not working for you, I tried setting up a simple one and it just worked:
It worked with Constant4Vector as well. So your choice of nodes is correct, you're hitting an issue I didn't know of. Could you share more about how you are setting these values? A screenshot would help too.
Regarding dynamic soft shadows using local cubemaps, we don't have any immediate plans to do such a tutorial, but we could do one in the future. :)
In the meantime, I believe that you can get the same approach to work in Unreal. It will require setting up another Cube Render Target but rendering visibility information, then using that information in your main shader, similarly to how it's done in Unity.Hope this is enough to get you started, let us know how it goes!
Hi rambuli, the default material model in Unreal supports features such as Emissive Color and should map them to the necessary ES3.1 features.
Make sure you enable ES3.1 Android support in your project settings:
Hi, great tutorial! I was using this method in Unity and recently switched to UE4 so I was happy to find this. I have a question about the BBoxMin param from the last image. How are you getting the negative values (-250,-250,0,0) in that node. I enter those and it changes to 0. I tried using the Constant4Vector and going into the details but that did not seem to work either. Sorry, I'm very new to Unreal so I'm not really sure which node I should be using for those parameters. Thanks!
Also I was wondering if you could do a tutorial on converting the Dynamic Soft Shadows using Cube maps https://community.arm.com/developer/tools-software/graphics/b/blog/posts/dynamic-soft-shadows-based-on-local-cubemap?
I was using that in my Unity projects and wondered if it would work in UE4.
Thanks!
Hi
thanks for the info. But how do you create materials specifically for Android ES3.1 and using Unreal? For example: How to create an emissive material?