Stereoscopic 3D inside Stereoscopic 3D [Unreal]

The mid-long (and maybe confusing) premise

Let’s say you are navigating a VR experience using a Head Mounted Display (HMD), or enjoying a projected VR environment on a power wall or CAVE. Now, for some reason, you want to look at a stereoscopic image within that VR world. Generally speaking, in order to properly experience stereoscopic images you need a separate image for each eye.
But…wait! You already are getting a separate image for each eye, since you are experiencing VR through that HMD/Screen/CAVE!
How could we provide a “depth-within-depth” experience to the user?
This is where a simple “trick” comes in: you can create a material/shader which shows a different image based on the display which is currently being rendered. In other words, this material knows when its pixels are being rendered for the left or right eye: based on that, it will show the correct image. The result? It feels like a 3DTV, but instead of looking at it while sitting on the sofa, you are doing it inside VR.

Why and how?

The need for a Stereo3D inside Stereo3D experience came up as a requirement for a setup where we wanted to show a stereo 3D video inside a VR environment.

At the moment, we are working with Unreal Engine, so the implementation below is exclusively for Unreal, but I suppose the same concept can be applied to other setups.

Required Custom Node

We are going to create some Unreal Materials. These materials will be using a custom node. I called it CurrentStereoIndex, but you can of course choose the name you prefer.

This newly created node will output 0 if the material is being rendered for the left eye, 1 for the the right eye.

To create this custom node, when you are working on your material, add a custom node (right click->custom). Then, in the Material Expression Custom part of the node Details section do the following:
– set the Output Type to CMOT Float 1.
– you can choose the Description and Desc fields as you prefer.
– we don’t need Inputs for this specific node, so leave the array to 0 Elements.
– Finally, and most importantly, paste the following in the Code field:

return ResolvedView.StereoPassIndex;

Version 1 – Separate Left and Right images
Create the material shown in the following picture. In this case, its shading model is set to “Unlit”, but this is not strictly required. The material provides two Texture paramters, TextureLeft and TextureRight. Note that the material is using our custom node in order to choose which texture to use!
Once the material is ready, you can create a material instance from it, and then apply your left and right images to its TextureLeft and TextureRight parameters. You can do it once if you need to show an image, or update the parameters for each frame if what you want to display it as a video.

Texture parameters can be set from a Blueprint or from C++. There are dedicated Blueprint nodes and C++ functions to do that on your material instance. I am not giving examples on how to do that, since the way you provide your images/frames might be very specific for your project.

Version 1 – Separate Left and Right images

Version 2 – Stitched Left and Right images (side by side stereo pair)
In this case, we will be providing a single Texture to our material. Some stereoscopic images, cameras or videos provide their stereo pair like this. So, the Texture parameter passed to the material will contain both left and right images/frames: we need to separate them before rendering them separately for each eye.

In case of a side by side stereo pair, we need to access the left horizontal half of the texture to retrieve the left eye image, and the right half to get the right eye image.
Textures are accessed through UV texture coordinates. We can think of them as horizontal (U) and vertical (V) coordinates addressing the texture pixels.
If we access pixels addressed by U coordinates values between 0 – 0.5, we will access the left eye image. U coordinates between 0.5 – 1 will give us the right eye image. We are not touching the V coordinate values, which will go from 0 – 1 for both images.
Note: for over/under stereo pairs, we would be doing the opposite.

Again, we are using our custom CurrentStereoIndex node. In this case, the node does not choose which Texture to pick but which set of UVs, and therefore which portion of the entire stitched texture.

Version 2 – Stitched Left and Right images (side by side stereo pair)

What now?

You can apply instances of these materials wherever you prefer. Personally, I am using them as materials for simple planes. When seen in VR, they really look like 3D TVs 🙂

dario mazzanti, 2021

Up ↑