What is foveated rendering on PSVR2?

what is foveated rendering

Wondering what foveated rendering is and how it works? Don’t worry. Here’s a detailed explainer on this VR technology.

PlayStation VR2‘s arrival has brought foveated rendering back into the spotlight. Mind you; this isn’t some revolutionary tech that’s just landed on our virtual doorsteps; VR headsets have been banking on it to amp up their performance for a while now.

But powerhouses like the PS VR2 and PS5 leverage it to smooth out potential performance issues, which has got some people’s heads turning.

Article continues after ad

Other virtual reality headsets like the Meta Quest Pro and more have used foveated rendering. The concept is devilishly simple, yet incredibly clever.

What is foveated rendering?

Foveated rendering reduces the resolution of where you’re not looking. The idea is that you don’t need to render the whole game all at once. As you focus on something, the game will then render that at full fidelity, and reduce the side you’re not looking at.

An exaggerated rendition of foveated rendering, with the edges blurred and a symbol showing where you'd be focusing with your eyes.An exaggerated rendition of foveated rendering, with the edges blurred and a symbol showing where you’d be focusing with your eyes.

PS VR2 and Meta Quest Pro use eye tracking to keep track of this so that the user’s peripheral vision is used rather than the titling of the head. As your eyes will focus on one area, the game or software can then ensure it’s not rendering too much.

Article continues after ad

The major difference between the Meta Quest 2’s and PS VR2, is that Meta’s headset requires it to be ‘fixed’. This means that it’ll consistently decrease the visuals when the image hits a certain threshold when moved, rather than tracking your eyes.

Foveated rendering in video

This has been used before in other virtual reality applications as well. Foveated rendering was used in the Insta360 video players. As the footage is a whopping 8K for the Pro cameras, the software would reduce the quality of the video’s bitrate if the user wasn’t looking directly at it.

Article continues after ad

Reduced rendering in games

Games have done this in different ways for years, with examples off the top of our heads linking in with things like Half-Life 2 and Source Engine games.

The further away an object is in Half-Life 2, the worse it is rendered out to the user. A zombie far off in the distance doesn’t require the full animation, or graphics applied to it. As it gets closer, the enemy will begin to render fully, as detail will be required.

Article continues after ad

This was a major issue in the very original version of Final Fantasy XIV, where barrels and objects were consistently presented with the full resolution, rather than reducing it in busy areas. This caused the game to perform poorly on a frequent basis and why things in games often look worse when you begin to observe them closely.