AMD announces FidelityFX Super Resolution 2.0

Prasad, 17 March 2022

Hot on the heels of its new CPU announcements, AMD today announced the successor to its FidelityFX Super Resolution technology that it had introduced last year. FSR 2.0 swaps spatial upscaling to temporal reconstruction for improved image quality at all resolutions.

AMD announces FidelityFX Super Resolution 2.0

To understand FSR 2.0, we must first revisit FSR 1.0. The original technique was a simple spatial upscaling technology. It worked exclusively on the current frame to upscale it to a higher resolution image. In essence, it worked same as your favorite image editor upscaling an image to a higher resolution while also adding a bit of sharpening.

This technique had many downsides. It required a robust anti-aliasing solution already implemented in the engine as it did not include any anti-aliasing of its own. Any aliasing or shimmering artifacts in the game would thus also be present in the FSR image (unlike DLSS). It also only worked well at higher resolutions as low resolutions did not have enough data to upscale, which was another area where DLSS' AI reconstruction was superior.

Native vs FSR 2.0 vs FSR 1.0 at Quality and Performance presets Native vs FSR 2.0 vs FSR 1.0 at Quality and Performance presets
Native vs FSR 2.0 vs FSR 1.0 at Quality and Performance presets

This is where FSR 2.0 comes in. FSR 2.0 switches to temporal image reconstruction. This technique uses frame color, depth, and motion vectors in the rendering pipeline and information from previous frames to reconstruct a higher resolution image. FSR 2.0 includes its own anti-aliasing solution, which replaces the one built into the engine.

The advantages of this technique is that is that it can recreate more detail compared to simple upscaling. And since it's image reconstruction, it can also retain more image in the final image than what may also be possible with native rendering.

FSR 2.0 in principle is similar to the way Nvidia's DLSS 2.0 and Intel's XeSS work. However, both of those other techniques use AI and machine learning assistance in reconstructing the image, which requires specialized hardware components, such as the Tensor cores for DLSS. FSR 2.0 does not include an AI or ML component, which means it can run on a wider variety of hardware, including GPU from Nvidia and Intel. On the other hand, it may not reconstruct as much detail from a low resolution image as DLSS. We will have to wait and see.

At present, we only have one example of FSR 2.0 in action, which is in the limited samples AMD released from Deathloop, which will be one of the first titles to include this feature. In this image, FSR 2.0 has a clear advantage over FSR 1.0 at the higher Quality present and especially at the lower Performance preset. In some areas, FSR 2.0 also resolves more detail than the native 4K rendering.

FSR 2.0 will work on all hardware that FSR 1.0 currently works on. As with FSR 1.0, FSR 2.0 can be adopted on PC, consoles, and also mobile and up to the respective game developer. The feature will be rolled out in Q2 2022.

AMD also announced the Radeon Super Resolution Technology, which is FSR 1.0 implemented on a driver level for AMD graphics cards. This allows the user to enable the feature in any game that runs in exclusive fullscreen mode. The downside is that it is a bit of a hassle to enable and that unlike FSR included within the game that only upscales the 3D image and not the HUD, RSR will upscale everything on the screen as the entire game needs to be manually set to run at a lower resolution.

RSR is available now through the latest Adrenalin Edition 2022 release. This feature is only available on RX 5000 series graphics and newer.

Source


Related

Reader comments

  • Adul Al Salami Kebab
  • 21 Mar 2022
  • gx$

Seems to work great on the RX 6600 XT! :D * 1 enable Radeon Super Resolution in AMD/Radeon Software * 2 set in-game resolution lower than 1680x1050 * 3 profit from fps boost and the game looking smooth AF

  • Anonymous
  • 18 Mar 2022
  • M}3

Lol, sure, Apple *pays* benchmarks. Do you have any proof for your bold (bald?) claims?

  • AnonF-1042576
  • 18 Mar 2022
  • M@p

Nah Ovendragon can't make a chipset and with TSMC they're double losers Samsung nods are so much better Nvidia and Intel use Samsung without overheating Who uses T*MC? Apple who pays benchmarks? Lol crapdragon

Popular articles

More

Popular devices

Electric Vehicles

More