Tecnologia

AMD Radeon VII to Support DLSS Equivalent

Adam Kozak, Product Marketing Manager for AMD recently let slip some intriguing details to the Japanese tech website 4gamer.net, most interesting of which was the hint of a direct multi-vendor competitor for Nvidia’s DLSS anti-aliasing.

It’s known as DirectML, developed by Microsoft, which the Radeon VII is a verified card for, it’s essentially an add-in graphics extension for DirectX12, similar to DirectX Ray Tracing, that enables DXR like features through the use of intelligent machine learning in the graphics API, thus the name. Although it is possible to ray trace through the library, without dedicated hardware to facilitate that feature, it’s still not a cost effective way of producing the same output Nvidia is achieving with its ray tracing cores. That’s especially the case without a significant performance loss on the traditional shader side of things.

However one thing it does enable is a form of anti-aliasing that has an effect, and performance hit similar to DLSS. Yet one that’s compatible with AMD’s latest hardware as opposed to just specifically Nvidia’s Tensor cores. That’s a big deal, and if Radeon VII supports this as standard on launch, it could make the card far more appealing than we first gave it credit for initially. After all, multi-source standards like this, are typically more likely to be taken up by developers than their proprietary counterparts. Despite Nvidia’s colossal market share in the dedicated graphics card division, thanks to AMD’s console dominance (and the next gen consoles featuring the likes of its next-gen Navi GPU) the likelihood of more AAA titles turning to DX12, and in turn DirectML is more and more likely than both DLSS and DXR.

Of course it’s that last part that’ll be the clincher for AMD, DX12 still isn’t extensively supported on the PC platform, and although it’s on the uptick, we’re not quite there yet.

It certainly seems like the race for more and more shader cores is coming to an end, and as both manufacturers begin to lean more on dedicated hardware, and machine learning, could this be the beginning of a new GPU arms race? Let’s hope so.