AMD has published a blog post that takes aim at rival Nvidia for their lack of usable VRAM. As textures in modern video games get bigger, a lack of VRAM can lead to less performance in some titles.
AMD has been relatively quiet on the GPU front since the launch of the Radeon RX 7900 XT and XTX. However, the company has now hit back at Nvidia, criticizing the manufacturer for its lack of VRAM on its current and older graphics cards in a blog post.
More recent examples of this are present in modern game releases, such as Hogwarts Legacy, Forspoken, and The Last of Us Part I. Now that developers are attempting to make use of modern graphics cards, we have observed a spike in VRAM when playing games at higher resolutions.
AMD has traditionally crammed more VRAM into their GPUs when compared to Nvidia. However, Nvidia’s higher-end GPUs also use a slightly faster standard: GDDR6X, vs AMD’s GDDR6. Nvidia has used the slower standard on some GPUs at the lower end of their stacks traditionally, too.
Subscribe to our newsletter for the latest updates on Esports, Gaming and more.
AMD has also showcased that their graphics cards start at 16GB of RAM when you hit the $500 mark. Nvidia currently sells the RTX 4070 Ti, which has 12GB of GDDR6X VRAM for an MSRP of $899.
The brewing VRAM war between AMD and Nvidia
AMD showcases several benchmarks with several graphics cards, and their prices are based on the lowest market pricing on Newegg. While some of the benchmarks do not correlate with internal testing, the fine folks at the YouTube channel HardwareUnboxed also tested AMD and Nvidia’s previous-generation offerings that also have this VRAM disparity with unsurprising results. AMD’s graphics cards come ahead at higher resolutions in many cases.
However, while VRAM capacity is an issue, Nvidia has one edge over AMD this generation, and it’s DLSS 3′s frame generation. These AI tools and VRAM capacity could be the deciding factor between exactly what GPU you choose to pick up next.