TipsMake
Newest

Is a GPU with over 16GB of VRAM worth buying?

If you're torn between the RTX 5070, 5080, or 5090, let me be blunt: don't let VRAM size influence your decision. For most needs, especially gaming, you almost never need more than 16GB of VRAM.

 

VRAM is only really important when you're gaming at 4K or higher or handling heavy AI tasks. Most users today still play at 1080p and 1440p, settings where high VRAM capacity is almost useless.

1080p and 1440p resolutions don't fully utilize the large amount of VRAM.

My first graphics card was a GTX 970, bought in 2016, with only 3GB of VRAM, but it was still considered a decent mid-range card at the time. Now, some consumer cards can reach 32GB of VRAM — more than the RAM of many mainstream PCs.

 

However, the majority of gamers still play at 1080p or 1440p resolution. According to Steam statistics, 74% of users are at these two levels, with 1080p accounting for more than half. And when playing at resolutions lower than 4K, games simply cannot take full advantage of the large amount of VRAM.

Currently, I'm using an RTX 3080 12GB, and previously an RTX 3060 12GB. With a 1440p monitor, even at max settings, the VRAM is never pushed to its maximum. In demanding games like Ghost Recon Wildlands or Breakpoint, the highest I've ever seen used was only around 8GB.

So if you're playing at 1080p or 1440p, buying a GPU with more than 16GB of VRAM is just a waste of money. A mid-range card is sufficient; only consider upgrading when you switch to a 4K monitor.

Is a GPU with over 16GB of VRAM worth buying? Picture 1

 

Upscaling technology makes mid-range cards as powerful as high-end cards.

I'm not a big fan of upscaling or frame generation technologies, but I have to admit they're really effective. DLSS, DLAA, FSR, and other AI-powered features help mid-range GPUs easily outperform their competitors—without needing extra VRAM.

Because upscaling low-resolution textures to high-resolution graphics doesn't consume much VRAM, users can play 4K games without having to invest in a GPU with high VRAM.

Even when playing 4K games, it's difficult to fully utilize more than 16GB of VRAM.

Even with an RTX 3080 12GB, I very rarely hit the maximum VRAM limit when playing 4K. A few games came close to the limit, but using the full 12GB was almost non-existent.

That means with 16GB of VRAM, most users will find it very difficult to use it all — even when playing 4K.

Is a GPU with over 16GB of VRAM worth buying? Picture 2

 

If you want to invest beyond your needs just to be safe, that's fine. But realistically, more than 16GB of VRAM won't deliver the performance it deserves. Improving your gaming experience, software performance, and optimizing settings is far more beneficial than upgrading to an overly expensive GPU.

The money you'd spend on a high-end GPU could probably be invested in more useful things: more RAM, an SSD upgrade, or a better CPU — although the CPU doesn't really need to be as 'powerful' as many people think.

More than 16GB of VRAM is only truly worthwhile for AI.

The only area where a large amount of VRAM is always beneficial is AI. If you're running large language modeling (LLM) or heavy AI tasks on a PC, the more VRAM the better. 32GB is sometimes even insufficient, and a multi-GPU system is necessary for smooth performance.

But even with a 12GB card, the author could still run AI, just slower. If your needs are just occasional basic AI runs, 12–16GB is still sufficient.

In short: you can buy a GPU with a large amount of VRAM if you really like it, but choosing an RTX 5090 32GB just to play 1440p games is… a waste. A large amount of VRAM doesn't automatically make the gaming experience 'better' if you don't use it.

Discover more
Marvin Fry
Share by Marvin Fry
Update 24 January 2026