AMD could have a cunning plan to cut the cost of its future graphics cards
Or at least ensure that they don’t become even more expensive...
AMD’s graphics cards could be very different in the future, switching to use a multi-chip module (MCM) design, at least according to a freshly spotted patent.
Notebookcheck.net highlighted the discovery of the patent by hardware leaker @davideneco25320 on Twitter, and it’s an interesting read for sure, providing a potential glimpse of how AMD is set to shift its GPU design in order to keep a lid on spiralling graphics card prices, and better compete with Nvidia (and indeed Intel for that matter).
https://t.co/8K9XmmiOjOChiplet gpu by amdJanuary 1, 2021
The broad idea, in simple terms, is to use MCM or multiple chips (‘chiplets’) on one board – as AMD already does with its Ryzen processors – as opposed to the current monolithic (single chip) design.
- Check out the best AMD CPUs
- These are the best AMD GPUs
- And here are all the best graphics cards
The move to MCM could confer a number of benefits in terms of ensuring better yields as graphics cards become more and more powerful, and their design becomes more demanding to figure out and implement while being able to keep costs down suitably. As we’ve seen in recent times, GPUs, or certainly the more powerful ones, have already become eye-wateringly expensive.
However, there are serious issues in making the change to an MCM model in terms of the way graphics cards work, but the AMD patent outlines how to tackle these thorny problems.
New way forward
The main stumbling blocks with an MCM design lie in the fact that games are programmed specifically to work with a single GPU, so this new way of doing things – which is effectively using multiple GPUs on a single board – is problematic in that respect. And it’s also a tricky matter to implement parallel workloads across multiple chiplets anyway, while keeping memory content in sync across them.
AMD’s solution in the patent is to hook up these GPU chiplets via a high bandwidth passive crosslink, with one of these chiplets being the primary GPU, as it were, with that directly connected to the CPU – meaning the processor (and OS) would see the graphics card as just a single (monolithic) entity in terms of coding software or games for it.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Furthermore, to try and tackle the aforementioned memory content issues, each GPU chiplet would have its own last-level cache, and these would be connected in a way to ensure coherency across all the chiplets.
When might this new design actually happen? It is feasible that AMD could be looking to MCM technology for next-gen RDNA 3 graphics cards, but that could be optimistic, and perhaps further down the line – maybe RDNA 4 – would be a more likely prospect.
This is all so much guesswork at this point, of course, and we can’t read too much into a single patent anyway. These kind of design concepts are often exploratory or experimental in nature, after all.
But it does show the direction AMD intends to travel in, or is at least seriously considering, while casting a light on potential solutions to the major drawbacks that traditional monolithic designs are beset with. As we head further into the future, these kind of graphics cards could be increasingly difficult to manufacture while keeping yields at a palatable enough level (or in other words, keeping costs down).
AMD isn’t the only firm thinking this way, as you might expect, with Nvidia exploring the use of MCM itself for Hopper graphics cards, rumor has it, and indeed Intel with Xe HP Arctic Sound. Remember that Intel is expected to compete with Nvidia and AMD in the heavyweight gaming arena this year with the launch of its Xe-HPG card.
- Check out all the best gaming PCs
Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).