30 December 2024

Looking back at 2024 and predictions for 2025...


It's, again, that time of year where we take stock of the past and also look towards the future. Last year was fairly busy for me, travelling the world and a huge increase at work resulting in less time to spend on this blog. Unfortunately, 2025 is looking just as busy in some ways given I am now tasked with increased responsibilities at work. So, that will drain more of my focus from the first of January, onwards... But, let's make hay whilst we can!


2024 Recap...


Last year, I achieved around a 40-50% accuracy rate. I was a bit disappointed with that so I actually spent a lot more time thinking about the industry and where things were going and I think that (aside from being lucky) that was time well-spent.

Let's see how I did! First off, let's take a trip to GPU land:

  • The client (desktop) RTX 50 series from Nvidia will not release in 2024.
  • Intel will not launch Battlemage desktop GPUs this year. 
  • No new Radeon cards will launch. 
I had a version of this post partially written in the early days of November and until that point this was 100% correct on all fronts. I was worried for a bit there with the RX 7600 GRE leaks and we almost managed to scrape through to the end of the year. But then Intel decided to finally deliver on something from their GPU division so that middle prediction was WRONG...


In summary: two correct, one wrong.


  • If Zen 5 desktop launches this year, it will launch with an X3D part in the line-up.

Wrong! I really thought this was going to be the launch - and we all saw how much the Ryzen 9000 launch needed the X3D parts to actually be good (or at least better)... However, after the lacklustre launch of the non-X3D parts, AMD launched the 9800X3D a much smaller period between the releases thus far! So, this prediction will probably be good for the next generation of AMD processors.


  • If Zen 5 launches, no new motherboard generation will launch. Prices will not drop on current line-ups.

Correct! The 800 series motherboards are almost entirely rebadged 700 series - the same chipsets, just with USB4 tacked on (in some cases).


  • PC ports/releases will continue to get better from the current low in terms of quality. 2023 was an outlier.

I would say that, although you could argue both ways, I was incorrect on this one. Too many AAA games released with major issues (Silent Hill, Dragon's Dogma 2, [Arguably] Stalker 2, etc. The list goes on!).


  • Microsoft will charge a nominal fee for Windows 10 security updates from client; maybe ~$25 per year. 
I was right (and closer than I ever thought it would be!) $30 for the first year for consumers... years two and three? Currently not offered, but we'll get to that in a bit.


Overall score: 57% correct! But if Intel hadn't have come along and ruined the party, it would have been 71%... Booo! One more reason to dislike Intel. 



Predictions for 2025...


I haven't had as much time to ponder the possibilities for next year but I'll try my best. What's hurting my percentages is the number  predictions. One right or wrong answer has a greater than 10% change in the calculation. I should try and increase the number of predictions to get a better idea of my true prognostication powers but let's see how imaginative I can be...


  • The RTX 5060 will be the first Nvidia GPU to feature 24 Gb (3GB) modules.
Look, I'm pretty sure Nvidia and AMD have gotten the message that 8 GB GPUs are not going to cut it in the low-end of the range (I'm not speaking about entry-level). However, we have to be realistic - what's holding them back is the price of components and the fact that memory makers have not started commercial production of the necessary next generation GDDR. 

That's finally happening, with SK-Hynix beginning mass production as of this last quarter on both 16 and 24 Gb modules. Samsung is a bit behind, but is also bringing those 24 Gb modules to the fray, while Micron also appears to be operating on the same timeframe with similar capacities but targeting low-power (e.g. laptop) applications with their design*. 
*Or, at least, that's what I can tell from the press releases.
With what I'm seeing from the various news reports and press releases, it looks like we'll have those 24 Gb modules somewhere around Q2 2025 but at slower speeds - which would line up well with a potential RTX 5060 release. Which leads nicely into the third prediction.

However, there is another reason why I believe that the RTX 5060 will feature these 24 Gb modules, first and perhaps none of the other products in the Nvidia stack: the typical memory bus "widths" used in the various GPU performance/price tiers.

The memory bus "width" determines both the die size and capacity of VRAM... via Videocardz.



If we look at the rumoured specifications (helpfully compiled by Videocardz.com) we see the memory bus size (width) reduces at each tier of GPU in the stack - this is also true historically. The reason for this is also two-fold: cost of the GPU silicon, and cost of the memory modules and applying those modules to the circuit board that makes up the base of the graphics card.

Memory controllers are relatively large structures on the GPU die and so the ideal situation is that the fewer of them that need to be included - the better. The reason is that the size of the die is proportional to the cost of each individual die (i.e. more dies that can be produced from a single silicon wafer, the cheaper each individual die is as the wafer cost remains constant!). Additionally, smaller dies have an increased possibility of being either defect free or free of a defect which renders them unsalvageable in some way (e.g. the process of binning).

On the other hand, running the traces for the memory modules, the layering of the pads to attach the modules add expense to the design and production of the graphics card PCBs. In addition to the cost of each individual memory module and even the assembly line process to apply the modules also add additional cost to the final BOM and COGs of a graphics card.

Therefore, the ideal GPU is one which is very small but very performant and has very little memory attached to it!

Unfortunately, for the product managers at all three GPU manufacturers (and Jensen's next Leather jacket purchase), games require data to be available for the GPU to operate, which means that they need VRAM. They also need a lot of parallel processing power in order to calculate all the required operations very quickly, which requires lots of transistors. So, these things mean a bigger GPU is "better" and a certain amount of VRAM is required to have consistent and high performance.

That quantity varies based on the games/applications that are running on the GPU but the long and short of it is that modern graphically-demanding gaming titles require around 10 - 12 GB VRAM even at a resolution of 1080p, though, as I pointed out last time, 1440p is quickly becoming the standard resolution for modern gamers and this "rule" definitely applies to that resolution... 

The big point here is that if you don't have enough VRAM, it's bad. If you do? It doesn't matter how much more you have.

This means that for all of the cards above the RTX 5060, the card which will most likely have a 128 bit memory bus, the GPU manufacturers will have enough VRAM for the majority of modern games at 1080p and 1440p that will be released over the coming 3-4 years when using 16 Gb (2 GB) modules. Above a 128 bit bus, that gives you a minimum of 10 GB VRAM to work with and that's enough... Bringing us to the second prediction:

The vast majority of games are unlikely to require more than 12 GB VRAM at recommended settings in the coming years...


  • The RTX 5060 will be the only RTX 50 series GPU that utilises the 24 Gb modules.

Moving onto the third prediction:

  • The desktop RTX 5060 will either release with 12 GB VRAM or have a variant with the 12 GB configuration.
Those 24 Gb modules I mentioned above will enable a a graphics card with a 128 bit bus to host a 12 GB framebuffer instead of the historic 8 GB we've been lamenting for around the last four years. This is MUCH needed at the low-end of each vendor's product stacks and especially on the targetted cheaper mid-range models that both Nvidia likes to put out (e.g. RTX 4060 Ti) which, if fitted with a larger framebuffer (aka VRAM), have been proven to perform better and output smoother sequential frametimes.


  • RTX 50 series and RX 90 series will be unimpressive in the price to performance compared with the current generation. We will not get large performance gains at each price point - with the exception of the RTX 5090.

This is less of a prediction and more of a continuation of the trend for the last two generations - we're not getting better performance at the same price point anywhere below the top-end cards. Nvidia have the opportunity to make the RTX 50 series something special... but I'm 80% sure that they won't. They won't because they don't have to. Crypto- sorry, I meant AI is still selling GPUs and all of their most important wafer allocation is heading towards that endeavour. AMD and Intel are not competing at the high-end, so Nvidia can do what they please... and what they please is to not have to focus on the consumer gaming market any more than they need to. 

They will save that amazing performance uplift for when they really need it. (i.e. If either AMD or Intel are able to compete, Nvidia has a whole 20-30% extra performance uplift per non-top card tier to pull out of the hat).


The RX 9070 XT (rumoured top RDNA4 card) has the above leaked results. (FPS data from TechPowerUp)

  • RDNA4 will still not be an impressive uplift in terms of ray tracing ability and will not be as performant as the RTX 4080.

Here's the thing, the recent performance leaks for the RX 9070 XT are not really making sense to me but people are going crazy for them. Let's get this out of the way: Ray Tracing performance is a function of the Raster performance of any given GPU... If your GPU is able to perform RT with 85% efficiency, you will get approximately 85% of the fps of the rasterisation performance of the GPU. 

Looking at the table above, the leaked performance has a ratio of 0.64 from the Port Royal score to the Timespy score. That's higher than the ratio of Ada Lovelace-based architecture RTX 40 series cards which hover around 0.62, with the exception of the RTX 4070 which is (in my opinion) power-starved. That's an impressive uplift when the RX 7000 series is hovering around a ratio of 0.52... Maybe not out of the realm of possibility, though.

However, if we then look at actual real world fps averages, at 1080p and 1440p we're actually looking at a ratio of 0.70 for the RTX 40 series and 0.56 for the RX 7000 series. Nvidia gets a bigger performance uplift in many of these game titles in the real world.

What's mostly concerning to me is the comparisons between the RX 7900 XT and RTX 4070 Ti in the leaks. If an RX 7800 XT is able to almost match the raster performance of a RX 6900 XT with 20 CU less, but also match its performance in the synthetic Port Royal test but generally win-out in real-world testing how would a 64 CU part (the rumoured spec of the RX 9070 XT) win over it so handily?

Then we can move onto the RTX 4070 Super, a card which manages the same score as the RX 6900 XT in Timespy synthetic raster but which handily beats the RX 7900 GRE in real-world gaming at 1080p and matches it at 1440p, despite that card having a surplus of 2000 points over it!

The point I'm trying to make, here, is that Timespy and Port Royal scores do not match the same real world GPU performance hierarchy... So, these leaks don't mean anything when it comes to how the RX 9070 XT will perform on the track, so to speak.


  • RDNA4 will not be impressive from a price to performance perspective.
This is the same prediction as above for Nvidia but aimed at AMD's line-up. The problem we have is that AMD seemingly can't think for themselves and have, effectively, no agency in this market. They want to only do what is necessary to continue developing their technology to keep them afloat and in the race but not to actually improve and compete. It's a fool's errand to expect them to offer something compelling and a card like the RX 7800 XT almost seems like an accident. 

There is also another reason reason which will be discussed in the next prediction:

  • RDNA4 will initially only release as a mid-range product - no low end GPU will be present for the first half of 2025. The RX 7600/XT may not even see a release until 2026 but this is dependent on whether Nvidia launches the RTX 5060 or not during the year.

There's a simple reason for this prediction: There's just too damn much inventory on the market! Seriously, while the RTX 4070 is basically out of stock (despite the non-GDDR6X variants releasing) everything else below the RTX 4070 Super is still in the market, hogging that space. It seems like, due to the increased prices on the 4070 Ti and Ti Super indicate that these are also end-of-life (rather than in high demand) but comparing this situation with AMD shows that all of their cards are going for below MSRP and readily available at all performance tiers.

What is AMD going to do? Launch products which will destabilise the majority of their current line-up and thus devalue the inventory they have on the market? I don't believe even AMD are that stupid...

Despite generally high prices, most GPUs have actually decreased from their release prices...


So, that leaves us with the obvious - they need to release something to show they are keeping up with Nvidia. Nvidia is likely to only release the top cards. AMD's top-end RDNA4 card will likely only have RX 7900 XT performance, therefore, devaluing only one or two cards makes more sense than devaluing the whole stack. 


  • The RDNA 4 top card will release at around €649/$600.
As I mention above, AMD can't undermine their swathes of inventory on the market and wouldn't want to. The RX 7900 XT and GRE are "salvaged" products - they're the dregs of the RX 7900 XTX - and destroying their market won't hurt the lower-end parts nor the aforementioned RX 7900 XTX's position in the stack.

This means that AMD can't price the part too low, which would rule out a $500 price-point. Plus, Nvidia (and thus AMD) have not given improvements in price-to-performance at the launch of the RTX 40 series, are are unlikely to do so for the RTX 50 series. Therefore, it's likely that AMD will follow the same strategy.

This will allow both companies to continue to sell their (in reality) poor value SKUs at the lower-end of the stack. Meanwhile, AMD can cut the price on the 7900 XTX a little, to help with sales but not cannibalise the profit too much...

No one wins.


  • SteamOS will make a return for DIY desktop gaming PCs... (Linux through the back door).

All the leaks surrounding Steam certified devices recently have me pining for the potential of SteamOS from back in the early 2010s. I'd love for that to become a reality in 2025, so I'm putting it on my wishlist, despite no real credence for it at this juncture!



Aaaaand, that's it! I could have come up with more but time just was not on my side! I've been writing this article on and off for over a month, now but with work, a promotion, family, and holidays I just don't have a lot of spare time to sit, think and produce #content.

I wish you all a very happy Christmas and pleasant New Year!

No comments: