4 January 2023

Looking back at 2022 and predictions for 2023...

... etc.

I'm not going to lie, 2022 kicked my ass in terms of work. I just didn't have the energy or time to properly dedicate to this blog, even though I had strong opinions on many events that occurred during the year - I just wasn't able to put my thoughts down onto paper (so to speak). Additionally, I didn't play too many games, instead I dedicated a lot of my free time to doing some hardware testing, in order to increase my understanding of that hardware and the ways it can affect gaming experiences in the mid-range.

Unfortunately, that testing is way more time consuming than just doing analysis or quick opinion pieces, but I do feel that I have improved the way I am able to analyse data outputs from game testing - and this is something that I can apply going forward, now that I have worked out the methodology to a greater extent.

In addition to this, the majority of the big hardware releases have happened this past year and there really isn't that much for me to be excited about for 2023, so my predictions may be a little weak for this coming year... 

But, the show must go on, so...


2022 Recap...


Last year I made the following predictions. Let's see how they turned out!

  • Inte's Arc will be underwhelming in price, though not in performance. Will not appreciably impact availability of discrete graphics cards for consumers.

Arc was expensive, across the board. The availability of the cards and did not appreciably impact availability of dGPUs for consumers at all. However, Arc has also been underwhelming in performance. Yes, sure, it does seem like they are improving the situation with regards to driver optimisation, but that does not mean that the initial launch was representative of this...

Sure, in the end Arc might end up being okay in terms of performance... but I can't claim that as being an accurate prediction.

Verdict: 2/3 Correct.


  • The i5-12400K/F CPUs will be more expensive than prior X400K/F level CPUs. The bargain prices of the 10400K/F and 11400K/F really are too good to be true.

This prediction is a little complicated because, yes - at release - those parts were only marginally more expensive than the prior generations. However, over time, with Intel's revenue woes and the stuff that's going on with inflation, these parts are now in the ballpark of where I was originally envisaging them. 

These bulk-sale products  really affect Intel's bottom line - I believe more so than their i7 and i9 products... and, in fact, you can see that those parts are more competitively priced relative to their AMD counterparts.

Verdict: Wrong... but eventually right.


  • AMD's v-cache will not make an appearance on the 6-core CPU.

There's very little to say about this other than many people were predicting 12- or 16-core variants, and a few were predicting 6-cores, too. I just didn't see this a realistic event.

Verdict: Correct.


  • I don't expect a clock speed bump for Zen 3D SKUs.

I was correct on this (in fact, there was a slight clockspeed regression), though the reason for it is not clear 'til this day...

The official word from AMD is that it is a voltage limit, not a heat limit (as I had suspected). However, increasing clocks at a given voltage increases heat-output and increased voltage at the same clocks results in even more heat generated. So, to my mind, the two reasons are still intertwined and not separate, regardless of what AMD says. 

Their "95 C as standard" operating temperature for their Ryzen 7000 series CPUs does not address my original issue with this logic - stacking silicon on active silicon will necessarily further worsen thermals on any chip and I can see this being an issue going forward with any Ryzen 7000 product...

Verdict: Correct.


  • I predict that DirectStorage will be much ado about nothing in 2022.

There's not much to say about this, other than I was right. No game released with this API/feature. In fact, the game that was touted to be the first to market with it enabled is delayed until the 24th January, so we will see how things pan out at the end of this month. Also, I'm not the only one who is critical of bringing this technology to the PC space...

Verdict: Correct.


  • The Radeon RX 7950 XT (or whatever the full-die SKU is called [RX7950 XT?] will not be 3x or even 2x the RX 6900 XT.

Typos in my prediction notwithstanding, I was correct in this one. Despite analysing the "leaks" from famous leakers which tipped my own expectation towards 2x the performance of the RX 6900 XT, the end reality is more like 1.4x - 1.5x. The RX 7000 cards are disappointing to me for a few reasons: 
  • The reference designs have some issues with heat, with large differentials between the average GPU temperature and the hotspots on the package.
  • All the talk of the double-pumping the compute units in the new design with twice the number of shaders has not resulted in much of a performance uplift, at all. The increased number of CUs gives 1.2x theoretical performance uplift and the increase in game clock should give 1.13x performance increase for a total of 1.35x increase over the RX 6900 XT... which is what is observed on average at a resolution of 1440p. Sure, at 4K, the uplift is closer to 1.5x.... but that's a far cry from at least double... and that means that double shader cores means 15% improvement... absolutely atrocious!
  • Power efficiency is absolutely terrible and power scaling is also terrible - and handled terribly.

Verdict: Correct, though I wish it weren't!

  • Radeon 7000 and Geforce 40 series will both be announced at the tail-end of the year. Nvidia will announce first. However, only Nvidia will have a proper product launch in 2022 for this series. AMD's will be a paper launch, with real availability in Jan 2023.

This was a REALLY close one. AMD almost didn't make it and, given the delay for the release of their partner cards, if AMD hadn't have had decent supply for their reference designs, this would have been a good call. However, they did (just!) so I was wrong.

Verdict: Wrong - but SO close!


  • GPU availability won't appreciably improve in 2022.

This was just plain wrong. GPU availability improved a lot during the middle of the year before the RTX 30 and RX 6000 series cards dried up in late October/November and avaiability of the new cards has been quite good... the problem being that people aren't really buying them due to their super high prices and, thus, they are remaining in stock... with the exception of the RTX 4090 and RX 7900 XTX.

Verdict: Our survey says: "Nuh-UH!"





Our Summary says: a 6:4 ratio of right to wrong...

That's better than 50:50!!

I BELIEVE, that's an improvement over the previous years...

Predictions for 2023...


I've been trying to follow along with every crazy thing that has been happening but it's been SO crazy that it's difficult. But let's put these out there:


  • This graphics card generation is a lost generation. There will be ZERO cards that consumers or reviewers consider actually good value... (Look, even the 4090 is not good value!)
Look, the RX 7900 XTX, XT, RTX 4080, RTX 4070 Ti and, really - if we're honest, the RTX 4090 are all poor value for the performance at the MSRP - and we KNOW we are not getting them at their MSRP. 

As a result, I doubt that any cards further down the stack will be considered good value for money either. It just doesn't seem possible... the top tier cards have done such a poor job and it seems like both AMD and Nvidia want to distance themselves from the prior generation cards that I just cannot see any well-priced cards this generation.

I should have been prepared for this because I was pretty much predicting this entire situation:

"All of a sudden, we're no longer getting the performance of the last generation's top-tier card for half the price, we're getting it for around 60% or two-thirds of the price... and that logic continues to scale down. A 7500 XT goes from 6500 XT's €200 to €350 for approximately an RX 5700 XT's performance... maintaining the performance per unit currency for that tier - which we also observed in the RX 6000 series prices too (though with less VRAM)."
And, really, let's not kid ourselves - the 3090 Ti was priced ridiculously, because they could... we should not be taking its "MSRP" into account when comparing the following generation of cards. The RTX 3080 or 3080 Ti are more credible options when thinking about the pricing structure.... and is the RTX 4070 Ti half the price of the 3080 Ti in terms of performance? 

No, it's around 60%. Nevermind the fact that we moved "equivalent performance" up by a whole half tier!

Good job, Duoae... *sighs*


  • Nvidia video Super Resolution will be a BIG thing...
I seriously do not know why this hasn't been done before. Maybe it was held back for a time when Nvidia would need it? This is tech that should have been available on day 1 of DLSS availability and, if not then, on day 1 of FSR 2.0 availability. In fact, AMD should have promoted it first, using FSR 2.0.

At any rate - this has the potential to be a huge deal in more ways that just streaming video... "free" upscaling of video played through the Chrome browser has the potential to give the masses upscaling technology for their old video files... I am really looking forward to this.


  • There will be no "pro" consoles for either Playstation or Xbox this year... Xbox Series S will continue to be a thorn in developers' sides...
Seriously, whoever thought of the Series S should be commended... and simultaneously committed because it was a terribly brilliantly terrible idea. Sure, for consumers, it's good (in terms of the initial purchase price) but it's bad for both consumers and developers in virtually every other aspect.

At any rate, from my perspective, Microsoft have abandoned their phone model and are sticking to console generations, like brother Sony intoned at the beginning of the generation...


  • DirectStorage will be a flop... again.

I'm sorry - I just can't stop. I have NOT seen a single demonstration or fact that shows that directstorage will improve gaming or streaming of assets. There are other tools on the PC table that are not being touched, for some unknown reason. 

Do developers require $700+ graphics cards? Sure! $200 PCIe gen 4/5 NVMe drives?! YES! But, no noes! You can't ask for $120 worth of RAM!

What sort of world do we live in?!

Seriously, the benefits of DS have not been proven - they've shown benefits on SATA devices (shouldn't be possible) HDD devices (shouldn't be possible!) and on NVMe devices, non-substantial performance benefits (i.e. sub two second improvements). What's the big deal?!

Oh! But they can drain our heavily taxed GPU resources! (Because everyone has an RTX 4090 idling at 40% utilisation during gameplay!)

Yeah, no... I'm sorry - I just don't get it. Yes, reviewers will RAVE about it. They will coo from the tree tops about this feature, all the while ignoring the fact that it has minimal benefits (like they've already been doing).


  • 32 GB of RAM will become standard for the recommended specifications of new AAA PC games...
This one is a long-shot. I have literally zero ideas why developers are refusing to use this resource or request it from gamers, instead of more expensive and finite hardware items like GPUs and advanced NVMe drives which require state of the art motherboards and CPUs to work properly...

However, I have to believe that some AAA developers will begin waking up and finally requiring 32 GB of system RAM instead of 16 GB... it makes so much sense on SO many levels!


Conclusion...


And that, as they say, is a wrap - I just don't have many predictions this year. I look forward to the year ahead in a gaming sense because so many anticipated titles were shifted into 2023. However, in reality, most of the interesting hardware is already out and there are not very many items expected to be delivered beyond the first weeks of January.

Thanks for reading and hope you had a nice Christmas and New Year!

No comments: