12 November 2021

The Full Nerd 2021 hardware awards...

This may be the only way to actually have a PC in a few years' time...

Yesterday, I watched the latest episode of the excellent Full Nerd Show, presented by PC World staff. They covered their favourite tech for the year in the categories of CPU, GPU, PC case, "Accessory", Laptop, PC game, Best innovation, and Worst Trend. I highly recommend a watch of this episode in particular as there's a lively debate over the CPU and GPU picks but I recommened subscribing to their channel in general as The Full Nerd is always of high quality.

However, I did have some thoughts about those two hotly debated categories and on what my own picks woud be for the CPU and GPU of the year....

I think it goes without saying that the last two years have been pretty amazing in terms of products released... but pretty terrible in terms of availability. That was mentioned time and time again throughout the show and I've written about it plenty of times. However, I feel like there's only so much of that global situation that should affect the choice for each category of winners and I feel like the standards for the choices between CPU and GPU were not consistent... but that's a personal thing and I'm not saying that their picks are wrong, just not what I would have gone for.

Personally, I fall more towards Gordon Ung's point of view - an award should go to "the best of something". It can be best innovation within the category, it could be best price/performance, it could just be best performance, it could be for how industry-changing the product is... but it should be the best at something

For the category of best CPU in 2021, there was a general concensus on the i5-12600K which, despite me saying was not that competitive or "aggressive" in its pricing, I cannot disagree with them: the 12600K is a great CPU and, for me - in terms of performance, no. of cores at a given price point, and in terms of relatively low gaming power consumption*, it wins-out over everything else released this year. Plus, it comes with an integrated GPU!
*From Gordon's numbers, I count the power consumption being between 85 - 90% of the Ryzen 5 5600X for very intensive all-core tasks - on a per task basis. This means that, despite pulling a higher peak wattage, the 12600K finishes the same work in less time, thus consuming less power, overall. Which is pretty impresive.
I would definitely pick that part over the 12900K and 12700K - both of which are overkill in every aspect for gaming.


The winners in each category. I only really have opinions on three of them...

However, regarding the show hosts' GPU pick, I could not disagree more.

First of all, let's take a look at the product against the criteria that I, personally, hold the winner of each category to. Is this product:
  • Innovative or providing innovation in the market?
  • The best price/performance?
  • The best performance?
  • Industry-changing?
For me, the Ryzen 5 5600G fails all of these tests. 


Innovative?

It's not an innovative part - it's using the same Zen 3 cores at a higher CPU base clock but, overall, the package can boost less because the 65W TDP is also considering the iGPU. It also has less L3 cache - 16 MB vs 32 MB, which we know can be important for gaming if and when a dedicated GPU will be installed as an upgrade.

The iGPU is also using the same (well, slightly better optimised) Vega compute units as were present in the Ryzen 5 2200/2400G and 3200/3400G parts. This is not a new technology!

Performance?

Yes, you can get the iGPU boosting to 1600 1900* MHz compared to the 1240 1400* MHz of the 3400G but what does that get you? 
*Somehow I got these numbers confused with actual performance numbers from a review. Corrected values inserted.
A whole lot of nothing... the performance when using the iGPU is essentially identical to the 3400G in gaming. We're talking around an extra 5-10 fps - though I will admit that those gains at sub-100 fps are actually useful compared to +20 fps at >200 fps!

What we can see in some games (e.g. CS:GO and Fortnite) is the effect of the increase in L3 cache over the 3400G, which had only 4 MB and if you play those games, then the 5600G is a much better option on an iGPU than prior APUs from AMD.

Price/Performance?

The 5600G also fails on this aspect as well. If we take the extreme situation of saying the 5600G performs up to ~10 fps better than the 3400G, we see a performance uplift of 3-20%. However, the 3400G launched at $149 in July 2019... the 5600G launched at $259 in August 2021. That's a maximum of 23% improvement in gaming performance for 74% extra cost...

That's still pretty terrible value.

Industry-changing?

AMD's more performant APUs compared to Intel's integrated GPU solutions have been around since 2018. There's nothing new about this in really any aspect - nothing new was brought to market, options weren't opened-up for consumers, etc. etc.


Hey, I paid for this underlying stock image, I'm going to beat its use "to death"... just like AMD with Vega integrated graphics... *ba dum pish*


I think that, based on these criteria, I can safely say that I would never have picked the R5 5600G for the best GPU release this year. So what would I have picked?

I think that the best release this year was the CDNA 2-based MI250/MI250X, from AMD.

Yes, okay maybe it's a bit unfair because this announcement was only made on Nov 8th and The Full Nerd aired on Nov 10th. Yes, this is not a PC gaming part... but that is not specified to be a requirement in the rules that were communicated by the team.

The reason I'm picking these parts is because they are very forward-looking. They're the first multi-die GPUs*, their construction incorporates new packaging technologies, and they are also managed in a new way so that the OS only sees a single GPU to interact with. This release is a heralding a new age of GPUs and I'm quite excited for the possibilities these bring to the table.
*As opposed to GPUs where two dies were placed on the daughter card and linked in an SLI configuration - the OS sees them as two separate GPUs but this multi-die GPU will be seen as a single GPU in a way that is transparent to the OS.
Yes, I disagree with Gordon's choice of the RTX 3080 Ti - it's power-hungry, it's inefficient, it's a small evolution of a design from last year... and it's over-priced. An RTX 3080 is (in theory) $700, for 10-15% more performance the RTX 3090 is $1500... the 3090 was always a stupid buy and, as a result - and by extension - the $1200 3080 Ti is also stupid. The value is not there and GDDR6X is a massive failure in terms of gain per cost (aside from the skyrocketing cost of GDDR6 because of its ubiquity and because nobody bought up a massive stockpile of it last year).

And that's my thoughts on that.

If you have any thoughts on what your picks for CPU or GPU, put them below in the comments and we can discuss them.

No comments: