8 December 2020

Next Gen game PC hardware requirements... (Part 3 - the cost of gaming)


So, I've lead my merry dance, trying to analyse the past and extrapolate into the future. Maybe I've helped you and provided information that's useful, maybe I haven't. However, there is one last thing I want to explore: the cost of gaming over the last 10 years and I think that, in this time of huge demand and low/no supply that's an interesting topic...

Time marches on...


I've had the feeling for a long time that PC gaming is becoming more expensive but never had the data to back it up until this year when i went on a data-driven analysis of what it took to game on the recommended hardware over a large number of games over the last ten years. Contrary to this point of view, it seems like many people believe it's getting cheaper to game on PC. So... having done all the legwork to get the data on average required performance per year for selections of AAA games over the last 10 year period I figured that the next step is a pretty small one: Look at what equipment (CPU/GPU) would be necessary to acheive that yearly average performance level and what price they launched at to see how things trend.

While it's difficult to discover historical motherboard prices at release, RAM prices have stayed pretty consistent between generations for capacities that are "normal" for their time. E.g. $48.99* for DDR4 2x8 GB in 2016, $85.95 for DDR3 2x2GB in 2010, $83.99 for DDR2 1x1 GB in 2006, $89.99 for DDR 1x256 MB in 2001. In 2020 dollars those are $53.15*, $103, $108 and $132 respectively.

Therefore, I think it's safe to say that the main differences in cost for building a PC (especially budget-aimed systems like those we're going to be talking about here) are the processor and graphics card. Whereas case, drives, and accessories are all personal choices and cannot be factored into this assessment.
*Although the price is cheap, I don't know how reliable that compiled list is because actual RAM prices I've observed over the last year are nowhere near as nice as they are in this list - maybe NewEgg is just very cheap? I can't currently browse it since they stopped EU IP addresses from accessing the site, even in just a window shopping mode, and I can't be bothered to sign up for a VPN. Actual prices I see locally, in other EU countries, and on Amazon put the price of 2x 8 GB DDR4 3200 MHz at around $93 - 106 (£70 - 80)...

Originally, I had planned to do a very simplistic look at launch prices and a singular piece of hardware for each category (CPU/GPU) but then I decided to split out the two main vendors and also compensate for inflation across the whole period.

The performance targets for each year - you can actually see the effect of the release of the PS4 Pro and One X as well as the release of the new consoles here in 2020...

Before we begin, let's remind ourselves that these "targets" are based on recommended specs for games running at 1080p and (more than likely) 30 fps. These are widely considered "baseline" specs for running a game. I'm not saying that I enjoy running games with these settings but these are what the developers, themselves, are saying are the acceptable experience for their audience.

Following on from this, I'm basing each piece of hardware from each vendor for each year on the cheapest solution for buying that performance level at that point in time. i.e. For 2013, I looked at the prices of the CPUs released that year and picked the solution which was cheapest to purchase at that point in time whilst also bearing in mind the mode number of cores/threads that were required each year (see the graph below). This means that I while I will pick a 2-core CPU that meets the above targets when the majority of titles were requiring 4 physical cores, I will highlight it as being undesirable because that choice might not have lead to a good user experience*.
*Poor frame times, stuttering, etc. I'll write about this in an entry on this blog soon...

I can't account for discounts or sales, I mean, anyone can find a second hand i7-4770K three years after release that beats a Ryzen 3 3200G for a similar price but there's no way to quantify that.... so this list is purely looking at prices at launch combined with a standardised performance level (i.e. a synthetic benchmark).


These are the most commonly required number of cores and threads per given year until 2020... and then extrapolated beyond for the next five years.


So, with that short explanation out of the way, let's begin.


Cores! Cores all the way down...


Compiling this list was relatively quick, on the surface of things. However, looking at the results there were a couple of difficult years where multiple CPUs could be purchased that would save you money over the long term. The one thing to note here is that as we've progressed through the years, CPU performance has outpaced the "average" required by games: meaning that, in general, low-end CPUs are able to meet our yearly performance targets.

Intel had some years where their CPUs are meeting the performance targets, by and large, but not the mode no. of cores (marked in red)... but I guess that's another discussion for another post.

There are two exceptions to this rule and that is in 2016-2017 when the mid-gen refresh consoles were launched and there were either no i3 class cpus released or they didn't meet the performance requirements by a wide margin. Notably, this is correlated with the shift to increasing core counts per performance level instigated by the release of AMD's Ryzen line of processors.

Where Intel had a rough couple of years between 2012 and 2015, AMD's Bulldozer mistake caused a lasting damage to performance in their lineup which propagated through into the first gen Ryzen series. What's incredibly shocking* is that, if a user had wanted, a Phenom II X4 980 was still almost meeting the recommended performance requirements up until 2017 and it had a launch price of $185 and the A10-7890K was effectively the top-end consumer-oriented Kaveri SKU ever released, meaning that AMD basically had almost no performance improvement in their CPUs over the course of four years!
*Though not to me because I was sitll using a Phenom II X4 955 BE in 2020 for gaming...

 

AMD really struggled to meet the performance targets until second generation Ryzen...

It's my opinion that AMD's winning of the console contracts for both Microsoft and SONY saved them in more ways that just monetarily. AMD effectively managed to freeze the performance required to play a game at a level where their CPUs on the desktop could keep up. In fact, AMD only couldn't meet single core performance in one year on the list, effectively allowing their users to game throughout the last ten year period. The deal to supply the Xbox One and Playstation 4 also created the environment where Intel sat on its hands and raked-in money without innovating and increasing performance. This gave AMD a chance to not only tread water all those years but to be in a position to catch up.

In fact, if there's one take away message from this compilation of information it's that buying a CPU at the wrong time, or with the wrong performance, will end up costing you longevity and money. Going back to the example of the 980 in 2016. If you had purchased that in 2011, it would have lasted you until 2016 whereas the previously listed FX chips wouldn't have really worked for gaming from 2016 onwards.


Pixel counting...


While the CPU targets weren't that taxing over the last ten years, increasing by 358% and 945% respectively for single and multicore performance scores, GPU requirements have been through the roof! At an astounding 2971%, it is clear that the consoles' relatively weak graphics hardware have not been holding developers back at all. The GPU makers, on the other hand...

Nvidia had no issue meeting the yearly targets... but didn't release any GPUs for three years!

One thing that really stood out to me during this review was that the launch cadence of GPUs has gone from around every six months with big increases to performance to having multiple years between releases and not always guaranteed performance increases.

It seems apparent that the huge competition for performance in the 1990s, 2000s and early 2010s gave an extremely strong evolutionary pressure on the designers of GPUs. ATI and Nvidia effectively managed to emerge victorious from this environment with many extinct architectures lining the road to this point in time. However, they have effectively reached their zenith on the current respective "DNA" of each of their platforms and both companies are trying to find ways to make the next evolutionary step. 

Surprisingly, despite AMD not really having a large portion of the market, they've done a remarkably good job of keeping up with Nvidia and on a much lower budget...

Nvidia made the first step on their journey with the release of Turing (RTX 20 series) and AMD did it with Navi (RX 5000 series). Either way, both companies are not able to churn out new improvements with the ease they were doing in the first half of this last decade and they're both early on in their experiments towards their new forms and so, if you had bought a high-end Nvidia graphics card in 2016 (a GTX 1080 for $599, $649 in 2020 money), you would have made a good investment because it's likely to last you another year or two in terms of raster performance.

In comparison, buying a mid-range card over the last 5 years would have resulted in you either spending more money or turning down graphics settings. I'm beginning to suspect that anything less than the high-range cards are poor investments - as long as you can afford them and this has certainly shaped my decision-making process now that I've upgraded this year. If you do go to the lower-priced cards (without buying used), it seems safer to go really low. That way, the multiple incremental upgrades you need to do in order to keep up actually make sense. 

The $250-400 mid-range cards just don't have the longevity and feature set of the higher-end or the disposable performance of the lower-end cards. Of course, that does bring us to the question of what price is each "range" of cards... it's clearly shifted over the years and what was, in 2016, a $200-250 mid-range segment is now a $300-400 segment.


I'm a trender, I trend my life away...


Plotting those inflation-adjusted prices against each other, it's as clear as day that graphics card prices have been steadily increasing year on year. Neither AMD or Nvidia are competing on cost - it seems PC consumers (and specifically gamers) do not care about "value", only performance numbers. The cost of the recommended graphics performance per year has increased by 2.5x - 2.9x over the last ten years, with AMD actually decreasing their performance/price ratio much more quickly than Nvidia.

I knew that graphics card prices were skyrocketing but had thought it was a relatively recent thing since the RTX 20 series, now it seems apparent that it's always been a general trend. What also appears to be the case is that AMD seems to be the company that is driving prices higher, not Nvidia. That looked contradictory to the way I was previously thinking about the market but it becomes more clear when looking at the data in aggregate.

I knew GPUs were getting more expensive, I just didn't realise the driving force behind the trend was AMD...

AMD are the ones that are pushing for higher price points at the affordable end of the spectrum and this is likely driving higher costs at the high-end because, otherwise, the whole lineup of cards from both vendors become squashed together, price-wise. Now, I don't know whether this is because AMD were not releasing competitive cards in the high-end and so they were not making back the invested R&D money from those lower-end, lower-priced cards alone or whether it's because AMD have been looking to change their market perception from being a "budget" company to being a premium brand like Intel and Nvidia. Either way, the effect is the same - it's clear that AMD have a larger gradient of increase.

I never thought I'd say this, but it looks like Intel is keeping AMD honest...

The same trend can also be seen for CPUs: Once again, despite the years where AMD were cheaper (because their components were not competitive), the chips that meet the recommended performance requirements for gaming each year became more expensive, on average, than Intel. However, we're not necessarily seeing the same trend across the CPU market because, unlike Nvidia, Intel didn't increase the performance of their products in parallel, meaning that they instead chose to drop prices in order to retain marketshare and compete with AMD.

Actually, what I think we can see from the graph and from the CPU list I compiled above, while many industry commentators and analysts have been saying that Intel didn't take AMD seriously and haven't been responding, Intel began addressing the AMD threat in 2018 and we can see that in both lowered prices and improved performance/no. of cores per CPU tier for their lineup and it's only since Intel made that move that AMD have followed suit.


Conclusions...


Summing this all up, let's look at each combination of CPU/GPU for each year to see how much a "computer" would cost to make in 2020 dollars:

And finally, the whole point of this post... (Highlighted cells are the cheapest option)

Discounting the period when AMD was a bargain bin supplier of CPUs, it has actually become quite expensive to build a PC with only AMD parts and it has not been the cheapest option since 2013 when targetting these performance levels. It seems you can get your best "bang for your buck" by mixing vendors and that mostly comes down to the problem I mentioned last time - AMD aren't currently able to segment their graphics cards very well.

Looking at the data, it's really clear that the main driver of the increased cost of gaming is the GPU - the fierce competition between AMD and Intel has kept relevant CPU prices below $200 for the last ten years keeping the increase to around 11-25%. In comparison, the graphics cards I've assembled here have increased by 151-194% in price to meet the yearly performance to be able to play the latest games.


Final Thoughts...


There are many commentators, tech tubers and analysts that promote a practice of updating when you need to. Normally, I'd agree with them. However, I think that for a truly price-conscious individual, that is not the optimal strategy - updating at the correct time/year is something that should be considered, especially when taking PC gaming into account.

The best choices over the last ten years were to buy a Phenom II processor in 2010/2011 or an i5-2500K in 2011. They may have been a little more pricey but their longevity has been amazing. Similarly, I'm of the mind that a Ryzen 3700X and i5-10600K will be a similar story in 2019/2020. We're already at the point where the Ryzen 2000 series chips are almost out of date and the high-end  CPU baseline established by the consoles means that level of performance should be what any person buying a CPU for gaming in the here and now should be looking at, if not a 5600X...

For the GPU side of the equation, buying at the high-end seems to be the best possibility for saving money on new products in the long term. This is especially true now that we expect to wait two to three years between graphics card generations when game requirements keep increasing and we're left without an increase in generational hardware performance in response.

Also, with the advent of resolution scaling, image reconstruction techniques and other technologies designed to intelligently overcome the sheer number of pixels required to fill high resolution and refresh rate displays, current high-end cards will potentially have much better longevity than their lower-end and mid-range bretheren and, in fact, those supporting technologies just tend to work better on higher-end hardware, ignoring the fact that if a game doesn't support any one or several of them, you can fall back onto the pure performance of the card itself.

No comments: