Via Twitter... |
The release of the required PC hardware specifications of Alan Wake 2, along with the revelation that RX 5000 series and GTX 10 series cards would not be supported* caused quite a stir in various online fora. I, myself, have not been overly happy with them but it may not be for the typical reasons that proponents of advancing technology(!!) would like to paint. At the same time, despite the hyperbole on both sides of the equation, I think there is room for reasoned discourse on the topic and platforms like Twitter and Reddit don't tend to promote or facilitate that.
So, here goes...
*They can run the game, just not to their normal relative performance envelope compared to other cards due to the fact that they do not support DX12 Ultimate mesh shaders.
The times, they are a'changing...
Belive it or not but I'm generally quite a proponent of people upgrading their PC hardware. I want new games to push the technical boundaries and adopt new technologies (where I feel it makes sense!). In fact, I have a spiritual sibling article for this piece that I never published (see below).
I do, however, believe in balance and I feel that the proponents of "games pushing the technological limits" are missing or ignoring some very valid points - despite some of them actually stating them very obviously in their arguments but apparently completely missing the logical conclusion of those points... I wish to address those and other points in this blogpost.
A post that I may finally get around to publishing soon... |
The Past...
Proponents of the higher technical speifications of games like Alan Wake 2 like to spout that 'in the past we had to walk both ways uphill to school and back, in the snow - with no shoes!'
Yes, there is certainly an element of truth to this - in the 90s we had to deal with a lot of tinkering to just get games working and, very often, games might not even work on your specific combination of harware (either properly, or at all!). In addition to this, we had to upgrade our hardware - practically yearly - to keep up with the frenzied output of game developers who were pushing the envelope on the (back then) very small PC gaming landscape. That is, of course, if we wanted to play the new games. Very often, we did without.
In the 2000s, we had to upgrade our hardware less often but as I can certainly attest: it was still fairly frequent. You can see my personal cards from the year 2000-onward in the table below. Although many commentators state that the slowdown in game spec requirements happened in the 2010s, I actually found that the X1950 Pro (bought in 2006) lasted me through to 2010 - at which point it was beginning to become unusuable on new releases. So, for me at least, the slowdown occurred even before 2010...
So, I have lived and gamed through all of that period. Is this an argument that we should move back to those expectations? I don't think so, but then I don't know what sort of point is being made because what happened in the past is irrelevant to what is happening today for multiple reasons...
The list of shame... or pride? Oh, okay - the latter cards don't count because I'm not buying them to use like I did in the past... |
First up is that market conditions are completely different. The X1950 Pro was circa $200 - $250 on release. Similarly, this price point or below applies to almost every one of those cards up to the GTX 1060 with the notable exception of the Radeon 9800 Pro. It's also important to note that the release cadence for graphics cards was insanely fast! The venerable Anand was complaining about the 6 month cadence back in the early 2000s.
Of course, we know that release cadence slowed to once every year over the next decade and then to approximately every 2 years by the late 2010s and, now, it's inching towards every 2.5 years between the lower-end cards in the stack.
On top of this, the expense of cards has just gone up and up - a $100 is now $300, a $200 is now $350 and their relative relative to the top end card in the stack has also decreased, too - so the performance increase per generation is decreasing (as I showed, here).
Anand and other reviewers were critical of the GPU manufacturers - ensuring that they held them to task if they didn't provide enough performance throughout the stack per generation... (AnandTech) |
In this context, time doesn't mean the same thing as it did back then. Sure, some people say that in the past tech got outdated super fast and was unable to play games... Well, guess what? Your new hardware was doing 200% more performance compared to your old one because of the rate of improvement of the GPU generations. Nowadays, at the low-end we're not getting improvements! (And the price is still slowly increasing.)
So, between the fact that the gap beween graphics card generations is getting longer and that performance increase at the low to mid-range is getting smaller the only way to improve performance is to move up the GPU stack - to much more expensive cards.
In this light, I don't buy the "suck it up" and "you are the problem" mentality of some of these commentators - 7 years in the 2000s is very different to 7 years in the 2020s. We're talking two, yes, TWO, GPU generations versus around five to six (at a minimum).
I bought an x800xt in 2004 and in 2007 it could not play any UE3 game/almost all ps360 titles on PC (dx 9.0b). Just 3 years later. As Doc has said, the long xbox360 and - to a degree - the ps4 gen broke some pc gamer's minds about PC part longevity due to the consoles low perf. https://t.co/os7PL5tb0j
— Alexander Battaglia (@Dachsjaeger) October 24, 2023
Perspective on resolve...
There's also another aspect* to this difference in the time period, too: resolution.
*This was unintentional...
Mainstream gaming monitors did not really advance at all during the period of 1995 - 2008/2010. 99.99% of gamers were playing on 1024x768 CRTs and later 768p and 900p LCD screens. HDTVs were not appropriate for playing PC games on (for various reasons). CRT monitors are less sensitive to variable frame presentation and also odd frame rates (i.e. you can play at whatever output your card can handle and you don't need to implement a frame limit) and the LCD monitors of the late period of that time were a maximum of 60 Hz, meaning that graphics cards didn't have to push all that hard to achieve comfortably playable experiences.
Just looking back at the reviewsphere - GPU reviews didn't reliably start testing 1080p as a resolution until around 2010 - 2012 and Steve Walton (of Hardware Unboxed fame) was testing in weird 16x10 resolutions (though I couldn't tell if this was on a CRT or LCD).
Similarly, during the 2010s, 1080p60Hz was the stagnant resolution to target and gamers have benefitted from that stagnation just as they did in the 2000s.
Nowadays, in 2023, 1080p60Hz is a low-end display that, ideally, we should be moving away from as a target. In fact, due to prices beginning to fall, many gamers buying new monitors are getting 1440p and 4K monitors and TVs at 60 /120 Hz - often with variable refresh rate technology. Graphics cards should be targetting these resolutions going forward, with 1440p as a minimum and 4K moving into the crosshairs relatively soon.
These are not trivial requirements and most low and mid-range graphics cards in both manufacturers current lineups are not up to the task, especially with the recent requirements coming from game developers!
Yes, this is a GPU manufacturer problem - in the sense that they are not getting the advancements in process node that they used to get, or in architectural low-hanging fruit, but also in the sense that they are trying to cut costs and provide the minimum viable product to consumers, which means smaller dies and less VRAM and narrower memory buses for more money.
That's not a good combination...
So, again, I'm coming back to the point that time is not the same - saying that a 7 year old GPU is not a viable product any more is not as meaningful as it was in the past, when the high-end products can still keep up with or outperform the lower end of last generation's graphics cards... and this is, of course, ignoring the fact that the RX 5000 series is only 4 years old.
Pushing the boundaries...
People are also pointing to prior Remedy titles which moved the bar higher by implementing cutting-edge technical solutions, such as DLSS and ray tracing in Control, but this comparison rings hollow when we look at the actual context:
This isn't a case of moving the quality bar up by implementing graphical or technical options at the high-end, this is lifting the floor by implementing a feature without any fall-back option for consumers with graphics cards which should be able to run the game at low settings.
In contrast, moving back to that example of Control, both DLSS and ray tracing had those fall-back rendering options that consumers could (and did) avail themselves of in order to experience the game.
Even to this day, only a handful of titles are exclusively ray-traced. Almost all games have a traditional raster pipeline that they fall back on. Now, I understand that development resources are limited and that time and money are not infinite - this isn't a case of me trying to rake Remedy over the coals, here.
What I am trying to counter is this, quite frankly, ridiculous and dismissive notion that gamers are a problem because they can't play games that they would like or that older hardware shouldn't be supported, especially given that hardware is advancing at a much slower pace, now.
Concluding remarks...
Given the huge costs of high-end hardware, what's going to happen in 5 -7 years? Are we saying that a $1500 card should be dropped because some new feature came out, despite it outperforming the low end new cards in most other titles? What about a $500 - $700 card every 3-4 years?
Is time really a good metric to be using to justify support (or not) of a product? Or is it just a convenient excuse to dismiss legitmate concerns?
Personally, although I can understand why Remedy has chosen not to optimise for cards that don't support the mesh shader tech (and let's be honest, this issue really doesn't affect me), I find the fact that they are able to release it on the Xbox Series S with its paltry GPU and memory combination but not an RX 5700 XT or a GTX 1080 Ti a little hard to swallow...
I firmly believe that we should be supporting hardware as much as possible for the reasons above as well as many more.
No comments:
Post a Comment