I don't have much time for technical analysis over this coming period though I have a few ideas to explore in the coming months. So, I thought I'd have a bit of a round-up of my thoughts that, more often than not, end up in YouTube video comments and Twitter discussions - especially when I don't see these points replicated anywhere else...
GPU news
The RX 7600 and RTX 4060 and RTX 4060 Ti were released and there are a few take-aways from these releases.
- Enthusiast consumers are still really angry about issues related to prices and availability over the last couple of years.
- RDNA 3 really isn't much of an improvement over RDNA 2 per resource unit.
- RTX 4060 and RTX 4060 Ti are actually impressive cards just at the wrong price point.
- If RDNA 3 had competed this graphics card generation could have been an amazing one... probably the best in history (no exaggeration!).
The Gnashing of Teeth...
Jayz2cents' pulled review of the RTX 4060 Ti* and general semi-faux** jumping on the hate bandwagon for clicks/views that I've seen on Youtube have highlighted to me how much the last couple of years has damaged the gaming and PC technology scene on the consumer side - let alone the industry side.
They way I see it is there is residual aggression and frustration over being unable to get parts combined with the same emotions with being unable to afford them... both during the crypto craze and now with the current generation of cards being hyper-inflated in price (for whatever constellation of reasons).
This effect is exacerbated by the increased duration between graphics card generations which is only getting longer as time progresses, resulting in an unending cycle of consumer desire being unfulfilled and a sharp slow-down in generational increase in performance being available to the low-end of the market.
The point, here, is that virtually no one had their nose out of joint when Nvidia released the RTX 20 series because consumers had bought into the GTX 9 and 10 series of cards wholesale (as well as AMD's very serviceable RX 400 and 500 series cards). However, at the point of 2020 and now, worse - three years later, we are in a situation where consumers need or feel the need to upgrade to improve the performance in modern games and find themselves unable to do so for both reasons outlined above.
*Because it wasn't negative and Phil dared to have a positive opinion about the product...
**Oh, I'm sure they think the prices are too high but the hyperbolic language of a good number of techtubers is really playing to the current leanings and expectations of the audience that both avoids confrontation with them and also drives engagement for people wanting to validate their outrage...
Plus, I feel like the push-back against new technologies like ray tracing is tied heavily into this lack of ability to buy the capable hardware. Although I see some people claiming that they turn off all those features on their RTX 4090s because they only get 100 fps or somesuch... I just can't see the majority of gamers taking that stance!
I feel that all those polls of people saying that they don't care about RT in games is purely a reflection of the fact that they don't, for the most part, have hardware in their gaming rigs that can do it. If they did, I would bet a lot of money that the poll percentages would be strongly inverted. That's also a reason why you've seen movement over the last year from outlets such as HUB move in favour of RT when they were previously pretty dead set against it - purely on what was available in the then-current games (which, IMO, as a tech enthusiast is the wrong way to look at things but I digress...).
The RX 7600 is actually decent value for money, despite not having that great a performance compared to last generation offerings - especially with its continually (and officially) falling price! |
The Double Helix...
RDNA has been a very successful architecture for AMD, I would say more than GCN was - it's flexible, very scalable and relatively power efficient. Plus, the improvements they've made to the front end and even in registers and the cache hierarchy has shown similar scalability, too.
On the other hand, despite architectural changes between power- and cost- saving features* and widening of the DCU throughput** RDNA has not really delivered on a lot of its promises. Most improvements appear to have come through process node optimisation - allowing higher frequency and lower voltages required to achieve those frequencies.
Last time, I made the point that the monolithic RX 7600 isn't that different from the monolithic RX 6650 XT and, given that we know that RDNA 2 has no improvements, beyond data management and core frequency over RDNA 1, this basically highlights that there's no performance increase across three generations of RDNA architectures.
Let's take a look at what they have in common:
- 2048 shader units
- 128 TMUs
- 64 ROPs
- L0 = 32 KB per WGP
- L1 = 128 KB per array
- L2 = 2 MB
- L3 = 32 MB
- PCIe 4.0 x8 lanes
- 128 bit bus / 8 GB VRAM
- 1x PCIe power 8-pin required
And let's see what's different:
- RX 7600 is 86% of the die size of the RX 6650 XT
- RX 7600 has 18 Gbps GDDR6 and RX 6650 XT has 17.5 Gbps GDDR6
- RX 7600 has 13,300 million transistors and RX 6650 XT has 11,060 million
- RX 7600 uses TSMC 6 nm node and RX 6650 XT uses 7 nm
- RX 7600 has a boost clock of 2655 MHz and RX 6650 XT has 2635 MHz
- RX 7600 has a TDP of 165 W and RX 6650 XT has 176 W
The problem here is that this is essentially the same product. None of the improvements that differentiate RDNA 3 from RDNA 2 have had any benefit to the performance of the card - it's all down to process node improvements and memory speed between the two. In fact, given the overclocking potential of RDNA 2, it's very easy to match the RX 7600 performance improvement of 7% through a small core and memory frequency increase. Yes, you may not save that ~20 - 35 W difference during gaming but the consumer could have had this performance for a similar price for over half a year now. This is not an exciting product and shows the lack of any real improvement between generations on AMD's part.
*The increase in all caches but mostly the L3 cache allowed reduction in the number of memory controllers required to feed the compute units. The implementation of chiplets also allows other power-savings which are not fully realised on RDNA3 because of an apparent bug in data transfer between the chiplets, VRAM and the GPU die (at least from my reading of the situation surrounding the idle power draw)... Both of these changes also concurrently result in manufacturing cost savings for AMD.
**Which, in theory, should allow developers to implement certain visual effects with greater speed (thus improving duration between frametimes) but which also requires either intense driver effort to implement or specific effort from developers of each individual game to implement... This (if my understanding is correct) should have a stronger effect at higher resolutions.
This picture of the 4060 Ti is small because that's how small the die is! *Ba-dum-pish!*... |
The Racers...
Nvidia have stumbled in the eyes of enthusiast gamers this generation only because their actions during last generation were mostly obscured from public view due to the surrounding market conditions. Now their strategies are on view for all to see, without any filters - despite what they may want you to believe.
As a result, combined with the general anger in enthusiast circles that I noted above, we are looking at consistently strong levels of criticism that I've never seen in the industry. It's not just gamers; it's tech press, game devs, AIB partners (most notably with the departure of EVGA from the GPU market!), and even some industry analysts...
Nvidia is lucky that AI has taken off substantially this year otherwise there wouldn't be much positive press for them to focus on.
Saying all that, the hatred their products lower in the stack have recieved are unfortunate in a technical sense because, price aside, they are decent - especially when viewed from a efficiency perspective (which I feel is an increasingly important aspect of PC hardware). Yes, we have some regressions from the last generation of products but, let's be fair, there's the same situation on AMD's side as well!
Unfortunately, Nvidia's choice to use lower-than-usual-tier silicon in their products has resulted in a situation where the consumer is losing and, thus, their products can generally not be recommended. Even relatively placid commentators like NX Gamer cannot get behind their shenanigans! However, Craft Computing has stirred the waters by pointing out that many reviews and reviewers are only assessing the absolute performance on an academic level, rather than a practical measure.
This isn't a new argument and outlets such as Digital Foundry, NX Gamer and myself* have been pivoting to cover the "user experience" which is, IMO, becoming more important than ever in this post-pixel age**.
While no particular review method is incorrect - all review types are fine as long as they are internally consistent! There is a growing need for "real world" experiential testing, showing the reasons for (and potentially against) buying the new "thing". Academic reviews do not address this need - or at least they are addressing it less and less... though I, myself, value them incredibly as I'm a bit of a nerd like that.
*If I can be considered an outlet after more than 15 years!
**The pixel is dead, upscaling is here to stay, hardware advances are slowing and tech cost is inflating beyond what the majority can afford...
Coming back to Nvidia, the RTX 40 series really is the RDNA 2 of their lineup: as I will show in my next article, their improvements this generation really only relate to process node and operating frequency, the same as RDNA2 was over RDNA1. Though, to be fair, RDNA3 appears to also be in the same boat... so you could argue there's more performance stagnation on AMD's side of the fence...
Starfield is in contention for the most controversial game of all time! However, I believe that Mass Effect 3 still wins out... |
Software Gains...
Much as been made about Xbox's "lack" of software titles - in reality, it's just short hand for "big games". As a result, Starfield has becom the idol upon which many have rested their hopes. For a studio that is RENOWNED for the technical glitches and shortcomings, this seems like one of those cosmic jokes one reads about from time to time in science fiction... appropriate.
At any rate, the "30 fps on the Series consoles" issue isn't going to go away. It's a shock, to be sure. Unfortunately, because of the prolongued segue into the current console generation, a lot of games were able to easily get away with 60 fps modes and higher. Now, as we move into a period of games releasing that only focus on the current generation of consoles, we will probably find that some developers have bitten off more than they can chew or more than they expected to chew.
Let's face it: it's not like 60 fps has been an expectation for the last ten years of gaming on console. 30 fps was considered fine from both sides: developer and consumer. So, everyone and their dog has likely been making games with a "flexible" frametime budget in mind over the last 3-5 years for their debut on the new consoles. Unfortunately, in that timeframe, consumers have realised that higher fps is generally a better experience that results in lower input latency and a smoother visual experience.
Developers cannot spin on a dime - so every game that did not have this consideration in its development, or which is unable to optimise the game for this aspect of presentation in the time available to release, will be unable to meet this new consumer expectation.
However, in comparison with other games, I am not really worried about this release on the Xbox. Starfield is a 10-year game. The performance at release doesn't matter - it will be patched, re-released on new hardware, it will be played, modded and analytically dissected over on the PC platform.
Does the poor performance of Skyrim's release on PS3 or Xbox 360 mean than the game wasn't successful or long-lasting? In a word: No.
Yes, it's disappointing in the here and now but this isn't a flash in the pan (assuming the same trajectory of prior Bethesda games) - it is a marathon upon which player experiences will be built...
Conclusion...
And that pretty much wraps up where things are at, IMO. As I hinted at earlier in the blogpost - next time I'll be looking at the performance uplift of the Ada Lovelace architecture over the Ampere lineup.
2 comments:
Hi, thank you for interesting thoughts!
I just want to react on to one og them. Its about RT in games and reason to turn it off.
I have 3090 and so I can afford turning on RT in all games. Also overdrive in CP2077 with larger dlss help.
But I always end with RT off. Why? Because if you dont pause game and try to find differences, they are so small!
In games like Quake RTX, there is so much difference. Thats because, there wasnt so good baked lightning back then. But in modern games its really hard to tell.
There was also som blind test on several games, where you had to tip, what picture is with RT and what without. I wasnt able to pick the right picture in half of cases. Simply I couldnt tell which was "better" RT one.
I dont know how many people have similar opinion. Its highly subjective of course.
Yeah that's fair, fybyfyby!
Personally, I actually notice and enjoy RT and usually I can see the difference in motion - though not so much in still images. I think there is a difference.
Anyway, I don't mind what everyone chooses to do with their own hardware and games :D. There is no wrong answer!
Post a Comment