29 December 2023

Looking back at 2023 and predictions for 2024...


Hap-pey Burf-yay!
 
 
The introduction to last year's post could almost be copy/pasted into this one. Work was even more intense than last year and I know for a fact that 2024 will be even tougher, so my hopes for spending time writing for this blog look set to be a big loss... But, that doesn't mean that I won't try and address things when I feel I can dedicate the time or perform more comparisons and benchmarks for everyone to digest.
 
2023 was a big year for hardware releases - with most of the current generations of CPU and GPUs being released at some point. Sure, we're getting some minor refreshes from Nvidia next year but, overall, 2024 looks set to be quite boring.
 
With that in mind, my predictions for this coming year were quite hard to pin down - what is left to predict? 

So let's start where we always do: the review of last year's predictions.

12 November 2023

The Performance Uplift of RDNA 3 over RDNA 2...


Yes, this is an RDNA 2 card, I wasn't able to create a new one with RDNA 3 just yet...

Much was made about the performance uplift (or lack thereof) of the initial RDNA3 cards released by AMD. Navi 31 (the RX 7900 XTX and XT) failed to meet expectations - seemingly both internal to AMD and externally (for various reasons) - this lack of performance also extended to the RX 7900 GRE which, despite lower core clocks, still underperformed compared to where it could be calculated that it should be...

Disappointingly, Navi 33 (the RX 7600) performed exactly the same as the equivalent RDNA2 counterpart, the RX 6650 XT, showing that there was zero performance uplift gen-on-gen in that lower tier part...

In the meantime, rumours swirled that Navi 32 was going to be 'fixed'. So, what is the truth of the matter? I intend to investigate a little and get to the bottom of the situation like I did with my Ampere vs Ada Lovelace performance uplift analysis...

2 November 2023

Alan Wake 2 Performance Analysis...


Alan Wake 2 is the new hotness in the games industry. Love it or hate it, you cannot deny the impact that it has had on the general conversation with regards to hardware, software, and game development. While I'm not as on-board with the near universal, unfettered praise for the title as most reviewers appear to be, I do find the discussions surrounding the hardware requirements and performance of that hardware near and dear to my heart.

So, without further ado, let's take a look at how current mid-range hardware performs in this game and whether that actually lines up with the hardware requirements that the developers put out just before the game's release...

27 October 2023

In Defence Of: Older Hardware...

Via Twitter...

The release of the required PC hardware specifications of Alan Wake 2, along with the revelation that RX 5000 series and GTX 10 series cards would not be supported* caused quite a stir in various online fora. I, myself, have not been overly happy with them but it may not be for the typical reasons that proponents of advancing technology(!!) would like to paint. At the same time, despite the hyperbole on both sides of the equation, I think there is room for reasoned discourse on the topic and platforms like Twitter and Reddit don't tend to promote or facilitate that. 

So, here goes...
*They can run the game, just not to their normal relative performance envelope compared to other cards due to the fact that they do not support DX12 Ultimate mesh shaders.

14 October 2023

RTX 40 series - aka "Did Nvidia jump the shark"...

Yes, I *splashed out*...


Now that Nvidia have essentially completed their consumer RTX 40 series cards, it's time to look back at the releases and take stock of what's happened. 

We've seen the now usual range of responses: from cynical and jaded gamers and games media, to acceptance from those who are coming to terms with the price to performance ratio Nvidia is now asking. 

Bundled up in all of this has been, for me, the question of whether Nvida pushed too far, too fast? Let's take a look...

17 September 2023

The problem with 'dumb' metrics (An argument against using GPU busy)...


With the recent release of the new updated beta version of Presentmon by Intel to include their new metric, GPU Busy, many people have become excited about the potential for it to have some sort of positive impact on the game review and user diagnostic testing landscapes.

However, if you follow me on Twitter, you may have noticed how I've not been impressed by the execution surrounding this metric - something which I think many people might have missed in the hubbub for something new and meaningful in system performance assessments.

This is the story of how dangerous undefined metrics can be in the wrong hands...

10 September 2023

Starfield Performance Analysis...


Starfield has been a much anticipated title for a number of years now but the hype, counter-hype and social media battles have been raging pretty strongly around this one since it was made an Xbox and PC exclusive. 
Normally, though, those concerns usually melt away shortly after launch when players can actually get their hands on the game and just play. In Starfield's case, this hasn't been quite the experience - there are many players who are struggling to run the game because it can be quite demanding of PC hardware. There are raging debates as to whether the problem lies with the developers, the engine, or the hardware manufacturers...

Since I had my savegame destroyed in Baldur's Gate 3 and didn't feel like repeating thirty hours of gametime I instead decided to switch over to Starfield for a change in pace. As a happy coincidence, I have a lot of familiarity with prior Bethesda Game Studios titles and a penchant for testing on various hardware configurations... and it just so happens that I have a new testing rig (mostly) up and running so it has turned out that Starfield is a prime target for the shakedown of this new testing capability...

2 August 2023

What went wrong with RDNA 3...?


Much has been made about the performance, or lack thereof, in the RDNA 3 architecture. Before launch, rumours persisted of 3+ GHz core clocks, and I had tried to make sense of them at the time - to no avail, leading to predictions that were around 2x the performance of the RX 6900 XT. 

Additionally, people (aka leakers) were really misunderstanding the doubling of FP32 units in each workgroup processor and so were counting more than actually existed in terms of real throughput performance and, I think it's safe to say that this decision from AMD was a disaster from a consumer standpoint. 

But is that really the reason RDNA 3 has been less performant than expected? Let's take a look!

22 July 2023

Let's Talk about System Requirements...



Everyone and their mother, cousin, friend, associate, partner, and long-lost relation has had strong opinions on System Requirements.. Hell, I'm right there with them! We all want detailed listings of settings combined with PC hardware... However, as a group, I think we are a bit too happy when developers give us more than nothing to work with.

So, I want to set the record straight on what are good system requirements for games... and I hope that some developers see this and take the advice on board.

21 July 2023

The Performance uplift of Ada Lovelace over Ampere...

 

One thing i feel like I may have been known for is being one of the first people to comment on the fact that there was no "IPC" uplift for AMD's RDNA 2 over RDNA architecture. Well, I never had an RX 5000 series card to check with but Hardware Unboxed confirmed this in practice. So, it was nice to feel validated. 

Now, I am aware that RDNA 3 is nothng more than a frequency adjusted RDNA2 (because their extra FP32 configurations do not appear to be easily used in existing programmes), but the question still burns within me: have Nvidia been able to increase the performance of their architecture from Ampere to Lovelace?

Let's find out...

8 July 2023

June Roundup... Software highs and hardware lows...



I don't have much time for technical analysis over this coming period though I have a few ideas to explore in the coming months. So, I thought I'd have a bit of a round-up of my thoughts that, more often than not, end up in YouTube video comments and Twitter discussions - especially when I don't see these points replicated anywhere else...

So, let's summarize!

21 June 2023

Analyse This: What would a mid-gen console refresh look like...?


There have been a lot of rumours and meanderings surrounding a potential mid-gen "pro" console for both Sony and Microsoft's current consoles. However, I've not really seen much analysis for what  form such a device would take or why such a device might exist. 

For this post, I'm going to delve back into my hardware speculation territory to see if we can't imagine some devices for both companies that might make some sense in the market.

1 June 2023

Mid-Range Hardware: RTX 4070 review (Part 2)


Last time, I looked at the relative performance between the RTX 4070 and an RTX 3070 on an Intel-based system. This time, I've chucked these two cards into my AMD system and compared them with my RX 6800 to see the performance scaling on a mid-range system.

17 May 2023

The Power Scaling of the RTX 4070


Literally, again, exactly what happens...

As is my wont, I have decided to poke and prod at any and all hardware within my nefarious reach. Since I've picked up the RTX 4070, why should this particular product be spared? Because it's small and cute? 

"NO!", says the wise man. "They should be subject to the woes and wiles of mortals as much as any other product!"

And so the story goes, again and again... Join me within these pages* where I will outline the limitations of the new sacrificial lamb.
*Technically, no pages are present given the format of this blog...

6 May 2023

Mid-Range Hardware: RTX 4070 review



The RTX 4070 is an interesting card. $100 more expensive than the 3070 on paper but cheaper than most people were probably able to purchase their 30 series card during the period of late 2020 through to just before the release of the 40 series in 2023.

Certainly, I paid slightly less than MSRP on the base model and that was around €50 cheaper than I paid for my 3070 when, in desperation, I just bought the first card I could get my hands on because I didn't have a graphics card.

However, the 40 series cards are almost literally languishing on store shelves due to their high prices and lacklustre performance increases over their 30 series counterparts, with the notable exception of the RTX 4090.

With that in mind, I want to focus on that step forward in performance and look at how the RTX 4070 improves on the 3070 on a mid-range system.

1 April 2023

Analyse This: The Technical Performance of Last of Us Part 1...


Where we're going, we don't need roads...


Although I'm fairly sure we do... The Last of Us Part 1 (from here on in: TLOU) has had a rocky start on its PC release. Problems have been reported far and wide, and players have complained, and complained, about performance issues on this PS3 port of a game.

But is this really a fair comparison? Does TLOU really have the performance problems that people ascribe it? Let's find out!

There will be no spoilers here, today, because I have barely played ten minutes of this release. However, I think I know enough to have a very short post on the demands of this game.

So, without further ado:

19 March 2023

We Need to Talk About FPS Metrics Reporting...


Recently, I've been on a bit of a tear, running around trying to get people to listen to me about what I believe is a better way to analyse the performance of individual games, as well as the hardware used to run those games than is currently being performed by the vast majority of hardware reviewers in the tech space...

As with all new ideas, things are still developing and I'm still choosing what to keep and what to drop - what works, and what doesn't. Today, I'm going to summarise the conclusions I've come to, thus far, and also introduce a new concept that I feel like everyone in the tech space should be doing. However, some are getting it wrong: wrong to SUCH an extent that it literally took me a while to double check the concept because I can't be the only person to have realised this over the last ten years...

5 March 2023

Analyse This: The Technical Performance of Hogwart's Legacy...


I've never really been a fan of the Harry Potter series - I've never read the books, though I have seen most of the films and thought that they were okay. However, I was immediately interested in Hogwart's Legacy when I saw the game for the first time due to the graphical effects used (i.e. ray tracing) and also once I saw the recommended system requirements for the game. I wanted to test it to see why the requirements were so relatively high and to see how they run.

This isn't a review of the game - I might do one of those at a later point, this is a look at how the game scales with certain system resources...

9 February 2023

The power curve of the RX 6800... and improving system energy use.

Again, literally what happens...


It may have escaped your notice that I picked up an RX 6800 last November. It's not been entirely smooth sailing, however: Two inexplicably dead DisplayPorts later*, I think I've managed to get a handle on the way in which to scale the RDNA 2 architecture.

*Completely unlinked to any tinkering on my part - since they happened when bringing the computer back from sleep - I don't keep my PC in sleep mode any longer. Never had a problem with it on my RTX 3070!

Last year, I looked at the power scaling of the RTX 3070 because I was conscious about getting the most out of my technology, with a focus on efficiency (both power and performance). So, I'm applying that same focus to the new piece of kit in my arsenal. Come along for the ride...

4 February 2023

Analyse This: Forspoken Demo analysis...


I've been eagerly anticipating the release of the first DirectStorage title, mostly to see whether my prediction and/or understanding of the tech was correct or not. However, it seems like this particular DirectStorage implementation, much like Forsaken, itself, is a bit of a disappointment...

In the end, I did not splash out on the main game, given the very mixed reception and absolutely jam-packed release schedule for the first quarter of the year - I chose to spend my limited money on other titles that I actually might want to play/test, instead.

Luckily for me, Sqaure Enix released a PC demo and, while I am not entirely sure that everything is exactly the same between it and the main release, it is what I am able to test in this scenario. Perhaps the conclusions I will draw are limited because of it but I do sort of question whether there will be large codebase or engine optimisations available for the main release that are not part of the demo... But, let's see.

22 January 2023

Next Gen PC gaming requirements (2022 update)


It's time for the yearly round-up of game recommended system specifications trending data! 

First up, I want to pay tribute to the person who helped inspire me to begin this yearly endeavour. It was his catalogging of the games released over a ten year period on Steam that jump-started this whole thing. He will be missed...

However, games keep being released and keep requiring more demanding hardware... so my sisyphean task remains.

Let's jump in!

15 January 2023

Yearly DirectStorage rant (Part 3)...


Yeah, I know this is getting bothersome and tiring but, as we finally approach the release date of Forspoken - the first game to include DirectStorage as a way of managing data from the storage device on your PC - I've noticed a trend of people posting about the topic in a very uncritical manner.

So, let's take a quick look at that and let me tell you my doubts...

8 January 2023

Analyse This: Does RAM speed and latency make a difference for gaming...? (Part 4)

 
Uber RAM...

I've looked at the performance of RAM speed over the last few entries and come to a few conclusions:
  • People are wont to interpret data incorrectly - or too little data.
  • On mid-range systems (or below): Pushing RAM to get the lowest possible latency (and system latency) in synthetic tests really does not correlate well with actual game performance...
  • On mid-range systems (or below): Pushing RAM to get the highest possible bandwidth (and system bandwidth) in synthetic tests really does not correlate well with actual game performance...
  • Intel and AMD architectures handle memory access in quite different ways - this may be an indication as to why Intel has historically had better gaming performance than AMD.
  • On mid-range systems (or below): RAM speed past DDR4 3200 really doesn't matter too much in gaming applications.
    • What DOES matter is the quality of the memory IC!
    • Samsung B-die is well-known for its overclocking and latency-reducing ability... but even at the same stock settings as another chip show a marked improvement on both AMD and Intel systems for higher-framerate gaming. No overclocking or tightening of timings required!
  • You cannot just look at static metrics like min, 1% low, average and maximum framerates to determine game performance - It doesn't show you the whole picture. 
    • Nowadays, we should be looking at the smoothness of the per-frame presenation. You can do this by adding simplistic numbers like standard deviation of the frame-to-frame variance... or you can plot nice graphs of the per frametime distribution during the benchmark run treated with the natural log (in order to normalise the results from the extremes).
  • The differences are pretty small... when taking everything into account. Optimising RAM timings and speed is the sort of thing people who are obsessed with an activity will do. I did enjoy seeing synthetic benchmark numbers go up until I realised that, after looking at all the data, it was all pointless anyway. You get more out of your time by buying the best memory IC at a decent speed (DDR4 3600 or 3800) and spending more money on your CPU and GPU and overclocking them than you do from optimising your lower quality RAM. 
    • Of course, you probably wouldn't have known what was low or high quality RAM when you bought it! I didn't.
So, with that summary of conclusions out of the way, let's head into the final entry in this series - raytracing.

4 January 2023

Looking back at 2022 and predictions for 2023...

... etc.

I'm not going to lie, 2022 kicked my ass in terms of work. I just didn't have the energy or time to properly dedicate to this blog, even though I had strong opinions on many events that occurred during the year - I just wasn't able to put my thoughts down onto paper (so to speak). Additionally, I didn't play too many games, instead I dedicated a lot of my free time to doing some hardware testing, in order to increase my understanding of that hardware and the ways it can affect gaming experiences in the mid-range.

Unfortunately, that testing is way more time consuming than just doing analysis or quick opinion pieces, but I do feel that I have improved the way I am able to analyse data outputs from game testing - and this is something that I can apply going forward, now that I have worked out the methodology to a greater extent.

In addition to this, the majority of the big hardware releases have happened this past year and there really isn't that much for me to be excited about for 2023, so my predictions may be a little weak for this coming year... 

But, the show must go on, so...