29 March 2024

Analyse This: Simulating the PS5 Pro...



Last time, I took a look at the PS5 Pro leaks and came to the conclusion, based on the claimed performance uplift, that the PS5 Pro will likely be heavily CPU-constrained (considering the upgrade in totality) and that it's likely that many titles will not take advantage of any 'RDNA3' architectural improvements on the APU.

It still baffles me why Sony would even bother releasing this thing, as I concluded:
"Honestly, a part of me is wondering why AMD/Sony didn't go with the same 36 CU configuration, but using RDNA 3 instead. They'd get the RT bonus performance and they could have clocked the GPU frequency higher to achieve a similar level of raster performance though at a cost to power use. The die would also be cheaper - and this is doubly important if there is some sort of CPU bottleneck in play - you've got a lot of wasted die area spent without capitalising on the potential performance."
But, let's not dwell on theoreticals, let's do some testing!

19 March 2024

Analyse This: Let's look at the PS5 Pro leaks...


Last year, I looked at the possibility of a mid-generation refresh for Sony or Microsoft's current console line-ups and didn't really see any point in one. I posited some easy improvements and reasons why other possible improvements didn't really make any sense.

Now, after a series of rumours from various sources, Moore's Law Is Dead has presented a pretty concrete leak. which is confirmed to originate from Sony's developer portal, confirming the console's existence and also some key performance metrics.

So let's take a look at what we have and if such a console can change my thoughts from last time...

1 February 2024

We Need to Talk About FPS Metrics Reporting... (Part 2)



There's a well-known idiom that's often said: "There's lies, damn lies, and then there's statistics...". This implies that the "statistics" in question are another, worse form of lie that is somehow obfuscated from the receiver of the information.

We also have multiple well-known sayings which revolve around the concept of, "you can make the statistics/data say anything you want". It seems readily apparent that people, in general, do not like or trust "the statistics".

I experience this, in my own way, in my day-to-day work. Scientists are currently not the most trusted of individuals - for whatever reason - and one of those reasons, in both cases, is a lack of understanding on the part of the consumer of the results of data analysis, both within and outside of scientific circles.

In the same way people say "science is hard", people say "statistics" is hard... and this is for good reason - though it might not be for the specific reason that might immediately spring to mind!

Statistics is not that difficult once you know what you are doing (at least in my opinion). The difficult part is knowing which statistical test to apply when and where. Yes, the difficulty, as when designing scientific experiments, is understanding the context, limitations and biases of what and how you wish to test.

This is why there are many statistical tests where the number of data points needs to be below or above a certain limit; why it is important to know the relationship between the individual data points and the set as a whole; and how the interpretation of the result of the analysis might be changed based on myriad factors.

Hence, we come to today's topic for discussion: hardware performance testing in games!

Last time, I attempted to communicate the shortfalls and incorrect analysis being performed in the industry at large. Admittedly, I was unsuccessful in many ways and was roundly dismissed by most parties...

Today, I will try a different tack.