1 February 2024

We Need to Talk About FPS Metrics Reporting... (Part 2)



There's a well-known idiom that's often said: "There's lies, damn lies, and then there's statistics...". This implies that the "statistics" in question are another, worse form of lie that is somehow obfuscated from the receiver of the information.

We also have multiple well-known sayings which revolve around the concept of, "you can make the statistics/data say anything you want". It seems readily apparent that people, in general, do not like or trust "the statistics".

I experience this, in my own way, in my day-to-day work. Scientists are currently not the most trusted of individuals - for whatever reason - and one of those reasons, in both cases, is a lack of understanding on the part of the consumer of the results of data analysis, both within and outside of scientific circles.

In the same way people say "science is hard", people say "statistics" is hard... and this is for good reason - though it might not be for the specific reason that might immediately spring to mind!

Statistics is not that difficult once you know what you are doing (at least in my opinion). The difficult part is knowing which statistical test to apply when and where. Yes, the difficulty, as when designing scientific experiments, is understanding the context, limitations and biases of what and how you wish to test.

This is why there are many statistical tests where the number of data points needs to be below or above a certain limit; why it is important to know the relationship between the individual data points and the set as a whole; and how the interpretation of the result of the analysis might be changed based on myriad factors.

Hence, we come to today's topic for discussion: hardware performance testing in games!

Last time, I attempted to communicate the shortfalls and incorrect analysis being performed in the industry at large. Admittedly, I was unsuccessful in many ways and was roundly dismissed by most parties...

Today, I will try a different tack.