I remember, if you will permit the brief reminiscing, when PC gaming was all about the hardware and, to a lesser extent, the software. Back in those heady days of the late 80s and early-mid 90s it seemed like each new game that I purchased required some new PC upgrade in order to function properly. CD-ROMs, dedicated graphics cards, new RAM, monitors (CGA/VGA etc), new, infinitely more powerful, CPUs.... SOUNDCARDS!
I used to lie there on my bed, a magazine in my hands or draped across my chest in a haze of imagining. It was a drug that I was happy to partake in; a fuel for dreams of the future. The sad thing is that this wasn't really a happy time. There was too much vendor lock-in, too many things that didn't play well together and which ended up being abandoned after a low adoption by players and/or the industry.
Luckily, we sort of got over that phase. Things calmed down in the early 2000s, tech became cheaper and, whilst upgrades were still required, a decent video card, sound card and CPU would basically cover you for everything for a couple of years (and then a GPU upgrade!). Monitors were very stable too. It got even better in the mid-2000s, with integrated sound chips on the motherboard and overall improved hardware and OSes that required less-tweaking and "expert" knowledge.
I think that, in some respects, we've been in a bit of a golden age since about 2006-2013. Tech slowed down enough that a single mid/high-range PC might last you 4-6 years as long as you weren't obsessed about ever increasing your desktop resolution or achieving perfect anti-aliasing. I myself was still using a 17" CRT until 2009 (and I still use it when I visit home!). Even now I'm on a 1600x900 LCD and the PC I have is into its third year with no sign of needing an update to play games on medium or high settings. Sure, the GPU fan is loud - I can't do anything about that and I'm not convinced that switching up to a much more expensive GPU at the same resolution will really fix that issue - but otherwise the box itself is perfect for my needs.
Things are changing though. The traditional PC sector is having a bit of trouble in driving sales, partially due to this part stability and partially due to the explosion of "good enough" tablet devices being sold to that majority of people who were previously sold cheap Dells and HPs to browse the web and send emails.
Out of this carnage and the mild associated industry panic, we are seeing an increase in commoditised Hardware. I'm putting a capital 'H' here because these new trend isn't about incremental improvements to your PC but actual sealed unit (or complete kit) improvements. I'm talking about the Oculus Rift, the G-sync monitors, NVidia shield and the steam controller.
Now, I'm not saying that there haven't been peripherals before... Of course there have! However, what I am saying is that there is a movement within the industry to innovate through new hardware experiences. These aren't particularly cheap, either and it's funny to see this parallel coming from that (for me) early period in the 90s where it feels like we may have a similar experience in a few years of not being able to get the best out of a game because we don't have the right hardware to experience it as it was conceived.
I'm also not saying that this is a bad thing either. I think things like the Oculus and Steam controller have the potential to be great additions to the landscape of PC gaming - even if I'm still sceptical of their eventual impact. Other innovations such as the G-sync monitors have more obvious improvements - at least to my mind, but unfortunately are vendor specific... which isn't good.
As lots of people have stated time and time again: History repeats itself. Let's hope these companies are taking into account the mistakes of the past...