There's been a lot of reporting on how the current console generation's mid-cycle upgrade is 'the end of console generations' but this can only be a bad thing for consumers and developers alike (as I've outlined before).
On the flip-side of things, are 'generations' really at an end? They're still going strong in the mobile market - you have generational separations, definitely. Many games and apps won't work on old hardware. Some apps which would be able to work have their support dropped for different hardware configurations due to cost and actual user numbers.
So what is this, exactly?
In my opinion the console manufacturers want to be able to make the most money possible. That's fair, as long as the consumer is getting a good deal out of it too! To do that, someone (well, it looks like multiple someones) realised that consumers can be tricked into paying more most easily by hiding behind hardware increments.
You see, paying £30 for a PC game 5 years ago was pretty standard, now it's more like £40-60. The problem is that the reach of the PC market has grown over that period so the scale of sales is potentially much larger. Sure, it costs more to produce certain types of games but that is also a reflection of how large the perceived market is. One of the biggest problems of the maturing industry has been how to predict sales and thus how much money to pour into a given project. Unfortunately, if you look at a few 'underperforming' or 'missed sales target' headlines stacked next to their actual sales numbers you see that the expectations were always too high.
So, of course, more money must be charged to support games that have too large a budget compared to their actual customer base.
The console side of the industry has had an easier time of things because they can include these price increases with each technological update. In the same way DVDs are cheaper than Blu Rays are cheaper than 4K content (yes there is investment to take into account there but I'm leaving it out of this equation at the moment because how can you compare the relative amount of money spent on developing DVD versus the development costs of Blu Ray? It's very difficult and a moot point once the technology reaches widespread market penetration...
So, instead of $40 for a PS2 title, you spent $50 on a PS3 title and $60 on a PS4 title.
That's great in terms of being able to justify your increase in cost to the consumer because they see a definitive improvement in performance and output to their entertainment. (Incidentally, this is why 3DTV and 4K screens are a difficult sell to the average consumer). The problem is, or was, that the last console generation lasted for 7 years for Sony and 8 years for Microsoft. Clearly, that's great for the consumer because they get long-term support and the relative value of their equipment purchase has increased over time. However, it seems like publishers and the platform holders were both chomping at the bit to a) get new, more powerful technology out the gate to support better features that could make them more money and b) charge more money for the games they were releasing.
In fact, you did see a few $60 tier games towards the end of the PS3/Xbox 360 generation but that became the almost de facto price for top tier games in the PS4/XBox One generation. The problem publishers/developers face is that, with each generational change, there is a loss of market size for these more expensive (production-wise and sale price) games.
So, now we come to the current situation: the phone model gets around this problem very nicely. The OS manufacturers allow backwards compatibility (and in some cases force it) on apps and games in order to access their walled-garden marketplace with access to a very large market size. Suddenly, you can see the point at which the light bulbs went on above people in MS and Sony. They too have walled gardens.
|Trakin's 2012 worldwide mobile marketshare|
The problem here is that the console market does not reflect the mobile market in a few key ways:
- People buy phones primarily as a means of contact. This includes availability of 'contact apps'. Not games.
- The majority of phones are subsidised by the carrier through two-to-three year contracts, not bought outright by the consumer themselves.
- App prices are very low. Either 'free' or sub $10.
- Nobody uses apps/games from 5 years ago.
- Each generation of phone has multiple price tiers. You don't need an Samsung Universe X5 at $599 to enjoy the latest apps (at pretty much the same quality) when you can get 'Alternative phone' at $199
These are big issues going forward for the console industry.
There is also quite a large performance gap between the Xbox One S and Scorpio and a big but not so huge difference between PS4 and Neo. Not only will supporting the multiple performance envelopes, bottlenecks and platform certifications increase the development costs but given the time frame of the development process these days (3-5 years on average) which platforms will developers target?
I mean, if we're talking about incremental upgrades every three years, a game will be in development and have to agilely adapt to the new architecture mid-way through development. Or will, as happens on PC, developers target an imaginary 10-20% improvement over the current situation and then optimise for when they get the official technical specs?
In the PS4 and Xbox 360 generation we saw a lot of optimisation and the familiarity with the platforms pay off towards the mid and end of the generation. It's unlikely we'll see that if consoles head down this particular path. What's the point in optimising or wringing the absolute most out of hardware when the next iteration is literally a game away? It's not cost/time effective. Just down-res the content and output and framerate for the current spec... that's the easier way.
This adds a lot of cost and uncertainty onto the development process.
This also means that, for the most part, new features and peripherals can not happen. You can't have a multi-generational pool of consumers to target and then introduce a peripheral like VR because then you've broken the model. Although, to be fair, this is why peripherals have never historically performed well with regards to sales but it severely limits the agility of the console environment to include new experiences and input/output methods.
Looking back at the history of the PC, we've seen pretty much a static design that has not evolved except for incremental power increases each year. Even controller support is pretty limited on the platform - what advancements outside of those imported from the console space have there been?
Imagine a world without the Wii. Imagine a world without improved gamepads/wheels with rumble and force feedback. Imagine a world without the iPhone...
We need hardware manufacturers to ask questions and innovate - even if those innovations do not take immediately. (Nintendo started off with consumer VR in the 1990s and it's only just now coming to market in a usable* version.
AR (alternate reality/augmented reality) has been around for a while now but its only with technology derived from the military, the VR push towards the consumer and powerful smartphones that it's really beginning to get it's feet underneath it.
Even looking at the stagnation of the smartphone market after its explosive evolution I'm seeing a lack of improvement. What's next on the horizon? There's little incentive outside of chest-beating and marketing susceptibility for upgrading to the latest and greatest phones. I upgrade whenever I am forced to (either through the phone becoming unusable or loss or breakage) - and I doubt I'm the only one.
I think that the next evolution in phone technology will be the marriage between the 'base unit', AR devices and the bluetooth headset**. Even that integration into portable communications technology is at least 5-10 years out.
Similarly, games have an amazing backlog of technological process (or lack thereof) in terms of AI, procedural storytelling/content production and reacting to the player's actions. One of the real advancements in gaming would be to have a story and/or world that dynamically reacts to the player's actions.
Oh, and it would be great to have some local multiplayer again.*** Gaming has become a very solitary pastime compared to the options that it once held and, in that sense has been moving towards the PC-style of entertainment for a long time.
|Heading for the world of Watch_Dogs? Hopefully our AR experiences will be less clichéd...|
Getting back to the topic at hand, what can we expect from the new style console generations?
- Increased prices for games that work on the newer hardware
- Poorer optimisation and performance on the older supported hardware
- Pressure to upgrade to the latest console variant through withholding software support
- Games not making full use of the technology available to them
- Focus on graphics instead of innovative gameplay features
To be honest, I think the future looks very uncertain in the console space and, if I was an investor I would be very careful just where I put my money. The reception of Playstation VR, Neo and Xbox Scorpio are all very important points to watch but beyond that, we should really be watching the output and reception of that output on these platforms by consumers even more carefully.
It's not unusual to have what looks like a success through initial sales but once consumer's become jaded to a sub-optimal strategy (either through poor product quality and/or support) that success will quickly turn to failure. We don't even have to go that far back to look at Microsoft's pushing of Kinect in the XBO without a killer feature or set of features. Looking even further back, there are plenty of options in both the console and other markets.
*Well, as usable as it can be, I suppose. I still view it as an interactive 'dead end'.
**In terms of being a usability peripheral rather than the data transfer technology itself.
***I'm looking at you Dirt Rally with your lack of 'hot seat' action.