I’m not particularly thrilled about the PlayStation 4 or Xbox One. I know I should be, considering what “next gen” usually means for gaming: fewer technical barriers, better-realized ideas, and new potential risks. And I was definitely excited about the launch of this current generation. In fact, I’m likely one of the few people on Earth who still has a functioning day-one launch version of the Xbox 360 (seen here).
But this generation doesn’t feel like it’s leaving behind a decent legacy. The obvious positives were some new intellectual properties (some of which are now iconic) and the flourishing of the indie scene, but it still didn’t feel as impressionable to me as past generations. Honestly, years from now, I can’t imagine myself looking back on my 360 or PS3 with fond, definitive memories … at least not in the way I do with those that came before.
All of that could be an easy answer to my lack of excitement for these upcoming consoles, but I won’t be so lazy. The reason goes deeper. A lot of why I’m so indifferent has to do with the systems themselves.
Above: Ah, yes, polygons. … I remember those. More of them is good, right? Could have fooled me.
Next-gen graphics, you say?
Interestingly enough, the “next gen” of visuals never truly kicks off until the Big Three launch their new hardware — no matter what that may be. Obviously, this is due to the tradition of new consoles “leaping ahead” in graphical capabilities, at least when compared to the persistent standard that is the PC. Unfortunately, as we’ve been told in the past, this upcoming generation is long overdue. Thanks to a recession, Microsoft, Sony, and Nintendo held off on launching new hardware a bit later than usual. So while consoles were stuck with tech that never changed, GPU manufacturers continued on with regular upgrades despite suffering from the same economical lag.
First, I need to make one very important point: Despite what we’ve been told by Sony and Microsoft at their wonderful conferences, next-gen graphics no longer depend solely on the prowess of hardware. Graphics engines are an element of software, and though they do rely on the capabilities of the systems running them, advances in programming techniques have ensured that games can look prettier than ever while facilitating a more practical use of resources.
Therefore, next-gen graphics aren’t entirely a matter of raw power. That high-poly old-man face was plain misdirection; games have displacement mapping now. They’ve had it for years, in fact. That is my point. Gamers have been experiencing cutting-edge graphics for quite a while … on the PC. DirectX 11 is not a new thing, nor are all of the bells and whistles that come with it.
Hell, even brilliant modders like Boris Vorontsov have managed to inject advanced graphical features into PC titles that were otherwise dated or stunted by their console counterparts. I know Xbox, PlayStation, and Nintendo fans hate hearing this sort of stuff, but it’s fact. Sorry.
The truth is that the PlayStation 4 and Xbox One are not “leaping ahead” this holiday season; they are merely playing catch-up. That basically cancels out a huge reason we’ve looked forward to these things.
This is The Elder Scrolls V: Skyrim (PC) with graphical mods and arguably what it would look like in “next-gen” form.
Xbox calls with the family — yay
The consoles of this generation aimed to mimic our common multimedia devices by, well, becoming them, and these upcoming systems are marching to the same beat. Providing a variety of features is all well and good, but such bonuses become an unnecessary burden when they don’t augment the gaming experience.
Social networking is cool, I suppose. I’ll give credit where it’s due: The ability to share data has been a decent gaming addition. I actually find myself browsing the image galleries of my Steam friends from time to time. But social features in console games have always been lacking what PC games have always had: the ability to easily form strong, active, vocal communities.
Obviously, social networking is a means to connect with others, so those features in games should always make it easier to reach out to like-minded gamers. Be honest — have you ever truly wanted to know your fellow Call of Duty players better? Probably not. I bet you’ve muted one or two, though. That’s usually what happens when developers restrict our interactions to simple in-game chat.
PC games don’t really face this issue. Due to the ease of connectivity that the platform provides, communities form organically for virtually every game. It’s why massively multiplayer online games are so prevalent on PCs and so absent on consoles; it’s a genre that relies almost entirely on active communities. Hell, people are even playing old, seemingly dead multiplayer-only PC titles (like Starsiege: Tribes) to this day because of long-lasting camaraderie.
So when it comes to consoles, all of these tacked-on features mean nothing if they don’t somehow enhance how we play. Skype chat is old news … and is completely useless to us.
Community and communication are integral to large-scale online games. I chose this PlanetSide 2 video because it’s the perfect example.
“Only on [insert platform]“
When you really think about it, the whole console-exclusive process is rather pointless, and we’re not benefiting much from it. We’re paying hundreds of dollars for hardware to then spend hundreds more on software that could easily be on any other platform. Even then, the selection is relatively small, considering how it’s more financially sound to release a game on every system available (hence why most publishers do it).
In an ideal world, individual consoles don’t need to exist. When you think about it, they don’t really push the medium. Specialized hardware manufacturers and game companies are just going to compete amongst themselves. Both the PlayStation 4 and Xbox One share similar specs (AMD saw to that), so neither has any particular edge over the other.
As I see it, we consumers would benefit more if the game industry adopted the home entertainment model of the film industry. Sony Pictures Entertainment owns the exclusive rights to several million-dollar franchises, all of which can be viewed on anything other than a Sony-made Blu-ray/DVD player.
The reason for this is that it would be incredibly stupid to limit the functionality of Sony films to players only Sony makes. Blu-rays and DVDs are just glorified software. They are cheap to produce, so the bulk of the profit lies with their sales alone.
Compare this to the video game industry, and it’s easy to see how console exclusivity — despite each company getting a small cut of software sales — is an incredibly nonsensical approach. Sony and Microsoft generally lose money with every system sold (even more so with all these useless features and services they keep tacking onto them), so mere publication seems like a more sound approach to turning a profit from software.
I don’t see much point in it all. I may be missing something integral here, so please share with me any particular defense you have for console-specific games. It makes sense with Nintendo because it overcharges its hardware to the point of rectal prolapse, but that’s not the case with Sony or Microsoft.
Above: Never-minding compatibility, all he has to worry about is which season of Intervention to buy.
Wake me when we get there
Part of me wonders if I’m a bit dead inside. I felt a twinge of elation when Sony announced the PlayStation 4 — as it took me back to similar moments in my past — but that went away once I considered all of the aforementioned issues. It’s just not the same for me.
Besides the usual (convenience and first-party nonsense), I can’t think of any logical reason to own a next-gen console, so I’m the furthest thing from excited when it comes to this holiday season. I may buy only one of the three, and I’ll probably feel like I’m unnecessarily spending money.
As far as I can see, consoles offer nothing that I haven’t already been enjoying, so any enthusiasm toward them seems incredibly misplaced. But perhaps that’s just me.