This week on Kotaku, there was an article that looked at some of the reasons why the Japanese weren’t buying the PS4 in droves. One of the random remarks from the Japanese corner of the Internet was, “Loads of people are still satisfied with the PS3’s graphics.” This is not something that has really happened before, but, going forward, it’s a problem that’s going to be a challenge for the industry. On the graphical side of games, the law of diminishing returns—especially with untrained eyes—is kicking in.
The “leap” in graphics from the PS3/60 era to the PS4/Xbox One generation is not a great one. Certain developers, like Quantic Dream, have pushed graphics so far that something like Beyond: Two Souls still looks like an impressive title even today, gameplay criticisms notwithstanding. Meanwhile, people play something like Watch_Dogs on new consoles and many people say “Well, it looks better, but not that much better.” And to some degree they’re right; with the last generation of hardware, studios began to approach that holy grail of gaming, photorealistic graphics. In Beyond: Two Souls, Ellen Page looks like Ellen Page and Willem Dafoe looks like his craggy ol’ self. The graphical performances of these two actors in Beyond: Two Souls is not blown away by the early peeks we’re getting of Kevin Spacey in the latest Call of Duty game. In fact, they pretty much look the same. With new generations of hardware, we’re not going to get a gigantic, quantum leap in graphical fidelity that users first experienced 14 years ago, from the PlayStation to PlayStation 2.
One of the big problems here is that programmers have gotten very good at cheating. In the PS3/360 era, even if real-time lighting or simulated particle effects were beyond the capabilities of the hardware, graphics artists were very good at faking it. They faked it well enough that most gamers—except for the extremely discerning graphics hounds—were satisfied with the results. It’s much like how only super strict audiophiles spit on the MP3 audio format whereas, for everyone else, the opinion is “Hey, it works well enough for me.”
Now, with more powerful hardware, developers no longer have to cheat. They can do real light, real smoke, other effects in real time, and the average player can’t tell the difference between the cheat, and the real time effect. What’s more, quite often they don’t even care. With games like The Last of Us and upcoming Uncharted games, graphics have hit a “good enough” level for the average player. Certainly, it’s not indistinguishable from real life, but given the priority that graphics get with each new hardware generation, it’s not unreasonable to think that we’re likely less than 20 years from that day finally coming.
And when that day comes, when all PCs and consoles are so powerful that games look photorealistic, what happens then? How can graphics still matter when they all look the same? It’s the same problem cinema has had for years since CG effects entered the industry. We’ve come a long way from the light cycles in Tron, but now movies routinely use photoreal CG effects, and while audiences appreciate the spectacle, no one remarks on the realism as a selling point of the effects. They’re now just considered tools to tell a story. In the same way, when all the waves in a videogame ocean start looking like real oceans, and there’s no discernible difference between a Gran Turismo Lamborghini and a real one, and gamers get 10 years to adjust to this, what will the selling point of games be?
Ultimately, it’s the interactivity, the gameplay that is always going to be the greatest strength of games. And we’re starting to see that gain more and more importance. That’s the way it should be, but it took us many years to get to the point today where people care less about how a game looks and more about how a game plays. Graphics were always important, but ironically, as they get better and better their importance lessens.