Skip Navigation
131 comments
  • There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases.

    Whoosh.

    We learned all the way back in the Team Fortress 2 and Psychonauts days that hyper-realistic graphics will always age poorly, whereas stylized art always ages well. (Psychonauts aged so well that its 16-year-later sequel kept and refined the style, which went from limitations of hardware to straight up muppets)

    There's a reason Overwatch followed the stylized art path that TF2 had already tread, because the art style will age well as technology progresses.

    Anyway, I thought this phenomena was well known. Working within the limitations of the technology you have available can be pushed towards brilliant design. It's like when Twitter first appeared, I had comedy-writing friends who used the limitation of 140 characters as a tool for writing tighter comedy, forcing them to work within a 140 character limitation for a joke.

    Working within your limitations can actually make your art better, which just complements the fact that stylized art lasts longer before it looks ugly.

    Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.

    Also, as others have pointed out, it's capitalism and the desire for endless shareholder value increase year after year.

    Cyberpunk 2077 is a perfect example. A technical achievement that is stunningly beautiful where they had to cut tons of planned content (like wall-running) because they simply couldn't get it working before investors were demanding that the game be put out. As people saw with the Phantom Liberty, given enough time, Cyberpunk 2077 could have been a masterpiece on release, but the investors simply didn't give CD Project Red enough time before they cut the purse strings and said "we want our money back... now." It's a choice to release too early.

    ...but on the other hand it's also a choice to release too late after languishing in development hell a la Duke Nukem Forever.

  • This author has no fucking clue that the indie gaming industry exists.

    Like Balatro... you know, the fucking Indie Game of the Year, that was also nominated for Best Game of the Year at the Game Awards.

    Localthunk was able to build this in Lua... WITH A BOX OF SCRAPS!

    • This article wasn't about indie games.

      • Ignoring indie games here is ignoring the answer to the entire premise. It's part of the equation.

        It would be like complaining that there's no place to see big cats, while not mentioning the zoo at all.

    • I think the worst part is the author even points to freaking Minecraft and Roblox, both were indie titles when they first launched, and also compared triple-A titles to a live service game and Epic's tech-demo-turned-Roblox-clone.

      Honestly it reads more like they set out to write an article supporting a given narrative and carefully tuned their evidence to fit that narrative.

      How about some studios that aren't hurting and don't fit that narrative? SCS software which makes Euro Truck Simulator 2 and American Truck Simulator hasn't released a new game since ATS's launch in 2016 because their business model is to keep selling DLC to the same customers, and invest that money in continuing to refine the existing games. Urban Games has openly stated they exist solely to build the best modern Transport Tycoon game they can, releasing a new iteration every few years with significant game engine improvements each time. N3V Games was literally bought out by a community member of one of it's earlier titles when it was facing bankruptcy and simply exists to refine the Trainz railroad simulator game. Or there's the famous example of Bay12Games which released Dwarf Fortress (an entirely text mode game) as freeware and with the "agreement" that they'd continue development as long as donations continued rolling in

      The answer isn't a move to live service games as the author suggests, nor is it to stop developing high fidelity games but simply to make good games. Gaming is one of those rare "if you build it they will come" markets where there's a practically infinite number of niches to fill and even making a new game in an existing niche can be extremely successful whether that be due to technical differences, design differences or just differences in gameplay. RimWorld, Dwarf Fortress and Banished all have very similar basic gameplay elements but all can exist without eating eachother's market share because they're all incredibly different games. Banished focuses more on city building, RimWorld focuses on story and your colonists ultimately escaping the godforsaken planet they've crashed on, and Dwarf Fortress is about building the best dwarf civilization you can before something ultimately causes it's collapse (because losing is fun!)

    • I'm sorry sir, but I'm not an indie dev. I need to show the investors that my game will earn $100 million otherwise it's a failure.

  • It is hard for me to take seriously a hand-wringing industry that makes more money than most entertainment industries. Capitalism is the primary cause of articles like this. Investors simply demand moar each year, otherwise it is somehow a sign of stagnation or poor performance.

    AAA studios could be different, but they choose to play the same game as every other sector. Small studios and independents suffer much more because of the downstream effects of the greedy AAAs establishing market norms.

    We need unionization, folks. Broad unionization across sectors to fight against ownership/investor greed. It won't solve everything but it will certainly stem the worst of it.

  • Overall good article with some inaccuracies but the answer to the articles question is to me an easy no. The whole industry won't recover because its an industry. It follows the rules of capitalism and its a constant race to the worse and while good games by good people happen on the side, they happen in spite of the system. Everything else is working as expected and will continue until you pay per minute to stream games you rent with intermittent forced ads and paid level unlocks.

  • A lot of comments in this thread are really talking about visual design rather than graphics, strictly speaking, although the two are related.

    Visual design is what gives a game a visual identity. The level of graphical fidelity and realism that's achievable plays into what the design may be, although it's not a direct correlation.

    I do think there is a trend for higher and high visual fidelity to result in games with more bland visual design. That's probably because realism comes with artistic restrictions, and development time is going to be sucked away from doing creative art to supporting realism.

    My subjective opinion is that for first person games, we long ago hit the point of diminishing returns with something like the Source engine. Sure there was plenty to improve on from there (even games on Source like HL2 have gotten updates so they don't look like they did back in the day), but the engine was realistic enough. Faces moved like faces and communicated emotion. Objects looked like objects.

    Things should have and have improved since then, but really graphical improvements should have been the sideshow to gameplay and good visual design.

    I don't need a game where I can see the individual follicles on a character's face. I don't need subsurface light diffusion on skin. I won't notice any of that in the heat of gameplay, but only in cutscenes. With such high fidelity game developers are more and more forcing me to watch cutscenes or "play" sections that may as well be cutscenes.

    I don't want all that. I want good visual design. I want creatively made worlds in games. I want interesting looking characters. I want gameplay where I can read at a glance what is happening. None of that requires high fidelity.

  • I have a computer from 2017. It's also a Mac. I can't play recent games and I think I've just gotten more and more turned off by the whole emphasis on better graphics and the need to spend ridiculous amounts of money on either a console or a really good graphics card for a PC has just turned me off of mainstream gaming completely.

    Mostly I just go play games I played when I was a kid these days. 1980s graphics and yet I have yet to get tired of many of them...

  • I had a lot of fun playing Romancing Saga 2 and Ara Fell recently. Sometimes games can be more immersive by not having high fidelity graphics.

    • I've seen a lot of cool indie games pop up out of heavily modified classic idTech engines like the DOOM and Quake engines. They're definitely not high fidelity, but a lot of them scratch an itch that slower paced modern games can't seem to scratch.

  • GSC in my opinion ruined stalker 2 in the chase for "next gen" graphics. And modern graphics are now so dependent on upscaling and frame gen, sad to see but trailers sell.

  • Eh. I want hyper realistic graphics, but I also want a solid story and good gameplay mechanics. If hyper realistic graphics took a backseat to story and mechanics I'd be just as annoyed as a focus on hyper realistic graphics over story and mechanics.

    Edit: Generally speaking, of course. There's quite a few modern games with non-realistic graphics I enjoy, but I'm always waiting for that next hyper realistic game to push the boundaries.

131 comments