Absolutely bizarre that a 1st party title doesn’t seem optimized for the console they’re developing for. This makes me skeptical the PC version will be optimized too.

  • ShinkanTrain@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    30, 60 or whatever fps is (or at least should be) a development decision made very early in development. It’s only a case of poor optimization if it doesn’t reach the target they’ve set.

    I don’t like it either, but an Unreal 5 game running at 30 fps (if that lol) on current gen is the norm.

      • Ghoelian@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The people that keep saying that should really just try to use a 144+hz monitor for a while. Surely they’ll be able to notice the difference as well.

        • Thrashy@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          Might just be my middle-aged eyes, but I recently went from a 75Hz monitor to a 160Hz one and I’ll be damned if I can see the difference in motion. Granted that don’t play much in the way of twitch-style shooters anymore, but for me the threshold of visual smoothness is closer to 60Hz than whatever bonkers 240Hz+ refresh rates that current OLEDs are pushing.

          I’ll agree that 30fps is pretty marginal for any sort of action gameplay, though historically console players have been more forgiving of mediocre performance in service of more eye candy.

          • ParetoOptimalDev@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Games feel almost disgusting on 60hz now, but they felt fine before I tried 144hz.

            Maybe if I was stuck at 60hz for a long time id get used to it.

            Now though, if I switch for 30m I can’t ignore the difference.