The problems are mostly not about GPU power, but awful CPU performance it seems.
Makes me feel a bit better, the game hits my 1080Ti pretty hard also. I also don’t play many modern or really any modern Unreal games so I didn’t know what to expect.
There’s an engine.ini file on Nexus that greatly improves performance. I’ve been using it for a few days on my gaming laptop with settings pretty much maxed and find it more than acceptable. The game has crashed 3 or 4 times though … Not sure if related.
That’s just Bethesda
I get the impression it’s a poorly optimized quick’n dirty cashgrab.
They’ve remastered the feeling of not having enough horsepower in your PC it seems. That’s what it felt like back in the day 2006 I remember. My PC could not handle the open world very well at the time it was a stuttery slideshow mess. I prefered playing it on Xbox 360 cause it ran more fluid. They should probably optimize it better though.
This was my first thought too lmao. “How considerate of them to recreate the experience of not being able to play it smoothly until half a decade after it comes out”
I remember tweaking the ini for goddamn hours getting the settings right like it was another school of magic to master
The thing is, it’s not all about horsepower. You gotta think about horse amour too.
horse amour
Bojack remake looking pretty lit.
Stop!
I can only get so hard!
oh god here we go again. https://www.pcgamer.com/games/rpg/its-happening-again-oblivion-remastered-is-selling-new-horse-armor-like-its-2006/ I think I’m happy leaving gaming for music and beer at the moment. Yeah you’ll buy same album you already own but “remastered” with less dynamic range but louder volume here as well but at least it’s cheaper.
Hardware manufacturers continuing the path of least work but highest harvest by deceiving marketing, and software manufacturers doing the exact same - or, in short, capitalism - was 100% on my bingo list for any year.
I’ve got a 7800XT, which is 1.5x more powerful than a 5090, and it struggles too.
which is 1.5x more powerful than a 5090
…are you sure? I think you’re mistaken.
Yeah, buddy. Pretty sure. 🙄
checks out
Sorry I down voted you, took me forever to get the joke.
That’s part of the fun.
Kind of amazing how Bethesda manages to always be at least 5 years behind every other major developer on optimization. I recently picked up Fallout 4, my first attempt at the series, and I just completely lost interest, one of the main reasons being the absurd load times on a 7900xt.
I’d be shocked if it was your GPU causing long load times, that’s not usually the culprit for that kind of thing unless you don’t have enough VRAM. It’s probably another part of your system the game arbitrarily dislikes, or it’s just generally being shit.
Running it off a hdd prob
While I don’t have an issue taking a dump on Bethesda this time the issue is either with Nvidia or Epic. The 50 series has been a shit show and wouldn’t be surprised if the flagship model somehow preforms worse than last Gen cards. And UE5 is also a shit show. Games made with UE5 look like blurry shits unless you buy a $1000+ top of the line GPU because it’s also a massive performance hog. If you’re playing a game that looks like shit and runs like shit, it’s probably using UE5.
i can’t find the article for the life of me but i read an interview with a dev who basically said that the UE5 engine is fine unless you try to crank all of the visual bells and whistles on at the same time. Now imagine being a dev team trying to convince marketing not to use all of the features they paid for? Can we blame Epic and Nvidia?
I think I get an average of ~100 on my 3080 Ti. Obviously it varies greatly, with outside during weather or near oblivion gates being closer to 60 but indoors/dungeons etc. get over 100.
Yeah, I have concerns over the quality of this article. I’m using a standard RTX 3070 and I don’t seem to have any of the frame issues to the degree that the article talks about. I’d lean more into it being a 5090 issue, not an Oblivion / UE5 issue.
My partners RX6600 holds 80 FPS on mid settings without framegen. Maybe the gap between med and high is larger than I realize, or there’s an ultra setting I hadn’t noticed, but the author’s experiences still seem out of whack.
I feel like this has been a trend lately. New, high-fidelity game releases, and the wave of “UNOPTIMIZED GARBAGE!” “dev, fix ur game!” starts rolling, only for myself, and the majority of people I speak to personally, to have no real issues. Feels rude to play this card, but I am starting to lean towards most people having no idea how to care for their machine, in a lot of cases, and rarely facing some weirdly specific drive/card compatibility stuff.
That’s extremely interesting.
I have an RX6600 and I’m barely getting 60fps indoors on low. Hm, maybe my CPU is the issue, but I’m well above the minimum requirements.
The difference is probably largely raytracing, though. I’d expect most of the cards that are struggling with the game to absolutely demolish it just by turning RT down.
I’m not home right now, but I’ll follow up with more formal testing either tomorrow or the next day. I’m very interested in why such similiar machines get completely different performance.
The machine in question is sporting a Ryzen 3600, for discussion sake.
Oh, that’s interesting. I am also using an R5 3600 lol.
Indoors, my performance is fine on medium it seems. It doesn’t change much from the 60fps I get on low. But outdoors it is rough. Even on low.
Edit: hah, either I did it and forgot- or the game defaulted- to a 60fps frame cap, which explains part of it. I actually get more like 70-80fps with occasional stutters when indoors. Also for sake of discussion, I’m on the Xbox Gamepass for PC version which, from prior experience, might be different in entirely random ways.
Good job team