EA is digging own grave :) well done EA boys :D :D :D lazy and scummy ? :) nice nice as expected usual style blame those who give you food on your table :D such a "SMART" move in these days :D not at all short sighted :D they`ll fix it anyhow, strange statement though
Reminds me of Apple claiming the failure of their butterfly keyboard was down to "user error" or "not spraying compressed air into the mechanism properly".
AAA studios switched to support only the newest consoles that have about 12GB of VRAM and consoles right now have record sales. No one will care that company like Nvidia is giving people GPUs with so limited amount of VRAM in 2023. This is planned obsolescence. $800 4071ti with 12GB is a joke, but the $400+ 4060ti (as fast as the 4070ti) with just 8GB of VRAM is just an insult in 2023. Sure, we can blame devs for a poor optimization but we really should blame Nvidia that cards have worse specs than consoles but the prices went through the roof.
@@tomtomkowski7653 Have you seen any late Steam Survey? People don't have the latest graphics cards, not even last gen. People mostly have GTX 1060s, 1650s, etc. Nvidia's lack of VRAM is bad, they're a greedy company, no doubt. But the devs making most players' hardware obsolete is another issue. Not even mentioning the CPU requirements lately when R5 3600 paired with Nvidia's driver overhead can feel obsolete.
Aside from optimisation and menu difficulties (Which I suspect will eventually get fixed) the DRM is a killer for me. I have a PC and a laptop, and not being able to play the game on both of these interchangeably makes it a non-starter. At least until I get bored with other releases and this goes on sale.
He said he would have to wait 24 hours after 5 hardware changes. So you would have to switch 5 times between your devices in one day to finally been "forced" to have a break and go to sleep or something.
I blame it on the people. My wife has sims2, sim3 and sims 4 full collection (over 1500$) and still complaints about how bad they are after every pack she buys.
@Menti Capti If it didn't play well on an all-AMD system which is wildly better spec'ed than the all-AMD consoles someone would have had to out of their way to F things up.
EA being garbage as usual.... Blaming players, DRM locking etc... I might look into playing this game after 10 years or something... It looked so good but seriously I don't know if I want to support EA in any sort of way with how they conduct their business. Edit: Also, nearly forgot to mention. Thanks heaps for another analysis in such a short time, and doing some weird work around to provide viewers/gamers with information they need. Fine work indeed.
As an artist, I can't stand how piracy ruined the ability to make a living in my industry. That being said, sail the high seas for EA or any other shit company. They need to feel it in their wallets or they won't change.
Just wait until Empress or someone takes denuvo out. The original Jedi game runs significantly better without it, so much so many paid players use the pirated version.
Am i the only one who thinks it's insane that we need in depth optimization guides to run games in HD for GPUs that cost $800+ AUD in my country? Why can't devs just wait and release games as polished as they are on next gen consoles for 99% of games.
they are killing their own sales buy doing this. They are training their customers to never pay full price but to instead wait 2 years and buy at extreme discount when the bugs are all fixed.
They used to do this until about 10 years ago. Now we have this stupid crap releases over and over and over again. It's not getting better either when pre-orders are a thing. People throw their money at a company because "the game looks great" in pre-rendered footage. The companies win every single time.
in depth optimization guide: "select the high preset". wow that was complicated i think i need to lie down... when will people finally understand that they are allowed to play on not completely maxed settings? and that's not new either. when crysis came out there wasn't a single card that could play the game at max settings and 1920x1200 res. result, the game still looked amazing 10 years later when powerful GPUs could actually run it maxed and at high resolutions. do you want 2015 graphics at "ultra" or do you actually want devs to push the envelope and take advantage of new tech? there were 8gb 290x way back in 2014, we're finally don't have to drag along the 2014 consoles anymore, and all everyone does is bitch about not being able to max out the game on 3 year old midrange hardware... hopeless.
Wouldn't those "cutting-edge, multi-threaded chipsets" more likely refer to non-uniform setups like the 7950X3D, or Intel's CPUs with both P- and E-cores - situations where latest scheduler optimizations could be a factor? Just a thought. (I was half expecting a CPU swap to the dual-CCD Ryzen X3D for further OS comparison.)
I'd test the new intel CPUs with E-cores between windows 10 and 11 to see if that is what EA was referring to. It is EA's fault for not specifically specifying which setup they are referring to, but new Gen intel would be my guess, as Windows 10 was not made with those in mind
I wonder if the DRM affects performance. Remember how Denuvo greatly hurt(s) gaming performance in many titles? To the point where pirated versions ran like 25% better. Also when developers finally removed Denuvo performance went way up.
Someone in the other video did mention something about it. As I understand it, (do not know if it is really exactly that), Denuvo is waiting for decryption to happen, while everything else is also waiting for "security" to be done until is processed further stalling EVERYTHING in the game!
Interestingly your GPU usage was 99% most of the time which was not the case for most of the other cases, but then again almost none of them were using the 7000 x3d chips, so maybe that helps to unlock the GPU usage, just a theory.
I constantly have my GPU usage pegged at 100%. I have a 5800X3D. I did think that DF review of this game was odd to omit these CPUs and go only with a 3600x and last Gen intel 12900k.
I wonder if it's a storage bottleneck more than a cpu bottleneck. I could see them being on a fast nvme 4.0 drive being a difference maker. Especially since that is what the ps5 has.
Fantastic video Steve - really hit all the crucial points. I was surprised to see how well 8gb cards were doing frankly. Hopefully they fix CPU performance soon (my 5900x should be sufficient for my 3070) as the game does look quite good, generally.
This is how 8gb cards should play at least for 2 more years i dont expect from them 4k with RT just 1080p high while Respawn fucked up CPU optimization at least GPU optimization looks good unlike other games released this year except Atomic Heart, Returnal and Dead Island 2.
@@hassosigbjoernson5738 he even says in the video that 8gb did well without any issues or missing textures. Games should utilize the GPUs vRam efficiently and just like when you add more ram to a PC and see that Windows uses more of it compared to if you had less. It's possible that's what this game does with vRam amounts. None of us are the developers so we don't know what's going on. All we know is that it's completely playable compared to 16gb of vRam when the haters said 8gb of vRam would never work🤷
@@danielpindell1267 it will work by not loading something or by loading some of the textures at a lower resolution. Windows does the same, it like to have at least 16 gb of ram on the device but it can work with 8 as well but not every function of the os will work perfectly.
It's not as linear as people think. The stingy thing Nvidia did, not only in addition to not supplying enough memory to the 4070, was capping the RAM bus on the 4070 to 192bit vs the 3070 bus at 256bit. I have a 3070 which I'm replacing with an AMD card, and it has performed very well up until this point. Any game I've thrown at it, ray tracing or not, with the right combination of settings is very playable even at 4k. Undervolt eeks out more performance. Almost all titles have run at 60FPS+ and look nice. When you have to lower settings, usually it doesn't break the fidelity of the game. It's last gen tech, but for the gen, it's still not a bad option.
Regarding Windows 10 vs. 11: I think the 7800X3D is a very unproblematic CPU in terms of scheduling. If there is a problem with Windows 10 I expect it to appear on either 7950X3D or ADL/RTL scheduling gets a bit more complicated with 2 CCDs or e cores.
I’d be more interested in cpu benchmarks honestly. Not everyone can afford a 7800X3D. I feel anyone with a 5000 AMD, 12th gen Intel should have no issues.
I wonder how the 5800x3D compares. While it is still $300, it has zero-additional cost for basically any AM4 system. With its low TDP all AM4 boards can handle the chip. $300 is much cheaper than any other uograde involving a new CPU and new Motherboard, not to mention the times that also triggers new RAM.
@@Hardwareunboxed I'm also waiting to see if they fix the cpu usage before buying it. I'm afraid my i7 12700 will be a limiting factor here despite having an rx 7900 xtx. I just don't have much hopes of improvement. They didn't really fix much in the last game on pc.
U not gonna talk about 3:55?? AMD using 2GB more VRAM and the 3070 isn't even maxed out.. Also 6800 uses less RAM, which likely means you're running 6800 at a lower resolution or lower settings...
The system requirements of the game say the recommended CPUs are the Intel Core i5 11600K and Ryzen 5 5600X(The CPU I have). Could you test the game with these CPUs to see what kind of performance you get?
I think EA Windows 10 comments were aimed at Intel P-Core, E-Core. So that would have been where to check the OS difference. I'm NOT defending EA, but it would have been nice to see this put to rest. Testing on system with uniform cores won't put it to rest.
I agree, EA pointing to it being a Win 11 vs. Win. 10 problem + Cpu utilization being a problem + the Cutter with a Intel Cpu having terrible frame stutters. All this hints to Intel CPUs with E-Cores having problems. Would have been interesting to see a Compartison with intel CPUs
Agreed, got the same feeling when I saw Steve only testing the 7800x3D on Windows 10 and not maybe the 13600/13900k as well. I hope to see this in a future video, but it may be to late at that time since patching may resolve the CPU issues.
A member on the Guru3d forums found out that having a page file on the same drive the game is installed, reduced the game's stutters by a big margin. Could be that people that have several SSDs, with the game installed on a different drive that has the pagefile enabled are having performance issues as data is shifted from one SSD to another. So the solution is to also have a pagefile on the drive where the game is installed.
Lot of newer PC ports seem to have issues with paging files not being enabled. H:ZD would crash randomly on my 7900 XT until I enabled paging file on my D: drive. Awesome advice!
Regarding the comment at 0:50, the game also runs OK on my system and I'm having a blast playing it. I'm getting between 60-70 FPS playing in 3440x1440 with details on Epic, Raytracing disabled and FSR set to quality. System in my case is a Ryzen 5700x, 32 Gigs of DDR4 3600 CL14 and a Radeon RX 6750 XT.
Enabling RT is applied only from the game. If you enabled RT in the menu, then loaded the game, then RT will not apply. The results from 80+ FPS with RT don't look too reliable. As Daniel showed, such FPS is typical for 7800X3D without RT, with RT it is in the range of 50-60 frames. Perhaps the difference in the test scene affects. But it's still interesting to find out if the RT inclusion was exactly used in the test? It could only appear to be turned on, but not actually work.
Isn't the statement just calling out that Intel's big-little architecture "requires" the windows 11 scheduler? You said it yourself, the 13900k had problems, and all "big cores" only CPUs did not have problems. Why not test a 12th/13th gen CPU on both W10 and W11 to accentuate the problem EAs statement was addressing. Of course this does not excuse the DRM, menus, and poor CPU usage.
You have to wait a year after launch for ea to fix games. Look at all battlefield games, they were ALL terrible at launch, one year later they're pretty decent games
More of these titles are to come in 2023 and 8GB VRAM is becoming obsolete, crazy to think RTX 3070 is a 2 year old high-medium-end and is already struggling to deliver smooth gameplay.
@TokyoBlues Cyberpunk 2077 was path tracing, but the environments aren't as detailed as in games like TLOU and Hogwarts Legacy and that is what eats VRAM the most, it's not a surprise though, if a game was made for consoles with 13.5gb of available VRAM then it's not surprising that you can't match some settings.
The RTX 3070 was never in the same performance tier as the RX 6800. The RX 6800 is to be compared with a RTX 3070 Ti. When I go on the eBay page of my site, they sell for the same price. The 3070 sells for significantly cheaper than the RX 6800.
With my 6900 XT with a 5800X3D. I have generally had above 13GB of VRAM usage and a not rare maximum of nearly 15 GB. I also get a consistent 100% GPU usage. I also have had consistently avove 20GB of regualr ram usuaage. My settings are mostly 4k epic with some setting dialed back and ray tracing. I generally stay above 60fps in more closed areas though my fps drops dramatically in more open areas of Koboh and i have to turn off ray tracing.
I’m curious about Intel 12th and 13th Gen CPUS on windows 10. The only reason I wonder is because when I was on windows 10 and upgraded to my 12700k, it completely broke Assassins Creed Origins. The problem was solved when upgrading to windows 11 and it was buttery smooth. Some games have known issues between p-core e-core scheduling and windows 10 optimizations. And Alex from Digital Foundry used a 12900k. Hmm
The patch they released today actually improves the performance significantly. I did a side by side comparison of the beginning of the game comparing the day one version and with patch 3.5. GPU usage is night and day difference. Still not perfect but a step in the right direction for sure.
Why have you again compared GPU's from different tiers? The 6800's direct competition was the 3080, not the 3070. The 3070 should be compared against the 6700.
That Nvidia driver CPU overhead was crazy! With the RX6800 going like 30% higher. Both GPUs delivered the same performance at 1080P and 1440P so CPU limited but the 3070 did so much worse, wow. I wonder what it is that causes this very significant driver overhead? It's interesting to see AMD drivers being thatcmuch more CPU efficient. And they say AMD has bad software.. Hmm. Game has also been running well on my 5800X3D and 6800XT on High with no RT. A few stutters but honestly can't complain.
They say AMD drivers are bad because they crashing all the time. Nvidia drivers are more stable because, well, just look Nvidia Control Panel. It looks the same since Windows XP. And you can tune AMD GPUs without Afterburner.
@@gamtax On that point, the tuning for AMD GPU's is apparently too limited compared to afterburner is what's being said about that. I honestly don't know, but with the NVCP control panel look you seem to be a couple of years of in favour to the design, I wouldn't give it any more than a looks like Windows 95 kind of design 😂
@@gamtax the control panel + afterburner is a better combo than Adrenaline tbh, AMD GPUs need tuning out of the box and its always the same "let me see what works without bricking my card"... not everybody is into that.
try testing on a cpu with e-cores and p-cores. i suspect the engine is not allocating workloads to the appropriate cores, which would explain why your experience was much smoother than those reported by very modern intel cpu owners
@@speggy_merball EA claims that the bad performance is due to using these CPUs on windows 10, which can't control the E and P cores as efficiently. I assumed that's what you were talking about.
I never mentioned windows versions. Read my comment again, slowly. I said exactly what I meant; no more, no less. Assuming people mean more than what they say is a surefire recipe for unnecessary problems.
Windows 10 would be a potential problem if using an Intel 12th or 13th gen processor, because it will assign threads to the e-cores. Can probably get better performance by setting affinity to force the game to use performance cores only.
I've really noticed these unreal engine games (at least on the new intel socket) don't like DDR5's with latency over 36CL with my 13900ks. I had to lower my timings on my CL38s (6400mhz) and all my stutters in a wide range of games died down to a micro stutter here and there.
Runs like garbage on Windows 11 too. And I’m running a 7950x, DDR5 6000, 4090. There is absolutely ZERO reason for a game made with Unreal Engine 4 to run this poorly except for poor optimization and a garbage DRM that has the tendency to break any game that it’s tied to (talking about you Denuvo) take Hogwarts legacy for example. It ran like crap too but there were reports of people who pirated the game who had way better performance out of it with Denuvo disabled that the retail version.
People are panicking because the "only" have 8GB of vRAM in 2023. And they are panicking without actually trying to game on 8GB cards without using the Ultra preset, and doing the unthinkable of lowering the textures to high.
@@nimrodery I too have 8GB of vRAM. I have yet to lower anything than textures from ultra to high to play TW Warhammer III, Cyberpunk 2077 or Witcher 3 EE. I have not yet to play this Jedi game, so i can't talk about it.
I have a 5600x, Rtx 2060 with 32gb of ram and I'm getting 45-55 fps on medium setting. The cpu use is limited to 45%. I get 90fps in Cyberpunk 2077 at high setting and 60 fps with RT on.
When a game runs better on Intel+Nvidia hardware, but is poorly optimized for AMD, it's never looked at as being a problem. The second it runs bad on Intel or Nvidia hardware then it's the end of the world. Also, the issues with the 4090 could be partially Nvidia driver related too. I feel there's a double standard when it comes to games performing better on AMD or lacking Nvidia's crutches like DLSS. I personally have a 7700X and a 4090 so it would be nice to see this game work well on my hardware combination but I feel that we're missing something in this conversation.
Nvidia GPUs are VASTLY more popular. Even with how decent current AMD GPUs are, Nvidia has much larger share of the market. So, when a game runs poorly on less popular hardware - it's terrible. But somewhat understandable. But when a game runs terrible on the more popular hardware - that's another level of not caring.
In regard to windows 10 vs 11 at 10:00 I think you'd find a differing result potentially with a BIG.little architecture like 12th/13th gen Intel (i5 probably being the worst) or maybe even the dual CCD 3D CPUs.
He probably just wanted to show that this was a lame blanket excuse as most people having such cutting-edge systems probably know that Windows 11 does schedule a bit better for them generally & a lot in rare edge-cases. Without further context from EA this sounds like a "you're holding it wrong" (iykyk).
That's how I feel about the 4070. I had a $7,900 XTX and it was a hunk of junk. Driver timeouts, weird odd inconsistent frame rates depending on what game you're playing. Adrenaline software not wanting to load up while you're in a game just very odd annoying things. I sold that piece of junk, bought a 4070 and it runs so nice.
The first game, Fallen Order, was developed with Nvidia as one of the partners. On the other hand, AMD worked with Respawn on Survivor. Could explain at least a piece of the puzzle. (yes, the first game was also not great on release, but not as bad as this one)
I have "same" result as 3070 but better 1% with 6700XT:) And i havent any issue, crashed, glitches, bugs etc. It work very well for me. But yeah, GPU sometimes is only 85% and CPU is sleeping (5700X):)
@Blue thanks bud, I know that. But I'm not even talking about PC here. But surely you saw that because it's impossible to miss for anyone who is able to read.
For me, on a 7900XT and a 5800X3D it ran fairly poorly, with loads of stuttering (on Khobo, not Coruscant). Also, I was getting extremely blurry visuals with FSR disabled, but was getting sharp textures with FSR on, which is the opposite of what one would expect. (plus the ghosting issues of FSR of course).
For some people good is around 60fps with 1440p, for some its nothing less than 4k with ultra settings. But I think if you got 4090 its fair to expect 4k ultra 60fps or something very close to it. I had such situations myself, when a lot of people complaining about gamebreaking bugs, bad performance and I got pretty good experience.
And a lot of console players aren't too bothered to play this game in the 30 fps quality mode while it would be considered unplayable and unacceptable on PC.
@Kathleen Delcourt Different case of expectations I suppose. Albeit the xbox is getting shafted in this instance. You pay more for a pc on average compared to the more cost effective console price (pc parts get rather pricey in comparison to the cost cutting consoles can make due to selling at a loss to get you into their ecosystem alongside keep continuous transactions to roll in further).
@@youravghuman5231 I meant to respond to those who say that the game runs better on consoles. It doesn't, it's just that standards and perceptions are different.
Perhaps "cutting-edge multithreaded chipsets designed for windows 11" refers to Intel chips with P/E cores (12th gen, 13th gen)... because there is a struggle with some applications utilizing the right cores for the right jobs at the right times on windows 10 (heck even on 11)... maybe they didn't want to call out Intel directly?
Regardless of the specifics of this portion, the overall intent of this statement was to point out that from EA's perspective, an insignificant portion were making a lot of noise about the PC port. It's incredibly disingenuous
Weirdly the game runs completely fine for me as well. 5800x, 6800xt, 32GB 3600 CL16 RAM. Obviously other people are having issues but not me for some reason. Getting 60+ 90% of the time, it rarely dips to around 40 but not often.
They're probably going to be fixing this game for the next 3 months minimum , if not more. I remember EA's first biggest mishap was Battlefield 4 , and ever since then they remained with this terrible business practice of not optimizing anything. At this point there's no excuse left really, the hardware on consoles and pcs doesnt differ as much as it did in the ps3/360 era, It's all a lot more standardized now, and yet terrible performance games keep coming out. I've also heard rumors of a 4060 8gb, this il be really interesting...
Tresspasser, Crysis. They werent very playable at launch because the requirenments were higher then what you could buy at the time. About a year or two later they were. It isnt a new issue. I remember it was fun reinstalling those games again and again with new upgrades to see the differance. To finally get to 1024x768 or 768p at 60fps was amazing but certainly never possible at launch.
I'm watching a phenomenal movie on my OC'd 7900 XTX with everything maxed out at native 4k ~60fps avg. It's really 50-70fps, but VRR on the OLED makes this look absolutely stunning. That said, it is using Over 23GB of VRAM and 31GB+ system RAM. So, a very fast NVMe and high powered CPU are also recommended. Other than the "other people's problems" system requirements.. This is my GOTY. Play it any way you can!!
@@ScepticGinger89 maybe that logic works only in win 11. maybe in win 10, windows uses all cores for gaming which weakens fps since some cores are slower.
@@ScepticGinger89 As @Mad Tech says, the issue was with Windows 10 not managing the cores effectively. Iirc, Steve, GN and others went over all this, back at the time of Intel launching their first mixed core CPU. Which is why I was surprised that Steve didn't cover that issue, with respect to performance in Survivor. Anyway, it was probably a workload induced oversight or maybe it's not even an issue. Regardless, hopefully Steve will address it, sometime in the future.
That Win10 claim (and driver claim) is valid , since depending on your OS Version number (not OS name), Ryzen and new Intel processors can stutter in Unreal Engine 4 , and will not deliver its full performance... compared to Unreal Engine 5 Win10 1607 makes Unreal Engine 4 games stutter on new processors (some games will not even boot and spit a CPU error), Win10 1903 can boot most Unreal Engine 4 games with decent speeds on the same hardware, but still has stutters ... Stutters that vanish when you start using Win10 19044 (21h2) and those stutters only happen on new Hardware, ONLY IN UNREAL ENGINE 4 , on Unity or other engines they don't... many are baffled why ? Some point to new CPU microcode support in new versions Win10 versions , some for better fixed Sata\SSD drivers that Win10 had problems day 1 had problems with slowing down drives , even slowly lose speed over time... HDDs\SSDs that when you them on the same hardware in Win7 work perfect...
I'm surprised by how high you were able to get your fps. I"ve seen captures with a 7800X3D and 4090 showing 55fps even at 1080p with the camera still in an early city area. You actually saw changes in fps with changes in graphics settings.
Could be something wrong in Nvidia's drivers for the 4090!! I've seen a lot of videos now with cards under the 4090 and the game is a hog but it runs. But they have to fix the jerkiness of the game..
Yea, I get like a 3-4 fps increase going from 1440p epic to 1080p low...I'm running a 9900k and 2080ti though, so it could be my older cpu is the problem...
@@madtech5153 I have been restarting after changing settings and it does absolutely nothing for me...I get about a 4 fps difference between 1440 epic and 1080p low...
A 5600X and RTX-3060 12GB here. FPS between 45-55 with occasional spike of 60 @ 1440p. Default settings of High and FRS set to Quality. Seems playable to me, but WTH do I know. 🤷♂
Great work Steve, but y'all really should have tested it with a less powerful CPU. It's great that the 7800X3D can handle it, but how about a person running Ryzen 3700X?
It could be due to MSFTs horrible communication when it comes to Intel bigLITTLE architecture. They say, only w11 is optimised for it, but tests show w10 kernel got most of the optimisation too. Only in a.i. workload there is a noticeable advantage in w11, du to better scheduling.
The game “runs” fine on my intel arc + 7800x3d system.. but there are extremely obvious memory leaks. It just happens that my 16gb vram card is ideal and my cpu is overkill. Once they fix the game (if they do) it will be superb to have a budget 16gb card. As it stands avoid the game
At me it runs fine in 4K epic with FSR quality. I had yesterday a little break at planet choose in starship (20 fps) and in one situation at a boss fight in Koboh Cave it goes down to 30 fps. Most time 56-70 fps. Some situation 8X fps. I sequenzes any time 46 fps, but smooth. My setup: -Asus TUF 3080 12GB OC -R7 5800X3D -32 GB Kingston Fury DDR4 3600MHz -WD Black M.2 -Windows 10 Without the reported problems, it is a beautiful game and it makes fun.
I enabled rebar last night and it made a significant difference for me, but I need to do more testing. Played for about 30 minutes, huge boss fight, 60 frames locked the whole time. I need to test on Koboh in the town. That's where I've had the most performance issues. Curious if other players have tested with it on vs off.
Its pretty funny even if AMD performed 6 times better than nVidia, the uniformed fanbot will still buy nVidia just because its green, and looks like a fancy Mercedes Benz. I do not care for the brand, but when a company releases better GPU's and never really move large numbers, who is truly to blame?
Mine runs okay for the most part. Mine gets annoying because it'll start stuttering if I tried to turn the camera too much. Running 5700X, RX6800, win 11
Thanks Steve for the review. I will wait for patches to be released and see if performance improves before purchase. Shader Compilation is the primary cause of all UE 4.x and to some degree UE5.0 titles, it's discussed extensively at Digital foundry where they perform frame analyze including CPU utilization. One fix for this is the game developer can compile shaders prior to the start of the game to help reduce stutters. Yes stutter can also be caused by the GPU, but in this case even 8GB is enough to play this title.
It's sad 'cause the first game was optimized fairy well, the only major issue being traversal stutter due to too aggressive loading of level segments, and the old UE4 shader compilation.. This sequel is on DX12, which should be running so much better on the CPU side, yet it's so much worse. Makes no sense how they downgraded things so much, including UI, while also not improving the traversal stuttering. And it's $70 this time. EA, c'mon..
EA never dissapoint people with their releases.. $70 triple AAA game.. paired with unoptimized port.. who would thought? Lol Go ahead.. support EA with paying $70 for being Beta taster.. while others will pick this later for $25 after some patches People complain 24/7 while they pre-ordering with full price.. stop pre-ordering.. especially if it was bad company
"it played ok for me" - ah thank good, i thought i was the only one... yes lowering the graphics settings doesnt change the fps, but only lowers gpu load in most cases, but im getting 40-50fps on Crouscant with nothing more than a rx 6700 xt and a 3700x with max settings 1440p and FSR Quality. 60-70fps on the planet that comes after
A few reasons not to buy from EA: - shitty DRM that makes the gaming experience as a pirate superior and more malware-free to the official release - Need for an EA account - Poor optimization - Expensive - Is a graveyard of development studios and IPs - Takes away games you already bought from them Reason to buy from EA: - They have the rights to popular IPs.
The steam hardware requirements say 4 core, 16 threads, still life in the i7-4790, but I feel the end is near. I only upgraded from a 3rd gan to a 4th for windows 11, for dx12. Next GPU RX6750xt or RTX3070.
I would never come to the defence of EA, but my game runs surprisingly good as well. Not perfect obviously, but over 90fps consistently. There is some stutter when transitioning to/from cutscenes and certain loading areas, but combat zones are fine. I’ve capped it to 80 fps for a bit more consistency. Edit: Also got the game free so any issues sting less than the fools who dropped $70
I don't get how it is possible to push out such a mess of a game, didn't they learn from the previous Star Wars game? Anyways let's hope that with RDNA 4, RTX 5000 and Battlemage/Celestial we get a proper amount of VRAM from all the companies.
I don't know why the 3070 is always compared to the Rx 6800. The 6800 has always been more expensive and almost impossible to find for 2 years in my country. And even when he came out, he was more powerful. The performance found here makes sense and is not surprising. It's not always a Vram story
in the firsts game clips (1440p, no rt, 6800v3070) the 6800 seems to stutter. Is that post recording or did it have low 0.1% lows? Certainly weird with so high 1% lows, but the 3070 footage did seem smoother
EA: Recommended System Requirements, OS - Windows 10 Also EA: dOnT uSe WiNdOwS 10 🤦♂️ EA still showing why it's one of the most hated software publishers in the world.
10:47 Aren't they talking about Intel's latest with low and high state cores or whatever it's called? I thought it must be what they're referring to, seeing how that's the only thing that works different on W11 compared to W10. They're probably wrong, but I just thought that was the only thing they could be referring to.
It's funny. Game is based on the same engine as the Fallen Order, with the same graphics, but "you are using the wrong Windows". Ha-ha-ha. Optimization? No, no, no! Wrong Windows! )
5800x3d/3060 12gig - plays fine for me too, since day 1 on the recommended settings - high/FidelityFX at quality, 1440p. Only issue is if i alt+tab a lot, it will develop a memory leak.
Another gem from EA and our best pal Denuvo. Sorry for the trouble Steve has to go through to test these games. Personally i would not mind one bit if he ignores them since i ain't putting that crap on my computer anyway, but i guess it's a popular title and he can't do that :(