The tests are conducted absolutely incorrectly. Now, I'm not saying that the 5900HX will win otherwise (because it won't), however you have 2 MAJOR flaws 1) The RAM. What does "RAM 2x8gb" even mean? What's the frequency? What are the timings? What is the module ranking? What RAM chips are used? 2) The laptops. The tested models are literally among the worst on the market, and this is very noticeable from the GPU clocks and the fact that they're literally POWER THROTTLING Testing Games, I love your tests and I do wish that you keep making these videos, but this one will be a huge L from me. Please, improve
Not the first time... Seeing more and more people blindly following his benchmarks too now, and it's not even about intel vs AMD, on both side I saw numbers and setups that didn't make sense, GPU testing is better but CPU ouch
@Gamers Lab Absolutely not. this issue is prevalent enough that techtubers like linus pointed out that you could easily lose 25% performance due to bad or suboptimal memory configs. I could predict that if you swap those ram between the laptops, 5900HX will win as much as 11800H cause they should be on par in performance if we make the memory the same And last point, trying to save money on the wrong thing can force you to upgrade sooner rather than later like the quote "cheap skates cost twice as much"(paraphrased)
@Gamers Lab Proof that you don't know what you are talking about, ram speed, cas latency and which module it is can really change performance, even more on laptop module as ram stick with less dense PCB(less actual memory chip on the stick) is proven to hurt certain laptop model and other performing better
Im surprised the 11800H is faster in Cyberpunk even though the 5900HX's GPU is getting 10 more watts. Also, the frametime graph is much more consistant on the Intel system in RDR2
This could be a problem with RAM modules also. Most of AMD devices comes with 1Rx16 RAM modulus which are noticeably slower than 1Rx8 modules. Just saying tho.
It's about dual channel. Amd laptops come with single ram sticks which hits performance. If you put against with same ram sticks dual rank performance will be same bro
@@arslanmenliyev3554 thats true. Even tho in both are in dual channel and same frequency the memory banks in RAM sticks will give a noticeable boost in fps. For 1Rx16 RAM sticks they have 2 memory banks. Which is slow and bottleneck the cpu. But in 1Rx8 RAM sticks have 4 memory banks. So those modules can get out much better performance from CPU while gaming. I have tested this and got around 10% fps boost on my Ryzen 7 4800H + rx5600m gaming laptop.
@@Btech94 only in low seting cpu will show diference at ultra it will be the same. The only fact is that i intel cpu are better in game way more like u see. Even i7 11400h is better than 5900hx
@@schweiserschweiser3397 thats kinda true. But in red dead redemption 2 at hardware unboxed+ ultra lightning setting it averages around 56fps on the built in benchmark on 1Rx16 memory. When I upgrade my RAM modules to 1Rx8 in same setting with ultra lightning it boosted up to 67 average. GPU was 100% in both times. Also need for speed heat got the same result.
@@naoton201 I do see it, however FPS is bottlenecked mainly by wattage of the GPUs and Optimus. So a fair comparison, monitoring the FPS as a primary indicator should keep account for these two factors above all else. When we have the external factors resolved, CPU comparison will show true statistics.
without timing and RAM type, the comparison is meaningless. It would have to be the same type of RAM for the benchmark to be comparable. Similarly, the GPU power consumption may be different
You put wrong laptop cpu referance in description. Also I think there is something off with your benchmarks. I have same laptop, msi gp66 11800h 3080, and my gpu power consumption and utilization is higher and cpu power consumption and utilization is lower in every other game. It is weird many of the games gpu is not fully utilized in your video and it consumes 100 120 watt but 3080 is rated as 125-140watt in msi and it never dips below 125watt while I am gaming.
Do you know if any of those have mux switch? and if you let us now how much watts rated those rtx 3080 , in the laptop segment they can goes from 100-160W will make huge difference
@@malware5674 Would you mind to explain EXACTLY why you think that we are fanboys? Point out the thinking PRECISELY of why you think that instead of going around accusing people and staying ignorant and arrogant on this topic
I don’t think it’s necessary that every laptop has that type of ram where even if you swap the ram, you will get better performance in ryzen? But yes I do agree that the intel one has a mux switch, so this ins’t really a real test, sorry if I am wrong about the case with the ram
look at the frequency of the cpus. R9 5900HX shuld run at 4.3Ghz on all cores while in this test it runs at 3.7Ghz most of the time. So its no surprise intel won here LOL
Не корректное сравнение. У R9 занижены частоты процессора. Скорее всего автор видео либо проплачен, либо умышленно не захотел сравнивать два ноутбука на равных характеристиках
As a fan of Ryzen, i'm not surprised. 11800H is a magic CPU - and the laptops using it - are probably the best damn choice. It even can survive STELLARIS in the Endgame.... If you understand what i'm talking about...
@@melonhuskantilgbt r7 5800x is better than 10900k, and 11900k and not just gaming but on productivity as well for $100 less. Go watch LTT reviews of zen 3 architecture and you'll see what i mean
MSI GP66 has mux switch, not a fair comparison ig 🙄 Rog strix scar Ryzen 5900hx doesn't have mux, so did you use external monitor to bypass optimus? Edit : I played RDR2 with exactly similar graphic settings on my Lenovo legion 5 pro RTX 3070 and Ryzen 7 5800h (which has mux) and got 90 fps as average.(Rog Strix scar with a 3080 got 86 🤷) Which clearly shows the advantage of mux switch. Also I have 16 gb RX16 ram not the good RX8, but my Ram has good timings compared to jarrod's
@@abhishekrajchauhan3967 As far aa i know mux switch switches between integrated graphics and gpu to save power and it changes according to the application for Chrome it will use integrated but for games it will use gpu
@@Droon_Jadhav You're right that DGpu will be used for games but the thing is When running a game, the frames might be generated on the dGPU, but they’re first send through the iGPU before reaching the screen. This means that in many scenarios the iGPU acts as a literal bottleneck, and this is why being able to disable Optimus results in a performance improvement in games.
@@abhishekrajchauhan3967 oh thanks for sharing this knowledge i didn't knew that i thought that it will normally render from cpu first and then DGPU THANX
@@gorabari5754 Tiger Lake also supports more PCIe lanes, and it's Gen 4 vs Gen 3 on Zen 3. There's no point buying Zen 3 unless it's an APU like the Ryzen 7 5800U.
in this video the ryzen 9 5900hz works at 3750mhz or less and can reach 4600mhz, the one in the video has lowered the frequencies on purpose while in the i7 4200mhz constant. if I put the ryzen at 4200 or more it would be the same or less.
@@daemonx867 the laptop with the amd processor doesnt have a mux switch but the one with the intel chip does and i dont think an external monitor was used in this comparison
@@daemonx867 Its not about the CPU this time, but about RAM. Switch the ram between the 2 laptops and the results will flip, too. This is a documented case. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-R7CO9v9rpOk.html
bro how do you get that fps in csgo with 5900hx. i have g15 advantage, 5900hx and rx6800m, external monitor via usb-c display port and all the settings to max power and get 310 fps in csgo benchmark map, 1080p low. all the drivers up to date from asus website. please help, i have a bottleneck. time spy score is good, 11900 combined and 12225 graphics, and upgraded the ram. i have no ideea what else to do
its so weird tho, in some games there is barely any difference and in others like 20-30 fps and in csgo, where the fps are the most, so difference should be biggest, there are 5 fps difference how does this work? optimizations? temps? i don't get it
Bro if you are using asus ROG Strix Scar 15 or 17 then use external monitor and then do the benchmarking as msi has the option to disable optimus. Do try then then see the power of 5900HX
@@sicka9048 If I have to dedicate some place at home for screen/keyboard /mouse/etc. I put my desktop PC there also with a true 3090… And use my laptop on the go…
These are some very interesting results even they're different laptops the gpu draw power seems to be roughly the same, good selection of laptops there. I have seen the ryzen cpu's been more efficient in most benchmarks but these results shows Intel being on the lead, it could be due to different cooling systems but they're nowhere near of termo throttling. I really like how you sync everything, please keep doing laptops benchmarks ;D
@@TRX25EX Actually I'm an Intel fanboy, but ryzen wasn't working properly so it isn't a fair comparison. I still think Intel will beat them regardless.
wow, no way is going to support 2k ultriwide high settings AAA games. I think ill get a PC with 3060 ti and if gpus prices keep falling, i'll replace the GPU.
@@asindiangamer6238 There's more L2, L3 cache, higher clockspeed. Tiger Lake is a beast that makes Zen 2 and Comet Lake desktop look stupid in comparison.
for everyone out there wondering about mux switch..i dont know about msi gp66 (ryzen cpu) but asus scar 15 (i7 cpu) don't have mux switch. so imo i7 cpus are good to go if you are buying a laptop just for gaming and dont care about battery, price and overclocking. If you are buying a laptop for other purpose ie video rendering etc i suggest amd cup. those laptops i mentioned above are the one tested in this video as you can see in the description.
loved the video it makes sense.....what doesn't make sense is spending another 1000 for a 3080....3060 is more than enough and I don't get why people max games
I agree that a 3060 is fine for a laptop over the price for a 3080. People max games because they like the eye candy. My friend has a full size pc with a 3090 and Cyberpunk looks absolutely amazing all maxed out with ray tracing on.
@@theplayerofus319 its true 6gb can be a bottlenecck but no game is over 6gb vram usage on high settings only like 3 games on ultra textures...lowering one or two settings to high is epic
It’s only an unfair advantage, when 5000h series are on PCIE 3.0. Than 11th gen with 4.0, which makes intel the winner but disappointing compatible comparisons
7 и 9 поколение Райзенов как раз таки созданы для игр. Я тестил ноут на 9 5900hx - отвал бошки. РДР идет как на Соньке, со всем качеством и без тормозов.
@@sergeivolkov8770 вы просто мало использовали 11800. У меня два ноутбука. Один 5800, второй 11800. Не хотят разработчики оптимизировать свои творения под АМД. ВСе эти скачки частот, скачки ФПС. В одном и том же режиме Апекс грузит процессор. ВНИМАНИЕ до 1.1. В других играх такого пока не заметил. Чтобы в апекс работало нормально, надо производительность бустить на ноутах. Нахрен этот геморрой?)