Тёмный

It's getting Even Worse... RTX 4060 Ti tested on a PCIe 3.0 System 

der8auer EN
Подписаться 203 тыс.
Просмотров 220 тыс.
50% 1

Support me on Patreon:
/ der8auer
---------------------------------------------------------
Music / Credits:
Outro:
Dylan Sitts feat. HDBeenDope - For The Record (Dylan Sitts Remix)
---------------------------------------------------------
Paid content in this video:
- /
Samples used in this video:
- RTX 4060 Ti
---------------------------------------------------------
Timestamps:
0:00 Intro
1:14 GPU upgrade?
2:37 PCIe 4.0 x8 with PCIe 3.0 CPU
3:26 3DMark Time Spy Extreme GT1
4:01 Remnant: From the Ashes - 1440p
4:19 A Plague Tale Requiem - 1080p
4:34 Battlefield 2042 - 1440p
4:52 Cyberpunk 2077 - 1080p
5:00 Cyberpunk 2077 DLSS
5:23 DLSS theme
6:40 Power consumption
7:32 Power consumption: 3060 Ti vs 4060 Ti
10:31 Price issue
11:48 Summary/Conclusion
12:28 Outro

Наука

Опубликовано:

 

12 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,9 тыс.   
@sojirou
@sojirou Год назад
Thank you for testing the 4010Ti so comprehensively, not a lot of channels would put in this kind of effort for a simple 2d display adapter.
@_mrsahem_
@_mrsahem_ Год назад
GT 4030 when
@potatoes5829
@potatoes5829 Год назад
@@_mrsahem_ nah it'll be the RT4030. gotta sell that raytracing to someone :)
@KamalSingh-ny9vw
@KamalSingh-ny9vw Год назад
NAH bruh it's GTX 4010Ti
@SinNumero_no41
@SinNumero_no41 Год назад
@@potatoes5829 lol rt 4030, what a great idea, ngreedia should hire you :)
@C0bblers
@C0bblers Год назад
@@_mrsahem_ The 75w, PCI 2.0, 12bit bus, 16gb model with the oversized cooler be the one to buy.
@Barrydick186
@Barrydick186 Год назад
Reviewers recommending the 4060 GPU for older systems have absolutely lost every bit of confidence i had in them.
@BikingChap
@BikingChap Год назад
Quite, I run a 9900k and hadn’t seen this referred to elsewhere. Perhaps because everyone runs absolutely top end hardware don’t think of checking on older platforms.
@m-copyright
@m-copyright Год назад
@@BikingChap They all use the best cpu at the time in order to exclude bottlenecking. And while that's great because it shows what the card can do without being held back, it also has downsides like the one we see here. We still have older cpus that are still way more than capable of handling newer gpus.
@j33k83
@j33k83 Год назад
if youre into upgrading an old harwdware i dont think nvidia is the right choice
@sketters9400
@sketters9400 Год назад
​​@@j33k83 agreed, upgrading old hardware is always better trough amd
@BikingChap
@BikingChap Год назад
@@m-copyright sure, I can see why they do that and it’s difficult to think of every scenario but now someone’s called it out it seems obvious. Easy to be wise after someone else has seen it I guess.
@Nelthalin
@Nelthalin Год назад
It's would have been nice if results of the 4060 Ti with 8x 4.0 where also in the tables. I guess it makes quite the difference but would have been nice to directly compare it.
@Alberto_RiscvsCisc
@Alberto_RiscvsCisc Год назад
Nvidia paid for this test thats why ...
@Igor-Bond
@Igor-Bond Год назад
Please tell me which is better to buy a video card, RX7600, which costs 42599 rubles, or RTX 4060 Ti, which costs 43990 rubles?
@MsNyara
@MsNyara Год назад
@@Igor-Bond RX 6700 XT at 29120 rubles is the way to go, beats both cards by a generous margin.
@joefowble
@joefowble Год назад
Agree, but then you're not running 4.0 on the same intel 10th gen test bench.
@darvengames
@darvengames Год назад
@@Igor-Bond4060 Ti
@DownwithEA1
@DownwithEA1 Год назад
Thank you! I've also been very curious about the power consumption comparisons. This was a very helpful perspective.
@soulshinobi
@soulshinobi Год назад
DLSS2 is a useful tool, but I agree with Hardware Unboxed that DLSS3 should not be included in benchmarks because it does not provide the responsive experience that the frame rate suggests. It's just a visual filter and should not be included in benchmarks at all.
@chubmouse
@chubmouse Год назад
Yep. And then there's the instances where it adds input latency as a trade-off, which is an objective gameplay experience downgrade the worse it gets.
@superpulaski9767
@superpulaski9767 Год назад
The best use cases I have for DLSS3 with frame Gen are for titles it has to be modded into for lol. I use it in my Skyrim mod list and elden ring but using it in darktide or cyberpunk makes the experience feel very meh. Using a 4090.
@no-barknoonan1335
@no-barknoonan1335 Год назад
@@chubmouse It always adds more latency, even if only a little. The only game it's worth it in is MS Flight Sim imo.
@alderwield9636
@alderwield9636 Год назад
b-b-but dlss is every nvidia fanboys last resort if they were cornered 😢
@TheVanillatech
@TheVanillatech Год назад
How much is Putin .... I mean, AMD paying you?!?!?!?
@JCBeastie
@JCBeastie Год назад
Nvidia really asking us to overlook the disappointing hardware because they have a software trick that makes it look better on paper.
@jacobrzeszewski6527
@jacobrzeszewski6527 Год назад
That's literally been Nvidia's thing since the beginning.
@lyonscultivars
@lyonscultivars Год назад
Well.said
@dagnisnierlins188
@dagnisnierlins188 Год назад
After today's keynote at computex it's clear that ai and data centres are priority for them now.
@mr.unknown4589
@mr.unknown4589 Год назад
those nGreedia suckers who shows middle finger to customers, are just to be avoided for good
@pandemicneetbux2110
@pandemicneetbux2110 Год назад
@@jacobrzeszewski6527 Yeah but they did kind of at least deliver before, back when Maxwell came out it was truly one of the most efficient and most performant GPU generations, even though back then nVidia literally got sued for lying about their VRAM on the 970 (and lost, not a good look when a corpo and its fleet of attorneys can still manage to lose and not buy a legal victory). Pascal was also good, it also delivered on performance, and AMD back then just didn't, their main value was in the lower and midrange but even then R9 300 was just not as performant and efficient. Also back then they didn't line their bullshit front and center, like TXAA, PCSS, HBAO+ wasn't being parroted in every outlet like it was some important thing to have, nobody ever implied you should buy a GPU just for better hairworks performance, but here we are and that's exactly what they are doing. I don't give two shits about frame generation. It also looks weird and bad upscaling so it kind of defeats the point to enable raytracing when you run worse fps and it looks worse too thanks to DLSS. And no idgaf that it's better now than 1.0, 1.0 was literally unplayable garbage and anyone who bought an RTX 2000 series ended up feeling cheated. The problem is they keep getting worse and worse. It's like they're drunk on their own hubris at this point, and truly expecting their loyal customers to be brainless sheep. "The more you spend the more you save" "we're like the new iPhone"I mean fucking really?
@wagnonforcolorado
@wagnonforcolorado Год назад
I think adding the DLSS, XeSS and FSR frame rates is useful information. At the same time, showing latency values while using these options is needed, so a consumer can decide if the latency penalty is worth the increased frame rate. Thank you for covering the PCIe 3.0 information as well. A lot of systems were built on gen 3, and a reviewer pointing out the possible bandwidth limitations helps in deciding which upgrades to make.
@Erik_Dz
@Erik_Dz Год назад
ITs important to tell consumers that its not a real frame rate increase either. They are fake frames. They are not rendered by the game but 'generated' by the gpu. It is frame 'smoothing' not actually improved FPS. This is why they should not replace the actual rendered FPS on graphs with frame generation data. For example in the Cyber punk 2077 graph with the added DLSS bar, an actually powerful GPU rendering at 152 fps will not only have lower latency but will look visually better than a 4060 ti using DLSS. DLSS 3 may be better, but it is still just fake frames generated to make the game appear higher FPS than it actually is. Its not magically improving FPS by flipping a switch. (which is what Nvidea wants people to think)
@KibitoAkuya
@KibitoAkuya Год назад
​@@Erik_Dz XeSS and FSR are NOT generating frames, that's not how they work, they (basically)render the frames at a lower resolution and feed them to an algorithm that upscales it while maintaining good quality (it's a little bit more complicated than that, just upscaling images alone makes it worse, but intel uses an ai model to try and improve the quality and AMD uses a few techniques together to do the same thing, trying to get as close as they can to what a native image would have been like) Neither does DLSS3, *BUT* , it has an extra option for frame interpolation which basically uses AI to "guess" a frame in between two already generated frames. THIS is what worsens latency because there is always one already done real frame ahead of every other frame you're presented (NVIDIA misleadingly markets this as "reducing latency" because frame interpolation mode forces Reflex on to mitigate that latency penalty, but you are indeed gonna experience penalties if you already were using Reflex beforehand) Now, AMD's recently unveiled FSR 3 will also feature interpolation (with their own techniques to do so) but they do admit that there will be latency penalties with frame interpolation
@zodwraith5745
@zodwraith5745 Год назад
@@Erik_Dz While that's true with DLSS3FG, it's not true with DLSS2 or 3 minus the FG. (DLSS3 works on 20 and 30 series, it's only the frame gen aspect that nvidia gatekept) DLSS upscaling can often produce a _better_ image than native and HUB already proved that. FG is up in the air and entirely subjective if you like high fps for the visuals or the responsiveness. That depends on the types of games you play. But anyone that blindly turns off DLSS for scaling is an idiot when it _always_ gives you superior latency, and often gives you better than native visuals as long as you're using quality mode. This is where we get into the grey area of games that support DLSS but not FSR. This needs to be a note in benchmarks cause anyone that _needs_ the extra performance is ALWAYS going to enable DLSS. It's not like they're going to say "I can't enable that because it's not fair to AMD users."
@zodwraith5745
@zodwraith5745 Год назад
@@KibitoAkuya I've got a sinking feeling FSR3 is going to suck. It's taking really long to come out and they don't have dedicated hardware for it. 30 series DOES have the dedicated hardware but doesn't have as big a frame buffer as 40 series. (Nvidia's bullshit excuse) FSR3 would most likely work _better_ on 30 series than RDNA, but of course AMD wants to change direction and limit FSR3 to AMD only. It's already been theorized Nvidia could easily enable FG on 30 series but they badly wanted a new feature they could gatekeep to 40 series. Typical Nvidia. The worst part is, imagine anyone that dropped 2 freaking grand on a 3090ti to have Nvidia only 6 months later spit in your face on DLSS3. I'd be fucking pissed. It pisses me off with my 3080ti and I spent far less than MSRP for it.
@bosstowndynamics5488
@bosstowndynamics5488 Год назад
​​@@KibitoAkuya Is DLSS3 frame generation that basic? Would have thought with the fact that it requires specific support that it could do asynchronous reprojection and just fine tune the result, with reprojection being a much older technique that genuinely reduces latency (very common in VR games on mobile headsets). The only real disadvantage for AMD for "work" (you didn't specify so I'm making assumptions about which workload you have) is that it won't support CUDA software, but most client side applications don't use CUDA for hardware acceleration and AMD support in the GPGPU realm is getting better rapidly, so unless you're doing AI you *should* be good (and *might* be good even on AI - be sure to research hardware support for your intended application)
@iseeu-fp9po
@iseeu-fp9po Год назад
The PCIe 3.0 comparison is so important here. I have been wondering about that myself but have not seen any other reviewer mention this.
@Saieden
@Saieden Год назад
As an owner/custodian of two B450M Mortar systems, this is extremely relevant to me if I decide to upgrade the 3060ti and 6700xt they house at the moment.
@ManuSaraswat
@ManuSaraswat Год назад
i'm in same boat, i built my system with ryzen 1700 nd aorus b350 board, when it died aorus gave me free b450 upgrade, now i'm running 5800x nd 1070, no way i'm buying nvidia cause their pricing nd products make no sense rn, 6700xt it is.
@Saieden
@Saieden Год назад
@@ManuSaraswat Yeah, I've been milking the first B450 for 4 years now and still running a 3700x in my rig that I want to replace with a secondhand 5800x3d/5900x/5950x/ for another few years when the time comes. I'd completely forgotten that I was still on 3.0 until this popped up lol.
@AndrewTSq
@AndrewTSq Год назад
I have a Asus Prime B450 with a 3600 that I never upgraded bios on, so its still a PCIE4 system :)
@grants7390
@grants7390 Год назад
@@AndrewTSq according to the asus website that's A pcie3 board
@AndrewTSq
@AndrewTSq Год назад
@@grants7390 Yes, cause AMD forced all B450 boards that supported PCIE4 to remove it with a bios update... but I never updated mine, so its still PCIE4. Just search "PCIE4 removed from B450" on google. Not only Asus, also Gigabyte had it "Gigabyte, who introduced a Firmware BIOS switch in their X470 motherboards allowing you to enable PCIe Gen 4.0 (partially). Once that happened a lot of motherboard manufacturers felt obligated to follow. Meanwhile, AMD has been stating that non-500 series motherboards will not get PCIe Gen 4.0 support. "
@1Grainer1
@1Grainer1 Год назад
no dlss is best way of testing, no 4 scenarios to include, no wasting reviewers time, gives "worst case scenario" in performance deparment and takes into account games that don't support it overall better way of testing since it gives rough estimate for majority of games that are not bugged or badly optimised dlss/fsr/xess can be included in RT low performance scenario to show if it can go from 10fps to 30fps and be at least playable edit: with last one, i meant dlss 2/fsr 2/xess, since those are somewhat supported by games, dlss 3 is frame transition smoothing, which only could help in games like "life is strange" if anything since that's the only game you really don't care about latency (choosing dialog options would still he horrible probably)
@bradhaines3142
@bradhaines3142 Год назад
problem is nvidia pushes reviewers really hard to pitch that so it looks better
@cpt.tombstone
@cpt.tombstone Год назад
I'd agree if Frame Generation's performance uplift was as linear as the performance uplift of DLSS. With DLSS, you can just calculate the difference in pixel counts between the output resolution and the render resolution and you will be very close to the actual performance uplift you get in game. With Frame Generation, it's really not that easy, because the base framerate with FG on and off are not the same - on my 4090, in a GPU-bound scenario, base framerate can be 25-30% lower with Frame Generation on. FG then doubles the effective framerate, and you end up with a 65-80% boost overall. However, in a CPU-bound scenario, you get a clean 100% uplift. This is very evident in poor quality PC ports like Hogwarts Legacy and Jedi Survivor, where if you enable RT, you cannot get over 60 fps, but Frame Generation doubles the framerate very effectively. Elden Ring is similar, there is a 60fps lock on the game, Frame Generation will double it to 120 fps without you messing up the gameplay because invulnerability frames not lasting as long as intended. So I think it could be very informative to test games with frame generation as well (not instead of native, that's disingenuous)
@salmon_wine
@salmon_wine Год назад
@@cpt.tombstone frame generation really kind of is cheating though as it only appeals to the eye candy factor of a high frame rate, and a high frame rate is already pretty superfluous to an eye-candy focused game. Thus, the main use of frame generation is to hit 60fps in a title where you are using graphics settings definitely too high for your gpu. Idk, I feel like there's more validity in testing native vs DLSS vs FSR than there is including frame generation in there
@joey_f4ke238
@joey_f4ke238 Год назад
@@salmon_wine Any game that isn't some competitive multiplayer will benefit from frame generation, it is a really great tool for maximum visual fidelity while retaining a smooth experience, i have always dropped graphic settings to get at least 60 fps or more for every game i played and i would have loved such a tool back when i played mh world or other really pretty games and my gpu just wasn't quite up to the task
@cpt.tombstone
@cpt.tombstone Год назад
@@salmon_wine I strongly disagree. 60->120 in a game that cannot achieve more than 60 fps is very strong use case for the tech. I've been playing a heavily modded Skyrim with frame generation, I only get 60-70 fps in the wilds at 4K even with a 4090. Turning on Frame Generation, now it's an almost perfect 116 fps experience, and the game is super smooth. Very-very minor latency impact as well, even with Reflex not working in this version of the FG mod.
@DanTheYeen
@DanTheYeen Год назад
Glad you're pointing this out. One thing I'd love to see next time is a card's PCIe 3 vs 4 performance plotted on the same graph.
@FiveFiveZeroTwo
@FiveFiveZeroTwo Год назад
Indeed, IMO the current graphs / benchmarks don't show the potentially lost performance at all.
@allyoucouldate9510
@allyoucouldate9510 Год назад
Congrats for covering Pcie 3 x8 issue! Myconcern with this is the shuttering that can have when system move data from Vram to/from RAM, because you will have 8GB/s on x8 lanes and not 16GB like would have on x16 lanes, so theoretically double shuttering time. This issue is amplified by small Vram of 8GB, on the 16GB version of the card there is no issues with bandwith.
@jakov175
@jakov175 Год назад
Thank you for putting emphasis on the power consumption. Especially important with something like the 4090 because in gaming it rarely ever runs at the max listed TDP.
@hop-skip-ouch8798
@hop-skip-ouch8798 Год назад
Same. Our power consumption bill is calculated in slabs so the higher we consume, the more expensive it gets per unit. And it gets barely talked in most reviews.
@marinipersonal
@marinipersonal Год назад
Exactly. My 4080 uses much less power than my 3080 Suprim. Runs very cool and as I don’t use high frame rates, only ultra on image quality, usually around 200w. Not bad.
@raulsaavedra709
@raulsaavedra709 Год назад
2x. Furmark is good to include to know the peak sustained power a card can pull, e.g. good to know maybe for power supply capacity planning. But that's way too high a load compared to everyday usage, even for the most demanding games or graphics applications. A call for better/more granular power analysis from other reviewers was very much pertinent.
@alexrusu6417
@alexrusu6417 Год назад
Copium!
@dakoderii4221
@dakoderii4221 Год назад
@@hop-skip-ouch8798 You pay more for utilities to stop "climate change" and "racism". The experts would never lie for nefarious reasons. It is impossible that someone with an education could ever do any wrong. It's best to blindly believe them and persecute the heretics who dare defy the gods of science. Now eat ze bugs! You'll own nothing and be happy!
@markus.schiefer
@markus.schiefer Год назад
Frame Generation should for the foreseeable future not be part of any comparative chart. Way too many games don't support it and depending on person and / or game it has negative side effects like increased latency. As for DLSS / FSR2: While I think that it is working well enough, the quality is depending on resolution and settings. Quality or balanced settings might be fine on 4k for most people, but for many even quality settings might be not acceptable at 1080p. On top of that, some popular games, even new ones sometimes, still don't support it. DLSS1 / FSR1 shouldn't even be considered. At the moment, comparison at native resolution without those features is simply the fairest.
@RN1441
@RN1441 Год назад
Frame generation and upscaling technologies are very vulnerable to abuse. In concept Nvidia can generate any framerate they want at any resolution they want if they are ok with grotesque artifacts and distortion. I really prefer to see the apples to apples raw performance comparisons and go from there.
@LuminousSpace
@LuminousSpace Год назад
dlss and fsr is really great for mobile devices like 7940U and asus rdna3
@Jaml321
@Jaml321 Год назад
Frame generation should never be the focus in testing a Gpu. It is a nice feature to have but i pay for raw performance not for fake frames.
@Val-bx6gn
@Val-bx6gn Год назад
The thing that many people tend to forget or even misunderstand about FG is that it isn't even for people with low fps. The lower your fps is the harder it is to interpolate between frames. And it does nothing to fix the input lag which usually is pretty bad at low fps. This feature is for people who already have pretty decent >60 fps and who also have high refresh monitors, it allows them to saturate the pipeline to further visual smoothness. It's a "win-more" feature.
@Anankin12
@Anankin12 Год назад
​​@@RN1441ot to mention, dlss3 is literally a performance worsener in online compatitive gaming. Increases latency, introduces false information (uses ML to predict what an object will do; if said object for example changes direction suddenly, DLSS3 will predict wrong and show you wrong information), etc. It's ok in slower paced stuff, but at that point... Do you truly need the high fps? It can't even be used to reach "playable" on heavy titles, because if native FPS are low enough you'll get untenable latency anyways. At the moment, the only actual use for it is basically VSync in single player titles: you can set it to match your screen refresh rate and that's it. Basically if you get 45 FPS in a non latency sensitive game, you can use it to get 120 fps with minimal drawbacks. Below 45 fps, but it depends on one's tolerance for it, it's useless because even if you get 100 fake frames you'll have an actual frame time of like 60 ms or more, which is about equivalent to playing at 15 fps. Literally unplayable. It can be a great technology, but in gaming it has very limited use at the moment (and in the future if they don't change something about it).
@nukedathlonman
@nukedathlonman Год назад
BIG thank you - I'm glad someone looked into this as it is very helpful information to have when people are looking at older systems for upgrades.
@garbuckle3000
@garbuckle3000 Год назад
Thank you for testing this! This was exactly what I was wondering about when I heard only x8 for the 4060ti. I know sites did testing when the 6600XT came out with only x8 and even more people still had PCIe 3.0 (I was one of them). I admit I saw a slight increase in performance once I upgraded my system to PCIe 4.
@pandemicneetbux2110
@pandemicneetbux2110 Год назад
Man I'd be so pissed by that. Mainly because the board is literally the ONLY thing in my rig I don't expect ever to touch in the lifetime of the system period, literally even the PSU I would imagine needing to replace before having to get a different board. That's literally the exact reason why I got the pricey motherboard to begin with, because I didn't want to have stupid problems like these and not have any hassle because it's supposed to be a rock solid x570 that shouldn't even need to be touched until I replace my whole rig by 2030 (God willing, barring terrible unforseen consequence knock on wood).
@DanishBashir-sz6vs
@DanishBashir-sz6vs Год назад
I am buying a PCIe 3 motherboard and a Ryzen 2600. I know it's very very old but it's extremely cheap. And very oddly I am buying a 3090 or 4070. I know I know bottleneck. But I am going to be playing at 4k or even higher in my VR setting. Tell me how much difference will PCI 3 Vs PCI 4 will make at for example 8k?
@MsNyara
@MsNyara Год назад
@@DanishBashir-sz6vs Nothing, x16 4.0 cards does not bottlenecks with 3.0 setups. x8 4.0 cards does bottlenecks with 3.0 setups but the amount of bottleneck depends of the CPU and specific card pair, generally weaker cards bottlenecks less from it and AMD cards bottlenecks less due to Infinity Cache doing lift work and Nvidia having no equivalent. I will heavily advise to at least buy a Ryzen 5500 as it just costs $85 and doubles the performance, can be used on the same motherboard you had eyes on by just asking the seller to update the bios if it has not been updated yet.
@DanishBashir-sz6vs
@DanishBashir-sz6vs Год назад
@@MsNyara 5500 was exactly next in line and you made me reconsider again lol. Okay, I will think about processor once again.
@DanishBashir-sz6vs
@DanishBashir-sz6vs Год назад
@@MsNyara 5500 was exactly next in line and you made me reconsider again lol. Okay, I will think about processor once again.
@nothingisawesome
@nothingisawesome Год назад
you are 100% right about benchmarks with DLSS. including those in the baseline benchmarks is a horrible idea. i know its more work for reviewers but i think its fine to give those figures out but i would summarize them at the end somehow: i.e. here is the DLSS/FSR number. but as it stands now i wouldnt include frame gen. also consider the point of benchmarks is having the same settings on all the products tested. man thats just annoyingly complicated
@---le7cy
@---le7cy Год назад
Its a 50 card, named as 60ti and priced as 70 ....
@ignacio6454
@ignacio6454 4 месяца назад
On 8 lanes. It is garbage.
@toututu2993
@toututu2993 26 дней назад
With 2 useless features that make gaming experience worse
@earthtaurus5515
@earthtaurus5515 Год назад
Thanks for covering this Roman 👍🏽👍🏽, haven't seen many reviewers mention this at all.
@philmccracken2012
@philmccracken2012 Год назад
Also, I wanted to add I absolutely love your videos! Thank you for what you do and how you do it!
@gyokzoli
@gyokzoli Год назад
Awesome content as usual! You really deserve a bigger audience.
@truckerallikatuk
@truckerallikatuk Год назад
The issue with DLSS is that it's a massive quality drop at 1080p, it's only realistically useable at 1440p or higher. Edit: At least it was for DLSS 2.0 The base render resolution when using DLSS didn't have enough resolution steps below 1080p to maintain quality even with the AI upscaling.
@Xayc__
@Xayc__ Год назад
It's same with dlss 3, because it's using same dlss super resolution. Difference between them is that dlss 3 also using frame generation tech that draws bunch of fake frames and make games visually smoother, but doesn't improve responsiveness (input lag), usually makes it worse, while real fps boost improves input lag.
@mryellow6918
@mryellow6918 Год назад
Still looks bad at 1440p
@caribbaviator7058
@caribbaviator7058 Год назад
​@@mryellow6918 agreed it doesn't look great on 1440p. Fsr looks even worse @1440p
@TheFriendlyInvader
@TheFriendlyInvader Год назад
You really shouldn't notice that much of a difference on the higher quality settings of DLSS, if you do it's probably not DLSS that's the issue as much as a shader game side bugging out due to sampling issues. Of course if you push it to the limit and use ultra performance you'll start seeing resolution issues, but that's the exception to the norm.
@stevensgarage6451
@stevensgarage6451 Год назад
it looks better than native in cyberpunk
@xingbairong
@xingbairong Год назад
Excellent video. Glad to see someone putting more attention to power consumption. To be honest the card isn't bad, the price however... Just few years ago this card would've probably been in the $220-$250 range and realistically that's where it should be.
@marstedt
@marstedt Год назад
I disagree because the name of this product implies that a performance upgrade should be expected. If it was cheaper AND renamed the 4060 (or 4050Ti) then I think it would be worthy of praise. I think most people would expect a 4060 to be equal or better in performance to a 3060Ti AND consume same/less power. Nvidia has failed / misguided the public on two fronts, price and performance.
@vsm1456
@vsm1456 Год назад
@@marstedt 4060 (non Ti) should perform around 3070, that would be a proper generational improvement as we usually saw in the past
@MsNyara
@MsNyara Год назад
No, the card is bad, it has a serious imbalance of memory and bus for its otherwise real power, leading it severe bottlenecks with itself, which is a very bad design.
@xingbairong
@xingbairong Год назад
@@MsNyara So if the RX 7600 and RTX 4060 Ti were the same price you would tell people to get the RX 7600?
@MsNyara
@MsNyara Год назад
@@xingbairong The RTX 4060 Ti if same price or up to $30 more but that is mostly comparing two terrible designed cards with each other. Just buy a RX 6600 for $200, RTX 3060 12GB for $280 or RX 6700 XT for $350, the rest does not makes sense in the performance segment.
@CorvoPT
@CorvoPT Год назад
nice explanation! congrets!... cheers from Portugal!
@stratos7755
@stratos7755 Год назад
6:30 All I want to see is pure raster performance.
@iamdarkyoshi
@iamdarkyoshi Год назад
Love to see constructive feedback between channels. I love GN's content as well, and agree some power tests per game would be a great addition.
@-opus
@-opus Год назад
It is definitely something they are lacking, but I guess most of their viewers are from the US and they don't (according to a lot of youtube comments) seem to care about power usage because of the power cost, or the cost to the environment.
@GrizzAxxemann
@GrizzAxxemann Год назад
@@-opus A lot of places in the US are on nuclear power. It's about as clean and efficoent as you can get.
@19alive
@19alive Год назад
Also GN needs to change those bars how he presents the numbers, looks like a mess, too many numbers on top of each other.
@-opus
@-opus Год назад
@@19alive Agreed, too many comparisons. The only issue is that there are often people questioning why certain models are still excluded.
@-opus
@-opus Год назад
@@GrizzAxxemann I guess that explains a lot, perhaps it is why they like rgb so much, it too glows in the dark. I am surprised no one has made a PC case shaped like a 3 eyed fish
@ibangladeshi1161
@ibangladeshi1161 Год назад
thanks for this bro, was looking for this, subscribed
@sortofsmarter
@sortofsmarter Год назад
your work and quality of information is great, I have never understood why you dont have 3X more subs. Your content is worth it, then purchasing test components would be less of a impact on the channels bottom line. Keep it up and Thanks
@Viking8888
@Viking8888 Год назад
Thx Roman! I appreciate you testing the differences between pci-e 3 to 4 with this card. It really shows how delusional nvidia has been with the 40 series pricing.
@nelsonrobe-from3278
@nelsonrobe-from3278 Год назад
I needed an upgrade and after watching the 4060 ti reviews I immediately went out and bought a 6700xt which has almost the same performance 50% more vram and was $70 cheaper than the 4060ti
@ResidentWeevil2077
@ResidentWeevil2077 Год назад
Ngreedia wants to go the high end/workstation route, and that's exactly what ended their former competitor 3dfx. Now it seems Ngreedia is bent on following in 3dfx's footsteps and they'll soon get their just rewards for their nonsense. The irony of this situation is highly amusing; it'd be even more amusing if Ngreedia goes under and sells their IP to Intel.
@EbonySaints
@EbonySaints Год назад
​@@ResidentWeevil2077 Look, I'm rooting for Intel on their GPU front, but do you *really* want the company that kept 4 cores and 8 threads for almost eight years to take over?
@username8644
@username8644 Год назад
​​​@@EbonySaintsey had zero competition, not even remotely close competition. And to be fair to Intel, the worst thing they ever did was not improve their CPUs enough with each generation for a couple generations. That's significantly better than all the issues AMD has been having with their CPUs these last generations. Also amd single handedly killed the workstation lineup with threadripper by pricing them to absolutely ridiculous levels. I mean first gen threadripper was good value, but then it went through the roof to 4k for a cpu. We used to be fuming at Intel for a 1k CPU. Honestly Intel looks way better than AMD now, and at least their CPUs are rock solid stable. AMD is pretty touchy. Edit: also Intel literally made all the high core count chips first before anyone else so your comment is slightly ironic. I'm referring to 6 cores, 8 cores, 10 cores, 12 cores, 14 cores, 16 cores, 18 cores, 20 cores, and 22 cores. Intel all did it first. They did that during your 8 year 4 core 8 thread time period. And yes they were expensive at the time, but AMD is now high core count king and they priced their stuff 4x higher than Intel ever did. So I don't see how Intel is worse.
@firecat6666
@firecat6666 Год назад
@@ResidentWeevil2077 Nvidia seems to be going for the AI market now, judging from some stuff I saw regarding their most recent financial report. Hardware for that area has much greater overlap with high end/workstation hardware than with gaming hardware, so no wonder their gaming products aren't getting much R&D attention. With all the projections for the growth of the AI market, Nvidia's decision seems pretty solid compared to 3dfx's back then. Also, it would probably be bad for the gaming GPU market if Nvidia decided to quit, we'd be back to duopoly again, except we'd then be left with the two people that currently offer the lower quality products.
@tqrules01
@tqrules01 Год назад
I like your points there are just a few things here I would like to add 1: Most gamers really don't care about power unless the gap is huge "look at all the core i9 9900k users" 2: The 8GB frame buffer is waaaaaay to low....... 3: Nvidia believes they are better and bigger than the consumer and won't budge on price or volume.
@WaterZer0
@WaterZer0 Год назад
People in Europe will care more about power than the US for reasons made obvious in this video.
@korana6308
@korana6308 Год назад
Good points. 1 I completely agree. And I seriously don't get that power efficiency thing. Because it depends on the settings that you play your game at. 2 This 8GB bs is completely ridiculous since we already had it on a GTX 1070 !!! 8 bleeding years ago. Ngreedia releasing it with 8 gb is completely ridiculous, to me the 8Gb should only be at the baseline of gaming which means only a 4050 should have it and the rest should go up. 4050 8Gb, 4060 12Gb , 4070 16Gb , 4080 20 Gb , 4090 24 Gb, and anything outside of that is Ngreedia just screwing us. 3 I think eventually they will though, but not as fast as people think. It will take a few months for them to organize a corporate meeting, to finally realize oh gush darnit we screwed up with pricing. They will have to lower the price... I saw a perfect comment on one of the 4060 videos, so now I repeat it everywhere. This 4060 is a 4050 at 4070' price.
@badass6300
@badass6300 Год назад
80%+ of their income comes from enterprise/server/AI and another 10% or so from laptops, they don't really care much to sell volume on the desktop side, they'd rather have premium products. I'm more surprised that AMD are even trying, because last year less than 10% of their revenue came from their GPUs.
@MahbuburRahman-uc7np
@MahbuburRahman-uc7np Год назад
Excellent review. Just one suggestion, for the TDP cost-saving calculations I think its better to test the total system power draw rather than just the GPU power draw. Because you can not just run your GPU without the system. Doing so the gap will minimize and will reflect real-world use better
@master21mark
@master21mark Год назад
Thanks for the extensive review. It's great you also factor in the power consumption cost to. My thought on the bandwidth limitation - would overclocking the PCIe lane help with this? I know it's possible to cause instability this way, but I'd be interested to see the impact. Could also make a nice overclocking video as there is a lot less content on the topic.
@ObakuZenCenter
@ObakuZenCenter 10 месяцев назад
You can overclock the card, but not PCIE lanes.
@_mrsahem_
@_mrsahem_ Год назад
Really appreciate the follow up video. I actually commented on the original video asking about Gen3 performance. Totally forgot Gen3 8x == Gen4 4x. Regardless I think the 4060 ti is terrible value right now. Hopefully AMD can provide some better alternatives.
@notrixamoris3318
@notrixamoris3318 Год назад
7600 is the same...
@syncmonism
@syncmonism Год назад
In the US, you can get a new 6750 XT for 330 USD (including shipping) right now
@notrixamoris3318
@notrixamoris3318 Год назад
@@syncmonism everyone did a price drop...
@tilapiadave3234
@tilapiadave3234 Год назад
@@notrixamoris3318 Doesn't look to bad at the new price ,, 260 of those green American peso's
@astra6640
@astra6640 Год назад
​@@notrixamoris3318 at least AMD's pricing is less wack... the 7600 having the same pitfalls isn't as bad when you factor in that it's lower in the product stack and priced according lower as well. Afaict, it's not *meant* to be a direct competitor to the 4060 Ti. Which makes the limited parameters a lot more excusable on AMD's side, since you get a low end limited product, but you do pay fittingly less for it. Whereas Nvidia's offering is kind of criminal...
@davidbrennan5
@davidbrennan5 Год назад
Native benchmarks only please, no gimmicks turned on. The new games are going to be very demanding, the requirements for Immortals of Aveum (unreal 5 engine) are very high even at lower resolutions. I wish AMD/Nvidia had spent time on improving the actual Native performance rather than add RT and FSR/DLSS . This is a great channel, Thank You for the hard work.
@davidbrennan5
@davidbrennan5 Год назад
@@mikeycrackson That is part of the problem but new games have very busy scenes and a lot of details that use up memory. They recommend a 12700K/5700X CPU and a 6800xt or 3080ti GPU @1440p 60FPS for medium to high settings for Immortals of Aveum. This game has an area where you are fighting on a moving mech and a battle is going on in the background and you have a hud screen etc...
@joannaatkins822
@joannaatkins822 Год назад
Thank you for your work, I really appreciate your insights. In general I would have liked to see a more in depth comparison between pcie 3.0 and 4.0 on the RTX 4060 ti, as such a big deal was made of that in the recent past with the RX 6500 xt. I'm guessing that without DLSS 3.0 the differences would be noticeable if not profound. With so many GPU launches, and with so many changes to specifications and platforms people are getting confused with the slew of tepid offerings from nvidia and amd. I've had so called techsperts recommending the Intel ARC cards to pure gamers recently, which I think is misguiding people at best. I've had people wondering why the RX 6700 XT is getting slammed and I've had to help them realise that it's a much newer and less useful product that is being slated. With all this in mind, so much is changing that short, clear, concise videos like yours are extremely helpful. Massive in depth reviews are not digestible enough for many people, even enthusiasts, so please keep up your efforts. They are greatly appreciated
@pandemicneetbux2110
@pandemicneetbux2110 Год назад
Anyone who calls himself a "techspert" is probably neither. That's pure 100% marketing bullshittery right there. Also again the only sole reason for mentioning DLSS is for making raytracing playable. You either have a new enough graphics card you can probably run your game natively anyway, or your GPU is old and slow enough to actually need it but also old enough not to have DLSS and therefore you're going to be using FSR anyway. That's for the tiny amount of games which even have it mind you, that's completely irrelevant for literally 99% of my games (I have iirc 7 games tops out of a 700+ library on GOG, Steam, and Epic that even have the option to use FSR/DLSS or RT, none of my other games do). So DLSS is really so much of a niche use case for such a small proportion of gamers that I literally see it along the lines of mentioning AMD's drivers for Linux, or nVidia's NVENC encoding and CUDA, yeah I"m sure AMD does run great on Linux and I'm sure your media business is doing well with your rendering and encoding, but it's completely irrelevant for the overwhelming majority of gamers just like DLSS is.
@DGCastell
@DGCastell Год назад
Right when you mentioned it was running on x8 on the last video, I knew it was gonna be a big problem for people like us who are still on PCIe 3.0. This is important because I know quite a lot of people will get this card not knowing about this potential performance loss.
@EightPawsProductionsHD
@EightPawsProductionsHD Год назад
There's no excuse in 2023 for buying new hardware without checking out the numerous online written and video reviews first.
@WSS_the_OG
@WSS_the_OG Год назад
Yes, but! The large L2 makes up for everything, even the lack of RAM. Right? Right?
@Soundsaboutright42
@Soundsaboutright42 Год назад
​@@EightPawsProductionsHDThere hasn't been that excuse for a very long time. People are just dumb to be honest. I'm shocked some people can even Velcro their shoes in the morning.
@jemma2607
@jemma2607 Год назад
So, what GPU do you recommend for 1080-1440p with pcie 3? I've read about the rx 6750xt but some say this won't be a good option if I also want to work with the system so... Can you give me any recommendations?
@simonb.8868
@simonb.8868 Год назад
@@EightPawsProductionsHD there is no excuse for them to cut 8 lanes
@thepatriot6966
@thepatriot6966 Год назад
I didn't even know the 4060Ti ran on only 8 pcie lanes. As you said no other reviewer spoke of this at all to my knowledge. What is Nividia thinking? What a disaster of a card.
@WaterZer0
@WaterZer0 Год назад
They are thinking: $$$
@arenzricodexd4409
@arenzricodexd4409 Год назад
Cost saving. AMD has been doing it longer than nvidia.
@BobMoran
@BobMoran Год назад
Thanks for the updated content. Honestly when I saw this was x8 pci 4, I knew this card was not for me. I'll be on am4 pci 3 for at least three more years before I'll even debate updating my whole system. Power usage has very little effect on my decision, same with dlss/fsr. I'm most concerned with productivity numbers for video production in da Vinci resolve. I suspect the bandwidth downgrade with pci 3 would impact this but I may be incorrect. Either way, thanks again for your content.
@andrebrait
@andrebrait Год назад
I think this card is too underpowered to suffer any capping due to PCIe bandwidth at 8x 3.0
@juss-passin-thru
@juss-passin-thru 3 месяца назад
@@andrebrait Exactly
@tinfxc
@tinfxc 11 месяцев назад
Nice recview! but hey, it would be nice if we also see how it performs on a pcie 4 system to see if its worth jumping to it or better stay on pcie 3 bit longer. thanks!
@KookyBone
@KookyBone Год назад
Great, just thought about the two topics of how the 8 lanes might affect the performance and how much the power savings could help. But just as a Sparfuchs, i look a lot after used GPUs here in Germany and I often find the RTX 3080 (10GB) for about 410-430€.... So it is even cheaper. But really great point: in the past Nvidia was nearly always matching the used price of their 80 series with the 60 (later Ti)series, but now they don't seem to care about this anymore... And with a market full of used 3080s this doesn't make sense. Really great video. Keep on doing them
@daviddesrosiers1946
@daviddesrosiers1946 Год назад
I've noped out on team red and green. Waiting on my Acer Predator Bifrost A770 OC. I'll take my chances with the guys who aren't trying to sell me an 8gb 128bit bus card for an outrageous price.
@pf100andahalf
@pf100andahalf Год назад
Almost every time I tell people that used 3080's are down to $400 now (I got one for $450 last october 2022 when prices finally dropped), and that they are a great deal, I get lots of pushback saying, "I won't buy used!" Glad to see there are some sane people around here. The 3080 is a beast and if you buy used it might be the best deal going right now.
@lat78610
@lat78610 Год назад
dude you re lucky in france people are insane and basically selling used at shop prices
@AndrewTSq
@AndrewTSq Год назад
I would actually get a new 4070 over a used 3080 anyday in the week. More vram, more speed, and you get a warranty. That is worth the extra $199 for me. edit: or if you dont mind AMD a 6800xt new is $529.
@main_stream_media_is_a_joke
I think people are somewhat scared to open and clean what is quite possibly the most expensive part of the PC. Just yesterday I got a used 3060ti for 230$. It also has around 4 years of warranty still left....Zotac model. I got lucky as the guy who sold it to me was moving out and have had barely used it for just around a year....card is in great shape and I plan to open and clean it soon.
@yasunakaikumi
@yasunakaikumi Год назад
the problem with used is if was used in mining, you might higher rate of the fan bearing going to die in the mid use of it.... if mining wasnt a thing back when it was released, id recommend it anytime but since it's only 10GB of Vram... it's not that much of a different from having an 8GB
@seeibe
@seeibe Год назад
With European energy prices, getting a used 3080 at $400 vs a new 4070 at $600 is still a tough choice. In gaming the 3080 uses a solid 100W more than the 4070ti. If you game 2 hours a day for 5 years at 0.45€/kWh (prices will only go up from here), that comes down to 164€, which puts the costs of the cards more in line with each other. And that's not accounting for people like me, who also use their PC for work 8 hours a day and thus idle/low usage power consumption matters a lot. In conclusion, don't pay more than 400€ for a used 3080 if you live in Europe lol
@Kingramze
@Kingramze Год назад
Thank you for this video! Most systems 3+ years old and older have PCIe3 boards, and those are the same customers that would consider a 4060 class card for an upgrade. Knowing the bandwidth is limited makes a big difference. It makes it even worse value for them, and I'd definitely recommend cards with full 16 lanes for such a system just so they get the most out of whatever card they choose. Very few cards are limited by PCIe3 16 lanes - and certainly only the high end ones. As for power efficiency, I think the reason it's not discussed in general is because the USA market doesn't find it a priority. I personally don't care because electricity is so cheap, it's not a factor. However, I DO care about noise and heat dissipation which are related to efficiency. If a card can use less power and therefore need less of a noisy cooling solution, then I'm very interested.
@melbornefischer4493
@melbornefischer4493 Год назад
Thank you for show the power consumption. :) I watched the other yt reviews and was wondering if they match last gen, whats the power consumption then and nobody showed enough so thank you.
@jonathanellis6097
@jonathanellis6097 Год назад
I was supprised by the lack of covrage about the pcie lanes. Not that long ago AMD got lambasted for doing the same thing, so i was surprised at the lack of testing on an older gen 3 board for the 4060ti, as i would imagin a lot of people with an older system would be looking at this card as as easy drop in replacement for an aging video card.
@ABaumstumpf
@ABaumstumpf Год назад
"Not that long ago AMD got lambasted for doing the same thing" They were rightfully pointed out as it was only 4!!! lanes.
@jonathanellis6097
@jonathanellis6097 Год назад
​@@ABaumstumpfyeah AMD should have been, and were called out, and it received lots of negative press. The odd thing is Nvida seem to have been given a pass on the issue of pcie lanes. Nvida should have been called out for it as well.
@ABaumstumpf
@ABaumstumpf Год назад
@@jonathanellis6097 again - twice the lanes and as he has shown even with PCIe3 not a real problem - contrary to AMD.
@gruzzob
@gruzzob Год назад
Hi Roman, couple of thoughts on this whole thing: 1. I know you tested the 4060Ti on a PCIe 4.0 system in your previous video, but it would have been good to see the results included here as well. Even though it was a different system (and probably more powerful. though there are ways to control for that), it could help give an idea of what kind of performance was being lost over the older bus, 2. re DLSS3: I say show it, people do use it, it does give extra framerate (potential quality issues aside), and it allows intergenerational testing 3. Keep up the good work, you have a good perspective on things
@racoonchief
@racoonchief Год назад
Awesome videos!
@zwerushka1
@zwerushka1 Год назад
Excellent soundcard with video exit)))
@uncrunch398
@uncrunch398 Год назад
Difference in power consumption could also be a difference in power paid for cooling your room if you're hot a significant portion of the year. I already want to run a heat transfer system using pumped liquid so I can run my hot CPU all day when the ambient is already uncomfortably warm. I just can't afford it yet.
@palmariusbjrnstad1682
@palmariusbjrnstad1682 Год назад
Indeed, this is also big in data centres. You pay for the power consumption twice due to cooling. It can work the other way too, though. In Norway, most of the year, the PC power consumption will be exactly balanced by needing less heating. It's very common to have resistive electric heaters and then the (marginal) cost of PC power usage is zero (though, more efficient heating options like heat pumps are becoming widespread). [Edit: and in the summer the electricity tends to be cheap here, due to large solar production on the continent]
@PainterVierax
@PainterVierax Год назад
@@palmariusbjrnstad1682 Yeah heatpumps can mitigate the heating cost. Though it doesn't work well under zero Celsius so might be less incentive to invest on it in your latitude than in southern Europe (especially units with reverse heatpump option to replace an existing HVAC)
@lonelymtbrider3369
@lonelymtbrider3369 Год назад
My thoughts on FSR, DLSS and frame generation is that it should always be included in a separate bar- but always together with the raw performance so we can see both side of the coins. I am one of those who gladly use DLSS + FG if it is available, because most of the time it's free performance gains with little to no negatives, but sometimes in some games DLSS look like sh** and it's better without.
@amunak_
@amunak_ Год назад
You also absolutely need it to be able to judge performance for titles that don't have it.
@lonelymtbrider3369
@lonelymtbrider3369 Год назад
@@amunak_ Yes, most definitely. Nvidia thinks DLSS3 should be compared to other cards running "raw" but that's just wrong. Nevertheless, I do want to see what kind of gains DLSS3+fg gives, but that should be an additional test, because raw performance is super important for the sake of comparison.
@ashberto6041
@ashberto6041 Год назад
Your review has been the only worthwhile reivew of all the reviews out there, well done. I am the target market but i have a 9900k so it is not a viable upgrade and the pcie lands is something I need to consider.
@maxview99
@maxview99 Год назад
Excellent video, I agree power consumption is an important metric.
@pandabuttonftw745
@pandabuttonftw745 Год назад
DLSS3 isn't the selling point nvidia thinks it is. fake frames don't mean squat since it introduces artifacting (more than "normal" dlss) and the input latency is not improved at all.
@subaction
@subaction Год назад
Trillion dollar company, folks.
@daviddesrosiers1946
@daviddesrosiers1946 Год назад
This gen of cards should have the market chasing Nvidia's execs down the street with pitchforks.
@subaction
@subaction Год назад
Not at all. It's good for the stock value to sell overpriced cards that will feel obsolete in one or two years of use. The execs are doing what they are required to do: make the most money for the company. They just happen to have noticed that the best way to do that is to exploit consumer stupidity.
@pjsb5757
@pjsb5757 6 месяцев назад
Tahnks for the information, very usefull to me!!!
@floodo1
@floodo1 Год назад
Thanks so much for always highlighting power efficiency and showing the benefits of DLSS3/FSR. Very important to consider the significant power efficiency improvements in 4000 vs 3000 series as well as frame rate with DLSS3. So many reviews ignore these benefits
@TheVektast
@TheVektast Год назад
And then FSR3 enters the chat.
@tee_es_bee
@tee_es_bee Год назад
DLSS, similarly to RT is just an extra _optional_ feature that _might_ add in _some_ games. Thus, primarily I would focus on pure rasterization, then including RT, then including DLSS/FSR/whateverIntelsTechIs. Thank you for including the power numbers. As someone who donates his compute when not gaming, power is much more important for me. 🧡💛🧡
@beatsntoons
@beatsntoons Год назад
Agreed on the power metric. It's hugely important to me. I went from a 2070S to a 4070, and the power draw is about the same or a little less. But the performance is better. I'm happy to similar performance as well, if the power draw keeps falling. PC parts can consume so much power these days and power generation is expensive.
@pliat
@pliat Год назад
XESS
@Melsharpe95
@Melsharpe95 Год назад
DLSS is such an important tech that it does need to be tested and factored into the purchase. Especially seeing as AMD themselves are working on their own DLSS 3.0 variant.
@totojejedinecnynick
@totojejedinecnynick Год назад
Roman, just a quick note on consumption calculations - you should consider additional cooling costs associated with card. You may need to ramp up case fans a bit to evacuate extra heat, you may have to run your air-conditioning harder in the summer... of course it won't make much difference in this case, but some high-end 600W cards - well, you have to consider it then. Remember that old datacenters had a rule of thumb that high-density servers can eat ~20-30% of total system power just to run their chassis fans, not to mention datacenter cooling overall.
@coppercore6287
@coppercore6287 Год назад
I agree with this, it does add a good deal of complexity to consider, but looking at a worst case during the summer months of having 150-200W or even more heat dumped into your house could be something.
@klopferator
@klopferator Год назад
This is hard to quantify in Germany because most houses and apartments here don't have AC anyway.
@Lishtenbird
@Lishtenbird Год назад
To be fair, you could get the reverse effect in winter, not having to run a heater as high.
@PainterVierax
@PainterVierax Год назад
@@Lishtenbird with this electricity price, it could be better to massively invest in heatpumps rather than resistive heaters.
@PainterVierax
@PainterVierax Год назад
another point is to not forget that older cards also have higher power consumption running light loads than newer ones. This dozen of Watts can be quite a difference when the computer is used for more than 4hr/day gaming sessions and sometimes barely powered off.
@YamiInu55
@YamiInu55 Год назад
As someone with an older system, thank you so much for this video. You saved me quite a bit of money!
@STONE-wh2en
@STONE-wh2en Год назад
Congrats. Excelent and smart video.
@Verpal
@Verpal Год назад
$400 is a price point that I would expect budget gamer be paying, and many of them would be on pcie 3.0. NVIDIA truly did gamer a dirty one this time, good that sales volume seems to be abysmal, people spoke with their wallet.
@subaction
@subaction Год назад
The problem is that a lot of people "speak with their wallet" by buying a higher end nvidia card instead, rewarding their gambit.
@toututu2993
@toututu2993 26 дней назад
I bought RX6600 for around $220. Such a great gpu with no 2 useless features thaf Nvidia been scamming you
@TheChocolateGoat
@TheChocolateGoat Год назад
As always great video :) I think people care less than you think about the power running costs simply because they'll notice it less being spread over the years I still think it's a really important point to make, more power going through the device is still going to mean more unconforable heat in the summer, more chip ageing etc Unfortunately our brothers in gaming have been buying into the upscaling promise for years now and those fake frames with DLSS 3 will definitely do a good job for the marketing aspect... More numbers more good they say right ?
@NPurvis7622
@NPurvis7622 Год назад
Just underclock...
@edtubemotion1
@edtubemotion1 Год назад
thank you for taking this test
@ernestweiss9438
@ernestweiss9438 Год назад
NO DLSS for reviews. I only use DLSS or FSR as a last resort if the game is not playable without it.
@einarcgulbrandsen7177
@einarcgulbrandsen7177 Год назад
As long so few of the game's don't have any option for this is, there's no need to test that.
@endame90
@endame90 Год назад
Totally, I don't even think of turning on frs/dlss unless it's absolutely necessary.
@spektrumB
@spektrumB Год назад
Same here.
@Afthrast
@Afthrast Год назад
​@@einarcgulbrandsen7177 even if they would, they still 99% of the time reduce fidelity and introduce ghosting and artifacts (that 1% is when they make text more readable)
@fredEVOIX
@fredEVOIX Год назад
same thing dlss shouéd be a ladt resort crutch not the new normal
@paulc0102
@paulc0102 Год назад
I totally get why GPU benchmarks use the fastest CPU and memory to mitigate the influence of the platform on the comparison, but there's also a place for demonstrating performance on older platforms - if reviewers are really trying to allow viewers to make an informed choice, then to me they are currently failing. Credit to you for attention to detail :)
@-opus
@-opus Год назад
But how many different setups do you expect youtubers to benchmark on?
@paulc0102
@paulc0102 Год назад
@@-opus That is an issue. Relative performance (with a fixed high-end platform) is great to show (generational) improvements (or lack of) etc. But this example is a case in point (and I think Jay's video is an extreme illustration) that not considering other factors can skew the result. I would suggest that there are a lot of folks happy with their "older" intel systems and AM4 kit that might want to upgrade their GPU - so testing at least a couple of the most popular configurations wouldn't hurt - particularly where there is a fundamental change between current and last gen e.g. PCIE3 to PCIE4. Most of us upgrade incrementally (I certainly do). Nobody running a 13900KS on a $1,000 motherboard is going to be gaming with an RTX4060 though anyway......
@-opus
@-opus Год назад
@@paulc0102 I agree that it makes sense for significant changes, I just don't see youtubers being keen on doubling their workload and complicating charts even further.
@mryellow6918
@mryellow6918 Год назад
Well, then just watch a cpu review, a cpu can only give you so many frames
@99mage99
@99mage99 Год назад
@@paulc0102 I 100% agree that benchmarks for older systems are super great but even adding one older platform can drastically increase the workload and production time for a video so I get why they don't when they already have such a limited time to benchmark the samples they get. It'd be great if AMD, Intel, Nvidia etc would send the samples out sooner to give reviewers more time to test more configurations.
@DigitalIP
@DigitalIP Год назад
I mentioned a 3.0 system either on this channel or on Jayz before, so YAYYYYY Glad someones testing on it
@FancyaBevMate
@FancyaBevMate Год назад
Great video mate. Credibility is everything for a CC. Well done for being honest. Some CCs I'm sure are being 'paid' by Nvidia to give false information 😢. Cheers
@RylTheValstrax
@RylTheValstrax Год назад
I agree - DLSS/FSR are not native raster perf, so they should not be treated as native raster perf. Its like the whole thing back in the 90s and 00s where companies were cheating by reducing graphical quality in benchmark programs, except now its being marketed as a feature you can use in games instead of a cheat baked into the drivers to be used in benchmarks. It is useful, dont get me wrong, but its not properly comparable/objectively like-for-like.
@Melsharpe95
@Melsharpe95 Год назад
AMD are working on their own version of DLSS 3.0 just as they did with DLSS 1.0 and 2.0. Current cards are all tested with DLSS/XESS/FSR on. It's a technology that helps, so why not test it and report on it?
@robertlawrence9000
@robertlawrence9000 Год назад
It's good to compare the cards primarily without upscaling or frame generation and then show it when using it for comparison. Good to see as much as possible of what the cards are capable of.
@SirDragonClaw
@SirDragonClaw Год назад
I disagree, at this point DLSS 2.x and 3.x have gotten good enough that you should almost always use them when available, and when set to quality mode with or without frame generation I (and most people) would argue that more times that not you are getting better than native or equivalent to native rendering quality perceptually which is what really matters. So I say include them as the new default (labeled though) for nvidia cards, and when AMD and Intel get theirs up to a similar quality where it looks as good or better then start including them by default. It really reminds me of years ago when the rendering quality differed between cards and brands in the early 2000's. You would adjust them to have a similar visual quality when benchmarking to get the most out of the cards while having a fair comparison.
@robertlawrence9000
@robertlawrence9000 Год назад
@@SirDragonClaw did you understand what I said and why I said it? Not all games have upscaling technology and good to still see without it also for comparison and then with it on
@danytoob
@danytoob Год назад
Xlnt work Roman. Most interesting analysis on the electricity consumption, and when factoring in global green worries ... well ... makes it just that much more convoluted from a personal responsibility point of view. Thx for your great contribution to the community - as usual!
@Cblan1224
@Cblan1224 Год назад
Yo how long until we get the mycro direct die block? There are a few threads of people anxiously checking on this every day, including myself! Keep up the great work Thanks
@an7on413
@an7on413 Год назад
How do these cards perform with and without SAM/resizable bar? I feel like the low bandwidth of these could make these features essential.
@Biker_Gremling
@Biker_Gremling Год назад
This was going to be the RTX 4050ti, but due to the 4080 cancelation reshuffle, low end cards where pushed up a tier. nVidia apparently decided not to stick to the original names and prices and now we have the worst GPU line-up in history.
@lewisgrey6010
@lewisgrey6010 Год назад
Theoretically the unlaunch of the 4080 12gb has pushed the rest of the lineup down a tier in relation to their original plan, not up. (Current 4070 would’ve been 4070ti, 4060ti would’ve then been 4070 etc).
@korana6308
@korana6308 Год назад
I mean it could all bi fixed with the price, then it would be a normal line up. 4060 Ti (16Gb) fair price $300 4070 Ti fair price $600 4080 fair price $800 4090 fair price $1000
@jaymorrow8058
@jaymorrow8058 Год назад
In my opinion this card would have done fine if it would have been named rtx 4060 and sold for 300. If it drops that much (100$/€) in price (wouldn't hold my breath) Radeon is screwed.
@XenonG
@XenonG Год назад
No, NVIDIA pushed the naming up a chip tier and it was all done many months ahead of time with press releases of the images showing the specs plus leaks. People looked at the "4080" "12GB" specs and it just looked like it actually supposed to be down a tier with the compute core amount and memory bus width. Back lash put it down to where is is supposed to, but pricing is still too much. You can also tell by looking at the chip codenames, NVIDIA could have named them up a tier, but the audacity to be honest here is quite admirable. My response is the same as a certain kernel developer:"FUCK YOU NVIDIA".
@gutterg0d
@gutterg0d Год назад
It's not worse, just more expensive.
@H4GRlD
@H4GRlD Год назад
Great review🙃
@JonathanSias
@JonathanSias Год назад
Thank you for retesting this on Pcie 3.0 against a 3060ti! DLSS should not be compared directly against older systems. Testing should be done with identical settings to show the hardware changes. DLSS 3.0 can be added to results after that, but 1. Not all games support it, so it's not always relevant. 2. DLSS 3.0 introduces latency. It's not high refresh rate gameplay, it's just visual. If a gamer is sensitive to input lag, they may never choose to use DLSS.
@qlum
@qlum Год назад
I think the main use case for a pci-e 3.0 system could be someone who is on an older am4 motherboard but upgraded the cpu to for example a 5800x3d. Which is quite a reasonable thing to do.
@-opus
@-opus Год назад
the pros and cons of a long life platform...
@smbu
@smbu Год назад
Well intel also stayed on pcie 3.0 for a looooong time. It wasn’t until their 11th gen cpus running on a 500 series motherboard that they used pcie 4.0 and then 5.0 with 12/13 gen. I’m sure there are probably many people still running those “older” intel systems like the 8core 9900k or 10core 10900k that are limited to pcie 3.0.
@qlum
@qlum Год назад
@@smbu That's very much true but for the performance level a 5800x3d makes sense as an upgrade today.
@wikwayer
@wikwayer Год назад
Sir Thank you for the autopsy
@virepri9871
@virepri9871 Год назад
Including DLSS benchmarks is a good idea, but it needs to be a separate chart and clearly indicated that the results are DLSS results. Personally, I tend to enable frame gen when available, as at higher refresh rates, frame gen making up a significant number of the frames is less significantly noticeable than at low frame rates (e.g. 90->175 is less noticeable than 30->60)
@megumi_0
@megumi_0 Год назад
great stuff on power consumption calculations. it's pretty wild to say it might be cheaper in 3 years to buy an older card over the newer one even if it draws more power.
@BikingChap
@BikingChap Год назад
This a really good call out and one I hadn’t picked up on and my Intel system is pcie3 I think you’re right re the DLSS and excluding it from frame count charts right now but worth debating if future versions give no discernible quality drop. In other words, from a users point of view, do we worry about the quality and smoothness we actually see or how much raw performance the GPU has before subsequent processing. Hmm
@mikekleiner3741
@mikekleiner3741 Год назад
Whenever I am able to afford a GPU that makes sense I am not buying Nvidia. They don't care about gamers, just miners and now AI deep learning. 2017 was so long ago for my 1080ti and yet no reason to buy a GPU unless you are rich.
@mromutt
@mromutt Год назад
This is super helpful. I was actually wondering how the pcie would work, I wasn't sure if there would be some kind of smart negotiation from the card to run at x16 3.0 or if it was just dumb and reports x8 no matter what. Or even possibly its only physically wired for x8 so no matter what its plugged into thats all it could ever run at. It sounds like its a case of its wired x8 so not hope of even a future firmware patch. In the end it just means even in the future if the price on these drop to like $200 usd it wont be worth it sadly.
@NeroKoso
@NeroKoso Год назад
7:50 in Finland we also have to add the electricity transfer cost, which makes the electricity bill double. Plus tax.
@Ghozer
@Ghozer Год назад
I'm glad you did the power savings... more channels should do this, and it's something that's been on my mind recently, especially with the energy crisis atm..... "The TRUE cost of gaming" - the cost of the electricity :D
@ionamygdalon2263
@ionamygdalon2263 Год назад
Great review! I strongly agree that other RU-vidrs should focus more on Power Consumption! PS I found it very interesting comparing those GPUs at PCIe 3.0 Last time PCIe 3.0 was an issue in benchmarks was with the RX 6500XT and the RX 6400
@taylorshin
@taylorshin Год назад
Because they were whopping 4 lanes!
@PainterVierax
@PainterVierax Год назад
@@taylorshin yes the rx6500 only dropped 4 lanes compared to the rx5500. This is half less than what the 4060ti dropped. And TBF this is less of an issue on the 6400 as it's still way better than every
@TheSkuldChan
@TheSkuldChan Год назад
Good video, can you make a compare with the pciex 4.0 system to see how number change?
@p3rf3ctxzer0
@p3rf3ctxzer0 Год назад
Hey der8aur is there a way to limit draw overall in like I don't know an msi or asus bios? What would be the affects? I am okay with slightly less performance with a setup which can over perform. What I mean is can I limit lets say max cpu draw so lets say I am comfortable with having max cpu of 4.0ghz and keeping gpu always at max heatpoint of 80f
@DragonOfTheMortalKombat
@DragonOfTheMortalKombat Год назад
This is Nvidia's 6500XT.
@ve3dy
@ve3dy Год назад
can you compare how undervolted + OCed 3060Ti vs stock 4060Ti? seems interesting if you can achieve same power draw and same performance as 4060Ti without any cost
@27Zangle
@27Zangle Год назад
Interesting thought!
@ShantalhaitianPrincess
@ShantalhaitianPrincess Год назад
love the fact that you add the watt consumption in the preformance charts few if any reviewers do this. Also I thought that our electric rates here in Florida were bad at 18 cents but my family in France and Holland are paying 5x more.
@Tokamak3.1415
@Tokamak3.1415 Год назад
Between 4 and 10PM here in CA our rate is $0.52/KWH. During the morning it's $0.39/KWH. The only way to get anywhere near $0.18/KWH is to play midnight to 6AM here which is $0.26/KWH with SCE. SD PG&E and PGE in northern CA are about the same. I turn off all lights in the house except the one room where the kids and I game in and we try to vent the heat out windows because adding AC load on top really adds to the monthly bill. Fortunately the kids can still get by on APUs - I fear the day they want real dedicated GPUs.
@ShantalhaitianPrincess
@ShantalhaitianPrincess Год назад
@@Tokamak3.1415 same here in Florida,Georgia, Tennessee, and Texas maybe those prices are in the Midwest states or places that use damns. Also here in Florida the largest company FPL charges more rates at night smh I'm moving to A South American country when I retire if things get worse here in America.
@knuckleheadcomputers
@knuckleheadcomputers Год назад
Great video
@pickle-o1151
@pickle-o1151 Год назад
PCIe 5.0 cards when? I am ready with a motherboard that will never use those sweet 5.0 GPU lanes
@IscAst4
@IscAst4 Год назад
x2 😢
@swizzler8053
@swizzler8053 Год назад
Maybe next year.
@einarcgulbrandsen7177
@einarcgulbrandsen7177 Год назад
Currently PCIe 4 is more than enough. Gonna be a few years before any GPU needs PCIe 5. But its good for marketing specs......
@kimnice
@kimnice Год назад
Well RTX 4090 shows barely any bottlnecking when using PCIe 2.0 x16 bandwidths (RTX 3080 with AGP, from 2003, bus speeds is still faster than RTX 3070 with PCIe 4.0 x16). It will take few generations when high end cards will need 4.0. Maybe next year these cards with insufficient amount of VRAM can utilize PCIe 5.0. Assuming they also cripple the PCIe connector.
@GameBacardi
@GameBacardi Год назад
RayTrace + DLSS = Not need, not using.
@sk0mi
@sk0mi Год назад
Finally! I asked on several channels what about x8 on pcie 3.0, literally no one did test this, or even thought about it...nor mention it at all. Several even said it will be good upgrade from 1060 or 2060S for budget built computers. If I am running 1060 or 2060S then I am probably on pcie 3.0 mobo...maybe 2.0 since there are mobos with pcie 2.0 while they can push i7 4770K overclocked very good. Thank you for this video
@jedrula77
@jedrula77 Год назад
Thanks for test. After in my RTX 3060Ti FE connector burn and NV adapter die (yes i proper connect XD) i buy nev moding cable with 6 pin power supply, made undervolting to 0.925V for 1950MHz (0.9V work perfect but i like full 100% stability) and now GPU eat only 140-160W. THX NV for you job.....
@nickw6068
@nickw6068 Год назад
On the subject of energy efficiency. Would be good to hear about idle power. I game a several hours a few times a week but I use the PC for 10-12 hours a day for 4 or 5 days a week.
@ryutenmen
@ryutenmen Год назад
7W idle, 13w on video playback. Its all on Nvidia's page.
@centurion1443
@centurion1443 Год назад
thanks for the video. This issue has been also raised in PCWorld The Full Nerd ep. 258
@PinoyBlender
@PinoyBlender 11 месяцев назад
great review
@notwhatitwasbefore
@notwhatitwasbefore Год назад
THANK YOU! I was wondering if the techtuber crowd had all lost the plot a little with many pointing out last gens 6500xt with its 4x PCI 4.0 would be an issue for people on older systems would likely get much lower performance that those on the latest platform but Nvidia seemed to get a pass on a significantly more expensive product having a similar issue. Would of been nice to see the 4060ti on a PCIe 3.0 system compared to a 4.0 system as I feel thats the only way to really see if it makes any real world differences. Honestly its the only testing I actually want to see on this card as if it makes little to no difference in most games then nvidias claims about improvements to cache and memory performance would seemingly been proven but if it has a noticeable drop in fps on the older PCI standard then thats probably the single most important piece of consumer advice about this product and no one has done that test as far as I can see. Again there was lots of it for the AMD card on day one and that was a card at half the price of this one that was probably not much interest to most gamers due to being the rx580 again but the 4060ti is much higher profile and will be looked at as an upgrade path for many people despite I would assume a potentially huge bottleneck for those users. Shame no one picked up on it but Nvidia always gets a pass by press this is just the latest exmple and once again thank you and well done for at least bringing it up
@shyawkward
@shyawkward Год назад
The 4060ti will age far worse compared to the 3060ti due to the bandwidth limits, both pcie and bus width. A good example is the rx580 that can still play modern games on respectable settings at even 1440p but ONLY if you have the 8gb version
@caribbaviator7058
@caribbaviator7058 Год назад
That's their plan all along. By the 4060ti now and upgrade to 5060ti when it's out . The 2060super was my last midrange card from Nvidia.
Далее
마시멜로우로 체감되는 요즘 물가
00:20
Просмотров 11 млн
Dora’s Tyla Dance is Everywhere 😨 #shorts
00:14
Просмотров 2,5 млн
Top 5 ways you're WASTING money on with your PC!
17:43
Oh, you thought AirPods Max were expensive?
11:03
Просмотров 142 тыс.
The RTX 4060 Ti - 40 Games Tested at 1080p
33:41
Просмотров 179 тыс.
Why is this PCIe Card RADIOACTIVE?
14:12
Просмотров 2,2 млн
This Nonsense MUST STOP!
8:20
Просмотров 153 тыс.
PCIE 3.0 vs PCIE 4.0 (RTX 4060) - Test in 7 Games
12:26
Здесь упор в процессор
18:02
Просмотров 270 тыс.
iPhone socket cleaning #Fixit
0:30
Просмотров 2,8 млн