@@club4ghz wrong and terrible comparison, when a gpu which is 4 gen behind is performing similar to the card which is 4 gen ahead, then i am not paying money to the greedy company.
@@MrBALAKUMARA You want to upgrade from high end to low end ? 4070 Super is $100 less than you paid for 1080Ti 2-2,5 times faster + latest technology.
@@joaonazario3764 if he lives in a poor country what you seem to imply , than it would mean its a even worse decision to do that and should have better priorities or investments
Can you do 1080p on those 2 GPU please? I wonder what they will get as far fps. Because clearly both aren’t for 1440p for latest titles. Maybe ultra or high @ 1080p
Considering you can currently get a GTX 1080 Ti for around $200, the choice is clear. This card is still fighting after 7 years, it's impressive. I still have it and use it :). One of the best video cards ever made.
This shows that since 2017 there has been no significant increase in computing power in standard rasterization. And manufacturers are trying to save themselves with AI. Because this gtx 1080 Ti was equipped with the same units responsible for calculations related to DLSS would be faster.
Proud owner of 1080ti since launch here, you don't have to worry about power spike like 30 series (40 has no one tested this yet so idk). so you can basically run this card with a decent cpu like i3 12100f and get a solid gaming performance with even 450w psu. I used to use 650w psu and overclock the card to 2Ghz with 330w power draw with no problem.
its insane that people will still pay 220 euros for used 1080 ti that can die at any time price is that high in my country i got new 4060 with 3 coolers for 340 and it is amazing card Never had raytracing before it looks really good
@@randallnguyen7405 thats great if you believe that card is good, i got used poco x3 pro 256 gb for 100$ and guy who sold me paid 300 but 3 years ago. Phone is without a scratch and works fine for now. So there must be good gpu-s as well but i just got new one so i dont have to worry at all for the next 3 years. I am glad yours work :)
Ну так сравнивают топовую карту 1xxx поколения и самый low level из 4xxx поколения. И карта 4060 лучше во всех играх и потребление ниже. Это победа, я считаю
Ну судя по разнице в ценах, победы пока не видно. Только генерация кадров фишка у 4060, впринципе 4060 помойка. Сразу 4070 супер надо либо ничего, в 2к только на 4070 можно играть нормально, а 4060 вообще непонятно для чего нужна если ее с 10 линейкой можно сравнить которой уже лет 8 @@ИванБокк-ь1х
My dad said he had a 1080 ti and hes coming back to our house to find it/confirm he didnt scrap it so hopefully it still works so i can pour all of my budget to the cpu, ram, and storage
Upgraded from an 1080 HoF to an 4060. It is slightly faster, AND consumes about 75% less energy. Living in germany with high electricity costs, this is a point to consider
dont get me wrong, i am very pro efficiency, but that ll take time to pay of.... 1) 1080 is like 180W, 4060 is like 100W. Thats not even 50% less energy 2) assuming 30c/kWh, 1h gaming will cost roughly you 3 cents less 3) assuming 300€ for 4060, you will need 10.000 hours of gaming to make up your purchase 4) assuming 3h gameplay per day, you will need roughly 10 years for the slightly faster upgrade to pay off
Everyone knows the RTX 4060 is actually an RTX 4050 since it has 8gb of vram (And also bc there's a huge performance diff between 4060 and 4070). Nvidia just slapped a 60 nametag on it along with a 60's price tag. Edit: Sorry guys, it turns out I also made a mistake. I just decided to slap an Nvidia nametag on Ngreedia
Doesn't matter what the name is. They can named it 4050 the price will still be $300. That's the reality with how expensive node shrink have become. AMD did not even dare to use 5nm on 7600. Most likely because using 5nm will mean have to sell those 7600 below cost.
@@AdaptacionGamer Yes, this is a site-to-site comparison where its no problem to look at it, but it is quite interesting when more components are compared and with more than one resolution for example
my 1080Ti with stable OC has a core clock of 1987MHz and 6000MHz on memory clock, and I guarantee is better than the 4060 and I play on 1440p 165Hz display, it's a beast of a card after whole 7 years later which is nuts.. 🤷🏻♂️
A 1080ti is about $180 used, and a 4060 is 230-250. Right about the same with the fps. Pretty cool to see the entry level card beating a flagship beat of a card forloeeuch lower cost of both new msrp
Thought this was obvious🤦♂Because its a GPU benchmark and using upscalers would only skew the results especially against cards like this 1080 ti that don't even support DLSS and frame gen
@@EX0007, I usually watch tests out of personal interest, since I once had a 1080Ti and I’m extremely interested in what it can still show with FSR3, especially since older video cards support this technology. Nothing prevents the tester from conducting both tests at once.
Maybe becuase frame generation isn't for everyone, it's noticable for me along with DLSS, FSR etc. Features, blah blah, long way to go still, maybe 8-10 years from now they will be good, but to increase the price because of these still in-development features is crazy.
Nobody playing a game will use Frame generation, it's legit the worst Tech they released so far. The Input lag is insane and the image quality is horrible. Not worth the "fake" frames it produces. As of DLSS, I do agree. Not only does DLSS look arguably better, it runs in most titles better too, same goes for FSR on the 1080 ti.
@@KelvinKMS I think people are too sensitive about nvidia's price increases, to me it's obvious, when they have better technology, it costs many times more R&D than AMD, plus the commitment to support the tools Their excellent work is a worthwhile extra payment, in the end, TSMC's increase in processing prices in the last 2 years is the main reason, you also see AMD affected by this when their prices also increased but due to everyone focuses on nvidia, so it's rarely talked about, combined with AMD's inferior technology as well as low pricing to eat the small piece of cake that Nvidia left behind, look at the horrifying amount of electricity consumed. and the performance compared to the RX 7000's appearance, I guess AMD will continue to consume the terrible consumption of about 400W-600W for the rx 8900xtx while the performance is only a few% better than the 5080 and consumes twice as much power as the rx7900xtx. I'm doing it with 4080.
Sad to see how far the mighty 1080Ti has fallen but even more shocking is how pathetic the 4060 is for a current gen card as it isn't much faster though definitely far more efficient
@@gametime4316 How does that refute what I said? And if it is a 4050, like people say, then it needs to be way less money. Remember that the 3060 beat the 2070 and the previous generations all beat the cards before them, like the 1060 destroying the 970 and the 2070 beat down the 1080 and was close to the 1080ti. It should have been no different this generation. Hell even the 4080 completely dunks on, not only, the 3080 ti, it even beats 3090 ti by a decent amount. The 4060 was a slap to the face of the consumer at it's price point.
@danielkowalski7527 if you care that much about power draw use integrated graphics or an APU bruh. Graphics cards are meant to be graphical powerhouses not laptop graphics.
Cyberpunk 2077 with Phantom Liberty Alan Wake II Avatar Frontiers of Pandora Hellblade II Benchmark titles for graphics! Also the native resolution should have been 1080p for this comparison (considering 4060 is an over priced 900p tier card at best in terms of native res after RTX 50 Series release it will drop down to 720p tier for sure)
It's a rebranded 4050 with the 4060 naming scheme and price attached to it. It's beyond a joke, and yet there are still deranged fanboys out there who defend Nvidia for this.
@@jspringer86 This is why competition is so important because it benefits consumers. It is rumoured that AMD has ditched the high end and enthusiast class segments for RDNA 4. However, if it ends up being super competitive in the entry level and midrange segments, then it could put a lot of pressure on Nvidia and hopefully convince them to pack significantly more processing power into their entry level and midrange GPUs for reasonable prices. It would be a shame if RDNA 4 didn't feature any high end and enthusiast class GPUs.
However, Battlemage is rumoured to launch later this year and is said to be competitive in the midrange segment. I don't think drivers would be nearly as big of a problem for Battlemage as they were for Alchemist. After RDNA 4 and Battlemage, RDNA 5 and Celestial should be expected to launch after that, hopefully not too far after. AMD is expected to return to the high end and enthusiast class segments with RDNA 5, and Celestial shouldn't be too far away from launching then as well. Leakers reckon Intel will compete in the high end and enthusiast class with Celestial, which should bring the heat to Nvidia and AMD, so hopefully there will be strong competition in the gaming GPU market within a few years. There is still hope left.
Maybe more VRAM will beneficial but 128bit did not hamper 4060 performance. People obbsessed too much with those numbers that they forget about more important aspect.
@@Gamer-q7v what's important is the total bandwidth not individual spec like bus width. You can give the card any recent card 512 bit bus treatement it is still useless if you pair them with GDDR3. And ultimately AIB need to make money. So it needs the balance where AIB can still make decent money for themselves. Those PCB with higher bus width are more expensive.
@@arenzricodexd4409 That's irrelevant, though, because no company in the right mind would pair a 512 bit bus with GDDR3. Bus width is important because it determines how quickly the GPU can access and store data in its memory. A wider memory bus allows for more data to be transferred simultaneously, which is crucial for rendering complex 3D textures in a virtual environment. A wider bus width also increases the overall bandwidth, which plays a big factor in gaming performance. Performance also scales much better when there is an ample amount of bandwidth to leverage the cores effectively, which ensures they are fed with data. While cache can help alleviate bandwidth bottlenecks, it can only do that to an extent. Wider memory buses and faster memory should definitely be top priorities for next gen GPUs.
@@Gamer-q7v that's my point. What happen when gpu maker use faster memory? They using much smaller bus width. But ultimately they strike to fine that balance that is appropriate with the product. 4060 primary target will be 1080p. There is no need to give the card excessive bandwidth when those that buy the card did not intent to use the gpu with 1440p or 4k res. The only issue nvidia probably should give the card more VRAM but that is separate topic. Sub $300 market is at the point where it is almost impossible for AIB to make money. So they need to reduce the cost as much as possible.
This is the funniest thing to me after just seeing a comment like "who gets a 1080ti for 1080p" with 10 laugh emojis from 5 years ago. If your goal is powering some exotic 500 Hz 1080p screen on modern games, a 4090 could be your 1080p card, and there are more and more games releasing where having 8GB or less VRAM is causing texture loading failures regardless of your goal framerate. It's all arbitrary marketing bullshit. Cards are simply good performance for the money, or they aren't. Most of these new cards aren't
Rtx 4060 vs gtx 1080Ti 8gb vram vs 11gb vram 128 bits vs 352 bits 272 bandwidth vs 485 bandwidth 118 Gpixels vs 145 Gpixels 236 Texture rate vs 372 texture rate All important fights 1080ti wins over 4060 by huge margin
@@KelvinKMS Yes, I played it a long time ago. And I played through many games of that time on this map. But now I keep up with the times and play modern solo projects that are worthy of completion and waiting for the RTX5080
@Respbury The 1080Ti gets like 15%, while the 4060 does not. both OC they are on par, while the 1080Ti actually beating the 4060 in some titles, it's an amazing Card, have one myself lying around.
@Respbury Absolutely not, tested the 1080 and 4060 like 6 days ago? On like 6 Games. It is not overrated, it is still that good. Both paired with an 5800x3d.
The top 2017 card usage: 200-250W. You can use 150$-200$ 1080ti with i3 10100/12100 65-90W or i5 10400/12400 +-80-100W and with PSU 500W or maybe 450W. And you can build this very cheap system with incredible performance 1080p-1440p high/very high settings in every game NOW for next years, and sometimes 2160p if you want it. The top 2024 card usage: 400W+ RTX 4080s or 4090 sometimes doesn't work with 850W PSU. Yeah - system performance with one of that cards will be insanely high in any scenario, but what cost...
Guys look at ram consumption, watt consumption, and at the end 4060 has dlls 3.0 which makes game performance soo good on Rt on , btw take a look at cpu , will a low budget guy will buy that high end cpu and a low end gpu
Not at all if you use FG in the 4060 , and GL enabling Ray Tracing in a RDNA2 or RDNA3 GPU . With the DLSS + FG mod even the 3060 blows the 6700XT , AMD GPUs are pure crap and Steam charts shows nobody really wants them unless you pass Ray Tracing , DLSS and FG (buy a Series X then , lmao)
@@atharhazrianto1329. Thats so peasant as when people said in 2020 "play with DLSS sucks , lol" . But ofc the FSR FG sucks a lot , you need a DLSS compatible hardware and the mod for make it look good . DLSS3 doesnt have that problem at all . Its only for AMD GPUs or pre RTX Nvidia ones due lack of AI hardware for upscaling .
@@dimcamus6454 honestly ? i have a 4070TI and i never played a single game with FG so far. all games that i played and had it i prefered the lower FPS (usually 60-90FPS in heavy games) over double FPS with with FG but lower image quality and higher input latency.
@@gametime4316 I wasn't talking abt which one is good or bad, I don't approve FG too much tbh. My point is a general comparison video should be more generalized with the same settings, not your personal preference, ppl can decide for themselves. If you don't like the new features on 40 series cards that's fine make a new video and explain it or whatever probably I'll be interested to watch. But turning off DLSS&FG makes 4060 look like having the same FPS as 1080ti, that's just not fair.
1080ti considered the best ever but i wouldnt blame nvidia hiking up prices, their gpus are just top notch and always a step ahead. its just sad to see it that way.
They are not gonna make the same mistake again. They want customers to keep buying their GPUs every generation. Making them buy 1 is not enough. Some companies actually go out of business because their products last so long and are so reliable. Eventually, everyone in the world will have this company's product and will not buy any more. Which is why one of the biggest competition companies have is themselves. They have to compete with their old products in order to encourage consumers to buy their new products
@@jasonzhu9742 i know. but games doesnt even suit well for any 3000 series gpus anymore. even low end 4000 series. thats why 5000s were said to focus on generating frames instead of rendering because its not ideal to ramp up power usage in able to play games in highest presets in 4k or 1440p.
i am not ready pay 1000$ for a 1440p 60 fps card whatever their tech is, this is the most dangerous mindset. 1080 ti doesn't support RT and dlss, but RTX 2080 ti does, so what? i have to pay extra 500$?!, a big NO. But you know what?, they did!, they succeeded in that. They released 2080 ti for 1200$ (1080 ti was 700$) because of the "tech" they are using. They continued to hike the prices and still doing it. The reason? YOU!, you, the customers feed their greediness and now we are paying for it. This is the mindset nvidia is using to milk their customers and their level of greediness is on whole lot of another level. A 1080p card is a 1080p card and a 1440p card is a 1440p card, i am not going spend my hard earned money because they have some good tech. I will pay only if the card is delivering me playable performance, not because of technology they are using.
@@MrBALAKUMARA even hardware unboxed said ppl want amd to succeed to have nvidia lower the prices. thats just the way it is, they are the best. 1080ti is the goat just because of the price but nonetheless its time is up. 1080ti to 2080 ti upgrade is for more frames, better technology. u paid for the engineers and and ppl who worked on its new features not just the card itself. and yes. i want amd to succeed so nvidia lower prices that i can buy.
@@prabhakardhar1379 actually, Yes, i have a 3070 at home and a 4060 TI 16gb at work and my 3070 still performs about 20% better even having less VRAM, VRAM WAS NEVER WHAT DETERMINES THE GPU PERFORMANCE, but its CUDA Cores
Ну конечно засунуть старую видео карту в новое поколение ддр5 и АМ5 и процессор 2022-2023 года поколения. Вы бы лучше старое полностью поколение с новым тестировали тогда естественно будет разница
Now do dlss3 ans fram generation and it would beat thw 1080 hy 50 to 60fps. The people not utilizing the 40 series have no idea. And anynody not utilizing it are idiots. Go buy amd like come on its time to accept these things nvidia does look amazing while improving performance insanely. Put it to you this way last year even the 3080 had issues in 1440p on cyberpunk. The 4060 can get 60fps on 1440p with dlss3. It's insane. We have comparisons of the 4060ti competing with the 3090ti when the 3090ti utilizes dlss2 and the 4060ti is using dlss3 and frame generation. We are talking within 5 to 10fps. That's massive. Some games winning on the 4060ti. And the people acting like it don't look good sre beyond ridiculous also. Its hilarious to hear people say shit they can't even see. Yet those same people would reduce from 4k or 1440p to 1080 and put it to high meanwhile think dlss 3 on 1440p ultra is gonns look worse haha. Funny mofos
First if you use dlss its not real 1440p next i used a 4060 in the pass just for testing a frend bough it i tested it. Now second in cyberpunk 2077 main game and dog town at even at 1080p ultra with dlss i was geting drops below 60 fps. Third when i turned on frame gen dear lord it was saying 75,68 but with an imput lag of 38 to 45 unplayable mess. So i diceded to keep my rtx 2060 for now.
@candidosilva7755 even just dlss3 alone is fine and it's going to grt optimized further with frame generation. Disregard the frame generation and dlss 3 is amazing. Idc if it's not native yall need to get over that odk how many times people have to prove that looks matter more then thinking it's native or not. They have found ways to give you almost as much quality as native and massive performance boosts. Not using it means you shouldn't even be with nvidia the dlss is always the discussion on why to buy nvidia disregarding that is just retarded. It's why nvidia is covering 70% of the market this year! With amd at 16%! Don't buy nvidia if you arent utilizing what they gave you for a performance boost. It's like hey all I want is 1080p with high frames then if you use dlss at 1440p omg it isn't native! So what? It gives you the extra performance and even turns there bade card into a monster. I've never ever been as impressed with nvidia as I have been with dlss3 it's legit why I switched. I wasn't a believer but thank god I got some commonsense and pulled the trigger. I'll sell the 4060ti now and go 4070super or 4070ti super we shall see. Regardless I'm very impressed I'm whopping on my 7600 and when using dlss3 its a massive boost and idc how they do it I care about the perofmance and how far I can push the card. I'm also a sim racer it's all I do so for me even if I use frame generation I don't feel the lag as much since I'm not playing shooters. So for me these cards were made for me lol. Literally all over reddit when peoppe ask nvidia vs amd every comment goes oh dlss nvidia def go nvidia. And for good reason now! The days of oh amd is close are gone it's just gone amd cannot compete right now it doesn't matter how people want to spin it.
@@reviewforthetube6485 bro that was my experience and i tested that thing hard for to weeks on several games with an 24 treads cpu and in cyberpunk 1440p dlss on quality at ultra no ray tracing or even the max setings the ones you need to put by hand manualy the 4060 was at 60 fps but whit many npcs together i jad drops to 38 using the mantis blades with heat. The only way i found to put the game runing stable was 1080p ultra setings 80 fps constant no ray tracing or path tracing just ultra. I didnt see the need to upgrade my rtx 2060 i instaled the mod to use frame gen dlss on quality 1080p high setings and i get around the same fps as the 4060 at ultra setings also with dlss on quality and frame gen. Now we can turn on some ray tracing on both but its a very chopy and crapy imput lag for me its impossible to play with both with rt.