@@shyamlok bruhh you're so positive it'll be slower than 4070 Super with 8gb of VRAM. Remeber 4060 ? that thing was slower than 3060Ti while also having less VRAM than 3060.
Does anyone remember when the 10 series came out and the 1060 was just as powerful as the 980, like within 5% difference in performance or something, and we were mind-blown? I wish we'd get that nowadays but no.
Recently, you guys asked a poll question about B-roll footage. I would like to slightly amend my answer to: the balance is fine, but if we could maybe get B-roll footage (over, say, the outro of the video) of you guys doing the photoshoots for the thumbnail images, that'd be grand.
I've had an RTX 4090 for almost a year and a half. It's still overpowered in most of my games. It's not Nvidia's style, but I do hope they use the 50 generation to focus on improvements with the low to mid range cards rather than a 5090 that is 50% faster (and more expensive) that an already OP card. Every card should have 16GB VRAM at minimum, even at US$299, not to mention GDDR7 with >128-bit bus and even DisplayPort 2.0.
Not to rain on your parade, but the 5060, if it ever comes out, will likely be an 8GB card for $400. It will likely perform about the same as a 4060 ti. Nobody will buy it but that's fine, Nvidia is making bank on the AI chips and the $2000 rtx 5090. They don't care!
It's overpowered in MOST gaming use cases. The one place where even the 4090 is not enough is in the super high end VR space where there are now headsets coming out that need to render more pixels than an 8k display
@@zwieback1988 sorry my initial comment was quite unclear I reckon. I mean it's too late now to buy into this generation. I can wait another 6 months and I'll just buy a complete new PC (9800x3d, either a 5080 or AMD equivalent). I dont know why i was so naive, but I wanted/hoped prices would settle lower and faster than they did. Current 7900xt price point (1000CAD) is the price point I wanted, but it happened so late I'm just going to sit that gen out.
@@karlhungus545 Def. They make way more money selling AI/pro cards, at that point selling gaming gpus is just a side thing that is not as profitable and a bad business decision when you think in financial terms only.
I managed to pick up an open box / ex display (still had all the peels on the card, basically brand new) 7800 XT Merc for £375 while ebay had one of their discount codes for some sellers on. Upgraded from an RX580,I just wasn't willing to wait any longer and I'm really happy with the upgrade.
People forgot that we use to play console for 7 years at a time until the next gen comes in. Now companies are trying to update their products annually to make you spend more.
yeah, the only reason I upgraded is cuz my PC was almost a decade old with a gtx 960 and intel 4th gen. it was still usable but I couldn't play Elden ring and the Witcher wasn't the best experience on it. If you don't even have something to do with a GPU just don't upgrade.
Exactly this. I've had my 4090 since launch week and up to now I've only played about 8 new games total since then. Forget hardware, I want more interesting games.
didn't wanna wait anymore and got 12.5% off on a 4080 super proart which was already one of the cheaper models out of all the 4080s for some reason... with the AI BS going on I am not expecting 5080's to deliver much better price/performance vs. current ones... and I didn't want to wait that long either... upgrading from a 3060 ti which is already struggling to keep up with my needs, so it will be nice improvement... I don't live the price but whatever doesn't seem to get any better
I think at some point you have to accept that inflation is a thing. However using inflation as an excuse for the 4060 ti, 4070 and 4080 pricing, that's just silly.
@@ofon2000 It's not inflation, it's margins. They've been wanting to kill off the low-end for a few years now and focus solely on the lucrative halo products which make them the most money (even if some the $15-$30 entry level graphics cards they might produce would still be priced in the
@@sulphurous2656 Steam hardware survey is only platform & there a many other platforms. You can't just take a single platform & make claim that it's the whole market.
@@sulphurous2656 ?? just the 580 and 6600 combined have a higher percent share than the 4090, so saying that there are more 4090s than all amd gpus just doesnt make sense.
@@pinkgrognak Its true statement but when talking about 4090 vs current gen AMD gpus on steam survey, or at least was at some point in the past (also because only 7900xtx makes the list with about 0.4% usage while 4090 has around 1% on steam. He either forgot or hes parroting something he saw but didnt quite understood.
@vagamer522 was just a touch out of my range. So I pulled the 7700x and paired it with a 7700xt. Might have been able to squeeze a 7800xt, but life has other priorities.
A few weeks ago I sold my 1.5 years old rtx4090 (basically for MSRP price) and bought rx6800xt for $350 to use untill the release of 5000 series of cards. But now I'm questioning if I even need a top tier GPU, all games I play run just fine on rx6800xt. Maybe I'll just skip that generation altogether
Unless you're running RT (which to be honest here, just a handful of games look that amazing with it) it's all you need. beside, running bang for the buck build gives some satisfaction too. Wise step imo
If the 5080 doesn't dethrone the 4090 and launches at $1200 then the 4090 price should hold on the second hand market although probably between current prices and 5080 price ( below the price you got for it) . I learned my lesson with crypto mining boom not to sell until replacement card in possession. With Ch!na eyeing Taiwan I'll hold on to my 4090 till I get a replacement card.
This is the crux. 4090 performance is very nice, but is there a game that a 4080 can't play? 4080 is nice, but is there any game a 4070ti can't play? Ok, I might not play in max settings 4K, but cranking settings down by like 10% and saving myself a casual £1000 is just a no-brainer.
@@Patrick-y4d1z a 4080 is a waste get a 4070 ti super or a 4090 a 4080 strix is the msrp of a 4090 at the price spend a couple hundred more bucks for a noticeable increase in performance especially if you want to run at native 4k
@@Patrick-y4d1z 4k is just an absurd waste of money. 1440p or even 1440p ultrawide high refresh will 100% of the time provide a better gaming experience than 4k. I fell into the 4k trap with the 1080ti and spent the last 7 years regretting it. Unless you have a larger than 34" monitor there is basically zero visual difference between 1440p and 4k, for that you get way lower fps, increased power draw, more expensive cards that don't last as long etc. A 7900xtx at 1440p is pushing way over 100 fps in even the most demanding games at 1440p, a 4090 will not be able to hold 60 fps at 4k long before an xtx can't hold 60 fps at 1440p, and I get the benefit of 165hz native resolution in many other games. Oh, and RT is dumb and will remain dumb for at least another generation, if not 2, but even if you like RT 1440p monitors are still superior there too.
I've never had my own gaming PC. I've been saving up starting this summer. I'm looking at a 7800x3d and 7900 GRE. I don't really have anything else to use, such a tricky situation for me
Mate that is nothing to sniff at. Easy to cool, easy to get pretty cheap memory with, it'll be a really sensible build and has a pretty good medium term upgrade path.
If you are near a MicroCenter, they do have bundles that include a GPU. Their current AMD 4-piece bundle is 7800X3D, decent ASUS mainboard, 32GB G.Skill 6000 RAM, and AMD reference 7900XT, for $210 off the regular price. I live about 15 minutes from MC so I always get their stuff. If you don't live near one, I'm sorry.
You’re making the smart play with AM5. The 9800X3D will be compatible. Probably the 11800X3D after that. Considering that, don’t feel too much pressure to get a 7800X3D right away. Although it IS an awesome chip, you could alternatively get a 7600 for now with the plan to upgrade CPU to 9800X3D or 11800X3D later.
If the 5080 is similar in performance to the 4090 and only a little cheaper, that might not be worth waiting for. The 5090 looks to be the only new card with a big leap in performance, but that's probably going to cost an obscene amount. Thumbs up for great videos.
And what we keep on showing _ every _ time _ is that they can get away with it. If so, then why change? Start to understand that and then you're getting the change we should see.
People keep buying it. It is the people that hold the power, the sheep are just too stupid to realize that if they hold out playing their perfectly capable cards and force almost no sales of Ngreedias new cards, they will be forced to drop prices drastically to please their masters on the board
For me the problem is the lack in increase in performance for the same price. A $250 - $350 card is the same performance tier as 3 gens ago. All that's changed is the extra's like DLSS, RT at the low end really doesn't matter as you don't have the FPS to spare for that.
Thanks for the video Tim, too bad that the prices haven't dropped as much as expected. I know that you guys are really not very fond of the Intel GPUs, but any news on the Battlemage lineup/price/launch date?
Why not buying now ?! Amd wont launch anything this year neither intel or nvidia only rtx 5090 will launch so if you have a pc even if it have 1050 ti stay on it and wait but you will wait until next year so keep that in mind but you dont have go and buy waiting will be a disaster you will to wait until next summer for discounts and new gpus i learned the hard way with last 2 gens
That monopoly happen because no one really interested in a market that have expensive venture with little return. Else we already see ARM, Imagination and Qualcomm discrete gpu on store shelves right now.
@@arenzricodexd4409 i think there is huge logic fault in your statement. if it is an expensive venture with little return then why nvidia is the most valuable company in the world? and those discrete gpus are here in the room with us? wishful thinking at best.
@@mc_sim nvidia exist for almost 30 years now. Back in the 90s there were dozens of company competing for grahpic accelerator. Where are they now? Why they did not become another trillion company like nvidia? Because gaming hardware market did not really bring in big money. Nvidia become the 3 trillion company because of AI not gaming.
@@arenzricodexd4409 partially i agree with you but still you can't say that this market isn't profitable - intel tried a few times entering this market and definitely not for the purpose of losing money. and their failure also doesn't mean it's not profitable.
We aren't Nvidia's customers. If another gamer NEVER bought an NVIDIA GPU they wouldn't even care. AI is going to carry that company untill everyone realizes it's a stupid bubble that bursts in 5-10 years.
I think if you're spending four figures or more you should wait. If you're gonna spend that you damn well better be getting the best card(s) on the market period and you won't be getting that for long if you buy now. I'm gonna wait. Plus I've got the 1080Ti, so Not only could I wait out the rest of this generation, I could even skip the next gen and go for the upgrade in 2026-27 timeframe if I really wanted to.
I'll only upgrade if I can either get 24gb vram for relatively cheap(1000$) or 48gb on the high end. No im not buying a 3090 that was used for cryptomining.
For most of us with high end 30 series or mid to high 40, there should be no need to upgrade for a few years. The games on PC coming out should be fine with our current builds.
True, but also for some (most?) people: fomo + bragging rights + enthusiast builds, so the 5090 will still sell at whatever ridiculous price they set it to
I'd argue a mid range 30 series is just fine as well. even a 3060 can work perfectly fine since the AAA games are kinda trash lately. Elden Ring ghost of Tsushima etc. etc. run perfectly well on that.
@@ghosthunter0950 To a degree - but we can see how the hardware is starting to struggle, especially when pushing UE5 features, for everything else though, I think a 3060 is ok.
my concern with a 5090 or 5080 is the 12 pin connect socket and the melting issue as the current leaks suggest more power is going to go through that weak link. Will it work or become slag? btw I have no intent to buy as I have a 4080 super which max outs fps on a 165mhz 1440p monitor, a faster card isn't going to be better fps is it?
"a faster card isn't going to be better fps is it?" are you serious? yea a 4090 for an ex will give you notably more fps but if i were you id upgrade my monitor to at 1440 oled 240hz oled panel at least
@@iDeparture but if your monitor is capped at 165mhz, how can a 4090 give you 200fps? It can't as its breaks the laws of physics. I'd have to buy a higher mhz monitor or a 4k monitor to see an improvement the 4090 has to offer. I like my monitor it has true black and HDR 600
@@tommyrotton9468 oh i thought you mean just an increase in performance. and lcd monitors (ips, va and tn panels) dont have true blacks because they use backlight strobing and dont have true hdr because True HDR requires a 10-bit or higher color depth, at least 1,000 nits to accurately display the increased contrast and color gamut and local dimming tho some can come close but trust me get an oled and you will not regret it
@@tommyrotton9468 the point is that you get a better experience with a faster card bc you cant just look at average FPS, you have to look at 1% lows and even 0.1% lows. So lets say your 4080 maxes out your 165hz screen, it does that in the AVERAGES. It will have framedrops, the 1% lows can be like 70FPS for example , and the 0.1% can be like 30 FPS. Thats game dependent but the 1% and 0.1% lows are basically always lower than the average. At least if you dont play Tetris with a 4090 and lock it to 30 FPS. In that case there wouldnt be a single framedrop below the 30FPS, obviously. But in more intensive games, theres almost always room for improvement in the 1% and 0.1% lows. And those lows tell you how consistent your gaming experience is, basically how many hiccups/stutters. The closer the lows are to your average, the smoother is your experience. So yes, it will make a difference, the 4090 will have more frames at the the 165FPS monitor limit, less frame drops below that. The 4080 will have worse 1% and 0.1% lows aka a less consistent experience. also in new intensive games, especially UE5 games, you will use DLSS upscaling even with a 4090 to get to 165FPs in 1440p. Especially if you want to play maxed out, if available very high RT. And if we talk about PT (path tracing), you wont get close to 165FPs, not even with a 4090. So Theres always headroom. And as said theres always room for getting higher 1% and 0.1% lows with a faster card.
@@tommyrotton9468 I plan to buy a 5090 (I def will buy one) and I will pair it with a 240hz 4k OLED. Even if the 5090 is like 40-60% faster than a 4090 (which it will be), I dont expect to come close to 240hz in the intensive 4k games. I will in some games, but I absolutely know that I wont be able to max it out in really intensive games (that currently exist), before I upgrade to like a 7090. But only, if the newest games by that point arent even more intensive than the current ones lmao. And then Im sure that I could increase the 1% and 0.1% lows even further by using an even faster card with the same monitor. So theres always room for improvement. And if its not for current games, it will be for future games. Bc one thing is for sure - upcoming ultra high fidelity AAA games wont get easier to run.
I'm fully prepared to wait it out till Rubin/RDNA 5 and pair it with a mature AM5 platform and Zen 6 X3D, my 5800x3D and 3080 12gb are serving very well paired with an odyssey G7, until we see some competent 4k performance from sub thousand dollar GPUs it won't be worth it to upgrade for most people on a 3060 or better.
I have been with Team Red since the ATI HD-5990 days and currently own a 6800 XT. However, the time has come to change sides. The constant crashes and driver dropouts have become a joke over the past 4 years. That's why I'll be switching to Team Green and getting a RTX 4080 Super once the 50 series cards have been on shelves for a few months. I might even get a 50 series card, but it all depends on what prices look like. It's been a ride, ATI/AMD.
There are also rumors that the memory bus width and amount will be the same as RTX 4000 series, so it looks like my 3060 TI would have to serve me for one more generation. Oh well, 6060 TI it is then. Considering I was lucky enough to find it right after it came out in December 2020 before scalpers got their hands on it, I am extremely happy with my purchase. I just hope it doesn't die out unexpectedly. I also wish AMD gave more competition, especially where ray-tracing is concerned but they seem to be as content as nVidia.
Makes me sad we won’t be getting a 8800 XT or 8900 XTX. Though, I guess that just means that my 7900 XTX will need to carry me through to the 9900 XTX 😅
My good old 1080 Ti is starting to struggle quite a bit at 1440p for newish games. Even lowing the settings and turning on FSR, it still struggles to hit the 60fps mark or anywhere close to that figure. With the new 5000 series at least 6 months or so away, I'm considering going for something a little more modern that can hold 60fps with high to ultra settings. AMD's FSR upscaling is still pretty bad with inferior image quality, especially when it comes to shimmering, ghosting movements and noises when compared to DLSS. A bit of ray tracing is nice to have on as well, along with Topaz video/photo upscaling that I use, which is why I prefer a Nvidia card. I don't want to spend too much, as GPU prices in Australia is already ridiculous as it is. The 4070 Super can be found for $1000 here. I attend to keep it for at least maybe a year or so or even longer if it can maintain decent performance (60fps) for upcoming titles, especially the Unreal Engine 5 ones. My two monitors are only 60hz so it doesn't have to push crazy high 144fps. Is it a good upgrade from my current card? Happy to hear others input/opinions or suggestions. For reference my current setup 13900k 32GB Ram 5600mhz 1080 Ti ASUS TUF Gaming 1000W
For longevities sake ya really want the 4070ti Super with 16GB of VRAM (more so if you intend to sell it on and put funds towards another upgrade in the not too distant future as I imagine others won't age so well) that is if you absolutely need to go the Nvidia route. Personally I think you'd be far better off getting yourself a 7800XT and taking whatever money you save in order to upgrade ya monitors for a much better overall experience. If ya can hold out longer and get yourself one on the second hand market there's often some good deals to be had if ya know where to look, i.e places like Marketplace local rich neighborhoods with people with more money than sense who don't know the true value of things or just wanna quick sale so they can upgrade to the latest and greatest whatever the cost may be and simply wanna make a few hundred bucks back. Can't say I know much about Topaz but a brief look says it supports AV1 encoders and they're working with AMD to improve things so maybe worth keeping an eye on. No point worrying about FSR or Ray Tracing if ya locked down to 1080p 60hz. Perosnally I'd ditch the 13900k at this point and get a 7800X3D along with a second hand board and 32GB DDR5 6000. Sell the CPU, MOBO, RAM, 1080ti, Monitors and the 1000W PSU togethe or seperately while it still holds value and pick up a cheap 750-800W one. Ofc it really depends on ya personal needs but I think many people often over spec without realising and figuring out what ya don't need is just as important as what ya do but if ya juggle things right and can make some money back on what you already have then get yourself a good upgrade. By my estimates (using some conversions since I'm in the UK) Let's say you can make $2000 (AUD) from selling absolutely everything. I reckon for $3000 (AUD) give or take you could upgrade everything as well as buy a new monitor based on a build I've been estimating putting together and that's all from new. If ya go the second hand route then ya can save a bit more money or get yourself a better case/mobo or whatever for half the price, so ya might not save more but at least ya system will be worth more which wil add to the value if you decide you wanna sell it and go back to the overpriced Intel/Nvidia route. it all depends how much effort ya willing to put in and whether that savigs worth it to you but as someone who's been studying this shit for the past 4yrs wanting to upgrade myself this is the most current logical conclusion I've come to thus far. Another option woud be to just sell the 1080ti and pick up a cheap 3080ti for the time being and then sell the system as a whole. That's just what I'd do though if I were in your shoes or what I'd be aiming for as a bare minimumover the next 6-12 months and ofc as prices continue to fall maybe my recomendations will change to getting a 7900XT or XTX. I mean there was a time I was looking at £2000-2500 ($3800-$4700UAD) for a new build but now I've got things a bit more dialled in to my actual needs I'm looking at spending more like £1300-1600 ($2500-$3000UAD) and maybe with a lil thought you can do the same.
Doesn't matter when. Most 500 series will be slower than their 400 counterparts(according to most who have way more experience than me) add if the flagship 5090 going to be like the 4090 I'll hard pass. Still seeing repair video's who are still fixing 25 4090 a week from the power socket burnout. Got a 7900xtx more than happy,was going 4090 but not dropping 2k for a card that'll melt connectors.
I consider the 7700XT / 4060ti to be high end cards. Also, most games that I play don't even use a 5600XT to 60-70%, I might wait a year or 5 before upgrading
I really hope rdna 4 does super well but i hope they make a massive massive leap. Issue is we hear that the rdna4 is arouns 7900xt performance in high end. So thats not great for a new gen launch and they said rdna4 was going to be a monster yet here we are. Lol But if the leak about a 7900xtx level card at 400 dollars is true thwy may annihilate the market. The issue is as we see even wirh outstanding priced on the gre people are still paying more for nvidia so amd has to resllt undersell things I guess to even keep up. Yet the 6000 series from them was just incredible and so many more bought amd then but this switch to nvidia has been crazy to see fornshre.
Wait in either case price drops on current some cards coming might be worth it later, if you don't have a card buy a cheep one that will get you by for now,
A Radeon 7600 for under $200 (USA) is not too bad for entry level imho, I see those deals here all the time these days. Great for 1080p gaming, maybe some 1440p even. Was curious about the equivalent Intel stuff, but they seemed to be power hogs by comparison, and I'm still waiting to get the verdict on if Intel has finally learned to make halfway decent GPU drivers - their integrated stuff was never well supported in the past.
I am curious about rdna4 pricing for midrange. Also i am curious about rtx performance and features for that. Second iteration of arc is interesting too
the leaks are saying the raytracing performance should be significant upgrade over RX 6000/7000 but I'll belive it when I see it. Also the same GPU architecture could power the PS5 Pro.
I built a new machine in March of 2024 with a 7800X3D and 7900 GRE. I'm very impressed by the 7900 GRE for the price:performance and it has 16GB VRAM. With a simple OC in Adrenalin it trades blows with NVIDIA competition for less money. The 7900 GRE is my go to recommendation this generation for people shopping in the ~$500 tier.
My current predicament is that my gpu is dead. I'm not upgrading, it's completely dead. Now I want to get a high end card, but it's at a time where that would be very stupid. But I really need a functioning pc for daily use and gaming.
11:48 Then it's going to be another skip generation. I can't wait to skip it! 👌 *They can shove their 192-bits.* 😘 256-bits on the 70-class or gtfo. ALL 70-class cards in the last DECADE came with 256-bit, since the GTX 670. Nvidia really screwed the pooch with the 40-series. Then they wonder why lowest sales in over 20 years. 🤔 Gee... I WONDER WHY.
@@darudesandstorm7002 That's because everyone and their grandma thinks they can make money from AI. And other company shareholders demand AI slapped on everything because they think it will generate more money for them. They're wrong. It will be the same as with crypto shit. Nvidia will milk them for everything, then move on to the next best thing. I'm kind of sick of hearing "AI'. AI mice, AI hats, AI monitors, AI apps, AI, thermal paste, AI chewing gum. ENOUGH already! Jesus! The Nvidia CTO came out in 2023 and said that crypto is useless to society (implying that Ai sUrELy iSn'T). The bubble WILL burst and the other shoe will drop. We just have to suffer through ridiculous prices rn, by holding off on purchases.
Who wonders here? Who is "they", I WONDER. Obviously, "they" isn't Nvidia, because they obviously know what they use their wafer allocation for and which market to milk for them juicy profits...
@@elgonzo7239 There were numerous articles last year when the 40-series came out saying that this generation had the lowest sales in over 20 years. Because all of the GPUs (except for maybe the 4090) were shifted by 1-2 tiers: - the "4070" is really a 4060. - the "4060 Ti" is really a 4050. - the "4060" is really a 4030 Ti. The REAL 4070 is now $800. And is called "4070 Ti SUPER". Same price as the 1080 Ti was (arguably the best GPU Nvidia ever made). 😐
i wouldn't expect too much from battlemage, from leaks there's no info from card partners, so it's not even planned for production, some even suggest June 2025 and also depending on what they do, with drivers fixed they can call for more money and A770 wasn't cheap, i'd like to remind that it uses 400mm^2 of 6nm node, 7800XT uses 360 nm^2 of 6nm as closest contender... so if not sold for cost or below, then it can go substancially in price
Most I spent on a gpu was nearly $500 and over $350 on a cpu. But these days I will never do that again for a cpu as I'll stick to $200 range and try between $300-400 for a gpu so I agree with this.
with the 5000 series, GPUs will cost more than a whole computer... what you said about shelves full of older models, that's not true, at least where live (in Canada) .... stores are full of NEW models ONLY.
don't buy now wait until the new ones come out, then when the new ones come out, wait even longer because the prices will drop, then continue to wait until the new ones are about to come out
IMO if it's less than a year until new generation and you don't desperately need a new GPU, you should wait. You shouldn't wait for price drops though and you probably shouldn't wait more than a year.
Just sold my series X and SSD for a 3080 Strix 12GB… CPU will be my next upgrade 9900k even at 5.0ghz is showing its age for 1440p 170hz… hoping the 5070 comes in at
Judging by success of 4090, it seems that NVIDIA doesn't really need to significantly improve fps/$ value, but just add some more raw performance, as apparently there are enough people out there willing to pay a top dollar for top performance.
cant wait for green team... i bought day 1 2080... 3090 ( that was horrible covid time need to wait 45 days) ... 4090 and now 5090 hope it will be in stock...
Well, hopefully rx8000 will finally be an upgrade from rx6000 series. As for nvidia, i don't have much hope. Intel - would be cool to see well priced 1440p card. Saddest part is that for doing local AI stuff with llms, stable diffusion and other models you are heavily cuda dependent and need tons of vram, so dated multiple used rtx3090 is still the best option
Historically, when AMD dirches the high end, they launch an amazing card for the price. Think about the rx 480 and rx 5700 xt. I just hope that can be true for the rx 8800 xt or whatever it will be called.
my prediction is 5090 would be beast in performance but again priced like it's made of gold, 5080 and cards below would be similar performance to 40series counterpart but Nvidia would release dlss4.0 and gatekeep that to 50series cards so that if you want new features you'll need to upgrade.
im on a 1050ti, i want to upgrade but i just dont know what to buy, im looking for a 1080p card, 4060 seems like the best option to me but all the controversy around it makes me not sure about it. i could get the 3060 12gb, its cheaper and 12gb vram could come in handy with bad ports but it has a higher power draw and power is not cheap where im from, im on a 450w psu so the 4060 seems better. also ive never gotten to try all the cool ai rtx dlss features, i just think it would be really fun to mess around with them in games and im worried that if i get a 30 series card i might not get newer features.
I've got a 6900xt which is a pretty killer card for 1440p but after selling my monitor and getting a OLED TV and hooking my pc upto it I feel like I could use a little more grunt for playing at 4k. The 7900xtx doesn't seem to be that much of an increase and I hate NVidias new power connector on the 4080 soooooo guess I'm waiting a while lol
Extremely happy with my 7900 XTX and 7900 XT. Likely won't have a need to upgrade this generation, but it's still gonna be upsetting if Radeon takes a step back and doesn't have any high end cards to at least compete with the 80 series like they've been able to do the last couple generations.
Gone are the days when you could expect a 5x to 10x improvement from a generation or two. Compare a 4070 Super to a 1660 Super. Four years and three generations later, the 4070 Super is not much over twice as fast as a 1660 Super, but uses nearly twice the power. The next generation may have a board that costs 5x as much as a 1660 Super cost, but it won't be 5x as fast. Lot's of people look excitedly for the next generation of GPUs hoping that finally we'll see real progress again, but I think those days are behind us.
I run an RX 6800 that I just got. I usually buy a card these days when I see a bargain. I bought the 6800 for the vram really. I was running a 6700. The last bleeding edge GPU I bought was an R9-290. I got a great deal on it ( $200 ) and a free game. I think it was a Far Cry game, but it could also be DiRT 3. In the early-mid 2000s I had new cards every 6 months. Then when all the DX upgrades went down I bought cards...the R9-290 was one such purchase. I think I bought 3870s for DX 10 and HD 6850s for DX 11.
In short the prices are still ridiculous and if aiming for high end or midrange GPU you might as well wait for the next gen or November Prime day unless you want to lose $200 or more on the GPU within a couple months. At the entry level it might make sense to buy now as that won’t be replaced until next year. Waiting a year is very bad advice. The only brand new entry level GPU I recommend is the RX 6600 8GB @$200 and the RX 6750XT 12Gb @$300. When next gen launch’s it won’t lose too much value. Maybe at worse $50-70 less.
Planning on building my first pc around november when there are deals. My current pre built i have has a 1660 ti and im thinking of getting the 7900 xtx. that should be a good upgrade yeah? I dont have to wait until next year to get a better graphics card?
For me, used market and older games is where the fun is. Both are cheap. Though I been rocking UHD620 for 4 years now, and I am not complaining...much.
Looking to have a duel pc streaming setup this year. My plan was to get AMD high end GPU for the gaming pc but just found out they aren’t going that route. Should I buy the current RDNA 7900 xtx or should I wait it out and get the new mid range gpu from AMD. Mind you it’s gonna be one pc for gaming and one of for streaming! Not sure if I need the high end but I also don’t want to be needing an upgrade within the next 5 years
a couple of weeks ago I bought a used 6800 XT Red Dragon for $280. Even after three years of mining, the card remains quiet, cool and offers high performance. great option to wait for RDNA5
Nvidia currently has an absolute monopoly in graphics cards. AMD's rdna4 does not have high-end models, so the rtx50 graphics card is even more arrogant. The raster performance improvement is almost small, but what can you do? Nvidia doesn't care about the players' feelings at all. The 5060 and 5060ti only have 8g of video memory, and the 5070 even uses the downgraded gb205. The 5070 and 5070ti still have 12g of video memory, and the gap between the 5080 and the 5070 is even bigger.
I'll be looking for a 4090 once the 5000 cards have launched. I would have jumped on a 4080 if it had launched at $1000 but they reduced its price too late. These cards are for playing games and anything over $1000 is way too much I'd rather just reduce my graphics settings and stick with what I've got.
I'd like to say the prices are too high across the board, but considering morons keep buying 4090s at $1600, i have lost all hope in the pricing being corrected.
As someone who is considering a 7900 XT currently it puts me in a hard spot. I want to buy AMD but not sure if I can wait until the new generation is "announced" early next year. My current card is a GTX 1080 Ti so it's not like I'm hurting for frames in the games I play just yet. The itch to upgrade is getting bad though since I've had this card for 6 years now lol.
I also was thinking a lot about just waiting for next gen or when prices drop more since my 6600XT was performing decently fine, but at that point I would be just waiting forever instead of playing the games I want to play. I upgraded to a 7900XTX on February and I'm very happy with it, high resolution and high refresh rate VR is amazing... Now my CPU is a massive bottleneck in some scenarios lol.