Not sure if anyone has mentioned this but the ARGB header is an *output* header. It's meant to power additional ARGB lights, don't connect it to your motherboard, that may result in a short.
Good shout!! Hopefully people buying this will see the note. I picked up on that too after seeing it on another video. It could fry the mb or gpu if unlucky. Video could really do with an edit so people aren't caught out as there are zero instructions included with the card on this apparently.
no, it will not "might result in a short" it will result in a short and break the rgb or the entire GPU at worst case scenario. There will be a short if you connect it. DO NOT CONNECT IT TO THE MB!
"My 500w GPU is overheating in my build with no intake and most likely a CPU drawing 250w itself. I don't understand how AMD would release such a bad card." -Amazon reviewer, probably
Picked up an XFX Merc 310 7900 XTX last week on sale for $30 over reference to replace my 1070Ti. I didn't even have it a day before I swapped the stock fans out for 3 quieter 120mm. Crazy this cooler is so big it actually fits 3 120mm fans like they're suppose to be there! lol My cooling performance suffered slightly, and now it reaches a max temp of 63C, instead of 55C, with a hotspot of 81C, but I can hardly hear the fans anymore. Even with the "worse" cooling performance I'm still hitting 2.9Ghz regularly while gaming, and that is on the stock profile, but with the VBIOS switched to the higher power target. So, still not bad cooling when realizing I'm pulling 450-500W. lol It's paired with a 5800X3D, and at 4k with the 7900 XTX it is a perfect combo. I have had some driver issues/timeout while playing some games, but nothing major. I had upgraded to a 43" 4K 144Hz monitor last Cyber Monday, and really had to turn settings down for my 1070Ti. 7900 XTX was really the upgrade I was hoping for, and able to use this monitor to its full potential. Overall, very satisfied, and would recommend.
@@polomarco7053 I had just zip tied them to begin with, then I ended up designing and 3D printing a fan shroud adapter that bolts onto the card, and allows the fans to screw into that.
@@swizzler8053 Ja,it was available here in Australia when othere were waiting for 3080s ,I paid top $$ for it but just could not have enough money for 3090.wanted the most vram i can afford.7900xtx going same price i paid for my 6800xt Au$1900 or bit more for Taichi.
Like all first generation designs it always need time to iron out many of the quirks and bugs. Personally I am still waiting for the 7800 and 7700 models to be announced to see if they will be worth while to get.
I think the performance won't be anything more than the 6900. Might have RT performance equal to a 3080, but the 6900 is probably cheaper now than the 7800 will be for a long time.
They didnt change a damn thing. So, not sure what that means...... The design of the card, is exactly the same. And why youd buy one of those over just getting a used 6800/6900xt is whatever, but hey, do you.
@@96kylar RDNA 3 is the first MCM card on the market for regular consumer. Guess you didn't know that. The infinity cache and the GPU use different nodes. It's obvious they are breaking through with something that will take some time to reach its full potential just like Ryzen and chiplets.
I just grabbed this same card. It arrives tomorrow. I dumped my 3080ti because I kept running out of VRAM. I can't wait to install this thing when it gets here. I got the same deal as you, but paid $999 since it was new. Still a steal with the free game if you ask me
I recently upgraded to the 7900XTX, and while I did go with Asrock as a brand I chose the reference version of the card. I had specific requirements, I needed only 2x8pin connectors, and because of case clearance I needed the short length of the reference version. I also really like the vapor chamber cooler over traditional heat pipes. Out of the box this card is really fast but I undervolted it and it’s an absolute beast. Running around 2400-2500mhz on the card most of the time, and temps never above 65c (80c junction). It’s relatively quiet and runs nearly all my games at 4k 144FPS (with or without FSR, depending on the game). I upgraded from a 3080 and I couldn’t be happier.
Same I got the sapphire pulse, and I ran at 2150 to 2350mhz at 0.765v . Temps are 75c and 73c for Hotspot and memory at 65% fan speed @ 270 to 330w depending on the games
@@paranikumarlpk I got the phantom series one hoping it doesn’t over heat or have issues of that sort, bough a build for 2k off of pc part picker someone recommended so I’m hoping it’s good
@Dfiyz I searched a lot and found sapphire makes very quality 7900xt and 7900xtx . Lotta reddit users posted their oc and undervolt results and I'm very impressed. Also I have lc c2 which has all hdmi ports . Sapphire has 2xhdmis ,which makes more sense for me amd it's pretty good XD
The quality between the ASRock phantom gaming line and the Challenger line is night and day. I bought a 5700 XT Challenger and it had very poor quality plastic. I bought a phantom gaming 6800 XT and it's super solid
I picked up the XFX 7900 XTX and it is Godly!!! I've played/am playing TLOU and Jedi Survivor, Etc. all completely maxed on Day 1... No Problem!! Hey, Star Wars Survivor...you're using 23GB+ of VRAM you say...!!!?? Forget about it!!! I feel like I can hear people without as much VRAM crying while I play at native 4k/60 max on everything... it's hilarious...lol
I can tell you right now, it has nothing to do with the two 8-pins. Two 8-pins are rated for more current than 12VHPWR. I know people will throw out the "150 watt limit" but that limit is a PCI Sig specification that is a limitation on the GPU side and has nothing to do with the load rating of the mini-fit jr standard or the wire. Ironically, a lot of GPU's don't even adhere to that rating anyways since they know that even with crappy 20ga wire, a single 8-pin is rated for about 250 watts. Buildzoid and Jonny Guru have talked about this, and people are conflating an arbitrary number that PCI Sig came up with, with Molex's official load ratings of the mini fit jr standard. Even on 18ga wire, it's rated at 8.5amps per terminal, which gives you 612 watts for two connectors. If you've got 16ga, it's rated up to 10amps, giving you 720 watts for two 8-pins. After all, Corsair isn't using 12VHPWR on the PSU side and is using two 8-pins over 16ga wire to 12VHPWR. They wouldn't be doing that if it couldn't handle it. So, the stability issues on the ref card is related to something else, not the fact that it's only two 8-pins.
I thought it was the vapor chamber and AMD offered replacements to those affected? The title of this vid makes it seem like all reference models are faulty
@@blueversace4447 That was an issue, but it was more of a temp issue... which could cause stability problems, but if his temps are fine, it's not the reason for his stability issues.
@@TheGameBench unless its a problem with overheating on a part of the die that has no temp sensor. could be poor conatact with the vapor chamber in a small spot.
I have the Taichi version and it's easily the nicest card I've ever owned in terms of build quality. I've always been a Sapphire/Powercolor kind of guy, but I think I'm an Asrock user from here on out. Makes me want to try their Taichi motherboards now.
As someone who was all in for Sapphire on AMD cards I am very happy with my phantom gaming 7900xt, thing is built really nice so I'd bet the taichi is even sturdier. With the sapphire models being $100-300 more than the ASRock models it's hard to justify the extra cost. ASRock also has a 3yr warranty vs Sapphires 2yr 🤷
looking from rx6900/6950xt cards asrock top end like taichi or oc formulas are one of the best build quality boards, there fans/cooling is the weakest part of the build, and sapphire and powercolor pcb were close to reference models with better cooling than asrock
@@Brabant076 My ASRock PG went back as it developed the 110c junction temps issue after a few weeks. 68 on die and 110 on junction isn't right and it smelled🤬
a 7900 xtx ref triggered OCP on my 850 watt corsair psu. once the machine crashed, i could no longer turn it back on, but had to flip the power on the PSU to drain the caps, then after five minutes, could boot again. a 1200 watt PSU fixed the problem completely.
So months later I have a 7900XTX, non-reference model, but it only has 2 8 pins, however, if you're underclocking or undervolting, you'll be below the 375W draw anyway, so it should be fine, it's only when you're overclocking and increasing power limit that you'll go above 375W, approximately 408W is a 15% power increase over 355W that most cards will allow you. I have found Starfield to be understable with anything more than a 10mv undervolt.
I don't think ray-tracing is a gimmick, it looks nice when it's implemented well (which is rare in itself), I just know that it's the very first thing to turn off in a game that dips below 100 FPS (my monitor runs at 3440x1440 @ 160 Hz). So it really is pointless to buy Nvidia, when they're so far behind AMD in the performance per dollar ratio in rasterization, which is what matters when your card is no longer the brand new top-of-the-line graphics card it used to be. The reality is that we are many years away from enjoying ray-tracing games, and when we are it will be as old news as when tessellation was a talking point! So be penny-wise, save some money, stick with AMD.
@@filipealves6602 thats it, its realy nice, but its performance is still terrible ... :D nvidia is better then amd at raytracing doesnt mean nvidias gpu are massively capable at hardcore raytracing. turning it on is often more like punishing your beloved pricey gpu for its great rasterization performance.
Dimensions on the 7900 xtx PG at 8:03 are wrong as far as I can tell. Should be 330 x 140 x 57.6 mm. Don't want anyone to use that thinking it'll fit in a case that it won't. I'm trying to squeeze one into the Fractal Ridge but It's going to be 3mm over height and 0.5mm over depth. I'm going to have an eye on return policies lol.
Yes that's what I witnessed too. Is the actual dimensions different from the official website stated dimensions? I would love to fit this in a ITX configuration so any clarification would be greatly appreciated.
As long as you dont use the reference amd model and you're not interested in ray tracing it's a great buy honestly. The 24gb vram is underrated especially since 8gb vram with ultra texture setting is becoming redundant in 2023 game releases. That 24gb should be good for 3-4 years
I bought my GTX 1080 with 8GB of VRAM in 2016. Only in 2023 , games seems like require more than 8GB . The 24GB probably last 6~7 years I guess. Besides, PS5 consoles only have 16GB of RAM shared between CPU and GPU , so games development must take this into consideration.
@@fleurdewin7958 Of course, it just depends what sort of framerates you’re happy with too, personally I like over 100 with high settings in a high resolution. I don’t see the point in putting everything on ultra the difference between ultra and high is diminishing returns tbh
EDIT- bought mine 4 days after launch also... Weird, my ref model has been fanatic. You do know, that the ref model is basically 98 percent of any AIB right? Software, and all. Spite the extra cost for nothing.
I have the ASRock Phantom Gaming 6900 XT and it's been absolutely great. Whenever I upgrade my GPU again (RX 8000 maybe?) I will definitely consider ASRock again.
I’ve used Asrock motherboards for years, both Intel and AMD, and they’ve always been fantastic bang for the buck boards. I’ve never used one of their GPUs but certainly considering it for my next card
@@96kylar I got my 6900 XT brand new for $700 when the reference cards were still retailing for $999. Also the reference 7900 XT/X vapor chamber would like a word with you.
Hmm, it might not be as good as the 4090(Non RT) but it's bumping close to the 4080, a card that goes for $200+ more. Ray Tracing is just not worth it still IMO. To huge of an FPS loss, for me, and at 4K, many games still look incredible with it off.
I've started to believe that RT is just a scam. I mean sure, the lighting and reflections look nice to most people, but there are many more important things to focus on when it comes to enjoying a game's graphics : Textures, animations, character / item models. I think Nvidia went way overboard with convincing everyone that they need extremely high quality lighting when it's just not necessary to me. And I have to admit, (maybe because I am 45 years old and perhaps my eyes are not that great anymore) I actually cannot tell any VISUAL difference between RT lighting and standard lighting. This is good for me though, I can save a lot of money by ignoring RT. I feel that the standard lighting and reflections already available is good enough to enjoy. Especially with HDR.
@@EternaL1feNot a PC guy but I am building my first Gaming PC, 2nd PC build for me and I agree. I'd rather 4K than RT which I'd only notice on water or on glass at a small portion of my gaming Time, whereas I'd see 4k 100% of the time
I've delidded my 7950x and undervolted. It can run at 5.3ghz ccx0 and 5.2ghz ccx1 at 1.24v. But I've decided to run it at 5.1ghz and 5ghz at 1.12v. Temps drop from 80c to 72c and PC stays cool with 3x360mm rads. Negative 300 on all cores too. Max power usage drops from 240w to 180w.
@@ddd45125 I've overclocked it a bit today LOL Couldn't resist. Now it's running at 5.2ghz and 5.1ghz. no negative offset. 5.75ghz single core. Plus, FCLK is 2167mhz, which helps a LOT with memory latency.
I bought this card a couple weeks ago at microcenter for $999! Love this card. I have it in a lian li 011 mini. I have mine undervolted thru adrenaline software.
I just got this card but the 7900XT version. It is the same card physically. Same PCB, shroud, fans, cooler, everything. So it's nice to see a cooler capable of cooling 500w+ on a chip with a max power limit of 304w (short ppt).
Firstly i would like to thank you for your unbias review as always good work here! Two 8 pin power connections are NOT the issue with the reference card, my LQ Vega64 pulled well over 500w on two 8pins all day with out issue. not sure why you think that is the problem but its most certainly not the issue. Could just be a faulty card, my 6900xtxh (Asrock OCformula) was doing the same thing contacted amd they confirmed my issues to be a faulty card so i sent it in and the replacement has been perfectly stable. Also wanted to note ASRock demanded photo's of the void if removed sticker on the rear of the card before they would do RMA. And well in the USA that is against the law. infact even placing those stickers on a product is against the law in the states. So my most recent gpu is the sapphire 7900xtx nitro it also has 2 hdmi and 2 DP altho i would much rather 3DP and 1 or 2 HDMI as its for a pc and you can convert to HDMI from DP with out issues. not so sure the same can be said about HDMI to DP at high refresh and resolution. Good day sir hope you continue to make the world a less shitty place to be :D
yeah there is no actual limit to the power you can draw though a wire. there is no reason it would be the problem. the wires just get warmer if more current is going though them, the only 'limit' is when they burst into flames. I have a 6900xt with 3x 8pin and after a month I noticed one of the connectors was only partially connected. never had any issues drawing about 375w.
I’m from America 🇺🇸 and I was at Micro Center in Michigan. We bought (2)7900xtx’s Asrock Phantoms. Theses cards are very beautiful and I really like these cards. I don’t understand why people are sleeping on this card?! I know some people don’t like the name cause it’s not 🆒 enough and or the colors. This card is a great card.
I have the 7900xtx liquid devil and while it can't raytrace aswell as the nvidia competition it certainly can raytrace I'm using 1440p 144hz monitor and in hogwarts legacy when I turn raytracing up to Max I still get 60-100fps which is still very playable. Just feel like you talk down it's ability to do raytracing to the point where ppl will assume it can't really do it at all which it certainly can at 1440p. havnt tested it at 4k so I assume it's performance at 4k raytracing would be very lacklustre but still it's a great 1440p raster and raytracing card. Love the videos by the way cheers
i'm i the only one that prefers 7900XT reference over XTX reference? i just think the dimensions are better and the design is MUCH cleaner (thumbs up for not combusting itself too)
Thnx for the review. I looked into this card, but quickly lost interest due to user reports of quality control issues. Instead I went for the XFX SPEEDSTER MERC310 RX 7900 XTX (got it for 985 euros from the USA). I'm currently playing 1440p/144Hz, no upscaling/RT, and I'm looking to upgrade to 4K/120Hz.
@Antonis24 yo brother. After doing some research that particular monitor even tho advertises at 120 hz, because of firmware only gets 60. If this is the monitor you want then fine however I believe the Gigabyte M32U would probably better suit your needs and wallet. It's a 144hz ips monitor with near identical specs for significantly less price point at 649 new
I actually have a Reference 7900 XTX and it's been quite a treat when tuned properly. Cooling is great even at 365W power usage and my performance improved with a memory overclock and undervolt. Good to know the Phantom Gaming is a good model, especially since it's going for MSRP in the US nowadays
*ASRock just rules ! I only use ASRock motherboards & videocards ,* in ALL my builds ! I hope , they will make PSU's , Memory, and SSD's in the future !
Asrock is an offshoot of Asus. It was originally created to be the cheaper budget brand of Asus, but has since splintered off to have non-budget hardware.
ffs these cables are certified to do 150W but they can do over 300W each without noticeable voltage drop over them. It have nothing to do with number of ports, if reference model is unstable there is some reason for that but it's not number of cables.
Same case with the OC Formula 6900/6950 XT. 21 power phase design on those bad boys (I imagine that helps with undervolting?)... and they were THE CHEAPEST variants when prices started to drop... I got a full ASROCK/Razer rig atm, and I love it.
Hoihoi, could you please comment on how the undervolting affected the idle, watching RU-vid and watching FHD/4k movie power consumption? This is what holding me back from buying any AMD cards. Thanks!;))
The biggest drawback of this card for me is how bad fsr looks compared to dlss. With my 3090 i can play games at 4k with dlss on performance and they still look very good. With a 6900xt ive tested putting fsr on performance looks bad and even on balanced games look soft. Like the 7900xtx gets a lot of fps in COD but my 3090 with dlss on performance probably gets just as much and looks totally fine to me.
Blame the devs or even more realistic, Nvidia. They shove loads of money towards devs to implement DLSS. Dlss newer versions are def okay, but so is FSR. Fsr 1 is, like the first versions of dlss, performance driven but it makes the quality suffer. In later itarations on some games, FSR 2 and later outshines DLSS quality wise but in performance, it's on par. But it's still not a lot of titles supporting it cause that sweet Nvidia's money makes the dev wanna implement DLSS more. I guess it's where the Nvidia tax goes whenever someone buys one of their products.
Airflow is the key. The Silverstone RL06 looked average but is brilliant for airflow. Chuck in some Noctua or other high performance, silent fans - you have a sleeper build.
Just snagged a sapphire pulse model, mostly because it's the only 1000 dollar that has 2 hdmi 2.1 ports. Thankfully, it has 3 8 pins too. Not something I realized would be an issue. Thanks for pounting it out!
@@taramjwi57 Oh.... lol.. that is a 12 dollar fix, but, again,... (and msrp on that is higher that the ref model my guy, no reason to bullshit, or be salty) happy gamming.
@@96kylarAll good not salty, I was just confused by your reply, as you were referencing something I didn't say. I also enjoy explaining my reasoning, I am one of those types =P. I bought it for a 1000, the same msrp as the reference model. This is what I was trying to say in my last comment. No uphike, I was avoiding paying more than that.
Atomic Heart is crashing for pretty much everyone I know, its not a gpu-related thing. The game just released in a very poor state. Outside of that I had a ton of issues related to drivers , the newest drivers have fixed most of them, but there are still problems, for example OBS using 30% gpu with no scenes. High power usage on multi monitor set ups on idle. I have the XFX 7900XTX OC'd to 3100mhz(rarely gets that high ,usually hovers around 2900) uses 462Watts and highest temps I've seen is 65c, absolutely amazing design from XFX. Biggest problem for me is the drivers like I said and apart from Atomic Heart and Sons of the forest(known for crashing on this card) I've had 0 issues in all other games.
Getting the best value out of gpus means waiting a bit and let others be the beta testers. Buying on release day is pretty much gambling. For example my card (4090 phantom) is great and all... but also really loud when fans are above 45%. I could try and replace the fans or have a lower custom curve that let the card run at 80° or buy some noise cancelling headphones. I think it will end up on the 2nd hand market well before the 50 series comes around.
I love your channel and what you do. However, I completely disagree with your choice of sponsor. 99.9% of the time, when websites sell cheap keys, they are obtained through credit card fraud. The money is then laundered through these websites, which is the only way they can sell keys at such a low price. I understand that it's great for RU-vidrs to have sponsors, and I know that's how you pay your bills. But in my opinion, it's wrong to support a company that is profiting from credit card fraud. I wouldn't take a dime from such a sponsor.
I would like a video on how ddr4 mem timming affect these cards. Also how it effects resize bar and its impacts on performance. We have seen a 20 to 40% bump in fps tuning ram timmings. Wonder how it effects new systems and technologies. Like a 3800 cl24 build
Good video! Nice to see that undervolting in afterburner works! Did you do anything to enable it or did it just work out of the box ? On 6000-series I believe it was viritually impossible to do in afterburner.
@@96kylar i disagree. After many AMD gpus I've experienced that adrenalin keeps reseting 100% stable profiles. Afterburner does not on my setup atleast.
I recently changed from the Hellhound 7900 XTX to the ASRock Phantom Gaming 7900 XTX and after driver uninstalling using DDU and reinstalling (due to issues when making the swap) I seem to have problems with the GPU crashing ALOT and performing pretty badly. The Hellhound was performing almost perfectly, but for some reason I can't seem to get this card to work as it should . Going to try a fresh Windows install after a bit more of troubleshooting, but Im hoping to get this card working.
It was cheaper than the Hellhound due to it being on sale, and I have the matching Motherboard, so it went with my build better. Also not that it was a major thing, but the RGB also let's me add a little more customization to it if I wanted outside of Blue and Purple only.
@El Cactuar Why not? And thanks for the grammar lesson. I was completely unaware that it was incorrect. I guess no one could understand what I meant there.
Same thing happened to me. The Phantom graphics card was so bad that I returned it and got the Sapphire Nitro+ Vapor-X instead. It works fine without any issues
Hey Brian, how about reviewing Thermalright FC140 (Frost Commander 140) CPU Cooler... No one has done it yet, seriously. - The specs & looks are so promising ! 🤔
Had a ref model since November. And it’s been great. Runs quiet and cool. Zero issues. Weird. Edit- and it wasn’t a “heat pipe issue” other wise ever singe card would have issues. By design. It was some had less liquid. Odd how I’m seeing more “power color and other ads like this more and more lately. The ref card is fine. More than fine. Didn’t you also recommend a the same maker on the 6800xt? Odd.
DUDE, REALLY agree with the 2 HDMI ports on GPUs. It's a common input for TVs so if you use the PC with TVs as monitors it's more versatile with 2 HDMI outputs. Good review. I agree that these should have 3 power inputs for the XTX. Good to see an AIB gets it right and system more stable. Personally waiting for RDNA 4. My suspicion is that's what could be used for Pro game consoles which also means AMD will be working with Microsoft and Sony on getting it right, and fast, just like RDNA 2 was excellent. I'm thinking RT performance will be much better with RDNA 4 and that's what I want for a new GPU. I have a little more faith that RDNA 4 is going to launch better than RDNA 3. And I want something like a 5090 or 8800 XT to have awesome 4K performance but use less power than a 4090, so closer to 4090 but a gen later, at a lower tier, lower cost and lower power consumption and that's when I move into 4K gaming.
Just wanting to throw in my two cents with a different model of the 7900 XTX. I have a Power Color Hellhound, running with a 5800 X3D, inside a 5000D airflow. Something I noticed when I finally got around to mounting the card vertically was that the GPU temp and GPU junction temperature dropped by about 15-20 degrees when running at full load. Before I did this I was consistently reaching 95-97 C for a junction temp. I believe something like this happened to some of the reference model cards but I'm not 100% sure. The Hellhound model only has two 8-pin power connectors, which is less than ideal. However the price was $100 less than some other factory OC models so that was a acceptable trade off for myself and stock was a nightmare back in December. I've never hit over 400 watts, even with a pretty significant OC to about 3,075 MHz. I barely got any more performance out of the OC so I dropped it back down to default and the temps went down with it while not really affecting performance. I constantly run in to a CPU bottleneck at 1440p so I'm hoping the 7800 X3D can help that issue. Great video and analysis!
You CONSTANTLY saw 97c ? Like the second you turned the rig on ? EDIT -oh well, than a 3rd 8pin would completely change that. If your picking up on my sarcasm
@@96kylar Sorry that was a little unclear. I wasn’t constantly running at 97 C, I would just see it end up there after an hour or so of gaming if it was at 100% utilization. Under the same circumstances with the card mounted vertically it would rarely exceed 80 C. I’d say a normal gaming session has the junction temp sitting in the mid 60s with the card vertically mounted. Right now idle at the desktop the GPU temp is 34 with the junction at 39 You’re right about the third power connector, I don’t see that influencing temperatures at all.
Kinda weird that mounting it vertically improved your temps, because it usually worsens them when the card is pretty close to the side panel. Sounds like your exhaust fans sucked most of the fresh air out before it could reach the gpu.
I would buy a 7900xtx if the board partners would add a usb-c port. I have a rtx2070super founders edition and that usb-c port on it is so very-very convenient. Edit to add: Would be great if the board partners could find a way to downsize their air-cooler 7900 series cards. ************************************************ On 12 May 2023 I purchased the Asrock reference version rx7900xtx from NewEgg. Runs like a champ. Absolutely no hot-spot issues. A wonderful-wonderful gem of a card. It's usb-c port is near every bit as strong as the port on the rtx2070super founders edition; charges rapidly. File access and transfers are swift as well. Using Adrenalin 23.4.3 drivers. PSU is a Evga 850 G5, handles my PC power requirements just fine. I always have my atx panel removed. I have no issues with 'bad' fan noise.
@@clintonleonard5187 My prestige x570 creation motherboard has a usb-c port. That port does not have near as much power as the usb-c port on the 2070. Ampere reports 300mA charging on the motherboard usb-c port, 980mA on the 2070super usb-c port. Moving files from my cell phone to the pc via the the rx2070super usb-c port 'seems' snappier as well, mainly photos and videos of family gatherings. Upgrading to a 4k monitor with a built-in usb-c data port might be an option. I have watched monitor reviews, such as the Cooler Master GP27u and other 4k monitors, but no one seems to test the usb-c ports charge rate or file transfer capabilities. Upgrading to one of those larger 4k monitors would free me up to purchase a regular more modern gpu. My current monitor is a lg 24ud58-b 4k monitor. I have a SB X-Fi Titanium Fatal1ty Pro on the lower slot. No room for the other slots. I did have it in the top pci-e x1 slot, above the gpu slot, but the sound card would get hot sandwiched up there between the GPU and CPU. CPU is a 5950x with a noctua NH-U12A.
Good video. Just wanted to make a point about default fan curves. I have the reference 7900 XT, and the default fan curve tops out at 66%. This meant the GPU was running up to 75C under load, with junction temps in the low 90's. I manually tuned the curve to top out at 80% and temps dropped by over 10 degrees. AMD should not have defaulted the fan profile to such a slow speed.
Odd, and i dropped all my fan curve. No need to run them that high. At all. Your just using voltage on spinning something, vs in the rig. But hey, do you.
Hey, one of those reviews on newegg is mine. Just wanted to say my testing was with an open case, so the hot spot reaching 100C+ is more likely due to bad mounting since yours at default is hitting only high 80s C.
I just got one and it hits 100 hot spot completely stock in a well ventilated case. Pretty much everyone on Reddit says theirs is hotter than youtubers'. I undervolted it which helped, and I just set an old CPU cooler on my GPU which took off like 4 degrees even without thermal pad contacts.
It's def not because of only having 2-8pins. 2 8 pins can provide 600 watts + 75 watts from PCIE. The aftermarket 40 series 12 pins only plug into 2-8pins on the PSU.
@@nevoburi You're actually correct, the spec is 150 per 8 pin connector. But each lead from the PSU has 2 - 8 pins connectors on it. The PSU and the wiring are capable of carrying 300watts per lead as-is (unless it's a cheap psu). If the card asks for it, the 8 pin will def push more than 150 watts.
@@96kylar I've been running a 500 watt 3090 bios on a 2-pin card for over 2 years. I don't know anyone personally running a 1000 watt bios + power modded 2-pin card. But people def do it on 3. 450< 1000. So yea.
I have to wonder if AMD was even aware of the 7900 XTX issues before launch. And if so, how in good conscience could they have released it. Great program TYC.
Speaking as a software developer - devs ALWAYS know about problems before release, but they're not the ones who decide when the product goes out the door :/
@@96kylar Vapor chamber issues on several batches of the reference design, for one thing. Don't get me wrong I'm a fan of the XTX but that card did not launch issue-free.
@@FastFilmFX There is about a 0% chance they had hands on any of the faulty vapor chamber units, consider it was a small number (few percent) of launch units from a bad batch of coolers, any of the testing units they were sent were probably fine. Do you think that they ordered the coolers and pcbs and assemble them like gift baskets at AMD hq? They're assembled by contracted companies and packaged by contracted companies while they receive test samples. They probably gave coolermaster a good beating for the coolerfail though.
Until it's 900$ max, i am not buying it. In fact i would probably not buy it until it's 849$. Which at 849$ i can definitely say it's a decent improvement over rdna 2.
i have a fairly compact case and i barely fit my 1080ti striq cards getting the fancy big aib cards comes with an extra new case cost too ontop of their already higher price tag
Just bought an xtx 3 weeks ago and haven't have issues with Hotspot and my case airflow isn't the best also undervolt is good got extra 30 fps on 3dmark now next step is buying fans to increase airflow to help gpu maybe new cause like the fractal torrent
I think it's safe to say to skip both the rtx 4000 cards and rx 7000 cards this gen and save up for a killer upgrade next gen. Wayyy to much power for such little performance increase from both companies even when undervolted. Ray tracing is another thing that needs another 80-130% performance increase.
Eh.. The 4090 brought a huge performance increase, and the power draw is even less than the previous gen 3090ti. The 4080 is also a significant boost over the 3080, and its ever more efficient than the 4090. That said, I'm sure the next gen will be even more impressive BUT there is one factor many of you are not taking into consideration. The geopolitical situation. Things are not looking so pretty for the places where these GPUs are made, or the supply lanes which are required to bring them to other countries. If something does happen, which there are signals that it will at some point (sooner than later), we can kiss the availability of new GPUs good bye. I hope it doesn't happen, but it is something to strongly consider.
I live in Japan and this is probably the cheapest 24GB VRAM I can find for the price half of a RTX 4090 this is the best option. Inflation is seriously a problem. However I wanted to know if like using a Ryzen 7 5700X3D & RX 7900 XTX would able to handle a 750W PSU no problem. I noticed the PSU can handle 3 Power connectors no problem its just draw power and so is the CPU. It would be great if you reply me back.
@V. D. My particular case is a coolermaster haf xb evo. It's almost like a testbench layout, with perforated case sides, so plenty of easy airflow. The thing runs very cool. The fans don't hit more than 1400 rpm on anything I've played on it so far, so it's very quiet. I think the overbuilt heatsink helps with that. At 2.1 inches longer, 1 inch taller, and a quarter inch wider, it's got a lot more surface area than the reference model.
The 7900 XTX beat the 4080 across the entire test suite on techpowerup as over December 12th of 2022, it’s presumably far better now so the argument can be made that if you don’t care to lose a few frames in every game in general to enjoy better Raytracing performance at a $200 premium then the 4080 makes sense. At the 7900 XT level, it directly competes with the 3090 Ti and eats the 4070 Tis lunch, the only reason either of those cards is really worth considering is if you need the slight efficiency gains because the 4070 Ti doesn’t really offer anything more with its 12GB 192 but VRAM buffer. I anticipate the tech reviewer videos in two years will be calling the 4070/Ti shit for having a gimped memory bus.
Not convinced the additional power connector on the ASRock is what makes it more stable, might just be luck of the draw in the silicon lottery. I got shafted hard with it in the GTX 970 days, the first one was unstable with the tiny overclock Gigabyte had given it. The second one got unstable with a 7MHz overclock and over the years lost even that and had to be underclocked. Third one was alright though and could do a decent overclock. (All in the same system, the system wasn't the problem and ran even an R9 Fury fine)
HI mate, can you do a video showing the MSI afterburner setup for this GPU? Also, should you only install the barebones AMD driver if using MSI AB or can you install the full driver with software? I've used MSI AB for years with NVIDIA but the settings seem a little strange to set with an AMD GPU (this being my first). Cheers.
You are soooooo right in that all 7900 XTX's are NOT created equally. In fact, I found that one of the worst 7900 XTX's is this Asrock Phantom. It has HORRIBLE cooling. Those fans worked so hard, and yet it was ridiculously hot. The PowerColor and Sapphire versions are much better. BUTTTTTT, they are not the best one I've tried... The best one I've tried thus far is the MSI Gaming Trio version of the 7900 XTX. That card is SUPER QUIET and VERY COOL. I've run relentless torture tests on this card, and the temp stayed below mid 70's, with the hotspot never getting above 84c or further than approximately 12-16 degrees from the card temp. This card was so quiet, I had to get my light out and check the fans. They were definitely spinning well. So I opened the case to hear them better, and it was still hard to even tell they were spinning unless I put the light on them. This card performs very well, and seems to be better than any of the other ones I've purchased/returned. It's also very stylish, and only a 2.5 slot sized card, so it will fit most cases. It's the best of all worlds thus far. The question is, will it stay that way...?
I have the reference model and now with the updates i have had no crashes at first i was running it with a +15 PL but now its at stock power and stable with a 50mv undervolt and +150mhz memory
Question: regardless of the price; which do you go with 7900 XTX or 4080 GPU cards, all things being equal, ignoring the price which would you go with?
Neither you buy the 4090. Imo it is better to buy a 4090, buy used 30series/6000series, or skip everything and wait for 50series/8000series. RDNA3 is pretty mediocre overall and I think those who buy now will be looking to upgrade again next generation.
i unfortunately had to send back my asrock phantom gaming 7900xtx after it crashed in multiple games and my 2nd monitor wasn't displaying anymore. So i went back to my 6700xt, maybe will wait a year to upgrade again.