i must just say philip, i know your technology videos arent the most watched on your channel, but i REALLY appreciate them. anytime something new is released, and you have made a video about it, you are my choice over everyone else.
Respectfully, I think that the Hardware Unboxed benchmarks are the industry's best work. This video is very interesting and all, but I need to check HWUB's benchmarks every time before making a purchasing decision.
AMD won't release their version of DLSS called Super Resolution to APUs since these kind of features require special hardware requirements to run, for example to run DLSS you need an RTX card with tensor cores, and most likely same will apply for AMD's Super Resolution meaning that unless they happen to include these hardware features with their next generation APU's, which I doubt they will, you won't be able to upscale games for improved performance.
@@andreiarg Dude, all next gen features are built-in every CU. Unless AMD redesign the entire arquitecture to remove this features, we are probably gonna see most of then in the next gen of APUs. (Granted, they will surely make the jump to RDNA in AM5 together with the new DDR5 memory).
Oh yeah, "Smart Access Memory" is another rebranded feature, more commonly known as "resizable BAR support" or ">4GB IOMMU". However, as it was not widely available on Windows, devs could never really take advantage of it. Perhaps that will change with this fancy marketing name.
Seeing as it was there for all games in the benchmark at least it looks like a driver implementation and at least it working or not is game independant. The gains might be quite dependant on the game though.
I think AMD will see how Nvidia handled it and try and make sure they don't do the same thing. The guidelines on how retailers should sell their cards should be a big indicator of that
I don't think AMD is going to do a lot better than nVidia. Not because of some nihilism or "company bad", I just don't think they can keep up with demand, no matter how much they stock. I'm certain they'll have more stock at launch and more people will be able to get their cards, but they won't be able to keep a stock for long.
@@metroplexprime9901 yeah that sounds about right. Considering what's going on in the world right now, they're definitely not going to be in stock for long. But I wish AMD the best. No one could have expected that they will be able to seriously compete with both Intel and Nvidia in the same year.
@@cunt5413 I do too. I'm really rooting for AMD to compete at all ends of the market, low and high. While nVidia hasn't slumped like intel, it's still going to put on the pressure to make better and better cards at lower prices. Stock will certainly be an issue, but I hope that there will be a 5 minute window for people to order in on launch.
This is about what I expected. Even without DLSS, the 3000 series has a lot of power, and if DLSS sees wider-spread adoption then it's not a fair competition.
@@jocyanide 3000 series is severely bottlenecked by the amount of vram in 4k, so even if 6800 had the same rasterization performance as 3070 it would be better value. And don't forget that AMD teased "Super resolution" which is supposedly a DLSS competitor. But to be fair, It is not going to work as good as DLSS 2.0, otherwise they would talk about it more
Hey Philip, they did mention how much Rage Mode and SAM matter (not in the live announcement). RM 1-3% SAM 6-7%. Although I do agree with you that not everyone will be able to use SAM, they did only use it in their benchmarks on the 3090 vs 6900xt, and people who will get a gpu that's $999+ will get the new CPUs and 500 series boards. Their versions of ddls and raytracing are still in development and dlss won't be available at launch (driver patch later). Raytracing will be about 20% worse. Anyway, thank you for the good summary of the event and let's hope for a good price-war to come.
AMD fighting a full stack war with NVIDIA and winning is HUGE, how AMD, a company far smaller than both of its main competitors is able to compete at such a level is absolutely amazing.
I hope they could compete with gpus on laptops as well... they won this year with the beast that is the 4800h but it would be good to compete with nvidia here as well because nvidia seems really lazy when it comes to laptop gpus... like this year they only released super varients of the 2070 and 2080 and unlocked the 2060 to 115w from 80w...
4:50 there is currently rumblings from inside nvidia that the higher vram 3070 and 3080 models were canceled for unknown reasons, so I wouldn't be surprised if these cards never release. Nvdida certainly does need something new to compete however, perhaps a 3070 ti, although it seems that nvidia is trying to avoid cluttering the lineup with ti and super cards. I wouldn't be surprised if nvidia releases a Titain with the full GA102 die to take back the performance crown, because the 3090 is slightly cut down from the full die.
Imagine getting those fancy features on iGPUs in mobile Ryzen chips, like the ones in the average laptop that parents would buy to their kids for school. That would be awesome.
I find it weird that ray tracing gets more attention than VR. We've had beautiful games running at hundreds of FPS on a pancake screen for many years. VR is the area where upgrades are actually needed. SMP (Nvidia) is cool in iRacing but otherwise there's actually just nothing.
@@ganondorf66 I don't think it is. Nothing can run any decent looking VR games (Dirt: Rally 2.0, Assetto Corsa, iRacing, ACC..) at good settings at 144hz at Valve Index resolutions. My RX5700XT can run any flat game out there at 1440p or 1080p. VR.. not so much.
All of this directly affects VR. At the end of the day, its just 2 displays in a headset. Most of the important things related to VR are the HW features, there is no standard yet and that makes SW optimisation a lot more difficult. You can already do Ray Tracing in VR, its just way more demanding.
If you're curious about the performance gains attributed to the different features: GN asked AMD about it and they stated that about 2-3% are gained by "Rage mode".
I'm really loving your tech videos, they are a good mix between entertaining and informative whereas on RU-vid it's usually either or with these things.
According to the slides SAM (smart access memory) was not used with the 6800xt benchmark, it was used with the 6800 (but no "rage mode") and with the 6900xt both SAM and "rage mode" were enabled.
smart access memory basically amounts to resizable BAR support. unless AMD is imposing restrictions on older CPUs, it should work fine on any board that supports large enough MMIO sizes
As much as I want to be excited for this, I know that these cards are going to be scalped like crazy. Best case scenario is that they aren't scalped anywhere near as fast as the Geforce 30 series cards, lasting 10 minutes as opposed to -666 minutes.
Here's some more info. Rage mode only got you a 1-2% increase while SAM(Smart Access memory) was responsible for the rest of it. All rage mode does is increase the power slider for the card(slightly not to the max) and the fan speed curve is changed to compensate for the thermals.
A couple of inaccuracies in the review... 1. They do have a comparisson available showing difference with SAM on and off and 2. Rage mode isn't one click overclocking, it's a boost to the power limit which means the card is more likely to achieve its boost clocks for more of the time. The actual boost clock isn't changed so there is still potential for overclocking.
Some inaccuracies here, rage mode isn't OverClocking, clock speeds are identical. What it does do is increase the power limit and allow it to boost to its highest clock for longer.
@@2kliksphilip it's not specified if it's referring to the already existing one click overclocking or if the two can be combined. Even so, why would they make a button that does the same thing as another one on the same menu? Call it semantics but by definition rage alone isn't increasing max clock speed the same way raising the power limit doesn't. However, if you end up reviewing these cards I'd love to see if manual overclocking benefits from the additional power headroom as well as thermals and if it can be stacked with other effects. Gamers Nexus, for example, has made multiple videos, old and new discussing power modding on Radeon and how it's not actually overclocking. If you're interested I recommend checking them out as they've done more testing than I have. The reason I seemingly got so hung up on semantics was because some may interpret what you said as a cause for instability and voiding of the cards warranty as overclocking traditionally is, whilst Rage mode does not.
About the SAM features Gamers Nexus reported that they asked AMD if the benchmarks used it they said they were not using it in any situations unless noted on the bottom of the chart. So hopefully that’s just raw performance.
AMD Drivers are like fine wine , they just keep getting better and better even years later but that dosn't mean they cant bring a working driver on release day !
You appear to be unaware of FidelityFX. It is another name referring to a mix of in-driver features and middleware to be implemented by game developers. AMD GPUOpen is the equivalent to NVIDIA Gameworks (major difference being that GPUOpen is under MIT license and hence less black boxy to a developer). So when they say "we have an improved denoiser" it means that their collection of middleware contains a better denoiser. That is not a hardware or driver feature. But they also mix Dx12 Ultimate and recent Vulkan features in there. DLSS is middleware afaik, contrast adaptive sharpening for example is not. I presume AMDs "Virtual Super Resolution feature" will amount into an industry standard library that more devs might be willing to adapt. Also we can expect great implementations from game developers working on primarily console focused upscaling algorithms that would trickle down to PC (already happend with some great temporal upsampling implementations)
that suspicious 50% less power consumption you pointed out got me thinking, and I actually think its some slightly deceptive marketing practices in terms of misrepresentative graphs. I think in reality its actually a 30% reduction from the previous models, and the "50%" is being pulled from half of the new power consumption numbers. for example; Original power: 100, New power: ~66 a 50% increase to 66 would take you back to 100, but where the deceptive marketing is, is in saying that a 50% decrease would get 66
The black screen issue actually happens due to the way the 5000 series was built, to truly fix this issue you need a PSU that has larger transistors (like a newer PSU)1 to handle the sudden voltage surge it happens that causes this issue, the drivers can only minimize it from happening.... Unless yer on Linux, it didn't happen there at all combined with open source driver.
I'm still mainly interested in the gtx3080 as I just like being able to use some of the specific Nvidia software, but I'm so glad AMD is really bringing competition this generation. It'll drive down prices for older cards and spread out the inventory for people of both sides as there will be less of a focus on the 3080's inventory when people start buying the 6800 and 6800 XT instead. At the end of the day, everyone wins with this announcement.
Radeon SAM has been a thing on Linux for a while already and works on past gpus and cpus. Roumors say that that it might come to other cpus on Windows as wel
I am waiting for dlss-like technology to be included in budget GPUs. Especially AMDs APUs. Playing new games on medium graphics 1080p@60fps with budget parts? That sound like great deal and the best way to enter PC-gaming.
Honestly, while this is cool and all, these cards are still very expensive. I was really hoping AMD would undercut Nvidia in terms of price for the same performance, but it seems for the most part both companies are focusing on the enthusiast and not their lower-end cards.
You addressed my exact worries about this presentation. I'm not SUPER worried about ray tracing. I can handle having way less fps with it on (as long as it's ~60 in 1440p) in select singleplayer games like metro and cyberpunk. However not having a DLSS equivalent is going to hurt bad in any game that's got DLSS 2.0 or 2.1 as it just allows or even beats that 60fps mark in most modern titles. That might be the real pain here for AMD. otherwise, I'm pretty happy with what they showed and I'm definitely considering a 6800XT.
Glad you mentioned the drivers. In the Linux community they somehow keep being praised for being open-source... but what good does that make them if they keep crashing and locking your system up. I'm now back on Nvidia and using their closed-source drivers - they additionally do a better job on power management, which matters a lot on a laptop.
It’s very true that people don’t trust AMD driver/compatibility. Multiple people said I should go for nvidea because of better drivers, but 5700xt was 100 euro cheaper to a comparable nvidea card so I went AMD. Sometimes when gaming I get a random ‘GPU disconnected error’ as well as VR errors only solved by pc restart (no idea if it is gpu-related, but I get the idea it is). Sometimes weird sound too.
A few days late, but the 'suspicious looking' bar graph really isn't. It's 54% more than the left bar, meaning the darker part should be about 35% of the full bar, which it appears to be. (100*1.54=154, 1-100/154=0.35).
This Smart Memory Access that you are worried about sounds a lot like DMA (Direct Memory Access) and has been here for a long time (I found references that point to DX9 ). I'm hoping that Smart Memory Access improves the speed and latency of DMA regardless of intel or AMD GPU
Amd's new cards will little heavy like 1-1.3 kilograms.. Because watch at lisa su 's hand at 3:18. Lisa's hand actually Struggling for hold it for show case..
CPU access to the GPU's memory is GROUNDBREAKING!!!!! I come from the point of view of autonomous vehicles simulation where the biggest Bottleneck is simulating LIDAR(Laser scanner) sensors. We use the GPU to render the scene in depth mode and then sample it in the CPU, that takes forever in terms of computer time. In gaming, Imagine games where AI and Gameplay can react to EVERYTHING you see. Usually stuff that runs on the GPU can't react to color/particles/shading in an efficient manner. Imagine physics engines with direct GPU access! given the fact that usually everything AMD does is open source and shared it is amazing news!
@@2kliksphilip Yeah, you're right. Think they only showed the difference with RAGE Mode also enabled. AMD told Steve at Gamers Nexus that RAGE mode in those slides was 1-2% of the total improvement shown, with SAM being responsible for the rest of it.
10:40. We actually already have all the numbers, for 6800XT and 6900XT at least. On the event video, 6800XT with SAM OFF & RAGE OFF 6900XT with SAM ON & RAGE ON On AMD website: 6800, 6800XT, 6900XT all with SAM ON & RAGE OFF. based on on this, RAGE is extra ~0-2% SAM is extra ~3& It depends on the game of course.
2:10 the main advantage of the 6800 is that it's likely a little faster, and most importantly it has double the vRAM of the 3070. AMD did well to offer 16gb of vRAM in all 3 cards, the relatively small amount of vRAM on the 3080 (10gb) made for a good debate on whether I'd be enough for 4k in a couple of years, the extra vRAM on AMD cards will be important for deciding what to buy.
I'd love to see how RTX IO and AMD Direct Storage improve their respectie products. I wouldn't be surprised i it turns out both are just the same thing with different branding.
If AMD have supply, A LOT of people will switch IMO. NVidia should not have launched their cards first. Whilst some people wont cancel their preorders, i wouldnt be surprised if they did. The added bonus is that if you have an AMD CPU, and many now do, you can get an extra 5-10% performance out of an AMD card through drivers and software.
I was expecting similar results, I remember seeing something where it was suggested that AMD knew about the 3080 and how good it was for its price and already had the 6800xt ready to compete, but were not expecting the 3090, and where expecting the 6900xt to be the fastest card on the market until the 2nd RTX Titan, but at 1/3rd the price. Raytracing performance was leaked I think also, the 6800xt gets similar performance to a 3070 when raytracing was on and 3080 when it was off which is disappointing but expected from 1st generation raytracing, as with Raytracing on the 3070 would preform the same at $200 less.
Quite frankly I don't see how this Smart Access Memory technology could have helped AMD's benchmarks too much here. It's not a GPU performance feature, it's a CPU performance feature. It increases performance if your CPU is bottlenecking the GPU by feeding it with more bandwidth, something that AMD CPUs tend to be sensitive about. And these benchmarks were done at 4k. It might have helped a bit in a couple of the more CPU bound games like Forza, but by itself, it wouldn't really increase, say for example, the Division 2 numbers much, since that game is incredibly GPU bound at 4K. What MIGHT have helped massage AMD's figures in the 3090 comparison, would be Rage mode. But there's some confusion going on about what Rage mode is. AMD called it an overclocking tool in their marketing, but that's not really what it is. If you've used AMD's Wattman, you're familiar with the GPU frequency curve control it offers. Rage mode doesn't touch the frequencies at all. It just raises the power and temperature limits. For the 6900XT that has to be restricted to running at 300 watts stock, Rage mode is the 3090 button, letting it chug down power possibly as much as a 3090 to keep voltages higher for longer and increase average boost frequency. It's a dumb tool, it's there so AMD could show off what its normally sensibly quiet and cool marvel of 7nm engineering can do when it's playing by Ampere rules, without them actually having to give it an Ampere TDP and an Ampere cooler. They just create this "auto overclocking tool" like an hour before the press conference, that's how dumb the tool is, run the benchmarks with it to show the 6900 mopping the floor with the stock 3090 for 400 dollars less, slap on a little disclaimer on the benchmark graph, then advertise it as their latest and greatest "auto overclocking software" like it's some innovative feature, when really a monkey could accomplish the same goal by playing with the power consumption slider in Wattman for a second. Quite brilliant, if you ask me. But one way it seems to be backfiring is it seems to be making people think that Rage mode is the full extent of the RX 6900XT's overclocking headroom. It isn't. Not by a long shot. AMD cards are famous for coming out of the factory with insanely restrictive power consumption limits. The 5700 comes out of the box with a 180 watt power limit. I've pushed mine to 270, at which point the aircooling simply could not keep up, but other than that, it was stable. AMD cards want to scarf down more power and spit out higher frequencies. It's just how they work. If the 6900XT comes out stock at 300, same as the 6800XT, while having more cores, you can probably push it well past 400 watts on water. I'd like to see the 4k numbers you'd get out of that monstrosity. Can't be overstated that AMD have built an amazing GPU here. True enthusiast class hardware like we haven't really seen from them since the Fury X. It pushes the boundaries of GPU technology in some very interesting ways. I hope the Infinity Cache is overclockable as well.
Dlss 2.0 needs to be supported by the game , lets hope Radeon upscaling is a function of the GPU to be used at will with any game ,and not depending wheater the game supports it .
I think AMD is still not releasing their full potential. I mean, they have been on the 7 nm process way ahead of the competition. They can pack some serious punch in their products. But it seems like they are just positioning themselves in the market so that the competition can play catch.
6800 16GB VRAM $579 3700 8 GB VRAM $499 Even at the higher price the 6800 looks more compelling. Double the VRAM and estimated higher performance (even without SAM).
You forgot an important part in the power consumption. Nvidia uses TDP which they consider only the power drawn by the GPU die and not the memory and other things. AMD uses TBP, aka Total Board Power which actually counts the power of everything and is much closer to real power usage. That 20W difference ? A smaller gap than the actual gap due to Nvidia being deceptive
@@2kliksphilip Yes, 320W TDP and 320W TBP 320W TDP draws more power since it doesn't count all components, so AMD GPUs have a bigger advantage there since they're measured with TBP
@@2kliksphilip I've confused myself, yes, AMD GPUs should be more than 20W better and using much less than Nvidia overall, Techpowerup and Gamers Nexus should have real power usage numbers soon
Me in 1998: wont touch Ati cards because drivers Me in 2008: wont touch Amd cards because drivers Me in 2020: wont touch Amd cards because drivers AMD: consistency achieved
Too bad Radeon doesn't have anything like Nvidia's CUDA which is very useful if you want to do some fast computing using GPU and is essential for deep learning. Maybe in the future there will be something comparable from the Radeon's side.
It is important to note the 6900 XT is a gaming card, the 3090 is a workstation card. 6900XT wins hands down if we compare it to the 3080 which is marketed as a gaming card.
@@2kliksphilip Wait, my bad after looking at the page myself, it does look like the 3090 is marketed as a gaming card, however, on the Nvidia website the 3090 page says "TITAN level performance" and the price reflects more of a TITAN card. While the card is marketed as a gaming card, I think its appeal is more for a workstation user, given the large amount of VRAM and price.
@@rinhato8453 I am curious to see how much more performance a 3090 Ti can give. If Nvidia is to compete with the 6900 XT they need to the reduce the price of the 3090. And keep in mind the 2080 Ti was listed as $999 when it first came out., so for Nvidia to market the 3090 with "TITAN level performance" with a price of $1,499 seems like it is more intended for workstation users.
Being able to directly address VRAM on consoles is one of their biggest strengths as a platform. But, it's telling that AMD is calling it "smart" memory access rather than "direct", massive caveats incoming.
@@ShawFujikawa Direct memory access (DMA) is a term that's been used since the dawn of computing, I get the feeling that if they said "DMA" it would be deceptive. "Smart" implies it going to be making some kinda automatic decisions or something, why would I need "smart" memory access? But maybe you're right.
I was already putting myself in your shoes when I was watching the presentation, I like your influence! I personally don't care much about Ray Tracing, but DLSS is absolutely revolutionary. I just hope AMD hasn't forgotten about budget gamers who are still playing on 1080p and like performance more than graphics.
All new graphics cards are already overkill for 1080p. That’s no longer important really. The 3070 or 6800 will be more than enough for 1080p in all circumstances
An Amd 5000 cpu combined with a Nvidia 3000 series card will give you the absolute highest fps at 1080p whitch almost everyone is still on. Every competative lol, fornite, csgo, valorent, wow, pubg, dota, cod warzone, brs etc all play on 1080P only and will continue so. I will definatly not get an amd radeon card, but rtx 3080 I am getting!
I don't really see the problem with SAM... Do you have it? Great, then have some free FPS. You don't have it? Your GPU works anyway at almost the same speed...