What a shame, I got an RX 480 two years ago and couldn't be happier with it, but even back then I noticed that many people didn't even consider AMD to be an option.
Top Job on the Analysis The one area where I dont often see mentioned is that when the Polaris prices did hit their rock bottom pricing of around £160 for an RX480 and £130 for an RX470 there were a lot of shops selling excess AMD Fury's for around the sub £200 mark. Even im guilty of giving out the advice as at the time you were seeing 10% Less Performance than a 1070 in most titles for essentially RX480/GTX1060 Money. As for the mining the only real downside is mindshare, I built a fair share of PCs using the RX470 when it hit its lowest price. As for the same price as the GTX1050Ti (£130) which was miles worse, and lacked Freesync (Which I find to be a real game changer overall). But since mining hit AMD Prices have yet to really return to a stable price, sure we get the odd sale down to £150 for some of the RX Cards, but usually the GTX Cards will often be the same price if not cheaper...Meaning those just end up selling better, still its good AMD Could profit out of it, as realistically at the end of the day they are a company. AMD Really needs to do something about mindshare, and Polaris could have been just that with its very promising low power consumption, but unfortunately that ship has sailed, and ill be suprised if the RX590 uses sub 200Watts like it should be capable of doing. All AMD Cards I have had (Bar the HD7770 which was flawless from the factory) needed an undervolt, my R9 285, My Fury, etc... The performance is always the same but I can shave a lot of my power consumption by changing a few numbers in Wattman. Their Vega chips continued this, and we've all seen how scalable they are. To this day Nvidia has handled their "Mistakes" better than when AMD has an issue, examples being the FX5000 Series which was followed up by the decent (albeit lackluster) 6000 Series, and Fermi, which was followed up with Fermi2 (Which is hands down one of the best aging architectures that isnt GCN), fixing peoples major gripes within one generation. Im not a fanboy for either company, but I do tend to buy AMD as its best for what I do, and when you consider how good and cheap Freesync is then the choice of a GPU when you have a budget in mind becomes vastly skewed. TL'DR: Polaris was good, AMD made some money, but it screwed with their brand image especially when paired with the poor Vega Launch.
that polaris launch was tainted by the boom of cryptomining at that time. We never saw that $200 price tag at least here in my country. More like $300 at the very least.
I proudly own an RX 580. I owned a GTX 660ti for 5 years and decided to switch to AMD due to the weak long-term driver support and performance of the 660ti. While at the time of release the 660ti was faster than the AMD competitors (7850 and 7870) the relative performance decreased over time. I even decided to pay a premium for the 580 over the 1060 as I purchased during the mining craze. My only complaint has been inconsistent free sync performance for my 144hz monitor at frame rates low in the range (48-60), but this is outweighed because I could actually afford a 144hz monitor. At the moment, I plan to continue purchasing AMD unless they become completely uncompetitive.
issaciams you mean regarding freesync? Very few games. I noticed the issue when testing free sync using a benchmark and have only noticed it in games like fallout 4 (though free sync remains an improvement). It’s really a non-issue, just was the one thing that caused me frustration since my purchase.
But the 660ti didn't compete with the 7850 and 7870, the Tahiti LE 7870 XT was the competitor. 7870 competed with the 660 and 7850 was up against the 650ti Boost.
@@instantkarma1978 Those were the cards within a similar price bracket, and in my research I looked at relative performance. You're right, though, which makes this comparison even more striking: 660ti: www.userbenchmark.com/PCGame/FPS-Estimates-The-Witcher-3--Wild-Hunt/3855/7683.0.0.0.0 Normal 7870: www.userbenchmark.com/PCGame/FPS-Estimates-The-Witcher-3--Wild-Hunt/3855/7714.0.0.0.0
A lot of benchmarks are set at unreasonable settings, making people think these cards aren't very capable today. Lowering settings is pretty underrated.
@@Pudgeee I have RX 560 non D and it's not that great. Sure I try to play at 1440p and should have bought at least 570. Still RX 560 did well for a while at 1440p. Now it's almost impossible to play anything on lowest at 1440p at all. I could play GTA 5 at 1440p medium-high settings, but now I can only play CoD WW2 at 1440p lowest settings (except AF and textures) with 45 fps and Far Cry 5 at 1440p needs near lowest settings and 0.7 resolution scaling to only get 40 fps stable. Meanwhile RX 570 or 580 does 1440p just fine. Can't comment on nVidia this gen, but after owning GTX 650 Ti and remembering how they gimped drivers (DOOM only ran at 640x480, lowest settings and sweet 35 fps. Sadly I can't even see that cinematic experience well) for it. I don't expect anything good out of 1050 or 1050 Ti in long term. If I only played at 1080p, then RX 560 would be much better card, but still sweet spot is 570 or 580, which are like 2 times faster for 30-50 percent increase in price.
@@MJ-uk6lu Well, why would you play at 1440p with a 560? It's a budget 1080 card (or lower), and anything above that is fairly unreasonable to expect good performance on.
@@Pudgeee Because my monitor is 1440p? Anyway I play mostly much older games and they run great at 1440p ultra settings. I really don't care about modern games and got it because I was worried that gtx 650 Ti won't handle UT2004 at max setting in 1440p.
@@MJ-uk6lu I just don't get what your point was when bringing up the fact that it can't play modern games at 1440p. If anyone were to play today's games on that resolution, they wouldn't rely on a 560. Older games are obviously entirely different.
@@dontuse10 it's as if every single game out there suddenly felt like shit, and no new release excites me any more, please sent help... :( as of this month I ve used my phone more than my triple monitor gaming computer next to me...
Same here. I had a 1050 TI, but i opted to sell it to get a AMD 5604G OC because of the drivers. (I got it on sale too, so even though there is a decent performance hit, it's still for me, a much better value alongside the open source drivers)
This is such an informative channel. I'm only recently getting into more technical pc hardware, but i feel like vids like these have pretty good starter videos on how the market and cards work. I really love listening to kliksphillip's voice too
The RX480 was really a good investement for me, I paired it with a Intel I5 6500, it worked beautifully until the Intel CPUs were nerfed after the discovery of Windows vulerability earlier this year, that caused a bottleneck on some games. It wasn't really behind the 1060, often being faster in some games, with 8GB of memory, I tought I could use it for VR in the case it took off.
Philip IMPORTANT: Steam Statistics takes information from every PC worldwide PER ACCOUNT, meaning if you and your friend used the same PC with the same hardware, Steam would register the hardware twice. And some countries, especially China, is known for having PC Gaming Cafes, which usually have mid tier power efficient GPUs, meaning they have GTX 1060s and 1070s. Imagine hundreds of people logging into those PCs every day and Steam registering the hardware every single time on the very same PC, of course Nvidias power efficient cards would seem rather bloated.
ChinaSteam I'm not sure you understand how the Peoples Republic of China works. There are very few chinese people who dare fuck with anything western so they have their own clients and games specially made for internet caffees
Speed (faster is better) - ofc Power consumption - ???? Size (smaller is better) - ok Process (smaller is better) - i trust you Price (lower is better) - *spits coffee*
Lower power consumption is always better when talking about similar cards. A smaller process is always better no matter what. A lower price is always better when talking about similar cards.
Everybody knows that people only buy graphics cards in order to enlarge their virtual penis. If you don't have the most expensive card with ray tracing cores, hyper-threaded anti-aliasing processors and quantum field chronometers, there's really no reason to even consider playing Fortnite.
After getting thoroughly hooked on researching computer specs over the last month, it's awesome timing for you to release a GPU hindsight video. Helps me to validate what I have learned. :D
RX 480's are actually pulling ahead of GTX 1060 6GB's in a lot of newer games now, though it varies a bit depending on the specs of the exact models being compared
Thanks Philip. As someone who liked hardware I got very familiar with NVIDIA's lineup of GPUs. I never new the naming schemes for the AMD side. This video is great at explaining the names and positioning.
12:17 Glanced at my screen as this table popped up and did a double take. Looking at the first AMD entry, the first thing I thought was "How many people have AMD cards in crossfire??"
VR is taking off but slowly. It's not gonna go away it's just too damn good. Once the price of vr headsets drops it will explode because they are the most immersive and fun thing I have ever experienced.
I thought I remember a good part of the power and efficiency problem for Polaris and Vega was using Global Foundries' 14nm. TSMC's 16nm IIRC is superior by a fairly good margin.
I bought a PC with the GTX 1070 over the 480 in early 2017. It was a while ago but from memory the deciders for me (at the time I wasn't as well read in hardware) were firstly the price difference in Australia wasn't as competitive for AMD. The 1070 had, and still has, an excellent reputation. My primary goal was to build a VR machine and while I didn't put too much stock in driver debates, fans on both sides were pretty adamant nvidia was the way to go for that. Finally because nvidia has the biggest market share, it's a lot easier to get an array of user reviews. Something I put less stock in now, but for a budding enthusiast, that was important to me then. I believe at the time the R9 Fury and the 390 were also competing in price. While it was clear where the 980 vs 1070 were generation-wise, AMD had some obfuscation for that particular area. The GTX 1070 was an excellent choice, and I don't regret it, what I do regret but only in hindsight was not getting a Ryzen CPU, but even then I probably would have got a cheap motherboard that couldn't have upgraded to the end of AM4 generation chips.
The 7970 did prove faster than the 680 and the 7970 GHz was faster, full stop, even at the time. Now it is a pure victory. The 290X competed with the 780 Ti which used similar power and was similar performance, but aged worse (fact) and was much bigger (fact). Hawaii was the last time AMD rekt Nvidia fully, not the 4000 nor 5000 nor 6000 series. It was the 290X.
I was one of the idiots who went with a gtx 770 (same chip as 680) instead of a 290 because of 'power and heat' considerations, despite getting a pretty hot and noisy windforce 3x not much different than the sapphire 290 I was looking at.
But the Hawaii cards got a bad reputation because of the shitty coolers. As much as I like AMD/ATI they've always failed on customer experience. Then mining blew it for AMD.
@@snetmotnosrorb3946You're completely wrong about mining. It's been a saving grace for RTG and is what funds what comes after Navi. Gaming is no longer the #1 highest profit market for GPUs, there's a lot of other uses for them nowadays.
@@snetmotnosrorb3946The sapphire triple x cooler for the 290 was better or on par with what I had on a similarly priced gtx 770, I had just bitten the marketing trap and fell for it.
I had an R9 390 and quite liked that card, it just felt powerful... But was also running hot and thus a bit too loud for my taste. I - sadly - got dragged into the mining "business" by a friend, and when things started going off the rails I sold my 390 and replaced it with an RX 580 that I previously used for mining. It felt like the exact same card, but with a much more silent design. I quite liked it but it really didn't feel like an improvement in terms of performance, so I soon ditched it for another card from the depths of Ethereum mines: a Vega 56. That really was an improvement in the performance department, and quite liked how fast games ran with it... at the rare times where I could actually hear them. Man, was that card loud! At that point I ran out of options, I wanted something that's faster than a Polaris, but is still silent. For the first time in my life I decided to go with nVidia instead, and bought a 1070 Ti. In terms of efficiency this card was a huge improvement, it's a little bit f aster than the Vega 56 was, and is really silent. And I hate it. Through all the years while I was using AMD cards I was always being told how their drivers suck donkey balls, and all that. You know what? There ISN'T A SINGLE DRIVER for my nVidia card that works well with all my games. Some driver versions cause stuttering in X games, other drivers make Y games stutter, it's ridiculous. And G-Sync, what a worthless piece of... You pay quite a lot for it, and then it works properly with like... 10% of the games currently on the market, at best? Thanks to all my issues I became a regular visitor on the nVidia forums as well, and all the ignorance going on there is mind blowing. nVidia refuses to fix issues, they just concentrate on marketing their new products and selling new stuff... Once they got their claw in you, once you've already gave your money to them, you're no longer important, you're just a problem, not a respected customer. I can totally understand all the people here rooting for AMD, the very first day Navi comes out I'll be back on the red team! nVidia is like the Apple of PC hardwares, looks tasty, smells good, the sugar coat is perfect, the price is high too, so it MUST be good, right? right?! Then you take the first bite, and meeehhhh, crap.
Had a 480 since launch. Loved it. Did great and even made a bit off mining it. Ended up selling it to a friend who was using a gtx660 for cheap and buying a used 1080 when the prices fell Bellow 350$ (got mine for 325$).
Great video except at the use of the steam hardware survey at 12:30 ish. The problem with using this data is that it fails to mention that the immense popularity of nvidia's cards is due to the Asian market and their majority usage of PC Bangs and Internet Cafes, most of which are either directly sponsored by Nvidia or get subsidies from Nvidia to use their cards in their PCs. This causes a massive artificial influx of nvidia cards in the steam hardware survey as many of those users have not actually bought a nvidia or an AMD card at all, they just use them. The average person building a gaming PC, especially in the west probably has a much higher chance of having an AMD card in their system.
It is going to be incredibly difficult for AMD to make a major comeback. There are a lot of reasons to take Nvidia over AMD right now, and it has been that way for quite a long time now. The sad thing is without real competition Nvidia card prices are going way up.
If there's any time for a comeback to be made, it's now. The RTX launch is leaving people with a lot of sour tastes in their mouths (regardless of NVidia's motivation behind it) and AMD's likely going to announce their 7nm architecture's GPUs in January. Still not very likely - Nvidia would probably have to bungle several launches in a row to waste the mindshare they have in the consumer market - but AMD might be able to make a dent in things if the stars align. They managed to give Intel a hell of a wake-up call with Zen last year, after all.
@@Jordan-rNvgQ umm... i'm prty sure you can find something similar or better then ULMB... as for the most used reason by nvidia users "emulation" when ONLY cemu have this problem.... and the rest works fine as for the connectivity... different AIB can have different connection also converters for old monitors those are not reasons to but one over the other when it is a bit to... how to put it... you might find it on both with bit of research and i don't think someone would buy a gtx 1080TI/2080TI or a vega56/64 strictly for emulation... what i'm trying to say is that i don't find your reasons "strong reasons" to not buy one over the other tho...
I had/have/used to use, a 290x with the three fans, that thing was like a jet engine in volume. I upgraded to a 1080 now and its a good 2-3 times quieter, not to mention the performance hike.
Hey, so the reason why I got a 1050 Ti was that it was on sale for ~$100 US . AMD's closest offering (the 460) was listed at ~$130 at the time. It was just a better deal for me.
Yup even my friend gone with 1050Ti instead of RX500 series. Gone with Intel Pentium G series instead of Ryzen 2200G that have the same freaking $price (Yes we are from Malaysia). I can’t brain this brand loyalty thing and also due to past reputation. Past reputation makes people stick with iPhones instead of more versatile and as powerful Androids like the latest Samsung Note ( 3/4th of a price in my country), Oneplus (fraction of a price), Pixel (almost the same price but with far better user experience and cloud service etc. This confirmed by my friend who’s a loyal iPhone fan since 3GS era up until the X (or Ten whatever you wanna call it) era he never once switch to other brand and also by a lots of reviewers out there) and many more.
Personally, as an inspiring content creator, I don't think I would've have ever bought AMD back then, no matter what. As you said, they lacked an equivalent to shadowplay back then, and to be fair, kind of still do. Sure they have relive or whatever, but they don't have NVENC, which is supported by OBS, and a handful of other programs. They also don't really have an equivalent to CUDA that's as supported as it. That makes Nvidia far more appealing for people doing any kind of 3D work, and much more. The same will go for the RTX series, being great for professionals doing work with ray tracing and AI. Back then AMD was also at a super low point due to CPUs aswell, Ryzen not being out yet iirc. AMD was not very competitive in CPUs, so people often went Intel/Nvidia instead of AMD. Another issue was marketing VR. Although it's nice in theory to have VR for $200, and it's achievable at a low price point, it's appeal still mainly lies in enthusiasts, especially back then. This means that people who's price point was $200, didn't really care about VR. VR headsets were still very expensive, and so were there limited games. People who could justify getting into VR at the time, were usually the same people who were able to justify purchasing something far better, like a 1070 and 1080. We are also starting to get more "less knowledgeable" people into the gaming scene, who just want what everyone else is using, which usually is Intel/Nvidia, as that's what all the content creators, or high end gaming rigs use. People don't really care that much about looking into price/performance like they used to in the past, or atleast the people I talk to don't anymore. Sometimes people just want the most expensive thing for whatever reason, look at smartphones. That's personally why I think AMD failed during this launch and why I believe AMD need to start focusing on the content creator, like they did with Ryzen, for their GPUs, as Nvidia's feature set kinda blows AMD away.
@@3kliksphilip They do now, but it's not widely supported like NVENC is. I'm also not aware of which cards specifically support it. Thanks for reply btw, ily. Last time I got a reply is when youtube had direct messages and I sent you a shitty map I made when I was 13
the hardware h264 encoder on polaris has been supported fine on OBS, use it on my rx480 all the time, and have been doing so for a while now. in them early polaris days you actually needed this bad boy: obsproject.com/forum/resources/amd-advanced-media-framework-encoder-plugin-for-obs-studio.427/ yeah, nvidia was 3-4 years ahead with shadowplay, so naturally their nvenc got baseline implemented way sooner, but, please don't make up nonsense about OBS and support for AMD hardware. :) in fact , i HAD to use OBS on my gtx 760 for hardware capturing, as the nvidia software did not not let me use shadowplay, it wanted me to have a 770 or higher, despite having the NVENC hardware, which meant i couldn't use it for ages until the software came along. but yeah, totally agree on the cuda and industry thing (and rtx), they have that shit on lockdown, introducing alternatives for it, that aren't 1:1 compatible is useless at this point. i actually remember RTG making some kind of program to convert cuda code into code a radeon could crunch, but never really heard much from that, i don't even remember what it's called anymore... but yeah, that shit was only useful for converting standalone sciency cuda stuff anyway, does nothing for 3d programs that need to natively support directly hooking into the hardware. the alternative, opencl, still seems to have trouble taking off. luckily, my 3d program of choice is blender, which is one of the exceptions that does have half-decent opencl support these days (even CPU+GPU rendering)
@@elikay2101 mhm do you know blender? a vega 64 renders any scene faster than a gtx 1080ti and now with amds pro renderer its even faster. AMD form of ray tracing was available for a long time known as "FireRays". For AI a vega card would be a no brainer against a GTX 1080ti considering it beats it in almost every scientific workload. BTW what are you creating?
"but they don't have NVENC, which is supported by OBS" the hardware level encoding was present since their first gen architecture GCN 1.0 (i think it includes 7k series not sure) as philip said it's called VCE and could be used in OBS, i personally used to record 1080p 30fps 20k bitrate on a mid-range r9 270x, the more mid range r7 260x had GCN 1.1 (or 1.2 dont remember for sure) and they improved to support 1080p 60fps at the same bitrate... usually people don't do proper research then say "oh this didn't had that or that or that"
@@3kliksphilipThe comment i made wasn't made well, I men't to say is that I wish i knew more about computers are the history of companies and products. I don't really know much history about computer tech, (all i really know is that a 1080ti is good) but its mainly all the acronyms and numbers. Don't worry the way you presented the information was great, I liked how you actually correlated the facts and statistic to what they men't. Sorry if the comment was a bit provocative, great video as always.
@i Draw Generally you don't actually use mining as your sole source of income; some people that are willing to invest insane amounts of money do but mostly it's a mostly-passive form of income. The ROI can be 8 months or more. When you do hit break-even, though, it can be an extra few hundred dollars a month (depending on where you live). People also tend to eventually resell their cards to get more of that money back faster. Also the biggest cause of the shortages was actually the big groups that would buy hundreds or more of the cards at a time. Not the people who would buy 10 or so. Especially people like this who probably bought used cards.
something to add is that perhaps there is more prebuilds been sold with nvidia than amd, and because of that people that are buying prebuilds more than likely will see a pc with an nvidia gpu compared to an amd gpu.
I was considering an AMD card for a future build due to FreeSync and better performance in some games I play. But given the hardware survey, I am a bit skeptical about it now. I have been pretty dependent on ShadowPlay on my current system, and I have an impression that ReLive is not as good. With their dev leadership gone, I hope they still manage to succeed in their next launch, which should be soon according to their roadmap. Given the poor behaviour by Nvidia at the moment, we really need some good ol competition once again.
I haven't kept up to date, but a few months after ReLive came, benchmarkers/testers claimed it had better video quality and had less performance impact than Shadowplay.
I haven't used shadowplay (althought I guess I could test it), but as far as I can tell from the settings, relive is superior in most ways, having options for both AVC and HEVC, audio quality and a few extras for livestreaming
I've been using a used 760Ti for like 2 years and it's been a great little card for £70 I paid on ebay but had to upgrade for newer games in late 2018. Looking at Nvidia's ridiculous pricing I chose to switch to AMD. Got an RX 580 Nitro for £170! Closest to this from the Nvidia camp was the crappy 3GB 1060 and that was at around £200 and more like an RX 570. That was an amazing purchase. Zipping through Forza Horizon 4 on Ultra at 60FPS is what I needed and at that price it was a no brainer. Nvidia is high if they think they can be the Apple of GPUs and constantly raise their prices like this.
One important thing: These Graphic Cards were not used for mining Bitcoin, the mining of bitcoin is nowadays only done using ASIC miners. The different miners were all mining other currencies that are somewhat resistant to ASIC mining.
@12:30 The reason you see such a disparity in Polaris vs Pascal at the 1060 and below is because blockchain pretty much priced all of the AMD gpus out of consumer hands. The Pascal pricing didn't start inflating too bad until a RX480 started costing $400.
I've been using an AMD RX 470 for about 3 years now and I've never looked back. This little thing runs like a dream and my friend is inheriting it from me once the next good AMD GPU comes out. Could be the RX 590, who knows.....
Excellent analysis, I think you understated the problem of miners buying up Polaris and how this shows up in what people actually game with. If you look at AMD figures their market share should be much larger than what shows up in something like steam hardware survey. AMD had millions of polaris GPUs going out the factory door per month that just aren't out there being used for gaming. Ex-mining AMD cards are indeed flooding the second hand market right now. Interestingly miners would undervolt and underclock Polaris cards and keep them well cooled and usually less dust clogged than a gamers card (Difference between garage vs someones dusty bedroom I guess), so the cards hitting the market are hardly burned out at all and probably should have a long life ahead. Gamers just aren't into buying second hand stuff, but I wonder if that habit will change now, and how will that shape the market for a while.
AMD's market share in Steam statistics might be very low but cryptocurrency miners bought a TON of these cards. Also, they supply CPU and GPUs for both PS4 and Xbox One. Their Ryzen CPU's compete very well with Intel in every price point. Hopefully they'll be able to compete again with Nvidia in consumer GPU's in near future.
Cards that were used for mining tend to be MORE reliable, not less, than other used GPUs (assuming the seller is reputable). They are typically undervolted while mining, and the fact that they are run 24/7 means there should be no defects and will continue to run for the foreseeable future.
I am very happy with my Polaris-based card, that is sold under the "RX Vega" brand, inside an Intel CPU. And by "happy" I mean confused.But also happy that Intel and AMD can get along to do that. Seriously, it's a thing that exists!
I live in Brazil and at least before the crypto boom, the RX 480 was a awsome card price-wise, I bought mine for R$200 less, the model I've choose was the Gaming X from MSI for R$1200, I got a overclockeble card that runs pretty cold, better than paying R$1400 for a basic 3GB 1060 mini from Zotac.
I remember a couple years ago I was on the RX 480 train. Bought two of them was tried to OC them but this is where I learned backing off voltage on these cards would actually increase performance easily since AMD's set voltage was so high for the powerlimit that clock speeds suffered. VRMs weren't that great either if you had them running at full power (the reference cooler) so I water blocked them. Ended up selling that PC did a build with Vega 56, I'm loving this card.
As someone who pre ordered an rx460 and had one since it came out to until very recently I think Polaris is good, the card was strong enough to run everything I needed it to and it was really cheap(about 120$).
Miners bought Polaris cards en masse, I haven't seen them in stock at MSRP until recently in Finland. And before the craze they weren't stocked. It was AMD on back order or Nvidia immediately.
I just bought a MSI RX 570 8GB card, a bit late but thanks to Freesync support it is still a very capable FullHD card and the price was really great (155 EUR). I was surprised though that I could undervolt it massively (1.15V -> 925 mV) to keep it cool at 65°C, powerful at 1286/1950 MHz and quite all at the same time. It consumes just below 90 W in Battlefield 1 with a 48-72 Hz FreeSync monitor. Very remarkable indeed. The default values are awful in comparison though. What a shame that only now at the end of its shelf live, some more gamers get to appreciate so much value at its intended price range.
4000 series was amazing. People still bought the 200 series in droves. They where happy when the price dropped to compete with ATI... even when the 4000 series was still a massively better deal. No matter how good ATI cards where - nobody bought them.
Polaris was solid, but I'm interested in what's next. nVidia has gone crazy with their prices and it seems like we no longer have a great 970-like deal for midrange consumers.
I had a 480, a asus dual 4gb model. Fantastically good card. I’ve upgraded to Vega now for rendering. But i will always stay with amd. I’ll likely have this card for a while.
Both are a great deal. German pricing is atm 200€ for Rx 580 and 220€ for gtx 1060. Rx cards have assasins creed in the pack (key about 40€) so if you dont need it and sell it you would have pay around 160€ ...cant be any cheaper.
Oh I had the R9 290! Yeah that was a very loud one. It was soo loud that, my family that was in other room were asking me why do I have kitchen mixer put into my PC. Little did they know it was a GPU with fans put to 30% of their speed... I hope I'll never stumble upon such loud piece of hardware. I instantly sold it. I forgot to mention one little detail. It was overheating with the max fan speed, reaching over 110℃ in just Battlefield 3.
I eventually settled on the Nvidia 1060 over the Radeon because I was using an Nvidia 680 before - and was getting into the Nvidia "Gamestream" system to stream games to other devices via the network in my home. Back then I was only dabbling with this, but today I don't think I could buy a AMD card anymore unless they come up with a System like this on their own. Steam Play is good, but a lot more lossy than GameStream, performance wise. So it's not really brand loyalty. More... Needed functionality.
Back when these cards released I was set to buy an upgrade for my 960, I ended up with a 1060 because the AMD card just weren't available from reputable retailers in my country and resellers on the equivalent of ebay would overprice the card by 50% even before crypto took the stage. Even now the local amazon only lists a single vega 64 and 2 vega 56 for sale. AMD MUST improve availability for Navi if they want to sell more.
Bear in mind, Mr kliksphillip, that pre built PCs, here in the UK, mainly use nvidia's GPUs only. This would account for a large percentage of users; the casuals who spend too much money so someone else builds their tower. It's a shame, but my friend bought a 1060, only because there was no 470 available.
Great video. I think many of the factors you mentioned contributed to Nvidia dominating the mindshare among PC gamers. When I was upgrading my GPU a couple of years back, AMD barely entered the conversation. At the mid-range, they weren't faster, cheaper or cooler. The 1060 was a no-brainer by comparison. In that scenario, I'd have to either have come across a crazy deal on an AMD card, or ignore the factors I mentioned above, and back the underdog out of principle. Back then I went with the safe option, as many others clearly did according to that - slightly alarming - Steam data. Now though, I'm watching and rooting for AMD more than ever (well, as much as it's appropriate to root for a billion-dollar company), given the various pricing and ethical shenanigans surrounding Nvidia and Intel. The 2019 tech dream is to jump on PC Part Picker - money in hand - and configure an all AMD system with a badass Zen 2 processor, a (hopefully) 1440p capable 590, and a sleek Freesync Monitor. Finger crossed, eh? 🤞🤓
Well the 590 is out and they just managed to beat the 2 year old 1060....... Amd needs to step up or nvidia will keep those prices high AF I am upgrading my pc soon, and am waiting to see what the 2060/1160(whatever it is) will be like And decide whether i get a 1080 or a 2060/70
When I was building my PC, Polaris was simply not an option for me. The mining craze was at its highest, and getting a Polaris card would cost be double the price of a 1050 Ti (the card I got). If I was building a PC right now, I'd probably go with AMD (although they're a bit more expensive where I live).
I had a 460 for around a year. Bought a 1060 6gb because it was during the mining craze and it was on Newegg for a proper price (not $600) Had a 480 been available, I would have gotten that, but I guess just my CPU is red for now.
You forgot to mention how the 200 series outsold the 4000 series, despite being, hotter, louder, larger, and more power hungry. NVidia in the past had times they were worse on all scales to AMD, and no one gave a shit, but the second the lack of funds bites AMD in the ass, everyone laughs at them and calls them crap. If AMD pulls back ahead in the future, nothing will change. People will STILL buy NVidia over AMD, just like what's happening now with Intel. AMD is better in all regards. That 2-10 FPS difference is so negligable at 144+ FPS it's outstanding how people still seem to think it qualifies as an arguement. And people wonder why "AMD Fanboys" are toxic.
the steam hardware survey i think is not a good indication of the market share of the polaris cards due tot he fact that, because of mining, you could not find rx cards on the market (out of stock or at double the price) for a year and a half... almost the entire life span of the 400 series
i had a r9 270 for years. solid card. then i wanted to have something new. couldnt decide between the 480 and a 1070. i went with the 1070 because the 480 had issues with gta5 and also wanted to have something more powerful. power consumption wasnt much of a concern to me tbh
I had a 4870 and then an R9 290 and have a good impression of AMD but I am pulling the trigger on a 1080ti on Thursday as the prices have dropped a bit and AMD does not really have an answer for the raw power of the 1080ti.
I'm from the future! Navi has launched and AMD slapped a 2X price increase on top of it, just like Nvidia. So I joined the 2% of people with brains and bought the RX 580 for 140$ New.
I had two HD7970's in Crossfire and I was hoping to replace it with one RX480, a few months before the release. I had to upgrade to a GTX 1070 for 490 euro. Currently I have a 32" FreeSync monitor and I have a VR headset. It would be nice if Navi would be a good upgrade over my current GTX 1070...
I have a singular reason for buying a 1060 instead of 480. I dabble in cuda programming and machine learning and the software support for cuda is great