Тёмный

Radeon HD 7000 Series with Hindsight: Great, or Disappointing? 

2kliksphilip
Подписаться 654 тыс.
Просмотров 69 тыс.
50% 1

Опубликовано:

 

28 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 396   
@juanignacioaschura9437
@juanignacioaschura9437 2 года назад
Actually, the 7000 moniker came full-circle two times: - ATI RADEON 7000 (R100): 2000 - AMD RADEON HD7000 (GCN1.0): 2011/2012 - AMD RADEON RX7000 (RDNA3): 2022
@ahreuwu
@ahreuwu 2 года назад
can't wait for the amd radeon xl7000 (rgp3) in 2033
@DigitalJedi
@DigitalJedi 2 года назад
I have one of each of the first 2. Hoping to switch away from Nvidia for the RX7000 series to go full circle.
@pauls4522
@pauls4522 2 года назад
yup iI found that pattern too. Almost every 10 years AMD loops back to 7000 series. The original Radeon 7000 series, HD 7000 series, and now the upcoming 7000xt series. I have personally owned the awful 32mb ati radeon 7000. A card which I actually fondly remember because it was my first gpu in my first ever pc that was all mine and not shared with family. I did not learn it was a gimped card until years later. Gimped as in 64bit memory bus, so even 3DFX cards from late 1998/early 1999 could beat it despite having DDR memory instead of SDRam. For example my sister had a 16mb 3dfx voodoo 3500TV and a 1ghz athlon cpu and 384mb of ram. I had a 500mhz intel pentium 3, 768mb ram, and the 32mb ati radeon 7000. back then I thought he pc was faster because she had a drastically better process, and a much nicer monitor providing cleaner visuals, but later now I know her voodoo card was at least 30% faster than mine. I also owned an HD7970 and still fondly remember it as my workhorse gpu for 5.5 years, and would have kept using it if the card did not start dying around the 5 year mark leading me to replacing in february 2019.
@albal156
@albal156 2 года назад
OMG 10 years a apart too lol.
@GamingRevenant
@GamingRevenant 2 года назад
The AMD HD5870 is literally the GPU which powered the start of my channel, and for more than 10 years allowed me to record videos and play video games with little to no problems (on 1080p). I consider it the best GPU pricewise vs. its performance that I have ever had. When I went ahead to buy a HD 6870, the salesperson literally told me it was not worth upgrading because all AMD did was shift the previous generation along the model names down without really improving. I'm happy he was honest about it, and so it seems he was right.
@e-cap1239
@e-cap1239 2 года назад
Actually, the 6870 is worse, as it has less shaders. But it was sold for nearly half the price. I personally also had an HD 5870 2GB and it's performance was quite good even if nowadays, it is comparable to a GTX 1050.
@raresmacovei8382
@raresmacovei8382 2 года назад
But that's not actually correct. Performance was mostly the same, but the card was smaller, more efficient, cheaper, great at crossfire
@stevy2
@stevy2 2 года назад
I had the HD 6950, unlocked to a 6970. That thing was a budget beast back in the day.
@Cheesemonk3h
@Cheesemonk3h 2 года назад
@@stevy2 I used a 6950 up until like 2018, it was severely outdated by that point but i still used it to play overwatch at launch. shame how quickly that game got ruined by identity politics
@Annyumi_
@Annyumi_ 2 года назад
@GamingRevenant you can get GTX 260 to start your 10 year channel since all your games you playing are DX9 so you can use that instead of dx11 gpu
@01chohan
@01chohan 2 года назад
AMD putting 3GB of VRAM in the 7900 series made such a massive difference as the GPUs aged. Reminds me of the i5 2500k vs i7 2700k comparison
@Orcawhale1
@Orcawhale1 2 года назад
Both of which are completely wrong. As the 7970 ran out of performance, long before it turned into vram problem. The same thing happend to the 2500k.
@AFourEyedGeek
@AFourEyedGeek 2 года назад
Haha, no it didn't, the 3GB was an awesome decision. Check out '7970 gaming in 2020' RU-vid video. The card was doing really well for such an old card, helped the higher amount of RAM on it. My Vega 64 is still going strong thanks to the 8GB of RAM on it.
@Orcawhale1
@Orcawhale1 2 года назад
@@AFourEyedGeek No, your Vega 64 is still going strong on account on the fact, that we've hit a point of diminishing returns in graphics. Which means that there's no longer the same push for increase in graphics. And im afraid your wrong, the 7970 does infact run out of performance, way before the vram buffer hits max.
@AFourEyedGeek
@AFourEyedGeek 2 года назад
@@Orcawhale1 the video link I suggested was evidence to the contrary regarding the 7970, games using just under 3GB VRAM, and performing relatively well. Increasing texture sizes has a minimal hit on performance but eats into the VRAM though. Higher textures can allow games to look good though you'll have to lower other settings to maintain decent fps. Some settings eat into different areas of a system, tweaking settings for your specific system can be beneficial, even a mismatch with higher RAM than relative performance of the GPU can offer a reasonable trade off.
@pauls4522
@pauls4522 2 года назад
@@Orcawhale1 incorrect dude. By the late 2010s in around 2016 the 3gb of vram was a limiting factor of the card more so than the raw performance. I have personally noticed this in plenty of games from around that era such as mirrors edge catalyst. When paired with the less commonly used 6gb variant, games from that era can have maxed out textures and still hit 1080P/60 just fine. With 3gb variant however you would see fps in the 20s if you attempted to max out the textures.
@mariuspuiu9555
@mariuspuiu9555 2 года назад
I believe the 7000 series is considered legendary because it improved a lot with driver updates (aka AMD Finewine)
@RealMephres
@RealMephres 2 года назад
AMD's Drivers are somehow very good cross-platform in almost every regard. Them unnecessarily increasing prices is annoying, but you got to give respect to their software department.
@Orcawhale1
@Orcawhale1 2 года назад
AMD Finewine has never been a thing. It was simply down to the architecture being reused over and over again.
@mariuspuiu9555
@mariuspuiu9555 2 года назад
@@Orcawhale1 It was a major thing early on with GCN. the driver updates gave that architecture major improvements over time (especially for those who bought the HD7000 and 200 series). for example, in Doom Eternal the GCN cards are at times twice as fast as Nvidia's 600/700 series (direct competitors at launch). the 7970's competitor should have been the 580, but it can beat the 780ti in some titles now.
@AFourEyedGeek
@AFourEyedGeek 2 года назад
The AMD drivers kept improving performance for GCN, while NVIDIAs drivers hurt performance over time and then dropped support early. It probably does have a lot to do with AMD using the GCN for so long, however the outcome for 7000 series owners is that the GPU was improving over time.
@Orcawhale1
@Orcawhale1 2 года назад
​@@mariuspuiu9555 Doom Eternal is using the Vulkan, which itself is based on AMD Mantle, from 2013. So obviously GCN is going to perform better than Nvidia. What's more, the increased performance, was simply down to the fact that developers became more familar with AMD's GCN architecture. As the Ps4 and Xbox One both used AMD hardware.
@simpson6700
@simpson6700 2 года назад
I feel like we need history videos like this whenever a new generation comes out. Reminding us of the Performance, price, die size, and power draw. I feel like nvidia and AMD went off the rails this gen with the shortage and i don't think we will ever go back to normality, because the shortage was profitable.
@rzarectz
@rzarectz 2 года назад
Kliks your tech content is top notch. Keep it up!
@thepcenthusiastchannel2300
@thepcenthusiastchannel2300 2 года назад
It's actually easy to tell how much of the performance uplift going from a 6970 to a 7970 was due to the new architecture and how much was due to the die shrink. A die shrink allows you to pack more transistors and often allows you to achieve higher clocks. In this case, the 28nm didn't allow for that much of a clock speed improvement. Going from 880 MHz to 925MHz. The bump in ALUs "Shader Units" also wasn't that large going from 1536 to 2048 pushing the FP32 performance up from 2.7 TFlops to 3.8 TFlops. The number of ROPs (still the main performance determining factor) was the same at 32 ROPs for both. This mean't that the Pixel Fill Rate went from 28.2 GPixel/s to 29.6 GPixel/s yet the performance went up significantly between the two cards. This is because the Fill Rate efficiency went up, the Pixel Shader efficiency went up as well. The move AMD made was from VLIW to SIMD. So what we're seeing is a boost in efficiency predominently led by architectural efficiency improvements rather than node shrink improvements. However, GCN was mean't to service two markets at once and this was its downfall in the consumer market. It was mean't for the Professional Workstation Market as well as the Consumer Gaming Market. This mean't that GCN tended to be extremely powerful compute wise but lacked Pixel Fill Rate performance. nVIDIA, on the other hand, segmented their architectures. nVIDIA's gaming line had more ROPs than AMD's and less Compute performance. nVIDIA basically went the way of the ATI/AMD 4000/5000 series with Kepler and Maxwell. Less power usage, more gaming oriented.
@Peterscraps
@Peterscraps 2 года назад
I got a 3090 because it was literally the only thing I could get my hands on, the same thing happened back in early 2016 when I got my r9 390X. It seems I got stuck in the upgrade cycle that has the worst power efficiency and market conditions. Don't get me fucking started on the abomination that is the 3090 ti, double the price with a new power supply with energy costs going through the roof. Nvidia is playing the market because they didn't last time, It's like seeing your drunk uncle piss away his inheritance at betfred
@veryfunnyguy4991
@veryfunnyguy4991 2 года назад
“Another successful office blunt rotation. pass that shit homie. Actually Norman you're being kicked out of the circle. Oh you can't do this to me I started this sesh! You chief that shit for like 6 puffs before you pass you're out Norman. DO YOU KNOW MUCH MUCH I SPENT ON THIS WEED!? Couldn't be much dog this is mid af. Fuck these opps I'll buy my own weed. I would like to purchase your best loud. Yeah fella try this out it's called goblin gas. WHAT THE FUCK DID YOU PUT IN THIS SHIT!? AH HAHAHAHA FELLA I'M TWEAKIN!!!”
@kiri101
@kiri101 2 года назад
The HD 7850 2GB has aged like a fine wine, mine's still powering forwards with modded drivers on Windows and excellent out-of-the-box support on Linux.
@Orcawhale1
@Orcawhale1 2 года назад
Then obviously it hasn't aged like fine wine.
@kiri101
@kiri101 2 года назад
@@Orcawhale1 It received considerable performance improvements in its lifetime, especially in open source drivers, and still continues to benefit from increased functionality and performance with modded drivers. So yes, like fine wine.
@hellwire4582
@hellwire4582 2 года назад
I got a 7950 as replacement for a 6670 in 2020. I was blown away when i could run red dead 2 at 1600x900 low settings with an oc'd i7 950 and a 7950 at 40 fps ^^ the game still looked totally amazing and felt smooth.
@FatSacks
@FatSacks 2 года назад
that's badass
@Orcawhale1
@Orcawhale1 2 года назад
Them im afraid, you don't know what smooth is.
@hellwire4582
@hellwire4582 2 года назад
@@Orcawhale1 i know your brain is smooth
@dklingen
@dklingen 2 года назад
Great video and a nice perspective - sadly the new card performance matching prior generation card pricing days seems to be lost as we are hitting $2K for top tier (insanity).
@akn8690
@akn8690 2 года назад
Im both subscribed to this and kliksphilip channel but videos from this channel wont drop to my subscriptions page. I saw this in my home page. weird. Also I would like to see a video about why amd drivers were bad and are they still bad and should we pay more for nvidia cards for its better and more stable software support ( both dlss, nvidia broadcast kinda things and drivers.)
@unison_moody
@unison_moody 2 года назад
Same
@gennoveus
@gennoveus 2 года назад
The video was also hidden for me, too ...
@timothypattonjr.4270
@timothypattonjr.4270 2 года назад
The 7970 aged better than expected because AMD continued to use GCN for far longer than Nvidia used Kepler.
@Loundsify
@Loundsify 2 года назад
It also helped that a lot of game engines on console were built around GCN.
@TrueThanny
@TrueThanny 2 года назад
Crossfire actually worked very well at the time. The number of games that didn't work well with it was very small. I had a pair of 7970's clocked at 1200MHz each, and that was faster than any nVidia card for years afterwards. As for efficiency, AMD is well ahead with RDNA 2, and is likely to get further ahead with RDNA 3. It's an error to conclude that the difference in power consumption is down to the node. The power efficiency difference between TSMC's 7nm node and Samsung's 8nm node is far lower than the efficiency difference between RDNA 2 and Ampere. Whether or not RDNA 3 takes the absolute performance crown, it's looking like it will utterly destroy nVidia in performance per watt. And that's with nVidia seemingly having a slight node advantage. But we'll see once we have actual data rather than just rumors.
@additivent
@additivent 2 года назад
They seem to have aged well, but only failing now because of AMD's shoddy driver support. The 7770 was a legitimate budget option for me 2 years ago, if it wasn't for a friend offering me a 1060 3GB for cheap.
@RuruFIN
@RuruFIN 2 года назад
Modified NimeZ drivers gives the older cards some extra life support.
@cyjanek7818
@cyjanek7818 2 года назад
Wasnt it checked on Linus tech Tips chanel that older amd cards actually work with newer games (better driver support)?
@MandoMTL
@MandoMTL 2 года назад
@@cyjanek7818 Fine Wine. Yep.
@conenubi701
@conenubi701 2 года назад
"shoddy driver support". This is an ancient card, you can't reasonably expect this card to be supported in this day and age with drivers
@peters.7428
@peters.7428 2 года назад
Custom drivers?
@simonrazer8303
@simonrazer8303 2 года назад
it felt like you said "in conclusion" 50 times in this video
@simonrazer8303
@simonrazer8303 2 года назад
@@2kliksphilip Oh oke. Well, it was a good video no matter how 👍
@Squeaky_Ben
@Squeaky_Ben 2 года назад
I remember building my very first gaming PC in 2011. 8 GB of RAM, a 6 Core Phenom II and a HD 6970. All to play Crysis 2 and Battlefield 3 at maximum resolution. Those really were the days.
@GD-mt4pe
@GD-mt4pe 2 года назад
I didn't get this video in my subscription feed, I saw it in my recommended 4 days later.
@0Blueaura
@0Blueaura 2 года назад
the switch to 6000 series at 2:00 got me really good xD
@bmcreider
@bmcreider Год назад
Awesome video on my current rabbit hole of nostalgia I’ve ventured down.
@tambarskelfir
@tambarskelfir 2 года назад
It's a nice recap of the situation, but there's some context missing. At the time of the HD7000 series, Rory Read decided that there was no future in discreet GPUs and AMD wasn't going to compete in that market going forward. AMD under Read, genuinely believed that the future was APUs, and the GCN architecture was perfect for APUs. Good enough for graphics, also excellent for GPGPU. This perennial lack of a competing product from AMD in discreet GPUs was in no small way due to corporate policy, which didn't change until after Ryzen.
@beetheimmortal
@beetheimmortal 2 года назад
That's absolutely insane. You just gotta love stupid decisions made by the leaders.
@Lyazhka
@Lyazhka 2 года назад
nice and informative video! looking forward to see FSR 2.0 in extra low resolutions!
@danielmadstv
@danielmadstv 2 года назад
This was awesome and I really enjoyed the history. I got into gaming with the HD 6950 and honestly didn't know much about the exact history afterwards. This was super interesting and I hope you make more like this. I'm also eagerly awaiting your FSR 2.0 video! Can't wait to hear your take, that's the one I respect the most as you are my designated upscaling and AI expert. Thank you for your excellent work as always.
@ffwast
@ffwast 2 года назад
[gently pats old 7950] "Don't worry buddy I still love you,a home theater pc is an important job you know"
@Lenk9
@Lenk9 2 года назад
it was only one year ago since i upgraded from my radeon 7800 to my current gtx 1080. you served me well radeon
@bakuhost
@bakuhost 2 года назад
What HD7000 was 10 years ago? Can't believe it's that old.
@haven216
@haven216 2 года назад
My first GPU that I ever owned was the HD 7730. For a little cheap GPU, it did quite well with the games I played at the time like Minecraft and Terraria.
@Venoxium
@Venoxium 2 года назад
My first GPU was the Radeon 7790! Had an ASUS one and I remember arguing with myself if I was going to go with the cheaper 7770 GHZ Edition and buy a FX8320 or stick with the FX6300 and go with the 7790.
@CompatibilityMadness
@CompatibilityMadness 2 года назад
Great video on interesting topic, but I have to ask : Performance used in table (@6:00), is based on Your test, time-of-release data/reviews, or on even later tests (done near the end of it's driver support, circa. 2020) ? Because this will impact GCN quite a lot (AMD drivers needed to mature before full performance could be seen). Same deal was with Navi and RDNA cards (Polaris/Fury to a lesser extent since they aren't as radical in changes vs. 7970). For example in GCN's 1.0 case, we only get good frame pacing (1% and 0.1%) after NV shows it's impact, and release of tools that actually measure it. Anyone who remembers that whole thing ? My 0.50$ about this topic : Multi-generation performance comparisons should always be made on PC with CPUs that have highest IPC available (for example today that would be Ryzen 5000 series or Alder Lake for GCN/Maxwell 2.0 GPUs). There is simply no way to effectively/accurately measure best cards, with era specific hardware (OC'ed or not). Furthermore, GCN was made to be more general compute focus architecture vs. Terascale, as it was meant to rival or at least contest with Nvidia's CUDA platform on super computers and other large scale compute projects (which can bring A LOT of money for more efficient compute designs). I think this is why efficiency on "pure graphics" on it, can be viewed as not as good as on previous generation cards (transistor budget was used to make it more versatile, instead of faster in specifc area). Simply put : It's not GPUs fault it's not as efficient in pure graphics as previous generations, it's simply how GPU market changed in few years since Terascale cards were released, that made it more profitable to leave some "game performance" behind for better overall product.
@Trovosity
@Trovosity 2 года назад
Love the information, love the music
@Maupa.
@Maupa. 2 года назад
Hi 2kliksphilip, do you decide what videos will be visible in the Subscription feed? Because I don't see this video on mine and that's really annoying.
@unyu-cyberstorm64
@unyu-cyberstorm64 2 года назад
My ATI Radeon HD 5870 runs Dreadnought at 1080P 60 at medium settings. It’s very nice, and is DX11 compatible
@lmchucho
@lmchucho 2 года назад
Philip this coming generación, is the third 7000 generación from ATI. The very first Radeon card was the Radeon 7200 from the year 2000.
@leonader9465
@leonader9465 2 года назад
For some reason I was craving a 2kliksphilip video about hardware. lol Thanks.
@pauls4522
@pauls4522 2 года назад
I have seen multiple youtubers construct the same data and results about the 7k series, but I have to say your presentation was absolutely phenomenal. Fun Fact though, the HD 7970 was actually released in late December 2011 in relatively limited supply. Not that the extra week really matters much in retrospec. The og HD7970 was originally beaten by the GTX680, but with driver improvements throughout many years it actually beat nvidia card. There was also a less commonly talked about super-overkill-for-the-time 6gb variant of the HD7970. Many of the large youtubers in the rare case it is mentioned instantly dismiss it as not mattering because "DERP- the HD7970 is not powerful enough to fully utilize 6gb vram -DERP" without even going into the proper analysis of it. Which I find to be a shame. I personally know quite a few games such as mirrors edge catalyst on top of others released around the 2016/2017 which were bottlenecked by the 3gb vram buffer, which could have at least seen maximized textures and 60fps still if using the 6gb variant. I bought my HD7970 when its price dropped to 300$ and came with 3 free games in September of 2013. I used the card until February 2019, and the only reason I retired the card then was because the card was starting to see signs that it was dying such as no longer being able to overclock, a 1-2 games would have gpu driver crashes, and some games would see "occasional" artifacts that would go away with a game restart. Most of the time though the card still ran great, but I knew it was not sustainable, so I took advantage of the early 2019 mining crash to pick up a dirt cheap
@cavegamer5989
@cavegamer5989 Год назад
I had both a 6750 and a 6970 in 2015-17 they were very capable even then. Ran the new battlefront from 2015 at over 60fps so I was happy. Had a 7970 for a few days after that then just traded it for a gtx 1050 and the experience was just so much better. Now I've come full circle and have a rx 6600xt. Great card.
@mikehunt42069
@mikehunt42069 2 года назад
Still sitting on a hd7970 and 2500k. Gpu is starting to artifact, even tried repasting and completely blowing out the dust but to no avail. My guess it's the memory modules dying. CPU is a bit degraded, it's not holding 4.5ghz like it used to. I'm still happy with the performance but if AMD's upcoming RDNA2 apus can exceed my hd7970's performance I'd be happy to upgrade to that platform.
@jorge69696
@jorge69696 2 года назад
Same. Still on a 2500k with a 380 which performs basically the same as a 7970. Can't wait to upgrade.
@estevaoang
@estevaoang 2 года назад
I fixed mine with iron in the die for like 40 minutes lol , and also fixed a hd7770 just changing the bios, maybe try it
@JwFu
@JwFu 2 года назад
Since i moved out (more then a decade ago) and had to pay energy bills my self i started to care more about efficiency. good video, somehow didn't popup in my sub tab.
@vAlcatr4z
@vAlcatr4z 2 года назад
HD7870 was the first gpu I first owned, I can say that it's still the top in my gpu list. It ran many games at near max settings back then with no issues regarding fps until I started playing Rust which months later killed the gpu (image would freeze few minutes after launching any game). What a great legacy.
@beachbum111111
@beachbum111111 2 года назад
The radeon 7950 was the first PC gaming card I had, previously I had a laptop with a Radeon 5650 that only lasted me 2 years, but that 7950 survived until this last year where it seems to crash for good, and man did it get me through alot.
@AlleyKatPr0
@AlleyKatPr0 2 года назад
You failed to mention pipelines, as we are now moving into the era of mesh shaders with DX12 'ultimate'. The reason this is worth mentioning is that, there is every chance that AMD's performance will be different.
@sadmanpranto9026
@sadmanpranto9026 2 года назад
I don't understand these stuff... good different or bad different (AMD performance) ?
@AlleyKatPr0
@AlleyKatPr0 2 года назад
@@sadmanpranto9026 "different" in that, it might be frames are better or worse, or, that the cards cannot scrub the memory cache as fast, or faster, than nvidia
@AlleyKatPr0
@AlleyKatPr0 2 года назад
@@2kliksphilip I would research the DX12U capabilities of AMD and nvidia, to see which of them has an edge...maybe ask them questions about it...maybe ask Valve about it...it's your channel 'man - but the mesh shaders are new(ish) and here. The days of vertex shaders being the dominant force in game/realtime rendering is coming to an end...whomever leads this will lead your next excel spreadsheet in Calibri 12 regular font when you do your analysis. AMD may (or may not) have a better future than nvidia for ALL of your raised points on power usage and efficiency - purely because their nm scale and design of chip might be better for DX12U usage than nvidia. Time will tell, but research the topics, make some predictions.
@AlleyKatPr0
@AlleyKatPr0 2 года назад
@@2kliksphilip You made several comparisons historically with the changes that nvidia and AMD have made to their grapihcs card designs and where they were focusing their efforts. It is a forward projection, as it is of interest in anyone looking at GPU designs - just as, you were making predictions of the future and your assumptions as to what would have happened in the past is AMD made different choices. This video of yours is very much read as if you are speaking specifically about the future and the hopes of what AMD might be doing in the future regarding power and efficiency. Mesh shaders are the compute term AMD are knee deep in - if they get this 'right' then nvidia will be faced with a competitor which has a more efficient design. What you are speaking of is as if, you are speaking about 'yield', wherein if the GPUJ's are designed well, they will offer more GPU power for less money, as they would be more efficient - giving a greater yield. As GPU 'power' is (basically) running an API and making very specific types of calculations involving real-time rendering in DirectX or Vulkan or w/e; the best GPU is the one which does those types of calculations better than the competitor. Mesh shaders are therefore: the main focus. The performance gain is quite significant. If I was in any understanding that you were making a video purely to cover the past and not in any way imply the future, then I apologise. microsoft.github.io/DirectX-Specs/d3d/MeshShader.html gpuopen.com/directx12-ultimate/
@luckythewolf2856
@luckythewolf2856 2 года назад
I’m pretty sure the 6970 was the real replacement for the 5870 because the 5970 was a dual gpu card and the 6990 was the successor to that. This means the naming was shifted up one tier of card.
@luckythewolf2856
@luckythewolf2856 2 года назад
@@2kliksphilip Ah I see! Yes I agree hopefully the next generation can help us although I’m not super hopeful for the power consumption being great
@parkerlreed
@parkerlreed 2 года назад
I still have my RX 480 8 GB from 2016 and it's going great. For me that was the last of the great performers at a great price.
@PackardKotch
@PackardKotch 2 года назад
I know this is a Radeon 7000 series video, but damn does 6000 series hold a special place in my heart. First dedicated graphics (in AIO pc) and first dedicated graphics in desktop pc (6770m and 6950 1gb respectively)
@benedictjajo
@benedictjajo 2 года назад
ah... will never forget my 2gb 7870 which I used for 7 years. It still works but getting its well deserved rest in the store room.
@Jackikins
@Jackikins 2 года назад
My first computer had a 7870 and the CPU was an FX 8350. Oh boy, did it run HOT. But it served me well until it eventually fell after my father tried to stick another one in there for crossfire, the case didn't have enough airflow and it basically died becase both cards ended up running around 110C on Minecraft's title screen. That computer went through alot, survived things it shouldn't have like my dumb young mind trying to blindly plug in the thing for the power supply and taking a huge jolt of electricity and it kind of remained in a "zombie" state. It ran for around 3-4 more years, blue screening, 100% disk usage adnauseum. That computer moved to a GTX 970 sometime shortly after the shock and eventually fell to one last bluescreen around 2018, where I then got another PC, self built. CPU is an 8700k, 16GB of RAM (now updated to 64GB) and my beautiful baby child, the 1080ti that I got for around £550, just before they were discontinued. I'll never forget that first PC. I made alot of mistakes, alot I never repeated again with computers. It annoyed the hell out of me, but I still have a soft spot for it since it was my true introduction to PC gaming.
@benhinton5475
@benhinton5475 2 года назад
The 7000 series was pretty fantastic for compute when it came out
@SyphistPrime
@SyphistPrime 2 года назад
My fondest memory of this era of GPUs was one not posting in my PC so I had to go with a 6850. Makes me kinda sad because a 7000 series card would be usable for some light Linux gaming for much longer than the 6850 lasted. It was night and day though going from a 6450 to a 6850. I still have both of those GPUs to this day. The 6850 is unused, but the 6450 is a great display adapter for my server PC for if I need to hook up a display for a console output.
@SweepiNetworks
@SweepiNetworks 2 года назад
Sadly, the HD7970 only was on par with Nvidia’s second largest chip of the Kepler architecture (GK104), allowing Nvidia to sell the GK104 as an HighEnd tier card labled GTX 680, and use their most powerful Kepler chip (GK110) to establish a new “Enthusiast” tier with the GTX Titan (GK110).
@DogeCharger
@DogeCharger 2 года назад
Ahhhh I remember my 7850 I had to turn on with a paper clip for the second power supply because I was using a lenovo prebuilt
@J0elPeters
@J0elPeters 2 года назад
I still have my 7950 which I bought used in 2014. Such a fantastic card
@SaltyMaud
@SaltyMaud 2 года назад
I've had good timing with my past GPU upgrades. 4890 in 2009, 7950 in 2013 and GTX1080 in 2016, they've all been great cards at a great time. Not so sure about picking up a RTX3070 in 2022, but since I managed to yoink one at MSRP, I just went with it, might not be the best GPU purchase I've made lately if RTX4000 is coming out soon and it's as crazy as it's said to be.
@MarcinSzklany
@MarcinSzklany 2 года назад
I love the hardware videos you do. Very insightful. Thanks!
@TiborSzarvas
@TiborSzarvas 2 года назад
I remember my first ATI card after a decade of nVidia reign: a Club3D X800 RX in 2005. Then in 2008, my last VGA card was a Sapphire HD3850 with 256MBs, those were the times! I was one of the latest to change to PCI-E; around 2013-14 I got a Sapphire R7 250 1GB card which got me through till the end of 2018 when I purchased a 8GB RX580... I still have the rx580, it has no problems running even the newest games in 1080p (on medium). I was today years old when I learnt it consumes 330Watts which is way too much for my 550W power supply. Thanks for another entertaining and informational video, Philip.
@Loundsify
@Loundsify 2 года назад
330w is total system power. The RX 580 uses 185w
@TiborSzarvas
@TiborSzarvas 2 года назад
@@Loundsify well, thanks. So one still needs that much power in that "department" of the power supply. My previous supply was 550W also, but distributed less power to the card, and the PC kept freezing/restarting while gaming. Changing to a better brand solved this, but i'm still thinking maybe i haven't got enough wattage for the card or some other component. Idk, anyway thanks for the info.
@aqueousdog
@aqueousdog 2 года назад
Do you think you could do a video dedicated to the price creep of gpus? It's definitely something to see the price of high-end cards then vs now.
@marsaurelius
@marsaurelius Год назад
4:52 Oh boy how we went full circle. The Rtx 4000 and Rx 7000 series generation is more perfermance for A LOT more money and same performance for a little bit more money
@FatheredPuma81
@FatheredPuma81 2 года назад
The most interesting thing about the 7000 series was that it started " AMD Fine Wine" and also ended it. Because it was re-released into oblivion it got driver updates for a long time but that also meant that when it lost driver support everything else did regardless of it's age. It's now kind of Nvidia Fine Wine at this point...
@ocudagledam
@ocudagledam 2 года назад
My experience with the 7000 series was that I bought a 7950 for a song exactly three years after it launched, kept it for 4 years, during which it fairly happily chewed through what I cared to throw at it at high details at 1080p and even after that I was still able to resell it (for half of what I originally paid for it) as it still noticeably outperformed the RX560, which was AMD's entry level solution at the time. Regarding the 3GB, I can tell you that it made a heck of a difference in AC Unity, which, even at 1080p, struggled on the 2GB cards unless the textures were set to low, and, considering how long I'd kept it, I'd wager that it wasn't the only game where I benefitted from the extra gig. BTW, I previously owned a Radeon HD 6950 2GB for another four years (that is, more or less, since that card had launched) and while one generation up doesn't seem like much, and in theory there wasn't that much of a difference in raw power, in practice it was enough to give me another 4 years of moderate gaming, so I would call the 7000 series quite successful.
@livingthedream915
@livingthedream915 2 года назад
I find your comparisons to the Titan GPUs very difficult to stomach - almost no one had those GPUs to begin with and the PC gaming community at large ignored those products entirely. Halo products at the far high ends of product segments are usually overpriced to the point of absurdity by both gpu manufacturers (why buy a $500 7970 ghz when you can buy a $350 HD 7950 and OC to match the $500 product) and it was often the value proposition that caught gamers interest
@TrevorLentz
@TrevorLentz 2 года назад
I remember grabbing a used MSI Twin Frozr 7950, upgrading my rig from a brand new 6850. My 8320 FX series processor may have been a bottleneck, but it kept my main rig up to date enough to enjoy games at 1080p. I wish I had kept the eBay email receipts to help remember how much I spent back in those days. GCN 1.0 was actually really outstanding for used parts folks of the time. I never saw the 7950 as a disappointment. It was... good enough for me (a dude who only wanted to play games with my budget 8320 fx build at the time). Sure, AMD could have overclocked and made it less power efficient, but the market was a whole other world back then. I'll admit, a few years ago... I was, foolishly, a bit of an AMD fanboy back then, but I still could enjoy games without breaking the bank. Having a better job and more money now, I can afford to upgrade my PC way more frequently (I should say, a few of my PCs). When I was younger and working a minimum wage job, the inferior products at a reasonable price were a perfect step into a better generation of PC hardware.
@Loundsify
@Loundsify 2 года назад
I bought a 7870 XT (Tahiti) which was essentially a 7930 but AMD stupidly named it 7870 XT. It was a cut down 7950 but 2GB VRAM. I thought it was a massive jump over the GTX 460 1GB I had.
@RyugaHidekiOrRyuzaki
@RyugaHidekiOrRyuzaki 2 года назад
This is going to seem weird, but YT didnt include this in my subscriptions page/feed. I had to manually look at your channel...
@mulymule12
@mulymule12 2 года назад
Ooo, I had 2 7850s in crossfire, some games struggled but otherwise lasted well up to 2019 without issues
@steel5897
@steel5897 2 года назад
Hot take: ETH mining probably kept AMD GPUs on the map. Nvidia has been better for gaming and especially encoding for a long, long time. But those RX 480 and RX 580 cards were a hot commodity for miners, insane mh/dollar and very power efficient.
@Fractal_32
@Fractal_32 2 года назад
Plus AMD actually had asynchronous compute compared to Nvidia’s 10 series which didn’t or at least didn’t benefit from it as much as AMD. I’ve also heard that Nvidia ripped out parts of their hardware scheduler on Pascal (or it was Maxwell?) although I couldn’t find anything proving or disproving this claim. If this claim is correct it would have made AMD cards better for game developers who want to do more advanced techniques for better images/better performance/*stability. *stability as in not dropping frames/frame time consistency when an explosion occurs since a ton of particle affects have to be rendered.
@guimblon
@guimblon 2 года назад
Anyone else not getting notifications from ANY of Philip's channels?
@reviewforthetube6485
@reviewforthetube6485 2 года назад
Im a bit bummed to see they may not be taking that hardware acceleration route. But then again if they could almost match it with hust native gaming that would be absolutely incredible. We see what they can do already so I wouldn't put it past them to come out swinging lol. I just hope all of these companies goal is to work towards the best native gaming performance. Imo hardware acceleration is great for when you need that extra but it shouldn't be something we rely on imo. I went with nvidi due to there hardware acceleration. Why? Because it's nice to have it lol.
@jelipebands1700
@jelipebands1700 2 года назад
Got the msi twin frozr hd7950 at frys and man that card blew me away skyrim and battlefield 4 never looked and ran so good. Lasted me for a lot of years. I did upgrade to a rx480 because they were so cheap at release. The second mining boom hit and i sold my rx480 for more than double what i paid and put my hd7950 back into system. I will never forget that card…
@priitmolder6475
@priitmolder6475 2 года назад
Im so old, the graphics cards I had were: Nvidia MX200 with 32MB. Geforce 6200 with 128MB. Radeon 6770 with 512MB. 7870 with 1 GB. R9 290X with 4GB. And finally typing this comment on R9 390X with 8GB.
@crylune
@crylune Год назад
It would go against the nomenclature, but I hope AMD has a RX 7970 or 7990 card just to honor this generation ;)
@h4m2_
@h4m2_ 2 года назад
this video isn't showing up in my subscription page!
@FlappySock
@FlappySock 2 года назад
Same here, really weird
@lukasg4807
@lukasg4807 2 года назад
2:03 thought you said it was 14nm and had to rewatch it a few times to hear 40nm lol.
@TheXev
@TheXev 2 года назад
You can point to two events when it comes to the efficiency of AMD graphics cards. One: David Wang leaves ATI/AMD- GPU's efficiency goes to pot. Two: David Wang returns to AMD - RTG's GPU's efficiency increases dramatically and becomes competitive again. RDNA3 will be the first GPU he fully had is hands into since his return to AMD for a full 3 years of development. I expect great things out of RDNA3.
@MandoMTL
@MandoMTL 2 года назад
NGL, I do miss the 3d/anime art on the shrouds.
@KaaptnIglo
@KaaptnIglo 2 года назад
There is definitely value in doing this :) was interesting and insightful indeed
@fups1
@fups1 2 года назад
Fantastic video! I have a lot of great memories with my old 7950! If only I held on to the millions of dogecoin I mined with it a little while longer...
@dodger3294
@dodger3294 2 года назад
Not a single one of your videos from any channel are showing up in my subs. What is going on. I have bell notifications on every channel I'm so mad
@alexw1681
@alexw1681 2 года назад
Heads up, this video for some reason hasn't appeared in either my notifications nor my subscription feed, despite me being subscribed and bell active. Just FYI
@T1C
@T1C 2 года назад
Oh wow, the future!
@Kato0909
@Kato0909 2 года назад
this Great video somehow did not showed in my subscriptions feed.
@Fezzy976
@Fezzy976 2 года назад
HD7000 series was amazing. You missed a lot of points here and only focused on the negatives. AMD brought ATi only a few years before hand and brought them mainly to focus on iGPU's for their upcoming APU series of products. Now factor in that Intel used horrendous marketing tactics (most of which AMD sued them for) Intel got the upper hand in the CPU market and now AMD had much smaller budgets to work with. But yet they will competed with the big boys in Intel and Nvidia. The 7000 series was excellent in terms of raw performance and its overclocking potential was insane. I had my 7970GHz overclocked to 1.25GHz and the memory was up at least 600MHz from what I can remember. This made it crush the 680 which was already pretty much at its max clocks and had zero room for overclocking. Then factor in double the framebuffer (1.5GB vs 3GB) and for games with huge texture mods this made the 7000 series almost double the speed of the 680 in games like Skyrim and Fallout 3. AMD included a hardware scheduler within GCN so they bet heavily on the next API's having more low level access to the GPU's with DX11 but Microsoft never included this. Nvidia spent millions on creating a software scheduler in their driver to compensate for this. Which is why Nvidia cards were always better in DX11 games. This is why AMD had to create their own API with low level access which was called Mantle and released for BF4 in a patch. Mantle was insane, and the 0.1 and 1% lows made the 680 look pitiful in comparison. AMD gave their Mantle code away to the OpenGL team who used it as a baseline for Vulkun and also worked with Microsoft and the code was included in DX12. This is why these cards were the cards that started the whole "AMD Fine Wine" where the GCN cards would perform insanely well in newer games using the newer API's thanks to the architecture used. Now the API's could take advantage of the GCN's built in hardware scheduler. This was also the case with AMD's CPU's too. AMD bet heavily on applications becoming more multithreaded. Which is why they pushed for more cores and threads in their CPU's such as Piledriver and Bulldozer. But this never came about as Intel had such a stranglehold on the market and we got 4 core 8 thread CPU's for nearly a decade before AMD managed to create Zen and punished Intel for their complacency. Now AMD have more money coming in, they have bigger budgets to work with and that is going to pay off big time and not just in the CPU space.
@Orcawhale1
@Orcawhale1 2 года назад
1. ATI was bought in 2006, and continued to operate in their own buildings, and their own name. What's more AMD and Intel settle all of their disputes in 2009, so the notion that this somehow had any effect on HD7000 is just dumb. 2. Now, your just outright lying. The 7970 wasn't faster than the GTX 680, hence why the 7970 GHZ edition was released. What's more, the GTX 680 litterally introduced gpu boost, which essentially was auto OC. Not to mention, the various extreme OC cards, like 680 SOC, or 680 Lighting. 3. Mods don't count for anything, and especially not on Bethesda games. That's a terrible argument. 4. Nope, not at all. Your confusing the ashes of singularity and asynchronous compute debacle, with Mantle. 5. Mantle wasn't "insane", it wasn't widely supported, and became irrelevant with DX12's release in 2015. And nor is it the reason for "finewine", your lying again. 6. You can thank the Ps4 and Xbox One for holding the pc market back. It has nothing to do with Intel or AMD. TL:DR None of your arguments work, or somehow makes the HD7000 amazing.
@Rmx2011
@Rmx2011 2 года назад
I used to rock a 7950 for many years, it was my first own GPU that I bought with ny own money. Good performer with no big issues, with 3gb of vram, no less. Also, it's really interesting to see now just how good Nvidias 600 series architecture was, this truly was the moment when nvidia took the lead and ran with it. Just a shame that the 7000 series was set against nvidias 600 tier cards, not sure if 7000 series could be necessarily called a failure, instead 600 series was just 'grounbreaking' at the time. Also all this titan talk being brought backup again makes me feel old, thanks.
@endersjehuty7721
@endersjehuty7721 2 года назад
My Asus HD 7950 Direct CU 2 its my savior, that 3 Gig Vram its ths SHITSSS!!! I love it.
@jameslewis2635
@jameslewis2635 2 года назад
I still have a HD 7870 from my old Windows 7 system and it lasted me quite a long time. It was powerful enough to be competitive enough at the time against Nvidia's products. However it did seem that AMD's design team seemed to get totally lost in terms of making a follow-up series with everything that followed up to the current 6000 series following behind Nvidia's products with the only place AMD could compete on being in terms of price.
@morwar_
@morwar_ 2 года назад
Me and my brother had one of those. Good old times.
@jakubmastek5733
@jakubmastek5733 2 года назад
No notification for this despite having rung bell
@thomasbitler8798
@thomasbitler8798 2 года назад
Wasn't in my sub feed
@silvertree88
@silvertree88 2 года назад
I've got a HD6750 1gb card still going in my kids PC, runs Minecraft and Lego games at 720p/60 still.
@divyjotsingh3879
@divyjotsingh3879 2 года назад
Will never forget my laptop with HD 7730m Bad boo could run jc3 around 35 fps 🤯
@winblasers2
@winblasers2 2 года назад
I'm using a 6870 which is missing half of it's fan blades, makes my desk vibrate when on full load but I can't afford an upgrade yet =D
@winblasers2
@winblasers2 2 года назад
@@2kliksphilip Really not blown away by the quality
@Mr_Spade
@Mr_Spade 2 года назад
Had two 7870Ghz editions in my first ever custom build, great cards but my dumbass didn't realise how under-utilised crossfire was by any software at the time
@opoxious1592
@opoxious1592 2 года назад
I owned 2 types of the 7000 series. the 7850, and the 7990.
@LsAdventurer
@LsAdventurer Год назад
I still have one of these GPUs... Though I think it's my CPU that makes my PC very slow.
@goblinphreak2132
@goblinphreak2132 2 года назад
I'm glad you didn't show the lie reviews around that time. Claiming a 7970 couldn't play Skyrim at 1080p ultra 4/4 (aa/af) at 60fps.... I was totally pissed because i noticed their lies because I owned the card. Not only did I have a 7970 but my cpu was technically slower as I was running an AMD cpu instead of intel (still do). I tested the game, 1080p, ultra, 4/4, and sure enough i pegged 60 the whole game. it never dropped. only when I changed AA/AF to 16x/8x respectively, did I notice my fps dropping to 50, which matched the reviews. Turns out they were testing AMD on max settings and Nvidia on lower settings, to give Nvidia the edge in the review. I even took video proving their bullshit but fanboys never listen. The 7000 series was god tier. I loved my Asus 7970 blower edition (aka reference). After a year i had heating issues, removed cooler, cleaned everything, and ended up applying arctic silver 5, the got tier cpu thermal paste. and not only did my temps lower, but they were even lower than the stock paste that was used. insane. so i was able to run the card with a lower fan profile thanks to better paste. card is somewhere upstairs on a shelf. AMD changes price performance at specific times. like when we had 390x and then later the RX480 which was basically the same performance as the 390x, for HALF the price. $429 for 390x launch day meanwhile the RX480 launched for $200 which is actually more than half price.... $230 less for the same performance was a huge boon for those who couldn't afford the higher tier stuff. which is how things go. AMD always does this. most generations are faster performance and either same price or more price. and then out of nowhere one generation will offer last gen performance for significantly less price. like my given example. i could go back further into detail but im too lazy. its happened more than once.
@realmrjangoon
@realmrjangoon 2 года назад
Is that digadig?
@tuff_lover
@tuff_lover 2 года назад
Proud 7870 Ghz Edition user, still a very capable & cool card.
@ros8137
@ros8137 2 года назад
I didn't get this video in my subscriptions, odd.
@Belgarion115
@Belgarion115 2 года назад
Yes! I *AM* still running a 7970! And, if not for Half-Life: Alyx, I would have no reason to upgrade.
@JohnBarchard
@JohnBarchard 2 года назад
Btw this video didn't show for me in my subscriptions or notifications (I'm subscribed and have the bell icon set to all). I only found it through my home page recommended
@bagmilker1558
@bagmilker1558 2 года назад
I had one i tihnk from 6xxx series with 512mb that could run games that required 2gb gpus, sure with a little bit of burning but it did
@TRHardware
@TRHardware 2 года назад
Very entertaining video!
@wertywerrtyson5529
@wertywerrtyson5529 2 года назад
The 270X which was a renamed 7870 was the last time I bought a new GPU under 200 bucks that was fast enough for anything at the time. I got it because Tomb Raider 2013 and it ran the game great even with TressFX. I don’t think that will ever happen again I’m afraid. 150-250 used to be mid range. Now 400-500 dollars that used to be high end is the new mid range. I think the 7000 series were great but mostly when the prices had come down.
Далее
The Radeon 480: was it a Success VS Nvidia?
18:29
Просмотров 151 тыс.
$200 Graphics Cards - Were older cards better?
17:11
Просмотров 174 тыс.
Главное рыба есть, а воды нет..
00:54
Nvidia Geforce RTX 3060 Ti Review
18:01
Просмотров 259 тыс.
Budget Builds in Early 2023?
11:14
Просмотров 140 тыс.
Bloodborne Is Genius, And Here's Why
1:26:40
Просмотров 7 млн
6500XT Tested on slower PC in older games
15:49
Просмотров 80 тыс.
Line Goes Up - The Problem With NFTs
2:18:23
Просмотров 15 млн
The complete and utter collapse of Ubisoft
31:41
Просмотров 1,3 млн
Holding out for a $200 GPU Hero
13:32
Просмотров 171 тыс.
Cheapest Gaming PC - The Ryzen 2200G APU
23:16
Просмотров 516 тыс.
Using the 6500 XT with a Budget CPU
14:53
Просмотров 56 тыс.
Главное рыба есть, а воды нет..
00:54