Тёмный

Did Shader Model 3.0 Help the 6800 Ultra? 

PixelPipes
Подписаться 20 тыс.
Просмотров 22 тыс.
50% 1

Опубликовано:

 

31 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 122   
@loganiushere
@loganiushere 4 года назад
It reminds me of the Voodoo cards: One card has truely awe inspiring performance, but skimps out on features, whereas it’s competitor(s) have less capable in power, but more feature speced cards, and that card was, in the end, more future-proof.
@wrmusic8736
@wrmusic8736 8 месяцев назад
Voodoo GPUs got hopelessly outdated 2 years into their lifetime. Granted by 1998 standards 2 years are a technological eternity, but nowadays replacing a videocard every year seems insane.
@hblankpc
@hblankpc 6 лет назад
I've always wondered about the longevity SM3 gave the GF6/7, this tackles it perfectly. I've got a lot to learn from you :)
@PixelPipes
@PixelPipes 6 лет назад
Hey Obsoletist! Welcome to the channel!
@raresmacovei8382
@raresmacovei8382 5 лет назад
Basically, having a higher end GeForce 6 or 7 allowed you to play almost all games release during the X360/PS3 era, while ATI X800 series got axed past 2007, where all games started requiring Pixel Shader 3.
@Shuttersound1
@Shuttersound1 6 лет назад
Such nostalgia! I'm so glad I found your channel. Makes me miss the good old days of playing games like Farcry, Doom and HL2 on my AthlonXP 2800+ PC with an AGP 6600 GT. PC gaming hasn't really felt as special to me since then, so being able to relive those times through your videos is ace. :) Also, I remember an issue with Halo where cards like my 6600 with SM3 wouldn't display pixel shader effects like bump mapping and specular, but an X800 I picked up for a later system did. Maybe there was only support up to SM2 in Halo since it was made with the SM1.1 GeForce 3 GPU in the Xbox? Or maybe I just had some weird driver issue at the time. Also, (if you're still reading this) maybe you should make a video about the arrival of the 8800GTX and stream processors? I remember how much of a big deal it was at the time, and when I bought my 8800GTX it was like a night and day difference compared to what I'd seen from traditional pipe based GPUs. One of the biggest differences was in Oblivion, where the most cards before would have a massive disparity between indoor and outdoor framerates, but the 8800GTX had the raw power to stabilise this. :)
@PixelPipes
@PixelPipes 6 лет назад
Glad I could help you relive the good ol' days! The 8800GTX is definitely a monumental moment in graphics card history and deserves some focus.
@HappyBeezerStudios
@HappyBeezerStudios 6 лет назад
Those were the machines I've dreamt of back then :D
@levimaynard2237
@levimaynard2237 2 года назад
Definitely the good old days...
@MFG9000
@MFG9000 5 лет назад
Some ATi owner disliked this video. I love watching your content, sometimes over and over again, for nostalgia and for it's accurate information. Like this one that reminded me of that Far Cry console variable that enabled HDR lighting.
@PixelPipes
@PixelPipes 5 лет назад
Makes me proud to hear that. Thank you!
@vladmihai306
@vladmihai306 4 года назад
Not some. Only one user. Other agree.
@SteelSkin667
@SteelSkin667 7 лет назад
I owned a X850XT at the time. It was infuriating to see all those games being purely and simply incompatible with an otherwise really fast card. The worst thing was that I was unable to upgrade for a while, meaning that I kept that useless thing until 2008, which is when I got a then brand new 8800 GT.
@PixelPipes
@PixelPipes 7 лет назад
SteelSkin667 Ouch! Thank you for sharing!
@HappyBeezerStudios
@HappyBeezerStudios 6 лет назад
The X850XT was still a nice card. But going to an 8800 GT is a fine choice. Just hoping for you that it wasn't one of the awful single slot reference designs hot enough to cook on it.
@SteelSkin667
@SteelSkin667 6 лет назад
HappyBeezerStudios - by Lord_Mogul As a matter of fact it was one of the single-slot reference models. Honestly I didn't know better at the time, but fortunately it didn't give me any issues whatsoever. I have no idea how hot it got, but I do remember is that is wasn't as noisy as the X850XT.
@HappyBeezerStudios
@HappyBeezerStudios 6 лет назад
Same goes for Half Life 2 and the other source engine games. Even read that some of them even offer a DX7 render path. Oh the dreadful single slot 8800 GT. Had one back in the day and it really started to shine after I modded the cooler with some case fans.
@GraveUypo
@GraveUypo 6 лет назад
huh, i just posted something from the opposite perspective. seems the 6800 series really was the right choice. here's a copy paste of my post: well, since i kept my 6800GS until early 2008, it paid off. when my first xbox360 3rl'd i was devastated. but since there was gears of war and test drive unlimited (my favorite games on the xbox360 at the time) versions available on pc, that kept me going for the year it took me to buy a new 360 (and a 8800GT). even though the 6800GS ran those like crap. at least it did run them. that said, i did run into that same issue with the geforce 4. i hated seeing people with the older, slower 8500PRO playing battlefield 2 when i couldn't with my shiny super fast ti4400. then the 9500pro came along and i realized the gf4 was not only incompatible with the latest games, it was also MUCH slower than its direct competition. ugh i hated that card. and it was the most expensive card i've ever bought (corrected for inflation and currency exchange value between real and dollar).
@fabiolorefice1895
@fabiolorefice1895 7 лет назад
Cool throwback. I remember the discussion although I didn’t really care at that moment as I was still rocking a GeForce 4 4200Ti. Although I did buy a 6600 GT (AGP) later in 2004 Keep your content coming BTW! It is always nice to watch deep dives into this retro stuff.
@synixx9286
@synixx9286 6 лет назад
I remember being 12 years old in 2005 and buying black and white 2, only to be confronted with the error "This game requires pixel shader 1.1 to run, please upgrade". The sales rep at PC world had told us our new celeron D PC could play new games but that was clearly a load of crap. So I went on ebay and bought the cheapest graphics card I could see which was the Radeon 9250 256mb. The game now ran! I was playing oblivion at the time and even for my 12 year old self it felt pretty choppy so I eventually got a Radeon X1650 Pro 512mb and upgraded to 1GB DDR RAM. Both games looked amazing with SM3 enabled and I knew I could do better so the next year, with crysis on the horizon I saved up my paper round money over the entire year and built myself a pc with an amd athlon x2 4200, x1950pro and 2gb ram, as well as a kick ass 22" widesceen 1680x1050! It was pretty awesome but when I eventually got my hands on an 8800GT in late 2008 I was blown away. That card was phenomental. I feel as though I missed out on the pre 2005 generation of graphics cards but my early expereinces with SM3 and the struggles of peasant gaming on a celeron D continue to humble me to this day!
@PixelPipes
@PixelPipes 6 лет назад
Great story! Thank you!
@serenameep8565
@serenameep8565 5 лет назад
Near same story here, i had to get the msi 9250 as my integrated graphics on my aldi medion sempron system couldn't even play C+C or even the sims well enough! I went sli gt7600 to play oblivion and like you when i got my first 8800gt i were blown away how everything became playable at high frames and details.
@SirDimpls
@SirDimpls 3 года назад
I have a cheap old phone I bought as backup, it's a Sony E4g and it's crappy. This phone has a Mediatek 4-core chip with a low-end Mali-T760 MP2 GPU which has a theoretical computation power of 48 GFLOPs only and can do DX11 & SM5.0. While the Geforce 6800 Ultra discussed in this video has 54 GFLOPs. And that realisation blows my mind.
@sinizzl
@sinizzl 6 лет назад
I upgrade from a Geforce FX 5700 to an X1950XT in 2007...I did nothing but play Oblivion until the end of that year.. good times!
@justiny.1773
@justiny.1773 5 лет назад
I had 2900xt 1gb in 2007 good times
@playingwithtrainerspcmt6407
@playingwithtrainerspcmt6407 4 года назад
brings me back...at that time i think i only had the ati x700..great vids
@retropcscotland4645
@retropcscotland4645 7 лет назад
Reminds me of the old ATi rage pro chip. It had the CIF 3D application that wasn't widely used. Only company that released a working CIF 3D patch were Eidios for the Tomb Raider gold game. That patch works very nicely with the old ATi Rage Pro and proper era CIF enabled driver.
@HappyBeezerStudios
@HappyBeezerStudios 6 лет назад
The GF 6 cards were fine, Same goes for the GF7 series. That was a phase when both companies had cards reasonable to choose after the disaster that was the GF FX. The GF 8 on the other changed alot with an insame performance gap and the 8800 GT being the "minimum requirement" on may games even almost a decade later.
@PixelPipes
@PixelPipes 6 лет назад
The 8800GT was a truly legendary card, almost as much as the 8800GTX itself.
@likeclockwork6473
@likeclockwork6473 6 лет назад
I'd say the Radeon HD 7870 should take the 8800's legend status. I've never seen an old re branded GPU bargain GPU thoroughly destroy a previous generation in sheer compatibility and general performance like that since the 8800. The HD 7870 was less than $300 when Nvidia's 6xx line released shortly after and remained priced well until now actually. Sure its not ideal but when you consider these GPUs can play everything the 8800s did and nearly everything today, its hard not to build the legend up for them. Better than the HD7970 in terms of market longevity. Look at the HD 7870 from this perspective. The R9 270 was selling for maybe $20 more than what the better versions of the GT 1030 is selling for now and that was years ago. The 1050 is its equivalent and its more expensive than the R7 270s were. What other GPU product have these manufacturers made that has remained competitive 6 solid year after launch?
@ChannelSho
@ChannelSho 4 года назад
​@@likeclockwork6473 Little late to this part, but what made the 8800 GT so legendary was that it destroyed the current generation market very hard. You could basically get then flagship performance for at least half the cost and it had less requirements overall (such as having a single slot cooler and needed less power) to use it. I don't think anything really came close to that.
@tylermartin7245
@tylermartin7245 2 года назад
I miss this
@wrmusic8736
@wrmusic8736 8 месяцев назад
Early tech adopting hardware is seldom actually fit to handle the new feature set. For every exception like Radeon 9700 or GeForce 8800 there are GeForce 3, GeForce FX, Radeon HD2000, Radeon HD5000, GeForce RTX20 which couldn't handle their new stuff. In fact since GeForce 8800 we haven't had a single GPU feature pioneer that could reliably deal with a cool new thing - and that was 18 years ago.
@Carstuff111
@Carstuff111 4 года назад
When it comes to PC gaming, I am, I admit, a bit behind the curve. Most of the games I tend to play are not the latest and greatest. I had a Radeon X800 Pro card that my roommate had bought brand new, and day one, he installed his own cooler (an AMD Athlon 64 x2 cooler he modded to fit) because he saw the card hit 80 degrees C almost instantly in a game. Once he put the modded cooler on, at full load, and overclocked, it barely got 5 degrees C over ambient temperature. And when I got that card, I ran it for a good, long time. Most of the games I played back then either needed SM 2.0 or older, and the X800 was a huge upgrade over the Radeon 9800 Pro I had before it (also modded and HEAVILY overclocked) for a long while. After the X800 Pro paired with an Athlon 64 dual core, I upgraded power supplies and ran a factory overclocked X1950 Pro for a little while, before jumping to a quad core Athlon II with first a Radeon 7770 (my first and only new video card ever) and then that same machine was upgraded to a Phanom II x6 1045T and Radeon 7850, both very overclocked. I do have a love for ATi/AMD cards I have to say, to the point my current rig now has the first Nvidia card I have owned since I had a GeForce 4200Ti that was bios modded to a 4500se. At this time, the heavily overclocked GTX 1070 I am now running was the best bang for the buck card to run with my new to me AMD Ryzen 5 1600X.
@GraveUypo
@GraveUypo 6 лет назад
well, since i kept my 6800GS until early 2008, it paid off. when my first xbox360 3rl'd i was devastated. but since there was gears of war and test drive unlimited (my favorite games on the xbox360 at the time) versions available on pc, that kept me going for the year it took me to buy a new 360 (and a 8800GT). even though the 6800GS ran those like crap. at least it did run them.
@PixelPipes
@PixelPipes 6 лет назад
The 6800GS was a great card! I v-modded and overclocked the crap out of mine!
@auroraattardcoleiro1455
@auroraattardcoleiro1455 6 лет назад
Awesome channel :) Just subscribed. I hope that you grow as much as RetroGamingHD and Budget Builds!
@trajanaugustus8783
@trajanaugustus8783 6 лет назад
I had a PNY 6800GS with the pixel & vertex pipelines unlocked with Riva Tuner, ran on par with the 6800GT for less money. Great video btw.
@JamesSmith-sw3nk
@JamesSmith-sw3nk 6 лет назад
There were games released in 2010 -2013 that had directX 9 support like Crysis2, Far Cry 3 & Metro: Last Light. I suspect a lot people used their 6800's for a long time. I remember having an AGP 6800gs that I unlocked to a 6800 ultra.
@Knaeckebrotsaege
@Knaeckebrotsaege 6 лет назад
I used a heavily modified and overclocked AGP 6800GT till I replaced the whole system in mid 2008 with a Core 2 Duo and 8800GTS 320MB. I still kept the old rig as a 2nd PC and occasionally tried games on it. I remember being surprised at the 6800GT being able to run games like Racedriver Grid no problem on pretty high settings
@soylentgreenb
@soylentgreenb 3 года назад
I played oblivion on a 9800 non-pro at 800x600. Pretty sure I disabled HDR to get better framerates. At least I didn't have to use the "oldblivion" mod like geforce FX 5900 owners :P
@KinoKonformist
@KinoKonformist 7 месяцев назад
I think at the moment in 2004-2008 most people without problems played on lower resolutions, cause most people don’t have LCD displays. So playing with 1024x768 or even 800x600 was ok.
@lucskyes2748
@lucskyes2748 6 лет назад
Amazzing work bud. Hugs from Brazil !!
@rijatru
@rijatru 5 лет назад
Seeing the backlash against raytracing brought me to this video. Half of performance on current high end GPUs, standard in most games in a couple of years.
@tHeWasTeDYouTh
@tHeWasTeDYouTh 4 года назад
"Let's lay it out and not mince words. XFX's GeForce 6800 Ultra 512MB graphics card, priced at around £470 or so, doesn't represent decent value to the gamer right now." back in the day when I read that I made an account on the forums just to troll the entire site..........lol first time I did that. I loved my 6800 Ultra
@likeclockwork6473
@likeclockwork6473 6 лет назад
By the end of 2007 ATI users wanting to upgrade could have picked up a HD 3850 for only $180 and that supported shader model 4.0. Kind of silly to look at the compatibility in games 3 years later as anything that special at the time.
@soylentgreenb
@soylentgreenb 4 года назад
Historically, the introduction of new features is not for the consumer initially, but for the developer. If the 6800 did not have SM3.0, the consumers wouldn't have games with good SM3.0 support for the 8000-series. Not just because developers didn't have cards with support to develop on, but because there needs to be wide support before developers even bother. The geforce 256 and radeon had per-pixel fixed function lighting. With like 6 passes or something it could run doom 3. Not playably, of course. T&L would eventually be assumed and polycounts could be made much higher; but not before cards without hardware T&L became uncommon. Shader model 8.0 on the radeon 8500 and geforce 3 could have been used to make the water in Half-life 2 and other neat features, but instead it was used to make player models and levels look like they were smeared in vaseline and make water look like liquid mercury. It wasn't until much faster DX9 cards that I saw it put to good use. This pattern has sort of repeated over and over, were early adopters don't get much benefit before the card is already nearing obsolescence. As an exercise to the reader, where will the 2080 ti be when hardware raytracing is commonly used in games as a set-it-and-forget-it feature? (I never realized Oblivion looked so much less terrible with HDR off).
@evolucion888
@evolucion888 3 года назад
The issue was that using shaders smaller than 64 per pixel would yield into severe performance degradation with the 6000 series, along with the fact that the issue with enabling HDR and no AA working was related to the ROPs doing the FPU filtering for the HDR, so they could either work one thing or the other. AMD had much better shading performance on the X1K series, and the shaders were the one doing the FPU filtering resolve for HDR and the ROP doing their magic for MSAA.
@ofoosy
@ofoosy 6 лет назад
You have to do a video on that 6800gs!
@foch3
@foch3 3 года назад
That's what I'm talking about.
@DanielGT_93
@DanielGT_93 Год назад
i had an e2160 overclocked to 3Ghz with an X800XT pci-e at the time of 8800GT launch. Still played a lot of games until i got an 8800GTS 512 used in 2010.
@vansyly9794
@vansyly9794 4 года назад
I upgraded from GF4 TI 4400 to 6800gt and it was a great experience. Huge performance upgrade with actual SM 3.0 The Farcry and SC:Chaos Theory gaming was great.
@Mantis4
@Mantis4 6 лет назад
yet another great quality video keep em coming :3
@Silikone
@Silikone 6 лет назад
ATI's SM 3 inclusion was perfectly timed. I played Mass Effect 2, a game from 2010, on an x1950 pro with satisfactory performance. Seems like the 6 series fell a tad short of the demands, though I'd love to be shown that this isn't the case.
@CompatibilityMadness
@CompatibilityMadness 6 лет назад
X1950 Pro has 36 Pixel Shaders, that's 12 more than 7900 GTX :D Basicly : Any game that uses Pixel Shader code will fly on them. Also, they support Oblivion AA + HDR mode without problems :) However, You forgot about X1800 series which were the actual first ATI SM3.0 cards ;)
@GraveUypo
@GraveUypo 6 лет назад
no, it wasn't perfectly timed, it was late. if had i gone with an ati at the time i'd be as mad as i was with my gf4 ti. never get feature-lacking cards unless you have a short upgrade roadmap
@DyoKasparov
@DyoKasparov 3 года назад
Man I remember when pixel shader 3 was new and bloom was hip
@hulkaman1a
@hulkaman1a 6 лет назад
Great video, really enjoyed it. A++!! One observation I feel like you left out (sorry if I missed it) was the fact that, in 2004 the 6800 could do SLi. This put Nvidia ahead of ATi in terms of performance, albeit at a heavy cost. An early adopter of 6800 Ultra's in SLi would have benefited from more playable frame rates and higher resolution than what was shown in your video. All of that aside, I remember buying a factory overclocked BFG 6800 GT in 2004 for two games. Doom 3, and Everquest 2.
@PixelPipes
@PixelPipes 6 лет назад
Crossfire definitely started in 2004 as well, albeit requiring an external dongle and "Master/Slave" cards. But I did not mention this, so it's a good point to bring up.
@hulkaman1a
@hulkaman1a 6 лет назад
Crossfire was September of 2005. By then Nvidia had regained the performance crown with the 7800 GTX.
@PixelPipes
@PixelPipes 6 лет назад
Ah you're right! My memory fudged that one!
@hulkaman1a
@hulkaman1a 6 лет назад
An honest mistake! I'm a retro builder too, and sometimes I get my facts mixed up. I have dedicated 2004 and 2005 machines, that I've worked on lately, so it's all fresh.
@Xbfg
@Xbfg 7 лет назад
Keep up the fantastic content i still have my leadtek 6800gt lol its in my collection
@cyphaborg6598
@cyphaborg6598 9 месяцев назад
Oh yes, Splinter Cell Chaos Theory. They couldn't be bothered with shader support but added that POS Starforce.
@wishusknight3009
@wishusknight3009 4 года назад
I was still rockin my 9700Pro AIW at that time. I held onto it until replacing it with a 7950GT. Perhaps close to the longest I ever had a video card. I only did that as I was able to use an Asrock 939Dual Sata2 board, which could facilitate the upgrade path. The video card was the last. Though I was not exactly paying all the AAA titles as they came out, i have generally waited for sales. And since hten I have stayed a bit behind the curve in graphics performance.
@robinenbernhard
@robinenbernhard 5 лет назад
Love my sli 6800 ultra in bf1942 with 2 hhd in raid 0 in desert combat back in day. I was first one in game so i can get every vehicle
@jetbrucelee
@jetbrucelee 4 года назад
Omg yes sm3.0 was a big deal. I bought a laptop in 2005 with an x700 ati and my buddy got a laptop with 6600gt. We played age of empires 3 and I was so pissed I couldn't play it with sm3.0 on. And it really made it looked so good.
@inwork1
@inwork1 5 лет назад
There was a community patch for BioShock that converted it to shader model 2.0b, it would be interesting to compare them.
@PixelPipes
@PixelPipes 5 лет назад
I investigated that and it's extremely glitchy. It was never completed and many parts of the game are incorrectly rendered (or not rendered at all).
@jakedill1304
@jakedill1304 Год назад
2:15 got to rebind q and e from the lean buttons LOL.. c&v work way better.. and that way you don't get stuck trying to lean your way into interactables LOL. I have a sneaking suspicion aside from the 20 year Time Warp of dual analog controller, q and e being a default for lean is why we don't see it anywhere near as much as we should have or do in modern games.
@sonicblast19
@sonicblast19 6 лет назад
I remember there was a hack for Bioshock that made it run on Shader Model 2. I remember playing it on my Radeon 9600.
@levimaynard2237
@levimaynard2237 2 года назад
Some games in shader model 2.0 look cleaner in my opinion. May be the engines that were used, but as an example, source engine with half life 2 series, or Unreal tournament 2004, have "less detail" in some areas, but the overall presentation seems cleaner and subjectively prettier. Unreal engine 3.0 games by comparison, have that plastic look and its alot dirtier. Its hard to explain. As far as HDR is concerned, I think Day of Defeat source has some of the best implementation of it (also shader model 2.0).
@Synthematix
@Synthematix 4 года назад
Id take the 6800gt anyday, im using the 6800gt gainward golden sample in my win98 machine, its bloody awesome. the 6800 supported directx 9c, the ati cards did not.
@LastOneLeft99
@LastOneLeft99 4 года назад
I went from 6800 GT to 8800 GTS just for Bioshock.
@gregoryberrycone
@gregoryberrycone 4 года назад
chaos theory wasn't a 360 title
@cyphaborg6598
@cyphaborg6598 4 года назад
Huh!!,that's odd I know that it was on the PS3 because I have them all 3 (trilogy) I figured it would also be on the 360,I can't find a reason as to why it wasn't on the 360. (I knew it was already out before the 7th generation of consoles started) Perhaps due to Backwards compatibility?at some point the PS3 didn't have it You can buy the Xbox version for the 360 via the marketplace (since mid 2019) *shrugs* That latter doesn't make much sense lol but.. Oh right Xbox One X enhanced.
@ching-chenhuang8119
@ching-chenhuang8119 4 года назад
Well, it's sort of too late, back in 2007 I changed my display card from 6800GT to 8800GT, and I played Bioshock on my Xbox 360.......
@Aranimda
@Aranimda 5 лет назад
I still play DirectX 9.0c titles on a daily basis: Guild Wars and Guild Wars 2. Tho the second one would not run with proper framerate on a 6800 Ultra.
@BlackDragon-xn2ww
@BlackDragon-xn2ww 6 лет назад
This goes in the opposite direction but I like to take a old card and see what a game looks like in software mode or any mode really it is surprising some of the interesting color schemes giving a almost rustic look to some games lol
@tHeWasTeDYouTh
@tHeWasTeDYouTh 4 года назад
5:32 no joke even in 2006 when Oblivion came out I thought the game looked like garbage. the art style looked really lame compared to other games and the graphics really didn't sell me the game. In my mind I thought that since the game is so big then you have to have mediocre graphics and visuals to make the game work
@mauriciochacon
@mauriciochacon 4 года назад
had the ati x800pro and i could play all the next gen games at low res, it sucked but still could lol, crysis 1, tomb raider legend even with forced dx9c stuff, cod4, gow, UT3. btw you missed tomb raider legend with shader model 3 settings
@mr.hairyface8158
@mr.hairyface8158 5 лет назад
Nice work! :)
@XzPERFECTIONzX
@XzPERFECTIONzX 3 года назад
I got my 6800 Ultra today for 75 Euros! :)
@FatheredPuma81
@FatheredPuma81 4 года назад
Not sure why he didn't run Skyrim on this...
@Devilot91
@Devilot91 6 лет назад
Oh the good old time I was so upset that my 9800XT and then my X800 XT could not run any game with shader model 2.0. Ati was smarter to focus on performance over technology, making it a really fast card but with the compromise of having older shader model 2.0b (which were a little upgraded version of "regular" 2.0). Viceversa, nVidia pushed more over future technology with shader model 3.0 but they were so useless like explained in this video. For this i hated for long years nVidia for damn shaders 3.0. Good old times :D
@christiangarcia2533
@christiangarcia2533 6 лет назад
Hay you should do a video on a amic a276308a. On the back of the graphics adapter was this 13k8214698. I found it in my mom's old computer. Do you think you could do a video on it please?
@Dysphoricsmile
@Dysphoricsmile 6 лет назад
I upgraded FROM an ATI X850 Platinum TO a 9800 GT - because SM 3.0+! And from the 9800 to a GTX 460, to a 560 Ti that I got free, to a GTX 770, to now, a Gigabyte G1 Gaming GTX 980 Ti! And BEFORE the X850 I had an FX 5600, and before that I owned THE GeForce 4 Ti 4600! That is the GPU history of my, PERSONAL rigs!
@jakedill1304
@jakedill1304 Год назад
Weird.. I was able to run Bioshock way better on a 6800 ultra, with nowhere near as many of the settings downgraded for whatever reason.. I had to do a lot of ioni tweaks and run a separate program to fix the fov but.. it wasn't that big a deal to get it playable, if only it was also a good game.. why did they have to release SWAT 4 right before this..? Just to let us know what we could have had if this were a PC game...?
@FatheredPuma81
@FatheredPuma81 4 года назад
"If you can stomach dropping down to 800x600" m8 I played Team Fortress 2 at 640x480. It wasn't actually that bad but TF2 was build to be playable at stupid low resolutions so...
@CamaroWarrior
@CamaroWarrior 3 года назад
Did you make a video on the shader model 4.0 features, or even 5.0/5.1 ? ...... I remember I ran bioshock back then when it was new with a shader model 2.0 Radeon x850xt 256MB card with a patch, it ran good and looked nice but didn't look as good as a SM3 card.
@TimTaylor99
@TimTaylor99 4 года назад
... and don't forget AMD being the first with real DX10.1 support 🧐😅🙏
@dabombinablemi6188
@dabombinablemi6188 6 лет назад
SM 3.0 really didn't help out the lower end models, such as the 6200A. It was simply too weak to see any benefit, and indeed was best suited to DirectX 8.1 and older titles, though still far better than the FX5500/5200 Ultra. Despite the DDR2 266 being limited to a 64bit bus.
@raresmacovei8382
@raresmacovei8382 2 года назад
A friend was happy with his GeForce 6200 due to being able to play Crysis and least enable physics by going to the Medium preset (just for Physics and nothing else). It was a bit ridiculous that you needed a Pixel Shader 3.0 card for anything more than Low preset, (including actually getting destruction physics), but here we are.
@dyslectische
@dyslectische Год назад
I had a 6800le one you can unlock to a 6800gs but with 6800le memory chip. A good boost but really you can go for a 6600gt in that time. Funny thing is that was a gigabyte 6600gt sli on one board . But only work on gigabyte motherboard. Compleet stupid that never a unlock bios is voor dat card so you can use it on other boards.
@loganiushere
@loganiushere Год назад
40 FPS = Barely playable?!
@antwanarmstrong5987
@antwanarmstrong5987 7 лет назад
Strange my copy of chaos theory supports SM2.0
@PixelPipes
@PixelPipes 7 лет назад
Oh hey you're right! A later patch added support for Shader Model 2.0! Apparently ATI worked with them to get it implemented. Interesting! www.bit-tech.net/reviews/gaming/pc/scct_sm20_sm30/1/
@DarkDrakee
@DarkDrakee 5 лет назад
But there was also software to make SM3 games run on SM2 hardware.
@razvanfodor5653
@razvanfodor5653 6 лет назад
ATI made a huge mistep in my opinion with the missing SM 3.0 support. They had the edge in performance since the 9700 and banked on it alone. Even at that time Nvidia made shady moves, so it was a shame ATI couldn't establish itself better.
@ricardobarros1090
@ricardobarros1090 Год назад
Good and thank you. Jesus bless you
@thelasthallow
@thelasthallow 6 лет назад
so do a video on the 7800GT and compare it to the 6800ultra?
@CompatibilityMadness
@CompatibilityMadness 6 лет назад
7800 GT will rape 6800 Ultra (in everything). I would argue, that 6800 Ultra isn't that great of a SM3.0 GPU because most games with native support for it are optimised forGeForce 7000 series of cards.
@thelasthallow
@thelasthallow 6 лет назад
who gives a shit? thats litterally the point of this channel to bench old cards, i dont give two fucks if a 7800GT will "rape" a 6800 i just want to see what the performance increases would be.
@CompatibilityMadness
@CompatibilityMadness 6 лет назад
OK. Check this : forum.pclab.pl/topic/1160166-AGP-Aftermath/page__st__20__p__14279091&#entry14279091 Graphs with results are below text, I hope you don't mind having AGP instead of PCI-e versions. 7900 GS (20PS/7VS) vs. 6800 Ultra (16PS/6VS).
@thelasthallow
@thelasthallow Год назад
@@CompatibilityMadness 5 years late to reply i know, OK thanks for the links i just checked them out. im building a retro PC and got a 6800GT but i also bought 2 7950GT 512MB cards for like $25 a piece on ebay. i also got a couple of 8800GTXs as well for super cheap. gonna do SLI with one set of them, not sure yet.
@sniglom
@sniglom 6 лет назад
Personally I always disabled hdr in oblivion. I thought it looked bad and wasn't worth the performance hit.
@vdrand9893
@vdrand9893 6 лет назад
You can unlock 6800 ultra with software to an quadro
@hate_mate7054
@hate_mate7054 6 лет назад
subscriben, because you talk about my favourite piece of hardware
@Pidalin
@Pidalin 4 года назад
I am playing FarCry in 1024x768 on my 6600 without problems, it runs well. :-D
@Vanu-i4o
@Vanu-i4o Месяц назад
Yeah but you're talking about games years later when everyone upgraded already.... In the time it was just worse than the ATI.
@Vanu-i4o
@Vanu-i4o Месяц назад
Before I press play.... NO!
@Protector1rk
@Protector1rk 3 года назад
When games start to required SM3 6800 Ultra became garbage piece of shit and Ati have X1XXX series card with this feature. Which outperform Geforce 7XXX.
@PyromancerRift
@PyromancerRift Год назад
Typical nvidia. Release a feature, but it is useless on the first gen.
@wakesake
@wakesake 7 лет назад
i have the 850pe and the 6800ultra , to be real ATI killed them with ease i never liked bloom or blur i see that as a weakness , i respect Detail only btw lets get real Biosmuck was/is consoletarded rip off from System Shock
@CompatibilityMadness
@CompatibilityMadness 6 лет назад
ATI kill them with Fillrates, NV went into functions and efficiency. Two diffrent paths, but for SM3.0 only GPU, ATIs X1900 series is THE king.
@GraveUypo
@GraveUypo 6 лет назад
ouch, such fanboyery. get that flag down, we're not brand-worshipping in this channel.
Далее
The GeForce FX Series - NVIDIA's Huge Misstep
10:51
Просмотров 200 тыс.
The first ever GeForce
15:50
Просмотров 148 тыс.
СОБАКА И  ТРИ ТАБАЛАПКИ Ч.2 #shorts
00:33
The Notorious Rage Fury MAXX
9:32
Просмотров 64 тыс.
NVIDIA TNT2 Ultra vs 3Dfx Voodoo3 3500 | Card Battles
14:02
PowerVR Kyro Series: Better than it really should be
21:05
What the GeForce FX 5800 Ultra SHOULD have been
13:04
Evolution of NVIDIA Tech Demos 1999-2022 w/ Facts
14:34
GeForce FX 5200 Ultra - Redemption?
12:01
Просмотров 24 тыс.
The ATI Radeon X850 XT PE from 2004
14:07
Просмотров 61 тыс.
First Gen Ray Tracing, 6 Years Later
31:54
Просмотров 123 тыс.
SiS 315: Budget Champ or Bargain Bin Chump?
17:01
Просмотров 12 тыс.