Тёмный

$1,200 GPU from 2002 - Can it run Doom 3? 

Tales of Weird Stuff
Подписаться 6 тыс.
Просмотров 78 тыс.
50% 1

Опубликовано:

 

26 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 475   
@STDRACO777
@STDRACO777 2 месяца назад
2002 was in the age where high end PC gaming got redefined every 2 years. There was more dramatic changes between 2002 and 2005 than 2016 and 2024.
@AmaroqStarwind
@AmaroqStarwind Месяц назад
Well… real-time raytracing has matured in the past few years.
@STDRACO777
@STDRACO777 Месяц назад
@@AmaroqStarwind It's not that I dislike RT. It just feels like we have been focussing on something that could have been simulated with pre-baked code. Similar to how Quake handled lighting. What I disliked about RT is how it was marketed harder to devs than to consumers. As a shortcut so they don't have to prebake lighting in the game. Let the customer's GPU do it instead. Im also not happy that many games will go on about RT but their dev's have not even implemented proper physics or environmental textures.
@Garde538
@Garde538 Месяц назад
This is why my Mum was smart enough to buy me consoles from Sega master system up to PS2. Gaurenteed future game support for the console life. We had a Win 95 pc that had DoS games and Doom wolfenstien ect. I rejoined PC gaming in 2019 as it makes sence now as PC setups are usable for many more years like you said
@fazum
@fazum Месяц назад
For gaming not that much.. RT is blsht! But about IA, the thing scaled real fast!
@STDRACO777
@STDRACO777 Месяц назад
@@Garde538 a 1080ti from 2017 is still giving very good frames with no RT.
@phirenz
@phirenz 3 месяца назад
I looked into the "only supporting integer" stuff. The "shader processors" that 3D labs are talking about are not the full equivalent of modern fragment shaders. 3D labs have actually split the fragment shaders across two seperate cores, these shader processors and the "texture coordinate processors", and the texture coordinate processors do support full 32bit floating point. These two cores are detached from each other, presumably with a longish FIFO between them. The idea is that the Texture coordinate processor does all the texture coordinate processing, fetches the textures, does dependant textures and then passes the final texture fetch colors on to the shader processors, which does the final calculations with 10 bit integer precision. The documentation explicitly points the texture processors can pass any raw value they calculate though to the shading units without needing to do a texture fetch. And if you check the ARB_fragment_shader extension (notice how the contact detail was 3Dlabs themselves) you will notice that it only requires the full 2^32 range for positions, normals or UV coords. Color values are only required to support the range of 2^10. This split between texture processors and final color processing was not unique to 3D labs. I believe all major graphics vendors implemented such a design for their register combiner and early pixel shader GPUs. It's why Pixel Shader model 1.x and Pixel Shader 2.0 (not 2.0a) have the concept of "texture instruction" and "arithmetic insertions". I don't think the Pixel Shader Model documentation ever actually stated those types of instructions would be run on two completely independent shader cores... but that's what was happening under the hood. I believe it was Nvidia who unified both jobs into a single "pixel shader" core with their FX line of GPUs, and this unification was part of the reason why these GPUs had bad performance, because they had to use 32bit floats for color. On the plus side, it was the first GPU with proper flow control. Even the Radeon R300 still had this split, I'm seeing references in the register documentation to seperate "texture microcode" and "ALU microcode".
@noth606
@noth606 3 месяца назад
hmm interesting, in particular the bit about the FX stuff, I still should have one of the big boys of the FX line in storage and at the time had a bunch of discussions with people who were "pooping" on it due to articles that came out dissing the architecture. Sure, it was imperfect, but it was far from the absolute dog it was made out to be, at least the higher end models were quite OK - expecting big things from the fx5200 or how it was called was a strange idea anyway.
@cesaru3619
@cesaru3619 3 месяца назад
3dlabs FANBOY in denial.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
The 3Dlabs split of higher precision texture coordinate processing and lower precision fragment processing seems most similar to the Geforce3 and Geforce4. NVIDIA had a very similar split there, and this can be seen in the NV_texture_shader and NV_register_combiners split. At least some modern architectures still have vestiges of this. On Intel GPUs, the texture instructions conceptually send a message to an asynchronous unit that performs the texture look up. Once the data has been fetched and processed, it is delivered back to the shader execution unit. Depending on numerous factors (e.g., caches) the texture access can take a wildly variable amount of time. The texture sampler hardware is on a separate piece of the die from the execution unit. The huge difference between current and vintage architectures is that current architectures are asynchronous (other things can happen in the shader while waiting for texture results) and vintage architectures were synchronous (the next phase of ALU operations would block until the texture unit was done). As I mentioned in the video, Geforce FX only had flow control in the vertex stages. Fragment processing had predication that could be used to emulate if-then-else structures (similar to conditional move instructions on CPUs), but loops were impossible. You are correct that a lot of hardware from this era has a concept of alternating "texture phases" and "ALU phases." These architectures have a limited number of texture indirections, so the compiler / driver has to try to group texture accesses into bundles to reduce the number of phases. Each Shader Model has a limit for the number of texture indirections that are guaranteed to be supported. On R200 it's really small... 2 or 3. I know ATI_fragment_shader only allowed 2 passes. I think I'm going to have to do a deep-dive video on the Radeon 8500 architecture. On Intel i915, it's also pretty small. I want to say 4, but it has been years since I did any serious work on that driver. The general idea is that a texture phase receives data from a previous stage. That is either the vertex processing / interpolation or a previous ALU phase. Texture look ups occur, and that data is fed to either the next ALU stage or the color / depth outputs. ALU stages have some number of values that can be persistent by passing through the texture stages (this is related to the "pass any raw value they calculate" bit that you mention) and some temporary values that are lost at each texture stage. Fragment shader color inputs (via gl_Color and gl_SecondaryColor variables), color outputs (via gl_FragColor), and depth output (via gl_FragDepth) are implicitly clamped to [0, 1] and may have reduced precision. Intermediate calculations are expected to be "full" precision. The spec is a little vague about what full precision would be. Issue 30 of the GLSL 1.10 spec defers to the OpenGL 1.4 spec, "It is already implicit that floating point requirements must adhere to section 2.1.1 of version 1.4 of the OpenGL Specification." Section 2.1.1 is where I assume you got the 2^32 and 2^10 values. That section also says, "The maximum representable magnitude for all other floating-point values must be at least 2^32 ." "All other floating-point values" would include intermediate calculations in a shader. NV30 and R300 had different representations. NV30 could use either 32-bit single precision or 16-bit half precision. The latter was not directly exposed in GLSL. R300 used some sort of 24-bit precision internally. It met the requirements of all the specifications at the time, but I have some memory of it causing problems for developers that expected / needed full single precision. Given that the P10 shaders can supposedly be quite large, there are a few ways they could have gotten around this. A clever compiler could analyze the shaders to determine that some calculations would be fine at lower precision. I implemented a similar pass in my compiler. For many values in simple shaders, it is very easy to prove that the range will be limited. That would cover a lot of things. The GLSL spec and OpenGL 1.4 are pretty loose in the definition of precision and the processing of exceptional values (infinity, NaN, etc.). Perhaps this enables the compiler to use a different representation for floating point values... like storing an integer numerator and integer denominator. This is just speculation. There is one thing that still bugs me. GLSL is a superset of ARB_fragment_program (the assembly language shader). If the driver and hardware can do the former, it can, by definition, do the latter. Why not support it? Why not support Shader Model 2.0?
@remasteredretropcgames3312
@remasteredretropcgames3312 2 месяца назад
@@TalesofWeirdStuff Looking for someone like your for help with a project in id Tech 4. Im trying to force the game to use more modern JPG formats by changing the hex code and renaming it. Forcing 3DC instead of RXB DXT5 would be a nice little upgrade to squeeze more memory bandwidth out of the engine.
@josue1996jc
@josue1996jc Месяц назад
holly shit i read you guys and i see jewbrish xd i wonder if thats how guys from TI see when you see us from the healtcare area talk xd
@seebarry4068
@seebarry4068 3 месяца назад
I had the 9700 pro. Good card, got about 8 years use from it.😊
@supabass4003
@supabass4003 3 месяца назад
Damn you're lucky, mine died in 2004 :( I had the Powercolor 9700Pro AIW.
@RJARRRPCGP
@RJARRRPCGP 3 месяца назад
@@supabass4003 Why did it die? That's a mystery, because the high-end, especially that high-end, aren't even likely to have bad caps. Except for motherboards, I usually would see bad caps more in 2004-2005.
@centigrams
@centigrams 3 месяца назад
@@supabass4003 My Radepn 9800XT 128MB (best core) is still alive to this day. sad your radeon 9700PRO AIW died
@picantoburito
@picantoburito 2 месяца назад
​@@RJARRRPCGP😊😢❤
@PatientXero607
@PatientXero607 2 месяца назад
I'll have to re-test Doom 3 on my fresh rebuilt Athlon XP 2500+ system. NF7 2.0 with a 9700 Pro. If memory serves, I think I upgraded to a 6800GT to play Doom 3.
@coolmoments2994
@coolmoments2994 3 месяца назад
I had bought a FireGL X1 it was about $1000 USD new, still have it somewhere actually, was able to play games back in the early 2000's at insane resolutions on a super high res CRT I used to have, was awesome.
@ichrismoku
@ichrismoku 2 месяца назад
Just got to 12:18 and the claims about 512mb RAM. Very serendipitous you should mention Doom 3 too... So here's the thing. There was a big competition around the release of Doom 3 that I remember to this very day. The prize for this competition? The world's first and only special edition 512mb graphics card - the point being it would allow you to play in "Ultra" graphics settings. To this day I can't find any mention of it online, no one seems to remember it, there's no trace of it. Perhaps this is something you might be interested in looking into.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
I have some vague memory of that too. I don't recall for sure if that was Doom 3 or some other game, but I do recall something about that. I can't even image what a slide show Doom 3 on Ultra settings would be on this card.
@szilardfineascovasa6144
@szilardfineascovasa6144 2 месяца назад
Probably overshadowed historically by the "But can it run Crisis?" trope 🙂.
@Hugobros3
@Hugobros3 3 месяца назад
now you're getting into the really good stuff ! I absolutely love these, and especially the later P20/25 cards.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
Those later cards seem like they might be more interesting. Did you have any of the 3Dlabs cards back in the day? I am curious to hear about people's real world experiences with them.
@Hugobros3
@Hugobros3 3 месяца назад
@@TalesofWeirdStuff I was born far too late for that. I got my first 3dlabs card (Wildcat Realizm 200) in 2011 at a flea market, without a clue on the significance it had. It was already outdated by then but ran TrackMania and Minecraft okay-ish on the terrible hands-me-down computer I had. I have collected a few other cards since, and I even managed to score some ZiiLabs ZMS stuff (the embedded ARM SoCs with StemCell GPUs, derived from their existing IP).
@kenh6096
@kenh6096 3 месяца назад
​@@TalesofWeirdStuff I remember the realizm cards whilst impressive for some professional applications had real issues if you attempted to game on them.
@remasteredretropcgames3312
@remasteredretropcgames3312 2 месяца назад
@@TalesofWeirdStuff Go to moddb Prey 2006 Remake and look at the second photos comments. Im using DXT1, DXT3, DXT5, RXGB DXT5, TGA RLE, and JPG 4:2:0 87-96% making up a over 60GB file that absolutely maxes out a RTX 4090s memory system using custom resolutions for EVERY SURFACE in the game. The goal is to get Nvidias attention and drop path tracing beyond the screenspace stuff im stuck using and scaling back to development hardware for the engine itself.
@xRichUKx
@xRichUKx 3 месяца назад
Very interested to hear about your career and experience writing graphics drivers!
@redgek
@redgek 3 месяца назад
Yes! I'd love to hear more about.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
I expect that it will come out more as I continue to talk about graphics hardware and OpenGL. :)
@smakfu1375
@smakfu1375 3 месяца назад
@@TalesofWeirdStuff As a former driver developer, though not in the display space (and formerly part of NTDev in Redmond), I have a ton of questions about how you approach core display miniport development for these beasts. Without disclosing anything sensitive, I’d love to see a video about the general process, from spec sheet onwards (especially once we entered the era of programmability). (Also, this video was fascinating, as are some of your detail followup comments.)
@jrno93
@jrno93 2 месяца назад
Young people today will never understand how hard it was the get things running right back then. Today its basically plug and play, especially when it comes to steam and how it automatically downloads all dependencies.
@Typhon888
@Typhon888 2 месяца назад
It’s actually harder now that windows ships with malware.
@EvilestGem
@EvilestGem 2 месяца назад
and yet things now are not working as well as they should. For example, I have a relatively high-end setup costing and I have more problems with games than I ever had. The amount of software within hardware now means the possibility of conflict is ridiculously high. You could argue we have too many options in too many areas.
@jrno93
@jrno93 2 месяца назад
@@EvilestGem you must have AMD gpu
@EvilestGem
@EvilestGem 2 месяца назад
@@jrno93 I had both. The issue I've had persisted through several new builds including intel and AMD.
@M4XXST3IN-vp5vk
@M4XXST3IN-vp5vk 2 месяца назад
That is how I became so good with hardware and tweaking
@TokeBoisen
@TokeBoisen 3 месяца назад
I absolutely do not miss my first GPU, the Geforce FX5600XT. At that time Nvidia used XT to mean "low-performance" versions, something 15-year old me didn't know when I bought my first pre-built. Upgraded to a 6800GT later that year, and that thing was a beast at the time. Thanks for this blast from the past
@noth606
@noth606 3 месяца назад
FX5600XT if I recall correctly was a sysintegrator only SKU, I it came as standard in some branded PC package deal? Pre-built can mean a local shop built PC package, or something totally different like an Acer, HP or similar. I recall at least Medion had FX5x00XT sysintegrator only cards that were strange ones, they seemed to be binning rejects or bus width gimped GPU's. I had either 5500xt or 5600xt that was a "faulty/disabled" 5800 where the memory controller was limited to 64mb at half speed or something like that. Very strange card, but not very useful.
@goldenheartOh
@goldenheartOh 2 месяца назад
I can relate. My 1st PC around 2000 had a PCI nvidia graphics card. This was when AGP was standard. I was amazed how much better my friend's PC played Midtown Madness using integrated graphics than mine could play it with a GPU . Mine barely could play it at all. But hey, I knew enough to make sure my PC had a gpu. ☠️ $500 lesson. That PC was a true potato. Icing on the cake was it ran Windows ME.
@hanshanserlein576
@hanshanserlein576 2 месяца назад
I bought a FX5200 back then. It was a catastrophic bad gpu, even worse then the previous generation!
@Chbcfjkvddjkiuvcfr
@Chbcfjkvddjkiuvcfr 2 месяца назад
I had an SLI of 6800GT, it was a beast
@noth606
@noth606 2 месяца назад
@@Chbcfjkvddjkiuvcfr heh, I still have a pair of 7950Gx2's. As useful as perfume on toast. But, it did turn heads heh, if not much else, quad SLI never worked right. I never used those cards really, other than to test and verify how crap they were. Still have 6800Ultra, and 5950Ultra (V9950Ultra) which both actually worked really well and did what it said on the tin. Have a 8800Ultra too etc. The single cards were good but it was only 8xxx series that worked well in SLI for me, even have a pair of 8600 something in SLI that work surprisingly well, roughly 8800U equivalent performance but cheaper and less prone to the GPU desoldering itself from the PCB spontaneously. 8800U has problems, if any of them survive still in use, I have 2 dead ones and 1 working. The working one if because I don't use it except to briefly benchmark things on. 8800GT was good, have a pair of them too, 512mb ones. The GTS is okay but a lot slower than it should be, and 2 slots wide without justification. Also have the 8800GT "alpha dog" which if it works is great but it has the same problem as the Ultra, you'll have to dig through a stack of them to find a working one. If you do, cooler swap it immediately and it might live on. Triple SLI 8800Ultra was a hoot though, million FPS but the graphics bugs made everything unplayable unless you only played one game and found a magic modded driver for it, otherwise disable SLI and run just a single one.
@ridiculous_gaming
@ridiculous_gaming 2 месяца назад
Radeon 9800 with Direct X 9, pixel shader C, made Half life 2 and COD 2 look absolutely mind blowing at this time on PC.
@DraponDrako
@DraponDrako Месяц назад
Water reflection and deep , are still amazing
@themidcentrist
@themidcentrist 3 месяца назад
I had 2 Voodoo 2 cards in SLI back in the day. Great stuff! I used it to play the original Tomb Raider with a patch that allowed it to support 3DFX glide.
@mproviser
@mproviser 2 месяца назад
How old are you... If you dont mind my asking :D
@angry_zergling
@angry_zergling 2 месяца назад
Around 2005 I spent $5k on a PC with a x2 4800 and dual 7800s...****512***.
@p1zzaman
@p1zzaman 2 месяца назад
I also had the Voodoo 2
@rpersen
@rpersen 2 месяца назад
Aah the good old Voodoo 2.
@DraponDrako
@DraponDrako Месяц назад
​@@mproviserremember those school guys "i have 2 voodoo in sli at home" , asking then let us play it - "my parents doesnt allow any friends from school", even if you break in his house to play it he will say "my father put's it in vault"
@DailyDoseOfShrooms
@DailyDoseOfShrooms 2 месяца назад
i feel like im going to be you in my 40s and 50s explaining to the younger generation the graphic cards of today.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
Begin your training by shaking your fist and yelling, "Get off my lawn!" 🤣
@Jabjabs
@Jabjabs 2 месяца назад
In 2004 I played through Doom 3 on a Radeon 9200 SE, back then it cost about $50 new. The SE meant that the ram bus was cut to 64bit width, about 1/4th the typical 256bit data bus. It ran Doom 3 with almost identical performance to this thing.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
Given the scores the Radeon 9250 got (see the follow-up video)... ouch. Just... ouch.
@Proton_N
@Proton_N 2 месяца назад
Damn, I had the same Radeon 9200se 😁
@jbrizz99
@jbrizz99 2 месяца назад
I played Doom 3 when it came out on a Geforce 4 MX 440, which was basically just a faster Geforce 2, at 640x480 and it ran better than this. There are videos of the Geforce 4 MX 440 on RU-vid playing Doom 3 just fine. There's something wrong with this card or setup I think, it should run Doom 3 better than a Geforce 4 MX 440.
@Proton_N
@Proton_N 2 месяца назад
@@jbrizz99 my best friend had mx440 and it was better than 9200se. Closer to 9600 pro
@GuyFromCrowd
@GuyFromCrowd 2 месяца назад
I did run Doom 3 Alpha back in 2003/2004 on Pentium 2.0 and Radeon 9000, but it was slideshow.
@drewnewby
@drewnewby 3 месяца назад
From tomshardware, "Old Hand Meets Young Firebrand: ATi FireGL X1 and Nvidia Quadro4 980XGL" The FireGL X1 128 MB is shown at $749, and $949 for the 256 MB version.
@drewnewby
@drewnewby 3 месяца назад
Uwe Scheffel's other articles are worth a read too, in this case ... " British Understatement: 3Dlabs' Wildcat VP Put To The Performance Test"
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
Thanks for finding that. $1,200 is still quite a bit more, but it's not as absurd.
@KleinMeme
@KleinMeme 3 месяца назад
@@TalesofWeirdStuff Correlates with what i found. (mostly german Hardware/IT Sites who had reports from 2002-2003 for these cards.) :D A german Hardware Website had a report with benchmarks who stated that the ATI FireGL X1 256MB was listed and sold for 995,-€. The 3Dlabs Wildcat VP990 Pro was MSRP 999,-€ before taxes, 1160,- with VAT applied. The VP760 was listed for 449,-$ The VP870 was listed for 599,-$ The VP970 was listed for 1199,-$ As stated in a report from June 24th 2002 on a german IT website.
@kamilwaek8326
@kamilwaek8326 2 месяца назад
@@TalesofWeirdStuff i found the price on australian magazine made in early 2000 ( Atomic: Maximum Power Computing) issue 023 december 2002 - and it said 2,595 $ (australian $ presumably) the magazine is dumped in archivedotorg if you want to see it Yourself
@pacocarrion7869
@pacocarrion7869 3 месяца назад
512MB is a lot, remember that Quadro FX 3000G has 256MB (2003 released)
@johnmay4803
@johnmay4803 3 месяца назад
i just want 2 say i love your vids keep up the fantastic work pal
@supabass4003
@supabass4003 3 месяца назад
The moment you said you didn't own a 9700Pro in 2002, I knew you were going to say that you had an 8500LE!!!!
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
Ha! The worst part is... I just got it in October or November of 2002 to upgrade my original Radeon DDR. A little late to that particular party.
@raresmacovei8382
@raresmacovei8382 3 месяца назад
@@TalesofWeirdStuff Not really. Due to the PS2 and original Xbox, games ran on DirectX8.1 until around 2006, with some games even in 2007+ still running on that API.
@szilardfineascovasa6144
@szilardfineascovasa6144 2 месяца назад
@@TalesofWeirdStuffAh, that Radeon was a sweet card...it died on me, switched to a GeForce2 Ti (or was ot Pro?). The speed bump was quite noticeable on an Athlon at 1200 MHz in Max Payne but the image quality of Nvidia was still lagging behind ATi.
@seasidegalaxystreet
@seasidegalaxystreet 3 месяца назад
I remember back in the day owning a £600 IIyama Vision Master Pro 510 22” NF diamondtron monitor and how cool that was. A great time to be in computing.
@drewnewby
@drewnewby 3 месяца назад
I had quite a few video cards prior, but I remember the ATI Radeon 9500 Pro as the first one I referred to as a graphics card proper.
@NavJack27gaming
@NavJack27gaming 3 месяца назад
Please Please cover more stuff like this! i had absolutely no idea you had this in-depth background with GPUs.
@nauglefest
@nauglefest 27 дней назад
That's not just a GPU, that's a workstation card. I recommend the brand and the model. 3DLabs cards were meant for workstations doing CAD, complex modeling, and they averaged as much as quadruple the amount of video RAM that a consumer market card has available. It's because it was a workstation card. I remember 1998 when these workstation cards had 64MB of RAM in a time when consumer graphics cards only had 16MB of RAM available. They're for heavy workloads too.
@placeholdername3206
@placeholdername3206 2 месяца назад
I had a Geforce 5800 Ultra and an AMD Athlon 2800+ back during the Doom 3 days. Was my first gaming PC. Not sure how that compares to what hes using here, specwise. That game ran smooth on my setup, i remember that. That setup lasted me quite a few years.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
In the follow-up video, I run a PCX 5900 on an Athlon X2 4800. The GPU is probably about the same as the 5800 Ultra, but the CPU is... a bit more. I decently playable numbers on that setup.
@otonalencar5145
@otonalencar5145 Месяц назад
First time on this channel, less than 3 minutes on this video and this guy got my thumbs up and a sub. Awesome content
@jtsiomb
@jtsiomb 3 месяца назад
While you were scrolling through the extenions, I could see both ARB_vertex_shader, and ARB_fragment_shader (and the obligatory umbrella ARB_shader_language100), so it absolutely looks like it has full GLSL support. Which makes sense, because 3DLabs was at the time at the forefront of GLSL and GL2 development. I never heard of 3DLabs before reading their whitepapers and slides about it back then. The big question then is, does it has a GLSL noise() function that doesn't always return 0? Because I want one if it does.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
That is a fabulous question! I will for sure check the noise function. For awhile Mesa had a real noise function, but we took it out because lower end GPUs couldn't do it. It generated way too many instructions.
@cmoore022
@cmoore022 2 месяца назад
Brings me back to memories of my Voodoo 5 5500, it lasted about 12 years with daily use. Miss that card.
@AmaroqStarwind
@AmaroqStarwind Месяц назад
If Intel’s cancelled Larabee cards had actually been released, you probably would’ve had a field day with them. Just out of curiosity, have you heard of a game called 4D Golf? I’d be curious to see what someone of your expertise thinks of its rendering code. The engine was recently made open source, and the developer also has a comprehensive build log on RU-vid.
@buggerlugz6753
@buggerlugz6753 3 месяца назад
The capturing VGA impressed me! :)
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
I realized afterwards that since these cards have DVI, I could probably use my regular HDMI capture device. 🤦‍♂️ I'll try that for the next video.
@CharlesVanNoland
@CharlesVanNoland Месяц назад
I played Doom3 on a GeForce3 back in the day. It didn't run very well, and I'd acquired the GF3 THREE YEARS PRIOR specifically to play Doom3, because that's the GPU that Carmack mentions at the Macworld Expo where Doom3 was originally unveiled. Then the game took 3 more years to release, and in that time GPUs got better and faster, so by the time my Doom3-intended GPU actually rendered the game, it was antiquated and bare-minimum hardware. I did have fun playing with NV_register_combiners in OpenGL though, creating a normal-mapping demo, and manually creating normal maps in Photoshop 5.0. It basically was a fixed-function toggleable dot-product that you could route different inputs into. It was the earliest most basic "programmable graphics hardware" you could imagine - no actual "shader" support, but you could do normalmapping.
@MrClassiccarenthusia
@MrClassiccarenthusia 2 месяца назад
I've recently seen Doom 3 running in a very similar manner. On my PIII 550 with an Nvidia FX 5500 PCI. It's running drivers that aren't entirely meant for it, the older ones that only support the 5200, as for whatever reason the newer ones for 98SE don't let System Shock 2 run in it's original form, as well as a couple of other games. But even so, it comfortably runs Fahrenheit, Max Payne 1 & 2, but primarily use it for older games. My Athlon Thunderbird 1.13 with a Voodoo 3000 can technically run modded Doom 3, but it looks like some kind of fever dream. For regular work, my 3800 X2 with Nvidia 7900GTX 512MB, PCI-e Ageia PhysX, 2GB of RAM, and a transparent 140GB 10k Raptor does just fine. Oh, and one of those fancy Creative Z31 cards with the gold ports. Since I'm not versed on this anywhere near as well as you are, maybe you can answer this question. The Chronicles of Riddick Escape From Butcher Bay looks absolutely marvellous on the 7900GTX. But, on later unified shader cards, the lovely soft shadows and lightning doesn't look right. This isn't my imagination, I've run the games side by side on my old brick and my actual PC, and the difference is clear. I even had the remaster running on my brother PC and compared it, still doesn't look anywhere near as good. Is there any features that a vertex and pixel shader card could do, that a unified shader card cannot? Because I know it's not the first time I've noticed this with games from around 2004 ~ 2006, where it looks cleaner and shadows and lighting look better on the older cards. 🤔 Or, maybe the devs were lazy.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
That is interesting. I don't know anything about that particular game, but I can't think of a particular reason why the shaders should have any affect. Is it just the shadows? There could be changes in how the texture samplers work that could have some impact on soft shadows.
@MrClassiccarenthusia
@MrClassiccarenthusia 2 месяца назад
@@TalesofWeirdStuff Well, I honestly didn't expect a response.. Okay, in TCoR: EFBB it's the shadows, and the interaction with light sources against shaded surfaces. The hardware at the time didn't necessarily permit copious use of polygons, so many surfaces were just flat with textures and shaders to make bricks, tiles, wood etc look like it was actually 3D. What I need to do, is grab a capture card, and record some of my favourites from back in the day, and stick the side by side comparisons in a RU-vid video that no one will watch. I cannot say this with any degree of certainty, but either some level of precision was lost when moving from separate pixel and vertex shader pipelines to the unified architecture. Or, the devs who made those games / game engines didn't write the rendering code in a way that later drivers code interpret to produce the same exact image on the screen. One that definitely sprung to mind again, was THQ's The Punisher, a game that by all accounts looks pretty bad even for the time. But on ATI X series cards (X300 ~ X800) old junk by today's standards, certain details could be made to look really good by tweaking driver options. Effects that couldn't be replicated when I moved over to an nVidia 7600 around that time. I'm firing up my old brick now to dig through my archive. Okay, bit sad looking at screenshots I took 19 years ago, but still. Amongst the many comparison shots, I found what I was looking for. With ATI APC the textures look sharper and cleaner, and it emphasised the metallic parts on the trench coat, buttons, edge lining etc. Without it, the game looked flat and textures were "standard". Anyway, don't let me bore you to death. When I was a teenager, these things mattered to me and my friends. Why does it look like this on this hardware, and like that with these driver settings, but that hardware looks slightly different and cannot achieve the same results. Don't worry, I'm just an idiot who still thumbs his nose at the past. 😂
@grumpybunny
@grumpybunny 2 месяца назад
Appreciate your format, and your outstandingly long experience with this stuff. Keep it coming I honestly love the stuff about extension implementation and compliance, fuckin' gold
@Nick1921945
@Nick1921945 18 дней назад
In high school, we had shop class where we learned how to use power tools and different kinds of hardware. We had a small period where we learned about computer hardware chips, and i will never forget our teacher being hyper serious about never touching the gold fins on it. He would constantly refer to it to "Fingering the fins". Even on the test. Meanwhile every single person on youtube is fingering the heck out of these gold fins like they are the king of the world.
@gelijkjatoch1009
@gelijkjatoch1009 3 месяца назад
Finally a very light game looking good, nicely aged.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
The bummer is that the scenery looks pretty good... too bad it's so dark you can't see any of it.
@gelijkjatoch1009
@gelijkjatoch1009 3 месяца назад
@@TalesofWeirdStuff The games ware also so scary before people had ratherd to turn it off.
@PixelPipes
@PixelPipes 3 месяца назад
This is so interesting! I wonder if Doom 3 is defaulting to fixed function mode. That framerate though... Does it not have occlusion culling or am I misunderstanding that? I do remember my friends and I back in the day saw this card on NewEgg and such and just gawked at the ridiculously huge framebuffer. But we all knew it wasn't suitable for gaming.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
I'm going to delve into the rendering modes in the next video. I finally looked it up... "gfxinfo" will show which modes are available, and "r_renderer" can be used to specify a mode. Based on the available functionality, it should be able to do the NV20 mode, but it might be falling back to ARB. The card doesn't have occlusion query, but I don't think Doom 3 uses that. I'm not 100% sure, though. It's possible that's a factor.
@r9guy_
@r9guy_ Месяц назад
Doom 3 was released in 2003 and it was running okay on Nvidia FX5200 on pentium 4, but was designed to run on lower end systems, I believe OpenAL and Open GL 2.0 was what was neccessary. By the way OpenGL 2.0 could load shades as extensions, a mechanism incepted in OpenGL from the initial design stages, then it was called IrisGL. By the way you explain very well for people who programmed in OpenGL 1..2
@cyrusjalali1571
@cyrusjalali1571 14 дней назад
I remember running Doom 3 on my Pentium 4 Northwood 2.4GHZ socket 478 with ATI 9000 AGP. Barely ran at 640x480. But in 2004 I upgraded to a x850xt agp and I was able to max it out. Pretty awesome.
@jholloway77
@jholloway77 Месяц назад
Is that a giant Sun Microsystems flag?!? So awesome
@TalesofWeirdStuff
@TalesofWeirdStuff Месяц назад
Yes it is. 😁 That's a pretty recent addition to the collection, too.
@AluminumHaste
@AluminumHaste 3 месяца назад
Man the Radeon 9700 Pro was such a beast, I had one for like 3 days before I returned it, my CPU was too slow to actually use all of it. It was 700$ CDN at Bestbuy back in like 2003ish.
@justinrose8661
@justinrose8661 2 месяца назад
I love how into this stuff you are. Reminds me of installing my Diamond Monster 3d Voodoo card in my first build as a kid. We've come a long way
@AnonyDave
@AnonyDave 3 месяца назад
Glad you have a sane definition for gpu. Nothing worse than someone looking at the most basic frame buffer (like say a sun cg3, where it's literally just a chunk of ram you can write to, and a ramdac that converts data from that ram into video data) and saying "look at this gpu"
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
I rarely "correct people," but it does annoy me when people call even something like a Voodoo 2 a GPU. But... I always tell people the correct way to say GIF is the *opposite* of however they just said it. 😂
@Bourougne
@Bourougne 2 месяца назад
Magical shortcut to 41:38
@kirillholt2329
@kirillholt2329 Месяц назад
Definitely interested in a video on Radion 8500/GeForce 3/4. GPU evolution in general is very interesting and under covered topic, the deeper the better (if time permits)
@slimebuck
@slimebuck 3 месяца назад
back then my gpu was a voodoo 3 3000. to this day it is my favorite ever gpu. it blew me away back then. It replaced my 2x Voodoo2 SLi set up
@Chbcfjkvddjkiuvcfr
@Chbcfjkvddjkiuvcfr 2 месяца назад
I had the 9800 pro … this graphic card was awesome, the golden age of hardware :-)
@NotAnonymousNo80014
@NotAnonymousNo80014 2 месяца назад
Got it to play Doom3 myself, two months later the 6600GT came out for the same price, all I was left with was buyer's remorse.
@OgOssman
@OgOssman 2 месяца назад
Even though I dont understand much of the lingo, I was able to finally understand what a gpu actually does. Thank you for that knowledge.
@JohnSmith-rx3tn
@JohnSmith-rx3tn 2 месяца назад
Finally youtube recommend me a decent and neat channel.
@TheMajorTechie
@TheMajorTechie 3 месяца назад
Very interesting! I've been interested in trying my hand at Windows graphics driver writing myself, though I haven't really had much of a clue of where to even start yet outside of grabbing the pre-made example drivers from the Windows DDK.
@homersimpson8955
@homersimpson8955 3 месяца назад
Radeons supports Vertex Texturing starting from DX10 era cards R600 and higher. But i have feelings that R500 also should get a support but it was disabled for some reason. Maybe it was buggy. That would interesting to read R500 specification about this. And feelings I got from Chronicles of Riddick: Assault on Dark Athena. This game specifically requires Vertex Texturing and keep R500 as unsupported, but few Catalyst drivers actually pass the check and alow to play the game for a while. Game still crashes on level 3 and there no way to go further on R500, but it made me think that Vertex Texturing may be implemented but buggy, or maybe level 3 is the first level which really require it.)) That's interesting that 3D Labs implemented so many advanced features but forgot about basic ones as occlusion query, also it's not 100% necessary, pretty sure DOOM 3 do not using occlusion query and do geometry culling on CPU and fight overdraw with early Z pass. Waiting for the next video with more games on P10.
@Ranguvar13
@Ranguvar13 Месяц назад
Thank you so much for your work on Linux! ❤ These are fascinating, never heard of em. My first card was a BFG FX 5200 (NV34), and while it was underpowered extremely quickly, I had to buy one recently to put on a shelf just for the nostalgia. It’s an incredible contrast with one tiny fan compared to the modern triple slot+ behemoths.
@timhaas6021
@timhaas6021 2 месяца назад
I had a beast of a 5900 at one time. Overclocked like a true champ. Paid way too much for it, but that was just a matter of bad timing. Never ran Doom 3 with it, I ran Doom on a pair of 7800s in SLi.
@ammaina01
@ammaina01 3 месяца назад
The right way. Without hardware and software that we know, your knowledge is abstract to me. What did I understand in the end? The Weird would have been a great character in the movie.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
That comment made me laugh so hard. Thank you! 😆😂🤣
@jameskolby
@jameskolby 3 месяца назад
Thank for the wonderful video! As a computer engineering student, I find more technical explanations of relatively old hardware to be fascinating look into how we got to where we are today
@MrJorgalan
@MrJorgalan 3 месяца назад
Nowadays, any graphics chip, even from the 8 and 16-bit era, is considered a GPU, which to me is an aberration. They were not GPUs, nor did we use that term back then! The term GPU started to be used with the GeForce 256 in 1999 and, as you mentioned, it should be programmable, which, as far as I know, happened a few years later. But as I said, back then we used the term graphics chip or VDP ("Video Display Processor"), or even PPU ("Pixel Processing Unit") for the SNES. Your videos are super interesting.
@eurocrusader1724
@eurocrusader1724 3 месяца назад
Yup,I remember the GeForce 256 SDR (wich I bought at the time for my Athlon 500 rig),being addressed as a geometry processing unit(gpu). Before that time, my voodoo 2 12 Meg was defined as a 3d add in card ,just like my ATi rage pro 4meg,wich also can be defined as a display adapter. Although my S3 virge 2 meg before that was my real first 3d card. I know it's bad,but seeing dark forces 2 run in 320x200 w/texture filtering hooked me for life..
@dyslectische
@dyslectische 3 месяца назад
Gpu come after t&l support . But really a full gpu is a directx 10 after the uni shaders introduction.
@seebarry4068
@seebarry4068 3 месяца назад
Video card was the term I was using back then.
@MrJorgalan
@MrJorgalan 3 месяца назад
@@seebarry4068 Today, the term "graphic card" is also used, I believe. I think the term "GPU" is more oriented towards referring to the chip itself in a strict sense and, in general, the video card
@Sean-fj9pn
@Sean-fj9pn 3 месяца назад
I remember in the 90s the term graphics accelerator was used a lot.
@luheartswarm4573
@luheartswarm4573 Месяц назад
the video itself is awesome, great topic, super well written and spoken! but I can't help myself to not stare at the cool opengl shirt!
@SianaGearz
@SianaGearz 3 месяца назад
The term GPU was introduced on PC by Nvidia with Geforce 256, which introduced a vector processor for a fixed function vertex pipeline and register combiners for the pixel pipeline. Configurable, not programmable. Outside the PC the term had popped up previously with very fixed function devices. SONY called the video subsystem of the original 1994 Playstation the GPU, and i believe there may have been earlier examples. But i guess if we agree to call only programmable units GPUs i wouldn't mind :D Dreamcast not having a DVD drive is... i don't know this seems largely irrelevant? I mean who cares that Gamecube has a drive that has a DVD pickup, the discs have barely more capacity than Dreamcast's, and you can't fit a DVD in there. The rotary speed of the Dreamcast drive is also not crazy high, it's not like any of the 52x PC drives, it's just noisy because there's a massive huge gap between the lid and the drive recess, so there is a lot of air noise from air running through that gap. I'm going to bet the ARB fragment shader support is very incomplete and that it will actually reject valid programs that don't conform to hardware constraints. This is not something they could sanely do with assembly ARB fragment program.
@goldenheartOh
@goldenheartOh 2 месяца назад
So this is the part of RU-vid where the PC engineers hang out. Cool. I was thinking in 5yrs we'll have a video asking "can an $1,600 gpu from 2022 run a modern AAA game?"
@CallOFDutyMVP666
@CallOFDutyMVP666 3 месяца назад
Great vid. Boosting engagements
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
Every little bit helps. :)
@Tarukeys
@Tarukeys 3 месяца назад
Radeon 9800 was a DX9 card and supported shader model 2. I had one and it was an incredible card for its time. I was running Far cry maxed out (needed 1 Gb of Ram or there was stutters, I tested it) with this beast when the game came out. Before the FX 5900, nVidia had no card worth a damn against it or the previous one, the legendary Radeon 9700 Pro. FX 5800 and the lower card were really bad. I built a Pentium 4c 2.4Ghz with dual channel ram (first gen with it), 1 GB of Corsair TwinX and the Radeon 9800. It was one hell of a PC and served me well for years, which was unusual back then. PCs were obsolete so fast in this era. Question: Are you the guy who was making the custom Omega drivers for Ati cards?
@harrylarkins1310
@harrylarkins1310 3 месяца назад
Are you sure it needed 1gb? I played it with a 8800gtx (and a 4200ti and 5900gtx before that) with 768mb and I don't recall that, maybe I'm wrong, It was a long time ago.
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
I think Far Cry needs 1GB system RAM, not video memory. I'm not the person making the Omega drivers, and... I'm not sure what they are. :) Another thing to research!
@Arivia1
@Arivia1 3 месяца назад
I've barely started this video but I love how you immediately dive into the programming and feature set nitty-gritty of this? these? GPUs and other graphics technologies. It's a clear tier above most any other historical GPU content on youtube (no shame on PixelPipes I love you but this is a whole other level).
@TalesofWeirdStuff
@TalesofWeirdStuff 3 месяца назад
I think there's plenty of room for lots of different perspectives. I equally enjoy LGR videos and Modern Vintage Gamer videos, but they tend to be very different levels of technical depth. I've been doing videos for almost 4 years, and I'm still trying to find where "my place" is.
@davuvnik
@davuvnik Месяц назад
to me a turning point of graphics, in my head at least, was when I saw the game demo named something like "krugen" that was compressed to less than a megabite or something like that, but it looked like 4gbs of data at least....
@TalesofWeirdStuff
@TalesofWeirdStuff Месяц назад
Years ago when I taught graphics programming, I'd show the students Debris by Farbraush... and then tell them the whole file was 177K. Minds successfully blown.
@danthompsett2894
@danthompsett2894 3 месяца назад
I feel sorry that i missed this card in 2003 seeing a label with 512mb on it 2years ahead of its competitors would have really opened my eyes, then again this card really isnt designed for Gaming, its designed for making games or 3d rendeing for making films etc.... but then even now i refuse to pay over £300-£400 for a graphics card although i am leaning towards the 4070 Ti im still holding out not spending over £700 just on a graphics card as long as possible, specially since im still stuck on AM4 with DDR4 not AM5 with DDR5 which bothers me since De8eur with his Thermal Grizzly line of products has made delidding and water cooling the AM5 CPU's as easy as Pie with there delidder and frame to make it compatible with any water cooler block.
@badwolfsat5
@badwolfsat5 10 дней назад
Fire GL 1 price would likely be found under Diamond Multimedia. I found it showing $599 at launch. Diamond Fire GL 1000 (3Dlabs Permedia NT)
@OnTheRocks71
@OnTheRocks71 3 месяца назад
I remember reading PC Gamer, I think, when the first GeForce launched. I think that was the first time I had read the term "GPU" and being kinda like "huh? what?" It's so weird looking back now because I think we just called video cards that had 3D acceleration...3D cards?
@JustAGuy85
@JustAGuy85 3 месяца назад
I played Doom 3 on a 9800PRO. Also played it on a 6600GT. The 6600GT played it far better, but that's when Nvidia was still KILLING image quality through the drivers, too. (They also were doing the same on the previous FX5000 series, too. BLURRY image quality for higher performance) Then ATi came out with drivers that increased Anti-Aliasing performance by 40% which made the 9800PRO like a new card. But, sure, the 6600GT was faster. And I even got a 6800GS AGP later, and then an HD3850 AGP (the last AGP ever made). Waste of money, because my 3400+ (s754) wasn't powerful enough to really unleash it. But yeah, the 9800PRO played Doom 3 "okay".
@mwitters1
@mwitters1 3 месяца назад
I would really like a video on the geforce ti4600. That was my first really high end GPU.
@wtfskilz
@wtfskilz 2 месяца назад
I remember playing the Demo in 2004 on my brother's computer on medium with some mid grade ATI card. There was nothing else like it at the time.
@robwebnoid5763
@robwebnoid5763 2 месяца назад
I didn't have that expensive card, but I did become a fan of 3DLabs in the late 1990's when I bought the FireGL1000Pro which had the Permedia2 processor on it. Because it was more or less a budget CAD card & general graphics, it ran games of that time less than decent but still good enough, especially when my PC was really only a Pentium 233mmx or K6-II or III at 400 or more. It did get a little better when I updated to a Pentium3 1.0 ghz cpu. But the biggest change was when I went to GeForce video cards. I did read about the P10 cards back then but at those prices I wasn't about to need one, heh. 07/16/24
@Kingslay3r
@Kingslay3r 27 дней назад
back then 400-500 was the price of flagship gpus, crazy how times changed right?
@Garde538
@Garde538 Месяц назад
Really enjoyed this video ✌️
@iambarnabas_
@iambarnabas_ 2 месяца назад
Would someone finally create the perfect dad video? The perfect dad video:
@Baz87100
@Baz87100 2 месяца назад
It's kind of nuts that a GPU that was 2 years old at the time of DOOM 3 releasing struggles to run it. In modern times you could run a brand new title with a 5 year old top end at the time graphics card pretty well, even without upscaling.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
I think that is because of consoles. Game devs who want to make the maximum profit ship for PC and various consoles. That means the practical minimum system requirements is a Switch, and that's a pretty low bar. As late as 2012 I remember seeing talks at GDC where game devs said, basically, they couldn't use interesting PC graphics features because they had to ship on PS3 and Xbox 360.
@FortniteOG420
@FortniteOG420 2 месяца назад
Wow this reminds me how I played DOOM 3 back in the day on my mom's windows XP machine, it had a OEM GPU that came with it, I finished the game with a average FPS of about 12 at the lowest settings
@ashgonza92
@ashgonza92 Месяц назад
You didn't even exit out the game and restart the program after changing the settings
@SamFlador
@SamFlador 2 месяца назад
Had an ATI AIW 9600 that I flashed to think it was a Radeon 9600 PRO, overclocked it, it was awesome
@rootbeer666
@rootbeer666 Месяц назад
I had 128MB 8500LE that I bought for $20 back when Doom 3 came out. It played way better on that, even though 8500 was removed from Doom 3 recommended configs by launch time (it was targeted earlier in development).
@Bob-v3d8t
@Bob-v3d8t 2 месяца назад
I had the ATI 9500 that could be softmodded to a 9700 in an athlon xp 1800 system and it ran doom 3 great.
@SJ-co6nk
@SJ-co6nk Месяц назад
Maybe it's just because I was there at the time, but to me a GPU is a graphics card with hardware T&L, since that's why nvidia coined the term. So anything before that isn't a GPU, anything after that is, but shader architectures definitely should be seen as equally revolutionary to just that. I mean, a 4004 is technically a CPU, but not much of one.
@UncuredRandomDiagnoses
@UncuredRandomDiagnoses 2 месяца назад
A 6800 could run Doom 3 Ultra @ 1024x768 at 60 fps. It was only a few months old. Something like a middle 2002 card would be single digit fps in the menu at those settings. GeForce 6800 was a graphics card by NVIDIA, launched on April 14th, 2004. Doom 3 was released in the United States on August 3, 2004.
@daniel-san836
@daniel-san836 2 месяца назад
In 2004 i was playing UT 04, battlefield 1942, HL2, battlefield vietnam, DOOM 3 and C&C Generals - Zero Hour on my Leadtek Geforce 5200FX 128mb, e6600 core 2 duo intel sk775 cpu with 512mb DDR ram
@TheVanillatech
@TheVanillatech 3 месяца назад
My friend bought a Radeon 8500LE. I was running a Geforce 2 Ultra back then, I advised my friend to get a Geforce 3 to get similar performance to me (he was struggling to play Morrowind on his aging Geforce 2 MX). His budget didn't quite stretch to the £180 Geforce 3, so we settled on the Radeon 8500LE which was only £110. After some overclocking and tweaking, his card was surprisingly good! In 32-bit colour it could rival my GF2 Ultra in some titles, such as Giants : Kabuto. And Morrowind was vastly improved for him, along with those water shaders. Definitely not a full 8500, but still ... best bang for buck at the time.
@slinkyfpv
@slinkyfpv 2 месяца назад
That was a great video! Thanks you so much.
@iamatlantis1
@iamatlantis1 2 месяца назад
was the 9800 the "all in wonder" one? the intro here was pretty nerdy, subed.
@renemolinaledezma2512
@renemolinaledezma2512 2 месяца назад
I remember when I played doom 3 for the first time on my old ati radeon x850 pro agp, accompanied by a modest sempron 3100 socket 754. Greetings and blessings
@bobbobson1605
@bobbobson1605 Месяц назад
The main processor being very programmable reminds me in some indirect way of IBM's weird M-ACPA sound cards that used a chunky Ti MSP430 DSP (the driver disks came with some example code for decoding JPEGs on the sound card...)
@Kbcqw
@Kbcqw Месяц назад
Very informative video 👍 thank you
@deagt3388
@deagt3388 29 дней назад
Used to have ATI Radeon HD 3870 AGP card if my memory serves me well allegedly the fastest AGP card, could run 'Crysis' in 1080p, test it! ;-)
@PoulWrist
@PoulWrist Месяц назад
I had a 9700 pro, it's what I played Doom 3 on.
@MaTtRoSiTy
@MaTtRoSiTy 3 месяца назад
I had a 5600XT when this game released and I rushed out to buy a copy of D3 only to get a rude shock when I tried to play it. Thankfully I later upgraded to a 6600GT then a cheap used 6800GT which both ran it perfectly fine
@milosstojanovic4623
@milosstojanovic4623 Месяц назад
I got in 2002-2003 Ati Radeon 9500, so i gamed like crazy back then :D
@Kuli24000
@Kuli24000 2 месяца назад
Shoot 2002 sounds so old. Please don't recognize the gpu. Please don't recognize the gpu.... dang it the 9800? That's fresh. My buddy who had a 9800 pro was able to run antialiasing on bf1942 which was crazy for the time.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
Battlefield 1942! I completely forgot about that game! As much as I played that game... I should have included it in the follow-up video. Maybe next time.
@SouthWestI10
@SouthWestI10 Месяц назад
I haven't seen an api expansion card in literally 20 years
@dancar2537
@dancar2537 3 месяца назад
wow, video cards are some kind of wild thing. parallelism without limits. had i been an european politician looking to start an industry in europe i would hire you if you would like to come
@enermaxstephens1051
@enermaxstephens1051 2 месяца назад
It would be really helpful if all the old GPU makers were still around.
@TalesofWeirdStuff
@TalesofWeirdStuff 2 месяца назад
How big of a case would be required to fit a Voodoo41? 😆
@fxgamer-11yt
@fxgamer-11yt 3 месяца назад
i would hope 3d labs is still up there still producing stuff
@OzzyFan80
@OzzyFan80 3 месяца назад
That was very interesting. I love tech and hardware especially gpu's but just computers in general. I also come from the same time period as you do. To see these 1,200 video cards that can't run a game like Doom 3 and we now have 500.00 plastic boxes that can run them at 60 to 120 fps is amazing. I just bought a new rig with a 4070 TI Super and the price of that gpu was hard to swallow but it will last quite some time. Before the 4070 the last gpu I bought were 2 Evga super over clocked editions or something and they were 768 mb cards I think without looking for specifics but yes EVGA is a company we need to come back. They made some of the most stable cards back then.
@vinizan
@vinizan 2 месяца назад
It will probably run Half-Life 2 very well
@majbt45
@majbt45 3 месяца назад
Really cool and in depth discussion on this sophisticated piece of graphics technology.
@UmVtCg
@UmVtCg 2 месяца назад
As Doom was released 2 years later, it better be
@dodgethis1986
@dodgethis1986 2 месяца назад
I had an Athlon 3200+ at 2.2ghz, a geforce 4 t1 4400 with 256mb and a whopping 512mb of ddr back in the day and Doom3 was running super smooth at 1280x1024. I hated the game but it ran well.
@MakeSh00t
@MakeSh00t 3 месяца назад
Today people complain that 4090 is expensive but 20 years ago was payment 4 times worst in my country. And yes i have only 800 euros payment and i have 4090.
@drumsmoker731
@drumsmoker731 3 месяца назад
Priorities 😁
@morkalan5226
@morkalan5226 3 месяца назад
800per month or per year? not trying to be condescending, but sadly there are many many countrys where 800per year is more than double/triple(/or more than that even) of their salary per year
@jamespowell2531
@jamespowell2531 2 месяца назад
Bro just came to the comments to brag about having a 4090 💀
@lostskull7467
@lostskull7467 2 месяца назад
I mean, what they argue is that they've jacked the prices up while having worse performance per dollar each gen. All I always hear is that the 10 series spoiled us.
@Fractal_blip
@Fractal_blip 2 месяца назад
Lol he said "and YES I DO have a 4090"​@@jamespowell2531
@i64fanatic
@i64fanatic 2 месяца назад
I had a Geforce 5500 FX cheap in my 1st computer (fam computer at the time was a Pentium 2 350mhz, ATI Rage II Turbo 8MB VRAM AGP) I bought to run Doom 3 specifically. No problem despite being a bad card at the original asking price. Until Prey came out. Same engine idtech 4, but more demanding. Then I switched to Radeon, but I can't recall the model haha.
Далее
Why did slot CPUs exist?
58:44
Просмотров 71 тыс.
So I Reviewed the DOOM 3 Source Code..
31:03
Просмотров 75 тыс.
Help Me Celebrate! 😍🙏
00:35
Просмотров 13 млн
Over 3 Hours of Half-Life & Portal Lore
3:46:52
Просмотров 620 тыс.
Can You Build a Working iPhone From AliExpress...?
19:06
Huge New Findings About Doom's Graphics
13:10
Просмотров 343 тыс.
Guess I'll Just Make My Own PS5 Pro
14:57
Просмотров 421 тыс.
Upgrading a Soldered Laptop GPU
17:16
Просмотров 590 тыс.