@@situatedillness Nope, mid-1999 is 1 GHz. On November 20, 2000, Intel released the Willamette-based Pentium 4 clocked at 1.4 and 1.5 GHz. In April 2001 a 1.7 GHz Pentium 4 was launched.
Lol, I remember watching this in 1989 and thinking about how far we had come. Now I’m watching this on RU-vid on my cell phone which makes these desktops look like calculator’s.
Hahaha, I do believe I am the first person to overclock a 286/386. I'll never forget when I was a young engineer working in the radar field, I was responsible for what was called the Built In Test Equipment (BITE). I was able to get this running faster by combining 2 speeds together (5 Mhz to 10 Mhz) and the data stream was twice what it was. I thought about my 386 could benefit from an increase in clock speed so I looked up the schematic of my motherboard and saw that the CPU was running at 16 Mhz. I called a vendor to order a 24 Mhz crystal (xtal) and the guy asked me why did I want to buy that. He told me he only sales to OEMs. I told him I wanted to test the idea of overclocking the CPU. He says he didn't think that would work. I looked at the buss for the I/O and the CPU and didn't see why it would not work. Low and behold, I got the xtal and installed it in my motherboard. It was just plug and play. I booted my system, brought up MS Flight Simulator and wow, I saw a much better runtime in the graphics performance (remember though, this is like 1989 or 1990). I called the guy and let him know it worked. He asked me to send him instructions on how it was done and what adjustments did I make with the clocks. I told him I did not do any wait states above 0. It was just remove and replace. I don't know how many years later but I started seeing ads of vendors talking about overclocking. If I'm not the first, I am certainly in the beginnings of all of that craze. Fun stuff!!!!!
I used commodore 64 1986-1992 (1 mhz 64 kb off ram) and then in 1992 my father bought 386 dx 40 4 mb ram ...it felt like supercomputer in my house ....and i recived amiga same year
I will always fondly remember this era in personal computing. So many small vendors adding their own special upgrade, their own flair on something new.
well megahertz were not the only thing that gave performance back then the faster system bus a CPU had affected performance greatly thought many did not know this back then the faster bus made a huge difference in computer speed and performance so Pentiums with a 50mhz bus were the worst 60 MHz pretty good and 66mhz were the best performing computer back then for the Pentiums true fact
Still remember our first family pc. Gateway 2000 with a 386DX 33Mhz, 4MB RAM and 35MB hdd. Got a Sound card for Christmas to play Wing Commander 2 with the voice add on pack.
"In 10 years from now, we expect to be shipping processors in the range of 200MHz" lol well the end of 1999 to early 2000 they got up to 1GHZ. I guess they beat their own estimates :)
Intel was a bit off, but they also believed that they would be putting out 10ghz machines by 2010. The predictions and reactions are comical now, but the megahertz wars were never expected to end. web.archive.org/web/20200304044703/www.geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/
I worked at a place that built custom computers from 1997-1999, and I remember getting shipments of the 333MHz and 366MHz Pentium 2s, and the workers got excited about it. We may have got up to 400MHz when I was there - my memory is a bit fuzzy on that. We never got 1GHz when I was at that job. That’s not to say they didn’t exist for higher end computers.
OH MY GOD! We had ONE golf game on a floppy we played up until 2003. The whole family would play it. And this it! That feeling of accidently stumbling over "That" game!
33MHz was blazing fast around 1990. My 386 had a "turbo" button to slow it down to 8MHz for when running old applications that ran too fast to be usable
In 89 I was still pushing a couple of 8mhz 68000 machines The 1040STe wich was new at the time, and an Amiga 500 with a trapdoor ram expansion. When I went to a 100mhz pentium in 96, I couldn't believe my eyes.
The first computer I ever used was my older brother's VIC-20. I was little and he had some old education games on it he'd boot up for me. Then the first computer we got that was specifically mine was a second-hand setup of an Amiga 1000 with expanded RAM, a discarded Apple color monitor that occasionally needed smacked on the side to make the color come in, and I used my old Panasonic boombox for the audio-out. Damn good sound too since it had a 5-band equalizer. I loved that thing. Ignorant young me never made a backup of the Amiga Kickstart diskette so I couldn't use it anymore eventually when it went bad. Then my brother cannibalized it for my games and peripherals around 1994 and I didn't use a computer again til 1996 when I finally got a standard PC with Windows 95. A couple years ago I bought a near-mint Amiga 500 from a family friend and got back almost ever game I use to have and then some. That's just a part of my childhood I can't leave in the past. I love the Amiga.
I remember those days, there was no public internet. There were bulletin boards that you could access with a dial up modem by dialing a specific phone number over a land line. You could find the phone numbers literally on physical bulleting boards on the walls in the places that sold you the computers. Some of the computer bbs were hosted by distant places like universities, and some were just local users much like yourself. People tended to network locally to avoid long distance charges, and people tended to restrict modem usage to off hours so as not to tie up their phone lines or add a second line just for their modem. Computers enthusiasts used to buy computer mags not only to read the latest news and see the latest gear, but because they contained line after line of computer code (BASIC) that people could enter into their computer, to make it perform some mundane by today's standard task. If you were lucky enough to be able to afford it, you could back your programs up on a 5 1/4" floppy drive or an audio cassette. If not, everything you entered disappeared the moment you turned your computer off, there were no internal hard drives. It was a big thing at the time when computer manufactures decided to add floppy drives.
While Gary was irreplaceable on the show, Jan was great as well. She was very charismatic and personable, and didn't sound like she was just reading a script.
The problem is how can you take any advice from a guy that blew off IBM and the chance to make billions? While he was a smart guy for sure, shooting yourself in the foot makes you look really stupid.
@@AcydDrop IBM offered Kildall a one-time payment of $250,000. That wouldn't have ever made Kildall billions. Gates got the same deal. Gates did not become a billionaire off of IBM royalties, he never got an IBM royalties deal either. It wasn't until COMPAQ came on the scene and Gates could sell DOS to the IBM clone market that he became a billionaire.
@@AcydDrop Jan was one of the managers at Xerox Parc who saw no market for their products and basically let the rest of the industry (Apple and Jobs in particular) walk out the front door with it. Garry wasn't the only one who fucked up.
@@medes5597 Funy how you all ignoring the fact that they are showing XT clone and 386SX and babling about voice recognition and AI... and the demo for them is a game without any 3D graphic in it...
4:09 "We still have a long way to go in the speeds of our microprocessors. Today, we're at 33 MHz is the maximum speed of a microprocessor that Intel manufactures. Ten years from now, we expect to be shipping microprocessors that operate at speeds up in the range of 200 MHz." This was 1990. Little did he know! Ten years later, Intel was selling Pentium III processors running nearly 1 GHz.
I totally loved the P3, especially the tualitin core. The P4 was a big disappointment. My 1.4Ghz P3 Tualitin, was faster than the 1.8Ghz P4 I replaced it with. :(
That must be the same Neil Rubenking that wrote the "Pianoman" music program, which could produce pseudo-polyphonic music from the IBM PC's internal speaker.
Now it's just waiting for any program that utilizes more cores. I have not run into anything yet that my old i5 four core can not handle with a GTX 580.
@@adityasumanth6122 technically yes because if you go back in time far enough cpu's where at one time made rectangular we can we can make them circular to if we wanted to but it's up the manufacturer to choose the shape not you unfortunately look up the 6502 for an example it's made both as a square and rectangular cpu depending on which model you order
Boy that brings back memories, my first computer was a 10 mHz 8088. Bought from an add in Computer Shopper, a rag that could have used in place of concrete building blocks.
I had an 8mhz Amstrad PC 1640 ECD 30mb HD. Very high spec 80s PC running MS-dos 3.2 and Gem Desktop that had no 3rd party programs to support it. I did have 640K and an EGA screen so some games looked amazing. There was a game Blood Money by Psygnosis that played amazingly well and looked amazing. Also old dos games like Road Wars that were totally engaging. Ancient dos games like SW.exe Space wars. A 2 person keyboard based game that was endlessly entertaining and was made in the 60s to run on mainframes and ported to the 8086. This computer opened up a legacy of computing that my spectrum was never able to do. I learnt to program on the spectrum but I learned so much more from my lowly Amstrad pc 1640. It got me through my first year of University as well. Turbo C from Borland worked ok on it. Took a while to compile but it worked. Honestly I remember this computer very very fondly. It never let me down. It was slow at times when I asked more of it that it was ever designed for but it always came through. Amstrad always gets a bad Rep but I'd argue the 1512 and the 1640 were like the spectrum in terms of introducing the youth to contemporary computing. The 1640 in particular could be specced to a really decent standard with EGA graphics and 30MB HD. It had a mouse as standard too. I ended up fitting an 8087 to it too. No idea why. Didn't need it just got one and felt like fitting it. Fractint was my favourite program outside of games. It made nothing of the 87 because it used Ints only. I spent many an hour zooming in on Mandelbrots and reveling in the psychedilc results. Luv and Peace.
We need a similar TV show in today’s age but instead focusing on the fast moving area embedded technology, Micro-controllers and Single board computers.
Can't have a TV show anymore. Look where you're voicing your concern. RU-vid channels are the new era of shows. All about finding what you like. There has to be someone making videos about it
4:14 that guy was spot on in 1989 he said in a decade there would be 200mhz processors in 1997 I got my very first computer a gateway 200mhz mmx technology computer
Not quite spot on. It actually came quicker than he thought since this was filmed in 1989. The first 200 MHz processor was released in 1996, which was just 7 years after this was filmed.
Now the sad thing, is that it's hard to find something that could still *read* those floppies, let alone copy them. Anyone that did copy them though, still has the software. The legit users got screwed.
Having being born in '95 I missed a lot of the earlier stuff like this but it's amazing to see CPU's that are so low power that they don't need any cooling other than passive airflow.
You can still do that today but... whats the point? I mean sure, phones are passively cooled as are some laptops. But why passively cool a desktop CPU. There are very few reasons to.
@@t1e6x12 Yeah, it's like how having a very loud turbo on your car & revving the engine down a quiet residential neighbourhood is a guaranteed muff-magnet, but it's nothing compared to the panty-dropping power from the sound of an aftermarket cooler on your overclocked dual-core Pentium. 😉
I like the way that The Computer Chronicles show me all the computers I missed before, as a result of missing lots of job opportunities. To all those who chose not to employ me, I'm doing fine now, because I am typing on a Windows computer that has an Intel iCore processor, with the latest business productivity programs, a color laser printer, and a professional label tape printer. Now you wish you could hire me, but I am doing fine.
In the 80's, computers were the realm of the corporate world. With the mass adoption of the internet and multimedia by the mid 90's, computers lost their business suit and tie.
wow in 88/89 I remember only being able to afford a turbo xt machine. 386 was something very special. very envious of someone who had one - often we had parties held when someone was unboxing one lol. they were wicked fast though, I still can't recall so much of a visual massive speed difference just from one jump in generation of chip (286 didn't count since it sucked). the 386 - 486 jump wasn't as visually impressive even though I know the 486 had a lot of processing power, the software demands increased with it so it pretty much leveled out.
Way back 1997 had a PII running at 333Mhz with 0.35 μm technology node. It had 32MB of RAM. Today, we have multi-core supercomputers in our pockets that can render 4k HDR videos with 5nm technology node.
My first computer was also in 1997 -- a 200Mhz Sony VAIO also with 32 megs of RAM and a whopping 3.4 gigabyte HDD. How will I ever use all that power and storage space, I wondered as a 17-year-old. Oh how the times change...
The trick IBM played on him was disgraceful. But they got their comeuppance. They totally lost control of the Market, and at least Gary lived long enough to see it happen.
I remember this episode airing and thinking 33MHz was blazing, how much more were we going to really need. 80386DX-33 was also pretty wicked fast with DOS type programs. You could fly through those text prompt type menus.
Then Duke Nukem happened, and it wasn´t fun... I am daily driving a 486-33 Sx, and so many games choke.... Still, I can see how this was an improvement over a ST
Got my first computer in 1991 at age 12. It was 286 running at 16MHz, but 20MHz with turbo button pressed. 3.5” and 5.25” floppy drives, and a 40MB HDD. Just thinking of that machine gives me joy. I built many upgrades since then, but sadly mother donated that first one 10 years ago, and I cried when i tried to look for it and found out :( She said there was a lot of “crap” in the attic that needed cleaning. 😮
20Mhz 286 was rockin. Most people only had 12 or 16MHz. Plus 286 chip was a huge speed increase clock for clock on an 8086. Contrary to the myth this video started. A 386sx was slower than 286. The reason it goes faster in the video is because they're adding a 387sx FPU as well as a 386sx chip. So cudos to you. Your 20Mhz 286 in 1991 was still a decent machine!
Computers were very expensive in the late 1980s. A 386 sx-16 mhzs computer with a monitor, keyboard, 1 mb of ram, 40 mb hdd, and a dot matrix monitor cost about $2400. A Microsoft mouse cost me about $100 in 1990!
***** The only thing that was lacking in the Amiga 500 was the 8 bit sound. Other than that, it had great graphics. The animation on the games was choppy on flight simulators.
Maria Gabriel always says her name funny.. she pauses in the middle of her last name every... single.. time... It's like, you have to wonder how that nuance happened, as it sounds like she just stops wanting to talk for an instant, then finishes up saying her name. I know it sounds silly to bring up but really, once you notice it, you can't un-notice it. :\
thomasg86 Well, pronunciation is cultural and subjective - I've heard quite a few varied pronunciation of my real last name, some that make sense and some that are totally fubar. Who knows.
If I had a time machine one of the things I would have done was go back to when this video was made and walk in with an ipad pro and see everyones response
Only almost a hundred times faster? Seriously? You must have a very, very weak computer. 486 processors were about 10 megaflops (able to run Doom), while in 2010 there were already 100 gigaflops CPUs (i7-980 for example). Current consumer desktop CPUs are up to 800 gigaflops, which is 80 000x faster than 486. PS5 has a million times faster GPU than the 486.
1992, 286, Tandy 1000 RLX 20 Meg. Hard Disk Drive, 3 1/2" Floppy drive, monitor, keyboard, mouse, printer. A three-year warranty, which came in handy since the hard drive was faulty. It was replaced and worked great. I liked DOS, Windows 3.1, Desk Mate( an early, fun, easy to use Graphical User Interface)!
Cpu speeds reached its apex in 2008 and 2009. My I7-960 which is a 2009 cpu is still being used by me. It is slightly slower than my I5-3570k which I got in late 2013. You can do all the office work with that I7-960. I have a AMD Phenom II 945 which I got in 2009, and it is being used in my office. It is still a pretty fast computer for a office computer, and it is more than adequate for office use. In 1987 to 1996 there were big leaps in cpu technology at the time. We had the 286 8 mhzs, 286 12 mhzs, 386 sx-16 mhzs, 386 dx-33, 386 sx-20, 386 sx-25, 386 sx-33, 486 dx 25 mhzs, 486 dx 33 mhzs, 486 dx2 66 mhzs, 486 dx4-75 mhzs, 486 dx4 100 mhzs, Pentium 60 mhzs, Pentium 75 mhzs, Pentium 90 mhzs, Pentium 100 mhzs, Pentium 120 mhzs, Pentium 133 mhzs, Pentium mmx 166 mhzs, & Pentium 200. From 2001 to 2014, there hasn't been a big leap in cpu technology. In 2001, I had Pentium 4 1800 mhzs cpu, and in 2014 I have a I7 4790 at 3600 mhzs. In a 13 year period, the core speed went up 2 times; however, the cpu manufacturers increased the number of cores to 4. The performance difference between a I7 4790 and a Pentium 4 1800 mhzs is about 6 to 8 times faster. The cpu performance improvement in 1987 to 1996 was like 16 to 20 times. CPU technology reached its zenith with transistor technology. The more transistors they add, the bigger the cpu gets, and the hotter it gets. That is the problem engineers are facing now.
I agree with what you're saying, however, at what point do we hit the cap? I mean I'm sure there is a point where faster processing is only needed by the most hoggish software, software if anything should be able to do more, with less processing power unless they're purposefully making things hoggish in the next 5-10 years that cap should be met for 99% of users. Unless you're the NSA having to extrapolate key words from all the surveillience they're collecting. But for home users and even gamers there is a plateau. Same for monitors and GPU's. There's a point where the eye can't process any more data in spite of being able to produce more than the eye can handle.
Christopher Snow I think we already hit the cap in silicon based transistor technology; however, RISC architecture could be utilized to improve performance with CPUs that require less transistors. A good example would be ARM processors on smart phones, they use RISC processors to execute more than one instruction per cycle. As a result, you have a CPU that has 1/10 to 1/20 the number of transistors compared to a CISC processor. It would work for software that doesn't require real number calculations, or numbers with decimal points. A RISC processor would not work for CAD programs, flight simulators, or any kind of applications that require float point calculations. ARM or RISC processors are limited in what they can do, this is why we need desktops with powerful processors and GPUs.
Ace1000ks1975 Oh I'm sure in that regard, but the real question is, for the average consumer, even gamer, how much more processing power is actually needed? We can go from 4k to 8k monitors but at what point does the eye not notice the difference between the two things... or in gaming how much CPU/GPU processing is capped. I think we're getting to the endgame on all "common" user needs. The only people which will need faster and faster processing power are data centers, major corporations or military cyber security applications. I know it's been said before but this time I really do think we're hitting a hard cap. If anything networking, wireless and otherwise is still in need of speed, the US seriously lags the world when it comes to internet speeds and thus home networking.
Christopher Snow We need to transition from transistor based electronics to optical based electronics to break the bottleneck. Light moves much faster than electrons. It would be like transitioning from piston engines to jet engines. The fastest piston engine aircraft could propel an aircraft to 450 to 500 mph. A jet engine can propel an aircraft more than 1000 mph to 1900 mph. For transistor based electronics, they will have to come up with better cooling systems, like liquid cooling solutions to clock CPUs and GPUs at higher frequencies without burning them out. For those who can afford it, they could make motherboards with multi-CPUs on it. That would address the speed problem. A board like that with 3 to 4 CPUs would be out of the reach of most users, it would cost at least $6000 or more.
Ace1000ks1975 What I'm saying is, games are right on the verge of having ALL they will ever need in the way of processing power for both GPU's and CPU's. The only thing which will extend needs beyond the point we're already at is some virtual reality game simulators.
1:50 that woman is wrong! If the 8088 had been made to run in VGA mode, it would have in fact been faster. CGA graphics are pixel-packed, so there is a lot of bit-shifting, AND'ing, OR'ing, etc to actually put the CGA graphics into video memory where the VGA is in 320x200x8 mode which is a single byte per pixel. No bit-fiddling required.
Yes, you can simply go mov ax, 0a000h mov es, ax mov ax, [buffer segment] mov ds, ax mov si, [buffer offset] xor di, di mov cx, 32000 rep movsw to fill the whole screen from a background buffer in VGA mode 0x13. In CGA you need to take each pixel and shift it around and or it to the screen if you don't have the backbuffer set up, which makes blitting to the backbuffer a pain in the you know what. Did program a lot of graphics stuff back in the day - and 0x13h was the easiest to work with of them all, also the most performant one, because the loop and the number of instructions needed to write to a CGA compliant backbuffer needed to be fetched byte by byte on an 8088 (or word by word on a 8086). Made more than up for the 4x larger screen memory to be transferred. CGA was really one of the worst things ever invented.
@referral madness x86 assembler. I think the example would work on even an 8088. However, it requires a VGA or an MCGA card in grapics mode 13h (320x200 @ 8 bits color depth).
@referral madness So many... started out in Pascal in late 80s, later x86-assembler an C/C++, then C#/F# and Haskell, now I'm into Python and Kotlin a lot.
When I switched over from my Atari 800 with DOS 2.0s to my Pentium MMX 233 MHz with Windows 95 I noticed that there was a vast speed difference. Of course it was so obvious that there would be a speed difference between the two computers.
consider that it was INTEL talking, the real innovators in the CPU world threw out the 90's - mid 2000's was AMD, clock for clock AMD won hands down, and on price, INTEL just had, and still has deeper pockets too grease the PC makers too use their chips.
TheRealWitblitz O GOD don't get me started on VIA, the only thing they have ever got right has been onboard audio. I have actually owned one of the VIA PC1 C7 boards, and I was always having trouble with it either being slow as dog shit, or driver hell, same with the Everex laptops that had the Mobile C7 chipsets, I bought 1 for my Niece, and one for my g/f at the time, as i was able to get them for slightly over $500 from Walmart, and just about every week I was having too do tech support on those damn things usually do too something VIA screwed up with an driver update threw Windows. VIA also has about the worst Linux support I have ever seen from a company. I'm amazed they are still in business.
Commodorefan64 Many of the big innovations back in the 1990s to mid 2000s such as copper interconnects and high speed caches were made by RISC CPU venders such as IBM ,DEC land SGI.
The 386 was a potential step forward but as a computing platform at the time it lagged sorely behind the 286, MHz for MHz, because of CISC and how the 32-bit instruction set was implemented. Yes 32-bit was going to be the way of the future, but at that time mainstream was solidly 16-bit which meant that the 386 was just a waste of money.
@@ryanyoder7573 I was a techie at that time (my first ever overclock was an Intel 8088 from 4.77 MHz to the dizzying heights of 6 MHz) and I switched to a 286 and over the years replaced the version I had until I everntually ended up with a Harris 25 MHz 286). I had that all the way during the 386 generation because for my main use case (for instance Word Perfect 5.1 for text processing) the 286 was BETTER than the AMD 386 computers I built for others (even though I told them they would be better off with a cheaper 286. I wasn't until the 486 that 32 bit started to become mainstream and I built a 486 DX50 system for myself. The problem is that Intel put the 32 bit CISC instruction set ahaed of the eight and sixteen bit instructions, thus, because software was overwhelmingly eight and 16 bit to run those apps on a 386 meant that every instruction had to run the 32 bit instruction set gauntlet before reaching the instructions the CPU could execute. Windows for Workgroups 3.11 could run a 32 bit version, but that was buggy as hell compared to the 16 bit version.
I'm not an architecture expert but I suspect the cost/performance difference had more to do with the bus size differences and capabilities of the more advanced 386 causing them to require more die space compared to the mature 286 design. Bus speeds were interesting back in that era, some systems used 286s on the xt platform which only used the 8bit isa bus instead of the 16bit eisa that the 286 was designed for. These devices even needed custom ide implementations since that standard was never meant to be used without a 16bit bus.
I remember the checkerboard pattern as I loaded a 0.8 MB photo on a floppy into a 486 system in 1992. You could (and I did) go outside to smoke a cigarette. By the time you came back, it might be finished.
33Mhz, 256 color VGA graphics and a 16 bit sound card and you had an amazing game machine circa 1990ish. Back in late 80s 16 color EGA was good, 8 bit sound card was like a luxury car and 16MHz was "fast"
@KoivuTheHab lol yep that was like mine... every time i pressed the turbo button it would go to 33 mhz.. thought it was stupid and didn't think about it much until i was an adult.
Lol, some people want to travel back in time to see dinosaurs, assassinate dictators, or meet their heroes, I just want to present a 2020 laptop and smartphone on this show and bring some poor Tandy marketing guy to tears
That discussion of doctors relying on fast computers to save lives hit home... Imagine how many lives have been saved by advanced imaging and surgical robots-all of which required silicon with orders of magnitude more whoopass.
Ahhh the times when even the next office version could slow a PC down forcing users to buy a new one every two years or so at a Moore's law cycle. Nah, I don't miss it.
+SomeGuyInSandy If by exciting you mean speed increases; nope, not even close. For the duration of the 90's the clock speed increased by 50% per year, for an entire decade, and that's where nearly all of the performance came from. From 2005, when Dennard scaling died, we get 10% per year performance increases and an occasional increase in the number of the CPU cores.
That is not exactly what I meant. My comment was more general in nature, taking things like mobile computing; tablets, phones, etc. into account. In addition, the plethora of applications, and the maturing of the Internet make up the "computing" I was referring to =)
+SomeGuyInSandy Ah. From your perspective I beleive massive, predictable performance increases as in the 90's would be boring. It is with increasing spectacle that Intel is flailing about (e.g. the Intel Curie and compute stick) trying to find strange new uses and markets for their chips because they can't deliver the increased desktop performance that drives _same, but faster_ desktop sales.
curiosity and interest, sure. but without the manufacturing, materials, physics, and other advancements to go with it, you may as well hand them schematics for a working tricorder, for all the practical good it would do them. Even reverse engineering many of the constructs they would see would likely take them nearly as long as developing them as they did in this timeline. Or maybe there was a Henry Starling type who stole some 29th century technology back in the 60s and created a temporal causality loop...
Monkey Robots Inc. Totally. Most people in these positions today are sales people who read a few lines off Wikipedia and do their best to convince everyone they are experts. You just know by the way these folks are talking that they are knowledgable and passionate about what they are talking about
+jaymorpheus11 apple computers at the end of the 90s used a different architecture to windows PCs a 350mhz iMac could beat a pentium 2 of twice the speed
I wonder what they would say about our modern desktops with our fancy 16 cores and 3GHz+ speeds, insane super NVMe M.2 storage and basically the elimination of CD and DVD drives in favor of our fancy flash drives. To say nothing of the standard 16BG's of storage in the bulk of modern Windows 10/11 machines.
it really is fascinating looking back at how much the culture of home computing changed. computing ubiquity now means complete idiots use powerful machines with little idea of their inner functions and they look back at this period like everyone was dumb and would be awestruck by the modern tools. i think they would be struck more by the culture, would you rather be playing games 24/7 or the guy viewing the sales figures of the games ? think about it.
Today to show the difference between 5 years of CPU development you would have had to show a benchmark chart. "Here you can clearly see the new cpu takes you from 110 to 130 FPS!" If they had put 2 pcs next to each other like that no one would be able to tell the difference.
Wonderpierrot those processors couldnt be overclocked.. first pentium could but only trough hardware..not bios..what you do i use ducktape and isolate first and third oin on the processor..but only to next mhz..for instance pentium 100 could go to 120.. or pentium 120 to 133 mhz..etc..
@@semiborba7047, and depending how far back you go in time, you wouldn't want to overclock. Earlier OS's and software adjusted their timing based on the processor clock speed. You could potentially go higher, but it may break the software, even if the hardware could handle it.
@@fcex558 Even then, it could only do so much. Even when you lower clock speed, bus speed, and disable caches, the system may still be too fast. I remember in the 386 I used as a kid, even some really old games were still too fast if I had the turbo switch in "slow mode".
You clearly didn’t have any experience with this era of computing. Now you can overclock an unlocked chip by clicking around in your bios and having a whole pile of settings auto adjust based on what you selected - things like bus speeds, timings, and multipliers. None of this existed back then, and especially without multipliers, this was an extremely difficult thing to do manually.
I ran seti@home from time to time, over the last 20 years (It was ended recently). a pentium 200 in 2000 would finish a workunit per day on average, and my recent I5 @ 4.7GHz was capable of crunching a couple of 100 units per day.
I am on RU-vid PC with a AMD 7 5700X running at 3700 MHz. My Amiga A1200 has got a 68060 50MHz. Things have come along way. My first PC was a 200MZh MMX socket 7 CPU.