My favorite part of the story is later when AMD created the Opteron and created the AMD-64 instruction set. It was just nice to see Intel using AMD's new instructions for a change.
Its my favourite part too, that actually brings us to the end of the issue because Intel would sign more deals with AMD to be able to use that instruction set and by now its called x86-64, in a sense its a product of both Intel and AMD, however Intel introduced a trap, if AMD was bought out they would not be able to keep their license, a Major reason they were not bought out during the early 2010s. The engineers behind the AM64 was also the men behind Zen core(left AMD after the launch) and last I heard he was working for Intel.
The particularly poetic follow-up to this story involving the AMD64 extensions that Intel had to go to AMD for, hat in hand, after their own Itanium designs failed to achieve broad market success is also an interesting chapter, and I hope you cover it too! These two companies have a complicated and deeply intertwined history, and it's in large part because of this that we can still have meaningful multiple vendor options for x86-compatible processors these days.
Actually Microsoft forced that. Even besides , Itanium, Intel was ready to introduce an incompatible 64 bit upgrade to x86, and Microsoft didn't want to have to support two different instruction sets, so they forced the two to come to terms. Its a classic story that I assume will be covered here in part deux.
Its a miracle that AMD is still standing and hadn't gone bankrupt after Bulldozer's massive failure, and in spite of all the anti-competitive practices that Intel done in the past few decades.
@@mozzinator bulldozer is really far from the history covered in the video. They were much closer to dying at that point than at any time during bulldozer
And now looking at 6000 GPU and 5000 CPU AMD is competing quite well with 90% performance at 70% power. AMD doesn't have the same edge their competitors have but they compete well with decent margins all the same. I am immediately drawn to the underdog story. Thanks goodness we have AMD to help other companies innovate.
As we get on with the decade in which Intel struggles to recover its mojo -- and AMD basks in the sunlight it pined for, for the past generation... As always, well done, Jon.
Stick a fork in Intel. It's DONE. And rightly so. Sure, they'll reorganize, probably spin off their fabs to a new corp, and probably supply some x86 for a few years, but with RISC-V gaining momentum at just the right time, I am guessing this arch is going to be the new hotness in a few years. Who knows who it will be
@@kayakMike1000 I've heard that the biggest ball-and-chain around innovation in general purpose PC atm is the x86 arch - I'd love to see a new arch, be is RISC based or otherwise. Apple made the switch from PPC to x86 to ARM, so it's definitely doable. They have a unified software-hardware ecosystem that makes the transition a lot easier though. It's been tried with IA64 and that was a massive flop. I don't know the details of why it didn't get adopted though
I had my first PC a 5 year old Pentium and it's then I discovered K6, Athlon, Duron. AMD was always a rebel, a skunk work, but it wouldn't be such w/o Intel. Very interesting and nostalgic video reminding old computer magazine articles from childhood
@@mr.ssergeev No, I was working in computers both hardware and software in the '80s. I read about things in the EE Times and saw the evolution of the chips over the decades that followed.
The AMD 386 DX 40 was an awesome awesome awesome CPU. It ran Linux like crazy. The AMD AM486 line was legendary in the day. The AM5x86-133 was by far the best '486 ever made.
@@michaeldale837 Yeah, it was a AMD 386 DX 40 with a Weitek '387, 8 MB RAM 128k of cache. I ran Slakware on it. It was a slick workstation. I used it for years.
Back in the days, the central processors were relatively easy to r&d and manufacture, so the market was NOT an oligopoly of just Intel and AMD. In addition to the mentioned Zilog, Motorola and NEC there were also IBM, DEC, HP, MIPS, SGI, Sun, Hitachi and many more. I especially note Cyrix's x86 compatible processors that were not half as fast at the same frequency, but also required less power and emitted less heat, so they could afford working without a cooling fan when all the other x86s were already in need of it.
9:50 - The 386 is not just 32-address bus, but internally 32-data, from bus to registers, to addressing. The x86 pipelining may make that a bit opaque. A true 32-bit processor, that scared IBM, hence lost their PC lead. You want to know the real reason for the Pentium name. When they added 100 to 486, they did not get 586. The pentium's built in math coprocessor had an obscure math bug.
You're right. 32-bit addressing & data processing was the primary selling point, not the 32-bit bus, as shown by the 386SX chips, which had the same internals, but with cut-down 16-bit bus. Because I/O is expensive! And nice jab at the FDIV bug.
Thanks for reminding about the FDIV bug. 1993 was pretty exciting in the PC space, with Pentium shattering everything and AMD and Cyrix making a lot of noise about their upcoming K5 and M1 CPUs.
That was an interesting side note about Mostek. I was firmly in the Motorola camp from the time the Atari ST came out (and Amiga, I guess) powered by the MC68000, whose assembly language and processor architecture I found far more comprehensible than the Intel 386. I stayed in that camp until the NeXT computers were discontinued, by which time Intel/AMD had finally come around to a sane form of memory addressing.
I think there would be less x86 computers and more ARM ones. Less competition in x86 but way more in variety. Maybe Amiga, RiscOS and PowerPC would be better known.
@@H0mework let's be real, ARM doesn't scale that well on higher power designs and a RISC-based arch has the downside of being very dependent on compiler optimizations, as they don't have the micro-ops translation layer of modern CISC-based one like x86. Also the vast x86 software library makes ARM quite uninteresting for developers outside embedded products running Android. You have to be the Raspberry Foundation or Apple to have the leverage to make it attractive to consumers. As for the name you dropped, Amiga hasn't developed something serious over the last 25 years. RiscOS has always been a niche compared to crossplatforn OSes like Linux or BSD. And PowerPC is now the excellent Openpower but IBM just made it for niche server applications and despite its openness the high costs and the fact no manufacturer dared to make its own Power CPUs just let the arch stale and now x86 is catching up on the perf level for a fraction of the price and wattage.
@@PainterVierax Psion and Sybian were around too, and windows XP 64 bit was rarely used even though it was"better". People gravitate towards software they need usually, not operating systems. You are assuming that MS would have had a hedgemony and few other OS developers, but there are many free Linux and BSD ports so I'm certain we have seen more diversity. I remember my n810 with fond memories until MS destroyed Nokia before buying it. Some people even still use the n900 to this day, a niche like that still attracted developers beyond it's normal lifespan. My n810 had no shortage of software either.
@@H0mework Psion, SyMbian (with an M, it's not a sextoy), Sailfish or Tizen were and are anecdotal in term of marketshare. They only evolved in embedded often with little to no updates or even internet connectivity. Sure people are more app driven than OS but you forget that one of the main challenge of exotic architectures and OSes is the software support. A ton of peripherals, SBCs or even microcontrollers are simply unusable nowadays because of the lack of software support so having less variety means improved support from manufacturers as well as larger communities to rely on. I believed in ARM as desktop ten years ago, then the reality of the myriad of device tree branches and the lack of GPU driver support from ARM itself (!!!) just made me go back to x86, even though nowadays uefi and retroingineering projects allow a slightly better user experience when traveling outside of the realm of Raspberry and Apple products. For embedded (as well as datacenters), the royalty-free Risc-V already ate a considerable amount of ARM marketshare in quite a short time but again long term support will be hard on such a vast ecosystem.
21:31 Mention to the Pentium FDIV bug, and how that put them in a vulnerable position for some time. F00F bug got almost no fame though. It's been forgotten now, sometime Intel was the laughing stock of IT cause of this. It would be interesting to know how that bug was able to reach production products. Best wishes.
@@aliabdallah102 Don't confuse Intel having a bad season, or a couple bad seasons with them having no resources to produce new good products and have a good come back. They still outsell Amd many times. Best wishes.
Thank you for this great history lesson on Intel & AMD. I always wondered why they stopped using the numbering system and changed to the Pentium brand name.
Competition is the only thing that keeps anyone honest and hungry. Self improvement is merely iterative and incremental because you don't know what you don't know until someone else does it.
my 1st pc was AMD 486 DX4 100mhz... followed by K6 200, K6-2 500, athlon xp 2500, athlon64 3000 - 3200, A8 5600k, A10 7870K, ryzen 3100 & now ryzen 5500... yeah I'm a fanboy 😂
Cyrix were also producing x86 compatible CPUs at that timeframe. Did they have an agreement with Intel ? Or were they small enough to pass under the legal radar ?
Cyrix did not have a licensing agreement with Intel and as a result had an even more contentious legal battle with Intel than AMD had (and which Cyrix ultimately won). Cyrix did not produce 386 clones though, their first CPU clone was the 486 (although in a 386 pin socket package). The Cyrix-Intel fight would make for an interesting Asianometry episode as well.
I think Cyrix used legal workarounds in order to sell x86 CPUs. Back then companies like IBM and Texas Instruments had an x86 license. Cyrix had these two companies (along with STMicro which also had a license) to fab their chips for them. As part of the deal these companies could sell Cyrix's designs under their own names, such as the Texas Instruments Ti486, which was a Cyrix Cx486.
@@MaxPower-11Cyrix's first x86 product was, IIRC, the FasMath Math Co-Processor for Intel 386 systems. Then they had the Cx486DLC/SLC which were kinda like a cross between a 386 and a 486 and plugged into 386 motherboards. They later produced 'real' 486 CPUs, the Cx486DX, which was their own design, benched a little slower than competing Intel and AMD products but also ran cooler.
While microcode can be updated/configured by firmware in some instances, it is still microcode and quite different from firmware in its function in a processor architecture. Microcode determines the multiple steps taken by the different hardware elements in a processor during the execution of each processor instruction. This is a fascinating period in computing history. My first PC (in 1989) used an 80286 made by Harris Semiconductor, another that was locked out of this market by the introduction of the 80386 design by Intel.
Ever since Ryzen came out, I am a big fan of AMD. Just competition back in the marketplace got Intel to straighten up and actually come out with decent products, plus knowing that you can get 95% of the performance at 2/3 the cost of an Intel chip is just good business.
17:48 I remember working on similar looking mobo decades ago. Pentium 2 proccy's (slot 1) came in like mini catridge. ISA, PCI and AGP slots wow! miss those days.
Thanks for the history lesson 🤓 I only really came across AMD once Intel started their Pentium advertising - a mate had a 486 PC and another bragged about his P60 - a couple of years later when transitioning from an Amiga to PC the decision to go AMD vs Intel with a K6-233 for uni was a no-brainier based on price - since the other hardware was pin compatible I wrongly assumed that up until the divergence of sockets/slots that there was still some sharing of the underlying architecture - looking forward to the next instalment 👍👏
Good job. There was a bit of conflation there with microcode (its firmware?). It would have helped to underline that it is entirely internal to the chip and operates the internals of the CPU. In any case, microcode was discarded with the Pentium series, KINDA. It actually lives on today in so called "slow path" instructions like block moves in the later cpus, which use microcode because nobody cares if they run super fast or not, since they are generally only used for backwards compatibility and got deprecated in 64 bit mode. I await the second half of this! Things took off again with the AMD64 and the "multicore wars". Despite the mess, the entire outcome probably could have been predicted on sheer economic grounds, that is, the market settling into a #1 and #2 player with small also-rans. Today's desktop market, at least, remains in the hands of the x86 makers except for the odd story of Apple and the M series chips. Many have pronounced the end of the desktop, but it lives on. I have many or even most colleges who use Apple macs as their preferred development machines, but, as I write this, I am looking out at a sea of x86 desktop machines. Its rare to see a mac desktop, and in any case, at this moment even the ubiquitous Mac pro laptops the trendsetters love are still x86 based, although I assume that will change soon. Me? Sorry, x86 everything, desktop and laptop(s). At last count I have 5 machines running around my house and office and 4 laptops. I keep buying Mac laptops and desktops, cause, you know, gotta keep up with things, but they grow obsolete faster than a warm banana. Yes, I had power PC Macs, and yes they ended up in the trash. And yes, I will probably buy Mac M2s at some point.
It would be nice to give some technical datas to this early processors. Meaning Data- and Addressbus size, Register size and number. And clock rate. Also if floating point numbers are supported.-
I'm not sure what your sources are, but the IBM PC used the 8088, which was a more economical version of the 8086. The 8086 was a true 16 bit CPU, whereas the 8088 was 16 bits internal with an 8 bit external data bus - much cheaper than to implement. Also, the 80286 was used for the IBM AT and was very successful.
It's just a question of retrocompatibility. That's also the reason why AM64 gets traction over Itanium. Software development has a lot of inertia, especially in the low level programming (OS and drivers) and with the closed-source model. Even a big cult like Apple had to provide some satisfying translation layers (Mac 68k emulator then Rosetta1 and 2) to assure the success of each of its 3 architecture transitions.
The competition between AMD and Intel since 386 has changed the world. Not metaphorically. We are fortunate it isn't dominated by one player. Fast forward to this year and Intel is back in the High End Desktop space with chips cheaper than AMD's. Not many times that has happened in the past where AMD is more expensive. Now I know why AMD seems to always have good support from Motherboard makers. Great storytelling as usual.
What does '[other companies would not accept that novel chip] without A SECOND SOURCE' mean? That term occurs in a line of text starting at about 6:40 in the video.
The entirety of the Intel competitive strategy of the time period covered by this video was driven by Intel's success in their "Orange Crush / Crush Motorola" campaign. (Another future video, maybe?) Grove, Barrett and the Intel senior leadership teams (at least up to Gelisinger) realized that using bad faith legal arguments could squelch their competition, allowing Intel to milk the profits for a product over its lifecycle and preventing the entry of competitors until the product was no longer profitable. Once they had succeeded with driving Motorola out of their competitive space, they went after AMD, Cyrix (and later National), NEC, Intergraph, and so on. Why compete when you can lawyer your way into monopolistic practices, creaming off the profits and starving your competitors from access to capital through bad press? Oh, and forcing your competitors to spend hundreds of millions on legal fees instead of engineering research is a useful feature, too. Eventually, the string of legal losses started to really hurt, so Intel switched to leveraging their pile of cash into Marketing Development Funds as a rod for AMD's back, but I am sure that MDF will feature in future AMD-Intel videos from Jon.
The 386's legacy isn't only about 32 bit address buss. iirc it introduced Virtual Memory with an MMU. This is a huge evolution that is indispensable for multiuser multiprocess machines
And really nice and general addressing modes. Before the 386, there were annoying limitations on which registers you could combine (and how) to form a memory address. The 386 instruction set was much more orthogonal than the 86/186/286 instruction sets… but the encoding was of course not nice and orthogonal.
@@holmybeer no, I’m talking about the ability to write [eax+ecx*4+1234], which works fine in real mode, btw. Before the 386, you could only use a few specific register in address calculations (bx, bp, si, di). If you wanted to access something on the stack, you almost had to use [bp+disp] (which conveniently defaulted to using the stack segment). You couldn’t write ss:[sp+disp] so you were basically forced to use bp as the frame pointer.
So good memories. For PC, I started with 80286 (for computer it was a MSX, using a Z80A). Then 386 came in, AMD was pretty good and got a share because it was CHEAPER and worked perfectly fine. Who can't remember 486-DX2, DX4?
I had a compaq 386 dx computer. It was really sturdy. I also had a ncr 286 computer. It had a high qualit svga monitor. Nice for games. My dad bought it in canada in the early 90s. I loved that machine.
I thought I read that the 4004 microprocessor was developed for use in places like elevator controls and someone then realized after the release of engineering samples that it could be the core chip for a calculator.
16:29 “Intel had developed the AM486 using what it called clean room … procedures” Interesting that Intel had to RE their own chip an rebranded it like an AMD product. ;)
Interesting that you referred to IBM's Boca Raton team as the 'toy' computer makers. Through the '80s, the IBM Fab in Burlington VT had really strong process tech, but mediocre products. IBM got second source rights to the 486D micro, and got good yield on initial builds. It would have been a great line filler to turn the focus from technical papers to manufacturing discipline, but the PC team used it for price leverage on Intel purchases.
Been rooting for AMD since around 1994. I hope you cover the events that transpired after this video, beginning with AMD deciding to create their own completely in-house CPU design, the K5, to Bulldozer, and now their rise with Ryzen, with their market cap going past Intel.
The technology has evolved in a much quicker path since 80s. There's no way a product can hit a market 5 years later but still create 1 billion revenue.
Micro computers were around before IBM PC. Vector Graphics was the first pre-built Pc years before IBM's pc. IBM didn't want to get into pc's until 1980's.
6:57 what? Intel had a second source? Intel WAS a source. Do you mean that buyers had a second source? And Mostek "went with Motorola?" Huh? Do you mean that Mostek decided to start making Motorola compatible chips instead of Intel compatible chips? This is confusing/unintelligible unless you already know what it means, which I don't.
Me too, I have no clue what is meant with "customers required second sourcing". I've got an MBA and I've worked with B2B sales. As a salesman I've heard all kinds of stupid excuses for not buying our terrific products, but I've never heard of this kind of obstacle. Did ALL potential CPU buyers have this strange and stubborn policy? Did they all also rent each of their office buildings from two different landlords?
This Intel + AMD story just shows that a single competitor can lead to better prices and outcomes for customers, WHILE both companies remain viable (Intel still made a ton of money while AMD struggled more but survived). It Intel was allowed to keep prices very elevated the PC boom might not have played out as violently as it had.