Some additional fun facts: - The 6502 is the most sold processor in history, in Q4 2019 alone sold 5 billion (yes, billion) CPU units. With a total around 110 billions sold - The 6502 is the most documented CPU architecure, over 32,500 pages have been written on the CPU architechture, over 3 pages per transistor - the 65c02 overclocking record is 133Mhz, record achieved in 2003\ - There were other, less known micros that ran 6502 processors, Like Digital 65, the SYM and AIM series, the Elektor Junior, the EP Series from Microdigital(Brazil), the Atlacatl(El Salvador, some little usage in Belize) - There have been machines with secon 6502 units, the BBC micro had a 6502 accelerator board, and several KIM upgrade boards had a second CPU only for graphics. -The PCEngine's CPU ran at 7.18 Mhz, so it was faster than the SNES
@@danielmantione I suppose all those small ICs and especially sensors which you find in almost each technical product and which do not require an ARM oder other 32 bit CPU. Even recently designed products often contain either a 6502 or 8051 CPU. And there are for example more than hundred of those sensors in each modern car.
I pulled out my TG-16 a few weeks ago and it doesn't boot. I expect it needs to be re-capped. 😥 Still... I loved that machine and with full stereo sound and 64-bit graphics I could not understand why people owned Nintendos. LoL.
6502 is a canonical piece of computer hardware, similar to classic game consoles. They will keep getting made, emulated and implemented in FPGA forever.
the 8-bit guys is correct about how cpu speed does not compare well against different cpu's because amd's cpu's where slower then the pentium 4's but could seriously match or even outperform faster pentium4's so the 6502 is not the only cpu that proves this issue it works even with modern computers cpu's
I wouldn't be so sure, machines are getting ridiculously complex, and I doubt you're going to be implementing RDNA GPUs with unified memory in an FPGA anytime soon. Let alone accurately. 😅
@@SaraMorgan-ym6ueRight, however on "modern" hardware (so even 20 years ago) the clockspeed was just a tiny bit of the puzzle. The size and usage of the cache was in most cases much more important. CPUs have reached a complexity level that makes it almost impossible to predict the performance ahead of time. They have dynamic pipeline optimisation stuff going on that's just wild. In the good old day you just pick up an instruction set sheet, look up the clock cycles of each instruction, add them up and you get a fairly good estimate how long your code would take to run. This isn't the case anymore.
@@Bunny99s well give it credit for surviving this long since no other cpu's have lasted as long as the 65c02 and 65c816 they are in it for the long haul plain and simple (for being manufactured this long) I mean lasting this long for manufacturing life all other cpu's are discontinued
My grandfather was the CTO of Zilog and raised me. He helped develop the Z80. I am very proud of him. He made life unrealistic for me.. he raised me and spent all his free time with me building PCs and listening to whatever music I found. I’ve never found anyone who holds a candle to him. Not having him around has been tough. Knowing his work won’t ever die in some capacity makes me happy. Not this IC.. just in general with his work. He also helped GE in the 60s build their first main frame. Dude was the most modest man you’d ever meet. Edit: I’m not going to put my families personal info out there, had no idea this comment would get any traction. Believe me if you want or don’t.
The TRS-80 Model I was my first home computer, and the first platform I learned assembly language on. The Z-80 will always have a special place of honor in my vintage-computing collection. 🙂
interesting my grandpa on my dad's side couldn't even find the on button on a pc and refused to use them .my grandpa on my moms side knew a lot about analogue electronics like crt tv's and vcr's but when it came to computers i didn't have one until 2008 and kept getting malware .
A few thoughts... 1. I feel this only scratches the surface. I get that this must be a monster project, but I'd watch a 2 hour version of this going into endless detail. Or a 4 part series even. The 6502 strikes me as one of the most important CPUs in history, it deserves ALL the effort and detail. 2. The segment with Bill Mensch was very short, perhaps there wasn't much to add, or he couldn't share anything new, but I wish there had been more. 3. The Z80 needs a similar video, as does the 6800/68000, as does the 8086
You got some solid points, but The 8-Bit Guy already made several videos of consoles and computers that were powered by the 6502 and he did go into fair details of the chip scattered across them. I hope David covers the Z80. I own a 20mhz variant and I would love to learn more about it.
I first programmed the 6502 on a synthesizer in 1979. A small music company in Oklahoma City called PAiA Electronics (still in business!) used a 6502 for computer control of a modular synthesizer in 1970's. John Simonton (en.wikipedia.org/wiki/John_Simonton ) was the designer. A fun fact is Larry Fast of Synergy used the Apple II to play some of Simonton's software for the PAiA platform. Fast played with Peter Gabriel and relied on Simonton for tour support on the first tour through Texas in 1977. So many electronic music performers are very familiar with the Rockwell 6502.
@davidryle Do you remember a guy named Hal Chamberlin? He wrote a killer book called Musical Applications of Microprocessors. It was my “Bible” in 1980.
@@GodmanchesterGoblin I met Synth Legend Roger Powell about that time. He was working with an Apple 2 with a Mountain Hardware 16 channel A/D-D/A converter board in it. He used to to handle control voltages and triggers for his Moog and Serge modules. Cool stuff! I think he used it on his second solo album Air Pocket.
At 4:38 that's me closing the VCR lid in the video "VHS VCRs Revisited" ... Big fan and honored to be mentioned/shown in an 8-bit-guy video! (First programming experience was BASIC on the Apple //e on a 6502 ...)
My first computer was in college in 1978: the MEK6800D2 development kit. I programmed a prototype for an automotive MPG display and a Times Square marquee, each in 512 bytes. Programming was all in assembly and hexadecimal. I would measure the frequency of the odometer and fuel sensor, divide distance pulses per second by fuel pulses per second and converting that to MPG units all using bit shifting. Storage was Kansas City standard audio tapes. My productivity improved immensely when I learned how to use the 6800 assembler on the university mainframe. I was unaware of the history of the 6502 and its connection to the 6800 at the time. It is great to hear from the people like Bill Mensch, who were hands on with some these things that are now household words. Thousands of pacemaker users are happy to know they are using some of the most tried and true processors there is.
Thanks so much for the informative videos. Your channel inspired me to transition into IT. Two years later, I’m working full time in the industry and couldn’t be happier! Your channel is the best.
I skipped that and wrote an 8086 emulator for MIPS and ARM 25 years ago so you could run PC programs on those processors for Windows CE. Found an undocumented feature of the 8086 on the way that MS-DOS was dependent on too.
Great video! Fun fact: The 6502 is essentially a simplified Motorola 6800 implemented using NMOS with a different pinout and a slightly modified instruction set. That's because most of its engineers came from the Motorola 6800 team!
Those simplifications were super important though: 6800 was eye-meltingly expensive at the time and the 6502 was cheap enough to practically sell by the barrel!
@@talideoni think it was $25 vs $125. I have the feeling that production cost was not the only reason why the motorola was so expensive. Misunderstanding the market costed motorola to lose the industry
Fun fact:(s) Similar to the 6800, the Data General Nova (1969) had two accumulators and two index registers and a zero page! But historians and Chuck Peddle himself (who worked on both the 6800 and 6502) said the 6800 was inspired by the PDP-11 although from a modern perspective the PDP-11 is a lot closer to the 68000. However when you are coding the Motorola processors (6800, 6809 or 68000) vs the MOS Technology 6502 it is apparent that the C style structs are prefered by the addressing mode design and that does have PDP-11 roots. On the 6502 you end up wanting structure of arrays.
Yeah, as an electrical engineering student, I cut my teeth on assembly on the 6800. I guess the college must have gotten a deal going with Motorola so they could “hook” us early. We soon moved to the TI TMS9900. And holy cow that thing was 16 bit! Wow. …lol
Hello David. Since I am so broke, I can't really afford Patreon, so this is the only place I can say it, but I wanted to express my deep thankfulness for you, LGR, and Techmoan changing my life. I have really wanted to meet you all, and get to know you a little better, and maybe send you something from the goodness of my heart, just to say thanks for getting me to where I am now. I am a sixteen-year-old nerd who from you three got really interested in old tech and began collecting old software, computers, typewriters, hi-fi equipment, etc. At this time I am now thinking about making my first video, but I haven't thought about it. I wanted to come on here and tell you guys I love all of you. I love watching your videos and learning new things. Always learning new things and weirding out teachers at school because they dont expect someone like me talking about these. I am talking about things they grew up with (lol)! And it makes me happy that I am now so interested in something I have spent literal years to dive into. Sounds weird to say but I enjoy growing up with you and the other two's videos. It makes me proud. Anyway, that's all I have right now. Thank you! ❤📼💾
I appreciate these guys’ work too. But if you want real, eternal fulfilment I highly recommend you turn to the Lord Jesus. Read one of the Gospels today, perhaps. God bless. 🙏
You got tantalising close to giving away a key reason why the 6502 was so popular, and that's that it leaves the bus available for video half the time in a predictable way. By luck (or judgement?) one clock cycle equalled one character time on the CRT, so one half of the cycle the CPU guzzled data, and the other half was left to video. Essentially it gave way to an early form of DMA. This wouldn't work on a Z80 as it doesn't have a predictable bus, so instead you have to implement other segregated ways for video.
It was just as easy to implement memory mapped graphics in Z80 machines. All clones of Sinclair Spectrum implemented it with a handful of ttl logic chips. Overall the complexity and performance were very similar.
Thank you! This was one of the most interesting 8-Bit Guy videos I watched in a long time. The only thing is that it kind of left me longing for more. Also, I expected a bit more of the Bill Mensch interview. I appreciate the effort that goes into making a video like this, but if it was up to me it could be at least twice as long without being boring. Would love to see more of this!
Fascinating David I learnt a lot and I had no idea that the 6502 was still being made and it makes me happy that it is. Back in the day I had a Commodore 64 and Commodore Plus 4 at home and used BBC Computers at school, all 6502 based and used to write programs in basic for them all so I understand about memory access and some of the CPU instructions. But what I never really understood is how it all fitted together. You turned on and there was a BASIC prompt and a flashing cursor but it all seemed a bit magic as to how that all happened. I never knew about how memory was addressed and stuff like chip select lines. Between yours and Adrian's channel I have learnt so much over the last few years and now understand much better how the computers of my youth actually worked. Really enjoyed the video, thanks!
1:22 I wouldn't be surprised if there's cars running around with 6502s under their dashboard. My 85 F150 is one of the last vehicles Ford ever sold with a carb, but it had an Intel 8061 under the dash in the primitive EEC-IV system. It would be shocking if there ISNT a ton of cars rolling around with 6502s running their primitive EFI setups/last gen carb setups.
Ditto, I drive an '84 F150, also with the same 8061 EEC-IV feedback carb (300 I6). They only offered feedback carbs for 3 years (300 and 351 84-86, 302 84-85). A bit of a weird technology, but for an eccentric like me, it fits!
You wouldn't be wrong... GM actually commissioned a 6800 variant for their earliest engine control modules in the early '80s! (the 6502 began as a low-cost version of the 6800 after all)
There are a lot of cars using 8051-compatible processors for various tasks. In recent times, most of those have moved to ARM, but, possibly some simple tasks like seat adjustment might be done with a specialized motor controller running 8051 code.
10:50 The TurboGrafx/PC Engine's HuC6280 (A variant of the 65SC02) runs at 7.16 MHz & in 1987 may have been one of the fastest 6502 variants. You mentioned the SNES and Apple IIgs use the 65c816 a "16 bit" variation which may make them a bad benchmark. Where the TG16/PCE CPU is closer to the 8 bit 6502. 11:35 Although the TG16/PCE had a few RAM expansion HuCards used by the CD-ROM attachment, there were no cartridge based enhancement chips ( 11:11 ) such as those used in SNES games. The TG16/PCE 65SC02 does make use of a video controller (HuC6270) and a color encoder (HuC6260) which certainly do some heavy lifting, but I still think the TG16/PCE would be a very good showcase of what the 6502 can do. Oh, and there is a mouse available if someone decided to port some games over.
This statement is a bit deceptive. The HuC6280 had an instruction set extension for copying memory to / from its video controller, which was far faster than doing it in software with the regular LDA/STA instructions.
That video co-processor did a lot of the heavy lifting on the PC Engine, much like most consoles of the time. The same even went for the C64. Things such as scrolling the screen, throwing sprites around the screen and collision detection, all of which could be very CPU intensive, were handled by these chips allowing the CPU to handle other things. That's why if you look at C64 versions of games vs. a Spectrum or an Amstrad version, no matter the quality of the gameplay itself(I can tell you from experience some Spectrum versions play better than their C64 versions), the C64 would typically have a smoother framerate and better scrolling. The only games I can think of where that isn't true is games that use 3D rendering.
David, this video was AWESOME I learnd soo much, as always thanks a ton 🙏 from the guns to the keys and cpus the sheer qualify of you videos is admirable! Thanks!
I got my game dev career start on the Apple II, back in 1983 and coded the 6502 on the Apple, Vic-20, C64, Atari 8's, etc. until moving to 68000 on the ST, Amiga and Sega Genesis. 40 years later and I kind of miss the 6502 and regardless of the struggle and limitations, there was something really "fun" about getting things working back then vs. now.
@@Okurka.oh here we go... Well, I've coded several machines over my 40 year career. I guess I could get into the minutia of every specific variant, but I was being general. But I guess I should be more pedantic, considering the video and likely audience. 🙄
Brilliant vid. I've picked up bits and pieces from these vids before, but it was great to have it all in one. Hopefully we'll see some similar vids on other chips in the future.
Wait. The damn furby is running on a 6502?! That's the kind of random factoid that I live for. The 6502 and Z80 are immortal in a way. These ancient chips from the 70s keep finding uses. You've listed many of the uses of the 6502. The Z80 also has some fun ones. Maybe you all remember those old MP3 players, the ones that were basically USB sticks with a battery and headphone jack. Those things use a Z80. It's fast enough to decode an MP3 as long as it doesn't need to do anything else.
There might be a Z80 core in some mp3 players for controlling them, but no way that thing does the mp3 decoding. That is done in a hardware decoder on the same chip.
I grew up with the 6502. Nostalgia makes me love it. Its abysmal registers brings me back to reality. I'd like to see a deeper dive. It was a transformative chip.
First time I got to use a PC in real life was a OSI C1p and 41P at 12 years old in the late seventies. I still remember the first words OSI 6502 BASIC VERSION 3.2 (c) MS etc. The MS part doomed me for life lol
@@maxxdahl6062 Similar. The Commodore also had a more advanced color display as well as the advanced sound mapping, but even the base code had subtle differences. Apple II, Amstrad etc had subtle differences. There is a page somewhere in wiki that has a list of the different 6502 BASIC variants. From the OSI BASIC start I can code in any form OF BASIC ever created. I Often use FreeBASIC beside C and Python.
Once you realise all the zero page instructions are there to provide loads of “registers” that can be used in lots of flexible ways, the 6502 is actually replete with registers ;)
Awesome video, great topic to cover. Been learning 6502 ASM over the last few years for doing NES homebrew projects, and I'm always amazed at the general efficiency of the processor and corresponding Assembly code. It really is fun to work with.
This was informative and approachable. Nice work, 8BG! Do you have any plans to release the full interview with Bill Mensch? That segment was shorter than I was hoping it would be.
The only CPU I learned to program in Assembly language!!!!!! I’ve got a soft spot in my heart for the 6502, and I’m amazed it is still alive and kicking after all this time!
The big idea behind ARM was to exploit the memory bandwidth to the max. When they evaluated all the current 16 bits CPUS (80285, 16032, 68000, 65816) they were appalled by their inefficient use of the memory bandwidth.
As did the 6502, with the rising / falling edge CLK logic almost doubling memory bandwidth (as i believe a video somewhere on RU-vid mentions). Acorn's MEMC 1 / 1A may have been more efficient, with a near single clock cycle memory accesses possible, when using the read / store multiple instructions (to push / pull fifteen of the twenty seven, 32 bit registers, as a block, taking 18 clk cycles = 3.33 bytes per 8MHz clock tick, rather than the standard 2 clk ticks per single ARM read / store op = 2 bytes per clock tick, let alone the 4 clk ticks needed on the like of a 68000 to read a 32bit word = one bytes per tick. Memory access was 2 to 3.33 times faster on an Acorn Archimedes, at the same clock speed, than a 68000 based Amiga, ST, Mac, Sun One, ..., the Archimedes were also marginally faster clocked than the PAL variants of the above), making for some quick stack accessing wizardry on a branch or return, along with some weird, 7-bit, reverse page logic, initially eliminating the need for much of the slow blocking logic found in the CISC toys. Though only 22 bits (4 MB) of the address bus went anywhere (without a multi MEMC and motherboard fudge), and the VLSI logic / DRAM couldn't be clocked above 8MHz, for the first six years of ARM chippery. Till the ARM 610 appeared, you couldn't source a true 32bit ARM SoC.
@@galier2I remember reading somewhere around the time ARM 1 came out, that they tested the other CPUs using the TUBE on the BBC micro - because that allowed them to separate specific aspects of performance which exposed the issues. I held on to that idea that the BBC micro has that important role in the dev of ARM which is now in billions of devices - but I heard something more recently that cast doubt on that. I’d love to have confirmation either way :)
@@sputukgmail The BBC micro was developed by Acorn the same people who created the ARM cpu. (ARM == Acorn Risc Machine). And yes they used BBC Micros to help develop the first ARM CPU and the software for it.
@@andrewgrant788 yes, I’m aware of of the origins of ARM - used one of the first Archimedes, followed the dev very closely- but it’s the conflicting stories of how important using the Tube with second processors to inform the team realising how to focus on Arm being the right path that I am hoping someone can clarify. As I say, I recall an article/interview were someone said how important it was, but I also watched a documentary about the dev of Arm that suggested they had already decided RISC was the way to go before they got the other processors to experiment with and test using the Tube.
I have been a subscriber for a couple of years and come here every so often when I'm on RU-vid . I have many other channels that I watch for different topics but I always enjoy spending time here as in this video. I remember all the hardware that used the 6502 back then so it's like a trip down memory lane. You put your videos together in such a way that makes them very enjoyable to watch and this one is no exception. Thanks a lot for this and please keep giving us the great content that you have been providing.
The PCEngine/TG16 ran a HuC6280 CPU, which is an upgraded CMOS version of the 6502. It ran at 7.18mhz and was released in 1987. I believe that was the fastest clocked 6502 based CPU of the time period you listed in the chart that has the SNES and Apple IIc Plus.
I think that it is weird that the cartridge also ran at that clock. Hudson added an instruction to burst copy data to VRAM ( and RAM). So level loading was fast and you can spend the splash screen on decompression. Or stream a level. Only problem: all operands are immediate. So jump indirect or self modify code.
Did it really run at 7.18 MHz, or did the system insert waitstates? The NES and SNES have a system clock at 21.47{72} MHz (6 times the NTSC color subcarrier frequency), but that doesn't mean that anything else in the system runs that fast. The graphics chips were the fastest at 5.3693{18} MHz. Also, the SEGA Genesis had a system clock of 53.693{18} MHz (15 x color subcarrier), with various dividers for the components. This is where MIPS benchmarks become more useful...
@@shinyhappyrem8728 Fanboys assured me that it does run at this clock. Remember: that was in 1987. Even the first draft/prototype of any circuit made in a NEC fab would run at that clock.
Wow, I knew the NES used a 6502 but I didn't know about the furby and Tama gotcha, that chip powered my '90s childhood. Perhaps the 6502 was also in the "digivice" that had a pedometer and you walked or shook the thing to get your Digimon to walk and play the game.
Love your videos!! This is a great video I never realized that modern technology still uses the 6502 It really powers the whole world! And all because some guys wanted to make a copy of the Motorola 6800
Well I'm a big 6502 fan but the world is probably powered by ARM. Which is not bad because at the end of the day ARM was designed by Acorn which previously used a 6502 but then needed something better which still doesn't use much power, and the result was ARM. There is an excellent video by the channel www.youtube.com/@LowSpecGamer about the history of ARM; he also has very good videos on the history of the 6502 and the Z80.
I remember reading a book about the history of MOS, in one chapter which oddly you can hardly find any reference to online, it talks about the toxic chemicals used in the creation of the masks, the fumes from which would turn paper yellow. In 1974 the MOS plant in Audubon was listed as a hazardous waste site due to a leak of trichloroethylene (TCE) which caused local groundwater contamination. It stood out to me as I knew people who worked in the area who ended up with a profound life changing chemical sensitivity. While the 6502 may have powered a generation, it's creation may also have damaged one. I just think it's important that those things are not forgotten too.
Wikipedia does mention the leak, and the factory becoming an EPA Superfund site just as Commodore was in the process of taking over. Sounds like MOS hadn't informed the new owners and left them holding the bag. It doesn't say what happened after that; if I were in Jack Tramiel's shoes, I'd have sued the former owners into the middle ages!
That's life in the 70's! I grew up where a chrome plant dumped their plating chemicals into the town aquifer to help boost testicular cancer rates. Probably why the EPA came about in the 70's, too.
I heard that when Commodore refused to manufacture the Clipper Chip(NSA spy chip for cellphones) they sent the EPA to close MOS technologies for "Environmental concerns". This was in late '93. Commodore didn't kill itself...
Did they ever want to dispose of TCE properly, or just pay the shareholders and then go out of business? I don’t understand why everyone wants humans in a manufacturing or chemical plant. I also kinda hate the ISS.
I was at one fab in the 1970s. They used a gas to dope the wafers. They had the option of borane or arsine. In the gas cylinder room the arsine cylinders were untouched and covered with dust. "There is no way we are going to risk using that stuff".
Great Video! Last year I actually did a project where I took a W65C02S, and connected it to a microcontroller, and monitored its inputs and outputs. I used the microcontroller's memory space as the addressable memory that the 65C02S used, and included peek and poke commands to monitor the memory addresses, as well as a clock step functionality and breakpoint functionality. I even wrote a disassembler for 6502 Assembly to go along with it. This was a fun project that i taught myself 6502 assembly with. It was a very cool experience. I'm now thinking of designing my own side project device around operating on the W65C02S.
Great documentary video. 2 comments. I’ve been a DJ for 36 years. Disco peaked in 1978, not 1975 as you mentioned as a notable moment of 1975. I thought the interview of the inventor of the 5602 could have been longer. It feels like it was cut short but that’s just my opinion. Overall A+ doc-video.
One little but not insignificant mistake in the video is that the WDC 65C816 would be from 1985, but it actually is from 1983 - which means a backwards compatible 16 bit successor to the C64 could have been done early enough to become the ancestor of our current PCs instead of the IBM PC, with Commodore still ruling the world now.
@@NuntiusLegis The design of the WDC65C816 started in 1983, Atari and Apple received prototype samples in the second half of 1984 and the official release of the CPU was early 1985.
The 6502 was a legendary CPU is so many ways. In a way I think it's use just about everywhere well into the 80s hurt the 65C816's chances of success since it seemed like there was no urgency in designing a successor IMO. The 65C816 it wasn't available until 1985 and it's chief competitor was the 68000 was introduced 6 years earlier. The 68000 was comparatively much more expensive than a 6502 but once it's price started to decline in the early 80s interest in using it went up dramatically! If MOS or WDC had started working on what became the 65C816 years earlier maybe it could have scored wins in devices that ended up using the 68000. Also, I can only imagine what the SNES could have been capable of if it would have come standard with the version of the 65C816 in the SA1 chips that was used in a few dozen games. The SA1 ran at 10.74 MHz which is 3x the 3.58MHz the CPU in the stock SNES ran at.
1:58 Thank you for including the OSI 4P. Warms my heart to see that :) 7:40 Thank your for the excellent illustrations so far. I am hoping to replicate some of this with the Atmel mega328 some time soon. The bus CS I recognized instantly, but I was curious about how they were getting the 16-bit pointers and integers. 25:16 thanks for the presentation. That was awesome :)
Still used on TI-84 graphing calculators which are still in current production. Actually, was used for most TI graphing calculators before the TI92 (and 89 series) moved to the 68K.
@@ZenithMusicNet The MSX was sold worldwide (but not in England or the United States). That's why many people today don't know it. They don't know the rest of the world.
There are a few 8-bit microprocessors that are still in production: * MOS Technology 6502 * Zilog Z80 * Motorola 6800/6808 * Intel 8051 * RCA 1802 (still used professionally in aviation, space, defence) FWIW The F8 then Z80 then 68000 were very heavily used in arcade machines.
Another awesome video! Thank you! I'd like to hear more of the interview with Bill Mensch and what he had to say about the 6502 architecture and the direction it might have gone.
@@EddieSheffieldI read elsewhere here that the full interview didn't exactly... go so well. Bill went on a lot of tangents, apparently, as an example.
You're right. Rubik's cube was more a 1978/1979 big thing. I remember when I was in 8th class in middle school when I was summond to the school director, wondering what "crime" I had committed, to just having the director asking me to solve his cube.
@@galier2 I remember 1980 because it was my first year at highschool and couldn't wait to get my hands on one. There may be a Hungarian or two here in their late 50s who got one before the rest of us.
@@andrewdunbar828same here - 1st year at the comprehensive school and …lets just say, someone was selling legit cubes cheap somehow. I was too naive to realise why they were cheap at the time and only twigged months later, that not everyone pays for things when they walk out of a store!
There is no way a 7 MHz 68000 was only 3 times faster than a 1 MHz 6502, in practical use it was more like 20 times faster. Although it required more cycles per operation, the operations were vastly superior in flexibility and capability, so most tasks could be performed with a third the operations, at 4 times the width. The 68000 also had divide and multiply of 32 bit numbers, where the 6502 only had addition and subtraction of 8 bit numbers. Needless top say the 68000 was hundreds of times faster in those tasks. I programmed Assembly for both, and although the 6502 is very nimble for the time size and price, there is no competition between it and a 68000 in how powerful they really are. OK you actually got to that at 14:47, and yes you had to make your own binary math for multiplication and division and numbers above 255. But even if you have the math, you sometimes use logic operations instead for fixed numbers. At least that's how I did it.
I am so happy I made the time to sit down and watch this video. There was a ton of great tech heavy information here that was extremely easy to understand, and I feel like such a casual "retro" fan having a murky understanding of the fundamental functionalities of these computers.
As someone who's been a game programmer for 30 years.. I miss this era.. Coding was so much more "fun" back then.. Modern day developers struggle to understand the basics of optimization... I look at people complaining about how impossible it is to run their game on the Switch (for example) and shake my head.
I agree, but that doesn’t mean it can’t still be fun. I personally don’t do a lot of actual optimization, I mostly try to make code that doesn’t run poorly and it’s pretty fun. This makes me want to do a homebrew project for an older cpu at some point through.
@@colleagueriley860: I work with 100s of people now. And most of them think optimization is a thing you get from adding more power to the machine. You hear them bitch and moan about power of hardware or engines in interviews with press.. it's depressing... And that's before we get to the simple fact the "higher ups" dont care and dont want you spending time/money on making it go fast.. just ship it out...
well, optimization back then was absolutely neccessary, because of the slow cpus, small mem etc. so there was no other way. you HAD to do it and that's why a lot of ppl became very good at it, it wasn't because it's "fun".
Errata: @16:30 - The Atari 2600 only has 12 bits of Address (as shown in your diagram) and can therefore only address 4 KB of memory space. The asteroids cartridge shown is an 8 KB cartridge, but it used a very crude banking scheme.
The 6507 CPU in the Atari 2600 can address 8KiB of memory space, 4KiB is devoted to the cartridge ROM and 4KiB is devoted to RAM, I/O, timer video and audio registers inside the RIOT and TIA chips.
I came to say something similar. BrainSlugs83 is close, but NerdlyPleasures has it right. I'm currently working on my first Atari 2600 game. The 6507 also still has internal vectors for interrupts that have been moved to fit into the modified address space, even though it no longer has the external pins that trigger those interrupts.
4:23 Nice research work. The Rubik's Cube went on sale in 1980, not in 1975. Originally it was called the Magic Cube and was since 1977 for sale but only in Hungary.
I don’t think it’s reasonable to say that the 6502 powered a generation. At least in the U.K. the ZX80/81 and the hugely popular Spectrum were Z80. My first micro, a TRS-80, as also Z80. The Z80 ran CP/M and was also a far easier chip to program.
@@Okurka. Poor? Not at all. It was used in many business machines. It was the basis of the MSX computer standard. It was also widely used in games consoles. It had a rich instruction set, including 16-bit instructions.
But the BBC Micro had a 6502 and that fact eventually led to the creation of ARM in the further history of the Acorn company, so I think the impact of the 6502 even in the UK is probably higher at least from today's standpoint. By "powering a generation" in the 80 of course you are correct, that probably goes to the Z80 for Sinclair's computers being cheap.
@@mudi2000a The BBC Micro was expensive. That’s why the cheaper ZX Spectrum was so popular. I’m not saying that the 6502 wasn’t hugely popular, just that the Z80 was too, and it was also very successful so that the suggestion that the 6502 powered a generation of computers was wrong. It was a mixed market, and remained like that for a long time.
I really like these educational types of videos from The 8-bit Guy, he has a very clear and easy to understand way of explaining these processes and as somebody who couldn’t be further from understanding the inner mechanisms of how a computer operates, these videos are fascinating to me. Good job Bryan!👍
@@jbponzi1 Yes, it's sad for me, too, especially as I am an old and sentimental guy. But, on the bright side, the Z180 still lives; I wrote a lot of code for the Z180. And also the eZ80 - already over 20 years since introduced; seems like yesterday. Life is so short and CPUs come and go! 🙂
Thank you for the sidebar into multiplication with rotate and/or lookup tables, it was really elegantly explained. I gained a new appreciation for how the 6502 instruction set allows for a clever programmer (or compiler) to do multiplication in just a handful of clock cycles. I had previously assumed you were just stuck with "a series of additions" (with large-number multiplications just taking an absurd number of cycles) or "paying for die space with extra-fancy chip circuitry", with no middle ground.