As far as I know, the ghosts in Pacman where "blinking" because only one sprite (out of the 4 ghost sprites) where rendered per frame. But this "optimization" makes them look more "ghosty" :)
+Simonschreibt The pacman game as it was sold was only a prototype. A programmer came up with a proof of concept, showed it to management and management decided to go with it instead of letting the programmer actually finish the game.
+bobnoel Atari Pacman wasn't unfinished, it's that they couldn't have the same maze because of technical limitations (not enough vertical pixels if they kept it vertical, and not laterally symmetrical (needed for background graphics) if maybe they laid it on its side). Not having the same maze meant it simply sucked. The build up and disappointment when it was released embittered an entire generation of people.
+bobnoel According to 'Racing the beam', it was _based_ on a prototype rather than giving the programmer time to rewrite from scratch. It also had to fit in a 4k ROM which limited the functionality. What's really amazing is how much better Ms Pacman was.
As a kid during the 80's, this stuff seemed like magic and it always had my mind wondering "how'd they do it?". Thanks to the internet and people such as yourself posting content like this, I can finally understand. It's calming to learn how this work. Thank you.
The Apple II used "artifact" colors -- same as with the hi-res modes of the Atari 8-bit computer systems (400, 800, XL, XE), Tandy/TRS-80 Color Computer, Dragon 32/64, and the "composite color" mode of some IBM CGA games. Atari 8-bit systems also had a trick where every scan line of the display could be in a different graphics mode. Games often used this to get more colors on the screen, or to show a status line (high score, lives left, etc.) at the top or bottom of the screen.
What you were thinking of is that each command for a new graphic mode (represented by a display list command) had an interrupt flag that interrupts the video chip (if turned on), allowing code to execute before that graphic mode is drawn that allows you to change the graphics register to a different color.
> Atari 8-bit systems also had a trick where every scan line of the display could be in a different graphics mode. !! Do you mean a different resolution or just a palette-swap like the CGA "cold" pallet and "warm" palette??
@@jessfucket You could switch to different-resolution modes, or even between text and graphics modes (that was how Atari BASIC could put four lines of interactive text at the bottom of a graphics screen, very handy). It was also possible to swap colors in the color palette using an interrupt routine, or even to swap in a custom character set in the text modes for character graphics. And most of the graphics modes had color indirection for the colors, so it wasn't like CGA where you just had two possibilities--the particular mode might have 4 colors, but those colors could be any of 128 possibilities, and if you used display-list interrupts you could change those on different scan lines. So the fancier Atari games tended to have this mix of wildly different types of video display on different horizontal bands of the screen, and there was also a popular effect for title screens and such were you could have pulsing rainbow bands streaming down the screen. It's one of those things that screams "Atari 8-bit".
They're not exactly the same, though. The Atari 8-bits, when in hi-res mode (320x192 pixels on a standard screen, which it didn't necessarily have to adhere to, by the way) use all 8 bits of each byte, and therefore can only produce black, white, and two other colors that depend on the particular graphics chips they use and their revisions. On the Apple II series, the 6 colors are consistent and predictable, while on the Atari, 2 of the 4 colors are kind of up in the air. Fortunately, the hue control can usually get you a couple of usable colors, but 4 is still less than 6, and the _Ultima_ games, contrary to what many Atari 8-bit owners seem to believe, suffer on the Atari 8-bits in comparison to the original Apple II versions. You're lucky if you even get blue water and green forests--sometimes these colors are reversed, and there is no orange and purple (or you get orange/brown and blue but no green nor purple--it depends!). In fact, there were two versions of some/all _Ultima_ games on the Atari that reversed the colors in case that would help. Maybe Origin should have used some of the Atari 8-bits' other modes and capabilities--of which there are many--for the _Ultima_ conversions. The problem with this, however, is that the resolution would have been much lower. On the C64/C128 there was color memory that allowed it to more or less match the Apple II for the _Ultima_ series, including resolution. The Apple II, for which these games were originally designed of course, did some things better, while the C64/C128 did other things better, but they were a pretty close match concerning the _Ultima_ games, at least. In most ways and with most games they were strikingly different, and the same goes for the Atari 8-bits. No other 8-bit computer could match the latter when it came to games that were designed specifically for it, although the same could be said for the C64/C128 in different ways and with different games.
Great video, did not know about the Apple graphics. I didn't know about CPU graphics either but the SLOW and FAST commands on the Sinclair ZX81 make sense now. It's amazing how old computers were a result of the juggling act where designers had to balance functionality versus affordability.
Actually, if you set the color bit when using a monochrome monitor, it would shift those seven pixel half a pixel-width to the left. It did the same thing on a color screen, Steve Wozniak generated colors by taking advantage of how the composite video signal was set up to relate color and position. You could take advantage of this to be able to get 560 pixel horizontal precision on positioning an object, even though the objects themselves were still limited to 280 distinct pixels (or 140 in color). I occasionally used the color bit to make things like smoother fonts.
I love this series! I wasn't even around for most of these graphics but the history of it is super interesting! I hope you keep making these beyond the 3 parts, as you suggested at the end of the video. I'm all for it!
Please do the CGA, EGA & VGA graphics! I still remember writing 2D demos in assembler utilizing direct access to VGA memory. I never did anything pas the basic VGA 320x200x8bit, as it was a simple write to the chip. No pages... Those were the times :)
+Jachym Lukes Yeah, and it behaved like you expected. If you knew the VGA registers, they'd work the same on virtually all video cards of the time. It was only when one got into VESA resolutions that things started to get weird and hard to deal with.
Phyphor666 I've just remembered how my schoolmate found some memory chips, pressed deep into his carpet. I got them from him and they fitted to my Trident VGA. With them I was able to view "images" downloaded from BBSes in True Color :)
Absolutely fascinating. A lot of it goes over my head of course because there's so much to cover, but I really like learning about these sort of limitations that developers encountered, and the different ways people addressed them using the same resources.
Growing up through the evolution of graphics modes, I'd LOVE to see an episode on CGA/EGA/VGA/SVGA. I remember being a kid with a green monochrome screen, and just being so wowed by the glory of CGA, blown away by EGA, but my head just about exploded when seeing VGA for the first time. Kids these days just have no clue how good they have it, graphically speaking. Haha.
I'm glad I randomly found your videos one day, they're interesting and fun and technical enough to keep me interested without becoming tedious. Love to see more stuff about graphics modes, but general 'how old machines did X' videos are always super interesting too.
I'd love video's about CGA, EGA and VGA! I've done some very simple assembly programming in the early 90s and EGA really baffled me with the planar memory. VGA with mode 13 and mode X were really awesome! Now I have deeper understanding of computers I'd love another explanation to see if it makes sense to me now :-D
Good video except those black lines on the left of the Atari 2600 are not because game code is running, it's just a bug in the video chip when a player object is repositioned on the previous line and the call didn't happen on a particular clock cycle.
+MrTBoneSF - Interesting.. I've always heard that it was not enough CPU time. Not all games have the black lines. but that would make sense for why even a sprite is affected when it moves in that area.
+The iBookGuy Racing the Beam if I remember correctly has an explanation of this. Also this question comes up a lot on the programming forum over at AtariAge ("why are they there?" and "how do I get rid of them"). Only games that have sprite reuse have the lines. Simple games with just two player sprites (like Combat) don't have to worry about the issue. Activision got around this by either having the kernel call the HMOVE on the exact cycle (near impossible to do in general code) or just calling HMOVE on every line so their games had a solid black border on the left. That area of the screen is usually in the non-playable space anyway since they had to leave a border for overscan on CRT tvs, so there was no harm in "window-boxing" the game. If the game naturally has a black background, you also wouldn't see the lines until the post-game color cycling (e.g. Space Invaders).
+pocpic Not really, it was 320 pixels wide but the background graphics were 160 wide then always mirror-imaged to form the right half. Sprites could be positioned anywhere though. Most games had that full-screen symmetry but you could do tricky things with the raster interrupt to swap in new background graphics.
Darn, no mention of the Atari 8-bit computer's Display List Interrupts (which let you do some CPU-driven-style stuff, but without having to spend all of your time counting CPU cycles) -- color palette changes (useful for those typical rainbow color effects), repositioning sprites, etc. Also, on the Atari and many other systems, you had hardware support for vertical & horizontal smooth scrolling (so moving per-pixel, even in cell-based modes like a 40x24 text screen), redefined character sets (fonts, useful for tile-based games -- e.g., instead of using 8-9K for bitmap mode & moving tons of data around, you'd use 1K of graphics data + 1-2K of 'font' data), and the ability to point the graphics chips at different parts of RAM... on the Atari, even on a per-scanline basis. Practical example of combining a lot of these effects -- say you had a drawing that you basically 'compressed' into 4x8 multicolor cells, giving you around 160x200 pixels. It's only 1K for the font, plus 1K for the graphics data that defines which shapes are drawn where on the screen. But then say you only draw the top half of the picture. Use the handy point-wherever-in-memory (Direct Memory Access / DMA) feature, plus the "flip the characterset upside-down at basically no cost" feature, to draw the mirror-image on the bottom half of the screen. Then use some color palette changes to 'tint' your picture blue, and tweak the horizontal scroll value on each scanline along the bottom half of the screen. For less than 2KB you have a full-screen picture, in 8 colors (or more, if you do more of those per-scanline color changes), with an animated lake reflection effect. :) Such is the power of Atari's 1979 GTIA & ANTIC chips! They've aged quite well, I think!
Great primer on old school computers! I miss those days (I programmed mostly for Atari 8-bit and Coleco). Atari called sprites "player/missile graphics". Four players 8 bits wide and 4 missiles 2 bits wide...but they were all 256 pixels tall (even taller than can fit on the screen!). They only had registers for horizontal position and were vertically placed by shifting the visible image within its very tall area! It was odd but allowed for tricks to vastly increase the number of sprites so long as only 4 appeared side by side at once...just define one player with multiple pictures within its 256 byte column then change the horizontal position of the player mid-scan. Atari 8-bit machines used "display list interrupts" to change not only colour pallette but even display resolution so you could have a section of text, a section of low res, section of high res etc. In some ways superior to what the C64 could do. The subject of how Atari graphics evolved (including how it influenced the Amiga) would be very interesting...from Atari 2600 racing the beam and the first player-missile graphics to the DLI and modes of the 8-bit computers and 5200 to how the Amiga was developed by the same people and evolved it to create HAM mode etc. Those pioneer engineers were amazing with the creative ways they used limited technology to do so much!
could not agree more with this..nowadays they don't spend any time on optimizing the games/software, and it's even worse that this trend got into smartphone/tablet technology, for example the cheapest "Android" brand new phone was bloated with about 400 Mb of software just from google, instal any other app and the phone is frozen....
Actually, the C64 has raster interrupts, which are effectively the equivalent of DLIs on the Atari. It can (without any flickering) similarly multiplex its 8 sprites (each 24x21) using raster interrupts, among other things such as switching modes and changing color registers on specific scan lines--anything the CPU can do within the limited time available. I think that a great many Atari 8-bit owners/fans are unaware of this. The C64 is generally more powerful when it comes to sprites, although of course the Atari 8-bits have their own advantages over the C64. Both can do DLIs/raster interrupts, though. I'm unaware of other 8-bit computers that have this capability.
@@rbrtck that is true. The C64 has better sprites (more if them plus they have multi colour support). ANTIC DLIs are a bit better than VIC-II raster interrupts because you don't need to involve the main CPU every scan to change graphics modes, color palette etc. You just set up the display list program and the ANTIC does the rest. ANTIC is technically a full CPU itself and the display list is a kind pf program. As for other 8 bit computers that use similar tricks the closest is the TMS9918/9928 and Yamaha 9938. For the sprites, they actually only have 4 but in the hardware they have 32+ "virtual" sprites and reposition the 4 real sprites (8 in the 9938) depending on the current scan line. That is why sprites flicker or disappear on TI/Coleco/MSX when more than 4 or 8 are side by side. These chips only have interrupts on vertical blank though; you need to poll for the current scan line and they don't have hardware support for pixel level scrolling, although the basic acceleration routines in the 9938 help with that a bit. That is not really the same though since the raster interrupt tricks are done in hardware.
@@markshanehayden4648 You're partially right about ANTIC, in that it can perform some of the dynamic screen operations on its own, albeit the more it does, the more cycles it "steals" from the CPU via DMA (the Atari 8-bit has a special version of the 6502 that can be halted for this purpose). I wouldn't call ANTIC a "full" CPU, though--it's more like a limited, special-purpose GPU, and DLIs that utilize the CPU are required for the more powerful, complex types of operations. While the design of Atari 8-bit's graphics subsystem is more elegant, in the end there is not much difference between what it and the C64 can do in terms of dividing up the screen. Keep in mind that the much-touted DLI function always interrupts and uses the CPU, and that the C64's raster interrupt does the same thing (the latter may be programmatically more clunky, but it's still fundamentally the same thing). Neither computer's CPU is "chasing the beam" from the vertical blanking interval (that's what ANTIC and VIC-II are for).
This series is awesome just found your videos and they are making me want to get back into programming. You are concise and informative while not becoming monotone or boring in the slightest. You got a me to sub after these two videos keep up the great work. I hope to see more of this series and will check out other stuff on your channel.
I love how programmers of the Commodore 64, years after it's release, were able to get more colors and resolution out of it for static screens, not unlike the Amiga's HAM (Hold and Modify) mode where it could produce 4096 colors on a static screen.
Would you consider doing a video like this for more systems like MS-Dos computers from the 90s, Sega Master System, Sega Genesis, Super Nintendo and comparing them?
The Mega Drive and Super Nintendo basically works like the NES but with more bits-per-pixel, and more and larger sprites. The Master System too, although it doesn't really use color palettes AFAIK, instead opting for a higher color depth in sprites and tiles (4bpp compared to the NES 2bpp -- same as the MD/SNES). Old DOS PC's basically work like what he described in the first part of the first video -- the "unusable" method that used too much memory.
+Scott Blacklock EGA was introduced in 1984 and VGA in 1987, so not quite. Also, you know you could just pop in an EGA or VGA ISA card in something like the IBM 5150, right? Even 8-bit ISA sound cards like the Sound Blaster or Adlib will work fine in machines of that age.
These videos are wonderful! Thank you so much. I'm looking forward to seeing the video on game music. Do you think you could talk about SNES graphics, in particular Mode 7?
Great video. Clear and concise explanations. I still have my Sinclair Spectrum , got it to learn Z80 assembler back then. When I started as a young electronics technician, one of the hardest things was getting hold of information. Today we are so spoilt with the internet ! Look forward to more of your material
I vote for Amiga. There's loads of weird interesting hacks from the Amiga to talk about. All from 1985 too. HAM, Extra Half Bright, the Copper chip, the Blitter, bitplanes, reusing hardware sprites as you go down the screen. Also why it worked so well compared to a CPU bound Atari ST. This site details some amazing Copper chip tricks. www.codetapper.com/amiga/sprite-tricks/
The Amiga is the big brother of the Atari 800 having it's video chip designed by the same person who did the Atari 800's video chip. Atari eventually added a copper chip of it's own.
Absolutely. Like this little gem on the Amiga 500 that uses HAM + Blitter objects and still has time to play the background music with sound effects: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-JJczdYO8N1c.html And this one for 1985's Amiga 1000: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-XDdMoglmUbs.html
These videos are so interesting! I definitely have more respect and appreciation for the programmers/designers that have come up with clever ways to circumvent these hardware limitations
+Gaming Jay Agreed. I would especially love to read about the register-level programming for these boards. The CGA card, in particular, could be programmed for all kinds of funky modes that IBM never intended. I remember one package we used in college (Micro PLATO, a coursework package) which tweaked CGA registers to produce a mode with fewer pixels horizonally, and more vertically. This often messed with the monitor's ability to sync, so we'd have to twiddle the vertical-hold knob in order to get a stable picture when using the program. There were other non-standard modes that were used by various games (like ElvenSoft's Round42). Sadly, very few of these games would play on more modern video cards, because they were using undocumented register-level hacks to produce the video modes. (I remember my ATI VGA Wonder card included a small application to put the card into a CGA-compatible mode that would let it support these modes. IBM's VGA products didn't have the ability at all.)
+Shamino0 that's neat. I find video and memory hacks very interesting. Reminds me of back when I used to code in Turbo C and had to work out all kinds of little hacks to get my programs to run
Great video! The previous one seemed more comprehensive, though this one still has a lot of golden information. Really looking forward to the next one. And +1 for the CGA, EGA and VGA episode!
+Fredrik Jagenheim The Wikipedia pages do a pretty good job. It's one of those modes, like Apple II double-hi-res, that is easier to describe (and implement in hardware) than to write software for. en.wikipedia.org/wiki/Original_Chip_Set#Denise en.wikipedia.org/wiki/Hold-And-Modify
You should do an episode on programming languages back then. What was used for the Apple II, the Atari 2600, Nintendo, Sega master system. What did the IDE look like? I have searched for Nintendo code but could not find anything. It seems that nobody has open sourced their games from that era.
NES games (and games for other computers and consoles from the 8-bit and 16-bit era) were almost always programmed using assembler language, which is basically giving instructions to the CPU etc.directly (whereas code in C etc. has to be compiled first.)
I've done a few videos with AppleSoft basic using an emulator and screen capture. I do have images for C and a few other languages. I also have a II GS emulator. Anything in particular you want covered?
There are different "levels" of code. First is "machine code", which is the lowest level. It's literally nothing but 0s and 1s (or hexadecimal). It's the hardest to use, but takes up the least amount of RAM (and ROM space) because the CPU doesn't have to convert anything. Machine code was mainly used in computers from the 50s and 60s. Next is "Assembly". These are machine-specific codes, used mostly in old systems that didn't have that much RAM. These hardware-specific languages used actual words instead of 0s and 1s, but were still difficult to learn and master because they were based on how computers operate instead of how human minds operate. That made them extremely efficient when it came to RAM usage. Next are the languages you might be more familiar with, like C, Java, and BASIC. These are designed to be used in as many machines as possible. They, with a few intentional exceptions, were designed more with how the human mind operates, which made them easier to master than machine code or assembly. They are also more taxing on the CPU because the CPU had to convert the code into "machine code" before it could be used. Finally, you have programs that convert user input into useable code. Examples include Raptor and Warioware DIY.
Most 8-bit games back at the time were coded in assembly language. This means they used a so-called "assembler" (mostly a "macro assembler" that was a bit more comfy. Some games were also coded in BASIC or other languages and then compiled (mostly strategy games etc.) - but most action games were done in assembly language. Very CPU dependant! The N64 was mostly coded in a "higher level language" like C or C++, to my understanding, but for some time critical things you still had to use assembly language. The problem isn't the 3D per se - it's that the bigger a project gets, programming in assembly language gets more and more time consuming and as the machines get faster you don't need the fastest speed for everything.
@@oldguy9051 A bit later on, many developers did their work on more powerful 16-bit computers, and used cross-assemblers and even cross-compilers (for languages like C) for the 8-bit computers. Additionally, BASIC was sometimes used wherever they could get away with it, and compiled BASIC for less CPU-intensive parts of games wasn't unheard of. But yeah, a lot of assembly/machine language was necessary to get the most out of the 8-bit computers.
Please do the Amiga too! The Amiga had the most wonderful graphics of any machine in the late 80s, and had things like the blitter, HAM6, EHB modes, multiple simultaneous resolutions, planar graphics. The list goes on!
@@Mikebumpful No, he's right. The Amiga really is the direct relative of the Atari 800: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-WUQ1mcyYbdk.html
+Bluewave256 Agreed. I had to keep turning up the volume during scenes when the iBook Guy was off camera, but then had to hastily turn it down when he was back on camera. This was something that should've easily been caught and fixed in post-production.
I'm more bothered by the environment change (going from an echoey room to a more silent room for voiceovers) than I am by the volume changes. I'm an audio engineer and I find the audio in most channels' video to be lacking. It's almost an afterthought.
4:09 - I don't think that on modern system it will use 90% of CPU. 1920x1080 = 2 073 600 pixels. Every pixel has three 1 byte colors (RGB). So, 3 * 1 * 2 073 600 = 6 220 800 colored subpixels in frame. Most monitors are 60 Hz, so 6 220 800 * 60 = 373 248 000 calculations in second. Is it much for modern 4-8 core 3-4 GHz CPU? It will take like 10-20% from one core. Correct me if I'm wrong.
90% came from structure of video signal: these 90% are when ray is visible and 'draws' a picture, while another 10% it's off and performs 'flyback' coming right to left and at the end of frame from bottom to up. That time between frames is best for actual computing because your code doesn't get interrupted after just couple of microseconds. I highly doubt modern CPU could change context just after sending out another pixel, that's pretty slow operation: push all registers to stack, change SP, pop registers from another stack, etc. It would easily take 10 clock cycles and that's how much is left to draw each pixel. I'm not even sure 10 cycles is enough for each pixel, because CPU here tries to perform at least some clever decoding (for ex. JPEG image on the fly), so it's not as simple as mov from memory, other way we wouldn't need this CPU-assisted mode after all! So it's safe to think one of cores would be 90% busy drawing graphics even now.
dude. this was invaluable for a nerd like myself. I've always been terrible with coding but I am always curious and wanting to learn. This broke everything down to such an easily comprehensible video. Thanks for your time!
+Jim Leonard , yes for NTSC it happens 60 times a second, for PAL it happens 50 times per second. That is why the speed of the 6510 CPU in the C64 is also different between NTSC and PAL. But the television is only able to update them at 30 frames and 25 frames per second - and uses an interlaced fields alternating between which lines to update. When hooking up an old system to a new monitor you can clearly see this interlacing going on resulting in a jagged look for animation.
This is really interesting. Looking forward for more. Excellent video and as a teacher myself, I can tell you that your explanations are clear and concise. Thanks again for the videos and keep up the good work!
Hi, I found this video really interesting and informative! I like the way people come up with really inventive ways to manage resources. Also, I like the new channel name; it makes sense with the info you're covering now.
+AcrOfSpades we don't have these memory limitation anymore bruv, game design is a joke these days compared to what the smartest engineers did back then in order to produce a proper game
Tim Lesher if you compare today's growth of performance of what's considered a "standard" computer with the graphics of video games you will notice that the demand of new hardware stands in no relation to the "improvement" of the games. Games from 2010 look pretty much like games today, but now look at 2010's and today's specs of your computer. It's a shame that, instead of writing efficient code, developers rather make people buy better hardware so they can write even messier code.
Tremendously awesome videos! Fascinating information, but I got super excited when I saw Ultima. I can honestly say that those games are what got me excited about computers and quite literally steered me toward my career and shaped my life.
CPU's have a small chunk that are dedicated to graphics. So it's a really small GPU...of course, those are leagues ahead of dedicated GPU's from decades ago...
My brother's notebook has Intel Core i5 CPU @2.4GHz with 4 cores, and nVidia GTX520MX GPU. It used CPU as a GPU too for a long time, having my brother swearing on low performance of games, until he found out about that and switched to "full performance" mode to use that GPU too. So yeah, some notebooks work with ignoring GPU by default. I think my dad's netbook(or idk, really small notebook) doesn't even have GPU at all and uses CPU in the same way.
Make the additional facts, please! I love this series! This is some really interesting behind the scenes stuff. I want all your graphics and game knowledge!
I am loving these videos. A lot of answers to things I always wondered! Whatever you cover next will definitely be interesting because this stuff is just fascinating :D
I really appreciated what you posted in the last video about sprites. Right before you covered it, I was typing up a question on how they were able to move above an area that was only allowed 2 colors :D
Seeing CGA, EGA & VGA graphics would be awesome! Just wanted to also say I love these videos! They are very informative and I enjoy learning about different types of graphics.
Your one of the smartest guys on RU-vid in my opinion. watching some of your other videos, your able to to put new things into an old product to improve it, and your able to take things apart and fix these things, I apologize for not being specific but there's many different videos, but regardless, it's all rocket science to me at points, I can't imagine the schooling you must of went through to know all of this.
Nice video. It has me all nostalgic. I remember Richard Garriott explaining colours on the Apple ][ and how he drew the Ultima shapes on graph paper to map each pixel and colour.
Haha, oh wow... I got distracted by the awesome tracker music in the previous video (Part 1), and spent an hour downloading all my old favourites before finally returning to this video. Love hearing the tracker music in your videos.
Thanks man. My 8, 12, and 15 year old loves these vids. It got my 12 year old girl excited enough she is actually playing with simple graphics now on an Arduino and 8x8 matrix display. She is really starting to appreciate what you were saying as a result.