Im taking an Atari programming course at my university, and holy shit this guy is a god. What he was able to program on the Atari platform just blows my freaking mind.
This man and all the other nerds willing to count bytes and cycles to create games that have been exciting us for decades deserve every bit of credit given to them. I can 't imagine what would have happened if Activision had never been founded. When I hear Mr. Crane talk about the limitations of early game consoles, I would like to see people like him really push modern hardware to its limits. Thank you for being the nerd that you are. Here 's to all the other programmers, story writers, scientists, hardware designers and creators of art: write your names on your work so people will know who you are. You deserve that.
Sitting here with cheap DIY kits for games on computers that would laugh at a Cray from the 80s I salute this man who made these incredible games on such a tricky, limited system way way back and paved the way for said computers/kits/improvements!!!
I've written games and other software for the Atari VCS (my most recent being an algorithmic art piece called Transmat, released last year.) And I have to tell you, of all the environments I've ever written software for, the VCS is one of the most rewarding, because of the tight coupling of the processor and display logic, it's hard to explain to people just how _immediate_ the effect was for writing a register on the TIA, and therefore having to be aware of the execution time of each and every instruction during the visible part of the screen (76 cycles, you get 15-30 instructions per line, 2-7 cycles per instruction, make them count.), you get a 40 bit playfield (only 20 bits of storage, split in half, reflected or mirrored), two 8-bit objects, and 3 1-bit objects, on a line. You get 128 bytes of RAM to store _ALL_ program state, you have no interrupts, your only reprieve is WSYNC, a register to assert the ready line on the CPU, to cause it to wait until the current scan-line has completed. Any and all vertical operations had to be done in software, changing the data in the registers BEFORE the TIA had the opportunity to shift them out to the display. You have no binary counters for horizontal positioning, even, only polynomial counters which were exposed as strobes, which you had to strobe at the moment you wanted an 8-bit or 1-bit object to display...only one problem, the best you could hope for with a typical load/store cycle, was 5 cycles, which means, you could only strobe in 15 pixel increments, you had to set a horizontal motion register, and then strobe HMOVE to bias the object -7 or +7 pixels. I could go on, but the point is, it's a well defined, constrained, universe, with enough open-endedness, that any software engineer truly worth their salt, can do something really great.
My first programming experience was with a Vtech Pre-Computer 1000, which, if you don't know what that is, was a training tool launched in the 80s to teach kids how to use computers. (My mom picked one up at a garage sale.) It had a monocrome LCD screen which could only display characters. (and only 20 at that!) It had some built-in games like hangman/ science/math/history/general knowledge quizzes, a calculator, and even a slot for expansion cartridges. But by far its best feature was a mini BASIC programming environment. It gave you about 2KB of storage to write your entire program, and I figured out all the tricks to squeeze everything I could out of those 2KB. (My best program was a memory match game, and I had a lot of fun with random story programs.) That said, the Atari 2600 sounds like a LOT of fun to program, and I'll have to give it a try!
This man is a complete legend who made our lives e entertaining in 1976 with pitfall for the Atari 2600 look at us now playing pc game's on the internet with other's
i was one of those people who spent hours meticulously drawing their own map. i think i was the only person in our family to finish Pitfall, and it took AGES to do. i would dream about it. thanks for the months and months of entertainment, Mr. Crane!
The amazing thing about 2600 programmers isn't that they had to work with a machine that lacked a display chip (it had one), it's that they made so many games with a display chip designed only for Pong and Combat -- so limited you could call it a half-scan-line buffer. The title of one book about the 2600 is "Racing the Beam" -- there weren't nearly enough clock cycles to both fetch graphics from memory and update the display registers, so coders invented a plethora of incredibly clever hacks.
I recall first seeing a screen shot of "Pitfall" in an Activision catalog a few months before it was released. I was stunned by the graphics. Home video game graphics were still rather blocky back then and each new Activision game looked better than the last. The color palette was beautiful.
@joaovictor1994 I think we loved the OLD Activision... the company today should just be called Call of Blizzard since all they do is let Blizzard make money for them and push Call of Duty and let any other project fall away and rot like Bizzare Creations.
My god do I love making maps for a game. Every time I find an old game that requires it, I get excited. I wish more modern games required a bit of cartography. Instead people generally consider it a negative if a game doesn't provide an in-game map. People praise Metroid: Zero Mission for adding a map for example, but drawing the map was possibly my favorite part of the original. At least we have the Etrian Odyssey series.
Haven't really followed the discussion, but as this argument has always irked me: A stated purpose is never a justification. If I start a club with the stated purpose of killing puppies, that won't excuse any ensuing puppy-killing. Saying that "corporations exist only to make money" as if that justifies the ways in which a corporation might go about that goal is the exact same thing.
56:38 - I remember going into a store in 1985 and buying frogs and flies and other Mattel games for 1 dollar! yep! I couldn´t believe my eyes either! I was 12 btw.
Check out "Muncher" (a PacMan clone) for the Bally Astrocade home console. The console only had enough RAM to draw the screen, so they had to put the game code into the first 2 bits of each 4-bit byte that was being used to draw the screen.
52:02 LOL it makes total sense that the Pressure Cooker tune was made by a composer with 9 notes to work with. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-qtaDzgFRhuQ.html
Thanks to David and his coworkers at Atari and Activision for creating groundbreaking games that became a big part of our lives growing up in the late 70's & 80's!
I wish someone would’ve asked if he had to take the same approach with the colecoVision & Intellivision ports of Pitfall, or were there less limitations to work around?
I believe the episode of Heckle and Jeckle that Crane is referring to is called The Lion Hunt. I noticed it's not thr only episode to feature alligators.
Great video. Very educational and inspiring. Hope David Crane gets around to making the source code available, or educational Iphone app explaining the technical nuts and bolts of this game (Pitfall) one day.
Interesting that he thought adding lives was a good idea in the end. It's nice to know some game designers don't want the experience to be as frustrating as possible.
@15:05 What can the Atari 2600 do? ... There's a great interview of Chuck Jones speaking on discipline of animation.. Discipline meaning, what are your constraints? How are you limited in the telling of a story? This works across all mediums.. Audio, film, etc... What do you have to work with? Make it so.. Usually ends up in a sweet, sweet product.. Simple, clear, and defined. .. Thank you for sharing this talk!
notice even the Atari 2600 uses a "GPU" to render the actual graphics, I know the CPU is very busy controlling every single pixel but I am wondering if there was a system with no display chip at all, just CPU with direct video output...
David Crane had clearly said what he and the other Activision game programmers had to do in order to work with the Atari 2600 to create enjoyable games for it.
When I was a child I would look at my pitfall manual and see David Cranes photo and always wondered who this man was that looks like Bruce Jenner. I am very grateful to have heard this presentation thank you for sharing it
I did not imply that. It's the difference between "companies exist to make money" and "we value the money we save not equipping our ships with lifeboats more than we value the additional security that would bring to our crewmen". The former is a stated purpose, the latter a justification. Whether you agree with that justification is irrelevant for the example, but I feel people often avoid justifying themselves through meaningless statements that logically amount to "we do it because we do it".
@maiki60fps Yes, the Apple II, that would be the closest thing. The cpu would put the pixels into memory. And as the television scanned the beam, the apple would sequentially dump the ram to the television set. A first example of bitmapping. The stuff that does the scanning isn't really a GPU either. It's a handful of discrete logic gates, at best. Nothing more than a counter and DAC.
@maiki60fps There always has to be hardware that takes the screen memory and turns it into a TV signal, because TVs can't read from RAM - but the hardware that does this is so simple that you may as well imagine that's what's happening.
@maiki60fps The ZX Spectrum did all of its graphics by the CPU writing directly to screen memory (from your profile I can see you're familiar with that machine!). And really, most DOS games were made by the CPU writing directly to a block of screen memory.
You imply that making money is an inexcusable crime. Unlike puppy killing, there are no real irreversible consequences. They make games. Someone else can make better games.
Man this guy is monotone and shows no positive emotion even when he is saying how it was successful.... Pitfall is cool but he sounds depressed about the whole thing!