"The very uselessness of the Altair is what drove the hobbyists together." Yep. I was a new student at a DeVry school in Toronto in 1982, and one guy who'd homebrewed an Altair -based system that could run CP/M had brought it into the school to do some work he couldn't do at home... and I was there the day he managed to prove his idea that an Altair with 32K of RAM could manage time-sharing by running two completely separate CP/M operating systems - each with its own memory and I/O space - with timing allocated by a simple timing device. For a newbie, it was pretty heady stuff.
@D HUH??????!?!?!? I don't own any apple products and don't care for them. But last time I checked arent they the first TRILLION DOLLAR company? They lost half of their wealth back in the day with Gate, no? Now I think they're doing too well and have too much power with their sheep followers.
I'm 66. It was the most fun I've ever had. A moment in time that will probably never happen again. The only regret is I was so young I didn't know it was special. I thought most people did what loved and worked with very bright and creative people. If I had a time machine I'd go back...over and over.
I'm just 3 years younger than you... but I didn't think myself too young, and I certainly *knew* it was special. I was an 8th grade "computer science" student (new program that year in my suburban Montreal highschool) when I was 12 in 1972... and being connected to computers (originally through DTSS at Dartmouth) was a revelation.
I agree completely. Some people (relatively few) exude honesty and class. Woz is one of them, and one of my all-time favorites. His autobiography, iWoz, was one of the most fascinating tomes I've ever had the pleasure of reading.
I bought my very first computer in 2000. Bought it from a Salvation Army store. It was a Packard Bell 486DX2/66mhz. It came with monitor, keyboard, mouse, 14k modem, 340 mb hard drive, 1 mb of memory and already installed with Windows 3.1. WOW!! Cost me $50. A brand new one from 1996 costs $1300
I remember this documentary, it is a blast from the 90s. Good stuff, I love the Machine that Change the World, part 1 can't be viewed because some complained and ruin it for us all because of copyright.
@@JimTheKid Capitalism isn't a ideology so it isn't going to be taking it over. Capitalism can work together with communism which is a philosophical ideology.
I know a 'nerd' didn't like drop down menus...he said it .."took the mystery and the secrecy out of running a computer"...my biggest and abrupt reply to him was..."Good, about time!" It was as if it was making fun of all the commands he used to type in. I just reminded him, that "we are in the 20th century, Peter", I'm glad of the drop down menus...it made my life easier. The drop downs revolutionized personal computers. It certainly put paid to the Commodore 20 I used to own. My next one was an Atari 520ST, then the 1040. Wasn't compatible with other computers, but I did use it for a lot of things like typing songs out, and sending them to my little Star dot matrix printer. Oh those were the days. Then microsoft came out with a real computer...the 3.1. Personal computers really came alive then. The rest is history.
Most Excellent! and historically correct documentary on the PC. I bought an IMSAI 8080 PC kit in 1977. There was no OS for the IMSAI only the BASIC language that booted from the front panel and ran directly on w/o an OS. I wrote a Disk OS for the IMSAI called FDOS and an language application for manufacturing CNC interface in 1978 and showed it in a booth at the International Machine Tool show in Chicago in 1978. I love PCs, I don't love social media, the cloud and Windows 10. ask me why.
@@rarRoarrar I currently develop manufacturing software applications for Windows on an IBM 32bit PC with Windows XP that is NEVER connected to the internet. I also use an Acer 64 bit system with Windows 7 for system testing that is also NEVER connected to the internet. Both systems have run flawlessly for over 20+ years without any virus protection software or any software updates. I used a separate Windows 10 system for internet communications with virus protection software to send and receive client data. This Windows 10 system would typically work for only 1-2 years before becoming corrupted and would have to be completely purged and reloaded to original state. I finally replaced the windows system for internet communication with a Chrome system w/o virus protection and it has been working now for 4+ years without purging and reloading. Although I constantly get and delete spam e-mail including e-mail from visus protection companies notifying me that my (windows system) is full of malware. I never use the cloud and do all my social interactions on my mobile phone using text messages. Best regards Larry
only thing i find kinda sad and unfortunate about the evolution of technology and such is how back in the day someone could start a company with a small game made by small people in a small room as long as they had the drive for it and same with computers but now that all that have become so advanced its just not possible anymore :/ games are so big and next gen now that you kinda need a big team to get pretty much anywhere in the world unless you are fine with just being that small indie dev that someone might have heard about
2023. As a touring musician who is stepping into programming. I appreciate the Gibson Les Paul (the guitar behind Woz) so purposefully set in frame. 🍻 💻
Maybe you need menu options ,browser,perhaps,too adjust documentary mirrors,kaledescope is only one set of settings able of such options,otjer observatories exsist,multi tjreading yuh see,html...
This program would have been soooo much better had this clown just told the story and presented the interviews without all of the dressing up and parading around as though he was part of this. God save us from reporters and their egos.
I loved these videos when they were new. Don't even get me started on "Computer Chronicles"; best tech show ever! Nowadays, computers are mostly appliances... or worse yet, TOYS. 25+ years ago, I could not wait to get home from work and get on my computer, and log onto Prodigy. These days, I have a computer in my pocket all day. What went wrong?
@Jon I have observed that there are times in life where people do not adapt to changes for multiple reasons. It is not always consistent with people being older.
@Jon relax he's just reflecting on how we take computers for granted nowadays. its possible to both appreciate todays technology while being wistful about how everyday its become
@@herrfriberger5 Yes, and one of the best CPU's ever, Assembler is so easy to program. Today the x86 x64 CPU's have thousands of commands, but ARM CPU's comes them close today, maybe ARM never had the change to program them in assembler.
There's an error at the 10:40 time mark where the host speaks about high level programming languages. He states that FORTRAN followed COBOL. This is incorrect. FORTRAN was released in 1957, COBOL in 1959 (and I think officially released in 61 or 62 by IBM) and BASIC in 1964. FORTRAN was suited for performing complex calculations, so it was the natural progression from machine language for scientists and engineers at that time. COBOL was built for writing business applications and finally BASIC, well, that was an all-purpose language (which back then compiled down to machine code. Later versions developed by Microsoft didn't compile but ran interpreted, very much slower).
And these two films (part one and two) are an a-b-c of today's situation. I think all of us, our friends and firstly- children should watch this and share to others. It's important to KNOW where we live and what for. And also who's giving cards in the "game" for our minds ;) . After all these years still actual and motivating material. Huge "like" from me :) .
Mainly because at the time there was no College Classes that truly taught what they were doing at the time. They invented what they were doing. The were truly the Pioneers of their craft
We all love a great myth, but it's bit false to portray them all as just tech geniuses who "didn't even go to college" yet had a "Vast knowledge of computer engineering and software programming". Consider that the very computer chips that they were using had been developed by engineers with bachelors, masters, and doctorate degrees in engineering, and that the underpinnings of the computer science elements they applied came out of decades of research and development university, government, and private enterprise where mainframe and what were called minicomputers led the way. Bottom line: There were thousands already employed who had vaster knowledge of computer engineering and software programming. The "genius" was FWIW: Bill Gates was a National Merit Scholar when he graduated from Lakeside School in 1973. He scored 1590 out of 1600 on the Scholastic Aptitude Tests (SAT) and enrolled at Harvard College in the autumn of 1973. He chose a pre-law major but took mathematics and graduate level computer science courses.
We all love a good myth but most of them did not have "vast knowledge of computer engineering and software programming". Their genius was in applying newly developed technology to start the personal computer product segment. It should be noted that the computer chips and integrated circuits that they were applying had been developed by engineers and scientists with bachelors, masters, and doctorates by companies such as Intel, Fairchild, Signetics, Mostek, Motorola, Texas Instruments, and National Semiconductor. The foundations and tools of computer science and computer engineering had been developed in the prior decades with university, government, and private industry research from companies such as IBM, Burroughs, Honeywell, CDC, Digital Equipment Corporation, et al. With the vast majority of those computer scientists and engineers having bachelors and advanced degrees. Mainframes and minicomputers were there as models for the "personal computer" hobbyists and product pioneers. For many years, the VAX11-780 minicomputer was the benchmark that personal computers were compared to. Bottom line is that when it comes to those with "vast knowledge of computer engineering and software programming" there were easily thousands who had vaster knowledge in those areas at the time. The "genius", vision, and wealth that was made was in the application and recognizing the opportunities much more than engineering and computer programming skills. I.e. much more entrepreneurial genius than technical genius.
Your the best of the bunch with the late Steve Jobs, Bill Gates & Steve Wozniak , our Hero's and I love coco cola and pizza with Science class with the American freedom ✔
I can not believe that one of the first statements made here is that Paul Allen and Bill Gates invented the Personal Computer. That is a completely false statement. As he also mentioned, that they started a SOFTWARE company, not a HARDWARE company. I am pretty sure that Microsoft has never built a computer, until I think the Microsoft XBox in 2001 and the Microsoft Surface in 2012 . They were of course focused on software. They added basic to almost every system, and created operating systems to run on some of the first personal computers. So, to say they significantly helped build the personal computer industry would be a more accurate statement.
I believe you are correct. They did not even create the Basic that they were installing on every system in the early days. They stole tiny basic and made a few changes and made a fortune !!!
He never said they invented the PC. He said young men LIKE Gates and Allen did. He used them as an example because at the time Microsoft was a huge part of the PC industry and they both played a huge part in creating it, although they never made any hardware. Of course this was all explained here if you cared to listen.
@@JaredConnell That is the same as saying, young men "LIKE" Jared Connell and Jeff Nay invented the Personal Computer. As they were guys just like us. I guess he could have simply said that the personal computer was created 20 years ago, by a bunch of geeks in their garage. Although it is more like 47 years ago, now. I would prefer to say people like Federico Faggin (8080 / Z80) and Chuck Peddle (6502) actually helped invent the Personal Computer. As they created the actual CPUs that made things like the home computer and video game systems possible. Neither one, was mentioned. I have one of those first home computers, the Altair 8800 and of course the magazine that goes with it... Although I look at the Micral N a french computer based on the 8008 CPU, that came out in early 1973, as the first home computer. As it looks like Robert's may have based the Altair 8800 on this computer. As they look very similar... So, describing the Altair 8800 as the oldest Personal Computer in the world, "may" not be entirely accurate either. Thank you for reaching out. I did end up watching the entire video. As I am a vintage computer collector, and have all the computers mentioned in this video as well as VisiCalc and Lotus 123 and every version of DOS from 1.03 to 5.
The early nerds realized nerds as a whole were gaining too much power and were too unregulated so they made sure to turn the nerds of tomorrow into lazy bums, with lesser IQs that just watch marvel and play video games. We live in a Nerdarchy.
Geeks from the 70s aren't much different than geeks from today, they may have spent a bit more time outside and ate a little less, but ultimately they are pretty much the same
In 1989, My Second Quarter Electronics Teacher, Russ Spitolnick, Was Asked A Simple Question..."What Do You think Will Be The Most Important Development In Future Personal Computing?"... Russ' Answer Was Immediate..."There's No Question About It...The Future Of Computers Is In Communications..." What, After All, Is You Tube And Facebook, BUT Communications?..Russ Was A Prophet, And I Thank My God That He Was My Second Quarter Electronics Instructor At New England Tech, West Palm Beach. Florida, Way, Way Back In 1989.. Before You Tube, Before Facebook.. Before ANY Of Us Knew What The Hell Was REALLY Going On In Electronics.... Russel Spitolnick Knew..Don't We Fell Better, Now?.....
aaaaaahhhh , the good old days ; my father got me hooked on those ; his first computer was a ( ??? ) , i have no idea but it had to be housed in a 3 story building and he used metal pins to actually program it ; he was SO glad when they went to punch cards ; for myself my first one was an 8088 programming machine language and basic ; frig binary programming 😵💫😵 ; then i went to the TRS80 model 3 and started learning cobol and fortran ; then progressed from there --- good times 😁
Anyone remember MICRON computers?? I recall them being among the best at one time. My dad bought a new one with the Pentium 90 and we thought we were really something lol. Was a big step up from the legendary 486. Of course it wasn’t long before the Pentium 100 came along and knocked us off our pedestal. Obsolescence is still a bitch today lol 😂 Of course this was early 90s. I’m to young to remember the 80s. I do recall using DOS when I was very young to load up a chess game. We had a battle chess game and you had to use those old floppy disks and DOS to load it.
im still watching this today, i wish i still had a CRT for all my old 4:3 games and programs. 3.4 Ghz CPU and 8GB of ram may seem like a lot back then but today Intel i5 and i7's blow my computer out of the water.
+Matt Brine 15 years ago I had an 8088 and an 8086 (as well as state of the art computers), and my kids (in their 20's now) still miss the games Mars, Iceman, Moon Landing, etc, etc. Playing those games now is tough, and they end up being tiny on the monitor... IF you can find them.
You still can't beat the CRT for multiple resolution capability! The one way the LCD is still behind. I just ordered a new (to me) CRT - eBay has some great stuff!
+thomase13 Yeah. Amazing how bad the LCD screens are as far as colors. They change like crazy depending on the viewing angle. But I do like their light weight. You and I both know that they will get this all figured out. The advances in the last 25 years have been out of this world. I have a powerful computer in my pocket!
"at one time he ate nothing but fruit" Well, sounds like the kind of guy who'd use alt-medicine to cure cancer I guess... shame, but hey, sure accomplished a lot in the time he had.
What I really dislike about this documentary is that they only mention Apple, Microsoft, and IBM and let them seem to having been the main companies that started it all off. But that's of course wrong and misleading. They mention the Intel 8080. At the time, it was one of the two predominant microprocessors. The other was the MOS 6502 (which was actually the heart of the Apple II). It was built into most of the personal computers at the time, including the Tandy and the Commodore machines like the PET and the VIC 20. Neither the processor nor these machines are mentioned. For this reason I don't consider this as a good documentary. Also, the glorification of big money disgusts me a little.
The documentary was about computer companies, including their software. It wasn't about microprocessor manufacturers. The 8080 was only mentioned because of its connection with the Altair. Why does the idea of making money "disgust" you?
@@robertromero8692 "The documentary was about computer companies, including their software" Well, following the title, it is a documentary on personal computers - not about three big companies. The title is obviously misleading. "Why does the idea of making money "disgust" you?" You really ask that?
@@bierundkippen720 “it is a documentary on personal computers - not about three big companies.” What’s objectionable about focusing on the companies that had the biggest impact in personal computers? “You really ask that?” Yes I do. What’s wrong with making money. Do YOU make money?
@@robertromero8692 "What’s objectionable about focusing on the companies that had the biggest impact in personal computers?" Well, it's stupid when the title is "Personal Computers". It's simple logic. Don't you get that?
@@bierundkippen720 There were a plethora of personal computer companies back in the day. It makes no sense for the documentary to try to cover them all. It covered the machines that had the biggest impact on the market. The PET wasn't one of them.
The computerindustry is represented by APPLE founder Steve Jobs and MICROSOFT Bill Gates Those two really made progress in the computerindustry. I can't programm, those two can and I like to work with their programms. Kind regards.
You jumped from the Altair 8800 to Microsoft???? What about CP/M? Digital Research? Gary Kildall? Micro-Cornucopia? S100 systems? Osborne, Kaypro, Big-board? MSDOS was essentially a CLONE of CPM/86.
I bought a TRS80 in '77, and upgraded to an Apple II in '79. It was the last Apple product I ever bought. It followed the design philosophy of Wozniak, meaning it was greatly customizable. Subsequent Apple products followed the Jobs philosophy, meaning the design was DICTATED, with NO opportunity for customization. I've always hated that attitude. It's no wonder that Apple has a small market share.
5 minutes in the the Reporter who made this would be fired from his job and never work again if made now.. Funny how truthful things tend to get you destroyed these days..
Its interesting how winners writes the History new, the Apple II was entering the Market on June 1977, but the Commodore PET was entering the Market on January 1977 on the Winter show CES. Hmm ok the first complete PC was not the Apple II it was the Commodore PET. damn
i love this doc, but within the first minute, he changes history gates and allen did not create the personal computer they created languages for the pc to use as he loves to point out during the doc series "its all about the app" a pc is just a box without the apps
It's hilarious how this documentary gives some credit to the flower children radicals of the late 60s early 70s for making computers available to the masses. Steve Jobs@29:40: "I think that that same spirit can be put into products. And those products can be manufactured and given to people" Given to people? Apple products are usually much more expensive than their counterparts. And then it goes from this to the 60s radical turned computer nerd turned rich guy that likes to do his interviews in a hot tub. There seem to be a lot of late 60s/ early 70s radicals that warmed up to Capitalism. LOL!
Bob Cringeworthy (not a typo) always ignores the contributions of RadioShack, Atari, and arguably the most significant player in the early personal computer revolution: Commodore. Too much emphasis on Microsoft and Apple. Yeah yeah, to the winners go the spoils and they write the history yadda yadda. Those are poor, intellectually vapid arguments.
Apple the first mass market computer company? What a twat, the first was Commodore with the PET which was the first mass market computer that didn't require someone else's equipment to make a full system, an off the shelf computer for the masses (all 1000 of them who wanted one in 1976 lol). Apple always played second fiddle to Commodre, first with the C64, then the Amiga 1000 then the Amiga 4000 040 (the most powerful CPU and graphics of any off the shelf desk top computer for sale in the world for a brief period when it was launched...and it wasn't an Apple Mac LC 040 that beat it lol)
In the late 1970's nerds invented spreadsheets just to improve things. Forty years later I used them at work in order to improve things. Still today most don't even know how useful the most modern versions of these things can be with careful effort in programming. One of my favorites was programming a calculator in a spreadsheet that would calculate the position of the cutterheads on a machine by using critical part and cutterhead dimensions. I could even calculate what would need to be changed between one part and the setup of any other part. This decreased the setup time for these machines. I created another one that could count not just available staff, but their training and skills as well, on a daily basis. In addition, I could schedule a work cell, and the spreadsheet I made could check if a crew member were trained at that position, if they were scheduled to be available that day, or if I had inadvertently scheduled them somewhere else at the same time. Then I made it so that the layout or contents of the department could easily be changed in the future by the manager without them knowing a single formula, and it would still work like it always had.
Wrong - Xerox PARC - invented everything - the "personal computer,GUI, mouse, bitmapped display, ethernet, object oriented programming, laser printer, WYSIWYG, natural language processing; all before the Apple I - which was a kit - and didn't even have a powersupply or keyboard.