These product managers were excellent presenters. They even timed little side chit chats perfectly timed to the disk thrashing latency. Bob was an expert.
WOW, one button Mouse.. I never tired of the Mac fans telling us how the One Button Mouse was so... vastly superiour to the Multi Button mouse on the PC.
Don't forget MacOS UI is much different than Windows, menu is always on top, there is no popup menu for selected element. Single button was to not confuse users and force them to keep in mind - you select element, and menu is always on top of the screen. There is no second button because there is no popup menu. SIngle button mouse would not make sense on Windows. Popup menu makes sense only on high resolution screen.
1:52 As a software guy, I resent the idea that a ground-breaking OS with integrated graphics engine could not be “technological innovation”. Remember, this thing let you interact in real-time with on-screen drawn images using just a CPU running at 7.8MHz, with no hardware acceleration!
More like the lack of hardware was innovative. Wozniak liked to do everything in software that used to be hardware-based, which led to a lot of flexibility on the software's part.
Amiga Inc was starving for a cash infusion and went to several companies before Commodore. Apple was one of them, with Jobs declining as he found the custom chip design too complex and potentially limiting for future expansion options. This turned out to be true as the NTSC video based system really constrained high resolution video output, even till this day. And the cost of further custom chip development was predicted to be costly as well, another prediction held true.
@SteelRodent While true, it ignores a huge part of what makes Apple successful - they took a system that was interesting but impractical, and tuned it for consumers (and businesses to an extent). That’s what people always miss when they talk about Apple: they always complain that Apple isn’t innovative, but they actually are - the innovation comes from design refinement, which is arguably just as important and difficult as the initial idea. Did that make it groundbreaking? Yes.
It certainly was a remarkable software achievement, but what the Atari ST and Commodore Amiga demonstrated was that there were significant gains from better hardware for a graphic task-set. Ex-Mac system engineers actually founded Radius for exactly that reason.
That was a very crazy time in the high-tech arena, especially so in Silicon Valley, as I was working in administration for a high-tech startup company in Mountain View. There was one engineering department manager that wanted his whole group on Macs for development of CAD/CAE design products for ICs. The problem was, to port-over those designs so that Apollo workstations and Digital Equipment Corp [DEC] mini-computers would recognize those designs. I found out later on that the manager was using that product development ploy to chisel down the price of the Macs from Apple direct; as he had no intention of using the Macs for development, but for his own personal use. Apple got wise to that stunt and slammed the door in the company's face for any Apple/Mac products at discounted prices.
I mean maybe they should have ... because they are worth less than nothing ... I cant name a single Mac product that isn't on pc , or work better on pc .... And pretty much since the early 2000s it's mostly been that way . The mac fan boy art fans will say it's the display , but ... mind you 120dpi to 144 dpi is standard in all flat panel monitors now . Oh it's the hard drives .... nope . Samsung has that one .... Oh it's the ram.... nope , MaaNY companies beat the snot out of apple in pretty much everything . Including on phones .
@@acidangel111a partir del año 2000 Mac comenzó a diseñar mejores sistemas operativos, a tal punto que Microsoft copió muchas características de Mac en Windows, desde sus inicios hasta el día de hoy.
@@acidangel111I worked on MANY Samsung phones. Total garbage. Also, i won’t waste my time listing apps exclusive to macOS (and no, I don’t mean GarageBand).
@@acidangel111also, what’s your deal? No one is forcing you to use a superior OS. It’s not your money, so why do strike me as one of those people who thinks they’re smarter than anyone who disagrees with them or doesn’t know some suuuuuuuuper obscure drive specs, etc?
@jimmybuffet4970 roflmao. Superior os . Try garbage at best . Whats my deal ? I don't know is this what's the deal ? Is there a door number 1 , 2 , or a big box ? I'll take the big box . Ohhhh it contains trash ? Well , still better than any mac. Fun fact.
That was totally shoehorned in too, with no context for what they were showing. Not even a contextualizing comment from stewart to explain how this PC thing is relevant to the discussion. I guess he was just letting Gary do his little ad segment and then move it all along
For those who don’t know, the host of the show invented GEM. Yup, this was a total conflict of interest and the negative attitude toward the Mac was totally biased.
I've gotta smile at Paul Schindler's look into the future of the mac as I sit here at the end of 2021 watching this old vid on an iMac, running an OS whose roots are the result of Jobs' NeXT adventure, with a beautiful color screen on the internet. This really isn't a put down of Paul. He honestly looked at the landscape of the industry at the time and made a reasonable prediction. In fact, many mac users (a more loyal group you can't find) in the late 80's and 90's went through a period of feeling abandoned by Apple. You just never know that's coming tomorrow.
He's right, even to this day the Macintosh is not taken seriously in the business world. After the iPhone became popular some smaller businesses have started using iMacs and theres always been a place for macs in creative and artistic endeavors. But the PC, pioneered by IBM and Microsoft has always dominated the overall business market
@@JaredConnell I don't wanna get into a religious war over MAC vs PC But I would ask you to look at Paul's reasons for discounting the Mac. He claims it's (the Mac) is flawed? Is it? How? He claims it's too expensive. True it's initially more expensive than a PC but it also has a longer life cycle, it's easier to use and has better security. He claims it's not fast. That's clearly no longer true. I just got my wife a new Mac with the M1 chip. It's blazingly fast. And I'm using an iMac which has the Intel chip in it and I still get great speed. It's got a small screen. Today that's laughable! It's in Black and White. Uh huh . . . seen the Retina display?? So bottom line, none of Paul's points as to why he rejected the Mac back in the 80's are valid today. And as far as it not being taken seriously, I guess that depends on the business you are in. If you do graphics and or video work it certainly is taken seriously. Or if you are an IOS developer, it's hard to find a better platform to work on.
@@Jim-mn7yq He was correct. Apple nearly went bankrupt in the 90s. Additionally, apple had two competing product lines, the Apple ii and upgrades, and the Mac. As to your points about apple nearly 40 years in the future... Mac's don't last as long as they are much harder to upgrade in hardware, though recently they are a bit easier now. Even so, they often require mac specific components, as opposed to generic components, meaning it is much more expensive. Additionally, old mac's can't upgrade to the newest MacOS, which hinders their software support and leaves them less secure. The brand new M1 has been shown to be slower than say, the most new AMD chipset and graphic card, for intensive tasks (transistor count is a physical limit you can't get around). The m1 based on ARM is far more efficient than x86-64, and this also makes it speedier for little, immediate, things. As for screens, for the same price, the apple monitor will usually be worse than a generic monitor. What 27"+ 4k retina monitors can you buy for under $300? Also, if it wasn't just for the iphone, apple as a computer company wouldn't really exists as it does today (it still would exist, but really the iphone is propping it up immensely).
@@lelsewherelelsewhere9435 "Apple’s market cap recently surpassed $2 trillion. A historic moment in the history of the U.S. stock markets. It is the first company in history to accomplish this making it the largest and most valuable publicly traded company in the U.S. It passed the $1 trillion mark only two years ago in 2018 after being a publicly traded company for 38 years. Phenomenally, it only took a mere two additional years to increase another trillion dollars in value." Need I say more?
@@Jim-mn7yq Well, yeah, current day Macs are a completely different product than the Macs back then. Of course his comments don't hold up if you apply them to current Macs. But the Macintosh he was reviewing had the flaws he went through and stated, and true to his prediction, it never was taken seriously in the business world.
The Validec for computer ordering at the restaurant was amazing and way ahead its time. It's really fashinating and instructive to watch these shows and read Byte magazine, still nowadays.
I got to work with Jazz and the LaserWriter. The Mac plus and a lot of other computers when I was 23. I worked at Computerland my first real job. The laser writer was over $10,000 in 85 dollars.
I am still using a 1984 Macintosh to this day. I regularly go online with it on Bulletin Boards and communicate with people. It’s the early internet and it runs very well on these old systems. Hard to believe my Macintosh was around when this episode was being filmed... all before I was even born.
@@Bendaak what bbs services are you dialing? how are you dialing them on the current digital phone networks. You say your doing this but you provide no explanation nor proof of how you are doing this.
@@jessihawkins9116 I am not dialling via phone line however it is still possible today as there are BBS services available such as Level-29 which can be authentically dialled into. I connect via TELNET using a RS232 WiFi serial modem. This is the most common method of connecting to a network on a vintage machine.
When I had originally seen this episode of the Computer Chronicles back in 1985 I wanted to get a Macintosh computer, but I soon discovered that Apple both priced them out of my price range and that there weren't any Apple stores in my hometown. I ended up getting myself a Windows PC just 12 years later instead.
18:43 The first LaserWriter we got, took so long to image graphics pages, that I named it “Deep Thought”. It was built around the legendary Canon LBP-CX engine, also used in HP LaserJets of the time. That thing could really cope with abuse--i.e. high print loads.
"low-hang·ing fruit nounINFORMAL a thing or person that can be won, obtained, or persuaded with little effort. "we know mining our own customer base is low-hanging fruit""
I haven't actually tried one myself (even in emulation) - I'm kind of curious about the extent to which this is actually true. Like, for instance, I used to run GEOS on a Commodore 128, like all the time. Lower overall screen resolution than a Mac 128K, slower CPU, technically a little more RAM since the display controller had its own 16kB... Better floppy drives (I had a 1581) - It was slow, of course, but I found it useful. Low bar perhaps but I have to imagine the Mac 128 was at least better than GEOS.
@Tetsujin - I own an original 1984 Macintosh (albeit upgraded by its former owner to a 512Ke with 1 MB RAM) and can tell you the above statement is accurate, it was basically a glorified "tech demo". It wasn't until the Macintosh Plus with its 1 MB RAM (expandable to 4 MB RAM), 800K floppy, SCSI hardisk support, proper keyboard (cursor-control!) and larger ROM firmware (more and better optimized Toolsets) did the Macintosh actually become usable and viable. Some might even argue it wasn't until the Macintosh II, more than a year later, with its modular/expandable design and color support. And yes, I'm familiar with GEOS on the C64 and Apple IIe, but those were 8-bit machines, running with low graphical screen resolutions and very limited memory constrictions. The Mac had a 16-bit CPU, could address far more memory (and linear, not bank switching) and fixed screen resolution, with almost double the number of horizontal and vertical pixels to push around. It was far under equipped to handle all this. Apple made the same mistake with the 16-bit Apple IIGS, with only shipping with 256K RAM, but at least it was user expandable up to 8 MB RAM. FYI, between its limitations and high price,the Mac was never even considered as a home computer until the early 90's, with the introduction of the Macintosh LC. However, but that point it was far too little, and too late. The DOS and Windows PC became a much better choice (I still think the Apple IIGS could've kept Apple in the home market game, given the chance, but they threw that opportunity away in favor of the lesser Macintosh).
@@Apple2gs it is simple, RAM was very expensive back then. Even Atari ST was planned with 128kB of RAM, same as Amiga was. In late 1984 RAM prices started to fall and Atari dropped the 128kB version in favor of 512kB and Amiga got 256kB as a baseline. It was in same time Apple introduced Mac 512. Even DOS computers in 1984 usually shipped with 128-256k RAM and 512k was seen as luxury. IBM PCjr had whopping 64k or 128k RAM. The RAM was expensive in 1984, Im telling you...
I volunteered for their afternoon recruitment drive for drone "workers" , who thought "Coowell" .......was something then...enough for me then. "It is now" - Thanks guys!
His prediction was true - Steve Jobs was fired, Apple II was their most popular machine, they lost money on Lisa, Mac wasn't proffitable you could not make use until was upgrated to 512k, it didn't age well but it took few years for Mac to actually useful
imagine having to be the guy pretending macintosh is not a piece of crap compared to the Apple IIgs which was deliberately slowed down because Steve Jobs had a petty ego
@Luke He looked visibly uncomfortable to me! Also, let's not get into an argument about subjective versus objective reality here, okay? You're trying to tell me that the software applications made up for it but they could have made the same software, but better (with color) for the Apple IIgs. Ego gets into marketing big time. I know this because there are many business and sales models in operation right now that don't make any sense. Take for example the business of collectable miniatures: here we have a gaming company with a business model created by gamers, not businessmen. Just go check out that crapshow and you'll see what I mean. It's basically gambling.
@@quonomonna8126 Also, let's not forget the price, the Apple IIgs still beat the Mac in many areas even when it was slowed down because of Jobs' insistence on crippling it, it had gs/os, it was in color, it had a faster cpu, and at the time it had more software and it had more games too, the Apple II team took the idea of the Mac and executed it way better. It's been confirmed many times that Steve hated the Apple II because he thought it was older tech and the Mac was the future, so when the Apple II team executed his vision better, he was angry. Even years later, at the introduction of the iMac, Wozniak wanted him to give a shoutout to the Apple II team because they were one of the main reasons that the company was alive at the time, he wouldn't do it.
@@quonomonna8126 Yes yes, this was mainly aimed at Luke even though I forgot to tag him. It's just sad that so many people forget that what a dirty start the Mac had.
Jobs left Apple a full year before the IIgs came out. And bear in mind you're kind of conflating two versions of the Apple IIgs here: the one that existed in the real world in 1986 (a year after this episode of Computer Chronicles aired) and sold with a bare minimum of memory, no display, and no disk drives for $1000 - and the imaginary version that lives in the minds of IIgs fans, the version that would have run at 8MHz or something because of that one time Woz said it'd be awesome to have a machine built around an 8MHz '816. The latter does not exist in reality. We don't know what kind of IIgs we could have got, and what that hypothetical machine would have cost, because it was never made. The closest thing we have is IIgs accelerator cards. Get a *really* good one of those, and a IIgs can *almost* keep up with an Amiga 500. The IIgs that exists in the real world has more wrong with it than a slow CPU. It has a slow *architecture*, because a lot of its architecture is Apple IIe architecture, stuck not only at an 8-bit data bus but also locked to 1MHz. The video RAM is part of that, in IIgs super-hi-res mode you get one frame buffer, which means no page flipping, and it's in slow RAM. Combined with no hardware sprites or background scrolling it really creates a bad situation for games. The best case as I understand it is to double-buffer and use the IIgs's RAM shadowing, triggered with a technique like "PEI slamming" when it's time to update the display RAM - If I got the math right, means you'd spend around 35ms updating the display from your double buffer, so you could maybe run the game at 15fps and only spend about *half* your overall CPU time telling the hardware to update video RAM from the double buffer! Or you can make "Arkanoid" or something where you only have to update smaller areas of the display on each frame. If Apple management really kneecapped the IIgs, Woz should have signed it "Alan Smithee" instead of "Woz". (Though it is nice the guy got to sign one a machine in the Apple II line) I think its problems run much deeper than CPU speed, it's baked into the whole design. Look at the SNES for instance. It has a faster CPU than the IIgs but not by much (3.58MHz vs. 2.8MHz) but it is *way* better at moving graphics around the screen. That's because, like most good game systems of the era, it's all about the rest of the chipset - hardware support for layered scrolling playfields and sprites makes a *huge* difference. I agree that the early Macintosh machines were fairly limited. I think those limitations were a necessary compromise at that point in the platform's history. It was slow enough as it was, frankly, and going color would have made it slower - or more expensive. What it offered instead was a high-resolution display (at least, higher than most color displays at the time) with relatively low RAM and CPU requirement. Later, more powerful Macs (1987 onward, Macintosh II line, etc.) were more powerful but with a price to match. Machines like the Amiga, Atari ST, etc. at the time were a good middle-ground, reasonably affordable but reasonably powerful, and that's what the IIgs should have been - but it really fell short of that IMO. We can imagine what if maybe it hadn't... but that is not how it turned out in reality.
I'm just glad the operating system was still called "System" then. I don't know if I could handle somebody on Computer Chronicles calling it "May Koss".
@25:33 A shame this Commodore laptop never came out. They had some key LCD patents owned but then decided LCD would lead to a path of nowhere and sold that off.
If you follow Bil Herd on RU-vid, he was the designer of that computer and not pleased that they never released it. He said that Commodore didn't see any future in portable computers!! I'm pretty sure he has one of the few prototypes that were made, and he even got his unit booting on live stream he did a while back.
@@DavePoo2 It was scratched due to fact that Commodore bet all its money on Amiga and after Amiga A1000 fiasco Commodore almost went bankrupt, so there was no space for incompatible portable C64 anymore.
Paul schindler was right here, there is very little mac os use in business these days, even Chrome OS and Linux are in heavier use. ios is different though
When the guy said it replaces the "A Greater Than" I literally had to play it again to figure out wtf he was talking about. Oh, like A:> prompt. I guess they didn't quite have terminology figured out back in the 80s.
@@andreiandrosoff1327 "Eh Colon Prompt" (i wouldn't verbalize the greater than symbol.) Or just A-prompt. Or C-prompt or whatever. That seems to be what was commonly used in the 90s which is when I really dug deep into computers originally.
From what I understand, this was the beginning of a time of tumult at Apple, such that years later, the company brought back Steve Jobs, getting that "weird Unixy NeXT thing" in the process.
The mathematician daughter of one of Britain's most talked-about poets figured out how to make that difference engine-had it been fully built-grind out "Bernoulli numbers".
@@mcswabin207 The trackball is smaller than the amount of space a mouse would need to operate. Apparently this concept is too much of a stretch for you to understand.
Practically everything he says is completely wrong, but he's at least more interesting than George Morrow. In Morrow's commentary segments he just says some obvious stuff every time
Neither allowed, nor disallowed - it was out of scope. Multi-tasking is something an operation system has to support (or not). GEM by itself was just graphical operating environment and widget toolkit that could be integrated into various products. What we see here are individual DOS apps using GEM for UI: a file manager, a picture editor. However, the TOS operating system of Atari computers used GEM. TOS wasn't initially truly multi-tasking, but its later developments were.
Gee, it had both colors... Black and white... and low end sound. The Atari ST and the Amiga came out later in 85. 4 channel Sound, Fully Pre emptive OS, 4096 colors... yep.. the Amiga was a Dinosaur campared to the Mac.. (Geez the Mac did not have a Fully Pre emptive OS till 2000and what?
Computer Chronicles got a lot of things right, but one thing they got wrong was how much time they spent talking about Top View and it was ultimately a complete flop. IBM was a market leader but they had really already stopped innovating in any significant way.
Yep, but you have to remember that IBM totally rewrote the microcomputer world at that moment in time. In just 3 years it was IBM or compatible as a standard. Everybody expected that it would be IBM to define GUI on the PC.
Why did Apple want into business instead of gaming? Now Windows is the gaming platform and it responsible for so much more spending than business applications. I drop thousands per year upgrading and buying gaming hardware and software. We only spend a small fraction of that at my business for productivity applications.
that’s a fair question. First, Jobs had used psychedelics and knew that the computer would push the human race forward (his words). He probably spent very little time gaming himself. Second, Jobs left Apple to found NeXT which was a computer for educational institutions (again, to push the human race forwards). Gaming is a timehole and he probably didn’t see it as a life well lived. Third, the Mac did have games. Tetris comes to mind.
@@blackrockcity I reserve gaming for snowy cold days, but really enjoy it. I do tend to think that I could be doing something productive instead and the older I get the less time I spend gaming.
Jobs took Raskin's design for an affordable computer... and he screwed it up. Xerox knew a personal computer powerful enough to run a GUI would be hideously expensive at that time... and that's why they didn't market the Alto. See also: Canon Cat, Xerox PARC
What have we done with this awesome technology? We've enabled perpetual outrage mobs and #metoo and stupid cat videos! NOBODY could have foreseen how badly technology would be abused to fuel outrage culture, though a lot of people could see the loss of privacy that was coming.
Hugely overpriced and hugely underpowered, as Apple products so often tend to be. I guess it deserves credit for bringing mouse-driven GUIs to the masses? Just one year later the Amiga came out and completely obliterated this thing in every possible regard, while costing a lot less
Of course looking back more than 30 years later, things are different today, but this show was shot in 1985, and that's the lense it was intended to be looked at through. You *did* hear where he listed the reasons *of the time*, right? Small (9 inch diagonal) black & white screen, too expensive, etc. Add to that that it only had 128KB of RAM, which was small even for its day, no hard drive, only 400 KB floppy disks, etc.
@@SweetBearCub Well, technically the prediction stood its ground. The PC market ended up swallowing up 90% of the marketshare of all computers ever sold. The Mac held a niche market for things, but big corporate companies that started putting a computer on every desk for their employees in the late 80s never went to Apple.
For desktop computers and laptops Macs are still a small part of the market. The install base of windows/linux based PCs is many times the size of the mac base. A major factor is the pricing(You can get a budget computer for as little as $300), but also because people are familiar with using a PC with windows and MS office at work (Macs are mostly used in companies that do professional graphics or audio). And of course the majority of gaming PC's are windows machines because a lot of titles, especially in indi games, are not Mac or Linux compatible (though Linux compatibility is a lot more common now then 10 years ago). (If you don't believe me check this page on market share of different OSes per month for 2013-2019: www.statista.com/statistics/218089/global-market-share-of-windows-7/ )
The Mac-as-computer has almost always been a home-user or student product, with some small volume niche markets like desktop publishing and video that just aren't necessary in any-way at all to use a Mac now. Plus the last good computer they made was the IIci.
The mac was a waste of time, they would have done better by not restricting the Apple 2 GS which had color capable and backwards compatibility with the apple 2 . Did I mention the color screen? , I contend that if Steve jobs hadn't spayed the capacity in the GS Ram then the GS would have been super super better. The MAC was a downgrade not an upgrade
Backwards compatibility itself kneecapped the IIgs before it even got out of the gate. And Jobs wasn't at Apple when the IIgs came out. So for instance, IIgs fans always moan about how terrible it is that Apple limited the CPU to 2.8MHz to avoid competing with Mac. But there's another, worse limitation baked into the IIgs architecture: the graphics memory (including the IIgs graphics modes) is part of the 1MHz "compatibility" core of the machine - so the slow CPU isn't even the limiting factor here: as slow as the CPU is, the video RAM is *slower*. In other late 80s platforms like Amiga and Atari ST, the CPU was supplemented by the chipset, which could ease the burden on the CPU. Even if the IIgs CPU had been exactly as fast as the Amiga CPU, the lack of this kind of support hardware meant it was forever limited by that display RAM bottleneck. I contend that the overall design of the IIgs (resulting primarily from choices made for backward compatibility) made it an underperforming, dead-end design mired in almost decade-old technology. On paper the system could look competitive with other late-80s machines like Amiga and Atari ST but couldn't back that up with real-world performance for a variety of reasons. I really think Woz was wrong on this one: Carrying forward the Apple II design was not a good plan for the future. But remember also the Mac came out before the IIgs. The original Mac was double the price of a comparable Apple II (A complete Apple IIc, with monochrome display, was $1300 in 1984, while the 1984 models of Macintosh were $2500-$2800) - but it had a faster CPU, wider memory bus, a higher-capacity disk drive (400K on the first Mac models, vs. 140K on the Apple II), and a higher-resolution display (512 x 342 vs. Apple II's 560 x 192 double hi-res, or the more typical 280 x 192 hi-res). The Apple II could display color, but it wasn't really good at it (typically 6 colors at 140 x 192, though some software used a 15 color 140 x 192 mode) so I don't count that as much of an asset. Mac was more expensive but it was a better machine. Then when the GS came out - shortly afterward the Mac platform got a major upgrade with the Macintosh II (including color graphics!) - but of course that was a much more expensive machine as well, around $5500. The IIgs started at $1000 (approx $1500 for a minimal complete system with one disk drive and a monitor) and the first model was shipped with a bare minimum of RAM (256kB, half of which was "slow RAM" in the 1MHz Apple IIe core) So at that point the price gap had widened to a factor of 4 or so, but Mac was still the better machine in my opinion. (If you wanted a better machine *and* a better price, you could get an Atari ST... Or an Amiga 500 once that came out.) The Apple II was a great machine for its time (i.e. late 1970s to early 1980s). But its real strengths were in the clever engineering hacks that allowed Apple to make such a great combination of power and economy. But those kinds of hacks don't really translate into a good long-term foundation for a computer platform. As it was, the Apple II (particularly the IIe) carried Apple through the 1980s, long enough for them to turn Macintosh into a platform that could carry them through the 1990s. I think they did about as well as they could have possibly hoped to have done, being one of very few prominent 1980s computer brands to still be relevant in the early 2000s (and the only one of the 1977 "big three" of Commodore, Atari, and Apple to even survive that long) - to say nothing of today, *another* 20 years later! I really doubt they would have done better than that by sticking with Apple II as the foundation for their machines.