Back then what was available was scooped up and scrutinized heavily. Magazines, books, manuals, television. The Internet existed (and the web only just being developed) but really at this time most people used Lynx, Usenet, Fidonet, and BBS. There is content like this now, you just have to utilize RU-vid and find a content creator for the area you are interested in.
@@michaelmcconnell7302 Loved this stuff back in the day and watch LTT/GN pretty regularly now. You're right, same stuff, just newer. You can even draw parallels from Quick Bits to the random access file!
Performance today, yes. Quality, not so much. The build standards for modern computers/TVs and everything really, except cars have utterly collapsed. Nothing we accept today as "normal" was acceptable back then. If you bought that 37 inch monitor back then, it would still work fine today. There are very few, if any monitors built today that will work 30 years from now.
@@tarstarkusz The biggest issue with this is the speed of modern technologies today. An Apple Cinema Display or Thunderbolt Display is something like 10-18 years old depending on the model, and many work flawlessly and are built of solid quality aluminum and glass in the case of the later models, and in the case of the Thunderbolt Display it still holds up as a 2k Display with a Thunderbolt 2 hub. But people don’t want 2k, most people are spoiled by 4K, and the Studio Display at 5k for 1600$ is twice the resolution of a 4K Display, and the 6k Pro Display is 5000$. People are moving into VR now as well. So it doesn’t even make sense to make equipment of extreme quality when we see it completely outpaced in under 5 years, and sometimes obsolete in less than 10. So the speed of technology requires lower quality materials in order to keep up with demands, advancements in technology and to keep prices realistic. I had an old General Electric radio from 1945. Solid wood full sized cabinet. It just doesn’t make sense to make tech that beautiful and lasting when it will be technically outdated in a few years.
oldtwins I bought a 19" pivoting LCD display back in like 1999 for $1000 & I still have it. I paid cash, so no loan for me. :) The color sucks on it because the backlight has aged, but it still works.
Old school technology programs were such a vibe… everything seemed so cool and we all thought the future would be amazing! Looking back I feel we have lost our innocence for technology 😢
I know this is an old reply, but a place I worked at back in the mid 90s had one of these - they took it to conventions for their display booth. They had a special wooden crate/frame to have it freighted out to whatever show they went to.
Why? It's not like you personally had a hand in inventing any of the things you use, you were just fortunate enough to be born at the right time to purchase them.
@@blackneos940 Which tells you that technology changes, but human nature doesn't. Always going to be smug people with a false sense of superiority based on possessions.
Neat to see monitors that supported portrait orientations were available even in 1991. They are making something of a comeback today, especially with multi-monitor setups.
It was kind of silly with a 4:3 due to the minor difference in width vs height. With widescreens it makes a big difference (e.g. watching vertical videos /s).
@@MarcoMugnatto They are rarer because software nowadays does a better job of scaling. Word can be scaled to show one page, multiple pages, etc. That wasn't the case back when this aired, there was generally little in the way of scaling, so portrait made more sense. I think nowadays they tend to be more of a secondary display rather than a primary one.
In my lifetime of working with computers, I consider 1991 to be the year that computers started to become more interesting and capable. This episode encapsulates everything that made that so.-
I believe the main difference was due to the falling price of hard drives so it became more affordable and mainstream. Computing simply sucked when relying on floppy drives. Display cards also helped a lot too with enhanced resolutions and colors. I remember seeing a 1024x768 256 color display, flicker-free, from some PC around this time and it was like looking at something sci-fi vs older display technologies.
at "+Geforce are for n00bs and CHILDREN. I use Quadro to edit 10-bit content" well I hope they are better than those EIZO overhyped pieces of shit, why having a remote for a 24inch display just to set the brightness. And why are the colors so bad? Well good thing I am not grading on them.
NEC MultiSync were the beezneez back in the day.... There were a few highend monitor makers back then, EIZO and NEC ruled large scale shadowmask before Sony changed the game with they're Trinitron screens. EIZO licenced Trinitron technology, and IDEK/Iiyama came with they're version called Diamantron. In 1996 I worked at an ISP which used those EIZO 21" Trinitron screens. Hugely expensive in those days.
I had an Eizo monitor, the image quality was fantastic and it would happily accept an Amiga video signal, but it had a bad tendency to loudly and dramatically blow capacitors.
In around 2001 i had a 22" CRT from NEC. This was really a great monitor, but it got damaged someday by a wrong driver. After that i bought a 22" TFT EIZO monitor.
One of the most interesting episodes with the first 3d accelerator at 14:18 , 3d displays, hd displays and a video card with a dangling oscilator at 17:59, $500 video cards that don't do much and an extra $800 to go from a 15 to 17 inch crt monitor.
Yup, that chick from Orchid was asked a specific question and then just spat out her memorized spiel. She has absolutely no idea what the fuck they're talking about, lol.
"Today's standards" are worse than the old standards. LCDs, which are the most commonly used type of computer monitor today, are a downgrade from CRT monitors. They caught on because they are lighter and take up less space for a given screen size, neither of which have anything to do with video display performance. Also, they are way cheaper to manufacture than CRTs, and since the manufacturers have no obligation to pass those savings onto the customers, they love the extra profits that "today's standards" bring them.
@@MaximRecoil CRTs can't have perfect geometry, can't get as bright as LCDs and the contrast ratio is awful in the real world because the screen itself is not black so any ambient light will raise the blacks, and there's no way to support variable refresh rate on tubes. Today's LCDs have wide gamut and accurate colors, high brightness and contrast for HDR, and support VRR for tear-free gaming. Tubes' only saving grace in terms of performance are motion clarity and viewing angles but even those are being matched by next gen display techs like OLED and MicroLED. The past is not all roses and rainbows like how you might have remembered.
@@DripDripDrip69 "CRTs can't have perfect geometry" They can have close-enough-to-perfect-geometry that you can't tell that it's not perfect by looking, so that's irrelevant. "can't get as bright as LCDs" You don't know what you're talking about. CRTs can get brighter than anyone would ever want to look at directly. "the contrast ratio is awful in the real world because the screen itself is not black so any ambient light will raise the blacks" Again, you don't know what you're talking about. The screen on my Dell P1230 is as black as any LCD screen. Furthermore, you don't even need a black screen for good contrast with a CRT; the contrast of old CRTs with light gray screens (which are mostly from the mid 1980s and earlier) looks fine in normal lighting. They don't look good if, e.g., the sun is shining directly on the screen, but neither does any other type of video display in existence. "and there's no way to support variable refresh rate on tubes." Everything you've said so far has established that you don't know what you're talking about, and this is no exception. "Variable refresh rate" originated on vector monitors, which use CRTs. The CRT itself does what the chassis circuitry tells it to do (and there's nothing about a CRT which prevents variable refresh rates from being implemented, obviously), and it's far more versatile than any digital display; just the fact that vector monitors exist, and can _only_ exist in CRT-based form, is a testament to this. It is 100% impossible to make a vector monitor using an LCD or any other type of digital display, because they inherently have a fixed pixel grid. "Today's LCDs have wide gamut" LOL. CRTs literally have an _infinite_ gamut. The gamut is limited only by the hardware that's sending them the video signal. The number of possible colors for an LCD or any other digital display is finite; for example, a 24-bit color display can produce exactly 256 × 256 × 256 colors, which = 16,777,216 colors. CRTs on the other hand, don't use digital steps of color; the intensity of the red, green, and blue electron guns is determined by voltage. How many possible voltages exist between say, 0 volts and 5 volts? An infinite number, obviously, which is why CRTs can generate an infinite number of colors. "high brightness and contrast for HDR, and support VRR for tear-free gaming." LOL (again). See above. "Tubes' only saving grace in terms of performance are motion clarity and viewing angles" Again, you don't know what you're talking about: - CRTs don't have a fixed pixel grid like _all_ digital displays do, so they can display a wide range of resolutions natively without the ugly scaling that you get with a digital display when it, e.g., fills a 1080p pixel grid with a 720p video. - Vector monitors can't even exist without CRTs. - CRTs can display an infinite number of colors, as I mentioned above. Digital displays inherently have a finite number of colors they can display. - CRTs display the video signal in real-time because the video signal directly drives the electron guns, so they have practically no display lag. It might take, say, 1 nanosecond for the electricity to make its way through the circuitry. On the other hand, even the "best" "gaming" digital monitors have at least 1 millisecond of display lag (and most of them have tens of milliseconds of display lag). Even 1 millisecond is an eternity compared to 1 nanosecond; it's literally a million times longer. That's why, if you go to a classic video game competition or exhibition, even today, you'll see a ton of old CRTs there; because most people don't want to put themselves at a disadvantage by using an LCD or any other type of digital display. Display lag is best avoided when you're, for example, doing a Super Mario Bros. (NES) speedrun that requires a bunch of "frame perfect" inputs in order to get a competitive time of completion. "The past is not all roses and rainbows like how you might have remembered." That would be a great point if you knew what you were talking about, but unfortunately, you've proven that you don't (see above, in many places). And I don't need to rely on memory; I still use CRTs for everything, including the PC monitor I'm using right now. It's funny that CRTs stopped being actively developed nearly 20 years ago (the industry was more than happy to move to a far-cheaper-to-manufacture display technology, with no need to pass the savings on to you, because they tricked you into thinking it was superior), and current digital displays still don't hold a candle to them in terms of performance.
Each of these computer display monitors were unique in what they could do. The only problem that I would have is that they were all beyond my price range, so that I could not afford any of them at all.
0:30 - "Guy, come on, we're tryin'-a watch the game here! Can you go shoot your TV show intro somewhere else?" "Oh I'm sorry, did my little monologue distract you from the action-packed, blink-or-you'll-miss-it, lightning-paced excitement of this BASEBALL GAME?"
I had a 20" back in 1989 that was made by JC Penny that was a VGA Monitor, TV and entertainment deign. I paid like $200.00 in an upper class Pawn Shop one day. It still works!
The part of this with the Radius monitor, it's 91 after they went public, where what Andy Hertzfeld and Burrell Smith did starting in 86 with Radius and the Mac products was something that was special, especially for two individuals improving on a product they helped build, but then got forced out of Apple as things changed.
yup! I recognized those pics as well. The woman was actually nude in the headshot and I did see her entire body way back in the late 1990's through one of the last BBS's that had that image.
It's weird to think how old that woman is now. She must at least be in her 50's considering I first saw that pic when I was about 15 and she was in her late 20's.
I think this is the episode where, according to Stewart, the monitor maker complained that viewers would think it was their monitor running slowly, not the video card. Who did they think watched this show?
13:29 reminds me of a time when I thought I'd have a side gig in graphic design. "Pantone" sought to resolve the RGB/CMYK discrepancies, if the Aldus FreeHand manual (or that of Adobe Illustrator) was anything to go by
Pantone became less relevant in that regard once home color calibration kits were released. You'd print a reference image with CMYK patterns on it from your printer, then hook up a special light meter to your monitor. It would then compare the colors on the print with the colors on the screen, then adjust the screen so it'd match your print. It then creates a profile that you could load into Photoshop, Illustrator, FreeHand, Quark, etc. Apple had it all integrated with their ColorSync system that could utilize almost any color profile generated by the various meters. You no longer needed to load color profiles on a per-app basis, everything was pretty much automatic. Pantone now charges insane amounts of money for reference swatches for matching real-world objects to their Pantone color system, and will soon (if not already) be charging money to simply view the Pantone colors within an app like Photoshop. Complete ripoff.
I used a NEC MultiSync 4FG mid 90-ish, awesome design and great picture, only beaten by the IDEK Vision Master 17 replacing it (yes, with the backlit LCD below the bottom bezel). It was my work-horse for years and years and outperformed any LCD for at least a decade. Cons: power consumption. Pros: picture quality, speed, size demanded a large desk :-) These days the screen surface is the way we measure, back then it was volume and weight, if you could not carry you were not worthy:-)
People always rave about Gary Kildall and Stewart Cheifet and rightly so, but Jan Lewis is just as awesome as those guys! OK well not as awesome as Kildall (R.I.P), but still.
This was back when computers did work for people, when people were in charge. Fast forward to today, the computers are effectively in charge, tracking everything, work, quantity, quality, steps, breaths, clicks, taps, trends, emotions, purchases, entertainment, desires, curiosities, problems, solutions, frustrations. People work because of computers.
I remember that if you only wanted to BBS or use text-based programs and word processors, Hercules cards was the way to go. I had no idea that company also had 3D acceleration cards. It's sad their 3D cards didn't make it to the 21st century.
imagine if they integrated a camera into the monitor so it would project your face on the shiny balls that would be so cool and trippy to look at back then
The graphic overlays on this program seem to have been done using a Commodore Amiga with Broadcast Titler 2 from Innovision, and the AmiGen genlock from Mimelics Corporation. Remember those ? I sure do.
AlainHubert they were huge Commodore and Macintosh fans at the Computer Chronicles, but they were very fair in representing the industry at the time. Newer broadcasts are much more polished, so they may have switched to a Macintosh app for screen overlays.
22:32 My high school had a few huge RGB monitor setups like this in the early to mid 90's to run educational software packages from OS/2 2.1 on a multimedia series PS/2 (486 DLC I think) with the program's video content on Laser Disc. Too bad we only ever used it 2 or 3 times in class. I was the official "unpaid" IT student so I got to play around with it after class. Well..look at that 26:42...there ya go.
Wow, though it's an NEC, I own a CRT that cost about ten grand back then. Yet it runs almost every arcade game out there at native resolution perfectly. Funny they never mention it could run low-resolution timings below VGA.
grafics art of a 1991 computer, now my phone can do more then this old computers of the 90's I remeber using those computers of the 90's, the good old days, man I feall old.
14:18 only 957 triangles? That looks like there is some Tessellation going on with subsurface smoothing. Is that only the video quality or was that already a thing in 1991?
Yeah but 1280x1024 in 1991 on a 37 inch monitor no one else at the time had near 1080p resolution on their monitors but today it does look ridiculous that huge CRT , imagine having to move house with that LMAO
hevad Hehe. That was somewhat funny, but it probably would have been better if the narrator guy just found out why they used K. If I recall correctly, the K is the initial for Key, as the black plate in printing used to be called the key plate and Illustrations, diagrams, cartoons and such often have a black lines called key lines defining the borders of the figures and areas of differing coloring.
Did Computer Chronicles ever cover video projectors? Those were a thing as well back then, but they were really expensive. Only some big corporations and such could afford them, I guess.
I would trade all youtube junk review videos with a program like this. Even Apple keynotes are terrible now. So full of fluff. I would welcome the old style Apple keynotes in early Jobs days. PS: this video reminded me of when I used to confuse After Dark for Windows with my favorite game Alone in the Dark! 😅