Literally 90+% of the population requires an explanation on CRT technology. Likely you and everyone else who 'knew' what a CRT was just had seen/used one before and maybe knew the acronym.
Yeah I grew up with SNES/N64/PS1 and in the 2000's when it became common to play those systems on emulators I hated the hideous flat pixelated look (PS1 games look like pixel vomit on an LCD). I stayed away from playing old games because of that. Then I saw a post of a gaming setup with a CRT monitor few years back and loved the aesthetic. So I bought one, pulled out all my old systems from my closet and they look absolutely stunning on a CRT. Nostalgia has nothing to do with it. It's how the games were built to be played. I went from RF modulator trash in the '90s to god tier RGB in the '20s.
@@awill891 I'm 41 years old and have been playing games since I was 6. I lived through the CRT era and still have a CRT at home that I sometimes use for retro gaming when I feel like it. Also fine with using LCDs, with various settings. I do mod hacking on old NES games. I play old N64, Dreamcast, Gamecube, PS1 stuff, etc. C64, Amiga, and Atari 2600 too when the whim occurs once every several years or so. I have a lot of younger gamer friends, and very few of them can stomach my favorite games from the NES era. Nostalgia has a ton to do with it. CRT is certainly how they were played and how they were built, but the concept of "how the games were meant to be played" is just fanboi nonsense. It's how you prefer it to look. And that's fine. Just don't pretend authorial intent is an objective metric.
@@blarghblargh "fanboi nonsense" god you're so embarrassing to get this upset and call names. I have an OLED and LCD for modern stuff and CRT for old analog stuff. Have no clue why you're so butthurt. It's a fact pre-HD games look better on a CRT. Unless all the devs in 1991 were developing 240p games for a 4K OLED screen and I didn't know about it.
@@blarghblargh I think CRTs looking much better than LCDs with lower resolution content like retro games isn't really a matter of preference, unless you somehow love excessive blur and artifacting. It's probably more a case of it just happened to work out this way, rather than any sort of "how it was meant to be played" but still.
Linus: "remember those glow in the dark stars you had on your celling as a kid?" me, a 22 year old homeowner: "you mean the ones i put in my bedroom very recently...yes"
@@zuki9537 Any first world nation excluding second rate places like america. You could trip over and be in debt your whole life and thank yourself lucky some scum shot you at a school or fast food place for no reason so you didn't have to work 5 jobs with no leave to pay for a wooden house that could burn down at any point 🤣.
@@lifeofentropy 31 here, and i just had to give up on owning a home because I can barely afford rent in my work area... maybe someday when i finally pay off all my debt, assuming that im not forced to pay for healthcare, new car, car repairs, traffic tickets, or any other surprises...
My heart skipped a beat when Linus skewed the electron beam with that magnetic bit... I'm old enough to remember when you used to need a TV repairman to use a "degaussing coil" after something magnetic was brought too close to the screen (it would leave the mask magneti,ed in small areas, skewing color "ghosts" on the picture like a rainbow burn-in). When color CRT monitors became popular, they began fitting a degaussing coil inside the frame of the display face that sent a degaussing burst every time it was turned on. That's why bigger monitors always made that weird "whomp" sound on powerup as the capacitor discharged that degaussing burst through the coil.
I like the de gaussing whomp xD My old CRT screen hat the function as well but I just needed it once. I had a set of magnets (building speakers) to close to it and my whole desktop was wonky xD
Actually that's because they have a big old flyback transformer inside them that boosts the voltage to some many thousands of volts so that it can blast electrons at the screen - the capacitive effects on those things are super crazy. Not entirely true you need to discharge them for several days either - they typically have a rubber "anode cap" inside them, and if you poke under the skirt with a good quality screwdriver, a ground cable, and a very steady hand you can discharge it. I would strongly not advise anybody in general population to try though, just to make sure. That's aside the fact that relatively modern CRTs have a bleeder resistor in there and they discharge fairly quickly after turning them off, but it's best not to assume that is the case. The reason why the mask gets magnetised is because the electrons that are accelerated from said electron gun are moving really really fast, have charge, and don't always hit their intended target, thus leaving a charge on the mask and causing the colours to trip out. Not directly related to the whole thing where the CRT stores a lot of energy - the amount of charge stored on the mask is actually kinda tiny - that's just the transformer inside the thing being heavy and metallic and dealing with big voltages
People still use CRT TVs here. I don't fix them anymore cuz they're all breaking down and I can't offer warranty, but in 2021 I did what I expect to be my last CRT TV repair. Even with degaussing coils, the CRT degrades over time and gets irreversibly magnetised on some areas. What I used to do to reverse that was attaching tiny permanent magnets on the CRT, to minimize the colored spots. It was a lucky shot, but it worked.
Not to mention that the average CRT monitor can't compare to a modern LCD for clarity/sharpness - if you're doing anything other than gaming on one, get ready for hard to read, shimmery text!
Okay, this might explain why in my head the games I played as a kid looked way better somehow, compared to when I replayed them on LCD later. The blur is mind blowing.
It is. CRT, in a way, is inherently flawed technology but games of that day were made with CRT's for CRT's and especially with RF or composite cables (or at best S-video) which also added a blur of their own. Old SNES and Genesis/MD games that have dithered and blocky/banding color gradients when watched with LCD actually look smooth on CRT, almost like the console is outputting more colors than it is truly capable of. The same effect are even used to create transparency effects which look real on CRT but wrong LCD. Like the shield or waterfalls in Sonic 1, they look wrong dithered mess in LCD where as with CRT it is a smooth and transparent.
100% this is why. I broke out my PS2 the other day and plugged it into a Sony Vega CRT, and it looked fantastic. Just like I remembered it. Modern displays are just absolute garbage when it comes to displaying sub-HD resolutions.
@@TheJRSvideos Well they can do okay with non HD sources of film content. But with anything digital, it's going to look like crap on a modern display if it was designed for and on a CRT as those drawbacks in a CRT heavily influenced how these games were made. Take away the CRT and you remove a massive part of the equation that led to the look those games had.
@@TheJRSvideos key word: sub-HD I really loved this video, more than usual which I often watch Linus videos for fun and probably my favourite videos he ever does is talking about very old and very rare technology. Thing is he's kind of got a salesman vibe so while he is being very objective I think a lot of people are getting the wrong impression or misremembering. It might look "better" than a cheap display, but the games themselves looked bad, and the displays any middle class family would have looked bad. It had all kinds of different problems to them especially the brightness issue and some of the other stuff Linus talked about, so I feel like a high enough refresh monitor with the least motion blur would still look best if you could just get the scaling right. But then I also hated the graphics back then. Like Command and Conquer: Red Alert is just unplayable to me for so many reasons, and the visual quality is one of them. Adding blurring from CRT is like lipstick on a pig to graphical quality back then imo. Put it this way, The Division blows me away to this day how good it looks, and doesn't need blurring or quirks of CRT to do so.
Retro Games were designed around CRT use, it's why they look better. LCD panels didn't come out for a long time after so they where not even in the thought
I can't even put into words how different this video was from your usual content, but damn was I glued to my screen for this. Study this, and make many more videos like it.
Honestly I am always fascinated by how good the best of old technology is. The same goes for digital cameras vs. analog cameras. The reason why we can have old movies in 4K and 8K is just because the analog film it was shot on has so much more data in it than digital cameras. I remember the first digital camera we had... You can't even see the faces in the photos clearly.
Film quality depends on which level of physical quality filmtape the movie was transfered onto. There are HD-releases of old movies out there which looks absolutely clean and only part of that is the digital processing in remastering, but there are also HD-releases in which you can see the film grain meaning the original filmtape wasn't that fine quality. So it depends on the source material if rescanning it in 4K or 8K actually makes sense. Though overall I absolutely know what you meant in your comment and I'm also amazed how good very old movies can look once scanned in HD.
Old digital cameras with a CCD image sensor instead of a CMOS sensor can take better looking photos. Those old cameras were *expensive* and part of that expense was the big, glass lens. When you zoom in on a CCD image you see *pixels*. Zoom in on a CMOS image and you see *blurry fuzz*. CCD is "sharp all the way down". CMOS works around its built in fuzz by going to extreme resolutions. Like a Class D audio amp, the fuzz is buried in the signal.
There are 3 kinds of camera, Digital, Analog and Film. The ones we used in the 90s to take photos were color film cameras, they were not analog. The only analog cameras that ever existed were old analog TV broadcast cameras that were based on the iconoscope an analog device which is basically the reverse-CRT in many ways kinda like how a microphone is a reverse-speaker the iconoscope is the reverse-crt. Iconoscope cameras were THE ONLY analog cameras and basically they were not available for ordinary people to buy. We had film cameras instead.
@@greggv8 Really depends... The major disadvantage of CMOS is rolling shutter and also the fact it's usually slower than CCDs. CCDs are way faster and if I'm not mistaken they are parallel meaning that they don't have the rolling shutter which is usually responsible for image distortions. I read somewhere that CMOS is basically cheaper so it's more widely used.
It took me a long while to understand why everyone wanted antialiasing. I never used it in the CRT days because it always looked worse, but now it's basically mandatory.
I thought it was just a me thing and never realized it was the crt because I would never use it on a crt but like you said now it's basically mandatory and thought it was just a thing of the times. Wild.
The type of AA matters a lot, FXAA often just smudges the image, SM and MS, etc, are far better about it, since they down sample and/or use 3d data. Still yeah CRT you don't need it.
This is strange, Linus making such an obvious error. It also talk about scanlines, so it might be this monitor supports 480i and this is a special mode. It should be also interesting to see how it was hooked up
@@LucaMolteni84 its an easy mistake to make, as in the retro world most things DID NOT support 480p only 480i, by the time progressive scan was mainstream crt was on its way out...
Congrats on owning an FW900! I'm glad to see this monitor being spoken about even after 20 years of it being made! I have owned an FW900 for several years now and use it as my main monitor for my set up everyday since. After watching your video, there are a few things I wanted to mention in regards to the resolutions and display connections. You are correct that the Titan X is the last Nvidia card that can run a native DVI-I output to complement the FW900's VGA and RGBHV via BNC analog inputs, however you can also use various DAC adapters such as HDMI to VGA, DP to VGA, and even USB-C to VGA. This is very helpful for people like myself who have an FW900 and have a modern day GPU such as a RTX 3070 or 3080. It is important to note that the majority of DACs available will limit the display output to 1920x1080 @ 60 hz. However, depending on which DAC adapter you use, you can output a resolution of up to 2560x1600 @ 60 hz. Some will argue that this is not recommended as it goes above the user manual's list of supported resolutions, as it can put more strain and wear on the monitor overtime. It is possible that some DP to VGA adapters can display a 2560x1600 @ 60 hz resolution because certain DP to VGA adapters have a higher pixel clock speed than others. To run a 2560x1600 @ 60 hz resolution, you would need a DP to VGA adapter than has a pixel clock speed of at least 350 Mhz. Here is a small list of DP to VGA adapters that I can confirm that will allow you to go past the 1920x1080 @ 60 hz limitation (350 Mhz clock speed or higher): Startech DP2VGAHD20 IcyBox IB-SPL1031 (requires USB to be powered) Delock Adapter DisplayPort 1.2 male > VGA female black (No. 62967) I can also confirm that the three listed DP to VGA adapters will allow for a 2304 x 1440 @ 80 hz resolution with no issues. It goes without saying that the people at the Hard Forum have well documented these findings and can be found here: hardforum.com/threads/24-widescreen-crt-fw900-from-ebay-arrived-comments.952788/page-381 Otherwise, amazing video! Hope to see more of your FW900 in the near future!
So a big correction: The Trinitron didn't use slits, it used vertical wires. Those two faint lines at 1/3rd and 2/3rds of the way up are horizontal stabilizers. If you give a light bop to the side of the display you'll actually see the image shimmer and wobble as all the wires bounce around a bit. Also another fun trick that old CRT-designed games used was free transparency. They would render textures that were in various grid or line patterns and take advantage of the natural blur that would be introduced to get free transparent clouds or translucent texture effects.
The natural blur thing and the vampire example Linus gave is not a feature of CRTs but of the old composite video signal (that yellow cable that's usually paired with the red and white audio cables). That signal crammed all the analog brightness and color and timing data into one wire, which results in a lot of distortions like that. You won't get it on a CRT with more 'modern' inputs like S-video, component or RGB/VGA (in ascending order of quality). The two are often equated because especially in the states, all old TVs only had composite input (nowadays people 'RGB mod' their CRT TVs to get higher quality RGB input into the display)
If someone want to sell that widescreen CRT Sony GDM-FW900, I'm interested as long as it can be shipped to Montreal from British Columbia or Alberta, Saskatchewan or Manitoba; I can drive to pick it up in Ontario, or Nova Scotia, in the bilingual Province of/du New/Nouveau Brunswick, Province du Québec, Territory of Labrador or Newfoundland.
When I was a young teen, my dad decided we would tackle our basement finishing all by ourselves so he could take the savings and buy the latest, greatest, 32" Trinitron, like around 1989 or so. It was a BEHEMOTH of a TV. I think it stayed down there (we turned the main basement space into like a giant den with surround sound and everything) until only a couple of years ago when he finally let it go on a random, big-appliance trash day. Playing SNES and then N64 on that thing was BOMB ASS. Man, me and my brother didn't know how lucky we were.
@@Roshan_420 I think so. It's kind of a shame, but you never really know what's going to become valuable later on, and it's not worth hoarding every single piece of tech. It was basically an old TV and my dad so no value in it anymore. Not to mention, that thing weighed a TON. So glad I'll never have to move CRTs ever again, personally.
I used to review monitors for a magazine and I remember being very proud of myself for being able to lift this thing up on a desk without breaking it or hurting myself.
I had the 24" Dell version of this for which I paid about £60 in 2003/2004. I upgraded to an lcd telly in 2007 and I do not regret going to 720p because of the trade off in size of the screen. The dell was almost as deep as it was wide. Like John, I was proud the day I took it from my car, went up four flights of steps and placed it gently down while my shoulders and arms were on fire. I updated to a 1080p lcd screen within one or two years. $3000 is a joke for something that basically benefits only the retro feel.
@@goblinphreak2132 Goblin you do understand that most people struggle with 100lbs,let alone a 200lb crt! You must be popular when friends are moving home!
@@faisalkl The Sony FW900 and the Dell version you claim to have had went way higher than 720p or 1080p, it could happily do 2880x2560 or 2560x1600p @ 85Hz, 1920x1200 @ 120Hz, it is far beyond a retro monitor, sure thanks to CRTs being able to display low resolution so beautifuly it can happily show 640x480, a 720p LCD would have been a huge downgrade for an FW900 owner.
@@olivermood8003 @Oliver Mood yes it could do much higher than 1080p and my go to resolution was 1920x1200. However if I went higher than 1200p my eyes would hurt like crazy after a few minutes and I never found any sweet spot better than that. Funnily enough, I run at 4k on a 28" with basically a higher dpi and no ill effects on my eyes. I was well into running the Dell at high refresh rates so don't know what caused the discomfort but it is possible I was doing something wrong. The picture quality was sublime though!
@@lopiklop not quite the same. Crts are made with several pounds of lead and murcury. Plus most light bulbs are also LED now. Crts will never be manufactured again just based on the manufacturing process alone.
When I was in high school I had a teacher that had this exact monitor bought it themselves to use at school. One day they came in and it was gone replaced with a crappy Dell monitor. Apparently the school district had replaced all the computers and the IT guys took the monitor thinking it belonged to the school it took 2 days of complaining before they gave it back.
I always wondered how many PVMs (professional video monitor) were thrown into dumpsters, when security systems or whatever were upgraded. I remember in movies you saw walls of these monitors. They're expensive for their size and considering their age. But i always wanted one just for retro gaming.
Okay, here's the thing. I kinda wish that someone would step in with a brand new no compromises CRT for historical purposes, like for museums. Because a lot of historical footage is completely unwatchable on modern displays because of the upscaling artifacts.
if anyone were to manufacture tubes again they would be as expensive (if not significantly more, economy of scale) as they used to be and nobody would buy them. I got a new old stock 32" trinitron for 500 bucks (700 dollar tv in 06) and people in the crt gaming subreddit flipped out because im ruining the value of tubes
@@braydoncoate9583 if this video proves anything it is that if someone made it, certain people would happily pay through the nose for them as long as they are good quality. It'd be a license to print money for a company willing to bite that bullet of tooling up a factory to make them and then sell them at a 2000% markup coz where else are people gonna go to buy them?
@@braydoncoate9583 yeah. I'm assuming while there's documentation lingering in company offices and warehouses, most of it is likely to be proprietary, and all the people who designed the methods for manufacturing these puppies are retired or passed away. Hell, all the plants that made these surely scrapped or reused any tools they would have used, so a new manufacture would likely have to setup a plant from scratch. So the only way for a crt to make a come back would be if some insane billionaire did it as a pet project...
5:23 - One of the most useful tips I've heard on here. I hate it when my PC randomly opens things on the wrong monitor, especially when the 2nd screen is off. Win+Shift+arrowkey is the fix I was longing for!
What are you on about?...(presses Win+shift+right arrow) You Tube jumps over to 2nd display. OMG, where has this been my whole life. As someone who plays games fullscreen and browses the web on the other display, this could be a game changer. Thank you, I didn't really get what was happening when Linus did it.
My back still remembers CRT displays. Moving entire CAD teams to new offices. 21" or 24" CRT monitors with full metal shielding. They were both really heavy and so massive that you couldn't really carry them safely unless there were two of you, which hardly ever happened.
When my father worked for Sun they had a special trolley for moving the monitors around when they'd do installation and repair as they were not only very heavy, but also very fragile and expensive. Of course he also had to lug the servers around too...
I just sold *ONE of these in my Patreon Yard-Sale.. Neither one to Linus, shame. EDIT: 2nd Unit Winner pulled out, So expect Unit 2 back in the June Yardsale. EDIT, you should take the bezel off and peel away the Plastic anti-glare coating. get a ton of NITS back
He's still missing ONE key component there: a VGA output for the Dreamcast. That doubles the vertical resolution, outputting VGA 640x480 or 480p instead of 480i. Those signals are all natively present inside the connection port there, as well as full SCART and composite etc. (You can also do this with a Sony PS2, using the correct adapter cord, however it only outputs 640x480 IIRC, as the games were all made for that aspect ratio to match TVs back then.)
Actually the PS2 was capable of 1080 there even was a TV with one built into it where they hardwired it to 1080 res (like a built in Roku vs one you stick in the HDMI port)
As a teen I had a Dreamcast with vga box and a Mitsu 'trinitron' type display. It was SUCH an amazing experience. It made my PS2 look so crappy. But now almost all games work on emulators.
But not all Dreamcast games support the VGA adapter. :( It's not that they *couldn't*, the companies that made them mostly neglected to enable that when they were written. IIRC most of those games have been patched to enable 480p, but of course they'll only play on an emulator, cut down to be crammed onto a 700 meg CD-R (which won't play on the final Dreamcast revision), or a Dreamcast with a modded BIOS and a hard drive or SDCard or USB mod.
@@animeloveer97 It doesn't seem to make much difference overall though. I have a modded PS2 with Component out, HD resolutions can be chosen(and from memory even forced for some games that didn't originally support it), and I was struggling to determine which output was best(on an LCD TV). From memory I think some things looked slightly better & others slightly worse.
It’s sort of amazing that it took this long for other display technologies to *just now* make up for the features we lost when CRTs were discontinued. And some features in particular, like being able to operate natively at lower resolutions, still hasn’t really been replicated.
Yes and not even OLED hasn't been able to fully "mimic" CRT behaviour. It's just a shame that we havent' really improved in image quality exept for resolution and HDR.
@@kingotime8977 in some ways haven’t improved in image quality either, true resolution is measured as dpi or dots per inch not 1366x768 pixels they don’t even advertise dpi anymore at all, and resolution still gets conflated with image size and pixels by pixels all the time Yes big flatscreen displays may have 720p hd or better 1366x768 or better yada yada, but I find I need two monitors to run two programs, we’re on crt monitor running at 400x600 or 600x800 or whatever it was I could cascade or tile or arrange windows as I wanted running 3 or 4 programs comfortably at once sharing that small crt real estate and still see everything quite clearly. Flat screen just need two monitors, some of this is the trend from “programs” to “apps” and poor or no optimization anymore, but a lot of it is the display itself from what I can tell, and a lot of it is the programs and desktop icons change size or you end up needing to use larger desktop icons on bigger resolution instead of the standard ones, and many icons and programs don’t scale up well at all, it’s like blowing up or scaling up a photo without the negatives, it blurs and distorts Text based programs and iconography designed for older versions of windows don’t scale well now days. Sometimes it’s better to read on a phone or tablet because the desktop world is formatted for video content now not text, where phones still have optimizations behind the scenes for better text readability and tablets are closely linked to e ink and e readers and iPads are same size nowadays as a sheet of paper roughly and have reproducibility to be concerned with for reading text and artists drawing with Apple Pencil or something Computer displays now compared to crts or other methods of reading are like comparing a tv or monitor to an overhead projector or slide projector, yes you can read on one but why when the experience is so poor or bad or distorted.
@@tristanwegner Blacks on CRTs are much better, unlike OLED that just switches pixels of, CRTs can handle unlimited amounts of greyscale and retain detail in even the darkest scenes, trying to play horror games or watch dark movies/TV shows on OLED is not easy as you canny make anything out, CRTs can happily show so many different shades of black, from the darkest of dark to the most subtle shades of blacks.
This has also happened within LCD technology. Plasma displays are insane, they have outrageous 40k+:1 contrast ratios and produce very nice images with good internal response times, the move away from plasma to LED was another regression that we are just now digging our way out of with OLEDs
Sobering information: There haven't been new CRTs made in nearly two decades, even the runoff Chinese cheapos that you could still find after the big players moved away from them entirely and sold their production equipment to the Chinese manufacturers didn't last more than a few years. They required specialized equipment to make the windings for the deflection yokes. That equipment was, as mentioned before, sold off to the Chinese market and eventually retired and probably scrapped by now. Even if you found some sitting in a warehouse in China somewhere, it's got decades of neglect on it, not to mention you'd still need to produce the actual vacuum tubes and while those are easier, there aren't any of those production facilities configured to produce them and likely wouldn't be interested in taking on whatever small scale orders someone would put in for them to produce a couple thousand tubes. If you wanted CRTs, you'd have to bulk order them. Many tens of thousands. Quantities that even on the enthusiast market wouldn't move fast enough to recoup the costs. Whatever CRTs are out there now are the last of them. They are never coming back. If you're a retro gamer, get them while you can until they get too tired to keep putting out a decent picture.
@@goblinphreak2132 No it is not, unfortunately. I mean back to CRT eras, at first it was for the elite costing nearly $10 000 of today value, and these models were not profitable, they were an investment for a growth in demand and cost reduction. It became profitable and accessible when produced in millions, and the factories were used and slightly modified for several models over the years, the profitability was made over time. Those factories face maintenance too, which is very expansive as you can imagine. Not a profitable operation today, at all. If the industry stopped their production, it is because it was not profitable anymore, this is by far the biggest reason. The demand for these behemoth would not grow, at best stagnate, and their durability mean low demand overall.
@@goblinphreak2132 No it isn't. There used to be a CRT in every home and dozens to hundreds in every office. Even if 1 in 10 gamers wanted one, it still wouldn't be high enough volume for the manufacturing to be affordable.
@@goblinphreak2132 Everything you said is wrong. The retro scene is not anywhere near as big as you think it is, the percentage of retro gamers who actually want CRTs is smaller than you think it is, the ease at which anybody (even the biggest corporations) could begin production of CRTs is not anything at all like you think it is. If anyone actually _did_ start making CRTs it would be at a massive loss. There is no profit in it. Not even in the fantasy land you live in.
@@goblinphreak2132 I just think they would be way more expensive than people want to pay for because the economies of scale will be really low. For example DJs love Technics turntables, I have a pair of SL1200-MKII that I own and they're great. Technics stopped production ages ago but finally decided to resume production and the new one is $5,000. I bought my two MK2's for $400 each brand new still in box in 2007. Everyone thinks if they make CRTs again it will be $200 - $300 for a good one like it was in the late 90s but they'd likely have to cost like 10x that minimum which few people would pay for.
Still have my Iiyama VisionMaster 454 pro, this vid has encouraged me to break it out and find out if it still works... was a beast in it's day, not quite as nice as the sony, but super affordable.
Exactly, the 454 is a beast, imo the diamontron series is better than the trinitron, because Sony just didn't try to improve it much when they still had their patent intact, once it got lifted and others were able to copy it, some copied and improved it. The 454 i have was sort of "new old stock", bought by a company and only used for a few weeks before being replaced by early LCDs and then kept in a box. Had to replace the main line capacitor tho, it blew up on me when i first powered it on and used it for a few hours, it was quite old anyway.
I've got an Iiyama VisionMaster 400 (found at the local dump, perfect condition), and it's really nice as well. I would be using it with my gaming rig, but for some reason windows completely locks up if I connect it...
I was curious about CRT's ever since I was old enough to understand how electronics worked. When I discovered an animated explanation of the CRT working, it totally blew my mind.. I just wondered how the hell humans developed something that could pull off such an amazing feat.
Interestingly, as I hear Linus 'grew up on a farm' (so is a 'farmer' himself?) the CRT was invented by a farmer 2! A man of course (as they invent everything, pretty much =) & came up with the 'beam bending' thing 2 break up pictures into 'scan lines & bladee bla. The color version was a refinement by Zenith I think = just threw at the engineer guys saying 'make a color version of this' & figured it out by breaking it into 'color phosphor groups' & so on. Trinitron was a deal 2 reduce 'shadow mask brightness loss' by using wires instead of a solid sheet with a bunch of holes. Problems with Trinitron include the ALWAYS annoying horizontal 'stabilization wire(s) & if they R around a lot of bass they can also get 'rainbow wave' type artifacts, which U can also C sometimes if U bang on them, depending on size. The brightest CRT I've ever seen was a fairly small random computer monitor but I forget the name. It was like 12" or something = not big @ all just 'WoW that's really BRIGHT!' =) Interestingly the Pioneer 'Kuro' a bunch of 'fanbody' rave about being so 'movie like' becasue 'dark blacks' (for a plasma, which all suk 4 'black level but so what?) is just like many of their other plasmas but they put 'sunglasses' in front of it, like make the 'shaded layer' even DARKER (because 'have 2' have 'something' otherwise the GRAY background is even LESS dark than a CRT?) N E 1 who pretends 2 think that CRT has better 'black levels' than LCD or ANYTHING ELSE is #FullRetard tier insane = like Linus there in the $tudio looking directly at the different things & how 'light gray' the 'blacks' R on the CRTs & spewing the same 'mantra' nonsense 'CRTs have the best black levels' = WTF? =)) It's like all the fools at 'so-called colleges' pretending 2 think boys can B girls if they just chop their dix off HAHAHA
@@Deathrape2001 Aside from all the hateful drivel, "the CRT was invented by a farmer" is plainly misleading. He grew up on a farm, but that had nothing to do with his discoveries. Upon graduating high school he was in the military a few months before quitting to go to university. The professions of him & all the others that worked on it were physicists & associated fields.
У меня сдох 22 дюймовый NEC MultiSync FE2111SB 😢😢😢😢 Твм сначала контакт пропадал, стукнеш и заработает, я все платы лупой просмотрел и кое где пропаял и бесполезно. А сегодня вообще перестал стартавать, мигает индикатор (светодиод)😢😢😢 А моник то топчик, один из последних по качеству, кинескоп Diamondtron, это крутой конкурент Тринитрона
I sold two rather high end (I was an aspiring cs pro) 19" monitors to a researcher on the other side of the country (just Sweden, but still). Her requirements involved the monitors being able to go to 120hz and higher. My iiyama monitor could do 200(!)hz and was the ultimate underdog to Sony's Trinitron technology. It was called diamondtron as far as I can remember. Why did she need them at any price? High refresh rates wasn't a thing with the TFT screens of the time. She just straight up bought them and sent a crate to my house and had them delivered over night. I think my asking price was a bit high but these were top of the line gaming monitors at the same. She would have paid triple the price, she was working with Uppsala uni with grant money for studying flies. How they could be so energy efficient, considering how little energy their brain consumes while doing complex calculations in flight and stuff. She was charmingly passionate about the possibilities.
Man, I think CRT monitors helped us remember old games and how we imagined them to look so good, like how remastered versions look today. Perhaps games dont need an actual remaster, just a good old crt monitor.
As I already said up above, the problem with remasters isn't that the original looked better by older hardware, but that many remasters look bad because they're incompetently done. Bioshock and Starcraft are two that immediately spring to mind. Fable afaik got panned for its remaster too though I played neither. This is because artistry in gaming is partly making all things work together, even if say realtime pathtracing isn't a thing and so you need all kinds of tricks done in raster which can make it look better than badly done RT, and alleged "HD texture packs" tend to do this by clashing wildly with everything else that's put together to go well in the original game, that now does not. It's because remasters tend to also only modify some parts of it and not others, and then not use something which really looks right on its own compared to the original artistic vision which looked more cartoony or blocky but still just looked better while being of poorer technical visual quality, and then on top of it all often ends up clashing with the very message or theme of the original game itself. Like just imagine if you took some old grainy movie's musical score, removed the tinny old staticy, low-fi, poor quality sound, and then upgraded that by...adding modern rock music to a black and white crime noir. Or imagine if you removed the old original chipset sound of 90s console games, and "remastered" it by introducing...jazz music and Lady Gaga. In Resident Evil. It doesn't suck because it's just a technical problem, it sucks because it's badly done and it clashes with everything else. I grew up then. Linus' monitor is a multithousand dollar piece of equipment which costs far more than a midrange computer today, even at scalper pricing--for just the monitor. This is a basically 1440p display from VGA period. No game was ever made to look good on that particular monitor; they were made for my family's and friends' monitors and TVs. They may look better in a sense, but they still looked like ass, because it was late 90s. Everything looked like low fidelity minecraft. That's why retrogamers all use pixels stuff, because that is how it all looked even back then on period correct technology. Imagine somebody trying to cargo cult late 90s games, and then cargo culting all the absolute worst aspects that I'm glad are dead and gone, only to leave out the only good thing they often had which was SOUL. Fallout, Baldur's Gate, Planescape, these had soul, Starcraft, Jedi Knight Academy, but they all looked awful and so terrible was it visually that for about a good 10 years the thing that mattered most to gamers was better graphics(partly because I don't think it ever occurred to us the storytelling, artistry, and mechanics would degrade in the process) t. cranky 90s boomer
@@GameCyborgCh For maybe a few thousand people within the entire world? Absolutely not. I see the appeal of CRTs for their niche use cases but that's all it is. An extremely niche group.
When I was a young adult, one of the first things I bought with my hard earned money was a Sony Trinitron CRT TV. It weighed 90lbs, it was a behemoth…. And I’ve never enjoyed an image on any screen so much since. It got left at my parents and sold at some point. I had no idea that image quality would still be playing catch-up this far into the future! I miss that thing. Lights off, Doom 3 running through my 5.1 sound setup…. It was magical
Intel tech upgrade in 2 years: Linus: "Anthony, is this LMG's CRT monitor?" Anthony: "Well, nobody was using it, sooooo...." *I do hope he gets to use it, he's the one that will appreciate it the most and i'm sure he will keep it in the best shape
Maybe make a retro gaming setup in the office so that all employees can enjoy it during break rather than a single person. But should a single person get to use it, my vote is Anthony.
Retro games are better on a large 480i TV. 4:3, high brightness, easy support for 240p, RGB hookups, etc; They might as well just sell it tbh, even a heavily used, worn-out FW-900 is worth a small fortune and wouldn't just end up sitting in a RU-vidr's warehouse of unused tech.
One of the 'quality' measures of a CRT was its dot pitch, which referred to the distance between pixels ("picture elements"). Crappy, low end monitors had a .51 dot pitch, where as the standard was .28. IIRC, it got down to like .23 or .21 in the consumer segment? Either way, this was important when you were driving higher resolutions with smaller fonts. (Trying to recall from 25+ years ago when i was working at CompUSA) we'd put up 1024x768 (SVGA, Baby!) resolution, and open up something like Word or Notepad. Change the font size to like 6. On the .51 dot pitch monitors, the letter "a" was nearly unreadable as the pixels were simply too far apart to give you a recognizable character. Cut that gap down to .28 and now all the pixels that make up the "a" were closer together which brought the "a" into focus (not literally, but that's the best way to describe it.) Completely unrelated...kinda... The Amiga with a Commodore 1084S monitor was still the bomb gaming setup for MANY years. You wanna dig into history? Do a retro reflection video of that platform.
I remember walking into a CompUSA back in the day and they had a row of PC CRTs setup, the Sony G400 stood out of the crowd due to its higher aperture grille contrast punch vs the shadow mask CRTs around it, I ended up buying the G400 it was a great monitor and one of many CRTs I'd end up owning, the G520P & FW900 were my favorites and I regret not having the FW900 repaired after the flyback transformer failed.
The dot pitch measured the distance between physical shadow mask elements, not the pixels themselves. You could have pixels smaller than your dot pitch occasionally - I remember seeing that in the days of .39 dot pitch "interlaced" vs .28 dot pitch "noninterlaced", referring to whether 1024x768 pixel resolution was displayed interlaced or not. On a .39 display, that resolution had a kind of supersampled/downrezzed look, where you could somehow tell that there was more detail behind the shadow mask, but you weren't getting all of it.
@@jimmay8627 That's true. Because the mask only affected which part of the phosphor coating was getting hit by the electron beam, it was/is theoretically possible that the beam power (an in practice brightness of phosphor) of a single dot changes in the middle of dot (assuming beam focusing is sharp enough which didn't happen with cheaper monitors). However, the mask dot pitch obviously puts heavy limitations of possible usable resolution because no information can be passed at the shadow parts of the mask.
I would love to rock a CRT but they just have one massive drawback. Most of the good ones have been all used up and are impossible to repair or best case is to get a slightly less used up one. They are all on the clock and only going up in price and it is such a huge bummer they are no longer in production. It is understandable why, but it would be so cool if someone took the technology a little farther and still produced fresh, wide aspect ratio crt monitors possibly with a proper DAC built in.
In the past two years I have snagged a dozen or so CRT TVs and monitors for free. One of the monitors has some burn in but the rest are in great shape. One of them is an old RCA pattern Colortyme console TV from the '80s and one is a Panasonic from the '90s with their pinhole type shadow mask but the rest are all Trinitron pattern. I used a couple of Trinitron monitors from 1995-2009. I keep a Toshiba flat Trinitron pattern 27" VCR/DVD combo in my living room for movies and a Sharp curved type Trinitron pattern 27" in my master suite for playing all my old consoles. Currently, I just have SNES, Gamecube and Wii, plus a Retron NES clone but I really want another PS2. I ended up owning 5 of them because I would think I was done, sell it and then eventually want another one. Dreamcast is getting too expensive so I just emulate that on an older laptop with analog output. One thing I need to do is get busy and start adding component input to all my TVs.
Love the CRT content but a few things I noticed! You can absolutely use active DP and even some HDMI adapters with CRTs, they are just a little finnicky to get working and you have to use programs like Custom Resolution Utility to push high framerates and resolution Also that dreamcast is pushing 480p to that monitor, it's one of the few game consoles that can natively output it over VGA! I don't believe that Monitor can even run at a 15khz H-refresh rate to take 480i and 240p signals. Typically only monitors meant for broadcast TV do
yeah about 95% of dreamcast games supported 480p progressive scan mode with the vga box, which were already plentiful and easy to find back in 2000. it's not like the Gamecube or PS2, where a minority of games supported progressive. The games that didnt support progressive where usually 2d games like SNK fighters. Funfact, where I live, our Xbox [og] didnt support anything but 480i, and they looked like bad. PAL users got shfted by Microsoft in image quality.
@@anasevi9456 most GameCube games that you would care to play support progressive scan, but the only way to output 480p natively is with a first party adapter that sells for like $350 now.
@@nebby3 Yeah I should correct my comment to say 480p over VGA (RGBHV). PS2, Gamecube, and Xbox all support at least 480p over YPbPr (Component) but the circuitry for decoding those signals isn't in computer monitors for whatever reason. So you have to find converters for that which are pretty niche, the only one I know of is by the Beharbros, the same people who made the dreamcast box they use in the vid Edit: Also your xbox comment is super wild considering some games support 720p and 1080i on the OG Xbox in NA!
@@nebby3 This isn't as true as is used to be, check out the GC-Video Project, the whole port has been reversed engineered and there's clone solutions on the market for cheaper now Also the Wii is an option like that other guy said but is known to have slightly softer quality
The thing I miss most about CRT's is when you have a demanding game, you can switch to a lower resolution and it still looked good. However, I think it's a fair tradeoff for LCD's size, weight and power consumption.
i remember in the early 2000s i would play pretty much every game possible on a 400 euro 2003 family desktop, even the newest game like prince of persia, rayman, morrowind etc would run no problem because i would run them at 480p sometime even lower, i was so use to that, that the concept of graphic card and graphic performance was alien to me. so when i made the transition to lcd it was rough " hum oblivion sure is struggling, i will lower the resolution to 480p.....why does it look like shit" . but we didn't have a choice as media started to leave 4:3 to 16:9, i couldn't play my xbox360 anymore as most game would just be cut on all corner and most text was unreadable. fun fact : i played skyrim day one on a old crt television from 1999 and it did a great job at hidding the ugliness of the game.
@@champ6436 Screens that generate their light from the panel itself and not from a projected light have always and will always look better due to the way our eyes work and our brain processes the information. It's to help with things like pattern recognition which allows us to see things like tigers hiding in the forest undergrowth.
I remember running Crysis on my 8800 GTX and a 1920x1200 display and actually downgrading to the 16:10 equivalent of 720p just because it was still better looking than high resolution and low everything else. That stupid game was so demanding on hardware of the time. Though I also didn't have a very good CPU... my PC was a bit lopsided since I was 14 and had no idea how to make a PC properly.
The CRT I had 20 years ago was so heavy, it bent my desk to U shape. I still have the desk and its still bent. Weight was my biggest issue with CRT monitors. If they made them as light as modern LED, I would have stuck with CRT.
Virtually impossible to do so. You need to create a massive glass tube that's strong enough to contain a vacuum. The bigger the screen, the bigger the tube and also the thicker the glass needs to be.
FED/SED displays were a continuation of CRT technology using a matrix of tiny cathode ray tubes, they were slimmer, lighter, required less power and were ready to be launched on the market, but were abandoned after LCDs became dominant...
Long time CRT fan here, Thanks for this great homage. The only thing that give me peace that crt are dead (or dying) is the work of Mike Chi and his Retrothinks. With upcomming 4k60hz scaler's the future is looking good to have great CRT like "retro-filter".
It will be different, the CRT is beaming the picture to the eyes 1 pixel at a time, taking longer to draw the whole picture 1 pixel at a time, while OLED is drawing the whole frame and is NOT drawing individual pixels one at a time. The brain knows it gets more feeling watching the CRT, especially at the CASINO.
Rear projection. Three six inch CRTs for each color channel, thus eliminating the need for a dot mask or aperture grille, and allowing 4k+ resolutions. Probably only 2 pounds per diagonal inch, if not less. Because the tubes are smaller, the electron guns don't need to travel as far for the same refresh, making higher refreshes possible. Also, having the DAC (or better yet, three of them -- ideally 10-bit or higher) inside of the monitor so you can have a DisplayPort connection out the back of it.
If you could use totally black room, you can still go with CRT video projectors. Those are *really* finicky to tune (I had one about 20 years ago) but they have all the plus sides of CRT monitors and if you use matte screen instead of silver screen, the viewing angle is equally good to CRT monitors. I used to run PS2 with about 80" image around year 2000 and because CRT (being totally analog technology) has no native resolution, it was able to render everything the video signal had.
That's the thing about technology. As we stop using it, we lose the knowledge. CRTs made today use the simplest and cheapest 80's design. There's no improvement, and even "deprovement". New Vinyl and cassete players are garbage compared to even mid line ones from the 70's and 80's. It's not that it's not worth it... it's that it's not possible. To make a copy of that monitor, the amount of R&D needed, build up to create the components would be greater than what it was at the time. It's the same reason we can't build a Saturn V today. We don't have the technology anymore. It's effectively lost.
Though this one isn’t really a crt but here one of his videos on something that would be close to it ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Ngy9TIbREJE.html
7:28 The red bleed comes from limitations in NTSC color encoding in TVs. It's not found in technically superior RGB space like most computer monitors (sort of alluded to at 14:04). 15:17 Says Aperture Grille, shows a slot mask (not a shadow mask like at 15:12)
I remember how disappointed I was with my first LCD TV, because all the channels looked like dog crap. CRT basically has native up-scaling as a physical property.
That's cause all it's gotta do is make the electron beams closer together or farther apart, is also why when things get bigger (beams farther apart ) they get washed out colors like watering down paint
It's more auto-scaling than upscaling since most CRT TV never get past SD (525-625 lines). And LCD TVs still fail to adjust lower definitions than their native one. BTW if you're trying to use a CRT TV with a computer that can display higher RGB resolutions (likely something like a Commodore Amiga or an Atari Falcon) you will have a TV that will flicker in a very painful way.
Actually the it is often the circuits that drive the guns rather than the phosphorus that get "tired" and debalanced. To calibrate them and restore proper white points and brightness there are trimmers (variable resistors) on the board inside the monitor, usually in the back of the tube. I have calibrated more monitors than I can count that people were going to trash and gave them another 10 years of useful life. Be careful when you do that because you work on the high side of the circuits - voltages like 15-30KV are usual. The front side setting are only for fine adjustments - the gross adjustments are in the back. Now, if someone let the screen in full sun for months on end, you can still improve the image a bit by adjusting it from the back
Yeah Adrien's Basement has a ton of good videos on how to do this kind of work, mainly for older tube monitors for pcs. And depending on the specific monitor, I've heard of recaps being very effective.
"I could never afford this kind of thing as a kid or teenager" I felt that. I was lucky to get one console from every generation, but the gaming pc's I saw in magazines were a pipe dream. Now I have setup that was like $3k and no time to play anything.
My whole ambition is quickly becoming to work as little as possible because man I feel that, but even in my early 20s is like "have some time to play, no money" now I have as many games as I want on a great system and no time. Eventually plan for most people is grow old, have lots of money and lots of free time but very little time left lol
Still remember this girl had a badass sony setup, back in the mid-late 90s when Sony was going high-end. The entire setup was LIGHTYEARS past my rigged-up e-machines. Her Dad was a car salesman with no tech background, have to imagine some sales team took him for a ride lol.
I was born in the 90s, so my childhood was surrounded with CRT screens. Just dawned on me that one thing a child today will never hear is the parent coming into the room when the screen is black with a little white dot in the middle to say oh, looks like the tube is blownI remember when my last television had the tube blown and then we got upgraded to a flat screen and it was amazing. I was fascinated by it and the thing is it wasn’t even that flat
@@ancientaliensarecoming7201 john linneman is a member of the digital foundry channel, they mainly talk about videogames, graphics technology, history of videogames, and other technical stuff.
and i was surprized that the crt technic i all about 'reaction time' producing an image. Looking over to my OmenX27 and got simply beeing afraid - even with it's 3ms real timings. Slower than 3 grandpas or more sitting in a ferrari in their rest rooms in their rest rooms house. Also afraid when the alienware qd-oled ist buyable with my budget
It's kinda crazy to think that when we (the olds) were kids, we witnessed what was probably the peak of CRT quality, given the extreme rarity of unopened or unused units in the present day. And unless some niche company decides to make some brand-new, we'll be the only ones who got to see it.
@@austinwolfe7295 There are still a decent amount of good condition CRT monitors around, even a few new in box, so you can still experience top tier CRT displays for now at least.
@@olivermood8003 I miss my CRT Trinitron TV for Smash Melee, Sonic and F-Zero. They are unplayable on a traditional 60 or 75Hz LCD, specially 2D Sonic,
У меня сдох 22 дюймовый NEC MultiSync FE2111SB 😢😢😢😢 Твм сначала контакт пропадал, стукнеш и заработает, я все платы лупой просмотрел и кое где пропаял и бесполезно. А сегодня вообще перестал стартавать, мигает индикатор (светодиод)😢😢😢 А моник то топчик, один из последних по качеству, кинескоп Diamondtron, это крутой конкурент Тринитрона
I had a 21" Eizo Flexscan CRT pro monitor from 2003 to 2009, I got it second hand on an army surplus auction way back when. That thing was a beast, at least 35kgs heavy, huge AF , had a 1600X1200 resolution at 160hz I think. Sharp colors, super black blacks, noice. I loved that monitor. Used it for everything, games, tv series, and such. Sadly it met its demise when our house burnt down in 2009 along with all my stuff. Still miss it to this day.
I still hope that SED/FED is being worked on in some lab somewhere. The insane SED/FED resolution potential makes 4K monitors look like 480p, and it can deliver the blackest of black plus near zero latency of CRTs.
To be fair, old tech literally costs an arm and a leg. We complain about how Apple is overpriced now, but a DynaTAC 8000X costs $3,995 (adjusted for Inflation its about $10,000 today) for just a wireless phone. If you want to cough up 10 grand, sure, there may be extremely durable and rugged laptops and phones in that bracket that will never die.
Thing is, even the cheapest CRT would give you perfect motion, next to zero input lag, and at least 85hz. While the cheapest LCD will have bad color, horrible ghosting, noticeable input lag, and only 60hz. LCD tech had only "caught up" at the highest (and priciest) end.
The cheapest LCDs cost less new than the cheapest CRTs did at the time. Good luck getting a new desktop crt for 60 dollars (which comes out to around 100 dollars in 2022 money).
So one major thing Linus missed. The CRT advantage is basically mainly for gaming. For real work, including graphics work - you need an LCD based display (or one of the newer off-tech that branches off from it). So unless you mainly only game on the PC, then a CRT will be basically trash. I went from a high end CRT to a mid-range lcd screen in the AMD Althon X2 days, and will say - LCD wins. Just don't buy some cheapo LCD today as even the quality of LCD varied as much as it did back in the day vs today when you shop ultra cheap, vs not cheap but not costly.
But cheap CRT would do high pitch noises and you could not sit before them for longer time periods. The even flickerd a lot. The first, cheapest, LCD with 60Hz had no flickering at all.
There’s a company in the US, that still appears to make CRTs, but I think it’s for oscilloscopes and stuff. But it would be cool if you could do a tour of their facility.
"Did I just waste my freakin' money!?" Linus: Yes... but also no. The retro games comparison was AMAZING. Because there's this thing where we think "Oh that game looked awesome!" and we see it today and it's total shit. But maybe, the fact that the technology was different had a part to play in it. Although idk if with a cheap generic CRT or TV this would apply. Like not many people would've had this Sony monitor I don't think.
Depends how you connected. A lot of people will have used composite or RF which look terrible. Most probably didn’t have VGA adapter for Dreamcast which was a huge leap.
CRT Monitors with speakers, Component inputs and VGA were rare and extremely pricy. Most consumers had 480i CRT TV's which looked nothing like a 2048x1536 36" NEC Monitor. Nor did they pay 20000$ usd
@@Blox117 This guy knows. It really depends what monitor you're on because a high end monitor now wipes the floor with CRT, or even just 1440p75, 1080p144 etc. I played Fallout 2 on a normal civilian monitor and it looked bad but we dealt with it because it's what we had, and one of the biggest reasons why "better graphics" became the whole focus of the industry from early 90s throughout the 2000s until the stagnation period of 2010s. It's because if you played Thief even not knowing better you could tell how bad it was just based by what impressed us back then (he shot a bottle and it shattered! wow! so lifelike!) and how low our real expectations. Like I couldn't even imagine being a kid shown today's technology even just the numbers like consumer grade 16 core, hyperthreaded, 4.9ghz chips with northbridge on it and 32[G]b RAM, [T]b storage, microSD, graphical accelerator cards with a hundred times the VRAM that are many times faster than CPUs etc., let alone being shown today's games. I'd just be so blown away, that I've honest to God sat here and thought "if I could tempt you as a devil, young me, what's the most I could've gotten out of you by offering this system in exchange for sacrificing something else, even part of your future" and realized I'd get a hell of a lot out of old me. That's practically "die at age 30 in a car crash no family or friends and fall to Hell" tier to a late 90s kid, and it's not just the graphics, which 1999 me would literally not be able to distinguish from photographs and live action acting[iirc motion capture just started around that year], but also the full scale to mechanics, even though most genres had already been developed by then. This includes things like destructible environments, combined effect mechanics like oil+fire in D:OS2, getting thrown from vehicles and real damage to cars or falling from planes in GTAV, and all that is mid-2010s.
One MAJOR drawback of CRTs is the power consumption. A modern 24" panel is going to take between 40-50 watts, while the FW-900 sucks down 170 watts. That's possibly 4 times more power.
I recently played Silent Hill 3 for the PS2 on a Philips CRT TV manufactured in November 2005. It has component input and I still have the official Sony PS2 component cables. Even though it is a 480i game, it is still to this day absolutely one of the most gorgeous gaming experiences I've ever had.
You could probably make a DP to VGA with FreeSync using a $1000 FPGA, that supports 240hz etc, most GPUs only came with slowish VGA converters (400MHz) which sets some hard limits on your output. My first monitor was a 1600x1200 CRT and it definitely pushed my X800 XL hard :P
And people called me crazy when I got my 4:3 HD 32" Sony Trinitron CRT TV in 2006. It had the good of each world. It did 1080i or p, really cant remember but it also did console gaming like no other. I did retire it when I moved and gave it to my cousin and he still has it and his kids play retro games on it to this day. Damn thing weighed about 250 lbs maybe even 300 lbs.
It did 1080i most likely. People gave a lot of crap to 1080i but that's only because they were viewing it on the wrong display. 1080i on a CRT whether it be a VGA monitor or a HD CRT looked amazing
@@superslash7254 Somehow the higher resolution felt like it was murdering my eyeballs and that was when I was in my 20's. nowadays I run 4k on a 28"screen now have no issues with eye fatigue.
The reason for the long start ups and being dim - this tube just needs a shot with a CRT rejuvenator to bring the guns back into spec and then it will be good for another 30 years. CRT rejuvenation is commonly done in the arcade community and usually you can bring a good tube back from the dead a few times.
I had one of these. It was the last CRT I bought. It was a great monitor, and I was able to set it to a custom resolution of 2304x1440 in the nVidia driver, making it the highest resolution of any monitor available at the time (unless you count things like the unobtainable IBM T221 LCD monitor). I used to play Guild Wars on it in that resolution. Eventually it started to develop issues, though, as CRTs can be finicky. I had to ship it to Sony's place in Southern California for in-warranty repairs (the shipping cost a ton), and that started a nightmare of *terrible* customer service that lasted months. In any case, the best way to run this monitor for the clearest picture was from the VGA out on the graphics card to the 5-BNC inputs on the back. It made a noticeable improvement in picture clarity over VGA to VGA. Hard to believe it's been over ten years now since that all went down. I'm glad we eventually graduated to LCDs, as they cost much less to ship and aren't as prone to develop the same kinds of annoying issues over time. I currently have a Dell UP3218K.
i used to love gaming on my CRT. i had a 21" ultra sharp one. seemed huge at the time, and i think the res was 1600x1200. Quake and Red Alert looked immense
i was laughing so hard when linus says "RGB, one for red one for blue and one for green (RBG not RGB)" and the editor colour coded the words like that XD that was a brilliant touch 🤣🤣
*You can use a HDMI to VGA adapter and putting it over 60 Hz* , you have to make a custom resolution in the Nvidea Controll Pannel! I am daily driving a CRT ( Sony Trinitron Multiscan E400) with an HDMI to VGA Adapter, its doing 91Hz (the max my CRT supports on 1920 x 1440)
Man this was the best video. I have a long love of CRTs and used to run my 19" at 1024x768 at 100Hz just to read books, then flip it back to 1280x1024 @ 85 Hz for gaming. I loved that thing and I am so, so envious of this Sony LTT bought.
@@lulkLogan While I appreciate your comment, I need to point out these were CRTs, so that aspect ratio had a much smaller impact, just as Linus pointed out in the video. The games I played at that time all supported 4:3 or 5:4 without issue.
@11:00 I had an extremely similar kind of experience recently, I dug my xbox 360 out of my parents place and hooked it up my 4k monitor and it looked horrible. But once I hooked it up to a 1080p plasma display, something which would have been accurate to the time period, I literally can't tell it's 1080p, it looks so good and it's weird how we forget that and say the graphics have aged and they only looked good to other graphics of the era, when in reality display technology has changed and isn't capable of displaying how good things used to look.
A bonus feature of the CRT displays is you could use a lightgun, which was absolute black magic to me as a kid. I kinda miss that era of couch multiplayer and renting gimmick controllers to play arcade games in the living room.
@@NoorquackerInd Really simple. Display a frame with black. Display a frame with white at the target. If it detects that it saw light, you got a hit. Alternatively, paint the frame black. Then paint the frame white, and measure how long it took from the start of the white frame to the detection of light by the gun. Because the computer is controlling the electron gun with the horizontal and vertical blanking signals it generates, it knows exactly what part of the screen the CRT is drawing, down to the microsecond. When the gun detects light, the computer knows exactly where it was pointed based on what part of the frame was being drawn.
My first job in 2000 was doing software development and each workstation has 2, 21” Sony Trinitron monitors. And when upgrades happened, everyone got to take one of their monitors home. Hauling that bad boy to a LAN party was a workout in itself.
Windows+Shift+Arrow key... holy crap most useful tech tip ever. My third monitor is in my case and windows keep popping up in it sometimes and it's a pain to grab from this is a life saver
Yeah, found out about it when I had to connect my AV-Receiver with my PC via HDMI and realized it's not possible to just transfer audio. So I always have a nonexistant 3rd Monitor. Close second for me is windows + shift + s for making a quick screenshot.
if you use a CRT and UNITY ENGINE is the nightmare and literally the worst engine EVER, you need 2 displays and FORCE win shift arrow the game to the second monitor to set the resolution from 4k or 1440p to a lower like in my case with an NEC 4x3 tube to 16x10 on a 21" (19" viewable) or it will display window out of bounds plus Unity game's hud and overlays will be not in frame.
I never thought I'd miss my old monitor as much as I do right this moment. Didn't realize how big of a role it played in the experience of not just old games, but for sure old artwork as well. Imagine how it effected even forum avatars back then, damn. Nostalgia will be the death of me.
I have an old 17" HP CRT monitor and realized when I was young that older games just look much better on CRT than LCD. I have a GameCube and noticed that playing my games on a flat screen looked like absolute crap compared to the cheap CRT TV I had when I was a kid.
My favorite part of this video was learning the Windows Shift Arrowkey shortcut to move windows from one screen to another. As someone who uses a (gasp!) HDMI tv monitor as a second display, this was sometimes an issue if I had the other display disconnected or off for whatever reason. Issue no more!
I LOVED running a CRT at 100 hz back in the day for PC. Great experience. If someone made a new CRT at a reasonable price, I'd love to buy it for the image quality.
They would never be a reasonable price as all the mass production for CRT are mostly gone and even the CRTs that are still made are for very specific uses
I was always torn between wanting a higher resolution and a higher refresh rate, anything below 75Hz made the veins in my eyes pop so I was always forced to lower the resolution even though it looked so much better at higher resolutions. (It wasn't a very good CRT hehe)
15:55 Older folks will recall...there was no such thing as corners on a CRT computer monitor unless you had a trinitron monitor. I remember my first trinitron and the first time I actually had clean 90 degree corners on screen. I never went back from there.
This video treats CRT monitors like ancient artifacts. And that just makes me feel old. And Linus forgot to degauss the monitor, he forgot how satisfying it is.
6:45 the motion clarity struggle was my life since i switched to lcd, it got better with 120hz at 120fps. I finally got the CRT motion clarity back for 60fps content with single strobe BFI on OLED, legit cried that day, forgot how clear and smooth 2d sonic really was.
@@bastol85 have you seen the c1s bfi? I heard the c1s bfi is a step down from the cx's and the c2 is allegedly even more of a step back so now I'm thinking damn I need to get barely used CX if I can find one
@Retro Soul i haven't seen it in person but yes i have heard that these aren't quite as good as the cx. Check the youtube channel hdtvtest, really good reviews there.
Like he said in the video, they had been in development fr 60 years, so they've more or less reached their peak when the last one came out, unless someone crazy decides to put millions into R&D hoping to get something better
@@shivansh7152 In essence, that is what the Trinitron is. Sony wanted to improve on the CRT design to allow cleaner, brighter images. They poured a lot of money into the aperture grill design. When the fw900 released in 2001 almost all monitors 19"-21" could only do 1600x1200 or 1920x1400. the FW900 could do 160hz at 1280x960 along with a few of the other 21-22" Trinitron monitors from other manufacturers. I had 4, including an IBM and a DiamondTron and all could hit 160-200hz refresh rate at lower resolutions. These were the best for competitive quake and cs play.
У меня сдох 22 дюймовый NEC MultiSync FE2111SB 😢😢😢😢 Твм сначала контакт пропадал, стукнеш и заработает, я все платы лупой просмотрел и кое где пропаял и бесполезно. А сегодня вообще перестал стартавать, мигает индикатор (светодиод)😢😢😢 А моник то топчик, один из последних по качеству, кинескоп Diamondtron, это крутой конкурент Тринитрона
I have an HP p1230 at my parents home, it is still in use occasionally when I'm there at the weekends, I use it with my Thinkstation S20 and some R9 GPU. I gotta say, no matter how good the LCD is, it just isn't the same, it took me years to finally accept LCD monitors. might as well mention that this single monitor, had a greater lifespan than any LCD can hope to achieve, it has been in constant use for 8 years for some graphic designer, eventually got promoted to gaming monitor for my brother and eventually found it's way on my desk it has been there for the last 7 years and it will never go away from there. this monitor has worked without any errors for all this time I found an IBM keyboard a while back, a welcome upgrade over the cheap keyboard I had, maybe I'll even find a ball mouse somewhere too to complete the setup :D
When I was a student someone was throwing a 19” Sony trinitron monitor away. I took it home and found that the picture had weird colours but by bending the cable it worked. I put a cable tie around it to maintained the bend and voilà working monitor worth a few hundred pounds! Got me through university and beyond. Flat screens all the way though for me now. Reclaim the desk space!!!
The interesting thing about tech, also technology I guess, is that due to its rate of advancement there's a much sharper value curve to it than anything else, which is that it's one of the most rapidly depreciating assets on the planet halving its value and within a few years being worth like 15% what you paid for it, but which progressively slows down its depreciating value until reaching the point that its value starts going up again. I find this really ironic that boomers keep throwing shit out because of this fact, because I'd bet you that within the next 15 years having period correct Nintendo Genesis+CRT and Pentium II+CRT battlestations are going to be incredibly valuable again, and thenceforward only will appreciate in value. Linus' channel are some of them, which not all of this stuff he shows off is because it's old high end Relics with Epic rarity stat bonuses, but because with its rapid depreciation of value and commonality people foolish begin throwing it away, like an overused meme, until finally that thing that once was common now becomes almost impossible to find when just ten years ago you could find thingX littering roadsides. ...just imagine when this becomes true of food, sand, and clean fresh water or table salt and potassium
One interesting thing is that flat screen CRT were made (they were as flat as LCD monitors) but they never caught on because the market had already moved on to LCD
I'm so happy to see that these old beasts are getting quite popular again, at least in the enthusiast kind of sphere. I love old tech and I can really relate to Linus' excitement for this thing haha
Oh man, it brought back so many memories hearing Linus describe Aperture Grille technology, and the little horizontal lines that the technology necessitated. I worked at Apple, who used Trinitron tubes for their Studio Displays, in the early 2000s, and fielded a handful of calls about those lines. I explained -- from our training -- pretty much exactly what Linus said. You stop noticing them after a while, but the moment your attention is called to them, they're all you can see.
This was a trip down memory lane! When I was growing up, the family computer had a pair of 21" sony trinitron monitors (from Sun where my father worked). They gave very good image as you would imagine though as they were not quite matched, one didn't support the frequency dos and the bios used so wasn't usable until windows was booted up (it was fine though as it was primarily connected to the Sun workstation my father needed for remote work. Very lucky to have such monitors back then! The weight though...
A CRT monitor for gaming use is a work of art. And because of the much more beautiful 3D picture and because of the lack of input lag and because of zero pixel response time. All this taken together makes a CRT monitor an uncontested choice for use in games. But the saddest thing that happened after the gaming industry switched to LCD technology in monitors was a drop in the quality of the games themselves (!) on the market, which now, played by players on LCD monitors, with high input lag and long response times, could not provide a truly high-quality gaming experience. If CRT technology is not revived in the future, just as vinyl records, tube amplifiers and magniplanar headphones have been revived, the entire gaming industry may be subject to a deep crisis.
It's crazy that my Syncmaster 955DF CRT looks like it has HDR despite it doesn't. Color gradients are perfect, it destroys my AMOLED phone by that alone, also highlights still have their color saturation. My AMOLED phone loses color for the highlights, i noticed this in Tron Legacy for specular lights like the bikes (my AMOLED is obviously brighter than my CRT). I also compared my Syncmaster and AMOLED phone vs WOLEDs and QD OLEDs, WOLEDs struggle with warm colors and QD OLEDs are worse than AMOLED. AMOLEDs are 100% pure RGB, like CRTs. QD OLEDs aren't 100% pure RGB, they use blue LEDs that are converted to other colors through the quantum dot thing, is not one dedicated LED per color.
Underfunded public library? More like "as seen in the series Ranczo, (as the old school building), well known" library in Jeruzal. Greatings to all my Polish friends!
I installed a lot of Sun, HP and other workstations back in the 90's they all used Trinitron displays, Phillips made them too. With SUN the display was black text on white, and EVERY new install mentioned the lines.
Yeah but you plug it into the wall so who cares. Of all the things that get brought up, I really don't care about energy efficiency at all, not because I don't pay my own bills obviously I do, but because it's not a big enough difference to matter atm. I care more about energy efficiency of internal components that become a bitch to cool down, and that in turn can drive up the bill by needing my AC on more using it in the summer, and because added power draw to say GPU or CPU drives up overall system costs by needing beefier PSUs, expensive AIOs etc. But yeah you're right about that, although in that sense I wonder how many people unplug everything to save power (plugged in appliances slowly leech power even when off)
I had a pair of 22" Diamondtron (Mitsubishi's version of Trinitron) CRTs. Wow, those things were heavy, but the contrast and color accuracy were amazing. I'm a graphics professional so switching over to LCD wasn't an option until IPS became reasonably affordable. TN panels at the time were just terrible in comparison to those old CRTs.
Always liked CRT for the ability to run "non-native" resolutions without distortion and lack of ghosting n such but when LCD came out I switched. CRT actually gives me headaches, they tend to produce a high frequency pitch on top of that flicker due to scanlines. So daily headaches vs a more pixelated set resolution was unfortunately worth it. Oh also a lot less glare with a matte LCD.
There shouldn't be any noticeable high pitch sound with PC CRTs, as the noise is 30khz which is well beyond human hearing. CRT TVs tho, they're definitely annoying with their 15khz whine.
@@-Keen- Started up an old Sony Triniton CRT some years ago. Much like the one in the video. No high frequency pitch when that thing was new. But boy does it have it now. Had to turn it off after a minute of using it.
@@DroidEater Damn, maybe I just got lucky. I have one from 1998 that's still chugging along silently. Still looks almost as good as OLED too, it's crazy.