*Sigh* I found a P1130 in recycling dumpster at a PC repair place (after asking for permission to check what was out there). A year or two later, I ended up dumping the thing to "make more room on my desk", and "not turn my room into a furnace" (I had a three CRT AMD Eyefinity array going). That was just back in 2015 or so. Such. Regret. Back then, I was just enamored with the black levels and near zero input lag of CRTs - I wasn't even using them in a way that allowed for good motion clarity (my FPS never matched my refresh rate). The things I'd do for another one of these... Motion clarity beats everything, IMO, and every time I think about looking into "better" computer monitors, I'm reminded that I have a curved VA monitor I don't bother using at all (wife uses it for WFH sometimes), and that I'd just be angry about the blur in everything I play. CRT and VR headsets, all the way, forever (except when playing/watching with other people - then the DLP projector suffices. Which I bought thinking it'd be clear, too, but I have a double image issue, and I have no idea what to do about that/if anything can be done, or if that's just inherent to DLP/this projector).
Huh the thing you said about CCFL is quite a suprise to me. I have NEC monitor from 2006 with CCFL back light and it looks amazing. After i bought new gaming monitor the only difference i see is pixel response time and refresh rate ofc. This NEC monitor has really good colors, i think it can get above 100% sRGB, and its max brightness is 470 nits. I was compering it few times to other monitors and its brightnes is just on another level. But idk, looks like this monitor is just really good for its time :D
Recently mistakenly bought this monitor sadly without seeing this review. May just return to 1080p 144hz and use this a secondary. I noticed in your video the resolution does fill all the way to the very edge of the screen. Is this normal ?
This is a novel feature, but has too many drawbacks. The added latency is a deal breaker on top of all the other downsides, as opposed to simply using an HDMI to VGA adapter.
can u create a program that intercepts mouse inputs and adds latency on one side and u can benchmark yourself igame like in the firingrange of apex? i feel like its so hard to guess latency when u dont have a reference
I had a 17inch Samsung CRT and I remember playing in 1024*700p at 80hz or something like that.. Doom 3, Far Cry, Half life 2.. amazing times.. 2006 I got my first 5:4 19inch Benq LCD Monitor, and I loved it. Never had any problems with gaming, loved the brightness, the text clarity and the massive space I had on my desk..
Is there a reason to be using a GPU like the 980 or Titan X when doing the passthrough? Or can you actually get away with something much older and perhaps passively cooled?
I have 3 CRT Monitors... nothing comes close to their Motion Clarity, nothing... but I'm not using them to play anymore, flickering is a problem to me, i guess i got too used to LCDs... I kinda love LCDs too.
Hey, man, I know it's been a long time, but your videos have been highly influential for me in getting into finding out the major I want to get into. I know you don't upload as often nowadays but I definitely will give the game a try despite me being poor at these types of games nowadays. I do hope you're doing well in whatever ventures you're in now, even if you aren't going to do more monitor reviews due to varying circumstances since I understand it's expensive.
@@ApertureGrille computer engineering, my uni allows me to specialize a bit into learning more on digital image processing though I admit I'm still a slow learner
@@elisebright I recently did a lot of work in Python, which was new to me, and I found it incredibly intuitive and powerful. Lots of great libraries for basically anything you'd want to do, even image processing. But it's definitely something I wish I'd learned in a more structured way in college. I think you'll do fine! And thank you for giving the game a try! It's actually now free on Steam, but it's a highly niche style made for Quake nerds, so no worries if it's not your thing.
so should i be using the "normal" color temp setting for the color temp monitor settings? Also, how can I calibrate to see better in the dark scenes of, for example, anime? Is default just good enough? Is seems some scenes seem a bit on the darker side than I'd like... Another thing, how can I accurately and correctly calibrate the gamma as you've done in this video with those curves as opposed to what Asus defaults to as you showed?
Do you think you could make an updated version of this video, with a newer gaming test rig with either an RTX 4090 or a RX 7900 XT as the render GPU, and testing out older GPUs to decide which ones to recommend for the VGA/DVI-I output?
Bought EVGA Hybrid 980Ti WC with 2017 Tax Money (800$). Year later, added BIOS voltage, & GPU/MEM OC & Liquid Metal mods. Furmark pulls 420w (in gpuz) with steady 70c, she ran 420w+ 3 days straight no issues (furmark for the weekend while I was gone). She kept ticking with games and video encoding until I sold it in 2023. 980Ti is a king in my books
Nvidia sucks at interlacing, you can do it but it's a process to get it working, you also have to do that process every time a fullscreen application shows up and everytime you boot up windows. It's a temporary solution that becomes a pain to deal with on the daily AMD is just plug and play, create the resolution then set it. and it always works. even when you restart your pc. it boots up into your interlaced resolution no problem.
I tested it. My conclusion: It can work. but note make sure you have a ton of pcie lanes/bandwidth to work with, because I tried this with a pcie 1x riser and it basically kneecapped my fps, im getting around 20 fps at 1600x1200. I'm going to install win11 and test to see if my results differ
I tried it on win11 and win11 is a lot easier to get working because you can select the rendering gpu unlike in win10. I tried with win10 and it's not worth doing this in win 10, it's too much work because you have to have 1 monitor connected to the analog gpu and 1 to the rendering gpu, then you need to start the game on the rendering gpu's monitor, then move it to the analog display. in Win11 you can select the rendering gpu and dont have to go through the long process of having 2 monitors connected to 2 gpus and moving things around.
Sorry for my possible ignorance, but I use a Display Port to DVI connector for one of my old LCD display, and it works just fine, why not just use that? Or maybe Display Port to VGA, or DP to DVI to VGA in a cursed chain of adaptors? :D
I currently use a display port to vga adapter and it works for the most part but I can't get the most out of my monitor because the adapter has a max pixel clock of 165Mhz. I'm currently running it at 2160 x 1620i at 68Hz, but my monitor can actually run that exact resolution at 110Hz, but 110Hz requires a pixel clock of 266Mhz, a Direct connection to a VGA gpu can give you like 300Mhz to work with, and finding a high quality adapter is really hard or impossible in some countries. I live in a third world country and it would be literally cheaper to get a 2nd gpu than to import a high quality adapter. I'm not sure about interlacing but I think using a 2nd gpu can allow you to use interlacing because nvidia stopped supporting them after the 10series and AMD the RX 500 series. Interlacing is a game changer because you can run high resolutions and framerates at the same time instead of trading one for the other
technically yes, but I think you should get an AMD analog gpu instead, getting interlaced working on nvidia sucks and is a long process, and you have to do this process every time you want to run an interlaced res. Also you can do this on windows 10, but you won't be able to select the gpu you want the application to render with. It's also a pain to get an app to use a render gpu. Windows 11 allows you to simply the render gpu.
@@jskilabe5986 so you recommend me to use An Amd analog gpu plus my 1660 ??? Am I right ? In that case what gpu would be a good one .. or could be also use and actual amd gpu plus the gt 710 .. it would work???
Hey, has someone a hint for me: I am running a 454 iiyama on my 960GTX, and the other card 4080 currently, should do the rendering. I have 2 other monitors connected to the 4080. i try running BG3, Cyberpunk, or RDR2, but it does not seam to work. I have to choose the graphics driver for each game in the graphics settings, but if I choose the 4080 I can’t choose the iiyama as the display :( Maybe someone has a tip, or a reference for me I can try out. Thanks in advance. (Tried to switch windows+shift+arrow key)
@@jskilabe5986 hey thanks for the suggestions, I tried doing that, but it had something to do with the other displays. Some Games just did not want to recognize the iiyama, if there was a connection to different monitors connected to the 4080. I will post my current settings and setup below.
I have the iiyama, and optional other monitors solely connected to the 960, not to the 4080 anymore. I also moved away from the approach to always have 3 monitors connected when gaming, or even in general. It has something calming to focus on one screen, and I didn't even realize it beforehand, but multiple connected monitor evoke some kind of pressure.. Might be all work related (working as a developer).. Either way: all games are set to render their graphics with a specific CPU, namely the 4080, and the output is forced over the 960, because everything is connected here. This works like a charm, with some minor issues here and there, sometimes the framerate is caped to a specific value, mainly in microsoft games. It helps to keep something like FPS counter up top from the gamebar to fix that. And it really helped to set a specific resolution and Hz for my monitor with the resolution tool. don't use the Nvidia scaling tools, always rely on your hardware for that. Questions? Please shoot, I try to answer the comment is already quite long haha
I've had this exact pass-through idea kicking around in my head for weeks and i FINALLY come across this video to prove to me that not only is it possible but it exists and works. I'm about to put a high end CRT into service as a secondary screen and my adapter is like 95% reliable for what i'm trying to do. This, on the other hand, is EXACTLY what I want to do. Gonna take some work but i think it'll be worth the effort. Thank you!
once you get to the 90kHz range you will start to feel the pinch of the adapter, unless you have a decent adapter. I have a 165mhz adapter and the pinch is real, I'm running like 60% of the refresh that my monitor can handle at specific resolutions because of the pixel clock limitation
The motion clarity of a CRT is truly something else. Its persistence is less than 1ms. 240fps@240Hz has a persistence of around 4ms, which is what most plasmas had. I do wonder what the persistence of the 120fps@120Hz backlight strobing one is.
Why do you say you're stuck with 2015 graphics performance forever? Could you not do this with a modern rtx 3070 and a 980 ti? I can't see why that would bottleneck you?
For sure, they've been eliminating features from their GPUs for years, like making it impossible to use interlaced resolutions or using 240p/480i@60hz, y'know things that you might want to do with CRT screens and these are totally artificial limitations imposed by them. It sucks.
so adapters dont have lag?? thats weird because i do feel like there is some lag with my startech DP2VGAHD20 with displayport on crt at 160hz while playing osu