True, my laptop are 1080p 16" but i never use 1080p i use 900p, because 1080p make the whole icon screen bit little and when i use scale 125% its bigger but some game doesnt support and make the FOV so weird (especially fps game)
I remember 1024x768 being the gold standard resolution to achieve. Back on my old athlon / Radeon 9600 machine if I could manage to run a new game at that resolution I felt like a god.
@@RandomGaminginHD Nice. I remember getting crysis for christmas that year but knew I couldn't run it on my 8500 gt. I still tried and yea lol disappointing results.
@@koreannom I bet! Legendary card, still have mine. Would not be a bad idea to build it out into a retro pc. Some of the older games I play have issues at times on newer hardware/software.
I was just upgrading my monitor and was able to transfer my old one to my daughter, she had been using a old, small TV with a VGA input that had this resolution. I took the old monitor to try on my little “tinker” PC with a 3000G and it really helped run a lot more games so much better and makes a nice cute little low res gaming setup. I’m gonna keep it out for smaller games for guests and kids and eventually upgrade to 5600G. Great videos always, have a good one!
Honestly, if it wasn't this hard to get a decent small, low res monitor (that isn't TN - yuck!), pairing one with a 5600g or 5700g would give really nice results. They still struggle with some stuff at full HD, but at 720p you'd get smooth experience with tons of games... Those APUs are nearly on par with the 1050 non-ti showed in the video. :)
Oh man, this resolution was the “1080p” back in the day. I still remember feeling so good about running F.E.A.R and half life 2 at 60 fps on my 7800 gt in my knock off alienware build. And Yes, Alienware’s case designs were actually highly regarded at one point.
the old ones are considered popular?? Damn it looks pretty hideous. But their lastest one (aurora) are gorgeous though it fits space age sci-fi prop like “Oblivion” movie I gotta add that one to my office. (Useless info) i always got an eye for good design and own plenty of designer item at my collections that now i realize it’s considered timeless (which is futuristic)
@@mifiamigahna Beyond being the minimum acceptable resolution, 1080p will probably lose its majority among PC gamers over the next several years now that GPU prices are returning to normal. 1440p and 4k monitors will continue to drop in price and newer graphics cards run games at high enough frame rates that you don't lose noticeable performance by upgrading from 1080p.
I still have two samsung crts. They do 85 hz no problem :) They are some of the very last crts samsung made. So kinda have all the bells and whistles. Playing retro games in them is certainly an experience.
@@denizkilic6022 Unless they're a budget 15" model from ~2000 or earlier. If those do support 1024x768 it's often at 60hz only, otherwise you gotta go VGA/SVGA.
I had a DiamondPro which went to i think 1600x1200 at 85Hz. 1024x768 at 130Hz that order of magnitude. But on lower end things there'd be compromises of course.
I once got a decommissioned video projector from my former school for free. That thing had 1024x768 resolution and I had a lot of fun playing games on it with my HD3200 toaster laptop. Eventually though, the Xenon lamp wouldn't ignite anymore, so I disassembled it to learn how it works. Has a DLP micromirror chip in it, a silicon chip with 1024x764 individually tiltable microscopic mirrors. It's quite mind-blowing.
@@RetroArcadeGuy yep, but repalcement lamps are quite expensive as they contain platimum-coated electrodes and are hard to find as that beamer was from the year 2002. Also that beamer was as loud as a vacuum cleaner since it had to cool the lamp with a strong fan. There is much more energy efficient and inexpensive LED beamers today, so I thought I'd disassemble the old one to learn how the technology works :)
I have absolutely no problem with playing games in the 4:3 ratio. Playing games in this resolution is basically 720p, but I prefer 1280x1024 monitors because that's basically 1080p
Just curious, but what about 1366x768 monitors? Are they 1080p as well? Also, how do we determine whether a monitor is 720p or 1080p or something else?
800x600 is still quite supported in some modern games, Though if you are planning on playing at a widescreen resolution I would recommend adding 1024x576 or 960x540 as a custom one. This process could also allow you (if really desperate) to lower it down to even 480x270, though only ideal for games which don't rely too much on draw distances clarity.
And in some of the ones on that it isn't supported you can add the desired resolution via messing with the configuration files (most Unreal Engine games accept that)
The only problem with low resolutions is getting some games to work, Wolfenstein The New Order for example was a nightmare to make It run at 852x480, It doesn't appear on the game menu, It only work in borderless fullscreen, and navigate Windows on anything lower than 1024x768 is a nightmare.
This is why I can't play DQXI properly, playing at the appropriate res and settings for me makes it stutter like nobody's business, lowering the res makes text blurry to the point it hurts my eyes to look at.
@@aweigh1010 Not true. When I was a kid photorealistic graphics mattered as much as the gameplay. Still remember firing up the IGI demo and being blown away by how real that looked back in 2001 or something. But i guess that applies mainly to kids like me who grew up with NES wishing my video games looked more immersive. These days everything looks respectably realistic so that whole searching for realism angle is out the window.
A lot of 1024x768 monitors also have a 75hz mode, which means good frame pacing at a 25fps cap if your PC can't hit a stable 30 (plus extra smoothness if you can hit 75fps). Same for 1280x1024 monitors, which also have the bonus of being able to display 720p and 960p without needing to scale, which means no blur.
I also had an AMD A4 3300 with the 6150D APU. I used that computer from 2013 to 2018, one day I searched for a benchmark of it and it took me to one of your videos, benchmarking the A4 3300. Since then. I ´ve never stopped watching your videos and you even became recognized, you taught me a lot about computers, a lot. Now I game on a 2010 xeon, the 5660x, and a gtx 970. I really like your videos and your style of making them, keep it up.
I loved playing Skyrim on a 3:2 aspect ratio and really appreciated the vertical real estate. There is definitely some appeal with playing on closer to sqaure displays.
I feel for those of us growing up in this resolution boom, we appreciate the nostalgia of low res gaming etc … whereas many young’ins growing up with 1080p and 4K gaming feel they need the latest and greatest to enjoy gaming ……. It’s the game itself that’s make a game great …. NOT the resolution 👍😇🤯
Let them be dumb, we enjoy the good things while they are miserable. I play at 720p and have a blast and when I get too cold I get into my bed and turn my CRT and play some Famicom AV and I cannot be more happy.
I remember when Duke Nukem 3D launched and I played it at 800x600 resolution. It seemed such a high resolution to me, because I was playing games at 640x480 or 320x240 before that.
I still use 900P on my laptop and am grateful that so many games still support the resolution. On a 14" display, it really doesn't look bad and makes the frame rate perfectly useable.
This is a great demonstration of why CRTs were so cool; you could change the resolution to get better performance without the graphics looking terrible from all the scaling that LCDs do. Unfortunately, there's a bit of a problem I've noticed if you're using a CRT now. Most games don't actually run in a true fullscreen mode, meaning if you change the resolution from 1600 x 1200 to 640 x 480, it doesn't actually change the desktop resolution, and only changes the render resolution and then scales it to your desktop resolution. I noticed this in particular when playing Cyberpunk 2077 a while ago. Maybe they've changed it since then, but I wouldn't hold my breath on that if they haven't.
I've been seen your videos since 2016 i believe, and i remember using every single tip and copied everything to get some of those delicious FPS. Thanks for all the memories and congratulations for everything youve archieved!
I remember buying my first laptop with a celeron on it. Every time I install a new game, I always go to setting and choose the lowest setting with the lowest resolution. From there I just adjust and increase the setting bit by bit. Skyrim with like lowres texture pack was awesome and dark souls with 15 fps look kinda realistic cuz my character move in slow motion, something that I would imagine should happen to a fully plated, dual wield greatsword would move in real life.
I have a 2k 165, but I also have a 1024x768 CRT monitor running at 85hz and it’s wonderful. Control and Prey with the proper 4:3 aspect ratio looks glorious.
Never played BFV, but I played the heck out of 1942 and they used the same game engine. While you could play it in a variety of resolutions, the UI looked best in a multiple of 800x600. So I played at that res on my GeForce 2MX and 1600x1200 on my X800XL.
I have a high end gaming rig, but I maintain a "retro" build with a P8h67 with a windows XP installation and a GTX 970, the last gen of graphic cards that still support native VGA output, which I use with a 19" 2048x1536 Samsung 997MB CRT monitor. It is very nice to play games scaling up to native resolution. I played some cyberpunk at custom 1400x1050 @ 85HZ resolution and the result was quite pleasant. I also have 1024x768@100hz and 1920x1440 @ 72hz set up IIRC
I remember playing GTA IV 800x600 in 25fps and I was happy :) I also managed to run it on Pentium 4 and GeForce 7600gt and it showed 20 fps and didn't render the sea. My friend used to run Crysis on Pentium 4 and ATI Radeon with 128 Mb of memory... He had to wait for 5 real minutes to load a level. He couldn't finish the game because the iced levels didn't work at all. That was the times.
Truth be Told, The Resolution I started out with for Games was 1366 x 768 on a Lenovo B570 Laptop with an Intel Pentium B950, 4GB of DDR3-1333 RAM, and a 320GB HDD on Windows 7. Team Fortress 2 at Lowest Settings ran at about 30 FPS with Drops into the 20's. It wasn't optimal, but my best Memories of Gaming were on that Laptop. Glad to see you do Video like this again, RGIHD. Would be fun to see more videos like this!
Ive been playing apex legends with this res for 2 years now with my gtx750ti, i even overclocked my monitor's refresh rate to 85hz from 60hz, game feels so much smoother.
I think i'd prefer 1280x720, unless on a CRT, which made me thing that the 1050 is still decent enough if that's where you game on, and CRTs are still great for gaming.
@@youtubeshadowbannedme I play on 1050TI 4GB. I can turn on every game and every game is playable (40-50fps) on low or medium settings. I fly in x-plane 11 with a 1050TI and I get around 30fps in cities and 40-50fps in other places (30-50fps is very good in flight sims lol). I can even play on VR with 0 problems. Edit: I always play on 1920x1080. My headset is quest 2.
@@youtubeshadowbannedme They are all valid. Just tweak the settings to low or medium. I haven't seen any complains so far unless you care about 100+ fps
Good times, this was pretty nostalgic, I had a 4:3 768p monitor until 2012 when I finally upgraded to a 1080p IPS screen which was the bleeding edge back then. Nowadays I still keep one around and also a CRT for retro games/emulators, the CRT is vastly superior though and since they're resolution agnostic you can play at any res and everything will look sharp even 480p.
I play on a 19" CRT monitor and it's great. They can be tuned to higher refresh rates at lower resolutions, so mine can do 105hz at 768p or on the extreme low end, 132hz at 800x600, which I've used for many hours of rhythm games. Here in the US I've picked up a several monitors for cheap or free (and sold most of them on after having my fun), you just have to be patient and watch local selling sites. I even played a tiny bit of Cyberpunk at 768p on the 14" Hot Wheels monitor, and it looked great with my somewhat slow Vega iGPU.
I don't understand how you are so surprised most games support it. They are supposed to run at whatever resolution the os supports. Lower resolutions such as 600p and 480p probably aren't supported because the developers think it wouldn't be good, even though they would run fine. I hope they don't stop supporting it.
This is actually a brilliant idea, the viewer who told you about this and inspired you is a smart cookie. Many people seem to forget that pixels are actual physical objects, so games running at a monitor’s native resolution will always look better than upscaling a smaller one or super sampling a bigger one. This could help those who are just starting out with pc gaming and can only afford cards like a gtx 1030 get the best look and frame rates out of their games.
note: ppl may think it looks pixelated and compressed but it wont look like that on a native 1024x768 monitor it will look as clean as 1080p on a 1080p monitor so its fine. on a CRT it will look even cleaner cuz there is no square pixels, it will be a bit blurry but it wont be even noticeable and the fun part is it wont look any different than 1080p if you compare it side by side.
It absolutely will not look as clean as 1080p on a 1080p monitor. It will look better than this online example, but you will not see the same clarity with 38% as many pixels. That's just physically not possible. If it was possible they never would have gone past 480p or 600p because it would have been "no different". Even with it being a CRT, it won't be as clean and clear. As you said yourself, it will be blurry. And that's assuming you go spend $100+ on a good condition CRT.
@@brandonjohnson4121 yeah with the GPU shortage it was supposed to be temporary but once I got windows 10 installed and saw the quality and brightness I was hooked. I can't understand why they stopped making them all over the world. Not a single new one on sale ever again. ☹️
Growing up in a developing household in a developing country, I started out my first PC with a GT640. I remember looking at the specifications for Middle Earth: Shadow of Mordor and thinking "wow, I'm never gonna be able to play that". But play it I did, scaling it all the way down to 1024x768. The constant 50+ fps on all low on that trash bucket of a PC was more than worth the abysmal image quality. I remember trying to play Rainbow Six Siege and absolutely shitting out 2-4 fps. I then built my second PC, this time with a 1060, and even still, certain games, if they offered the choice to play at lower resolutions, even if I have a 32 inch monitor, I'd play on it then. Siege comes to mind first in that regard, I still played with 800x600. I no longer play Siege, and I now have a 3080 which, frankly, doesn't require me to scale anything down anymore. But having that option, having to grow up with those standards, it really helped a lot.
Duuuuuuudeeeeeeee A4 4000 here!!!!! The highest I went on that one was Rise of the Tomb Rider (10-15 fps) and Witcher 3 (15-20 fps) used that product like hell for more than 7 years! And it's still got all the life left in it! I'm way more proud of it than my current 2070 super 3800x rig! The day it dies I'd make a showpiece out of it decorated with gold or something! So many happy memories with it! Btw the resolution was also the same as yours.
I would recommend buying a CRT like P1130, Lacie 22 Blue IV and so on. It is a huge image quality booster at low resolutions, games just look amazing! I played Doom 3 with my old GTX460 and Samsung 1100DF with tweaked resolution to achieve 104Hz refresh rate and it was fire.
I remember playing PVZ Garden Warfare 1, on 1024 with radeon 1GB vram, and it runs 50-90 fps it was amazing, many hours of play. Cpu was E8500 at 3.8ghz
Reminds me of when I got my 360. I used an old CRT monitor and played @ 1280x1024 and all I had to do was buy the VGA cable for 50 USD. A few years later I acquired another CRT monitor someone was throwing out and it actually supported 1080p. I also bought myself a component to VGA converter for my PS3 and Wii. I didn't have to buy a legit HD monitor for years.
My imac g3 rebuild with a 750ti in it has a 1024 x 768 screen and it doesn't look too bad. Means a lot of older games will run maxed out. Would like to put a 1650 in it and see what it can do with newer games
@@9852323 i did this: pull out the CRT, hot glue in an old dell 15" 4x3 display. Remove the motherboard and leave as just the internals as nothing but the metal plate. Mount a new motherboard (I put mine on its side with L brackets) and slimline PSU at the back. Single fan 750ti, then mount a case fan pulling air up out of the holes at the top. 8gb RAM, 2 ssds, wireless keyboard Bluetooth receiver and WiFi dongle. Then I replaced the old speakers with larger ones that I just stuck in either side behind the holes. Wired in a new power button and that was it done.
I had to do this last year when my monitor broke down, playing RDR 2 on that square monitor from my childhood was really something else. It was like my younger version got games from the future, and was marveling at how far we've come!
I can tell you guys a hack. If you have like an RX 560 4gb and your playing at like 1080p and the performane is terrible. Invest in a good 720p monitor and then use that and most games from the past few years run really good at 720p on a rx 560 4gb.
Hey man, thanks for making this video. It was really interesting. I actually do game at 1024x768. My main monitor is a crt monitor running 1024x768 at 120 hz 😊
I grew up in Venezuela, when compared to the uk, we are rather behind on a technological level. in terms of leisure and entertainment, spending money on a gaming computer was out of the question, thus I was limited to a pirated version of counter strike for a very long time... my monitor was a 1024 x 768 hp office monitor , nowadays I do live here in the uk have a msi monitor for gaming but I still cant play cs on any other monitor than one that is 1024 x 768. there's a nostalgia to that aspect ratio for me, I can't quite describe it. good video. took me back.
I quite enjoyed playing games at 1024x768. When Crysis first came out I had an old 8800GT which was a competing card during that period however the performance of Crysis was not particularly good at higher resolutions. For me however, I had an old CRT monitor which maxed out at 1024x768 so I was able to nearly max out the graphics while still getting
Jesus, I'm using a 1050 right now I didn't know I could get such great performance by switch to the monitor that's been sitting in my garage for 15 years, thanks!
Say what you want, but playing games with this resolution actually saved me from playing same, old games back in the day. I had this beautiful 1600x1200 monitor, so playing 800x600 meant Fullscreen no problem, but... Halved. Worked like a charm for so many years.
As a 720p streamer, I find myself a comfortable position when playing 1024x768 windowed, as it gives me space to check anything I need, such as the chat, stream info, and all. All without the need of a second monitor.
Steve I bought a Lenovo laptop about 6 years ago for gaming. Now Gaming was something it was not built for. But the discreet GPU thing got me, hooked. I still play GTA V at 1024 X 768p at the lowest settings. Because that's the only way I get more than 30 Fps. I never had regrets until I saw a 1080p experience of the game.
Not just for gaming, by getting a lower res monitor (or setting our monitor to a lower res), we can enjoy RU-vid videos with better clarity when playing at 720p or below. Perfect for people with lower speed internet. Personally, I feel like owning two monitors would be great. One at 1600x900 or lower, another at QHD or 4K.
I gave my youngest son my old 750ti and my old 1024x768 monitor. He plays fortnite at a solid 60fps and loves it. I'm impressed with how great the 750ti still is.
I was looking specifically for a 1280×1024 XGA monitor in order to play one of my favs(HoMM IV) with some nostalgia so when a good friend of mine said he had one I was over the hill grateful despite already having FullHD(free from the trash as well) and knowing all of the patches community made for that game. So, if nothing else, nostalgia alone can easily justify "downgrades" like that!
I remember playing Fallout New Vegas in my AMD A3 33-something at this resolution, at like 30 to 40 FPS, I can't believe I completed it with DLCs like that
I just got a 3060ti and still game on a 19" Samsung CRT at 1440x1050@78hz. It is pretty glorious lol. The motion clarity is butter smooth, and it looks great to me 😁 Sometimes I will do 1600x1200@60hz, but prefer the flicker reduction of the higher hz.
The reason why I watch your channel is because you are always in touch with reality. So many PC tech tubers only prefer the most expensive products and want nothing to do with low-end hardware.
Using an EPSON EB-E01 1024x768 projector to play older games and emulators. Found myself using it for everything gaming related witha 1050ti, amazed how newer games run so much better with this resolution. Cannot beat 100 inches display at the wall also. Think I made a video about that best last year. Still running perfectly and it is in it's second lamp atm.
As someone who has a acer aspire with a i5-4440 and intel hd graphics 4600, the dell crt monitor from 1998 i picked up for 5 bucks today really helps my performance. The big advantage is the refresh rate, as a rhythm gamer that really matters. I got stepmania running at 400fps at 1024 x 768 i have never seen anything run that smooth. on this computer.
2:47 This video made me realize that the "PS2 aesthetic" that I love so much is more tied to the resolution than I expected Elden Ring and Cyberpunk feel like modern PS2 titles, it's bizzare
I recently picked up a native 1366×768 monitor out of curiosity. At the same time I was making use of some old parts and threw together a i5-4570 and RX 560 4gb machine with 16gb ram. I had initially tested it on a 1080p monitor and it really struggled. But when I hooked it up to the 1366×768 monitor I was presently surprised how well it performed. All of the sudden it could handle the majority of games thrown at it. The funny part was that it really didn't look that bad and was really quite satisfying to play on.
What do you mean by aging graphics cards? I feel like this is a little bit misleading because most people may assume that this means their graphics cards will perform worse over time which is not true. As long as thermals are okay, your 10 year old graphics card should perform as well as it did when you bought it brand new (excluding factors like newer games and unstable drivers)
Nostalgia! I remember when I was stuck on an antique 640x480 monitor that physically could not run 800x600, and games were increasingly requiring this higher res as a minimum. My brother finally found a multisync monitor for me and I loved that thing. Amusingly it lasted exactly as long as I needed it to, as it died while I was offloading data off of my old computer onto the new one that I had just gotten. Many monitors and older LCDs also support 1280x1024, which is actually 720p but taller!
I used this trick back in highschool (2015) when i got a pc with a 960 and decided "Hmm 1080p gaming won't be very enjoyable" and got a 900p monitor. That system still performs brilliantly for normal games, some video editing and daily use to this day,
Oh man when the original Far Cry came out and I upgraded my GPU to play it at 1024x768 it was terrific. This was on a Sony Trinitron CRT that I honestly wish I still had. Fun video. Cheers!
Depends on what your game genre is I use a 768 32 inch TV as a main monitor, if you play any FPS on that it feels like you're just starting pixels But with driving games/simulator as I often play is perfectly serviceable
My experience with that on using old monitors on new hardware, I actually used my old 1440 x 900 60hz tv that I bought in like 2006 with my previous pc until like 2018 when I bought a 1080p 144hz monitor. Oh and I was using a GTX 1080 with an i7 5775C on said 1440 x 900 monitor.
I used to use 1024x768 native. I had a 1366x768 monitor (Well, a TV), but due to my PC only outputting VGA (It outputted displayport too, I didn't have a compatible monitor. It was a Intel 4th gen office PC essentially), and me not having money for a GPU, I had to use 1024x768 stretched to 16:9. Then I got a DP to HDMI adapter, replaced the old TV with a new one (same native resolution, but a decade more modern, and technically having support for 1080p albeit with a few issues), and I got a new PC too, it being a 6th gen i5 office PC.
Going from a Intel Atom + eMMC (Cherry trail, to be fair) which was my computer before getting my first desktop, to a Pentium G3420 + SSD was a HUGE performance increase, compared to going from that to an i5-6500. To be fair, the i5-6500 PC has a HDD. I plan to swap it out ASAP.
Wow, talk about a blast from the past... I remember back in 2005, I built a computer with an Intel Pentium 4 650 3.4 GHz CPU, 2x1 GB DDR2 800 RAM, a 74gb WD Raptor SATA HDD, and an ATI Radeon X850XT PE 256 GDDR3 PCI Express 1.0 GPU. Paired with a 4:3 19" Viewsonic CRT (forget the exact model), I would play games like Half Life 2 at 1024x768 on ultra/high settings and either 75 or 90 Hz refresh rate. Those were the days... I ended up giving that computer and monitor to my nephews years later, and I think that CRT is still in use today since one's a retro gamer.
based. I picked up a 1366x768 19"er a few months ago for €10 to help my 1650 if it were to struggle with games in the future and it's great. You still get that 16:9 aspect ratio, half as many pixels as 1080p and also it's wider than my laptop's 15.6" screen
You might be remembering wrong, but around 2001-2002, 1024x768 was the high resolution. Most of us peasants with lower gpus - even with not so lowly stuff like a Geforce 6800 GS around 2004/2005 - that resolution was pretty expensive and not always necessary, since we got get away playing at 800x600 with higher refresh in that era CRTs and cheaper, crappy LCDs. I moved from 800x600 only in 2007, when I got a 17 inch LG lcd that didn't sucked. But, honestly, 800x600 was pretty sharp and nice on a CRT. And you got the added bonus of very high refresh rate with no input lag whatsoever. It was a great time.
I just moved to Japan, and my Senpai gave me his 1366x768 32-inch Sony TV from 2008. Boy it’s such a hugh upgrade from my 13-inch MacBook, I mean 32 inch is huge! With my 6500xt EGPU and virtual super resolution set at 1080p, the gaming experience is superb👌🏻
I have a SamsungLE32B350F1W 32 inch TV at my dad's house which supports 1366x768 px 50 HZ (although through HDMI it's 60 HZ, not that I'm complaining )- for my laptop with 1060 3GB, its more than perfect and ironically the image looks amazing when playing let's say AC Odyssey at around 55-ish average FPS Low settings. I'm thinking once I I'll change my RX 5600 XT from my main rig for something more powerful this year, I'll build my dream emulation + High PC gaming and use the TV as a main monitor ! It's gonna be glorious !! Also, keep up the good job !
Yes I sometimes use this resolution…but not for the same reason. I have a 1440p monitor on its side as my main monitor. Using a windowed game, you often end up at strange resolutions like this one. I do also have a 4k projection screen. But who wants to use that in summer.
I've used a 19" Viewsonic CRT monitor at 1024x768 at 116hz since 2007 when I built my first PC. Just last year, my CRT finally kicked the bucket, and so I swapped to a 24" Viewsonic LCD. 1024x768 is a very usable resolution, but i'd found that most modern games don't support it. I'm so used to 4:3 that 16:9 makes me feel bug eyed and I use 1280x1024 w/ black bars to this day!
I have a newer pc and still use a 900p native resolution 75hz Asus monitor at times. At its native resolution it’s looks at least as sharp as 1080p for gaming. Growing up on CRT monitors and low resolutions brings back memory’s 🤤😉
I was using same computer with amd radeon 6350 hd for 10 years, i was used to 768p gaming, even my desktop was at 768p, now i bought a new computer with rtx 3050 and i noticed my old gamer boy soul set itself fire, now im happy and playing games at 1080p
What makes 1024x768 so great is that it's a easy resolution to get native image quality without scaling because of the 1366x768 TVs and Monitors and the fact that it's easy to get games running at this resolution. And the performance boost is not only because of the less pixels needed to render, the aspect ratio helps too reducing the extra field of view rendered in wide aspect ratios.
What I really loved about the CRT monitors, that you could play whatever resolution you wanted/or could, without any blurryness. Me and my brother had a great 19" LG Flatron which had a 1600x1200 resolution, the first PC that we connected to that CRT was a Intel Celeron 466 mhz, 320 MB 100mhz RAM and a GeForce 2 MX 400 32MB and we played everything on 800x600, especially demanding games like Max Payne. After we got an upgrade from our parents and it was a Pentium 4 2.4 ghz, 512MB DDR RAM and a GeForce 4 Ti 4200 (Legendary Card) and we could play our games in 1240x960, some in 1600x1200.
A funny fact that goes along with this video... many, by which I mean all before a certain date, projectors that claim to be 1080p are actually native 1024x768. But they use 3 lcd's to render Red, Green, and Blue separately and then combine the images using a crystal and lenses. The combined image actually has more dots than a 1080 display, but they are separate colors, your eyes blend them. Basically, if you have a projector you bought more than a few years ago because you wanted to watch your new Blu-Ray movies on a wall... that projector is probably a native 768 display and will work perfectly for this kind of gaming.
I personally have eye strain on small monitors So I decided to upgrade to an Ultra-wide The question was however, most of them are 1440+ in resolution I snagged a 1080p ultrawide, and barely took a hit to FPS for the reduced eye strain!
Fun fact about my build is the fact that I'm running a 3090ti but got low on budget after it and still use my old Dell 1024 x 768 monitor with a hdmi to VGA adapter ngl haven't experienced 1080p afar from my TV or my phone but if you say why don't I use my TV as a monitor welp... Is big and hanging on my living room wall and I still like my old monitor gives me that childhood back