@@aminbagheri9044 I am not sold on RTX remix yet, I remember that it upscales the textures etc. it can work if you also have an artist who will go over the AI upscalled textures and fix them up, even in the demo they shown during the conference(that likely shows a bit inflated best results) there were some instances, where the "AI upscaler" will just straight up throw the detail out of the window and just kind of "un-ditherer" the texture and makes it look too smooth; And I kind of disagree on the Quake 2 looking better with RTX(also this demo adds a couple effects like motion blur etc.). The mood of the game is lost, the RTX lighting in that game looks kinda half-assed.
I actually ordered this laptop, 3050 and all. Didn't need it for much gaming maybe just some indie's here and there. To my surprise, I was sent the 3060 variant and only charged for the 3050! Still stoked about that, been a great laptop and performs quite well.
@@joelconolly5574 That was probably 6-7 months ago and I never did hear any fuss. It was an Amazon vendor idk maybe they didn't really care or didn't have any 3050 models to send lol.
I think one thing to keep in mind about the laptops with the 3050/3050ti is their screens are usually around 15inches. Definitely a lot smaller than the monitor used for testing in this video. The smaller screen does a better job of hiding the imperfections caused by using upscalers like DLSS and FSR.
Fidelity FX on my steamdeck is amazing. So with a current Gen ryzen apu id like to see how it compares. Cant wait till the 7000 series hits Dawid's hands.
@@scottcol23 I've seen somewhere that the iGPU on the new Ryzen CPUs is the same as the one on the previous gen. Looks like they decided not to change it...
1) Quake - I lost 600 frames per second 2) FN - I dropped from 200 to 60 frames per second. 3) BF5 - I'm not sure I can tell a difference visually with RTX on Dawid, you just summed up ray tracing perfectly!
The DirectX12 stutters are a known issue, and not just with Fortnite. It has something to do with Exploit Protection in Windows 10, there are tons of articles that can be found via Google on how to fix it.
In my experience with dx12, it adds stutters as it fills the cache of the games on launch. For example playing ready or not, i need to play the maps 1 time with stutters but the second time is smooth. There are some games that I feel the stutter doesn't go away like in battlefield.
Thanks for this video bro, I've got the same performance testing laptop and was happy to see that the rtx 3050 as a budget laptop can still perform well with newish titles in 2023. Happy to see that rtx and dlss also work reasonably well on this laptop :). I had my concerns when most reviews I've watched rate the mobile rtx3050 as meh. So thanks for clearing that up for me, on a budget I feel a lot happier now about my rog strix g15.
I really never use RT on my 3060. I tried it in Control a couple times and didn't feel it was worth the lower FPS. It pretty much just makes things shiny, but I'm not a crow, so it didn't work on me.
I was watching the video happily and you reached the end talking about upscaling technologies. Mentioning the igpu testing made me realize how dumb I am to never have thought about using that on my slow laptop. This entire summer I've only had access to my laptop which can barely run even league of legends on 60 fps. And I've put up with this performance on a bit more demanding titles that now, you have made me realize what a way better experience with fidelitydx it could have been! Thanks for that lol.
@@spacecy yeah you want to see crappy fps on a powerful nvidia gpu just run linux and you'll see why linus flipped nvidia the bird in that one clip on youtube he was pissed at them for oh we can't help you make drivers or at least help you understand our hardware so you can make decent drivers for our gpu's like amd does
the rtx in control was phenomenal imo, i played it on a 2070 at 1080p w/ dlss(so render resolution was 720p), it looked amazing, still does tbh. i managed to get a 3080 some time later and god damn, at a solid 144fps and maxed out it still is the best looking implementation of raytracing imo. the fact that the game itself is great is pretty cool too.
Raytracing. Not RTX. RTX is the Marketing Term by Nvidia. The technology is called raytracing and is Part of directX 12. And Had been around for decades.
@@edenjung9816 well yeah, obviously that's what i meant, no need to be pedantic about it my dude i even wrote that "it still is the best looking implementation of raytracing" so...
Yup. Played it and beat it on a 3050 at 1080p w/ DLSS. Loved it. Got a 3070 a few months ago and yeah - at 60+ fps at native resolution with full ray tracing, it is a feast for the eyes. Incredible game.
My assessment for raytracing from my experience is that it's quality varies from game to game. More often than not I found that with a few exceptions unless you are really looking for it or it is something noticeable, you really can't tell the difference between when raytracing is turned on or not. Plus there's no standard application. Some games just use it for reflections which is the most noticeable effect, while others use it on just lighting and shadows which are barely noticeable at all. In end the graphical upgrade it provides feels to small to justify the performance hit you would take, especially if the game isn't optimized appropriately to use it.
I am not sure if it has already been mentioned in the 590 previous comments, but the thing with Battlefield V is: it only features Raytraced Reflections. In the game scenes you showed at first there simply where no reflections so the framerate did not tank as much. In the hangar though there was water on the floor which has raytraced reflections in it. So this was tanking your performance there and not so much the fire and the smoke. :)
I noticed that smearing effect of bright things happen in other games too, even with DLSS disabled it does that, but DLSS seems to make it more noticable. It's actaully a cool effect.
iGPU FidelityFX would be great Especially with AMD's new CPUs, 'cause those can already be gamed on pretty darn well (I mean, 2 GPU cores will only get you so far, but we're looking at last gen base model console performance on an iGPU that isn't even intended to be gamed on. Kinda makes me excited for if they ever make a G variant)
I have Dell G15 with ryzen and I think gpu powerlimit is 90W in my case. Even with Nvidia Gf experience overclock I have 2050mhz stable. Gets the job done really well for me.
I would love to see the difference between manually tuning the video settings and the one suggested by GeForce Esperance and whatever AMD is doing these days. To be honest, I have just been using GeForce Esperance, and it's been going fine for me. It's been a while since I have had to dig through a game's settings.
I’ve noticed that GeForce experience tends to do ok, but it will also randomly suggest lower setting on some games that don’t make much sense. Don’t really need to turn on dlss for Fist running on a 3080, but it REALLY wants to etc. Also the suggestions tend to get worse when a new product launch happens.
@@MarginalSC i know u might know this already, but u can change the performance level for each program in the Geforce experience app, my valorant had settings set to high automatically cause of geforce, but then i found the setting and set it to low
@@art_nt_nk8353 Yeah. But if you're just going off the default things can get weird. Quake RTX? How about you turn off that global illumination? Your 3080 can't handle that...
That feeling when my RTX 2070 Mobile does much MUCH better than a 3050. They really gave the 3050 the short end of the stick this round. I kind of expected a card that had the same Cuda Cores as the 2070 to have equal or better performance.
RTX 2070 has almost double the count of TMU and ROP units, almost triple the count of tensor cores, and again double of RT cores... And ofc 8GB of VRAM on 256bit bus... So yeah.... not even close...
Your mobile 2070 laptop probably also cost more than twice what a 3050 laptop costs. They are really good chips for the money (especially during the gpu shortage period).
I had a laptop with an RTX 3050 and that was my first device that was actually capable of running modern games. That gpu really introduced me into what was possible with pc gaming. Now I have a proper desktop pc that I can max out settings in any game
Awesome tests - I'm looking to replace my aging gaming laptop - i7-7700hq + gtx 1060m 3gb with a 3050/3050ti laptop. DLSS / FSR really seems like a must for this type of card... Over time the 4gb of VRAM seems like it will really create a problem for these GPUs keeping up.
Don’t take the 3050 because that 4gb of vram is going nowhere unless you plan to game on low to medium settings. You rather take the 1660ti or save up a little more and get a 3060.
@@r3tro290 That is where I find myself at the moment. I do a bit of gaming and a bit of video editing. Seems to be the case I'm going to need more VRAM either way
I would absolutely recommend atleast a 3060, if not a 3070. The vram and overall performance is more then worth the extra couple hundred it'll likely cost. It's anywhere from a 20% - 30% power increase depending on the wattage being pushed through it. You'll definitely notice a difference, and it should keep up for quite a few more years then the 3050 would.
I managed to find a €900 laptop for €661 with a 3050 and 144Hz screen, so decent value since I don't use it often for gaming and I don't mind playing on medium settings. So far so good, runs pretty well honestly and the GPU only takes 60W.
I find dx12 less stuttery with my 3060, but fortnite is going places if it's about performance, it's always a surprise after a update came out. Edit ( I'm using a stock 3700x water-cooled 360mm.)
The S4 launch definitely introduced stutters and gave me frame drops but I think they were mostly related to CPU usage and memory leaks. Even in performance mode I had bad fps. I run a mobile 2060.
I’m using a i5 11th gen and a 1650, 16ram, not the best hardware but it also stutters, but it only happens in Fortnite, even Forza Horizon 5 runs smoothly
It says 1.6 in the bottom, that’s the latest version, devs have to enable 2.1, and CDPR are clearly working with nvidia with their continued support for nvidia features like DLSS 3 and RTX overdrive. 1.6 is still FSR 1.0, though there is a mod for 2.x.
SSRTGI on Skyrim vastly improves the experience with a need to carry a light source in places and far more atmospheric environments. It needs some tuning, and obviously tanks performance even with a 3090 pulling 600w.
They showed off raytraced morrowind as part of their recent events showcasing their raytracing modding pipeline for the high end 40 series graphics cards... Or was that another fever dream...
With the improved power efficiency (at the low end) of next gen I think the 4050 might be a pretty sweet card. I'm very curious what DLSS 3.0 ends up looking like.
I have an rtx 3050 laptop and it boosts up to 2010ish mhz, but if I turn the turbo mode off it's usually 1900 I've found that it's a great gpu for what I play and what I payed for the system (£649)
Fact of the matter: very few games actually fully implement ray tracing. And why would they ? In Battlefield 5 for example there are not so many reflecting surfaces, so game for the most part looks basically the same with or without RT. Cyberpunk is more of urban game, so RT does benefit - sometimes. DLSS actually kills RT although people do not realize this. DLSS tries to match low res pic with high res pic, so RT only gets in the way.
My mrs bought a lenovo laptop for just short of £700. Its nice spec, opted for the 16gb ram with rtx 3060m. Was looking at 8gb with the 3050. She's well pleased with it.
IDk why the GPU power limit is 80W. My Dell G15 has RTX 3050 with 95W and I think Strix also has a 95W version of RTX 3050 if you update the BIOS. Nevertheless laptop GPUs are also good enough for gaming.
30 FPS isn't bad, though I imagine the sole reason for the laptop RTX 3050 is for workstation/on-the-go studio usage. The price of these machines had been going down for quite a while, too...
Nividia should come up with better names for their laptop GPUs. Just calling it 3050 when it's in fact not, is just misleading. They could call it 3050-ish instead.
Dx12 is terrible for Fortnite it is a known problem with some systems to stutter you need to put the game on performance mode witch is what most competitive players would be using instead of everything low
On my "totally for work" laptop somehow wielding only a 6800H, that FidelityFX definitely is a blessing for me, able to plugging into my 1440p monitor and playing games with decent graphic quality
i remember when i used to watch ray tracing videos thinking it was the shit, then i got the rtx 3060ti turned on ray tracing in Control (70fps+ in 1080p) turned it off again and carried on playing the game... i wonder when will people stop over exaggerating how graphics look in video games... the main things in video game should be good good textures and very well developed mechanics not shiny floors and bright halls
How cool is it to buy a device for a lot of money, turn on RTX, try to see the difference with a microscope and not see it, and get FPS like on your parents' laptop from 2013 as a bonus.
Honestly, I feel like RDR2 would also be a good game to test with GPUs. (It has DLSS and FSR, No Ray Tracing, but it is still a very demanding) Edit: also with Metro Exodus smear with DLSS, use DLSS Swapper to use a updated version of DLSS (I found that 2.4.6 has the best results overall)
1650 and 3050 gangs rise up. Jokes aside, I’m alright on my 1650ti Nitro 5. Waiting for more clearance on 3060 laptops or desktops, but the last few years on my 1650ti have been surprisingly great.
@@Naarii14 can you imagine ray traced Skyrim that would be a skyrim worth buying right there cause I have rtx I want to turn it on in skyrim to make things look pretty🤪
Ray tracing in old games with low poly models and low resolution textures doesn't make much sense it just looks out of place. And in games that have art style that is far from realistic like Minecraft is just straight up dumb.
Hey David! Since the chapter 3 season 4 came out in fortnite there was a good amount of performance decrease , and 1 day ago they have fixed the issue!