Best to test the voxel shaders in a cave or night time. They use a dynamic lighting system which is why it's so intenisve. Torches, lava and other coloured light sources can cause the surrounding blocks to cast hard shadows that update in real time. These shaders are also based off the complemantry reimagined shaders which is why they had the vanilla looking water by default.
Yeah, for real! i do strongly agree, and strongly suggest it for people who do like caving a lot! since ( in my opinion ) having a shader enabled while discovering the cave is really good and gives you good visual! And sometimes, i think it is looking like a real life experience for me! There are alot of people who are really interested in it! I think it is very cool to have this specific shaders ( Voxel shaders ) even with water if you see one in the cave! Tbh, I strongly suggest that people should at least ONE time use these shaders!
@@W1LlzA well most are entry or low tier tbh, especially if they are older than 2 3 years, my 1660ti was mid tier for sure in past, now its entry level at best
This shader list is so much better then what you normally run! I love how you have raytracing shaders to strangle any performance out of the card. It would be rlly interesting to see you run Seus HRR 3 on older tech also
Me personally I like the SEUS PTGI shaders the most, they don't look that great when you use vanilla minecraft texture pack, these are meant to be used with PBR texture/resource packs, this way they look way better but the game could get a little bit more intensive.
The reason that you, Kryzzp, aren't seeing any godrays in Sildurs Vibrant shaders is because Volumetric lighting turns off godrays internally even though you turned them on. You can see godrays by turning off the Volumetric lighting option from "Sky and Lighting" settings. Thank me later and love your content!
Hello I am a 74 year old man from Uruguay, and can I just say that I love your content! The deep philosophical barriers really reflect on what post war modernism has done to the average psyche. Your in depth arguments truly evoke discussion, for example in this video you commented on the obscenity of the war on drugs in Switzerland. What an amazing statement! Please continue what you are doing... the world needs people like you
With Rethinking Voxels, DO NOT set its profile to ultra cause it makes no difference in visuals, but for some reason destroys performance. Where it's gorgeous are block lights, set time to night and go to a village or a cave and place torches and other stuff, looks absolutely beautiful, each one will create nice sharp colored shadows, looks even better than something like Seus HRR. It is intensive, but if you set the settings right, it can be running quite alright, I mean, on my gtx1050ti 1080p I managed to get solid 30fps+ with beautiful visuals, on rtx4070 1440p it may be sitting at 120fps.
I wish that SUES PTGI would work on AMD as well also rethinking voxels needs loading time to compile the shaders and latest iris I would recommend fabulosly optimized mod pack for testing
My brother in law just showed me your video. He said you look just like my son. I didn't believe him. So here I am and sure enough, you look just like my son!!!
yeah should be 500 max 550 since, you can buy a rx 6800xt which has 16gb of VRAM for 550 Euros. 4070 delivers the same- performance as the rx 6800 xt with the downside that the 4070 is more likely to run in VRAM issues (spillover, stutters, etc)
@@limeisgaming I remember getting an open box 1070 FE for 300 dollars back in 2018. I used this video and the 6900XT MC shader video (I upgraded to the 6950XT from my 1070) to see the difference between the two cards, as they were around the same price, $600. According to these two videos, the 6900XT beats the 4070 (project LUMA results have a 100 FPS gap OMG). Not sure how far drivers have progressed since the 6900xt video came out so it might be different, but still, Damn. I chose the 6950xt originally, because although 40 series had all the new bells and whistles, I mainly wanted a GPU that could play Minecraft modpacks with shaders at large render distances. DLSS 3 and Frame Generation was not going to help me get over 240 hz on Minecraft Java, so the Nvdia Fanboy Feature argument I see in yt comments was out the window. Funny thing is, even though by buying RDNA2 instead of 40 series, I gave up the potential to enjoy frame gen in newer games (if I decided to play them), apparently AMD is going to bring it to my GPU anyways with FSR 3, and **Maybe** even give me the option to enable frame gen on the older DX11/DX12 games, It is funny how things end up working out. Sorry for the essay, I just wanted to share my thoughts on a video about the main use case for me having such a powerful GPU.
u can lower the simulation distance. if u make it like 6. after 6 chunks of distance mobs will not move redstone systems will stop etc. it will help the cpu.
idk if u know this but you can use /locate structure to get coordinates of the nearest structure u want for example /locate structure Village (it got updated in 1.20.6 idk that one) and /locate biome plains to get coordinates of a plains biome
I think after 2 more videos you're gonna blow that monster GPU 😂 after testing 16K res and Ultra setting you can get 10fps what do you think what you can get 😂❤
This is the best video for me currently because i will build a pc in a few months and have planned buying the rtx 4070 ghost and want to play mc on it mainly
I wonder what the 4060 ti will be like and price point, so I am guessing it will be like a 3070 ti in power and with dlss 3 of course at the price of 450 prob
the seus hrr 2.1 and hrr 3 are amazing but you need to go in the settings and play around with the settings because the default settings feel a little too vibrant, also i reccomend hrr 2.1 because is more stable and has more fps than hrr 3.
Sildur's Shaders disable godrays when volumetric lighting is enabled I think, you can't have godrays along with VL or VL with godrays. It's because of how volumetric lighting works
@@zWORMzGaming Good stuff, i'm planning on upgrading from a ryzen 3600 to a 5800x3d, my old gtx 1070 will, most likely be replaced with a 4070, since my case is small (meshify c)
I would go for the 1TB SSD for now. When u run out, buy a cheap sata ssd. Even tho i have an SSD for windows, I have 6TB of HDD storage for my games and I really regret it. Load times aren’t fun.
I have an i9, rtx 4070, m.2 ssd, 32gb ram and an 800 watt psu but if I use shaders my PC always crashes. My pc reports a critical error but it doesn't state what causes it. It just says event id 41 kernel power. Do you have any ideas.
Minecraft is the only thing which can run with the potato gpus and can also destroy the rtx 4090 with shaders physics mods and texture packs because of this big community
Kryzp, wouldn't you like once make a video while you talk in Portuguese with subtitles? It could be even short one but still would be attractive video as part of your activity. you could use some idioms, sayings or equivalents of "what the heck", "God damn it". I think that's a cool idea.
24:18 what 80 years ago😂or am i hearing wrong. Also decrease the simulation distance to 5 that decreases cpu load by a lot without giving any noticeable change.
Hey bro, I was wondering, what are the actual clocks of your CPU? You see any performance benefits going from stock 5.1 to any other overclock, let's say 5.5 ? I tested it with so many games and it feels like 13600k performs the same on any clock. I paired it with a 3080 10G.
I run 5.6GHz on the P cores and 4.2GHz on the E cores. You don't see a difference with the 3080 because the i5 is already good enough at stock to not bottleneck it, in that case, instead of overclocking the CPU I suggest undervolting it so it consumes less power while giving ou the same performance :) You'd start seeing a difference when overclocking with something like an RTX 4070Ti or 7900XT and above in performance, or if you use low settings / low resolutions in games!
I have a quick question that I hope can be resolved here: how do you consistently keep your gpu at 90% - 100% usage? even after I do mass tweaking to my game it always underperforms while staying at ~50% usage (this is with an rtx 3080 ti)
@@DravenCanter no not at all, when I run benchmarks the scores seem fine but whenever I'm ingame I immediately notice I'm lacking performance, in his video with the 3080 ti running mc he is constantly hitting 200+ fps at 100% usage while I sit at 100ish frames (1440p) going as low as 40 - 50 with it at around 40% - 60% usage
to add onto this, I also run into issues concerning my vram usually running out of it when running aplications that shouldn't be exhausting all my vram even with all my apps closed
I think a really fun thing you can do in these minecraft videos is to just try like spawning a bunch of mobs or explode a lot of tnt to see the performance and btw if you wanna find a specific biome or structure in creative you can type /locate biome jungle for example! Hope this helps
Can you do a CPU comparison at this point for Minecraft. Like 5800x3d vs 5900x vs 13600k on Minecraft with the rtx 4080 at 1080p to 4k. You could do as many cpus as you want, but it would be nice to see which cpu is also working the best for this game. Also spawn like a huge section of mobs in a fence to see how they handle a cow farm for example.
@@yancgc5098 I did not ask that. I asked for a comparison. In fact you could be wrong. What if minecraft takes advantage of the 3d V-cache in multiplayer scenarios. Its just not that simple of an answer. I've also seen the 13600k lose to the 5800x3d many times. Its all test scenario bc they are both better in certain areas.
@@Purified1k Well you can throw the 5900X out of there since Minecraft Java is very single threaded. Between the 13600K and the 5800X3D it depends whether Minecraft loves that amount of 3D V-Cache or prefers the adequate amount of cache alongside the superior single core performance of the 13600K. My money is on the Intel CPU
@@yancgc5098 But you forgot the part where I just want the comparison in general. So yes, even though the 5900x might not be as good. I want to see how it stacks up.