I Bought an $8 CPU to find out one thing. Can it run Minecraft? I almost got scammed, almost cried, but something very unexpected happened.. ⭐Guiny Merch! - guiny.merchforall.com/ 📩 Business - guinybusiness@gmail.com
Intel Xeons aren’t scams they’re just very cheap because they mass produced them back when everybody wanted them and nobody wants them now. They’re typically use in workstations and they’re so cheap because offices and etc clear them out to get new ones all in one go. They also make pretty decent budget gaming pcs if you can find a decently newer version of one. Btw they’re also server cpus. Edit: thank you guys I never got so many likes before!
@CodingK1d I have a 15 6500 Intel core cpu that clocks at 3.2 ghz and it can only run fortnite 45fps at lowest settings (you must have a good or decent gpu then
Ye, I have a xeon e3-1220v2 and it can run mc at like 80fps 32 chunks so long as im using some optimization mods, my gpu is the gt 730 4gb and it underpreforms for my cpu
not really smartie, the intel pentium is so old that bottlenecks his cpu by alot, its like having an rtx 3060 and using it as a intel hd graphics 4000 (intergated) so think before u say something
wtf no? lunar got 1000 fps for 2 seconds and it was while the chunks were loaded, it doesn't lag because default minecraft is really unoptimized and lunar uses i guess some performance mods/addons to the client that make it run better and make faster math
Minecrafts code isn't made to be run on the GPU. You can mod the shit out the game but the backend of the game is designed to be handled by the CPU. You would have to completely rewrite Minecrafts code (And not in Java to begin with) so it can hand all the complex calculations like Chunk gen, Mob AI, etc. to the GPU instead of the CPU. While Minecraft sort of supports Multithreading for chunk gen, it's somewhat experimental because it increases the chance to generate faulty chunks. Modern GPUs have thousands of cores and great Performance when these cores work together but not when working alone. GPUs have horrible single core performance and you can generate a single chunk on a single thread. Chunk gen speed would be horrendous when running on the GPU. Minecraft would have to implement a system that allows the game to generate a chunk on multiple cores at the same time but this would probably be highly complex and just not worth it. The GPU is only used for things like lightning, shadows, particles, vertex shader, Tangent Vectors, rendering triangulated polyhedras etc.
by disabled do you mean broken? because there is no physicle damage and the likelihood of the transistors being broken is very slim. or do you mean that the asus motherboard is not compatible with the CPU on a software level?
@@Mase.- nope, it's running off the GPU. if it'd be running from the integrated graphics, the HDMI cable would be plugged into the motherboard, not the GPU, and in the F3 menu you can see that it's rendered off the GPU
@@Guiny Wow I didn’t expect you to respond! Im glad that you look to improve and learn more about tech, and i admire you for that. Not many people are willing to learn, but im happy that you do. I don’t have much time to tell you everything, but i think the primary issue with your testing is the mods used, as well as the balance between the components. Vanilla minecraft is incredibly CPU-heavy, which is why you saw poor performance with the Xeon. However Fabric and Optifine are primarily GPU-heavy, which is why you saw a major performance improvement when running Lunar Client. The other thing to consider is that Xeon’s will naturally perform worse in minecraft when compared to the desktop consumer segment chips from the same era due to lower clock speeds and slower single thread performance. This is primarily due to workstation tasks needing more cores and not faster cores. But overall, i think this is just more of an issue on my part, as I personally look forwards to the analytical parts of tech videos like this. Maybe it’s because i’m just a computer nerd, but keep up the good work.
hey guiny! great vid as always. i have a quick question! So im going to make a video on the cheapest phone , and see if it can run minecraft. do you have any tips on how to edit it, or thumbnails??
Wrong. Bedrock is written in C++ which is way faster than Java. Bedrock and Java are barely different in the way they handle API calls. Where in a system everything is handled is the same. Chunk generation, mob AI and whatever is still running on the CPU while stuff like lightning, shadows, particles, vertex shader, Tangent Vectors, rendering triangulated polyhedras etc. is still handled by the GPU.
@@luigiistcrazy Bedrock is not "way faster than Java" because it's written in C++, it's faster (though even this is debatable) because Minecraft Java Edition is the codebase equivalent of a festering garbage tip.
Playing Minecraft Java 1.8.8 with my trusty CMClient on this old Intel Celeron 900 2.19GHz CPU is always an adventure. Can you believe I still manage to keep up a decent 60-100 FPS on my ancient Windows 7 32-bit, 3GB RAM, single-core setup? Crazy, right? Though, my CPU usage hitting 100% most of the time is a bit of a hassle. I've watched so many optimization videos on RU-vid. I remember that day when I installed Minecraft and I got 5-10 FPS-my laptop even crashed and got the blue screen. I got this laptop when I was 14 years old on February 1st, 2023, and now it's February 22nd, 2024, so it's been a year since I've had this laptop. This laptop has a heating problem, and I bought a cooling pad with the money I saved, but it still heats up. This laptop? It's like a treasure passed down from my dad. It's been with us for 11 years now. Sure, it's ancient, but it's my lifeline. I'm from middle class family, so an upgrade isn't happening anytime soon. But you know what? I'm grateful for what I have. By the way, I'm 15 years old. This laptop isn't for gaming. It's for doing work only. I make and edit videos for my channel. Your videos have been a constant source of joy on this aging machine. A little heart in the comments would truly make my day :) EDIT : It's worse now i can't even play minecraft so, I'm leaving it :) and now it's summer so my laptop is going to heat more lol but yk this laptop is my lifeline
Absolutely, I'm beyond grateful! It's been a whirlwind tinkering with this trusty old laptop, but I'm thrilled it's still going strong. Although, I must admit, it's not always smooth sailing, especially with the heating issues causing my Minecraft FPS to drop. Nonetheless, your videos have been my constant source of inspiration and guidance along the way. Keep up the amazing work, you're truly the best! @@Guiny
To be honest, you should learn about opening up computers if you are still on a 15 year old laptop, just saying. Those kind of skills will come in handy if you will keep on going with systems THAT old.
iirc on 100 and 200 motherboards u can disable smth in the bios to be able to support xeons, you wont have to that with lga 2011v3 and older sockets as they have consumer xeons which can just be slapped in and easy game
my main computer right now is rocking 2 xeon x5675s and they are pretty good. you need server motherboard and ecc memory for the xeon you bought to run.