Intel Xeons aren’t scams they’re just very cheap because they mass produced them back when everybody wanted them and nobody wants them now. They’re typically use in workstations and they’re so cheap because offices and etc clear them out to get new ones all in one go. They also make pretty decent budget gaming pcs if you can find a decently newer version of one. Btw they’re also server cpus. Edit: thank you guys I never got so many likes before!
@CodingK1d I have a 15 6500 Intel core cpu that clocks at 3.2 ghz and it can only run fortnite 45fps at lowest settings (you must have a good or decent gpu then
Ye, I have a xeon e3-1220v2 and it can run mc at like 80fps 32 chunks so long as im using some optimization mods, my gpu is the gt 730 4gb and it underpreforms for my cpu
by disabled do you mean broken? because there is no physicle damage and the likelihood of the transistors being broken is very slim. or do you mean that the asus motherboard is not compatible with the CPU on a software level?
wtf no? lunar got 1000 fps for 2 seconds and it was while the chunks were loaded, it doesn't lag because default minecraft is really unoptimized and lunar uses i guess some performance mods/addons to the client that make it run better and make faster math
Minecrafts code isn't made to be run on the GPU. You can mod the shit out the game but the backend of the game is designed to be handled by the CPU. You would have to completely rewrite Minecrafts code (And not in Java to begin with) so it can hand all the complex calculations like Chunk gen, Mob AI, etc. to the GPU instead of the CPU. While Minecraft sort of supports Multithreading for chunk gen, it's somewhat experimental because it increases the chance to generate faulty chunks. Modern GPUs have thousands of cores and great Performance when these cores work together but not when working alone. GPUs have horrible single core performance and you can generate a single chunk on a single thread. Chunk gen speed would be horrendous when running on the GPU. Minecraft would have to implement a system that allows the game to generate a chunk on multiple cores at the same time but this would probably be highly complex and just not worth it. The GPU is only used for things like lightning, shadows, particles, vertex shader, Tangent Vectors, rendering triangulated polyhedras etc.
not really smartie, the intel pentium is so old that bottlenecks his cpu by alot, its like having an rtx 3060 and using it as a intel hd graphics 4000 (intergated) so think before u say something
@@Guiny Wow I didn’t expect you to respond! Im glad that you look to improve and learn more about tech, and i admire you for that. Not many people are willing to learn, but im happy that you do. I don’t have much time to tell you everything, but i think the primary issue with your testing is the mods used, as well as the balance between the components. Vanilla minecraft is incredibly CPU-heavy, which is why you saw poor performance with the Xeon. However Fabric and Optifine are primarily GPU-heavy, which is why you saw a major performance improvement when running Lunar Client. The other thing to consider is that Xeon’s will naturally perform worse in minecraft when compared to the desktop consumer segment chips from the same era due to lower clock speeds and slower single thread performance. This is primarily due to workstation tasks needing more cores and not faster cores. But overall, i think this is just more of an issue on my part, as I personally look forwards to the analytical parts of tech videos like this. Maybe it’s because i’m just a computer nerd, but keep up the good work.
@@Mase.- nope, it's running off the GPU. if it'd be running from the integrated graphics, the HDMI cable would be plugged into the motherboard, not the GPU, and in the F3 menu you can see that it's rendered off the GPU
Wrong. Bedrock is written in C++ which is way faster than Java. Bedrock and Java are barely different in the way they handle API calls. Where in a system everything is handled is the same. Chunk generation, mob AI and whatever is still running on the CPU while stuff like lightning, shadows, particles, vertex shader, Tangent Vectors, rendering triangulated polyhedras etc. is still handled by the GPU.
@@luigiistcrazy Bedrock is not "way faster than Java" because it's written in C++, it's faster (though even this is debatable) because Minecraft Java Edition is the codebase equivalent of a festering garbage tip.
Playing Minecraft Java 1.8.8 with my trusty CMClient on this old Intel Celeron 900 2.19GHz CPU is always an adventure. Can you believe I still manage to keep up a decent 60-100 FPS on my ancient Windows 7 32-bit, 3GB RAM, single-core setup? Crazy, right? Though, my CPU usage hitting 100% most of the time is a bit of a hassle. I've watched so many optimization videos on UA-cam. I remember that day when I installed Minecraft and I got 5-10 FPS-my laptop even crashed and got the blue screen. I got this laptop when I was 14 years old on February 1st, 2023, and now it's February 22nd, 2024, so it's been a year since I've had this laptop. This laptop has a heating problem, and I bought a cooling pad with the money I saved, but it still heats up. This laptop? It's like a treasure passed down from my dad. It's been with us for 11 years now. Sure, it's ancient, but it's my lifeline. I'm from middle class family, so an upgrade isn't happening anytime soon. But you know what? I'm grateful for what I have. By the way, I'm 15 years old. This laptop isn't for gaming. It's for doing work only. I make and edit videos for my channel. Your videos have been a constant source of joy on this aging machine. A little heart in the comments would truly make my day :) EDIT : It's worse now i can't even play minecraft so, I'm leaving it :) and now it's summer so my laptop is going to heat more lol but yk this laptop is my lifeline
Absolutely, I'm beyond grateful! It's been a whirlwind tinkering with this trusty old laptop, but I'm thrilled it's still going strong. Although, I must admit, it's not always smooth sailing, especially with the heating issues causing my Minecraft FPS to drop. Nonetheless, your videos have been my constant source of inspiration and guidance along the way. Keep up the amazing work, you're truly the best! @@Guiny
To be honest, you should learn about opening up computers if you are still on a 15 year old laptop, just saying. Those kind of skills will come in handy if you will keep on going with systems THAT old.
I am wrong here, The RX 580 (The GPU he is using in this build) only requires an 8 pin power connector from the psu so the xeon was the only problem for the system before, my bad on that one! So please laugh at my stupidity down here :D | | V 4:40 It also helps to have both your GPU connectors plugged in! Thats why the display didn't output anything, also the Xeon would be incompatible as you later discover and thus the PC would display a message similar to "Incompatible CPU/Proccessor"
The Xeon wasn't "disabled" it simply was running off a 9 year old socket, you are lucky that nothing got damaged because that could have very well caused a short
5:25 Just as I thought, you we're gonna buy another CPU that's below 8 dollars. This is indeed not clickbait. Edit: Fixed the timestamp. Edit 2: Yes you finally tried Bedrock Edition buggy but better performance.
Xeon's are workbench/server related CPU'S for example Xeon e3-1230v2 is i7-3770 xeon e5640 is i7-7600 and so on but as LEO said they can be used for a decent gaming pc (My main pc runs on e3)
iirc on 100 and 200 motherboards u can disable smth in the bios to be able to support xeons, you wont have to that with lga 2011v3 and older sockets as they have consumer xeons which can just be slapped in and easy game
Kinda fun to mention; currently running Minecraft with a $15 CPU. Versions prior to 1.13 run like a charm so blame Mojang. I even run near 200 mods in 1.12.2 with no problem.
hey guiny! great vid as always. i have a quick question! So im going to make a video on the cheapest phone , and see if it can run minecraft. do you have any tips on how to edit it, or thumbnails??
I'll never touch a Skylake or Kaby Lake CPU even if they were free, they had such a long list of errata that Microsoft had to release patches just to accommodate those specific processors.
You can run the first processor you bought by modding the motherboard's bios, and the first processor you bought is much more powerful than the second one.
can this comment get 1000 likes?
edit: dang we smashed that, but can we get 1000 subscribers? 😂
yes and i’m first lmao that’s amazing im 45 seconds late tho that’s sad
yes it is easy
No ony video
Yes
😮
Intel Xeons aren’t scams they’re just very cheap because they mass produced them back when everybody wanted them and nobody wants them now. They’re typically use in workstations and they’re so cheap because offices and etc clear them out to get new ones all in one go.
They also make pretty decent budget gaming pcs if you can find a decently newer version of one. Btw they’re also server cpus.
Edit: thank you guys I never got so many likes before!
my pc is worse but has the same cpu bc it has intergrated graphics
i have one and its obviously ass but it can run fortnite at 70 fps medium settings
@@CodingK1d depends which one you have really
@CodingK1d I have a 15 6500 Intel core cpu that clocks at 3.2 ghz and it can only run fortnite 45fps at lowest settings (you must have a good or decent gpu then
Ye, I have a xeon e3-1220v2 and it can run mc at like 80fps 32 chunks so long as im using some optimization mods, my gpu is the gt 730 4gb and it underpreforms for my cpu
U PUT THE CPU IN A MOTHERBOARD WITH THE WRONG SOCKET
by disabled do you mean broken? because there is no physicle damage and the likelihood of the transistors being broken is very slim. or do you mean that the asus motherboard is not compatible with the CPU on a software level?
@@electricminecrafter The motherboard is no longer compatible. That´s what he meant
Bro you cant put a cpu that is not in the motherboards socket
@@golustheplayeryou can
@@golustheplayersome sockets are very similar or exactly the same, just different bus speeds
I bet more money was spent on the case than the actual CPU
Cases can be expensive so yeah
bro put a bet
Ofc
@@RaidenBrice meh they are also not needed
@@epicgamer66941 They´re optional, yeah
Next video idea: find a GPU for 8 dollar
the terror
Nvidia GeForce2 MX from my 2004 Apple Mac has entered the chat
good idea i hope that he like this idea
old radeon gpu would lowkey work
gtx 210 has entered the chat
the reason why lunar client got 1000 fps was bc it was mainly running off the gpu, which is like miles better than the cpu
wtf no? lunar got 1000 fps for 2 seconds and it was while the chunks were loaded, it doesn't lag because default minecraft is really unoptimized and lunar uses i guess some performance mods/addons to the client that make it run better and make faster math
Minecrafts code isn't made to be run on the GPU. You can mod the shit out the game but the backend of the game is designed to be handled by the CPU. You would have to completely rewrite Minecrafts code (And not in Java to begin with) so it can hand all the complex calculations like Chunk gen, Mob AI, etc. to the GPU instead of the CPU. While Minecraft sort of supports Multithreading for chunk gen, it's somewhat experimental because it increases the chance to generate faulty chunks. Modern GPUs have thousands of cores and great Performance when these cores work together but not when working alone. GPUs have horrible single core performance and you can generate a single chunk on a single thread. Chunk gen speed would be horrendous when running on the GPU. Minecraft would have to implement a system that allows the game to generate a chunk on multiple cores at the same time but this would probably be highly complex and just not worth it.
The GPU is only used for things like lightning, shadows, particles, vertex shader, Tangent Vectors, rendering triangulated polyhedras etc.
@@luigiistcrazytotally right
and if the game was still based on gpu it would be bottlenecked by the cpu
@@luigiistcrazyLunar client has gpu compatibilty. If he removed the gpu he would get 20fps max
Your pc is being carried by your GPU.
not really smartie, the intel pentium is so old that bottlenecks his cpu by alot, its like having an rtx 3060 and using it as a intel hd graphics 4000 (intergated) so think before u say something
gpu just renders 3d, his player moving turning, generation otherwise is all handled by the cpu
and for crying out loud, the CPU is more important than the GPU. integrated graphics may be bad, but the CPU is necessary.
@@whkn fr
@@whkn who tf said it isn't , that wasn't my point
him looking for the cheapest and weakest cpu ❌
me having weaker than the one he bought ✅
Dawg you have a high end gpu of course it's going to work
Minecraft relies heavily on the CPU.
You can have the strongest GPU in the world, but if your CPU is 20 years old, Minecraft ain't gonna run.
@@ebonygyro8736 Yeah but he literally put it at 2 chunks and didn't load stuff in, the CPU wasn't the one pulling the fps up
@@noisnecsa995 My point still stands. The fact is, unless you're using shaders, Minecraft relies on the CPU more. That's not really up for debate.
He is using an rx580 which is the equivalent of a gtx1650 so it’s not rlly high end is it
The technological ignorance pains me so much...
I'm a to-be computer engineering student and I cant even bear watching this more than 30 seconds
Could you tell me a few main things to improve on? I want to improve lol
@@Guiny Wow I didn’t expect you to respond! Im glad that you look to improve and learn more about tech, and i admire you for that. Not many people are willing to learn, but im happy that you do.
I don’t have much time to tell you everything, but i think the primary issue with your testing is the mods used, as well as the balance between the components. Vanilla minecraft is incredibly CPU-heavy, which is why you saw poor performance with the Xeon. However Fabric and Optifine are primarily GPU-heavy, which is why you saw a major performance improvement when running Lunar Client. The other thing to consider is that Xeon’s will naturally perform worse in minecraft when compared to the desktop consumer segment chips from the same era due to lower clock speeds and slower single thread performance. This is primarily due to workstation tasks needing more cores and not faster cores.
But overall, i think this is just more of an issue on my part, as I personally look forwards to the analytical parts of tech videos like this. Maybe it’s because i’m just a computer nerd, but keep up the good work.
@@ItsHonski i doubt he actually read all that 💀
@@Greenscreenkid_798 that's not a lot of words
one time i read a comment for minutes haha
Guiny always manages to make his videos so goofy in a serious way 😂
We love goofing
@@Guinyfrfr
@@Guinyik
why did you spend 500 usd for i7 when you could get the 7800x3d?@@Guiny
npc ass coment srsly
6:33 first reaction when my pc came back to life after days without working somehow
imagine him not putting the CPU in the correct passion, i would be so pissed at I'm cuz there is a freakingtab that shows you where to place it
The Goofy Ahh Guiny vids are the BEST VIDS ON THE INTERNET and I love it! keep it goin Guiny!
Thanks man!
the second power cable for the gpu wasn't even plugged in😭😭
That doesn’t matter it can run integrated graphics
@@Mase.- nope, it's running off the GPU. if it'd be running from the integrated graphics, the HDMI cable would be plugged into the motherboard, not the GPU, and in the F3 menu you can see that it's rendered off the GPU
guiny is litterally my fav youtuber .keep going man!
Thank u :D
@@Guiny:D
@@Badguylolpo:D
@@Guiny Can I get a shout-out pwease
6:30 Hey ! at least you don't have to buy an MacBook
6:33 Enjoy ur MacBook :) - We are all waiting
THE WAY SAY YOU SAY LITERALLY IN 1:59 IS SO FUNNY 😂😂😂😂😂😂😂😂😂😂😂😂 ( I LIKED BTW )
Minecraft Bedrock has worked so well because it's more of a GPU based game.
Wrong. Bedrock is written in C++ which is way faster than Java. Bedrock and Java are barely different in the way they handle API calls. Where in a system everything is handled is the same. Chunk generation, mob AI and whatever is still running on the CPU while stuff like lightning, shadows, particles, vertex shader, Tangent Vectors, rendering triangulated polyhedras etc. is still handled by the GPU.
@@luigiistcrazy Bedrock is not "way faster than Java" because it's written in C++, it's faster (though even this is debatable) because Minecraft Java Edition is the codebase equivalent of a festering garbage tip.
the chance of Guiny giving me a shoutout is 0%
shout out!
@@Guiny lol ty ig
8:30 yes we all know about the "small balls" world 😂😂💀💀
Banger, loved it.
Guiny always manages to make his videos so entertaining, even when discussing budget tech options. Love the content!
W glaze
i love how he find every cheap tech to run minecraft as smooth as possible
7:42 does your 2,000 dollar pc freeze and greenscreen on youtube?
2:35 man in labor
Cuh💀
GET A GPU FROM TEMU 😊😊
Good idea
Guiny, your production quality has comeb so far. I didn't even realize you made this video, it's so well made!
Thank you!!!!
vid idea: minecraft on a 50$ custom pc
Did no one noticed at 0:42 in the search history it was written : "Why guiny is so cool"
Playing Minecraft Java 1.8.8 with my trusty CMClient on this old Intel Celeron 900 2.19GHz CPU is always an adventure. Can you believe I still manage to keep up a decent 60-100 FPS on my ancient Windows 7 32-bit, 3GB RAM, single-core setup? Crazy, right? Though, my CPU usage hitting 100% most of the time is a bit of a hassle.
I've watched so many optimization videos on UA-cam. I remember that day when I installed Minecraft and I got 5-10 FPS-my laptop even crashed and got the blue screen. I got this laptop when I was 14 years old on February 1st, 2023, and now it's February 22nd, 2024, so it's been a year since I've had this laptop. This laptop has a heating problem, and I bought a cooling pad with the money I saved, but it still heats up.
This laptop? It's like a treasure passed down from my dad. It's been with us for 11 years now. Sure, it's ancient, but it's my lifeline. I'm from middle class family, so an upgrade isn't happening anytime soon. But you know what? I'm grateful for what I have. By the way, I'm 15 years old. This laptop isn't for gaming. It's for doing work only. I make and edit videos for my channel.
Your videos have been a constant source of joy on this aging machine. A little heart in the comments would truly make my day :)
EDIT : It's worse now i can't even play minecraft so, I'm leaving it :) and now it's summer so my laptop is going to heat more lol but yk this laptop is my lifeline
Dude you are amazing! Congrats on making it work so well!!
Bruh just buy a used pc they are dirt cheap now days
Absolutely, I'm beyond grateful! It's been a whirlwind tinkering with this trusty old laptop, but I'm thrilled it's still going strong. Although, I must admit, it's not always smooth sailing, especially with the heating issues causing my Minecraft FPS to drop. Nonetheless, your videos have been my constant source of inspiration and guidance along the way. Keep up the amazing work, you're truly the best! @@Guiny
How please teach me master
To be honest, you should learn about opening up computers if you are still on a 15 year old laptop, just saying. Those kind of skills will come in handy if you will keep on going with systems THAT old.
its a nice day when guiny uploads
bro has on his search history: 'why is Guiny so cool' made me laugh. you are cool
thanks lol
thanks lol
I honestly thought you stop uploading but you were just making a video for a really long time
Wel al love you guiny! 💯
Thank you!!
You best :D@@Guiny
For every like this comment gets I will do 2 pushups
Lmaoooo
No one care
@@ThươngChu-p9p I do
Bro failed miserably 💀
American megatrends exit buttons: esc delete enter
I fogor to like my own comment :(
COOL 💯💯💯 🔥🔥🔥
9:47 literal heart attack lol
i love the why is guiny so good
I am wrong here, The RX 580 (The GPU he is using in this build) only requires an 8 pin power connector from the psu so the xeon was the only problem for the system before, my bad on that one!
So please laugh at my stupidity down here :D
|
|
V
4:40 It also helps to have both your GPU connectors plugged in!
Thats why the display didn't output anything, also the Xeon would be incompatible as you later discover and thus the PC would display a message similar to "Incompatible CPU/Proccessor"
Your search history 0:43
Why guiny is soo cool
7:29 this doesnt work because if the monitor cant support 8k,this usually happen when u try to find an 8k vid
his videos are amazing
i love your video!
The Xeon wasn't "disabled" it simply was running off a 9 year old socket, you are lucky that nothing got damaged because that could have very well caused a short
(59 seconds in) "quad core, that's actually good." 🤣
Guiny never fails to fill us up with his juicy content
This comment is questionable
lol
5:54
Bro be dancing real good
5:25 Just as I thought, you we're gonna buy another CPU that's below 8 dollars. This is indeed not clickbait.
Edit: Fixed the timestamp.
Edit 2: Yes you finally tried Bedrock Edition buggy but better performance.
Underrated youtuber. Keep up the good work!
THANK U!
6:30 You didn't plug in the second GPU power cable.... and the xeon e3-1220v5 has 4 cores not 2
4:48... noticed that the graphics card does not have both PCIe power cables plugged up, this could also be causing issues.
I really like seeing you make Pc related videos.
his pc didn't work then the apple gods intervened so he would buy $2000 MacBook lol
10:02 bro not just anythng, even the computer that you made in minecarft with redstone can run bedrock pretty well.
Xeon's are workbench/server related CPU'S for example
Xeon e3-1230v2 is i7-3770
xeon e5640 is i7-7600 and so on
but as LEO said they can be used for a decent gaming pc (My main pc runs on e3)
"oh wait i'm meant to unplug it" famous last words
Guiny held up the monitor I use and said "It's pretty cheap"
it was $120 3 years ago lol
@@Guiny I use this monitor also, I can confirm it is the goat of all monitors
Just by saying this processor is a no brainer summons the entirety of the pc building community
Have a good day guiny and everyone
iirc on 100 and 200 motherboards u can disable smth in the bios to be able to support xeons, you wont have to that with lga 2011v3 and older sockets as they have consumer xeons which can just be slapped in and easy game
great video idea!!!!
i will give guiny an award for makking gaming setups out of scrap & ebay
7:30 when i was watching youtube my screen went green like that too and my dads pc also has intel pentium cpu and i get 4 fps in minecraft
"I was very disappointed"
No shit, it's 9 bucks.
Kinda fun to mention; currently running Minecraft with a $15 CPU. Versions prior to 1.13 run like a charm so blame Mojang. I even run near 200 mods in 1.12.2 with no problem.
7:59 Pong: Am i a joke to you?
“My laptop battery is gonna end”
“The pc in his background “🗿
7:43 what do you mean you cant tell the difference the cpu is like a fat man climbing stairs 🤣
knowing that my laptop can run fine with an intel celeron 900 your performance makes sense
Nice edit😊
why bro got so happy when the computer started with a pentium 💀
now get a $8 Graphics Card
Guiny try and make your NEW Macbook run Minecraft at 1000 frames at least! gr8 vid! :D
Please make more videos like this
hey guiny! great vid as always. i have a quick question! So im going to make a video on the cheapest phone , and see if it can run minecraft. do you have any tips on how to edit it, or thumbnails??
Are you the twitter guy? I sent you a message answering this question lol haha
@@Guiny YES HAHA
That was very nice
first time watching you and 30 seconds in I'm already subscribed LMAO
Thank you!
I'll never touch a Skylake or Kaby Lake CPU even if they were free, they had such a long list of errata that Microsoft had to release patches just to accommodate those specific processors.
i dont know how many mind-numbing agents im meant to take before handling this editing style
but my guess was much lower than what it really is
Guiny, these videos are enjoying since i love videos bout cpu's
3:04 Bib Bidid XDDDDDD
50 Likes ???? 😢😢😢😅😅😅😮😮😮😂😂😂🎉🎉🎉❤❤❤😊😊😊
4:31 Riiiiiiiiiiii ????!?!?!?!??! 😅😅😅😅😅😮😮😮😮😂😂😂😂🎉🎉🎉🎉❤❤❤❤
6:35 oh oh oh oh s- cuts out LMAO
i swore lol
4:30 bro when did u last time clean your keyboard
Next video Idea: seeing how much more performance I get when I plug in my gpu correctly
Intel: It will never be spoken up like this
bruh exploded my year drums when he screamed
I got an Intel Core 2 Quad for 70 cents once. Minecraft worked fairly well, and was fine for basic web browsing. Worth the purchase!
Man You are my Idol you do such crazy things And I like it 😊
You can run the first processor you bought by modding the motherboard's bios, and the first processor you bought is much more powerful than the second one.
bro will spend 100 dollars total on just cheap cpu's and laptops at this point breh
4:49 the issue wasn’t the cpu it was that u didn’t plug in all the gpu power supply cables… IM 12 YRS OLD AND I KNOW MORE THAN U DO, HOW???
I was afraid your PC would explode from CPU Bottleneck lmao
that's crazy how you found it lol
first time watching you and its a genuinely enjoyable video
Thank U!!!
you are a genius! thanks for the content!!
Thanks man
bro can u please play with me
i can give u my discord id
plzzzzzzzzz
@@Guiny bro can u please play with me
Guiny the goat
Tbh the intel sockets are almost impossible to break, you can basically throw the cpu in and there not be an issue
Nearly every CPU might be super laggy if you don’t use the discrete GPU and turn off integrated graphics
hehe he said ebay but ebay was already there xd