This CPU Cost LESS than $26 New...
Вставка
- Опубліковано 6 вер 2022
- I bought this CPU New for less than $26 and obviously I had to try gaming on it.
Get some AWESOME Dawid merch here: dawiddoesmerch.com
Join this channel and help us buy weird computers and tech to make videos on:
/ @dawiddoestechstuff
Or you can support the channel on Patreon: / dawiddoestechstuff
Follow me on whichever Social media you don't hate
Discord: / discord
Twitch: / dawiddoestechstuff
Twitch bits on UA-cam: / @dawiddoestwitchstuff1128 - Наука та технологія
this man gave us smth to watch when the apple event started getting boring
bro actually same
so the whole event?
I had no idea there even was an event...and I'm chronically online ...
Accurate bro
@@streetle nah only waiting for apple watch to finish . want to see the 14's
This would be perfect to diagnose malfunctioning 10th gen / 11th gen intel systems
And thats the only thing its good for.
@@thecomicguy2785 or watching a dawid video
Honestly, as a HTPC or file server this little thing is just perfect with it's low power and thus thermals. You can run it without fan, with just a heatsink and it's fine.
@@MarioPL989 yeah right, I considered it for my home NAS but got an i3 for dirt cheap instead. 2 strong cores are better than 4 fairly old cores, especially where efficiency matters.
You can actually get a 10th gen i3 for $65 that's actually usable, it's an F SKU so no internal graphics.
This motherboard costs like 8-10x what the CPU costs!!! That's perfect system budgeting right there.
What motherboard is it? Looks really good for an all white build
@@ianodundo8322 its an nzxt board, aka SHIT, cuz nzxt makes bad things
@@lnfinyx Actually made by ASrock with NZXT branding, so yeah, rubbish.
@@lnfinyx I don’t agree, NZXT make awesome cases. Also their AIOs are pretty good. Although yeah, this Asrock board isn’t the greatest.
ASRock boards may not be the best performer but based on my experience they're quite reliable.
Not gonna lie, one of the first thoughts I had when I found out it was a Celeron was “I hope he uses that 3080 Ti.” Didn’t disappoint!
That poor 3080ti is a legend on this channel for getting used in every stupid system ever. The terrible bastardized MSI SFF system was one of it's best moments.
I could be interesting to see what GPU wouldn't be CPU bottlenecked by little Celeron.
@@cheebsgod At this point that 3080 Ti is going to turn into a masochist.
Clearly the 3080Ti was to weak for the celeron...
Imo he should've gone with a 3090Ti 🤦🏽♂️
Haha!! Yeah, this 3080ti has lived one hell of a life.
As we all know, Intel reserves its Celeron Branding to the most top of the line performance CPUs
I3, the one everyone buys, arbitrarily not the flagship anymore, iPeen, celeron
They work hard to make sure that Celerons are garbage because they don't want to repeat of the Celeron / Pentium 3 debacle. The Celeron could be overclock faster than the Pentium lll (at the time Pentium was the premium product)
If you got a good one you could hit 1 GHz. Which at the time was crazy to think about.
Rofl
I agree, that stupid 3080ti was clearly the bottleneck, he should've used a 3090ti
@@megan_alnico I had this Celeron 300A ! With an Abit BH6 mobo ! Damn those were good times to be alive ! 🥰😂😂
In 9 minutes, you've just summed up all of my frustration with customers determining a computer's speed by the number of 'gigas' it has.
Haha!! Yeah “gigas” can be real misleading.
time to buy that 3.6 GHz Pentium 4 with "2 gb Nvidia graphics" (hint: it's a 8400 GS)
@@HappyBeezerStudios and then it runs slow as hell, and they are like wtf? this thing is as screaming fast as an i7🤣🤣🤣
Dawid put single channel ram into a pc... he has become what he swore to destroy
In the name of Science some sacrifices must be committed.
Dawid is merely following the advice of the elders , " If you cant beat em, join em "
When looking to torture a celeron, anything goes. 😅
2:45 holy shit I expected this thing to run way worse than that. When you think about it it's super impressive to see GTA V running on just 4GBs of single channel RAM, Intel iGPU and with the cheapest new CPU on the market
I never thought I'd see the beautiful image of an Intel Loser Stock Cooler on a high-end NZXT motherboard.
It's art!
PS3 levels of performance with just the igpu. Impressive! People were paying good money to buy gta v to play at this kind of quality back in the day!
Haha! Yeah. 😁
ehhh nice joke ,. but seriously the ps3 could get the game going at least 30fps.
It may have helped to cap the FPS when you put in the video card. The CPU was 100% most of the time with the 3080Ti installed. I had the same problem when I replaced my GTX1060 with a RTX3060Ti. Outer Worlds, particularly, immediately started studdering. Wondering what was up I ran stats and all 4 cores of my 7600K were pegged at 100%. The problem abated when I limited the FPS. Don't know if that will help, but that's what happened to me.
I was going to suggest the exact same thing
Bruh. What would help is not using a Celeron to game.
These videos are for memes.
@@rustler08 they sure are, but the guy gave a great advice! Someone with a similar problem might stumble upon this comment and find a solution. Cant see a point to be mean to op
@@rustler08 yeah, but sometimes you have to spitshine a turd, you know?
That's the real meme.
Other than the friends we made along de wey.
@@rustler08 Incorrect. They are more of a what if scenario. The only meme-like part of it is going massively overboard with the gpu upgrade. Imagine someone already has this cpu and they want to game. The first upgrade is a gpu. A cpu upgrade comes later. The 3080 Ti is just thrown in there to maximize the cpu, but we could easily get the same result with a lower power gpu. Slapping the 3080 Ti in there is just much more convenient bucause we're gauranteed that hard cpu bottleneck.
I love it when you test out the cheap stuff. I've always been curious!
In that case you might want to check out RandomGaminginHD as well.
@@KimBoKastekniv47 I'm mostly entertained by Dawid's humor and analogies
I also really enjoy the cheap stuff. 😁
@@KimBoKastekniv47 For sure. RGinHD is awesome! 😄
I mean, we have a billion channels that only deal in higher end parts. Cheap stuff barely gets any love.
it's hilarius to see that even the pentium wich was a high end cpu in the 90s is now just a cheap low end cpu .
I know.... silly way to rebrand, take your top of the line product name and re use it on nearly the lowest end processor you manufacture.
@@volvo09 actually fucking brilliant ngl 25-50 dollar CPUs that can be shoved into prebuilts or bought in bulk for a company or whatever with the pentium and Celeron name easy fucking money
@@LlamaCraft yeah, regular people do grab them based on name alone... It is an iconic name.
Same thing happened to the Athlon. To be fair, the modern low-end chips do outclass the 20-odd year old originals by quite a margin :) Just not relative to their peers
Celeron's are by definition bottom bin CPU's, but ok.
Whenever I see the word Celeron, I immediately think "office computer"
There are millions of them out there.
@@glenndavis341 Unfortunately.
I think this CPU deserves another video. Put it in a really small form factor build, with some tweaked ram and maybe a Linux install. Try indies and much older stuff.
G5900 also doesn't have AVX/AVX2, the LGA 1700 celeron has AVX-512 support
deserves to be thrown in the dumpster of the year 2000 and never opened again.
indie games are an unoptimized mess, 2d games that chug down 16 gigs of ram
@@mentalasylumescapee6389 the space is flooded with videos of mid to high-end hardware and what it can do. It's pretty interesting to see the opposite end. Not in a "how much does it struggle" way, but a "what classics can you play" one.
It’ll run fine on older games. Would’ve liked to see it paired with a 1650 or rx570.
Just starting a game is already a good processor ")
Haha!! True.
Another interesting video that we didn't know that we needed! Great job Dawid.
At the beginning of the year I wanted to build a cheap secondary system to replace my teenager Xeon S775 System. I have a friend who had a spare H510 and was willing to give that to me,so I was searching for a Celeron/Pentium or even a used 10th gen i3 but all of them cost more than 50€ + the RAM cost. Thank god I found a used X58 combo with a DFI motherboard, an i7 , a cooler and I got 12Gb of RAM for a total of 40€ and it's perfectly fine for browsing,movies and some light gaming. I believe, especially when it comes to gaming it is much better than 2 physical cores and 2 or 4 threads.
the severity of the condition of our economic circumstances is beyond many peoples comprehension and many continue to deny its existence.
@Dran Femme Do you use any FA? If yes, what's the progress so far
Damn, I thought by adding dual-channel memory with a total of 64gb would’ve unlocked the Celeron’s….. “11900k god mode” cpu functionality 😂😅🤭🤣😋
nope you need atleast 128 gigs for that option to be unlocked 64 just won't cut it🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
This setup is sick ...and totally twisted like a real OEM machine.
You never cease to amaze me with your content
Would have been interesting to see an optimal pairing for this cpu. Ultra low budget build getting the absolute max out of every penny would be cool.
Or what kind of games could actually be run on it. Like some popular indie games etc. Then we would know how much the cheapest Indiegame PC would cost.
it would also be interesting to see how it compares to a 4th gen i5 since they're 20-30$ these days.
Probably be cheaper to just take an old phone and emulate stuff like that ha....
@@eddybooboo9916dunno, how good is the windows XP emulation on iOS these days?
Honestly, it is already optimally matched by its iGPU . If you want, you could pair it with gt 1030 , rx 550 perhaps even 1050 ti or rx 570 but in this case CPU would be bottleneck. Without hyperthreading this is simply budget office CPU . Get at least Pentium if you want to play something semi-modern.
Dawid, I have no idea how you continually think up and implement some of the most interesting and unique ideas and experiments to play around with and test, but bravo friend. You seem to it again and again. Also, as usual, top notch editing! Thank you, sir!
it crashed because it couldn't handle the software the poor little Celeron was in over its head
I think you found the perfect CPU match for a 1630!
1010 bro
@IRON V8 nah it’s too old lol
I loved seeing the comparisons with single and dual channel ram. makes me wish this old laptop i have here that i brought back to life had a second ram slot in it. while it's actually quite usable now, it's missing that graphics push.
I love when you experiment with older equipment. Gives some of us hope that we won't have to send our cpu to the old folks home so soon.
I love how he knows it's going to be entertaining even if it runs like crap🤣🤣🤣
I'm curious how it would have performed with a more dated GPU...like a GTX 970 or 1070.
even a 3060 would be a better match, but the results would be pretty much the same. That Celeron is so weak, even those older GPUs would get bored.
@@HappyBeezerStudios Facts. He's slapping a Lamborghini engine in a Ford Pinto.
This thing is significantly faster than my dual-core i5 5257U. Intel truly did make terrible CPUs in the pre-Ryzen era.
Hell naw man, the i5 4gens are CRAZY good still
The lineup of their U-series CPUs is crazy in retrospect.Even the i7s were dual-core until 8th gen (well after Ryzen launched)
only the U series was horrible, Intel just didn't go over 4 cores until 8 gen because Pre-Ryzen AMD CPU's were so worse despite more cores that Intel did not need more than 4C/8T for consumer i7's
The first and only prebuilt I've ever purchased was an HP featuring a 433MHz Celeron. Once I learned how to eliminate the bloatware it was amazing how much gaming fun I actually had on the little north bridge chipset graphics. A couple of years later I added RAM and bought a dedicated gpu that consisted of a very early EVGA pci card (NOT pcie!) and got a couple more years out of her.
Looking forward to seeing how this one runs games. Great video as always
Frame cap and older games would love those two cores a good pairing would have been a gt 1030 or 730
The company I work for runs these Celeron cpus. They came in HP prebuilds with 4gb single channel ram and HHDs. They could barely run chrome after awhile. I convinced one of my managers to buy a 16gb kit for her PC to fix the issues she was having, then I took her stock 4gb of ram and put it in the PC I use the most here so it could have a not unreasonably small amount of ram. They both work relatively well now doing office-y things. It's been more of a chore to convince them to switch to ssds though.
Yeah, i have the same problem, even worse i might ad.. the CPU is not that bad, it's a i3 6098, but coupled with single channel 4gb some crap ram and even worse, a dying Toshiba HDD...
The fun part is that the IT director made a stupid remark, when I've asked him to slap another 4gb, but "noo, everyone is asking for the ram, but it's the CPU". Well, I've asked to swap the CPU, "wee don't do that, wee have to change the hole PC"!!
Do i have to mention that the GPU is a dedicated gt210??
@@TheSilviu8x tfw you feel like you know more than your company's IT guy... And you don't even know anything that anyone shouldn't. I waited till my boss's went home for the day to do this btw because they never listen to my suggestions lol
For the SSD mention less energy use, longer lifespan,more reliability and less likely to fail than spinning rust.
smart move getting the manager the upgrade first, that way they see the difference first hand and it would be easier to convince them to upgrade the other machines as well.
I see you really were sorry about keeping me waiting! Thanks for the fresh upload to get me through my day, love your videos and loser stock coolers.
Prediction, when inserting the celeron into the computer, the computer spontaneously caught on fire and gave your neighborhood aids lol
It’s technically 2 Comet Lake cores (same architecture as i7 10700 for example) and 3.4GHz with a 58W TDP… impressive for a Celeron!
Should make for a super fast experience for basic tasks and the Windows environment which for the price is already insane!
This CPU is for like a child's tablet device.
@@microtasker yet I'm sure my parents would find it a fast CPU for what they are doing on computers
I was hoping you try 1440p and 4k with the 3080 ti celeron combo, curious if it will run at the same bottlenecked cpu fps or will it have other anomaly side effects, shame it didn't happen
As it is CPU limited it will run same FPS whether at 720P or 2160P. This is true for most games.
There are however some game engines effects that provide a greater load to your CPU the higher resolution that the effect is rendered at. This happened on the initial recent Crysis remaster and the devs did more more work in patches to reduce this CPU bottleneck which was tied to resolution.
This also is possible with Raytraced workloads as higher resolution can mean greater CPU dependency.
@@micb3rd it was still a wasted opportunity as any side effects / anomaly would have helped viewers to identify if their system was cpu bottlenecked at higher resolutions. I myself have a 10400F with a TUF 3080, I run at 1440p with DLDSR 2.25x to 4k resolution. So years down the road, I would love to find out what anomaly symptoms I may start seeing CPU bottlenecks at the higher res.
That’s true, I should have tried different resolutions. However, after that first GTA crash, pretty much everything just refused to launch in the first place.
@@XMG3 The symptom will be easy to spot, remove any frame rate cap and use MSI Afterburner overlay. If your GPU is not any longer being utilised at 95% and above then there is a bottleneck and it is most likely the CPU.
If you want to see this effect it will already happen to you on Cyberpunk 2077 on your set-up (4K DLDSR and DLSS performance) when when using Ray Tracing for reflections.
In places where there is high crowd density like the City markets then GPU is not fully loaded and CPU is the limit.
Lets hope game developers keep a close eye on this CPU load as it is a shame to have a top end card and not be able to fully utilise it.
@@micb3rd Yes, i use MSI afterburner, it is for sure good indication when GPU is not utilized near max, but that's not the only determining factor, for example, saints row 2022 has very poor GPU utilization (35%-100% for me) due to poor optimization. There's always many variables involved, so more tests always help
Got a i5 3570 new on AliExpress a year ago, for my test bench ! Still rocking !
Always fun to see what you are going to do next Dawid .
Problem was you should've used a 3090ti, the 3080 clearly wasn't fast enough to keep up.
That power draw combined with those videos about low power consumption GPUs makes me wonder: What would the most power efficient PC (i.e. entire build) look like?
Interestingly the best power efficiency is with higher end parts with their power settings turned down. That's basically what the commercial offerings like Quadro cards are. However I think that A2000 that he already did a video on would be a bit more in the spirit of what you're talking about than putting in a 3090ti just to tune it down.
The power supply would also be really expensive for an 80 plus titanium unit, but it would also have to be a low wattage one to keep it in the wattage range where it is actually most efficient which is something like half the total rating. But that's not all, power supplies run most efficiently on 240 volt power, which is perfectly doable in North America but he's no electrician.
@@ffwast Absolutely! I'm running an undervolted Ryzen 7 5800X and 6900 XT. My buddy has a 3080 TI and was livid the day I showed him my entire system uses just a bit more power than just his GPU, and pulls competitive framerates. It's something that's actually been on my mind for years. Just, "How silly can you actually get with power efficiency?"
Power efficiency is VERY relevant in Europe/UK for the foreseeable future. Thanks to Putin's Ukraine war and oil/nat gas sanctions, electric bills are 4x-5x higher.
had a pentium g4560 which is essentially the same thing but just with 4 threads instead of 2 i loved that thing
7:30 I died at this part, the confidence, the music and then the results lmao
Releasing a modern dual core with no hyperthreading should be considered a war crime
Its also locked.... Like wtf why is that still a thing in 2022..
I have a 12 year old chip thats still getting 144fps+ so long as the graphics card is capable.
But then again, socket 1366 was peak Intel (I think most of these still have the OC record, last I checked, too).
If you don't know -- check out what the hexacore server chips can do when OC'd. Entire channels dedicated to the niche, and they also float around 25 dollars, too.
Also has triple channels memory instead of dual, which makes it still work even today with slower memory as it's being accessed more often.
Only thing that's going to force me to stop using it is when a PCIE 2.0 x 16 is finally fully saturated during gaming.
Only issue is finding good boards, but that happens on every platform I'm running a Xeon X5460 on 4.1 GHz here and even that thing is a beast, especially considering its age.
Sure, might not be a hexacore with HT, but still impressive.
@@HappyBeezerStudios yup, my 5650 has been at 4.6ghz and 75 degrees the whole time. Chugging like a champ.
Thats on a single slot aio rad, too.
Lots of old mom and pop electronics resale stores will be a goldmine when it comes to the boards.
Still the odd Sabertooth popping up every now and then on EBay, too.
Even in max use, cpu never passed 45C. Imagine overclocking it would make a huge difference.
Love to see you do a low power gaming build what when all the electricity prices are going up
I'd be interested in seeing what GPU this little processor does like. Could this be a secret budget CPU for like a 1050 ti or something?
If you pair it with 1050 Ti you would have to cap FPS in most games to avoid stuttering. Pentium is much better budget CPU because it has 2c/4t . Hyperthreading does help a lot in modernish games.
When you're adding more RAM, can't you up the VRAM allocation in the BIOS or in Windows? I feel as though that would make a significant difference to stutter, even if the fps is still the same
The vram allocation was 32gb. Not sure how much more that igpu needs. 😂
@@DawidDoesTechStuff Lol! Yeah fair enough
$26 CPU
$426 Motherboard
$426 RAM kit
$852 Video card
Must be a Dawid video...
I would've done one more test:
Using a 360mm AIO cooler, combined with the last setup that's been used in this video.
Also, I would've locked the FPS to 60, or 30.
if it was unlocked and we could magically throw 4 of them in a single huge motherboard and somehow even got quad channel memory, then those CPU's would be a damn beast at that price, I can already imagine something like making a 5+ghz 8 core out of all of them for cheap 😁
The only reason they are cheap is because they're locked and you can't use few of them together
No it wont. People have tried to use dual socket Xeons for gaming. They suck real bad because the cores are not on the same silicon , but on a different CPU . You get a huge latency penalty for it which matters most for games.
@@fleurdewin7958 I was talking if it magically worked kinda as a joke 😁
Anyway intel Xeons mostly didn't got good gaming performance mostly because they lacked high frequency so the majority has poor single thread performance compared to a normal I7...
I was thinking the whole time: "now overclock it" then I realized that this is not a AMD Product and you can't overclock it because it's locked
der8auer did overclock a 12th gen Celeron, and with impressive results.
Would love to see a follow up video comparing single 4gb stick Windows Vs Linux, just for frame rate comparisons
7:59 That's how my old gaming PC ran GTA5 (when I 1st got GTA5). It was an AMD A8-5500 (quad-core @ 3.2Ghz).. paired with a GTX 950 SC+. After I upgraded to an Intel i5-4690k with the same GTX 950 it easily ran medium 1080p @ 60fps (had fps locked to 60 since my monitor only ran 60hz and fps never dipped below 60)
The GTA V benchmark is still better than the PS4 version of the game...
I don’t really understand the purpose of this price point…
The CPU is so cheap, whatever build you put it in its going to a minority on the price. By the time you’re at a full build with a case, motherboard, RAM, PSU, SSD, you’re looking at $300… and at that point you can spend another 50 and get a decent CPU?…
They are usually used for budget office machines. Cheap PC case with "500 W" PSU (actually more like 250) included, costing $25-30 . 2666Mhz no name RAM . Cheapest SSD or even HDD for boot drive. Keyboard, mouse, headphones with mic combo for $25-30. Most important - if you buy in bulk you get the discount, and on a scale $50 difference could grow to $50 000. Of course, poor office workers are not entirely happy, but overall this computer is just a tool .
Cheap cooling, PSUs, cases, etc; mean that you can go lower in price than you might think.
Most big OEMs would buy the chips in trays because they can use coolers that cost even less than the Intel stock one. Add a $35 case/PSU combo (I use a Rosewill one myself, its actually nice), $30 RAM, $50 mobo, $20 SSD and you have a cheapo ~$180 PC for an office slave to sit in front of all day
@@capsulate8642 PC part picker a system together… it doesn’t make sense to not spec up to a higher processor.
Factor in “future proofing”… this thing makes no sense.
Low end CPUs will come with an included cooler.
@@aleksazunjic9672 that’s a false economy.
You’re paying the staff $10 an hour, 8 hours a day. $400 a week. If you spend an extra $50 for an i3, you’d be recouping your investment in just couple weeks of additional productivity.
@@robertpatton172 My point was that they don't come with coolers for OEM purchases (which is most of them).
Anyway, spending any amount of extra money for unneeded performance and heat and power draw is pointless. Gamer brand marketing might make you think otherwise, but it's the truth.
4:00 - I used to play GTA V at 800x600 resolution back when I was using a laptop without a dedicated GPU. It was almost impossible to hit further away targets because of the pixelation.
In my experience from playing KSP and cities skylines I can confirm that 20 fps indeed feels much better than 11.
Good that I still have my old core 2 quad here. Which is old, but shows that that tiny Celeron isn't meant for gaming when a 13 year old used chip outperforms in by a factor of 3
Could it run at 4k or 8k in order to reduce the cpu bottleneck on the system?
nope, vram limitations
i wish i had this cpu, today i use an i5 650 with 4gb ddr3 ram IN SINGLE CHANNEL
This is obviously Dawid's evil clone. Dawid would NEVER use one stick of RAM
Dawid. I spent 10 years of my life. Gaming on Intel chipset graphics and a dual core Pentium e2140 and just 1gig of ddr2 ram and ran literally windows 10 on it for 3 years. It felt like i was shot with adrenaline after i got my ryzen 5 1500x and GTX 1050ti pc.
Dude, your old rig could literally have been nicknamed "Gaming for Masochists". A machine like that will run basic office tasks on Windows XP just fine (BTDT - E2180 on i915 with IIRC 2 GiB of RAM and 80 gig harddrives), but even its memory compression won't save Windows 10 with this little RAM. How long did it take to even boot, 10 minutes? And then the horrors of old Intel GMA graphics (you must have had something a little less dated than the GMA 900 at least, as that one doesn't even have Windows 10 drivers), not to mention pre-AHCI ICHs.
I am not one bit surprised that the difference was night and day, but a few select upgrades (notably anywhere from 2 to 4 GiB of RAM and possibly a faster harddrive or even an SSD) could have made the old rig a lot more bearable at least.
@@PileOfEmptyTapes well. It surprisingly booted under 5 mins. I had a 160 gig hdd that died to a virus. And then i ended up adding a 1tb wd blue hdd. It took literally 1 min for the start menu to open when clicked. It had a gigabyte 945gcmx board that could actually overclock the Pentium with varying frequencies every restart XD but that board died as well. It then ended up having a Chinese zebronics motherboard that also died as well. Windows 10 would run the Microsoft wddm drivers for the GMA graphics. Then the ram stick also died and it got upgraded to just 2 gigs and the difference was night and day. System speed improved by a big margin. But since the board died, i ended up buying a ryzen 5 pc and made a lovely keychain of that cpu.
GTA V has always been pretty ram heavy and wasn't surprised with the framerate boosts with the memory increase at all. Since I had to suffer the same issues but with 6GB of ram... Till I upgraded my ram and realize how much it REALLY ate since singleplayer can go as high as 8-12+GB while online is always above 10GB of RAM usage.
My goodness that CPU scores about the same as a Core2Quad QX9770 in passmark. In my head Core2Quad still resonates as high end when i hear it.
yeah, I'm getting like 3 times the performance on my old core 2 quad. if the power draw wasn't that high, I could totally see Penryn and Nehalem having a good time in budget htpcs and office builds even today.
I have the older version of this in the laptop that my french high school give me, and in fact, that is really usable, if you reset the pc and add some ram, but if you let the bloware that is install and take 100% of the poor celeron n4000 can't do anything...
N series is next level of terrible.
Setting an fps limit to 60 might've helped with the suttering with the gpu. Also ram SPEED would have been interesting to look at. If you wanted to push it further, BCLK overclocking for shits and giggles would be interesting
Actually having a 3.4GHz 2c/2t CPU back in 2007 was very good, as software was optimized for dual cores (with many software being optimized for single core use). Don't even think there was a stock 3.4GHz dual core back then, though many could reach 3.7GHz on light overclock.
Great video, good job!👍
Probably would have worked better sticking with an fps cap there too like 35 - 65 fps there besides low settings, also turn off nvidia shader cache too reduce the sudden game stutters might help as well unless you are planning too just leave the game running for like 6+ hours.
Wonder how well this would have done with a gtx 1050 ti with reasonable settings though for e-sports titles.
2:50 Oof, you almost killed the cat there, buddy 🫠
Amazing. I was very entertained :)
I had to get a Celeron G6900 for troubleshooting at one point. So I threw together a Celeron G6900 in a Maximus Formula, 32GB DDR5-6600, RTX 3090...
It was amazing.
LOL, I won a Z690 Classified and got a G6900 for $40 to throw in it for lolz. Sounds like you did pretty much the same as what I'll do eventually.
20 fps is amazing compared to 11 fps cause 2x fps
dang the gpu improved it so much
So true. Even thou it's still trash, it's trash inside a golden bag. And that means something.
The G6900 and G7400 was on sale recently here and I bought the Alder Lake Pentium Gold for like 40 bucks. I’ll be plugging it into a CRT for some older games. The motherboard a MSI pro H610 even has VGA but I’ll be using a GTX 660 that even on the box shows how to connect it to a CRT. The newest generation with the ability to connect to a CRT is the GTX 900 series so at some point I’d like to get a 970 or something but the Pentium might bottleneck it a bit. Interestingly the Pentium is about as good as an older i5 like the 3570 on 4 threaded applications and much faster in single threaded.
Excellent video, I'm going to link my users this the next time they put in a ticket asking for a grip of RAM to be installed in their laptop as the silver bullet cure-all for their performance problems.
A good idea for a next video would be to build a gaming pc with old parts and compare it to gaming on a top of the line cpu with its integrated graphics (no gpu)
you can get the devil's canyon i5 4690k for $25 and easily overclock it to 4.5ghz all core with a cheap after market cooler.
Some people have even managed to pushed this to 4.8ghz and 5ghz on 2 cores on air coolers.
$25 for 4690k is insanely cheap considering the the hyper threaded version of this cpu 4790k goes for $80.
in all seriousness this proves that CPU bottleneck is the worst type of bottleneck, I had a smoother GTAV experience using an i5 4460 and a 1050ti back in the day than the one showed in this video
When I started this video, the introduction was all in slow motion. I thought it was funny as it was done on purpose to reflect the slowness of this CPU. But then I realised that is not the case when you started to talk... Then I recall I changed the setting to 0.75 speed for my kid yesterday... hahaha... cracked me up real good
I have it’s sibling (G5925) in my local Plex Server. Low Power usage but well supported hardware transcoding.
If you press the key 4 and the key 5 on your keyboard, UA-cam will cut to diferent segments of the video, both of which Dawid says "although" in
I'm glad you bought an expensive socket cover.
Performance wise, this CPU is on par with a Core i7 920 - basically flagship... back in 2008. With a TDP that's around 10% - impressive efficiency!
The iGPU and faster memory do give it a nice uplift however. For a basic Internet browsing PC it's honestly not that bad - it is kinda amazing that they sell something like this though, although I suppose that's the joy of product binning. It's hilarious that the UHD610 graphics uses a good deal more of the TDP than the CPU does.
I have the step up to that one, the Pentium Gold G6400T. 2 cores/4 threads, 3.4 GHz, that I picked up used for $35. In Geekbench 5 it scores just above a i3-6100 2 core/4 thread at 3.7GHz. In Sysbench the i3-6100 beats the G6400T by just a little, so that is the CPU I compare it to.
Should have tried an older lower end gpu, maybe even the same specs or slightly better than the UHD610 igpu. The celeron could more easily keep up and not crash, and the duties would have been lessened.
A good idea would be comparing a regular Intel processor to their lower power t- series variants
I'd be interested to see the little cpu paired with like a 1050ti or 1060 or something, that might be interesting.
would have liked to have seen a OC attempt on the cpu
Great video I laughed way more than I thought I would ! I had a theory… would slower ram give you a better experience with the 3080 ?
An old Haswell, Broadwell or Skylake CPU may be a better alternative when it comes to the performance per dollar ratio, esp. if you can snatch a bundle consisting of CPU + mainboard + RAM. Memory OC brings very nice gains, esp. on Haswell, while Broadwell got that sweet L4 cache that bumps its gaming IPC.
I bought an old Dell Inspiron 660s with an Intel Core i3-2100. I swapped it with an Intel Core i7-3770s after a bit because I found one for $15. It's remarkable just how usable it still is.
A used I7-2600 costs about the same and will perform better in all but very old single-threaded games. But for a 2-threader it's adequate for everyday usage.
One of my friends uses a Celeron G5925 (the highest clocked celeron ever) for his main gaming computer…paired with a GTX 1660 Super
Dawid starting with a single stick of RAM?
"You were the Chosen One! You were supposed to destroy the Sith, not join them!"
😂
What software is that you use for monitoring frames and Temps? Also I love your vids keep up the great work my guy👍🏻
He uses msi afterburner
Your videos are underrated you should get a lot of more views
was gonna get an rtx 4080 for my celeron but deciding against it now, thanks Dawid!
You should try get the asrock riptide b660 with 12th gen celeron and blck oc. That would be a perfect match for the 3080ti
that base frequency is my turbo and the only advantage i have is that my i7-6600u can do multi threading after that it is all down sides PLUS the said not very fast igpu is a "not very fast" one and mine is older than that one so damn that celeron be looking like a good buy
thats actually pretty good. Well good compared to what I play on now thats like probably at least 3 times better that my current “gaming” setup
I mean for retro gaming, I bet this thing is decent. Problem is, the motherboard and hardware around it will cost to much.
Hi Dawid, are you able to disable the internal graphics?
Your videos are so moreish, even when I almost perfectly predict what will happen I can't help but watch 😂❤