Even movies about illegal street racing demo'd this. Someone has a super high performance car his dad owned and is afraid to drive it. It's speed and quickness are too insane for when the driver has to steer.
I know you guys don't like to put out "longer" Byte Size Tech vids but I think 22 minutes is perfectly acceptable! Especially when it's cut from a 4+ hour stream!
A good example is HDD/SSD. In theory a PC with HDD instead of SSD has only lower disk benchmarks assuming CPU/RAM/GPU are the same. However user experience when upgrading to SSD is like two times faster CPU - everything runs faster except from... synthetic benchmarks.
Benchmarks never lie. They are used to measure something ie. CPU/GPU performance in a controlled repeatable environment. We use benchmarks to support our buying decision. Benchmarks are never intended to replicate real world usage. Anyone expect their fps to be the same as in the benchmark video does not understand how benchmark works.
The way you have always talked about how the game feels when you are playing on one system vs another and how stable a platform is, is the true golden nuggets of your hardware reviews. I know that all reviewers get accused of being biased but at some point you are entitled to actually have an opinion about the hardware you prefer using. For me having a stable computer that lacks a “personality” is very high on the list when choosing the parts.
Not only that, but no one is going to have the exact same system as their test bed, and even if they did the results will be different. It’s just to give you a “general” idea of how one component compares to another.
Benchmarking as a hobby forced me to dig deeper into what all the little things in the BiOS are all about, and how to control them. Benchmarking may not apply to real world gaming of running apps, but I learning more about my system, and having fun doing it.
I feel bad for people queueing up near stores in rain, snow, thunder, lightning to buy Nvidia 3000 and AMD 6000 series cards while I sit at home drinking hot soup and happily gaming with my I5 8400+RX 580
More people need to understand this. I had a 2600 and when playing runescape (yes i know... runescape) the fps would be completely fine with nothing else open. However, when i tried to watch 4k youtube videos on my 2nd monitor at the same time my fps would take a dive. Typically would get 80ish fps, and a stuttery 25 with youtube running. My 3900x was a great upgrade for quality of life use
I've used a Zen quad and Zen 6 core and a Zen+ 8 core. I'm not sure about stutters in games, but the Windows and browsing experience is EXACTLY the same on all of those.
Well I would assume regular windows experience wouldn’t change much. Maybe if you’re running a high end game with other stuff running in the background & then trying windows browsing 🤔
If it was just browsing, it should be the same, what they meant here probably something like when playing game and Alt+Tab out of game to do something else, or streaming and doing something else. I think the we starting to reach the point where CPU become powerful enough for most people and they can't feel much difference because of the normal things they do their PC, just like how Android Mid range phone was shit 6 years ago, but now the user experience for Mid range vs Flagship phone aren't much difference except for the gaming part.
I doubt if there's a lot of difference between Zen and Zen+. I moved from a R5 1600 to a R7 3700X and I can feel a certain amount of difference in windows - it is snappier
I was gonna get 5600x but few weeks ago a friend of mine went from 2600 to 2700x and told me "dont go 6 core! Numbers may seem same but more cores is much smoother" and now this vid pops. New consoles are 8 core, dont go under 8 core.
Are these two married? They seem JUST like my dad and his wife... the guy w a bit of ego being constantly reinforced by the female... (and don't misinterpret me. I absolutely have the same personality as the female. I'll generally fully back whomever I am speaking with in order to help him/her fully-realize his/her perspective before I start with any critical appraisal).
I've found that 4C/8T now days is really not enough. With every passing quarter it struggles more and more. Have chrome open streaming music while playing a modern game? Say goodbye to your 0.1% lows. Even an ancient 980x 6C/12T handles things more gracefully especially overclocked. Sure it may not push out as many frames but the overall experience is better. If you're buying a cpu today the minimum core count you should be getting is 8 cores. Those 9900K's for $290 were a killer deal if you managed to get one.
Benchmarks are for UA-camrs who are running an advertising focused business through creating content. They are not interested in actual real world performance. They need a medium to communicate to a wide audience in the most widely understandable format. They provide a one dimensional synthetic view of hardware performance. On top of all those mixed and varied use cases, in the real world are also all the 'Quality of Life' factors that will get even more ignored because they are super hard to turn into numbers. These are some of the most pertinent things to users, such as reliability, stability, longevity, ease of maintenance, how close to the red line they are running, thermal issues, software issues etc. These have more impact on a users experience, most users couldn't spot a 10% difference in frames or latency even if there was money on the line. That said I think that most PC Enthusiasts know this, they might get a little dazzled by the numbers, but are more sensible hat on when buying. I think the people who do get duped are the casuals and fanboys who take it verbatim, so good on you producing this video, not sure it will change much though. Reviewers seem to live above scrutiny and I am sure they know exactly what they are doing. But I still blame people for being sheep, it is not the sheepdogs fault. There is a good reason they are called influencers, they are the modern day drug pushers, they know their product is toxic, but they are there to make money, not to care about the junkies. Every drug pusher pretends to be your friend, and every junky falls for it. That is just life.
I'm a designer by trade and whenever I work with technical people such as engineers, I often find most of them just think in numbers rather than thinking like an average user, so I totally agree with this! It even makes you wonder why some reviewers bash some products being too expensive when the price doesn't match the benchmark numbers, but discounting human experience. Most benchmark reviews only produce quantitative results which doesn't paint the whole picture. Not to mention, reviewers tend to put too much focus on overclocking on hardware that are already ridiculously fast. Perception is a valuable qualitative measure which is often ignored. UX needs to be taken into account, it is called human-computer interaction after all. You essentially need a good balance of quantitative and qualitative reports to provide a better context/review. This is a very academic and insightful take! Good job, Tech!
This is postmodern. Watching GN videos and making some spreadsheet is still the best way to decide what to buy for the overwhelming majority of people.
@@xpodx Think of a specific scenario where that makes sense. Most consumers don't care if websites load 2ms faster by going Intel or whatever. They care about fps per dollar and frame to frame interval consistency.
@seanlutzke1694 some people want the fastest and only into one brand, all my buddies don't do amd. Well one buddy of mine has a amd cpu. But ya, I don't care about price per dollar, I just want the fastest I can Afford and sell my part to get the better one.
@@xpodx GN Benchmarks are great for that. My point still stands. If you want the most bang for your buck you should watch GN videos about the products you're considering. The only reason not to is if you're too dumb to understand the data.
I could be wrong, but for me personally, I have gone from a 7700k to a 8700 to a 3800xt within the last 2 years, and haven't really felt a difference in speed within windows. Not sure if there's something else to this, but it's just my experience.
Gaming on R5 3600, Cyberpunk is hammering all of it's cores and threads to 100%. Wish I had at least 8c/16T, or better yet 12C/24T. What you say was the main selling point of the first gen Ryzen cpu's! Amd showing how 7700K (4c8T) beats Ryzen 1700 (8c16T) in games and then loses all of it's advantages to the 1700 once you start streaming and heavy multitasking. So while the 7700k system lost responsiveness and fps went down drastically, the 1700 kept doing it's own thing. But yeah 7700k would win all the gaming benchmarks by a mile.
Erm. But if you're simply gaming with a 7700k and the R7 1700, despite having background apps running (but not streaming), the 7700k would still beat the 1700 out. The background apps are such negligble things that modern computers , even an R3 or i3 will run with ease as long as it's not streaming / rendering something else while gaming.
@@MoneyMaking861 ya, but ask yourself, which would you rather have, a 10 core 10-900k for $299 or a 12 core Threadripper 1920x running all core over 4.2 GHz for $199. In the end it all depends on what you are doing.
Going from a 2700x to a 5600x was a really good move it got rid of stuttering and gave all my fps levels a boost at 1080p and 1440p real world not bench marks
I love how he said if they're the same price stressing the "if" and that's the exact situation I'm facing with 10850k with 5600x/5800x. I've kinda leaned towards to the 10850k because 10 cores and 20 threads is overkill for what I need and i would rather overkill than passable especially for the same price.
I wholeheartedly agree with tech's viewpoint. Every time i run windows updates on my retired htpc with core 2 quad q9400, 8gb ram im glad i dont use it anymore as my daily rig. Just running windows updates it gets around cpu 60-80% usage and it takes AGES to download and install them, now granted its really outdated cpu, since then i used 4790k and 3700x (currently running 5950x) but the only cpu i could keep using would be 3700x and not 4790k. Why? Because when you get used to 3700x and now 5950x it would be really hard to go back to 4790k. Sure 4790k would still run most games adequately but you are really on the edge especially on games like shadow of the tomb raider, assasins creed origins/odyssey etc. and god help running with something extra in the background like watching twitch stream, installing games on steam, updating windows/games. Also from my personal experience i always aim for cpu with more cores/threads, obviously single core speed is extremely important as well, but if you have a choice and dont sacrifice much on single core performance always take cpu with more cores/threads than you need now. Trust me (or dont) but it will come in handy in the future however crazy it might sound now, like 5900x/10900k for pure gaming rig is a must for high end gaming machine. Changing cpu/mobo/ram is the hardest part about pc upgrades and the longer you can keep it the longer it will pay off in a long run.
Horsepower is meaningless for a passenger car, as it primarily effects straight line top speed. A nice flat and fat torque curve is what you want, as it is the source of acceleration and responsiveness. FPS in expert reviewers' benchmarks is all fine and good, but real-world performance will always be lower.
Indeed, which is why electric motors are superior to combustion engines for this use case. Fun fact, the first cars were electric. It's just that combustion technology very rapidly outperformed the crappy batteries and less than ideal motors at the time.
Love the fact that you used a Doug Demuro analogy! I love tech and cars and I 100% agree about the objectivity versus subjectivity, argument. Benchmarks ≠ End user experience
I do think benchmarks would benefit from more statistics in their analysis, even if the set up is itself subjective. Its worth knowing the level of deviation that's observed in each set up when accessing how much more capable something is.
the benchmarks are "best case scenarios" and you can "compare between various graphics cards" to get an idea which one is best. CPU's are more iffy, because "background tasks"
Clean test bench benchmarks is analogous to doing algebra Real world testing is multivariable calculus (where some variables are subjective and others can't be controlled for so that test can't be repeated)
Well.... in Australia right now: 3600x - $380 3900x - $760 I don't see how double the price justifies "it just feels better to use." I would love to get higher than a 6 core cpu, but double the price, plus needing to get a cooler, the 5900x and 3900x is not better for most gamers on a budget. Take that $300+ and use in a couple of generations after ddr5 is out and affordable, then all CPUs will likely have more cores as well.
Ah yeah fair, I probs should have generalised rather than use Aussie prices. Prices will change anyways and at some price point one will be worth it over the other. The 'use more than just a benchmark' and 'factors that are hard to benchmark' parts are great though. And what I've been loving about this channel, always giving more angles to view and think about things. Also, if you ever are able to control the Aussie prices, feel free to make them a little less crazy
Amazing video , cant believe more tech folks dont talk about this , lol if you read the comment section of benchmark videos alot of people complain they dont get close to the benchmarks ,
There is a serious bloatware problem, especially with the software the motherboard vendors provide (or iCue). Also, all it takes is a cheaper motherboard with poor VRM, or a cheap power supply.. and it can cripple the whole system. Combine the two, and you've got seriously poor performance relative to the numbers in the benchmarks.
Well... I subscribed on this video. DIsapponted in some ways because I now think I haven't made the best decision on my new PC components. On the other hand I can appreciate the wider picture and know how in a couple of years I'm going to upgrade my rig if I want to play top games
Can you please make a different channel all about buying the best cars at different budgets, that would be amazing. I don't know nothing about cars and I'm going to need to buy one in the future and I need a buying guide.
If Chrysler would ditch the ancient Mercedes platform the Charger, Challenger and 300 are built on, they might be able to make it weigh less than a black hole while still being strong enough and being able to take a turn.
This channel is exactly why I'm going from 7700k to 5900x. I can stream and enjoy my pc with no care in the world. Even with new nvenc my 4c 8t cpu is showing its age. Cyberpunk is pushing it to its limit. I also can't get my 240hz screen yet because my cpu is to weak :(
@@jokernumbaonefive2413 Yes it will. That game prefers 8c and 16t there's plenty of cpu scaling benchmarks out there. Yes it is bad optimised but it is not why i'm upgrading. I stream and I also want to play on 240hz in some games and the new ryzen cpus would help me do that.
@@C0d0ps well the 7700k was the worst purchase you ever made because the 8700k came out the year after. i see your a streamer so cant you just stream at 720p and not take a performance hit in game.
@@C0d0ps Cyberpunk is cpu bound as much as it is gpu bound. AND it is unoptimized. None of the cpu and gpu combinations you can get today will provide the scaling you are paying for. Keep that in mind. You'll also not see a significant difference between 6 and 8 cores for at least 4-5 years. And by then, your expensive cpu will be obselete. Of course, if you 'multitask' while playing a demanding game like Cyberpunk, go ahead with a 8-10 core CPU. But that is a lot of money to waste since you can get a 6 core chip for 200$
My personal background tasks: HWInfo, Steam, EAOrigin, Epic, printer software, VPN, Corsair iQue, Windows updates, Outlook, listening to a youtube playlist in the background, aaaand the game launchers are downloading and upgrading some games.... on top of that I have the nasty habbit of opening everything that interests me in a new tab. It is very easy to "use up" what was an entire PC a couple years ago, meaning 4-8 Threads and 8 GB of RAM without even actually doing anything. Hardly something one can measure on a clean benchmark system. Benchmark results imho should be taken as a baseline of what is tested - the game - and how individual components compare relative to each other on an even playing field. I'd calculate in some headroom when you want to reach that suggested level of performance in real life. Where technology currently stands, that's especially with CPU cores and RAM... OR be prepared to run an everyday system as clean as a benchmark system.
yeah you can't really see it. looking at a live game benchmark you can't see the latency difference between 6 and 12 cores clicking buttons, opening and closing programs, chrome things, but holy heck you can just feel the difference.
Hitches and dropped frames would show in the 1% lows and 0.1% lows Also the vast majority of people who own a gaming pc, use it primarily for gaming and they don't multi task
That's like saying most people who buy a house , use it just for sleeping in at night. Buying a multipurpose device for one single purpose doesn't really make a lot of sense.
@@ByteSizeTech I have to disagree here. No 'budget conscious' gamer would waste core utility which can be utilized on games on discord, chrome tabs in the background. Especially if it is a demanding game. Gaming with multitasking is an extremely premium end, first world problem. You are asking people spend at least 2x the money (going from 3600 to 3700x/10700) for this reason. And this is the best case scenario where intel chips are even more expensive. That money is better served upgrading the GPU. Your cpu core arguments are extremely biased towards premium spenders where the argument is moot since they can technically buy the best of the best. Your job is to find the deal for people on a budget. Not to suggest people to spend 2x the money waving away 6 cores and pushing 8-10 cores with minimal fps jump now with an undetermined future potential. 6 cores, on 1 monitor, no multitasking with 5700xt-ish GPU will be good for 1080p gaming for 4-5 years. Even if the argument involves 2k and 4k, the money is still better spent on the GPU rather than 'extra cores'.
@@crazyahhkmed I didn't say it was just for gaming. I said it was primarily for gaming. which means when they game the do in fact turn off other programs such as browsers and other memory/cpu hungry tasks. Sane human beings, do not have 16 chrome tabs opened at the same time that they are gaming. That would just be a waste of money/performance. You know you could close them and then restore all of your tabs once you open chrom again.
I suggest you create a "non-clean" testbench, where you show the lack of power from number of cores. If you are saying that this effect is tangible, it SHOULD be benchmarkable. In a scientific world,. there is no way that you are saying a cpu power is not enough, but you cant show it. Please give me a scientific test, where a 5600x loses to 9900K in gaming. I'm just so tired of hearing these "you need a number of cores for something" -stuff. Thats would mean a 1700x would be better than 5600x. Which it is not. Forget the number of cores and see to the performance of the chip.... Tech should be savvy enough for this. IPC and Mhz makes the chip, not the number of cores.
In fairness they did address this point, to a degree. Ultimately I think the only way to do this due to inconsitencies would be to create a program that actually uses/drains certain amounts of CPU usage and RAM. And I'm no expert but that may not be easy to do because of how operating systems fundamentally work. It's both why I love PC's but at the same time on a simple, pure basis consoles are insane value for money.
If he or anybody else could've done it, they would've. Real world is exactly that, real world. Even if you replicate everything from the Chrome version, number of tabs, exact website visited and which process you're doing on that site, what files you're reading and writing, it will NOT be the same on every bench you have for there will be too many uncontrollables. I'm sure you can grasp this.
I know right, the key word is then... it is subjective and like tech always says: "It depends". Which is why I would steer away from focusing on number of cores between different chips (intel and amd, older gen vs new gen), but rather focus on the (measurable) performance. The benchmarks are not lying, they just focus on games and productivity tests (not streaming, four monitor-setups where you play three different games while watching youtube and updating windows at the same time.) One thing I disagree with tech is: There is a scenario where a 5600x makes much much more sense than a i9-10900K, which is a thermal performance limited applications (Small itx cases, etc). The power draw and heat output of the intel part is just staggering compared to a zen 3 chip.
@@anotherbricknwall The thing is the other channels are doing the same thing by indirectly trashing this channel and stating that he is spouting "nonsense". The word on the street has become "buying anything more than a 6 core gaming CPU is stupid".
I have four threads per core pegging my R3 3100. I agree with the point about somewhat FPS drops in games. Difference in overall app responsiveness is hardly noticeable.
This is one reason I went for a 3700X. More resources to cope with older installs of W10 and background tasks. Some tests on an old install would be interesting.
I'm poor. Whenever i purchase anything, i ALWAYS want to know that precise area of the market where the returns are the best... right in the top of the value-for-your-$ curve... Right at the point where you start paying a lot for less-great improvements, that's where I want to stop moving up. We're talking the point where, if you took some option off the table, it's going to be a killer blow in performance or happiness. I would like an entire channel of this stuff for all areas of life... the cheapest car, pc, smart phone, etc you can get that is still solid. I have kicked myself SO often for going with the cheapest decent-seeming option because it ends up being TOO crap... esp when usually jsut a few bucks more would get you a mountain of difference.
Haha I love the car analogy, however, your average person races in straight lines. Mainly lights to lights or on the freeway. And wheelies lol people love wheelies 🤣🤣🤣
It’s why I gave up “gaming Jesus”. The tests they do are always clean installs, one thing running, and says nothing about what people use computers for. People are in for a huge reality check when their build doesn’t perform like Nexus benchmarks. The reason is because Nexus is, in my experience, not good enough to judge a component as it’s part of a system ecosystem. Like, nobody cares that a 5900X drags an i9 over the coals in 7zip. I can’t name one damn person on Earth that spends all day zipping and unzipping archives all damn day. Once you get into these workstation loads, you’re looking at business grade hardware. I would also hazard a guess that they even naively believe that “gigabit” Ethernet ports are capable of fully saturating the link bandwidth, when it completely depends on the integration of the hardware. I’ve seen this before on Cisco routers and switches and the fabric and switching capacity is real.
So welcome :) , i have Ryzen 3700x and 5600x, mb is asus x570-P, wich cpu you recommended to leave with me to use in game and every day use for office job? Thanks for replay :)
It is about the budget and the amount of money one is willing to spend on PC and majority of people according to me are still on mid range and lower end of the spectrum. The high end systems get a lot of focus while the majority are ignored. It's not that people don't know that the 8 & 12 core CPUs are better than 6 core or 4 core CPUs but most have a limited budget which they may stretch additionally by 10 % or such but they won't stretch by 40 / 50% . I take your point that you see 8 cores 16 thread CPUs as the best value in terms of its future proofing due to various reasons and I agree with that assessment but people are already stretching their 'budget' to get that 6 core CPUs and mid range GPUs and if the higher end 8 / 10 core CPUs were just 10-15 % difference in price point than most would definitely go for them but they are not and are mostly much above their limits. The point is one would spend more on their GPUs than on their CPUs for Gaming and both the price of CPUs and GPUs should be taken in consideration of one's budget. The price points of mid range and value for money GPUs have already increased than it was before and one would stretch their budget more over the GPUs than on CPUs since games which are limited by CPUs are much less than those of GPUs. I have an GTX 1060 and am thinking of buying an RTX 3060ti which I think is the value card for 1080p gaming . That in itself is a big jump in mid range GPUs price point and then asking for spending on 8/10 core CPUs would be a bit too much. I believe that 6 core 12 thread CPUs like the Ryzen 5600 with its IPC improvements or it's Intel equivalent would be that value for money buys that would last for long.
True benchmarks include multiple monitors running while chrome browsers are up, twitch, msi after burner, obs screen recording and broadcasting via twitch or youtube, rendering some kind of file all while running a benchmark for a game. That is where the real bench numbers come from.
I disagree. You then tell me one single case that is applicable to all of the users on the planet. When we benchmark a game the only thing that is using most of the resources is the game itself. Ya there will be some resources used by background windows telemetry etc. But its a game benchmark not a multitasking while playing benchmark where you have opened 10 background apps. 30 chrome tabs etc like you where everyone needs 32 or 64 gigs of ram
I regularly game with 40+ Chrome tabs, monitoring software, and a music player open in the background, sometimes having videos playing in the background, with small variation in the smoothness of the game. (PC has a Ryzen 5 3600X and 16 GBs of 3600 MHz, CL16 RAM.)
And this is why my $199 Threadripper 1920x and my $850 Threadripper 2990wx are still better choices than many other less expensive CPU's. Depending on the use case, of course. The whole story depends on the how it's configured and what it's used for.
Drag race analogy is wrong. Thy would be a drystone synthetic benchmark. Benchmarks in games are like laptimes on a track on sunny day conditions. So very close to real world racing, but might be a little different on wet etc.
Facts. When I try to play CSGO on my 3900x with 32gb of ram and a 1080. If I have a ton of Firefox tabs open; which I usually do. My game will get high fps but it will feel sluggish. Sometimes it runs buttery smooth and other times I have to close certain tabs. Like Twitch streams make my game feel like trash. Looking at charts wouldn't show that, it would just show the fact im still getting 300+ fps. But as an actual user I can tell you it sucks and affects my experience. I'm really hoping my 5950x will fix this issue.
No... You don't need more cores to fix the drop in performance... CSGO uses 4 cores maximum and it also gives you the most performance with 4 core affinity assigned with a task manager. Instead of blaming it on the CPU, learn how to manage your proccesses' affinities manually and take advantage of both CCDs your 3900X has, damn. You literally have a 3600X + 3600 in one chip connected via an infinity fabric. There is so much potential and plannifcation to easily increase performance.
@@TriWaZe Although the architecture is improved, this issue will happen in the 5900X and 5950X. Even with the combined memory, you deal with latency from shared CCDs. What I'm talking about is assigning every background process to the weakest CCD and use the best CCD (which will be basically 98% free) to game on without feeling much FPS penalty as before, and even overclock it. Per example, since you use tons of tabs of Firefox, you would want to assign the core affinity of Firefox to the range of cores 12 to 23 and leave the game from 0 to 11 core affinity. For CS:GO, it would be best to assign 4 physical cores or 2 cores + 2 SMT cores, as long as you run 4 logical cores on Windows. I also have to add I achieved single-core performance of 521 in Cinebench for the gaming CCD, which is like the 10700K's. It will not perform as such, but the optimizations pay off really well.
Great video, we have several computers in our household and we are alway upgrading and swaping around hardware, my son sold his Skylake I3 system and we swapped the the 1050Ti with the R560 in my other sons computer before he sold it because the 1050ti ws susposed to be better. My son was running a R3 1200. And believe it or not the computer RAN WORSE. I exhanged my 1070 for it in my i5-6600K system (which i'm going to sell soon). And it MYSTACALLY RAN BETTER in that system (obviously not as well as the 1070 but it's going to auction anyway). Trust me when I saw we did ALL THE DRIVER THINGS!!
fully agree... thanks... do you think the R9 3900x still a valuable cpu today (end 2020 - early 2021)? would you recommend to buy it instead of spending a bunch of money on a R9 5900x??? I play 4k and do some video editing sometime.
yeah bro, its one of the best cpus, only because its not the latest it doesnt mean its trash, its actually pretty good, plus the 5900x is pretty difficult to find.
This topic reminde me as in recent years same testers/ benchmark test gaming CPU/GPU in Ashes of the Singularity to prove the AMD is better and more future proof that Intel/Nvidia. :)
I like benchmarks...because I can run them on a new build to see if I've set everything up correctly and the hardware is working as intended as compared to other similar builds. Also, it still works for bragging rights 🤣😂
Good honest information and very true but the real line comes down to what each individual can afford. If you have money to burn then by all means get the most cores and most and fastest memory because you can afford it. For those like me, I can't afford all that stuff so I just get what I can afford. Which generally is the technology that's behind at least 3 years. My current computer was bought in 2008 for under $500 still running Windows 7. But am going if still in stock last check store had 2 a prebuilt HP with Ryzen 5 3500 and gtx 1650 super GPU for $599. It may not be the best but compared to what I have now is a huge leap and that's all that I need to be happy. It will be capable of doing everything I need it to do for along time especially if I can get my hands on a RTX 2060 GPU or the 2060 super my power supply will handle up to that. That should get me 1440p resolution for video quality. That should be good for a few years for me after all I am still going on Windows 7 and ddr3 ram and a Athlon processor.
every aspect of your PC is relative to its performance, This is the only video I've ever seen explain PC behavior.. A PC is as fast as the slowest part.
Currently using a 5600x and I don't have any problem streaming on discord while I game in CPU bound scenarios. The only time my CPU hits 100% is in benchmarks. 6 cores is easily enough for my use. Two 3440x1440 monitors (one for games, one for discord/task manager/fan curves/browser). The sort of people that need that many threads and don't use their PC for work must be the same people that leave every app on their phone running and have 10+ browser tabs open constantly. I just can't see it.
(*Nasally voice*) But in single threaded applications....games don't utilize no more than x amount of cores...And I am here to defend my purchase!.. All the regurgitated fecal matter I heard in the forums.
well its so much to digest i will just it as a summary that more cores means better and more responsive computer where u can do many things at once as i do. I discord stream plus running audio of my team voice chat in discord while playing dota 2 and when finding match. I go watch youtube chrome runs on bg. Therefore its gonna be a better experience in more cores.
About to spend 1300 on a new build my cpu is going to be a i9 10850k, 32 gb 3200 ram, z490 msi board with 2 m.2 Pci gen 3. I want my 10 cores and 20 threads. Oh and graphics card is a 5500 xt for now only thing I can find but I mainly play older games.
i have a 1660 an r5 3600 and 16gb ram and im looking to upgrade my gpu so i can run cyberpunk 2077, on my 4k monitor (not necessarily with rtx and max i dont mind running on high) or atleast run it smoothly on my 1080 monitor at max, i dont want to start buying the top tier card every year so my question is, should i get a second hand 2080 ti for around £500???? or should i suck it up and get the 3080 when its back in stock?
Its better to wait as Nvidia may launch more versions of Ti or super anytime next year with more Vram. Even the regular 3080 has just 10 gb VRAM so that is not a good deal either
Benchmarks are plenty relevant and anyone using a computer should know enough about how to keep background tasks and/or apps closed when they play games for max performance. Or if they don't know or want to know all that stuff then they should pay for more cores and more memory.
I found your Channel after i Ordered 5600x... -_- for 260€ @ Black Friday. Finally I choose the 5600, because I don't know If the Brocken 3 withe Edition can handle the 5800x. But even better as the i5 4690nk.
It's like how sports car performance is more than just their straight line speed.
Sent this before you made the same reference in the video lol
Duh he said that.
@@adamhymas4620 mans can't read
Even movies about illegal street racing demo'd this. Someone has a super high performance car his dad owned and is afraid to drive it. It's speed and quickness are too insane for when the driver has to steer.
It's why the Mazda Miata is one of the greatest sports cars ever created.
Benchmarks are telling the truth, from a certain point of view. Like obi-wan.
exactly
Thank you.
I always see this way... "Facts are part of reality"
I know you guys don't like to put out "longer" Byte Size Tech vids but I think 22 minutes is perfectly acceptable! Especially when it's cut from a 4+ hour stream!
Yes, i totally agree
That's why they call them Benchmarks. Nobody should expect it to apply precisely as their use case. It's a general rule of thumb.
The good thing about the Tech Deals channel is that they also share the user experience in addition to the benchmark results.
A good example is HDD/SSD. In theory a PC with HDD instead of SSD has only lower disk benchmarks assuming CPU/RAM/GPU are the same. However user experience when upgrading to SSD is like two times faster CPU - everything runs faster except from... synthetic benchmarks.
Benchmarks never lie. They are used to measure something ie. CPU/GPU performance in a controlled repeatable environment. We use benchmarks to support our buying decision. Benchmarks are never intended to replicate real world usage. Anyone expect their fps to be the same as in the benchmark video does not understand how benchmark works.
That's alot of people.
Holy crap. As a n00b to PC building I needed to hear all this. Made me feel better about the parts I chose for my first build.
The way you have always talked about how the game feels when you are playing on one system vs another and how stable a platform is, is the true golden nuggets of your hardware reviews.
I know that all reviewers get accused of being biased but at some point you are entitled to actually have an opinion about the hardware you prefer using.
For me having a stable computer that lacks a “personality” is very high on the list when choosing the parts.
Exactly my sentiments, and I'd easily give 2 gold stars for the speech marks on personality!
Not only that, but no one is going to have the exact same system as their test bed, and even if they did the results will be different. It’s just to give you a “general” idea of how one component compares to another.
Benchmarking as a hobby forced me to dig deeper into what all the little things in the BiOS are all about, and how to control them. Benchmarking may not apply to real world gaming of running apps, but I learning more about my system, and having fun doing it.
I feel bad for people queueing up near stores in rain, snow, thunder, lightning to buy Nvidia 3000 and AMD 6000 series cards while I sit at home drinking hot soup and happily gaming with my I5 8400+RX 580
More people need to understand this. I had a 2600 and when playing runescape (yes i know... runescape) the fps would be completely fine with nothing else open.
However, when i tried to watch 4k youtube videos on my 2nd monitor at the same time my fps would take a dive. Typically would get 80ish fps, and a stuttery 25 with youtube running.
My 3900x was a great upgrade for quality of life use
I have Adhd and rarely do I watch a video word for word.
Best information I’ve captured yet regarding this whole CPU war mess.
I've used a Zen quad and Zen 6 core and a Zen+ 8 core. I'm not sure about stutters in games, but the Windows and browsing experience is EXACTLY the same on all of those.
Well I would assume regular windows experience wouldn’t change much. Maybe if you’re running a high end game with other stuff running in the background & then trying windows browsing 🤔
If it was just browsing, it should be the same, what they meant here probably something like when playing game and Alt+Tab out of game to do something else, or streaming and doing something else.
I think the we starting to reach the point where CPU become powerful enough for most people and they can't feel much difference because of the normal things they do their PC, just like how Android Mid range phone was shit 6 years ago, but now the user experience for Mid range vs Flagship phone aren't much difference except for the gaming part.
I doubt if there's a lot of difference between Zen and Zen+. I moved from a R5 1600 to a R7 3700X and I can feel a certain amount of difference in windows - it is snappier
@@akashsurya3879 That's what I'm implying. Cores don't matter, architecture does. I think Tech is seriously wrong about this.
I was gonna get 5600x but few weeks ago a friend of mine went from 2600 to 2700x and told me "dont go 6 core! Numbers may seem same but more cores is much smoother" and now this vid pops. New consoles are 8 core, dont go under 8 core.
I'm pretty sure I've watched every byte size tech they are just so interesting, helpful, and small thank you so much keep it up:)
True
Are these two married? They seem JUST like my dad and his wife... the guy w a bit of ego being constantly reinforced by the female...
(and don't misinterpret me. I absolutely have the same personality as the female. I'll generally fully back whomever I am speaking with in order to help him/her fully-realize his/her perspective before I start with any critical appraisal).
I switched from a 4790k to a 10850k , i play at 4k and benchmarks are very similar but my GOD there is a huge difference in experience
Same cpu I am going to buy it's only 400 dollars great deal
Hey what cooler are you using trying to figure out which one to get for the i9 10850k.
@@ThorntonWillie i splurged on the Fractal design celcius +s36 , overkill but looks great
Ok going to go check it out. Ty
@@ThorntonWillie a Noctua NH-D15, Darkrock pro 4 or any 280 cooler will be a great
I've found that 4C/8T now days is really not enough. With every passing quarter it struggles more and more. Have chrome open streaming music while playing a modern game? Say goodbye to your 0.1% lows. Even an ancient 980x 6C/12T handles things more gracefully especially overclocked. Sure it may not push out as many frames but the overall experience is better.
If you're buying a cpu today the minimum core count you should be getting is 8 cores. Those 9900K's for $290 were a killer deal if you managed to get one.
Word! it's a struggle with my 6700k, getting a 10850k sometime, today hopefully, in the near future just waiting on it and I can't f**king wait!
I agree. 4790K served me well for years but it's time for retirement.
Benchmarks are for UA-camrs who are running an advertising focused business through creating content. They are not interested in actual real world performance. They need a medium to communicate to a wide audience in the most widely understandable format. They provide a one dimensional synthetic view of hardware performance.
On top of all those mixed and varied use cases, in the real world are also all the 'Quality of Life' factors that will get even more ignored because they are super hard to turn into numbers. These are some of the most pertinent things to users, such as reliability, stability, longevity, ease of maintenance, how close to the red line they are running, thermal issues, software issues etc. These have more impact on a users experience, most users couldn't spot a 10% difference in frames or latency even if there was money on the line.
That said I think that most PC Enthusiasts know this, they might get a little dazzled by the numbers, but are more sensible hat on when buying. I think the people who do get duped are the casuals and fanboys who take it verbatim, so good on you producing this video, not sure it will change much though.
Reviewers seem to live above scrutiny and I am sure they know exactly what they are doing. But I still blame people for being sheep, it is not the sheepdogs fault. There is a good reason they are called influencers, they are the modern day drug pushers, they know their product is toxic, but they are there to make money, not to care about the junkies. Every drug pusher pretends to be your friend, and every junky falls for it. That is just life.
I'm a designer by trade and whenever I work with technical people such as engineers, I often find most of them just think in numbers rather than thinking like an average user, so I totally agree with this! It even makes you wonder why some reviewers bash some products being too expensive when the price doesn't match the benchmark numbers, but discounting human experience. Most benchmark reviews only produce quantitative results which doesn't paint the whole picture. Not to mention, reviewers tend to put too much focus on overclocking on hardware that are already ridiculously fast. Perception is a valuable qualitative measure which is often ignored. UX needs to be taken into account, it is called human-computer interaction after all. You essentially need a good balance of quantitative and qualitative reports to provide a better context/review. This is a very academic and insightful take! Good job, Tech!
This is postmodern. Watching GN videos and making some spreadsheet is still the best way to decide what to buy for the overwhelming majority of people.
Like they said, should also check other tests of people doing more than just gaming. It'll help to pick the best chip for your budget.
@@xpodx Think of a specific scenario where that makes sense. Most consumers don't care if websites load 2ms faster by going Intel or whatever. They care about fps per dollar and frame to frame interval consistency.
@seanlutzke1694 some people want the fastest and only into one brand, all my buddies don't do amd. Well one buddy of mine has a amd cpu. But ya, I don't care about price per dollar, I just want the fastest I can Afford and sell my part to get the better one.
@@xpodx GN Benchmarks are great for that. My point still stands. If you want the most bang for your buck you should watch GN videos about the products you're considering. The only reason not to is if you're too dumb to understand the data.
@@seanlutzke1694 I watch his stuff, super informative guy. Some of his thoughts I don't agree with, but most things I do.
I could be wrong, but for me personally, I have gone from a 7700k to a 8700 to a 3800xt within the last 2 years, and haven't really felt a difference in speed within windows. Not sure if there's something else to this, but it's just my experience.
Gaming on R5 3600, Cyberpunk is hammering all of it's cores and threads to 100%. Wish I had at least 8c/16T, or better yet 12C/24T.
What you say was the main selling point of the first gen Ryzen cpu's! Amd showing how 7700K (4c8T) beats Ryzen 1700 (8c16T) in games and then loses all of it's advantages to the 1700 once you start streaming and heavy multitasking. So while the 7700k system lost responsiveness and fps went down drastically, the 1700 kept doing it's own thing. But yeah 7700k would win all the gaming benchmarks by a mile.
Erm. But if you're simply gaming with a 7700k and the R7 1700, despite having background apps running (but not streaming), the 7700k would still beat the 1700 out. The background apps are such negligble things that modern computers , even an R3 or i3 will run with ease as long as it's not streaming / rendering something else while gaming.
@@seahawkd5203 Ryzen 3000 is not trash in gaming lol, what are you on about? Especially 1440p/4K they’re the same performance as the new chips
@@seahawkd5203 Yes, In my second sentence I say at 1440P/4K they have the same performance.
100% agree with you man, imagine people actually comparing a 6 core to a 10 core...
But tons of people are going for the 5600x because "it is better than the i9-10900k in a few games".
@@unclej3910 is it the same price tho?
@@MoneyMaking861 ya, but ask yourself, which would you rather have, a 10 core 10-900k for $299 or a 12 core Threadripper 1920x running all core over 4.2 GHz for $199. In the end it all depends on what you are doing.
Going from a 2700x to a 5600x was a really good move it got rid of stuttering and gave all my fps levels a boost at 1080p and 1440p real world not bench marks
I love how he said if they're the same price stressing the "if" and that's the exact situation I'm facing with 10850k with 5600x/5800x. I've kinda leaned towards to the 10850k because 10 cores and 20 threads is overkill for what I need and i would rather overkill than passable especially for the same price.
I wholeheartedly agree with tech's viewpoint. Every time i run windows updates on my retired htpc with core 2 quad q9400, 8gb ram im glad i dont use it anymore as my daily rig. Just running windows updates it gets around cpu 60-80% usage and it takes AGES to download and install them, now granted its really outdated cpu, since then i used 4790k and 3700x (currently running 5950x) but the only cpu i could keep using would be 3700x and not 4790k. Why? Because when you get used to 3700x and now 5950x it would be really hard to go back to 4790k. Sure 4790k would still run most games adequately but you are really on the edge especially on games like shadow of the tomb raider, assasins creed origins/odyssey etc. and god help running with something extra in the background like watching twitch stream, installing games on steam, updating windows/games.
Also from my personal experience i always aim for cpu with more cores/threads, obviously single core speed is extremely important as well, but if you have a choice and dont sacrifice much on single core performance always take cpu with more cores/threads than you need now. Trust me (or dont) but it will come in handy in the future however crazy it might sound now, like 5900x/10900k for pure gaming rig is a must for high end gaming machine. Changing cpu/mobo/ram is the hardest part about pc upgrades and the longer you can keep it the longer it will pay off in a long run.
Horsepower is meaningless for a passenger car, as it primarily effects straight line top speed. A nice flat and fat torque curve is what you want, as it is the source of acceleration and responsiveness. FPS in expert reviewers' benchmarks is all fine and good, but real-world performance will always be lower.
Indeed, which is why electric motors are superior to combustion engines for this use case. Fun fact, the first cars were electric. It's just that combustion technology very rapidly outperformed the crappy batteries and less than ideal motors at the time.
@@LiveType An internal combustion engine car will continue to be superior for long trips as long as we lack fast charging and sufficient stations.
Cars don't even use horses anymore.
Love the fact that you used a Doug Demuro analogy! I love tech and cars and I 100% agree about the objectivity versus subjectivity, argument. Benchmarks ≠ End user experience
based on current prices, R5 5600X is 405 and i9 10850k (basically 10900k) is 450. there is no way you should buy R5 in this case either... yeah
I do think benchmarks would benefit from more statistics in their analysis, even if the set up is itself subjective. Its worth knowing the level of deviation that's observed in each set up when accessing how much more capable something is.
the benchmarks are "best case scenarios" and you can "compare between various graphics cards" to get an idea which one is best. CPU's are more iffy, because "background tasks"
I’m sold on the Honda Odyssey with the vacuum cleaner....😂👍
The final line.. "there's more to using a computer than the frame rate." I whole heartedly agree!!
Clean test bench benchmarks is analogous to doing algebra
Real world testing is multivariable calculus (where some variables are subjective and others can't be controlled for so that test can't be repeated)
That's a decent analogy
From all your recent videos I would summarize the idea is just like “the best setup isn’t the the perfect setup but the one that best fits.”
Well.... in Australia right now:
3600x - $380
3900x - $760
I don't see how double the price justifies "it just feels better to use."
I would love to get higher than a 6 core cpu, but double the price, plus needing to get a cooler, the 5900x and 3900x is not better for most gamers on a budget. Take that $300+ and use in a couple of generations after ddr5 is out and affordable, then all CPUs will likely have more cores as well.
I can’t control crazy Aussie prices. :)
Ah yeah fair, I probs should have generalised rather than use Aussie prices. Prices will change anyways and at some price point one will be worth it over the other.
The 'use more than just a benchmark' and 'factors that are hard to benchmark' parts are great though. And what I've been loving about this channel, always giving more angles to view and think about things.
Also, if you ever are able to control the Aussie prices, feel free to make them a little less crazy
Amazing video , cant believe more tech folks dont talk about this , lol if you read the comment section of benchmark videos alot of people complain they dont get close to the benchmarks ,
There is a serious bloatware problem, especially with the software the motherboard vendors provide (or iCue). Also, all it takes is a cheaper motherboard with poor VRM, or a cheap power supply.. and it can cripple the whole system. Combine the two, and you've got seriously poor performance relative to the numbers in the benchmarks.
tech will never let this go lol
Well... I subscribed on this video. DIsapponted in some ways because I now think I haven't made the best decision on my new PC components. On the other hand I can appreciate the wider picture and know how in a couple of years I'm going to upgrade my rig if I want to play top games
I love this video so much. This channel deserves so much more love!
@@ploperdung Oh wow I didn't know that! Subscribed to Tech deals, thanks for the heads up. Looks like I have something to binge watch again.
Can you please make a different channel all about buying the best cars at different budgets, that would be amazing. I don't know nothing about cars and I'm going to need to buy one in the future and I need a buying guide.
Yep a car channel and housing/ personal finance channel, too!
If Chrysler would ditch the ancient Mercedes platform the Charger, Challenger and 300 are built on, they might be able to make it weigh less than a black hole while still being strong enough and being able to take a turn.
It's more about the latency than it is about the fps. Well said!
This channel is exactly why I'm going from 7700k to 5900x.
I can stream and enjoy my pc with no care in the world.
Even with new nvenc my 4c 8t cpu is showing its age. Cyberpunk is pushing it to its limit.
I also can't get my 240hz screen yet because my cpu is to weak :(
cyberpunk is the new crysis. terrible example. your new cpu will not fix the poor performance in that game.
@@jokernumbaonefive2413
Yes it will. That game prefers 8c and 16t there's plenty of cpu scaling benchmarks out there.
Yes it is bad optimised but it is not why i'm upgrading.
I stream and I also want to play on 240hz in some games and the new ryzen cpus would help me do that.
@@C0d0ps well the 7700k was the worst purchase you ever made because the 8700k came out the year after. i see your a streamer so cant you just stream at 720p and not take a performance hit in game.
@@C0d0ps Cyberpunk is cpu bound as much as it is gpu bound. AND it is unoptimized. None of the cpu and gpu combinations you can get today will provide the scaling you are paying for. Keep that in mind. You'll also not see a significant difference between 6 and 8 cores for at least 4-5 years. And by then, your expensive cpu will be obselete. Of course, if you 'multitask' while playing a demanding game like Cyberpunk, go ahead with a 8-10 core CPU. But that is a lot of money to waste since you can get a 6 core chip for 200$
My personal background tasks: HWInfo, Steam, EAOrigin, Epic, printer software, VPN, Corsair iQue, Windows updates, Outlook, listening to a youtube playlist in the background, aaaand the game launchers are downloading and upgrading some games.... on top of that I have the nasty habbit of opening everything that interests me in a new tab. It is very easy to "use up" what was an entire PC a couple years ago, meaning 4-8 Threads and 8 GB of RAM without even actually doing anything. Hardly something one can measure on a clean benchmark system. Benchmark results imho should be taken as a baseline of what is tested - the game - and how individual components compare relative to each other on an even playing field. I'd calculate in some headroom when you want to reach that suggested level of performance in real life. Where technology currently stands, that's especially with CPU cores and RAM... OR be prepared to run an everyday system as clean as a benchmark system.
I went from 6 cores to 12 and know exactly what you mean. It’s something you can’t benchmark but you feel when you use the computer.
yeah you can't really see it. looking at a live game benchmark you can't see the latency difference between 6 and 12 cores clicking buttons, opening and closing programs, chrome things, but holy heck you can just feel the difference.
Hitches and dropped frames would show in the 1% lows and 0.1% lows
Also the vast majority of people who own a gaming pc, use it primarily for gaming and they don't multi task
Nothing that you just said... was true...
@@ByteSizeTech host a straw poll and put this one to rest
That's like saying most people who buy a house , use it just for sleeping in at night. Buying a multipurpose device for one single purpose doesn't really make a lot of sense.
@@ByteSizeTech I have to disagree here. No 'budget conscious' gamer would waste core utility which can be utilized on games on discord, chrome tabs in the background. Especially if it is a demanding game. Gaming with multitasking is an extremely premium end, first world problem. You are asking people spend at least 2x the money (going from 3600 to 3700x/10700) for this reason. And this is the best case scenario where intel chips are even more expensive. That money is better served upgrading the GPU. Your cpu core arguments are extremely biased towards premium spenders where the argument is moot since they can technically buy the best of the best. Your job is to find the deal for people on a budget. Not to suggest people to spend 2x the money waving away 6 cores and pushing 8-10 cores with minimal fps jump now with an undetermined future potential. 6 cores, on 1 monitor, no multitasking with 5700xt-ish GPU will be good for 1080p gaming for 4-5 years. Even if the argument involves 2k and 4k, the money is still better spent on the GPU rather than 'extra cores'.
@@crazyahhkmed
I didn't say it was just for gaming.
I said it was primarily for gaming.
which means when they game the do in fact turn off other programs such as browsers and other memory/cpu hungry tasks. Sane human beings, do not have 16 chrome tabs opened at the same time that they are gaming. That would just be a waste of money/performance. You know you could close them and then restore all of your tabs once you open chrom again.
I'm glad you are doing these because all the recent Tech Deals videos are all really long.
this channel is underrated, just got me subscribed.
I think AMD showing us FPS in their cpu presentation without explaining what SAM is might be a good example
I suggest you create a "non-clean" testbench, where you show the lack of power from number of cores. If you are saying that this effect is tangible, it SHOULD be benchmarkable. In a scientific world,. there is no way that you are saying a cpu power is not enough, but you cant show it. Please give me a scientific test, where a 5600x loses to 9900K in gaming. I'm just so tired of hearing these "you need a number of cores for something" -stuff. Thats would mean a 1700x would be better than 5600x. Which it is not. Forget the number of cores and see to the performance of the chip.... Tech should be savvy enough for this. IPC and Mhz makes the chip, not the number of cores.
In fairness they did address this point, to a degree. Ultimately I think the only way to do this due to inconsitencies would be to create a program that actually uses/drains certain amounts of CPU usage and RAM. And I'm no expert but that may not be easy to do because of how operating systems fundamentally work. It's both why I love PC's but at the same time on a simple, pure basis consoles are insane value for money.
If he or anybody else could've done it, they would've.
Real world is exactly that, real world.
Even if you replicate everything from the Chrome version, number of tabs, exact website visited and which process you're doing on that site, what files you're reading and writing, it will NOT be the same on every bench you have for there will be too many uncontrollables. I'm sure you can grasp this.
I know right, the key word is then... it is subjective and like tech always says: "It depends". Which is why I would steer away from focusing on number of cores between different chips (intel and amd, older gen vs new gen), but rather focus on the (measurable) performance. The benchmarks are not lying, they just focus on games and productivity tests (not streaming, four monitor-setups where you play three different games while watching youtube and updating windows at the same time.)
One thing I disagree with tech is: There is a scenario where a 5600x makes much much more sense than a i9-10900K, which is a thermal performance limited applications (Small itx cases, etc). The power draw and heat output of the intel part is just staggering compared to a zen 3 chip.
The 9900k is much closer in single core performance to the 5600x than the 1700x is the to the 9900k. Obviously Zen1 is slower no matter the cores.
@@anotherbricknwall The thing is the other channels are doing the same thing by indirectly trashing this channel and stating that he is spouting "nonsense". The word on the street has become "buying anything more than a 6 core gaming CPU is stupid".
I feel like it gives an idea or the right place to look . But it never gives you exact result . Especially for every other individual
We need experienced brains like yours to properly educate the guys/gals trying to understand everything with a TLDR mindset !!
I love the way you think, its soo good and very unique. Your a very open-minded person
To be perfectly honest no one takes bench marks as the end all be all it's to give people an idea of what to expect out of a certain part
You don't know that. There's likely some people who do take it as Gospel.
They hated Tech for he spoke the truth.
I have four threads per core pegging my R3 3100. I agree with the point about somewhat FPS drops in games. Difference in overall app responsiveness is hardly noticeable.
This channel is so underrated! I love it
This is one reason I went for a 3700X. More resources to cope with older installs of W10 and background tasks. Some tests on an old install would be interesting.
Intel: Benchmarks don't matter!
Byte Size Tech: Benchmarks don't matter!
I'm poor. Whenever i purchase anything, i ALWAYS want to know that precise area of the market where the returns are the best... right in the top of the value-for-your-$ curve...
Right at the point where you start paying a lot for less-great improvements, that's where I want to stop moving up.
We're talking the point where, if you took some option off the table, it's going to be a killer blow in performance or happiness.
I would like an entire channel of this stuff for all areas of life... the cheapest car, pc, smart phone, etc you can get that is still solid.
I have kicked myself SO often for going with the cheapest decent-seeming option because it ends up being TOO crap... esp when usually jsut a few bucks more would get you a mountain of difference.
Your channel is slowly convincing me I need a 5950X.
Haha I love the car analogy, however, your average person races in straight lines. Mainly lights to lights or on the freeway. And wheelies lol people love wheelies 🤣🤣🤣
Keep dropping the knowledge for those that may not be tech savvy.
Its amazing how uncommon this knowledge is.
It’s why I gave up “gaming Jesus”. The tests they do are always clean installs, one thing running, and says nothing about what people use computers for.
People are in for a huge reality check when their build doesn’t perform like Nexus benchmarks. The reason is because Nexus is, in my experience, not good enough to judge a component as it’s part of a system ecosystem.
Like, nobody cares that a 5900X drags an i9 over the coals in 7zip. I can’t name one damn person on Earth that spends all day zipping and unzipping archives all damn day. Once you get into these workstation loads, you’re looking at business grade hardware.
I would also hazard a guess that they even naively believe that “gigabit” Ethernet ports are capable of fully saturating the link bandwidth, when it completely depends on the integration of the hardware. I’ve seen this before on Cisco routers and switches and the fabric and switching capacity is real.
So welcome :) , i have Ryzen 3700x and 5600x, mb is asus x570-P, wich cpu you recommended to leave with me to use in game and every day use for office job? Thanks for replay :)
It is about the budget and the amount of money one is willing to spend on PC and majority of people according to me are still on mid range and lower end of the spectrum. The high end systems get a lot of focus while the majority are ignored. It's not that people don't know that the 8 & 12 core CPUs are better than 6 core or 4 core CPUs but most have a limited budget which they may stretch additionally by 10 % or such but they won't stretch by 40 / 50% . I take your point that you see 8 cores 16 thread CPUs as the best value in terms of its future proofing due to various reasons and I agree with that assessment but people are already stretching their 'budget' to get that 6 core CPUs and mid range GPUs and if the higher end 8 / 10 core CPUs were just 10-15 % difference in price point than most would definitely go for them but they are not and are mostly much above their limits. The point is one would spend more on their GPUs than on their CPUs for Gaming and both the price of CPUs and GPUs should be taken in consideration of one's budget. The price points of mid range and value for money GPUs have already increased than it was before and one would stretch their budget more over the GPUs than on CPUs since games which are limited by CPUs are much less than those of GPUs. I have an GTX 1060 and am thinking of buying an RTX 3060ti which I think is the value card for 1080p gaming . That in itself is a big jump in mid range GPUs price point and then asking for spending on 8/10 core CPUs would be a bit too much. I believe that 6 core 12 thread CPUs like the Ryzen 5600 with its IPC improvements or it's Intel equivalent would be that value for money buys that would last for long.
True benchmarks include multiple monitors running while chrome browsers are up, twitch, msi after burner, obs screen recording and broadcasting via twitch or youtube, rendering some kind of file all while running a benchmark for a game. That is where the real bench numbers come from.
I disagree. You then tell me one single case that is applicable to all of the users on the planet. When we benchmark a game the only thing that is using most of the resources is the game itself. Ya there will be some resources used by background windows telemetry etc. But its a game benchmark not a multitasking while playing benchmark where you have opened 10 background apps. 30 chrome tabs etc like you where everyone needs 32 or 64 gigs of ram
I regularly game with 40+ Chrome tabs, monitoring software, and a music player open in the background, sometimes having videos playing in the background, with small variation in the smoothness of the game. (PC has a Ryzen 5 3600X and 16 GBs of 3600 MHz, CL16 RAM.)
And this is why my $199 Threadripper 1920x and my $850 Threadripper 2990wx are still better choices than many other less expensive CPU's. Depending on the use case, of course.
The whole story depends on the how it's configured and what it's used for.
i always keep an eye on average cpu usage and 1% lows
loving this byte size conversations
Drag race analogy is wrong. Thy would be a drystone synthetic benchmark. Benchmarks in games are like laptimes on a track on sunny day conditions. So very close to real world racing, but might be a little different on wet etc.
So, what do you think about intels 11th gen sacrifizing cores for single core performance?
Facts. When I try to play CSGO on my 3900x with 32gb of ram and a 1080. If I have a ton of Firefox tabs open; which I usually do. My game will get high fps but it will feel sluggish. Sometimes it runs buttery smooth and other times I have to close certain tabs. Like Twitch streams make my game feel like trash. Looking at charts wouldn't show that, it would just show the fact im still getting 300+ fps. But as an actual user I can tell you it sucks and affects my experience. I'm really hoping my 5950x will fix this issue.
Same thing used to happen to me even tho I was getting 160+FPS in valorante it felt sluggish on my i7-5700 now it feels much better with my I-9 9900k
No... You don't need more cores to fix the drop in performance... CSGO uses 4 cores maximum and it also gives you the most performance with 4 core affinity assigned with a task manager. Instead of blaming it on the CPU, learn how to manage your proccesses' affinities manually and take advantage of both CCDs your 3900X has, damn. You literally have a 3600X + 3600 in one chip connected via an infinity fabric. There is so much potential and plannifcation to easily increase performance.
@@rooscoe1061 Not talking about more cores. The 5950x has combined memory and has less latency.
@@TriWaZe Although the architecture is improved, this issue will happen in the 5900X and 5950X. Even with the combined memory, you deal with latency from shared CCDs. What I'm talking about is assigning every background process to the weakest CCD and use the best CCD (which will be basically 98% free) to game on without feeling much FPS penalty as before, and even overclock it.
Per example, since you use tons of tabs of Firefox, you would want to assign the core affinity of Firefox to the range of cores 12 to 23 and leave the game from 0 to 11 core affinity. For CS:GO, it would be best to assign 4 physical cores or 2 cores + 2 SMT cores, as long as you run 4 logical cores on Windows.
I also have to add I achieved single-core performance of 521 in Cinebench for the gaming CCD, which is like the 10700K's. It will not perform as such, but the optimizations pay off really well.
Great video, we have several computers in our household and we are alway upgrading and swaping around hardware, my son sold his Skylake I3 system and we swapped the the 1050Ti with the R560 in my other sons computer before he sold it because the 1050ti ws susposed to be better. My son was running a R3 1200. And believe it or not the computer RAN WORSE. I exhanged my 1070 for it in my i5-6600K system (which i'm going to sell soon). And it MYSTACALLY RAN BETTER in that system (obviously not as well as the 1070 but it's going to auction anyway). Trust me when I saw we did ALL THE DRIVER THINGS!!
fully agree... thanks... do you think the R9 3900x still a valuable cpu today (end 2020 - early 2021)? would you recommend to buy it instead of spending a bunch of money on a R9 5900x??? I play 4k and do some video editing sometime.
yeah bro, its one of the best cpus, only because its not the latest it doesnt mean its trash, its actually pretty good, plus the 5900x is pretty difficult to find.
@UNITYtechFUTURE yep... but 6 core is less future proof I guess...
@@andreanovelli3265 Grazie!
This topic reminde me as in recent years same testers/ benchmark test gaming CPU/GPU in Ashes of the Singularity to prove the AMD is better and more future proof that Intel/Nvidia. :)
I like benchmarks...because I can run them on a new build to see if I've set everything up correctly and the hardware is working as intended as compared to other similar builds. Also, it still works for bragging rights 🤣😂
I expecting he saying gamer nexus b4 he saying it.. XD
Nobody with an i9 and high end graphics card is playing games in 1080p so benchmarks like that are meaningless
Good honest information and very true but the real line comes down to what each individual can afford. If you have money to burn then by all means get the most cores and most and fastest memory because you can afford it. For those like me, I can't afford all that stuff so I just get what I can afford. Which generally is the technology that's behind at least 3 years. My current computer was bought in 2008 for under $500 still running Windows 7. But am going if still in stock last check store had 2 a prebuilt HP with Ryzen 5 3500 and gtx 1650 super GPU for $599. It may not be the best but compared to what I have now is a huge leap and that's all that I need to be happy. It will be capable of doing everything I need it to do for along time especially if I can get my hands on a RTX 2060 GPU or the 2060 super my power supply will handle up to that. That should get me 1440p resolution for video quality. That should be good for a few years for me after all I am still going on Windows 7 and ddr3 ram and a Athlon processor.
it ok , but it a good idea to do a test bench from a game view with a 5600 and a 10900 then do work on it and see what happend
This is why I watch so much of tech
every aspect of your PC is relative to its performance, This is the only video I've ever seen explain PC behavior.. A PC is as fast as the slowest part.
so do your benchmarks with other programs running at same time and show results.
That really misses the point of the discussion... you are still focused on a FPS number in a chart.
@@ByteSizeTech but while other stuff is running, there has to be something to be used as a measurement.
Currently using a 5600x and I don't have any problem streaming on discord while I game in CPU bound scenarios. The only time my CPU hits 100% is in benchmarks. 6 cores is easily enough for my use. Two 3440x1440 monitors (one for games, one for discord/task manager/fan curves/browser). The sort of people that need that many threads and don't use their PC for work must be the same people that leave every app on their phone running and have 10+ browser tabs open constantly. I just can't see it.
try a 12 core 5900x and your game will run smoother
@@freshdiaper427 Huh? How do you figure?
(*Nasally voice*) But in single threaded applications....games don't utilize no more than x amount of cores...And I am here to defend my purchase!.. All the regurgitated fecal matter I heard in the forums.
well its so much to digest i will just it as a summary that more cores means better and more responsive computer where u can do many things at once as i do. I discord stream plus running audio of my team voice chat in discord while playing dota 2 and when finding match. I go watch youtube chrome runs on bg. Therefore its gonna be a better experience in more cores.
Cool factor!
It sounds like a Game Pro review.
What about the Chrysler Pacifica? I work at the factory that makes them! 😎
I have multiple monitors but I don't get why this should hurt performance.
About to spend 1300 on a new build my cpu is going to be a i9 10850k, 32 gb 3200 ram, z490 msi board with 2 m.2 Pci gen 3. I want my 10 cores and 20 threads. Oh and graphics card is a 5500 xt for now only thing I can find but I mainly play older games.
i have a 1660 an r5 3600 and 16gb ram and im looking to upgrade my gpu so i can run cyberpunk 2077, on my 4k monitor (not necessarily with rtx and max i dont mind running on high) or atleast run it smoothly on my 1080 monitor at max, i dont want to start buying the top tier card every year so my question is, should i get a second hand 2080 ti for around £500???? or should i suck it up and get the 3080 when its back in stock?
Its better to wait as Nvidia may launch more versions of Ti or super anytime next year with more Vram. Even the regular 3080 has just 10 gb VRAM so that is not a good deal either
Main test....does it do want I want and am I happy with it...if I like a Walmart PC....then I like it..
Benchmarks are plenty relevant and anyone using a computer should know enough about how to keep background tasks and/or apps closed when they play games for max performance. Or if they don't know or want to know all that stuff then they should pay for more cores and more memory.
I found your Channel after i Ordered 5600x... -_- for 260€ @ Black Friday. Finally I choose the 5600, because I don't know If the Brocken 3 withe Edition can handle the 5800x. But even better as the i5 4690nk.
Your videos have taught me a lot, thank you!