Threadripper VS Core i9 For Unreal Engine Development In 2024

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 96

  • @threadripper979
    @threadripper979 4 місяці тому +31

    Looking good, Gordon. You're a tough dude.

  • @Baldmaxx
    @Baldmaxx 4 місяці тому +15

    You're looking and sounding awesome, Gordon! We are all thinking about you and wishing you the best.

  • @technopriest8686
    @technopriest8686 4 місяці тому +14

    You guys are filling a huge void of YT content for tackling this subject. As an new indie game dev this is crucial info. Thank you

    • @donnydarko7624
      @donnydarko7624 4 місяці тому +2

      What really got me into the starting to watch LTT, and gn and pc gamer was my desire to build a computer so I could learn blender and maybe z brush and Houdini, and wanting to figure out what things were and werent important things specifically for a modeling/rendering machine, and I came to this decision probably around the worst time in recent history (relative to when buying/building a machine for this purpose was even a realistic option for the average Joe) 2021. So I definitely wasn't trying to pay a system integrator stupid amount of money that I couldn't afford to give me a machine that was worth a lot less than I paid for it. So I just decided to say hey I'll just take it slow this whole GPU shortage won't last forever, and hopefully by the time gpu's can be found I'll have learn the information I need to to be able to make the right choices to build the best machine within my means, and I got to say the journey and the amount of time it took to drill down and understand that GPU vram was probably the most important specification of a GPU as pertaining to scene size for 3D bottling animation and rendering. It literally took me about a year to like come across that information. And it was some obscure channel that no longer uploads videos. I got to say that I was extremely surprised that Andrew blender guru didn't have that information anywhere laid out even posted or liked a website where he had build guides. At least not that I remember. It seems like none of the 3D channels have anything much about that most just say a Nvidia GPU for viewport real time Ray tracing, as far as I recall. Its not like I have to go look up the information now because it's already in my head.

    • @technopriest8686
      @technopriest8686 4 місяці тому +2

      @@donnydarko7624 yeah I totally get that. I only recently learned about VRAM also. It's unfortunate that so much of the market is geared towards gaming exclusively. Where are you located? I think it's easier to find local game devs to learn from now more than before, especially in big cities

    • @donnydarko7624
      @donnydarko7624 4 місяці тому

      @@technopriest8686 Minneapolis. There was an Activision office here, but Idk if it still is. Im more interested in animated shorts, live visuals and VFX though. Being a game dev is too much like being a film director imo. You spend 2 years or more of your life focusing on a singular idea in hopes that it will be well received. I don't think I could handle that level of anxiety. I also write electronic music and DJ, so I have a concept I"m eventually trying to put together that combines live music performance and live visuals where the live visuals are being fed input from the changes from the synths I'm performing with.

  • @moravianlion3108
    @moravianlion3108 3 місяці тому +5

    Getting into full time game dev. Paired 7950x with 192Gb DDR5. So, I guess I'm somewhere in the middle here? Saved bunch of money by buying 7900 xtx instead of 4090. Still has 24Gb vram and can also run CUDA (thx ZLUDA) and other parallel tasks fairly quickly.
    The overall PC HW cost me under $3500 everything included, including 4Tb NVMe storrage, PSU and so on.

  • @HanmaHeiro
    @HanmaHeiro 4 місяці тому +6

    Thank you! This is the kind of coverage some of us have been starved for. You guys cover what seems niche but comes out as ahead of its time. I remember your early Frore systems coverage

  • @graphisn
    @graphisn 4 місяці тому +2

    This video is bit of a nostalgia trip for me. Back in 2016-2021, I worked as a system builder at an Australian-based 'Puget-Systems-equivalent' company, building workstations for SOLIDWORKS/Archicad/Twinmotion. Unfortunately Covid killed off the business.
    That being said, many of the talking points mentioned here are very relatable, especially the 'time spent when the computer is locked up, is time that you're paying someone to sit there and do nothing'.
    Kinda like Jensen's meme of 'the more you buy, the more you save'.
    Fun fact: we 'almost' landed a deal with Weta Digital (now Weta FX) back in '15 before they ultimately went with BOXX.

  • @aridhol22
    @aridhol22 4 місяці тому +7

    Good to see Gordon, lookin good

  • @jamescampbell6728
    @jamescampbell6728 4 місяці тому +8

    Ultimately these high end consumer CPUs are basically still HEDT CPUs. I hear a lot of game devs use 7950x3d systems. And tailor their work flow to it. Not only do you save on the CPU, but also the platform. I hear Threadripper non-pro also has much worse motherboards because they just don't sell as well anymore... so you'd have to shell out for a pro anyway. So if I were starting a game studio, I'd get an i9 or an R9 system

    • @CNC-Time-Lapse
      @CNC-Time-Lapse Місяць тому

      Yeah totally agree. TR PRO really shines if you need more than 128GB's of RAM as WRX90 supports up-to 2TB, 8-channel memory and 148 PCIe lanes (128 of which are PCIe Gen 5.0) Expect to pay easily 5-10k+ for the system... I build my own and paid over 7K for my TR Pro system (TR PRO 7965 WX). If you don't need all of that though getting a high-end consumer desktop is going to be the best choice for a startup or small studio.

  • @milohajek
    @milohajek 4 місяці тому +1

    Yay, Gordon and Will, what do you have for us today.

  • @JBrinx18
    @JBrinx18 4 місяці тому +3

    The one that is the most reliable, clearly

  • @EthelbertCoyote
    @EthelbertCoyote 4 місяці тому +1

    I build my own, as an animator you really have to have a sit down with yourself buying from allows leasing and more importantly if it is a good company, dropshipping. Also no matter how technical you are remember being your own it means faster solutions yes, but also a huge time and research investment. Like anything in life make sure your investments pay off the way you want.

  • @DavidWB147
    @DavidWB147 4 місяці тому +3

    I really hope we can see some HEDT competition. I do a lot of CPU rendering, and the gulf in price between my 13700K and a newer Threadripper is enormous. I'd happily discard the advanced server-like/Pro features and such for a 64-thread $1000 HEDT. The £1300 baseline price for the 7960X feels a bit uncomfortable for 48 threads.

    • @Javierm0n0
      @Javierm0n0 4 місяці тому +2

      But u cant just compare thread count. Applications just like different cpu's sometimes.

    • @paulct91
      @paulct91 4 місяці тому +1

      Does this include Threadripper Pro AND Threadripper (non-pro)?

    • @DavidWB147
      @DavidWB147 4 місяці тому

      @@paulct91 For price? The cost of £1300 was for the non-pro TR 7960X. At least here anyway.

    • @DavidWB147
      @DavidWB147 4 місяці тому +2

      @@Javierm0n0 For my specific application of rendering on CPU, examples like highly parallel raytracing; the 7960X drops out with a time that is almost half my 13700K, but unfortunately is between triple and quadruple the price. I get why, but it is a nice aspiration that one can maintain a more linear cost/thread or cost/core towards higher counts. But I can't see any evidence that a Threadripper or the new Xeon 6 will do this at the "lower" end of 48-56 threads.

    • @Javierm0n0
      @Javierm0n0 4 місяці тому

      @@DavidWB147 makes sense to me.

  • @rory-red
    @rory-red 4 місяці тому +1

    I use a 24 core threadripper with 256GBs quad channel memory with OC ASUS 4090 and dual 4TB NVMe in Mirror with another 4 TB scatch drive

  • @forbiddenera
    @forbiddenera 4 місяці тому +1

    Literally was googling this a month ago playing with UE5 and twiddling over which to consider getting
    (Decided to wait for next gen maybe)

  • @donnydarko7624
    @donnydarko7624 4 місяці тому +3

    Surprised Companies wouldn't run the final renders on something like AWS, or even have contracts with Amazon to have a dedicated monthly allotment of render time with AWS.

    • @paulct91
      @paulct91 4 місяці тому

      Why?

    • @donnydarko7624
      @donnydarko7624 4 місяці тому

      @@paulct91 Stability, less upfront hardware investment, near infinite scalability, if the company wasn't able to invest in a render farm of their own there still would be no down time because no employees workstations would be needed to render at all, and the cost may actually be cheaper than the cost involved in operating their own render farm when you consider the combined hardware costs, electricity due the direct KWh costs for the render process, and the incidental costs of kwh needed for additional climate control. The pricing to render using AWS is extremely reasonable. On top of the retail pricing, I'm willing to bet that especially larger businesses could negotiate better pricing deals.

    • @macabo
      @macabo 4 місяці тому +3

      by rendering do you mean compiling/building? Companies like Epic do use AWS for some of their build infrastructure - the problem is cost. Epic spends millions a month on continuous builds for their projects already. The problem is that their usage needs are near constant so the costs for using cloud services are much higher than in-house hardware. Cloud is better for something where you have brief periods of high demand, not as much for 24/7 steady demand. This is why most VFX studios still maintain their own massive render farms. Cloud is usually for burst capacity in extenuating circumstances because its damn expensive.

    • @donnydarko7624
      @donnydarko7624 4 місяці тому

      @@macabo it makes 100% sense why VFX houses and animation studios would run their own. They are always working on a million different projects. Vote for light bakes, rendering, compiling, it all comes down to cost analysis what works better for one's specific needs, but I feel like if you are a small/mid size game dev making 3D games, I feel like that would be the ideal situation where using a cloud service would be really worth looking at as an option.

    • @donnydarko7624
      @donnydarko7624 4 місяці тому

      @@macabo maybe the pricing has drastically increased since I last looked into it in like 2021

  • @nempk1817
    @nempk1817 4 місяці тому +5

    Would like to see the 7950x compared to the 14900k because off the E Cores.

  • @Lambretta_G
    @Lambretta_G 4 місяці тому +21

    I thought that games being unoptimized lately because devs are running threadrippers with 4090s was a joke... I guess not!

    • @jolness1
      @jolness1 4 місяці тому +17

      There’s a big difference between compiling and baking and light mapping versus actually playing the game. Them doing that work on a higher and workstation is not the cause of unoptimized games. Threadrippers are much worse at gaming than consumer CPUs So if that was the case, it would be better CPU optimized. There’s a variety of reasons why games are poorly optimized now, but this is not one of them

    • @SMorales851
      @SMorales851 4 місяці тому +5

      It's sort of true. Having a very powerful machine is necessary for compiling, baking and packaging. However, there's also the issue that running an Unreal game in Editor mode is also significantly slower, so the hardware needs to compensate for that in order the be able to playtest quickly. That means that the dev can't reliably know what the performance of the game is really like, even if they were running a more "pedestrian" setup.
      Though, if I had to guess, the main reason for unoptimized games, in Unreal specifically, is that the engine is optimized for a certain kind of game. Some things simply won't run efficiently when used in certain ways, some games need them replace them with a custom implementation (which doesn't always get done).
      For example, I know for a fact that the built-in character movement and animation in Unreal can get out of hand VERY quickly. It is designed to work with, at most, one or two dozen characters. Crowds and anything of the sort will require an specially optimized implementation,

    • @paulct91
      @paulct91 4 місяці тому

      ​@@SMorales851Usually they'd just package it for deployment Alpha/Beta testing to help narrow down performance or if working mainly with a specific GPU vendors then their staff (GPU Vendor's) might have dedicated staff to specifically optimize for their hardware features. (As opposed to 'general' optimizations.)

    • @macabo
      @macabo 4 місяці тому +1

      Some large game projects in UE require 128GB+ RAM just to open the projects in the editor. Companies having shitty practices and testing around scalability and perf on various target platforms is another problem.

    • @iLegionaire3755
      @iLegionaire3755 2 місяці тому +1

      @@jolness1 Threadrippers are not worse at gaming. This is not true. I have seen the 7995WX.

  • @ThePrimaFacie
    @ThePrimaFacie 4 місяці тому +1

    Yes this is an Ad space nothing wrong with that and the Northwest box looks nice too. Its just 2 to 3 times more (without tax/shipping) for the same GPU. You could be able to run 3 4090 machines and have one down and still be ok if doing GPU rendering. IDK I understand wanting piece of mind but I wonder how many indy 3d Heavy UE devs there are? Thanks for the vid

  • @RurouTube
    @RurouTube 4 місяці тому +2

    I would say that the Intel one is not unstable because it is a consumer part but it is just Intel want to win benchmarks over AMD. I had multiple Intel systems (pre-alder lake systems) and all run stable under full load multiple times. I also had AMD system stable under the same load. The load is for 3D rendering. So the notion that consumer Intel CPU is not suitable for this kind of work is mainly for this newer Intel CPU. It is a piece of garbage that were build to lie to the consumer that their CPU is running faster than AMD when the actual fact is that they were straining the CPU a lot to the point that it can degrade faster than normal. Honestly everyone buying at least 13900 and 14900 should be able to replace their CPU infinitely and shouldn't be subjected to run it at a lower speed/multiplier just to get it stable because that was not was promised in Intel slide. There is no "you can only get this level of performance if you only run it for certain minutes". You can't get Intel Cinebench MT benchmark number without having it run at basically unlimited power and if I buy 14900 based on Intel presentation, I expect to be able to run it for 3D rendering at the same performance level as shown in Intel presentation without degradation and everyone should expect the same. Ask for replacement and don't just settle for using less performant profile.

    • @iLegionaire3755
      @iLegionaire3755 2 місяці тому +1

      I have asked, Intel has refused to speak to me over the phone regarding my very dead Intel Core i9 14900K, and kept me on hold two different times. I have little faith Intel will RMA Raptor Lake, so I switched to AMD and the 7800X3D, and without a doubt I don't regret it.

  • @deuswulf6193
    @deuswulf6193 4 місяці тому +2

    Wow Gordon, I don't know what happened to you but hang in there.

    • @pcworld
      @pcworld  4 місяці тому +4

      ua-cam.com/video/lIoyetk2vfE/v-deo.html

  • @yumri4
    @yumri4 4 місяці тому

    For most doing light maps i can see them using GPGPU and it takes a few minutes not 20 to 30 minutes to just do the light map for an enclosed space. There are ways to optimize the workflow so you have the minimal amount of down time though having a GPU server to do it on for a team of 4 or more makes sense. For a smaller team maybe get the artist a big enough tablet to draw own that is a separate computer from the computer doing the light map baking. With that they can start or continue drawing the next part to be applied based on what they think or know the next part will be. Yes it is a 2 computer solution 1 powerful one and 1 powerful enough to be a drawing tablet.
    Another reason for the computer + tablet solution is planning for what to happen next though you will get to a part in the process everything is already put in place on a design sheet and design document. So the tablet they got will just be used for doodling to pass the time and going through the image library to see what they are to do next. Then plan in their head what to do.
    Baking entire levels at a time makes little sense when you can bake only the part you are working on instead. UE5.3 might not allow for that is the issue i can think of that yes baking the entire level would be required each time.

  • @mrhassell
    @mrhassell 4 місяці тому +3

    Intel Xeon W9-3495X is the equivelent CPU. 14900K is Intel's consumer line, equal to the AMD Ryzen 9 - 7950X. AMD Threadripper is a workstation Xeon with Intel CPUs

    • @mrhassell
      @mrhassell 4 місяці тому +2

      Comparring an AMD $8,600 AUD processor to an $860 Intel CPU is just dumb.

  • @milohajek
    @milohajek 4 місяці тому +3

    Most companies system configurators are so limited.
    Why 48GB of RAM on the 14900???
    I have 128GB in my two most used systems, 48GB seems odd

    • @rory-red
      @rory-red 4 місяці тому

      some these new motherboards have ram size at 24 gigs like I have AM5 system that have 192 Gigs Max but only 96 Gigs in my gaming system. 48 gig is becoming more normal now as putting more on each ram stick

    • @Vegemeister1
      @Vegemeister1 4 місяці тому

      48 GiB is the most you can have with 2 single-rank DIMMs. Dual-rank goes to 96. More than that needs 2DPC, which tanks memory clock speed (or more than 2 channels like Threadripper).

    • @milohajek
      @milohajek 4 місяці тому

      @rory-red I have a triple channel MSI motherboard from 2007 that has 48GB of RAM, this limit they are throwing out there is just nonsense. Every system I have built in the past 5-6 years (out of my 32+ years of system building, during which I have personal built over 11,400 systems) has gotten 32GB modules, thus you can start with 32GB in single channel and have plenty of room to upgrade both in speed/data transfer and memory storage capacity. My 2 newest desktops both have 128GB of Corsair Vengeance DDR5 and I find it extremely helpful, and YES, there are plenty of times where I check task manager and I'm using 68GB to 74GB of RAM, so 32GB (or in this case, 48GB) is a bare minimum starting point. I mostly do video editing, website design and building, day trading, and a lot of research (so it's not unheard of for me to have 300 to 500 Chrome or Edge tabs open), also DaVinci Resolve does work better with more System & GPU RAM. Bottom line is that this "48GB thing is a cop-out", if you can't get even (16GB, 32GB, 64GB, 128GB or 256GB* of RAM in a system these days then your either NOT doing your research or your skimping on the cost of the parts)
      *Only some Non-Server grade motherboards support 256GB of RAM, although there are more and more coming out that support 192GB to 256GB without skepping off that massive financial cliff that is the server market.

    • @milohajek
      @milohajek 4 місяці тому

      @Vegemeister1 @rory-red As for "dual Rank DIMMs, thats just some sort of short cut, which seems to have to many downsides compared to a single rank DIMM (Dynamic Inline Memory Module), as a history lesson I have a triple channel MSI motherboard from 2007 that has 48GB of RAM, and sure it is slightly faster than ASUS dual channel motherboard i have with 32GB of RAM, but its slower than the Gigabyte Z79 board with Quad channel RAM from that similar 2007-2009 era, this this limit they are throwing out there is just nonsense. Every system I have built in the past 5-6 years (out of my 32+ years of system building, during which I have personal built over 11,400 systems) has gotten 32GB modules, thus you can start with 32GB in single channel and have plenty of room to upgrade both in speed/data transfer and memory storage capacity. My 2 newest desktops both have 128GB of Corsair Vengeance DDR5 and I find it extremely helpful, and YES, there are plenty of times where I check task manager and I'm using 68GB to 74GB of RAM, so 32GB (or in this case, 48GB) is a bare minimum starting point. I mostly do video editing, website design and building, day trading, and a lot of research (so it's not unheard of for me to have 300 to 500 Chrome or Edge tabs open), also DaVinci Resolve does work better with more System & GPU RAM. Bottom line is that this "48GB thing is a cop-out", if you can't get even (16GB, 32GB, 64GB, 128GB or 256GB* of RAM in a system these days then your either NOT doing your research or your skimping on the cost of the parts)
      *Only some Non-Server grade motherboards support 256GB of RAM, although there are more and more coming out that support 192GB to 256GB without skepping off that massive financial cliff that is the server market.

    • @Vegemeister1
      @Vegemeister1 4 місяці тому

      @@milohajek You do not know what you do not know.
      1. The 48 and 96 GiB configurations are on modern platforms are not from 3-channel motherboards like in the old days, or any kind of "cop-out". They exist because the largest available DDR5 chips are (unusually) 3 GiB.
      2. A single-rank DIMM has 8 chips (or rarely, 4). A dual-rank DIMM has 16 (8 on both sides). They aren't a "short cut". They're the only way to fit the largest possible amount of memory in a slot. The number of ranks is the number of chips wired in parallel to each data line from the slot.
      3. 2-DIMM-per-chanel configurations (4 DIMMs on recent desktop platforms) are harder to run at high clock frequencies than 1-DIMM-per-channel. Similarly, dual-rank DIMMs are harder than single-rank. Therefore, if you only care how *fast* your RAM is, but not how much you have, the best config is 2 single-rank DIMMs in a motherboard that only has 2 slots.
      4. 48 GiB comes from 2 channels * 1 DIMM/channel * 1 rank/DIMM * 8 chips/rank * 3 GiB/chip.

  • @Vegemeister1
    @Vegemeister1 4 місяці тому +1

    12:16 Heavy parallel computations on Windows really keep you from using the machine for other things? WTF is Microsoft doing with their scheduler? Is there a Windows equivalent of "nice" you could stuff into a build script?
    (It's also possible this is a problem that would be solved by Moar RAM.)

  • @KnowingDigitalTruth
    @KnowingDigitalTruth 4 місяці тому +1

    The premise of the whole video is: Thread ripper took 20 minutes and the Intel i9/1400Ks took 31 minutes. Would I leave my long history of Intel to run to thread ripper, to save 11 minutes?
    NO.

    • @kazioo2
      @kazioo2 2 місяці тому

      You missed the 28 min vs 68 min test.

  • @johnlewis6226
    @johnlewis6226 4 місяці тому

    Can you make the benchmark available ?

  • @daveg4417
    @daveg4417 4 місяці тому +2

    The comparison should have been an AMD Threadripper versus an Intel W2400/W3400. Those are HEDT Apples to Apples. An i9 is not the same class of processor.
    That said, an i9 versus Threadripper will perhaps show consumer-level users whether the HEDT workstation is worth the extra price.
    As a 25 year pro game dev veteran neither of these systems are what I would have picked for UE5 use, not enough RAM in either.
    FYI Last summer I built an Intel Xeon W7-2495X 24-Core 48-Thread, ASUS W790-ACE, 512GB DDR5 RDIMM, ASUS RTX-3090 24GB, Corsair 5000D, WD SN850X 4TB x2, etc. For UE5 use.
    It works great for UE5 and is much better than my previous game dev system, an AMD R9-5950X 16-Core, ASUS ROG X570, 128GB DDR4, ASUS RTX-3090 24GB.
    UE5 wants more than 128GB of memory for such thing as large landscape imports and building HLODs. 128GB+24GB is the absolute minimum for any serious UE5 work.

  • @ye849
    @ye849 4 місяці тому +1

    Why take a Threadripper to develop games? It would be slower and orders of magnitude more expensive with absolutely no pros.

  • @TheCastle2k
    @TheCastle2k 4 місяці тому

    *Workstation PC: I make your games but I am not good at paying them!
    *Gaming PC: I play games but i am struggling making them!
    Morale: we're only comfortable in our domains.

    • @moravianlion3108
      @moravianlion3108 3 місяці тому +1

      workstation PCs are pretty good at playing games, just don't expect excessive framerate, that's all.

  • @A2theC
    @A2theC 4 місяці тому +1

    The difference between CPU bound games on either of these machines could easily be chalked up to being more optimized for Intel chipsets over AMD's multicore platform, this can be seen on many comparisons between the two brands with many cores simply not being used at all on AMD systems.
    That being said, both are extreme overkill for most gaming and you could 'live with' significantly lower spec cpu and almost certainly have less than 5% hit to fps. How in hell do you have only 100 fps in Fortnite on a 4090? Running 4k with max raytracing?
    For game production or rendering I'd have figured an RTX 6000 ada would be a better choice having more Vram to work with but looking at some workload performance numbers somehow the 4090 still outperforms the specific purpose built card in 6/7 tasks.
    Way to go Nvidia 🤦 "This card is the best for rendering, pls pay 6x as much for stability... also don't look at our other 'cheaper' card".

  • @chrisbullock6477
    @chrisbullock6477 4 місяці тому +2

    Guys this is old news, most studios and even VFX studios are using Threadripper Machines. Its no big mystery, plus if you have networking and servers that are AMD Threadripper Pro based it makes IT's jobs so much easier. Plus its alot more cost efficient on top of lower power and better overall performance no matter the size of your team/company from Startup to international size.

    • @deuswulf6193
      @deuswulf6193 4 місяці тому +1

      Eh that's not entirely true. From what i have seen, most (not all) vfx studio workstations are fairly generic "gaming" style rigs, which also double as units in a larger render farm when the artist is not actively using them.

    • @chrisbullock6477
      @chrisbullock6477 4 місяці тому +1

      @@deuswulf6193 Its either "not entirely true" or relatable depending on who's answering. I guess its best to say based on my experience in my area...etc.

    • @macabo
      @macabo 4 місяці тому +1

      People's time is the biggest expense so most large studios use the latest and greatest HEDT available. Big name studios also receive a lot of free or cheap hardware in exchange for marketing. ILM had a partnership with AMD and HP for a long time (may still) and with SGI before that.

    • @Vegemeister1
      @Vegemeister1 4 місяці тому

      The cost of electricity is infinitesimal compared to the salaries of the people using the computers. Nobody cares about workstation power usage, unless it's so high the HVAC can't keep up.

  • @MrBratkenSolov
    @MrBratkenSolov 4 місяці тому

    Genius: lets take workstation AMD CPU with tons of cores and compare it with consumer Intel CPU with 8+16 cores.
    Very nice. Such fair.

  • @KenjiEspresso
    @KenjiEspresso 4 місяці тому

    How much does it cost?

  • @Marcelis
    @Marcelis 4 місяці тому

    Damn, Will really let himself go since the slap😅😅

  • @lightweighth2609
    @lightweighth2609 Місяць тому

    So buy a $4,000 CPU with the additional expense of the Board which are equally more expensive vs buying everybody a $1500 GPU and save roughly $3000+ on the Intel setup which is going to absolutely smash the bag on light bake times of the CPU bake machines.......make that make sense

  • @xpodx
    @xpodx 4 місяці тому

    For gaming, you were testing at 1080p right?
    What about 4k, 5k, and 8k? What's the fps difference then? Should be very similar, gpu bound.

  • @aiksjdijdemlfnewklfn7092
    @aiksjdijdemlfnewklfn7092 4 місяці тому

    I think it is fine as programming and designing need a lot of brain power those mints so they can refresh. Like they have to be relaxed or the proformce drops significantly. At some point they might f up things accidently alot when their brain fires up. That is especially true for the ones born 2000 or higher. In those jobs even the psychological state of the person affects their work. Especially for new people.
    I think on avg making sure they work max will do more harm than good in high skill and thinking parts in IT and tech. Unless u make sure you higher experienced pros, which will cost more money and keep asking for a raise. Am pretty sure they will cost far more for less proportionally speeking in the long run.

  • @chrisbullock6477
    @chrisbullock6477 4 місяці тому +1

    The people that know just know, outside of that it just sounds like you're trying to pander to both sides without hurting the other companies feelings🤷‍♂

  • @grtitann7425
    @grtitann7425 4 місяці тому +1

    And people wonder.why the game runs like garbage on AMD gpus...

  • @FirdausAmir
    @FirdausAmir 4 місяці тому

    In my test intel better at compiling texture in unreal

  • @alpzepta
    @alpzepta 3 місяці тому

    Only i9-10980XE is the true HEDT

  • @yeahitsme4427
    @yeahitsme4427 4 місяці тому +1

    So he says that the 13900 started to degrade after sometime running full load!! So the conclusion is easy: Intel 13th and 14th gen cannot be used in this kind of workload, they have flaws.

  • @MrMcnamex
    @MrMcnamex 2 місяці тому

    one cost 10x more that the other lol

  • @shmookins
    @shmookins 4 місяці тому +4

    Pointless 25 min. ad.
    I had to stop at the dumb line: 'our custom build is faster than your custom build' when the site just uses off the shelf parts that you can even get cheaper elsewhere. But hey, you can pay extra for the logo to be in different colors! ...

  • @thesupremeginge
    @thesupremeginge 4 місяці тому +2

    Blame the pandemic for the VR game startup failing. A pandemic is a best case scenario for VR. Maybe just admit that nobody really wants VR games.

    • @ApplePotato
      @ApplePotato 4 місяці тому +3

      VR really didnt take off because no wanted to be in a cluncky headset for hours

    • @MalachiConstant1980
      @MalachiConstant1980 4 місяці тому +4

      Do you have any idea of what Will’s VR studio was doing? It seems like you don’t. He wasn’t just making a stationary VR shooter. Maybe just don’t throw stuff out without having at least some of the information.

    • @tlv8555
      @tlv8555 4 місяці тому +2

      ​@@MalachiConstant1980 No one wants VR games

    • @notthatwillsmith
      @notthatwillsmith 4 місяці тому

      We made a tool that Cartoon Network and Adult Swim used to make cartoons live, in front of a live studio audience. We used VR hardware and software to do that, but no one wanted to do live shoots in 2020 and by the time things were normal again, Zaslov had fired everyone at CN and A|S that we worked with.

    • @thesupremeginge
      @thesupremeginge 4 місяці тому

      @@MalachiConstant1980 You didn't grasp the concept of VR is dead.

  • @MOPO3JCg
    @MOPO3JCg 4 місяці тому +3

    "Unreal Engine" is absolute crap.

  • @LeesChannel
    @LeesChannel 4 місяці тому +1

    Anytime I get to see Gordon is a pleasure. I hope you are doing well, sir! 🫡