The Ultimate Linux Gaming Trashcan

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 449

  • @ActionRetro
    @ActionRetro  Рік тому +11

    Go to www.piavpn.com/ActionRetro to get 83% off Private Internet Access with 4 months free!

    • @chandlerbing7570
      @chandlerbing7570 Рік тому

      Thank you S̶p̶o̶n̶s̶o̶r̶b̶l̶o̶c̶k̶ for blocking todays s̶p̶o̶n̶s̶o̶r̶

    • @dylanlindsay1993
      @dylanlindsay1993 Рік тому

      plz tell me that sceptre monitor aint the 75-80$ on off amazon! i had the 80$ one and i had it for 7 days before i returned it and used the money to buy a 165hz 1080p lg ultra gear monitor that walmart had on rollback for like 5$ mor than the sceptre! i will never buy a sceptre monitor ever again! no cap my backup monitor which is a dell monitor from 2008 that i got for 8$ at goodwill had WAY BETTER viewing angles than that sceptre i had! i had to sit dead center infront of the sceptre becuase the viewing angles were so bad and thats why i returned it!

    • @BigKelvPark
      @BigKelvPark 9 місяців тому

      So if they don't keep any of your data - how do they bill you monthly?

  • @AutumnRain
    @AutumnRain Рік тому +586

    well i have to compliment the design a bit - you don't often see a computer where installing ram makes you feel like you're replacing uranium fuel cells

    • @raven4k998
      @raven4k998 Рік тому +16

      well he's making that trash can useful at running without apple trash in the can🤣🤣🤣

    • @ChairmanMeow1
      @ChairmanMeow1 7 місяців тому +4

      the design of the cube was INCREDIBLE. but sadly it was form over function in the end. But turning it over and pulling the CORE out. That felt nuclear as well.

    • @herkatron
      @herkatron 3 місяці тому

      😂

  • @retropuffer2986
    @retropuffer2986 Рік тому +251

    Action Retro: "If you enjoyed taking heavily depreciated professional grade tech and using it entirely unprofessionally..."
    Me: Stares at maxed out NeXT Station Color Turbo...."Yes, yes I do."

    • @BrianMoore-uk6js
      @BrianMoore-uk6js Рік тому +20

      What to do with a NeXT Station Color Turbo professional workstation in 1991? John Carmack: "let's create Doom on it."

    • @ordinosaurs
      @ordinosaurs Рік тому +9

      My little LAN of SparcStation 10 (Debian), SparcStation 20 (NeXTStep 3.3) and Alpha Miata (Windows 2K ß) agrees...

    • @davel4030
      @davel4030 5 місяців тому +1

      🤤😂

  • @alasdairlumsden670
    @alasdairlumsden670 Рік тому +197

    As someone who had his hands in many tens if not hundreds of Xeon class servers over a period of 15 or so years, I can say that the thermal compound looks totally normal and would have been working fine. It looks crusty as it solidifies with heat, so when you pull it off it leaves a textured appearance. But strapped to the CPU it would have been uniform. The difference in single thread performance is likely down to the larger 30MB cache on the E5-2697v2 vs 10MB of the E5-1620v2.

    • @ahmetrefikeryilmaz4432
      @ahmetrefikeryilmaz4432 Рік тому +8

      geekbench is extremely touchy with it's memory...

    • @alasdairlumsden670
      @alasdairlumsden670 Рік тому +8

      @@ahmetrefikeryilmaz4432 That actually raises a fair point - the test wasn't totally like for like as the memory was changed as well!

    • @ahmetrefikeryilmaz4432
      @ahmetrefikeryilmaz4432 Рік тому +6

      @@alasdairlumsden670 I was actually affirming your statement. Cache is still memory.
      I don't really think there should be a drastic difference between the old and the new RAM kits.

    • @mndlessdrwer
      @mndlessdrwer Рік тому

      Oof, I remember the v2 series of Xeon E series processors. They were good for the time, but I opted for the v4 version in my desktop build for a good reason. Better ongoing longevity. I've got a 20 core, 40 thread processor and it's rarely ever seeing serious utilization, particularly in Windows where thread allocation routines are kinda borked at the best of times.

    • @Pasi123
      @Pasi123 Рік тому +6

      For some reason his Geekbench 6 results on both CPUs are way lower than they should be. Comparing to CPUs that I've tested even a laptop Core2 Duo P8700 scored higher in single thread (366/550).
      A generation older Xeon E5-2690 8c/16t scored 724/4001 beating the faster E5-2697 v2 in both single and multi thread even though it shouldn't. And no, I'm not confusing Geekbench 6 with Geekbench 5 which has different scoring.

  • @betonmischer_86
    @betonmischer_86 Рік тому +94

    I'm sure someone has already mentioned this, but plugging the monitor directly to the eGPU is preferable, otherwise display data coming back from the GPU will be taking up some of the already very limited Thunderbolt 2 bandwidth.

    • @Cyba_IT
      @Cyba_IT Рік тому +5

      That's what I was thinking and thought that was the default way to use eGPU's. Seems very inefficient to not do it that way unless you're just using the eGPU as some sort of cache or hardware accelerator or something.

  • @polocatfan
    @polocatfan Рік тому +78

    when the graphics card is bigger than your computer you're definitely doing something right.

    • @raven4k998
      @raven4k998 Рік тому +2

      nope nope nope you should fear for the cpu's safety then🤣🤣🤣

  • @RonLaws
    @RonLaws Рік тому +44

    oh no he's installing the nVidia driver from nVidia instead of the gpu repo :(

  • @ok-tr1nw
    @ok-tr1nw Рік тому +56

    First law for nvidia on linux is to never use the nvidia installee, for ubuntu use the graphics ppa

    • @cnr_0778
      @cnr_0778 Рік тому +6

      This.

    • @roccociccone597
      @roccociccone597 Рік тому +10

      I mean the best rule of thumb is to just not use nvidia to begin with. The hours I've wasted fixing broken kernel updates thanks to nvidia's incompetence left me scarred for decades to come.

    • @cnr_0778
      @cnr_0778 Рік тому

      This. If you have a choice you should just not use nVidia to begin with.@@roccociccone597

    • @akshat8586
      @akshat8586 3 місяці тому

      @@roccociccone597 Never ever had any issue with Nvidia Drivers, instead AMD was causing problems for me.

    • @Katky1
      @Katky1 3 місяці тому

      @@akshat8586 were you on windows

  • @Astravall
    @Astravall Рік тому +52

    The 4070 only has a single eight PIN Connector which can support a maximum of 150 Watts ... plus 75 Watts max from the PICe Slot it cannot draw more then 225 Watts. So the extra 650 Watts power pupply is overkill for both graphics cards IMHO.

    • @juanignacioaschura9437
      @juanignacioaschura9437 Рік тому +3

      The problem is not the TBP for the nVIDIA Cards, but rather the transient power spikes they have.

    • @nated4wgy
      @nated4wgy Рік тому

      @@juanignacioaschura9437this is not an issue on Ada Lovelace cards. It was with Ampere. But not anymore. See Gamers Nexus video reviews on 40 series cards for this. Was one of the first things they tested.

    • @Objectorbit
      @Objectorbit Рік тому +12

      @@juanignacioaschura9437 The 4000 series lessened the issues with those spikes. A 400 watt PSU would have been fine for powering nothing but the GPU

    • @Madrrrrrrrrrrr
      @Madrrrrrrrrrrr 2 місяці тому +1

      Yep 650 is for a whole system being on the safe side. But i would bet the egpu works in OSX with the amd card.

  • @spoopyangie
    @spoopyangie Рік тому +32

    You're using 22.04 LTS. It's kernel predates RDNA2. That might be a (not the) problem...

    • @JakeR0bH
      @JakeR0bH Рік тому +1

      Wont the linux-firmware package contain up to date firmware for AMD GPUs? I've been running an RX 6700 XT from 22.04 - 23.04 and it's been working fine.

    • @spoopyangie
      @spoopyangie Рік тому +1

      @@JakeR0bH Ye 23.04 should be fine. But 22?

    • @egsuslo
      @egsuslo Рік тому +1

      Kernel 6.2 (which is used in the latest 22.04) supports RDNA2. My rx 6700xt runs just fine on latest Ubuntu 22.04.

    • @spoopyangie
      @spoopyangie Рік тому +1

      ​@@egsuslo.3 LTS does indeed use 6.2. I stand corrected 😅
      Besides blacklisting the old Radeon driver to force amdgpu module. I don't really see a solution or the actual problem...

  • @Mitch3D
    @Mitch3D Рік тому +5

    I like these these systems are becoming so cheap, it's good to know they're actually a great windows gaming PC for the price.

  • @arranmc182
    @arranmc182 Рік тому +12

    When using thunderbolt 2 if you ask it to send the image back to be output over the build in graphics on the computer then you are throwing away performance, if you want the best performance possible then run a monitor directly from the external GPU that way your not chucking away bandwidth sending to much data over the slow thunderbolt 2.

  • @jobalisk6649
    @jobalisk6649 Рік тому +3

    So, ram speed is only reduced if you install 1866mhz ram, I have 128 of 1066mhz ram in mine and theres no reduction in speed with my 12 core cpu

  • @MR.Peanut2
    @MR.Peanut2 Рік тому +30

    I thought you got your hands on the leaked Xbox

    • @MateoThePro
      @MateoThePro Рік тому +6

      Xbox Series X Digital Edition 😂😂😂

    • @TechTonic420
      @TechTonic420 Рік тому +2

      It had to run some sort of Windows and some Nvidia GPU in order to qualify as an xbox 😂. Hold on.....
      If you would install the lightest ever linux distro that could emulate the xbox, then it would be an even better xbox

    • @ok-tr1nw
      @ok-tr1nw Рік тому +1

      ​@@TechTonic420funny you say nvidia even when the only xbox with an nvidie gpu is the very first one

    • @poggerfrogger9327
      @poggerfrogger9327 Рік тому +2

      Xbox 720

  • @sappy.7z
    @sappy.7z Рік тому +9

    bro refuses to install mint 😭😭

  • @bryans8656
    @bryans8656 Рік тому +8

    I did this exact same upgrade (except for the video card) when I had my Pro 13 a few years ago. I liked the challenge of the CPU replacement.

  • @nmihaylove
    @nmihaylove Рік тому +7

    Can you passthrough the eGPU to a Windows VM? Sounds like more in the spirit of this channel than running Windows native.

  • @Somelucky
    @Somelucky Рік тому +5

    After having watched several Jeff G videos about getting these big gaming cards working over external interfaces, it was pretty clear you were in for a challenge. Glad to see that the integrated video on the can works well in Windows at least.

  • @lucasjones6295
    @lucasjones6295 Рік тому +4

    just a quick tip the gpu power supply requirements assumes you have one power supply supplying power to the entire computer so the 400w but we fine for any gpu except for like a x900 amd gpu or a xx90 nvidia gpu assuming the gpus max power draw is less then the 400watts the power supply can provide

  • @majinshinsa
    @majinshinsa Рік тому +6

    That plot twist was unexpected but wicked cool

  • @mausmalone
    @mausmalone Рік тому +18

    This kind of puts some cold water on my idea of setting up an eGPU for my linux laptop so I could theoretically game when I dock it at home. It seems like the sort of project where I could spend a boat load of money and still come up with absolutely nothing in the end.

    • @RockyPeroxide
      @RockyPeroxide Рік тому +2

      I think you can get your money back if you decide it's not for you within 2 weeks or something.
      At least, that's the costumer protection law here for online shopping.
      It should be able to work on Linux, but then again, I once tried to do surround sound using different sound cards, it should also work but I couldn't get it to work.

    • @Brant92M
      @Brant92M Рік тому +4

      It works much better when you use normal hardware

    • @archlinuxrussian
      @archlinuxrussian Рік тому

      It depends on the hardware and how it's actually hooked up. I'd recommend looking around on Phoronix or another Linux-centric forum (gamingonlinux maybe? or /r/linux?) for information on specific hardware. The setup in this video was quite esoteric and niche, so I wouldn't take it as indicative of all eGPU setups. And I have some reservations regarding his use of "AMDGPU drivers" and how the monitor was hooked up upon first trying the eGPU...

    • @CFWhitman
      @CFWhitman Рік тому +10

      I wouldn't read too much into this experiment unless you are trying to use a Thunderbolt 3 eGPU with a Thunderbolt 2 device.

    • @tarajoe07
      @tarajoe07 Рік тому +2

      USB-C ones work great

  • @Astravall
    @Astravall Рік тому +14

    Wait? .... why do you need a driver for the AMD 6750XT graphics card? OK it might be different for thunderbolt, but the Linux Kernel should support the card out of the box. Which Ubuntu and kernel version did you use? I'm certain a recent kernel will support it.
    And i'm pretty sure 650 Watts are recommended for a complete system with CPU etc. A Radeon 6750XT only draws 250 Watt maximum. Sure the might be spikes that are higher but 400 Watts should be enough. for powering the card alone.

    • @CommodoreFan64
      @CommodoreFan64 Рік тому +1

      i have an ASRock AMD RX 6650 XT 8GB GPU on my Erying Intel 12900H MATX mobo with 32GB DDR4 3200Mhz RAM, and running Manjaro GNOME in traditional desktop mode on the latest kernels it works just fine with DP to HDMI adapters powering 2 of the same 1080p 75Hz Sceptre monitors he showed in the video. even Steam works just fine for the most part.
      Having said that I stopped using Ubuntu, and Ubuntu based distros a few years back when they started doing more, and more of their own BS wank, like not having the latest kernels, trying to kills 32bit Libs, SNAPS, etc.. and that's part of why companies like Valve moved away from using Debian/Ubuntu as the base for STEAMOS, and moved to an Arch base, and maybe he should just try installing Manjaro GNOME with the latest updates, and newest kernel, and see if that does not help solve some things(not sure it will for the internal dual GPUs through on that trash can but worth a shot).

    • @RealOny
      @RealOny Рік тому +4

      @@CommodoreFan64 I think he used Ubuntu 22.04.3 which is running over a year-old kernel, that would explain all the issues, maybe switching to a bleeding edge distro like Arch or any distro offering a recent kernel

    • @CommodoreFan64
      @CommodoreFan64 Рік тому

      ​@@RealOny I'm personally not a huge fan of PoP_OS! , or distros being based on Ubuntu in general as once Ubuntu does major changes that are bad for the community many of the ones based on it just following along like sheep, and rolling pure Arch on something like the Apple trash can has its own set of headaches which is why I suggest Manjaro GNOME as a balance with it still having the latest kernels, being based on Arch but somewhat more vetted before pushing out packages in the main repos, while being easy to set up, and get going fast, plus the Manjaro team has been working really hard to make Manjaro more stable with far far less breakages than in the past.

    • @MashonDev
      @MashonDev 5 місяців тому

      Jus rembember to install mesa and vulkan!

  • @jwoody8815
    @jwoody8815 Рік тому +2

    IDK im not an apple guy really but that "trashcan" upgraded would prolly make a damned good steam box, it even kinna looks the part of some unknown modern game console.
    Wouldnt mind having one myself to tinker with.
    FYI, A 2060 - 3050 would have made a pretty competent gaming setup.

  • @JARVIS1187
    @JARVIS1187 Рік тому +2

    Had a 970 EVO in my iMac 2019 and was SO disappointed. macOS had trouble with trim, so every restart was a test of patience. Took about 5 minutes each time. Booting into Windows via Bootcamp was no problem, it was just macOS.
    Just for you to consider when you swap back to macOS (maybe possibly eventually) and notice the same: it is the SSD.

  • @coyote_den
    @coyote_den Рік тому +2

    I love the design of the trashcan Mac Pro... but I compared the maxed out Geekbench scores compared to the M1 Pro I'm on right now and good god.

  • @DigitalDiabloUK
    @DigitalDiabloUK Рік тому +6

    I've had a similar fight with a trashcan trying to work with egpus. The problem was that Windows often wouldn't boot with the thunderbolt card connected, and I couldn't get the OSX patches to support a modern eGPU. I might have another go as the weather turns colder and having a space heater would be useful in the office 😂

  • @whophd
    @whophd Рік тому +1

    Things I’ve done: 1. Upgraded my trashcan to 12-core (with a lot of help, but it was open heart surgery feels!) 2. Use Windows on a trashcan for gaming 3. Use a 6900XT eGPU with massive 750W power supply on a Mac Mini 2018 for AAA gaming, if you can call MSFS 2020 a AAA title? It sure needs the processing power. But what I’ve never tried is to grab my TB2-TB3 adaptor and connect the eGPU to my trashcan. I’ve seen a few articles about it though - you need to add special support to get eGPU working over TB2. Read up online first! And that’s before you even go beyond macOS.

  • @philtkaswahl2124
    @philtkaswahl2124 Рік тому +1

    An Intel Apple product with Linux (then Windows) installed on it, an AMD GPU (then an NVidia one, then stock), and a controller for a Microsoft console.
    Gotta love the mishmash.

  • @seshpenguin
    @seshpenguin Рік тому +2

    I daily drove Arch on my MBP with an eGPU for a while, though it was a whole thing to setup. What I learned was two things:
    1. you cannot hotplug the GPU, it needs to be there when the system boots
    2. You need to tell the window manager to use the GPU as the primary graphics device (in KDE/KWin, I had to set the KWIN_DRM_DEVICES environment variable to /dev/dri/card1). Then after logging KDE would spring to life on the eGPU.

    • @archlinuxrussian
      @archlinuxrussian Рік тому +2

      Also, at least in this case, I have a hunch that plugging the monitor directly into the eGPU may have made a difference? Also a bit concerned when he mentioned "AMDGPU drivers"...not sure if he meant the proprietary ones or not (which I'd almost never recommend).
      Also, nice profile picture :)

    • @seshpenguin
      @seshpenguin Рік тому +2

      @@archlinuxrussian Yea, I don't think I ever got display loopback to work, which was fine because that would decrease performance anyway (especially on TB2 bandwidth on the Mac Pro).
      Also thanks :P

  • @MediocreTCG
    @MediocreTCG Рік тому +2

    Been thinking of buying a trashcan Mac. It'd only be my second Mac ever after my 07 polycarb, but those little turds always spoke to me.

  • @karlc11
    @karlc11 Рік тому

    Just picked one of these up for £100 and I'm stoked. It's much faster than my 3770k and 7970ghz and the size and quietness is fantastic. Always been an apple hater but the aesthetics of this thing are great, and build quality

  • @AndromedasCartoon
    @AndromedasCartoon 7 місяців тому +1

    For those that are interested in playing more modern games and have a viable low-medium end computer (as of 2024), I'd recommend getting the version with the 2697v2 as stock, as well as the dual FirePro D700s. Oh, and you may as well put 128gb of ram in there.
    Boom! You got a very high ram set up with a decent CPU (12c/24t, 3.5GHz), a decent GPU setup (7 teraflops, 12GB of VRAM), and 128GB of RAM!

  • @GaiusIuliusCaesar1
    @GaiusIuliusCaesar1 Рік тому +6

    I would suspect that there is a wierd pcie4 to pcie3/2 miss match. Seen that happen with pcie4 compatible cards running on pcie3 riser. Had to force the primary pcie 16x slot to run at pcie3(default is auto) in the bios. Dont know if that can be done for the egpu enclosure.

  • @madson-web
    @madson-web Рік тому +2

    I was watching and suggesting windows in my mind, then you did it. It works well for such generation.

  • @globalvillagedrunk
    @globalvillagedrunk Рік тому +5

    Loving this series. I did the same CPU upgrade with a cheap Mac Pro off eBay recently. OpenCore Sonoma guide next?

  • @eDoc2020
    @eDoc2020 Рік тому +1

    TLDR: If your software uses less than 9 cores the E5-2667v2 is a better CPU than the 2697v2.
    The E5-2697v2 the highest end CPU for the platform but is not the fastest for single-core performance. It only turbos to 3.5GHz (3.0 for all cores) so for better single-threaded performance you'd want the E5-2667v2 which turbos up to 4.0 or 3.6GHz on all cores. I made a spreadsheet and the 2667v2 is clocked about 14-20% faster throughout the board as long as you have less than 8 active cores. For 9 active cores the 2697 is clocked 6% slower overall (core speed * core count) but has a bit more cache so it probably comes out near equal. For 10, 11, and 12 cores the 2697v2 is 4, 14, and 25% faster.

  • @tralphstreet
    @tralphstreet Рік тому +11

    I would have gone with Arch, or some other rolling release distro. Having the latest packages really helps when it comes to gaming. Also, why not plug the display to the eGPU directly?

    • @serqetry
      @serqetry Рік тому +8

      Yes, Ubuntu is a terrible distribution to be doing this kind of stuff with.

    • @peppefailla1630
      @peppefailla1630 Рік тому +2

      Fedora Is rolling release

    • @peppefailla1630
      @peppefailla1630 Рік тому +1

      ​@@initial_kdthe situation got better over time. Nowadays arch even has a TUI installer, even like 3 years ago that would have been CRAZY

    • @Motolav
      @Motolav Рік тому +1

      ​@@peppefailla1630 Fedora isn't entirely rolling release it's a hybrid distro

  • @arthurswart4436
    @arthurswart4436 Рік тому +3

    The firmware probably doesn't support the 6750xt. Would be nice to try again with a 1080ti.

  • @cogspace
    @cogspace 9 місяців тому

    Used one of these as a workstation at my job in like 2014. Always liked the design even if it was hilariously impractical in many ways.

  • @u0aol1
    @u0aol1 Рік тому +1

    Looks like the kind of design a technician would have nightmares about. Looks gorgeous though

  • @VIRACYTV
    @VIRACYTV Рік тому +2

    Someone needs to make translucent shells for this. Show the beautiful guts it has

  • @icanrunat3200mhz
    @icanrunat3200mhz Рік тому +4

    I used to be an "old school [K]Ubuntu fanboy" as well, until I tried OpenSUSE. It just seems that little bit more polished. Plus, YaST is super slick for managing things that would otherwise be kinda tedious.

    • @joe--cool
      @joe--cool Рік тому +3

      Basically anything other than Ubuntu is much slicker, faster and smaller. I still cannot fathom why Canonical still clings to their failed snaps.

    • @WyvernDotRed
      @WyvernDotRed Рік тому +3

      Back before the move to Snaps I got into Linux with Ubuntu MATE and was happy enough with this OS to not distro-hop further.
      But as Snaps got rolled out, the modern laptop I ran it on got slow on both system and application startup.
      Which led me to hop to Manjaro and after getting annoyed with their bad management of the distro I somehow ended moving to Garuda Linux.
      This has it's (significant) problems too, but it's a great OS for me which I'm sticking to long-term, as I like Arch's ease of installing up-to-date obscure software.
      But after trying it on an older laptop, OpenSUSE has become my backup distro as I can replicate all the setup I like, including most obscure stuff, with personal repos.
      OpenSUSE tumbleweed is significantly more stable than Arch, both in the sense of packages changes as with breakage on updates, but I'm not changing a system I'm happy with.

  • @CharlesRedstone
    @CharlesRedstone Рік тому +8

    I believe that CrossFire is the reason why this Mac performs graphically better on Windows, if Apple enabled crossfire on MacOS instead of leaving the decision to developers, the Mac Pro would’ve a little more powerful, and yes, Linux to this day is still a mess with drivers and components (especially gaming ones), maybe I would suggest you try again with some Debloated Windows version (like tiny10 or ghost specter) just to see how much performance upgrades you can get, by the way, great video, I believe it was exactly what I would do if I bought this machine

    • @BridgetSculpts
      @BridgetSculpts Рік тому +3

      both GPUs work in tandem great in Blender for cycles rendering!
      Unreal engine editor does run on the second GPU too only if you set it too which helps with performance.
      but yeah would be nice if everything worked with both gpus. I have not found a single game that does.

  • @complexacious
    @complexacious Рік тому +2

    I wonder if the reason the 6750 is failing is because you're using it in a PCIe3 capable Thunderbolt 3 enclosure so it attempts to operate at PCIe3 specs, but then you've got the thunderbolt 2 to 3 adapter which can't tell the card that the link is only PCIe2 capable. People ran into these kinds of problems with PCIe3 risers on PCIe4 capable motherboards. The card would see the mobo was capable of 4 and try to use it, not knowing that there was a non-compliant riser in the way. You had to tell the motherboard to only use PCIe3 signalling but you had a chicken and egg problem.
    The 1050Ti maybe worked because it was old enough that it was only trying to use a slower specification. Maybe it's only 8x wired? Maybe some quirk made it decide to be a PCIe2 card?
    eGPUs are always fiddly.

  • @JapanPop
    @JapanPop Рік тому +6

    You inspired me to install Ubuntu on an old Dell machine with an Nvidia 1050ti GPU. Problem is, I still can’t figure out how to get Steam to work. Nevertheless, I will prevail!

    • @RukaIXR
      @RukaIXR Рік тому +2

      Good luck! I hope it goes well for you!

  • @Francois_L_7933
    @Francois_L_7933 Рік тому +1

    It's funny because every time you remove the case on this one I get the same feeling I had when I saw Darth Vader remove his helmet in The Empire Strikes Back..

  • @nrg753
    @nrg753 Рік тому +3

    The interesting thing about those universal blue images (like bazzite) is that you can switch out to any other image with a single command. Or switch to another ostree based OS like the official Silverblue or Kinoite. You don't even lose your data or flatpak apps.

    • @Cyba_IT
      @Cyba_IT Рік тому +1

      A steamOS-like OS would be pretty cool on the can. Limit it's functionality somewhat but would be pretty cool next to the TV just for gaming.

  • @thavith
    @thavith Рік тому +2

    I was really hoping the Linux only way would work. I am sure there are some Linux guys out there that have done this.

  • @goodasdead4303
    @goodasdead4303 Рік тому +2

    I've told people for years these weren't bad computers, they were just expecting way too much from a company like apple.

    • @CommodoreFan64
      @CommodoreFan64 Рік тому

      That's true, but sadly so many people fall for Apple's marketing wank, and get caught up in the hype thus leading to unrealistic expectations.

    • @BridgetSculpts
      @BridgetSculpts Рік тому

      I think he means the toxic pc gamers not people 'disappointed' with their 3k dollar purchase. most people that bought this computer at the time were pretty happy with it. I remember when it came out and all the videos of people trying to build a hackingtosh with the same specs for the same price but it wasnt possible, it usually came out to be more expensive.
      if you bought this computer back in the day I'm sure you could justify it if it's for work.@@CommodoreFan64

  • @robsquared2
    @robsquared2 Рік тому +3

    Remember, any trash can can be a gaming trash can if you use it as a basketball hoop.

  • @VeronicaExplains
    @VeronicaExplains Рік тому +1

    Me: [starts shopping for a trashcan]

  • @fwah23
    @fwah23 5 місяців тому

    Your videos are allowing me to get on with my life while protecting me from janking up my life too hard with my ADHD brain. Thank you

  • @hermean
    @hermean Рік тому +1

    I wanna see gaming on a ThinkPad with an eGPU, myself. I got it to work on my x200, exactly once. The ultimate Linux gamer experience, surely.

  • @jaredwright5917
    @jaredwright5917 Рік тому +1

    I remember when the trash can came out and was always wondering how the hell it was built, given its weird shape. I thought they might have built the whole thing on a large flexible PCB and rolled it up to fit the case.

  • @beenine5557
    @beenine5557 Рік тому

    Too bad about the plot-twist ending after all that work, but hey! You've saved a bunch of Linux geeks a lot of time by letting us know what _doesn't_ work.

  • @DouglasWalrath
    @DouglasWalrath Рік тому +1

    it showing as nvidia corporation device is just because it doesn't have the updated PCI IDs for it, this will not prevent the card from working since the name is entirely cosmetic and for the user
    if you want it to show the actual name of the device run sudo update-pciids and it'll fetch the latest PCI IDs from the internet

  • @jjohnson71958
    @jjohnson71958 Рік тому +1

    ican needs a pc case fan mod

  • @CodingItWrong
    @CodingItWrong Рік тому +1

    The constant references to Spindler are especially hilarious with reference to a 2013 machine😂

  • @semperfidelis316
    @semperfidelis316 Рік тому +2

    That CPU swap has been my favorite one to do. Yes it's a crap ton of work and shouldn't be, but I guess I'm weird and like computers like that. I let my kid drill a hole in the old cpu and put a key ring on it. He thinks it's so cool.

  • @gregzambo6693
    @gregzambo6693 Рік тому +2

    It would look great with a clear cover.

  • @btarg1
    @btarg1 Рік тому +2

    I am very surprised that this machines has an M.2 slot, PCs didn't have those as a common standard for another couple of years

    • @JosephHalder
      @JosephHalder Рік тому +3

      It's not exactly a m.2 slot, it's proprietary but still PCIe like m.2, which is why it can be adapted.

  • @arseniysemin1361
    @arseniysemin1361 3 місяці тому

    If I recall correctly from my days of mac: egpu will function only if display connected to gpu you can actually see that your fans are not spinning which means no signal (and I may be wrong but not all docks supported, you can dig throu mac os egpu forums for more info). Also never install nvidia drivers through package from nvidia website, in my 5 years of linux it never worked once especially on ubuntu (apt-install should do ). Trashacans are fun, I'm hunting one to use as a homelab.

  • @thedefectivememe
    @thedefectivememe 11 місяців тому +1

    Keep the Linux dream alive and maybe try using KVM! If you can do some tricky by having the integrated GPU dedicated to video output on the linux OS, and then passing the nvidia GPU to the windows VM

  • @macbuff81
    @macbuff81 10 місяців тому

    Yes, I did the upgrade to 128GBs and yes, it reduces the RAM speed from 1866Mhz down to 1066Mhz. The issue here is not the memory, but rather a technical limitation of the Xeon processor memory controller. However, that is still a lot faster than any swap file even if it on an NVMe storage medium. Everything pretty runs of memory on my 2013 trash can at this point. I also installed an AngelBoard which allows me to install regular M.2 drives.
    I also have an eGPU, however, the newest MacOS (Mojave) won't allow the use of it. Thanks Apple. They say it because of stability issues which is BS of course. Yes, you won't get the full bandwith the card is capable of (in my case, it is a 5700XT) due to the Thunderbolt 2 port being only capable 0f 20Gb/s vs. TB3 which can do double. However, it could still work with the TB3 to TB2 adapter from Apple. Apple simply wants you to buy a new system (their standard strategy). There is a workaround for this which does not require turning off SIP (system integrity protection). It's called "Kryptonite." It is still a bit of a hassle though.

  • @Arachnoid_of_the_underverse
    @Arachnoid_of_the_underverse Рік тому +2

    I think it looks great without a cover,real R2D2ish.Maybe you could make a clear perspex cover for it.

    • @beenine5557
      @beenine5557 Рік тому +1

      Exactly what I was thinking! Except in my mind, this was clearly an Imperial darkside astromech, given the all black color-scheme.

  • @jangosch8647
    @jangosch8647 6 місяців тому

    If you have two GPU's the standard setting for the nvidia-driver is "On demand". The Desktop is then always using the dedicated GPU and the discrete GPU is only used when needed or you tell the system to use it. Ubuntu has an Option "Use discrete graphics" when you launch programs. Actually you can run the Steamlauncher with the dedicated GPU and the Games will run on the discrete because most of the games would not run on the dedicated anyway. At least this is how it works on my Lenovo with discrete NVIDIA-Card.

  • @TylerComptonShow
    @TylerComptonShow Рік тому +2

    In my experience, installing the Linux drivers directly from Nvidia causes nothing but pain and suffering. In my very early days of using Linux, I ruined a couple installs trying to do that!

  • @c.n.crowther438
    @c.n.crowther438 9 місяців тому

    If that shroud were clear with a bit of ligthing, that thing would look cool af.

  • @svrsakura
    @svrsakura Рік тому

    The difference between a pro and an amateur is that pros get paid... You, sir is a pro!

  • @gumbyx84
    @gumbyx84 Рік тому +1

    I love these crazy videos. If I had the extra cash, I'd be very tempted to pick up my own trashcan Mac

  • @ahmetrefikeryilmaz4432
    @ahmetrefikeryilmaz4432 Рік тому +1

    Dude the PSU in that external whatever is perfectly fine for anything lesser than a 3090...

  • @FavoritoHJS
    @FavoritoHJS Рік тому

    about bus speeds, if memory serves capacity by itself doesn't limit clock rates, but 4 sticks of memory tend to have lower clock rates than 2 due to bus issues, and larger capacity sticks might have lower maximum clock rates by themselves.
    of course, apple could also be mucking things up by automatically setting speed based on ram count...

  • @hrbt78
    @hrbt78 Рік тому +1

    The 400W power supply gets the RX6750XT running without any problems. The recommended 600w is for the entire system, the card itself consumes 250w. but still a great project. I actually don't think the Mac is that ugly

  • @Cyba_IT
    @Cyba_IT Рік тому +1

    I would love to have one of these simply because they are very cool computers and they actually run Windows very well. Problem is a reasonably well specced one is still around $900 ($1.4K here in New Zealand) which is a lot for a PC just to have as an ornament! I would use it of course but just for messing around on.

  • @yjk_ch
    @yjk_ch Рік тому +2

    12:16 Depending on how old the kernel is on distro you've chosen, the amdgpu in the kernel may not support your GPU.
    I couldn't get Debian working on my RX 6600XT for instance, because Kernel was too old and doesn't even attempt to initialize the card(not even single dmesg line). Maybe the recent Debian works fine, but idk.
    I could compile latest Linux kernel myself, but I didn't want to go that route.

  • @jbettcher1
    @jbettcher1 Рік тому +3

    It's probably been said, but that RTX 4070 only draws like 200W. I know you probably were just addicted to the jank, I get it.

    • @fuelvolts
      @fuelvolts Рік тому +2

      I was so confused by what he was doing. I don’t know of any consumer GPU that wouldn’t work with 400w when that PSU is only powering the GPU. No need for a janky second PSU.

  • @kevinmckenna5682
    @kevinmckenna5682 Рік тому +2

    I would love to see you benchmark a bunch of new-ish games on that Win10 trashcan system.

  • @dustinschings7042
    @dustinschings7042 Рік тому +4

    The PSU in the external GPU should have been more than enough to run any of the GPUs you tried. 400W is plenty for almost any GPU out there. Yes, the AMD card recommended a 650W PSU, but it also is accounting for you having a whole PC powered by the PSU as well, not just the GPU. If you could not get almost anything to work with only the PSU inside the enclosure, then I would guess that the 400W PSU is simply not working properly.

  • @Astinsan
    @Astinsan Рік тому

    8:38 ice trays.. what a great idea

  • @notjustforhackers4252
    @notjustforhackers4252 Рік тому +5

    The RX6750XT works plug and play on a Linux 5.17 kernel paired with Mesa 22.2. I'm currently using the same card on Fedora 38 with a 6.4 Kernel and the latest version of MESA. There was absolutely no reason why that card shouldn't have just fired up on the version of Ubuntu you were using. I can only imagine you were trying to install the AMD Pro drivers, if so why?...... as for NVIDIA why the hell are you downloading them directly from NVIDIA? NEVER, EVER download GPU drivers from a website for Linux.. use you distro's package or driver manager. Your video doesn't make it clear exactly what you did but it looks like user error to me, sorry. Can you clarify what you did please?

  • @ran2wild370
    @ran2wild370 Рік тому +2

    LOL, r/linux is going to fire plasma torch from all exhausting orifices. :-))

  • @jgaming2069
    @jgaming2069 Рік тому

    I like how tou intergrated the ad into the theme of the video

  • @rog2224
    @rog2224 Рік тому +2

    Without its case, the Trashcan looks like something from the Death Star.

  • @brett9000
    @brett9000 Рік тому +2

    You did not need a second PSU as the recommended power supply is for a whole PC not just the GPU. Also Fallout 4 isnt a very good test as it has a 60FPS cap and can run on weak GPUs I used to play it on a GTX960m at ultra settings and the 960m is a laptop GPU

  • @HikingFeral
    @HikingFeral Рік тому +2

    Linux is so good for gaming now that i hardly ever log into my Windows partition. I need it for University work because their software runs like rubbish in a VM but it's been like 2 months since i logged into Windows and I have been gaming / being productive and watching films and videos every day. I would recommend people to use an Arch based distro when gaming because of access to better GPU drivers and the latest programs but Ubuntu based systems are fine. Debian's packages are a little too old for proper gaming in my opinion but vintage / retro style gaming is great on Debian. I am on 535 Nvidia drivers in Arch but anything past 525 on Ubuntu is really unstable.

  • @tassoss13
    @tassoss13 Рік тому +1

    Please don't use Ubuntu for those experiments I really recommend to retry with a different distro that is not based on Ubuntu

  • @hahohihi-zm7ou
    @hahohihi-zm7ou Рік тому +1

    Try to connect the GPU through the M2-adapter interface. There are adapters e.g M2-PCI-E or M2-Oculink

  • @UpLateGeek
    @UpLateGeek Рік тому +4

    I'm wondering whether you'd have more luck running it through a native TB2 EGPU (correctly pronounced egg-poo) enclosure, rather than using a TB3 unit with the TB2 adapter?

  • @jaimeduncan6167
    @jaimeduncan6167 Рік тому +1

    Sometimes it does not works. Loved the video. I wonder if Apple build a machine like this, it could have 3 M2 ultras and a bus, and it will be easear to clean than the Studio (that I love, not get me wrong). Simply the thermals were not there, It was evident to me, but for some reason it was not for Apple.

  • @ronsonwagner9401
    @ronsonwagner9401 3 місяці тому

    This was really entertaining to watch. I couldn't help but wonder about that power supply. It looked like it might be a pretty standard ATX power supply in that eGPU enclosure. If so, it might be worth it to swap the psu out for a stronger one just in case you need that overhead in the future.

  • @tk421dr
    @tk421dr Рік тому +2

    installing ram on this is like swapping isolinear chips on a LACARS terminal. kinda want one just for that, could be a neat home server.

  • @NdxtremePro
    @NdxtremePro 7 місяців тому

    Its accurate, that is a limitation of the CPU/RAM design.

  • @hattree
    @hattree Рік тому

    If you've ever seen Auralnauts Star Wars ...."Thank you, Magic trashcan."

  • @dmug
    @dmug Рік тому

    Good vid, just wanted you to know though there’s a 60Hz hum in the background audio.

  • @MacCrafter707
    @MacCrafter707 2 місяці тому

    I hand cramped the instant you showed the mouse. LOL

  • @maillouski
    @maillouski Рік тому

    Sean, it's really funny to see your beard grow throughout the video, showing how long and hard it was to get to something working :P

  • @annieworroll4373
    @annieworroll4373 6 місяців тому

    neat 970 evo. I've got two 1TB units, one in my desktop and one upgraded the gen 2 256gb SSD my laptop came with.

  • @BRBTechTalk
    @BRBTechTalk Рік тому

    18:53 Bravo ha, I love your comment and I agree with you 100%

  • @N30Dr4g0n
    @N30Dr4g0n Рік тому +1

    I find it hard to believe that Ubuntu couldn't get that Radeon 6750xt working. I have Linux Mint on my main rig with an Asus Tuf Gaming 6800 OC and Mint recognized it right off the bat and offered to install the correct driver. It has been running like a dream ever since.
    More than likely it is Apple's fault, not Ubuntu. Apple is notorious for setting up their systems so they don't work with certain 3rd party GPUs. However, Apple used to use Nvidia for their GPUs and for a long time never removed support for Nvidia cards even though they stopped using them. So in the end this doesn't surprise me at all as I have seen it happen with several older Apple systems that myself and others have tried to upgrade.

    • @chitan1362
      @chitan1362 11 місяців тому

      I wonder if the amdgpu driver ever works through thunderbolt at all.

  • @JonneBackhaus
    @JonneBackhaus Рік тому

    oldschool and ubuntu in the same sentence: made me chuckle.