plz tell me that sceptre monitor aint the 75-80$ on off amazon! i had the 80$ one and i had it for 7 days before i returned it and used the money to buy a 165hz 1080p lg ultra gear monitor that walmart had on rollback for like 5$ mor than the sceptre! i will never buy a sceptre monitor ever again! no cap my backup monitor which is a dell monitor from 2008 that i got for 8$ at goodwill had WAY BETTER viewing angles than that sceptre i had! i had to sit dead center infront of the sceptre becuase the viewing angles were so bad and thats why i returned it!
well i have to compliment the design a bit - you don't often see a computer where installing ram makes you feel like you're replacing uranium fuel cells
the design of the cube was INCREDIBLE. but sadly it was form over function in the end. But turning it over and pulling the CORE out. That felt nuclear as well.
Action Retro: "If you enjoyed taking heavily depreciated professional grade tech and using it entirely unprofessionally..." Me: Stares at maxed out NeXT Station Color Turbo...."Yes, yes I do."
As someone who had his hands in many tens if not hundreds of Xeon class servers over a period of 15 or so years, I can say that the thermal compound looks totally normal and would have been working fine. It looks crusty as it solidifies with heat, so when you pull it off it leaves a textured appearance. But strapped to the CPU it would have been uniform. The difference in single thread performance is likely down to the larger 30MB cache on the E5-2697v2 vs 10MB of the E5-1620v2.
@@alasdairlumsden670 I was actually affirming your statement. Cache is still memory. I don't really think there should be a drastic difference between the old and the new RAM kits.
Oof, I remember the v2 series of Xeon E series processors. They were good for the time, but I opted for the v4 version in my desktop build for a good reason. Better ongoing longevity. I've got a 20 core, 40 thread processor and it's rarely ever seeing serious utilization, particularly in Windows where thread allocation routines are kinda borked at the best of times.
For some reason his Geekbench 6 results on both CPUs are way lower than they should be. Comparing to CPUs that I've tested even a laptop Core2 Duo P8700 scored higher in single thread (366/550). A generation older Xeon E5-2690 8c/16t scored 724/4001 beating the faster E5-2697 v2 in both single and multi thread even though it shouldn't. And no, I'm not confusing Geekbench 6 with Geekbench 5 which has different scoring.
I'm sure someone has already mentioned this, but plugging the monitor directly to the eGPU is preferable, otherwise display data coming back from the GPU will be taking up some of the already very limited Thunderbolt 2 bandwidth.
That's what I was thinking and thought that was the default way to use eGPU's. Seems very inefficient to not do it that way unless you're just using the eGPU as some sort of cache or hardware accelerator or something.
The 4070 only has a single eight PIN Connector which can support a maximum of 150 Watts ... plus 75 Watts max from the PICe Slot it cannot draw more then 225 Watts. So the extra 650 Watts power pupply is overkill for both graphics cards IMHO.
@@juanignacioaschura9437this is not an issue on Ada Lovelace cards. It was with Ampere. But not anymore. See Gamers Nexus video reviews on 40 series cards for this. Was one of the first things they tested.
Wont the linux-firmware package contain up to date firmware for AMD GPUs? I've been running an RX 6700 XT from 22.04 - 23.04 and it's been working fine.
@@egsuslo.3 LTS does indeed use 6.2. I stand corrected 😅 Besides blacklisting the old Radeon driver to force amdgpu module. I don't really see a solution or the actual problem...
It had to run some sort of Windows and some Nvidia GPU in order to qualify as an xbox 😂. Hold on..... If you would install the lightest ever linux distro that could emulate the xbox, then it would be an even better xbox
When using thunderbolt 2 if you ask it to send the image back to be output over the build in graphics on the computer then you are throwing away performance, if you want the best performance possible then run a monitor directly from the external GPU that way your not chucking away bandwidth sending to much data over the slow thunderbolt 2.
After having watched several Jeff G videos about getting these big gaming cards working over external interfaces, it was pretty clear you were in for a challenge. Glad to see that the integrated video on the can works well in Windows at least.
Wait? .... why do you need a driver for the AMD 6750XT graphics card? OK it might be different for thunderbolt, but the Linux Kernel should support the card out of the box. Which Ubuntu and kernel version did you use? I'm certain a recent kernel will support it. And i'm pretty sure 650 Watts are recommended for a complete system with CPU etc. A Radeon 6750XT only draws 250 Watt maximum. Sure the might be spikes that are higher but 400 Watts should be enough. for powering the card alone.
i have an ASRock AMD RX 6650 XT 8GB GPU on my Erying Intel 12900H MATX mobo with 32GB DDR4 3200Mhz RAM, and running Manjaro GNOME in traditional desktop mode on the latest kernels it works just fine with DP to HDMI adapters powering 2 of the same 1080p 75Hz Sceptre monitors he showed in the video. even Steam works just fine for the most part. Having said that I stopped using Ubuntu, and Ubuntu based distros a few years back when they started doing more, and more of their own BS wank, like not having the latest kernels, trying to kills 32bit Libs, SNAPS, etc.. and that's part of why companies like Valve moved away from using Debian/Ubuntu as the base for STEAMOS, and moved to an Arch base, and maybe he should just try installing Manjaro GNOME with the latest updates, and newest kernel, and see if that does not help solve some things(not sure it will for the internal dual GPUs through on that trash can but worth a shot).
@@CommodoreFan64 I think he used Ubuntu 22.04.3 which is running over a year-old kernel, that would explain all the issues, maybe switching to a bleeding edge distro like Arch or any distro offering a recent kernel
@@RealOny I'm personally not a huge fan of PoP_OS! , or distros being based on Ubuntu in general as once Ubuntu does major changes that are bad for the community many of the ones based on it just following along like sheep, and rolling pure Arch on something like the Apple trash can has its own set of headaches which is why I suggest Manjaro GNOME as a balance with it still having the latest kernels, being based on Arch but somewhat more vetted before pushing out packages in the main repos, while being easy to set up, and get going fast, plus the Manjaro team has been working really hard to make Manjaro more stable with far far less breakages than in the past.
IMO part of the issue was the thunderbolt eGPU setup. That card barely had a fraction of the PCIe bandwidth it needs to function and in my experience AMD cards/drivers do not handle having such limited bandwidth very well.
Thank you @ActionRetro for another great episode. This is the second video that I've seen in this series, and I enjoy the shenanigans. I also liked the sound effects while you were disassembling the "trash can" to install the 12-core Xeon. You've definitely earned a sub ... looking forward to the next one =)
just a quick tip the gpu power supply requirements assumes you have one power supply supplying power to the entire computer so the 400w but we fine for any gpu except for like a x900 amd gpu or a xx90 nvidia gpu assuming the gpus max power draw is less then the 400watts the power supply can provide
This kind of puts some cold water on my idea of setting up an eGPU for my linux laptop so I could theoretically game when I dock it at home. It seems like the sort of project where I could spend a boat load of money and still come up with absolutely nothing in the end.
I think you can get your money back if you decide it's not for you within 2 weeks or something. At least, that's the costumer protection law here for online shopping. It should be able to work on Linux, but then again, I once tried to do surround sound using different sound cards, it should also work but I couldn't get it to work.
It depends on the hardware and how it's actually hooked up. I'd recommend looking around on Phoronix or another Linux-centric forum (gamingonlinux maybe? or /r/linux?) for information on specific hardware. The setup in this video was quite esoteric and niche, so I wouldn't take it as indicative of all eGPU setups. And I have some reservations regarding his use of "AMDGPU drivers" and how the monitor was hooked up upon first trying the eGPU...
I've had a similar fight with a trashcan trying to work with egpus. The problem was that Windows often wouldn't boot with the thunderbolt card connected, and I couldn't get the OSX patches to support a modern eGPU. I might have another go as the weather turns colder and having a space heater would be useful in the office 😂
I would suspect that there is a wierd pcie4 to pcie3/2 miss match. Seen that happen with pcie4 compatible cards running on pcie3 riser. Had to force the primary pcie 16x slot to run at pcie3(default is auto) in the bios. Dont know if that can be done for the egpu enclosure.
IDK im not an apple guy really but that "trashcan" upgraded would prolly make a damned good steam box, it even kinna looks the part of some unknown modern game console. Wouldnt mind having one myself to tinker with. FYI, A 2060 - 3050 would have made a pretty competent gaming setup.
Had a 970 EVO in my iMac 2019 and was SO disappointed. macOS had trouble with trim, so every restart was a test of patience. Took about 5 minutes each time. Booting into Windows via Bootcamp was no problem, it was just macOS. Just for you to consider when you swap back to macOS (maybe possibly eventually) and notice the same: it is the SSD.
I would have gone with Arch, or some other rolling release distro. Having the latest packages really helps when it comes to gaming. Also, why not plug the display to the eGPU directly?
12:16 Depending on how old the kernel is on distro you've chosen, the amdgpu in the kernel may not support your GPU. I couldn't get Debian working on my RX 6600XT for instance, because Kernel was too old and doesn't even attempt to initialize the card(not even single dmesg line). Maybe the recent Debian works fine, but idk. I could compile latest Linux kernel myself, but I didn't want to go that route.
I wonder if the reason the 6750 is failing is because you're using it in a PCIe3 capable Thunderbolt 3 enclosure so it attempts to operate at PCIe3 specs, but then you've got the thunderbolt 2 to 3 adapter which can't tell the card that the link is only PCIe2 capable. People ran into these kinds of problems with PCIe3 risers on PCIe4 capable motherboards. The card would see the mobo was capable of 4 and try to use it, not knowing that there was a non-compliant riser in the way. You had to tell the motherboard to only use PCIe3 signalling but you had a chicken and egg problem. The 1050Ti maybe worked because it was old enough that it was only trying to use a slower specification. Maybe it's only 8x wired? Maybe some quirk made it decide to be a PCIe2 card? eGPUs are always fiddly.
I daily drove Arch on my MBP with an eGPU for a while, though it was a whole thing to setup. What I learned was two things: 1. you cannot hotplug the GPU, it needs to be there when the system boots 2. You need to tell the window manager to use the GPU as the primary graphics device (in KDE/KWin, I had to set the KWIN_DRM_DEVICES environment variable to /dev/dri/card1). Then after logging KDE would spring to life on the eGPU.
Also, at least in this case, I have a hunch that plugging the monitor directly into the eGPU may have made a difference? Also a bit concerned when he mentioned "AMDGPU drivers"...not sure if he meant the proprietary ones or not (which I'd almost never recommend). Also, nice profile picture :)
@@archlinuxrussian Yea, I don't think I ever got display loopback to work, which was fine because that would decrease performance anyway (especially on TB2 bandwidth on the Mac Pro). Also thanks :P
An Intel Apple product with Linux (then Windows) installed on it, an AMD GPU (then an NVidia one, then stock), and a controller for a Microsoft console. Gotta love the mishmash.
It's funny because every time you remove the case on this one I get the same feeling I had when I saw Darth Vader remove his helmet in The Empire Strikes Back..
The interesting thing about those universal blue images (like bazzite) is that you can switch out to any other image with a single command. Or switch to another ostree based OS like the official Silverblue or Kinoite. You don't even lose your data or flatpak apps.
it showing as nvidia corporation device is just because it doesn't have the updated PCI IDs for it, this will not prevent the card from working since the name is entirely cosmetic and for the user if you want it to show the actual name of the device run sudo update-pciids and it'll fetch the latest PCI IDs from the internet
TLDR: If your software uses less than 9 cores the E5-2667v2 is a better CPU than the 2697v2. The E5-2697v2 the highest end CPU for the platform but is not the fastest for single-core performance. It only turbos to 3.5GHz (3.0 for all cores) so for better single-threaded performance you'd want the E5-2667v2 which turbos up to 4.0 or 3.6GHz on all cores. I made a spreadsheet and the 2667v2 is clocked about 14-20% faster throughout the board as long as you have less than 8 active cores. For 9 active cores the 2697 is clocked 6% slower overall (core speed * core count) but has a bit more cache so it probably comes out near equal. For 10, 11, and 12 cores the 2697v2 is 4, 14, and 25% faster.
Things I’ve done: 1. Upgraded my trashcan to 12-core (with a lot of help, but it was open heart surgery feels!) 2. Use Windows on a trashcan for gaming 3. Use a 6900XT eGPU with massive 750W power supply on a Mac Mini 2018 for AAA gaming, if you can call MSFS 2020 a AAA title? It sure needs the processing power. But what I’ve never tried is to grab my TB2-TB3 adaptor and connect the eGPU to my trashcan. I’ve seen a few articles about it though - you need to add special support to get eGPU working over TB2. Read up online first! And that’s before you even go beyond macOS.
For those that are interested in playing more modern games and have a viable low-medium end computer (as of 2024), I'd recommend getting the version with the 2697v2 as stock, as well as the dual FirePro D700s. Oh, and you may as well put 128gb of ram in there. Boom! You got a very high ram set up with a decent CPU (12c/24t, 3.5GHz), a decent GPU setup (7 teraflops, 12GB of VRAM), and 128GB of RAM!
Just picked one of these up for £100 and I'm stoked. It's much faster than my 3770k and 7970ghz and the size and quietness is fantastic. Always been an apple hater but the aesthetics of this thing are great, and build quality
I’m surprised you didn’t have more issues than you did with the GPU setup. Both the 6750XT and the 4070 are designed for PCIe 4.0 x16 (though x8 provides enough bandwidth) and the minimum you would ever want to run those cards on is PCIe 3.0 x16. But with this setup you’re, at BEST, running those cards on PCIe 2.0 x4 which would mean you’re essentially only giving these cards 1/8th of the bandwidth they were designed for. And frankly I’m surprised it works at all. I had a PCIe 3.0 motherboard start to go bad and it was letting my 6800XT only use x8 of PCIe 3 and it caused huge performance issues and crashes that weren’t solved until I upgraded to a newer MB that finally gave the card the PCIe 4.0 x16 it wanted.
I think he means the toxic pc gamers not people 'disappointed' with their 3k dollar purchase. most people that bought this computer at the time were pretty happy with it. I remember when it came out and all the videos of people trying to build a hackingtosh with the same specs for the same price but it wasnt possible, it usually came out to be more expensive. if you bought this computer back in the day I'm sure you could justify it if it's for work.@@CommodoreFan64
If I recall correctly from my days of mac: egpu will function only if display connected to gpu you can actually see that your fans are not spinning which means no signal (and I may be wrong but not all docks supported, you can dig throu mac os egpu forums for more info). Also never install nvidia drivers through package from nvidia website, in my 5 years of linux it never worked once especially on ubuntu (apt-install should do ). Trashacans are fun, I'm hunting one to use as a homelab.
I remember when the trash can came out and was always wondering how the hell it was built, given its weird shape. I thought they might have built the whole thing on a large flexible PCB and rolled it up to fit the case.
If you have two GPU's the standard setting for the nvidia-driver is "On demand". The Desktop is then always using the dedicated GPU and the discrete GPU is only used when needed or you tell the system to use it. Ubuntu has an Option "Use discrete graphics" when you launch programs. Actually you can run the Steamlauncher with the dedicated GPU and the Games will run on the discrete because most of the games would not run on the dedicated anyway. At least this is how it works on my Lenovo with discrete NVIDIA-Card.
about bus speeds, if memory serves capacity by itself doesn't limit clock rates, but 4 sticks of memory tend to have lower clock rates than 2 due to bus issues, and larger capacity sticks might have lower maximum clock rates by themselves. of course, apple could also be mucking things up by automatically setting speed based on ram count...
The RX6750XT works plug and play on a Linux 5.17 kernel paired with Mesa 22.2. I'm currently using the same card on Fedora 38 with a 6.4 Kernel and the latest version of MESA. There was absolutely no reason why that card shouldn't have just fired up on the version of Ubuntu you were using. I can only imagine you were trying to install the AMD Pro drivers, if so why?...... as for NVIDIA why the hell are you downloading them directly from NVIDIA? NEVER, EVER download GPU drivers from a website for Linux.. use you distro's package or driver manager. Your video doesn't make it clear exactly what you did but it looks like user error to me, sorry. Can you clarify what you did please?
Keep the Linux dream alive and maybe try using KVM! If you can do some tricky by having the integrated GPU dedicated to video output on the linux OS, and then passing the nvidia GPU to the windows VM
It's too bad it only has a single m.2 slot, otherwise you add a gpu the jank way via a m.2 to pcie slot adatper. I guess maybe you could run the OS of external storage instead but not sure if that would work. Unfortunately crossfire is a rather mixed bag and doesn't work in most modern games.
I boot windows 11 off usb and it works really well. the hardware seems to have a bug though where it wont detect properly the drives at usb3, it will run at usb2. I basically have to disconnect and reconnect them and swap which usb jack they're in until they get recognized as usb3, but once it does it persists past reboots. @@ActionRetro
This was really entertaining to watch. I couldn't help but wonder about that power supply. It looked like it might be a pretty standard ATX power supply in that eGPU enclosure. If so, it might be worth it to swap the psu out for a stronger one just in case you need that overhead in the future.
In my experience, installing the Linux drivers directly from Nvidia causes nothing but pain and suffering. In my very early days of using Linux, I ruined a couple installs trying to do that!
You did not need a second PSU as the recommended power supply is for a whole PC not just the GPU. Also Fallout 4 isnt a very good test as it has a 60FPS cap and can run on weak GPUs I used to play it on a GTX960m at ultra settings and the 960m is a laptop GPU
The PSU in the external GPU should have been more than enough to run any of the GPUs you tried. 400W is plenty for almost any GPU out there. Yes, the AMD card recommended a 650W PSU, but it also is accounting for you having a whole PC powered by the PSU as well, not just the GPU. If you could not get almost anything to work with only the PSU inside the enclosure, then I would guess that the 400W PSU is simply not working properly.
You inspired me to install Ubuntu on an old Dell machine with an Nvidia 1050ti GPU. Problem is, I still can’t figure out how to get Steam to work. Nevertheless, I will prevail!
Linux is so good for gaming now that i hardly ever log into my Windows partition. I need it for University work because their software runs like rubbish in a VM but it's been like 2 months since i logged into Windows and I have been gaming / being productive and watching films and videos every day. I would recommend people to use an Arch based distro when gaming because of access to better GPU drivers and the latest programs but Ubuntu based systems are fine. Debian's packages are a little too old for proper gaming in my opinion but vintage / retro style gaming is great on Debian. I am on 535 Nvidia drivers in Arch but anything past 525 on Ubuntu is really unstable.
Too bad about the plot-twist ending after all that work, but hey! You've saved a bunch of Linux geeks a lot of time by letting us know what _doesn't_ work.
I used to be an "old school [K]Ubuntu fanboy" as well, until I tried OpenSUSE. It just seems that little bit more polished. Plus, YaST is super slick for managing things that would otherwise be kinda tedious.
Back before the move to Snaps I got into Linux with Ubuntu MATE and was happy enough with this OS to not distro-hop further. But as Snaps got rolled out, the modern laptop I ran it on got slow on both system and application startup. Which led me to hop to Manjaro and after getting annoyed with their bad management of the distro I somehow ended moving to Garuda Linux. This has it's (significant) problems too, but it's a great OS for me which I'm sticking to long-term, as I like Arch's ease of installing up-to-date obscure software. But after trying it on an older laptop, OpenSUSE has become my backup distro as I can replicate all the setup I like, including most obscure stuff, with personal repos. OpenSUSE tumbleweed is significantly more stable than Arch, both in the sense of packages changes as with breakage on updates, but I'm not changing a system I'm happy with.
I'm wondering whether you'd have more luck running it through a native TB2 EGPU (correctly pronounced egg-poo) enclosure, rather than using a TB3 unit with the TB2 adapter?
On my computer with a 3060 I can't get it to say 3060 in neofetch... also I didn't expect a 4060 to work since in my macpro5,1(4,1) the newest card to work is a 1080 ti. Also from my experience MacPros run better on windows... except if it's cluttered like mine is😅.
I would love to have one of these simply because they are very cool computers and they actually run Windows very well. Problem is a reasonably well specced one is still around $900 ($1.4K here in New Zealand) which is a lot for a PC just to have as an ornament! I would use it of course but just for messing around on.
Yes, I did the upgrade to 128GBs and yes, it reduces the RAM speed from 1866Mhz down to 1066Mhz. The issue here is not the memory, but rather a technical limitation of the Xeon processor memory controller. However, that is still a lot faster than any swap file even if it on an NVMe storage medium. Everything pretty runs of memory on my 2013 trash can at this point. I also installed an AngelBoard which allows me to install regular M.2 drives. I also have an eGPU, however, the newest MacOS (Mojave) won't allow the use of it. Thanks Apple. They say it because of stability issues which is BS of course. Yes, you won't get the full bandwith the card is capable of (in my case, it is a 5700XT) due to the Thunderbolt 2 port being only capable 0f 20Gb/s vs. TB3 which can do double. However, it could still work with the TB3 to TB2 adapter from Apple. Apple simply wants you to buy a new system (their standard strategy). There is a workaround for this which does not require turning off SIP (system integrity protection). It's called "Kryptonite." It is still a bit of a hassle though.
I don't know if this is the problem or not, but on a Hackintosh system I have, Mac OS won't boot with an XFX GPU. Something about the XFX VBIOS that Mac OS doesn't like. Forcibly flashing the card with another VBIOS (I think I used ASUS) and it worked. I don't know if this is the issue here, with a real Mac, but just something I thought of.
Go to www.piavpn.com/ActionRetro to get 83% off Private Internet Access with 4 months free!
Thank you S̶p̶o̶n̶s̶o̶r̶b̶l̶o̶c̶k̶ for blocking todays s̶p̶o̶n̶s̶o̶r̶
plz tell me that sceptre monitor aint the 75-80$ on off amazon! i had the 80$ one and i had it for 7 days before i returned it and used the money to buy a 165hz 1080p lg ultra gear monitor that walmart had on rollback for like 5$ mor than the sceptre! i will never buy a sceptre monitor ever again! no cap my backup monitor which is a dell monitor from 2008 that i got for 8$ at goodwill had WAY BETTER viewing angles than that sceptre i had! i had to sit dead center infront of the sceptre becuase the viewing angles were so bad and thats why i returned it!
So if they don't keep any of your data - how do they bill you monthly?
well i have to compliment the design a bit - you don't often see a computer where installing ram makes you feel like you're replacing uranium fuel cells
well he's making that trash can useful at running without apple trash in the can🤣🤣🤣
the design of the cube was INCREDIBLE. but sadly it was form over function in the end. But turning it over and pulling the CORE out. That felt nuclear as well.
😂
Action Retro: "If you enjoyed taking heavily depreciated professional grade tech and using it entirely unprofessionally..."
Me: Stares at maxed out NeXT Station Color Turbo...."Yes, yes I do."
What to do with a NeXT Station Color Turbo professional workstation in 1991? John Carmack: "let's create Doom on it."
My little LAN of SparcStation 10 (Debian), SparcStation 20 (NeXTStep 3.3) and Alpha Miata (Windows 2K ß) agrees...
🤤😂
As someone who had his hands in many tens if not hundreds of Xeon class servers over a period of 15 or so years, I can say that the thermal compound looks totally normal and would have been working fine. It looks crusty as it solidifies with heat, so when you pull it off it leaves a textured appearance. But strapped to the CPU it would have been uniform. The difference in single thread performance is likely down to the larger 30MB cache on the E5-2697v2 vs 10MB of the E5-1620v2.
geekbench is extremely touchy with it's memory...
@@ahmetrefikeryilmaz4432 That actually raises a fair point - the test wasn't totally like for like as the memory was changed as well!
@@alasdairlumsden670 I was actually affirming your statement. Cache is still memory.
I don't really think there should be a drastic difference between the old and the new RAM kits.
Oof, I remember the v2 series of Xeon E series processors. They were good for the time, but I opted for the v4 version in my desktop build for a good reason. Better ongoing longevity. I've got a 20 core, 40 thread processor and it's rarely ever seeing serious utilization, particularly in Windows where thread allocation routines are kinda borked at the best of times.
For some reason his Geekbench 6 results on both CPUs are way lower than they should be. Comparing to CPUs that I've tested even a laptop Core2 Duo P8700 scored higher in single thread (366/550).
A generation older Xeon E5-2690 8c/16t scored 724/4001 beating the faster E5-2697 v2 in both single and multi thread even though it shouldn't. And no, I'm not confusing Geekbench 6 with Geekbench 5 which has different scoring.
when the graphics card is bigger than your computer you're definitely doing something right.
nope nope nope you should fear for the cpu's safety then🤣🤣🤣
I'm sure someone has already mentioned this, but plugging the monitor directly to the eGPU is preferable, otherwise display data coming back from the GPU will be taking up some of the already very limited Thunderbolt 2 bandwidth.
That's what I was thinking and thought that was the default way to use eGPU's. Seems very inefficient to not do it that way unless you're just using the eGPU as some sort of cache or hardware accelerator or something.
oh no he's installing the nVidia driver from nVidia instead of the gpu repo :(
The 4070 only has a single eight PIN Connector which can support a maximum of 150 Watts ... plus 75 Watts max from the PICe Slot it cannot draw more then 225 Watts. So the extra 650 Watts power pupply is overkill for both graphics cards IMHO.
The problem is not the TBP for the nVIDIA Cards, but rather the transient power spikes they have.
@@juanignacioaschura9437this is not an issue on Ada Lovelace cards. It was with Ampere. But not anymore. See Gamers Nexus video reviews on 40 series cards for this. Was one of the first things they tested.
@@juanignacioaschura9437 The 4000 series lessened the issues with those spikes. A 400 watt PSU would have been fine for powering nothing but the GPU
Yep 650 is for a whole system being on the safe side. But i would bet the egpu works in OSX with the amd card.
You're using 22.04 LTS. It's kernel predates RDNA2. That might be a (not the) problem...
Wont the linux-firmware package contain up to date firmware for AMD GPUs? I've been running an RX 6700 XT from 22.04 - 23.04 and it's been working fine.
@@JakeR0bH Ye 23.04 should be fine. But 22?
Kernel 6.2 (which is used in the latest 22.04) supports RDNA2. My rx 6700xt runs just fine on latest Ubuntu 22.04.
@@egsuslo.3 LTS does indeed use 6.2. I stand corrected 😅
Besides blacklisting the old Radeon driver to force amdgpu module. I don't really see a solution or the actual problem...
I thought you got your hands on the leaked Xbox
Xbox Series X Digital Edition 😂😂😂
It had to run some sort of Windows and some Nvidia GPU in order to qualify as an xbox 😂. Hold on.....
If you would install the lightest ever linux distro that could emulate the xbox, then it would be an even better xbox
Xbox 720
When using thunderbolt 2 if you ask it to send the image back to be output over the build in graphics on the computer then you are throwing away performance, if you want the best performance possible then run a monitor directly from the external GPU that way your not chucking away bandwidth sending to much data over the slow thunderbolt 2.
I did this exact same upgrade (except for the video card) when I had my Pro 13 a few years ago. I liked the challenge of the CPU replacement.
After having watched several Jeff G videos about getting these big gaming cards working over external interfaces, it was pretty clear you were in for a challenge. Glad to see that the integrated video on the can works well in Windows at least.
bro refuses to install mint 😭😭
That plot twist was unexpected but wicked cool
Wait? .... why do you need a driver for the AMD 6750XT graphics card? OK it might be different for thunderbolt, but the Linux Kernel should support the card out of the box. Which Ubuntu and kernel version did you use? I'm certain a recent kernel will support it.
And i'm pretty sure 650 Watts are recommended for a complete system with CPU etc. A Radeon 6750XT only draws 250 Watt maximum. Sure the might be spikes that are higher but 400 Watts should be enough. for powering the card alone.
i have an ASRock AMD RX 6650 XT 8GB GPU on my Erying Intel 12900H MATX mobo with 32GB DDR4 3200Mhz RAM, and running Manjaro GNOME in traditional desktop mode on the latest kernels it works just fine with DP to HDMI adapters powering 2 of the same 1080p 75Hz Sceptre monitors he showed in the video. even Steam works just fine for the most part.
Having said that I stopped using Ubuntu, and Ubuntu based distros a few years back when they started doing more, and more of their own BS wank, like not having the latest kernels, trying to kills 32bit Libs, SNAPS, etc.. and that's part of why companies like Valve moved away from using Debian/Ubuntu as the base for STEAMOS, and moved to an Arch base, and maybe he should just try installing Manjaro GNOME with the latest updates, and newest kernel, and see if that does not help solve some things(not sure it will for the internal dual GPUs through on that trash can but worth a shot).
@@CommodoreFan64 I think he used Ubuntu 22.04.3 which is running over a year-old kernel, that would explain all the issues, maybe switching to a bleeding edge distro like Arch or any distro offering a recent kernel
@@RealOny I'm personally not a huge fan of PoP_OS! , or distros being based on Ubuntu in general as once Ubuntu does major changes that are bad for the community many of the ones based on it just following along like sheep, and rolling pure Arch on something like the Apple trash can has its own set of headaches which is why I suggest Manjaro GNOME as a balance with it still having the latest kernels, being based on Arch but somewhat more vetted before pushing out packages in the main repos, while being easy to set up, and get going fast, plus the Manjaro team has been working really hard to make Manjaro more stable with far far less breakages than in the past.
Jus rembember to install mesa and vulkan!
IMO part of the issue was the thunderbolt eGPU setup. That card barely had a fraction of the PCIe bandwidth it needs to function and in my experience AMD cards/drivers do not handle having such limited bandwidth very well.
I like these these systems are becoming so cheap, it's good to know they're actually a great windows gaming PC for the price.
Thank you @ActionRetro for another great episode. This is the second video that I've seen in this series, and I enjoy the shenanigans. I also liked the sound effects while you were disassembling the "trash can" to install the 12-core Xeon. You've definitely earned a sub ... looking forward to the next one =)
Remember, any trash can can be a gaming trash can if you use it as a basketball hoop.
just a quick tip the gpu power supply requirements assumes you have one power supply supplying power to the entire computer so the 400w but we fine for any gpu except for like a x900 amd gpu or a xx90 nvidia gpu assuming the gpus max power draw is less then the 400watts the power supply can provide
So, ram speed is only reduced if you install 1866mhz ram, I have 128 of 1066mhz ram in mine and theres no reduction in speed with my 12 core cpu
Can you passthrough the eGPU to a Windows VM? Sounds like more in the spirit of this channel than running Windows native.
This kind of puts some cold water on my idea of setting up an eGPU for my linux laptop so I could theoretically game when I dock it at home. It seems like the sort of project where I could spend a boat load of money and still come up with absolutely nothing in the end.
I think you can get your money back if you decide it's not for you within 2 weeks or something.
At least, that's the costumer protection law here for online shopping.
It should be able to work on Linux, but then again, I once tried to do surround sound using different sound cards, it should also work but I couldn't get it to work.
It works much better when you use normal hardware
It depends on the hardware and how it's actually hooked up. I'd recommend looking around on Phoronix or another Linux-centric forum (gamingonlinux maybe? or /r/linux?) for information on specific hardware. The setup in this video was quite esoteric and niche, so I wouldn't take it as indicative of all eGPU setups. And I have some reservations regarding his use of "AMDGPU drivers" and how the monitor was hooked up upon first trying the eGPU...
I wouldn't read too much into this experiment unless you are trying to use a Thunderbolt 3 eGPU with a Thunderbolt 2 device.
USB-C ones work great
Loving this series. I did the same CPU upgrade with a cheap Mac Pro off eBay recently. OpenCore Sonoma guide next?
I've had a similar fight with a trashcan trying to work with egpus. The problem was that Windows often wouldn't boot with the thunderbolt card connected, and I couldn't get the OSX patches to support a modern eGPU. I might have another go as the weather turns colder and having a space heater would be useful in the office 😂
I would suspect that there is a wierd pcie4 to pcie3/2 miss match. Seen that happen with pcie4 compatible cards running on pcie3 riser. Had to force the primary pcie 16x slot to run at pcie3(default is auto) in the bios. Dont know if that can be done for the egpu enclosure.
Me: [starts shopping for a trashcan]
Someone needs to make translucent shells for this. Show the beautiful guts it has
IDK im not an apple guy really but that "trashcan" upgraded would prolly make a damned good steam box, it even kinna looks the part of some unknown modern game console.
Wouldnt mind having one myself to tinker with.
FYI, A 2060 - 3050 would have made a pretty competent gaming setup.
Had a 970 EVO in my iMac 2019 and was SO disappointed. macOS had trouble with trim, so every restart was a test of patience. Took about 5 minutes each time. Booting into Windows via Bootcamp was no problem, it was just macOS.
Just for you to consider when you swap back to macOS (maybe possibly eventually) and notice the same: it is the SSD.
I would have gone with Arch, or some other rolling release distro. Having the latest packages really helps when it comes to gaming. Also, why not plug the display to the eGPU directly?
Yes, Ubuntu is a terrible distribution to be doing this kind of stuff with.
Fedora Is rolling release
@@initial_kdthe situation got better over time. Nowadays arch even has a TUI installer, even like 3 years ago that would have been CRAZY
@@peppefailla1630 Fedora isn't entirely rolling release it's a hybrid distro
I was watching and suggesting windows in my mind, then you did it. It works well for such generation.
12:16 Depending on how old the kernel is on distro you've chosen, the amdgpu in the kernel may not support your GPU.
I couldn't get Debian working on my RX 6600XT for instance, because Kernel was too old and doesn't even attempt to initialize the card(not even single dmesg line). Maybe the recent Debian works fine, but idk.
I could compile latest Linux kernel myself, but I didn't want to go that route.
I love the design of the trashcan Mac Pro... but I compared the maxed out Geekbench scores compared to the M1 Pro I'm on right now and good god.
The constant references to Spindler are especially hilarious with reference to a 2013 machine😂
Been thinking of buying a trashcan Mac. It'd only be my second Mac ever after my 07 polycarb, but those little turds always spoke to me.
The firmware probably doesn't support the 6750xt. Would be nice to try again with a 1080ti.
You never cease to amaze me with these wild builds/setups! Also, interesting you got an SSD that would not work well on macOS!
Looks like the kind of design a technician would have nightmares about. Looks gorgeous though
do you ever feel like saying "the plans for that battle station are stored in this droid"?
I wonder if the reason the 6750 is failing is because you're using it in a PCIe3 capable Thunderbolt 3 enclosure so it attempts to operate at PCIe3 specs, but then you've got the thunderbolt 2 to 3 adapter which can't tell the card that the link is only PCIe2 capable. People ran into these kinds of problems with PCIe3 risers on PCIe4 capable motherboards. The card would see the mobo was capable of 4 and try to use it, not knowing that there was a non-compliant riser in the way. You had to tell the motherboard to only use PCIe3 signalling but you had a chicken and egg problem.
The 1050Ti maybe worked because it was old enough that it was only trying to use a slower specification. Maybe it's only 8x wired? Maybe some quirk made it decide to be a PCIe2 card?
eGPUs are always fiddly.
I daily drove Arch on my MBP with an eGPU for a while, though it was a whole thing to setup. What I learned was two things:
1. you cannot hotplug the GPU, it needs to be there when the system boots
2. You need to tell the window manager to use the GPU as the primary graphics device (in KDE/KWin, I had to set the KWIN_DRM_DEVICES environment variable to /dev/dri/card1). Then after logging KDE would spring to life on the eGPU.
Also, at least in this case, I have a hunch that plugging the monitor directly into the eGPU may have made a difference? Also a bit concerned when he mentioned "AMDGPU drivers"...not sure if he meant the proprietary ones or not (which I'd almost never recommend).
Also, nice profile picture :)
@@archlinuxrussian Yea, I don't think I ever got display loopback to work, which was fine because that would decrease performance anyway (especially on TB2 bandwidth on the Mac Pro).
Also thanks :P
8:38 ice trays.. what a great idea
An Intel Apple product with Linux (then Windows) installed on it, an AMD GPU (then an NVidia one, then stock), and a controller for a Microsoft console.
Gotta love the mishmash.
It's funny because every time you remove the case on this one I get the same feeling I had when I saw Darth Vader remove his helmet in The Empire Strikes Back..
"runs non-modern games"
"oh my god it runs"
If normal computer videos are "tech porn", this is the hardcore BDSM dubious-consent equivalent
The interesting thing about those universal blue images (like bazzite) is that you can switch out to any other image with a single command. Or switch to another ostree based OS like the official Silverblue or Kinoite. You don't even lose your data or flatpak apps.
A steamOS-like OS would be pretty cool on the can. Limit it's functionality somewhat but would be pretty cool next to the TV just for gaming.
Used one of these as a workstation at my job in like 2014. Always liked the design even if it was hilariously impractical in many ways.
it showing as nvidia corporation device is just because it doesn't have the updated PCI IDs for it, this will not prevent the card from working since the name is entirely cosmetic and for the user
if you want it to show the actual name of the device run sudo update-pciids and it'll fetch the latest PCI IDs from the internet
TLDR: If your software uses less than 9 cores the E5-2667v2 is a better CPU than the 2697v2.
The E5-2697v2 the highest end CPU for the platform but is not the fastest for single-core performance. It only turbos to 3.5GHz (3.0 for all cores) so for better single-threaded performance you'd want the E5-2667v2 which turbos up to 4.0 or 3.6GHz on all cores. I made a spreadsheet and the 2667v2 is clocked about 14-20% faster throughout the board as long as you have less than 8 active cores. For 9 active cores the 2697 is clocked 6% slower overall (core speed * core count) but has a bit more cache so it probably comes out near equal. For 10, 11, and 12 cores the 2697v2 is 4, 14, and 25% faster.
Things I’ve done: 1. Upgraded my trashcan to 12-core (with a lot of help, but it was open heart surgery feels!) 2. Use Windows on a trashcan for gaming 3. Use a 6900XT eGPU with massive 750W power supply on a Mac Mini 2018 for AAA gaming, if you can call MSFS 2020 a AAA title? It sure needs the processing power. But what I’ve never tried is to grab my TB2-TB3 adaptor and connect the eGPU to my trashcan. I’ve seen a few articles about it though - you need to add special support to get eGPU working over TB2. Read up online first! And that’s before you even go beyond macOS.
For those that are interested in playing more modern games and have a viable low-medium end computer (as of 2024), I'd recommend getting the version with the 2697v2 as stock, as well as the dual FirePro D700s. Oh, and you may as well put 128gb of ram in there.
Boom! You got a very high ram set up with a decent CPU (12c/24t, 3.5GHz), a decent GPU setup (7 teraflops, 12GB of VRAM), and 128GB of RAM!
Just picked one of these up for £100 and I'm stoked. It's much faster than my 3770k and 7970ghz and the size and quietness is fantastic. Always been an apple hater but the aesthetics of this thing are great, and build quality
I’m surprised you didn’t have more issues than you did with the GPU setup. Both the 6750XT and the 4070 are designed for PCIe 4.0 x16 (though x8 provides enough bandwidth) and the minimum you would ever want to run those cards on is PCIe 3.0 x16. But with this setup you’re, at BEST, running those cards on PCIe 2.0 x4 which would mean you’re essentially only giving these cards 1/8th of the bandwidth they were designed for.
And frankly I’m surprised it works at all. I had a PCIe 3.0 motherboard start to go bad and it was letting my 6800XT only use x8 of PCIe 3 and it caused huge performance issues and crashes that weren’t solved until I upgraded to a newer MB that finally gave the card the PCIe 4.0 x16 it wanted.
I've told people for years these weren't bad computers, they were just expecting way too much from a company like apple.
That's true, but sadly so many people fall for Apple's marketing wank, and get caught up in the hype thus leading to unrealistic expectations.
I think he means the toxic pc gamers not people 'disappointed' with their 3k dollar purchase. most people that bought this computer at the time were pretty happy with it. I remember when it came out and all the videos of people trying to build a hackingtosh with the same specs for the same price but it wasnt possible, it usually came out to be more expensive.
if you bought this computer back in the day I'm sure you could justify it if it's for work.@@CommodoreFan64
If I recall correctly from my days of mac: egpu will function only if display connected to gpu you can actually see that your fans are not spinning which means no signal (and I may be wrong but not all docks supported, you can dig throu mac os egpu forums for more info). Also never install nvidia drivers through package from nvidia website, in my 5 years of linux it never worked once especially on ubuntu (apt-install should do ). Trashacans are fun, I'm hunting one to use as a homelab.
I remember when the trash can came out and was always wondering how the hell it was built, given its weird shape. I thought they might have built the whole thing on a large flexible PCB and rolled it up to fit the case.
If you have two GPU's the standard setting for the nvidia-driver is "On demand". The Desktop is then always using the dedicated GPU and the discrete GPU is only used when needed or you tell the system to use it. Ubuntu has an Option "Use discrete graphics" when you launch programs. Actually you can run the Steamlauncher with the dedicated GPU and the Games will run on the discrete because most of the games would not run on the dedicated anyway. At least this is how it works on my Lenovo with discrete NVIDIA-Card.
about bus speeds, if memory serves capacity by itself doesn't limit clock rates, but 4 sticks of memory tend to have lower clock rates than 2 due to bus issues, and larger capacity sticks might have lower maximum clock rates by themselves.
of course, apple could also be mucking things up by automatically setting speed based on ram count...
The RX6750XT works plug and play on a Linux 5.17 kernel paired with Mesa 22.2. I'm currently using the same card on Fedora 38 with a 6.4 Kernel and the latest version of MESA. There was absolutely no reason why that card shouldn't have just fired up on the version of Ubuntu you were using. I can only imagine you were trying to install the AMD Pro drivers, if so why?...... as for NVIDIA why the hell are you downloading them directly from NVIDIA? NEVER, EVER download GPU drivers from a website for Linux.. use you distro's package or driver manager. Your video doesn't make it clear exactly what you did but it looks like user error to me, sorry. Can you clarify what you did please?
How on earth does a decade old motherboard hold up so well all these years later
Keep the Linux dream alive and maybe try using KVM! If you can do some tricky by having the integrated GPU dedicated to video output on the linux OS, and then passing the nvidia GPU to the windows VM
It would look great with a clear cover.
I would love to see you benchmark a bunch of new-ish games on that Win10 trashcan system.
I am very surprised that this machines has an M.2 slot, PCs didn't have those as a common standard for another couple of years
It's not exactly a m.2 slot, it's proprietary but still PCIe like m.2, which is why it can be adapted.
It's too bad it only has a single m.2 slot, otherwise you add a gpu the jank way via a m.2 to pcie slot adatper. I guess maybe you could run the OS of external storage instead but not sure if that would work.
Unfortunately crossfire is a rather mixed bag and doesn't work in most modern games.
i'm thinking of doing it that way lol, i'll just boot off usb lol
I boot windows 11 off usb and it works really well. the hardware seems to have a bug though where it wont detect properly the drives at usb3, it will run at usb2. I basically have to disconnect and reconnect them and swap which usb jack they're in until they get recognized as usb3, but once it does it persists past reboots. @@ActionRetro
I think it looks great without a cover,real R2D2ish.Maybe you could make a clear perspex cover for it.
Exactly what I was thinking! Except in my mind, this was clearly an Imperial darkside astromech, given the all black color-scheme.
I love these crazy videos. If I had the extra cash, I'd be very tempted to pick up my own trashcan Mac
Your videos are allowing me to get on with my life while protecting me from janking up my life too hard with my ADHD brain. Thank you
1:00 I died when you added "Tire"
LOLLLLLLLL
18:53 Bravo ha, I love your comment and I agree with you 100%
If that shroud were clear with a bit of ligthing, that thing would look cool af.
This was really entertaining to watch. I couldn't help but wonder about that power supply. It looked like it might be a pretty standard ATX power supply in that eGPU enclosure. If so, it might be worth it to swap the psu out for a stronger one just in case you need that overhead in the future.
ican needs a pc case fan mod
In my experience, installing the Linux drivers directly from Nvidia causes nothing but pain and suffering. In my very early days of using Linux, I ruined a couple installs trying to do that!
I was really hoping the Linux only way would work. I am sure there are some Linux guys out there that have done this.
Without its case, the Trashcan looks like something from the Death Star.
thats why I named mine VaderPro.
You did not need a second PSU as the recommended power supply is for a whole PC not just the GPU. Also Fallout 4 isnt a very good test as it has a 60FPS cap and can run on weak GPUs I used to play it on a GTX960m at ultra settings and the 960m is a laptop GPU
You get a bun, he gets a bun, and I get a bun too! We all get a bun!
The PSU in the external GPU should have been more than enough to run any of the GPUs you tried. 400W is plenty for almost any GPU out there. Yes, the AMD card recommended a 650W PSU, but it also is accounting for you having a whole PC powered by the PSU as well, not just the GPU. If you could not get almost anything to work with only the PSU inside the enclosure, then I would guess that the 400W PSU is simply not working properly.
Dude the PSU in that external whatever is perfectly fine for anything lesser than a 3090...
I hand cramped the instant you showed the mouse. LOL
You inspired me to install Ubuntu on an old Dell machine with an Nvidia 1050ti GPU. Problem is, I still can’t figure out how to get Steam to work. Nevertheless, I will prevail!
Good luck! I hope it goes well for you!
Linux is so good for gaming now that i hardly ever log into my Windows partition. I need it for University work because their software runs like rubbish in a VM but it's been like 2 months since i logged into Windows and I have been gaming / being productive and watching films and videos every day. I would recommend people to use an Arch based distro when gaming because of access to better GPU drivers and the latest programs but Ubuntu based systems are fine. Debian's packages are a little too old for proper gaming in my opinion but vintage / retro style gaming is great on Debian. I am on 535 Nvidia drivers in Arch but anything past 525 on Ubuntu is really unstable.
Sean, it's really funny to see your beard grow throughout the video, showing how long and hard it was to get to something working :P
😂
Too bad about the plot-twist ending after all that work, but hey! You've saved a bunch of Linux geeks a lot of time by letting us know what _doesn't_ work.
I wanna see gaming on a ThinkPad with an eGPU, myself. I got it to work on my x200, exactly once. The ultimate Linux gamer experience, surely.
I used to be an "old school [K]Ubuntu fanboy" as well, until I tried OpenSUSE. It just seems that little bit more polished. Plus, YaST is super slick for managing things that would otherwise be kinda tedious.
Basically anything other than Ubuntu is much slicker, faster and smaller. I still cannot fathom why Canonical still clings to their failed snaps.
Back before the move to Snaps I got into Linux with Ubuntu MATE and was happy enough with this OS to not distro-hop further.
But as Snaps got rolled out, the modern laptop I ran it on got slow on both system and application startup.
Which led me to hop to Manjaro and after getting annoyed with their bad management of the distro I somehow ended moving to Garuda Linux.
This has it's (significant) problems too, but it's a great OS for me which I'm sticking to long-term, as I like Arch's ease of installing up-to-date obscure software.
But after trying it on an older laptop, OpenSUSE has become my backup distro as I can replicate all the setup I like, including most obscure stuff, with personal repos.
OpenSUSE tumbleweed is significantly more stable than Arch, both in the sense of packages changes as with breakage on updates, but I'm not changing a system I'm happy with.
I'm wondering whether you'd have more luck running it through a native TB2 EGPU (correctly pronounced egg-poo) enclosure, rather than using a TB3 unit with the TB2 adapter?
I like how tou intergrated the ad into the theme of the video
On my computer with a 3060 I can't get it to say 3060 in neofetch... also I didn't expect a 4060 to work since in my macpro5,1(4,1) the newest card to work is a 1080 ti. Also from my experience MacPros run better on windows... except if it's cluttered like mine is😅.
If you've ever seen Auralnauts Star Wars ...."Thank you, Magic trashcan."
I would love to have one of these simply because they are very cool computers and they actually run Windows very well. Problem is a reasonably well specced one is still around $900 ($1.4K here in New Zealand) which is a lot for a PC just to have as an ornament! I would use it of course but just for messing around on.
Yes, I did the upgrade to 128GBs and yes, it reduces the RAM speed from 1866Mhz down to 1066Mhz. The issue here is not the memory, but rather a technical limitation of the Xeon processor memory controller. However, that is still a lot faster than any swap file even if it on an NVMe storage medium. Everything pretty runs of memory on my 2013 trash can at this point. I also installed an AngelBoard which allows me to install regular M.2 drives.
I also have an eGPU, however, the newest MacOS (Mojave) won't allow the use of it. Thanks Apple. They say it because of stability issues which is BS of course. Yes, you won't get the full bandwith the card is capable of (in my case, it is a 5700XT) due to the Thunderbolt 2 port being only capable 0f 20Gb/s vs. TB3 which can do double. However, it could still work with the TB3 to TB2 adapter from Apple. Apple simply wants you to buy a new system (their standard strategy). There is a workaround for this which does not require turning off SIP (system integrity protection). It's called "Kryptonite." It is still a bit of a hassle though.
13:18 ain't no way, what a madlad
I don't know if this is the problem or not, but on a Hackintosh system I have, Mac OS won't boot with an XFX GPU.
Something about the XFX VBIOS that Mac OS doesn't like. Forcibly flashing the card with another VBIOS (I think I used ASUS) and it worked.
I don't know if this is the issue here, with a real Mac, but just something I thought of.
The difference between a pro and an amateur is that pros get paid... You, sir is a pro!
installing ram on this is like swapping isolinear chips on a LACARS terminal. kinda want one just for that, could be a neat home server.
Shut up, Wesley! :D
Try to connect the GPU through the M2-adapter interface. There are adapters e.g M2-PCI-E or M2-Oculink