Stop data brokers from exposing your information. Go to my sponsor aura.com/bringusstudios to get a 14-day free trial and see if your personal information has been compromised
21:26 you're CPU bottlenecked (or PCIe link speed bottlenecked)! It's like you're a cook in a restaurant, and have all the ingredients and tools ready. There's only one waiter on shift, and it gets to you every half an hour. One evening, all orders are glasses of water! You get a glass, fill it, give it to the waiter... And wait 29 minutes for the next order of water. You deliver exactly two glasses of water an hour! Another evening, you get orders for nice dishes you can make in exactly 30 minutes. You're working all the time, but you can keep up with the waiter: you deliver 2 dishes an hour. Same as before, but now tastier :) At lower settings the GPU instantly delivers a frame and then just waits for new information to get to it. At higher settings, the PCIe bus still only delivers information for 80FPS, but now the GPU is working all the time and making nicer frames. Higher settings don't make the load on the PCIe link worse, as the textures and heavy stuff is already stored on the GPU. The CPU just has to tell the GPU how to piece the stuff it has in memory together:) If it's a CPU bottleneck it's the same thing, but instead of information being slow to be delivered it's the CPU that struggles to calculate what to tell the GPU to do. We now have 20 waiters in the restaurant, but only one customer every half an hour! It's more likely to be the CPU than the bus, as the PCI link speed only really matters when loading new textures for new areas, you'd see bad dips sometimes and normal gameplay most of the time.
That is actually a brilliant way of describing it, thanks! And now I know both what the 'x2, x8, x16' parts of a PCI-e slot definition mean, and how the link speed works :D.
The bottleneck is actually the CPU, the GPU probably got a few more beans. That said, the PCI-E 2x link definitely contributed to the bottleneck, since it had less bandwidth to communicate with the GPU.
In your GTA V testing with the 6950xt, the CPU is the limiting factor on why there is no FPS change when you changed the video settings. The CPU can only handle so much in the game, thus limiting the FPS from going as high as the GPU can go. That's why you see not 100% usage on the GPU on both normal and ultra settings. The CPU simply cannot process the logistics needed for a higher FPS. It is "bottle-necking" the GPU. Here's a metaphor that might help. It's like putting a lamborghini in LA traffic. The flow of traffic being the CPU and the lambo being the GPU. The lambo can only go as fast as the flow of traffic allows it, so the CPU is limiting the GPU from it's full potential FPS.
When I heard "We're going to minimize the jank", I didn't expect the same amount of hardware maiming (rip m.2 key adapter that got physically bent in half)
I have absolutely zero clue what you're talking about in the more technical bits like 80% of the time and almost all related jokes go completely over my head, and yet I'm entertained. Subscribed.
Rule of thumb: PCIe 1.0 x1 is 250 MB/s. you multiply by the number of lanes (here: x2: 500 MB/s) and double for each each Generation (so: PCIe 1.0 x2 : 500MB/s -> PCIe 2.0 x2 :1000MB/s -> PCIe 3.0 x2 : 2000MB/s). A PCIe 4.0 x16 can handle 32000MB/s, but most GPUs use 3.0x16 or 4.0x8 (which both are 16000MB/s). During most gameplay, this does not matter to much. The instructions sent by the GPU (geometrie, ...) is not that large. But rember at 100fps you have a frame ever 10 ms. If the GPU needs to load 160MB of new textures before starting the rendering of the next frame(takes 10ms at 16000MB/s) , the framerate drops to 50 fps (10ms + 10ms -> 20ms frametime). An your machine, it would take 80ms to load 160MB over PCIe 3.0x2, causing a sudden fps drop to 11 fps! (80ms + 10 ms = 90 ms frametime)
Everything you said is right, but the performance also depends on how well the game is optimised for that scenario. If you know that you have a lot of video memory and little pcie bandwidth, you could optimise your performance by loading everything into memory during loading screens. Most games don't do that (GTA V for example is well known for missing textures with slow systems), but games like Doom and CS2 run really well because the textures are already there before they are needed. They were programmed and designed with really good frametimes in mind.
I watched this on my Pixel 6 (not the worst, but not the best speakers I've had on a phone) and I didn't really notice all the audio problems you were apologising for.
Binge watching the entire channel on decent headphones right now, I literally forgot about the "audio issues" until I saw this comment. Bro makes amazing content and speaks clearly, that's enough to make it enjoyable.
My favorite thing about Bringus Studios is I can watch this man do amazing things with incredibly advanced technology, make it all seem so simple, and yet I still learn absolutely nothing
Thumbs up for the programmers and hardware designers that made this possible. Thats some serious flexibility at play here. I didnt even know you could just leave PCIe lines unconnected.
It's honestly not that hard to make communications this flexible, given how PCIe works. It uses a bunch of individual serial lanes for communication, so it's relatively easy to just shove the data over less lanes, because serial is already sequential. If it was a parallel bus, then it would be somewhere between very hard to impossible to make it this flexible.
Google meet box lead me to this channel, it was chilling in my fyp for a week untill I finally clicked, and boy did it not disappoint. I've since binged all the vids. So thanks Lenovo and Google
Hey you, don't agonize about sound quality so much. You are perfectly audible and you're coming thorugh on both channels. That is NOT what can be said about some *other* content creators who sometimes don't even bother to notify people about their audio problems. So you're good. And the vid was good too. Good job. ;')
My expectations for this setup is that you should set texture resolutions relatively low to limit data transfer but you should be able to crank the resolution as high as you want with no perfomance drop.
Hardware Unboxed and Gamer's Nexus tested the relevance of the PCI lanes and their result was that cards with low VRAM needed PCIe bandwidth the most. i.e. 8 and 16Gig cards had no problem running at fewer lanes. But the lowend cards really needed it.
Makes sense. A lot of the time at runtime isn't spent sending big bricks of data over, it's commanding the GPU to do stuff and sending small bits of data. It's loading times that would be affected by bandwidth limitations.
Even sexier in the audio equivalent of 240p. Any audio engineers think the buzzing might be a ground loop? Doom coders know who to code a graphics pipeline 😎
M.2 m keys is the most lane available version of the entire m.2 family, and even then manufacturer sometimes could route only x1 of it if that's what they designed it like
The settings with the RX6950 XT don't make that big of a difference because the PCI-E bottleneck is in the amount of drawcalls the CPU can make to the GPU. A drawcall is the CPU basically telling the GPU "go do this", and when you turn the settings up the CPU doesn't necessarily make more drawcalls. It changes what's being asked: "go do that".
As someone who daily drives a headless laptop that has two desktop CPU coolers to cool the CPU and GPU, I can’t believe someone has created something more jank
Yo just found this channel and u are the version of a car guy sticking a ls in whatever it can fit and shit if it don’t we’ll make it fit and I love it this is a awesome ass channel
The hard FPS cap is probably because of a CPU bottleneck. The times where the frames suddenly dip to 1fps occasionally is most likely because of PCIe bandwidth limitation.
ok I think this was one of the most exquisite experiences I ever had with a video, but also, I'd love to have something this good, holy crap wdym jank, that's what peak performance looks like
Probz for the endurance and dedication to this project! Really enjoyable! When first watching I was hoping for an easy cheap project but didn’t turn out that way :D
Man, I get so pumped from your videos! Would love to see the Google Meet Gaming Hub with the proper cooling for the CPU, maybe liquid metal + Noctua fan and then design the new back cover for ventilation?
Totally addicted to you man, your first video i watched was The andriod on iphone thing, since then i watch every video that gets recommended to me. Totally worth it. Its so fun to see you game on stuff that no one imagined wss possible
8:57 I don't know why I laughed so hard at this. Probable because I have bought enough shitty peripherals from Aliexpress to expect that the entire thing is going to shit. 🤣
I honestly love projects like this so much. I've never even heard of Google Meat before this, and I'm loving it being absolutely twisted from what it was meant to be lol
21:10 It's because the gpu was bottlenecked. If you run graphically higher settings, then there will be less of a gpu bottleneck. Because of that, the fps stay the same. The same applies to when you run a higher resolution (For example, going from 1080p to 1440p)
Reminder that a PC case also insulates any "buzzing" through every ground cable, motherboard, devices and connections. The GPU and PSU being caseless might also be the issue, or you have a heck of a microphone that can pick up coil whines. If you didn't had any buzz before replacing the PSU, it's also a very high significative in bulding quality between brands... Or having too much high voltage electronics near a microphone could also be the issue. Amazing this thing can game when it wasn't meant to game, lol.
Regarding Far Cry 6 with switching the resolution so that RSR would kick in: you didn’t actually have to do that. The game actually has AMD FSR 1.0 built in, which is exactly the same as RSR but without the UI being affected. Enabling FSR with the Quality preset would have given you the same result as dropping the resolution to 720p with RSR enabled without having to restart the game.
So you moved the wifi card to be able to connect the ssd that you need to move to be able to conect the gpu, but, couldn't you just put the ssd on the slot you moved the wifi to so you didn't need to rerout the cables?
@@Lewisnotfound1980He replied to someone else and said "I tried, but the bottom of the board didn't have enough clearance with the shell when the SSD + adapter was installed."
That thing is actually pretty absurdly well equipped with 3 separate M.2 slots! pretty rare to see in a mobile device, most people end up using USB WiFi or whatever if they do this.
14:45 it could actually be that the cable or something else is dodgy and although the slot is wired for 4x, the pcie negotiation is failing at 4x and so its falling back to 2x
Basically pcie will automatically drop link width down to the maximum that works reliably in a particular hardware setup, so it will work even with dodgy pcie card and/or motherboard designs, or dodgy riser cables, or just some hardware whose differential pair (an electrical signal transfer method that uses two conductors of equal length and other characteristics and flips a pair of different voltages between the two, to gain more reliable and less error-prone signalling than ground-referenced high and low voltage signals) length or other mismatches sum to push the diffpair out of the tolerance pcie has
I don't have the ability to write off a 6950 XT for my potato projects, so when I use an EGPU setup, I just use my old RX580 8GB card. Still a solid card today since it supports all the newer AMD upscaling tech still. The 8GB of vram is worse than raw processing power, IMO.
Hi, Bringus! Thank you for this video about Google Meet Conference device! Just one thing, that I want to explain, that Coral AI Board used for training Neural networks, or Ai's like ChatGPT. It is't GPU for gaming.
huh endeavour's NVMe Root Port is config'd for x4 lanes so that slot should work better, if it doesn't that's probably the fault of the aliexpress adapter lel then again I'm probably not discovering anything the chrultrabook guys haven't already found
We recently found out that the nvme port is only wired for x2 on endeavour, despite the fizz baseboard setting it to x4. Other fizz boards seem to not be affected, only endeavour. I imagine this is related to the extra coral slots but I'm not too sure.
Stop data brokers from exposing your information. Go to my sponsor aura.com/bringusstudios to get a 14-day free trial and see if your personal information has been compromised
Meanwhile people with SponsorBlock extension like 3 days after release of this video: (they dont have to watch this ad)
Well at least we know who sponsored this.
Yea
Bottle neck bringus
Where is your delorean?
I just stumbled across this channel again and bro hooks up a GPU to a Google Meet conference computer. Like what in the name of
Yes
its in the name of gaming
gaming
He's a spiritual successor to Druaga1! Instead of shoving SSDs into shitty computers, Bringus plays games on whack hardware lol.
Subscribe
21:26 you're CPU bottlenecked (or PCIe link speed bottlenecked)!
It's like you're a cook in a restaurant, and have all the ingredients and tools ready. There's only one waiter on shift, and it gets to you every half an hour. One evening, all orders are glasses of water! You get a glass, fill it, give it to the waiter... And wait 29 minutes for the next order of water. You deliver exactly two glasses of water an hour!
Another evening, you get orders for nice dishes you can make in exactly 30 minutes. You're working all the time, but you can keep up with the waiter: you deliver 2 dishes an hour. Same as before, but now tastier :)
At lower settings the GPU instantly delivers a frame and then just waits for new information to get to it. At higher settings, the PCIe bus still only delivers information for 80FPS, but now the GPU is working all the time and making nicer frames.
Higher settings don't make the load on the PCIe link worse, as the textures and heavy stuff is already stored on the GPU. The CPU just has to tell the GPU how to piece the stuff it has in memory together:)
If it's a CPU bottleneck it's the same thing, but instead of information being slow to be delivered it's the CPU that struggles to calculate what to tell the GPU to do. We now have 20 waiters in the restaurant, but only one customer every half an hour!
It's more likely to be the CPU than the bus, as the PCI link speed only really matters when loading new textures for new areas, you'd see bad dips sometimes and normal gameplay most of the time.
That is actually a brilliant way of describing it, thanks! And now I know both what the 'x2, x8, x16' parts of a PCI-e slot definition mean, and how the link speed works :D.
Nice explaination
I love how you used the Jeremy Clarkson approach of "more power solves everything"
Speed and power saves lives.
POWEEEEERRRRR
Sometimes my genius, it’s almost frightening
@@beedslolkuntus2070sometimes if you listen carefully, you can hear my genius
hey, im the 1000th person to like you, cool
Bringus: "Alright Google meet are you ready for another video?"
Google Meet: "I'm tired boss"
iBoss
I’m beating my google meet 🥵
thats fine, just make sure youre not in a conference and- ohhhh man! now there are a thousand people watching you! @@monkeymanjoe2
Im beating it to nissan gtr rule 34 @@monkeymanjoe2
The bottleneck is actually the CPU, the GPU probably got a few more beans. That said, the PCI-E 2x link definitely contributed to the bottleneck, since it had less bandwidth to communicate with the GPU.
indeed, that's why the games are running very similar as well. To answer his question at the end of the video. But the CPU is mostly the reason.
A good way to measure is how much power is the dGPU drawing
I'd even go as far as to say it's not the CPU but the 2x connection making it act like a CPU bottleneck.
@@ChrisD__ Actually the PCIe x2 limits how many CPU instructions can get to the GPU, besides competing with texture data.
it was drawing 80 watts in GTA V lol, crazy fast gpu
I just need to point out how genuinely welcome and pleasant the Half-Life HEV suit 'use' button spam was. True artistry.
In your GTA V testing with the 6950xt, the CPU is the limiting factor on why there is no FPS change when you changed the video settings. The CPU can only handle so much in the game, thus limiting the FPS from going as high as the GPU can go. That's why you see not 100% usage on the GPU on both normal and ultra settings. The CPU simply cannot process the logistics needed for a higher FPS. It is "bottle-necking" the GPU.
Here's a metaphor that might help. It's like putting a lamborghini in LA traffic. The flow of traffic being the CPU and the lambo being the GPU. The lambo can only go as fast as the flow of traffic allows it, so the CPU is limiting the GPU from it's full potential FPS.
🤓
@@jackneely7772 😢 sorry your slow
so I guess CPU can't handle more FPS, no matter the picture quality?
I really wasn't sure if it was the CPU or the PCIe 3.0 x2 bottlenecking the setup but that makes sense as to why it would be the CPU
wouldn't the most straightforward metaphor be "it's like having a narrow bottle neck on a bottle."
I use these devices everyday at work, and this series is hilarious. My AV team and I love this series.
When I heard "We're going to minimize the jank", I didn't expect the same amount of hardware maiming (rip m.2 key adapter that got physically bent in half)
Not all goals are met lmao
@@BringusStudios lol why didnt you leave the wifi card where it was and install the m.2 on the back? (if you didnt i dont know yet im still watching)
thats the first thing that came to my mind the moment I saw the new adaptor
I have absolutely zero clue what you're talking about in the more technical bits like 80% of the time and almost all related jokes go completely over my head, and yet I'm entertained.
Subscribed.
Rule of thumb: PCIe 1.0 x1 is 250 MB/s. you multiply by the number of lanes (here: x2: 500 MB/s) and double for each each Generation (so: PCIe 1.0 x2 : 500MB/s -> PCIe 2.0 x2 :1000MB/s -> PCIe 3.0 x2 : 2000MB/s). A PCIe 4.0 x16 can handle 32000MB/s, but most GPUs use 3.0x16 or 4.0x8 (which both are 16000MB/s).
During most gameplay, this does not matter to much. The instructions sent by the GPU (geometrie, ...) is not that large.
But rember at 100fps you have a frame ever 10 ms. If the GPU needs to load 160MB of new textures before starting the rendering of the next frame(takes 10ms at 16000MB/s) , the framerate drops to 50 fps (10ms + 10ms -> 20ms frametime). An your machine, it would take 80ms to load 160MB over PCIe 3.0x2, causing a sudden fps drop to 11 fps! (80ms + 10 ms = 90 ms frametime)
@AdxmZI Clown
Everything you said is right, but the performance also depends on how well the game is optimised for that scenario. If you know that you have a lot of video memory and little pcie bandwidth, you could optimise your performance by loading everything into memory during loading screens. Most games don't do that (GTA V for example is well known for missing textures with slow systems), but games like Doom and CS2 run really well because the textures are already there before they are needed. They were programmed and designed with really good frametimes in mind.
That was an 8th gen Intel CPU so I think that's PCIe gen 3.
🤯
🤓
22:00 Google weren't lying about that 400 fps thing.
he took it off tho lol
@@DRMAOMMAIZEINNGits residual
The thought of not installing the old SSD on the other side is now permanently burnt into my brain, truly one of the gaming move I've ever seen.
Same
Had the same thought at first. But looking at the thermal pads for the TPU suggest that the SSD+adapter will be too thick for the back side
@@LucienneGainsboroughdremel
i watched a video about oculink made by an italian youtube channel like 2 days ago and literally 2 days later bringus makes a video about it lol
yes
Yeah and also Linus mentioned it for the first time on my memory in a recent video
the italian channel who i said made a video about OcuLink is called "MVVBlog" btw
IL GOAT MARCO VALLEGGI
I watched this on my Pixel 6 (not the worst, but not the best speakers I've had on a phone) and I didn't really notice all the audio problems you were apologising for.
Sometimes I forget that I'm the only one listening extremely closely to my own voice all day long while editing lmao
you wont notice on a phone. I heard everything through my headphones
Binge watching the entire channel on decent headphones right now, I literally forgot about the "audio issues" until I saw this comment. Bro makes amazing content and speaks clearly, that's enough to make it enjoyable.
@@anon-fq3ud And does stupid things too lmao
I didn't notice it either on my Note 20...@@BringusStudios
My favorite thing about Bringus Studios is I can watch this man do amazing things with incredibly advanced technology, make it all seem so simple, and yet I still learn absolutely nothing
Thumbs up for the programmers and hardware designers that made this possible. Thats some serious flexibility at play here. I didnt even know you could just leave PCIe lines unconnected.
It's honestly not that hard to make communications this flexible, given how PCIe works. It uses a bunch of individual serial lanes for communication, so it's relatively easy to just shove the data over less lanes, because serial is already sequential. If it was a parallel bus, then it would be somewhere between very hard to impossible to make it this flexible.
It's so it can be used with an enclosure on a ultrabook over thunderbolt, those are x2 or x4 as well
the crispy camera audio just adds to the vibe of this
Good to see that bingus studios also watches my favorite jankmaster.
dawid is the king of jank
the fact that this sad baby 1.8ghz of a dual core chip is able to handle all of these games do blow my mind
PS4 and Xbox One CPUs were also dogwater so everyone's been optimizing for the worst CPUs since 2013
@@Wheagg Ps4 was pretty good but Xbox one had a sad bulldozer AMD that shat the bed most of the time I kinda feel bad for em lmao
It’s a quad core with hyper threading.
@@Dargin ps4 and xbox used the exact same cpu . just different vram config and ps4 had a 200 mhz clock speed advantage
It's a quad-core with hyperthreading and Intel Turbo Boost to 4.0GHz. That's why it can handle those games 😂
This is some crazy ass gaming computer
this sure is uhh.. something
@@veemoneoYou've never seen a gaming PC before ?
Honestly runs better than mine
same man my goofy ass i7 7700 and 1050ti can't even run gta v at 60fps 1080p
Your videos always make me feel cosy, but I can't tell why. 10/10
Google meet box lead me to this channel, it was chilling in my fyp for a week untill I finally clicked, and boy did it not disappoint. I've since binged all the vids. So thanks Lenovo and Google
Hey you, don't agonize about sound quality so much. You are perfectly audible and you're coming thorugh on both channels. That is NOT what can be said about some *other* content creators who sometimes don't even bother to notify people about their audio problems. So you're good. And the vid was good too. Good job. ;')
As someone who's never played GTA V; thanks to these kinds of videos, I'm intimately familiar with the first two minutes of gameplay.
My expectations for this setup is that you should set texture resolutions relatively low to limit data transfer but you should be able to crank the resolution as high as you want with no perfomance drop.
Hardware Unboxed and Gamer's Nexus tested the relevance of the PCI lanes and their result was that cards with low VRAM needed PCIe bandwidth the most. i.e. 8 and 16Gig cards had no problem running at fewer lanes. But the lowend cards really needed it.
Makes sense. A lot of the time at runtime isn't spent sending big bricks of data over, it's commanding the GPU to do stuff and sending small bits of data. It's loading times that would be affected by bandwidth limitations.
I like how you became alot more active on your channel
Weirdest bottleneck
“Its not about the length. Its about how you use it”
-Bringus Out Of Context
Add a server fan to the chassis letting the cpu cool off like no tomorrow haha
you sir, deserve my subscription... the use of game effect sounds is deserving alone.
Even sexier in the audio equivalent of 240p.
Any audio engineers think the buzzing might be a ground loop?
Doom coders know who to code a graphics pipeline 😎
You have the time, patience and skills that people like me only dream of having ❤
I’d like to think that AI chip would probably use more PCIe lanes, did you try it?
That connector (a+e key) offers pci-e only in x1 afaik.
M.2 m keys is the most lane available version of the entire m.2 family, and even then manufacturer sometimes could route only x1 of it if that's what they designed it like
audio is fine, adds character
The settings with the RX6950 XT don't make that big of a difference because the PCI-E bottleneck is in the amount of drawcalls the CPU can make to the GPU. A drawcall is the CPU basically telling the GPU "go do this", and when you turn the settings up the CPU doesn't necessarily make more drawcalls. It changes what's being asked: "go do that".
I think low core clock also attributes to bottleneck)
Ah yes, finally youtube algorithm brought me something worthy to lay my eyes on for almost half an hour. Deserves a sub
amazing video thank you mr bringus
the dawid clip made my day- im suprised at the ammount of tech people I know who have no idea that creators like you or dawid or even linus exist
As someone who daily drives a headless laptop that has two desktop CPU coolers to cool the CPU and GPU, I can’t believe someone has created something more jank
Sounds like a Linux user
@@Rocky712_ i have a ubuntu partition on my boot drive, however i daily drive windows lmao.
Well, i wish i could find a good deal for a headless gaming laptop, seems fun
how u connect fan to laptop and not crush the chips ? + power the fans!
@@lelkasa361 i use the stock heatsink, and attach the tower coolers to that, and for a fan, its a table fan on my desk
Yo just found this channel and u are the version of a car guy sticking a ls in whatever it can fit and shit if it don’t we’ll make it fit and I love it this is a awesome ass channel
The hard FPS cap is probably because of a CPU bottleneck. The times where the frames suddenly dip to 1fps occasionally is most likely because of PCIe bandwidth limitation.
this is like giving dope to a toddler and expecting it to be a grown man (but he covers the whole room in sh*t)
I love this. Thank you for combining old tech with new.
ok I think this was one of the most exquisite experiences I ever had with a video, but also, I'd love to have something this good, holy crap wdym jank, that's what peak performance looks like
This guy's like Dawid but he can pronounce "Thirty"
THIDDY
@@BringusStudiosLET'S INSTALL IT IN MY PEECEE
And less jank, 3d printing was involved instead of duct tape
@@steelfox14483d printing is the duct tape of the additive manufacturing world.
😂
as I came on UA-cam for the first time in 5000 years. I see this video. I instantly clicked on it and subscribed.
Personally love this type of content. Would kill to see this finished clean and polished
We are gaming. Gamers
Facts
Yes
gamers grab your monster energy and lets freakin gaaaaaaaame
WEVE FINALLY ACHIEVED GAMING.
The design is very gamer.
The indifferent cruelty of the universe vs. the indomitable PCIe backwards compatibility
this video is so unhinged. premium bringus content! keep it up!
Probz for the endurance and dedication to this project! Really enjoyable!
When first watching I was hoping for an easy cheap project but didn’t turn out that way :D
9:08 damnnnnn, who did you have to pay for THOSE epic special effects!
I drive by Poo Poo Point for work in Issaquah, WA a few times a week lol
Common Chrultrabook W
I have to thank UA-cam recommendations for making me stumble across this crazy channel.
12:20
Bro's trying to fit a bus in a bike lane right now 💀
petition to name the contraption the Google Meat
God I love your sound effects
Man, I get so pumped from your videos! Would love to see the Google Meet Gaming Hub with the proper cooling for the CPU, maybe liquid metal + Noctua fan and then design the new back cover for ventilation?
You did very well in minimising the jank. I think it looks very decent. Well done!
18:15 that was DEFINETLY "doom audio"
Now build a full enclosure to make it into a full pc
I greatly appreciate your plentiful usage of half life sfx.
This feels like an LTT video but less short linus dropping things
im so happy to see an continuation
i love how he relocated the wifi card to the back and rerouting the antennae cables instead of installing the SSD where the coral chip was instead
Chrultrabook spotted, 10/10 video
16:38 gave me flashbacks and i thought my computer blue screened
I just got finished with a jankier version of this two hours before the video premiered. We are all truly That Predictable
I've gamed with a GPU over a X1 slot and it worked surprisingly fine
Again love it when he makes video after-school
I actually would pay this guy to make a mini pc like this
0:04 It will, it absolutely will... 😂
You're right, the realization of it being a PCIe 3 x2 did disgust me
this google meet is better than my pc 💀💀💀💀
Totally addicted to you man, your first video i watched was The andriod on iphone thing, since then i watch every video that gets recommended to me. Totally worth it. Its so fun to see you game on stuff that no one imagined wss possible
8:57 I don't know why I laughed so hard at this. Probable because I have bought enough shitty peripherals from Aliexpress to expect that the entire thing is going to shit. 🤣
I honestly love projects like this so much. I've never even heard of Google Meat before this, and I'm loving it being absolutely twisted from what it was meant to be lol
It’s probably google using a cheap m.2 connector since they probably just cheaped out and pushed these to workplaces in 2020-2021 to make a quick buck
21:10 It's because the gpu was bottlenecked. If you run graphically higher settings, then there will be less of a gpu bottleneck. Because of that, the fps stay the same. The same applies to when you run a higher resolution (For example, going from 1080p to 1440p)
Reminder that a PC case also insulates any "buzzing" through every ground cable, motherboard, devices and connections. The GPU and PSU being caseless might also be the issue, or you have a heck of a microphone that can pick up coil whines. If you didn't had any buzz before replacing the PSU, it's also a very high significative in bulding quality between brands... Or having too much high voltage electronics near a microphone could also be the issue.
Amazing this thing can game when it wasn't meant to game, lol.
Windows buzzing crash blue screen death
Regarding Far Cry 6 with switching the resolution so that RSR would kick in: you didn’t actually have to do that. The game actually has AMD FSR 1.0 built in, which is exactly the same as RSR but without the UI being affected. Enabling FSR with the Quality preset would have given you the same result as dropping the resolution to 720p with RSR enabled without having to restart the game.
So you moved the wifi card to be able to connect the ssd that you need to move to be able to conect the gpu, but, couldn't you just put the ssd on the slot you moved the wifi to so you didn't need to rerout the cables?
He doesnt reply 😢
@@Lewisnotfound1980He replied to someone else and said "I tried, but the bottom of the board didn't have enough clearance with the shell when the SSD + adapter was installed."
That thing is actually pretty absurdly well equipped with 3 separate M.2 slots! pretty rare to see in a mobile device, most people end up using USB WiFi or whatever if they do this.
14:45 it could actually be that the cable or something else is dodgy and although the slot is wired for 4x, the pcie negotiation is failing at 4x and so its falling back to 2x
Basically pcie will automatically drop link width down to the maximum that works reliably in a particular hardware setup, so it will work even with dodgy pcie card and/or motherboard designs, or dodgy riser cables, or just some hardware whose differential pair (an electrical signal transfer method that uses two conductors of equal length and other characteristics and flips a pair of different voltages between the two, to gain more reliable and less error-prone signalling than ground-referenced high and low voltage signals) length or other mismatches sum to push the diffpair out of the tolerance pcie has
I don't have the ability to write off a 6950 XT for my potato projects, so when I use an EGPU setup, I just use my old RX580 8GB card. Still a solid card today since it supports all the newer AMD upscaling tech still. The 8GB of vram is worse than raw processing power, IMO.
4:40 wifi card will remember that
Thank god for @BringusStudios living out my impulsive thoughts for me, so many electronics saved by him scratching the itch :P
24:12 first time hearing an American using Celsius than Fahrenheit
We use Celsius for electronics, we don’t use Fahrenheit for that.
The buzzing gives it ✨️personality✨️
15:18 big gpu go fast
Hi, Bringus! Thank you for this video about Google Meet Conference device! Just one thing, that I want to explain, that Coral AI Board used for training Neural networks, or Ai's like ChatGPT. It is't GPU for gaming.
bottle neck speedrun any%
PCI-E splits between WI-FI slot and m.2, probably
16:49 cz full screen windowed, choose fullscreen
in 1080p, cpu usage is higher than in 4k for example, that makes the gpu work more
huh endeavour's NVMe Root Port is config'd for x4 lanes so that slot should work better, if it doesn't that's probably the fault of the aliexpress adapter lel
then again I'm probably not discovering anything the chrultrabook guys haven't already found
We recently found out that the nvme port is only wired for x2 on endeavour, despite the fizz baseboard setting it to x4. Other fizz boards seem to not be affected, only endeavour. I imagine this is related to the extra coral slots but I'm not too sure.
Also, in lspci we found that the root port was only running at x2.