I run games off a USB C external SSD and it's not stuttery at all tbh. I have OSD set up the same way so i can see the FT graph as well. No clue why it would be in this scenario.. And Doom Eternal wasn't doing it.. could the VGA adapter or monitor itself affect it? Or maybe the USB cable/port or the External drive needs a scan?
14:32 A lot of Nvidia's higher end cards have coil while like this. For anyone reading this and unaware of a solution, a mild undervolt may help, at a very mild performance drop. In the case of this 3090 Ti, I imagine the power delivery is quite aggressive and thus the resulting whine.
@@Jmich69 oh shit, 1y ago and still came back for it, pretty chad ngl so the whine becomes worse when the power delivery is hit then, that makes sense i guess. but i thought inducers were the most whine generators, how can transistors whine?
@@falkez1514 Something about the frequency of the transistors hitting an audible harmonization. I honestly don't know enough about the science and detailed reasoning behind it, but I understand that having a higher quality VRM can help prevent a GPU from having coil while. Something about the cleaner and more consistent power delivery. You can most often find bad coil whine in cards with the bare minimum allowed power delivery. You can also find coil while in a top end ROG STRIX card. It's much less likely, but it can happen.
@@Jmich69 It's called coil whine for a reason. GPU power consumption varies during different stages of rendering a frame. This causes fluctuations in the inductor current of the VRM stages and therefore in the magnetic fields inside them. The copper windings of the inductors start to vibrate a little under the influence of this changing magnetic field. The reason high-end cards have this more often is because they draw more current per stage and have more stages that make the noise while also increasing the frequency as they are slightly out of phase by design, pushing it into a range where the human ear is more senstitive. Frame rate also affects the frequency, which is why in some games it's possible to "modulate" the frequency by turning the camera.
540p? Even my old GTX 960 could do 540p without getting hot! I upgraded from it to a 3060 and may sell the 960. I don't expect to get a fortune for it, I paid maybe $200ish for it back long before the scalping sh!tstorm began.
Discovered this channel a few days ago. The main reason I love it is that this madman will go out of his way to do incredibly stupid, incredibly entertaining things lol
@@DawidDoesTechStuff Doubt it’d even be that much considering prices on flagship GPUs from 10-15 years ago now, although we’ll probably be in another shortage by then 😂
The higher power thing was a really good hint about the 4000 series and the phase array those extremely power hungry ( according to rumors ) cards. Keep up the hard work!
Especially considering he's using a 1080p60 monitor which is still a decent resolution and VGA is perfectly capable of displaying it. This video was pretty pointless.
My friend actually had a GPU from Gigabyte that he only had for 1 year, and it stopped working. He sent it in to Gigabyte under warranty and they refused to fix it. They're not a very good company
DOOM is so well optimized, it always use 100% of the hardware available, on CP 2077 you can see GPU never hitting any higher than 80%... so basically more than 20FPS potentially there but game not using the power needed to render them. In every resolution comparison I've seen with high end/enthusiast range GPUs it's always the same, low GPU usage so not that much performance gains going from even 8k to 1080p (MFS 2020 10FPS difference from 4k to 1080p for example).
Best thing they did with the TI was get the memory on one side. The back side memory on my 3090 ran so freaking hot, 110c, I was forced to get a double waterblock, front and back. Works great, but really heavy.
coil whine is a combination of the PSU and GPU. Really depends on the luck of the mix on whether you get it or not. My 3080 coil whines badly with my EVGA 800W PSU but not with my Seasonic 1300W PSU.
Every Tech Tuber: "We got the 3090ti in to test. Look at this massive cooling set-up! Now check out these graphs." Dawid: "Let's make this modern beast of a card, weep like a baby. I'm gonna defeat the purpose of all its power, by using a 1080p 60hz monitor from the before times." 🤣😂🙃😅😭 "Next week, I'll put the whole system into the oven at see 200C. Just to see if you can still game, while living on the planet Venus"
it was 1080p he could have gone for those shit 1366x768 monitors i have one of those, believe me, it would be fun to see such a high end card on a shitty monitor
oh yeah, we love being subjected to linguistic abuse. Was it wrong of me to pierce my earballs with a knife to stop the pain of listening to dawid say "TIE" over and over again?
@@hamyncheese The nvidia presenter in their intro video for this card way back when they teased it said "tie" instead of T I This seems to be the _official_ pronunciation.
@@Vengir I think so, that's what makes it funnier. A very high end marketing gaff. The older ones _were_ messaged as T I weren't they? Since Ti is titanium, not a Star Wars throwback memory.
Did not expect this to simply work with a generic adapter... I was under the impression NVIDIA GPU's, since the 10 series didn't have analog output requiring an active converter to output VGA.
exactly what I wanted to see, and I would like to see if those adapters could run at 1600x1200@85Hz, because that is what I still use on one of my gaming computers with an old high end CRT. best GPU with analog output is GTX980Ti, but I'm currently using a GTX970
Not sure if somebody else mentioned it, but if I’m not mistaken, the pcb is actually the same design that’s going to be used on the 40series, and this is basically testing the new power delivery connection
Here's the chain I want to see. HDMI to VGA , VGA to RCA/AV, RCA to RF modulator/75 ohm coax, 75 ohm coax to 300 ohm transformer, 300 ohm connector to black and white tv.
Dawid, I would just like to say I've been subbed for a long time now and your growth has been amazing. You're my top 3 favorite tech tuber of all time. Your one liners are top tier caps and your content is top tier botulism antidote. Love you bro, stay awesome.
Holy crap call the fire department near you!! Some disgruntled employee at Gigabyte didn't put all the VRMs on your board and wants to make another combustible PC component. Maybe there are supposed to be VRMs in those empty spots, but Gigabyte's CEO read Stephen Edwin King book and wants to get his GPUs in the upcoming "Firestarter Movie" advertizing before they release on May 13. O_o I hope they do another MONSTER GPU that uses all those VRMs and they could call it the XXXtreme 3090Ti++ HAHAHAHA You did an incredible critique of that beast GPU and I wish more youtubers would bring some fun into their reviews like you do.
Love this concept, Dawid! Might I suggest taking it further and running the least demanding games you can think of on it? Let's see if you can get that framerate counter up in the thousands in gameplay. :)
Dude, STOP using an external drive in your benchmarks! That's where half the stutters were coming from. You should know better than this! Just plug in a SATA SSD with the game files already on it. You can have Steam scan the folder and it'll recognize the game installs. Don't do this again! Some games like Cyberpunk would have run much faster with an SSD installed on the motherboard.
I wish there was a "Dawid Option" for all sorts of entertainment and information. I enjoy all of your videos, and am glad that it looks like your channel is gaining momentum. Thanks for all the chuckles.
I would love to see a 3090 TIE being put onto a 1st gen PCI-E slot or a really low lane m.2 slot with many dongles for a 16 lane adaptor. Just slap on some rgb to this setup and it will make the gaming 42069% better. Trust me bro
Hey Dawid, I love the video! When companies send over stupidly expensive products like this, do you have to send them back, ir do you get to keep them?
I love this channel for the creativity and not just like mainstream reviewing stuff. You go out of your way for entertainment and doinn things just for the heck of it rather than a straight io review. Love your videos man
had similar experience with usb game drive, in many games the fps are very stuttery especially at the beginning of the game, it won’t go away even with usb 3 gen 2 X 2 speed, i’m using nvme ssd with usb external enclosure. after switching to sata ssd as game drive the whole thing went away
since am4's end is near, can you make a their list of lowlow, low, normal, good, medium, medeium well, high and so high you need to sell your body motherboards or any device am4 compatible? i.e. gpu's apu's or just which one you think is worth the money to insides ratio?
I'm fairly certain that the extra spaces for the power delivery (and the additional space for a second 16 pin power connector on the top right corner) is for the same PCB to be compatible with the RTX 4090 in the future which keeps some costs down
I can see why they made the shell for the pcb plastic, I mean I’m pretty sure it’s already heavy with the block on it so the plastic would help with sagging
that sound was air trapped in the pump of the GPU. rad was lower than the pump/block, so air was trapped and we were hearing the pump rev up to account for more activity.
I dont know exacly what it is but there is that distinctive crap-factor to all your videos (with a fairly high production value tbh) that I absolutely adore. Oh, and your way of describing certain circumstances with absolutely hilarious made-up diseases and other states of being: lyrical masterpieces :D
Dawid, you have those horrible scammers pretending to be to be you. I did NOT win a PC from you but what you do win is my subscription. Love you bud, keep it up
I'm a relatively new person in the PC build space and from all that I have learnt in this 6 months or so, when you see a graphics card attached to a AIO (which this is the first time I am seeing one) from factory it has to be wild. Let's hope my expectations are not dashed by the end of this video
I never realised how amazing DOOM's engine is. I only recently discovered how badly some games can run even when there's a lot more gpu potential available to them. 90+% usage at the lowest settings on a 3090ti is really impressive. I didn't even think 80% of usage of a 12900k was possible from just gaming yet
I think it would be great if you could release some info on the thickness of the thermal pads and even do a thermal pad replacement video. You could measure the difference between high end thermal pads and the stock ones for Vram temperatures. Can even apply some quality thermal paste for the GPU as well while you are at it. As seen in the video it doesnt seem too tricky to remove the shrouds and install new pads-paste.
They definitely left those spaces open so that later down the road they can continue to use that same board on newer models and crank out even more power to it.
Seriously love your videos. Look forward to each new one. Also, you could get 6 vga monitors running with 2 3090 Tis and some adapters. 🤣🤣🤣 please do it.
Dawid: runs everything off of a USB external drive.
Also Dawid: everything is stuttery
It’s USB C. 😂
@@DawidDoesTechStuff USB-C just means its running higher speed, it means close to nothing when it comes to response time, hence the stuteriness
I run games off a USB C external SSD and it's not stuttery at all tbh. I have OSD set up the same way so i can see the FT graph as well. No clue why it would be in this scenario.. And Doom Eternal wasn't doing it.. could the VGA adapter or monitor itself affect it? Or maybe the USB cable/port or the External drive needs a scan?
I only figured this was my problem with two games earlier this year after one them I stopped playing 2 years ago due to the stuttery lol
@@CarkBurningMoo do you own a 3090 tho?
It's crazy how good the Doom game engine is. No frame dropping. Every other game you get so many frame drops!
because he's running it on an external USB drive
“Hello Nvidia, this is Linus. Any idea where my 3090Ti went?” “You sent it to who? And he did WHAT with it? Wow!”
I can hear this in Linus voice
Hopefully Nvidia mentioned to Linus that Dawid did not drop the GPU! 🙂
I can hear his voice too
@@smd9591 Dogs start awooooing in the background
linus dont ask for one, he ask for 10, as always, if not more
14:32 A lot of Nvidia's higher end cards have coil while like this. For anyone reading this and unaware of a solution, a mild undervolt may help, at a very mild performance drop. In the case of this 3090 Ti, I imagine the power delivery is quite aggressive and thus the resulting whine.
the whine comes from the vrms right?
@@falkez1514 The whine comes from the transistors in the GPU/CPU. It is usually associated with cheap power delivery (which includes the VRM).
@@Jmich69 oh shit, 1y ago and still came back for it, pretty chad ngl
so the whine becomes worse when the power delivery is hit then, that makes sense i guess. but i thought inducers were the most whine generators, how can transistors whine?
@@falkez1514 Something about the frequency of the transistors hitting an audible harmonization. I honestly don't know enough about the science and detailed reasoning behind it, but I understand that having a higher quality VRM can help prevent a GPU from having coil while. Something about the cleaner and more consistent power delivery. You can most often find bad coil whine in cards with the bare minimum allowed power delivery. You can also find coil while in a top end ROG STRIX card. It's much less likely, but it can happen.
@@Jmich69 It's called coil whine for a reason. GPU power consumption varies during different stages of rendering a frame. This causes fluctuations in the inductor current of the VRM stages and therefore in the magnetic fields inside them. The copper windings of the inductors start to vibrate a little under the influence of this changing magnetic field. The reason high-end cards have this more often is because they draw more current per stage and have more stages that make the noise while also increasing the frequency as they are slightly out of phase by design, pushing it into a range where the human ear is more senstitive. Frame rate also affects the frequency, which is why in some games it's possible to "modulate" the frequency by turning the camera.
The 3090ti - undisputed king of 540p gaming.
800x600
@@gpubenchmarks7905 540p is tiny bit more heavy than 800x600. But yeah, below that is just dog shit.
540p? Even my old GTX 960 could do 540p without getting hot! I upgraded from it to a 3060 and may sell the 960. I don't expect to get a fortune for it, I paid maybe $200ish for it back long before the scalping sh!tstorm began.
@@charleshines1553 gimme it for 20 bucks
@@charleshines1553 my 4090 absouloutely destroys 144p
Discovered this channel a few days ago. The main reason I love it is that this madman will go out of his way to do incredibly stupid, incredibly entertaining things lol
shame he didn't use a crt with the mosso era connector would have been more nostalgic
watching people play with tech i will never ever be able to afford makes me happy for some reason
Untill 15 years later when they became semi affordable
That would make me extremely upset and jealous.
This setup made me a little jealous, and I have a Ryzen 9 5950X, 64 GB ddr4, RX 6900X.
@@TheRealNeoFrancois That’s true. 10 or 15 years from now this GPU will be like $100.
@@DawidDoesTechStuff Doubt it’d even be that much considering prices on flagship GPUs from 10-15 years ago now, although we’ll probably be in another shortage by then 😂
The higher power thing was a really good hint about the 4000 series and the phase array those extremely power hungry ( according to rumors ) cards. Keep up the hard work!
definitely should have tested it with a CRT monitor
viewing on my crt
@@KokoroKatsura same
I think, the best match were an 80-Column-Monochrom or Amber CRT ;-)
Some of those can actually go up to a 100hz, and colors are pretty vivid on them too, I think he should give it a shot.
Especially considering he's using a 1080p60 monitor which is still a decent resolution and VGA is perfectly capable of displaying it. This video was pretty pointless.
I love that david makes videos about situations that we can all relate to and totally aren't batshit crazy
love how he get sent stuff by gigabyte and then hints at fire hazards with their psu
My friend actually had a GPU from Gigabyte that he only had for 1 year, and it stopped working. He sent it in to Gigabyte under warranty and they refused to fix it. They're not a very good company
(explosions in the distance)
@@marcogenovesi8570 (half life 2 explosion sound)
@@AtomSquirrel not the first time I've heard of that
NZXT is the bigger fire hazard…😁😆
DOOM is so well optimized, it always use 100% of the hardware available, on CP 2077 you can see GPU never hitting any higher than 80%... so basically more than 20FPS potentially there but game not using the power needed to render them.
In every resolution comparison I've seen with high end/enthusiast range GPUs it's always the same, low GPU usage so not that much performance gains going from even 8k to 1080p (MFS 2020 10FPS difference from 4k to 1080p for example).
cp
You took the most power GPU right now and basically turned into the butter roboter from Rick and Morty. I love this channel.
What is my purpose
You turn 3090ti into 2060 super
…….oh myyy Ghhhhhaaaaaaad
Me too. Really!
But I really do not like Rick an Morty and any kind of camparison with this gloryfull yt-channel :D ;) No offense bro!
I love this UA-cam channel (and that analogy) haters gonna hate.
@@JohnDoe-wq5eu True...
It emasculated me with its huge die, so I’ll emasculated it with VGA. 😂
Best thing they did with the TI was get the memory on one side. The back side memory on my 3090 ran so freaking hot, 110c, I was forced to get a double waterblock, front and back. Works great, but really heavy.
Can't imagine spending this much only to get coil whine
If it didn't use as much power as a particle accelerator, no doubt it wouldn't whine as much eh.....
lol its pretty extreme as well, its not supposed to be that variable and noticable
coil whine is a combination of the PSU and GPU. Really depends on the luck of the mix on whether you get it or not. My 3080 coil whines badly with my EVGA 800W PSU but not with my Seasonic 1300W PSU.
I dunno, with some skill you could use it as a full-blown instrument. Sounds like a feature to me xD
@@tonymorris4335 Seasonic, never will ever buy a different brand.
1:13 I spy a bent fin on that VRM cooler, tut tut Gigabyte...😂
Every Tech Tuber: "We got the 3090ti in to test. Look at this massive cooling set-up! Now check out these graphs."
Dawid: "Let's make this modern beast of a card, weep like a baby. I'm gonna defeat the purpose of all its power, by using a 1080p 60hz monitor from the before times."
🤣😂🙃😅😭
"Next week, I'll put the whole system into the oven at see 200C. Just to see if you can still game, while living on the planet Venus"
This cooling solution is crazy AF!
It's now starting snowing here on Venus!
Damn! That Venus idea is actually really good. 😂
it was 1080p
he could have gone for those shit 1366x768 monitors
i have one of those, believe me, it would be fun to see such a high end card on a shitty monitor
im fucking dead xd
@@DawidDoesTechStuff now you need a nasa sponsorship
Nvdia designing their gpus to double as bagpipes is a feature I never knew I wanted.😁
14:30 for reference🎶
Love the dedication to saying "3090 Tie" every time, lol. What even is Nvidia's marketing anymore
oh yeah, we love being subjected to linguistic abuse. Was it wrong of me to pierce my earballs with a knife to stop the pain of listening to dawid say "TIE" over and over again?
saying "TIE" every time winds up Linus.😁
@@hamyncheese The nvidia presenter in their intro video for this card way back when they teased it said "tie" instead of T I
This seems to be the _official_ pronunciation.
@@stephen1r2 They seem to use both pronunciation, depending on who from the company is talking at any given moment.
@@Vengir I think so, that's what makes it funnier. A very high end marketing gaff. The older ones _were_ messaged as T I weren't they? Since Ti is titanium, not a Star Wars throwback memory.
Did not expect this to simply work with a generic adapter... I was under the impression NVIDIA GPU's, since the 10 series didn't have analog output requiring an active converter to output VGA.
I just used a DP to VGA cable for a LOOONG while on my 1080P monitor with my 3070. And it worked well
This PCB is the same one that's planned to be used for the 600W 4090. That's the reason for the blank spots.
Wtf. We gotta have a nuclear reactor to run that thing
@@alibozkuer5022 The founders edition will have a Molten salt reactor in the package, should be able to run the 4090 undervolted.
Oh cool! That makes sense.
@@DawidDoesTechStuff YOU'RE SPECIAL ED!!!!!
@@IR4TE 4090 ti AORUS Chernobyl master z360 edition
Great job screwing in the VGA connector at 0:26. "It's proper procedure" -Anthony
I think the next logical step is using this setup with a CRT monitor.
You can decrease resolution for higher refresh rate
1024 x 768 baby! Gotta see Crysis as originally intended.
@Monochromatik Early generation CRTs had translucent faces so they actually can have backlight bleed.
exactly what I wanted to see, and I would like to see if those adapters could run at 1600x1200@85Hz, because that is what I still use on one of my gaming computers with an old high end CRT. best GPU with analog output is GTX980Ti, but I'm currently using a GTX970
Muting the music for the protective film peels, this guy gets it.
4:29 Dawid giving Gigabyte what they asked for, the true Linus video experience
I love how you built this ultra expensive system on top of the boxes. Classic Dawid test bench setup. Awesome.
Not sure if somebody else mentioned it, but if I’m not mistaken, the pcb is actually the same design that’s going to be used on the 40series, and this is basically testing the new power delivery connection
Hasn’t even been that long… come on give the gamers a break from the whole race with scalpers nvidia
heheha
@@Fxmbro "come on give the gamers a break from the whole race with scalpers nvidia" free market and capitalism, wouldn't trade it for anything ;)
a 360 on a gpu, I love it. Seems a lot easier to have a 360 gpu on the bottom, 360 cpu on top with some out take fans than making a whole custom loop
Total budget of the video: Like 10-20 $ for the Hdmi to Vga converter.
and $100 for power to power the 3090 tie and 12900k that i want both of so bad
@@shuttleman27c Same, my pc do be kinda old
Here's the chain I want to see.
HDMI to VGA , VGA to RCA/AV, RCA to RF modulator/75 ohm coax, 75 ohm coax to 300 ohm transformer, 300 ohm connector to black and white tv.
Dawid, I would just like to say I've been subbed for a long time now and your growth has been amazing. You're my top 3 favorite tech tuber of all time. Your one liners are top tier caps and your content is top tier botulism antidote. Love you bro, stay awesome.
I have no idea what the video was about but you have an entertaining voice. Could listen to you for hours.
Dawid giving us some more useful consumer advice, now that $2000 price tag on the RTX 3090 Ti makes a lot of sense
This is probably the most DANK tech channel on UA-cam.
when they are sending you their best products you know you've made it.
Holy crap call the fire department near you!! Some disgruntled employee at Gigabyte didn't put all the VRMs on your board and wants to make another combustible PC component. Maybe there are supposed to be VRMs in those empty spots, but Gigabyte's CEO read Stephen Edwin King book and wants to get his GPUs in the upcoming "Firestarter Movie" advertizing before they release on May 13. O_o I hope they do another MONSTER GPU that uses all those VRMs and they could call it the XXXtreme 3090Ti++ HAHAHAHA You did an incredible critique of that beast GPU and I wish more youtubers would bring some fun into their reviews like you do.
Love this concept, Dawid! Might I suggest taking it further and running the least demanding games you can think of on it? Let's see if you can get that framerate counter up in the thousands in gameplay. :)
Would the frame rate budge from 999 in Half-Life 2 1080p High Settings?
F.E.A.R. Let Alma loose!
Windows Solitaire 240P 2560fps
i bet minesweeper would get insane fps on that card
Clustertruck at 480p. The coil whine will make glass evaporate.
Dude, STOP using an external drive in your benchmarks! That's where half the stutters were coming from. You should know better than this! Just plug in a SATA SSD with the game files already on it. You can have Steam scan the folder and it'll recognize the game installs. Don't do this again! Some games like Cyberpunk would have run much faster with an SSD installed on the motherboard.
this should be common knowledge.
My external drive is fast enough to game on, his probably is too, I’ve had mine for 3/4 years…
I wish there was a "Dawid Option" for all sorts of entertainment and information. I enjoy all of your videos, and am glad that it looks like your channel is gaining momentum. Thanks for all the chuckles.
Would go ballistic for a GPU like this.
Awesome vids, Dawid, thanks for testing all this stuff and making us laugh.
16:07 Who needs a guitar when you have a 3090TI.
"Fins Array III", sounds like a new space telescope they might be building up in Scandinavia.
You should hook up 4 of the Mesozoic era monitors to it and see how it does with them. Might need a new monitor arm as well
that would be so epic for multi monitor setup then 🤣😂🙃😅
@@raven4k998 it really would be, I'd love to see him do it. He needs to use his porn music as he builds the setup
that's a good test, how many Mesozoic period monitors can a 3090 "tie" run?
@@Angmar3 ah yes, the 3090 "tie"... Imma go with 4 lol
@@Angmar3 imagine it running 10 at the same time the problem is the amount of adapters and shit you need to do that sort of setup though
Most entertaining tech CC out there. Always caught off guard when a joke is dropped in.
I would love to see a 3090 TIE being put onto a 1st gen PCI-E slot or a really low lane m.2 slot with many dongles for a 16 lane adaptor.
Just slap on some rgb to this setup and it will make the gaming 42069% better. Trust me bro
holy shit your channel grew fast. big grats
Hey Dawid, I love the video! When companies send over stupidly expensive products like this, do you have to send them back, ir do you get to keep them?
I think he gets to keep them, as the Corsair power supply used in the video was apparently sent to him a while ago.
Stupidly expensive stuff sends back, that's a normal practice
It get send back generally when they can keep it they say it
@@insert_username_here Corsair's probably gonna with a while before sending more things over XD
This content was a great Dawid justification of just saying: "ye I'd like to have a 3090 laying around".
"That power button is worth $200 at least!"
My GOD Dawid!!! DONT TELL THEM THAT! 🤣
he doesn't need to
that's already what they're charging for them
@@Shpoovy Exactly what I was gonna type. 😂
@@DawidDoesTechStuff
ua-cam.com/video/BNvoM0T8tXs/v-deo.html
man... and to think my old ddr2 ECS board has both power and reset buttons lol.
I love this channel for the creativity and not just like mainstream reviewing stuff.
You go out of your way for entertainment and doinn things just for the heck of it rather than a straight io review.
Love your videos man
had similar experience with usb game drive, in many games the fps are very stuttery especially at the beginning of the game, it won’t go away even with usb 3 gen 2 X 2 speed, i’m using nvme ssd with usb external enclosure.
after switching to sata ssd as game drive the whole thing went away
🤔
It's actually amazing what you're capable of doing over analog.
since am4's end is near, can you make a their list of lowlow, low, normal, good, medium, medeium well, high and so high you need to sell your body motherboards or any device am4 compatible? i.e. gpu's apu's or just which one you think is worth the money to insides ratio?
Good idea. I feel less guilty hacking the shit out of cheap last-gen gear.
@@SchoolforHackers Why would you feel guilty about it? It is your hardware now, do as you please with it.
I'm fairly certain that the extra spaces for the power delivery (and the additional space for a second 16 pin power connector on the top right corner) is for the same PCB to be compatible with the RTX 4090 in the future which keeps some costs down
Spam comment above my reply ^^^ report don't fall for it
DUDE dat coil whine OMG 🤣
I can see why they made the shell for the pcb plastic, I mean I’m pretty sure it’s already heavy with the block on it so the plastic would help with sagging
You should have gone with a DP to VGA converter instead, since that gives you the most "native" VGA signal
Great vid. Thanx Dawid. you are my fav uploader!
8:29, am I seeing that correctly that he is only using one power connector for the CPU while the CPU beeing the power hungriest CPU ever existed? :O
I mean it is an eight pin cpu connector
@@jordangames2560 but there are two of them and I think plugging them in isnt a bad idea with a 12900k
8:02 GIGABYTE be like we will never make any psus again !
Fact: NVidia actually call it a tie, not TI like we all do
It's T I 💦💦
Only in the recent years. Throughout the older gens pre 30 series, it's been TI.
Is it really a tie with the 6900XT? 🤔
Yes, generating from the word Titanium! Sounds stupid imo
Thai
Man, these golden videos just keep rolling! Keep it up man 👍
Finally someone who actually takes the cards apart, thank you!
13:29 That bit cracked me up! you crazy man lol
I Think My First Intel Motherboard had Cretaceous period technology
David said sponsors and my mind went instantly to the word linode. This word is a massive part of you branding, or at least to me.;)
Great video, tonnes of Dawid brand humour. That 3090ti was sexy!
This is the kind of Dawid Does Savagery that I like.
14:30 Haha The GPU was actually singing Doomly... 🤣 😂....
that sound was air trapped in the pump of the GPU. rad was lower than the pump/block, so air was trapped and we were hearing the pump rev up to account for more activity.
That is a monster graphics card my next upgrade someday, still fun watching
4:16 As an actual IRL oil baron, I seriously appreciate this comment.
I just love the way you talk about the parts
5nm gpus are monsters
i hope we see more budget 5mn gpus next year
2010: This is how you get 10-15 fps boost
2022: This is how you get a couple hundreds fps boost
I dont know exacly what it is but there is that distinctive crap-factor to all your videos (with a fairly high production value tbh) that I absolutely adore. Oh, and your way of describing certain circumstances with absolutely hilarious made-up diseases and other states of being: lyrical masterpieces :D
i found this channel a few years ago trying to run fortnite on my crappy pc. insane to see the growth from those days well deserved keep it up.
And this is why I watch Dawid-he doesn't do what everyone else does
A good quote from Jurassic Park that applies to this
-Is it heavy?
-Yes.
-Then it’s expensive, put it down.
Dawid, you have those horrible scammers pretending to be to be you. I did NOT win a PC from you but what you do win is my subscription. Love you bud, keep it up
You know you’ve hit the big league when Dawid gets a 3090Ti and maxs out 960p on a 60hz Dell dinosaur monitor 🥰😇👍😜
Somehow this is a video I would absolutely expect from Dawid lol
Is it just me, or is every heatsink on that mainboard expertly designed to trap heat and mercilessly cook whatever component sits below?
That should be the actual Microcenter jingle
I request that Gigabyte let you keep the card for future content, and send one to all of your viewers as well.
I'm a relatively new person in the PC build space and from all that I have learnt in this 6 months or so, when you see a graphics card attached to a AIO (which this is the first time I am seeing one) from factory it has to be wild. Let's hope my expectations are not dashed by the end of this video
0:00 love that kurzgesagt poster
I never realised how amazing DOOM's engine is. I only recently discovered how badly some games can run even when there's a lot more gpu potential available to them. 90+% usage at the lowest settings on a 3090ti is really impressive. I didn't even think 80% of usage of a 12900k was possible from just gaming yet
Thanks to Nvidia and Gigabyte for sending you that stuff. I love these videos and glad to see you get the recognition!!
"Burn down a puppy orphanage" LOLOLOLOLOLOL Dawid, you clever bastard
"So what instrument do you play"
Dawid-"Oh I play the Graphics card"
xD
I think it would be great if you could release some info on the thickness of the thermal pads and even do a thermal pad replacement video. You could measure the difference between high end thermal pads and the stock ones for Vram temperatures. Can even apply some quality thermal paste for the GPU as well while you are at it. As seen in the video it doesnt seem too tricky to remove the shrouds and install new pads-paste.
They definitely left those spaces open so that later down the road they can continue to use that same board on newer models and crank out even more power to it.
Dawid when he got the motherboard and gpu was like. sike that’s the wrong number
They even give you a badge in the top foam piece cool.
Oh, oh, TWO sponsors in a single video, Dawid has hit the big time, next video Dawid goes house shopping.. No, wait, he's in Canada, forget it.
Aye. I have the regular 3090 version of that card. I was almost upset when I didn't immediately see you get the figure as well.
I like how dawid measures size in motherboards
Seriously love your videos. Look forward to each new one. Also, you could get 6 vga monitors running with 2 3090 Tis and some adapters. 🤣🤣🤣 please do it.
that should be micro center's ad intro of some sort from now on