I run games off a USB C external SSD and it's not stuttery at all tbh. I have OSD set up the same way so i can see the FT graph as well. No clue why it would be in this scenario.. And Doom Eternal wasn't doing it.. could the VGA adapter or monitor itself affect it? Or maybe the USB cable/port or the External drive needs a scan?
540p? Even my old GTX 960 could do 540p without getting hot! I upgraded from it to a 3060 and may sell the 960. I don't expect to get a fortune for it, I paid maybe $200ish for it back long before the scalping sh!tstorm began.
14:32 A lot of Nvidia's higher end cards have coil while like this. For anyone reading this and unaware of a solution, a mild undervolt may help, at a very mild performance drop. In the case of this 3090 Ti, I imagine the power delivery is quite aggressive and thus the resulting whine.
@@Jmich69 oh shit, 1y ago and still came back for it, pretty chad ngl so the whine becomes worse when the power delivery is hit then, that makes sense i guess. but i thought inducers were the most whine generators, how can transistors whine?
@@falkez1514 Something about the frequency of the transistors hitting an audible harmonization. I honestly don't know enough about the science and detailed reasoning behind it, but I understand that having a higher quality VRM can help prevent a GPU from having coil while. Something about the cleaner and more consistent power delivery. You can most often find bad coil whine in cards with the bare minimum allowed power delivery. You can also find coil while in a top end ROG STRIX card. It's much less likely, but it can happen.
@@Jmich69 It's called coil whine for a reason. GPU power consumption varies during different stages of rendering a frame. This causes fluctuations in the inductor current of the VRM stages and therefore in the magnetic fields inside them. The copper windings of the inductors start to vibrate a little under the influence of this changing magnetic field. The reason high-end cards have this more often is because they draw more current per stage and have more stages that make the noise while also increasing the frequency as they are slightly out of phase by design, pushing it into a range where the human ear is more senstitive. Frame rate also affects the frequency, which is why in some games it's possible to "modulate" the frequency by turning the camera.
@@DawidDoesTechStuff Doubt it’d even be that much considering prices on flagship GPUs from 10-15 years ago now, although we’ll probably be in another shortage by then 😂
My friend actually had a GPU from Gigabyte that he only had for 1 year, and it stopped working. He sent it in to Gigabyte under warranty and they refused to fix it. They're not a very good company
Discovered this channel a few days ago. The main reason I love it is that this madman will go out of his way to do incredibly stupid, incredibly entertaining things lol
The higher power thing was a really good hint about the 4000 series and the phase array those extremely power hungry ( according to rumors ) cards. Keep up the hard work!
Especially considering he's using a 1080p60 monitor which is still a decent resolution and VGA is perfectly capable of displaying it. This video was pretty pointless.
Every Tech Tuber: "We got the 3090ti in to test. Look at this massive cooling set-up! Now check out these graphs." Dawid: "Let's make this modern beast of a card, weep like a baby. I'm gonna defeat the purpose of all its power, by using a 1080p 60hz monitor from the before times." 🤣😂🙃😅😭 "Next week, I'll put the whole system into the oven at see 200C. Just to see if you can still game, while living on the planet Venus"
it was 1080p he could have gone for those shit 1366x768 monitors i have one of those, believe me, it would be fun to see such a high end card on a shitty monitor
Did not expect this to simply work with a generic adapter... I was under the impression NVIDIA GPU's, since the 10 series didn't have analog output requiring an active converter to output VGA.
coil whine is a combination of the PSU and GPU. Really depends on the luck of the mix on whether you get it or not. My 3080 coil whines badly with my EVGA 800W PSU but not with my Seasonic 1300W PSU.
DOOM is so well optimized, it always use 100% of the hardware available, on CP 2077 you can see GPU never hitting any higher than 80%... so basically more than 20FPS potentially there but game not using the power needed to render them. In every resolution comparison I've seen with high end/enthusiast range GPUs it's always the same, low GPU usage so not that much performance gains going from even 8k to 1080p (MFS 2020 10FPS difference from 4k to 1080p for example).
oh yeah, we love being subjected to linguistic abuse. Was it wrong of me to pierce my earballs with a knife to stop the pain of listening to dawid say "TIE" over and over again?
@@hamyncheese The nvidia presenter in their intro video for this card way back when they teased it said "tie" instead of T I This seems to be the _official_ pronunciation.
@@Vengir I think so, that's what makes it funnier. A very high end marketing gaff. The older ones _were_ messaged as T I weren't they? Since Ti is titanium, not a Star Wars throwback memory.
Best thing they did with the TI was get the memory on one side. The back side memory on my 3090 ran so freaking hot, 110c, I was forced to get a double waterblock, front and back. Works great, but really heavy.
exactly what I wanted to see, and I would like to see if those adapters could run at 1600x1200@85Hz, because that is what I still use on one of my gaming computers with an old high end CRT. best GPU with analog output is GTX980Ti, but I'm currently using a GTX970
Not sure if somebody else mentioned it, but if I’m not mistaken, the pcb is actually the same design that’s going to be used on the 40series, and this is basically testing the new power delivery connection
I'm fairly certain that the extra spaces for the power delivery (and the additional space for a second 16 pin power connector on the top right corner) is for the same PCB to be compatible with the RTX 4090 in the future which keeps some costs down
Dawid, I would just like to say I've been subbed for a long time now and your growth has been amazing. You're my top 3 favorite tech tuber of all time. Your one liners are top tier caps and your content is top tier botulism antidote. Love you bro, stay awesome.
Love this concept, Dawid! Might I suggest taking it further and running the least demanding games you can think of on it? Let's see if you can get that framerate counter up in the thousands in gameplay. :)
I wish there was a "Dawid Option" for all sorts of entertainment and information. I enjoy all of your videos, and am glad that it looks like your channel is gaining momentum. Thanks for all the chuckles.
Here's the chain I want to see. HDMI to VGA , VGA to RCA/AV, RCA to RF modulator/75 ohm coax, 75 ohm coax to 300 ohm transformer, 300 ohm connector to black and white tv.
I can see why they made the shell for the pcb plastic, I mean I’m pretty sure it’s already heavy with the block on it so the plastic would help with sagging
Hey Dawid, I love the video! When companies send over stupidly expensive products like this, do you have to send them back, ir do you get to keep them?
had similar experience with usb game drive, in many games the fps are very stuttery especially at the beginning of the game, it won’t go away even with usb 3 gen 2 X 2 speed, i’m using nvme ssd with usb external enclosure. after switching to sata ssd as game drive the whole thing went away
Holy crap call the fire department near you!! Some disgruntled employee at Gigabyte didn't put all the VRMs on your board and wants to make another combustible PC component. Maybe there are supposed to be VRMs in those empty spots, but Gigabyte's CEO read Stephen Edwin King book and wants to get his GPUs in the upcoming "Firestarter Movie" advertizing before they release on May 13. O_o I hope they do another MONSTER GPU that uses all those VRMs and they could call it the XXXtreme 3090Ti++ HAHAHAHA You did an incredible critique of that beast GPU and I wish more youtubers would bring some fun into their reviews like you do.
I would love to see a 3090 TIE being put onto a 1st gen PCI-E slot or a really low lane m.2 slot with many dongles for a 16 lane adaptor. Just slap on some rgb to this setup and it will make the gaming 42069% better. Trust me bro
I love this channel for the creativity and not just like mainstream reviewing stuff. You go out of your way for entertainment and doinn things just for the heck of it rather than a straight io review. Love your videos man
since am4's end is near, can you make a their list of lowlow, low, normal, good, medium, medeium well, high and so high you need to sell your body motherboards or any device am4 compatible? i.e. gpu's apu's or just which one you think is worth the money to insides ratio?
Dude, STOP using an external drive in your benchmarks! That's where half the stutters were coming from. You should know better than this! Just plug in a SATA SSD with the game files already on it. You can have Steam scan the folder and it'll recognize the game installs. Don't do this again! Some games like Cyberpunk would have run much faster with an SSD installed on the motherboard.
14:30 my msi laptop (7300HQ, 960M) do that too lol, whenever i press a mouse button or do smth on a game and it does a strange noise like the one in the video
14:28 thats the coil wine witch is the most important peace of technology used to increase wattage (turbo speed) for cpu and other parts its integrated on motherboard gpu have it own coil wine integrated on shipset thats why you need to connect power cable separately on gpu rather than connecting it on motherboard is may sound annoying sometime when there is much load on gpu / cpu
One thing I noticed in some games i had to set max frames rates in global settings to my monitor refresh rate in the nvidia panel which fixed the stutters as the GPU was pushing more which must of been the cause
6:22 I believe it is 20 phases and 2 phases for the GDDR6. It has 6 blank spots for up to 26 phases! Good lord. It might be the same PCB design for next gen lovelace? The 4090 supposedly draws 500-600W!
Could the GPU's Pump be making that sound from having it at the highest point of the run in the AIO?? Jump to 8:50 to see what I am talking about. 14:32 for the sound.
Dawid, you have those horrible scammers pretending to be to be you. I did NOT win a PC from you but what you do win is my subscription. Love you bud, keep it up
Isn't the 'weird flickering' you're referring to screen tearing? That's what it looks like in the video. Games tend to do that at high framerates on a 60hz display, so it'd make sense. G-sync or free-sync won't work over VGA, and turning on V-sync would limit your frame rates, so I'd wager that's the issue.
You know you could run 6 Display Port monitors off of this card as DisplayPort can daisy chain from one monitor to another. Just a thought :). If I recall, the board design for the 3090 Ti can also be used on NVidia's next generation GPU's, that would explain the extra power delivery spaces given the expected power draw of the next generation GPU's from NVidia.
I'm a relatively new person in the PC build space and from all that I have learnt in this 6 months or so, when you see a graphics card attached to a AIO (which this is the first time I am seeing one) from factory it has to be wild. Let's hope my expectations are not dashed by the end of this video
I was going to make a joke about how Gigabyte needed to shill out the advertisement dosh after the PSU scandal, but now I'm more focused on Dawid's joke about the watercooled GPU's shroud. "Oh, it's made out of plastic? I was expecting mythril..." 🤣
I think it would be great if you could release some info on the thickness of the thermal pads and even do a thermal pad replacement video. You could measure the difference between high end thermal pads and the stock ones for Vram temperatures. Can even apply some quality thermal paste for the GPU as well while you are at it. As seen in the video it doesnt seem too tricky to remove the shrouds and install new pads-paste.
I would be very upset with that zip tied fan wire if I spent that much on a card, why is that hanging out instead of being run under the nylon sleeve...?
I have the same PSU but 850W. I got it for the Asus TUF RTX 3080. I am a bit worried because after I bought it, I read some bad reviews about it on Amazon. Love your video :)
I dont know exacly what it is but there is that distinctive crap-factor to all your videos (with a fairly high production value tbh) that I absolutely adore. Oh, and your way of describing certain circumstances with absolutely hilarious made-up diseases and other states of being: lyrical masterpieces :D
Try to overclock the display with CRU app, certain screen support that, it can be nice to see the difference between an overclock screen and a real high quality screen
“Hello Nvidia, this is Linus. Any idea where my 3090Ti went?” “You sent it to who? And he did WHAT with it? Wow!”
I can hear this in Linus voice
Hopefully Nvidia mentioned to Linus that Dawid did not drop the GPU! 🙂
I can hear his voice too
@@smd9591 Dogs start awooooing in the background
linus dont ask for one, he ask for 10, as always, if not more
Dawid: runs everything off of a USB external drive.
Also Dawid: everything is stuttery
It’s USB C. 😂
@@DawidDoesTechStuff USB-C just means its running higher speed, it means close to nothing when it comes to response time, hence the stuteriness
I run games off a USB C external SSD and it's not stuttery at all tbh. I have OSD set up the same way so i can see the FT graph as well. No clue why it would be in this scenario.. And Doom Eternal wasn't doing it.. could the VGA adapter or monitor itself affect it? Or maybe the USB cable/port or the External drive needs a scan?
I only figured this was my problem with two games earlier this year after one them I stopped playing 2 years ago due to the stuttery lol
@@CarkBurningMoo do you own a 3090 tho?
The 3090ti - undisputed king of 540p gaming.
800x600
@@gpubenchmarks7905 540p is tiny bit more heavy than 800x600. But yeah, below that is just dog shit.
540p? Even my old GTX 960 could do 540p without getting hot! I upgraded from it to a 3060 and may sell the 960. I don't expect to get a fortune for it, I paid maybe $200ish for it back long before the scalping sh!tstorm began.
@@charleshines1553 gimme it for 20 bucks
@@charleshines1553 my 4090 absouloutely destroys 144p
14:32 A lot of Nvidia's higher end cards have coil while like this. For anyone reading this and unaware of a solution, a mild undervolt may help, at a very mild performance drop. In the case of this 3090 Ti, I imagine the power delivery is quite aggressive and thus the resulting whine.
the whine comes from the vrms right?
@@falkez1514 The whine comes from the transistors in the GPU/CPU. It is usually associated with cheap power delivery (which includes the VRM).
@@Jmich69 oh shit, 1y ago and still came back for it, pretty chad ngl
so the whine becomes worse when the power delivery is hit then, that makes sense i guess. but i thought inducers were the most whine generators, how can transistors whine?
@@falkez1514 Something about the frequency of the transistors hitting an audible harmonization. I honestly don't know enough about the science and detailed reasoning behind it, but I understand that having a higher quality VRM can help prevent a GPU from having coil while. Something about the cleaner and more consistent power delivery. You can most often find bad coil whine in cards with the bare minimum allowed power delivery. You can also find coil while in a top end ROG STRIX card. It's much less likely, but it can happen.
@@Jmich69 It's called coil whine for a reason. GPU power consumption varies during different stages of rendering a frame. This causes fluctuations in the inductor current of the VRM stages and therefore in the magnetic fields inside them. The copper windings of the inductors start to vibrate a little under the influence of this changing magnetic field. The reason high-end cards have this more often is because they draw more current per stage and have more stages that make the noise while also increasing the frequency as they are slightly out of phase by design, pushing it into a range where the human ear is more senstitive. Frame rate also affects the frequency, which is why in some games it's possible to "modulate" the frequency by turning the camera.
watching people play with tech i will never ever be able to afford makes me happy for some reason
Untill 15 years later when they became semi affordable
That would make me extremely upset and jealous.
This setup made me a little jealous, and I have a Ryzen 9 5950X, 64 GB ddr4, RX 6900X.
@@TheRealNeoFrancois That’s true. 10 or 15 years from now this GPU will be like $100.
@@DawidDoesTechStuff Doubt it’d even be that much considering prices on flagship GPUs from 10-15 years ago now, although we’ll probably be in another shortage by then 😂
It's crazy how good the Doom game engine is. No frame dropping. Every other game you get so many frame drops!
because he's running it on an external USB drive
love how he get sent stuff by gigabyte and then hints at fire hazards with their psu
My friend actually had a GPU from Gigabyte that he only had for 1 year, and it stopped working. He sent it in to Gigabyte under warranty and they refused to fix it. They're not a very good company
(explosions in the distance)
@@marcogenovesi8570 (half life 2 explosion sound)
@@AtomSquirrel not the first time I've heard of that
NZXT is the bigger fire hazard…😁😆
Discovered this channel a few days ago. The main reason I love it is that this madman will go out of his way to do incredibly stupid, incredibly entertaining things lol
shame he didn't use a crt with the mosso era connector would have been more nostalgic
You took the most power GPU right now and basically turned into the butter roboter from Rick and Morty. I love this channel.
What is my purpose
You turn 3090ti into 2060 super
…….oh myyy Ghhhhhaaaaaaad
Me too. Really!
But I really do not like Rick an Morty and any kind of camparison with this gloryfull yt-channel :D ;) No offense bro!
I love this UA-cam channel (and that analogy) haters gonna hate.
@@JohnDoe-wq5eu True...
It emasculated me with its huge die, so I’ll emasculated it with VGA. 😂
The higher power thing was a really good hint about the 4000 series and the phase array those extremely power hungry ( according to rumors ) cards. Keep up the hard work!
definitely should have tested it with a CRT monitor
viewing on my crt
@@KokoroKatsura same
I think, the best match were an 80-Column-Monochrom or Amber CRT ;-)
Some of those can actually go up to a 100hz, and colors are pretty vivid on them too, I think he should give it a shot.
Especially considering he's using a 1080p60 monitor which is still a decent resolution and VGA is perfectly capable of displaying it. This video was pretty pointless.
I love that david makes videos about situations that we can all relate to and totally aren't batshit crazy
Every Tech Tuber: "We got the 3090ti in to test. Look at this massive cooling set-up! Now check out these graphs."
Dawid: "Let's make this modern beast of a card, weep like a baby. I'm gonna defeat the purpose of all its power, by using a 1080p 60hz monitor from the before times."
🤣😂🙃😅😭
"Next week, I'll put the whole system into the oven at see 200C. Just to see if you can still game, while living on the planet Venus"
This cooling solution is crazy AF!
It's now starting snowing here on Venus!
Damn! That Venus idea is actually really good. 😂
it was 1080p
he could have gone for those shit 1366x768 monitors
i have one of those, believe me, it would be fun to see such a high end card on a shitty monitor
im fucking dead xd
@@DawidDoesTechStuff now you need a nasa sponsorship
Did not expect this to simply work with a generic adapter... I was under the impression NVIDIA GPU's, since the 10 series didn't have analog output requiring an active converter to output VGA.
I just used a DP to VGA cable for a LOOONG while on my 1080P monitor with my 3070. And it worked well
Can't imagine spending this much only to get coil whine
If it didn't use as much power as a particle accelerator, no doubt it wouldn't whine as much eh.....
lol its pretty extreme as well, its not supposed to be that variable and noticable
coil whine is a combination of the PSU and GPU. Really depends on the luck of the mix on whether you get it or not. My 3080 coil whines badly with my EVGA 800W PSU but not with my Seasonic 1300W PSU.
I dunno, with some skill you could use it as a full-blown instrument. Sounds like a feature to me xD
@@tonymorris4335 Seasonic, never will ever buy a different brand.
DOOM is so well optimized, it always use 100% of the hardware available, on CP 2077 you can see GPU never hitting any higher than 80%... so basically more than 20FPS potentially there but game not using the power needed to render them.
In every resolution comparison I've seen with high end/enthusiast range GPUs it's always the same, low GPU usage so not that much performance gains going from even 8k to 1080p (MFS 2020 10FPS difference from 4k to 1080p for example).
cp
Love the dedication to saying "3090 Tie" every time, lol. What even is Nvidia's marketing anymore
oh yeah, we love being subjected to linguistic abuse. Was it wrong of me to pierce my earballs with a knife to stop the pain of listening to dawid say "TIE" over and over again?
saying "TIE" every time winds up Linus.😁
@@hamyncheese The nvidia presenter in their intro video for this card way back when they teased it said "tie" instead of T I
This seems to be the _official_ pronunciation.
@@stephen1r2 They seem to use both pronunciation, depending on who from the company is talking at any given moment.
@@Vengir I think so, that's what makes it funnier. A very high end marketing gaff. The older ones _were_ messaged as T I weren't they? Since Ti is titanium, not a Star Wars throwback memory.
13:00 Do you know that VGA monitors have an auto-adjust button ?
This PCB is the same one that's planned to be used for the 600W 4090. That's the reason for the blank spots.
Wtf. We gotta have a nuclear reactor to run that thing
@@alibozkuer5022 The founders edition will have a Molten salt reactor in the package, should be able to run the 4090 undervolted.
Oh cool! That makes sense.
@@DawidDoesTechStuff YOU'RE SPECIAL ED!!!!!
@@IR4TE 4090 ti AORUS Chernobyl master z360 edition
Best thing they did with the TI was get the memory on one side. The back side memory on my 3090 ran so freaking hot, 110c, I was forced to get a double waterblock, front and back. Works great, but really heavy.
4:29 Dawid giving Gigabyte what they asked for, the true Linus video experience
1:13 I spy a bent fin on that VRM cooler, tut tut Gigabyte...😂
I think the next logical step is using this setup with a CRT monitor.
You can decrease resolution for higher refresh rate
1024 x 768 baby! Gotta see Crysis as originally intended.
@Monochromatik Early generation CRTs had translucent faces so they actually can have backlight bleed.
exactly what I wanted to see, and I would like to see if those adapters could run at 1600x1200@85Hz, because that is what I still use on one of my gaming computers with an old high end CRT. best GPU with analog output is GTX980Ti, but I'm currently using a GTX970
4:16 As an actual IRL oil baron, I seriously appreciate this comment.
Not sure if somebody else mentioned it, but if I’m not mistaken, the pcb is actually the same design that’s going to be used on the 40series, and this is basically testing the new power delivery connection
Hasn’t even been that long… come on give the gamers a break from the whole race with scalpers nvidia
heheha
@@Fxmbro "come on give the gamers a break from the whole race with scalpers nvidia" free market and capitalism, wouldn't trade it for anything ;)
Muting the music for the protective film peels, this guy gets it.
16:07 Who needs a guitar when you have a 3090TI.
I'm fairly certain that the extra spaces for the power delivery (and the additional space for a second 16 pin power connector on the top right corner) is for the same PCB to be compatible with the RTX 4090 in the future which keeps some costs down
Spam comment above my reply ^^^ report don't fall for it
Dawid, I would just like to say I've been subbed for a long time now and your growth has been amazing. You're my top 3 favorite tech tuber of all time. Your one liners are top tier caps and your content is top tier botulism antidote. Love you bro, stay awesome.
Nvdia designing their gpus to double as bagpipes is a feature I never knew I wanted.😁
14:30 for reference🎶
Total budget of the video: Like 10-20 $ for the Hdmi to Vga converter.
and $100 for power to power the 3090 tie and 12900k that i want both of so bad
@@shuttleman27c Same, my pc do be kinda old
Great job screwing in the VGA connector at 0:26. "It's proper procedure" -Anthony
when they are sending you their best products you know you've made it.
14:30 Haha The GPU was actually singing Doomly... 🤣 😂....
Dawid giving us some more useful consumer advice, now that $2000 price tag on the RTX 3090 Ti makes a lot of sense
I love how you built this ultra expensive system on top of the boxes. Classic Dawid test bench setup. Awesome.
Love this concept, Dawid! Might I suggest taking it further and running the least demanding games you can think of on it? Let's see if you can get that framerate counter up in the thousands in gameplay. :)
Would the frame rate budge from 999 in Half-Life 2 1080p High Settings?
F.E.A.R. Let Alma loose!
Windows Solitaire 240P 2560fps
i bet minesweeper would get insane fps on that card
Clustertruck at 480p. The coil whine will make glass evaporate.
Man, these golden videos just keep rolling! Keep it up man 👍
I wish there was a "Dawid Option" for all sorts of entertainment and information. I enjoy all of your videos, and am glad that it looks like your channel is gaining momentum. Thanks for all the chuckles.
Here's the chain I want to see.
HDMI to VGA , VGA to RCA/AV, RCA to RF modulator/75 ohm coax, 75 ohm coax to 300 ohm transformer, 300 ohm connector to black and white tv.
You should hook up 4 of the Mesozoic era monitors to it and see how it does with them. Might need a new monitor arm as well
that would be so epic for multi monitor setup then 🤣😂🙃😅
@@raven4k998 it really would be, I'd love to see him do it. He needs to use his porn music as he builds the setup
that's a good test, how many Mesozoic period monitors can a 3090 "tie" run?
@@Angmar3 ah yes, the 3090 "tie"... Imma go with 4 lol
@@Angmar3 imagine it running 10 at the same time the problem is the amount of adapters and shit you need to do that sort of setup though
I can see why they made the shell for the pcb plastic, I mean I’m pretty sure it’s already heavy with the block on it so the plastic would help with sagging
Hey Dawid, I love the video! When companies send over stupidly expensive products like this, do you have to send them back, ir do you get to keep them?
I think he gets to keep them, as the Corsair power supply used in the video was apparently sent to him a while ago.
Stupidly expensive stuff sends back, that's a normal practice
It get send back generally when they can keep it they say it
@@insert_username_here Corsair's probably gonna with a while before sending more things over XD
I have no idea what the video was about but you have an entertaining voice. Could listen to you for hours.
had similar experience with usb game drive, in many games the fps are very stuttery especially at the beginning of the game, it won’t go away even with usb 3 gen 2 X 2 speed, i’m using nvme ssd with usb external enclosure.
after switching to sata ssd as game drive the whole thing went away
🤔
"So what instrument do you play"
Dawid-"Oh I play the Graphics card"
xD
"That power button is worth $200 at least!"
My GOD Dawid!!! DONT TELL THEM THAT! 🤣
he doesn't need to
that's already what they're charging for them
@@Shpoovy Exactly what I was gonna type. 😂
@@DawidDoesTechStuff
ua-cam.com/video/BNvoM0T8tXs/v-deo.html
man... and to think my old ddr2 ECS board has both power and reset buttons lol.
Holy crap call the fire department near you!! Some disgruntled employee at Gigabyte didn't put all the VRMs on your board and wants to make another combustible PC component. Maybe there are supposed to be VRMs in those empty spots, but Gigabyte's CEO read Stephen Edwin King book and wants to get his GPUs in the upcoming "Firestarter Movie" advertizing before they release on May 13. O_o I hope they do another MONSTER GPU that uses all those VRMs and they could call it the XXXtreme 3090Ti++ HAHAHAHA You did an incredible critique of that beast GPU and I wish more youtubers would bring some fun into their reviews like you do.
I would love to see a 3090 TIE being put onto a 1st gen PCI-E slot or a really low lane m.2 slot with many dongles for a 16 lane adaptor.
Just slap on some rgb to this setup and it will make the gaming 42069% better. Trust me bro
I love this channel for the creativity and not just like mainstream reviewing stuff.
You go out of your way for entertainment and doinn things just for the heck of it rather than a straight io review.
Love your videos man
since am4's end is near, can you make a their list of lowlow, low, normal, good, medium, medeium well, high and so high you need to sell your body motherboards or any device am4 compatible? i.e. gpu's apu's or just which one you think is worth the money to insides ratio?
Good idea. I feel less guilty hacking the shit out of cheap last-gen gear.
@@SchoolforHackers Why would you feel guilty about it? It is your hardware now, do as you please with it.
holy shit your channel grew fast. big grats
DUDE dat coil whine OMG 🤣
This is probably the most DANK tech channel on UA-cam.
You should have gone with a DP to VGA converter instead, since that gives you the most "native" VGA signal
Great vid. Thanx Dawid. you are my fav uploader!
8:29, am I seeing that correctly that he is only using one power connector for the CPU while the CPU beeing the power hungriest CPU ever existed? :O
I mean it is an eight pin cpu connector
@@jordangames2560 but there are two of them and I think plugging them in isnt a bad idea with a 12900k
a 360 on a gpu, I love it. Seems a lot easier to have a 360 gpu on the bottom, 360 cpu on top with some out take fans than making a whole custom loop
Dude, STOP using an external drive in your benchmarks! That's where half the stutters were coming from. You should know better than this! Just plug in a SATA SSD with the game files already on it. You can have Steam scan the folder and it'll recognize the game installs. Don't do this again! Some games like Cyberpunk would have run much faster with an SSD installed on the motherboard.
this should be common knowledge.
My external drive is fast enough to game on, his probably is too, I’ve had mine for 3/4 years…
8:02 GIGABYTE be like we will never make any psus again !
Fact: NVidia actually call it a tie, not TI like we all do
It's T I 💦💦
Only in the recent years. Throughout the older gens pre 30 series, it's been TI.
Is it really a tie with the 6900XT? 🤔
Yes, generating from the word Titanium! Sounds stupid imo
Thai
Would go ballistic for a GPU like this.
Awesome vids, Dawid, thanks for testing all this stuff and making us laugh.
14:30 my msi laptop (7300HQ, 960M) do that too lol, whenever i press a mouse button or do smth on a game and it does a strange noise like the one in the video
14:28 thats the coil wine witch is the most important peace of technology used to increase wattage (turbo speed) for cpu and other parts its integrated on motherboard
gpu have it own coil wine integrated on shipset thats why you need to connect power cable separately on gpu rather than connecting it on motherboard
is may sound annoying sometime when there is much load on gpu / cpu
13:29 That bit cracked me up! you crazy man lol
"Burn down a puppy orphanage" LOLOLOLOLOLOL Dawid, you clever bastard
9:12 Rounded corners.
I think his Dell monitor might be newer than my (Dell 1909W) monitor.
One thing I noticed in some games i had to set max frames rates in global settings to my monitor refresh rate in the nvidia panel which fixed the stutters as the GPU was pushing more which must of been the cause
Most entertaining tech CC out there. Always caught off guard when a joke is dropped in.
2010: This is how you get 10-15 fps boost
2022: This is how you get a couple hundreds fps boost
A good quote from Jurassic Park that applies to this
-Is it heavy?
-Yes.
-Then it’s expensive, put it down.
Great video, tonnes of Dawid brand humour. That 3090ti was sexy!
6:22 I believe it is 20 phases and 2 phases for the GDDR6. It has 6 blank spots for up to 26 phases! Good lord. It might be the same PCB design for next gen lovelace? The 4090 supposedly draws 500-600W!
Could the GPU's Pump be making that sound from having it at the highest point of the run in the AIO?? Jump to 8:50 to see what I am talking about. 14:32 for the sound.
David said sponsors and my mind went instantly to the word linode. This word is a massive part of you branding, or at least to me.;)
Great video Dawid. Very entertaining.
Oh, oh, TWO sponsors in a single video, Dawid has hit the big time, next video Dawid goes house shopping.. No, wait, he's in Canada, forget it.
This content was a great Dawid justification of just saying: "ye I'd like to have a 3090 laying around".
Dawid, you have those horrible scammers pretending to be to be you. I did NOT win a PC from you but what you do win is my subscription. Love you bud, keep it up
Isn't the 'weird flickering' you're referring to screen tearing? That's what it looks like in the video. Games tend to do that at high framerates on a 60hz display, so it'd make sense. G-sync or free-sync won't work over VGA, and turning on V-sync would limit your frame rates, so I'd wager that's the issue.
Looks like it
surprised you haven't asked for a fifth of tea yet...! LOL love you brother! 🤣🤣
This is the kind of Dawid Does Savagery that I like.
“Look at all those rays being traced!” Lmao Jesus Christ. The one-liners…
Nvidia: Uhm I don't know if this dude is Linus
@4:30 " Oh no I almost dropped it"
Nvidia: never mind, he's our guy.
14:30
Please i need an explanation for this.
I made the mistake of starting "Every time he says 'Mesozoic Era' I take a drink." and now I'm dead.
You know you could run 6 Display Port monitors off of this card as DisplayPort can daisy chain from one monitor to another. Just a thought :).
If I recall, the board design for the 3090 Ti can also be used on NVidia's next generation GPU's, that would explain the extra power delivery spaces given the expected power draw of the next generation GPU's from NVidia.
I'm a relatively new person in the PC build space and from all that I have learnt in this 6 months or so, when you see a graphics card attached to a AIO (which this is the first time I am seeing one) from factory it has to be wild. Let's hope my expectations are not dashed by the end of this video
Great video Dawid!
I was going to make a joke about how Gigabyte needed to shill out the advertisement dosh after the PSU scandal, but now I'm more focused on Dawid's joke about the watercooled GPU's shroud. "Oh, it's made out of plastic? I was expecting mythril..." 🤣
I think it would be great if you could release some info on the thickness of the thermal pads and even do a thermal pad replacement video. You could measure the difference between high end thermal pads and the stock ones for Vram temperatures. Can even apply some quality thermal paste for the GPU as well while you are at it. As seen in the video it doesnt seem too tricky to remove the shrouds and install new pads-paste.
When I was a kid one of my brothers had a TI SR56 calculator. It also makes different noises when pressing keys and performing calculations.
@Dawid Does Tech Stuff The 3090ti PCB will be used for RTX 4000 series. That is why you have blank space.
lol is that rock part back ground music from rebel galaxy ?
I would be very upset with that zip tied fan wire if I spent that much on a card, why is that hanging out instead of being run under the nylon sleeve...?
Gigabyte after hearing Dawid refer to their product as a chode. "Yeah we definitely sent this to the wrong inbox."
I have the same PSU but 850W. I got it for the Asus TUF RTX 3080. I am a bit worried because after I bought it, I read some bad reviews about it on Amazon. Love your video :)
14:32 the GPU is trying to act as a sound card in DOOM Eternal. Playing The Only Thing They Fear Is You but barley.
I dont know exacly what it is but there is that distinctive crap-factor to all your videos (with a fairly high production value tbh) that I absolutely adore. Oh, and your way of describing certain circumstances with absolutely hilarious made-up diseases and other states of being: lyrical masterpieces :D
Try to overclock the display with CRU app, certain screen support that, it can be nice to see the difference between an overclock screen and a real high quality screen