Watch our Intel A750 GPU review: ua-cam.com/video/de51aJ33dMk/v-deo.html Grab a GN Tear-Down Toolkit! store.gamersnexus.net/products/gamersnexus-tear-down-toolkit or a GN Large Modmat! store.gamersnexus.net/products/modmat-volt-large (both in stock & shipping at time of posting) Watch our Intel Arc A770 review & benchmarks: ua-cam.com/video/nEvdrbxTtVo/v-deo.html
@@XtreeM_FaiL some one do the meme tictoc/yt short.. get an old busted videocard but take a Drill to it and screw like 30 Deck screws into it.. watch the intel sales pitch video of SCREWLESS design!! flash over to you holding your videocard thats literally got Deck screws all through it even take an old ide cable and screw screws all through the cable .. then look dissapointed at your Screw design videocard.. and fade out to you dreaming of a Screwless intel videocard.. then show the classic windows bluescreen as an ending lol
We should probably add another 1 or 2 on that number. 1 for Steve for going thru the whole process of tearing it down. And another for the poor person that has to put it back together. (probably Steve again).
I work in engineering for a company that does small electronics manufacturing and our assemblers complain to engineering when we put 6 screws on a product. If we put 20 screws on a product, our whole manufacturing department would quit. If we put 55 on one product, they would probably find out where the engineers live and burn our houses down.
And if there's only tape and 0 screws, the Right to Repair enthusaists may hunt down the homes of those assemblers wanting to remove screws, seal tape the doors and windows, and gas them all in CS for 24 hours. emphasis on 24 hours because riot gas is not lethal unless you make them to be.
They used glue and tape which means putting it back together is going to be so much fun. Any type of rma feels like it's better off just replacing the card
Honestly looks like a laptop engineering and manufacturing team were outsourced by Intel for this. Very similar elements in many machines I've disassembled
Considering that this is Intel first serious try I propably wouldn't say outsourced more as they had to create the team from scrath. Remeber they mostly are making CPU's so their Graphics card looking similar is to be expected. Personally I m just happy that we finally have An RGB (AMD, Nvidia and Intel)
@@tomaszzalewski4541 None of this seems very in-house. The design looks classic business-casual Intel and in line with their new stock coolers, but the finish, materials, assembly and build quality feel very out of place for intel. I think the in-house design went as far as a 3D cad render. My guess is that they banked on board partners to take over. The founder edition is called 'limited edition'. They had to get these cards out or it'd be another chicken and egg problem. Nobody really bit other than Asrock.
@@xynonners my bad, I must have misunderstood it when someone said the FE is only made by Intel. Acer made one, which is barely available and terrible value. The only thing it does well is catch eyes. It's more a billboard that acer makes gpus now rather than a serious product
I can see fan replacement procedures for these to consist of: 1: attack shroud with hacksaw. 2: strap noctua fans on with tape. 3: connect noctua fans to chassis header.
Craziest part is, that FE is one of the smaller 4090's, der8auer put up a comparison with the Aorus Master 4090 and it makes the FE look small somehow.
Watching you take that apart was stressful. I can only imagine what it's like assembling or servicing it. RMA/maintenance teams are going to love these cards.
@@SlenderSmurf Some notebooks have the same hell but most of them offer quite easy fan replace without the need of full disassembly. This GPU looks neat but repairability is near zero.
Coming back to this a year later makes you realize: They probably never repaired defective units anyway, because they're probably happy for every card they get out of a warehouse - even if it's just a replacement for a defective one.
when you took off that cheap backplate, i was like "why did they put that over?" it looks actually pretty nice with it off. watching this tear down gives me dell computer vibes with over use of plastic and weird engineer.
How are you guys so good with using the "Thanks Steve" footage without it ever getting old? I still laugh every time it comes up, you guys are masterful at finding the right moments to invoke it.
Love these tear down videos! These are not manufactured to be disassemble frequently, but listening to Steve whine and complain makes it all worthwhile!
God that's even more of a cursed thought when you consider that cars now have subscriptions for built in features like heated seats and stereo . . . We're fucking doomed and the cause is going to be wasted silicon
Nah, Ford still has them beat. I have to remove my bumper and the headlight housing itself just to change a light bulb. GPUs don't even have what should be considered "consumable" components, particularly when they go to the trouble of sourcing Cooler Master fans for their card. Short of Noctua, they're about as good as you can get in terms of sheer reliability. Really, the only thing that you might want to change before 5 years passes, since the fans aren't terribly likely to die before then, is the thermal paste, which is a less involved process now that we know you don't need to remove the screws holding down the little LED control PCB. use a plastic spudger to carefully lift the back panel, unscrew the handful of screws holding the back plate to the midplate and the GPU die to the cooler and pop it apart. Use a pair of pliers or forceps to unseat the connectors and you're good to go with replacing the thermal paste and pads.
I repasted my old MSI GTX 1080 Gaming X (non-Ti) recently. Four screws and two plugs (logo LEDs and fans) to take the entire giant cooler off. Meanwhile I actually had to look up a fcking tutorial for my RX 6800...
As someone who frequently works on laptops, I feel for you. Also, a tip on taped stuff. Put it on wax paper. It prevents dust buildup and stops it from sticking to anything else.
@cas curse Do you even do professional computer repair? That is so unbelievably impractical. People don't want to wait a month and be charged an extra $20 because a technician can't be bothered to properly reuse parts. It's also a huge waste. Yes, keep some double sided tape for strengthening things, but don't order replacements for everything when it isn't actually necessary.
@cas curseActually now that i read it again, your comment might be about fully replacing just adhesive itself with an "official" kit or whatever, if so this is mostly reasonable if you can keep these kits in stock on site. You are still being cringe about it though and keeping a stock of official adhesive replacement kits might be impossible for new models of "certain brands", because they don't allow you to keep anything in stock lol. But you can mostly disregard everything below if that's the case and you aren't actually replacing entire part if it is at all glued. Still, relax a little. My brother in christ, you aren't sending rockets to the space, you are repairing low grade consumer electronics, stop being uppity about it. If you are replacing any part that needs glue or tape, you either are an idiot or lying about what you do. What if the back of a perfectly fine phone screen is glued, gonna order a replcement too? Just take a blade and some alcohol, scrape the glue residue off and replace the adhesive with whatever's appropriate
If only there were some company with experience in graphics card manufacturing that suddenly had excess capacity in terms of their entire graphics card manufacturing not doing anything right now, then Intel could talk to them about manufacturing graphics cards.
@@monzarace Wow! Fuck having any subtlety in humour! Guys! lmao haha Is the company EVGA????????!?!?!?! xdxdxdxd No FUCKING SHIT IT'S THAT SAME COMPANY.
Let's see how much they improve in their next gen cards. Both in performance and build quality. I'm really hoping to see them competing with the mid range ATI and nvidia cards with solid quality, then we'll just have to worry about drivers.
Ehhh Don’t hold your breath There was a A780 that was supposed to come out that met the 3070 in performance It didn’t work Next gen Intel MIGHT come in 2024, and by that point, it’s gonna be competing against RDNA5 and RTX 6000 They would do good if they reach 4060 performance That’s if Intel doesn’t can Arc completely on desktop
You do realize that NUC is a separate company from Intel and part of the Intel Partner Alliance but they did make quite a few Ryzen based NUCs too! Some where under the Ruby name I think!
@@Justchuck69 No, NUCs are specifficaly designed by intel and name NUC itself is trademarked by them, AMD alternatives exist but they are not and can't be called NUC. NUC alternatives for AMD are called mini pcs
Nobody likes crazy, elaborate designs more than smartphone engineers. They will use half a dozen flat-flex ribbon cables to shave half a millimeter off of the packaging size of the components compared to just building a larger PCB. They also love auxiliary PCBs more than any other industry, because it lets them move things around to wherever they have space left for it. And that's before we get to the last couple of generations of phones where the modern "sandwich PCB" has come into common use, where they basically use soldered connectors to bind two layers of PCB together with basically no gap between them so they can squeeze as much components into the most compact space possible. We should just be glad that Intel didn't hire smartphone PCB engineers for their GPUs. We should also probably be grateful that laptops haven't started to use those designs, because then they would be even less repairable.
Love your work and I value your opinions in my purchases. I design PCB's for automotive use for a living. I can add a little insight to one part of this. That LED ring PCB isn't actually the worst idea from a design standpoint. It's extremally simple and you'd be surprised how cheap it would be to manufacture especially in bulk. I also bet it was designed in a few hours. Not allot of time or money in that specific part. The way it's attached on the other hand, I have to agree with you there. Small plastic clips on the housing, screws really anything but tape or glue.
I design pcbs as well. They probably had the rings and edge boards done on the same panel. Given how thin they are I wouldn't expect the board to be much more than a dollar, if that. Low current as well and that drops the copper thickness down as well.
Yeah the fan circles 24:18 and led shroud pcb 24:00 fit neatly inside one another and the space inside each circle is enough for the ARC led pcb 13:11 and that other LED pcb 11:54 are all probably made in one piece so it isn't as wasteful as people would expect. The use of threaded inserts where "wood" screws into hollow plastic posts would've sufficed means this was designed by someone used to low volume production though.
Heyoh, i build systems for automotive. And i really liked these LED-PCB's and they are really cheap if you consider all functions they do. What i don't like at all are those extra power PCB. You have a connector from a PCB, to a PCB to get to a PCB in on single unit ... just why. It's like 3-5 bucks and a punch of possible failures for nothing. Also casing any PCB in plastic was never a good idea ever ...but we tend to do it all the time ...
@@MrHaggyy experience has a lot to do with it. If the engineer is unfamiliar with things, they'll be making accessory boards for everything. Also, mechanical constraints given by other departments. They'll decide a long wire is better than running a trace to a connector that can sandwich to another pcb removing the need for wires, and other similar ideas.
This video makes me feel better because even a company like Intel produces products like this when rushed. Cost and efficiency was bypassed for the launch date and I agree with that decision. They probably didn't account for the pressure AMD put on.
I don’t think they’d want to make video cards that have broken software that’ll result in tons of returns and rmas. But they’d be the perfect partner to produce red lined variants.
@@joemarais7683 Not the first time EVGA dealt with weird ass/half working products. My friend loved EVGA Classified shit that I was there trying to fix them as well
Cars today are glued at many poinst. It only depends on the type/quality. Of course there is a high chance that even Intel used special, durable double sided tape, you as a user can't buy something so special so once you disassembled this card it is better to skip the backplate.
@@EM83D Yeah, that's why a bunch of stuff in the interiors of my car is falling apart. And that's not something that is put into an environment with components that run very hot inside.
Hearing the tape separate as the thin metal backplate bends and the plastic clatter of the led strips reminded me of a Dawid Does Tech Stuff quote: "It's like taking a blade to a kitten's face."
@@chrisdpratt I don't think that's likely. I highly doubt that Steve (or any of the press) was the first to tell people at Intel about the issues re:manufacturing and thermals. They've probably known for a very long time.
@@chrisdpratt They finalized the design of the cooler shroud? That makes no sense... they don't even have the full data on final power consumption at this point. You're full of shit.
@@chrisdpratt They might have finalized the design of the chip itself and related stuff (like memory and power delivery) but the actual design of the boards, the PCBs, etc. should have enough time to finalize.
Seeing just how complicated the teardown of this one was really makes me appreciate the way more servicable Design of the Gigabyte Gaming OC RX 6600 XT and PNY XLR8 3060ti on wich I did a deepclean and repaste a few weeks ago. On both you basically only had to remove the screws you could see on the Backplate to seperate the Cooler from the card and then like four to six additional screws to separate the Finstack from the Fanshroud. There really is no need to overcomplicate it like Intel did here with their Reference design.
And they pass that cost onto us! The folks who they would really like to find enough cost competitive advantage with ARC to buy it instead of Nvidia or AMD, both of whom are launching much better cards soon in this price range. Botched. Kinda. Maybe Intel doesn't super care about making money on this generation.
@@_--_--_ AMD fanboi detected. The Intel Arc is absolutely a strong competitor to AMD equivalents in every way... only exception is driver maturity which will improve quickly. ATI card drivers were absolute shit for quite a while when they first entered the market.
@@mikerzisu9508 What is the relevance of complaining about ATI's drivers from the AIW Radeon days? And the fanboy detected comment - seriously. Just admit the 6600xt is an excellent value compared to all other AMD, Nvidia and Intel current offerings. No need for time machines to 2008.
I’ve seen some truly baffling tear-downs on this channel. But this one… we’ll I wouldn’t say it takes the cake, but it’s certainly memorable. When I saw those custom PCBs for the LEDs… my lawd… Screwless indeed
The threaded inserts are one of the better ways to screw things into plastic. They can handle a surprisingly large load especially the ones for injection molding.
I saw back in the day when u improvised to water cool a titan V, great stuff i have 6 of them mining strong.. But now i wonder if you can point me in the same dirrection of what i can buy and modify slightly to cool a HighPoint SSD7540 8x M.2 NVMe SSD Card. Thanks in advance for all the reviews you guys do.
27:12 I wonder if the card physical design complications were done for a $700 card since they were aiming for the performance and mining-era prices for 3070 cards; now that prices have dropped and the A7 performance is being positioned against the 3060 the amount of extra in the design isn't appropriate for the price it's being sold at today.
@@infernaldaedra they're produced in such volume that the practical benefits of a torx type drive (longer tool life and tool can spin before it is at full depth) make it necessary, or at least worth the trouble to swap tools from whatever was already in the machines, due to economies of scale
Hey man, I just wanted to say I appreciate your videos. You have a great way of putting things out for the simple layman, such as my self. I salute you sir! Thank you for your efforts!
Intel could surely benefit from a board partner for at least or one or two generations until their own design team really understands Video Card Efficiency, A to Z. Maybe they could even actually HIRE AWAY a few Video Card Engineers from EVGA at this point or at least contract a few of them for a year or two. THANK YOU STEVE. 😉
How does anything in this video have any bearing on whether or not this piece so equipment functions well as a graphics card. You aren’t taking it on hikes. .01 percent of people are taking apart their graphics cards at any point in their life. That is way beyond an enthusiast thing to do. The conclusion at the end is so strained. The thing that actually matters is the price to performance ratio. And pretending like “too many 5 cent leds” hurts that is laughable.
@@hastyscorpion Because sometimes you have to get in to clean, to redo thermal paste, to replace borked and failing fans all to avoid shelling out better than half a weeks takehome pay just to keep playing the games that keep me sane.
@@Bill_Falsename You're in the .01 percent Hastyscorpion mentioned because normal users aren't going to "get in to clean, to redo thermal paste, to ...".
@@chankwanting I consider myself a normal user but I had to fix an issue with my old Asus 980 Strix where the fan had to be slightly adjusted because it was making contact with the fins. It was very easy to take apart and access the fans and remove those, and since it was open I went ahead and redid the paste. It doesn't take a .01 user to want to fix an issue that is as simple as a bent fan fin. I would also assume a normal user wouldn't want to replace an entire GPU just because of an issue as simple as that.
If I remember correctly, you took apart one of their NUCs a while back and were really impressed with how good and intelligent the build was. I guess they didn't carry that process over to their GPUs. You'd think that a company trying to break into a well-established market dominated by another player (or two) would pay attention to best practices and things Steve has said in the past (Thanks, Steve!). Also, is there any reason to re-install that thin, aluminum plate you took off first? The card actually looks better without it and the open air ought to drop the temperature a bit.
To be fair, these choices don't affect most consumers, who will never even repaste their card, and they had to cut costs when they realized they were competing with the 3060 not the 3070 I'm sure.
@@eclipsegst9419 you're right. As a budget gamer looking to build my first PC soon with an Rx 6600, this Intel GPU may very well be a viable upgrade path for me, we'll have to see down the line
@@Junya01 it's not much faster than the 6600, I would get one or the other. I myself am trying to debate between 6600xt and a770. I'm curious to see what kind of performance gains the first few driver updates bring. But, 6700xt is getting cheaper by the day too. I would love to help keep a 3rd competitor in the market though.
for all we know the teams that worked on their NUCs are different and might not have anything to do with the ARC GPU teams. Its like that most of the time on these big companies.
Thank you for suffering through this for us steve. When I need to tear it down, your videos are so valuable, and will save me completely destroying it. It won't go back exactly the same though, which really dissapointed me.
I was thinking the same thing at first. But, really, I can't think of any video card or computer component I've ever needed it for. Now, phones, tablets, and other small electronics you'll absolutely need them. Especially with waterproof/ water resistant designs. But it's kinda a different ballgame, ya know? Computer hardware has (at least traditionally) been much more serviceable, and not used the same design philosophy as phones
Thank you for the video papa! Also something worth mentioning... The RMA team is going to have ptsd dreams about tape and jst headers. Repair gate keeping via poor design accessibility is stupid on so many levels.
@@caramelldansen2204 Workers: Hey this design is pretty bad can you change it before production? Shareholders: Oh my! It needs more RGBs! Add this now!
You're assuming they'll attempt repairs. This looks like the sort of thing where any replacements go into the bin. Internal repairability will probably be a concern only when designing the next gen if there's more failures than anticiptated.
That nudge "Thank you pappa" - shows how well they understand their market. Intel's idea is that fathers buy GPUs for their Sons. When the truth is those sons and daughters buy GPUs for themselves.
@@razorblade7108 lol, ouch. I mean at least they are being realistic with themselves and pricing it accordingly and being honest about its performance.
i really wonder who developed this. if they would have used only 50% more brain the cards would cost easily 30-50$ less. how complicated is this!? especially the manufacturing process for a card must be insane there are so many steps that only have to be done by hand. completely insane tbh!
@@youtubeisgarbage900 not defending anyone. I couldn't give a shit whether Intel succeed or not. Simply saying that these experts in the comments should be working for these big companies instead of wasting their time here. They really sound like they know exactly what to do.
Nothing like being a new player in the game, and you not only do not innovate, you actually push things in the opposite direction for design. Well done Intel, showing us all how it's done.
I like the white table, especially with the dark card. Any plans for a "light mode" modmat, maybe not for full scale production but at least as a UA-cam piece
In case anyone was curious, the fabric tape they're using is high-temp wiring loom tape. It's fantastic stuff because it bonds well to paint, plastic, nylon, and even smooth-finish silicone as it is meant to wrap wiring looms in automotive production. It is also very thin and doesn't readily lose its adhesion like Kapton tape sometimes does over time. It makes sense that they'd use a product seemingly purpose-built for wire management to manage their wires. Honestly, I'd rather have all of the little auxiliary PCBs for LED management if it means they can use fewer, easier to manage cables instead of really going out of their way to create a rat's nest of cables inside the housing. As long as the plugs that go into the connectors are obvious, labeled, or don't need to be removed to service the card then I'm largely fine with their implementation, as it makes the card relatively tidy for the amount of LEDs they're using. I'd much rather them have an absolute stunner of a card that is slightly less convenient to disassemble than make something that looks boring or trashy as their "Limited Edition" flagship product. But it's still a game of qualifiers. If the card didn't look so good and wasn't priced so competitive to its performance level, then I'd be less forgiving of these minor annoyances. Also, a pair of forceps is a fantastic tool for pulling connectors from their sockets, particularly if you can get right-angle ones. All in all, an annoying GPU to work on, but at least they're trying to use quality components so you hopefully won't need to. At the very least, if your card doesn't arrive DoA then it should last until the next model gets released, assuming Intel doesn't follow through on axing the whole project.
I hear there's some amazing board and card designers from EVGA that are looking for a new gig. Maybe Intel could pick up all of them to make something meaningful come out of all of this. I look forward to what Intel eventually offers, but for going into my server for encoding/decoding, not so much gaming, though I won't say it couldn't be a good secondary card to run my additional monitors at some point.
The individuals working on graphics cards would be held to the same noncompete agreement as EVGA as a whole, so for at least a year or two they can't work at a competitor of Nvidia. It's not that Evga wouldn't make cards for another company but they literally can't for a period after ending Nvidia contract, and any employees with access to Nvidia Intellectual property would be required to be bound as well
a lot of double-sided tape like that, starts becoming unadered at higher temperatures commonly seen inside pcs. I'm curious as to how it's going to hold up over time
Not if done properly. Cars have lots of adhesives. Entire body panels held on with them. My comma 3 is held up with 3m vhb and records temps up 70c as its in the windshield. Other users that live in hot climates have reported higher.
@@PinkFZeppelin Well floyd and zep tapes hold up better than everything produced since, and the manufacturing process probably entailed everything BUT glue.
This reminds me of working on my car and having to deal with both SAE and metric fasteners. The only difference is me saying "Thanks, GM!" instead of "Thanks, Intel!"
"It's the totality of the contrivancy that drives those complaints" Never change mate. In German we say "ich bin mit der Gesamtsituation unzufrieden", and your wording captures that sentiment perfectly.
@@Nost2682 to be fair, it's not a bad first time GPU. Realistically it is better if Intel never fully catches up to the likes of AMD or Nvidia GPUs because they'll always have to sell for a little cheaper. That would be a win for mid range and lower gamers which let's be honest. That's most of us. But if they do catch up, maybe that may turn up the heat on the other vendors to innovate more or reduce prices. I doubt significant price reduction will happen thou. Not if they use giant GPU dies on the latest and greatest nodes. They have to figure out a way to do more with less. And AMD/Nvidia are already doing just that with DLSS/FSR, chiplets, and it's still not enough to reduce prices because they still use huge dies.
I get it, it's called the Limited Edition. That's why it's taped together with double-sided tape. Sales will probably also live up to that name. The non-limited edition might be held together with powerful magnets! Looking forward to it!
@@Fulano5321 Time will tell, I guess. The first run will be shaky, but maybe in a third or fourth generation, Intel will step it up. Either way, it makes for entertaining tear-down videos by GN! 😄
I wish more manufactures had function first designs. My favorite is my speed queen washer and dryer, lol. Now I just need Ford or Chevy to make a basic truck without any creature comforts, I don't even want a radio in it or comfy seats, just a basic functional work truck.
Then you'll be more interested in their general consumer models and not their "Limited Edition" variant with its fancy aesthetics. The other variants have less RGB.
Had this card for 4 months now, and It performs VERY well. The only thing is that they've focused so much on DX12 that older games (some I must say as I've played DX11 & DX10 (and even games like GTA:SA) games that work perfectly good) might not perform very well, but then again I don't play much of these. The memory size is very nice to have, games do utilize it even if it doesn't max out on cards with less memory on the same settings, which speeds up rendering alot and gives more FPS. Never had thermal issues, and never planned on taking it apart.
For pulling cable connections like you were @11:30 I find oldschool IC lifters are really useful. You can grab both shoulders of a connector and pull it out straight out without putting stress on the wires.
I use forceps. Great to have around for fishing wires through cable management routing or for pulling connectors. Also for retrieving stainless steel screws when they fall inside the case.
Amazing screw less design Intel! Absolutely no screws to have to worry about or keep track of and especially nice to not have to deal with all different types of screws as well! Amazing!
The little mention of "if the drivers keep getting made" really hurt me as a user of the legendary value card R9 390. I get regular GPU driver crashes now
When I saw this thing with just the backplate removed, I had a feeling the complete teardown was going to be a nightmare. Thanks, Steve! * in my best Intel keynote voice *
It's almost as if they never designed these thinking "there may be a few cards with RMAs that need to be repaired, so let's design a card that can be taken apart easily for our RMA team!" Also, backplates that don't actually add any cooling or functionality is a pet peeve of mine. Complete waste in manufacturing cost and design.
@@razorblade7108 Right, it's even worse than worthless. Thank you Papa Intel! I feel like GPU manufacturers have been like "let's throw a backplate on it for no reason" for several years now. There are even been GPUs with pointless plastic backplates. I have more respect for a minimally designed GPU made to reduce costs, than a card with a ton of bloat with no thought put into the big picture. There are other ways to add value other than making a product look flashy. EVGA gained its reputation thanks to its stellar warranties. Leaning towards right to repair by providing board schematics is another way to add perceived value. I feel like that's the only thing Intel can put on the table at this point - other than better GPU drivers.
Gotta say all the GN products are super classy, I only have the coasters but it’s pretty clear the pride you guys take on providing useful features. Good stuff loving all the new product reviews
It might as well have a "no user serviceable parts inside" tag on it. This is something you would find on a flea market tail gate sale for $50 quality-wise.
It is nice they tried to keep the cost down anyhow. Graphics cards don't always need to be expensive. They DO however need to last some expected time frame!!
I wish all component manufacturers would stop wasting their time and resources on stupid LED's. For those who want LED's, there are plenty of separate LED's and controllers. Can't they just buy those and decorate their PC's with the LED's themselves? Why must the LED's be integrated into the components?
agreed. at least put a physical rgb switch like what msi did on their mobo's, so people can turn off rgb without silly softwares eating 10% cpu usage. this way you could still sell the card to rgb fanboys. but still, no rbg at all in a flagship card would be nice.
Watch our Intel A750 GPU review: ua-cam.com/video/de51aJ33dMk/v-deo.html
Grab a GN Tear-Down Toolkit! store.gamersnexus.net/products/gamersnexus-tear-down-toolkit or a GN Large Modmat! store.gamersnexus.net/products/modmat-volt-large (both in stock & shipping at time of posting)
Watch our Intel Arc A770 review & benchmarks: ua-cam.com/video/nEvdrbxTtVo/v-deo.html
Cringe Af
@@Lamiishere the only cringe thing is your non-existant father
Intel is a Beast, the performance of this card is Ace a1😵🪴
From the guys that brought you: AMD just glued together their chiplets comes: a truly glued together card
Hey Steve, does your team have any plans to release a shorty set of the GN screwdriver kit
A screwless design with only 55-56 screws. Impressive.
It is also wireless.
@@XtreeM_FaiL some one do the meme tictoc/yt short.. get an old busted videocard but take a Drill to it and screw like 30 Deck screws into it.. watch the intel sales pitch video of SCREWLESS design!! flash over to you holding your videocard thats literally got Deck screws all through it even take an old ide cable and screw screws all through the cable .. then look dissapointed at your Screw design videocard.. and fade out to you dreaming of a Screwless intel videocard.. then show the classic windows bluescreen as an ending lol
Anything under 64 screws is practically no screws at all. This is a well known industry standard, yes?
We should probably add another 1 or 2 on that number. 1 for Steve for going thru the whole process of tearing it down. And another for the poor person that has to put it back together. (probably Steve again).
and multiple different kinds so you now must add factory processes to slow production to make sure the right ones go in the right place...
I work in engineering for a company that does small electronics manufacturing and our assemblers complain to engineering when we put 6 screws on a product. If we put 20 screws on a product, our whole manufacturing department would quit. If we put 55 on one product, they would probably find out where the engineers live and burn our houses down.
And if there's only tape and 0 screws, the Right to Repair enthusaists may hunt down the homes of those assemblers wanting to remove screws, seal tape the doors and windows, and gas them all in CS for 24 hours.
emphasis on 24 hours because riot gas is not lethal unless you make them to be.
They used glue and tape which means putting it back together is going to be so much fun. Any type of rma feels like it's better off just replacing the card
Sounds about right. "TOOLS?! WE HAVE TO USE A SCREWDRIVER?! UGHHHHHH"
But, hear me out, 55 screws *and* there's 8 extremely similar variants throughout the product decided almost randomly
@@bradonhoover3002 It looks like they had random variety of screws, why else would they have different variants on the same location
We've gone from intel trowing shade at glued CPUs to Intel's glued together GPUs
That definitely should have been referenced in the video. Great point! haha
they deserve to get shit on for that.
Intel locks down OC on their mobos and CPU tiers for no reason, so it's not really surprising they don't want you opening your GPU
@@GamersNexus This approach to construction is why the Apollo 1 crew capsule burned. 👎🏿👎🏿
Indubitably.
Honestly looks like a laptop engineering and manufacturing team were outsourced by Intel for this. Very similar elements in many machines I've disassembled
Considering that this is Intel first serious try I propably wouldn't say outsourced more as they had to create the team from scrath. Remeber they mostly are making CPU's so their Graphics card looking similar is to be expected.
Personally I m just happy that we finally have An RGB (AMD, Nvidia and Intel)
@@tomaszzalewski4541 None of this seems very in-house. The design looks classic business-casual Intel and in line with their new stock coolers, but the finish, materials, assembly and build quality feel very out of place for intel. I think the in-house design went as far as a 3D cad render.
My guess is that they banked on board partners to take over. The founder edition is called 'limited edition'. They had to get these cards out or it'd be another chicken and egg problem. Nobody really bit other than Asrock.
@@deldareland acer
Also, the LE moniker is exclusive to the A770, the A750 FE is not considered "LE."
@@xynonners my bad, I must have misunderstood it when someone said the FE is only made by Intel.
Acer made one, which is barely available and terrible value. The only thing it does well is catch eyes. It's more a billboard that acer makes gpus now rather than a serious product
@@xynonners Intel has A750 limited edition ;).
I can see fan replacement procedures for these to consist of:
1: attack shroud with hacksaw.
2: strap noctua fans on with tape.
3: connect noctua fans to chassis header.
And i bet better temps
Probably a more effective use of time.
Sounds like a future Dawid video.
@@Jerevinan It would have to be Noctua knock-offs from Ali Express though.
Doubly important since apparently the official driver has no fan control support.
Intel: "Check out our screwless design"
GN: "this GPU is basically made of screws"
But what is really laughable is that not even all 65 screws are the same... there are up to 5 different types among all of them... wtf?????
@@JeanPiFresita At least they didn't come up with custom heads...
@@getsideways7257 Yea, at least they are not Apple bad.... Yet.
@@lop1652 If they'll invest more into intricate design, we might see videos of even higher entertainment value once their next gen hits.
@The13thRonin Well, people always ask for variety...
I'm still amazed how comically large 4090 is.
It looks like an oversized novelty toy graphics card lmao
Craziest part is, that FE is one of the smaller 4090's, der8auer put up a comparison with the Aorus Master 4090 and it makes the FE look small somehow.
It looks like a literal brick. I bet you could build a pretty structurally sound house of them!
@@tymandude1510If you could run all the GPUs you would probably make the house a sauna lol
Is it just me or it looks like A770 is about half as thicc as 4090
Watching you take that apart was stressful. I can only imagine what it's like assembling or servicing it. RMA/maintenance teams are going to love these cards.
Pretty sure this thing has more screws and cables than my laptop, and that was a small hell to replace the fans on
@@SlenderSmurf Some notebooks have the same hell but most of them offer quite easy fan replace without the need of full disassembly. This GPU looks neat but repairability is near zero.
"here you, a replacement rtx 3060. what do you mean, that wasnt your card? shhh shhh, its better this way, just pretend its okay and move on"
Coming back to this a year later makes you realize: They probably never repaired defective units anyway, because they're probably happy for every card they get out of a warehouse - even if it's just a replacement for a defective one.
7:24 love the screwless screws in the screwless design.
This was not in the announced announcement of the announcement.
Screwless screws are pretty amazing. You don't even need to unscrew a screw to unscrew the screw.
@@GamersNexus We're screwed!
@@GamersNexus Screwless screw, reminds me of my wife 😶
the drivers had a few loose screws with so they overcompensated with the shroud design
It was leftover glue from AMD "gluing" their chips together...Intel got it at a great price.
Intel is finally sticking it to AMD.
No they're sticking it to their own GPUs.
@@spankbuda7466 "Sticking to AMD"
@@dra6o0n lmao
when you took off that cheap backplate, i was like "why did they put that over?" it looks actually pretty nice with it off. watching this tear down gives me dell computer vibes with over use of plastic and weird engineer.
4:38
Perfectly cut drop.
Whoever's editing this, Thanks.
I personally love these little touches.
yeah i love these random little easter eggs in the GN videos. funny af too
Maybe it's an LTT advertisement hehe
If whoever edits these thinks they've at some point used the "thanks Steve" clip too much, you haven't. Makes me laugh every time
I'm still trying to figure out who the guy is that stands up and goes "ow my knees... :("
@@Lazaerus Wendell from LEVELONETECHS is the guy with the sore knees
@@Lazaerus he’s from Level1Techs
Been following him since Tek Syndicate days
Please, do not stop. It cracks me up everytime.
_Thank you papa_
"Thanks Steve" never gets old. Always makes me chuckle. Always well placed. Keep up the good work, can't wait for the new shirt!
Ow, my knees!
Thanks, Denki
Where the hell is that from?
Thank you papa
*taps chest*
Yeah... 🥰
@@Zwettekop Intel's presentation at Computex 2021. Here's the Gamer's Nexus video on it: ua-cam.com/video/OnNZ3hCjIvs/v-deo.html
That's one cool LED board by Intel. It even has video card as an extra!
Never get tired of hearing Steve have an "Ohhh come on!!!" outburst in a tear-down video.
I think the funniest part is where Steve tries to pick up that top plate from the table. Well, at least it stuck to that...
Cheers.
How are you guys so good with using the "Thanks Steve" footage without it ever getting old? I still laugh every time it comes up, you guys are masterful at finding the right moments to invoke it.
i know right, complete opposite of that emotional damage meme that got old and annoying so fast with other youtubers
they use it after steve says "thanks intel". it's not rocket science finding an area to place that footage in the vid.
Thanks Papa is the new meme, it's priceless
Love these tear down videos! These are not manufactured to be disassemble frequently, but listening to Steve whine and complain makes it all worthwhile!
Congratulations Intel, you managed to make modern car engineers look competent at design and ease of servicing.
God that's even more of a cursed thought when you consider that cars now have subscriptions for built in features like heated seats and stereo . . .
We're fucking doomed and the cause is going to be wasted silicon
@@o_sagui6583 Most likely.
Let's not go that far lol
Nah, Ford still has them beat. I have to remove my bumper and the headlight housing itself just to change a light bulb. GPUs don't even have what should be considered "consumable" components, particularly when they go to the trouble of sourcing Cooler Master fans for their card. Short of Noctua, they're about as good as you can get in terms of sheer reliability. Really, the only thing that you might want to change before 5 years passes, since the fans aren't terribly likely to die before then, is the thermal paste, which is a less involved process now that we know you don't need to remove the screws holding down the little LED control PCB. use a plastic spudger to carefully lift the back panel, unscrew the handful of screws holding the back plate to the midplate and the GPU die to the cooler and pop it apart. Use a pair of pliers or forceps to unseat the connectors and you're good to go with replacing the thermal paste and pads.
How many miles before you change the oil on your GPU?
I recently did a thermal repaste on a decade old card and I was amazed by how easy it was. I certainly used to be much easier to tear down cards.
It's not used to be, it just depends on if the manufacturer wants to be user friendly
Nah you had utter nightmares in the past too, its just a matter of asshole design
@Michaels Carport Same, its was a 6950 by a company that I don't think even exists anymore. It was 4 or so screws and one connector for the fans.
I repasted my old MSI GTX 1080 Gaming X (non-Ti) recently. Four screws and two plugs (logo LEDs and fans) to take the entire giant cooler off. Meanwhile I actually had to look up a fcking tutorial for my RX 6800...
Yep I did a fan replacement and repaste on an RX590 recently. Knocked about 6 degrees off a Timespy run, and about 10 off of ambient.
As someone who frequently works on laptops, I feel for you. Also, a tip on taped stuff. Put it on wax paper. It prevents dust buildup and stops it from sticking to anything else.
Hey I appreciate the tip. 19 years later, hate laptops still. I never once thought or read of this. Lol.
@cas curse Do you even do professional computer repair? That is so unbelievably impractical. People don't want to wait a month and be charged an extra $20 because a technician can't be bothered to properly reuse parts. It's also a huge waste. Yes, keep some double sided tape for strengthening things, but don't order replacements for everything when it isn't actually necessary.
@@Joostinonline are you smoking crack? you should always have enough stock of the things you use the most for repairs, it's common sense 101
@cas curseActually now that i read it again, your comment might be about fully replacing just adhesive itself with an "official" kit or whatever, if so this is mostly reasonable if you can keep these kits in stock on site. You are still being cringe about it though and keeping a stock of official adhesive replacement kits might be impossible for new models of "certain brands", because they don't allow you to keep anything in stock lol. But you can mostly disregard everything below if that's the case and you aren't actually replacing entire part if it is at all glued. Still, relax a little.
My brother in christ, you aren't sending rockets to the space, you are repairing low grade consumer electronics, stop being uppity about it. If you are replacing any part that needs glue or tape, you either are an idiot or lying about what you do. What if the back of a perfectly fine phone screen is glued, gonna order a replcement too? Just take a blade and some alcohol, scrape the glue residue off and replace the adhesive with whatever's appropriate
wow, didnt even think of that. thanks
If only there were some company with experience in graphics card manufacturing that suddenly had excess capacity in terms of their entire graphics card manufacturing not doing anything right now, then Intel could talk to them about manufacturing graphics cards.
But they'd need to actually deliver GPU dies to said company, without being late by a whole two quarters.
@@WhenDoesTheVideoActuallyStart since when has intel been concerned about delaying a product?
would that be a certain company who has just announced that they dropped nvidia...?...
Yeah, if only such a company would exist...
@@monzarace Wow! Fuck having any subtlety in humour! Guys! lmao haha Is the company EVGA????????!?!?!?! xdxdxdxd
No FUCKING SHIT IT'S THAT SAME COMPANY.
Let's see how much they improve in their next gen cards. Both in performance and build quality. I'm really hoping to see them competing with the mid range ATI and nvidia cards with solid quality, then we'll just have to worry about drivers.
Ehhh
Don’t hold your breath
There was a A780 that was supposed to come out that met the 3070 in performance
It didn’t work
Next gen Intel MIGHT come in 2024, and by that point, it’s gonna be competing against RDNA5 and RTX 6000
They would do good if they reach 4060 performance
That’s if Intel doesn’t can Arc completely on desktop
@@bornonthebattlefront4883
in 2024 intel is gonna compete against rtx 50xx not 60xx
bro actually called them ATI not AMD
@@lukerutledge1357 IYKYK
@@lukerutledge1357 I'm just barely old enough to remember that I had one of the last ATI Cards in my first laptop
Laughed so hard at the Wendell cameo for a "Thanks Steve!". I swear it only gets funnier.
Ow my knee!
That one really took me by surprise haha
This for sure looks like the NUC engineers had a field day. They love those crazy elaborate designs.
Probably exactly what happened. "We have an entire division that's used to making small enclosures, right? Have them design it"
You do realize that NUC is a separate company from Intel and part of the Intel Partner Alliance but they did make quite a few Ryzen based NUCs too! Some where under the Ruby name I think!
@@Justchuck69 No, NUCs are specifficaly designed by intel and name NUC itself is trademarked by them, AMD alternatives exist but they are not and can't be called NUC. NUC alternatives for AMD are called mini pcs
Nobody likes crazy, elaborate designs more than smartphone engineers. They will use half a dozen flat-flex ribbon cables to shave half a millimeter off of the packaging size of the components compared to just building a larger PCB. They also love auxiliary PCBs more than any other industry, because it lets them move things around to wherever they have space left for it. And that's before we get to the last couple of generations of phones where the modern "sandwich PCB" has come into common use, where they basically use soldered connectors to bind two layers of PCB together with basically no gap between them so they can squeeze as much components into the most compact space possible. We should just be glad that Intel didn't hire smartphone PCB engineers for their GPUs. We should also probably be grateful that laptops haven't started to use those designs, because then they would be even less repairable.
I'm sure they would have an amazing get-together with BMW engineers lmao
You and this channel has resurrected the super nerd in me that i had sadly ran away from in my school years. Im so glad to have found this channel
Never forget that the non-nerds need nerds to fix their expensive electronics XD
Beautiful screwless design, very impressive!
Now show Paul Allens graphicscard.
Love your work and I value your opinions in my purchases. I design PCB's for automotive use for a living. I can add a little insight to one part of this. That LED ring PCB isn't actually the worst idea from a design standpoint. It's extremally simple and you'd be surprised how cheap it would be to manufacture especially in bulk. I also bet it was designed in a few hours. Not allot of time or money in that specific part. The way it's attached on the other hand, I have to agree with you there. Small plastic clips on the housing, screws really anything but tape or glue.
I design pcbs as well. They probably had the rings and edge boards done on the same panel. Given how thin they are I wouldn't expect the board to be much more than a dollar, if that. Low current as well and that drops the copper thickness down as well.
Yeah the fan circles 24:18 and led shroud pcb 24:00 fit neatly inside one another and the space inside each circle is enough for the ARC led pcb 13:11 and that other LED pcb 11:54 are all probably made in one piece so it isn't as wasteful as people would expect.
The use of threaded inserts where "wood" screws into hollow plastic posts would've sufficed means this was designed by someone used to low volume production though.
Heyoh, i build systems for automotive. And i really liked these LED-PCB's and they are really cheap if you consider all functions they do.
What i don't like at all are those extra power PCB. You have a connector from a PCB, to a PCB to get to a PCB in on single unit ... just why. It's like 3-5 bucks and a punch of possible failures for nothing.
Also casing any PCB in plastic was never a good idea ever ...but we tend to do it all the time ...
@@MrHaggyy experience has a lot to do with it. If the engineer is unfamiliar with things, they'll be making accessory boards for everything. Also, mechanical constraints given by other departments. They'll decide a long wire is better than running a trace to a connector that can sandwich to another pcb removing the need for wires, and other similar ideas.
This video makes me feel better because even a company like Intel produces products like this when rushed. Cost and efficiency was bypassed for the launch date and I agree with that decision. They probably didn't account for the pressure AMD put on.
good grief - this was such a pain to watch that I hope EVGA sees this and partners up with intel out of sheer pity,
I don’t think they’d want to make video cards that have broken software that’ll result in tons of returns and rmas. But they’d be the perfect partner to produce red lined variants.
EVGA probably knows, there's some things even a pro just can't fix?
@@lilblackduc7312 EVGA probably knows a pro can fix anything. ...At a cost.
@@joemarais7683 Not the first time EVGA dealt with weird ass/half working products. My friend loved EVGA Classified shit that I was there trying to fix them as well
7:26 Thanks Steve never gets old. I need a GN "Thanks Steve" shirt.
Not to mention that many adhesives do not age well when constantly exposed to heat and are susceptible to dust as well.
The whole design felt like it was meant to be discarded after 2-3 years. Low cost, lots of tape and glue, hard to self-repair.
@@GL1TCH3D Intel understood limited edition and it took it literally
Most double sided tapes fall into that category sadly
Cars today are glued at many poinst. It only depends on the type/quality. Of course there is a high chance that even Intel used special, durable double sided tape, you as a user can't buy something so special so once you disassembled this card it is better to skip the backplate.
@@EM83D Yeah, that's why a bunch of stuff in the interiors of my car is falling apart. And that's not something that is put into an environment with components that run very hot inside.
Hearing the tape separate as the thin metal backplate bends and the plastic clatter of the led strips reminded me of a Dawid Does Tech Stuff quote:
"It's like taking a blade to a kitten's face."
Definitely feels like the first version of a product. Curious what changes Battlemage will make.
Likely none, since they've already finalized the design of that before they even sent Alchemist out for review.
@@chrisdpratt I don't think that's likely. I highly doubt that Steve (or any of the press) was the first to tell people at Intel about the issues re:manufacturing and thermals. They've probably known for a very long time.
@@chrisdpratt They finalized the design of the cooler shroud? That makes no sense... they don't even have the full data on final power consumption at this point. You're full of shit.
Half the number of screws, twice the Powah!
@@chrisdpratt They might have finalized the design of the chip itself and related stuff (like memory and power delivery) but the actual design of the boards, the PCBs, etc. should have enough time to finalize.
20:24 haha
Seeing just how complicated the teardown of this one was really makes me appreciate the way more servicable Design of the Gigabyte Gaming OC RX 6600 XT and PNY XLR8 3060ti on wich I did a deepclean and repaste a few weeks ago. On both you basically only had to remove the screws you could see on the Backplate to seperate the Cooler from the card and then like four to six additional screws to separate the Finstack from the Fanshroud. There really is no need to overcomplicate it like Intel did here with their Reference design.
Shoutout to Steve for knowing the importance of prying at the connections via flathead (or some other device) rather than simply yanking the cable.
Part 2 please. I wanna see Steve put this back together. 😂
Oh man, that “Thanks Steve!” Intel reply at 7:27 is golden please keep that clip in use, shit was hilariously timed.
I like whenever they use it. The "ow my knees" is pretty good too.
Wysi
Oh my god 727
9:23 is even better XD
I laughed so hard
Your constant using of Intel's own presentation is top notch. 💯😂
I can't imagine the build cost of this thing - those crazy PCBs and insane assembly.
And they pass that cost onto us! The folks who they would really like to find enough cost competitive advantage with ARC to buy it instead of Nvidia or AMD, both of whom are launching much better cards soon in this price range.
Botched. Kinda. Maybe Intel doesn't super care about making money on this generation.
@@MrJamesonStyles ?? This is cost competitive in many ways, on DX12/modern games. $289 is a hell of a deal.
@@theSato Not when comparing to a 6600xt, which is cheaper, and has good drivers, and is significantly faster in a lot of games.
@@_--_--_ AMD fanboi detected. The Intel Arc is absolutely a strong competitor to AMD equivalents in every way... only exception is driver maturity which will improve quickly. ATI card drivers were absolute shit for quite a while when they first entered the market.
@@mikerzisu9508 What is the relevance of complaining about ATI's drivers from the AIW Radeon days? And the fanboy detected comment - seriously. Just admit the 6600xt is an excellent value compared to all other AMD, Nvidia and Intel current offerings. No need for time machines to 2008.
I’ve seen some truly baffling tear-downs on this channel. But this one… we’ll I wouldn’t say it takes the cake, but it’s certainly memorable. When I saw those custom PCBs for the LEDs… my lawd…
Screwless indeed
The threaded inserts are one of the better ways to screw things into plastic. They can handle a surprisingly large load especially the ones for injection molding.
Also they're serviceable. You can get away with self-tapping screws into plastic if you only plan to put the thing together once.
Large Load 😂
I saw back in the day when u improvised to water cool a titan V, great stuff i have 6 of them mining strong.. But now i wonder if you can point me in the same dirrection of what i can buy and modify slightly to cool a HighPoint SSD7540 8x M.2 NVMe SSD Card. Thanks in advance for all the reviews you guys do.
Please never stop the quick cuts to "Thanks Steve"
27:12 I wonder if the card physical design complications were done for a $700 card since they were aiming for the performance and mining-era prices for 3070 cards; now that prices have dropped and the A7 performance is being positioned against the 3060 the amount of extra in the design isn't appropriate for the price it's being sold at today.
In this episode of ''Dissapointing Steve'', intel finds a way to make steve both dissapointed and angry.
it's not his fault he is a drama queen it's just how he was born and raised
@@slaytronic On the contrary he trained at many dojos and adventured around the world to hone those skills.
If you ever need to replace a fan:
Just break the old one out of the housing and hotglue a case fan to the front.
Oddly brilliant
Well this card's going to be fun to maintenance for some people.
I gotta give intel credit where it's due, this is the first time I've seen Torx screws being used(structurally) in a GPU
It`s because only advanced people understand Torx.
Coming next: Gamebit screws!
OEM GPUs would like to have a word lol
@@infernaldaedra they're produced in such volume that the practical benefits of a torx type drive (longer tool life and tool can spin before it is at full depth) make it necessary, or at least worth the trouble to swap tools from whatever was already in the machines, due to economies of scale
NVIDIA and AMD both have used them for a few generations now.
1. love the hidden Intel card getting picked up from behind the 4090 2. nice desk top!!! Is that underlit glass? Love that
Hey man, I just wanted to say I appreciate your videos. You have a great way of putting things out for the simple layman, such as my self. I salute you sir! Thank you for your efforts!
I would pay good money for a Steve uncensored video where he just goes off.
Intel could surely benefit from a board partner for at least or one or two generations until their own design team really understands Video Card Efficiency, A to Z. Maybe they could even actually HIRE AWAY a few Video Card Engineers from EVGA at this point or at least contract a few of them for a year or two. THANK YOU STEVE. 😉
It's interesting that acer is making a version of this card. Not sure if they have ever made a consumer sku gpu before.
@@Kona_Shred nope, but ASRock is pretty new to the Radeon market too and their cards are ok
That gag at the start with the giant RTX card (probably 4090) in front was great 😂
Oh my god, "thank you papa" is going to be a new meme
I can't 🤣
55.5 screws is still 20 less than it takes to hold down the ROG laptop keyboards.
Intel may be realizing that hiring the guy that tanked AMD's GPU division to run their new GPU division could not have been the best idea ever.
How does anything in this video have any bearing on whether or not this piece so equipment functions well as a graphics card. You aren’t taking it on hikes. .01 percent of people are taking apart their graphics cards at any point in their life. That is way beyond an enthusiast thing to do. The conclusion at the end is so strained. The thing that actually matters is the price to performance ratio. And pretending like “too many 5 cent leds” hurts that is laughable.
@@hastyscorpion This is the teardown video. It is not a review.
@@hastyscorpion Because sometimes you have to get in to clean, to redo thermal paste, to replace borked and failing fans all to avoid shelling out better than half a weeks takehome pay just to keep playing the games that keep me sane.
@@Bill_Falsename You're in the .01 percent Hastyscorpion mentioned because normal users aren't going to "get in to clean, to redo thermal paste, to ...".
@@chankwanting I consider myself a normal user but I had to fix an issue with my old Asus 980 Strix where the fan had to be slightly adjusted because it was making contact with the fins. It was very easy to take apart and access the fans and remove those, and since it was open I went ahead and redid the paste. It doesn't take a .01 user to want to fix an issue that is as simple as a bent fan fin. I would also assume a normal user wouldn't want to replace an entire GPU just because of an issue as simple as that.
Repasting one of these in a year or two are going to be an absolute nightmare
If I remember correctly, you took apart one of their NUCs a while back and were really impressed with how good and intelligent the build was. I guess they didn't carry that process over to their GPUs. You'd think that a company trying to break into a well-established market dominated by another player (or two) would pay attention to best practices and things Steve has said in the past (Thanks, Steve!). Also, is there any reason to re-install that thin, aluminum plate you took off first? The card actually looks better without it and the open air ought to drop the temperature a bit.
To be fair, these choices don't affect most consumers, who will never even repaste their card, and they had to cut costs when they realized they were competing with the 3060 not the 3070 I'm sure.
@@eclipsegst9419 you're right. As a budget gamer looking to build my first PC soon with an Rx 6600, this Intel GPU may very well be a viable upgrade path for me, we'll have to see down the line
@@Junya01 it's not much faster than the 6600, I would get one or the other. I myself am trying to debate between 6600xt and a770. I'm curious to see what kind of performance gains the first few driver updates bring. But, 6700xt is getting cheaper by the day too. I would love to help keep a 3rd competitor in the market though.
Re: the plate, to prevent dust buildup maybe?
for all we know the teams that worked on their NUCs are different and might not have anything to do with the ARC GPU teams. Its like that most of the time on these big companies.
Thank you for suffering through this for us steve. When I need to tear it down, your videos are so valuable, and will save me completely destroying it. It won't go back exactly the same though, which really dissapointed me.
I'm surprised that Steve doesn't have a set of spudgers like most Pro repairers do. Necessary for phones and many other devices.
I was thinking the same thing at first. But, really, I can't think of any video card or computer component I've ever needed it for.
Now, phones, tablets, and other small electronics you'll absolutely need them. Especially with waterproof/ water resistant designs.
But it's kinda a different ballgame, ya know? Computer hardware has (at least traditionally) been much more serviceable, and not used the same design philosophy as phones
I've never used one for any pc component, only mobile devices.
I appreciate the extra effort for the new "Thanks Intel" videos, Thanks Steve!
Thank you for the video papa!
Also something worth mentioning... The RMA team is going to have ptsd dreams about tape and jst headers. Repair gate keeping via poor design accessibility is stupid on so many levels.
That's the _workers'_ problem. As long as shareholders are happy, anything goes!
@@caramelldansen2204 Workers: Hey this design is pretty bad can you change it before production?
Shareholders: Oh my! It needs more RGBs! Add this now!
For those that want to find the papa reference, it's at 9:30 in the intel keynote. And it's really weird
You're assuming they'll attempt repairs. This looks like the sort of thing where any replacements go into the bin. Internal repairability will probably be a concern only when designing the next gen if there's more failures than anticiptated.
That nudge "Thank you pappa" - shows how well they understand their market. Intel's idea is that fathers buy GPUs for their Sons. When the truth is those sons and daughters buy GPUs for themselves.
1:01 is my favorite moment of the video
well ya know, i think its super cool to see someone else gettin into the GPU scene. I hope they keep gettin better
It won't take much effort getting better than that at least
@@razorblade7108 lol, ouch. I mean at least they are being realistic with themselves and pricing it accordingly and being honest about its performance.
Nobody believed me when I said that RGB is the COWBELL of gaming these days. You never have enough of it.
I wish it would just go away.
I've got a FEVER and the only prescription is more ARE BE GEE.
Its free 10% performance uplift
I will always be a fan of the 808 cowbell until the day I die
It’s just light pollution imo
i really wonder who developed this. if they would have used only 50% more brain the cards would cost easily 30-50$ less. how complicated is this!? especially the manufacturing process for a card must be insane there are so many steps that only have to be done by hand. completely insane tbh!
They saved money with the screwless design.....oh wait.
Sounds like you are an expert and should be the chief advisor to these huge manufacturers. Get in touch with them and share your expertise.
But... but... Look how pretty the lights look! 50 dollars less???? Naaaaa... who would want to save that on an ugly graphics card with no lights????
@@youtubeisgarbage900 oh just shame the thousands of engineers working in the company which makes life easy for billions of people across the globe
@@youtubeisgarbage900 not defending anyone. I couldn't give a shit whether Intel succeed or not. Simply saying that these experts in the comments should be working for these big companies instead of wasting their time here. They really sound like they know exactly what to do.
Nothing like being a new player in the game, and you not only do not innovate, you actually push things in the opposite direction for design. Well done Intel, showing us all how it's done.
When you get used to "competing" in a Quasi-Monopoly, these things happen.
I like the white table, especially with the dark card. Any plans for a "light mode" modmat, maybe not for full scale production but at least as a UA-cam piece
It's actually an entire lightbox! Haha, it illuminates and has RGB and everything. We do have ideas for future modmats!
@@GamersNexus
I'd love to draw and fix up my sketches on that light box! Looks beautiful, just like you papa ❤
In case anyone was curious, the fabric tape they're using is high-temp wiring loom tape. It's fantastic stuff because it bonds well to paint, plastic, nylon, and even smooth-finish silicone as it is meant to wrap wiring looms in automotive production. It is also very thin and doesn't readily lose its adhesion like Kapton tape sometimes does over time. It makes sense that they'd use a product seemingly purpose-built for wire management to manage their wires.
Honestly, I'd rather have all of the little auxiliary PCBs for LED management if it means they can use fewer, easier to manage cables instead of really going out of their way to create a rat's nest of cables inside the housing. As long as the plugs that go into the connectors are obvious, labeled, or don't need to be removed to service the card then I'm largely fine with their implementation, as it makes the card relatively tidy for the amount of LEDs they're using. I'd much rather them have an absolute stunner of a card that is slightly less convenient to disassemble than make something that looks boring or trashy as their "Limited Edition" flagship product. But it's still a game of qualifiers. If the card didn't look so good and wasn't priced so competitive to its performance level, then I'd be less forgiving of these minor annoyances.
Also, a pair of forceps is a fantastic tool for pulling connectors from their sockets, particularly if you can get right-angle ones.
All in all, an annoying GPU to work on, but at least they're trying to use quality components so you hopefully won't need to. At the very least, if your card doesn't arrive DoA then it should last until the next model gets released, assuming Intel doesn't follow through on axing the whole project.
What we want: Good thermal management, easy access, good build quality.
What we get: 90 programmable lightbulbs
I really like the look of these GPUs, they nailed the aesthetics, I hope they continue the look or other manufacturers make more GPUs this clean.
I'd rather look at the display signal it's outputting.
@@armorgeddon Most GPUs have gaudy ornamentation, so simple designs fall in line with that.
I mean this is proof EVGA and Intel would really be a heavy hitter combo
Seriously, they need help from a brand like EVGA in the worst way. They'd practically solve 90% of those card construction problems in a heartbeat.
furry detected, opinion discarded.
EVGA can't make a bad card great. This card can't even run many pre dx12 titles at all. Only intel can fix that.
@@madarab hating furries went out of style in like 2011, you're just embarrassing yourself
@@madarab ???
Do people really get this offended over a profile pic nowadays?
Jesus christ the modern internet is a cesspool.
I appreciate your hard work and lack of sleep to get these videos out. Best of luck to the entire team.
I hear there's some amazing board and card designers from EVGA that are looking for a new gig. Maybe Intel could pick up all of them to make something meaningful come out of all of this. I look forward to what Intel eventually offers, but for going into my server for encoding/decoding, not so much gaming, though I won't say it couldn't be a good secondary card to run my additional monitors at some point.
awesome idea
The individuals working on graphics cards would be held to the same noncompete agreement as EVGA as a whole, so for at least a year or two they can't work at a competitor of Nvidia. It's not that Evga wouldn't make cards for another company but they literally can't for a period after ending Nvidia contract, and any employees with access to Nvidia Intellectual property would be required to be bound as well
@@montgomeryfitzpatrick473 /whoosh
@@silverthorngoodtree5533 what was the joke I missed?
@@montgomeryfitzpatrick473 gamers Nexus already did a video updating the EVGA saga where the company responded they never signed such an agreement.
Gamers Nexus you've become really funny lately good to see so much of your personality show through these latest videos.
Please never stop the “Thanks Steve” I laugh literally every time
a lot of double-sided tape like that, starts becoming unadered at higher temperatures commonly seen inside pcs. I'm curious as to how it's going to hold up over time
Not if done properly. Cars have lots of adhesives. Entire body panels held on with them. My comma 3 is held up with 3m vhb and records temps up 70c as its in the windshield. Other users that live in hot climates have reported higher.
@@PinkFZeppelin Well floyd and zep tapes hold up better than everything produced since, and the manufacturing process probably entailed everything BUT glue.
This reminds me of working on my car and having to deal with both SAE and metric fasteners. The only difference is me saying "Thanks, GM!" instead of "Thanks, Intel!"
What car has any SAE? The last cars to use imperial/SAE were from about 50 years ago.
@@IRefuseToUseThisStupidFeature oh how I wish that was true. A buddy of mines 2020 Chevrolet has both metric and sae parts.
Been doing a few things in my Ford, it's been only metric so far thankfully
@@Silverhks Reading both your and hockeyfan's comments here have hurt me more than words ever have before. WHY WOULD THEY DO THAT?! 😭
"It's the totality of the contrivancy that drives those complaints"
Never change mate. In German we say "ich bin mit der Gesamtsituation unzufrieden", and your wording captures that sentiment perfectly.
For their first time designing a card, I will say it is one of the best looking graphics card I’ve ever seen.
It has a lot of lipstick on it but not a lot of brains under that makeup
@@moldoveanu8 But it will cost you a night to find out.
@@Nost2682 to be fair, it's not a bad first time GPU.
Realistically it is better if Intel never fully catches up to the likes of AMD or Nvidia GPUs because they'll always have to sell for a little cheaper.
That would be a win for mid range and lower gamers which let's be honest. That's most of us.
But if they do catch up, maybe that may turn up the heat on the other vendors to innovate more or reduce prices.
I doubt significant price reduction will happen thou. Not if they use giant GPU dies on the latest and greatest nodes.
They have to figure out a way to do more with less. And AMD/Nvidia are already doing just that with DLSS/FSR, chiplets, and it's still not enough to reduce prices because they still use huge dies.
@@moldoveanu8 performance to price shows that there is a lot of brains under the cover ;).
I get it, it's called the Limited Edition. That's why it's taped together with double-sided tape. Sales will probably also live up to that name. The non-limited edition might be held together with powerful magnets! Looking forward to it!
So like a limited lifespan instead of limited supply?
@@Fulano5321 Time will tell, I guess. The first run will be shaky, but maybe in a third or fourth generation, Intel will step it up. Either way, it makes for entertaining tear-down videos by GN! 😄
I love these tear downs.
It is a good looking card but i'd appreciate function-first design more.
I wish more manufactures had function first designs. My favorite is my speed queen washer and dryer, lol. Now I just need Ford or Chevy to make a basic truck without any creature comforts, I don't even want a radio in it or comfy seats, just a basic functional work truck.
Then you'll be more interested in their general consumer models and not their "Limited Edition" variant with its fancy aesthetics. The other variants have less RGB.
@@Myers100684 With 30 screws less and none of that fance PCB LED stuff this card could literally sell for 30-50 dollars less.
Yeah. Function over form.
Had this card for 4 months now, and It performs VERY well. The only thing is that they've focused so much on DX12 that older games (some I must say as I've played DX11 & DX10 (and even games like GTA:SA) games that work perfectly good) might not perform very well, but then again I don't play much of these. The memory size is very nice to have, games do utilize it even if it doesn't max out on cards with less memory on the same settings, which speeds up rendering alot and gives more FPS. Never had thermal issues, and never planned on taking it apart.
For pulling cable connections like you were @11:30 I find oldschool IC lifters are really useful. You can grab both shoulders of a connector and pull it out straight out without putting stress on the wires.
I use forceps. Great to have around for fishing wires through cable management routing or for pulling connectors. Also for retrieving stainless steel screws when they fall inside the case.
Not a bad idea!
i love how your simple explanations of general concepts feel like sick burns on intel.
Amazing screw less design Intel! Absolutely no screws to have to worry about or keep track of and especially nice to not have to deal with all different types of screws as well! Amazing!
The editor had a field day with this video, I love it. 😂
The little mention of "if the drivers keep getting made" really hurt me as a user of the legendary value card R9 390. I get regular GPU driver crashes now
Strange, I have a r9 290 and even with the last driver pack it has been rock solid
I don't really play new games though
When I saw this thing with just the backplate removed, I had a feeling the complete teardown was going to be a nightmare. Thanks, Steve! * in my best Intel keynote voice *
It's almost as if they never designed these thinking "there may be a few cards with RMAs that need to be repaired, so let's design a card that can be taken apart easily for our RMA team!"
Also, backplates that don't actually add any cooling or functionality is a pet peeve of mine. Complete waste in manufacturing cost and design.
It actually creates an air pocket which traps heat like winter jackets
@@razorblade7108 Yes I also watched the video
Intel will not be doing any RMA's. Intel will tell you to keep the GPU and will send you a "Thank You Papa" gift card instead.
@@razorblade7108 Right, it's even worse than worthless. Thank you Papa Intel! I feel like GPU manufacturers have been like "let's throw a backplate on it for no reason" for several years now. There are even been GPUs with pointless plastic backplates.
I have more respect for a minimally designed GPU made to reduce costs, than a card with a ton of bloat with no thought put into the big picture. There are other ways to add value other than making a product look flashy. EVGA gained its reputation thanks to its stellar warranties. Leaning towards right to repair by providing board schematics is another way to add perceived value. I feel like that's the only thing Intel can put on the table at this point - other than better GPU drivers.
I'm impressed, pretty good engineering there. Steve typically overshoots with his sarcasm and wisdom but that's his shtick
Gotta say all the GN products are super classy, I only have the coasters but it’s pretty clear the pride you guys take on providing useful features. Good stuff loving all the new product reviews
I got the glasses with the coasters. They are very nice!
It might as well have a "no user serviceable parts inside" tag on it. This is something you would find on a flea market tail gate sale for $50 quality-wise.
I've never had a fan die (personally) but I did have to repaste a GPU recently and being able to do that is important.
It is nice they tried to keep the cost down anyhow. Graphics cards don't always need to be expensive. They DO however need to last some expected time frame!!
the LED implementation really looks like an afterthought, especially that tiny board for the external rgb connector.
I wish all component manufacturers would stop wasting their time and resources on stupid LED's. For those who want LED's, there are plenty of separate LED's and controllers. Can't they just buy those and decorate their PC's with the LED's themselves? Why must the LED's be integrated into the components?
agreed. at least put a physical rgb switch like what msi did on their mobo's, so people can turn off rgb without silly softwares eating 10% cpu usage. this way you could still sell the card to rgb fanboys. but still, no rbg at all in a flagship card would be nice.
55 screwless design, nice Intel, nailed it.
I don't think it's nailed, it's literally screwed together
@@Emptiness_Machine_2001 don' give them ideas, they may nail parts together next ^^'
@@ledoynier3694 Nailing it would literally make for nightmares.
This definitely proves intel can overengineer a video card lol, I can't complain. My jaw was on the floor when you flipped over the led assembly.
If it gets too hot it literally melts, amazing. truely a fluent designe!