China doesn't want me to have this GPU - Moore Threads MTT S80

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 6 тис.

  • @TARS..
    @TARS.. Рік тому +3743

    I think what the recent newcomers into the gpu space have shown us is how important driver support is.

    • @alaa341g
      @alaa341g Рік тому +216

      of course , nvidia always wins over AMD with drivers ; and btw this is just a 2 years craetion , give more years , and it will fload the market with lot of good midrange GPU like they did with phones

    • @GlitchedCattt
      @GlitchedCattt Рік тому +28

      I was thinking the same thing. No drivers for this GPU is yet available or rather not yet available for the consumer, OR maybe it's in the manual together with the GPU that only the owner of the GPU linus is Investigating has it and it's wasn't given to the public yet

    • @Grimmers
      @Grimmers Рік тому +130

      Nvidia's decades long policy of tuning for specific games in their drivers and even going as far to optimize the shaders of specific games into the driver. Must be an absolute nightmare to be AMD or Intel and having to try to support all that old spagetti of legacy optimizations.

    • @macgyver9134
      @macgyver9134 Рік тому +22

      this isn't news. Anyone who's had an ati card from the late 90's early 2000's knew this!!

    • @5Hydroxytryptophan
      @5Hydroxytryptophan Рік тому +87

      Not only newcomers.. Nvidia's driver still sucks big time in Linux.

  • @blze0018
    @blze0018 Рік тому +7134

    "Took them 2 years to make a GPU and you 3 years to make a screwdriver" - Fuckin gottem.

    • @naamadossantossilva4736
      @naamadossantossilva4736 Рік тому +284

      TBF the LTT screwdriver is great and the GPU is trash even compared to the ARC dumpsterfire.

    • @killa4rilla146
      @killa4rilla146 Рік тому +42

      Had me dying laughing
      🤣🤣🤣🤣🤣🤣🤣🤣

    • @Chris_Garman
      @Chris_Garman Рік тому +11

      @@naamadossantossilva4736 Oh yeah? What does that screwdriver cost again?

    • @epicnicity916
      @epicnicity916 Рік тому +298

      ​@@naamadossantossilva4736 tbf a screw driver is just plastic with a steel rod, GPU is much more complicated

    • @KobeLoverTatum
      @KobeLoverTatum Рік тому +21

      15:09

  • @thexgamer8240
    @thexgamer8240 Рік тому +11735

    Imagine a Chinese secret agent watching Linus drop the GPU and break their hidden spy camera.

    • @geetanshkaul3419
      @geetanshkaul3419 Рік тому +200

      ahahahhahahahahahahahahahahahahah bro💀

    • @wescrowther655
      @wescrowther655 Рік тому +464

      I’d say they saw it coming.

    • @GoofinDailyIsAlsoMyIGN
      @GoofinDailyIsAlsoMyIGN Рік тому +32

      ​@@wescrowther655 yea

    • @williamchupin
      @williamchupin Рік тому +16

      lmfao

    • @gggoom2443
      @gggoom2443 Рік тому +885

      Dunno why the US government keeps emphasising China’s spying on everyone… isn’t this what the US has been doing to rest of the world?😂😂

  • @ApplePotato
    @ApplePotato Рік тому +490

    AMD Polaris and Vega was developed out of their Shanghai office, so China does have enough talent to make the hardware. In fact any university level computer architecture course can teach you how to make a beefy GPU. But the magic sauce is the optimization within the drivers. If you can't fully utilize the GPU, the performance/watt is going to suffer. I suspect their drivers does not implement DirectX APIs 100% at any level, that is why they only support certain games.

    • @kelmanl4
      @kelmanl4 Рік тому +33

      It's not about talent, China has a lot of it in the computer science and r&d space but they need access to parts that require a lot of time to create this tech, AMD and Nvidia are working on project right now that won't see the light of day for another 2-4 years, it will take them time to create everything. There a reason Intel has used x86 for decades.
      They also need driver support in application and games which requires cooperation on both the developers and the card makers which isn't an option for everyone.
      Intel has been working on gpus for a decade on and off and they are still have driver and support issues with games a year after launch.

    • @anarchyandempires5452
      @anarchyandempires5452 Рік тому

      I believe they legally can't support DirectX, That's a Microsoft property and unless I am remembering wrong Microsoft is one of the companies that was banned by China during the trade war, I I would imagine I'm probably wrong,

    • @ApplePotato
      @ApplePotato 6 місяців тому +5

      @@kelmanl4 Yes it does take time to come out with the silicon. But drivers imo are toughest. Even Nvidia and AMD have to release patches or games specific fix or optimization constantly. DirectX, OpenGL and Vulkan are not going to easy to implement from scratch and imagine doing it across multiple versions.
      Intel or AMD can easily drop x86 and go for a RISC ISA. The reason why they don't is because there are still huge money to be made for backwards compatibility. Intel has proposed to dropped all the legacy 32bit rings and 16bit real mode support in hardware from future CPUs though. Hardly anyone these days are booting with DOS or a 32bit OS. And its updated every so often with new SIMD instructions. x86 is still very alive and evolving.

    • @jamesj.mccombie5031
      @jamesj.mccombie5031 4 місяці тому +1

      Or steal IP.

    • @TheJuanjo234
      @TheJuanjo234 3 місяці тому +3

      The architecture of an GPU is more complex than the computer architecture course in a uni. There are a lot of optimization to do in hardware architecture to reduce consumption. Drivers isn't the only magic sauce.

  • @S41L0R
    @S41L0R Рік тому +10612

    Ah, finally a GPU brand that you haven't dropped a card from.

    • @Vyle_One
      @Vyle_One Рік тому +597

      yet

    • @NLRevZ
      @NLRevZ Рік тому +93

      Has he dropped a Matrox card yet? Hmm...

    • @wh2960
      @wh2960 Рік тому +102

      Chances are low, but never zero.

    • @GeraltOfRivia69
      @GeraltOfRivia69 Рік тому +17

      He can't take chances

    • @Hablift
      @Hablift Рік тому +3

      😂😂🤣

  • @ProjectPhysX
    @ProjectPhysX Рік тому +348

    Some people in Germany got their hands on MTT S80 GPUs, and they tested a couple things for me: Unfortunately there is no OpenCL support yet, although MTT adverdised it. Drivers are still very early.

    • @decreer4567
      @decreer4567 Рік тому +1

      If they don't support OpenCL then it's total garbage.

    • @ProjectPhysX
      @ProjectPhysX Рік тому +42

      @@decreer4567 this is likely to change with driver updates. OpenCL support is mandatory to have any user base in the compute / data center segment. I'm not surprised there is no support yet; Intel's OpenCL support at Arc launch was abysmal too but has gotten significantly better already. Give them some time.

    • @tylerclayton6081
      @tylerclayton6081 Рік тому

      @@ProjectPhysX The last thing westerners should do is support a Chinese chip companies no matter how small they may be. Why are Germans so supportive of dictatorships. Didn’t y’all learn a lesson from the whole over dependence on Russian energy thing?
      The west has gotta decouple from China. Not buy advanced components from them that they could then use for spying or collecting data from us

    • @ProjectPhysX
      @ProjectPhysX Рік тому +59

      @@tylerclayton6081 OpenCL is an open standard, if the hardware supports it, it can run a large variety of stoftware. Open standards are a good thing.
      I'm not supporting Chinese chip companies and their dictatorships. But there is no reason to be dismissive of the chinese people either, they are not that different from us. I come from academia and value international collaboration, no matter of nationality. International collaboration/communication solves problems, decoupling through stereotypes and building walls does not.

    • @jamesjamey8596
      @jamesjamey8596 Рік тому +4

      I get lackluster game support but I was expecting the situation to be far better for compute workloads! I feel sorry for the poor souls who have to use these to work on their research projects!

  • @Euathlus1985
    @Euathlus1985 Рік тому +1817

    Finally a GPU where the power connector is on the right position

    • @NoNameAtAll2
      @NoNameAtAll2 Рік тому +3

      wdym?

    • @leastE
      @leastE Рік тому +192

      @@NoNameAtAll2 most (well maybe all of them) gpus have their power connector "on top" of the card which when you mount the gpu, unless you vertically mount it, you always end up seeing the pci-e power cables in front of the gpu.
      the card in the video has it to the side of the fan, which kinda hides the cable in sight, which is kinda cool

    • @СусаннаСергеевна
      @СусаннаСергеевна Рік тому +8

      It makes absolutely no difference, the cables are covered up by the side panels anyway and even if you’re running with the sides off for thermals you’re not spending any significant amount of time under your desk staring into the computer.

    • @Midz350i
      @Midz350i Рік тому +82

      @@СусаннаСергеевна I think in EVGA testing they found it affect air flow little bit. Also it looks cleaner and on some cards the wires will hit the glass panel. On the original O11 I couldn't close the side panel when I had the EVGA 1080ti hybrid due to hitting the power wires.

    • @zetsubou3704
      @zetsubou3704 Рік тому +95

      @@СусаннаСергеевна Bro do you use a fkin Lenovo Thinkcentre from the 2000's ☠️. Almost all modern cases have transparent glass and acrylic side panels in case (pun intended) you didn't know.
      Also yeah many people keep their PC's on their desk 🗿

  • @jwickerszh
    @jwickerszh Рік тому +28

    I'm impressed, I didn't even expect this to work... because I'm pretty sure MTT first focus isn't gaming.

    • @aaroncruz9181
      @aaroncruz9181 3 місяці тому

      This is where Chinese gamers train to defeat Taiwan in real life.
      At least COD will have a decent title afterwards.

  • @gnuplusmatt
    @gnuplusmatt Рік тому +893

    Apparently it will use a standard PowerVR module on Linux, and PowerVR has a Vulkan driver on Linux - there might be some interesting testing to be done by Anthony

    • @davidgoodnow269
      @davidgoodnow269 Рік тому +39

      That makes sense.
      Doesn't China still officially use their own flavor of Red Hat Linux?

    • @tilsgee
      @tilsgee Рік тому +15

      @@davidgoodnow269 isn't Kylin is the official name for Linux distro for china?

    • @myusernameiscooldude
      @myusernameiscooldude Рік тому

      nope, still garbage full of stolen tech, this is literally a scam to suck up government funding

    • @Yukicanis
      @Yukicanis Рік тому +4

      Including AI using ncnn.

    • @648
      @648 Рік тому +4

      @@davidgoodnow269 they also use Win 10 G, basically a China windows 10 mod without ms acc functionality

  • @Cyhawkx
    @Cyhawkx Рік тому +2114

    I like how they list Dwarf Fortress as compatible and supported. . . Dwarf Fortress doesn't use any GPU, its entirely CPU bound.

    • @Intake2365
      @Intake2365 Рік тому +153

      There's a new Dwarf Fortress, in case you missed it! Check it out. :)

    • @luziferius3687
      @luziferius3687 Рік тому +127

      It displays stuff on screen, so it uses the GPU. In fact, uses OpenGL to render the screen. Factorio is 2D, yet uses OpenGL on Linux and DX11 on Windows.

    • @SvendDesignsSD
      @SvendDesignsSD Рік тому +228

      @@luziferius3687 No it doesn't. OP is right - the new release of Dwarf Fortress on steam is entirely CPU bound even for rendering graphs. It uses a hardcoded version of OpenGL and CPU clock sync to render the 2d graphics.

    • @jayyvonkush1941
      @jayyvonkush1941 Рік тому +1

      So just like every Chinese product its a lie.

    • @theboxofdemons
      @theboxofdemons Рік тому +12

      @@SvendDesignsSD OK even if everything in game is rendered by the CPU, surely the window itself is drawn by your gpu onto your desktop.

  • @MattSitton
    @MattSitton Рік тому +1055

    I like how this is sponsored by xsplit but they use OBS to test on the GPU

    • @Luniii737
      @Luniii737 Рік тому +69

      Wait. XSplit is still a thing?

    • @Guitarhero1000
      @Guitarhero1000 Рік тому +9

      @@AlexDatcoldness Don't you need to pay Xsplit a fee for using it?

    • @DryRoastedLemon
      @DryRoastedLemon Рік тому +3

      @@AlexDatcoldness Why, actually? Genuine question.

    • @theres-kc4bb
      @theres-kc4bb Рік тому

      @@AlexDatcoldness just use obs. they have updated it like half a year ago. spend 10 minutes on it, earn a lot of money.

    • @ahyaan2552
      @ahyaan2552 Рік тому +2

      @@AlexDatcoldness boiler went today and i am down £1500, think i will have to continue to pirate keys :(

  • @w10537543
    @w10537543 Рік тому +68

    ”What is the use of a child who has just learned to walk?“
    ”He will eventually become an adult“

  • @Jatin-Gaur
    @Jatin-Gaur Рік тому +1175

    Linus described this GPU as a nuclear weapon that every president is after

    • @aaronlay1210
      @aaronlay1210 Рік тому +3

      ik

    • @PointingLasersAtAircraft
      @PointingLasersAtAircraft Рік тому +51

      GPUs will be a vital resource in WWWIII.

    • @Jatin-Gaur
      @Jatin-Gaur Рік тому +86

      @@PointingLasersAtAircraft WW3 will be played in COD lobbies, modernization exists everywhere! Joe will prolly appear with the best of pay to win weapons

    • @Deadxet
      @Deadxet Рік тому +37

      @@PointingLasersAtAircraft World Wide Web 3?

    • @hebleh5771
      @hebleh5771 Рік тому +8

      @@PointingLasersAtAircraft Tarkov seems to agree

  • @steverogers8163
    @steverogers8163 Рік тому +480

    I have to imagine its priority is server first, desktop second. I was surprised it didn't seem to allow any real video encode/decode as that's a giant use case for server GPUs. Though maybe it does and this is really a case of locked down hardware/software package deal. ie maybe Chinese UA-cam is using footbrake instead of handbrake.

    • @itsalexjones
      @itsalexjones Рік тому +27

      Almost certainly the media transcode capability is through an SDK (like nvenc) and the enterprises buying these will just implement it. Trying it on OBS is a good test, but if OBS hasn't implemented the SDK, like they have for nvenc, then obviously it wont work

    • @duncanlanceoliver194
      @duncanlanceoliver194 Рік тому

      its likely good for background rendering were it doesnt have to display the image it make but does the maths work fine as that was the read out were say and the display were saw hinted at given the low display rate but high processing abilty and its only when new assits are obtained or installed that system hitches like server bacground gpu use to do in the early 2010's they were great cards for the price then like $120 for decent cad work server but crap for any 3d gameing or texture live rendering

    • @duncanlanceoliver194
      @duncanlanceoliver194 Рік тому +1

      further note companies that would get such card would do so cause they would take what they saved over the worker system and supply a finalisation system within tehoffice that worker would upload their final project to be worked on to the system rather then have 20 machine costing 15k each you can have 50 machine worth 2k and one single system worth 30k

    • @y__h
      @y__h Рік тому +25

      It supported PyTorch out of the box, it's literally an AI accelerator in a GPU trenchcoat

    • @RacingSlow
      @RacingSlow Рік тому

      It’s listed under desktop not sever on their site

  • @electronash
    @electronash Рік тому +88

    To give them credit, "Moore Threads" is a really clever name. lol

    • @churro6160
      @churro6160 Рік тому +5

      I guess you can say Moore's law isn't dead 🤔

  • @kirkh4205
    @kirkh4205 11 місяців тому +182

    The good news is that if Moore Threads can actually become a contender with the heavy hitting GPU MFGs, then their products could help to keep the price down on in the powerful graphics card market.

    • @st.altair4936
      @st.altair4936 10 місяців тому +25

      Specially when you consider it's China; their products tend to be insanely competitively priced.

    • @littletweeter1327
      @littletweeter1327 9 місяців тому

      They might only care about the Chinese market, which is why they made a card similar in performance to a 3060. Because 99% of gamers in China only play shitty mobas that can run on a 750ti

    • @DI-ry5hg
      @DI-ry5hg 8 місяців тому +3

      @@st.altair4936 Just imagine that any software you use in the future will need to be censored by the Chinese government before it can be optimized accordingly.

    • @MimOzanTamamogullar
      @MimOzanTamamogullar 5 місяців тому +10

      ​@@DI-ry5hg I've used Chinese phones, none of the memes actually hold up. It's literally just a phone. There's no reason to think their GPUs would be different.

    • @cvspvr
      @cvspvr 5 місяців тому

      ​@@DI-ry5hgsomeone will figure out a bypass just like they bypassed nvidia's limited hashrate

  • @michealvalois4463
    @michealvalois4463 Рік тому +379

    Adam is a man after my own heart. A fan of TF2 whos not blinding by basegame nostalgia and enjoys the absolute chaos.

    • @punimarudogaman
      @punimarudogaman Рік тому +4

      did u know tf2 will release a major update this year ? they annouced it.

    • @War_Parrot
      @War_Parrot Рік тому

      ​​@@punimarudogaman some hats, emotes, effects and maps? I'm too lazy to check it by myself.

    • @scidja8567
      @scidja8567 Рік тому +35

      ​@@punimarudogaman yeah sure buddy whatever you say.... Major update my ass link me the source right now bruh

    • @BitchlessNigga
      @BitchlessNigga Рік тому +1

      ​@@punimarudogaman Stop the cap brudda

    • @sheesh1101
      @sheesh1101 Рік тому +3

      i started playing tf2 at a time when cosmetics and weapons were already a thing
      the weapons are genuinely a great addition to the base game

  • @Thurrock2
    @Thurrock2 Рік тому +268

    Love how the power port has finally been moved. Now someone just needs to put it on the bottom and cases will be looking much cleaner without those 2 cables jumping over the MB!

    • @macicoinc9363
      @macicoinc9363 Рік тому +3

      They will never do that, that would be such a pain in the ass to deal with it.

    • @Thurrock2
      @Thurrock2 Рік тому +3

      @@macicoinc9363 How so? Moving a small part on a PCB?

    • @cgiacona
      @cgiacona Рік тому +1

      how would you access it on the bottom? it would be completely blocked by the motherboard

    • @Thurrock2
      @Thurrock2 Рік тому +12

      @@cgiacona I meant the bottom on the back clearly, but even still, larger GPUs extend past the MB anyway.

    • @UlrichLeland
      @UlrichLeland Рік тому

      It just depends on how you customise your own pc, I have a 3090 FE, my 12 pin connector looks super clean with a braided 12 pin to dual 8 pin connector it's not in the way at all.

  • @gobbel2000
    @gobbel2000 Рік тому +44

    Maybe it's time for you to prepare some Pytorch/Tensorflow benchmarks. That could be more the target applications of these cards.

    • @MaddTheSane
      @MaddTheSane Рік тому +3

      Are there even drivers for the GPU's AI acceleration?

    • @maskedtomato3005
      @maskedtomato3005 Рік тому +10

      @@MaddTheSane They do make drivers on Linux and that's their main purpose. Our company it considering to use this product to do some AI computing in order to prepare the furture chip ban that might happen.

    • @roro-v3z
      @roro-v3z Місяць тому

      @@MaddTheSane we are using this to avoid us sanctions

  • @PopTartNeko
    @PopTartNeko Рік тому +134

    I'd support them just to break the duopoly we have rn honestly.
    What chinese brands did to the smartphone market was a great thing for everyone. Folks here in southeast asia can finally get high-end spec phones without it costing arm and leg.
    AND cheaper competitors also forced the likes of Samsung and Apple to be more competetive in terms of features/quality/pricing.
    The same kind of thing happening to happening to GPU market would be a net positive for consumers.. Just saying.

    • @Dell-ol6hb
      @Dell-ol6hb Рік тому +15

      I agree 100%, Americans companies are scrambling to stop them because it would break their oligopoly on the market and they would actually have to make good competitive hardware.

    • @yono_yume7083
      @yono_yume7083 Рік тому +19

      Agreed, I'm tired of +5% performance and +50% price.

    • @kierancusack7774
      @kierancusack7774 Рік тому

      yeah but they also use that cellphone architecture through huawei to basically put a strangle hold on the communications networks of developing countries and spy on people so although its great for normal people in the short run it will have nasty long term ramifications for countries like bangladesh and malaysia

    • @greenjobs2153
      @greenjobs2153 Рік тому +1

      ​@@yono_yume7083😂honesty

  • @rmo9808
    @rmo9808 Рік тому +627

    It might take them a decade but once they get halfway decent cards they'll just flood the market.

    • @TheMsdos25
      @TheMsdos25 Рік тому +204

      About time someone puts the Nvidia/AMD duopoly on their toes.

    • @bobsemple9341
      @bobsemple9341 Рік тому

      Doubt it. China is known for crap

    • @vyor8837
      @vyor8837 Рік тому

      Try 90 decades.
      You know, how long it'll take to get back on their feet after china collapses into civil war(again).

    • @skyisreallyhigh3333
      @skyisreallyhigh3333 Рік тому +76

      I'm gonna guess that by their 3 or 4th generation they will have cards comparable to the cards Nvidia, AMD, and Intel will be putting out at the same time. They are getting a share of that $1.4T and that helps tremendously.

    • @vyor8837
      @vyor8837 Рік тому +49

      @@skyisreallyhigh3333 And what are they going to build them with? Hopes and prayers?

  • @LuLeBe
    @LuLeBe Рік тому +389

    15:08 that "2 years for this GPU, 3 years for your screwdriver" is exactly what I'm thinking. The effort to even get a GPU working in whatever way possible is massive. From there you need to work on various features (like Tesselation) and likely revisit the hardware design multiple times to get it right. But 2 years is a pretty short time for the initial product.

    • @ComradePhoenix
      @ComradePhoenix Рік тому +34

      By the same token, though: The LTT screwdriver favorably compares to name-brand, high-quality ones. The GPU does not.

    • @rfouR_4
      @rfouR_4 Рік тому +105

      People can shit on this product all they want, but 2 years from nothing to an actual product is crazy. If they can keep up the pace excited to see what they can accomplish in the coming years. Amazing what state driven investment into technological innovations can accomplish. It's almost like having a government that invests in it's own infrastructure, development and progress instead of one that spends all it's money in bombs and 800+ foreign military bases while it rots from the inside out is maybe superior. Weird how that works. Probably has nothing to do with how China has managed to lift 850 million of its people out of poverty. 🤔

    • @ilovetheatf
      @ilovetheatf Рік тому +56

      ​@@rfouR_4 CCP bot

    • @jackharan3791
      @jackharan3791 Рік тому +54

      @@rfouR_4 they literally licensed the IPs from another company, its not like they did all the RnD and all that themselves, they just put the money down to put themselves in the position to make a glorified 1030

    • @milkmilky157
      @milkmilky157 Рік тому +7

      2 years is the time spent on sourcing, sanding and reprinting😂

  • @djangoryffel5135
    @djangoryffel5135 Рік тому +132

    I like, that they use EPS. 400W with one, not fire hazardous plug, seems only logical to go this route.

    • @tylerclayton6081
      @tylerclayton6081 Рік тому +6

      How many fires started with the 4090 power connectors? Pretty sure it was zero. A lot more AMD cards had vapor chamber issues

    • @oxfordsparky
      @oxfordsparky Рік тому +3

      @@tylerclayton6081the only melted plugs were caused by user error, installing them incorrectly.
      Mine has had zero issues.

    • @Ruhrpottpatriot
      @Ruhrpottpatriot Рік тому

      @@username8644 No it is a user error. Didn't you watch the GN video on the subject?

    • @TheF4llen77
      @TheF4llen77 Рік тому +5

      ​@@Ruhrpottpatriot der punkt ist das der connector scheiße designed ist

    • @Ruhrpottpatriot
      @Ruhrpottpatriot Рік тому

      @@username8644 "They designed a power connector is that very finicky, much more than a regular power connector."
      It's literally the same style of power connector just with more pins.
      "You should never be able to melt a power connector because it wasn't fully plugged in"
      Improperly seated power connectors, no matter if they are inside a PC or not, are one, if not the most common cause for residential fires. And that includes stupid NEMA connectors as well as the CEE 7/x plugs.
      The problem with the connector, as the GN video showed, is that the tolerances are very tight and it's easy to push them not as far as they should, which can be solved by looking a bit closer and pull on it to see if the clip has arrested.
      Tight tolerances are not a bad thing per se, in case of electricity you want to have as little wiggle room as possible, especially if high currents are involved.
      If you don't have that you get sparks at the connection point, which increase resistance even further (thus increasing heat) and can lead to the connections even fusing together.
      " Grow some balls, use your brain, and start calling out these scummy companies who keep getting away with this crap."
      I'm all for more responsibility for big tech, but this isn't a problem of a scummy company (remember: Fault rate is less than 0.1%) and rather users not doing their due diligence; ignoring bad cables caused by manufacturing defects and not adhering to the given standards, but that's not NVIDIA's fault.
      Seriously Gamers Nexus did a whole series on that topic.

  • @Enakaji
    @Enakaji Рік тому +78

    Hmm, PowerVR is a name in the GPU space I haven't heard in ages. I actually used to have a PowerVR Kyro and Kyro II GPU way back in the early Geforce T&L days. Back then it basically was the "Bruteforce" Hardware T&L Approach on Geforce vs the "efficient" deferred Renderer on Kyro cards.

    • @PicturesqueGames
      @PicturesqueGames Рік тому +6

      powervr self-yeeted from desktop space and were doing a metric ton of smartphone gpus

    • @godzil42
      @godzil42 Рік тому

      @@PicturesqueGames No they were not making their own chip and was relying on partners to make the chip, and one of the partner (ST) decided to leave the market leaving Video Logic (as they were known at the time) in the dust, and fighting against ATI and NVIDIA was not necessarily something they could over the time, they were not as big. But the original disparition of the PowerVR from the PC market was not Video Logic will to start with.

    • @indiasuperclean6969
      @indiasuperclean6969 Рік тому +1

      WOW VERY DANGEROUS SIR! !! 😠 😠 BUT THIS WHY IM SO LUCKY LIVE IN SUPER INDIA THE CLEANEST COUNTRY IN THE WORLD 🇮🇳🤗 , WE NEVER SCAM! WE GIVE RESPECT TO ALL WOMEN THEY CAN WALK SAFELY ALONE AT NIGHT AND WE HAVE CLEAN FOOD AND TOILET EVERYWHERE 🇮🇳🤗🚽, I KNOW MANY POOR PEOPLE JEALOUS WITH SUPER RICH INDIA 🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳

    • @chillhour6155
      @chillhour6155 9 місяців тому

      Think I had a cheetah card, well that's what was on the box at least

  • @davidcousins8407
    @davidcousins8407 Рік тому +46

    I had an old Power VR GPU many many years ago, they used tile based rendering and avoided rendering anything not visible. Cheap cards that did well price/performance wise vs the ATI and 3dFX cards at the time. They had a few compatibility issues with newer versions of Windows, and Nvidia did their usual dirty tricks to help torpedo them.

    • @atomicskull6405
      @atomicskull6405 Рік тому

      PowerVR divides the scene into tiles and renders in on die memory.

    • @scheeseman486
      @scheeseman486 Рік тому +4

      @@atomicskull6405 Every modern GPU is tile based now

    • @MasticinaAkicta
      @MasticinaAkicta Рік тому +3

      Ah occlusion, didn't Nvidia use that around the Geforce FX line to prop up its rather questionable performance. Once the "Oops you sure you need 24bits textures" trick didn't get that much more performance they put in one of the drivers a simpler trick, knowing the path of the camera in the benchmark they pre-occluded it. Aka, not use on chip/in driver occlusion but recognized the software and ran pre scripted blacked out areas.
      Of course a free flowing camera POV broke that little lie.

    • @atomicskull6405
      @atomicskull6405 Рік тому +1

      @@scheeseman486 So PowerVR was right then.

  • @NatjoOfficial
    @NatjoOfficial Рік тому +337

    TLDR: The card is effectly a card which has performance that varies between a gtx 1030 and 1660 but which also consumes 250 watts of power (the 1030 consumes 30 watts I think they said)

    • @mihnealazar7039
      @mihnealazar7039 Рік тому +2

      Thanks

    • @nexusyang4832
      @nexusyang4832 Рік тому +48

      In the grand scheme of it, the fact someone even made something that works is pretty bananas. I sure as hell can't make a gpu.

    • @paxtonlarcher615
      @paxtonlarcher615 Рік тому +85

      @@nexusyang4832 then again, you probably don thave a team of Electric Engineers, Folks who understand Discrete Structures very well, Pro's in Assembly. they do, id expect them to have a good working GPU that didnt consume as much as an RTX 3070 while giving 1030 performance

    • @pauloa.7609
      @pauloa.7609 Рік тому +74

      Imagine using more power than a 1080ti a delivering the performance of a 1030. Oof and all that probably after stealing nvidia's tech.

    • @chewyslimei289
      @chewyslimei289 Рік тому +12

      yes maximum power draw for 1030 is 30 W and 1660 is 120 W
      The 3070 is 220 W

  • @Liaomiao
    @Liaomiao Рік тому +90

    I wouldn't underestimate them, it's a giant step forward for them
    Manufacturing high end semiconductors is the single most technically challenging industry and they haven't been in the game for long. The west kinda forced their hand by preventing China from sourcing high end chips elsewhere, and now I'm worried it may have been too short sighted

    • @z_nytrom99
      @z_nytrom99 Рік тому +16

      A great leap forward* for them 😆

    • @funbarsolaris2822
      @funbarsolaris2822 Рік тому +1

      ​@@z_nytrom99lol

    • @putinslittlehacker4793
      @putinslittlehacker4793 Рік тому +12

      I mean honestly. If anything the ban of them from importing certain chips will only increase there investment in domestic manufacturing. Why would you back down when your sole sorce of something essential to your economy threatens to be cut odd

    • @Dell-ol6hb
      @Dell-ol6hb Рік тому +7

      @@putinslittlehacker4793 true this was a very dumb move from the perspective of the west, it’s not like China is some small undeveloped nation, they would easily be able to develop industry for making high end computer chips if they wanted to 😂

    • @Dandandandandandandandandanda1
      @Dandandandandandandandandanda1 Рік тому +3

      Eh it's great for us consumers that there's more competition.

  • @pizza_man4329
    @pizza_man4329 Рік тому +436

    I actually really wanted another big gpu competitor considering the gpu market that we have...

    • @smittyvanjagermanjenson182
      @smittyvanjagermanjenson182 Рік тому +26

      Highly unlikely for another decade minimum. Nvidia and AMD have been in the game longer than the rest and know the ins and out of GPU design. Hence why Intels cards were laughable when they arrived. Price only got out of hand with the newest cards because upto 5 year old cards are still worth while for almost everything. New games are finally starting to push the envelope and ruin performance on older cards, even then you can just drop some settings and get solid performance again.

    • @divinehatred6021
      @divinehatred6021 Рік тому

      They do not make those GPU to compete with anyone , they are making them so they wont be left without anything if USA does something horrible again in attempt to have complete world control.

    • @Rabolisk
      @Rabolisk Рік тому +3

      Agree. But what's holding back from the market having available GPUs is fab capacity for making the GPUs and a lack of board partners willing to make graphics cards because of low profit margins.

    • @divinehatred6021
      @divinehatred6021 Рік тому +4

      @@Rabolisk also, even then, nothing will stop woke game studios from releasing games that wont run well even on 4090, neither on consoles.

    • @archangel7052
      @archangel7052 Рік тому +1

      ​@@smittyvanjagermanjenson182 A decade or two is nothing...

  • @MacgyverFreitas
    @MacgyverFreitas Рік тому +450

    It's a really impressive GPU for only 2 years of development. And due the western restrictions, the support should be better on Linux, Vulkan, and others open technology's

    • @seppi3201
      @seppi3201 Рік тому +15

      is it even possible? 3060TI in two years?

    • @MacgyverFreitas
      @MacgyverFreitas Рік тому +28

      @@seppi3201 well, with the same power efficiency I doubt, but in raw performance maybe we see in a cople more year with more mature drivers

    • @pengwin_
      @pengwin_ Рік тому

      corporate espionage will do that

    • @MacgyverFreitas
      @MacgyverFreitas Рік тому +101

      @@brianransom16 Industrial espionage is not restricted to the CCP, Western companies do it whenever they can, AMD for example started with reverse engineering Intel chips. Nowadays, the most practical and legalized way is to hire the competition's main engineers to develop “new technologies”, a strategy that all the big technology companies use all the time. I agree that they wouldn't develop as fast from scratch, but nothing starts from scratch. I think the difference with CCP-affiliated companies is that they are less willing to hide spying as they are less susceptible to lawsuits.

    • @LordCyler
      @LordCyler Рік тому +14

      @@seppi3201 When you dont GAF about IP or copyright law, its 100% possible. Didn't manage it in this case, but its possible.

  • @MenkoDany
    @MenkoDany Рік тому +175

    Most importantly, their GPUs are based on Imaginative Technologies, which has been around since the 90s, but simply moved into the mobile (phone) space. So it's not as much a new 4th competitior, as a resurrection of one from the 90s
    EDIT:
    Oh, linus talks about it at the end! All is good

    • @damienkram3379
      @damienkram3379 Рік тому +5

      You forgot bout Adreno and Mali...
      And somewhere is still alive VIA technologies and S3 Graphics...

    • @madson-web
      @madson-web Рік тому +1

      Don't forget about Dreamcast

    • @FixedFunction
      @FixedFunction Рік тому +4

      @@damienkram3379 S3 Graphics is a skeleton crew within VIA that only works with Zhaoxin in China making iGPUs. VIA hasn't funded a new GPU architecture from S3 since ~2012 and is still shipping the same Chrome 600 series cores (2-4 CUs max, DX11.1, OpenGL 4.1, no OpenCL) for a basic integrated option.

    • @meneldal
      @meneldal Рік тому

      That's I assume mostly the GPU compute part. I doubt they also got made all the other IP from scratch, I wouldn't be surprised there was some stuff that was stolen like ARM stuff that isn't properly attributed (stuff they got from previous projects made in China where they got the IP legally). Not to mention the EDA tools, no way it's all 100% Chinese either.

    • @MenkoDany
      @MenkoDany Рік тому

      @@damienkram3379 Adreno is an anagram of Radeon so you can guess where that came from. Mali is an original design by ARM, but its history goes back to 1998

  • @asirimaduranga8697
    @asirimaduranga8697 10 місяців тому +32

    i hope they succeed and give a proper competition. best wishes to them.

  • @jabinstech
    @jabinstech Рік тому +980

    It's not like they can send a spy balloon to your house with a box that says "give it back"
    edit: wow so much like

    • @rahuld_as
      @rahuld_as Рік тому

      But they can send the virus through the gpu drivers #hidden_backdoor

    • @JohnSmith-vn8dm
      @JohnSmith-vn8dm Рік тому +84

      This is actually why a lot of foreign tech companies are either closing or refusing to expand their offices in China right now. They are worried about hiring local Chinese employees who will then take IP and knowledge and start their own government backed competitors. ASML just sent a delegation of suppliers to countries like India, Vietnam and Indonesia to look for new locations to get local talent and build local factories. There's much less risk for them in these friendly countries with better IP rights.

    • @grumpyoldwizard
      @grumpyoldwizard Рік тому +11

      The card IS the spy...

    • @789know
      @789know Рік тому +7

      ​@@JohnSmith-vn8dmAlso it isn't like any of these countries can set up local competitor even if the people who previously work there decided to take the knowledge and ip and work in said supposedly new company

    • @thomasying4990
      @thomasying4990 Рік тому +3

      ​@@JohnSmith-vn8dmso just making a gpu is infringing on nvidia's ip?

  • @5ebliminal
    @5ebliminal Рік тому +76

    moving the vrm's to the top of the card is kind of a simple idea to help vent the hot air dirtectly out the top instead of stuffing them in the middle of the board where it can heat soak easier

    • @hovant6666
      @hovant6666 Рік тому +14

      VRMs are easy to cool, placing them along the top stretches the trace lengths badly and provides asymmetric delivery to the die, leading to worse voltage drop-off - or spikes to compensate

  • @llynellyn
    @llynellyn Рік тому +263

    The fact they have a better power connector setup than the Nvidia 4000 series is comical xD

    • @ABaumstumpf
      @ABaumstumpf Рік тому

      The fact that you have no clue yet are talking big is just - sad.
      Just a hint for the retards that will come crying:
      EPS 8pin is specified by the same spec as the 12VHPWR.

    • @fleurdewin7958
      @fleurdewin7958 Рік тому +14

      Agree. The EPS12V power connector is better than the 12VHPWR . We have been using the EPS connector for decades, never seen them melt or catch fire despite seeing older Intel HEDT CPU pulling a crap tonne of power from them.

    • @oxfordsparky
      @oxfordsparky Рік тому +10

      @@fleurdewin7958 the EPS12v is rated way lower and the the 12vhpr connector isn’t an nVidia creation, it’s a standard connector, so many idiots keep blaming nVidia for it.
      The only ones that have failed have been due to improper installation.

    • @ABaumstumpf
      @ABaumstumpf Рік тому +4

      @@oxfordsparky Even better - The EPS was specified by the same company as the ATX standard (from which the 12vhpwr comes).
      But people are too ignorant to learn from their mistakes. The connector never was at fault.

    • @realms4219
      @realms4219 Рік тому +4

      @@oxfordsparky Nvidia still came up with it, and helped make it a standard.

  • @allergictobs9751
    @allergictobs9751 Рік тому +18

    This is actually very very impressive, considering their age in this industry. I am fking blown away.

    • @NotSoFakeTaxi
      @NotSoFakeTaxi 4 місяці тому +1

      But the card is crap for how much power it’s drawing. It’s actually horrible

  • @HKlink
    @HKlink Рік тому +80

    I once had a GT 1030 in my system and upgraded to a 1660 Super. I can confirm that this comparison is about right. The 1030 simultaneously *doesn't exactly run like garbage* and *will bottleneck you hard.*

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Рік тому +2

      yeah iirc it was replaced to gtx970 everything run well and smooth just i never really tested it, but game on it alot, first witcher 3 gameplay also with that pc
      it was forbidden msi tiger 1 fan not spinning after 2 years, and the msi board also failed, replaced it with gigabyte board

    • @zeroa69
      @zeroa69 Рік тому +2

      1030 are awesome for low power high end mame stations

    • @joshgts9675
      @joshgts9675 Рік тому

      The 1030 is a card to drop in pre-built systems. Buying one for a custom ATX system is moronic. A used GPU would be a far better value.

    • @Ndkksooejn
      @Ndkksooejn 9 місяців тому

      Yeah, gt 1030s are going for $200 to $250 aud new here.
      Can get a used 980ti for the same price 😂
      ​@@joshgts9675

  • @LithFox
    @LithFox Рік тому +101

    You know speaking of things like PCI-5 and stuff, there's a game I have in particular called iRacing that supposedly has a lot of bandwidth related issues as far as sending data to the GPU because of the fact that they send the entire world space every frame. I'm curious with the impact of bandwidth over the bus actually has for GPUs particularly in a game environment that does such a thing and I wonder if anyone at the lab would be able to provide insight on that.

    • @LithFox
      @LithFox Рік тому +15

      I feel like it's an under discussed aspect of gaming as a whole because we just assume that it's just straight up power that's needed but I think a lot of people forget that the CPU still needs to send instructions to the GPU and that data still has to get over to the GPU over the motherboard.

    • @ABaumstumpf
      @ABaumstumpf Рік тому

      So you have a game that is just insanely stupid. It is sad just how bad most games, even the big AAA games, are coded. they literally waste a decade of hardware-improvements just by being retarded programmers.
      Was one of the strange things when DX12 was announced: OpenGL already offered low-level access and nearly nobody used that cause it is a lot of work and easy to get wrong. And with Dx12 and Vulkan the same problems came up. Not only that we have seen that the code it self is yet again mostly bad to the point that the drivers of AMD and Nvidia have to do the heavy lifting by dynamically reshuffling data as the code as written would just result in slideshows if anything.

    • @racerex340
      @racerex340 Рік тому +26

      Except an RTX 4090 can operate at 98% performance in a PCIe gen3 x16 slot than when in a PCIe gen4 x16 slot. We're only just now seeing GPUs that are finally outgrowing PCIe gen3. Im betting its at least two generations, maybe 3 before a card is knocking on the door of PCIe gen4 x16 limitations.

    • @xthelord1668
      @xthelord1668 Рік тому +3

      @@racerex340 this depends on how heavy PCIe bandwidth usage gets in future with likes of direct storage
      remember were starting to see PCIe 3.0 show age after that many years,4.0 is 2x bandwidth off of 3.0 which even with direct storage could take years to be fully saturated

    • @stefanl5183
      @stefanl5183 Рік тому

      @@racerex340 But LithiumFox point is that's application dependent and there are some applications where more bandwidth is necessary. One such application might be AI and applications that benefit from memory pooling of multiple cards. Nvidia's solution to this has been Nvlink, but a higher bandwidth PCIe connection could be a cheaper alternative solution. This could also perhaps be used to bring back multi-gpu gaming like SLI, without needing special sli connectors between the GPUs. Anyway, there are definitely situations this could benefit.

  • @K3V0M
    @K3V0M Рік тому +157

    I think we definitely underappreciate how well some of our stuff works together.

    • @reuven2010
      @reuven2010 Рік тому +9

      PCS in general are extremely underappreciated.. The tech is absolutely insane .

    • @warymane6969
      @warymane6969 Рік тому +6

      True. This video is nostalgic. I got same vibe about 10 year ago when china got banned from only International space station. The best way to stop china progress to banned and then mock them. it clearly work

    • @prashantmishra9985
      @prashantmishra9985 Рік тому

      ​@@reuven2010 Wdym?

    • @didyoumissedmegobareatersk2204
      @didyoumissedmegobareatersk2204 Рік тому +1

      ​@@warymane6969 😂😂they made 7nn chip

    • @leosmi1
      @leosmi1 9 місяців тому

      THAT'S CALLED P-R-O-T-O-C-O-L-S

  • @aison2735
    @aison2735 Рік тому +17

    The specifications of this graphics card are very high, similar to the GTX3060, but due to driver optimization reasons, it has not fully utilized its true performance. In recent months, this Chinese graphics card company has been constantly updating and optimizing drivers at a high frequency, the performance of this s80 graphics card has greatly improved compared to before, and it can now support most mainstream games. Its GPU usage during operation is currently less than 20%, so its potential is still great. Currently, its price is only $163, and the inventory of product distributors was quickly snapped up. They are preparing to launch the next generation of the S90

  • @BramVanroy
    @BramVanroy Рік тому +145

    Seems obvious to me that it is productivity or research oriented. That's why you see the AV1 support as well as Pytorch and Tensorflow. It also explains the higher memory capacity and Tensor-like cores, typical for deep learning cards. So it would be fairer to run some DL throughout benchmarks on this.

    • @fostena
      @fostena Рік тому +24

      Except all the productivity suites they tested won't start, while some games did. It's a glorified prototype, a marketing stunt. They re-purposed their server GPU for consumer use, and it's still useless. If they're able to iterate maybe they will succeed, but it depends on how much money the Chinese Government are willing to spend on Moore Threads

    • @guzilayerken5013
      @guzilayerken5013 Рік тому +19

      @@fostena They issued the card to prove to investors that they could deliver a real product. Second, many people in the West do not understand that not all random enterprises can be funded by the government. The Moore Thread started with private capital, and they need to convince investors to invest more.

    • @fostena
      @fostena Рік тому +11

      @@guzilayerken5013 you don't need to explain to me public-funded enterprises! I live in Italy, we where almost a "socialist" country once, by USA standards 😄! We got plenty of government intervention in the economy. That's what I said, by the way, the card is a marketing stunt, it's a proof, a prototype

    • @Oktokolo
      @Oktokolo Рік тому +5

      @@fostena Maybe it runs well under Linux. China will probably not optimize a product for an OS they won't be able to legally update next year.

    • @haijiazhu3148
      @haijiazhu3148 Рік тому

      @@fostena Nor really. This card are not design for general use, such as gaming. It was designed for machine learning.

  • @Hulkeq2
    @Hulkeq2 Рік тому +14

    Even if this is absolute crap, it's backed by - true murky - capital and does something. Like you said "I wouldn't be able to do it" and on a pertinent level this is pretty much on point. We need more players. We need competition coming in from other directions. Give it time and in a few years we might get something from them on ARC level, a few years after that something that can compete. Voodoo wasn't supplanted by NVIDIA Riva TNT in one generation. These things need time and a steady flow of cash. I'm not saying a world where the fastest GPU4s are china made is a better world to live in, i'm saying competition will bring prices down between competitors.

    • @pauloa.7609
      @pauloa.7609 Рік тому

      These chinese companies will never be able to sell their products in the west, all that stolen tech would make it impossible unless they are prepared to do legal wars until the world ends. so, you, me and everyone else in the west dont win anything from this.

    • @Hulkeq2
      @Hulkeq2 Рік тому

      @@pauloa.7609 Predictions based on wants or should be don't mean much. The world is changing, a family being able to sit on their laurels for generations because an ancestor thought about putting a piece of plastic at the end of a shoelace is, when you take a step back just as ridiculous as chinese made video cards.
      There is never a patent on a result, just on a means.
      We will have to see.

  • @apathyzen9730
    @apathyzen9730 Рік тому +23

    9:21 I like how supposedly "China-only" card has FCC and CE logos.

    • @venosaur121212
      @venosaur121212 Рік тому +5

      That is probably the chineese CE logo.
      It looks almost the same.

    • @fotografotimido
      @fotografotimido Рік тому

      @@venosaur121212 It has both the Conformité Européene And the FCC logos (Or at least the spacing in the CE logo ondicates is the Conformité Européene logo)

    • @llih0074
      @llih0074 Рік тому +1

      Not surprisingly, most of the motherboards like Asus, Gigabyte, MSI, etc., are manufactured in China and printed with FCC and CE Logos, even some models that are only sold in China.

  • @jackie2-g8l
    @jackie2-g8l Рік тому +273

    They made this only for 2 years? Wtf that's actually impressive in such a really short period of time they made a working GPU, if given more time to produce GPUs then I guess they have a higher chance to being more compatible to more hardwares

    • @seelevollerei-mc5fo
      @seelevollerei-mc5fo Рік тому

      If China succeeds, Nvidia and AMD will cut prices significantly,That's great

    • @BETAsin
      @BETAsin Рік тому +37

      Hope so for the consumer market. Nvidia's prices are completely out of wack with what consumer's are willing to pay, AMD is no better and the stock is always out anyways. It's time an actual disrupter came on the market to break the diopoly.

    • @aboomination897
      @aboomination897 Рік тому

      Maybe the compatibility issues (especially with certain game titles) are a feature, knowing a bit about how much China's GOV loves to censor.

    • @CrythornMadness
      @CrythornMadness Рік тому

      The 2 years isnt that impressive when you remember most products home grown in china are from stolen tech lol. They dont innovate, they steal and make rip offs. The ultimate chinese knock off!

    • @lumpython5351
      @lumpython5351 Рік тому +23

      Lol, brought technology from UK, Imagination was the company offers graphic technology for Sega Dreamcast. Nothing marvellous.

  • @lemidnite
    @lemidnite Рік тому +7

    Whoever edited the intro is based and an absolute mad lad, let me shake his hand😭😭

    • @TheSkytherMod
      @TheSkytherMod Рік тому

      Exactly! Surprised more people didn't notice

  • @T90MT
    @T90MT Рік тому +18

    Hearing linus say "Thanks pal ❤" Was so heartwarming.

  • @alexlun4464
    @alexlun4464 Рік тому +179

    The gpu itself isn't scary.
    The scary part is that China was able to put that gpu together without having an international supply chain like nvidia or amd does to make their gpus.
    All the more, this iteration is close to the 3060ti which you might say "oh but that's mid range" but you would be wrong because for a lot of people a 3060ti is a high-end gpu.

    • @MadChrisp
      @MadChrisp Рік тому

      Not scary once you realize it's all stolen ip... As usual

    • @Talesofaweedsmoker
      @Talesofaweedsmoker Рік тому

      What china is more then capable to manufacture GPUs like this they already make 75 percent of the world's products. Idk why people assume china is a third world country they are not they a highly developed industrialized country. The usa knows china is the only one with that can rival usa tech which is war the usa is in a economic war with china we've stopped selling them chips it don't matter.

    • @nelsonk1341
      @nelsonk1341 Рік тому +21

      What do u mean by scary💀 it's a freaking gpu bro not a next gen deadly weapon

    • @alexlun4464
      @alexlun4464 Рік тому +63

      It's scary because a long supply-chain is currently needed to make computer chips. A collaboration effort between Taiwan, Japan, Korea and the US. These are first world nations that need each other to make these chips.
      Now because tensions between China and the US, China basically said they were going to make their own chips with blackjack and hookers and they actually did. They are on their way to beat those other nations at chip manufacturing.
      II don't mean it as in "ohh scary, china bad". But the fact that they actually pulled it off is insane. China is already becoming this century new super-power.

    • @Spyrit2011
      @Spyrit2011 Рік тому +13

      Good for China, I have purchased Blackview phones from China, love them.

  • @09gdt
    @09gdt Рік тому +17

    Hearing Linus say " let's try DP" that'll scare anyone for life 😂

  • @Vanderfate
    @Vanderfate Рік тому +19

    Heard pretty much the same thing 15 years ago about Chinese cell phone.
    Looking forward to the next 15 years.

    • @mobiusflammel9372
      @mobiusflammel9372 Рік тому +1

      I think the biggest difference there is there wasn't a burgeoning economic/tech war brewing at the time. Who knows how this will pan out, I"m only saying there's a bit of a different context here.

    • @Vanderfate
      @Vanderfate Рік тому +2

      @@mobiusflammel9372 hmmmmm, how about ISS

  • @seljd
    @seljd Рік тому +52

    82Hz refresh might be just half of 165Hz that many monitors use

    • @ОлегЖданов-ъ1д
      @ОлегЖданов-ъ1д Рік тому +1

      This is obvious, but the whole video is built on making fun of gpu problems.

    • @nobrakes7892
      @nobrakes7892 Рік тому +3

      @@ОлегЖданов-ъ1д but china bad , hahaha

  • @raghardeishi972
    @raghardeishi972 Рік тому +38

    I like how they managed to get AWESOME cooling and next to zero noise operation and have low price. Of course, hard restrictions on MBs, and not allowing to launch games after warning they are not tested and supported but perhaps they would work...
    I actually kinda wonder why both Intel and China didn't concentrate on Vulcan support. Force games to support Vulcan as main instead of DX12 and everyone would be happy, with exception of developers.

    • @keigansabo9330
      @keigansabo9330 Рік тому

      dx12 is alot better than vulkan lol

    • @poor_youtuber1390
      @poor_youtuber1390 Рік тому +2

      @@keigansabo9330 how?

    • @zhanucong4614
      @zhanucong4614 Рік тому

      @@keigansabo9330 as nitendo pirate i disagree,with integrated gpu i can run scarlet

  • @maciejszymanski2502
    @maciejszymanski2502 Рік тому +27

    if it's an AI oriented you could test it by running a basic training on it with a ready base, put it through 10-20 epochs see how long it takes, compare against cpu performance and some entry level gpu like the 1660 in this video

    • @wsippel
      @wsippel Рік тому +3

      I was kinda interested, so I looked around - couldn't find anything. According to some people who played around with it, it's quite difficult to get the Linux software for those cards (UnixCloud has some pretty old drivers for previous MTT GPUs on their website, but not for the S80/ S3000), and I'm not sure anyone managed to get access to any Torch builds for MTT GPUs. And no code has been submitted upstream at all as far as I can tell.

    • @qiyuxuan9437
      @qiyuxuan9437 Рік тому

      Probably need to do it in linux tho. This card is not mainly built for windows.

    • @fcukcensorship783
      @fcukcensorship783 Рік тому

      How? Does torch, tf or any other ml API support these cards?

    • @sophiophile
      @sophiophile Рік тому

      ​@@fcukcensorship783 the screenshot from their website they showed said it supports pytorch.

  • @SaperPl1
    @SaperPl1 Рік тому +9

    That power connector at the back side of the card should get back to all GPUs as in the pci-e spec reference design...

  • @taiwanluthiers
    @taiwanluthiers Рік тому +56

    It's likely this GPU isn't designed for gaming, but for other applications. Like trying to play games with Quadro instead of Geforce GPU's. It's likely designed with computational power in mind (AI, cryptocurrency/blockchain, professional level 3D animation, etc.) and gaming is likely an afterthought.

    • @JustSomeDinosaurPerson
      @JustSomeDinosaurPerson Рік тому +10

      Except quaddro's still deliver stellar gaming performance because they have rock solid driver support.

    • @jyrolys6
      @jyrolys6 Рік тому +4

      It's a lot more likely people underestimate the amount of work that went into driver support for games over the years. For a new player to try their hand at it, this is impressive, but obviously not a good product for actual use now.

    • @hakureicirno6059
      @hakureicirno6059 Рік тому

      Feels like some kind of rasterization module slapped onto a GPGPU/floating-point accelerator.

    • @januszkurahenowski2860
      @januszkurahenowski2860 Рік тому

      As they said, it's a repurposed server GPU but this version is supposed to be for gaming and they didn't do a great job at it since it can run only a handful of games at performance levels of a 1030

  • @ak47zaq
    @ak47zaq Рік тому +306

    The way I see this is that China does have the market there to support a new GPU company to start from scratch, they might able to come up with brand new tech tree that perform better and cost less at the sametime, just like all the other things the made in China. I think overall it's a good thing, judging by how Nvidia is doing right now, I'd say the more competitors the better for us gamers. Just give them couple of years, we will see.

    • @DerpSenpai
      @DerpSenpai Рік тому +18

      Also to just add a bit. They are licensing the same GPU IP that ImgTech sells to phone makers. They simply don't have drivers done at all. The GPU most likely is stronger than what it actually shows but can't use the performance due to bad drivers. Moore Threads did the hardware, worry about software later. However, we shouldn't expect more than 1660 in gaming even when it's fully optimized.

    • @thinkingcashew6
      @thinkingcashew6 Рік тому +8

      The reality is that this is a CCP company that is going to make chips for MIL/AI use in the long term. While they would probably like a commercially viable product it is unlikely they will be able to do that any time soon.

    • @ShawnX1995
      @ShawnX1995 Рік тому +38

      I am glad Chinese tech companies are making GPU for gaming market. This will change the balance built by AMD and NVIDIA. I believe both of them will release more competitive products under the pressure from outside.

    • @ectomyology
      @ectomyology Рік тому

      @@thinkingcashew6 do you have the idea that every single company in China is controlled by the government? Well, guess what. You are wrong.😢😢😢

    • @candle86
      @candle86 Рік тому

      I'll never run one, same reason i won't own Lenovo, they are spy tools of the CCP, if you want to claim no way China would slip in code into the firmware to spy on westerner's, I have a bridge between Tokyo and New York for sale are you interested in buying it?

  • @lucasljs1545
    @lucasljs1545 Рік тому +47

    If it is their first product that's actually nice. It could become better really fast after experience.

    • @zaxwashere
      @zaxwashere Рік тому +2

      just limited by their fabrication capabilities.
      Not having 7nm or below is going to be a HARD limit.

    • @didyoumissedmegobareatersk2204
      @didyoumissedmegobareatersk2204 Рік тому +4

      ​@@zaxwashere they do have 7nn chip tho

    • @zaxwashere
      @zaxwashere Рік тому +1

      @@didyoumissedmegobareatersk2204 oh neat. I didn't know that. Kinda surprising that they did it without EUV, but we'll see if it hits any sort of production.
      I correct my post
      Gonna be tough without ~~7nm~~ uhhh
      5nm

    • @Dunewarrior00
      @Dunewarrior00 Рік тому +2

      @@zaxwashere Their 7nm was found on a mining chip. This tells us that its probably not a mature 7nm node (meaning yields are low, actual performance isn't there). However, the fact that it exists by itself is a big deal, because its like getting your foot in the door to greater things.

  • @peterkovacs184
    @peterkovacs184 Рік тому +91

    Nvidia has the H100 out with pcie5.0 for a while, but so far it seems unnecessery for the rest of the gpus that it is not worthy for them to add support for it.

    • @Alucard-gt1zf
      @Alucard-gt1zf Рік тому +8

      I don't think even the 4090 fully utilises even pcie 4.0 much less 5.0

    • @h_alkhalaf
      @h_alkhalaf Рік тому +2

      H100 is not a graphics card, it doesn't even have display outputs.

    • @iawindowss4061
      @iawindowss4061 Рік тому +7

      @@h_alkhalaf Yea it's an ai accelerator card

    • @PhantomMattcraft
      @PhantomMattcraft Рік тому +5

      @@h_alkhalaf It's a graphics card mate, a GPU in a card format. There are many, many GPUs that exist without display outputs. Look at any "mining" GPUs that are quite literally 100% complete replicas of consumer GPUs with absolutely zero differences apart from lack of display outputs and different drivers/firmware. Nvidia literally called the H100 a GPU in the full product name, "NVIDIA H100 Tensor Core GPU". I cannot stress this enough, that just cause a GPU has no display outputs does not mean it isnt a GPU.

    • @wingcommanderbob8268
      @wingcommanderbob8268 Рік тому +3

      @@PhantomMattcraft it doesnt support directx, opengl or vulkan in any format, and is therefore incapable of actually rendering graphics

  • @LunaWuna
    @LunaWuna Рік тому +11

    It makes ARC look polished...
    Waiting for the 1 month of moore threads challenge :)

    • @marioalexanderski9598
      @marioalexanderski9598 Рік тому

      ARC has recently become polished after Intel fixing their terrible drivers, making them much better.

  • @MrTeddy12397
    @MrTeddy12397 Рік тому +353

    -999999999999999999 social credit

    • @daliag_439
      @daliag_439 7 місяців тому +13

      不好了!

    • @therealfakenews2274
      @therealfakenews2274 6 місяців тому +6

      How many canadian social credits is that?

    • @purplebeast8536
      @purplebeast8536 6 місяців тому +1

      @therealfakenews2274 chinese bot replying to 1 year old comments

    • @therealfakenews2274
      @therealfakenews2274 6 місяців тому +3

      @@purplebeast8536canadian thought police?

    • @daliag_439
      @daliag_439 6 місяців тому +3

      @purplebeast8536 what?

  • @Mantis4
    @Mantis4 Рік тому +58

    I would be interested in a revisit to see if and how the drivers have improved or not

    • @ivermektin6874
      @ivermektin6874 Рік тому

      It's a mobile GPU replicated badly to get to desktop tier performance. It will take a while but eventually it should improve.

    • @張元隆-z9i
      @張元隆-z9i Рік тому +1

      it does support games in every week, but still it will take a long time to compete with even 1660ti, right now, in some games it will perform similar to a 1660ti, especially in some old games or some chosen games, but general speaking, it is still not even close to 1660ti, which is pretty sad. the only reason that you would like to buy this card is you want to support a third party company to compete with those larger companies, other than that, just go with AMD if u only play games

  • @ElAnikilador001
    @ElAnikilador001 Рік тому +18

    That part at 0:30 had me dying

  • @mfcfbro
    @mfcfbro Рік тому +97

    As meh as this is, I'm still really excited that someone else is stepping in. They clearly have good engineers, but software will come with time. I mean it took AMD 10 years to make good video drivers. Any competition is good for the market.

    • @kevin_mx
      @kevin_mx Рік тому +1

      TBH their drivers still kinda sucks even to this day xD

    • @紫阳花开
      @紫阳花开 Рік тому

      别考虑什么竞争了,这家企业能不能活下去都是未知数。制造他的工艺是tsmc 7nm,这意味着即便活下去了,由于无法自主制造,会和华为一个结局

    • @monkev1199
      @monkev1199 Рік тому

      At least on Linux amdgpu and the open source userspace drivers have been rock solid in my experience. Windows sadly gets shafted again

    • @randomvg00
      @randomvg00 Рік тому

      Idk, they didn't seem to be interested in making gpu for anywhere except china and probably the plan is that they would make everything themself and just ban out all the western made hardwares, and still compared to intel arc at launch, that still have better performance, stability and support than this card so i'm more excited on seeing where intel going with their gpu

  • @eytbits
    @eytbits Рік тому +25

    Hope they improve sooner and compete toe to toe with the big 3. Definitely a win for consumers.

    • @ctrash
      @ctrash Рік тому +10

      @@jack99889988 Most likely. America is terrified of competition.

    • @keyrifnoway
      @keyrifnoway Рік тому

      ​@@ctrash the "competition" is a gtx 1030 running at 255 watts

    • @eytbits
      @eytbits Рік тому +3

      their first consumer grade release. really rooting for this company to catch up. they don't have to release it in the US. there's the rest of the world you know

    • @maysartas5581
      @maysartas5581 Рік тому +8

      @@keyrifnoway people said the same thing about Chinese phones. Now they sell hundreds of millions of phones every year. Even without the us market. (Same for Chinese electric cars they’ve started to gain marketshare in southeastern asia, Europe and middle east)
      China has the money to burn to play catchup unlike most countries. They’ll just keep throwing money at it until it becomes semi competitive. In a few years it’ll sell like hotcakes in Asia then a few years more Europe.

    • @maysartas5581
      @maysartas5581 Рік тому +8

      @@jack99889988 still definitely a win for the other 97% of the world population

  • @castorwong9096
    @castorwong9096 Рік тому +3

    As a Chinese ,I firmly doubt 1:39 ,We only know that it's the first Moore Threads gaming GPU, complety domestic and of course only the geeks would buy it to run benchmark. NV was ordered to limit the export of cutting-edge GPUs to China, and the goverment esbalished a project to fund those domestic chip companies who develop GPU CPU chips . Moore Thread is just one of those companies.

  • @ZDY66666
    @ZDY66666 Рік тому +67

    Gotta give credit where it's due. The naming of Chinese companies especially the English names are fire...I mean "Moore Threads"? C'mon....that's genius 🤣

    • @macicoinc9363
      @macicoinc9363 Рік тому +3

      Gotta give it to them, the English namings are always something. Like their recycled lithium batteries that were branded fire explosion or some shit lol.

    • @fenix2k1
      @fenix2k1 Рік тому +7

      Having spent plenty of time in South East Asia a good English business name is more of a happy "infinite number of monkeys with typewriters" coincidence than inspired genius.

  • @harrydang9
    @harrydang9 Рік тому +11

    I'm disappointed you didnt have labs benchmark this card against the GTX 1660, RTX 3060, and GT 1030. i think i would have prefered that over the nonscripted gameplay tests

    • @altrag
      @altrag Рік тому +1

      I'm guessing none of the benchmarking tools actually functioned given the poor lack of support this card seems to provide.

    • @那人冷靜一點
      @那人冷靜一點 Рік тому

      Why waste time

  • @Wulfryk
    @Wulfryk Рік тому +11

    should keep in mind that League of legends runs mostly of your CPU not GPU. they kept it extremely low performance so it works on pretty much everything

    • @tristanstebbens1358
      @tristanstebbens1358 Рік тому +1

      yeah if you're running on the lowest graphical settings. Mainly because all the number crunching and client side processing is CPU based (pretty standard). But actually increasing the graphic settings of course increases GPU usage by a wide margin. Essentially, a poor gpu can run league on the lowest settings. A moderate gpu is needed to run league at higher settings, let alone 4K on top of that.

    • @g2fiora
      @g2fiora Рік тому

      @@tristanstebbens1358 yea but it still maxes out at ~25% usage on my 6700 XT so anything above a ~1060 should run the game the exact same (within margin of error)

  • @jaydenli4758
    @jaydenli4758 Рік тому +17

    In fact, Our chinese harware fans think that two years are really a short time for GPU manufactring. It can work ! We cant wait to see more products. My friend bought a moor gpu to play games , although it's not very stable. ( Moor did not sell this gpu to the market, beacuse they know it is a dev version. Brand repuation matters very much)

  • @RoamGaming
    @RoamGaming Рік тому +12

    nice to know that even in a video sponsored by Xsplit you chose to use OBS.

  • @ZironZ
    @ZironZ Рік тому +16

    Honestly if they just focus a ton on drivers this seems like it could be promising. They have already proven they can get decent performance in some games.

  • @ClassicRockRadioEU
    @ClassicRockRadioEU Рік тому +14

    Interesting and reminds me of the nineties when mobos and video cards were released with drivers that had not 'matured' (not forgetting games that ran like snot until they were patched numerous times)

  • @josephbornman8462
    @josephbornman8462 Рік тому +1

    Really interesting to get the detailed take on how things actually get built in the real world in the last 4 minutes

  • @nujuat
    @nujuat Рік тому +40

    Yeah this seems like a re-painted general purpose GPU (probably for AI and maybe crypto). All you need is one person to make it compatible for whatever GP GPU application and then it becomes super useful. Given the high VRAM to core count ratio, I'm guessing loading big datasets for AI purposes is the priority

    • @nolanlewis538
      @nolanlewis538 Рік тому

      Or they are just trying to scam their way into some sweet govt funding. Lot of chinese companies have done that in the past. With one trillion $ there is enough for everyone to share.

    • @Sparks95
      @Sparks95 Рік тому

      a gpu is as useful as you want it to be.....

    • @Azzysdesignworks
      @Azzysdesignworks Рік тому +3

      Copied chinese tech? Noooooo.... lol

    • @zhanucong4614
      @zhanucong4614 Рік тому

      @@Sparks95 not true in soo many ways,nvidia is perfect for blender,amd is linux(nvidia runs poorly on linux)

    • @Sparks95
      @Sparks95 Рік тому

      @@zhanucong4614 interesting, what are homegrown Chinese gpus suited for?

  • @Guy-jb9qf
    @Guy-jb9qf Рік тому +21

    I don't care if this gpu only exists because of the US/China tech cold war, having another competitor in the market is good for consumers.

    • @st.altair4936
      @st.altair4936 10 місяців тому +1

      The more the US sanctions China, the more innovative and self-sufficient they get lol.

  • @pawelkapica5363
    @pawelkapica5363 Рік тому +5

    We actually need this card. A cheap gpu out of china that threatens the gpu cartel|s market share is exactly what we need. If they fix their drivers it could be viable for 1080p as long as they keep the price down.

    • @edward0116
      @edward0116 Рік тому

      US never ending ban game won't let it happen. Your blood is for them to suck, not others.

  • @razvidanish1826
    @razvidanish1826 Рік тому +3

    Yooo I watch linus's videos just by looking his face on the thumbnail😂, like how old the video is doesn't matter 😅, it's super informative and entertaining❤

  • @danevileye
    @danevileye Рік тому +4

    12:18
    "Yeah, let's try DP" 🤣🤣🤣
    Sebastian, Linus - Mar/11/2023

  • @TheUberRed
    @TheUberRed Рік тому +7

    09:39 True, and Based

  • @intetx
    @intetx Рік тому +7

    I would really like to see more content with that card

  • @a1ter120
    @a1ter120 7 місяців тому

    honestly knowing that the boys at home are making progress is enough to make me proud no matter how bad it is

  • @DawnKnight1993
    @DawnKnight1993 Рік тому +4

    Just bought a PC from Build Redux yesterday. Glad to see they are still a sponsor, and even more excited to try it my new PC out!

  • @CtIyst
    @CtIyst Рік тому +8

    Just got my first pc and got all the parts from your $1000 budget pc. thanks man!

    • @kylecruz7040
      @kylecruz7040 Рік тому +1

      It's insane how 1000 dollars is considered a budget system nowadays..

    • @MegaEmmanuel09
      @MegaEmmanuel09 Рік тому

      @Kyle Cruz I was gonna scroll past until I saw a reply. Budget doesn't mean cheap or inexpensive; it's just the amount of money you plan to use. You can have a $550 budget for a PC, or one that's $2,550, and that's still *a* budget.

  • @metroplex29
    @metroplex29 Рік тому +4

    That gpu shroud design looks clean AF

  • @yanxuan4101
    @yanxuan4101 Рік тому +12

    Seems it has some drive issues since from all specifications it should not have such low performance, and considering its very small support list they are probably not working on maximizing the performance rather than making it run first.

  • @MWDoom
    @MWDoom Рік тому +5

    China having a domestic GPU with limited SKUs is a boon for gaming in the West. It'll give developers a singular target to optimize for.

  • @dil6969
    @dil6969 Рік тому +4

    14:05 - TF2 is the GOAT for me. While this would NEVER happen, if Valve ever released a TF3 and it was good, I would be over the moon. I don't think I've had more fun in any other FPS, other than the OG Halo trilogy. Overwatch feels like the spiritual successor but it never got me hooked like TF2 did.

  • @dippst
    @dippst Рік тому +17

    i wouldn't be surprised in the least if the "gaming" gpus were qc rejects from the enterprise production line.

  • @hanyangxu3139
    @hanyangxu3139 Рік тому +14

    It is pretty much common knowledge to Chinese industry insiders that they bought the GPU's soft IP from imagination technology from the UK. The same company that licensed Apple way ago with GPU Soft IP architectures. But again, there is so much more to get a GPU out besides having the RTL.
    Edit: Also, given that they bought Imagination's IP, I don't think they actually stole any patents or IP...

    • @Hayan_Yeou
      @Hayan_Yeou Рік тому +2

      And yet we regularly fine Chinese tech spies here in Korea. Chinese are investing huge money on stealing things and it's not just CCP and techs.

    • @xiaoteam575
      @xiaoteam575 Рік тому +5

      泡菜要变成韩国的了 😂😂

    • @qixun1127
      @qixun1127 Рік тому

      @@xiaoteam575 不是吗??

    • @南霁云-w6u
      @南霁云-w6u Рік тому +5

      @@qixun1127 难道是吗? 白菜是中国土生土长的植物,韩国根本就没有,每年还要从山东进口大量白菜,现在你跟我说泡菜是韩国的?

    • @qixun1127
      @qixun1127 Рік тому

      @@南霁云-w6u 山东做泡菜?

  • @tytv6920
    @tytv6920 Рік тому +15

    I myself own a GT 1030 and for the price it's actually rly good. It definitely does more than I ever thought it would

    • @Tonyx.yt.
      @Tonyx.yt. Рік тому +4

      no, its not, because performance per $ is much worse than mid range gpus

    • @belgarath6508
      @belgarath6508 Рік тому +3

      ​@@Tonyx.yt.issue is that I'd you can't afford a midrange gpu/ don't wanna spend a few hundred bucks, the FPS/$ value is irrelevant.

    • @temp50
      @temp50 Рік тому

      @@Tonyx.yt. "performance per $" doesn't matter if you have limited amount of money. In this case you have a "performance per $" sublist to choose a card from.

  • @Glubokij
    @Glubokij Рік тому +6

    Indeed this is a threat and a good competitor for the AMD and Nvidia. The same has happened to the phones when Chinese phones took a huge part of EU and US market. US government acted fast and banned Huawei before it could took the marked completely.
    With that huge funding the new GPUs will pop up in 2-3 years easily. Also, TSMC factories which are the only in the world capable for 4nm lithography are under the China influence.

    • @son_guhun
      @son_guhun Рік тому

      TSMC is not under Chinese influence unless they decide to invade Taiwan, lol
      Also, Chinese phones were never really that popular in the US. They really prefer to spend their money on needlessly expensive iPhones instead.

    • @Shatterfury1871
      @Shatterfury1871 Рік тому +1

      Chinese lithography? Puhahaha!!!!😂😂😂😂

  • @yankeebotanist4699
    @yankeebotanist4699 Рік тому +4

    The coughing reference was just so subtle and on point, well done Linus 😂

  • @toddkrueger1125
    @toddkrueger1125 Рік тому +4

    I think it would be only fair to run this card on the fully Chinese P3 processor with the Chinese board or the highest, and one that is available from China then see what we get.

  • @owlstead
    @owlstead Рік тому +11

    With cards like the Intel ARC 380 I can see people buying it to get enough monitor support, 3D acceleration (CS:GO will play with enough framerate), encoding options and hell, maybe some use of the AI possibilities. It will also allow your CPU to run more smoothly as an APU also uses system RAM for video. Some people (like me) don't care about high end gaming at all, and find these kind of GPU's perfectly fine. I don't have any reason to upgrade my RX 580 anyway, which I managed to buy when it was still considered a "medium priced" GPU at around €210, and it has fine performance and FAN stop for my silent PC.

    • @rickyray2794
      @rickyray2794 Рік тому

      RX580, lol.

    • @craftmine5889
      @craftmine5889 Рік тому

      你真的很勇敢,摩尔线程造这张卡就是为了告诉所有关注者真的在做。
      另外,如果你能经常反馈一些bug,那么这个GPU的兼容性就会越来越好

    • @paulie-g
      @paulie-g Рік тому +2

      @@rickyray2794 I had an RX580 up to a few months ago and it ran 3440x1440 on my coding workstation flawlessly. Didn't have any issues when playing with CAD and a few other 3d things either. So no, not 'lol'.

    • @umr4h138
      @umr4h138 Рік тому

      ​@@rickyray2794 you know nothing about GPUs. The rx580 is a solid card

  • @BanjoGate
    @BanjoGate Рік тому +12

    Got to love Adam and his love for TF2! Really want to see him cover it more on testing

    • @Thornskade
      @Thornskade Рік тому

      And it's actually a rather interesting game to test because of how CPU-bound it is, and how much clock speed it needs for achieving competitive framerates

  • @knockedgoose4206
    @knockedgoose4206 Рік тому +31

    It's crazy how much less Linus waves his hands around when he's carrying an ultra rare component.

    • @AB-80X
      @AB-80X 9 місяців тому

      Who cares? It's garbage.

  • @technician4you182
    @technician4you182 Рік тому

    i recently bought a desktop with a display port i dont recognise. this video taught me thats it is display port 1.4a. thanks linus

  • @gaganvs4090
    @gaganvs4090 Рік тому +5

    0:32 is that a COVID reference?

  • @gq-ym6rd
    @gq-ym6rd Рік тому +18

    Moore Thread has released a new MTT S70 graphics card on May 31st: 3584 MUSA core, 7GB graphics memory,
    It is expected to release a new driver supporting DX11 at the end of June, supporting games such as Genshin Impact and Dark Soul 3

  • @coryplum5375
    @coryplum5375 Рік тому +4

    Developing GPU is not only on hardware, software is important too. It's easier to get more TFlops than suitable drive software and gaming software engines.

  • @harmonylivingston3460
    @harmonylivingston3460 Рік тому +3

    Cue the Chinese bots calling every minor issue fake and western propaganda instead of just being satisfied they made a GPU in 2 years at all.

  • @MetalMan1245
    @MetalMan1245 Рік тому +217

    This just makes me feel more validated for buying an Intel card.

    • @1IGG
      @1IGG Рік тому +11

      How is your experience with the card?

    • @00sean00
      @00sean00 Рік тому +59

      Nice try Mr Intel bot, you're not fooling us.

    • @darwinjackson3560
      @darwinjackson3560 Рік тому +7

      @@1IGG the card is getting better with the driver updates

    • @FNLNFNLN
      @FNLNFNLN Рік тому +16

      @The Big Sad Given the absolutely brutal expense of building a modern CPU/GPU, basically the only countries that would ever try to develop a homegrown semiconductor industry from the ground up (+/- a couple stolen IPs) are those who are in opposition to the current US/dominated world order, who don't have reliable access to established manufacturers.
      In the modern world, that's basically only China, Russia (lol), and maybe sort of, kind of, India.

    • @ffpr1
      @ffpr1 Рік тому

      @@FNLNFNLNNorth Korea?😂