of course , nvidia always wins over AMD with drivers ; and btw this is just a 2 years craetion , give more years , and it will fload the market with lot of good midrange GPU like they did with phones
I was thinking the same thing. No drivers for this GPU is yet available or rather not yet available for the consumer, OR maybe it's in the manual together with the GPU that only the owner of the GPU linus is Investigating has it and it's wasn't given to the public yet
Nvidia's decades long policy of tuning for specific games in their drivers and even going as far to optimize the shaders of specific games into the driver. Must be an absolute nightmare to be AMD or Intel and having to try to support all that old spagetti of legacy optimizations.
AMD Polaris and Vega was developed out of their Shanghai office, so China does have enough talent to make the hardware. In fact any university level computer architecture course can teach you how to make a beefy GPU. But the magic sauce is the optimization within the drivers. If you can't fully utilize the GPU, the performance/watt is going to suffer. I suspect their drivers does not implement DirectX APIs 100% at any level, that is why they only support certain games.
It's not about talent, China has a lot of it in the computer science and r&d space but they need access to parts that require a lot of time to create this tech, AMD and Nvidia are working on project right now that won't see the light of day for another 2-4 years, it will take them time to create everything. There a reason Intel has used x86 for decades. They also need driver support in application and games which requires cooperation on both the developers and the card makers which isn't an option for everyone. Intel has been working on gpus for a decade on and off and they are still have driver and support issues with games a year after launch.
I believe they legally can't support DirectX, That's a Microsoft property and unless I am remembering wrong Microsoft is one of the companies that was banned by China during the trade war, I I would imagine I'm probably wrong,
@@kelmanl4 Yes it does take time to come out with the silicon. But drivers imo are toughest. Even Nvidia and AMD have to release patches or games specific fix or optimization constantly. DirectX, OpenGL and Vulkan are not going to easy to implement from scratch and imagine doing it across multiple versions. Intel or AMD can easily drop x86 and go for a RISC ISA. The reason why they don't is because there are still huge money to be made for backwards compatibility. Intel has proposed to dropped all the legacy 32bit rings and 16bit real mode support in hardware from future CPUs though. Hardly anyone these days are booting with DOS or a 32bit OS. And its updated every so often with new SIMD instructions. x86 is still very alive and evolving.
The architecture of an GPU is more complex than the computer architecture course in a uni. There are a lot of optimization to do in hardware architecture to reduce consumption. Drivers isn't the only magic sauce.
Some people in Germany got their hands on MTT S80 GPUs, and they tested a couple things for me: Unfortunately there is no OpenCL support yet, although MTT adverdised it. Drivers are still very early.
@@decreer4567 this is likely to change with driver updates. OpenCL support is mandatory to have any user base in the compute / data center segment. I'm not surprised there is no support yet; Intel's OpenCL support at Arc launch was abysmal too but has gotten significantly better already. Give them some time.
@@ProjectPhysX The last thing westerners should do is support a Chinese chip companies no matter how small they may be. Why are Germans so supportive of dictatorships. Didn’t y’all learn a lesson from the whole over dependence on Russian energy thing? The west has gotta decouple from China. Not buy advanced components from them that they could then use for spying or collecting data from us
@@tylerclayton6081 OpenCL is an open standard, if the hardware supports it, it can run a large variety of stoftware. Open standards are a good thing. I'm not supporting Chinese chip companies and their dictatorships. But there is no reason to be dismissive of the chinese people either, they are not that different from us. I come from academia and value international collaboration, no matter of nationality. International collaboration/communication solves problems, decoupling through stereotypes and building walls does not.
I get lackluster game support but I was expecting the situation to be far better for compute workloads! I feel sorry for the poor souls who have to use these to work on their research projects!
@@NoNameAtAll2 most (well maybe all of them) gpus have their power connector "on top" of the card which when you mount the gpu, unless you vertically mount it, you always end up seeing the pci-e power cables in front of the gpu. the card in the video has it to the side of the fan, which kinda hides the cable in sight, which is kinda cool
It makes absolutely no difference, the cables are covered up by the side panels anyway and even if you’re running with the sides off for thermals you’re not spending any significant amount of time under your desk staring into the computer.
@@СусаннаСергеевна I think in EVGA testing they found it affect air flow little bit. Also it looks cleaner and on some cards the wires will hit the glass panel. On the original O11 I couldn't close the side panel when I had the EVGA 1080ti hybrid due to hitting the power wires.
@@СусаннаСергеевна Bro do you use a fkin Lenovo Thinkcentre from the 2000's ☠️. Almost all modern cases have transparent glass and acrylic side panels in case (pun intended) you didn't know. Also yeah many people keep their PC's on their desk 🗿
Apparently it will use a standard PowerVR module on Linux, and PowerVR has a Vulkan driver on Linux - there might be some interesting testing to be done by Anthony
It displays stuff on screen, so it uses the GPU. In fact, uses OpenGL to render the screen. Factorio is 2D, yet uses OpenGL on Linux and DX11 on Windows.
@@luziferius3687 No it doesn't. OP is right - the new release of Dwarf Fortress on steam is entirely CPU bound even for rendering graphs. It uses a hardcoded version of OpenGL and CPU clock sync to render the 2d graphics.
@@PointingLasersAtAircraft WW3 will be played in COD lobbies, modernization exists everywhere! Joe will prolly appear with the best of pay to win weapons
I have to imagine its priority is server first, desktop second. I was surprised it didn't seem to allow any real video encode/decode as that's a giant use case for server GPUs. Though maybe it does and this is really a case of locked down hardware/software package deal. ie maybe Chinese UA-cam is using footbrake instead of handbrake.
Almost certainly the media transcode capability is through an SDK (like nvenc) and the enterprises buying these will just implement it. Trying it on OBS is a good test, but if OBS hasn't implemented the SDK, like they have for nvenc, then obviously it wont work
its likely good for background rendering were it doesnt have to display the image it make but does the maths work fine as that was the read out were say and the display were saw hinted at given the low display rate but high processing abilty and its only when new assits are obtained or installed that system hitches like server bacground gpu use to do in the early 2010's they were great cards for the price then like $120 for decent cad work server but crap for any 3d gameing or texture live rendering
further note companies that would get such card would do so cause they would take what they saved over the worker system and supply a finalisation system within tehoffice that worker would upload their final project to be worked on to the system rather then have 20 machine costing 15k each you can have 50 machine worth 2k and one single system worth 30k
The good news is that if Moore Threads can actually become a contender with the heavy hitting GPU MFGs, then their products could help to keep the price down on in the powerful graphics card market.
They might only care about the Chinese market, which is why they made a card similar in performance to a 3060. Because 99% of gamers in China only play shitty mobas that can run on a 750ti
@@st.altair4936 Just imagine that any software you use in the future will need to be censored by the Chinese government before it can be optimized accordingly.
@@DI-ry5hg I've used Chinese phones, none of the memes actually hold up. It's literally just a phone. There's no reason to think their GPUs would be different.
Love how the power port has finally been moved. Now someone just needs to put it on the bottom and cases will be looking much cleaner without those 2 cables jumping over the MB!
It just depends on how you customise your own pc, I have a 3090 FE, my 12 pin connector looks super clean with a braided 12 pin to dual 8 pin connector it's not in the way at all.
@@MaddTheSane They do make drivers on Linux and that's their main purpose. Our company it considering to use this product to do some AI computing in order to prepare the furture chip ban that might happen.
I'd support them just to break the duopoly we have rn honestly. What chinese brands did to the smartphone market was a great thing for everyone. Folks here in southeast asia can finally get high-end spec phones without it costing arm and leg. AND cheaper competitors also forced the likes of Samsung and Apple to be more competetive in terms of features/quality/pricing. The same kind of thing happening to happening to GPU market would be a net positive for consumers.. Just saying.
I agree 100%, Americans companies are scrambling to stop them because it would break their oligopoly on the market and they would actually have to make good competitive hardware.
yeah but they also use that cellphone architecture through huawei to basically put a strangle hold on the communications networks of developing countries and spy on people so although its great for normal people in the short run it will have nasty long term ramifications for countries like bangladesh and malaysia
I'm gonna guess that by their 3 or 4th generation they will have cards comparable to the cards Nvidia, AMD, and Intel will be putting out at the same time. They are getting a share of that $1.4T and that helps tremendously.
15:08 that "2 years for this GPU, 3 years for your screwdriver" is exactly what I'm thinking. The effort to even get a GPU working in whatever way possible is massive. From there you need to work on various features (like Tesselation) and likely revisit the hardware design multiple times to get it right. But 2 years is a pretty short time for the initial product.
People can shit on this product all they want, but 2 years from nothing to an actual product is crazy. If they can keep up the pace excited to see what they can accomplish in the coming years. Amazing what state driven investment into technological innovations can accomplish. It's almost like having a government that invests in it's own infrastructure, development and progress instead of one that spends all it's money in bombs and 800+ foreign military bases while it rots from the inside out is maybe superior. Weird how that works. Probably has nothing to do with how China has managed to lift 850 million of its people out of poverty. 🤔
@@rfouR_4 they literally licensed the IPs from another company, its not like they did all the RnD and all that themselves, they just put the money down to put themselves in the position to make a glorified 1030
@@username8644 "They designed a power connector is that very finicky, much more than a regular power connector." It's literally the same style of power connector just with more pins. "You should never be able to melt a power connector because it wasn't fully plugged in" Improperly seated power connectors, no matter if they are inside a PC or not, are one, if not the most common cause for residential fires. And that includes stupid NEMA connectors as well as the CEE 7/x plugs. The problem with the connector, as the GN video showed, is that the tolerances are very tight and it's easy to push them not as far as they should, which can be solved by looking a bit closer and pull on it to see if the clip has arrested. Tight tolerances are not a bad thing per se, in case of electricity you want to have as little wiggle room as possible, especially if high currents are involved. If you don't have that you get sparks at the connection point, which increase resistance even further (thus increasing heat) and can lead to the connections even fusing together. " Grow some balls, use your brain, and start calling out these scummy companies who keep getting away with this crap." I'm all for more responsibility for big tech, but this isn't a problem of a scummy company (remember: Fault rate is less than 0.1%) and rather users not doing their due diligence; ignoring bad cables caused by manufacturing defects and not adhering to the given standards, but that's not NVIDIA's fault. Seriously Gamers Nexus did a whole series on that topic.
Hmm, PowerVR is a name in the GPU space I haven't heard in ages. I actually used to have a PowerVR Kyro and Kyro II GPU way back in the early Geforce T&L days. Back then it basically was the "Bruteforce" Hardware T&L Approach on Geforce vs the "efficient" deferred Renderer on Kyro cards.
@@PicturesqueGames No they were not making their own chip and was relying on partners to make the chip, and one of the partner (ST) decided to leave the market leaving Video Logic (as they were known at the time) in the dust, and fighting against ATI and NVIDIA was not necessarily something they could over the time, they were not as big. But the original disparition of the PowerVR from the PC market was not Video Logic will to start with.
WOW VERY DANGEROUS SIR! !! 😠 😠 BUT THIS WHY IM SO LUCKY LIVE IN SUPER INDIA THE CLEANEST COUNTRY IN THE WORLD 🇮🇳🤗 , WE NEVER SCAM! WE GIVE RESPECT TO ALL WOMEN THEY CAN WALK SAFELY ALONE AT NIGHT AND WE HAVE CLEAN FOOD AND TOILET EVERYWHERE 🇮🇳🤗🚽, I KNOW MANY POOR PEOPLE JEALOUS WITH SUPER RICH INDIA 🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳
I had an old Power VR GPU many many years ago, they used tile based rendering and avoided rendering anything not visible. Cheap cards that did well price/performance wise vs the ATI and 3dFX cards at the time. They had a few compatibility issues with newer versions of Windows, and Nvidia did their usual dirty tricks to help torpedo them.
Ah occlusion, didn't Nvidia use that around the Geforce FX line to prop up its rather questionable performance. Once the "Oops you sure you need 24bits textures" trick didn't get that much more performance they put in one of the drivers a simpler trick, knowing the path of the camera in the benchmark they pre-occluded it. Aka, not use on chip/in driver occlusion but recognized the software and ran pre scripted blacked out areas. Of course a free flowing camera POV broke that little lie.
TLDR: The card is effectly a card which has performance that varies between a gtx 1030 and 1660 but which also consumes 250 watts of power (the 1030 consumes 30 watts I think they said)
@@nexusyang4832 then again, you probably don thave a team of Electric Engineers, Folks who understand Discrete Structures very well, Pro's in Assembly. they do, id expect them to have a good working GPU that didnt consume as much as an RTX 3070 while giving 1030 performance
I wouldn't underestimate them, it's a giant step forward for them Manufacturing high end semiconductors is the single most technically challenging industry and they haven't been in the game for long. The west kinda forced their hand by preventing China from sourcing high end chips elsewhere, and now I'm worried it may have been too short sighted
I mean honestly. If anything the ban of them from importing certain chips will only increase there investment in domestic manufacturing. Why would you back down when your sole sorce of something essential to your economy threatens to be cut odd
@@putinslittlehacker4793 true this was a very dumb move from the perspective of the west, it’s not like China is some small undeveloped nation, they would easily be able to develop industry for making high end computer chips if they wanted to 😂
Highly unlikely for another decade minimum. Nvidia and AMD have been in the game longer than the rest and know the ins and out of GPU design. Hence why Intels cards were laughable when they arrived. Price only got out of hand with the newest cards because upto 5 year old cards are still worth while for almost everything. New games are finally starting to push the envelope and ruin performance on older cards, even then you can just drop some settings and get solid performance again.
They do not make those GPU to compete with anyone , they are making them so they wont be left without anything if USA does something horrible again in attempt to have complete world control.
Agree. But what's holding back from the market having available GPUs is fab capacity for making the GPUs and a lack of board partners willing to make graphics cards because of low profit margins.
It's a really impressive GPU for only 2 years of development. And due the western restrictions, the support should be better on Linux, Vulkan, and others open technology's
@@brianransom16 Industrial espionage is not restricted to the CCP, Western companies do it whenever they can, AMD for example started with reverse engineering Intel chips. Nowadays, the most practical and legalized way is to hire the competition's main engineers to develop “new technologies”, a strategy that all the big technology companies use all the time. I agree that they wouldn't develop as fast from scratch, but nothing starts from scratch. I think the difference with CCP-affiliated companies is that they are less willing to hide spying as they are less susceptible to lawsuits.
Most importantly, their GPUs are based on Imaginative Technologies, which has been around since the 90s, but simply moved into the mobile (phone) space. So it's not as much a new 4th competitior, as a resurrection of one from the 90s EDIT: Oh, linus talks about it at the end! All is good
@@damienkram3379 S3 Graphics is a skeleton crew within VIA that only works with Zhaoxin in China making iGPUs. VIA hasn't funded a new GPU architecture from S3 since ~2012 and is still shipping the same Chrome 600 series cores (2-4 CUs max, DX11.1, OpenGL 4.1, no OpenCL) for a basic integrated option.
That's I assume mostly the GPU compute part. I doubt they also got made all the other IP from scratch, I wouldn't be surprised there was some stuff that was stolen like ARM stuff that isn't properly attributed (stuff they got from previous projects made in China where they got the IP legally). Not to mention the EDA tools, no way it's all 100% Chinese either.
@@damienkram3379 Adreno is an anagram of Radeon so you can guess where that came from. Mali is an original design by ARM, but its history goes back to 1998
This is actually why a lot of foreign tech companies are either closing or refusing to expand their offices in China right now. They are worried about hiring local Chinese employees who will then take IP and knowledge and start their own government backed competitors. ASML just sent a delegation of suppliers to countries like India, Vietnam and Indonesia to look for new locations to get local talent and build local factories. There's much less risk for them in these friendly countries with better IP rights.
@@JohnSmith-vn8dmAlso it isn't like any of these countries can set up local competitor even if the people who previously work there decided to take the knowledge and ip and work in said supposedly new company
moving the vrm's to the top of the card is kind of a simple idea to help vent the hot air dirtectly out the top instead of stuffing them in the middle of the board where it can heat soak easier
VRMs are easy to cool, placing them along the top stretches the trace lengths badly and provides asymmetric delivery to the die, leading to worse voltage drop-off - or spikes to compensate
The fact that you have no clue yet are talking big is just - sad. Just a hint for the retards that will come crying: EPS 8pin is specified by the same spec as the 12VHPWR.
Agree. The EPS12V power connector is better than the 12VHPWR . We have been using the EPS connector for decades, never seen them melt or catch fire despite seeing older Intel HEDT CPU pulling a crap tonne of power from them.
@@fleurdewin7958 the EPS12v is rated way lower and the the 12vhpr connector isn’t an nVidia creation, it’s a standard connector, so many idiots keep blaming nVidia for it. The only ones that have failed have been due to improper installation.
@@oxfordsparky Even better - The EPS was specified by the same company as the ATX standard (from which the 12vhpwr comes). But people are too ignorant to learn from their mistakes. The connector never was at fault.
I once had a GT 1030 in my system and upgraded to a 1660 Super. I can confirm that this comparison is about right. The 1030 simultaneously *doesn't exactly run like garbage* and *will bottleneck you hard.*
yeah iirc it was replaced to gtx970 everything run well and smooth just i never really tested it, but game on it alot, first witcher 3 gameplay also with that pc it was forbidden msi tiger 1 fan not spinning after 2 years, and the msi board also failed, replaced it with gigabyte board
You know speaking of things like PCI-5 and stuff, there's a game I have in particular called iRacing that supposedly has a lot of bandwidth related issues as far as sending data to the GPU because of the fact that they send the entire world space every frame. I'm curious with the impact of bandwidth over the bus actually has for GPUs particularly in a game environment that does such a thing and I wonder if anyone at the lab would be able to provide insight on that.
I feel like it's an under discussed aspect of gaming as a whole because we just assume that it's just straight up power that's needed but I think a lot of people forget that the CPU still needs to send instructions to the GPU and that data still has to get over to the GPU over the motherboard.
So you have a game that is just insanely stupid. It is sad just how bad most games, even the big AAA games, are coded. they literally waste a decade of hardware-improvements just by being retarded programmers. Was one of the strange things when DX12 was announced: OpenGL already offered low-level access and nearly nobody used that cause it is a lot of work and easy to get wrong. And with Dx12 and Vulkan the same problems came up. Not only that we have seen that the code it self is yet again mostly bad to the point that the drivers of AMD and Nvidia have to do the heavy lifting by dynamically reshuffling data as the code as written would just result in slideshows if anything.
Except an RTX 4090 can operate at 98% performance in a PCIe gen3 x16 slot than when in a PCIe gen4 x16 slot. We're only just now seeing GPUs that are finally outgrowing PCIe gen3. Im betting its at least two generations, maybe 3 before a card is knocking on the door of PCIe gen4 x16 limitations.
@@racerex340 this depends on how heavy PCIe bandwidth usage gets in future with likes of direct storage remember were starting to see PCIe 3.0 show age after that many years,4.0 is 2x bandwidth off of 3.0 which even with direct storage could take years to be fully saturated
@@racerex340 But LithiumFox point is that's application dependent and there are some applications where more bandwidth is necessary. One such application might be AI and applications that benefit from memory pooling of multiple cards. Nvidia's solution to this has been Nvlink, but a higher bandwidth PCIe connection could be a cheaper alternative solution. This could also perhaps be used to bring back multi-gpu gaming like SLI, without needing special sli connectors between the GPUs. Anyway, there are definitely situations this could benefit.
True. This video is nostalgic. I got same vibe about 10 year ago when china got banned from only International space station. The best way to stop china progress to banned and then mock them. it clearly work
The specifications of this graphics card are very high, similar to the GTX3060, but due to driver optimization reasons, it has not fully utilized its true performance. In recent months, this Chinese graphics card company has been constantly updating and optimizing drivers at a high frequency, the performance of this s80 graphics card has greatly improved compared to before, and it can now support most mainstream games. Its GPU usage during operation is currently less than 20%, so its potential is still great. Currently, its price is only $163, and the inventory of product distributors was quickly snapped up. They are preparing to launch the next generation of the S90
Seems obvious to me that it is productivity or research oriented. That's why you see the AV1 support as well as Pytorch and Tensorflow. It also explains the higher memory capacity and Tensor-like cores, typical for deep learning cards. So it would be fairer to run some DL throughout benchmarks on this.
Except all the productivity suites they tested won't start, while some games did. It's a glorified prototype, a marketing stunt. They re-purposed their server GPU for consumer use, and it's still useless. If they're able to iterate maybe they will succeed, but it depends on how much money the Chinese Government are willing to spend on Moore Threads
@@fostena They issued the card to prove to investors that they could deliver a real product. Second, many people in the West do not understand that not all random enterprises can be funded by the government. The Moore Thread started with private capital, and they need to convince investors to invest more.
@@guzilayerken5013 you don't need to explain to me public-funded enterprises! I live in Italy, we where almost a "socialist" country once, by USA standards 😄! We got plenty of government intervention in the economy. That's what I said, by the way, the card is a marketing stunt, it's a proof, a prototype
Even if this is absolute crap, it's backed by - true murky - capital and does something. Like you said "I wouldn't be able to do it" and on a pertinent level this is pretty much on point. We need more players. We need competition coming in from other directions. Give it time and in a few years we might get something from them on ARC level, a few years after that something that can compete. Voodoo wasn't supplanted by NVIDIA Riva TNT in one generation. These things need time and a steady flow of cash. I'm not saying a world where the fastest GPU4s are china made is a better world to live in, i'm saying competition will bring prices down between competitors.
These chinese companies will never be able to sell their products in the west, all that stolen tech would make it impossible unless they are prepared to do legal wars until the world ends. so, you, me and everyone else in the west dont win anything from this.
@@pauloa.7609 Predictions based on wants or should be don't mean much. The world is changing, a family being able to sit on their laurels for generations because an ancestor thought about putting a piece of plastic at the end of a shoelace is, when you take a step back just as ridiculous as chinese made video cards. There is never a patent on a result, just on a means. We will have to see.
@@venosaur121212 It has both the Conformité Européene And the FCC logos (Or at least the spacing in the CE logo ondicates is the Conformité Européene logo)
Not surprisingly, most of the motherboards like Asus, Gigabyte, MSI, etc., are manufactured in China and printed with FCC and CE Logos, even some models that are only sold in China.
They made this only for 2 years? Wtf that's actually impressive in such a really short period of time they made a working GPU, if given more time to produce GPUs then I guess they have a higher chance to being more compatible to more hardwares
Hope so for the consumer market. Nvidia's prices are completely out of wack with what consumer's are willing to pay, AMD is no better and the stock is always out anyways. It's time an actual disrupter came on the market to break the diopoly.
The 2 years isnt that impressive when you remember most products home grown in china are from stolen tech lol. They dont innovate, they steal and make rip offs. The ultimate chinese knock off!
The gpu itself isn't scary. The scary part is that China was able to put that gpu together without having an international supply chain like nvidia or amd does to make their gpus. All the more, this iteration is close to the 3060ti which you might say "oh but that's mid range" but you would be wrong because for a lot of people a 3060ti is a high-end gpu.
What china is more then capable to manufacture GPUs like this they already make 75 percent of the world's products. Idk why people assume china is a third world country they are not they a highly developed industrialized country. The usa knows china is the only one with that can rival usa tech which is war the usa is in a economic war with china we've stopped selling them chips it don't matter.
It's scary because a long supply-chain is currently needed to make computer chips. A collaboration effort between Taiwan, Japan, Korea and the US. These are first world nations that need each other to make these chips. Now because tensions between China and the US, China basically said they were going to make their own chips with blackjack and hookers and they actually did. They are on their way to beat those other nations at chip manufacturing. II don't mean it as in "ohh scary, china bad". But the fact that they actually pulled it off is insane. China is already becoming this century new super-power.
I think the biggest difference there is there wasn't a burgeoning economic/tech war brewing at the time. Who knows how this will pan out, I"m only saying there's a bit of a different context here.
I like how they managed to get AWESOME cooling and next to zero noise operation and have low price. Of course, hard restrictions on MBs, and not allowing to launch games after warning they are not tested and supported but perhaps they would work... I actually kinda wonder why both Intel and China didn't concentrate on Vulcan support. Force games to support Vulcan as main instead of DX12 and everyone would be happy, with exception of developers.
if it's an AI oriented you could test it by running a basic training on it with a ready base, put it through 10-20 epochs see how long it takes, compare against cpu performance and some entry level gpu like the 1660 in this video
I was kinda interested, so I looked around - couldn't find anything. According to some people who played around with it, it's quite difficult to get the Linux software for those cards (UnixCloud has some pretty old drivers for previous MTT GPUs on their website, but not for the S80/ S3000), and I'm not sure anyone managed to get access to any Torch builds for MTT GPUs. And no code has been submitted upstream at all as far as I can tell.
It's likely this GPU isn't designed for gaming, but for other applications. Like trying to play games with Quadro instead of Geforce GPU's. It's likely designed with computational power in mind (AI, cryptocurrency/blockchain, professional level 3D animation, etc.) and gaming is likely an afterthought.
It's a lot more likely people underestimate the amount of work that went into driver support for games over the years. For a new player to try their hand at it, this is impressive, but obviously not a good product for actual use now.
As they said, it's a repurposed server GPU but this version is supposed to be for gaming and they didn't do a great job at it since it can run only a handful of games at performance levels of a 1030
The way I see this is that China does have the market there to support a new GPU company to start from scratch, they might able to come up with brand new tech tree that perform better and cost less at the sametime, just like all the other things the made in China. I think overall it's a good thing, judging by how Nvidia is doing right now, I'd say the more competitors the better for us gamers. Just give them couple of years, we will see.
Also to just add a bit. They are licensing the same GPU IP that ImgTech sells to phone makers. They simply don't have drivers done at all. The GPU most likely is stronger than what it actually shows but can't use the performance due to bad drivers. Moore Threads did the hardware, worry about software later. However, we shouldn't expect more than 1660 in gaming even when it's fully optimized.
The reality is that this is a CCP company that is going to make chips for MIL/AI use in the long term. While they would probably like a commercially viable product it is unlikely they will be able to do that any time soon.
I am glad Chinese tech companies are making GPU for gaming market. This will change the balance built by AMD and NVIDIA. I believe both of them will release more competitive products under the pressure from outside.
I'll never run one, same reason i won't own Lenovo, they are spy tools of the CCP, if you want to claim no way China would slip in code into the firmware to spy on westerner's, I have a bridge between Tokyo and New York for sale are you interested in buying it?
@@didyoumissedmegobareatersk2204 oh neat. I didn't know that. Kinda surprising that they did it without EUV, but we'll see if it hits any sort of production. I correct my post Gonna be tough without ~~7nm~~ uhhh 5nm
@@zaxwashere Their 7nm was found on a mining chip. This tells us that its probably not a mature 7nm node (meaning yields are low, actual performance isn't there). However, the fact that it exists by itself is a big deal, because its like getting your foot in the door to greater things.
Nvidia has the H100 out with pcie5.0 for a while, but so far it seems unnecessery for the rest of the gpus that it is not worthy for them to add support for it.
@@h_alkhalaf It's a graphics card mate, a GPU in a card format. There are many, many GPUs that exist without display outputs. Look at any "mining" GPUs that are quite literally 100% complete replicas of consumer GPUs with absolutely zero differences apart from lack of display outputs and different drivers/firmware. Nvidia literally called the H100 a GPU in the full product name, "NVIDIA H100 Tensor Core GPU". I cannot stress this enough, that just cause a GPU has no display outputs does not mean it isnt a GPU.
it does support games in every week, but still it will take a long time to compete with even 1660ti, right now, in some games it will perform similar to a 1660ti, especially in some old games or some chosen games, but general speaking, it is still not even close to 1660ti, which is pretty sad. the only reason that you would like to buy this card is you want to support a third party company to compete with those larger companies, other than that, just go with AMD if u only play games
As meh as this is, I'm still really excited that someone else is stepping in. They clearly have good engineers, but software will come with time. I mean it took AMD 10 years to make good video drivers. Any competition is good for the market.
Idk, they didn't seem to be interested in making gpu for anywhere except china and probably the plan is that they would make everything themself and just ban out all the western made hardwares, and still compared to intel arc at launch, that still have better performance, stability and support than this card so i'm more excited on seeing where intel going with their gpu
their first consumer grade release. really rooting for this company to catch up. they don't have to release it in the US. there's the rest of the world you know
@@keyrifnoway people said the same thing about Chinese phones. Now they sell hundreds of millions of phones every year. Even without the us market. (Same for Chinese electric cars they’ve started to gain marketshare in southeastern asia, Europe and middle east) China has the money to burn to play catchup unlike most countries. They’ll just keep throwing money at it until it becomes semi competitive. In a few years it’ll sell like hotcakes in Asia then a few years more Europe.
As a Chinese ,I firmly doubt 1:39 ,We only know that it's the first Moore Threads gaming GPU, complety domestic and of course only the geeks would buy it to run benchmark. NV was ordered to limit the export of cutting-edge GPUs to China, and the goverment esbalished a project to fund those domestic chip companies who develop GPU CPU chips . Moore Thread is just one of those companies.
Gotta give credit where it's due. The naming of Chinese companies especially the English names are fire...I mean "Moore Threads"? C'mon....that's genius 🤣
Gotta give it to them, the English namings are always something. Like their recycled lithium batteries that were branded fire explosion or some shit lol.
Having spent plenty of time in South East Asia a good English business name is more of a happy "infinite number of monkeys with typewriters" coincidence than inspired genius.
I'm disappointed you didnt have labs benchmark this card against the GTX 1660, RTX 3060, and GT 1030. i think i would have prefered that over the nonscripted gameplay tests
should keep in mind that League of legends runs mostly of your CPU not GPU. they kept it extremely low performance so it works on pretty much everything
yeah if you're running on the lowest graphical settings. Mainly because all the number crunching and client side processing is CPU based (pretty standard). But actually increasing the graphic settings of course increases GPU usage by a wide margin. Essentially, a poor gpu can run league on the lowest settings. A moderate gpu is needed to run league at higher settings, let alone 4K on top of that.
@@tristanstebbens1358 yea but it still maxes out at ~25% usage on my 6700 XT so anything above a ~1060 should run the game the exact same (within margin of error)
In fact, Our chinese harware fans think that two years are really a short time for GPU manufactring. It can work ! We cant wait to see more products. My friend bought a moor gpu to play games , although it's not very stable. ( Moor did not sell this gpu to the market, beacuse they know it is a dev version. Brand repuation matters very much)
Honestly if they just focus a ton on drivers this seems like it could be promising. They have already proven they can get decent performance in some games.
Interesting and reminds me of the nineties when mobos and video cards were released with drivers that had not 'matured' (not forgetting games that ran like snot until they were patched numerous times)
Yeah this seems like a re-painted general purpose GPU (probably for AI and maybe crypto). All you need is one person to make it compatible for whatever GP GPU application and then it becomes super useful. Given the high VRAM to core count ratio, I'm guessing loading big datasets for AI purposes is the priority
Or they are just trying to scam their way into some sweet govt funding. Lot of chinese companies have done that in the past. With one trillion $ there is enough for everyone to share.
We actually need this card. A cheap gpu out of china that threatens the gpu cartel|s market share is exactly what we need. If they fix their drivers it could be viable for 1080p as long as they keep the price down.
Yooo I watch linus's videos just by looking his face on the thumbnail😂, like how old the video is doesn't matter 😅, it's super informative and entertaining❤
@Kyle Cruz I was gonna scroll past until I saw a reply. Budget doesn't mean cheap or inexpensive; it's just the amount of money you plan to use. You can have a $550 budget for a PC, or one that's $2,550, and that's still *a* budget.
Seems it has some drive issues since from all specifications it should not have such low performance, and considering its very small support list they are probably not working on maximizing the performance rather than making it run first.
14:05 - TF2 is the GOAT for me. While this would NEVER happen, if Valve ever released a TF3 and it was good, I would be over the moon. I don't think I've had more fun in any other FPS, other than the OG Halo trilogy. Overwatch feels like the spiritual successor but it never got me hooked like TF2 did.
It is pretty much common knowledge to Chinese industry insiders that they bought the GPU's soft IP from imagination technology from the UK. The same company that licensed Apple way ago with GPU Soft IP architectures. But again, there is so much more to get a GPU out besides having the RTL. Edit: Also, given that they bought Imagination's IP, I don't think they actually stole any patents or IP...
@@Tonyx.yt. "performance per $" doesn't matter if you have limited amount of money. In this case you have a "performance per $" sublist to choose a card from.
Indeed this is a threat and a good competitor for the AMD and Nvidia. The same has happened to the phones when Chinese phones took a huge part of EU and US market. US government acted fast and banned Huawei before it could took the marked completely. With that huge funding the new GPUs will pop up in 2-3 years easily. Also, TSMC factories which are the only in the world capable for 4nm lithography are under the China influence.
TSMC is not under Chinese influence unless they decide to invade Taiwan, lol Also, Chinese phones were never really that popular in the US. They really prefer to spend their money on needlessly expensive iPhones instead.
I think it would be only fair to run this card on the fully Chinese P3 processor with the Chinese board or the highest, and one that is available from China then see what we get.
With cards like the Intel ARC 380 I can see people buying it to get enough monitor support, 3D acceleration (CS:GO will play with enough framerate), encoding options and hell, maybe some use of the AI possibilities. It will also allow your CPU to run more smoothly as an APU also uses system RAM for video. Some people (like me) don't care about high end gaming at all, and find these kind of GPU's perfectly fine. I don't have any reason to upgrade my RX 580 anyway, which I managed to buy when it was still considered a "medium priced" GPU at around €210, and it has fine performance and FAN stop for my silent PC.
@@rickyray2794 I had an RX580 up to a few months ago and it ran 3440x1440 on my coding workstation flawlessly. Didn't have any issues when playing with CAD and a few other 3d things either. So no, not 'lol'.
And it's actually a rather interesting game to test because of how CPU-bound it is, and how much clock speed it needs for achieving competitive framerates
Moore Thread has released a new MTT S70 graphics card on May 31st: 3584 MUSA core, 7GB graphics memory, It is expected to release a new driver supporting DX11 at the end of June, supporting games such as Genshin Impact and Dark Soul 3
Developing GPU is not only on hardware, software is important too. It's easier to get more TFlops than suitable drive software and gaming software engines.
@The Big Sad Given the absolutely brutal expense of building a modern CPU/GPU, basically the only countries that would ever try to develop a homegrown semiconductor industry from the ground up (+/- a couple stolen IPs) are those who are in opposition to the current US/dominated world order, who don't have reliable access to established manufacturers. In the modern world, that's basically only China, Russia (lol), and maybe sort of, kind of, India.
I think what the recent newcomers into the gpu space have shown us is how important driver support is.
of course , nvidia always wins over AMD with drivers ; and btw this is just a 2 years craetion , give more years , and it will fload the market with lot of good midrange GPU like they did with phones
I was thinking the same thing. No drivers for this GPU is yet available or rather not yet available for the consumer, OR maybe it's in the manual together with the GPU that only the owner of the GPU linus is Investigating has it and it's wasn't given to the public yet
Nvidia's decades long policy of tuning for specific games in their drivers and even going as far to optimize the shaders of specific games into the driver. Must be an absolute nightmare to be AMD or Intel and having to try to support all that old spagetti of legacy optimizations.
this isn't news. Anyone who's had an ati card from the late 90's early 2000's knew this!!
Not only newcomers.. Nvidia's driver still sucks big time in Linux.
"Took them 2 years to make a GPU and you 3 years to make a screwdriver" - Fuckin gottem.
TBF the LTT screwdriver is great and the GPU is trash even compared to the ARC dumpsterfire.
Had me dying laughing
🤣🤣🤣🤣🤣🤣🤣🤣
@@naamadossantossilva4736 Oh yeah? What does that screwdriver cost again?
@@naamadossantossilva4736 tbf a screw driver is just plastic with a steel rod, GPU is much more complicated
15:09
Imagine a Chinese secret agent watching Linus drop the GPU and break their hidden spy camera.
ahahahhahahahahahahahahahahahahah bro💀
I’d say they saw it coming.
@@wescrowther655 yea
lmfao
Dunno why the US government keeps emphasising China’s spying on everyone… isn’t this what the US has been doing to rest of the world?😂😂
AMD Polaris and Vega was developed out of their Shanghai office, so China does have enough talent to make the hardware. In fact any university level computer architecture course can teach you how to make a beefy GPU. But the magic sauce is the optimization within the drivers. If you can't fully utilize the GPU, the performance/watt is going to suffer. I suspect their drivers does not implement DirectX APIs 100% at any level, that is why they only support certain games.
It's not about talent, China has a lot of it in the computer science and r&d space but they need access to parts that require a lot of time to create this tech, AMD and Nvidia are working on project right now that won't see the light of day for another 2-4 years, it will take them time to create everything. There a reason Intel has used x86 for decades.
They also need driver support in application and games which requires cooperation on both the developers and the card makers which isn't an option for everyone.
Intel has been working on gpus for a decade on and off and they are still have driver and support issues with games a year after launch.
I believe they legally can't support DirectX, That's a Microsoft property and unless I am remembering wrong Microsoft is one of the companies that was banned by China during the trade war, I I would imagine I'm probably wrong,
@@kelmanl4 Yes it does take time to come out with the silicon. But drivers imo are toughest. Even Nvidia and AMD have to release patches or games specific fix or optimization constantly. DirectX, OpenGL and Vulkan are not going to easy to implement from scratch and imagine doing it across multiple versions.
Intel or AMD can easily drop x86 and go for a RISC ISA. The reason why they don't is because there are still huge money to be made for backwards compatibility. Intel has proposed to dropped all the legacy 32bit rings and 16bit real mode support in hardware from future CPUs though. Hardly anyone these days are booting with DOS or a 32bit OS. And its updated every so often with new SIMD instructions. x86 is still very alive and evolving.
Or steal IP.
The architecture of an GPU is more complex than the computer architecture course in a uni. There are a lot of optimization to do in hardware architecture to reduce consumption. Drivers isn't the only magic sauce.
Ah, finally a GPU brand that you haven't dropped a card from.
yet
Has he dropped a Matrox card yet? Hmm...
Chances are low, but never zero.
He can't take chances
😂😂🤣
Some people in Germany got their hands on MTT S80 GPUs, and they tested a couple things for me: Unfortunately there is no OpenCL support yet, although MTT adverdised it. Drivers are still very early.
If they don't support OpenCL then it's total garbage.
@@decreer4567 this is likely to change with driver updates. OpenCL support is mandatory to have any user base in the compute / data center segment. I'm not surprised there is no support yet; Intel's OpenCL support at Arc launch was abysmal too but has gotten significantly better already. Give them some time.
@@ProjectPhysX The last thing westerners should do is support a Chinese chip companies no matter how small they may be. Why are Germans so supportive of dictatorships. Didn’t y’all learn a lesson from the whole over dependence on Russian energy thing?
The west has gotta decouple from China. Not buy advanced components from them that they could then use for spying or collecting data from us
@@tylerclayton6081 OpenCL is an open standard, if the hardware supports it, it can run a large variety of stoftware. Open standards are a good thing.
I'm not supporting Chinese chip companies and their dictatorships. But there is no reason to be dismissive of the chinese people either, they are not that different from us. I come from academia and value international collaboration, no matter of nationality. International collaboration/communication solves problems, decoupling through stereotypes and building walls does not.
I get lackluster game support but I was expecting the situation to be far better for compute workloads! I feel sorry for the poor souls who have to use these to work on their research projects!
Finally a GPU where the power connector is on the right position
wdym?
@@NoNameAtAll2 most (well maybe all of them) gpus have their power connector "on top" of the card which when you mount the gpu, unless you vertically mount it, you always end up seeing the pci-e power cables in front of the gpu.
the card in the video has it to the side of the fan, which kinda hides the cable in sight, which is kinda cool
It makes absolutely no difference, the cables are covered up by the side panels anyway and even if you’re running with the sides off for thermals you’re not spending any significant amount of time under your desk staring into the computer.
@@СусаннаСергеевна I think in EVGA testing they found it affect air flow little bit. Also it looks cleaner and on some cards the wires will hit the glass panel. On the original O11 I couldn't close the side panel when I had the EVGA 1080ti hybrid due to hitting the power wires.
@@СусаннаСергеевна Bro do you use a fkin Lenovo Thinkcentre from the 2000's ☠️. Almost all modern cases have transparent glass and acrylic side panels in case (pun intended) you didn't know.
Also yeah many people keep their PC's on their desk 🗿
I'm impressed, I didn't even expect this to work... because I'm pretty sure MTT first focus isn't gaming.
This is where Chinese gamers train to defeat Taiwan in real life.
At least COD will have a decent title afterwards.
Apparently it will use a standard PowerVR module on Linux, and PowerVR has a Vulkan driver on Linux - there might be some interesting testing to be done by Anthony
That makes sense.
Doesn't China still officially use their own flavor of Red Hat Linux?
@@davidgoodnow269 isn't Kylin is the official name for Linux distro for china?
nope, still garbage full of stolen tech, this is literally a scam to suck up government funding
Including AI using ncnn.
@@davidgoodnow269 they also use Win 10 G, basically a China windows 10 mod without ms acc functionality
I like how they list Dwarf Fortress as compatible and supported. . . Dwarf Fortress doesn't use any GPU, its entirely CPU bound.
There's a new Dwarf Fortress, in case you missed it! Check it out. :)
It displays stuff on screen, so it uses the GPU. In fact, uses OpenGL to render the screen. Factorio is 2D, yet uses OpenGL on Linux and DX11 on Windows.
@@luziferius3687 No it doesn't. OP is right - the new release of Dwarf Fortress on steam is entirely CPU bound even for rendering graphs. It uses a hardcoded version of OpenGL and CPU clock sync to render the 2d graphics.
So just like every Chinese product its a lie.
@@SvendDesignsSD OK even if everything in game is rendered by the CPU, surely the window itself is drawn by your gpu onto your desktop.
I like how this is sponsored by xsplit but they use OBS to test on the GPU
Wait. XSplit is still a thing?
@@AlexDatcoldness Don't you need to pay Xsplit a fee for using it?
@@AlexDatcoldness Why, actually? Genuine question.
@@AlexDatcoldness just use obs. they have updated it like half a year ago. spend 10 minutes on it, earn a lot of money.
@@AlexDatcoldness boiler went today and i am down £1500, think i will have to continue to pirate keys :(
”What is the use of a child who has just learned to walk?“
”He will eventually become an adult“
Linus described this GPU as a nuclear weapon that every president is after
ik
GPUs will be a vital resource in WWWIII.
@@PointingLasersAtAircraft WW3 will be played in COD lobbies, modernization exists everywhere! Joe will prolly appear with the best of pay to win weapons
@@PointingLasersAtAircraft World Wide Web 3?
@@PointingLasersAtAircraft Tarkov seems to agree
I have to imagine its priority is server first, desktop second. I was surprised it didn't seem to allow any real video encode/decode as that's a giant use case for server GPUs. Though maybe it does and this is really a case of locked down hardware/software package deal. ie maybe Chinese UA-cam is using footbrake instead of handbrake.
Almost certainly the media transcode capability is through an SDK (like nvenc) and the enterprises buying these will just implement it. Trying it on OBS is a good test, but if OBS hasn't implemented the SDK, like they have for nvenc, then obviously it wont work
its likely good for background rendering were it doesnt have to display the image it make but does the maths work fine as that was the read out were say and the display were saw hinted at given the low display rate but high processing abilty and its only when new assits are obtained or installed that system hitches like server bacground gpu use to do in the early 2010's they were great cards for the price then like $120 for decent cad work server but crap for any 3d gameing or texture live rendering
further note companies that would get such card would do so cause they would take what they saved over the worker system and supply a finalisation system within tehoffice that worker would upload their final project to be worked on to the system rather then have 20 machine costing 15k each you can have 50 machine worth 2k and one single system worth 30k
It supported PyTorch out of the box, it's literally an AI accelerator in a GPU trenchcoat
It’s listed under desktop not sever on their site
To give them credit, "Moore Threads" is a really clever name. lol
I guess you can say Moore's law isn't dead 🤔
The good news is that if Moore Threads can actually become a contender with the heavy hitting GPU MFGs, then their products could help to keep the price down on in the powerful graphics card market.
Specially when you consider it's China; their products tend to be insanely competitively priced.
They might only care about the Chinese market, which is why they made a card similar in performance to a 3060. Because 99% of gamers in China only play shitty mobas that can run on a 750ti
@@st.altair4936 Just imagine that any software you use in the future will need to be censored by the Chinese government before it can be optimized accordingly.
@@DI-ry5hg I've used Chinese phones, none of the memes actually hold up. It's literally just a phone. There's no reason to think their GPUs would be different.
@@DI-ry5hgsomeone will figure out a bypass just like they bypassed nvidia's limited hashrate
Adam is a man after my own heart. A fan of TF2 whos not blinding by basegame nostalgia and enjoys the absolute chaos.
did u know tf2 will release a major update this year ? they annouced it.
@@punimarudogaman some hats, emotes, effects and maps? I'm too lazy to check it by myself.
@@punimarudogaman yeah sure buddy whatever you say.... Major update my ass link me the source right now bruh
@@punimarudogaman Stop the cap brudda
i started playing tf2 at a time when cosmetics and weapons were already a thing
the weapons are genuinely a great addition to the base game
Love how the power port has finally been moved. Now someone just needs to put it on the bottom and cases will be looking much cleaner without those 2 cables jumping over the MB!
They will never do that, that would be such a pain in the ass to deal with it.
@@macicoinc9363 How so? Moving a small part on a PCB?
how would you access it on the bottom? it would be completely blocked by the motherboard
@@cgiacona I meant the bottom on the back clearly, but even still, larger GPUs extend past the MB anyway.
It just depends on how you customise your own pc, I have a 3090 FE, my 12 pin connector looks super clean with a braided 12 pin to dual 8 pin connector it's not in the way at all.
Maybe it's time for you to prepare some Pytorch/Tensorflow benchmarks. That could be more the target applications of these cards.
Are there even drivers for the GPU's AI acceleration?
@@MaddTheSane They do make drivers on Linux and that's their main purpose. Our company it considering to use this product to do some AI computing in order to prepare the furture chip ban that might happen.
@@MaddTheSane we are using this to avoid us sanctions
I'd support them just to break the duopoly we have rn honestly.
What chinese brands did to the smartphone market was a great thing for everyone. Folks here in southeast asia can finally get high-end spec phones without it costing arm and leg.
AND cheaper competitors also forced the likes of Samsung and Apple to be more competetive in terms of features/quality/pricing.
The same kind of thing happening to happening to GPU market would be a net positive for consumers.. Just saying.
I agree 100%, Americans companies are scrambling to stop them because it would break their oligopoly on the market and they would actually have to make good competitive hardware.
Agreed, I'm tired of +5% performance and +50% price.
yeah but they also use that cellphone architecture through huawei to basically put a strangle hold on the communications networks of developing countries and spy on people so although its great for normal people in the short run it will have nasty long term ramifications for countries like bangladesh and malaysia
@@yono_yume7083😂honesty
It might take them a decade but once they get halfway decent cards they'll just flood the market.
About time someone puts the Nvidia/AMD duopoly on their toes.
Doubt it. China is known for crap
Try 90 decades.
You know, how long it'll take to get back on their feet after china collapses into civil war(again).
I'm gonna guess that by their 3 or 4th generation they will have cards comparable to the cards Nvidia, AMD, and Intel will be putting out at the same time. They are getting a share of that $1.4T and that helps tremendously.
@@skyisreallyhigh3333 And what are they going to build them with? Hopes and prayers?
15:08 that "2 years for this GPU, 3 years for your screwdriver" is exactly what I'm thinking. The effort to even get a GPU working in whatever way possible is massive. From there you need to work on various features (like Tesselation) and likely revisit the hardware design multiple times to get it right. But 2 years is a pretty short time for the initial product.
By the same token, though: The LTT screwdriver favorably compares to name-brand, high-quality ones. The GPU does not.
People can shit on this product all they want, but 2 years from nothing to an actual product is crazy. If they can keep up the pace excited to see what they can accomplish in the coming years. Amazing what state driven investment into technological innovations can accomplish. It's almost like having a government that invests in it's own infrastructure, development and progress instead of one that spends all it's money in bombs and 800+ foreign military bases while it rots from the inside out is maybe superior. Weird how that works. Probably has nothing to do with how China has managed to lift 850 million of its people out of poverty. 🤔
@@rfouR_4 CCP bot
@@rfouR_4 they literally licensed the IPs from another company, its not like they did all the RnD and all that themselves, they just put the money down to put themselves in the position to make a glorified 1030
2 years is the time spent on sourcing, sanding and reprinting😂
I like, that they use EPS. 400W with one, not fire hazardous plug, seems only logical to go this route.
How many fires started with the 4090 power connectors? Pretty sure it was zero. A lot more AMD cards had vapor chamber issues
@@tylerclayton6081the only melted plugs were caused by user error, installing them incorrectly.
Mine has had zero issues.
@@username8644 No it is a user error. Didn't you watch the GN video on the subject?
@@Ruhrpottpatriot der punkt ist das der connector scheiße designed ist
@@username8644 "They designed a power connector is that very finicky, much more than a regular power connector."
It's literally the same style of power connector just with more pins.
"You should never be able to melt a power connector because it wasn't fully plugged in"
Improperly seated power connectors, no matter if they are inside a PC or not, are one, if not the most common cause for residential fires. And that includes stupid NEMA connectors as well as the CEE 7/x plugs.
The problem with the connector, as the GN video showed, is that the tolerances are very tight and it's easy to push them not as far as they should, which can be solved by looking a bit closer and pull on it to see if the clip has arrested.
Tight tolerances are not a bad thing per se, in case of electricity you want to have as little wiggle room as possible, especially if high currents are involved.
If you don't have that you get sparks at the connection point, which increase resistance even further (thus increasing heat) and can lead to the connections even fusing together.
" Grow some balls, use your brain, and start calling out these scummy companies who keep getting away with this crap."
I'm all for more responsibility for big tech, but this isn't a problem of a scummy company (remember: Fault rate is less than 0.1%) and rather users not doing their due diligence; ignoring bad cables caused by manufacturing defects and not adhering to the given standards, but that's not NVIDIA's fault.
Seriously Gamers Nexus did a whole series on that topic.
Hmm, PowerVR is a name in the GPU space I haven't heard in ages. I actually used to have a PowerVR Kyro and Kyro II GPU way back in the early Geforce T&L days. Back then it basically was the "Bruteforce" Hardware T&L Approach on Geforce vs the "efficient" deferred Renderer on Kyro cards.
powervr self-yeeted from desktop space and were doing a metric ton of smartphone gpus
@@PicturesqueGames No they were not making their own chip and was relying on partners to make the chip, and one of the partner (ST) decided to leave the market leaving Video Logic (as they were known at the time) in the dust, and fighting against ATI and NVIDIA was not necessarily something they could over the time, they were not as big. But the original disparition of the PowerVR from the PC market was not Video Logic will to start with.
WOW VERY DANGEROUS SIR! !! 😠 😠 BUT THIS WHY IM SO LUCKY LIVE IN SUPER INDIA THE CLEANEST COUNTRY IN THE WORLD 🇮🇳🤗 , WE NEVER SCAM! WE GIVE RESPECT TO ALL WOMEN THEY CAN WALK SAFELY ALONE AT NIGHT AND WE HAVE CLEAN FOOD AND TOILET EVERYWHERE 🇮🇳🤗🚽, I KNOW MANY POOR PEOPLE JEALOUS WITH SUPER RICH INDIA 🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳🤗🇮🇳
Think I had a cheetah card, well that's what was on the box at least
I had an old Power VR GPU many many years ago, they used tile based rendering and avoided rendering anything not visible. Cheap cards that did well price/performance wise vs the ATI and 3dFX cards at the time. They had a few compatibility issues with newer versions of Windows, and Nvidia did their usual dirty tricks to help torpedo them.
PowerVR divides the scene into tiles and renders in on die memory.
@@atomicskull6405 Every modern GPU is tile based now
Ah occlusion, didn't Nvidia use that around the Geforce FX line to prop up its rather questionable performance. Once the "Oops you sure you need 24bits textures" trick didn't get that much more performance they put in one of the drivers a simpler trick, knowing the path of the camera in the benchmark they pre-occluded it. Aka, not use on chip/in driver occlusion but recognized the software and ran pre scripted blacked out areas.
Of course a free flowing camera POV broke that little lie.
@@scheeseman486 So PowerVR was right then.
TLDR: The card is effectly a card which has performance that varies between a gtx 1030 and 1660 but which also consumes 250 watts of power (the 1030 consumes 30 watts I think they said)
Thanks
In the grand scheme of it, the fact someone even made something that works is pretty bananas. I sure as hell can't make a gpu.
@@nexusyang4832 then again, you probably don thave a team of Electric Engineers, Folks who understand Discrete Structures very well, Pro's in Assembly. they do, id expect them to have a good working GPU that didnt consume as much as an RTX 3070 while giving 1030 performance
Imagine using more power than a 1080ti a delivering the performance of a 1030. Oof and all that probably after stealing nvidia's tech.
yes maximum power draw for 1030 is 30 W and 1660 is 120 W
The 3070 is 220 W
I wouldn't underestimate them, it's a giant step forward for them
Manufacturing high end semiconductors is the single most technically challenging industry and they haven't been in the game for long. The west kinda forced their hand by preventing China from sourcing high end chips elsewhere, and now I'm worried it may have been too short sighted
A great leap forward* for them 😆
@@z_nytrom99lol
I mean honestly. If anything the ban of them from importing certain chips will only increase there investment in domestic manufacturing. Why would you back down when your sole sorce of something essential to your economy threatens to be cut odd
@@putinslittlehacker4793 true this was a very dumb move from the perspective of the west, it’s not like China is some small undeveloped nation, they would easily be able to develop industry for making high end computer chips if they wanted to 😂
Eh it's great for us consumers that there's more competition.
I actually really wanted another big gpu competitor considering the gpu market that we have...
Highly unlikely for another decade minimum. Nvidia and AMD have been in the game longer than the rest and know the ins and out of GPU design. Hence why Intels cards were laughable when they arrived. Price only got out of hand with the newest cards because upto 5 year old cards are still worth while for almost everything. New games are finally starting to push the envelope and ruin performance on older cards, even then you can just drop some settings and get solid performance again.
They do not make those GPU to compete with anyone , they are making them so they wont be left without anything if USA does something horrible again in attempt to have complete world control.
Agree. But what's holding back from the market having available GPUs is fab capacity for making the GPUs and a lack of board partners willing to make graphics cards because of low profit margins.
@@Rabolisk also, even then, nothing will stop woke game studios from releasing games that wont run well even on 4090, neither on consoles.
@@smittyvanjagermanjenson182 A decade or two is nothing...
It's a really impressive GPU for only 2 years of development. And due the western restrictions, the support should be better on Linux, Vulkan, and others open technology's
is it even possible? 3060TI in two years?
@@seppi3201 well, with the same power efficiency I doubt, but in raw performance maybe we see in a cople more year with more mature drivers
corporate espionage will do that
@@brianransom16 Industrial espionage is not restricted to the CCP, Western companies do it whenever they can, AMD for example started with reverse engineering Intel chips. Nowadays, the most practical and legalized way is to hire the competition's main engineers to develop “new technologies”, a strategy that all the big technology companies use all the time. I agree that they wouldn't develop as fast from scratch, but nothing starts from scratch. I think the difference with CCP-affiliated companies is that they are less willing to hide spying as they are less susceptible to lawsuits.
@@seppi3201 When you dont GAF about IP or copyright law, its 100% possible. Didn't manage it in this case, but its possible.
Most importantly, their GPUs are based on Imaginative Technologies, which has been around since the 90s, but simply moved into the mobile (phone) space. So it's not as much a new 4th competitior, as a resurrection of one from the 90s
EDIT:
Oh, linus talks about it at the end! All is good
You forgot bout Adreno and Mali...
And somewhere is still alive VIA technologies and S3 Graphics...
Don't forget about Dreamcast
@@damienkram3379 S3 Graphics is a skeleton crew within VIA that only works with Zhaoxin in China making iGPUs. VIA hasn't funded a new GPU architecture from S3 since ~2012 and is still shipping the same Chrome 600 series cores (2-4 CUs max, DX11.1, OpenGL 4.1, no OpenCL) for a basic integrated option.
That's I assume mostly the GPU compute part. I doubt they also got made all the other IP from scratch, I wouldn't be surprised there was some stuff that was stolen like ARM stuff that isn't properly attributed (stuff they got from previous projects made in China where they got the IP legally). Not to mention the EDA tools, no way it's all 100% Chinese either.
@@damienkram3379 Adreno is an anagram of Radeon so you can guess where that came from. Mali is an original design by ARM, but its history goes back to 1998
i hope they succeed and give a proper competition. best wishes to them.
It's not like they can send a spy balloon to your house with a box that says "give it back"
edit: wow so much like
But they can send the virus through the gpu drivers #hidden_backdoor
This is actually why a lot of foreign tech companies are either closing or refusing to expand their offices in China right now. They are worried about hiring local Chinese employees who will then take IP and knowledge and start their own government backed competitors. ASML just sent a delegation of suppliers to countries like India, Vietnam and Indonesia to look for new locations to get local talent and build local factories. There's much less risk for them in these friendly countries with better IP rights.
The card IS the spy...
@@JohnSmith-vn8dmAlso it isn't like any of these countries can set up local competitor even if the people who previously work there decided to take the knowledge and ip and work in said supposedly new company
@@JohnSmith-vn8dmso just making a gpu is infringing on nvidia's ip?
moving the vrm's to the top of the card is kind of a simple idea to help vent the hot air dirtectly out the top instead of stuffing them in the middle of the board where it can heat soak easier
VRMs are easy to cool, placing them along the top stretches the trace lengths badly and provides asymmetric delivery to the die, leading to worse voltage drop-off - or spikes to compensate
The fact they have a better power connector setup than the Nvidia 4000 series is comical xD
The fact that you have no clue yet are talking big is just - sad.
Just a hint for the retards that will come crying:
EPS 8pin is specified by the same spec as the 12VHPWR.
Agree. The EPS12V power connector is better than the 12VHPWR . We have been using the EPS connector for decades, never seen them melt or catch fire despite seeing older Intel HEDT CPU pulling a crap tonne of power from them.
@@fleurdewin7958 the EPS12v is rated way lower and the the 12vhpr connector isn’t an nVidia creation, it’s a standard connector, so many idiots keep blaming nVidia for it.
The only ones that have failed have been due to improper installation.
@@oxfordsparky Even better - The EPS was specified by the same company as the ATX standard (from which the 12vhpwr comes).
But people are too ignorant to learn from their mistakes. The connector never was at fault.
@@oxfordsparky Nvidia still came up with it, and helped make it a standard.
This is actually very very impressive, considering their age in this industry. I am fking blown away.
But the card is crap for how much power it’s drawing. It’s actually horrible
I once had a GT 1030 in my system and upgraded to a 1660 Super. I can confirm that this comparison is about right. The 1030 simultaneously *doesn't exactly run like garbage* and *will bottleneck you hard.*
yeah iirc it was replaced to gtx970 everything run well and smooth just i never really tested it, but game on it alot, first witcher 3 gameplay also with that pc
it was forbidden msi tiger 1 fan not spinning after 2 years, and the msi board also failed, replaced it with gigabyte board
1030 are awesome for low power high end mame stations
The 1030 is a card to drop in pre-built systems. Buying one for a custom ATX system is moronic. A used GPU would be a far better value.
Yeah, gt 1030s are going for $200 to $250 aud new here.
Can get a used 980ti for the same price 😂
@@joshgts9675
You know speaking of things like PCI-5 and stuff, there's a game I have in particular called iRacing that supposedly has a lot of bandwidth related issues as far as sending data to the GPU because of the fact that they send the entire world space every frame. I'm curious with the impact of bandwidth over the bus actually has for GPUs particularly in a game environment that does such a thing and I wonder if anyone at the lab would be able to provide insight on that.
I feel like it's an under discussed aspect of gaming as a whole because we just assume that it's just straight up power that's needed but I think a lot of people forget that the CPU still needs to send instructions to the GPU and that data still has to get over to the GPU over the motherboard.
So you have a game that is just insanely stupid. It is sad just how bad most games, even the big AAA games, are coded. they literally waste a decade of hardware-improvements just by being retarded programmers.
Was one of the strange things when DX12 was announced: OpenGL already offered low-level access and nearly nobody used that cause it is a lot of work and easy to get wrong. And with Dx12 and Vulkan the same problems came up. Not only that we have seen that the code it self is yet again mostly bad to the point that the drivers of AMD and Nvidia have to do the heavy lifting by dynamically reshuffling data as the code as written would just result in slideshows if anything.
Except an RTX 4090 can operate at 98% performance in a PCIe gen3 x16 slot than when in a PCIe gen4 x16 slot. We're only just now seeing GPUs that are finally outgrowing PCIe gen3. Im betting its at least two generations, maybe 3 before a card is knocking on the door of PCIe gen4 x16 limitations.
@@racerex340 this depends on how heavy PCIe bandwidth usage gets in future with likes of direct storage
remember were starting to see PCIe 3.0 show age after that many years,4.0 is 2x bandwidth off of 3.0 which even with direct storage could take years to be fully saturated
@@racerex340 But LithiumFox point is that's application dependent and there are some applications where more bandwidth is necessary. One such application might be AI and applications that benefit from memory pooling of multiple cards. Nvidia's solution to this has been Nvlink, but a higher bandwidth PCIe connection could be a cheaper alternative solution. This could also perhaps be used to bring back multi-gpu gaming like SLI, without needing special sli connectors between the GPUs. Anyway, there are definitely situations this could benefit.
I think we definitely underappreciate how well some of our stuff works together.
PCS in general are extremely underappreciated.. The tech is absolutely insane .
True. This video is nostalgic. I got same vibe about 10 year ago when china got banned from only International space station. The best way to stop china progress to banned and then mock them. it clearly work
@@reuven2010 Wdym?
@@warymane6969 😂😂they made 7nn chip
THAT'S CALLED P-R-O-T-O-C-O-L-S
The specifications of this graphics card are very high, similar to the GTX3060, but due to driver optimization reasons, it has not fully utilized its true performance. In recent months, this Chinese graphics card company has been constantly updating and optimizing drivers at a high frequency, the performance of this s80 graphics card has greatly improved compared to before, and it can now support most mainstream games. Its GPU usage during operation is currently less than 20%, so its potential is still great. Currently, its price is only $163, and the inventory of product distributors was quickly snapped up. They are preparing to launch the next generation of the S90
Seems obvious to me that it is productivity or research oriented. That's why you see the AV1 support as well as Pytorch and Tensorflow. It also explains the higher memory capacity and Tensor-like cores, typical for deep learning cards. So it would be fairer to run some DL throughout benchmarks on this.
Except all the productivity suites they tested won't start, while some games did. It's a glorified prototype, a marketing stunt. They re-purposed their server GPU for consumer use, and it's still useless. If they're able to iterate maybe they will succeed, but it depends on how much money the Chinese Government are willing to spend on Moore Threads
@@fostena They issued the card to prove to investors that they could deliver a real product. Second, many people in the West do not understand that not all random enterprises can be funded by the government. The Moore Thread started with private capital, and they need to convince investors to invest more.
@@guzilayerken5013 you don't need to explain to me public-funded enterprises! I live in Italy, we where almost a "socialist" country once, by USA standards 😄! We got plenty of government intervention in the economy. That's what I said, by the way, the card is a marketing stunt, it's a proof, a prototype
@@fostena Maybe it runs well under Linux. China will probably not optimize a product for an OS they won't be able to legally update next year.
@@fostena Nor really. This card are not design for general use, such as gaming. It was designed for machine learning.
Even if this is absolute crap, it's backed by - true murky - capital and does something. Like you said "I wouldn't be able to do it" and on a pertinent level this is pretty much on point. We need more players. We need competition coming in from other directions. Give it time and in a few years we might get something from them on ARC level, a few years after that something that can compete. Voodoo wasn't supplanted by NVIDIA Riva TNT in one generation. These things need time and a steady flow of cash. I'm not saying a world where the fastest GPU4s are china made is a better world to live in, i'm saying competition will bring prices down between competitors.
These chinese companies will never be able to sell their products in the west, all that stolen tech would make it impossible unless they are prepared to do legal wars until the world ends. so, you, me and everyone else in the west dont win anything from this.
@@pauloa.7609 Predictions based on wants or should be don't mean much. The world is changing, a family being able to sit on their laurels for generations because an ancestor thought about putting a piece of plastic at the end of a shoelace is, when you take a step back just as ridiculous as chinese made video cards.
There is never a patent on a result, just on a means.
We will have to see.
9:21 I like how supposedly "China-only" card has FCC and CE logos.
That is probably the chineese CE logo.
It looks almost the same.
@@venosaur121212 It has both the Conformité Européene And the FCC logos (Or at least the spacing in the CE logo ondicates is the Conformité Européene logo)
Not surprisingly, most of the motherboards like Asus, Gigabyte, MSI, etc., are manufactured in China and printed with FCC and CE Logos, even some models that are only sold in China.
They made this only for 2 years? Wtf that's actually impressive in such a really short period of time they made a working GPU, if given more time to produce GPUs then I guess they have a higher chance to being more compatible to more hardwares
If China succeeds, Nvidia and AMD will cut prices significantly,That's great
Hope so for the consumer market. Nvidia's prices are completely out of wack with what consumer's are willing to pay, AMD is no better and the stock is always out anyways. It's time an actual disrupter came on the market to break the diopoly.
Maybe the compatibility issues (especially with certain game titles) are a feature, knowing a bit about how much China's GOV loves to censor.
The 2 years isnt that impressive when you remember most products home grown in china are from stolen tech lol. They dont innovate, they steal and make rip offs. The ultimate chinese knock off!
Lol, brought technology from UK, Imagination was the company offers graphic technology for Sega Dreamcast. Nothing marvellous.
Whoever edited the intro is based and an absolute mad lad, let me shake his hand😭😭
Exactly! Surprised more people didn't notice
Hearing linus say "Thanks pal ❤" Was so heartwarming.
The gpu itself isn't scary.
The scary part is that China was able to put that gpu together without having an international supply chain like nvidia or amd does to make their gpus.
All the more, this iteration is close to the 3060ti which you might say "oh but that's mid range" but you would be wrong because for a lot of people a 3060ti is a high-end gpu.
Not scary once you realize it's all stolen ip... As usual
What china is more then capable to manufacture GPUs like this they already make 75 percent of the world's products. Idk why people assume china is a third world country they are not they a highly developed industrialized country. The usa knows china is the only one with that can rival usa tech which is war the usa is in a economic war with china we've stopped selling them chips it don't matter.
What do u mean by scary💀 it's a freaking gpu bro not a next gen deadly weapon
It's scary because a long supply-chain is currently needed to make computer chips. A collaboration effort between Taiwan, Japan, Korea and the US. These are first world nations that need each other to make these chips.
Now because tensions between China and the US, China basically said they were going to make their own chips with blackjack and hookers and they actually did. They are on their way to beat those other nations at chip manufacturing.
II don't mean it as in "ohh scary, china bad". But the fact that they actually pulled it off is insane. China is already becoming this century new super-power.
Good for China, I have purchased Blackview phones from China, love them.
Hearing Linus say " let's try DP" that'll scare anyone for life 😂
Heard pretty much the same thing 15 years ago about Chinese cell phone.
Looking forward to the next 15 years.
I think the biggest difference there is there wasn't a burgeoning economic/tech war brewing at the time. Who knows how this will pan out, I"m only saying there's a bit of a different context here.
@@mobiusflammel9372 hmmmmm, how about ISS
82Hz refresh might be just half of 165Hz that many monitors use
This is obvious, but the whole video is built on making fun of gpu problems.
@@ОлегЖданов-ъ1д but china bad , hahaha
I like how they managed to get AWESOME cooling and next to zero noise operation and have low price. Of course, hard restrictions on MBs, and not allowing to launch games after warning they are not tested and supported but perhaps they would work...
I actually kinda wonder why both Intel and China didn't concentrate on Vulcan support. Force games to support Vulcan as main instead of DX12 and everyone would be happy, with exception of developers.
dx12 is alot better than vulkan lol
@@keigansabo9330 how?
@@keigansabo9330 as nitendo pirate i disagree,with integrated gpu i can run scarlet
if it's an AI oriented you could test it by running a basic training on it with a ready base, put it through 10-20 epochs see how long it takes, compare against cpu performance and some entry level gpu like the 1660 in this video
I was kinda interested, so I looked around - couldn't find anything. According to some people who played around with it, it's quite difficult to get the Linux software for those cards (UnixCloud has some pretty old drivers for previous MTT GPUs on their website, but not for the S80/ S3000), and I'm not sure anyone managed to get access to any Torch builds for MTT GPUs. And no code has been submitted upstream at all as far as I can tell.
Probably need to do it in linux tho. This card is not mainly built for windows.
How? Does torch, tf or any other ml API support these cards?
@@fcukcensorship783 the screenshot from their website they showed said it supports pytorch.
That power connector at the back side of the card should get back to all GPUs as in the pci-e spec reference design...
It's likely this GPU isn't designed for gaming, but for other applications. Like trying to play games with Quadro instead of Geforce GPU's. It's likely designed with computational power in mind (AI, cryptocurrency/blockchain, professional level 3D animation, etc.) and gaming is likely an afterthought.
Except quaddro's still deliver stellar gaming performance because they have rock solid driver support.
It's a lot more likely people underestimate the amount of work that went into driver support for games over the years. For a new player to try their hand at it, this is impressive, but obviously not a good product for actual use now.
Feels like some kind of rasterization module slapped onto a GPGPU/floating-point accelerator.
As they said, it's a repurposed server GPU but this version is supposed to be for gaming and they didn't do a great job at it since it can run only a handful of games at performance levels of a 1030
The way I see this is that China does have the market there to support a new GPU company to start from scratch, they might able to come up with brand new tech tree that perform better and cost less at the sametime, just like all the other things the made in China. I think overall it's a good thing, judging by how Nvidia is doing right now, I'd say the more competitors the better for us gamers. Just give them couple of years, we will see.
Also to just add a bit. They are licensing the same GPU IP that ImgTech sells to phone makers. They simply don't have drivers done at all. The GPU most likely is stronger than what it actually shows but can't use the performance due to bad drivers. Moore Threads did the hardware, worry about software later. However, we shouldn't expect more than 1660 in gaming even when it's fully optimized.
The reality is that this is a CCP company that is going to make chips for MIL/AI use in the long term. While they would probably like a commercially viable product it is unlikely they will be able to do that any time soon.
I am glad Chinese tech companies are making GPU for gaming market. This will change the balance built by AMD and NVIDIA. I believe both of them will release more competitive products under the pressure from outside.
@@thinkingcashew6 do you have the idea that every single company in China is controlled by the government? Well, guess what. You are wrong.😢😢😢
I'll never run one, same reason i won't own Lenovo, they are spy tools of the CCP, if you want to claim no way China would slip in code into the firmware to spy on westerner's, I have a bridge between Tokyo and New York for sale are you interested in buying it?
If it is their first product that's actually nice. It could become better really fast after experience.
just limited by their fabrication capabilities.
Not having 7nm or below is going to be a HARD limit.
@@zaxwashere they do have 7nn chip tho
@@didyoumissedmegobareatersk2204 oh neat. I didn't know that. Kinda surprising that they did it without EUV, but we'll see if it hits any sort of production.
I correct my post
Gonna be tough without ~~7nm~~ uhhh
5nm
@@zaxwashere Their 7nm was found on a mining chip. This tells us that its probably not a mature 7nm node (meaning yields are low, actual performance isn't there). However, the fact that it exists by itself is a big deal, because its like getting your foot in the door to greater things.
Nvidia has the H100 out with pcie5.0 for a while, but so far it seems unnecessery for the rest of the gpus that it is not worthy for them to add support for it.
I don't think even the 4090 fully utilises even pcie 4.0 much less 5.0
H100 is not a graphics card, it doesn't even have display outputs.
@@h_alkhalaf Yea it's an ai accelerator card
@@h_alkhalaf It's a graphics card mate, a GPU in a card format. There are many, many GPUs that exist without display outputs. Look at any "mining" GPUs that are quite literally 100% complete replicas of consumer GPUs with absolutely zero differences apart from lack of display outputs and different drivers/firmware. Nvidia literally called the H100 a GPU in the full product name, "NVIDIA H100 Tensor Core GPU". I cannot stress this enough, that just cause a GPU has no display outputs does not mean it isnt a GPU.
@@PhantomMattcraft it doesnt support directx, opengl or vulkan in any format, and is therefore incapable of actually rendering graphics
It makes ARC look polished...
Waiting for the 1 month of moore threads challenge :)
ARC has recently become polished after Intel fixing their terrible drivers, making them much better.
-999999999999999999 social credit
不好了!
How many canadian social credits is that?
@therealfakenews2274 chinese bot replying to 1 year old comments
@@purplebeast8536canadian thought police?
@purplebeast8536 what?
I would be interested in a revisit to see if and how the drivers have improved or not
It's a mobile GPU replicated badly to get to desktop tier performance. It will take a while but eventually it should improve.
it does support games in every week, but still it will take a long time to compete with even 1660ti, right now, in some games it will perform similar to a 1660ti, especially in some old games or some chosen games, but general speaking, it is still not even close to 1660ti, which is pretty sad. the only reason that you would like to buy this card is you want to support a third party company to compete with those larger companies, other than that, just go with AMD if u only play games
That part at 0:30 had me dying
As meh as this is, I'm still really excited that someone else is stepping in. They clearly have good engineers, but software will come with time. I mean it took AMD 10 years to make good video drivers. Any competition is good for the market.
TBH their drivers still kinda sucks even to this day xD
别考虑什么竞争了,这家企业能不能活下去都是未知数。制造他的工艺是tsmc 7nm,这意味着即便活下去了,由于无法自主制造,会和华为一个结局
At least on Linux amdgpu and the open source userspace drivers have been rock solid in my experience. Windows sadly gets shafted again
Idk, they didn't seem to be interested in making gpu for anywhere except china and probably the plan is that they would make everything themself and just ban out all the western made hardwares, and still compared to intel arc at launch, that still have better performance, stability and support than this card so i'm more excited on seeing where intel going with their gpu
Hope they improve sooner and compete toe to toe with the big 3. Definitely a win for consumers.
@@jack99889988 Most likely. America is terrified of competition.
@@ctrash the "competition" is a gtx 1030 running at 255 watts
their first consumer grade release. really rooting for this company to catch up. they don't have to release it in the US. there's the rest of the world you know
@@keyrifnoway people said the same thing about Chinese phones. Now they sell hundreds of millions of phones every year. Even without the us market. (Same for Chinese electric cars they’ve started to gain marketshare in southeastern asia, Europe and middle east)
China has the money to burn to play catchup unlike most countries. They’ll just keep throwing money at it until it becomes semi competitive. In a few years it’ll sell like hotcakes in Asia then a few years more Europe.
@@jack99889988 still definitely a win for the other 97% of the world population
As a Chinese ,I firmly doubt 1:39 ,We only know that it's the first Moore Threads gaming GPU, complety domestic and of course only the geeks would buy it to run benchmark. NV was ordered to limit the export of cutting-edge GPUs to China, and the goverment esbalished a project to fund those domestic chip companies who develop GPU CPU chips . Moore Thread is just one of those companies.
他说的没错啊
There is more companies who will build GPUs?
Gotta give credit where it's due. The naming of Chinese companies especially the English names are fire...I mean "Moore Threads"? C'mon....that's genius 🤣
Gotta give it to them, the English namings are always something. Like their recycled lithium batteries that were branded fire explosion or some shit lol.
Having spent plenty of time in South East Asia a good English business name is more of a happy "infinite number of monkeys with typewriters" coincidence than inspired genius.
I'm disappointed you didnt have labs benchmark this card against the GTX 1660, RTX 3060, and GT 1030. i think i would have prefered that over the nonscripted gameplay tests
I'm guessing none of the benchmarking tools actually functioned given the poor lack of support this card seems to provide.
Why waste time
should keep in mind that League of legends runs mostly of your CPU not GPU. they kept it extremely low performance so it works on pretty much everything
yeah if you're running on the lowest graphical settings. Mainly because all the number crunching and client side processing is CPU based (pretty standard). But actually increasing the graphic settings of course increases GPU usage by a wide margin. Essentially, a poor gpu can run league on the lowest settings. A moderate gpu is needed to run league at higher settings, let alone 4K on top of that.
@@tristanstebbens1358 yea but it still maxes out at ~25% usage on my 6700 XT so anything above a ~1060 should run the game the exact same (within margin of error)
In fact, Our chinese harware fans think that two years are really a short time for GPU manufactring. It can work ! We cant wait to see more products. My friend bought a moor gpu to play games , although it's not very stable. ( Moor did not sell this gpu to the market, beacuse they know it is a dev version. Brand repuation matters very much)
Jayden. China numba waaaan!
继续等着看吧,摩尔还有很长路要走…
@@jobturkey7418 number Yuan!
nice to know that even in a video sponsored by Xsplit you chose to use OBS.
Honestly if they just focus a ton on drivers this seems like it could be promising. They have already proven they can get decent performance in some games.
Interesting and reminds me of the nineties when mobos and video cards were released with drivers that had not 'matured' (not forgetting games that ran like snot until they were patched numerous times)
Really interesting to get the detailed take on how things actually get built in the real world in the last 4 minutes
Yeah this seems like a re-painted general purpose GPU (probably for AI and maybe crypto). All you need is one person to make it compatible for whatever GP GPU application and then it becomes super useful. Given the high VRAM to core count ratio, I'm guessing loading big datasets for AI purposes is the priority
Or they are just trying to scam their way into some sweet govt funding. Lot of chinese companies have done that in the past. With one trillion $ there is enough for everyone to share.
a gpu is as useful as you want it to be.....
Copied chinese tech? Noooooo.... lol
@@Sparks95 not true in soo many ways,nvidia is perfect for blender,amd is linux(nvidia runs poorly on linux)
@@zhanucong4614 interesting, what are homegrown Chinese gpus suited for?
I don't care if this gpu only exists because of the US/China tech cold war, having another competitor in the market is good for consumers.
The more the US sanctions China, the more innovative and self-sufficient they get lol.
We actually need this card. A cheap gpu out of china that threatens the gpu cartel|s market share is exactly what we need. If they fix their drivers it could be viable for 1080p as long as they keep the price down.
US never ending ban game won't let it happen. Your blood is for them to suck, not others.
Yooo I watch linus's videos just by looking his face on the thumbnail😂, like how old the video is doesn't matter 😅, it's super informative and entertaining❤
12:18
"Yeah, let's try DP" 🤣🤣🤣
Sebastian, Linus - Mar/11/2023
09:39 True, and Based
I would really like to see more content with that card
honestly knowing that the boys at home are making progress is enough to make me proud no matter how bad it is
Just bought a PC from Build Redux yesterday. Glad to see they are still a sponsor, and even more excited to try it my new PC out!
Just got my first pc and got all the parts from your $1000 budget pc. thanks man!
It's insane how 1000 dollars is considered a budget system nowadays..
@Kyle Cruz I was gonna scroll past until I saw a reply. Budget doesn't mean cheap or inexpensive; it's just the amount of money you plan to use. You can have a $550 budget for a PC, or one that's $2,550, and that's still *a* budget.
That gpu shroud design looks clean AF
Seems it has some drive issues since from all specifications it should not have such low performance, and considering its very small support list they are probably not working on maximizing the performance rather than making it run first.
China having a domestic GPU with limited SKUs is a boon for gaming in the West. It'll give developers a singular target to optimize for.
14:05 - TF2 is the GOAT for me. While this would NEVER happen, if Valve ever released a TF3 and it was good, I would be over the moon. I don't think I've had more fun in any other FPS, other than the OG Halo trilogy. Overwatch feels like the spiritual successor but it never got me hooked like TF2 did.
i wouldn't be surprised in the least if the "gaming" gpus were qc rejects from the enterprise production line.
It is pretty much common knowledge to Chinese industry insiders that they bought the GPU's soft IP from imagination technology from the UK. The same company that licensed Apple way ago with GPU Soft IP architectures. But again, there is so much more to get a GPU out besides having the RTL.
Edit: Also, given that they bought Imagination's IP, I don't think they actually stole any patents or IP...
And yet we regularly fine Chinese tech spies here in Korea. Chinese are investing huge money on stealing things and it's not just CCP and techs.
泡菜要变成韩国的了 😂😂
@@xiaoteam575 不是吗??
@@qixun1127 难道是吗? 白菜是中国土生土长的植物,韩国根本就没有,每年还要从山东进口大量白菜,现在你跟我说泡菜是韩国的?
@@南霁云-w6u 山东做泡菜?
I myself own a GT 1030 and for the price it's actually rly good. It definitely does more than I ever thought it would
no, its not, because performance per $ is much worse than mid range gpus
@@Tonyx.yt.issue is that I'd you can't afford a midrange gpu/ don't wanna spend a few hundred bucks, the FPS/$ value is irrelevant.
@@Tonyx.yt. "performance per $" doesn't matter if you have limited amount of money. In this case you have a "performance per $" sublist to choose a card from.
Indeed this is a threat and a good competitor for the AMD and Nvidia. The same has happened to the phones when Chinese phones took a huge part of EU and US market. US government acted fast and banned Huawei before it could took the marked completely.
With that huge funding the new GPUs will pop up in 2-3 years easily. Also, TSMC factories which are the only in the world capable for 4nm lithography are under the China influence.
TSMC is not under Chinese influence unless they decide to invade Taiwan, lol
Also, Chinese phones were never really that popular in the US. They really prefer to spend their money on needlessly expensive iPhones instead.
Chinese lithography? Puhahaha!!!!😂😂😂😂
The coughing reference was just so subtle and on point, well done Linus 😂
I think it would be only fair to run this card on the fully Chinese P3 processor with the Chinese board or the highest, and one that is available from China then see what we get.
With cards like the Intel ARC 380 I can see people buying it to get enough monitor support, 3D acceleration (CS:GO will play with enough framerate), encoding options and hell, maybe some use of the AI possibilities. It will also allow your CPU to run more smoothly as an APU also uses system RAM for video. Some people (like me) don't care about high end gaming at all, and find these kind of GPU's perfectly fine. I don't have any reason to upgrade my RX 580 anyway, which I managed to buy when it was still considered a "medium priced" GPU at around €210, and it has fine performance and FAN stop for my silent PC.
RX580, lol.
你真的很勇敢,摩尔线程造这张卡就是为了告诉所有关注者真的在做。
另外,如果你能经常反馈一些bug,那么这个GPU的兼容性就会越来越好
@@rickyray2794 I had an RX580 up to a few months ago and it ran 3440x1440 on my coding workstation flawlessly. Didn't have any issues when playing with CAD and a few other 3d things either. So no, not 'lol'.
@@rickyray2794 you know nothing about GPUs. The rx580 is a solid card
Got to love Adam and his love for TF2! Really want to see him cover it more on testing
And it's actually a rather interesting game to test because of how CPU-bound it is, and how much clock speed it needs for achieving competitive framerates
It's crazy how much less Linus waves his hands around when he's carrying an ultra rare component.
Who cares? It's garbage.
i recently bought a desktop with a display port i dont recognise. this video taught me thats it is display port 1.4a. thanks linus
0:32 is that a COVID reference?
Moore Thread has released a new MTT S70 graphics card on May 31st: 3584 MUSA core, 7GB graphics memory,
It is expected to release a new driver supporting DX11 at the end of June, supporting games such as Genshin Impact and Dark Soul 3
ccp bot
@@AceGigaloCIA bot
Developing GPU is not only on hardware, software is important too. It's easier to get more TFlops than suitable drive software and gaming software engines.
Cue the Chinese bots calling every minor issue fake and western propaganda instead of just being satisfied they made a GPU in 2 years at all.
This just makes me feel more validated for buying an Intel card.
How is your experience with the card?
Nice try Mr Intel bot, you're not fooling us.
@@1IGG the card is getting better with the driver updates
@The Big Sad Given the absolutely brutal expense of building a modern CPU/GPU, basically the only countries that would ever try to develop a homegrown semiconductor industry from the ground up (+/- a couple stolen IPs) are those who are in opposition to the current US/dominated world order, who don't have reliable access to established manufacturers.
In the modern world, that's basically only China, Russia (lol), and maybe sort of, kind of, India.
@@FNLNFNLNNorth Korea?😂