New GPUs are Bad??... "F**k it, I'll Do it Myself."

Поділитися
Вставка
  • Опубліковано 22 тра 2024
  • Check out Dylan's DIY GPU!: www.furygpu.com/
    • FuryGPU displaying Win...
    Dylan Barrie made his OWN graphics card. It's wild. Making GPU hardware and software is extremely difficult- we've seen companies like Moore Threads enter the space recently and their launch has been quite rocky. Intel's GPUs have improved SIGNIFICANTLY, but it is clear they still have hurdles to overcome. These are companies that have tons of resources and insider information... so how did Dylan manage to do it on his own? Let's take a look!
    ==JOIN THE DISCORD!==
    / discord
    GN: • China's Moore Threads ...
    • Intel Arc 2024 Revisit...
    LTT: • China doesn't want me ...
    store.steampowered.com/hwsurv...
    www.mthreads.com/
    www.xilinx.com/products/silic...
    www.amd.com/en/corporate/xili...
    0:00- New GPUs are....
    0:51- Making GPUs is hard
    4:12- The ULTIMATE DIY Graphics Card
    6:46- The Challenges
    8:10- What about the chip?
    10:12- How does this fit in?
  • Наука та технологія

КОМЕНТАРІ • 725

  • @sanishnaik2040
    @sanishnaik2040 Місяць тому +1732

    Im from nvidia. This guy has to be stopped.

    • @AbbasDalal1000
      @AbbasDalal1000 Місяць тому +479

      Im from this guy nvidia has to be stopped

    • @KianFloppa
      @KianFloppa Місяць тому +65

      amd better

    • @e....3235
      @e....3235 Місяць тому +35

      Nice and AMD has to be stopped

    • @danieltoth9742
      @danieltoth9742 Місяць тому +138

      I'm from stop. Nvidia and AMD needs to be this guy'd.

    • @udaysingh9_11
      @udaysingh9_11 Місяць тому +70

      I'm in this guy, he wants me not to stop.

  • @ClamChowder95
    @ClamChowder95 Місяць тому +640

    I think he is underselling his work. This could be a stepping stone for future open source GPUs. He should be incredibly proud of what he did all by himself.

    • @k_kubes
      @k_kubes Місяць тому +56

      This feels awfully familiar to how Linux started, basically someone toying around with a concept not expecting the project to become that big of a deal even after going open source and end up exploding in popularity

    • @jasonstephens6109
      @jasonstephens6109 Місяць тому +29

      Yeah this guy is laying the groundwork for something with huge potential. Even if an open source GPU never competes with the big boys, it could lay the groundwork for niche features that become mainstream.
      There's also the potential for custom instruction sets that give new life to defunct GPUS

    • @jnharton
      @jnharton Місяць тому +8

      @@k_kubesLinux was more than just "toying around", but definitely a personal project that was never originally intended to go public let alone become a mainstream OS.
      I actually some files once that were supposedly part/all of Linux (maybe just the kernel) prior to version 1.0.

    • @harrydean9723
      @harrydean9723 Місяць тому +2

      this should be a thing

    • @xavierrodriguez2463
      @xavierrodriguez2463 Місяць тому +8

      Open source GPU to go with RISC-V

  • @AnimeGIFfy
    @AnimeGIFfy Місяць тому +638

    open source software + open source hardware. this needs to happen if you want anything good to happen in your life.

    • @DengueBurger
      @DengueBurger Місяць тому +21

      At-cost open-source GPUs when

    • @DengueBurger
      @DengueBurger Місяць тому +25

      The Linux of GPUs

    • @TheDoomsdayzoner
      @TheDoomsdayzoner Місяць тому +12

      That's how we evolved as species. When "secret" sciences like Algebra and Geometry became available for everyone. When "secret" tech became available for everyone like household appliances.

    • @wwrye929
      @wwrye929 Місяць тому

      Making the hardware would be price and that would make harder for the game maker

    • @AnimeGIFfy
      @AnimeGIFfy Місяць тому +4

      @@wwrye929 im not talking about everyone making their own hardware from scratch

  • @SIedgeHammer83
    @SIedgeHammer83 Місяць тому +524

    Voodoo and Kyro GPUs need to make a comeback.

    • @bryndaldwyre3099
      @bryndaldwyre3099 Місяць тому +17

      Imagine this gpu being able to use a glidewrapper.

    • @xgamer25125
      @xgamer25125 Місяць тому +66

      wasn't 3DFX (maker of voodoos) got bought and absorbed into Nvidia..?

    • @RAgames-mc3kp
      @RAgames-mc3kp Місяць тому

      Yes i think ​@@xgamer25125

    • @gh975223
      @gh975223 Місяць тому +31

      @@xgamer25125 it was

    • @itsnot1673
      @itsnot1673 Місяць тому +9

      And matrox
      maybe even trident

  • @POLARTTYRTM
    @POLARTTYRTM Місяць тому +209

    The guy who made the gpu is the real "he's good with computers" guy.

    • @thevoidteb1285
      @thevoidteb1285 Місяць тому

      Did you listen to him in this video? Hes a novice.

    • @akazza69
      @akazza69 Місяць тому +1

      Why the heck am i experiencing a deja-vu
      This comment
      This reply above me
      I am getting crazy

    • @POLARTTYRTM
      @POLARTTYRTM Місяць тому

      @@akazza69 normal we all have time from time to time.

    • @0xD1CE
      @0xD1CE Місяць тому +7

      @@thevoidteb1285 Being a novice with HDL programming and writing windows kernel drivers does not mean he's not good with computers. That's like saying I'm not good with music because I never played the Pikasso guitar...

    • @yoshi596
      @yoshi596 Місяць тому +2

      @@thevoidteb1285 Oh is that so? Then go ahead and do it better than him. Notify me when you upload a video about your custom made GPU, I'll wait.

  • @Karti200
    @Karti200 Місяць тому +582

    I saw the news day one when it came out - and it just pissed me off how many out of touch people there were about it…
    Like… People literary roasted the creator of it because "it is too weak"... like cmon what the heck is wrong with some people
    This is literary an openware / shareware version of a GPU made by a community - this is an amazing milestone if you ask me

    • @elysian3623
      @elysian3623 Місяць тому +76

      Let's be real, consumers themselves have no idea what stuff is, what is good for them or how stuff works, they just consume, they're easily fooled into consuming stuff they don't need as well, 90% of Nvidia cards have probably never been used for a cuda work load but they were sold based on their dominance in certain tasks.
      I still live in hopes that AMD makes their software stack fully open and somebody comes along with a working prototype of something absolutely game changing and they work with them to actually use their combined technology to advance GPU's, currently the stagnation in GPU performance is because of die shrink and ramming things like AI acceleration into cards that really just need to be affordable and play games well.

    • @Dingbat1967
      @Dingbat1967 Місяць тому

      The average lamba person is an idiot. That's pretty much why. The intertubes just made it more obvious.

    • @TechBuild
      @TechBuild Місяць тому +29

      People who have some idea how GPUs work and their differences from CPUs will easily understand the work this person has done. It is a phenomenal task of making a GPU yourself which does 3D rendering, even a basic one, so well. In the CPU space, there are lots of architectures available to build upon but not in the GPU space.

    • @tsl_finesse_6025
      @tsl_finesse_6025 Місяць тому +21

      People don't understand nvidia have around 30k employees while him alone does 1 full project by himself 😬😬. Bro is fire and a genius 💪🏾

    • @rextrowbridge8386
      @rextrowbridge8386 Місяць тому +7

      Because they are ignorant of how hard it is to make a gpu from the ground up.

  • @crumbman2065
    @crumbman2065 Місяць тому +123

    Really interesting video.
    As an electrical engineer working with FPGAs I can assure you it's a heck of a lot of writing (probably verilog) code to get this thing to work as it's supposed to. The biggest issue about doing this with an FPGA is that they run on really low clock speeds (typically ~100 -max ~250 MHz). So you can't really get speed just by increasing the clock speed (like NV and AMD have been doing more aggressively recently).
    Props to this man

    • @kesslerdupont6023
      @kesslerdupont6023 Місяць тому +2

      Are consumer FPGAs big enough to scale to entire architectures or are they typically cut-down and more of an evaluation tool?

    • @noth606
      @noth606 Місяць тому +6

      @@kesslerdupont6023 Well, you wouldn't be able to make a fullsize flagship FPGA based GPU that competes with the big boys, if that's what you mean. What I think this is, without looking deep into it, is a rendering pipeline mostly from scratch, my guess is that it's OpenGL based on choosing Quake to test it, so it's unlikely to support anything approaching DirectX 10 type stuff in terms of GPU functions, because it's at least one if not multiple orders of magnitude more complex.
      So, it's definitely impressive work, but I don't think nVidia or AMD are shaking in their boots.
      I'd guesstimate, based on what I know off the top of my head, that you'd need probably a few FPGA's, like 3-6, maybe more, to build up a full Dx10/11 type unit with enough ROP's, shaders etc to do something useful with including all the jank you have to have around it like memory management stuff, things to handle texture/geometry/shadercode ram plus then output handling. It kinds depends on how 'strict' of a model you aim for really, because to a point you can choose to do a lot on the host system, or not. The more you do on the host in code, the less you need dedicated hardware/FPGA space to do.
      It could be that this is a Dx10+ model project just not completed far enough to currently run more than basic stuff equivalent to OpenGL. I hope so.

    • @myne00
      @myne00 Місяць тому +4

      @kesslerdupont6023 they are absolutely used by Intel and amd to test design changes. They probably have really big ones that can do an entire (single) core.
      Nvidia probably does use them too, but it would most likely be a very cut down gpu.
      They would absolutely perform horrible, but it's about comparing different hardware approaches and validation.
      Eg if fpu design x does 100flops and design z does 102 flops you have your architectural comparison.
      Then you run through a bunch of tests to validate the results. Don't want a chip that gives incorrect answers.
      Fpgas are used in the real world in applications like telecommunications signal processing where a new technique could be released every year or so.
      I'm not aware of any other real world applications aside from the "MiSTer" which is mostly used to emulate old game consoles.

    • @kesslerdupont6023
      @kesslerdupont6023 Місяць тому

      @@noth606 Thanks for the helpful comment. I don't know much about DirectX but maybe I should look more into it.

    • @kesslerdupont6023
      @kesslerdupont6023 Місяць тому

      @@myne00 thanks for the info. During validation is it normal to have to fuzz the chip across all possible combinations of input/output or are there shortcuts to validation?

  • @grgn5412
    @grgn5412 Місяць тому +129

    This is WAY better than you may think.
    By simply converting his FPGA design into an ASIC, you get a 10x performance increase.
    ASICS are expensive (From one to a few million dollars to make the mask which allows mass production), and FPGA's have been used to prototype them : the languages to program the first or design the second are the same (VHDL or Verilog), so this conversion is very common in the industry (this happenend for crypto mining, for instance).

    • @SUCHMISH
      @SUCHMISH Місяць тому

      I looked the chips up, and the good news is that you can by them in bulk for a low price used... Only problem is that they are used... But I feel like this idea has some merit to it though!!

    • @aymenninja8120
      @aymenninja8120 Місяць тому +1

      I saw a video of a guy who made his own ASIC, and he didn't sound like super rich, i think the technology of making ASICs is getting more affordable.

    • @Endgame901
      @Endgame901 Місяць тому +3

      @@aymenninja8120 He didn't really make his own ASIC, he basically got in on a group buy for space on the silicon for an ASIC. Still pretty dope, but not quite the same.

    • @aymenninja8120
      @aymenninja8120 Місяць тому

      @@Endgame901 and that group is not a big corporation or something, my point is there it is possible now to make ASICs for clients other than big companies. if I got things right.

    • @Endgame901
      @Endgame901 Місяць тому +1

      @@aymenninja8120 you're not wrong, per se, but an ASIC like tinytapeout isn't really in the same scope as this, even if you purchased every single "block" of silicon space.
      The type of chip you'd need _would_ cost Big Company money.

  • @lineandaction
    @lineandaction Місяць тому +122

    We need open source gpu

    • @namebutworse
      @namebutworse Місяць тому +5

      Im getting flash backs the moment i hear "source"
      (I am Team Fortress 2 player)

  • @baget6593
    @baget6593 Місяць тому +23

    I dont care if fury gpu succeeds, i just want nvidia to lose

  • @ThreaT650
    @ThreaT650 Місяць тому +27

    Respect to putting me on the FuryGPU, this is dope! That thing is performing around an old Radeon 8500 or something! Impressive!

  • @toblobs
    @toblobs Місяць тому +153

    I can only imagine what Dylan could do with Nvidia's budget with his talent

    • @kaptenhiu5623
      @kaptenhiu5623 Місяць тому +33

      Becomes a trillionaire and dominate the AI market like NVIDIA does rn?

    • @blanketmobbypants
      @blanketmobbypants Місяць тому +3

      But I would definitely buy it

    • @josephdias5859
      @josephdias5859 Місяць тому +4

      or just enough of a budget to produce cards to play all the 90s games and early 2000s games

    • @ThreaT650
      @ThreaT650 Місяць тому +2

      And a team of engineers, YUP!

    • @ThreaT650
      @ThreaT650 Місяць тому

      Seems to have his head screwed on straight too. Would be great at the consumer level.

  • @tech6294
    @tech6294 Місяць тому +54

    8:57 The VRAM GDDR chip is to the lower left of the fan cooler. That mini storage card might be BIOS? Not sure lol. Great video! ;)

    • @jashaswimalyaacharjee9585
      @jashaswimalyaacharjee9585 Місяць тому +9

      Yup, I guess he is hot loading the vBIOS via SD Card.

    • @dbarrie
      @dbarrie Місяць тому +10

      That’s the DisplayPort splitter, which takes the DP signal from the FPGA and splits it into DP/HDMI to supply the outputs. All of the (slow, DDR3!) RAM on the device is part of the Kria SoM, underneath the fan!
      SD card is there to update the firmware when I’m not running with the card hooked up to the dev machine!

    • @greyhope-loveridge6126
      @greyhope-loveridge6126 Місяць тому +2

      @@dbarrie That's a really good way of updating the firmware on the fly - I'm amazed you got this running!
      DDR3 isn't actually the worst memory you could've used, and maybe you could use some older, broken GPUs and de-solder some GDDR6 or GDDR5 to transplant while you get stuff working too? I am fascinated to see how far this comes.

    • @Vifnis
      @Vifnis Місяць тому +1

      @@greyhope-loveridge6126 I doubt this would even work since GPUs aren't out-of-the-box FPGAs... might need to check JEDEC standards first to even see if that's possible, and iirc they only started with GDDR4 and up, wasn't everything before that 666MHz DDR3 VRAM?

    • @proudyy
      @proudyy Місяць тому

      @@greyhope-loveridge6126 Definitely, he has to keep going. The amount of potential in this is crazy. And even though he said in this video that the goal never was competition, he still can get a competitor in the future :P
      Or even found a company which competes in the future, whatever...

  • @liutaurasleonavicius2680
    @liutaurasleonavicius2680 Місяць тому +25

    there was also one person who literally combined a Nvidia and AMD gpus and was able to use DLSS and AFMF together, this must be dark magic

    • @user-xe6sm4jv8f
      @user-xe6sm4jv8f Місяць тому +10

      No magic, you literally just put 2 cards into 2 slots on your motherboard and they work🤣

    • @callyral
      @callyral Місяць тому

      ​@@user-xe6sm4jv8fIt just working makes it seem more like magic

    • @shiro3146
      @shiro3146 Місяць тому +1

      @@user-xe6sm4jv8f i dont think it was as easy as that bruh

    • @sirseven3
      @sirseven3 Місяць тому +2

      ​@@shiro3146it totally can work just like that. Just as you can have 2 NIC's operating at the same time. The main thing to worry about is the drivers for the card. You won't be able to pool the memory space as NVLINK but you do get an additional processor. Utilizing profile inspector you technically could get it to work as NVLINK but it takes manual config. I've ran different ram types with xmp and sometimes I got bluescreens but it was semi stable with a solid overclocking of said RAM

    • @granatengeorg
      @granatengeorg 16 днів тому +1

      I do the same in blender, rendering on an rx while using optix denoising on my gtx, all together in realtime in the viewport. Was also quite surprised that it just worked lol.

  • @POLARTTYRTM
    @POLARTTYRTM Місяць тому +27

    I've had people telling me that the guy's gpu is not impressive at all because writing drivers, APIs, and all that, including the card is the "bare minimum" they would expected from any software engineer apprentice or professional, yet I don't see new cards coming out every week made by software engineers.

    • @taxa1569
      @taxa1569 Місяць тому +2

      DO NOT believe them. To do something from scratch that the past 20 years has been iterating upon over and over by massive companies is like, saying the person who added the sauce to the meal they've been preparing for 3 hours is now the chef, and these 'chefs' can ALL do what this guy did. Except he prepared the whole meal and THEY just put on the sauce.
      The bare minimum was in fact coming into the GPU development space and adding on to the well prepared, already existing meal.

    • @POLARTTYRTM
      @POLARTTYRTM Місяць тому +3

      @@taxa1569 they gave every excuse possible, that the guy just messed with an FPGA and sourced the other parts... I was like in my mind... tf do it then, I want to see if it's that easy you can do it, so? Anyone can if they are given the tools.

  • @jungle2460
    @jungle2460 Місяць тому +13

    If that SD card actually turns out to be the VRAM, that's genius. I'd love to be able to swap SD cards to upgrade VRAM

    • @HamguyBacon
      @HamguyBacon Місяць тому +11

      SD cards are not fast enough to be vram, thats just the bios.

    • @barela3018
      @barela3018 29 днів тому +1

      Way to slow, ram and ssds are made for different purposes, that’s why there is a loading time before games, ssd is loading the system to the ram and then when needed by the gpu, the ram will select information from the program loaded.

  • @tropixi5336
    @tropixi5336 Місяць тому +104

    "YoU DiDnT mEnTiOn InTeL"

    • @AryanBajpai_108
      @AryanBajpai_108 Місяць тому +19

      He did 😂

    • @Prince-ox5im
      @Prince-ox5im Місяць тому +12

      ​@@neon_archhe's mocking someone's comment not saying it

    • @ranjitmandal1612
      @ranjitmandal1612 Місяць тому

      😂

    • @tropixi5336
      @tropixi5336 Місяць тому

      @@AryanBajpai_108 im talking about the start where he said "NVidia and amd are top contenders" ....

    • @urnoob5528
      @urnoob5528 Місяць тому +1

      @@ranjitmandal1612 smh

  • @oktc68
    @oktc68 Місяць тому +6

    This is the most interesting PC oriented video I've seen for ages. Nice1 Vex, nice change of pace.

  • @oglothenerd
    @oglothenerd Місяць тому +11

    Someday we will have open source GPU instruction sets. I know it. The community always wins.

    • @fatemanasrin7579
      @fatemanasrin7579 Місяць тому +1

      And they'll be 9 year old white spoiled kids aho aill buy these tjinking they're smart and break it or put it on fire..

    • @xeschire706
      @xeschire706 15 днів тому +1

      Or we can just take either risc-v, or a custom & extended version of the 6502 that supports a 64 bit architecture, nyuzi, or even the miaow isa's, & modify & optimize them for efficient graphics processing, for use in our custom, open source GPU's instead, which I think would be a far better route in my opinion.

    • @oglothenerd
      @oglothenerd 15 днів тому +1

      @@xeschire706 I like that idea.

  • @vishalkumar-dr8wq
    @vishalkumar-dr8wq Місяць тому +15

    Its amazing what he was able to achieve. I used FPGA's during my time in undergraduate study and what he achieved it takes an amazing amount of skill and work.

    • @jnharton
      @jnharton Місяць тому +1

      True, but don't discount his 20-30 years of writing code for software rendering.
      Not only does that mean he has a considerable foundation in understanding what the hardware needed to be capable of, it almost meant he could write his own kernel driver and an interface API to make his GPU usable under a modern Windows OS!

    • @cooldudep
      @cooldudep Місяць тому

      ​@@jnharton what part of op's comment discounts the guy's decades of work?

    • @mrnorthz9373
      @mrnorthz9373 28 днів тому

      ​@@cooldudep i dont think he means the comment discounted his skill, he means to anyone that may think this is a miracle or something unexpected from a guy of this magnitude

  • @StuffIThink
    @StuffIThink Місяць тому +6

    I've been watching him try to make this thing work forever super cool to see someone else giving him some exposure.

  • @Drunken_Hamster
    @Drunken_Hamster Місяць тому +3

    The future where I can piece together and upgrade my GPU like I can the rest of my system would be lit. NGL I'd love for the GPU scene to instead have motherboard chip slots similar to the CPU socket, with their own section for memory, special compute units (to improve ray tracing or AI separately from rasterized processing), and output pathways so you never have to worry about finding a card with the types and quantities of outputs that you want.
    It'd also make cooling simpler and likely more compact, kinda like how it is for CPUs with semi-universal setups that only require certain amounts of height as the sole variable. And it'd DEFINITELY make liquid cooling more accessible, not that I want to do that as much as I once used to.

  • @maxcarter5922
    @maxcarter5922 Місяць тому +19

    What a great contribution! Crowdsource this guy?

    • @arenzricodexd4409
      @arenzricodexd4409 Місяць тому

      To play Quake at 60FPS?

    • @mrnorthz9373
      @mrnorthz9373 28 днів тому +5

      ​@@arenzricodexd4409quake at 60 fps today, cyberpunk at 60 fps tomorrow.

    • @scudsturm1
      @scudsturm1 19 днів тому

      @@arenzricodexd4409 dont complain if u cant build a gpu yourself and write the driver yourself

    • @arenzricodexd4409
      @arenzricodexd4409 19 днів тому +1

      @@scudsturm1 nah this guy do it for fun. crowd source? that is an attempt to take away his passion for this.

  • @Hemeltijd
    @Hemeltijd Місяць тому +2

    This is so informative and cool. If you find any more topics alike, can you make more videos like this?

  • @rainbye4291
    @rainbye4291 Місяць тому +9

    My man just did the unthinkable. Great effort for making a gpu this good ALONE.

  • @AleksanderFimreite
    @AleksanderFimreite Місяць тому

    I would assume the rates displayed around 5:30 indicates what speed the internal game updates (ticks) are running.
    One render seems to take around 25 - 45 ms to draw, and another 5 - 10 ms to clear the data for the next render. This indicates a total range of 30 - 55 ms per update.
    Formula to calculate rate per second would be (1000ms / x) which becomes roughly 33 - 22 range. Which seems accurate to how choppy the enemies move around.
    Camera motion is much smoother than their movements. Despite this, I'm also impressed by the efforts of individuals trying to tackle such a daring project.

  • @BOZ_11
    @BOZ_11 Місяць тому +27

    Fury?? ATI Technologies 'bout to make a complaint

    • @core36
      @core36 Місяць тому +5

      I don’t think ATI is going to make any complaints anytime soon

    • @BOZ_11
      @BOZ_11 Місяць тому +2

      @@core36 so i see sarcasm isn't your strongest suit

    • @spooderderg4077
      @spooderderg4077 Місяць тому +3

      AMD may still own the trademark though. Might not since they haven't even used the ATI brand in a decade. But they potentially could.

    • @urnoob5528
      @urnoob5528 Місяць тому +2

      @@BOZ_11 tell that to urself

    • @MarioSantoro-ig5qh
      @MarioSantoro-ig5qh 24 дні тому

      Unless he starts selling them they probably wont do anything.

  • @jameshadaway8621
    @jameshadaway8621 Місяць тому +1

    Great video I remember the want of 3dfx cards in 90s and always wanted to work in IT and its good people can build there cards as hobby.

  • @PeterPauls
    @PeterPauls Місяць тому +3

    My first GPU was a 3Dfx Voodoo 3 3000 and 3Dfx made the first GPU available for the masses (AFAIK) and they disappeared around 2000-2002.

  • @Revoku
    @Revoku 29 днів тому

    the CPU/GPU on a raspberry PI is an arm cpu/whatever internal gpu, has set instructions /pathways for both. an FPGA is a chip that you can program the gates/instructions/pathways
    you run code that changes the configuration of the chip

  • @sturmim
    @sturmim Місяць тому +7

    He could braze some old GDDR6 chips from broken or old GPUs. Would like to see that.

    • @kesslerdupont6023
      @kesslerdupont6023 Місяць тому +5

      It may be good enough to just put some DDR5 on there depending on what speed the GPU is currently using.

    • @jcoyplays
      @jcoyplays Місяць тому +1

      He could've used DDR3 and been on par/overkill. (800-2133 MT/s, or about 400-1051 MHz, which would match/exceed the FPGA clock speed)

    • @kesslerdupont6023
      @kesslerdupont6023 Місяць тому

      @@jcoyplays yeah that is true

  • @jaywhite15_AL
    @jaywhite15_AL Місяць тому

    LOVE the Meze's you're running.

  • @miguelcollado5438
    @miguelcollado5438 Місяць тому +16

    Real3D, Mellanox, Realtek made decent GPU's in the 90's as well... but they have eventually all been absorbed by the same 3 major brands in the 2000's...
    Dylan Barrie deserves our community's full support for his work.

  • @poohbear4702
    @poohbear4702 Місяць тому

    I've been wondering for a few years whether someone would do this. Very cool!

  • @cybernit3
    @cybernit3 Місяць тому +10

    The biggest hurdle to make a GPU is you need lots of money to make the gpu chip if its an ASIC but later on the ASIC chip would be cheaper than using an FPGA. I wish they could make high performance FPGAs that are cheap; not so expensive. I have to give this fury gpu guy some credit for making it; this could lead to something decent in the future or inspire future gpu designers. Also there is VAMPIRE AMIGA who made an extension of the AGA Amiga graphic chipset.

    • @Tyrian3k
      @Tyrian3k Місяць тому +2

      It simply can't be as cheap as a chip that is tailor made for the specific desired purpose.
      It's like wanting a van that can perform like an F1 car without it ending up costing more than the F1 car.

  • @sannyassi73
    @sannyassi73 Місяць тому

    I wonder how well those MTT GPUs perform with AI- 16gb of memory is decent. That might be their main purpose.
    It's also becoming easy to make/mill your own silicone boards with different combinations of certain CNC machines and lasers. Most Chiplets/Capacitors/etc. on a board are very cheap to buy individually and then you make your own actual board. It's pretty neat. I don't have the tools to do it but I've been looking into it and it's getting to be affordable to do it at home for prototyping and even some mass production for more simple parts.

  • @Ele20002
    @Ele20002 20 днів тому

    I love projects like this. Making a driver fully compatible with windows, and actually creating all the required ports to connect via PCIe is insanely impressive. Using an existing graphics API would be even more impressive, but modern APIs are so complex these days it'd be a hell of a task for one person, so I can understand skipping that step.
    It'd really be great to get more GPU designs into the open though. GPU architecture isn't really shared in that much detail - everyone does their own thing, so you can only take inspiration from the higher level concepts.
    There's a good reason to hide GPU ISAs behind the driver though - a lot of optimisations can be enabled by the compiler and features integrated into each new version that'd otherwise need convincing developers to add support into their game for.
    Breaking into the GPU space in performance is also difficult because so many optimisations are made specifically targeted at a certain GPU, forcing newcomers to support hardware acceleration of that feature as well to not fall behind in benchmarks, even if there's another way to achieve the same effect that's more efficient on their hardware.

  • @Desaved
    @Desaved Місяць тому +10

    We're at the point where GPUs are more expensive than the entire rest of the computer!

  • @SL4PSH0CK
    @SL4PSH0CK Місяць тому +1

    unironically china market has been a blessing for budget builders notably the e-sport aimed GPU w/ the "SP's" and when the brand they represent was a sister brand like Palit(taiwan) and Inno3D

  • @Powerman293
    @Powerman293 Місяць тому +1

    I could see this project eventually turn into the PC equivalent of those FPGA clone consoles but for 90s GPUs.
    A very cool demonstration of tech that fills a niche market but ultimately is not threatening to the big players.

  • @xeschire706
    @xeschire706 15 днів тому

    I've been thinking about doing something like this for a while, but with instead taking a pre-existing isa like risc-v, or an open source, & upgraded implementation of the 6502 isa, then modify & optimize either architecture for efficient graphics processing in order to create a custom open source gpu. Of course my target would strictly be microcontrollers, embedded devices, Arduino's, retro style games consoles & handhelds, & also retro PCs for now, as little baby steps before I can take anything like that any further.

  • @guarand6329
    @guarand6329 Місяць тому

    I bet if he took the transistor design and converted that to dedicated silicon vs the fpga, it would run faster.
    Pretty cool that he created a gpu design, also wrote the driver, and it's working.

  • @CrowandTalbot
    @CrowandTalbot 23 дні тому +1

    wasn't quake that game that used to prove or break computer builds back in the day? and his gpu handles it gorgeously? that's enough for me to know he's onto something

  • @bubmario
    @bubmario Місяць тому

    I think what is important about something like that is it can lead to gateways which are normally locked and up to the vendor to approve. If there is some thing that is not able to be performed on Nvidia or AMD GPU, maybe this card can be a solution to that in years time. Open source stuff is really important for that.

  • @UltraVegito-1995
    @UltraVegito-1995 Місяць тому +53

    *If only moore threads became successfully AI GPU in china causing them to neglect Nvidia or AMD....*

    • @arthurwintersight7868
      @arthurwintersight7868 Місяць тому +9

      I just want to see more actual competition, to drive down prices.

    • @zerocal76
      @zerocal76 Місяць тому +4

      😅😅 You must know very little about China to make a comment like that. The last thing anyone in the world wants is a gov like China's to become completely tech-independent, especially in the hardware accelaration & AI space!

    • @arthurwintersight7868
      @arthurwintersight7868 Місяць тому +5

      @@zerocal76 - China is highly likely to implode under their own pressure at some point. Especially if their shoddy construction work at the Three Gorges ends up being as bad as people think. In the meantime they can drive down GPU and NAND prices.

    • @arenzricodexd4409
      @arenzricodexd4409 Місяць тому

      Still does not make them to ignore AI.

    • @hey01e5
      @hey01e5 Місяць тому +1

      unfortunately, if moore threads became competitive they'd get sanctioned for "national security reasons", leaving us westerners stuck with nvidia and AMD who will just price gouge the GPUs

  • @cashmoney2159
    @cashmoney2159 Місяць тому

    i just got a laptop and its having a ddr5 12gb ram stick, but the problem is that in my country only 8 and 16gb ram sticks are sold now what should I do? buying which ram would be a better option 8gb or 16 for getting that dual channel boost?

  • @CheesyX2
    @CheesyX2 Місяць тому

    Seriously impressive stuff!

  • @aggressivefox454
    @aggressivefox454 6 днів тому

    For how customizable and “freeing” the pc market is with a wide variety of interchangeable parts, operating systems, etc. I would have expected for their to be less of a monopoly on gpus. I always got the impression they were just mini computers so I kind of thought that you might be able to easily build them yourself (granted not as easily as a pc). Open source and custom built gpus would be awesome to see though. I’d love to have a multi gpu set up that I can build myself

  • @deathVIAavatar
    @deathVIAavatar Місяць тому

    I'm already imagining a cool, very affordable project PC bundle that could have this in it. Something that includes projects for people to make. Almost like a modern Commodore 64, but with a dedicated GPU as opposed to an SBC, and with plenty of ram to allow all kinds of editing of beginner projects. Maybe something that could render up through PS1/Nintendo 64 style graphics without sweating it. I know this sort of thing kinda exists, but with the open source nature of this GPU, that would align with the tinkering factor.

  • @thevikinggamer454
    @thevikinggamer454 Місяць тому

    Loved the video, it would be really good for us consumers, if more try to get into the gpu game, it would force both prices down and accsessability up. and its impressive to see 1 person actually make a functional one. When it comes to Intel Gpu's and Starfield, it works like a charm btw, and on one of the kids gaming pc'es, one of them are running Sparkle Intel Arc A770 TITAN OC 16GB, and runs both starfield and Cyberpunk 2077 with well over 100fps+ on max settings with low well over 80fps+, so I dont understand how testers gets so low on thoose 2 in tests. Intel also are sending out almost a new driver update every week now, making the card better and better every week at this point, and even at some games, my kid is actually out perform my 4080 in some games. So Intel are well on theyre way to update the atleast Arc 770 , to be a really solid contender and card for the fraction of the price of my 4080

  • @rua893
    @rua893 Місяць тому +1

    great video .. soo interesting 👍👍

  • @aleanscm9350
    @aleanscm9350 Місяць тому

    I am working on optimizing the performance of low end gpu’s and other components, but if there was open source gpus were you could optimize from the metal it would beat beefy stuff.

  • @ohnoitsaninja
    @ohnoitsaninja Місяць тому +3

    It's not hard to make a graphics card, if we abandoned our current software library.
    It's very hard to make a graphics card thats compatible and performant on every version of opengl, directx, vulkan that has ever come out.

  • @zealotoffire3833
    @zealotoffire3833 Місяць тому

    im into designing simple computer architectures and then running programs on them, even for cpu's it took a lot of research to get something working and i couldnt even find that detailed things enough to bedesigned from just logic gates. then i tried to make a simple gpu and then i was done, the most closest thing to any details was a university paper writen by someone about the order of how it interprets instructions, which was barely anything so i just gotta pretty much reinvent the wheel (i have not even attempted to make a gpu from logic since i need to plan before and i don’t know know howto do that at this point lol)

  • @Skullkid16945
    @Skullkid16945 Місяць тому

    I hope there is an open source hardware boom soon, or at the least more big names getting involved in finding ways to make things like this more avaliable for the open source community as a whole.

  • @andreasmeow452
    @andreasmeow452 Місяць тому

    3:28 It's been a known issue that with MSAA turned on in GTA V, arc performs very poorly. Maybe Steve had that enabled, as it does say it is "VeryHigh-Ultra-Custom". but I mean, it is also 4K too so I'd be a bit hard to run super well regardless

  • @YuNherd
    @YuNherd Місяць тому

    i am waiting for arm based gpus, hope some will make it
    edit:
    he could pitch this gpu as modular where you can change the core and number of RAM, changeable heatsinks too

  • @Mantrevouir
    @Mantrevouir Місяць тому

    Your haircut looks so badass keep it mate👍

  • @sidburn2385
    @sidburn2385 Місяць тому

    This is very good news for the gaming community and will take off as more and more get involved.

  • @dennisestenson7820
    @dennisestenson7820 Місяць тому

    About 10-15 years ago I worked on a product that used an FPGA to generate video output. It's impressive, but definitely not unheard of.

  • @Ninetails94
    @Ninetails94 Місяць тому

    it wouldnt be too hard to make the gpu itself, the self written code is the hardest part,
    so the fact that some dude made a gpu from scratch its pretty neat, hope someday we could get custom gpus that out perform the major players.

  • @c.n.crowther438
    @c.n.crowther438 Місяць тому

    I will be following Dylan Barrie's work with great interest.

  • @histerical90
    @histerical90 Місяць тому +1

    You know the problem with that survey? There are also counted people with steam decks, rog allys, other apus I think that percentage is mostly from there, while for nvdia those are just proper full gpus.

  • @kaseyboles30
    @kaseyboles30 Місяць тому

    If this fpga design were transfered to a full blown asic of it's own it could probably take that 720p60fps and turn it into 1440p60fps. Especially on a more recent node. could probably double the number of cores in it on top of the speed up. heck the design might scale up to 4k60 for the game.

  • @pauloisip3458
    @pauloisip3458 Місяць тому +7

    I can see this guy becoming successful unless nvidia makes a move on the guy

    • @AssassinIsAfk
      @AssassinIsAfk 27 днів тому +1

      Either 2 things will happen
      1) Nvidia becomes Nintendo/ Sony and send a cease and desist
      2) they send him a opportunity to work for them to fix their budget cards or AMD/intel send him a opportunity to work for them.

    • @NicCrimson
      @NicCrimson 12 днів тому

      @@AssassinIsAfk A cease and desist for what?

    • @AssassinIsAfk
      @AssassinIsAfk 12 днів тому

      @@NicCrimson I don't think you understand the joke

  • @shiro3146
    @shiro3146 Місяць тому

    cant wait for open source cpu and gpus
    would be very crazy cool considering theres linux on software space but theres no in hardware
    sure hw isnt as easy as sw, it would cost unimaginable money, but hey this kind of niches already proved itself that it can became the start of a standard too

  • @phillangstrom8693
    @phillangstrom8693 Місяць тому

    i would like to see one of the companies make a video card with not only 32Gb of fast ram but also will have a high speed 200Gb m.2 drive for cashing the entire game so it doesn't have to use the cpu as much to read from the main drive or I am still waiting for game makers to adopt the new particle physics engine that will make high end GPUs unnecessary because high quality graphics at a high frame rate are possible on standard integrated APUs and low end GPUs like a gt 980

  • @andrewvader1955
    @andrewvader1955 Місяць тому

    This is super cool! I would be happy if it runs old titles.

  • @roythunderplump
    @roythunderplump Місяць тому

    Louis Rossmann would love this story piece, hope more jump on board with these electronic projects.

  • @softwarelivre2389
    @softwarelivre2389 Місяць тому +1

    We need to run Super Tux Kart on that ASAP

  • @gamma2816
    @gamma2816 Місяць тому +3

    Future prediction:
    1. Hardware will become open source but lackluster but you can now build individual pieces of your PC just like you could the PC itself before.
    2. At first the trend will be for people in the know and won't affect the market.
    3. Some content creator, say Linus, will build an insane GPU and CPU and people copy it.
    4. Now the open source hardware is so adept that it's actually a market threat for Nvidia and AMD.
    5. Building becomes more streamlined like PC building and you can now buy parts that judt click in place building your own chips like legos, again much like PC building.
    6. Mainstream GPUs and CPUs cannot compete as much like Mods for games the vanilla experience judt can't compete with what Johnny cooked up in his mom's basement.
    7. For Nvidia and AMD to compete they now have to adapt so either they or some up and comer OR Intel will start making licensed parts for projects like this and will promise, "XYZ if you buy XYZ from X company and not Y or Z!" And it will be up to the individuals to check which combination is best for performance just adding another customisation choice for PC gamers.
    8. A huge wave of insane complaints online about performance because optimising for this many combinations of parts on PC is impossible for developers.
    9. Developers are forced to focus on specific gear so they make games optimised for specific systems, this way the top company will buy performance from devs making them optimise for their parts.
    10. Now open source is once again dead because "Why buy your own gear with bad performance when you can buy a full Nvidia board with optimisation lol!"
    11. Back to square one as people are forced to buy full boards or brand items, aka a premade GPU, aka what we have today.
    12. 🤷‍♂️🤣

    • @powercore9277
      @powercore9277 Місяць тому +1

      really doubt it bc have you seen the machines that intel uses to make their cpus? they need to use machines made by one specific dutch brand ASML and those machines are not cheap

    • @gamma2816
      @gamma2816 Місяць тому

      @@powercore9277 Well I agree, probably absolutely not in the near future, but given that tech evolves and a lot of "too expensive for civil use" things in the past are now cheaply available, well relatively anyway. There was a time when cars were not an everyman's thing but here we are, so near future, absolutely not, but who knows later down the line. 😝
      But then again we are at a little stop in evolution tech wise as transistors have reached their size limit in how small they can be if I understood it right. But if quantum tech is invented to a civilian usage then maybe we'll continue, but I know nothing about this, it's just what's currently suggested. But hey, if their theories are true and quantum computers reach civilians in gaming then ping will be a thing of the past as all will have instantaneous server reach SUPPOSEDLY.

    • @sergemarlon
      @sergemarlon Місяць тому +1

      I didn't see you mention AI. You pointed out the step in which human developers will fail and it seems like you don't think that's the perfect role for AI to fill.

    • @gamma2816
      @gamma2816 Місяць тому

      @@sergemarlon Very true! It's I guess I just don't hope so. 😅 I love AI tech but it freaks me out, it's like staring the thing that will end you in the face. AI is great, but terrifying so I don't want to think about it too much. Guess it's much like the nuke, great in power and as a scientific marble that ended wars, but terrifying when you think about it for too long. 😅
      But you're right, AI could probably handle it, but an AI built to build more machines that we don't understand, hence why we need them to do it, is a little uncomfortable for me. 😝

  • @kaimanic1406
    @kaimanic1406 Місяць тому +1

    I can't even imagine to build my own GPU. This guy is amazing!

  • @melexxa
    @melexxa Місяць тому

    Man, I really want to mod my GPU by adding more VRAM chips using clamp shell but I'm pretty scared to do so.

  • @TheCustomFHD
    @TheCustomFHD 25 днів тому

    I "know" someone that has been reversing WDDM, the video driver stack since Win Vista. That work would help this gpu probably a bit.

  • @zawadlttv
    @zawadlttv Місяць тому +1

    the sdcard probably holds the programming of the chip. probably the easiest way to update that like that

  • @emiljagnic2101
    @emiljagnic2101 18 днів тому

    Awesome, thank you for reporting about this!

  • @litlle_elctro_engeneer
    @litlle_elctro_engeneer Місяць тому

    i think that microsd card is there for:
    * Storing the program (because its so large)
    and as he says, for storage

  • @flakes369
    @flakes369 Місяць тому +3

    TempleOS energy

  • @drenewoo_irl
    @drenewoo_irl 20 годин тому

    I would contribute into this project, i already made kinda custom gpu on USB for linux, with modified dxvk so it runs pretty well

  • @FlockersDesign
    @FlockersDesign Місяць тому +2

    If his frametime is around 38 this means its running below 30 FPS
    60 FPS is a frametime around 12
    And yes before someone askes thjs is my job as a Enviroment/lighting artist in the game industry for 12 years

    • @diaman_d
      @diaman_d Місяць тому

      16.6 ms to be precise

  • @legiongaming99
    @legiongaming99 27 днів тому

    I dont mind if they hit the technical limit more gpus in the market means cheaper gpus for all and if it did hit a hard limit thats good for comp games

  • @JohnBernas-ll2si
    @JohnBernas-ll2si 4 дні тому

    The frametime for the 60 FPS with working vsync should take about 16,666667 ms, so considering what we saw in the counter - it runs at about 35~40 FPS at maximum, or I don't get something about his GPU.

  • @Wobble2007
    @Wobble2007 Місяць тому

    Amazing what he has achieved, though MiSTer has already done this with Groovy-Mame & MiSTerCast, which lets you use your MiSTer as a GPU.

  • @KaitsuYT
    @KaitsuYT Місяць тому

    im curious would it be any easier to write the drivers for linux? He should try it out? maybe

  • @noth606
    @noth606 Місяць тому +1

    RasbPi has zero to do with FPGA, just relatively important point, Pi's use off the shelf chips, slap them on a custom board with some RAM and shizzle, connectors etc and call it a day. The FuryGPU is a significantly more involved project than that. I could throw together a RasbPi sort of equivalent thing in a couple of weeks for the hardware, a month or two for the adaptations of the codebase to the board assuming the chosen CPU is reasonably supported. It would take me years to fart out something approaching the FuryGPU if I'm lucky and have loads of resources. I am not bullshitting, I have done sort of similar things, but different purpose and lower power, to the rasbpi. I ended up not proceeding beyond making a first series of fully fabbed and tested boards in the end, but it didn't have anything to do with the hardware or software, but with me splitting up with my then GF, having an argument with my business partner at the same time, having to move and within less than a year finding new GF, deciding to and getting married. New wife soon got pregnant and priorities shifted from fugging around with tech, to different things. 10 yrs ago.

  • @CipherDiaz
    @CipherDiaz Місяць тому +1

    An FPGA is *NOT* in a raspberry pi. They are quite expensive. Also, the programming on these fpga's is not anywhere near similar to C/C++ or anything like that, you are basically writing the logic for how electricity flows between components. An fpga also has a clock-rate, and the higher the rate - the more these things cost. So for him to get say, 150fps, he would most likely need a fast onboard clock. And most likely more gates to work with, since the only way to truly optimize anything on an FPGA is by heavy use of tables. Which might not apply to a GPU since its basically moving a ton of data around memory as quickly as possible.
    But yeah, awesome project!

  • @MyouKyuubi
    @MyouKyuubi Місяць тому +2

    the GT 1030 is an absolute gigachad of a card, no joke... It's the best graphics card you can get that can get by purely off of passive cooling (Radiator with no fan)! So it's brilliant for like tiny, almost pocket-sized "portable stationary" PC builds. :)

    • @vylrent
      @vylrent 26 днів тому +1

      It absolutely is a joke. Mini pcs can run off integrated graphics that beat it to the ground, intel and some nvidia card makers are making tiny gpus that are still quiet as passive cooling is only a good idea if you have something extremely underpowered, dust, or you cant bear 20db of noise. Its about time the gt 1030 dies.

    • @MyouKyuubi
      @MyouKyuubi 26 днів тому

      @@vylrent Integrated graphics can't run entirely off of passive cooling though... They need fans blowing air on them. :P

    • @vylrent
      @vylrent 26 днів тому

      @@MyouKyuubi Passive heatsinks made out of copper..

    • @MyouKyuubi
      @MyouKyuubi 25 днів тому +1

      @@vylrent nah, dude, integrated graphics uses CPU... there's absolutely no way you can use passively cooled integrated graphics without overheating the CPU playing something like Half Life Source. :P
      You're gonna needs a MASSIVE heatsink at the very least in order for passive cooling to work, at which point, we're no longer in the pocket-sized scale of computers.
      Get real, bro.

  • @pv8685
    @pv8685 Місяць тому

    dylans card is impressive! alone the fact that it runs and shows a picture is a miracle. that it even runs quake is mind blowing. and it got even a feature that any other gpu misses! a micro sd card slot!

  • @bgtubber
    @bgtubber Місяць тому

    Impressive for 1 guy! I wish him success.

  • @SalveMonesvol
    @SalveMonesvol Місяць тому

    This line of work could be amazing to emulate old consoles.

  • @J-Ernie
    @J-Ernie Місяць тому

    I have the same microphone as you. Can you share how you power it and if you use the goxlr could you please share the settings?

    • @vextakes
      @vextakes  Місяць тому +1

      I might make a video on setup stuff in the future. Mine is pretty simple… the shure MV7 runs XLR into the Behringer UMC204HD interface and I use a Fethead to boost the signal a little big for less signal noise. The interface is quite cheap same with the fethead.
      I record the microphone raw and do all the processing in post like the eq, compression, dessing, expansion, and limiting. Although if u use a GOXLR I do something a little similar. I use Elgato’s wavelink software to put in-software live processing on my mic for using in calls and while live-streaming. The difference with the GOXLR is I can still capture the raw microphone for recording, you would need like an XLR splitter if you want to do that on the GOXLR using its live processing
      Hope this helps!

    • @J-Ernie
      @J-Ernie Місяць тому

      @@vextakes It would be amazing if you could make a video. I'll subscribe so I don't miss it. Thank you for your reply.

    • @vextakes
      @vextakes  Місяць тому +1

      Yeah np
      I don’t think it would go on main channel, or it might be members-only or something. Just a heads up

    • @J-Ernie
      @J-Ernie Місяць тому

      @@vextakes Ok, Thanks for the heads up. If it's okay, I have one more question. I have researched the Fetheads and saw several Triton Audio FetHeads. Could you kindly let me know which one is recommended for our microphone? I appreciate your help. Thank you!

    • @vextakes
      @vextakes  Місяць тому +1

      There’s only 1 fethead- it’s a brand name. I use that triton audio one. You can also use a cloud lifter or many cheap generic ones I’ve actually heard are decent. You use the 48v phantom power that u usually use for condenser mics to boost the signal
      Honestly tho, the gain lifter doesn’t make a huge difference and I wouldn’t stress about it. It takes away a little bit of noise, but I used mine for months without one and nobody could tell especially watching thru a UA-cam video on a phone speaker, tv, whatever. Nobody can tell man. Just make sure your voice sounds clear in the video and is pleasant to listen to. Almost no one is listening on studio level equipment, those that do, you shouldn’t listen to. I’ve had people try to tell me shit and they’re just nitpicking.

  • @adilam6128
    @adilam6128 Місяць тому

    Hey VEX! I need your help! I built a pc and when I launch a AAA game it gets too loud after 5 minutes of the launch. I disabled case fans, set silent profiles to CPU/GPU fans and nothing helped!

    • @GTORazor
      @GTORazor Місяць тому

      Do you have cpu core temp monitors? Sounds like cpu getting very hot if fans go to high when launching games.

    • @adilam6128
      @adilam6128 Місяць тому

      @@GTORazor when I play god of war cpu temp doesn't go above 70-72c max and still loud!

    • @GTORazor
      @GTORazor Місяць тому

      @@adilam6128 what CPU cooler are you running? Are all fans tied into motherboard or case fan controller? My new build runs 25C or less at idle and never over 50C during extended gaming with near silent operation. i5-12600K, Peerless Assassin 120 cooler. 5 case fans, Zalman ZL1 case.

  • @gravecode
    @gravecode Місяць тому

    I'm praying a open-source gpu commnunity develops low-key the world need one now more than ever.

  • @user-cx6rg6mr7d
    @user-cx6rg6mr7d Місяць тому

    kudos to DIY project

  • @WilliamAshleyOnline
    @WilliamAshleyOnline 25 днів тому

    FPGA is used in audio interfaces that run custom dsp plugins. they run anywhere from a few dollars to 100 per chip. I'd love to see dsp open source for video and audio processing at an affordable price range... even interfaces with a couple chips can run in the hundreds of dollars if not thousands. I think fpga will replace older sharc chips used for audio processing an missile aquisition technologies over this decade.

  • @keiyano
    @keiyano Місяць тому

    the hardest part of making new devices is always the windows drivers. Its indeed a lot better nowadays since windows has support for usermode drivers written in C# which is much more understandable and they do give examples of writing them. But they need more tutorials and examples to help devs who arent familiar with writing drivers. also there is also driver signing which is confusing

  • @sanketsbrush8790
    @sanketsbrush8790 Місяць тому

    I always wanted to make my own gpu with my own bare hands , but I don't have any knowledge of it

  • @HyperDev00
    @HyperDev00 Місяць тому

    Basically FPGA is an intel chip that can be programmed ,you can specify gates and decoder , multiplexers , and all that without dealing with hardware directly, because it is programmed more like a software.

  • @jasont80
    @jasont80 16 днів тому

    He's basically using a single-board computer to render graphics in software. It will never be close to a modern GPU, but this level of tinkering is amazing. Love it!

  • @applebumpcaster8240
    @applebumpcaster8240 Місяць тому

    Now THAT is very cool! We can do Open source GPU now?
    Thank God.