Graphics Processing Unit (GPU)

Поділитися
Вставка
  • Опубліковано 12 січ 2025

КОМЕНТАРІ • 115

  • @h1k0usen13
    @h1k0usen13 4 роки тому +94

    Never stop what you're doing. You're a life savior and this is the best CS-related channel on UA-cam.

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +13

      Thank you so much :)KD

    • @naseef2075
      @naseef2075 4 роки тому +4

      Yes please! Been really helpful during my A Level studies, I'll continue to benefit from your videos in the future!

    • @h1k0usen13
      @h1k0usen13 4 роки тому +1

      @@ComputerScienceLessons Also, do you have any plans to do a video/series on kernels (monolithic, hybrid, micro..)?

  • @ognjend2798
    @ognjend2798 4 роки тому +11

    Fantastic content, thank you! Your British (Australian?) accent makes it even more enjoyable to watch :)

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +7

      Mainly British, childhood in NZ. Well spotted. Thanks for the lovely comment :)KD

  • @Sal3600
    @Sal3600 4 роки тому +4

    I like the brief but important info about the CPU at the beginning.

  • @khushjain8679
    @khushjain8679 3 роки тому +8

    Great informative video, thanks a lot.
    Seeing the technology used and mapping every little detail about graphic in vector seems such an incredible idea and that too happening in real time makes me appreciate Technology and the beautiful minds behind it exponentially. Kudos to them.

  • @shitshow_1
    @shitshow_1 2 роки тому +1

    Thank you for posting great contents! 🙂
    The only way to re-pay you is by sharing your videos with my peers and showing them how great these contents are : )

    • @ComputerScienceLessons
      @ComputerScienceLessons  2 роки тому +1

      Telling other people about my channel would be fantastic. Thank you :)KD

  • @smiiks2691
    @smiiks2691 10 місяців тому

    Just incredible. Going straight to the points. Thanks a millions for these masterpieces!!

  • @EnderSlimeygaming
    @EnderSlimeygaming 3 роки тому +2

    Simple and easy to understand visually, you are awesome!

  • @Someone-ve7yn
    @Someone-ve7yn 2 роки тому +1

    This channel is just amazing, thank you!

  • @prilk1704
    @prilk1704 4 роки тому +23

    The small figures make it harder for the eye, I recommend you to fit the diagram to the same width of the screen, from edge to edge, or at least close to that.

  • @evanescent7536
    @evanescent7536 2 роки тому +4

    This is a very informative video. It helps me a lot with my assignment as a computer engineering student. Thank you, sir.

    • @ComputerScienceLessons
      @ComputerScienceLessons  2 роки тому

      Thank you. Delighted to help. You might like this one (if you already know something about vectors)
      ua-cam.com/video/Cb4aoihvh-o/v-deo.html :)KD

  • @STRAGGLER36
    @STRAGGLER36 3 роки тому +1

    Your explanation methods are top notch. Please keep doing what you're doing. Everything is better with graphic explanations.. This is the best. Thank you I'm learning new stuff new stuff.

    • @ComputerScienceLessons
      @ComputerScienceLessons  3 роки тому

      Thank you so much. I do enjoy making these, and of course, I'm learning all the time too :)KD

  • @Tobi-gl2lb
    @Tobi-gl2lb 2 роки тому

    Godlike video mate. Thank you!!

  • @skillissuexd
    @skillissuexd Рік тому +1

    Im a pc enthusiast, your explanation is very simple and understandable, thank you very much ❣

  • @izzyakram
    @izzyakram 4 роки тому +1

    Thank you so much for this clear and concise information. This helps a lot for my CS major.

  • @flavienrobert2516
    @flavienrobert2516 Рік тому +1

    This is a true teacher !

  • @AjinkyaMahajan
    @AjinkyaMahajan 4 роки тому +1

    Great Explanation. It added a drop to my knowledge pool and increased the volume multifold. 🎇✨
    Thanks
    Cheers

  • @oplemath
    @oplemath 2 роки тому +1

    Amazing video!!

  • @soualehmohamedaya6625
    @soualehmohamedaya6625 3 роки тому +1

    the best video about GPU

  • @tonalddrumpboe5151
    @tonalddrumpboe5151 2 роки тому +1

    Video ram (VRAM)/ frame buffer
    GDDR6 DRAM has wider memory bus than regular DRAM.
    The GPU does graphics pipeline/ rendering pipeline. To turn numerical computer data and turn it into something to display on the screen.
    Opposed to a CPU, a GPU has hundreds to lightweight cores (or shader cores).
    Single Instruction Multiple Data paradigm (SIMD): a cluster of cores share a single control unit. One instruction is given to each core which handle the same operation among different sets of data. The GPU has multiple sets of instruction stream.
    Cache:
    Level 1: cluster of cores share the memory
    Level 2: all cores

  • @_BWKC
    @_BWKC 4 роки тому +3

    Very good video.

  • @zoriiginalx7544
    @zoriiginalx7544 2 місяці тому

    6:47 Those are not rotation matrices. They are just translation matrices 2 units in the y direction.
    Also, that’s not how matrix-vector multiplication works. The vector must be on the RHS otherwise it’s ill-formed unless you are using row vectors.

  • @jaysanprogramming6818
    @jaysanprogramming6818 4 роки тому +4

    Thank you for this well made video. Would you consider making a video about the evolution of graphics APIs to further this topic?

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +1

      I will definitely take a look at it. I've been looking into OpenGL and Direct3D. CUDA is also set to become significant in big data mining and AI. :)KD

    • @jaysanprogramming6818
      @jaysanprogramming6818 4 роки тому

      @@ComputerScienceLessons OpenGL is old and has evolved greatly along the years but its main strength was its portability. Tha fact that Apple has deprecated it in favor of its own Metal library is a step backward in my opinion. But the future mainstream "open" library seems to be Vulkan.

  • @jonathanmoore5619
    @jonathanmoore5619 3 роки тому +1

    Great video. Thank you.

  • @mareksulecki419
    @mareksulecki419 3 роки тому +1

    Very well explained

  • @watchingyou
    @watchingyou 3 роки тому +1

    Very informative, thank you.

  • @Jeko_9785
    @Jeko_9785 3 роки тому +4

    "8-10 Gb"
    *Stares with a 6 gig 1660 Super*

    • @ComputerScienceLessons
      @ComputerScienceLessons  3 роки тому

      I don't want to sound smug, but I haven't looked back since I upgraded to an RTX 3070 :)KD

    • @Jeko_9785
      @Jeko_9785 3 роки тому +1

      @@ComputerScienceLessons :/

  • @polakamkamalnath1000
    @polakamkamalnath1000 2 роки тому

    awesome explaination!😍

  • @bottomstudio3552
    @bottomstudio3552 2 роки тому

    Keep doing your job ❤️
    🔥 🔥

  • @sivabalanj
    @sivabalanj 4 роки тому +1

    pretty good video....to set up the basics....
    nice work...keep it up...hehehe

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому

      Thank you. I am working on a more detailed video about the graphics pipeline, but it's hard to pin down. :)KD

    • @sivabalanj
      @sivabalanj 4 роки тому

      @@ComputerScienceLessons hehe.... i'm waiting.... ill research on it while i have some more free time....😁

  • @abhisheksarkar4897
    @abhisheksarkar4897 4 роки тому

    I have been working on gpu for 7 dayes now am satisfied

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +1

      Glad to help. It's one of those topics that's hard to pin down; the tech is advancing so quickly. :)KD

    • @abhisheksarkar4897
      @abhisheksarkar4897 4 роки тому

      @@ComputerScienceLessons yup am also takeing help of professor Onur Mutlu lectures

  • @UladzimirKarol
    @UladzimirKarol 11 місяців тому +1

    Very good.

  • @ashokphour3655
    @ashokphour3655 2 роки тому +1

    Is the graphics adapter is always used outside the SOC or it can be integrated inside SOC?

    • @ComputerScienceLessons
      @ComputerScienceLessons  2 роки тому

      Early PC architecture (back in the 80s and 90s) put the graphics controller inside the CPU. These days, it's either integrated into the CPU (common with regular laptops) or, if you want better quality, its done by a discrete graphics adapter. This video and web link may interest you :)KD
      ua-cam.com/video/_I8CLQazom0/v-deo.html
      www.computer.org/publications/tech-news/chasing-pixels/the-integrated-graphics-controller

  • @arthurklause5251
    @arthurklause5251 2 роки тому +1

    Thanks a lot !

  • @jbot9993
    @jbot9993 4 роки тому +2

    What does it mean when you say that HDMI is slow at 60 Hz. Am I doing something wrong when connecting my 144 Hz monitor with HDMI or this has nothing to do with the display itself?

    • @Wrtvrxgvcf55
      @Wrtvrxgvcf55 4 роки тому +2

      most high frequency gaming monitors are indeed hooked up to DVI or displayport, but its also important what the resolution is; HDMI 2.0 supports 4k up to 60hz (1080p at 120hz (but could go higher)), while HDMI 2.1 supports 10k up to 120hz.

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +2

      You should match the refresh rate of the monitor with a cable that can support it. Frequency mismatch will cause shearing effects. Displayport is the way to go with a gaming monitor :)KD

  • @edumeli02
    @edumeli02 4 роки тому +3

    I am currently studying linear algebra at university and I always wondered what's a real world purpose for it. Now I know😁

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +1

      I'm working on a new series about quantum computers. At least one of the videos in that series will be about linear algebra; quantum logic gates are based upon it. I have no doubt quantum computing is going to bring about a paradigm shift in human evolution - so linear algebra is definitely worth knowing about. :)KD

    • @edumeli02
      @edumeli02 4 роки тому

      @@ComputerScienceLessons Fascinating!!!Hopefully by the time I'll get my computer science degree advancements will be made regarding the world of quantum computing

    • @edumeli02
      @edumeli02 4 роки тому

      @math It won't. "Conventional" computing and quantum computing are two separate concepts that have different purposes and different usecases. It wouldn't be practical to have a quantum computer at home as you'd have to cool it down to near 0 kelvin🤣🤣

  • @danielamancillav
    @danielamancillav Рік тому

    Lo amo señor, gracias!!!

  • @TerenceA72
    @TerenceA72 2 місяці тому

    GPU used to mean geometry processing unit, crazy how these things change and are forgotten

    • @TerenceA72
      @TerenceA72 2 місяці тому

      Like late 90's very early 2000's, graphics cards still relied on the CPU to plot the coordinates of vertices for polygons, Nvidia introduced cards with a dedicated 'GPU' term wasn't used for long before it just came to mean a graphic card

  • @mareechmakuach8378
    @mareechmakuach8378 4 роки тому +1

    Fantastic

  • @user-om2ev8wz6g
    @user-om2ev8wz6g 4 роки тому +1

    What software are you using to make these videos? I want to make these nest presentations too.

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому

      This one was done with Adobe Fireworks (for the images) and Microsoft PowerPoint. :)KD

  • @geekionizado
    @geekionizado 4 роки тому +1

    In which program you draw these components? It's beautiful

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому

      I use Fireworks and PowerPoint for the diagrams and circuit boards, and Blender for the rest. Thanks for the lovely comment. :)KD

  • @a.alsuleim4609
    @a.alsuleim4609 4 роки тому

    Hi, can somebody tell me, which program is used for poducing these slides. Thank you.

    • @ComputerScienceLessons
      @ComputerScienceLessons  4 роки тому +2

      I use Fireworks for the 2D images like circuit boards, PowerPoint for simple block diagrams and Blender for the more complex 3D stuff. :)KD

    • @naseef2075
      @naseef2075 4 роки тому +2

      @@ComputerScienceLessons Blender FTW!

  • @x86_architecture10
    @x86_architecture10 2 роки тому +1

    I am the onethousand'th liker of this video.

  • @MeariBamu
    @MeariBamu 4 роки тому

    I hope you should make the background like dark mode

  • @toddlask
    @toddlask 2 роки тому +1

    please explain how the graphics card is used for crypto mining in your new crypto series!

    • @ComputerScienceLessons
      @ComputerScienceLessons  2 роки тому

      Nice idea. I will definitely cover this one day, as an intellectual exercise. However, I'm uncomfortable with the environmental impact. Perhaps I need to cover renewable energy too :)KD

    • @toddlask
      @toddlask 2 роки тому

      @@ComputerScienceLessons yes, i keep hearing about the environment with crypo too. explaining that too would be interesting.

  • @homescriptone
    @homescriptone 8 місяців тому

    In 2024 we know their is no documentation on the architeture of a gpu like for the cpu.

  • @ScottFranklin-of3nz
    @ScottFranklin-of3nz Рік тому

    Ray chasing real time sounds expensive

  • @mirzobeksultonnazarov976
    @mirzobeksultonnazarov976 3 роки тому +1

    hey Javid, is it you?

    • @ComputerScienceLessons
      @ComputerScienceLessons  3 роки тому

      No this is Kevin. I'll let Javid know you were looking for him :)KD

    • @mirzobeksultonnazarov976
      @mirzobeksultonnazarov976 3 роки тому

      @@ComputerScienceLessons the voice on the video is very similar to javid.
      By the way, I was looking for information about how graphic data passes through the GPU or how the GPU distributes data(vertex, index, color etc.) to its cores, but I didn’t find a sufficiently detailed explanation, can you tell me where I can find out?

  • @pyprogramming599
    @pyprogramming599 4 роки тому

    like it
    and tell me how use gpu of my android cheap phone.
    i asked how can use instead cpu

  • @moonhowler667
    @moonhowler667 4 місяці тому

    I have a number of issues with this video. First, the video ram is not the framebuffer. It's also where any CBVs, SRVs, textures, vertex buffers and index buffers live. Second, HDMI has supported over 60Hz for literally years. Third, your render pipeline is very wrong. Almost every name is wrong and you missed steps. In fact most of what you have in your "render pipeline" are actually things done within the vertex shader, which is only part of said render pipeline.
    It doesn't help anyone trying to learn if you give them half baked information you barely took the time to google.

  • @ashokbuttowski
    @ashokbuttowski 3 роки тому +2

    Voice not at all clear and crisp

    • @ComputerScienceLessons
      @ComputerScienceLessons  3 роки тому +2

      I agree - cheap microphone I'm afraid. I hope the content was useful.

    • @ashokbuttowski
      @ashokbuttowski 3 роки тому

      @@ComputerScienceLessons yeah useful buddy ✌

  • @aminaalfaitory
    @aminaalfaitory 3 роки тому

    Please can you help me with information about the Gpu memory hierarchy with books or references. If you can, I will write to my e-mail for communication

  • @ЮрийШпорхун
    @ЮрийШпорхун 3 роки тому +1

    Your voice is so familiar to me. I have no reason why.

    • @ComputerScienceLessons
      @ComputerScienceLessons  3 роки тому

      I've been told I sound like various other people :)KD

    • @watchingyou
      @watchingyou 3 роки тому

      He just sounds like your typical British narrator, perfect for this kind of content.

  • @babylongate
    @babylongate 9 місяців тому

    I wish you did speak a bit faster like x10 would be fine so we wouldn’t distracted with other daily routines

  • @flat-earther
    @flat-earther 3 роки тому +1

    White background hurts eyes, I suggest dark background.

    • @ComputerScienceLessons
      @ComputerScienceLessons  3 роки тому

      A few people have suggested that. TBH, I prefer white. I'll take another look. :)KD

  • @yuktikumari6042
    @yuktikumari6042 4 роки тому +2

    first comment :)

  • @Mohanraj-fe3uj
    @Mohanraj-fe3uj 7 місяців тому

    Thank you so much bro ❤❤❤

  • @MacadamMarcus-y1x
    @MacadamMarcus-y1x 3 місяці тому

    Moore Sarah Hall Ruth Young John

  • @japcy6669
    @japcy6669 3 роки тому +1

    Yeahh thak youu