Great informative video, thanks a lot. Seeing the technology used and mapping every little detail about graphic in vector seems such an incredible idea and that too happening in real time makes me appreciate Technology and the beautiful minds behind it exponentially. Kudos to them.
Thank you for posting great contents! 🙂 The only way to re-pay you is by sharing your videos with my peers and showing them how great these contents are : )
The small figures make it harder for the eye, I recommend you to fit the diagram to the same width of the screen, from edge to edge, or at least close to that.
Your explanation methods are top notch. Please keep doing what you're doing. Everything is better with graphic explanations.. This is the best. Thank you I'm learning new stuff new stuff.
Video ram (VRAM)/ frame buffer GDDR6 DRAM has wider memory bus than regular DRAM. The GPU does graphics pipeline/ rendering pipeline. To turn numerical computer data and turn it into something to display on the screen. Opposed to a CPU, a GPU has hundreds to lightweight cores (or shader cores). Single Instruction Multiple Data paradigm (SIMD): a cluster of cores share a single control unit. One instruction is given to each core which handle the same operation among different sets of data. The GPU has multiple sets of instruction stream. Cache: Level 1: cluster of cores share the memory Level 2: all cores
6:47 Those are not rotation matrices. They are just translation matrices 2 units in the y direction. Also, that’s not how matrix-vector multiplication works. The vector must be on the RHS otherwise it’s ill-formed unless you are using row vectors.
I will definitely take a look at it. I've been looking into OpenGL and Direct3D. CUDA is also set to become significant in big data mining and AI. :)KD
@@ComputerScienceLessons OpenGL is old and has evolved greatly along the years but its main strength was its portability. Tha fact that Apple has deprecated it in favor of its own Metal library is a step backward in my opinion. But the future mainstream "open" library seems to be Vulkan.
Early PC architecture (back in the 80s and 90s) put the graphics controller inside the CPU. These days, it's either integrated into the CPU (common with regular laptops) or, if you want better quality, its done by a discrete graphics adapter. This video and web link may interest you :)KD ua-cam.com/video/_I8CLQazom0/v-deo.html www.computer.org/publications/tech-news/chasing-pixels/the-integrated-graphics-controller
What does it mean when you say that HDMI is slow at 60 Hz. Am I doing something wrong when connecting my 144 Hz monitor with HDMI or this has nothing to do with the display itself?
most high frequency gaming monitors are indeed hooked up to DVI or displayport, but its also important what the resolution is; HDMI 2.0 supports 4k up to 60hz (1080p at 120hz (but could go higher)), while HDMI 2.1 supports 10k up to 120hz.
You should match the refresh rate of the monitor with a cable that can support it. Frequency mismatch will cause shearing effects. Displayport is the way to go with a gaming monitor :)KD
I'm working on a new series about quantum computers. At least one of the videos in that series will be about linear algebra; quantum logic gates are based upon it. I have no doubt quantum computing is going to bring about a paradigm shift in human evolution - so linear algebra is definitely worth knowing about. :)KD
@@ComputerScienceLessons Fascinating!!!Hopefully by the time I'll get my computer science degree advancements will be made regarding the world of quantum computing
@math It won't. "Conventional" computing and quantum computing are two separate concepts that have different purposes and different usecases. It wouldn't be practical to have a quantum computer at home as you'd have to cool it down to near 0 kelvin🤣🤣
Like late 90's very early 2000's, graphics cards still relied on the CPU to plot the coordinates of vertices for polygons, Nvidia introduced cards with a dedicated 'GPU' term wasn't used for long before it just came to mean a graphic card
Nice idea. I will definitely cover this one day, as an intellectual exercise. However, I'm uncomfortable with the environmental impact. Perhaps I need to cover renewable energy too :)KD
@@ComputerScienceLessons the voice on the video is very similar to javid. By the way, I was looking for information about how graphic data passes through the GPU or how the GPU distributes data(vertex, index, color etc.) to its cores, but I didn’t find a sufficiently detailed explanation, can you tell me where I can find out?
I have a number of issues with this video. First, the video ram is not the framebuffer. It's also where any CBVs, SRVs, textures, vertex buffers and index buffers live. Second, HDMI has supported over 60Hz for literally years. Third, your render pipeline is very wrong. Almost every name is wrong and you missed steps. In fact most of what you have in your "render pipeline" are actually things done within the vertex shader, which is only part of said render pipeline. It doesn't help anyone trying to learn if you give them half baked information you barely took the time to google.
Please can you help me with information about the Gpu memory hierarchy with books or references. If you can, I will write to my e-mail for communication
Never stop what you're doing. You're a life savior and this is the best CS-related channel on UA-cam.
Thank you so much :)KD
Yes please! Been really helpful during my A Level studies, I'll continue to benefit from your videos in the future!
@@ComputerScienceLessons Also, do you have any plans to do a video/series on kernels (monolithic, hybrid, micro..)?
Fantastic content, thank you! Your British (Australian?) accent makes it even more enjoyable to watch :)
Mainly British, childhood in NZ. Well spotted. Thanks for the lovely comment :)KD
I like the brief but important info about the CPU at the beginning.
Yes - I think context is important. :)KD
Great informative video, thanks a lot.
Seeing the technology used and mapping every little detail about graphic in vector seems such an incredible idea and that too happening in real time makes me appreciate Technology and the beautiful minds behind it exponentially. Kudos to them.
I totally agree :)KD
Thank you for posting great contents! 🙂
The only way to re-pay you is by sharing your videos with my peers and showing them how great these contents are : )
Telling other people about my channel would be fantastic. Thank you :)KD
Just incredible. Going straight to the points. Thanks a millions for these masterpieces!!
Simple and easy to understand visually, you are awesome!
This channel is just amazing, thank you!
You're very kind. Thank you :)KD
The small figures make it harder for the eye, I recommend you to fit the diagram to the same width of the screen, from edge to edge, or at least close to that.
noted. :)KD
This is a very informative video. It helps me a lot with my assignment as a computer engineering student. Thank you, sir.
Thank you. Delighted to help. You might like this one (if you already know something about vectors)
ua-cam.com/video/Cb4aoihvh-o/v-deo.html :)KD
Your explanation methods are top notch. Please keep doing what you're doing. Everything is better with graphic explanations.. This is the best. Thank you I'm learning new stuff new stuff.
Thank you so much. I do enjoy making these, and of course, I'm learning all the time too :)KD
Godlike video mate. Thank you!!
Im a pc enthusiast, your explanation is very simple and understandable, thank you very much ❣
You are most welcome :)KD
Thank you so much for this clear and concise information. This helps a lot for my CS major.
Glad to help :)KD
This is a true teacher !
Thank you so much :)KD
Great Explanation. It added a drop to my knowledge pool and increased the volume multifold. 🎇✨
Thanks
Cheers
Amazing video!!
Thank you :)KD
the best video about GPU
Thank you :)KD
Video ram (VRAM)/ frame buffer
GDDR6 DRAM has wider memory bus than regular DRAM.
The GPU does graphics pipeline/ rendering pipeline. To turn numerical computer data and turn it into something to display on the screen.
Opposed to a CPU, a GPU has hundreds to lightweight cores (or shader cores).
Single Instruction Multiple Data paradigm (SIMD): a cluster of cores share a single control unit. One instruction is given to each core which handle the same operation among different sets of data. The GPU has multiple sets of instruction stream.
Cache:
Level 1: cluster of cores share the memory
Level 2: all cores
Very good video.
Thank you :)KD
6:47 Those are not rotation matrices. They are just translation matrices 2 units in the y direction.
Also, that’s not how matrix-vector multiplication works. The vector must be on the RHS otherwise it’s ill-formed unless you are using row vectors.
Thank you for this well made video. Would you consider making a video about the evolution of graphics APIs to further this topic?
I will definitely take a look at it. I've been looking into OpenGL and Direct3D. CUDA is also set to become significant in big data mining and AI. :)KD
@@ComputerScienceLessons OpenGL is old and has evolved greatly along the years but its main strength was its portability. Tha fact that Apple has deprecated it in favor of its own Metal library is a step backward in my opinion. But the future mainstream "open" library seems to be Vulkan.
Great video. Thank you.
You're welcome :)KD
Very well explained
Thank you :)KD
Very informative, thank you.
You're welcome :)KD
"8-10 Gb"
*Stares with a 6 gig 1660 Super*
I don't want to sound smug, but I haven't looked back since I upgraded to an RTX 3070 :)KD
@@ComputerScienceLessons :/
awesome explaination!😍
Keep doing your job ❤️
🔥 🔥
Ok
pretty good video....to set up the basics....
nice work...keep it up...hehehe
Thank you. I am working on a more detailed video about the graphics pipeline, but it's hard to pin down. :)KD
@@ComputerScienceLessons hehe.... i'm waiting.... ill research on it while i have some more free time....😁
I have been working on gpu for 7 dayes now am satisfied
Glad to help. It's one of those topics that's hard to pin down; the tech is advancing so quickly. :)KD
@@ComputerScienceLessons yup am also takeing help of professor Onur Mutlu lectures
Very good.
Thank you :)KD
Is the graphics adapter is always used outside the SOC or it can be integrated inside SOC?
Early PC architecture (back in the 80s and 90s) put the graphics controller inside the CPU. These days, it's either integrated into the CPU (common with regular laptops) or, if you want better quality, its done by a discrete graphics adapter. This video and web link may interest you :)KD
ua-cam.com/video/_I8CLQazom0/v-deo.html
www.computer.org/publications/tech-news/chasing-pixels/the-integrated-graphics-controller
Thanks a lot !
You're most welcome :)KD
What does it mean when you say that HDMI is slow at 60 Hz. Am I doing something wrong when connecting my 144 Hz monitor with HDMI or this has nothing to do with the display itself?
most high frequency gaming monitors are indeed hooked up to DVI or displayport, but its also important what the resolution is; HDMI 2.0 supports 4k up to 60hz (1080p at 120hz (but could go higher)), while HDMI 2.1 supports 10k up to 120hz.
You should match the refresh rate of the monitor with a cable that can support it. Frequency mismatch will cause shearing effects. Displayport is the way to go with a gaming monitor :)KD
I am currently studying linear algebra at university and I always wondered what's a real world purpose for it. Now I know😁
I'm working on a new series about quantum computers. At least one of the videos in that series will be about linear algebra; quantum logic gates are based upon it. I have no doubt quantum computing is going to bring about a paradigm shift in human evolution - so linear algebra is definitely worth knowing about. :)KD
@@ComputerScienceLessons Fascinating!!!Hopefully by the time I'll get my computer science degree advancements will be made regarding the world of quantum computing
@math It won't. "Conventional" computing and quantum computing are two separate concepts that have different purposes and different usecases. It wouldn't be practical to have a quantum computer at home as you'd have to cool it down to near 0 kelvin🤣🤣
Lo amo señor, gracias!!!
Eres muy bienvenida :)KD
GPU used to mean geometry processing unit, crazy how these things change and are forgotten
Like late 90's very early 2000's, graphics cards still relied on the CPU to plot the coordinates of vertices for polygons, Nvidia introduced cards with a dedicated 'GPU' term wasn't used for long before it just came to mean a graphic card
Fantastic
Thank you :)KD
What software are you using to make these videos? I want to make these nest presentations too.
This one was done with Adobe Fireworks (for the images) and Microsoft PowerPoint. :)KD
In which program you draw these components? It's beautiful
I use Fireworks and PowerPoint for the diagrams and circuit boards, and Blender for the rest. Thanks for the lovely comment. :)KD
Hi, can somebody tell me, which program is used for poducing these slides. Thank you.
I use Fireworks for the 2D images like circuit boards, PowerPoint for simple block diagrams and Blender for the more complex 3D stuff. :)KD
@@ComputerScienceLessons Blender FTW!
I am the onethousand'th liker of this video.
I hope you should make the background like dark mode
please explain how the graphics card is used for crypto mining in your new crypto series!
Nice idea. I will definitely cover this one day, as an intellectual exercise. However, I'm uncomfortable with the environmental impact. Perhaps I need to cover renewable energy too :)KD
@@ComputerScienceLessons yes, i keep hearing about the environment with crypo too. explaining that too would be interesting.
In 2024 we know their is no documentation on the architeture of a gpu like for the cpu.
Ray chasing real time sounds expensive
hey Javid, is it you?
No this is Kevin. I'll let Javid know you were looking for him :)KD
@@ComputerScienceLessons the voice on the video is very similar to javid.
By the way, I was looking for information about how graphic data passes through the GPU or how the GPU distributes data(vertex, index, color etc.) to its cores, but I didn’t find a sufficiently detailed explanation, can you tell me where I can find out?
like it
and tell me how use gpu of my android cheap phone.
i asked how can use instead cpu
I have a number of issues with this video. First, the video ram is not the framebuffer. It's also where any CBVs, SRVs, textures, vertex buffers and index buffers live. Second, HDMI has supported over 60Hz for literally years. Third, your render pipeline is very wrong. Almost every name is wrong and you missed steps. In fact most of what you have in your "render pipeline" are actually things done within the vertex shader, which is only part of said render pipeline.
It doesn't help anyone trying to learn if you give them half baked information you barely took the time to google.
I hope you find what you are looking for elsewhere.
Voice not at all clear and crisp
I agree - cheap microphone I'm afraid. I hope the content was useful.
@@ComputerScienceLessons yeah useful buddy ✌
Please can you help me with information about the Gpu memory hierarchy with books or references. If you can, I will write to my e-mail for communication
Your voice is so familiar to me. I have no reason why.
I've been told I sound like various other people :)KD
He just sounds like your typical British narrator, perfect for this kind of content.
I wish you did speak a bit faster like x10 would be fine so we wouldn’t distracted with other daily routines
Trouble is, some people ask me to speak more slowly. Can't win!
White background hurts eyes, I suggest dark background.
A few people have suggested that. TBH, I prefer white. I'll take another look. :)KD
first comment :)
Ta dah :)KD
Thank you so much bro ❤❤❤
Moore Sarah Hall Ruth Young John
Yeahh thak youu
You're welcome :)KD