- 218
- 213 100
UCSC CSE160
United States
Приєднався 17 січ 2020
This is a channel for the CSE160 - Introduction to Computer Graphics class from the University of California, Santa Cruz (UCSC). Here we have videos of our lectures, lab sections and short tutorials to guide students on the programming assignments.
Ohhh that is so great!!!! Highly appreciate your advise!!!!!! I was looking many videos, but you are a favorite!!!!
Perfect, you make it easy to understand glFrustum. Thanks so much.
this guy is the man
How are you able to edit? I found the inspect and source tab, but I can change anything or find an option to edit.
Wish my Uni have that kind of classes. Really cool that someone teaching such things. Only things they teached in computer graphics are photoshop and basic 3d max, almost everyone be not wanting to study no more cause of even more agonizing ways to model. Btw did you heard of SmallUPBP or CPPM, they probably the best methods so far.
glPerspective implementation?
No comments on the guy having this class in his bed?
Is possible do it in firefox?
ig no.
Thanks
thanks useful.
wow, thats what i was missing.
Yeah, yeah. RTX is most realistic. Yeah, yup, yup, absolutely. And there is absolutely no such a thing as photon mapping
Did you make up your own version of reality and believe it as you were writing this comment? Photon mapping has been a thing for 10+ years and so has ray tracing
Awesome, I just used this to change the numbers of decimals on something that wasn't adding up. Thanks a lot ser
Which language are you using please?
Amazing video
thanks, i'm quite surprised people don't show this in js tutorials, i had to specifically search for this.
thanks😘
My understanding is that they use the rasterized image as source for raytracing in the game engine and use minimal amount of samples (with the 2000 series this minimal amount of samples can be quite high) in a buffer zone before it's previewed and using a method of approximation probably with the help of a neural network to display the image as if it was ray traced with a higher sample count. It feels like an enhanced version of the rasterized image to me rather than a raytraced one. Having everything baked in the game engine also help I suppose.
where can we found the tool/software you used? this is very useful for teaching and learning!
♫ The BEST Video! также есть хорошее видео про gluLookAt (+SFML) ua-cam.com/video/MZmyzfYz6CY/v-deo.html ☻
Really good explanation. Thank you for your help ++
I think you forgot to mention that the normal matrix also need to be passed to the shader.
This was exactly what I was looking for! Use case of non symmetrical l/r t/b: Let's say your face is not in the center of your monitor. Or you have multiple monitors (where the outer ones are not pointed toward you). Or perhaps you are walking around your monitor. Or in other words when your monitor (image plane) is at an angle to you. Taking into account head position (and therefore relative vanishing point) using non symmetrical values would create the properly skewed perspective on your monitor that when viewed at an angle would be the correct perspective to your eyes. This relative transformation could be considered keystone correction. Games that have portals inadvertently do this to correctly project the other side of the portal onto the image plane of the portal. As far as I understand. Currently no software takes head position into account in this way. There are some head tracking implementations (tracker ir), but as far as I understand they just change camera orientation. I propose that if head tracking was taken into account in this way, then monitors could be nearly as holographically awesome as vr. This is one of the things I'm currently playing with.
Really crazy to me that they took so long to answer in the beginning 😂 uni is attended only by rich & spoiled babies
Not sure what you're on about. This is pretty much normal (hence the lack of reaction from the lecturer) here in Germany (where education is free) as well. You have to consider this could be a lecture attended by a few hundred people, so lots of psychological effects can be going on. For instance, you've got introverts who know the answer right away but do not want to stand in the spotlight presenting it, you got people who have said a lot during the lecture already and want others to have a learning experience as well, ... On top of that, this is an online lecture (likely Corona lockdown related) where it's always hard to perceive the general mood of the room. Do people raise their hands? Do they look like they know? Ever since Corona semesters for me, this has been a huge problem when lecturers don't use meeting features like emotes to ask for answers or if someone knows. People hesitate a lot more just activating their mic and speaking than they would raise a hand in presence. This has nothing to do with being rich / spoiled the slightest imho
@@BGFutureBG Understandable tbh, it was just funny cause I never went to uni and I would have answered directly
sorry. what is this?
good
how do i calculate up vector if I want to orient camera inclined to the scene and not horizontal always?
Nice brief introduction!
10:45 I can tell the difference between 60hz and 144hz. I would bet 10000 dollars I would get it right 100 times in a row in a blind test.
Oops I did an inky inky! Oopsie poopsie!
Nice explanation!
can you please share with me a link to the software you are using (for teaching purpose)
Great video, thanks.
thaaaaaaaanks, man, I got it because of your explanation !!!!!
goat status
in Windows 10 Powershell at first I was trying to use commands starting with $ python3 <cmd> following the instruction at @8:00 however it did not work until I removed the 3 and ran $python <cmd>
Great explanation!
Great lab!
thank you very much, I need this video !