Great man, you cleanse my mind and my view to programming... I'm only a script guy andy... I'm new to programming and I'm lucky to have encountered your channel before going in into this journey... I love scripts because they are straight forward.
@@itsdanott I'm glad someone else pointed it out, I was gonna say, this feels like the most schizophrenic clip I've listened to in ages, the audio issues were actually making me frustrated with the cake and tea tangents
As a graphics programmer I never considered any of this, but he's absolutely right. To my knowledge, Vulkan is the most frankenstein of them all because it attempts to find a common abstraction for every vendor. In a perfect world where every vendor opened up their hardware, abstractions like Vulkan's notoriously awful descriptor sets would either not exist or be a lot more simpler to implement.
You guys still write 1000 lines of code to draw a basic damn triangle ? Man you gotta be paid more than react devs at Netflix. That shit is the hardest thing I have ever seen in my life. When I saw this, I totally gave up and went to network programming, I'm so chill on the Linux terminal screen.
You act like those abstractions aren't there for a reason. In this hypothetical world where every vender opened up their hardware, do you think graphics programmers are going to learn dozens of different ISAs and implement engine backends for dozens of GPU specific apis? Seems like you're still just end up with some GPU independent abstraction...
Maybe it is just me but concepts in Vulkan were very straightforward for me. Yes, it is a lot of boilerplate at the beginning but the way to abstract them away for a game felt very natural.
I started programming about 15 years ago, and I feel like I was right on the edge of "old school" and "modern" development. I kinda wish I had the opportunity to be a software engineer during the days of actually needing to implement stuff and not just being a glorified infra engineer stitching together a bunch of NPM libraries.
You're comparing old school programming to JavaScript or web based programming? That's not a great comparison for understanding how modern programming differs from the old days. A more reasonable comparison would be doing the same thing, for example making a game in 2005 vs 2020. Many people in 2020 will probably use hardware rendering, which comes with its own complexities and setup, and also have to create much more elaborate, generalized and complicated systems. Many people still do that even though we have things like unity, so if you're really interested and have the skills, you can still make things from scratch which comes with many benefits.
It took me weeks to learn that opengl is basically a state machine, and you're just calling functions that turn on/off features. I wouldn't say it's simple but I feel like that should have been the first line of every beginner tutorial I did.
The best resource I found for OpenGL is the khronos group wiki, I think they mention early on about opengl being a state machine. As a side note, every concept there is explained really well. Would recommend over any tutorial or book
This is why everybody ran for the hills and started using web browsers for everything. On a short term basis it's easier to deal with the insanity of web browsers than the rocket science of native applications.
100%, we give out about web devs, but there are valid reason why things devolved into the current state of affairs, drivers and graphics api's ruined native development. It's still doable if you have the know how to avoid most of the complexity, but it's far from ideal
@@CianMcsweeney Yeah you can still do things if you understand the full "ecosystem" of garbage but even then it's not going to be very portable without abstracting the entire machine to death anyway.
And nearly 95% of applications, aside of games, can be done on the web without noticing serious performance problems. Even a lot of 3D applications can be written thanks to web-GPU. I mean I don't care games at all (don't play them much), so the web is ok for me I even play basic games on my browser only.
I’m a software engineer who uses OpenGL for graphics in our company’s software. In my opinion, it is completely impossible to gather the graphics rings of power and bind them in darkness. Instead, I have resigned myself to the fact that I will need to just know each shading language and each use case, and just do the rework a bunch of times. There’s no better solution unless there’s only one GPU manufacturer.
My first introduction to "programming" was html and visual basic. I was on the cusp of the design philosophy change over you mention. When I first started, it seemed the only real way to get anything done was to target older machines and when you had direct access to hardware. Things had started to be abstracted so far, you needed to learn dozens of things just to render something on the screen. It was really disheartening. I ended up pushing towards ruby and python for a little bit. Everyone at the time was heralding them as the future of programming. Then I got stuck on java for a little bit. Over the years I ended up more so building skills for dev ops rather than explicitly programming so I pushed for a career in IT. I do wish I had been exposed to more bare metal programming first. I would have fallen more into software development had that been the case.
The original purpose of computation was to reduce friction. You program a system so that you can solve a problem which was previous with high friction. Bank transfers or company document storage were already solved with physical technology but instead now we are able to do so frictionless. People saw how they were able to make these physical solution frictionless by creating software that they abstracted the idea, what if the purpose of software isn't to make originally physical systems faster, but instead how can we make the process of creating software frictionless. Many of tools we have developed in the past were motivated by solving these physical problems, and thus needed to provide the tools needed to make something frictionless. What has happened in recently years, say last 1.5 decades, is that many of the tools we had in the past are replaced by one which are "frictionless". The problem with that is that these new tools lack the ability to now solve the real physical problem, which was to make these orginialy physical systems frictionless. Basically we used the tools which enabled us to make frictionless systems and used to only the tools themselves and resulted in the loss of actually solving the problem.
The worst part is these hardware manufactures don't even get the competitive advantage they seek. What actually happens is they just create significantly work for engine developers to support the multiple shader APIs. Basically nobody wins and everyone else loses.
agree, a large reason why many games never released for PC was due to how difficult it was (and still is) to release bug-free pc releases due to the driver and graphics api mess
@@doltBmB I'm talking about the releases most average people want to play. Of course there's loads of great indie games on pc. My point was, that graphics vendors have made it unnecessarily difficult to develop for PC
I totally understand what Jon is saying and largely agree, but I feel like his analysis tends to stop prematurely at the technical level. Are AAA games really hampered today primarily by old technical debt? Im sure that's a major factor, but I would argue it's a similar problem with movies. When 100 million dollars are being thrown at a project, no investor is ever going to take the liability of giving artists the freedom they need to make anything but the most mass-appeal trough slop you've ever seen. All while trying to save money by using new developers desperate for work, or offshoring pieces of their development entirely. In this kind of environment, is it really a surprise nobody takes the time to learn the intricacies of their machines? It's not oit of fear or ignorance that people don't know their stuff. I would argue it's almost entirely a form of market failure. We racked up too much technical debt and traded quality for development speed, and now in many ways its become a defining quality trait to do things well instead. The pendulum eventually HAS to swing back that way. And no, I don't agree with Jon's "bronze age collapse" model. I think we'll be able to dust off the C manuals, establish better hardware standards or all switch to Rust before we just pack it in and decide nobody can use computers any more, lol.
Vulkan and DX12 are considered modern graphics APIs. DX11 (and previous), OpenGL, and Metal are considered more as Legacy APIs. The main difference between modern and legacy is that the burden of syncing the CPU and GPU is now put in the hands of the developer with modern APIs. Although this leads to more complex code bases and a steeper learning curve, the good thing about this is we can have (almost) complete control over resources shared between the CPU/GPU. This is huge for performance possibilities! Keep in mind, having control like this is also a double edged sword. Not everyone will know how to write code that manages resources correctly. Additionally, these changes help the embedded world once it catches on (embedded seems to always lag so far behind, so who knows how many years until OpenGL ES is phased out) If you are new to learning graphics APIs in general, start with the legacy ones first because they handle the hard part of memory management and GPU sync. Starting with DX12/Vulkan will be much more challenging, and you will end up just learning more memory management than the graphics related aspects. But if that's what you want, go for it!
Metal isn’t a legacy API. It’s everything Vulkan should have been. It’s only major issue is that’s only available for Apple platforms. There is a reason why MacOS doesn’t have this desktop app issue that Windows/Linux have. It’s extremely easy to make desktop apps on MacOS.
Biggest obstacle of fixing graphics APIs is that Apple, Sony, Microsoft, nVidia etc. don't want to fix them. Everyone want to push their own library in platform what they control. So options are: 1. Write graphics on browser 2. Use some graphics API abstraction layer.
Yet Open Source keeps winning! Blender isn't bloated and horrible, Godot isn't bloated and horrible. The world is waking up to the fact that Open Source software is just better due to it's sole desire to give the end user the best experience over monetary considerations.
i was thinking of learning dos because, as a hobby, i would like to develop games in c for a simple system that isn't subject to terrible change. i'm happy to go down that path but the fact that i couldn't find a sweet spot like that in a relevant modern platform has made me feel very disappointed
Gamedevs don't put up with all of the APIs and issues - they resort to using an existing game engine instead, which is a bummer for the reasons JB outlines here, all the complexity of the undertaking entailed.
I come here purely for the Jon Blow cake reviews, I don't know what all y'all are talking about here. Like, is Jon known for anything other than his amazing cake reviews?
I started writting a new kind of forward predictive renderer, I don't see a need more than simple compute shaders to crunch numbers to utilize the full gpu.. I decided to instead leverages how Ai's like NERFs and Gaussian splat neural networks calculate graphics, I honestly think we are heading in this direction. It will be more about how many numbers you can crunch at the end of the day :)
But GPUs have hardware dedicated to rasterizing triangles and specialized caching organization and mechanisms for texture accesses, among other things, and it's always going to be faster than a shader core executing algorithms to do the same work.
GPUs are actually very simple. Writing a direct driver for a particular GPU wouldn't be much harder than using Vulkan. Vulkan is basically the lack of a graphics library. It's just a thin abstraction layer providing things that most GPUs have in common.
Your analogy is flawed. It's not that when C was made, someone made a compiler for ALL architectures - far from it. C was made for a specific reason, a specific need on a specific platform. Then, over the years it prevailed because it filled a gap, and it prevailed with lots of vendors implementing it in their own way - gradually then becoming a true standard one could rely upon and so on and so forth. Making the same for a GPU would be trivial, you can make your own toy language for the GCN for example, call it G. But probably it would not really serve a big need, thus won't spread and become a standard etc etc.
It's true. It's sad but it's the reality of the situation. Every GPU/System company wants to have control over the process so all of them have their own GPU language. There is no incentive for them to support or care about a universal language. That's the only reason we have Direct3D, Vulkan, OpenGL, Metal, and all the other proprietary console graphics API's, each with their own shading language.
I don't see how this is different from CPUs. There are already shader languages that compile to different GPUs respectively, depending on your toolchain; GLSL, HLSL, WGSL and Furthark just to name a few.
@@dealloc it is different from cpu's, amd and intel share the same instruction set, x64/x86, on desktop only Apple use a different architecture/instruction set. Sure microcode is different but the programmer doesn't need to care about that. If there was some common hardware interface or instruction set used between gpu vendors we wouldn't be in this situation. The ideal futuristic option is to not need a gpu at all, and to rely on expanding the width of cpu vector registers to make gpu ops doable on CPU's with good speed
@@CianMcsweeney > on desktop only Apple use a different architecture/instruction set Apple's Silicon CPUs are RISC-based SOC, like ARM which are deployed on more than just Apple's devices.
Even RISC-V cores are used in NVIDIA-GPUs nowadays. But that doesn't mean necessarily, that any of that is exposed to the user aka programmer. That's why I like embedded devices from small vendors....they include hardware documentation with every single bit in every single register described. The backside is: if I own such a device and should take full advantage of that and write everything besides the linux kernel and GNU tools myself, that would be a task for the next ten years. As it is mentioned in the video: things got WAY too complex.
There are already a couple out there. GPLGPU is an implementation of a 1990s-era GPU (so it's all fixed-function stuff), and Nyuzi and MIAOW are manycore compute cores (the latter using AMD's Southern Islands ISA). Oh, and of course, there are HDL re-implementations of the GPUs from classic games consoles for use in FPGA projects like MiSTer.
Jonathan, I work for Imagination Technologies (PowerVR GPUs). We made the ISA public for our latest two generations (some people even managed to get SGX reversed engineered for PSVita). Unfortunately even if we manage to expose ASM directly to the programmers to they can run shaders they way they want very few brave graphics developers will attempt to do it. Also keep in mind that not only between different families of GPUs, but even on the same family, you can write code on one PowerVR BXM core that will basically hang the GPU on a slightly different permutation (just because that instruction is trying to access or do something that is not supported). This will only work on closed systems where you know that the hardware is never going to change.
I'm reminded of the Nintendo 64's RSP (8 x 16bit vector unit), mostly used for vertex processing and triangle setup and sound mixing, where games would normally not use custom RSP code but rather use pre-written "microcodes" supplied in the development kit... except for Factor 5 for their Indiana Jones and the Infernal Machine game, which showed Factor 5's demoscene roots and used custom RSP code and rendered in high resolution! ... and came out very late in the life cycle of the N64, and was not one of the top 50 selling N64 games... It seems that for these kind of game system, it's more important to get good performance from middling code, than great performance from exceptional code...
@@SuSh0xka Oh... Well, I don't know I guess. You have a point... I guess that they must have only been supplied the pre-compiled RSP .o object files (which the SDK code would copy into the RSP instruction RAM to do vertex and audio processing).
I'm medical laboratory technician. But I don't know why I like and really everyone in the programming world and appreciate Their contribution by any means they give to the world .
Along with our contributions, you should also count our negative impact. For example, Twitter is a massive shit stain on humanity, we made disinformation commonplace, we are ridiculously overpaid and bring prices up for everybody, we facilitated mass distribution of digital pornograpgy which is an epidemic, we made scamming a viable career choice and the list goes on.
While medical laboratory technician's work might be more straightforward than solving engineering problems, a functioning society is fucked without your team while it do ok without my team.
We created TikTok to lower the intelligence of human beings. We created Whatsapp, Google and Facebook to gather and sell all of your data, messages, pictures and videos to the governments and companies. We created Microsoft Windows, Android, iOS to sneaking on you, collection all of your private data and collecting them to CIA and FBI. Do you still respect us ?
@@StarContract If the net benefit was so terrible, there would be no hesitation to combat the industry. Let's not pretend like our industry doesn't bring any value
Gaining a fundamental understanding of everything in your stack/all the tools/concepts you use to build things is very time consuming. Like for example, if i wanted a fundmental understanding of what projection in 3d graphics actually does / what project matricies are, i would need to study linear algebra. Studying linear algebra doesn't necessarily require trigonometric knowledge, but there's a fundamental relationship between trig & projection & inner products & magnitude so you really should. Eventually, if youre like me you forget what you were actually studying for in the first place.
This is a real issue... I mean I'm literally here because I was looking for Godot Tutorials, now I'm watching videos on graphics APIs... like what am I even doing here?! 😅
Honestly this is a poor take IMO. What do you mean "locking up shading languages" ? You know that HLSL can be compiled in Vulkan, right? Also GLSL runs on any vendor right now, except maybe Apple. So, I don't understand this complain. Secondly, do you know that OpenCL exists? Is the closest thing to C and it's more of the general compute platform that you are talking about. This is also runs on any vendor (again, except Apple), so, again, what's the problem here? Do you want a simple API for graphics, well, you can always use OpenGL or D3D11, they still work you know. Now, as developer, I am grateful that Dx12 and Vulkan exist, because now you can optimize your app or game to take full potential of the hardware. In the past, we had to depend on the vendor to optimize the driver for specific games, it was worse than now, so, again I don't understand your rant. Yes, Dx12 and Vulkan ARE complicated, and there is a reason, there is no free lunch, GPUs are complicated, and they are not complicated for some conspiracy theory of Nvidia to make them complicated, is just progress. They are required to do thousand of things now that were not a thing in the past, video encoding/decoding, general compute, AI inference, raytracing, etc. You can write a Basic interpreter for GLSL if you want, you can also go an write your open source driver for Intel if you want (like the i915). Finally, are you advocating for Open Source drivers? or Open Source RTL? Like the design of the GPU? Who will fund this? Who will validate this? Is not a raspberry PI you know. Seems to me you are trying to reduce everything to: complicated bad, simple good.
Like almost everything this clown says, it's a non issue that no one else is hung up on. This doesn't matter at all, and it's especially funny coming from someone working on pissant indie games that don't even NEED a modern 3D API to begin with.
@@nolram sure but since we have spir-v what hinders us from coming up with a new language that transpiles to spir-v and uses existing toolchains to again transpile to specific implementations? I know it sounds a bit ridiculous, but we do much crazier stuff nowadays
@@dahahaka Good news: That is, partially, what we are already doing! Although SPIR-V is mainly (and pretty much only) consumed by Khronos-standards (OpenGL, Vulkan, OpenCL), it can be compiled to very easily by shader compilers. Meaning that even the Microsoft DirectX shader compiler, that can also compile for DirectX, can also output SPIR-V, which is why even most Vulkan applications use HLSL, as that is the input language to DXC.
I'd say it's more like bytecode for the Java Virtual Machine. Compile your Java (or other language) program to bytecode, then load that up in the JVM to actually run it, and it need not know about the specifics of what architecture it's actually running on. Same idea for SPIR-V, except with compilation, rather than interpretation. You compile your GLSL shader to SPIR-V and save it to disk, then at load-time the SPIR-V version gets handed to the graphics driver, which compiles it to whatever the GPU wants.
I don't think this is true, the real problem is your shader fix would suck about as much as glsl. Just look at how many tries people made to get programming right. When we finally got Crablang some people thought that was so good and made wgsl with 'Vec4' that nobody uses.
Vec4 is the ultimate in optimizing for genericisim over the most common case. Like, I get it. It makes perfect sense. But it doesn't feel good to use so it doesn't fuckin matter
Transpiling doesn't solve anything if the semantics of the shader languages are drastically different. Transpiling is only possible if there are semantic equivalents you can rearrange syntax to make.
@@HairyPixels I DON'T KNOW! No idea what he is complaining about. I get his point about modern rendering being overly complex, that makes sense, but most of the rest of this rant seems pretty nonsensical to me... Vulkan can be used to counter most of his points about vendor lock-in.
@@HairyPixels what's the complaint? Thats like saying the problem for game development is solved because engines exist. Games development requires low level control, he doesn't want a transpiler, he wants to communicate with the GPU directly like he does the CPU.
@JonathanBlow So, basically, the reason software development is so abstracted and 'crappy', is because of the need to constantly - marketing-driven, profit-driven - evolve the underlying hardware and systems. I use to use the Amiga ecosystem as an example of what happens, when you have hardware/base, that stays more or less the same over a long timeperiod. The creativity and amount of problemsolving that went into making extraordinairy experiences with very constricted technology, was astounding. And I question whether this could be the same, IF hardware wasn't so rapidly evolving. Software development that actually caters to the hardware 100% is soo much longer to develop than the hardware itself. Very interesting take there!
Constantly moving to the next shiny thing while never taking time to refine. This is why I LOVE stuff like Pico-8. I think the concept of virtual-only consoles is really clever. These types of self-imposed restrictions inspire creativity. I also love seeing new physical hardware like the Playdate (and its included dev environment).
can anybody here explain to me why we actually need graphics drivers? my cpu doesnt need any drivers either. if i have a program which says to calculate 1+1 on the cpu then the cpu understands it directly. why is it that when i tell my gpu to calculate 1+1 it suddenly needs a driver for it.
because the GPU has its own instruction set, and you are sending the instructions via your CPU, so the bare minimum your CPU needs to know how to do, is to send "some arbitrary" data to the GPU, which is what the driver does under the hood
@@TheOnlyJura But CPU also has own instruction set. We just compile to different target. We could very well do the same for GPUs and have even different languages. We just need open specification for hardware, then anybody could make compiler for it.
@@TheOnlyJura my cpu has it's own isa and my gpu has it's own isa. c programs are compiled to machinecode and shaders are compiled so it runs on the gpu isa. PCIe is a clearly defined standard where no driver help should be needed.
Most APIs in general are bad. Typically full of unnecessary abstractions, ties into an intrusive mental model and doesn't support granular steps from quick prototyping and down toward controlling as much yourself that you need to get the production release done. With that said, it takes many iterations to make a great API. Every time it takes more than a few minutes to figure out how to use the API (as a first-time user) to achieve what you need is a symptom of bad API design in my book. Don't settle just because the norm is bad.
But… the hardware manufacturers don‘t make the standards. Different consortiums make the standards, like Khronos or the DirectX Group at Microsoft. So what is John‘s point, since the manufacturers don‘t make the evil he complains about?
@@monkev1199 They are involved, certainly, but there is no single one of them that makes the standards, which is what John is complaining about. It's not some exclusive thing per vendor.
I kinda phased myself out. Back in the fat i was in tight with opengl, directx and all that. Then i kinda had some life changes and was out of anything really programming or even computer related for about 15yrs. Now, the tools are the same but changed but relearning all the tiols and modern best practices is pretty daunting..
@@IpanienkoYour point underscores mine. C alone doesn't cover all FPGA data types; it relies on HLS for translation. Yet, HLS is only somewhat encompassing, leaving us (developers) to handle specific FPGA requirements manually. At the end modern graphics shaders is just more extensive.
@@Ipanienko well there is already an intermediate format for shading languages, called spir-v. dont know if macos supports it, but you can just convert this format to msl and you are good to go.
The thought of manufacturers designing their products for control. My naive mind can only think of one solution for this: anti-consumerism. But then after buying second hand laptops, I still contributed to the problem more or less. I don't fucking know what the solution anymore apart from DIY microchips and open source projects. But then, trying to source decent equipments and Android (Google) is a shitshow already.
The poor state of modern API's are kind of an indication that capitalism isnt quite giving us better products - Competition seems to affect only the hardware. Same thing with modern entertainment - Competition hasnt worked to give us better Alien, Terminator or Predator movies. Its all garbage. Why isnt competition working?
The electronic engineers, the embedded software dev community needs to do it's comming out and put in place an open source collaboration with indepedant graphic hardware makers, graphic driver programmers, and graphic programmers on the "front end". It Really sounds utopian and I am so sorry for everyone who is impacted by this shit situation.
I can understand and accept the sentiment, but hardware nowadays really is too complicated to be accesseed directly. Still, opensourcing specs, drivers and frameworks would make everyone's life so much easier.
i want to see more 3d menu systems, and title screens, make the interface fully extruded 3d and rotating around, i would like to see that, not done that often!!!
This is something that I am quite excited about the future of wgpu for - it compiles down (and quite effectively too, an actually good use of rust's absolute overkill of 'zero-cost' abstractions) into basically whatever backend API you want. Vulkan, DX, Metal, if you want to support antique toasters it can do OpenGL no problem, and even WebGL and WebGPU (confusingly, wgpu is also the core of firefox's WebGPU implementation. Mozilla moment I guess). And it's a heck of a lot nicer to write than Vulkan, speaking from experience. While I've seen bindings for other languages out in the wild, it is still rust native - and does depend on its features kinda deeply. So if you don't want to use rust, you've still got a while to wait most likely. also, grrr CUDA >:{ One of the big names at intel, don't remember exactly who, said the world should push towards open standards (specifically calling out against CUDA) for how to interact with a GPU. Rare hardware manufacturer W? Be interesting to see how much intel follow up on that in their own product lines
I still think the most universal approach model wise to graphics is writing directly to the acreen pixels. Unfortunately it seems you just can't get any speed out of that. APIs have not favored the programmer instead we're having to bend ourselves and bang our heads on the wall as we build cabins out of sawdust and glue. I don't think there's been a more hostile programming era to the new programmer. We have tools that attempt to do everything for us yet we have horrid abstractions around graphics we need entire engines just to manage everything and nobody understands what's going on at the end of the day truly because everybody's worried about learning the 10% of the glue they need to make everything work. A big part of it too is the expectation of the people who receive the games. If the graphics API was simply coloring screen pixels you're development kit consisted of a core language and some multimedia libraries and the customer would settle for much more simple games with less content we'd all be better off. You look at game programming books in the '90s and early 2000s and you essentially see a core language some supporting multimedia functionality and that's it you get a game in around 200 pages. Go back to the '80s and you basically get 4 pages of BASIC to make a game. So we've gone from a program listing of four pages to a book of 200 pages to five books covering various topics or just blindly learning a game engine.
OpenGL has fragment shaders. With unified shaders you can pull in the rasteriser or just don’t? Nanite and cycles use the GPU, but not its rasteriser. Would be cool if we could just use pure functions, forEach, and hierarchical z-buffer. Cube maps as code.
You understand that in the 80s it was all software renderer right? And that with modern GPU APIs, the responsibility is moved from the driver/API implementation to the application? Do you really think moving from directx11 to 12 suddenly added tons of code? It just shifted who has to write the code needed to drive the hardware.
We kinda are going back to software rendering. Maybe in the future we will have a unified CPU/GPU architecture with massively parallel general purpose compute units that just write data to the framebuffer. I think Sony originally planned something like this with their CELL processor on the PS3. Apparently the PS3 wasn't going to have a GPU originally and all of the graphics would be rendered using SPE cores.
Metal is an amazing shading language, shame that Apple locked it to their proprietary hardware. But on the other hand part of the benefit of that is that Metal can be strongly coupled with their GPU hardware which means they can make it less general and more efficient. This is pretty important when considering GPUs.
Ughh this is the exact problem I'm in the process of solving, but the likelyhood of piracy and/or no payment, is kicking me in the gut so hard. /whinge
Did this dude just complain about shader languages? Of all problems that face video game developers, is the inability to rewrite the shader language really something to focus on? There is literally no game that would become possible to make in reasonable time if you had a new shader language. I'm reminded of the Steve Jobs Takes Criticism video whre the dude complains about java and opendoc. There aren't any customers that care dude.
@@Morimea You misunderstand, friend. I'm not saying there isn't any _technology_ you can't implement. I said there aren't any _games_ that would be unlocked by shader language innovation. C++ is far more "outdated" than GL, but there aren't any games that will become possible with a new programming language.
we need to reimagine the gpu from the ground up using first principles. we cant just leave nvidia, amd, and intel to keep using their bloated proprietary architecture
Sorry with the audio on this video. Will be fixed in the next upload.
Can you post the dates of the videos? they're months or years old?
That means you'll reupload the video? Because in this state it's hard to follow along at all :D
Great man, you cleanse my mind and my view to programming... I'm only a script guy andy... I'm new to programming and I'm lucky to have encountered your channel before going in into this journey... I love scripts because they are straight forward.
@@itsdanott I'm glad someone else pointed it out, I was gonna say, this feels like the most schizophrenic clip I've listened to in ages, the audio issues were actually making me frustrated with the cake and tea tangents
As a graphics programmer I never considered any of this, but he's absolutely right. To my knowledge, Vulkan is the most frankenstein of them all because it attempts to find a common abstraction for every vendor. In a perfect world where every vendor opened up their hardware, abstractions like Vulkan's notoriously awful descriptor sets would either not exist or be a lot more simpler to implement.
You guys still write 1000 lines of code to draw a basic damn triangle ? Man you gotta be paid more than react devs at Netflix. That shit is the hardest thing I have ever seen in my life.
When I saw this, I totally gave up and went to network programming, I'm so chill on the Linux terminal screen.
@@-Engineering01- Huh?
You act like those abstractions aren't there for a reason. In this hypothetical world where every vender opened up their hardware, do you think graphics programmers are going to learn dozens of different ISAs and implement engine backends for dozens of GPU specific apis? Seems like you're still just end up with some GPU independent abstraction...
@@-Engineering01- You're lucky if it's just 1000 lines. If you learn it long enough, you'll develop Stockholm Syndrome for the API.
Maybe it is just me but concepts in Vulkan were very straightforward for me. Yes, it is a lot of boilerplate at the beginning but the way to abstract them away for a game felt very natural.
I started programming about 15 years ago, and I feel like I was right on the edge of "old school" and "modern" development. I kinda wish I had the opportunity to be a software engineer during the days of actually needing to implement stuff and not just being a glorified infra engineer stitching together a bunch of NPM libraries.
You're comparing old school programming to JavaScript or web based programming? That's not a great comparison for understanding how modern programming differs from the old days. A more reasonable comparison would be doing the same thing, for example making a game in 2005 vs 2020. Many people in 2020 will probably use hardware rendering, which comes with its own complexities and setup, and also have to create much more elaborate, generalized and complicated systems. Many people still do that even though we have things like unity, so if you're really interested and have the skills, you can still make things from scratch which comes with many benefits.
Jokes on you, we've always been digital plumbers.
@@gabereiser Not true.
It took me weeks to learn that opengl is basically a state machine, and you're just calling functions that turn on/off features. I wouldn't say it's simple but I feel like that should have been the first line of every beginner tutorial I did.
Can you explain further?
The best resource I found for OpenGL is the khronos group wiki, I think they mention early on about opengl being a state machine. As a side note, every concept there is explained really well. Would recommend over any tutorial or book
It IS explained in the first chapter in most if not all popular opengl learning resources. Khronos' "What is OpenGL" page also explains this.
You really need to be good at low level stuff like (pointers, memory layout, structs, image formats, file i/o) to use opengl
In OpenGL's defense, they do state this fact at the very first chapter of their tutorial on learnopengl. That it's all basically a state machine.
As a C graphics programmer, 100% agree on this
This is why everybody ran for the hills and started using web browsers for everything. On a short term basis it's easier to deal with the insanity of web browsers than the rocket science of native applications.
100%, we give out about web devs, but there are valid reason why things devolved into the current state of affairs, drivers and graphics api's ruined native development. It's still doable if you have the know how to avoid most of the complexity, but it's far from ideal
Then somebody came up with React and thought it was the best thing invented in the history of the universe.
@@CianMcsweeney Yeah you can still do things if you understand the full "ecosystem" of garbage but even then it's not going to be very portable without abstracting the entire machine to death anyway.
And nearly 95% of applications, aside of games, can be done on the web without noticing serious performance problems. Even a lot of 3D applications can be written thanks to web-GPU.
I mean I don't care games at all (don't play them much), so the web is ok for me I even play basic games on my browser only.
@@-Engineering01- that would be true if web apps were actually written well, most of them are not
Jonathan Blow on cake as pallet cleanser
Lol
I’m a software engineer who uses OpenGL for graphics in our company’s software. In my opinion, it is completely impossible to gather the graphics rings of power and bind them in darkness. Instead, I have resigned myself to the fact that I will need to just know each shading language and each use case, and just do the rework a bunch of times. There’s no better solution unless there’s only one GPU manufacturer.
you already use a bunch of shading languages, which compile to Spir-V, with Vulkan
My first introduction to "programming" was html and visual basic. I was on the cusp of the design philosophy change over you mention. When I first started, it seemed the only real way to get anything done was to target older machines and when you had direct access to hardware. Things had started to be abstracted so far, you needed to learn dozens of things just to render something on the screen. It was really disheartening. I ended up pushing towards ruby and python for a little bit. Everyone at the time was heralding them as the future of programming. Then I got stuck on java for a little bit. Over the years I ended up more so building skills for dev ops rather than explicitly programming so I pushed for a career in IT. I do wish I had been exposed to more bare metal programming first. I would have fallen more into software development had that been the case.
The original purpose of computation was to reduce friction. You program a system so that you can solve a problem which was previous with high friction. Bank transfers or company document storage were already solved with physical technology but instead now we are able to do so frictionless. People saw how they were able to make these physical solution frictionless by creating software that they abstracted the idea, what if the purpose of software isn't to make originally physical systems faster, but instead how can we make the process of creating software frictionless.
Many of tools we have developed in the past were motivated by solving these physical problems, and thus needed to provide the tools needed to make something frictionless. What has happened in recently years, say last 1.5 decades, is that many of the tools we had in the past are replaced by one which are "frictionless". The problem with that is that these new tools lack the ability to now solve the real physical problem, which was to make these orginialy physical systems frictionless.
Basically we used the tools which enabled us to make frictionless systems and used to only the tools themselves and resulted in the loss of actually solving the problem.
The worst part is these hardware manufactures don't even get the competitive advantage they seek. What actually happens is they just create significantly work for engine developers to support the multiple shader APIs. Basically nobody wins and everyone else loses.
agree, a large reason why many games never released for PC was due to how difficult it was (and still is) to release bug-free pc releases due to the driver and graphics api mess
@@CianMcsweeney PC has the biggest game library out of any platform, you are talking shit.
@@doltBmB And yet there are still hundreds of console games that were - and probably will be - never ported to PC.
@@doltBmB I'm talking about the releases most average people want to play. Of course there's loads of great indie games on pc. My point was, that graphics vendors have made it unnecessarily difficult to develop for PC
@@CianMcsweeney you live in a consolified bubble if you think people only want to play AAA third person shooters
I totally understand what Jon is saying and largely agree, but I feel like his analysis tends to stop prematurely at the technical level.
Are AAA games really hampered today primarily by old technical debt? Im sure that's a major factor, but I would argue it's a similar problem with movies. When 100 million dollars are being thrown at a project, no investor is ever going to take the liability of giving artists the freedom they need to make anything but the most mass-appeal trough slop you've ever seen. All while trying to save money by using new developers desperate for work, or offshoring pieces of their development entirely.
In this kind of environment, is it really a surprise nobody takes the time to learn the intricacies of their machines? It's not oit of fear or ignorance that people don't know their stuff. I would argue it's almost entirely a form of market failure. We racked up too much technical debt and traded quality for development speed, and now in many ways its become a defining quality trait to do things well instead. The pendulum eventually HAS to swing back that way.
And no, I don't agree with Jon's "bronze age collapse" model. I think we'll be able to dust off the C manuals, establish better hardware standards or all switch to Rust before we just pack it in and decide nobody can use computers any more, lol.
The cuts in the video are driving me nuts.
Vulkan and DX12 are considered modern graphics APIs. DX11 (and previous), OpenGL, and Metal are considered more as Legacy APIs. The main difference between modern and legacy is that the burden of syncing the CPU and GPU is now put in the hands of the developer with modern APIs. Although this leads to more complex code bases and a steeper learning curve, the good thing about this is we can have (almost) complete control over resources shared between the CPU/GPU. This is huge for performance possibilities! Keep in mind, having control like this is also a double edged sword. Not everyone will know how to write code that manages resources correctly.
Additionally, these changes help the embedded world once it catches on (embedded seems to always lag so far behind, so who knows how many years until OpenGL ES is phased out)
If you are new to learning graphics APIs in general, start with the legacy ones first because they handle the hard part of memory management and GPU sync. Starting with DX12/Vulkan will be much more challenging, and you will end up just learning more memory management than the graphics related aspects. But if that's what you want, go for it!
Metal isn't a legacy API. It also is an explicit API, its just not as verbose as Vulkan.
Metal isn’t a legacy API. It’s everything Vulkan should have been. It’s only major issue is that’s only available for Apple platforms. There is a reason why MacOS doesn’t have this desktop app issue that Windows/Linux have. It’s extremely easy to make desktop apps on MacOS.
Man my nigga I dig this dog. Mf thought deep and shared some real shit ,bigsub
Biggest obstacle of fixing graphics APIs is that Apple, Sony, Microsoft, nVidia etc. don't want to fix them. Everyone want to push their own library in platform what they control.
So options are:
1. Write graphics on browser
2. Use some graphics API abstraction layer.
“Everything is too big and too horrible.” lol 2020s in a nutshell.
Yet Open Source keeps winning!
Blender isn't bloated and horrible, Godot isn't bloated and horrible. The world is waking up to the fact that Open Source software is just better due to it's sole desire to give the end user the best experience over monetary considerations.
i was thinking of learning dos because, as a hobby, i would like to develop games in c for a simple system that isn't subject to terrible change. i'm happy to go down that path but the fact that i couldn't find a sweet spot like that in a relevant modern platform has made me feel very disappointed
Gamedevs don't put up with all of the APIs and issues - they resort to using an existing game engine instead, which is a bummer for the reasons JB outlines here, all the complexity of the undertaking entailed.
I come here purely for the Jon Blow cake reviews, I don't know what all y'all are talking about here. Like, is Jon known for anything other than his amazing cake reviews?
I miss the "old days" of programming.
Are you a psychopath? It was worse than it is today.
Omfg this is so true and even tho i dont make video games it applies to webdev and just software development in general
I started writting a new kind of forward predictive renderer, I don't see a need more than simple compute shaders to crunch numbers to utilize the full gpu.. I decided to instead leverages how Ai's like NERFs and Gaussian splat neural networks calculate graphics, I honestly think we are heading in this direction. It will be more about how many numbers you can crunch at the end of the day :)
Cram more buzzwords in next time.
@@lithium forgot my trusty circular quadratic algebra! 🤣
Cool idea! I'm curious how it develops
But GPUs have hardware dedicated to rasterizing triangles and specialized caching organization and mechanisms for texture accesses, among other things, and it's always going to be faster than a shader core executing algorithms to do the same work.
What did you write? It sounds cool.
GPUs are actually very simple. Writing a direct driver for a particular GPU wouldn't be much harder than using Vulkan. Vulkan is basically the lack of a graphics library. It's just a thin abstraction layer providing things that most GPUs have in common.
The only game in town is OpenGL and it’s also the one everyone is waiting for to die. The API scene is so bad that OpenGL cannot die.
Apple stopped supporting it, right? That was what killed Adobe Flash.
it is already marked as legacy api by almost everyone. nobody use opengl anymore.
Did you time travel here from 2004?
Vulkan has more problems than OpenGL does. I was sent from 2004 to save you all.
@@smallbluemachine what about directX?
Your analogy is flawed.
It's not that when C was made, someone made a compiler for ALL architectures - far from it. C was made for a specific reason, a specific need on a specific platform. Then, over the years it prevailed because it filled a gap, and it prevailed with lots of vendors implementing it in their own way - gradually then becoming a true standard one could rely upon and so on and so forth.
Making the same for a GPU would be trivial, you can make your own toy language for the GCN for example, call it G. But probably it would not really serve a big need, thus won't spread and become a standard etc etc.
It's true. It's sad but it's the reality of the situation. Every GPU/System company wants to have control over the process so all of them have their own GPU language. There is no incentive for them to support or care about a universal language. That's the only reason we have Direct3D, Vulkan, OpenGL, Metal, and all the other proprietary console graphics API's, each with their own shading language.
I don't see how this is different from CPUs. There are already shader languages that compile to different GPUs respectively, depending on your toolchain; GLSL, HLSL, WGSL and Furthark just to name a few.
@@dealloc it is different from cpu's, amd and intel share the same instruction set, x64/x86, on desktop only Apple use a different architecture/instruction set. Sure microcode is different but the programmer doesn't need to care about that. If there was some common hardware interface or instruction set used between gpu vendors we wouldn't be in this situation. The ideal futuristic option is to not need a gpu at all, and to rely on expanding the width of cpu vector registers to make gpu ops doable on CPU's with good speed
@@CianMcsweeney That would be awesome and make draw calls far less expensive surely.
@@quaker5712 potentially yes, would be a massive change however so not sure how likely it is to happen
@@CianMcsweeney > on desktop only Apple use a different architecture/instruction set
Apple's Silicon CPUs are RISC-based SOC, like ARM which are deployed on more than just Apple's devices.
RISC-V is developing an open-source GPU. So the problem of shading language access may be alleviated in the near future.
Even RISC-V cores are used in NVIDIA-GPUs nowadays. But that doesn't mean necessarily, that any of that is exposed to the user aka programmer. That's why I like embedded devices from small vendors....they include hardware documentation with every single bit in every single register described. The backside is: if I own such a device and should take full advantage of that and write everything besides the linux kernel and GNU tools myself, that would be a task for the next ten years. As it is mentioned in the video: things got WAY too complex.
There are already a couple out there. GPLGPU is an implementation of a 1990s-era GPU (so it's all fixed-function stuff), and Nyuzi and MIAOW are manycore compute cores (the latter using AMD's Southern Islands ISA).
Oh, and of course, there are HDL re-implementations of the GPUs from classic games consoles for use in FPGA projects like MiSTer.
100% on point
Jonathan, I work for Imagination Technologies (PowerVR GPUs). We made the ISA public for our latest two generations (some people even managed to get SGX reversed engineered for PSVita).
Unfortunately even if we manage to expose ASM directly to the programmers to they can run shaders they way they want very few brave graphics developers will attempt to do it. Also keep in mind that not only between different families of GPUs, but even on the same family, you can write code on one PowerVR BXM core that will basically hang the GPU on a slightly different permutation (just because that instruction is trying to access or do something that is not supported). This will only work on closed systems where you know that the hardware is never going to change.
I'm reminded of the Nintendo 64's RSP (8 x 16bit vector unit), mostly used for vertex processing and triangle setup and sound mixing, where games would normally not use custom RSP code but rather use pre-written "microcodes" supplied in the development kit... except for Factor 5 for their Indiana Jones and the Infernal Machine game, which showed Factor 5's demoscene roots and used custom RSP code and rendered in high resolution! ... and came out very late in the life cycle of the N64, and was not one of the top 50 selling N64 games... It seems that for these kind of game system, it's more important to get good performance from middling code, than great performance from exceptional code...
@@boptillyouflop I thought N64 RSP microcode was not available (at least at the beginning).
@@SuSh0xka Oh... Well, I don't know I guess. You have a point... I guess that they must have only been supplied the pre-compiled RSP .o object files (which the SDK code would copy into the RSP instruction RAM to do vertex and audio processing).
I'm medical laboratory technician. But I don't know why I like and really everyone in the programming world and appreciate Their contribution by any means they give to the world .
Along with our contributions, you should also count our negative impact. For example, Twitter is a massive shit stain on humanity, we made disinformation commonplace, we are ridiculously overpaid and bring prices up for everybody, we facilitated mass distribution of digital pornograpgy which is an epidemic, we made scamming a viable career choice and the list goes on.
While medical laboratory technician's work might be more straightforward than solving engineering problems, a functioning society is fucked without your team while it do ok without my team.
@@StarContractstill more positives than negatives
We created TikTok to lower the intelligence of human beings.
We created Whatsapp, Google and Facebook to gather and sell all of your data, messages, pictures and videos to the governments and companies.
We created Microsoft Windows, Android, iOS to sneaking on you, collection all of your private data and collecting them to CIA and FBI.
Do you still respect us ?
@@StarContract If the net benefit was so terrible, there would be no hesitation to combat the industry. Let's not pretend like our industry doesn't bring any value
Gaining a fundamental understanding of everything in your stack/all the tools/concepts you use to build things is very time consuming. Like for example, if i wanted a fundmental understanding of what projection in 3d graphics actually does / what project matricies are, i would need to study linear algebra.
Studying linear algebra doesn't necessarily require trigonometric knowledge, but there's a fundamental relationship between trig & projection & inner products & magnitude so you really should.
Eventually, if youre like me you forget what you were actually studying for in the first place.
This is a real issue... I mean I'm literally here because I was looking for Godot Tutorials, now I'm watching videos on graphics APIs... like what am I even doing here?! 😅
Modern Graphics Programming and its consequences have been a disaster for the human race
Lol troll
based blow pilled comment
Honestly this is a poor take IMO.
What do you mean "locking up shading languages" ? You know that HLSL can be compiled in Vulkan, right? Also GLSL runs on any vendor right now, except maybe Apple. So, I don't understand this complain.
Secondly, do you know that OpenCL exists? Is the closest thing to C and it's more of the general compute platform that you are talking about. This is also runs on any vendor (again, except Apple), so, again, what's the problem here?
Do you want a simple API for graphics, well, you can always use OpenGL or D3D11, they still work you know. Now, as developer, I am grateful that Dx12 and Vulkan exist, because now you can optimize your app or game to take full potential of the hardware. In the past, we had to depend on the vendor to optimize the driver for specific games, it was worse than now, so, again I don't understand your rant.
Yes, Dx12 and Vulkan ARE complicated, and there is a reason, there is no free lunch, GPUs are complicated, and they are not complicated for some conspiracy theory of Nvidia to make them complicated, is just progress. They are required to do thousand of things now that were not a thing in the past, video encoding/decoding, general compute, AI inference, raytracing, etc. You can write a Basic interpreter for GLSL if you want, you can also go an write your open source driver for Intel if you want (like the i915).
Finally, are you advocating for Open Source drivers? or Open Source RTL? Like the design of the GPU? Who will fund this? Who will validate this? Is not a raspberry PI you know. Seems to me you are trying to reduce everything to: complicated bad, simple good.
Like almost everything this clown says, it's a non issue that no one else is hung up on. This doesn't matter at all, and it's especially funny coming from someone working on pissant indie games that don't even NEED a modern 3D API to begin with.
Isn't spir-v basically a universal shading language?
It‘s more of an assembly format.
It's not officially supported by macOS or web (wgpu) afaik, so you still need to transpile to msl and wgsl respectively
@@nolram sure but since we have spir-v what hinders us from coming up with a new language that transpiles to spir-v and uses existing toolchains to again transpile to specific implementations?
I know it sounds a bit ridiculous, but we do much crazier stuff nowadays
@@dahahaka Good news: That is, partially, what we are already doing! Although SPIR-V is mainly (and pretty much only) consumed by Khronos-standards (OpenGL, Vulkan, OpenCL), it can be compiled to very easily by shader compilers. Meaning that even the Microsoft DirectX shader compiler, that can also compile for DirectX, can also output SPIR-V, which is why even most Vulkan applications use HLSL, as that is the input language to DXC.
I'd say it's more like bytecode for the Java Virtual Machine. Compile your Java (or other language) program to bytecode, then load that up in the JVM to actually run it, and it need not know about the specifics of what architecture it's actually running on. Same idea for SPIR-V, except with compilation, rather than interpretation. You compile your GLSL shader to SPIR-V and save it to disk, then at load-time the SPIR-V version gets handed to the graphics driver, which compiles it to whatever the GPU wants.
Gone are the days of bare metal coding like on the C64 and Amiga ooooh how glorious where those days ;)
Yes.
I don't think this is true, the real problem is your shader fix would suck about as much as glsl. Just look at how many tries people made to get programming right. When we finally got Crablang some people thought that was so good and made wgsl with 'Vec4' that nobody uses.
Vec4 is the ultimate in optimizing for genericisim over the most common case.
Like, I get it. It makes perfect sense. But it doesn't feel good to use so it doesn't fuckin matter
Why does it skip every 10-or-so seconds?
because proprietary software is trash and things are too complicated nowadays
Bug when exporting
Because he's eating cake.
Why isn't there a shader language transpiler if that's such a big problem to have so many shader languages?
Transpiling doesn't solve anything if the semantics of the shader languages are drastically different. Transpiling is only possible if there are semantic equivalents you can rearrange syntax to make.
There is. Loads. Even the DirectX Shader Compiler, which is open source, supports SPIR-V output.
@@nolram then what is blow's complaint about? seems like this problem is solved.
@@HairyPixels I DON'T KNOW! No idea what he is complaining about. I get his point about modern rendering being overly complex, that makes sense, but most of the rest of this rant seems pretty nonsensical to me... Vulkan can be used to counter most of his points about vendor lock-in.
@@HairyPixels what's the complaint? Thats like saying the problem for game development is solved because engines exist. Games development requires low level control, he doesn't want a transpiler, he wants to communicate with the GPU directly like he does the CPU.
@JonathanBlow So, basically, the reason software development is so abstracted and 'crappy', is because of the need to constantly - marketing-driven, profit-driven - evolve the underlying hardware and systems. I use to use the Amiga ecosystem as an example of what happens, when you have hardware/base, that stays more or less the same over a long timeperiod. The creativity and amount of problemsolving that went into making extraordinairy experiences with very constricted technology, was astounding. And I question whether this could be the same, IF hardware wasn't so rapidly evolving. Software development that actually caters to the hardware 100% is soo much longer to develop than the hardware itself. Very interesting take there!
Constantly moving to the next shiny thing while never taking time to refine.
This is why I LOVE stuff like Pico-8. I think the concept of virtual-only consoles is really clever. These types of self-imposed restrictions inspire creativity. I also love seeing new physical hardware like the Playdate (and its included dev environment).
can anybody here explain to me why we actually need graphics drivers? my cpu doesnt need any drivers either. if i have a program which says to calculate 1+1 on the cpu then the cpu understands it directly. why is it that when i tell my gpu to calculate 1+1 it suddenly needs a driver for it.
because the GPU has its own instruction set, and you are sending the instructions via your CPU, so the bare minimum your CPU needs to know how to do, is to send "some arbitrary" data to the GPU, which is what the driver does under the hood
@@TheOnlyJura But CPU also has own instruction set. We just compile to different target. We could very well do the same for GPUs and have even different languages. We just need open specification for hardware, then anybody could make compiler for it.
@@TheOnlyJura my cpu has it's own isa and my gpu has it's own isa. c programs are compiled to machinecode and shaders are compiled so it runs on the gpu isa. PCIe is a clearly defined standard where no driver help should be needed.
Because gpu instructions are purposely hidden away in the driver. But I’m not an expert, I’m sure there are other uses for it too
@@anonymouscommentator your network card needs a driver too
There are compatibility libraries like bhfx, maybe that's the best way currently.
Low level API's have been an absolute disaster.
High level you mean
@@botbeamer No.
Most APIs in general are bad. Typically full of unnecessary abstractions, ties into an intrusive mental model and doesn't support granular steps from quick prototyping and down toward controlling as much yourself that you need to get the production release done. With that said, it takes many iterations to make a great API. Every time it takes more than a few minutes to figure out how to use the API (as a first-time user) to achieve what you need is a symptom of bad API design in my book. Don't settle just because the norm is bad.
But… the hardware manufacturers don‘t make the standards. Different consortiums make the standards, like Khronos or the DirectX Group at Microsoft.
So what is John‘s point, since the manufacturers don‘t make the evil he complains about?
Take a quick look at the khronos members list and come back to me about the hardware manufacturers not being involved
@@monkev1199 They are involved, certainly, but there is no single one of them that makes the standards, which is what John is complaining about. It's not some exclusive thing per vendor.
what's with the sound?
I kinda phased myself out. Back in the fat i was in tight with opengl, directx and all that. Then i kinda had some life changes and was out of anything really programming or even computer related for about 15yrs. Now, the tools are the same but changed but relearning all the tiols and modern best practices is pretty daunting..
you dont need a shader language, you just need a generic pbr pre-programmed shading system. yes like it was a generic texture raster system. but T&L.
I honestly think its not comparable C that had to deal with a few different data types/formats vs. the massive complexity of modern graphics shaders.
C runs on FPGAs. It's not just a few different data types.
@@IpanienkoYour point underscores mine. C alone doesn't cover all FPGA data types; it relies on HLS for translation. Yet, HLS is only somewhat encompassing, leaving us (developers) to handle specific FPGA requirements manually. At the end modern graphics shaders is just more extensive.
@@brhvitor4 That's not my point. You are still vastly underselling the flexibility of C
@@Ipanienko well there is already an intermediate format for shading languages, called spir-v. dont know if macos supports it, but you can just convert this format to msl and you are good to go.
Will AMD ROCm fit in here somewhere? It’s open source, but I’m unaware of its use outside of deep learning applications.
I'm unaware of its use even in the field of deep learning lol everything is CUDA.
The thought of manufacturers designing their products for control. My naive mind can only think of one solution for this: anti-consumerism.
But then after buying second hand laptops, I still contributed to the problem more or less.
I don't fucking know what the solution anymore apart from DIY microchips and open source projects.
But then, trying to source decent equipments and Android (Google) is a shitshow already.
The poor state of modern API's are kind of an indication that capitalism isnt quite giving us better products - Competition seems to affect only the hardware.
Same thing with modern entertainment - Competition hasnt worked to give us better Alien, Terminator or Predator movies. Its all garbage. Why isnt competition working?
The electronic engineers, the embedded software dev community needs to do it's comming out and put in place an open source collaboration with indepedant graphic hardware makers, graphic driver programmers, and graphic programmers on the "front end". It Really sounds utopian and I am so sorry for everyone who is impacted by this shit situation.
I can understand and accept the sentiment, but hardware nowadays really is too complicated to be accesseed directly.
Still, opensourcing specs, drivers and frameworks would make everyone's life so much easier.
Win 3.11 came with drivers. Only a few chosen ones could improve the driver in the N64 (microcode).
I wonder how I can apply this to LSPs
i want to see more 3d menu systems, and title screens, make the interface fully extruded 3d and rotating around, i would like to see that, not done that often!!!
You want that Navi computer from Lain, don't you?
@@4.0.4 just saying that titlescreens and menus are always 2d, it would be cool to see a 3d one, or lots of them. :)
might work in VR, but poorly on a flat screen
What's his take on WebGPU?
This is something that I am quite excited about the future of wgpu for - it compiles down (and quite effectively too, an actually good use of rust's absolute overkill of 'zero-cost' abstractions) into basically whatever backend API you want. Vulkan, DX, Metal, if you want to support antique toasters it can do OpenGL no problem, and even WebGL and WebGPU (confusingly, wgpu is also the core of firefox's WebGPU implementation. Mozilla moment I guess). And it's a heck of a lot nicer to write than Vulkan, speaking from experience. While I've seen bindings for other languages out in the wild, it is still rust native - and does depend on its features kinda deeply. So if you don't want to use rust, you've still got a while to wait most likely.
also, grrr CUDA >:{ One of the big names at intel, don't remember exactly who, said the world should push towards open standards (specifically calling out against CUDA) for how to interact with a GPU. Rare hardware manufacturer W? Be interesting to see how much intel follow up on that in their own product lines
Isn't Vulkan open source?
i agree
Intellectual Property ruins everything per usual
I still think the most universal approach model wise to graphics is writing directly to the acreen pixels. Unfortunately it seems you just can't get any speed out of that. APIs have not favored the programmer instead we're having to bend ourselves and bang our heads on the wall as we build cabins out of sawdust and glue. I don't think there's been a more hostile programming era to the new programmer. We have tools that attempt to do everything for us yet we have horrid abstractions around graphics we need entire engines just to manage everything and nobody understands what's going on at the end of the day truly because everybody's worried about learning the 10% of the glue they need to make everything work. A big part of it too is the expectation of the people who receive the games. If the graphics API was simply coloring screen pixels you're development kit consisted of a core language and some multimedia libraries and the customer would settle for much more simple games with less content we'd all be better off. You look at game programming books in the '90s and early 2000s and you essentially see a core language some supporting multimedia functionality and that's it you get a game in around 200 pages. Go back to the '80s and you basically get 4 pages of BASIC to make a game. So we've gone from a program listing of four pages to a book of 200 pages to five books covering various topics or just blindly learning a game engine.
Any good book recommendations?
OpenGL has fragment shaders. With unified shaders you can pull in the rasteriser or just don’t? Nanite and cycles use the GPU, but not its rasteriser. Would be cool if we could just use pure functions, forEach, and hierarchical z-buffer. Cube maps as code.
You understand that in the 80s it was all software renderer right?
And that with modern GPU APIs, the responsibility is moved from the driver/API implementation to the application?
Do you really think moving from directx11 to 12 suddenly added tons of code? It just shifted who has to write the code needed to drive the hardware.
We kinda are going back to software rendering. Maybe in the future we will have a unified CPU/GPU architecture with massively parallel general purpose compute units that just write data to the framebuffer. I think Sony originally planned something like this with their CELL processor on the PS3. Apparently the PS3 wasn't going to have a GPU originally and all of the graphics would be rendered using SPE cores.
@@VivienBihl I'm saying you didn't really even need an API for software rendering. You poked directly the color into the pixels.
Glados give you cake ? 😂
Metal is an amazing shading language, shame that Apple locked it to their proprietary hardware. But on the other hand part of the benefit of that is that Metal can be strongly coupled with their GPU hardware which means they can make it less general and more efficient. This is pretty important when considering GPUs.
It has 100% efficiency because no one on earth uses it.
Metal is a graphics API, not a shading language.
it's both, you write Metal shaders on iOS, not GLSL..@@nolram
Ughh this is the exact problem I'm in the process of solving, but the likelyhood of piracy and/or no payment, is kicking me in the gut so hard. /whinge
Did this dude just complain about shader languages? Of all problems that face video game developers, is the inability to rewrite the shader language really something to focus on?
There is literally no game that would become possible to make in reasonable time if you had a new shader language.
I'm reminded of the Steve Jobs Takes Criticism video whre the dude complains about java and opendoc. There aren't any customers that care dude.
@@Morimea You misunderstand, friend. I'm not saying there isn't any _technology_ you can't implement. I said there aren't any _games_ that would be unlocked by shader language innovation. C++ is far more "outdated" than GL, but there aren't any games that will become possible with a new programming language.
@@wheatandtares-xk4lp so there's no benefit to using C++, everything can be written in brainfuck
we need to reimagine the gpu from the ground up using first principles. we cant just leave nvidia, amd, and intel to keep using their bloated proprietary architecture
What? But they make the chips, what are you going to do?
@@turolretar No he said we need to reimagine, not actually *make* anything
too big to fail in video games?
Good thing Unreal Engine has a material editor