Code generation and AST parsing/unparsing **is** metaprogramming! You just use tools outside of the compiler, which is just as bad as the C preprocessor (which also operates outside of the compiler).
tbf, code generators can make sense for certain things, like for example protobuf but other than things like that, I want to work on actual types as the compiler understands them
Well, not quite. You cannot do any type checking with the C preprocessor, for example. But nothing prevents you from doing that from your external tools.
@@Evan490BC They are equally bad in the sense of not being integrated with the compiler and thus not understanding the code in the same way. Although you could build external tools that understand the code just as well as the compiler, that is rarely done for tasks like code generation.
I was a passive Zig fan for a while(not really doing anything with it, but generally aligned with the goal of usurping C) but I've grown a little more interested in Hare's approach, because it's disinterested in self-extension. If you need to do extension, you really want a Lisp instead of a Fortran(using the Chuck Moore characterization of "there are only four languages, Fortran, Lisp, Forth, and APL".) But as soon as you make a Lisp, you make a jungle of Lisps, none of them quite alike. And this is borne out by everything long-lived that has tried to add some general-purpose extension - C macros, C++ templates, and of course the Common Lisp and Scheme landscape. So I'd rather define extension within a second language made to be a good code generator, e.g. Janet. The design problem of nailing down a systems programming language with good defaults is more prone to bikeshedding than pushing it onto extension, but it leads to a concise implementation that's easy to understand.
I randomly stumbled upon Hare casually browsing the OpenBSD repos about two weeks ago. Considering C has been around for 50 years and is still going strong, I think a 100-year lifespan is actually very achievable.
C will still run strong for the next 50 years too. Its best longevity secret is Its community. Hare with this mindset (of no proprietary os only) would not last for 50+ year.
@@maxrinehart4177 proprietary OSs are going away in the next 20. They're not profitable anymore. Their epoch has come and gone. Hence MS announcing this year their plans to cram theirs full of ads and spyware. If they've gone full mask off, it's game over. They're done providing value, and are ready to raze the place and rake in profits until everyone leaves
This is what I thought when he said about a programming language lasting for the next 100 years. Did the creators of C language intended it to be around for 50 years? I don't know them, but perhaps not. My idea is that the language will almost always still be around as long as there are computers. It persists by inertia. Whether it was the intention or not, I think that most of the modern programming languages will still be around by the end of the 21st century, that is if the electronic computer civilization does not collapse. But even in the Fallout world they still had computers.
I honestly think at this point "how is it better than C?" question should be forbidden when talking about new languages. It's really not that hard to find something better than a dead horse in 2024. Yes, we know that errno is bad, we know that null terminated strings are bad, we know that C macros are bad. No new language competes against C, it competes against all other languages that compete against C. Tell me how this new language is better than zig, odin and rust.
The advantage with having some kind of macros in a language is that it gives you something similar to what you get with Zig's `comptime`... and this is really really really useful when you're working on very constrained systems, like microcontrollers. Got me thinking what has Hare got for `comptime`????
> We've got solutions for managing generic resources, like Python's `with` for example. And we've got solutions for managing memory, like Rust's borrow checker. But nobody's really tried to unify that. I don't think the difference between `with` and what Rust is doing is really about the *kind* of resource being managed. It's more about the *scope*. The `with` keyword (like `using` in C# or `defer` in Go) is tied to a particular block of code. The resource in question always gets cleaned up at the end of that block. Rust's borrow checker is intimiately related to its model of ownership and move semantics, and the lifetime of a resource can be more complicated than any one block. This makes the borrow checker suitable for managing all sorts of things: memory, iteration, mutability, files, locks, threads, etc. But of course Python would be unusable (any language would be unusable?) if every single object had to have a known scope.
16:00 So you should also implement massively parallel programming primitives in the language to 'talk' to GPUs using CUDA or OPENCL type dev. In less then 100 years, 16, 32 and 64 cores will be normal on the desktop, this can take the form of non-performance cores, and there should be a primitive to 'talk' to these 32 cores on a CPU or 1024 cores on a GPU to aid in parallel development but without a major change in the syntax, but maybe use a toggle switch in the language. 53:50 liked and heading to the hills
Right now, parallel primitives aren't expected from a language, since the mindset is that libraries such as basic linear algebra subprograms (BLAS, clBLAS, cuBLAS...) take care of that. The problem might be interoperating with closed-source video drivers. But the new, open-source drivers, and Vulcan, are making that avenue more accessible. Maybe a good direction as more such hardware becomes widely-available and reachable directly through software.
@@makeitreality457 yes, right now, but if you create a language with the explicit goal of making it for the future, especially with the goal of it not changing anymore, you need to think (or rather guess) about how things are going to be at that point and if you think that this is practically impossible to do and pretty much entirely luck based, I guess you also answered yourself how likely it is for such a language to succeed
I don't get why somebody can't just take C and strip out macros, and header files; add some mechanism for generics (which also leads the way to result types), revamp the type system (I don't like typing uint16_t etc). I'm not a good programmer at all, and every language I learn feels like it's in my way. At least, with C, I know I have to build everything from scratch but it's such a simple language that I sort-of know the path I have to take. I can appreciate the safety of Rust, but it feels like a wrestling match against the compiler (I always lose).
You should take a look at modern c. its much more better. also c is not dead language. it is actively being developed and feature are being added every year. also, its common misconception that you have to do everything by scratch in c. i write c everyday and at no point i have to do everything from scratch, but i use variety of libraries to get the job done.
zig does some of this. by replacing the macros with comptime code exec (kinda like lisp) you get a lot of of power (including generics) but instead of having to learn a new language for your language it's still zig. As for the type system I am not totally sure, it may or may not be to your liking. It solves at least two of your problems with C though so probably worth checking out in your case. Might learn it myself, actually.
@@abowden556 it does. I like zig a lot but it's not stable yet. There's not much information (that I can find) about how to write build.zig. I would like to be able to write zig for embedded but don't know what to do about linking. That's what I mean when I say I'm not a good programmer; what seem to be simple tasks put me at a standstill
The real problem with older c programs is people dynamically loading glibc rather than statically linking it, that's usually where the 3 year old programs fail. glibc despite claiming to be for making c more stable is the most unstable thing in the universe but anything that was statically linked (or just outright syscalls directly) still runs perfectly fine
and glibc is so not meant to be statically linked that it may break from just doing that (yes, I have seen that happen, no, I don't understand how) so, I would recommend you to statically link a different libc, like musl
hare looks interesting, I like the idea of the language not breaking things once it's 1.0. I don't really care if my code runs in 100 years, but currently with the programming languages I use, it's very tedious to add a new feature or fix a bug in a 2 year old project. It's half a day of updating the code or even longer setting up a VM with the old compiler and libraries before I can even start to make the changes I want to.
I've just imported a 25 year old C++ program into MS VS 2022, and only a few changes were required around for loop scope. Took 10 mins. The code was originally developed on Windows 95 with MSVC 4.0. Needless to say, I was rather surprised that it was so easy.
What baffles me around Hare is that multithreading is not supported at all, to a point where they explicitly state that you should not do this, are on your own and that the stdlib could possibly break from doing that (since it's not threadsafe). Did they even look at how computer hardware evolved over the last -10- 5 years? Or how hardware manufacturers say where things are going to go in the next decade? If this wouldn't be a systems programming language, fine, a higher level language can get away without support for it (look at Python), but a systems programming language?
I only looked a bit at Hare but I already like its concept a bit more than Zig. I think Zig's error handling design is its biggest mistake, only being able to return an error code without context makes it too limiting. Hare on the other hand seems to do this better while otherwise looking mainly like a cleaner C. Nonetheless I think both languages are promising.
Since Zig has no RAII and no GC it can’t support arbitrary error value types. It would be a shame to have to defer destroy() all errors unless you return them. Error handling in Rust is quite verbose compared to Zig unless you use ”anyhow” which puts errors on the heap and turns Rust into Go.
@@thebatchicle3429Agreed, and it could for example be nice to see where in a file a json parse error occurred. It would however require some discipline to not make error unions too big and to not put in pointers to things with a non static lifetime. If you put in two extra integers in your error type then that will grow the size of any instance of anyerror as well.
After playing around with it, Hare is terrific for Linux. But it lacks the cross-compile targets of Zig. And apparently you can also repurpose zig compiler to compile C, C++, run a complete alternative zig build system, and also run code.
I am not sure the error handling in zig is set in stone. I think there's still an open proposal they are considering. After having used Rust for a while, I agree that just having codes is Zig a little sucky. Still better than exceptions though. And I still am more bullish on Zig than I am on Rust. The cross compiling, C/C++ migration, hot reloading and incremental compiling, and being able to switch out LLVM (eventually) all are killer features.
I'd love having something like golang with sum types and better error handling or rust with implicit traits and GC so you never have to think about lifetimes. I really think the perfect language is somewhere between those two and would love to see some experimentation in that direction.
> rust with implicit traits and GC Totally. Could be super useful. > so you never have to think about lifetimes Aaa! The single thing that makes Rust Rust is the no-mutable-aliasing rule. Even if you have GC, you don't want to get rid of that rule, and that rule requires lifetimes. I'm summarizing a great article called "Notes on a Smaller Rust", but if I link to it UA-cam will black-hole my comment :(
C macros are trash, agreed, but I'm a little confused that the solution is external code generating programs writing to source files. The extent to which this can even be considered a significant upgrade to doing codegen with C (which is already my personal go-to strategy for meta-programming in C) would seem to come down to the quality of internal ast manipulation tools. Assuming this is done well, you're a hop, skip, and a jump from something much closer to typed lisp-style macros. It sounds pretty good still, don't get me wrong, I just don't understand not going the extra mile to get compile-time superpowers. Will be checking out Hare either way.
Yes... compiletime is the feature that most attracted me to Zig and procedural macros for Rust... running external codegen programs is how I do things with Kotlin and assembly language - can't say I find it that great.
The disadvantage of any metaprogramming language feature is it's always going to be more difficult to understand than the equivalent non-metaprogramming code. Code generation has the same problem but with them you still get real concrete code on the other side you can look at/debug. An interesting example of this is the D programming language "mixin" feature, which allows you to take any compile-time known string in interpret it as code. Cool idea but understanding "mixin" based code is always more difficult than the equivalent static code. There was a proposal to take all mixins and write the generated code out to disk so the code could be inspected/debugged later, but at that point it's really just a glorified code generator; which all languages can already do without having to support advanced language features like mixin. This being said metaprogramming isn't "bad" or stricly worse than code generation...it can be very nice/convenient to use. Point is that language designers should always compare any meta-programming feature to the "code generator" solution and consider the tradeoffs. It's clear that Drew understands this equivalence and the tradeoffs.
@@jonathanmarler5808 proper meta-programming is strictly superior to code generation, unless there's something preventing you from having some function that converts an internal ast to formatted code. It's not even close, since a meta-programming system like we're discussing has type information that you generally need to separately track yourself for code generation. There are tons of things in software development that are subjective or simply a matter of tradeoffs where the best answer is "it depends", but metaprogramming vs codegen is not one of them.
@@evan_ca your take on this intrigues me. I think by your description, Dlang's mixin feature would not qualify as proper metaprogramming as it doesn't operate on the ast nor have knowledge of types (it's much more like code generation). I'm open to learn more if you can point me to resources to check out?
... if I was implementing a microkernel, I would also put ideas similar to NaCl-like (Google Native Client) technologies directly into the microkernel.
truly don't understand why someone would make such a thing, what a waste of time. a niche operating system made solely for playing videogames will never take off
but no windows.. its like delusional programming language.. try looking at Odin.. its the best low level language out there before Jai come out ( will it even be )
Since there has not yet been a single low level language to provide me a macro system, I still will not be leaving C. What is it with kids and refusing to allow programmers control over the language, and not just the computer?
Yeah, there's a lot of appeal (and certainly a lot of brain food) in having a language that lets you manipulate its own code. Buuuuut...I wouldn't hold C up as a great example of that. It's string-based macro system kinda sucks. IMO, Lisp had the right idea here - let people manipulate the language *in the language*, with AST-transforms. It works so much better than string-munging. You might find Zig interesting. It's a C-style language with a macro system that works at the language level, rather than the source-string level. YMMV of course. 🙂
@@DeveloperVoices The issue for me is that there are whole universes of language design space left entirely unexplored because no one has even concieved that it is possible. APL, J, K, and all the Iversonian languages have proven the benefit of notation, failing to mention the power that mathematical notation unlocks in the human mind. And yet, if I'm going to use a low-level language, I'm forced to use the most cumbersome and unweildy notation ever concieved for problem solving. How many times must I jam out for loop syntax before I get a language level construct for implicit iteration? K iteration looks like this: fn'array
@@nanthilrodriguezAny way to create this mystical land of notation without it devolving into a write-only language like literally every Iversonian language? Maybe some examples of typical problems solved in C/other systems languages that can be better clarified with your preferred style of notation?
@@nanthilrodriguez I agree, since last century several high level languages have ways to avoid loops, which clarifies code once you get used to reading it. Why don't new low level languages implement this? I also like mathematical notation, but this probably isn't that useful when writing operating systems, otherwise it would have been implemented in some of the low level languages.
Hare is pretty much useless for serious projects because it only supports non-proprietary operating systems (both: as host and as compile target). This means popular systems like Windows and MacOS etc. are all not supported. I like the general idea and concept of Hare, but this "no-proprietary-os"-thing is holding it back. For Hare to gain any popularity, they will need to reconsider this design choice.
100%. I came across this in the docs and immediately closed the browser tab and disregarded the language. It is some sort martyrdom that guarantees the failure of the language. I understand that their stance is that anyone can create a compiler based on the spec, however, that encourages the sort of fragmentation that C suffers from.
Windows and Mac suck, prolly won't be around for long without breaking changes or will get replaced by some new version like they have been. They're terrible targets for anyone who wants to make a project stand the test of time.
Hmm ... [18:29] "try to find a binary that was built for Linux three years ago". Easy, take a statically built mysqldump 5.7.25 (for linux-glibc 2.12; GA 2019-01-21), it runs on a freshly updated Ubuntu 20.04 (glibc 2.31), and I have little doubts it runs on 22.04 or newer. That said I admit it ain't easy to build/maintain this crap, with which I dealt firsthand, so I'm all for throwing C out of supply chain. Long live Hare 1.0!
I like C as long as macros aren't overused, but I like C++ better. People just love to bash C (and C++) because it's seen as the cool thing to do by the 'in crowd'.
Anything that considers Niklaus Wirth to be a design idol has to be worth a look!!! :) But, like Hare, no ARM32 and no MIPS32 support is a deal breaker for me. :(
This language sounds worse to me than rust and zig and a bunch of other C or C++ wannabes. Use only once seems like a bunch of unnecessary overhead for no good reason.
Honestly true, I think this is their biggest mistake. Sure, QBE is simpler than LLVM and all, but not supporting one of the major platforms will hurt adoption for sure.
@@satinxs8it's always kind of sad when developers bring their personal ideologies into their work. andrew is also hostile towards windows, but at least he begrudgingly supports it. the most chill out of the bunch is ginger bill. and we know windows will be a first class citizen because odin is developed on a windows machine.
Retro languages are not the future. The more features in a language the better. C++ is a huge improvement over C. Java is a huge improvement over C++. (for business apps) Scala is an improvement over Java.
@@donwinston 1.thats exactly what I was suggesting, but DECADES AND DECADES is a wild statement regarding codemonkey webdev jobs, 1 decade till early adapters, 2 decades till you are phased out probably. 2.both GO and rust programmers are slowly taking over the backend market. 3.its not about languages, its about the understanding the underlying concepts of programming and hardware.
Code generation and AST parsing/unparsing **is** metaprogramming! You just use tools outside of the compiler, which is just as bad as the C preprocessor (which also operates outside of the compiler).
tbf, code generators can make sense for certain things, like for example protobuf
but other than things like that, I want to work on actual types as the compiler understands them
Well, not quite. You cannot do any type checking with the C preprocessor, for example. But nothing prevents you from doing that from your external tools.
@@Evan490BC They are equally bad in the sense of not being integrated with the compiler and thus not understanding the code in the same way. Although you could build external tools that understand the code just as well as the compiler, that is rarely done for tasks like code generation.
I was a passive Zig fan for a while(not really doing anything with it, but generally aligned with the goal of usurping C) but I've grown a little more interested in Hare's approach, because it's disinterested in self-extension. If you need to do extension, you really want a Lisp instead of a Fortran(using the Chuck Moore characterization of "there are only four languages, Fortran, Lisp, Forth, and APL".) But as soon as you make a Lisp, you make a jungle of Lisps, none of them quite alike. And this is borne out by everything long-lived that has tried to add some general-purpose extension - C macros, C++ templates, and of course the Common Lisp and Scheme landscape. So I'd rather define extension within a second language made to be a good code generator, e.g. Janet. The design problem of nailing down a systems programming language with good defaults is more prone to bikeshedding than pushing it onto extension, but it leads to a concise implementation that's easy to understand.
I randomly stumbled upon Hare casually browsing the OpenBSD repos about two weeks ago. Considering C has been around for 50 years and is still going strong, I think a 100-year lifespan is actually very achievable.
They've got 50 years to mess it up
C will still run strong for the next 50 years too. Its best longevity secret is Its community. Hare with this mindset (of no proprietary os only) would not last for 50+ year.
@@maxrinehart4177 proprietary OSs are going away in the next 20. They're not profitable anymore. Their epoch has come and gone. Hence MS announcing this year their plans to cram theirs full of ads and spyware. If they've gone full mask off, it's game over. They're done providing value, and are ready to raze the place and rake in profits until everyone leaves
This is what I thought when he said about a programming language lasting for the next 100 years. Did the creators of C language intended it to be around for 50 years? I don't know them, but perhaps not. My idea is that the language will almost always still be around as long as there are computers. It persists by inertia. Whether it was the intention or not, I think that most of the modern programming languages will still be around by the end of the 21st century, that is if the electronic computer civilization does not collapse. But even in the Fallout world they still had computers.
C will still be around in 50 years, I wonder if anyone will remember Hare?
Thank you Kris for such amazing talks. This is the only no nonsense podcast on tech yt I watch.
you should shift to watch more no nonsense yt content.
Interviewer said "Lye-nicks" instead of "Lin-nucks". My man! Instant sub!
Lol
So glad to have found this channel, love what you're doing!
C3 when?
Linear types are interesting but pretty awkward to work with in practice since you have to thread the resource handle through every function call.
I honestly think at this point "how is it better than C?" question should be forbidden when talking about new languages. It's really not that hard to find something better than a dead horse in 2024. Yes, we know that errno is bad, we know that null terminated strings are bad, we know that C macros are bad. No new language competes against C, it competes against all other languages that compete against C. Tell me how this new language is better than zig, odin and rust.
recently saw tsoding messing with it. seems pretty interesting. maybe in a few years we'll see if the toolchain for it gets easier to use.
it seemed very straightforward, and very complete. But I had to compile everything from latest sources. The distro pack was a no-go
The advantage with having some kind of macros in a language is that it gives you something similar to what you get with Zig's `comptime`... and this is really really really useful when you're working on very constrained systems, like microcontrollers. Got me thinking what has Hare got for `comptime`????
I have to give a sub, your work is phenomenal! Engaging questions, love the tone you set. I hope you really grow this channel!
> We've got solutions for managing generic resources, like Python's `with` for example. And we've got solutions for managing memory, like Rust's borrow checker. But nobody's really tried to unify that.
I don't think the difference between `with` and what Rust is doing is really about the *kind* of resource being managed. It's more about the *scope*. The `with` keyword (like `using` in C# or `defer` in Go) is tied to a particular block of code. The resource in question always gets cleaned up at the end of that block. Rust's borrow checker is intimiately related to its model of ownership and move semantics, and the lifetime of a resource can be more complicated than any one block. This makes the borrow checker suitable for managing all sorts of things: memory, iteration, mutability, files, locks, threads, etc. But of course Python would be unusable (any language would be unusable?) if every single object had to have a known scope.
I get a HolyC feeling here.
You should complete the series with Vale language.
Yes, definitely want to do Vale. (Though I suspect the series will never be complete. 😅)
@@DeveloperVoiceshave you done Odin yet?
@@shaurz Sure have! That's over here: ua-cam.com/video/aKYdj0f1iQI/v-deo.html
16:00 So you should also implement massively parallel programming primitives in the language to 'talk' to GPUs using CUDA or OPENCL type dev.
In less then 100 years, 16, 32 and 64 cores will be normal on the desktop, this can take the form of non-performance cores, and there should be a primitive to 'talk' to these 32 cores on a CPU or 1024 cores on a GPU to aid in parallel development but without a major change in the syntax, but maybe use a toggle switch in the language.
53:50 liked and heading to the hills
Right now, parallel primitives aren't expected from a language, since the mindset is that libraries such as basic linear algebra subprograms (BLAS, clBLAS, cuBLAS...) take care of that. The problem might be interoperating with closed-source video drivers. But the new, open-source drivers, and Vulcan, are making that avenue more accessible. Maybe a good direction as more such hardware becomes widely-available and reachable directly through software.
My computer is 3-ish years old, and it has 16 true cores, 32 hyper threaded. This reality isn't even 10 years out.
@@makeitreality457 yes, right now, but if you create a language with the explicit goal of making it for the future, especially with the goal of it not changing anymore, you need to think (or rather guess) about how things are going to be at that point
and if you think that this is practically impossible to do and pretty much entirely luck based, I guess you also answered yourself how likely it is for such a language to succeed
I don't get why somebody can't just take C and strip out macros, and header files; add some mechanism for generics (which also leads the way to result types), revamp the type system (I don't like typing uint16_t etc). I'm not a good programmer at all, and every language I learn feels like it's in my way. At least, with C, I know I have to build everything from scratch but it's such a simple language that I sort-of know the path I have to take. I can appreciate the safety of Rust, but it feels like a wrestling match against the compiler (I always lose).
You should take a look at modern c. its much more better. also c is not dead language. it is actively being developed and feature are being added every year. also, its common misconception that you have to do everything by scratch in c. i write c everyday and at no point i have to do everything from scratch, but i use variety of libraries to get the job done.
zig does some of this. by replacing the macros with comptime code exec (kinda like lisp) you get a lot of of power (including generics) but instead of having to learn a new language for your language it's still zig. As for the type system I am not totally sure, it may or may not be to your liking. It solves at least two of your problems with C though so probably worth checking out in your case. Might learn it myself, actually.
@@abowden556 it does. I like zig a lot but it's not stable yet. There's not much information (that I can find) about how to write build.zig. I would like to be able to write zig for embedded but don't know what to do about linking. That's what I mean when I say I'm not a good programmer; what seem to be simple tasks put me at a standstill
take a look at C3
@@poggybitz513how is it better? if you still need to create header file, it doesn't sound like better at all
The real problem with older c programs is people dynamically loading glibc rather than statically linking it, that's usually where the 3 year old programs fail. glibc despite claiming to be for making c more stable is the most unstable thing in the universe but anything that was statically linked (or just outright syscalls directly) still runs perfectly fine
and glibc is so not meant to be statically linked that it may break from just doing that (yes, I have seen that happen, no, I don't understand how)
so, I would recommend you to statically link a different libc, like musl
hare looks interesting, I like the idea of the language not breaking things once it's 1.0.
I don't really care if my code runs in 100 years, but currently with the programming languages I use, it's very tedious to add a new feature or fix a bug in a 2 year old project. It's half a day of updating the code or even longer setting up a VM with the old compiler and libraries before I can even start to make the changes I want to.
I've just imported a 25 year old C++ program into MS VS 2022, and only a few changes were required around for loop scope. Took 10 mins. The code was originally developed on Windows 95 with MSVC 4.0. Needless to say, I was rather surprised that it was so easy.
@@toby9999 That's awesome.
Love this interviews
Adobe is pushing the language Val which also relies on linear types.
What baffles me around Hare is that multithreading is not supported at all, to a point where they explicitly state that you should not do this, are on your own and that the stdlib could possibly break from doing that (since it's not threadsafe).
Did they even look at how computer hardware evolved over the last -10- 5 years? Or how hardware manufacturers say where things are going to go in the next decade?
If this wouldn't be a systems programming language, fine, a higher level language can get away without support for it (look at Python), but a systems programming language?
I only looked a bit at Hare but I already like its concept a bit more than Zig. I think Zig's error handling design is its biggest mistake, only being able to return an error code without context makes it too limiting.
Hare on the other hand seems to do this better while otherwise looking mainly like a cleaner C. Nonetheless I think both languages are promising.
Since Zig has no RAII and no GC it can’t support arbitrary error value types. It would be a shame to have to defer destroy() all errors unless you return them. Error handling in Rust is quite verbose compared to Zig unless you use ”anyhow” which puts errors on the heap and turns Rust into Go.
@@krumbergifynot really true. There’s no reason why an error couldn’t contain more context than a simple error code while not needing to be destroyed
@@thebatchicle3429Agreed, and it could for example be nice to see where in a file a json parse error occurred. It would however require some discipline to not make error unions too big and to not put in pointers to things with a non static lifetime. If you put in two extra integers in your error type then that will grow the size of any instance of anyerror as well.
After playing around with it, Hare is terrific for Linux. But it lacks the cross-compile targets of Zig. And apparently you can also repurpose zig compiler to compile C, C++, run a complete alternative zig build system, and also run code.
I am not sure the error handling in zig is set in stone. I think there's still an open proposal they are considering. After having used Rust for a while, I agree that just having codes is Zig a little sucky. Still better than exceptions though.
And I still am more bullish on Zig than I am on Rust. The cross compiling, C/C++ migration, hot reloading and incremental compiling, and being able to switch out LLVM (eventually) all are killer features.
Ahh, Hare-brained Scheme sent me 😆 (it would probably be a cool project too!)
Are linear types similar to what ATS uses?
Thank you both for this interview!
I'd love having something like golang with sum types and better error handling or rust with implicit traits and GC so you never have to think about lifetimes. I really think the perfect language is somewhere between those two and would love to see some experimentation in that direction.
Checkout the V language, it might be a good fit to what you are describing
I've heard that Ocaml is rust with GC but it's very far from the C procedural programming style
> rust with implicit traits and GC
Totally. Could be super useful.
> so you never have to think about lifetimes
Aaa! The single thing that makes Rust Rust is the no-mutable-aliasing rule. Even if you have GC, you don't want to get rid of that rule, and that rule requires lifetimes. I'm summarizing a great article called "Notes on a Smaller Rust", but if I link to it UA-cam will black-hole my comment :(
Hare looks nice. It is not overly inventive, but it has nice defaults.
I don't know what it is but using colons for the module separator turns me instantly off
And I thought I am the only grumpy one, with my intense distaste for Python's indentation... I quite like Hare's surface syntax, though!
C macros are trash, agreed, but I'm a little confused that the solution is external code generating programs writing to source files. The extent to which this can even be considered a significant upgrade to doing codegen with C (which is already my personal go-to strategy for meta-programming in C) would seem to come down to the quality of internal ast manipulation tools. Assuming this is done well, you're a hop, skip, and a jump from something much closer to typed lisp-style macros. It sounds pretty good still, don't get me wrong, I just don't understand not going the extra mile to get compile-time superpowers. Will be checking out Hare either way.
Yes... compiletime is the feature that most attracted me to Zig and procedural macros for Rust... running external codegen programs is how I do things with Kotlin and assembly language - can't say I find it that great.
The disadvantage of any metaprogramming language feature is it's always going to be more difficult to understand than the equivalent non-metaprogramming code. Code generation has the same problem but with them you still get real concrete code on the other side you can look at/debug.
An interesting example of this is the D programming language "mixin" feature, which allows you to take any compile-time known string in interpret it as code. Cool idea but understanding "mixin" based code is always more difficult than the equivalent static code. There was a proposal to take all mixins and write the generated code out to disk so the code could be inspected/debugged later, but at that point it's really just a glorified code generator; which all languages can already do without having to support advanced language features like mixin.
This being said metaprogramming isn't "bad" or stricly worse than code generation...it can be very nice/convenient to use. Point is that language designers should always compare any meta-programming feature to the "code generator" solution and consider the tradeoffs. It's clear that Drew understands this equivalence and the tradeoffs.
@@jonathanmarler5808 proper meta-programming is strictly superior to code generation, unless there's something preventing you from having some function that converts an internal ast to formatted code. It's not even close, since a meta-programming system like we're discussing has type information that you generally need to separately track yourself for code generation. There are tons of things in software development that are subjective or simply a matter of tradeoffs where the best answer is "it depends", but metaprogramming vs codegen is not one of them.
@@evan_ca your take on this intrigues me. I think by your description, Dlang's mixin feature would not qualify as proper metaprogramming as it doesn't operate on the ast nor have knowledge of types (it's much more like code generation). I'm open to learn more if you can point me to resources to check out?
Very interesting
The OS with its capability model and messaging system is reminiscent of Google's Fuchsia
... if I was implementing a microkernel, I would also put ideas similar to NaCl-like (Google Native Client) technologies directly into the microkernel.
What makes NaCl different from other byte-code interpreters?
That does not seem like something that should be in a microkernel. But using interpreters a fundamental server could of course be viable.
people complaining hare is targeting primerly real operating systems such as linux instead of toy operating systems like windows
truly don't understand why someone would make such a thing, what a waste of time. a niche operating system made solely for playing videogames will never take off
sounds like a 100 year lang that'll take 100 years to finish.
This guy might want to rethink calling his language hair.
Who else came (pun intended) here from tsoding?
100-year old language... imagine still writing for-loops in hundred years
Sadly I can imagine we’ll still be writing COBOL in 100 years. 😅
24:24
No Windows support => Dead in the water
but no windows.. its like delusional programming language.. try looking at Odin.. its the best low level language out there before Jai come out ( will it even be )
wait, he said "ed", not "ed"!
Since there has not yet been a single low level language to provide me a macro system, I still will not be leaving C.
What is it with kids and refusing to allow programmers control over the language, and not just the computer?
Yeah, there's a lot of appeal (and certainly a lot of brain food) in having a language that lets you manipulate its own code. Buuuuut...I wouldn't hold C up as a great example of that. It's string-based macro system kinda sucks. IMO, Lisp had the right idea here - let people manipulate the language *in the language*, with AST-transforms. It works so much better than string-munging.
You might find Zig interesting. It's a C-style language with a macro system that works at the language level, rather than the source-string level.
YMMV of course. 🙂
@@DeveloperVoices The issue for me is that there are whole universes of language design space left entirely unexplored because no one has even concieved that it is possible. APL, J, K, and all the Iversonian languages have proven the benefit of notation, failing to mention the power that mathematical notation unlocks in the human mind. And yet, if I'm going to use a low-level language, I'm forced to use the most cumbersome and unweildy notation ever concieved for problem solving. How many times must I jam out for loop syntax before I get a language level construct for implicit iteration? K iteration looks like this: fn'array
Nim?
@@nanthilrodriguezAny way to create this mystical land of notation without it devolving into a write-only language like literally every Iversonian language? Maybe some examples of typical problems solved in C/other systems languages that can be better clarified with your preferred style of notation?
@@nanthilrodriguez I agree, since last century several high level languages have ways to avoid loops, which clarifies code once you get used to reading it. Why don't new low level languages implement this? I also like mathematical notation, but this probably isn't that useful when writing operating systems, otherwise it would have been implemented in some of the low level languages.
Hyprland jumpscare BOO!
Boo! I have no COC!
@@mari3434drew seethes daily because the person who made the best wayland compositor is not him.
@@happygofishing vaxryGODS..... we won......
So in a hundred years we'll still have to deal with semicolons?
😂
I’m quite confident that in 100 years we’ll still be maintaining COBOL. 😅
Hare is pretty much useless for serious projects because it only supports non-proprietary operating systems (both: as host and as compile target). This means popular systems like Windows and MacOS etc. are all not supported. I like the general idea and concept of Hare, but this "no-proprietary-os"-thing is holding it back. For Hare to gain any popularity, they will need to reconsider this design choice.
100%. I came across this in the docs and immediately closed the browser tab and disregarded the language. It is some sort martyrdom that guarantees the failure of the language.
I understand that their stance is that anyone can create a compiler based on the spec, however, that encourages the sort of fragmentation that C suffers from.
so fork it
@@JakobKenda why would I? What incentive do I have? There are already other languages out there that are good and support all major platforms.
Windows and Mac suck, prolly won't be around for long without breaking changes or will get replaced by some new version like they have been. They're terrible targets for anyone who wants to make a project stand the test of time.
@@pyrotek45 that’s absolute and utter nonsense.
No preprocessor, no multithreading, no thanks
Hmm ... [18:29] "try to find a binary that was built for Linux three years ago". Easy, take a statically built mysqldump 5.7.25 (for linux-glibc 2.12; GA 2019-01-21), it runs on a freshly updated Ubuntu 20.04 (glibc 2.31), and I have little doubts it runs on 22.04 or newer. That said I admit it ain't easy to build/maintain this crap, with which I dealt firsthand, so I'm all for throwing C out of supply chain. Long live Hare 1.0!
I like C as long as macros aren't overused, but I like C++ better. People just love to bash C (and C++) because it's seen as the cool thing to do by the 'in crowd'.
Odin
Anything that considers Niklaus Wirth to be a design idol has to be worth a look!!! :)
But, like Hare, no ARM32 and no MIPS32 support is a deal breaker for me. :(
@@actualwafflesenjoyer if you like to mess around with PIC33 microcontrollers or old embedded devices and OpenWRT... yup!
Hare has bad syntax and design because Drew can't write a good text parser. Everything is compromised to make his life easier.
This language sounds worse to me than rust and zig and a bunch of other C or C++ wannabes.
Use only once seems like a bunch of unnecessary overhead for no good reason.
doesn't support windows; it's not going to be the language of 100 years. it's not going to be anything at all.
2123 will be the year of the linux desktop
Honestly true, I think this is their biggest mistake. Sure, QBE is simpler than LLVM and all, but not supporting one of the major platforms will hurt adoption for sure.
@@satinxs8it's always kind of sad when developers bring their personal ideologies into their work.
andrew is also hostile towards windows, but at least he begrudgingly supports it.
the most chill out of the bunch is ginger bill. and we know windows will be a first class citizen because odin is developed on a windows machine.
@androth1502 I don't mind people bringing ideology in their project, it's their brainchild, after all. But don't honestly expect widespread adoption
Makes no sense not supporting windows
Retro languages are not the future. The more features in a language the better. C++ is a huge improvement over C. Java is a huge improvement over C++. (for business apps) Scala is an improvement over Java.
Literal web-developer-tier opinion
I worry about your job security in the upcoming decade.
@@vyyr Java and PHP and JavaScript will dominate for decades and decades. AI will take my job not some Go or Rust programmer.
@@donwinston 1.thats exactly what I was suggesting, but DECADES AND DECADES is a wild statement regarding codemonkey webdev jobs, 1 decade till early adapters, 2 decades till you are phased out probably.
2.both GO and rust programmers are slowly taking over the backend market.
3.its not about languages, its about the understanding the underlying concepts of programming and hardware.
Java is good but its too verbose and everything being an "object" is an abomination.
GO is a good example of a modern language.
What about Lurcher.
DeVault is also a curmudgeon - upon being pointed out issues in his crappy code, he gets uber-defensive and attacks people. Hahaha.