It's that moment where you're casually scrolling through youtube. You see a video recommended. It intrigues you, and after 1 min in, you realise you hit an absolute goldmine. Subscribed.
I developed a smart form system a few years ago for my past employer, where users could enter custom formulas in fields that were dependant on other fields and/or other objects within our system. Really typical of a lot of CRM's these days, but we needed to roll-our-own since it was specific to metrology. So, I took a deep-dive into this topic and ended up learning more about ANTLR and g4 syntax than I expected, and now I'm stuck in this rabbit hole forever.
You know ever since I found this channel and I find myself downloading videos to my archive and using them as a teaching materials I can't just say how thankful am I to you for this great work I wish I could do more to support but all I can do is point to your channel whenever I can
Amazing work! I think Knuth’s research really ought to be touched on more here though. The approach in the recursive descent section is combinatorial parsing (every language has a parser combinator library now, you can write a compiler in basically anything), but Knuth had the idea to do it with lookup tables that tell you when you’ve parsed enough characters for a based on the next character. The new rule for pushdown automata (precedence)? Another lookup table. It’s exponentially harder to scale than parser combinators so it doesn’t hold up today, but it’s an important part of the history!
Thanks for sharing your thoughts on this! There was definitely a lot I didn't cover here and useful to hear which are highlights for you. Will keep this in mind.
Nice video. Thank you for sharing. From language communication to social constructs. Religions imprinted from social structures. The check of types. Ask them but knowing they have a imprint structure and things that differ. Bias sets and bias blindness and linking to phrasing/word structure and lexicons/words used.
I love these videos! My course focused on too much theory and running algorithms by hand like minization of DFA, nfa to dfa, etc., which was a nightmare.
I don't know what a modern equivalent would be, but if you want a super simple explanation of recursive descent to just get something working without understanding any of the theory, then Crenshaw's tutorial is the best I've ever found. If it didn't use Pascal and generate 68k assembly it would still be directly usable today, but hardly anyone uses Pascal anymore and fewer still need code generated for a 68k processor and if you're just starting out learning you won't understand how to translate these. If anyone knows of a good modern equivalent that either is written in C or generates C or both, that would probably be what I'd recommend. If not, then I'll have to add that to my project list because someone needs to write one.
quick correction for 10:50 - context sensitive languages can be recognised by linear bounded automata, which are less powerful than turing machines. turing machine automata recognise the recursively enumerable languages, which are a superset of the context sensitive languages
Awesome work! Have you ever noticed how recursion in programming is similar to a proof by induction in mathematics? In both you begin with the "base" case 🙂
Thanks :) These ones are coded with Motion Canvas - motioncanvas.io - takes just a little while to get used to but really nice to work with once you do.
hi Kay, hope you are doing well, Im 25 getting into programming, idk if I should go for functional programing in Ocaml or C++, any advise, love ur videos!!
Long time ago, I wrote a parser after I heard that compilers first tokenize. That concept clicked as it was easy to understand and implement: just group digits into numbers and letters into strings. I thought I was halfway. With a lot of suffering, I implemented what I now know to be a recursive descent parser. Years later I discovered the academic theory and...I still don't understand it. Or rather, I grew to understand how impractical it is for programmers. Firstly, it's explained to 'produce' text, which is already confusing for programmers as that's not what we want to do. Secondly, the grammar notation is not only very abstract but it's also incomplete. You casually mention multiple times that we are assuming a specific order, in case of multiple choices, even labelling a specific order as "greedy". I think no gramnar checker or parser generator uses standard BNF or even EBNF, so if these notations are so flawed, why are academics still teaching them? At least you mentioned that the top down production is not what we want to do but that this theory was initially developed to understand human languages, but you should start with that info! (But even then, no human "produces" sentences in a top down manner - you start with the concept e.g. "swimming" and then build the sentence around it. "I want to go swimming!" Interestingly, notice how in this case we "produced" from the right, starting with the last and most important word of the sentence.) As I'm writing this, I just realised that all this theory has been created to analyse and classify text hut that it's wrong to assume that it's equally useful to practically produce or transform (compile). I know that compiler theory does give some tools to compile text into code, such as the method of merging (reducing) tokens into non-terminal nodes, but you mentioned that highly demanding compilers for systems programming languages still use the "old fashioned" manually written recursive parsers. I think that's a symptom that language and compiler theory may not be that useful, otherwise it would have resulted in more advanced (in the meaning of more capable) compilers.
I guess what I want to say is : 1. Don't start compiler writing lessons with the graveyard of the theory, just as you wouldn't start a class on general relativity by explaining the 4 essential elements: earth, wind, fire and water. 2. Don't parrot. Be critical of theory and throw away or refine what is wrong. 3. If you want to know how good a car is, don't listen to the CEO but talk to the mechanic. Ask language builders what part if the theory they used...and what bits of theory they came up with as they were working, as this surely is the way that grammar notation was improved in a gazillion ways but never standardized!
Hi! Do you mean the version around 14:06? If so, yes that is a scaled back version just for that example (so won't match 'haha'). If I misunderstand though let me know.
I've had a nasty experience with umami. It seemed that when adding attribute values for some events recorded, it made my whole app crash (didn't really had time to debug what was the problem, so I just removed those attribute values and it worked as expected). Other than that, very pleased with this great tool !
Zero *or* more times. The "or" is very faint, but if you play the video at 1x or slower you can hear it even if the captions don't capture that fact. If you need further explanation of what zero or more times means, it means that something can repeat to infinity or not exist at all.
It's that moment where you're casually scrolling through youtube. You see a video recommended. It intrigues you, and after 1 min in, you realise you hit an absolute goldmine. Subscribed.
same here lol ... thanks algorithm and thanks for the great video and channel 👍
Same here 😅
The power of the title and thumbnail!
Just like Rust!
Yeah, specially the haired man, which is very rare on UA-cam, once they tend to attract toxic people.
You know it's going to be good when Kay starts with some historical artifact
Every single upload is a masterpiece
Agreed, even though your statement is still an understatement 🙂
I'm taking a class on compilers this semester. You have no idea how great it is to click on UA-cam and see a video like this. Thanks
Dude I just started writing a compiler to learn how they work. This video couldn't have come at a more perfect time! Great vid as always
Then you also might want to read Writing C Compiler by Nora Sandler that came out very recently.
@@LinuxIsBetter43 damn, I definitely wanna read this. Modern compiler book, with all the modern techniques.
This guy has the blood of Newton of some great people
This channel is exactly what I've been looking for. Thank you algorithm god 🙌🏻
I developed a smart form system a few years ago for my past employer, where users could enter custom formulas in fields that were dependant on other fields and/or other objects within our system. Really typical of a lot of CRM's these days, but we needed to roll-our-own since it was specific to metrology.
So, I took a deep-dive into this topic and ended up learning more about ANTLR and g4 syntax than I expected, and now I'm stuck in this rabbit hole forever.
You're cooking so hard Kay, thanks for putting out this awesome content.
You know ever since I found this channel and I find myself downloading videos to my archive and using them as a teaching materials
I can't just say how thankful am I to you for this great work
I wish I could do more to support but all I can do is point to your channel whenever I can
amazing stuff, glad to see theory applied, wish my teachers on uni would show this side of the course
I wish I watched this before my exam today where I was shifty on pushdown automata... great video!
This has become one of my favorite channels
I knew someday in your video this topics will come … great to hear from you …
"Crafting Interpreters" by Nystrom is a good treatment on the subject for those that would like a deeper dive
Amazing work! I think Knuth’s research really ought to be touched on more here though. The approach in the recursive descent section is combinatorial parsing (every language has a parser combinator library now, you can write a compiler in basically anything), but Knuth had the idea to do it with lookup tables that tell you when you’ve parsed enough characters for a based on the next character. The new rule for pushdown automata (precedence)? Another lookup table. It’s exponentially harder to scale than parser combinators so it doesn’t hold up today, but it’s an important part of the history!
Thanks for sharing your thoughts on this! There was definitely a lot I didn't cover here and useful to hear which are highlights for you. Will keep this in mind.
Nice video. Thank you for sharing. From language communication to social constructs. Religions imprinted from social structures. The check of types. Ask them but knowing they have a imprint structure and things that differ. Bias sets and bias blindness and linking to phrasing/word structure and lexicons/words used.
I love these videos! My course focused on too much theory and running algorithms by hand like minization of DFA, nfa to dfa, etc., which was a nightmare.
Bonkers how appropriate, for me, this is right now. Thank you
aye
I don't know what a modern equivalent would be, but if you want a super simple explanation of recursive descent to just get something working without understanding any of the theory, then Crenshaw's tutorial is the best I've ever found. If it didn't use Pascal and generate 68k assembly it would still be directly usable today, but hardly anyone uses Pascal anymore and fewer still need code generated for a 68k processor and if you're just starting out learning you won't understand how to translate these.
If anyone knows of a good modern equivalent that either is written in C or generates C or both, that would probably be what I'd recommend. If not, then I'll have to add that to my project list because someone needs to write one.
Love the vids. I am learning the Same topic in uni right now. Nice overhaul. Thank you very much. Continue the good work pretty please!😊
thank you. this almost made me shed a tear
i literally lol'd at the example of "the grammar of a laugh". well played.
Your videos keep getting better and better!
quick correction for 10:50 - context sensitive languages can be recognised by linear bounded automata, which are less powerful than turing machines. turing machine automata recognise the recursively enumerable languages, which are a superset of the context sensitive languages
these videos are so great. happy i found the channel :)
As Kay is extremely thorough, I shall refrain from commenting until the end since invariably whatever I have to add is eventually covered
Loved the video, Jai Shree Ram!!
holy smokes this channel has exploded. kinda knew it would tho
Kay has lots of knowledge and a beautiful english voice for a guy from Barcelona. Amazing.
My face lit up. Thanks, Kay!
Awesome work! Have you ever noticed how recursion in programming is similar to a proof by induction in mathematics? In both you begin with the "base" case 🙂
Yo, that’s dope.
Greetings from Peru
Thanking you most kindly from English England
I've wanted to go through Sipser's computation course on mitocw for a while, already have his book. Maybe this popping up is a sign 😂
Amazing channel!
Awesome vbideo, thanks so much. What do you use to make these visualizations? They're beautiful!
Thanks :) These ones are coded with Motion Canvas - motioncanvas.io - takes just a little while to get used to but really nice to work with once you do.
hi Kay, hope you are doing well, Im 25 getting into programming, idk if I should go for functional programing in Ocaml or C++, any advise, love ur videos!!
THAT WAS AWESOME !
Thanks Kay
Thanks.
nice video
im glad i watched this
Awesome!
Hi, great shirt.
6:41 is there some reason why (Tail ::= 'a' Tail) is only in the right tree?
Noam Chomsky
can you do PEG grammar?
Long time ago, I wrote a parser after I heard that compilers first tokenize. That concept clicked as it was easy to understand and implement: just group digits into numbers and letters into strings. I thought I was halfway. With a lot of suffering, I implemented what I now know to be a recursive descent parser.
Years later I discovered the academic theory and...I still don't understand it. Or rather, I grew to understand how impractical it is for programmers.
Firstly, it's explained to 'produce' text, which is already confusing for programmers as that's not what we want to do.
Secondly, the grammar notation is not only very abstract but it's also incomplete. You casually mention multiple times that we are assuming a specific order, in case of multiple choices, even labelling a specific order as "greedy".
I think no gramnar checker or parser generator uses standard BNF or even EBNF, so if these notations are so flawed, why are academics still teaching them?
At least you mentioned that the top down production is not what we want to do but that this theory was initially developed to understand human languages, but you should start with that info!
(But even then, no human "produces" sentences in a top down manner - you start with the concept e.g. "swimming" and then build the sentence around it. "I want to go swimming!" Interestingly, notice how in this case we "produced" from the right, starting with the last and most important word of the sentence.)
As I'm writing this, I just realised that all this theory has been created to analyse and classify text hut that it's wrong to assume that it's equally useful to practically produce or transform (compile).
I know that compiler theory does give some tools to compile text into code, such as the method of merging (reducing) tokens into non-terminal nodes, but you mentioned that highly demanding compilers for systems programming languages still use the "old fashioned" manually written recursive parsers.
I think that's a symptom that language and compiler theory may not be that useful, otherwise it would have resulted in more advanced (in the meaning of more capable) compilers.
I guess what I want to say is :
1. Don't start compiler writing lessons with the graveyard of the theory, just as you wouldn't start a class on general relativity by explaining the 4 essential elements: earth, wind, fire and water.
2. Don't parrot. Be critical of theory and throw away or refine what is wrong.
3. If you want to know how good a car is, don't listen to the CEO but talk to the mechanic. Ask language builders what part if the theory they used...and what bits of theory they came up with as they were working, as this surely is the way that grammar notation was improved in a gazillion ways but never standardized!
Thanks for taking the time to put these thoughts into writing, appreciate it 👍
The Laugh language I think is wrong. How do you process 'haha' given the rules?
'h' -> Head :: 'h' Tail
'a' -> Tail 'a' Tail
Now where to?
Hi! Do you mean the version around 14:06? If so, yes that is a scaled back version just for that example (so won't match 'haha'). If I misunderstand though let me know.
can you create a parser using a n dimensional gaussian
why do I feel like listening to supergrass all of a sudden
yah lanang
I've had a nasty experience with umami. It seemed that when adding attribute values for some events recorded, it made my whole app crash (didn't really had time to debug what was the problem, so I just removed those attribute values and it worked as expected). Other than that, very pleased with this great tool !
What is 0 more times? Around 10:04
Zero *or* more times. The "or" is very faint, but if you play the video at 1x or slower you can hear it even if the captions don't capture that fact. If you need further explanation of what zero or more times means, it means that something can repeat to infinity or not exist at all.
Ok that explains a lot. Thanks for the explanation
So the rumore of sanskrit is an artificial language is true, what I person.
if ode5 is so good why has there no ode6
You should be able to solve this.
Subject + Verb + Object * preposition = X^adverb
Solve for X when Subject = Topic
Show your work