After the line: "You can still find Assembly in modern world: Web Assembly" i realized that this person knows nothing about any aspect of programming excluding wikipedia info.
Memory leak isn’t the issue of using too much memory. Program is still fine, as long as it knows what memory it’s using. Memory leak is the problem where the program forgets to actually delete the memory it no longer uses; the computer (operating system) thinks that the program is still using the memory but the program thinks it’s not.
The problem arises when the program forgets to free the memory, but then thinks it needs MORE memory and asks for it. If this happens in a loop the system will eventually run out of memory.
@@FranLegon Doesn't necessarily result; The program can leak just a few bytes on starting because it forgot to delete some initialization variables. But the rest of the program memory could still be managed properly. It's still dirty in the sense that you forgot to take the dirt off your shoes before entering house. Memory leak can be though of forgetting to throw out the trash. Usually users don't notice it. The problem only gets noticed when the trash raises a stink. Like how humans treat the environment.
Yeah the part about Swift is wrong as it's available on Windows and Linux (the Arc browser is even building it's Windows app on Swift). Yes iOS and Mac development probably requires a Mac but you can 100% write command line apps or backends on Windows and Linux. Also Kotlin is useful for more than Android development, it's fully supported in the popular Spring framework.
I don't think anyone would watch a video of this kind for its accuracy. It's clearly aimed for the common layman (i.e. someone who's only getting started in programming)
@@ashwinnarasimhan2729 technically yes, practically no. Same as with C#/.Net, which can be used with Mono outside of Windows. But why would you do that?
Oh hey man, thanks for mentioning Holy-C. The dude behind it I actually talked to online a lot. He was actually a really nice guy when the Schizophrenia wasnt screwing him up. He was a weird crazy genius, and ultimately he died tragically alone because of his mental illness. TempleOS and Holy-C are mostly remembered for the mental illness behind it, but Terry was a real human, and he deserves to be remembered as a mad pioneer who could have been so much more if it wasnt for the fact that this world is terribly unequipped to support people with significant mental illness. RIP Terry.
Was surprised, but pleased to see it mentioned too. 110% agree on not only how he should be remembered, but on how he could have been so much more. Dude was a genius. RIP Terry.
I'm always surprised whenever somebody does not remember Terry for his genius self, but for the "controversial" attitude he had. Shows what society values more and how we don't deserve people like Terry. Hope he's in a better world now.
This video is just completely riddled with inaccuracies, please viewers of this video, take everything said with a large grain of salt and do your own research and investigation of programming and computing topics.
@@tomasprochazka6198 "Formally" is the key part you're missing. I know you meant no harm but it's annoying when someone tries to pull the "but actually 🤓" and isn't even right.
@@dejangegic golang is rarely used informally as well. Between devs, we know what go is, we don't need to call it golang just because search engines can't
I wish r would do that trying to find anything code related I'm a language whose name is a letter is horrible you end up just looking up packages most of the time
yes, classes are not required for OOP, for example. Class based OOP is only one of the many ways to implement OOP. saying C cannot handle OOP is plain wrong.
10:40 note that HTML and CSS are not scripting languages, but a markup languages. You can't have logic with these, right? Despite CSS having calc() function which can be abused to have some logic, still it's not scripting or programming, just mark up. Also I miss Hack lang (PHP extended by Facebook) and Vala, both are great and could be on the list!
@@HappyPurpleFella Computers run on logic, so in order to write instructions to a computer the language needs to be able to perform logic. That's why CS students learn Discrete math at minimum, it's a mathematics field devoted to formalized binary logic, i.e. exactly what computers do. I'm reading a book called Code by Charles Petzold that does a great overview as to how and why computers developed as they did, if you're interested in learning more :) Edit: This just came to me, maybe it's a good example. This need to perform logic could be why you could code a game in Excel, but not in Word. Excel can perform logic and math, but Word can't
Assembly is not a well defined term. Most assemblies are higher-level then you would expect.For example, Procedures/Subroutines don't even exist in machine code, so saying WebAssembly is Assembly is debatable but I'd say it is not a false statement. Furthermore, WASM targets a VM so it is the lowest-level language for that specific machine.
eeeeh, sort of. "Assembly" is pretty non-existent as a unified concept. "assembly" is basically just putting names to sequences of raw machine code, it's not really a language in and of itself. For instance even on a risc architecture nothing is stopping you from impementing x86 style inline memory referencing, and then just binding that formatting to a sequence of bytes which does that, even though technically it'd be several distinct machine instructions. Calling "Assembly" a language is sort of like calling macros a language, neither are really a language themselves, they just sort of stand-in for the language itself. (Assembly just puts words to sequences of raw bytes of machine code whereas macros just put words to sequences of source-code.) In that sense, there isn't really a difference between "assembly" and *_any_* compiled code, and web assembly is assembly as much as anything else can be. Granted, Webassembly is a *_unified_* compilation target, i.e. : it's not running *_completely_* bare metal and it's not *_completely_* raw bytecode, but this is more an issue with the classification IMO. You wouldn't call java an interpreted language, but it similarly works in a 'virtual machine'. There is a bit of grey area between compiled languages and interpreted languages, that grey area is just typically so small as to not really matter. Webassembly is basically trying to be as close to completely native as possible while staying a unified compilation target, so I'd say it being 'assembly' (given, as mentioned previously, 'assembly' basically just means bytecode with extrasteps) is a fair classification. It's a bit inaccurate, but every classification is.
WebEssembly is a binary format. So it's not human readable, while the purpose of assembly is, that it's a human readable representation of machine code. WAT (WebAssembly Text Format) could be called an Assembly
Indeed. WASM has support in many programming languages. Rust probably has some of the best support, though. For C/C++, a tool called Emscripten can compile them to WASM.
Forth was a brilliant language. I spent years reading about it before I ever got a chance to use it because I was utterly obsessed with its concept. And man when I finally got to use it it was like a duck to water for me. Its basically inverted Lisp. One simple concept , stacks (as opposed to lisps one simple concept, lists) taken to its logical end. The end result is that while lisp naturally produces functional programming, forth naturally produces structured programming, leading to a very very interesting conclusion;- The difference between functional and structured programming is the order you remove items from a linked list. Beatiful mathematical comp-sci in the form of a languag. Everyone should learn 3 languages, purely for deeping their understanding, not for job skills: Lisp, Forth and Smalltalk. Those three languages distill the purest essence of their respective pardgims of coding, functional, structural, and OOP respectively. And all three of them you can learn if a few days each, tops.
@@shayneoneill1506 I did not know Smalltalk was the OOP proto-language. I'd always been under the impression that C++ was, which I've always found intimidating. I will look into Smalltalk, thank you.
@@Stettafire Yes. It was created by a man named Chuck Moore, in the mid 1970s. : STAR 42 EMIT ; : STARS 0 DO STAR LOOP ; : DOUBLE DUP + ; : QUAD DOUBLE DOUBLE ; 16 4 QUAD STARS If you enter the above into GNU FORTH, it will print out asterisks almost endlessly.
s, a+, b, d, e, Erlang, oCaml, Produire, 1C enterprise programming language, v, some old soviet programming languages in Russian, and even more are missing than just those.
around 8:50 when "he" started to read the websites names i realized that the script is read by ai voice now that i think about it, whole video is probably created by ai
Was created about two months ago, tons of factual mistakes and weird choices and organization (web assembly? powershell? (and not bash?) zig, but no rust?), and plenty of bizarre unnatural speech. Plus, totally unfocused content, none of the other videos are programming. I didn't see it but I think you're totally right, this is mostly or entirely AI.
Rust is in there. Although I thought the exact same thing: "Zig and not Rust?" Hovered the timeline and there it is. About as meaningful an overview as everything else on the list.@@omnisel
@@mashydp1780Oh huh, yeah, you're right. I must have missed it or thinking about something else he said, or something, lol. It was 14 seconds long, in my defense
I thought about this as well. It is so incredibly simple to have AI generate a list of 50+ (or however many there were) programming languages and write a short description about them regarding either their history or noteworthy components and then use an AI voice to read it. It explains why there's quite a few errors with some of the statements.
I genuinely did not realize this video was read (and probably made) by AI until a comment pointed out how weird it sounded when it read out the website names. On closer examination, this video is definitely read by AI and had many signs pointing to it, but if you weren't purposefully looking for it, it just goes to show how good AI is getting now.
I can't believe that you didn't mention that Erlang was designed to work even if you change the code *while* it's running etc. And LISP deserves a mention of the Lisps, the languages depending from it due to how easy it was to make new personalized languages in LISP to begin with.
Html and CSS are not scripting languages. In fact, they are not programming languages at all as they have no functional abilities. They are called markup languages and are completely static unless manipulated by a programming language like JavaScript.
HTML is a markup language, not a programming language. Likewise, CSS is also not a programming language, but is a style sheet language. Furthermore, JavaScript is technically related to Java, but only as a marketing ploy by the folks behind Netscape.
While PowerShell was originally Windows command line on steroids, it's now pretty much a scripting interpreter for C#/.NET (i.e. doesn't require compiling)
As a dev, that's why we have QA! I support QA in all their endeavours! They allow me to avoid doing the crap I hate.... (I'm only slightly joking. I still do TDD and write unit tests. But if you think TDD can replace proper QA you're fucking deluded)
@@mollthecoder When Rasmus Lerdorf used it for his personal home page! When he gave up the language to the community that had been built around it, they decided to rename it to PHP: Hypertext Preprocessor, as it was no longer the language for Lerdorf's personal home page but still an hypertext preprocessor, and built for applications on the HTPP!
try doing ur own research instead of asking chatgpt to write the script and then using an ai model to read it out, this is so wrong on so many things that it doesnt count as educational
Unfortunately, I have a correction. Swift is no longer only available for Apple devices. It can now be written on and for Windows and Linux. Great video otherwise tho!
@@dejangegic Swift compiles to a binary, like Go. Unlike Go, there’s no garbage collection: just super-fast automatic reference counting. Swift has lots of language niceties like a strong type system, excellent type inference, Protocol types, type extensions, maybe the best enum implementation ever made. Swift’s Foundation library contains so much functionality that most other langauges need help with a patchwork of packages. There’s an excellent server ecosystem covering nearly all other use cases. There are plenty of downsides. Using Swift on Linux has different behavior than using Swift on Apple systems for things like URLSession. So, writing Server Swift will mean adopting third party libaries like AsyncHTTPClient. Most Swift server frameworks rely on the SwiftNIO library because Swift’s async/await system can hop threads (though, I know this is being worked on if not already in Swift 5.9). And if you aren’t using Docker then you have to install the Swift runtime on your host which can be annoying.
Interesting and informative video. A few minor things. You missed FBD, STL, and LAD - all three are very common if you do industrial automation (but virtually unheard of outside that) (usually you end up doing all three from the same tools, and learning them at the same time - they all complement each other but comforms differently to different ways to approach a program) You missed SPARK, the high-reliability version of Ada (given just how insanely safe Ada is to begin with this is kinda mindboggling). Largest user of Ada, it depends on how you view it, if you do it by "organisation size" then yes, it is DoD, but in terms of industries it is used in banking, aviation, railways, and medical fairly heavy (basically in situations where failure are not an option). Nvidia also uses Ada a fair bit. The different Ada versions are notably different (for instance the jump between Ada83 and Ada95 is basically the Ada-counterpart to the jump between C and C++) Portability of Java is rather due to practices than technical reasons, you can compile java-code to platform specific binaries, and you can compile most (compiled) languages to java bytecode (the compiled java that runs in a JVM). (For that matter, virtual machines specific for a specific bytecode isn't exactly rare)
Nice thing about Scratch is that every variable is every type of variable simultaneously. You can set a variable to 5, divide it by 2, and then set it to hello all in the same script and all without the requirement of using annoying converting functions.
dude I have been immersed in tech my whole life, and never have I been told that the first compilers were actual physical machines with moving parts. Thats actually kinda wild
Still alive and well and in active use - be that via evolution (M -> Caché -> IRIS, gaining a bunch of functionality such as OO along the way), or the purer variant (GT.M)
Hi @giociampa, I worked for one client just a few years ago and got IRIS shortlisted, but in the end, they went for what "they knew" even though IRIS blew everything out of the water on cost/performance, etc. 🙄
High level languages are not necessarily more slow than low level ones. Some compiled languages C for example, have no runtime and are as fast as assembly (because the compiler literately compiled to assembly)
Fortran makes use of columns to. On the old Fortran pads where you wrote your code, there was a special column for the C character which represented a comment.
So the website “99 bottles of beer” (a site devoted to having programs to print the song lyrics) has 1500+ languages. Given 1 second per language it would take 25 minutes to say just the names…
13:53 Swift is open source, so you can also use it on Linux and windows (I’ve tried it), you just won’t have access to apple specific things like SwiftUI.
Calliong it "Every Programming Language Ever" is a bit of a stretch. Others I have worked with include Mercury Autocode, CLIST, ISPF, Paradigm Plus, PL/1, REXX, and GML.
@@gamerk316 The Rust cult is strong lol. Linus said something about needing to attract new maintainter for the linux kernel, as a lot of the maintainer are getting older. Adding Rust is imo a way to "advertise" the linux kernel. Also, Rust doesn't have most thing that Linus despise with C++ (exceptions, operator overloading, dynamic dispatch, ect...).
@@briannormant3622 Otherwise known as "Linus hates things that makes developers lives easier because of his ego". Linus (and the GPL) are the main reasons why I don't develop for Linux.
ATLAS was a programming language used by the DoD and airline industry for diagnostic equipment used for testing and repair of radar and computer components in aircraft. It could also be used for CNC and other machine operations.
Two things that i noticed: 1. You can code in Swift on windows, although the setup is slighly complicated. 2. Rust is not easy, easier than c and c++? Maybe (i haven't done enough Rust to say that for sure). Rust is by the way NOT used for applications where Performance is EXTREMELY important - Games and OSs do not fall in this category, performance there is important, but not extremely so - because the safetey features that it provides slow it down. Programs where performance is key are found in trading and writen in c or c++.
@@dtimer319 correct, rust is extremely good when it comes to speed and safety, but as far as i know, rust is slower than c++ - only on a marginal level so but sometimes this is relevant. I would like to see the cases where it is faster, since I personally would question the speed of the code itself instead of what it has been compiled to.
@@dtimer319 It's a lie. For the exact same program C++, C and Rust will compile to same machine code. Rust and C++ can be faster than C in some cases, example (std::sort vs qsort) the compiler can directly inline the sorting operation. But a C programmer optimizing the hell out of his program can use macro and/or simply rewrite a qsort for their specific time to get to the same speed as C++/Rust. If you compare the assembly code emmitted by C, C++ and Rust. It's for all intent and purpose the exact same. If you check a test showing perf comparaison between those 3. Check the code used. It won't always do the exact same thing. It's quite easy to understand why, Rust use the LLVM framework, the same Clang use to compile C and C++ code. So the same optimization can be performance for those 3 programs. With modern processor, when it comes to performance, the best way to optimize is to keep in mind cache localy and branchless programming.
@@entropy4959 there are plenty of benchmarks. At least for math problems, the best Rust solutions are often the fastest or second fastest. However the little known D language beats it in terms of speed any time it isn't forgotten to be included in those tests because D is the fastest boy in the race, when it does happen to race.
LOGO was the first language I ever learned about. It was in middle school. We had these old 486 computers with early Windows and PrintShop and Logo. I had a lot of fun making different looking designs. In a year or two I would buy a used TI-83 for $20 and teach myself TI-BASIC, which is incredibly easy because it's an already very limited version of BASIC. Those were some fun times!
"Every Programming Language Ever…" is quite an exaggeration. For instance, these are missing (in no particular order): Object Pascal (Delphi can be considered a dialect of it); Modula2, Modula3, Oberon; E; AmigaE (not related to E); REXX; Eiffel; J (inspired by APL); Racket, Scheme, Clojure, …; Forth; TCL; Korn shell, Z shell, Bash …; AWK; D; BCPL; groovy; Oz; REBOL (and red); Haxe; occam; … and a lot, lot more. Moreover, non-Turing complete languages (as SQL, HTML, CSS) shouldn't be listed in a "programming languages" list. As pointed out by others, there are mistakes. I want to focus on a languages-related few of them: § **Raku** is a different language than Perl (they decided to change the name from Perl6 to Raku to stress the fact that it is a different language); § Go is not like Python at all - and Go is a compiled lang, Python is not § maybe just ALGOL, not ALGOL 60; otherwise you should list other ALGOLs as well (58, 68, W) § C was influenced by Algol 68, and B (derivative of BCPL)
@@doigt6590 I suspect that the "influenced by Algol ..." label could be given to many languages. Then one can argue that Algol 68 wasn't the first and only to use this or that feature/concept/syntax; then one should know the languages the C's creators were exposed to… and so on. Hard to do a correct tracking; anyway I think it isn't unlikely it has influenced someway all the languages created in the time when it was, er, influential :)
@@MauroPanigada Instead of suspecting something, you could actually look it up. The authors of C wrote on the history of its making and they are quite clear on the relation between C and Algol.
@@doigt6590 Looked it up on wikipedia hoping it would have an easy reference of such a history lesson… they have a citation which seems to contradict your "Saying C was influenced by Algol 68 is quite a stretch too." The quote is this: "The scheme of type composition adopted by C owes considerable debt to Algol 68, although it did not, perhaps, emerge in a form that Algol's adherents would approve of."
4:17 I'd just like to interject for a moment. What you're refering to as Linux, is in fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX. Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called Linux, and many of its users are not aware that it is basically the GNU system, developed by the GNU Project. There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called Linux distributions are really distributions of GNU/Linux!
No, Richard, it's 'Linux', not 'GNU/Linux'. The most important contributions that the FSF made to Linux were the creation of the GPL and the GCC compiler. Those are fine and inspired products. GCC is a monumental achievement and has earned you, RMS, and the Free Software Foundation countless kudos and much appreciation. Following are some reasons for you to mull over, including some already answered in your FAQ. One guy, Linus Torvalds, used GCC to make his operating system (yes, Linux is an OS -- more on this later). He named it 'Linux' with a little help from his friends. Why doesn't he call it GNU/Linux? Because he wrote it, with more help from his friends, not you. You named your stuff, I named my stuff -- including the software I wrote using GCC -- and Linus named his stuff. The proper name is Linux because Linus Torvalds says so. Linus has spoken. Accept his authority. To do otherwise is to become a nag. You don't want to be known as a nag, do you? (An operating system) != (a distribution). Linux is an operating system. By my definition, an operating system is that software which provides and limits access to hardware resources on a computer. That definition applies wherever you see Linux in use. However, Linux is usually distributed with a collection of utilities and applications to make it easily configurable as a desktop system, a server, a development box, or a graphics workstation, or whatever the user needs. In such a configuration, we have a Linux (based) distribution. Therein lies your strongest argument for the unwieldy title 'GNU/Linux' (when said bundled software is largely from the FSF). Go bug the distribution makers on that one. Take your beef to Red Hat, Mandrake, and Slackware. At least there you have an argument. Linux alone is an operating system that can be used in various applications without any GNU software whatsoever. Embedded applications come to mind as an obvious example. Next, even if we limit the GNU/Linux title to the GNU-based Linux distributions, we run into another obvious problem. XFree86 may well be more important to a particular Linux installation than the sum of all the GNU contributions. More properly, shouldn't the distribution be called XFree86/Linux? Or, at a minimum, XFree86/GNU/Linux? Of course, it would be rather arbitrary to draw the line there when many other fine contributions go unlisted. Yes, I know you've heard this one before. Get used to it. You'll keep hearing it until you can cleanly counter it. You seem to like the lines-of-code metric. There are many lines of GNU code in a typical Linux distribution. You seem to suggest that (more LOC) == (more important). However, I submit to you that raw LOC numbers do not directly correlate with importance. I would suggest that clock cycles spent on code is a better metric. For example, if my system spends 90% of its time executing XFree86 code, XFree86 is probably the single most important collection of code on my system. Even if I loaded ten times as many lines of useless bloatware on my system and I never excuted that bloatware, it certainly isn't more important code than XFree86. Obviously, this metric isn't perfect either, but LOC really, really sucks. Please refrain from using it ever again in supporting any argument. Last, I'd like to point out that we Linux and GNU users shouldn't be fighting among ourselves over naming other people's software. But what the heck, I'm in a bad mood now. I think I'm feeling sufficiently obnoxious to make the point that GCC is so very famous and, yes, so very useful only because Linux was developed. In a show of proper respect and gratitude, shouldn't you and everyone refer to GCC as 'the Linux compiler'? Or at least, 'Linux GCC'? Seriously, where would your masterpiece be without Linux? Languishing with the HURD? If there is a moral buried in this rant, maybe it is this: Be grateful for your abilities and your incredible success and your considerable fame. Continue to use that success and fame for good, not evil. Also, be especially grateful for Linux' huge contribution to that success. You, RMS, the Free Software Foundation, and GNU software have reached their current high profiles largely on the back of Linux. You have changed the world. Now, go forth and don't be a nag. Thanks for listening.
The question is how in the world did the GNU Project arrive on the worst thing to pronounce with the fewest possible letters. Seriously--it's only three letters and when people see it for the first time, there's about three possibilities that come to mind for how to say it. It's simply a case of stereotypical computer geeks doing something brilliant and not knowing the first thing about marketing, and now we get to hear the oft-repeated lamentations of disrespect when that's not the case at all. Linux is just easy to say and kinda catchy--that's it.
So many goofs. FORTRAN is not slow. FLOWMATIC ran on a Univac 1, definitely not a supercomputer. Algol inspired Algol W, Algol 68, Pascal, but C, not so much. C didn’t inspire go, Java, JavaScript, PHP, or Python. Also you only hit a few languages, in reality there are hundreds, mostly special-purpose languages and every assembly-language.
7:55 functional programming is good not just for math, but also business applications. Check the talks "Domain Modeling made functional" and "Making Impossible States Impossible"
this isn't every programming language. There were lots of Russian based programming languages in the Soviet union, a French based programming language in France, some newer programming languages like v, some older ones also missing like s, a+, b, d, e, Erlang, oCaml, and some more modern Non-English programming languages like Produire (プロデル (Japanese)) and 1C enterprise programming language (Russian). There's also tons of others missing in here. It would've helped if the video had some disclaimer for how it was classifying these because it seems to be leaving a lot out.
A lot of missing things here: A few examples of the top of my head: tcl Modula2 Modula3 BCPL B BLISS FOCAL FORTH Scheme Utopia Eiffel bash scripts csh scripts D DIBOL MDL
LISP : The only computer programming language I've ever been shown that I just cannot get my head around. I have no particular training or practice with Python or Rust, but they're just programming languages. Lisp seems to be something else.
Nice list! Thanks for the video. On what criteria these languages was chosen? As example why V Lang and Crystal was not included? It's my recent favorites :) both really deserve to be mentioned
Knowing our fortune, if Crystal was mentioned, it'd be two seconds long section and said to be "Ruby, but with better speed" :< How he massacred my boy Nim.... I know it's niche language and it can be summarized to be "Python, but C", but there's so much you can include to expand it even with few seconds more... like memory management system you can choose, or macros allowing you to basically rewrite AST to your needs!
And what about Focal and two independently made languages called C - - ?.. What about Modula-2/3 and Oberon? And most important of all, where's PL/I ?! And Forth?!.
@@edwardfalk9997 its used most commonly for developing ios apps, but its a programming language on its own, it can be used for other thing like web services, etc.
This is the case with macOS and iOS in general, it's not even unique for Swift. This is due to Apple's licensing. Also, xCode is not for Swift, is much older than Swift, and there are other IDEs for Swift, Swift and xCode are not related. This dude had so many things wrong about these...
@@edwardfalk9997I think they mean you can’t compile to a macOS or iOS executable from Linux or Windows. Swift has run on Linux for about 7 years and Windows for about 3. It was released under the Apache license starting with the first version to support Linux.
@@jaybutler I think they mean you can't BUY a compiler to do it. You can WRITE a compliler for anything to anything in ANY environment which supports STANDARD IN, STANDARD OUT and STANDARD ERROR on any architecture. If it was saleable someone would have done it. PS I used to compile programe in Algol on IBM 360 series mainframes to execute on DEC PDP10 machines.
@@KanashimiMusic The two are not directly correlated. Low level gives you more control over memory management, that part was right. You could say the same about a pair of scissors and a chainsaw. If you have no idea how a chainsaw works, its useless.
@@NetherFX They are definitely correlated, very strongly I would even say. I don't know what exactly you mean by "not directly", but most traits that are inherent to higher-level languages naturally come with performance impacts. 1. As you yourself said, low level gives you more control over memory management, and if you don't have that control, the language (or rather, the runtime) has to do the work for you, very often through a garbage collector, which has some performance impact. That is unless the compiler manages the memory for you, like in Rust, but calling Rust a "higher-level" language is debatable at best. 2. High-level languages usually include a lot of non-zero-cost abstractions and overhead which can have significant performance impacts. 3. All high-level languages that I can think of are interpreted. Even languages that are usually referred to as "compiled" are really not - they're compiled to machine code for a virtual machine like the JVM which then interprets or "just-in-time compiles" that code to actual machine code at runtime. And quite obviously, interpreting also has a significant performance impact. Granted, the JVM JIT specifically is known for being very good at optimizing in real time, but even it can't possibly beat ahead-of-time compiled, ACTUAL machine code. I'm aware that low-level vs high-level is not binary, rather, it is a spectrum that's not very well-defined or universally agreed upon. But it's impossible to deny that many of the traits that make a language "high-level" inherently lead to programs written in that language being slower.
Its weird hearing "Fortran" and "slower" together. Fortran compilers have been around so long and are so refined they generate ASM so optimized its considered one of the only languages that runs faster than C. Its a *rocket ship* of a language (And yes, it'll do your GPU stuff via its vector extensions and run them crazy fast). Its also so old its possibly your actual grandfather.
No language will ever come close to fully replacing C (at least for now), but Rust has the biggest chances This is because of the support for C on essentially everything that runs code and it's established role in tech
great video, getting to the point so fast, but it isn't *every programming language ever*. you missed some languages, as a couple of commenters already pointed out. oh and at 5:14: floats aren't technically real numbers, since they can't fully represent irrational numbers. they can only represent a certain spectrum of rational numbers.
If you want to be _super_ technical, IEEE-754 floats are a subset of the dyadic rationals (plus 4 extra values, one of which isn't a value and one of which is the same as an existing value but not). The dyadic rationals are a "ring" that consists of all integers multiplied by arbitrary powers of ½. A "ring" is a set of "numbers" that can be added, subtracted and multiplied while behaving nicely. Floats let you divide by rounding to the nearest representable dyadic rational even if the dyadic rationals themselves don't support division.
Need to add the languages Forth and ARexx to the list. My list of languages over the years are: basic, forth, assemble (multiple CPUs (6502, 680x0, VAX)), pascal, cobol, C, C++,ARexx, HTML, SQL, (i've looked at other computer languages but haven't programmed in them)
After the line: "You can still find Assembly in modern world: Web Assembly" i realized that this person knows nothing about any aspect of programming excluding wikipedia info.
Memory leak isn’t the issue of using too much memory. Program is still fine, as long as it knows what memory it’s using. Memory leak is the problem where the program forgets to actually delete the memory it no longer uses; the computer (operating system) thinks that the program is still using the memory but the program thinks it’s not.
The problem arises when the program forgets to free the memory, but then thinks it needs MORE memory and asks for it. If this happens in a loop the system will eventually run out of memory.
I run into this with some of my older classroom machines and have to teach middle school students about this issue. They have no clue!
Resulting in the program using too much memory
@@FranLegon Doesn't necessarily result; The program can leak just a few bytes on starting because it forgot to delete some initialization variables. But the rest of the program memory could still be managed properly. It's still dirty in the sense that you forgot to take the dirt off your shoes before entering house.
Memory leak can be though of forgetting to throw out the trash. Usually users don't notice it. The problem only gets noticed when the trash raises a stink. Like how humans treat the environment.
@@GaryFerrao ok that's a good point. I was thinking it always build up to "too much" but you're right, it might not. Thanks for the response
Sadly this video has many mistakes, but it was still fun to watch.
Yeah the part about Swift is wrong as it's available on Windows and Linux (the Arc browser is even building it's Windows app on Swift). Yes iOS and Mac development probably requires a Mac but you can 100% write command line apps or backends on Windows and Linux. Also Kotlin is useful for more than Android development, it's fully supported in the popular Spring framework.
Right off the bat with assembly too 😂
Agree.
This is more for entertaining than for accuracy.
I don't think anyone would watch a video of this kind for its accuracy. It's clearly aimed for the common layman (i.e. someone who's only getting started in programming)
@@ashwinnarasimhan2729 technically yes, practically no. Same as with C#/.Net, which can be used with Mono outside of Windows. But why would you do that?
I love how the whole comment section is just programmers correcting this video
fr XD
he doesn't know #garbage collection exists in the first place🤣🤣🤣💯👎
Nothing like a triggered autistic programmer.
Oh hey man, thanks for mentioning Holy-C. The dude behind it I actually talked to online a lot. He was actually a really nice guy when the Schizophrenia wasnt screwing him up. He was a weird crazy genius, and ultimately he died tragically alone because of his mental illness. TempleOS and Holy-C are mostly remembered for the mental illness behind it, but Terry was a real human, and he deserves to be remembered as a mad pioneer who could have been so much more if it wasnt for the fact that this world is terribly unequipped to support people with significant mental illness. RIP Terry.
I also got happy when I saw Holy-C
Was surprised, but pleased to see it mentioned too.
110% agree on not only how he should be remembered, but on how he could have been so much more.
Dude was a genius.
RIP Terry.
I'm always surprised whenever somebody does not remember Terry for his genius self, but for the "controversial" attitude he had. Shows what society values more and how we don't deserve people like Terry. Hope he's in a better world now.
Is there still content of him around somewhere I saw some clips of him and would really like to see him program or smth
I more remember him for saying the N word and ranting about government agencies.
This video is just completely riddled with inaccuracies, please viewers of this video, take everything said with a large grain of salt and do your own research and investigation of programming and computing topics.
A veritable mountain of salt.
Memory leak isn't using too much memory, that's the consequence of memory leak.
♥
Go is formally known as Go, not Golang. Golang is something we use for easier SEO
Some do, and then there are "the others", who call the language Golang, in written and spoken language.
@@tomasprochazka6198 "Formally" is the key part you're missing. I know you meant no harm but it's annoying when someone tries to pull the "but actually 🤓" and isn't even right.
@@dejangegic golang is rarely used informally as well. Between devs, we know what go is, we don't need to call it golang just because search engines can't
I wish r would do that trying to find anything code related I'm a language whose name is a letter is horrible you end up just looking up packages most of the time
The way you explained object oriented language is making me cry, I am *NOT* joking.
yes, classes are not required for OOP, for example. Class based OOP is only one of the many ways to implement OOP.
saying C cannot handle OOP is plain wrong.
@@notkamui9749 I mean, by definition, it's right
10:40 note that HTML and CSS are not scripting languages, but a markup languages. You can't have logic with these, right? Despite CSS having calc() function which can be abused to have some logic, still it's not scripting or programming, just mark up. Also I miss Hack lang (PHP extended by Facebook) and Vala, both are great and could be on the list!
Why do they need to have "logic", after all, isn't programming just instructions to a computer?
@@HappyPurpleFella Computers run on logic, so in order to write instructions to a computer the language needs to be able to perform logic. That's why CS students learn Discrete math at minimum, it's a mathematics field devoted to formalized binary logic, i.e. exactly what computers do. I'm reading a book called Code by Charles Petzold that does a great overview as to how and why computers developed as they did, if you're interested in learning more :)
Edit: This just came to me, maybe it's a good example. This need to perform logic could be why you could code a game in Excel, but not in Word. Excel can perform logic and math, but Word can't
HTML and CSS are quite literally turing complete, if that doesn't reach the bar for scripting language then I'm not sure what does.
Okay.. And?
@@gintoki_sakata__ and it should not be on the list of programming languages 🤷♂
Web Assembly is not assembly. I've written it in C++, but I think other languages are supported.
Assembly is not a well defined term. Most assemblies are higher-level then you would expect.For example, Procedures/Subroutines don't even exist in machine code, so saying WebAssembly is Assembly is debatable but I'd say it is not a false statement. Furthermore, WASM targets a VM so it is the lowest-level language for that specific machine.
eeeeh, sort of. "Assembly" is pretty non-existent as a unified concept. "assembly" is basically just putting names to sequences of raw machine code, it's not really a language in and of itself. For instance even on a risc architecture nothing is stopping you from impementing x86 style inline memory referencing, and then just binding that formatting to a sequence of bytes which does that, even though technically it'd be several distinct machine instructions.
Calling "Assembly" a language is sort of like calling macros a language, neither are really a language themselves, they just sort of stand-in for the language itself. (Assembly just puts words to sequences of raw bytes of machine code whereas macros just put words to sequences of source-code.) In that sense, there isn't really a difference between "assembly" and *_any_* compiled code, and web assembly is assembly as much as anything else can be.
Granted, Webassembly is a *_unified_* compilation target, i.e. : it's not running *_completely_* bare metal and it's not *_completely_* raw bytecode, but this is more an issue with the classification IMO. You wouldn't call java an interpreted language, but it similarly works in a 'virtual machine'. There is a bit of grey area between compiled languages and interpreted languages, that grey area is just typically so small as to not really matter. Webassembly is basically trying to be as close to completely native as possible while staying a unified compilation target, so I'd say it being 'assembly' (given, as mentioned previously, 'assembly' basically just means bytecode with extrasteps) is a fair classification. It's a bit inaccurate, but every classification is.
WebEssembly is a binary format. So it's not human readable, while the purpose of assembly is, that it's a human readable representation of machine code.
WAT (WebAssembly Text Format) could be called an Assembly
You don't write WebAssembly with C++. You can compile C++ into Assembly and WebAssembly
Indeed. WASM has support in many programming languages. Rust probably has some of the best support, though. For C/C++, a tool called Emscripten can compile them to WASM.
I wish JS was only used for web browsers, but people made the stupid decision to use it in desktop apps, on web servers, and way more
You mean Electrum apps? Which are websites packages in a Chromium browser...
@@theramendutchmani think he meant nodejs
what’s so stupid about deciding to use JS on web servers?
@@danielegvijs sucks
@@emireri2387 specifically how
There is no mention of FORTH; the one language seemingly everyone forgets.
no mention of Algol 68, Modula 2, JOVIAL, etc .....🤥
Forth was a brilliant language. I spent years reading about it before I ever got a chance to use it because I was utterly obsessed with its concept. And man when I finally got to use it it was like a duck to water for me. Its basically inverted Lisp. One simple concept , stacks (as opposed to lisps one simple concept, lists) taken to its logical end. The end result is that while lisp naturally produces functional programming, forth naturally produces structured programming, leading to a very very interesting conclusion;- The difference between functional and structured programming is the order you remove items from a linked list. Beatiful mathematical comp-sci in the form of a languag.
Everyone should learn 3 languages, purely for deeping their understanding, not for job skills: Lisp, Forth and Smalltalk. Those three languages distill the purest essence of their respective pardgims of coding, functional, structural, and OOP respectively. And all three of them you can learn if a few days each, tops.
@@shayneoneill1506 I did not know Smalltalk was the OOP proto-language. I'd always been under the impression that C++ was, which I've always found intimidating. I will look into Smalltalk, thank you.
I've legit never heard of Forth. But maybe before my time?
@@Stettafire Yes. It was created by a man named Chuck Moore, in the mid 1970s.
: STAR 42 EMIT ;
: STARS 0 DO STAR LOOP ;
: DOUBLE DUP + ;
: QUAD DOUBLE DOUBLE ;
16 4 QUAD STARS
If you enter the above into GNU FORTH, it will print out asterisks almost endlessly.
+ BCPL, Forth, Modula, Modula-2, Metafont, PostScript
I didn't hear C# or Bash mentioned.
s, a+, b, d, e, Erlang, oCaml, Produire, 1C enterprise programming language, v, some old soviet programming languages in Russian, and even more are missing than just those.
around 8:50 when "he" started to read the websites names i realized that the script is read by ai voice
now that i think about it, whole video is probably created by ai
Was created about two months ago, tons of factual mistakes and weird choices and organization (web assembly? powershell? (and not bash?) zig, but no rust?), and plenty of bizarre unnatural speech. Plus, totally unfocused content, none of the other videos are programming. I didn't see it but I think you're totally right, this is mostly or entirely AI.
I think you two are right. The whole structure of this video is pretty sus
Rust is in there. Although I thought the exact same thing: "Zig and not Rust?" Hovered the timeline and there it is. About as meaningful an overview as everything else on the list.@@omnisel
@@mashydp1780Oh huh, yeah, you're right. I must have missed it or thinking about something else he said, or something, lol. It was 14 seconds long, in my defense
I thought about this as well. It is so incredibly simple to have AI generate a list of 50+ (or however many there were) programming languages and write a short description about them regarding either their history or noteworthy components and then use an AI voice to read it. It explains why there's quite a few errors with some of the statements.
WebAssembly is NOT an assembly language.
Next you should do "Every JavaScript framework ever explained in 60 minutes"!
Underrated Comment.
Impossible without running the video on at least 2x
But there will be two more in the time it takes you to watch the video
More like 60 years 😂😂😂
I genuinely did not realize this video was read (and probably made) by AI until a comment pointed out how weird it sounded when it read out the website names. On closer examination, this video is definitely read by AI and had many signs pointing to it, but if you weren't purposefully looking for it, it just goes to show how good AI is getting now.
AI can have a lisp now??
maybe script written by ai but the voiceover is a real dude (although misinformed)
@@lestusmestus if it really is AI, then the creator of this video likely trained the model himself on his own voice
@@FilterBud I mean could be. there is pretty good voice cloning sites out there
I can't believe that you didn't mention that Erlang was designed to work even if you change the code *while* it's running etc.
And LISP deserves a mention of the Lisps, the languages depending from it due to how easy it was to make new personalized languages in LISP to begin with.
Html and CSS are not scripting languages. In fact, they are not programming languages at all as they have no functional abilities. They are called markup languages and are completely static unless manipulated by a programming language like JavaScript.
Wow, I taught Pascal and C in the Borland IDE that you used in your IDE screenshot-MAN THAT BRINGS BACK MEMORIES (esp. malloc()!)
HTML is a markup language, not a programming language. Likewise, CSS is also not a programming language, but is a style sheet language. Furthermore, JavaScript is technically related to Java, but only as a marketing ploy by the folks behind Netscape.
While PowerShell was originally Windows command line on steroids, it's now pretty much a scripting interpreter for C#/.NET (i.e. doesn't require compiling)
14:26
"Developers test their code thoroughly"
I put my fist through my desk I was laughing so hard.
As a dev, that's why we have QA! I support QA in all their endeavours! They allow me to avoid doing the crap I hate.... (I'm only slightly joking. I still do TDD and write unit tests. But if you think TDD can replace proper QA you're fucking deluded)
Bro called fortran slow xD
and called lua fast
@@Om4r37lua is fast though compared to fortran or js or the hell that is python
@@Om4r37no shit
@@Om4r37lua is one of the fastest interpreted languages
@@Om4r37javascript is slow, but it's fast when u run it on BunJS env
Fun fact, PHP had a good meaning for it's name... but then they made it:
PHP: Hypertext Preprocessor
IT'S RECURSIVE
Originally it meant personal home page
@@mollthecoder When Rasmus Lerdorf used it for his personal home page!
When he gave up the language to the community that had been built around it, they decided to rename it to PHP: Hypertext Preprocessor, as it was no longer the language for Lerdorf's personal home page but still an hypertext preprocessor, and built for applications on the HTPP!
Now it means PHP Hates Programmers
You mentioned powershell, but forgot bash.
try doing ur own research instead of asking chatgpt to write the script and then using an ai model to read it out, this is so wrong on so many things that it doesnt count as educational
Lots of small wrong/misleading info but i learned a lot about the languages
LLVM is not a virtual machine...
Unfortunately, I have a correction. Swift is no longer only available for Apple devices. It can now be written on and for Windows and Linux. Great video otherwise tho!
You can even write a server in Swift. A better question is, why would you?
@@dejangegicapple paid you to
@@dejangegic Swift compiles to a binary, like Go. Unlike Go, there’s no garbage collection: just super-fast automatic reference counting. Swift has lots of language niceties like a strong type system, excellent type inference, Protocol types, type extensions, maybe the best enum implementation ever made. Swift’s Foundation library contains so much functionality that most other langauges need help with a patchwork of packages. There’s an excellent server ecosystem covering nearly all other use cases.
There are plenty of downsides. Using Swift on Linux has different behavior than using Swift on Apple systems for things like URLSession. So, writing Server Swift will mean adopting third party libaries like AsyncHTTPClient. Most Swift server frameworks rely on the SwiftNIO library because Swift’s async/await system can hop threads (though, I know this is being worked on if not already in Swift 5.9). And if you aren’t using Docker then you have to install the Swift runtime on your host which can be annoying.
@@dejangegicit’s actually a pretty amazing language
The title is nonsense. There are thousands of programming languages around...
well you certainly lived up to your name
For example brainf%%k
Android Studio may be the most used IDE for Android development, IntelliJ IDEA is the worlds most used IDE for Kotlin
Interesting and informative video.
A few minor things.
You missed FBD, STL, and LAD - all three are very common if you do industrial automation (but virtually unheard of outside that) (usually you end up doing all three from the same tools, and learning them at the same time - they all complement each other but comforms differently to different ways to approach a program)
You missed SPARK, the high-reliability version of Ada (given just how insanely safe Ada is to begin with this is kinda mindboggling).
Largest user of Ada, it depends on how you view it, if you do it by "organisation size" then yes, it is DoD, but in terms of industries it is used in banking, aviation, railways, and medical fairly heavy (basically in situations where failure are not an option). Nvidia also uses Ada a fair bit.
The different Ada versions are notably different (for instance the jump between Ada83 and Ada95 is basically the Ada-counterpart to the jump between C and C++)
Portability of Java is rather due to practices than technical reasons, you can compile java-code to platform specific binaries, and you can compile most (compiled) languages to java bytecode (the compiled java that runs in a JVM). (For that matter, virtual machines specific for a specific bytecode isn't exactly rare)
There was actually a “turtle”. It was a simple turtle-like robot that could hold a pen and draw the results of the Logo program.
SQL is not a programming language, it is a Structured QUERY Language.
I'm so happy you skipped Groovy. For anyone wondering, Groovy walked so Kotlin could run
what bro?😂
@@K.Parth_Singh What haven't you understood?
Well then, i thank Groovy but he has stopped walking, i gotta catch Kotlin now if you don't mind.
@@dejangegic groovy walked?
@@K.Parth_Singh It's a figure of speech. Search for "You have to walk before you can run"
Nice thing about Scratch is that every variable is every type of variable simultaneously. You can set a variable to 5, divide it by 2, and then set it to hello all in the same script and all without the requirement of using annoying converting functions.
FORTRAN was the first high level language that was commercialized, not the first first high level language.
dude I have been immersed in tech my whole life, and never have I been told that the first compilers were actual physical machines with moving parts. Thats actually kinda wild
It was also a job, like there were people that would compile your code into punchcards
Wild, yes, and not true also.
It was really fantastic to see a mention of MUMPS - it's still the greatest!
Still alive and well and in active use - be that via evolution (M -> Caché -> IRIS, gaining a bunch of functionality such as OO along the way), or the purer variant (GT.M)
Hi @giociampa, I worked for one client just a few years ago and got IRIS shortlisted, but in the end, they went for what "they knew" even though IRIS blew everything out of the water on cost/performance, etc. 🙄
High level languages are not necessarily more slow than low level ones. Some compiled languages C for example, have no runtime and are as fast as assembly (because the compiler literately compiled to assembly)
Well assembly will always be faster, but the real question is can we code perfect assembly 😉
Fortran makes use of columns to. On the old Fortran pads where you wrote your code, there was a special column for the C character which represented a comment.
It's been a while but aren't there columns still reserved for labels and to indicate line continuations as well?
@@mikechappell4156 I can't remember. I can only remember the C column. I was a 9-year old child when I last wrote a Fortran program on a pad.
So the website “99 bottles of beer” (a site devoted to having programs to print the song lyrics) has 1500+ languages. Given 1 second per language it would take 25 minutes to say just the names…
My grandpa used to code in RPG in the 60s. Really cool to see a shout out
I, and many others, will code some RPG today.
13:53 Swift is open source, so you can also use it on Linux and windows (I’ve tried it), you just won’t have access to apple specific things like SwiftUI.
Calliong it "Every Programming Language Ever" is a bit of a stretch. Others I have worked with include Mercury Autocode, CLIST, ISPF, Paradigm Plus, PL/1, REXX, and GML.
Holy C is a wild ride of a story haha
liked cuz you included terry, dudes a legend
Where is FORTH?
13:46 "replace objective C" sounded like "replay subjective C" lol
5:56 - there is no c++ in linux, i needed to point it out since no c++ in linux kernel is almost a religious thing for us linux users at this point
god now I need to make a version of the Fuse T-posing over the crying hostage meme with Rust and C++
There is some C++ in the source tree, but only C++98 or older is allowed currently.
@@BlessedDog And yet, they're starting to use Rust :/
@@gamerk316 The Rust cult is strong lol. Linus said something about needing to attract new maintainter for the linux kernel, as a lot of the maintainer are getting older.
Adding Rust is imo a way to "advertise" the linux kernel. Also, Rust doesn't have most thing that Linus despise with C++ (exceptions, operator overloading, dynamic dispatch, ect...).
@@briannormant3622 Otherwise known as "Linus hates things that makes developers lives easier because of his ego".
Linus (and the GPL) are the main reasons why I don't develop for Linux.
ATLAS was a programming language used by the DoD and airline industry for diagnostic equipment used for testing and repair of radar and computer components in aircraft. It could also be used for CNC and other machine operations.
3:02 "A = 1. what is a? A is a REAL SCALAR! a = "now a character array". what is a? A is a 21 element CHARACTER ARRAY!"
Matlab is "MATrix LABoratory", not "Matrix Library".
Two things that i noticed:
1. You can code in Swift on windows, although the setup is slighly complicated.
2. Rust is not easy, easier than c and c++? Maybe (i haven't done enough Rust to say that for sure). Rust is by the way NOT used for applications where Performance is EXTREMELY important - Games and OSs do not fall in this category, performance there is important, but not extremely so - because the safetey features that it provides slow it down. Programs where performance is key are found in trading and writen in c or c++.
Rust has in many cases been shown to outperform C and C++. Rust is extremely good when performance and safety are needed.
@@dtimer319 correct, rust is extremely good when it comes to speed and safety, but as far as i know, rust is slower than c++ - only on a marginal level so but sometimes this is relevant. I would like to see the cases where it is faster, since I personally would question the speed of the code itself instead of what it has been compiled to.
@@dtimer319
It's a lie.
For the exact same program C++, C and Rust will compile to same machine code. Rust and C++ can be faster than C in some cases, example (std::sort vs qsort) the compiler can directly inline the sorting operation. But a C programmer optimizing the hell out of his program can use macro and/or simply rewrite a qsort for their specific time to get to the same speed as C++/Rust.
If you compare the assembly code emmitted by C, C++ and Rust. It's for all intent and purpose the exact same. If you check a test showing perf comparaison between those 3. Check the code used. It won't always do the exact same thing. It's quite easy to understand why, Rust use the LLVM framework, the same Clang use to compile C and C++ code. So the same optimization can be performance for those 3 programs.
With modern processor, when it comes to performance, the best way to optimize is to keep in mind cache localy and branchless programming.
@@entropy4959 there are plenty of benchmarks. At least for math problems, the best Rust solutions are often the fastest or second fastest. However the little known D language beats it in terms of speed any time it isn't forgotten to be included in those tests because D is the fastest boy in the race, when it does happen to race.
You definitely deserve a thumbs up for this one.
This video is a straight up masterpiece. In 15 minutes I learned more important concepts than one university course would be able to do
what kind of shitty university are planning on going to
@@valerikitipov1389 doing your mum academy
LOGO was the first language I ever learned about. It was in middle school. We had these old 486 computers with early Windows and PrintShop and Logo. I had a lot of fun making different looking designs. In a year or two I would buy a used TI-83 for $20 and teach myself TI-BASIC, which is incredibly easy because it's an already very limited version of BASIC. Those were some fun times!
"Every Programming Language Ever…" is quite an exaggeration. For instance, these are missing (in no particular order): Object Pascal (Delphi can be considered a dialect of it); Modula2, Modula3, Oberon; E; AmigaE (not related to E); REXX; Eiffel; J (inspired by APL); Racket, Scheme, Clojure, …; Forth; TCL; Korn shell, Z shell, Bash …; AWK; D; BCPL; groovy; Oz; REBOL (and red); Haxe; occam; … and a lot, lot more.
Moreover, non-Turing complete languages (as SQL, HTML, CSS) shouldn't be listed in a "programming languages" list.
As pointed out by others, there are mistakes. I want to focus on a languages-related few of them:
§ **Raku** is a different language than Perl (they decided to change the name from Perl6 to Raku to stress the fact that it is a different language);
§ Go is not like Python at all - and Go is a compiled lang, Python is not
§ maybe just ALGOL, not ALGOL 60; otherwise you should list other ALGOLs as well (58, 68, W)
§ C was influenced by Algol 68, and B (derivative of BCPL)
Saying C was influenced by Algol 68 is quite a stretch too. It's a distant ancestor, through CPL, BCPL, B and finally NB.
@@doigt6590 I suspect that the "influenced by Algol ..." label could be given to many languages. Then one can argue that Algol 68 wasn't the first and only to use this or that feature/concept/syntax; then one should know the languages the C's creators were exposed to… and so on. Hard to do a correct tracking; anyway I think it isn't unlikely it has influenced someway all the languages created in the time when it was, er, influential :)
@@MauroPanigada Instead of suspecting something, you could actually look it up. The authors of C wrote on the history of its making and they are quite clear on the relation between C and Algol.
@@doigt6590 Looked it up on wikipedia hoping it would have an easy reference of such a history lesson… they have a citation which seems to contradict your "Saying C was influenced by Algol 68 is quite a stretch too."
The quote is this: "The scheme of type composition adopted by C owes considerable debt to Algol 68, although it did not, perhaps, emerge in a form that Algol's adherents would approve of."
@@MauroPanigada Lol wikipedia while I am quoting the language's authors. I think I'll truth the language's authors over what wikipedia says haha.
What About BrainF*ck programming language
All 1500+ languages? I doubt it.
How many total languages exist ,?. Thankyou
4:17
I'd just like to interject for a moment. What you're refering to as Linux, is in fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX.
Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called Linux, and many of its users are not aware that it is basically the GNU system, developed by the GNU Project.
There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called Linux distributions are really distributions of GNU/Linux!
No, Richard, it's 'Linux', not 'GNU/Linux'. The most important contributions that the FSF made to Linux were the creation of the GPL and the GCC compiler. Those are fine and inspired products. GCC is a monumental achievement and has earned you, RMS, and the Free Software Foundation countless kudos and much appreciation.
Following are some reasons for you to mull over, including some already answered in your FAQ.
One guy, Linus Torvalds, used GCC to make his operating system (yes, Linux is an OS -- more on this later). He named it 'Linux' with a little help from his friends. Why doesn't he call it GNU/Linux? Because he wrote it, with more help from his friends, not you. You named your stuff, I named my stuff -- including the software I wrote using GCC -- and Linus named his stuff. The proper name is Linux because Linus Torvalds says so. Linus has spoken. Accept his authority. To do otherwise is to become a nag. You don't want to be known as a nag, do you?
(An operating system) != (a distribution). Linux is an operating system. By my definition, an operating system is that software which provides and limits access to hardware resources on a computer. That definition applies wherever you see Linux in use. However, Linux is usually distributed with a collection of utilities and applications to make it easily configurable as a desktop system, a server, a development box, or a graphics workstation, or whatever the user needs. In such a configuration, we have a Linux (based) distribution. Therein lies your strongest argument for the unwieldy title 'GNU/Linux' (when said bundled software is largely from the FSF). Go bug the distribution makers on that one. Take your beef to Red Hat, Mandrake, and Slackware. At least there you have an argument. Linux alone is an operating system that can be used in various applications without any GNU software whatsoever. Embedded applications come to mind as an obvious example.
Next, even if we limit the GNU/Linux title to the GNU-based Linux distributions, we run into another obvious problem. XFree86 may well be more important to a particular Linux installation than the sum of all the GNU contributions. More properly, shouldn't the distribution be called XFree86/Linux? Or, at a minimum, XFree86/GNU/Linux? Of course, it would be rather arbitrary to draw the line there when many other fine contributions go unlisted. Yes, I know you've heard this one before. Get used to it. You'll keep hearing it until you can cleanly counter it.
You seem to like the lines-of-code metric. There are many lines of GNU code in a typical Linux distribution. You seem to suggest that (more LOC) == (more important). However, I submit to you that raw LOC numbers do not directly correlate with importance. I would suggest that clock cycles spent on code is a better metric. For example, if my system spends 90% of its time executing XFree86 code, XFree86 is probably the single most important collection of code on my system. Even if I loaded ten times as many lines of useless bloatware on my system and I never excuted that bloatware, it certainly isn't more important code than XFree86. Obviously, this metric isn't perfect either, but LOC really, really sucks. Please refrain from using it ever again in supporting any argument.
Last, I'd like to point out that we Linux and GNU users shouldn't be fighting among ourselves over naming other people's software. But what the heck, I'm in a bad mood now. I think I'm feeling sufficiently obnoxious to make the point that GCC is so very famous and, yes, so very useful only because Linux was developed. In a show of proper respect and gratitude, shouldn't you and everyone refer to GCC as 'the Linux compiler'? Or at least, 'Linux GCC'? Seriously, where would your masterpiece be without Linux? Languishing with the HURD?
If there is a moral buried in this rant, maybe it is this:
Be grateful for your abilities and your incredible success and your considerable fame. Continue to use that success and fame for good, not evil. Also, be especially grateful for Linux' huge contribution to that success. You, RMS, the Free Software Foundation, and GNU software have reached their current high profiles largely on the back of Linux. You have changed the world. Now, go forth and don't be a nag.
Thanks for listening.
The question is how in the world did the GNU Project arrive on the worst thing to pronounce with the fewest possible letters. Seriously--it's only three letters and when people see it for the first time, there's about three possibilities that come to mind for how to say it. It's simply a case of stereotypical computer geeks doing something brilliant and not knowing the first thing about marketing, and now we get to hear the oft-repeated lamentations of disrespect when that's not the case at all. Linux is just easy to say and kinda catchy--that's it.
@@jonfeuerborn5859 I hope you're not serious. You're aware that this is a copypasta?
I just wanna say: i use arch btw
Please consider doing a vid on what languages are most popular, aimed at helping a new student decide what to study. Thank You
Nice ride! But wot no forth? And bcpl? Some influential blasts from the distant past - and my youth! 😄 Also, OCCAM?
So many goofs. FORTRAN is not slow. FLOWMATIC ran on a Univac 1, definitely not a supercomputer. Algol inspired Algol W, Algol 68, Pascal, but C, not so much. C didn’t inspire go, Java, JavaScript, PHP, or Python. Also you only hit a few languages, in reality there are hundreds, mostly special-purpose languages and every assembly-language.
Music: Mario Kart Wii OST - Coconut Mall
Forgetting PL/1 is no problem for me, but forgetting Forth? Seriously?
I wish i could program.
I have some amazing ideas that the world will never see.
Did you mention FORTH ?
Thanks for the video. I don't know anything about programming. This clarified a lot.
Nice video. I am your 274th subscriber!
7:55 functional programming is good not just for math, but also business applications. Check the talks "Domain Modeling made functional" and "Making Impossible States Impossible"
Excellent. And BASIC was based on Algol.
10:50 uhm... is deprecated...
center is the best element in the world everyone should use it it's pure magic and nobody will change my mind just because it's deprecated.
@@doigt6590 .div {display:flex;justify-content:center}
this isn't every programming language. There were lots of Russian based programming languages in the Soviet union, a French based programming language in France, some newer programming languages like v, some older ones also missing like s, a+, b, d, e, Erlang, oCaml, and some more modern Non-English programming languages like Produire (プロデル (Japanese)) and 1C enterprise programming language (Russian). There's also tons of others missing in here. It would've helped if the video had some disclaimer for how it was classifying these because it seems to be leaving a lot out.
A lot of missing things here: A few examples of the top of my head:
tcl
Modula2
Modula3
BCPL
B
BLISS
FOCAL
FORTH
Scheme
Utopia
Eiffel
bash scripts
csh scripts
D
DIBOL
MDL
LISP : The only computer programming language I've ever been shown that I just cannot get my head around. I have no particular training or practice with Python or Rust, but they're just programming languages. Lisp seems to be something else.
lisp isn't a programming language, lisp is a way to live, for real
@@DeciPaliz~ See? I told you I didn't understand it.
You totally missed FORTH developed to control telescopes and the basis computer controlled ,manufacturing
ThePrimeTime reacted to you! You're going famous.
What about dBase/FoxPro/Clipper?
Nice list! Thanks for the video. On what criteria these languages was chosen? As example why V Lang and Crystal was not included? It's my recent favorites :) both really deserve to be mentioned
I think it's top 60 or something like that. Just the video title is misleading. By the way Crystal is great! As ex Ruby dev I really like Crystal
Knowing our fortune, if Crystal was mentioned, it'd be two seconds long section and said to be "Ruby, but with better speed" :<
How he massacred my boy Nim.... I know it's niche language and it can be summarized to be "Python, but C", but there's so much you can include to expand it even with few seconds more... like memory management system you can choose, or macros allowing you to basically rewrite AST to your needs!
R is also required for social sciences like political science.
And what about Focal and two independently made languages called C - - ?.. What about Modula-2/3 and Oberon? And most important of all, where's PL/I ?! And Forth?!.
you can code in swift on windows and linux, but you can't compile it for mac or iphone.
?? I thought Swift was written to replace Objective C, especially in iPhone
@@edwardfalk9997 its used most commonly for developing ios apps, but its a programming language on its own, it can be used for other thing like web services, etc.
This is the case with macOS and iOS in general, it's not even unique for Swift.
This is due to Apple's licensing.
Also, xCode is not for Swift, is much older than Swift, and there are other IDEs for Swift, Swift and xCode are not related.
This dude had so many things wrong about these...
@@edwardfalk9997I think they mean you can’t compile to a macOS or iOS executable from Linux or Windows. Swift has run on Linux for about 7 years and Windows for about 3. It was released under the Apache license starting with the first version to support Linux.
@@jaybutler I think they mean you can't BUY a compiler to do it. You can WRITE a compliler for anything to anything in ANY environment which supports STANDARD IN, STANDARD OUT and STANDARD ERROR on any architecture. If it was saleable someone would have done it.
PS I used to compile programe in Algol on IBM 360 series mainframes to execute on DEC PDP10 machines.
I wish Scratch would grow-up.
Literally.
I am an adult, I need a non-kid version to learn and create with.
Swift is fully open source and capable of being run on many different operating systems today, not just Apple.
A few errors and some missing languages, but all in all a solid video. Keep it up
DANG, you covered a _LOT_ of ground there.
Great video, but i hope you understand that arguments like "low level runs much faster than high level" are debatable and wrong.
It's true though, especially for interpreted languages, although this guy clearly doesn't know what he is talking about.
Low level literally runs faster than high level though.
@@KanashimiMusic The two are not directly correlated.
Low level gives you more control over memory management, that part was right.
You could say the same about a pair of scissors and a chainsaw. If you have no idea how a chainsaw works, its useless.
@@NetherFX They are definitely correlated, very strongly I would even say. I don't know what exactly you mean by "not directly", but most traits that are inherent to higher-level languages naturally come with performance impacts.
1. As you yourself said, low level gives you more control over memory management, and if you don't have that control, the language (or rather, the runtime) has to do the work for you, very often through a garbage collector, which has some performance impact. That is unless the compiler manages the memory for you, like in Rust, but calling Rust a "higher-level" language is debatable at best.
2. High-level languages usually include a lot of non-zero-cost abstractions and overhead which can have significant performance impacts.
3. All high-level languages that I can think of are interpreted. Even languages that are usually referred to as "compiled" are really not - they're compiled to machine code for a virtual machine like the JVM which then interprets or "just-in-time compiles" that code to actual machine code at runtime. And quite obviously, interpreting also has a significant performance impact. Granted, the JVM JIT specifically is known for being very good at optimizing in real time, but even it can't possibly beat ahead-of-time compiled, ACTUAL machine code.
I'm aware that low-level vs high-level is not binary, rather, it is a spectrum that's not very well-defined or universally agreed upon. But it's impossible to deny that many of the traits that make a language "high-level" inherently lead to programs written in that language being slower.
love the powershell part
I miss Standard ML and Scheme 🙂. Functional programming is interesting from a mathematical point of view.
He says IBM rpg and my mind is like 'ibm has a roleplaying game?? :o'
Its weird hearing "Fortran" and "slower" together. Fortran compilers have been around so long and are so refined they generate ASM so optimized its considered one of the only languages that runs faster than C. Its a *rocket ship* of a language (And yes, it'll do your GPU stuff via its vector extensions and run them crazy fast). Its also so old its possibly your actual grandfather.
No language will ever come close to fully replacing C (at least for now), but Rust has the biggest chances
This is because of the support for C on essentially everything that runs code and it's established role in tech
memory leak is when you didnt free the allocated memory right and have no "access" to it anymore
great video, getting to the point so fast, but it isn't *every programming language ever*. you missed some languages, as a couple of commenters already pointed out.
oh and at 5:14:
floats aren't technically real numbers, since they can't fully represent irrational numbers. they can only represent a certain spectrum of rational numbers.
If you want to be _super_ technical, IEEE-754 floats are a subset of the dyadic rationals (plus 4 extra values, one of which isn't a value and one of which is the same as an existing value but not).
The dyadic rationals are a "ring" that consists of all integers multiplied by arbitrary powers of ½. A "ring" is a set of "numbers" that can be added, subtracted and multiplied while behaving nicely. Floats let you divide by rounding to the nearest representable dyadic rational even if the dyadic rationals themselves don't support division.
Need to add the languages Forth and ARexx to the list. My list of languages over the years are: basic, forth, assemble (multiple CPUs (6502, 680x0, VAX)), pascal, cobol, C, C++,ARexx, HTML, SQL, (i've looked at other computer languages but haven't programmed in them)
You gotta teach me assembly man
Pascal is the ancestor of all good programming languages.
Dude. At least you got the names mostly right, I guess.