Why I Chose Rust Over Zig

Поділитися
Вставка
  • Опубліковано 27 сер 2024
  • Recorded live on twitch, GET IN
    Article
    turso.tech/blo...
    By: Pekka Enberg | x.com/penberg?...
    My Stream
    / theprimeagen
    Best Way To Support Me
    Become a backend engineer. Its my favorite site
    boot.dev/?prom...
    This is also the best way to support me is to support yourself becoming a better backend engineer.
    MY MAIN YT CHANNEL: Has well edited engineering videos
    / theprimeagen
    Discord
    / discord
    Have something for me to read or react to?: / theprimeagenreact
    Kinesis Advantage 360: bit.ly/Prime-K...
    Get production ready SQLite with Turso: turso.tech/dee...

КОМЕНТАРІ • 643

  • @NeilHaskins
    @NeilHaskins Місяць тому +361

    "A mutex is just a semaphore of length one."
    "A monad is just a monoid in the category of endofunctors."

    • @linkernick5379
      @linkernick5379 Місяць тому +11

      Both clauses are true ❤

    • @taatuu25
      @taatuu25 Місяць тому +55

      I'd just like to interject for a moment. What you're referring to as a binary semaphore, is in fact a binary semaphore with ownership, or as I've recently taken to calling it, a Mutex. A binary semaphore does not have unlocking restrictions by itself, but is rather another free attribute of a fully functioning mutex type made useful by its ownership semantics.
      Many programming languages implement a binary semaphore today, without enforcing the ownership. Through a peculiar turn of events, the most popular implementations today which are widely used are often called "Mutexes", and many of its users are not aware that it's basically unsafe. There really is a binary semaphore, and these people are using it, but it is just a part of the type they use.

    • @alexandrustefanmiron7723
      @alexandrustefanmiron7723 Місяць тому +1

      Welcome to basic 12th grade algebra structures! Wow amazing! PS. joking prob 1st or 2nd year faculty algebra. PS don't remember when lambda calculus was.

    • @OnFireByte
      @OnFireByte Місяць тому +2

      Semaphore is just a mutex that has length

    • @Microphunktv-jb3kj
      @Microphunktv-jb3kj Місяць тому +3

      pretty sure C was considered high level language few decades ago atleast
      whats next... javascript low level language in 2050 ?

  • @OutBoXeR007
    @OutBoXeR007 Місяць тому +362

    Came to this channel for programming, stayed for the moustache

    • @eotfofiw2748
      @eotfofiw2748 Місяць тому +11

      Came for the moustache... twice

    • @nyssc
      @nyssc Місяць тому +4

      Same. Also came on the mustache.

    • @SpocksBro
      @SpocksBro Місяць тому +1

      Technically it's a pornstache though, more specifically a 70's pornstache. Very nice

    • @bobby9568
      @bobby9568 Місяць тому +3

      does this guy even program?

    • @npc-drew
      @npc-drew Місяць тому +2

      Mustache ain’t real pal, it’s all a simulation to condition you for the AI overlord who will also have a mustache 😅😂🙂

  • @user-kn9lc9po1v
    @user-kn9lc9po1v Місяць тому +195

    5:00 It appears that Prime mistakenly interprets the phrase 'If it compiles, it works' to mean 'If it compiles, it works correctly.' However, no one using this phrase believes that programs written in Rust or Haskell are free of logic bugs. Instead, it means that a compiled program won't crash unexpectedly due to issues like mismatched types, use-after-free errors.

    • @j-p-d-e-v
      @j-p-d-e-v Місяць тому +16

      Agree when Im writing in Rust when it compiles successfully, the next thing I do is write more test cases to see if the application work as expected.

    • @petrusboniatus
      @petrusboniatus Місяць тому +6

      It's also a matter speed, is just faster to get this kind of errors from the lsp that at runtime.

    • @user-kn9lc9po1v
      @user-kn9lc9po1v Місяць тому +4

      @@samuraijosh1595 Right, I was wrong, out of bounds checks happen in runtime.

    • @havenselph
      @havenselph Місяць тому

      Most things in rust keep you from crashing, they very much beg you to handle panics ​@samuraijosh1595

    • @isodoubIet
      @isodoubIet Місяць тому +4

      "y.' However, no one using this phrase believes that programs written in Rust or Haskell are free of logic bugs."
      Yes they are. Defend your bailey.

  • @LtdJorge
    @LtdJorge Місяць тому +118

    I think "If it compiles, it works" should be reworded to something like "if it compiles, it does what you described". If what you described is wrong, then the output will be wrong, but that's not really a bug, it's just wrong logic. Rust eliminates, by default, the ambiguity that unsafe code can carry. You could have described the correct algorithm, but have data races which do introduce bugs, for example.

    • @bennyboiii1196
      @bennyboiii1196 Місяць тому +21

      I think it mostly gives people a false sense of security in rust, especially for devs that are new to systems programming. Also, it was butchered heavily. I’m pretty sure the original saying was “if it compiles, it runs” which has way different connotations.

    • @MechanicaMenace
      @MechanicaMenace Місяць тому +7

      ​@@bennyboiii1196problem with that is it doesn't sound as revolutionary. Also... I know this is going to be a massive shock but a lot of coders are pedantic AF contrarians and would purposely misunderstand "it runs" to be all "well anything that compiles technically runs..."

    • @LtdJorge
      @LtdJorge Місяць тому +2

      @@bennyboiii1196 Yeah, true. It runs is better than it works.

    • @TheSulross
      @TheSulross Місяць тому

      Real Rust programs be written using other foreign code and libraries where uses lots of unsafe blocks as wrappers

    • @LtdJorge
      @LtdJorge 15 днів тому +2

      @@TheSulross so? If those uphold the safety contracts, are correctly implemented, the things they depend on (OS, drivers, libraries, etc) are not bugged and Rust itself doesn't have bugs, then unsafe code is irrelevant.
      Safety in Rust means the code you write does what you told the compiler it should do, if you wrote it incorrectly the compiler cannot do anything, and if any of your dependencies doesn't work as it says, that's not a problem with your code.
      So if the dependencies you rely on are well tested, fuzzed, etc then there should be no problem. The thing with Rust is the low level FFI code should be put in unsafe libraries that are extremely well tested and verified, and also written by knowledgeable developers, then wrapped in safe interfaces and consumed by developers oblivious to the unsafety under it. It's the same as when people say that you can do fully safe C code with static + dynamic analysis tools (you actually can't 100%). Then do the same with the low level unsafe Rust lib making it "safe", letting you use fully safe Rust on top of it, instead of also having to use analysis tools in your higher level C code.

  • @seiakari
    @seiakari Місяць тому +93

    I read the title as "Why I Choose Rush over Zerg", and I thought that didn't make a lot of sense.

    • @LewisCowles
      @LewisCowles Місяць тому +3

      Well you're right, the zerg will inherit the galaxy dude.

    • @TunaCanGuzzler
      @TunaCanGuzzler Місяць тому +5

      To be fair terran's rushes are doing pretty good these days, reapers and helions baby!

    • @thunder____
      @thunder____ 15 днів тому

      Well, Rush is a Terran player, so that tracks

  • @christopher8641
    @christopher8641 Місяць тому +68

    If it compiles, then all that is remaining is business logic bugs. I like that

    • @TheSulross
      @TheSulross Місяць тому +2

      And bad pointer dereferencing

    • @christopher8641
      @christopher8641 Місяць тому +3

      @@TheSulross in safe rust?

    • @TheSulross
      @TheSulross Місяць тому +2

      @@christopher8641 well, people should not pretend that real Rust programs are not going to be interfacing with highly useful (and very often necessary) libraries that aren’t written in Rust. And those integrations mean Rust unsafe blocks.

  • @DmitryVoytik
    @DmitryVoytik Місяць тому +109

    Prime, if your worst bugs are logical bugs, it simply means you write code in dynamic languages. Worst bugs in C are silent memory corruption and data races. Only C system SW engineers know what heisenbugs are not by reading memes, but spending days trying to debug them.

    • @sortof3337
      @sortof3337 Місяць тому +38

      amen. i do telco hardware and web developers saying memory issues are not real is laughable. i also write python for shims as well and having logical bugs in python is skill issue which I clearly have. but like you said some of the worst bugs in C are memory errors and segfaults which literally occur once in a moon or when lightning strikes nearby town drunk and is in range of 10 kms. I've had to literally travel 1000kms to fix bugs which would literally never happen in rust.

    • @complexity5545
      @complexity5545 Місяць тому

      Wow I haven't heard heisenbugs in a while. You test (in sections), then the bug goes away. You go live, the bug shows up. Good one. I going to have to teach this one to some of my contractors (that I occasionally hire (for audio stuff)). We're having problems with GTK and some C++ Stuff coming down to [audio frame] memory problems and crash; I bet he can't find the problem because of his testing/debug/flags environment. [ heisenbug ] is a good word for the documents.
      Thanks.

    • @hagaiak
      @hagaiak Місяць тому +4

      @jonkoenig2478 Honest question, is the person I'm replying to a bot? I wonder if there are bots trying to create random hate on the internet lately...? If you're human write the answer to "42 - 7".

    • @grimonce
      @grimonce 8 днів тому

      That's only true because you don't write business logic in C... Having worked in both embedded and now financial domains I say both have really weird problems. Where the later is mostly a headache because of really complicated business processes and legislative requirements... Both take a toll on your sanity, the stupid comparison between the two and trying to prove that your problems are the real ones and others are children playing in the dark is a sanity issue in itself.

  • @freeideas
    @freeideas Місяць тому +60

    I object, your honor, to calling debugging tricky memory issues in C/C++ code, a "skill issue". The best C/C++ programmers in the world have this problem. So, unless "not having god-like omniscience" is considered a "skill issue", then this is not a skill issue. It is a language issue. I'm not saying it is a language issue that isn't worth the extra work. Not saying we should all stop with C/C++, just saying that isn't exactly a skill issue.

    • @GrizikYugno-ku2zs
      @GrizikYugno-ku2zs Місяць тому +24

      Sounds like you have a skill issue.

    • @isodoubIet
      @isodoubIet Місяць тому +10

      "I don't have a skill issue"
      - Person who described the language as "C/C++"

    • @freeideas
      @freeideas Місяць тому +9

      Actually, i DO have a skill issue with those languages. I'm just saying that people who are wizards with those languages still struggle with memory issues.

    • @isodoubIet
      @isodoubIet Місяць тому +6

      @@freeideas This is why it's a problem that you conflated C and C++.
      In C, yes, everyone will struggle with memory issues. C++? Not so much. The language provides tools for managing memory that make it so you hardly ever have to think about it.

    • @rusi6219
      @rusi6219 Місяць тому +5

      The language told you you are responsible for the memory then how is it a language issue if YOU mismanage YOUR memory? You Rust cultists have a serious problem with taking responsibility for your failings.

  • @brod515
    @brod515 Місяць тому +74

    @3:00 "when you have a black uncle you have to do something"...
    as a black person... FACTS!

    • @LtdJorge
      @LtdJorge Місяць тому +8

      Lmfao

    • @tirushone6446
      @tirushone6446 Місяць тому +3

      it's yet another case of PrimieTIme say stuff that sounds out of context even in context

    • @TheSulross
      @TheSulross 29 днів тому

      Came for the CS geeking - stayed for the pithy sociological observations

  • @Hardware-pm6uf
    @Hardware-pm6uf Місяць тому +194

    Zig vs rust > Vim vs emacs

    • @Hardware-pm6uf
      @Hardware-pm6uf Місяць тому +75

      Go is the vscode of programming languages

    • @ivymuncher
      @ivymuncher Місяць тому +5

      @@Hardware-pm6ufwhich language is helix?

    • @notuxnobux
      @notuxnobux Місяць тому +5

      @@Hardware-pm6uf it just works?

    • @Hardware-pm6uf
      @Hardware-pm6uf Місяць тому +1

      @@notuxnobux yup

    • @Hardware-pm6uf
      @Hardware-pm6uf Місяць тому +3

      @@ivymuncher I think ocaml I don't know a good fit for helix

  • @stysner4580
    @stysner4580 Місяць тому +86

    I really don't get the point of hating on "if it compiles it works"... It eliminates a type of bug completely, the logic bugs are there in any language?

    • @RogerValor
      @RogerValor Місяць тому +19

      yeah i had the same thought, complaining about language independent logic bugs will happen in any language

    • @MarcelRiegler
      @MarcelRiegler Місяць тому +22

      It's like claiming that type system are pointless because they can't catch logic bugs either. Ideally the computer should take care of as much menial work as possible, and both a type system and the borrow checker do that.

    • @skulver
      @skulver Місяць тому +2

      They never claim if it compiles it's good, just that it works, which for this purpose means it won't randomly crash because of a preventable error.

    • @UnidimensionalPropheticCatgirl
      @UnidimensionalPropheticCatgirl Місяць тому +2

      @@skulverbut that’s genuinely not true in rust, there are whole classes of bugs which rust cannot statically check, out of bounds errors for example, so it can still crash.

    • @daphenomenalz4100
      @daphenomenalz4100 Місяць тому +1

      @@UnidimensionalPropheticCatgirl still happens less often, whereas in C you must cover various things beforehand, to avoid them

  • @jmw1515
    @jmw1515 Місяць тому +20

    I just love Rust, that’s not going to change

  • @julkiewicz
    @julkiewicz Місяць тому +12

    Dude, that's not true that ownership is not a biggie. It is a biggie. Especially in a kernel environment where you have to worry about concurrency, reentrancy (you could call user code and then be called again from that user code). The hardest to find bugs, are by far those to do with memory safety. Because ultimately you're not writing a closed system where you have control over everything. You're writing an open system where most of the running code is user code, or kernel module code. It's incredibly hard to reproduce issues. It's much much easier to fix invalid logic bugs than memory management bugs in such an environment.

  • @callumbirks
    @callumbirks Місяць тому +51

    Re; “If it compiles, it works”;
    As someone who comes from C/C++ like Pekka, this definitely *feels* partially true in Rust. Yes logic bugs are still a problem, but in my experience most of the bugs I have to deal with in C++ are memory errors (working with low level systems programming). Almost all of these are compiler errors in Rust.

    • @isodoubIet
      @isodoubIet Місяць тому +4

      If most errors you see in C++ are memory errors, you're using the language incorrectly. If you're tracking memory allocation correctly (i.e. using dedicated memory management types such as std::vector and std::unique_ptr instead of having each of your classes have that responsibility in addition to their business role, tracking pointers with types such as std::span and std::string view which also track size information, and just being minimally thoughtful about ownership (which you also have to do in Rust), you should hardly ever see any problems.

    • @CGMossa
      @CGMossa Місяць тому +4

      That's a lot of boxes as a necessity... Atleast in rust, you start without boxes.

    • @isodoubIet
      @isodoubIet Місяць тому +1

      @@CGMossa It's not really onerous to say "your class should have a vector of things instead of a pointer to an array of things". What I described is literally the easiest way to set things up. People go out of their way to make C++ hard.

    • @p37ert
      @p37ert Місяць тому +4

      @@isodoubIet Rust is still safer than C++ even with std::vector and std::unique_ptr because references into either can be invalidated if the memory is released or the std::vector is resized. Almost all data structures in the C++ standard library have reference or iterator invalidation caveats that can typically only be detected at runtime with address sanitizers, and detection can still be flakey. This can't happen is Rust due to lifetimes enforced at compile time.

    • @iverbrnstad791
      @iverbrnstad791 Місяць тому +4

      Yeah, logic problems are much easier to exclude too. I do C++ at work, and still find that refactoring Haskell I wrote last year is easier than C++ I wrote yesterday. The ability to reduce the problem space with enum arguments, and then have the compiler enforce completeness makes it much easier to write code I can reason about, and easier to keep track of all possible paths through my code. This seems to be the case for Rust as well.

  • @DrGeoxion
    @DrGeoxion Місяць тому +19

    Async Rust doesn't use green threads though. It's agnostic.
    Sure, Tokio's runtime for async uses green-ish threads, but that's not the only runtime!
    You can use async on embedded devices without allocators and without OS threads too.
    Async Rust really is just polling objects and having wakers that can notify the executor to poll again. Anything more than that is runtime specific and is different for each runtime you may use

    • @hanifarroisimukhlis5989
      @hanifarroisimukhlis5989 Місяць тому +6

      Async Rust are even more powerful. With a bit of hacky workaround, you can implement generators from it. I think generators proposal were sort-of based on async. Async iterators, boom easy and included.
      I once tried to handle state machine, and i ended up using async + other stuff. It's surprisingly usable and easy for end user to use.

    • @RiwenX
      @RiwenX Місяць тому +4

      Yeah, I use async Rust on embedded devices, and it's awesome. It all depends on the executor/runtime.

    • @DrGeoxion
      @DrGeoxion Місяць тому

      @jonkoenig2478 hmmm no it's not. Almost all production async Rust code I've written is not running on Tokio

    • @RiwenX
      @RiwenX Місяць тому +1

      @jonkoenig2478 dude said ===, js fanboy detected 😭😭

    • @hagaiak
      @hagaiak Місяць тому

      Tokio uses regular OS threads, no?
      By default a pool of `cpu_core_count` OS threads.

  • @lawrencejob
    @lawrencejob Місяць тому +7

    To distill the author’s point, they pint out that Zig doesn’t have a dedicated type (meta) language for types because Zig is used for both metaprogramming and programming. The argument for making the type language the same as the main language is that you don’t have to learn two languages and the interplay is good. The argument against is that a meta language might have different requirements and benefit from being a different language (usually declarative, different primitives like, say, Type)

  • @zaper2904
    @zaper2904 Місяць тому +109

    NGL "I prefer the pre-proccesor over Zig comptime" is such a insane take I genuinely feel like it invalidates the entire article.

    • @purewantfun
      @purewantfun Місяць тому +16

      right, it just reeks of skill issues

    • @sortof3337
      @sortof3337 Місяць тому +20

      @@purewantfun yes. skill issue of you guys. c pre processor literally powers almost entirety of internet and telecom.

    • @diadetediotedio6918
      @diadetediotedio6918 Місяць тому +26

      No, I pretty much agree with him. He did not said "I prefer the pre-processor over Zig comptime", he said "I prefer pre-processor over Zig comptime IF we can't have generics", which is another thing. Comptime in zig feels more like a workaround to the fact that typesystems are a specific part of a language than anything concrete on itself.

    • @christopher8641
      @christopher8641 Місяць тому +14

      ​@@purewantfunI don't think the skill issue applies to someone that contributes to the linux kernel. Maybe you have the skill issue

    • @sentient_carbon
      @sentient_carbon Місяць тому +7

      @@diadetediotedio6918 right. It's really cool writing "argument: anytype" and knowing a very lean code will be generated at comptime and the compiler will scream at me if it's invalid, but when I'm reading a function trying to understand how to use it and what it expects of me and I read "argument: anytype" all I can think is "well, shit"

  • @LusidDreaming
    @LusidDreaming Місяць тому +6

    If you build any heavily concurrent system, I think ownership actually does cause a lot of bugs. I worked on an engineering simulation software that did a ton of concurrent calculations, some of which were interdependent, and the two biggest classes of bugs I dealt with were serialization issues (more related to networking) and data races.
    Even currently, working in the cloud, i deal wih more race conditions across services than logic bugs. I feel like the majority of logic bugs were found early on in testing because they are typically deterministic and thus easily reproducible. But with race conditions, there have been a handful where they were seen, but then couldn't be reproduced, and the QA engineer would end up getting gaslit into thinking they must've just not executed the test correctly.

  • @Bolpat
    @Bolpat Місяць тому +3

    The difference between Zig’s comptime and C++ templates and constexpr is that Zig’s stuff is unified. In C++, you can call constexpr functions at compile time to produce compile-time constants, e.g. for array sizes or template value arguments. Template stuff is declarative, constexpr function execution is imperative. That makes sense from where C++ is coming from, but modern language designers realized that the clear separation of types and values isn’t useful at compile-time. In Zig, types are values at compile-time. A function can have parameters that contain types as values and return types as values at compile-time. In C++ lingo, that would mean that not only can you write a consteval function that takes in something and returns a number (for an array bound), but also, you can write a function that returns a type to be used as a type for a run-time value. Essentially, type_info, at compile-time, isn’t meaningfully different than a type, so why have them be different things? In C++, there are function templates, class templates, value templates, and alias templates. In D, there are just templates that happen to produce functions, classes, values or aliases (which is a lot more streamlined), and in Zig, there are no templates, but functions that shove types around as if they were values. Sorting types (at compile-time)? No harder than sorting values at compile-time, which is no harder than sorting values at run-time. Why copy values and alias types? If a type is a value (which it is at compile-time), an alias of a type is exactly a (const) copy of a type. So in Zig, things that are separated for no good reason (in hindsight) aren’t separate. The only difference between types and other values is that types don’t exist at run-time, which means if you’d end up with having to run a function that manipulates types at run-time (e.g. because of a run-time value argument), the compiler will complain.

  • @ralph_d_youtuber8298
    @ralph_d_youtuber8298 Місяць тому +13

    The mentality of doing things "the Rust way" is real. Even when I go back to writing Java for school work, I now prefer using interfaces and enums over inheritance. Additionally, I make sure that all nullable objects are wrapped in an Optional.

    • @rusi6219
      @rusi6219 Місяць тому

      Procedural people have been telling you for decades that full-on OOP is a bad idea so why does Rust get all the credit, just because they have a toxic PR policy?

    • @omg33ky
      @omg33ky Місяць тому

      Better to show than tell and rust did that well ​@@rusi6219

    • @protowalker
      @protowalker Місяць тому +6

      ​@@rusi6219why are you so mad?

    • @RustIsWinning
      @RustIsWinning 4 дні тому

      @@rusi6219 🤡🤡🤡

  • @mkvalor
    @mkvalor Місяць тому +6

    Rust async does NOT use green threads. The original rust pre-1.0 did use/expose them. Both tokio and stdasync use hardware thread pools. It is true that these pools are of the M:N variety, but that is a separate concept from green threads, which typically facilitate cooperative multitasking.

    • @stevenhe3462
      @stevenhe3462 Місяць тому

      Yeah. Work-stealing event loops that are multi-threaded by default.

  • @remrevo3944
    @remrevo3944 Місяць тому +3

    How rust async-executors work is actually now /that/ complicated.
    Basically you have a queue of top-level futures that you create using things like `tokio::spawn()` that get polled when they are called to awake.
    The fact that this executor is executing this queue multi-threaded is mostly just a implementation detail.
    And futures are nested state machines that somewhere down the line depend on a "Resource" that `.wake`s the associated Waker when it is ready, which signals the executor to repoll the associated top-level future.
    (One example for a resource is AsyncFd, which uses something like epoll to signal when it is ready.)

  • @7etsuo.c
    @7etsuo.c Місяць тому +10

    They forced us to make redblack trees by hand in our data structures course.

  • @Kiyuja
    @Kiyuja Місяць тому +42

    Rust in the title is an immediate neuron activation for me

    • @agnesakne4409
      @agnesakne4409 Місяць тому +1

      this comment is good

    • @Kiyuja
      @Kiyuja Місяць тому

      @@agnesakne4409 thx buddy, I gave it my all and tried to be honest

    • @Mglunafh
      @Mglunafh Місяць тому

      A potential salt mine / spice warehouse 🧠⚡ so yeah, exactly that

  • @EhdrianEh
    @EhdrianEh Місяць тому +5

    Never used zig, but I agree that thus is_duck example seemed pretty hacky. You accomplish the same thing in rust in your type declarations without having to write type checkers in the body of your function. I understand why this would be used in a pythonic language tho (mojo) - you have things like "isinstance" in the middle if your code and it follows the same pattern.

  • @bluelinden
    @bluelinden Місяць тому +12

    "if it compiles it works" is less about "bug-free" binaries and more about completely getting rid of undefined behavior. as a former TypeScript dev, Rust is a godsend because it's so much easier to debug than TS.

    • @justsomeguy8385
      @justsomeguy8385 Місяць тому

      What undefined behavior are you encountering in TS?

    • @Gridfen
      @Gridfen Місяць тому

      @@justsomeguy8385 Do you know whether JSON.parse throws? How did you discover that information?

    • @Luxalpa
      @Luxalpa Місяць тому

      @@justsomeguy8385 JSON.parse() comes to mind

  • @masu33
    @masu33 Місяць тому +3

    Red Black: think of it like this:
    Coloring is there to show you how lengthy the branch is compared to logn... when it's too long, the coloring (and the rotation) is basically just there to balance it... so updating the coloring is a chore to make rotations possible to balance it within limits.
    Just like what you do with a Btree, but while you overpopulate that theoretical totally logn tree horizontally a bit, you overpopulate vertically on a rb, the color limit of the RB is like the size limit of the B.
    At least this how I used to teach it at the univ...
    Hope this helps someone randomly finding it. :D

  • @BosonCollider
    @BosonCollider Місяць тому +4

    The main complaint I have about zig is that in industry it is kind of difficult to push over just using a linter-enforced subset of C++ or Swift. It is an incremental improvement over C but it doesn't provide a fundamentally new capability that you can't get from any of the million of other C replacements that have popped up over the years (Ada, D or Nim without GC, V, Odin, Pascal, Delphi, fortran, C2 and C3, Holy C, and the list just goes on and on, also holy C is cooler than zig).
    Rust on the other hand genuinely does provide a new capability (non-GCed language with memory safety). Yes, it can be painful to write it, and yes I wish it were smaller and closer to what Greydon originally wanted, and yes you should probably just use threads instead of async/await, but at least I can immediately explain what the point of it is, and parallel (not concurrent) code with Rayon is better than pretty much any alternative I've seen.
    If you want a better C that is simple like Go, I like C3. It's a sane "fixed C" that does not try to do everything like C++ and just removes the obviously broken C footguns by making things defined, adds defer, interfaces, slices, odin-like error handling, and Go-like build tooling, while staying fully ABI compatible. If you want a C that is a bit more like Go, that's what I would suggest.

  • @mayur9876
    @mayur9876 Місяць тому +36

    I tried giving zig so many chances, but it never clicks. I keep drifting back to C, maybe it's just wired into my brain. Rust feels like a breath of fresh air, but its colourful, toxic community keeps me away.

    • @LtdJorge
      @LtdJorge Місяць тому +13

      I'd recommend ignoring the Rust community. When I started reading top crates' code and doing syscalls (not really, but direct libc calls) manually, is when I started understanding Rust, deeply.

    • @mayur9876
      @mayur9876 Місяць тому +7

      @@samuraijosh1595 mojo ... cough cough

    • @pietraderdetective8953
      @pietraderdetective8953 Місяць тому +3

      ​@@samuraijosh1595 you're describing Mojo!

    • @gideonunger7284
      @gideonunger7284 Місяць тому +8

      The rust community sucks but your use of "colorful" just sounds like you are a biggot instead of having any Substantive criticism.

    • @CGMossa
      @CGMossa Місяць тому +20

      ​@@gideonunger7284your response made his sensible. 😮

  • @nateb1804
    @nateb1804 Місяць тому +9

    Most of my career was also C in the kernel (Windows kernel in this case). You get really good at C after years and rarely code bugs of the security-vulnerability nature. There are linters and safe string libraries and stuff like that to make things safer. The book "Writing Solid Code" set me on the right path early on.

    • @corvoworldbuilding
      @corvoworldbuilding Місяць тому

      Thanks for the book recommendation. It looks great

    • @tychoides
      @tychoides Місяць тому +2

      As you said you get good with C after years. I am doing C for numerical stuff and I still mess up indexes and crash my programs. Also C programmers are elite programmers, better than most. So if you can write C right, you special not the average.

    • @RustIsWinning
      @RustIsWinning 5 днів тому

      Thank you for the recent IPv6 CVE! God damn even Windows kernel devs are too blind to see that they cannot write safe C.

  • @F_Around_and_find_out
    @F_Around_and_find_out Місяць тому +24

    My learning strat right now is study C, use C API to work with SQL and learn both at the same time. Git gud with C. Continue grinding on C till Zig finally reach 1.0.

    • @mysterry2000
      @mysterry2000 Місяць тому +4

      I'd normally suggest Rust.
      But if C is your way to go, I'd suggest contributing to Open Source for the sake of learning.
      Always worth everyone's time, yours included 😁

    • @monsterhunter445
      @monsterhunter445 Місяць тому +1

      C is great but your jobs are gonna be embedded might as well learn c++

    • @pietraderdetective8953
      @pietraderdetective8953 Місяць тому

      You can do both C and Zig in tandem. That's what I'm doing currently.
      Yeah Zig is not 1.0 yet but it's perfectly usable.

    • @lmnts556
      @lmnts556 Місяць тому +1

      I hope they improve on zig syntax until then.

    • @LinkEX
      @LinkEX Місяць тому

      @@mysterry2000 Any more details on getting started with Rust?
      Any good ways of finding Open Source projects with easy tasks to get familiar with a Rust project, for instance?
      As for books, I've recently heard a podcast episode that recommended Hands-On Rust.

  • @Bolpat
    @Bolpat Місяць тому +1

    13:40 This is why I fell in love with the D language ~10 years ago. It allows you to approach problems the way they come up in your head. There is no special D way.

  • @remrevo3944
    @remrevo3944 Місяць тому +2

    31:25 There is a great blog article the tokio team made about their executor called "making the tokio scheduler 10x faster".

  • @yokebabjr3866
    @yokebabjr3866 Місяць тому +2

    Hey, I just want to bring some precision.
    At 6:14 when you are trying to explain briefly the basic of Rust, what are you talking about is RAII (Resource Acquisition Is Initialization) which is a concept from C++ that Rust has taken. I think to describe what makes Rust unique, I would have talked about the borrow checker, which is ensuring you can have mutation XOR aliasing at compile time. Now others have taken this concept, like mojo, but Rust what the one who introduces it.
    At 9:29 when you describe the way Arc works, you wrongly place the counter on the stack, it should be on the heap, next to the value inside the Arc. Indeed, if it were on the stack, each clone would have its own copy of the counter which will defeat the whole point of making reference counting.
    Regarding the green threading thing, I agree that the terminology is confusing, for the one who wants to lookup on the internet, go has what's called "stackful coroutine" and Rust has "stackless coroutine".
    I don't want to blame or anything, I know it's hard to vulgarize tech content in life, please keep up the good work.

  • @decathorpe
    @decathorpe Місяць тому +8

    first, cloning in Rust has *nothing* to do with whether a value is on the stack or on the heap. a clone is always about getting a second owned copy of a value, whether on the stack or not.
    second, I am not surprised that he doesn't understand how async/await works in Rust if he doesn't understand what "green threads" (stackful coroutines) means and how what Rust does (stackless coroutines / state machines) is different.
    third, if you like comptime so much compared to generics in Rust, you can do very similar things with "macros by example" Rust (generating copies of functions for different types, etc.) you *can* do that, but it's just objectively worse than using proper generics for that use case 🤷🏻‍♂️
    this kind of misinformation about Rust (whether intentional or not) is making me start to distrust any of his takes. like, if you don't understand it, just don't say anything about it?

  • @t1nytim
    @t1nytim Місяць тому +3

    This is not having watched the video, but I've been using Rust as my primary language for about 18 months now. And loved it, primarily the strictness of the language, like the borrow checker and all that. And I wanna learn Zig, but I'm hesitant, so I keep telling myself, when it reaches 1.0 I'll pick it up then and see how I go. It's probably also with admitting I've only used python for over a decade prior to giving Rust a go and enjoying it, is just a lack of exposure thing. Though I have been using Haskell for the last 3 weeks, as something new to learn, so I won't be ready to pick a new language for at least a year yet anyway.

  • @swojnowski453
    @swojnowski453 Місяць тому +4

    There are, and only have been two language worth learning. One is C, the other PHP. You write them once every 20 years, and when you come back later they are still fine. Everything else is shit for those who like to write and before they finish start rewriting ... Good luck boys with your new toys Zig and Rust ;).

  • @Nightwulf1269
    @Nightwulf1269 Місяць тому +1

    Correction! Go doesn't "spawn threads" usually when calling a go routine. The runtime spawns a bunch of threads when the program is started and then go distributes work chunks on that existing threads (hence they are called lightweight threads) because you save the overhead of creating the thread context every time you spawn a go routine. This is why they are that fast.

  • @matt-xq1xv
    @matt-xq1xv Місяць тому +33

    The point the article author made about C being amazing until you need a data structure that you don’t want to roll out on your own is so true. It almost makes you consider C++ until you realize how shit it is.

    • @EnDeRBeaT
      @EnDeRBeaT Місяць тому +11

      what's with the C++ hate on this channel, the language is decent despite being older than most of the viewers

    • @isodoubIet
      @isodoubIet Місяць тому +1

      Saying "I don't like C++ so I'll stick with C" is wild
      Like a completely bonkers take. The worst parts of C++ are those it has in common with C, my dude.

    • @hanifarroisimukhlis5989
      @hanifarroisimukhlis5989 Місяць тому +4

      @@EnDeRBeaT Decent? With types so long that "auto" were invented for? And however many constructors, assignment overloads there are? Oh and how many ways of allocating objects? Yeah C++ beyond C++98 is a huge mess.
      That's kinda why i don't touch any C++11 features at all. Give me plain old malloc() please.

    • @isodoubIet
      @isodoubIet Місяць тому +5

      @@hanifarroisimukhlis5989 "The language is bad because it has type inference"
      Wild, wild, wild.

    • @EnDeRBeaT
      @EnDeRBeaT Місяць тому

      @@hanifarroisimukhlis5989 i would take long types any time of the day over void*. Constructor complaints are valid, however most of the times you just need a constructor and destructor. Many ways of doing something? Choose one, and stick to it. If you're given many tools you don't need to use every single one

  • @julkiewicz
    @julkiewicz Місяць тому +5

    Rust borrow checker also guarantees no race conditions, so...

  • @zea_64
    @zea_64 Місяць тому +2

    Prime, you should try writing an executor, it's actually not that hard to make a crappy executor! The harder part is reactors, which are, conceptually, a thing that holds info on async I/O tasks that the executor polls to check what I/O is ready. If your Future impl can check for itself if it's done, you can just busy wait instead. I have my own Future types that want to batch requests to something, so they just enqueue requests (or send if the buffer fills up) and then my executor sends the batch on idle.

  • @hagaiak
    @hagaiak Місяць тому +1

    For clone stack vs clone heap you're right. That's a bad design decision made by Rust's library to have `Clone` trait used for both.
    They currently working on introducing a new trait (maybe named `Claim`) to differentiate between actual clones (like cloning a `Vec`'s heap data) to "cheap"/"invisible" clones such as increasing the ref-count of `Rc` or `Arc`.
    This will also fix the problem of having to call `.clone()` on `Arc`s when sharing ownership with closures (lambdas). Today you have to sprinkle `.clone()` but `.claim()` can be implicit (with an option to opt-out and force it to be explicit for code sections or crates where atomic increments/decrements can harm performance).
    It's insane that this change is possible, thanks to editions in Rust! Pretty good foresight to design the language with editions to be able to evolve like that.

  • @gruntaxeman3740
    @gruntaxeman3740 Місяць тому +2

    I like C too and I know how to write C in way that it never crash or have bugs.
    It just need very strict development process and verification. It is actually good then to make reliable software because tools are verified to work correctly, there are tools for verification and there isn't much complexity.
    It won't work if it is written same way as junior web developers write Javascript. It requires very different mindset.

  • @headlibrarian1996
    @headlibrarian1996 Місяць тому +3

    Rust drop is essentially no different than a C++ destructor. Not exactly difficult to understand.

  • @pmcgee003
    @pmcgee003 Місяць тому +1

    Green threads are basically coroutines, with an executor coordinating. Cooperative multitasking.

  • @alexeiboukirev8357
    @alexeiboukirev8357 Місяць тому

    Green threads are cooperative in nature and tasks have to explicitly yield to let other tasks take turns. Go compiler inserts yields into the code at strategic points (typically when functions are called or returned from) behind the scenes. Hardware threads use interrupts to switch tasks and do not require explicit yields. Interrupts can happen at any time/point in the code.

  • @greasedweasel8087
    @greasedweasel8087 Місяць тому +3

    10:32 As respectfully as possible (I’m not as experienced with Rust as you are, I don’t think) I would like to disagree. I believe Copy is the trait that clones the stack: indeed it is the trait that says “I have no information beyond what is stored in the stack, and I can leave scope without doing clean-up.” Clone, on the other hand, says “to copy me I need to do some nontrivial work that involves things other than the stack, like copying heap buffers.” I do grant that Arc doesn’t actually affect the heap when cloned and so you weren’t *wrong* by saying that it “clones the stack,” but my point is that when I’m reading Rust and I run into a call to Clone::clone, I read that as saying “I am doing some expensive data manipulation that affects more than just the stack.” Idk if that’s a good heuristic but I don’t think it has failed me yet.

    • @antoniong4380
      @antoniong4380 Місяць тому

      Copy is just a hint to the user.
      Anything that impls Clone, can impl Copy only if the owner of the type decides to do so.
      The Copy Trait it's just telling the compiler "It's ok to implicitly clone this. If it need be, clone it"
      This saves yourself from having to constantly do some_fn(my_var.clone()) to solve ownership problems. That's why all integer and floating types implement Copy, because it's cheap to clone, and sidetracking a bit, it makes more sense to clone because it would take more bytes to pass by reference a u8(8 bytes) , than simply copying 1 byte

    • @raymarch3576
      @raymarch3576 Місяць тому +2

      no, his entire understanding of this is wrong. Almost every single sentence.
      Copy/Clone has nothing to do with stack/heap:
      //this is a `Copy` from heap to heap
      let mut a = Box::new(1);
      let b = Box::new(2);
      *a = *b;
      Also his description of Arc cannot possibly be true, because if the refcount was on the stack, then how could the other owners of Arcs access that refcount?
      You said "arc doesn't actually affect the heap when cloned"
      that is wrong.
      Arc stores the T as well as the refcount on the heap, that way the other Arc instances can modify that shared refcount, while any Arc object can be deallocated without deallocating the refcounter, regardless of whether it lives on the stack and heap.
      It is painful to see so much misunderstanding presented so confidently in primes videos, where most people don't have the expertise to question it.
      Your intuition about Clone is good imo. I hope you don't let yourself get too sidetracked by prime content.

    • @greasedweasel8087
      @greasedweasel8087 Місяць тому +1

      @@raymarch3576 Oh of course! As soon as you said that the count is stored on the heap it clicked for me! Apologies for also being confidently wrong about the Arc stack/heap, and thank you for your correction.

    • @hagaiak
      @hagaiak Місяць тому +1

      By the way, note that he is right about the problem with `Arc` and `Rc` implementing `Clone`.
      I think many experienced Rustaceans are currently in the process of thinking up a fix in a future Rust edition. Basically, we want something like a `Claim` trait (in addition to `Copy` and `Clone`) to differentiate "heavy" data clones (such as `Vec` clones) from logical ref-counting increases. `Arc::claim()` would be used instead of `Arc::clone()`.
      This will allow to trace all actual heavy clones by grepping for `.clone()`, and will also allow capturing `Arc`s in closures to be much simpler and implicit (very common thing to do). Simply do:
      let foo = Arc::new(...);
      // `foo.claim()` implicity called because `foo` is still needed later in the function (still "lively").
      let bar = || { do_something(foo); };
      do_something_else(foo);
      For most projects, `claim()` can be implicit as atomic increments/decrements are not a performance concern. For code sections and/or crates where such operations should not be implicit, there will be an opt-out so that `claim()`s must be explicit.
      But you're right that Prime went on to confidently explain things very wrong :p

    • @raymarch3576
      @raymarch3576 Місяць тому

      ​@@hagaiak i remember i heard of `Claim`. I didn't look deep into it, but to me it feels like they should call it `Assign` and make it the missing trait that corresponds to the "=" operator, unless theres some detail i missed.
      As someone who has wasted a lot of their life debugging C++ i tend to object to every proposal that adds implicit function calls, even though I totally understand where they're coming from.
      In the end, even if they add a `Claim`/`Assign` trait, the copy situation will still not be remotely as messy as that of C++ thankfully

  • @nonefvnfvnjnjnjevjenjvonej3384
    @nonefvnfvnjnjnjevjenjvonej3384 Місяць тому +4

    I think this is a bit of a trope that you can't understand anything about the borrow checker. I wonder how much this is true. Maybe if you are absolutely new to programming or if you have only ever done react or something then maybe, but otherwise it isn't really that bad. You read the rules once and you get it.
    For me getting into Rust has been a breeze.

  • @jaysistar2711
    @jaysistar2711 Місяць тому +4

    Being that almost all languages use libraries written in C to do their work, I think that C has more libraries than any other language. When people say "You have to write a lot of code if you use C." That just means that the lack of a good package manager plus the general lazyness of people cause people to role their own libraries more often in C. However, just like in other languages, you can do anything in a single line of code: s function call. Also, due to inline functiins, you also pay less of a performance cost for the abstraction.

    • @hanifarroisimukhlis5989
      @hanifarroisimukhlis5989 Місяць тому +4

      And how you compile those libraries, each with their own build tooling because C doesn't have any. It's getting slightly better, but i digress.

    • @Luxalpa
      @Luxalpa Місяць тому +2

      C isn't just missing a good package manager, it's also missing a good api layer. It doesn't have generics, it doesn't have that big of a standard library, and because of that, the C ecosystem doesn't have a whole lot of things that you can rely on.

    • @jaysistar2711
      @jaysistar2711 Місяць тому

      @@hanifarroisimukhlis5989 Exactly. I've reimplemented many build systems in CMake, but it's proven to be more trouble than it's worth. Zig's build system and Cargo (Rust's package manager and build system) are far better, and easier to work with, as well.

    • @jaysistar2711
      @jaysistar2711 Місяць тому

      @@Luxalpa A huge standard library is a problem. Templates were originally made by using C preprocessor macros, so it does have them, I've used them, but they're ugly, and hard to maintain.

    • @rusi6219
      @rusi6219 Місяць тому

      @@Luxalpa C standard library is pretty bad but the fact that it's tiny is one aspect of it that's actually good

  • @krumbergify
    @krumbergify Місяць тому +7

    C developers overuse linked lists because they are simple to implement and get right. Unfortunately linked lists are really slow on modern hardware because walking a list involves constantly dereferencing pointers and that is very cache unfriendly.

    • @blarghblargh
      @blarghblargh Місяць тому +5

      depends on how you implement them, and what your program does with each node.
      if they're intrusive and allocated with a free list into a block of contiguous memory, they can be fast.
      if they are partitioned into large chunks and those chunks are linked, they can be fast.
      it's really only the trivial individually heap allocated linked list that is slow, when your loop is tight enough that the cache misses are dominating your performance. if a lot is done with that individual node, and local caches get a lot of use while handling an individual node, it can still cause negligible slow down.
      but yeah, if you're doing mostly CPU work and hardly pulling in any RAM besides the nodes contents, then an individually linked list can be extremely slow in comparison to a contiguous array.
      people often cause similar problems in other higher level languages when using associative arrays. if the hash table implementation is distributing access too randomly/broadly across its buckets, then you can get similar cache thrashing. sometimes just doing a linear search on a contiguous array can be much faster, even though its complexity is O(N) instead of O(1).

    • @rusi6219
      @rusi6219 Місяць тому

      Ackshually C devs always use arrays unless a linked list is an objective net gain

  • @diadetediotedio6918
    @diadetediotedio6918 Місяць тому +10

    I pretty much agree with the comptime take (which extends to the notion of types as values).
    * It makes it harder or almost impossible for the compiler to infer types
    * It is resolved after compilling, so you cannot expect reasonable hintings from the tooling
    * It is based on the strange conception that "type systems should not use a different syntax or be in a different world", while simultaneously making this distinction implicitly and explicitly many times because this is just how things work, it feels like a workaround to a problematic worldview more than a properly designed feature.

    • @TheFreshMakerHD
      @TheFreshMakerHD Місяць тому

      Not even. Comptime is useful for generics in zig or for dynamic programming problems

    • @hanifarroisimukhlis5989
      @hanifarroisimukhlis5989 Місяць тому +2

      @@TheFreshMakerHD But template-style metaprogramming is functional, while comptime are inherently imperative. With functional we can use things like type theory and immutability, which is way easier to reason.

  • @mateusvmv
    @mateusvmv Місяць тому +1

    @ 3:00
    Well, you don't need a balanced tree if you have a radix tree
    Want to find something? Is bit zero? Go left, otherwise go right
    Want to do a range query between left and right, (l,r)? You can use most significant digit for indexing.
    Is the bit on the left equal to the bit on the right? Then to to its subtree with same range
    Are the bits different? If left is higher than right, then you're out of bounds, otherwise you can merge the ranges (l, inf) and (0, r) on both subtrees.
    Now, of course the logarithm of the amount of items (complexity of a balanced tree) is always lower than the logarithm of the amount of possible items (complexity of a trie), but you can't beat this simplicity with a balanced tree.

  • @Euphorya
    @Euphorya Місяць тому +1

    That comptime take was WILD! Preferring preprocessor macros over comptime seems insane to me.

    • @chriss3404
      @chriss3404 22 дні тому

      I think so too for the most part, but Pekka really is onto something. Preprocessor macros are complex and more than a little terrifying at times, but with preprocessor macros, C folks have settled on a set of standard predefined macros and idioms for applying them.
      The simplicity is there at different levels.
      C preprocessor macros are horrible to understand how to build safely or implement, but easy to logically compose because they aren't complicated -- it's all textual source code generation.
      Zig comptime is incredibly powerful, easy to understand and dig into how it works, but much harder to logically compose because it empowers you to do so much at runtime.
      A comptime function, unlike a preprocessor statement:
      - can be entirely bespoke meaning that you're REQUIRED to dig into the implementation to understand what it really does or how to use it sometimes
      - is easy to write, so you're likely to encounter tons of them (the only restraint engineers have is their own laziness!)
      - introduces a ton of complexity if you are just trying to do interfaces
      I think that over time, Zig will standardize on some idioms for comptime functions, including common techniques and packages for implementing patterns like interfaces and then these problems won't matter much.
      However, for the time being, when there isn't a book/definitive resource on when Zig programmers should zig or zag, whenever i see a new comptime function, i have to read it to know what on earth is going on, and that just makes me sad.

  • @tinrab
    @tinrab Місяць тому +15

    I wish the article talked more about features Rust has over Zig. Macros, especially proc macros, for example. I have written macros that save me tens of thousands of lines. It makes the language more complicated, but the cost is worth it in the long run.

    • @jonnyso1
      @jonnyso1 Місяць тому +4

      I love using macros, I hate writing them.

    • @NoX-512
      @NoX-512 Місяць тому +2

      That’s what comptime is for in Zig

    • @tinrab
      @tinrab Місяць тому +8

      @@NoX-512 comptime is not as nearly as powerful.

    • @towel9245
      @towel9245 Місяць тому

      Can you elaborate on what you saved on writing? I'm curious.

    • @LtdJorge
      @LtdJorge Місяць тому

      ​​@@towel9245 I don't know if you're interested or if you are knowledgeable about Wayland, but take as an example the Smithay crates implementing the wayland protocol definitions in Rust. Let's say wayland-protocols-wlr, except for specific types for that protocol, the entire thing is a call to a macro from wayland-scanner which reads the XML definitions and generates the Rust equivalents.

  • @JasonWood100
    @JasonWood100 Місяць тому +1

    I have no idea how to program anything, yet I love watching this channel

  • @krumbergify
    @krumbergify Місяць тому +1

    Zig is actually more low level than C because you can design data structures with dedicated padding and specific sizes of enums in a way you could not do in portable C prior to C23.

  • @fcoder1
    @fcoder1 Місяць тому +2

    >> How could he not understand comptime??
    >> Async is too magical! (explains async incorrectly)
    sorry if you find this toxic

    • @RustIsWinning
      @RustIsWinning 2 дні тому

      Classic Andrew Date and toxic masculinity.

  • @sub-harmonik
    @sub-harmonik Місяць тому

    I've changed my mind on colored functions.. the thing is it is useful to 'annotate' functions that do have 'continuations' for many reasons, and obviously that has to be recursively applied to all callers.

  • @LucasGalfaso
    @LucasGalfaso Місяць тому

    A key point when talking between OS threads and green threads is that the former is a preemptive multitasking environment, and the later is a cooperative multitasking environment (besides the cost of running one or the other, of course).

  • @benheidemann3836
    @benheidemann3836 Місяць тому +1

    Minor correction, async/await in Rust doesn’t use green threads. Green threads use time slicing, while the concurrency model for async/await is based on cooperative multitasking. You can configure Tokio to be single threaded and if you block forever and nothing else will execute (unlike green threads)

    • @maniacZesci
      @maniacZesci Місяць тому

      Rust does use "green threads". Green threads are basically scheduled tasks which are run by scheduler using OS threads, and scheduler reuses fixed number of threads. Both Rust and Go in essence do that and both use work-stealing to redistribute work.

    • @benheidemann3836
      @benheidemann3836 Місяць тому

      @@maniacZesci yeah, I was using a narrower definition of green threads. The one you provided, though not ubiquitous, is essentially the same as the one used by Tokio. Given this definition, I mostly agree with you, but it’s worth noting that Rust does not implement a scheduler. Rust also does not implement work-stealing either. Schedulers can implement whichever scheduling algorithms they like. Tokio is the most common and does use work stealing by default. Also worth noting that given this definition of green threading, async/await in JS is also green-threaded, which is interesting as most people seem to agree JS doesn’t have threads.

    • @maniacZesci
      @maniacZesci Місяць тому +1

      @@benheidemann3836 I'm not familiar with JS internals, but I believe JS uses event loop which runs in a single thread, so I wouldn't claim that JS has green threads.
      Yes Rust leaves scheduler and runtime implementation to library authors and work-stealing is not only algorithm.
      What I meant is that in Rust you can have green threads too if you want (using libraries like Tokio) not that Rust comes
      with batteries included for that like Go.
      Rust used to have that prior version 1.0 but they removed it because it comes with additional overhead and for Rust that aims to be a systems programming language too that was unacceptable.

    • @benheidemann3836
      @benheidemann3836 Місяць тому +1

      @@maniacZesci thanks for the clarification. I think we’re on the same page now 🙂
      The observation I was making with JS was that Tokio (for example) can be configured to execute on a single thread, which means it behaves functionally (nearly) the same as the JS event loop (I know there are some subtle differences). If we were to say that Tokio is green threads even if it’s configured to run on a single OS thread, then JS has green threads too in a sense.
      Having had this OS threads, green threads, async/await discussion a couple times, people seem to fall into two camps: those who require parallel execution (> 2 OS threads managed by the executor) and those who feel that concurrency is enough as long as there’s an executor. Interestingly, the definition used by Tokio on their docs seems to imply the latter, and therefore that JS has green threads.

    • @maniacZesci
      @maniacZesci Місяць тому

      @@benheidemann3836 I haven't look at Tokio docs for some time but I think Tokio uses thread per core by default, so if CPU has 8 cores, Tokio will spawn 8 threads for example.
      Configuration for single thread is reserved for those rare cases should someone need it for some reason, I really can't think of any reason for it tbh, but what do I know.
      It can be configured to spawn any number of threads ignoring thread per core too.

  • @astral6749
    @astral6749 Місяць тому +1

    Thank you for the brief overview about threads in this video.

  • @GoodVolition
    @GoodVolition Місяць тому +1

    I think you should also think about C and C++ in terms of stack and heap. Even in the kernel world kernels have heaps as well or at least those I'm most familiar with do. It's pretty much unavoidable.

  • @nordgaren2358
    @nordgaren2358 Місяць тому +1

    I think the counter for arc is on the heap with the data.

  • @jenreiss3107
    @jenreiss3107 Місяць тому +3

    Rust is a good language because it doesn't try to pretend that the world isn't complex. Being a go programmer is like being a flat earther: yeah you can get by 90% of the time in your day to day, but you're putting a hard cap on the possible things you can do with the language. Rust doesn't hide that the world is actually round, and because of that it's maybe a little more difficult to wrap your head around but then you can do stuff like landing on the moon

    • @rusi6219
      @rusi6219 Місяць тому

      Except that the earth is demonstrably flat and your argumentum ad populum doesn't change that fact

    • @jenreiss3107
      @jenreiss3107 Місяць тому

      @@rusi6219 found the golang user

    • @RustIsWinning
      @RustIsWinning 3 дні тому

      ​@@rusi6219you exposed yourself by thinking the earth is flat lmao

    • @rusi6219
      @rusi6219 3 дні тому

      @@RustIsWinning you exposed yourself by thinking that Rust is winning.

    • @RustIsWinning
      @RustIsWinning 3 дні тому

      @@rusi6219 Yea and? Oh my days I just realized that it's you again with that profile picture LMAO. No wonder you think that way...

  • @sundae6610
    @sundae6610 Місяць тому +3

    lost traits and enums is less painful than forced to use inheritance

    • @isodoubIet
      @isodoubIet Місяць тому +4

      Nobody's forcing anyone to use inheritance

  • @mudandstars
    @mudandstars Місяць тому

    We need a thorough video on how the Rust runtime works exactly

  • @CallousCoder
    @CallousCoder Місяць тому +2

    Indeed the worst bugs always are in business logic. Often stacked logic where there are exceptions on the standard logic. Ugghhh… I’ve had few bugs preventing code from compiling or randomly crashing. But being off by one, weird logic trees that seem to work but fail once every 100 runs.

  • @TheOnlyJura
    @TheOnlyJura Місяць тому +1

    Mutex is not a semaphore of length one, and for sync stuff we have Rc, or maybe you should learn to use lifetimes properly.
    And for async stuff you don't necessarily need to use Mutex if you don't need mutability.

  • @szeredaiakos
    @szeredaiakos 29 днів тому

    I played enough with mutexes to know that it is not simple or easy. But if tuned right, they are blazing fast.
    By tuning, i mean that the write of a thread happens while the other threads are processing. Eliminating most of the wait to write time.

  • @varomix
    @varomix Місяць тому +3

    Seems like people are just writing clickbait articles for Prime to read

  • @FrankHarwald
    @FrankHarwald Місяць тому +1

    The fundamental problem with compile time evaluation (turing-complete evaluation of expression at compile time but only limited by time) are the problems which arise from ANY turing-complete system: it's entirely possible (but not necessary) for turing-complete systems to fail at some place at some time which you CANNOT be predicted, understood or even analyzed just using its components (not just not in a limited amount of time but) AT ALL. This is very much follows from Gödel's incompleteness theorem. Of course almost every practical program is written in a language which is for all intents & purposes turing-complete, meaning these issues will arise in any language anyway in some way, at least at runtime. So why would it be so much worse when a language like Zig would also adopt the same concept at compile time? Well, it means that software is not just potentially impossible to get working good enough even when its part are correct when you run it, no, it will mean that the compilation, debugging, build process & hence software development itself is now also potentially impossible to get right _EVEN_ when all of its parts are correct! That's something that wouldn't be the case if Zig didn't have a turing-complete compile time, because then if you compile a program of which you knew all of its parts compiled correctly, the whole part would always also compile correctly. That's a hidden danger but very real & not well known side effect of turing-complete systems. The reason why turing-complete systems are used anyways is because one has to, e.g. because there is no other option to do some things, e.g. some programs either strictly require a system as complete as this or would be impractically hard to write down or execute else. The problem I fear is that the problem that Zig is trying to solve by putting a turing-complete system in its compile time is that this problem DOESN'T REQUIRE such a complete system to solve, meaning you'll get all the problems that a turing-complete system brings with questionable benefits, hidden dangers but which could ultimately be solved differently without these additional dangers - even if admittedly sometimes in a more difficult way.

    • @hanifarroisimukhlis5989
      @hanifarroisimukhlis5989 Місяць тому +1

      Technically any generics which is strong enough for Peano arithmetic is automatically Turing complete. But there's inherent ease with functional-style templating, which discourage many non-halting condition. Imperative/comptime style metaprogramming makes it very easy to get into non-halting.

  • @KeyT3ch
    @KeyT3ch Місяць тому +1

    zig's comptime is like C++'s templates, type traits and SFINAE, but on steroids. And then you add a bit of reflection on top of it. It's pretty fucking cool

    • @maleldil1
      @maleldil1 Місяць тому +1

      It also has many of the same problems: duck typing, weird type errors (stuff fails when it's used, so the type error can be far away from what really caused it), and severely limiting the ability of LSP.

    • @KeyT3ch
      @KeyT3ch Місяць тому +1

      @@maleldil1 true! Very true! And that why zig is still in 0.1x status, it's not stable

  • @chris3079
    @chris3079 Місяць тому +1

    Started learning C as part of OS dev over last two months, and I'm loving C, I thought it would be scary. But I know i'm not good enough for it, so I need to use Rust.

  • @krumbergify
    @krumbergify Місяць тому +2

    comptime is terrible for api docs. It doesn’t tell you what the actual requirements are on the type you are supposed to pass in. Traits / interfaces are much more clear here.

    • @Presenter2
      @Presenter2 Місяць тому

      True, but some of the problems are shared. When you see a function that accepts something that implements trait A, you need to now what implements the trait to use the function. In that case, reading type bounds from a function's docs (this is what you would typically do in Zig to notify user of required parameter's type properties) is not worse. Fair point though 👍

  • @nevokrien95
    @nevokrien95 Місяць тому +2

    I am kinds blowen away by how much prime is blowen away by zigs tooling
    Looks like valgrind... like am I missing something?

  • @mghinto
    @mghinto Місяць тому +12

    Finally, a new video to reinforce my belief that Rust is the best lang

  • @alexpyattaev
    @alexpyattaev Місяць тому +10

    Comptime has one major problem - lack of bounds on inputs. So you can not make a channel that only accepts types that are safe to send across threads. Might sound like a small issue, but it is at the root of the "rust experience".

    • @Presenter2
      @Presenter2 Місяць тому

      You can make that system using comptime. Of course, this will not be that "elegant" like in Rust. You will need to write boundaries in a function body. Part of those will be determined through some kind of a handmade "comptime interface" I guess. The problem is that all of those boundaries will not be seen in the function's signature, which is a downside compared to Rust's approach.

    • @alexpyattaev
      @alexpyattaev Місяць тому +1

      @@Presenter2 let us hope that one day we will see languages converge to something that is flexible like in zig but remains capable of strictly enforcing constraints like in rust.

  • @Aken0o
    @Aken0o Місяць тому +1

    That one lmao: "coolkids call it comptime, dummies call it structural macros, c++ degens call it template magic"

  • @freeideas
    @freeideas Місяць тому +1

    I believe anything done with async/await can be done with less complexity with plain old blocking threads. The price is more memory -- except with java virtual threads, which have the simplicity of old-fashioned threads and the memory cost of microthreads. Of course, if you are using an async/await infested library, there is not much you can do to simplify.

  • @redcrafterlppa303
    @redcrafterlppa303 Місяць тому

    5:05 using Rusts extensive type system you can model most logic as types and varients. This way the idea works if you write "perfect rust" without any boolean logic. Of course this is not feasible in reality. But it would work in theory.
    Moving as much logic to compile time as possible is the strategy I think will be the future of programming. Compilers understand the code we write better then we ourselves nowadays. So with more interactivity between the compiler and the programmer the compiler will be able to even help with logic bugs. Static asserts (c++) and strong types (rust) are the first step on that journey of compile time guaranteed code.

  • @npc-drew
    @npc-drew Місяць тому +4

    The power of Rust,
    The power of many.
    The power of Rust,
    The power of many.

    • @rusi6219
      @rusi6219 Місяць тому +3

      classical conditioning

  • @eldarshamukhamedov4521
    @eldarshamukhamedov4521 Місяць тому +1

    It's funny how similar this whole "switch to Zig from Rust" thing feels to the FE framework thrash everyone complains about. Imagine being an engineer trying to get into systems programming. A year ago, everyone couldn't shut up about Rust, let's do everything in Rust, etc. Now it's Zig, switch to Zig, it's so much better than Rust.

    • @crab-cake
      @crab-cake 7 днів тому

      i think it's because rust is actually being used in the industry now. it's no longer a theoretical idea. zig is the new cool thing around the block. i think zig came in a little too late though and it's unlikely that it will get widespread adoption. primarily because of how quickly rust has gained adoption and also because zig doesn't offer radical solutions to problems. it's not similar enough to c to be what kotlin is to java or what typescript is to javascript, and it's not radically different enough from c to be it's own thing. it's in this weird middle ground with languages like odin, nim, etc.

  • @vlknckl
    @vlknckl Місяць тому +1

    Rust doesnt use green threads, using green threads requires to have a GC as how Go handle them.
    Zig or Rust will never use green threads because of C compatability (Green threads have their own heap and stack) issues.

  • @tiagocerqueira9459
    @tiagocerqueira9459 Місяць тому +1

    Jon Gjengset demystifies how async runtimes work in his video "Decrusting the tokio crate" and it's not that hard conceptually

  • @elvispalace
    @elvispalace Місяць тому +6

    bro, I'm just ok with rust, ok?

  • @shirshak6738
    @shirshak6738 Місяць тому +2

    I write java all day, but i feel so productive with Rust, because I don't feel fear :D

  • @Omnifarious0
    @Omnifarious0 Місяць тому

    C macros are text replacement. And a big implication of this is that they also have no real access to any information about the C program. For example, there would be no way to sensibly ask if there was a function `quack` that could be applied to an argument to the function.

  • @ayesaac
    @ayesaac 7 днів тому

    That description of becoming proficient with rust is why we can't have nice things. You almost never actually want to clone, and you usually don't actually want a mutex. You just do it because you're lazy.

  • @dniliveact
    @dniliveact Місяць тому

    Building on Rust the last few months and I love it

  • @w1sh832
    @w1sh832 Місяць тому

    Had to do thread pooling and build a worker queue in my OS class. Would reccommend 10/10 will make u smarter

  • @michaellatta
    @michaellatta Місяць тому

    We often call C a portable assembly language.

  • @transire3450
    @transire3450 Місяць тому +1

    Comptime only solves problem of abstracting code over types but does not let us conclude a contract between caller and callee. That's why c+ added concepts, because pure templates have the same problem.

    • @infastin3795
      @infastin3795 Місяць тому

      comptime allows it. It's just much more verbose compared to C++ concepts.

  • @maniacZesci
    @maniacZesci Місяць тому +1

    Maybe the man is not hype-train traveler, unlike some UA-camrs.

  • @thegeniusfool
    @thegeniusfool Місяць тому +14

    Zig is for people who kind of learned C and it looks geeky cool to them, but they still don’t want to feel dusty.
    Zig should only be used for old C teams. Definitely the real C++

  • @raddinox2707
    @raddinox2707 Місяць тому

    That ARC thing sounds exactly like my C++ object wrapper class I made back in the early 2000's to count references to an object and auto delete the object

    • @seneca983
      @seneca983 Місяць тому +1

      As a minor nitpick, Rust has both Rc and Arc. The difference is that in the case of Arc, incrementing and decrementing the reference count is *atomic* so it's also guaranteed to work with multiple threads (but might be a bit slower than Rc). I would guess that you didn't make your reference count atomic. Or maybe you did?

  • @avinashmalik9441
    @avinashmalik9441 Місяць тому +1

    The duck example is just CRTP in C++, very common in C++, as I thought everyone is just copying C++!

  • @JoacimNitz
    @JoacimNitz Місяць тому

    Dont know much about programming. But i love your way of communicating your thoughts and how you explain things. 👌

  • @HalfMonty11
    @HalfMonty11 Місяць тому +3

    Is comptime cool? yeah... but I have a hard time coming up with a genuinely good reason to use them that isn't going to just add more complexity than some alternative. If you find yourself leaning on comptime I think that'd be a bit of a code smell to re-evaluate what the hell you are doing

  • @OREYG
    @OREYG Місяць тому +3

    What lengths are you going to go to avoid writing C++? You have a horrible bias against C++. It is, and will be a better language that Zig, Rust and the likes. While you spend your time pointlessly relearning the same things for the third time, a real programmer (who writes in C++ btw), would do 10x more, and 10x better.

    • @rusi6219
      @rusi6219 Місяць тому

      The language is bloated and the devs are conceited

  • @RenderingUser
    @RenderingUser Місяць тому +2

    _mom said its my turn to be validated_