3 Levels of Sorting Algorithms - FASTEST Comparison Sort!

Поділитися
Вставка
  • Опубліковано 11 січ 2025

КОМЕНТАРІ • 195

  • @iwktwom
    @iwktwom 4 роки тому +252

    Just started a computer science degree at 44, long way to go. Great video but, thanks.

    • @kushnayak1619
      @kushnayak1619 4 роки тому +6

      Ok

    • @xaos9036
      @xaos9036 4 роки тому +32

      Never too late. Keep doing and don't give up!

    • @JannisAdmek
      @JannisAdmek 4 роки тому +5

      you can do it 🔥

    • @billgordon6954
      @billgordon6954 4 роки тому +26

      Lol. I’m 49 and a junior right now for CS degree. Old guys rule!

    • @roccov3614
      @roccov3614 4 роки тому +17

      I started at 50. In my second year right now.

  • @viacheslav7870
    @viacheslav7870 3 роки тому +187

    i didn't understand a single thing but still gave a like

  • @DanielSposito
    @DanielSposito 4 роки тому +109

    I've watched hours of algo videos during the past few weeks and this video is thorough yet significantly easier to understand than most others. Thank you!

  • @diamondsmasher
    @diamondsmasher Рік тому +13

    Based on the sheer volume of documentation on sorting algorithms, I honestly thought it would be a significant factor in my programming career.
    I doubt in the last 20 years I’ve ever sorted a single thing….
    But still fascinating.

  • @dcrey7
    @dcrey7 3 роки тому +30

    lvl 1 -O(n^2) - bubble, selection, insertion,
    lvl 2- O(nlogn) - merge , quick, heap

  • @daniel.watching
    @daniel.watching Рік тому +13

    The main advantage of TimSort is that it works really well on partially sorted data. It's pretty rare to come across truly random data and when the data is already partially organised TimSort has the intelligence to skip over the already sorted sections while other algorithms will naively sort them.
    Also there should be an honourable mention to non-comparative sorts like Bin and Radix sorts. Radix, especially, can be incredibly fast when the sort keys are can be expressed as Integers with a known range. It's O(n) complexity, and while it does a lot of swaps, it has good memory locality which helps with cache misses.

    • @justvladcreate
      @justvladcreate 8 місяців тому

      agree on radix sort and imo it is the best but as was shown in the vid it is worse with partially sorted data

  • @SimGunther
    @SimGunther 2 роки тому +8

    For those who can't see the text at the end graphs, Introsort took 1st followed closely by Timsort and Quicksort in 2nd/3rd place respectively

  • @AlyssaMarie-vr8cc
    @AlyssaMarie-vr8cc 2 роки тому +19

    Thank you for the explanation about insertion sort. There seems to be some conflicting information out there about if it is actually quicker or not in comparison to bubble or selection sort methods - I think the important distinction to make is that it is in fact faster than those methods in best-case scenario, and the same in the worst-case scenario - but in between best and worst-case, it is still potentially quicker than the other two.

  • @venkat.sairam
    @venkat.sairam Рік тому

    🎯 Key Takeaways for quick navigation:
    00:00 🧭 *Introduction to Sorting Algorithms*
    - Sorting algorithms are crucial for various operations like search and database operations.
    00:16 📊 *Levels of Sorting Algorithms*
    - Sorting algorithms are categorized into three levels: n squared complexity, n log n complexity, and hybrid algorithms.
    00:31 🚀 *Level 1: n squared Sorting Algorithms*
    - Bubble sort, selection sort, and insertion sort are basic n squared sorting algorithms.
    - Bubble sort repeatedly compares adjacent elements and swaps if needed.
    - Selection sort divides the list into sorted and unsorted sections, selecting the smallest element.
    - Insertion sort iterates through the list and inserts elements into the sorted section.
    02:51 🔍 *Level 2: n log n Sorting Algorithms*
    - Merge sort, quicksort, and heapsort are n log n complexity sorting algorithms.
    - Merge sort divides the list into sublists and merges them to sort.
    - Quicksort picks a pivot, partitions the list, and recursively sorts sublists.
    - Heapsort maintains a heap structure in the unsorted section while extracting elements.
    06:53 💡 *Level 3: Hybrid Sorting Algorithms*
    - Hybrid algorithms like Tim Sort and Introsort combine features from multiple algorithms.
    - Tim Sort collects runs and merges them efficiently, optimized for nearly sorted data.
    - Introsort starts with quicksort, switches to heapsort for large lists, and uses insertion sort for small partitions.
    08:58 🔄 *Conclusion on Sorting Algorithms*
    - Sorting algorithms have various characteristics, including time complexity, stability, and space requirements.
    - The choice of sorting algorithm depends on the specific application and data characteristics.
    10:11 📚 *Closing Remarks*
    - The video summarizes the key points about sorting algorithms and encourages viewers to subscribe for more content.
    Made with HARPA AI

  • @Axel_Andersen
    @Axel_Andersen Рік тому +11

    I think it would have made sense to mention that these comparisons here are AFAIU based on the idea of equal access times to all elements. This is seldom through nowdays with several levels of caches and virtual memory. In the old days this was even more of an issue when the data was on multiple tapes where the seek times were enormous compared to anything else. I guess merge search is then then an important step in sorting as you can sort a section in main memory, write it out on tape and then read from several tape units one item, pick the largest one and write it all out as you go to yet an other tape unit. I'm not old enough to 'have been there done that', but old enough to see the step,stop operation of a large number or tape units in movies and I always suspected that was because of merge sortin. At least in many cases.

  • @aakashbashyal1822
    @aakashbashyal1822 4 роки тому +27

    After watching this video, I remembered my lecturer who taught us those sorting algorithm in the same way as presented here, that is just reading from the slides. In the class, I pretended to understand everything and listen carefully, but in the end, I didn't understand anything.

  • @zaico09
    @zaico09 4 роки тому +49

    Radix sort?

    • @elliott8175
      @elliott8175 4 роки тому +42

      In fairness, the title said "fastest comparison sort".

    • @lordtejas9659
      @lordtejas9659 4 роки тому

      @@elliott8175 nice!

    • @arshakyessayan4087
      @arshakyessayan4087 4 роки тому +1

      And counting sort. They are the best variants of sorting.

    • @lordtejas9659
      @lordtejas9659 4 роки тому +1

      @@arshakyessayan4087 No. Doesn't work and not stable either fails practically for large amount of data.

    • @lextr3110
      @lextr3110 4 роки тому

      @@lordtejas9659 radix sort with bucket is not stable? or with counting?

  • @dhananjaypanage2472
    @dhananjaypanage2472 4 роки тому +14

    Very underrated channel.
    Excellent content. Keep it up❤️❤️❤️

  • @static-san
    @static-san Рік тому +1

    I found Sedgewick's "Algorithms" to be great at going into lots more detail about sorts (and other things). That's where I also learnt that MergeSort and HeapSort came about because of using and building tree-oriented data structures.

  • @Inspirator_AG112
    @Inspirator_AG112 2 роки тому +2

    Distribution sorts can get to the minimum time complexity of O(n).
    *Examples:*
    Bucket sort
    Counting sort
    Flash sort
    Pigeonhole sort
    Radix sort

  • @davidjames1684
    @davidjames1684 3 роки тому +6

    It seems like someone could do analysis on which algorithm works best on what type of data (size, data type, % already sorted...), and just select the proper algorithm(s) to sort. For example, if it is known that algorithm A works best on almost sorted data and that condition is detected, then run algorithm A on it.

    • @heron3140
      @heron3140 3 роки тому +7

      Mh yes, that's what hybrid sort algorithms do

  • @CompilerStuck
    @CompilerStuck 2 роки тому +7

    After watching this, I was finally able to sort my life out

  • @revanslacey
    @revanslacey 3 роки тому +2

    Wow that was quick. I would have liked to have seen the finish of the race.

  • @kingsfan10000000
    @kingsfan10000000 2 роки тому

    You are way smarter than I am. You could have been speaking Latin. I didn’t understand a thing but you get the gold star from me. Dropping a like

  • @robbertwethmar5612
    @robbertwethmar5612 Рік тому

    nice overview and graphical representation! Thanks. Last time I thought about sorting algorithms was decades ago, I assumed current libraries would do something smart. Nice to hear what ;-)

  • @AbhishekChandraShukla
    @AbhishekChandraShukla 10 місяців тому

    This is Dope brother!

  • @JordanBeagle
    @JordanBeagle 2 роки тому

    Thank you, just the video I was looking for after the last one, thanks for the info

  • @GameCyborgCh
    @GameCyborgCh Рік тому

    Don't mind me just watching videos to pass the time until bogosort has sorted the list

  • @mrtn9026
    @mrtn9026 4 роки тому +4

    I created an algorithm that sorts two numbers on their right place in one iteration.
    However you need a whole copy of the array with numbers and a few other variables.
    So basically a loop runs through the copied array once and saves the index of the biggest and smallest element in variables so you can access them.
    Then this smallest element gets written to array[0] (First) and the biggest to array[size-1] (last).
    0 is represented by a variable and gets incremented After the next step.
    size-1 is also a variable and gets decremented after the next step.
    So you know where to put your next smallest and biggest value from the copied array.
    The inserted elements from the copied array are now written to -1 so they are „hidden“ for the next iteration since our smallest value needs to be 0 or larger.
    And this gets repeated until the bigger index in the real array is smaller then the small Index.
    Because then every element is sorted.
    Ill post Java code in a second.
    Hope this doesnt exist yet.

  • @KiteHQ
    @KiteHQ  4 роки тому +1

    we set up a facebook group for people really serious about learning Python or helping others with their learning journey facebook.com/groups/505658083720291

  • @jan861
    @jan861 4 роки тому +13

    Can you make a video on how you did the animations? I assume you programmed the timing specifically (one time unit for iteration)?

    • @KiteHQ
      @KiteHQ  4 роки тому +6

      Good idea! We did code this up in Python

    • @jan861
      @jan861 4 роки тому +4

      ​@@KiteHQ I always find the animations and diagrams at least similarly interesting as the content of the presentation. :D
      (I'm thinking of 3Blue1Brown right now.) --> "Manim"

  • @kaustabhchakraborty4721
    @kaustabhchakraborty4721 3 роки тому +2

    Plz a video on radix sort.

  • @tommylau7457
    @tommylau7457 3 роки тому

    if sorting int without duplicates, sort by putting all elements into a boolean array(size = the largest element) and then generate an array according to that boolean array.
    if u want to remove duplicates after sorting, u can also use this method, the time complexity is O(1).
    if u want to keep the duplicates, link the elements into linked list.
    if u want to reduce the memory size, use hash function to deal with collisions(arr[i] % num)

    • @SupaKoopaTroopa64
      @SupaKoopaTroopa64 3 роки тому +4

      Sounds kinda like gravity sort or counting sort. They are used as a subroutine in Radix sort, an O(n) algorithm.

    • @tommylau7457
      @tommylau7457 3 роки тому

      @@SupaKoopaTroopa64 yea that's exactly counting sort actually.
      I see people using quick sort then use another function to remove the duplicates.
      but using counting sort without counting the duplicates will automatically cancels out the duplicates for you already XD
      it's interesting to see how the algos work :D

  • @davidjames1684
    @davidjames1684 3 роки тому +8

    I "sort" of understand all of this.

  • @iwersonsch5131
    @iwersonsch5131 4 роки тому +4

    Insertion sort can be done O(N*log(N)) as well if you find the insertion place by comparing to the middle of the possible spots rather than the extremum.
    Say you have 1023 elements in the sorted part of your list, and the next one to insert is the 612th smallest. Instead of comparing to elements 1023, 1022, 1021 etc. until 611 of the sorted list, you compare to the 511th, then the 767th, 639th, 575th, 607th, 623rd, 615th, 611th, 613th, and 612th and after log2(1024)=10 comparisons you already know exactly where to place the new element.
    This keeps the advantages of insertion sort while guaranteeing the speed that quicksort can only hope and pray for

    • @AAA-de6gt
      @AAA-de6gt 4 роки тому +1

      It's still O(n^2) because the elements need to be copied over every time something is inserted.

    • @iwersonsch5131
      @iwersonsch5131 4 роки тому +2

      ​@@AAA-de6gt So if you were trying to avoid using extra space (like I was when initially typing this), you would need O(N) and not O(1) time to copy one list of length N? That's sad if that's true.
      Still only O(N log N) comparisons though, so if a comparison takes 50 times as long as a copy, it is still asymptotically 51 times as fast as basic insertion sort - and even if it takes the same time, it's asymptotically twice as good.
      Even if copying really takes that long, you can still force O(N log N) time by using O(N log N) space to describe the positions of all N elements and only editing those. At this point it would probably become pretty impractical since I'd be losing the advantages of insertion sort for almost sorted lists

    • @orangenostril
      @orangenostril 2 роки тому

      So basically, binary search through the sorted section for the right spot? That's actually genius.

    • @iwersonsch5131
      @iwersonsch5131 2 роки тому

      @@orangenostril Pretty much. Pretty simple, isn't it?

    • @orangenostril
      @orangenostril 2 роки тому

      @@iwersonsch5131 Super simple, super clever

  • @Wyld1one
    @Wyld1one Рік тому

    you forgot radix sot. given the proper linked list implementation it is in place, in order and O(n)

  • @md2perpe
    @md2perpe 4 роки тому +4

    I wouldn't say that the heap is unsorted but partially sorted.

    • @seneca983
      @seneca983 Рік тому

      It's partially sorted in the wrong direction.

  • @joyburd2
    @joyburd2 3 роки тому +2

    Great explanation. Exactly what I was looking for. Thank you.

  • @JordanBeagle
    @JordanBeagle 2 роки тому

    8:50 It's sounds like these 2nd tier and hybrid all have a sorting time of O(N*Log(N)) so wouldn't that mean they were all the same speed?

    • @liam8398
      @liam8398 Рік тому +1

      Time complexity doesn't actually mean speed. Two algorithms can both be O(n*log(n)), but big O notation doesn't take into account the constant time factor that each algorithm might have. For example, a function that iterates through a list of n elements ONCE is O(n), but another function that iterates through a list of n elements TWICE is also O(n), but it is 2 times slower than the first.

  • @Classicv5
    @Classicv5 3 роки тому +1

    Does your example of bubble sort sort in ascending order while selection is descending or am I missing something?

  • @SkyboxMonster
    @SkyboxMonster Рік тому

    I came up with a sorting technique that runs zero comparisons. but requires a custom chip to run on. I doubt that trying to emulate the chip design would yield much in time savings.

  • @tamnker8465
    @tamnker8465 2 роки тому +1

    I just got into computer science AP classes and this video might as well be a horror show for me.

  • @valentin6824
    @valentin6824 Рік тому

    Long Story Short, It is Impossible to get below O(n*Log(n)) with normal sorting algorithms. There is a Mathematical way of prooving that, which has Something to do with how recursion works.
    However, there are other Specialized algorithms which Work in O(n), but with the difference that the sorting range must bei limited for Them.
    You can Imagine them Like a Container, in which you Store all possible elements, and then Go through the list which needs to become sorted. Every time you find an element, you Count +1 in your Container for the element you found, and Output the result in the end.
    That works for Limited Problems, but since you dont know the Elements which need to be sorted in Most cases, its Not used very offen.

    • @kasuha
      @kasuha Рік тому

      With regards to the proof, there is n! possible permutations of the input data and to identify which one do you have in hand you need at least log(n!) comparisons. And O(log n!) is approximately the same as O(n log n).

  • @DatascienceConcepts
    @DatascienceConcepts 4 роки тому +3

    Nice :) Will refer others!

  • @zyro8623
    @zyro8623 Рік тому

    Let it finish sorting!!

  • @adgarza
    @adgarza 4 роки тому +4

    Although Shell sort is a variant of Bubble Sort but speedy, I think it would be worth to receive a mention.

    • @KiteHQ
      @KiteHQ  4 роки тому +1

      Thanks for the feedback! :)

    • @romeolz
      @romeolz 4 роки тому +2

      Wait wasn't shell a variant of insertion and comb sort is a bubble variant?

    • @adgarza
      @adgarza 4 роки тому

      @@romeolz I don't think so. As I see it, is more a variant of bubble sort with pivots.

    • @DeGuerre
      @DeGuerre 3 роки тому +2

      Shell sort can be thought of as a variant of bubble sort OR insertion sort. It's not deployed very often. I've only ever used it once, in a piece of embedded firmware on a tiny CPU where I had lots of ROM available but essentially no additional RAM to spare (not even enough for quick sort), and knew the maximum input size (1000 or so elements) was too small to make heap sort viable.

    • @adgarza
      @adgarza 3 роки тому

      @@DeGuerre Yes, you're absolutely right. It is a kind of bubble sort, but faster than it. I think would be good to see the difference in speed. I think it could had a place after quick sort.

  • @pablodumas9446
    @pablodumas9446 2 роки тому

    This video is a gift from God.

  • @fatimamaychine
    @fatimamaychine 3 роки тому

    Hello i have a question for an interview about what is the time running for the ''sort in '' method of an object, do you have any idea

  • @Sanmayce
    @Sanmayce 3 роки тому

    Who can make a comparison between Introsort and Quicksort 'Magnetica'?

  • @sodiboo
    @sodiboo 3 роки тому

    the title says “FASTEST COMPARISON SORT”, but the thumbnail implies there’s one that’s faster than O(n log n)? isn’t that impossible with comparisons only?

  • @killjaqular
    @killjaqular 3 роки тому +4

    no RADIX sort? RADIX is incredible powerful on large datasets

    • @nm_crazy
      @nm_crazy 3 роки тому

      It's no a comparison sort

    • @DeGuerre
      @DeGuerre 3 роки тому

      @@nm_crazy is correct, in that this video is about comparison sorts.
      MSD radix sort is partition-based, much like quick sort, and as such, work well together sometimes. In databases, for example, you often have secondary keys.

  • @-_James_-
    @-_James_- Рік тому

    Why doesn't Insertion Sort binary search the sorted section of the list?

  • @nicreven
    @nicreven Рік тому

    glorious counting sort O(N) master race

  • @Axel_Andersen
    @Axel_Andersen Рік тому

    This video is about as much as you need to know about sorting as if you find yourself writing a sorting algorithm you probably on the wrong path. Use a standard libray insteand.

    • @seneca983
      @seneca983 Рік тому

      Someone has to write those standard libraries.

    • @Axel_Andersen
      @Axel_Andersen Рік тому

      @@seneca983 Of course. But they have been written already. IDK 1 in million programmer is going to need the more info about sorting thant this? Yeah I know, I, like thousands of people every year leht this in detaila Uni but in reality more than this video gives you is rarely needed.

    • @seneca983
      @seneca983 Рік тому

      @@Axel_Andersen Programmers program all sorts of things so a lot of videos on such subjects are only going to be relevant to a small subset of them. There are certainly exceptions like "here's how React works" etc. but you can probably find a lot of videos on those too.

    • @Axel_Andersen
      @Axel_Andersen Рік тому

      @@seneca983True. Don't really get what you are driving at. My point was that this is video is something that every programmer should know, but that is all mos programmers need to know about sorting. No need to go deeper, so this was perfect.

    • @seneca983
      @seneca983 Рік тому +1

      @@Axel_Andersen I didn't have any deep point. It was just an offhand comment that some programmers need to know more about these niche topics (and a lot of topics tend to be somewhat niche). There wasn't that much more to it. However, now that you ask I think I would add to your original comment that I think there's at least one more thing that would be good for programmers to be aware of, namely radix sorts. A programmer might not need to understand them but it's good to be aware that they can bring significant speedups for certain kinds of data.

  • @dragonking7092
    @dragonking7092 2 роки тому +2

    bogosort is clearly the fastest, since it can sort everything in one try

  • @seshagirik4066
    @seshagirik4066 4 роки тому

    Thanks , very useful.

  • @siddheshmore231
    @siddheshmore231 3 роки тому +1

    What do you mean by not stable ?

    • @xGOKOPx
      @xGOKOPx Рік тому

      He literally said it

    • @seneca983
      @seneca983 Рік тому

      Stable means that equal keys retain their relative order.

  • @fussyboy2000
    @fussyboy2000 4 роки тому +4

    Why does nobody include bogosort as the deafult comparitor?

    • @10spen11
      @10spen11 3 роки тому +1

      This video only looks at comparison sorts unfortunately 😔

  • @GabrielsEpicLifeofGoals
    @GabrielsEpicLifeofGoals 2 роки тому +1

    O(n) is possible! Just not with comparison sorts.

  • @revllanes7354
    @revllanes7354 2 роки тому

    Third is O(color sorting). Not O(?)

  • @BeeBaux
    @BeeBaux Рік тому

    How you created the graphs

  • @TheDarkOne629
    @TheDarkOne629 Рік тому

    Why does nobody give bucketsort and pidgeonhole sort some love? :(
    Also bogosort. It's really fun to confuse students with it when they try to calculate the complexity.

  • @Jkauppa
    @Jkauppa 2 роки тому

    Hash sort, very quick

    • @Jkauppa
      @Jkauppa 2 роки тому

      like radix sort

    • @Jkauppa
      @Jkauppa 2 роки тому

      the only requirement is a continuous measurable value space

    • @Jkauppa
      @Jkauppa 2 роки тому

      how about making an empty array the same size then scanning all the source array for the smallest, then select it, then insert in the empty sorted array in the next highest position

    • @Jkauppa
      @Jkauppa 2 роки тому

      Hash sort is rather Hash bin sort, because it has more than two bins

  • @dactel
    @dactel 2 роки тому +1

    Why the homie keep leaning over?

  • @AmodeusR
    @AmodeusR Рік тому

    It would be better if the sortings in the final were separated by their level.

  • @5ikronoz
    @5ikronoz 2 роки тому

    Where are Kite for linux?

  • @B_Ahmed1234
    @B_Ahmed1234 3 роки тому

    How do I sort my socks with these algorithms?

  • @jonaw.2153
    @jonaw.2153 Рік тому

    Is there any benefit to using a slower sorting algorithm? Why wouldn't you go for the fastest algorithm whenever possible?

    • @katrinabryce
      @katrinabryce Рік тому +1

      Some of the faster ones require more memory, and if that means swapping out to disk rather than doing it in RAM, it could actually take a lot longer due to IO bottlenecks.

  • @nightglide_
    @nightglide_ Рік тому

    Try bogosort with 1 planck time delay

  • @magicmulder
    @magicmulder 4 роки тому +1

    I wonder how a learning AI would fare here. Would it end up with something close to introsort? Or something else altogether? IIRC it’s impossible to be faster than O(n log n) on average anyway.

    • @DeGuerre
      @DeGuerre 3 роки тому

      It is indeed impossible to do better than O(n log n) comparisons in a comparison-based search. The proof is quite straightforward if you know a little information theory, namely, that to represent a number between 1 and M, you need log M bits.
      There are n! (n factorial) possible permutations of an array of size n. To sort the array, you need to discover which permutation the array is in, that is, you need to discover a number between 1 and n!.
      A comparison operation (say,

  • @juancho5327
    @juancho5327 3 роки тому

    internet explorer uses stable sorting is that slower than all those ?

  • @raedius_music
    @raedius_music 4 роки тому

    great video! im assuming you have strong face gestures so you dont fall asleep mid sentance?

  • @chrispysaid
    @chrispysaid 3 роки тому +1

    I know the definitions of all the words you're using, but I have no idea what you're saying.

  • @AnonUsername473
    @AnonUsername473 3 роки тому +1

    his face looks generated by an AI

  • @dulithaperera3211
    @dulithaperera3211 11 місяців тому

    Ryan Ghosling??

  • @alpers.2123
    @alpers.2123 3 роки тому

    What about wolfsort

  • @cerealbowl7038
    @cerealbowl7038 Рік тому

    Bogosort is the best sorting algorithm.

  • @mlucasl
    @mlucasl Рік тому

    3 Levels? In that regard, you forgot 2 types of sorting algorithms that are quite important, Bucket sort, which can be O(n) with known parameters, and Bogosort, which can be O(?) and really frustrating.

    • @LiamLimeLarm
      @LiamLimeLarm Рік тому

      dont forget quantum bogosort, objectively the best sorting algorithm with O(1) (as long as you are in the right universe)

    • @mlucasl
      @mlucasl Рік тому

      @@LiamLimeLarm Quantum bogosort would be as useful as the collapse function you use. And in that case, the collapsing algorithm would be more of a bucket sort. Because it will have to have a zoning for setting index position of the current element, to the sorted index place. There is no "(as long as you are in the right universe)" in quantum computer, if it was just like that, it would be useless, you will always be "in the right universe" if you have the correct collapsing function (the hard part). If you need to be in the right universe, then bogosort always works at the first try, if not, you are just not on the correct universe.

  • @krishivgoel2122
    @krishivgoel2122 7 місяців тому

    Dude u forgot LSD Radix Sort

  • @buddihinbabu
    @buddihinbabu 4 роки тому +1

    Great Job! Thanks A Lot....

  • @TomeczekH
    @TomeczekH 4 роки тому +3

    radix would win on random data in very big list...

    • @segmentsAndCurves
      @segmentsAndCurves 4 роки тому

      Try sorting real number, complex number and structle data.
      It ain't goin' well.

    • @romannasuti25
      @romannasuti25 3 роки тому

      @@segmentsAndCurves problem is, how do you even establish relative sort orders for those elements? Most real sorting done is with elements of at least similar if not identical types. There’s ways to map many kinds of similar data types to an absolute sort order, and in such cases you can still produce a radix index (used to sort the array) reading every element only once, then partitioning to increment the correct elements. The only serious problem with radix sort is it’s relatively high constant cost for those cases that you need to convert data types to produce an absolute sort order. For identical data types, like in sorting and querying a column in relational data, radix count sorts are fast, simple to parallelize, and even works across nodes in a distributed cluster given a query master that can “reduce” the counts to produce a single index, which it then sends back down to nodes to actually sort the data with either inter-node communication or, admittedly not optimally, sending back chunks of data with their new locations to the query master.

    • @segmentsAndCurves
      @segmentsAndCurves 3 роки тому

      @@romannasuti25 What's your point, again? I don't really get it.

    • @studiousboy644
      @studiousboy644 3 роки тому

      @@segmentsAndCurves
      You can't really compare complex numbers lmfaoooo.

    • @segmentsAndCurves
      @segmentsAndCurves 3 роки тому

      @@studiousboy644 Or can you?
      Yeah, you can't, but what I mean is multiple field comparisons

  • @simonmultiverse6349
    @simonmultiverse6349 3 роки тому +2

    You don't seem to know what the letter O stands for. It means "Order of" which means the magnitude of the number is a constant multiplied by the quantity. As an example, "O(n^2)" is pronounced "order of n squared" meaning the running time is a constant multiplied by n^2. One can count time, number of swaps, number of comparisons, memory usage, or any quantity which is relevant to the performance of the algorithm.

  • @cesaltinofelix
    @cesaltinofelix 4 місяці тому

    i was here 🔥

  • @Pilosofia
    @Pilosofia 3 роки тому

    you forget the lvl4 and the best one. the bogosort.

  • @chonchjohnch
    @chonchjohnch 3 роки тому

    Sleepsort uses the least ram

  • @classsix6491
    @classsix6491 2 роки тому

    Heap sort is o(n) space complexity because you need heap that will hold the whole data structre if it's not an array

    • @nandoaires
      @nandoaires Рік тому

      You can do heapsort in-place.

    • @seneca983
      @seneca983 Рік тому

      It's O(1) space complexity. You don't need *extra* space like that.

  • @HondaCivic20019
    @HondaCivic20019 3 роки тому

    Where to do these sorting? I wanna try it

    • @Banzybanz
      @Banzybanz 3 роки тому +1

      Create a random list of numbers, and write a programme to sort them on your favourite programming language.

  • @monkyyy0
    @monkyyy0 3 роки тому

    Wow a sorting video that doesn't cover radix sort expect for mentioning that its about comparison sorts in passing..... im shocked once again

  • @hh126
    @hh126 3 роки тому +1

    lucky bogosort = fastest

  • @LetsKetchup
    @LetsKetchup Рік тому

    You forgot bogosort😂

  • @palernitana
    @palernitana 4 роки тому

    Selam Ben Adal

  • @KasesTBW
    @KasesTBW Рік тому

    Bogosort to the moon

  • @tyapka
    @tyapka 4 роки тому +2

    You should skip explaining how algos work, because except for trivial alogs your are explanations are not good at all, but instead you better explain in what situation I would choose one or the other algo.

  • @Runehorn
    @Runehorn 3 роки тому +1

    Too much bobblehead and not enough sorting on screen.

  • @panasonic_youth
    @panasonic_youth 3 роки тому

    Yep. I didn't understand a single thing in this video

  • @rosek6585
    @rosek6585 3 роки тому

    Radix sort: PATHETIC

  • @hanyanglee9018
    @hanyanglee9018 4 роки тому

    Lv4 network sortation.

  • @veeragbrahma4574
    @veeragbrahma4574 4 роки тому +1

    yaaaaaaaaaaaaaaaaaaaaakkkkkkkkkkkk

  • @veeragbrahma4574
    @veeragbrahma4574 4 роки тому +1

    No coding link to download and also very speed in teaching which is not understandable to all non english students. Please give the code links and also provide such PPTs and books Tutorials for free because if you provide more for the students then it will be very helpful

    • @ng2250
      @ng2250 4 роки тому

      no

    • @veeragbrahma4574
      @veeragbrahma4574 4 роки тому

      @@ng2250 then you can carry on with your videos yourself. its time waste to me and students to spend time on your videos then.

    • @oliverstransky4254
      @oliverstransky4254 4 роки тому +2

      @@veeragbrahma4574 Are ya coding son?

    • @veeragbrahma4574
      @veeragbrahma4574 4 роки тому

      @@oliverstransky4254 if you are then you can continue son

    • @MrFreeGman
      @MrFreeGman 4 роки тому

      use the closed captions?