Hybrid sort - a combination of quick sort, heap sort and insertion sort, see std::sort Timsort - a hybrid of merger sort and insertion sort strings in radex sort.
I propose what I have heard referred to as "Stalin Sort": in O(n) remove any element which would cause the array to be unsorted. Unlike most sorting algorithms it doesn't preserve the contents of an array, but the result is indeed sorted.
You know, I wonder, what would happen if, instead of deleting the list forever, you stored the deleted values to another array? Then recursively sorted that purged list, until you're left with a bunch of sublists. Then, simply insert each element in every sublist in the list from the recursion before it, until eventually the list is sorted?
I think my favorite sorting algorithm is Miracle Sort. The steps are as follows: Check if the array is sorted. If not, check if the array is sorted. Repeat until sorted.
Redefine Sort: step 1: redefine the sort qualifications so that the list you have is sorted step 2: marvel at your "sorted" list Delete Sort step 1: delete all data step 2: claim that it's all sorted step 3: realize that you could say the same for it being not sorted step 4: panic Replace Sort Step 1: replace the data with the set 1, 2, 3 step 2: profit
Step 1 do step two Step 2 refer back to step number one Or Step 1:Go to step 3 ignore step 2 Step 2:go to step 1 ignore step 3 Step 3:go to step 2 ignore step 1
@@warriorsabe1792 Wait, I thought it was "shuffle list, then if not sorted, destroy the universe". Assuming the Many-Worlds interpretation of the universe holds, then all universes where the shuffle did not sort the list will be destroyed, so all surviving universes will have the list sorted in O(n) time and O(1) space complexity.
Pattern defeating quicksort. Also if you're looking for "joke" ones: sleepsort. It'd may be interesting to go into algorithms designed with L1/L2 cache and branches/branchless in mind.
For optimizing cache you'll want to look into "Cache-oblivious algorithms" which is a way of analyzing loads & stores in big O notation (and probabilistic cache misses), the currently most known optimal algorithm for this is Funnelsort (which is a small improvement on Merge sort). Really Quick Sort & Merge Sort are both nearly optimal (which you probably guessed) but at least we have a model to tell us "why" quick sort is so good.
In Bogo Sort, instead of permuting the elements randomly, you could just permute the elements sequencially (think of permuting the indexes in lexicographical order, then checking if the reordering sorts the list). It is still O(x!), but at least is guaranteed to sort in finite time.
I put this video knowing it would knock me out, worked like a charm. When I woke up, I re-watched the entire thing because it’s was such an interesting video
Binary insertion is actually like one of the if not the best way to sort physical objects, because the reason it is slowed down is shifting is O(n), but with real objects, you can shift everything at once for O(1), like in sorting cards you look for the right spot to put a card and then just stick it in there.
wow this video was super interesting! I somewhat recently programmed an application that sorted tiles in a 2D grid, and coding a fair amount of algorithms made me feel like I at least had a somewhat good grasp of how everything worked, but seeing this video I was really surprised at how deep the subject is, and how much I was missing! Amazing video!
with all this knowledge i learned that the optimal sorting algorithem for all these animations is simply returning list(range(n)) since after assuming so many wonderful things (n uniform whole numbers between 1 and n) you already know the answer
Oh my gosh, I was just getting recommended a bunch of sorting algorithm ASMR(?) videos and was trying to find something to explain each of the sorting algorithms, but none were really helping me. Haven't watched the video yet, but you've done such a good job explaining other complex topics, that I'm pretty sure this one will be of great help to me.
This is an amazing video, it's like those "sorting algorithms visualized" videos, but with a concise explanation of how each one works. Thank you, it really helped me understand them.
I wanted to see how Radix works, and this video explained it so well. I'm only really a beginner at coding, and I had an "OHHHH that makes sense" moment in this
I'm literally playing a programming game right now and I'm thinking which sorting algorithm is the easiest to implement. It could never be a better timing for this video to come out lol. Apart from that, probably good thing to refresh memory and likely learn something new so you are instantly getting a watch time from me 👀
If I need to sort something & I don't care about optimization, I use what I call NaïveSort It's selection sort, except instead of swapping elements, you copy the element into a temporary list & delete the original, then at the end you replace the original list with the temporary It's literally just worse than SelectionSort, which is already pretty bad, but it's dead easy to implement -- maybe even the easiest of them all
@@aformofmatter8913 always loved selection sort for how stupidly easy it is to implement, literally just call min and swap in a loop and that's it, but imo yours feels needlessly sophisticated, in a way. Like, for me it is easier to just call swap than to create a new container. Although I see why you like it, it's an interesting approach
@@elnico5623 it was probably "Turing complete". It's not a programming game per se, instead it teaches you to create a computer from logic gates and then you program it. It's available on steam and is very fun, pretty much a sandbox for any digital logic simulation
The space/time issue was a huge one for me when I did embedded programming. I used to write code for things as slow as 8mhz 8-bit hardware, with a few kb to a few hundred kb of ram. Merge sort was the usual candidate, but there were actually times that we would say stability be damned and free memory addresses from those twin sorted arrays as soon as they were in the final array, and then just return the final array.
I look forward to the next part! There's another algorithm I'm not sure is in the end list (since I don't remember the name), but it had to do with a _very particular_ sort of memory device that was able to use chains of charge pump cells to be able to rapidly interleave and shuffle lists together. The algorithm wasn't particularly good asymptotically, but the amount of silicon gate area required for each cell was just _tiny_ and the device could be run at clock speeds fast enough to more than keep up in practical terms.
kinda makes me think of what hardware might look like for impractical but asymptotically or even thermodynamically optimal sorting if foregoing solid state
There are a couple of algorithms used on really old, serial access (i.e. tape) backing store. The first is the Fibonacci sort: Augment the list to make a fibonacci number (say F[n]) of items and split them between two tapes, the first (tape A) containing F[n-1] and the second (tape B) F[n-2] items. Each item on these tapes can be considered a sorted list of length one. Merge sort the first list on tapes A and B onto a third tape (tape C), and repeat for F[n-2] times. Now, rewind tapes B and C. Tape A has F[n-1] - F[n-2] = F[n-3] lists left. This becomes the new tape B. Tape B is finished, so can be used as the new tape C. Tape C has F[n-1] sorted lists, so can be the new Tape A. Repeat this until there is just one sorted list. The other one is an extension of this, called the cascade sort, where, instead of just three tapes, there are k tapes, and tapes 1 to k-1 are merged to tape k, then the tapes are cycled, so tape 1 becomes the new tape 2, tape 2 becomes the new tape 3, …, tape k - 1 becomes the new tape k and the tape k becomes the new tape 1.
That pancake sort might actually work on some kind of strange doubly-linked list if each link specifies which link of the following element to continue with
There is a sorting algorithm that can sort in 1 round, and it actually works similar to bogo-sort! However, it does depend on the multiverse hypothesis being true. 1) Shuffle the list according to a truly random source of random numbers (atomic decay is a good one). 2) Check the list 3) If sorted, then you're done 4) If not sorted, then destroy the universe
No matter how many times I get it explained to me I can’t ever fully understand big O notation. Which is funny because back in the day I was an avid sorting algorithm enjoyer, having made over 90 sorts myself.
22:42 For best performance you would count the digits without copying the array to the second buffer, after counting you would do the sort copy to the other buffer, then count, sort copy back, and keep going until one of the two arrays has the final sorted result Computational complexity is the same, but it's more performant
in-place merge sort would've been a nice mention. Also the two ways of quicksort partitioning. And in terms of joke sorts, demon sort is one of the more interesting ones :) And then there would be threading optimization with odd-even sort and merge sort… ah, but I haven't seen the second video yet. Great work, anyway!
This explained sorts in plain language, much better than others have done. Nice flag, too (0:06). 2:35 - A Numberphile video showed that Selection and Insertion are the same sort but reversed. 20:00 - That's not the only way to do Shell Sort. If a pair is swapped at the current gap, compare to the left at the same gap size, repeat until you reach the start of the list. It guarantees that all items at the current gap size will be in order, moving items which might have been missed with a larger gap on previous iterations. Example with gap of three: 4,7,2,9,1,6,5,8,0,3 compare 4 & 9; compare 9 & 5, swap, compare 4 & 5; compare 9 & 3, swap, compare 5 & 3, swap, compare 4 & 3
mine: you have list A and B, make list B numbers s-b (smallest number to biggest number in list A), then go through list B and remove any items in list B not in list A, because we made list B s-b, it was sorted, then remove all the ones not in A and we only have the ones in A
2:50 fun fact, binary insertion is actually slower than plain insertion in practicallity due to cache locality. ref: a cpp talk given by someone at cppcon.
I'm not complaining about the content, because that is ACE, but i'd low cut the vocal track with a EQ or HPF at minimum 85hz. maybe you can even cut it at 120-130hz based on tonal features of your voice, do it post process, not pre. Just a little free tip that will make the sound more consistent through the series. (that is, if you use the same microphone, and the same room ofc) There is a very audible deep humming which i suspect you can just filter away.
34:57 sorts of my interest: * Bitonic (i never understood this in class) * Pairwise Sorting Network (ik this, but just noting here) * Intro (c++ stl) * Tim (python
Yaay, SFML! I used it last semester for some assignments but haven’t ever seen it named in a project someone else has done, so I wondered if it was commonly used or not.
If you have to order a stack of printed invoices by date, the bucket sort is good. There are 4 buckets that a human can see quickly 1-9 single digit 10-19 -> 1 20-29 -> 2 30-31 -> 3 After that you use insert sort on each bucket which is possible on a normal sized desk. A girl I worked with would alway use insertion sort right away with auxiliary array in form of a huge pysical table that she would selfishly block. She was slower but it always gave the impression that she had more work. I always wanted to sabotage the stupid mess she made.
In terms of complexity sleep sort is actually weirdly interesting to discuss. Just sleep as long as the number is, and it will print the items likely in order, assuming the minimum difference in sleep is greater than any expectable jitter in the threading handling. While the time complexity is O(n), the efford is actually barely anything. While certainly in the list of impractical sorting algorithms, it is actually implementable at least.
one thing possibly missing from the obscure sorts list is examples program synthesis and machine learning producing novel algorithms which are not easily explained
4:40 - your definition of Big-O here is technically correct (except maybe it should be less than or equal to, but I'm not sure if that changes anything), but it's more common to phrase it as f(x)
I love how my immediate instinct for every recursive sorting algorithm I’ve coded has been to add a base case that just says “BUT IF THE LENGTH OF THE LIST IS LESS THAN 15, ITS SHAKER TIME BABYYYYY”
I feel like the statement radix can only sort integers should come with an asterisk. Shouldn't you be able to map any object to an integer and achieve the same results? This would require some preprocessing if it's not baked in to the object itself but could still be done quickly in some cases. For example if you're sorting cities by country->city you could generate an integer encoding for that data to then perform radix on.
in languages with pointers, you can make a list where each element has a pointer to the next element. using this structure, you can shift the entire array by using the pointers and not repeated reading and writing.
Here's an interesting variation on the Bogo sort: it will only randomize groups of sorted sequences, then at next check, it will do it again with the new group. Will this be better or worse than regular Bogo? The intuitive answer would be yes, but how would you prove it? My approach would be that there are situation where the list is piecewise sorted and the randomization sorts it right, so it can be a reduction in time complexity, though inconsistent because it only works in one condition but it's also a significant increase in space complexity because you need to keep track of which are the sorted lists.
Great video. I don't give a fuck about computers but this was fun to learn about. I never really thought sorting pancakes by flipping them, but I guess that actually would work.
When it comes to Bucket sort I'm guessing you can preserve the efficiency of algorithm on non evenly distributed arrays if you know what distribution your data is expected to be, so that you can put more buckets on the densest parts of the array. Also, I wonder if you could do multiple steps of evenly distributed buckets, but only split buckets that are above a certain size. My guess is that it would boil down to another algorithm.
Me when seeing gravity sort somehow have everything go into place (particularly on a circular visualization): oh so that makes everything fall to its correct place “Gravity sort *insert stuff blah blah blah it uses gravity ok*” Me: oh it literally does that
My favourite sorting algorithm is the O(0) (or would it be O(1)?) Ba Sing Sort, aka "there is no out of place value in the array". You don't do anything, and the array is already sorted. Trust me.
What if you use the `median of the medians` algorithm in the implementation of the quicksort to almost always be sure that you will split into 2 equal subsequences? Would it improve its time complexity in the worst case ?
Hey! Loved this video :). Small observation I would make is when you're comparing algorithms, maybe use a gradient from green to red to order their complexity :)
I think there is a problem with the way you described quicksort. If the pivot happens to be the largest element in the list, A will scroll all the way to the end of the list and meet B without making any swaps. Then if you place the pivot to the left of them, the element to the right of the pivot (where A and B meet) is going to remain smaller than the pivot, but also be on the wrong side of the pivot, with no chance of making ever getting sorted.
You're right, there is a discrepancy between that and my code. My code goes like this: int quickPart(int from, int to) { int piv = a[from]; int c = from + 1; int d = to; while (c < to && a[c] < piv) { check(c); c++; } while (d > from && a[d] > piv) { check(d); d--; } while (c < d) { switcheroo(c, d); show(); c++; d--; while (c < to && a[c] < piv) { c++; } while (d > from && a[d] > piv) { d--; } } switcheroo(from, d); return d; } A goes right until it finds a piece >= the pivot (or if it passes B) B goes left until it finds a piece
I've written a node.js package for sorting in the past and just went back to mess with it again a little and I found out that I wasn't benchmarking node's native sort method for Arrays correctly (it was sorting the input array in-place and then in subsequent runs already had the pre-sorted input). Now I wanna rewrite my package and try double selection sort while I'm at it.
Fun fact, around 9 years ago, I made up a song that had a section like C C# E E D# C. It was my favorite part of the song. When formulating my intro, I was messing around on a keyboard and inevitably found myself rediscovering the song I had made all those years ago. I was inspired by Europe in midi art to change the last note to a D.
I just thought of an idea for a new sorting algorithm and i think it should be called scam sort or wishful sort. go through the array and find the minimum and maximum value, then return a new array evenly interpolating between them and pray that the input data matched the incredibly niche test case it was designed for. with the right data, this will sort it in O(n) if im not mistaken.
i guess this can be generalized to any sorting algorithm that assumes something about the data without looking at all of it. for example just look at the first element of the array and make a new array that repeats that value for the length of the array. this has O(1) reads which is probably the best you'll get from a sorting algorithm, except maybe intelligent design sort.
Corection at 9:35: Shouldn't the pivot be a random value in the list that we swap to be the first value? That way, we make it less likely that some preoccuring pattern in the data will cause us to get the worst case runtime.
The original Quicksort started with the first piece as the pivot, sorry. Anyways the idea of a random pivot still likely wouldn’t improve the algorithm as it would be nondeterministic and also would still run into the same issues compared to using the method shown here or the method from PDQsort.
in the discussion of radix and related sorts, I fail to see how, in practical implementation, the range can be anything other than O(1), since any practical number representation has a fixed bit count (short of arbitrary precision data types, but who actually uses those). A radix256 sort of a 32bit number can be done in 4 passes, no variance. It doesn't matter if the 32 bit values are int or float, since floats can be turned into a value-ordered sequence of integers by a simple bitwise operation during the first pass, that is easily undone during the last pass (forward: flip all bits of negative numbers and just sign bit of positives). Otherwise a very clear, concise and well presented review of sorting. Thank you. Moving on to ep.2 now...
twitter.com/kuvina_4
Hybrid sort - a combination of quick sort, heap sort and insertion sort, see std::sort
Timsort - a hybrid of merger sort and insertion sort
strings in radex sort.
But can J do 8t myself
?
I propose what I have heard referred to as "Stalin Sort":
in O(n) remove any element which would cause the array to be unsorted.
Unlike most sorting algorithms it doesn't preserve the contents of an array, but the result is indeed sorted.
you've heard of merge sort, now get ready for purge sort
@@Kuvinaneck yeah, it's purging time!
Lol. This sounds like an algorithm created by an AI when you don't specify the requirements explicitly.
Order(1) sort: set array size to 0.
You know, I wonder, what would happen if, instead of deleting the list forever, you stored the deleted values to another array? Then recursively sorted that purged list, until you're left with a bunch of sublists. Then, simply insert each element in every sublist in the list from the recursion before it, until eventually the list is sorted?
I think my favorite sorting algorithm is Miracle Sort. The steps are as follows:
Check if the array is sorted. If not, check if the array is sorted. Repeat until sorted.
Also heck yeah, enbies represent! Happy Pride!
Mfw random radiation-induced bit flips
@@DaemonWorx its best case time complexity is still O(1) because there's always that small chance radiation flips all the bits in perfect order
You'd have to recreate so many SM64 solar bit mutations in order for that to be even a fraction as viable as bogosort
@@DZ-DizzyDummit makes bogobogosort look like the speed of light
My favorite sorting algorithm is "don't sort". Its where you figure out a way to solve your problem without sorting the list
Redefine Sort:
step 1: redefine the sort qualifications so that the list you have is sorted
step 2: marvel at your "sorted" list
Delete Sort
step 1: delete all data
step 2: claim that it's all sorted
step 3: realize that you could say the same for it being not sorted
step 4: panic
Replace Sort
Step 1: replace the data with the set 1, 2, 3
step 2: profit
@@therealelement75no the empty list is definitely sorted
Step 1 do step two
Step 2 refer back to step number one
Or
Step 1:Go to step 3 ignore step 2
Step 2:go to step 1 ignore step 3
Step 3:go to step 2 ignore step 1
I have done this before, when getting the top 5 things from a list, I just got the top thing 5 times
It's like the fourth time I rewatch that series, seriously this is the best resource to intermediary understanding of sorting.
We need to have Quantum Bogosort in the next one; it's too good of a joke algorithm to not include. Great video! :D
Yeah it's like "what? you checked them one at a time? Just do every possible permutation simultaneously and grab the sorted one ez"
@@warriorsabe1792 Wait, I thought it was "shuffle list, then if not sorted, destroy the universe". Assuming the Many-Worlds interpretation of the universe holds, then all universes where the shuffle did not sort the list will be destroyed, so all surviving universes will have the list sorted in O(n) time and O(1) space complexity.
I watched this expecting it
69th like
@@KinuTheDragonin some cases it's O(1). Just don't check if it's sorted, assume it is, and only destroy the universe if there is an error.
I come here to learn more about the radix sort, but end up watching all of the video. It is very interesting, thank you for making this
Pattern defeating quicksort. Also if you're looking for "joke" ones: sleepsort. It'd may be interesting to go into algorithms designed with L1/L2 cache and branches/branchless in mind.
For optimizing cache you'll want to look into "Cache-oblivious algorithms" which is a way of analyzing loads & stores in big O notation (and probabilistic cache misses), the currently most known optimal algorithm for this is Funnelsort (which is a small improvement on Merge sort). Really Quick Sort & Merge Sort are both nearly optimal (which you probably guessed) but at least we have a model to tell us "why" quick sort is so good.
20:08 You really thought you could sneak that in there and have nobody notice
In Bogo Sort, instead of permuting the elements randomly, you could just permute the elements sequencially (think of permuting the indexes in lexicographical order, then checking if the reordering sorts the list). It is still O(x!), but at least is guaranteed to sort in finite time.
I put this video knowing it would knock me out, worked like a charm. When I woke up, I re-watched the entire thing because it’s was such an interesting video
The world needs more UA-camrs like you and less UA-camrs of the kind you don't even want to ever find.
That was a great explanation of big O notation, which doesn't make it sound illogical!
Binary insertion is actually like one of the if not the best way to sort physical objects, because the reason it is slowed down is shifting is O(n), but with real objects, you can shift everything at once for O(1), like in sorting cards you look for the right spot to put a card and then just stick it in there.
wow this video was super interesting! I somewhat recently programmed an application that sorted tiles in a 2D grid, and coding a fair amount of algorithms made me feel like I at least had a somewhat good grasp of how everything worked, but seeing this video I was really surprised at how deep the subject is, and how much I was missing! Amazing video!
with all this knowledge i learned that the optimal sorting algorithem for all these animations is simply returning list(range(n)) since after assuming so many wonderful things (n uniform whole numbers between 1 and n) you already know the answer
Oh my gosh, I was just getting recommended a bunch of sorting algorithm ASMR(?) videos and was trying to find something to explain each of the sorting algorithms, but none were really helping me.
Haven't watched the video yet, but you've done such a good job explaining other complex topics, that I'm pretty sure this one will be of great help to me.
99% of bogo sorters quit right before they’re about to sort the list
This is an amazing video, it's like those "sorting algorithms visualized" videos, but with a concise explanation of how each one works. Thank you, it really helped me understand them.
I wanted to see how Radix works, and this video explained it so well. I'm only really a beginner at coding, and I had an "OHHHH that makes sense" moment in this
I'm literally playing a programming game right now and I'm thinking which sorting algorithm is the easiest to implement. It could never be a better timing for this video to come out lol. Apart from that, probably good thing to refresh memory and likely learn something new so you are instantly getting a watch time from me 👀
If I need to sort something & I don't care about optimization, I use what I call NaïveSort
It's selection sort, except instead of swapping elements, you copy the element into a temporary list & delete the original, then at the end you replace the original list with the temporary
It's literally just worse than SelectionSort, which is already pretty bad, but it's dead easy to implement -- maybe even the easiest of them all
@@aformofmatter8913 always loved selection sort for how stupidly easy it is to implement, literally just call min and swap in a loop and that's it, but imo yours feels needlessly sophisticated, in a way. Like, for me it is easier to just call swap than to create a new container. Although I see why you like it, it's an interesting approach
What game?
@@elnico5623 it was probably "Turing complete". It's not a programming game per se, instead it teaches you to create a computer from logic gates and then you program it. It's available on steam and is very fun, pretty much a sandbox for any digital logic simulation
The space/time issue was a huge one for me when I did embedded programming. I used to write code for things as slow as 8mhz 8-bit hardware, with a few kb to a few hundred kb of ram. Merge sort was the usual candidate, but there were actually times that we would say stability be damned and free memory addresses from those twin sorted arrays as soon as they were in the final array, and then just return the final array.
I look forward to the next part!
There's another algorithm I'm not sure is in the end list (since I don't remember the name), but it had to do with a _very particular_ sort of memory device that was able to use chains of charge pump cells to be able to rapidly interleave and shuffle lists together. The algorithm wasn't particularly good asymptotically, but the amount of silicon gate area required for each cell was just _tiny_ and the device could be run at clock speeds fast enough to more than keep up in practical terms.
kinda makes me think of what hardware might look like for impractical but asymptotically or even thermodynamically optimal sorting if foregoing solid state
I couldn't help but think of Liam Neeson…
A very particular sort of memory. The sort of memory that's a nightmare for unsorted lists…
There are a couple of algorithms used on really old, serial access (i.e. tape) backing store.
The first is the Fibonacci sort:
Augment the list to make a fibonacci number (say F[n]) of items and split them between two tapes, the first (tape A) containing F[n-1] and the second (tape B) F[n-2] items.
Each item on these tapes can be considered a sorted list of length one.
Merge sort the first list on tapes A and B onto a third tape (tape C), and repeat for F[n-2] times.
Now, rewind tapes B and C. Tape A has F[n-1] - F[n-2] = F[n-3] lists left. This becomes the new tape B. Tape B is finished, so can be used as the new tape C. Tape C has F[n-1] sorted lists, so can be the new Tape A.
Repeat this until there is just one sorted list.
The other one is an extension of this, called the cascade sort, where, instead of just three tapes, there are k tapes, and tapes 1 to k-1 are merged to tape k, then the tapes are cycled, so tape 1 becomes the new tape 2, tape 2 becomes the new tape 3, …, tape k - 1 becomes the new tape k and the tape k becomes the new tape 1.
Cant wait to see thanos sort. If the array isnt sorted, remove half the items at random. Repeat until the array is sorted
host is teaching me more math then school ever did, and making it interesting
Fun fact: a quantum bogosort would in theory always resolve in 1 step (if done correctly)
Commenting for the algorithm!
The sorting algorithm
That pancake sort might actually work on some kind of strange doubly-linked list if each link specifies which link of the following element to continue with
30:00 its really cool to see algorithms that only work in the real world! im really glad you included these
There is a sorting algorithm that can sort in 1 round, and it actually works similar to bogo-sort! However, it does depend on the multiverse hypothesis being true.
1) Shuffle the list according to a truly random source of random numbers (atomic decay is a good one).
2) Check the list
3) If sorted, then you're done
4) If not sorted, then destroy the universe
I think this is probably the best explanation of Quick Sort I've encountered. Finally got my head around it after all these years.
No matter how many times I get it explained to me I can’t ever fully understand big O notation. Which is funny because back in the day I was an avid sorting algorithm enjoyer, having made over 90 sorts myself.
22:42
For best performance you would count the digits without copying the array to the second buffer, after counting you would do the sort copy to the other buffer, then count, sort copy back, and keep going until one of the two arrays has the final sorted result
Computational complexity is the same, but it's more performant
ive been using bubble sort to maintain the sortedness of arrays, works pretty well if nothing huge needs to change
20:08 I love you for this
Amazing video, really! I was not expecting this high quality, but here it is! You gained a new subscriber.
Bogo has always been my favorite. Not many sorts can go from "1st try" to "heat death of the universe."
I can't explain what a major W this video is for my brain
in-place merge sort would've been a nice mention. Also the two ways of quicksort partitioning. And in terms of joke sorts, demon sort is one of the more interesting ones :)
And then there would be threading optimization with odd-even sort and merge sort… ah, but I haven't seen the second video yet. Great work, anyway!
This explained sorts in plain language, much better than others have done. Nice flag, too (0:06).
2:35 - A Numberphile video showed that Selection and Insertion are the same sort but reversed.
20:00 - That's not the only way to do Shell Sort. If a pair is swapped at the current gap, compare to the left at the same gap size, repeat until you reach the start of the list. It guarantees that all items at the current gap size will be in order, moving items which might have been missed with a larger gap on previous iterations. Example with gap of three: 4,7,2,9,1,6,5,8,0,3
compare 4 & 9; compare 9 & 5, swap, compare 4 & 5; compare 9 & 3, swap, compare 5 & 3, swap, compare 4 & 3
Loved the visual style
Just stumbled upon this and gotta say it's really well made! Keep it up :)
I accidentally subscribed.
*+1 sub*
mine: you have list A and B, make list B numbers s-b (smallest number to biggest number in list A), then go through list B and remove any items in list B not in list A, because we made list B s-b, it was sorted, then remove all the ones not in A and we only have the ones in A
I was just about to cover this topic, but you did it first.
"Words are just 26 bit numbers"
God i've never thought about it like that.
Fluxsort and Quadsort would be pretty interesting to cover as well for the part 2!
2:50 fun fact, binary insertion is actually slower than plain insertion in practicallity due to cache locality. ref: a cpp talk given by someone at cppcon.
I'm not complaining about the content, because that is ACE, but i'd low cut the vocal track with a EQ or HPF at minimum 85hz. maybe you can even cut it at 120-130hz based on tonal features of your voice, do it post process, not pre.
Just a little free tip that will make the sound more consistent through the series. (that is, if you use the same microphone, and the same room ofc) There is a very audible deep humming which i suspect you can just filter away.
34:57 sorts of my interest:
* Bitonic (i never understood this in class)
* Pairwise Sorting Network (ik this, but just noting here)
* Intro (c++ stl)
* Tim (python
Radex sort is where it went over my head I think. Could be that 20 minutes of thinking is my limit for one sitting though.
brilliant video and your voice is really nice too. will definitely rewatch next year when i have an intro to programming class :)
Yaay, SFML! I used it last semester for some assignments but haven’t ever seen it named in a project someone else has done, so I wondered if it was commonly used or not.
the best video ive seen on this subject
This is a very good explanation of these sorts! Are you planning to make the code for the visualisations public?
I am considering it, but as of right now, the code is a mess with no comments, so I would like to make it readable before that.
Sleep Sort, where you make a thread for each item, let them sleep for that, then they will wake up in order
This video is really helpful. Keep going bro 🔥🔥
If you have to order a stack of printed invoices by date, the bucket sort is good. There are 4 buckets that a human can see quickly
1-9 single digit
10-19 -> 1
20-29 -> 2
30-31 -> 3
After that you use insert sort on each bucket which is possible on a normal sized desk.
A girl I worked with would alway use insertion sort right away with auxiliary array in form of a huge pysical table that she would selfishly block.
She was slower but it always gave the impression that she had more work.
I always wanted to sabotage the stupid mess she made.
In terms of complexity sleep sort is actually weirdly interesting to discuss. Just sleep as long as the number is, and it will print the items likely in order, assuming the minimum difference in sleep is greater than any expectable jitter in the threading handling. While the time complexity is O(n), the efford is actually barely anything. While certainly in the list of impractical sorting algorithms, it is actually implementable at least.
one thing possibly missing from the obscure sorts list is examples program synthesis and machine learning producing novel algorithms which are not easily explained
4:40 - your definition of Big-O here is technically correct (except maybe it should be less than or equal to, but I'm not sure if that changes anything), but it's more common to phrase it as f(x)
Wow, this video is a really good work, congratulations!
33:40 I had the subs on and it said:
"*Bogosaur* is unique because it's not guaranteed to sort the list on finite time." 😂
I love how my immediate instinct for every recursive sorting algorithm I’ve coded has been to add a base case that just says “BUT IF THE LENGTH OF THE LIST IS LESS THAN 15, ITS SHAKER TIME BABYYYYY”
I feel like the statement radix can only sort integers should come with an asterisk. Shouldn't you be able to map any object to an integer and achieve the same results? This would require some preprocessing if it's not baked in to the object itself but could still be done quickly in some cases. For example if you're sorting cities by country->city you could generate an integer encoding for that data to then perform radix on.
Nuclear Bomb Sort: sets EVERY element in the array to 0 and checks if it's sorted
This, is technically O(n) time, and actually has a space complexity of, 0
But it is less useful than stalin sort
I think you don't need merge sort to use the additional array, you can do all in the first one with pointers
amazing video you explain everything so well
20:09 AMONG US on purple bar
in languages with pointers, you can make a list where each element has a pointer to the next element. using this structure, you can shift the entire array by using the pointers and not repeated reading and writing.
isn't pointer arithmetic like a taboo?
13:43 actually, selection sort can be stable, depending on how you find the smallest/largest number, and how you implement the algorithm
Here's an interesting variation on the Bogo sort: it will only randomize groups of sorted sequences, then at next check, it will do it again with the new group. Will this be better or worse than regular Bogo? The intuitive answer would be yes, but how would you prove it?
My approach would be that there are situation where the list is piecewise sorted and the randomization sorts it right, so it can be a reduction in time complexity, though inconsistent because it only works in one condition but it's also a significant increase in space complexity because you need to keep track of which are the sorted lists.
Great video. I don't give a fuck about computers but this was fun to learn about. I never really thought sorting pancakes by flipping them, but I guess that actually would work.
When it comes to Bucket sort I'm guessing you can preserve the efficiency of algorithm on non evenly distributed arrays if you know what distribution your data is expected to be, so that you can put more buckets on the densest parts of the array. Also, I wonder if you could do multiple steps of evenly distributed buckets, but only split buckets that are above a certain size. My guess is that it would boil down to another algorithm.
From now on I'll exclusively use BOGO sort for everything.
Me when seeing gravity sort somehow have everything go into place (particularly on a circular visualization): oh so that makes everything fall to its correct place
“Gravity sort *insert stuff blah blah blah it uses gravity ok*”
Me: oh it literally does that
13:02 exactly. but why. stability. thanks a lot for backing it up so well
My favourite sorting algorithm is the O(0) (or would it be O(1)?) Ba Sing Sort, aka "there is no out of place value in the array". You don't do anything, and the array is already sorted. Trust me.
masterful presentation, lots nuance is packed into every part
What if you use the `median of the medians` algorithm in the implementation of the quicksort to almost always be sure that you will split into 2 equal subsequences? Would it improve its time complexity in the worst case ?
Hey! Loved this video :). Small observation I would make is when you're comparing algorithms, maybe use a gradient from green to red to order their complexity :)
I think there is a problem with the way you described quicksort. If the pivot happens to be the largest element in the list, A will scroll all the way to the end of the list and meet B without making any swaps. Then if you place the pivot to the left of them, the element to the right of the pivot (where A and B meet) is going to remain smaller than the pivot, but also be on the wrong side of the pivot, with no chance of making ever getting sorted.
You're right, there is a discrepancy between that and my code. My code goes like this:
int quickPart(int from, int to)
{
int piv = a[from];
int c = from + 1;
int d = to;
while (c < to && a[c] < piv)
{
check(c);
c++;
}
while (d > from && a[d] > piv)
{
check(d);
d--;
}
while (c < d)
{
switcheroo(c, d);
show();
c++;
d--;
while (c < to && a[c] < piv)
{
c++;
}
while (d > from && a[d] > piv)
{
d--;
}
}
switcheroo(from, d);
return d;
}
A goes right until it finds a piece >= the pivot (or if it passes B)
B goes left until it finds a piece
the start kinda looks like a musical of the europe continent in fl studio if you remember
I've written a node.js package for sorting in the past and just went back to mess with it again a little and I found out that I wasn't benchmarking node's native sort method for Arrays correctly (it was sorting the input array in-place and then in subsequent runs already had the pre-sorted input). Now I wanna rewrite my package and try double selection sort while I'm at it.
That intro music sounds like the western half of Iceland.
Fun fact, around 9 years ago, I made up a song that had a section like C C# E E D# C. It was my favorite part of the song. When formulating my intro, I was messing around on a keyboard and inevitably found myself rediscovering the song I had made all those years ago. I was inspired by Europe in midi art to change the last note to a D.
I just thought of an idea for a new sorting algorithm and i think it should be called scam sort or wishful sort.
go through the array and find the minimum and maximum value, then return a new array evenly interpolating between them and pray that the input data matched the incredibly niche test case it was designed for. with the right data, this will sort it in O(n) if im not mistaken.
i guess this can be generalized to any sorting algorithm that assumes something about the data without looking at all of it. for example just look at the first element of the array and make a new array that repeats that value for the length of the array. this has O(1) reads which is probably the best you'll get from a sorting algorithm, except maybe intelligent design sort.
20:09 I see you
This is amazing. Thank you
Very cool thank you so very much
20:09 *a m o g u s*
This video is very gender
Bruh
I’m Sortsexual.
Wut
@@MoolsDogTwoOfficialinsertion sorting fr
Mood
Great Job!
Binary insertion and its precursor, nonbinary insertion? I don't know if that's actually funny or not, but it popped into my head.
Amazing video!
Corection at 9:35: Shouldn't the pivot be a random value in the list that we swap to be the first value? That way, we make it less likely that some preoccuring pattern in the data will cause us to get the worst case runtime.
The original Quicksort started with the first piece as the pivot, sorry. Anyways the idea of a random pivot still likely wouldn’t improve the algorithm as it would be nondeterministic and also would still run into the same issues compared to using the method shown here or the method from PDQsort.
the. Best. video high. at 1.5x speed ofc. followed by half•alive’s still feel. music video
in the discussion of radix and related sorts, I fail to see how, in practical implementation, the range can be anything other than O(1), since any practical number representation has a fixed bit count (short of arbitrary precision data types, but who actually uses those). A radix256 sort of a 32bit number can be done in 4 passes, no variance. It doesn't matter if the 32 bit values are int or float, since floats can be turned into a value-ordered sequence of integers by a simple bitwise operation during the first pass, that is easily undone during the last pass (forward: flip all bits of negative numbers and just sign bit of positives).
Otherwise a very clear, concise and well presented review of sorting. Thank you. Moving on to ep.2 now...
20:08 AMONGUS
Awesome stuff!