See, problem with quantum bogo sort is that what the algorithm _really_ does is basically guarantee that the computer will experience a failure that causes it to incorrectly report the randomized list as sorted, since even that sort of miniscule probability of a hardware fault is just the more likely of the two by a huge margin.
That's where the cleverness of quantum programming comes in, that description is a bit more jokeified than the real way it would work, which involves causing the combined wavefunction to interfere with itself in such a way that it massively amplifies the probability of the desired outcome while simultaneously destroying the probability of the other ones. That's how quantum computers work in general, by doing things to the wavefunctions that mess with probability so the correct answer always happens - imagine adding numbers by rolling a magic die that weights itself to only ever roll the number you want, for example
You could fix that by creating n (one for each permutation of the list) + 1 universes, n of which attempt to sort the list, while the remaining 1 just immediately throws an error. The sorting universes check their permutation. If it's not sorted, they delete themselves. However, if the list is sorted, they delete the error universe. This ensures that a universe will always exist, even if the list is, for whatever reason, unsortable. It also allows for multiple solutions.
My two favourites that haven’t been mentioned are sleep sort, working on numbers only, where you sleep X amount of time for each value and insert them at the end of the sleep. By my favourite, is miracle sort: you look if it’s sorted, if it’s not, you wait a bit and look again to see if, because of a miracle, it’s now sorted. You can pray or sacrifice something to (try to) accelerate the sorting time.
I also forgot Stalin sort. Where you purge data that isn’t sorted. Pros: it’s always in O(n). Cons: worst case, you purged all values except for the first one
There's also intelligent design sort: The odds of the list randomly being in the specific order that you find it in is astronomically low, 1/n!. Thus, it was clearly set up in that order by an Intelligent Sorter, according to whose divine plan it is already sorted, and who are we mortals to question the Intelligent Sorter?
I've worked at a clothes store, and just realised I've implemented a version of patience sort independently. I need to sort stacks of clothes by their size, and they start out basically completely shuffled. The main difference is that I sort the intermediate piles in reverse order, so that when I reassemble the final stack, it comes out the right way up. (cos I add new clothes to the top of the stack) Glad to see the idea has been studied with a name and theory behind it
I had a similar experience when I was an intern at an office. I had to sort stacks of numbered paper documents (thousands of them, I think around 14k), but there were sequences of already sorted papers, and I found a way to take advantage of that: I would take two small stacks of sorted papers, merge them into one larger stack like in insertion sort, then get two other smaller stacks, merge them, and then merge the two resulting large stacks and just kept going like this. Some time later I learned about Timsort, which is a sorting algorithm that's pretty similar to my paper-sorting method. It's cool how many sorting algorithms in computer science can also be used in the real world.
Slowsort is cool and all, but I'm surprised there was no accompanying mention of worstsort - what if we cranked the inefficiency factor up to *nested factorials*
the fun thing about sorting networks is that, by dedicating a piece of hardware to each comparison, you can pipeline the sort so that the system as a whole sorts a full list per cycle. It's the kind of thing you might find on an FPGA. Although it'll have far higher latency than one cycle, the throughput would be absurd.
another joke sorting "algorithm" that i enjoy is Intelligent Design Sort; it goes like this 1: observe that there are n! possible orders that the array could have been in 2: further observe that this means that the odds of it being in the order that it is by sheer chance 1/n! 3: this is so improbable that the possibility can be discounted. 4: therefore, an intelligent designer must have put the elements of the array in that order for reasons beyond our understanding. 5: therefore, the list is already in the correct order. this "algorithm" technically has a time complexity of 1, but, well...
Excellent video! Not really mentioned, and not really covered, is the significance of any/all knowledge about the data you're sorting. Almost sorted already, dense/sparse data, etc. can be used to choose an appropriate algorithm. Also, briefly mentioned, is the significance of the machine (if any) you're using to sort. "By hand" vs "ordinary language/computer" vs "machine level coding" vs "parallel processing".
17:07 There actually is a more practical implementation of the Quantum Bogo sort - just send the list to enough computers to make every single list randomization, then only return the result from the one which did it. This way you reduce the number of universes you need to one. Extremely effective, no scifi tech needed.
I saw one called crow sort once, it only works on integers with no duplicates. You read each value, place the value in the index for its value, then delete each empty spot. It does zero comparisons besides empty checking. Kinda fun.
I think that the pairwise sorting network is the same as the odd-even except that instead of merging 2 adjacent sorted lists, it merges 2 interleaved lists. So the "divide" part of divide-and-conquer is done by splitting the data according to the least significant bit of the indices, instead of the most significant one. Also, it wins extra points by being the prettiest.
The quality is insane, you deserve orders of magnitudes more subscribers! Heres another sorting algorithm: For each element in the list, you start a new thread. The thread sleeps for as many microseconds (any other unit of time would also work) as its value, then writes its value back to the list. Here is my analysis of this algorithm: It isn't stable, its space complexity is n and the time complexity depends on the biggest element in the list. It also isn't guaranteed to sort the list since computers aren't perfect machines.
I think it have a name which is Sleep Sort. Not sure if you’re referring to it tho. This algorithm could have big trouble when sorting 1, 2, 9999. This algorithm cannot sort negative value as well.
Once quantum computers become common, quantum bogo sort wouldn't even be a joke algorithm - and with what I know of quantum programming, I believe if you entangle all the elements you can even do the steps as a single operation each, making it not even as slow as O(n). The "destroy the universe" thing is a bit tongue-in-cheek, though, because what it's really doing is setting up an interference pattern that destroys the possibility of that universe happening in the first place when the wavefunction is collapsed
it has been proved that quantum computers cannot do comparison sorts faster than nlogn. so time wise they are no better however there are spacewise improvements to classical algorithms
I've always been fascinated by trees, so the tree sort is one of my favorites. But instead of red-black, I'd go for something like the AVL tree. But anyway, great video with some great graphics. Thanks so much.
Legal note for Quantum Bogosort: Since its killcount complexity is k(n!-1) where k = population of Earth it potentially violates geneva convention and is overall ethically dubious
I came here to ask that you actually put (Part x) on your video titles please. I watch youtube on my TV and it is hard to get to the description and not having the Part 2/3/4/ect in the title makes finding the right video difficult. Thank you for your time and videos. They are very informative and I am enjoying them. Keep up the good work.
I came up with an algorithm as a joke but I think it is decent, grab the last item of the list and place it into an auxiliary array then compare with the new last piece if it is smaller place it to the left otherwise place it to the right repeat until the original list is empty then iterate the worst case scenario is something like; 10,1,2,3,4,5,6,7,8,9 it moves 10 8 times screwing up all the other numbers as it goes through it does resort some of them, I tested it by hand on a few small list with 10 worst case it took about 27 rounds a random 10 took about 10 rounds, worst case with 4 was 7 rounds, also with 4 it had about 10 cases with 1 round and 5 or 6 with 2 rounds
reminds me of an algorithm i came up with which had O(n) best case but O(2^n) average and worst case time complexity, O(1) space, dont remember if it was stable
while the list is not sorted, let n be the first element which is smaller than its predecessor (ie not in order), then rotate elements 1 to n to the right by one worst case for this is the sorted list rotated one to the right, so for four elements it would be 4 1 2 3, takes 2^(n - 1) steps with n elements
@@Nick-pu8jg Wow id bet it looks cool when visualised. And i have an idea of optimising it (maybe). I thought of doing something like the cocktail sort, were you go foward and back. So what if this foward and back thing was implemented to your sorting algorithm, and the part that goes backward would ofcourse look for the first one that is bigger than its predecesor. Maybe this will work maybe not
16:02 What's notable about slow sort is it's (theoretically) the slowest sorting algorithm that makes progress- obviously bogo dort is slower, but that's because bogo sort is constantly resetting, and a faulty sorting algorithm that simply never stops is even slower than bogo sort, but it is also jot making progress. Slow sort is funny in that way. As inefficient as possible, eithout sitting down and twiddling your thumbs
this video is awesome ! i watched both parts and everything is explained really well, i like the animation style, and they're pretty entertaining ! also a fan of the nb flag pin ill definitely check out your other videos too !
I think i accidentally discovered a variation of one of the Sorting Networks when trying to figure out how you could best run a tournament that ranked every player. (assuming all players not sharing a match could play at the same time, the goal would be to optimize the number of "rounds" which is effectively trying to optimize a parallel sorting algorithm) My solution was also O(nlogn^2) where the n is gotten rid of in paralellization. The other sorting networks mentioned here were of the same time complexity, so I hypothesize logn^2 is the best you can do for a parallel sorting algorithm, but I don't know for sure.
Bogosort can be improved into creating every possible unique distribution and deleting incorrect ones by trial and error and knowledge of the sorted state.
The wikipedia page on slowsort says it's stable, but the table on wikipedia says it is not. I do not know enough about these things to look into this discrepancy myself, however.
@@Kuvina Fair, I've always seen O(x) as the sorting time, from what I remember of university, so an algorithm that sorted a list in fixed time regardless of size would be O(1). I could totally be misremembering though.
The biggest joke is that quantum bogo is a valid interpretation of some real quantum sorting algorithms. It is just that rather than destroying the universes that output incorrect sorts we duplicate the universe that outputs a properly sorted list enough times that a universe chosen at random will almost certainly be one where the result was correctly sorted. The various many worlds interpretations of quantum mechanics can get really strange.
9:30 instead of merging between ops, maybe try allocating an n^2 destination array and then do a sort on each column. repeat this by transposing the matrix over and over again, using gravity and then sorting until you have a sorted triangle of values. At each step, the greatest value will be in the corner which can be taken away. This is O(unpleasant).
I made my own sorting algorithm with terrible space complexity. It creates 2 auxiliary arrays with one containing the smallest element and one containing the largest element in the list. It then places each element in the list in the auxiliary list corresponding to the value closest to it (is it colser to the lowest or the highest value?) It then destroys the original list and runs recursively for each auxiliary list, being sorted when the length of each auxiliary list is 1. Has this a name already?
could you do miracle sort? it does nothing, but just checks in on the array once in a while, with no goal other than to wait for a miracle technically it has zero operations, but really takes forever
you could do a variation of miracle sort where there is one arguably two operations: moving a piece of uranium next to the memory and retracting it once sorted. This way it could theoretically actually work on a human timescale
There is an easy way to make gnome sort faster: When the gnome finds a piece to swap, than write the index of that piece to a variable. Then, when the gnome is done swapping back, than he can just look into the index in that variable and go back there.
Yeah the video was good, but they actually continued the music at the end and cut it off at a point that sounded good, which no other UA-camr ever does.
i kind of want bogo^n sort. If the degree is 1, it just does regular bogo sort, or else it sorts the first x values using bogo^(n-1) sort and if the next value isn’t sorted it shuffles the list and starts over I estimated it takes O(n!^x) for bogosort^x
a bit specific, but if you would like one less known joke algorithm I like is shell-shell sort where instead of insertion sorting the subsists, you recursively call shell-shell sort. its very slow, but has some neat properties like if you use 3 coprime gaps it drops down to N (log(N) ^2) time complexity (while still being slow)
I never noticed it was so close to sqrt(2). But I looked into it, and I think the -1.415 actually comes from log_2(3) - 3. en.wikipedia.org/wiki/Merge-insertion_sort
Why does no one test sorting algorithms on sets of numbers that aren’t in fixed increments or integers? I feel like a lot of sorting algorithms would only work on whole numbers that have no gaps.
twitter.com/@Kuvina_4
See, problem with quantum bogo sort is that what the algorithm _really_ does is basically guarantee that the computer will experience a failure that causes it to incorrectly report the randomized list as sorted, since even that sort of miniscule probability of a hardware fault is just the more likely of the two by a huge margin.
That's where the cleverness of quantum programming comes in, that description is a bit more jokeified than the real way it would work, which involves causing the combined wavefunction to interfere with itself in such a way that it massively amplifies the probability of the desired outcome while simultaneously destroying the probability of the other ones. That's how quantum computers work in general, by doing things to the wavefunctions that mess with probability so the correct answer always happens - imagine adding numbers by rolling a magic die that weights itself to only ever roll the number you want, for example
You could fix that by creating n (one for each permutation of the list) + 1 universes, n of which attempt to sort the list, while the remaining 1 just immediately throws an error.
The sorting universes check their permutation. If it's not sorted, they delete themselves. However, if the list is sorted, they delete the error universe.
This ensures that a universe will always exist, even if the list is, for whatever reason, unsortable. It also allows for multiple solutions.
My two favourites that haven’t been mentioned are sleep sort, working on numbers only, where you sleep X amount of time for each value and insert them at the end of the sleep.
By my favourite, is miracle sort: you look if it’s sorted, if it’s not, you wait a bit and look again to see if, because of a miracle, it’s now sorted. You can pray or sacrifice something to (try to) accelerate the sorting time.
I also forgot Stalin sort. Where you purge data that isn’t sorted. Pros: it’s always in O(n). Cons: worst case, you purged all values except for the first one
Thank you! These 3 are probably the algorithms I've gotten the most comments about and they'll definitely be in part 3 !
There's also intelligent design sort: The odds of the list randomly being in the specific order that you find it in is astronomically low, 1/n!. Thus, it was clearly set up in that order by an Intelligent Sorter, according to whose divine plan it is already sorted, and who are we mortals to question the Intelligent Sorter?
I've worked at a clothes store, and just realised I've implemented a version of patience sort independently. I need to sort stacks of clothes by their size, and they start out basically completely shuffled. The main difference is that I sort the intermediate piles in reverse order, so that when I reassemble the final stack, it comes out the right way up. (cos I add new clothes to the top of the stack) Glad to see the idea has been studied with a name and theory behind it
I had a similar experience when I was an intern at an office. I had to sort stacks of numbered paper documents (thousands of them, I think around 14k), but there were sequences of already sorted papers, and I found a way to take advantage of that: I would take two small stacks of sorted papers, merge them into one larger stack like in insertion sort, then get two other smaller stacks, merge them, and then merge the two resulting large stacks and just kept going like this. Some time later I learned about Timsort, which is a sorting algorithm that's pretty similar to my paper-sorting method. It's cool how many sorting algorithms in computer science can also be used in the real world.
It's insane how good these videos are compared to the view count. You deserve millions with these videos.
Thank you so much!
YOU ARE GOD DAMN RIGHT
Slowsort is cool and all, but I'm surprised there was no accompanying mention of worstsort - what if we cranked the inefficiency factor up to *nested factorials*
the fun thing about sorting networks is that, by dedicating a piece of hardware to each comparison, you can pipeline the sort so that the system as a whole sorts a full list per cycle. It's the kind of thing you might find on an FPGA. Although it'll have far higher latency than one cycle, the throughput would be absurd.
Radiobogo sort:
Place some tritium rods on your CPU and wait for the array to be sorted by the beta particles
another joke sorting "algorithm" that i enjoy is Intelligent Design Sort; it goes like this
1: observe that there are n! possible orders that the array could have been in
2: further observe that this means that the odds of it being in the order that it is by sheer chance 1/n!
3: this is so improbable that the possibility can be discounted.
4: therefore, an intelligent designer must have put the elements of the array in that order for reasons beyond our understanding.
5: therefore, the list is already in the correct order.
this "algorithm" technically has a time complexity of 1, but, well...
The bitonic comparison graph has another application within Factorio as a belt balancing graph.
literally saw that too, especially the larger balancers like 32 by 32 and above.
As a geometry fan, I am incredibly excited for your next video
me too
Excellent video!
Not really mentioned, and not really covered, is the significance of any/all knowledge about the data you're sorting.
Almost sorted already, dense/sparse data, etc. can be used to choose an appropriate algorithm.
Also, briefly mentioned, is the significance of the machine (if any) you're using to sort. "By hand" vs "ordinary language/computer" vs "machine level coding" vs "parallel processing".
17:07 There actually is a more practical implementation of the Quantum Bogo sort - just send the list to enough computers to make every single list randomization, then only return the result from the one which did it. This way you reduce the number of universes you need to one. Extremely effective, no scifi tech needed.
I saw one called crow sort once, it only works on integers with no duplicates. You read each value, place the value in the index for its value, then delete each empty spot. It does zero comparisons besides empty checking. Kinda fun.
This video was aweasome, please go over bogobogo sort and miracle sort in the next video, I find them hilarious.
they did it!
obsessed with the small stipulation that quantum bogo sort isnt stable
thumbs up for making baiai look like the subset of odd-even that it is
Can't wait for part 3!
You really think he could make 6 of these? Gee golly I hope so(factorial)
@@ClementinesmWTF they go by they/them btw
@@m_affiliates ok…noted. But also, wooosh
My dad knows Tim Peters, who made Timsort. It is based on insertion and merge sort, and is also stable. It's fairly fast.
Can you do miracle sort too? Amazing video btw, I learned a lot.
I think that the pairwise sorting network is the same as the odd-even except that instead of merging 2 adjacent sorted lists, it merges 2 interleaved lists.
So the "divide" part of divide-and-conquer is done by splitting the data according to the least significant bit of the indices, instead of the most significant one.
Also, it wins extra points by being the prettiest.
Miracle sort plz. I mean at least the chance of an alien messing with it is an O(1). XD
The quality is insane, you deserve orders of magnitudes more subscribers!
Heres another sorting algorithm: For each element in the list, you start a new thread. The thread sleeps for as many microseconds (any other unit of time would also work) as its value, then writes its value back to the list.
Here is my analysis of this algorithm: It isn't stable, its space complexity is n and the time complexity depends on the biggest element in the list.
It also isn't guaranteed to sort the list since computers aren't perfect machines.
I think it have a name which is Sleep Sort. Not sure if you’re referring to it tho.
This algorithm could have big trouble when sorting 1, 2, 9999.
This algorithm cannot sort negative value as well.
Once quantum computers become common, quantum bogo sort wouldn't even be a joke algorithm - and with what I know of quantum programming, I believe if you entangle all the elements you can even do the steps as a single operation each, making it not even as slow as O(n). The "destroy the universe" thing is a bit tongue-in-cheek, though, because what it's really doing is setting up an interference pattern that destroys the possibility of that universe happening in the first place when the wavefunction is collapsed
it has been proved that quantum computers cannot do comparison sorts faster than nlogn. so time wise they are no better
however there are spacewise improvements to classical algorithms
I've always been fascinated by trees, so the tree sort is one of my favorites. But instead of red-black, I'd go for something like the AVL tree. But anyway, great video with some great graphics. Thanks so much.
Legal note for Quantum Bogosort: Since its killcount complexity is k(n!-1) where k = population of Earth it potentially violates geneva convention and is overall ethically dubious
Yay Quantum Bogosort!
That's awesome, and without the third part it will never be complete.
FACTORIO BALANCER SORT- I MEAN BITONIC SORT
wait thats so true
Odd-and-even sort works well if it's parallelized such that every pair in an iteration is compared then swapped if needed at the same time.
I came here to ask that you actually put (Part x) on your video titles please. I watch youtube on my TV and it is hard to get to the description and not having the Part 2/3/4/ect in the title makes finding the right video difficult. Thank you for your time and videos. They are very informative and I am enjoying them. Keep up the good work.
Exchange sort is also called sandpaper sort, and it's what I've been calling it
I was expecting patience sort to be "wait for cosmic rays to sort the array".
That's kinda the miracle sort.
I came up with an algorithm as a joke but I think it is decent, grab the last item of the list and place it into an auxiliary array then compare with the new last piece if it is smaller place it to the left otherwise place it to the right repeat until the original list is empty then iterate the worst case scenario is something like; 10,1,2,3,4,5,6,7,8,9 it moves 10 8 times screwing up all the other numbers as it goes through it does resort some of them, I tested it by hand on a few small list with 10 worst case it took about 27 rounds a random 10 took about 10 rounds, worst case with 4 was 7 rounds, also with 4 it had about 10 cases with 1 round and 5 or 6 with 2 rounds
reminds me of an algorithm i came up with which had O(n) best case but O(2^n) average and worst case time complexity, O(1) space, dont remember if it was stable
But whats the akgorithm???
while the list is not sorted, let n be the first element which is smaller than its predecessor (ie not in order), then rotate elements 1 to n to the right by one
worst case for this is the sorted list rotated one to the right, so for four elements it would be 4 1 2 3, takes 2^(n - 1) steps with n elements
@@Nick-pu8jg Wow id bet it looks cool when visualised. And i have an idea of optimising it (maybe). I thought of doing something like the cocktail sort, were you go foward and back. So what if this foward and back thing was implemented to your sorting algorithm, and the part that goes backward would ofcourse look for the first one that is bigger than its predecesor. Maybe this will work maybe not
@@Breuhh never visualized it so would also be interested to see what thatd look like, could see if doing it back and forth works out later ig
16:02 What's notable about slow sort is it's (theoretically) the slowest sorting algorithm that makes progress- obviously bogo dort is slower, but that's because bogo sort is constantly resetting, and a faulty sorting algorithm that simply never stops is even slower than bogo sort, but it is also jot making progress. Slow sort is funny in that way. As inefficient as possible, eithout sitting down and twiddling your thumbs
I love your videos so much! You are so good at explaining math and science! 😄
this video is awesome ! i watched both parts and everything is explained really well, i like the animation style, and they're pretty entertaining ! also a fan of the nb flag pin
ill definitely check out your other videos too !
Thank you so much!
I think i accidentally discovered a variation of one of the Sorting Networks when trying to figure out how you could best run a tournament that ranked every player. (assuming all players not sharing a match could play at the same time, the goal would be to optimize the number of "rounds" which is effectively trying to optimize a parallel sorting algorithm)
My solution was also O(nlogn^2) where the n is gotten rid of in paralellization. The other sorting networks mentioned here were of the same time complexity, so I hypothesize logn^2 is the best you can do for a parallel sorting algorithm, but I don't know for sure.
Apparently, there is an nlogn network called the AKS network. But it does not become faster than the odd even network until n is above 302 sextillion.
@@Kuvina call that ass-ymptotic complexity idk
The bitonic one reminds me of the pixel sorting video by acerola
Bogosort can be improved into creating every possible unique distribution and deleting incorrect ones by trial and error and knowledge of the sorted state.
1:05 cool colours, is it a coincidence?
13:00 Blue and pink on white. I see what you did there. 😏
I prefer the Stalin sort. It's also an N sorting algorithm, and is less destructive than the Quantum Bogo sort.
I like how Strand Sort is Stalin Sort except everyone is saved
The wikipedia page on slowsort says it's stable, but the table on wikipedia says it is not. I do not know enough about these things to look into this discrepancy myself, however.
Amazing video!
wait stooge sort is n^e?
Fantastic as always :D
wouldnt quantum bogo be O(1)? it's fixed time regardless of size of array, right?
I believe it takes O(n) time to check whether the list is sorted, but I'm not actually sure about how quantum hardware works.
@@Kuvina Fair, I've always seen O(x) as the sorting time, from what I remember of university, so an algorithm that sorted a list in fixed time regardless of size would be O(1). I could totally be misremembering though.
The biggest joke is that quantum bogo is a valid interpretation of some real quantum sorting algorithms.
It is just that rather than destroying the universes that output incorrect sorts we duplicate the universe that outputs a properly sorted list enough times that a universe chosen at random will almost certainly be one where the result was correctly sorted. The various many worlds interpretations of quantum mechanics can get really strange.
ok but how does one destroy the universe?
asking for... reasons
9:30 instead of merging between ops, maybe try allocating an n^2 destination array and then do a sort on each column. repeat this by transposing the matrix over and over again, using gravity and then sorting until you have a sorted triangle of values. At each step, the greatest value will be in the corner which can be taken away. This is O(unpleasant).
The way he says "auxiliary" is off the hook
I made my own sorting algorithm with terrible space complexity. It creates 2 auxiliary arrays with one containing the smallest element and one containing the largest element in the list. It then places each element in the list in the auxiliary list corresponding to the value closest to it (is it colser to the lowest or the highest value?) It then destroys the original list and runs recursively for each auxiliary list, being sorted when the length of each auxiliary list is 1.
Has this a name already?
could you do miracle sort?
it does nothing, but just checks in on the array once in a while, with no goal other than to wait for a miracle
technically it has zero operations, but really takes forever
Table should look like n, inf, inf, memory: 1, stable: no
you could do a variation of miracle sort where there is one arguably two operations: moving a piece of uranium next to the memory and retracting it once sorted.
This way it could theoretically actually work on a human timescale
i think you can improve stooge sort a little by using 2floor(n/3) for the final step
There is an easy way to make gnome sort faster: When the gnome finds a piece to swap, than write the index of that piece to a variable. Then, when the gnome is done swapping back, than he can just look into the index in that variable and go back there.
im pretty sure that is just insertion sort lol
Yeah the video was good, but they actually continued the music at the end and cut it off at a point that sounded good, which no other UA-camr ever does.
Odd even looks cool. Agree?
impressive work!
Quantum Bogo: I need to sort thus list of 100 names, i guess i need to destroy 100!-1 universes to do it
Congrats on your graduation! 🎉
i kind of want bogo^n sort. If the degree is 1, it just does regular bogo sort, or else it sorts the first x values using bogo^(n-1) sort and if the next value isn’t sorted it shuffles the list and starts over
I estimated it takes O(n!^x) for bogosort^x
8:23 auxiliary array
You should probably also include Stalin sort and vacuous sort (which is just return [];) in part 3 as well
Stalin Sort is basically just Strand Sort except you give up after one iteration
So strand sort is recursive Stalinsort?
I keep seeing Shatter Sort in visualizations. I have never heard of it.
a bit specific, but if you would like one less known joke algorithm I like is shell-shell sort where instead of insertion sorting the subsists, you recursively call shell-shell sort. its very slow, but has some neat properties like if you use 3 coprime gaps it drops down to N (log(N) ^2) time complexity (while still being slow)
17:17 quantum bogo using stalin sort
hey. whenever you do part 3, could you include bogobogosort? its like bogosort but worse. thank u and cool video
-liz
On the quantum Bogo sort: if the universe wasn't destroyed by suing the sort would it be possible to remove the timelines that are unstable as well?
what about grail sort?
thank you youtube algorithm for showing me this 😭🙏
Brilliant ❤
Can you do block sort/block merge sort/wikisort?
Why is exchange worse than selection?
this one is amazing too! see my other comment for suggestions for a part 3 ^^
5:36
So just nlogn (n)-(√2)n
²
I never noticed it was so close to sqrt(2). But I looked into it, and I think the -1.415 actually comes from log_2(3) - 3.
en.wikipedia.org/wiki/Merge-insertion_sort
So underrated
I was hoping to see pigeonhole sort
What about pigeonhole sort?
What about fire sort?
Why does no one test sorting algorithms on sets of numbers that aren’t in fixed increments or integers? I feel like a lot of sorting algorithms would only work on whole numbers that have no gaps.
It's time to go to bed,
Welllllllllllll just one more video 😂
part 3?
Gravity sort please
Gravity sort is in part 1!
ua-cam.com/video/AAwYzYkjNTg/v-deo.html
@@Kuvina Now where is stalin sort?
36159
|
\/
369
So if the number is in wrong order it will destroy the wrong number that is not in place
You missed the greatest sorting algorithm of all time, the Stalin sort
Mom i cant sort anymore i am tired.
cheese!😁
_hi_
how about you just set everything to 0, it's sorted, easy as that
you forgot stalin sort. Bad video.
get a good mic
Oooh boy you haven't been on the internet enough if you consider this a bad mic.
that's the first hate comment of that type that I've seen