You forgot about Stalin sort witch is O(n). You simply go through the array of elements and eliminate ones which are not in order. In the end you end up with sorted array.
@@MatheusAugustoGames Why not Index Sort? You just sort by the index in the array, and, wow! It's already sorted! It's like Hope Sort but with extra steps!
I love your enthusiasm! I found your channel a view days ago through a video explaining branchless programming. I was hooked 😀 you gained yourself a longterm subscriber 😎😃
I'm basically just going to watch all your videos to prepare me until I get a real job since I graduated university. Love your work and thank you for the clear and concise explanation of everything you do!!!
Dude - I have just finished binge-watching about 20 of your videos back to back, and just wanted to drop you a line to say how fantastic your content is, and to wish you and your family a fantastic Crimbo. I don't know if you have a background in teaching (and I'm too lazy to check), and I'm sure you've heard this a million times before, but you if you haven't already, you should totally consider a career in this. It's impossible to watch your vids without a smile on my face. After today's session, and thanks to your tutes, I've found that Radix sort has reawakened the CompSci beast within me (don't tell the missus). Have a good one mate, and thanks again for everything you're putting out there. Massive appreciation.
Start now. You will be ahead by the time you get there and have lots of free time. I always did weeks of work in 1 day and played games the rest of my time.
For anyone who is wondering how he get log(N!) to N*log(N). I figured it out. N! is (N)*(N-1)*(N-2)*(N-3)...*1. The last step can be written as (N-N+1) . N*(N-1) is close to N^2 so we approximatd that. As an approximation to compute N! means to multiply N , N times. Which can be written as N^N. So log(N!) => log(N^N) because of log property we can get N out making it N*log(N).
just going to appreciate the choice for green. i'm color blind and you cannot believe how often people use that super light green that's indistinguishable from yellow for me.
I love your explanations, I love your attitude, and I love the good energy. I actually watched the radix series in reverse order (oops) but still enjoyed it and found alot of good information non the less. I'm in my last semester as a Computer Science Student, and I can tell you they never teach us like this in college. I had to get my information and skills from watching guys like you explain it, instead of listening to a professor saying "Sorting cannot be done in less than O(nlogn), write that down and memorize it". Again, I hope you keep it up and have a great day!
The explanation for why number of permutations is a factorial was so simple and so intuitive that it finally clicked for me WHY that is. Previously I knew that was the case and sorta took it as granted without actually getting it. I just sorta memorized it and accepted it as the way things were My mind has been blown. Thanks for blowing it for me.
Wow thanks, I am dabbling with permutations of unique elements of tree names for weeks and I watched this only now?!? Radix Sort popped up in recommended for a week ago. I thought your thumbnail was just too promising to be true 😂
You could have just covered the whole stability topic in one sentence. And maybe one phrase for why it is useful. "Sorting is said to be stable: WLOG if index of 436[green] < index of 436[yellow] before sorting, then index of 436[green] < index of 436[yellow] after sorting." "It's a useful property for example if you want to use said sorting algorithm on the same set of numbers multiple times, e.g. sort a sequence by digits, starting at the highest significant digit. 42, 18, 12 => 18, 12, 42 => 12, 18, 42"
If the distribution is random, then yes O(N * log N) is the best you can do for comparison based sorts. But if the distribution is not random, and the algorithm notices, it can take advantage of that and do less sorting. It will be O(N). And example of that is natural merge sort where the sort takes advantage of any natural runs (or reverse runs, and reverse that).
Always important to cover the basis to understand why something is as good as it is. Again, thank you. (Just had to go through all these and upvote them all. Don't want YT recommending pt3 before pt 1 :D )
I was taught that n*log(n) was from recursively dividing the array in half, sorting the halves and merging, which needs a depth of log(n) multiplied by a linear time to merge sorted arrays, but this permutation approach is far more elegant.
@@clefablelover7801 Quicksort is also roughly dividing the array in half and sorting the halves, just without merging. So the same logic applies to it.
Your videos on each topic are highly detailed yet very easy to understand. I just found out about your channel and love watching the explanations. I hope you will keep making more! :)
Really interesting video. Not sure how I managed to subscribe to your channel, but I'm glad I did. Although I already knew that a creel was a basket for catching lobster.
I think you finally made sorting interesting for me. Would've loved a detailed discussion on why the time complexity of the comparison sort is log2(n!) though.
The time complexity of the comparison sort is not log_2(n!) because there is no such thing as "the" comparison sort. Comparison sorting is a class of sorting techniques (which includes very different methods like selection sort and merge sort), and not all have the same complexity. But none can do better (in the worst case, or even on average) than log_2(n!) , for the reason explained in the video (each comparison can at best guarantee halving the number of remaining viable permutations).
*Edit : Confusion Solved by "Luca". Explained by "Some One"* At 13:32 , it was said that paths can be an odd number. But I think they can't be. Because mathematically, we are taking factorial ( ! ), and the result of factorial can never be an Odd number. Simple to say, we always divide a number with "2" at the end. Please consider it, and tell me if I'm wrong
At the beginning there is an even number of paths. But since there are only few ns such that n! = 2^k there is always a point where the number of left paths is odd.
I haven’t played around with a risc v processor yet. I like the newer SIMD proposal more than the old one, where they were going to use the regular registers for SIMD. It seems to have some really great ideas! Not sure when or if the vector extensions will be part of the standard, but I think it all looks really interesting! Great question! Hopefully at some point we can explore other Assembly languagues on this channel. I’d love to cover Atmel, PIC and ARM at some point. Well, thank you for watching :)
This is the kind of thinking that I like to do on my own but I'm always partially stuck in begging or end because I'm missing one critical point thinking about the identity of the method and the standard objective of what is that I want to measure in any given point of the figuring out part or the sorting process really good video man
@@WhatsACreel No worries. Great content! Could do with a better mic or mic settings though. I gave a talk about radix sort myself ua-cam.com/video/C65vYWeN7-k/v-deo.html
Great talk, looks like a very interesting use case! You nailed it :) I’ve already recorded the 3 vids for this mini-series, but I will look into improving my microphone technique for the future. Thank you for the advice :)
I was taught that the Bubble Sort was easiest sort to understand but the lest efficient . However, there was no question that it produced correctly sorted result. Surely any sort will work no matter how slow it is and how much RAM it uses. Later I learned the Quick Sort. It as the name implies is pretty good and also, using recursion, is easy to code.
In my opinion bubble sort should not be used as a sorting algorithm for teaching purposes. I do understand that as an introduction intuitive algorithms are discussed first. But I think that bubble sort is not intuitive at all (and also has no application). I mean ever sorted a deck of cards with bubble sort?
I feel like you should've covered "nearly sorted lists" in your sort olympics. In many cases, we have to sort after each data insertion and this metric is the most important.
13:31 With the exceptions of 0 and 1, all factorials are actually even! So you actually should never have an odd number of paths (except in the cases of 0 or 1, where you'll know your list is technically already sorted anyway).
That only guarantees that the *first* operation is even. Every factorial greater than 2! will have at least one odd prime factor, so you'll always have *some* uneven splits.
Very good video but how bad is the sound? I strongly suggest you put some absorbing material on the walls and keep that mike a little closer (or get a dynamic one that might be better suited for your studio)
Yep, I messed up the audio in some of these! There was a couple of vids where I lost the audio from the mic all together and had to use the camera mic! The audio is improving bit by bit. Thanks for the suggestions, and thanks for watching :)
@@WhatsACreel cool, I know a bit about audio because my son wants to be a UA-camr and the single best thing we did to improve his videos was to take care of the sound. If you used the camera mike it explains why the sound is so terrible 😭 And yes, your videos are otherwise VERY good 😊😊
It's tangential to your video but when it comes to "why are there n! permutations?" I like to explain it inductively. How many ways are there to arrange one shape, a circle? Well, one. It's there. Our answer is 1 = 1 (this will make more sense later). How many ways are there to arrange two objects, circle and triangle? Well, two. First we do our arrangement of circle in the back and just out the star in front. Then we put the circle in front and arrange the star in our one way. Exciting stuff. 2 × 1 = 2, so that's our answer. How many ways are there to arrange circle, triangle, and star? Let's first write down that we can put the star in front and do both arrangements for circle and triangle. But then we can also replace the star with the circle and arrange the other two in two ways, and then we can replace the circle with the triangle and arrange the circle and the star in two days. So that's 3 × 2 × 1 = 6 combinations. With four shapes we can apply the same logic. Do our six variants with one shape in front, then replace it with one of the others. We can do four replacements with the other shapes now. So we have 4 × 3 × 2 × 1 = 24. This sort of construction might not seem exciting but it is useful for explaining the amount of permutations possible if not all items are distinct (five red marbles, three black marbles, two blue marbles, or maybe two 436). I realise it wouldn't be helpful for the angle of your video, just consider it neat.
Great video. Now, I have a question. Since, sorted array is one of the permutations, I wonder is there any sorting algorithm using symmetric groups and group theory?
That’s a really interesting question! I am not aware of any algorithms that explicitly use them, but I think they explain the basis of how sorting algorithms work. Maybe theoretically all sorting algorithms use them? Cheers for this question mate, and cheers for watching :)
If you claim that O(nlogn) is a slow algorithm take the algorithm O(n!) For sorting we can write such algorithm because sorting is in fact choosing permutations which satisfies some condition
i think your audio is a bit weird maybe? either your mic is set to omni-directional or your room is made of glass, brick and tiles.. perhaps some acoustic foam would help, and a reverb adjustment? great video regardless. nice job mate.
Radix sort is the fastest sorting algorithm for two reasons: Its number of data bit accesses is the theoretical minimum. It uses the spatial latency and the temporal latency of the memory cache to the maximum extend.
I noticed on those videos which show the sorting algorithms and have sound effects along with them, that the Radix sort sounds the kewlest. ;) Edit, here's the one I am talking about: ua-cam.com/video/kPRA0W1kECg/v-deo.html
@@WhatsACreel Yeah, I like how some of them sound. Also, visualizing the sorts like this is fascinating and shows how they work. Some of them are amazing at how they do things.
To sort your 4 elements, all you did was ask 5 questions, so it looks to me you only need n+1 iterations/checks/comparisons. What about swapping numbers which are in reverse order until the last iteration doesn't have anything to swap anymore? If you are lucky and the list only needs 1 swap, that kind of sort is quite fast and is also a comparison sort. Just check if value 1 is larger than value 2. If so, swap number 1 and 2. If it's not, do nothing. Then move on to number 2 and 3, then 3 and 4. Don't forget to record the amount of swaps you do when running through the list. Then start again from the first number and compare it again against number 2 and so on. If the amount of swaps is still 0 at the end of the list, it's sorted. For a small list of any data, be it numbers or text, I find it fast enough to use even for a realtime game like sorting the inventory of a MMORPG.
I think you'd go through the list and check if every number is greater than the previous one. I don't think there is any faster way than that, so I guess it's O(n)? Cheers for watching mate :)
Timsort is frequently faster than O(nlogn). When the data is partially ordered already, Timsort is very good at exploiting that ordering. And inputs to a sorting algorithm are frequently not 100% random.
@@WhatsACreel Yes, it's a hybrid of a binary insertionsort, and a tweaked mergesort that manages to be in place. It's stable, O(n) best case, O(nlogn) average case, O(nlogn) worst case. I've got an m4-ized pure python and cython version at stromberg.dnsalias.org/svn/sorts/compare/trunk/timsort_reimp.m4
Hello! Great videos! Please tell me, what is the picture behind you on the right (your left) the one with the red border? At first I thought MC Escher, but screen capping and blowing up, doesn't look like it.
My father drew both the pictures in the background. The left one is a koala playing cricket, and the right one is a man waiting at a train station. Dad is easily the most talented artist I have ever met - he produces photo realism with nothing but Faber Castell 6B pencils!! I will never know how he does it. He says it’s just patience, and off he goes drawing like that! hahaha Amazing man :) Maybe I will include a closeup of these images in a video at some point? They are extraordinary! Cheers for watching :)
@@WhatsACreel PLEASE do include those, and give your father my best! Your sorting algorithm videos made me REALLY understand how important computer SCIENCE is in our world!
12:40 there is no guarantee that this is "the single sorted permutation", since a comparison can have 3 results, but you are always eliminating after only asking a true or false question, you may eliminate other valid solutions along the way
@@WhatsACreel I'm sure most caught this, but I get annoyed when I notice a mistake or innacuracy and it hasn't been documented in the comments yet, so here's my contribution. Cheers
I think this depends on what the rule of sorting is if there are duplicates in a list. If the rule is to group all duplicate elements together then the algorithm can simply be extended to first ask a question if are the two elements the same, then element all the permutations where the elements aren't grouped together, and THEN asking the greater or less than question.
This is fabulous! How would you explain the conversion of number of comparisons, i.e. Log2(N!), to O(NLogN)? This must be obvious to folks more familiar with Big O notation...which is not me.
Bit late maybe, but there's a thing called stirling's formula which states n! ~= sqrt(2pi n) (n/2.7)^n, which is roughly on the same order as n^n. so log(n!) ~= log(n^n) = nlogn
Yep, just not great acoustics in this room, unfortunately :( Oh, and my sound engineering skills are pretty hit and miss! Cheers for watching anywho, and have good one :)
Sage advice! Well, I recorded the three vids for this little series already, but I'll look forward to improving my mic technique in the next one after that :)
@@WhatsACreel Yeah, put some soft furniture/pillows etc in the room, and it will absorb the sound quickly as it bounces around. The "soft walls" they often placed in open-space offices have this exact purpose. They prevent reverb and dampens sound - making the office space more quiet. Audio signal filtering might work - but often these kinds of filters can make the sound seem a bit off/weird.
I hope this isn't too dumb a question but I was wondering about the use of big O in the video. Shouldn't it be big_Omega(nlogn) instead O(nlogn) since the nlogn is a lower bound for the complexity?
Implementin exclusion logic in algorithm without using exclusion in algorithm itself. Combination of facts/comparisons is much more than sum of statemnts but rather multiplicaton.
3blue1brown is maybe the best yt vids on the whole net! Extraordinary talent there :) Thank you for watching my vids mate, I hope you had a good sleep :)
Sorting gets even more interesting when the data to be sorted is too big to fit in the available random access memory... Sure you can "Just go virtual", but that road could lead to really bad thrashing!
But in my researchs I found that the industry pretty much uses Quick sort, at least this was my conclusion. Is Radix really more practical than other algorithms?
Yes, comparison sorts are very common! They're easy to implement and flexible - since we only need to define < for our objects and we can comparison sort just about anything! There's benefits to both comparison and radix. Cheers for watching :)
there is a problem with the comparison sorts: it is not true at all that you can halve the number of possible permutations, even if the number is even.
Yes, you are right! Sorting 13 elements takes 34 comparisons, but Log2(13!) ceiling is 33. Sorry if that was misleading. Sounds like you are aware of the following list: oeis.org/A036604 I did try not say specifically that 'only' odd numbers lead to an inability to halve. Might have edited it wrong, but I was aware of that point. It's a great topic, really! So funny how the simplest questions lead to ridiculously difficult problems! What is the minimum number of comparisons required to sort 1000 elements? Nobody knows... :) Anywho, great observation, and thank you for watching :)
I remember proof from school, which has proven that you can do only slightly better than n log(n) I don't remember the exact formula, but I do remember the "pyramid" of comparisons, which you just cannot get around. EDIT: oh yeah!! log(N!)
Big O designates an expression that is above the exact time complexity in the long term, but not necessarily the closest expression. If something is O(n), it is also O(n^2), but the smallest one is usually the most significant. Well, something like O(log2(N!)) is tecnically O(Log2(N^N)) since the later is bigger, but in this case we can extract a N to get the complexity in the video.
You forgot about Stalin sort witch is O(n). You simply go through the array of elements and eliminate ones which are not in order. In the end you end up with sorted array.
Is this a known CS joke? because it's hilarious
Or Hope Sort. You return the original array, hoping it was already sorted.
But on average your time is n!/2
@@MatheusAugustoGames Why not Index Sort? You just sort by the index in the array, and, wow! It's already sorted! It's like Hope Sort but with extra steps!
You also have to destroy all evidence of the 'removed' elements
1:10 This is the fastest sorting algorithm I've seen! EyeballSort is O(n) with NO overhead!
This look at sorting is just brilliant. Thanks for making this video. Eagerly waiting for part 2.
That great mate! Hopefully I can upload later today. Thank you for watching :)
I wish this was available when I was taking comp sci in college
Thank you, cheers for watching :)
Just found this!
I'm still in Computer Science right now! 😁
I love your enthusiasm! I found your channel a view days ago through a video explaining branchless programming. I was hooked 😀 you gained yourself a longterm subscriber 😎😃
That's great, welcome mate! Thank you for watching :)
I'm basically just going to watch all your videos to prepare me until I get a real job since I graduated university. Love your work and thank you for the clear and concise explanation of everything you do!!!
Dude - I have just finished binge-watching about 20 of your videos back to back, and just wanted to drop you a line to say how fantastic your content is, and to wish you and your family a fantastic Crimbo. I don't know if you have a background in teaching (and I'm too lazy to check), and I'm sure you've heard this a million times before, but you if you haven't already, you should totally consider a career in this. It's impossible to watch your vids without a smile on my face. After today's session, and thanks to your tutes, I've found that Radix sort has reawakened the CompSci beast within me (don't tell the missus). Have a good one mate, and thanks again for everything you're putting out there. Massive appreciation.
I'm glad I know this 2 years before actually taking comp sci, let's see if I forget by then and come back
I will take a bet and say that you will comment here in few years. I would like you to tag me when that happens, pls :D
Start now. You will be ahead by the time you get there and have lots of free time. I always did weeks of work in 1 day and played games the rest of my time.
Talk about a cliffhanger... How can you go faster than N log(N)?!
Brilliant walkthrough, look forward to parts 2 and 3!
For anyone who is wondering how he get log(N!) to N*log(N). I figured it out. N! is (N)*(N-1)*(N-2)*(N-3)...*1. The last step can be written as (N-N+1) . N*(N-1) is close to N^2 so we approximatd that. As an approximation to compute N! means to multiply N , N times. Which can be written as N^N. So log(N!) => log(N^N) because of log property we can get N out making it N*log(N).
Oh now i finally understand why logn appeared all the time! Fantastic video
That's really great! Thanks for watching :)
This is exactly how I'd like to have been taught comparison sorts.
just going to appreciate the choice for green. i'm color blind and you cannot believe how often people use that super light green that's indistinguishable from yellow for me.
You managed to present this topic in an interesting (even exciting!) way and brought me a deeper understanding of what sorting is about. Thank you!
I love your explanations, I love your attitude, and I love the good energy. I actually watched the radix series in reverse order (oops) but still enjoyed it and found alot of good information non the less. I'm in my last semester as a Computer Science Student, and I can tell you they never teach us like this in college. I had to get my information and skills from watching guys like you explain it, instead of listening to a professor saying "Sorting cannot be done in less than O(nlogn), write that down and memorize it". Again, I hope you keep it up and have a great day!
This is actually a really refreshing and nice view on sorting algorithms! I know about sorting but you brought something new to the table. Subscribed!
The explanation for why number of permutations is a factorial was so simple and so intuitive that it finally clicked for me WHY that is. Previously I knew that was the case and sorta took it as granted without actually getting it. I just sorta memorized it and accepted it as the way things were
My mind has been blown. Thanks for blowing it for me.
Wow thanks, I am dabbling with permutations of unique elements of tree names for weeks and I watched this only now?!? Radix Sort popped up in recommended for a week ago. I thought your thumbnail was just too promising to be true 😂
The amount of background you provide is pitch perfect, for me, at least. Thank you.
just wow!!! even i knew this concept but i learned it the hardway back 2 days ago and now i found your video....
sorry u are underrated
I've never thought of sorting algorithms that way! Thanks a lot of this new insight, and I sure can't wait for the next parts!
Glad you liked it! Thank you for watching :)
Ever since your sorting race video I was so curious about radix sort! What an awesome video great work :D
Edit: Can't wait for the next video!
Radix Sort is an amazing algorithm! Hopefully I can upload later today. Thank you for watching :)
You could have just covered the whole stability topic in one sentence. And maybe one phrase for why it is useful.
"Sorting is said to be stable: WLOG if index of 436[green] < index of 436[yellow] before sorting, then index of 436[green] < index of 436[yellow] after sorting."
"It's a useful property for example if you want to use said sorting algorithm on the same set of numbers multiple times, e.g. sort a sequence by digits, starting at the highest significant digit. 42, 18, 12 => 18, 12, 42 => 12, 18, 42"
If the distribution is random, then yes O(N * log N) is the best you can do for comparison based sorts.
But if the distribution is not random, and the algorithm notices, it can take advantage of that and do less sorting. It will be O(N). And example of that is natural merge sort where the sort takes advantage of any natural runs (or reverse runs, and reverse that).
Does CPU use counting sort for small data ?
Your enthusiasm is infectious
Eagerly waiting for part 2! Never really thought about turning sorting into a tree
Don't know the first thing about compsci but this is a very good video, thank you for making this, man
At first I read the title as "why is radix sort of fast?"
Close enough, isn't it?
Always important to cover the basis to understand why something is as good as it is. Again, thank you. (Just had to go through all these and upvote them all. Don't want YT recommending pt3 before pt 1 :D )
That's very nice.
I'm looking forward to next videos about this!
Hopefully I can upload later today! Thanks for watching mate :)
Excellent graphics! Perfect pace!
Cheers mate, thanks for watching :)
I was taught that n*log(n) was from recursively dividing the array in half, sorting the halves and merging, which needs a depth of log(n) multiplied by a linear time to merge sorted arrays, but this permutation approach is far more elegant.
That's a merge sort, different approach, same time complexity.
@@clefablelover7801 Quicksort is also roughly dividing the array in half and sorting the halves, just without merging. So the same logic applies to it.
you make those depressing themes so fun
Your videos on each topic are highly detailed yet very easy to understand. I just found out about your channel and love watching the explanations. I hope you will keep making more! :)
Beautiful explanation and visualisation! I always get confused by big-O, but I think I'll remember this one for a long long time :)
time well spent! now I understand better what O(NlogN) means.
Really interesting video. Not sure how I managed to subscribe to your channel, but I'm glad I did. Although I already knew that a creel was a basket for catching lobster.
Ha! Glad you came back mate! I will be certain to employ a creel when I next find a lobster :)
I think you finally made sorting interesting for me. Would've loved a detailed discussion on why the time complexity of the comparison sort is log2(n!) though.
The time complexity of the comparison sort is not log_2(n!) because there is no such thing as "the" comparison sort. Comparison sorting is a class of sorting techniques (which includes very different methods like selection sort and merge sort), and not all have the same complexity. But none can do better (in the worst case, or even on average) than log_2(n!) , for the reason explained in the video (each comparison can at best guarantee halving the number of remaining viable permutations).
Nice video, but you should consider using audio absorbers cause the reverberation is quite massive
Really insightful video once again mate. Keep it up!
Cheers mate, thank you for watching :)
Great explanations so far! Looking forward to seeing the next part! :)
That's great! Hopefully I can upload later today. Thank you for watching :)
@@WhatsACreel perfect👍
I don’t even study computer science.. What is Radix.. and yet here I am learning about sorting algorithms
Mind blowing explanation! Can't wait for part 2!
Great to hear mate! All 3 vids are up now! Cheers for watching :)
*Edit : Confusion Solved by "Luca". Explained by "Some One"*
At 13:32 , it was said that paths can be an odd number. But I think they can't be. Because mathematically, we are taking factorial ( ! ), and the result of factorial can never be an Odd number. Simple to say, we always divide a number with "2" at the end. Please consider it, and tell me if I'm wrong
11:17
At the beginning there is an even number of paths. But since there are only few ns such that n! = 2^k there is always a point where the number of left paths is odd.
@@Lucaazade Oh okay, I thought that was about the "Total paths". Thanks for the correction.
@@oliwiermoskalewicz1988 Oh okay, Got the point. Thank you so much. I was considering the whole paths (total paths)
Hey, great video! I have been meaning to ask you what you think of RISC V, considering its lack of conditional moves and its vector extensions?
I haven’t played around with a risc v processor yet. I like the newer SIMD proposal more than the old one, where they were going to use the regular registers for SIMD. It seems to have some really great ideas! Not sure when or if the vector extensions will be part of the standard, but I think it all looks really interesting!
Great question! Hopefully at some point we can explore other Assembly languagues on this channel. I’d love to cover Atmel, PIC and ARM at some point.
Well, thank you for watching :)
@@WhatsACreel Thanks!
B[it operations] expansion will have cmovs, I think
This is really really really well explained
This is the kind of thinking that I like to do on my own but I'm always partially stuck in begging or end because I'm missing one critical point thinking about the identity of the method and the standard objective of what is that I want to measure in any given point of the figuring out part or the sorting process really good video man
Please send part 2 😭 great video
really glad that radix is getting some attention
Amazing algorithm!! Cheers for watching mate :)
@@WhatsACreel No worries. Great content! Could do with a better mic or mic settings though. I gave a talk about radix sort myself ua-cam.com/video/C65vYWeN7-k/v-deo.html
Great talk, looks like a very interesting use case! You nailed it :)
I’ve already recorded the 3 vids for this mini-series, but I will look into improving my microphone technique for the future. Thank you for the advice :)
Fantastic video, helps give another perspective to look at comparison sorts, very useful!
thx youtube algorithm recommend this awesome video
Pretty cool explanation of nlogn
I was taught that the Bubble Sort was easiest sort to understand but the lest efficient . However, there was no question that it produced correctly sorted result. Surely any sort will work no matter how slow it is and how much RAM it uses. Later I learned the Quick Sort. It as the name implies is pretty good and also, using recursion, is easy to code.
In my opinion bubble sort should not be used as a sorting algorithm for teaching purposes. I do understand that as an introduction intuitive algorithms are discussed first. But I think that bubble sort is not intuitive at all (and also has no application). I mean ever sorted a deck of cards with bubble sort?
15:00 earned my sub
Great visualization
Philosophically Wonderful
Theoretically Beautiful
that was an amazing explanation
I feel like you should've covered "nearly sorted lists" in your sort olympics. In many cases, we have to sort after each data insertion and this metric is the most important.
That's a really good topic! Cheers for the suggestion! And thank you for watching :)
Just use insertion sort for that case, it's nearly linear for mostly sorted lists.
13:31 With the exceptions of 0 and 1, all factorials are actually even! So you actually should never have an odd number of paths (except in the cases of 0 or 1, where you'll know your list is technically already sorted anyway).
That only guarantees that the *first* operation is even. Every factorial greater than 2! will have at least one odd prime factor, so you'll always have *some* uneven splits.
Fantastic as always
Thank you! Cheers for watching :)
Very good video but how bad is the sound? I strongly suggest you put some absorbing material on the walls and keep that mike a little closer (or get a dynamic one that might be better suited for your studio)
Yep, I messed up the audio in some of these! There was a couple of vids where I lost the audio from the mic all together and had to use the camera mic! The audio is improving bit by bit. Thanks for the suggestions, and thanks for watching :)
@@WhatsACreel cool, I know a bit about audio because my son wants to be a UA-camr and the single best thing we did to improve his videos was to take care of the sound. If you used the camera mike it explains why the sound is so terrible 😭 And yes, your videos are otherwise VERY good 😊😊
It's tangential to your video but when it comes to "why are there n! permutations?" I like to explain it inductively.
How many ways are there to arrange one shape, a circle? Well, one. It's there. Our answer is 1 = 1 (this will make more sense later).
How many ways are there to arrange two objects, circle and triangle? Well, two. First we do our arrangement of circle in the back and just out the star in front. Then we put the circle in front and arrange the star in our one way. Exciting stuff. 2 × 1 = 2, so that's our answer.
How many ways are there to arrange circle, triangle, and star?
Let's first write down that we can put the star in front and do both arrangements for circle and triangle. But then we can also replace the star with the circle and arrange the other two in two ways, and then we can replace the circle with the triangle and arrange the circle and the star in two days. So that's 3 × 2 × 1 = 6 combinations.
With four shapes we can apply the same logic. Do our six variants with one shape in front, then replace it with one of the others. We can do four replacements with the other shapes now. So we have 4 × 3 × 2 × 1 = 24.
This sort of construction might not seem exciting but it is useful for explaining the amount of permutations possible if not all items are distinct (five red marbles, three black marbles, two blue marbles, or maybe two 436).
I realise it wouldn't be helpful for the angle of your video, just consider it neat.
Mate, love your explanation!
Great video. Now, I have a question. Since, sorted array is one of the permutations, I wonder is there any sorting algorithm using symmetric groups and group theory?
That’s a really interesting question! I am not aware of any algorithms that explicitly use them, but I think they explain the basis of how sorting algorithms work. Maybe theoretically all sorting algorithms use them?
Cheers for this question mate, and cheers for watching :)
If you claim that O(nlogn) is a slow algorithm take the algorithm O(n!)
For sorting we can write such algorithm because sorting is in fact choosing permutations which satisfies some condition
i think your audio is a bit weird maybe? either your mic is set to omni-directional or your room is made of glass, brick and tiles.. perhaps some acoustic foam would help, and a reverb adjustment? great video regardless. nice job mate.
I love the shirt! Oh yeah, as always the content was also good.
Radix sort is the fastest sorting algorithm for two reasons: Its number of data bit accesses is the theoretical minimum. It uses the spatial latency and the temporal latency of the memory cache to the maximum extend.
Optimal cache usage is actually funnel sort, this has been proved
I noticed on those videos which show the sorting algorithms and have sound effects along with them, that the Radix sort sounds the kewlest. ;)
Edit, here's the one I am talking about: ua-cam.com/video/kPRA0W1kECg/v-deo.html
Hahaha, wow! Sounds like laser fights :)
@@WhatsACreel Yeah, I like how some of them sound. Also, visualizing the sorts like this is fascinating and shows how they work. Some of them are amazing at how they do things.
Couldn't agree more! For such a simple problem, folks have come up with some crazy ways to solve it!
Best part about radix sort is that it can be easily adapted for processing strings.
This is brilliant you are brilliant
What we actually need is to find in which branch of the multiverse the list is already sorted and get it from there
To sort your 4 elements, all you did was ask 5 questions, so it looks to me you only need n+1 iterations/checks/comparisons. What about swapping numbers which are in reverse order until the last iteration doesn't have anything to swap anymore? If you are lucky and the list only needs 1 swap, that kind of sort is quite fast and is also a comparison sort. Just check if value 1 is larger than value 2. If so, swap number 1 and 2. If it's not, do nothing. Then move on to number 2 and 3, then 3 and 4. Don't forget to record the amount of swaps you do when running through the list. Then start again from the first number and compare it again against number 2 and so on. If the amount of swaps is still 0 at the end of the list, it's sorted. For a small list of any data, be it numbers or text, I find it fast enough to use even for a realtime game like sorting the inventory of a MMORPG.
What is the fastest way to check whether a list of elements is sorted? Would it still be O(NLogN)?
I think you'd go through the list and check if every number is greater than the previous one. I don't think there is any faster way than that, so I guess it's O(n)? Cheers for watching mate :)
Awesome video bud
That's good! Thank you for watching mate :)
Timsort is frequently faster than O(nlogn). When the data is partially ordered already, Timsort is very good at exploiting that ordering. And inputs to a sorting algorithm are frequently not 100% random.
TimSort is great!! If it's the one I'm thinking of. It's a hybrid, no? Cheers for watching mate :)
@@WhatsACreel Yes, it's a hybrid of a binary insertionsort, and a tweaked mergesort that manages to be in place. It's stable, O(n) best case, O(nlogn) average case, O(nlogn) worst case. I've got an m4-ized pure python and cython version at stromberg.dnsalias.org/svn/sorts/compare/trunk/timsort_reimp.m4
Hello! Great videos! Please tell me, what is the picture behind you on the right (your left) the one with the red border? At first I thought MC Escher, but screen capping and blowing up, doesn't look like it.
My father drew both the pictures in the background. The left one is a koala playing cricket, and the right one is a man waiting at a train station. Dad is easily the most talented artist I have ever met - he produces photo realism with nothing but Faber Castell 6B pencils!! I will never know how he does it. He says it’s just patience, and off he goes drawing like that! hahaha Amazing man :)
Maybe I will include a closeup of these images in a video at some point? They are extraordinary! Cheers for watching :)
@@WhatsACreel PLEASE do include those, and give your father my best! Your sorting algorithm videos made me REALLY understand how important computer SCIENCE is in our world!
12:40 there is no guarantee that this is "the single sorted permutation", since a comparison can have 3 results, but you are always eliminating after only asking a true or false question, you may eliminate other valid solutions along the way
Yes, sorry, there might be duplicates!
@@WhatsACreel I'm sure most caught this, but I get annoyed when I notice a mistake or innacuracy and it hasn't been documented in the comments yet, so here's my contribution. Cheers
@@hymnsfordisco Great attention to detail! thanks for the contribution :)
I think this depends on what the rule of sorting is if there are duplicates in a list. If the rule is to group all duplicate elements together then the algorithm can simply be extended to first ask a question if are the two elements the same, then element all the permutations where the elements aren't grouped together, and THEN asking the greater or less than question.
This is fabulous! How would you explain the conversion of number of comparisons, i.e. Log2(N!), to O(NLogN)? This must be obvious to folks more familiar with Big O notation...which is not me.
Bit late maybe, but there's a thing called stirling's formula which states n! ~= sqrt(2pi n) (n/2.7)^n, which is roughly on the same order as n^n. so log(n!) ~= log(n^n) = nlogn
@@ralphmb980 thank you!
He has a microphone, but why his voice is like he forgot to turn the mic on?
Yep, just not great acoustics in this room, unfortunately :( Oh, and my sound engineering skills are pretty hit and miss! Cheers for watching anywho, and have good one :)
That's a really excellent idea! I think it would make a big improvement :)
What's a Creel? make sure you stay on-axis to the microphone. I can't really see but it almost looks like it's pointing at an angle to your voice.
Sage advice! Well, I recorded the three vids for this little series already, but I'll look forward to improving my mic technique in the next one after that :)
@@WhatsACreel Yeah, put some soft furniture/pillows etc in the room, and it will absorb the sound quickly as it bounces around. The "soft walls" they often placed in open-space offices have this exact purpose. They prevent reverb and dampens sound - making the office space more quiet.
Audio signal filtering might work - but often these kinds of filters can make the sound seem a bit off/weird.
I hope this isn't too dumb a question but I was wondering about the use of big O in the video. Shouldn't it be big_Omega(nlogn) instead O(nlogn) since the nlogn is a lower bound for the complexity?
I hadn't even though of it this way, this is super duper cool- well done!
Hello Phoenix
just get the first element. by definition a sorted list
Now: why radix sort looks like it'll be fast, but is actually slower than most common sorts on real computers in common use cases.
This reminds me of Hamming Codes inner workings as explained 3blue1brown video. A very strange way felling of simmilarity.
Implementin exclusion logic in algorithm without using exclusion in algorithm itself. Combination of facts/comparisons is much more than sum of statemnts but rather multiplicaton.
I really need some sleep
3blue1brown is maybe the best yt vids on the whole net! Extraordinary talent there :) Thank you for watching my vids mate, I hope you had a good sleep :)
Sorting gets even more interesting when the data to be sorted is too big to fit in the available random access memory... Sure you can "Just go virtual", but that road could lead to really bad thrashing!
Oh this is a great channel.
Unless... maybe....
I literally can’t concentrate on what you are saying you’ve got stunning look jesus
This guy is amazing.
Why would it be NLogN and not logN? Unless I'm missing something, I don't see how you do N comparisons logN times.
Simply, YES!
Thank you Sir Universe :)
Brilliant!
Amazing, you're amazing! Thank you very much, matey ! DATA DATA DATA not DATA DATA DATA ! haha
Where did that log2(n!) come from?
But in my researchs I found that the industry pretty much uses Quick sort, at least this was my conclusion. Is Radix really more practical than other algorithms?
Yes, comparison sorts are very common! They're easy to implement and flexible - since we only need to define < for our objects and we can comparison sort just about anything! There's benefits to both comparison and radix. Cheers for watching :)
there is a problem with the comparison sorts: it is not true at all that you can halve the number of possible permutations, even if the number is even.
Yes, you are right! Sorting 13 elements takes 34 comparisons, but Log2(13!) ceiling is 33. Sorry if that was misleading. Sounds like you are aware of the following list: oeis.org/A036604
I did try not say specifically that 'only' odd numbers lead to an inability to halve. Might have edited it wrong, but I was aware of that point. It's a great topic, really! So funny how the simplest questions lead to ridiculously difficult problems! What is the minimum number of comparisons required to sort 1000 elements? Nobody knows... :)
Anywho, great observation, and thank you for watching :)
I remember proof from school, which has proven that you can do only slightly better than n log(n) I don't remember the exact formula, but I do remember the "pyramid" of comparisons, which you just cannot get around.
EDIT: oh yeah!! log(N!)
Going in: wot is dis comparison sort?
Leaving: oh its just trash
I don't understand how we went from log2(N!) to Nlog(N). Could someone please explain to me?
Big O designates an expression that is above the exact time complexity in the long term, but not necessarily the closest expression. If something is O(n), it is also O(n^2), but the smallest one is usually the most significant. Well, something like O(log2(N!)) is tecnically O(Log2(N^N)) since the later is bigger, but in this case we can extract a N to get the complexity in the video.
Great Video :)