You know what I like about the MIT lectures? They tell you the application/use case of what you're being taught. That makes a huge difference for beginners who have no way of visualizing these abstract concepts. Many people who get discouraged with stuff like this aren't able to relate with the content and feel like it's something crazy out there. It's the simple things.
That's not the case for junior or senior level MIT OCW courses. The same professor teaches an advanced data structures course on UA-cam and those are so academic and abtruse that he doesn't write code and only sometimes gives applications. Like his succinct binary trees data structure video. He gives one use (the Oxford English dictionary), but besides that he just explains its math.
The secret is that some of the professors actually don't KNOW why they are teaching what they are teaching. Professors aren't always allowed to just "profess" what they know these days. This is the grand illusion of academia, many who know what's going on, don't teach. Many who teach would simply prefer to waste their time in a laboratory doing research work.
Hey there! There are some other videos in this course playlist that explain the terms used in this one. - represent graph in Python: ua-cam.com/video/5JxShDZ_ylo/v-deo.html - adjacency list in Python: ua-cam.com/video/C5SPsY72_CM/v-deo.html - examples of theta, O, omega: ua-cam.com/video/P7frcB_-g4w/v-deo.html - what is hashing: ua-cam.com/video/0M_kIqhwbFo/v-deo.html - python implementation of iterator: ua-cam.com/video/-DwGrJ8JxDc/v-deo.html
Super genus guy! From Wikipedia Demaine was born in Halifax, Nova Scotia, to artist sculptor Martin L. Demaine and Judy Anderson. From the age of 7, he was identified as a child prodigy and spent time traveling across North America with his father.[1] He was home-schooled during that time span until entering university at the age of 12.[2][3] Demaine completed his bachelor's degree at 14 years old at Dalhousie University in Canada, and completed his PhD at the University of Waterloo by the time he was 20 years old.[4][5]
Breadth-first search (BFS) is an algorithm for traversing or searching tree or graph data structures. It starts at the tree root (or some arbitrary node of a graph, sometimes referred to as a 'search key'[1]) and explores the neighbor nodes first, before moving to the next level neighbors. BFS was invented in the late 1950s by E. F. Moore, who used it to find the shortest path out of a maze,[2] and discovered independently by C. Y. Lee as a wire routing algorithm (published 1961).[3][4]
This is amazing, I am a student of Algorithms at the University of Nevada, Las Vegas. I really appreciate this class, thank you so much for the UA-cam videos MIT!!!!
Scissors cuts Paper Paper covers Rock Rock crushes Lizard Lizard poisons Spock Spock smashes Scissors Scissors decapitates Lizard Lizard eats Paper Paper disproves Spock Spock vaporizes Rock (and as it always has) Rock crushes Scissors
This guy's lectures really puts the Computer Science lectures at UofT to shame. But then again he is a prodigy, still doesn't justify why we get grad-students as "profs" though. Just my 2 cents
When I was a kid I used a screw driver to pry the Rubics cube apart and put it back together solved. That's when my parents knew I would be an engineer and not a mathematician.
Time and space complexity[edit] The time complexity can be expressed as O ( | V | + | E | ) {\displaystyle O(|V|+|E|)} ,[5] since every vertex and every edge will be explored in the worst case. | V | {\displaystyle |V|} is the number of vertices and | E | {\displaystyle |E|} is the number of edges in the graph. Note that O ( | E | ) {\displaystyle O(|E|)} may vary between O ( 1 ) {\displaystyle O(1)} and O ( | V | 2 ) {\displaystyle O(|V|^{2})} , depending on how sparse the input graph is. When the number of vertices in the graph is known ahead of time, and additional data structures are used to determine which vertices have already been added to the queue, the space complexity can be expressed as O ( | V | ) {\displaystyle O(|V|)} , where | V | {\displaystyle |V|} is the cardinality of the set of vertices (as said before). If the graph is represented by an adjacency list it occupies Θ ( | V | + | E | ) {\displaystyle \Theta (|V|+|E|)} [6] space in memory, while an adjacency matrix representation occupies Θ ( | V
May be the camera person should consider finding a balance between landscape and portrait shooting, instead of taking the actions in portrait always. It gets difficult to see the contents in the board with the staff, since the focus is set to one "focussed" part of the black board. This is just my thought. btw MIT rocks!
Sujivson Titus True, a little frustrating when he's pointing at something but it's off the screen, or you're reading through something but part of it is cut off
Don't understand why people complain about the chalk. As he writes down so am I doing in my notebook and I find this to be working very smoothly [especially as I can pause the video and also think for myself and try to prove what he said] - I'm getting most of what he says - less so to get a proof on the spot for n x n x n - but hey, he published a paper with et al. on this subject so :) that's accessible for later.
@Simon WoodburyForget Because you are forgetting that Computer Science is a...science. CS is not programming. Programming is monkey work. Algorithms are at the heart of CS. You don't need code because this isn't meant for practical uses. CS is just math for computers.
Here are points in other videos in this course's playlist that explain terms used in this video: - represent graph in Python: watch?v=5JxShDZ_ylo&t=1709s - adjacency list in Python: watch?v=C5SPsY72_CM&t=189s - examples of theta, O, omega: watch?v=P7frcB_-g4w&t=130 - what is hashing: watch?v=0M_kIqhwbFo&t=22 - python implementation of iterator: watch?v=-DwGrJ8JxDc&t=978 I found this useful. Hope some of you find it useful as well. If you find more terms for which I can add pointers, let me know. If a few people think that this is useful, I can add this information for a few more videos. If you are looking for this info in any specific videos, let me know. If I have made these notes for those videos, I will add. Cheers!
Dear all, In this lockdown stage in home, please provide game equipments to your children/students to play. If not help them to watch "Math Art Studio" in you tube. They will play with their names and learn different concepts in mathematics.Those who have seen it they have learnt maths and enjoyed its beauty every day.
Wish my professor wasn't lazy and wrote all the notes on the board like this instructor. I can't keep up with half-assed powerpoints that my professor rushes through
I don't know if he gave you guys the powerpoint slides, but if he did, then you wouldn't have to spend time copying them down because you'd know you would get them. That way, you can spend time writing down the things that will be more helpful for you.
Not sure if Erik consciously changed his teaching style from Lec1. He used to explain an example of how things move and build an algorithm on top of it. But this lecture is different, puts down algorithm first and then explains an example on how we move from it.
This is a great lecture. i really appreciate the level of teaching from MIT. This is what makes a good university: its professors. even though this video is 7 years old, i cant believe they're using chalk boards at MIT. White boards are so much cleaner and easier to read / write on.
His T-Shirt: Scissors cuts paper, paper covers rock, rock crushes lizard, lizard poisons Spock, Spock smashes scissors, scissors decapitates lizard, lizard eats paper, paper disproves Spock, Spock vaporizes rock, and as it always has, rock crushes scissors.
Clear. I read the chapter corresponding to this in the Algorithm Design Manual but I wasn't feeling like it really all came together but this did it for me.
MIT is amazing for postgrad and PhD work. But really you don’t have to go tho MIT to learn this info. Maybe the junior and senior years are probably where it is differentiated from other universities.
I think it could be possible to implement multiple graphs even using object-oriented programming. Instead of v.neighbours, this property can be an array, so v.neighbours[0] would be the 1st graph, v.neighbours[1] the second, and so on.. Am i wrong?
No you're not wrong. But it is not practical. You will have to remember for each vertex the number of graphs it belongs to. So, if you need to analyse say vertex v and u, you need to make sure that v.neighbours[0] & u.neighbours[0] are refering to the same graph (you need to allign - 'synchronize'- the array indices - which can be a headacke )
46:17 How do you find the shortest path from "f" to "c"? Though they are connected through an edge but according to this algorithm, parent of "f" is "d" and parent of "d" is "x" and parent of "x" is "s". You cannot reach to "c" from "f".
the path you are talking about is where 's' is the starting point, so it will give the shortest path from 's' to any node in the graph. To compute the shortest path between 'f' and 'c' , you need to set 'f' or 'c' as the starting point...
if f and c are connected some way then, you can get the path from f to c buy back tacking from parnet[c], assuming the verices have some sort of pointer. If they are class type of Node/vertex with parent pointer, it a lot easier to trace.
I've always wanted to know HOW a rubics cube is actually mechanically put together that it allows so much random rotation of everything without breaking or getting gummed up. :O
Maybe someone can help me: I'm from Brazil and I study CS. I noticed that in Introduction to Algorithms ppl there already knows algorithm analysis, study graphs and this kind of stuff. Here we just learn the basics (we start with C, since the basics of the language till structs and we see a bit of divide and conquer, sorting (qsort, merge, and the n^2 algorithms) and we work with matrix and files (bin and text). Then in the second year we study design and analysis of algorithms, which is when we learn algorithms analysis and paradigms like dynamic programming, divide and conquer (deeply), greedy algorithms and so on. Now I'm in the third year and I'm studying graphs. Id like to know if the students dont get confused by studying these kind of stuff early (and if they actually study it early cuz I don really know if this assignment is a 1st year assignment)
Luciano My CS courses are very intense, we start with Intro to Programming (2D,3D arrays, OOP,, Big Oh, Merge Sort, a ton of other fundamentals), Data Structures and Algs include Topological Graphs, Greedy Algs, Heaps, etc. this is done in our first year. We tend to go super fast and learn C for Comp Arc and Systems Ops (the next semester of classes that is usually coupled with Discrete Structures, Multi Calc, and Linear Algebra).They teach us Java in our reg CS classes so they make Comp Arc and Systems Ops very rigorous since we’re beginning C (to appreciate and completely understand memory allocation). So basically if you take 3 cs classes a semester, we’ll take Artificial Intelligence in our second year, that way we can go onto learning machine learning, advanced data structures, and top level engineering classes. Most of us are coupling our CS degree with Math so most likely we do Calc 5, and pure maths by our second year or third year. Anyways, I hope this gave you an insight. It’s hard to measure difficulty with others since this is the pace that I’ve adapted to.
In case someone else sees this and wants to know, according to MITs website on this course MIT 6.006, there are 2 prerequisite classes that must be taken before this one. Introduction to EECS I (6.01) and Mathematics for Computer Science (6.042) this is likely a 2nd year course, tho I’m sure a 1st year could take it their 2nd semester as long as they took or tested out of the prerequisites.
The lectures are awesome but the camera work is bad, its much better in the recent MIT Algorithm series, however, I was missing Prof. Domaine for not teaching graphs
I'm interested to see whether or not quantum computers will be able to find god's number for larger sized rubik's cubes, or if we will never know them.
what if i want to know all the shortest paths to a node in the example that is there in this lecture! for example there might my exponentially many ways to get to node f from s. and there might be many shortest paths.But BFS gives us only one! i want to know the no. of all the shortest paths between two nodes s and f. How can I achieve this?
+deepthi g In the video, the professor mentioned a paper where its research was publicated, this is the paper : erikdemaine.org/papers/Rubik_ESA2011/paper.pdf
The required textbook for this course is: Cormen, Thomas, Charles Leiserson, Ronald Rivest, and Clifford Stein. Introduction to Algorithms. 3rd ed. MIT Press, 2009. ISBN: 9780262033848. (www.amazon.com/exec/obidos/ASIN/0262033844/ref=nosim/mitopencourse-20) See the course on MIT OpenCourseWare for more information and materials at: ocw.mit.edu/6-006F11
You know what I like about the MIT lectures? They tell you the application/use case of what you're being taught. That makes a huge difference for beginners who have no way of visualizing these abstract concepts. Many people who get discouraged with stuff like this aren't able to relate with the content and feel like it's something crazy out there. It's the simple things.
I could not agree more with this. The biggest difference that I spotted along the lecture
That's not the case for junior or senior level MIT OCW courses. The same professor teaches an advanced data structures course on UA-cam and those are so academic and abtruse that he doesn't write code and only sometimes gives applications. Like his succinct binary trees data structure video. He gives one use (the Oxford English dictionary), but besides that he just explains its math.
The secret is that some of the professors actually don't KNOW why they are teaching what they are teaching. Professors aren't always allowed to just "profess" what they know these days. This is the grand illusion of academia, many who know what's going on, don't teach. Many who teach would simply prefer to waste their time in a laboratory doing research work.
Right you are my friend.
u so real
00:40 graph search
02:00 recall graph
05:20 applications of graph search
10:30 pocket cube 2x2x2 example
20:25 graph representations
20:40 adjacency lists
26:00 implicit representation of graph
29:05 space complexity for adj list
31:05 breadth-first search
34:05 BFS pseudo-code
36:58 BFS example
43:27 shortest path
48:35 running time of BFS
Hey there! There are some other videos in this course playlist that explain the terms used in this one.
- represent graph in Python: ua-cam.com/video/5JxShDZ_ylo/v-deo.html
- adjacency list in Python: ua-cam.com/video/C5SPsY72_CM/v-deo.html
- examples of theta, O, omega: ua-cam.com/video/P7frcB_-g4w/v-deo.html
- what is hashing: ua-cam.com/video/0M_kIqhwbFo/v-deo.html
- python implementation of iterator: ua-cam.com/video/-DwGrJ8JxDc/v-deo.html
20:57: Representation of graphs
31:10: BFS
thanks man
why would you wanna skip? this guys chit chat is excellent, I can listen to it all day :)
it helps me save up some time thx
thanks so much
BF
Can we just take a moment to appreciate how brilliantly the camera work accompanied this already perfect lecture!
Super genus guy!
From Wikipedia
Demaine was born in Halifax, Nova Scotia, to artist sculptor Martin L. Demaine and Judy Anderson. From the age of 7, he was identified as a child prodigy and spent time traveling across North America with his father.[1] He was home-schooled during that time span until entering university at the age of 12.[2][3]
Demaine completed his bachelor's degree at 14 years old at Dalhousie University in Canada, and completed his PhD at the University of Waterloo by the time he was 20 years old.[4][5]
omg can't believe I graduated from the same university as him.
Breadth-first search (BFS) is an algorithm for traversing or searching tree or graph data structures. It starts at the tree root (or some arbitrary node of a graph, sometimes referred to as a 'search key'[1]) and explores the neighbor nodes first, before moving to the next level neighbors.
BFS was invented in the late 1950s by E. F. Moore, who used it to find the shortest path out of a maze,[2] and discovered independently by C. Y. Lee as a wire routing algorithm (published 1961).[3][4]
One of THE best explanation of BFS, I’ve came across. It’s something about the way he explains. Brilliant.
I think Erik's lectures are very good
MIT teachers make student love the subject they teach :)
This is amazing, I am a student of Algorithms at the University of Nevada, Las Vegas. I really appreciate this class, thank you so much for the UA-cam videos MIT!!!!
damn that’s crazy i’m studying cs there right now, im taking algorithms at the moment
His friendly tone makes the revising process so much easier!
"There are more configurations in a 7*7*7 cube than the number of particles in the known universe" 27:35
- Erik Demaine (2011)
So how does anyone solve it?
@@wetbadger2174 by not trying all possible combinations/permutations but trying only the ones that make sense.
Totally appreciate you mentioning the diameter O(n^2/log n) of n x n x n rubic's cube!!!
Scissors cuts Paper
Paper covers Rock
Rock crushes Lizard
Lizard poisons Spock
Spock smashes Scissors
Scissors decapitates Lizard
Lizard eats Paper
Paper disproves Spock
Spock vaporizes Rock
(and as it always has) Rock crushes Scissors
no
LMFAO
BBT ha
I am sorry, can you repeat that?
His T-Shirt lol
Much better than the newer version. Glad I come back and watch this.
Ayo.... ☠️ MIT FOR A REASON... Such in depth lecture 😁
This guy's lectures really puts the Computer Science lectures at UofT to shame. But then again he is a prodigy, still doesn't justify why we get grad-students as "profs" though. Just my 2 cents
Well in the UoC, our prof doesn't even go into much depth... The guy in this video is amazing.
They go "Breadth-First" I suppose.
At UIUC, we were taught such stuff in C++ :(
When I was a kid I used a screw driver to pry the Rubics cube apart and put it back together solved. That's when my parents knew I would be an engineer and not a mathematician.
My god! His t-shirt also has a graph!! Brilliant!
Yup, and that too a complete graph.
bros before hoes
Time and space complexity[edit]
The time complexity can be expressed as
O ( | V | + | E | ) {\displaystyle O(|V|+|E|)}
,[5] since every vertex and every edge will be explored in the worst case.
| V | {\displaystyle |V|}
is the number of vertices and
| E | {\displaystyle |E|}
is the number of edges in the graph. Note that
O ( | E | ) {\displaystyle O(|E|)}
may vary between
O ( 1 ) {\displaystyle O(1)}
and
O ( | V | 2 ) {\displaystyle O(|V|^{2})}
, depending on how sparse the input graph is.
When the number of vertices in the graph is known ahead of time, and additional data structures are used to determine which vertices have already been added to the queue, the space complexity can be expressed as
O ( | V | ) {\displaystyle O(|V|)}
, where
| V | {\displaystyle |V|}
is the cardinality of the set of vertices (as said before). If the graph is represented by an adjacency list it occupies
Θ ( | V | + | E | ) {\displaystyle \Theta (|V|+|E|)}
[6] space in memory, while an adjacency matrix representation occupies
Θ ( | V
Best lecture on BFS..
Erik Demaine rockss....
Wonderful. I wake up watching these lectures and sleep watching them.
The grind!
This lecture was really "Breadth"-takíng :-D ty Erik
oh my.
oh you
SD ex cc
You are breadth-taking.
Personally, I prefer a less “edgy” pun, with greater “depth”. (Haha!, graph humor!)
Love the way he is wearing a t-shirt with 5 vertices and and 5 directed edges. Whcih would require a space complexity of O(10) to be stored.
undirected ?
It's a reference from a rock paper scissors game from the big bang theory, so it is directed.
So 0(1)
"There are more configurations in this cube than there are particles in the known universe. Yeah. I just calculated that in my head, haha" - Erik
Thanks MIT and Eric.Best teaching that too for free.
I really like the sound of the chalk.
May be the camera person should consider finding a balance between landscape and portrait shooting,
instead of taking the actions in portrait always. It gets difficult to see the contents in the board with the staff,
since the focus is set to one "focussed" part of the black board. This is just my thought. btw MIT rocks!
Sujivson Titus True, a little frustrating when he's pointing at something but it's off the screen, or you're reading through something but part of it is cut off
Erik Demaine - "...but in the textbook, and I guess in the world..." lol
Don't understand why people complain about the chalk. As he writes down so am I doing in my notebook and I find this to be working very smoothly [especially as I can pause the video and also think for myself and try to prove what he said] - I'm getting most of what he says - less so to get a proof on the spot for n x n x n - but hey, he published a paper with et al. on this subject so :) that's accessible for later.
Lectures like this make me feel how lucky MIT students are !!!
@@SimonWoodburyForget but I think the point is to just give students an introduction of the subject so they can work on real problems
@Simon WoodburyForget Because you are forgetting that Computer Science is a...science. CS is not programming. Programming is monkey work. Algorithms are at the heart of CS. You don't need code because this isn't meant for practical uses. CS is just math for computers.
Erik is the best teacher who explains data structure and algorithms so clearly and in a simple way.
Thank you MIT for providing these lectures. These are very helpful.
42:40 Just want to applaud at an amazing explanation and demo 👏
Here are points in other videos in this course's playlist that explain terms used in this video:
- represent graph in Python: watch?v=5JxShDZ_ylo&t=1709s
- adjacency list in Python: watch?v=C5SPsY72_CM&t=189s
- examples of theta, O, omega: watch?v=P7frcB_-g4w&t=130
- what is hashing: watch?v=0M_kIqhwbFo&t=22
- python implementation of iterator: watch?v=-DwGrJ8JxDc&t=978
I found this useful. Hope some of you find it useful as well. If you find more terms for which I can add pointers, let me know.
If a few people think that this is useful, I can add this information for a few more videos. If you are looking for this info in any specific videos, let me know. If I have made these notes for those videos, I will add.
Cheers!
Very useful. Would be more convenient if the link was properly given.
Thanks to Erik Demaine
[FOR MY REFERENCE]
1) Graph Applications
2) Graph BFS Algo.
3) Time Complexity
Dear all,
In this lockdown stage in home, please provide game equipments to your children/students to play. If not help them to watch "Math Art Studio" in you tube. They will play with their names and learn different concepts in mathematics.Those who have seen it they have learnt maths and enjoyed its beauty every day.
Amazing lecture sir and of course your are from MIT because your level of knowledge is very high!!Thanks for your time..
Wish my professor wasn't lazy and wrote all the notes on the board like this instructor. I can't keep up with half-assed powerpoints that my professor rushes through
I don't know if he gave you guys the powerpoint slides, but if he did, then you wouldn't have to spend time copying them down because you'd know you would get them. That way, you can spend time writing down the things that will be more helpful for you.
Mine used wolphram mathematica live, and it was an absolute mess 🤦🏻♂️
Not sure if Erik consciously changed his teaching style from Lec1. He used to explain an example of how things move and build an algorithm on top of it. But this lecture is different, puts down algorithm first and then explains an example on how we move from it.
Thank you! Love Erik's lectures!
Absolutely must watch, adding where they are used and application gives good perception which otherwise made graph dry subject for me.
This is a great lecture. i really appreciate the level of teaching from MIT. This is what makes a good university: its professors.
even though this video is 7 years old, i cant believe they're using chalk boards at MIT. White boards are so much cleaner and easier to read / write on.
Very clear and intuitive 💎 Thanks for sharing this invaluable resouces! Big shout out to MIT 🔥🎓
Better than my College's class, thumbs up.
school is so outdated, 50 mins of chalk work for a 7\10 mins of content
Thank YOUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU, it took me 3 days to understand how to track the path in bfs
I can smell the chalk dust through the video. Takes me back, great stuff.
There are two ways to study algorithms: the MIT way, or the hard way
this guy is the man. his lectures are awesome.
His T-Shirt: Scissors cuts paper, paper covers rock, rock crushes lizard, lizard poisons Spock, Spock smashes scissors, scissors decapitates lizard, lizard eats paper, paper disproves Spock, Spock vaporizes rock, and as it always has, rock crushes scissors.
Man I feel smarter just by sitting here even though I have no clue what happened in those 50 minutes.
You are such an amazing teacher! I wish I had you in college
I wish I had u 🥺👉👈
This makes me think about the extraordinary gap in intellect between human beings.
30:40 BFS
14:44 bookmark
I like the way he writes.
amazing lecturer. Mr. cameraman, please dring cofee or something and keep up
Cameraman did a good job of knowing when we'd rather look at the board than him walking.
he is a great teacher
You deserve a white board set up like that #neverSettle
Thank you so much for posting this video!!! Its too hard to find videos explain algorithms clearly and easy to understand.
Clear. I read the chapter corresponding to this in the Algorithm Design Manual but I wasn't feeling like it really all came together but this did it for me.
In that code frontier will never be False resulting in an infinite loop. It should be while len(frontier):
Thank you for a very clear explanation.
MIT is amazing for postgrad and PhD work. But really you don’t have to go tho MIT to learn this info. Maybe the junior and senior years are probably where it is differentiated from other universities.
2*2*2 cube has 8!*3^7 vertices not 8!*3^8
yay MIT lecture in my room
Thanks for the wonderful lectures Erik!
This guy is so cool!
Thank You MIT.
Simply evergreen content!!
Thank you!
His t-shirt also has graph on it!
for n x n x n rubic cube, looks like the solution would be 2^(n+1)+n+1. based on 2x2x2 and 3x3x3 values. is it right?
How can someone dislike this video?
eksisozlukbirsikibeyenmemetimi
Akshansh Thakur Good question
by just clicking on the thumbs down button
you made me laugh very hard, good day to you :P
Fat fingers & envy
At 34:13, if anyone cares to change the subtitles from (INAUDIBLE), what he says sounds like "pseudocode".
Changed! Thanks for the feedback. :)
Just looked up this lecturer in Wikipedia. Holy sh*t he's a genius.
Prof. Demaine is just incredible. Enjoyable lecture, lots of examples and applications. Would love to meet him in person!
Is that the guy from the Nova Origami special that proved you can make and 3 dimensional shape by folding a flat sheet of something?
Yes, Erik Demaine was in that Nova program on Origami. :)
Thank you Eric! Eric choupo moting
Eric is great!
it's, in fact, from 34:14
Why is it n^2 cubies and not n^3? Since 2×2×2 has 8 cubies I think and 3×3×3 has 27
How are there 24 possible symmetries?
I think it could be possible to implement multiple graphs even using object-oriented programming. Instead of v.neighbours, this property can be an array, so v.neighbours[0] would be the 1st graph, v.neighbours[1] the second, and so on.. Am i wrong?
No you're not wrong. But it is not practical.
You will have to remember for each vertex the number of graphs it belongs to. So, if you need to analyse say vertex v and u, you need to make sure that v.neighbours[0] & u.neighbours[0] are refering to the same graph (you need to allign - 'synchronize'- the array indices - which can be a headacke )
nowadays it goes on slides, missing the old fashion teaching
46:17 How do you find the shortest path from "f" to "c"? Though they are connected through an edge but according to this algorithm, parent of "f" is "d" and parent of "d" is "x" and parent of "x" is "s". You cannot reach to "c" from "f".
the path you are talking about is where 's' is the starting point, so it will give the shortest path from 's' to any node in the graph. To compute the shortest path between 'f' and 'c' , you need to set 'f' or 'c' as the starting point...
if f and c are connected some way then, you can get the path from f to c buy back tacking from parnet[c], assuming the verices have some sort of pointer. If they are class type of Node/vertex with parent pointer, it a lot easier to trace.
I've always wanted to know HOW a rubics cube is actually mechanically put together that it allows so much random rotation of everything without breaking or getting gummed up. :O
Love Erik's lecture, but the sound when he scrubs the blackboard...
Maybe someone can help me:
I'm from Brazil and I study CS. I noticed that in Introduction to Algorithms ppl there already knows algorithm analysis, study graphs and this kind of stuff. Here we just learn the basics (we start with C, since the basics of the language till structs and we see a bit of divide and conquer, sorting (qsort, merge, and the n^2 algorithms) and we work with matrix and files (bin and text). Then in the second year we study design and analysis of algorithms, which is when we learn algorithms analysis and paradigms like dynamic programming, divide and conquer (deeply), greedy algorithms and so on. Now I'm in the third year and I'm studying graphs. Id like to know if the students dont get confused by studying these kind of stuff early (and if they actually study it early cuz I don really know if this assignment is a 1st year assignment)
Luciano My CS courses are very intense, we start with Intro to Programming (2D,3D arrays, OOP,, Big Oh, Merge Sort, a ton of other fundamentals), Data Structures and Algs include Topological Graphs, Greedy Algs, Heaps, etc. this is done in our first year. We tend to go super fast and learn C for Comp Arc and Systems Ops (the next semester of classes that is usually coupled with Discrete Structures, Multi Calc, and Linear Algebra).They teach us Java in our reg CS classes so they make Comp Arc and Systems Ops very rigorous since we’re beginning C (to appreciate and completely understand memory allocation). So basically if you take 3 cs classes a semester, we’ll take Artificial Intelligence in our second year, that way we can go onto learning machine learning, advanced data structures, and top level engineering classes. Most of us are coupling our CS degree with Math so most likely we do Calc 5, and pure maths by our second year or third year. Anyways, I hope this gave you an insight. It’s hard to measure difficulty with others since this is the pace that I’ve adapted to.
In case someone else sees this and wants to know, according to MITs website on this course MIT 6.006, there are 2 prerequisite classes that must be taken before this one. Introduction to EECS I (6.01) and Mathematics for Computer Science (6.042) this is likely a 2nd year course, tho I’m sure a 1st year could take it their 2nd semester as long as they took or tested out of the prerequisites.
The lectures are awesome but the camera work is bad, its much better in the recent MIT Algorithm series, however, I was missing Prof. Domaine for not teaching graphs
I'm interested to see whether or not quantum computers will be able to find god's number for larger sized rubik's cubes, or if we will never know them.
what if i want to know all the shortest paths to a node in the example that is there in this lecture! for example there might my exponentially many ways to get to node f from s. and there might be many shortest paths.But BFS gives us only one! i want to know the no. of all the shortest paths between two nodes s and f. How can I achieve this?
Can Someone explain how the number of possible states is derived for 2*2*2 cube?
Can any one tell me how he came to conclusion about the total number of moves required to solve a cube of n*n*n ?
Thanks
+deepthi g In the video, the professor mentioned a paper where its research was publicated, this is the paper : erikdemaine.org/papers/Rubik_ESA2011/paper.pdf
40:35 minutes the example of the implementation
Sheldon would be proud of his tshirt
what is the reference book align to this course ?
The required textbook for this course is: Cormen, Thomas, Charles Leiserson, Ronald Rivest, and Clifford Stein. Introduction to Algorithms. 3rd ed. MIT Press, 2009. ISBN: 9780262033848. (www.amazon.com/exec/obidos/ASIN/0262033844/ref=nosim/mitopencourse-20) See the course on MIT OpenCourseWare for more information and materials at: ocw.mit.edu/6-006F11
I wish I studied in MIT
great lectures, thank you for uploading this
There is a undirected graph on his shirt!