Yes. I find it irritating whenever a video of a lecture focuses on the person instead of the presentation content on screen that the person is explaining. Depending on what lecture you're watching, it's very hit and miss. It's such a joy to find a video that focuses on the content instead of the person.
UA-cam is an unfortunate platform for hosting talks. What you really want is a platform that presents the video/audio of the speaker in a small box and then presents an actual full-fidelity PDF copy of the slides that is synced with the talk and lets you peruse it at your will. In that case, the video/audio of the speaker would likely be hosted on UA-cam and the slides on your platform from some CDN. Leaving those who view just the video on UA-cam (likely a stolen copy illegally uploaded) at a disadvantage.
In general. But he says some very stupid things as well, so absolutely don't take everything to heart. Think critically, evaluate what is being said. That goes for what I just said as well.
@@herrpez What did he say that was very stupid? If you don't answer, that is admission of stupidity on your part. Like two people have asked you this already.
Well, TBH his opinion on self driving cars are likely to be proven false soon... And Moore's law wasn't dead when this talk happened... It's nearly dead now but still not fully dead.
I'm a mathematician on the verge of changing my profession to programming, and this lecture series is the best content about the practice of coding I havw ever seen. By a landslide. This Uncle Bob guy is amazing!
You should read the whole "Clean" series. It's best in this order: 1. Clean Code 2. Clean Coder 3. Clean Architecture 4. Clean Agile After your first huge project do it again and you will get even more out of it. Besides uncle bobs books I can advice for Pragmatic Programmer.
Been there many years ago. The best programmers I have worked with came from Mathematics background, you are more likely than others to succeed. However, if you are making a living out of being a Mathematician, my recommendation would be to stick to it, it is a much more satisfying, care free job, software world is like Wild West; on the other hand, so many more programmers are being produced each year that in a few years the job market will not be as good as it is now, but there is always going to be way fewer Mathematicians.
@@marioepsilon that's a very good point. Unfortunately it is, for some reason, extremely difficult to find work as a mathematician where I live. Hence the career change. :/
Been struggling with my code architecture for a while. It starts out alright and I keep thinking "separate everything, make it modular". Down the line I realize that all these "modules" have been linked together all over the place and it's just a mess, they all depend on each other. This talk made me realize what I've been doing wrong. What I've been doing is like putting a brain in every limb to make it think for itself, but then it also has to communicate and coordinate with every other limb. This is off-course not how the body works.. The hand does not feel heat and then think "what should I do about this?", it sends that information to the brain and gets told which move to make. The brain never asks for information, it gets told. It's obvious, I know, but up until now I had not thought about it this way and it drastically changes how I think about my code structure.
It seems that even the best of us have a hard time predictinng the future, as is evident by the statement about 8-core processors in Laptops. It is now July 2020 and both, the red team and the blue team, offer 8-core chips for Notebooks. On a sidenote Moore's law was never about clock speed, but about complexity of integrated circuits. If it were about clock speed it would have died together with the Pentium 4.
0:00 Opening. 3:25 Dick Vlot about Architecture and Agile Software Development. 9:30 Presentation of Uncle Bob. 11:19 Diffraction: Why do incandescent lights glow? 15:27 Architecture Introduction / I've built lots of apps / "I want to be a programmer" anecdote. 26:09 The Architecture rules are independent of every other variable. 29:21 Working vs. Right. 30:45 What is Design in Architecture? 31:50 What is the goal of Software Architecture? 33:28 Case study of bad Architecture. 38:55 Executive View / What went wrong / Secret to going fast. 44:43 Messes aren't faster even in the short term. 48:15 Solution of the Executive's Dilema / Two Values of Sofware. 52:41 Behavior / Are we going to see self driving cars? 1:00:14 Scope vs. Shape / Stakeholders want changes. 1:10:33 Urgency and Importance / Eisenhower Matrix. 1:13:09 Fight for the Architecture. 1:15:14 A Rails App / The web is a Ddelivery Mechanism. 1:19:06 Architecture Floor Plans / A Use Case Driven Approach. 1:24:05 Interactors / Entities / Interfaces Objects. 1:27:11 Request Model. 1:30:21 What about MCV? / Design Patterns / How MCV goes wrong as a web Architecture. 1:34:53 Model View Presenter / Dependency Rule. 1:39:10 What about the Database? / The Database is a detail / ORM 1:48:00 Fitnesse: a wiki page porject development. 1:53:54 A good Architecture allows major decisions to be defered! / About IntelliJ and Visual Studio. 2:03:44 Frameworks / Plugin Model.
I beleive that putting this in the description of the video will make youtube add marks on the timeline of the video and name sections of that timeline accordingly.
Gold ♥ The first time I saw a computer, i didn't sleep that night thinking how i could make one with sticks & lights lol, cause it wasn't mine. I was 10 ~30 years ago. and pure pleasure everyday since.
Charismatic as always. Although I do not agree with everything and the rest might be seen as an obvious true (at least for people with 20+ years exp. and a heart for constant learn to improve), I would like to you Robert for all you efforts to make our work more reasonable and to show that we technical people are not completely mad and out of this (business) world :) This was a portion of energy to work even harder.
I wish I had seen this 2 years ago. The first 3 lessons described my team (including myself), and now the project has been deprecated because it is a disaster! I am no manager and no lead, but by explaining this to my previous teammates at that time, we could have avoided to reach to the point of no return.
(1:47:48, 2:03:44, 2:09:00) most of these topics are also viable as arguments against blind usage of frameworks, which clients usually target as a requirement to have some control over the maintainability, though sometimes the framework itself has gotten in the way of that or simply bloats the whole solution. The load of dependencies and their dependencies - cascading.
Regarding Microservices. I have been developing one of the biggest information system in Poland with use of BEA(Oracle) Tuxedo (that was a direct precursor of WebLogic and Java EE standard). It was similar to nowadays Microservices in terms of the general programmatic approach, design of the API, how the front-end and back-end interacts, how we divides our services into groups enclosing them in separated modules/programs, how we define routing and redirecting beetwen them and there was also a famous Actor-model (Scala claimed to invent it in Play!) at the backend side used as a default by Tuxedo (IPC-keys, or memory inter-mapping between the backend processes). It was 22 years ago and Tuxedo is even older. And there was RPC, CGI/FastCGI even before (if I remember correctly).
How do you know you are watching quality content? When there are more upvotes than there are comments. This series is essential learning for any programmer that is passionate about the profession.
Maybe they don't know it's a mess, because they don't know what is clean. BTW this goes for every profession. Mechanical Engineering, Car mechanics everyone.
the IBM jeopardy computer answer is very cool. I am reminded of the books Hitchhiker galaxy. The answer to life the universe and everything is 42. - the problem was the people did not understand the question. Watson got the answer correct, we just did not have a precise enough question. This is the singularity issue and the existential crisis in AI in a nutshell. yeah, it will happen but why? well, we probably asked a bad question, and we don't understand the answer. It probably correct, but will take a lifetime to understand it.
3 роки тому+3
Great lesson. Just to nitpick, because I've found that Bob got a few things wrong when he was not talking about software: Moore's law was never about clock rate. Or speed. Moore's law was about the number of transistors on a single chip. So it wasn't even about density. This means that Moore's law has definitely not died a decade ago. If you look at the graph on the Wikipedia page, it may very well be alive today: en.wikipedia.org/wiki/Moore%27s_law#Moore's_second_law . Actually, there is a section on forecast that says that most experts (including Moore himself) expect it to end around 2025. Now the exponential performance gain (wether it's per chip or, more importantly, per Watts) may have come off the exponential curve already. I'm not sure, but that's a different thing, even if it's more important from the perspective of what he's saying.
As far I remember this explanation is not that full as it might be. It's not about the electron to shake, but about to change the orbit by the electron, which is connected to emitting a portion of the energy, which might be a photon.
It's been more then 5 years since clock speeds have stagnated. I have seen 3-4 Ghz machines back in 2006 but they were very expensive. The absolute limit of the motherboards since 2006 is 5 Ghz. I know I was designing hardware since 2005. But 5 Ghz is not sustainable because of heat generation and cross-talk, it becomes cost prohibitive even at 4 Ghz because the company looses profits. The cross talk at 5Ghz becomes so high that the error correction actually slows down the Chip compared to slower speeds. We have verified this with FEA and we built test boards to measure actual performance. And the CST Microwave Studio a $30,000 simulation software predicted everything exactly as we measured it on the test boards. We have known this since 2006 what he is talking about.
These series have been great so far I really enjoyed them and I'm grateful for this, already looking for ways to start learning and implementing the things mentioned by Bob but I think in future events the camera mix operator should try to focus more on the slides when Bob is making references to them, it helps more putting a picture to what's being said. Thanks once again for this wonderful series it has been more than helpful for me and can't wait to implement the things mentioned by Bob.
Q at 57:32, ChatGPT 4: “One American city that fits this description is Chicago, Illinois. The two airports are Chicago Midway International Airport, named after the Battle of Midway, and Chicago O'Hare International Airport, named after Edward O'Hare, a World War II flying ace.”
I was disappointed when he said that Moors law is dead and self driving cars will not be here in 20 years. Up until now the things he said were really good. But in that case he is probably wrong. I would love to discuss this further!
how is moore's law not dead? I've had the same computer for almost 10 years now. In that time I've upgraded the video card once, and the RAM once, and my CPU is pretty darn old, but I couldn't do all that much better even if I bought a brand new top of the line computer. moore's law was that computing power doubled every 18 months. computer power has barely doubled in 10 years. you'd have to use a different metric, like maybe miniaturization, or ubiquity.
I agree that he made some great points, but believing that self-driving cars won't happen is dead wrong. Also, the prior lesson about not using the debugger (often) seems strange, as it gives you invaluable information about the code as it runs.
ML is dead since 2016 in some areas. Tesla exists too. Bob is wasting our time with his fun way of talking, i like entertainment, but im getting old, so i couldnt watch it all, some parts at 2x speed. :(
~1:55:15 "Good architecture maximises the number of implementation [DETAILS] not made" - A critical thought is to reflect on what is 'detail' from the [usage] architecture view point, rather than the coding viewpoint.
At the beginning he talks about how his first experiences that caused him to become a programmer. I personally find it ironic that he shares this in the same presentation where he insults DBs and SQL. That is because SQL is what got me interested in programming to begin with. I was taking a college class which included learning Microsoft Access but I wasn't satisfied with using the built in tools and wanted to know how it worked and spent my spring break reading SQL books. I find that very similar to Uncle Bob's dissatisfaction with his original manual and getting the more advanced one and can relate to his reaction to trying his own things and thinking how cool it was that it did exactly what you wanted it to.
Well I'm sorry I can't relate. What does he actually mena by the way? Because the only explanation I can think of is you write your own query language back there. Queries shouldn't travel across a wire anyway from what I know.
Lol I was just thinking that whilst watching this ... No 8 core cpu's huh ... oh, erm ... Bob, and the first gen of those would have been out when this was uploaded. I'm thinking this is a very old lecture if he thinks "2.6ghz is the pinnacle of CPU clock speeds".
@@franciscomagalhaes7457 With a quick google search I found a blog post written in 2019 about attending this event "A few weeks ago, ..." source: themobilecompany.com/blog/coding-a-better-world-with-uncle-bob-2019/
@@iMaxos I did the search too, but I think you can find several editions of this and other talks by uncle bob. Or at least talks about parts of these subjects. So I don't think this means this one is necessarily from 2019...
I doubt anyone will see this but having studied black body radiation in my classes, Uncle Bob's description at the start hurts. His description is basically classical and the classical description of black body radiation leads to physical impossibilities like that anything that's glowing will give a stupid amount of x-rays (known as the UV catastrophe). The way to fix this problem is to invoke some quantum mechanics in that atoms can only be in a discrete number of energy states. Max Planck's solution to this problem of black body radiation was actually one of the early problems that lead to the development of quantum mechanics.
Whilst I've loved this series of vidoes so far and have a deep respect for Bob, I'm seriously puzzled by one topic in this particular vid. Why would dependency injection "infiltrate" itself? DI is purely the idea that you pass in pre-created instances of an abstract interface. Regarding pre-packaged IoC container libraries which assist with creation of dependencies, I've used AutoFac, Simple Injector and .net core DI and not one of them infiltrates your system. They all are used early on to specify implementation to interface mappings along with scope but we only ever need to put the interaface names in the constructors. Nothing container specific at all past that top level boundary. DI is just a way of coding, not a framework. All code I've written just has the interface passed to the constructors as it would if you were manually newing up implementations of interface and passing them in AS that interface and with the IoC container I use I can always switch to another one. His take on this is particularly odd to me making out it's something you tie yourself to, but it really isn't. Maybe it's not something that can be done correctly in JAVA? Weird.
As with all things there are degrees to it, and some frameworks may not be as intrusive. Having experienced Spring in enterprise code firsthand, what he is saying here is absolutely true of that framework. Spring is a parasite that becomes deeply ingrained into your project, and if you ever want to change frameworks, it will require a considerable amount of effort. Because there's this magic glue code that does reflectiony things behind the scenes (the auto-wired annotations mentioned in the lecture), it also makes testing more difficult, and made the system harder to reason about. What ends up there at runtime? Gotta go digging through Spring configuration to figure it out. You don't need a framework to do DI: you can instantiate the dependencies yourself and inject them via constructors that ask for them by interface. Misko Hevery has a now decade-old post on his testing blog talking about this sort of thing: misko.hevery.com/2010/05/26/do-it-yourself-dependency-injection/
@@Exa4096 while I agree that spring is particularly bad about it - why would you use a framework if it didn't save you any effort? The corollary is that not using that framework will inevitably cost you some effort.
@@Reashu what Bob is saying is that these frameworks save you time in the short term but cost you far more in the long term. He urges caution with their use and to isolate it as much as possible from your code to prevent it from breaking everything when the framework inevitably changes. They do not have your interests in mind, but rather their own.
@@Exa4096 Yes, I also saw the video, you do not need to quote it. However, he does not demonstrate it, nor have I seen it demonstrated. Maybe he is talking about extreme usage of Spring that goes beyond what I've seen, but it seems to me that this is point is badly elaborated and lacks nuance.
@@Reashu If you have seen struts, you should know struts 1.3, I have seen companies entirely built on struts 1.3, and suddenly struts 2 came in and the switch could not happen, same with AngularJS to Angular 2, same Spring will do one day, because they look at their future, history will repeat itself that's what Uncle Bob said, you said it was not demonstrated, can you say that again after thinking of the companies which used struts 1.3 and AngularJS? How much it cost them to switch, and some are yet to switch AngularJS to Angular 2, is it sufficient enough?
30:24 It's amazing watching a video on software design and development, and having a religious insight =)) What are sins if not bad practices that cause headache in long run, especially when we apply a stoic worldview?
Great speaker. I am a bit upset at his pessimism for self-driving cars lol, otherwise what great advice for new senior developers such as myself. It's hard sometimes to push back.
I am an electrical engineer and I program programmable logical computers nowaday in machines which is a different field of "programming" it has more constraint. To be honest people in the industry think it as something different than programming by IT peolpe. I feel the responsibility all the days when the people from production and management act like if they would know what is the problem what is the cure and when is the date and treat me like a cell in the excel file that should be green on the date. Last time I shouted on them. I am not proud of it. But ordinary peolple are not able to make distiction in between the simple terms like not open and closed and they can use they hierarcy against us if they are not statisfied. I would like to say in this situation by my hart please do not use your position as knowledge and i will not use my knowledge as power.
It's been more then 5 years since clock speeds have stagnated. I have seen 3-4 Ghz machines back in 2006 but they were very expensive. The absolute limit of the motherboards since 2006 is 5 Ghz. I know I was designing hardware since 2005. But 5 Ghz is not sustainable because of heat generation and cross-talk, it becomes cost prohibitive even at 4 Ghz because the company looses profits. The cross talk at 5Ghz becomes so high that the error correction actually slows down the Chip compared to slower speeds. We have verified this with FEA and we built test boards to measure actual performance. And the CST Microwave Studio a $30,000 simulation software predicted everything exactly as we measured it on the test boards. We have known this since 2006 what he is talking about.
@@hrgwea OK, I was listening to something else when I saw your comment. I agree that we will have self driving cars as soon as we have the infrastructure in place. It won't be "everywhere" on earth but it will be in Cities. We already know how to automate machinery. Self driving cars are just a larger scale of automation. Cheers!
@@EvenStarLoveAnanda It really depends on what you call a "self driving car", I believe the only way we can have a car without a driver is if the car is physically guided, and we kinda have those already, but for larger groups of people, namely trains and even those still have a driver! Edit: A thought on why I think this. The world is just too chaotic, which computers don't like. At a certain point, when there's too much chaos, no amount of logic in the computer will be sufficient to make order out of the chaos, which it needs to function. I think there's a threshold. I'm curious now what the science behind this is :) By restricting the physical world the car acts in (what I meant with "physically guided"), i.e. create an infrastructure like the railways, the amount of chaos is reduced so that the computer can cope. Maybe I'm just rambling I smoked a joint :)
Hi, I just have a question. I have internalized the concepts around Clean Architecture, but I consider people think clean architecture is a set of folder, a kind or folder structure or something like that, and that structure can be applied to all projects; I mean, they use the same layers for frontend and backends developments. Web is a part of the Infrastructure layer, but web IS NOT A NEW SOLUTION, is a part of a solution. Actually it is not a question, I like too much to get an answer from anybody of you who see this scenario in your company or personal developments. Do we need usecases inside a angular/reactive frontend app? Or are we understanding bad the clean architecture? Thanks.
4 роки тому+1
19:01 - he's probably talking about Digi-Comp 1 (en.wikipedia.org/wiki/Digi-Comp_I)
Someone tell him that there are 8 core CPUs since 2011... He just didn't buy one (since most people don't need them). His point still stands, of course. We got a 16 core CPU- only 8 years later... And here it will probably stop (we'll move to better things, I believe)... In general, Great series! (except the editor :( )
I am surprised by uncle Bob's lack of understanding of databases. Has he looked at Postgres lately? I have not seen an object method implementation with the power of SQL (many have tried). There are times when you want to work on large sets of data, not individual objects. The database provides much more than persistence. Most importantly data integrity and schema. Indexing and linking of data is also important. With an object relational (like Postgres) database you can also implement business logic. His comment about code injection is wrong, look at how PHP is implementing PDO - there is no chance of code injection if it is implemented correctly.
I decided to be a developer in my 13 (currently I have 30 hehe), I was to a course about Graphic Design, 3D design and for some reason php, and I loved it, and then in a bit learned java hehe.
They can only drive in the easiest situation. For example how would a self driving car change lane in a traffic jam? Or how can it enter a roundabout filled with cars? Stops at the middle of the road and waits the end of the rush hour?
I do know he has a blog, called the clean code blog, and one of the... blogs? Mentions the toy and has a picture of it. The blog wasn't named "my toy," it was some story. Maybe it was the one about types, the first of the two such stories. It's a fun read too!
I feel like if there were actual code examples in this series it'd make things much easier to understand and not sound a bit vague. In this first episode there was a concrete example which helped, and there's less and less so, and it's so long that it feels like he's waffling. Shame because he's definitely saying important stuff but it's just hard to take it on board if he's not giving examples
More's Law isn't about clock core frequency, "Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years." Also if it was really the frequency like Uncle Bob is saying, then, it'll become harmful to humans once it reaches 6ghz and above. So basing More's law on CPU clock frequency is silly, given that we already know there's already a limit that it'll stay safe for humans.
Right. And the increase of transistors number has indeed slowed down below Moor's law, to about 2.5 years. What do you mean by harmful to humans regarding the CPU clock speed?
@@RoiTrigerman this graph github.com/karlrupp/microprocessor-trend-data shows otherwise. > What do you mean by harmful to humans regarding the CPU clock speed? Electromagnetic radiation, once the CPU hits 6ghz and higher, it'll leak and is dangerous to humans. The only way going forward is we get some sort of shield, or go with multiprocessing architecture, which is what modern computing is doing. If you think about it, Uncle Bob is right where the maximum frequency didn't get up at all, just hovered, around 3ghz~4ghz. And that was more than 10 years ago. Every manufacture both on software and hardware is either focusing on power consumption and/or multi-cores architecture. I'm just stating Moore's law isn't about clock frequency, but rather the number of transistors. That's what Uncle Bob got it wrong.
@@chris-ew9wl The graph you linked to shows what I claimed. It shows that the number of transistors increased by more than 10 times in 10 years.. it's hard to get accurate numbers from it but it seems about right. If every 2 years the number would be doubled, than after 10 years it should be 16 times bigger. Since the graph looks like it's a little less than 16 times, it means the number doubles after a little more than 2 years. Wikipedia says about 2.5 years, and it seems right according to your graph.
And are you sure you are not confusing CPU clock speed with radio frequency? Can you find any resource saying that the CPU clock speed is somehow dangerous to humans?
the point is not about files vs database. I think it was that they were able to defer the decision until last minute and realized that they did not need DB in the first place. files worked for their usecase. If they had taken a decision in haste at the start itself, they would've overengineered their app by using a db, would've gotten themselves locked to mysql's licensing or whatever etc. Instead they were able to defer the decision to last minute and now have the ability to use db like MySQL if they REALLY needed it.
he said sure we have cars that can drive in good conditions with a human occupant, but we probably won't have completely automatic cars that drive themselves with no human supervision. has this changed?
@@BienestarMutuo Haha, framework. It's a pattern, Bob's whole point is you should build your software using the pattern, not use some framework making you dependent on the framework and restricting your freedom to change the system.
@@Rob81k Yes that is right, but also is right that you need a framework "open sourced" that make your code easy and not develop all the "libraries" each time for each development.
So his critique of Ruby on Rails is that because it has an opinionated directory structure, it somehow implies that Rails is tightly coupled to the delivery mechanism (web/IO) and it that it's hard to determine the intent of app? ... what? Shouldn't he realize that because Rails has an opinionated directory structure, it's very easy to determine the intent of *any* rails app by looking at the same directories each time (e.g. app/models, app/controllers, config/routes.rb)? As opposed to non-opinionated projects where every project has a different structure according to the team that built it and thus requires more time to determine the intent? There are many valid criticism for Rails, but this is just lazy.
We had a class on managing a project. I already knew the agile manifesto well enough to tell that the professor twisted it a bit. It takes a lot of therapy to digest Uncle Bob properly
I'm dubious about the Interactor as a Use Case. If one manufactures a knapsack, the use cases refer to the many ways one might interact with the bag, not some decision maker in the process, do they not?
I'm happy that in this video we get to see the screen all the time while Bob is talking about it, as opposed to the previous videos
Yes. I find it irritating whenever a video of a lecture focuses on the person instead of the presentation content on screen that the person is explaining.
Depending on what lecture you're watching, it's very hit and miss. It's such a joy to find a video that focuses on the content instead of the person.
only possible because they did a retrospective at the end of previous sprint.
UA-cam is an unfortunate platform for hosting talks. What you really want is a platform that presents the video/audio of the speaker in a small box and then presents an actual full-fidelity PDF copy of the slides that is synced with the talk and lets you peruse it at your will. In that case, the video/audio of the speaker would likely be hosted on UA-cam and the slides on your platform from some CDN. Leaving those who view just the video on UA-cam (likely a stolen copy illegally uploaded) at a disadvantage.
This whole series is gold
In general. But he says some very stupid things as well, so absolutely don't take everything to heart. Think critically, evaluate what is being said. That goes for what I just said as well.
@@herrpez pls give us example
@@herrpez What did he say that was very stupid?
If you don't answer, that is admission of stupidity on your part.
Like two people have asked you this already.
@@herrpez Correct. You're talking bollocks!
:-)
Well, TBH his opinion on self driving cars are likely to be proven false soon... And Moore's law wasn't dead when this talk happened... It's nearly dead now but still not fully dead.
I appreciate the semi-transparent overlay slides, kudos to the editor. So much better than the first part where we barely ever got to see the slides!
I'm a mathematician on the verge of changing my profession to programming, and this lecture series is the best content about the practice of coding I havw ever seen. By a landslide. This Uncle Bob guy is amazing!
You should read the whole "Clean" series. It's best in this order:
1. Clean Code
2. Clean Coder
3. Clean Architecture
4. Clean Agile
After your first huge project do it again and you will get even more out of it. Besides uncle bobs books I can advice for Pragmatic Programmer.
Been there many years ago. The best programmers I have worked with came from Mathematics background, you are more likely than others to succeed. However, if you are making a living out of being a Mathematician, my recommendation would be to stick to it, it is a much more satisfying, care free job, software world is like Wild West; on the other hand, so many more programmers are being produced each year that in a few years the job market will not be as good as it is now, but there is always going to be way fewer Mathematicians.
@@marioepsilon that's a very good point. Unfortunately it is, for some reason, extremely difficult to find work as a mathematician where I live. Hence the career change. :/
@@TNeulaender I've found your book recommendation order very useful thanks for sharing.
Did you make the leap? How did it work for you? The job market sucks now but it was amazing when you posted this comment.
I've watched 4 of these today... I'll come back another day.
Been struggling with my code architecture for a while. It starts out alright and I keep thinking "separate everything, make it modular". Down the line I realize that all these "modules" have been linked together all over the place and it's just a mess, they all depend on each other.
This talk made me realize what I've been doing wrong. What I've been doing is like putting a brain in every limb to make it think for itself, but then it also has to communicate and coordinate with every other limb. This is off-course not how the body works..
The hand does not feel heat and then think "what should I do about this?", it sends that information to the brain and gets told which move to make. The brain never asks for information, it gets told.
It's obvious, I know, but up until now I had not thought about it this way and it drastically changes how I think about my code structure.
Thank you Uncle Bob for being an Uncle Bob to all the programming community.
It seems that even the best of us have a hard time predictinng the future, as is evident by the statement about 8-core processors in Laptops. It is now July 2020 and both, the red team and the blue team, offer 8-core chips for Notebooks.
On a sidenote Moore's law was never about clock speed, but about complexity of integrated circuits. If it were about clock speed it would have died together with the Pentium 4.
I exited the full screen just to read this comment :D
@@lapieuvreee you can scroll down on full screen brother
0:00 Opening.
3:25 Dick Vlot about Architecture and Agile Software Development.
9:30 Presentation of Uncle Bob.
11:19 Diffraction: Why do incandescent lights glow?
15:27 Architecture Introduction / I've built lots of apps / "I want to be a programmer" anecdote.
26:09 The Architecture rules are independent of every other variable.
29:21 Working vs. Right.
30:45 What is Design in Architecture?
31:50 What is the goal of Software Architecture?
33:28 Case study of bad Architecture.
38:55 Executive View / What went wrong / Secret to going fast.
44:43 Messes aren't faster even in the short term.
48:15 Solution of the Executive's Dilema / Two Values of Sofware.
52:41 Behavior / Are we going to see self driving cars?
1:00:14 Scope vs. Shape / Stakeholders want changes.
1:10:33 Urgency and Importance / Eisenhower Matrix.
1:13:09 Fight for the Architecture.
1:15:14 A Rails App / The web is a Ddelivery Mechanism.
1:19:06 Architecture Floor Plans / A Use Case Driven Approach.
1:24:05 Interactors / Entities / Interfaces Objects.
1:27:11 Request Model.
1:30:21 What about MCV? / Design Patterns / How MCV goes wrong as a web Architecture.
1:34:53 Model View Presenter / Dependency Rule.
1:39:10 What about the Database? / The Database is a detail / ORM
1:48:00 Fitnesse: a wiki page porject development.
1:53:54 A good Architecture allows major decisions to be defered! / About IntelliJ and Visual Studio.
2:03:44 Frameworks / Plugin Model.
I beleive that putting this in the description of the video will make youtube add marks on the timeline of the video and name sections of that timeline accordingly.
1:21:12 Stairway to Heaven
1:53:25 "A good architecture allows you to defer critical decisions as long as possible"
Gold ♥
The first time I saw a computer, i didn't sleep that night thinking how i could make one with sticks & lights lol, cause it wasn't mine. I was 10 ~30 years ago.
and pure pleasure everyday since.
I can listen to this for days ... great videos
Charismatic as always. Although I do not agree with everything and the rest might be seen as an obvious true (at least for people with 20+ years exp. and a heart for constant learn to improve), I would like to you Robert for all you efforts to make our work more reasonable and to show that we technical people are not completely mad and out of this (business) world :) This was a portion of energy to work even harder.
I wish I had seen this 2 years ago. The first 3 lessons described my team (including myself), and now the project has been deprecated because it is a disaster! I am no manager and no lead, but by explaining this to my previous teammates at that time, we could have avoided to reach to the point of no return.
(1:47:48, 2:03:44, 2:09:00) most of these topics are also viable as arguments against blind usage of frameworks, which clients usually target as a requirement to have some control over the maintainability, though sometimes the framework itself has gotten in the way of that or simply bloats the whole solution. The load of dependencies and their dependencies - cascading.
Regarding Microservices. I have been developing one of the biggest information system in Poland with use of BEA(Oracle) Tuxedo (that was a direct precursor of WebLogic and Java EE standard). It was similar to nowadays Microservices in terms of the general programmatic approach, design of the API, how the front-end and back-end interacts, how we divides our services into groups enclosing them in separated modules/programs, how we define routing and redirecting beetwen them and there was also a famous Actor-model (Scala claimed to invent it in Play!) at the backend side used as a default by Tuxedo (IPC-keys, or memory inter-mapping between the backend processes). It was 22 years ago and Tuxedo is even older. And there was RPC, CGI/FastCGI even before (if I remember correctly).
@UnityCoin thanks so much for this series
Quality content through the whole series! Thanks!
29:42 Editor: You want to see the graphic? Well, I think you need to see the speaker. 💡 ooh, I know!
How do you know you are watching quality content? When there are more upvotes than there are comments. This series is essential learning for any programmer that is passionate about the profession.
when everyone is cheering, stay critical.
The rant about the new languages and new frameworks that all boil down to just more of the same old stuff was so satisfying
always energetic and inspiring
Maybe they don't know it's a mess, because they don't know what is clean.
BTW this goes for every profession.
Mechanical Engineering, Car mechanics everyone.
the IBM jeopardy computer answer is very cool. I am reminded of the books Hitchhiker galaxy. The answer to life the universe and everything is 42. - the problem was the people did not understand the question. Watson got the answer correct, we just did not have a precise enough question. This is the singularity issue and the existential crisis in AI in a nutshell. yeah, it will happen but why? well, we probably asked a bad question, and we don't understand the answer. It probably correct, but will take a lifetime to understand it.
Great lesson. Just to nitpick, because I've found that Bob got a few things wrong when he was not talking about software: Moore's law was never about clock rate. Or speed. Moore's law was about the number of transistors on a single chip. So it wasn't even about density. This means that Moore's law has definitely not died a decade ago. If you look at the graph on the Wikipedia page, it may very well be alive today: en.wikipedia.org/wiki/Moore%27s_law#Moore's_second_law . Actually, there is a section on forecast that says that most experts (including Moore himself) expect it to end around 2025.
Now the exponential performance gain (wether it's per chip or, more importantly, per Watts) may have come off the exponential curve already. I'm not sure, but that's a different thing, even if it's more important from the perspective of what he's saying.
As far I remember this explanation is not that full as it might be. It's not about the electron to shake, but about to change the orbit by the electron, which is connected to emitting a portion of the energy, which might be a photon.
scientists explain it that way bcz we cant think in metaphysic, everything must stay materialistic or our logics fail to continue understanding.
It's been more then 5 years since clock speeds have stagnated.
I have seen 3-4 Ghz machines back in 2006 but they were very expensive.
The absolute limit of the motherboards since 2006 is 5 Ghz.
I know I was designing hardware since 2005.
But 5 Ghz is not sustainable because of heat generation and cross-talk, it becomes cost prohibitive even at 4 Ghz because the company looses profits.
The cross talk at 5Ghz becomes so high that the error correction actually slows down the Chip compared to slower speeds.
We have verified this with FEA and we built test boards to measure actual performance.
And the CST Microwave Studio a $30,000 simulation software predicted everything exactly as we measured it on the test boards.
We have known this since 2006 what he is talking about.
These series have been great so far I really enjoyed them and I'm grateful for this, already looking for ways to start learning and implementing the things mentioned by Bob but I think in future events the camera mix operator should try to focus more on the slides when Bob is making references to them, it helps more putting a picture to what's being said.
Thanks once again for this wonderful series it has been more than helpful for me and can't wait to implement the things mentioned by Bob.
"The mess slows you down"
Q at 57:32, ChatGPT 4: “One American city that fits this description is Chicago, Illinois. The two airports are Chicago Midway International Airport, named after the Battle of Midway, and Chicago O'Hare International Airport, named after Edward O'Hare, a World War II flying ace.”
i want to see his opinion on chatgpt :P
I was disappointed when he said that Moors law is dead and self driving cars will not be here in 20 years. Up until now the things he said were really good. But in that case he is probably wrong. I would love to discuss this further!
He is right, 80% automation is the easy part, the 20% is the really difficult part. but 80% automation is a lot and very good for many people.
how is moore's law not dead? I've had the same computer for almost 10 years now. In that time I've upgraded the video card once, and the RAM once, and my CPU is pretty darn old, but I couldn't do all that much better even if I bought a brand new top of the line computer. moore's law was that computing power doubled every 18 months. computer power has barely doubled in 10 years. you'd have to use a different metric, like maybe miniaturization, or ubiquity.
I agree that he made some great points, but believing that self-driving cars won't happen is dead wrong. Also, the prior lesson about not using the debugger (often) seems strange, as it gives you invaluable information about the code as it runs.
ML is dead since 2016 in some areas. Tesla exists too. Bob is wasting our time with his fun way of talking, i like entertainment, but im getting old, so i couldnt watch it all, some parts at 2x speed. :(
I came to learn why lights glow. I stayed for the software engineering
~1:55:15 "Good architecture maximises the number of implementation [DETAILS] not made" - A critical thought is to reflect on what is 'detail' from the [usage] architecture view point, rather than the coding viewpoint.
I feel, the last question about concurrency had not been answered. Anyway, great talk, thanks for uploading.
Interesting about his comments on not seeing more than 4 cores on a CPU.
I genuinely cried when he told his story about becoming a programmer at the age of 12. There's something about it that's just poetic to me
I strongly agree with the SQL part.
Surprised he didn't invoke the US Army Rangers ethic. "Slow is smooth, and smooth is fast."
At the beginning he talks about how his first experiences that caused him to become a programmer. I personally find it ironic that he shares this in the same presentation where he insults DBs and SQL. That is because SQL is what got me interested in programming to begin with. I was taking a college class which included learning Microsoft Access but I wasn't satisfied with using the built in tools and wanted to know how it worked and spent my spring break reading SQL books. I find that very similar to Uncle Bob's dissatisfaction with his original manual and getting the more advanced one and can relate to his reaction to trying his own things and thinking how cool it was that it did exactly what you wanted it to.
Well I'm sorry I can't relate. What does he actually mena by the way? Because the only explanation I can think of is you write your own query language back there. Queries shouldn't travel across a wire anyway from what I know.
55:40 and now we have threadripper
Lol I was just thinking that whilst watching this ...
No 8 core cpu's huh ... oh, erm ... Bob, and the first gen of those would have been out when this was uploaded.
I'm thinking this is a very old lecture if he thinks "2.6ghz is the pinnacle of CPU clock speeds".
You have any idea when this talk happened? The upload date is 2019, but it looks like it may have happened around the early '10s...
@@franciscomagalhaes7457 With a quick google search I found a blog post written in 2019 about attending this event "A few weeks ago, ..."
source: themobilecompany.com/blog/coding-a-better-world-with-uncle-bob-2019/
Not in your laptop...
@@iMaxos I did the search too, but I think you can find several editions of this and other talks by uncle bob. Or at least talks about parts of these subjects. So I don't think this means this one is necessarily from 2019...
I doubt anyone will see this but having studied black body radiation in my classes, Uncle Bob's description at the start hurts.
His description is basically classical and the classical description of black body radiation leads to physical impossibilities like that anything that's glowing will give a stupid amount of x-rays (known as the UV catastrophe). The way to fix this problem is to invoke some quantum mechanics in that atoms can only be in a discrete number of energy states. Max Planck's solution to this problem of black body radiation was actually one of the early problems that lead to the development of quantum mechanics.
Whilst I've loved this series of vidoes so far and have a deep respect for Bob, I'm seriously puzzled by one topic in this particular vid. Why would dependency injection "infiltrate" itself? DI is purely the idea that you pass in pre-created instances of an abstract interface. Regarding pre-packaged IoC container libraries which assist with creation of dependencies, I've used AutoFac, Simple Injector and .net core DI and not one of them infiltrates your system. They all are used early on to specify implementation to interface mappings along with scope but we only ever need to put the interaface names in the constructors. Nothing container specific at all past that top level boundary. DI is just a way of coding, not a framework. All code I've written just has the interface passed to the constructors as it would if you were manually newing up implementations of interface and passing them in AS that interface and with the IoC container I use I can always switch to another one. His take on this is particularly odd to me making out it's something you tie yourself to, but it really isn't. Maybe it's not something that can be done correctly in JAVA? Weird.
As with all things there are degrees to it, and some frameworks may not be as intrusive.
Having experienced Spring in enterprise code firsthand, what he is saying here is absolutely true of that framework. Spring is a parasite that becomes deeply ingrained into your project, and if you ever want to change frameworks, it will require a considerable amount of effort.
Because there's this magic glue code that does reflectiony things behind the scenes (the auto-wired annotations mentioned in the lecture), it also makes testing more difficult, and made the system harder to reason about. What ends up there at runtime? Gotta go digging through Spring configuration to figure it out.
You don't need a framework to do DI: you can instantiate the dependencies yourself and inject them via constructors that ask for them by interface. Misko Hevery has a now decade-old post on his testing blog talking about this sort of thing: misko.hevery.com/2010/05/26/do-it-yourself-dependency-injection/
@@Exa4096 while I agree that spring is particularly bad about it - why would you use a framework if it didn't save you any effort? The corollary is that not using that framework will inevitably cost you some effort.
@@Reashu what Bob is saying is that these frameworks save you time in the short term but cost you far more in the long term. He urges caution with their use and to isolate it as much as possible from your code to prevent it from breaking everything when the framework inevitably changes. They do not have your interests in mind, but rather their own.
@@Exa4096 Yes, I also saw the video, you do not need to quote it. However, he does not demonstrate it, nor have I seen it demonstrated. Maybe he is talking about extreme usage of Spring that goes beyond what I've seen, but it seems to me that this is point is badly elaborated and lacks nuance.
@@Reashu If you have seen struts, you should know struts 1.3, I have seen companies entirely built on struts 1.3, and suddenly struts 2 came in and the switch could not happen, same with AngularJS to Angular 2, same Spring will do one day, because they look at their future, history will repeat itself that's what Uncle Bob said, you said it was not demonstrated, can you say that again after thinking of the companies which used struts 1.3 and AngularJS? How much it cost them to switch, and some are yet to switch AngularJS to Angular 2, is it sufficient enough?
His entry should be called : Better call bob 😜
Better Call Saul vibes😂
You are awesome. Thank You.
Love the ST:TOS reference from "The Apple" at 1:10😅
Skip to 11:20 to avoid all the opening BS and get to Uncle Bob.
yes go straight to listen a bullshit about lights
30:24 It's amazing watching a video on software design and development, and having a religious insight =))
What are sins if not bad practices that cause headache in long run, especially when we apply a stoic worldview?
Prevention ,right focus, constructive thinking what could make you faster
Every since I was 2 and deleted an important financial file on my parent's computer, I knew I was a programmer
Nothing you can do to go faster? I guess uncle Bob never heard of cocaine.
@Brandon Busby No it won't decreace the number of bugs, you just write more code.
In the context of Robert's talk, cocaine actually makes you the hare.
In short terms, programmers are playing Tetris to fit new pieces in the software.
1:21:12 I GET IT UNCLE BOB!!! Stairway to heaven live in New York 1973!
Great speaker.
I am a bit upset at his pessimism for self-driving cars lol, otherwise what great advice for new senior developers such as myself.
It's hard sometimes to push back.
I am an electrical engineer and I program programmable logical computers nowaday in machines which is a different field of "programming" it has more constraint. To be honest people in the industry think it as something different than programming by IT peolpe. I feel the responsibility all the days when the people from production and management act like if they would know what is the problem what is the cure and when is the date and treat me like a cell in the excel file that should be green on the date. Last time I shouted on them. I am not proud of it. But ordinary peolple are not able to make distiction in between the simple terms like not open and closed and they can use they hierarcy against us if they are not statisfied.
I would like to say in this situation by my hart please do not use your position as knowledge and i will not use my knowledge as power.
53:34 I'll remember this 5 years from now.
It's been more then 5 years since clock speeds have stagnated.
I have seen 3-4 Ghz machines back in 2006 but they were very expensive.
The absolute limit of the motherboards since 2006 is 5 Ghz.
I know I was designing hardware since 2005.
But 5 Ghz is not sustainable because of heat generation and cross-talk, it becomes cost prohibitive even at 4 Ghz because the company looses profits.
The cross talk at 5Ghz becomes so high that the error correction actually slows down the Chip compared to slower speeds.
We have verified this with FEA and we built test boards to measure actual performance.
And the CST Microwave Studio a $30,000 simulation software predicted everything exactly as we measured it on the test boards.
We have known this since 2006 what he is talking about.
@@EvenStarLoveAnanda I'm not referring to clock speeds.
Listen to the first sentence at that timestamp.
@@hrgwea OK, I was listening to something else when I saw your comment.
I agree that we will have self driving cars as soon as we have the infrastructure in place.
It won't be "everywhere" on earth but it will be in Cities.
We already know how to automate machinery.
Self driving cars are just a larger scale of automation. Cheers!
@@EvenStarLoveAnanda It really depends on what you call a "self driving car", I believe the only way we can have a car without a driver is if the car is physically guided, and we kinda have those already, but for larger groups of people, namely trains and even those still have a driver!
Edit: A thought on why I think this. The world is just too chaotic, which computers don't like. At a certain point, when there's too much chaos, no amount of logic in the computer will be sufficient to make order out of the chaos, which it needs to function. I think there's a threshold. I'm curious now what the science behind this is :) By restricting the physical world the car acts in (what I meant with "physically guided"), i.e. create an infrastructure like the railways, the amount of chaos is reduced so that the computer can cope.
Maybe I'm just rambling I smoked a joint :)
Hi, I just have a question. I have internalized the concepts around Clean Architecture, but I consider people think clean architecture is a set of folder, a kind or folder structure or something like that, and that structure can be applied to all projects; I mean, they use the same layers for frontend and backends developments. Web is a part of the Infrastructure layer, but web IS NOT A NEW SOLUTION, is a part of a solution. Actually it is not a question, I like too much to get an answer from anybody of you who see this scenario in your company or personal developments. Do we need usecases inside a angular/reactive frontend app? Or are we understanding bad the clean architecture? Thanks.
19:01 - he's probably talking about Digi-Comp 1 (en.wikipedia.org/wiki/Digi-Comp_I)
Someone tell him that there are 8 core CPUs since 2011... He just didn't buy one (since most people don't need them).
His point still stands, of course. We got a 16 core CPU- only 8 years later... And here it will probably stop (we'll move to better things, I believe)...
In general, Great series! (except the editor :( )
Same here, but much later: ZX Spectrum 48k, 12 years old, decision made. :)
good morning drill sergeant!
Try doing this.
Whenever I skipped some part, i saw uncle Bob moving from right to left every time
I am surprised by uncle Bob's lack of understanding of databases. Has he looked at Postgres lately?
I have not seen an object method implementation with the power of SQL (many have tried). There are times when you want to work on large sets of data, not individual objects.
The database provides much more than persistence. Most importantly data integrity and schema. Indexing and linking of data is also important. With an object relational (like Postgres) database you can also implement business logic.
His comment about code injection is wrong, look at how PHP is implementing PDO - there is no chance of code injection if it is implemented correctly.
Uncle Bob is right about code but really unaware of Tesla's capabilities
I decided to be a developer in my 13 (currently I have 30 hehe), I was to a course about Graphic Design, 3D design and for some reason php, and I loved it, and then in a bit learned java hehe.
"If you give me a system that works perfectly now, but I cannot change it, that software will be worthless tomorow."
Don't tell him about Blockchain.
Thats true
damn, please fix up the camera switching next time.. it sucks to the camera goes to the speaker when the speaker were talking about the presentation.
52:00 Those self driving cars are reality by now.
No its not. You cant travel by self driving car to another country.
What? They are not. What are you talking about?
They can only drive in the easiest situation. For example how would a self driving car change lane in a traffic jam? Or how can it enter a roundabout filled with cars? Stops at the middle of the road and waits the end of the rush hour?
Anybody know what toy exactly Uncle Bob is talking about here? I'd love to get my hands on one if possible. Would be a neat little trinket, I think.
I do know he has a blog, called the clean code blog, and one of the... blogs? Mentions the toy and has a picture of it. The blog wasn't named "my toy," it was some story. Maybe it was the one about types, the first of the two such stories. It's a fun read too!
1:00:00 - 1:00:12 - cracked me up so hard!
Why use output boundary when you can just return ResponseModel?
I don't know who SBM is, but they should reconsider their video editing career path
What year was this recorded?
I feel like if there were actual code examples in this series it'd make things much easier to understand and not sound a bit vague. In this first episode there was a concrete example which helped, and there's less and less so, and it's so long that it feels like he's waffling.
Shame because he's definitely saying important stuff but it's just hard to take it on board if he's not giving examples
read his book, you'll have all the examples, bad and then good there
Moore's Law Is Dead, the UA-cam channel, just got a good bit of free advertising
More's Law isn't about clock core frequency, "Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years."
Also if it was really the frequency like Uncle Bob is saying, then, it'll become harmful to humans once it reaches 6ghz and above. So basing More's law on CPU clock frequency is silly, given that we already know there's already a limit that it'll stay safe for humans.
Right. And the increase of transistors number has indeed slowed down below Moor's law, to about 2.5 years.
What do you mean by harmful to humans regarding the CPU clock speed?
@@RoiTrigerman this graph github.com/karlrupp/microprocessor-trend-data shows otherwise.
> What do you mean by harmful to humans regarding the CPU clock speed?
Electromagnetic radiation, once the CPU hits 6ghz and higher, it'll leak and is dangerous to humans. The only way going forward is we get some sort of shield, or go with multiprocessing architecture, which is what modern computing is doing.
If you think about it, Uncle Bob is right where the maximum frequency didn't get up at all, just hovered, around 3ghz~4ghz. And that was more than 10 years ago. Every manufacture both on software and hardware is either focusing on power consumption and/or multi-cores architecture.
I'm just stating Moore's law isn't about clock frequency, but rather the number of transistors. That's what Uncle Bob got it wrong.
@@chris-ew9wl The graph you linked to shows what I claimed. It shows that the number of transistors increased by more than 10 times in 10 years.. it's hard to get accurate numbers from it but it seems about right. If every 2 years the number would be doubled, than after 10 years it should be 16 times bigger. Since the graph looks like it's a little less than 16 times, it means the number doubles after a little more than 2 years. Wikipedia says about 2.5 years, and it seems right according to your graph.
@@RoiTrigerman oh you're totally right, My Math was wrong lol, I just divided it by 10, where it should have been exponential like what you said.
And are you sure you are not confusing CPU clock speed with radio frequency? Can you find any resource saying that the CPU clock speed is somehow dangerous to humans?
Yeah if we are slow to deliver code, we are fired.
Love the pontification! Good business model from faked knowledge!
Coding better!!
got curious about the programming toy he mentions in the beginning, anyone knows the name of it?
It's the en.wikipedia.org/wiki/Digi-Comp_I
@@garypopov9158 thats pretty cool tks
59:24 statement on self-driving cars and liability
25:28 When I started to think that I am the God it was when I drag a file to CMD and that was open🤣🤣🤣🤣
When was this recorded?
I believe 2019
Thanks
59:24 -> more funny would be if the answer of the car was "CHICAGO" ~LOL~!
i don't get it, files but not a database?
I think you get it
the point is not about files vs database. I think it was that they were able to defer the decision until last minute and realized that they did not need DB in the first place. files worked for their usecase. If they had taken a decision in haste at the start itself, they would've overengineered their app by using a db, would've gotten themselves locked to mysql's licensing or whatever etc. Instead they were able to defer the decision to last minute and now have the ability to use db like MySQL if they REALLY needed it.
@@blasttrash deferred persistence
what's not to get?
@@Ashalmawia got it, deferred persistence
Are we going to see self driving cars?
Uncle bob: Not a chance.
Deep Learning: Haha ;)
he also said that we wouldn't have 8-core cpu's
he said sure we have cars that can drive in good conditions with a human occupant, but we probably won't have completely automatic cars that drive themselves with no human supervision. has this changed?
Does anybody have a working implementation of the architecture Bob talks about?
do you mean a framework?
@@BienestarMutuo No, not a framework, just a code implementation. Is there any framework that uses Bob's clean architecture?
I implemented one myself in a laravel project. This might help someone.
github.com/rahamatj/clean-laravel
@@BienestarMutuo Haha, framework. It's a pattern, Bob's whole point is you should build your software using the pattern, not use some framework making you dependent on the framework and restricting your freedom to change the system.
@@Rob81k Yes that is right, but also is right that you need a framework "open sourced" that make your code easy and not develop all the "libraries" each time for each development.
So his critique of Ruby on Rails is that because it has an opinionated directory structure, it somehow implies that Rails is tightly coupled to the delivery mechanism (web/IO) and it that it's hard to determine the intent of app? ... what?
Shouldn't he realize that because Rails has an opinionated directory structure, it's very easy to determine the intent of *any* rails app by looking at the same directories each time (e.g. app/models, app/controllers, config/routes.rb)? As opposed to non-opinionated projects where every project has a different structure according to the team that built it and thus requires more time to determine the intent?
There are many valid criticism for Rails, but this is just lazy.
play at 1.5x or faster.
12:08 Yes! - A DVD and a cereal box :-)
25:30 lol like the community episode XD
I wonder how many front-end frameworks enthusiast know what GWT is? Anybody?
57:47 That's unfortunate. The *question* is "What is Chicago?"
Do any Universities stress Uncle Bob's principles for software development?
We had a class on managing a project. I already knew the agile manifesto well enough to tell that the professor twisted it a bit. It takes a lot of therapy to digest Uncle Bob properly
Me laughing in 16 cores.
The power of 2 curve is gone, but processors are still getting bigger, just more slowly.
skip to 1:35 for the genius part
I'm dubious about the Interactor as a Use Case. If one manufactures a knapsack, the use cases refer to the many ways one might interact with the bag, not some decision maker in the process, do they not?
I feel bad that uncle bob is so wrong about self driving cars.
it's ok to be wrong. you can even sometimes _hope_ to be wrong. although I'm not aware that he's wrong yet about that, is he?
meanwhile taxi without a driver start operating at moskow in october i beleave
TORONTO!