This is why it is essential that the software you design/write is "Easy To Change." If your software is written in such a way that it cannot easily change/evolve it is no longer *soft*, rather it is effectively firmware.
Well said. The management always look towards "unicorn" devs who ship things fastest and then spend the next 3 months fixing bugs. Still though, they're happy with these unicorn devs and not someone who takes more time initially but doesn't spend months fixing bugs. Writing a clean and well-meant code takes time but it helps everyone in the longer run.
@@SufianBabriit takes time, but shipping a bug free product six months later might still be the same point at which you discover it needs radical change because it addresses the wrong problem. Then all that effort now needs to be redone, and you're six months behind, and you'll ship the next bug-free version in a year. The other type, that's sought after, will ship with some risk, and change fast, and make a product a year earlier that people want.
@@centerfield6339 I am not talking about a bug-free product which is virtually impossible to do. I don't think giving a week extra on an estimated 2 month of work is too much, especially if resulting work will have lesser issues. By putting pressure on the developers will only encourage them in leaving behind a mess of hacky code and bugs, and worse yet race-conditions. It is easier to do something right the first time than to fix it once it has reached production. PS: even better than shipping buggy software (if doing it has ever made anyone profitable) is to do it the agile way. You don't need to spend 6 months in development before getting feedback from the QA, investor or the customers.
Excellently put. This is probably because what us humans "want" is something in our imagination and we haven't used it to know what the down sides are, unlike what we don't like.
As a developer in an aging highly regulated enterprise, my connection to end users is mediated through lawyers, content specialists, designers and marketing experts. With so many variables affecting the product’s evolution, so many stakeholders that might change their minds, small incremental changes are all the more important to prevent the whole thing evolving into a suicidal platypus.
As a developer who spent much of his career working in "highly regulated enterprises", change is still constant, even though some orgs and some people try to deny it 😉 This doesn't stop being true, and working in small steps is still the answer.
@@ContinuousDelivery I'm currently in one of these highly regulated enterprises.... health care, pharmacy benefit management. And the biggest issue I see is that due to the nature of the business and the project-based relationship with the "business side", people get into this task based groove and just try to do things the way they've always been done, "cause thats how its done" , but that kills our creative problem solving process as well as the exploratory process for figuring out what the heck the software even needs to be able to do
I think an important distinction to make is requirements changing because of user feedback, or lessons learned, vs. requirements constantly changing due to stakeholders or product owners having “commitment issues” and chasing every new user suggestion or shiny new idea they have. That’s a very different matter and will almost unavoidably result in code rot and dev burnout. It’s much worse than requirements just evolving because the things you’ve built are getting used (which is actually a good thing).
This, this is the problem. Changing requirements because user needs is ok. But what we get most of the time is the other way around. Management wants to rush deadlines, everything being urgent (then nothing is), etc etc
In a previous life I was an electronics maintenance technician for Uncle Sam, and one of the principles was that a guy who had to maintain a piece of equipment had to be an expert on operating it; otherwise, how could he know whether it was working correctly? In a like manner, a person developing a piece of software needs to know the user workflows. This knowledge will often enable us to deduce the requirements with minimal input from the user community.
I am glad I found this channel while I learn everything essential to get my first software job. I am seeing things differently now. I probably have a baseline skill to already get a job, but I want to know the overall picture so I can make sure I great habits over bad ones. Strange thing is in 2020 I gave up on looking fora software job and got into digital painting. The process I built for learning and doing was similar to the software practices I was learning. Then I looked back an realized, I didn't need to know everything. I don't need to study for years. All I have to be is opened to change and know that my solution at the time is not the best but it's better than nothing. I can always comeback and reshape/polish it to make it better. Now I am re-reading everything and it all look so different the way I am thinking about writing software. I think as I went through the fundamentals of art and used the concepts to build digital art, even though I didn't fully understand them I still made decent art. The whole process of designing rough sketches, playing around with the fundamentals to add, and then making the final product feels so similar to software. At the beginning stages of my art I keep it rough and vague. I can add or remove in rough objects of things that are their representation and try to find the art that I want. Then when I have the overall rough picture, I can go in and start detailing the objects and bring them from a random shape object to the actual object they are. After that and going back to reading software something clicked in my head. Now all I have to do is finish my reading, try out multiple concepts, and complete a decent project to expand my knowledge of the tools and concepts I can use to build software like an art piece. In art I developed my own process I go through to learn and create things. I realized although I could make programs, I didn't have the guide to all the information I needed to create a reliable process. This is why my code was always horribly bad. Especially when I tried TDD. That's what made me give up in 2020 by the way. I thought my process in software was decent, it got me through college. I wasn't opened to change, and I never really saw others create better software. So, I had nothing to go by. So in late 2022, I bought a subscription to ACM and tried to find multiple books on subjects. Somehow by feb of 2023, after a lot of trial and error reading books I finally found decent ones that didn't assume one already had a job and knew half the stuff they didn't cover. 1. Java: A beginners guide. I have no clue what I learned in college but this book made me see java so much differently. It made me question if I really knew java all this time. 2. Spring Start Here Where was this book in 2020? I think it released late 2021. This book finally gave me a path to create a whole process from front end to back end and build upon it. Before this all I saw were bits and pieces of spring that made no sense whatsoever. Now I am finding all the Data Structures, Algorithms and Design Pattern books I can. I now wonder what college ever taught me. I am sure it was something but now I know my young brain back then didn't see things the way it does now. I was so concerned about passing the class and not having to pay thousands to repeat them. I only did what was needed. I don't think I ever truly understood the stuff. Having the time to actually look and explore it on my time is so eye opening.
Constantly changing user requirements are life. Software is automation. The difference between now and history is not that now we are done, but that we are automating different things. The focus of software development has also shifted with hardware capacity constraints going from limited to seemingly limitless. Software development should be about user collaboration. In the same way you need to test the logic of your sw, you need to test the usefulness of the sw. As long ago as the early eighties I have had end-users test my sw. I learned very early in my career that it can be much more productive to be a developer on a team of users than on a team of developers.
@@ContinuousDelivery True. Another approach is to flip the script: users develop the software. It's been my experience that most developers tend to believe they're smarter than the users. However, disciplines may require technical knowledge but they don't define smarts: it's highly likely that there are users of any sw that are "smarter" than the people who developed it. At least that's what I intend to find out with my platform. If I'm right, you, Dave, will likely want to interview me ~2 years from now. If I'm wrong.... we will never speak of this! 😅 _(but I'm not wrong)_
This is a fun meta level conversation. But on a more pragmatic level when a client approaches me, it's either openSpec == openCheque or fixed cost == fixed spec. Honestly rather prefer more fluid projects, constraints exercise creativity after all. But not when it's an abuse of me as a resource. Enjoying your content, cheers !
My dad has been a software developer a long time, he once told me that requirements do not change, your understanding of the problem just becomes clearer as you are immersed in the problem space.
Sorry to disagree with your Dad, but I am afraid that I do. I think that they do change, nearly always. The requirements for version 1 of Windows were VERY VERY different to the requirements for version 11, for example. Requirements are, in my opinion, an ever-evolving thing, at least they are for successful products and projects. I agree with your Dad that as you become more immersed in the problem space, your understanding also changes, but it does this as well as the requirements actually changing I think. With best wishes to your Dad 😉
The project I'm currently working on is rooted in the Model T "design pattern." We're trying very hard to expand it in a modular way and we have a reimplementation in the works. But new features and changes are always coming in, making modernization extremeley hard. It seems to be a never-ending hurdle.
It's amazing the detriment bad management can have on engineers. So many people, especially self-taught folks trying to land that first job, have this mindset that software is all about "build build build" ... but really it's a LOT slower than that, and it's actually much less "task" oriented and much more discovery/exploratory based. People get into this mindset of "oh, here is this list of tasks I need to do, and we do this here, and this there...." but that actually ends up killing their creativity and possibly hindering our exploratory process for figuring out what exactly our software even needs to do. Once you free your mind to the millions of ways there are to go about solving problems everything changes.
Great video thank you! I've always liked how you frame the quality of code (or a system) as the measure of how easy it is to change. From this video, I'm going to add the following to my high-level principles: - make change easy, safe and low-cost - allow for mistakes
I really like the concept "Cost Of Change". What does it take for a process / piece of code / etc. to change? Work to minimize this cost as much as possible (within reason)
I think that if you do that you will always be in a better place. The tricky part, is that for this strategy to work, it has to be easy to make things easy to change 😉 This is really, fundamentally, what Continuous Delivery is all about.
@@haxwithaxe True enough. However, if you're not a user there is very little chance of you ever relating to the diversity within your base. Conversely, if you're a user, then you will be able to relate to said diversity. You just need to define what being a user means to you: are you locked in on one workflow or use, or are you open to discovering new workflows and uses for the products you create?
The biggest enemy to a good development process is to put a management hierarchy between the end users and development teams. We just spent 3 weeks building the wrong thing due to whispered requirements. Turned out that we'd already implemented what they actually needed. Wasn't completely wasted, did some useful refactoring along the way, useful for the next sprint that is. Not the first time this has happened.
The iPhone example is even more telling. Jobs called it a phone, an iPod and an internet communicator. The first two were known concepts and got a huge applause. The last one was something abstract and unknown and got a mild applause. But the big revolution was the internet communicator (and the App Store later on). Social media on the phone, etc. And indeed streaming media instead of the iPod functionality.
Software developer should get clear and concise requirements. If changes are to be made, proper amount of extra time should be given to take those changes into account. Most problems with changing requirements results from lack of time to properly implement changes (which sometime demand a complete rewrite of already written code) and pressuring dev team to deliver even a broken feature. This way every software stack becomes a hot steaming pile of garbage over time.
If you write in small chunks and get user feedback frequently your rewrites are small chunks and you don't need to throw away huge chunks of code unless you aren't actually talking to the users and when the real users show up their feedback requires you to trash everything. That's a communication problem though not a dev process problem.
I have this question in mind, how does other engineering disciplines differ from software engineering? Obviously, they plan on things in advance and can make innovative new designs as well. So why in software engineering we can't do that? Why we can't plan ahead? Please enlighten me.
Because we can iterate much faster, and we know very little ahead of time. You're not going to find someone that knows exactly what the market wants ahead of time.
@@PavelHenkin If that's the case then why we can't do some kind of research to determine that first before actually writing the code? Basically, knowing the market first?
I work in engineering on the mechanical side of things -- I'm afraid your premise is incorrect. As soon as you add "innovative" to "planned in advance", we run into a huge mire of missteps and strife. Planning ahead only works with very clearly defined problems, often with very strictly regulated solutions. Even then, we run into tremendous problems (see pretty much any large construction project anywhere). Any prototyping effort will go through numerous revisions and refinements before you get anywhere close to a finished product.
Hillel Wayne did a load of interviews to research this question. As I remember the biggest conclusion is that there's a huge variation amount non-software engineering processes.
Well I guess it's not til recently with 3D printing that I would consider that mechanical engineers could go agile, sure they can optimize a design virtually but now they can print a new piece over and over again like us. That said we get much closer to production then they can , and if we get to production it's much easier for us to change direction.
Without users and their changing requirements, I'd be sat at home writing boring coding katas AND my bank account would be empty... long live the users say I.
I think we have to evolve the discussion. Yes, we have to embrace changes and we have also embraced that. But given changing requirements companies should provide extra resources, alocate extra time as well to developers. Estrangely enough we haven't evolved to that and some Don't want to discuss it like they want the cake and eat it too
No, this is not true. Product Development is a process of discovery. Software Development as a wider phenomenon is not entirely a process of discovery (as in discovery with a user). Linux kernel development is not a discovery process with a user. Its is an engineering process where user gets what engineers provide to a user.
Except it's open-source. The user can discover something and then create a change. Depends on who the user is, too. In this case, it's engineers making things for other engineers most of the time. Maybe a user discovered a security vulnerability?
@@antdok9573 A user can file a bug / vulnerability into majority of products, regardless if its open or close sourced. This doesn't determine techniques that are use for its development.
I'm sorry, but I do not understand your viewpoint. Linux counters your argument directly. There seems to be a misunderstanding about the overlap between product and software development, especially when the product is the software. In open-source projects like the Linux kernel, the line between "user" and "developer" is blurred. Many users are also developers. So, saying users just receive what engineers give is off the mark. Users can modify, contribute to, and even fork projects. With 20,000+ contributors to Linux, who's the real user? Even closed-source projects, particularly those that update frequently, often use open dialogues and real-time metrics to gauge how their software is utilized.
@@antdok9573 Well, that depends how do we define a user. Taking into account that all Androids, many Clouds, etc have underpinnings in Linux then I would say that vast majority of users have absolutely no influence or knowledge about this thing. They take what is given to them. But maybe you are correct that Linux kernel is not the best example, it is a complicated thing. My point is that Software Development comes in many shape and forms and hitting everything with Agile hammer isn't the best thing under the sun. Google isn't using Agile techniques and it knows what its doing. I like debate between Robert C Martin and James Coplien. It was about "how would you design a bank and if your user would have any meaningful input into it?". The answer is "rather no", because if you would do this with your user you would design a calculator. Bank is not a calculator, although bank transfers have some appearances of a calculator. You user is an obstacle in this case. What you need is very deep understanding of the domain. You user just receives an endpoint to a very deep system of which he has no knowledge about.
All SW dev is about discovery on multiple levels, sure LINUX was an exploration Linus wondered "Could I build my own UNIX to run on the x86 processors that are in the computers I use". Look at all the different LINUX distros, and UI options, what is that if not exploration of customer need? A "user" isn't a different species, for some things, like a programming language or a fairly technical OS like LINUX, programmers and other technical people are of course "users". LINUX has evolved enormously since its birth to meet user demand.
Hmm I see streaming as an evolution of the iPod not a replacement of it. The original iPhone couldn't have had streaming capabilities because the overwhelming majority of cell phone infrastructure didn't support the data requirements it would have taken. (Note: while I'm reasonably confident in what I typed, I didn't look up any numbers or anything to verify)
All this is fine and dandy if you are employed by the company for which you are developing software. If you are a software project manager in a third-party contractor organisation where you are handed a fixed price contract with a bunch of requirements, and the customer keeps changing the requirements, but refuses to approve change requests because they don't want to pay more, you will develop an aversion to humanity.
What I said is still true in the case that you describe, it is just that the people involved are acting irrationally. Which means that this can't work out very well in reality, which is what usually happens - the customer doesn't get what they want, because they don't know what the want at the start, or the creators are cheated by being asked to make one thing, and being paid for that, but building another. So the only sane answer is to do what I say in the video, and EVERYONE needs to agree that change is inevitable, and then find a fair way to deal with that. The trouble is that much of the world *is* irrational.
@@ContinuousDelivery oh, they know what they are doing. They know they can shortchange the contractor because they have the power to arm twist. Sometimes I have come to know from sources inside the client organisations, that this is the modus operandi of some procurement/managers. Contract for something within the budget they are given and sneak in more work, so that they can showcase their "achievement beyond target" or "savings realized" in their KRAs. Me and my resources have ended up working late hours and weekends to "maintain good client relationship", until burned out. But who cares? Software developers are "dime a dozen". What you suggest is fine when people act fairly. Not when they are being driven to go beyond target in their KRAs.
In my experience frustrating changing requirements happens mostly if somebody has a monopoly on design and decisions. If one person design the requirements and his decisions are on being challenged before implementation, then the solution will have allot of room for improvement and requirements will change to match these. If you always question the requirements and ensure that the requirements are both well understood and actually addressing the core user need, you will see that the solutions being developed needs much less refining afterwards.
It's not the changes that are the problem, it's the arrogant and belittling way they are imposed from above by people who can't be bothered to put in the legwork to properly analyze the problem, but just make a wild guess about what implementation detail change might do the trick.
Yes. This is what I say, we shouldn't be discussing changing requirements as we embrace them already. Let's talk about where projects haven't evolved yet and don't want to embrace, it seems
Not so, what most prod experts say is that users don't know what they want, but they are clear about what they don't want, but only once they have seen it. So the job is to show users our best guess, find out what they don't like, fix that, and try again. It is a process of design/product evolution!
This is why it is essential that the software you design/write is "Easy To Change."
If your software is written in such a way that it cannot easily change/evolve it is no longer *soft*, rather it is effectively firmware.
Well said. The management always look towards "unicorn" devs who ship things fastest and then spend the next 3 months fixing bugs. Still though, they're happy with these unicorn devs and not someone who takes more time initially but doesn't spend months fixing bugs.
Writing a clean and well-meant code takes time but it helps everyone in the longer run.
@@SufianBabriit takes time, but shipping a bug free product six months later might still be the same point at which you discover it needs radical change because it addresses the wrong problem. Then all that effort now needs to be redone, and you're six months behind, and you'll ship the next bug-free version in a year. The other type, that's sought after, will ship with some risk, and change fast, and make a product a year earlier that people want.
@@centerfield6339 I am not talking about a bug-free product which is virtually impossible to do.
I don't think giving a week extra on an estimated 2 month of work is too much, especially if resulting work will have lesser issues.
By putting pressure on the developers will only encourage them in leaving behind a mess of hacky code and bugs, and worse yet race-conditions.
It is easier to do something right the first time than to fix it once it has reached production.
PS: even better than shipping buggy software (if doing it has ever made anyone profitable) is to do it the agile way. You don't need to spend 6 months in development before getting feedback from the QA, investor or the customers.
Because users don't know what they want, until they get to see what they don't want.
This is called Humphreys Law
I've been programming since 1965. This is 100% correct. Users don't know what they want. They only know what they don't want.
😎
Excellently put.
This is probably because what us humans "want" is something in our imagination and we haven't used it to know what the down sides are, unlike what we don't like.
As a developer in an aging highly regulated enterprise, my connection to end users is mediated through lawyers, content specialists, designers and marketing experts. With so many variables affecting the product’s evolution, so many stakeholders that might change their minds, small incremental changes are all the more important to prevent the whole thing evolving into a suicidal platypus.
As a developer who spent much of his career working in "highly regulated enterprises", change is still constant, even though some orgs and some people try to deny it 😉 This doesn't stop being true, and working in small steps is still the answer.
@@ContinuousDelivery I'm currently in one of these highly regulated enterprises.... health care, pharmacy benefit management. And the biggest issue I see is that due to the nature of the business and the project-based relationship with the "business side", people get into this task based groove and just try to do things the way they've always been done, "cause thats how its done" , but that kills our creative problem solving process as well as the exploratory process for figuring out what the heck the software even needs to be able to do
"suicidal platypus" is opening at tonight's show.
I think an important distinction to make is requirements changing because of user feedback, or lessons learned, vs. requirements constantly changing due to stakeholders or product owners having “commitment issues” and chasing every new user suggestion or shiny new idea they have. That’s a very different matter and will almost unavoidably result in code rot and dev burnout. It’s much worse than requirements just evolving because the things you’ve built are getting used (which is actually a good thing).
This, this is the problem. Changing requirements because user needs is ok. But what we get most of the time is the other way around. Management wants to rush deadlines, everything being urgent (then nothing is), etc etc
In a previous life I was an electronics maintenance technician for Uncle Sam, and one of the principles was that a guy who had to maintain a piece of equipment had to be an expert on operating it; otherwise, how could he know whether it was working correctly? In a like manner, a person developing a piece of software needs to know the user workflows. This knowledge will often enable us to deduce the requirements with minimal input from the user community.
I am glad I found this channel while I learn everything essential to get my first software job. I am seeing things differently now. I probably have a baseline skill to already get a job, but I want to know the overall picture so I can make sure I great habits over bad ones. Strange thing is in 2020 I gave up on looking fora software job and got into digital painting. The process I built for learning and doing was similar to the software practices I was learning. Then I looked back an realized, I didn't need to know everything. I don't need to study for years. All I have to be is opened to change and know that my solution at the time is not the best but it's better than nothing. I can always comeback and reshape/polish it to make it better. Now I am re-reading everything and it all look so different the way I am thinking about writing software.
I think as I went through the fundamentals of art and used the concepts to build digital art, even though I didn't fully understand them I still made decent art. The whole process of designing rough sketches, playing around with the fundamentals to add, and then making the final product feels so similar to software.
At the beginning stages of my art I keep it rough and vague. I can add or remove in rough objects of things that are their representation and try to find the art that I want. Then when I have the overall rough picture, I can go in and start detailing the objects and bring them from a random shape object to the actual object they are. After that and going back to reading software something clicked in my head. Now all I have to do is finish my reading, try out multiple concepts, and complete a decent project to expand my knowledge of the tools and concepts I can use to build software like an art piece.
In art I developed my own process I go through to learn and create things. I realized although I could make programs, I didn't have the guide to all the information I needed to create a reliable process. This is why my code was always horribly bad. Especially when I tried TDD. That's what made me give up in 2020 by the way. I thought my process in software was decent, it got me through college. I wasn't opened to change, and I never really saw others create better software. So, I had nothing to go by.
So in late 2022, I bought a subscription to ACM and tried to find multiple books on subjects. Somehow by feb of 2023, after a lot of trial and error reading books I finally found decent ones that didn't assume one already had a job and knew half the stuff they didn't cover.
1. Java: A beginners guide.
I have no clue what I learned in college but this book made me see java so much differently. It made me question if I really knew java all this time.
2. Spring Start Here
Where was this book in 2020? I think it released late 2021. This book finally gave me a path to create a whole process from front end to back end and build upon it. Before this all I saw were bits and pieces of spring that made no sense whatsoever.
Now I am finding all the Data Structures, Algorithms and Design Pattern books I can. I now wonder what college ever taught me. I am sure it was something but now I know my young brain back then didn't see things the way it does now. I was so concerned about passing the class and not having to pay thousands to repeat them. I only did what was needed. I don't think I ever truly understood the stuff. Having the time to actually look and explore it on my time is so eye opening.
Constantly changing user requirements are life. Software is automation. The difference between now and history is not that now we are done, but that we are automating different things. The focus of software development has also shifted with hardware capacity constraints going from limited to seemingly limitless. Software development should be about user collaboration. In the same way you need to test the logic of your sw, you need to test the usefulness of the sw. As long ago as the early eighties I have had end-users test my sw. I learned very early in my career that it can be much more productive to be a developer on a team of users than on a team of developers.
Users have general ideas about what they want, but specific ideas about what they _don't_ want.
Yes, exactly, so you have to show them your best idea, see how they dislike it, change your ideas, and try again.
@@ContinuousDelivery True. Another approach is to flip the script: users develop the software.
It's been my experience that most developers tend to believe they're smarter than the users. However, disciplines may require technical knowledge but they don't define smarts: it's highly likely that there are users of any sw that are "smarter" than the people who developed it. At least that's what I intend to find out with my platform. If I'm right, you, Dave, will likely want to interview me ~2 years from now. If I'm wrong.... we will never speak of this! 😅 _(but I'm not wrong)_
This is a fun meta level conversation. But on a more pragmatic level when a client approaches me, it's either openSpec == openCheque or fixed cost == fixed spec. Honestly rather prefer more fluid projects, constraints exercise creativity after all. But not when it's an abuse of me as a resource. Enjoying your content, cheers !
Failure is not an option... it's a requirement. Not enough people get that and I'm glad you devoted an entire video to the concept.
My dad has been a software developer a long time, he once told me that requirements do not change, your understanding of the problem just becomes clearer as you are immersed in the problem space.
Sorry to disagree with your Dad, but I am afraid that I do. I think that they do change, nearly always.
The requirements for version 1 of Windows were VERY VERY different to the requirements for version 11, for example. Requirements are, in my opinion, an ever-evolving thing, at least they are for successful products and projects.
I agree with your Dad that as you become more immersed in the problem space, your understanding also changes, but it does this as well as the requirements actually changing I think.
With best wishes to your Dad 😉
you guys are getting requirements?!
The project I'm currently working on is rooted in the Model T "design pattern." We're trying very hard to expand it in a modular way and we have a reimplementation in the works. But new features and changes are always coming in, making modernization extremeley hard. It seems to be a never-ending hurdle.
It's amazing the detriment bad management can have on engineers. So many people, especially self-taught folks trying to land that first job, have this mindset that software is all about "build build build" ... but really it's a LOT slower than that, and it's actually much less "task" oriented and much more discovery/exploratory based.
People get into this mindset of "oh, here is this list of tasks I need to do, and we do this here, and this there...." but that actually ends up killing their creativity and possibly hindering our exploratory process for figuring out what exactly our software even needs to do.
Once you free your mind to the millions of ways there are to go about solving problems everything changes.
Great video thank you! I've always liked how you frame the quality of code (or a system) as the measure of how easy it is to change.
From this video, I'm going to add the following to my high-level principles:
- make change easy, safe and low-cost
- allow for mistakes
I really like the concept "Cost Of Change". What does it take for a process / piece of code / etc. to change? Work to minimize this cost as much as possible (within reason)
I think that if you do that you will always be in a better place.
The tricky part, is that for this strategy to work, it has to be easy to make things easy to change 😉 This is really, fundamentally, what Continuous Delivery is all about.
If you are not a user of your own software, you will never truly understand your users.
If the system is big or complex enough we're unlikely to effectively understand users with different patterns of use despite being a user.
@@haxwithaxe True enough. However, if you're not a user there is very little chance of you ever relating to the diversity within your base. Conversely, if you're a user, then you will be able to relate to said diversity. You just need to define what being a user means to you: are you locked in on one workflow or use, or are you open to discovering new workflows and uses for the products you create?
@@yapdog yup, exactly. I was agreeing just extending.
That is certainly ideal, but not always possible, but you still need to understand the sw as though you were a user - that's the job, or should be!
@@ContinuousDelivery For clarity's sake, for what types of projects is it not possible?
The biggest enemy to a good development process is to put a management hierarchy between the end users and development teams. We just spent 3 weeks building the wrong thing due to whispered requirements. Turned out that we'd already implemented what they actually needed.
Wasn't completely wasted, did some useful refactoring along the way, useful for the next sprint that is.
Not the first time this has happened.
The iPhone example is even more telling. Jobs called it a phone, an iPod and an internet communicator. The first two were known concepts and got a huge applause. The last one was something abstract and unknown and got a mild applause. But the big revolution was the internet communicator (and the App Store later on). Social media on the phone, etc. And indeed streaming media instead of the iPod functionality.
Software developer should get clear and concise requirements. If changes are to be made, proper amount of extra time should be given to take those changes into account. Most problems with changing requirements results from lack of time to properly implement changes (which sometime demand a complete rewrite of already written code) and pressuring dev team to deliver even a broken feature. This way every software stack becomes a hot steaming pile of garbage over time.
If you write in small chunks and get user feedback frequently your rewrites are small chunks and you don't need to throw away huge chunks of code unless you aren't actually talking to the users and when the real users show up their feedback requires you to trash everything. That's a communication problem though not a dev process problem.
I have this question in mind, how does other engineering disciplines differ from software engineering? Obviously, they plan on things in advance and can make innovative new designs as well. So why in software engineering we can't do that? Why we can't plan ahead? Please enlighten me.
Because we can iterate much faster, and we know very little ahead of time. You're not going to find someone that knows exactly what the market wants ahead of time.
@@PavelHenkin If that's the case then why we can't do some kind of research to determine that first before actually writing the code? Basically, knowing the market first?
I work in engineering on the mechanical side of things -- I'm afraid your premise is incorrect. As soon as you add "innovative" to "planned in advance", we run into a huge mire of missteps and strife. Planning ahead only works with very clearly defined problems, often with very strictly regulated solutions. Even then, we run into tremendous problems (see pretty much any large construction project anywhere). Any prototyping effort will go through numerous revisions and refinements before you get anywhere close to a finished product.
Hillel Wayne did a load of interviews to research this question. As I remember the biggest conclusion is that there's a huge variation amount non-software engineering processes.
Well I guess it's not til recently with 3D printing that I would consider that mechanical engineers could go agile, sure they can optimize a design virtually but now they can print a new piece over and over again like us. That said we get much closer to production then they can , and if we get to production it's much easier for us to change direction.
Without users and their changing requirements, I'd be sat at home writing boring coding katas AND my bank account would be empty... long live the users say I.
I think we have to evolve the discussion. Yes, we have to embrace changes and we have also embraced that. But given changing requirements companies should provide extra resources, alocate extra time as well to developers. Estrangely enough we haven't evolved to that and some Don't want to discuss it like they want the cake and eat it too
No, this is not true. Product Development is a process of discovery. Software Development as a wider phenomenon is not entirely a process of discovery (as in discovery with a user). Linux kernel development is not a discovery process with a user. Its is an engineering process where user gets what engineers provide to a user.
Except it's open-source. The user can discover something and then create a change. Depends on who the user is, too. In this case, it's engineers making things for other engineers most of the time. Maybe a user discovered a security vulnerability?
@@antdok9573 A user can file a bug / vulnerability into majority of products, regardless if its open or close sourced. This doesn't determine techniques that are use for its development.
I'm sorry, but I do not understand your viewpoint.
Linux counters your argument directly.
There seems to be a misunderstanding about the overlap between product and software development, especially when the product is the software.
In open-source projects like the Linux kernel, the line between "user" and "developer" is blurred. Many users are also developers. So, saying users just receive what engineers give is off the mark. Users can modify, contribute to, and even fork projects. With 20,000+ contributors to Linux, who's the real user?
Even closed-source projects, particularly those that update frequently, often use open dialogues and real-time metrics to gauge how their software is utilized.
@@antdok9573 Well, that depends how do we define a user. Taking into account that all Androids, many Clouds, etc have underpinnings in Linux then I would say that vast majority of users have absolutely no influence or knowledge about this thing. They take what is given to them. But maybe you are correct that Linux kernel is not the best example, it is a complicated thing.
My point is that Software Development comes in many shape and forms and hitting everything with Agile hammer isn't the best thing under the sun. Google isn't using Agile techniques and it knows what its doing. I like debate between Robert C Martin and James Coplien. It was about "how would you design a bank and if your user would have any meaningful input into it?". The answer is "rather no", because if you would do this with your user you would design a calculator. Bank is not a calculator, although bank transfers have some appearances of a calculator. You user is an obstacle in this case. What you need is very deep understanding of the domain. You user just receives an endpoint to a very deep system of which he has no knowledge about.
All SW dev is about discovery on multiple levels, sure LINUX was an exploration Linus wondered "Could I build my own UNIX to run on the x86 processors that are in the computers I use".
Look at all the different LINUX distros, and UI options, what is that if not exploration of customer need?
A "user" isn't a different species, for some things, like a programming language or a fairly technical OS like LINUX, programmers and other technical people are of course "users".
LINUX has evolved enormously since its birth to meet user demand.
Change is the only constant.
Hmm I see streaming as an evolution of the iPod not a replacement of it. The original iPhone couldn't have had streaming capabilities because the overwhelming majority of cell phone infrastructure didn't support the data requirements it would have taken. (Note: while I'm reasonably confident in what I typed, I didn't look up any numbers or anything to verify)
All this is fine and dandy if you are employed by the company for which you are developing software. If you are a software project manager in a third-party contractor organisation where you are handed a fixed price contract with a bunch of requirements, and the customer keeps changing the requirements, but refuses to approve change requests because they don't want to pay more, you will develop an aversion to humanity.
What I said is still true in the case that you describe, it is just that the people involved are acting irrationally. Which means that this can't work out very well in reality, which is what usually happens - the customer doesn't get what they want, because they don't know what the want at the start, or the creators are cheated by being asked to make one thing, and being paid for that, but building another.
So the only sane answer is to do what I say in the video, and EVERYONE needs to agree that change is inevitable, and then find a fair way to deal with that.
The trouble is that much of the world *is* irrational.
@@ContinuousDelivery oh, they know what they are doing. They know they can shortchange the contractor because they have the power to arm twist. Sometimes I have come to know from sources inside the client organisations, that this is the modus operandi of some procurement/managers. Contract for something within the budget they are given and sneak in more work, so that they can showcase their "achievement beyond target" or "savings realized" in their KRAs. Me and my resources have ended up working late hours and weekends to "maintain good client relationship", until burned out. But who cares? Software developers are "dime a dozen". What you suggest is fine when people act fairly. Not when they are being driven to go beyond target in their KRAs.
In my experience frustrating changing requirements happens mostly if somebody has a monopoly on design and decisions. If one person design the requirements and his decisions are on being challenged before implementation, then the solution will have allot of room for improvement and requirements will change to match these. If you always question the requirements and ensure that the requirements are both well understood and actually addressing the core user need, you will see that the solutions being developed needs much less refining afterwards.
It's not the changes that are the problem, it's the arrogant and belittling way they are imposed from above by people who can't be bothered to put in the legwork to properly analyze the problem, but just make a wild guess about what implementation detail change might do the trick.
Yes. This is what I say, we shouldn't be discussing changing requirements as we embrace them already. Let's talk about where projects haven't evolved yet and don't want to embrace, it seems
This. Just this.
If noone knows the correct answer, any software would be perfect to deliver to customers.
Not so, what most prod experts say is that users don't know what they want, but they are clear about what they don't want, but only once they have seen it.
So the job is to show users our best guess, find out what they don't like, fix that, and try again. It is a process of design/product evolution!
The most precise user requirement description is the code implementation.