🎯 Key Takeaways for quick navigation: 00:00 🎤 Richard Campbell, an experienced software developer, discusses the future of software development, including trends and challenges in the next decade. 02:07 🧠 Moore's Law, the trend of doubling the number of transistors on a chip every 18-24 months, is reaching its limits, and future improvements in computing power will be constrained. 05:39 💻 Architectural improvements and optimizing processors become crucial as Moore's Law ends, and companies like Apple's M1 and M2 chips showcase advancements in architecture. 09:59 📉 Network limitations, especially with 5G and 6G, will be a significant constraint in the future, impacting data transfer and communication between devices. 17:37 🌐 Ubiquitous computing, where every device has an IP address and connects to the network, will become more prevalent, driven by innovations like satellite networks and increased connectivity. 19:43 🦠 The COVID-19 pandemic has accelerated cloud migration and changed work dynamics, but economic impacts, loss of expertise, and supply chain disruptions have also posed challenges. 21:47 💡 Software developers may face economic challenges and a shift from a growth mindset to a focus on return on investment as the industry evolves. 22:31 🏭 Companies should focus on building tools that help them make money and work more efficiently. 23:00 📈 Companies are becoming more reflective and reevaluating their priorities due to economic changes and technological advancements. 24:09 🌐 The browser market is relatively stable, with Chrome, Safari, Edge, and Firefox being major players, and web development is still mostly focused on browsers. 25:17 🧩 Blazer and WebAssembly offer opportunities for running client-side code in various environments, with potential for server-side use as well. 29:11 🕸️ The concept of decentralized web is gaining popularity, offering more control and flexibility while avoiding some centralization drawbacks. 32:07 🚀 Web 3.0, with concepts like blockchain, has potential but has been widely misapplied and faces issues with crypto and centralization. 33:33 🆕 .NET has made significant advancements, transitioning to a cloud-centric, heterogeneous client platform while maintaining compatibility and familiar skills. 35:05 🔮 MAUI (Multi-platform App UI) by Microsoft aims to provide a unified client development model for multiple platforms, although it's still evolving. 36:45 🛠️ The Power Platform empowers domain experts to build apps and solves UI multi-platform issues, leading to increased productivity and ease of use. 39:03 🛡️ Containerization of software is growing, with potential applications in desktop machines to enhance security against exploits and attacks. 42:28 🧭 Building a successful career in development involves choosing between staying on the Leading Edge or becoming an expert in a specific technology stack. 43:23 🤖 Artificial Intelligence (AI) is an umbrella term often used for technologies that don't work yet; when they do, they receive new names like deep learning or predictive analytics. 44:19 🖥️ The current era of software development incorporates various AI technologies like image recognition, speech bots, and form recognizers that are readily available as libraries, making it easier to use AI in software development. 45:30 🤖 Machine learning models, like form recognizers, can be used to automatically recognize forms and associated data, streamlining the process of digitizing forms without manual coding. 46:23 📈 Analyzing data using advanced analytics and machine learning is a valuable career opportunity, as the volume of data continues to increase, and the cloud provides better tools for advanced data analysis. 47:47 ⚙️ Beyond predictive analytics, the integration of prescriptive analytics, which combines predictive models with actions, enables automated decision-making, as seen in personalized marketing strategies and emergency planning. 49:25 🎮 Machine learning models, like OpenAI's Co-Pilot, are becoming valuable tools for programming, assisting developers in generating code snippets, but human evaluation and discernment remain essential. 52:01 🕶️ Although AR headsets are a promising technology, their commercial adoption faces challenges, and 2023 might not be a significant year for AR headset advancements. 58:07 ⚛️ Quantum computing's potential lies in solving complex problems in fields like agriculture, chemistry, and material science, but the technology is still in its early stages and requires significant development. 01:05:49 🛠️ Quantum computing is still at a stage reminiscent of early mechanical mainframes, with various approaches to building quantum computers and the need for a stable and reliable qubit technology.
it's very important to understand the architecture. Imagine AI was not possible in the 50s because there was no nano tech in place. all AI algorithms rely on theories built decades ago.
Very much so this. Living in the Linux space, this is very obviously in the context of "developer in the corporate enablement space" If you are genuinely trying to build infrastructure - kernel dev work, building GUI toolkits, firmware/hardware dev work - this kinda ignores a whole host of issues. No one talks about the compute overhead of webapps added up across an entire electrical grids. It's not a problem for one user, or even one corporate user. But it's bad from an actual social infrastructure stance because you are building a standard where A) applications are being bottlenecked by their container/wrapped browser. and B) while resources are abundant, it doesn't mean the aggregate inefficiency should be ignored. That doesn't even include the fact you lose sovereignty of your software stack. Your software provider can now - and this started with subscription models (the pay me for the privilege of using work I did last year this year again model) - arbitrarily modify tools you use. For most end users it doesn't matter. But for anyone who is trying to do genuinely technical work, it can be a serious resource drain. Stable software which is standalone is valuable.
Well first when we thinks on software development it is better to first take a look decades ago when the first software were designed and we also take another look to how the software design have been changing in the last 4 decades and is an big change which wil continue without ending never. Mr. Richard Campbel was a so helpfull and interesting video. Thanks so much
Funny to mention Visual Studio 2010. With the current version of Visual Studio 2022 (version 17.7.0) even the Start Page is back in favor of the startup screen 🤣. No confusion there.
Thanks for confirming some of our predictions about AI, for example. It is useful to understand trends in order to be able to create software that will remain relevant. Understanding the trends and advancements in AI is crucial for creating software that stays relevant and meets the evolving needs of users. By staying updated on the latest developments, we can harness the power of AI to deliver innovative solutions that align with the changing landscape.
Another thing that might help also is to try to document the component if there is no existing documentation. That will help you also increase your knowledge of the feature.
We're going to exhaust Moore's "Law" because we're going to run out of atoms, and then architecture performance should return to be a more relevant discussion. A form of containerization, such as Wasm is doing, should give us the next abstraction layer for web development. “Blockchain is not inherently stupid, it’s just wildly misapplied”, maybe we’ll use the remaining of it to do something useful one day. The big collection of concepts we call AI is being used to create handy tools we should keep an eye on, especially to develop solutions to approach the end user to his domain-oriented problems. Hololens (and similar headgear devices) sounds like a relevant goal, but it seems we’re still a bit far from it doing what we really want it to do. Quantum computing should give us power to solve problems we couldn’t solve yet, like pinpointing specific chemistry reactions. The best way to predict the future is to make it :)
Yes, many think that if we pack in processing power and memory we can then have bulky applications with generated code. So do the same thing as some custome classes in JS, we move a whole library to do the same and we further decrease network efecctivity.
The solution will be pretty cool, but also an interesting point is with the increase from 2G to 6G we will be create better 3D mapping in realtime of people, and as we cycle down the range to 2G then we can see our hearts beating :P. So we can flip the wifi's around if we wanted :P :P :P
Then interestingly, where this will go is you can map the interface to our elctromagnetic fields so that we can visualize and interact with it in our imagination, headseats are a thing of the past.
The problem of things like 5G are, that since I still only get 3 - 10 GB per month for anything like affordable prices, I literally couldn’t care less, and range was already a lot more important, so any reduction in that is just shit. As for connected devices, telemetry, etc. LTE it’s already fast enough anyway, heck it is enough for videostreaming
From my experience, no code tools work till a point where you want to make a simple application without specific features, beyond that point it gets too difficult to build and maintain using them. When I was young and didn't know anything about programming I used to make games using the no-code tools like game maker and construct 2D, I felt limited while making them due to the aforementioned reasons of no code tools. After learning programming I would never not use code to make games, expressing complex applications is much more easier in code, once you know how. Also no code tools are mostly abstract framework that leave much room for optimization, which in certain applications you may need.
Ahh sweet sweet performance mockery. I love it. Gotta strap ourselves to the Quantum computing rocket at some distant future point, to get the performance explosions again.
I could appreciate how small we are compared to mother nature. A quantum computer needed just to understand the nitrogen fixing than a simple plant can do just like that.
@@larslover6559 Bro, you could be presented with the 1:1 version of how our universe came to be and where the big bang came from and how was it even possible yet you would still be spewing religious nonsense.
He pretty much covered all technologies we are developing or using but he forget IoT/IIoT and light computing which is a domain of computing between silicon and quantum computing. Nice talk ! Computing industry marching on...
MS Power platform - domain experts can design their own apps, app forms reasonably well, but mostly for internal apps and not public facing, users be authenticated members of tenant
Good talk! I agree that power platform is really taking off, although I did try the powerautomate copilot and am not impressed. Power automate and powerbi are heavily used and depended upon in our org.
Decentralized computing is good as a concept. But the current Web3 implementation is so bad that the costs wildly outshine the benefits. Most accessible Web3 projects are heavily centralized with their off-chain parts and are, imo, indistinguishable from a normal web with fancy but slow databases.
sorrry i am a developer , Edge is much better than crome, both are made of chromium but edge is much better and uses less resouces, i think you wanted to say IE?
Seems to me that we should be geting better at making predictions, especially about technology. In 1980, what computer producers were still going to be relevant after year 200? Pretty sure people thought IBM, DEC and Unix What happened to DEC & why? Building up a knowledge base like this, over time, should mean we get better at the difficult (predicting weather is surely far more difficult, but we haven't given up). Similar predictions for languages...hadn't heard of Java until mid 90s; at what stage was it apparent that COBOL was no longer going to be relevant...and what happened between then & the decade preceding?
@@bobweiram6321 Not sure what you mean by 'relevant'; it is still being used, but how many times is it being chosen as the language for a new application?
When it comes to the electron cloud configuration, alpha fold can't help. It's a different kind of problem. AlphaFold can only do the organic stuff (proteins) around the metal ions.
Photonic chips are just classical computers. At best, they might provide a constant-factor speedup to existing compute. If this speedup is modest, then it's not going to change the world, and there's a whole bunch of technical issues making it very hard to achieve without making something worse than a silicon computer.
Really? The guy gives a brilliant speech that covered just about everything in IT and you came away concerned about his health? No - actually you just wanted to be condescendingly judgemental, based on observation from a distance. What do you know about his circumstances? What do you know about his health? What do you know about any medications he has to take? Nothing, nothing and nothing. So before you start being needlessly critical of others and making judgements in a knowledge vacuum, maybe you might wanna take a look in the mirror first. Maybe _that_ guy is the one who needs some good advice. Just sayin ... 🤨🤨🤨
TL;DR - absolutely nothing at all has changed in software development since at least the 1980s through to today. It's literally Groundhog Day, every day, of every year, of every decade. Only the names of the tools change, but everything else is exactly the same. It will remain exactly the same next year, and the whole decade after that.
Very good talk but it's a little out of date, specifically when he recommended Unity. Please don't use Unity their board has tanked the company in the same way Facebook and Twitter did.
They did years ago and it's called Firefox ;) Firefox was based on open-sourced Netscape code. Off course it has evolved over the years, but in some sense it's an evolution of Netscape. The Mozilla organization was also created by Netscape (company).
At least Uncle Bob isn't an obvious evangelist for a major player (Microsoft) and competitor in many of the product spaces based on the videos I have watched.
This content is overflowing with transformative thoughts. I found a book with like themes that redirected my journey. "AWS Unleashed: Mastering Amazon Web Services for Software Engineers" by Harrison Quill
Everything fine until the talker says all "crypto is pretty much a ponzi scheme". He's clearly an ignorant in that matter, he should have never commented on it.
🎯 Key Takeaways for quick navigation:
00:00 🎤 Richard Campbell, an experienced software developer, discusses the future of software development, including trends and challenges in the next decade.
02:07 🧠 Moore's Law, the trend of doubling the number of transistors on a chip every 18-24 months, is reaching its limits, and future improvements in computing power will be constrained.
05:39 💻 Architectural improvements and optimizing processors become crucial as Moore's Law ends, and companies like Apple's M1 and M2 chips showcase advancements in architecture.
09:59 📉 Network limitations, especially with 5G and 6G, will be a significant constraint in the future, impacting data transfer and communication between devices.
17:37 🌐 Ubiquitous computing, where every device has an IP address and connects to the network, will become more prevalent, driven by innovations like satellite networks and increased connectivity.
19:43 🦠 The COVID-19 pandemic has accelerated cloud migration and changed work dynamics, but economic impacts, loss of expertise, and supply chain disruptions have also posed challenges.
21:47 💡 Software developers may face economic challenges and a shift from a growth mindset to a focus on return on investment as the industry evolves.
22:31 🏭 Companies should focus on building tools that help them make money and work more efficiently.
23:00 📈 Companies are becoming more reflective and reevaluating their priorities due to economic changes and technological advancements.
24:09 🌐 The browser market is relatively stable, with Chrome, Safari, Edge, and Firefox being major players, and web development is still mostly focused on browsers.
25:17 🧩 Blazer and WebAssembly offer opportunities for running client-side code in various environments, with potential for server-side use as well.
29:11 🕸️ The concept of decentralized web is gaining popularity, offering more control and flexibility while avoiding some centralization drawbacks.
32:07 🚀 Web 3.0, with concepts like blockchain, has potential but has been widely misapplied and faces issues with crypto and centralization.
33:33 🆕 .NET has made significant advancements, transitioning to a cloud-centric, heterogeneous client platform while maintaining compatibility and familiar skills.
35:05 🔮 MAUI (Multi-platform App UI) by Microsoft aims to provide a unified client development model for multiple platforms, although it's still evolving.
36:45 🛠️ The Power Platform empowers domain experts to build apps and solves UI multi-platform issues, leading to increased productivity and ease of use.
39:03 🛡️ Containerization of software is growing, with potential applications in desktop machines to enhance security against exploits and attacks.
42:28 🧭 Building a successful career in development involves choosing between staying on the Leading Edge or becoming an expert in a specific technology stack.
43:23 🤖 Artificial Intelligence (AI) is an umbrella term often used for technologies that don't work yet; when they do, they receive new names like deep learning or predictive analytics.
44:19 🖥️ The current era of software development incorporates various AI technologies like image recognition, speech bots, and form recognizers that are readily available as libraries, making it easier to use AI in software development.
45:30 🤖 Machine learning models, like form recognizers, can be used to automatically recognize forms and associated data, streamlining the process of digitizing forms without manual coding.
46:23 📈 Analyzing data using advanced analytics and machine learning is a valuable career opportunity, as the volume of data continues to increase, and the cloud provides better tools for advanced data analysis.
47:47 ⚙️ Beyond predictive analytics, the integration of prescriptive analytics, which combines predictive models with actions, enables automated decision-making, as seen in personalized marketing strategies and emergency planning.
49:25 🎮 Machine learning models, like OpenAI's Co-Pilot, are becoming valuable tools for programming, assisting developers in generating code snippets, but human evaluation and discernment remain essential.
52:01 🕶️ Although AR headsets are a promising technology, their commercial adoption faces challenges, and 2023 might not be a significant year for AR headset advancements.
58:07 ⚛️ Quantum computing's potential lies in solving complex problems in fields like agriculture, chemistry, and material science, but the technology is still in its early stages and requires significant development.
01:05:49 🛠️ Quantum computing is still at a stage reminiscent of early mechanical mainframes, with various approaches to building quantum computers and the need for a stable and reliable qubit technology.
thank you very much
Bravo🎉🎉 Thanks for summary ❤❤
Thanks you saved my time
So, he is describing the state of things, not the next 10 years?
And you my friend is a real hero.
Interesting topic and well presented, I certainly learned a few new things. Thanks!
Wish I had the level of confidence and body language. Great presentation.
"John Carmack, the guy behind Oculus" 🤔 ... Doom be upon us
This has put me off watching it now
He quit meta a while ago.
Wonderful talk. Thank you for sharing this with all of us!
Good talk, but I wish Richard would have spent more time (or even some at all) on software development of the future as the title suggests.
it's very important to understand the architecture. Imagine AI was not possible in the 50s because there was no nano tech in place. all AI algorithms rely on theories built decades ago.
thanks uncle Richard, it was really helpful to broaden the way I used to think
Great talk. Nothing to do with software development.
Appreciate this talk. Thank you sir!
I feel like the code could have been called "The Next Decade Of Microsoft Software Development"
Very much so this. Living in the Linux space, this is very obviously in the context of "developer in the corporate enablement space"
If you are genuinely trying to build infrastructure - kernel dev work, building GUI toolkits, firmware/hardware dev work - this kinda ignores a whole host of issues.
No one talks about the compute overhead of webapps added up across an entire electrical grids. It's not a problem for one user, or even one corporate user. But it's bad from an actual social infrastructure stance because you are building a standard where A) applications are being bottlenecked by their container/wrapped browser. and B) while resources are abundant, it doesn't mean the aggregate inefficiency should be ignored.
That doesn't even include the fact you lose sovereignty of your software stack. Your software provider can now - and this started with subscription models (the pay me for the privilege of using work I did last year this year again model) - arbitrarily modify tools you use. For most end users it doesn't matter. But for anyone who is trying to do genuinely technical work, it can be a serious resource drain. Stable software which is standalone is valuable.
Well first when we thinks on software development it is better to first take a look decades ago when the first software were designed and we also take another look to how the software design have been changing in the last 4 decades and is an big change which wil continue without ending never. Mr. Richard Campbel was a so helpfull and interesting video. Thanks so much
Thanks, from Vietnam 😊
Such a good talk. I want him to make a new one every three months due to the release of these generative models like Gemini 1.5!
thought provoking talk, Richard. Thanks for presenting this, so many takeaways!
No one could have predicted that the "choose Unity" part of this talk would be what aged poorly just 5 months later 😆
People don't use Chrome because they prefer the browser. They like the integration with the Google mother ship.
Wonderful talk.
Very interesting. Thank you!
Up-to-date and informative presentation here. That's awesome. Thank you!
WOW. Great Talk 🤯
This explains so much! 😮
Great talk
Funny to mention Visual Studio 2010. With the current version of Visual Studio 2022 (version 17.7.0) even the Start Page is back in favor of the startup screen 🤣. No confusion there.
Amazing presentation!!
Thanks for confirming some of our predictions about AI, for example. It is useful to understand trends in order to be able to create software that will remain relevant.
Understanding the trends and advancements in AI is crucial for creating software that stays relevant and meets the evolving needs of users. By staying updated on the latest developments, we can harness the power of AI to deliver innovative solutions that align with the changing landscape.
appreciate the learnings from the talk
Wow, such a great information. Thank you.
I am enjoying this
Another thing that might help also is to try to document the component if there is no existing documentation. That will help you also increase your knowledge of the feature.
We're going to exhaust Moore's "Law" because we're going to run out of atoms, and then architecture performance should return to be a more relevant discussion.
A form of containerization, such as Wasm is doing, should give us the next abstraction layer for web development.
“Blockchain is not inherently stupid, it’s just wildly misapplied”, maybe we’ll use the remaining of it to do something useful one day.
The big collection of concepts we call AI is being used to create handy tools we should keep an eye on, especially to develop solutions to approach the end user to his domain-oriented problems.
Hololens (and similar headgear devices) sounds like a relevant goal, but it seems we’re still a bit far from it doing what we really want it to do.
Quantum computing should give us power to solve problems we couldn’t solve yet, like pinpointing specific chemistry reactions.
The best way to predict the future is to make it :)
This was really insightful
good talk
Great presentation and insights by Richard! Love the video ❤
The talk has nothing to do with the next decade of software development but rather the advancements throughout the last century
Yes, many think that if we pack in processing power and memory we can then have bulky applications with generated code. So do the same thing as some custome classes in JS, we move a whole library to do the same and we further decrease network efecctivity.
The solution will be pretty cool, but also an interesting point is with the increase from 2G to 6G we will be create better 3D mapping in realtime of people, and as we cycle down the range to 2G then we can see our hearts beating :P. So we can flip the wifi's around if we wanted :P :P :P
Then interestingly, where this will go is you can map the interface to our elctromagnetic fields so that we can visualize and interact with it in our imagination, headseats are a thing of the past.
Loved watching your talk, also makes me wonder where our technologoy will be finally implemented :D
Brilliant Talk
The problem of things like 5G are, that since I still only get 3 - 10 GB per month for anything like affordable prices, I literally couldn’t care less, and range was already a lot more important, so any reduction in that is just shit. As for connected devices, telemetry, etc. LTE it’s already fast enough anyway, heck it is enough for videostreaming
As a student of Software Engineering my question is that: Should i learn coding to make Games and Apps or switch to no code tools?
Don't switch to no-code tools. Learn how it all works and then use the tools for jobs if you need it.
Someone will always need to build the no code tools
Dont take shortcuts… they never work
From my experience, no code tools work till a point where you want to make a simple application without specific features, beyond that point it gets too difficult to build and maintain using them.
When I was young and didn't know anything about programming I used to make games using the no-code tools like game maker and construct 2D, I felt limited while making them due to the aforementioned reasons of no code tools. After learning programming I would never not use code to make games, expressing complex applications is much more easier in code, once you know how.
Also no code tools are mostly abstract framework that leave much room for optimization, which in certain applications you may need.
Do what’s hard.
Pretty good tech talk.
great fair summary; thanks;
This was awesome. Well done!
Ahh sweet sweet performance mockery. I love it. Gotta strap ourselves to the Quantum computing rocket at some distant future point, to get the performance explosions again.
I could appreciate how small we are compared to mother nature. A quantum computer needed just to understand the nitrogen fixing than a simple plant can do just like that.
Yes, nature is insanely complex. God is just wise and powerful beyond words
@@larslover6559 Bro, you could be presented with the 1:1 version of how our universe came to be and where the big bang came from and how was it even possible yet you would still be spewing religious nonsense.
Thank you, very interesting
Nice talk
great talk
He pretty much covered all technologies we are developing or using but he forget IoT/IIoT and light computing which is a domain of computing between silicon and quantum computing. Nice talk ! Computing industry marching on...
Light computing is just classical computing (minus the handful of quantum light-based computers).
MS Power platform - domain experts can design their own apps, app forms reasonably well, but mostly for internal apps and not public facing, users be authenticated members of tenant
Exceptional speaker, a bit weak on the outlook of what’s going to happen in the midterm but catching anyway
Amazing!
Make the future.
I thought the presentation was about software development
Good talk! I agree that power platform is really taking off, although I did try the powerautomate copilot and am not impressed. Power automate and powerbi are heavily used and depended upon in our org.
my main take : container based security software model
Strange to hear “compute” used as a noun.
random find. Great video
A+
Great
Really? The inflation we are experiencing is from supply chain issues? Think again my brother. It's about central bank and government policy.
Decentralized computing is good as a concept. But the current Web3 implementation is so bad that the costs wildly outshine the benefits.
Most accessible Web3 projects are heavily centralized with their off-chain parts and are, imo, indistinguishable from a normal web with fancy but slow databases.
Feels like a 10 year old video when talking about AR thanks to the Apple Vision Pro presentation.
How about Webb development and in particular Blazor’s future?
The trend of billionaires messing up their companies started with Steve Ballmer.
Wow👏
1:02:05. Aaah!!! What you said there about leaving the land to go fallow for sometime is also in the bible. 😄
"It is dangerous to make predictions - especially about the future" is even in the talk and yet he goes on to make those predictions.
Predictions about the past are far safer.
sorrry i am a developer , Edge is much better than crome, both are made of chromium but edge is much better and uses less resouces, i think you wanted to say IE?
Seems to me that we should be geting better at making predictions, especially about technology.
In 1980, what computer producers were still going to be relevant after year 200?
Pretty sure people thought IBM, DEC and Unix
What happened to DEC & why?
Building up a knowledge base like this, over time, should mean we get better at the difficult (predicting weather is surely far more difficult, but we haven't given up).
Similar predictions for languages...hadn't heard of Java until mid 90s; at what stage was it apparent that COBOL was no longer going to be relevant...and what happened between then & the decade preceding?
Cobol is still relevant. It's just not as popular.
@@bobweiram6321 Not sure what you mean by 'relevant'; it is still being used, but how many times is it being chosen as the language for a new application?
Did I miss it or did he totally ignore photonic chips?
And what about AlphaFold! (way more relevant for agriculture than quantum computers)
When it comes to the electron cloud configuration, alpha fold can't help. It's a different kind of problem. AlphaFold can only do the organic stuff (proteins) around the metal ions.
Photonic chips are just classical computers. At best, they might provide a constant-factor speedup to existing compute. If this speedup is modest, then it's not going to change the world, and there's a whole bunch of technical issues making it very hard to achieve without making something worse than a silicon computer.
someone summarize it please !
This was a waste of time for me, learned nothing new other than some microsoft marketing.
Exactly my experience!
Thanks for intro)
What was the name of the paper on ML?
Do you know about applied quantum computing to chemistry?
I assume you are one of a kind researcher
😂
Guess that’s what happens when you plan a speech the night before..
The next is ....... Subject-Oriented Programming on Universal Software Model. No OOP! No classes!
LOL. A Microsoft person talking about Napster and Crypto being nefarious.
That part with apple and AR didnt age well, lol
The MS UI platforms absolutely suck. Flutter solved it in a nice way.
Can you do it with c# or you still need js?
I wanna invite this guy to a BBQ and talk the whole day.
The title is a little misleading.. says nothing about Software Development or it'strends, in the first 20 mins at least.
We've been entering into an era of less is more for a long time now. Richard might want to look into Keto Diet.
was shocked when i saw the belly from the side
Really? The guy gives a brilliant speech that covered just about everything in IT and you came away concerned about his health? No - actually you just wanted to be condescendingly judgemental, based on observation from a distance. What do you know about his circumstances? What do you know about his health? What do you know about any medications he has to take? Nothing, nothing and nothing. So before you start being needlessly critical of others and making judgements in a knowledge vacuum, maybe you might wanna take a look in the mirror first. Maybe _that_ guy is the one who needs some good advice. Just sayin ... 🤨🤨🤨
TL;DR - absolutely nothing at all has changed in software development since at least the 1980s through to today.
It's literally Groundhog Day, every day, of every year, of every decade.
Only the names of the tools change, but everything else is exactly the same.
It will remain exactly the same next year, and the whole decade after that.
Somebody will be comparing today’s super computer frontier to handheld device by 2035😂
“Nobody’s mining bitcoin anymore!”
Yeah, but NVidia is still charging like they are
Very good talk but it's a little out of date, specifically when he recommended Unity. Please don't use Unity their board has tanked the company in the same way Facebook and Twitter did.
few weeks later apple has their consumer device - so much for a tech prophet!
At that price I don't think the vast majority of this world considers it a "consumer" product.
I wish they’d bring back Netscape
I changed the icon of my browser to netscape 4.75 😆
They did years ago and it's called Firefox ;) Firefox was based on open-sourced Netscape code. Off course it has evolved over the years, but in some sense it's an evolution of Netscape.
The Mozilla organization was also created by Netscape (company).
Why?
blockchain , companies want central control- other parties can introduce transactions into your data independent of you.
Damn this guy could probably do standup tbh.
Hmm...this was made before the Unity debacle...
Dude disqualified it self with "Noone does Bitcoin mining anymore" Hashrate literally at ATH
He’s trying to be like Uncle Bob! Spent 30 minutes of my life on this which I won’t get back.
At least Uncle Bob isn't an obvious evangelist for a major player (Microsoft) and competitor in many of the product spaces based on the videos I have watched.
This content is overflowing with transformative thoughts. I found a book with like themes that redirected my journey. "AWS Unleashed: Mastering Amazon Web Services for Software Engineers" by Harrison Quill
The Facebook diss about “300 billion” was such a miss lol
20:35 - a wild @linustechtips appears 😂
Misleading title, bait for clicks
I wonder how much did he get from Microsoft to promotes it left and right ?!
I get this is a fun presentation and not be taken too seriously, but I think rn the only speculation we can do about the next decade is: *idk* 😄
"interpreted as inflation" that's inflation
Everything fine until the talker says all "crypto is pretty much a ponzi scheme".
He's clearly an ignorant in that matter, he should have never commented on it.
Well, unity seems to have tanked itself slightly.
Misleading title
misleading title, patronising, overconfident, microsoft biased, spammy, and aging horribly just 2 months after posted. I want my time back.