I realized this while in my first role. Chat GPT got my tickets done faster but maintaining the code it produced and actually growing as a dev caused more problems than it was worth. Frankly I'd rather use a framework with code gen (ala rails or laravel) for productivity than ai code
@@johndoe-j7z I agree but even when working in C# in Microservices that followed pretty consistent OOP and company standards, AI tools were helpful but did require pretty high diligence. The effort required to ensure the code it generated was helpful was not always worth it. It did occasionally suggest something I would have taken a while to come up with but making sure it wasn’t breaking anything or adding random dependencies in the process did eat up some of the time it saved.
This is purely anecdotal and I understand but it is a moment that makes me clutch my pearls a bit. I know juniors who constantly have ChatGPT prompts open, I have been in calls where they share their screens and I ask a question about their code and everything gets filtered THROUGH ChatGPT before I ever get an answer. I’ve asked, “What version of node are you using?” And watch them prompt, “how to check node version” before typing “node -v” in the terminal. I’ve asked that same question to the same person more than 3 times and watch them search it each time. LLMs have made learning code much more of learning how to use a tool as opposed to learning a craft. It isn’t engineering anymore, it’s “what do I do to get what I want”
This is scarily dystpian. NPC creation in real time. Humans are turning into robts, being contrlled by robts, and asking their robt overlrds what to do and how to do it. IT's actually really scary how far humnity is falling. They aren't even processing information, it's just going in and out. Their minds are so rottd out....
@@almazingsk8er this. I considered myself exceptionally good at math, only to get to college and realize I was taught fill-in-the-blank formulas and not how to actually think and solve problems using math and reason. I struggled and had to essentially relearn how to learn math. I taught myself programming as a hobby back in 2018. I didn't have the core knowledge of how to design a system or build and application, I just knew how to solve basic puzzle problems. Now as someone who managed to get a job as a programmer... I find myself struggling to understand system design and software patterns. It feels the exact same way as it did relearning math. It's so easy to just ChatGPT how to do something, instead of actually thinking through and solving the problem. That knowledge gap is dangerous and it bothers me that I struggle to understand the problem and solution.
AI coding assistants are adequate for solving a singular problem (but many times need to have their solutions reviewed to find subtle bugs) - but using them on an established codebase is problematic. They can be good at getting the gist of what code is doing when you have no idea, but sometimes are wrong about idiomatic usages of some languages.
It confidently gives the wrong answers. It's good for simple tasks, but I find myself using the chat window as a kind of self-discussion on how to solve problems. 9/10 times I don't hit enter.
I find if I don't already know the problem domain well enough, ChatGPT 9 times out of 10 steers me in wrong direction. The more you know the better the answers, so it kinda defeats the purpose of even asking ChatGPT!
that's why I think weekends are key times to train yourself with code challenges, and writing the code char by char on your own, like the old times. During sprints for a company, it can sometimes be stressful to be too stuck, and maybe using GPT can release that stress accumulation, but there needs to be a time on the week specially focused on training those muscles you are talking about by coding like if this was 1993.
Great video. I agree. I’ve been coding for a few months now, & something that’s really helped me is asking ChatGPT for hints rather than a full solution when I am stumped. It’s been very useful. For example, recursion is still a bit tricky for me, so if I manage to arrive to a solution but I’m still unsure or unable to explain it, I’ll ask, “Can you explain the execution context of this function line by line?” It helps me visualize what’s going on (Fibonacci sequence)
@@rtpHarry ChatGPT does not understand context. Imagine this thing like a small wikipedia page, and you have to know how to use it, and train it well. It's funny to me, that there are guys who thing this type of thing, such as chatGPT, can take the dev's jobs. It cannot.
@@nikoryu-lungma You do not understand ChatGPT, if you think, that it does not understand context. Everytime I use it, I give a lot of information for context to get better replies. It works incredibly well. It is all about the context. The context basically changes, what the right answer is.
@@antman7673 Since youtube deleted my comments, i'm just gonna say this. I KNOW exactly what i'm saying, because I have evidence. I just hate that youtube deletes my comments. I stand firm to my point. ChatGPT does not understand context at all. You have to train it. Nothing you say can change my standpoint, unless you have ALL OF THE FACTS with you.
I use ChatGPT to generate quizzes, exercises, and practice problems, for every problem got wrong I have ChatGPT explain the concept/s of that problem. It's all about how you ask questions and what you have it do (i.e. Do you give it the role of a teacher/tutor or an answer bank?).
I agree. The thing is, when you use chatGPT, you have to have certain knowledge, not zero knowledge. ChatGPT lacks context, and it answers like a child. You need to "Train" the thing to answer better. That requires your own knowledge. The problem with Junior Developers when they use ChatGPT, is that they lack knowledge. They lack complete knowledge of what's happening around them, therefore they look dumber
I'm not even a developer yet, just an amateur self taught coder, and within a month I can tell it doesn't have any clue what it's talking about. The ONLY thing I use it for is finding minor grammatical issues in my code that the editor doesn't pickup on where there's no actual error.
I can imagine that if someone doesn't have a strong enough mental model about how things work that they're able to self-learn then chatGPT can get them stuck in a loop of receiving answers but not understanding. I haven't personally found that it affects me negatively this way partly because a lot of the problems I'm solving are specific to the business and I just tend not to commit things that I don't understand, I don't understand why there are memes about developers committing code and not understanding how it works that's not development..
I personally never bothered to remember syntax or names of functions - mostly because I was changing programming languages and libraries I worked with - so would almost never do same functionality over and over (with exception to maybe some date and sting functions). Only developers who work on one project for more than 1 years - in a very narrow scope would do remember the syntax. For me chatgpt is just a advanced auto completion. There is another phenomena that existed forever: people blindly copying code - it does not matter if the copy it from other people repositories, stack overflow or chagpt. It can be very very easily spotted - because the style changes. This blame has really nothing to do with chatgpt. Every tool is just a tool. It usually leverages the skill at some price.
This is exactly why I have sworn against using AI. And, even when I was using AI for general research, I noticed that AI (LLMs) were often making stuff up when I would try to verify some of the information. Apparently, it's called a "hallucination".
the irony of you typing some nonsense people were dismissing as nothingburger discussion two years ago is sort of hilarious learn how to use a tool, don't get confused
@@minhuang8848 Cool story bro. Am still not using AI for coding. If I can't trust it for general research, i'm def not trusting it for my projects and work lol. If you need to rely on AI, that says more about you than me.
Yeah true but what is your alternative? Stack overflow is hardly a bastion of correctness. Most blog posts were just "demo code". You can write it yourself if you don't have any pressure to deliver, but even then you are going to trip up with every bug that everyone else had and solved when they first tackled it. Reinventing the best practice is slow. Use it sceptically. Move in small features, generate just the function, like you would write yourself. Use git to compare what it suggests. Use your experience to know when it doesn't work. I have been tripped up still with it, but my speed and scale of coding is vastly increased and its not like I never made any mistakes before LLM came around.
for my side projects, I let LLMs do the boring stuff i.e. boilerplate starter codes. I do the fun stuff i.e. functions or methods which needs proper logic or critical to the application.
ask ChatGPT "are you sure?" twice. watch those confident answers come rolling in. (so the student didn't understand the assignment and was trying to copy paste and the instructor was good enough to make something that a google search wouldn't work.
This sounds like a natural development, there have always been lazy students who don't care about learning things properly as long as they pass. Like, this reminds me of some of my professors refusing to record and upload their lectures because they think the students will be worse if they don't go to class because they can't ask questions. Even though those students likely wouldn't be asking questions anyways. There's always gonna be people who use whatever resources are at hand properly and those who don't and imo it's a good thing that the latter are getting weeded out through this, especially in CS which is oversaturated with new grads.
Similar things were said about junior devs constantly using gogole and stackoverflow. It's just the new way of learning. Junior devs 5 years ago probably never learned the critical skills that a dev would have needed 20 years ago because technology got better. Yes, the code that chatgpt spits out is often bad. But it's still possible to learn and grow by using it.
Yeah, I just made a comment about this is the new StackOverflow. People relied on that too heavily, instead of learning. Me, I would use SO for learning new concepts and thing,s because there were some really really good answers to questions there, from the actual devs, etc. I recall something about Early Javascript, and one of the original devs posting comments about it in a giant post. However, most people were just C/Ping answers they got from there, not learning why the answer is correct or learning how to use it. Very sad.
@AlvinKazu I guess it's sad in that it doesn't benefit people who actually know how to code. But you can be a programmer if you just know how to get the job done. And if you have seen that someone has done already done what you want to do, why not just copy paste. It gives newbies the freedom to do create anything they want within a short space of time.
@@NegativeAccelerate The thing about C/P is that most people don't even edit the code, it's 100% copied. Newer coders should take code snippets and interact with them. Learn why they are doing what they are doing, so you can do it in the future. With the ease of access, people aren't even learning, it's just spoonfed, like a baby.
I hope my 20 years of doing code the old way will still pay off in the future. If I had these tools in school now it is really tempting to use and hurry and get the answer
I used chat gpt for a module in my uni course that was already late so i was just trying to rush through it. It made me not understand what the fuck i was actually doing, i was just saying "It needs to do this" i would get the code, and then if it didnt work i would say "it didnt work". Whenever i use chat gpt now its only for theory stuff, not really for practical implementations.
I absolutely agree with you but i don't really love generating CSS with it . I do the back-end in spring-boot and it's complicated enough. Having chatgpt do the front-end design in bootstrap or tailwind(that makes it easier to make it responsive) is always helpfull for me. I usually have to fix a lot of bugs and debug a lot of code when using Claude AI or gpt4 which is really helpful to me ,I find myself debugging a lot easier now thanks to that .
@@andreivaughn1468 Meh. HTML/CSS is a tiny part of my job, and AI is never good enough to get it perfect, so I always have to understand the output so I can adjust it. I'm a backend dev and I don't do design, so during development it's really nice to just say "make me an error page using tailwindcss, use this kind of style" and paste the markup for some other page(s) on the site which have been designed and built. Once the project gets further along, real designers and front end devs can make a better page, but it really has increased the visual quality of tech demos and early stage projects.
its decent at giving out styled compnenets with shadcn ui compnents and tailwindcss but I aleays end up changing some major part of the style cause it can never get it quite right.
@@andreivaughn1468 I do concede that you essentially forget it after some time but that is still sth you can control , I do read the code gpt provides most of the time. Its basically stackoverflow on steroids lol
Lately, with newer versions, They are ultra-verbose even for the most simplest of questions. I believe they "optimize" them by having them give longer response in general. For anything logically serious, or deep into the frameworks or libraries, They almost always produces the wrong answer until you find them wrong and "try" to gaslight them into submitting then good luck from there.
I can understand why new devs might be in trouble by using Chat GPT. I've been working with software development since 2019 (not that long) and now I've been using GPT to build things faster, I ALWAYS check the code in order to understand what is happening and making sure I understood it, when I dont understand something I look it up and LEARN from it, it's amazing, sometimes it will give solutions that I would, otherwise, spend hours to get to by myself, thats why I make sure I got it and try to memorize it in a way that in the future when I come across a similar scenario I might remember it and do that on my own. I think newbies might not understand that process because maybe they're starting from a point where GPT is the default tool that they will use. Btw there are a lot of mistakes and adjustments I usually need to do to the code so...!
Does junior's dependent on LLM fall asleep thinking about a bug? Do they think about how to optimize their database while sleeping with their wife? Do they find solutions in their dreams? If not, more power to them, but you ain't gonna be no coder
Hey would this be a good medium to ask for a comment/video of “problem solving “ skill being destroyed by LLMs. I know what problems are, but not specifically what you mean, like knowing where to put a loop if required? Or how to approach, engineer, and execute the program? Enthusiast level here… be kind good people!
@@rustyshackleford2841 I think that problem solving is hard to grasp because it goes beyond tools. LLMs are a tool, frameworks are a tool, languages are a tool. A good analogy would be mecanics, in that case problem solving is not about knowing how to use a wrench or even knowing what valve to adjust, it is understanding why you are buildibg an engine or what the purpose of the machine is (taking you from a to b vs extruding materials). Solving problems is about analysing and understanding not about using a tool. That comes later
AI unless highly trained on something specific is not reliable at all and should be avoided for anything more than repetitive tasks. LLMs are like an autocomplete with steroids, it's great for repetitive tasks (i.e. make insert statements for cities and/or states in the way your db likes, generating random data within a pattern, etc) but anything more complex than that and it will just start making stuff up, and of course, that is very unreliable unless what you need is a bread and butter solution, which is often not the case
chatgpt will destroy your problem solving skills and you cannot take it into an interview with you when you want to upgrade to the next level. It is a useful tool in some ways as when learning a new concept it can remove a bit of the breaking in pains but you have to understand what you are doing and how you can fix it. I use chatgpt as part of my training where I will get it to quiz me with some questions on the subject but you must be able to challenge everything it produces. It's absolutely insane to me when I hear about how people just copy and paste when in education, what is the point of going to school/college if you're just copying AI? To get a job? Well that goes back to my first point!!
For jrs yes this is a messy option. I take the view that as much as i like coding it is professionally delinquent of me to code it by hand any more. Thats like only using a text editor and only referencing books. First is syntax. We move between 10 or more different languages all the time. Wasting time looking up each bit to refresh your memory is slow. Second is the time saved looking for documentation, desperately trying to find a snippet on stack overflow that solves your same issue. And then trying to merge it. And the absolute wasted time when you hit a dead end and cannot get it to work and have to start again with a different library. Third is bugs. I have always preferred to reach for somebody elses solution, especially in web dev. Why? Because i can spend an hour ir two crafting my own perfect solution, thinking ive solved it, without knowing that the community has found that you have to add x in for ie10 or a safari has a bug in it's implementation of api y. Its not proper to sell code to clients which is just the result of you home brewing it all and just having fun, taking a first stab at it. It needs to have the wisdom of the community built into it. There are plenty of ways to flex your skills with file organisation and high quality comms and apply design patterns to the higher level structure that its being built into, but using whatever the latest established external brain in your workflow is a must.
We’re in a weird uncomfortable limbo, GPT can go blazingly fast but the code still needs to be read and verified by a human. GPT will become incredibly useful and worth the investment being pumped into its development once it’s good enough to work without needing its work to be checked manually by a person.
Talking about bundling/webpack thats really not a react thing. It predates it. You really should be learning about that when you're learning JavaScript. But people think they can just learn how to write a function and just jump into react. I was 1 one them
Interesting video -- but here is my take: ChatGPT only compounds the users base ability. If a Jr Dev is dumb, it will only make it dumber. If a Jr Dev knows how to use ChatGPT, it will only help them grow. A gun is a gun, in the wrong hands, it's dangerous, in the right hands, it can save your life.
ChatGPT turned me from a shitty junior-level developer (self-taught) into an OK mid-level developer. However, I don't know if I'm the exception or the rule in this regard. Also, I went through a phase of realizing (through the experience of working with it) that ChatGPT is actually extremely DUMB and doesn't understand anything. BUT... once you realize that at its core ChatGPT is an extremely dumb machine (that likes to hallucinate methods that don't exist) THEN you can really start leveraging it (as I did).
I fundamentally disagree with just about everything in the video. ChatGPT is great for doing all kinds of things. You can have it re-write your entire codebase in one shot to change it from using websocket to using webrtc, etc. Of course you don't just blindly copy and paste, but properly using chatgpt to be productive is an art form and the answer to using it properly is not 'just use it as a mentor'. These ai models are only going to get better. You're better off learning skills in interacting with AI models today so that you remain relevant in the future when the models are so good that everyone is heavily utilizing them.
exactly. It's a matter of not losing ground and knowing how to use the tool. I love how fast and easy it has made responsiveness (I always hated that part of web dev) but even so I always take a peek at its css to make sure I understand what is happening
Started using ChatGPT in May of this year. When I started my bootcamp. I’m about to graduate in 2 weeks, and ChatGPT has come A LONG friggin way when it comes to producing quality code. Keep telling yourself ChatGPT is trash, and won’t be integrated into all of y’all’s daily work in like 2-5 years lol
You have basically no experience, so gpt looks good to you. Actual experts solve problems which chatgpt cant solve. You lack self awareness of your beginner status.
It's a blessing and curse. Do you want the problem fixed quickly or do you want spend all day trying to find documentation + examples? In my case I choose the former.
Basically write your own code and use AI as a code reviewer. A lot of people let AI write the code and the human is the reviewer. That's a mistake. It should be your second pair of eyes. You shouldn't be it's second pair of eyes. There are so many things I come up with through just trial and error that wouldn't even have crossed my mind if I simply asked for the correct answer.
when seniority increases you will come across situations and challenges you need to fix that you won't just be able to plug into gpt. You will have to have developed thought processes and experience over the years in order to rule out some things and think about other possibilities. So many times it's some innocuous config setting buried deep somewhere not at all obvious because there isn't actually a bug in the code, or it could be a line of code that isn't saving something in correct order or not remembering the state of something. And then one day an investor can pull the plug, reduce headcount at your business and you're suddenly in a job market competing with hundreds of other people, and you will need to demonstrate why you are the person that should be hired. And not just fixing things, you will also be involved with third parties in meetings designing integrations and identifying potential issues to deal with down the road, security, performance, flexibility, scalability. When you are new in dev you don't have those responsibilities yet, but this is the time where you learn about it so that you are ready to move up the chain
One thing about Ai is that it became a double edged sword. On one side you can repeatedly ask questions without having to be crucified for not seeing the already answered question in a thread 🧵 (which saves you a great amount of time and embarrassment). On the other hand people are starting to use Chat Gpt as a cheat sheet instead of a learning tool and actually submitting copy pasted work into their projects. Copying work is the oldest trick in the book but what happens when you are hit with that golden question? “Can you please explain your answer?” Followed by a marker and a whiteboard.
I've used stackoverflow and superuser since roughly 2016 for computing and coding questions. Really stupid shit most of the time, really niche questions, too. But guess what, if you take the time to make sure you're not missing obvious answers, and that your question is novel, you will never be crucified for asking a question. Asking a person definitely beats asking a computer - the computer doesn't even know if what it's recommending is functional or correct, it's all guesses and probability.
I actually learn(ed) so much faster and easier with AI. Using AI is a must at this point, if you don't use AI then you will slowdown too much. The key is to build a good foundation, get a job, struggle a lot and then when you know what you are doing, use ChatGPT as a mentor.
lol it can’t teach you to write good code. I would recommend uncle Bobs classic books on clean code. This guy is absolute classic of programming literature and he is exact ppposite of average sheet code your llms are learning from
@@goldsucc6068 Indeed, learning, but not "how to program". By learning I mean patterns, optimiziation methods, principles, etc. But I usually learn from books, but sometimes ChatGPT is so fast I can just learn from there.
I realized this while in my first role. Chat GPT got my tickets done faster but maintaining the code it produced and actually growing as a dev caused more problems than it was worth. Frankly I'd rather use a framework with code gen (ala rails or laravel) for productivity than ai code
I think GPT is fine as long as you understand what it's actually doing.
@@johndoe-j7z plus correct ChatGPT for obvious bad practices and properly writing tests for that code
@@johndoe-j7z I agree but even when working in C# in Microservices that followed pretty consistent OOP and company standards, AI tools were helpful but did require pretty high diligence. The effort required to ensure the code it generated was helpful was not always worth it. It did occasionally suggest something I would have taken a while to come up with but making sure it wasn’t breaking anything or adding random dependencies in the process did eat up some of the time it saved.
It's like people have invented a tool that solves writer's block but you think it can write books for you. Maybe that's the problem...
@@daveogfans413 That is hilariously accurate I’m stealing this
This is purely anecdotal and I understand but it is a moment that makes me clutch my pearls a bit.
I know juniors who constantly have ChatGPT prompts open, I have been in calls where they share their screens and I ask a question about their code and everything gets filtered THROUGH ChatGPT before I ever get an answer. I’ve asked, “What version of node are you using?” And watch them prompt, “how to check node version” before typing “node -v” in the terminal. I’ve asked that same question to the same person more than 3 times and watch them search it each time.
LLMs have made learning code much more of learning how to use a tool as opposed to learning a craft. It isn’t engineering anymore, it’s “what do I do to get what I want”
This is scarily dystpian. NPC creation in real time.
Humans are turning into robts, being contrlled by robts, and asking their robt overlrds what to do and how to do it.
IT's actually really scary how far humnity is falling.
They aren't even processing information, it's just going in and out. Their minds are so rottd out....
@@almazingsk8er this.
I considered myself exceptionally good at math, only to get to college and realize I was taught fill-in-the-blank formulas and not how to actually think and solve problems using math and reason.
I struggled and had to essentially relearn how to learn math.
I taught myself programming as a hobby back in 2018. I didn't have the core knowledge of how to design a system or build and application, I just knew how to solve basic puzzle problems.
Now as someone who managed to get a job as a programmer... I find myself struggling to understand system design and software patterns. It feels the exact same way as it did relearning math. It's so easy to just ChatGPT how to do something, instead of actually thinking through and solving the problem. That knowledge gap is dangerous and it bothers me that I struggle to understand the problem and solution.
Garbage in Garbage out
Cope in devs out
AI coding assistants are adequate for solving a singular problem (but many times need to have their solutions reviewed to find subtle bugs) - but using them on an established codebase is problematic. They can be good at getting the gist of what code is doing when you have no idea, but sometimes are wrong about idiomatic usages of some languages.
It confidently gives the wrong answers. It's good for simple tasks, but I find myself using the chat window as a kind of self-discussion on how to solve problems. 9/10 times I don't hit enter.
get a rubberduck, cheaper than chatgpt
I find if I don't already know the problem domain well enough, ChatGPT 9 times out of 10 steers me in wrong direction. The more you know the better the answers, so it kinda defeats the purpose of even asking ChatGPT!
that's why I think weekends are key times to train yourself with code challenges, and writing the code char by char on your own, like the old times. During sprints for a company, it can sometimes be stressful to be too stuck, and maybe using GPT can release that stress accumulation, but there needs to be a time on the week specially focused on training those muscles you are talking about by coding like if this was 1993.
chatgpt gives me old depcreated answers all the time, you can only go so far with it
Great video. I agree. I’ve been coding for a few months now, & something that’s really helped me is asking ChatGPT for hints rather than a full solution when I am stumped.
It’s been very useful.
For example, recursion is still a bit tricky for me, so if I manage to arrive to a solution but I’m still unsure or unable to explain it, I’ll ask, “Can you explain the execution context of this function line by line?” It helps me visualize what’s going on (Fibonacci sequence)
For example, recursion is a bit tricky for me
@@rtpHarry
ChatGPT does not understand context. Imagine this thing like a small wikipedia page, and you have to know how to use it, and train it well.
It's funny to me, that there are guys who thing this type of thing, such as chatGPT, can take the dev's jobs. It cannot.
Don't worry, CS majors spend a full semester on Recursion. Keep it up!
@@nikoryu-lungma
You do not understand ChatGPT, if you think, that it does not understand context.
Everytime I use it, I give a lot of information for context to get better replies.
It works incredibly well. It is all about the context.
The context basically changes, what the right answer is.
@@antman7673
Since youtube deleted my comments, i'm just gonna say this.
I KNOW exactly what i'm saying, because I have evidence. I just hate that youtube deletes my comments.
I stand firm to my point. ChatGPT does not understand context at all. You have to train it. Nothing you say can change my standpoint, unless you have ALL OF THE FACTS with you.
I use ChatGPT to generate quizzes, exercises, and practice problems, for every problem got wrong I have ChatGPT explain the concept/s of that problem. It's all about how you ask questions and what you have it do (i.e. Do you give it the role of a teacher/tutor or an answer bank?).
I agree. The thing is, when you use chatGPT, you have to have certain knowledge, not zero knowledge.
ChatGPT lacks context, and it answers like a child. You need to "Train" the thing to answer better.
That requires your own knowledge.
The problem with Junior Developers when they use ChatGPT, is that they lack knowledge. They lack complete knowledge of what's happening around them, therefore they look dumber
Claude is generally quite accurate. GPT 4o is good with some languages.
I'm not even a developer yet, just an amateur self taught coder, and within a month I can tell it doesn't have any clue what it's talking about.
The ONLY thing I use it for is finding minor grammatical issues in my code that the editor doesn't pickup on where there's no actual error.
yucks. Using ChatGPT to finish school assignments is THE most shooting yourself in the foot AND not understanding why your foot hurts at the same time
I can imagine that if someone doesn't have a strong enough mental model about how things work that they're able to self-learn then chatGPT can get them stuck in a loop of receiving answers but not understanding. I haven't personally found that it affects me negatively this way partly because a lot of the problems I'm solving are specific to the business and I just tend not to commit things that I don't understand, I don't understand why there are memes about developers committing code and not understanding how it works that's not development..
I personally never bothered to remember syntax or names of functions - mostly because I was changing programming languages and libraries I worked with - so would almost never do same functionality over and over (with exception to maybe some date and sting functions). Only developers who work on one project for more than 1 years - in a very narrow scope would do remember the syntax. For me chatgpt is just a advanced auto completion. There is another phenomena that existed forever: people blindly copying code - it does not matter if the copy it from other people repositories, stack overflow or chagpt. It can be very very easily spotted - because the style changes. This blame has really nothing to do with chatgpt. Every tool is just a tool. It usually leverages the skill at some price.
This is exactly why I have sworn against using AI. And, even when I was using AI for general research, I noticed that AI (LLMs) were often making stuff up when I would try to verify some of the information. Apparently, it's called a "hallucination".
the irony of you typing some nonsense people were dismissing as nothingburger discussion two years ago is sort of hilarious
learn how to use a tool, don't get confused
@@minhuang8848 Cool story bro. Am still not using AI for coding. If I can't trust it for general research, i'm def not trusting it for my projects and work lol. If you need to rely on AI, that says more about you than me.
@@minhuang8848 Love how youtube keeps deleting my replies.
Yeah true but what is your alternative? Stack overflow is hardly a bastion of correctness. Most blog posts were just "demo code".
You can write it yourself if you don't have any pressure to deliver, but even then you are going to trip up with every bug that everyone else had and solved when they first tackled it. Reinventing the best practice is slow.
Use it sceptically. Move in small features, generate just the function, like you would write yourself. Use git to compare what it suggests. Use your experience to know when it doesn't work.
I have been tripped up still with it, but my speed and scale of coding is vastly increased and its not like I never made any mistakes before LLM came around.
@@thelonercoder5816 I disagree. It is perfect for boilerplate code, especially migrations, and trivial bugs identification.
I’m glad to have started university before ChatGPT became so widespread.
I'm glad I never got duped and went to college in the first place and kept my original thoughts.
@@kevinsedwards Good thing I had a proper CS program and the only thoughts it changed are about recursion and matrix multiplication.
@@MsJavaWolf Different strokes
for my side projects, I let LLMs do the boring stuff i.e. boilerplate starter codes. I do the fun stuff i.e. functions or methods which needs proper logic or critical to the application.
ask ChatGPT "are you sure?" twice. watch those confident answers come rolling in. (so the student didn't understand the assignment and was trying to copy paste and the instructor was good enough to make something that a google search wouldn't work.
This sounds like a natural development, there have always been lazy students who don't care about learning things properly as long as they pass. Like, this reminds me of some of my professors refusing to record and upload their lectures because they think the students will be worse if they don't go to class because they can't ask questions. Even though those students likely wouldn't be asking questions anyways. There's always gonna be people who use whatever resources are at hand properly and those who don't and imo it's a good thing that the latter are getting weeded out through this, especially in CS which is oversaturated with new grads.
Like someone else said, it's just a tool and there's always been people who are better at using tools than others.
Similar things were said about junior devs constantly using gogole and stackoverflow. It's just the new way of learning.
Junior devs 5 years ago probably never learned the critical skills that a dev would have needed 20 years ago because technology got better.
Yes, the code that chatgpt spits out is often bad. But it's still possible to learn and grow by using it.
Yeah, I just made a comment about this is the new StackOverflow. People relied on that too heavily, instead of learning.
Me, I would use SO for learning new concepts and thing,s because there were some really really good answers to questions there, from the actual devs, etc. I recall something about Early Javascript, and one of the original devs posting comments about it in a giant post.
However, most people were just C/Ping answers they got from there, not learning why the answer is correct or learning how to use it. Very sad.
@AlvinKazu I guess it's sad in that it doesn't benefit people who actually know how to code. But you can be a programmer if you just know how to get the job done. And if you have seen that someone has done already done what you want to do, why not just copy paste. It gives newbies the freedom to do create anything they want within a short space of time.
@@NegativeAccelerate The thing about C/P is that most people don't even edit the code, it's 100% copied. Newer coders should take code snippets and interact with them. Learn why they are doing what they are doing, so you can do it in the future.
With the ease of access, people aren't even learning, it's just spoonfed, like a baby.
I hope my 20 years of doing code the old way will still pay off in the future. If I had these tools in school now it is really tempting to use and hurry and get the answer
Always double xheck your work people
I used chat gpt for a module in my uni course that was already late so i was just trying to rush through it. It made me not understand what the fuck i was actually doing, i was just saying "It needs to do this" i would get the code, and then if it didnt work i would say "it didnt work". Whenever i use chat gpt now its only for theory stuff, not really for practical implementations.
tbh , using chatgpt to do the html css has to be a must! would you really waste time on centering divs and grids ?
If you lean on an AI or a "do it for you" tool long enough, you eventually lose the ability to do it yourself. Can happen with anything!
I absolutely agree with you but i don't really love generating CSS with it .
I do the back-end in spring-boot and it's complicated enough.
Having chatgpt do the front-end design in bootstrap or tailwind(that makes it easier to make it responsive) is always helpfull for me.
I usually have to fix a lot of bugs and debug a lot of code when using Claude AI or gpt4 which is really helpful to me ,I find myself debugging a lot easier now thanks to that .
@@andreivaughn1468 Meh. HTML/CSS is a tiny part of my job, and AI is never good enough to get it perfect, so I always have to understand the output so I can adjust it. I'm a backend dev and I don't do design, so during development it's really nice to just say "make me an error page using tailwindcss, use this kind of style" and paste the markup for some other page(s) on the site which have been designed and built. Once the project gets further along, real designers and front end devs can make a better page, but it really has increased the visual quality of tech demos and early stage projects.
its decent at giving out styled compnenets with shadcn ui compnents and tailwindcss but I aleays end up changing some major part of the style cause it can never get it quite right.
@@andreivaughn1468 I do concede that you essentially forget it after some time but that is still sth you can control , I do read the code gpt provides most of the time. Its basically stackoverflow on steroids lol
Lately, with newer versions, They are ultra-verbose even for the most simplest of questions. I believe they "optimize" them by having them give longer response in general. For anything logically serious, or deep into the frameworks or libraries, They almost always produces the wrong answer until you find them wrong and "try" to gaslight them into submitting then good luck from there.
I can understand why new devs might be in trouble by using Chat GPT. I've been working with software development since 2019 (not that long) and now I've been using GPT to build things faster, I ALWAYS check the code in order to understand what is happening and making sure I understood it, when I dont understand something I look it up and LEARN from it, it's amazing, sometimes it will give solutions that I would, otherwise, spend hours to get to by myself, thats why I make sure I got it and try to memorize it in a way that in the future when I come across a similar scenario I might remember it and do that on my own. I think newbies might not understand that process because maybe they're starting from a point where GPT is the default tool that they will use.
Btw there are a lot of mistakes and adjustments I usually need to do to the code so...!
LOL I loved the thumbnail so very much.
Never used it, got my WFH dev job, pretty happy.
Love this video and input so damn much, thank you. This matters coming from you.
Does junior's dependent on LLM fall asleep thinking about a bug? Do they think about how to optimize their database while sleeping with their wife? Do they find solutions in their dreams? If not, more power to them, but you ain't gonna be no coder
Hey would this be a good medium to ask for a comment/video of “problem solving “ skill being destroyed by LLMs. I know what problems are, but not specifically what you mean, like knowing where to put a loop if required?
Or how to approach, engineer, and execute the program?
Enthusiast level here… be kind good people!
@@rustyshackleford2841 I think that problem solving is hard to grasp because it goes beyond tools. LLMs are a tool, frameworks are a tool, languages are a tool. A good analogy would be mecanics, in that case problem solving is not about knowing how to use a wrench or even knowing what valve to adjust, it is understanding why you are buildibg an engine or what the purpose of the machine is (taking you from a to b vs extruding materials).
Solving problems is about analysing and understanding not about using a tool. That comes later
@ good response, thank you.
AI unless highly trained on something specific is not reliable at all and should be avoided for anything more than repetitive tasks.
LLMs are like an autocomplete with steroids, it's great for repetitive tasks (i.e. make insert statements for cities and/or states in the way your db likes, generating random data within a pattern, etc) but anything more complex than that and it will just start making stuff up, and of course, that is very unreliable unless what you need is a bread and butter solution, which is often not the case
TBH it helped me a lot on my first real job where SSR and SR where @ssholes who never wanted to assist the trainees and jr devs.
Use it like stackoverflow. That's about it
chatgpt will destroy your problem solving skills and you cannot take it into an interview with you when you want to upgrade to the next level. It is a useful tool in some ways as when learning a new concept it can remove a bit of the breaking in pains but you have to understand what you are doing and how you can fix it. I use chatgpt as part of my training where I will get it to quiz me with some questions on the subject but you must be able to challenge everything it produces. It's absolutely insane to me when I hear about how people just copy and paste when in education, what is the point of going to school/college if you're just copying AI? To get a job? Well that goes back to my first point!!
For jrs yes this is a messy option.
I take the view that as much as i like coding it is professionally delinquent of me to code it by hand any more. Thats like only using a text editor and only referencing books.
First is syntax. We move between 10 or more different languages all the time. Wasting time looking up each bit to refresh your memory is slow.
Second is the time saved looking for documentation, desperately trying to find a snippet on stack overflow that solves your same issue. And then trying to merge it. And the absolute wasted time when you hit a dead end and cannot get it to work and have to start again with a different library.
Third is bugs. I have always preferred to reach for somebody elses solution, especially in web dev. Why? Because i can spend an hour ir two crafting my own perfect solution, thinking ive solved it, without knowing that the community has found that you have to add x in for ie10 or a safari has a bug in it's implementation of api y.
Its not proper to sell code to clients which is just the result of you home brewing it all and just having fun, taking a first stab at it. It needs to have the wisdom of the community built into it.
There are plenty of ways to flex your skills with file organisation and high quality comms and apply design patterns to the higher level structure that its being built into, but using whatever the latest established external brain in your workflow is a must.
We’re in a weird uncomfortable limbo, GPT can go blazingly fast but the code still needs to be read and verified by a human. GPT will become incredibly useful and worth the investment being pumped into its development once it’s good enough to work without needing its work to be checked manually by a person.
Talking about bundling/webpack thats really not a react thing. It predates it. You really should be learning about that when you're learning JavaScript. But people think they can just learn how to write a function and just jump into react. I was 1 one them
Meanwhile i wouldnt even have started my first programming project if it wasn't for gpt.
Interesting video -- but here is my take: ChatGPT only compounds the users base ability. If a Jr Dev is dumb, it will only make it dumber. If a Jr Dev knows how to use ChatGPT, it will only help them grow.
A gun is a gun, in the wrong hands, it's dangerous, in the right hands, it can save your life.
Its so bad sometimes. And obviously complicated for no reason.
I really hope you’re putting any sensitive passwords or keys in environment files and not the code itself!
ChatGPT turned me from a shitty junior-level developer (self-taught) into an OK mid-level developer.
However, I don't know if I'm the exception or the rule in this regard.
Also, I went through a phase of realizing (through the experience of working with it) that ChatGPT is actually extremely DUMB and doesn't understand anything.
BUT... once you realize that at its core ChatGPT is an extremely dumb machine (that likes to hallucinate methods that don't exist) THEN you can really start leveraging it (as I did).
This is why I stopped using chat gpt.. had an internship and chat could not help me at all bro lol
ChatGPT is a tool. We know some can use tools better than others.
Yes the use of AI tools can be helpful..but overusing / abusing it will lead to double edged sword situation and end up hurting in the long run.
I felt this months ago. I noticed more I used chatgpt more dumber I got.
I fundamentally disagree with just about everything in the video. ChatGPT is great for doing all kinds of things. You can have it re-write your entire codebase in one shot to change it from using websocket to using webrtc, etc. Of course you don't just blindly copy and paste, but properly using chatgpt to be productive is an art form and the answer to using it properly is not 'just use it as a mentor'.
These ai models are only going to get better. You're better off learning skills in interacting with AI models today so that you remain relevant in the future when the models are so good that everyone is heavily utilizing them.
exactly. It's a matter of not losing ground and knowing how to use the tool. I love how fast and easy it has made responsiveness (I always hated that part of web dev) but even so I always take a peek at its css to make sure I understand what is happening
water is wet
Started using ChatGPT in May of this year. When I started my bootcamp. I’m about to graduate in 2 weeks, and ChatGPT has come A LONG friggin way when it comes to producing quality code. Keep telling yourself ChatGPT is trash, and won’t be integrated into all of y’all’s daily work in like 2-5 years lol
You have basically no experience, so gpt looks good to you. Actual experts solve problems which chatgpt cant solve. You lack self awareness of your beginner status.
It's a blessing and curse. Do you want the problem fixed quickly or do you want spend all day trying to find documentation + examples? In my case I choose the former.
Basically write your own code and use AI as a code reviewer. A lot of people let AI write the code and the human is the reviewer. That's a mistake. It should be your second pair of eyes. You shouldn't be it's second pair of eyes. There are so many things I come up with through just trial and error that wouldn't even have crossed my mind if I simply asked for the correct answer.
Skill issue
it’s always the junior dev’s fault. i think your perception is broken
taking the rest of the day off to think about it is wild. lol
You guys cope so much man, chatgpt helping us a alot than harming us new devs, u will yall had it when u were starting out
when seniority increases you will come across situations and challenges you need to fix that you won't just be able to plug into gpt. You will have to have developed thought processes and experience over the years in order to rule out some things and think about other possibilities. So many times it's some innocuous config setting buried deep somewhere not at all obvious because there isn't actually a bug in the code, or it could be a line of code that isn't saving something in correct order or not remembering the state of something. And then one day an investor can pull the plug, reduce headcount at your business and you're suddenly in a job market competing with hundreds of other people, and you will need to demonstrate why you are the person that should be hired.
And not just fixing things, you will also be involved with third parties in meetings designing integrations and identifying potential issues to deal with down the road, security, performance, flexibility, scalability. When you are new in dev you don't have those responsibilities yet, but this is the time where you learn about it so that you are ready to move up the chain
One thing about Ai is that it became a double edged sword. On one side you can repeatedly ask questions without having to be crucified for not seeing the already answered question in a thread 🧵 (which saves you a great amount of time and embarrassment).
On the other hand people are starting to use Chat Gpt as a cheat sheet instead of a learning tool and actually submitting copy pasted work into their projects.
Copying work is the oldest trick in the book but what happens when you are hit with that golden question? “Can you please explain your answer?” Followed by a marker and a whiteboard.
I've used stackoverflow and superuser since roughly 2016 for computing and coding questions. Really stupid shit most of the time, really niche questions, too. But guess what, if you take the time to make sure you're not missing obvious answers, and that your question is novel, you will never be crucified for asking a question. Asking a person definitely beats asking a computer - the computer doesn't even know if what it's recommending is functional or correct, it's all guesses and probability.
Chatgpt often lacks the indepth explanation on rhe why and how part
I actually learn(ed) so much faster and easier with AI. Using AI is a must at this point, if you don't use AI then you will slowdown too much. The key is to build a good foundation, get a job, struggle a lot and then when you know what you are doing, use ChatGPT as a mentor.
@@ExistentialSadness what prompt do you use for that?
lol it can’t teach you to write good code. I would recommend uncle Bobs classic books on clean code. This guy is absolute classic of programming literature and he is exact ppposite of average sheet code your llms are learning from
@@goldsucc6068 Ehm, who said that I use AI to learn to code?
@ you said word “mentor” which kind of assumes learning process
@@goldsucc6068 Indeed, learning, but not "how to program". By learning I mean patterns, optimiziation methods, principles, etc. But I usually learn from books, but sometimes ChatGPT is so fast I can just learn from there.