Honestly this gives chatGPT more credit than it deserves. In my experience, there are basically two use cases for genAI code: 1. quick documentation retrieval. 2. boilerplate code In both cases the developer has to be able to quickly determine whether the code is correct or not. In any even slightly complex use case, genAI fails pretty badly and you can in no way rely on it to solve your problems.
i think i agree. when i was first learning rust, chatgpt was very helpful in reminding me what syntax to use for certain things but it utterly failed at, for example, nearly anything involving a library
@@vvert1506 I definitely don't agree with the "best tutor" part. You rely on your tutor to be correct which is certainly not guaranteed with chatGPT. Its an ok "idea generator" I guess.
An intermediate C coder here: I can't agree more. In a longer session, it made the same error three times, although 'we' agreed, that it was totally wrong. It also tried to typecast a char to a byte. For me a second opinion had never been more true. In another lengthy session, it manage to make a simple working client server code in for Linux and C code boiler plates using CUBS in Linux. It have been a help in CodeBlocks, Raylib graphics and Linux and gave me a great transition from Windows to Linux Mint LMDE. Now I have a small working relational database, with a graphical interface, editing, search and printing. But a save lot of time using it as a super search engine I have some mystical dyslectic issues and it's helps me a lot and learning English.
ChatGPT made programming so much harder for me. It's like getting advice from someone who thinks they know what you're thinking, but they have no idea what your taking about; just tons of assumptions.
It's great at doing basic things but once you ask it to do something a little more specific it'll just hallucinate a bunch of stuff or misinterpret you completely. I just use it to give me some advice on things I could refactor or clean up, it's pretty decent at that.
I find ChatGPT to be wrong more often than it's right. The most annoying thing is it seems to "forget" stuff you said earlier: 1) Ask for A 2) A comes back wrong 3) Ask for A with additional context B 4) A comes back wrong using context B 5) Elaborate on context B 4) A comes back wrong again 5) Ask for A using B and additional context C 6) A comes back with C, but not B 7) Remind it to use context B 8) A comes back wrong using context B, but not C 9) And over, and over, and over again. Yeah... I'm not worried about AI taking my job anytime soon.
Best use for chatgpt is to write comments for functions . u give it the template for a comment and then you feed it the function and it writes the comments . Ususally you need to change a thing or two but damn it does a great job
My favorite use is to test my understanding of general concepts, explaining my logical thought process and letting it fill in any gaps that i can then double check
My experience using Chat GPT to learn: 1) its a Very good tool to learn Simple Concepts, or if you want to breakdown an Algorithm and Understand how it works 2) You can also Use Chat gpt like an interviewer or prepare for a test to ask you questions if you (although it has a Tendency to Repeat the same content) 3) You shouldnt just Copy paste code from GPT, as It can Make Minor mistakes in Syntax, like a Missing Semi-colon or a colon or Missing Statements or Even Logical errors. If your working an A large code base or building a Project using GPT might not be A good Idea Overall using GPT while learning to code is pretty nice tool, if you have Some Understanding and Patience to actually read the code and Debug it!
It's ok at generating simple programs for languages that have a lot of documentation and example code for the AI model to train on, but it absolutely sucks at generating code for less popular languages or low level stuff like Assembly. A first year student could probably throw all their simple JavaScript assignments in and get working code, but the later projects are too complex and too difficult for an AI to generate anything coherent. They'd spend more time fixing broken code than actually learning the things correctly the first time.
This reminds me of something that happened at work a few weeks ago. I asked a colleague, who I consider more skilled than me, a question after reading the documentation, so he proceeded to ask BingGPT about the topic and, a few minutes after that, he answered with the same thing I had already read in the documentation. I don't think he even saved time by doing that, which is... curious, to say the least
In my experience you can use chatgpt to write scripts in tcl, Linux bash or python for your actual code whether it is c , cpp etc, it may help find errors in your code, or draw ideas Keep in mind that chatgpt doesn't even compile the code it generates
I think we'd be better off learning how to use a calculator than how to use chat-gpt. most student finish school without ever using an "advance" button on the calculator. even the one that stores a value to memory. learning to use the specialized functions in the casio calculator is like learning to use a new library - it involves reading documentation, understanding the use-case, figuring out the input/output, and developing the skills to be able to check if the results you get back are correct.
I'm using it for maths - sometimes it makes mistakes, I mostly research the stuff beforehand and ask it to double-check the answers, sometimes when I don't have the info though, I ask it to explain - it's better than nothing, I wish I could have a human teacher
as an Ecole 42 student we don't have teachers. Some projects come and system say "do it, i know you don't know anything about this project but you will learn while doing this project". In this situation of course i'm using chatgpt. But "Hey GPT, this is my subject pdf. Write all of this project" is not my prompt. Besides, GPT can't make all project by itself (yet). Generally my prompt is "Gpt, what is deadlock? can you give me an example?". İ am learning so many thing with this method. We must catch the train but not let AI do all the work and make us lazy. We must push ourselves.
Something copilot IS really good for is when you know how to program more or less, but you don't know a languages std library. Ex. I know C but my boss wants to throw me into cpp for a while. Your velocity is much higher when you know what you want to do, but not the function calls/syntax needed to do it. That said it won't be as good as knowing the libs yourself.
Even if those tools would work better, the actual question is: How good is the use of natural languages to describe/express a computing specific problem vs the use of programming languages to describe/express the same problem. Kind of similar to maths, a few symbols might express something uniquely vs expressing it with a text which might be ambiguous
Do you have tried to use TDD with LLMs? You write the tests and ask the LLM to write the program, that makes the tests pass. The LLM gets all your test results, the compiler output and, if an executable program was created, the test results of your test program.
The question is whether you should learn programming at all at this point. AI is growing at a neck breaking rate. Despite it not being actually intelligent, it may be just good enough most of the time. This example shows that chat GPT is not good enough yet in this particular use case. However, the companies may develop their own AIs generating code, suited for their needs. This may not be a silver bullet, but can reduce the required workforce. That just the guesstimation now. We'll see what the future brings when it comes. Personally I would rather advice young people to work in the manual labor now. More and more often it pays as well as mid level developer job. Additionally, I doubt that real estate bubbles ever pop, as the governments happily support them. At the same time, IT eldorado seems to be ending now.
I used ChatGPT to adapt a straight command-line dos sound utility to Windows. All of the real work was being done in the SoX library, which I linked statically, so the Windows code was basically a wrapper. ChatGPT saved me some time, but it was only plugging in code that any commercial IDE would have auto-generated. Just boilerplate.
This was an absolutely excellent video on the current state of AI. It put my thoughts into words almost flawlessly, especially the bit at the end that it can't solve new problems. I really great video to see after months of hearing people treat AI as though it was actually a genius that could eventually replace every programmer on earth one day. I'd try various things myself and the second I'd give it one of my real world problems that I solved (but there isn't a stackoverflow solution for) it'd pretty much break down and give complete garbage, once I told it that a particular part of the code was wrong to see if it'd fix it, and it actually argued that it was correct, before explaining that it did exactly opposite of what it did.
I see ChatGPT as a junior developer. Does it replace a senior dev? No, but it can sometimes provide good suggestions. But I'm saying this as someone who lerned programming before AI, as a beginner it may tempting to use it, and it may be fine if you use it complementary to other learning methods like code examples, documentation, books, practise and so on.
@@haroldcruz8550 Yes, maybe junior dev was overstating its capabilities. I don't have expectations towards its solutions, It's more like here is some code that resembles the average of what people write in similar situations. If the situation is easy it's very helpful, like parse this file line by line in this programming language I'm not so familiar with. The downside is that it gets details dangeriously wrong.
The very-few times I asked ChatGPT to generate some C code for a problem, its first few "upbeat" solutions failed to account for edge cases (eg: division by zero or negative numbers). Additionally, its data representation was "beginner level" and its suggested algorithms simply "brute force". Don't expect "Quake Square Root (approximation)" to fall out of its machinations. Is "good enough" really good enough?
LLMs are a better search engine for coding... Being exposed to code is being exposed to code... What are you going to do... Go back to writing assembly? What are you going to do... stop using a calculator... Remember you won't have a calculator with you all the time in the future so you got to learn the maths! LOL... Use a code specific LLM... Learn the weaknesses.. and then forget all that in a few months as it improves...
It's quite impressive if it wasn't wrong most of the time. I've resorted to using it for cooking recipe advice.It's quite good at that. If you don't like cinnamon on your wheetabix, you ask for alternatives until it gives you one you like.
I read that gnu pthread runs at kernel level, are there any threading library in linux that runs at user level? I mean like kernel is unaware of the threads.
10:21 This code is right but the comments are somewhat wrong, because if you remove the initalization, the ints in the array might contain trap representations and reading from a trap representation causes undefined behavior.
ChatGPT reduces the amount of documentation you have to read and the second good use is to ask it to explain code you cannot understand. I haven’t found good usage other than that.
Students need to do it in two parts - learn programming, and once they have some level of comfort, learn tools at least enough to convince employers. Thankfully, employers don't ask for certifications in common IDEs, so hopefully they don't ask for chatgpt/copilot certifications.
ChatGPT capabilities have will have zero to do with its influence on the turnover rate for this industry. I would focus more on ChatGPT as a business logic. 8:07 No ChatGPT is not meant to be smart. It is meant to be machine learning. Which inherently by definition is as close to the correct answer without being correct.
Chatgpt4 gives this: #include // Assuming the button is connected to P1.1 // and using the internal pull-up resistor void setup() { // configure button pin as input with pull-up resistor pinMode(PUSH2, INPUT_PULLUP); // enable interrupt on the button pin attachInterrupt(PUSH2, wakeUp, FALLING); } void enterLowPowerMode() { // enter low power mode __bis_SR_register(LPM4_bits + GIE); // Enter LPM4 w/interrupt } void wakeUp() { // This function will be called when the button is pressed // Exit low power mode __bic_SR_register_on_exit(LPM4_bits); } void loop() { enterLowPowerMode(); // Your normal code goes here }
I think that for documentation or if I'm stuck with an error can help me to figure it out, like googling or use stack overflow but finding the solution in a faster way. Of course for projects or programs with some logic it makes errors and doesn't work, but I think that is not the point.
@@_________________404 I use it for making examples and exercises without the solution, and trying to solve myself. I didn't use in high level programming but only for studying.
it can't do creative coding for you sure it gives you idea of existing code and famous algorithm, just give it any competitive programming quenstion from codeforces it struggle
you could be teaching yr students how to program their own ai and learning how it works instead of trying to fight against the grain. how about making them write an ai and everyone has to use that to take a test based on what it was trained on instead of trying to discourage use, anyone who ignores the utility of ai will fall further behind
I'm sure in time, AI will be able to do most of it. But I think the goal is to make AI lighten the workload, i.e. the programmer can write more natural statements in the way human think and speak.
@@JacobSorberAh, that makes more sense, never done embedded programming or whatever it was called. Had no projects that I wanted to do in that direction so would've been a pointless expense
ChatGTP launched at around the same time I started learning C on this channel, it was mostly useless for me then because it spewed out code and these technical words (my knowledge of programming terms was low then). it was only when i stopped using it that i saw improvements (C was not a good language to begin with 🤣).
Students who take my classes (my target audience) sometimes ask GPT to write entire programs. As for Energia, it was relevant to a project that my embedded systems students were working on. So, they (and I) cared about Energia. Also FWIW, this is also the code-equivalent of the ad hominem fallacy.
Fair enough! While your interest in Energia is commendable, this example does not effectively demonstrate how to use the tool. According to Andrew Ng, improving prompt engineering skills is essential for effectively utilizing a GPT system. Asking GPT to write an entire program using a relatively obscure entity like Energia is not an example of good prompt engineering. I’ve successfully used GPT to create entire programs by providing detailed and precise prompts. Additionally, you didn’t mention which version of ChatGPT you used. The latest version, GPT-4o, is remarkable, though it requires a paid subscription. As someone who loves the C programming language and has written many lines of code in it, I appreciate your efforts in teaching it. However, your example using GenAI is not compelling. Given all that, I agree with you that students must NOT rely on GenAI to do their Homework. They need to struggle with languages such as C to understand how computers work. Thanks!
In short, web developers you are deprecated chat gpt is smart enough to do make websites 🤣 , embedded developers are not been replaced in the near future 😎
You are using 3.5, it's really a mess, but hold on, try GPT 4, you will totally change your mind... I dare you try it and re-record. Otherwise this video is useless, the point is to create it using version 4. We all know version 3.5 is a mess.
I find your video a bit unfair. LLMs and especially ChatGPT 3.5 are not meant to be used with your brain in hibernation mode, at least yet. You still have to be able to question ChatGPTs output, especially with niche or complex prompts and you still have to have somewhat of a clue about the topic you're asking it about and double check its answers accordingly, especially if something seems way off. But imo ChatGPT in 90% of the cases gives precise answers to, and thats the magic word: PRECISE prompts. I would never just directly copy what ChatGPT outputs, but it gives a nice overview about certain concepts and how to possibly implement them.
Misleading title, simply calling a tool useless is like calling yourself useless which isn't true. Anyone and everyone are only as good as they use the tool around them. Don't blame the tool, blame it's wielder or person using it. If you're not using AI as a tool in aiding you solve tasks then you will be left behind. This video is like saying "don't use calculators because if you mess up your math is messed up". Which is terrible logic, it comes down to the person using the tool and providing accurate information to the tool to better aid you in reaching an end resolution.
Continued: in the instance of this video, the creator used lazy , inaccurate , incomplete , and no real directions to chatGPT which we all know is required for it to more accurately provide a working base solution for the user to extrapolate and refactor as necessary.
@@_________________404 Not true, if your doing basic arithmetic then maybe your logic holds true. But if your doing math that involves time and session based parameters your argument goes out the window. Instead of trying to think of outliers that aren't relevant look at the substance of the post made. Which is where you will see i'm discussing how the video is misleading and that tools are only as good as they are used or created. Seeing as AI is made by programmers, it too falls into this category of reliance upon it's creators and users. So sorry friend, i think you have it confused, it is entirely 'about the person' and if you fail to understand that then you need to understand what AI even is. Many of us programmers have made our own forms of AI in different usage cases and so we clearly understand AI has many implicating factors but mostly comes down to how well 'I' the programmer build or implement it and then after it's ready for use, it comes down to how well I use the tool or how well I made the tool.
I firmly beleive he did mention the 'tool' isn't Useless. Its just not as Reliable for a Larger code base. Maybe just maybe for Learning purpose or as a teaching tool to learn simple concepts.
@@SimGuntherrevisit this in 5 years. Programming is not some special gift from god. It’s logic. And most of the code written today is crap btw. Good luck champ
Honestly this gives chatGPT more credit than it deserves.
In my experience, there are basically two use cases for genAI code:
1. quick documentation retrieval.
2. boilerplate code
In both cases the developer has to be able to quickly determine whether the code is correct or not. In any even slightly complex use case, genAI fails pretty badly and you can in no way rely on it to solve your problems.
i think i agree. when i was first learning rust, chatgpt was very helpful in reminding me what syntax to use for certain things but it utterly failed at, for example, nearly anything involving a library
best tutor, worst assistant.
@@vvert1506 I definitely don't agree with the "best tutor" part. You rely on your tutor to be correct which is certainly not guaranteed with chatGPT. Its an ok "idea generator" I guess.
And unit tests. Just be sure to proofread them.
An intermediate C coder here: I can't agree more. In a longer session, it made the same error three times, although 'we' agreed, that it was totally wrong. It also tried to typecast a char to a byte. For me a second opinion had never been more true.
In another lengthy session, it manage to make a simple working client server code in for Linux and C code boiler plates using CUBS in Linux. It have been a help in CodeBlocks, Raylib graphics and Linux and gave me a great transition from Windows to Linux Mint LMDE. Now I have a small working relational database, with a graphical interface, editing, search and printing.
But a save lot of time using it as a super search engine
I have some mystical dyslectic issues and it's helps me a lot and learning English.
ChatGPT is like a rubber ducky that can respond. Or like a slightly incompetent assistant that you have to double check if he is not lying.
I question your use of "slightly". ;)
Ahmen, you don't learn anything if you let someone else do it for you.
100% 👍👍👍
ChatGPT is just like automated Stackoverflow without the comments and upvote.
Depends on how you use it
Especially if the someone else has an IQ of actually zero.
Chatgpt or overly program friendly libraries?
ChatGPT made programming so much harder for me. It's like getting advice from someone who thinks they know what you're thinking, but they have no idea what your taking about; just tons of assumptions.
It's great at doing basic things but once you ask it to do something a little more specific it'll just hallucinate a bunch of stuff or misinterpret you completely. I just use it to give me some advice on things I could refactor or clean up, it's pretty decent at that.
I find ChatGPT to be wrong more often than it's right. The most annoying thing is it seems to "forget" stuff you said earlier:
1) Ask for A
2) A comes back wrong
3) Ask for A with additional context B
4) A comes back wrong using context B
5) Elaborate on context B
4) A comes back wrong again
5) Ask for A using B and additional context C
6) A comes back with C, but not B
7) Remind it to use context B
8) A comes back wrong using context B, but not C
9) And over, and over, and over again.
Yeah... I'm not worried about AI taking my job anytime soon.
As a CS student I can confirm 😂
No pain, no gain
Best use for chatgpt is to write comments for functions . u give it the template for a comment and then you feed it the function and it writes the comments . Ususally you need to change a thing or two but damn it does a great job
My favorite use is to test my understanding of general concepts, explaining my logical thought process and letting it fill in any gaps that i can then double check
My experience using Chat GPT to learn:
1) its a Very good tool to learn Simple Concepts, or if you want to breakdown an Algorithm and Understand how it works
2) You can also Use Chat gpt like an interviewer or prepare for a test to ask you questions if you (although it has a Tendency to Repeat the same content)
3) You shouldnt just Copy paste code from GPT, as It can Make Minor mistakes in Syntax, like a Missing Semi-colon or a colon or Missing Statements or Even Logical errors. If your working an A large code base or building a Project using GPT might not be A good Idea
Overall using GPT while learning to code is pretty nice tool, if you have Some Understanding and Patience to actually read the code and Debug it!
It's ok at generating simple programs for languages that have a lot of documentation and example code for the AI model to train on, but it absolutely sucks at generating code for less popular languages or low level stuff like Assembly. A first year student could probably throw all their simple JavaScript assignments in and get working code, but the later projects are too complex and too difficult for an AI to generate anything coherent. They'd spend more time fixing broken code than actually learning the things correctly the first time.
This reminds me of something that happened at work a few weeks ago. I asked a colleague, who I consider more skilled than me, a question after reading the documentation, so he proceeded to ask BingGPT about the topic and, a few minutes after that, he answered with the same thing I had already read in the documentation. I don't think he even saved time by doing that, which is... curious, to say the least
Beyond high school, chatGPT gets wrong so often, that using it slows your work.
In my experience you can use chatgpt to write scripts in tcl, Linux bash or python
for your actual code whether it is c , cpp etc, it may help find errors in your code, or draw ideas
Keep in mind that chatgpt doesn't even compile the code it generates
I think we'd be better off learning how to use a calculator than how to use chat-gpt.
most student finish school without ever using an "advance" button on the calculator. even the one that stores a value to memory.
learning to use the specialized functions in the casio calculator is like learning to use a new library - it involves reading documentation, understanding the use-case, figuring out the input/output, and developing the skills to be able to check if the results you get back are correct.
I'm using it for maths - sometimes it makes mistakes, I mostly research the stuff beforehand and ask it to double-check the answers, sometimes when I don't have the info though, I ask it to explain - it's better than nothing, I wish I could have a human teacher
That's just using it as a shitty calculator that isn't always right lol
as an Ecole 42 student we don't have teachers. Some projects come and system say "do it, i know you don't know anything about this project but you will learn while doing this project". In this situation of course i'm using chatgpt. But "Hey GPT, this is my subject pdf. Write all of this project" is not my prompt. Besides, GPT can't make all project by itself (yet). Generally my prompt is "Gpt, what is deadlock? can you give me an example?". İ am learning so many thing with this method. We must catch the train but not let AI do all the work and make us lazy. We must push ourselves.
I agree, if you ask short well structured questions it is really helpful. I'm also into Philosophers now😊
@@ekaterinamikhailova5201 Ah, good to see a an Ecole 42 student :D, where are you from? I finished cub3d yesterday. My intra nick is yciftci xd
Yes right, im doing printf right now
Something copilot IS really good for is when you know how to program more or less, but you don't know a languages std library.
Ex. I know C but my boss wants to throw me into cpp for a while. Your velocity is much higher when you know what you want to do, but not the function calls/syntax needed to do it.
That said it won't be as good as knowing the libs yourself.
Even if those tools would work better, the actual question is: How good is the use of natural languages to describe/express a computing specific problem vs the use of programming languages to describe/express the same problem. Kind of similar to maths, a few symbols might express something uniquely vs expressing it with a text which might be ambiguous
Do you have tried to use TDD with LLMs? You write the tests and ask the LLM to write the program, that makes the tests pass. The LLM gets all your test results, the compiler output and, if an executable program was created, the test results of your test program.
The question is whether you should learn programming at all at this point. AI is growing at a neck breaking rate. Despite it not being actually intelligent, it may be just good enough most of the time. This example shows that chat GPT is not good enough yet in this particular use case. However, the companies may develop their own AIs generating code, suited for their needs. This may not be a silver bullet, but can reduce the required workforce. That just the guesstimation now. We'll see what the future brings when it comes. Personally I would rather advice young people to work in the manual labor now. More and more often it pays as well as mid level developer job. Additionally, I doubt that real estate bubbles ever pop, as the governments happily support them. At the same time, IT eldorado seems to be ending now.
I used ChatGPT to adapt a straight command-line dos sound utility to Windows. All of the real work was being done in the SoX library, which I linked statically, so the Windows code was basically a wrapper. ChatGPT saved me some time, but it was only plugging in code that any commercial IDE would have auto-generated. Just boilerplate.
You say it saved you some time... but.... did it?
I agree with you, our professor asked "would progressive impressive new GAN replace us?" and all the yes answers were naive
my best use case for ChatGPT is for a quick refresher on a topic and the worst one is asking for something specific. It will get it wrong.
This was an absolutely excellent video on the current state of AI. It put my thoughts into words almost flawlessly, especially the bit at the end that it can't solve new problems.
I really great video to see after months of hearing people treat AI as though it was actually a genius that could eventually replace every programmer on earth one day.
I'd try various things myself and the second I'd give it one of my real world problems that I solved (but there isn't a stackoverflow solution for) it'd pretty much break down and give complete garbage, once I told it that a particular part of the code was wrong to see if it'd fix it, and it actually argued that it was correct, before explaining that it did exactly opposite of what it did.
waited long for such video👌
I see ChatGPT as a junior developer. Does it replace a senior dev? No, but it can sometimes provide good suggestions. But I'm saying this as someone who lerned programming before AI, as a beginner it may tempting to use it, and it may be fine if you use it complementary to other learning methods like code examples, documentation, books, practise and so on.
Not even close, ChatGPT is overrated, it has some uses but it's nowhere close to a human developer except maybe Frontend JS developers.
@@haroldcruz8550 Yes, maybe junior dev was overstating its capabilities. I don't have expectations towards its solutions, It's more like here is some code that resembles the average of what people write in similar situations. If the situation is easy it's very helpful, like parse this file line by line in this programming language I'm not so familiar with. The downside is that it gets details dangeriously wrong.
The very-few times I asked ChatGPT to generate some C code for a problem, its first few "upbeat" solutions failed to account for edge cases (eg: division by zero or negative numbers). Additionally, its data representation was "beginner level" and its suggested algorithms simply "brute force". Don't expect "Quake Square Root (approximation)" to fall out of its machinations. Is "good enough" really good enough?
That code looks like to be used within Code Composer Studio, did you try that code in CCS
Lol , makes sense as to why there were soo many mistakes
I'd be cool if you made one about C mocking libraries.
LLMs are a better search engine for coding... Being exposed to code is being exposed to code... What are you going to do... Go back to writing assembly? What are you going to do... stop using a calculator... Remember you won't have a calculator with you all the time in the future so you got to learn the maths! LOL... Use a code specific LLM... Learn the weaknesses.. and then forget all that in a few months as it improves...
It's quite impressive if it wasn't wrong most of the time. I've resorted to using it for cooking recipe advice.It's quite good at that. If you don't like cinnamon on your wheetabix, you ask for alternatives until it gives you one you like.
I read that gnu pthread runs at kernel level, are there any threading library in linux that runs at user level? I mean like kernel is unaware of the threads.
If the kernel is unaware of the threads... then they're not really threads.
10:21 This code is right but the comments are somewhat wrong, because if you remove the initalization, the ints in the array might contain trap representations and reading from a trap representation causes undefined behavior.
Asking ChatGPT to write a gient complex script? No no no ...
Using LLM-based Copilot for coding? YES! YES! YES!
What do you have to say about Devin?
ChatGPT reduces the amount of documentation you have to read and the second good use is to ask it to explain code you cannot understand. I haven’t found good usage other than that.
Students need to do it in two parts - learn programming, and once they have some level of comfort, learn tools at least enough to convince employers. Thankfully, employers don't ask for certifications in common IDEs, so hopefully they don't ask for chatgpt/copilot certifications.
The problem with LLMs is, they are not honest.
better for intuition and common stuff than complex stuff
I barely use it
ChatGPT capabilities have will have zero to do with its influence on the turnover rate for this industry. I would focus more on ChatGPT as a business logic.
8:07 No ChatGPT is not meant to be smart. It is meant to be machine learning. Which inherently by definition is as close to the correct answer without being correct.
Chatgpt4 gives this:
#include
// Assuming the button is connected to P1.1
// and using the internal pull-up resistor
void setup() {
// configure button pin as input with pull-up resistor
pinMode(PUSH2, INPUT_PULLUP);
// enable interrupt on the button pin
attachInterrupt(PUSH2, wakeUp, FALLING);
}
void enterLowPowerMode() {
// enter low power mode
__bis_SR_register(LPM4_bits + GIE); // Enter LPM4 w/interrupt
}
void wakeUp() {
// This function will be called when the button is pressed
// Exit low power mode
__bic_SR_register_on_exit(LPM4_bits);
}
void loop() {
enterLowPowerMode();
// Your normal code goes here
}
I think that for documentation or if I'm stuck with an error can help me to figure it out, like googling or use stack overflow but finding the solution in a faster way.
Of course for projects or programs with some logic it makes errors and doesn't work, but I think that is not the point.
@@_________________404 I use it for making examples and exercises without the solution, and trying to solve myself.
I didn't use in high level programming but only for studying.
Was this tested on chatgpt 4.0? seems even more in-accurate than usual.
Nah, he did it with 3.5. what did he expect?
Is he too cheap for a subscription?
Using gpt 4.0 for C and I would say its just as bad.
To me chatGPT is a glorified rubber duck and sometimes useful to find document.
well the bigger question is are we friends to chatgpt cuz if not, it definitely is coming for us
it can't do creative coding for you sure it gives you idea of existing code and famous algorithm, just give it any competitive programming quenstion from codeforces it struggle
you could be teaching yr students how to program their own ai and learning how it works instead of trying to fight against the grain. how about making them write an ai and everyone has to use that to take a test based on what it was trained on instead of trying to discourage use, anyone who ignores the utility of ai will fall further behind
I'm sure in time, AI will be able to do most of it. But I think the goal is to make AI lighten the workload, i.e. the programmer can write more natural statements in the way human think and speak.
3:54...how can you have 5 and a *half* pins? it's either 5 or 6. It's not physically possible for a pin to both exist and not exist.
not sure if you're joking. 🤔 P5.5 means port 5 pin 5.
@@JacobSorberAh, that makes more sense, never done embedded programming or whatever it was called. Had no projects that I wanted to do in that direction so would've been a pointless expense
ChatGTP launched at around the same time I started learning C on this channel, it was mostly useless for me then because it spewed out code and these technical words (my knowledge of programming terms was low then). it was only when i stopped using it that i saw improvements (C was not a good language to begin with 🤣).
Tell that to every employer 😂😂😂
Great 👍🏼
Excellent point! ChatGPT is for the unmotivated lazy, one of the characters of the TV series South Park, and brainless bug.
You used the gpt 3.5, this is inaccurate version
Cheap shot! Who asks the GPT to write an entire program like that? And who cares about Energia?!!
Students who take my classes (my target audience) sometimes ask GPT to write entire programs. As for Energia, it was relevant to a project that my embedded systems students were working on. So, they (and I) cared about Energia. Also FWIW, this is also the code-equivalent of the ad hominem fallacy.
Fair enough! While your interest in Energia is commendable, this example does not effectively demonstrate how to use the tool. According to Andrew Ng, improving prompt engineering skills is essential for effectively utilizing a GPT system. Asking GPT to write an entire program using a relatively obscure entity like Energia is not an example of good prompt engineering. I’ve successfully used GPT to create entire programs by providing detailed and precise prompts.
Additionally, you didn’t mention which version of ChatGPT you used. The latest version, GPT-4o, is remarkable, though it requires a paid subscription. As someone who loves the C programming language and has written many lines of code in it, I appreciate your efforts in teaching it. However, your example using GenAI is not compelling.
Given all that, I agree with you that students must NOT rely on GenAI to do their Homework. They need to struggle with languages such as C to understand how computers work. Thanks!
In short, web developers you are deprecated chat gpt is smart enough to do make websites 🤣 , embedded developers are not been replaced in the near future 😎
You are using 3.5, it's really a mess, but hold on, try GPT 4, you will totally change your mind... I dare you try it and re-record. Otherwise this video is useless, the point is to create it using version 4. We all know version 3.5 is a mess.
I find your video a bit unfair.
LLMs and especially ChatGPT 3.5 are not meant to be used with your brain in hibernation mode, at least yet.
You still have to be able to question ChatGPTs output, especially with niche or complex prompts and you still have to have somewhat of a clue about the topic you're asking it about and double check its answers accordingly, especially if something seems way off.
But imo ChatGPT in 90% of the cases gives precise answers to, and thats the magic word: PRECISE prompts.
I would never just directly copy what ChatGPT outputs, but it gives a nice overview about certain concepts and how to possibly implement them.
The video is commenting on how I see students using ChatGPT, not how it could or should be used.
Misleading title, simply calling a tool useless is like calling yourself useless which isn't true. Anyone and everyone are only as good as they use the tool around them. Don't blame the tool, blame it's wielder or person using it. If you're not using AI as a tool in aiding you solve tasks then you will be left behind. This video is like saying "don't use calculators because if you mess up your math is messed up". Which is terrible logic, it comes down to the person using the tool and providing accurate information to the tool to better aid you in reaching an end resolution.
Continued: in the instance of this video, the creator used lazy , inaccurate , incomplete , and no real directions to chatGPT which we all know is required for it to more accurately provide a working base solution for the user to extrapolate and refactor as necessary.
@@_________________404 Not true, if your doing basic arithmetic then maybe your logic holds true. But if your doing math that involves time and session based parameters your argument goes out the window. Instead of trying to think of outliers that aren't relevant look at the substance of the post made. Which is where you will see i'm discussing how the video is misleading and that tools are only as good as they are used or created. Seeing as AI is made by programmers, it too falls into this category of reliance upon it's creators and users. So sorry friend, i think you have it confused, it is entirely 'about the person' and if you fail to understand that then you need to understand what AI even is. Many of us programmers have made our own forms of AI in different usage cases and so we clearly understand AI has many implicating factors but mostly comes down to how well 'I' the programmer build or implement it and then after it's ready for use, it comes down to how well I use the tool or how well I made the tool.
wow, I think I touched a nerve. But, I'm not saying what you seem to think I'm saying.
I firmly beleive he did mention the 'tool' isn't Useless. Its just not as Reliable for a Larger code base. Maybe just maybe for Learning purpose or as a teaching tool to learn simple concepts.
Bad advice….use the tools around you. We won’t have programmers for much longer anyway
You're funny
I like that you're the reason more engineers will be hired to clean up the garbage you make with ChatGPT and CoPilot 😊
@@SimGuntherrevisit this in 5 years. Programming is not some special gift from god. It’s logic. And most of the code written today is crap btw. Good luck champ
you are definitely a web dev.