This is more obvious if you understand how neural network works. It has error margins, so there will always be a need for some kind of check when it produces something. Images and video have leeway since nobody notice some small error in pixels. But for code, even if something compiles, someone needs to make sure it does what it's suppose to. And that means someone needs to understand what's being produced.
Exactly right! AI will speed up a LOT of tasks for us, but ultimately, humans will have to check and verify (i.e. the boring jobs) everything it produces. One interesting point though, couldn't we programme it to stop introducing vulnerabilities into the code? Nice talk! Thank you!
This is more obvious if you understand how neural network works. It has error margins, so there will always be a need for some kind of check when it produces something. Images and video have leeway since nobody notice some small error in pixels. But for code, even if something compiles, someone needs to make sure it does what it's suppose to. And that means someone needs to understand what's being produced.
Exactly right! AI will speed up a LOT of tasks for us, but ultimately, humans will have to check and verify (i.e. the boring jobs) everything it produces. One interesting point though, couldn't we programme it to stop introducing vulnerabilities into the code? Nice talk! Thank you!
you cannot program it to stop doing something it doesn't even know it's doing wrong; it thinks it's doing what it's supposed to do.
What software/program was he using at 3:02 ?🤔
Power point
THis is just wishful thinking. The AI is going to steamroll this kid's future