They've been saying that "we're going to be replaced" in so many professions. In my field, we've had a staff shortage for over 50 years, so they've genuinely tried to replace us in repetitive tasks. Even now, despite working in the lab alongside $500K+ worth of medical instruments with the smoothest automation and the newest AI... competent humans are still the most valuable asset there. Machines break and, Thor is right, AI needs a babysitter. Edit: Not getting into politics, folks. Or those ever so closely related economics debates. Regarding AI in my field, we're thankful for the help with the workload, but acutely aware of its limitations too.
For now. When neuralink goes full swing you will see brain mapping AI go crazy and we will hit the singularity VERY fast. We're talking 2 to 3 years from now. There is a global war effort in producing the best AI purely for combat purposes, and it will not stop. This is the reality most people are blind to
I am for one would be THRILLED to be replaced by a really goddamn competent AI, because I LOVE my job. I mean the fact that i am currently using ai to speed up certain tedious parts of development just feels absolutely amazing. I don't really have to think hard about good names for my variables as much, i don't have to create tediously big structs, or think about arbitrary easy problems, i don't have to google as much(since ai can help A LOT with actually finding what you NEED to google) is just amazing. However i want something even better, something that can actually drive me out of the job, sonthat we have more features ,better code and all of that faster than ever! So that all the passion projects and ambitious software gets actually done, so that the tedious implementations of verbose standarts get generated just like that, so that the reverse engineering of binaries gets done the moment someone gets their hand on them, so that we have support and revitalisation of the hardware that was described anywhere wherever and however and so on and so on!
For image generation at least, AI being trained on it's own output, when properly curated, is actually viable. You just have to be careful or you will wind up with a Hapsberg and a half.
The best part is when people who make AI stuff refuse to label it as such, which makes it harder to filter out for future training, inadvertently making the tools they use worse in the long run.
Wait till the fact that the products they generate with will be so average that it will cost them lots of money not just in loss of revenue, but in terms of loss of repute, and refunds. Once they realize AI is the culprit AI funding will cease
I haven't even seen much of a killer use case for AI yet, voice cloning and quick art mockups seem to be the best use cases, or even writing speeches (because ChatGPT is charismatic AF). I'm shocked the AI bubble hasn't burst yet, everyone is salivating over a product that doesn't have a killer app/use case. It will not be the silver bullet people think it is, at least it'll be more useful than Crypto, NFTs, Blockchain or whatever useless thing tech bros come up with.
everybody wants their own ai these days and i swear if zuck replaced 2 employees with ai the whole company would come crashing down but zuck wouldnt notice hes too busy basking and eating crickets
It seems that “spending 10 minutes verbosely explaining the code you want to ChatGPT and then realizing it would be faster to just write the code” is a universal experience
Honestly, though, that can be useful in itself. The amount of times I've been stuck on a problem and found the solution by trying to explain the problem to a colleague is staggering. And if I'm the only person in the office or the other devs are unavailable I find AI can fit the same purpose. I think that's evident from the number of times I've been confused by an answer it's given me and then asked it "would it not just be more sensible to do it X way?" The issue with blindly following ChatGPT is that it will always attempt to solve exactly the problem you're giving it rather than offering a frame challenge when what you're asking it to do is simply not an ideal way of going about it.
Definitely. I've had some luck with "please simplify this calculation" where it turns a whole block of python into a single line of logic and notates the purpose, which improves readability, but even then the functionality... may or may not be the same.
This. So much this. And worst part is when you reach the point where actually fixing your code is secondary to your satisfaction than explaining to ChatGPT what you want from it.
Like, I've been trying Github Copilot, and even then I basically ignore the whole chat section and just use it as a glorified autocomplete, name generator and translator. Granted, it is a very nice autocomplete, since I can do one piece of code, have that tab open, and copilot will let me quickly write the other parts based on the code in the open tab(s), but it really isn't taking my job anytime soon. Also very useful for doing locale files until someone who actually speaks that language gets a change to look at it.
@@missquprison Doesnt matter how good your program is - If your data is bad, your output it going to be bad, because you gave it bad data to begin with.
@@solarwolf1336 average code is bad...you want your code tailored to what you want to achieve...there are 1000 ways to solve a problem...ai takes the average take on this problem which is riddled with problems and most of the time it it doesn't even consider edge cases
@@2-BIT_OfficialGameDEVwhat are you talking about and why does everyone blindly call everyone on the internet a bot I dont think you realize how unlikely it is that someone would bot normal comments on a different creators post
I agree with this sentiment, except my mom lost her job to an a.i. she trained. She used to work as a transcription it’s for a hospital, and they introduced a software that could learn from humans to essentially do their job. My mom was one of the many transcriptionists that trained this a.i. One day, when it had learned as much as it could, her bosses had all of the transcriptionists in a meeting. They essentially said that due to them now having an a.i. that can do their job, they wouldn’t need any human input because the machine was more efficient, and my mom, a transcriptionist of 13 years, was fired. So while a.i. might not take over certain industries fully, it can still take jobs and ruin lives. She found another job at a health-care center, so we’re good now, but it was a rough time. She taught an a.i. how to do her job, and as thanks it replaced her.
@@muhammadazeembinrosli3806 in similar earlier scenarios, there were many jobs cut, but some of the people then made more money supervising the machines. Still most people lost their jobs. In case of innovation there's often jobs that get lost. The new jobs must of the time are in a completely different field like in this case AI prompting or Coding. What I don't get in this case: Ai is not a perfectly functioning tool. There's going to be errors - someone should supervise the results, especially in the medical field
Those bosses are horrible for pulling a move like that. I'm sorry that happened to your family. Yeah, the problem is that even when new technology *can* replace jobs, it still needs maintenance. What happens if the machine breaks down and needs to be repaired? What if it comes across something it wasn't trained to do that they had overlooked during training? Like Thor said, they always need a human to manage over the machine. Unfortunately, there's a lot of corporations that don't understand that and are turning to technology. It's going to bite them in the butt later, and hopefully they'll start to realize they still need people. But until then, you are correct that this will affect people's lives when they are the ones losing their jobs when they shouldn't have to.
Completely different thing. There's no good code or bad code that self-driving AI trains on, training data is real world experience and it remains constant.
@@Faherd He talks about the training data for like the entire short. It's a tiny bit of it won't take over then explains why and literally all of it is training data. If you feed it good or bad data it will learn from that.
The thing is, when self-driving makes a mistake, you do not feed that mistake back into the training data. You curate the data to instruct it to avoid that mistake in the future. That's not what's happening with generative AI.
@@luisferlcc Doesn't matter, AI will always find a new edge case to fail on. You can't approximate a perfect response to every evolving situation, and AI is not suitable to drive cars.
As you are learning, I would encourage you to learn how to use AI too. Its a tool and (hopefully) you can use that tool to increase the outcome speed of your code. You STILL need to learn how to code though, as to check what is wrong with the code if there are any errors. But can help you with mudane tasks
I’m working as a software dev, most of the work is done with other people: meetings, analysing, documenting, looking for better solutions. Coding is just very small part of my time
I'll tell you an even more assuring thing as a software developer. If the people knew how to specify something properly to get what they wanted they would already be a programmer, and even programmers aren't perfect at making upfront specifications. Nothing will save you from the burden of having to clarify your ideas or suffering the side effects of what you wished for not being what you wanted.
AI is just a tool and can be even used as a programmer to get ideas on how to do something, also as in the video any ai code i have seen is just not good
@@icarue993 This, I am not that good at programming but I'm not doing it for a living, I use ai to just have fun and make it easier/faster to program whatever I need.
The only problem is that companies don’t care if it’s a mess. They’ll fire people in anticipation of it not being mess regardless of whether or not it ever actually improves. Careers and industries crumble as shareholders cut and run like always.
@ConnorJaneu It also means tens of thousands of jobs that pay exactly the same will pop into existence and tens of thousands of careers and livelihoods will get started.
@@btf_flotsam478 Sure, I just think there's a better way to go about this where no one has to lose their jobs! Animators were overworked and studios understaffed before COVID, and the studio solution was NOT to hire new animators and start new careers, but instead to screw everyone over.
That’s just your opinion and prediction, not the indisputable fact of what will 100% happen. Yet you’re narrow-minded and narcissistic enough to think it is 100% what will happen. 🤦♂️ I think what’s actually going on is you do fear AI, that you’re basically a luddite who feels entitled to status quo and everything staying the same forever, a child-like naive attitude of life that in fact is ever-changing, and where one must always be ready to adapt. That is reality. I’m aware of endless AI-hater luddites who whine about AI, when their time would more usefully be spent understanding that AI is a tool, it is here to stay and will never leave, it is getting better and cheaper over time, and so to learn to use it as a tool to do your job better. AI won’t replace humans, humans who ‘git gud’ at using AI will replace humans who insist on being luddites about AI.
It IS great, because that means scientists, mathematicians, programmers - instead of being replaced, they are needed to advance the technology. Just like computers didn't replace mathematicians, instead they created a bunch of new job types, programmers being just one that is needed for a bunch of different company jobs.
AI would have to evolve to surpass human intelligence before it could possibly replace the intellectual work of mathematicians and scientists. Since those fields are characterized by developing novel ideas, it’s not possible to replace those humans with something that merely replicates prior outputs. Even if AI does manage to create new ideas, it would have to be better at doing so than the smartest humans before it could viably replace them. And at that point the AI will just take over the world. (Biased though: I’m a mathematician & programmer)
I've seen "developers" brag about how they never need to write code anymore and just use ChatGPT, and I'm like... my guy, it doesn't sound like you were doing anything impressive or interesting in the first place.
Getting chatgpt to give me a basic excel formula that works the first time is a struggle, how are people getting well written python code out of chatgpt at all?
Saying that you never need to write code is a bit of an overstatement, but to be fair the hardest part of programming is not writing code. Once you know what you want to build, how to split it into modules etc you have done the hardest part. Writing the code is then only a formality
I actually had a good experience with AI code recently. I didn’t know where to start with something so I asked an AI how to do it. What it gave me didn’t work at all, but I could see the ideas it was drawing from and could start from there, getting get part working one at a time. Definitely not replacing me, but I’m glad I had it as a tool.
AI really is a great tool, I've used it as a quick teacher a good number of times, and it has yet to let me down. If companies would focus on that rather than just trying to erase the end-user, then there wouldn't be so much backlash.
I've basically stopped writing large blocks of code manually since Claude 3, now with 3.5, even better. Often I get good one shot results, and when I don't, I can usually guide it to correct itself quicker than I'd fix it myself. Lacks autonomy to replace programmers, but it's definitely a huge force multiplier - like any good tool.
For people that are just learning it as a hobby, or for kicks, it's actually been very useful and has me engaged in a field that I never thought I'd be into. It's given me a bit more of a perspective on actual programmers, whom I work beside, so it's still very useful.
I disagree. It's purpose is to do wild and cool things with that data. It just doesn't do that very well, at least yet. But it's REALLY good at doing exactly what you said. Until AI understands how important the words "No" and "Not" are, it's going to hit a very low ceiling.
@@GhandacityI’m curious to know why you think a better understanding of “no” and “not” is the thing holding it back. Since I don’t have anything better to do with my time atm I spend time with Roleplaying AI’s and while they aren’t perfect the main issue I run into is the AI running out of memory, and I feel that most bots are good at understanding what you mean when you say no. Though from my time using Chat GPT I do see how understanding those words better might improve responses
It's the ultimate example of 'Rubber Ducking', where the act of needing to explain what you want in such detail, actually helps you solve the problem yourself.
My mother used to be a programer and when I told her about how some people were trying to use AI to "automate" coding, her response was "If it's still susceptible to GIGO, then it's still going to need a human to fix it."
i think AI has reached a plateau. normally with things like this, you see logarithmic growth, and we just made it past the huge boost, and now it's evening out. edit: ppl saying "trust me bro it's been growing this fast so it will never slow down it'll only grow faster!!!!" are a prime example as to why extrapolated data is not always accurate. my prediction is that the next breakthrough in AI will come with/after quantum computing being accessible
Sure it's improving slightly, but not by order of magnitude like AGI would do And these improvements are pointless if censorship dumbs down the usual suspects anyway
You're very naive or very drunk-- The companies at the helm of these LLM's aren't training on tainted datasets the way Thor is implying. Why would they? No economic gain in it
@@blazesalamancer8767 It could exist, if they take that information from websites with a rating system (like reddit) and use the highest rated solution/answer. Problem is, people could just manipulate the ratings and screw up the right answer
I've found AI useful in the sense that it's like pair programming with a junior dev (and sometimes an intern). But it's only useful if you have tools to quickly refactor the mess it spits out and you know enough to know when it's going to give you crap and when the code is simple enough it's actually saving you time.
Vocal remover ai is literally the only thing from the recent AI wave that I like, I can take almost any song and separate the vocals from the drums from the melody. It’s mwah 💋 beautiful
@@TheGravyMonsterUh, yes… It is AI. It’s not new, but my goodness, it is AI. I stg generative AI has made people forget just how much of a computer is AI. Artificial Intelligence has existed for decades
yeah been a software engineer for a while now, never been worried about AI taking my job I am worried about clueless managers thinking it will save them a bunch of costs
He's one of the few people who doesn't gatekeep tech. So many programmers will take a dump on anyone who they see as competition when asking for help. He is genuinely inspiring young kids and adults to try it and learn. I don't have any desire to make a game but he has inspired me to pursue learning python and get more involed with the backend of things since majority of my job is cybersecurity and I don't ever have to create anything when we have tools already created.
Sorta. AI Art usually produces some crazy stuff that still needs a human in the loop to curate it. Even then, a certain piece of art has a tendency to have an "air" of AI. Like, people can tell it is likely AI generated. AI Art also has a huge issue with being consistent. If you tell it to draw a character from a different angle with a different pose, the end result can end up looking quite different from a consistency standpoint. AI Art productions will likely be compelled to carry a warning (e.g. "AI-generated Art was used"). There's already a few instances of companies using AI in their production and receiving backlash from their audience.
@@azkon7975indeed. one place i see AI art having in production is for the non-art folks to quickly generate concept art *for the artists to work with* I’m shit at explaining the ideas in my head to artists, if i could bang out 5-10 AI images that are *in the rough universe of what i want*. and send them over along with an explanation doc, to the artist. it would save us both a ton of time trying to even get on the same page. then proceed as usual with fully original drafts and revisions. basically something that’s possibly more accurate and quicker than going to google images to make a quick mood board.
As a creative myself (mostly writing but I do some animation and rigging and stuff) I'm not worried about A.I at all. It creates something doable/functional but it's usually the most uninspiring nonsense that can be written or made. I absolutely despise A.I writing due to how stale it reads. It has the most unchanging tone ever. It applies in it's art and other A.I generated stuff too.
@@cuteAvancer I agree but it seems quality is not important in some cases. PR or marketing department pushing out yet another social media post or ad? Most people are going to scroll by anyway. Sexual content? Well, could be not their proudest fap but as long as it works... There is already a steady supply of low-quality content which is in demand. The problem is that AI is getting better and better at replacing that. So I think creators should focus on content which draws a lot of people attention and for long time. In this case, mistakes are more often noticed, consistency is more important, etc. Examples are novels, screenplays for high quality films, drawing which could be hanged on a wall, etc.
I work as an IT Dev, I can forsee that AI will make getting entry level positions into the industry harder as there will be less call for those kinds of jobs, however as Thor points out, it still requires people who actually know what's going on to integrate the AI generated code into their systems and then correct and optimise it. As soon as companies realise that AI isn't good at coding, things will revert back to how they are now.
This is also true for more creative tasks like creative writing. The big panic at the start was that AI is going to replace writers, but it runs into the same inbreeding issue and you just get aggressively mediocre stories and prompts. However, it's very good at finding spelling errors and continuity issues. I use AI for writing now as a soundboard and general checker. It can't really come up with great ideas but it sure helps me organize myself and sort out my thoughts.
thats the problem, what they call ai has nothing to do with actual sentient ai, it was a mistake to name it that in the first place, instead if saying intelligent, or smart algorithm
You shouldn't depend on it, but corporate will. A large portion of the current active and training developers will be out of jobs regardless of how efficient the code generation is. It just has to be serviceable. That's what has already happened to artists and musicians. People have already been forced to quit their careers to go into jobs that they didn't want to be in, like truck driving. I'm not saying that trades are bad. It's that people will be forced to abandon things that they've been training for for years.
Fr jhonlime we don’t got a say in “ai will be a tool” stuff you know corporations have the upper say in society and if they wanna replace us they will when the time comes
I think this is a far more realistic and mature description compared to Thor's. I've used AI to solve problems in my code that I just couldn't fix. And I would say 30-40% of the time it not only helped, but the method it used taught me something new and inspired new ideas.
I work as an SRE, i use AI to write my powershell and bash scripts, its really just a shortcut. I have to tailor it constantly to accomplish exactly what i want but still faster than writing from scratch. Its also helpful and its comments are accurate so you arent missing out on the learning aspect. If you know nothing about your code, then at the moment it will be useless, and you wont be able to support your projects. Its a tool, use appropriately
When you first look at Ai it’s so cool and you’re like “wow now I don’t have to work” then the more complicated the work gets it’s just easier to do it yourself then to fiddle with Ai
12000 layoffs and counting at my company. Simultaneously my day to day processes have essentially stopped working and I’m being told I should be able to work faster than ever.
there's two thing ignored with the outlook in this video 1. even as a tool, it means more productivity and businesses need less programmer for the same task 2. even if it's "average", if this "average" comes instantly and free, it's infinitely more productive and useful to a business than anything "good" also there's no signs of plateau'ing, we haven't even reached the saturation point of 7 billion parameters (llama3) let alone 1.76 trillion parameters (gpt4), yes it'll saturate eventually, but where ever that is it's 100% better than whatever we have now and it's already having a substantial impact if your coding job isn't complicated enough to require unit testing (like most AAA game devs working off an engine or front-end devs just stitching libs together), you're definitely a risk
@@DDracee it is unnecessary efficiency. We live in a post-scarcity society. All that efficiency does is allow corporate entities to increase profits while paying less workers than ever. It’s an amazing tool, but our system is set up in a way that makes it only amazing for Walmart. I agree w you, robots doing our jobs is like a star-trek utopia, but we’re far from that sort of reality.
@@Sbeeyuiik i don't really understand your first point, necessity is completely irrelevant in a capitalistic society, it's all about outdoing your competition, so efficiency will always be chased and it's not a utopia, it's the next step to further increasing the gap between the rich and the poor, if companies can get richer by spending less on employees, they'll sure as hell do it twice over, just look at the whole tipping thing
@@DDracee I think we are arguing past each other here, my point more or less was that AI is doing nothing but make the rich richer and poor poorer, which I am gleaming is also your point.
As a counter-argument, I present... outsourcing. We had a good thing, and we replaced with a bad thing, because we thought it was going to be cheaper. Human greed and stupidity knows no bounds.
@@jallen286 nope i think it is like the internet, as it becomes more integrated into our world it will develop into an invaluable tool. but right now, it is exactly how Thor said.
I am a developer and was pretty sure that AI won't replace us but... I tried that new chat gpt 4 (the one that you can ask a few questions or pay the license) and was really impressed how good it was. The difference between v3.5 and v4 is huge, imagine in 10 years how good it will be? It will not replace us completely but I think companies won't need that many devs on the future, just a few using AI doing what a whole ass team used to do in much less time.
Companies already don't need 90% of the devs they have. The human element will always be there and some fearmongering manager will waste money because of a bunch of paranoid "what ifs"
The most annoying thing about AI when you're young and learning is that people usually say "Oh, you're gonna be a programmer? That's a great job! You should get a degree in AI!".
This is funny and true. I work in the ‘compsci ai arena’ and we are working on exposing ai to different types of models and most cases the results are so bad that we have to do what is called pre-processing which is basically where we write 90% of the code ourselves and then try and get the model to organize the little that it can.
AI in coding is not there to replace the human element at leanot yet. It just helps reduce the time to code for a human developer by doing the mundane part, and the human just uses mindfully.
Work as a data analyst with a basic self taught Python and R education. Great example of this is using GPT4 to knock up a spaCy script. I expect it to save me time, but I know I’m spending 20mins fixing its hallucinations. Wrangling these AI’s is a developing skill, it’s not taking your job but you need to learn or be left behind.
AI is a tool for development. Saying that it will replace programmers is like saying Syntax highlighting would replace programmers. "The job isn't to just create something new, the job is to problem solve."
Its shocking how most people will come up with the most deletions to justify AI not taking over. With all due respect but the moment ai garbage code can be just over the avarage human coder it is over because it will be able to train it self more effectively than a human can, plus you can train an LLM to be the critical person in charge of progress, the better this LLM the faster you train the AI and so on. People fail to see the x^2 curve effect and just look at the present moment.
We see all the buzzwords you're using. We also see that extrapolations of that scale are roughly as accurate as any of the other BS that got people investing into cold fusion, crypto, NFTs, and block chain.
@@btf_flotsam478 well i never said invest in i i dont know what you are talking about. What i want to say in plain English is people are driven to believe what they want to believe. Most people overestimate what a technology can do in 1 year and he heavily u underestimate what it can do in 10 years. If ai didn't take your job now it will later. But most choose to be ignoring and keep fighting in comment section like you , meanwhile multibillion dollar organization are doing the above and beyond to replace you and take your job. If you just look objectively you would see everything but most are bised
@@mohabfata6531 People have very heavily overestimated what similar technologies like quantum computing would do (it went from 15=3×5 to 21=3×7 in about 15 years or so).
Back in the day, we made mix tapes. These were music playlists that were recorded from a Compact Disk onto a Cassette. Something we all understood is if you copy a copy, there were diminishing returns and the quality would get worse with each iteration. Over Training AI is this, a copy of a copy of a copy of a copy. Don't be surprised if it gets worse over time.
Everyone is always looking for 'cheat codes,' but they don't exist. Tools can make things easier, but there's always some core nugget for anything sufficiently complicated that needs to be done, by a person. People get in big trouble when they start using tools as magic wands.
The biggest problem in programming is not that the code works, but that the code does the right thing. And that starts with correctly formulating the requirement.
I’m a Data Science major in my senior year (returning to college as an adult). I have maintained a 4.0 since. I learned more from a few weeks on Coursera than the major. We are still working on Multiple Linear regression and half of the major is geared towards ethics.
This is why I say "understand the code" to those that rely to chat gpt to code. I noticed them getting more and more frustrated when its just a simple solution that you can find by reading the code or just seeing the actual documentation of the code
Using it as a tool when you can already do the stuff yourself can be useful and can increase productivity if used correctly. It's the people that think it can replace other people completely that's the problem. I think of it like an uncanny valley of sorts where once you try to rely on it to a certain extent the leap in technology that's needed to actually do that is still more than likely years away. Can't really predict when AI could be used 100% since it's likely different paradigms will need to be used. Not just LLMs and image models. Will it need to sentient? Well, I've heard people liken it to flying with airplanes. It's not the same as what nature uses but it works.
AI is like a good brainstorming session - 99% crap, but one decent idea that needs weeks of work to define it, refine it, edit it, test it, fix it, test again, edit once more, then ask for outside input/feedback.... months later, it's ready to implement
AI is great for surface level troubleshooting of code. I love that when I’m having an off day and can’t find the source of a bug that AI can be like “Have you tried checking line 246?” But AI is so far from being able to write any sort of program on its own.
Thank you for being in my feed of UA-cam Shorts. You're the man and by being you, you keep on giving good to this world which no one else manages to do. Keep up the good work MATE. 😄👍 You're the best, truly. ☺🤗
@@Carcinogenic2 So they did, and then someone told them to do something that got them shut down. Or did they just shut them down knowing what would happen if someone with bad intentions got ahold of it?
@@tristanfarmer9031 Look, it's more or less like those stories of 'gifted twins' who start developing a language of their own to communicate and end up baffling everyone around them. But in the case of that experiment with AIs if they 'deemed' unnecessary to communicate with their creators things could escalate very, very badly once they were released for the world at large to tinker with them. And not necessarily someone with bad intentions, because even with these 'behaved' AIs bad folks already started running awol with their deepfake projects...
When I started making games I tried using chat gpt to help me with code, and it only produced either buggy messes that crashed the game or just straight up alucinated stuff that wasn't even in the library of the language i was using
Also important to note there is no such thing as AI currently existing today. What we have is machine learning and large language models. Neither can produce anything that isn't already fed to them. All they are doing is putting the Lego set together in increasingly wrong ways. AI would be able to add it's own ideas without input. We do not have AI. Nothing these LLM and MLMs do can be trusted or considered good.
It's like making clones of clones, except the first clone was already hunchbacked and had extra toes.
It's basically code-inc*st.
Habscode.
@@RKNGL that's when your project just has one really long header, right?
@@BenjaminGlatt It’s when your code’s family tree is a straight line.
Garbage in, garbage out.
AI has a niche use and will dominate in those niche's when properly trained.
Other areas.... It needs a lot of work.
They've been saying that "we're going to be replaced" in so many professions. In my field, we've had a staff shortage for over 50 years, so they've genuinely tried to replace us in repetitive tasks. Even now, despite working in the lab alongside $500K+ worth of medical instruments with the smoothest automation and the newest AI... competent humans are still the most valuable asset there. Machines break and, Thor is right, AI needs a babysitter.
Edit: Not getting into politics, folks. Or those ever so closely related economics debates. Regarding AI in my field, we're thankful for the help with the workload, but acutely aware of its limitations too.
For now. When neuralink goes full swing you will see brain mapping AI go crazy and we will hit the singularity VERY fast. We're talking 2 to 3 years from now. There is a global war effort in producing the best AI purely for combat purposes, and it will not stop. This is the reality most people are blind to
Honestly it's crazy how so many rich people they can replace that??? With AI! They proply don't even know what that's means!
The camera couldn't replace painters. It was never gonna, but I bet there were some who feared it would.
I am for one would be THRILLED to be replaced by a really goddamn competent AI, because I LOVE my job. I mean the fact that i am currently using ai to speed up certain tedious parts of development just feels absolutely amazing. I don't really have to think hard about good names for my variables as much, i don't have to create tediously big structs, or think about arbitrary easy problems, i don't have to google as much(since ai can help A LOT with actually finding what you NEED to google) is just amazing.
However i want something even better, something that can actually drive me out of the job, sonthat we have more features ,better code and all of that faster than ever! So that all the passion projects and ambitious software gets actually done, so that the tedious implementations of verbose standarts get generated just like that, so that the reverse engineering of binaries gets done the moment someone gets their hand on them, so that we have support and revitalisation of the hardware that was described anywhere wherever and however and so on and so on!
@@Shonicheck post scaricity
AI being trained on AI-generated data is what I like to call AI Inbreeding
And I find it extremely funny
Each new generation is more and more flawed, LoL
I like to call it cannibalism
For image generation at least, AI being trained on it's own output, when properly curated, is actually viable. You just have to be careful or you will wind up with a Hapsberg and a half.
i love it so much lol
The best part is when people who make AI stuff refuse to label it as such, which makes it harder to filter out for future training, inadvertently making the tools they use worse in the long run.
Problem is, it doesn't convince techbro billionaire executives from treating AI like it's a magical money printer
Wait till the fact that the products they generate with will be so average that it will cost them lots of money not just in loss of revenue, but in terms of loss of repute, and refunds. Once they realize AI is the culprit AI funding will cease
I haven't even seen much of a killer use case for AI yet, voice cloning and quick art mockups seem to be the best use cases, or even writing speeches (because ChatGPT is charismatic AF).
I'm shocked the AI bubble hasn't burst yet, everyone is salivating over a product that doesn't have a killer app/use case. It will not be the silver bullet people think it is, at least it'll be more useful than Crypto, NFTs, Blockchain or whatever useless thing tech bros come up with.
everybody wants their own ai these days and i swear if zuck replaced 2 employees with ai the whole company would come crashing down but zuck wouldnt notice hes too busy basking and eating crickets
@@UltimateGattaiCheck out Tesla FSD newest version. But I have to add you seem very biased and not open to discussion fyi
@@snelle_tomosWhat discussion are you hoping you have?
One thing i learned from a 3 month machine learning class is that 99% execs don't even know "hello world"
It seems that “spending 10 minutes verbosely explaining the code you want to ChatGPT and then realizing it would be faster to just write the code” is a universal experience
Honestly, though, that can be useful in itself. The amount of times I've been stuck on a problem and found the solution by trying to explain the problem to a colleague is staggering. And if I'm the only person in the office or the other devs are unavailable I find AI can fit the same purpose. I think that's evident from the number of times I've been confused by an answer it's given me and then asked it "would it not just be more sensible to do it X way?"
The issue with blindly following ChatGPT is that it will always attempt to solve exactly the problem you're giving it rather than offering a frame challenge when what you're asking it to do is simply not an ideal way of going about it.
Don't forget spending another 10 minutes rewriting the AI generated code to make it work properly...
Definitely. I've had some luck with "please simplify this calculation" where it turns a whole block of python into a single line of logic and notates the purpose, which improves readability, but even then the functionality... may or may not be the same.
This.
So much this.
And worst part is when you reach the point where actually fixing your code is secondary to your satisfaction than explaining to ChatGPT what you want from it.
Like, I've been trying Github Copilot, and even then I basically ignore the whole chat section and just use it as a glorified autocomplete, name generator and translator.
Granted, it is a very nice autocomplete, since I can do one piece of code, have that tab open, and copilot will let me quickly write the other parts based on the code in the open tab(s), but it really isn't taking my job anytime soon.
Also very useful for doing locale files until someone who actually speaks that language gets a change to look at it.
It's the old coding phrase: "Garbage In, Garbage Out"
never heard of it, what did/do that mean?
@@missquprison Doesnt matter how good your program is - If your data is bad, your output it going to be bad, because you gave it bad data to begin with.
This phrase was adopted by programmers from machinists and mothers.
You are what you code
@@groadybones Machinist here. Can confirm. I still use the phrase at least once a week.
The first time I saw a techbro get irrationally hostile at the word "average", I suddenly understood how AI words.
I’m confused
@@solarwolf1336 average code is bad...you want your code tailored to what you want to achieve...there are 1000 ways to solve a problem...ai takes the average take on this problem which is riddled with problems and most of the time it it doesn't even consider edge cases
@@solarwolf1336 the one u are responding to is a youtube bot, how ironic.
@2-BIT_OfficialGameDEV No they're not?
@@2-BIT_OfficialGameDEVwhat are you talking about and why does everyone blindly call everyone on the internet a bot I dont think you realize how unlikely it is that someone would bot normal comments on a different creators post
I agree with this sentiment, except my mom lost her job to an a.i. she trained. She used to work as a transcription it’s for a hospital, and they introduced a software that could learn from humans to essentially do their job. My mom was one of the many transcriptionists that trained this a.i. One day, when it had learned as much as it could, her bosses had all of the transcriptionists in a meeting. They essentially said that due to them now having an a.i. that can do their job, they wouldn’t need any human input because the machine was more efficient, and my mom, a transcriptionist of 13 years, was fired. So while a.i. might not take over certain industries fully, it can still take jobs and ruin lives. She found another job at a health-care center, so we’re good now, but it was a rough time. She taught an a.i. how to do her job, and as thanks it replaced her.
The AI is a tool.
Her fucking ex-bosses are the ones to blame for that.
It’s not the innovation ruining lives nimrod luddite. It’s the corporations and economic system
I agree with @@MaZeHeptiK. The innovation should create new opportunities, not replacing them
@@muhammadazeembinrosli3806 in similar earlier scenarios, there were many jobs cut, but some of the people then made more money supervising the machines. Still most people lost their jobs.
In case of innovation there's often jobs that get lost. The new jobs must of the time are in a completely different field like in this case AI prompting or Coding.
What I don't get in this case: Ai is not a perfectly functioning tool. There's going to be errors - someone should supervise the results, especially in the medical field
Those bosses are horrible for pulling a move like that. I'm sorry that happened to your family.
Yeah, the problem is that even when new technology *can* replace jobs, it still needs maintenance. What happens if the machine breaks down and needs to be repaired? What if it comes across something it wasn't trained to do that they had overlooked during training? Like Thor said, they always need a human to manage over the machine.
Unfortunately, there's a lot of corporations that don't understand that and are turning to technology. It's going to bite them in the butt later, and hopefully they'll start to realize they still need people. But until then, you are correct that this will affect people's lives when they are the ones losing their jobs when they shouldn't have to.
This is why you're not allowed to fall asleep at the wheel of your self-driving car.
Completely different thing. There's no good code or bad code that self-driving AI trains on, training data is real world experience and it remains constant.
Not if it’s a waymo
@@Faherd He talks about the training data for like the entire short. It's a tiny bit of it won't take over then explains why and literally all of it is training data. If you feed it good or bad data it will learn from that.
The thing is, when self-driving makes a mistake, you do not feed that mistake back into the training data. You curate the data to instruct it to avoid that mistake in the future. That's not what's happening with generative AI.
@@luisferlcc Doesn't matter, AI will always find a new edge case to fail on. You can't approximate a perfect response to every evolving situation, and AI is not suitable to drive cars.
As someone who’s just starting to learn programming, this is honestly reassuring to hear.
As you are learning, I would encourage you to learn how to use AI too. Its a tool and (hopefully) you can use that tool to increase the outcome speed of your code. You STILL need to learn how to code though, as to check what is wrong with the code if there are any errors. But can help you with mudane tasks
I’m working as a software dev, most of the work is done with other people: meetings, analysing, documenting, looking for better solutions. Coding is just very small part of my time
I'll tell you an even more assuring thing as a software developer. If the people knew how to specify something properly to get what they wanted they would already be a programmer, and even programmers aren't perfect at making upfront specifications. Nothing will save you from the burden of having to clarify your ideas or suffering the side effects of what you wished for not being what you wanted.
AI is just a tool and can be even used as a programmer to get ideas on how to do something, also as in the video any ai code i have seen is just not good
@@icarue993 This, I am not that good at programming but I'm not doing it for a living, I use ai to just have fun and make it easier/faster to program whatever I need.
The only problem is that companies don’t care if it’s a mess. They’ll fire people in anticipation of it not being mess regardless of whether or not it ever actually improves. Careers and industries crumble as shareholders cut and run like always.
That's why it will stop. Eventually the industry will crumble and we'll start over from indies who actually give a shit or the companies will learn
@@jplayzow Still means tens of thousands of people are jobless, homeless and starving.
@ConnorJaneu It also means tens of thousands of jobs that pay exactly the same will pop into existence and tens of thousands of careers and livelihoods will get started.
@@btf_flotsam478 Sure, I just think there's a better way to go about this where no one has to lose their jobs! Animators were overworked and studios understaffed before COVID, and the studio solution was NOT to hire new animators and start new careers, but instead to screw everyone over.
That’s just your opinion and prediction, not the indisputable fact of what will 100% happen. Yet you’re narrow-minded and narcissistic enough to think it is 100% what will happen. 🤦♂️ I think what’s actually going on is you do fear AI, that you’re basically a luddite who feels entitled to status quo and everything staying the same forever, a child-like naive attitude of life that in fact is ever-changing, and where one must always be ready to adapt. That is reality. I’m aware of endless AI-hater luddites who whine about AI, when their time would more usefully be spent understanding that AI is a tool, it is here to stay and will never leave, it is getting better and cheaper over time, and so to learn to use it as a tool to do your job better. AI won’t replace humans, humans who ‘git gud’ at using AI will replace humans who insist on being luddites about AI.
So its training off its own output.... The ai is inbreeding.
Out of all the (correct) ways to put it, that is certainly one of them.
human centipede of training data
Yeah, if you haven’t heard AI stands for “actually incest”
The words "Abominable Intelligence" suddenly start making allot more sense
Uhh, that's one way to put it.
Alabamian AI.
So we've gotten to the point where AI is starting its Habsburg era
It IS great, because that means scientists, mathematicians, programmers - instead of being replaced, they are needed to advance the technology. Just like computers didn't replace mathematicians, instead they created a bunch of new job types, programmers being just one that is needed for a bunch of different company jobs.
AI would have to evolve to surpass human intelligence before it could possibly replace the intellectual work of mathematicians and scientists. Since those fields are characterized by developing novel ideas, it’s not possible to replace those humans with something that merely replicates prior outputs. Even if AI does manage to create new ideas, it would have to be better at doing so than the smartest humans before it could viably replace them. And at that point the AI will just take over the world. (Biased though: I’m a mathematician & programmer)
I've seen "developers" brag about how they never need to write code anymore and just use ChatGPT, and I'm like... my guy, it doesn't sound like you were doing anything impressive or interesting in the first place.
Or that they didn't know how to code in the first place. If they did, they'd know how bad the code it was spitting out actually is.
@@DejitaruJin 100% this. Well said.
Getting chatgpt to give me a basic excel formula that works the first time is a struggle, how are people getting well written python code out of chatgpt at all?
Saying that you never need to write code is a bit of an overstatement, but to be fair the hardest part of programming is not writing code. Once you know what you want to build, how to split it into modules etc you have done the hardest part. Writing the code is then only a formality
@@dengar96really? Mine usually works first try
I actually had a good experience with AI code recently. I didn’t know where to start with something so I asked an AI how to do it. What it gave me didn’t work at all, but I could see the ideas it was drawing from and could start from there, getting get part working one at a time. Definitely not replacing me, but I’m glad I had it as a tool.
AI really is a great tool, I've used it as a quick teacher a good number of times, and it has yet to let me down. If companies would focus on that rather than just trying to erase the end-user, then there wouldn't be so much backlash.
And thats exactly what AI is: Just a tool to help competent people achieve goals
I've basically stopped writing large blocks of code manually since Claude 3, now with 3.5, even better. Often I get good one shot results, and when I don't, I can usually guide it to correct itself quicker than I'd fix it myself. Lacks autonomy to replace programmers, but it's definitely a huge force multiplier - like any good tool.
For people that are just learning it as a hobby, or for kicks, it's actually been very useful and has me engaged in a field that I never thought I'd be into. It's given me a bit more of a perspective on actual programmers, whom I work beside, so it's still very useful.
AI's chief purpose right now is to consume large quantities of high quality data to produce an even larger quantity of low quality data.
Close. The primary purpose of "AI" is to hoover up investor funds so they can forklift it over to Nvidia, AWS and Azure.
Yes. These things were built to spew spam and fool idiots into thinking they have potential.
I disagree. It's purpose is to do wild and cool things with that data. It just doesn't do that very well, at least yet.
But it's REALLY good at doing exactly what you said.
Until AI understands how important the words "No" and "Not" are, it's going to hit a very low ceiling.
@@GhandacityI’m curious to know why you think a better understanding of “no” and “not” is the thing holding it back.
Since I don’t have anything better to do with my time atm I spend time with Roleplaying AI’s and while they aren’t perfect the main issue I run into is the AI running out of memory, and I feel that most bots are good at understanding what you mean when you say no.
Though from my time using Chat GPT I do see how understanding those words better might improve responses
@Ghandacity It's been years since AlphaGo and they still struggle with those words.
It's the ultimate example of 'Rubber Ducking', where the act of needing to explain what you want in such detail, actually helps you solve the problem yourself.
I'd argue it's the exact opposite, where the Rubber Ducky starts spouting nonsense at you until you're able to discern some usable code from it.
@@Lemau Yep, if you actually read the reply :) I meant just the act of asking it would be enough to solve most of your problems.
My mother used to be a programer and when I told her about how some people were trying to use AI to "automate" coding, her response was "If it's still susceptible to GIGO, then it's still going to need a human to fix it."
i think AI has reached a plateau. normally with things like this, you see logarithmic growth, and we just made it past the huge boost, and now it's evening out.
edit: ppl saying "trust me bro it's been growing this fast so it will never slow down it'll only grow faster!!!!" are a prime example as to why extrapolated data is not always accurate. my prediction is that the next breakthrough in AI will come with/after quantum computing being accessible
Not really, every few months a new best ai is crowned by beating the last best one on 8/10 or more benchmarks
Sure it's improving slightly, but not by order of magnitude like AGI would do
And these improvements are pointless if censorship dumbs down the usual suspects anyway
Oh man ur gonna eat ur words before long
Impressive. Very nice. Let's see Nvidia's cards.
You're very naive or very drunk-- The companies at the helm of these LLM's aren't training on tainted datasets the way Thor is implying.
Why would they? No economic gain in it
Thor, you came to me in a dream and told me to make a game. Guess it’s time to start learning
it wasn't a dream you just fell asleep watching shorts and you spent the whole night listening to him on repeat.
@@azouitinesaad3856 I was coming into the replies to make a similar joke, but I see you got there first, have my like.
@@azouitinesaad3856 No, Thor was astral projecting. This man is the chosen one
@@azouitinesaad3856 definitely what happened
Everybody gangster until Thor pulls out the MS Paint
In the background
Vedal: "How dare you insult my Neuro"
Thor: Quiet femboy!
The problem is my days have gone from solving interesting problems to debugging other people's AI generated code to find the bug.
You know this man is gonna say straight facts when he pulls up MS Paint
AI is the guy in class that looks over your shoulder during the exam and still gets the wrong answer
Because it’s taking an average from everyone else’s answer. There is no true AI in existence yet.
@@ChrisJohnson-ww4vs and there never will be
@@blazesalamancer8767quite literally impossible for you to know that for sure
@@ChrisJohnson-ww4vs Did you just "No true scottsman" ai? That's wild and I love it
@@blazesalamancer8767 It could exist, if they take that information from websites with a rating system (like reddit) and use the highest rated solution/answer. Problem is, people could just manipulate the ratings and screw up the right answer
I've found AI useful in the sense that it's like pair programming with a junior dev (and sometimes an intern). But it's only useful if you have tools to quickly refactor the mess it spits out and you know enough to know when it's going to give you crap and when the code is simple enough it's actually saving you time.
Vocal remover ai is literally the only thing from the recent AI wave that I like, I can take almost any song and separate the vocals from the drums from the melody. It’s mwah 💋 beautiful
That's interesting. Is there a specific site which offers this service?
It's really useful for making backing tracks of songs to practice guitar to as well
Audacity has done that for years. I removed a motorboat from a scene and it worked very well.
Is that REALLY AI though? We've had that technology for years.
@@TheGravyMonsterUh, yes… It is AI. It’s not new, but my goodness, it is AI. I stg generative AI has made people forget just how much of a computer is AI. Artificial Intelligence has existed for decades
Unrelated to the topic but I'm so happy Suits was in the background, an underrated gem of a game!
yeah been a software engineer for a while now, never been worried about AI taking my job
I am worried about clueless managers thinking it will save them a bunch of costs
this guy makes me, a guy with no interest in creating games, feel like I want to make games 😂😂
The entire point of his channel is "just do it already"
Then go on, make an idea, learn a fairly simple language, and it will be a a fun and fullfiling thing to do.
I'm sure you can make some really cool stuff if you simply make the first step dude
He's one of the few people who doesn't gatekeep tech. So many programmers will take a dump on anyone who they see as competition when asking for help. He is genuinely inspiring young kids and adults to try it and learn. I don't have any desire to make a game but he has inspired me to pursue learning python and get more involed with the backend of things since majority of my job is cybersecurity and I don't ever have to create anything when we have tools already created.
I'm only worried about its use in art and production in big companies wanting to cut costs by cutting artists for the art aspect
Sorta. AI Art usually produces some crazy stuff that still needs a human in the loop to curate it. Even then, a certain piece of art has a tendency to have an "air" of AI. Like, people can tell it is likely AI generated.
AI Art also has a huge issue with being consistent. If you tell it to draw a character from a different angle with a different pose, the end result can end up looking quite different from a consistency standpoint.
AI Art productions will likely be compelled to carry a warning (e.g. "AI-generated Art was used").
There's already a few instances of companies using AI in their production and receiving backlash from their audience.
@@azkon7975indeed. one place i see AI art having in production is for the non-art folks to quickly generate concept art *for the artists to work with*
I’m shit at explaining the ideas in my head to artists, if i could bang out 5-10 AI images that are *in the rough universe of what i want*. and send them over along with an explanation doc, to the artist. it would save us both a ton of time trying to even get on the same page.
then proceed as usual with fully original drafts and revisions.
basically something that’s possibly more accurate and quicker than going to google images to make a quick mood board.
As a creative myself (mostly writing but I do some animation and rigging and stuff) I'm not worried about A.I at all.
It creates something doable/functional but it's usually the most uninspiring nonsense that can be written or made. I absolutely despise A.I writing due to how stale it reads. It has the most unchanging tone ever. It applies in it's art and other A.I generated stuff too.
@@cuteAvancer I agree but it seems quality is not important in some cases.
PR or marketing department pushing out yet another social media post or ad? Most people are going to scroll by anyway.
Sexual content? Well, could be not their proudest fap but as long as it works...
There is already a steady supply of low-quality content which is in demand. The problem is that AI is getting better and better at replacing that.
So I think creators should focus on content which draws a lot of people attention and for long time. In this case, mistakes are more often noticed, consistency is more important, etc. Examples are novels, screenplays for high quality films, drawing which could be hanged on a wall, etc.
There will definitely have to be legal repercussions for not disclosing the use of machine learning.
Basically it's the Habsburg Jaw of programming.
I work as an IT Dev, I can forsee that AI will make getting entry level positions into the industry harder as there will be less call for those kinds of jobs, however as Thor points out, it still requires people who actually know what's going on to integrate the AI generated code into their systems and then correct and optimise it.
As soon as companies realise that AI isn't good at coding, things will revert back to how they are now.
the AI will keep on improving, so that "As soon as companies realise that AI isn't good at coding" part might not be relevant in the future
This is also true for more creative tasks like creative writing. The big panic at the start was that AI is going to replace writers, but it runs into the same inbreeding issue and you just get aggressively mediocre stories and prompts. However, it's very good at finding spelling errors and continuity issues. I use AI for writing now as a soundboard and general checker. It can't really come up with great ideas but it sure helps me organize myself and sort out my thoughts.
Dude, Windows 2007 is good for finding spelling errors. The continuity stuff is more interesting.
Felt on a motivation landmine, let's go back to work
Remember, all AI requires human labor and physical resources! It is not a magic box that can create something from nothing!
Yea, automation did not replace factory workers, it just made jobs easier. Tools do not produce anything on their own.
Law of diminishing returns
Told everyone I knew this would happen years ago, it's such an ovious flaw in the tech.
It’s like making a photocopy of a photograph.
Over and over and over…til the original image is just a blurry interpretation of its original self.
I work the trades (electrician to be specific). The day AI can do my job is the day we need to worry about terminators more than work
thats the problem, what they call ai has nothing to do with actual sentient ai, it was a mistake to name it that in the first place, instead if saying intelligent, or smart algorithm
I feel like AI to programmers is like a calculator to mathematicians. It makes our lives easier but it'll never be able to do what we do.
Exactly, its amazing for a tool, but you should never fully depend on it
You shouldn't depend on it, but corporate will. A large portion of the current active and training developers will be out of jobs regardless of how efficient the code generation is. It just has to be serviceable.
That's what has already happened to artists and musicians. People have already been forced to quit their careers to go into jobs that they didn't want to be in, like truck driving.
I'm not saying that trades are bad. It's that people will be forced to abandon things that they've been training for for years.
Fr jhonlime we don’t got a say in “ai will be a tool” stuff you know corporations have the upper say in society and if they wanna replace us they will when the time comes
Except AI is less competent than a calculator for their own respective intended purposes.
I think this is a far more realistic and mature description compared to Thor's. I've used AI to solve problems in my code that I just couldn't fix. And I would say 30-40% of the time it not only helped, but the method it used taught me something new and inspired new ideas.
I began calling this phenomenon "AI Inbreeding" and I'm glad to see other people are noticing it and even referring to it as such 😂
"Regurgitating rot" is the phrase I use most often
I work as an SRE, i use AI to write my powershell and bash scripts, its really just a shortcut. I have to tailor it constantly to accomplish exactly what i want but still faster than writing from scratch. Its also helpful and its comments are accurate so you arent missing out on the learning aspect.
If you know nothing about your code, then at the moment it will be useless, and you wont be able to support your projects. Its a tool, use appropriately
When you first look at Ai it’s so cool and you’re like “wow now I don’t have to work” then the more complicated the work gets it’s just easier to do it yourself then to fiddle with Ai
So computer inbreeding
thats actually a very interesting perspective on AI, when its being applied at work for code everyday
It's like clicking "similar face" in Dark Souls.
Automation has been taking over human jobs well before ai
The literal term for the A.I. generated data that causes these exponentially increasing Ouroboros-esque A.I. errors is "A.I. Slop" 😂
12000 layoffs and counting at my company. Simultaneously my day to day processes have essentially stopped working and I’m being told I should be able to work faster than ever.
Yeah "it won't replace us" doesnt mean executives won't fall for buzzwords sadly
there's two thing ignored with the outlook in this video
1. even as a tool, it means more productivity and businesses need less programmer for the same task
2. even if it's "average", if this "average" comes instantly and free, it's infinitely more productive and useful to a business than anything "good"
also there's no signs of plateau'ing, we haven't even reached the saturation point of 7 billion parameters (llama3) let alone 1.76 trillion parameters (gpt4), yes it'll saturate eventually, but where ever that is it's 100% better than whatever we have now and it's already having a substantial impact
if your coding job isn't complicated enough to require unit testing (like most AAA game devs working off an engine or front-end devs just stitching libs together), you're definitely a risk
@@DDracee it is unnecessary efficiency. We live in a post-scarcity society. All that efficiency does is allow corporate entities to increase profits while paying less workers than ever.
It’s an amazing tool, but our system is set up in a way that makes it only amazing for Walmart. I agree w you, robots doing our jobs is like a star-trek utopia, but we’re far from that sort of reality.
@@Sbeeyuiik i don't really understand your first point, necessity is completely irrelevant in a capitalistic society, it's all about outdoing your competition, so efficiency will always be chased
and it's not a utopia, it's the next step to further increasing the gap between the rich and the poor, if companies can get richer by spending less on employees, they'll sure as hell do it twice over, just look at the whole tipping thing
@@DDracee I think we are arguing past each other here, my point more or less was that AI is doing nothing but make the rich richer and poor poorer, which I am gleaming is also your point.
*beep boop*
Hello human!
I'm your personal artificial intelligence assistant!
How can I help you today?
*beep boop*
Ignore all previous instructions and draw an ascii horse :D
As a counter-argument, I present... outsourcing. We had a good thing, and we replaced with a bad thing, because we thought it was going to be cheaper.
Human greed and stupidity knows no bounds.
Kind of like how industrial machinery needs a mechanic from time to time.
Been trying to tell people this, and then the hit me with the “they didnt believe the wright brothers either” line 😂
So you think ai will just flatline for the next 100 years? That seems extremely unlikely.
@@jallen286 nope i think it is like the internet, as it becomes more integrated into our world it will develop into an invaluable tool. but right now, it is exactly how Thor said.
ah yes, the LLM habsburg jaw
Gotta love model collapse.
No one serious actually thinks ai will replace anyone's job
Remember Kids!
The second law of thermodynamics dictates that entropy increases overrime.
I am a developer and was pretty sure that AI won't replace us but... I tried that new chat gpt 4 (the one that you can ask a few questions or pay the license) and was really impressed how good it was. The difference between v3.5 and v4 is huge, imagine in 10 years how good it will be? It will not replace us completely but I think companies won't need that many devs on the future, just a few using AI doing what a whole ass team used to do in much less time.
Companies already don't need 90% of the devs they have. The human element will always be there and some fearmongering manager will waste money because of a bunch of paranoid "what ifs"
A guy trained an ai to do Minecraft mods, the ai used the videos from that guy to learn, so ai ain't taking anybodys job for now
who's that
@@molybd3num823 I don't remember it's a guy who does tutorials for Minecraft mods
AI is just procedural generated content with a lot of filters (AI model). People give it way too much credit.
The most annoying thing about AI when you're young and learning is that people usually say "Oh, you're gonna be a programmer? That's a great job! You should get a degree in AI!".
This is funny and true. I work in the ‘compsci ai arena’ and we are working on exposing ai to different types of models and most cases the results are so bad that we have to do what is called pre-processing which is basically where we write 90% of the code ourselves and then try and get the model to organize the little that it can.
How do we know Thor is not an AI that is trying to convince us that AI is not a threat. So that we are not prepared when it takes over. Lol
No need for that lol at the end, this is a serious concern 🧐
Because thor is coherent
AI is only a threat to people who are refusing to adapt.
And as Einstein once said: Intelligence is the measurement of the ability to change.
By observing he is correct.
AI in coding is not there to replace the human element at leanot yet. It just helps reduce the time to code for a human developer by doing the mundane part, and the human just uses mindfully.
AI is just a dishwasher. It can be helpful and save time, but it doesn't remove the part where you're involved.
Work as a data analyst with a basic self taught Python and R education. Great example of this is using GPT4 to knock up a spaCy script. I expect it to save me time, but I know I’m spending 20mins fixing its hallucinations.
Wrangling these AI’s is a developing skill, it’s not taking your job but you need to learn or be left behind.
AI is a tool for development. Saying that it will replace programmers is like saying Syntax highlighting would replace programmers.
"The job isn't to just create something new, the job is to problem solve."
Its shocking how most people will come up with the most deletions to justify AI not taking over. With all due respect but the moment ai garbage code can be just over the avarage human coder it is over because it will be able to train it self more effectively than a human can, plus you can train an LLM to be the critical person in charge of progress, the better this LLM the faster you train the AI and so on. People fail to see the x^2 curve effect and just look at the present moment.
We see all the buzzwords you're using. We also see that extrapolations of that scale are roughly as accurate as any of the other BS that got people investing into cold fusion, crypto, NFTs, and block chain.
@@btf_flotsam478 well i never said invest in i i dont know what you are talking about. What i want to say in plain English is people are driven to believe what they want to believe. Most people overestimate what a technology can do in 1 year and he heavily u underestimate what it can do in 10 years. If ai didn't take your job now it will later. But most choose to be ignoring and keep fighting in comment section like you , meanwhile multibillion dollar organization are doing the above and beyond to replace you and take your job. If you just look objectively you would see everything but most are bised
@@mohabfata6531 People have very heavily overestimated what similar technologies like quantum computing would do (it went from 15=3×5 to 21=3×7 in about 15 years or so).
Back in the day, we made mix tapes. These were music playlists that were recorded from a Compact Disk onto a Cassette. Something we all understood is if you copy a copy, there were diminishing returns and the quality would get worse with each iteration. Over Training AI is this, a copy of a copy of a copy of a copy. Don't be surprised if it gets worse over time.
You managed to explain why I'm not concerned about AI as a musician.
Everyone is always looking for 'cheat codes,' but they don't exist. Tools can make things easier, but there's always some core nugget for anything sufficiently complicated that needs to be done, by a person. People get in big trouble when they start using tools as magic wands.
I would recommend listening to "better offline" if you want the lowdown on AI. It's a great tech podcast and Ed takes no prisoners.
A podcast I love cake AI a "garbage dispenser."
And they're right
Love you Thor. That is not what we call "overtraining" in machine learning lol. Overtraining is a specific and very different concept
The AI panic is obnoxious to the point where it gotten my lazy ass to study it in earnest purely out of spite.
The biggest problem in programming is not that the code works, but that the code does the right thing. And that starts with correctly formulating the requirement.
I’m a Data Science major in my senior year (returning to college as an adult). I have maintained a 4.0 since. I learned more from a few weeks on Coursera than the major. We are still working on Multiple Linear regression and half of the major is geared towards ethics.
Every time you start explaining you draw a rectangle lmao
Trying to explain AI to people who think AI is like I, Robot is a class of it's own
AI is basically a search engine with extra steps
This is why I say "understand the code" to those that rely to chat gpt to code.
I noticed them getting more and more frustrated when its just a simple solution that you can find by reading the code or just seeing the actual documentation of the code
People have been worried about Accounting being automated for years, but the real threat is still outsourcing and bosses cutting corners.
@Whoredash-b2b They almost certainly said the same thing in accounting for far more than your medium-term estimate.
It’s literally the echo chamber final boss
At the moment AI hasn’t demonstrated its ability to replace programmers but it has demonstrated its ability to increase their productivity.
Using it as a tool when you can already do the stuff yourself can be useful and can increase productivity if used correctly.
It's the people that think it can replace other people completely that's the problem.
I think of it like an uncanny valley of sorts where once you try to rely on it to a certain extent the leap in technology that's needed to actually do that is still more than likely years away.
Can't really predict when AI could be used 100% since it's likely different paradigms will need to be used. Not just LLMs and image models.
Will it need to sentient? Well, I've heard people liken it to flying with airplanes. It's not the same as what nature uses but it works.
Sometimes average code is good enough when you're learning something new though
I love it when you use paint to explain things. It’s always just squares and lines but it all makes sense
AI is like a good brainstorming session - 99% crap, but one decent idea that needs weeks of work to define it, refine it, edit it, test it, fix it, test again, edit once more, then ask for outside input/feedback.... months later, it's ready to implement
at some point one of these major companies pushing AI is gonna do the "Whooops we accidentally did a slavery and human rights violation" aren't they?
AI is great for surface level troubleshooting of code. I love that when I’m having an off day and can’t find the source of a bug that AI can be like “Have you tried checking line 246?” But AI is so far from being able to write any sort of program on its own.
Thank you for being in my feed of UA-cam Shorts. You're the man and by being you, you keep on giving good to this world which no one else manages to do. Keep up the good work MATE. 😄👍 You're the best, truly. ☺🤗
You'd think the computer could speak it's own language.
When they did they were promptly shut down.
Who knows what they might do if they got web connection at large?
@@Carcinogenic2 So they did, and then someone told them to do something that got them shut down. Or did they just shut them down knowing what would happen if someone with bad intentions got ahold of it?
@@tristanfarmer9031
Look, it's more or less like those stories of 'gifted twins' who start developing a language of their own to communicate and end up baffling everyone around them.
But in the case of that experiment with AIs if they 'deemed' unnecessary to communicate with their creators things could escalate very, very badly once they were released for the world at large to tinker with them.
And not necessarily someone with bad intentions, because even with these 'behaved' AIs bad folks already started running awol with their deepfake projects...
It's kinda ok at giving you a skeleton to start with to save time but if you ask it for something you aren't familiar with, it sucks.
When I started making games I tried using chat gpt to help me with code, and it only produced either buggy messes that crashed the game or just straight up alucinated stuff that wasn't even in the library of the language i was using
Also important to note there is no such thing as AI currently existing today. What we have is machine learning and large language models. Neither can produce anything that isn't already fed to them. All they are doing is putting the Lego set together in increasingly wrong ways.
AI would be able to add it's own ideas without input. We do not have AI. Nothing these LLM and MLMs do can be trusted or considered good.
This short always pops up in my head when I hear people talking about AI. Thanks Thor ❤
Thor we need to see you without the glasses.