for sure, I don't really talk to anyone about tech irl, but whenever it does come up it does feel like people treat it like magic, maybe that's just natural
In fairness it's so much more helpful than Google. I forgot a keyboard shortcut in After Effects the other week. Google gave me nothing but 9 minute tutorials that seemed MAYBE contained what I was after. I asked ChatGpt out of curiosity and it got it spot on in 3 sentences.
What really worries me is the amount of engineering students who cant function without it. Im doing a robotics capstone and another one of my teammates cant understand a single thing i do without asking chatgpt to "translate" and it often doesnt fully get it right. It really frustrating not being able to communicate about the work im doing with him, hes not even willing to try and digest it himself.
Speaking as someone who entered the industry a few years ago, this has always been the case. Truthfully, the way that engineering material and concepts are presented to students is often unhelpful, outdated, or irrelevant when applied to the real world, and although it helps to be able to approach scientific concepts and mathematical principles with confidence unassisted, in my eyes there's not much difference between using chatgpt and google for research on a given subject. The main reason why I personally don't rely on chatgpt for this purpose is because currently, I don't feel it's a more reliable tool for accurately explaining higher level concepts than say wikipedia, or engineerstoolkit.
Thats bad. I graduated just in time(mechanical engineering), when chatgpt was still too dumb to do advanced math questions, so no one really used it. Hard to see new engineering grads in the next few years being able to get a career started. When you can’t comprehend the basics. The only good part is it will make us who actually worked for our degree to be able to find jobs easily 😂.
As a teacher, it's definitely difficult. The key is that our education system is built on testing for results and end products. Whereas now we have to start finding ways to assess process and continual progression. It's really hard to do, but will ultimately make us better educators.
This is the unintended silver lining to the whole thing. I sincerely hope they can't figure out how to stop kids using ai. The way forward is teaching kids to learn not teaching for exams.
@@cuineform3708 Figure out education reform? Maybe after a generation or so of kids dumbed down by ai. I hope they can figure it out too, but they're already failing the kids with reading as it is.
The problem with that is it’s going to turn everything into math class, where you have to get to the answer in a specific way or it’s wrong even if the answer is correct.
IMO, an AI recruiter would probably do a better job of giving a full and thorough interview than a human. Can't be tired or drunk or in a bad mood. Can't be concerned with getting through X interviews in a day, etc.
@@RachelTension its more about not having to deal with doing these interviews and the protection of human roles in my current industry. I did work as a recruiter and we were unionized, just we were based in Aus
@@XetXetable The problem is, the AI recruiter's scoring can easily have biases nobody notices. Could be something inocuous like certain types of microphones or webcam angles give slightly higher scores, or could be that it gives lower scores to black people or anyone with a muslim name. Any biases will be hard to prove and the recruiter using the tool is likely think a computer is "guaranteed to be neutral" because of how it's marketed.
yesterday I saw an ad for apple intelligence where some moron manager sent out a delegation email and had it re-written to sound more corporate and polite. I cannot begin to describe the levels of anger I felt at the idea of the person making more money than me, while making me do all of the work, being an incompetent baboon that just sits around sending out 10 AI emails a day and sending me spiraling. It's not who can do the job, it's who gets hired first
That baboon has parents and grandparents who were in much more respectable positions in society than yours, and Apple is here to make sure traditions are continued.
as someone in a managerial role... i hate when my E-Mails sound super corporate... to be fair half of our Slack channels are meme channels where i post daily... hopefully my employees appreciate me ending my E-Mails with emojis because in my head every message that doesn't end with an emojis sounds super cold and stiff
My favorite is the guy who procrastinated his presentation so he had AI churn out a powerpoint in a few minutes. My question would just be why do we even need the dude at that point? Have AI just churn out the powerpoint and throw it up on the screen with an AI voice summarizing it. Why do we need the doofus standing in front of it reading it word for word? People are going to AI themselves out of their own jobs soon
As a young college instructor/grad student who is trying to get a full time professor job it is awful. Since covid every lecture I have been in I see 90% of students with a laptop open and them not listening to the professor at all. Then, I see that the online homework I assign now is done on average in half the time (some will do a 20 minute exercise in 5 minutes). Then, I watched exam averages tank 10+ points lower. I finally decided to ban computers because at least they are then forced to sit in class and kind of listen, which did help their averages, but now my teaching ratings are suffering because students think that rule is unfair. I normally wouldnt care, but these ratings impact my ability to get hired. Its like a lose-lose, cause at least before if they cheated on the HW, they could learn some of the methods or fall back on class notes for the exams, but now they arent even trying on the HW and not listening in the class and if I dont do anything they will just do badly on the exams and blame me, but if I do something to help them they blame me anyways.
I'm currently a student (though at a German uni ) and they are also suffering with chatgpt, one of my professors decided that they will offer a "Probeklausur" aka a test exam in exam conditions. Also all professors explained pretty thoroughly that using ai in any submitted document (even hw assignments) would result in consequences and that by using chatgpt they are only cheating themself. Some just straight up state that the hw is voluntary. Honestly I also don't know how to help you. A test exam without chatgpt would definitely show the students how unprepared they are and maybe some discussion about it and what you have observed (like tell them that since chatgpt got introduced your exam averages have lowered and that by banning computers they have increased ) might do some good but it's difficult to state helpfull facts when so little is know
I was on the student end of this exact situation. Although pre chatGPT, I had a professor have a whole lecture dedicated to showing us the data related to why they made the decision and the exam average differences. Helped at cut down on the hate that was previously directed towards the professor. This was a very technical major where the students were able to understand so depending on your area of study this could help
@@maxakaphoenix Sorry but what kind of kindergarten University are you in? having a bachelors and 2 Masters from the Freie Universität Berlin... no prof would've given 2 fucks if anyone pays attention or given us test exams this is definitely some Abi or FH shit not university.... there's like 120-300 people in a lecture on average... i promise you the prof does not give 2 fucks they get paid even if everyone fails
@@LoFiAxolotl ok maybe I've painted to rosy a picture. I am at the TU Darmstadt. I am in my 3. semester and I have participated in 13 modules so far. From those 3 profs give candy if you actively participate and regularly ask for feedback make use of online quizzes and try to make sure that the majority comes along. 6 don't bother but accept feedback and ask the students if they're to fast/slow and 4 straight up don't give two fucks like you described. However, almost every single one had a slide in their pres. about ai(chatgpt specifically) and how using it it any submission unmarked would result in consequences like a 0 on the assignment (though around 4 invoked the threat of Exmatrikulation) in egregious cases) and those that didn't have them on the slides mentioned this verbally. Furthermore out of those 13, 3 gave us the opportunity to participate in Probeklausuren of which 2 I believe gave a bonus (though the 5 I'm currently in might give us an opportunity later). In conclusion I'm sorry if you believe that receiving any support and having professors that care equates a TU to a kindergarden university
I'm in my third year of teaching post-graduate school (29 years old so fairly young for college profs) and am teaching at my undergrad school as an adjunct in their music dept. I still allow my students to bring laptops to class and my tests are take-home open book tests. If I see a student in class doing something on their laptop that isn't class related, they get 1 warning. If I see it again, I tell them to leave and come back next time since they obviously aren't paying attention to the lecture. I've been very open with the chair of my dept (someone who has been a mentor figure for me in anything music/education related since I was a senior in high school) about my thought process on the use of learning AI in classes and how I basically feel bottlenecked because every assignment I come up with can be completed in a couple minutes using ChatGPT. He and I are currently working on designing a couple new versions of current courses that make it harder to pass the class using AI.
You might not believe a guy named Michael Howard is going to have a 40 point triple double tonight, but maybe that's just because the super smart roboman knows more than you.
@@sauceinmyface9302 eh, I don't really agree. People should be expected to write things in person, but specifically with pencils seems pointless. That just screws people who can't write like that, and doesn't actually prevent people from using chatgpt, since they can just generate it then write it out...
I mean it's also just cracked at teaching, I use it for my Calc II class all the time since as long as I know most of the stuff from class, any questions I have about specific cases I can just ask it instead of my GSI who does not want to be there. I can work on a problem set for 2 hours, and then throw a screenshot of it in to ChatGPT and see if I got all the answers right, and if I disagree with a problem I can ask it to elaborate on its step by step process to learn. I figured if it got anything wrong I'd be able to tell since its explanations wouldn't make sense, but so far it genuinely hasn't gotten a single thing I've asked it about Calc II wrong yet! It's kinda crazy, im thankful im very motivated to learn Calc II cause it's cool cause otherwise it would be kinda discouraging knowing this word-predicter can do every problem better than i can lol
@@lostinthesauce3986it is really good, if you know what to ask, how to ask, and how you best learn. if you don't, it will just be used to teach and all the knowledge will be lost my young son
Most of the time the explanations it gives are much better than the instructor's. Teaching is an art, and most instructors are not very good at it. With AI I can ask as many follow up questions as I want and it won't ever get annoyed.
You're mostly right about this, but I want to add a caveat: Chat GPT is good at teaching you things that are well-documented on the internet. There's a lot of Calc I, II, and even III material out there, so Chat GPT, which is really just a fancy algorithm to predict the next word, can give you a pretty good summary with its neural network. It has a large network to pull from, after all, since it has read through so much stuff on that topic. However, it is very easy to identify weaknesses in Chat GPT when it comes to "niche" or undocumented material. For example, I gave it a super basic question about Assembly (low level programming language that is very old with relatively small use cases) and it was waaaaay off in its answer. Why? Because its neural network didn't have a lot of material to pull from. It can help me a lot with stuff like C and Java, which are more modern and well documented, but ultimately it comes down to using the right tool for the job. Anyway, just wanted to give my two cents.
@ most teachers do not get annoyed. and if they do, talk to a counselor, or talk to a tutor somewhere else. i promise you, chat gpt is good, but it got stuck on such simple calc 1 answers when i tried it before
Growing up, google never felt like enough. Like that thing on the tip of your tongue or a vague idea, you can explain it and ChatGPT tells you what it is. Its amazing
im in university and my courses are starting to treat students’ use of Gen AI (mostly ChatGPT) like asking a classmate for help. sometimes itll be right, sometimes itll be wrong. when its wrong, you can tell it why and try to go from there. when its right, you can have it explain to you. there will also be times when you dont know if its right, though, and its just confidently answering wrong. literally like a classmate. principally it makes sense bc ChatGPT basically just outputs what it thinks sounds correct, and i guess so do people in a way when learning things.
Not necessarily daily. But some questions you can just simply not Google. Ones you have to describe because there isnt really a word for what you are looking for.
yeah, other than helping with coding, the main thing I use it for is helping me find words I am forgetting while writing lol. I still stand by that chatgpt is no where near as good a writer as most people, but.
This is one of the best uses for AI imo and no one seems to talk about it. Also talking though things like trying to figure out if there may be anything to worry about with your pet (not as a serious diagnosis, but like in the same preliminary way as you would normally do with google. Like if they threw up twice in a row or something.). It's very useful for being able to navigate that type of info. Google just gives you the same generic slop that just don't apply to your situation, across a million listicles Whereas you can actually like discuss and progress the state of the inquiry smoothly with AI
@@cbot9302 ChatGPT is and will always be very cliché when writing. Always give the most common counterpoints when asked for arguments, etc. It's just how AI works. But if you're looking only for that - it's perfectly awesome as a starting point.
I mean, you CAN. That simply requires the very important skills needed to conduct basic research, use search engines and evaluate sources Core components of media literacy taught in high school The very scenario you describe is where AI performs the worst in terms of reliability
As a student that holds to academic integrity, it’s pretty frustrating seeing the majority of my peers use ai to do their work for them. Like you pay thousands of dollars to come here and you don’t learn anything. They’ll do the assignments in a tenth of the time it takes me and get the same grade or better, and then they get a D on the exam and tell me how smart I am when I pass tests with no problems. Like no dude, I’m just up studying and doing homework/school literally all day from when I wake up till midnight just so I can keep up with my classes. And then they ask me why I don’t have free time, like what are we saying bro? What worries me is how many classes shift to online exams because now there isn’t even a precedent where it’s hard to cheat and you actually need to know what’s going on. Schools are making it easier for people to cheat, and now what, people in my engineering major are going to be getting jobs and not know anything from their degree. Don’t tell me about how hard your life is when you just cheat on everything bro.
After an exam for a course that doesn't have an immediate follow up (aka analogue circuits 1 to 2). Do you actually remember that course by teaching yourself sticking to academic honesty? Do you have any hobbies outside of uni? Do you have any friends outside of class?
Yes, I can remember a lot of what was in the classes my first couple years here. I participate in clubs as well, and hang out with friends there, but this semester I legitimately haven’t had time to hang out with friends outside of clubs/school sports games.
In regard to programmers we already had the issue that there are many people who can only scripts, and understand the syntax a bit but do not really understand what they're programming, it's gonna become worse now with GPT
Im glad I graduated college when I did. My Senior yr chatgpt was starting to be well know. But it was at the point where it wasnt that great, and often gave wrong answers, and in my senior yr in engineering our classes were pretty advanced so chatgpt would give clearly wrong answers so everyone just asked it stupid stuff to see what it would do. Never used it for homework.
As a college student myself, I often find myself using chatgpt not to write my papers for me, but help with the wording or phrasing of things I want to say. It just saves me hours of rereading the same few sentences over and over until I get the words right in my head. Or fluff up the paper with higher level vocabulary. Once I have something started with a strong voice in it, it helps me maintain that same tone throughout the paper. I just think the biggest thing with AI is understanding how to use it right now. It’s not meant to do the work for you, but can help support in smaller tasks.
I think this is a pretty reasonable use case. I can struggle with writing in similar ways, so I understand where you're coming from. For myself, I describe it as sometimes my brain struggles to get into "writing mode" (especially for topics that are not interesting to me), but it usually goes pretty smoothly once I get into the flow. However, I also have a general disdain for current AI. Not the concept of AI and neural nets in general, but how it has been shoved into everything and the way that we are using this technology as a society. At the moment, it just seems so wasteful as it requires sooooo much electricity for such little benefit. I also doubt that I'd be able to shortcut my brain getting into "writing mode". With that being said, if it works for you then that's great. I'm not trying to criticize your use case at all.
Same. I know the course material, study, take notes, participate in online discussions, what have you. When it comes to essays, however, my writing style sucks. It doesn't sound like a college student wrote it. I use chatgpt to pull up "example" essays (with explanations on how to write an engaging hook, formulate a strong thesis, etc) so I can feel more confident in my ability to churn out an essay, and it has been working alright for me so far. Obviously the ultimate goal is to not need help with writing essays, but I'm getting there slowly but surely
I use it regularly to 'search' for things that nobody is writing about, or at least nobody who is SEO abusing enough for Google. You have to fact check it religiously, but it will give you AN answer to start you in the right direction. Today I asked what the smallest modern V6 was. The first answer was wrong (Nissan 200SX 1.6L) but it's second answer (Lexus IS250 2.5L) was a good start even if I'm not sure it's technically the smallest.
That's an interesting example because although I'm not a car guy, this does seem like a cool fact and so I gave it a quick search: it seems as though the IS 250 was discontinued in 2015 (I don't see it on Lexus' current website), and so no current Lexus model uses that 2.5L engine. You're right that chatgpt is great for getting a start on research, but from what I found it does seem as though the smallest V6 engine currently in production is 3.5L from a couple different manufacturers, including Lexus with the IS 350
That’s why you tell it to search online and it searches google for the answer and gives you sources, this is what I got: “The smallest modern production V6 engine is the Mitsubishi 6A10, a 1.6-liter (1,597 cc) engine introduced in the early 1990s. This compact V6 was available in models like the Mitsubishi Mirage and Lancer, delivering approximately 140 horsepower at 7,000 rpm.  Another notably small V6 is Mazda’s 1.8-liter (1,845 cc) K8 engine, used in the Mazda MX-3. This engine produced around 130 horsepower and featured Mazda’s Variable Resonance Induction System (VRIS) to optimize performance across different RPM ranges.  These engines were developed to offer the smoothness and performance of a V6 while adhering to tax regulations in certain markets that favored engines under 2.0 liters.” And then I asked for the newest: “As of November 2024, the smallest V6 engines currently in production are: • Ford’s 2.7-liter EcoBoost V6: This twin-turbocharged engine delivers 315 horsepower and 400 lb-ft of torque. It’s available in models like the 2024 Ford Ranger.  • Volkswagen Group’s 2.9-liter EA839 V6: This twin-turbocharged engine produces up to 331 kW (444 hp) and 600 Nm (443 lb-ft) of torque. It’s used in high-performance models such as the Audi RS4 and RS5.  These engines exemplify modern engineering’s ability to balance performance and efficiency in smaller-displacement V6 configurations.”
@@ericolivier1271well ChatGPT is still my goat, when I asked it search google for the newest smallest v6 engine, it gave me a 2.7 litre ford and 2.9 litre Audi engine, both are pretty hard to find on a google search by yourself, evidently as you too said 3.5 is the smallest you can find “As of November 2024, the smallest V6 engines currently in production are: • Ford’s 2.7-liter EcoBoost V6: This twin-turbocharged engine delivers 315 horsepower and 400 lb-ft of torque. It’s available in models like the 2024 Ford Ranger.  • Volkswagen Group’s 2.9-liter EA839 V6: This twin-turbocharged engine produces up to 331 kW (444 hp) and 600 Nm (443 lb-ft) of torque. It’s used in high-performance models such as the Audi RS4 and RS5.  These engines exemplify modern engineering’s ability to balance performance and efficiency in smaller-displacement V6 configurations.”
Edit - I looks like a lot of people are literally reading the calc hater comment and replying. I'm not anti calculator because I think everything should be done in your head. At least read everything first - As someone who was in high school / college when AI became popular, and a previous calculator hater because my friends who were bad at math never improved since they could just use a TI-83 to solve any problem, we are truly cooked. Not even about "this isn't fair" whiny business, but people are genuinely becoming dumber. Tutoring people in high school who don't understand basic math is jarring.
I think an important part of the 'at what point is it acceptable to have something else do the work for you' situation is that you can only learn so many things. You do not have infinite time to learn everything. I'd love to learn spanish. But I'm also working 2 jobs right now and don't even have the energy to clean my room more than once a week. I am a computer wiz and can do great mental math, but if you ask me how to cook a meal, I'd have nothing to offer you. I want to learn to cook, but I'm busy learning to draw, as that's the skill I'm currently focused on. My point is, getting upset at people for relying on calculators is crazy. They're everywhere, and are not hard to use, and give reliable answers. I'm sure you have areas in life where you lack knowledge, and I doubt that you would appreciate someone acting like your lack of knowledge in that field is shameful. And don't give me the business about how math is a core skill. It is, but so are tons of other things. Something being a core skill depends on what you do in life. If your life path doesn't involve needing to know spanish, knowing spanish is not a core skill. If you always have a machine in your pocket that perfectly solves every math problem you encounter and is easy to use, then, well, would you call mental math a core skill? Probably not. AI changes the situation, as AI is actually pretty inconsistent, so using it to cover your weaknesses is a gamble. For some people that risk/reward dynamic is preferable, and for some people, it's not. You can't really generalize it to 'nobody should use this thing'. There are situations where you shouldn't use it, absolutely, like an electrician needing to ask chatgpt about everything over and over instead of properly learning the job. But that's only one situation out of millions.
Dang its new to me that ti84s could solve for x, aka 90% of college math classes. More seriously, any half decent math teacher will design the problems around the calculator you are allowed to use in the class
@@godlyvex5543 relying on calculators for basic math is crazy though. Learning times tables in elementary school, but by high school needing a calculator for them is bad
I'm a ta for some upper level/grad CS courses at my college and the difference between students pre and post chat gpt is dramatic. I have to sometimes explain things that they should have learned in the first semester despite being juniors and seniors. Everyone's becoming super reliant on chat gpt, skipping all the fundamentals and than failing when thing begin to get challenging. Some professors have just stopped giving out homework and only giving out tests and quizzes to counter this.
The point about using AI to detect if something is produced by an AI is actually at the heart of the previous generation of generative AI. There was a whole area of study on training two competing AI, one to create content and another to discern whether content was real or generated and use the feedback from the discriminator to train the generator. They're called GANs and they're already mostly out of date. The point I want to make here is that trying to create an AI that detects AI slop isn't going to fix anything. After all that's almost definitely how the system that false-flagged those nine human essays was made anyway.
While ChatGPT on assignments is clearly an issue I do think it reveals the inherent flaws of this kind of assessment in the first place. I’d argue that exams especially push you to not care about the material content and what it means, but to remember the right things making it only half a step up from the 0 thought of ChatGPT. It still helps develop general essay writing and thinking skills but we end up completely forgetting a lot of what we write essays about anyway which is why students often feel like they’re meaningless and don’t care to put thought into them.
yeah school was so awful for me, I'd say up until 5th grade I was learning pretty important stuff, but as time went on, the topics became less and less relevant to my life. Science was important to learn the basics of reality, but learning about how DNA works? For 6 months? Wouldn't that time be better dedicated to like, teaching me how taxes work? I don't need to know about the 4 building blocks of dna, their names, and how dna replicates... it feels like they just arbitrarily decided to have those 5 main subjects just extend throughout the rest of school instead of cutting them off once people understood the important stuff. Most of the things I learned after 8th grade in math was not relevant to my life. History is an endlessly explorable topic, so you have to draw the line somewhere, and yet they decided to just keep teaching me about increasingly obscure and irrelevant things until the end of school. English/writing is one of the few subjects that I think deserves to continue forever, but my issue is how it keeps rehashing the same few ideas repeatedly. I swear, I was taught more about analyzing fiction than I was about interpreting anything relevant to my life. Why, in the last years of high school, was I still writing short stories, and not learning how to write correspondence and be convincing to people? Why wasn't there more focus on TALKING to people? Phone call etiquette? How are these extremely important and relevant skills left untouched while we get assigned another few essays about some random thing nobody cares about? I dropped out of school around when I became a senior, since I had been skipping school for large chunks of time. Occasionally I came back for big tests and aced them despite not being there for the classes. I took my GED and got amazing scores, and to be honest the test was just so easy it could be completed by an 8th grader. Not because it's bad or anything, but because it focused the main 4 topics to things relevant to real life. Turns out, when you arbitrarily limit education to 4 topics, there's only so much mileage you can get out of those while still staying relevant to life.
@@godlyvex5543this argument is so washed. taxes aren’t that complicated, you figure them out after like two years of having to file them, and on the other hand if you didn’t learn about dna in school for 6 months there’s unironically a good chance you’d be an antivaxxer right now
It’s always been the case in my classes that there are people who study for the exam and people who study to learn. That’s a personal choice you make. There’s no shift in responsibility there to society, or the structure of classes, or anything external to you. No matter how you restructure the course you can’t make somebody care about something. If you cared about learning the material, exams would not be an issue for you. Essay writing specifically is about repetitions. The more you do it about a diversity of topics the better of a writer you become. The individual snapshot of writing this paper at this time doesn’t matter long term for your life. The cumulative writing experience you’ve gained from your education does.
@@Exisist5151 right and I understand that but my point is that exams specifically are not a great way to test intelligence or even long term knowledge. With the status quo you only have to remember the information for the specific period and the information that you need to know is hyperspecific and often feels like it’s learnt only for the sake of learning. During school I thought I didn’t like learning because I struggled to attach myself to what felt like meaningless topics but as I’ve gotten older (still young) I have realised that I love learning and have an innate curiosity that I satisfy through UA-cam and university which is much more related to what the topics that I find interesting. You are right that you can’t force someone to be interested but the school system does not do a very good job at even trying to make learning engaging. Individual teachers try really hard I’m sure but the ones I had often didn’t and treated it like leading a textbook to the class.
i've been using it as sort of teacher/mentor/helper. I never had to replace the flap for a toilet in the back so i pointed my camera at it, uploaded the pic asked what is it, then where i can buy it, and then bought it fixed it myself. Fixed the leak in the toilet. I never had anyone teach me these things or show me. The second thing i've been using it for is just general questions and some clarification because I have ADHD and i just have trouble digesting information. Being able to take something that's sometimes deliberately written in a confusing way, and slapping it in chatgpt to say "hey can you explain this as if i was 5?" has made my life easier. It's been one of the most life changing things next to the internet for me big A.
I said this a while ago. The solution is simple, oral exams. But that would require *gasp* teachers to ditch testing methods that scale well from a grading perspective.
Oh that explains why all of my classes are flipped classrooms, which I don’t learn shit in because I’m not good at teaching myself. I do so well in normal classes, I miss them :(
@ interesting, my understanding of flipped classrooms was that students would get more hands on time being taught by teachers rather than using the classroom time sitting in silence watching lectures… but i can imagine a lot of classes just use it as an excuse to be lazy and cut staff
@@hjewkes Usually classes just hand out worksheets, I have no idea what the right answers are, the teacher grades on completion so I never end up actually learning the topic in depth because there’s no lecture.
I was watching a reaction youtuber and i stg he would ask chatgpt “hey what did the creator of the video mean when they made this” EVERY TIME IT WAS SO WEIRD
@@JaMaMaa1 i was watching the new tyler album reactions and randomly found them i stopped watching cause if i watch a reaction channel i want people to have their own insights not asking chatgpt
It's interesting they're using AI interviewers. I'm normally alone at my shop, but today the manager of a different department who mostly works from home came in to do phone interviews all day long. It looked like a massive time investment. Luckily my company isn't doing that yet. I learned a lot listening to his questions and got to get some really good career advice from someone who hires similar positions to what I want.
A friend of mine at uni doesn't even go to lectures anymore. He just copy and pastes the lecture slides from our classes into gpt and asks it to explain it
@@Asternius its not like i can pay attention to a 2 hour lecture anyway shit would be useful to get the cliffs notes as well, anything at this point is good for me
I think AI and Chat GPT really can help you understand the things. Bc in the end you have to write an exam and if you only use AI without trying to understand anything, you won't get far. And it really helps me to understand things. And find patterns in some topics and connections between topics.
Takes like two hours for the kid who messed with Runescape bots to make something than turns a text into key presses with short varying length waits between them and a couple of longer pauses and share it with the class.
I spoke with one of my college professors who works a lot with admin. He basically said that they are moving to completely change essays and assignments so that it doesn’t matter if you use AI to help. All traditional homework problems are now only done during proctored tests.
All homework problems are done via proctored tests? That’s not how homework works nor can it be enforced 😂 I’d love to see a school try and force surveillance while you’re doing homework.
@ no, students are allowed to use AI on assigned homework. The homework is either not graded the same or is designed so that using AI doesn’t really give an unfair advantage. The old traditional homework problems that AI can do on its own are used on in class quizzes and tests which are proctored.
People acting all surprised that AI can output work at levels equal to or greater than a recent college grad... Then also acting surprised that that's 99% of the people that use it
I'm in college and used chatgpt to generate ideas for formatting my conclusion of a scientif research paper. Did not copy any of the chatgpt work just used it to inspire me but it freaked me out because the way it wrote sounded exactly how I write papers. I'm terrified I'll get accused of using chatgpt eventually even though I've almost never used the thing
I’m a high school history teacher. I really wanted to do lots of research papers, but chatgpt has made any work that is not handwritten impossible. All writing work has to happen in class, and even then kids will use chatgpt quickly in the bathroom and do their best to copy it to their work. It makes genuinely teaching writing impossible
I use it often to check my understanding of concepts from my classes. The notes can be vague sometimes and if I can't get a good googlable result it helps me conceptualize. Keep in mind this is just because I don't have friends in my classes.
5:50 easiest way to game the AIs if they are based on chatGPT is to just say statements instead of questions, basically leave no room for implication .
Dude chat gpt has helped me crazy. Sometimes to just double check or using it as a baackboard to bounce ideas. Expand ideas. I dont know if i woould use it for finance lol. But using it to start off ideas.
I feel like it's useful for when you put in information, like giving it a context and stuff because of its historical knowledge. But not for just straight up factual information that can change overtime
My dad uses it. He once tried to find a picture of his graduating class with it. He was so amazed it could find it and showed the whole family. Finally I pointed out to him that it was just a generated picture and he was really sad. He still uses it though
Debate me. I am an engineering student using AI when allowed. I have had a few teachers not prohibit AI because it "wont help" but I find workflows that almost trivialize the work for me. I find at worst the practice is benign, and at best it gives me a severe leg up over my non AI-doped peers. Having an intern in my pocket to do the low level things at an adequate level, that I need to verify and tidy the work of, lets me focus on the high level goal of understanding and applying material as best as possible. There are AI doomers that trust it fully, which I cant agree with. I put in genuine rigor into learning the tool and experimenting to find its biggest strengths and applications, and feel enriched while using it at arms length. The only fear I have is that OpenAI turns to ClosedAI and I lose access to the software that I have partially built my understanding around. I would pay serious bucks to acquire an NVIDIA chip and own my AI if I had full access to the model.
I'll just become a plumber or some shit. On a real note though: I am a teacher myself, but I teach in an elementary school, and I don't give out written assignments at all. And on an extremely rare occasion that I do, it's either during the class, or I just ask a bunch of questions afterwards to make sure they didn't just AI it. Though I doubt they would anyway, since I teach English as a foreign language, so they wouldn't even know what prompt to write in to get an essay back. Not to mention I haven't heard them talk about AI at all, so they might not even know that's a thing they can do, hopefully. That being said, if I was teaching in a high-school, I'd probably just... quit and wouldn't. This seems like an absolute nightmare. Like, if I was a student now, I'd probably AI this shit, but at least I'd change some stuff, you know? It seems crazy to me that tech literacy is going down instead of up since tech is literally everywhere now. Even your fucking fridge can have Wi-Fi these days, man.
As a recruiter, AI interviews won't ever be fully implemented unless labor laws drastically change. There's so many ethical violations just from the example you showed. While it may seem like she's just cutting him off, an AI not being able to recognize different accents and not allowing someone to continue speaking is technically a form of discrimination (in terms of labor law). Its little intricacies like that could end up putting companies in a lot of hot water. They'll probably use it until they get sued. But who knows, I could be wrong. I just don't see it working with our current laws.
as a student i hate the ai cheating. one of my profs is giving impossible quizzes now and those of us who refuse to cheat are taking the hit. I bet there'd be a perfect bimodal distribution if you graphed the grades.
I agree that calculators didn't replace learning, but when I was in school Wolfram Alpha definitely did for math problems. Some people would only do their homework by themselves because they knew they'd be fucked in exams if they didn't. Maybe paper-writing goes the same way, where you just have in-person handwritten papers that take much more emphasis than the homework. I'd done most of my schooling in an Indian system, and our grades each term were entirely based on exams we'd write for each subject twice a term anyway
engineering uni freshman here, everyone I know hits their “take a picture” limit so much they’ve created alt accounts, AI has atp overtaken UA-cam tutorials (at least for the simpler stuff)
When i was getting my IT diploma, AI was in the come up. It got used allot, but it doesn't help as much in tasks that actually need to be preformed outside of theory. It was also accepted for use in exams, since my school held a policy of learning in the way of being in the work force. Basically you had any tool and even the internet to your disposal on most tests. So you could quite nicely use it to sum up key points, autocorrect and write some simple scripts. I still did all the rest of the work though. When used like that it's actually a nice tool.
Had a TA lead a discussion with some example problems. He chat gpt'd the answer and literally the first step tried to solve a series resistance with a parallel resistor equation. Everyone was stupefied over how the TA didn't notice most obvious thing taught in high school physics. When we asked him to go through it himself he just straight up said: "no" 😂😂
Im a CS student and at my university they allow the use of AI and just have people present their work more often so they can see that people understand what they wrote
In my CS department they actually encourage us to use it. Not as a replacement for actual understanding of course; we mostly use it as a handy tool to quickly search and summarize documentation
THe only thing I've used AI for to any significant degree is generating atmospheric landscape images for the D&D game I'm running, that I would have absolutely zero hope of drawing - but can describe details I want accurately, and iterate on whichever images match what I'm looking for most until I get pretty close to that I saw initially in my mind's eye.
I use chat gpt all the time in uni. I’ve even used it for the abstract in my papers. Our course responsible/ teachers sometimes allow it as long as stuff is referenced correctly. I.e. basically the only thing you can use it for is abstract. No that chat could write our papers either way though. It doesn’t even know how to reason polarity. It’s also a great way to ask about a concept you have a hard time to wrap your head around. Like a second teacher. I had a really hard time understanding how a membrane selective potentiometer worked fundamentally. But after tinkering with the prompts I got a corrected response I understood. Unlike what the teacher, Google and textbook said.
Chatgpt is great for stuff I'm working on. It helps me get together books and research papers. It can help me write out study plans for subjects or take notes etc. It doesn't understand how to do optical math or really any math in general but for study and work it's great. It also helps me generate concepts for python scripts I'm writing at work for automation that takes a few secs works good for simple stuff.
I use AI almost every day for cooking. Asking how long and at what temp to cook stuff… What to do with leftovers… what flavors go well with others… It’s pretty helpful
There are some big exceptions for student AI cheating at the university level, which are many humanities classes. I'm a philosophy grad student at an ivy who teaches at other colleges on the side, and many of my colleagues and I have found pretty solid ways to detect and pre-empt cheating. Most philosophy content is VERY poorly "understood" by GPT. There's a set of readings and concepts that're extremely common in intro classes, but if you just switch it up to be more niche readings the cheating becomes obvious. For example, I decided to replace half a unit on empiricism as family of theories in philosophy of science with a more niche unit on values in science -- something that is not readily googleable -- and the cheating became instantly obvious. Yeah, I had 15 cheaters in a class of 30, but none substantively claimed that they didn't cheat. I've seen colleagues in psych, sociology, cultural studies, and anthro do the same thing with fairly good results. It might be that non-intro humanities classes will become the hardest classes to cheat in within a few years.
My wife works at a high school and all of the kids computers are integrated with Grammerly which has an AI function the kids are encouraged to use. A lot of them read/write at a 1st-3rd grade level without AI and it’s the only way they will graduate
While it definitely does just “guess” the next word as an output. It does it prodigiously well, and has a lootttt of info to back it up. You NEED to be careful when using it, and being aware of when it’s training data ends (making real time bets usually slop guesses) and realizing it hardly will ever understand when to say “I don’t know” is so important. But it’s absolutely more than a word guesser. Especially specialized “AI” used to do specific tasks like hybrid ray tracing or robotic movement training akin to inverse kinematics.
I work tutoring one-to-one, and I can see individual interviews /oral testing coming back in a big way. It's meaningful for the teacher even as much as it is harder than lecturing to a whole class at once, and it's already just clear as day what the student knows and what they are trying to 'guess' is the answer you want to hear, so someone quoting AI is just not gonna get under the radar in any context.
I use chat gpt all the time in uni. I’ve even used it for the abstract in my papers. Our course responsible/ teachers sometimes allow it as long as stuff is referenced correctly. I.e. basically the only thing you can use it for is abstract. It’s also a great way to ask about a concept you have a hard time to wrap your head around. Like a second teacher. I had a really hard time understanding how a membrane selective potentiometer worked fundamentally. But after tinkering with the prompts I got a corrected response I understood l. Unlike what the teacher, Google and textbook said.
I actually gave a seminar lecture on the advancements of technology in education in 2021, and I explicitly mentioned how AI was going to be a major game changer in the near future, and educators would not be prepared for it. Then a year later ChatGPT was a thing
I use chatgpt to ask questions that’s usually take multiple google searches, like the comparison between two certain motorcycles. It’s just more convenient.
My stepmother is a lecturer an an Australian university and they just design the entire course around testing the students’ ability with the assumption that they are AI assisted.
I’ve never used ChatGPT a single time. I honestly don’t think that I ever will and I feel like maybe I won’t be better off for it. but philosophically this is what I want. it’s ubiquity and how much it’s become relied on by students, as well as other AI, really make me trepidatious and actually scared of the future in ways that I never thought I would be.
I have used it, as it stands its 10 times worse at getting you relevant and accurate information than Google is, and Google is fucking dogshit compared to what it used to be now, you're not gonna be losing out at all for a long while still
Thank God someone else is mentioning women using character AI for AI boyfriends. Everyone jokes about AI girlfriends, but women a going crazy with character AI
ive found predictive text programs super helpful specifically for translating historical documents, where i might be able to read 4 letters out of a word but the 5th is illegible. works great for that. i would not trust a single thing it spits out regarding actual history though. i tried it a few times and it's just wrong over and over.
I found out that I could use chat gpt to find links for my communication assignment, asked it to for links for my arguments. Also I could see chat gpt being the new search engine
I am divided on this topic, on one hand I feel as if AI is a tool that SHOULD be used, on the other hand it's highly dangerous if it replaces actual learning. During my last year of uni, I've used GPT on multiple math questions and equations as a way to understand the process that goes into solving an equation. The current chatgpt will not replace learning, as it usually gets the correct answers, but it rarely uses the easiest formulas. And I think that will be what differentiates the "A" students from the "C" student. During an "offline" math exam, the students that's copy pasting from GPT will struggle to finish in time. When it comes too essays, it is and will be like a war. Sort of like between UA-cam ads and adblocker. Patches will come and go, but it will continue to evolve. I don't think there is a solution, except for mandated in-house writing on individual computers or allowing some sort of control over your device.
I feel like there might be a return of more in-person valuation and examination overall, which could be a positive thing. Homework is already proven to not be helpful in many cases to develop student skills and focus, so just get rid of it, and do in person examination more. More or same time in school, no homework, in class writing, done. Good luck for university though, but a similar process could be applied for examination. For jobs though there might be some need for regulation, like a centralised network where your applications are tracked and your qualifications are automatically filled in when applying (I know I am describing linkedin, but I mean it as even more central to the process)
When presented with having to write an essay on two vacation locations vs each other, pull out that damn AI and spend your time on something meaningful instead. I support THAT. More interesting and informative topics, then no AI.
In some of my classes in hs we only wrote on paper/ safe exam browser to train ourselves on reasoning from only stuff we know. I recon something like that could be done, but with a textbook to avoid AI but still do writing.
I wanted to say that as a university student, the way teachers are fighting back against AI is removing written assignments as much as possible and making classes more exam based. My business law class is 100% weighted by exams.
So i could take a test interview first, where i score poorly to write down the questions the gonna ask me, then prepare in advance for the best interview score ever seen.
I use chatgpt for literally everything. My niche use case is if I can't think of a word or concept; I give it near-concepts of the word and what I think it might start with and it always gets it.
I'm so glad i spedrun college and got out before this became such a huge thing. If i had been in for the extra two years a lot of it would have sucked.
My English teacher uses the Google Docs feature to see document history and uses that to determine if something was written by AI. for example, she knows if it’s written by AI if large portions of text are clearly copy and pasted into the document. I think this feature can also be used to prove that you wrote each word yourself and didn’t use AI however people might just type it out and fake errors so idk.
Being a tech-literate young adult in this time is truly a sobering experience, like what do you mean you're relying on autocorrect+ to makes bets
for sure, I don't really talk to anyone about tech irl, but whenever it does come up it does feel like people treat it like magic, maybe that's just natural
Autocorrect+ lmao (to be clear I agree, just funny)
In fairness it's so much more helpful than Google. I forgot a keyboard shortcut in After Effects the other week. Google gave me nothing but 9 minute tutorials that seemed MAYBE contained what I was after. I asked ChatGpt out of curiosity and it got it spot on in 3 sentences.
@@Purriah This is such a unneeded comment
How so
What really worries me is the amount of engineering students who cant function without it. Im doing a robotics capstone and another one of my teammates cant understand a single thing i do without asking chatgpt to "translate" and it often doesnt fully get it right. It really frustrating not being able to communicate about the work im doing with him, hes not even willing to try and digest it himself.
Woooah, I thought 2020 was a bad year to graduate but maybe not so bad since that I had to learn the material myself when learning
Speaking as someone who entered the industry a few years ago, this has always been the case. Truthfully, the way that engineering material and concepts are presented to students is often unhelpful, outdated, or irrelevant when applied to the real world, and although it helps to be able to approach scientific concepts and mathematical principles with confidence unassisted, in my eyes there's not much difference between using chatgpt and google for research on a given subject. The main reason why I personally don't rely on chatgpt for this purpose is because currently, I don't feel it's a more reliable tool for accurately explaining higher level concepts than say wikipedia, or engineerstoolkit.
😂😭
Thats bad. I graduated just in time(mechanical engineering), when chatgpt was still too dumb to do advanced math questions, so no one really used it. Hard to see new engineering grads in the next few years being able to get a career started. When you can’t comprehend the basics. The only good part is it will make us who actually worked for our degree to be able to find jobs easily 😂.
3rd year ECE student and yeah it’s crazy 😭
As a teacher, it's definitely difficult. The key is that our education system is built on testing for results and end products. Whereas now we have to start finding ways to assess process and continual progression. It's really hard to do, but will ultimately make us better educators.
This guy gets it
This is the unintended silver lining to the whole thing. I sincerely hope they can't figure out how to stop kids using ai. The way forward is teaching kids to learn not teaching for exams.
@@cuineform3708 Figure out education reform? Maybe after a generation or so of kids dumbed down by ai. I hope they can figure it out too, but they're already failing the kids with reading as it is.
Common Core and its consequences have been a disaster for the human race.
The problem with that is it’s going to turn everything into math class, where you have to get to the answer in a specific way or it’s wrong even if the answer is correct.
the Ai recruiter makes me so glad my work place is heavily unionized and has a history of strikes when management tries to overstep its bounds
Scabs go home!
The recruiter issue would only apply to recruits, i.e., people who don't have union protections yet
IMO, an AI recruiter would probably do a better job of giving a full and thorough interview than a human. Can't be tired or drunk or in a bad mood. Can't be concerned with getting through X interviews in a day, etc.
@@RachelTension its more about not having to deal with doing these interviews and the protection of human roles in my current industry. I did work as a recruiter and we were unionized, just we were based in Aus
@@XetXetable The problem is, the AI recruiter's scoring can easily have biases nobody notices. Could be something inocuous like certain types of microphones or webcam angles give slightly higher scores, or could be that it gives lower scores to black people or anyone with a muslim name. Any biases will be hard to prove and the recruiter using the tool is likely think a computer is "guaranteed to be neutral" because of how it's marketed.
yesterday I saw an ad for apple intelligence where some moron manager sent out a delegation email and had it re-written to sound more corporate and polite. I cannot begin to describe the levels of anger I felt at the idea of the person making more money than me, while making me do all of the work, being an incompetent baboon that just sits around sending out 10 AI emails a day and sending me spiraling. It's not who can do the job, it's who gets hired first
I saw that ad too and honestly I don’t think it’s gonna win apple any favours either
That baboon has parents and grandparents who were in much more respectable positions in society than yours, and Apple is here to make sure traditions are continued.
as someone in a managerial role... i hate when my E-Mails sound super corporate... to be fair half of our Slack channels are meme channels where i post daily... hopefully my employees appreciate me ending my E-Mails with emojis because in my head every message that doesn't end with an emojis sounds super cold and stiff
My favorite is the guy who procrastinated his presentation so he had AI churn out a powerpoint in a few minutes. My question would just be why do we even need the dude at that point? Have AI just churn out the powerpoint and throw it up on the screen with an AI voice summarizing it. Why do we need the doofus standing in front of it reading it word for word? People are going to AI themselves out of their own jobs soon
As a young college instructor/grad student who is trying to get a full time professor job it is awful. Since covid every lecture I have been in I see 90% of students with a laptop open and them not listening to the professor at all. Then, I see that the online homework I assign now is done on average in half the time (some will do a 20 minute exercise in 5 minutes). Then, I watched exam averages tank 10+ points lower. I finally decided to ban computers because at least they are then forced to sit in class and kind of listen, which did help their averages, but now my teaching ratings are suffering because students think that rule is unfair. I normally wouldnt care, but these ratings impact my ability to get hired. Its like a lose-lose, cause at least before if they cheated on the HW, they could learn some of the methods or fall back on class notes for the exams, but now they arent even trying on the HW and not listening in the class and if I dont do anything they will just do badly on the exams and blame me, but if I do something to help them they blame me anyways.
I'm currently a student (though at a German uni ) and they are also suffering with chatgpt, one of my professors decided that they will offer a "Probeklausur" aka a test exam in exam conditions. Also all professors explained pretty thoroughly that using ai in any submitted document (even hw assignments) would result in consequences and that by using chatgpt they are only cheating themself. Some just straight up state that the hw is voluntary.
Honestly I also don't know how to help you. A test exam without chatgpt would definitely show the students how unprepared they are and maybe some discussion about it and what you have observed (like tell them that since chatgpt got introduced your exam averages have lowered and that by banning computers they have increased ) might do some good but it's difficult to state helpfull facts when so little is know
I was on the student end of this exact situation. Although pre chatGPT, I had a professor have a whole lecture dedicated to showing us the data related to why they made the decision and the exam average differences. Helped at cut down on the hate that was previously directed towards the professor. This was a very technical major where the students were able to understand so depending on your area of study this could help
@@maxakaphoenix Sorry but what kind of kindergarten University are you in? having a bachelors and 2 Masters from the Freie Universität Berlin... no prof would've given 2 fucks if anyone pays attention or given us test exams this is definitely some Abi or FH shit not university.... there's like 120-300 people in a lecture on average... i promise you the prof does not give 2 fucks they get paid even if everyone fails
@@LoFiAxolotl ok maybe I've painted to rosy a picture. I am at the TU Darmstadt. I am in my 3. semester and I have participated in 13 modules so far. From those 3 profs give candy if you actively participate and regularly ask for feedback make use of online quizzes and try to make sure that the majority comes along. 6 don't bother but accept feedback and ask the students if they're to fast/slow and 4 straight up don't give two fucks like you described. However, almost every single one had a slide in their pres. about ai(chatgpt specifically) and how using it it any submission unmarked would result in consequences like a 0 on the assignment (though around 4 invoked the threat of Exmatrikulation) in egregious cases) and those that didn't have them on the slides mentioned this verbally. Furthermore out of those 13, 3 gave us the opportunity to participate in Probeklausuren of which 2 I believe gave a bonus (though the 5 I'm currently in might give us an opportunity later).
In conclusion I'm sorry if you believe that receiving any support and having professors that care equates a TU to a kindergarden university
I'm in my third year of teaching post-graduate school (29 years old so fairly young for college profs) and am teaching at my undergrad school as an adjunct in their music dept.
I still allow my students to bring laptops to class and my tests are take-home open book tests. If I see a student in class doing something on their laptop that isn't class related, they get 1 warning. If I see it again, I tell them to leave and come back next time since they obviously aren't paying attention to the lecture.
I've been very open with the chair of my dept (someone who has been a mentor figure for me in anything music/education related since I was a senior in high school) about my thought process on the use of learning AI in classes and how I basically feel bottlenecked because every assignment I come up with can be completed in a couple minutes using ChatGPT.
He and I are currently working on designing a couple new versions of current courses that make it harder to pass the class using AI.
One time I asked Chat GPT for betting advice because I was curious and it deadass named me players that weren’t even on the team.
Yep, they retweet the 'glitch' now but won't be long before the 'Mad cuz GTP lied' posts come
You might not believe a guy named Michael Howard is going to have a 40 point triple double tonight, but maybe that's just because the super smart roboman knows more than you.
The AI-overlord predicted future players and who of them will win.
My high school got rid of all writing assignments on computers and now everything is hand written 😂
thats cruel but also based as fuck
@@sauceinmyface9302 eh, I don't really agree. People should be expected to write things in person, but specifically with pencils seems pointless. That just screws people who can't write like that, and doesn't actually prevent people from using chatgpt, since they can just generate it then write it out...
understandable although my hand hurts reading this
All you have to do is, write, chatgpt can solve
@@anvo4806 Your teacher will surely approve when you type in prompts to your laptop during your in person midterm
My favourite guy is the person who tries to hit on the AI recruiter.
I mean it's also just cracked at teaching, I use it for my Calc II class all the time since as long as I know most of the stuff from class, any questions I have about specific cases I can just ask it instead of my GSI who does not want to be there. I can work on a problem set for 2 hours, and then throw a screenshot of it in to ChatGPT and see if I got all the answers right, and if I disagree with a problem I can ask it to elaborate on its step by step process to learn. I figured if it got anything wrong I'd be able to tell since its explanations wouldn't make sense, but so far it genuinely hasn't gotten a single thing I've asked it about Calc II wrong yet! It's kinda crazy, im thankful im very motivated to learn Calc II cause it's cool cause otherwise it would be kinda discouraging knowing this word-predicter can do every problem better than i can lol
It really is a god send for better understanding calculus and where you make errors.
@@lostinthesauce3986it is really good, if you know what to ask, how to ask, and how you best learn. if you don't, it will just be used to teach and all the knowledge will be lost my young son
Most of the time the explanations it gives are much better than the instructor's. Teaching is an art, and most instructors are not very good at it. With AI I can ask as many follow up questions as I want and it won't ever get annoyed.
You're mostly right about this, but I want to add a caveat: Chat GPT is good at teaching you things that are well-documented on the internet. There's a lot of Calc I, II, and even III material out there, so Chat GPT, which is really just a fancy algorithm to predict the next word, can give you a pretty good summary with its neural network. It has a large network to pull from, after all, since it has read through so much stuff on that topic. However, it is very easy to identify weaknesses in Chat GPT when it comes to "niche" or undocumented material. For example, I gave it a super basic question about Assembly (low level programming language that is very old with relatively small use cases) and it was waaaaay off in its answer. Why? Because its neural network didn't have a lot of material to pull from. It can help me a lot with stuff like C and Java, which are more modern and well documented, but ultimately it comes down to using the right tool for the job. Anyway, just wanted to give my two cents.
@ most teachers do not get annoyed. and if they do, talk to a counselor, or talk to a tutor somewhere else. i promise you, chat gpt is good, but it got stuck on such simple calc 1 answers when i tried it before
2:25 TEKKEN MENTIONED 🫵🧍♂️🗣
Tengen Fyters
LET'S FUCKING GO!! BIG A YOSHIMITSU ARC INCOMING
DORYA
WE SKEET AGAIN
Marvelous!
Growing up, google never felt like enough. Like that thing on the tip of your tongue or a vague idea, you can explain it and ChatGPT tells you what it is. Its amazing
im in university and my courses are starting to treat students’ use of Gen AI (mostly ChatGPT) like asking a classmate for help. sometimes itll be right, sometimes itll be wrong. when its wrong, you can tell it why and try to go from there. when its right, you can have it explain to you. there will also be times when you dont know if its right, though, and its just confidently answering wrong. literally like a classmate. principally it makes sense bc ChatGPT basically just outputs what it thinks sounds correct, and i guess so do people in a way when learning things.
I use ChatGPT daily. Mostly for images of Big A as a glizzy.
based
Could also use ChatGPT to write some code to automate the daily process of generating Big A as a glizzy images.
Not necessarily daily. But some questions you can just simply not Google. Ones you have to describe because there isnt really a word for what you are looking for.
yeah, other than helping with coding, the main thing I use it for is helping me find words I am forgetting while writing lol. I still stand by that chatgpt is no where near as good a writer as most people, but.
This is one of the best uses for AI imo and no one seems to talk about it. Also talking though things like trying to figure out if there may be anything to worry about with your pet (not as a serious diagnosis, but like in the same preliminary way as you would normally do with google. Like if they threw up twice in a row or something.).
It's very useful for being able to navigate that type of info. Google just gives you the same generic slop that just don't apply to your situation, across a million listicles
Whereas you can actually like discuss and progress the state of the inquiry smoothly with AI
@@cbot9302 ChatGPT is and will always be very cliché when writing. Always give the most common counterpoints when asked for arguments, etc. It's just how AI works. But if you're looking only for that - it's perfectly awesome as a starting point.
A prime example is a movie or song you cant remember the name of and is very vague to describe
I mean, you CAN. That simply requires the very important skills needed to conduct basic research, use search engines and evaluate sources
Core components of media literacy taught in high school
The very scenario you describe is where AI performs the worst in terms of reliability
As a student that holds to academic integrity, it’s pretty frustrating seeing the majority of my peers use ai to do their work for them. Like you pay thousands of dollars to come here and you don’t learn anything. They’ll do the assignments in a tenth of the time it takes me and get the same grade or better, and then they get a D on the exam and tell me how smart I am when I pass tests with no problems. Like no dude, I’m just up studying and doing homework/school literally all day from when I wake up till midnight just so I can keep up with my classes. And then they ask me why I don’t have free time, like what are we saying bro? What worries me is how many classes shift to online exams because now there isn’t even a precedent where it’s hard to cheat and you actually need to know what’s going on. Schools are making it easier for people to cheat, and now what, people in my engineering major are going to be getting jobs and not know anything from their degree. Don’t tell me about how hard your life is when you just cheat on everything bro.
After an exam for a course that doesn't have an immediate follow up (aka analogue circuits 1 to 2). Do you actually remember that course by teaching yourself sticking to academic honesty? Do you have any hobbies outside of uni? Do you have any friends outside of class?
@@cuineform3708doing a class legit has got to be better than depending on ai to help you work and think everything for the rest of your life
Yes, I can remember a lot of what was in the classes my first couple years here. I participate in clubs as well, and hang out with friends there, but this semester I legitimately haven’t had time to hang out with friends outside of clubs/school sports games.
He just like me fr
In regard to programmers we already had the issue that there are many people who can only scripts, and understand the syntax a bit but do not really understand what they're programming, it's gonna become worse now with GPT
Im glad I graduated college when I did. My Senior yr chatgpt was starting to be well know. But it was at the point where it wasnt that great, and often gave wrong answers, and in my senior yr in engineering our classes were pretty advanced so chatgpt would give clearly wrong answers so everyone just asked it stupid stuff to see what it would do. Never used it for homework.
As a college student myself, I often find myself using chatgpt not to write my papers for me, but help with the wording or phrasing of things I want to say. It just saves me hours of rereading the same few sentences over and over until I get the words right in my head. Or fluff up the paper with higher level vocabulary. Once I have something started with a strong voice in it, it helps me maintain that same tone throughout the paper. I just think the biggest thing with AI is understanding how to use it right now. It’s not meant to do the work for you, but can help support in smaller tasks.
I think this is a pretty reasonable use case. I can struggle with writing in similar ways, so I understand where you're coming from. For myself, I describe it as sometimes my brain struggles to get into "writing mode" (especially for topics that are not interesting to me), but it usually goes pretty smoothly once I get into the flow.
However, I also have a general disdain for current AI. Not the concept of AI and neural nets in general, but how it has been shoved into everything and the way that we are using this technology as a society. At the moment, it just seems so wasteful as it requires sooooo much electricity for such little benefit. I also doubt that I'd be able to shortcut my brain getting into "writing mode".
With that being said, if it works for you then that's great. I'm not trying to criticize your use case at all.
Same. I know the course material, study, take notes, participate in online discussions, what have you. When it comes to essays, however, my writing style sucks. It doesn't sound like a college student wrote it. I use chatgpt to pull up "example" essays (with explanations on how to write an engaging hook, formulate a strong thesis, etc) so I can feel more confident in my ability to churn out an essay, and it has been working alright for me so far. Obviously the ultimate goal is to not need help with writing essays, but I'm getting there slowly but surely
"i like to notice these things" the noticer has logged on
I use it regularly to 'search' for things that nobody is writing about, or at least nobody who is SEO abusing enough for Google. You have to fact check it religiously, but it will give you AN answer to start you in the right direction.
Today I asked what the smallest modern V6 was. The first answer was wrong (Nissan 200SX 1.6L) but it's second answer (Lexus IS250 2.5L) was a good start even if I'm not sure it's technically the smallest.
That's an interesting example because although I'm not a car guy, this does seem like a cool fact and so I gave it a quick search: it seems as though the IS 250 was discontinued in 2015 (I don't see it on Lexus' current website), and so no current Lexus model uses that 2.5L engine. You're right that chatgpt is great for getting a start on research, but from what I found it does seem as though the smallest V6 engine currently in production is 3.5L from a couple different manufacturers, including Lexus with the IS 350
That’s why you tell it to search online and it searches google for the answer and gives you sources, this is what I got:
“The smallest modern production V6 engine is the Mitsubishi 6A10, a 1.6-liter (1,597 cc) engine introduced in the early 1990s. This compact V6 was available in models like the Mitsubishi Mirage and Lancer, delivering approximately 140 horsepower at 7,000 rpm. 
Another notably small V6 is Mazda’s 1.8-liter (1,845 cc) K8 engine, used in the Mazda MX-3. This engine produced around 130 horsepower and featured Mazda’s Variable Resonance Induction System (VRIS) to optimize performance across different RPM ranges. 
These engines were developed to offer the smoothness and performance of a V6 while adhering to tax regulations in certain markets that favored engines under 2.0 liters.”
And then I asked for the newest:
“As of November 2024, the smallest V6 engines currently in production are:
• Ford’s 2.7-liter EcoBoost V6: This twin-turbocharged engine delivers 315 horsepower and 400 lb-ft of torque. It’s available in models like the 2024 Ford Ranger. 
• Volkswagen Group’s 2.9-liter EA839 V6: This twin-turbocharged engine produces up to 331 kW (444 hp) and 600 Nm (443 lb-ft) of torque. It’s used in high-performance models such as the Audi RS4 and RS5. 
These engines exemplify modern engineering’s ability to balance performance and efficiency in smaller-displacement V6 configurations.”
@@ericolivier1271well ChatGPT is still my goat, when I asked it search google for the newest smallest v6 engine, it gave me a 2.7 litre ford and 2.9 litre Audi engine, both are pretty hard to find on a google search by yourself, evidently as you too said 3.5 is the smallest you can find
“As of November 2024, the smallest V6 engines currently in production are:
• Ford’s 2.7-liter EcoBoost V6: This twin-turbocharged engine delivers 315 horsepower and 400 lb-ft of torque. It’s available in models like the 2024 Ford Ranger. 
• Volkswagen Group’s 2.9-liter EA839 V6: This twin-turbocharged engine produces up to 331 kW (444 hp) and 600 Nm (443 lb-ft) of torque. It’s used in high-performance models such as the Audi RS4 and RS5. 
These engines exemplify modern engineering’s ability to balance performance and efficiency in smaller-displacement V6 configurations.”
Edit - I looks like a lot of people are literally reading the calc hater comment and replying. I'm not anti calculator because I think everything should be done in your head. At least read everything first -
As someone who was in high school / college when AI became popular, and a previous calculator hater because my friends who were bad at math never improved since they could just use a TI-83 to solve any problem, we are truly cooked. Not even about "this isn't fair" whiny business, but people are genuinely becoming dumber. Tutoring people in high school who don't understand basic math is jarring.
Being a calc hater is crazy ngl.
Mfukas in 1642 be like
I think an important part of the 'at what point is it acceptable to have something else do the work for you' situation is that you can only learn so many things. You do not have infinite time to learn everything. I'd love to learn spanish. But I'm also working 2 jobs right now and don't even have the energy to clean my room more than once a week. I am a computer wiz and can do great mental math, but if you ask me how to cook a meal, I'd have nothing to offer you. I want to learn to cook, but I'm busy learning to draw, as that's the skill I'm currently focused on. My point is, getting upset at people for relying on calculators is crazy. They're everywhere, and are not hard to use, and give reliable answers. I'm sure you have areas in life where you lack knowledge, and I doubt that you would appreciate someone acting like your lack of knowledge in that field is shameful.
And don't give me the business about how math is a core skill. It is, but so are tons of other things. Something being a core skill depends on what you do in life. If your life path doesn't involve needing to know spanish, knowing spanish is not a core skill. If you always have a machine in your pocket that perfectly solves every math problem you encounter and is easy to use, then, well, would you call mental math a core skill? Probably not.
AI changes the situation, as AI is actually pretty inconsistent, so using it to cover your weaknesses is a gamble. For some people that risk/reward dynamic is preferable, and for some people, it's not. You can't really generalize it to 'nobody should use this thing'. There are situations where you shouldn't use it, absolutely, like an electrician needing to ask chatgpt about everything over and over instead of properly learning the job. But that's only one situation out of millions.
Dang its new to me that ti84s could solve for x, aka 90% of college math classes.
More seriously, any half decent math teacher will design the problems around the calculator you are allowed to use in the class
@@godlyvex5543 relying on calculators for basic math is crazy though. Learning times tables in elementary school, but by high school needing a calculator for them is bad
AI interview is crazy, I’m for sure getting the interviewer to start talking like a pirate
It’d be crazy if schools taught in a way that engaged students and made them want to learn.
I'm a ta for some upper level/grad CS courses at my college and the difference between students pre and post chat gpt is dramatic. I have to sometimes explain things that they should have learned in the first semester despite being juniors and seniors. Everyone's becoming super reliant on chat gpt, skipping all the fundamentals and than failing when thing begin to get challenging. Some professors have just stopped giving out homework and only giving out tests and quizzes to counter this.
The point about using AI to detect if something is produced by an AI is actually at the heart of the previous generation of generative AI. There was a whole area of study on training two competing AI, one to create content and another to discern whether content was real or generated and use the feedback from the discriminator to train the generator. They're called GANs and they're already mostly out of date.
The point I want to make here is that trying to create an AI that detects AI slop isn't going to fix anything. After all that's almost definitely how the system that false-flagged those nine human essays was made anyway.
While ChatGPT on assignments is clearly an issue I do think it reveals the inherent flaws of this kind of assessment in the first place. I’d argue that exams especially push you to not care about the material content and what it means, but to remember the right things making it only half a step up from the 0 thought of ChatGPT. It still helps develop general essay writing and thinking skills but we end up completely forgetting a lot of what we write essays about anyway which is why students often feel like they’re meaningless and don’t care to put thought into them.
yeah school was so awful for me, I'd say up until 5th grade I was learning pretty important stuff, but as time went on, the topics became less and less relevant to my life. Science was important to learn the basics of reality, but learning about how DNA works? For 6 months? Wouldn't that time be better dedicated to like, teaching me how taxes work? I don't need to know about the 4 building blocks of dna, their names, and how dna replicates... it feels like they just arbitrarily decided to have those 5 main subjects just extend throughout the rest of school instead of cutting them off once people understood the important stuff. Most of the things I learned after 8th grade in math was not relevant to my life. History is an endlessly explorable topic, so you have to draw the line somewhere, and yet they decided to just keep teaching me about increasingly obscure and irrelevant things until the end of school. English/writing is one of the few subjects that I think deserves to continue forever, but my issue is how it keeps rehashing the same few ideas repeatedly. I swear, I was taught more about analyzing fiction than I was about interpreting anything relevant to my life. Why, in the last years of high school, was I still writing short stories, and not learning how to write correspondence and be convincing to people? Why wasn't there more focus on TALKING to people? Phone call etiquette? How are these extremely important and relevant skills left untouched while we get assigned another few essays about some random thing nobody cares about?
I dropped out of school around when I became a senior, since I had been skipping school for large chunks of time. Occasionally I came back for big tests and aced them despite not being there for the classes. I took my GED and got amazing scores, and to be honest the test was just so easy it could be completed by an 8th grader. Not because it's bad or anything, but because it focused the main 4 topics to things relevant to real life. Turns out, when you arbitrarily limit education to 4 topics, there's only so much mileage you can get out of those while still staying relevant to life.
imagine the adults actually experience learning the first time in their lives through trying to hook up nodes for their custom waifu chatbots
@@godlyvex5543this argument is so washed. taxes aren’t that complicated, you figure them out after like two years of having to file them, and on the other hand if you didn’t learn about dna in school for 6 months there’s unironically a good chance you’d be an antivaxxer right now
It’s always been the case in my classes that there are people who study for the exam and people who study to learn.
That’s a personal choice you make. There’s no shift in responsibility there to society, or the structure of classes, or anything external to you.
No matter how you restructure the course you can’t make somebody care about something.
If you cared about learning the material, exams would not be an issue for you.
Essay writing specifically is about repetitions. The more you do it about a diversity of topics the better of a writer you become. The individual snapshot of writing this paper at this time doesn’t matter long term for your life. The cumulative writing experience you’ve gained from your education does.
@@Exisist5151 right and I understand that but my point is that exams specifically are not a great way to test intelligence or even long term knowledge. With the status quo you only have to remember the information for the specific period and the information that you need to know is hyperspecific and often feels like it’s learnt only for the sake of learning. During school I thought I didn’t like learning because I struggled to attach myself to what felt like meaningless topics but as I’ve gotten older (still young) I have realised that I love learning and have an innate curiosity that I satisfy through UA-cam and university which is much more related to what the topics that I find interesting. You are right that you can’t force someone to be interested but the school system does not do a very good job at even trying to make learning engaging. Individual teachers try really hard I’m sure but the ones I had often didn’t and treated it like leading a textbook to the class.
i've been using it as sort of teacher/mentor/helper. I never had to replace the flap for a toilet in the back so i pointed my camera at it, uploaded the pic asked what is it, then where i can buy it, and then bought it fixed it myself. Fixed the leak in the toilet. I never had anyone teach me these things or show me.
The second thing i've been using it for is just general questions and some clarification because I have ADHD and i just have trouble digesting information. Being able to take something that's sometimes deliberately written in a confusing way, and slapping it in chatgpt to say "hey can you explain this as if i was 5?" has made my life easier. It's been one of the most life changing things next to the internet for me big A.
I said this a while ago.
The solution is simple, oral exams. But that would require *gasp* teachers to ditch testing methods that scale well from a grading perspective.
It seems like a good opportunity for the flipped classroom - have the kids watch the lectures at home and do the assignments in class.
Oh that explains why all of my classes are flipped classrooms, which I don’t learn shit in because I’m not good at teaching myself. I do so well in normal classes, I miss them :(
@ interesting, my understanding of flipped classrooms was that students would get more hands on time being taught by teachers rather than using the classroom time sitting in silence watching lectures… but i can imagine a lot of classes just use it as an excuse to be lazy and cut staff
@@hjewkes Usually classes just hand out worksheets, I have no idea what the right answers are, the teacher grades on completion so I never end up actually learning the topic in depth because there’s no lecture.
I was watching a reaction youtuber and i stg he would ask chatgpt “hey what did the creator of the video mean when they made this” EVERY TIME IT WAS SO WEIRD
why were you watching them and was this a dream
@@JaMaMaa1 i was watching the new tyler album reactions and randomly found them i stopped watching cause if i watch a reaction channel i want people to have their own insights not asking chatgpt
It's interesting they're using AI interviewers. I'm normally alone at my shop, but today the manager of a different department who mostly works from home came in to do phone interviews all day long. It looked like a massive time investment. Luckily my company isn't doing that yet. I learned a lot listening to his questions and got to get some really good career advice from someone who hires similar positions to what I want.
A friend of mine at uni doesn't even go to lectures anymore. He just copy and pastes the lecture slides from our classes into gpt and asks it to explain it
Lowkey might do this 😂
@@tommysoulznot a good idea
@@Asternius its not like i can pay attention to a 2 hour lecture anyway shit would be useful to get the cliffs notes as well, anything at this point is good for me
@ a 2 hour lecture is to hard to pay attention too? sheesh
@@tommysoulzIf you cant pay attention for 2 hours just change career path
I think AI and Chat GPT really can help you understand the things. Bc in the end you have to write an exam and if you only use AI without trying to understand anything, you won't get far.
And it really helps me to understand things. And find patterns in some topics and connections between topics.
to prove your essay is real, show the document history of your google doc
Git commit history for cs classes
Takes like two hours for the kid who messed with Runescape bots to make something than turns a text into key presses with short varying length waits between them and a couple of longer pauses and share it with the class.
@@afrofantom6631 ong
I spoke with one of my college professors who works a lot with admin. He basically said that they are moving to completely change essays and assignments so that it doesn’t matter if you use AI to help. All traditional homework problems are now only done during proctored tests.
All homework problems are done via proctored tests?
That’s not how homework works nor can it be enforced 😂
I’d love to see a school try and force surveillance while you’re doing homework.
@ no, students are allowed to use AI on assigned homework. The homework is either not graded the same or is designed so that using AI doesn’t really give an unfair advantage. The old traditional homework problems that AI can do on its own are used on in class quizzes and tests which are proctored.
10:46 that’s so cooked 😂
People acting all surprised that AI can output work at levels equal to or greater than a recent college grad... Then also acting surprised that that's 99% of the people that use it
I'm in college and used chatgpt to generate ideas for formatting my conclusion of a scientif research paper. Did not copy any of the chatgpt work just used it to inspire me but it freaked me out because the way it wrote sounded exactly how I write papers. I'm terrified I'll get accused of using chatgpt eventually even though I've almost never used the thing
Wouldn't be suprised if the solution is hand written stuff with some camera on the student for the whole duration of the assignment.
I’m a high school history teacher. I really wanted to do lots of research papers, but chatgpt has made any work that is not handwritten impossible. All writing work has to happen in class, and even then kids will use chatgpt quickly in the bathroom and do their best to copy it to their work. It makes genuinely teaching writing impossible
@@adamb.4404good writing is time consuming and inefficient. Actually teach your students by making it interesting
@@adamb.4404 Are you not able to enforce the students don't carry their phones to the bathroom?
@@iks-zeroHow would you enforce that? Should we station a guard outside the bathrooms and perform searches?
@@paroxysm6437
Well, they're in the class initially right. Just enforce that they show their phones and go to the restroom without them
It's all I do. Rarely go to Google anymore except for maps and images.
I use it often to check my understanding of concepts from my classes. The notes can be vague sometimes and if I can't get a good googlable result it helps me conceptualize. Keep in mind this is just because I don't have friends in my classes.
"Hey I'm an AI recruiter!"
Me: I'M OUT
The best thing I’ve seen chatgpt used for was making Pokémon type puns out of peoples’ names, like John Doe -> John Flow (water type)
5:50 easiest way to game the AIs if they are based on chatGPT is to just say statements instead of questions, basically leave no room for implication .
A.I. recruiter - Tell me about your last employment.
Me - You're gorgeous! What are you doing after this? 😂😂😂
Dude chat gpt has helped me crazy.
Sometimes to just double check or using it as a baackboard to bounce ideas. Expand ideas. I dont know if i woould use it for finance lol.
But using it to start off ideas.
I feel like it's useful for when you put in information, like giving it a context and stuff because of its historical knowledge. But not for just straight up factual information that can change overtime
My dad uses it. He once tried to find a picture of his graduating class with it. He was so amazed it could find it and showed the whole family. Finally I pointed out to him that it was just a generated picture and he was really sad. He still uses it though
Debate me.
I am an engineering student using AI when allowed. I have had a few teachers not prohibit AI because it "wont help" but I find workflows that almost trivialize the work for me. I find at worst the practice is benign, and at best it gives me a severe leg up over my non AI-doped peers.
Having an intern in my pocket to do the low level things at an adequate level, that I need to verify and tidy the work of, lets me focus on the high level goal of understanding and applying material as best as possible. There are AI doomers that trust it fully, which I cant agree with. I put in genuine rigor into learning the tool and experimenting to find its biggest strengths and applications, and feel enriched while using it at arms length.
The only fear I have is that OpenAI turns to ClosedAI and I lose access to the software that I have partially built my understanding around. I would pay serious bucks to acquire an NVIDIA chip and own my AI if I had full access to the model.
4:57 Atrioc Smith: Can a robot feel love? Does it know what a partnership in life feels like?
AI: Do you?
I'll just become a plumber or some shit.
On a real note though: I am a teacher myself, but I teach in an elementary school, and I don't give out written assignments at all. And on an extremely rare occasion that I do, it's either during the class, or I just ask a bunch of questions afterwards to make sure they didn't just AI it. Though I doubt they would anyway, since I teach English as a foreign language, so they wouldn't even know what prompt to write in to get an essay back. Not to mention I haven't heard them talk about AI at all, so they might not even know that's a thing they can do, hopefully.
That being said, if I was teaching in a high-school, I'd probably just... quit and wouldn't. This seems like an absolute nightmare. Like, if I was a student now, I'd probably AI this shit, but at least I'd change some stuff, you know? It seems crazy to me that tech literacy is going down instead of up since tech is literally everywhere now. Even your fucking fridge can have Wi-Fi these days, man.
As a recruiter, AI interviews won't ever be fully implemented unless labor laws drastically change. There's so many ethical violations just from the example you showed. While it may seem like she's just cutting him off, an AI not being able to recognize different accents and not allowing someone to continue speaking is technically a form of discrimination (in terms of labor law). Its little intricacies like that could end up putting companies in a lot of hot water. They'll probably use it until they get sued. But who knows, I could be wrong. I just don't see it working with our current laws.
Gotta love doing weekly discussion posts, and it's just a bunch of robots talking to each other. Dead internet theory frfr
Man I wish I had AI for those stupid weekly discussion posts. Such a waste of time
If a job has an AI recruiter I'm not applying
as a student i hate the ai cheating. one of my profs is giving impossible quizzes now and those of us who refuse to cheat are taking the hit. I bet there'd be a perfect bimodal distribution if you graphed the grades.
I agree that calculators didn't replace learning, but when I was in school Wolfram Alpha definitely did for math problems. Some people would only do their homework by themselves because they knew they'd be fucked in exams if they didn't. Maybe paper-writing goes the same way, where you just have in-person handwritten papers that take much more emphasis than the homework. I'd done most of my schooling in an Indian system, and our grades each term were entirely based on exams we'd write for each subject twice a term anyway
engineering uni freshman here, everyone I know hits their “take a picture” limit so much they’ve created alt accounts, AI has atp overtaken UA-cam tutorials (at least for the simpler stuff)
I'm guilty...
When i was getting my IT diploma, AI was in the come up. It got used allot, but it doesn't help as much in tasks that actually need to be preformed outside of theory.
It was also accepted for use in exams, since my school held a policy of learning in the way of being in the work force. Basically you had any tool and even the internet to your disposal on most tests.
So you could quite nicely use it to sum up key points, autocorrect and write some simple scripts. I still did all the rest of the work though. When used like that it's actually a nice tool.
Had a TA lead a discussion with some example problems. He chat gpt'd the answer and literally the first step tried to solve a series resistance with a parallel resistor equation. Everyone was stupefied over how the TA didn't notice most obvious thing taught in high school physics. When we asked him to go through it himself he just straight up said: "no" 😂😂
Yo tell the teacher. That TA is getting paid to answer your questions and work through problems with you. 2:26
Im a CS student and at my university they allow the use of AI and just have people present their work more often so they can see that people understand what they wrote
In my CS department they actually encourage us to use it. Not as a replacement for actual understanding of course; we mostly use it as a handy tool to quickly search and summarize documentation
THe only thing I've used AI for to any significant degree is generating atmospheric landscape images for the D&D game I'm running, that I would have absolutely zero hope of drawing - but can describe details I want accurately, and iterate on whichever images match what I'm looking for most until I get pretty close to that I saw initially in my mind's eye.
I use chat gpt all the time in uni. I’ve even used it for the abstract in my papers. Our course responsible/ teachers sometimes allow it as long as stuff is referenced correctly. I.e. basically the only thing you can use it for is abstract.
No that chat could write our papers either way though. It doesn’t even know how to reason polarity.
It’s also a great way to ask about a concept you have a hard time to wrap your head around. Like a second teacher.
I had a really hard time understanding how a membrane selective potentiometer worked fundamentally. But after tinkering with the prompts I got a corrected response I understood.
Unlike what the teacher, Google and textbook said.
Chatgpt is great for stuff I'm working on. It helps me get together books and research papers. It can help me write out study plans for subjects or take notes etc. It doesn't understand how to do optical math or really any math in general but for study and work it's great. It also helps me generate concepts for python scripts I'm writing at work for automation that takes a few secs works good for simple stuff.
I use AI almost every day for cooking. Asking how long and at what temp to cook stuff… What to do with leftovers… what flavors go well with others…
It’s pretty helpful
real. for basic reminders, it's extremely useful
You could just use google or youtibe for that or even just ask other humans. Stop taking shortcuts
@@roycampbell586 Why not take the shortcut if it's faster and easier?
There are some big exceptions for student AI cheating at the university level, which are many humanities classes. I'm a philosophy grad student at an ivy who teaches at other colleges on the side, and many of my colleagues and I have found pretty solid ways to detect and pre-empt cheating. Most philosophy content is VERY poorly "understood" by GPT. There's a set of readings and concepts that're extremely common in intro classes, but if you just switch it up to be more niche readings the cheating becomes obvious. For example, I decided to replace half a unit on empiricism as family of theories in philosophy of science with a more niche unit on values in science -- something that is not readily googleable -- and the cheating became instantly obvious. Yeah, I had 15 cheaters in a class of 30, but none substantively claimed that they didn't cheat. I've seen colleagues in psych, sociology, cultural studies, and anthro do the same thing with fairly good results. It might be that non-intro humanities classes will become the hardest classes to cheat in within a few years.
My wife works at a high school and all of the kids computers are integrated with Grammerly which has an AI function the kids are encouraged to use. A lot of them read/write at a 1st-3rd grade level without AI and it’s the only way they will graduate
AI is too new to be to blamed for that
is this high school perhaps on MLK Drive?
that's horrible, our education system is cooked
@@ThugHunterfromIsraelget bent bigot
The old human essays that were flagged as AI generated were probably part of the learning algorithm
While it definitely does just “guess” the next word as an output. It does it prodigiously well, and has a lootttt of info to back it up. You NEED to be careful when using it, and being aware of when it’s training data ends (making real time bets usually slop guesses) and realizing it hardly will ever understand when to say “I don’t know” is so important. But it’s absolutely more than a word guesser. Especially specialized “AI” used to do specific tasks like hybrid ray tracing or robotic movement training akin to inverse kinematics.
I work tutoring one-to-one, and I can see individual interviews /oral testing coming back in a big way. It's meaningful for the teacher even as much as it is harder than lecturing to a whole class at once, and it's already just clear as day what the student knows and what they are trying to 'guess' is the answer you want to hear, so someone quoting AI is just not gonna get under the radar in any context.
Its funny, I'm in my senior year of my electrical engineering degree and ChatGPT is not good enough to help with 99% of my assignments
As a fellow electrical engineering student you are just wrong
Give an example @@paroxysm6437
I use chat gpt all the time in uni. I’ve even used it for the abstract in my papers. Our course responsible/ teachers sometimes allow it as long as stuff is referenced correctly. I.e. basically the only thing you can use it for is abstract.
It’s also a great way to ask about a concept you have a hard time to wrap your head around. Like a second teacher.
I had a really hard time understanding how a membrane selective potentiometer worked fundamentally. But after tinkering with the prompts I got a corrected response I understood l.
Unlike what the teacher, Google and textbook said.
maybe if google diddnt sabotage their search engine, chatgpt wouldn't be killing therm.
A lot of our assignments for english were already done on paper in class. I imagine that will spread further
Nothing gives me more sense of dread for the future of humanity than an atrioc video 🎉
I actually gave a seminar lecture on the advancements of technology in education in 2021, and I explicitly mentioned how AI was going to be a major game changer in the near future, and educators would not be prepared for it. Then a year later ChatGPT was a thing
I use chatgpt to ask questions that’s usually take multiple google searches, like the comparison between two certain motorcycles. It’s just more convenient.
My stepmother is a lecturer an an Australian university and they just design the entire course around testing the students’ ability with the assumption that they are AI assisted.
I’ve never used ChatGPT a single time. I honestly don’t think that I ever will and I feel like maybe I won’t be better off for it. but philosophically this is what I want. it’s ubiquity and how much it’s become relied on by students, as well as other AI, really make me trepidatious and actually scared of the future in ways that I never thought I would be.
Blockbuster never used the internet and now they gone forever lmao
@@SyntaxWyntax blockbuster is a business not a person, idiot
Using it seems kind of weak lmao
@@SyntaxWyntaxmore of a Lead situation, then blockbuster one. lead works better but also poisonous.
I have used it, as it stands its 10 times worse at getting you relevant and accurate information than Google is, and Google is fucking dogshit compared to what it used to be now, you're not gonna be losing out at all for a long while still
Thank God someone else is mentioning women using character AI for AI boyfriends. Everyone jokes about AI girlfriends, but women a going crazy with character AI
Coffee glizzy
Chatgpt is very good for rapid acquisition of knowledge
ive found predictive text programs super helpful specifically for translating historical documents, where i might be able to read 4 letters out of a word but the 5th is illegible. works great for that. i would not trust a single thing it spits out regarding actual history though. i tried it a few times and it's just wrong over and over.
I found out that I could use chat gpt to find links for my communication assignment, asked it to for links for my arguments.
Also I could see chat gpt being the new search engine
I work in the medical field and we have a lot of doctors using it for diagnosis and treatment plans. It does a pretty good job tbh.
this is just anothrt ai doomer video from pepole who dont use it
I’m for sure promising to submit myself to the ai overlords to get my interviewer to give me a 100/100
I am divided on this topic, on one hand I feel as if AI is a tool that SHOULD be used, on the other hand it's highly dangerous if it replaces actual learning. During my last year of uni, I've used GPT on multiple math questions and equations as a way to understand the process that goes into solving an equation. The current chatgpt will not replace learning, as it usually gets the correct answers, but it rarely uses the easiest formulas. And I think that will be what differentiates the "A" students from the "C" student. During an "offline" math exam, the students that's copy pasting from GPT will struggle to finish in time. When it comes too essays, it is and will be like a war. Sort of like between UA-cam ads and adblocker. Patches will come and go, but it will continue to evolve. I don't think there is a solution, except for mandated in-house writing on individual computers or allowing some sort of control over your device.
What dish is the editor?
A dish? Which one damn it!
I feel like there might be a return of more in-person valuation and examination overall, which could be a positive thing.
Homework is already proven to not be helpful in many cases to develop student skills and focus, so just get rid of it, and do in person examination more.
More or same time in school, no homework, in class writing, done. Good luck for university though, but a similar process could be applied for examination.
For jobs though there might be some need for regulation, like a centralised network where your applications are tracked and your qualifications are automatically filled in when applying (I know I am describing linkedin, but I mean it as even more central to the process)
2:56 I had an AI recruiter call me. It was eerie how good it’s gotten. It was much better than the one in this vid tho
When presented with having to write an essay on two vacation locations vs each other, pull out that damn AI and spend your time on something meaningful instead.
I support THAT.
More interesting and informative topics, then no AI.
In some of my classes in hs we only wrote on paper/ safe exam browser to train ourselves on reasoning from only stuff we know. I recon something like that could be done, but with a textbook to avoid AI but still do writing.
The AI interview could be cool for an AI security job. If you manage to identify its flaws and jailbreak it, you’re hired.
Shit man, you got me. I got clickbaited. I don't make my own grocery lists anymore man. ChatGPT my personal little Wall-E, my goat.
I wanted to say that as a university student, the way teachers are fighting back against AI is removing written assignments as much as possible and making classes more exam based. My business law class is 100% weighted by exams.
its scary how many people take LLM responses at face value and accept it as a fact without second thought
So i could take a test interview first, where i score poorly to write down the questions the gonna ask me, then prepare in advance for the best interview score ever seen.
I use chatgpt for literally everything. My niche use case is if I can't think of a word or concept; I give it near-concepts of the word and what I think it might start with and it always gets it.
I use it for emails, ideas and organizing the main points at meetings
I'm so glad i spedrun college and got out before this became such a huge thing. If i had been in for the extra two years a lot of it would have sucked.
My English teacher uses the Google Docs feature to see document history and uses that to determine if something was written by AI. for example, she knows if it’s written by AI if large portions of text are clearly copy and pasted into the document. I think this feature can also be used to prove that you wrote each word yourself and didn’t use AI however people might just type it out and fake errors so idk.