Tanner explains reality of being in technology field very well. Timing is everything. You are hired because you have the greatest potential to create value for the company which justifies extreme pay. Only the top 1% get those tech jobs or the very lucky, but expectations are always high. Cutting edge tech jobs are at minimum 60 to 70 hours a week if you include time maintaining skills. Elon Musk led the way with tech job cuts with Twitter. Twitter had a tough transition, but is doing well with about a third of the previous staff.
Couldn't agree more. I am an "AI Engineer" on paper, but more than half of my work is Infra/AIOps. I'd say it's very rare to work as a model developer in this market. Amazing video btw! :)
AIOps/Infra, is there enough resources on Internet to learn?😢Or it’s so complicated and sophisticated that must be learned in the real working environment ? I mean, when company are hiring AI engineers, the amount of papers matters, seems in the hiring process they are still concern first about academic ability of candidates, not those practical stuff. Is this because those stuff are hard to learn for students at school or by themselves? Company will train you from zero😢
@@minorikushieda2733 bro just master Devops + Datascience (excersice your concepts in kaggle) + ML engg stuff(how to build apps using llm apis - langchain & stuff) . this is all.
Unless you're working for a handful of companies at the forefront of AI research, or embedded within a university research group as a postdoc specialist, that's essentially what being an AI engineer means. Schools do a disservice by selling the myth that jobs exist where you do abstract modeling then hand over to some implementation team to actually make it work in the real world.
I think companies either hire software engineers with a background in ML to implement their model as Tanner says, or PhDs who do the training, etc. of the models. Someone who "only" has a Master but also doesn't have any significant software engineering experience is a bit in the awkward middle - you are not practical enough for an implementation role but not theoretical enough for a training/research role. That's probably the issue that Tanner is facing.
Im somewhat in the middle awkward class u mention, although this is typically enough for me to get AI internship offers. This enabled me to get relevant experience and I’ve been moving towards GPU computing. Just my experience but I think internships are a great route for those stuck in the middle
@@texasgoat2991 Same on my end, not enough to get a job but enough to get an internship in a middle to big sized company, which hopefully should increase chances later on for a job!
I got into the AI job market at the end of 2022, I deliberately looked for a job at startups, because I thought that if I got a job at a big company I would handle 1 part of the pipeline, but I wanted to learn the whole process from data Engineering right through to implementing it into a production environment. So far I have developed processes for every stage, and now testing the full pipeline for some parts of the project. I am the only person working on the Machine Learning Team, so I have had to work out how to design each step myself with only the internet and the knowledge I got from a coding boot camp.
Yeah dude. Same here. Built whole AI systems for our startup, broke production, rebuilt it, deployed large models on different cloud services, did testing...all by myself. Startup teaches you a lot.
physician degrees are gatekept, only X amount are allowed to graduate each year. that's the only reason it is so stable, honestly they should do it for every job field to prevent people from going into markets that can't support them
@@Zuranthusfalse there is limited amount of space to get in. If that’s what your referring to then yes but nope they don’t gate keep how many student can graduate. SWE cannot be gatekeep because u can learn it without school
@@redkoadplayer7289 residencies are limited "44,853 applicants and 41,503 certified positions across 6,395 residency training programs." there are doctors who never get into a residency program their entire career
What companies want from their ML engineers is to show them how to add value to the business. Either one needs to understand the business problems or partner with a team that knows those problems.
You are right - you need to show up with solutions to real problems and it takes years in the business to develop those skills. It's only going to get worse.
@@rockpadstudios Companies are growing dumb. Companies want AI but they don't know why! This is why I encourage people to not stick with companies and find ways to build their own thing.
one issue I am observing is that lots of companies skipped over conventional statistics that could help them and just jumped into ai to brute force solutions that arent going to work. They were sold a lie it could solve anything and they will get disappointing results and be averse to hiring for years to come.
Company are just pandering to their shareholders. AI concept is very hot now, although they don’t know anything about tech, retail investors and funds are chasing it insanely. Ok, you run a company and you want your share pirce to go high. Definitely, you want your business to become related with AI, then those blind money will smell it and crush to buy your shares. However, I don’t think this will make much effect on job market. These company will not hire many people, they are just pretending, no need to do that much.
Can you please expand on this to someone not familiar with the subject well? I was able to build models with both traditional and AI models using the same limited training set (100 data points). Why should I have continued with traditional statistics if the AI-one worked well?
@Nuhopoclik1 Not saying it is always bad. But lets say you are an economist and want to model inflation trends. Doing convention time series analysis, you might test your data for unit roots, autocorellation, stationarity etc. Is there even relationships worth modelling. With ai you might be tempted to just throw as much data at it as you have, when the problem might not even be able to be modeled. So to do accurate modelling that you will eventually have to explain to executives, you do have to go through those steps, which I dont feel like is happening in a lot of spaces. its just ai, black box, more data.
because if you can get enough results using simpler techniques, it's better to use them. This is Occam's razor. Plus, don't forget about the interpretation of your models. If your models are not interretable and you build your management system on them "no matter what, targeting advertising clients, or loan applications", then your business eventually becomes unmanageable. Especially if you have several models in the pipeline.
I finished my masters in AI a year ago, and I still can't find a job. Why? Because my area is not in LLMs and like it's been said in this video, all the jobs are for seniors.
Sorry to hear that… but it’s sounds accurate. And it’s also sad that people ignore every other aspect of AI or deep learning.. hope we see soon people relax more and explore more
Going to be blunt, you should find problem within yourself. Why did you study what you studied if the market does not like it? Are you going into industry or want to be an educator? Think hard and plan your path with purpose from now on instead of blindly going into a field where you are like I like this and I am going to learn it blindnessly.
Well yeah thats because a degree doesnt automatically qualify you for a tech role. you need skills. most people who have these jobs don't have a degree, this is because skills is what gets you hired, not the degrees, all the degree does is prove you can do homework and pass exams. Only the people who understand that skills is what matters are those who have no trouble finding roles.
I have multiple degrees in engineering & business and a graduate degree in AI and I worked both in research and in industry. AI has multiple sub-fields. I'm always surprised when someone just studies ML by itself, and considers it enough to breeze through the industry, that's not enough. You either need to be a domain expert in some other field or at the very least have DevOps skills because most companies do not have large teams as companies are still assessing the feasibility and business value of AI. Having a team without domain knowledge of the problems they are working on increases the likelihood of those projects failing or not attaining the highest/best business value. It's perhaps not easy to have multiple degrees and work experience from various industries but if you can invest/sacrifice the time it will always pay off. When I did multiple degrees I just didn't want to be stuck in one field and when I studied AI it wasn't such a hype but everything changed so fast and now companies are scrambling and people are jumping on the AI bandwagon without a proper strategy. Take your time , do some proper research so you don't waste your time.
The misconception you are referring to is actually engineered by AI solution/consultancy sellers. Step 1) build the hype around AI, step 2) get the contracts with clients, step 3) figure out if we can really do something useful, while we burn the client's money.
Becoming a better prompt engineer is, for the most part, a waste of time. I've noticed GPT4 has been getting better in the background. And of course, their whole goal is for the system to respond usefully without slinging special prompting tricks lol. For example, GPT4 used to completely rewrite my code and cut parts out, whereas GPT4o initially would just repeat the code back to me. Recently though, GPT4o can do 300+ lines of python and preserve my original program while implementing the prompted tweaks. This all happened recently without any announcements.
It still hallucinates constantly, I would never trust an LLM to not fuck up 300 lines of code I also notice the ability to consume and produce higher token volumes has made responses less concise and digestible, combined with hallucinations means a solid engineer is pretty often just better off using Google and Stack Overflow a lot of the time. Especially if you work for a company whose code you can't just hand over to OpenAI. Often, if you Google keywords similar to what your prompt would have been, you find that the data these LLMs are synthesizing is nearly always on the first page of Google results The hype cycle got really out of hand and it is time for it to die off
AI engineering is not at all the same as prompt engineer. prompt engineer is a fancy way for typing effectively to LLMs. AI engineering involved knowledge of how neural networks are created… creating vision models and other transformers and optimizing etc. that is then added to business process mapping to know what and how to automate different processes in the industry you get into. Vastly different skills.
Yeah. I see prompt engineering as a similar activity to SEO, i.e., trying to "game the algorithm/model". But SEO is working in an environment with one very dominant and relatively stable search algorithm, and with a very clear relationship to a revenue stream (more clicks, more sales, etc.). Prompt engineering is doing the same but in a landscape with many different LLMs that are changing relatively quickly (appreciable differences in most models every 6mo to 1yr), and in most cases the revenue streams have not been proven yet. And it seems like it would be extremely frustrating work.
@@mike200017 not exactly comparable to SEO. There are research papers for things like SOT and other mechanism thats provide accuracy etc. for llms. They arent incredibly different and you don’t need to change to all the others just stick to the best ones which so far keeps being Chatgpt4o. Or gemini or claude upgrades. The results you are trying to get with llms are real-time and the methods for accuracy are known. So its really more like… the better your ability to model processes and map them in your mind and express them in writing as an operatinf procedure step by step.. the better your results. Thats just one part. You also forgot that LLMs are just NLPs… NLPs process language into commands… commands that can be linked to functions…. Like automating workflows.. etc.. however the LLM needs to access vision models or backend code and have each button and field labeled in order for it to know and be the universal action model we are all seeking.
I second that the boom was in 2022. My gov lab was even peddling ai summer classes, which they've never done before. Then in 2023 it was contracts-no-renewal year lol. Also in 2023, I vividly remember how nobody wanted to admit they were using ChatGPT to help write papers.
I worked many years writing "classical algorithms" next to a research team working on ML/AI approaches to the same problems. They worked overtime and were constantly swamped with issues trying to keep their massive data collection and training infrastructure up and running and it was moving forward at a snail's pace. I think they never quite understood (or were in denial of) why I would give them such a bewildered look whenever they were bragging about all the things ML/AI could do without needing to code anything. I'm not pooping on ML, but I think we are still in the early days, a lot of maturing of technology and experimentation still has to happen.
A better way to define AI hallucination is that it is predicting to a prompt incorrectly. Trying to figure out why a model is predicting incorrectly is the challenging part. Sometimes, we can't even figure out that the prediction is wrong.
The sad reality happened in 1994, Eternal September. Then during blitzscaling, when templated and untyped languages became mainstream, and to suppress wages, technicians started being called "developers" and "engineers", which they are absolutely not. Specifically to address the raised questions: * it's in 99% of cases (like this one) a misnomer, "ML engineers" like this are not "ML" nor "engineers", the best name I've seen is "data scientist", but a more honest would be "data cleansers" * "prompt engineering" is nonsense, not gonna even address it * math is required, unless you want to be relegated to lower roles or underperforming teams * CS is required, same * those who join the industry for money usually don't have aptness for either, hence they mostly try cargo culting, and end up disposed of after a surge of low complexity manual work is completed. Now you may cry "elitist" all you want, but the era of cynically catering to amateurs to boost low quality growth has left a deep scar in the industry, so don't expect any consolation prizes. Just look at Boeing or Intel.
Honestly I always assumed that ML engineers were full stack devs with a specialization in ML. I had no idea this area of study did not include general software engineering. Personally I'm a lifelong developer who did a couple courses to pickup ML and understand how to use it within my projects where necessary. But I always felt like maybe there was something special I am missing out on by not having studied it at college. Eitherway just leveraging an LLM has become so insanely easy and just constructing prompts within my app that get submitted.
I’ve had a 30 year career, beginning with cobol and dbase. I work on and with Ai every day. I’ve been deeply depressed the last 3 years watching my industry rip itself apart *again* by hype driven by horrid SV VC scumbags selling Ai. I’m really pleased to see everyone waking up. Make no mistake, when Ci/cd pipelines, dev tools etc are leveraged with generative Ai by the next generation of developers, productivity will explode. My hope is that highly secure local deployment environments will emerge with Ai connected data lakes and a full stack of standard dev tools backed up by LLMs fine tuned on configuration management and security etc. Now, every type of business software can be constructed locally by devs using plugin components, with the entire stack managed in the background. That means MORE developers, not less.
the sad reality is most famous universities focus on research. The culture of prestige has screwed up the university system in the US, Canada and UK. I've been working in consulting for more than 15 years and most graduates don't know practical engineering. Most graduates take 2-5 years to learn how to build, ship and manage applications.
Same issue here. I tried to find an AI job last winter. Tbh I got 1 or 2 jobs, but not the ones I wanted. So I decided to go the senior SWE route and become tech lead instead. I've always seen AI in the broader context of process automation. I used to freelance as well and developed customized tooling to support work processes. So yeah, as said in the video, you need to be a good software engineer and have AI skills on top nowadays. For me, AI is just another tool in my toolbox to develop cooler automations. I cannot tell you the countless times I joined some CSVs with pandas and cleaned the data for some quick insights, etc. The practical data mining skills are underestimated IMO.
Love this video. One of those rare ones that remind you of the harsh reality out there with regards to jobs in this field. Of course, in any field, being able to land a lucrative job is not easy as some has mentioned in the comments with the key factor of whether you are at the right place at the right time. However, this video is all the more pertinent during this period considering the mad rush towards anything AI/ML-related and there is a danger that jobs in this field may be over-hyped.
I think this was a nice talk regarding the reality of what the companies think it takes to implement something and how actually it needs to be done and integrated. The part I can relate to is the part that more ofen than not these projects are possible to complete with a good Data engineering team that plans and implements how the data is extracted, transformed and stored for the ML/AI engineers to actual feed the models
it is always boils down to demand + learning curve/barrier to entry. if machine can do everything, and your contribution is almost zero (meaning zero learning curve) then income = zero. if learning curve is high, or barrier of entry is high, like solving hard maths problem, but no demand, then it is your job to do it, people rather solve it the easier and economical way.
Degree matters. So I advise people with non-IT/math background to stop jumping into this sink hole. In today market, AI is a tool, it is no longer a field of knowledge. It is better to do something else that make use of AI, unless you’re top 0.1% who invent the wheel.
Very well said! And there is a lot of opportunity in that area of application of what we currently have. We seen how much easier it is to work with companies once they actually understand what it does and most importantly, what are the limitations. So that you can build strategies that scale, and extract that value today, but be flexible for tomorrow.
If one does have a math background (i.e. a bachelor's in mathematics), would this still be a good area to go into? Is there room for people who want to work more on the abstract side of things?
Guys as someone starting out my goal was to read theory for a month more, then try fastapi blogs, then karpathy, then implement any papers i like plus i can understand. by hit and trial. The message im getting here is to focus on software engineering and get to building projects right away. Is that right? If you were to hire me, would you like to see in my github how many papers I've implemented or how many actual real world problems I've solved using AI? Or both maybe? Advice in this regards would really help!
Just my opinion. I’d pick the latter one if u want to get hired easily.. Of course it’s a big plus if u could understand the architectures underlying the models. Implementing the models will help in that sense. But unless u want to be a researcher (which is super hard to get a job as a researcher as they mostly require phd degree for the job) the latter option (building projects + familiarity with the AI architectures) will likely help you get a job (relatively) easily.
@@swapnilchand338 Well. Hard to say. I'd say applied? But not really software engineering. or could be called "Research Engineer" . My main job is not writing papers, i.e., working on novel algorithms. I mostly work on improving the foundation models (such as LLM, Stable Diffusion, etc). Our team modify/improve the open source models to meet the customers' (usually B2B not B2C) requirements. As you know, the open source models themselves are not all adequate to use for the business purpose without any modification. Of course those improved models can be sometimes really novel algorithms. If so, we write papers of it. As of now, I'm working on the video generation stuff. T2V, V2V, I2V stuff. So, I'm not really a software engineer nor research scientist. I prefer to say research engineer. But, as you know, these terms aren't really strictly defined so companies might use the terms in different ways. You might already know this but one way to find what you should to get hired is to search job positions you're interested in and see what skills or experiences they require.
@@anonymous-random what has been your background - say x yrs of software engg at big tech, y years of ml startups, z years in masters? can we connect over any socials?
Seems Ai / ML people also aren’t good at software engineering but sure can pretend to know PyTorch… I have a masters in the subject but I’m also a learned software engineer. You can’t just be one thing in todays world.
ML is a super broad area and it’s really hard to teach it at the MS level unless they restrict the intake to Math undergrad students or have a very tough math qualifier including calculus, advanced stats, info theory, advanced linear algebra and convex optimization topics. Almost no school has such strict entry criteria. I am doing my Masters in ML and AI at a top tier school and my perception is that they don’t teach enough theory for you to come up with your own algos on the fly by yourself thereby making you unqualified to be an ML algo designer. You will know enough to apply standard algos but frankly that’s not very tough even without a CS degree of any sort. As to teaching you practical CS skills, it’s hard to teach that at school fully, it’s the job of industry to teach that. The schools do teach you the basics throughly assignments to the point you understand basic Python libraries such as numpy, pandas, sklearn, Pythorch and all but they are kind of like toy examples. To do a job a full real life example including interventions you would have to carve out a full year doing a real life internship. But isn’t that what a job literally is? So basically a MS by itself gets you to Mastery in neither theory nor practical CS application. I don’t think the characterization that schools are focused on just preparing you for PhDs is accurate. The issue is that real life has no easy arbitrage opportunities that you can just do an MS, not be focused on what you don’t know and filling that gap continuously, and expect to earn a million dollar salary as an entitlement. Never happens!
The conversation is quite interesting. If i may, I think it is important to differentiate research roles from other types of roles even in the industry. Big, serious and strong companies are mostly interested in research scientists because those are the people who create the novel things and as a result, we have less and less need for people who cannot do the research but can only do the engineering part. And even universities and institutions know that and that's why they divide their master programs into two for the two different paths: research based and course based and I think they are self explanatory. The fact that the research part is the most important one between the two is not stressed on enough, unfortunately, very few people are on that path compared to the alternative, creating the discrepancies that we see in the roles. Also for most of those AI roles, a PhD is going to be preferred compared to any other degree and even before joining your phd it is assumed and some professors test it that you are an engineer.
Very interesting observation! Unfortunately, at the end of the day economics play a big role too, industry not only offers higher pay, but also support with compute, there as academia is lacking on those resources all around. And thus we see industry, meaning corporations, eating up the talents. However, at a corporate job your research is strained to the goals of corporation, thus engineering part plays a big role too. It's a bit sad to see that because we would like to have more independent researchers advancing a field without commercial incentives. But there money goes attention follows.
I wouldn't discredit engineering in favour of research. Someone's got to build the data engineering bits and that's often what academics know very little about. Every researcher needs at least a few engineers to support their infrastructure.
@@bla7091yes, but if you have a few engineers and one researcher you would like to have an advanced researcher, I studied ML myself I must say that the level taught in a master degree while interesting(and nowadays prob a core subject for undergrad CS students too), once you got what it’s doing , you can throw away the math in application, any sub par engineer (not even advanced knowledge in coding) would be capable of whacking up a few line of python code that runs a prediction- ChatGPT can do that now. There is no point hiring a junior ML person.
excellent interview. As a currenet MS student in AI now, I am wondering just whether I need an AI PHD or not... doesn't seem like the best use of my time
2023 had a lot of layoffs because of economic factors that affected the US job market. So, it might not be a very accurate to judge AI expert demand in the job market independently.
In many large companies, the problem is they don't even understand data science as a concept. They have a data science project but hire a database consuttant. There isn't even a devops-deployed development environment with actual dataset access. The datset is safely locked away in a production-only server environment, deliberately designed to protect it from any meddling software-enginners or data-scientists. The project is treated as a pure data-engineering task to build a data-pipeline intended to be built without any visability into the dataset itself. Then there will usually be some shitty procurement vendor who immediately wants control access to both code and data to maintain their monopoly over the project.
@@godago I hate to be such a statistic, but I'm human, and gotta give in sometimes. And, I will validate your instincts, because I totally clicked on this because of that (skillfully maintained) mustache. And now I'm subbing, because I look for content like these topics anyway. So, props to @godago!
2 місяці тому
Insightful! Side note: there are some issues with the sound, possibly related to the editing.
If you substitute "machine learning" for "data scientist" or "big data engineer" you could be talking about the last 10 years data/ml hypes. Welcome to the industry
Great content. Too bad on the audio engineering. Hopefully you make sure that never happens in future videos. Were you using a noise gate (set poorly)? Or mu-law compression over the transmission path?
Good interview. Bad audio. There are full words that are missing in the audio. Yes, I turned on CC, but in this day and age, it should not be difficult to get something that is completely audible.
It seems like the way ML has to be taught would not require only a shift but rather more business-oriented elements in the curriculum which deals with its application. That wouldn't mean a dumbing down but rather increased time requirements for new entrants into the industry. Realistically it will be opportunists from other disciplines (mostly STEM) who will have a much easier time joining.
The amount that you'd have to offer as a new entrant (CS knowledge, focus on ML, understanding of the infrastructure and operation side of things) will be more readily available from an already existing pool of applicants from CS and jobless STEM people.
So the best practice for people who have absolutely no clue is to just start actually doing the thing itself and jumping naked into the market. Otherwise they'll look forward to 8+ years of education and then junior experience minimum.
Hey, nice video. As an AI enthusiast, I became a data scientist right after graduating with a CS degree. However, I found the work to be simple tree modeling and drudgery, so I transitioned to backend development to improve myself further. Despite this, I'm still curious about AI. As a backend engineer, will my AI skills be valuable in the future? Or am I wasting my time in backend?
00:01 Expectations vs. Reality in AI Job Market 01:48 AI job market boom was in 2022 and saw a decline in 2023 03:34 AI job market requires skills beyond theoretical knowledge 05:24 Machine learning engineers need to focus on infrastructure and monitoring models. 06:58 Shift in demand towards practical skills for AI job market 08:44 Deep learning models are widely applicable and increase productivity 10:36 Importance of continuous learning and self-improvement in AI field 12:14 he shares personal experience with prompt engineering.
You are conflating the general hiring down trend now and peak of 2022 and now. It was Covid and we are still dealing with it. I'm a Caio and ml engineer. The graph directly maps to programmer hires.
Such a great point about academia missing the ball. Students most often get hired by businesses that need solutions that work for them and their processes. If you don’t know how to understand the processes and needs of businesses because you were never taught this it’s a huge gap.
This is classic complaining that university is not a bootcamp, when in fact nobody expects that fresh graduate will be able to introduce data science into company by themselves. In fact, even graduates of computer science don't really have skills to do that.
what is it like though for people with a Phd? Of course not everyone with a PhD will end up at openai but do you have an idea where most of them work on/ at?
I’m a senior generative AI engineer at a FAANG, PhD in theoretical physics was finished in 2021, and not top notch programming/engineering skills (20 leetcode problems total to my name, none I’ve been able to solve on my own). I understand the nitty gritty details of backpropagation, probability, statistics, statistical tests, metrics, broad training techniques, ability read papers, end to end model training+productionization understanding
You are describing an entire product application where the assumption is ai engineers/data scientists have the skill to solve without other roles involved. Bias data engineer here. If you cant effectively get and clean the data at scale then deliver to applications on time, these models are useless. Too much hype around one job function for me
More businesses should interview the IT staff of the universities as well as the graduates to find out the weak points. I remember during my PhD, the IT people that ran the high-performance computing center came to our department to give us a special lecture about why are we creating too many files, it was basically lack of database knowledge. Someone in my research group was creating thousands of directories with over 1M individual files, "I want a different file for each subset of data, so what?" basically, haha.
$925000 a year sounds great but how much of that money he is actually going to get?! 500k per year? Don't get me wrong that's still a very good salary, but taxes are a thing.
Correct observation! There is also other benefits that are baked into the final offer that depend on multiple factors. And a big one, is living costs, that would cut a bit chunk of that money. But you are also right, even if all things considered, it is still a big offer on the table and competition for talent is fierce.
Dude, if you can't survive on 500k you don't deserve to get paid that much. You could get apartment for 10k per month and you could still live off that.
what everyone needs and still hire and pays about the million a year but nobody is talking about is the "Devops/mlops", data egnineer, the good old IT. You simply cannot exist as a competitive medium sized business without a good infrastructure and these need to be not only update every few years they also must be maintained. ML engineers and Data Scientists are dozen a dime, Infrastructure and system architecture u dont learn at school.
This is why I didnt bother trying to teach myself AI/ML or any data science concepts. I dont really see a huge benefit to my employment by learning this stuff over the conventional engineering stuff I'm doing now because most of these AI tools and products is useless vaporware imo. I havent really seen a lot of useful applications of it, and the useful ones dont seem very profitable when you assess the cost of building and maintaining models like these. At the end of the day its always money in, money out for a business.
Prompt engineering is dead. o1 gets worse if you add prompting patterns, vs just talking to it naturally. All the chain of thought and top patterns are already built into the system, so adding these seems to actually make the results worse. I’ve been saying this since the term “prompt engineering” started gaining popularity, prompt engineering will just be “communication skills”
Good news: Digitalism is killing capitalism. A novel perspective, first in the world! Where is capitalism going? Digitalism vs. Capitalism: The New Ecumenical World Order: The Dimensions of State in Digitalism by Veysel Batmaz is available for sale on Internet.
Good devops engineers are worth their weight in gold and don't need any additional glorification. That's why they're still being hired by the bucket load while so many theory-heavy practice-light ML engineers are falling by the wayside.
eemm.. no problem was stated here. normal scenario - you create a decent model, put it in production, of course there will be problems. For me you guys sound like little kids complaining on harsh life or stupid management but you maybe educated, especcially in the end🤦 well, i explored chatgpt in 2022 and created many things already.
Tanner explains reality of being in technology field very well. Timing is everything. You are hired because you have the greatest potential to create value for the company which justifies extreme pay. Only the top 1% get those tech jobs or the very lucky, but expectations are always high. Cutting edge tech jobs are at minimum 60 to 70 hours a week if you include time maintaining skills. Elon Musk led the way with tech job cuts with Twitter. Twitter had a tough transition, but is doing well with about a third of the previous staff.
Really well said!
And a fraction of its revenue 😂
Twitter is about to go bankrupt.
@@realnapster1522 Based on what datasets?
As a free speech platform it should never bankrupt
Never try to time the market, job, investment, dating, or otherwise. Do it because you like it, otherwise you will be burned.
Couldn't agree more. I am an "AI Engineer" on paper, but more than half of my work is Infra/AIOps. I'd say it's very rare to work as a model developer in this market. Amazing video btw! :)
So awesome to hear that your experience match! Thanks for sharing this comment!!
Agree. And I think most of the AI jobs revolves around Operations and Infra, setting up model/data pipelines.
AIOps/Infra, is there enough resources on Internet to learn?😢Or it’s so complicated and sophisticated that must be learned in the real working environment ?
I mean, when company are hiring AI engineers, the amount of papers matters, seems in the hiring process they are still concern first about academic ability of candidates, not those practical stuff.
Is this because those stuff are hard to learn for students at school or by themselves? Company will train you from zero😢
@@minorikushieda2733 bro just master Devops + Datascience (excersice your concepts in kaggle) + ML engg stuff(how to build apps using llm apis - langchain & stuff) . this is all.
Unless you're working for a handful of companies at the forefront of AI research, or embedded within a university research group as a postdoc specialist, that's essentially what being an AI engineer means. Schools do a disservice by selling the myth that jobs exist where you do abstract modeling then hand over to some implementation team to actually make it work in the real world.
I think companies either hire software engineers with a background in ML to implement their model as Tanner says, or PhDs who do the training, etc. of the models. Someone who "only" has a Master but also doesn't have any significant software engineering experience is a bit in the awkward middle - you are not practical enough for an implementation role but not theoretical enough for a training/research role. That's probably the issue that Tanner is facing.
Im somewhat in the middle awkward class u mention, although this is typically enough for me to get AI internship offers. This enabled me to get relevant experience and I’ve been moving towards GPU computing. Just my experience but I think internships are a great route for those stuck in the middle
@@texasgoat2991 Same on my end, not enough to get a job but enough to get an internship in a middle to big sized company, which hopefully should increase chances later on for a job!
This is very true.
I got into the AI job market at the end of 2022, I deliberately looked for a job at startups, because I thought that if I got a job at a big company I would handle 1 part of the pipeline, but I wanted to learn the whole process from data Engineering right through to implementing it into a production environment. So far I have developed processes for every stage, and now testing the full pipeline for some parts of the project. I am the only person working on the Machine Learning Team, so I have had to work out how to design each step myself with only the internet and the knowledge I got from a coding boot camp.
Fantastic, good for you! Mind telling us the company name?
So am I :) cheers from brazil!
Yeah dude. Same here. Built whole AI systems for our startup, broke production, rebuilt it, deployed large models on different cloud services, did testing...all by myself. Startup teaches you a lot.
Good luck
damn. can you share some tips? any stuff which is waste of time i should'nt focus on while learning ML?
turns out our asian parents were right all along: become a doctor
physician degrees are gatekept, only X amount are allowed to graduate each year. that's the only reason it is so stable, honestly they should do it for every job field to prevent people from going into markets that can't support them
@@Zuranthusfalse there is limited amount of space to get in. If that’s what your referring to then yes but nope they don’t gate keep how many student can graduate. SWE cannot be gatekeep because u can learn it without school
@@redkoadplayer7289 residencies are limited
"44,853 applicants and 41,503 certified positions across 6,395 residency training programs."
there are doctors who never get into a residency program their entire career
General physicians will be replwced with LLMs based on medicine
@@SahilP2648i'm just going to guess you've never seen the inside of an EMR if you're saying goofiness like this lol
What companies want from their ML engineers is to show them how to add value to the business. Either one needs to understand the business problems or partner with a team that knows those problems.
You are right - you need to show up with solutions to real problems and it takes years in the business to develop those skills. It's only going to get worse.
@@rockpadstudios Companies are growing dumb. Companies want AI but they don't know why!
This is why I encourage people to not stick with companies and find ways to build their own thing.
one issue I am observing is that lots of companies skipped over conventional statistics that could help them and just jumped into ai to brute force solutions that arent going to work. They were sold a lie it could solve anything and they will get disappointing results and be averse to hiring for years to come.
Could not agree more!
Company are just pandering to their shareholders.
AI concept is very hot now, although they don’t know anything about tech, retail investors and funds are chasing it insanely.
Ok, you run a company and you want your share pirce to go high. Definitely, you want your business to become related with AI, then those blind money will smell it and crush to buy your shares.
However, I don’t think this will make much effect on job market. These company will not hire many people, they are just pretending, no need to do that much.
Can you please expand on this to someone not familiar with the subject well? I was able to build models with both traditional and AI models using the same limited training set (100 data points). Why should I have continued with traditional statistics if the AI-one worked well?
@Nuhopoclik1 Not saying it is always bad. But lets say you are an economist and want to model inflation trends. Doing convention time series analysis, you might test your data for unit roots, autocorellation, stationarity etc. Is there even relationships worth modelling. With ai you might be tempted to just throw as much data at it as you have, when the problem might not even be able to be modeled. So to do accurate modelling that you will eventually have to explain to executives, you do have to go through those steps, which I dont feel like is happening in a lot of spaces. its just ai, black box, more data.
because if you can get enough results using simpler techniques, it's better to use them. This is Occam's razor. Plus, don't forget about the interpretation of your models. If your models are not interretable and you build your management system on them "no matter what, targeting advertising clients, or loan applications", then your business eventually becomes unmanageable. Especially if you have several models in the pipeline.
I finished my masters in AI a year ago, and I still can't find a job. Why? Because my area is not in LLMs and like it's been said in this video, all the jobs are for seniors.
Sorry to hear that… but it’s sounds accurate. And it’s also sad that people ignore every other aspect of AI or deep learning.. hope we see soon people relax more and explore more
@MarcAyouni try it at Palantir Technologies
Going to be blunt, you should find problem within yourself. Why did you study what you studied if the market does not like it? Are you going into industry or want to be an educator? Think hard and plan your path with purpose from now on instead of blindly going into a field where you are like I like this and I am going to learn it blindnessly.
@@weiguohao3128LLMs will have not so long interest to specialize on them only.
Well yeah thats because a degree doesnt automatically qualify you for a tech role. you need skills. most people who have these jobs don't have a degree, this is because skills is what gets you hired, not the degrees, all the degree does is prove you can do homework and pass exams.
Only the people who understand that skills is what matters are those who have no trouble finding roles.
I have multiple degrees in engineering & business and a graduate degree in AI and I worked both in research and in industry. AI has multiple sub-fields. I'm always surprised when someone just studies ML by itself, and considers it enough to breeze through the industry, that's not enough. You either need to be a domain expert in some other field or at the very least have DevOps skills because most companies do not have large teams as companies are still assessing the feasibility and business value of AI. Having a team without domain knowledge of the problems they are working on increases the likelihood of those projects failing or not attaining the highest/best business value. It's perhaps not easy to have multiple degrees and work experience from various industries but if you can invest/sacrifice the time it will always pay off. When I did multiple degrees I just didn't want to be stuck in one field and when I studied AI it wasn't such a hype but everything changed so fast and now companies are scrambling and people are jumping on the AI bandwagon without a proper strategy. Take your time , do some proper research so you don't waste your time.
Are you familiar with DevOps? What is your expertise?
AI expert showing off his tech skills by providing perfect microphone / audio quality
The misconception you are referring to is actually engineered by AI solution/consultancy sellers. Step 1) build the hype around AI, step 2) get the contracts with clients, step 3) figure out if we can really do something useful, while we burn the client's money.
Becoming a better prompt engineer is, for the most part, a waste of time. I've noticed GPT4 has been getting better in the background. And of course, their whole goal is for the system to respond usefully without slinging special prompting tricks lol. For example, GPT4 used to completely rewrite my code and cut parts out, whereas GPT4o initially would just repeat the code back to me. Recently though, GPT4o can do 300+ lines of python and preserve my original program while implementing the prompted tweaks. This all happened recently without any announcements.
It still hallucinates constantly, I would never trust an LLM to not fuck up 300 lines of code
I also notice the ability to consume and produce higher token volumes has made responses less concise and digestible, combined with hallucinations means a solid engineer is pretty often just better off using Google and Stack Overflow a lot of the time. Especially if you work for a company whose code you can't just hand over to OpenAI. Often, if you Google keywords similar to what your prompt would have been, you find that the data these LLMs are synthesizing is nearly always on the first page of Google results
The hype cycle got really out of hand and it is time for it to die off
AI engineering is not at all the same as prompt engineer. prompt engineer is a fancy way for typing effectively to LLMs. AI engineering involved knowledge of how neural networks are created… creating vision models and other transformers and optimizing etc.
that is then added to business process mapping to know what and how to automate different processes in the industry you get into.
Vastly different skills.
Yeah. I see prompt engineering as a similar activity to SEO, i.e., trying to "game the algorithm/model". But SEO is working in an environment with one very dominant and relatively stable search algorithm, and with a very clear relationship to a revenue stream (more clicks, more sales, etc.). Prompt engineering is doing the same but in a landscape with many different LLMs that are changing relatively quickly (appreciable differences in most models every 6mo to 1yr), and in most cases the revenue streams have not been proven yet. And it seems like it would be extremely frustrating work.
@@mike200017 not exactly comparable to SEO. There are research papers for things like SOT and other mechanism thats provide accuracy etc. for llms. They arent incredibly different and you don’t need to change to all the others just stick to the best ones which so far keeps being Chatgpt4o. Or gemini or claude upgrades.
The results you are trying to get with llms are real-time and the methods for accuracy are known. So its really more like… the better your ability to model processes and map them in your mind and express them in writing as an operatinf procedure step by step.. the better your results.
Thats just one part. You also forgot that LLMs are just NLPs… NLPs process language into commands… commands that can be linked to functions…. Like automating workflows.. etc.. however the LLM needs to access vision models or backend code and have each button and field labeled in order for it to know and be the universal action model we are all seeking.
I second that the boom was in 2022. My gov lab was even peddling ai summer classes, which they've never done before. Then in 2023 it was contracts-no-renewal year lol.
Also in 2023, I vividly remember how nobody wanted to admit they were using ChatGPT to help write papers.
I worked many years writing "classical algorithms" next to a research team working on ML/AI approaches to the same problems. They worked overtime and were constantly swamped with issues trying to keep their massive data collection and training infrastructure up and running and it was moving forward at a snail's pace. I think they never quite understood (or were in denial of) why I would give them such a bewildered look whenever they were bragging about all the things ML/AI could do without needing to code anything. I'm not pooping on ML, but I think we are still in the early days, a lot of maturing of technology and experimentation still has to happen.
A better way to define AI hallucination is that it is predicting to a prompt incorrectly. Trying to figure out why a model is predicting incorrectly is the challenging part. Sometimes, we can't even figure out that the prediction is wrong.
The sad reality happened in 1994, Eternal September.
Then during blitzscaling, when templated and untyped languages became mainstream, and to suppress wages, technicians started being called "developers" and "engineers", which they are absolutely not.
Specifically to address the raised questions:
* it's in 99% of cases (like this one) a misnomer, "ML engineers" like this are not "ML" nor "engineers", the best name I've seen is "data scientist", but a more honest would be "data cleansers"
* "prompt engineering" is nonsense, not gonna even address it
* math is required, unless you want to be relegated to lower roles or underperforming teams
* CS is required, same
* those who join the industry for money usually don't have aptness for either, hence they mostly try cargo culting, and end up disposed of after a surge of low complexity manual work is completed.
Now you may cry "elitist" all you want, but the era of cynically catering to amateurs to boost low quality growth has left a deep scar in the industry, so don't expect any consolation prizes. Just look at Boeing or Intel.
AI is not just AI. Some can do what others have done before, some can do what no one has done before. There is a price difference between the two.
None can do what others have not done before
Developer vs researcher
Kudos to you Goda, what an incredible channel you have built here. Congratulations!!!
Thank you so so much!
Honestly I always assumed that ML engineers were full stack devs with a specialization in ML. I had no idea this area of study did not include general software engineering. Personally I'm a lifelong developer who did a couple courses to pickup ML and understand how to use it within my projects where necessary. But I always felt like maybe there was something special I am missing out on by not having studied it at college. Eitherway just leveraging an LLM has become so insanely easy and just constructing prompts within my app that get submitted.
True. I have met many professors and PhD students in ML who cannot code other than the Python needed to train their models
When he is finished wirh his phd in a year the AI hype bubble is gone 😂
I’ve had a 30 year career, beginning with cobol and dbase. I work on and with Ai every day. I’ve been deeply depressed the last 3 years watching my industry rip itself apart *again* by hype driven by horrid SV VC scumbags selling Ai. I’m really pleased to see everyone waking up. Make no mistake, when Ci/cd pipelines, dev tools etc are leveraged with generative Ai by the next generation of developers, productivity will explode. My hope is that highly secure local deployment environments will emerge with Ai connected data lakes and a full stack of standard dev tools backed up by LLMs fine tuned on configuration management and security etc. Now, every type of business software can be constructed locally by devs using plugin components, with the entire stack managed in the background.
That means MORE developers, not less.
Machine learning engineer @ 2024 = software engineer + data engineer + LLM / RAG
Software engineer - fullstack application
Data Engineer - build data pipeline
the sad reality is most famous universities focus on research. The culture of prestige has screwed up the university system in the US, Canada and UK. I've been working in consulting for more than 15 years and most graduates don't know practical engineering. Most graduates take 2-5 years to learn how to build, ship and manage applications.
Same issue here. I tried to find an AI job last winter. Tbh I got 1 or 2 jobs, but not the ones I wanted. So I decided to go the senior SWE route and become tech lead instead.
I've always seen AI in the broader context of process automation. I used to freelance as well and developed customized tooling to support work processes.
So yeah, as said in the video, you need to be a good software engineer and have AI skills on top nowadays. For me, AI is just another tool in my toolbox to develop cooler automations.
I cannot tell you the countless times I joined some CSVs with pandas and cleaned the data for some quick insights, etc. The practical data mining skills are underestimated IMO.
Love this video. One of those rare ones that remind you of the harsh reality out there with regards to jobs in this field. Of course, in any field, being able to land a lucrative job is not easy as some has mentioned in the comments with the key factor of whether you are at the right place at the right time. However, this video is all the more pertinent during this period considering the mad rush towards anything AI/ML-related and there is a danger that jobs in this field may be over-hyped.
I think this was a nice talk regarding the reality of what the companies think it takes to implement something and how actually it needs to be done and integrated. The part I can relate to is the part that more ofen than not these projects are possible to complete with a good Data engineering team that plans and implements how the data is extracted, transformed and stored for the ML/AI engineers to actual feed the models
it is always boils down to demand + learning curve/barrier to entry. if machine can do everything, and your contribution is almost zero (meaning zero learning curve) then income = zero. if learning curve is high, or barrier of entry is high, like solving hard maths problem, but no demand, then it is your job to do it, people rather solve it the easier and economical way.
Degree matters. So I advise people with non-IT/math background to stop jumping into this sink hole. In today market, AI is a tool, it is no longer a field of knowledge. It is better to do something else that make use of AI, unless you’re top 0.1% who invent the wheel.
Very well said! And there is a lot of opportunity in that area of application of what we currently have. We seen how much easier it is to work with companies once they actually understand what it does and most importantly, what are the limitations. So that you can build strategies that scale, and extract that value today, but be flexible for tomorrow.
If one does have a math background (i.e. a bachelor's in mathematics), would this still be a good area to go into? Is there room for people who want to work more on the abstract side of things?
@@sirnonapplicable if you want to work on that side, get a theoretical degree
@@therealjezzyc6209 bachelor's in mathematics or computer science are good to go. But maybe, you're going to get a master's after that.
Literally no one has passion now. A student thinking about retiring in tech by earning a million dollars? Bro graduate first and get a damn job
Guys as someone starting out my goal was to read theory for a month more, then try fastapi blogs, then karpathy, then implement any papers i like plus i can understand. by hit and trial. The message im getting here is to focus on software engineering and get to building projects right away. Is that right?
If you were to hire me, would you like to see in my github how many papers I've implemented or how many actual real world problems I've solved using AI? Or both maybe?
Advice in this regards would really help!
Just my opinion. I’d pick the latter one if u want to get hired easily.. Of course it’s a big plus if u could understand the architectures underlying the models. Implementing the models will help in that sense. But unless u want to be a researcher (which is super hard to get a job as a researcher as they mostly require phd degree for the job) the latter option (building projects + familiarity with the AI architectures) will likely help you get a job (relatively) easily.
@@anonymous-random ok! getting hands on asap. what do u do research or applied?
@@swapnilchand338 Well. Hard to say. I'd say applied? But not really software engineering. or could be called "Research Engineer" . My main job is not writing papers, i.e., working on novel algorithms. I mostly work on improving the foundation models (such as LLM, Stable Diffusion, etc). Our team modify/improve the open source models to meet the customers' (usually B2B not B2C) requirements. As you know, the open source models themselves are not all adequate to use for the business purpose without any modification. Of course those improved models can be sometimes really novel algorithms. If so, we write papers of it.
As of now, I'm working on the video generation stuff. T2V, V2V, I2V stuff.
So, I'm not really a software engineer nor research scientist. I prefer to say research engineer. But, as you know, these terms aren't really strictly defined so companies might use the terms in different ways.
You might already know this but one way to find what you should to get hired is to search job positions you're interested in and see what skills or experiences they require.
@@anonymous-random what has been your background - say x yrs of software engg at big tech, y years of ml startups, z years in masters? can we connect over any socials?
Seems Ai / ML people also aren’t good at software engineering but sure can pretend to know PyTorch… I have a masters in the subject but I’m also a learned software engineer. You can’t just be one thing in todays world.
ML is a super broad area and it’s really hard to teach it at the MS level unless they restrict the intake to Math undergrad students or have a very tough math qualifier including calculus, advanced stats, info theory, advanced linear algebra and convex optimization topics. Almost no school has such strict entry criteria. I am doing my Masters in ML and AI at a top tier school and my perception is that they don’t teach enough theory for you to come up with your own algos on the fly by yourself thereby making you unqualified to be an ML algo designer. You will know enough to apply standard algos but frankly that’s not very tough even without a CS degree of any sort.
As to teaching you practical CS skills, it’s hard to teach that at school fully, it’s the job of industry to teach that. The schools do teach you the basics throughly assignments to the point you understand basic Python libraries such as numpy, pandas, sklearn, Pythorch and all but they are kind of like toy examples. To do a job a full real life example including interventions you would have to carve out a full year doing a real life internship. But isn’t that what a job literally is?
So basically a MS by itself gets you to Mastery in neither theory nor practical CS application. I don’t think the characterization that schools are focused on just preparing you for PhDs is accurate.
The issue is that real life has no easy arbitrage opportunities that you can just do an MS, not be focused on what you don’t know and filling that gap continuously, and expect to earn a million dollar salary as an entitlement. Never happens!
The conversation is quite interesting. If i may, I think it is important to differentiate research roles from other types of roles even in the industry. Big, serious and strong companies are mostly interested in research scientists because those are the people who create the novel things and as a result, we have less and less need for people who cannot do the research but can only do the engineering part. And even universities and institutions know that and that's why they divide their master programs into two for the two different paths: research based and course based and I think they are self explanatory. The fact that the research part is the most important one between the two is not stressed on enough, unfortunately, very few people are on that path compared to the alternative, creating the discrepancies that we see in the roles. Also for most of those AI roles, a PhD is going to be preferred compared to any other degree and even before joining your phd it is assumed and some professors test it that you are an engineer.
Very interesting observation! Unfortunately, at the end of the day economics play a big role too, industry not only offers higher pay, but also support with compute, there as academia is lacking on those resources all around. And thus we see industry, meaning corporations, eating up the talents. However, at a corporate job your research is strained to the goals of corporation, thus engineering part plays a big role too. It's a bit sad to see that because we would like to have more independent researchers advancing a field without commercial incentives. But there money goes attention follows.
I wouldn't discredit engineering in favour of research. Someone's got to build the data engineering bits and that's often what academics know very little about. Every researcher needs at least a few engineers to support their infrastructure.
@@bla7091yes, but if you have a few engineers and one researcher you would like to have an advanced researcher, I studied ML myself I must say that the level taught in a master degree while interesting(and nowadays prob a core subject for undergrad CS students too), once you got what it’s doing , you can throw away the math in application, any sub par engineer (not even advanced knowledge in coding) would be capable of whacking up a few line of python code that runs a prediction- ChatGPT can do that now. There is no point hiring a junior ML person.
Wait a doggone minute… the most interesting part of the video was in the last 20 seconds of the video!!
(In my opinion)
This comment made my day! Thank you!
excellent interview. As a currenet MS student in AI now, I am wondering just whether I need an AI PHD or not... doesn't seem like the best use of my time
2023 had a lot of layoffs because of economic factors that affected the US job market. So, it might not be a very accurate to judge AI expert demand in the job market independently.
Totally agree!
In many large companies, the problem is they don't even understand data science as a concept. They have a data science project but hire a database consuttant. There isn't even a devops-deployed development environment with actual dataset access. The datset is safely locked away in a production-only server environment, deliberately designed to protect it from any meddling software-enginners or data-scientists. The project is treated as a pure data-engineering task to build a data-pipeline intended to be built without any visability into the dataset itself. Then there will usually be some shitty procurement vendor who immediately wants control access to both code and data to maintain their monopoly over the project.
its just hard to get a job from 2023 in all developer roles
Thanks for this video; let me know when you have a class for ‘integration engineers’.
P.S. Will say hi on LinkedIn 😊
amazing mustache !
If Tanner on the thumbnail won't make this video CTR go boom, I don't know what can :D
@@godago lmaooo
@@godago I hate to be such a statistic, but I'm human, and gotta give in sometimes. And, I will validate your instincts, because I totally clicked on this because of that (skillfully maintained) mustache. And now I'm subbing, because I look for content like these topics anyway. So, props to @godago!
Insightful! Side note: there are some issues with the sound, possibly related to the editing.
If you substitute "machine learning" for "data scientist" or "big data engineer" you could be talking about the last 10 years data/ml hypes.
Welcome to the industry
Great content. Too bad on the audio engineering. Hopefully you make sure that never happens in future videos. Were you using a noise gate (set poorly)? Or mu-law compression over the transmission path?
They fcking had to shift this education thing, right when I graduated 😭 so you now you are telling me I have to do phd to be employed 😭😭
Please reupload the podcast again as the audio is bad @Goda Go
I know... Unfortunately, it has been an issue of mic, not the issue of exporting. ;/
Good interview. Bad audio. There are full words that are missing in the audio. Yes, I turned on CC, but in this day and age, it should not be difficult to get something that is completely audible.
Audio cooked
I know :(
Great video. I recruit for AI and have a start up client in NYC looking for top CS degree, full stack dev exp w/interest or some knowledge of AI.
It seems like the way ML has to be taught would not require only a shift but rather more business-oriented elements in the curriculum which deals with its application. That wouldn't mean a dumbing down but rather increased time requirements for new entrants into the industry. Realistically it will be opportunists from other disciplines (mostly STEM) who will have a much easier time joining.
The amount that you'd have to offer as a new entrant (CS knowledge, focus on ML, understanding of the infrastructure and operation side of things) will be more readily available from an already existing pool of applicants from CS and jobless STEM people.
So the best practice for people who have absolutely no clue is to just start actually doing the thing itself and jumping naked into the market. Otherwise they'll look forward to 8+ years of education and then junior experience minimum.
Audio is really bad on this, had to quit. Might as well just make a blog post if it needs the captions that bad.
Fully understand! And it’s a great idea. The conversation was just so good that I decided to take the risk :)
Hey, nice video. As an AI enthusiast, I became a data scientist right after graduating with a CS degree. However, I found the work to be simple tree modeling and drudgery, so I transitioned to backend development to improve myself further. Despite this, I'm still curious about AI. As a backend engineer, will my AI skills be valuable in the future? Or am I wasting my time in backend?
I don't think anyone can say for sure. But backend has a lot of moving parts
@@niamhleeson3522 thanks for answer
If you like backend, look at getting a certification with Snowflake and then you can work on massive data projects as a contractor or employee.
@@donventura2116 thanks for the advice
@@niamhleeson3522 I hope so
I shit, i went to mcgill as well working as an MLE. Been meaning to apply for MILA!
Thanks so much for this enlightning feedback!
new new problem is how to solve the amount of electricity use, GPU usage of AI, plus waste heat. We need physics engineers.
Exactly right!
Everytime I see an icon-less account leave a massive comment I think it’s a bot. This is the true state of the AI field
00:01 Expectations vs. Reality in AI Job Market
01:48 AI job market boom was in 2022 and saw a decline in 2023
03:34 AI job market requires skills beyond theoretical knowledge
05:24 Machine learning engineers need to focus on infrastructure and monitoring models.
06:58 Shift in demand towards practical skills for AI job market
08:44 Deep learning models are widely applicable and increase productivity
10:36 Importance of continuous learning and self-improvement in AI field
12:14 he shares personal experience with prompt engineering.
Awesome! Thanks!
9:02 Cat
You are conflating the general hiring down trend now and peak of 2022 and now. It was Covid and we are still dealing with it.
I'm a Caio and ml engineer. The graph directly maps to programmer hires.
So is it better now or worse?
@@lostwanderer8693still bad
Very informative - viewing time well spent! (new subscriber)
So glad it was valuable! And happy to have you hear! A second part of this conversation is coming out soon!
Very nice and informative!🎉
Such a great point about academia missing the ball. Students most often get hired by businesses that need solutions that work for them and their processes. If you don’t know how to understand the processes and needs of businesses because you were never taught this it’s a huge gap.
This world is finished. I'd rather have not been born.
Rite, fk this shaaat!!!
Exactly,
Colin Farrel is now a ML engineer?
:D
This is classic complaining that university is not a bootcamp, when in fact nobody expects that fresh graduate will be able to introduce data science into company by themselves. In fact, even graduates of computer science don't really have skills to do that.
the cuts make it unbearable to listen to
Pithy interview/discussion. Thank you very much.
Thank you for tuning in for this conversation ❤️
what is it like though for people with a Phd? Of course not everyone with a PhD will end up at openai but do you have an idea where most of them work on/ at?
I’m a senior generative AI engineer at a FAANG, PhD in theoretical physics was finished in 2021, and not top notch programming/engineering skills (20 leetcode problems total to my name, none I’ve been able to solve on my own).
I understand the nitty gritty details of backpropagation, probability, statistics, statistical tests, metrics, broad training techniques, ability read papers, end to end model training+productionization understanding
Really poor audio when Tanner was speaking.
It's really mostly all integration. A lot of the magic to be built is there.
Exactly!
It is timing. I will spend some of free times to build AI
I like to sometimes what video in higher play speed and the audio for this one is bad for this
What's with his microphone?
I know... we are getting a good mic to Tanner, airpods and recording software did not like each other ;)
why is the audio so ass
The software we filmed failed us… :( we are getting Tanner a proper mic too now.
why the heck has this trend of overlay captions caught on? its annoying
Oh… I used them because audio was damaged from Tanners side and I thought it is easier to comprehend what he says with captions :)
You are describing an entire product application where the assumption is ai engineers/data scientists have the skill to solve without other roles involved. Bias data engineer here. If you cant effectively get and clean the data at scale then deliver to applications on time, these models are useless. Too much hype around one job function for me
struggling with audio quality...
We are getting Tanner a good mic, but yes... hardware failed us here.
Prompt engineering sounds like a painfull chore on the verge of masochism.
It indeed can be :D
More businesses should interview the IT staff of the universities as well as the graduates to find out the weak points. I remember during my PhD, the IT people that ran the high-performance computing center came to our department to give us a special lecture about why are we creating too many files, it was basically lack of database knowledge. Someone in my research group was creating thousands of directories with over 1M individual files, "I want a different file for each subset of data, so what?" basically, haha.
I'm living on the island now. It's really cool when you can make money from every place in the world. With trading it is a reality
$925000 a year sounds great but how much of that money he is actually going to get?! 500k per year? Don't get me wrong that's still a very good salary, but taxes are a thing.
Correct observation! There is also other benefits that are baked into the final offer that depend on multiple factors. And a big one, is living costs, that would cut a bit chunk of that money. But you are also right, even if all things considered, it is still a big offer on the table and competition for talent is fierce.
@@godagoAI generated comment?! It doesn't make too much sense.
Dude, if you can't survive on 500k you don't deserve to get paid that much. You could get apartment for 10k per month and you could still live off that.
@godago is this a ai response if so that is so disrespectful towards the audience I will unscripted from the channel.
🤦♂️
what everyone needs and still hire and pays about the million a year but nobody is talking about is the "Devops/mlops", data egnineer, the good old IT. You simply cannot exist as a competitive medium sized business without a good infrastructure and these need to be not only update every few years they also must be maintained. ML engineers and Data Scientists are dozen a dime, Infrastructure and system architecture u dont learn at school.
Love it!!!!!!!
AI is a buble
now it's waiting untill investers get it
Sadly, but kind of agree. We see our clients extract a lot of value but navigating to that point is not a straight road.
AI ppl are learning how it is to be infrastructure engineer lol. They didn’t even touch on how you host/transfer data in regulated areas
This is why I didnt bother trying to teach myself AI/ML or any data science concepts. I dont really see a huge benefit to my employment by learning this stuff over the conventional engineering stuff I'm doing now because most of these AI tools and products is useless vaporware imo. I havent really seen a lot of useful applications of it, and the useful ones dont seem very profitable when you assess the cost of building and maintaining models like these. At the end of the day its always money in, money out for a business.
weird bg music had me confused
valuable, thank you, chin up
AI is dumb.....
Isn’t companies like PLTR also eating these ML engineer’s lunch ?
Great video, but tanner might’ve struggled with his job search because of his mic quality lol
You guys buried a hole for the other engineers
Prompt engineering is dead. o1 gets worse if you add prompting patterns, vs just talking to it naturally. All the chain of thought and top patterns are already built into the system, so adding these seems to actually make the results worse. I’ve been saying this since the term “prompt engineering” started gaining popularity, prompt engineering will just be “communication skills”
The only ones that really need prompt engineering are those that build and train the actual LLM systems
Good news: Digitalism is killing capitalism. A novel perspective, first in the world! Where is capitalism going? Digitalism vs. Capitalism: The New Ecumenical World Order: The Dimensions of State in Digitalism by Veysel Batmaz is available for sale on Internet.
everyone is hiring if your an expert. no one wants a noob.
with this principle, no one should start anything or study anything :D
Haha well said@@godago
If you dont work on the models, you are just a glorified Devops Engineer fueled by the cash dumped into the AI hype
Good devops engineers are worth their weight in gold and don't need any additional glorification. That's why they're still being hired by the bucket load while so many theory-heavy practice-light ML engineers are falling by the wayside.
Goda, Pardon me if you think I am being "fresh," but that dress you wore during his interview is very pretty and becoming of you.
Colin Farrell?🤔
He’d make so much from doing only fans omg
This video would be cool if it wasn’t just a long ad masquerading as insight
Ad for what?
eemm.. no problem was stated here. normal scenario - you create a decent model, put it in production, of course there will be problems. For me you guys sound like little kids complaining on harsh life or stupid management but you maybe educated, especcially in the end🤦 well, i explored chatgpt in 2022 and created many things already.
if employer sees high resilience and agility of model as something natural and by default, show them middle finger.