Hey everyone. I think a lot of people took this video the wrong way and that’s totally my fault. I don’t mean to say ALL self taught developers never learn these topics, they are simply the things I know personally I hadn’t considered learning until attending university and that In my experience a lot of devs that haven’t been in a traditional school system don’t learn. I’m sure many self taught developers do learn these topics and are far better at them than I am! I also wanted to clarify that not everything on this list is necessary to learn (as I stated many times in the video). This videos objective was to introduce some topics some self taught devs may not have considered learning so that they can have a kind of road map or idea of topics they might want to look at next. Apologize if it came off the wrong way 👍
As someone who is in college and also who is a self taught..i think you are right i mean most self taught didn't consider learning them so it is a great thing you just tells us this things to consider..thank you
The title is very arrogant. There are a lot of people with degrees other that computer science that work in this industry. i have 35 course hours of mathematics and electrical engineering. Including calculus and linear algebra. I think system administrator know a little about operating systems. I studied Introduction to Algorithms by Thomas H. Corman and Tanenbaum's Modern Operating Systems and audited a Mathematics for Computer Science from MIT. I have been in industry for over 25 years and currently work as a Devops Engineer. The only never is never stop learning.
as a self taught coder ive also taught myself a lot of maths in the process. if you can teach yourself to code you can teach yourself other advanced concepts. i also taught myself electronics so i know the computer all the way down to the transistor level.
Same been learning more maths than some grads by the end of high school by self teaching. Self taught means you can learn anything. I dont understand the "credentials giving you secret knowledge"argument.
@@cybrdelic credentials get you in the door. its really just proof that you signed your life away into student loans so your potential employer can have leverage over you. they don't care if you are good at what you do or not. they can train you on what you need to know to do your job.
I am learning the same stuff u learnt now but it takes a lot of time. I enjoy learning all this stuff but want to know how long it may take before i get a job. Say i started from scratch how long would it take to learn the basics and job relevant skills?
With no degree and 30 projects on your portfolio... you are a hot cake... But if you want to pay a wholesome load of casg to some college who use the same resource online to teach you.
I would say this is a fairly inaccurate title as well. A lot of this knowledge is great, and it's also widely available. There's not really a need to learn any of this, outside of just pure enjoyment.
Most of the self-taught devs that I know do learn these things. They're usually the ones that are constantly learning, and never happy with not knowing something.
I'm one of them. I have learnt 13 languages. Earned money with 6 of them. I have lots of experience with the 5 major cloud technologies. Became a linux and containarization expert. I develop distributed systems as a devops engineer who has real world knowledge/experience on both operational, developer and automation side. I'm working on open source projects, so anyone could check. And lastly english is not my native so i spent lots of time to learn english to be able to learn computer science.
Quite right. I came up in the culture of British home computing in the 80s, and I knew of so many talented developers who learned a lot of this stuff before they even reached university age.
Yea your comment makes sense. People that WANT to learn this will always go above and beyond. Even reading books. Most programmers in schools will never do this, thinking the internet is always superior.
I have a math degree, I recently learned about a little inefficiency at my company and after a couple months of just thinking about it, I realized it was a giant constrained optimization problem that could be modeled with linear algebra. I was and still am awful at python but in my excitement to solve this problem I managed to pull it together and write a program to generate the matrix and constrains (both the values and dimensions will change day by day so it needed to be algorithmically generated)... it worked and now I am trying to learn python as fast as possible to move that whole system into python and hoping more problems pop up that can be solved by a quick optimization.
@@mistatank1231 if these math libraries are C code wrappers, then that makes sense. But I've heard that Numpy works slower due to the large method call, when the C library optimizes the whole thing.
@@khodis2002 As someone who knows a ton of languages from 6502 assembler, C, pascal, all the way through to golang, I can safely say that language speed, while important, is not the only factor when writing code. It's a balance between speed of language and speed/TCO of development. Python is pretty easy to read and understand, and quick to throw scripts together.
This video is very relevant. Knowing what to focus on in CS can trip you up learning on your own. Being an older learner /relearner, OOP was a very difficult concept, but after a very long struggle I got it. Also USE what you're learning, as you're learning it- best lesson I've learned. And watch Tech with Tim!
@@NivAwesome It is also important in scientific applications and other simulations. Btw., most computer graphics is also based on it. Stop omitting something just because you don't know the application. That's ignorant.
Partially right. You need to know what is available to you so you are able to gauge if it's worth to investigate it more. You would never consider hooking an FPGA to your server for heavy processing (e.g. cryptographic hashes), because you don't know it's a possibility. As an SE, your job is not to reply to some request, but help your client/employer do the right thing. That may even means no programming at all. (Changing processes) So, no need to learn in-depth, but surely required to learn as much as possible. Where it's nuanced is that you can do it in an intelligent way. If you know Angular, Dagger is not be be learn as it's basically the same principles at play. If you know a C family language, C, C# or Java shouldn't be that complicated etc. But if you totally ignore one aspect, then you'll never learn it when required, unless it's part of some specifications you got. It's why in the academic track I got, we did a grand tour. I even programmed robots you find in pharmaceutical companies. Did also some German assembly, designed an build pcbs for interfacing, analyses a library processes etc. Stuff I would have not done on my own and that gives me an edge. In one of my job, my interview was cut with "Ok, you are hired". I asked how they came to it and they replied that I was the only one who pulled some UML and asked the right questions. They asked me to make a small program which was an ETL and before digging into the code, I did a small design. The created only one class to have the "feel" of it, then called them to discuss the solution before churning through code. In the end, though, there is ample room for self taught people. They can do basic algorithms, which are 90% of the jobs (mine is a bad example as I mainly works with trees transformations). Then you need only one guy who know how to tie everything together and being efficient while keeping it readable and maintainable. That's also that guy who can tell what is worth learning. Still, there is a huge plus-value to have some IT culture.
I'm 100% self taught when it comes to computers from the transistor to python! I can somewhat agree with that statement for a basic coder or programmer... But as for an actual Engineer, I kind of beg to differ. Let's pick on C/C++ languages with the modern C/C++ compilers... any major brand will do Clang, GCC, MSVC, or even Intel... and take a simple little program that goes through a loop from 1 to 100 performs division on some values and prints the results. Do this with optimizations off, then do it with optimizations on. I'm more familiar with the MSVC compiler and I'm more familiar with the x86 architectures however I do have some knowledge of GCCs compiler and the 6502 and Risc-V architectures. Not too familiar with MIPS or or ARM. However, when you compile your C/C++ under Clang there is an intermediate file usually object files that your C/C++ code is translated into before the linkage or linker stage. Clang translates to LLVM before that gets translated to actual Assembly - Machine Code. Now if you can examine the actual LLVM code without and with optimizations then compare the two assemblies... you'll see that some of that division is replaced with bit-shifts as opposed to actual division and others are replaced with something that looks esoteric (magic numbers) and bit banging. This is what your compiler does under the hood. It tries to make the best attempts of optimizing your code to use the least and or fastest segments within your hardware. If it can use registers before going out to main memory it will do so. If it can replace division with shifting or even multiplication it will do so... Knowing what's going on under the hood of your compiler, assembler and even within your CPU as well as the rest of the system gives you the necessary tools to be a better Engineer. To be a developer it may not be necessary... just rely on the tools and take them for granted. But if you're the one who's designing the hardware or device drivers, or writing processor intensive code on limited resources then you better know how to minimize cache line misses! How to avoid memory leaks! And so on... Not even knowing your compiler is enough... knowing how your compiler, operating system its kernel and HAL work with and manipulate your assembler and disassembler is vital information even if you are programming in Java, JavaScript and or Python it can help. As it helps to design better algorithms. Such as learning how to minimize branch predictions, how to organize your data to easily stay within the cache blocks... knowing when or when not to use dynamic memory or the heap versus the stack... knowing when to use pointer and pointer arithmetic... and to squeeze a few more clock cycles out, knowing how and when to bit-bang! There are coders, there are programmers and then there are Engineers. And you don't necessarily have to have a "college" education to be an Engineer!
@@skilz8098 you are so right. I've argued with so many people over this. I've dared to say that one could teach himself UpTo PhD level without entering a classroom , if the learning materials can convey the knowledge, and the field doesn't require complex lab setup e.g like it is in the physical sciences. It's about understanding and the mind is the seat of it and doesn't require middle men to unlock insight from knowledge. Unfortunately, modern society equates attainment of understanding with credentials- credentialism. They're arguments for this culture e.g for protection of clients and the public but the truth is much broader in scope of what's possible
@@collinsa8909 Lol, I never looked at diploma and certificates when I interviewed. Just ran though the protocol. Still, I do not recall any self taught doing good...neither those who did put forward their credentials. It's mostly those who thought they didn't do that good who passed. Self taught have issues you'll rarely find int those who go though "professional school" (So, not university, but alike where you learn to "do".). Mind that university usually means not actionable. They get the theory, but the practice is not there yet.
@@wasimraja2980 Exactly, that's why I said I partially agree as it is coming from a different perspective. Not once have I had a curriculum credited base class towards either computer science, software and or electronic/hardware engineering. Yet over the past 20 years I have acquired the knowledge of my own volition. The only difference for me is that I want to know all that is involved... even down to the theory of quantum mechanics at the atomic level of the actual electronic components that makes up your circuitry! Why? I like computers and technology but more than that I like Physics! Now, I'm not really a Physicist but I enjoy it especially when it applies to Engineering.
In India, Calculus is taught in schools, grades 11 and 12. So when a student joins University/College, s/he already had a glimpse of what exactly is Calculus.
The same goes for most European countries: derivatives and integrals are being taught in the last two years of high school, as well as permutations, combinations, matrices and determinants. University calculus starts with line/surface/volume/contour integrals and differential equations. The difference is very visible when using American textbooks for first-year university physics: the US books (e.g. Ohanian's Physics) only contains Maxwell's equations in global (integral) form, while Europeans professors are teaching the local form (differential, i.e. nabla notation) immediately. And I learned about logic gates in primary school (back in the 1970s) and basic set theory (on a Venn diagram board game where we had to put images of faces on the correct position, e.g. a face with curly hair and blue eyes in the intersection of the curly and blue sets) even before that in kindergarten.
In the UK, calculus and discrete maths are part of A-level maths, so they're not necessarily something computer science students have learned. In practice though I don't recall ever meeting a computer science student that didn't have an A-level in maths.
I think software design patterns is one of the subjects that needs to be included here. Once I had this subject in university I worked on a project and actually got an opportunity to tryout the things that I had learnt in the subject.
I think they're great to learn about, but personally I've just stumbled upon the "correct" way to design a system many times and didn't even know I could find the conceptual design described in some book. Point being, they can be learned rather quickly through practice, so do not fall under the category of "never learned" but I'd agree that they're great to read about so you're at least aware of what you didn't know as a self-taught dev.
Systems design is something that usually comes with experience, most companies don't ask that for jr. Roles, it's more of a staff or principal engineer thing. But definitely should do at least at basic level before and interview
Speaking for myself (self-taught programmer over the last 45 years) I can't imagine being unaware of any of this. You can find textbooks on pretty much all of this material in any good library.
2 courses I was taught in university not mentioned here (which were pain in the ass back then) but are helpful now were "Theory of automata" and "Compiler construction". Interesting stuff, Knowing how compiler works do make you a better programmer
Being a self-taught developer with 15+ years of experience, to this very day I feel that in some areas I have much more knowledge than "academic" developers who have the same years of experience, yet in other areas, "academic" developers learn certain things in their first year, that I still haven't learned properly to this day or struggle to understand, which feels very intimidating to me. Especially these algorithm test challenges you get during interview questions is something I never learned properly and I struggle with them.
linear algebra has a lot to do with physics, shaders, graphics and even encryption (see AES-GCM) calculus of course is about physics What I can honestly say that the most important thing I learned at the university is team work, clean code, etc. That gives you a huge advantage as a junior to transition quickly to mid
It’s kind of interesting that I had to learn Discrete Mathematics for my linguistics degree, especially when it came to set theory, set builder notation, and proofs using propositional logic. Also learned boolean algebra for my semantics courses.
there is a lot more in discrete mathematics like counting and probability, recurrence relations and generating functions, graphs and trees and their application like in scheduling, pigeon hole principle bla bla
Man when I saw the title I thought to myself "ugh this is probably gonna be one of those clickbaity titles where he'll tell us to learn how to write comments or something stupid" but man you actually delivered an amazing video. I'm always trying to improve myself but I have a lot of trouble understanding a lot of what people mean when I read documentation or articles, and there seems to be just a massive influx of "developers" posting videos on youtube that all seem to copy each other posting the exact same stuff and it's difficult sorting through everything that I already know. thanks.
I've been programming for 40 years, and much of my expertise was self taught. The topics that you cover are ALL things I've had to learn, and lots more, e.g. systems administration, large Hadoop style systems, networking, security... If you're to be successful as a programmer, you need to keep up with whatever's popular, which means you'll be continually reeducating yourself. The good news is that for many topics, one only needs to learn the jargon, as the new topic will closely resemble something you already know. "The rarely change what we do in computing, but the change what they call it" -- Attributed to Grace Hopper.
Idea for your future tutorial videos: Pick an important subject taught in universities and make them accessible for self taught Devs through series of tutorial videos
Self taught devs don't need those things. Learning to them is on a need to know basis. That said , any avid learner would explore topics outside of his sphere to have a feel for the scope of his discipline. I did, issue is, most of the theoretical education suffer from ultra poor tuition - most books are unnecessarily convoluted beyond the complexity of the subject in question itself. Then, most lessons fail to show rekevance to the learner , making the material doubly unattractive and useless to the autodidact
i strongly disagree. theoretical principles help people know what they are doing. theory is unavoidable many applications: if you work with images or sounds, you have to understand mathematical analysis and linear algebra, if you work with text (or other sequences) you need to know a bit of graphs and discrete math, if you work with web, you need basic knowledge about threading. And if you program, you must know about functional programming style, otherwise your code will probably be disastrous (mine definitely was).
@@Daniel_Zhu_a6f I partially get your point but you don't get mine. Schools teach you lots of theory, hoping you'll hav an arsenal to tackle your field with. This is what I call knowledge warehousing. You may use some and never use others. So y fill your mind with do much junk. On the other hand, with lots of practice and trial and error, which is what self teaching will expose you too, you'd develop a logical mindset. Then with some theory, not much, you'd know how to design systems. Then with other projects out of your range, you can research and learn to tackle. E.g if I'm to design sound systems, or graphics, I'd research and learn until I know enough to get the job done. Same with ai/machine learning. Robotics. I'd be qualified through the rigorous training of building systems via trial n error of self taught programming. E.g I work with texts, sequences and have never used graph theory. I do use lots of lists,sets etc. I've never used discrete maths explicitly, though I do programming. Overtime from experience I've discovered some patterns with boolean operations/algebra. Ive read up on discrete maths a bit but found it dry, irrelevant and redundant. Yeah, I filled out truth tables and all ad neaseum and wrote some math proofs but it never came in useful when tackling a programming problem at all. When writing a calculator etc I don't stop to think of computing boolean operations on a truth table! When writing an arithmetic parser, I don't start thinking of automata theory! I write the parser it works I'm done. At the end I've done automatically theory without knowing it in theory. This won't fly for all cases but many times it will. Self taughts approach problems differently. E.g when trying to solve some numerical problem, u noticed that java had no way of computing exponentials with non integer , floating point, arguments and i badly needed thus functionality, so I began researching ways to achieve my goal which led me to looking for ready made solutions that could help- found no free one and trying to understand the math behind fast exponential computation. Basically- learning as I needed to
As an electrical engineer, the first language I learnt to program in was assembly. It did help with my understanding of what the actual hardware was up to when the computer was writing machine code etc.
This is an awesome video man. I work for a FANG company and I'm a self-taught programmer. My start in tech was as a network engineer and transitioned to a developer. I was fortunate to find a mentor that told me some of the things you mentioned that I need to learn. after learning about data structures, paradigms, and architecture, I felt languages like python are a bad first language for folks. When I learned Golang, that's what really force me to learn more about these very concepts you're talking about. this is super applicable to the folks reading, I didn't go to a computer science school and have a few friends that haven't I'm usually giving similar advice of things to become better devs. Great video.
It really depends on what you're doing. When we are talking about graphics programming (especially 3d) then things like vectors and matrices are pretty much entry level to that field.
What I remember from my college level math classes: Calc 2 was a weed-out class at my school as most students were coming in with college credit from high school and Linear Algebra wasn't difficult, just tedious from not being able to show multiple steps in a transformation.
I think the tittle should be “what they don’t teach you on a 20 hours online course” as after 5 years of working as a software developer you need to know most of the subjects mentioned here. Perhaps the math part is not super necessary unless you’re doing 3D engines -in that case Linear Algebra becomes vital or other math if you’re writing software for digital signal processing, compression or cryptography.
Thank you for this video. Found it very insightful. Most of the items you covered, I have run into at one point or another as a self-taught developer (been developing in one form or another since about 1982). My concern has always been that developers aren't learning the "low" level computing concepts (like CPU architecture and Operating systems), and now I can see why. I've bookmarked this video as I'm going to make it a point to learn more stuff based on what you've outlined.
One which I think even most college students never learn, unless they've taken a course on compiler design, is Finite State Automata. This is a shame, because they are remarkably useful in the design of several types of programs, ranging from search engines to TCP/IP stacks. Similarly with parsing, which again is mostly associated with compiler development but is useful in any number of other areas. Relational theory is another topic which you generally only get with a specialized course, which is often at the graduate level. Lambda calculus is rarely taught at all these days, which is something of a shame. As for operating system design, for self-taught programmers there is the OSDev forum and wiki, though there are several caveats about the accuracy and datedness of the latter and the vitriolic tone of the former.
I am a university student and our curriculum do have theory of computation as a subject which goes deep enough to understand finite state machines including non deterministic. Course also cover regular language, and also it’s applications in compiler design ….. including parsing.
I would say a lot of what a CS degree teaches you isn't directly applicable, but it allows you to understand topics at a deeper level and so will be able to solve more complex and fundamental problems.
As a self taught Developer (and self taught with english language too, sorry) I think your video is very useful for many of us because it gives ways to improve. A self taught learner chooses what to learn and from who. The knowledge of the others remains important. I've seen many of the topics you listed because I've a long history as a programmer (beginning in 1981, assembler, many many books, and often using languages where I manage the memory, threads, ...). Sadly not enough time for all the math stuff. Thank you
Really good video, you keep making better content every day (Congrats on doing that) I have been a subscriber since you started the first pygame series and I have seen almost every single one of your videos and I just thought that I should leave a comment to say how much I love ans support what you do, and that without you I probably would not have started programming! Keep doing what you do, youre great and not far from reaching the 1 million subs ;).
This covers many useful computer related topics. Having a basic understanding of all the "tools" helps you choose a good approach when facing a new problem. You can always brush up on how to use the technique, but if you don't know it exists, you will never think to use it. What most programs overlook is domain specific skills. For example writing an accounting program. Knowing the GUI and SQL interfaces is necessary but not enough. You also need to need to understand accounting well enough to understand the needs and requests of the users. Almost all retail sales systems SUCK because the programmers didn't understand how the systems are used.
No DSP? That math is pretty interesting and intense, and has a lot of stuff to learn for programmers, especially those ever touching signals (audio, video, etc)
You made good points here Tim. I'm also a self taught developer and have sometimes difficulties to write software with good and simple design. And my understanding with software paradigms, computer architecture in general and also mathematics in general are quite decent. Every self-taught developer should keep this points in mind. The ability to code and make software work is not the only skill that makes a good developer or engineer should have. But if you are self taught in the fist place, it should not be a problem to learn all the other concepts too.
It's just take quite a bit of time, which you cannot afford, otherwise you would have gotten a proper education at the first place. Catching pieces here and there will not give you a big picture which any development should start with. Lack of skills in solving problems in general way gives an impression that programming is easy, cause you never actually program, but keep hardcoding and repeating yourself.
OS was great. I think its one of the most relevant topics, especially when it comes to real world relevance. Not only do you have a better understanding of how and why a lot of things work or don't work with operating systems, filesystems and concurrency, it e.g. also helped me to understand asyncio better, and it makes reading documentation easier as you know the basic terminology. Would do it again immediately
Never is too harsh word it truly depends on the person. I'm self taught developer except maths I learned all others my journey was year long though the thing I did was always ask why and how then search internet on those.
Great video. I was a self-taught programmer with a degree in Physics, was terrible at advanced math. After 25 years of systems engineering with a little "programming on the bare metal" machine code, I went back to school, got a degree in Software Engineering--my first impression was, "I've spent 25 years in a different profession with the same name." After graduating with a master's degree, I proceeded to self-teach myself the rest of computer science by getting a job teaching the upper-division undergraduate courses (none of which I took in school myself), all of which made me a much better programmer and able to solve difficult problems easily. FSMs, functional programming, and "Big O" are the most helpful concepts. Still coding and learning 56 years after grinding out my first program in new employee orientation at Univac.
I know that's a clickbait title, but saying that if you don't go to uni you won't learn this material is a bit of a stretch. Now you have access to all books and online resources you would ever need to learn all this topics. It's only the matter of your own internal curiosity to dive deeper in some more theoretical topics.
In one company I worked, they didn't know how to migrate databases, because they were not indexing and indexing them would take days (at least). At some point I passed by, discussed and the issue popped. So, I replied that it is easily solved with dichotomic search. It took me 5 minutes to solve an issue they were wrapping their head around. When asked how I came with that so fast I replied that it is literally something you learn in the first quarter of the first year at university. So my solution came straight from the book. (of course, I had to recall it) Same with my current boos to whom I can swiftly answer, just because I learned the material. As a stupid example, if we need to do some intensive process, I'll see if it is possible to hand off the problem to an FPGA. Self taught programmer do not even know that it exists and can't design hardware at all. Before the academic track, I was a self taught programmer and I was already good. Learning made me even better in the sense I have more horizons.
As a first class Nigerian student of mathematics and computer science, this is so relatable. What people get wrong abt Computer science is that it is first, a SCIENCE itself... You can't just wake up someday and say you wanna be a computer scientist or a programmer without understanding the core principles of computing... My advice to self taught developers is that they should try and go to the university to backup their already acquired knowledge of programming.
For a university class, I learned a language called Haskell, which sounds a bit similar to OCaml, no loops, and super strong typing. It wouldn't let you even compile/run the program without every part of the program running being typed out beforehand. Our final project was to use a parsing library and implement our own programming language from scratch, using Haskell, and it really taught me a lot about how to construct a programming language, as well as forming a plan and format before starting to code.
I'm sort of self taught. I learned C and C++ in college, along with some higher math, Calculus and Linear Algebra (I don't remember much but I remember enough of the concepts to know what to google). Now I'm mainly a C# dev. I'm completely self taught in SQL and front end, though.
Calculus, algebra, these are the foundations of engineering, especially software and computer science. Can't imagine professional software developer without good math background. Sure, you can build a bridge without the math - as long as it is an amateur endeavor in your backyard.
Do we get bonus points if we _did_ learn this as a self taught developer? Math is super important if you want to do any kind of game development. Low level hardware knowledge is invaluable when you need to write high throughput highly optimized code in a big datacenter environment, and I consider it a big separator between an average programmer and a great one -- it's very obvious when someone does or does not know this stuff.
Yes, I am sad he did not mention DOD which utilizes knowledge of the cache. I'm self taught and I guess I'm learning that, which according to this video, isn't even taught to university students.
To add to this, some of the topics that were left are:- 1. Real time and embedded systems 2. Advanced computing networks 3. Software quality and testing 4. Distributed system 5. Introduction to robotics and industrial automation 6. Compiler design 7. VLSI design 8. Mobile computing and applications 9. Software design and Architecture 10. Software requirements 11. System programming
If I was allowed to subscribe to only one channel on UA-cam, I would choose this! Hands down the most knowledgeable and thorough youtuber considering computer technology. You're doing a great job Tim, keep it up!
Fun fact... I was studying these advanced math topics in the last 2 years of high school... the amazing Eastern European system, where you get a ton of garbage to study too early, while in western countries they study such things in 1st-2nd year of uni. I had a friend who went to uni to the UK (CS degree) and the 1st year, when it came to math, he didn't have to study anything because he already knew the topics.
Title of the video did its job - no apology needed, the UA-cam algorithm encourages click baity titles so we learn to live with it. See Veritasium’s video on the subject if you need convincing.
I transferred universities when going into second year, and now I'm taking Logic Design again just so I can take Principles of Compilation in the 3rd year. By far the course I'm most looking forward to, really hope they don't cancel it.
Tbf I am a college student, but I consider myself self-taught since I skip almost every class and only study for the exams. I knew everything you mentioned from the internet. Not all self taught developers are uninterested in these topics. For instance I'm very passionate about different paradigms even though we haven't tackled this subject yet in college (and I don't think we will since they're OOP fanatics lmao)
Honestly the Operating Systems class might be the most important. So many things are built upon those concepts, and it is really difficult to get your head around the reason why many of the things you are using work like they do, without knowing how a kernel works.
As a Junior, I took a class in Linear Algebra in 1984 because it sounded easy. It ended up being the only C I got in 4 years and I still have no idea what Linear Algebra is.
I feel you about maybe or maybe not going back to school. I completed an associate's degree while I was already working as a software developer and was thinking about getting a bachelor's but the requirements are just too dumb. Why do I need to know anthropology for a computer science degree?! Obviously the courses you listed here are probably beneficial in some shape or form (except calculus, I took calculus II and still haven't used it and likely forgot everything about it at this point). But having a degree serves as a pass where your application may not be thrown away if you don't have one.
This is encouraging to a point. I'm not a developer... but been teaching programming to myself since, I don't know, 12 years old? And just now I'm really seriously considering to get into development as a job. And the thing is... despite the fact that I had formal education in computing, I have a working knowledge of every single course/topic you mentioned... and acquired just out of pure curiosity and affinity towards formalism and the systematicism on these subjects ... (and to a certain extent... out of a little disdain/contempt towards the 'hacky/grokky' way of doing things...)
If you really dive into what you are doing the more you also want to understand what's going on under the hood. If you are serious about your journey you can learn everything. University courses are one way, but the point is to have advanced peers or a teacher as in order to learn effectively we all need different perspectives...
Haven't watched Tim in a long time. This is a great video and it's super cool to see how much he's grown. Like literally, he looks older! But still sounds like the same Tim :D
Great insight in this video, thanks Tim. For someone on the self-taught route without a CS / Math degree, from what I have read, watched and experienced, having practical knowledge of Advanced Math is only really beneficial if you're working inn Artificial Intelligence, Machine Learning, Augmented / Virtual reality spaces. But, I could very well be wrong! I'm documenting my self-taught journey on my channel with the hopes of landing a role before the end of 2022. Happy New Year!
I have two degrees in IT. One is in web development. The other in system admin. I also taught myself how to develop and program software among many other things. The difference is a formal education presents you the information. Self-education forces you to think things through yourself. There are advantages and disadvantages to both, but your point is correct. When you teach yourself, it is you deciding what you don't know when it is impossible for you to know what that is. It is you looking outside but you are on the inside. In a classroom, you can see inside yourself through others, and that is how you can see yourself from the outside. You also get a holistic view that is already for you to learn, but in self-education you have to put together the pieces to make the picture whole. Though I do not think I learned nearly as much in school as I did on my own, I think school created the right frame of mind and made the path clear for me. Also, it is common knowledge in school that you are only getting started there and won't learn everything. The notion you will learn everything comes from your recruiter. The professor/instructor will be the first to say you are not going to learn everything there. In school you learn how to be a pro, not all of what a pro can know. That takes time to achieve. I agree with all your points, and with the need to know advanced math particularly. That is because of how much easier it is to be clever in your code. Every programmer should know how to create advanced algorithms instead of mere mathematical operations.
This is just my two cents but- If you're considering whether or not you should finish a degree in Computer Science, then there's nothing wrong with choosing another major. However, if you're debating whether or not to finish college at all, I'd recommend at least getting a Bachelor's degree. You seem bright enough that you could probably learn the topics on your own anyways, but having a bachelor's degree will help you in job interviews later on down the line. Two years might seem like a long time to finish your degree, but you could think of it as just accumulating tools for your toolchest of knowledge for whatever it is you plan to do later on. However- If you've got a strong sense that there's something more important; like a good business opportunity or something in life that you want even more than a degree, there's nothing wrong with jumping on that opportunity. Just keep in mind that finishing a degree later on is much harder than delaying your plans for two years. Whichever path you choose, there will be a trade off.
linear algebra can be useful when working with some applications of arrays, but I can definitely see why they wouldn't be used too much. It is a lot more important when it comes to understanding the underlying math behind statistics and forecasting - the data science side of "programming".
My opinion on calc? I don't necessarily use it terribly often, but when I do, it's an absolute lifesaver. I once made a little flight sim with a variable timestep in MS Excel and eventually I switched it to storing acceleration and tracking jerk. Absolutely no way I could have made it work with only knowledge of trig and below. Calc can also be useful for things like easy definite or indefinite integrals of some shape, or for easy derivatives, which is pretty nice for things like terrain generation code.
I’m a sr engineer at my company and I do interviews for all kinds of skill levels. I ask about programming paradigms in every interview. We work in JavaScript and I find it spans across multiple paradigms, and most modern languages are multi-paradigm. Knowledge of that is important & if someone talks about only OOP, it’s a bit of a red flag
I agree that discrete math is quite important as it's also the basis for an algorithm analysis course (typically a junior level class that may be combined with data structures). In programming, making the code functionally correct is only half the battle. For example, one should never use bubble sort as it's highly inefficient - this course will tell you why. It will then allow you to perform similar analysis on your own algorithms.
I'm a self taught, and I understand how 1s and 0s gets user input from mouse and keyboard and eventually send an email through a browser tab. Am tryna brag, but also tryna say non-cs-takers don't let anything like these discourage you, take your time.
You are quite right, the OS fundamental courses I did in university came in quite handy when working with large Java projects where multi-threading and thread synchronisation presented major problems to solve.
I agree on most points except linear algebra - which does come in handy when writing graphics applications. Another few points I'd like to ammend are: (1) human perceptive psychology (2) languages (in general) (3) computers and their effects on society (4) IT specific law (5) economics (6) software design patterns.
One of the things I've learned only after going to university is how to pronounce boolean. There was no UA-cam back then and all my self teaching were from books.
I'm going to say Big-Oh notation and what it means. This is best illustrated by searching a sorted list for a number (linear time) vs doing a binary search on a sorted list (log time). Most of the hardware today is powerful enough to make this a moot point for home projects; however, this becomes super important at big corporations with dbs containing millions of rows.
As a self taught low level programmer, i learned computer architecture but without the Boolean algebra part. I also taught myself some compiler theory. My only regret is not having a math background. Although its not necessary to have a math background, i think math helps you a lot to become good at problem solving
While I'll definitely concede about math, I did definitely pick up the rest of the subjects you mentioned on my own. I started learning programming to learn more about computers however so it was always part of my interests to go as low level and learn as much as I could because I have a problem and sunlight scares me.
I’m a little biased on this one. I’m a 4th year CS student myself and if I hadn’t been self-learning in my free time, I’d be in a bad place. At least in my particular experience, I’ve got a vast list of things I found out I needed to learn over the years just by trying to create something and getting stuck and then researching the latest technologies to tackle an issue. My projects are also personal and have utility (not compilers, parsers, tic tac toe, etc) so I definitely have more focus when I’m into it. My background definitely helped me a lot too. I was an electronics repair technician (for anything with basic circuitry, modem cards, power amplifiers etc) for 6 years reading individual bits of information with special machines and schematics before going into CS. I already had a pretty good idea of computer architecture so I can’t say self-taught development from ZERO is going to get you there but after almost graduating with my CS degree, I realize it was a year of nonsense, a year of (maybe or maybe not) relatable mathematics depending on intended dev field, and a year of programming knowledge that I could have gotten for free online. I saved all the high level courses for last and those are the only ones I can say are worth it in a formal CS program (algorithms, data structures, architecture, security, concurrency, etc)
The funny part is that my Mechatronics Engineering degree combined with my Math Minor taught all of this. The languages also covered a wide swathe of paradigms. Verilog, Assembly, C, MATLAB, as well as going from literally figuring out gates from CMOS diagrams, all the way to Assembly, and then C, as well as converting between the two.
Hey everyone. I think a lot of people took this video the wrong way and that’s totally my fault. I don’t mean to say ALL self taught developers never learn these topics, they are simply the things I know personally I hadn’t considered learning until attending university and that In my experience a lot of devs that haven’t been in a traditional school system don’t learn. I’m sure many self taught developers do learn these topics and are far better at them than I am! I also wanted to clarify that not everything on this list is necessary to learn (as I stated many times in the video). This videos objective was to introduce some topics some self taught devs may not have considered learning so that they can have a kind of road map or idea of topics they might want to look at next. Apologize if it came off the wrong way 👍
no worries
The title literally says NEVER in all caps. Does youtube not allow creators to change titles?
As someone who is in college and also who is a self taught..i think you are right i mean most self taught didn't consider learning them so it is a great thing you just tells us this things to consider..thank you
The title is very arrogant. There are a lot of people with degrees other that computer science that work in this industry. i have 35 course hours of mathematics and electrical engineering. Including calculus and linear algebra. I think system administrator know a little about operating systems. I studied Introduction to Algorithms by Thomas H. Corman and Tanenbaum's Modern Operating Systems and audited a Mathematics for Computer Science from MIT. I have been in industry for over 25 years and currently work as a Devops Engineer. The only never is never stop learning.
You should consider changing the title if you feel you have misrepresented something.
as a self taught coder ive also taught myself a lot of maths in the process. if you can teach yourself to code you can teach yourself other advanced concepts. i also taught myself electronics so i know the computer all the way down to the transistor level.
Same been learning more maths than some grads by the end of high school by self teaching. Self taught means you can learn anything. I dont understand the "credentials giving you secret knowledge"argument.
@@cybrdelic credentials get you in the door. its really just proof that you signed your life away into student loans so your potential employer can have leverage over you. they don't care if you are good at what you do or not. they can train you on what you need to know to do your job.
@@LordOfNihil i got in the door without it
I am learning the same stuff u learnt now but it takes a lot of time.
I enjoy learning all this stuff but want to know how long it may take before i get a job. Say i started from scratch how long would it take to learn the basics and job relevant skills?
With no degree and 30 projects on your portfolio... you are a hot cake...
But if you want to pay a wholesome load of casg to some college who use the same resource online to teach you.
Rename the title to : What Self-Taught Developers should also learn, because you could learn all this stuff online too.
Right
Not as click baity
I would say this is a fairly inaccurate title as well. A lot of this knowledge is great, and it's also widely available. There's not really a need to learn any of this, outside of just pure enjoyment.
It's not interesting as it is now
Even though it's clickbait, I think the title is fine as most self taught developers won't take time to learn this anyway.
Most of the self-taught devs that I know do learn these things. They're usually the ones that are constantly learning, and never happy with not knowing something.
I'm one of them. I have learnt 13 languages. Earned money with 6 of them. I have lots of experience with the 5 major cloud technologies. Became a linux and containarization expert. I develop distributed systems as a devops engineer who has real world knowledge/experience on both operational, developer and automation side. I'm working on open source projects, so anyone could check. And lastly english is not my native so i spent lots of time to learn english to be able to learn computer science.
Me too
literally me in a nutshell XD
Quite right. I came up in the culture of British home computing in the 80s, and I knew of so many talented developers who learned a lot of this stuff before they even reached university age.
Yea your comment makes sense. People that WANT to learn this will always go above and beyond. Even reading books. Most programmers in schools will never do this, thinking the internet is always superior.
I have a math degree, I recently learned about a little inefficiency at my company and after a couple months of just thinking about it, I realized it was a giant constrained optimization problem that could be modeled with linear algebra. I was and still am awful at python but in my excitement to solve this problem I managed to pull it together and write a program to generate the matrix and constrains (both the values and dimensions will change day by day so it needed to be algorithmically generated)... it worked and now I am trying to learn python as fast as possible to move that whole system into python and hoping more problems pop up that can be solved by a quick optimization.
You have unlocked mankind’s greatest and most powerful tool. Keep at it! Generalization is the best part of mathematics.
But why in those programs python is mostly used? It's not even close to be a fast runtime language.
@@khodis2002 he still ridin a tricyle bro
@@mistatank1231 if these math libraries are C code wrappers, then that makes sense.
But I've heard that Numpy works slower due to the large method call, when the C library optimizes the whole thing.
@@khodis2002 As someone who knows a ton of languages from 6502 assembler, C, pascal, all the way through to golang, I can safely say that language speed, while important, is not the only factor when writing code. It's a balance between speed of language and speed/TCO of development. Python is pretty easy to read and understand, and quick to throw scripts together.
This video is very relevant. Knowing what to focus on in CS can trip you up learning on your own. Being an older learner /relearner, OOP was a very difficult concept, but after a very long struggle I got it. Also USE what you're learning, as you're learning it- best lesson I've learned. And watch Tech with Tim!
This is what I really need. Coz I been literally studying all free courses on UA-cam just to fulfill what I think I'm lacking.
Linear algebra is very useful for coders who do data analysis.
Yes there is awesome books
""" ver useful """ is actually an understatement of how much it's needed
For coders who aren't data analysts it can be a waste of time at the start.
Also for low level graphics, most math used in rendering is related to linear algebra
@@NivAwesome It is also important in scientific applications and other simulations. Btw., most computer graphics is also based on it. Stop omitting something just because you don't know the application. That's ignorant.
Software Engineering is a vast field. Not everyone needs to learn everything. Just learn it when required.
Partially right. You need to know what is available to you so you are able to gauge if it's worth to investigate it more.
You would never consider hooking an FPGA to your server for heavy processing (e.g. cryptographic hashes), because you don't know it's a possibility.
As an SE, your job is not to reply to some request, but help your client/employer do the right thing. That may even means no programming at all. (Changing processes)
So, no need to learn in-depth, but surely required to learn as much as possible.
Where it's nuanced is that you can do it in an intelligent way. If you know Angular, Dagger is not be be learn as it's basically the same principles at play. If you know a C family language, C, C# or Java shouldn't be that complicated etc.
But if you totally ignore one aspect, then you'll never learn it when required, unless it's part of some specifications you got.
It's why in the academic track I got, we did a grand tour. I even programmed robots you find in pharmaceutical companies. Did also some German assembly, designed an build pcbs for interfacing, analyses a library processes etc.
Stuff I would have not done on my own and that gives me an edge.
In one of my job, my interview was cut with "Ok, you are hired". I asked how they came to it and they replied that I was the only one who pulled some UML and asked the right questions.
They asked me to make a small program which was an ETL and before digging into the code, I did a small design. The created only one class to have the "feel" of it, then called them to discuss the solution before churning through code.
In the end, though, there is ample room for self taught people. They can do basic algorithms, which are 90% of the jobs (mine is a bad example as I mainly works with trees transformations). Then you need only one guy who know how to tie everything together and being efficient while keeping it readable and maintainable. That's also that guy who can tell what is worth learning. Still, there is a huge plus-value to have some IT culture.
I'm 100% self taught when it comes to computers from the transistor to python!
I can somewhat agree with that statement for a basic coder or programmer... But as for an actual Engineer, I kind of beg to differ. Let's pick on C/C++ languages with the modern C/C++ compilers... any major brand will do Clang, GCC, MSVC, or even Intel... and take a simple little program that goes through a loop from 1 to 100 performs division on some values and prints the results. Do this with optimizations off, then do it with optimizations on. I'm more familiar with the MSVC compiler and I'm more familiar with the x86 architectures however I do have some knowledge of GCCs compiler and the 6502 and Risc-V architectures. Not too familiar with MIPS or or ARM. However, when you compile your C/C++ under Clang there is an intermediate file usually object files that your C/C++ code is translated into before the linkage or linker stage. Clang translates to LLVM before that gets translated to actual Assembly - Machine Code. Now if you can examine the actual LLVM code without and with optimizations then compare the two assemblies... you'll see that some of that division is replaced with bit-shifts as opposed to actual division and others are replaced with something that looks esoteric (magic numbers) and bit banging. This is what your compiler does under the hood. It tries to make the best attempts of optimizing your code to use the least and or fastest segments within your hardware. If it can use registers before going out to main memory it will do so. If it can replace division with shifting or even multiplication it will do so... Knowing what's going on under the hood of your compiler, assembler and even within your CPU as well as the rest of the system gives you the necessary tools to be a better Engineer. To be a developer it may not be necessary... just rely on the tools and take them for granted. But if you're the one who's designing the hardware or device drivers, or writing processor intensive code on limited resources then you better know how to minimize cache line misses! How to avoid memory leaks! And so on... Not even knowing your compiler is enough... knowing how your compiler, operating system its kernel and HAL work with and manipulate your assembler and disassembler is vital information even if you are programming in Java, JavaScript and or Python it can help. As it helps to design better algorithms. Such as learning how to minimize branch predictions, how to organize your data to easily stay within the cache blocks... knowing when or when not to use dynamic memory or the heap versus the stack... knowing when to use pointer and pointer arithmetic... and to squeeze a few more clock cycles out, knowing how and when to bit-bang! There are coders, there are programmers and then there are Engineers. And you don't necessarily have to have a "college" education to be an Engineer!
@@skilz8098 you are so right. I've argued with so many people over this. I've dared to say that one could teach himself UpTo PhD level without entering a classroom , if the learning materials can convey the knowledge, and the field doesn't require complex lab setup e.g like it is in the physical sciences. It's about understanding and the mind is the seat of it and doesn't require middle men to unlock insight from knowledge. Unfortunately, modern society equates attainment of understanding with credentials- credentialism. They're arguments for this culture e.g for protection of clients and the public but the truth is much broader in scope of what's possible
@@collinsa8909 Lol, I never looked at diploma and certificates when I interviewed. Just ran though the protocol.
Still, I do not recall any self taught doing good...neither those who did put forward their credentials. It's mostly those who thought they didn't do that good who passed.
Self taught have issues you'll rarely find int those who go though "professional school" (So, not university, but alike where you learn to "do".).
Mind that university usually means not actionable. They get the theory, but the practice is not there yet.
@@wasimraja2980 Exactly, that's why I said I partially agree as it is coming from a different perspective. Not once have I had a curriculum credited base class towards either computer science, software and or electronic/hardware engineering. Yet over the past 20 years I have acquired the knowledge of my own volition. The only difference for me is that I want to know all that is involved... even down to the theory of quantum mechanics at the atomic level of the actual electronic components that makes up your circuitry! Why? I like computers and technology but more than that I like Physics! Now, I'm not really a Physicist but I enjoy it especially when it applies to Engineering.
In India, Calculus is taught in schools, grades 11 and 12. So when a student joins University/College, s/he already had a glimpse of what exactly is Calculus.
The same goes for most European countries: derivatives and integrals are being taught in the last two years of high school, as well as permutations, combinations, matrices and determinants.
University calculus starts with line/surface/volume/contour integrals and differential equations. The difference is very visible when using American textbooks for first-year university physics: the US books (e.g. Ohanian's Physics) only contains Maxwell's equations in global (integral) form, while Europeans professors are teaching the local form (differential, i.e. nabla notation) immediately.
And I learned about logic gates in primary school (back in the 1970s) and basic set theory (on a Venn diagram board game where we had to put images of faces on the correct position, e.g. a face with curly hair and blue eyes in the intersection of the curly and blue sets) even before that in kindergarten.
@@koenlefever Actually, most of Indian ways of teaching is derived from the Europeans (specifically English People)
@@koenlefever We learn it in the last 3 months in high school but in a very competitive way in my country
In the UK, calculus and discrete maths are part of A-level maths, so they're not necessarily something computer science students have learned. In practice though I don't recall ever meeting a computer science student that didn't have an A-level in maths.
@mayank Garg the problem in Indian education is we are not taught how to apply it practically.
I think software design patterns is one of the subjects that needs to be included here. Once I had this subject in university I worked on a project and actually got an opportunity to tryout the things that I had learnt in the subject.
I think they're great to learn about, but personally I've just stumbled upon the "correct" way to design a system many times and didn't even know I could find the conceptual design described in some book. Point being, they can be learned rather quickly through practice, so do not fall under the category of "never learned" but I'd agree that they're great to read about so you're at least aware of what you didn't know as a self-taught dev.
Systems design is something that usually comes with experience, most companies don't ask that for jr. Roles, it's more of a staff or principal engineer thing.
But definitely should do at least at basic level before and interview
Speaking for myself (self-taught programmer over the last 45 years) I can't imagine being unaware of any of this. You can find textbooks on pretty much all of this material in any good library.
2 courses I was taught in university not mentioned here (which were pain in the ass back then) but are helpful now were "Theory of automata" and "Compiler construction". Interesting stuff, Knowing how compiler works do make you a better programmer
Being a self-taught developer with 15+ years of experience, to this very day I feel that in some areas I have much more knowledge than "academic" developers who have the same years of experience, yet in other areas, "academic" developers learn certain things in their first year, that I still haven't learned properly to this day or struggle to understand, which feels very intimidating to me. Especially these algorithm test challenges you get during interview questions is something I never learned properly and I struggle with them.
Operating System is probably my favorite class because it's the most useful non-programming class for a programmer.
linear algebra has a lot to do with physics, shaders, graphics and even encryption (see AES-GCM)
calculus of course is about physics
What I can honestly say that the most important thing I learned at the university is team work, clean code, etc. That gives you a huge advantage as a junior to transition quickly to mid
It’s kind of interesting that I had to learn Discrete Mathematics for my linguistics degree, especially when it came to set theory, set builder notation, and proofs using propositional logic. Also learned boolean algebra for my semantics courses.
there is a lot more in discrete mathematics like counting and probability, recurrence relations and generating functions, graphs and trees and their application like in scheduling, pigeon hole principle bla bla
Man when I saw the title I thought to myself "ugh this is probably gonna be one of those clickbaity titles where he'll tell us to learn how to write comments or something stupid" but man you actually delivered an amazing video. I'm always trying to improve myself but I have a lot of trouble understanding a lot of what people mean when I read documentation or articles, and there seems to be just a massive influx of "developers" posting videos on youtube that all seem to copy each other posting the exact same stuff and it's difficult sorting through everything that I already know. thanks.
I've been programming for 40 years, and much of my expertise was self taught. The topics that you cover are ALL things I've had to learn, and lots more, e.g. systems administration, large Hadoop style systems, networking, security... If you're to be successful as a programmer, you need to keep up with whatever's popular, which means you'll be continually reeducating yourself. The good news is that for many topics, one only needs to learn the jargon, as the new topic will closely resemble something you already know.
"The rarely change what we do in computing, but the change what they call it"
-- Attributed to Grace Hopper.
The video production quality is getting better and better. Good work, Tim!
Idea for your future tutorial videos:
Pick an important subject taught in universities and make them accessible for self taught Devs through series of tutorial videos
Self taught devs don't need those things. Learning to them is on a need to know basis. That said , any avid learner would explore topics outside of his sphere to have a feel for the scope of his discipline. I did, issue is, most of the theoretical education suffer from ultra poor tuition - most books are unnecessarily convoluted beyond the complexity of the subject in question itself. Then, most lessons fail to show rekevance to the learner , making the material doubly unattractive and useless to the autodidact
i strongly disagree. theoretical principles help people know what they are doing. theory is unavoidable many applications: if you work with images or sounds, you have to understand mathematical analysis and linear algebra, if you work with text (or other sequences) you need to know a bit of graphs and discrete math, if you work with web, you need basic knowledge about threading. And if you program, you must know about functional programming style, otherwise your code will probably be disastrous (mine definitely was).
@@Daniel_Zhu_a6f I partially get your point but you don't get mine. Schools teach you lots of theory, hoping you'll hav an arsenal to tackle your field with. This is what I call knowledge warehousing. You may use some and never use others. So y fill your mind with do much junk. On the other hand, with lots of practice and trial and error, which is what self teaching will expose you too, you'd develop a logical mindset. Then with some theory, not much, you'd know how to design systems. Then with other projects out of your range, you can research and learn to tackle. E.g if I'm to design sound systems, or graphics, I'd research and learn until I know enough to get the job done. Same with ai/machine learning. Robotics. I'd be qualified through the rigorous training of building systems via trial n error of self taught programming. E.g I work with texts, sequences and have never used graph theory. I do use lots of lists,sets etc. I've never used discrete maths explicitly, though I do programming. Overtime from experience I've discovered some patterns with boolean operations/algebra. Ive read up on discrete maths a bit but found it dry, irrelevant and redundant. Yeah, I filled out truth tables and all ad neaseum and wrote some math proofs but it never came in useful when tackling a programming problem at all. When writing a calculator etc I don't stop to think of computing boolean operations on a truth table! When writing an arithmetic parser, I don't start thinking of automata theory! I write the parser it works I'm done. At the end I've done automatically theory without knowing it in theory. This won't fly for all cases but many times it will. Self taughts approach problems differently. E.g when trying to solve some numerical problem, u noticed that java had no way of computing exponentials with non integer , floating point, arguments and i badly needed thus functionality, so I began researching ways to achieve my goal which led me to looking for ready made solutions that could help- found no free one and trying to understand the math behind fast exponential computation. Basically- learning as I needed to
As an electrical engineer, the first language I learnt to program in was assembly. It did help with my understanding of what the actual hardware was up to when the computer was writing machine code etc.
Assembly was my third project that I learned. It’s what finally helped me understand the concept of recursion.
This is an awesome video man. I work for a FANG company and I'm a self-taught programmer. My start in tech was as a network engineer and transitioned to a developer. I was fortunate to find a mentor that told me some of the things you mentioned that I need to learn. after learning about data structures, paradigms, and architecture, I felt languages like python are a bad first language for folks. When I learned Golang, that's what really force me to learn more about these very concepts you're talking about. this is super applicable to the folks reading, I didn't go to a computer science school and have a few friends that haven't I'm usually giving similar advice of things to become better devs. Great video.
Can I learn web development and mobile app development on my own ? How long should I give to myself ?
Can I get a good job with it ?
What are the subjects for a high paying job ?
I have a mechanical engineering degree.
digital systems and computer architecture is so fun, it definitely changes the way you see the world.
It really depends on what you're doing. When we are talking about graphics programming (especially 3d) then things like vectors and matrices are pretty much entry level to that field.
I love how everyone watching your videos is a self-taught developer and completely disgrees with this video
What I remember from my college level math classes: Calc 2 was a weed-out class at my school as most students were coming in with college credit from high school and Linear Algebra wasn't difficult, just tedious from not being able to show multiple steps in a transformation.
I think that linear algebra requires a new way if thinking about things (different to what you would have been exposed to in school).
What most self-taught developers never learn is automata theory, computational complexity theory and computability theory
r/iamverysmart material.
What does that even mean
useful if you want to make a computer with potatoes
@@jdeep7 compiler engineering
The real question is how much you make each year :D
"Java is object-oriented"
Me when I'm in Java "hippity hoppity your variables are now my property." 🤣
C++ Here's your public working papers the rest is Private Property!
@@skilz8098 Rust, Here is you, you are the user your are now my fuck puppet so everything is private
😅😅 inheritance... I get it
I think the tittle should be “what they don’t teach you on a 20 hours online course” as after 5 years of working as a software developer you need to know most of the subjects mentioned here. Perhaps the math part is not super necessary unless you’re doing 3D engines -in that case Linear Algebra becomes vital or other math if you’re writing software for digital signal processing, compression or cryptography.
Thank you for this video. Found it very insightful. Most of the items you covered, I have run into at one point or another as a self-taught developer (been developing in one form or another since about 1982). My concern has always been that developers aren't learning the "low" level computing concepts (like CPU architecture and Operating systems), and now I can see why. I've bookmarked this video as I'm going to make it a point to learn more stuff based on what you've outlined.
You are just wrong
One which I think even most college students never learn, unless they've taken a course on compiler design, is Finite State Automata. This is a shame, because they are remarkably useful in the design of several types of programs, ranging from search engines to TCP/IP stacks. Similarly with parsing, which again is mostly associated with compiler development but is useful in any number of other areas.
Relational theory is another topic which you generally only get with a specialized course, which is often at the graduate level. Lambda calculus is rarely taught at all these days, which is something of a shame.
As for operating system design, for self-taught programmers there is the OSDev forum and wiki, though there are several caveats about the accuracy and datedness of the latter and the vitriolic tone of the former.
I am a university student and our curriculum do have theory of computation as a subject which goes deep enough to understand finite state machines including non deterministic. Course also cover regular language, and also it’s applications in compiler design ….. including parsing.
integrals and derivates can be super useful when you wanna play with physics !!!
I would say a lot of what a CS degree teaches you isn't directly applicable, but it allows you to understand topics at a deeper level and so will be able to solve more complex and fundamental problems.
As a self taught Developer (and self taught with english language too, sorry) I think your video is very useful for many of us because it gives ways to improve.
A self taught learner chooses what to learn and from who. The knowledge of the others remains important.
I've seen many of the topics you listed because I've a long history as a programmer (beginning in 1981, assembler, many many books, and often using languages where I manage the memory, threads, ...).
Sadly not enough time for all the math stuff.
Thank you
Really good video, you keep making better content every day (Congrats on doing that) I have been a subscriber since you started the first pygame series and I have seen almost every single one of your videos and I just thought that I should leave a comment to say how much I love ans support what you do, and that without you I probably would not have started programming! Keep doing what you do, youre great and not far from reaching the 1 million subs ;).
This covers many useful computer related topics. Having a basic understanding of all the "tools" helps you choose a good approach when facing a new problem. You can always brush up on how to use the technique, but if you don't know it exists, you will never think to use it.
What most programs overlook is domain specific skills. For example writing an accounting program. Knowing the GUI and SQL interfaces is necessary but not enough. You also need to need to understand accounting well enough to understand the needs and requests of the users. Almost all retail sales systems SUCK because the programmers didn't understand how the systems are used.
No DSP? That math is pretty interesting and intense, and has a lot of stuff to learn for programmers, especially those ever touching signals (audio, video, etc)
You made good points here Tim. I'm also a self taught developer and have sometimes difficulties to write software with good and simple design. And my understanding with software paradigms, computer architecture in general and also mathematics in general are quite decent. Every self-taught developer should keep this points in mind. The ability to code and make software work is not the only skill that makes a good developer or engineer should have. But if you are self taught in the fist place, it should not be a problem to learn all the other concepts too.
It's just take quite a bit of time, which you cannot afford, otherwise you would have gotten a proper education at the first place. Catching pieces here and there will not give you a big picture which any development should start with. Lack of skills in solving problems in general way gives an impression that programming is easy, cause you never actually program, but keep hardcoding and repeating yourself.
OS was great. I think its one of the most relevant topics, especially when it comes to real world relevance. Not only do you have a better understanding of how and why a lot of things work or don't work with operating systems, filesystems and concurrency, it e.g. also helped me to understand asyncio better, and it makes reading documentation easier as you know the basic terminology. Would do it again immediately
Never is too harsh word it truly depends on the person.
I'm self taught developer except maths I learned all others my journey was year long though the thing I did was always ask why and how then search internet on those.
It was a clickbait. Chill
Great video. I was a self-taught programmer with a degree in Physics, was terrible at advanced math. After 25 years of systems engineering with a little "programming on the bare metal" machine code, I went back to school, got a degree in Software Engineering--my first impression was, "I've spent 25 years in a different profession with the same name." After graduating with a master's degree, I proceeded to self-teach myself the rest of computer science by getting a job teaching the upper-division undergraduate courses (none of which I took in school myself), all of which made me a much better programmer and able to solve difficult problems easily. FSMs, functional programming, and "Big O" are the most helpful concepts. Still coding and learning 56 years after grinding out my first program in new employee orientation at Univac.
I know that's a clickbait title, but saying that if you don't go to uni you won't learn this material is a bit of a stretch. Now you have access to all books and online resources you would ever need to learn all this topics. It's only the matter of your own internal curiosity to dive deeper in some more theoretical topics.
In one company I worked, they didn't know how to migrate databases, because they were not indexing and indexing them would take days (at least). At some point I passed by, discussed and the issue popped. So, I replied that it is easily solved with dichotomic search. It took me 5 minutes to solve an issue they were wrapping their head around. When asked how I came with that so fast I replied that it is literally something you learn in the first quarter of the first year at university. So my solution came straight from the book. (of course, I had to recall it)
Same with my current boos to whom I can swiftly answer, just because I learned the material. As a stupid example, if we need to do some intensive process, I'll see if it is possible to hand off the problem to an FPGA. Self taught programmer do not even know that it exists and can't design hardware at all.
Before the academic track, I was a self taught programmer and I was already good. Learning made me even better in the sense I have more horizons.
As a first class Nigerian student of mathematics and computer science, this is so relatable. What people get wrong abt Computer science is that it is first, a SCIENCE itself... You can't just wake up someday and say you wanna be a computer scientist or a programmer without understanding the core principles of computing... My advice to self taught developers is that they should try and go to the university to backup their already acquired knowledge of programming.
For a university class, I learned a language called Haskell, which sounds a bit similar to OCaml, no loops, and super strong typing. It wouldn't let you even compile/run the program without every part of the program running being typed out beforehand. Our final project was to use a parsing library and implement our own programming language from scratch, using Haskell, and it really taught me a lot about how to construct a programming language, as well as forming a plan and format before starting to code.
I'm sort of self taught. I learned C and C++ in college, along with some higher math, Calculus and Linear Algebra (I don't remember much but I remember enough of the concepts to know what to google). Now I'm mainly a C# dev. I'm completely self taught in SQL and front end, though.
Calculus, algebra, these are the foundations of engineering, especially software and computer science. Can't imagine professional software developer without good math background. Sure, you can build a bridge without the math - as long as it is an amateur endeavor in your backyard.
Maybe should've included Computation Theory and Formal Languages.
As a self taught programmer, I can say that this is all true. Your channel inspired me to create my own Python channel. Thanks for all you do!
Do we get bonus points if we _did_ learn this as a self taught developer? Math is super important if you want to do any kind of game development. Low level hardware knowledge is invaluable when you need to write high throughput highly optimized code in a big datacenter environment, and I consider it a big separator between an average programmer and a great one -- it's very obvious when someone does or does not know this stuff.
Yes, I am sad he did not mention DOD which utilizes knowledge of the cache. I'm self taught and I guess I'm learning that, which according to this video, isn't even taught to university students.
To add to this, some of the topics that were left are:-
1. Real time and embedded systems
2. Advanced computing networks
3. Software quality and testing
4. Distributed system
5. Introduction to robotics and industrial automation
6. Compiler design
7. VLSI design
8. Mobile computing and applications
9. Software design and Architecture
10. Software requirements
11. System programming
If I was allowed to subscribe to only one channel on UA-cam, I would choose this! Hands down the most knowledgeable and thorough youtuber considering computer technology. You're doing a great job Tim, keep it up!
I really appreciate rhat! Thanks :)
Totally agree. Without these knowledge, a self-learner can only be a coder, never a programmer.
Fun fact... I was studying these advanced math topics in the last 2 years of high school... the amazing Eastern European system, where you get a ton of garbage to study too early, while in western countries they study such things in 1st-2nd year of uni. I had a friend who went to uni to the UK (CS degree) and the 1st year, when it came to math, he didn't have to study anything because he already knew the topics.
Title of the video did its job - no apology needed, the UA-cam algorithm encourages click baity titles so we learn to live with it. See Veritasium’s video on the subject if you need convincing.
The compilers course was really fun. Highly recommended.
I transferred universities when going into second year, and now I'm taking Logic Design again just so I can take Principles of Compilation in the 3rd year. By far the course I'm most looking forward to, really hope they don't cancel it.
Tbf I am a college student, but I consider myself self-taught since I skip almost every class and only study for the exams. I knew everything you mentioned from the internet. Not all self taught developers are uninterested in these topics. For instance I'm very passionate about different paradigms even though we haven't tackled this subject yet in college (and I don't think we will since they're OOP fanatics lmao)
Honestly the Operating Systems class might be the most important. So many things are built upon those concepts, and it is really difficult to get your head around the reason why many of the things you are using work like they do, without knowing how a kernel works.
These are literally the first things I learned/continue to sharpen
As a self taught python and c++ and c# programmer, it took me experience to fully grasp OOP.
Thanks for the video as always Tim, don't mind those people who are pissed about the video's title.
As a Junior, I took a class in Linear Algebra in 1984 because it sounded easy. It ended up being the only C I got in 4 years and I still have no idea what Linear Algebra is.
I feel you about maybe or maybe not going back to school. I completed an associate's degree while I was already working as a software developer and was thinking about getting a bachelor's but the requirements are just too dumb. Why do I need to know anthropology for a computer science degree?!
Obviously the courses you listed here are probably beneficial in some shape or form (except calculus, I took calculus II and still haven't used it and likely forgot everything about it at this point).
But having a degree serves as a pass where your application may not be thrown away if you don't have one.
This is encouraging to a point. I'm not a developer... but been teaching programming to myself since, I don't know, 12 years old? And just now I'm really seriously considering to get into development as a job. And the thing is... despite the fact that I had formal education in computing, I have a working knowledge of every single course/topic you mentioned... and acquired just out of pure curiosity and affinity towards formalism and the systematicism on these subjects ... (and to a certain extent... out of a little disdain/contempt towards the 'hacky/grokky' way of doing things...)
If you really dive into what you are doing the more you also want to understand what's going on under the hood. If you are serious about your journey you can learn everything. University courses are one way, but the point is to have advanced peers or a teacher as in order to learn effectively we all need different perspectives...
You can't say whatever you feel,
Most of these are useful in one way or another.
--respect
Haven't watched Tim in a long time. This is a great video and it's super cool to see how much he's grown. Like literally, he looks older! But still sounds like the same Tim :D
Great insight in this video, thanks Tim. For someone on the self-taught route without a CS / Math degree, from what I have read, watched and experienced, having practical knowledge of Advanced Math is only really beneficial if you're working inn Artificial Intelligence, Machine Learning, Augmented / Virtual reality spaces. But, I could very well be wrong! I'm documenting my self-taught journey on my channel with the hopes of landing a role before the end of 2022. Happy New Year!
I went in thinking "Ha! I know bash!" but I was treated with a whole bunch of goodies i know nothing of. Great video! Made my day.
Even though I'm doing CS right now I still got something out of this video, thanks a lot tim.
I have two degrees in IT. One is in web development. The other in system admin. I also taught myself how to develop and program software among many other things. The difference is a formal education presents you the information. Self-education forces you to think things through yourself. There are advantages and disadvantages to both, but your point is correct.
When you teach yourself, it is you deciding what you don't know when it is impossible for you to know what that is. It is you looking outside but you are on the inside. In a classroom, you can see inside yourself through others, and that is how you can see yourself from the outside. You also get a holistic view that is already for you to learn, but in self-education you have to put together the pieces to make the picture whole.
Though I do not think I learned nearly as much in school as I did on my own, I think school created the right frame of mind and made the path clear for me. Also, it is common knowledge in school that you are only getting started there and won't learn everything. The notion you will learn everything comes from your recruiter. The professor/instructor will be the first to say you are not going to learn everything there.
In school you learn how to be a pro, not all of what a pro can know. That takes time to achieve. I agree with all your points, and with the need to know advanced math particularly. That is because of how much easier it is to be clever in your code. Every programmer should know how to create advanced algorithms instead of mere mathematical operations.
Thanks for contents!
This is just my two cents but- If you're considering whether or not you should finish a degree in Computer Science, then there's nothing wrong with choosing another major. However, if you're debating whether or not to finish college at all, I'd recommend at least getting a Bachelor's degree. You seem bright enough that you could probably learn the topics on your own anyways, but having a bachelor's degree will help you in job interviews later on down the line. Two years might seem like a long time to finish your degree, but you could think of it as just accumulating tools for your toolchest of knowledge for whatever it is you plan to do later on.
However- If you've got a strong sense that there's something more important; like a good business opportunity or something in life that you want even more than a degree, there's nothing wrong with jumping on that opportunity. Just keep in mind that finishing a degree later on is much harder than delaying your plans for two years. Whichever path you choose, there will be a trade off.
Programming paradigm will be the topic I will have to learn
-Advanced math proof
-Digital systems and computer Architecture
-Programming Paradigms
-Programming Language Concepts
-Operating Systems
I'm gonna be like you someday🥺(13 year old)
I code since 8yo. Now I'm glad to reach 40yo earning money for making the same as a little boy.
Me too
Thanks for this...
I started when I was 17, and it pains me knowing that Tim is a year younger than me and can code circles around me, one handed. Keep going kid.
Add software design patterns, software architecture, requirements engineering, and clean coding to the list.
linear algebra can be useful when working with some applications of arrays, but I can definitely see why they wouldn't be used too much. It is a lot more important when it comes to understanding the underlying math behind statistics and forecasting - the data science side of "programming".
Thanks this was intresting!
Did you learn these courses from internet while being enrolled in a university program or from your teachers in uni ?
why am I feeling so good that I know all these (except Operating Systems) even though I'm a self thought. Where did I learn all this 😅?
My opinion on calc?
I don't necessarily use it terribly often, but when I do, it's an absolute lifesaver. I once made a little flight sim with a variable timestep in MS Excel and eventually I switched it to storing acceleration and tracking jerk. Absolutely no way I could have made it work with only knowledge of trig and below.
Calc can also be useful for things like easy definite or indefinite integrals of some shape, or for easy derivatives, which is pretty nice for things like terrain generation code.
We love you tim thanks for your advices. Don't sleep, make creativity
hey ! bought algo expert this week and i love it ! i see it like a 90$ video game with 160 level .. im grinding it 😎 lots of fun
I took a class in Set theory back in 68 , didn't know what I could do with it back then but today it is really helpful in Blender.
I’m a sr engineer at my company and I do interviews for all kinds of skill levels. I ask about programming paradigms in every interview. We work in JavaScript and I find it spans across multiple paradigms, and most modern languages are multi-paradigm. Knowledge of that is important & if someone talks about only OOP, it’s a bit of a red flag
I agree that discrete math is quite important as it's also the basis for an algorithm analysis course (typically a junior level class that may be combined with data structures). In programming, making the code functionally correct is only half the battle. For example, one should never use bubble sort as it's highly inefficient - this course will tell you why. It will then allow you to perform similar analysis on your own algorithms.
I'm a self taught, and I understand how 1s and 0s gets user input from mouse and keyboard and eventually send an email through a browser tab.
Am tryna brag, but also tryna say non-cs-takers don't let anything like these discourage you, take your time.
Algo expert ad before video, algo expert ad in the video itself, algo expert expert ad on the video, algo expert ad at the end of the video
You are quite right, the OS fundamental courses I did in university came in quite handy when working with large Java projects where multi-threading and thread synchronisation presented major problems to solve.
That's the best type of content
I agree on most points except linear algebra - which does come in handy when writing graphics applications. Another few points I'd like to ammend are: (1) human perceptive psychology (2) languages (in general) (3) computers and their effects on society (4) IT specific law (5) economics (6) software design patterns.
One of the things I've learned only after going to university is how to pronounce boolean. There was no UA-cam back then and all my self teaching were from books.
How did your mind pronounce it by default? Bowlion? Etc?
Boo-lean.
Should have been boo-le-an.
@@ShunyValdez ahh so like scary ghost Boo micropause lean back
I'm going to say Big-Oh notation and what it means. This is best illustrated by searching a sorted list for a number (linear time) vs doing a binary search on a sorted list (log time). Most of the hardware today is powerful enough to make this a moot point for home projects; however, this becomes super important at big corporations with dbs containing millions of rows.
As a self taught low level programmer, i learned computer architecture but without the Boolean algebra part. I also taught myself some compiler theory. My only regret is not having a math background. Although its not necessary to have a math background, i think math helps you a lot to become good at problem solving
I have a degree in mathematics and also a programmer. It's not worth the reget.
While I'll definitely concede about math, I did definitely pick up the rest of the subjects you mentioned on my own.
I started learning programming to learn more about computers however so it was always part of my interests to go as low level and learn as much as I could because I have a problem and sunlight scares me.
I’m a little biased on this one. I’m a 4th year CS student myself and if I hadn’t been self-learning in my free time, I’d be in a bad place. At least in my particular experience, I’ve got a vast list of things I found out I needed to learn over the years just by trying to create something and getting stuck and then researching the latest technologies to tackle an issue. My projects are also personal and have utility (not compilers, parsers, tic tac toe, etc) so I definitely have more focus when I’m into it. My background definitely helped me a lot too. I was an electronics repair technician (for anything with basic circuitry, modem cards, power amplifiers etc) for 6 years reading individual bits of information with special machines and schematics before going into CS. I already had a pretty good idea of computer architecture so I can’t say self-taught development from ZERO is going to get you there but after almost graduating with my CS degree, I realize it was a year of nonsense, a year of (maybe or maybe not) relatable mathematics depending on intended dev field, and a year of programming knowledge that I could have gotten for free online. I saved all the high level courses for last and those are the only ones I can say are worth it in a formal CS program (algorithms, data structures, architecture, security, concurrency, etc)
The funny part is that my Mechatronics Engineering degree combined with my Math Minor taught all of this. The languages also covered a wide swathe of paradigms. Verilog, Assembly, C, MATLAB, as well as going from literally figuring out gates from CMOS diagrams, all the way to Assembly, and then C, as well as converting between the two.