4. Assembly Language & Computer Architecture

Поділитися
Вставка
  • Опубліковано 28 кві 2024
  • MIT 6.172 Performance Engineering of Software Systems, Fall 2018
    Instructor: Charles Leiserson
    View the complete course: ocw.mit.edu/6-172F18
    UA-cam Playlist: • MIT 6.172 Performance ...
    Prof. Leiserson walks through the stages of code from source code to compilation to machine code to hardware interpretation and, finally, to execution.
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

КОМЕНТАРІ • 281

  • @ModusPonenz
    @ModusPonenz 2 роки тому +244

    A friend and I were taking an x86 assembly language class together back when we were working on our Computer Science degrees. We worked at Intel at the time. At some point during the semester the instructor discovered that we were Intel employees. We worked in the Memory Components Group at that time and had nothing to do with CPU products. Even so, from that point on, whenever he was explaining some of the peculiarities of the x86 architecture, he jokingly glared at us as if it were our fault. He would say, "Why does this instruction work the way it does? Because Intel decided that's the way it is." And look directly at us. We both got an A in the class.

    • @Meiyourenn
      @Meiyourenn 2 роки тому +4

      did you have good understanding of MASM???

    • @ModusPonenz
      @ModusPonenz 2 роки тому +9

      @@Meiyourenn It's been a long time since I took this class, but yes we had a good understanding of MASM. We were more familiar with the assembler that Intel sold with their development systems. But the class was using the Microsoft assembler. Luckily the Intel and Microsoft assemblers weren't all that different. At least the operand order was the same.

    • @ndotl
      @ndotl 2 роки тому +9

      I worked for IBM, and had an adjunct professor who worked at IBM across the hall from me. Needless to say, I got a B+. (Intel) Assembly was my favorite language, until I learned C the next semester. C was my favorite language until I learned C++ the next semester. Even after learning Java, C++ remained my favorite language. Currently learning C# and relearning C++, while coding in Apex on the Salesforce platform.
      P.S: I got a B+ when I really deserved an A, but I really enjoyed the experience and could go back to it if I really had to. I was at a Salesforce conference in San Francisco a year or so before corvid, and met some guy on the bus coding in Assembly on his laptop. In a way I kind of envied him, because whatever it was it was going to be fast. I can remember that in IBM assembly which I did not code in, MVCL was considered by programmers one of the worst instructions invented.

    • @mr.incognito4640
      @mr.incognito4640 5 місяців тому

      Can you please recommend me some good books in assembly

    • @mr.incognito4640
      @mr.incognito4640 5 місяців тому

      ​@@ndotlcan you suggest me some good books for assembly?

  • @jellyjams7217
    @jellyjams7217 2 роки тому +74

    I do not have the prerequisites to be watching this video, but still watched the whole thing, learning a lot about something I know nothing about.

    • @justcurious1940
      @justcurious1940 6 місяців тому

      same with me, I enjoyed it .

    • @user-nk6dc2wk6p
      @user-nk6dc2wk6p 2 місяці тому

      bro... stop it! i know you know something atleast about computer.. and your just being curious about whats happening inside those hardwares and software you see on screen..

    • @saritshull3909
      @saritshull3909 Місяць тому

      this is the way.
      not my way though lol. I'm trying to pass

  • @user-rz3uy5ok7x
    @user-rz3uy5ok7x 3 роки тому +212

    I just love how you can see that he loves his job and that he truly enjoys teaching. Thank you for sharing this lecture!

    • @intuit13
      @intuit13 2 роки тому +3

      must be an amazingly charmed life to be super smart, loving academia. perfect life. :x i can only imagine

    • @muhammadsubhani7420
      @muhammadsubhani7420 2 роки тому +2

      Love his passion for explaining architecture for software development implications!

  • @div6601
    @div6601 Рік тому +17

    Prof. Leiserson is an amazing instructor. I love watching his lectures. He never rushes off with the material and always prioritizes quality over the quantity. Thank you Prof. Leiserson and MIT OCW🙏🏽

  • @paulmarkert5907
    @paulmarkert5907 Рік тому +2

    This lecture is so nostalgic for me and reminds me of my first programming assignments, literally in Assembler (albeit mainframe). I enjoyed it.

  • @mattgraves3709
    @mattgraves3709 2 роки тому +4

    Great series!
    I operate at a couple of layers above this currently, but I love learning about the layers below and maybe one day will get to work a bit closer to the hardware on some performance critical projects.

  • @wmffmw1854
    @wmffmw1854 11 місяців тому +4

    I used to teach Assembly Language, DOS, CPU Architecture and the instruction set timing and bus cycle operation for several systems. Including the DEC PDP 11, Intel MCS86 and MCS 51 and earlier. My class started with Hardware operation from the point of the first clock pulse from the release of the reset button. Including Machine Instruction and Bus cycles and how instruction execution controlled hardware operation. We also introduced Hardware Emulators. To support the class, I wrote a Text Book on the Theory and Operation of Microprocessor Hardware and Software. The Z-80 / 8080 CPUs were used as the example architecture. My class supported the NTX 2000 NATO Telephone Switching System Back in 1980 for NATO & ITT at the introduction of Fully Integrated Digital Switching Systems for Central Office Telephone Switches. basically, 256 Microprocessors operating in fully redundant pairs, all under the control of a custom minicomputer executing a Real Time, Multi-tasking Operating System.

  • @kbflom4500
    @kbflom4500 2 роки тому +6

    Thank you for sharing and giving some insight into some of what our computers have of workload under the hood. This class inspires me and even if the topics are abstract, it reminds me of how lucky we are to have advanced human interface device applications - and it all scales down to processors Hz and memory - but all that is thanks to students and lectures digging deep persistantly and manufacturers even deeper.

  • @madyogi6164
    @madyogi6164 Рік тому +3

    Joyful to watch, even as an entertainment.
    Assembly is very fun to do. Maybe a suicidal task if doing something for normal OS, but pure joy when there is none such thing as OS, no C, and starting barely from scratch - uC-s, for example!

  • @grzesiek1x
    @grzesiek1x Рік тому +1

    It became more clear to me while I started doing digital electronics projects myself and what is amazing I come up to the ideas that someone has already invented but it is amazing sometimes to invent it myself! It is like exploring history or archeology and you find out what is going on with your computer.

  • @fluctura
    @fluctura 2 роки тому +16

    Thank you SO MUCH for putting this online!! I'm writing an assembler by myself to actually really understand Assembly and machine code, and this one is an eye-opener in so many ways for me

    • @kewtomrao
      @kewtomrao Рік тому

      Hows the project going?
      Want to do it my self too but scared to start it :)

  • @headoverbars8750
    @headoverbars8750 3 роки тому +12

    Thank you so much MIT and Professor Leiserson!!
    I write primarily to a VM (JVM via Kotlin) which compiles to Java Bytecode... I am loving that I get almost 2 levels of abstraction down by working through these lectures' n C... I mean I wasn't oblivious but didn't learn assembly as I should have.

    • @NazriB
      @NazriB 2 роки тому

      Lies again? Hello DLC

  • @kabel74
    @kabel74 2 роки тому +11

    Among my favourite computer science subjects when I was in college way back then. It tune your mind to think really hard about optimizing your code for a given hardware. Amazing lectures and brilliant labs by similarly enthusiastic lecturer too 🙂

    • @jrwickersham
      @jrwickersham 2 роки тому

      I loved the living crap out of the physics classes I took that used Fortran (91??)
      Especially when looking at the problems that looked into route optimization across a large matrix. Prof at the time was a former Honeywell employee. I tossed out a comment “wow, this looks like figuring out how to plan a route for a Tomahawk missile..”.
      There was a look and a wink, and a sly “yep..”

  • @davidpanetta5492
    @davidpanetta5492 2 роки тому +3

    Interesting! I, much like the professor, started by doing most software projects in assembly code. Over the years I learned many, once you get the hang of one, others come more easily. I was reflecting back on one project I worked on where the code ran on an IBM 370, there were many groups involved, each writing code with only a specification sheet. I later learned it was a government secret project, so one group didn't know much about any other, all we ever saw of other groups work was object code modules for the linker which got linked to our code. Software breaks, in several cases the code authoring group no longer existed (Legacy) and if there was source code it was classified, when we traced a fault to such an object module we often had to disassemble it to figure out what it was doing. Our group usually wrote in IBM370 assembly, but we soon discovered that other groups had used many different languages - you can recognize compiler constructs after a while. Some code was written in C, FORTRAN, PL1, COBOL, even FORTH and assembly. Once we figured out what was broken, we had to write a patch for that object module - remember it was legacy code - no source available. Sometimes it was fun, sometimes it was stressful!

  • @brucefelger4015
    @brucefelger4015 2 роки тому +1

    first program i ever wrote was for CS101 which we had to right in Binary literals, so 0 and 1's learning from the bottom up has helped me more times than i can name. thanks for this.

  • @GamerX84
    @GamerX84 4 роки тому +26

    Since it wasn't mentioned, the older 8/16 bit systems often used bank switching to access more hardware resources within the 64KB address space. The CPU could only see 64KB at a time, but the hardware mappings within the 64KB could be changed on the fly as is possible with the Commodore 64.

    • @legion_prex3650
      @legion_prex3650 Місяць тому

      it is obvious that you were game programming with (turbo) assembler on the C64 back in the 80ies... :) am i right?

  • @rustycherkas8229
    @rustycherkas8229 Рік тому +1

    At the other end of my career, I caught a glimpse of what was then called "micro code"...
    An "instruction decode" cycle means that the instruction bits en-/dis-able circuitry of data signal pathways. (Eg: enable a shift register to multiple/divide by 2, or enable the block of 2-bit adders to sum bytes...) I envisioned this magic as a really complex railway switching yard. This is the coalface where machine code's 1s or 0s appear as 'high or low' electrical potentials.
    Then came learning about micro code, the embedded multi-step gating/latching operations that would occur (synchronously) within one or a few clock cycles. Way back then, the hardware I saw didn't have advanced micro code or machine code or Assembly to multiply two integers; multiply was done with many Assembly instructions (usually a library 'routine')...
    It helped when thinking how the effect of C's pre-/post-increment (decrement) instructions could be achieved, for instance.

  • @WacKEDmaN
    @WacKEDmaN 9 місяців тому +2

    im learning assembly on z80 and now, after watching half of this course, thinking about moving to C and hand compiling required optimised assembly...this playlist is great stuff... sure alot of it i wont need.. but its nice to see not much has changed in the x86-64 world...
    big thanks to MIT for sharing this stuff with us all :)

  • @Rfc1394
    @Rfc1394 2 роки тому +10

    When I went to City College back in the late '70s, Assembly Language was commonly taught. Today, nobody worries about assembler except people who need extreme performance and compiler writers. I used to work at a shop that wrote almost all their software in Assembly. Nobody does that anymore,

    • @psycronizer
      @psycronizer 2 роки тому +1

      it's a shame really, because it is right down to machine level, and not that difficult to learn, just takes longer to get certain things done without relying on library calls etc. I loved it, using my ZX spectrum I wrote a defender type game, took me months, lots of coffee, all on paper, no assembler.

    • @mytech6779
      @mytech6779 2 роки тому +6

      Nobody on x86.
      Asm is still needed for the kajillions of microcontrollers embedded in various products, where they still only have a kilobyte of space for cost savings and a low clock speed 8bit processor for minimal power consumption. They may get the overall program compiled from C++ but then is needs adjustments. These specialty compilers do not have the broad support that the x86 compiler-optimizers enjoy.

  • @Jeremygee
    @Jeremygee 3 роки тому +5

    By far the best intro to x86 assembly I’ve ever seen

    • @pauleveritt3388
      @pauleveritt3388 2 роки тому

      Come on! It's MIT for Pete's Sake. You would expect any less?

  • @fernandoochoaolivares8829
    @fernandoochoaolivares8829 2 роки тому +1

    One of the best coders I see in a while...

  • @thomasclapton2010
    @thomasclapton2010 Рік тому

    Thank you for sharing and giving some insight into some of what our computers have of workload under the hood.

  • @charlespackwood
    @charlespackwood 2 роки тому +6

    Every computer nerd needs to learn Assembly Language.

  • @kd1s
    @kd1s 2 роки тому +12

    My introduction to assembly language was on a TRS-80 back in the day. Found I could speed things by poking values into memory. Then got the assembler and life was fantastic. And i still remember the video addrs 3c00 to 3fff

    • @pauleveritt3388
      @pauleveritt3388 2 роки тому +1

      I wrote a program on my TRS-80 that would draw the X-Y axis centered on the screen and the plot cardioids using polar coordinates. Not bad for a 9 line BASIC program.

    • @jimmccusker3155
      @jimmccusker3155 2 роки тому

      @@pauleveritt3388 Remember this?
      LD HL, 3C00H
      LD DE, 3C01H
      LD BC, 3FFH
      LD (HL),20H
      LDIR

    • @kd1s
      @kd1s 2 роки тому +1

      @@pauleveritt3388 Ah very cool. Yeah I did all sorts of things, had a voice input module and a speech synthesizer hooked up to mine.

    • @psycronizer
      @psycronizer 2 роки тому

      @@jimmccusker3155 was this some display buffer program, the load, increment, repeat instruction and load (HL) 20H is the clue for me, never used TRS but that was Z80A just like my ZX spectrum I believe, I wrote a defender type game in assembly, by using paper and poke statements, took me fucking ages ! it didn't use sprites, just the larger character set attributes, but it ran ! the spectrum split up the display screen in a really weird way though.

    • @jimmccusker3155
      @jimmccusker3155 2 роки тому

      @@psycronizer Yes, the TRS-80's video was mapped from 3C00H - 3FFFH and those assembly instructions were the quickest way to clear the screen using a space character (20H).

  • @leixun
    @leixun 3 роки тому +168

    *My takeaways:*
    1. Assembly language 0:22
    - Why we want to have a look the assembly language 6:46
    - Expectation of students 10:05
    - If you really want to understand something, you should understand one level below what you normally need to use 12:20
    2. X86-64 ISA primer 13:28
    - 4 important concept: Registers 14:03, Instructions 20:25, data types 28:15 and memory addressing modes 35:50
    - Assembly idiom 43:08
    3. Floating-point and vector hardware 47:54: SIMD
    4. Overview of computer architecture 56:55
    - Historically, computer architects have aimed to improve processor performance by exploit parallelism (e.g. instruction-level parallelism (ILP), vectorization, multicore) and locality (caching)
    - ILP 1:00:35: pipelining, pipeline stall, hazards
    - Superscalar processing 1:09:08
    - Out-of-order execution 1:11:40: bypassing, from in-order to out-of-order, register renaming
    - Branch prediction 1:15:44: speculative execution

  • @veramentegina
    @veramentegina 4 роки тому +20

    love this guy!! Great lecture really!!

  • @Anonnius
    @Anonnius 5 місяців тому

    Thank you for making this lecture freely available!

  • @georgealex19
    @georgealex19 2 роки тому

    Great lecture. I’ve graduated 6+ years ago and I already knew most of the stuff, but I just wanted to say I really appreciate the lecture and really liked the professor’s attitude and presentation👌

    • @georgealex19
      @georgealex19 2 роки тому

      Edit: I’m actually impressed I knew the answer to how many bytes the quad-word has and also why it historically has 8 😂 man, I’m getting old

  • @KuldeepYadav-jw7jn
    @KuldeepYadav-jw7jn 2 місяці тому

    Prof Leiserson is the professor every CS student deserves on this earth......I wish he would have taught me

  • @snaplash
    @snaplash 2 роки тому

    I used to maintain SDS Sigma series mainframes, and in my spare time I'd write blinky light programs directly in machine language, entered via the front panel switches. No assemblers needed :)

  • @skywatchers9675
    @skywatchers9675 Рік тому +1

    Charles Eric Leiserson is a computer scientist, specializing in the theory of parallel computing and distributed computing, and particularly practical applications thereof. As part of this effort, he developed the Cilk multithreaded language. He invented the fat-tree interconnection network, a hardware-universal interconnection network used in many supercomputers, including the Connection Machine CM5, for which he was network architect. He helped pioneer the development of VLSI theory, including the retiming method of digital optimization with James B. Saxe and systolic arrays with H. T. Kung. He conceived of the notion of cache-oblivious algorithms, which are algorithms that have no tuning parameters for cache size or cache-line length, but nevertheless use cache near-optimally. He developed the Cilk language for multithreaded programming, which uses a provably good work-stealing algorithm for scheduling. Leiserson coauthored the standard algorithms textbook Introduction to Algorithms together with Thomas H. Cormen, Ronald L. Rivest, and Clifford Stein.
    Leiserson received a B.S. degree in computer science and mathematics from Yale University in 1975 and a Ph.D. degree in computer science from Carnegie Mellon University in 1981, where his advisors were Jon Bentley and H. T. Kung.[2]
    He then joined the faculty of the Massachusetts Institute of Technology, where he is now a professor. In addition, he is a principal in the Theory of Computation research group in the MIT Computer Science and Artificial Intelligence Laboratory, and he was formerly director of research and director of system architecture for Akamai Technologies. He was Founder and chief technology officer of Cilk Arts, Inc., a start-up that developed Cilk technology for multicore computing applications. (Cilk Arts, Inc. was acquired by Intel in 2009.)
    Leiserson's dissertation, Area-Efficient VLSI Computation, won the first ACM Doctoral Dissertation Award. In 1985, the National Science Foundation awarded him a Presidential Young Investigator Award. He is a Fellow of the Association for Computing Machinery (ACM), the American Association for the Advancement of Science (AAAS), the Institute of Electrical and Electronics Engineers (IEEE), and the Society for Industrial and Applied Mathematics (SIAM). He received the 2014 Taylor L. Booth Education Award from the IEEE Computer Society "for worldwide computer science education impact through writing a best-selling algorithms textbook, and developing courses on algorithms and parallel programming." He received the 2014 ACM-IEEE Computer Society Ken Kennedy Award for his "enduring influence on parallel computing systems and their adoption into mainstream use through scholarly research and development." He was also cited for "distinguished mentoring of computer science leaders and students." He received the 2013 ACM Paris Kanellakis Theory and Practice Award for "contributions to robust parallel and distributed computing."
    WIKIPEDIA

  • @magnuswootton6181
    @magnuswootton6181 3 роки тому +2

    Thanks for the ILP lesson! now I know what its called!!! ILP is the future, I think.

  • @legion_prex3650
    @legion_prex3650 3 роки тому +5

    cool! Thanks Professor Leiserson! Great lecture! I still love assembly.

  • @leonlao744
    @leonlao744 2 роки тому +2

    I will absolutely give this lecture a thumb up

  • @vetiarvind
    @vetiarvind 2 роки тому +5

    Holy shit he's one of the authors of the CQRS book. We're in the presence of a legend.

    • @danman6669
      @danman6669 2 роки тому +1

      You mean CLRS. What's really ironic is the 'L' in CLRS stands for Professor Leiserson's last name, yet you got it wrong and wrote 'Q' instead, even though the whole point to your comment was to point out that he is one of the co-authors of the book.

  • @DrewryPope
    @DrewryPope 2 роки тому

    a truly great overview thank you i've watched this multiple times

  • @Marius-vw9hp
    @Marius-vw9hp Рік тому

    I love his enthusiasm for the historical confusion X)

  • @lil-hooves
    @lil-hooves 4 роки тому +25

    Incredibly helpful, thank you!

  • @Ali-kl3ql
    @Ali-kl3ql 2 роки тому +1

    What a great mindset! 13:19 : `` Go one step beyond, and then you can come back!``

  • @antonfernando8409
    @antonfernando8409 2 роки тому +5

    Came here to understand somethings about c++ optimizations, but this is another level altogether, its been 30 since I last wrote assembler code (68k motorola to multiply matrx, it was fun), still not a bad idea to get refresher on assembly. So loop decrementing is more efficient that counting up, interesting, now I know why. That instruction pipeline stuff is just crazy shit, way out of my pay grade. Thanks prof, enjoyed it.

  • @therealb888
    @therealb888 2 роки тому +8

    13:10 The one level below & one step beyond learning philosophy is something I've followed as well.

    • @skilz8098
      @skilz8098 2 роки тому +4

      I'm self taught and I have learned about every level of abstraction within computers from the science (mathematics, physics and chemistry side of things) all the way up the chain to logic gates, cpu and ISA design to OS and Compiler design and various other fields such as 3D Graphics - Game Engine design, to Hardware Emulation all primarily in C/C++ and a little bit of assembly, and now learning about Neural Nets, A.I. Programming and Machine Learning within Python. I didn't start at the bottom though nor the top. I started near the middle tackling some of the hardest aspects of software engineering with C++ while learning DirectX and OpenGL. C++ is one of the hardest languages to become proficient and accurate with as a first language and 3D Graphics and Game Engine Programming incorporates almost every and all other aspects of programming throughout a majority of all other industries. The Graphics Rendering is just one small part... You also have Audio Processing, Animation, Physics Engines, A.I. programming such as Path Finding and scripting dialogs for in game NPCs. Terrain Generation, Foliage, Weather and Environment Mapping and Generation, Various Tree Structures with Memory Management, Networking with Server and Client side applications that deal with sockets, packets, ip addresses and more if the game is meant to be Online, Compression & Decompression, and Security with encryption and decryption and much much more as this is only the "game engine" side of things as this doesn't include any "game logic" or "game rules"... Many Engines will have their own Parsers and some will even have their own built in scripting languages for that specific engine which also leads into compiler or interpreter design... But this is where I started... I branched from here going in both directions. Going up the ladder of abstraction I am now learning Python, Java and JavaScript with Machine Learning Algorithms and Techniques as my target goal and going down the ladder I'm learning more about Assembly and how it is designed through the implementation of its targeted hardware (ISA, CPU - Hardware Design). I never stop learning! That's why I'm watching this video... It's a refresher and I might pick up something new that I didn't know from before! But yes that 1 step above and below is a very good method to follow.

    • @MJ-ur9tc
      @MJ-ur9tc 2 роки тому +1

      @@skilz8098 you must be kidding ! How could somebody learn so many things by just self studying!!!? And if your age is below 30 then it does not seem to be feasible to self learn so many concepts by your age. Please enlighten me how you have achieved this feat if you have really learnt so much within a time span of few years.

    • @skilz8098
      @skilz8098 2 роки тому +1

      @@MJ-ur9tc Well, I am 40 and I never stop reading into things! Also, when I was a teenager either in middle or high school, I still knew more then than most 40 year olds did.

    • @therealb888
      @therealb888 2 роки тому +1

      @@skilz8098 I mean re-reading your comment just makes me pause. It's like word to word what I want to do. Like, are you trying to social engineer me? I'm not exaggerating, I hope you're not as well but the things you've mentioned here are everything I want to learn and get good at myself. Moving down & moving up the abstraction ladder this is what I want to do as well but it's not something that's easy. It's like EECS, ECE/TCE + IT/NE. Those are like 3-4 fields. You have different teams, hell different departments for them in an organization. Damn! I thought I was overreaching and doing something like that is impossible but you have done it.
      Could you please share a timeline of your learning too please.

    • @skilz8098
      @skilz8098 2 роки тому +1

      ​@@therealb888 The learning or timeline that you are referring to doesn't really exist, it's kind of random and what's of interest to me at the time... but it would look something like this...
      From about 2000 - 2018 I have taught myself C and C++ and my primary motivation was to learn how to build a 3D Graphics or 3D Game Engine from scratch. I wasn't so much interested in making a game even though that was on the todo list; I was more interested in learning how to make the actual engine. I wanted to know how lines, triangles, etc. are being drawn to the screen. How to simulate physics, motion and animation, even 3D Audio was a part of this. By learning how to build an actual Game Engine and all of the different aspects and components of the Engine it teaches you memory management, resource management and object lifetime. It teaches you databasing, file parsing, writing your own scripting language sort of like an assembler or compiler, etc... and this is just to open, read, load, write to files either being graphics, audio, vertex data, later on shaders, initialization files, configuration files, save files, etc. Then comes the Graphics Pipeline and all of its various stages. It teaches you how to setup up and how to both integrate and use a Graphics API such as DirectX, OpenGL, and now Vulkan and I've learned all 3. Well, I've learned DirectX from version 9.0c upto version 11.1 as I currently don't have a Windows 10 machine and don't have access to hardware that supports DirectX 12. The same goes for OpenGL. I started with Legacy OpenGL or v 1.0, I then skipped version 2... and went straight to modern OpenGL which was 3.3 at the time and now its 4.5 or 4.6... And even to this day I'm still interested in learning so I read through the documents, forums, help pages, watch videos, etc... just to remain current. Even your programming language such as C++ changes with time. When I first started I was using Visual Studio Express 2003 or 2005... My current latest compiler is Visual Studio 2017 and when I finally get my new rig, I'll have Visual Studio 2019 or newer for C++ 20... I can only do C++ 17 and there are some features that I can not, but for C++ 14 and 11, I can do just about all of them.
      Outside of just the graphics portion. I then started looking into and trying to learn how compilers are made, how assemblers, disassemblers are made for I was diving into how the actual programming languages themselves were built. As for the assembly side of things, I started to learn this because of my newer interest in learning how Hardware Emulators are built or engineered. This is still the software side of things. Now at this point I had some knowledge of basic circuitry more on the math side, but never have solder a circuit board, but I've always had an interest in how they are made and how we are able to get them to behave in the manner in which we prescribe. So this lead me into doing some research and I came across Ben Eaters UA-cam page on building an 8-bit breadboard CPU. I've been following his channel ever since! I even went from there and started to watch videos on UA-cam from either MIT, Stanford, etc... on Computer Architect Design and more. I even went and found a free program call Logisim that allows you to place icons onto a grid fashion to build circuits. And yes you can build a CPU within. I went this route because I didn't have the physical materials to follow along with Ben Eaters videos. And I was able to build my own implementation of his 8-bit breadboard CPU within Logism following and adhering to his ISA (Instruction Set Architecture). I can take his binary form of his own assembly language and run it through my CPU within Logisim and it will run the same functions that he demonstrates.
      Before this I had already knew about the internal hardware components of a modern computer such as the CPU, the RAM, ROM(Hard drives or external storage), The Main Bus, CMOS originally then BIOS, expansion cards, video cards, sound cards etc. and I've always been able to build, configure and install all of the hardware and this goes as far back as the early to mid 90s. I was about 12 when I got my first PC and it had Windows 3.1 & Dos 6.0. When you turned it on, you got some writing on the screen about the connected hardware, the ram, etc... then you'd get a command prompt! C:\ If you wanted to use Windows, you had to be at the root of the drive and type "cd Windows" without the quotes, this would then put you in the Windows directory, and then you had to type Windows or Windows.exe for it to load and run... CDs never mind DVDs or BluRays were not exactly new, but most computers then didn't have CD Rom support, they still had the old floppies, both the smaller 3.5 and the larger flimsier 5.25 diskettes. The internet was around but it was both young and most didn't have it as it was expensive and there weren't that many graphics nevermind videos. It wasn't until the late 90s that we first got internet and even then (pre dot com boom) most websites had only a couple of graphics or images because their download times were slow. Over 90% of the internet back then was text and hypertext links and many sites were still found on ftp sites.
      So I've always had a decent knowledge of the hardware but never knew how they were built through their circuitry. Due to my interest in wanting to learn Hardware Emulation Programming, this lead me into CPU Architect and Design as well Electronic Engineering beyond the integrated circuits right down to the logic gates themselves and their connections or pathways. And I didn't stop there, I even learned how the logic gates themselves are made from the actual components such as transistors, resistors, capacitors, etc... and around 2005 - 2008 I was trying to put myself through college to get a degree in both Software and Hardware or Electronic Engineering. And within my Calc Based Physics Class I learned more about Electricity and the various components. Not so much the circuit diagrams, but what chemicals they are made of, how they interact with each other, what happens to them when you apply a voltage or a charge, or put them near a magnetic field, etc...
      Now as for the math behind it all my highest level of math from a classroom is Calc II. However I was teaching myself Linear Algebra and Vector Calculus while I was learning Calc I. This was from write the functions to build a working 3D Scene with a viewpoint and view frustum to have a working Camera as well as applying transformations from one coordinate system to another as well as applying transformations onto objects to simulate both physics and animations.
      When you start working with Audio especially 3D Audio this is an entire different ball game! Now you have to learn FFTs, bitrate samples, frequencies, compression and decompression algorithms etc...
      And it doesn't end there... Just recently within the past year or two, I have an interest in two different topics. AI - Programming, Neural Networks, Machine Learning and Operating System Design. The first got me into learning Python and the later refreshed my understanding of Assembly and C as C++ is my primary language of choice. So even to this day I'm still learning!

  • @davereid-daly2205
    @davereid-daly2205 2 роки тому +1

    Fantastic explanations and diagrams. Extremely helpful indeed, thank you SO much !!!!!!

  • @lucas404x
    @lucas404x 3 роки тому +107

    Great video. I'm trying to get this knowledge without to be in college. Thanks Professor Leiserson! :)

    • @Uvisir
      @Uvisir 2 роки тому +6

      same here

    • @ian_b
      @ian_b 2 роки тому +29

      Back in the old days when microcomputers first came out, many of us had to learn Assembly from a handful of books and magazine articles. Nowadays there are massive resources available online. If we could do it, you can! Best wishes for your learning.

    • @therealb888
      @therealb888 2 роки тому +3

      @@ServitorSkull oh ben eater I knew I was familiar with it!. He sould have used 8086 though.

    • @brucemunro8598
      @brucemunro8598 2 роки тому +4

      @@therealb888 You can still buy new 6502 chips, and they are very cheap as well.

    • @Mr_ToR
      @Mr_ToR 2 роки тому

      watch ben eater. first watch his videos about building a cpu, then his other stuff.

  • @laboratoriodojulio
    @laboratoriodojulio 2 роки тому

    For me.. this is the best class ... great.. best regards professor.

  • @amyh4606
    @amyh4606 2 роки тому +2

    I remember this class. It was interesting to know how programs work under the covers. That said, you wouldn't want to write a program in assembly

    • @sbalogh53
      @sbalogh53 2 роки тому +2

      I wrote many programs in Z80 assembler code back in the 1980s. I also reverse engineered large blocks of binary code stored in EPROMS, modified the code to suit new requirements, assembled the new code and burned it back into EPROMS. They were fun times. Not sure I would enjoy the experience with X86 code though.

    • @williamdrum9899
      @williamdrum9899 Рік тому

      These days, no. The x86 has a huge number of instructions, and the ARM compresses its immediate operands meaning that not every 32-bit value can be loaded into a register as a one-liner (usually on the ARM, constants are stored in nearby data blocks and you load from there instead). I don't mind writing in Motorola 68000 assembly tbh.

  • @naruto6918
    @naruto6918 2 роки тому +2

    His book"introduction to algorithms" is just osm ❤️

  • @JakeBechtold
    @JakeBechtold 2 роки тому +4

    Man I wish I could have had this professor in college! Guess that's why he's at MIT.

    • @picklerix6162
      @picklerix6162 2 роки тому +1

      I wish I had this guy in college. The professor who taught my assembly language class barely knew the topic but that didn’t stop me from learning.

    • @sbalogh53
      @sbalogh53 2 роки тому

      @@picklerix6162 ... I self-taught myself in Honeywell Easycoder (it's assembler language) and Z80 assembler. It was not that hard back then, although today's chips seem to be far more complex so there may be a need for referencing the manual more often.

  • @sikendongol4208
    @sikendongol4208 2 роки тому +4

    12:46 My experience is that if you really want to understand something, you want to understand it to level that's necessary and then one level below that.

    • @skilz8098
      @skilz8098 2 роки тому +1

      I'm bi-directional when it comes to learning about computers and I'm 100% self taught. I like to peek into 2 levels below and 2 levels above. I'll go all the way down to the transistors and how they are made within the realm of chemistry and physics all the way up to scripting languages such as Python.
      If I was to design a Degree Program at an arbitrary University here's a road map or an outline for each level of abstraction:
      Mathematics - General fields: Algebra, Geometry, Trigonometry, Calculus 1 and 2 at a minimum and maybe some Linear Algebra
      Physics - Basic Newtonian up to some Quantum Mechanics including the beginning stages of Electricity and Simple Circuits
      Chemistry - Basic Chemistry at the College Level primarily focused on the various chemicals and compounds that can act as conductors, insulators and inductors. (No need for Organic Chemistry)
      This would cover that basics needed for Entry Level Electronic Engineering
      Mathematics - Analytical Geometry, Vector Calculus, more Linear Algebra, Lambda Calculus, Probability & Statistics, Logic - Boolean Algebra, Introduction to Truth Tables & Karnaugh Maps.
      Physics - Quantum Mechanics continued, deeper theories such as wave propagation
      Circuit Design - Ability to both read and build electric diagrams or schematics using both DC and AC type currents learning about the differences between circuits components that are either in parallel or in series, voltage and current dividers. Also covering both Analog and Digital Electronic design patterns leading us into our Logic Gates.
      Chemistry - Specialized on the components that make up your transistors, resistors, diodes, capacitors, rectifiers, and more...
      Still no programming yet...
      Mathematics - More on Boolean Algebra and Discrete math that is Log2 Base mathematics (Binary Arithmetic and Logic calculations), Extended Truth Tables & Karnaugh Maps, State Machines covering both Mealy and Moore machines. Arbitrary Languages as Sets with a defined Alphabet.
      Physics - building various digital circuits and analyzing them with voltage meters and oscilloscopes.
      Chemistry - diving deeper into the components at the quantum levels
      Mathematics - Complex Number Analysis, Euler's numbers and formulas, Fourier Series, Laplace Transforms, Signal Analysis and FFTs(Fast Fourier Transforms)... and more
      Physics & Chemistry - maybe specialized fields at this point
      Bringing it all together, your first day as a programmer, well hardware / software engineer!
      Digital Circuits I - Combinational Logic - Integrated Circuits and Complex Circuit Design building an adder, a multiplexer and demux or selector, comparators, binary counters.
      Digital Circuits II - Sequential Logic - Feedback loops, SR Latch, D-Latch, Flip Flop, JK Flip Flop, Toggle, Timers, Registers
      Digital Circuits III - High and Low Logic, Synchronous vs Asynchronous, Bit Addressing, Memory and Data Paths
      Digital Circuits IV - Putting It All Together (ISA Design) - Refresher on Truth Tables, Karnaugh Maps & State Machines.
      *Introduction to HDL and Verilog
      **Any addition mathematics, physics or chemistry that is needed
      ISA I - Basic Turing Machines - Implementing one on a breadboard then move to either Proto Board and or a PCB
      ISA II - Pipelining
      ISA III - Caches
      ISA IV - Branch Prediction
      ISA V - Out of Order Execution
      **Any addition mathematics, physics or chemistry that is needed
      Transition from Hardware to Software:
      - Building an Assembler, Disassembler and Hex Editor.
      - Learning about the C Language and Building a basic C Compiler, Linker & Debugger
      - More Mathematics
      *Advanced Courses: GPU(Graphical Processing) - APU(Audio Processing) design, Storage Design, Monitor or Display Design, I/O Peripheral Design (Keyboards and Mouse) these would be basic and simplified versions of what you can buy on the market, something that could theoretically be done on a breadboard and easily be done on something like either on a ProtoBoard, PCB, of FPGA ... These may also include various classes related to Mechanical Engineering and other required mathematics, physics and chemistry courses.
      Operating System Design (OSD)
      OSD I - CMOS, Bios and or EFI/UEFI, Bootstrapper, the Kernel, the HAL and more basic features.
      OSD II - Vector Tables, Memory Addressing and Mapping, Virtual Memory, Processes and Threads, I/O handling and Interrupts, and more...
      OSD III - File Handling, Compression & Decompression, Encryption & Decryption
      OSD IV - Drivers and Peripheral Support
      OSD V - Graphics - Font and 2D Image Rendering, Audio and Networking (Ports)
      OSD VI - Generating an Installer
      OSD VII - Installing and Configuring the OS for First USE, extending your C compiler to support C/C++ based on your assembly language.
      Using your homebrew built computer with your own OS and compiler and if you elected to build your own or just interface with I/O, storage, graphic and audio devices... now it's time to use them and put it to good use. We can now jump into basic Game Design by starting off with implementing DOOM in a combination of ASM, C and or C++ to test out your hardware.
      The final stages: Designing your own high level dynamic scripting or interpreted language using your custom built CPU-PC to write high level languages within it.
      ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
      If you can build your own CPU and ISA from the ground up, build your own basic assembler and compiler that is enough to be able to write, build, and run DOOM on it, design your own high level scripting language to work on it... Then I could easily say you have graduated and become both a Hardware and Software Engineer!
      If I was coordinating a degree program at some university, something like this would be the outline for the Dual Bachelors/Masters Degree programs for both Software and Hardware Engineering. There may be some fields of mathematics, physics, etc. that I didn't cover here. Such as Data Sets, Algorithms, etc. which would be important yet I feel they would be more related towards the Computer Science side of things as opposed to the Engineering side of things. Yet to transition from the Bachelors to the Masters, the Bachelors in Computer Science side of things would be required.
      If one was to take every course available, they'd end up with a Masters for the following fields: Mathematics, Physics, Computational Chemistry, Computer Science, Software Engineering, Electrical-Electronic/Hardware Engineering, Operating System Design & Data Management and possibly a Bachelors in some of the follow: Mechanical Engineering, Data Structures & Algorithms, Information Theory, Compiler & OS Design and a few more...

  • @arnolddalby5552
    @arnolddalby5552 3 роки тому +2

    Excellent, loved it.

  • @GoogleUser-ee8ro
    @GoogleUser-ee8ro Рік тому

    This lesson was filmed in 2018, and if I recall correctly it was the time when Intel introduced AVX512 to its Skylake architecture; if MIT were to film this course again, I wonder if it would update it to an ARM assembly version, 🙂

  • @DHorse
    @DHorse 2 роки тому

    What a great speaker .

  • @juanmamani2110
    @juanmamani2110 2 роки тому

    Lecture effect: memory jumps to 70s and 80s intel 8080,8088,286,386,486....
    Excelent lecture as intro to asm instructions.

  • @alexandersviridov8682
    @alexandersviridov8682 4 роки тому +12

    Great explanation. TY from Russia.

  • @jakefischer8281
    @jakefischer8281 2 роки тому

    Love his enthusiasm!

  • @erichlow3109
    @erichlow3109 2 роки тому

    Excellent intro course, maybe considering in further lesson scope a look to ARM M33 et al architectures very present in IoT use cases

  • @pfever
    @pfever 3 роки тому +3

    Amazing professor! :)

  • @pial2461
    @pial2461 4 роки тому +3

    Wow! Awesome content

  • @truehurukan
    @truehurukan Рік тому

    Assembly is a very nice programmation language but damn it is complicated ^^ I'm a french speaking programmer/teacher and I would give all my positive feedback for this lecture, lately, yes... I did not used the Google's subtitles to understand the instructor... that's great !!

  • @dynamics
    @dynamics 3 роки тому +2

    Thank you! 🤍💗

  • @mostafar8514
    @mostafar8514 9 місяців тому

    My only experience in assembly is zachtronics exapunks game and it led me here and i understand most of it surprisingly well. To anyone who wants to learn assembly in a fun way, definately check the game out

  • @mohamed_5765
    @mohamed_5765 4 роки тому +2

    thanks for sharing

  • @nishthagupta1357
    @nishthagupta1357 2 роки тому +1

    Such an eloquent speaker he is! ❤

    • @danman6669
      @danman6669 2 роки тому

      "He is such an eloquent speaker!"

    • @nishthagupta1357
      @nishthagupta1357 2 роки тому

      @@danman6669 I tried complicated sentence formation technique. Unlike you, who did the simple kind.

  • @mariodrechsler2618
    @mariodrechsler2618 Рік тому

    I began with assembly in 1987 but sadly endet some lines of code later. I knew how allmighty it is, but that time I studied architecture. Today I sometimes think how helpful it would been all the passing time to have skills like that... This is the real cause why Mies van der Rohe said "Less is more" ;)

  • @mhaddadi
    @mhaddadi 2 роки тому

    8087 was doing the floating point calculations, mathco chip.

  • @dpz34
    @dpz34 3 роки тому +6

    Thank God UA-cam and MIT both exist

  • @user-yp5gz5ip1j
    @user-yp5gz5ip1j 3 роки тому +1

    Great lecture! Thanks!

  • @justcurious1940
    @justcurious1940 6 місяців тому

    Thanks for free lecture.

  • @laohu5511
    @laohu5511 2 роки тому

    Great video , great introduction.

  • @TranscendentBen
    @TranscendentBen 2 роки тому +1

    1:00:06 Here's a point where he misspoke on memory vs. registers: "I would say also the fact that you have a design with registers, that also reflects locality. cuz thwat that the processor wants to do things is fetch stuff from memory, doesn't want to operate on it in memory, that's very expensive, it wants to fetch things into memory [he meant registers], get enough of them there that you can do some calculations, do a whole bunch of calculations, and then put them back out there [to memory]."
    This reminds me of the February 1984 BYTE Magazine, in an interview with the original Macintosh system-level programmers, one of them said of writing efficient code for the Mac's 68000 processor, "keep the registers full." The hit (time delay) of accessing main memory vs. a register wasn't nearly as bad back then, but keeping as much data in registers as you can keeps from having to swap out a lot of data to main memory.

    • @williamdrum9899
      @williamdrum9899 Рік тому +1

      I always feel like I don't have enough registers on the 68000 and yet I never feel this way with machines that have far fewer registers like the 6502. Sure you've got 8 on the 68k but it seems like I need to give up a data register any time I have to index an array so I can get the right offset

  • @allanrichardson9081
    @allanrichardson9081 2 роки тому

    The assembly language for this one-chip microprocessor is more complex than the assembly language for the 1964 models of System/360! Learned some very interesting information. Now I need to get the details somewhere without going BACK to college!

    • @raybod1775
      @raybod1775 2 роки тому +1

      I’m a retired IBM mainframe programmer, nothing but respect for any assembler programmer who can write quality code. Assembler is completely unforgiving, takes so much diligence and concentration. I did write macro assembler code to run Cobol programs, but someone else had written the original code I modified and used. Great for getting a true feel about how computers operate.

    • @schmetterling4477
      @schmetterling4477 2 роки тому

      @@raybod1775 Assembler is no more unforgiving than any other language. Poor code is poor code in any language. There is, unfortunately, a religious belief among younger programmers without a solid computer science background that computer languages and compilers make programming easier. That is not the case. A language is simply a collection of shortcuts to achieve certain side-effects. It is, sort of, an implicit library to a set of often used algorithms. The exact same results can be achieved with explicit libraries of assembly level functions. What will be missing are the (extensive) compiler optimizations. An assembler programmer would have to work much harder to get the same level of optimization of out the code as is possible with a modern compiler. Other than that no language can transform a hard programming problem into an easy one.

    • @schmetterling4477
      @schmetterling4477 2 роки тому

      @@raybod1775 I have written 6502 assembler before you were even born, Ray.
      You can do absolutely everything on a well designed compiler, Ray. C, for instance, has a so called _asm_ statement. Guess what that does? Python lets you bind C code to your Python program directly. Just because you are chicken to use these facilities doesn't mean they don't exist and aren't being used by people who actually know how to use computers. You clearly don't. So what? So nothing except that a guy called Ray has to educate himself.

  • @okaro6595
    @okaro6595 2 роки тому +1

    8086 could address 1 megabyte by using the segment registers. The weird memory architecture was created partially to make it compatible with 8080.

    • @williamdrum9899
      @williamdrum9899 Рік тому

      Yeah I remember trying C for 8086 and thought "Ok but how do I pick a segment to load from" not realizibg I could just specify the full 20-bit address and the compiler did the rest... Lesson learned, registers are the compiler's job

  • @insidiousmaximus
    @insidiousmaximus 2 роки тому +9

    2am
    me: better go bed
    UA-cam: ...Wanna learn Assembly?
    me: ....mmmmmm yeah sure

  • @FreestateofOkondor
    @FreestateofOkondor 4 роки тому +8

    awesome video but why does the playlist have some videos in the wrong order?

    • @mitocw
      @mitocw  4 роки тому +26

      No idea why. We were sure they were in order before the playlist was made public... but it's been fixed. Thanks for you note!

  • @WeconTechnology
    @WeconTechnology 2 роки тому

    very nice video about language and computer.

  • @chang-kp9sp
    @chang-kp9sp 2 роки тому +2

    Surprised to see current programmers do not learn much of Assembly language. It is very useful and insightful if they really want to know inside of computer.

    • @sbalogh53
      @sbalogh53 2 роки тому

      Most current programmers probably just patch together a bunch of library functions when writing their "program". This explains why so much of current software feels as if it is running on a 1980's computer instead of the super fast machines we use today. I have seen some insanely stupid code written by "young" programmers. For example a customer counter was stored in an SQL database that was located in another machine in another building. Every time that counter was incremented, which was often in that application, meant a series of SQL instructions over a local area network. They had code like that yet still wondered why their million dollar Sun servers could only handle 70-80 web requests a second. Then occasionally the network connection would time out and the program would fail. I tried to mention that this was a silly idea but was told to stop being so negative and be a "team player". I am so glad to be retired and not have to deal with these people, although I still have to suffer crappy software every day.

    • @toby9999
      @toby9999 2 роки тому

      @@sbalogh53 Spot on. Most of the developers I've worked with wouldn't have a clue about what's under the hood. It's mostly high level stuff these days.

    • @williamdrum9899
      @williamdrum9899 Рік тому +1

      Things I learned from assembly:
      * The compiler will avoid multiplication and division at all costs (except by powers of 2)
      * x % 256 = x & 255 (same for other powers of 2)
      * for(;;) is a goto
      * while(true) is a goto
      * in fact, every control structure is just a goto wearing a trench coat
      * The largest integer on any computer is -1
      * Arrays are secretly one-dimensional

  • @abevigoda3149
    @abevigoda3149 2 роки тому

    When I needed to really optimize code the only option was doing it in assembly, I have no doubt it still applies today, knowing the target processor well (instruction set and the bugs you can exploit) can dramatically increase performance in iterative code, it's not for lazy people though.

    • @NinjaRunningWild
      @NinjaRunningWild 2 роки тому

      True, it can be tedious. But, to quote Michael Abrash, "Rarely can you get more than a 2x performance increase by doing everything in Assembly Language."

    • @HarnessedGnat
      @HarnessedGnat 2 роки тому

      Today, compilers can be VERY good at producing optimized code. Sometimes it's a matter of knowing how not to get in the compiler's way. Write readable code, and then maybe have a look to see what it did.

    • @abevigoda3149
      @abevigoda3149 2 роки тому

      @@HarnessedGnat Sure, and compilers will get even better in the future, but knowing how not to get in the compiler's way requires knowledge of how the compiler works, the deeper the better, and not all programmers take the time to do that, they just compile their code away and expect it to be thoroughly optimized, in some applications it is best to do things yourself (use your brain and natural intelligence) instead of relying on an automated, rule following algorithm that will oversee even the obvious for its lack of intelligence, then you reap the benefits of assembly programming.

  • @RolandNSI
    @RolandNSI 2 роки тому +2

    Assembly gives you power ! The ultimate power over your machine. Use it wisely. ( and please don't put viruses in the cracks you make )

    • @AntonySimkin
      @AntonySimkin 2 роки тому +3

      just put some sleeping helpers if you need some huge resources to access some servers or may be calculate something lol

  • @ralfneitzel3935
    @ralfneitzel3935 5 місяців тому

    Mega gut! Danke!

  • @LukeAvedon
    @LukeAvedon 2 роки тому

    Wow! this is great.

  • @manavnaik1607
    @manavnaik1607 2 роки тому

    This is good content

  • @engcre
    @engcre Рік тому +1

    Gostei muito 🇧🇷

  • @brahimd8683
    @brahimd8683 2 роки тому

    Quantum computers will be very fast, then they will be developed with complex algorithms where they can decide how to work and charge etc. Then they will be developed into 3D, through a new technology that is still in the preliminary research stage, I call it 3D lights, These 3D lights will also be used in TV and even smartphones, etc.

  • @rabbitcreative
    @rabbitcreative 2 роки тому +2

    *chuckles* He said, "similar-in-structure". How Korzybskian.

  • @emvdl
    @emvdl 2 роки тому

    Thank you!

  • @dmpase
    @dmpase 2 роки тому +3

    11:15 "Vector operations tend to be faster than scalar operations." Well, not as much as they used to. I have plots of DAXPY (y[i]=a*x[i]+y[i]) for different sizes of x and y on Intel and ARM architectures. Vector operations are much faster when x and y both fit into registers or L1. When they fit in L2, vector is still faster but not as much. When they only fit in L3 or memory, there is no difference in speed. Gather and scatter operations show no difference in speed (vector vs. scalar) regardless of cacheability. I haven't found an example where vector ops are slower, but I do have examples where vector and scalar ops are very similar in speed. Much different now than the CDC 7600, FPS-264, CRAY Y-MP and C-90 vector architectures of our younger days.
    BTW, I love your work Dr. Leiserson. I was very happy to find this lecture. This topic is not taught enough. Please don't take my comments as a criticism. I only wish to say that your example, "vector is faster", is not nearly as true today as it once was.

    • @schmetterling4477
      @schmetterling4477 2 роки тому

      If you are busting the cache, then you are a shite programmer. :-)

  • @Bytheocean
    @Bytheocean 3 роки тому +4

    I applaud you teaching this elegant language. One point from a 40+ year programmer and fellow professor. Everyone I have ever known has pronounced "Clang" as "C-Lang" Or See-Lang.

    • @lawrencemanning
      @lawrencemanning 2 роки тому +2

      Are you sure? Acronyms are nearly always made into short easily pronounced utterances. See: picofarad to puff, SSH to shush (fairly unusual admittedly) and even WWW to wah,wah,wah or dub,dub,dub. FWIW every time I've talked about Clang with other programmers it was: Klang.

    • @darrellee8194
      @darrellee8194 2 роки тому

      I’ve also never heard the option flag referred to as minus but always Dash

  • @starriet
    @starriet 2 роки тому

    I don't know how UA-cam recommended me this(maybe related to googling),
    but anyways, excellent job UA-cam!!! Now you're working.

  • @RobWinchesterBoston
    @RobWinchesterBoston 2 роки тому +1

    2**128 addressing... well I can sort of see databases using that eventually as SSD and RAM continue to blur (though for normal use I agree x64 will be around for a long, long time)

  • @user-sd4eg5wi6c
    @user-sd4eg5wi6c 2 роки тому

    Thank you. ♥♥♥

  • @TranscendentBen
    @TranscendentBen 2 роки тому

    1:07:16 "FMA? Fused Multiply and Add." Is this Intel coming up with a new name for something that already existed? In DSP terminology (which goes back a few decades) it's MAC, for Multiply and Accumulate.

  • @lequangdung589
    @lequangdung589 3 роки тому +1

    Very cool

  • @dhikshith12
    @dhikshith12 2 роки тому +3

    I wish i could listen properly to the MIT's student's questions/answers

    • @w1d3r75
      @w1d3r75 2 роки тому +1

      Maybe in the course transcripts ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-172-performance-engineering-of-software-systems-fall-2018/lecture-videos/lecture-4-assembly-language-computer-architecture/#vid_playlist

    • @dhikshith12
      @dhikshith12 2 роки тому

      @@w1d3r75 thanks 🙂

  • @kabukidlifevlogtv9939
    @kabukidlifevlogtv9939 2 роки тому

    excellent

  • @KingXKok
    @KingXKok 5 місяців тому

    Other than setting flags, why would we ever want to write to same register twice in a row instead of just eliding the first instruction?

  • @Basieeee
    @Basieeee 3 роки тому +3

    6:55,
    me in my head, REVERSE ENGINEERING, REVERSE ENGINEERING.

  • @artie5172
    @artie5172 3 місяці тому

    I have a doubt, if x86 documentation is freely available for students to learn, then why x86 is closed ISA or x86 is proprietary??

  • @jvolstad
    @jvolstad 2 роки тому +2

    I'm a retired COBOL developer. 👍

  • @pekertimulia125
    @pekertimulia125 2 роки тому

    Kalau di atas mah assembly silicon jadinya transistor
    Itu silicone di campur dofol terus ribdipe gitulsh..

  • @NinjaRunningWild
    @NinjaRunningWild 2 роки тому

    Just CISC architecture. There's pros & cons to that. One con is a complicated & verbose instruction set.

  • @andrewherrera7735
    @andrewherrera7735 2 роки тому

    59:02 all this is the duct tape holding together moore's law.