Carrie Anne is such an amazing instructor. Her voice, volume and energy is just perfect. Now if only the subject matter wasn't too technical for me and I didn't experience headaches after, this would be my favorite course. Keep up the good work.
hands down favourite season of anything on youtube. so so so so so interesting every single video, and carrie-ann you breath the life of computer science through me, and helps make my studies not only make sense, but also make it so so so engaging!!! thxsm
Have a strange feeling knowing that I have been watching this from #1 all the way to #23. What a inspiring journey, I have learnt a lot. Thanks Carrie and all the people involved in making these video possible !!!
Thanks for explaining the drawRectangle function. I've taken a class in computer science, and our class implemented that function, but it was never explained to us how it worked. Your explanation was clear and helpful.
What I learned from this course made me appreciate the hard work went into technology we use today. I curious on what more innovations to come. I would like to be part of it too.
This series is great, and I mean GREAT. I'm an IT students and it's been very helpful to me.. Greetings from all the way in the middle east, Yemen :) Thanks Carrie Anne..
Fun fact: modern computers and graphics cards still support 80 x 25 character displays for legacy reasons. Anyone who has delved into the arcane witchcraft of operating system development can tell you that the screen buffer for colour display is at 0xB8000. That is one address I will never forget.
The fact that you can only display characters in an 80 x 25 grid? :P In order to draw individual pixels in full colour, you need to specifically ask the gpu to do so, but for anything more than a toy, it's better to use full gpu acceleration. OSDev.org has a lot of very in depth information on the sort of thing.
4:32 slight hairsplitting: that's not a DOS program, but rather a Linux program called Midnight Commander (mc). Granted, it's a clone of Norton Commander for DOS and does indeed use line-drawing characters if the terminal/console supports them, so I'm kind of straying from the point.
The CRT at 0:31 is not used for inspecting memory content (primarily), it IS the memory. CRTs were used on some early computers as a primary storage device. It is called a Williams Tube. It was also possible to inspect the individual bits if one really wanted to. It was an advance compared to mercury delay lines.
I think this is the first Crash Course episode I've watched in two years. What was I thinking!? The great thing is that I have two years of Crash Course to catch up on now. :)
Vector based graphics have this awesome quirk, they are not affected by "resolution scaling". Today we use SVG (Scalar Vector Graphics) to store simple pictures that can be scaled to any size.
Yeah, it's awesome. I created my channel icon as an SVG and after it was done transformed (I don't know the right word) it to the suggested size for channel icons. However, at the end of the day SVGs are still converted to pixels to display on our screens, which means lines that are not horizontal or vertical will never look quite right.
If you get an analogue oscilloscope with an x-y mode you have a vector display of sorts. If you add a microprocessor you can draw vector graphics. I have the oscilloscope...
Stating the most obvious outcome, screen resolutions for the same aspect ratio (for 16:9 aspect ratios, it would be 1280*720, 1366*768, 1600*900, 1920*1080, 2560*1440, 3840*2160, etc) would become irrelevant. It also means that, in the case of LinusTechTips, the only reason to have a ridiculous multi-monitor display (examples: their 8K and 16K monitor setups) is to make pictures literally bigger, not more high-def, because bigger is the same as high-def. (If you got a 34" 1080p display to replace a 24" 1080p display, the only reason you would do that is if you wanted to see the screen from farther away or if you wanted to enlarge things, but it's still the same number of pixels; if that 34" display were 1440p or even 4K, your reason would be to see more detail, but only if the content being displayed can be scaled up.) Also, there would be no need for anti-aliasing because the point of anti-aliasing is to remove those pixelated edges off of objects. My question is this: would you still be able to represent 16777216 different colours this way, or gradients? Also, what will happen to pixel art? Pixel art is its own art style.
Oh I hope we revisit to look at a specific machines method of displaying graphics. Even comparing the Apple IIs sub pixel drawing in 77 to the C64 in 85 and on to CGA,EGA,and VGA which is still with us to this day is a rabbit hole worth a series in of itself.
lol, if computer classes existed when you were in middle school you aren't as old as I am. I was writing programs on punch cards when I was in college :-D
+Slither Gamer Nope, the elders should respect us, we are graduating with the distraction of the Internet, including Social Media, UA-cam, and overload of useless information. While in the past, it was simple, you pick a field you like, you don't get distracted by instant text messages, phone calls, emails, UA-cam uploads, instagram posts, and other useless online items that are distracting.
@@BangMaster96 For some of us The Internet opened a whole new world of education. Not everyone is blessed with Internet or a full time connected social media devices for leisure.
I missed how the vector display can move at an angle. How does it calculate the angle it needs to move along to get to a new point? Sounds like pretty fancy footwork.
She mentioned something about a device that moved the electron beam around. I wonder how it moves so freely since most things back then are shown to be very linear, like left to right, top to bottom. You'd think the beam would have to move in stairsteps, not angled lines.
It moved much like the magnets inside a speaker move, the preset locations are digital (they can be 1 or 2 but not 1.5) but the movement between those locations is smooth, it has no reason to stop half-way unless you tell it to.
Here is where i recomend people to look into creative coding. Using stuff like Processing 3 or P5.js you can make amazing stuff, while learning about the way computers work with graphics. There are a lot of great youtube chanels about this. But i recomend looking into Coding Train's Coding Challenges, to get an idea of the things you can do EDIT: Oh!! And i forgot to say, the syntax used in places like at 9:33 is based on BASIC, a very good beginers programing language (It's like an old precursor to python and similars), if you want to learn graphics you could try QB64. A modern computers por of QuickBasic, a version of BASIC for MS-DOS computers, with a decent way to render graphics. It's a bit arcaic, but it would also make you apreciate a bit more the comodities of modern programing languages, there are a few tutorials on QB64 here in youtube, but keep in mind that you are not likely going to be using QB64 for any serious comercial aplication any time soon, it's just a way to explore programing old school style
My god, that Norton Commander shot brought back more memories than I'd imagine. Yup, I had it all decked out with custom viewers, players, editors and unpackers.. ARJ was king, I scripted custom dosshell menu's for all my game diskettes, and I made music on trackers. Those were the days, those were the days..
Brings back memories, coders these day are spoilt with bazillion libraries and APIs. I still remember having to read and write a bitmap from file and into a video buffer.
I want to see an example of a mechanical computer and display, you know, just in case a sun flare causes an EMP that takes out all our electronics. Don't ask what I need to display so bad.
I like to think that the reason we started using monitors for computers in the first place was because if you wanted to see a different set of data outputs, you simply "change the channel" like you would with a TV. Well, what if I wanna watch multiple TV channels at once?!
LCD doesn't use raster scanning. All pixels that change color are refreshed at specific intervals, usually 60, 120 or 144 times a second. If a pixel stays the same color and intensity, it is not refreshed. GPUs would sometimes choose to refresh the top half first, and then the bottom, and sometimes divide the display into even more sections for sequential refresh. That is to maintainn sync for input, output and image processing. This may cause an artifact known as "tearing" in video games.
Can someone help me find the episode discussing the role of the NSF in laying groundwork for modern computer science? This is a fantastic series, and I want to share it with my colleagues in federal consulting. Thanks!!
I learn computer science in school but its soo, like the teachers don't have your energy, they are bored, I come on youtube many more times to undestand how things work because they are unable to teach us properly =)))
Some of the best games still use ASCII graphics, I guess because they can add anything they'd like without having to worry about art assets. By the way, I'm talking about 'Dwarf Fortress' and 'Cataclysm DDA'.
not mentioned here was color-cell graphics, which sort of fall between ASCII-graphics and bitmap graphics. many early computers (ex: C64) and game-consoles (NES, Sega Genesis, etc ...) did their graphics this way. typically, graphics were represented as a pair of colors for a pixel-block with 1 or 2 bits per pixel to select the desired color, or sometimes by using a special-purpose font built directly for the graphics in question).
"color-cell" That's an interesting term. It sounds like the pre-cursor to palletized graphics, which are still used both to save memory and (more often) to get a retro aesthetic. Regarding the ASCII graphics in the original post, I usually default to ZZT as an example, even though that came out in the early 90's, because it's a free game with a world editor that includes a fairly simple scripting language. It was even sold back in the day on the basis of favoring depth over the flashy graphics of the time, which I find amusing nowadays since the creator went on to lead the Unreal engine team.
yeah; color-cell was popular in the 80s, and also early-90s video codecs (ex: MS-CRAM and QuickTime RPZA). palette graphics were an independent development, which generally use more space but were also a little more flexible. color-cell still lives on though in the form of GPU texture-compression and similar (and some specialized image/video codecs). I am considering color-cell for a possible FPGA CPU project, partly so that I may be able to fit a 320x200 framebuffer into 16kB of SRAM, whereas I would need 64kB with an 8-bit palette (and/or to stream in the framebuffer from external DRAM to do a screen refresh).
First "finger" touchscreen was invented in 1967 by E. A. Johnson. It was based on capacitive coupling, just like today's smartphone screens. Multitouch appeared in the early 1970s. These technologies are a lot older than you might think!
Depends on the display. Using my old oscilloscope which is technically a vector display of sorts, I can draw a circle by using AC power, a capacitor, an inductor and a few resistances. I can change the size in x or y by changing some resistances or by turning some knobs. I can similarly move it around the screen by turning some other knobs. To make two different circles at once, I need faster sine waves and a Traic or some transistors to switch between them. This is not really computer programming though. To really answer your question, I wouldn't be surprised if the closest thing to a circle would be a regular polygon.
Abram Thiessen Thanks for your comprehensive answer I get it now . but what is the science behind your method of making a cricle ? does the resistors create a magnetic field or what?
No, it is just that if you have two sine waves offset by 90 degrees and set one as x and the other as y it draws a circle. The capacitor and inductor are used to create the phase shift, the resistors are there to lower the current so that it doesn't damage the oscilloscope and also to control the size.
For very early computers one option for memory (besides mercury delay lines and magnetic drums) was special CRTs known as William's Tubes (en.wikipedia.org/wiki/Williams_tube). I am pretty sure this was explained in a previous episode. A side effect of using this technology is that you can directly observe the current contents of all the memory as a bunch of glowing dots. One of the smallest early computers was the LGP-30 (en.wikipedia.org/wiki/LGP-30) which had a drum memory with special tracks as registers. A built in oscilloscope CRT could show the contents of the registers as waveforms since the computer processed just one bit at a time (so each register was a single wired that had to be looked at). All other computers used tiny light bulbs to show the internal state (mostly replaced by LEDs in the 1970s).
But in this video she says that the use for displays in early computer was to keep track of memory values beacuse constantly printing it on paper would be too expensive. And obviously it looks better to see a "3234" printed on a screen than a bunch of little lights one would need to look and calculate the actual value (like the williams tube you linked) and as Carrie Ann also said in this video, early CRT had very bad contrast and look pretty bulky, a device with a memory buffer where the computer can store values that need to be displayed and a circuit to convert said values into easy-to-see displays (like nixie tubes or LED displays) would been probably cheaper and smaller. Just to clarify, I'm not implying I'm smarter than the computer scientists back in the day, I just wanna know why didn't they use other ways of displaying if the main objective was to diplay memory values.
Sending the contents of memory to a printer was called a "core dump" (since magnetic core memories were the most popular type from the very late 1950s to mid 1970s), but as she said this was awkward an wasteful. CRTs were not used as an alternative to this, however, until terminals or built-in displays replaced the control panels with blinking lights in the late 1970s. See the big deal Woz makes of this about his Apple I, for example. Before that CRTs were used as I explained and as graphical output devices like she said. But not for observing memory. While many computers used binary output and input in their front panels (and so light bulbs and LEDs like I mentioned) others, including my first computer (www.retrotechnology.com/restore/6800D2_run.jpg), had octal or hexadecimal panels and so did use 7 segment LED displays or equivalent.
You say that programmers don't write the drawing functions, but use graphic libraries with ready-to-go functions. Are these still just draw functions but made by someone else for time-saving, or is it an entirely different way of drawing graphics?
Yeah, I didn't want to imply that this actually was a DOS system that they showed. But as the topic was how these programs drew lines with characters sets, what's written in there was not that important. I'd guess it was easier for someone on the team to get that picture on *nix instead of trying to run an actual NC on a modern machine.
When i was a kid I used to start game called Platoon on my C-128 just to watch the intro screen. That's how amazing it was: s-media-cache-ak0.pinimg.com/736x/98/af/39/98af39ec3cf6f8a6ef584c521516e513.jpg
I've been a programmer for 20 years and I learn new things in many of your videos. They are a treat for absolutely everyone!
Carrie Anne is such an amazing instructor. Her voice, volume and energy is just perfect. Now if only the subject matter wasn't too technical for me and I didn't experience headaches after, this would be my favorite course.
Keep up the good work.
Honestly, after seeing 23 of her videos, I have gotten sick of her voice. But she is still an amazing teacher
@@ismaelnehme379 I still like her voice, guess I have a lot of tolerence.
@@dulguunmurunbarsbold210 I find her voice beautiful.
Fun fact: "pixel" is a portmanteau of "PICTure ELement".
I don't ever want this series to end.
3 years deep into my CS degree and this is the video that finally demystified the most fundamental things I've been curious about.
hands down favourite season of anything on youtube. so so so so so interesting every single video, and carrie-ann you breath the life of computer science through me, and helps make my studies not only make sense, but also make it so so so engaging!!! thxsm
We wove you Carrie Anne
*love
*wuv
Connor Major I don't think that's a typo...
As a computer graphics researcher, this is so important for me. Thanks!
You guys are seriously underrated
Have a strange feeling knowing that I have been watching this from #1 all the way to #23. What a inspiring journey, I have learnt a lot. Thanks Carrie and all the people involved in making these video possible !!!
Learned a lot from this whole course so far. Carrie Ann great job. Never understood vector vs raster. Now have a better idea.
Thanks for explaining the drawRectangle function. I've taken a class in computer science, and our class implemented that function, but it was never explained to us how it worked.
Your explanation was clear and helpful.
What I learned from this course made me appreciate the hard work went into technology we use today.
I curious on what more innovations to come. I would like to be part of it too.
Ali Mürteza Yeşil I’m with you man
This series is great, and I mean GREAT. I'm an IT students and it's been very helpful to me..
Greetings from all the way in the middle east, Yemen :)
Thanks Carrie Anne..
Greetings from Syria !
Here is one from Libya
Fun fact: modern computers and graphics cards still support 80 x 25 character displays for legacy reasons. Anyone who has delved into the arcane witchcraft of operating system development can tell you that the screen buffer for colour display is at 0xB8000. That is one address I will never forget.
I did not know that. Are there any drawbacks from the legacy support?
The fact that you can only display characters in an 80 x 25 grid? :P
In order to draw individual pixels in full colour, you need to specifically ask the gpu to do so, but for anything more than a toy, it's better to use full gpu acceleration.
OSDev.org has a lot of very in depth information on the sort of thing.
Are you referring to the small grid itself as the drawback, or the availability of the support of that small grid?
I love these video productions. Seeing those screens in the 1960/70's building circuit diagrams was surprisingly impressive!!
I'm so happy it's NOT an outtakes episode!
Beautifully done as usual!
this series is my favorite. Never expected to be this enthusiastic about the technicalities of the computer. :D thanks!!!
4:32 slight hairsplitting: that's not a DOS program, but rather a Linux program called Midnight Commander (mc). Granted, it's a clone of Norton Commander for DOS and does indeed use line-drawing characters if the terminal/console supports them, so I'm kind of straying from the point.
The CRT at 0:31 is not used for inspecting memory content (primarily), it IS the memory. CRTs were used on some early computers as a primary storage device. It is called a Williams Tube. It was also possible to inspect the individual bits if one really wanted to. It was an advance compared to mercury delay lines.
Soooo, can we assume there will be a video on 3D graphics as well? I'm excited :)
Gnoccy It gonna be very hard to explain lol
If they show 3d engine code example i'll be so happy
Valink it's not that hard if we start at ray tracing and end with triangles...
Computerphile has done some great videos on 3d graphics. in short, its just drawing lots of little triangles really really quickly.
Valink I have some do you want me to upload it to git hub (it's in java)
Valink, They've done very well in explaining things that are very hard to explain.
I think this is the first Crash Course episode I've watched in two years. What was I thinking!? The great thing is that I have two years of Crash Course to catch up on now. :)
The film CC with Weezy Waiter is also great, highly recommend.
Still remember the video range for my TRS-80 I had when I was a child. It ran from 3C00 to 3FFF. Which was 1K of screen memory.
I swear this crash course is more information rich than my computer class in high school
Could you imagine if we went down the path of vector based screens rather than pixels? I'd find that amazing. I want a vector based screen.
Vector based graphics have this awesome quirk, they are not affected by "resolution scaling". Today we use SVG (Scalar Vector Graphics) to store simple pictures that can be scaled to any size.
Yeah, it's awesome. I created my channel icon as an SVG and after it was done transformed (I don't know the right word) it to the suggested size for channel icons. However, at the end of the day SVGs are still converted to pixels to display on our screens, which means lines that are not horizontal or vertical will never look quite right.
Cool, I didn't notice at first. And yeah you are quite right, at the end of the day it has to be crammed into pixels for our current displays.
If you get an analogue oscilloscope with an x-y mode you have a vector display of sorts. If you add a microprocessor you can draw vector graphics.
I have the oscilloscope...
Stating the most obvious outcome, screen resolutions for the same aspect ratio (for 16:9 aspect ratios, it would be 1280*720, 1366*768, 1600*900, 1920*1080, 2560*1440, 3840*2160, etc) would become irrelevant. It also means that, in the case of LinusTechTips, the only reason to have a ridiculous multi-monitor display (examples: their 8K and 16K monitor setups) is to make pictures literally bigger, not more high-def, because bigger is the same as high-def. (If you got a 34" 1080p display to replace a 24" 1080p display, the only reason you would do that is if you wanted to see the screen from farther away or if you wanted to enlarge things, but it's still the same number of pixels; if that 34" display were 1440p or even 4K, your reason would be to see more detail, but only if the content being displayed can be scaled up.)
Also, there would be no need for anti-aliasing because the point of anti-aliasing is to remove those pixelated edges off of objects.
My question is this: would you still be able to represent 16777216 different colours this way, or gradients? Also, what will happen to pixel art? Pixel art is its own art style.
Oh I hope we revisit to look at a specific machines method of displaying graphics. Even comparing the Apple IIs sub pixel drawing in 77 to the C64 in 85 and on to CGA,EGA,and VGA which is still with us to this day is a rabbit hole worth a series in of itself.
I wanna know more about the Code Club Australia tshirt Carrie Anne's wearing.
4:33 DOS example?! That's Midnight Commander, not Norton Commander. And it's clearly running on some form of Unix or Linux.
And it's still an option in most modern distributions :)
The number of times I typed 'mc' instead of 'mv' and Midnight Commander would start...
And I still use it today (and I'm not that old ;)
Penny Lane i felt same as a linux user:$
No wonder the file names didn't seem to check out (and xterm?)
I had to write a program to draw rectangles in a computer class in middle school. Don't ask when it was or I'll get kicked out for being too old.
I had to in elementary school - nah you're probably not too old. So when was it? :p
I had to do something similar in college as part of a beginner programming course.
Alverant You used TURTLE, too?
lol, if computer classes existed when you were in middle school you aren't as old as I am. I was writing programs on punch cards when I was in college :-D
who need sleep if you see Crash Course posted a new video for Computer Science.
Good Episode!
To me, the base for clearer future. Thamk you, Team!
This series is amazing!
Awesome presentation ... keep up the good work
4:33 It`s not a dos, it is a Midnight Commander on *nix system :)
I love this series.
Respect Your Elders, They Graduated Without The Internet.
+Slither Gamer
Nope, the elders should respect us, we are graduating with the distraction of the Internet, including Social Media, UA-cam, and overload of useless information.
While in the past, it was simple, you pick a field you like, you don't get distracted by instant text messages, phone calls, emails, UA-cam uploads, instagram posts, and other useless online items that are distracting.
@@BangMaster96 For some of us The Internet opened a whole new world of education. Not everyone is blessed with Internet or a full time connected social media devices for leisure.
Ok boomer
I missed how the vector display can move at an angle. How does it calculate the angle it needs to move along to get to a new point? Sounds like pretty fancy footwork.
Sourcedrop linear algebra.
With these two tools we shall build a world!
She mentioned something about a device that moved the electron beam around. I wonder how it moves so freely since most things back then are shown to be very linear, like left to right, top to bottom. You'd think the beam would have to move in stairsteps, not angled lines.
It moved much like the magnets inside a speaker move, the preset locations are digital (they can be 1 or 2 but not 1.5) but the movement between those locations is smooth, it has no reason to stop half-way unless you tell it to.
In addition to Linear Algebra and Trigonometry, Analytical Geometry might also be a useful tool.
What branch of mathematics was most useful in designing graphical user interfaces?
For additional reading search "bresenham algorithm" for line drawing. That is all.
I think I implemented that on DOS with DJGPP back in my university days.
This was a hell of an episode to watch drunk :o
Here is where i recomend people to look into creative coding. Using stuff like Processing 3 or P5.js you can make amazing stuff, while learning about the way computers work with graphics. There are a lot of great youtube chanels about this. But i recomend looking into Coding Train's Coding Challenges, to get an idea of the things you can do
EDIT: Oh!! And i forgot to say, the syntax used in places like at 9:33 is based on BASIC, a very good beginers programing language (It's like an old precursor to python and similars), if you want to learn graphics you could try QB64. A modern computers por of QuickBasic, a version of BASIC for MS-DOS computers, with a decent way to render graphics. It's a bit arcaic, but it would also make you apreciate a bit more the comodities of modern programing languages, there are a few tutorials on QB64 here in youtube, but keep in mind that you are not likely going to be using QB64 for any serious comercial aplication any time soon, it's just a way to explore programing old school style
My god, that Norton Commander shot brought back more memories than I'd imagine. Yup, I had it all decked out with custom viewers, players, editors and unpackers.. ARJ was king, I scripted custom dosshell menu's for all my game diskettes, and I made music on trackers.
Those were the days, those were the days..
This is not Norton, but Midnight Commander and thats not a DOS, but some Unix distro
Ah I see now, yeah the xterms bit at the bottom kind of gives it away. The user interface is eerily similar though.
devjock Both are very similar, thought.
Code club Australia Got some great free PR from this episode :)
really great video! Thank you so much :)
5:29 oh my god it's just like the pen from scratch
Brings back memories, coders these day are spoilt with bazillion libraries and APIs. I still remember having to read and write a bitmap from file and into a video buffer.
I want to see an example of a mechanical computer and display, you know, just in case a sun flare causes an EMP that takes out all our electronics. Don't ask what I need to display so bad.
here you could potently build one with a big enough one www.kickstarter.com/projects/871405126/turing-tumble-gaming-on-a-mechanical-computer
That looks pretty cool. If I had kids I would buy one of those for myself.
EM shielding is quite impressive, you are sure to protect modern electronics up to stupidly high tolerances.
6:21 Lost opportunity there: Spacewar.... exclamation point.
this series is exactly like the intro to computer science class in college
The first character generators were actually a small vacuum tube
I like to think that the reason we started using monitors for computers in the first place was because if you wanted to see a different set of data outputs, you simply "change the channel" like you would with a TV.
Well, what if I wanna watch multiple TV channels at once?!
LCD doesn't use raster scanning. All pixels that change color are refreshed at specific intervals, usually 60, 120 or 144 times a second. If a pixel stays the same color and intensity, it is not refreshed.
GPUs would sometimes choose to refresh the top half first, and then the bottom, and sometimes divide the display into even more sections for sequential refresh. That is to maintainn sync for input, output and image processing. This may cause an artifact known as "tearing" in video games.
Can you please do CRASH COURSE ART HISTORY?!?!?! PLEASE!!
Amazing, thank you!
Sketchpad looks impressive to me even by today's standards.
Literally more coverage than 10 lectures... with less math. which is sad :(
Can someone help me find the episode discussing the role of the NSF in laying groundwork for modern computer science? This is a fantastic series, and I want to share it with my colleagues in federal consulting. Thanks!!
After the thing on sketchpad, they showed a Unigraphics workstation. I still use that software, except it's called NX now...
That GUI for DOS reminded me of Norton Commander. Couldn't tell if that's what it was for sure, though.
Someone else mentioned Midnight Commander, an FOSS Norton Commander clone.
The one I remember using was X-Tree Gold.
Great video
I feel like that whole line about lines kind of crossed the line
that pixel Mongol was really cute
Got a circuit you wanna build? Test it out here first. Type: 'Circuit Solver' by Phasor Systems on Google Play.
If that pen at around 7:15 is real (as in the video is real), it seems to have far lower latency than today's digital pens ._.
2:25 Nowadays many LCD is not using scanning anymore ig
4:32 That's obviously some clone for *nix machines but darn that made me nostalgic for Norton Commander :)
www.ghisler.com/
:D
perhaps midnight-commander.org/ ?
7:41 Nope, it actually blows my mind. Now my mind has turned into an ash.
I learn computer science in school but its soo, like the teachers don't have your energy, they are bored, I come on youtube many more times to undestand how things work because they are unable to teach us properly =)))
That's not DOS! That's Midnight Commander!
Some of the best games still use ASCII graphics, I guess because they can add anything they'd like without having to worry about art assets. By the way, I'm talking about 'Dwarf Fortress' and 'Cataclysm DDA'.
not mentioned here was color-cell graphics, which sort of fall between ASCII-graphics and bitmap graphics. many early computers (ex: C64) and game-consoles (NES, Sega Genesis, etc ...) did their graphics this way. typically, graphics were represented as a pair of colors for a pixel-block with 1 or 2 bits per pixel to select the desired color, or sometimes by using a special-purpose font built directly for the graphics in question).
Shut up boi nobody cares
"color-cell" That's an interesting term. It sounds like the pre-cursor to palletized graphics, which are still used both to save memory and (more often) to get a retro aesthetic.
Regarding the ASCII graphics in the original post, I usually default to ZZT as an example, even though that came out in the early 90's, because it's a free game with a world editor that includes a fairly simple scripting language. It was even sold back in the day on the basis of favoring depth over the flashy graphics of the time, which I find amusing nowadays since the creator went on to lead the Unreal engine team.
yeah; color-cell was popular in the 80s, and also early-90s video codecs (ex: MS-CRAM and QuickTime RPZA). palette graphics were an independent development, which generally use more space but were also a little more flexible. color-cell still lives on though in the form of GPU texture-compression and similar (and some specialized image/video codecs).
I am considering color-cell for a possible FPGA CPU project, partly so that I may be able to fit a 320x200 framebuffer into 16kB of SRAM, whereas I would need 64kB with an 8-bit palette (and/or to stream in the framebuffer from external DRAM to do a screen refresh).
0:59 is that a touchscreen ???
Did I just see touch screen in the 1960's? 1:00
No, that is not touch screen. That is a light pen. en.wikipedia.org/wiki/Light_pen
First "finger" touchscreen was invented in 1967 by E. A. Johnson. It was based on capacitive coupling, just like today's smartphone screens. Multitouch appeared in the early 1970s. These technologies are a lot older than you might think!
Dear crash course ,
Just asking "for fun" what the vector display instructions would be to draw a circle ?
Depends on the display. Using my old oscilloscope which is technically a vector display of sorts, I can draw a circle by using AC power, a capacitor, an inductor and a few resistances. I can change the size in x or y by changing some resistances or by turning some knobs. I can similarly move it around the screen by turning some other knobs.
To make two different circles at once, I need faster sine waves and a Traic or some transistors to switch between them.
This is not really computer programming though.
To really answer your question, I wouldn't be surprised if the closest thing to a circle would be a regular polygon.
Abram Thiessen Thanks for your comprehensive answer I get it now .
but what is the science behind your method of making a cricle ?
does the resistors create a magnetic field or what?
No, it is just that if you have two sine waves offset by 90 degrees and set one as x and the other as y it draws a circle. The capacitor and inductor are used to create the phase shift, the resistors are there to lower the current so that it doesn't damage the oscilloscope and also to control the size.
Why did early CRTs were used to show memory values? Wasn't it cheaper and more efficient to use nixie tubes or LED displays to show just numbers?
For very early computers one option for memory (besides mercury delay lines and magnetic drums) was special CRTs known as William's Tubes (en.wikipedia.org/wiki/Williams_tube). I am pretty sure this was explained in a previous episode. A side effect of using this technology is that you can directly observe the current contents of all the memory as a bunch of glowing dots.
One of the smallest early computers was the LGP-30 (en.wikipedia.org/wiki/LGP-30) which had a drum memory with special tracks as registers. A built in oscilloscope CRT could show the contents of the registers as waveforms since the computer processed just one bit at a time (so each register was a single wired that had to be looked at).
All other computers used tiny light bulbs to show the internal state (mostly replaced by LEDs in the 1970s).
But in this video she says that the use for displays in early computer was to keep track of memory values beacuse constantly printing it on paper would be too expensive. And obviously it looks better to see a "3234" printed on a screen than a bunch of little lights one would need to look and calculate the actual value (like the williams tube you linked) and as Carrie Ann also said in this video, early CRT had very bad contrast and look pretty bulky, a device with a memory buffer where the computer can store values that need to be displayed and a circuit to convert said values into easy-to-see displays (like nixie tubes or LED displays) would been probably cheaper and smaller.
Just to clarify, I'm not implying I'm smarter than the computer scientists back in the day, I just wanna know why didn't they use other ways of displaying if the main objective was to diplay memory values.
Sending the contents of memory to a printer was called a "core dump" (since magnetic core memories were the most popular type from the very late 1950s to mid 1970s), but as she said this was awkward an wasteful. CRTs were not used as an alternative to this, however, until terminals or built-in displays replaced the control panels with blinking lights in the late 1970s. See the big deal Woz makes of this about his Apple I, for example. Before that CRTs were used as I explained and as graphical output devices like she said. But not for observing memory.
While many computers used binary output and input in their front panels (and so light bulbs and LEDs like I mentioned) others, including my first computer (www.retrotechnology.com/restore/6800D2_run.jpg), had octal or hexadecimal panels and so did use 7 segment LED displays or equivalent.
Thanks, I guess it goes the same as programming in machine code, it may look hard to do but it was how it was done back in the day.
sebastian carreira LED and LCD screens just didn’t exist yet in any practical form. Nixie tubes were however used
GUI? The code should just cascade before your eyes... on a completely unrelated note, have you seen the one?
I think I just saw Jayne there
You say that programmers don't write the drawing functions, but use graphic libraries with ready-to-go functions. Are these still just draw functions but made by someone else for time-saving, or is it an entirely different way of drawing graphics?
It's the former and it can be written in a lower level language for faster execution.
I always thought that VRAM stood for "Vanilla RAM"
I know this is exceptionally petty, but the pixellated video resolution didn't quite match up against the grid :)
This 'new level of abstraction' thing is soooo dumb.
Besides that, this whole series is totally invaluable.
That was not a pixel...
That "dos example" looked a lot like Midnight Commander, not Norton Commander.
Since when has Midnight Commander on a *nix machine been DOS?
Looks basically the same as Norton Commander, which was DOS.
varana312 but the picture is not of a FAT system.
Yeah, I didn't want to imply that this actually was a DOS system that they showed.
But as the topic was how these programs drew lines with characters sets, what's written in there was not that important. I'd guess it was easier for someone on the team to get that picture on *nix instead of trying to run an actual NC on a modern machine.
Graphictistic!!😀
What's the language used in examples please?
it's sudocode which means that it isn't technically a programming language.
SweedishSlenderman
pseudocode*
ahhh remember Prodigy?
no such things as halfway crooks
I know what's gonna be this years pizza-john
She said “done with word play” and then went on to do a word play.
Bet the rastascan has some good kush
This course is amazing. However, the last few chapters have been 90% history and 10% CS...
It still really bugs me that the Levels of Abstraction animation's "level" number isn't in binary...
Sam Freed By now it's tradition. Too late to change it.
Carrie Anne got a Carrie tan
*Carrie Fat-Anne. Takes a lot of bits to carry that big plump Anne
You Suck Better than dating a stick imo lol
Wow great vid
*is ten seconds in
When i was a kid I used to start game called Platoon on my C-128 just to watch the intro screen. That's how amazing it was: s-media-cache-ak0.pinimg.com/736x/98/af/39/98af39ec3cf6f8a6ef584c521516e513.jpg
4:33 Dos example? that's bash
{codeclub austrailia} what is the meaning of this, mate?
Man, all this talk about vector displays makes me want to find a Vectrex that isn't going for a million dollars 😔
Just a question guys: who votes down this video? What do you think was wrong? (No I dont work for crash course)
The presenter?
There's trolls that go around downvoting videos for no reason. Gain enough views, and no matter what the video is, you will always get dislikes.
Barnard Rabenold and some people misclick too
let brainState = 🤯
Always have to play these videos at 0.75x speed :(