It's actually not that stupid. That's how a lot PRNG's are implemented even today. It's called a "Linear congruential generator" (LCG). Look it up on wikipedia. The great thing about multiplying by a "relatively" large number (5 is relatively large for byte values) is that it quickly overflows. The whole sequence of (x*5+1)%256 is: -- 0, 1, 6, 31, 156, 13, 66, 75, 120, 89, 190, 183, 148, 229, 122, 99, 240, 177, 118, 79, 140, 189, 178, 123, 104, 9, 46, 231, 132, 149, 234, 147, 224, 97, 230, 127, 124, 109, 34, 171, 88, 185, 158, 23, 116, 69, 90, 195, 208, 17, 86, 175, 108, 29, 146, 219, 72, 105, 14, 71, 100, 245, 202, 243, 192, 193, 198, 223, 92, 205, 2, 11, 56, 25, 126, 119, 84, 165, 58, 35, 176, 113, 54, 15, 76, 125, 114, 59, 40, 201, 238, 167, 68, 85, 170, 83, 160, 33, 166, 63, 60, 45, 226, 107, 24, 121, 94, 215, 52, 5, 26, 131, 144, 209, 22, 111, 44, 221, 82, 155, 8, 41, 206, 7, 36, 181, 138, 179, 128, 129, 134, 159, 28, 141, 194, 203, 248, 217, 62, 55, 20, 101, 250, 227, 112, 49, 246, 207, 12, 61, 50, 251, 232, 137, 174, 103, 4, 21, 106, 19, 96, 225, 102, 255, 252, 237, 162, 43, 216, 57, 30, 151, 244, 197, 218, 67, 80, 145, 214, 47, 236, 157, 18, 91, 200, 233, 142, 199, 228, 117, 74, 115, 64, 65, 70, 95, 220, 77, 130, 139, 184, 153, 254, 247, 212, 37, 186, 163, 48, 241, 182, 143, 204, 253, 242, 187, 168, 73, 110, 39, 196, 213, 42, 211, 32, 161, 38, 191, 188, 173, 98, 235, 152, 249, 222, 87, 180, 133, 154, 3, 16, 81, 150, 239, 172, 93, 210, 27, 136, 169, 78, 135, 164, 53, 10, 51 -- This is a full cycle through all 256 values without repeats. Yes, the distribution is not that perfect. For this the multiplication factor needs to be higher so you get a greater "avalanche effect" between two numbers in the sequence. Though multiplying by 5 is relatively cheap with add and shift (left shift by 2 and add the original value to itself. So in essence multiply by 4 and add one more to get 5). So it's a great compromise between practicality and quality. I've implemented the standard 32 bit turbo pascal LCG (0x8088405) in lua for some simple encryption in computercraft :) of course it's not really secure, but it works quite well. It serves two purposes: obfuscation of wireless modem messages and more resiliance against random malformed messages. My system even used a fix encryption key as well as a random per-message-key. So it makes it harder to reverse engineer or detect patterns ^^.
@@Bunny99s Whoa, I should have expected to see a ComputerCraft mention out here! Do you do this stuff solo or is there some kind of community I'm not aware of?
I love all those crazy and creative optimizations to make games fit and run on hardware that's so limited that it might seem that a game like that would not be possible :)
im only halfway in the video so he may mention this. but my favorite optimization in mario is that there are not 32 levels. many levels are the same level and all it does is check is it the hard or easy version of the level. like there are plants in EVERY pipe but it checks "is this 1-1 if so no plants". this is why the castle levels repeat but some have more fire bars or a maze while the other version may not this drastically reduces the number of levels to store
I like that "random" generator which works on frames. In C64, the timer was located in zeropage which is very fast to access, so you could do the same thing essentially, just reading the timer instead for example. The idea to use the time between title page and start-of-game is also good to get a random number from start.
I love videos like this. I used to develop games in Macromedia Director, and it had a ton of limitations. When I made Mario Paint Composer I couldn't store music the way I wanted to. This is because Director didn't handle arrays very good. You couldn't just jump to the last note in the song - Director had to read every single item in the array before getting there. So the longer the song, the more time it would take to read subsequent notes. So what I did is I saved the songs as image files. I could jump to any pixel and read its RGB value. So I'd store music that way. The R value in a pixel might be the note, the instrument, or the velocity. Then for each new note I'd just read the next horizontal pixel, then the next, then the next.
Amazing thing is how Kevin Zurawel explain this on a simple way. This was 34 minuutes, but to me was just like 5. I want more, for sure. Nice work, Kevin!
I love that you could describe the idea of "generating" random numbers from a lookup table as "if you can't make your random numbers from scratch, store-bought is fine, too"
Another famous game thaat does it: DOOM. It also advances the index according to certain actions, since those take random numbers, too. For example, any direct attack that hits something with HP queries the random table to give the hit a random amount, from 1/4 to twice the stated power. (Explosives work differently; their damage is power minus the distance to the target, or no hit at all if that's a negative number -- and up/down distance is completely ignored.) At some points, the A.I. running the creatures use random numbers for their decisions, too. But the bottom line is that DOOM gameplay saves (those you can playback to watch what other players did) only work because the starting position is always the same. If you repeat the input in a frame-perfect way, you get the same result. And that's exactly what the playback subroutine does; the file contains user input and timing data only; actual damage figures, shotgun spread, and A.I. decisions are recomputed.
The very significant restrictions in the 8 bit era led to many incredibly ingenious solutions, all driven by an expert level knowledge of the CPU's instruction optimisations and equally ingenious memory management optimisation. A truly unique, golden era of programming. I miss those days! I still program professionally today and while of course inevitable, my unique historical viewpoint makes me somewhat saddened by how much has been abstracted away today, to the point where so much is provided to the developer - resulting in much of that absolute requirement for critical thinking present in the 80's, quite simply, rapidly heading to a 'not required' status.
I love the idea of making NES games today, creating within the original hardware limitations as a creative practice. In my experience, creativity thrives under constraints..
Worth noting on the graphics, each tile was 2 bits per pixel, allowing 4 colors defined for the tile. What 4 colors were used could be changed or "palette swapped", efficiently allowing reuse with minimal space. This is why many enemies, player and background sprites look the same.
The most impressive is that those programmers did all this without internet, only helped by a shitty documentation translated from Japanese. Understanding how the NES worked was already an enigma to solve before you even started coding for it.
I used to make very small compressed (4 kilobytes) games for programming competitions. There are a lot of parallels to the optimization and simplification they had to do for those old platforms. Modern games dont really worry about such detailed optimizations much unless it impacts framerate / streaming speed, ending up in those 8 gigabyte patches for a few changes in the gameworld and logic.
As a software developer that is still enamored with video games, this still fills me with admiration for the game devs of old. Hardware limits were there but that just forced devs to optimize like crazy.
Toshihiko Nakago, that's the guy who programmed Super Mario Bros. And The Legend of Zelda. And the following games of both series. He deserves a lot of praise, yet he doesn't even have his own Wikipedia page in English. 🙁
Galois LFSRs are super commonly used everywhere. CRC is built on them, for example. I was first exposed to them when I read Fabien Sanglard's explanation of how the Wolf3D killscreen was drawn.
I absolutely adore the creative problem-solving that goes into making these old games fit such tiny storage limitations. The example of how Contra handles collision blew my mind. Excellent talk and presentation.
Contra's random number system is the best system presented. User feedback and timings are much better than any pseudo random system. The important point about the other two systems presented is what is the seed? Contra's system essentially creates new seeds constantly.
Great talk, Kevin! I wrote a few NES emulators back in the day, and wrote z80 assembly for GameBoy Color professionally for a couple years back in the 1900's, and this presentation was spot on.
Another noteworthy use of random numbers is for the noise channel of the Audio Processing Unit, used for generating snare drums, cymbals and gunshots, etc. The Linear-Feedback Shift-Register method happens to create a full spectrum of sounds by varying the sample rate, or by reducing the sequence of bits producing a metallic tone as heard in the Megaman games.
This is kind of the opposite of how abstraction is usually handled in modern software development. They try to make something complex and then when that's done, provide an interface (java ruined this word) to it for people to use without having to worry about the underlying meticulous construction. I find that any algorithmic problems are usually easiest to solve with as few lines as possible though and just run it. Simple code and only a few lines. It's surprising how often it just works or requires like 1 or 2 small changes. Neurotically checking for null before the program is tested is just bad because it makes writing it a lot more mentally taxing and like half the code will be completely unnecessary.
I'm an old-school programmer from the 70s. APL and BASIC had pretty good Random functions. If you wanted to help that a little, you would specify a seed, which is an offset into the random table. One easy way to generate a random seed would be to run a timing loop after requesting an input to see how many cycles it took for the human to answer the question. "Number of Players?" or whatever the question was. When coding in Z-80 Assembly, if you needed a pseudorandom number, the R register came in handy. That was the Ram Refresh register which kept track of which memory address is being refreshed. You could take that value, AND it with however many bits you needed and VIOLA! Good enough for gubment work, as we said.
Thanks for this talk! It was really interesting to get an overview of all the struggles developers had to deal with back in the day and the creative solutions they came up with in the end.
Regarding ramdomly generated numbers, a great example that uses the RNG Table is Doom, since the developers wanted the player to be able to record its gameplays into demo files and be replayed in any PC that had the same Doom version. By using the RNG table, u could always expect the same results (either them being behaviors, values, etc) when replaying the demos. This ultimately lead to the popularization of speedrunning in general, since now players could record and distribute their playthroughts without using a camcorder or other bulky and expensive solution while also having tiny *.LMP (demo) files.
Actually LFSR as shown here for Tetris is also completely deterministic and is easily reset to a known internal state. From that state on, it always reproduces the same sequence. I think random table was merely because it was faster and good enough.
Holy smokes! I first saw this video about a year ago, and I just recently went through the entirety of what has published so far on the Famicom Party book without realizing Kevin was the author. It's a great blog series, and I'm looking forward to more parts of it being released. In case Kevin sees this, "Thank you very much for what you have provided."
Great talk! No mention of sound and music, an important feature of the NES. I've been asked how game programs had so many things going on at once. I did it with finite state machines. One racing game I developed needed trig tables, which I precomputed that took up an entire bank of ROM in the cartridge. We called the tiles "characters", which is why they're named chr. Also not mentioned is the 2x2 cells for background characters that specify their palette. We generally used layers of indirection to define large worlds with few bytes, which you call abstraction. Sprite chrs could also be mirrored horizontally and vertically, and we created tools to compress artist's graphics into character maps so they didn't need to be done manually. Music was also compressed with patterns recognition (abstraction) during assembly
the benefits of "embracing the stupid" are so incredibly real; I'm VERY new to game dev and have learned so much from just going "whatever; I'll do it this way for now" and moving on to something else for a while with my technically-functioning solution intact. I've been able to go BACK later with skills I learn doing Other Stuff and make my stupid solutions better / more like the way I wanted them to work originally... & obviously at a certain point, it's reasonable to just say "this is good enough" if your game functions and you don't have anything else to fuss with before you declare it Finished. I love the thing about just writing the save file 8 times especially... sometimes brute force is exactly what you need!
I loved this talk! I enjoyed it so much I actually wish it was a series where I could move on to the SNES or Genesis next, then the N64 or PlayStation, and so on to learn about the advancements and new techniques introduced over time!
It's impressive how optimized those games were and how clever the devs were able to reduce the size. Nowadays it's "Yeah, sure, 90 Gigabytes are absolutely fine! We have basically unlimited space, so why optimize?"
Should be noted that the average 90GB game's size usually consists of very compressed texture data and model data. Game studios usually buy services or hire consultants to help reduce these sizes. It's very important to them and not at all something they shrug off. This is a standard "gamer doesn't understand the problem complexity and calls game dev lazy" pattern. The reason game devs REALLY care about this is that you need to get this data from CPU to GPU, and for the game to run ok you need to be able to do this very quickly. There are plenty of middleware companies that specialize in exactly this sort of thing. So they're optimizing for CPU->GPU transfer rate, and this leads to complex trade-offs that surprisingly don't always involve making the texture as small in memory as possible. I'm not saying the "who cares" attitude doesn't exist, but there's no room for it if you want to be a successful competitor at the top.
Also consider that textures these days are WAY larger than in the past. You might think a 4096x4096 texture is about four times larger than a 1024x1024 size texture, but it's actually 16 times the number of pixels. Now you might be surprised that we've gone from 10GB games to 120GB games in the last few years, but that's entirely within that 16x multiplier
keep in mind that sometimes bigger size is used for performance. As an example, PS1 Crash had random data at the center of the disk so the actual data on the edge would allow for more throughput when the disk spins
A lot of the data in modern games that make them so big is actually audio in most cases. When you want a ton of voice acting and you need it to play basically instantly without the overhead of decompressing it first, you end up with a crap ton of large audio files. Was a lot easier to keep things small when all conversations were just a series of generated beeps that accompanied the text roll, heh. Even worse if it has been localized for several languages and each get their own set of audio files that end up going out in every copy instead of just for the regions that need them. One of those areas where a lot of studios aren't really taking proper advantage of the modern age of game download services that can technically use a different depot for each language and only download the one that matches the users settings.
Wolfenstein 3D and DOOM both use a hardcoded (the same one in fact) table of 256 values for random numbers. lookup tables were fairly common for everything back when CPUs weren't much faster than RAM. sin/cos tables. fixed point math tables.
I completely agree with the "Embrace the Stupid" concept. I often run into people trying to make perfection for stuff that does not require sophistication. If conventional makers understand the concept many things can be accelerated and then, if required, modified for perfection...
This is why games like this were awesome. Similar to Star Wars. The original. Limitations breed creativity and passion. Unlimited resources leads to trash. Awesome talk
This was a random watch, but sweet jesus this was interesting! I still don’t understand half of it, but it felt like I understood it all due to the great presentation
Incredible level of ingenuity in the '80s. I was also going to say 'thinking outside the box' was mandatory, however, there really was no box at this time - you literally had to solve many problems yourself i.e. invent solutions. Remember also, no internet here - so no Google for your docs/resources, no stackoverflow/forum, etc, to answer all your questions. It was an incredibly exciting time as the home computer/console era exploded into so many households, with 6502/10, Z80, etc coders busy perfecting those assembly routines. It's equally incredible how far this tech has advanced. There is a real concern, though, in the obvious over-reliance on the internet to increasingly do all our 'thinking' - especially the 'hard thinking'.
Great talk! And I totally agree--As long as something fulfills its purpose, it doesn't matter how dumb you go about it to get there. Classic videogames are so interesting in that regard.
As a kid I remember being impressed that FF1 found a way to avoid people save scumming (resetting the game to get better rng, more favourable battles). Eventually I did guess that they just had one big random number though. So technically I did notice.... but if anything it made the game better that it wasn't really random, rather than worse.
Holy shit...as someone who had just been born when the NES came out, the amount of tedious work necessary to squeeze out every byte of memory and CPU cycle and all sort of tricks the devs had to do back then is just amazing. Today you just take Unity/Godot/etc., throw in a tilemap and some sprites (who cares if mario.png is 2kB or 28kB), hook up the (pre-made) input system and there you go. But then again, making a game is much more complex today and you have to struggle with different shit...so not _that_ much has changed actually. :)
Tedious work? Almost every 8-bit game had between 1 and 3 programmers working on it, usually just one and the art and design team were another 2-3 people, and sound one more person, and the size of code is counted in kilobytes. Each assembler instruction takes a byte + data, and you print all the codebase of a game out nice and legible, you're going to have a fairly thin booklet. Most games were done in mere months. Today, an average production takes a thousand people, around 200 at the head studio and the rest outsourced, and takes around 3 years. Even just clicking a shitty little health overlay in Unity can take you days, they'd be done with that in a couple hours. The scope is completely different. I'd say the work done back then may be seen as unusual from today's point of view, a lot of that basic skill is no longer conveyed and is thus kinda lost, and not everyone was equally successful back then either, but it's not actually all that high up on difficulty scale.
@@SianaGearz Lovely point. I recall when I saw someone finishing the Mario game (just the game) and jaw dropped at the amount of developers on just that one game, by how long it took the credits to scroll all the way.
Today you don't just take Unity Unreal etc. Almost every game for Nintendo Switch has careful research management. From Nintendo games like Mario Odyssey to Witcher 3, Doom and Wolfenstein. And late in regular console generations like the PS4 and XboxOne some talented Developers manage to press every drop of performance out of those 2013 weak AMD Jaguar core based APU's in those systems. Which is also quite astonishing in some cases.
Also, the 2nd pattern table at 06:20 all that garbage in the bottom is actual program code-- they couldn't fit the game's code into PRG-ROM so they put the overflow in CHR-ROM and copied it to RAM and ran it from there to fit into the 32kB + 8kB footprint.
I feel like at times, he was pausing just in case there were a giggle or two. But 100% overall INSANE presentation. I knew the basics of level design and sizing, but how you broke it down was great! I already shared this with my fellow gaming nerds!! Thanks again for posting!
Metroid should have been the dumbest RNG example, it pulls something from the initial state of ram, which is why certain enemies **don't** vary like they should - it only does this once.
To answer the final question, yes. The random function returns a float (from memory everything in JS is a float) between 0 inclusive and 1 *exclusive*, so if you multiplied by 255 youd get a number between 0 and 254, not 0 and 255, given that floor will return the integer portion of the float?
23:18 The original doom does exactly this. 256 numbers uniquely spread out in an array. Whenever anything needs a random number; pointer + 1 and take it out of the table and roll around. Simple
I was going to say that. It allowes for synchronisation across networks and recording of demos because the randomness is essentially generated by player input. That wouldn't work with Tetris because the player input has to respond to the random element. I suppose that user generated randomness is kind of like an early version of some forms of CAPTCHA.
With a 6502 processor I usually used the 1/100 seconds value of the timer. So long as the random numbers were required at random times this always produced a genuinely random number from 0 to 99.
I really appreciate the mantra about stupid ideas. I was developing an algorithm in C++ that I felt was stupid. It was a trivial method to return … random strings. And I just concatenated what was needed and returned the value of random indices to stdout. Fast forward I find an open source program that has a similar function and when reading the code noticed it was done in a more elegant, but in essence, the same way. It made me smile because what I thought was so dumb as an amateur, a paid professional had the same idea. Not such a dumb idea after all. Unless of course the individual copied my code and refactored it 🙄!
As you were describing how devs solved random number generation, I couldn't help but think about how pokemon games handled it (they were also 8-bit, but significantly more hardware to work with). An absolutely perfect example of a dumb solution that works really well (but if you know what's going on in the code you can exploit it in-game) is called DSUM. Basically there were two variables: the first codes for what pokemon will appear in a random encounter, the second for whether or not an encounter is generated on a given check. Those two numbers would be manipulated each frame, in a somewhat predictable manner, such that the cycling of the values could be tracked. Any given in-game area has a set of pokemon that can be encountered there, each with a specified chance, but these "chances" are coded as some range of values out of 255, so you could have a 20% chance to see a pokemon (25/255), a 10% chance (12/255), and so on. The thing is, because you can track the DSUM which cycles roughly every 6.5 seconds, you know which pokemon can appear at any given time. This means you can decide when to take actions in game that can generate an encounter to manipulate which pokemon actually appears (rather than waste time with getting the wrong encounter). You can't really track whether or not the encounter will be generated in the first place, but by only attempting to generate one when the results would be in your favor, you can save a significant amount of time searching. This is particularly relevant in speedruns, where fractions of a second matter. If anyone is curious about this, you can find more by searching "DSUM manipulation" in google, or by watching any generation 1 or 2 pokemon speedrun, especially the ones on the Games Done Quick channel, as those are marathon runs where the runner will explain what they are doing.
Awesome talk! Small correction: the Game Genie does not write to RAM but intercepts the ROM. If the CPU requests a certain address in the ROM data the Game Genie just replaces the value with something predefined. There might also be a check value to make sure that the address points to the correct ROM bank. if(address == X && valueAtAddressX == Y) return Z;
Yeah the action replay and gameshark do it with RAM instead. Actually.. I wonder how it does it.. the game genie is between the cartridge and the system so it can just act as a middle man and swtich the values up, but the gameshark is also there, how does it intercept the RAM ? I guess it could switch instructions ? Like if there is a LDA for a value it has a cheat for it changes the LDA so it loads from ROM instead of RAM ? But then that wouldn't really work when it's not directly loading the RAM address like if it's using LDA $adr, X because the gameshark doesn't know the value of the X register or if the instruction is loaded from RAM instead of ROM.
Thanks for the video! The collision part really helped me! I've started making my own NES game, which also requires some sort of RNG. I came up with a different approach that works pretty well for me. I use my key inputs in each frame and add a value to a variable that represents the random number. If I hold down the up key, 1 is added in each frame, if I hold down the right key, 2 is added, if I hold down the down key, 3 is added, etc. Since the number is stored in a byte, it is reset to 0 when it reaches 256. I have a controller that spawns enemies at random positions at a fixed interval. When you play the game, you automatically change the random number that is taken as the spawn position for the enemies.
I think when programming it’s important to prioritize intent and apply this appropriately to what you are developing . Applicable programming with intent. As you mentioned There is no reason to create a “true” random number generator and spend time doing that if it is not necessary and many levels removed from the actual purpose and intent of the overall code (which is the game ). If it works, it works. The actual product and game play is the priority thus that should the most important aspect of development. Now if for some reason having some sort of pseudo random number that essentially leads to predictability and diminishing the difficulty/fun factor… that is another story.
Thanks man, really good video. Your section on graphics has given me a bit more enthusiasm for this Bomber King disassembly I've been doing for the last 3 years. Maybe I'll finally work out how the level data is stored ;)
There are some other titles from the NES that are and were legendary classics to play while also being programming masterpieces of their time. The Legend of Zelda, Metroid and The Guardian Legend!
Fun fact: the second random number approach (hard-coded list of 256 random numbers and just cycle through them) is also used by the legendary Doom. The reason however, is the demo function. Carmack needed a way to create the illusion of randomness, but in a way that was deterministic. If the numbers generated were always completely random, then demos would desynchronise and you couldn't play them back properly.
@Guy Dude The implicit state of that 16-bit LFSR like shown for Tetris here is just two bytes - with your own implementation, making it fully repeatable is not a problem at all. I think Doom chose precomputed tables purely for performance reasons.
Really nice video! I started doing software development in the late '70s, but did not work with the 6502 or game development very much. I learned a lot of interesting things about how these things worked from your talk. Thanks! BTW, developers today are really spoiled. Back in the day, we toiled for hours on a subroutine to try to reduce the byte count from 200 bytes to 150 bytes. We also counted individual instruction execution times to get our routines to run in under some small number of milliseconds.
Exactly, cycle counting was incredibly important - especially if you wanted your game to stand out. While the internet has changed much of our lives for the good, it has also very negatively affected our ability to think for ourselves; to problem solve - why work out the problem yourself when stackoverflow can do that for you, lol. Not sure what the answer is but we are definitely dumbing down on so many things.
The design hard guaranteed life of the save battery is 10 years. It's fairly rare of them to fail before 20 years are up. Beyond that, it's a matter of luck. I suspect the primary failure cause is just the chemical degradation of the cell with time, not the current being drawn and capacity being depleted.
Super Mario Bros. actually uses three bytes for Mario's x-coordinate. For instance, Mario's x-position might be something like 50 C3 02 (in big-endian), which would mean he was two screens plus 195 and 5/16 pixels from the left edge of the level. Each screen is 256 pixels wide, and each pixel is 16 "subpixels" (in the sense that the lowest nibble is never used). So rather than 8.8 fixed-point, it's more like 8.8.8. Speed has only two bytes for each coordinate. The high byte is in units of 1/16 px/frame, i.e. 1 subpx/frame, so the maximum expressible speed is 16 px/frame (or nearly four screens per second). The low byte is in units of 1/4096 px/frame, i.e. 1/256 subpx/frame. However, when Mario moves, his position only ever changes according to the value of the high byte; the low byte is ignored (which is why his position is always a multiple of 1/16 px). The low byte for speed (aka "subspeed") only matters for making acceleration smoother, because in the acceleration calculation, the low byte of speed is updated and rolls over to the high byte.
This was a really fun and comprehensive talk. Makes me kind of wish I was a developer back in those days so I would have to tackle those types of low level challenges. I'd like to try my hand at Game Boy development which, if I'm not mistaken, carries some of similar patterns.
Great talk! However , a point of order. The Atari 2600 was NOT the immediate predecessor to the NES system. There was an intermediate generation -- the Atari 5200 and the Colecovision. That second system, as I recall, has the distinction of hosting the first port of Donkey Kong as its pack-in title. The NES was still a marked improvement over both of these consoles, as well as the Mattel Intellivision, which lived between the VCS and the Coleco generations. But it was far less of an improvement than over the VCS. Among other things, the Pitfall II cart was the first game I am aware of with a multi-part four-instrument soundtrack accompanying the entire game, with different songs played for different situations.
What's interesting about this to me is that, as beginner game dev, I needed an RNG system for drop rates that felt random but couldn't actually be random. My work around was to make the player character constantly generate a new number in a linear sequence for every frame that the game runs that determines whether an enemy will drop anything. If the number allows the enemy to drop something, then the enemy will read its own linearly-generated number that determines what it drops. Between the frame-perfect timing, "randomness" of what number the player character could have, and the fact that the enemy value is specific to each object and only starts once it's on screen, it truly does feel random and fair. Is it stupid? Yes. Is it not actually random? Of course. But is it functional? Yes, so it's good enough for me.
One thing that can be considered for all practical purposes random is polling user input and feeding that into the existing pseudo-random number sequence in some way. Works particularly well with analog inputs since they have a wide range of values. I'd do really stupid shit like string them together in nonsensical equations with math functions to produce unpredictable results. This was on modern hardware, but I was paranoid. =)
Normally you'd just give your platform's random function a set seed, that way you can get a random sequence that is always the same if you give it the same seed
Really interesting! Now I really understand why Super Mario Bros 3 was such a technical masterpiece. Wish you broke that down 😅👌 Excellent speech regardless!
23:00 not as dumb as it sounds, because 1) if the random numbers are used for more than one thing the "offset" keeps shifting all the time and 2) the numbers are probably computed modulo something, which makes the same roll essentially uncorrelated to any other modulo something else
also it depends upon the condition of hardware. As you play your hardware warms up changing the time it takes to do calculations and changing the amount of time left to add to the random number.
Ex NES dev here. Our RNG was also stupid, basically just multiplied by 5 and added 1 with an 8-bit value.
It's actually not that stupid. That's how a lot PRNG's are implemented even today. It's called a "Linear congruential generator" (LCG). Look it up on wikipedia.
The great thing about multiplying by a "relatively" large number (5 is relatively large for byte values) is that it quickly overflows. The whole sequence of (x*5+1)%256 is:
--
0, 1, 6, 31, 156, 13, 66, 75,
120, 89, 190, 183, 148, 229, 122, 99,
240, 177, 118, 79, 140, 189, 178, 123,
104, 9, 46, 231, 132, 149, 234, 147,
224, 97, 230, 127, 124, 109, 34, 171,
88, 185, 158, 23, 116, 69, 90, 195,
208, 17, 86, 175, 108, 29, 146, 219,
72, 105, 14, 71, 100, 245, 202, 243,
192, 193, 198, 223, 92, 205, 2, 11,
56, 25, 126, 119, 84, 165, 58, 35,
176, 113, 54, 15, 76, 125, 114, 59,
40, 201, 238, 167, 68, 85, 170, 83,
160, 33, 166, 63, 60, 45, 226, 107,
24, 121, 94, 215, 52, 5, 26, 131,
144, 209, 22, 111, 44, 221, 82, 155,
8, 41, 206, 7, 36, 181, 138, 179,
128, 129, 134, 159, 28, 141, 194, 203,
248, 217, 62, 55, 20, 101, 250, 227,
112, 49, 246, 207, 12, 61, 50, 251,
232, 137, 174, 103, 4, 21, 106, 19,
96, 225, 102, 255, 252, 237, 162, 43,
216, 57, 30, 151, 244, 197, 218, 67,
80, 145, 214, 47, 236, 157, 18, 91,
200, 233, 142, 199, 228, 117, 74, 115,
64, 65, 70, 95, 220, 77, 130, 139,
184, 153, 254, 247, 212, 37, 186, 163,
48, 241, 182, 143, 204, 253, 242, 187,
168, 73, 110, 39, 196, 213, 42, 211,
32, 161, 38, 191, 188, 173, 98, 235,
152, 249, 222, 87, 180, 133, 154, 3,
16, 81, 150, 239, 172, 93, 210, 27,
136, 169, 78, 135, 164, 53, 10, 51
--
This is a full cycle through all 256 values without repeats. Yes, the distribution is not that perfect. For this the multiplication factor needs to be higher so you get a greater "avalanche effect" between two numbers in the sequence. Though multiplying by 5 is relatively cheap with add and shift (left shift by 2 and add the original value to itself. So in essence multiply by 4 and add one more to get 5). So it's a great compromise between practicality and quality.
I've implemented the standard 32 bit turbo pascal LCG (0x8088405) in lua for some simple encryption in computercraft :) of course it's not really secure, but it works quite well. It serves two purposes: obfuscation of wireless modem messages and more resiliance against random malformed messages. My system even used a fix encryption key as well as a random per-message-key. So it makes it harder to reverse engineer or detect patterns ^^.
@@Bunny99s Whoa, I should have expected to see a ComputerCraft mention out here! Do you do this stuff solo or is there some kind of community I'm not aware of?
What games did you develop?
I don’t get why it’s “stupid” if it works fine.
Can you like figure out this RNG pattern and abuse it to your advantage or something?
@@Bunny99s did you tell an actual NES Dev to look it up on Wikipedia?,, 🤭
I love all those crazy and creative optimizations to make games fit and run on hardware that's so limited that it might seem that a game like that would not be possible :)
im only halfway in the video so he may mention this. but my favorite optimization in mario is that there are not 32 levels. many levels are the same level and all it does is check is it the hard or easy version of the level. like there are plants in EVERY pipe but it checks "is this 1-1 if so no plants". this is why the castle levels repeat but some have more fire bars or a maze while the other version may not
this drastically reduces the number of levels to store
It's nice to see the NES's modern descendant , the Switch, doing some similar stuff.
I like that "random" generator which works on frames. In C64, the timer was located in zeropage which is very fast to access, so you could do the same thing essentially, just reading the timer instead for example. The idea to use the time between title page and start-of-game is also good to get a random number from start.
Even I really love these cute optimizations , any other similar recommendation in form of book/ video which shows examples of good optimisations...?
I love videos like this. I used to develop games in Macromedia Director, and it had a ton of limitations. When I made Mario Paint Composer I couldn't store music the way I wanted to. This is because Director didn't handle arrays very good. You couldn't just jump to the last note in the song - Director had to read every single item in the array before getting there. So the longer the song, the more time it would take to read subsequent notes.
So what I did is I saved the songs as image files. I could jump to any pixel and read its RGB value. So I'd store music that way. The R value in a pixel might be the note, the instrument, or the velocity. Then for each new note I'd just read the next horizontal pixel, then the next, then the next.
Finally, I understand my childhood. Thank you for building this talk from the ground up, with 0 required knowledge.
Amazing thing is how Kevin Zurawel explain this on a simple way. This was 34 minuutes, but to me was just like 5. I want more, for sure.
Nice work, Kevin!
"If it's stupid and it works, it's not stupid." Great talk!
The ability to explain complex topics to people that aren’t too seasoned in the field is so impressive
I love that you could describe the idea of "generating" random numbers from a lookup table as "if you can't make your random numbers from scratch, store-bought is fine, too"
Another famous game thaat does it: DOOM. It also advances the index according to certain actions, since those take random numbers, too. For example, any direct attack that hits something with HP queries the random table to give the hit a random amount, from 1/4 to twice the stated power. (Explosives work differently; their damage is power minus the distance to the target, or no hit at all if that's a negative number -- and up/down distance is completely ignored.) At some points, the A.I. running the creatures use random numbers for their decisions, too.
But the bottom line is that DOOM gameplay saves (those you can playback to watch what other players did) only work because the starting position is always the same. If you repeat the input in a frame-perfect way, you get the same result. And that's exactly what the playback subroutine does; the file contains user input and timing data only; actual damage figures, shotgun spread, and A.I. decisions are recomputed.
@@achtsekundenfurz7876 Doom can generate random numbers using the clock. It doesn't use a lookup table.
The very significant restrictions in the 8 bit era led to many incredibly ingenious solutions, all driven by an expert level knowledge of the CPU's instruction optimisations and equally ingenious memory management optimisation. A truly unique, golden era of programming. I miss those days! I still program professionally today and while of course inevitable, my unique historical viewpoint makes me somewhat saddened by how much has been abstracted away today, to the point where so much is provided to the developer - resulting in much of that absolute requirement for critical thinking present in the 80's, quite simply, rapidly heading to a 'not required' status.
I love the idea of making NES games today, creating within the original hardware limitations as a creative practice. In my experience, creativity thrives under constraints..
It's amazing how occupied I stayed with such a small amount of data for so long. Contra collision detection was pretty fascinating.
Agreed. Now we just take the computing power we have for granted.
Worth noting on the graphics, each tile was 2 bits per pixel, allowing 4 colors defined for the tile. What 4 colors were used could be changed or "palette swapped", efficiently allowing reuse with minimal space. This is why many enemies, player and background sprites look the same.
NES had multiple ways to use tiles and colours, this presentation could spend the whole 30 minutes just on that
The most impressive is that those programmers did all this without internet, only helped by a shitty documentation translated from Japanese.
Understanding how the NES worked was already an enigma to solve before you even started coding for it.
29:25 "...2.5inch disks, here they are with a save icon for size reference" That amused me way more than it probably should have :D
Haha it made me go crazy at the realization that we still use floppies as save icons.
Same reaction here. Im like "interesting lecture, cool stuff. " then "save icon" XD
I used to make very small compressed (4 kilobytes) games for programming competitions. There are a lot of parallels to the optimization and simplification they had to do for those old platforms. Modern games dont really worry about such detailed optimizations much unless it impacts framerate / streaming speed, ending up in those 8 gigabyte patches for a few changes in the gameworld and logic.
As a software developer that is still enamored with video games, this still fills me with admiration for the game devs of old. Hardware limits were there but that just forced devs to optimize like crazy.
Toshihiko Nakago, that's the guy who programmed Super Mario Bros. And The Legend of Zelda. And the following games of both series. He deserves a lot of praise, yet he doesn't even have his own Wikipedia page in English. 🙁
pioneer of the game!
“If it’s stupid and it works, it’s not stupid”
That linear feedback XOR register 'algorithm' is exactly how we generated random noise in our hardware for dithering audio. Super useful.
Galois LFSRs are super commonly used everywhere. CRC is built on them, for example. I was first exposed to them when I read Fabien Sanglard's explanation of how the Wolf3D killscreen was drawn.
I absolutely adore the creative problem-solving that goes into making these old games fit such tiny storage limitations. The example of how Contra handles collision blew my mind. Excellent talk and presentation.
Great talk. I had no idea how far they were pushing the limits of the system to provide the games I grew up on. New appreciation.
Contra's random number system is the best system presented. User feedback and timings are much better than any pseudo random system. The important point about the other two systems presented is what is the seed? Contra's system essentially creates new seeds constantly.
I was gonna say that it sounded like the most random one
Great talk, Kevin! I wrote a few NES emulators back in the day, and wrote z80 assembly for GameBoy Color professionally for a couple years back in the 1900's, and this presentation was spot on.
I've been coding since about 1980 and I loved this talk! Some I knew and some I didn't know.
Another noteworthy use of random numbers is for the noise channel of the Audio Processing Unit, used for generating snare drums, cymbals and gunshots, etc. The Linear-Feedback Shift-Register method happens to create a full spectrum of sounds by varying the sample rate, or by reducing the sequence of bits producing a metallic tone as heard in the Megaman games.
this is exactly why atari had random number generator (17 bit shift register). LDA RANDOM is so much easier in atari 8-bit.
Always amazes me how powerful abstraction is. How much power you can achieve just by dismissing some unnessecary details
This is kind of the opposite of how abstraction is usually handled in modern software development. They try to make something complex and then when that's done, provide an interface (java ruined this word) to it for people to use without having to worry about the underlying meticulous construction. I find that any algorithmic problems are usually easiest to solve with as few lines as possible though and just run it. Simple code and only a few lines. It's surprising how often it just works or requires like 1 or 2 small changes. Neurotically checking for null before the program is tested is just bad because it makes writing it a lot more mentally taxing and like half the code will be completely unnecessary.
Slow burn up to 34:00 was worth the like and applies to all of life, not just software. Thank you!
I'm an old-school programmer from the 70s. APL and BASIC had pretty good Random functions. If you wanted to help that a little, you would specify a seed, which is an offset into the random table. One easy way to generate a random seed would be to run a timing loop after requesting an input to see how many cycles it took for the human to answer the question. "Number of Players?" or whatever the question was. When coding in Z-80 Assembly, if you needed a pseudorandom number, the R register came in handy. That was the Ram Refresh register which kept track of which memory address is being refreshed. You could take that value, AND it with however many bits you needed and VIOLA! Good enough for gubment work, as we said.
Thanks for this talk! It was really interesting to get an overview of all the struggles developers had to deal with back in the day and the creative solutions they came up with in the end.
Regarding ramdomly generated numbers, a great example that uses the RNG Table is Doom, since the developers wanted the player to be able to record its gameplays into demo files and be replayed in any PC that had the same Doom version. By using the RNG table, u could always expect the same results (either them being behaviors, values, etc) when replaying the demos. This ultimately lead to the popularization of speedrunning in general, since now players could record and distribute their playthroughts without using a camcorder or other bulky and expensive solution while also having tiny *.LMP (demo) files.
Actually LFSR as shown here for Tetris is also completely deterministic and is easily reset to a known internal state. From that state on, it always reproduces the same sequence. I think random table was merely because it was faster and good enough.
wow that bit about contra and how it handles collision detection is actually really ingenious.
Holy smokes! I first saw this video about a year ago, and I just recently went through the entirety of what has published so far on the Famicom Party book without realizing Kevin was the author. It's a great blog series, and I'm looking forward to more parts of it being released.
In case Kevin sees this, "Thank you very much for what you have provided."
Same here! The other way around tho, I started the book a few days ago :)
Great talk! No mention of sound and music, an important feature of the NES. I've been asked how game programs had so many things going on at once. I did it with finite state machines. One racing game I developed needed trig tables, which I precomputed that took up an entire bank of ROM in the cartridge. We called the tiles "characters", which is why they're named chr. Also not mentioned is the 2x2 cells for background characters that specify their palette. We generally used layers of indirection to define large worlds with few bytes, which you call abstraction. Sprite chrs could also be mirrored horizontally and vertically, and we created tools to compress artist's graphics into character maps so they didn't need to be done manually. Music was also compressed with patterns recognition (abstraction) during assembly
I have watched so many videos about 8-bits but to be honest this guy explaind it so easly i actualy understand now how it works. !
the benefits of "embracing the stupid" are so incredibly real; I'm VERY new to game dev and have learned so much from just going "whatever; I'll do it this way for now" and moving on to something else for a while with my technically-functioning solution intact. I've been able to go BACK later with skills I learn doing Other Stuff and make my stupid solutions better / more like the way I wanted them to work originally... & obviously at a certain point, it's reasonable to just say "this is good enough" if your game functions and you don't have anything else to fuss with before you declare it Finished. I love the thing about just writing the save file 8 times especially... sometimes brute force is exactly what you need!
Excellent and interesting presentation! It's easy to appreciate the creativity and pragatic problem solving game devs had to use.
I loved this talk! I enjoyed it so much I actually wish it was a series where I could move on to the SNES or Genesis next, then the N64 or PlayStation, and so on to learn about the advancements and new techniques introduced over time!
wow that would be awesome, if you ever find a series like that on YT hit me up. I'll do the same :)
It's impressive how optimized those games were and how clever the devs were able to reduce the size. Nowadays it's "Yeah, sure, 90 Gigabytes are absolutely fine! We have basically unlimited space, so why optimize?"
Should be noted that the average 90GB game's size usually consists of very compressed texture data and model data. Game studios usually buy services or hire consultants to help reduce these sizes. It's very important to them and not at all something they shrug off. This is a standard "gamer doesn't understand the problem complexity and calls game dev lazy" pattern.
The reason game devs REALLY care about this is that you need to get this data from CPU to GPU, and for the game to run ok you need to be able to do this very quickly. There are plenty of middleware companies that specialize in exactly this sort of thing. So they're optimizing for CPU->GPU transfer rate, and this leads to complex trade-offs that surprisingly don't always involve making the texture as small in memory as possible.
I'm not saying the "who cares" attitude doesn't exist, but there's no room for it if you want to be a successful competitor at the top.
Also consider that textures these days are WAY larger than in the past. You might think a 4096x4096 texture is about four times larger than a 1024x1024 size texture, but it's actually 16 times the number of pixels. Now you might be surprised that we've gone from 10GB games to 120GB games in the last few years, but that's entirely within that 16x multiplier
keep in mind that sometimes bigger size is used for performance. As an example, PS1 Crash had random data at the center of the disk so the actual data on the edge would allow for more throughput when the disk spins
A lot of the data in modern games that make them so big is actually audio in most cases. When you want a ton of voice acting and you need it to play basically instantly without the overhead of decompressing it first, you end up with a crap ton of large audio files. Was a lot easier to keep things small when all conversations were just a series of generated beeps that accompanied the text roll, heh.
Even worse if it has been localized for several languages and each get their own set of audio files that end up going out in every copy instead of just for the regions that need them. One of those areas where a lot of studios aren't really taking proper advantage of the modern age of game download services that can technically use a different depot for each language and only download the one that matches the users settings.
this is extremely eye opening, thanks for posting!
Wolfenstein 3D and DOOM both use a hardcoded (the same one in fact) table of 256 values for random numbers. lookup tables were fairly common for everything back when CPUs weren't much faster than RAM. sin/cos tables. fixed point math tables.
Great lesson at the end there. I tend to find myself trying to perfect a feature that I'm trying to put in a game rather than dumbing it down.
I completely agree with the "Embrace the Stupid" concept. I often run into people trying to make perfection for stuff that does not require sophistication. If conventional makers understand the concept many things can be accelerated and then, if required, modified for perfection...
This is why games like this were awesome. Similar to Star Wars. The original. Limitations breed creativity and passion. Unlimited resources leads to trash. Awesome talk
Cyberpunk 2077 moment
This was a random watch, but sweet jesus this was interesting! I still don’t understand half of it, but it felt like I understood it all due to the great presentation
Same for me. Haven't watched the full thing, but already know it's gonna be guud.
19:20 "In real life, we throw bullets at people and see if they hit, in contra, we throw people at bullets and see if they hit, is more efficient!"
Incredible level of ingenuity in the '80s. I was also going to say 'thinking outside the box' was mandatory, however, there really was no box at this time - you literally had to solve many problems yourself i.e. invent solutions. Remember also, no internet here - so no Google for your docs/resources, no stackoverflow/forum, etc, to answer all your questions. It was an incredibly exciting time as the home computer/console era exploded into so many households, with 6502/10, Z80, etc coders busy perfecting those assembly routines. It's equally incredible how far this tech has advanced. There is a real concern, though, in the obvious over-reliance on the internet to increasingly do all our 'thinking' - especially the 'hard thinking'.
This was excellent. It's amazing how elegant the best Solutions are
Great talk! And I totally agree--As long as something fulfills its purpose, it doesn't matter how dumb you go about it to get there. Classic videogames are so interesting in that regard.
32:07 I dropped my jaw when I heard how Dragon Warrior saves its games. No wonder my saves on Dragon Warrior outlasted all other saves.
I really enjoyed this video. Great presentation, very well spoken. Funny, and smart. Great job!
Very interesting how problems got solved in the early days. I think we can learn a lot about good abstraction from those programmers :)
As a kid I remember being impressed that FF1 found a way to avoid people save scumming (resetting the game to get better rng, more favourable battles). Eventually I did guess that they just had one big random number though.
So technically I did notice.... but if anything it made the game better that it wasn't really random, rather than worse.
what a delightful talk
Holy shit...as someone who had just been born when the NES came out, the amount of tedious work necessary to squeeze out every byte of memory and CPU cycle and all sort of tricks the devs had to do back then is just amazing. Today you just take Unity/Godot/etc., throw in a tilemap and some sprites (who cares if mario.png is 2kB or 28kB), hook up the (pre-made) input system and there you go. But then again, making a game is much more complex today and you have to struggle with different shit...so not _that_ much has changed actually. :)
The NES had hardware sprites - luxury.
Tedious work? Almost every 8-bit game had between 1 and 3 programmers working on it, usually just one and the art and design team were another 2-3 people, and sound one more person, and the size of code is counted in kilobytes. Each assembler instruction takes a byte + data, and you print all the codebase of a game out nice and legible, you're going to have a fairly thin booklet. Most games were done in mere months. Today, an average production takes a thousand people, around 200 at the head studio and the rest outsourced, and takes around 3 years. Even just clicking a shitty little health overlay in Unity can take you days, they'd be done with that in a couple hours. The scope is completely different. I'd say the work done back then may be seen as unusual from today's point of view, a lot of that basic skill is no longer conveyed and is thus kinda lost, and not everyone was equally successful back then either, but it's not actually all that high up on difficulty scale.
@@SianaGearz Lovely point. I recall when I saw someone finishing the Mario game (just the game) and jaw dropped at the amount of developers on just that one game, by how long it took the credits to scroll all the way.
Today you don't just take Unity Unreal etc. Almost every game for Nintendo Switch has careful research management. From Nintendo games like Mario Odyssey to Witcher 3, Doom and Wolfenstein. And late in regular console generations like the PS4 and XboxOne some talented Developers manage to press every drop of performance out of those 2013 weak AMD Jaguar core based APU's in those systems. Which is also quite astonishing in some cases.
Very cool talk!
Also, the 2nd pattern table at 06:20 all that garbage in the bottom is actual program code-- they couldn't fit the game's code into PRG-ROM so they put the overflow in CHR-ROM and copied it to RAM and ran it from there to fit into the 32kB + 8kB footprint.
I didn't think you could copy from CHR-ROM to RAM. I was under tge impression that anything on the CHR chip was totally inaccessible by the CPU
I feel like at times, he was pausing just in case there were a giggle or two. But 100% overall INSANE presentation. I knew the basics of level design and sizing, but how you broke it down was great! I already shared this with my fellow gaming nerds!! Thanks again for posting!
Metroid should have been the dumbest RNG example, it pulls something from the initial state of ram, which is why certain enemies **don't** vary like they should - it only does this once.
To answer the final question, yes. The random function returns a float (from memory everything in JS is a float) between 0 inclusive and 1 *exclusive*, so if you multiplied by 255 youd get a number between 0 and 254, not 0 and 255, given that floor will return the integer portion of the float?
23:18 The original doom does exactly this. 256 numbers uniquely spread out in an array. Whenever anything needs a random number; pointer + 1 and take it out of the table and roll around. Simple
I was going to say that. It allowes for synchronisation across networks and recording of demos because the randomness is essentially generated by player input. That wouldn't work with Tetris because the player input has to respond to the random element. I suppose that user generated randomness is kind of like an early version of some forms of CAPTCHA.
With a 6502 processor I usually used the 1/100 seconds value of the timer. So long as the random numbers were required at random times this always produced a genuinely random number from 0 to 99.
I really appreciate the mantra about stupid ideas. I was developing an algorithm in C++ that I felt was stupid. It was a trivial method to return … random strings. And I just concatenated what was needed and returned the value of random indices to stdout. Fast forward I find an open source program that has a similar function and when reading the code noticed it was done in a more elegant, but in essence, the same way.
It made me smile because what I thought was so dumb as an amateur, a paid professional had the same idea. Not such a dumb idea after all.
Unless of course the individual copied my code and refactored it 🙄!
Either way, Jonathan - you can still pat yourself on your back!
Amazing presentation. Made such a lot of information so easily understandable and interesting. Kudos!
As you were describing how devs solved random number generation, I couldn't help but think about how pokemon games handled it (they were also 8-bit, but significantly more hardware to work with). An absolutely perfect example of a dumb solution that works really well (but if you know what's going on in the code you can exploit it in-game) is called DSUM. Basically there were two variables: the first codes for what pokemon will appear in a random encounter, the second for whether or not an encounter is generated on a given check. Those two numbers would be manipulated each frame, in a somewhat predictable manner, such that the cycling of the values could be tracked. Any given in-game area has a set of pokemon that can be encountered there, each with a specified chance, but these "chances" are coded as some range of values out of 255, so you could have a 20% chance to see a pokemon (25/255), a 10% chance (12/255), and so on.
The thing is, because you can track the DSUM which cycles roughly every 6.5 seconds, you know which pokemon can appear at any given time. This means you can decide when to take actions in game that can generate an encounter to manipulate which pokemon actually appears (rather than waste time with getting the wrong encounter). You can't really track whether or not the encounter will be generated in the first place, but by only attempting to generate one when the results would be in your favor, you can save a significant amount of time searching. This is particularly relevant in speedruns, where fractions of a second matter.
If anyone is curious about this, you can find more by searching "DSUM manipulation" in google, or by watching any generation 1 or 2 pokemon speedrun, especially the ones on the Games Done Quick channel, as those are marathon runs where the runner will explain what they are doing.
what I like is that it's true creativity, that you don't suspect at all
As a coder from that era , this is great stuff
The best thing ever happen to this world is the NES home system. Still playing that.
Played through Mega Man 6 while watching this, myself!
I'm not into programming, so I have very little knowledge, and I loved that presentation, thanks for making it so easy to understand!
Loved the conclusion! Will take with me for life.
Awesome talk! Small correction: the Game Genie does not write to RAM but intercepts the ROM. If the CPU requests a certain address in the ROM data the Game Genie just replaces the value with something predefined. There might also be a check value to make sure that the address points to the correct ROM bank.
if(address == X && valueAtAddressX == Y) return Z;
Yeah the action replay and gameshark do it with RAM instead. Actually.. I wonder how it does it.. the game genie is between the cartridge and the system so it can just act as a middle man and swtich the values up, but the gameshark is also there, how does it intercept the RAM ? I guess it could switch instructions ? Like if there is a LDA for a value it has a cheat for it changes the LDA so it loads from ROM instead of RAM ? But then that wouldn't really work when it's not directly loading the RAM address like if it's using LDA $adr, X because the gameshark doesn't know the value of the X register or if the instruction is loaded from RAM instead of ROM.
Thanks for the video! The collision part really helped me!
I've started making my own NES game, which also requires some sort of RNG. I came up with a different approach that works pretty well for me. I use my key inputs in each frame and add a value to a variable that represents the random number. If I hold down the up key, 1 is added in each frame, if I hold down the right key, 2 is added, if I hold down the down key, 3 is added, etc. Since the number is stored in a byte, it is reset to 0 when it reaches 256.
I have a controller that spawns enemies at random positions at a fixed interval. When you play the game, you automatically change the random number that is taken as the spawn position for the enemies.
I think when programming it’s important to prioritize intent and apply this appropriately to what you are developing . Applicable programming with intent. As you mentioned There is no reason to create a “true” random number generator and spend time doing that if it is not necessary and many levels removed from the actual purpose and intent of the overall code (which is the game ).
If it works, it works. The actual product and game play is the priority thus that should the most important aspect of development. Now if for some reason having some sort of pseudo random number that essentially leads to predictability and diminishing the difficulty/fun factor… that is another story.
Thanks man, really good video. Your section on graphics has given me a bit more enthusiasm for this Bomber King disassembly I've been doing for the last 3 years. Maybe I'll finally work out how the level data is stored ;)
There are some other titles from the NES that are and were legendary classics to play while also being programming masterpieces of their time. The Legend of Zelda, Metroid and The Guardian Legend!
Fun fact: the second random number approach (hard-coded list of 256 random numbers and just cycle through them) is also used by the legendary Doom.
The reason however, is the demo function. Carmack needed a way to create the illusion of randomness, but in a way that was deterministic. If the numbers generated were always completely random, then demos would desynchronise and you couldn't play them back properly.
LFSRs *are* deterministic though
@Guy Dude The implicit state of that 16-bit LFSR like shown for Tetris here is just two bytes - with your own implementation, making it fully repeatable is not a problem at all. I think Doom chose precomputed tables purely for performance reasons.
Wonderful talk, super interesting thanks Kevin!
This was a fun watch
Really nice video! I started doing software development in the late '70s, but did not work with the 6502 or game development very much. I learned a lot of interesting things about how these things worked from your talk. Thanks!
BTW, developers today are really spoiled. Back in the day, we toiled for hours on a subroutine to try to reduce the byte count from 200 bytes to 150 bytes. We also counted individual instruction execution times to get our routines to run in under some small number of milliseconds.
Exactly, cycle counting was incredibly important - especially if you wanted your game to stand out. While the internet has changed much of our lives for the good, it has also very negatively affected our ability to think for ourselves; to problem solve - why work out the problem yourself when stackoverflow can do that for you, lol. Not sure what the answer is but we are definitely dumbing down on so many things.
Thank god for compilers,
Excellent presentation overall! Clear, thorough, and enjoyable :)
The design hard guaranteed life of the save battery is 10 years. It's fairly rare of them to fail before 20 years are up. Beyond that, it's a matter of luck. I suspect the primary failure cause is just the chemical degradation of the cell with time, not the current being drawn and capacity being depleted.
my Sonic 3 & Knuckles batteries still haven't died after 27 years :)
Absolutly amazing lecture.
Awesome talk. I did wonder why the NES had to be reset before being turned off after saving in Final Fantasy.
Very nice presentation indeed. A lot of the solutions to things made sense.
A lot of what he talked about coincides the methods that I came up with when I was learning. Excellent presentation.
Super Mario Bros. actually uses three bytes for Mario's x-coordinate. For instance, Mario's x-position might be something like 50 C3 02 (in big-endian), which would mean he was two screens plus 195 and 5/16 pixels from the left edge of the level. Each screen is 256 pixels wide, and each pixel is 16 "subpixels" (in the sense that the lowest nibble is never used). So rather than 8.8 fixed-point, it's more like 8.8.8.
Speed has only two bytes for each coordinate. The high byte is in units of 1/16 px/frame, i.e. 1 subpx/frame, so the maximum expressible speed is 16 px/frame (or nearly four screens per second). The low byte is in units of 1/4096 px/frame, i.e. 1/256 subpx/frame. However, when Mario moves, his position only ever changes according to the value of the high byte; the low byte is ignored (which is why his position is always a multiple of 1/16 px). The low byte for speed (aka "subspeed") only matters for making acceleration smoother, because in the acceleration calculation, the low byte of speed is updated and rolls over to the high byte.
Awesome talk!
This is really fascinating!
i learned so much from this video. things i have always wondered ....
33:10 for my future reference..
Also known as KISS - Keep It Simple, Stupid
Literally was at 33:08 as I read this lol
This was a really fun and comprehensive talk. Makes me kind of wish I was a developer back in those days so I would have to tackle those types of low level challenges.
I'd like to try my hand at Game Boy development which, if I'm not mistaken, carries some of similar patterns.
Great talk! However , a point of order. The Atari 2600 was NOT the immediate predecessor to the NES system. There was an intermediate generation -- the Atari 5200 and the Colecovision. That second system, as I recall, has the distinction of hosting the first port of Donkey Kong as its pack-in title. The NES was still a marked improvement over both of these consoles, as well as the Mattel Intellivision, which lived between the VCS and the Coleco generations. But it was far less of an improvement than over the VCS. Among other things, the Pitfall II cart was the first game I am aware of with a multi-part four-instrument soundtrack accompanying the entire game, with different songs played for different situations.
This entire talk was fascinating! Loved the split between abstract concepts and how they were implemented. hope to see more like it.
What's interesting about this to me is that, as beginner game dev, I needed an RNG system for drop rates that felt random but couldn't actually be random.
My work around was to make the player character constantly generate a new number in a linear sequence for every frame that the game runs that determines whether an enemy will drop anything. If the number allows the enemy to drop something, then the enemy will read its own linearly-generated number that determines what it drops.
Between the frame-perfect timing, "randomness" of what number the player character could have, and the fact that the enemy value is specific to each object and only starts once it's on screen, it truly does feel random and fair.
Is it stupid? Yes. Is it not actually random? Of course. But is it functional? Yes, so it's good enough for me.
One thing that can be considered for all practical purposes random is polling user input and feeding that into the existing pseudo-random number sequence in some way. Works particularly well with analog inputs since they have a wide range of values. I'd do really stupid shit like string them together in nonsensical equations with math functions to produce unpredictable results. This was on modern hardware, but I was paranoid. =)
@@kevinfishburne The Apple 2 did that back in the late 70s. While it would wait for input it would increment a 16-bit random seed.
Normally you'd just give your platform's random function a set seed, that way you can get a random sequence that is always the same if you give it the same seed
22:38 The table for random numbers is also what doom uses. And it even works when you are replaying a demo file, since every random is the same.
What a great talk.
Hi Strange Loop ! you rock.. haha was clapping all the way through this until I stopped and hit download. Thanks, will be back :-)
Really interesting! Now I really understand why Super Mario Bros 3 was such a technical masterpiece. Wish you broke that down 😅👌 Excellent speech regardless!
The one thing I have not been able to find anywhere on UA-cam is what kind of programs did they use to program these games?!
This was really interesting and fascinating :D
23:00 not as dumb as it sounds, because 1) if the random numbers are used for more than one thing the "offset" keeps shifting all the time and 2) the numbers are probably computed modulo something, which makes the same roll essentially uncorrelated to any other modulo something else
also it depends upon the condition of hardware. As you play your hardware warms up changing the time it takes to do calculations and changing the amount of time left to add to the random number.
It would be mod 256 no matter what because it's just 8-bit addition with no overflow protection.