This is a cellular automaton. In fact, it is a generalisation of elementary cellular automata assuming you allow a node to be connected to itself. Because in elementary cellular automata the structure is fixed, they are even simpler yet show similar behaviour.
Depends how you define it, but they are definitely related. This framework seems a bit more general, though, e.g. no need to keep degree of all vertices the same but a cellular automaton equivalently assumes all degrees/# of neighbours of each vertex/cell is equal since it lies in a grid.
YES! I thought the same thing: it's like a cellular automaton but not on the plane or space but on some kind of the graph and with irregular neighbourhood.
Professor Turner was my lecturer at uni.. the best lecturer I had. The man brought my confidence and programming ability to a new level. Great man, thanks for your help Alex, you was the guy that made programming click for me. You are greatly missed at Hull uni.
Doesn't take much thinking to see the rules that would result from this, as it's really a finite state machine: - Any RBN, left to develop long enough, will eventually settle into a cycle. - For any RBN, there will exist a finite number of possible cycles. This number will be at least one, and at most 2^n where n is the number of nodes. - Each cycle likewise will have a minimum length of one (ie, nothing changes each iteration) and a maximum length of 2^n. - If you perturb a network, the state in which you place it will be either part of one of those cycles, or a state which will enter one of those cycles after enough iterations. The cycle is is/will-be may or may not be the same cycle you perturbed it from. In a sense it is a state machine of state machines: While the RBN itsself has a finite 2^n states, each of these states itsself is part of a much smaller set that contains all the states in a particular cycle and those which lead to it. You could generalise it to an infinite number of nodes, each following the same rule defined using relative positions... but then you've just invented a one-dimensional cellular automaton.
ModestJoke 2^n is the maximum length of a cycle because that's the maximum number of unique states and as soon as it returns to a state it has been at before that means it is in a cycle The 2 comes from the number of options for a node, it can only either be on or off so 2 options
Stuart Kauffman's RBN model was introduced slightly before Conway's Game of Life (1967 - 1969). Conway's Game of Life was created in 1970, which is 14 years *before* Stephen Wolfram wrote his paper in 1984 classifying the dynamics of 1-D cellular automata (CA). RBNs are like cellular automata but they don't have a spatial grid/neighborhood associated with how they update, meaning they are more abstract and more powerful than Wolfram's CAs and can therefore demonstrate self-emergent order in the dynamics in complex systems that operate *without* the constraint of a grid. Stephen Wolfram wrote a popular book which gets a lot of attention from a younger generation of scientists and computer scientists, but he has a reputation for talking up his own work while saying very little about whose shoulders he stood on to do it (von Neumann, Ulam, Codd, Conway, and more), as well as the contributions to our knowledge about CAs from other researchers (Toffoli, Margolis, Langton, Packard, and more)
If you allow more connections between nodes and design your transition rule table you should be able to represent Conway's Game of Life as a Boolean Network. So I'm not too surprised these networks show complex behavior.
It seems like the Game of Life is a very specific case of this, except that there’s an infinite number of nodes in GoL, where as in a random Boolean network there’s an arbitrarily large but still finite set of nodes
2:15 Does the definition of a random Boolean network require the edges to be ordered? If not, there could be as few as four unique states for three edges. Though, the truth table for a Boolean network with ordered edges can be arranged to function identically to one without.
Actually, because you have a finite number of states, by the pigeon hole principle, EVERY boolean network repeats eventually, with varying amounts of randomness at the beginning. As soon as you have a single repeated state, everything since the first occurrence of that state becomes the "loop". An interesting problem would be to find the maximal loop or the maximal randomness at the beginning.
@@RussellTeapot I mean, maximum amount of time before the first instance of the first repeated state. For example, in the sequence 1,4,3,6,5,2,7,8,7,8,7,8, maximizing the length of the 1,4,3,6,5,2 part (where there is no obvious discernible pattern without knowing the boolean table that defines the pattern)
@@jako7286 Oh, I see! Is it a solvable problem? Note that I'm very ignorant in mathematics: my flimsy intuition is that is not possible until you encounter a repeated state or you reach the maximum number of states possible, given the size of the network
@@RussellTeapot It is possible to brute force solve it with a computer for the 20 nodes and 3 connections per node case. There are no more than about 9.5 billion initial states (probably much much less when you reduce it for symmetry, which I didn't do in that rough calculation) combinations possible. It gets much more difficult the more nodes and the more connections you add though.
I like the size of the network he picked... 20 bits, small enough to fit on a text screen, AND small enough to iterate through all the possibilities in a fraction of a second, since there are no hidden states, the longest possible pattern is 2^20, or just over 1 million.
It is more complex, if you consider the number of possibilities how the 20 nodes are randomly configured with 3 connections to other nodes. Plus the function input to output also adds possibilities.
@@diegamingkatze I think Mike's point is still correct. Given any set of rules or set of connections, the longest possible pattern is 2^20 cycles long-- and almost certainly shorter. The number of possible patterns (as opposed to their maximum length) is dependent on the factors that you describe: (19 * 18 * 17)^20 different ways to connect the nodes, and then 2^8 different input/output maps.
It looks like it's still too big to look at every combination fully. There are ncr(19 3) 969 different possibilities to connect each node, so 969^20 graphs. And 8 truth table entries for each graph, so 2^(8*20) possible truth tables for each graph. And 2^20 initial conditions for each of those. So about 8e113 total graphs and initial states.
@@gkoan The number of possible patterns is dependent upon the ways the nodes are connected and the input-output map, but you can put bounds on it: It is at minimum one (ie, it iterates through all 2^20 possible states) and at maximum 2^20 (ie, every possible state will lead to itsself).
It's funny you say you didn't expect the pattern to emerge, but you could totally encode the game of life with this binary system and have it work as expected
There are 2^20 possible states (never mind the number of possible connections between each node!), you wouldn't necessarily expect to find repeating patterns so easily. Also, I'm sure 20 was only picked as an example, I wouldn't be surprised if they examine much bigger networks than this.
The behavior is fascinating, but not that surprising if you look at it from the perspective of the network as a nonlinear dynamical system. However its state might be represented in some mathematical phase space, there may be attractors (settling on a single value) and strange attractors (complex repeating dynamical behavior). The graph nodes and connection rules are a different way to model the recursive equations that one might use to make a fractal, for example. Basically, a different way to view the same phenomenon.
Stephan Wolfram's book "A New Kind of Science" explores these types of systems in great depth. Interestingly, the types of behavior demonstrated here (simple terminating, repetitive/nested, seemingly random, and some combination of the latter two) show up in a LOT of different systems that evolve according to simple rules.
Not knowing much about it, I picked up a copy of his book from Goodwill a few years ago for $1.99. I flip it open and, surprise surprise, he had signed it. Went to look it up online to see if it might be worth something to someone, but the heft to price ratio is just sad - and it's easy to see why - cellular automatons and complexity, very interesting; a full grown man self-ingratiating and expounding established ideas as his own foundational revelations, not so much.
It seems like the network falls into these stable patterns. Sometimes these disturbances will knock it into a different pattern, or may not and will return to the original pattern. I think this is a great learning tool.
It would be interesting to combine this with a survival mechanic. Networks which freeze into some state "die" and those that keep changing "survive" and "multiply", changing a couple random bits. Run a couple hundred generations and see what happens.
that "self healing cyclic behaviour" (imma call it that, no idea if there's a real name for that) sounds like it could make for interesting generative music applications.
Is there some kind of metric that describes the resistance to change? like measures the connections and starting configuration to determine the robustness?
Does the "repeating pattern" output just shows that random parameters have created an obfuscated 20-bit counter? On second thought, I guess it's more than just an obfuscated 20-bit counter. When you increase to 200 or 2000-bit counter, then the repeating patterns are an "interesting / useful" subset of the possible values.
Now, would that work for _any change_ of these nodes (including changing only some of them) to anything (well, 0 or 1)? And: Do those nodes have a significance to the repeating sequence? What I'm asking is essentially: Is the "restoration" process agnostic to its target, and is it stable enough to recover something meaningful about the whole "code" when damage such as this (destroying the information in the first 5 bits) occurs?
It is interesting that a pattern tends to be resilient to changes in values. I'm curious how resilient it is to changes in the relationships and/or ruleset? I'm inclined to think virtually any change in either of these would be much more likely to permanently change the pattern?
Most interesting to me would be example that after the perturbation the system settles into a new repeating cycle but one with a different period. Doing the flipping of bits is a special case of starting the same network with a different original state. Since the system has no memory, flipping a limited number of bits is just equivalent to restarting with a new fully random origin with the same same connections and truth table. Flipping six bits isn’t less nor more relevant to flipping five or seven. What’s interesting is discovering for a given network design how many different steady states/cycles it supports from all possible inputs. You get an approximation of that when you flip five bits but it is a very shallow exploration of the machine’s properties
I've experimented with random turing machines on a cyclic space and used inverse compressibility as a metric to find interesting state transition graphs.
Another cool question is whether there are periods that are impossible for a given number of bits. I’d bet 2^n - 1 is impossible but i don’t know how to prove or for other periods.
Most of the system-wide cyclical states are "factorable", consisting of multiple subnetworks, each having their own period, with the total system period being the least common multiple of all the periods.
I noticed a pattern immediately in the initial setup of the random truth table. It lines up with this rule: if the middle column has a 1 on either side of it, return 1, if the middle column has 0 on both sides return the middle column XOR 1;
Why the repeating patern was unexpected ? There is a finite number of possible states the array of boolean can have, and if the rules stay the same one day it will go back to a state in wich it was prior (maximum loop length is forced to be k out of n no ?) edit : even if there is a "mutation", if it hits a prvious state it had before the "mutation" it will go back, the robustnes is finally just the case where two starting states leeds to the same output ?
I was thinking 20 bits, i.e. a 20 bits integer. Because the next state always is a direct function of the current state, the first time a number repeats we can see the period time. The boolean functions could be a pseudo random number generator, but this requires a long period time and a few other properties of the numbers generated.
This reminds me of FPGA programming, if each logic element (node) is a LUT with inputs from N neighbours, feeding into a flip-flop with a global clock. You could write a program to generate Verilog to run random Boolean networks with tens of thousands of elements running at hundreds of MHz. Although I don't think you can have truly random routing in an FPGA, there are limits on the connections between logic elements you can have.
Doesn't randomness in the output depend on the pseudorandomness of the algorithm used? In other words - since under the hood, every randomness generated that is not fed actual random data (such as user mouse movement, electric grid noise, lava lamp pictures etc.) is based on some mathematical function being used, doesn't actually this show up in the end result?
Interested folks would do well to scan the description of Stuart Kaufman's books "the origins of order" and "at home in the universe" -- any interest at all will be kindled into a roaring flame. Thanks for this video
This is very similar to the Ising model used in statistical mechanics. Might be able to use statistical mechanics to describe the behavior of the system.
does math not predict this behaviour as: there are 2^20 different states the network can be in so in 2^20+1 itterations you must at least hit a state you have found before so from that point on towards where you see the state again you would get a repeating pattern?
The fact that the pattern repeats is not surprising. The fact that even complicated repeating cycles may be robust (ie. stabilize again to the same cycle, when change is introduced) is the interesting bit.
Could the reason for the repeating patterns be that the computer is not actually truly random and is actually spitting out numbers in some pattern? Maybe the same experiment with a lava lamp or something would be truly random..
Alright mate how was your weekend. IT was great I played with random boolean values to see what happened 😂😂😂😂😂😂😂😂😂 I think this guys been locked down for too long hes looking for patterns in random 1's and 0's.
Of course there is a possibility that repeating patterns could emerge, that's exactly how ring oscillators work. So Random Boolean Networks just show that the technology we've been using for a long time exists, doesn't have any practical use and doesn't prove anything.
@@vylbird8014 true, I guess it os a finite state machine. What I meant by repeating patterns is a sequence of more than one state that repeats rather than every cycle being the same otherwose it's obvious thst that it would repeat.
Could you use this for encryption? It seems like if you start with enought nodes and enought connections it would be very hard to calculate backwards without knowing the start. The starting parameters would be some sort of key in this case.
anti-chaos, actually. A chaotic system is defined by simple rules but is extremely sensitive to initial conditions so can't be predicted. These systems are also defined by simple rules but are not very sensitive to initial conditions so they give predictable outputs.
@@jecelassumpcaojr890 How are they not very sensitive to initial conditions? Look at the total difference in results, including ones that are kinda stable in the end. Small differences in the initial conditions of the network can suddenly result in a totally unstable network. Or one that repeats. Or a random one. The fact that some of them exhibit robustness doesn't refute that either. IMHO it makes it even more interesting and fits with chaos theory as well.
@@Ddeletham I should have said that binary random networks include some that are not sensitive to initial conditions and so are anti-chaotic. Not all of them have the same behavior, just as not all cellular automata have the same nature.
So, depending on the network, some are chaotic and some are not. Makes sense, after all not even the Lorenz System is chaotic for all parameter values.
This is making something complex where complexity is not needed to find that patterns would in fact emerge....1d cellular automata does these exact patterns all the time, the only difference to this complex system is the choice in the neighbors. If you randomly pick neighbors and don't change them afterward, then all you've done is re-arrange the 1d lattice....which you can then just sort to get back to a proper 1d CA. This isn't special, sorry.
This is a generalisation of GoL. The difference is that GoL needs a specific network architecture (i.e. a lattice) and a particular set of rules which in this case are generated randomly
This is a cellular automaton. In fact, it is a generalisation of elementary cellular automata assuming you allow a node to be connected to itself. Because in elementary cellular automata the structure is fixed, they are even simpler yet show similar behaviour.
Depends how you define it, but they are definitely related. This framework seems a bit more general, though, e.g. no need to keep degree of all vertices the same but a cellular automaton equivalently assumes all degrees/# of neighbours of each vertex/cell is equal since it lies in a grid.
YES! I thought the same thing: it's like a cellular automaton but not on the plane or space but on some kind of the graph and with irregular neighbourhood.
That's why first thing this reminded me of was Conway's game of life
I was actually wandering if they were related. I had just stumbled onto Conway's game of life three days ago and I was stunned by the ressemblance.
IMHO. It seems to me that these are just random logical circuits.
Professor Turner was my lecturer at uni.. the best lecturer I had. The man brought my confidence and programming ability to a new level. Great man, thanks for your help Alex, you was the guy that made programming click for me. You are greatly missed at Hull uni.
Doesn't take much thinking to see the rules that would result from this, as it's really a finite state machine:
- Any RBN, left to develop long enough, will eventually settle into a cycle.
- For any RBN, there will exist a finite number of possible cycles. This number will be at least one, and at most 2^n where n is the number of nodes.
- Each cycle likewise will have a minimum length of one (ie, nothing changes each iteration) and a maximum length of 2^n.
- If you perturb a network, the state in which you place it will be either part of one of those cycles, or a state which will enter one of those cycles after enough iterations. The cycle is is/will-be may or may not be the same cycle you perturbed it from.
In a sense it is a state machine of state machines: While the RBN itsself has a finite 2^n states, each of these states itsself is part of a much smaller set that contains all the states in a particular cycle and those which lead to it.
You could generalise it to an infinite number of nodes, each following the same rule defined using relative positions... but then you've just invented a one-dimensional cellular automaton.
Like a linear feedback shift register eh? Isn't this how pseudo random numbers can be generated?
If each node connects to three other random nodes, wouldn't it be 3^n?
And you can even easily calculate all the states which will (after some iterations) enter one of the states of the cycle.
ModestJoke
2^n is the maximum length of a cycle because that's the maximum number of unique states and as soon as it returns to a state it has been at before that means it is in a cycle
The 2 comes from the number of options for a node, it can only either be on or off so 2 options
@@Rob9 so ? FPS can randomise that ?
Turing has a paper on these types of networks. google turing intelligent machinery pretty smart guy back in 1948.
Stuart Kauffman's RBN model was introduced slightly before Conway's Game of Life (1967 - 1969). Conway's Game of Life was created in 1970, which is 14 years *before* Stephen Wolfram wrote his paper in 1984 classifying the dynamics of 1-D cellular automata (CA). RBNs are like cellular automata but they don't have a spatial grid/neighborhood associated with how they update, meaning they are more abstract and more powerful than Wolfram's CAs and can therefore demonstrate self-emergent order in the dynamics in complex systems that operate *without* the constraint of a grid. Stephen Wolfram wrote a popular book which gets a lot of attention from a younger generation of scientists and computer scientists, but he has a reputation for talking up his own work while saying very little about whose shoulders he stood on to do it (von Neumann, Ulam, Codd, Conway, and more), as well as the contributions to our knowledge about CAs from other researchers (Toffoli, Margolis, Langton, Packard, and more)
"Not too dissimilar" - guy giving a lecture on unnecessarily complex boolean behavior
"You can't say that this isn't too dissimilar" isn't what my stance would be.
If you allow more connections between nodes and design your transition rule table you should be able to represent Conway's Game of Life as a Boolean Network. So I'm not too surprised these networks show complex behavior.
It seems like the Game of Life is a very specific case of this, except that there’s an infinite number of nodes in GoL, where as in a random Boolean network there’s an arbitrarily large but still finite set of nodes
I was thinking the same. In addition to what you said, in Conway's Game of Life the node connections are predefined instead of random.
Drink every time he says boolean? Friday night sorted.
He says boolean? And i was wondering why he talks about the random booty network.
bewlean
How about every other time he says boolean, you drink. :)
Liver all sorted too lol
@@Xaminn haha
2:15 Does the definition of a random Boolean network require the edges to be ordered? If not, there could be as few as four unique states for three edges. Though, the truth table for a Boolean network with ordered edges can be arranged to function identically to one without.
Actually, because you have a finite number of states, by the pigeon hole principle, EVERY boolean network repeats eventually, with varying amounts of randomness at the beginning. As soon as you have a single repeated state, everything since the first occurrence of that state becomes the "loop". An interesting problem would be to find the maximal loop or the maximal randomness at the beginning.
What do you mean for "maximal randomness"?
@@RussellTeapot I mean, maximum amount of time before the first instance of the first repeated state. For example, in the sequence 1,4,3,6,5,2,7,8,7,8,7,8, maximizing the length of the 1,4,3,6,5,2 part (where there is no obvious discernible pattern without knowing the boolean table that defines the pattern)
@@jako7286 Oh, I see! Is it a solvable problem? Note that I'm very ignorant in mathematics: my flimsy intuition is that is not possible until you encounter a repeated state or you reach the maximum number of states possible, given the size of the network
@@RussellTeapot It is possible to brute force solve it with a computer for the 20 nodes and 3 connections per node case. There are no more than about 9.5 billion initial states (probably much much less when you reduce it for symmetry, which I didn't do in that rough calculation) combinations possible. It gets much more difficult the more nodes and the more connections you add though.
I like all three of these words as individual topics; a video about all three combined is bound to be fantastic!
I like the size of the network he picked... 20 bits, small enough to fit on a text screen, AND small enough to iterate through all the possibilities in a fraction of a second, since there are no hidden states, the longest possible pattern is 2^20, or just over 1 million.
It is more complex, if you consider the number of possibilities how the 20 nodes are randomly configured with 3 connections to other nodes. Plus the function input to output also adds possibilities.
@@diegamingkatze I think Mike's point is still correct. Given any set of rules or set of connections, the longest possible pattern is 2^20 cycles long-- and almost certainly shorter. The number of possible patterns (as opposed to their maximum length) is dependent on the factors that you describe: (19 * 18 * 17)^20 different ways to connect the nodes, and then 2^8 different input/output maps.
It looks like it's still too big to look at every combination fully. There are ncr(19 3) 969 different possibilities to connect each node, so 969^20 graphs. And 8 truth table entries for each graph, so 2^(8*20) possible truth tables for each graph. And 2^20 initial conditions for each of those. So about 8e113 total graphs and initial states.
@@gkoan The number of possible patterns is dependent upon the ways the nodes are connected and the input-output map, but you can put bounds on it: It is at minimum one (ie, it iterates through all 2^20 possible states) and at maximum 2^20 (ie, every possible state will lead to itsself).
It's funny you say you didn't expect the pattern to emerge, but you could totally encode the game of life with this binary system and have it work as expected
Random ?
This system has a finite number of states, so of course there will be repitition after enough iterations...
There are 2^20 possible states (never mind the number of possible connections between each node!), you wouldn't necessarily expect to find repeating patterns so easily. Also, I'm sure 20 was only picked as an example, I wouldn't be surprised if they examine much bigger networks than this.
Each digit of pi has a finite state (0-9), but no pattern / repetition.
@@JensRiggelsen the digits of pi are not a state machine, the next digit is not computed as a function of the prior one.
@@tommy_1446 Ah... Yes, you're right, my bad.
The fact that there's a finite number of states does not imply that there must be a cycle
The behavior is fascinating, but not that surprising if you look at it from the perspective of the network as a nonlinear dynamical system. However its state might be represented in some mathematical phase space, there may be attractors (settling on a single value) and strange attractors (complex repeating dynamical behavior). The graph nodes and connection rules are a different way to model the recursive equations that one might use to make a fractal, for example. Basically, a different way to view the same phenomenon.
This seems like a critical link between Game of Life and Neural Networks. Seems like there is a lot of potential for innovation here.
Ooo random stuff, I like these the most :)
ok mr random generator
@@chairwood rude
@@noamlima9402 didnt mean for it to be rude
@@chairwood so its mandatory to you make that ? Cant you feel better without making bad to yourself and others
@@noamlima9402 also I don't understand what "so its mandatory to you make that ? " means :( those words dont make sense to me in that order
Stephan Wolfram's book "A New Kind of Science" explores these types of systems in great depth. Interestingly, the types of behavior demonstrated here (simple terminating, repetitive/nested, seemingly random, and some combination of the latter two) show up in a LOT of different systems that evolve according to simple rules.
Not knowing much about it, I picked up a copy of his book from Goodwill a few years ago for $1.99. I flip it open and, surprise surprise, he had signed it. Went to look it up online to see if it might be worth something to someone, but the heft to price ratio is just sad - and it's easy to see why - cellular automatons and complexity, very interesting; a full grown man self-ingratiating and expounding established ideas as his own foundational revelations, not so much.
It seems like the network falls into these stable patterns. Sometimes these disturbances will knock it into a different pattern, or may not and will return to the original pattern. I think this is a great learning tool.
I feel bad for Dr. Turner, that scratch-off map will stay like that for a long time it seems :-P
uuuuu alive
It would be interesting to combine this with a survival mechanic. Networks which freeze into some state "die" and those that keep changing "survive" and "multiply", changing a couple random bits. Run a couple hundred generations and see what happens.
I'm halfway through and I'm drunk already 🍺
Computerphile makes my day
At 6:10 there IS a repeating pattern you can see.. it takes about 52 steps to repeat.
that "self healing cyclic behaviour" (imma call it that, no idea if there's a real name for that) sounds like it could make for interesting generative music applications.
tbh it just seems like the randomly assigned truth table biases a certain feedback cycle, is it really that surprising that many systems recover?
Is there some kind of metric that describes the resistance to change? like measures the connections and starting configuration to determine the robustness?
Yes it is called lyapunov exponents in dynamical system theory
6:06
You actually can see the repeating cycle here. it starts on column 6 and ends on column 58.
this more like Wolfram code then Conway's game of life
The set of bits representing the next states is equivalent to the Wolfram Code for elementary cellular automata.
I myself thought neural networks.
Yes, but Wolfram's automata are potentially infinite.
This is just game of life with extra steps.
2:40 The last time I had one tab in Notepad++ is when I installed it for the first time. I need to clean that now.
The hunting autofocus is making me feel a bit sick
Watching on a smaller screen or from further away can mitigate this. But yes. Its intrusive.
It's called a random boolean zoom effect.
Exactly!
...and the audio is terrible because of the cheap headset.
Please could you allow subtitles one day? :(
Does the "repeating pattern" output just shows that random parameters have created an obfuscated 20-bit counter?
On second thought, I guess it's more than just an obfuscated 20-bit counter. When you increase to 200 or 2000-bit counter, then the repeating patterns are an "interesting / useful" subset of the possible values.
Now, would that work for _any change_ of these nodes (including changing only some of them) to anything (well, 0 or 1)?
And: Do those nodes have a significance to the repeating sequence?
What I'm asking is essentially: Is the "restoration" process agnostic to its target, and is it stable enough to recover something meaningful about the whole "code" when damage such as this (destroying the information in the first 5 bits) occurs?
Correction: is the "restoration" process agnostic to _the condition of_ its target
This is fascinating! Now if only I had a reason to use this in my day job (working as a developer of web services).
It is interesting that a pattern tends to be resilient to changes in values. I'm curious how resilient it is to changes in the relationships and/or ruleset? I'm inclined to think virtually any change in either of these would be much more likely to permanently change the pattern?
Most interesting to me would be example that after the perturbation the system settles into a new repeating cycle but one with a different period. Doing the flipping of bits is a special case of starting the same network with a different original state. Since the system has no memory, flipping a limited number of bits is just equivalent to restarting with a new fully random origin with the same same connections and truth table. Flipping six bits isn’t less nor more relevant to flipping five or seven. What’s interesting is discovering for a given network design how many different steady states/cycles it supports from all possible inputs. You get an approximation of that when you flip five bits but it is a very shallow exploration of the machine’s properties
I've experimented with random turing machines on a cyclic space and used inverse compressibility as a metric to find interesting state transition graphs.
What is "inverse compressibility"?
This is either the funniest comment of all time, or I am dumber than I think. 🤔
@@GreenBoiler it could be both? I was intending to be quite serious though.
@@RussellTeapot if it doesn't compress well I assume it's interesting to prioritize my manual effort.
I wonder whether the truth table or the connections map is more influential. And to what degree
Fascinating. Thanks!
What happens if you add a sensor node? Like, one that changes with time and one that changes every time you press a key?
Another cool question is whether there are periods that are impossible for a given number of bits. I’d bet 2^n - 1 is impossible but i don’t know how to prove or for other periods.
The "perturbation leads to the same structure" sounds like a lemma of fixed point theory.
Most of the system-wide cyclical states are "factorable", consisting of multiple subnetworks, each having their own period, with the total system period being the least common multiple of all the periods.
I noticed a pattern immediately in the initial setup of the random truth table.
It lines up with this rule: if the middle column has a 1 on either side of it, return 1, if the middle column has 0 on both sides return the middle column XOR 1;
That's just an example. Both the truth tables for each node and the starting value of each node are randomized on each run.
nice video and all but whats up with the map of the world behind him? it seems off..
What is going on with middle America, the Mediterranean and Indonesia in that map??
I was askying myself the same
I think it might be a map of the last ice age.
It's more than that. Look at the isthmus of Panama, it's way too thick.
Enjoy this video. Very cool.
Why the repeating patern was unexpected ? There is a finite number of possible states the array of boolean can have, and if the rules stay the same one day it will go back to a state in wich it was prior (maximum loop length is forced to be k out of n no ?)
edit : even if there is a "mutation", if it hits a prvious state it had before the "mutation" it will go back, the robustnes is finally just the case where two starting states leeds to the same output ?
It's surprising that it recocers so quickly but it could be cherrypicked exampels.
I was thinking 20 bits, i.e. a 20 bits integer. Because the next state always is a direct function of the current state, the first time a number repeats we can see the period time.
The boolean functions could be a pseudo random number generator, but this requires a long period time and a few other properties of the numbers generated.
GPS PRN codes are generated this way.
what is that?
Instructions unclear. Are the 3 nodes always the same or do they change with each tick? Does the table stay the same or change with each tick?
Stays the same, to both questions.
Does anyone know what the map behind him is?
Wouldn't it be interesting to set each state a different kind of sound and listen throughout the process?
This reminds me of FPGA programming, if each logic element (node) is a LUT with inputs from N neighbours, feeding into a flip-flop with a global clock. You could write a program to generate Verilog to run random Boolean networks with tens of thousands of elements running at hundreds of MHz. Although I don't think you can have truly random routing in an FPGA, there are limits on the connections between logic elements you can have.
Doesn't randomness in the output depend on the pseudorandomness of the algorithm used?
In other words - since under the hood, every randomness generated that is not fed actual random data (such as user mouse movement, electric grid noise, lava lamp pictures etc.) is based on some mathematical function being used, doesn't actually this show up in the end result?
None of your examples of random data is random, either.
This is the 666th video on this channel. Let that sink in.
This just seems like the Game of Life with extra steps
Why not just have only two inputs for each node and use basic logic gates for the truth tables?
Interested folks would do well to scan the description of Stuart Kaufman's books "the origins of order" and "at home in the universe" -- any interest at all will be kindled into a roaring flame.
Thanks for this video
This is very similar to the Ising model used in statistical mechanics. Might be able to use statistical mechanics to describe the behavior of the system.
does math not predict this behaviour as: there are 2^20 different states the network can be in
so in 2^20+1 itterations you must at least hit a state you have found before so from that point on towards where you see the state again you would get a repeating pattern?
The fact that the pattern repeats is not surprising. The fact that even complicated repeating cycles may be robust (ie. stabilize again to the same cycle, when change is introduced) is the interesting bit.
@@KohuGaly oh, thank you for clarifying that
So, where's the point? It's finite so it always fall into cycles. However it could be stimulating, thank you
The point is not the cycles, it's the recovery after a "mutation" of a "gene", implying biology could be much simpler than we think.
Got it, thanks. They are a sort of local discrete minima
Could the reason for the repeating patterns be that the computer is not actually truly random and is actually spitting out numbers in some pattern? Maybe the same experiment with a lava lamp or something would be truly random..
So network at t + 1 is a function applied to the matrix multiplication of the adjacency matrix and the network at t. (S_{t+1} = f(A * S_t))
What is that map?
So if I go back in time and kill a butterfly... It likely won't have that big of an effect on the timeline... cool
Depends on how you go about killing that butterfly.
I was looking for something like this, going to write a paper about it
6:13 ofc you can see a pattern.
Mandelbrot sets ?
Now I'm curious about simulating bonds between elemental materials...
This seems related to Wolfram Physics and his work on hypergraphs and branchial space anyone want to comment on that
Is the truth table generated on each step? It seems that it isn't.
Alright mate how was your weekend.
IT was great I played with random boolean values to see what happened 😂😂😂😂😂😂😂😂😂
I think this guys been locked down for too long hes looking for patterns in random 1's and 0's.
what this random network is good for? what its real life application could/should be?!
you must be new here
Shout out to hulls best lecturer
Of course there is a possibility that repeating patterns could emerge, that's exactly how ring oscillators work. So Random Boolean Networks just show that the technology we've been using for a long time exists, doesn't have any practical use and doesn't prove anything.
This is a finite state mechine. It's not only a possibility that repeating pattern could emerge: It's an inevitability.
@@vylbird8014 true, I guess it os a finite state machine. What I meant by repeating patterns is a sequence of more than one state that repeats rather than every cycle being the same otherwose it's obvious thst that it would repeat.
Look to control theory for good explanations on system (in)stability.
Try visualising this in processing.org. It'll make for prettier pictures.
Homeostasis, interesting 🤔
This seems a lot like an LFSR.
Boolean every time he says drink
So a computer counts 0,1,0,1 as an array to process its value.
Could you use this for encryption?
It seems like if you start with enought nodes and enought connections it would be very hard to calculate backwards without knowing the start. The starting parameters would be some sort of key in this case.
JFC, I was researching the exact same thing as you.
Chaos theory
anti-chaos, actually. A chaotic system is defined by simple rules but is extremely sensitive to initial conditions so can't be predicted. These systems are also defined by simple rules but are not very sensitive to initial conditions so they give predictable outputs.
@@jecelassumpcaojr890
How are they not very sensitive to initial conditions? Look at the total difference in results, including ones that are kinda stable in the end.
Small differences in the initial conditions of the network can suddenly result in a totally unstable network. Or one that repeats. Or a random one.
The fact that some of them exhibit robustness doesn't refute that either. IMHO it makes it even more interesting and fits with chaos theory as well.
@@Ddeletham I should have said that binary random networks include some that are not sensitive to initial conditions and so are anti-chaotic. Not all of them have the same behavior, just as not all cellular automata have the same nature.
So, depending on the network, some are chaotic and some are not.
Makes sense, after all not even the Lorenz System is chaotic for all parameter values.
sounds like a game of life with random rules
It has CHAOS
It's a dynamic system
Is that an ice age map behind Alex?
Subtítulos?
For a moment, I thought this was video for chaos theory...
This is making something complex where complexity is not needed to find that patterns would in fact emerge....1d cellular automata does these exact patterns all the time, the only difference to this complex system is the choice in the neighbors. If you randomly pick neighbors and don't change them afterward, then all you've done is re-arrange the 1d lattice....which you can then just sort to get back to a proper 1d CA.
This isn't special, sorry.
And you use that for ... ?
is there a git for the program generating the random sequence?
Haha, who'd expect a repeating pattern in a stochastic network!
This is a generalisation of GoL. The difference is that GoL needs a specific network architecture (i.e. a lattice) and a particular set of rules which in this case are generated randomly
Am I the only one who hears "booty network" every time?
You are not.
Like Wolframs book: a new kind of scienece
Its Ethan Klein's lost European little brother!