NFA would be integral to developing quantum computing languages. The principle explicated at 12:30 and the ensuing discussion maps directly over into such topics, including the remark about how we have some advantages over this simpler array of NFA, and perhaps by means of a "magical oracle", like the one in our head or soul or else in a quantum computing array.
I'm slightly confused. Shouldn't keystrokes and mouse pointer positions be considered as deterministic as well? Since they are the input to the FSM and at least in theory, can be enumerated.
When I was doing this kind of stuff the questions like these were coming too. What I used to do is to refer to element of computer memory. Can we model it as a glass of water? Yes, but it is a leaky glass in a shallow pool. In an ideal state the glass is either full or empty. This never happens; we never have a perfectly full or perfectly empty glass exactly because it leaks into the pool when, say, more than half full and leaks in from the pool when more than half empty. What is a practical solution? One might say, that we can draw a mark half the glass height and consider it full when above this level and empty when below. Imagine then, that we have to look at some 50 glasses in a short sequence from afar. Then we won't be able to see if the glass is more than half full/empty when the level of water in the glass is very close to the mark. Our solution is to make two marks and call the area between them an error. Our leaky glass of water/memory bit shows an error at some point. A lot of work in the computer is internal error correction. More importantly, what I presented is an essence of every digitisation/quantisation. Determinism is a privilege of God and humans are capable of taking deterministic actions only with a crude approximation. We can not calibrate our actions exactly and we can not foresee consequences of our actions infinitely into the future. The unintended consequences. You don't have to believe in God. You just have to agree, that it is a useful concept. Last, but not least, when I was taught physics every course of it was starting from some propedeutics of measurement theory. Physics without measurement theory is just invalid as a tool to model and shape reality.
I think technically everything is deterministic. It's just when you have an incomplete model that suddenly things that weren't included in your model appears to have a "random" behavior. Example: A coin flip may appear non-deterministic, however if you model the wind velocity, gravity, coin shape, etc., then it can become deterministic.
FINALLY someone makes sense of this crap. Thank you.
what program are you in?
love how clear you explain these topics, plus you have such a calming voice!
Thank you so much for your series on the Theory of Computation. You have a very clean, concise, and informative style.
NFA would be integral to developing quantum computing languages. The principle explicated at 12:30 and the ensuing discussion maps directly over into such topics, including the remark about how we have some advantages over this simpler array of NFA, and perhaps by means of a "magical oracle", like the one in our head or soul or else in a quantum computing array.
Thank you. Your definitions were clear and your example at 10:25 was just what I needed to see.
Thank you for this series. You have put so much effort into this and such a great outcome.
You're a computation wizard, Harry Porter
20:40 accepted by NFA not DFA
Thank you sir, Like your explanations.
Thank you for this series!
thank you!!!!!!
I'm slightly confused. Shouldn't keystrokes and mouse pointer positions be considered as deterministic as well? Since they are the input to the FSM and at least in theory, can be enumerated.
When I was doing this kind of stuff the questions like these were coming too. What I used to do is to refer to element of computer memory. Can we model it as a glass of water? Yes, but it is a leaky glass in a shallow pool. In an ideal state the glass is either full or empty. This never happens; we never have a perfectly full or perfectly empty glass exactly because it leaks into the pool when, say, more than half full and leaks in from the pool when more than half empty. What is a practical solution? One might say, that we can draw a mark half the glass height and consider it full when above this level and empty when below. Imagine then, that we have to look at some 50 glasses in a short sequence from afar. Then we won't be able to see if the glass is more than half full/empty when the level of water in the glass is very close to the mark. Our solution is to make two marks and call the area between them an error. Our leaky glass of water/memory bit shows an error at some point. A lot of work in the computer is internal error correction. More importantly, what I presented is an essence of every digitisation/quantisation.
Determinism is a privilege of God and humans are capable of taking deterministic actions only with a crude approximation. We can not calibrate our actions exactly and we can not foresee consequences of our actions infinitely into the future. The unintended consequences. You don't have to believe in God. You just have to agree, that it is a useful concept.
Last, but not least, when I was taught physics every course of it was starting from some propedeutics of measurement theory. Physics without measurement theory is just invalid as a tool to model and shape reality.
I think technically everything is deterministic. It's just when you have an incomplete model that suddenly things that weren't included in your model appears to have a "random" behavior.
Example: A coin flip may appear non-deterministic, however if you model the wind velocity, gravity, coin shape, etc., then it can become deterministic.
If a program does not reach the favourable state, does it mean that the program will reject the input. Hope you could respond. Thanks
Yes and that means the DFA doesn't recognize the language consisting of that string
great work sir...so self less of u :)
gud