Excellent video. Do you have any suggestions for an introductory textbook (or book simpliciter) on bayesian epistemology, especially in regard to philosophy (e.g. philosophy of religion, philosophy of science, philosophy of mind, etc)?
Fantastic video! I just want to add some minor comment related to the subjective bayesianism criticism. If I believe God is a deceiver and the earth is actually younger that what it seems, I don't agree that all evidence "confirms" my hypothesis. You have to remember that a Bayesian compares how surprising the evidence would be in both hypothesis before updating the priors. How likely it is that my new scientific discovery exists in the world where I've been lied to be God, compared to the world where the earth is in fact older ? I'm not certain that this would be in favour of the hypothesis of the deceiving God. At best, the likelihoods would be equal, which means it doesn't confirm nor disconfirm. And that would be only in the case where God is absolutely omnipotent, and absolutely wanting to deceive us with no exception. But that also should rationally also decrease the prior belief of the hypothesis, because there's the extra unknown on why would God want to deceive us so badly. But even more importantly, this issue would not solved by the "objective" approach, because what you describe is an issue on updating based on evidence, not about having wrong priors. I am more of a subjectivist in that regard, I believe priors should be subjective. However, in the case of God wanting to deceive us regarding the age of the earth, it seems clear that it would be irrational to have this prior without sufficient justification. Without going into details into why a person would assign such a high confidence in this prior, I would imagine that it's a consequence of a faulty Bayes induction, but at the same time, I recognize it's entirely possible that they grew up in a world where everything they've experienced could make it likely, say they grew up in a world where people in authority constantly and abusively deceive you, in which case they would be entirely rational to hold a view like that.
Great video, but you are wrong about the trickster god. Such a hypothesis would have to assign positive probabilities to events that are impossible according to quantum mechanics, so its likelihood for observed events has to be smaller than the one given by quantum mechanics, and given the astronomical number of observations we make that confirm quantum mechanics, the trickster god is quickly disproven.
I don't think "we will never have a certainty of convergence if this is true" is a real flaw with this theory, since we clearly don't have a certainty of convergence in reality (given how many people persist in believing silly things despite the evidence), no matter what philosophy we pick. That a philosophy doesn't make the world perfect isn't a reason to reject it, as the best possible philosophy might just maximize our chance of being right without giving us a certainty of being right.
Correct, I did indeed closely follow a book that was not published until two years after I made the video. It's nice to see that my time travelling achievements are finally getting recognition! 😉
Well done! Note that belief in God does not necessarily entail the literal interpretation of Genesis. And traditional belief in God does include belief that God cannot deceive as this would contradict God's nature.
In a respectful manner I think its fair to say our current theory of everything is rationalized by a basian mind. Inside out deal. If your a dualist or onesss, everything naturalism everything physicalism then basin may accurately rationalize how you think. Easy to be replicated and replace. Unless your subjective & objective if one of those mediate on you behalf it mine as well be one and the same. Ontologically speaking im having a hard time distinguishing computation from about 75% of the world's beliefs. Unless your excersizeing a triality of self epistemology that believes in a critical extreme states of correlating entanglement like a soul both within oneself ,Hibbert space theory/ spiritual dimension with a 3 way Transfer (data ) indefinable intuitive data . I just don't see how anyone else escapes the chaldean minded platos cave analogy which is very dualistic in my classical American founding principles and faithful understanding . The founders use newtons human dashboards equations of the knowledge of good and evil, cause and effect hierarchy to escape in philosophical way. Temporary anylitical doubt for about 100 years Then qauntom physics or Max plank said yep classical American founders was right ,theology was also. Id say our classical faithful take it even farther with air tight seals
Garbage in, garbage out. There's no reason why a Bayesian should be concerned over whether our probability values conform to the laws of probability, considering that our probability values are meaningless. If our prior probabilities can be whatever random value we feel like choosing, then that effectively means that our posterior probabilities are just as random and just as subject to the control of our whims, and so the entire process is pointless. The only apparent point to using Bayes' theorem this way would be to lend an appearance of respectability to our arbitrarily held beliefs by running them through a mathematical formula, and thereby suggesting that we've given our beliefs serious consideration. Unfortunately, there's nothing serious about applying a mathematical formula when some of the numbers are plucked from thin air. Objective Bayesians are really no better than subjective Bayesians, since objective Bayesians are just demanding that people choose particular meaningless probability values rather than letting them pick their own meaningless probability values. Unless our numbers come from some meaningful source, there's no point in using them in calculations.
You could use Occam's Razor to select the priors. And before you say it's unprincipled, it is actually in fact a theorem that least complex hypotheses are most probable. Checkout Solomonoff Induction, a formalisation of Occam's Razor
@@iVideoCommenter "It is actually in fact a theorem that least complex hypotheses are most probable." Solomonoff Induction _assumes_ that the least complex hypotheses are most probable. It gives a formal technique for evaluating complexity and thereby assigning probability. We can't use that to prove that the least complex hypotheses are actually the most probable. We have to decide if we want the least complex hypotheses to be the most probable, and if we do then we might want to use Solomonoff Induction, and so it's just another way of arbitrarily assigning probability based on the whim of the person doing the assigning. Unfortunately such evaluations must always be arbitrary because the complexity of a hypothesis and the truth of a hypothesis are unconnected. Sometimes very complex things are true, and sometimes very simple things are true, and there is no apparent process where the universe checks the complexity before determining which will be true and which will be false. The closest we can come to a connection is in the minds of people who prefer simple things over complicated things, and therefore we wish that simple things were more likely to be true.
Trash in, treasure out. So claims the 'washing out of the priors'. Here is a copy and paste I found in a forum comment section roughly explaining this. *Bayes' theorem washing out of the priors.* "* the final probabilities are in the same 'direction' regardless of one's prior probabilities (i.e., the evidence either increases or decreases the probability of some hypothesis) * the 'magnitude' of both final probabilities is either above or below 0.5."
@@Ansatz66 oh no my two sentences were unconnected. A theorem that complex propositions are less likely (ceteris paribus) can be shown easily as follows: P(A&B)
Excellent video. Do you have any suggestions for an introductory textbook (or book simpliciter) on bayesian epistemology, especially in regard to philosophy (e.g. philosophy of religion, philosophy of science, philosophy of mind, etc)?
A shot in the dark here but I am almost sure that the source material for this is from Peter Godfrey Smith book theory and reality
You were hungry in this one! (kept hearing belly grumbles, lol), great video, thanks!
hungry for knowledge
Excellent. Looking forward to part 2.
Thank you for the new video!
Such a clear explanation. Thank you a lot for this!
Excellent explanation. Thank you!
Fantastic video!
I just want to add some minor comment related to the subjective bayesianism criticism. If I believe God is a deceiver and the earth is actually younger that what it seems, I don't agree that all evidence "confirms" my hypothesis. You have to remember that a Bayesian compares how surprising the evidence would be in both hypothesis before updating the priors.
How likely it is that my new scientific discovery exists in the world where I've been lied to be God, compared to the world where the earth is in fact older ? I'm not certain that this would be in favour of the hypothesis of the deceiving God. At best, the likelihoods would be equal, which means it doesn't confirm nor disconfirm.
And that would be only in the case where God is absolutely omnipotent, and absolutely wanting to deceive us with no exception. But that also should rationally also decrease the prior belief of the hypothesis, because there's the extra unknown on why would God want to deceive us so badly.
But even more importantly, this issue would not solved by the "objective" approach, because what you describe is an issue on updating based on evidence, not about having wrong priors.
I am more of a subjectivist in that regard, I believe priors should be subjective. However, in the case of God wanting to deceive us regarding the age of the earth, it seems clear that it would be irrational to have this prior without sufficient justification. Without going into details into why a person would assign such a high confidence in this prior, I would imagine that it's a consequence of a faulty Bayes induction, but at the same time, I recognize it's entirely possible that they grew up in a world where everything they've experienced could make it likely, say they grew up in a world where people in authority constantly and abusively deceive you, in which case they would be entirely rational to hold a view like that.
Great video, but you are wrong about the trickster god. Such a hypothesis would have to assign positive probabilities to events that are impossible according to quantum mechanics, so its likelihood for observed events has to be smaller than the one given by quantum mechanics, and given the astronomical number of observations we make that confirm quantum mechanics, the trickster god is quickly disproven.
Great video.
I don't think "we will never have a certainty of convergence if this is true" is a real flaw with this theory, since we clearly don't have a certainty of convergence in reality (given how many people persist in believing silly things despite the evidence), no matter what philosophy we pick. That a philosophy doesn't make the world perfect isn't a reason to reject it, as the best possible philosophy might just maximize our chance of being right without giving us a certainty of being right.
**Willam James Intensifies* *
dang you followed titelbaum so closely you even kept the two book format ;) just kidding love your work dog
Correct, I did indeed closely follow a book that was not published until two years after I made the video. It's nice to see that my time travelling achievements are finally getting recognition! 😉
@@KaneB omg, so the only conclusion i can now reach is that in fact it was titelbaum doing the following!
Thank you.
Olha ele aí hahaha valeu pela sugestão
Great channel!
Excellent.
26:15
Well done! Note that belief in God does not necessarily entail the literal interpretation of Genesis. And traditional belief in God does include belief that God cannot deceive as this would contradict God's nature.
In a respectful manner I think its fair to say our current theory of everything is rationalized by a basian mind. Inside out deal.
If your a dualist or onesss, everything naturalism everything physicalism then basin may accurately rationalize how you think.
Easy to be replicated and replace. Unless your subjective & objective if one of those mediate on you behalf it mine as well be one and the same.
Ontologically speaking im having a hard time distinguishing computation from about 75% of the world's beliefs.
Unless your excersizeing a triality of self epistemology that believes in a critical extreme states of correlating entanglement like a soul both within oneself ,Hibbert space theory/ spiritual dimension with a 3 way Transfer (data ) indefinable intuitive data .
I just don't see how anyone else escapes the chaldean minded platos cave analogy which is very dualistic in my classical American founding principles and faithful understanding .
The founders use newtons human dashboards equations of the knowledge of good and evil, cause and effect hierarchy to escape in philosophical way.
Temporary anylitical doubt for about 100 years
Then qauntom physics or Max plank said yep classical American founders was right ,theology was also.
Id say our classical faithful take it even farther with air tight seals
Thanx a lot!
Bayesianism seems really important and useful, yet I find it boring as all heck...
Frequentism is 10x more boring, especially statistical tests...
The probability of me hearing somebody pronounce “H” like the presenter is not near zero.
Garbage in, garbage out. There's no reason why a Bayesian should be concerned over whether our probability values conform to the laws of probability, considering that our probability values are meaningless. If our prior probabilities can be whatever random value we feel like choosing, then that effectively means that our posterior probabilities are just as random and just as subject to the control of our whims, and so the entire process is pointless.
The only apparent point to using Bayes' theorem this way would be to lend an appearance of respectability to our arbitrarily held beliefs by running them through a mathematical formula, and thereby suggesting that we've given our beliefs serious consideration. Unfortunately, there's nothing serious about applying a mathematical formula when some of the numbers are plucked from thin air.
Objective Bayesians are really no better than subjective Bayesians, since objective Bayesians are just demanding that people choose particular meaningless probability values rather than letting them pick their own meaningless probability values. Unless our numbers come from some meaningful source, there's no point in using them in calculations.
You could use Occam's Razor to select the priors. And before you say it's unprincipled, it is actually in fact a theorem that least complex hypotheses are most probable. Checkout Solomonoff Induction, a formalisation of Occam's Razor
@@iVideoCommenter "It is actually in fact a theorem that least complex hypotheses are most probable."
Solomonoff Induction _assumes_ that the least complex hypotheses are most probable. It gives a formal technique for evaluating complexity and thereby assigning probability. We can't use that to prove that the least complex hypotheses are actually the most probable. We have to decide if we want the least complex hypotheses to be the most probable, and if we do then we might want to use Solomonoff Induction, and so it's just another way of arbitrarily assigning probability based on the whim of the person doing the assigning.
Unfortunately such evaluations must always be arbitrary because the complexity of a hypothesis and the truth of a hypothesis are unconnected. Sometimes very complex things are true, and sometimes very simple things are true, and there is no apparent process where the universe checks the complexity before determining which will be true and which will be false. The closest we can come to a connection is in the minds of people who prefer simple things over complicated things, and therefore we wish that simple things were more likely to be true.
Trash in, treasure out. So claims the 'washing out of the priors'.
Here is a copy and paste I found in a forum comment section roughly explaining this.
*Bayes' theorem washing out of the priors.*
"* the final probabilities are in the same 'direction' regardless of one's prior probabilities (i.e., the evidence either increases or decreases the probability of some hypothesis)
* the 'magnitude' of both final probabilities is either above or below 0.5."
@@Ansatz66 oh no my two sentences were unconnected. A theorem that complex propositions are less likely (ceteris paribus) can be shown easily as follows: P(A&B)
@@iVideoCommenter "P(A&B)