Truly appreciating these videos on logic! Some of these concepts never stuck with me when I was first introduced to them. These videos seem to help a ton with that :)
Would you say that sound and complete means syntax is equivalent to semantics? That a formal system can indeed capture meaning? Some would argue that formal systems are too "small". But if a formal system is dialethic, it can be complete, and also sound
Yes, if you've got soundness & completeness, then the proof system & the semantics give the same results. But sometimes you still get more information out of one of the other system. For example, if you *can't* prove something, your proof system may or may not explain why. Often, the most helpful explanation then is semantic, in terms of a counter-model showing how the premises can all be true but the conclusion false.
Great video! if we were to suppose something like a fitch system to be unsound, is it possible that it would be complete ever? im just wondering if we can have something be complete but unsound? thanks
Sure! In fact, unsoundness makes completeness easier. In the extreme case, take the proof rule that allows you to derive any sentence. It’s as unsound as it gets but complete, since it proves any theorem.
I don't quite understand the distinction between logic and proof system, is it that logic contains both syntactic and semantic notions while proof systems contain only syntactic ones? But if logic is also about semantics then wouldn't that make it not ontologically neutral (as I thought it was supposed to be) as it would be about truth values and variables?
Second order logic is complete, but you have to enlarge the space of semantics. Sets, functions and relations aren't large enough semantics for second order logic. Henkien proved that so called generalized models can provide complete semantics to second order logic. Then there are semantics that category theory offers that allow one to prove completeness of second-order logic and variants of it.
It very much depends on what we mean by "second order logic". Henkin models (aka general models, which are just set-theoretic constructions - think of them as models which restrict which relations and functions exist) give a notion of validity for which the usual second-order axioms are complete. But there's a real question whether that counts as genuine second order logic. The logic of Henkin models can be translated into many-sorted FIRST-order logic, and so in that sense, doesn't genuinely go beyond first-order notions. FULL second-order logic, which goes genuinely beyond first-order logic, can't be interpreted with Henkin models, and lacks a completeness theorem (as well as the Downward Löwenheim-Skolem Theorem).
@@AtticPhilosophy Second-order logic also lacks an upwards Löwenheim-Skolem theorem, this is evident via the existence of categorical theories in second order logic (all models of T are isomorphic, meaning T only has models of a single cardinality). There are some vestiges of Löwenheim-Skolem properties for second-order logic (check out hanf numbers and Löwenheim numbers).
That’s the problem of the justification of deduction, discussed by Dummett and others. It’s a philosophical, rather than logical, problem. Meta logic uses the generally agreed proof techniques of mathematics (induction, conditional proof and so on).
I have a problem with this. I don't see that semantic entailment is even a meaningful concept. Ok for simple systems you can create something like a truth table, but the syntax of these systems will be always be sound and complete ie its 'semantics' will be exactly the same as its 'syntax' , they are just different ways of proving the same thing. But for systems that are too complex for truth tables, then these just don't have a 'semantics', they are entirely represented by their syntactic form. So semantic notions are either redundant or undefined, and so basically useless.
Once you bring in the idea of a sentence being true/false, or verified/unverified or whatever, you’re in the realm of semantics. Most logicians/mathematicians are suspicious of formal systems without any semantics, largely because we want to know the system is consistent (ie doesn’t prove any contradiction or absurdity). Nearly all formal systems in use - including programming languages - have several semantics available, and finding new kinds of semantics is still a big task for working logicians. Semantics definitely isn’t limited to simple systems with truth-tables.
Thanks for your reply. I see semantics as in the syntax. Take for example the expression (X¬=X), now this must mean False in any useful logic. It does not require an interpretation, as its semantics is contained in its syntax. What am I missing here?
@@tomrobingray Strictly speaking, syntax is just the form of the sentence: x ≠ x is a well-formed sentence, the negation of an atomic sentence, etc. Syntax doesn't say what a sentence means. Add a proof system, and we can say that x ≠ x proves a contradiction, is the negation of a theorem, etc. But as soon as you what to say that it's true or false, you're in the realm of semantics. Some logicians are happy to stick to the level of proof systems, arguing that we can justify proof rules without going into truth and falsity,, but that's a minority view. For most logicians, we want to be able to establish that the proof rules we use are sound (ie that they preserve truth).
@@tomrobingray There's several: *inferentialism* is the general theory of meaning that says the meaning of a concept or proposition is given by what we can infer from it. In logic, *proof theoretic semantics* is the idea that the meaning of a proposition is given by the collection of proofs we can give for it (or the ways it can be derived from given premises). A specific historical example of this is the BHK semantics for intuitionistic logic. More contemporary approaches use type theory as a kind of proof-theoretic semantics. Historically, all these approaches draw to some extent on Wittgenstein's idea of meaning as use. (So, in the logical case, *use* is understood in terms of inferential deductive relations.)
Thank god for you and thank YOU!
Truly appreciating these videos on logic! Some of these concepts never stuck with me when I was first introduced to them. These videos seem to help a ton with that :)
Thanks - glad they helped!
Nice and simple breakdown. Looking forward to some deeper content on the topics! 👍
Thanks! More on this coming over the next few weeks.
Any questions? Leave me a comment below!
well done.....a example of simple explanations to explain some bizarre stuff..
Thanks!
Would you say that sound and complete means syntax is equivalent to semantics? That a formal system can indeed capture meaning? Some would argue that formal systems are too "small". But if a formal system is dialethic, it can be complete, and also sound
Yes, if you've got soundness & completeness, then the proof system & the semantics give the same results. But sometimes you still get more information out of one of the other system. For example, if you *can't* prove something, your proof system may or may not explain why. Often, the most helpful explanation then is semantic, in terms of a counter-model showing how the premises can all be true but the conclusion false.
How could a dialethic system be sound?
thank you so much!
You’re welcome!
Great video! if we were to suppose something like a fitch system to be unsound, is it possible that it would be complete ever? im just wondering if we can have something be complete but unsound? thanks
Sure! In fact, unsoundness makes completeness easier. In the extreme case, take the proof rule that allows you to derive any sentence. It’s as unsound as it gets but complete, since it proves any theorem.
I don't quite understand the distinction between logic and proof system, is it that logic contains both syntactic and semantic notions while proof systems contain only syntactic ones? But if logic is also about semantics then wouldn't that make it not ontologically neutral (as I thought it was supposed to be) as it would be about truth values and variables?
Second order logic is complete, but you have to enlarge the space of semantics. Sets, functions and relations aren't large enough semantics for second order logic. Henkien proved that so called generalized models can provide complete semantics to second order logic. Then there are semantics that category theory offers that allow one to prove completeness of second-order logic and variants of it.
It very much depends on what we mean by "second order logic". Henkin models (aka general models, which are just set-theoretic constructions - think of them as models which restrict which relations and functions exist) give a notion of validity for which the usual second-order axioms are complete. But there's a real question whether that counts as genuine second order logic. The logic of Henkin models can be translated into many-sorted FIRST-order logic, and so in that sense, doesn't genuinely go beyond first-order notions. FULL second-order logic, which goes genuinely beyond first-order logic, can't be interpreted with Henkin models, and lacks a completeness theorem (as well as the Downward Löwenheim-Skolem Theorem).
@@AtticPhilosophy Second-order logic also lacks an upwards Löwenheim-Skolem theorem, this is evident via the existence of categorical theories in second order logic (all models of T are isomorphic, meaning T only has models of a single cardinality). There are some vestiges of Löwenheim-Skolem properties for second-order logic (check out hanf numbers and Löwenheim numbers).
We can prove soundness of given logic with metalogic. BUT, how can we be sure that metalogic itself is sound?
That’s the problem of the justification of deduction, discussed by Dummett and others. It’s a philosophical, rather than logical, problem. Meta logic uses the generally agreed proof techniques of mathematics (induction, conditional proof and so on).
Soundness is more important than consistency
I have a problem with this. I don't see that semantic entailment is even a meaningful concept. Ok for simple systems you can create something like a truth table, but the syntax of these systems will be always be sound and complete ie its 'semantics' will be exactly the same as its 'syntax' , they are just different ways of proving the same thing. But for systems that are too complex for truth tables, then these just don't have a 'semantics', they are entirely represented by their syntactic form. So semantic notions are either redundant or undefined, and so basically useless.
Once you bring in the idea of a sentence being true/false, or verified/unverified or whatever, you’re in the realm of semantics. Most logicians/mathematicians are suspicious of formal systems without any semantics, largely because we want to know the system is consistent (ie doesn’t prove any contradiction or absurdity). Nearly all formal systems in use - including programming languages - have several semantics available, and finding new kinds of semantics is still a big task for working logicians. Semantics definitely isn’t limited to simple systems with truth-tables.
Thanks for your reply. I see semantics as in the syntax. Take for example the expression (X¬=X), now this must mean False in any useful logic. It does not require an interpretation, as its semantics is contained in its syntax. What am I missing here?
@@tomrobingray Strictly speaking, syntax is just the form of the sentence: x ≠ x is a well-formed sentence, the negation of an atomic sentence, etc. Syntax doesn't say what a sentence means. Add a proof system, and we can say that x ≠ x proves a contradiction, is the negation of a theorem, etc. But as soon as you what to say that it's true or false, you're in the realm of semantics. Some logicians are happy to stick to the level of proof systems, arguing that we can justify proof rules without going into truth and falsity,, but that's a minority view. For most logicians, we want to be able to establish that the proof rules we use are sound (ie that they preserve truth).
Is there a name for the school relying on proof systems, and any notable proponents?
@@tomrobingray There's several: *inferentialism* is the general theory of meaning that says the meaning of a concept or proposition is given by what we can infer from it. In logic, *proof theoretic semantics* is the idea that the meaning of a proposition is given by the collection of proofs we can give for it (or the ways it can be derived from given premises). A specific historical example of this is the BHK semantics for intuitionistic logic. More contemporary approaches use type theory as a kind of proof-theoretic semantics. Historically, all these approaches draw to some extent on Wittgenstein's idea of meaning as use. (So, in the logical case, *use* is understood in terms of inferential deductive relations.)