We used Chomsky’s sentence tree around 25 years ago in a company that developed machine translation software. The source language was parsed (analysed) based on its Chomsky-tree structure from the top sentence level down to its smallest entities - the individual words by monolingual rules. This structure was than transformed by rules into the grammatically correct tree structure of the target language. We only used English French, German, Italien, Spanish and Brazilian Portuguese, and French German, so no language with completely different language systems. German, due to its very different word order, was acctually the most difficult lanugage because it required a 100% correct analysis of the source language and made use of preposition attachment and other interesting things to make sure semantic units were not ripped apart when creating the correct word order for the target side. It was a brilliant system for its time, but also proof how complex languages are and how difficult it is to capture all of it with a rule-based system.
I don’t know any German but I would expect a machine to look at a long German word and occasionally divide it into sub words incorrectly with results that are hilarious nonsense to a human. In fairness, I have done this with Hebrew many times, such as wondering why Betzalel was named “the onion of G-d” instead of realizing it was “in the shadow of G-d”
@@wafelsen The old MT systems used dictionaries. If a word was not in these dictionaries, the systems could not translate it and definitely would not attempt to break it up into meaningful snippets. This would have slowed down the translation process. Our system ( and I assume most if not all of the systems of other companies) used an embedded complex morphology analyzer that recognized all possible inflected word forms on the source side and created the corresponding forms on the target side as final step. The dictionaries contained only the uninflected base forms such as singular forms, infinitives and uncontracted forms - plus codes (morphology) and lots of tags (grammatical, semantic and syntactic information) used by the engine and the rule system.
As a computer science student and language enthusiast I thoroughly enjoyed this video. You timed this great for me as I'm taking Discrete mathematics where I've learned about Regular Languages from a math perspective AND I'm taking a class on compilers where I'm learning about syntax trees and writing code to generate them based on grammar I've defined. I'm loving the content that dives a little into linguistic academia in concise interesting videos. I'm definitely interested in hearing more about generative language theory as I don't have a firm grasp on it or why it exists.
Conversation with a child language acquisition lecturer, paraphrased: "young children are bombarded with language 24/7 and are sensitively attuned to it because their attention is always focused on their parents" "so why is UG premised on the paucity of the stimulus?" "my guess is that Chomsky never spent much time raising children". 😵🔥
@@johnnye87 Of course there is poverty of input. Universal Grammar (UG) approach claims that there is a universal set of principles and parameters that control the shape of human language. But the entire corpus of studied language is so small that it is impossible to derive what this universal set of principles and parameters actually is from using the entirety of human language as input. If the entirety of human language does not contain enough input to enable the principles and parameters of UG to be extracted, then how could a child receive enough input to work them out, unless they are already built in?
@@stevencarr4002 My issue isn't so much with the hypothesis that the evolved human language faculty preferences certain patterns over others, so that you can get a "complete" language without having to expose the child to enough data to unambiguously derive every rule. That seems pretty obvious and unremarkable. My issue is that they tend to treat the formulae they come up with as though they represent actual mental processes, when they bear no resemblance at all to how any other part of the human mind works. We don't operate on strict mathematical rules. A neural network like the human brain runs on weights and fuzzy logic, and the languagey bits of our brain aren't isolated from the non-languagey bits. Chomskyans produce formalised descriptions of a phenomenon and then mistake the map they drew for the territory.
@@stevencarr4002The argument question beggingly assumes there are principles of UG to be worked out. The usual anti-UG line is that language evolved to be learnable by human brains not the other way around. Cognitive biases were around long before before language, so any language that arose would of course be shaped by those biases. We don’t work out the laws of physics to learn to walk either, so is the correct formalization of physics innate also?
I read the the Chomsky book that was mentioned in this video in an upper division linguistics class. The first time I read it was really confusing, as is the case with many academic writings, especially ones from philosophers. I'm glad people like you are here to read through it fully and share your understanding with us because it provides more discussion on the topics these academics talk about. To me, the more we discuss these ideas, the more minds we can get working on them and the faster it will be to realize a solution. I think it's really beautiful that you make these kinds of videos and share this on the internet. Thank you for the great work!
This channel feels so relevant to any grammar thing I'm currently thinking about. I feel like my mind is being read or something. I was thinking about this yesterday. Great video
So great to see this video. I had a similar experience with Chomsky in linguistics and cognitive science (misunderstanding and dismissing the straw man definitions of UG or LAD, working through the topic myself, realizing what he was actually positing, finding his work invaluable and awesome), and I’ve been butting heads with social scientists (my native field is anthropology) about it for 15 years. Love your channel!
I'm glad you mentioned the branding problem at the end. That explains a lot. So let's just say that my entire grad school experience was three years of back-to-back branding debacles regarding anything related to Chomsky.
11:15 understand that this pseudo-erudition straw manning is precisely Chomsky’s own approach to any area of philosophy he can’t be bothered to understand well enough to comment on and instead chooses to classify as nonsense. It seems like a kind of poetic justice, really. He feels confident enough in his intellect to write off anything he can’t understand as “incomprehensible” and others do the same with him. Oh the hubris of the hyperspecialized.
Great stuff! Pauses are immensely important for your viewers, especially when the content has any level of complexity. More often than not I feel the need to speed up a video, but you're the first person I've ever felt I needed to slow down or have to rewatch. Would love to hear your reasoning against the naysayers, perhaps another video idea or so.
@@languagejones6784 Address Everett's claim that recursion is not the base on which language is constructed, including his claim that Piraha has no past tense
@@gordonbgrahamDon’t get me wrong, Everett might still be proved one day to be a hack or a bigot, but at this point no one can prove him wrong. And unlike Chomsky, he is making claims that can be refuted, meanwhile Chomsky just argues that UG is about biology, makes no predictions and is not a theory but a field of study, all which can’t be proven wrong.
@@impendio The lack of a concept of the past and number (other than a little or a lot) no words for colour and the apparent absence of recursion in Piraha (something that would counter Chomsky's claim vis a vis recursion) seems to suggest language is not fixed in its conceptual properties. That language is biological can't be refuted is certain, but that doesn't address the issue of recursive grammar which Chomsky is a proponent of.
Oh man more computer science content please, I'm sure I'm not the only one who wants to see how you modeled words in Python. Plus there are probably a lot of computer science nerds in your audience :D
Yes, this is me. I like languages. I love computers. I love math. If I hadn't luckily discovered my incompatibility with academia as an undergrad, I might've ended up somewhere in the confluence of those studying generative linguistics and/or NLP right around the time that LLMs began blowing up... Eh. For the best.
Part of UG is the idea that there are a certain set of universials which are all either debunked (like all languages can form subclauses which isn't true, some don't do that) or so vague that they can mean anything (like every sentence has a subject and each time you find one without, they will say something like "well, it's not expressed" or point to random shit to be the subject. One language was analyzed by 2 guys and they came to different conclutions what the subject is). I beliefe that the ability for language is innate, that doesn't make the Chomskian generative grammar true. It's a model and as a smart guy once said "All models are wrong but some models are usefull". Take that from a computer scientist who worked on my master's thesis on a very not Chomsky syntax theory (called Role and Reference Grammar).
The problem with UG is that it's totally unfalsifiable. It's not a scientific theory. Also Chomsky is a hack, but that's not strictly related. There is no evidence in any field that the universal elements of human language are distinct from the definitional elements of language as a concept.
I find your criticism of null subjects to be very weak. UG never needed to have mandatory subjects; in fact the EPP was only proposed in the 1980s, as I assume you know. Even if every sentence in all languages we know of only had overt subjects, the EPP/obligatory subjects wouldn't HAVE to be part of UG. More recently, Chomsky (2013, 2015) even did away with the Extended Projection Principle. Besides, I don't see how misidentifying the subject of a language has any bearing on UG. It doesn't even mean the language doesn't have subjects (whether overt or not).
@@amazingcaio4803 I did not know that EPP was a later development. I never even heard the term but I guess it's the same as obligatory subjects. My point is that all the universals Chomsky proposed are refuted or irrefutable. And my broader point being that UG has a lot more baggage than the simple statement "Language is innate". That said, I wasn't referring to pro drop (like Spanish "te quiero") or dummy subjects (like English "it is raining"). Here I see that the subject is not expressed or semantically empty. But what about German impersonal passive ("Hier wird nicht spielt", word for word: "Here is not played"). And let me rephrase my other point: If two Western linguists analyze a very not Western language and come to different conclusions what the subject is, maybe "subject" is a Western concept that doesn't fit with foreign languages. The syntax theory I worked with tried very hard not to be Western centered and here is an article why it is so cool: www.computerworld.com/article/2929085/blame-chomsky-for-non-speaking-ai.html The article didn't age too well and the wikipedia article isn't too long but it has some links.
@@twipameyer1210 I do get some of your points. I also dislike that some claims people make are so wild they're essentially unfalsifiable. Not only would this be unscientific, but it would also go against one of the central goals of UG - to restrict language so that it explains language acquisition. There is indeed a lot more to UG than innateness. I don't even think innateness itself is controversial; language acquisition must ultimately be encoded in our genes. What's controversial is whether there is a domain-specific language faculty. While hypothesing its existence can easily lead to speculation, I do believe it is a worthy line of inquiry. I don't think that nowadays 'subject' has a formal definition in Chomskyan generative grammar. At most it's generally used as a synonym for the 'specifier of T(ense)', which is a theory-internal concept. This is hardly a Western concept (the obligatoriness of subjects, OTOH, could be) and people doing generative grammar actually like studying other languages so that they can gather enough data to refine the theory of UG. Hungarian was partially used to support the DP (determiner phrase) hypothesis and data from Chinese, Japanese, and Serbo-Croatian helped refine proper government and the Empty Category Principle. Chomsky also once proposed the configurational parameter based on data from Japanese. Even if the concepy of subjects (and/or the idea that subjects are obligatory) were Western-specific and didn't apply to other languages, this would mean that our analysis is biased, but I still wouldn't think we should abandon the idea of a universal grammar, especially because that's incidental to UG (there's no a priori reason for our account of UG to include subjects). Regarding subjects specifically, the general idea is that T must have a specifier, so the focus of inquiry isn't on the subject (read, the specifier of T), but on T itself (why does it need a specifier?). Most accounts don't really restrict the subject to a specific kind of element (any restriction is usually derivable from other principles or is lexically determined), so the subject could be a null pronoun, an expletive, or any other type of constituent (e.g. a locative, as in your example in German). Sure, looks can be deceiving and one could argue 'hier' isn't even the subject of your example sentence, based on empirical evidence or theory-internal reasons. That's why syntactitians may have different analyses. In fact, isn't that expected or even desired? In the end, it doesn't mean UG is refuted or irrefutable. Many universals still seem to be true. One example is Lasnik and Saito's (1984) restrictions on syntactic Wh-movement. (Their account may be outdated, but the empirical phenomenon still holds true, as far as I know.) Edit: BTW I personally have nothing against other theories of grammar and I don't deny their usefulness.
@@amazingcaio4803 Let me first thank you for this conversion! I enjoy it very much and think it's very fruitful, especially compared to other conversations on the internet. Let me now reiterate my claim: All models are wrong but some models are useful. UG is good at explaining some things, other theories explain or visualize other things better. Some Chomskyans will claim that UG is literally objectively what happens in our brains and strawmaning non-UG-linguists in saying they don't believe language is innate. That's what I felt the video did and that was what I was attacking. I don't think you think so, so we are good. And to lay my biases open: I'm more a computer scientist than a linguist. I did some basic linguistic lectures with Chomskyan professors and wrote my master thesis in the computer linguistics department with a professor who hated Chomsky so much that he unironically claimed that language is just a cognitive ability like any other. My (Chomskyan) syntax professor was so fond of the idea that the subject is universal that I had the impression that it's central to UG. I'm glad that isn't the case apparently. I find myself somewhere between these 2 worlds.
Most people know Chomsky as a political writer. His talks are so hard to get into, that once when he gave separate "politics" and "linguistics" talks at the University of Maryland in the 1990s, I joined a bunch of people for the linguistics talk, where the Q&A was all politics questions. Chomsky's theories about universal grammar have stood up very well over the decades, but within linguistics there are camps that favor his theories and those who are critical. When I worked for AAAS and Science magazine in the late 1990s, I helped plan the annual conference. We always had to make sure that programs in the linguistics track represented both camps.
As a linguistics enthusiast and conlanger, the concept of universal grammar is interesting to me because I like thinking about the earliest limits of language development in prehistory: how did people go from single word utterances to tell each other about food, predators, or danger, to complete sentences? Is the process of stringing words together in a way that creates meaning something that evolved within our brains, or is it a social construct? How much of language is instinct, and how much is learned? As a conlanger, I'm often amazed and delighted when my made-up words that I put together in a made-up sentence structure can still produce meaning, in some very interesting ways. If there wasn't some kind of basal syntax deep within our brains, conlanging would be very different, if not impossible.
One could also start to study Language/languages, not only as they developed during millions of years amonge primates, but also as how a mother tongue develops from the bond between mother and child, from the baby's perspective. The last months in the womb the unborn child hears both music and speach and other sounds, thus it is familiarized with the world of sound surrounding it. I have heard babies use sentence melodies, cadences, both declamatory and questions mode, when they could not talk and walk yet. They also recognize music and imitate melodies and entertain themselves . And I have heard babies take turns in interaction with their mother, using syllables of sounds and minimal pauses, as if they were words. Babies were in such situations very pleased with their sounds and the interaction they could bring about, ready to repeat the situation again and again. Patient mothers gently mold these sounds into (let's call it) a prelanguage, and then into the baby's mother tongue. Dads can do this, too, if they are around enough. Often parents understand the baby, and later the toddler, when nobody else does. Many cute situations come up, of entertaining value for the adult, too, but also misunderstandings - which are *meaningful* experiences, as part of learning how rules regulate language use. These foundations could contribute to explaining how come our languages are not so logical as binary sequences, instead there is a haphazardness to words and their forms within the sound system of the mother tongue (and the anatomy of toddlers).
But Chomskian grammar (assuming we're talking about generative grammar, which is a grammatical theory, and not the pseudoscientific innate syntax hypothesis which is frequently referred to as "universal grammar") has nothing to do either with how languages work in our brains nor with how they historically developed. Syntax is mainly a pathway to linearizing expressions that initially mirror our thinking that is primarily associative in nature: thoughts don't have hierarchy, they don't have constituents, they don't even have any linear order (even though they are experienced linearly due to the nature of time); they only have concepts and relations between them, relations between sets of related concepts - those can be active, patientive, causative, instrumentative, spatial, temporal, etc., - i.e. meaningful relations, not abstract-structural. Generative grammar is an abstract model that is supposed to be able to predict, mostly based on the structure of linguistic expressions, whether those linguistic expressions would be considered structurally possible within a given language - i.e. it is supposed to predict whether a person *perceiving* the text would consider it *structurally* inappropriate, and if they would, it is supposed to tell us the structural principle that was violated. Generative grammar does not and was never meant to predict nor explain the way syntactic structure is arrived at either in use or in historical development. Some structural patterns are obviously there, just like, for example, every walking animal moves its feet the same way as the other animals of the same species and the same age, even though they are free to move any other way they are physically capable of - they just won't be doing that most of the time, as long as the one of the millions of biomechanical alternatives is noticeably more comfortable in the long run; animals just have places they are aiming to get to, and all they do is step, step, step, etc., without having any "overall way of moving feet" in mind. With languages it also seems that this structure of speech isn't something that is ever the goal, but something that emerges on its own, out of the systematicity characteristic to the expressive means of a language - to the lexical units and their various counterparts within the lexicon, to the means for expressing predicative relations, semantic modification, communicative relations, rhetorical relations an so on - the systematicity that is quite necessary to ensure the possibility of unambiguously reconstructing the initial conceptual content that had led to this very structure that is observed (i.e. the systematicity, loss of which would lead to the impossibility to consistently "interpret" the resulting linguistic expressions). It's not the structural principles that lead the development of speech complexity - those principles aren't there yet, when speech never reaches any complexity: the language stays incapable of expressing that as long as the society that generates that language stays culturally incapable of the necessary verbal coherence - there are human languages that don't have any means of precisely expressing most kinds of relations between concepts, not even the predicate-subject-object relations. It is the cognitive development and cultural "fossilization" (roughly speaking, concept → myth → habit → tradition) that gradually lead to the development of this structure that we observe and that we can describe using the notion of generative grammar.
The linguist Guy Deutscher wrote a popular linguistics book all about how language development could have happened in prehistory: it is titled _The Unfolding of Language: The Evolution of Mankind`s Greatest Invention_
Great video! In linguistics at Cambridge, we studied principles and parameters in intro to syntax in first year, and later years it got compretely overhauled by minimalism and other approaches. Coming from a biological science background (I studied neurology and neuroanatomy) I’ve always been super sceptical of UG and only managed to reconcile what I know about cognitive science with what linguists from largely humanities and philosophical backgrounds refer to as “the language faculty” by describing UG as our innate bias for language acquisition in the vaguest way possible. The fundamental principle underlying syntax is recursion, which is what leads to hierarchical structures. I was taught that human language is recursive and that is what distinguishes it from other animal sounds. It’s only after i graduated that I learned humans think and process things recursively, which begs the question of whether it’s because of our recursive perception that we developed recursive language or if recursive language helped up perceive other things recursively. Super interesting stuff!
Also a cantab, had the same experience here (though I came to these conclusions during undergrad). I've ended up working on paradigmatic morphology which doesn't really fit such a compositional theory of language easily (contra Distributed Morphology and morphemic theory more generally), so I don't even really interact with Chomsky's work much these days, but my view is that Chomsky's work features 1 a confusion between description and explanation and 2 successive rounds of having less substantial things to say (e.g. minimalism has a much-reduced language facultt relative to P&P and G&B).
Thank you!! I saw a grad student doing parse trees at the LGBT Center and learned more in a 5 min discussion than I had understood from 8 years of Spanish education! I was only able to comprehend and fully parse foreign language when I was given some basic operators for structuring how the modules worked together. After taking a syntax class with Beatrice Santorini, I picked up Japanese quite easily and the sensei even commented on how rapidly I understood grammatical distinctions such as the は/が topic vs subject marker or how adjectives seemed to be either fundamental noun-ish or were akin to a verb for having that quality. They appeared to either be noun-ish + the copula word that conjugated tense, mood, aspect, negation, and conjunction or they were almost quasi verbs which then conjugated normally. That structure paralleled a similar verbifiying structure that was "to do" + a noun: rugby → play rugby, study → to study, marriage → to marry/(get) married. Once we were shown adjectives as they conjugated for the past, the pattern jumped out at me there at some level were nouns you were like and nouns you do or verbs. Which opened a whole backdoor to the philsophy and all these meditations on what is red and are qualities of the object and if not what "are" they. I never understood why we discussed before then... Syntax changed my life (as I've mentioned before 😳)
Very interested in the Intro to Linguistics class. I'm a computer software programmer, with a ton of Python experience, and when I was in school in the 80s, took Intro to Linguistics, and Computers and Linguistics...and when I dropped out, my Linguistics prof called me on the telephone (!) to ask if I was ok. I was to take her Transformational Grammar class, and sorely miss not having done that.
I think that a nuanced approach to language learning is important. An overwhelming majority of languages are observed and not constructed. Therefore, the so-called "rules" are just someone's observations, and giving their opinions on those observations. For example, as far as I know, the first academic mention of a "modal verb" was by Leonard Bloomfield in the 1933 book Languages (thus making the idea of a "modal verb" itself less than 100 years old). However, what is "deemed" a modal verb was basically established by Quirk and Greenbaum in 1973. Since then, many have questioned whether there should be a finite list of modals, like Quirk and Greenbaum propose, or if the broad definition of what a modal verb is leaves room for interpretation. Is "have to" considered a model verb, since it sets out to do what other clearly defined modal verbs (like 'must') already do? Does "need to" fall into that same category? Now, what I mentioned might be a uniquely English problem, since we don't have an "English Academy" regulator. Still, even those regulators such as the l 'Académie française, while regulating the language, are still doing it through the interpretation of it's members.
as a biologist and linguist, i have difficulties accepting UG (although your python-related input is quite interesting). instead of language capacity (or language organ or else), there is rather something like language readiness, a synergy of different older neural networks, which took over the processing of language, but it has not evolved to be language-specific (although it may later evolve to be more efficient). the statistical learning approach seems to me to be a more plausible way the language works, as advocated by morten h. christiansen et al. it would be interesting to know where you stand on this approach 😁
Your research sounds promising. But I'm sure I'm not alone wondering what kind of PhD program trained biologists AND linguists. Not my field. Just curious.
But Chomsky doesn't claim it evolved to be language specific. His claim is actually very similar to yours. There was some specific evolutionary change at some point that allowed other neural functions to click together and express this latent capacity as language. As more and more people spoke and languages evolved and changed, statistically, more and more and more different possibile arrangements of the computationally restricted possibilities of language emerged. Since language is a layer cake of different computacional systems of various constraints-pragmatics on top of semantics on top of syntax on top of morphology on top of phonology (itself constrained by the physical limits of phonetics)-you have an almost infite possibility of configutions. It's quite an elegant and simple solution to the problem of the vaaaast diversity of languages that exists.
@@Robespierre-lIyou have to study them separately :-) although, i think it would be quite beneficial for linguists to combine their training with the biological one, especially in evolutionary biology, as there are plenty of shared fields of interest - e.g. population genetics. i started with biochemistry through molecular and system biology, and after some time, in fact, almost 20 yrs later, i went for general linguistics. my major interests in lingo are cognitive linguistics and sociolinguistics.
@@Eruntano42 from the biological point of view, the evolution of the language and the respective neural structures has to be more stepwise. e.g., arbib et al. hypothesise that there were at least seven distinguished steps (synergies of new neural networks) and, gradually, certain forms of language communication appeared, which were much simpler than contemporary human language but still helped to survive in more and more complicated human societies. the thesis of "pragmatics on top of semantics on top of syntax on top of morphology on top of phonology" is, according to this kind of study, not necessarily valid, e.g., pragmatics does not require syntax or morphology in the earliest assumed forms of language. it is still in the early development stage in all these directions, but what seems to be a sign of good science is that they can explain more than previous hypotheses.
The language acquisition literature is very much based on specific universal grammar mechanisms, and they are more productive than statistical learning.
Thanks for your work Dr Jones. I tried to read Chomsky when I was a teenager in the 1970's. I got a bit lost, but I like your programming analogy. I have always been a language fan.
As an English speaker I think I learned more about English grammar while learning French grammar than anywhere else. I may have forgotten what English grammar was actually covered in what our schools called the language arts class, but when the French teacher was talking about using the subjunctive with the past perfect in French, I know it was a revelation for my understanding of English. English grammar was always "that does not sound right" or "that sounds right" or even "that sounds better".
When I studied Russian in school one of the best books I had was called English Grammar for Students of Russian. I think there are versions for several other languages as well.
@@joshrotenberg5872--- YES!!! I recommend that series for everyone who wants to learn a language covered by that series. There's even 'Spanish Grammar for Students of English' written in Spanish.
As someone old enough to have learned to read through phonics and had English grammar explicitly taught in grade school and later got a master's in teaching ESL, I can tell you that 2/3 to 3/4 of English/ESL teachers actually have a very poor grasp of both English grammar and English spelling.
about 5:20-6:24 we have an analysis of how words are put together to make syntactical structures. Does it matter that when learning a language, children parse phonemes, not words? The sound 'ed' indicates past in English, so children often say the ungrammatical sentence 'I wented' Obviously this does not falsify Chomsky's theory (nothing can), but can UG account for how children can learn language when they don't hear separate words, but hear a string of continuous sound?
As a computer scientist myself (although not doing any computational linguistics), I too am interested in the intro to linguistics course. I am fairly good at English grammar and have a lot of professional writing experience (not just technical writing, but communicative/persuasive writing and creative writing as well), and love what you do here. I too see the overlap with Comp Sci, and with my love of computer language theory, I often talk about similarities and differences between computer languages and spoken languages with programming students. Thanks for what you do, and I look forward to more.
I appreciate this exposition of the issue of UG, and agree with most of it. Here are two points, where I have some reservervations. I don't think it's simply misunderstanding or proliferation of straw men . There's a deeper philosophical aspect of the dispute between (some) anti-Chomskians and UG types. The issue, which Chomsky explicitly takes a position on in I think Knowledge of Language is the old nature vs. nurture dispute, aka Plato vs. Aristotle. Generative(TM) Linguistics isn't only about the insight about something like rules generating output. It's a specific stance that some that capacity is hard wired in a language module or language modules as opposed to simply the outgrowth of general intelligence understood as an overall learning capacity. The idea of innate knowledge is deeply offensive to some people who connect it to the worst excesses of evolutionary psychology and even social darwinism. That seems like a stretch to me, but the whole issue gets interfered with by the sociology of the field, branding (as you point out), and toxic personalities. Another thing I would want to modulate a bit is your colorful characterizations of sociolinguists, a tribe I'm a member of. Essentially, since Tony Kroch's arrival at U Penn (a place I think you're familiar with) Labov, G. Sankoff et al. have been much more sympathetic to a UG-type approach (although without the term) than a lot of others. In fact, the field seems split on the subject of an innate/modular cognition approach and the functionalist/connectionist strand. There are a number of researchers who do variationist analyses along with generative syntactic ones.
The way I heard it people kept finding cases in languages that don't fit the model and the model had to be generalized until it was meaningless. The most famous case was Pirahã not using recursive structures at all. No clauses, no noun phrases within noun phrases. Kinda like parsing an assembly program where every sentence/line of code says only one thing.
I’d be interested to know what you think about Daniel Everett’s work on the Pirahã language and his assertion that it does not have recursion (Chomsky’s essential property of language)
UG is very interesting, but I disagree with Chomsky on many of his other opinions outside of linguistics haha. Super interested in the course you're putting together here on UA-cam. Really looking forward to watching it!
OMG, huge nostalgia for when I took UG at UT Austin. I loved it so much. Phonology not so much (in fact, I almost changed majors because I just thought I was DUMB). But Phonetics?? OMG, IPA is heaven to my eyes and years. Yes, Intro to Linguistics, por favor!
Thanks for this video eloquently laying out the beauty and truth of Chomsky's syntactic theories. He's still very much swimming against the current despite his popularity in his own field of study
My Phil of Lang and linguistics professors at the University of Kansas were MIT Chomskyans. Thanks for giving this major breakthrough in cognitive linguistics a fair shake.
I did an undergraduate degree in linguistics at a school that required all students to study a semester's worth of Minimalism. Got absolutely clobbered by it, failed it, had to take it twice. Years later, I went back to school to do an undergraduate computer science degree at a school that required all students to do a semester of computability theory. Imagine my surprise when good ol' Noam pops up with his binary trees and context free grammars! Never before in my life had I wanted to write my old Syntax professor a Christmas card.
I was a ling major at Maryland in the mid-80s. The first course after baby-Ling (this is a phoneme) was basically a semester on the evidence and motivation for UG. It was a VERY Chompskian school. My professors always pronounced Noam to rhyme with Rome :-) I have no idea if that was right, since they had pretty strong accents of their own! And then in my computer science classes (dual major). there was Chomsky again, of course.
'Gnome Chomsky' made me laugh, but familiarity does tend to make names sound more like plausible native words. Not a lot of native English vocab with an o-a transition :)
@@rasmusn.e.m1064--- I don't think I've heard his name pronounced any other way than 'gnome'. Then again we're talking grad school linguistics in the early '90's.
@@ak5659 I'm not a native English speaker, but I typically pronounce his first name in English something like [ˈnɔ͡ʊ̆ˌəm], which I perceive as my realisation of /ˈnɔ͡ʊ̆ˌæm/ but I can see why the final schwa might easily get reduced in this particular constellation of //First name_Surname//, where the surname tends to get a lot of stress.
Your vids are great, thank you. I would like to see a more detailed explanation of "Universal Grammar". What is it, where's its place in development of linguistics, and what's the controversy?
my teacher quickly went over UG in my linguistics I class the other day, I wrote down what she said so that I could look it up later (because "wtf? no?") and you just saved me! wonderful video!
The story about how you started to reinvent computational context-free grammars sounds like exactly the sort of rabbit hole I'd fall into 😆. I could totally see myself doing something like that. This video is right up my alley and honestly, so is this whole channel
I'm a computer scientist halfway through an introductory university course on linguistics as well as a language philosophy class. There is a person for every niche youtube video. I guess I am the one for this video. 😂
I always thought grammar is like a programming language. And I wished there was a Full Documentation of every Language. (or is there something like that ?) Most Language learning Books aren't really precise with there explanation of Grammar. Like in Japanese, most of the Books tell you that there are Na Adjectives (But Technically this are special Nouns). Or that the Partikel wa is indicating the Subject (this is also not really true). And then the Teacher are confused why i make strange sentences.
This is certainly an interesting presentation on universal grammar, but there are many reasons why people disagree with its tenets (besides the representation of how sociolinguists criticize the "ideal speaker" concept). Ewa Dąbrowska has a really good paper that outlines criticisms towards universal grammar, I would really recommend the read to anyone who's interested in some more solid criticism of generative theory. Ewa Dabrowska. (2015). What exactly is Universal Grammar, and has anyone seen it? Frontiers in Psychology, 6.
The main problem is with UG selling UG as the sole theory that does all of this!! Like, parameters exist in other theories, they're just conceptualized differently and the claim about how we get them is different. Like when you talk about the generation of sentences. Chomsky reacted to Behaviorism and Skinner. Rightfully. But in the process he somehow seems to negate that things like dependency grammar existed (also successfully used in NLP without needing to posit that these grammatical relations are innate!)? UG was a novel way of computationally defining sentence structure but it was not the first, by a long shot, to introduce bloody "hierarchical structure"!!! Things like "unergatives" or "unaccusatives" are the work of typologists, many of whom are not necessarily into UG, nor do they *need* UG to explain what an unergative does!! It's not impossible to describe all of what you did in construction grammar or, better yet, dependency grammar. You can analyze grammatical structures in dependency grammar (and it's been done since at least the 1930s) without needing to posit innateness. All of the "other" theories make assumptions about hierarchical structures! What people take issue with is not the theory of generating sentences in itself but everything around it: the "language module" claim (no real evidence and in some descriptions unfalsifiable); the poverty of stimulus claim; the one genetic mutation claim; UG generally making too much of a distinction between syntactic structure and everything else, UG essentially making hierarchical structure "their" thing rather than accepting that other, even older, theories clearly have that too, etc. The main problem is that someone thinks they must explain similarities through innateness and the existence of a language module rather than the fact that language uses a combination of domain-general skills that are common to humans and, therefore, lead to humans communicating in similar ways but also with many, many differences like one language having accusative alignment and the other having crazy stuff like split-ergativity. I really like your content, so this is not coming from an angry position and I'm sorry for the length, but: One, while it's true that UG has transformed to mean way less than it initially did, there's still plenty of people who do assume way more to be hardwired than is plausible. In language acquisition, there are people who literally assume that we have to have an innate predisposition for "determiners" or else we couldn't learn them. And they'll fight anyone who claims that this is possibly not very reasonable. UG, unfortunately, has many researchers who like to come up with claims that are essentially unfalsifiable, like the poverty of stimulus, or that merge must be one single genetic mutation, or that there is a specific language module (the last two might be one and the same, depending on decade of publication). Second, if Chomsky ever claimed that children produce linguistic *structures* they had never heard, we need to define what a structure is. If it's supposed to mean something like the passive construction, then no, that's not what's happening. A child is able to generalize properties from some structures and use them with parts that did not appear in those structures in the input. Now, the big fight concerns how this is possible. Third, nobody who does construction grammar assumes a lack of hierarchies, yet UG people like to claim they do (another straw man). Nobody working in non-UG assumes that UG literally says "all languages essentially have the same underlying structure." In this case, despite you having good intentions, it seems almost like a straw man of the straw man counter-positions. What people *do* doubt is that linguistic abilities necessarily require a "language module" as in a particular skill that is in situ *just* for language, given the fact that i) many things can be explained by domain general abilities; and ii) other species have certain features as well and some of those, especially pertaining to pro-social communicative units suggest complex structures. So the main issue with Chomsky's (et al.) claim is that there is an assumption of a mutation that makes us able to have merge and that merge is what makes up the narrow faculty of language. Four, the computation example you mention around 3:30 is a perfect example of what happens in a lot of reporting on UG results. You rightfully say that it is perfectly computable to say a sentence backwards to express s.th. like a question. True. But is that actually easily computed in natural human interactions? no. And that has to do with our *general* cognitive abilities and *decidedly not* with a language module, yet here it's used as "evidence" for regularities in the world's languages? We know a ton about how much we can generally keep in our memory stack for sentence production, so that is a constraint, and obviously the other person also has to be able to parse the sentence.; another constraint. And there are many more. Besides having a marker like prosody or an extra word or moving words around, there is very little that could be done without requiring extreme adaptation. So, seeing these "similarities" as evidence is reaching. Or the mention of unergatives or unaccusatives as somehow relating to UG, as if they wouldn't be equally well described in other theoretical frameworks. Generally, I tend to follow the principle that I don't care if someone has a UG background or not, unless it affects how they debate results. So interesting observations will always be interesting. But what frequently happens is this derailing into "and UG is the only way to explain this" which is unnecessary. Lastly, while this entire debate might be unknown outside of linguistics, you must at least acknowledge that this "crazy" idea is the dominant one in US linguistics. PS: I don't know when you studied linguistics but the computer-phobic comment seems a bit dated. I'm a computational neuroscientist (coming from a Physics undergrad, so had to learn a lot of new stuff) and currently work in a cognitive science department with about 80% of my work involving linguists and I teach computational approaches to language perception as well as coding and statistics classes. But even in programing, to write the code, you have to think about how to encode rules first. It's not like computation solves itself.
Great video thanks! What would you recommend for someone who dabbles in linguistics, but needs a nice overview of the main ideas and concepts in language learning and acquisition? A nice, readable book for example? Asking for a friend.
UG provided plenty of employment for people who today would migrate into software engineering. It’s very appealing to anyone with an analytical mind. I loved it when I first encountered it 30 years ago. Unfortunately it is a theory, and to cope with real languages it ends up becoming horribly complex. I’m glad you referred to adding another epicycle, that’s exactly where I was heading. Archeoanthropology suggests that language might be a million years old. We know that a language that was spoken 6,000 years ago lacked certain conceptual structures that exist in modern languages. And we can see how language evolves over time, and make educated guesses about the evolution of grammatical structures. My guess is that the brain has a relatively simple architecture for processing language, and that the complexity in modern languages has evolved culturally. That generalised neural networks can produce language is surely proof that great complexity is not needed, just an absurdly large number of neurons.
This brings back memories of the "language wars" at MIT in the late 60s and early 70s. I had taken classes in AI from Marvin Minsky and Seymour Paper. My problem with format linguistics was that it didn't make sense from a computational point of view, and it didn't deal with intrinsic ambiguity. I was very interesting to see how effective syntax emerged so took psycholinguistics. One interesting example was using a click test (clicks in ear, language in the other). For example, it showed that the subjunctive was in decline. Also, the meaning is not in the sound waves but in the context. Generating language doesn't teach much. Understanding and acting in ongoing contexts is more interesting. Just language mixed still sense maketh.
I think the most important deep theoretical split is between people who believe natural languages are the way they are primarily because of quickly degenerating *competencies* of very young humans (LAD), or because of cognitive *weaknesses* in those same young brains, which languages evolved to more effectively colonize (the coevolution theory prominently associated with Terence Deacon).
Referring to the LAD, do you mean that the older a kid is when first exposed to language the less likely he is to acquire native speaker fluency? If yes, I taught people in that situation years before Language Deprivation Syndrome became a thing. Plehse excuse the awkward wording. Grad school was a looooong time ago.
Decades and decades ago, I was equally skeptical of deep grammar & the like, but the Army was sending me to language school. Learning well was the ticket to not ending up in the infantry... So I picked up Bickerton's book _Language & Species_... It ended up being a lifesaver. Possibly literally... I walked into Russian fully understanding and just intuiting case structure, feeling my way through what all my buddies sweated over hours a day. Serendipitous... Not saying it would have helped with any language... Time I went back for Arabic, didn't help a bit. But, if you're learning Russian...
Aaaaa, intersectional disciplines uniting their fields to produce new forms of understanding, its modern day magics. It is really cool to see theres room for programmers to work in the field too! Thanks for these videos, and Chag Pesach Sameach to you and yours!
When you hear the phrase, "I saw the man," but parse it as a present tense verb performed with a noun which is a homonym of the verb; you know like, in a magic act, (or a horror movie).
I remember two textbooks that I used to learn Japanese: Tae Kim and Gakushudo. Both of them teach you set "sentence patterns" with "variables" that that you can substitute with other words, e.g. [Thing]NI[Action] where you can simply substitute [Thing] with other nouns and [Action] with other verbs to say that you did some activity there. But they never give you a tree diagram that splits a sentence into SubjectPhrase and VerbPhrase. Instead they teach you that the Verb is the core of the sentence while all other things like Topic/Subject/Object/Location/Origin/Destination etc. are just additional information that are optional. Is Tae Kim's approach the same as Chomsky's universal grammar? generative grammar? dependency grammar?
I would be interested in linguistics for regular people, even if I am not sure I would fit in the regular people category. I get the idea of that meaning of "ideal" its similar to the "ideal" in "ideal gas law".
1. I don't believe that there is a universal grammar, but our minds are able to form a grammar. A difference is the subject: mind (whatever it is) vs grammar 2. It is so easy to form a sentence with an unfamiliar world. I wonder if you know the word sepulka, but I bet you already have some idea about sepulling and sepulkaruim. 3. My own attempts to program something that could build sentences ended with the impossibility to teach the machine the notion of I. Except for hard programming, which is not interesting. Ideally, I'd ask the machine who are you and it would reply I am Parrot?and if I ask who am I, it would reply You are the Creator
Funny coincidence. I just started learning German, and am in the middle of a coding spurt to generate very simple sentences. Using Python, too, to start, although I plan on transitioning to Javascript soon enough and present everything with nice colour clues in a browser. I've written all the classes you said you started out with, and have parsed and wrangled a lot of case tables from Wiktionary to have some data. There's flexibility in all the categories across which nouns inflect, and the ability to toggle their representations into pronouns. I haven't read any syntax books though, and don't plan on expanding on what I have beyond very simple sentences that can get off the ground my feel for noun genders and cases. It feels like overkill to make a complete representation of German grammar just for this. So I won't. I know I should soon give up if I tried.
I'm interested in the intro to linguistics
Me too
Me too!@@DoughBrain
Me three!
I'd lovee to see this too!
Me too!
As a card-carrying, bona fide Regular People, I'm very interested in the Intro to Linguistic for Mes
"What's the singular for people?"
I need to remember to use this comment in the intro!
We used Chomsky’s sentence tree around 25 years ago in a company that developed machine translation software. The source language was parsed (analysed) based on its Chomsky-tree structure from the top sentence level down to its smallest entities - the individual words by monolingual rules. This structure was than transformed by rules into the grammatically correct tree structure of the target language. We only used English French, German, Italien, Spanish and Brazilian Portuguese, and French German, so no language with completely different language systems. German, due to its very different word order, was acctually the most difficult lanugage because it required a 100% correct analysis of the source language and made use of preposition attachment and other interesting things to make sure semantic units were not ripped apart when creating the correct word order for the target side. It was a brilliant system for its time, but also proof how complex languages are and how difficult it is to capture all of it with a rule-based system.
Obviously, no serious language translation company today allows Chomsky-like ideas into their computers.
I don’t know any German but I would expect a machine to look at a long German word and occasionally divide it into sub words incorrectly with results that are hilarious nonsense to a human. In fairness, I have done this with Hebrew many times, such as wondering why Betzalel was named “the onion of G-d” instead of realizing it was “in the shadow of G-d”
@@wafelsen The old MT systems used dictionaries. If a word was not in these dictionaries, the systems could not translate it and definitely would not attempt to break it up into meaningful snippets. This would have slowed down the translation process. Our system ( and I assume most if not all of the systems of other companies) used an embedded complex morphology analyzer that recognized all possible inflected word forms on the source side and created the corresponding forms on the target side as final step. The dictionaries contained only the uninflected base forms such as singular forms, infinitives and uncontracted forms - plus codes (morphology) and lots of tags (grammatical, semantic and syntactic information) used by the engine and the rule system.
@@wafelsenflashbacks to me thinking that the "Gamali-" part of "Gamaliel" meant "my camel"...
I wonder why you didn't stick to languages within ONE language family. You went for two. I think because they are European?
Came for the linguistics, stayed for the academic in-fighting stories.
I have LOADS of those
@@languagejones6784share more please 🙏
As a computer science student and language enthusiast I thoroughly enjoyed this video. You timed this great for me as I'm taking Discrete mathematics where I've learned about Regular Languages from a math perspective AND I'm taking a class on compilers where I'm learning about syntax trees and writing code to generate them based on grammar I've defined. I'm loving the content that dives a little into linguistic academia in concise interesting videos. I'm definitely interested in hearing more about generative language theory as I don't have a firm grasp on it or why it exists.
check out the book Snow Crash by neil stephenson.
As a linguist, my quippy answer to the title would be: It's a lot less wrong than it sounds, but it is wrong.
Zing!
Conversation with a child language acquisition lecturer, paraphrased: "young children are bombarded with language 24/7 and are sensitively attuned to it because their attention is always focused on their parents" "so why is UG premised on the paucity of the stimulus?" "my guess is that Chomsky never spent much time raising children". 😵🔥
@@johnnye87 Of course there is poverty of input.
Universal Grammar (UG) approach claims that there is a universal set of principles and parameters that control the shape of human language.
But the entire corpus of studied language is so small that it is impossible to derive what this universal set of principles and parameters actually is from using the entirety of human language as input.
If the entirety of human language does not contain enough input to enable the principles and parameters of UG to be extracted, then how could a child receive enough input to work them out, unless they are already built in?
@@stevencarr4002 My issue isn't so much with the hypothesis that the evolved human language faculty preferences certain patterns over others, so that you can get a "complete" language without having to expose the child to enough data to unambiguously derive every rule. That seems pretty obvious and unremarkable.
My issue is that they tend to treat the formulae they come up with as though they represent actual mental processes, when they bear no resemblance at all to how any other part of the human mind works. We don't operate on strict mathematical rules. A neural network like the human brain runs on weights and fuzzy logic, and the languagey bits of our brain aren't isolated from the non-languagey bits. Chomskyans produce formalised descriptions of a phenomenon and then mistake the map they drew for the territory.
@@stevencarr4002The argument question beggingly assumes there are principles of UG to be worked out. The usual anti-UG line is that language evolved to be learnable by human brains not the other way around. Cognitive biases were around long before before language, so any language that arose would of course be shaped by those biases. We don’t work out the laws of physics to learn to walk either, so is the correct formalization of physics innate also?
I read the the Chomsky book that was mentioned in this video in an upper division linguistics class. The first time I read it was really confusing, as is the case with many academic writings, especially ones from philosophers. I'm glad people like you are here to read through it fully and share your understanding with us because it provides more discussion on the topics these academics talk about. To me, the more we discuss these ideas, the more minds we can get working on them and the faster it will be to realize a solution. I think it's really beautiful that you make these kinds of videos and share this on the internet. Thank you for the great work!
Intro to Linguistics for regular people, please
I loved that BC joke. Please keep these videos coming. I love these.
This channel feels so relevant to any grammar thing I'm currently thinking about. I feel like my mind is being read or something. I was thinking about this yesterday. Great video
So great to see this video. I had a similar experience with Chomsky in linguistics and cognitive science (misunderstanding and dismissing the straw man definitions of UG or LAD, working through the topic myself, realizing what he was actually positing, finding his work invaluable and awesome), and I’ve been butting heads with social scientists (my native field is anthropology) about it for 15 years. Love your channel!
I am also super interested in the intro to Linguistics!! Great video btw, thanks for clearing up a lot of the confusion I had around UG!
I'm glad you mentioned the branding problem at the end. That explains a lot. So let's just say that my entire grad school experience was three years of back-to-back branding debacles regarding anything related to Chomsky.
I'm pretty sure it's a religion and we're watching, as is tradition, a battle of heresiarchs.
11:15 understand that this pseudo-erudition straw manning is precisely Chomsky’s own approach to any area of philosophy he can’t be bothered to understand well enough to comment on and instead chooses to classify as nonsense.
It seems like a kind of poetic justice, really. He feels confident enough in his intellect to write off anything he can’t understand as “incomprehensible” and others do the same with him. Oh the hubris of the hyperspecialized.
Hey Adam, you sound a little angry
That was interesting! I’d love an intro to linguistics course that provides a structured explanation. Thanks for the great content!
Great stuff! Pauses are immensely important for your viewers, especially when the content has any level of complexity. More often than not I feel the need to speed up a video, but you're the first person I've ever felt I needed to slow down or have to rewatch. Would love to hear your reasoning against the naysayers, perhaps another video idea or so.
Would you consider doing a video on the Pirahã language and associated academic controversies?
I’ve been wary of it, but I think I should
@@languagejones6784 Address Everett's claim that recursion is not the base on which language is constructed, including his claim that Piraha has no past tense
Would watch. I have had to explain UG too many times because of this controversy.@@languagejones6784
@@gordonbgrahamDon’t get me wrong, Everett might still be proved one day to be a hack or a bigot, but at this point no one can prove him wrong. And unlike Chomsky, he is making claims that can be refuted, meanwhile Chomsky just argues that UG is about biology, makes no predictions and is not a theory but a field of study, all which can’t be proven wrong.
@@impendio The lack of a concept of the past and number (other than a little or a lot) no words for colour and the apparent absence of recursion in Piraha (something that would counter Chomsky's claim vis a vis recursion) seems to suggest language is not fixed in its conceptual properties. That language is biological can't be refuted is certain, but that doesn't address the issue of recursive grammar which Chomsky is a proponent of.
I would absolutely LOVE an intro to linguistics series 👀
Oh man more computer science content please, I'm sure I'm not the only one who wants to see how you modeled words in Python. Plus there are probably a lot of computer science nerds in your audience :D
Computer science! Or at least computer pay the bills.
Yes, this is me. I like languages. I love computers. I love math.
If I hadn't luckily discovered my incompatibility with academia as an undergrad, I might've ended up somewhere in the confluence of those studying generative linguistics and/or NLP right around the time that LLMs began blowing up...
Eh. For the best.
yay for the computer science and linguistic theoretical crossovers
I was surprised, too. Language enthusiast, but also programmer and math nerd.
That.. Sure helped me to clarify some stuff about Chomsky's work that were hard to wrap my head around while I was doing my bachelor's. Thanks
I should probably stop wondering why I, a computer scientist, like linguistics
Part of UG is the idea that there are a certain set of universials which are all either debunked (like all languages can form subclauses which isn't true, some don't do that) or so vague that they can mean anything (like every sentence has a subject and each time you find one without, they will say something like "well, it's not expressed" or point to random shit to be the subject. One language was analyzed by 2 guys and they came to different conclutions what the subject is).
I beliefe that the ability for language is innate, that doesn't make the Chomskian generative grammar true. It's a model and as a smart guy once said "All models are wrong but some models are usefull". Take that from a computer scientist who worked on my master's thesis on a very not Chomsky syntax theory (called Role and Reference Grammar).
The problem with UG is that it's totally unfalsifiable. It's not a scientific theory. Also Chomsky is a hack, but that's not strictly related.
There is no evidence in any field that the universal elements of human language are distinct from the definitional elements of language as a concept.
I find your criticism of null subjects to be very weak. UG never needed to have mandatory subjects; in fact the EPP was only proposed in the 1980s, as I assume you know. Even if every sentence in all languages we know of only had overt subjects, the EPP/obligatory subjects wouldn't HAVE to be part of UG. More recently, Chomsky (2013, 2015) even did away with the Extended Projection Principle.
Besides, I don't see how misidentifying the subject of a language has any bearing on UG. It doesn't even mean the language doesn't have subjects (whether overt or not).
@@amazingcaio4803 I did not know that EPP was a later development. I never even heard the term but I guess it's the same as obligatory subjects.
My point is that all the universals Chomsky proposed are refuted or irrefutable. And my broader point being that UG has a lot more baggage than the simple statement "Language is innate".
That said, I wasn't referring to pro drop (like Spanish "te quiero") or dummy subjects (like English "it is raining"). Here I see that the subject is not expressed or semantically empty. But what about German impersonal passive ("Hier wird nicht spielt", word for word: "Here is not played").
And let me rephrase my other point: If two Western linguists analyze a very not Western language and come to different conclusions what the subject is, maybe "subject" is a Western concept that doesn't fit with foreign languages.
The syntax theory I worked with tried very hard not to be Western centered and here is an article why it is so cool: www.computerworld.com/article/2929085/blame-chomsky-for-non-speaking-ai.html
The article didn't age too well and the wikipedia article isn't too long but it has some links.
@@twipameyer1210 I do get some of your points. I also dislike that some claims people make are so wild they're essentially unfalsifiable. Not only would this be unscientific, but it would also go against one of the central goals of UG - to restrict language so that it explains language acquisition.
There is indeed a lot more to UG than innateness. I don't even think innateness itself is controversial; language acquisition must ultimately be encoded in our genes. What's controversial is whether there is a domain-specific language faculty. While hypothesing its existence can easily lead to speculation, I do believe it is a worthy line of inquiry.
I don't think that nowadays 'subject' has a formal definition in Chomskyan generative grammar. At most it's generally used as a synonym for the 'specifier of T(ense)', which is a theory-internal concept. This is hardly a Western concept (the obligatoriness of subjects, OTOH, could be) and people doing generative grammar actually like studying other languages so that they can gather enough data to refine the theory of UG. Hungarian was partially used to support the DP (determiner phrase) hypothesis and data from Chinese, Japanese, and Serbo-Croatian helped refine proper government and the Empty Category Principle. Chomsky also once proposed the configurational parameter based on data from Japanese. Even if the concepy of subjects (and/or the idea that subjects are obligatory) were Western-specific and didn't apply to other languages, this would mean that our analysis is biased, but I still wouldn't think we should abandon the idea of a universal grammar, especially because that's incidental to UG (there's no a priori reason for our account of UG to include subjects).
Regarding subjects specifically, the general idea is that T must have a specifier, so the focus of inquiry isn't on the subject (read, the specifier of T), but on T itself (why does it need a specifier?). Most accounts don't really restrict the subject to a specific kind of element (any restriction is usually derivable from other principles or is lexically determined), so the subject could be a null pronoun, an expletive, or any other type of constituent (e.g. a locative, as in your example in German). Sure, looks can be deceiving and one could argue 'hier' isn't even the subject of your example sentence, based on empirical evidence or theory-internal reasons. That's why syntactitians may have different analyses. In fact, isn't that expected or even desired?
In the end, it doesn't mean UG is refuted or irrefutable. Many universals still seem to be true. One example is Lasnik and Saito's (1984) restrictions on syntactic Wh-movement. (Their account may be outdated, but the empirical phenomenon still holds true, as far as I know.)
Edit: BTW I personally have nothing against other theories of grammar and I don't deny their usefulness.
@@amazingcaio4803 Let me first thank you for this conversion! I enjoy it very much and think it's very fruitful, especially compared to other conversations on the internet.
Let me now reiterate my claim: All models are wrong but some models are useful. UG is good at explaining some things, other theories explain or visualize other things better. Some Chomskyans will claim that UG is literally objectively what happens in our brains and strawmaning non-UG-linguists in saying they don't believe language is innate. That's what I felt the video did and that was what I was attacking. I don't think you think so, so we are good.
And to lay my biases open: I'm more a computer scientist than a linguist. I did some basic linguistic lectures with Chomskyan professors and wrote my master thesis in the computer linguistics department with a professor who hated Chomsky so much that he unironically claimed that language is just a cognitive ability like any other. My (Chomskyan) syntax professor was so fond of the idea that the subject is universal that I had the impression that it's central to UG. I'm glad that isn't the case apparently. I find myself somewhere between these 2 worlds.
Most people know Chomsky as a political writer. His talks are so hard to get into, that once when he gave separate "politics" and "linguistics" talks at the University of Maryland in the 1990s, I joined a bunch of people for the linguistics talk, where the Q&A was all politics questions.
Chomsky's theories about universal grammar have stood up very well over the decades, but within linguistics there are camps that favor his theories and those who are critical. When I worked for AAAS and Science magazine in the late 1990s, I helped plan the annual conference. We always had to make sure that programs in the linguistics track represented both camps.
As a linguistics enthusiast and conlanger, the concept of universal grammar is interesting to me because I like thinking about the earliest limits of language development in prehistory: how did people go from single word utterances to tell each other about food, predators, or danger, to complete sentences? Is the process of stringing words together in a way that creates meaning something that evolved within our brains, or is it a social construct? How much of language is instinct, and how much is learned?
As a conlanger, I'm often amazed and delighted when my made-up words that I put together in a made-up sentence structure can still produce meaning, in some very interesting ways. If there wasn't some kind of basal syntax deep within our brains, conlanging would be very different, if not impossible.
One could also start to study Language/languages, not only as they developed during millions of years amonge primates, but also as how a mother tongue develops from the bond between mother and child, from the baby's perspective. The last months in the womb the unborn child hears both music and speach and other sounds, thus it is familiarized with the world of sound surrounding it. I have heard babies use sentence melodies, cadences, both declamatory and questions mode, when they could not talk and walk yet. They also recognize music and imitate melodies and entertain themselves . And I have heard babies take turns in interaction with their mother, using syllables of sounds and minimal pauses, as if they were words. Babies were in such situations very pleased with their sounds and the interaction they could bring about, ready to repeat the situation again and again. Patient mothers gently mold these sounds into (let's call it) a prelanguage, and then into the baby's mother tongue. Dads can do this, too, if they are around enough. Often parents understand the baby, and later the toddler, when nobody else does. Many cute situations come up, of entertaining value for the adult, too, but also misunderstandings - which are *meaningful* experiences, as part of learning how rules regulate language use. These foundations could contribute to explaining how come our languages are not so logical as binary sequences, instead there is a haphazardness to words and their forms within the sound system of the mother tongue (and the anatomy of toddlers).
But Chomskian grammar (assuming we're talking about generative grammar, which is a grammatical theory, and not the pseudoscientific innate syntax hypothesis which is frequently referred to as "universal grammar") has nothing to do either with how languages work in our brains nor with how they historically developed. Syntax is mainly a pathway to linearizing expressions that initially mirror our thinking that is primarily associative in nature: thoughts don't have hierarchy, they don't have constituents, they don't even have any linear order (even though they are experienced linearly due to the nature of time); they only have concepts and relations between them, relations between sets of related concepts - those can be active, patientive, causative, instrumentative, spatial, temporal, etc., - i.e. meaningful relations, not abstract-structural. Generative grammar is an abstract model that is supposed to be able to predict, mostly based on the structure of linguistic expressions, whether those linguistic expressions would be considered structurally possible within a given language - i.e. it is supposed to predict whether a person *perceiving* the text would consider it *structurally* inappropriate, and if they would, it is supposed to tell us the structural principle that was violated. Generative grammar does not and was never meant to predict nor explain the way syntactic structure is arrived at either in use or in historical development.
Some structural patterns are obviously there, just like, for example, every walking animal moves its feet the same way as the other animals of the same species and the same age, even though they are free to move any other way they are physically capable of - they just won't be doing that most of the time, as long as the one of the millions of biomechanical alternatives is noticeably more comfortable in the long run; animals just have places they are aiming to get to, and all they do is step, step, step, etc., without having any "overall way of moving feet" in mind. With languages it also seems that this structure of speech isn't something that is ever the goal, but something that emerges on its own, out of the systematicity characteristic to the expressive means of a language - to the lexical units and their various counterparts within the lexicon, to the means for expressing predicative relations, semantic modification, communicative relations, rhetorical relations an so on - the systematicity that is quite necessary to ensure the possibility of unambiguously reconstructing the initial conceptual content that had led to this very structure that is observed (i.e. the systematicity, loss of which would lead to the impossibility to consistently "interpret" the resulting linguistic expressions).
It's not the structural principles that lead the development of speech complexity - those principles aren't there yet, when speech never reaches any complexity: the language stays incapable of expressing that as long as the society that generates that language stays culturally incapable of the necessary verbal coherence - there are human languages that don't have any means of precisely expressing most kinds of relations between concepts, not even the predicate-subject-object relations. It is the cognitive development and cultural "fossilization" (roughly speaking, concept → myth → habit → tradition) that gradually lead to the development of this structure that we observe and that we can describe using the notion of generative grammar.
The linguist Guy Deutscher wrote a popular linguistics book all about how language development could have happened in prehistory: it is titled _The Unfolding of Language: The Evolution of Mankind`s Greatest Invention_
Great video! In linguistics at Cambridge, we studied principles and parameters in intro to syntax in first year, and later years it got compretely overhauled by minimalism and other approaches. Coming from a biological science background (I studied neurology and neuroanatomy) I’ve always been super sceptical of UG and only managed to reconcile what I know about cognitive science with what linguists from largely humanities and philosophical backgrounds refer to as “the language faculty” by describing UG as our innate bias for language acquisition in the vaguest way possible. The fundamental principle underlying syntax is recursion, which is what leads to hierarchical structures. I was taught that human language is recursive and that is what distinguishes it from other animal sounds. It’s only after i graduated that I learned humans think and process things recursively, which begs the question of whether it’s because of our recursive perception that we developed recursive language or if recursive language helped up perceive other things recursively. Super interesting stuff!
Also a cantab, had the same experience here (though I came to these conclusions during undergrad). I've ended up working on paradigmatic morphology which doesn't really fit such a compositional theory of language easily (contra Distributed Morphology and morphemic theory more generally), so I don't even really interact with Chomsky's work much these days, but my view is that Chomsky's work features 1 a confusion between description and explanation and 2 successive rounds of having less substantial things to say (e.g. minimalism has a much-reduced language facultt relative to P&P and G&B).
Thank you!! I saw a grad student doing parse trees at the LGBT Center and learned more in a 5 min discussion than I had understood from 8 years of Spanish education! I was only able to comprehend and fully parse foreign language when I was given some basic operators for structuring how the modules worked together. After taking a syntax class with Beatrice Santorini, I picked up Japanese quite easily and the sensei even commented on how rapidly I understood grammatical distinctions such as the は/が topic vs subject marker or how adjectives seemed to be either fundamental noun-ish or were akin to a verb for having that quality. They appeared to either be noun-ish + the copula word that conjugated tense, mood, aspect, negation, and conjunction or they were almost quasi verbs which then conjugated normally. That structure paralleled a similar verbifiying structure that was "to do" + a noun: rugby → play rugby, study → to study, marriage → to marry/(get) married. Once we were shown adjectives as they conjugated for the past, the pattern jumped out at me there at some level were nouns you were like and nouns you do or verbs. Which opened a whole backdoor to the philsophy and all these meditations on what is red and are qualities of the object and if not what "are" they. I never understood why we discussed before then... Syntax changed my life (as I've mentioned before 😳)
I really like your approach. Please do make that course.
Very interested in the Intro to Linguistics class. I'm a computer software programmer, with a ton of Python experience, and when I was in school in the 80s, took Intro to Linguistics, and Computers and Linguistics...and when I dropped out, my Linguistics prof called me on the telephone (!) to ask if I was ok. I was to take her Transformational Grammar class, and sorely miss not having done that.
I think that a nuanced approach to language learning is important. An overwhelming majority of languages are observed and not constructed. Therefore, the so-called "rules" are just someone's observations, and giving their opinions on those observations. For example, as far as I know, the first academic mention of a "modal verb" was by Leonard Bloomfield in the 1933 book Languages (thus making the idea of a "modal verb" itself less than 100 years old). However, what is "deemed" a modal verb was basically established by Quirk and Greenbaum in 1973. Since then, many have questioned whether there should be a finite list of modals, like Quirk and Greenbaum propose, or if the broad definition of what a modal verb is leaves room for interpretation. Is "have to" considered a model verb, since it sets out to do what other clearly defined modal verbs (like 'must') already do? Does "need to" fall into that same category?
Now, what I mentioned might be a uniquely English problem, since we don't have an "English Academy" regulator. Still, even those regulators such as the l 'Académie française, while regulating the language, are still doing it through the interpretation of it's members.
as a biologist and linguist, i have difficulties accepting UG (although your python-related input is quite interesting). instead of language capacity (or language organ or else), there is rather something like language readiness, a synergy of different older neural networks, which took over the processing of language, but it has not evolved to be language-specific (although it may later evolve to be more efficient). the statistical learning approach seems to me to be a more plausible way the language works, as advocated by morten h. christiansen et al. it would be interesting to know where you stand on this approach 😁
Your research sounds promising. But I'm sure I'm not alone wondering what kind of PhD program trained biologists AND linguists.
Not my field. Just curious.
But Chomsky doesn't claim it evolved to be language specific. His claim is actually very similar to yours. There was some specific evolutionary change at some point that allowed other neural functions to click together and express this latent capacity as language. As more and more people spoke and languages evolved and changed, statistically, more and more and more different possibile arrangements of the computationally restricted possibilities of language emerged. Since language is a layer cake of different computacional systems of various constraints-pragmatics on top of semantics on top of syntax on top of morphology on top of phonology (itself constrained by the physical limits of phonetics)-you have an almost infite possibility of configutions. It's quite an elegant and simple solution to the problem of the vaaaast diversity of languages that exists.
@@Robespierre-lIyou have to study them separately :-) although, i think it would be quite beneficial for linguists to combine their training with the biological one, especially in evolutionary biology, as there are plenty of shared fields of interest - e.g. population genetics. i started with biochemistry through molecular and system biology, and after some time, in fact, almost 20 yrs later, i went for general linguistics. my major interests in lingo are cognitive linguistics and sociolinguistics.
@@Eruntano42 from the biological point of view, the evolution of the language and the respective neural structures has to be more stepwise. e.g., arbib et al. hypothesise that there were at least seven distinguished steps (synergies of new neural networks) and, gradually, certain forms of language communication appeared, which were much simpler than contemporary human language but still helped to survive in more and more complicated human societies. the thesis of "pragmatics on top of semantics on top of syntax on top of morphology on top of phonology" is, according to this kind of study, not necessarily valid, e.g., pragmatics does not require syntax or morphology in the earliest assumed forms of language. it is still in the early development stage in all these directions, but what seems to be a sign of good science is that they can explain more than previous hypotheses.
The language acquisition literature is very much based on specific universal grammar mechanisms, and they are more productive than statistical learning.
Definitely interested in that intro course duder.
Thanks for your work Dr Jones. I tried to read Chomsky when I was a teenager in the 1970's. I got a bit lost, but I like your programming analogy. I have always been a language fan.
absolutely interested in the intro to linguistics for regular people class!
I would definitely watch an introduction to linguistics. Keep us posted!
As an English speaker I think I learned more about English grammar while learning French grammar than anywhere else. I may have forgotten what English grammar was actually covered in what our schools called the language arts class, but when the French teacher was talking about using the subjunctive with the past perfect in French, I know it was a revelation for my understanding of English. English grammar was always "that does not sound right" or "that sounds right" or even "that sounds better".
Per Goethe: “Wer fremde Sprachen nicht kennt, weiß nichts von seiner eigenen” - "Those who don't know a foreign language know nothing of their own."
When I studied Russian in school one of the best books I had was called English Grammar for Students of Russian. I think there are versions for several other languages as well.
@@joshrotenberg5872--- YES!!! I recommend that series for everyone who wants to learn a language covered by that series. There's even 'Spanish Grammar for Students of English' written in Spanish.
As someone old enough to have learned to read through phonics and had English grammar explicitly taught in grade school and later got a master's in teaching ESL, I can tell you that 2/3 to 3/4 of English/ESL teachers actually have a very poor grasp of both English grammar and English spelling.
about 5:20-6:24 we have an analysis of how words are put together to make syntactical structures.
Does it matter that when learning a language, children parse phonemes, not words? The sound 'ed' indicates past in English, so children often say the ungrammatical sentence 'I wented'
Obviously this does not falsify Chomsky's theory (nothing can), but can UG account for how children can learn language when they don't hear separate words, but hear a string of continuous sound?
Awesome discussion! Many thanks, from a guy who majored in Computer Science and German.
These videos are so, so good. Your channel is a breath of fresh air in the sea of bold claims which is language youtube.
As a computer scientist myself (although not doing any computational linguistics), I too am interested in the intro to linguistics course. I am fairly good at English grammar and have a lot of professional writing experience (not just technical writing, but communicative/persuasive writing and creative writing as well), and love what you do here. I too see the overlap with Comp Sci, and with my love of computer language theory, I often talk about similarities and differences between computer languages and spoken languages with programming students. Thanks for what you do, and I look forward to more.
We called it “generative grammar” when I was in grad school.
6:06, I think he's having a stroke
I appreciate this exposition of the issue of UG, and agree with most of it. Here are two points, where I have some reservervations.
I don't think it's simply misunderstanding or proliferation of straw men . There's a deeper philosophical aspect of the dispute between (some) anti-Chomskians and UG types. The issue, which Chomsky explicitly takes a position on in I think Knowledge of Language is the old nature vs. nurture dispute, aka Plato vs. Aristotle. Generative(TM) Linguistics isn't only about the insight about something like rules generating output. It's a specific stance that some that capacity is hard wired in a language module or language modules as opposed to simply the outgrowth of general intelligence understood as an overall learning capacity. The idea of innate knowledge is deeply offensive to some people who connect it to the worst excesses of evolutionary psychology and even social darwinism. That seems like a stretch to me, but the whole issue gets interfered with by the sociology of the field, branding (as you point out), and toxic personalities.
Another thing I would want to modulate a bit is your colorful characterizations of sociolinguists, a tribe I'm a member of. Essentially, since Tony Kroch's arrival at U Penn (a place I think you're familiar with) Labov, G. Sankoff et al. have been much more sympathetic to a UG-type approach (although without the term) than a lot of others. In fact, the field seems split on the subject of an innate/modular cognition approach and the functionalist/connectionist strand. There are a number of researchers who do variationist analyses along with generative syntactic ones.
The way I heard it people kept finding cases in languages that don't fit the model and the model had to be generalized until it was meaningless. The most famous case was Pirahã not using recursive structures at all. No clauses, no noun phrases within noun phrases. Kinda like parsing an assembly program where every sentence/line of code says only one thing.
Exactly. Keep on adding epicycles until it fits.
"downloaded nasals in a sleep cycle" 😂
I’d be interested to know what you think about Daniel Everett’s work on the Pirahã language and his assertion that it does not have recursion (Chomsky’s essential property of language)
UG is very interesting, but I disagree with Chomsky on many of his other opinions outside of linguistics haha. Super interested in the course you're putting together here on UA-cam. Really looking forward to watching it!
OMG, huge nostalgia for when I took UG at UT Austin. I loved it so much. Phonology not so much (in fact, I almost changed majors because I just thought I was DUMB). But Phonetics?? OMG, IPA is heaven to my eyes and years. Yes, Intro to Linguistics, por favor!
Thanks for this video eloquently laying out the beauty and truth of Chomsky's syntactic theories. He's still very much swimming against the current despite his popularity in his own field of study
My Phil of Lang and linguistics professors at the University of Kansas were MIT Chomskyans. Thanks for giving this major breakthrough in cognitive linguistics a fair shake.
I did an undergraduate degree in linguistics at a school that required all students to study a semester's worth of Minimalism. Got absolutely clobbered by it, failed it, had to take it twice. Years later, I went back to school to do an undergraduate computer science degree at a school that required all students to do a semester of computability theory. Imagine my surprise when good ol' Noam pops up with his binary trees and context free grammars! Never before in my life had I wanted to write my old Syntax professor a Christmas card.
I was a ling major at Maryland in the mid-80s. The first course after baby-Ling (this is a phoneme) was basically a semester on the evidence and motivation for UG. It was a VERY Chompskian school. My professors always pronounced Noam to rhyme with Rome :-) I have no idea if that was right, since they had pretty strong accents of their own! And then in my computer science classes (dual major). there was Chomsky again, of course.
You can’t escape!
'Gnome Chomsky' made me laugh, but familiarity does tend to make names sound more like plausible native words. Not a lot of native English vocab with an o-a transition :)
@@rasmusn.e.m1064--- I don't think I've heard his name pronounced any other way than 'gnome'. Then again we're talking grad school linguistics in the early '90's.
@@ak5659 I'm not a native English speaker, but I typically pronounce his first name in English something like [ˈnɔ͡ʊ̆ˌəm], which I perceive as my realisation of /ˈnɔ͡ʊ̆ˌæm/ but I can see why the final schwa might easily get reduced in this particular constellation of //First name_Surname//, where the surname tends to get a lot of stress.
8:34 i'm interested!
0:40 Sums up Chomsky pretty well in general really.
Definitely interested in the intro linguistics course.
I would LOVE an intro to linguistics from you!!
Are you still working on it or is it available somewhere?
Finally an entertaining and concise linguistics video that delves into the various aspects of theory. I'd appreciate more of the sort
what are your thoughts on the work done by Adele E. Goldberg, Michael Tomasello and others into usage-based / construction-based theory of language?
Your vids are great, thank you. I would like to see a more detailed explanation of "Universal Grammar". What is it, where's its place in development of linguistics, and what's the controversy?
I'm interested in the intro to linguistics idea!
I am most definitely interested in that intro to linguistics class
my teacher quickly went over UG in my linguistics I class the other day, I wrote down what she said so that I could look it up later (because "wtf? no?") and you just saved me! wonderful video!
Interested in intro to linguistics
I admire you for weaving computer science and computability into your discussion. I love all your videos.
The story about how you started to reinvent computational context-free grammars sounds like exactly the sort of rabbit hole I'd fall into 😆. I could totally see myself doing something like that. This video is right up my alley and honestly, so is this whole channel
I'm a computer scientist halfway through an introductory university course on linguistics as well as a language philosophy class.
There is a person for every niche youtube video. I guess I am the one for this video. 😂
I always thought grammar is like a programming language. And I wished there was a Full Documentation of every Language. (or is there something like that ?)
Most Language learning Books aren't really precise with there explanation of Grammar. Like in Japanese, most of the Books tell you that there are Na Adjectives (But Technically this are special Nouns). Or that the Partikel wa is indicating the Subject (this is also not really true).
And then the Teacher are confused why i make strange sentences.
This is certainly an interesting presentation on universal grammar, but there are many reasons why people disagree with its tenets (besides the representation of how sociolinguists criticize the "ideal speaker" concept). Ewa Dąbrowska has a really good paper that outlines criticisms towards universal grammar, I would really recommend the read to anyone who's interested in some more solid criticism of generative theory.
Ewa Dabrowska. (2015). What exactly is Universal Grammar, and has anyone seen it? Frontiers in Psychology, 6.
Looking forward to reading it, and who knows? Maybe even making a video
The main problem is with UG selling UG as the sole theory that does all of this!! Like, parameters exist in other theories, they're just conceptualized differently and the claim about how we get them is different. Like when you talk about the generation of sentences. Chomsky reacted to Behaviorism and Skinner. Rightfully. But in the process he somehow seems to negate that things like dependency grammar existed (also successfully used in NLP without needing to posit that these grammatical relations are innate!)? UG was a novel way of computationally defining sentence structure but it was not the first, by a long shot, to introduce bloody "hierarchical structure"!!!
Things like "unergatives" or "unaccusatives" are the work of typologists, many of whom are not necessarily into UG, nor do they *need* UG to explain what an unergative does!! It's not impossible to describe all of what you did in construction grammar or, better yet, dependency grammar. You can analyze grammatical structures in dependency grammar (and it's been done since at least the 1930s) without needing to posit innateness. All of the "other" theories make assumptions about hierarchical structures!
What people take issue with is not the theory of generating sentences in itself but everything around it: the "language module" claim (no real evidence and in some descriptions unfalsifiable); the poverty of stimulus claim; the one genetic mutation claim; UG generally making too much of a distinction between syntactic structure and everything else, UG essentially making hierarchical structure "their" thing rather than accepting that other, even older, theories clearly have that too, etc.
The main problem is that someone thinks they must explain similarities through innateness and the existence of a language module rather than the fact that language uses a combination of domain-general skills that are common to humans and, therefore, lead to humans communicating in similar ways but also with many, many differences like one language having accusative alignment and the other having crazy stuff like split-ergativity.
I really like your content, so this is not coming from an angry position and I'm sorry for the length, but:
One, while it's true that UG has transformed to mean way less than it initially did, there's still plenty of people who do assume way more to be hardwired than is plausible. In language acquisition, there are people who literally assume that we have to have an innate predisposition for "determiners" or else we couldn't learn them. And they'll fight anyone who claims that this is possibly not very reasonable.
UG, unfortunately, has many researchers who like to come up with claims that are essentially unfalsifiable, like the poverty of stimulus, or that merge must be one single genetic mutation, or that there is a specific language module (the last two might be one and the same, depending on decade of publication).
Second, if Chomsky ever claimed that children produce linguistic *structures* they had never heard, we need to define what a structure is. If it's supposed to mean something like the passive construction, then no, that's not what's happening. A child is able to generalize properties from some structures and use them with parts that did not appear in those structures in the input. Now, the big fight concerns how this is possible.
Third, nobody who does construction grammar assumes a lack of hierarchies, yet UG people like to claim they do (another straw man). Nobody working in non-UG assumes that UG literally says "all languages essentially have the same underlying structure."
In this case, despite you having good intentions, it seems almost like a straw man of the straw man counter-positions.
What people *do* doubt is that linguistic abilities necessarily require a "language module" as in a particular skill that is in situ *just* for language, given the fact that i) many things can be explained by domain general abilities; and ii) other species have certain features as well and some of those, especially pertaining to pro-social communicative units suggest complex structures. So the main issue with Chomsky's (et al.) claim is that there is an assumption of a mutation that makes us able to have merge and that merge is what makes up the narrow faculty of language.
Four, the computation example you mention around 3:30 is a perfect example of what happens in a lot of reporting on UG results. You rightfully say that it is perfectly computable to say a sentence backwards to express s.th. like a question. True. But is that actually easily computed in natural human interactions? no. And that has to do with our *general* cognitive abilities and *decidedly not* with a language module, yet here it's used as "evidence" for regularities in the world's languages? We know a ton about how much we can generally keep in our memory stack for sentence production, so that is a constraint, and obviously the other person also has to be able to parse the sentence.; another constraint. And there are many more. Besides having a marker like prosody or an extra word or moving words around, there is very little that could be done without requiring extreme adaptation. So, seeing these "similarities" as evidence is reaching. Or the mention of unergatives or unaccusatives as somehow relating to UG, as if they wouldn't be equally well described in other theoretical frameworks.
Generally, I tend to follow the principle that I don't care if someone has a UG background or not, unless it affects how they debate results. So interesting observations will always be interesting. But what frequently happens is this derailing into "and UG is the only way to explain this" which is unnecessary.
Lastly, while this entire debate might be unknown outside of linguistics, you must at least acknowledge that this "crazy" idea is the dominant one in US linguistics.
PS: I don't know when you studied linguistics but the computer-phobic comment seems a bit dated. I'm a computational neuroscientist (coming from a Physics undergrad, so had to learn a lot of new stuff) and currently work in a cognitive science department with about 80% of my work involving linguists and I teach computational approaches to language perception as well as coding and statistics classes. But even in programing, to write the code, you have to think about how to encode rules first. It's not like computation solves itself.
Great video thanks! What would you recommend for someone who dabbles in linguistics, but needs a nice overview of the main ideas and concepts in language learning and acquisition? A nice, readable book for example? Asking for a friend.
Needless to say: yes to the intro in linguistics
Extremely interested in the linguistics class!
UG provided plenty of employment for people who today would migrate into software engineering. It’s very appealing to anyone with an analytical mind. I loved it when I first encountered it 30 years ago. Unfortunately it is a theory, and to cope with real languages it ends up becoming horribly complex. I’m glad you referred to adding another epicycle, that’s exactly where I was heading. Archeoanthropology suggests that language might be a million years old. We know that a language that was spoken 6,000 years ago lacked certain conceptual structures that exist in modern languages. And we can see how language evolves over time, and make educated guesses about the evolution of grammatical structures. My guess is that the brain has a relatively simple architecture for processing language, and that the complexity in modern languages has evolved culturally. That generalised neural networks can produce language is surely proof that great complexity is not needed, just an absurdly large number of neurons.
This brings back memories of the "language wars" at MIT in the late 60s and early 70s. I had taken classes in AI from Marvin Minsky and Seymour Paper. My problem with format linguistics was that it didn't make sense from a computational point of view, and it didn't deal with intrinsic ambiguity. I was very interesting to see how effective syntax emerged so took psycholinguistics. One interesting example was using a click test (clicks in ear, language in the other). For example, it showed that the subjunctive was in decline. Also, the meaning is not in the sound waves but in the context. Generating language doesn't teach much. Understanding and acting in ongoing contexts is more interesting. Just language mixed still sense maketh.
This misunderstanding has been grinding my gears for ages. Thanks!
As CS grad (so I've had a lot of academic contact with Chomsky's work) with a weird interest in linguistics, this video was great
thank you so much I needed this video! please, do the intro!
I think the most important deep theoretical split is between people who believe natural languages are the way they are primarily because of quickly degenerating *competencies* of very young humans (LAD), or because of cognitive *weaknesses* in those same young brains, which languages evolved to more effectively colonize (the coevolution theory prominently associated with Terence Deacon).
LAD
Referring to the LAD, do you mean that the older a kid is when first exposed to language the less likely he is to acquire native speaker fluency? If yes, I taught people in that situation years before Language Deprivation Syndrome became a thing.
Plehse excuse the awkward wording. Grad school was a looooong time ago.
I'm interested in the intro to linguistics. Thanks!
I love your work and would be extatic about a Course for Ordinary People.
Btw definetly interested in the intro to linguistics!
yes would love to see the intro to linguistics
Decades and decades ago, I was equally skeptical of deep grammar & the like, but the Army was sending me to language school. Learning well was the ticket to not ending up in the infantry... So I picked up Bickerton's book _Language & Species_... It ended up being a lifesaver. Possibly literally... I walked into Russian fully understanding and just intuiting case structure, feeling my way through what all my buddies sweated over hours a day. Serendipitous... Not saying it would have helped with any language... Time I went back for Arabic, didn't help a bit. But, if you're learning Russian...
Aaaaa, intersectional disciplines uniting their fields to produce new forms of understanding, its modern day magics. It is really cool to see theres room for programmers to work in the field too!
Thanks for these videos, and Chag Pesach Sameach to you and yours!
Very interested in the Intro to Linguistics!
I'd be interested in the course!
As a non practicing mathematician, I find this extremely fascinating. Thank you.
When you hear the phrase, "I saw the man," but parse it as a present tense verb performed with a noun which is a homonym of the verb; you know like, in a magic act, (or a horror movie).
I remember two textbooks that I used to learn Japanese: Tae Kim and Gakushudo. Both of them teach you set "sentence patterns" with "variables" that that you can substitute with other words, e.g. [Thing]NI[Action] where you can simply substitute [Thing] with other nouns and [Action] with other verbs to say that you did some activity there. But they never give you a tree diagram that splits a sentence into SubjectPhrase and VerbPhrase. Instead they teach you that the Verb is the core of the sentence while all other things like Topic/Subject/Object/Location/Origin/Destination etc. are just additional information that are optional.
Is Tae Kim's approach the same as Chomsky's universal grammar? generative grammar? dependency grammar?
I'm trying to build every sentence structure with minimal vocab... didn't know where to start...
I would be interested in linguistics for regular people, even if I am not sure I would fit in the regular people category. I get the idea of that meaning of "ideal" its similar to the "ideal" in "ideal gas law".
I see your point and that is why compared grammar works 😅
I'd love to watch the intro to linguistics!
I have a background in linguistics already and I still would be interested in your intro to linguistics for normal people.
When's the intro to linguistics coming out?
Very interesting - thanks.
1. I don't believe that there is a universal grammar, but our minds are able to form a grammar. A difference is the subject: mind (whatever it is) vs grammar
2. It is so easy to form a sentence with an unfamiliar world. I wonder if you know the word sepulka, but I bet you already have some idea about sepulling and sepulkaruim.
3. My own attempts to program something that could build sentences ended with the impossibility to teach the machine the notion of I. Except for hard programming, which is not interesting. Ideally, I'd ask the machine who are you and it would reply I am Parrot?and if I ask who am I, it would reply You are the Creator
Funny coincidence. I just started learning German, and am in the middle of a coding spurt to generate very simple sentences. Using Python, too, to start, although I plan on transitioning to Javascript soon enough and present everything with nice colour clues in a browser.
I've written all the classes you said you started out with, and have parsed and wrangled a lot of case tables from Wiktionary to have some data. There's flexibility in all the categories across which nouns inflect, and the ability to toggle their representations into pronouns. I haven't read any syntax books though, and don't plan on expanding on what I have beyond very simple sentences that can get off the ground my feel for noun genders and cases. It feels like overkill to make a complete representation of German grammar just for this. So I won't. I know I should soon give up if I tried.
"barbush" In the comments
I got no sound in the preview. Anyone else have this bug?
Thank you very much for this video.
But where CAN I memorize my common words in randomly generated sentences because now I'm really interested in that