Stephen Wilson 0:06 Welcome to Episode 21 of the Language Neuroscience Podcast. I'm Stephen Wilson, and I'm a neuroscientist at Vanderbilt University Medical Center in Nashville, Tennessee. This is a podcast about the scientific study of language and the brain. My guest today should need no introduction. Noam Chomsky is Institute Professor and Emeritus Professor of Linguistics at MIT, and since 2017, Laureate Professor of Linguistics at the University of Arizona. As you all surely know, Professor Chomsky is the founder of the field of generative linguistics, and has been the most prominent advocate of the view that language is a biological entity and should be studied as such, a view that I certainly share. He's one of the most influential and highly cited scholars in the world. I'm extremely grateful that he took the time to join me for a conversation on the podcast. My goal in this conversation is to draw Noam out as to his opinions on language and the brain, and to explore his views on the relationship between generative linguistics and the neuroscience of language. Did I succeed? I think so, up to a point. I'll let you be the judge. In any case, it's a fun conversation and an honor to be joined on the podcast by a living legend. Hi, Noam, how are you today? Noam Chomsky 1:13 Doing fine. Stephen Wilson 1:15 Thank you very much for joining me. I really appreciate it. And where are you at the moment? Are you in Tucson or no? Noam Chomsky 1:23 Ah, right now, I happen to be in Brazil. Stephen Wilson 1:25 Oh, really? Noam Chomsky 1:26 But the internet doesn't care. Stephen Wilson 1:29 What are you up to you there? Noam Chomsky 1:32 Visiting my wife's family. She's Brazilian. Stephen Wilson 1:35 Uh huh. And um, so, I know that you moved to the University of Arizona, I think about six or seven years ago, which was just around the time that I'd moved on from there. How's that been for you? Noam Chomsky 1:48 It's doing fine. We like it. Stephen Wilson 1:50 Do you spend all your time there? Or do you just kind of come in and out at different times? Noam Chomsky 1:54 All the time. Actually, this is the first time I've been out of the house for about two years. Stephen Wilson 2:01 Wow! Noam Chomsky 2:02 We've been isolated because of the pandemic. Stephen Wilson 2:05 Do you have a nice place there? Noam Chomsky 2:07 Yeah, we have a place out right beyond the city limits, Oro Valley. Pretty empty area, so it's quite pleasant. Stephen Wilson 2:19 No, Oro Valley is very beautiful. You kind of have that view of the Catalinas there, right? Noam Chomsky 2:25 Yeah, view of the Catalinas, we can see when there's a wildfire right across from Oracle Road, almost getting to us. Stephen Wilson 2:36 Ah, yeah. There was that bad one a couple of years ago, on the mountain there. Yeah. But it didn't make it to your house. Noam Chomsky 2:44 It didn't cross the road. It was close. If it had crossed the road, we'd probably had to evacuate. Stephen Wilson 2:51 Yeah, that's terrible. Well, I'm glad you could join me from Brazil. So on, on the podcast, I'd like to start by asking people how they became the kind of scientists they are. And I know that you've written about your personal history a bit, but most people probably haven't read that. So I'm just going to ask you. Were you interested in languages as a kid? Noam Chomsky 3:16 Not particularly. I mean, I happen to be interested in historical Semitic linguistics but it wa sort of an accident and partly it was because my father was a Semitic Scholar. And when I was about 12 years old, I guess, I read his doctoral dissertation on a medieval Hebrew grammarian, which had a lot of historical Semitic linguistics in it, but it was just a curiosity. It was not a special interest. Stephen Wilson 3:53 And what languages were spoken in your household? Because I know that both your parents were immigrants from Eastern Europe. Did you all speak English or a variety? Noam Chomsky 4:02 My parents' native language was Yiddish, but they would never let us hear it. I did study Hebrew from early childhood. I was fairly fluent in that. Stephen Wilson 4:02 Uh huh. But they didn't, they just spoke English to you in the home, huh? Noam Chomsky 4:24 Only English. Stephen Wilson 4:25 All right. Noam Chomsky 4:26 And they were fluent in English. My, my father, who came when he was 17, spoke with an accent. My mother came when she was a baby. So, she was native. Stephen Wilson 4:38 Oh okay, they came pretty young. So you were kind of probably, you know, surrounded by English but learning Hebrew and also hearing Yiddish, or at least if they I mean, I bet you heard it from time to time. Were you immediately struck by the commonalities between these languages? So I know that you've written, I'm going to quote 'English is not Swahili, at least not quite. A rational Martian scientist would probably find the variation rather superficial, concluding that there was one human language with minor variants.' Did that observation strike you immediately as you started to learn these languages? Noam Chomsky 5:11 Well, I read that observation when, actually, I think that's probably my observation. Stephen Wilson 5:18 Yeah, it's your observation. Noam Chomsky 5:20 It was much later. At the time when I started studying linguistics, late 40s, at college, the standard view was that languages could vary almost without limit, there was very little that could be said in general about them, and that each language had to be studied on its own without prejudice from other languages. I think pretty much the same view was held at the time in biology with regard to organisms. A leading biologist said that they're all so different that almost anything could be a possible organism if you study each one alone. That has been shown to be completely wrong about organisms. They are actually restricted to very narrow types. That's one of the greatest discoveries of the last 30, 40 years, deep homologies and so on. And very, since the Cambrian explosion, very limited type of organisms. And I think we're finding the same about languages. The more you learn about them, the more you find that the differences are really quite superficial. And in a certain sense, that almost has to be true, from the fact that a language is picked up by a child with almost no evidence, and very quickly. It was known 50 years ago that there are some problems about language acquisition. Now we know they're far more severe. It's been shown experimentally that the two to three year old child has basically the essentials of the language. And it's also been shown by just careful statistical analysis that the data is very sparse. So since the language and children aren't adapted to one or other language, any child can pick up any language with equal facility. So putting all this together, it leads to the conclusion that languages must be fundamentally the same. Stephen Wilson 7:28 So did you reach that conclusion by following that logic that you've just outlined, rather than your own personal observations of similarities between the languages that you were familiar with or learning as a student? Noam Chomsky 7:58 That is the reason why we expect it to be true. The, in fact it's hard to imagine alternatives, given the general facts that are just mentioned. But then comes to try to show that it is true and that requires detailed studies of the particular languages. And by now, there's extensive research showing that even languages that look on the surface radically different, are fundamentally the same in their basic structure. Load of quite sophisticated work on that. Stephen Wilson 8:37 Absolutely. So, you know, I think that at the time that you encountered the field of linguistics, I understand you were studying with Zellig Harris, and that was your first encounter with the field. At that time, people weren't really thinking about language as a biological property of the species in the way that you've advocated. How did you come around to that point of view? Was that, in that, in those early years, when you were first encountering the field? Noam Chomsky 9:05 Well, I was studying structural linguistics, philosophy of language and so on, and there was an orthodoxy. Part of it was that languages could vary, without limit, with regard to acquisition, the orthodoxy, quote Leonard Bloomfield, leading American linguist, generally accepted was that, as he put in language as a matter of habit and training. Move over to philosophy of language, leading figure, United States, W V O Quine, who I happen to be studying with and time. His view was that language is a complex of dispositions to respond to stimuli, he added, established by conditioning, he thought operant conditioning. Those were the general views. A couple of, there were a couple of us who just didn't believe any of it. It didn't make any sense. We were actually three grad students at Harvard. One was Morris Halle, my old friend worked with for 70 years. Another was Eric Lenneberg who went on to found biology of language, modern biology of language. We were just a small group by ourselves, none of this made any sense. So we all worked in a different direction. I mean, it just seems obvious that language is a trait of an individual. It's a biological phenomenon and it can't possibly be learned by habit and training, you get nowhere that way. Conditioning, even less so. It gets you nowhere. And now, that's all been pretty well established. Stephen Wilson 11:01 You've made the point. Noam Chomsky 11:04 At the time it was quite a heterodox position. We were essentially out on our own. Stephen Wilson 11:28 Yeah, so... Noam Chomsky 11:10 The ultimate question was, also the, the basic approach of structural linguistics was, which made sense under their assumptions, was procedural, you develop a system of analytic procedures, which you can apply to any body of data. And it will yield items, their arrangement, their organization, and so on. Harris's method, and structural linguistics was the most sophisticated development of this idea, that were personally, that was actually my introduction field as an undergraduate. But um, that's what got me interested in the field actually. But, but I thought it was completely wrong. You can't get anywhere that way. Stephen Wilson 12:02 That's fascinating. So you know, you kind of, as far as I understand it, that you encountered like Harris, by serendipity. And obviously, you know, took a lot from his work as you develop your early, you know, generative grammar. Whereas Quine, you chose to work with evidently, but you thought that he was pretty much barking up the wrong tree, like, did you know when you went when you chose to, you know, go to Harvard and work with Quine, that you were going to disagree with him already? Or did you kind of realize that once you got there? Noam Chomsky 12:35 Well, frankly, I met Harris through separate connections, common political interests. At the time, I was 16, I was very bored with college, thinking of dropping out. He was a very brilliant person, very charismatic, lots of interests in all areas. And he suggested to me that I started taking his graduate courses. I had no preparation. So I did that. And he, I also started taking grad courses in other fields, in philosophy and mathematics, and basically had no background but there were very fine faculty who didn't care much and were willing to tolerate my ignorance and let me just go ahead. And so I had a kind of an eclectic, undergraduate experience with almost no coursework of any kind when it came to it. I had to do a graduate, an undergraduate thesis. So Harris suggested that I do structural grammar of Hebrew, which I knew pretty well. So I started doing it in the prescribed fashion found an informant, went through the field methods, got answers. And after a couple of weeks, I realized this is ridiculous. Either I know all the answers or I don't care about them, because they're all in areas like phonetics, that I wasn't interested in. So I just dropped it and started to do what seemed sensible. Write a, what's now called a generative grammar. It didn't exist at the time. Basically, I had studied logic, to try to develop a recursive procedure that enumerates the infinitely many sentences of the language in their structures and just started on that. And that's interesting and I don't think, I don't know if anybody ever read it, but it went on to become what became generative grammar. Stephen Wilson 14:54 Yeah, that's fascinating. So, I talked to, on my podcast last year to Pim Levelt, who you probably know, you know, the Dutch psycholinguist. And in his view, the field of the scientific study of language basically has four pillars, theoretical linguistics, language acquisition, psycholinguistics, and the neuroscience of language. So you've written extensively about the relationships between those four parts of the field. But your technical endeavors have been very much in the first branch. Was that always going to be the case or could that have been different? Like was that were those just the kinds of questions that were most interesting to you or could it have gone another way? Noam Chomsky 15:36 They are all, they're all linked. I mean, you can't study acquisition of language, without some conception of what language is. Just like I can't study growth of the organism without knowing what the structure of the organism is. So, if you want to study acquisition of language, you have to begin by trying to, some conception of what it is that's acquired. You may change the conception in the course of your work. Of course, that's the way science works. But the core question is, what's the nature of the object, and everything starts with that. And after that, they all fall together. Like language acquisition becomes the study of how scattered stimuli yield the language competence that a two or three year old child has. And this work in generative grammar did stimulate child language studies, particularly those of the late Lila Gleitman was one of the leaders, many others. Out of that has come the understanding, to everyone's surprise, that a child knows far more than it can exhibit. A child who's producing two word expressions, is understanding complex sentences, and already knows complex fundamental properties of language, that it continues that way. We find that most of our knowledge is completely unconscious, inaccessible to consciousness. You have to study it the way, from the outside, the way you study, say, the nature of vision or something like that. And as far as comparative linguistics is concerned, that gets an entirely new framework when you approach it this way. You have to ask, what are the respects in which languages share structure and where are the apparent differences located? And by now I think it's, researches converging on the conclusion in my view, that the fundamental structure of language is almost completely shared, maybe completely shared, and that the variations are localized in a particular component of the linguistic system, which is a superficial component. I mean, basically, what we call a language has two fundamentally different parts. There's an internal part, a computational procedure that is generating the yielding the infinitely many structures available to us, each of which pretty much constitutes a thought or is very close to that. So it's basically a thought generating system, which is, incidentally, a traditional view was held pretty much until the 20th century than that, I think, turns out to be correct. So it was an internal thought generating system. Then there's an ancillary system, which maps the structures into the sensory motor system, usually speech, but we now know could be sign could even be touch. And it's in this ancillary system, it seems, that the variation lies, which is not surprising, because that's the system that the child has direct information about. Child has almost no information about the internal system. It doesn't appear in the data available. So that presumably just comes from the inside. It's the nature of the organism. The ancillary system, you do, the child does hear sounds and linear order of words and so on. So you'd expect possible variation there and that also seems to be the locus of the apparent complexity of language, the internal system we're beginning to learn is, seems to be based on very simple computational principles, I should say make it clear, this is not a common view, I'm giving you my own view, which, for the last 70 years has been way at the fringe of the field and still is. So talking for myself, not for the field. But it seems to me that we are finding that the internal system is a remarkably, is remarkably simple as a computational system, maybe even almost perfect. That should not surprise us either. On evolutionary grounds, the humans of course, are very recent species. New species. A couple of 100,000 years, is nothing. There is apparently no variation in linguistic capacity among humans. So as far as we know, any child, any culture can pick up any language with equal facility. Well, that tells us something that we have genomic evidence that humans began to separate, maybe on the order of roughly 150,000 years ago, something in that neighborhood. Humans didn't appear much before that. Two to 300,000 years, that's nothing in evolutionary time. Prior to the appearance of homosapiens, there's almost no indication in the archaeological record of any serious sophisticated symbolic activity, a couple of scratches on a bone or something like that, not long after humans appear, again, in evolutionary time, you get extremely rich evidence of complex symbolic activity. Well, put all that together, it suggests that language, which is the source of this probably emerged pretty much along with homosapiens. Which would mean it ought to be quite simple. Anything that emerged pretty suddenly not after millions of years of tinkering, evolutionary tinkering, could be a simple system. Stephen Wilson 22:39 Right. Noam Chomsky 22:40 So I think, just from general considerations, we would anticipate that there's a dual system. An internal system for generating infinitely many thoughts, basically, a system of thought, which is common to humans, and is quite simple. And then an ancillary system, which is occasionally used to externalize the internal system in usually sound. And that can be done in many ways. So, you expect variation. That's prior anticipation, then comes the task we are trying to show. That's scientific... Stephen Wilson 23:25 Right. So, so in that view, a lot of the complexity arises from the many different ways in which you can linearize these internal structures that are not inherently linear, right? Noam Chomsky 23:38 That are not linear. In fact, we have striking evidence for that. There's by now overwhelming evidence that as early as you can begin to test children, actually, it's down to 17 months and some of the tests, but certainly by two years old or so, children already ignore a 100% of what they hear and attend only to what their minds construct. So, property called structure dependence. The operations of language, operate on structures, not on linear order strings. Linear order strings is disregarded. And it's just regarded from the earliest moment, which means that it must be built in and we expect and it must be there for some fundamental computational reason. And we have good evidence about what it is we take the simplest computational operation that will yield a digital infinity of expressions. It's essentially binary set formation that can't get simpler than. It is what's called merge in the current literature. It turns out that, that yilds among other things, the consequence that, you have only structures and no strings, yields many other consequences. But that's the heart of work in the last 10 or 20 years, which in my view at least, is beginning to converge on what we know must be true. But, it is hard to show. Stephen Wilson 25:21 Yeah. Cool. So, you know, it's clear that like the generative program that you've described, has had profound influences on the study of language acquisition, as you've just been discussing, also psycholinguistics and the neuroscience of language, which is my primary interest. But it seems to me that the influence has been mostly unidirectional. Do you agree with that? Noam Chomsky 25:51 Well, in a way that has to be true. I mean you can't study the neurology of vision, if you don't know what vision is. If some guidelines, the, otherwise you don't know what to look for. So, but now, and the same is true as neurolinguistics, which, incidentally, is a much harder area for a number of reasons. Number one, number one reason is you just can't do invasive experiments. There's lots of experiments you can dream up, you know, raising children in isolated environments, sneaking electrodes into Broca's area, and all kinds of things, but just can't do them. The other problem is, there's, like human language is unique. We learn about vision, with invasive experiments on cats and monkeys. But can't, there's no other organism that has anything remotely like language. So, can't do anything. So you're stuck. There's no, there's no comparative evidence, and can't do the experiments, you have to be much more sophisticated. But nevertheless, there are some interesting results. One of the most interesting actually has to do with what we were just talking about structure dependence. There's a clever experiments, designed by my friend and colleague, Andrea Moro, Italian linguist, carried out in Milan, and the research centers there then elsewhere in the world replicated. But essentially, the design of the experiment is to take subjects, say the German speaking subjects for example, and give them two kinds of invented material. One kind is invented language that's modeled on an existing language that they don't know, say, maybe the Italian or something. The second set of experimental materials is, give them an invented language that violates principles of language, violates structure dependence, for example, and then test and see what happens. Well, basically, what happens is, when they get an invented language modeled on an actual language, you get normal behavior in the language based centers of Broca's, Wernicke's area. When you give them a language that violates the principle of structure dependence, you get diffuse activity over the brain, which means basically they're treating it as a puzzle. And that's probably the most striking result in neurolinguistics so far. There's another work David Stephen Wilson 29:00 I know, just one moment. So, there is just the paper that you just described is Musso et.al., 2003. I'm gonna link it in the podcast notes for anybody that wants to read up further. And so yeah, you're saying you like David Poeppel's work too? Noam Chomsky 29:17 David Poeppel has done work showing, which has, for example, found neural correlates to the structures in the mind, free structure and so on, which, again, is interesting work and that there's a, there's by now, you can identify neural signatures to different kinds of deviation from grammatical expressions, syntactic and semantic ones differ in their neural signatures and there's other things Stephen Wilson 29:45 You are alluding to Helen Neville's work there, do you think? You're alluding to Helen Neville's ERP study, Noam Chomsky 30:04 Helen Neville was the person who did this. Yeah. Stephen Wilson 30:07 Yeah. Noam Chomsky 30:08 She was using ERP, not the fancy imaging. Stephen Wilson 30:14 So, you know that it seems like the, a lot of this is like kind of, we are using neuroscience methods and seeing the reflection of these concepts that we know from generative linguistics to be true. And like you said, I mean, I guess if that's all it is, then there wouldn't be any influence back on linguistics, because all it, is that all that neurolinguistics can do for you, is to like, kind of show evidence that the brain respects some of these structures that you've motivated independently or could there be more? Noam Chomsky 30:45 In principle, I mean, one of the conditions that in the study of the nature of language, there are certain conditions are estimated, one condition is acquisition. The nature of language must be such that it can be acquired under the conditions of the actual condition, empirical conditions of acquisition. Very rapid data. It has to meet the evolutionary condition, the nature of language, you must be such that it could have evolved, and apparently in a very brief period, and another condition of estimate is that, it as to fit the brain. Whatever the structure of the brain is, language had to conform to it. So if we ever learn enough about the structure of the brain, it might impose conditions on, maybe important conditions on the study of just the basic nature of language. Hasn't really happened so far. But that's not a criticism, as I see. Neurolinguistics is an extremely difficult problem. You just have, your hands are tied, can't do the experiments. There's no comparative evidence. So it is difficult. Stephen Wilson 32:03 Yeah, go I mean, I agree as somebody for whom that's my profession. It's not super easy. But it does, like, you know, I feel like it's the language is the best thing to study as a human neuroscientist, because anything else can be studied better in animals, right? Like, if I was a visual neuroscientist, then I'd be like, what am I, why am I scanning humans when I could be sticking electrodes in the brains of macaques? But with language, I'm justified in working with humans, which is what I want to do, because it's the only way to study it. Noam Chomsky 32:33 Well, the study of the macaque brain has given some interesting, strong suggestions, that's especially under the Friederici's work, comparing macaque and human brains. And she's found that in the human brain, there are certain clues to dorsal and ventral circuits between the two major language areas, and they develop in infancy, their myelinization takes place in infancy. And in the macaque brain it never closes. And she's also has evidence that as the myelinization increases in early infancy, seems to correlate with linguistic capacities. Well, all of that, together suggests what might be the evolutionary origin of language, or at least a large part of it, some simple closing of a circuit between two brain areas, it's over simplifying, of course, but the rough idea, so you, and as it happens, she is using macaques. Stephen Wilson 33:55 Yeah, there is some nice work by Rilling that I think you probably are familiar with too, on that comparison between humans and macaques and, you know the growth of the arcuate, the relative growth of the arcuate fasciculus in humans. So, let's talk about so you know, I kind of asked you about, like whether these other pillars of the field could, could influence linguistics. Let's talk about the influence of linguistics on these other pillars of the field, as you've said, like, you know, you need to know what language is before you can study the acquisition of it, or the processing of it, or the neural basis of it, and I completely agree. But I guess when I think about linguistics and the neuroscience of language, I have, like, what I'm what I want to call a Goldilocks view, which is that you need to have the right amount of linguistics and I think you're going to disagree I'm sure you're gonna disagree with this one. I'm just gonna tell you my my view, is that like Brain and Language researchers who know nothing about linguistics, they don't tend to do very good work, because they can't formulate good questions, they design stimuli that don't get at their questions, it just doesn't go well. But I also think that people that know too much about linguistics don't tend to do the greatest Brain and Language Studies. They tend to be always investigating really esoteric concepts from the latest trending theory, which is probably going to be abandoned in a few years anyway. And the brain doesn't seem to care about those subtle manipulations. So I feel like the best you want to be in the right, you want to know just the right amount of linguistics, we know enough about language to sort of know that it's a biological entity, and more or less how it works and then you but you don't want to take any of it too seriously. You kind of want to study the big picture, things that we can actually get out with the brain like things like nouns versus verbs, you know, the brain cares about that difference. Grammar versus lexicon, the brain cares about that. What do you think? Noam Chomsky 35:50 Well, it is kind of like saying, I want to study the neurology of vision, but I don't care what the vision is. It's not going to get anywhere. It's a, you can study all kinds of things, find all kinds of data, but whether data means anything, depends on whether it fits into some kind of theory of what the actual phenomenon is. So if you take the even takes, takes Stephen Kosslyn's work on the neurology of, recent work on neurology of vision. This is guided by his discoveries in experimental psychology of how image creation develops, it develops in the brain. Is it sequential? Is it simultaneous? And so on. Well, that information guides the neurolinguistic, the neuro, the neurology of vision, but there's experimental studies on a vision, were guided by knowledge of what the visual system is, by the way, you know what to look for. I mean, that's just the way science proceeds. There's no point getting data, if you know, what you're looking for, you can get all kinds of data doesn't mean it may mean something may not data doesn't become evidence until it's evidence for something. And evidence for something means there's a theoretical framework in the background. Maybe not totally fleshed out, in fact, the experiments should contribute to fleshing out or maybe modifying the framework. But experiments without a theoretical framework are just blind. Stephen Wilson 37:40 I mean, I do agree. I mean, that's why I kind of said Goldilocks, I mean, obviously, I'm not advocating that you should go poking around in the brain trying to learn about Brain and Language, without knowing anything about how language works. But like, let's say you brought up earlier, like structure dependence. I mean, you know, that is obviously something that's absolutely fundamental to language, that no serious linguist would dispute, at least I hope not. And that's the kind of fact about language that I think everybody needs to know and is like, good, the kind of thing that should be investigated in neural context. If you take something more subtle, like, you know, the distinction between like head movement and XP movement to quote, you know, something that actually has been, you know, discussed in the literature as having potential neural correlates. I think when you get into those kinds of fine distinctions, you're getting a bit too fine. Like, you're kind of going to the sort of theory, you know, theory specific distinctions that might not turn out to be real cuts in the system. Do you know what I mean? Noam Chomsky 38:45 Well, it might. And in fact, there are even suspicions to that effect. So for example, for a long time, in the study of the nature of language, it was assumed that there are two basic kinds of operations. One kind of operation just puts things together, like you have read and books and you can form read books, you form a bigger unit from smaller ones. The second is the displacement operation. So if I say, What did you read? The what is understood as the object of read, even though it's not there, get there. But the underlying structures got to be something like, read what? And it comes what read. Those operations are ubiquitous in language, displacement. And displacement also, in always involves what's called reconstruction, you interpret it from the point from which it was displaced. Well, over in recent years, we've have a good reason to believe that these are basically the same operation. And here to turn to your point, be interesting to know if there's a neural basis for the apparent distinction, the displacement and combination. And then there are open questions very open as to the nature of displacement. Is it actually displacement of an element? Or is it some interpretive mechanism, which takes the presented thing and finds a point at which you interpreted in the original, or is it both? Therse are open questions, they have a lot of linguistic consequences. And it's conceivable that neurolinguistic evidence could shed some light on this. There are many questions. Stephen Wilson 39:25 Yeah. But I agree. I mean, that's very interesting that like you making a different fundamental distinction now. But you know, where does that leave, you know, Yosef Grodzinsky whose paper I think you realize I'm alluding to, like, claiming that people with Aphasia had a selective deficit for XP movement and not head movement, like where does that leave him, when there's reportedly been empirical evidence produced, but now the concept that it was based on is, is gone by the wayside. Noam Chomsky 41:25 Actually Grodzinsky's work is a good illustration of theory going to work. He started with the observation from linguistic studies, that there are, as he put it, what are what are called traces, points in an utterance, which refer to something you don't hear there, but you hear somewhere else. Like, 'What did you read?' So there's a trace position in the object of 'read', and he found evidence that aphasics differ in how they are capable of finding these positions, which couldn't have... In fact there's even further work, which shows that there are different kinds of these positions. So, there's the kind in 'What did you read?', and the kind in 'Who tried to win?'. The subject of 'win' is not there, but it turns out, it's a different kind of empty element then the raised kind in 'What did you read?' And there's some neural correlates of that. My colleague Tom Bever, who's done some of the main work on that. And all of this work, can very well proceed, but on the basis of knowing what you're looking for. Stephen Wilson 42:56 But I guess I don't... Noam Chomsky 42:58 At least some guideline of what you're looking for, maybe not need to maybe even fill out the details. So like Grodzinsky's work does fill out details on what the nature of these trace positions are, like, and other form. So there's an interplay, but again, the, this work, like everything is theory guided, it's within the framework of an explanatory theory. Otherwise, there's no way to proceed. Stephen Wilson 43:28 Yeah, but once you I mean, if the two kinds of movement are not seen as being fundamentally different anymore, what do you make of the, I gues, I mean, I would say, first of all, that I don't think the empirical claim is true, that people with Aphasia have a selective deficit for processing XP movement versus head movement. But, you know, whether you believe me about that or not, is kind of maybe beside the point. I mean, what about the fact that this claim has been made based on a theoretical construct that isn't a part of the theory anymore, because you're you're saying now that those two kinds of movements are actually not fundamentally distinct. Noam Chomsky 44:07 Well take XP movement and head movement, there is a long linguistic tradition about this, distinguishing them. But it all overlooks a crucial theoretical point. There is no head movement. Head movement can't be formulated in any existing or suggested theoretical framework. It is used all the time, but mostly intuitively, when you look into its properties, it violates all the properties of movement, so it really can't be movement. That raises the question what it is, and there's some recent work on that. Actually I have a paper on it just came out a couple of months ago, which suggested a different approach to the phenomenon that is called head movement, but not movement, different causes. Well, here it's conceivable that neurolinguistic evidence could have a bearing, you might ask whether it's what we call head movement has the signature, the neuro signatures of XP movement or not, or whether it's more like the quasi morphological properties that I'm talking about. This a point right at the edge of current research, last couple of neurolinguistic evidence could in principle, provide some crucial materials. And things like that happen all the time, at the edge at the point in which you're doing, trying to push inquiry further, there are always questions of this kind, open questions for which there's linguistic evidence if you could find psycholinguistic evidence or neurolinguistic evidence, all to the good. Stephen Wilson 46:00 Yeah. Yeah. Okay, I see where you're coming from. I'd like to if you have a few more moments, kind of just run through a couple of what I think there's some things we've learned about the brain basis of language in the last 50 odd years, and just asked you to talk about like, how they fit with your notions of modularity especially. So, one of the things that I think that we've learned, at least, I believe it, is that only a pretty small fraction of the brain is really critically involved in language, right? And, you know, we've talked about Broca's area and Wernicke's area and the connections between them, and I know that you're, you know, your colleagues in the neuro world like Angela Friederici, and so on probably have the same view as me on this. We are talking about a subset of regions, it's, and there were, you know, pretty much mostly in the left hemisphere. Certainly like from, you know, from a practical point of view, one of the things that I do with my job is like map out language areas for pre surgical patients. And then, you know, the surgeons want to know, where are the language areas that I can't take out, you know, they don't care about the rest of the brain, they'll take out chunks here, they're everywhere, they won't take out language areas. So, you know, from a practical point of view, it does seem that it is discrete. Now, do you like that finding from your from the point of view of your sort of language organ concept? Or would you have been just as happy if the language organ turned out to be spatially distributed across the whole brain? How do you feel about the way things have gone with the evidence? Noam Chomsky 47:34 It's a very tricky question. One thing we have to recognize, is a fundamental limitation on the kinds of experiments that can be carried out. All the experiments, everything we've mentioned, is based on the study of performance. You're studying production and reception of language, same in the psycholinguistic area. But that's production and reception are different from knowledge. The distinction that Aristotle made, which was forgotten for 2500 years, and revived with the development of the theory of computation in the 20th century. There is a difference in Aristotle's terms between possession of knowledge and use of knowledge. So take structure dependence, that's a principle that we possess. It's possession of knowledge, you find in performance that you adhere to it. But we can only study the performance and the neurolinguistic studies are all studies of what parts of the brain are involved, and how are they involved in performance. Nobody has any idea how to deal with the coding of the knowledge that's internal. That's x. So let's just take ordinary common sense. We, nobody speaks English and understands German. So the speaking and the hearing are accessing the same internal system of knowledge and the locus of that internal system, and the way it is encoded in the brain, or total mysteries. You don't even know how to investigate it. It's obviously the knowledge is coded somehow, but how and how in such a way that it can be accessed by both the process of production, the processes of reception, if you look at say, Levelt's work which you mentioned on production, he's done very extensive work on production, but only on the external aspect of production. Production actually involves several things. Like, if I produce a sentence now, the first thing that happens is my brain has to pick out of the infinite array of possible thoughts. One thought that I'm going to express, that's the first step. Second step, once you've picked it up, is to carry out the mechanical procedure of turning it into a sound. Levelt's work deals with the second part. Nobody knows how to deal with the first one. How do you explore the infinite array of possible expressions and select one and say that's my thought. I mean, it obviously happens, we're doing it all the time and it's instantaneous. But no one has a clue as to how that proceed. Stephen Wilson 51:04 So the fact that you can damage any part of the brain apart from so called eloquent cortex, without causing any apparent disruption to the ability to do all the things that make us think that people have this knowledge, ie, they can still do grammaticality judgment tasks, they can still, you know, produce language and comprehend language, all these things that for which we require that central system, doesn't that entail that the central system must be at least localized to those so called language areas, even if we don't understand how its coded there. Do we know something about that it must be there? That, that's the part of the brain that's critical for that? Noam Chomsky 51:45 It's got to be there, but it's very mysterious. Stephen Wilson 51:51 Yeah and beyond that, yeah, we don't know too much. Noam Chomsky 51:54 Abstractly, you can construct, a generative grammar is a theory of the knowledge, it's not a theory of the production. It's a, that was a distinction that was missed, for millennia literally. Take a look at the work on traditional work on how sentences or use of language was always use. So there is famous comments of Wilhelm von Humboldt about how somehow language makes infinite use of finite means. But notice he was talking about use, not the generation of from finite means. That is a different process. In fact, we know the distinctions clearly now from basically from the mathematical theory of computation, which is the theory of generation not production. If you're doing, say, meta mathematics, study of proof theory for arithmetic, you're not studying, the process of producing, a mathematician produces a proof, you're studying the generation of an infinite number of geometrical objects, which count as proofs, the mathematician has to somehow find his way to find one. That's the making a good proof. But it's different from what's in the head, the axiom system that's maybe on paper, maybe in the head, is just yielding an infinite number of these things, then comes the task of finding one doesn't, you know, prove a theory and by starting with the axioms, and getting to the next line, the next line, never get a proof, though. But that's what the generation system does. These are just different things. They're clear in the mathematical case, but they are the same, basically, in the linguistic case. So the generative grammar is not a theory of production, or of reception, it's neutral between them. It's just a system of knowledge that is accessed by the more or less reflex act of perception, and the mechanical act of producing once you've picked the thought, that's the hard part. Stephen Wilson 54:24 Yeah, but doesn't it give you any pleasure to know that we kind of know where it is. But is that just is that not important anymore? Or maybe that was never the point. It's really about the the nature of the computations Noam Chomsky 54:38 Actually there, I mean, there are similar questions somewhat that arise with regard to voluntary action generally. I'm before I raise my finger, let's say like this, somehow something inside was a decision to raise my finger? We can't say anything about. Stephen Wilson 55:05 Yeah, but yeah, I mean, the motor system show like, you know, when you study the motor system, you kind of studying the, the representations of the effectors. But when you study the language network, you're not. I mean, like I can, you know, there was speech production areas that are distinct from language areas and there are speech perception areas that are distinct from language areas, and language areas, you know, they're their own thing. And actually, we don't have that for the motor system, like you just said, like, we don't have a sort of central motor generator that we can identify and say, Hey, don't take that out. Because the person won't be able to think of anything many movement to make, I kind of think we're in slightly better shape than the, the motor folks in that sense. Noam Chomsky 55:43 It's, it's a very mysterious area, because you don't really have any good ideas as to how to investigate it. And these other things we've been talking about, like, what's the difference between head movement and XP movement at the neurological level, at least you have some idea what you're talking about, and how you can think of ways of testing and experimenting with. But when you get to the question of how a system of knowledge is stored, abstracted from performance. We would rather know how to proceed. Stephen Wilson 56:26 Yeah, I agree. Okay. If I can just ask one more thing, and then it's going to connect back to your friend, Eric Lenneberg, who you mentioned earlier. This is just another kind of really fascinating fact about the neural organization of language that I think has become a lot clearer in recent years. So I'm sure you were, you know, obviously, you guys were all working together back in the 50s and 60s, and you know his early work suggesting that if a kid take the kind of left hemisphere damage that would cause a devastating permanent and aphasia and an adult, if that happens in a very young child, that does not result in a devastating permanent aphasia. Rather, you get essentially normal language and your friend, Lenneberg speculated that that was because language was growing in the right hemisphere. And I would say that in the last few years, with imaging, especially the work of Elissa Newport and her colleagues in Georgetown, that supposition has been abundantly confirmed, and moreover, we've learned that there's really only two possible ways language can be organized in the brain, it has a particular characteristic land in the left hemisphere. Or you can do the exact mirror opposite in the right hemisphere, but you don't get other random sets of language areas. You know, it's either like, one or the other. So I guess my question is, you know, again, is that kind of the way you would have thought a language organ would function that it could, that it had to be somewhere, but it could be somewhere else, but only two possibilities. I mean, that's kind of interesting, don't you think? Noam Chomsky 57:59 Well, as far as the general study of language is concerned, doesn't say anything, you could very well have a, an image of the language system, the left brain system could have an image in the right brain. We know the brain is pretty plastic. And it could very well be that if you damage the language areas in the primary language areas in the left cortex, could have the right cortex simply take over. It wouldn't be a surprise from the, from what's known about the study of brain plasticity, plasticity, I mean, even with sensory organs, this happens. But the the study of language in abstraction from the brain just doesn't say anything about them. It could be whatever happens, happens and might tell you something. Eric, Eric Lenneber's work has found a quite a remarkable variety of different circumstances in which language either is retained or is selectively damaged. I mean, some of the most remarkable are these microcephaly dwarfs with virtually no cortex, but rich language capacity, that's a, as far as I know, that still pretty mysterious, right? But he also found a variety of dissociations between language capacity and other cognitive capacities and there's been a great deal of work on that since. Susan Curtiss, at San Diego, is one of the main people whose worked on that. And it's, there is a, it really seems to be a highly specialized capacity. Of course, it draws from, shares properties with many others, especially in performance. But then there are a lot of currently quite unique ones. Stephen Wilson 1:00:20 Yeah. Noam Chomsky 1:00:20 It's a surprising phenomenon. Of course, we are a very strange species in many respects. Stephen Wilson 1:00:29 No, I've definitely come around to the view that language is very distinct from all other aspects of cognition, and interacts with them only in like, you know, for production and perception. But I do think that there's really strong evidence that there's a core system that's distinct, as you have, as you have argued. Noam Chomsky 1:00:50 Hard to, it's just hard to imagine an alternative. Stephen Wilson 1:00:54 Well, I mean, others in the field have certainly been able to imagine alternatives. Noam Chomsky 1:01:02 Lots of things, we know that we have no, no awareness that we know, and can't even be drawn out. I mean, look, I have to leave no time to give examples. But you can find cases of things that people know, but they have no idea that they know, they, sentences that are multiply ambiguous, but people can't figure out the ambiguities. Unless you give them special hints, and so on and so forth. You just, it's there, but you can't find it. Stephen Wilson 1:01:33 Yeah. Yeah, I share that view. Well, it was really great talking to you. Thanks so much for making the time and I know that the listeners of the podcast really gonna appreciate hearing from you. Noam Chomsky 1:01:50 Good to talk. Sorry, gotta run off. Work. Stephen Wilson 1:01:52 Yeah, no, it's absolutely. Good luck with it and take care. Noam Chomsky 1:01:58 Yep. Bye. Stephen Wilson 1:01:58 Bye. Okay, well, that's it for Episode 21. Thanks to Noam, for joining me on the podcast. And thank you all for listening. I've linked some key papers in the show notes and on the podcast website at langneurosci.org/podcast. I'd like to thank the journal 'Neurobiology of Language' for supporting transcription of today's episode, and I'd like to thank Marcia Petyt for transcribing the conversation. See you next time.