Where did language come from? The question’s not as old as you may think, and the answer’s not as clear. There are all sorts of theories as to how and why humans developed language and the cognitive capacities that enable its use, and thus far there’s been little to recommend any one postulate over the others. The publication of More than Nature Needs, though, represents a breakthrough that author Derek Bickerton details below.
-----
This year is the ninetieth anniversary of the Linguistic Society of America (LSA), the country’s largest association of professional linguists. The event was, naturally, celebrated at the Society’s annual meeting on January 2-5 in Minneapolis with a commemorative symposium. Two issues were significantly absent: how language evolved, and where (if anywhere!) linguistics was going.
Granted, few linguists seemed even aware of language evolution until 1990, when Steven Pinker and Paul Bloom’s seminal paper Natural Language and Natural Selection and my book Language and Species were published. But since then, dozens of books and hundreds of articles on the topic have appeared. However, a substantial majority of them still come from writers in other disciplines, and perhaps that’s the problem. Thinking how language could have evolved entails dealing with data from several other sciences, and from its birth linguistics, balanced uneasily between science and humanities, has always been a somewhat cloistered discipline.
The Linguistic Society of Paris got things off to a bad start in 1866 by banning discussions of language origins from its proceedings. Since then, things have been worsened by the way schools and universities teach linguistics. In America a large majority of linguistics courses are limited to graduate students. Consequently while most college-educated persons have had significant exposure to physics or biology, linguistics remains terra incognita to all but a few. To the layman, a linguist means merely someone proficient in several languages; people at cocktail parties have no notion that to ask a linguist “How many languages do you speak?” is like asking a garage mechanic “How many cars do you own?”
For devotees of the Delphic injunction “Know thyself” (to be read, of course, as “Know what it means to be human” rather than “Know what it means to be Fred or Freda Smith”), these facts are tragic. A large part—maybe even the most crucial part—of being human is having language. Without it, how could we have science or religion, laws or codes of morality? Yet most books and articles on human evolution hardly mention language. Similarly, most books and articles on the evolution of language make no attempt to integrate it in the story of human evolution. As a result, we know more about the origins of the solar system or even the cosmos than we do about how we got language and all our other cognitive capacities. There are theories around, of course. But for most issues in science, there’s one theory that most people accept—the Nebular Hypothesis for the solar system, the Big Bang for the cosmos. For human language and cognition, theories are legion, and they’re all over the map, and it’s hard to see what would make any one of them preferable to others.
“Your theory of language evolution depends on your theory of language,” said linguist Ray Jackendoff, pinpointing yet another problem. For the last half-century, linguistics has been a battleground between nativists (people who believed that the grammatical structures of language were somehow imprinted on the human brain) and empiricists (people who believed that those structures had to be learned socially and inductively). It’s like those movie fights where one guy delivers what looks like a knockout blow, but no, the other’s up again and he delivers a knockout blow, with similar results. The only difference is that in the movie, there’s a hero and a villain and you know which is which.
In science as opposed to movies, you know there’s something wrong when a fight like this goes on for decades, with both sides delivering what look like knockout punches but neither one getting knocked out. You know this can’t happen in real life unless people are missing something. Neurologist Terrence Deacon saw the problem clearly: “Language is too complex and systematic, and our capacity to acquire it is too facile, to be adequately explained by cultural use and general learning alone. But the process of evolution is too convoluted and adventitious to have produced this complex phenomenon by lucky mutation or the genetic internalization of language behavior.”
Yet until my latest book More than Nature Needs appeared, no-one had ever suggested a synthesis that would merge empiricist thesis with nativist antithesis. The best nativist punch was the ease, speed and universality of the way children learn language; how could they do that before they could tie their shoelaces, unless their heads contained a Universal Grammar that somehow spelled out the structure of all languages? The best empiricist punch was the diversity of the world’s six-thousand-plus languages, not just in their words (everyone agrees those have to be learned), but also in their structures—too varied, surely, for any kind of inborn mechanism to produce.
But suppose that just a few basic components of structure were hard-wired while the rest had to be learned. Then children would have as it were a starter kit so that as soon as they learned enough words, they could put them together in a fairly intelligible way, thereby communicating with grown-ups and getting what they wanted. Then whenever what they said was different from the way grown-ups said it (and every parent has heard this happen!) they would gradually correct themselves, piece by piece.
Remember the second thing missing from the LSA symposium was the future of linguistics. Was that because linguists think things will go on more or less the same way as before? As regards their subject-matter, they’re probably right. There’s a lot of current brouhaha about how tweeting and texting and netspeak are changing language. In the appropriate media, such novelties as “lol” and “wtf” are timesavers (and time is what everyone’s short of nowadays), but they’re parasitic on language rather than changes to it. If, like the unfortunate writer of a condolence letter who thought “lol” meant “lots of love,” you don’t understand how they’re derived from plain English, you don’t understand them either. As for Twitter, I think that’s a positive force—it teaches people to say more in fewer words, a lesson most of us need.
But if linguists thought their discipline would continue with no end to the nativist-empiricist slugfest, they could now be wrong. Let’s wait ten years and see what the LSA centenary symposium will have to say.