Grammar as a Language System

Language is a faculty residing in human brains. Language understanding and generation occur in the Wernicke’s area and the Broca’s area, respectively. In the cognitive process, a thought is formed into a sentence by using the mental grammar that we acquired since childhood. Although we don’t know how the language is wired inside the network of neuron cells and how it might look like, we can still approximate the shape of the mental grammar simply as a set of structural rules.

In linguistics, we define grammar as a set of structural rules governing the construction of sounds (phonology), words (morphology), phrases, clauses, and sentences (syntax) in natural language. There’re three types of grammar with respect to its use.

  • Mental grammar: the real grammar inside the brain
  • Descriptive grammar: a set of structural rules resulting from analyzing and describing how the language is actually used by humans
  • Prescriptive grammar: a set of structural rules devised to promote a specific way of language use over the others

Grammar serves as a communication system. When we learn a foreign language in our adulthood, language learning becomes a lengthy and difficult process. One may never form a foreign language’s mental grammar as good as their own mother tongue. That’s because, in reality, grammar has high complexity sounds and word orders and has high abstraction of meaning.

In contrast, language learning in children is much more simplistic. Though having a weak mind, a child quickly grasps the essense of language like sponge, formulating a mental grammar from a small amount of examples (i.e. linguistic experience). At the early years of life, the majority of his linguistic inputs are in the form of noisy speech. How can he distinguish his parent’s voice from the background noise? How can he distinguish sound units (phonemes) from the voice? How can he imitate a sound unit with his vocal instruments? How can he associate sound units to a concept in his mind?

To address these baffling questions in the field of child language acquisition, there are two great schools of thought: nativism (Chomsky, 1965) and connectionism (McClelland and Rumelhart, 1981).

Nativism: This school of thought was originated by the Poverty of the Stimulus theory proposed by Chomsky (1965). Chomsky believes that the language acquisition ability is innate—i.e. it’s biologically built in to our brain. To be precise, our brain is equipped with a hypothetical module called Language Acquisition Device (LAD). He argues that natural language is unlearnable given the relatively limited data available to children learning a first language. Moreover, in the language development process, parents don’t explicitly teach a grammar to them but they can somehow formulate a mental grammar. He concludes that linguistic knowledge alone is not sufficient for language learning. It must be supplemented by some innate linguistic capacity. On one extreme, we call any linguist who strongly believe in this theory a nativist.

Connectionism: This school of thought is the main rival of nativism. The notion of connectionism can be traced back to the emergence of early neural networks (McCulloch and Pitts, 1943) and was carried on to an implementation of the computational neural-cognitive system, called Parallel Distributed Processing (PDP) (McClelland and Rumelhart, 1981). McClelland and Rumelhart argue that language acquisition can be seen as mental operations stimulated by inputs. Linguistic knowledge is represented as patterns of numerical activity across larget sets of simple processing units. Processing is done by transformations of activity patterns across large sets of connections. Language learning occurs in the form of interaction between (1) the domain-general architecture and the learning mechanism and (2) the linguistic experience. One the other extreme, we call any linguist who strongly believe in this theory a connectionist.

Because these schools of thought posit their theories on the opposite extremes, there is an ongoing so-called linguistic war between the nativists and the connectionists. Several computer scientists start to synergize both extremes by incorporating the innateness into statistical models and artificial neural networks as prior knowledge to maximize the performance of their NLP tasks. This synergetic approach seems very promising in recent years.

When linguists study a language, they break down the mental grammar into seven components:

  • Phonetics: articulation of speech sounds
  • Phonology: phoneme formation
  • Morphology: word formation
  • Syntax: word orders
  • Semantics: meaning of words
  • Discourse: relations among sentences
  • Pragmatics: meaning in context

Children acquire the first four components to formulate the mental grammar during the language development process. Once they achieve the adult-grammar stage, they will acquire the last three as supplements from social interaction.

References

  • Chomsky, Noam. (1965). Aspects of the Theory of Syntax. Cambridge, MA: MIT Press.
  • McCulloch, W. S., & Pitts, W. (1943). A logical calculus of ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5, 115-133.
  • McClelland, J. L. & Rumelhart, D. E. (1981). An interactive activation model of context effects in letter perception: Part 1. An account of basic findings. Psychological Review, 88(5), 375-405.

Copyright (C) 2017 by Prachya Boonkwan. All rights reserved.

The contents of this blog is protected by U.S., Thai, and International copyright laws. Reproduction and distribution of the contents of this blog without written permission of the author is prohibited.