Archive for September, 2008

The Grammar Of Thought

| September 3rd, 2008

Update: Found this interesting book related to this post – The Language Instinct [link]

I have just started to scratch the surface of Natural Language Processing for my next project (involving NLP and Twitter – details to follow) and I already have a dozen questions bothering me. I shall attempt to put forth a few of the ideas and questions in this post. Lets talk briefly about the structure of language. Language has different levels of structure:

  1. dicourse – group of sentences
  2. sentences
  3. phrases
  4. words
  5. and so on…

Between the ‘sentences’ and ‘words’ lies the syntactic structure of language. This syntactic structure is built using the parts of speech of the words: nouns, verbs, etc. Words are grouped into phrases whose formation is governed by the grammar rules, for example:

Sentence -> ‘Noun Phrase’ . ‘Verb Phrase’
‘Noun Phrase’ -> Determiner . Adjective . Noun
‘Verb Phrase’ -> Verb . ‘Noun Phrase’

A sentence is grammatically correct if it adheres to the grammar of the language (like described above). With just the above knowledge about language (something you might have learnt in the 5th grade) we can see that for a candidate sentence to make sense in some language, it has to be composed of meaningful components and these components have to be in some specific order for it to logically make sense.

Grammar of Thought

This has led me to ponder if an analogous grammar exists for ‘thought’. Our thoughts can also be broken down into meaningful components and the components here also have to follow some implicit ordering for the ‘thought’ to make sense. If you think about the way you think, you will notice that as you run from one thought to another there is some logical connection between them just as between the sentences in a paragraph. If we could somehow get a formal representation of this grammar, wouldn’t it enable machines to think?

Language and Thought

There is enough literature out there which links the structure of language with the structure of thought. Benjamin Whorf states in his writings:

the structure of a human being’s language influences the manner in which he understands reality and behaves with respect to it

Thus, human cognition is based on the structure of language which in turn is the grammar defining the language. Hence a machine capable of generating sequence of grammatically correct sentences which also fit together logically (discourse), should have some ability of cognition. Even the Turing test uses natural language as a test for some level of cognition. Is this perspective of Natural Language Processing as a means of provisioning cognition to a machine, correct? Could this be another path for achieving artificial intelligence? I would love to get an answer to this from NLP experts out there.

Or is it just one of my other posts which don’t make sense because its 3am and I’m half asleep?