How Computers Understand Language -Natural Language Parsing
Article by Rick Leinecker, September 30, 2007

A few days ago, my daughter Beth was talking about her English homework. She was telling me about diagramming sentences, and then she asked the typical high school question: "when am I ever going to use this?" I develop quite a few programs that use artificial intelligence techniques, and her question got me thinking about how computers understand language. I'd like to share my thoughts with you now.

For starters, computers have no knowledge base of what things mean. They can't get a sentence and immediately understand it. They have to go through a process that's known as Natural Language Parsing (NLP). The process starts by breaking down sentences into parts. It's very similar to how we diagram sentences. But we're better at it than computers. For instance, a word that can be either a noun or a verb throws the computer. We intuitively know the difference based on the context. But a computer has to explicitly analyze the syntax of a word that can be a verb or a noun - much more difficult than our implicit conclusions.

Computers perform NLP as the first step at understanding a sentence. And that's why we need to learn to diagram sentences: it helps us better understand what we read and hear. The better we can diagram sentences, the better we understand the component parts of the sentences. And the better we understand the component parts, the better we understand the meaning.

Of course, there's more than just diagramming sentences to full understanding. Once a sentence is broken down, a knowledge base helps us move from language parts to language understanding. The knowledge base for the computer is a specialized dictionary in which information is stored that can be used to discern meaning. For us, it's our memory. Everything we've learned from the time we were born makes up the knowledge base that our mind uses to interpret language meaning. For instance, when we hear the word apple, our brain finds the appropriate memory that relates to the sentence.

Going beyond NLP (for computers) and diagramming (for humans), we need context to fully understand language. People do this by instinct. Computers don't do it well at all. To this day there aren't any mainstream techniques for computers to do this. Yes, there is research that's going on that makes a reasonable attempt. But you won't find any software on the market that can understand a sentence in the context of an entire paragraph.

A computer can't understand a metaphor, a simile, or an analogy. That's because these literary techniques use an indirect manner of conveying meanings. And by the same token, a computer can't create a metaphor, either. There is some research being done at the University of Indiana that's getting close, but at this time the metaphor-creating programs have a still relatively primitive.

Those are the basics of natural language parsing.