As mentioned in the previous posts, as soon as ChatBots can interact like humans in natural language, the ChatBots (and later digital voice assistants) will come and there is no way back.

But when can the ChatBot interact a a human?
Personally my vision is that as soon as the ChatBot understands the user it can interact well and create an exceptional User Experience. Most ChatBots are using Natural Language Processing (NLP). NLP can be used to generate natural language so the messages created by the ChatBot and to understand the messages created by the user. By NLP generated text by publishers and very advanced ChatBots. Actually it is relatively easy to generate natural language if you compare with understanding of natural language. But most ChatBots are using predefined messages which will be hardly mentioned by the user and gives a better user experience.

The Artificial Intelligence part of a ChatBot is in its understanding of natural languages. The topic is called Natural Language Understanding (NLU) and is a subtopic of Natural Language Processing in artificial intelligence that deals with machine reading comprehension.

Why Natural Language Understanding (NLU) is needed for ChatBots?
The main tasks for NLU are mapping the given input in natural language into useful representations and analyzing different aspects of the language. As it says Understanding, we need to understand what the input from the user means, so understand the user, at least understand the intention of the user and the details of the intention.

How does Natural Language Understanding (NLU) work?
There are different approaches but most natural language understanding systems share some common components.

The main components are:
A lexicon of the language: A lexicon is the vocabulary of a person, language, or branch of knowledge (such as nautical or medical). An example for English is the Wordnet lexicon.

A parser: A parser is a software component that takes input data (frequently text) and builds a data structure – often some kind of parse tree, abstract syntax tree or other hierarchical structure – giving a structural representation of the input, checking for correct syntax in the process. The parsing may be preceded or followed by other steps, or these may be combined into a single step. The parser is often preceded by a separate lexical analyser, which creates tokens from the sequence of input characters.

Grammar rules to break sentences into an internal representation. Grammar is a set of structural rules governing the composition of clauses, phrases, and words in any given natural language.

What are the steps of Natural Language Understanding (NLU)?

Lexical Analysis − It involves identifying and analyzing the structure of words. Lexicon of a language means the collection of words and phrases in a language. Lexical analysis is dividing the whole chunk of txt into paragraphs, sentences, and words.

steps_in_nlp

Syntactic Analysis (Parsing) − It involves analysis of words in the sentence for grammar and arranging words in a manner that shows the relationship among the words.

Semantic Analysis − It draws the exact meaning or the dictionary meaning from the text. The text is checked for meaningfulness. It is done by mapping syntactic structures and objects in the task domain. The semantic analyzer disregards sentence such as “hot ice-cream”.

Discourse Integration − The meaning of any sentence depends upon the meaning of the sentence just before it. In addition, it also brings about the meaning of immediately succeeding sentence.

Pragmatic Analysis − During this, what was said is re-interpreted on what it actually meant. It involves deriving those aspects of language which require real world knowledge.

Source: tutorialspoint.com AI – Natural Language Procesing

Semantic Analysis and Discourse Integration is also called Contextual Reasoning. The last 3 steps from Semantic Analysis till Pragmatic Analysis are the parts that understand the context.

What are the difficulties in NLU?
As mentioned during the introduction as soon as ChatBots can interact like humans in natural language the User Experience will be very good. But NLU is the key and it is very difficult because:

  • Natural Language has an extremely rich form and structure.
  • The Lexical ambiguity − It is at very primitive level such as word-level.
  • The Syntax Level ambiguity − A sentence can be parsed in different ways.
  • The Referential ambiguity − Referring to something using pronouns. One input can mean different meanings. Many inputs can mean the same thing.
  • Understanding the context (semantic analysis and disclosure integration) caused by the previous mentioned ambiguities is very difficult (sometimes also for humans)
  • If the context of whole conversation is not clear, pragmatic analysis or application reasoning and execution will be never close 100%.

Conclusion
It will be extreme difficult to make NLU the way that it understands everything in all languages but with time it will be improved. Probably it will go much faster now Deep Learning is more widely available. Most ChatBot platforms (like for example wit.ai) only supporting Lexical Analysis and Syntactic Analysis (Parsing) but not the context. It actually means that they don’t understand the context but in these platforms relations between certain words or phrases are the intents for pre-defined answers of the bot. As soon as the bot platforms can understand the context the User Experience will be very much improved.

And as bonus, I read that Watson from IBM also doesn’t understand the context, so I am really wondering how it won Jeopardy!?

Do you want to know more about Natural Language Understanding or ChatBots then please contact me via the contact form.

Leave a Reply

Your email address will not be published.