Chatbots don’t quite understand us yet. We speak, and they process our commands. With a chatbot like Yahoo Weather, you ask about the forecast in Seattle, and the bot returns an answer. Natural Language Processing (NLP) can read what you say and interpret some meaning.
You don’t want to know the current temp in Seattle, you want to know what the weather will be like in the future. That capability is helpful, because the dialogue can branch out. You can ask about a different city or the next morning, without knowing anything about how to find those features within the app (without even needing to install the app).
Yet, most of us communicate without following a specific set of commands. For example, we tend to be pretty random. We ask more complicated questions, like this: “What’s the weather in Miami, and then Orlando the next day.” For now, chatbots don’t know what you mean. In the Yahoo Weather chatbot, you’ll only get a report for Miami, the bot ignores the second part of your request.
That’s not the case with Pat, a new technology that uses Natural Language Understanding. In a demo recently, the developers showed me how it works. In one chat, the bot can understand when you ask about “the president” and then ask about his age in the same request. Interestingly, this command is possible in the Google search app on your phone only if you specify that you mean President Obama.
NLU looks at the words you are using and flags them with meaning. It looks at a sentence asking about the “president” and interprets the meaning as related to the most common request. Humans can be vague, when we say “president,” we usually mean the leader of the U.S.
Pat also understands grammar. In another test, the bot was able to understand context. There might be two different subjects mentioned in a request or statement, but the bot looks for meaning, not only through the words you use but also in how the words are constructed. The demo mentions “the chancellor” and “protesters” and the bot understood that it was the protesters who were angry.
What makes this interesting to me is the potential for understanding text chats in the future. An intelligent assistant would not need to ask multiple follow-up questions or request more details. It could look at the construction of a sentence and determine our intent.
We might ask about a train route, the weather, and flight delays all in one request, and the bot would know that we are traveling and need to take a train and then board a plane and that we’re concerned about the weather when we travel (this requires understanding, not just processing).
NLU is not an optional add-on on for chatbots. The technology, even in a fairly limited demo, shows how the interactions between a bot and a human need to be able to handle random requests. In another demo, the bot understood a request that involved an interruption.
You might say “what’s the weather in Miami, no Orlando” and the bot would know you mean Orlando. Today, this kind of input is confusing to bots and leads to a lot of miscues.