Chatbots - Building ForkChoice with Watson
Watson. Almost everyone - at least in the tech industry - has heard of him. The IBM supercomputer, which is being used across multiple fields, ranging from crucial medical recommendations on cancer treatment to tone analyzing customer support cases.
But what if you wanted to use Watson, to build your own chatbot?
Bluemix - A developer playground
IBM Bluemix provides a number of services for developers, including Watson Conversation, Text-to-Speech and Tone Analyzer. These services make it easy to build (intelligent) chatbots. With Watson Conversation, the chatbot will be able to understand your users intents and be able to answer advanced questions.
How we structured the ForkChoice bot
To understand how to build a bot, we have to think about what a conversation really is. We rarely give this any though, since talking to each other is natural. But to build a chatbot, we need to know what a conversation actually consists of.
Usually, when two people talk, the conversation will consist of:
- One or more intents.
The intent could be to find a place to eat.
- The matter of the intent (entities in Watson).
This could be some demands for the restaurant, for example ‘cheap’, ‘nearby’ or a type of food.
The conversation context needs to persist. For example saving the name of the user, and being able to say goodbye [name] in the end of the conversation.
Our goal was to make the ForkChoice bot able to understand these four main intents:
- Finding a place to eat/drink
- Getting inspiration for food/drinks
- Showing more information about a place
- Asking for help using the chatbot
Besides the three main intents, a number of smaller intents were added, e.g. random questions, positive/negative reactions etc. These are mainly to help the chatbot lead a more natural conversation.
Then, there is the question of what the user is actually trying to find. We tried to determine a number of ways a person could search for food and drinks. One way, is to search by the type of food or drink - e.g. “Find a burger” or “Where can I find a smoothie?
Another way could be by the type of place - e.g. “Find a restaurant” or “Find a café”.
These two main types of searches became our main entities (food-type, drink-type and place-type).
In addition to the search keyword itself, we also needed the bot to understand different location and sorting criteria. For example, a user might search for “the best restaurant within 3 kilometers” or “A café in Copenhagen”.
Best and closest were simple to add as entities - but what about processing numbers and cities?
For this, IBM has made “system-entities”, which can extract numbers and locations from a user input.
For the user to interpret the conversation with a chatbot as advanced, it is important to be able to save context from earlier inputs. We use context for saving the users location and other values that persist throughout the conversation.
And now for the fun part - seeing it all in action. Below is a screenshot of how the conversation looks, and how the values above are all put together to create a meaningful conversation with the ForkChoice chatbot.
Feel free to try the app, and tell us what you think: forkchoice.com