A no-code approach to Rasa forms, FormAction and FormPolicy


A no-code approach to Rasa forms, FormAction and FormPolicy

We recently released an exciting new feature in Botfront Cloud: conversational forms.

If you're familiar with Rasa and Rasa's FormPolicy and FormAction you will find a lot of similarities, as Botfront is based on Rasa. We have ported the Rasa's developer experience to an intuitive interface allowing to create forms in minutes. Under the hood, it still uses the FormPolicy.

In this tutorial series we'll go through all the steps required to create a simple but resilient lead generation form without a single line of code.

A form that allows a natural experience where user can ask clarifications, make and correct mistakes, and provide information in different ways.

What is slot filling?

Slot filling is a process where the virtual assistant collects a certain amount of information before to proceed.

Two examples:

  • Collect the necessary input to invoke an API (e.g. cities and dates for a flight search)
  • Collect lead information on your website.

The main difference with a regular story based conversation flow is that the assistant automatically iterates through the questions it needs to ask to gather the information. You don't need to model this part. You will use stories to control how the form is invoked and how deviations from the happy path are handled.

1. The happiest of the happy paths

We'll start small to get a good understanding of how forms work. In this first step we'll suppose the user is collaborative and answers questions exactly as expected.

Set up our policies

In order to make this form work, you will need at least the following in your policies:

policies
  - name: AugmentedMemoizationPolicy
  - name: FormPolicy
  - name: KerasPolicy
    epochs: 100

You can set your policies in the Stories screen using the Policies button on top of the sidebar.

Create the form

We first need to create a form. We'll name it leadgen_form, add an informative description, and specify the three slots we want to collect: first_name, work_email and company_size.

We'll also enable the collect in Botfront switch in order to retrieve form submissions in the Incoming section:

When we save, Botfront will create (unfeaturized) slots for us if they don't exist, and show them in the left sidebar under the form. You can re-order them if you'd like the questions to be asked in a different order.

Configure the slots

Once the form is created we can configure the slots. Configuring the slots means for each slot:

  • Define a question to ask.
  • Validate the user response and provide error and confirmation messages.
  • Define how to extract the slot value from the response.

The first_name slot

We'll set the following:

  • The question to ask is "What is your first name?"
  • The validation simply requires a response with more than one character
  • In the extraction tab we state that the slot will be filled with the whole content of the user message.

The work_email slot

Now let's do the same for the work_email slot. The only difference here is that we want to make sure it's a valid email address:

The company_size slot

Here it's slightly different since we are interested in ranges rather than an exact number. We'll offer buttons to the users. The range will be set as an entity in the quick reply payload. In the extraction tab, we need to tell Botfront where to extract the value from, since it's not a entire text message. We simply specify that the value needs to be take from the entity company_size, and we condition it on the inform intent. More on that later.

Integrate the form in a story

Now that our form is configured, we can use it in a story.

Our story will start with the intent start_form. This is just to have a way to get into the form

Then we start the form. Remember that when the form is active, it iterates through all the questions and utter your error or confirm messages depending on the validity of the user responses. None of this needs to be in your stories.

That is why you need to explicitly pick up after completion, when all the slots have been filled.

Testing our form and retrieving submissions

Now we can train our assistant and try to fill the form.

Finally, we can export form submissions in CSV in the incoming section as follow:

2. Handling digressions

We have covered the basics of setting up a conversational form, however the functionality is very limited:

  • It is not possible to exit or interrupt the form.
  • It is not possible to ask clarifications or questions.
  • It works if the user utters the expected response as a single word, but not in a sentence, such as My name is Nathan.

We'll see how to create natural and resilient flows by combining forms with branches and links.

We will start by allowing interruptions: when the user asks to stop (i.e. with the stop intent), we'll ask a confirmation. If they confirm, we'll acknowledge and stop the form, otherwise we'll resume it and repeat the last question.

To do this we need to adjust our stories and our slots configuration.

Adjusting our stories

1. Let's split our story in order to have all form related directives in a single story:

At this point, we didn't change the functionality. Having a separate story handling the whole form process will allow to handle deviations easier.

Don't forget to link the Start form story to the Happy path one.

2. Let's create 2 branches Happy path and Interruptions and move the completion path to the Happy path branch.

3. We start the Interruptions branch with a stopping intent. The first thing we need to do is deactivating the form. Deactivating is more crucial when side questions are enabled (next post), but it's a good practice to do it sytematically. Then we ask for a confirmation

4. In the confirmation branch (the user says yes) we just need to acknowledge since the form is already deactivated. The user can start asking other questions.

5. In the resume branch, we acknoledge and link back to the story (see at the very end of the video). The form will be reactivated and the last question will be repeated.

Adjusting our slots configuration

Finally, we need need to let our form know that it should not try to fill the slot if the stop intent is recognized.

If you don't do this, at the first_name step the assistant would take stop as your first name and at the work_email step it would just thwrow that the email is invalid. Since we already restricted the intent to inform at the company_size step, we don't have to do anything else.

Checking the results

Now we can chat with our assistant and play with the interruption flow. Don't forget to train first.

Where do we go from there

We handled interruption in our flow, but you could use the exact same mechanism to answer generic side questions that a user might ask while answering yours. In a pizza ordering bot, a user could check the delivery time while you're asking which type of cheese they want.

The next step is offering clarifications users may need to answer a particular question the form is asking.

3. Handling contextual questions

A user might ask for a clarification directly related to a question being asked. An extreme example is when the same question can have a different answer depending on the context. For example, Why are you asking that?.

The first step is to enable contextual questions in the form. When you enabled slots, Botfront will set the categorical requested_slot with the name of all slots in forms as categories.

Add contextual questions

Adding contextual questions is similar to adding generic question or interruption flows.

The main difference is that the first element of the branch need to refer the slot currently being filled. In the context of the first_name, all you need to do is to set a condition where the requested_slot is first_name.

The video shows how to change our story so our assistant can answer the question Why are you asking that? in context.

You can add more questions by adding branches under the slot condition.

We also need to exclude the why intent from the allowed intents in the extraction tabs of slots.

Now all we have to do is training our assistant and check the result. As you can see the bot can now answerthe question in context.

Adding NLU

We added a lot of flexibility and resilience to our form. But we still miss some of the basics: we have assumed until now that users would just type their first name and email address.

But what if someone says Hi, I'm Nathan and my email is nathan@whatev.er?

We want our assistant to get the slot value from the right place. If the NLU determines the user is providing their first_name or work_email, then we should take the value from the entity. Otherwise we should just take the content of the message provided it is valid.

In the video we show how to add multiple extraction conditions. We changed two things:

  • We specified that the first_name and work_email can be extracted from entities if the detected intent is inform.
  • We excluded inform from the allowed intents when taking the value from the user message to make sure the conditions don't mix up.