COVID-19 Chatbot with Rasa 2.0: open source conversational AI

Overview

COVID-19 Chatbot with Rasa 2.0: open source conversational AI

Table of Contents

  1. introduction
  2. COVID-19 data
  3. Conversational flow
  4. Implementation
  5. Installation
  6. References

introduction

As natural language processing (NLP) technology and chatbot systems over the past few years have evolved quickly, also the usefulness of chatbots has increased. The motivation of chatbots is productivity; they have an instant access to information they refer to and are efficient in assisting users. (Brandtzaeg, 2017, Why people use chatbots. COVID-19 chatbot is an excellent use case example for the technology.

The content of a chatbot consists of the personality, conversation flows and the information it can deliver to the user. Personality is created by interactions and responses and by acting differently in different situations. These responses should be designed so that it maximises the engagement between the bot and the user (Katz, 2019, The Ultimate Guide to chatbot personality, Chatbots Magazine). The COVID-19 chatbot described here aims to use these principles, however due to the efforts required, in a rather minimalistic way leaving plenty of room for future improvements. e.g. in the area of how to handle chitchat.

COVID-19 data

The COVID-19 data format chosen here is defined by (https://api.rootnet.in/covid19-in/stats/history), which provides COVID-19 data freely for developers.

Conversation Flow

The conversation is initiated by the end-user. A greeting or a goodbye should reset any prior assumptions or knowledge collected by the bot during previous interactions. When time or COVID-19 detail are not contained in the query, the bot shall report the current and generic COVID-19 conditions. When the city is not provided in the query, the bot shall request for it. Any further specifics in the query should be answered in more detail if information is available.

3 Major queries that a user can perform apart from greeting,goodbye etc:

As this is a demo modal of how to integrate API in rasa. So I have designed this COVID-19 bot to answer limited queries as follows:

Query1

user can ask for current confirmed covid cases of any city in India

Examples:

-what is the number of cases currently in Delhi?
- how's the covid currently in Delhi?
- Tell me about covid currently in Maharashtra
- covid cases in Punjab currently

Query2

user can ask for total current confirmed cases of any two cities in India

Examples:

- Hey, what’s the total count of confirmed cases in Delhi, Maharashtra altogether?
- total confirmed cases in Delhi and Maharashtra together?
- total cases currently in Delhi and Maharashtra
- Get me the total current confirmed cases in Delhi and Maharashtra
- Tell me the total current confirmed cases in Delhi and Maharashtra

Query3

user can ask total cases between any dates(format: yyyy-mm-dd)

Examples:

- What’s the confirmed case count from 2020-10-01 to 2020-10-12?
- Hey, what's the current confirmed cases from 2020-10-01 to 2020-10-12?
- currently cases from 2020-10-01 to 2020-10-12?
- Hey, what’s the total count of confirmed cases from 2020-10-01 to 2020-10-12?
- total confirmed cases from 2020-10-01 to 2020-10-12 in India?
- total cases from from 2020-10-01 to 2020-10-12? in India
- Get me the total current confirmed cases from 2020-10-01 to 2020-10-12? in India
- Tell me the total current confirmed cases from 2020-10-01 to 2020-10-12? in India

Implementation

All components are defined to support the conversation flow . The end-user intents here are: who_are_you, covid_in_city, covid_in_two_city, covid_in_period, covid_without_city, greet, goodbye, affirm, deny, mood_great, mood_unhappy, bot_challenge, how_are_you, capabilities In Rasa, the slots can be used for passing information to and back between Rasa and external actions. Three slots are required: city,city2, init_date,final_date.

The responses where the personality is also largely created are: utter_greet, utter_goodbye, utter_ask city (triggers city_form), utter_iamabot, utter_capabilities, utter_im_well so on. This also includes the external action, action_covid, which fetches the COVID-19 data, parses it and generates the COVID-19 response sentence.

External actions are user defined functions written in python. Only one action, action_covid, is required. It is split in two separate functionalities here: actions.py which receives slots: city,city2,init_date and final_date from Rasa. It then queries the COVID-19 data for specific city from covid_api.py where a function covid_data(city,city2,init_data,final_data) is defined. The function returns the COVID-19 data(totalcases) after getting filtered out from (https://api.rootnet.in/covid19-in/stats/history) onecall json format to action_covid, which then forms a response sentense to be passed back to Rasa.

The user intents, stories and rules are used for training the NLP model. These intent examples cover tens of different ways of asking questions, and explaining to the model how to find the values for the three slots and what is the intent the user has. The stories contain the conversation flows and rules that will stop any conversation and force a different path.

Installation

Installation assumes existing installation of miniconda or anaconda. https://www.anaconda.com/

pip3 & Rasa

Below are the simple steps for creating a virtual environment, install pip3 and Rasa Open Source 2.0.

conda create -n RasaEnv python=3.7.6 
conda activate RasaEnv
conda install -c anaconda pip3
pip3 install rasa==2.8.11  

In case of issue, please refer to Rasa Open Source installation pages: https://rasa.com/docs/rasa/installation/

Creating and initialising a new project:

mkdir rasa
cd rasa
rasa init --no-prompt

This will create a new directlry, under which rasa creates all necessary directories and files.

Replace all files in the rasa directory with the files in the project.

Train the model and run the bot

Train the model with command

rasa train

There are additional actions that need to be started before starting the bot evaluation. These are in actions.py and covid_api.py files. To do so, run below commands on two different terminals:

rasa run actions

Start the discussion with rasabot:

rasa shell

References

- Rasa. (n.d.). Rasa: Open source conversational AI. URL: https://rasa.com

Owner
Aazim Parwaz
I am a 3rd year computer science undergraduate at NIT Srinagar
Aazim Parwaz
Multi-Task Pre-Training for Plug-and-Play Task-Oriented Dialogue System

Multi-Task Pre-Training for Plug-and-Play Task-Oriented Dialogue System Authors: Yixuan Su, Lei Shu, Elman Mansimov, Arshit Gupta, Deng Cai, Yi-An Lai

Amazon Web Services - Labs 124 Jan 03, 2023
The model is designed to train a single and large neural network in order to predict correct translation by reading the given sentence.

Neural Machine Translation communication system The model is basically direct to convert one source language to another targeted language using encode

Nishant Banjade 7 Sep 22, 2022
Translate - a PyTorch Language Library

NOTE PyTorch Translate is now deprecated, please use fairseq instead. Translate - a PyTorch Language Library Translate is a library for machine transl

775 Dec 24, 2022
An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition

CRNN paper:An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition 1. create your ow

Tsukinousag1 3 Apr 02, 2022
lightweight, fast and robust columnar dataframe for data analytics with online update

streamdf Streamdf is a lightweight data frame library built on top of the dictionary of numpy array, developed for Kaggle's time-series code competiti

23 May 19, 2022
The following links explain a bit the idea of semantic search and how search mechanisms work by doing retrieve and rerank

Main Idea The following links explain a bit the idea of semantic search and how search mechanisms work by doing retrieve and rerank Semantic Search Re

Sergio Arnaud Gomez 2 Jan 28, 2022
In this repository we have tested 3 VQA models on the ImageCLEF-2019 dataset.

Med-VQA In this repository we have tested 3 VQA models on the ImageCLEF-2019 dataset. Two of these are made on top of Facebook AI Reasearch's Multi-Mo

Kshitij Ambilduke 8 Apr 14, 2022
Shared, streaming Python dict

UltraDict Sychronized, streaming Python dictionary that uses shared memory as a backend Warning: This is an early hack. There are only few unit tests

Ronny Rentner 192 Dec 23, 2022
Various Algorithms for Short Text Mining

Short Text Mining in Python Introduction This package shorttext is a Python package that facilitates supervised and unsupervised learning for short te

Kwan-Yuet 466 Dec 06, 2022
Blazing fast language detection using fastText model

Luga A blazing fast language detection using fastText's language models Luga is a Swahili word for language. fastText provides a blazing fast language

Prayson Wilfred Daniel 18 Dec 20, 2022
Tensorflow implementation of paper: Learning to Diagnose with LSTM Recurrent Neural Networks.

Multilabel time series classification with LSTM Tensorflow implementation of model discussed in the following paper: Learning to Diagnose with LSTM Re

Aaqib 552 Nov 28, 2022
[Preprint] Escaping the Big Data Paradigm with Compact Transformers, 2021

Compact Transformers Preprint Link: Escaping the Big Data Paradigm with Compact Transformers By Ali Hassani[1]*, Steven Walton[1]*, Nikhil Shah[1], Ab

SHI Lab 367 Dec 31, 2022
DLO8012: Natural Language Processing & CSL804: Computational Lab - II

NATURAL-LANGUAGE-PROCESSING-AND-COMPUTATIONAL-LAB-II DLO8012: NLP & CSL804: CL-II [SEMESTER VIII] Syllabus NLP - Reference Books THE WALL MEGA SATISH

AMEY THAKUR 7 Apr 28, 2022
Code voor mijn Master project omtrent VideoBERT

Code voor masterproef Deze repository bevat de code voor het project van mijn masterproef omtrent VideoBERT. De code in deze repository is gebaseerd o

35 Oct 18, 2021
Local cross-platform machine translation GUI, based on CTranslate2

DesktopTranslator Local cross-platform machine translation GUI, based on CTranslate2 Download Windows Installer You can either download a ready-made W

Yasmin Moslem 29 Jan 05, 2023
Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech tagging and word segmentation.

Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classifi

186 Dec 24, 2022
Code for hyperboloid embeddings for knowledge graph entities

Implementation for the papers: Self-Supervised Hyperboloid Representations from Logical Queries over Knowledge Graphs, Nurendra Choudhary, Nikhil Rao,

30 Dec 10, 2022
NL-Augmenter 🦎 → 🐍 A Collaborative Repository of Natural Language Transformations

NL-Augmenter 🦎 → 🐍 The NL-Augmenter is a collaborative effort intended to add transformations of datasets dealing with natural language. Transformat

684 Jan 09, 2023
Experiments in converting wikidata to ftm

FollowTheMoney / Wikidata mappings This repo will contain tools for converting Wikidata entities into FtM schema. Prefixes: https://www.mediawiki.org/

Friedrich Lindenberg 2 Nov 12, 2021
A full spaCy pipeline and models for scientific/biomedical documents.

This repository contains custom pipes and models related to using spaCy for scientific documents. In particular, there is a custom tokenizer that adds

AI2 1.3k Jan 03, 2023