This is a chatbot project that utilizes the Rasa framework. connect to two different APIs: Google Books API and the OpenAI API. The chatbot can provide book information, including description, price, and links to books using the Google Books API. Additionally, it can recommend books based on authors and descriptions using the OpenAI API.
- Retrieve book information from Google Books API.
- Recommend books based on authors and descriptions using the OpenAI API.
- Basic HTML front-end for interacting with the chatbot.
Before you begin, ensure you have met the following requirements:
-
Python (3.x recommended) installed on your system.
-
Rasa framework installed. You can install it using
pip
:pip install rasa
Before you can run the chatbot, you need to set up the required environment and install the necessary dependencies.
-
Python 3: If you don't have Python 3 installed, you can download and install it from the official Python website.
-
OpenAI API: You'll need an OpenAI API key. You can obtain one by following the instructions on the OpenAI website.
-
Google Books API: You'll also need to set up a Google Books API key. Follow the steps outlined in the Google Books API documentation to obtain your API key.
-
dotenv: This project uses
dotenv
to manage environment variables. You can install it usingpip
:pip install python-dotenv
- https://github.com/kakhi1/Book-Bot.git
- cd Book-Bot
- pip install -r requirements.txt
-
Add your Google API key to 'GOOGLE_API_KEY'
-
Add your OPENAI API key to 'OPENAI_API_KEY'
rasa train rasa run
-
Access the chatbot via the HTML front-end by opening the index.html file in your web browser.
-
Interact with the chatbot by providing queries for book information and recommendations.
To start the Rasa chatbot server, use the following command:
python -m rasa run --port 8098 --cors "*"
rasa run actions
- Thanks to Rasa for providing the chatbot framework.
- Thanks to Google for the Google Books API.
- Thanks to OpenAI for the recommendation capabilities.
To Dockerize the project, we'll create separate Dockerfiles for the Rasa chatbot, custom actions server, and the NGINX frontend server.
```shell
docker build . -t <account_username>/<repository_name>:<custom_image_tag>
docker run -d -p 8098:5005 <account_username>/<repository_name>:<custom_image_tag>
```shell
docker build . -t <account_username>/<repository_name>:<custom_image_tag>
docker run -d <account_username>/<repository_name>:<custom_image_tag>
```shell
docker build . -t <account_username>/<repository_name>:<custom_image_tag>
docker run -d -p 80:80 <account_username>/<repository_name>:<custom_image_tag>
Now you have the Rasa chatbot, custom actions server, and NGINX frontend server running in separate Docker containers. When deploying the action server image, you need to add the endpoint.yml file to the Rasa chatbot. Then, the Rasa chatbot URL can be used for front-end post requests.
Access the chatbot via the HTML front-end by opening the index.html file in your web browser. Interact with the chatbot by providing queries for book information and recommendations.
- Kakhi Mtchedluri
- [email protected]