NLP server housing intent and NER models as well as Langchain memory agent for Ditto assistant clients.
- Rename
.env.example
to.env
and setOPENAI_API_KEY
to your OpenAI API key. - Rename
example_users.json
tousers.json
and add user's information. docker build -t nlp_server .
docker run --env-file .env --rm -p 32032:32032 nlp_server
- Main Google Search Agent:
- Create an account on serpapi.com and set
SERPAPI_API_KEY
to your API key in.env
.
- Create an account on serpapi.com and set
- Fallback Agent:
- Create an account on serper.dev and set
SERPER_API_KEY
to your API key in.env
.
- Create an account on serper.dev and set
- If you prefer using HuggingFace's API, set
HUGGINGFACEHUB_API_TOKEN
to your HuggingFace API key and setLLM=huggingface
in.env
.
- Install Neo4j Desktop and create a new project.
- Add a new database named
ditto-memory
and set the password topassword
. - That's it! You can now visualize the memory by opening the Neo4j browser and running the following query:
MATCH (n) RETURN n LIMIT 100
All nodes can be expanded to view their properties. Each node holds textual information in the description
property. The following are the types of nodes in the graph and what their description properties hold:
- Prompt Nodes: holds a summary of the user's prompt in
description
. - Response Nodes: holds a summary of Ditto's response in
description
. - Subject Nodes: holds the subject name of the conversation pair in
title
anddescription
(same astitle
for Subject Nodes). - Sub-category Nodes: holds a summary of each sub-category in Ditto's response, where each sub-category node holds a summary of the sub-category in
description
.