Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 1.53 KB

README.md

File metadata and controls

12 lines (7 loc) · 1.53 KB

langchain-pinecone-qa

A sample Streamlit web application for generative question-answering with LangChain and Pinecone.

LangChain is an open-source framework created to aid the development of applications leveraging the power of large language models (LLMs). It can be used for chatbots, text summarisation, data generation, code understanding, question answering, evaluation, and more. Pinecone, on the other hand, is a fully managed vector database, making it easy to build high-performance vector search applications without infrastructure hassles. Once you have generated the vector embeddings using a service like OpenAI Embeddings, you can store, manage and search through them in Pinecone to power semantic search, recommendations, and other information retrieval use cases. See this post on LangChain Embeddings for a primer on embeddings and sample use cases.

langchain-pinecone-qa

For a detailed tutorial on generative question-answering with LangChain and Pinecone, see this post.

To deploy on Railway using a one-click template, click the button below.

Deploy on Railway