This project is a multifaceted code assistant designed to aid developers in writing better code, learning new programming languages, and debugging efficiently. Created with the goal of enhancing the coding experience, it serves as an educational tool for those looking to expand their coding expertise while also offering robust debugging support. Whenever developers encounter errors, this assistant provides intelligent insights and solutions, streamlining the debugging process and reducing development time.
- Python: The primary programming language used for developing the backend logic and functionalities.
- Ollama: Utilized for leveraging large language models efficiently with a focus on reducing computational demands and optimizing performance.
- CodeLlama: Integrated for advanced natural language processing capabilities, enhancing the interaction between the user and the code through sophisticated language models.
- LangChain: Applied to link various language models and services seamlessly, enabling more complex and versatile language-based applications.
- Gradio: Employed to create a user-friendly web interface that allows users to interact with the model easily, providing inputs and receiving outputs directly.
- Python 3.x
- Ollama
Clone the repo
git clone https://github.com/AmarnathaGowda/CodeLlamaAssistant.git
- Download the Ollama CLI from the official website
- On your favort terminal, run the following command to install the ollama
ollama run codellama
- Once the installation is complete, run the following command to create the custom model
ollama create LokiLogic -f Modelfile
conda create -n CodeLlamaAssistantEnv python -y
conda activate CodeLlamaAssistantEnv
pip install -r requirements.txt
# Finally run the following command
python app.py
open up localhost URL : http://127.0.0.1:7860