Skip to content

Commit

Permalink
fix: fix bad merge in AntonOsika#639 (AntonOsika#644)
Browse files Browse the repository at this point in the history
  • Loading branch information
ErikBjare authored and 70ziko committed Oct 25, 2023
1 parent 62e6672 commit 57e335a
Show file tree
Hide file tree
Showing 3 changed files with 24 additions and 10 deletions.
13 changes: 3 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,11 @@
[![GitHub Repo stars](https://img.shields.io/github/stars/AntonOsika/gpt-engineer?style=social)](https://github.com/AntonOsika/gpt-engineer)
[![Twitter Follow](https://img.shields.io/twitter/follow/antonosika?style=social)](https://twitter.com/AntonOsika)


**Specify what you want it to build, the AI asks for clarification, and then builds it.**

GPT Engineer is made to be easy to adapt, extend, and make your agent learn how you want your code to look. It generates an entire codebase based on a prompt.

[Demo](https://twitter.com/antonosika/status/1667641038104674306)
[Demo](https://twitter.com/antonosika/status/1667641038104674306)[Documentation](https://gpt-engineer.readthedocs.io/en/latest/)

## Project philosophy

Expand Down Expand Up @@ -42,6 +41,7 @@ Either just:
Or:
- Create a copy of `.env.template` named `.env`
- Add your OPENAI_API_KEY in .env
- (advanced) Use a local model, see [docs](https://gpt-engineer.readthedocs.io/en/latest/open_models.html).

Check the [Windows README](./WINDOWS_README.md) for windows usage.

Expand Down Expand Up @@ -81,15 +81,8 @@ Editing the `preprompts`, and evolving how you write the project prompt, is how

Each step in `steps.py` will have its communication history with GPT4 stored in the logs folder, and can be rerun with `scripts/rerun_edited_message_logs.py`.

### Running with open source models

You can use gpt-engineer with open source models by using an OpenAI compatible API, such as the one offered by the [text-generator-ui extension `openai`](https://github.com/oobabooga/text-generation-webui/blob/main/extensions/openai/README.md). This can easily be setup with [TheBloke's Runpod template](https://www.runpod.io/console/gpu-secure-cloud?template=f1pf20op0z).

To do so, first set up the API according to the instructions linked above. Then you need to go into the text-generation-webui, go to settings, check the `openai` extension, save. You then need to expose TCP port 5001 in your Runpod config, which will give it an exposed TCP port something like 40125. Then restart your Runpod, and check that the API is live by browsing: http://<public ip>:<port>/v1/models

Then, as an example we can now run it with WizardCoder-Python-34B hosted on Runpod: `OPENAI_API_BASE=http://<host>:<port>/v1 python -m gpt_engineer.main benchmark/pomodoro_timer --steps benchmark TheBloke_WizardCoder-Python-34B-V1.0-GPTQ`
You can also run with open source models, like WizardCoder. See the [documentation](https://gpt-engineer.readthedocs.io/en/latest/open_models.html) for example instructions.

Check your Runpod dashboard for the host and (exposed TCP) port, mine was something like 40125.

## Vision
The gpt-engineer community is building the **open platform for devs to tinker with and build their personal code-generation toolbox**.
Expand Down
1 change: 1 addition & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Welcome to GPT-ENGINEER's Documentation
usage
readme_link
windows_readme_link
open_models


.. toctree::
Expand Down
20 changes: 20 additions & 0 deletions docs/open_models.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
Using with open/local models
============================

You can integrate ``gpt-engineer`` with open-source models by leveraging an OpenAI-compatible API. One such API is provided by the `text-generator-ui extension openai <https://github.com/oobabooga/text-generation-webui/blob/main/extensions/openai/README.md>`_.

Setup
-----

To get started, first set up the API with the Runpod template, as per the `instructions <https://github.com/oobabooga/text-generation-webui/blob/main/extensions/openai/README.md>`_.

Running the Example
-------------------

Once the API is set up, you can run the following example using WizardCoder-Python-34B hosted on Runpod:

.. code-block:: bash
OPENAI_API_BASE=http://<host>:<port>/v1 python -m gpt_engineer.main benchmark/pomodoro_timer --steps benchmark TheBloke_WizardCoder-Python-34B-V1.0-GPTQ
To find the host and the exposed TCP port, check your Runpod dashboard. For example, my port was 40125.

0 comments on commit 57e335a

Please sign in to comment.