Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to support gemini api #514

Closed
4t8dd opened this issue Mar 14, 2024 · 4 comments
Closed

Is it possible to support gemini api #514

4t8dd opened this issue Mar 14, 2024 · 4 comments

Comments

@4t8dd
Copy link

4t8dd commented Mar 14, 2024

I am not sure how big the difference between gpt and gemini. I assume it is small based on some open report.
I wonder if it is possible to support gemini api? It is free anyway.

@CheerfulPianissimo
Copy link

https://github.com/zhu327/gemini-openai-proxy seems like it could be used for this.

@CheerfulPianissimo
Copy link

CheerfulPianissimo commented Mar 18, 2024

Ok, can confirm that works. Just set API_BASE_URL to http://localhost:8080/v1 and the openai key to your gemini api key with the proxy running.

@4t8dd
Copy link
Author

4t8dd commented Mar 19, 2024

@CheerfulPianissimo Thanks for your trying. It works for too.
And there is another repo with which you can deploy this proxy to some saas and it works perfectly for me .

@4t8dd 4t8dd closed this as completed Mar 19, 2024
@nullnuller
Copy link

Is anyone else having this issue?

"model": "gpt-4",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
curl: (52) Empty reply from server

Here's the docker:
$ docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
9a49c5c0b3e4 zhu327/gemini-openai-proxy:latest "/app/gemini" 3 minutes ago Up 3 minutes 0.0.0.0:8081->8081/tcp gemini

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants