Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR]: Enable Automatic Token and Cost Tracking for OpenAI LLM Calls via HTTP POST in Opik UI #1415

Open
Sanchita-P opened this issue Feb 27, 2025 · 3 comments
Labels
enhancement New feature or request

Comments

@Sanchita-P
Copy link

Proposal summary

Proposal Summary:
Currently, Opik tracking allows for monitoring the number of tokens and associated costs when using standard LLM calls via track_openai. However, in my project, OpenAI LLM calls are made through an HTTP POST request instead of directly using the OpenAI API. As a result, the automatic tracking of #tokens and costs is not available in the UI. This feature request is to explore options to enable automatic token and cost tracking for HTTP POST-based LLM calls within the Opik UI. If this functionality already exists, I would appreciate guidance on implementing it; if not, I kindly request this feature be considered for development.

Code Example for how I call the LLM:
async with session.post(
url, headers=system_info.headers, data=payload, params=params, timeout=300
) as response:
response_data = await response.read()
response_json = json.loads(response_data)

Motivation

Motivation:

Problem Statement: The current limitation prevents teams using HTTP POST requests for LLM calls from benefiting from automatic token and cost tracking in the Opik UI.

Current Workaround: Manually calculating tokens and costs using external tools or custom scripts, which is error-prone and inefficient.

Benefits: This feature would enhance transparency and monitoring capabilities, streamline workflows, and align HTTP POST-based LLM integrations with the standard API integrations in terms of analytics and reporting.

Thank you for considering this request. Please let me know if you need any additional information or if I can assist further in providing context or testing potential solutions.

@Sanchita-P Sanchita-P added the enhancement New feature or request label Feb 27, 2025
@jverre
Copy link
Collaborator

jverre commented Feb 28, 2025

Hi @Sanchita-P

This is actually possible today ! Let me update the docs and get back to you, essentially you need to specify the model and provider fields when logging a span

@jverre
Copy link
Collaborator

jverre commented Feb 28, 2025

@Sanchita-P I've added some docs here, let me know if you have any questions.

I've opened a PR to support it when using opik track decorators: #1430

@Sanchita-P
Copy link
Author

Thanks, @jverre! I'll look into this :) What about the #tokens which is displayed in the UI?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants