Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new openapi spec for Watsonx.ai extension #322

Open
dzzzzllz opened this issue Feb 3, 2025 · 0 comments
Open

Add new openapi spec for Watsonx.ai extension #322

dzzzzllz opened this issue Feb 3, 2025 · 0 comments

Comments

@dzzzzllz
Copy link

dzzzzllz commented Feb 3, 2025

Hi, I have created a new all-in-one openapi spec for Watsonx.ai extension, that able to connect wxAssistant/wxOrchestrate to external AI services and model/template deployed into watsonx.ai runtime.

The new openapi spec including 6 path:

  • generate text with wx.ai
  • generate text (stream)
  • generate text with deployed model/template
  • generate text with deployed model/template (stream)
  • generate text with deployed AI service
  • generate text with deployed AI service (stream)

This could be useful for everyone who wants to use external models on watsonx.ai, skip tuning prompts inside wxA, or conduct a RAG pipeline inside watsonx.ai

I want to add this to https://github.com/watson-developer-cloud/assistant-toolkit/tree/master/integrations/extensions/starter-kits/language-model-watsonx as an advanced usecase.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant