You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I have created a new all-in-one openapi spec for Watsonx.ai extension, that able to connect wxAssistant/wxOrchestrate to external AI services and model/template deployed into watsonx.ai runtime.
The new openapi spec including 6 path:
generate text with wx.ai
generate text (stream)
generate text with deployed model/template
generate text with deployed model/template (stream)
generate text with deployed AI service
generate text with deployed AI service (stream)
This could be useful for everyone who wants to use external models on watsonx.ai, skip tuning prompts inside wxA, or conduct a RAG pipeline inside watsonx.ai
Hi, I have created a new all-in-one openapi spec for Watsonx.ai extension, that able to connect wxAssistant/wxOrchestrate to external AI services and model/template deployed into watsonx.ai runtime.
The new openapi spec including 6 path:
This could be useful for everyone who wants to use external models on watsonx.ai, skip tuning prompts inside wxA, or conduct a RAG pipeline inside watsonx.ai
I want to add this to https://github.com/watson-developer-cloud/assistant-toolkit/tree/master/integrations/extensions/starter-kits/language-model-watsonx as an advanced usecase.
The text was updated successfully, but these errors were encountered: