Skip to content

Commit

Permalink
Merge pull request #350 from elsoul/ai
Browse files Browse the repository at this point in the history
update chat response
  • Loading branch information
POPPIN-FUMI authored Mar 19, 2024
2 parents 32e4e94 + a7037fa commit bc69e50
Show file tree
Hide file tree
Showing 6 changed files with 58 additions and 37 deletions.
5 changes: 5 additions & 0 deletions .changeset/dirty-fans-compare.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@skeet-framework/ai": patch
---

Update chat response
6 changes: 4 additions & 2 deletions packages/ai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ This plugin wraps the following AI models.

- [Vertex AI(Google Cloud)](https://cloud.google.com/vertex-ai/)
- [Open AI(ChatGPT)](https://openai.com/)
- [Claude AI](https://claude.ai/)

Fast and easy to deploy with Skeet Framework.

Expand Down Expand Up @@ -113,6 +114,7 @@ const input = 'What is the capital of France?'
const gemini = await chat(context, examples, input, 'Gemini')
const openai = await chat(context, examples, input, 'OpenAI')
const claude = await chat(context, examples, input, 'Claude')
```
# Skeet AI Docs
Expand Down Expand Up @@ -145,8 +147,8 @@ Powered by TAI, Cloud Functions, Typesaurus, Jest, Prettier, and Google Cloud.
- [GitHub CLI](https://cli.github.com/)
```bash
$ npm i -g @skeet-framework/cli
$ skeet create web-app
$ pnpm add -g @skeet-framework/cli
$ skeet new
```
## Contributing
Expand Down
6 changes: 4 additions & 2 deletions packages/ai/src/lib/chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,15 @@ const CLAUDE = 'Claude'
* @param context - A string providing context or background information for the chat model to consider.
* @param examples - An array of `InputOutput` objects representing example input-output pairs to guide the model's responses.
* @param input - The user's input string for which a response is requested from the chat model.
* @param aiType - Specifies the chat model to use. Defaults to 'Gemini'. Can be either 'Gemini' or 'OpenAI'.
* @param aiType - Specifies the chat model to use. Defaults to 'Gemini'. Can be either 'Gemini', 'OpenAI' or 'Claude'.
* @param isStream - A boolean indicating whether to return a stream of the model's response. Defaults to true.
* @param isLogging - A boolean indicating whether to log the stream's content to the console. Defaults to true.
* @returns Returns a Promise resolving to a stream of the model's response. If logging is disabled, the raw stream is returned directly.
* @throws Exits the process with status code 1 if an error occurs.
*
* @example
* import { chat } from '@skeet-framework/ai'
*
* const examples = [
* { input: "Who was the first person in space?", output: "Yuri Gagarin" },
* { input: "Tell me about the Apollo missions.", output: "Gemini" }
Expand Down Expand Up @@ -118,13 +120,13 @@ export const chat = async (
examples,
input,
) as MessageParam[]
console.log(prompt)
if (isStream) {
const stream = await claudeChatStream(prompt)
if (!isLogging) {
return stream
}
await readClaudeStream(stream)
return stream
}
const resp = await claudeChat(prompt)
return resp
Expand Down
12 changes: 10 additions & 2 deletions packages/ai/src/lib/readClaudeStream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,16 @@ export const readClaudeStream = async (
streamingResp: Stream<MessageStreamEvent>,
) => {
for await (const item of streamingResp) {
const text = JSON.parse(JSON.stringify(item))
process.stdout.write(chalk.white(text.delta?.text))
try {
const text = JSON.parse(JSON.stringify(item))
const msg = text.delta?.text
if (msg != null) process.stdout.write(chalk.white(msg))
} catch (error) {
process.stdout.write(
chalk.white('Something went wrong... Please try again 🙇'),
)
return error
}
}
// ストリームの終了後、改行を出力して区切ります
process.stdout.write('\n')
Expand Down
14 changes: 11 additions & 3 deletions packages/ai/src/lib/readGeminiStream.ts
Original file line number Diff line number Diff line change
@@ -1,18 +1,26 @@
import { StreamGenerateContentResult } from '@google-cloud/vertexai'
import { VertexAiResponse } from './types/vertexAiResponseTypes'
import chalk from 'chalk'
import { inspect } from 'util'

export const readGeminiStream = async (
streamingResp: StreamGenerateContentResult,
) => {
for await (const item of streamingResp.stream) {
// itemをVertexAiResponse型として扱います。型アサーションを適宜調整してください。
const text = JSON.parse(JSON.stringify(item)) as unknown as VertexAiResponse
// `console.log`の代わりに`process.stdout.write`を使用して改行なしでテキストを出力
if (text.candidates[0].content.parts[0].text) {
try {
// `console.log`の代わりに`process.stdout.write`を使用して改行なしでテキストを出力
if (text.candidates[0].content.parts[0].text) {
process.stdout.write(
chalk.white(text.candidates[0].content.parts[0].text),
)
}
} catch (error) {
process.stdout.write(
chalk.white(text.candidates[0].content.parts[0].text),
chalk.white('Something went wrong... Please try again 🙇'),
)
return error
}
}

Expand Down
52 changes: 24 additions & 28 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit bc69e50

Please sign in to comment.