Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

release: 4.15.0 #410

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions .devcontainer/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# syntax=docker/dockerfile:1
FROM debian:bookworm-slim AS stainless

RUN apt-get update && apt-get install -y \
nodejs \
npm \
yarnpkg \
&& apt-get clean autoclean

# Yarn
RUN ln -sf /usr/bin/yarnpkg /usr/bin/yarn

WORKDIR /workspace

COPY package.json yarn.lock /workspace/

RUN yarn install

COPY . /workspace
20 changes: 20 additions & 0 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/debian
{
"name": "Debian",
"build": {
"dockerfile": "Dockerfile"
}

// Features to add to the dev container. More info: https://containers.dev/features.
// "features": {},

// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],

// Configure tool-specific properties.
// "customizations": {},

// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
// "remoteUser": "root"
}
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "4.14.2"
".": "4.15.0"
}
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
# Changelog

## 4.15.0 (2023-11-03)

Full Changelog: [v4.14.2...v4.15.0](https://github.com/openai/openai-node/compare/v4.14.2...v4.15.0)

### Features

* **beta:** add streaming and function calling helpers ([#409](https://github.com/openai/openai-node/issues/409)) ([510c1f3](https://github.com/openai/openai-node/commit/510c1f325ee55197b4c2f434475128c265500746))
* **client:** allow binary returns ([#416](https://github.com/openai/openai-node/issues/416)) ([02f7ad7](https://github.com/openai/openai-node/commit/02f7ad7f736751e0e7687e6744bae464d4e40b79))
* **github:** include a devcontainer setup ([#413](https://github.com/openai/openai-node/issues/413)) ([fb2996f](https://github.com/openai/openai-node/commit/fb2996f0d291210878145aacf9b952f8133d9414))
* streaming improvements ([#411](https://github.com/openai/openai-node/issues/411)) ([37b622c](https://github.com/openai/openai-node/commit/37b622c79ddbd6c286b730e740403c82b542e796))

## 4.14.2 (2023-10-30)

Full Changelog: [v4.14.1...v4.14.2](https://github.com/openai/openai-node/compare/v4.14.1...v4.14.2)
Expand Down
115 changes: 114 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ You can import in Deno via:
<!-- x-release-please-start-version -->

```ts
import OpenAI from 'https://raw.githubusercontent.com/openai/openai-node/v4.14.1-deno/mod.ts';
import OpenAI from 'https://raw.githubusercontent.com/openai/openai-node/v4.14.2-deno/mod.ts';
```

<!-- x-release-please-end -->
Expand Down Expand Up @@ -102,6 +102,119 @@ Documentation for each method, request param, and response field are available i
> [!IMPORTANT]
> Previous versions of this SDK used a `Configuration` class. See the [v3 to v4 migration guide](https://github.com/openai/openai-node/discussions/217).
### Streaming responses

This library provides several conveniences for streaming chat completions, for example:

```ts
import OpenAI from 'openai';

const openai = new OpenAI();

async function main() {
const stream = await openai.beta.chat.completions.stream({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
stream: true,
});

stream.on('content', (delta, snapshot) => {
process.stdout.write(delta);
});

// or, equivalently:
for await (const part of stream) {
process.stdout.write(part.choices[0]?.delta?.content || '');
}

const chatCompletion = await stream.finalChatCompletion();
console.log(chatCompletion); // {id: "…", choices: […], …}
}

main();
```

Streaming with `openai.beta.chat.completions.stream({…})` exposes
[various helpers for your convenience](helpers.md#events) including event handlers and promises.

Alternatively, you can use `openai.chat.completions.create({ stream: true, … })`
which only returns an async iterable of the chunks in the stream and thus uses less memory
(it does not build up a final chat completion object for you).

If you need to cancel a stream, you can `break` from a `for await` loop or call `stream.abort()`.

### Automated function calls

We provide a `openai.beta.chat.completions.runFunctions({…})` convenience helper for using function calls
with the `/chat/completions` endpoint which automatically calls the JavaScript functions you provide
and sends their results back to the `/chat/completions` endpoint,
looping as long as the model requests function calls.

If you pass a `parse` function, it will automatically parse the `arguments` for you and returns any parsing errors to the model to attempt auto-recovery. Otherwise, the args will be passed to the function you provide as a string.

If you pass `function_call: {name: …}` instead of `auto`, it returns immediately after calling that function (and only loops to auto-recover parsing errors).

```ts
import OpenAI from 'openai';

const client = new OpenAI();

async function main() {
const runner = client.beta.chat.completions
.runFunctions({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'How is the weather this week?' }],
functions: [
{
function: getCurrentLocation,
parameters: { type: 'object', properties: {} },
},
{
function: getWeather,
parse: JSON.parse, // or use a validation library like zod for typesafe parsing.
parameters: {
type: 'object',
properties: {
location: { type: 'string' },
},
},
},
],
})
.on('message', (message) => console.log(message));

const finalContent = await runner.finalContent();
console.log();
console.log('Final content:', finalContent);
}

async function getCurrentLocation() {
return 'Boston'; // Simulate lookup
}

async function getWeather(args: { location: string }) {
const { location } = args;
// … do lookup …
return { temperature, precipitation };
}

main();

// {role: "user", content: "How's the weather this week?"}
// {role: "assistant", function_call: "getCurrentLocation", arguments: "{}"}
// {role: "function", name: "getCurrentLocation", content: "Boston"}
// {role: "assistant", function_call: "getWeather", arguments: '{"location": "Boston"}'}
// {role: "function", name: "getWeather", content: '{"temperature": "50degF", "preciptation": "high"}'}
// {role: "assistant", content: "It's looking cold and rainy - you might want to wear a jacket!"}
//
// Final content: "It's looking cold and rainy - you might want to wear a jacket!"
```

Like with `.stream()`, we provide a variety of [helpers and events](helpers.md#events).

Read more about various examples such as with integrating with [zod](helpers.md#integrate-with-zod),
[next.js](helpers.md#integrate-wtih-next-js), and [proxying a stream to the browser](helpers.md#proxy-streaming to-a-browser).

## File Uploads

Request parameters that correspond to file uploads can be passed in many different forms:
Expand Down
11 changes: 11 additions & 0 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,3 +156,14 @@ Methods:
- <code title="get /fine-tunes">client.fineTunes.<a href="./src/resources/fine-tunes.ts">list</a>() -> FineTunesPage</code>
- <code title="post /fine-tunes/{fine_tune_id}/cancel">client.fineTunes.<a href="./src/resources/fine-tunes.ts">cancel</a>(fineTuneId) -> FineTune</code>
- <code title="get /fine-tunes/{fine_tune_id}/events">client.fineTunes.<a href="./src/resources/fine-tunes.ts">listEvents</a>(fineTuneId, { ...params }) -> FineTuneEventsListResponse</code>

# Beta

## Chat

### Completions

Methods:

- <code>client.beta.chat.completions.<a href="./src/resources/beta/chat/completions.ts">runFunctions</a>(body, options?) -> ChatCompletionRunner | ChatCompletionStreamingRunner</code>
- <code>client.beta.chat.completions.<a href="./src/resources/beta/chat/completions.ts">stream</a>(body, options?) -> ChatCompletionStream</code>
88 changes: 87 additions & 1 deletion ecosystem-tests/node-ts-cjs-auto/tests/test.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import OpenAI, { toFile } from 'openai';
import OpenAI, { APIUserAbortError, toFile } from 'openai';
import { TranscriptionCreateParams } from 'openai/resources/audio/transcriptions';
import fetch from 'node-fetch';
import { File as FormDataFile, Blob as FormDataBlob } from 'formdata-node';
Expand Down Expand Up @@ -68,6 +68,92 @@ it(`streaming works`, async function () {
expect(chunks.map((c) => c.choices[0]?.delta.content || '').join('')).toBeSimilarTo('This is a test', 10);
});

it(`ChatCompletionStream works`, async function () {
const chunks: OpenAI.Chat.ChatCompletionChunk[] = [];
const contents: [string, string][] = [];
const messages: OpenAI.Chat.ChatCompletionMessage[] = [];
const chatCompletions: OpenAI.Chat.ChatCompletion[] = [];
let finalContent: string | undefined;
let finalMessage: OpenAI.Chat.ChatCompletionMessage | undefined;
let finalChatCompletion: OpenAI.Chat.ChatCompletion | undefined;

const stream = client.beta.chat.completions
.stream({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
})
.on('chunk', (chunk) => chunks.push(chunk))
.on('content', (delta, snapshot) => contents.push([delta, snapshot]))
.on('message', (message) => messages.push(message))
.on('chatCompletion', (completion) => chatCompletions.push(completion))
.on('finalContent', (content) => (finalContent = content))
.on('finalMessage', (message) => (finalMessage = message))
.on('finalChatCompletion', (completion) => (finalChatCompletion = completion));
const content = await stream.finalContent();

expect(content).toBeSimilarTo('This is a test', 10);
expect(chunks.length).toBeGreaterThan(0);
expect(contents.length).toBeGreaterThan(0);
for (const chunk of chunks) {
expect(chunk.id).toEqual(finalChatCompletion?.id);
expect(chunk.created).toEqual(finalChatCompletion?.created);
expect(chunk.model).toEqual(finalChatCompletion?.model);
}
expect(finalContent).toEqual(content);
expect(contents.at(-1)?.[1]).toEqual(content);
expect(finalMessage?.content).toEqual(content);
expect(finalChatCompletion?.choices?.[0]?.message.content).toEqual(content);
expect(messages).toEqual([finalMessage]);
expect(chatCompletions).toEqual([finalChatCompletion]);
expect(await stream.finalContent()).toEqual(content);
expect(await stream.finalMessage()).toEqual(finalMessage);
expect(await stream.finalChatCompletion()).toEqual(finalChatCompletion);
});

it(`aborting ChatCompletionStream works`, async function () {
const chunks: OpenAI.Chat.ChatCompletionChunk[] = [];
const contents: [string, string][] = [];
const messages: OpenAI.Chat.ChatCompletionMessage[] = [];
const chatCompletions: OpenAI.Chat.ChatCompletion[] = [];
let finalContent: string | undefined;
let finalMessage: OpenAI.Chat.ChatCompletionMessage | undefined;
let finalChatCompletion: OpenAI.Chat.ChatCompletion | undefined;
let emittedError: any;
let caughtError: any;
const controller = new AbortController();
const stream = client.beta.chat.completions
.stream(
{
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
},
{ signal: controller.signal },
)
.on('error', (e) => (emittedError = e))
.on('chunk', (chunk) => chunks.push(chunk))
.on('content', (delta, snapshot) => {
contents.push([delta, snapshot]);
controller.abort();
})
.on('message', (message) => messages.push(message))
.on('chatCompletion', (completion) => chatCompletions.push(completion))
.on('finalContent', (content) => (finalContent = content))
.on('finalMessage', (message) => (finalMessage = message))
.on('finalChatCompletion', (completion) => (finalChatCompletion = completion));
try {
await stream.finalContent();
} catch (error) {
caughtError = error;
}
expect(caughtError).toBeInstanceOf(APIUserAbortError);
expect(finalContent).toBeUndefined();
expect(finalMessage).toBeUndefined();
expect(finalChatCompletion).toBeUndefined();
expect(chatCompletions).toEqual([]);
expect(chunks.length).toBeGreaterThan(0);
expect(contents.length).toBeGreaterThan(0);
});

it('handles formdata-node File', async function () {
const file = await fetch(url)
.then((x) => x.arrayBuffer())
Expand Down
2 changes: 2 additions & 0 deletions examples/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
yarn.lock
node_modules
Loading