By Admin on Tuesday, 10 December 2024
Category: All

OpenAI Assistants Streaming

The OpenAI Assistant Client has been improved to implement the streaming responses when calling a run thread. Before this new feature, it requires to poll the run object status till it's completed. Now you can use the new Stream events to handle the streaming messages.

Streaming Events

Instead of waiting the full response from the assistant, you can stream the response using Server-Sent Events. Just pass the param Stream = True when using the CreateRun function and the response will be using streaming.

The following events are used to handle the streaming responses

Step 1: Create an Assistant

An Assistant represents an entity that can be configured to respond to a user's messages using several parameters like model, instructions, and tools. 

Step 2: Create a Thread

A Thread represents a conversation between a user and one or many Assistants. You can create a Thread when a user (or your AI application) starts a conversation with your Assistant. 

Step 3: Add a Message to the Thread and Run using Streaming

Enter your text here ...

Step 4: Handle the Response

Use the event OnStreamMessageDelta to read the server-sent stream message. 

OpenAI Delphi Sample

The following compressed file contains the source code of the Assistants demo built for windows using the sgcWebSockets library. 

Related Posts