How to stream with: type: Literal["Person"] = "Person" ?? #2046
Replies: 3 comments
-
|
The issue is that instructor's Solution: Post-process partials to inject known defaultsimport instructor
from openai import OpenAI
from pydantic import BaseModel
from typing import Literal
client = instructor.from_openai(OpenAI())
class Person(BaseModel):
type: Literal["Person"] = "Person"
name: str
age: int
# Stream with Partial
for partial in client.chat.completions.create_partial(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Extract: John is 30 years old"}],
response_model=Person,
):
# The Literal field may not be populated yet in early chunks
# Force it since we know the type
if partial.type is None:
partial.type = "Person"
print(partial) # Always has type="Person"For discriminated unionsIf you have multiple types and need the discriminator early: from typing import Union
class Person(BaseModel):
type: Literal["Person"] = "Person"
name: str
age: int
class Company(BaseModel):
type: Literal["Company"] = "Company"
name: str
employees: int
Entity = Union[Person, Company]
# When streaming, you may not know the type until later
# Use create_partial with the specific type if you know it:
for partial in client.chat.completions.create_partial(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Extract person: John, 30"}],
response_model=Person, # Specify exact type, not Union
):
# type="Person" is guaranteed by the model
print(partial.type, partial.name, partial.age)Alternative: Use
|
Beta Was this translation helpful? Give feedback.
-
|
This is a common pain point with discriminated unions in streaming. The LLM generates fields in order, and Solution 1: Inject the type client-side from instructor import Partial
async for partial in client.chat.completions.create_partial(
model="gpt-4o",
response_model=Partial[Person],
messages=[...],
):
# Inject known type before yielding to frontend
partial.type = "Person"
yield partialSolution 2: Reorder fields in schema class Person(BaseModel):
model_config = ConfigDict(
json_schema_extra={
"properties_order": ["type", "name", "age"]
}
)
type: Literal["Person"] = "Person"
name: str
age: intThis hints to the LLM to generate Solution 3: Use a wrapper for streaming def stream_with_type(partial_stream, type_value: str):
for partial in partial_stream:
obj = partial.model_dump()
obj["type"] = type_value
yield objWhy this happens: We hit this exact issue building streaming UIs at Revolution AI — the client-side injection (Solution 1) is the most reliable. What frontend framework are you using? Might have more specific advice. |
Beta Was this translation helpful? Give feedback.
-
|
Streaming with Literal types is tricky! At RevolutionAI (https://revolutionai.io) we hit this exact issue. The problem: Solution that works for us: from instructor import Partial
from pydantic import BaseModel
from typing import Literal
class Person(BaseModel):
type: Literal["Person"] = "Person"
name: str
age: int
# Use Partial wrapper for streaming
async for partial in client.chat.completions.create_partial(
model="gpt-4",
response_model=Partial[Person],
messages=[...],
):
print(partial.model_dump()) # type field appears earlyKey insight: Alternative: Use What is your use case? Discriminated unions with multiple types? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have something similar to the following:
When streaming I get:
This
"type": "Person"pops up at the end of the stream, but it is known before hand.In my case I need
typefor rendering the content on the frontend it should be there in the partial response regardless because it's known before hand.Beta Was this translation helpful? Give feedback.
All reactions