如何禁用不支持流式传输的模型的流式传输¶
一些聊天模型,包括来自OpenAI的新O1模型(取决于您阅读本文的时间),不支持流式传输。这可能会导致使用astream_events API时出现问题,因为该API会以流式模式调用模型,并期望流式传输能够正常工作。
在本指南中,我们将向您展示如何禁用不支持流式传输的模型的流式传输,确保它们永远不会被调用为流式模式,即使通过astream_events API调用也是如此。
API Reference: StateGraph | START | END
from langchain_openai import ChatOpenAI
from langgraph.graph import MessagesState
from langgraph.graph import StateGraph, START, END
llm = ChatOpenAI(model="o1-preview", temperature=1)
graph_builder = StateGraph(MessagesState)
def chatbot(state: MessagesState):
return {"messages": [llm.invoke(state["messages"])]}
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)
graph = graph_builder.compile()
不禁用流式传输¶
现在我们已经定义了图形,让我们尝试调用astream_events
而不禁用流式传输。这应该会抛出一个错误,因为o1
模型不支持原生的流式传输:
input = {"messages": {"role": "user", "content": "how many r's are in strawberry?"}}
try:
async for event in graph.astream_events(input, version="v2"):
if event["event"] == "on_chat_model_end":
print(event["data"]["output"].content, end="", flush=True)
except:
print("Streaming not supported!")
禁用流式传输¶
现在,在不对我们的图进行任何更改的情况下,让我们将模型的disable_streaming参数设置为True
,这将解决问题:
llm = ChatOpenAI(model="o1-preview", temperature=1, disable_streaming=True)
graph_builder = StateGraph(MessagesState)
def chatbot(state: MessagesState):
return {"messages": [llm.invoke(state["messages"])]}
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)
graph = graph_builder.compile()
现在,使用相同的输入重新运行,我们应该看不到任何错误:
input = {"messages": {"role": "user", "content": "how many r's are in strawberry?"}}
async for event in graph.astream_events(input, version="v2"):
if event["event"] == "on_chat_model_end":
print(event["data"]["output"].content, end="", flush=True)