-
Notifications
You must be signed in to change notification settings - Fork 15k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
on_chain_start callbacks crash with serialized=None
in v0.3
#26773
Comments
Hi! I am having the same problem using StructuredTool:
However using this, I am not having the problem anymore, but the agent is not working as accurate as before:
|
Fix merged will be resolved in the next langchain-core release |
The issue is fixed in the new release, thank you! I'll add to clarify, that the removal of |
Edit: I have solved this problem by modifying "on_chain_start" in my code. class LoggingHandler(BaseCallbackHandler):
def on_chat_model_start(
self, serialized: Dict[str, Any], messages: List[List[BaseMessage]], **kwargs
) -> None:
print("Chat model started")
def on_llm_end(self, response: LLMResult, **kwargs) -> None:
print(f"Chat model ended, response: {response.generations}")
def on_chain_start(
self, serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any
) -> None:
if "name" in kwargs:
name = kwargs["name"]
else:
if serialized:
name = serialized.get("name", serialized.get("id", ["<unknown>"])[-1])
else:
name = "<unknown>"
print(f"Chain {name} started")
def on_chain_end(self, outputs: Dict[str, Any], **kwargs) -> None:
print(f"Chain ended, outputs: {outputs}") I have updated langchain_core to 0.3.8 but still get "serialized=None" for "on_chain_start" callbacks. I have tried the exact code from How to pass callbacks in at runtime, but I did not get the expected result.
This is my code: from typing import Any, Dict, List
import os
from langchain_openai import ChatOpenAI
from langchain_core.callbacks import BaseCallbackHandler
from langchain_core.messages import BaseMessage
from langchain_core.outputs import LLMResult
from langchain_core.prompts import ChatPromptTemplate
class LoggingHandler(BaseCallbackHandler):
def on_chat_model_start(
self, serialized: Dict[str, Any], messages: List[List[BaseMessage]], **kwargs
) -> None:
print("Chat model started")
def on_llm_end(self, response: LLMResult, **kwargs) -> None:
print(f"Chat model ended, response: {response}")
def on_chain_start(
self, serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs
) -> None:
print(f"Chain {serialized.get('name')} started")
def on_chain_end(self, outputs: Dict[str, Any], **kwargs) -> None:
print(f"Chain ended, outputs: {outputs}")
callbacks = [LoggingHandler()]
llm = ChatOpenAI(
model='Meta-Llama-3.1-405B-Instruct',
base_url="https://api.sambanova.ai/v1",
api_key=os.environ.get("SAMBANOVA_API_KEY"),
)
prompt = ChatPromptTemplate.from_template("What is 1 + {number}?")
chain = prompt | llm
chain_with_callbacks = chain.with_config(callbacks=callbacks)
result = chain_with_callbacks.invoke({"number": "2"}) |
@danielkerwin Make the following changes in your code: def on_chain_start(
self, serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any
) -> None:
if "name" in kwargs:
name = kwargs["name"]
else:
if serialized:
name = serialized.get("name", serialized.get("id", ["<unknown>"])[-1])
else:
name = "<unknown>"
print(f"Chain {name} started") |
Checked other resources
Example Code
Any callback manager includes default callbacks such as
StdOutCallbackHandler
, so we can simply instantiate an agent:Error Message and Stack Trace (if applicable)
Error in StdOutCallbackHandler.on_chain_start callback: AttributeError("'NoneType' object has no attribute 'get'")
Description
Starting with Langchain v0.3, on_chain_start callbacks started receiving
serialized
argument as None. This causes callback handlers, including the default ones, to crash.I doubt that this is intended, but if so, the built-in callbacks should be updated, as well as the docs.
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
The text was updated successfully, but these errors were encountered: