Does Langgraph sustain state variables across each graph run? #479
Replies: 4 comments 3 replies
-
To sustain state variables across invocations in your Langgraph deployment with Langserve, you can implement a custom memory class that retains and updates state variables. Here's an example of how you could modify the memory handling to achieve persistence: from typing import Any, Dict, List
from langchain_core.memory import BaseMemory
class PersistentMemory(BaseMemory):
"""Memory that retains state across invocations."""
memories: Dict[str, Any] = dict()
@property
def memory_variables(self) -> List[str]:
return list(self.memories.keys())
def load_memory_variables(self, inputs: Dict[str, Any]) -> Dict[str, str]:
return self.memories
def save_context(self, inputs: Dict[str, Any], outputs: Dict[str, str]) -> None:
"""Updates the memory with new context."""
for key, value in outputs.items():
self.memories[key] = value
def clear(self) -> None:
"""Clears the memory."""
self.memories.clear() This This approach directly addresses the issue of state variables being reset between invocations by ensuring that updates to the state are persisted. Implement this class and use it as the memory component in your Langgraph application to maintain state across different runs.
|
Beta Was this translation helpful? Give feedback.
-
@hwchase17 kindly give me direction here! |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
@hinthornw Hi, Based on my understanding, the memory persistence is generated after the invoke. However, for example, if I want to persistence some file data in the state beforehand and then directly use this cached file data to answer questions during the first invoke, how can I achieve this? For instance, if I want to save my name in the memory beforehand. |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I am deploying a langgraph through langserve. When i made an invocation, the state variables gets updated. When i made the 2nd invocation, the state variables(updated during first invocation) gets destroyed. I am hitting the deployed graph through postman. If i want to sustain the state variables across our invocations. Like what was the state in previous run, I must know get them in next graph run.
System Info
langchain-nomic
langchain_community
tiktoken
langchainhub
chromadb
langchain
langgraph==0.0.44
tavily-python
gpt4all
python-dotenv
openai
bs4
langchain-openai
langserve
fastapi
uvicorn
Beta Was this translation helpful? Give feedback.
All reactions