Memory is an optional module. Unless necessary, you don’t need to add a Memory module to your Agent. Because StateGraph itself contains a historical message list messages, which is sufficient to meet the most basic “memory” requirements.
Situations where a Memory module needs to be added include:
Too many historical messages, requiring external tools to store memory
Triggering human intervention (interrupt), requiring temporary saving of Agent state
Extracting user preferences across conversations, etc.
LangGraph divides memory into:
Short-term memory (MemorySaver)
Long-term memory (MemoryStore)
In addition, LangMem also provides memory storage and retrieval functionality.
import os
import sqlite3
from dotenv import load_dotenv
from dataclasses import dataclass
from typing_extensions import TypedDict
from openai import OpenAI
from langchain_openai import ChatOpenAI
from langchain.agents import create_agent
from langchain.tools import tool, ToolRuntime
from langgraph.graph import StateGraph, MessagesState, START, END
from langgraph.checkpoint.memory import InMemorySaver
from langgraph.store.memory import InMemoryStore
# Load model configuration
_ = load_dotenv()
# Load model
model = ChatOpenAI(
api_key=os.getenv("DASHSCOPE_API_KEY"),
base_url=os.getenv("DASHSCOPE_BASE_URL"),
model="qwen3-coder-plus",
temperature=0.7,
)
# Create assistant node
def assistant(state: MessagesState):
return {'messages': [model.invoke(state['messages'])]}I. Short-term Memory¶
Short-term memory (working memory) is generally used to temporarily store the state of an Agent or Workflow for recovery after failures or retries.
1.1 Using Short-term Memory in Workflows¶
If a checkpoint is configured for a workflow, the next time the workflow is invoked, it will continue from the previous conversation. If not configured, historical conversations will not be retained.
# Create short-term memory
checkpointer = InMemorySaver()
# Create graph
builder = StateGraph(MessagesState)
# Add nodes
builder.add_node('assistant', assistant)
# Add edges
builder.add_edge(START, 'assistant')
builder.add_edge('assistant', END)
# Use checkpointer
graph = builder.compile(checkpointer=checkpointer)
## If we don't use checkpointer, see what happens?
# graph = builder.compile()
# Tell the agent who I am
result = graph.invoke(
{'messages': ['Hello! I am Patrick Star']},
{"configurable": {"thread_id": "1"}},
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Hello! I am Patrick Star
================================== Ai Message ==================================
Oh hi, Patrick! *gives a friendly wave*
I heard you were doing some underwater stargazing again last night. Did you see any new constellations? I know how much you love looking up at those twinkly lights through the ocean water.
Say, want to go jellyfishing later? I bet there's a whole school of 'em around Goo Lagoon this time of day. We could use your big pink net - just watch out for the electric eels! *nervous laugh*
You're always good company, Patrick. Even when things get a little... interesting under the sea!
# Let the agent say my name
result = graph.invoke(
{"messages": [{"role": "user", "content": "May I ask who I am?"}]},
{"configurable": {"thread_id": "1"}},
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Hello! I am Patrick Star
================================== Ai Message ==================================
Oh hi, Patrick! *gives a friendly wave*
I heard you were doing some underwater stargazing again last night. Did you see any new constellations? I know how much you love looking up at those twinkly lights through the ocean water.
Say, want to go jellyfishing later? I bet there's a whole school of 'em around Goo Lagoon this time of day. We could use your big pink net - just watch out for the electric eels! *nervous laugh*
You're always good company, Patrick. Even when things get a little... interesting under the sea!
================================ Human Message =================================
May I ask who I am?
================================== Ai Message ==================================
Oh, Patrick! You're my best friend, silly! Don't you remember? We live in the same neighborhood in Bikini Bottom. You're that big-hearted, fun-loving pink starfish who lives under a rock with your pet snail, and you work at the Krusty Krab sometimes... well, when you're not too busy napping or going jellyfishing!
We've had so many adventures together - like that time we got trapped in a giant soap bubble, or when we went hunting for the mysterious "Lost Boot" in the deep ocean. You're always up for fun and games, even if you do get a little mixed up sometimes.
And Patrick, you're probably the only person in all of Bikini Bottom who truly appreciates my jokes! Well, most of the time anyway. *chuckles*
Wait... you really don't remember, do you? That's not like you at all, buddy. Are you feeling okay?
1.2 Using Short-term Memory in Agents¶
The effect of using short-term memory in Agents is similar to that in workflows.
from langchain.agents import create_agent
# Create short-term memory
checkpointer = InMemorySaver()
agent = create_agent(
model=model,
checkpointer=checkpointer
)
# Tell the agent I am Squidward
result = agent.invoke(
{'messages': ['Hello! I am Squidward']},
{"configurable": {"thread_id": "2"}},
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Hello! I am Squidward
================================== Ai Message ==================================
Oh... hello there, Squidward. *nervous chuckle*
Is everything... uhh... tentacle-y on your end? I must say, your name certainly has a certain... aquatic quality to it. Very... underwater.
*shifts uncomfortably*
Say, are you by chance familiar with the fine arts? Perhaps you play a wind instrument? Or maybe you're more of a visual artist? I've always appreciated someone with... sophisticated tastes.
*awkward pause*
Right then! How can I assist you today?
# Let the agent say my name
result = agent.invoke(
{"messages": [{"role": "user", "content": "Who am I?"}]},
{"configurable": {"thread_id": "2"}},
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Hello! I am Squidward
================================== Ai Message ==================================
Oh... hello there, Squidward. *nervous chuckle*
Is everything... uhh... tentacle-y on your end? I must say, your name certainly has a certain... aquatic quality to it. Very... underwater.
*shifts uncomfortably*
Say, are you by chance familiar with the fine arts? Perhaps you play a wind instrument? Or maybe you're more of a visual artist? I've always appreciated someone with... sophisticated tastes.
*awkward pause*
Right then! How can I assist you today?
================================ Human Message =================================
Who am I?
================================== Ai Message ==================================
*adjusts posture and peers through one eye*
Well, that's rather obvious, isn't it? You're Squidward Tentacles, resident of 124 Conch Street, right next door to that insufferable yellow sponge who insists on trying to be my friend. You work the register at the Krusty Krab, though clearly you're far too talented for such menial employment. You're an accomplished clarinet player - though I understand your neighbors find your practicing... challenging. You also paint, sculpt, and generally pursue high art in your spare time.
*nervous laugh*
Of course, you're also known for your... *ahem* ...distinctive laugh. The whole neighborhood knows that laugh.
*clears throat awkwardly*
Does that about sum it up? Or did you perhaps bonk your head on something? That does happen occasionally around here.
To verify whether InMemorySaver is truly effective, you can comment out the checkpointer and observe the Agent’s behavior.
1.3 Using Databases to Save Short-term Memory¶
If using SQLite to save working state, even if the program exits, it should be able to restore the state before exit. Let’s verify this. Before that, you need to install a Python package to support SqliteSaver checkpoint:
pip install langgraph-checkpoint-sqlite# Delete SQLite database
if os.path.exists("short-memory.db"):
os.remove("short-memory.db")from langgraph.checkpoint.sqlite import SqliteSaver
# Create short-term memory with SQLite support
checkpointer = SqliteSaver(
sqlite3.connect("short-memory.db", check_same_thread=False)
)
# Create Agent
agent = create_agent(
model=model,
checkpointer=checkpointer,
)
# Tell the agent I am Sha Wujing (a character from Journey to the West)
result = agent.invoke(
{'messages': ['Hi! I am Sha Wujing']},
{"configurable": {"thread_id": "3"}},
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Hi! I am Sha Wujing
================================== Ai Message ==================================
Hello, Brother Sha Wujing. As the second demon general of Flowing Sands River and a member of Tang Sanzang's pilgrimage team to the West for Buddhist scriptures, you are an extremely powerful warrior. Do you have any interesting experiences in the journey of seeking the scriptures to share with me? Or do you have any thoughts about your previous life as a celestial marshal?
Create a new Agent and configure it with a SQLite checkpoint. Let’s see if the Agent can read the memory about my name from SQLite.
# Create a new Agent
new_agent = create_agent(
model=model,
checkpointer=checkpointer,
)
# Let the agent recall my name
result = new_agent.invoke(
{'messages': ['Who am I?']},
{"configurable": {"thread_id": "3"}},
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Hi! I am Sha Wujing
================================== Ai Message ==================================
Hello, Brother Sha Wujing. As the second demon general of Flowing Sands River and a member of Tang Sanzang's pilgrimage team to the West for Buddhist scriptures, you are an extremely powerful warrior. Do you have any interesting experiences in the journey of seeking the scriptures to share with me? Or do you have any thoughts about your previous life as a celestial marshal?
================================ Human Message =================================
Who am I?
================================== Ai Message ==================================
*Scratches my head with my rake, looking slightly confused*
Oh, brother, you sure got me puzzled there for a moment. You just introduced yourself as Sha Wujing, didn't you? Though... now that I think about it, something feels off.
*Strokes beard thoughtfully*
Say, you don't happen to be carrying a certain mala around your neck, do you? The one with 108 precious beads? I seem to recall those beads were made from the skulls of people I once... well, let's just say I wasn't always on the path of virtue.
*Shifts uncomfortably*
You know, before I met Master and took up the Buddhist path, I was quite the troublemaker in the Heavenly Palace. Got myself banished for... oh, some rather unseemly behavior. Now I'm trying to atone for my past misdeeds by protecting the monk on our journey to the West.
But I must ask - are you truly who you claim to be, or perhaps someone testing this poor cultivator?
II. Long-term Memory¶
Long-term memory is generally used to save important business-related information, such as user attributes, traffic parameters, etc.
2.1 Creating an Embedding Generation Function¶
Long-term memory supports using Embedding to retrieve semantically similar content. Below we create an Embedding generation function that can generate Embeddings required for retrieval.
# Embedding dimension
EMBED_DIM = 1024
# Interface for getting text embedding
client = OpenAI(
api_key=os.getenv("DASHSCOPE_API_KEY"),
base_url=os.getenv("DASHSCOPE_BASE_URL"),
)
# Embedding generation function
def embed(texts: list[str]) -> list[list[float]]:
response = client.embeddings.create(
model="text-embedding-v4",
input=texts,
dimensions=EMBED_DIM,
)
return [item.embedding for item in response.data]
# Test if text embedding can be generated normally
texts = [
"LangGraph's middleware is very powerful",
"LangGraph's MCP is also very useful",
]
vectors = embed(texts)
len(vectors), len(vectors[0])(2, 1024)2.2 Reading and Writing Long-term Memory¶
First, write two pieces of data into InMemoryStore.
# Create InMemoryStore memory storage
store = InMemoryStore(index={"embed": embed, "dims": EMBED_DIM})
# Add two user data records user_1 user_2
namespace = ("users", )
store.put(
namespace, # Namespace to group related data together
"user_1", # Key within the namespace
{
"rules": [
"User likes short, direct language",
"User only speaks English & python",
],
"rule_id": "3",
},
)
store.put(
("users",),
"user_2",
{
"name": "John Smith",
"language": "English",
}
)Through namespace and key, you can directly read long-term memory.
item = store.get(namespace, "user_2")
itemItem(namespace=['users'], key='user_2', value={'name': 'John Smith', 'language': 'English'}, created_at='2026-02-14T10:51:23.965935+00:00', updated_at='2026-02-14T10:51:23.965948+00:00')You can also retrieve through vector search.
items = store.search(
namespace,
query="language preferences",
filter={"rule_id": "3"},
)
items[Item(namespace=['users'], key='user_1', value={'rules': ['User likes short, direct language', 'User only speaks English & python'], 'rule_id': '3'}, created_at='2026-02-14T10:51:23.135391+00:00', updated_at='2026-02-14T10:51:23.135400+00:00', score=0.4085710154661828)]2.3 Using Tools to Read Long-term Memory¶
from dataclasses import dataclass
@dataclass
class Context:
user_id: str
@tool
def get_user_info(runtime: ToolRuntime[Context]) -> str:
"""Used to query user information"""
user_id = runtime.context.user_id
user_info = runtime.store.get(("users",), user_id)
return str(user_info.value) if user_info else "Unknown user"
# Create Agent
agent = create_agent(
model=model,
tools=[get_user_info],
store=store,
context_schema=Context
)
# Run Agent
result = agent.invoke(
{"messages": [{"role": "user", "content": "Check user info"}]},
context=Context(user_id="user_2")
)
for message in result['messages']:
message.pretty_print()================================ Human Message =================================
Check user info
================================== Ai Message ==================================
Tool Calls:
get_user_info (call_6d114bb7a56d476fb425db59)
Call ID: call_6d114bb7a56d476fb425db59
Args:
================================= Tool Message =================================
Name: get_user_info
{'name': 'John Smith', 'language': 'English'}
================================== Ai Message ==================================
The user information is as follows:
- **Name**: John Smith
- **Language**: English
2.4 Using Tools to Write Long-term Memory¶
class UserInfo(TypedDict):
name: str
@tool
def save_user_info(user_info: UserInfo, runtime: ToolRuntime[Context]) -> str:
"""Used to save/update user information"""
user_id = runtime.context.user_id
runtime.store.put(("users",), user_id, user_info)
return "Successfully saved user information"
# Create Agent
agent = create_agent(
model=model,
tools=[save_user_info],
store=store,
context_schema=Context
)
# Run Agent
agent.invoke(
{"messages": [{"role": "user", "content": "My name is John Smith"}]},
context=Context(user_id="user_123")
)
store.get(("users",), "user_123").value{'name': 'John Smith'}