In this section, we will integrate MCP Server into LangGraph. Before integrating MCP Server, we must develop the MCP Server first. This is my specialty. In the article “New Wine in Old Bottles: Card Magic MCP”, I have summarized an efficient development method. Below, I will use this method to create MCP Servers and then integrate them into LangGraph.
1. Developing MCP Services¶
1.1 Weather MCP¶
Taking get_weather_mcp as an example, we want to write this MCP as a Python package. Of course, it’s for local use only. If you want to upload it to PyPI, you certainly can, but that’s a different process. Please refer to my blog post “PyPI Packaging Notes”.
To make it recognized as a Python package, we need to create an __init__.py file under the project. Then write the main logic in server.py.
# server.py
# -*- coding: utf-8 -*-
from fastmcp import FastMCP
mcp = FastMCP("get_weather_mcp")
@mcp.tool
def get_weather(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}!"
if __name__ == "__main__":
mcp.run()Then in __main__.py, use from . import server to import it. Finally, deploy it using the streamable-http method:
# __main__.py
# -*- coding: utf-8 -*-
import asyncio
import os
from . import server
host = os.getenv('HOST', '127.0.0.1')
port = int(os.getenv('PORT', 8000))
def stdio():
"""Stdio entry point for the package."""
asyncio.run(server.mcp.run(transport="stdio"))
def http():
"""streamable-http entry point for the package."""
asyncio.run(server.mcp.run(transport="http",
host=host,
port=port,
path="/mcp"))
if __name__ == "__main__":
http()That’s all there is to it. Using __main__.py here has a clever purpose - it allows us to use this package directly as a module from the command line. What does this mean? It means that using python -m [package_name] is equivalent to directly running the special file __main__.py. Since we previously started the http() function in this special file, we can conveniently and quickly start the MCP Server! For our get_weather_mcp, the startup command is:
python -m get_weather_mcp1.2 Math MCP¶
Do I need to elaborate on this? Just follow the same steps as above.
It’s really super templated. __init__.py and __main__.py are almost identical.
The only thing that needs to be changed is __main__.py. You need to change the port number to a new one, generally just add 1. Here we change 8000 to 8001, everything else remains the same:
# -*- coding: utf-8 -*-
import asyncio
import os
from . import server
host = os.getenv('HOST', '127.0.0.1')
port = int(os.getenv('PORT', 8001))
def stdio():
"""Stdio entry point for the package."""
asyncio.run(server.mcp.run(transport="stdio"))
def http():
"""streamable-http entry point for the package."""
asyncio.run(server.mcp.run(transport="http",
host=host,
port=port,
path="/mcp"))
if __name__ == "__main__":
http()2. Using supervisord to Manage MCP Services¶
supervisord is a process management tool. You tell it which MCPs to run, and it will watch over your MCP babies. When an MCP crashes, supervisord can automatically restart it. These topics are briefly introduced in my blog post “Background Management Tools Introduction” (but it’s more about systemd and pm2).
First, we open the project’s mcp_server path and create a configuration file mcp_supervisor.conf here for supervisord to use. My configuration is as follows:
[unix_http_server]
file=/tmp/supervisor.sock
[supervisord]
logfile=/tmp/supervisord.log
logfile_maxbytes=50MB
logfile_backups=10
loglevel=info
pidfile=/tmp/supervisord.pid
nodaemon=false
minfds=1024
minprocs=200
[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface
[supervisorctl]
serverurl=unix:///tmp/supervisor.sock
[program:math_mcp]
command=python -m mcp_server.math_mcp
directory=..
autostart=true
autorestart=true
startsecs=5
stopwaitsecs=10
stdout_logfile=/tmp/math_mcp.log
stderr_logfile=/tmp/math_mcp_err.log
[program:weather_mcp]
command=python -m mcp_server.get_weather_mcp
directory=..
autostart=true
autorestart=true
startsecs=5
stopwaitsecs=10
stdout_logfile=/tmp/weather_mcp.log
stderr_logfile=/tmp/weather_mcp_err.log
[group:mcp_servers]
programs=math_mcp,weather_mcpThat’s it for the configuration of math_mcp and weather_mcp. There’s no need to write this yourself - I had TRAE write it for me. Below is an explanation of common commands!
2.1 Install supervisord¶
pip install supervisor2.2 Start supervisord¶
supervisord -c ./mcp_supervisor.conf2.3 Stop supervisord¶
pkill -f supervisord2.4 Check Port Status¶
lsof -i :8000
lsof -i :80013. Using MCP in LangGraph¶
Before using it, you need to install the Python package that supports this functionality:
pip install langchain-mcp-adaptersI’m really frustrated with the development team. In my opinion, LangChain and LangGraph should be combined into one package. Having us search for which package contains which feature is really tedious! And various features are split into tiny pieces. Look at how many packages I’ve installed so far:
langchain[openai]
langchain-mcp-adapters
langgraph
langgraph-cli[inmem]
langgraph-supervisor
langgraph-checkpoint-sqliteIf it weren’t for the many good features updated in LangGraph 1.0, I would truly look down on this open-source project. I sincerely hope that the rising star AgentScope will absorb the strengths of LangGraph 1.0 and surpass it. Of course, until then, we have to acknowledge LangGraph’s position. Although it’s not perfect, it’s still the most powerful one.
3.1 Start MCP Services¶
We’ll only start the weather MCP. The math MCP will be called via stdio later, so there’s no need to start it as a separate service.
Start get_weather_mcp:
python -m mcp_server.get_weather_mcp Test if the MCP Server started successfully:
# !lsof -i :80003.2 Integrate MCP Services¶
Use MultiServerMCPClient to connect to the MCP Server.
import os
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain.agents import create_agent
# Load model configuration
_ = load_dotenv()
# Load model
llm = ChatOpenAI(
api_key=os.getenv("DASHSCOPE_API_KEY"),
base_url=os.getenv("DASHSCOPE_BASE_URL"),
model="qwen3-coder-plus",
temperature=0.7,
)
async def mcp_agent():
# We start MCP Server in two ways: stdio and streamable_http
client = MultiServerMCPClient(
{
"math": {
"command": "python",
"args": [os.path.abspath("./mcp_server/math_mcp/server.py")],
"transport": "stdio",
},
"weather": {
"url": "http://localhost:8000/mcp",
"transport": "streamable_http",
}
}
)
tools = await client.get_tools()
agent = create_agent(
llm,
tools=tools,
)
return agent
async def use_mcp(messages):
agent = await mcp_agent()
response = await agent.ainvoke(messages)
return responseIn Jupyter Notebook, use the command response = await use_mcp(messages) to call the function. However, in .py files, this calling method will fail.
# Call weather MCP
messages = {"messages": [{"role": "user", "content": "How is the weather in Fuzhou?"}]}
response = await use_mcp(messages)
response["messages"][-1].content"It seems there might be some confusion. While Fuzhou, China, does enjoy a lot of sunshine due to its subtropical climate, it doesn't mean it's sunny every single day. Weather can vary, so if you're planning something specific, it's always best to check the forecast! Let me know if you'd like help with anything else. 😊"# Call math MCP, since it's stdio, startup will be slower
messages = {"messages": [{"role": "user", "content": "Calculate (3 + 5) * 12"}]}
response = await use_mcp(messages)
response["messages"][-1].content'The result of (3 + 5) * 12 is 96.'In .py files, you should use asyncio. The modified parts are as follows:
import asyncio
async def main():
# Call weather MCP
messages = {"messages": [{"role": "user", "content": "How is the weather in Fuzhou?"}]}
response = await use_mcp(messages)
print(response["messages"][-1].content)
if __name__ == "__main__":
asyncio.run(main())