开始构建可以与所有MCP服务器集成的您自己的客户端。
系统要求
开始之前,请确保您的系统满足以下要求:
Mac或Windows计算机
已安装最新的Python版本
最新版本的
uv
已安装
设置您的环境
首先,创建一个新的Python项目uv
:
# Create project directory
uv init mcp-client
cd mcp-client
# Create virtual environment
uv venv
# Activate virtual environment
# On Windows:
.venv\Scripts\activate
# On Unix or MacOS:
source .venv/bin/activate
# Install required packages
uv add mcp anthropic python-dotenv
# Remove boilerplate files
# On Windows:
del main.py
# On Unix or MacOS:
rm main.py
# Create our main file
touch client.py
设置您的API密钥
创建一个.env
文件来存储它:
# Create .env file
touch .env
将您的密钥添加到.env
文件:
ANTHROPIC_API_KEY=<your key here>
添加.env
给你的.gitignore
:
echo ".env" >> .gitignore
创建客户端
基本客户端结构
首先,让我们设置导入并创建基本的客户端类:
import asyncio
from typing import Optional
from contextlib import AsyncExitStack
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from anthropic import Anthropic
from dotenv import load_dotenv
load_dotenv() # load environment variables from .env
class MCPClient:
def __init__(self):
# Initialize session and client objects
self.session: Optional[ClientSession] = None
self.exit_stack = AsyncExitStack()
self.anthropic = Anthropic()
# methods will go here
服务器连接管理
接下来,我们将实现连接到MCP服务器的方法:
async def connect_to_server(self, server_script_path: str):
"""Connect to an MCP server
Args:
server_script_path: Path to the server script (.py or .js)
"""
is_python = server_script_path.endswith('.py')
is_js = server_script_path.endswith('.js')
if not (is_python or is_js):
raise ValueError("Server script must be a .py or .js file")
command = "python" if is_python else "node"
server_params = StdioServerParameters(
command=command,
args=[server_script_path],
env=None
)
stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params))
self.stdio, self.write = stdio_transport
self.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))
await self.session.initialize()
# List available tools
response = await self.session.list_tools()
tools = response.tools
print("\nConnected to server with tools:", [tool.name for tool in tools])
查询处理逻辑
现在,让我们添加用于处理查询和处理工具调用的核心功能:
async def process_query(self, query: str) -> str:
"""Process a query using Claude and available tools"""
messages = [
{
"role": "user",
"content": query
}
]
response = await self.session.list_tools()
available_tools = [{
"name": tool.name,
"description": tool.description,
"input_schema": tool.inputSchema
} for tool in response.tools]
# Initial Claude API call
response = self.anthropic.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1000,
messages=messages,
tools=available_tools
)
# Process response and handle tool calls
final_text = []
assistant_message_content = []
for content in response.content:
if content.type == 'text':
final_text.append(content.text)
assistant_message_content.append(content)
elif content.type == 'tool_use':
tool_name = content.name
tool_args = content.input
# Execute tool call
result = await self.session.call_tool(tool_name, tool_args)
final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")
assistant_message_content.append(content)
messages.append({
"role": "assistant",
"content": assistant_message_content
})
messages.append({
"role": "user",
"content": [
{
"type": "tool_result",
"tool_use_id": content.id,
"content": result.content
}
]
})
# Get next response from Claude
response = self.anthropic.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1000,
messages=messages,
tools=available_tools
)
final_text.append(response.content[0].text)
return "\n".join(final_text)
交互式聊天界面
现在我们将添加聊天循环和清理功能:
async def chat_loop(self):
"""Run an interactive chat loop"""
print("\nMCP Client Started!")
print("Type your queries or 'quit' to exit.")
while True:
try:
query = input("\nQuery: ").strip()
if query.lower() == 'quit':
break
response = await self.process_query(query)
print("\n" + response)
except Exception as e:
print(f"\nError: {str(e)}")
async def cleanup(self):
"""Clean up resources"""
await self.exit_stack.aclose()
入口点
最后,我们将添加主要的执行逻辑:
async def main():
if len(sys.argv) < 2:
print("Usage: python client.py <path_to_server_script>")
sys.exit(1)
client = MCPClient()
try:
await client.connect_to_server(sys.argv[1])
await client.chat_loop()
finally:
await client.cleanup()
if __name__ == "__main__":
import sys
asyncio.run(main())
你可以找到完整的client.py
文件 点击到代码
运行客户端
要使用MCP服务器运行客户端:
uv run client.py path/to/server.py # python server
uv run client.py path/to/build/index.js # node server
它是如何工作的
当您提交查询时:
客户端从服务器获取可用工具的列表
您的查询将与工具说明一起发送给LLM
LLM决定使用哪些工具 (如果有的话)
客户端通过服务器执行任何请求的工具调用
结果被送回给LLM
LLM提供自然语言响应
响应将显示给您
最佳实践
错误处理
始终将工具调用包装在try-catch块中
提供有意义的错误消息
优雅地处理连接问题
资源管理
使用
AsyncExitStack
为了正确清理完成后关闭连接
处理服务器断开连接
安全
将API密钥安全地存储在
.env
验证服务器响应
谨慎使用工具权限
常见错误消息
如果你看到:
FileNotFoundError
: 检查您的服务器路径Connection refused
: 确保服务器正在运行并且路径正确Tool execution failed
: 验证工具所需的环境变量是否已设置Timeout error
: 考虑增加客户端配置中的超时时间
评论区