Skip to content

Commit 13a36dd

Browse files
eavanvalkenburgnmoeller
authored andcommitted
Python: Introducing support for using a MCP server as a plugin (microsoft#11334)
### Motivation and Context <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> Adds MCP server configs and a function that turns the server into a plugin, with each tools of the server represented as a function. Adds a sample showing how to use that with a Github MCP server. Closes: microsoft#10785 and microsoft#11190 With special thanks to @nmoeller ### Description <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄 --------- Co-authored-by: Nico Möller <[email protected]>
1 parent 5b9fe94 commit 13a36dd

File tree

10 files changed

+1080
-116
lines changed

10 files changed

+1080
-116
lines changed

python/pyproject.toml

+4-1
Original file line numberDiff line numberDiff line change
@@ -79,6 +79,9 @@ hugging_face = [
7979
"sentence-transformers >= 2.2,< 5.0",
8080
"torch == 2.6.0"
8181
]
82+
mcp = [
83+
"mcp ~= 1.6"
84+
]
8285
mongo = [
8386
"pymongo >= 4.8.0, < 4.12",
8487
"motor >= 3.3.2,< 3.8.0"
@@ -114,7 +117,7 @@ qdrant = [
114117
redis = [
115118
"redis[hiredis] ~= 5.0",
116119
"types-redis ~= 4.6.0.20240425",
117-
"redisvl >= 0.3.6",
120+
"redisvl ~= 0.4"
118121
]
119122
usearch = [
120123
"usearch ~= 2.16",

python/samples/concepts/mcp/README.md

+46
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
# Model Context Protocol
2+
3+
The model context protocol is a standard created by Anthropic to allow models to share context with each other. See the [official documentation](https://modelcontextprotocol.io/introduction) for more information.
4+
5+
It consists of clients and servers, and servers can be hosted locally, or they can be exposed as a online API.
6+
7+
Our goal is that Semantic Kernel can act as both a client and a server.
8+
9+
In this folder the client side of things is demonstrated. It takes the definition of a server and uses that to create a Semantic Kernel plugin, this plugin exposes the tools and prompts of the server as functions in the kernel.
10+
11+
Those can then be used with function calling in a chat or agent.
12+
13+
## Server types
14+
15+
There are two types of servers, Stdio and Sse based. The sample shows how to use the Stdio based server, which get's run locally, in this case by using [npx](https://docs.npmjs.com/cli/v8/commands/npx).
16+
17+
Some other common runners are [uvx](https://docs.astral.sh/uv/guides/tools/), for python servers and [docker](https://www.docker.com/), for containerized servers.
18+
19+
The code shown works the same for a Sse server, only then a MCPSsePlugin needs to be used instead of the MCPStdioPlugin.
20+
21+
The reverse, using Semantic Kernel as a server, is not yet implemented, but will be in the future.
22+
23+
## Running the sample
24+
25+
1. Make sure you have the [Node.js](https://nodejs.org/en/download/) installed.
26+
2. Make sure you have the [npx](https://docs.npmjs.com/cli/v8/commands/npx) available in PATH.
27+
3. The Github MCP Server uses a Github Personal Access Token (PAT) to authenticate, see [the documentation](https://github.com/modelcontextprotocol/servers/tree/main/src/github) on how to create one.
28+
4. Install Semantic Kernel with the mcp extra:
29+
30+
```bash
31+
pip install semantic-kernel[mcp]
32+
```
33+
34+
5. Run the sample:
35+
36+
```bash
37+
cd python/samples/concepts/mcp
38+
python mcp_as_plugin.py
39+
```
40+
41+
or:
42+
43+
```bash
44+
cd python/samples/concepts/mcp
45+
python agent_with_mcp_plugin.py
46+
```
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
# Copyright (c) Microsoft. All rights reserved.
2+
3+
import asyncio
4+
5+
from semantic_kernel.agents import ChatCompletionAgent, ChatHistoryAgentThread
6+
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
7+
from semantic_kernel.connectors.mcp import MCPStdioPlugin
8+
9+
"""
10+
The following sample demonstrates how to create a chat completion agent that
11+
answers questions about Github using a Semantic Kernel Plugin from a MCP server.
12+
The Chat Completion Service is passed directly via the ChatCompletionAgent constructor.
13+
Additionally, the plugin is supplied via the constructor.
14+
"""
15+
16+
17+
# Simulate a conversation with the agent
18+
USER_INPUTS = [
19+
"What are the latest 5 python issues in Microsoft/semantic-kernel?",
20+
"Are there any untriaged python issues?",
21+
"What is the status of issue #10785?",
22+
]
23+
24+
25+
async def main():
26+
# 1. Create the agent
27+
async with MCPStdioPlugin(
28+
name="Github",
29+
description="Github Plugin",
30+
command="npx",
31+
args=["-y", "@modelcontextprotocol/server-github"],
32+
) as github_plugin:
33+
agent = ChatCompletionAgent(
34+
service=AzureChatCompletion(),
35+
name="IssueAgent",
36+
instructions="Answer questions about the Microsoft semantic-kernel github project.",
37+
plugins=[github_plugin],
38+
)
39+
40+
for user_input in USER_INPUTS:
41+
# 2. Create a thread to hold the conversation
42+
# If no thread is provided, a new thread will be
43+
# created and returned with the initial response
44+
thread: ChatHistoryAgentThread = None
45+
46+
print(f"# User: {user_input}")
47+
# 3. Invoke the agent for a response
48+
response = await agent.get_response(messages=user_input, thread=thread)
49+
print(f"# {response.name}: {response} ")
50+
thread = response.thread
51+
52+
# 4. Cleanup: Clear the thread
53+
await thread.delete() if thread else None
54+
55+
"""
56+
Sample output:
57+
GitHub MCP Server running on stdio
58+
# User: What are the latest 5 python issues in Microsoft/semantic-kernel?
59+
# IssueAgent: Here are the latest 5 Python issues in the
60+
[Microsoft/semantic-kernel](https://github.com/microsoft/semantic-kernel) repository:
61+
62+
1. **[Issue #11358](https://github.com/microsoft/semantic-kernel/pull/11358)**
63+
**Title:** Python: Bump Python version to 1.27.0 for a release.
64+
**Created by:** [moonbox3](https://github.com/moonbox3)
65+
**Created at:** April 3, 2025
66+
**State:** Open
67+
**Comments:** 1
68+
**Description:** Bump Python version to 1.27.0 for a release.
69+
70+
2. **[Issue #11357](https://github.com/microsoft/semantic-kernel/pull/11357)**
71+
**Title:** .Net: Version 1.45.0
72+
**Created by:** [markwallace-microsoft](https://github.com/markwallace-microsoft)
73+
**Created at:** April 3, 2025
74+
**State:** Open
75+
**Comments:** 0
76+
**Description:** Version bump for release 1.45.0.
77+
78+
3. **[Issue #11356](https://github.com/microsoft/semantic-kernel/pull/11356)**
79+
**Title:** .Net: Fix bug in sqlite filter logic
80+
**Created by:** [westey-m](https://github.com/westey-m)
81+
**Created at:** April 3, 2025
82+
**State:** Open
83+
**Comments:** 0
84+
**Description:** Fix bug in sqlite filter logic.
85+
86+
4. **[Issue #11355](https://github.com/microsoft/semantic-kernel/issues/11355)**
87+
**Title:** .Net: [MEVD] Validate that the collection generic key parameter corresponds to the model
88+
**Created by:** [roji](https://github.com/roji)
89+
**Created at:** April 3, 2025
90+
**State:** Open
91+
**Comments:** 0
92+
**Description:** We currently have validation for the TKey generic type parameter passed to the collection type,
93+
and we have validation for the key property type on the model.
94+
95+
5. **[Issue #11354](https://github.com/microsoft/semantic-kernel/issues/11354)**
96+
**Title:** .Net: How to add custom JsonSerializer on a builder level
97+
**Created by:** [PawelStadnicki](https://github.com/PawelStadnicki)
98+
**Created at:** April 3, 2025
99+
**State:** Open
100+
**Comments:** 0
101+
**Description:** Inquiry about adding a custom JsonSerializer for handling F# types within the SDK.
102+
103+
If you need more details about a specific issue, let me know!
104+
# User: Are there any untriaged python issues?
105+
# IssueAgent: There are no untriaged Python issues in the Microsoft semantic-kernel repository.
106+
# User: What is the status of issue #10785?
107+
# IssueAgent: The status of issue #10785 in the Microsoft Semantic Kernel repository is **open**.
108+
109+
- **Title**: Port dotnet feature: Create MCP Sample
110+
- **Created at**: March 4, 2025
111+
- **Comments**: 0
112+
- **Labels**: python
113+
114+
You can view the issue [here](https://github.com/microsoft/semantic-kernel/issues/10785).
115+
"""
116+
117+
118+
if __name__ == "__main__":
119+
asyncio.run(main())
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,115 @@
1+
# Copyright (c) Microsoft. All rights reserved.
2+
3+
import asyncio
4+
5+
from samples.concepts.setup.chat_completion_services import Services, get_chat_completion_service_and_request_settings
6+
from semantic_kernel import Kernel
7+
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
8+
from semantic_kernel.connectors.mcp import MCPStdioPlugin
9+
from semantic_kernel.contents import ChatHistory
10+
11+
"""
12+
This sample demonstrates how to build a conversational chatbot
13+
using Semantic Kernel,
14+
it creates a Plugin from a MCP server config and adds it to the kernel.
15+
The chatbot is designed to interact with the user, call MCP tools
16+
as needed, and return responses.
17+
18+
To run this sample, make sure to run:
19+
`pip install semantic-kernel[mcp]`
20+
21+
or install the mcp package manually.
22+
23+
In addition, different MCP Stdio servers need different commands to run.
24+
For example, the Github plugin requires `npx`, others use `uvx` or `docker`.
25+
26+
Make sure those are available in your PATH.
27+
"""
28+
29+
# System message defining the behavior and persona of the chat bot.
30+
system_message = """
31+
You are a chat bot. And you help users interact with Github.
32+
You are especially good at answering questions about the Microsoft semantic-kernel project.
33+
You can call functions to get the information you need.
34+
"""
35+
36+
# Create and configure the kernel.
37+
kernel = Kernel()
38+
39+
# You can select from the following chat completion services that support function calling:
40+
# - Services.OPENAI
41+
# - Services.AZURE_OPENAI
42+
# - Services.AZURE_AI_INFERENCE
43+
# - Services.ANTHROPIC
44+
# - Services.BEDROCK
45+
# - Services.GOOGLE_AI
46+
# - Services.MISTRAL_AI
47+
# - Services.OLLAMA
48+
# - Services.ONNX
49+
# - Services.VERTEX_AI
50+
# - Services.DEEPSEEK
51+
# Please make sure you have configured your environment correctly for the selected chat completion service.
52+
chat_service, settings = get_chat_completion_service_and_request_settings(Services.OPENAI)
53+
54+
# Configure the function choice behavior. Here, we set it to Auto, where auto_invoke=True by default.
55+
# With `auto_invoke=True`, the model will automatically choose and call functions as needed.
56+
settings.function_choice_behavior = FunctionChoiceBehavior.Auto()
57+
58+
kernel.add_service(chat_service)
59+
60+
# Create a chat history to store the system message, initial messages, and the conversation.
61+
history = ChatHistory()
62+
history.add_system_message(system_message)
63+
64+
65+
async def chat() -> bool:
66+
"""
67+
Continuously prompt the user for input and show the assistant's response.
68+
Type 'exit' to exit.
69+
"""
70+
try:
71+
user_input = input("User:> ")
72+
except (KeyboardInterrupt, EOFError):
73+
print("\n\nExiting chat...")
74+
return False
75+
if user_input.lower().strip() == "exit":
76+
print("\n\nExiting chat...")
77+
return False
78+
79+
history.add_user_message(user_input)
80+
result = await chat_service.get_chat_message_content(history, settings, kernel=kernel)
81+
if result:
82+
print(f"Mosscap:> {result}")
83+
history.add_message(result)
84+
85+
return True
86+
87+
88+
async def main() -> None:
89+
# Create a plugin from the MCP server config and add it to the kernel.
90+
# The MCP server plugin is defined using the MCPStdioPlugin class.
91+
# The command and args are specific to the MCP server you want to run.
92+
# For example, the Github MCP Server uses `npx` to run the server.
93+
# There is also a MCPSsePlugin, which takes a URL.
94+
async with MCPStdioPlugin(
95+
name="Github",
96+
description="Github Plugin",
97+
command="npx",
98+
args=["-y", "@modelcontextprotocol/server-github"],
99+
) as github_plugin:
100+
# instead of using this async context manager, you can also use:
101+
# await github_plugin.connect()
102+
# and then await github_plugin.close() at the end of the program.
103+
104+
# Add the plugin to the kernel.
105+
kernel.add_plugin(github_plugin)
106+
107+
# Start the chat loop.
108+
print("Welcome to the chat bot!\n Type 'exit' to exit.\n")
109+
chatting = True
110+
while chatting:
111+
chatting = await chat()
112+
113+
114+
if __name__ == "__main__":
115+
asyncio.run(main())

0 commit comments

Comments
 (0)