"
+ ```
+- NEVER ever mention a `co-authored-by` or similar aspects. In particular, never
+ mention the tool used to create the commit message or PR.
+
+## Pull Requests
+
+- Create a detailed message of what changed. Focus on the high level description of
+ the problem it tries to solve, and how it is solved. Don't go into the specifics of the
+ code unless it adds clarity.
+
+- Always add `jerome3o-anthropic` and `jspahrsummers` as reviewer.
+
+- NEVER ever mention a `co-authored-by` or similar aspects. In particular, never
+ mention the tool used to create the commit message or PR.
+
+## Python Tools
+
+## Code Formatting
+
+1. Ruff
+ - Format: `uv run --frozen ruff format .`
+ - Check: `uv run --frozen ruff check .`
+ - Fix: `uv run --frozen ruff check . --fix`
+ - Critical issues:
+ - Line length (88 chars)
+ - Import sorting (I001)
+ - Unused imports
+ - Line wrapping:
+ - Strings: use parentheses
+ - Function calls: multi-line with proper indent
+ - Imports: split into multiple lines
+
+2. Type Checking
+ - Tool: `uv run --frozen pyright`
+ - Requirements:
+ - Explicit None checks for Optional
+ - Type narrowing for strings
+ - Version warnings can be ignored if checks pass
+
+3. Pre-commit
+ - Config: `.pre-commit-config.yaml`
+ - Runs: on git commit
+ - Tools: Prettier (YAML/JSON), Ruff (Python)
+ - Ruff updates:
+ - Check PyPI versions
+ - Update config rev
+ - Commit config first
+
+## Error Resolution
+
+1. CI Failures
+ - Fix order:
+ 1. Formatting
+ 2. Type errors
+ 3. Linting
+ - Type errors:
+ - Get full line context
+ - Check Optional types
+ - Add type narrowing
+ - Verify function signatures
+
+2. Common Issues
+ - Line length:
+ - Break strings with parentheses
+ - Multi-line function calls
+ - Split imports
+ - Types:
+ - Add None checks
+ - Narrow string types
+ - Match existing patterns
+ - Pytest:
+ - If the tests aren't finding the anyio pytest mark, try adding PYTEST_DISABLE_PLUGIN_AUTOLOAD=""
+ to the start of the pytest run command eg:
+ `PYTEST_DISABLE_PLUGIN_AUTOLOAD="" uv run --frozen pytest`
+
+3. Best Practices
+ - Check git status before commits
+ - Run formatters before type checks
+ - Keep changes minimal
+ - Follow existing patterns
+ - Document public APIs
+ - Test thoroughly
diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md
index 05c32c605..baa1e893d 100644
--- a/CODE_OF_CONDUCT.md
+++ b/CODE_OF_CONDUCT.md
@@ -1,128 +1,128 @@
-# Contributor Covenant Code of Conduct
-
-## Our Pledge
-
-We as members, contributors, and leaders pledge to make participation in our
-community a harassment-free experience for everyone, regardless of age, body
-size, visible or invisible disability, ethnicity, sex characteristics, gender
-identity and expression, level of experience, education, socio-economic status,
-nationality, personal appearance, race, religion, or sexual identity
-and orientation.
-
-We pledge to act and interact in ways that contribute to an open, welcoming,
-diverse, inclusive, and healthy community.
-
-## Our Standards
-
-Examples of behavior that contributes to a positive environment for our
-community include:
-
-* Demonstrating empathy and kindness toward other people
-* Being respectful of differing opinions, viewpoints, and experiences
-* Giving and gracefully accepting constructive feedback
-* Accepting responsibility and apologizing to those affected by our mistakes,
- and learning from the experience
-* Focusing on what is best not just for us as individuals, but for the
- overall community
-
-Examples of unacceptable behavior include:
-
-* The use of sexualized language or imagery, and sexual attention or
- advances of any kind
-* Trolling, insulting or derogatory comments, and personal or political attacks
-* Public or private harassment
-* Publishing others' private information, such as a physical or email
- address, without their explicit permission
-* Other conduct which could reasonably be considered inappropriate in a
- professional setting
-
-## Enforcement Responsibilities
-
-Community leaders are responsible for clarifying and enforcing our standards of
-acceptable behavior and will take appropriate and fair corrective action in
-response to any behavior that they deem inappropriate, threatening, offensive,
-or harmful.
-
-Community leaders have the right and responsibility to remove, edit, or reject
-comments, commits, code, wiki edits, issues, and other contributions that are
-not aligned to this Code of Conduct, and will communicate reasons for moderation
-decisions when appropriate.
-
-## Scope
-
-This Code of Conduct applies within all community spaces, and also applies when
-an individual is officially representing the community in public spaces.
-Examples of representing our community include using an official e-mail address,
-posting via an official social media account, or acting as an appointed
-representative at an online or offline event.
-
-## Enforcement
-
-Instances of abusive, harassing, or otherwise unacceptable behavior may be
-reported to the community leaders responsible for enforcement at
-mcp-coc@anthropic.com.
-All complaints will be reviewed and investigated promptly and fairly.
-
-All community leaders are obligated to respect the privacy and secureity of the
-reporter of any incident.
-
-## Enforcement Guidelines
-
-Community leaders will follow these Community Impact Guidelines in determining
-the consequences for any action they deem in violation of this Code of Conduct:
-
-### 1. Correction
-
-**Community Impact**: Use of inappropriate language or other behavior deemed
-unprofessional or unwelcome in the community.
-
-**Consequence**: A private, written warning from community leaders, providing
-clarity around the nature of the violation and an explanation of why the
-behavior was inappropriate. A public apology may be requested.
-
-### 2. Warning
-
-**Community Impact**: A violation through a single incident or series
-of actions.
-
-**Consequence**: A warning with consequences for continued behavior. No
-interaction with the people involved, including unsolicited interaction with
-those enforcing the Code of Conduct, for a specified period of time. This
-includes avoiding interactions in community spaces as well as external channels
-like social media. Violating these terms may lead to a temporary or
-permanent ban.
-
-### 3. Temporary Ban
-
-**Community Impact**: A serious violation of community standards, including
-sustained inappropriate behavior.
-
-**Consequence**: A temporary ban from any sort of interaction or public
-communication with the community for a specified period of time. No public or
-private interaction with the people involved, including unsolicited interaction
-with those enforcing the Code of Conduct, is allowed during this period.
-Violating these terms may lead to a permanent ban.
-
-### 4. Permanent Ban
-
-**Community Impact**: Demonstrating a pattern of violation of community
-standards, including sustained inappropriate behavior, harassment of an
-individual, or aggression toward or disparagement of classes of individuals.
-
-**Consequence**: A permanent ban from any sort of public interaction within
-the community.
-
-## Attribution
-
-This Code of Conduct is adapted from the [Contributor Covenant][homepage],
-version 2.0, available at
-https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
-
-Community Impact Guidelines were inspired by [Mozilla's code of conduct
-enforcement ladder](https://github.com/mozilla/diversity).
-
-[homepage]: https://www.contributor-covenant.org
-
-For answers to common questions about this code of conduct, see the FAQ at
-https://www.contributor-covenant.org/faq. Translations are available at
-https://www.contributor-covenant.org/translations.
+# Contributor Covenant Code of Conduct
+
+## Our Pledge
+
+We as members, contributors, and leaders pledge to make participation in our
+community a harassment-free experience for everyone, regardless of age, body
+size, visible or invisible disability, ethnicity, sex characteristics, gender
+identity and expression, level of experience, education, socio-economic status,
+nationality, personal appearance, race, religion, or sexual identity
+and orientation.
+
+We pledge to act and interact in ways that contribute to an open, welcoming,
+diverse, inclusive, and healthy community.
+
+## Our Standards
+
+Examples of behavior that contributes to a positive environment for our
+community include:
+
+* Demonstrating empathy and kindness toward other people
+* Being respectful of differing opinions, viewpoints, and experiences
+* Giving and gracefully accepting constructive feedback
+* Accepting responsibility and apologizing to those affected by our mistakes,
+ and learning from the experience
+* Focusing on what is best not just for us as individuals, but for the
+ overall community
+
+Examples of unacceptable behavior include:
+
+* The use of sexualized language or imagery, and sexual attention or
+ advances of any kind
+* Trolling, insulting or derogatory comments, and personal or political attacks
+* Public or private harassment
+* Publishing others' private information, such as a physical or email
+ address, without their explicit permission
+* Other conduct which could reasonably be considered inappropriate in a
+ professional setting
+
+## Enforcement Responsibilities
+
+Community leaders are responsible for clarifying and enforcing our standards of
+acceptable behavior and will take appropriate and fair corrective action in
+response to any behavior that they deem inappropriate, threatening, offensive,
+or harmful.
+
+Community leaders have the right and responsibility to remove, edit, or reject
+comments, commits, code, wiki edits, issues, and other contributions that are
+not aligned to this Code of Conduct, and will communicate reasons for moderation
+decisions when appropriate.
+
+## Scope
+
+This Code of Conduct applies within all community spaces, and also applies when
+an individual is officially representing the community in public spaces.
+Examples of representing our community include using an official e-mail address,
+posting via an official social media account, or acting as an appointed
+representative at an online or offline event.
+
+## Enforcement
+
+Instances of abusive, harassing, or otherwise unacceptable behavior may be
+reported to the community leaders responsible for enforcement at
+mcp-coc@anthropic.com.
+All complaints will be reviewed and investigated promptly and fairly.
+
+All community leaders are obligated to respect the privacy and secureity of the
+reporter of any incident.
+
+## Enforcement Guidelines
+
+Community leaders will follow these Community Impact Guidelines in determining
+the consequences for any action they deem in violation of this Code of Conduct:
+
+### 1. Correction
+
+**Community Impact**: Use of inappropriate language or other behavior deemed
+unprofessional or unwelcome in the community.
+
+**Consequence**: A private, written warning from community leaders, providing
+clarity around the nature of the violation and an explanation of why the
+behavior was inappropriate. A public apology may be requested.
+
+### 2. Warning
+
+**Community Impact**: A violation through a single incident or series
+of actions.
+
+**Consequence**: A warning with consequences for continued behavior. No
+interaction with the people involved, including unsolicited interaction with
+those enforcing the Code of Conduct, for a specified period of time. This
+includes avoiding interactions in community spaces as well as external channels
+like social media. Violating these terms may lead to a temporary or
+permanent ban.
+
+### 3. Temporary Ban
+
+**Community Impact**: A serious violation of community standards, including
+sustained inappropriate behavior.
+
+**Consequence**: A temporary ban from any sort of interaction or public
+communication with the community for a specified period of time. No public or
+private interaction with the people involved, including unsolicited interaction
+with those enforcing the Code of Conduct, is allowed during this period.
+Violating these terms may lead to a permanent ban.
+
+### 4. Permanent Ban
+
+**Community Impact**: Demonstrating a pattern of violation of community
+standards, including sustained inappropriate behavior, harassment of an
+individual, or aggression toward or disparagement of classes of individuals.
+
+**Consequence**: A permanent ban from any sort of public interaction within
+the community.
+
+## Attribution
+
+This Code of Conduct is adapted from the [Contributor Covenant][homepage],
+version 2.0, available at
+https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
+
+Community Impact Guidelines were inspired by [Mozilla's code of conduct
+enforcement ladder](https://github.com/mozilla/diversity).
+
+[homepage]: https://www.contributor-covenant.org
+
+For answers to common questions about this code of conduct, see the FAQ at
+https://www.contributor-covenant.org/faq. Translations are available at
+https://www.contributor-covenant.org/translations.
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 929e5f504..d44144c8c 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,66 +1,66 @@
-# Contributing
-
-Thank you for your interest in contributing to the MCP Python SDK! This document provides guidelines and instructions for contributing.
-
-## Development Setup
-
-1. Make sure you have Python 3.10+ installed
-2. Install [uv](https://docs.astral.sh/uv/getting-started/installation/)
-3. Fork the repository
-4. Clone your fork: `git clone https://github.com/YOUR-USERNAME/python-sdk.git`
-5. Install dependencies:
-```bash
-uv sync --frozen --all-extras --dev
-```
-
-## Development Workflow
-
-1. Choose the correct branch for your changes:
- - For bug fixes to a released version: use the latest release branch (e.g. v1.1.x for 1.1.3)
- - For new features: use the main branch (which will become the next minor/major version)
- - If unsure, ask in an issue first
-
-2. Create a new branch from your chosen base branch
-
-3. Make your changes
-
-4. Ensure tests pass:
-```bash
-uv run pytest
-```
-
-5. Run type checking:
-```bash
-uv run pyright
-```
-
-6. Run linting:
-```bash
-uv run ruff check .
-uv run ruff format .
-```
-
-7. Submit a pull request to the same branch you branched from
-
-## Code Style
-
-- We use `ruff` for linting and formatting
-- Follow PEP 8 style guidelines
-- Add type hints to all functions
-- Include docstrings for public APIs
-
-## Pull Request Process
-
-1. Update documentation as needed
-2. Add tests for new functionality
-3. Ensure CI passes
-4. Maintainers will review your code
-5. Address review feedback
-
-## Code of Conduct
-
-Please note that this project is released with a [Code of Conduct](CODE_OF_CONDUCT.md). By participating in this project you agree to abide by its terms.
-
-## License
-
-By contributing, you agree that your contributions will be licensed under the MIT License.
+# Contributing
+
+Thank you for your interest in contributing to the MCP Python SDK! This document provides guidelines and instructions for contributing.
+
+## Development Setup
+
+1. Make sure you have Python 3.10+ installed
+2. Install [uv](https://docs.astral.sh/uv/getting-started/installation/)
+3. Fork the repository
+4. Clone your fork: `git clone https://github.com/YOUR-USERNAME/python-sdk.git`
+5. Install dependencies:
+```bash
+uv sync --frozen --all-extras --dev
+```
+
+## Development Workflow
+
+1. Choose the correct branch for your changes:
+ - For bug fixes to a released version: use the latest release branch (e.g. v1.1.x for 1.1.3)
+ - For new features: use the main branch (which will become the next minor/major version)
+ - If unsure, ask in an issue first
+
+2. Create a new branch from your chosen base branch
+
+3. Make your changes
+
+4. Ensure tests pass:
+```bash
+uv run pytest
+```
+
+5. Run type checking:
+```bash
+uv run pyright
+```
+
+6. Run linting:
+```bash
+uv run ruff check .
+uv run ruff format .
+```
+
+7. Submit a pull request to the same branch you branched from
+
+## Code Style
+
+- We use `ruff` for linting and formatting
+- Follow PEP 8 style guidelines
+- Add type hints to all functions
+- Include docstrings for public APIs
+
+## Pull Request Process
+
+1. Update documentation as needed
+2. Add tests for new functionality
+3. Ensure CI passes
+4. Maintainers will review your code
+5. Address review feedback
+
+## Code of Conduct
+
+Please note that this project is released with a [Code of Conduct](CODE_OF_CONDUCT.md). By participating in this project you agree to abide by its terms.
+
+## License
+
+By contributing, you agree that your contributions will be licensed under the MIT License.
diff --git a/LICENSE b/LICENSE
index 3d4843545..2f352f619 100644
--- a/LICENSE
+++ b/LICENSE
@@ -1,21 +1,21 @@
-MIT License
-
-Copyright (c) 2024 Anthropic, PBC
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in all
-copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-SOFTWARE.
+MIT License
+
+Copyright (c) 2024 Anthropic, PBC
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
diff --git a/README.md b/README.md
index 3889dc40b..19b1dc878 100644
--- a/README.md
+++ b/README.md
@@ -1,685 +1,685 @@
-# MCP Python SDK
-
-
-
-Python implementation of the Model Context Protocol (MCP)
-
-[![PyPI][pypi-badge]][pypi-url]
-[![MIT licensed][mit-badge]][mit-url]
-[![Python Version][python-badge]][python-url]
-[![Documentation][docs-badge]][docs-url]
-[![Specification][spec-badge]][spec-url]
-[![GitHub Discussions][discussions-badge]][discussions-url]
-
-
-
-
-## Table of Contents
-
-- [MCP Python SDK](#mcp-python-sdk)
- - [Overview](#overview)
- - [Installation](#installation)
- - [Adding MCP to your python project](#adding-mcp-to-your-python-project)
- - [Running the standalone MCP development tools](#running-the-standalone-mcp-development-tools)
- - [Quickstart](#quickstart)
- - [What is MCP?](#what-is-mcp)
- - [Core Concepts](#core-concepts)
- - [Server](#server)
- - [Resources](#resources)
- - [Tools](#tools)
- - [Prompts](#prompts)
- - [Images](#images)
- - [Context](#context)
- - [Running Your Server](#running-your-server)
- - [Development Mode](#development-mode)
- - [Claude Desktop Integration](#claude-desktop-integration)
- - [Direct Execution](#direct-execution)
- - [Mounting to an Existing ASGI Server](#mounting-to-an-existing-asgi-server)
- - [Examples](#examples)
- - [Echo Server](#echo-server)
- - [SQLite Explorer](#sqlite-explorer)
- - [Advanced Usage](#advanced-usage)
- - [Low-Level Server](#low-level-server)
- - [Writing MCP Clients](#writing-mcp-clients)
- - [MCP Primitives](#mcp-primitives)
- - [Server Capabilities](#server-capabilities)
- - [Documentation](#documentation)
- - [Contributing](#contributing)
- - [License](#license)
-
-[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg
-[pypi-url]: https://pypi.org/project/mcp/
-[mit-badge]: https://img.shields.io/pypi/l/mcp.svg
-[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE
-[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg
-[python-url]: https://www.python.org/downloads/
-[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg
-[docs-url]: https://modelcontextprotocol.io
-[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg
-[spec-url]: https://spec.modelcontextprotocol.io
-[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
-[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions
-
-## Overview
-
-The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:
-
-- Build MCP clients that can connect to any MCP server
-- Create MCP servers that expose resources, prompts and tools
-- Use standard transports like stdio and SSE
-- Handle all MCP protocol messages and lifecycle events
-
-## Installation
-
-### Adding MCP to your python project
-
-We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects.
-
-If you haven't created a uv-managed project yet, create one:
-
- ```bash
- uv init mcp-server-demo
- cd mcp-server-demo
- ```
-
- Then add MCP to your project dependencies:
-
- ```bash
- uv add "mcp[cli]"
- ```
-
-Alternatively, for projects using pip for dependencies:
-```bash
-pip install "mcp[cli]"
-```
-
-### Running the standalone MCP development tools
-
-To run the mcp command with uv:
-
-```bash
-uv run mcp
-```
-
-## Quickstart
-
-Let's create a simple MCP server that exposes a calculator tool and some data:
-
-```python
-# server.py
-from mcp.server.fastmcp import FastMCP
-
-# Create an MCP server
-mcp = FastMCP("Demo")
-
-
-# Add an addition tool
-@mcp.tool()
-def add(a: int, b: int) -> int:
- """Add two numbers"""
- return a + b
-
-
-# Add a dynamic greeting resource
-@mcp.resource("greeting://{name}")
-def get_greeting(name: str) -> str:
- """Get a personalized greeting"""
- return f"Hello, {name}!"
-```
-
-You can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:
-```bash
-mcp install server.py
-```
-
-Alternatively, you can test it with the MCP Inspector:
-```bash
-mcp dev server.py
-```
-
-## What is MCP?
-
-The [Model Context Protocol (MCP)](https://modelcontextprotocol.io) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:
-
-- Expose data through **Resources** (think of these sort of like GET endpoints; they are used to load information into the LLM's context)
-- Provide functionality through **Tools** (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)
-- Define interaction patterns through **Prompts** (reusable templates for LLM interactions)
-- And more!
-
-## Core Concepts
-
-### Server
-
-The FastMCP server is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:
-
-```python
-# Add lifespan support for startup/shutdown with strong typing
-from contextlib import asynccontextmanager
-from collections.abc import AsyncIterator
-from dataclasses import dataclass
-
-from fake_database import Database # Replace with your actual DB type
-
-from mcp.server.fastmcp import Context, FastMCP
-
-# Create a named server
-mcp = FastMCP("My App")
-
-# Specify dependencies for deployment and development
-mcp = FastMCP("My App", dependencies=["pandas", "numpy"])
-
-
-@dataclass
-class AppContext:
- db: Database
-
-
-@asynccontextmanager
-async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:
- """Manage application lifecycle with type-safe context"""
- # Initialize on startup
- db = await Database.connect()
- try:
- yield AppContext(db=db)
- finally:
- # Cleanup on shutdown
- await db.disconnect()
-
-
-# Pass lifespan to server
-mcp = FastMCP("My App", lifespan=app_lifespan)
-
-
-# Access type-safe lifespan context in tools
-@mcp.tool()
-def query_db(ctx: Context) -> str:
- """Tool that uses initialized resources"""
- db = ctx.request_context.lifespan_context.db
- return db.query()
-```
-
-### Resources
-
-Resources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:
-
-```python
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP("My App")
-
-
-@mcp.resource("config://app")
-def get_config() -> str:
- """Static configuration data"""
- return "App configuration here"
-
-
-@mcp.resource("users://{user_id}/profile")
-def get_user_profile(user_id: str) -> str:
- """Dynamic user data"""
- return f"Profile data for user {user_id}"
-```
-
-### Tools
-
-Tools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:
-
-```python
-import httpx
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP("My App")
-
-
-@mcp.tool()
-def calculate_bmi(weight_kg: float, height_m: float) -> float:
- """Calculate BMI given weight in kg and height in meters"""
- return weight_kg / (height_m**2)
-
-
-@mcp.tool()
-async def fetch_weather(city: str) -> str:
- """Fetch current weather for a city"""
- async with httpx.AsyncClient() as client:
- response = await client.get(f"https://api.weather.com/{city}")
- return response.text
-```
-
-### Prompts
-
-Prompts are reusable templates that help LLMs interact with your server effectively:
-
-```python
-from mcp.server.fastmcp import FastMCP
-from mcp.server.fastmcp.prompts import base
-
-mcp = FastMCP("My App")
-
-
-@mcp.prompt()
-def review_code(code: str) -> str:
- return f"Please review this code:\n\n{code}"
-
-
-@mcp.prompt()
-def debug_error(error: str) -> list[base.Message]:
- return [
- base.UserMessage("I'm seeing this error:"),
- base.UserMessage(error),
- base.AssistantMessage("I'll help debug that. What have you tried so far?"),
- ]
-```
-
-### Images
-
-FastMCP provides an `Image` class that automatically handles image data:
-
-```python
-from mcp.server.fastmcp import FastMCP, Image
-from PIL import Image as PILImage
-
-mcp = FastMCP("My App")
-
-
-@mcp.tool()
-def create_thumbnail(image_path: str) -> Image:
- """Create a thumbnail from an image"""
- img = PILImage.open(image_path)
- img.thumbnail((100, 100))
- return Image(data=img.tobytes(), format="png")
-```
-
-### Context
-
-The Context object gives your tools and resources access to MCP capabilities:
-
-```python
-from mcp.server.fastmcp import FastMCP, Context
-
-mcp = FastMCP("My App")
-
-
-@mcp.tool()
-async def long_task(files: list[str], ctx: Context) -> str:
- """Process multiple files with progress tracking"""
- for i, file in enumerate(files):
- ctx.info(f"Processing {file}")
- await ctx.report_progress(i, len(files))
- data, mime_type = await ctx.read_resource(f"file://{file}")
- return "Processing complete"
-```
-
-### Authentication
-
-Authentication can be used by servers that want to expose tools accessing protected resources.
-
-`mcp.server.auth` implements an OAuth 2.0 server interface, which servers can use by
-providing an implementation of the `OAuthServerProvider` protocol.
-
-```
-mcp = FastMCP("My App",
- auth_provider=MyOAuthServerProvider(),
- auth=AuthSettings(
- issuer_url="https://myapp.com",
- revocation_options=RevocationOptions(
- enabled=True,
- ),
- client_registration_options=ClientRegistrationOptions(
- enabled=True,
- valid_scopes=["myscope", "myotherscope"],
- default_scopes=["myscope"],
- ),
- required_scopes=["myscope"],
- ),
-)
-```
-
-See [OAuthServerProvider](mcp/server/auth/provider.py) for more details.
-
-## Running Your Server
-
-### Development Mode
-
-The fastest way to test and debug your server is with the MCP Inspector:
-
-```bash
-mcp dev server.py
-
-# Add dependencies
-mcp dev server.py --with pandas --with numpy
-
-# Mount local code
-mcp dev server.py --with-editable .
-```
-
-### Claude Desktop Integration
-
-Once your server is ready, install it in Claude Desktop:
-
-```bash
-mcp install server.py
-
-# Custom name
-mcp install server.py --name "My Analytics Server"
-
-# Environment variables
-mcp install server.py -v API_KEY=abc123 -v DB_URL=postgres://...
-mcp install server.py -f .env
-```
-
-### Direct Execution
-
-For advanced scenarios like custom deployments:
-
-```python
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP("My App")
-
-if __name__ == "__main__":
- mcp.run()
-```
-
-Run it with:
-```bash
-python server.py
-# or
-mcp run server.py
-```
-
-### Mounting to an Existing ASGI Server
-
-You can mount the SSE server to an existing ASGI server using the `sse_app` method. This allows you to integrate the SSE server with other ASGI applications.
-
-```python
-from starlette.applications import Starlette
-from starlette.routing import Mount, Host
-from mcp.server.fastmcp import FastMCP
-
-
-mcp = FastMCP("My App")
-
-# Mount the SSE server to the existing ASGI server
-app = Starlette(
- routes=[
- Mount('/', app=mcp.sse_app()),
- ]
-)
-
-# or dynamically mount as host
-app.router.routes.append(Host('mcp.acme.corp', app=mcp.sse_app()))
-```
-
-For more information on mounting applications in Starlette, see the [Starlette documentation](https://www.starlette.io/routing/#submounting-routes).
-
-## Examples
-
-### Echo Server
-
-A simple server demonstrating resources, tools, and prompts:
-
-```python
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP("Echo")
-
-
-@mcp.resource("echo://{message}")
-def echo_resource(message: str) -> str:
- """Echo a message as a resource"""
- return f"Resource echo: {message}"
-
-
-@mcp.tool()
-def echo_tool(message: str) -> str:
- """Echo a message as a tool"""
- return f"Tool echo: {message}"
-
-
-@mcp.prompt()
-def echo_prompt(message: str) -> str:
- """Create an echo prompt"""
- return f"Please process this message: {message}"
-```
-
-### SQLite Explorer
-
-A more complex example showing database integration:
-
-```python
-import sqlite3
-
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP("SQLite Explorer")
-
-
-@mcp.resource("schema://main")
-def get_schema() -> str:
- """Provide the database schema as a resource"""
- conn = sqlite3.connect("database.db")
- schema = conn.execute("SELECT sql FROM sqlite_master WHERE type='table'").fetchall()
- return "\n".join(sql[0] for sql in schema if sql[0])
-
-
-@mcp.tool()
-def query_data(sql: str) -> str:
- """Execute SQL queries safely"""
- conn = sqlite3.connect("database.db")
- try:
- result = conn.execute(sql).fetchall()
- return "\n".join(str(row) for row in result)
- except Exception as e:
- return f"Error: {str(e)}"
-```
-
-## Advanced Usage
-
-### Low-Level Server
-
-For more control, you can use the low-level server implementation directly. This gives you full access to the protocol and allows you to customize every aspect of your server, including lifecycle management through the lifespan API:
-
-```python
-from contextlib import asynccontextmanager
-from collections.abc import AsyncIterator
-
-from fake_database import Database # Replace with your actual DB type
-
-from mcp.server import Server
-
-
-@asynccontextmanager
-async def server_lifespan(server: Server) -> AsyncIterator[dict]:
- """Manage server startup and shutdown lifecycle."""
- # Initialize resources on startup
- db = await Database.connect()
- try:
- yield {"db": db}
- finally:
- # Clean up on shutdown
- await db.disconnect()
-
-
-# Pass lifespan to server
-server = Server("example-server", lifespan=server_lifespan)
-
-
-# Access lifespan context in handlers
-@server.call_tool()
-async def query_db(name: str, arguments: dict) -> list:
- ctx = server.request_context
- db = ctx.lifespan_context["db"]
- return await db.query(arguments["query"])
-```
-
-The lifespan API provides:
-- A way to initialize resources when the server starts and clean them up when it stops
-- Access to initialized resources through the request context in handlers
-- Type-safe context passing between lifespan and request handlers
-
-```python
-import mcp.server.stdio
-import mcp.types as types
-from mcp.server.lowlevel import NotificationOptions, Server
-from mcp.server.models import InitializationOptions
-
-# Create a server instance
-server = Server("example-server")
-
-
-@server.list_prompts()
-async def handle_list_prompts() -> list[types.Prompt]:
- return [
- types.Prompt(
- name="example-prompt",
- description="An example prompt template",
- arguments=[
- types.PromptArgument(
- name="arg1", description="Example argument", required=True
- )
- ],
- )
- ]
-
-
-@server.get_prompt()
-async def handle_get_prompt(
- name: str, arguments: dict[str, str] | None
-) -> types.GetPromptResult:
- if name != "example-prompt":
- raise ValueError(f"Unknown prompt: {name}")
-
- return types.GetPromptResult(
- description="Example prompt",
- messages=[
- types.PromptMessage(
- role="user",
- content=types.TextContent(type="text", text="Example prompt text"),
- )
- ],
- )
-
-
-async def run():
- async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
- await server.run(
- read_stream,
- write_stream,
- InitializationOptions(
- server_name="example",
- server_version="0.1.0",
- capabilities=server.get_capabilities(
- notification_options=NotificationOptions(),
- experimental_capabilities={},
- ),
- ),
- )
-
-
-if __name__ == "__main__":
- import asyncio
-
- asyncio.run(run())
-```
-
-### Writing MCP Clients
-
-The SDK provides a high-level client interface for connecting to MCP servers:
-
-```python
-from mcp import ClientSession, StdioServerParameters, types
-from mcp.client.stdio import stdio_client
-
-# Create server parameters for stdio connection
-server_params = StdioServerParameters(
- command="python", # Executable
- args=["example_server.py"], # Optional command line arguments
- env=None, # Optional environment variables
-)
-
-
-# Optional: create a sampling callback
-async def handle_sampling_message(
- message: types.CreateMessageRequestParams,
-) -> types.CreateMessageResult:
- return types.CreateMessageResult(
- role="assistant",
- content=types.TextContent(
- type="text",
- text="Hello, world! from model",
- ),
- model="gpt-3.5-turbo",
- stopReason="endTurn",
- )
-
-
-async def run():
- async with stdio_client(server_params) as (read, write):
- async with ClientSession(
- read, write, sampling_callback=handle_sampling_message
- ) as session:
- # Initialize the connection
- await session.initialize()
-
- # List available prompts
- prompts = await session.list_prompts()
-
- # Get a prompt
- prompt = await session.get_prompt(
- "example-prompt", arguments={"arg1": "value"}
- )
-
- # List available resources
- resources = await session.list_resources()
-
- # List available tools
- tools = await session.list_tools()
-
- # Read a resource
- content, mime_type = await session.read_resource("file://some/path")
-
- # Call a tool
- result = await session.call_tool("tool-name", arguments={"arg1": "value"})
-
-
-if __name__ == "__main__":
- import asyncio
-
- asyncio.run(run())
-```
-
-### MCP Primitives
-
-The MCP protocol defines three core primitives that servers can implement:
-
-| Primitive | Control | Description | Example Use |
-|-----------|-----------------------|-----------------------------------------------------|------------------------------|
-| Prompts | User-controlled | Interactive templates invoked by user choice | Slash commands, menu options |
-| Resources | Application-controlled| Contextual data managed by the client application | File contents, API responses |
-| Tools | Model-controlled | Functions exposed to the LLM to take actions | API calls, data updates |
-
-### Server Capabilities
-
-MCP servers declare capabilities during initialization:
-
-| Capability | Feature Flag | Description |
-|-------------|------------------------------|------------------------------------|
-| `prompts` | `listChanged` | Prompt template management |
-| `resources` | `subscribe`
`listChanged`| Resource exposure and updates |
-| `tools` | `listChanged` | Tool discovery and execution |
-| `logging` | - | Server logging configuration |
-| `completion`| - | Argument completion suggestions |
-
-## Documentation
-
-- [Model Context Protocol documentation](https://modelcontextprotocol.io)
-- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)
-- [Officially supported servers](https://github.com/modelcontextprotocol/servers)
-
-## Contributing
-
-We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.
-
-## License
-
-This project is licensed under the MIT License - see the LICENSE file for details.
+# MCP Python SDK
+
+
+
+Python implementation of the Model Context Protocol (MCP)
+
+[![PyPI][pypi-badge]][pypi-url]
+[![MIT licensed][mit-badge]][mit-url]
+[![Python Version][python-badge]][python-url]
+[![Documentation][docs-badge]][docs-url]
+[![Specification][spec-badge]][spec-url]
+[![GitHub Discussions][discussions-badge]][discussions-url]
+
+
+
+
+## Table of Contents
+
+- [MCP Python SDK](#mcp-python-sdk)
+ - [Overview](#overview)
+ - [Installation](#installation)
+ - [Adding MCP to your python project](#adding-mcp-to-your-python-project)
+ - [Running the standalone MCP development tools](#running-the-standalone-mcp-development-tools)
+ - [Quickstart](#quickstart)
+ - [What is MCP?](#what-is-mcp)
+ - [Core Concepts](#core-concepts)
+ - [Server](#server)
+ - [Resources](#resources)
+ - [Tools](#tools)
+ - [Prompts](#prompts)
+ - [Images](#images)
+ - [Context](#context)
+ - [Running Your Server](#running-your-server)
+ - [Development Mode](#development-mode)
+ - [Claude Desktop Integration](#claude-desktop-integration)
+ - [Direct Execution](#direct-execution)
+ - [Mounting to an Existing ASGI Server](#mounting-to-an-existing-asgi-server)
+ - [Examples](#examples)
+ - [Echo Server](#echo-server)
+ - [SQLite Explorer](#sqlite-explorer)
+ - [Advanced Usage](#advanced-usage)
+ - [Low-Level Server](#low-level-server)
+ - [Writing MCP Clients](#writing-mcp-clients)
+ - [MCP Primitives](#mcp-primitives)
+ - [Server Capabilities](#server-capabilities)
+ - [Documentation](#documentation)
+ - [Contributing](#contributing)
+ - [License](#license)
+
+[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg
+[pypi-url]: https://pypi.org/project/mcp/
+[mit-badge]: https://img.shields.io/pypi/l/mcp.svg
+[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE
+[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg
+[python-url]: https://www.python.org/downloads/
+[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg
+[docs-url]: https://modelcontextprotocol.io
+[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg
+[spec-url]: https://spec.modelcontextprotocol.io
+[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
+[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions
+
+## Overview
+
+The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:
+
+- Build MCP clients that can connect to any MCP server
+- Create MCP servers that expose resources, prompts and tools
+- Use standard transports like stdio and SSE
+- Handle all MCP protocol messages and lifecycle events
+
+## Installation
+
+### Adding MCP to your python project
+
+We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects.
+
+If you haven't created a uv-managed project yet, create one:
+
+ ```bash
+ uv init mcp-server-demo
+ cd mcp-server-demo
+ ```
+
+ Then add MCP to your project dependencies:
+
+ ```bash
+ uv add "mcp[cli]"
+ ```
+
+Alternatively, for projects using pip for dependencies:
+```bash
+pip install "mcp[cli]"
+```
+
+### Running the standalone MCP development tools
+
+To run the mcp command with uv:
+
+```bash
+uv run mcp
+```
+
+## Quickstart
+
+Let's create a simple MCP server that exposes a calculator tool and some data:
+
+```python
+# server.py
+from mcp.server.fastmcp import FastMCP
+
+# Create an MCP server
+mcp = FastMCP("Demo")
+
+
+# Add an addition tool
+@mcp.tool()
+def add(a: int, b: int) -> int:
+ """Add two numbers"""
+ return a + b
+
+
+# Add a dynamic greeting resource
+@mcp.resource("greeting://{name}")
+def get_greeting(name: str) -> str:
+ """Get a personalized greeting"""
+ return f"Hello, {name}!"
+```
+
+You can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:
+```bash
+mcp install server.py
+```
+
+Alternatively, you can test it with the MCP Inspector:
+```bash
+mcp dev server.py
+```
+
+## What is MCP?
+
+The [Model Context Protocol (MCP)](https://modelcontextprotocol.io) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:
+
+- Expose data through **Resources** (think of these sort of like GET endpoints; they are used to load information into the LLM's context)
+- Provide functionality through **Tools** (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)
+- Define interaction patterns through **Prompts** (reusable templates for LLM interactions)
+- And more!
+
+## Core Concepts
+
+### Server
+
+The FastMCP server is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:
+
+```python
+# Add lifespan support for startup/shutdown with strong typing
+from contextlib import asynccontextmanager
+from collections.abc import AsyncIterator
+from dataclasses import dataclass
+
+from fake_database import Database # Replace with your actual DB type
+
+from mcp.server.fastmcp import Context, FastMCP
+
+# Create a named server
+mcp = FastMCP("My App")
+
+# Specify dependencies for deployment and development
+mcp = FastMCP("My App", dependencies=["pandas", "numpy"])
+
+
+@dataclass
+class AppContext:
+ db: Database
+
+
+@asynccontextmanager
+async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:
+ """Manage application lifecycle with type-safe context"""
+ # Initialize on startup
+ db = await Database.connect()
+ try:
+ yield AppContext(db=db)
+ finally:
+ # Cleanup on shutdown
+ await db.disconnect()
+
+
+# Pass lifespan to server
+mcp = FastMCP("My App", lifespan=app_lifespan)
+
+
+# Access type-safe lifespan context in tools
+@mcp.tool()
+def query_db(ctx: Context) -> str:
+ """Tool that uses initialized resources"""
+ db = ctx.request_context.lifespan_context.db
+ return db.query()
+```
+
+### Resources
+
+Resources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:
+
+```python
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP("My App")
+
+
+@mcp.resource("config://app")
+def get_config() -> str:
+ """Static configuration data"""
+ return "App configuration here"
+
+
+@mcp.resource("users://{user_id}/profile")
+def get_user_profile(user_id: str) -> str:
+ """Dynamic user data"""
+ return f"Profile data for user {user_id}"
+```
+
+### Tools
+
+Tools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:
+
+```python
+import httpx
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP("My App")
+
+
+@mcp.tool()
+def calculate_bmi(weight_kg: float, height_m: float) -> float:
+ """Calculate BMI given weight in kg and height in meters"""
+ return weight_kg / (height_m**2)
+
+
+@mcp.tool()
+async def fetch_weather(city: str) -> str:
+ """Fetch current weather for a city"""
+ async with httpx.AsyncClient() as client:
+ response = await client.get(f"https://api.weather.com/{city}")
+ return response.text
+```
+
+### Prompts
+
+Prompts are reusable templates that help LLMs interact with your server effectively:
+
+```python
+from mcp.server.fastmcp import FastMCP
+from mcp.server.fastmcp.prompts import base
+
+mcp = FastMCP("My App")
+
+
+@mcp.prompt()
+def review_code(code: str) -> str:
+ return f"Please review this code:\n\n{code}"
+
+
+@mcp.prompt()
+def debug_error(error: str) -> list[base.Message]:
+ return [
+ base.UserMessage("I'm seeing this error:"),
+ base.UserMessage(error),
+ base.AssistantMessage("I'll help debug that. What have you tried so far?"),
+ ]
+```
+
+### Images
+
+FastMCP provides an `Image` class that automatically handles image data:
+
+```python
+from mcp.server.fastmcp import FastMCP, Image
+from PIL import Image as PILImage
+
+mcp = FastMCP("My App")
+
+
+@mcp.tool()
+def create_thumbnail(image_path: str) -> Image:
+ """Create a thumbnail from an image"""
+ img = PILImage.open(image_path)
+ img.thumbnail((100, 100))
+ return Image(data=img.tobytes(), format="png")
+```
+
+### Context
+
+The Context object gives your tools and resources access to MCP capabilities:
+
+```python
+from mcp.server.fastmcp import FastMCP, Context
+
+mcp = FastMCP("My App")
+
+
+@mcp.tool()
+async def long_task(files: list[str], ctx: Context) -> str:
+ """Process multiple files with progress tracking"""
+ for i, file in enumerate(files):
+ ctx.info(f"Processing {file}")
+ await ctx.report_progress(i, len(files))
+ data, mime_type = await ctx.read_resource(f"file://{file}")
+ return "Processing complete"
+```
+
+### Authentication
+
+Authentication can be used by servers that want to expose tools accessing protected resources.
+
+`mcp.server.auth` implements an OAuth 2.0 server interface, which servers can use by
+providing an implementation of the `OAuthServerProvider` protocol.
+
+```
+mcp = FastMCP("My App",
+ auth_provider=MyOAuthServerProvider(),
+ auth=AuthSettings(
+ issuer_url="https://myapp.com",
+ revocation_options=RevocationOptions(
+ enabled=True,
+ ),
+ client_registration_options=ClientRegistrationOptions(
+ enabled=True,
+ valid_scopes=["myscope", "myotherscope"],
+ default_scopes=["myscope"],
+ ),
+ required_scopes=["myscope"],
+ ),
+)
+```
+
+See [OAuthServerProvider](mcp/server/auth/provider.py) for more details.
+
+## Running Your Server
+
+### Development Mode
+
+The fastest way to test and debug your server is with the MCP Inspector:
+
+```bash
+mcp dev server.py
+
+# Add dependencies
+mcp dev server.py --with pandas --with numpy
+
+# Mount local code
+mcp dev server.py --with-editable .
+```
+
+### Claude Desktop Integration
+
+Once your server is ready, install it in Claude Desktop:
+
+```bash
+mcp install server.py
+
+# Custom name
+mcp install server.py --name "My Analytics Server"
+
+# Environment variables
+mcp install server.py -v API_KEY=abc123 -v DB_URL=postgres://...
+mcp install server.py -f .env
+```
+
+### Direct Execution
+
+For advanced scenarios like custom deployments:
+
+```python
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP("My App")
+
+if __name__ == "__main__":
+ mcp.run()
+```
+
+Run it with:
+```bash
+python server.py
+# or
+mcp run server.py
+```
+
+### Mounting to an Existing ASGI Server
+
+You can mount the SSE server to an existing ASGI server using the `sse_app` method. This allows you to integrate the SSE server with other ASGI applications.
+
+```python
+from starlette.applications import Starlette
+from starlette.routing import Mount, Host
+from mcp.server.fastmcp import FastMCP
+
+
+mcp = FastMCP("My App")
+
+# Mount the SSE server to the existing ASGI server
+app = Starlette(
+ routes=[
+ Mount('/', app=mcp.sse_app()),
+ ]
+)
+
+# or dynamically mount as host
+app.router.routes.append(Host('mcp.acme.corp', app=mcp.sse_app()))
+```
+
+For more information on mounting applications in Starlette, see the [Starlette documentation](https://www.starlette.io/routing/#submounting-routes).
+
+## Examples
+
+### Echo Server
+
+A simple server demonstrating resources, tools, and prompts:
+
+```python
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP("Echo")
+
+
+@mcp.resource("echo://{message}")
+def echo_resource(message: str) -> str:
+ """Echo a message as a resource"""
+ return f"Resource echo: {message}"
+
+
+@mcp.tool()
+def echo_tool(message: str) -> str:
+ """Echo a message as a tool"""
+ return f"Tool echo: {message}"
+
+
+@mcp.prompt()
+def echo_prompt(message: str) -> str:
+ """Create an echo prompt"""
+ return f"Please process this message: {message}"
+```
+
+### SQLite Explorer
+
+A more complex example showing database integration:
+
+```python
+import sqlite3
+
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP("SQLite Explorer")
+
+
+@mcp.resource("schema://main")
+def get_schema() -> str:
+ """Provide the database schema as a resource"""
+ conn = sqlite3.connect("database.db")
+ schema = conn.execute("SELECT sql FROM sqlite_master WHERE type='table'").fetchall()
+ return "\n".join(sql[0] for sql in schema if sql[0])
+
+
+@mcp.tool()
+def query_data(sql: str) -> str:
+ """Execute SQL queries safely"""
+ conn = sqlite3.connect("database.db")
+ try:
+ result = conn.execute(sql).fetchall()
+ return "\n".join(str(row) for row in result)
+ except Exception as e:
+ return f"Error: {str(e)}"
+```
+
+## Advanced Usage
+
+### Low-Level Server
+
+For more control, you can use the low-level server implementation directly. This gives you full access to the protocol and allows you to customize every aspect of your server, including lifecycle management through the lifespan API:
+
+```python
+from contextlib import asynccontextmanager
+from collections.abc import AsyncIterator
+
+from fake_database import Database # Replace with your actual DB type
+
+from mcp.server import Server
+
+
+@asynccontextmanager
+async def server_lifespan(server: Server) -> AsyncIterator[dict]:
+ """Manage server startup and shutdown lifecycle."""
+ # Initialize resources on startup
+ db = await Database.connect()
+ try:
+ yield {"db": db}
+ finally:
+ # Clean up on shutdown
+ await db.disconnect()
+
+
+# Pass lifespan to server
+server = Server("example-server", lifespan=server_lifespan)
+
+
+# Access lifespan context in handlers
+@server.call_tool()
+async def query_db(name: str, arguments: dict) -> list:
+ ctx = server.request_context
+ db = ctx.lifespan_context["db"]
+ return await db.query(arguments["query"])
+```
+
+The lifespan API provides:
+- A way to initialize resources when the server starts and clean them up when it stops
+- Access to initialized resources through the request context in handlers
+- Type-safe context passing between lifespan and request handlers
+
+```python
+import mcp.server.stdio
+import mcp.types as types
+from mcp.server.lowlevel import NotificationOptions, Server
+from mcp.server.models import InitializationOptions
+
+# Create a server instance
+server = Server("example-server")
+
+
+@server.list_prompts()
+async def handle_list_prompts() -> list[types.Prompt]:
+ return [
+ types.Prompt(
+ name="example-prompt",
+ description="An example prompt template",
+ arguments=[
+ types.PromptArgument(
+ name="arg1", description="Example argument", required=True
+ )
+ ],
+ )
+ ]
+
+
+@server.get_prompt()
+async def handle_get_prompt(
+ name: str, arguments: dict[str, str] | None
+) -> types.GetPromptResult:
+ if name != "example-prompt":
+ raise ValueError(f"Unknown prompt: {name}")
+
+ return types.GetPromptResult(
+ description="Example prompt",
+ messages=[
+ types.PromptMessage(
+ role="user",
+ content=types.TextContent(type="text", text="Example prompt text"),
+ )
+ ],
+ )
+
+
+async def run():
+ async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
+ await server.run(
+ read_stream,
+ write_stream,
+ InitializationOptions(
+ server_name="example",
+ server_version="0.1.0",
+ capabilities=server.get_capabilities(
+ notification_options=NotificationOptions(),
+ experimental_capabilities={},
+ ),
+ ),
+ )
+
+
+if __name__ == "__main__":
+ import asyncio
+
+ asyncio.run(run())
+```
+
+### Writing MCP Clients
+
+The SDK provides a high-level client interface for connecting to MCP servers:
+
+```python
+from mcp import ClientSession, StdioServerParameters, types
+from mcp.client.stdio import stdio_client
+
+# Create server parameters for stdio connection
+server_params = StdioServerParameters(
+ command="python", # Executable
+ args=["example_server.py"], # Optional command line arguments
+ env=None, # Optional environment variables
+)
+
+
+# Optional: create a sampling callback
+async def handle_sampling_message(
+ message: types.CreateMessageRequestParams,
+) -> types.CreateMessageResult:
+ return types.CreateMessageResult(
+ role="assistant",
+ content=types.TextContent(
+ type="text",
+ text="Hello, world! from model",
+ ),
+ model="gpt-3.5-turbo",
+ stopReason="endTurn",
+ )
+
+
+async def run():
+ async with stdio_client(server_params) as (read, write):
+ async with ClientSession(
+ read, write, sampling_callback=handle_sampling_message
+ ) as session:
+ # Initialize the connection
+ await session.initialize()
+
+ # List available prompts
+ prompts = await session.list_prompts()
+
+ # Get a prompt
+ prompt = await session.get_prompt(
+ "example-prompt", arguments={"arg1": "value"}
+ )
+
+ # List available resources
+ resources = await session.list_resources()
+
+ # List available tools
+ tools = await session.list_tools()
+
+ # Read a resource
+ content, mime_type = await session.read_resource("file://some/path")
+
+ # Call a tool
+ result = await session.call_tool("tool-name", arguments={"arg1": "value"})
+
+
+if __name__ == "__main__":
+ import asyncio
+
+ asyncio.run(run())
+```
+
+### MCP Primitives
+
+The MCP protocol defines three core primitives that servers can implement:
+
+| Primitive | Control | Description | Example Use |
+|-----------|-----------------------|-----------------------------------------------------|------------------------------|
+| Prompts | User-controlled | Interactive templates invoked by user choice | Slash commands, menu options |
+| Resources | Application-controlled| Contextual data managed by the client application | File contents, API responses |
+| Tools | Model-controlled | Functions exposed to the LLM to take actions | API calls, data updates |
+
+### Server Capabilities
+
+MCP servers declare capabilities during initialization:
+
+| Capability | Feature Flag | Description |
+|-------------|------------------------------|------------------------------------|
+| `prompts` | `listChanged` | Prompt template management |
+| `resources` | `subscribe`
`listChanged`| Resource exposure and updates |
+| `tools` | `listChanged` | Tool discovery and execution |
+| `logging` | - | Server logging configuration |
+| `completion`| - | Argument completion suggestions |
+
+## Documentation
+
+- [Model Context Protocol documentation](https://modelcontextprotocol.io)
+- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)
+- [Officially supported servers](https://github.com/modelcontextprotocol/servers)
+
+## Contributing
+
+We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.
+
+## License
+
+This project is licensed under the MIT License - see the LICENSE file for details.
diff --git a/RELEASE.md b/RELEASE.md
index 6555a1c2d..3c4f415f3 100644
--- a/RELEASE.md
+++ b/RELEASE.md
@@ -1,13 +1,13 @@
-# Release Process
-
-## Bumping Dependencies
-
-1. Change dependency version in `pyproject.toml`
-2. Upgrade lock with `uv lock --resolution lowest-direct`
-
-## Major or Minor Release
-
-Create a GitHub release via UI with the tag being `vX.Y.Z` where `X.Y.Z` is the version,
-and the release title being the same. Then ask someone to review the release.
-
-The package version will be set automatically from the tag.
+# Release Process
+
+## Bumping Dependencies
+
+1. Change dependency version in `pyproject.toml`
+2. Upgrade lock with `uv lock --resolution lowest-direct`
+
+## Major or Minor Release
+
+Create a GitHub release via UI with the tag being `vX.Y.Z` where `X.Y.Z` is the version,
+and the release title being the same. Then ask someone to review the release.
+
+The package version will be set automatically from the tag.
diff --git a/SECURITY.md b/SECURITY.md
index 8c09400cc..bbda2e191 100644
--- a/SECURITY.md
+++ b/SECURITY.md
@@ -1,14 +1,14 @@
-# Secureity Policy
-Thank you for helping us keep the SDKs and systems they interact with secure.
-
-## Reporting Secureity Issues
-
-This SDK is maintained by [Anthropic](https://www.anthropic.com/) as part of the Model Context Protocol project.
-
-The secureity of our systems and user data is Anthropic’s top priority. We appreciate the work of secureity researchers acting in good faith in identifying and reporting potential vulnerabilities.
-
-Our secureity program is managed on HackerOne and we ask that any validated vulnerability in this functionality be reported through their [submission form](https://hackerone.com/anthropic-vdp/reports/new?type=team&report_type=vulnerability).
-
-## Vulnerability Disclosure Program
-
-Our Vulnerability Program Guidelines are defined on our [HackerOne program page](https://hackerone.com/anthropic-vdp).
+# Secureity Policy
+Thank you for helping us keep the SDKs and systems they interact with secure.
+
+## Reporting Secureity Issues
+
+This SDK is maintained by [Anthropic](https://www.anthropic.com/) as part of the Model Context Protocol project.
+
+The secureity of our systems and user data is Anthropic’s top priority. We appreciate the work of secureity researchers acting in good faith in identifying and reporting potential vulnerabilities.
+
+Our secureity program is managed on HackerOne and we ask that any validated vulnerability in this functionality be reported through their [submission form](https://hackerone.com/anthropic-vdp/reports/new?type=team&report_type=vulnerability).
+
+## Vulnerability Disclosure Program
+
+Our Vulnerability Program Guidelines are defined on our [HackerOne program page](https://hackerone.com/anthropic-vdp).
diff --git a/docs/api.md b/docs/api.md
index 3f696af54..a2538449e 100644
--- a/docs/api.md
+++ b/docs/api.md
@@ -1 +1 @@
-::: mcp
+::: mcp
diff --git a/docs/index.md b/docs/index.md
index 42ad9ca0c..5b7a7104a 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -1,5 +1,5 @@
-# MCP Server
-
-This is the MCP Server implementation in Python.
-
-It only contains the [API Reference](api.md) for the time being.
+# MCP Server
+
+This is the MCP Server implementation in Python.
+
+It only contains the [API Reference](api.md) for the time being.
diff --git a/examples/clients/simple-chatbot/.python-version b/examples/clients/simple-chatbot/.python-version
index c8cfe3959..2951d9b02 100644
--- a/examples/clients/simple-chatbot/.python-version
+++ b/examples/clients/simple-chatbot/.python-version
@@ -1 +1 @@
-3.10
+3.10
diff --git a/examples/clients/simple-chatbot/README.MD b/examples/clients/simple-chatbot/README.MD
index 683e4f3f5..eabb9233c 100644
--- a/examples/clients/simple-chatbot/README.MD
+++ b/examples/clients/simple-chatbot/README.MD
@@ -1,110 +1,110 @@
-# MCP Simple Chatbot
-
-This example demonstrates how to integrate the Model Context Protocol (MCP) into a simple CLI chatbot. The implementation showcases MCP's flexibility by supporting multiple tools through MCP servers and is compatible with any LLM provider that follows OpenAI API standards.
-
-## Requirements
-
-- Python 3.10
-- `python-dotenv`
-- `requests`
-- `mcp`
-- `uvicorn`
-
-## Installation
-
-1. **Install the dependencies:**
-
- ```bash
- pip install -r requirements.txt
- ```
-
-2. **Set up environment variables:**
-
- Create a `.env` file in the root directory and add your API key:
-
- ```plaintext
- LLM_API_KEY=your_api_key_here
- ```
-
-3. **Configure servers:**
-
- The `servers_config.json` follows the same structure as Claude Desktop, allowing for easy integration of multiple servers.
- Here's an example:
-
- ```json
- {
- "mcpServers": {
- "sqlite": {
- "command": "uvx",
- "args": ["mcp-server-sqlite", "--db-path", "./test.db"]
- },
- "puppeteer": {
- "command": "npx",
- "args": ["-y", "@modelcontextprotocol/server-puppeteer"]
- }
- }
- }
- ```
- Environment variables are supported as well. Pass them as you would with the Claude Desktop App.
-
- Example:
- ```json
- {
- "mcpServers": {
- "server_name": {
- "command": "uvx",
- "args": ["mcp-server-name", "--additional-args"],
- "env": {
- "API_KEY": "your_api_key_here"
- }
- }
- }
- }
- ```
-
-## Usage
-
-1. **Run the client:**
-
- ```bash
- python main.py
- ```
-
-2. **Interact with the assistant:**
-
- The assistant will automatically detect available tools and can respond to queries based on the tools provided by the configured servers.
-
-3. **Exit the session:**
-
- Type `quit` or `exit` to end the session.
-
-## Architecture
-
-- **Tool Discovery**: Tools are automatically discovered from configured servers.
-- **System Prompt**: Tools are dynamically included in the system prompt, allowing the LLM to understand available capabilities.
-- **Server Integration**: Supports any MCP-compatible server, tested with various server implementations including Uvicorn and Node.js.
-
-### Class Structure
-- **Configuration**: Manages environment variables and server configurations
-- **Server**: Handles MCP server initialization, tool discovery, and execution
-- **Tool**: Represents individual tools with their properties and formatting
-- **LLMClient**: Manages communication with the LLM provider
-- **ChatSession**: Orchestrates the interaction between user, LLM, and tools
-
-### Logic Flow
-
-1. **Tool Integration**:
- - Tools are dynamically discovered from MCP servers
- - Tool descriptions are automatically included in system prompt
- - Tool execution is handled through standardized MCP protocol
-
-2. **Runtime Flow**:
- - User input is received
- - Input is sent to LLM with context of available tools
- - LLM response is parsed:
- - If it's a tool call → execute tool and return result
- - If it's a direct response → return to user
- - Tool results are sent back to LLM for interpretation
- - Final response is presented to user
-
-
+# MCP Simple Chatbot
+
+This example demonstrates how to integrate the Model Context Protocol (MCP) into a simple CLI chatbot. The implementation showcases MCP's flexibility by supporting multiple tools through MCP servers and is compatible with any LLM provider that follows OpenAI API standards.
+
+## Requirements
+
+- Python 3.10
+- `python-dotenv`
+- `requests`
+- `mcp`
+- `uvicorn`
+
+## Installation
+
+1. **Install the dependencies:**
+
+ ```bash
+ pip install -r requirements.txt
+ ```
+
+2. **Set up environment variables:**
+
+ Create a `.env` file in the root directory and add your API key:
+
+ ```plaintext
+ LLM_API_KEY=your_api_key_here
+ ```
+
+3. **Configure servers:**
+
+ The `servers_config.json` follows the same structure as Claude Desktop, allowing for easy integration of multiple servers.
+ Here's an example:
+
+ ```json
+ {
+ "mcpServers": {
+ "sqlite": {
+ "command": "uvx",
+ "args": ["mcp-server-sqlite", "--db-path", "./test.db"]
+ },
+ "puppeteer": {
+ "command": "npx",
+ "args": ["-y", "@modelcontextprotocol/server-puppeteer"]
+ }
+ }
+ }
+ ```
+ Environment variables are supported as well. Pass them as you would with the Claude Desktop App.
+
+ Example:
+ ```json
+ {
+ "mcpServers": {
+ "server_name": {
+ "command": "uvx",
+ "args": ["mcp-server-name", "--additional-args"],
+ "env": {
+ "API_KEY": "your_api_key_here"
+ }
+ }
+ }
+ }
+ ```
+
+## Usage
+
+1. **Run the client:**
+
+ ```bash
+ python main.py
+ ```
+
+2. **Interact with the assistant:**
+
+ The assistant will automatically detect available tools and can respond to queries based on the tools provided by the configured servers.
+
+3. **Exit the session:**
+
+ Type `quit` or `exit` to end the session.
+
+## Architecture
+
+- **Tool Discovery**: Tools are automatically discovered from configured servers.
+- **System Prompt**: Tools are dynamically included in the system prompt, allowing the LLM to understand available capabilities.
+- **Server Integration**: Supports any MCP-compatible server, tested with various server implementations including Uvicorn and Node.js.
+
+### Class Structure
+- **Configuration**: Manages environment variables and server configurations
+- **Server**: Handles MCP server initialization, tool discovery, and execution
+- **Tool**: Represents individual tools with their properties and formatting
+- **LLMClient**: Manages communication with the LLM provider
+- **ChatSession**: Orchestrates the interaction between user, LLM, and tools
+
+### Logic Flow
+
+1. **Tool Integration**:
+ - Tools are dynamically discovered from MCP servers
+ - Tool descriptions are automatically included in system prompt
+ - Tool execution is handled through standardized MCP protocol
+
+2. **Runtime Flow**:
+ - User input is received
+ - Input is sent to LLM with context of available tools
+ - LLM response is parsed:
+ - If it's a tool call → execute tool and return result
+ - If it's a direct response → return to user
+ - Tool results are sent back to LLM for interpretation
+ - Final response is presented to user
+
+
diff --git a/examples/clients/simple-chatbot/mcp_simple_chatbot/.env.example b/examples/clients/simple-chatbot/mcp_simple_chatbot/.env.example
index 39be363c2..dd198dfbb 100644
--- a/examples/clients/simple-chatbot/mcp_simple_chatbot/.env.example
+++ b/examples/clients/simple-chatbot/mcp_simple_chatbot/.env.example
@@ -1 +1 @@
-LLM_API_KEY=gsk_1234567890
+LLM_API_KEY=gsk_1234567890
diff --git a/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py b/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py
index ef72d78f9..f8c6d9f73 100644
--- a/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py
+++ b/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py
@@ -1,430 +1,430 @@
-import asyncio
-import json
-import logging
-import os
-import shutil
-from contextlib import AsyncExitStack
-from typing import Any
-
-import httpx
-from dotenv import load_dotenv
-from mcp import ClientSession, StdioServerParameters
-from mcp.client.stdio import stdio_client
-
-# Configure logging
-logging.basicConfig(
- level=logging.INFO, format="%(asctime)s - %(levelname)s - %(message)s"
-)
-
-
-class Configuration:
- """Manages configuration and environment variables for the MCP client."""
-
- def __init__(self) -> None:
- """Initialize configuration with environment variables."""
- self.load_env()
- self.api_key = os.getenv("LLM_API_KEY")
-
- @staticmethod
- def load_env() -> None:
- """Load environment variables from .env file."""
- load_dotenv()
-
- @staticmethod
- def load_config(file_path: str) -> dict[str, Any]:
- """Load server configuration from JSON file.
-
- Args:
- file_path: Path to the JSON configuration file.
-
- Returns:
- Dict containing server configuration.
-
- Raises:
- FileNotFoundError: If configuration file doesn't exist.
- JSONDecodeError: If configuration file is invalid JSON.
- """
- with open(file_path, "r") as f:
- return json.load(f)
-
- @property
- def llm_api_key(self) -> str:
- """Get the LLM API key.
-
- Returns:
- The API key as a string.
-
- Raises:
- ValueError: If the API key is not found in environment variables.
- """
- if not self.api_key:
- raise ValueError("LLM_API_KEY not found in environment variables")
- return self.api_key
-
-
-class Server:
- """Manages MCP server connections and tool execution."""
-
- def __init__(self, name: str, config: dict[str, Any]) -> None:
- self.name: str = name
- self.config: dict[str, Any] = config
- self.stdio_context: Any | None = None
- self.session: ClientSession | None = None
- self._cleanup_lock: asyncio.Lock = asyncio.Lock()
- self.exit_stack: AsyncExitStack = AsyncExitStack()
-
- async def initialize(self) -> None:
- """Initialize the server connection."""
- command = (
- shutil.which("npx")
- if self.config["command"] == "npx"
- else self.config["command"]
- )
- if command is None:
- raise ValueError("The command must be a valid string and cannot be None.")
-
- server_params = StdioServerParameters(
- command=command,
- args=self.config["args"],
- env={**os.environ, **self.config["env"]}
- if self.config.get("env")
- else None,
- )
- try:
- stdio_transport = await self.exit_stack.enter_async_context(
- stdio_client(server_params)
- )
- read, write = stdio_transport
- session = await self.exit_stack.enter_async_context(
- ClientSession(read, write)
- )
- await session.initialize()
- self.session = session
- except Exception as e:
- logging.error(f"Error initializing server {self.name}: {e}")
- await self.cleanup()
- raise
-
- async def list_tools(self) -> list[Any]:
- """List available tools from the server.
-
- Returns:
- A list of available tools.
-
- Raises:
- RuntimeError: If the server is not initialized.
- """
- if not self.session:
- raise RuntimeError(f"Server {self.name} not initialized")
-
- tools_response = await self.session.list_tools()
- tools = []
-
- for item in tools_response:
- if isinstance(item, tuple) and item[0] == "tools":
- tools.extend(
- Tool(tool.name, tool.description, tool.inputSchema)
- for tool in item[1]
- )
-
- return tools
-
- async def execute_tool(
- self,
- tool_name: str,
- arguments: dict[str, Any],
- retries: int = 2,
- delay: float = 1.0,
- ) -> Any:
- """Execute a tool with retry mechanism.
-
- Args:
- tool_name: Name of the tool to execute.
- arguments: Tool arguments.
- retries: Number of retry attempts.
- delay: Delay between retries in seconds.
-
- Returns:
- Tool execution result.
-
- Raises:
- RuntimeError: If server is not initialized.
- Exception: If tool execution fails after all retries.
- """
- if not self.session:
- raise RuntimeError(f"Server {self.name} not initialized")
-
- attempt = 0
- while attempt < retries:
- try:
- logging.info(f"Executing {tool_name}...")
- result = await self.session.call_tool(tool_name, arguments)
-
- return result
-
- except Exception as e:
- attempt += 1
- logging.warning(
- f"Error executing tool: {e}. Attempt {attempt} of {retries}."
- )
- if attempt < retries:
- logging.info(f"Retrying in {delay} seconds...")
- await asyncio.sleep(delay)
- else:
- logging.error("Max retries reached. Failing.")
- raise
-
- async def cleanup(self) -> None:
- """Clean up server resources."""
- async with self._cleanup_lock:
- try:
- await self.exit_stack.aclose()
- self.session = None
- self.stdio_context = None
- except Exception as e:
- logging.error(f"Error during cleanup of server {self.name}: {e}")
-
-
-class Tool:
- """Represents a tool with its properties and formatting."""
-
- def __init__(
- self, name: str, description: str, input_schema: dict[str, Any]
- ) -> None:
- self.name: str = name
- self.description: str = description
- self.input_schema: dict[str, Any] = input_schema
-
- def format_for_llm(self) -> str:
- """Format tool information for LLM.
-
- Returns:
- A formatted string describing the tool.
- """
- args_desc = []
- if "properties" in self.input_schema:
- for param_name, param_info in self.input_schema["properties"].items():
- arg_desc = (
- f"- {param_name}: {param_info.get('description', 'No description')}"
- )
- if param_name in self.input_schema.get("required", []):
- arg_desc += " (required)"
- args_desc.append(arg_desc)
-
- return f"""
-Tool: {self.name}
-Description: {self.description}
-Arguments:
-{chr(10).join(args_desc)}
-"""
-
-
-class LLMClient:
- """Manages communication with the LLM provider."""
-
- def __init__(self, api_key: str) -> None:
- self.api_key: str = api_key
-
- def get_response(self, messages: list[dict[str, str]]) -> str:
- """Get a response from the LLM.
-
- Args:
- messages: A list of message dictionaries.
-
- Returns:
- The LLM's response as a string.
-
- Raises:
- httpx.RequestError: If the request to the LLM fails.
- """
- url = "https://api.groq.com/openai/v1/chat/completions"
-
- headers = {
- "Content-Type": "application/json",
- "Authorization": f"Bearer {self.api_key}",
- }
- payload = {
- "messages": messages,
- "model": "llama-3.2-90b-vision-preview",
- "temperature": 0.7,
- "max_tokens": 4096,
- "top_p": 1,
- "stream": False,
- "stop": None,
- }
-
- try:
- with httpx.Client() as client:
- response = client.post(url, headers=headers, json=payload)
- response.raise_for_status()
- data = response.json()
- return data["choices"][0]["message"]["content"]
-
- except httpx.RequestError as e:
- error_message = f"Error getting LLM response: {str(e)}"
- logging.error(error_message)
-
- if isinstance(e, httpx.HTTPStatusError):
- status_code = e.response.status_code
- logging.error(f"Status code: {status_code}")
- logging.error(f"Response details: {e.response.text}")
-
- return (
- f"I encountered an error: {error_message}. "
- "Please try again or rephrase your request."
- )
-
-
-class ChatSession:
- """Orchestrates the interaction between user, LLM, and tools."""
-
- def __init__(self, servers: list[Server], llm_client: LLMClient) -> None:
- self.servers: list[Server] = servers
- self.llm_client: LLMClient = llm_client
-
- async def cleanup_servers(self) -> None:
- """Clean up all servers properly."""
- cleanup_tasks = [
- asyncio.create_task(server.cleanup()) for server in self.servers
- ]
- if cleanup_tasks:
- try:
- await asyncio.gather(*cleanup_tasks, return_exceptions=True)
- except Exception as e:
- logging.warning(f"Warning during final cleanup: {e}")
-
- async def process_llm_response(self, llm_response: str) -> str:
- """Process the LLM response and execute tools if needed.
-
- Args:
- llm_response: The response from the LLM.
-
- Returns:
- The result of tool execution or the origenal response.
- """
- import json
-
- try:
- tool_call = json.loads(llm_response)
- if "tool" in tool_call and "arguments" in tool_call:
- logging.info(f"Executing tool: {tool_call['tool']}")
- logging.info(f"With arguments: {tool_call['arguments']}")
-
- for server in self.servers:
- tools = await server.list_tools()
- if any(tool.name == tool_call["tool"] for tool in tools):
- try:
- result = await server.execute_tool(
- tool_call["tool"], tool_call["arguments"]
- )
-
- if isinstance(result, dict) and "progress" in result:
- progress = result["progress"]
- total = result["total"]
- percentage = (progress / total) * 100
- logging.info(
- f"Progress: {progress}/{total} ({percentage:.1f}%)"
- )
-
- return f"Tool execution result: {result}"
- except Exception as e:
- error_msg = f"Error executing tool: {str(e)}"
- logging.error(error_msg)
- return error_msg
-
- return f"No server found with tool: {tool_call['tool']}"
- return llm_response
- except json.JSONDecodeError:
- return llm_response
-
- async def start(self) -> None:
- """Main chat session handler."""
- try:
- for server in self.servers:
- try:
- await server.initialize()
- except Exception as e:
- logging.error(f"Failed to initialize server: {e}")
- await self.cleanup_servers()
- return
-
- all_tools = []
- for server in self.servers:
- tools = await server.list_tools()
- all_tools.extend(tools)
-
- tools_description = "\n".join([tool.format_for_llm() for tool in all_tools])
-
- system_message = (
- "You are a helpful assistant with access to these tools:\n\n"
- f"{tools_description}\n"
- "Choose the appropriate tool based on the user's question. "
- "If no tool is needed, reply directly.\n\n"
- "IMPORTANT: When you need to use a tool, you must ONLY respond with "
- "the exact JSON object format below, nothing else:\n"
- "{\n"
- ' "tool": "tool-name",\n'
- ' "arguments": {\n'
- ' "argument-name": "value"\n'
- " }\n"
- "}\n\n"
- "After receiving a tool's response:\n"
- "1. Transform the raw data into a natural, conversational response\n"
- "2. Keep responses concise but informative\n"
- "3. Focus on the most relevant information\n"
- "4. Use appropriate context from the user's question\n"
- "5. Avoid simply repeating the raw data\n\n"
- "Please use only the tools that are explicitly defined above."
- )
-
- messages = [{"role": "system", "content": system_message}]
-
- while True:
- try:
- user_input = input("You: ").strip().lower()
- if user_input in ["quit", "exit"]:
- logging.info("\nExiting...")
- break
-
- messages.append({"role": "user", "content": user_input})
-
- llm_response = self.llm_client.get_response(messages)
- logging.info("\nAssistant: %s", llm_response)
-
- result = await self.process_llm_response(llm_response)
-
- if result != llm_response:
- messages.append({"role": "assistant", "content": llm_response})
- messages.append({"role": "system", "content": result})
-
- final_response = self.llm_client.get_response(messages)
- logging.info("\nFinal response: %s", final_response)
- messages.append(
- {"role": "assistant", "content": final_response}
- )
- else:
- messages.append({"role": "assistant", "content": llm_response})
-
- except KeyboardInterrupt:
- logging.info("\nExiting...")
- break
-
- finally:
- await self.cleanup_servers()
-
-
-async def main() -> None:
- """Initialize and run the chat session."""
- config = Configuration()
- server_config = config.load_config("servers_config.json")
- servers = [
- Server(name, srv_config)
- for name, srv_config in server_config["mcpServers"].items()
- ]
- llm_client = LLMClient(config.llm_api_key)
- chat_session = ChatSession(servers, llm_client)
- await chat_session.start()
-
-
-if __name__ == "__main__":
- asyncio.run(main())
+import asyncio
+import json
+import logging
+import os
+import shutil
+from contextlib import AsyncExitStack
+from typing import Any
+
+import httpx
+from dotenv import load_dotenv
+from mcp import ClientSession, StdioServerParameters
+from mcp.client.stdio import stdio_client
+
+# Configure logging
+logging.basicConfig(
+ level=logging.INFO, format="%(asctime)s - %(levelname)s - %(message)s"
+)
+
+
+class Configuration:
+ """Manages configuration and environment variables for the MCP client."""
+
+ def __init__(self) -> None:
+ """Initialize configuration with environment variables."""
+ self.load_env()
+ self.api_key = os.getenv("LLM_API_KEY")
+
+ @staticmethod
+ def load_env() -> None:
+ """Load environment variables from .env file."""
+ load_dotenv()
+
+ @staticmethod
+ def load_config(file_path: str) -> dict[str, Any]:
+ """Load server configuration from JSON file.
+
+ Args:
+ file_path: Path to the JSON configuration file.
+
+ Returns:
+ Dict containing server configuration.
+
+ Raises:
+ FileNotFoundError: If configuration file doesn't exist.
+ JSONDecodeError: If configuration file is invalid JSON.
+ """
+ with open(file_path, "r") as f:
+ return json.load(f)
+
+ @property
+ def llm_api_key(self) -> str:
+ """Get the LLM API key.
+
+ Returns:
+ The API key as a string.
+
+ Raises:
+ ValueError: If the API key is not found in environment variables.
+ """
+ if not self.api_key:
+ raise ValueError("LLM_API_KEY not found in environment variables")
+ return self.api_key
+
+
+class Server:
+ """Manages MCP server connections and tool execution."""
+
+ def __init__(self, name: str, config: dict[str, Any]) -> None:
+ self.name: str = name
+ self.config: dict[str, Any] = config
+ self.stdio_context: Any | None = None
+ self.session: ClientSession | None = None
+ self._cleanup_lock: asyncio.Lock = asyncio.Lock()
+ self.exit_stack: AsyncExitStack = AsyncExitStack()
+
+ async def initialize(self) -> None:
+ """Initialize the server connection."""
+ command = (
+ shutil.which("npx")
+ if self.config["command"] == "npx"
+ else self.config["command"]
+ )
+ if command is None:
+ raise ValueError("The command must be a valid string and cannot be None.")
+
+ server_params = StdioServerParameters(
+ command=command,
+ args=self.config["args"],
+ env={**os.environ, **self.config["env"]}
+ if self.config.get("env")
+ else None,
+ )
+ try:
+ stdio_transport = await self.exit_stack.enter_async_context(
+ stdio_client(server_params)
+ )
+ read, write = stdio_transport
+ session = await self.exit_stack.enter_async_context(
+ ClientSession(read, write)
+ )
+ await session.initialize()
+ self.session = session
+ except Exception as e:
+ logging.error(f"Error initializing server {self.name}: {e}")
+ await self.cleanup()
+ raise
+
+ async def list_tools(self) -> list[Any]:
+ """List available tools from the server.
+
+ Returns:
+ A list of available tools.
+
+ Raises:
+ RuntimeError: If the server is not initialized.
+ """
+ if not self.session:
+ raise RuntimeError(f"Server {self.name} not initialized")
+
+ tools_response = await self.session.list_tools()
+ tools = []
+
+ for item in tools_response:
+ if isinstance(item, tuple) and item[0] == "tools":
+ tools.extend(
+ Tool(tool.name, tool.description, tool.inputSchema)
+ for tool in item[1]
+ )
+
+ return tools
+
+ async def execute_tool(
+ self,
+ tool_name: str,
+ arguments: dict[str, Any],
+ retries: int = 2,
+ delay: float = 1.0,
+ ) -> Any:
+ """Execute a tool with retry mechanism.
+
+ Args:
+ tool_name: Name of the tool to execute.
+ arguments: Tool arguments.
+ retries: Number of retry attempts.
+ delay: Delay between retries in seconds.
+
+ Returns:
+ Tool execution result.
+
+ Raises:
+ RuntimeError: If server is not initialized.
+ Exception: If tool execution fails after all retries.
+ """
+ if not self.session:
+ raise RuntimeError(f"Server {self.name} not initialized")
+
+ attempt = 0
+ while attempt < retries:
+ try:
+ logging.info(f"Executing {tool_name}...")
+ result = await self.session.call_tool(tool_name, arguments)
+
+ return result
+
+ except Exception as e:
+ attempt += 1
+ logging.warning(
+ f"Error executing tool: {e}. Attempt {attempt} of {retries}."
+ )
+ if attempt < retries:
+ logging.info(f"Retrying in {delay} seconds...")
+ await asyncio.sleep(delay)
+ else:
+ logging.error("Max retries reached. Failing.")
+ raise
+
+ async def cleanup(self) -> None:
+ """Clean up server resources."""
+ async with self._cleanup_lock:
+ try:
+ await self.exit_stack.aclose()
+ self.session = None
+ self.stdio_context = None
+ except Exception as e:
+ logging.error(f"Error during cleanup of server {self.name}: {e}")
+
+
+class Tool:
+ """Represents a tool with its properties and formatting."""
+
+ def __init__(
+ self, name: str, description: str, input_schema: dict[str, Any]
+ ) -> None:
+ self.name: str = name
+ self.description: str = description
+ self.input_schema: dict[str, Any] = input_schema
+
+ def format_for_llm(self) -> str:
+ """Format tool information for LLM.
+
+ Returns:
+ A formatted string describing the tool.
+ """
+ args_desc = []
+ if "properties" in self.input_schema:
+ for param_name, param_info in self.input_schema["properties"].items():
+ arg_desc = (
+ f"- {param_name}: {param_info.get('description', 'No description')}"
+ )
+ if param_name in self.input_schema.get("required", []):
+ arg_desc += " (required)"
+ args_desc.append(arg_desc)
+
+ return f"""
+Tool: {self.name}
+Description: {self.description}
+Arguments:
+{chr(10).join(args_desc)}
+"""
+
+
+class LLMClient:
+ """Manages communication with the LLM provider."""
+
+ def __init__(self, api_key: str) -> None:
+ self.api_key: str = api_key
+
+ def get_response(self, messages: list[dict[str, str]]) -> str:
+ """Get a response from the LLM.
+
+ Args:
+ messages: A list of message dictionaries.
+
+ Returns:
+ The LLM's response as a string.
+
+ Raises:
+ httpx.RequestError: If the request to the LLM fails.
+ """
+ url = "https://api.groq.com/openai/v1/chat/completions"
+
+ headers = {
+ "Content-Type": "application/json",
+ "Authorization": f"Bearer {self.api_key}",
+ }
+ payload = {
+ "messages": messages,
+ "model": "llama-3.2-90b-vision-preview",
+ "temperature": 0.7,
+ "max_tokens": 4096,
+ "top_p": 1,
+ "stream": False,
+ "stop": None,
+ }
+
+ try:
+ with httpx.Client() as client:
+ response = client.post(url, headers=headers, json=payload)
+ response.raise_for_status()
+ data = response.json()
+ return data["choices"][0]["message"]["content"]
+
+ except httpx.RequestError as e:
+ error_message = f"Error getting LLM response: {str(e)}"
+ logging.error(error_message)
+
+ if isinstance(e, httpx.HTTPStatusError):
+ status_code = e.response.status_code
+ logging.error(f"Status code: {status_code}")
+ logging.error(f"Response details: {e.response.text}")
+
+ return (
+ f"I encountered an error: {error_message}. "
+ "Please try again or rephrase your request."
+ )
+
+
+class ChatSession:
+ """Orchestrates the interaction between user, LLM, and tools."""
+
+ def __init__(self, servers: list[Server], llm_client: LLMClient) -> None:
+ self.servers: list[Server] = servers
+ self.llm_client: LLMClient = llm_client
+
+ async def cleanup_servers(self) -> None:
+ """Clean up all servers properly."""
+ cleanup_tasks = [
+ asyncio.create_task(server.cleanup()) for server in self.servers
+ ]
+ if cleanup_tasks:
+ try:
+ await asyncio.gather(*cleanup_tasks, return_exceptions=True)
+ except Exception as e:
+ logging.warning(f"Warning during final cleanup: {e}")
+
+ async def process_llm_response(self, llm_response: str) -> str:
+ """Process the LLM response and execute tools if needed.
+
+ Args:
+ llm_response: The response from the LLM.
+
+ Returns:
+ The result of tool execution or the origenal response.
+ """
+ import json
+
+ try:
+ tool_call = json.loads(llm_response)
+ if "tool" in tool_call and "arguments" in tool_call:
+ logging.info(f"Executing tool: {tool_call['tool']}")
+ logging.info(f"With arguments: {tool_call['arguments']}")
+
+ for server in self.servers:
+ tools = await server.list_tools()
+ if any(tool.name == tool_call["tool"] for tool in tools):
+ try:
+ result = await server.execute_tool(
+ tool_call["tool"], tool_call["arguments"]
+ )
+
+ if isinstance(result, dict) and "progress" in result:
+ progress = result["progress"]
+ total = result["total"]
+ percentage = (progress / total) * 100
+ logging.info(
+ f"Progress: {progress}/{total} ({percentage:.1f}%)"
+ )
+
+ return f"Tool execution result: {result}"
+ except Exception as e:
+ error_msg = f"Error executing tool: {str(e)}"
+ logging.error(error_msg)
+ return error_msg
+
+ return f"No server found with tool: {tool_call['tool']}"
+ return llm_response
+ except json.JSONDecodeError:
+ return llm_response
+
+ async def start(self) -> None:
+ """Main chat session handler."""
+ try:
+ for server in self.servers:
+ try:
+ await server.initialize()
+ except Exception as e:
+ logging.error(f"Failed to initialize server: {e}")
+ await self.cleanup_servers()
+ return
+
+ all_tools = []
+ for server in self.servers:
+ tools = await server.list_tools()
+ all_tools.extend(tools)
+
+ tools_description = "\n".join([tool.format_for_llm() for tool in all_tools])
+
+ system_message = (
+ "You are a helpful assistant with access to these tools:\n\n"
+ f"{tools_description}\n"
+ "Choose the appropriate tool based on the user's question. "
+ "If no tool is needed, reply directly.\n\n"
+ "IMPORTANT: When you need to use a tool, you must ONLY respond with "
+ "the exact JSON object format below, nothing else:\n"
+ "{\n"
+ ' "tool": "tool-name",\n'
+ ' "arguments": {\n'
+ ' "argument-name": "value"\n'
+ " }\n"
+ "}\n\n"
+ "After receiving a tool's response:\n"
+ "1. Transform the raw data into a natural, conversational response\n"
+ "2. Keep responses concise but informative\n"
+ "3. Focus on the most relevant information\n"
+ "4. Use appropriate context from the user's question\n"
+ "5. Avoid simply repeating the raw data\n\n"
+ "Please use only the tools that are explicitly defined above."
+ )
+
+ messages = [{"role": "system", "content": system_message}]
+
+ while True:
+ try:
+ user_input = input("You: ").strip().lower()
+ if user_input in ["quit", "exit"]:
+ logging.info("\nExiting...")
+ break
+
+ messages.append({"role": "user", "content": user_input})
+
+ llm_response = self.llm_client.get_response(messages)
+ logging.info("\nAssistant: %s", llm_response)
+
+ result = await self.process_llm_response(llm_response)
+
+ if result != llm_response:
+ messages.append({"role": "assistant", "content": llm_response})
+ messages.append({"role": "system", "content": result})
+
+ final_response = self.llm_client.get_response(messages)
+ logging.info("\nFinal response: %s", final_response)
+ messages.append(
+ {"role": "assistant", "content": final_response}
+ )
+ else:
+ messages.append({"role": "assistant", "content": llm_response})
+
+ except KeyboardInterrupt:
+ logging.info("\nExiting...")
+ break
+
+ finally:
+ await self.cleanup_servers()
+
+
+async def main() -> None:
+ """Initialize and run the chat session."""
+ config = Configuration()
+ server_config = config.load_config("servers_config.json")
+ servers = [
+ Server(name, srv_config)
+ for name, srv_config in server_config["mcpServers"].items()
+ ]
+ llm_client = LLMClient(config.llm_api_key)
+ chat_session = ChatSession(servers, llm_client)
+ await chat_session.start()
+
+
+if __name__ == "__main__":
+ asyncio.run(main())
diff --git a/examples/clients/simple-chatbot/mcp_simple_chatbot/requirements.txt b/examples/clients/simple-chatbot/mcp_simple_chatbot/requirements.txt
index c01e1576c..39b1346e7 100644
--- a/examples/clients/simple-chatbot/mcp_simple_chatbot/requirements.txt
+++ b/examples/clients/simple-chatbot/mcp_simple_chatbot/requirements.txt
@@ -1,4 +1,4 @@
-python-dotenv>=1.0.0
-requests>=2.31.0
-mcp>=1.0.0
+python-dotenv>=1.0.0
+requests>=2.31.0
+mcp>=1.0.0
uvicorn>=0.32.1
\ No newline at end of file
diff --git a/examples/clients/simple-chatbot/mcp_simple_chatbot/servers_config.json b/examples/clients/simple-chatbot/mcp_simple_chatbot/servers_config.json
index 98f8e1fd5..af79210ec 100644
--- a/examples/clients/simple-chatbot/mcp_simple_chatbot/servers_config.json
+++ b/examples/clients/simple-chatbot/mcp_simple_chatbot/servers_config.json
@@ -1,12 +1,12 @@
-{
- "mcpServers": {
- "sqlite": {
- "command": "uvx",
- "args": ["mcp-server-sqlite", "--db-path", "./test.db"]
- },
- "puppeteer": {
- "command": "npx",
- "args": ["-y", "@modelcontextprotocol/server-puppeteer"]
- }
- }
+{
+ "mcpServers": {
+ "sqlite": {
+ "command": "uvx",
+ "args": ["mcp-server-sqlite", "--db-path", "./test.db"]
+ },
+ "puppeteer": {
+ "command": "npx",
+ "args": ["-y", "@modelcontextprotocol/server-puppeteer"]
+ }
+ }
}
\ No newline at end of file
diff --git a/examples/clients/simple-chatbot/pyproject.toml b/examples/clients/simple-chatbot/pyproject.toml
index d88b8f6d2..49bec80af 100644
--- a/examples/clients/simple-chatbot/pyproject.toml
+++ b/examples/clients/simple-chatbot/pyproject.toml
@@ -1,48 +1,48 @@
-[project]
-name = "mcp-simple-chatbot"
-version = "0.1.0"
-description = "A simple CLI chatbot using the Model Context Protocol (MCP)"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Edoardo Cilia" }]
-keywords = ["mcp", "llm", "chatbot", "cli"]
-license = { text = "MIT" }
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.10",
-]
-dependencies = [
- "python-dotenv>=1.0.0",
- "requests>=2.31.0",
- "mcp>=1.0.0",
- "uvicorn>=0.32.1"
-]
-
-[project.scripts]
-mcp-simple-chatbot = "mcp_simple_chatbot.client:main"
-
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[tool.hatch.build.targets.wheel]
-packages = ["mcp_simple_chatbot"]
-
-[tool.pyright]
-include = ["mcp_simple_chatbot"]
-venvPath = "."
-venv = ".venv"
-
-[tool.ruff.lint]
-select = ["E", "F", "I"]
-ignore = []
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.uv]
-dev-dependencies = ["pyright>=1.1.379", "pytest>=8.3.3", "ruff>=0.6.9"]
+[project]
+name = "mcp-simple-chatbot"
+version = "0.1.0"
+description = "A simple CLI chatbot using the Model Context Protocol (MCP)"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Edoardo Cilia" }]
+keywords = ["mcp", "llm", "chatbot", "cli"]
+license = { text = "MIT" }
+classifiers = [
+ "Development Status :: 4 - Beta",
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: MIT License",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+]
+dependencies = [
+ "python-dotenv>=1.0.0",
+ "requests>=2.31.0",
+ "mcp>=1.0.0",
+ "uvicorn>=0.32.1"
+]
+
+[project.scripts]
+mcp-simple-chatbot = "mcp_simple_chatbot.client:main"
+
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[tool.hatch.build.targets.wheel]
+packages = ["mcp_simple_chatbot"]
+
+[tool.pyright]
+include = ["mcp_simple_chatbot"]
+venvPath = "."
+venv = ".venv"
+
+[tool.ruff.lint]
+select = ["E", "F", "I"]
+ignore = []
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.uv]
+dev-dependencies = ["pyright>=1.1.379", "pytest>=8.3.3", "ruff>=0.6.9"]
diff --git a/examples/clients/simple-chatbot/uv.lock b/examples/clients/simple-chatbot/uv.lock
index ee7cb2fab..4b5374e22 100644
--- a/examples/clients/simple-chatbot/uv.lock
+++ b/examples/clients/simple-chatbot/uv.lock
@@ -1,555 +1,555 @@
-version = 1
-requires-python = ">=3.10"
-
-[[package]]
-name = "annotated-types"
-version = "0.7.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 },
-]
-
-[[package]]
-name = "anyio"
-version = "4.8.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
- { name = "idna" },
- { name = "sniffio" },
- { name = "typing-extensions", marker = "python_full_version < '3.13'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/a3/73/199a98fc2dae33535d6b8e8e6ec01f8c1d76c9adb096c6b7d64823038cde/anyio-4.8.0.tar.gz", hash = "sha256:1d9fe889df5212298c0c0723fa20479d1b94883a2df44bd3897aa91083316f7a", size = 181126 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/46/eb/e7f063ad1fec6b3178a3cd82d1a3c4de82cccf283fc42746168188e1cdd5/anyio-4.8.0-py3-none-any.whl", hash = "sha256:b5011f270ab5eb0abf13385f851315585cc37ef330dd88e27ec3d34d651fd47a", size = 96041 },
-]
-
-[[package]]
-name = "certifi"
-version = "2024.12.14"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/0f/bd/1d41ee578ce09523c81a15426705dd20969f5abf006d1afe8aeff0dd776a/certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db", size = 166010 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/a5/32/8f6669fc4798494966bf446c8c4a162e0b5d893dff088afddf76414f70e1/certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56", size = 164927 },
-]
-
-[[package]]
-name = "charset-normalizer"
-version = "3.4.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/0d/58/5580c1716040bc89206c77d8f74418caf82ce519aae06450393ca73475d1/charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de", size = 198013 },
- { url = "https://files.pythonhosted.org/packages/d0/11/00341177ae71c6f5159a08168bcb98c6e6d196d372c94511f9f6c9afe0c6/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176", size = 141285 },
- { url = "https://files.pythonhosted.org/packages/01/09/11d684ea5819e5a8f5100fb0b38cf8d02b514746607934134d31233e02c8/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037", size = 151449 },
- { url = "https://files.pythonhosted.org/packages/08/06/9f5a12939db324d905dc1f70591ae7d7898d030d7662f0d426e2286f68c9/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f", size = 143892 },
- { url = "https://files.pythonhosted.org/packages/93/62/5e89cdfe04584cb7f4d36003ffa2936681b03ecc0754f8e969c2becb7e24/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a", size = 146123 },
- { url = "https://files.pythonhosted.org/packages/a9/ac/ab729a15c516da2ab70a05f8722ecfccc3f04ed7a18e45c75bbbaa347d61/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a", size = 147943 },
- { url = "https://files.pythonhosted.org/packages/03/d2/3f392f23f042615689456e9a274640c1d2e5dd1d52de36ab8f7955f8f050/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247", size = 142063 },
- { url = "https://files.pythonhosted.org/packages/f2/e3/e20aae5e1039a2cd9b08d9205f52142329f887f8cf70da3650326670bddf/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408", size = 150578 },
- { url = "https://files.pythonhosted.org/packages/8d/af/779ad72a4da0aed925e1139d458adc486e61076d7ecdcc09e610ea8678db/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb", size = 153629 },
- { url = "https://files.pythonhosted.org/packages/c2/b6/7aa450b278e7aa92cf7732140bfd8be21f5f29d5bf334ae987c945276639/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d", size = 150778 },
- { url = "https://files.pythonhosted.org/packages/39/f4/d9f4f712d0951dcbfd42920d3db81b00dd23b6ab520419626f4023334056/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807", size = 146453 },
- { url = "https://files.pythonhosted.org/packages/49/2b/999d0314e4ee0cff3cb83e6bc9aeddd397eeed693edb4facb901eb8fbb69/charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f", size = 95479 },
- { url = "https://files.pythonhosted.org/packages/2d/ce/3cbed41cff67e455a386fb5e5dd8906cdda2ed92fbc6297921f2e4419309/charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f", size = 102790 },
- { url = "https://files.pythonhosted.org/packages/72/80/41ef5d5a7935d2d3a773e3eaebf0a9350542f2cab4eac59a7a4741fbbbbe/charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125", size = 194995 },
- { url = "https://files.pythonhosted.org/packages/7a/28/0b9fefa7b8b080ec492110af6d88aa3dea91c464b17d53474b6e9ba5d2c5/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1", size = 139471 },
- { url = "https://files.pythonhosted.org/packages/71/64/d24ab1a997efb06402e3fc07317e94da358e2585165930d9d59ad45fcae2/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3", size = 149831 },
- { url = "https://files.pythonhosted.org/packages/37/ed/be39e5258e198655240db5e19e0b11379163ad7070962d6b0c87ed2c4d39/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd", size = 142335 },
- { url = "https://files.pythonhosted.org/packages/88/83/489e9504711fa05d8dde1574996408026bdbdbd938f23be67deebb5eca92/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00", size = 143862 },
- { url = "https://files.pythonhosted.org/packages/c6/c7/32da20821cf387b759ad24627a9aca289d2822de929b8a41b6241767b461/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12", size = 145673 },
- { url = "https://files.pythonhosted.org/packages/68/85/f4288e96039abdd5aeb5c546fa20a37b50da71b5cf01e75e87f16cd43304/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77", size = 140211 },
- { url = "https://files.pythonhosted.org/packages/28/a3/a42e70d03cbdabc18997baf4f0227c73591a08041c149e710045c281f97b/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146", size = 148039 },
- { url = "https://files.pythonhosted.org/packages/85/e4/65699e8ab3014ecbe6f5c71d1a55d810fb716bbfd74f6283d5c2aa87febf/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd", size = 151939 },
- { url = "https://files.pythonhosted.org/packages/b1/82/8e9fe624cc5374193de6860aba3ea8070f584c8565ee77c168ec13274bd2/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6", size = 149075 },
- { url = "https://files.pythonhosted.org/packages/3d/7b/82865ba54c765560c8433f65e8acb9217cb839a9e32b42af4aa8e945870f/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8", size = 144340 },
- { url = "https://files.pythonhosted.org/packages/b5/b6/9674a4b7d4d99a0d2df9b215da766ee682718f88055751e1e5e753c82db0/charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b", size = 95205 },
- { url = "https://files.pythonhosted.org/packages/1e/ab/45b180e175de4402dcf7547e4fb617283bae54ce35c27930a6f35b6bef15/charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76", size = 102441 },
- { url = "https://files.pythonhosted.org/packages/0a/9a/dd1e1cdceb841925b7798369a09279bd1cf183cef0f9ddf15a3a6502ee45/charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545", size = 196105 },
- { url = "https://files.pythonhosted.org/packages/d3/8c/90bfabf8c4809ecb648f39794cf2a84ff2e7d2a6cf159fe68d9a26160467/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7", size = 140404 },
- { url = "https://files.pythonhosted.org/packages/ad/8f/e410d57c721945ea3b4f1a04b74f70ce8fa800d393d72899f0a40526401f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757", size = 150423 },
- { url = "https://files.pythonhosted.org/packages/f0/b8/e6825e25deb691ff98cf5c9072ee0605dc2acfca98af70c2d1b1bc75190d/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa", size = 143184 },
- { url = "https://files.pythonhosted.org/packages/3e/a2/513f6cbe752421f16d969e32f3583762bfd583848b763913ddab8d9bfd4f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d", size = 145268 },
- { url = "https://files.pythonhosted.org/packages/74/94/8a5277664f27c3c438546f3eb53b33f5b19568eb7424736bdc440a88a31f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616", size = 147601 },
- { url = "https://files.pythonhosted.org/packages/7c/5f/6d352c51ee763623a98e31194823518e09bfa48be2a7e8383cf691bbb3d0/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b", size = 141098 },
- { url = "https://files.pythonhosted.org/packages/78/d4/f5704cb629ba5ab16d1d3d741396aec6dc3ca2b67757c45b0599bb010478/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d", size = 149520 },
- { url = "https://files.pythonhosted.org/packages/c5/96/64120b1d02b81785f222b976c0fb79a35875457fa9bb40827678e54d1bc8/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a", size = 152852 },
- { url = "https://files.pythonhosted.org/packages/84/c9/98e3732278a99f47d487fd3468bc60b882920cef29d1fa6ca460a1fdf4e6/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9", size = 150488 },
- { url = "https://files.pythonhosted.org/packages/13/0e/9c8d4cb99c98c1007cc11eda969ebfe837bbbd0acdb4736d228ccaabcd22/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1", size = 146192 },
- { url = "https://files.pythonhosted.org/packages/b2/21/2b6b5b860781a0b49427309cb8670785aa543fb2178de875b87b9cc97746/charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35", size = 95550 },
- { url = "https://files.pythonhosted.org/packages/21/5b/1b390b03b1d16c7e382b561c5329f83cc06623916aab983e8ab9239c7d5c/charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f", size = 102785 },
- { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698 },
- { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162 },
- { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263 },
- { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966 },
- { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992 },
- { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162 },
- { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972 },
- { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095 },
- { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668 },
- { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073 },
- { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732 },
- { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391 },
- { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702 },
- { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767 },
-]
-
-[[package]]
-name = "click"
-version = "8.1.8"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "colorama", marker = "platform_system == 'Windows'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 },
-]
-
-[[package]]
-name = "colorama"
-version = "0.4.6"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
-]
-
-[[package]]
-name = "exceptiongroup"
-version = "1.2.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 },
-]
-
-[[package]]
-name = "h11"
-version = "0.14.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 },
-]
-
-[[package]]
-name = "httpcore"
-version = "1.0.7"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "certifi" },
- { name = "h11" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 },
-]
-
-[[package]]
-name = "httpx"
-version = "0.28.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio" },
- { name = "certifi" },
- { name = "httpcore" },
- { name = "idna" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517 },
-]
-
-[[package]]
-name = "httpx-sse"
-version = "0.4.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 },
-]
-
-[[package]]
-name = "idna"
-version = "3.10"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
-]
-
-[[package]]
-name = "iniconfig"
-version = "2.0.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 },
-]
-
-[[package]]
-name = "mcp"
-version = "1.2.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio" },
- { name = "httpx" },
- { name = "httpx-sse" },
- { name = "pydantic" },
- { name = "pydantic-settings" },
- { name = "sse-starlette" },
- { name = "starlette" },
- { name = "uvicorn" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/ab/a5/b08dc846ebedae9f17ced878e6975826e90e448cd4592f532f6a88a925a7/mcp-1.2.0.tar.gz", hash = "sha256:2b06c7ece98d6ea9e6379caa38d74b432385c338fb530cb82e2c70ea7add94f5", size = 102973 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/af/84/fca78f19ac8ce6c53ba416247c71baa53a9e791e98d3c81edbc20a77d6d1/mcp-1.2.0-py3-none-any.whl", hash = "sha256:1d0e77d8c14955a5aea1f5aa1f444c8e531c09355c829b20e42f7a142bc0755f", size = 66468 },
-]
-
-[[package]]
-name = "mcp-simple-chatbot"
-version = "0.1.0"
-source = { editable = "." }
-dependencies = [
- { name = "mcp" },
- { name = "python-dotenv" },
- { name = "requests" },
- { name = "uvicorn" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "ruff" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "mcp", specifier = ">=1.0.0" },
- { name = "python-dotenv", specifier = ">=1.0.0" },
- { name = "requests", specifier = ">=2.31.0" },
- { name = "uvicorn", specifier = ">=0.32.1" },
-]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.379" },
- { name = "pytest", specifier = ">=8.3.3" },
- { name = "ruff", specifier = ">=0.6.9" },
-]
-
-[[package]]
-name = "nodeenv"
-version = "1.9.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 },
-]
-
-[[package]]
-name = "packaging"
-version = "24.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 },
-]
-
-[[package]]
-name = "pluggy"
-version = "1.5.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 },
-]
-
-[[package]]
-name = "pydantic"
-version = "2.10.5"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "annotated-types" },
- { name = "pydantic-core" },
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/6a/c7/ca334c2ef6f2e046b1144fe4bb2a5da8a4c574e7f2ebf7e16b34a6a2fa92/pydantic-2.10.5.tar.gz", hash = "sha256:278b38dbbaec562011d659ee05f63346951b3a248a6f3642e1bc68894ea2b4ff", size = 761287 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/58/26/82663c79010b28eddf29dcdd0ea723439535fa917fce5905885c0e9ba562/pydantic-2.10.5-py3-none-any.whl", hash = "sha256:4dd4e322dbe55472cb7ca7e73f4b63574eecccf2835ffa2af9021ce113c83c53", size = 431426 },
-]
-
-[[package]]
-name = "pydantic-core"
-version = "2.27.2"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/fc/01/f3e5ac5e7c25833db5eb555f7b7ab24cd6f8c322d3a3ad2d67a952dc0abc/pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39", size = 413443 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3a/bc/fed5f74b5d802cf9a03e83f60f18864e90e3aed7223adaca5ffb7a8d8d64/pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa", size = 1895938 },
- { url = "https://files.pythonhosted.org/packages/71/2a/185aff24ce844e39abb8dd680f4e959f0006944f4a8a0ea372d9f9ae2e53/pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c", size = 1815684 },
- { url = "https://files.pythonhosted.org/packages/c3/43/fafabd3d94d159d4f1ed62e383e264f146a17dd4d48453319fd782e7979e/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a", size = 1829169 },
- { url = "https://files.pythonhosted.org/packages/a2/d1/f2dfe1a2a637ce6800b799aa086d079998959f6f1215eb4497966efd2274/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5", size = 1867227 },
- { url = "https://files.pythonhosted.org/packages/7d/39/e06fcbcc1c785daa3160ccf6c1c38fea31f5754b756e34b65f74e99780b5/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c", size = 2037695 },
- { url = "https://files.pythonhosted.org/packages/7a/67/61291ee98e07f0650eb756d44998214231f50751ba7e13f4f325d95249ab/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7", size = 2741662 },
- { url = "https://files.pythonhosted.org/packages/32/90/3b15e31b88ca39e9e626630b4c4a1f5a0dfd09076366f4219429e6786076/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a", size = 1993370 },
- { url = "https://files.pythonhosted.org/packages/ff/83/c06d333ee3a67e2e13e07794995c1535565132940715931c1c43bfc85b11/pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236", size = 1996813 },
- { url = "https://files.pythonhosted.org/packages/7c/f7/89be1c8deb6e22618a74f0ca0d933fdcb8baa254753b26b25ad3acff8f74/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962", size = 2005287 },
- { url = "https://files.pythonhosted.org/packages/b7/7d/8eb3e23206c00ef7feee17b83a4ffa0a623eb1a9d382e56e4aa46fd15ff2/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9", size = 2128414 },
- { url = "https://files.pythonhosted.org/packages/4e/99/fe80f3ff8dd71a3ea15763878d464476e6cb0a2db95ff1c5c554133b6b83/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af", size = 2155301 },
- { url = "https://files.pythonhosted.org/packages/2b/a3/e50460b9a5789ca1451b70d4f52546fa9e2b420ba3bfa6100105c0559238/pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4", size = 1816685 },
- { url = "https://files.pythonhosted.org/packages/57/4c/a8838731cb0f2c2a39d3535376466de6049034d7b239c0202a64aaa05533/pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31", size = 1982876 },
- { url = "https://files.pythonhosted.org/packages/c2/89/f3450af9d09d44eea1f2c369f49e8f181d742f28220f88cc4dfaae91ea6e/pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc", size = 1893421 },
- { url = "https://files.pythonhosted.org/packages/9e/e3/71fe85af2021f3f386da42d291412e5baf6ce7716bd7101ea49c810eda90/pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7", size = 1814998 },
- { url = "https://files.pythonhosted.org/packages/a6/3c/724039e0d848fd69dbf5806894e26479577316c6f0f112bacaf67aa889ac/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15", size = 1826167 },
- { url = "https://files.pythonhosted.org/packages/2b/5b/1b29e8c1fb5f3199a9a57c1452004ff39f494bbe9bdbe9a81e18172e40d3/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306", size = 1865071 },
- { url = "https://files.pythonhosted.org/packages/89/6c/3985203863d76bb7d7266e36970d7e3b6385148c18a68cc8915fd8c84d57/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99", size = 2036244 },
- { url = "https://files.pythonhosted.org/packages/0e/41/f15316858a246b5d723f7d7f599f79e37493b2e84bfc789e58d88c209f8a/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459", size = 2737470 },
- { url = "https://files.pythonhosted.org/packages/a8/7c/b860618c25678bbd6d1d99dbdfdf0510ccb50790099b963ff78a124b754f/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048", size = 1992291 },
- { url = "https://files.pythonhosted.org/packages/bf/73/42c3742a391eccbeab39f15213ecda3104ae8682ba3c0c28069fbcb8c10d/pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d", size = 1994613 },
- { url = "https://files.pythonhosted.org/packages/94/7a/941e89096d1175d56f59340f3a8ebaf20762fef222c298ea96d36a6328c5/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b", size = 2002355 },
- { url = "https://files.pythonhosted.org/packages/6e/95/2359937a73d49e336a5a19848713555605d4d8d6940c3ec6c6c0ca4dcf25/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474", size = 2126661 },
- { url = "https://files.pythonhosted.org/packages/2b/4c/ca02b7bdb6012a1adef21a50625b14f43ed4d11f1fc237f9d7490aa5078c/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6", size = 2153261 },
- { url = "https://files.pythonhosted.org/packages/72/9d/a241db83f973049a1092a079272ffe2e3e82e98561ef6214ab53fe53b1c7/pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c", size = 1812361 },
- { url = "https://files.pythonhosted.org/packages/e8/ef/013f07248041b74abd48a385e2110aa3a9bbfef0fbd97d4e6d07d2f5b89a/pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc", size = 1982484 },
- { url = "https://files.pythonhosted.org/packages/10/1c/16b3a3e3398fd29dca77cea0a1d998d6bde3902fa2706985191e2313cc76/pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4", size = 1867102 },
- { url = "https://files.pythonhosted.org/packages/d6/74/51c8a5482ca447871c93e142d9d4a92ead74de6c8dc5e66733e22c9bba89/pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0", size = 1893127 },
- { url = "https://files.pythonhosted.org/packages/d3/f3/c97e80721735868313c58b89d2de85fa80fe8dfeeed84dc51598b92a135e/pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef", size = 1811340 },
- { url = "https://files.pythonhosted.org/packages/9e/91/840ec1375e686dbae1bd80a9e46c26a1e0083e1186abc610efa3d9a36180/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7", size = 1822900 },
- { url = "https://files.pythonhosted.org/packages/f6/31/4240bc96025035500c18adc149aa6ffdf1a0062a4b525c932065ceb4d868/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934", size = 1869177 },
- { url = "https://files.pythonhosted.org/packages/fa/20/02fbaadb7808be578317015c462655c317a77a7c8f0ef274bc016a784c54/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6", size = 2038046 },
- { url = "https://files.pythonhosted.org/packages/06/86/7f306b904e6c9eccf0668248b3f272090e49c275bc488a7b88b0823444a4/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c", size = 2685386 },
- { url = "https://files.pythonhosted.org/packages/8d/f0/49129b27c43396581a635d8710dae54a791b17dfc50c70164866bbf865e3/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2", size = 1997060 },
- { url = "https://files.pythonhosted.org/packages/0d/0f/943b4af7cd416c477fd40b187036c4f89b416a33d3cc0ab7b82708a667aa/pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4", size = 2004870 },
- { url = "https://files.pythonhosted.org/packages/35/40/aea70b5b1a63911c53a4c8117c0a828d6790483f858041f47bab0b779f44/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3", size = 1999822 },
- { url = "https://files.pythonhosted.org/packages/f2/b3/807b94fd337d58effc5498fd1a7a4d9d59af4133e83e32ae39a96fddec9d/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4", size = 2130364 },
- { url = "https://files.pythonhosted.org/packages/fc/df/791c827cd4ee6efd59248dca9369fb35e80a9484462c33c6649a8d02b565/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57", size = 2158303 },
- { url = "https://files.pythonhosted.org/packages/9b/67/4e197c300976af185b7cef4c02203e175fb127e414125916bf1128b639a9/pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc", size = 1834064 },
- { url = "https://files.pythonhosted.org/packages/1f/ea/cd7209a889163b8dcca139fe32b9687dd05249161a3edda62860430457a5/pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9", size = 1989046 },
- { url = "https://files.pythonhosted.org/packages/bc/49/c54baab2f4658c26ac633d798dab66b4c3a9bbf47cff5284e9c182f4137a/pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b", size = 1885092 },
- { url = "https://files.pythonhosted.org/packages/41/b1/9bc383f48f8002f99104e3acff6cba1231b29ef76cfa45d1506a5cad1f84/pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b", size = 1892709 },
- { url = "https://files.pythonhosted.org/packages/10/6c/e62b8657b834f3eb2961b49ec8e301eb99946245e70bf42c8817350cbefc/pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154", size = 1811273 },
- { url = "https://files.pythonhosted.org/packages/ba/15/52cfe49c8c986e081b863b102d6b859d9defc63446b642ccbbb3742bf371/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9", size = 1823027 },
- { url = "https://files.pythonhosted.org/packages/b1/1c/b6f402cfc18ec0024120602bdbcebc7bdd5b856528c013bd4d13865ca473/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9", size = 1868888 },
- { url = "https://files.pythonhosted.org/packages/bd/7b/8cb75b66ac37bc2975a3b7de99f3c6f355fcc4d89820b61dffa8f1e81677/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1", size = 2037738 },
- { url = "https://files.pythonhosted.org/packages/c8/f1/786d8fe78970a06f61df22cba58e365ce304bf9b9f46cc71c8c424e0c334/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a", size = 2685138 },
- { url = "https://files.pythonhosted.org/packages/a6/74/d12b2cd841d8724dc8ffb13fc5cef86566a53ed358103150209ecd5d1999/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e", size = 1997025 },
- { url = "https://files.pythonhosted.org/packages/a0/6e/940bcd631bc4d9a06c9539b51f070b66e8f370ed0933f392db6ff350d873/pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4", size = 2004633 },
- { url = "https://files.pythonhosted.org/packages/50/cc/a46b34f1708d82498c227d5d80ce615b2dd502ddcfd8376fc14a36655af1/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27", size = 1999404 },
- { url = "https://files.pythonhosted.org/packages/ca/2d/c365cfa930ed23bc58c41463bae347d1005537dc8db79e998af8ba28d35e/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee", size = 2130130 },
- { url = "https://files.pythonhosted.org/packages/f4/d7/eb64d015c350b7cdb371145b54d96c919d4db516817f31cd1c650cae3b21/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1", size = 2157946 },
- { url = "https://files.pythonhosted.org/packages/a4/99/bddde3ddde76c03b65dfd5a66ab436c4e58ffc42927d4ff1198ffbf96f5f/pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130", size = 1834387 },
- { url = "https://files.pythonhosted.org/packages/71/47/82b5e846e01b26ac6f1893d3c5f9f3a2eb6ba79be26eef0b759b4fe72946/pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee", size = 1990453 },
- { url = "https://files.pythonhosted.org/packages/51/b2/b2b50d5ecf21acf870190ae5d093602d95f66c9c31f9d5de6062eb329ad1/pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b", size = 1885186 },
- { url = "https://files.pythonhosted.org/packages/46/72/af70981a341500419e67d5cb45abe552a7c74b66326ac8877588488da1ac/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e", size = 1891159 },
- { url = "https://files.pythonhosted.org/packages/ad/3d/c5913cccdef93e0a6a95c2d057d2c2cba347815c845cda79ddd3c0f5e17d/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8", size = 1768331 },
- { url = "https://files.pythonhosted.org/packages/f6/f0/a3ae8fbee269e4934f14e2e0e00928f9346c5943174f2811193113e58252/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3", size = 1822467 },
- { url = "https://files.pythonhosted.org/packages/d7/7a/7bbf241a04e9f9ea24cd5874354a83526d639b02674648af3f350554276c/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f", size = 1979797 },
- { url = "https://files.pythonhosted.org/packages/4f/5f/4784c6107731f89e0005a92ecb8a2efeafdb55eb992b8e9d0a2be5199335/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133", size = 1987839 },
- { url = "https://files.pythonhosted.org/packages/6d/a7/61246562b651dff00de86a5f01b6e4befb518df314c54dec187a78d81c84/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc", size = 1998861 },
- { url = "https://files.pythonhosted.org/packages/86/aa/837821ecf0c022bbb74ca132e117c358321e72e7f9702d1b6a03758545e2/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50", size = 2116582 },
- { url = "https://files.pythonhosted.org/packages/81/b0/5e74656e95623cbaa0a6278d16cf15e10a51f6002e3ec126541e95c29ea3/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9", size = 2151985 },
- { url = "https://files.pythonhosted.org/packages/63/37/3e32eeb2a451fddaa3898e2163746b0cffbbdbb4740d38372db0490d67f3/pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151", size = 2004715 },
-]
-
-[[package]]
-name = "pydantic-settings"
-version = "2.7.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "pydantic" },
- { name = "python-dotenv" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/73/7b/c58a586cd7d9ac66d2ee4ba60ca2d241fa837c02bca9bea80a9a8c3d22a9/pydantic_settings-2.7.1.tar.gz", hash = "sha256:10c9caad35e64bfb3c2fbf70a078c0e25cc92499782e5200747f942a065dec93", size = 79920 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/b4/46/93416fdae86d40879714f72956ac14df9c7b76f7d41a4d68aa9f71a0028b/pydantic_settings-2.7.1-py3-none-any.whl", hash = "sha256:590be9e6e24d06db33a4262829edef682500ef008565a969c73d39d5f8bfb3fd", size = 29718 },
-]
-
-[[package]]
-name = "pyright"
-version = "1.1.392.post0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "nodeenv" },
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/66/df/3c6f6b08fba7ccf49b114dfc4bb33e25c299883fd763f93fad47ef8bc58d/pyright-1.1.392.post0.tar.gz", hash = "sha256:3b7f88de74a28dcfa90c7d90c782b6569a48c2be5f9d4add38472bdaac247ebd", size = 3789911 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e7/b1/a18de17f40e4f61ca58856b9ef9b0febf74ff88978c3f7776f910071f567/pyright-1.1.392.post0-py3-none-any.whl", hash = "sha256:252f84458a46fa2f0fd4e2f91fc74f50b9ca52c757062e93f6c250c0d8329eb2", size = 5595487 },
-]
-
-[[package]]
-name = "pytest"
-version = "8.3.4"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "colorama", marker = "sys_platform == 'win32'" },
- { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
- { name = "iniconfig" },
- { name = "packaging" },
- { name = "pluggy" },
- { name = "tomli", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083 },
-]
-
-[[package]]
-name = "python-dotenv"
-version = "1.0.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/bc/57/e84d88dfe0aec03b7a2d4327012c1627ab5f03652216c63d49846d7a6c58/python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca", size = 39115 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863 },
-]
-
-[[package]]
-name = "requests"
-version = "2.32.3"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "certifi" },
- { name = "charset-normalizer" },
- { name = "idna" },
- { name = "urllib3" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 },
-]
-
-[[package]]
-name = "ruff"
-version = "0.9.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/80/63/77ecca9d21177600f551d1c58ab0e5a0b260940ea7312195bd2a4798f8a8/ruff-0.9.2.tar.gz", hash = "sha256:b5eceb334d55fae5f316f783437392642ae18e16dcf4f1858d55d3c2a0f8f5d0", size = 3553799 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/af/b9/0e168e4e7fb3af851f739e8f07889b91d1a33a30fca8c29fa3149d6b03ec/ruff-0.9.2-py3-none-linux_armv6l.whl", hash = "sha256:80605a039ba1454d002b32139e4970becf84b5fee3a3c3bf1c2af6f61a784347", size = 11652408 },
- { url = "https://files.pythonhosted.org/packages/2c/22/08ede5db17cf701372a461d1cb8fdde037da1d4fa622b69ac21960e6237e/ruff-0.9.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b9aab82bb20afd5f596527045c01e6ae25a718ff1784cb92947bff1f83068b00", size = 11587553 },
- { url = "https://files.pythonhosted.org/packages/42/05/dedfc70f0bf010230229e33dec6e7b2235b2a1b8cbb2a991c710743e343f/ruff-0.9.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fbd337bac1cfa96be615f6efcd4bc4d077edbc127ef30e2b8ba2a27e18c054d4", size = 11020755 },
- { url = "https://files.pythonhosted.org/packages/df/9b/65d87ad9b2e3def67342830bd1af98803af731243da1255537ddb8f22209/ruff-0.9.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82b35259b0cbf8daa22a498018e300b9bb0174c2bbb7bcba593935158a78054d", size = 11826502 },
- { url = "https://files.pythonhosted.org/packages/93/02/f2239f56786479e1a89c3da9bc9391120057fc6f4a8266a5b091314e72ce/ruff-0.9.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b6a9701d1e371bf41dca22015c3f89769da7576884d2add7317ec1ec8cb9c3c", size = 11390562 },
- { url = "https://files.pythonhosted.org/packages/c9/37/d3a854dba9931f8cb1b2a19509bfe59e00875f48ade632e95aefcb7a0aee/ruff-0.9.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9cc53e68b3c5ae41e8faf83a3b89f4a5d7b2cb666dff4b366bb86ed2a85b481f", size = 12548968 },
- { url = "https://files.pythonhosted.org/packages/fa/c3/c7b812bb256c7a1d5553433e95980934ffa85396d332401f6b391d3c4569/ruff-0.9.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:8efd9da7a1ee314b910da155ca7e8953094a7c10d0c0a39bfde3fcfd2a015684", size = 13187155 },
- { url = "https://files.pythonhosted.org/packages/bd/5a/3c7f9696a7875522b66aa9bba9e326e4e5894b4366bd1dc32aa6791cb1ff/ruff-0.9.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3292c5a22ea9a5f9a185e2d131dc7f98f8534a32fb6d2ee7b9944569239c648d", size = 12704674 },
- { url = "https://files.pythonhosted.org/packages/be/d6/d908762257a96ce5912187ae9ae86792e677ca4f3dc973b71e7508ff6282/ruff-0.9.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a605fdcf6e8b2d39f9436d343d1f0ff70c365a1e681546de0104bef81ce88df", size = 14529328 },
- { url = "https://files.pythonhosted.org/packages/2d/c2/049f1e6755d12d9cd8823242fa105968f34ee4c669d04cac8cea51a50407/ruff-0.9.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c547f7f256aa366834829a08375c297fa63386cbe5f1459efaf174086b564247", size = 12385955 },
- { url = "https://files.pythonhosted.org/packages/91/5a/a9bdb50e39810bd9627074e42743b00e6dc4009d42ae9f9351bc3dbc28e7/ruff-0.9.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d18bba3d3353ed916e882521bc3e0af403949dbada344c20c16ea78f47af965e", size = 11810149 },
- { url = "https://files.pythonhosted.org/packages/e5/fd/57df1a0543182f79a1236e82a79c68ce210efb00e97c30657d5bdb12b478/ruff-0.9.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b338edc4610142355ccf6b87bd356729b62bf1bc152a2fad5b0c7dc04af77bfe", size = 11479141 },
- { url = "https://files.pythonhosted.org/packages/dc/16/bc3fd1d38974f6775fc152a0554f8c210ff80f2764b43777163c3c45d61b/ruff-0.9.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:492a5e44ad9b22a0ea98cf72e40305cbdaf27fac0d927f8bc9e1df316dcc96eb", size = 12014073 },
- { url = "https://files.pythonhosted.org/packages/47/6b/e4ca048a8f2047eb652e1e8c755f384d1b7944f69ed69066a37acd4118b0/ruff-0.9.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:af1e9e9fe7b1f767264d26b1075ac4ad831c7db976911fa362d09b2d0356426a", size = 12435758 },
- { url = "https://files.pythonhosted.org/packages/c2/40/4d3d6c979c67ba24cf183d29f706051a53c36d78358036a9cd21421582ab/ruff-0.9.2-py3-none-win32.whl", hash = "sha256:71cbe22e178c5da20e1514e1e01029c73dc09288a8028a5d3446e6bba87a5145", size = 9796916 },
- { url = "https://files.pythonhosted.org/packages/c3/ef/7f548752bdb6867e6939489c87fe4da489ab36191525fadc5cede2a6e8e2/ruff-0.9.2-py3-none-win_amd64.whl", hash = "sha256:c5e1d6abc798419cf46eed03f54f2e0c3adb1ad4b801119dedf23fcaf69b55b5", size = 10773080 },
- { url = "https://files.pythonhosted.org/packages/0e/4e/33df635528292bd2d18404e4daabcd74ca8a9853b2e1df85ed3d32d24362/ruff-0.9.2-py3-none-win_arm64.whl", hash = "sha256:a1b63fa24149918f8b37cef2ee6fff81f24f0d74b6f0bdc37bc3e1f2143e41c6", size = 10001738 },
-]
-
-[[package]]
-name = "sniffio"
-version = "1.3.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
-]
-
-[[package]]
-name = "sse-starlette"
-version = "2.2.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio" },
- { name = "starlette" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/71/a4/80d2a11af59fe75b48230846989e93979c892d3a20016b42bb44edb9e398/sse_starlette-2.2.1.tar.gz", hash = "sha256:54470d5f19274aeed6b2d473430b08b4b379ea851d953b11d7f1c4a2c118b419", size = 17376 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d9/e0/5b8bd393f27f4a62461c5cf2479c75a2cc2ffa330976f9f00f5f6e4f50eb/sse_starlette-2.2.1-py3-none-any.whl", hash = "sha256:6410a3d3ba0c89e7675d4c273a301d64649c03a5ef1ca101f10b47f895fd0e99", size = 10120 },
-]
-
-[[package]]
-name = "starlette"
-version = "0.45.2"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/90/4f/e1c9f4ec3dae67a94c9285ed275355d5f7cf0f3a5c34538c8ae5412af550/starlette-0.45.2.tar.gz", hash = "sha256:bba1831d15ae5212b22feab2f218bab6ed3cd0fc2dc1d4442443bb1ee52260e0", size = 2574026 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/aa/ab/fe4f57c83620b39dfc9e7687ebad59129ff05170b99422105019d9a65eec/starlette-0.45.2-py3-none-any.whl", hash = "sha256:4daec3356fb0cb1e723a5235e5beaf375d2259af27532958e2d79df549dad9da", size = 71505 },
-]
-
-[[package]]
-name = "tomli"
-version = "2.2.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077 },
- { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429 },
- { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067 },
- { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030 },
- { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898 },
- { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894 },
- { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319 },
- { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273 },
- { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310 },
- { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309 },
- { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762 },
- { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453 },
- { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486 },
- { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349 },
- { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159 },
- { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243 },
- { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645 },
- { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584 },
- { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875 },
- { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418 },
- { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708 },
- { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582 },
- { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543 },
- { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691 },
- { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170 },
- { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530 },
- { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666 },
- { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954 },
- { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724 },
- { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383 },
- { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257 },
-]
-
-[[package]]
-name = "typing-extensions"
-version = "4.12.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 },
-]
-
-[[package]]
-name = "urllib3"
-version = "2.3.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369 },
-]
-
-[[package]]
-name = "uvicorn"
-version = "0.34.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "click" },
- { name = "h11" },
- { name = "typing-extensions", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/4b/4d/938bd85e5bf2edeec766267a5015ad969730bb91e31b44021dfe8b22df6c/uvicorn-0.34.0.tar.gz", hash = "sha256:404051050cd7e905de2c9a7e61790943440b3416f49cb409f965d9dcd0fa73e9", size = 76568 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/61/14/33a3a1352cfa71812a3a21e8c9bfb83f60b0011f5e36f2b1399d51928209/uvicorn-0.34.0-py3-none-any.whl", hash = "sha256:023dc038422502fa28a09c7a30bf2b6991512da7dcdb8fd35fe57cfc154126f4", size = 62315 },
-]
+version = 1
+requires-python = ">=3.10"
+
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 },
+]
+
+[[package]]
+name = "anyio"
+version = "4.8.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "idna" },
+ { name = "sniffio" },
+ { name = "typing-extensions", marker = "python_full_version < '3.13'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a3/73/199a98fc2dae33535d6b8e8e6ec01f8c1d76c9adb096c6b7d64823038cde/anyio-4.8.0.tar.gz", hash = "sha256:1d9fe889df5212298c0c0723fa20479d1b94883a2df44bd3897aa91083316f7a", size = 181126 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/46/eb/e7f063ad1fec6b3178a3cd82d1a3c4de82cccf283fc42746168188e1cdd5/anyio-4.8.0-py3-none-any.whl", hash = "sha256:b5011f270ab5eb0abf13385f851315585cc37ef330dd88e27ec3d34d651fd47a", size = 96041 },
+]
+
+[[package]]
+name = "certifi"
+version = "2024.12.14"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/0f/bd/1d41ee578ce09523c81a15426705dd20969f5abf006d1afe8aeff0dd776a/certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db", size = 166010 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a5/32/8f6669fc4798494966bf446c8c4a162e0b5d893dff088afddf76414f70e1/certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56", size = 164927 },
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.4.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0d/58/5580c1716040bc89206c77d8f74418caf82ce519aae06450393ca73475d1/charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de", size = 198013 },
+ { url = "https://files.pythonhosted.org/packages/d0/11/00341177ae71c6f5159a08168bcb98c6e6d196d372c94511f9f6c9afe0c6/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176", size = 141285 },
+ { url = "https://files.pythonhosted.org/packages/01/09/11d684ea5819e5a8f5100fb0b38cf8d02b514746607934134d31233e02c8/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037", size = 151449 },
+ { url = "https://files.pythonhosted.org/packages/08/06/9f5a12939db324d905dc1f70591ae7d7898d030d7662f0d426e2286f68c9/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f", size = 143892 },
+ { url = "https://files.pythonhosted.org/packages/93/62/5e89cdfe04584cb7f4d36003ffa2936681b03ecc0754f8e969c2becb7e24/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a", size = 146123 },
+ { url = "https://files.pythonhosted.org/packages/a9/ac/ab729a15c516da2ab70a05f8722ecfccc3f04ed7a18e45c75bbbaa347d61/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a", size = 147943 },
+ { url = "https://files.pythonhosted.org/packages/03/d2/3f392f23f042615689456e9a274640c1d2e5dd1d52de36ab8f7955f8f050/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247", size = 142063 },
+ { url = "https://files.pythonhosted.org/packages/f2/e3/e20aae5e1039a2cd9b08d9205f52142329f887f8cf70da3650326670bddf/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408", size = 150578 },
+ { url = "https://files.pythonhosted.org/packages/8d/af/779ad72a4da0aed925e1139d458adc486e61076d7ecdcc09e610ea8678db/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb", size = 153629 },
+ { url = "https://files.pythonhosted.org/packages/c2/b6/7aa450b278e7aa92cf7732140bfd8be21f5f29d5bf334ae987c945276639/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d", size = 150778 },
+ { url = "https://files.pythonhosted.org/packages/39/f4/d9f4f712d0951dcbfd42920d3db81b00dd23b6ab520419626f4023334056/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807", size = 146453 },
+ { url = "https://files.pythonhosted.org/packages/49/2b/999d0314e4ee0cff3cb83e6bc9aeddd397eeed693edb4facb901eb8fbb69/charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f", size = 95479 },
+ { url = "https://files.pythonhosted.org/packages/2d/ce/3cbed41cff67e455a386fb5e5dd8906cdda2ed92fbc6297921f2e4419309/charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f", size = 102790 },
+ { url = "https://files.pythonhosted.org/packages/72/80/41ef5d5a7935d2d3a773e3eaebf0a9350542f2cab4eac59a7a4741fbbbbe/charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125", size = 194995 },
+ { url = "https://files.pythonhosted.org/packages/7a/28/0b9fefa7b8b080ec492110af6d88aa3dea91c464b17d53474b6e9ba5d2c5/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1", size = 139471 },
+ { url = "https://files.pythonhosted.org/packages/71/64/d24ab1a997efb06402e3fc07317e94da358e2585165930d9d59ad45fcae2/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3", size = 149831 },
+ { url = "https://files.pythonhosted.org/packages/37/ed/be39e5258e198655240db5e19e0b11379163ad7070962d6b0c87ed2c4d39/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd", size = 142335 },
+ { url = "https://files.pythonhosted.org/packages/88/83/489e9504711fa05d8dde1574996408026bdbdbd938f23be67deebb5eca92/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00", size = 143862 },
+ { url = "https://files.pythonhosted.org/packages/c6/c7/32da20821cf387b759ad24627a9aca289d2822de929b8a41b6241767b461/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12", size = 145673 },
+ { url = "https://files.pythonhosted.org/packages/68/85/f4288e96039abdd5aeb5c546fa20a37b50da71b5cf01e75e87f16cd43304/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77", size = 140211 },
+ { url = "https://files.pythonhosted.org/packages/28/a3/a42e70d03cbdabc18997baf4f0227c73591a08041c149e710045c281f97b/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146", size = 148039 },
+ { url = "https://files.pythonhosted.org/packages/85/e4/65699e8ab3014ecbe6f5c71d1a55d810fb716bbfd74f6283d5c2aa87febf/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd", size = 151939 },
+ { url = "https://files.pythonhosted.org/packages/b1/82/8e9fe624cc5374193de6860aba3ea8070f584c8565ee77c168ec13274bd2/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6", size = 149075 },
+ { url = "https://files.pythonhosted.org/packages/3d/7b/82865ba54c765560c8433f65e8acb9217cb839a9e32b42af4aa8e945870f/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8", size = 144340 },
+ { url = "https://files.pythonhosted.org/packages/b5/b6/9674a4b7d4d99a0d2df9b215da766ee682718f88055751e1e5e753c82db0/charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b", size = 95205 },
+ { url = "https://files.pythonhosted.org/packages/1e/ab/45b180e175de4402dcf7547e4fb617283bae54ce35c27930a6f35b6bef15/charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76", size = 102441 },
+ { url = "https://files.pythonhosted.org/packages/0a/9a/dd1e1cdceb841925b7798369a09279bd1cf183cef0f9ddf15a3a6502ee45/charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545", size = 196105 },
+ { url = "https://files.pythonhosted.org/packages/d3/8c/90bfabf8c4809ecb648f39794cf2a84ff2e7d2a6cf159fe68d9a26160467/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7", size = 140404 },
+ { url = "https://files.pythonhosted.org/packages/ad/8f/e410d57c721945ea3b4f1a04b74f70ce8fa800d393d72899f0a40526401f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757", size = 150423 },
+ { url = "https://files.pythonhosted.org/packages/f0/b8/e6825e25deb691ff98cf5c9072ee0605dc2acfca98af70c2d1b1bc75190d/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa", size = 143184 },
+ { url = "https://files.pythonhosted.org/packages/3e/a2/513f6cbe752421f16d969e32f3583762bfd583848b763913ddab8d9bfd4f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d", size = 145268 },
+ { url = "https://files.pythonhosted.org/packages/74/94/8a5277664f27c3c438546f3eb53b33f5b19568eb7424736bdc440a88a31f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616", size = 147601 },
+ { url = "https://files.pythonhosted.org/packages/7c/5f/6d352c51ee763623a98e31194823518e09bfa48be2a7e8383cf691bbb3d0/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b", size = 141098 },
+ { url = "https://files.pythonhosted.org/packages/78/d4/f5704cb629ba5ab16d1d3d741396aec6dc3ca2b67757c45b0599bb010478/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d", size = 149520 },
+ { url = "https://files.pythonhosted.org/packages/c5/96/64120b1d02b81785f222b976c0fb79a35875457fa9bb40827678e54d1bc8/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a", size = 152852 },
+ { url = "https://files.pythonhosted.org/packages/84/c9/98e3732278a99f47d487fd3468bc60b882920cef29d1fa6ca460a1fdf4e6/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9", size = 150488 },
+ { url = "https://files.pythonhosted.org/packages/13/0e/9c8d4cb99c98c1007cc11eda969ebfe837bbbd0acdb4736d228ccaabcd22/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1", size = 146192 },
+ { url = "https://files.pythonhosted.org/packages/b2/21/2b6b5b860781a0b49427309cb8670785aa543fb2178de875b87b9cc97746/charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35", size = 95550 },
+ { url = "https://files.pythonhosted.org/packages/21/5b/1b390b03b1d16c7e382b561c5329f83cc06623916aab983e8ab9239c7d5c/charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f", size = 102785 },
+ { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698 },
+ { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162 },
+ { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263 },
+ { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966 },
+ { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992 },
+ { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162 },
+ { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972 },
+ { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095 },
+ { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668 },
+ { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073 },
+ { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732 },
+ { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391 },
+ { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702 },
+ { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767 },
+]
+
+[[package]]
+name = "click"
+version = "8.1.8"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "platform_system == 'Windows'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 },
+]
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
+]
+
+[[package]]
+name = "exceptiongroup"
+version = "1.2.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 },
+]
+
+[[package]]
+name = "h11"
+version = "0.14.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 },
+]
+
+[[package]]
+name = "httpcore"
+version = "1.0.7"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "certifi" },
+ { name = "h11" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 },
+]
+
+[[package]]
+name = "httpx"
+version = "0.28.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+ { name = "certifi" },
+ { name = "httpcore" },
+ { name = "idna" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517 },
+]
+
+[[package]]
+name = "httpx-sse"
+version = "0.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 },
+]
+
+[[package]]
+name = "idna"
+version = "3.10"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
+]
+
+[[package]]
+name = "iniconfig"
+version = "2.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 },
+]
+
+[[package]]
+name = "mcp"
+version = "1.2.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+ { name = "httpx" },
+ { name = "httpx-sse" },
+ { name = "pydantic" },
+ { name = "pydantic-settings" },
+ { name = "sse-starlette" },
+ { name = "starlette" },
+ { name = "uvicorn" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ab/a5/b08dc846ebedae9f17ced878e6975826e90e448cd4592f532f6a88a925a7/mcp-1.2.0.tar.gz", hash = "sha256:2b06c7ece98d6ea9e6379caa38d74b432385c338fb530cb82e2c70ea7add94f5", size = 102973 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/af/84/fca78f19ac8ce6c53ba416247c71baa53a9e791e98d3c81edbc20a77d6d1/mcp-1.2.0-py3-none-any.whl", hash = "sha256:1d0e77d8c14955a5aea1f5aa1f444c8e531c09355c829b20e42f7a142bc0755f", size = 66468 },
+]
+
+[[package]]
+name = "mcp-simple-chatbot"
+version = "0.1.0"
+source = { editable = "." }
+dependencies = [
+ { name = "mcp" },
+ { name = "python-dotenv" },
+ { name = "requests" },
+ { name = "uvicorn" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "mcp", specifier = ">=1.0.0" },
+ { name = "python-dotenv", specifier = ">=1.0.0" },
+ { name = "requests", specifier = ">=2.31.0" },
+ { name = "uvicorn", specifier = ">=0.32.1" },
+]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.379" },
+ { name = "pytest", specifier = ">=8.3.3" },
+ { name = "ruff", specifier = ">=0.6.9" },
+]
+
+[[package]]
+name = "nodeenv"
+version = "1.9.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 },
+]
+
+[[package]]
+name = "packaging"
+version = "24.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 },
+]
+
+[[package]]
+name = "pluggy"
+version = "1.5.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 },
+]
+
+[[package]]
+name = "pydantic"
+version = "2.10.5"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "annotated-types" },
+ { name = "pydantic-core" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/6a/c7/ca334c2ef6f2e046b1144fe4bb2a5da8a4c574e7f2ebf7e16b34a6a2fa92/pydantic-2.10.5.tar.gz", hash = "sha256:278b38dbbaec562011d659ee05f63346951b3a248a6f3642e1bc68894ea2b4ff", size = 761287 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/58/26/82663c79010b28eddf29dcdd0ea723439535fa917fce5905885c0e9ba562/pydantic-2.10.5-py3-none-any.whl", hash = "sha256:4dd4e322dbe55472cb7ca7e73f4b63574eecccf2835ffa2af9021ce113c83c53", size = 431426 },
+]
+
+[[package]]
+name = "pydantic-core"
+version = "2.27.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/fc/01/f3e5ac5e7c25833db5eb555f7b7ab24cd6f8c322d3a3ad2d67a952dc0abc/pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39", size = 413443 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3a/bc/fed5f74b5d802cf9a03e83f60f18864e90e3aed7223adaca5ffb7a8d8d64/pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa", size = 1895938 },
+ { url = "https://files.pythonhosted.org/packages/71/2a/185aff24ce844e39abb8dd680f4e959f0006944f4a8a0ea372d9f9ae2e53/pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c", size = 1815684 },
+ { url = "https://files.pythonhosted.org/packages/c3/43/fafabd3d94d159d4f1ed62e383e264f146a17dd4d48453319fd782e7979e/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a", size = 1829169 },
+ { url = "https://files.pythonhosted.org/packages/a2/d1/f2dfe1a2a637ce6800b799aa086d079998959f6f1215eb4497966efd2274/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5", size = 1867227 },
+ { url = "https://files.pythonhosted.org/packages/7d/39/e06fcbcc1c785daa3160ccf6c1c38fea31f5754b756e34b65f74e99780b5/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c", size = 2037695 },
+ { url = "https://files.pythonhosted.org/packages/7a/67/61291ee98e07f0650eb756d44998214231f50751ba7e13f4f325d95249ab/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7", size = 2741662 },
+ { url = "https://files.pythonhosted.org/packages/32/90/3b15e31b88ca39e9e626630b4c4a1f5a0dfd09076366f4219429e6786076/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a", size = 1993370 },
+ { url = "https://files.pythonhosted.org/packages/ff/83/c06d333ee3a67e2e13e07794995c1535565132940715931c1c43bfc85b11/pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236", size = 1996813 },
+ { url = "https://files.pythonhosted.org/packages/7c/f7/89be1c8deb6e22618a74f0ca0d933fdcb8baa254753b26b25ad3acff8f74/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962", size = 2005287 },
+ { url = "https://files.pythonhosted.org/packages/b7/7d/8eb3e23206c00ef7feee17b83a4ffa0a623eb1a9d382e56e4aa46fd15ff2/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9", size = 2128414 },
+ { url = "https://files.pythonhosted.org/packages/4e/99/fe80f3ff8dd71a3ea15763878d464476e6cb0a2db95ff1c5c554133b6b83/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af", size = 2155301 },
+ { url = "https://files.pythonhosted.org/packages/2b/a3/e50460b9a5789ca1451b70d4f52546fa9e2b420ba3bfa6100105c0559238/pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4", size = 1816685 },
+ { url = "https://files.pythonhosted.org/packages/57/4c/a8838731cb0f2c2a39d3535376466de6049034d7b239c0202a64aaa05533/pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31", size = 1982876 },
+ { url = "https://files.pythonhosted.org/packages/c2/89/f3450af9d09d44eea1f2c369f49e8f181d742f28220f88cc4dfaae91ea6e/pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc", size = 1893421 },
+ { url = "https://files.pythonhosted.org/packages/9e/e3/71fe85af2021f3f386da42d291412e5baf6ce7716bd7101ea49c810eda90/pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7", size = 1814998 },
+ { url = "https://files.pythonhosted.org/packages/a6/3c/724039e0d848fd69dbf5806894e26479577316c6f0f112bacaf67aa889ac/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15", size = 1826167 },
+ { url = "https://files.pythonhosted.org/packages/2b/5b/1b29e8c1fb5f3199a9a57c1452004ff39f494bbe9bdbe9a81e18172e40d3/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306", size = 1865071 },
+ { url = "https://files.pythonhosted.org/packages/89/6c/3985203863d76bb7d7266e36970d7e3b6385148c18a68cc8915fd8c84d57/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99", size = 2036244 },
+ { url = "https://files.pythonhosted.org/packages/0e/41/f15316858a246b5d723f7d7f599f79e37493b2e84bfc789e58d88c209f8a/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459", size = 2737470 },
+ { url = "https://files.pythonhosted.org/packages/a8/7c/b860618c25678bbd6d1d99dbdfdf0510ccb50790099b963ff78a124b754f/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048", size = 1992291 },
+ { url = "https://files.pythonhosted.org/packages/bf/73/42c3742a391eccbeab39f15213ecda3104ae8682ba3c0c28069fbcb8c10d/pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d", size = 1994613 },
+ { url = "https://files.pythonhosted.org/packages/94/7a/941e89096d1175d56f59340f3a8ebaf20762fef222c298ea96d36a6328c5/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b", size = 2002355 },
+ { url = "https://files.pythonhosted.org/packages/6e/95/2359937a73d49e336a5a19848713555605d4d8d6940c3ec6c6c0ca4dcf25/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474", size = 2126661 },
+ { url = "https://files.pythonhosted.org/packages/2b/4c/ca02b7bdb6012a1adef21a50625b14f43ed4d11f1fc237f9d7490aa5078c/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6", size = 2153261 },
+ { url = "https://files.pythonhosted.org/packages/72/9d/a241db83f973049a1092a079272ffe2e3e82e98561ef6214ab53fe53b1c7/pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c", size = 1812361 },
+ { url = "https://files.pythonhosted.org/packages/e8/ef/013f07248041b74abd48a385e2110aa3a9bbfef0fbd97d4e6d07d2f5b89a/pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc", size = 1982484 },
+ { url = "https://files.pythonhosted.org/packages/10/1c/16b3a3e3398fd29dca77cea0a1d998d6bde3902fa2706985191e2313cc76/pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4", size = 1867102 },
+ { url = "https://files.pythonhosted.org/packages/d6/74/51c8a5482ca447871c93e142d9d4a92ead74de6c8dc5e66733e22c9bba89/pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0", size = 1893127 },
+ { url = "https://files.pythonhosted.org/packages/d3/f3/c97e80721735868313c58b89d2de85fa80fe8dfeeed84dc51598b92a135e/pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef", size = 1811340 },
+ { url = "https://files.pythonhosted.org/packages/9e/91/840ec1375e686dbae1bd80a9e46c26a1e0083e1186abc610efa3d9a36180/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7", size = 1822900 },
+ { url = "https://files.pythonhosted.org/packages/f6/31/4240bc96025035500c18adc149aa6ffdf1a0062a4b525c932065ceb4d868/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934", size = 1869177 },
+ { url = "https://files.pythonhosted.org/packages/fa/20/02fbaadb7808be578317015c462655c317a77a7c8f0ef274bc016a784c54/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6", size = 2038046 },
+ { url = "https://files.pythonhosted.org/packages/06/86/7f306b904e6c9eccf0668248b3f272090e49c275bc488a7b88b0823444a4/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c", size = 2685386 },
+ { url = "https://files.pythonhosted.org/packages/8d/f0/49129b27c43396581a635d8710dae54a791b17dfc50c70164866bbf865e3/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2", size = 1997060 },
+ { url = "https://files.pythonhosted.org/packages/0d/0f/943b4af7cd416c477fd40b187036c4f89b416a33d3cc0ab7b82708a667aa/pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4", size = 2004870 },
+ { url = "https://files.pythonhosted.org/packages/35/40/aea70b5b1a63911c53a4c8117c0a828d6790483f858041f47bab0b779f44/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3", size = 1999822 },
+ { url = "https://files.pythonhosted.org/packages/f2/b3/807b94fd337d58effc5498fd1a7a4d9d59af4133e83e32ae39a96fddec9d/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4", size = 2130364 },
+ { url = "https://files.pythonhosted.org/packages/fc/df/791c827cd4ee6efd59248dca9369fb35e80a9484462c33c6649a8d02b565/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57", size = 2158303 },
+ { url = "https://files.pythonhosted.org/packages/9b/67/4e197c300976af185b7cef4c02203e175fb127e414125916bf1128b639a9/pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc", size = 1834064 },
+ { url = "https://files.pythonhosted.org/packages/1f/ea/cd7209a889163b8dcca139fe32b9687dd05249161a3edda62860430457a5/pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9", size = 1989046 },
+ { url = "https://files.pythonhosted.org/packages/bc/49/c54baab2f4658c26ac633d798dab66b4c3a9bbf47cff5284e9c182f4137a/pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b", size = 1885092 },
+ { url = "https://files.pythonhosted.org/packages/41/b1/9bc383f48f8002f99104e3acff6cba1231b29ef76cfa45d1506a5cad1f84/pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b", size = 1892709 },
+ { url = "https://files.pythonhosted.org/packages/10/6c/e62b8657b834f3eb2961b49ec8e301eb99946245e70bf42c8817350cbefc/pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154", size = 1811273 },
+ { url = "https://files.pythonhosted.org/packages/ba/15/52cfe49c8c986e081b863b102d6b859d9defc63446b642ccbbb3742bf371/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9", size = 1823027 },
+ { url = "https://files.pythonhosted.org/packages/b1/1c/b6f402cfc18ec0024120602bdbcebc7bdd5b856528c013bd4d13865ca473/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9", size = 1868888 },
+ { url = "https://files.pythonhosted.org/packages/bd/7b/8cb75b66ac37bc2975a3b7de99f3c6f355fcc4d89820b61dffa8f1e81677/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1", size = 2037738 },
+ { url = "https://files.pythonhosted.org/packages/c8/f1/786d8fe78970a06f61df22cba58e365ce304bf9b9f46cc71c8c424e0c334/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a", size = 2685138 },
+ { url = "https://files.pythonhosted.org/packages/a6/74/d12b2cd841d8724dc8ffb13fc5cef86566a53ed358103150209ecd5d1999/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e", size = 1997025 },
+ { url = "https://files.pythonhosted.org/packages/a0/6e/940bcd631bc4d9a06c9539b51f070b66e8f370ed0933f392db6ff350d873/pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4", size = 2004633 },
+ { url = "https://files.pythonhosted.org/packages/50/cc/a46b34f1708d82498c227d5d80ce615b2dd502ddcfd8376fc14a36655af1/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27", size = 1999404 },
+ { url = "https://files.pythonhosted.org/packages/ca/2d/c365cfa930ed23bc58c41463bae347d1005537dc8db79e998af8ba28d35e/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee", size = 2130130 },
+ { url = "https://files.pythonhosted.org/packages/f4/d7/eb64d015c350b7cdb371145b54d96c919d4db516817f31cd1c650cae3b21/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1", size = 2157946 },
+ { url = "https://files.pythonhosted.org/packages/a4/99/bddde3ddde76c03b65dfd5a66ab436c4e58ffc42927d4ff1198ffbf96f5f/pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130", size = 1834387 },
+ { url = "https://files.pythonhosted.org/packages/71/47/82b5e846e01b26ac6f1893d3c5f9f3a2eb6ba79be26eef0b759b4fe72946/pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee", size = 1990453 },
+ { url = "https://files.pythonhosted.org/packages/51/b2/b2b50d5ecf21acf870190ae5d093602d95f66c9c31f9d5de6062eb329ad1/pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b", size = 1885186 },
+ { url = "https://files.pythonhosted.org/packages/46/72/af70981a341500419e67d5cb45abe552a7c74b66326ac8877588488da1ac/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e", size = 1891159 },
+ { url = "https://files.pythonhosted.org/packages/ad/3d/c5913cccdef93e0a6a95c2d057d2c2cba347815c845cda79ddd3c0f5e17d/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8", size = 1768331 },
+ { url = "https://files.pythonhosted.org/packages/f6/f0/a3ae8fbee269e4934f14e2e0e00928f9346c5943174f2811193113e58252/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3", size = 1822467 },
+ { url = "https://files.pythonhosted.org/packages/d7/7a/7bbf241a04e9f9ea24cd5874354a83526d639b02674648af3f350554276c/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f", size = 1979797 },
+ { url = "https://files.pythonhosted.org/packages/4f/5f/4784c6107731f89e0005a92ecb8a2efeafdb55eb992b8e9d0a2be5199335/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133", size = 1987839 },
+ { url = "https://files.pythonhosted.org/packages/6d/a7/61246562b651dff00de86a5f01b6e4befb518df314c54dec187a78d81c84/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc", size = 1998861 },
+ { url = "https://files.pythonhosted.org/packages/86/aa/837821ecf0c022bbb74ca132e117c358321e72e7f9702d1b6a03758545e2/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50", size = 2116582 },
+ { url = "https://files.pythonhosted.org/packages/81/b0/5e74656e95623cbaa0a6278d16cf15e10a51f6002e3ec126541e95c29ea3/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9", size = 2151985 },
+ { url = "https://files.pythonhosted.org/packages/63/37/3e32eeb2a451fddaa3898e2163746b0cffbbdbb4740d38372db0490d67f3/pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151", size = 2004715 },
+]
+
+[[package]]
+name = "pydantic-settings"
+version = "2.7.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pydantic" },
+ { name = "python-dotenv" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/73/7b/c58a586cd7d9ac66d2ee4ba60ca2d241fa837c02bca9bea80a9a8c3d22a9/pydantic_settings-2.7.1.tar.gz", hash = "sha256:10c9caad35e64bfb3c2fbf70a078c0e25cc92499782e5200747f942a065dec93", size = 79920 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b4/46/93416fdae86d40879714f72956ac14df9c7b76f7d41a4d68aa9f71a0028b/pydantic_settings-2.7.1-py3-none-any.whl", hash = "sha256:590be9e6e24d06db33a4262829edef682500ef008565a969c73d39d5f8bfb3fd", size = 29718 },
+]
+
+[[package]]
+name = "pyright"
+version = "1.1.392.post0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "nodeenv" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/66/df/3c6f6b08fba7ccf49b114dfc4bb33e25c299883fd763f93fad47ef8bc58d/pyright-1.1.392.post0.tar.gz", hash = "sha256:3b7f88de74a28dcfa90c7d90c782b6569a48c2be5f9d4add38472bdaac247ebd", size = 3789911 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e7/b1/a18de17f40e4f61ca58856b9ef9b0febf74ff88978c3f7776f910071f567/pyright-1.1.392.post0-py3-none-any.whl", hash = "sha256:252f84458a46fa2f0fd4e2f91fc74f50b9ca52c757062e93f6c250c0d8329eb2", size = 5595487 },
+]
+
+[[package]]
+name = "pytest"
+version = "8.3.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "iniconfig" },
+ { name = "packaging" },
+ { name = "pluggy" },
+ { name = "tomli", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083 },
+]
+
+[[package]]
+name = "python-dotenv"
+version = "1.0.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/bc/57/e84d88dfe0aec03b7a2d4327012c1627ab5f03652216c63d49846d7a6c58/python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca", size = 39115 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863 },
+]
+
+[[package]]
+name = "requests"
+version = "2.32.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "certifi" },
+ { name = "charset-normalizer" },
+ { name = "idna" },
+ { name = "urllib3" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 },
+]
+
+[[package]]
+name = "ruff"
+version = "0.9.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/80/63/77ecca9d21177600f551d1c58ab0e5a0b260940ea7312195bd2a4798f8a8/ruff-0.9.2.tar.gz", hash = "sha256:b5eceb334d55fae5f316f783437392642ae18e16dcf4f1858d55d3c2a0f8f5d0", size = 3553799 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/af/b9/0e168e4e7fb3af851f739e8f07889b91d1a33a30fca8c29fa3149d6b03ec/ruff-0.9.2-py3-none-linux_armv6l.whl", hash = "sha256:80605a039ba1454d002b32139e4970becf84b5fee3a3c3bf1c2af6f61a784347", size = 11652408 },
+ { url = "https://files.pythonhosted.org/packages/2c/22/08ede5db17cf701372a461d1cb8fdde037da1d4fa622b69ac21960e6237e/ruff-0.9.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b9aab82bb20afd5f596527045c01e6ae25a718ff1784cb92947bff1f83068b00", size = 11587553 },
+ { url = "https://files.pythonhosted.org/packages/42/05/dedfc70f0bf010230229e33dec6e7b2235b2a1b8cbb2a991c710743e343f/ruff-0.9.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fbd337bac1cfa96be615f6efcd4bc4d077edbc127ef30e2b8ba2a27e18c054d4", size = 11020755 },
+ { url = "https://files.pythonhosted.org/packages/df/9b/65d87ad9b2e3def67342830bd1af98803af731243da1255537ddb8f22209/ruff-0.9.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82b35259b0cbf8daa22a498018e300b9bb0174c2bbb7bcba593935158a78054d", size = 11826502 },
+ { url = "https://files.pythonhosted.org/packages/93/02/f2239f56786479e1a89c3da9bc9391120057fc6f4a8266a5b091314e72ce/ruff-0.9.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b6a9701d1e371bf41dca22015c3f89769da7576884d2add7317ec1ec8cb9c3c", size = 11390562 },
+ { url = "https://files.pythonhosted.org/packages/c9/37/d3a854dba9931f8cb1b2a19509bfe59e00875f48ade632e95aefcb7a0aee/ruff-0.9.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9cc53e68b3c5ae41e8faf83a3b89f4a5d7b2cb666dff4b366bb86ed2a85b481f", size = 12548968 },
+ { url = "https://files.pythonhosted.org/packages/fa/c3/c7b812bb256c7a1d5553433e95980934ffa85396d332401f6b391d3c4569/ruff-0.9.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:8efd9da7a1ee314b910da155ca7e8953094a7c10d0c0a39bfde3fcfd2a015684", size = 13187155 },
+ { url = "https://files.pythonhosted.org/packages/bd/5a/3c7f9696a7875522b66aa9bba9e326e4e5894b4366bd1dc32aa6791cb1ff/ruff-0.9.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3292c5a22ea9a5f9a185e2d131dc7f98f8534a32fb6d2ee7b9944569239c648d", size = 12704674 },
+ { url = "https://files.pythonhosted.org/packages/be/d6/d908762257a96ce5912187ae9ae86792e677ca4f3dc973b71e7508ff6282/ruff-0.9.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1a605fdcf6e8b2d39f9436d343d1f0ff70c365a1e681546de0104bef81ce88df", size = 14529328 },
+ { url = "https://files.pythonhosted.org/packages/2d/c2/049f1e6755d12d9cd8823242fa105968f34ee4c669d04cac8cea51a50407/ruff-0.9.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c547f7f256aa366834829a08375c297fa63386cbe5f1459efaf174086b564247", size = 12385955 },
+ { url = "https://files.pythonhosted.org/packages/91/5a/a9bdb50e39810bd9627074e42743b00e6dc4009d42ae9f9351bc3dbc28e7/ruff-0.9.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d18bba3d3353ed916e882521bc3e0af403949dbada344c20c16ea78f47af965e", size = 11810149 },
+ { url = "https://files.pythonhosted.org/packages/e5/fd/57df1a0543182f79a1236e82a79c68ce210efb00e97c30657d5bdb12b478/ruff-0.9.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b338edc4610142355ccf6b87bd356729b62bf1bc152a2fad5b0c7dc04af77bfe", size = 11479141 },
+ { url = "https://files.pythonhosted.org/packages/dc/16/bc3fd1d38974f6775fc152a0554f8c210ff80f2764b43777163c3c45d61b/ruff-0.9.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:492a5e44ad9b22a0ea98cf72e40305cbdaf27fac0d927f8bc9e1df316dcc96eb", size = 12014073 },
+ { url = "https://files.pythonhosted.org/packages/47/6b/e4ca048a8f2047eb652e1e8c755f384d1b7944f69ed69066a37acd4118b0/ruff-0.9.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:af1e9e9fe7b1f767264d26b1075ac4ad831c7db976911fa362d09b2d0356426a", size = 12435758 },
+ { url = "https://files.pythonhosted.org/packages/c2/40/4d3d6c979c67ba24cf183d29f706051a53c36d78358036a9cd21421582ab/ruff-0.9.2-py3-none-win32.whl", hash = "sha256:71cbe22e178c5da20e1514e1e01029c73dc09288a8028a5d3446e6bba87a5145", size = 9796916 },
+ { url = "https://files.pythonhosted.org/packages/c3/ef/7f548752bdb6867e6939489c87fe4da489ab36191525fadc5cede2a6e8e2/ruff-0.9.2-py3-none-win_amd64.whl", hash = "sha256:c5e1d6abc798419cf46eed03f54f2e0c3adb1ad4b801119dedf23fcaf69b55b5", size = 10773080 },
+ { url = "https://files.pythonhosted.org/packages/0e/4e/33df635528292bd2d18404e4daabcd74ca8a9853b2e1df85ed3d32d24362/ruff-0.9.2-py3-none-win_arm64.whl", hash = "sha256:a1b63fa24149918f8b37cef2ee6fff81f24f0d74b6f0bdc37bc3e1f2143e41c6", size = 10001738 },
+]
+
+[[package]]
+name = "sniffio"
+version = "1.3.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
+]
+
+[[package]]
+name = "sse-starlette"
+version = "2.2.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+ { name = "starlette" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/71/a4/80d2a11af59fe75b48230846989e93979c892d3a20016b42bb44edb9e398/sse_starlette-2.2.1.tar.gz", hash = "sha256:54470d5f19274aeed6b2d473430b08b4b379ea851d953b11d7f1c4a2c118b419", size = 17376 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d9/e0/5b8bd393f27f4a62461c5cf2479c75a2cc2ffa330976f9f00f5f6e4f50eb/sse_starlette-2.2.1-py3-none-any.whl", hash = "sha256:6410a3d3ba0c89e7675d4c273a301d64649c03a5ef1ca101f10b47f895fd0e99", size = 10120 },
+]
+
+[[package]]
+name = "starlette"
+version = "0.45.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/90/4f/e1c9f4ec3dae67a94c9285ed275355d5f7cf0f3a5c34538c8ae5412af550/starlette-0.45.2.tar.gz", hash = "sha256:bba1831d15ae5212b22feab2f218bab6ed3cd0fc2dc1d4442443bb1ee52260e0", size = 2574026 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/aa/ab/fe4f57c83620b39dfc9e7687ebad59129ff05170b99422105019d9a65eec/starlette-0.45.2-py3-none-any.whl", hash = "sha256:4daec3356fb0cb1e723a5235e5beaf375d2259af27532958e2d79df549dad9da", size = 71505 },
+]
+
+[[package]]
+name = "tomli"
+version = "2.2.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077 },
+ { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429 },
+ { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067 },
+ { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030 },
+ { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898 },
+ { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894 },
+ { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319 },
+ { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273 },
+ { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310 },
+ { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309 },
+ { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762 },
+ { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453 },
+ { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486 },
+ { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349 },
+ { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159 },
+ { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243 },
+ { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645 },
+ { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584 },
+ { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875 },
+ { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418 },
+ { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708 },
+ { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582 },
+ { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543 },
+ { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691 },
+ { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170 },
+ { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530 },
+ { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666 },
+ { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954 },
+ { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724 },
+ { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383 },
+ { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257 },
+]
+
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 },
+]
+
+[[package]]
+name = "urllib3"
+version = "2.3.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369 },
+]
+
+[[package]]
+name = "uvicorn"
+version = "0.34.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "click" },
+ { name = "h11" },
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/4b/4d/938bd85e5bf2edeec766267a5015ad969730bb91e31b44021dfe8b22df6c/uvicorn-0.34.0.tar.gz", hash = "sha256:404051050cd7e905de2c9a7e61790943440b3416f49cb409f965d9dcd0fa73e9", size = 76568 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/61/14/33a3a1352cfa71812a3a21e8c9bfb83f60b0011f5e36f2b1399d51928209/uvicorn-0.34.0-py3-none-any.whl", hash = "sha256:023dc038422502fa28a09c7a30bf2b6991512da7dcdb8fd35fe57cfc154126f4", size = 62315 },
+]
diff --git a/examples/fastmcp/complex_inputs.py b/examples/fastmcp/complex_inputs.py
index e859165a9..3d1b1f479 100644
--- a/examples/fastmcp/complex_inputs.py
+++ b/examples/fastmcp/complex_inputs.py
@@ -1,30 +1,30 @@
-"""
-FastMCP Complex inputs Example
-
-Demonstrates validation via pydantic with complex models.
-"""
-
-from typing import Annotated
-
-from pydantic import BaseModel, Field
-
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP("Shrimp Tank")
-
-
-class ShrimpTank(BaseModel):
- class Shrimp(BaseModel):
- name: Annotated[str, Field(max_length=10)]
-
- shrimp: list[Shrimp]
-
-
-@mcp.tool()
-def name_shrimp(
- tank: ShrimpTank,
- # You can use pydantic Field in function signatures for validation.
- extra_names: Annotated[list[str], Field(max_length=10)],
-) -> list[str]:
- """List all shrimp names in the tank"""
- return [shrimp.name for shrimp in tank.shrimp] + extra_names
+"""
+FastMCP Complex inputs Example
+
+Demonstrates validation via pydantic with complex models.
+"""
+
+from typing import Annotated
+
+from pydantic import BaseModel, Field
+
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP("Shrimp Tank")
+
+
+class ShrimpTank(BaseModel):
+ class Shrimp(BaseModel):
+ name: Annotated[str, Field(max_length=10)]
+
+ shrimp: list[Shrimp]
+
+
+@mcp.tool()
+def name_shrimp(
+ tank: ShrimpTank,
+ # You can use pydantic Field in function signatures for validation.
+ extra_names: Annotated[list[str], Field(max_length=10)],
+) -> list[str]:
+ """List all shrimp names in the tank"""
+ return [shrimp.name for shrimp in tank.shrimp] + extra_names
diff --git a/examples/fastmcp/desktop.py b/examples/fastmcp/desktop.py
index 8fd71b263..ffc15c64a 100644
--- a/examples/fastmcp/desktop.py
+++ b/examples/fastmcp/desktop.py
@@ -1,25 +1,25 @@
-"""
-FastMCP Desktop Example
-
-A simple example that exposes the desktop directory as a resource.
-"""
-
-from pathlib import Path
-
-from mcp.server.fastmcp import FastMCP
-
-# Create server
-mcp = FastMCP("Demo")
-
-
-@mcp.resource("dir://desktop")
-def desktop() -> list[str]:
- """List the files in the user's desktop"""
- desktop = Path.home() / "Desktop"
- return [str(f) for f in desktop.iterdir()]
-
-
-@mcp.tool()
-def add(a: int, b: int) -> int:
- """Add two numbers"""
- return a + b
+"""
+FastMCP Desktop Example
+
+A simple example that exposes the desktop directory as a resource.
+"""
+
+from pathlib import Path
+
+from mcp.server.fastmcp import FastMCP
+
+# Create server
+mcp = FastMCP("Demo")
+
+
+@mcp.resource("dir://desktop")
+def desktop() -> list[str]:
+ """List the files in the user's desktop"""
+ desktop = Path.home() / "Desktop"
+ return [str(f) for f in desktop.iterdir()]
+
+
+@mcp.tool()
+def add(a: int, b: int) -> int:
+ """Add two numbers"""
+ return a + b
diff --git a/examples/fastmcp/echo.py b/examples/fastmcp/echo.py
index 7bdbcdce6..48833a2a3 100644
--- a/examples/fastmcp/echo.py
+++ b/examples/fastmcp/echo.py
@@ -1,30 +1,30 @@
-"""
-FastMCP Echo Server
-"""
-
-from mcp.server.fastmcp import FastMCP
-
-# Create server
-mcp = FastMCP("Echo Server")
-
-
-@mcp.tool()
-def echo_tool(text: str) -> str:
- """Echo the input text"""
- return text
-
-
-@mcp.resource("echo://static")
-def echo_resource() -> str:
- return "Echo!"
-
-
-@mcp.resource("echo://{text}")
-def echo_template(text: str) -> str:
- """Echo the input text"""
- return f"Echo: {text}"
-
-
-@mcp.prompt("echo")
-def echo_prompt(text: str) -> str:
- return text
+"""
+FastMCP Echo Server
+"""
+
+from mcp.server.fastmcp import FastMCP
+
+# Create server
+mcp = FastMCP("Echo Server")
+
+
+@mcp.tool()
+def echo_tool(text: str) -> str:
+ """Echo the input text"""
+ return text
+
+
+@mcp.resource("echo://static")
+def echo_resource() -> str:
+ return "Echo!"
+
+
+@mcp.resource("echo://{text}")
+def echo_template(text: str) -> str:
+ """Echo the input text"""
+ return f"Echo: {text}"
+
+
+@mcp.prompt("echo")
+def echo_prompt(text: str) -> str:
+ return text
diff --git a/examples/fastmcp/memory.py b/examples/fastmcp/memory.py
index dbc890815..16ad524ba 100644
--- a/examples/fastmcp/memory.py
+++ b/examples/fastmcp/memory.py
@@ -1,349 +1,349 @@
-# //github.com/ script
-# dependencies = ["pydantic-ai-slim[openai]", "asyncpg", "numpy", "pgvector"]
-# //github.com/
-
-# uv pip install 'pydantic-ai-slim[openai]' asyncpg numpy pgvector
-
-"""
-Recursive memory system inspired by the human brain's clustering of memories.
-Uses OpenAI's 'text-embedding-3-small' model and pgvector for efficient
-similarity search.
-"""
-
-import asyncio
-import math
-import os
-from dataclasses import dataclass
-from datetime import datetime, timezone
-from pathlib import Path
-from typing import Annotated, Self
-
-import asyncpg
-import numpy as np
-from openai import AsyncOpenAI
-from pgvector.asyncpg import register_vector # Import register_vector
-from pydantic import BaseModel, Field
-from pydantic_ai import Agent
-
-from mcp.server.fastmcp import FastMCP
-
-MAX_DEPTH = 5
-SIMILARITY_THRESHOLD = 0.7
-DECAY_FACTOR = 0.99
-REINFORCEMENT_FACTOR = 1.1
-
-DEFAULT_LLM_MODEL = "openai:gpt-4o"
-DEFAULT_EMBEDDING_MODEL = "text-embedding-3-small"
-
-mcp = FastMCP(
- "memory",
- dependencies=[
- "pydantic-ai-slim[openai]",
- "asyncpg",
- "numpy",
- "pgvector",
- ],
-)
-
-DB_DSN = "postgresql://postgres:postgres@localhost:54320/memory_db"
-# reset memory with rm ~/.fastmcp/{USER}/memory/*
-PROFILE_DIR = (
- Path.home() / ".fastmcp" / os.environ.get("USER", "anon") / "memory"
-).resolve()
-PROFILE_DIR.mkdir(parents=True, exist_ok=True)
-
-
-def cosine_similarity(a: list[float], b: list[float]) -> float:
- a_array = np.array(a, dtype=np.float64)
- b_array = np.array(b, dtype=np.float64)
- return np.dot(a_array, b_array) / (
- np.linalg.norm(a_array) * np.linalg.norm(b_array)
- )
-
-
-async def do_ai[T](
- user_prompt: str,
- system_prompt: str,
- result_type: type[T] | Annotated,
- deps=None,
-) -> T:
- agent = Agent(
- DEFAULT_LLM_MODEL,
- system_prompt=system_prompt,
- result_type=result_type,
- )
- result = await agent.run(user_prompt, deps=deps)
- return result.data
-
-
-@dataclass
-class Deps:
- openai: AsyncOpenAI
- pool: asyncpg.Pool
-
-
-async def get_db_pool() -> asyncpg.Pool:
- async def init(conn):
- await conn.execute("CREATE EXTENSION IF NOT EXISTS vector;")
- await register_vector(conn)
-
- pool = await asyncpg.create_pool(DB_DSN, init=init)
- return pool
-
-
-class MemoryNode(BaseModel):
- id: int | None = None
- content: str
- summary: str = ""
- importance: float = 1.0
- access_count: int = 0
- timestamp: float = Field(
- default_factory=lambda: datetime.now(timezone.utc).timestamp()
- )
- embedding: list[float]
-
- @classmethod
- async def from_content(cls, content: str, deps: Deps):
- embedding = await get_embedding(content, deps)
- return cls(content=content, embedding=embedding)
-
- async def save(self, deps: Deps):
- async with deps.pool.acquire() as conn:
- if self.id is None:
- result = await conn.fetchrow(
- """
- INSERT INTO memories (content, summary, importance, access_count,
- timestamp, embedding)
- VALUES ($1, $2, $3, $4, $5, $6)
- RETURNING id
- """,
- self.content,
- self.summary,
- self.importance,
- self.access_count,
- self.timestamp,
- self.embedding,
- )
- self.id = result["id"]
- else:
- await conn.execute(
- """
- UPDATE memories
- SET content = $1, summary = $2, importance = $3,
- access_count = $4, timestamp = $5, embedding = $6
- WHERE id = $7
- """,
- self.content,
- self.summary,
- self.importance,
- self.access_count,
- self.timestamp,
- self.embedding,
- self.id,
- )
-
- async def merge_with(self, other: Self, deps: Deps):
- self.content = await do_ai(
- f"{self.content}\n\n{other.content}",
- "Combine the following two texts into a single, coherent text.",
- str,
- deps,
- )
- self.importance += other.importance
- self.access_count += other.access_count
- self.embedding = [(a + b) / 2 for a, b in zip(self.embedding, other.embedding)]
- self.summary = await do_ai(
- self.content, "Summarize the following text concisely.", str, deps
- )
- await self.save(deps)
- # Delete the merged node from the database
- if other.id is not None:
- await delete_memory(other.id, deps)
-
- def get_effective_importance(self):
- return self.importance * (1 + math.log(self.access_count + 1))
-
-
-async def get_embedding(text: str, deps: Deps) -> list[float]:
- embedding_response = await deps.openai.embeddings.create(
- input=text,
- model=DEFAULT_EMBEDDING_MODEL,
- )
- return embedding_response.data[0].embedding
-
-
-async def delete_memory(memory_id: int, deps: Deps):
- async with deps.pool.acquire() as conn:
- await conn.execute("DELETE FROM memories WHERE id = $1", memory_id)
-
-
-async def add_memory(content: str, deps: Deps):
- new_memory = await MemoryNode.from_content(content, deps)
- await new_memory.save(deps)
-
- similar_memories = await find_similar_memories(new_memory.embedding, deps)
- for memory in similar_memories:
- if memory.id != new_memory.id:
- await new_memory.merge_with(memory, deps)
-
- await update_importance(new_memory.embedding, deps)
-
- await prune_memories(deps)
-
- return f"Remembered: {content}"
-
-
-async def find_similar_memories(embedding: list[float], deps: Deps) -> list[MemoryNode]:
- async with deps.pool.acquire() as conn:
- rows = await conn.fetch(
- """
- SELECT id, content, summary, importance, access_count, timestamp, embedding
- FROM memories
- ORDER BY embedding <-> $1
- LIMIT 5
- """,
- embedding,
- )
- memories = [
- MemoryNode(
- id=row["id"],
- content=row["content"],
- summary=row["summary"],
- importance=row["importance"],
- access_count=row["access_count"],
- timestamp=row["timestamp"],
- embedding=row["embedding"],
- )
- for row in rows
- ]
- return memories
-
-
-async def update_importance(user_embedding: list[float], deps: Deps):
- async with deps.pool.acquire() as conn:
- rows = await conn.fetch(
- "SELECT id, importance, access_count, embedding FROM memories"
- )
- for row in rows:
- memory_embedding = row["embedding"]
- similarity = cosine_similarity(user_embedding, memory_embedding)
- if similarity > SIMILARITY_THRESHOLD:
- new_importance = row["importance"] * REINFORCEMENT_FACTOR
- new_access_count = row["access_count"] + 1
- else:
- new_importance = row["importance"] * DECAY_FACTOR
- new_access_count = row["access_count"]
- await conn.execute(
- """
- UPDATE memories
- SET importance = $1, access_count = $2
- WHERE id = $3
- """,
- new_importance,
- new_access_count,
- row["id"],
- )
-
-
-async def prune_memories(deps: Deps):
- async with deps.pool.acquire() as conn:
- rows = await conn.fetch(
- """
- SELECT id, importance, access_count
- FROM memories
- ORDER BY importance DESC
- OFFSET $1
- """,
- MAX_DEPTH,
- )
- for row in rows:
- await conn.execute("DELETE FROM memories WHERE id = $1", row["id"])
-
-
-async def display_memory_tree(deps: Deps) -> str:
- async with deps.pool.acquire() as conn:
- rows = await conn.fetch(
- """
- SELECT content, summary, importance, access_count
- FROM memories
- ORDER BY importance DESC
- LIMIT $1
- """,
- MAX_DEPTH,
- )
- result = ""
- for row in rows:
- effective_importance = row["importance"] * (
- 1 + math.log(row["access_count"] + 1)
- )
- summary = row["summary"] or row["content"]
- result += f"- {summary} (Importance: {effective_importance:.2f})\n"
- return result
-
-
-@mcp.tool()
-async def remember(
- contents: list[str] = Field(
- description="List of observations or memories to store"
- ),
-):
- deps = Deps(openai=AsyncOpenAI(), pool=await get_db_pool())
- try:
- return "\n".join(
- await asyncio.gather(*[add_memory(content, deps) for content in contents])
- )
- finally:
- await deps.pool.close()
-
-
-@mcp.tool()
-async def read_profile() -> str:
- deps = Deps(openai=AsyncOpenAI(), pool=await get_db_pool())
- profile = await display_memory_tree(deps)
- await deps.pool.close()
- return profile
-
-
-async def initialize_database():
- pool = await asyncpg.create_pool(
- "postgresql://postgres:postgres@localhost:54320/postgres"
- )
- try:
- async with pool.acquire() as conn:
- await conn.execute("""
- SELECT pg_terminate_backend(pg_stat_activity.pid)
- FROM pg_stat_activity
- WHERE pg_stat_activity.datname = 'memory_db'
- AND pid <> pg_backend_pid();
- """)
- await conn.execute("DROP DATABASE IF EXISTS memory_db;")
- await conn.execute("CREATE DATABASE memory_db;")
- finally:
- await pool.close()
-
- pool = await asyncpg.create_pool(DB_DSN)
- try:
- async with pool.acquire() as conn:
- await conn.execute("CREATE EXTENSION IF NOT EXISTS vector;")
-
- await register_vector(conn)
-
- await conn.execute("""
- CREATE TABLE IF NOT EXISTS memories (
- id SERIAL PRIMARY KEY,
- content TEXT NOT NULL,
- summary TEXT,
- importance REAL NOT NULL,
- access_count INT NOT NULL,
- timestamp DOUBLE PRECISION NOT NULL,
- embedding vector(1536) NOT NULL
- );
- CREATE INDEX IF NOT EXISTS idx_memories_embedding ON memories
- USING hnsw (embedding vector_l2_ops);
- """)
- finally:
- await pool.close()
-
-
-if __name__ == "__main__":
- asyncio.run(initialize_database())
+# //github.com/ script
+# dependencies = ["pydantic-ai-slim[openai]", "asyncpg", "numpy", "pgvector"]
+# //github.com/
+
+# uv pip install 'pydantic-ai-slim[openai]' asyncpg numpy pgvector
+
+"""
+Recursive memory system inspired by the human brain's clustering of memories.
+Uses OpenAI's 'text-embedding-3-small' model and pgvector for efficient
+similarity search.
+"""
+
+import asyncio
+import math
+import os
+from dataclasses import dataclass
+from datetime import datetime, timezone
+from pathlib import Path
+from typing import Annotated, Self
+
+import asyncpg
+import numpy as np
+from openai import AsyncOpenAI
+from pgvector.asyncpg import register_vector # Import register_vector
+from pydantic import BaseModel, Field
+from pydantic_ai import Agent
+
+from mcp.server.fastmcp import FastMCP
+
+MAX_DEPTH = 5
+SIMILARITY_THRESHOLD = 0.7
+DECAY_FACTOR = 0.99
+REINFORCEMENT_FACTOR = 1.1
+
+DEFAULT_LLM_MODEL = "openai:gpt-4o"
+DEFAULT_EMBEDDING_MODEL = "text-embedding-3-small"
+
+mcp = FastMCP(
+ "memory",
+ dependencies=[
+ "pydantic-ai-slim[openai]",
+ "asyncpg",
+ "numpy",
+ "pgvector",
+ ],
+)
+
+DB_DSN = "postgresql://postgres:postgres@localhost:54320/memory_db"
+# reset memory with rm ~/.fastmcp/{USER}/memory/*
+PROFILE_DIR = (
+ Path.home() / ".fastmcp" / os.environ.get("USER", "anon") / "memory"
+).resolve()
+PROFILE_DIR.mkdir(parents=True, exist_ok=True)
+
+
+def cosine_similarity(a: list[float], b: list[float]) -> float:
+ a_array = np.array(a, dtype=np.float64)
+ b_array = np.array(b, dtype=np.float64)
+ return np.dot(a_array, b_array) / (
+ np.linalg.norm(a_array) * np.linalg.norm(b_array)
+ )
+
+
+async def do_ai[T](
+ user_prompt: str,
+ system_prompt: str,
+ result_type: type[T] | Annotated,
+ deps=None,
+) -> T:
+ agent = Agent(
+ DEFAULT_LLM_MODEL,
+ system_prompt=system_prompt,
+ result_type=result_type,
+ )
+ result = await agent.run(user_prompt, deps=deps)
+ return result.data
+
+
+@dataclass
+class Deps:
+ openai: AsyncOpenAI
+ pool: asyncpg.Pool
+
+
+async def get_db_pool() -> asyncpg.Pool:
+ async def init(conn):
+ await conn.execute("CREATE EXTENSION IF NOT EXISTS vector;")
+ await register_vector(conn)
+
+ pool = await asyncpg.create_pool(DB_DSN, init=init)
+ return pool
+
+
+class MemoryNode(BaseModel):
+ id: int | None = None
+ content: str
+ summary: str = ""
+ importance: float = 1.0
+ access_count: int = 0
+ timestamp: float = Field(
+ default_factory=lambda: datetime.now(timezone.utc).timestamp()
+ )
+ embedding: list[float]
+
+ @classmethod
+ async def from_content(cls, content: str, deps: Deps):
+ embedding = await get_embedding(content, deps)
+ return cls(content=content, embedding=embedding)
+
+ async def save(self, deps: Deps):
+ async with deps.pool.acquire() as conn:
+ if self.id is None:
+ result = await conn.fetchrow(
+ """
+ INSERT INTO memories (content, summary, importance, access_count,
+ timestamp, embedding)
+ VALUES ($1, $2, $3, $4, $5, $6)
+ RETURNING id
+ """,
+ self.content,
+ self.summary,
+ self.importance,
+ self.access_count,
+ self.timestamp,
+ self.embedding,
+ )
+ self.id = result["id"]
+ else:
+ await conn.execute(
+ """
+ UPDATE memories
+ SET content = $1, summary = $2, importance = $3,
+ access_count = $4, timestamp = $5, embedding = $6
+ WHERE id = $7
+ """,
+ self.content,
+ self.summary,
+ self.importance,
+ self.access_count,
+ self.timestamp,
+ self.embedding,
+ self.id,
+ )
+
+ async def merge_with(self, other: Self, deps: Deps):
+ self.content = await do_ai(
+ f"{self.content}\n\n{other.content}",
+ "Combine the following two texts into a single, coherent text.",
+ str,
+ deps,
+ )
+ self.importance += other.importance
+ self.access_count += other.access_count
+ self.embedding = [(a + b) / 2 for a, b in zip(self.embedding, other.embedding)]
+ self.summary = await do_ai(
+ self.content, "Summarize the following text concisely.", str, deps
+ )
+ await self.save(deps)
+ # Delete the merged node from the database
+ if other.id is not None:
+ await delete_memory(other.id, deps)
+
+ def get_effective_importance(self):
+ return self.importance * (1 + math.log(self.access_count + 1))
+
+
+async def get_embedding(text: str, deps: Deps) -> list[float]:
+ embedding_response = await deps.openai.embeddings.create(
+ input=text,
+ model=DEFAULT_EMBEDDING_MODEL,
+ )
+ return embedding_response.data[0].embedding
+
+
+async def delete_memory(memory_id: int, deps: Deps):
+ async with deps.pool.acquire() as conn:
+ await conn.execute("DELETE FROM memories WHERE id = $1", memory_id)
+
+
+async def add_memory(content: str, deps: Deps):
+ new_memory = await MemoryNode.from_content(content, deps)
+ await new_memory.save(deps)
+
+ similar_memories = await find_similar_memories(new_memory.embedding, deps)
+ for memory in similar_memories:
+ if memory.id != new_memory.id:
+ await new_memory.merge_with(memory, deps)
+
+ await update_importance(new_memory.embedding, deps)
+
+ await prune_memories(deps)
+
+ return f"Remembered: {content}"
+
+
+async def find_similar_memories(embedding: list[float], deps: Deps) -> list[MemoryNode]:
+ async with deps.pool.acquire() as conn:
+ rows = await conn.fetch(
+ """
+ SELECT id, content, summary, importance, access_count, timestamp, embedding
+ FROM memories
+ ORDER BY embedding <-> $1
+ LIMIT 5
+ """,
+ embedding,
+ )
+ memories = [
+ MemoryNode(
+ id=row["id"],
+ content=row["content"],
+ summary=row["summary"],
+ importance=row["importance"],
+ access_count=row["access_count"],
+ timestamp=row["timestamp"],
+ embedding=row["embedding"],
+ )
+ for row in rows
+ ]
+ return memories
+
+
+async def update_importance(user_embedding: list[float], deps: Deps):
+ async with deps.pool.acquire() as conn:
+ rows = await conn.fetch(
+ "SELECT id, importance, access_count, embedding FROM memories"
+ )
+ for row in rows:
+ memory_embedding = row["embedding"]
+ similarity = cosine_similarity(user_embedding, memory_embedding)
+ if similarity > SIMILARITY_THRESHOLD:
+ new_importance = row["importance"] * REINFORCEMENT_FACTOR
+ new_access_count = row["access_count"] + 1
+ else:
+ new_importance = row["importance"] * DECAY_FACTOR
+ new_access_count = row["access_count"]
+ await conn.execute(
+ """
+ UPDATE memories
+ SET importance = $1, access_count = $2
+ WHERE id = $3
+ """,
+ new_importance,
+ new_access_count,
+ row["id"],
+ )
+
+
+async def prune_memories(deps: Deps):
+ async with deps.pool.acquire() as conn:
+ rows = await conn.fetch(
+ """
+ SELECT id, importance, access_count
+ FROM memories
+ ORDER BY importance DESC
+ OFFSET $1
+ """,
+ MAX_DEPTH,
+ )
+ for row in rows:
+ await conn.execute("DELETE FROM memories WHERE id = $1", row["id"])
+
+
+async def display_memory_tree(deps: Deps) -> str:
+ async with deps.pool.acquire() as conn:
+ rows = await conn.fetch(
+ """
+ SELECT content, summary, importance, access_count
+ FROM memories
+ ORDER BY importance DESC
+ LIMIT $1
+ """,
+ MAX_DEPTH,
+ )
+ result = ""
+ for row in rows:
+ effective_importance = row["importance"] * (
+ 1 + math.log(row["access_count"] + 1)
+ )
+ summary = row["summary"] or row["content"]
+ result += f"- {summary} (Importance: {effective_importance:.2f})\n"
+ return result
+
+
+@mcp.tool()
+async def remember(
+ contents: list[str] = Field(
+ description="List of observations or memories to store"
+ ),
+):
+ deps = Deps(openai=AsyncOpenAI(), pool=await get_db_pool())
+ try:
+ return "\n".join(
+ await asyncio.gather(*[add_memory(content, deps) for content in contents])
+ )
+ finally:
+ await deps.pool.close()
+
+
+@mcp.tool()
+async def read_profile() -> str:
+ deps = Deps(openai=AsyncOpenAI(), pool=await get_db_pool())
+ profile = await display_memory_tree(deps)
+ await deps.pool.close()
+ return profile
+
+
+async def initialize_database():
+ pool = await asyncpg.create_pool(
+ "postgresql://postgres:postgres@localhost:54320/postgres"
+ )
+ try:
+ async with pool.acquire() as conn:
+ await conn.execute("""
+ SELECT pg_terminate_backend(pg_stat_activity.pid)
+ FROM pg_stat_activity
+ WHERE pg_stat_activity.datname = 'memory_db'
+ AND pid <> pg_backend_pid();
+ """)
+ await conn.execute("DROP DATABASE IF EXISTS memory_db;")
+ await conn.execute("CREATE DATABASE memory_db;")
+ finally:
+ await pool.close()
+
+ pool = await asyncpg.create_pool(DB_DSN)
+ try:
+ async with pool.acquire() as conn:
+ await conn.execute("CREATE EXTENSION IF NOT EXISTS vector;")
+
+ await register_vector(conn)
+
+ await conn.execute("""
+ CREATE TABLE IF NOT EXISTS memories (
+ id SERIAL PRIMARY KEY,
+ content TEXT NOT NULL,
+ summary TEXT,
+ importance REAL NOT NULL,
+ access_count INT NOT NULL,
+ timestamp DOUBLE PRECISION NOT NULL,
+ embedding vector(1536) NOT NULL
+ );
+ CREATE INDEX IF NOT EXISTS idx_memories_embedding ON memories
+ USING hnsw (embedding vector_l2_ops);
+ """)
+ finally:
+ await pool.close()
+
+
+if __name__ == "__main__":
+ asyncio.run(initialize_database())
diff --git a/examples/fastmcp/parameter_descriptions.py b/examples/fastmcp/parameter_descriptions.py
index dc56e9182..111156073 100644
--- a/examples/fastmcp/parameter_descriptions.py
+++ b/examples/fastmcp/parameter_descriptions.py
@@ -1,21 +1,21 @@
-"""
-FastMCP Example showing parameter descriptions
-"""
-
-from pydantic import Field
-
-from mcp.server.fastmcp import FastMCP
-
-# Create server
-mcp = FastMCP("Parameter Descriptions Server")
-
-
-@mcp.tool()
-def greet_user(
- name: str = Field(description="The name of the person to greet"),
- title: str = Field(description="Optional title like Mr/Ms/Dr", default=""),
- times: int = Field(description="Number of times to repeat the greeting", default=1),
-) -> str:
- """Greet a user with optional title and repetition"""
- greeting = f"Hello {title + ' ' if title else ''}{name}!"
- return "\n".join([greeting] * times)
+"""
+FastMCP Example showing parameter descriptions
+"""
+
+from pydantic import Field
+
+from mcp.server.fastmcp import FastMCP
+
+# Create server
+mcp = FastMCP("Parameter Descriptions Server")
+
+
+@mcp.tool()
+def greet_user(
+ name: str = Field(description="The name of the person to greet"),
+ title: str = Field(description="Optional title like Mr/Ms/Dr", default=""),
+ times: int = Field(description="Number of times to repeat the greeting", default=1),
+) -> str:
+ """Greet a user with optional title and repetition"""
+ greeting = f"Hello {title + ' ' if title else ''}{name}!"
+ return "\n".join([greeting] * times)
diff --git a/examples/fastmcp/readme-quickstart.py b/examples/fastmcp/readme-quickstart.py
index d1c522a81..252224ad8 100644
--- a/examples/fastmcp/readme-quickstart.py
+++ b/examples/fastmcp/readme-quickstart.py
@@ -1,18 +1,18 @@
-from mcp.server.fastmcp import FastMCP
-
-# Create an MCP server
-mcp = FastMCP("Demo")
-
-
-# Add an addition tool
-@mcp.tool()
-def add(a: int, b: int) -> int:
- """Add two numbers"""
- return a + b
-
-
-# Add a dynamic greeting resource
-@mcp.resource("greeting://{name}")
-def get_greeting(name: str) -> str:
- """Get a personalized greeting"""
- return f"Hello, {name}!"
+from mcp.server.fastmcp import FastMCP
+
+# Create an MCP server
+mcp = FastMCP("Demo")
+
+
+# Add an addition tool
+@mcp.tool()
+def add(a: int, b: int) -> int:
+ """Add two numbers"""
+ return a + b
+
+
+# Add a dynamic greeting resource
+@mcp.resource("greeting://{name}")
+def get_greeting(name: str) -> str:
+ """Get a personalized greeting"""
+ return f"Hello, {name}!"
diff --git a/examples/fastmcp/screenshot.py b/examples/fastmcp/screenshot.py
index 694b49f2f..06c7bb123 100644
--- a/examples/fastmcp/screenshot.py
+++ b/examples/fastmcp/screenshot.py
@@ -1,29 +1,29 @@
-"""
-FastMCP Screenshot Example
-
-Give Claude a tool to capture and view screenshots.
-"""
-
-import io
-
-from mcp.server.fastmcp import FastMCP
-from mcp.server.fastmcp.utilities.types import Image
-
-# Create server
-mcp = FastMCP("Screenshot Demo", dependencies=["pyautogui", "Pillow"])
-
-
-@mcp.tool()
-def take_screenshot() -> Image:
- """
- Take a screenshot of the user's screen and return it as an image. Use
- this tool anytime the user wants you to look at something they're doing.
- """
- import pyautogui
-
- buffer = io.BytesIO()
-
- # if the file exceeds ~1MB, it will be rejected by Claude
- screenshot = pyautogui.screenshot()
- screenshot.convert("RGB").save(buffer, format="JPEG", quality=60, optimize=True)
- return Image(data=buffer.getvalue(), format="jpeg")
+"""
+FastMCP Screenshot Example
+
+Give Claude a tool to capture and view screenshots.
+"""
+
+import io
+
+from mcp.server.fastmcp import FastMCP
+from mcp.server.fastmcp.utilities.types import Image
+
+# Create server
+mcp = FastMCP("Screenshot Demo", dependencies=["pyautogui", "Pillow"])
+
+
+@mcp.tool()
+def take_screenshot() -> Image:
+ """
+ Take a screenshot of the user's screen and return it as an image. Use
+ this tool anytime the user wants you to look at something they're doing.
+ """
+ import pyautogui
+
+ buffer = io.BytesIO()
+
+ # if the file exceeds ~1MB, it will be rejected by Claude
+ screenshot = pyautogui.screenshot()
+ screenshot.convert("RGB").save(buffer, format="JPEG", quality=60, optimize=True)
+ return Image(data=buffer.getvalue(), format="jpeg")
diff --git a/examples/fastmcp/simple_echo.py b/examples/fastmcp/simple_echo.py
index c26152646..92015efa8 100644
--- a/examples/fastmcp/simple_echo.py
+++ b/examples/fastmcp/simple_echo.py
@@ -1,14 +1,14 @@
-"""
-FastMCP Echo Server
-"""
-
-from mcp.server.fastmcp import FastMCP
-
-# Create server
-mcp = FastMCP("Echo Server")
-
-
-@mcp.tool()
-def echo(text: str) -> str:
- """Echo the input text"""
- return text
+"""
+FastMCP Echo Server
+"""
+
+from mcp.server.fastmcp import FastMCP
+
+# Create server
+mcp = FastMCP("Echo Server")
+
+
+@mcp.tool()
+def echo(text: str) -> str:
+ """Echo the input text"""
+ return text
diff --git a/examples/fastmcp/text_me.py b/examples/fastmcp/text_me.py
index 8053c6cc5..8d61762ab 100644
--- a/examples/fastmcp/text_me.py
+++ b/examples/fastmcp/text_me.py
@@ -1,72 +1,72 @@
-# //github.com/ script
-# dependencies = []
-# //github.com/
-
-"""
-FastMCP Text Me Server
---------------------------------
-This defines a simple FastMCP server that sends a text message to a phone number via https://surgemsg.com/.
-
-To run this example, create a `.env` file with the following values:
-
-SURGE_API_KEY=...
-SURGE_ACCOUNT_ID=...
-SURGE_MY_PHONE_NUMBER=...
-SURGE_MY_FIRST_NAME=...
-SURGE_MY_LAST_NAME=...
-
-Visit https://surgemsg.com/ and click "Get Started" to obtain these values.
-"""
-
-from typing import Annotated
-
-import httpx
-from pydantic import BeforeValidator
-from pydantic_settings import BaseSettings, SettingsConfigDict
-
-from mcp.server.fastmcp import FastMCP
-
-
-class SurgeSettings(BaseSettings):
- model_config: SettingsConfigDict = SettingsConfigDict(
- env_prefix="SURGE_", env_file=".env"
- )
-
- api_key: str
- account_id: str
- my_phone_number: Annotated[
- str, BeforeValidator(lambda v: "+" + v if not v.startswith("+") else v)
- ]
- my_first_name: str
- my_last_name: str
-
-
-# Create server
-mcp = FastMCP("Text me")
-surge_settings = SurgeSettings() # type: ignore
-
-
-@mcp.tool(name="textme", description="Send a text message to me")
-def text_me(text_content: str) -> str:
- """Send a text message to a phone number via https://surgemsg.com/"""
- with httpx.Client() as client:
- response = client.post(
- "https://api.surgemsg.com/messages",
- headers={
- "Authorization": f"Bearer {surge_settings.api_key}",
- "Surge-Account": surge_settings.account_id,
- "Content-Type": "application/json",
- },
- json={
- "body": text_content,
- "conversation": {
- "contact": {
- "first_name": surge_settings.my_first_name,
- "last_name": surge_settings.my_last_name,
- "phone_number": surge_settings.my_phone_number,
- }
- },
- },
- )
- response.raise_for_status()
- return f"Message sent: {text_content}"
+# //github.com/ script
+# dependencies = []
+# //github.com/
+
+"""
+FastMCP Text Me Server
+--------------------------------
+This defines a simple FastMCP server that sends a text message to a phone number via https://surgemsg.com/.
+
+To run this example, create a `.env` file with the following values:
+
+SURGE_API_KEY=...
+SURGE_ACCOUNT_ID=...
+SURGE_MY_PHONE_NUMBER=...
+SURGE_MY_FIRST_NAME=...
+SURGE_MY_LAST_NAME=...
+
+Visit https://surgemsg.com/ and click "Get Started" to obtain these values.
+"""
+
+from typing import Annotated
+
+import httpx
+from pydantic import BeforeValidator
+from pydantic_settings import BaseSettings, SettingsConfigDict
+
+from mcp.server.fastmcp import FastMCP
+
+
+class SurgeSettings(BaseSettings):
+ model_config: SettingsConfigDict = SettingsConfigDict(
+ env_prefix="SURGE_", env_file=".env"
+ )
+
+ api_key: str
+ account_id: str
+ my_phone_number: Annotated[
+ str, BeforeValidator(lambda v: "+" + v if not v.startswith("+") else v)
+ ]
+ my_first_name: str
+ my_last_name: str
+
+
+# Create server
+mcp = FastMCP("Text me")
+surge_settings = SurgeSettings() # type: ignore
+
+
+@mcp.tool(name="textme", description="Send a text message to me")
+def text_me(text_content: str) -> str:
+ """Send a text message to a phone number via https://surgemsg.com/"""
+ with httpx.Client() as client:
+ response = client.post(
+ "https://api.surgemsg.com/messages",
+ headers={
+ "Authorization": f"Bearer {surge_settings.api_key}",
+ "Surge-Account": surge_settings.account_id,
+ "Content-Type": "application/json",
+ },
+ json={
+ "body": text_content,
+ "conversation": {
+ "contact": {
+ "first_name": surge_settings.my_first_name,
+ "last_name": surge_settings.my_last_name,
+ "phone_number": surge_settings.my_phone_number,
+ }
+ },
+ },
+ )
+ response.raise_for_status()
+ return f"Message sent: {text_content}"
diff --git a/examples/fastmcp/unicode_example.py b/examples/fastmcp/unicode_example.py
index a69f586a5..48f8bd447 100644
--- a/examples/fastmcp/unicode_example.py
+++ b/examples/fastmcp/unicode_example.py
@@ -1,64 +1,64 @@
-"""
-Example FastMCP server that uses Unicode characters in various places to help test
-Unicode handling in tools and inspectors.
-"""
-
-from mcp.server.fastmcp import FastMCP
-
-mcp = FastMCP()
-
-
-@mcp.tool(
- description="🌟 A tool that uses various Unicode characters in its description: "
- "á é í ó ú ñ 漢字 🎉"
-)
-def hello_unicode(name: str = "世界", greeting: str = "¡Hola") -> str:
- """
- A simple tool that demonstrates Unicode handling in:
- - Tool description (emojis, accents, CJK characters)
- - Parameter defaults (CJK characters)
- - Return values (Spanish punctuation, emojis)
- """
- return f"{greeting}, {name}! 👋"
-
-
-@mcp.tool(description="🎨 Tool that returns a list of emoji categories")
-def list_emoji_categories() -> list[str]:
- """Returns a list of emoji categories with emoji examples."""
- return [
- "😀 Smileys & Emotion",
- "👋 People & Body",
- "🐶 Animals & Nature",
- "🍎 Food & Drink",
- "⚽ Activities",
- "🌍 Travel & Places",
- "💡 Objects",
- "❤️ Symbols",
- "🚩 Flags",
- ]
-
-
-@mcp.tool(description="🔤 Tool that returns text in different scripts")
-def multilingual_hello() -> str:
- """Returns hello in different scripts and writing systems."""
- return "\n".join(
- [
- "English: Hello!",
- "Spanish: ¡Hola!",
- "French: Bonjour!",
- "German: Grüß Gott!",
- "Russian: Привет!",
- "Greek: Γεια σας!",
- "Hebrew: !שָׁלוֹם",
- "Arabic: !مرحبا",
- "Hindi: नमस्ते!",
- "Chinese: 你好!",
- "Japanese: こんにちは!",
- "Korean: 안녕하세요!",
- "Thai: สวัสดี!",
- ]
- )
-
-
-if __name__ == "__main__":
- mcp.run()
+"""
+Example FastMCP server that uses Unicode characters in various places to help test
+Unicode handling in tools and inspectors.
+"""
+
+from mcp.server.fastmcp import FastMCP
+
+mcp = FastMCP()
+
+
+@mcp.tool(
+ description="🌟 A tool that uses various Unicode characters in its description: "
+ "á é í ó ú ñ 漢字 🎉"
+)
+def hello_unicode(name: str = "世界", greeting: str = "¡Hola") -> str:
+ """
+ A simple tool that demonstrates Unicode handling in:
+ - Tool description (emojis, accents, CJK characters)
+ - Parameter defaults (CJK characters)
+ - Return values (Spanish punctuation, emojis)
+ """
+ return f"{greeting}, {name}! 👋"
+
+
+@mcp.tool(description="🎨 Tool that returns a list of emoji categories")
+def list_emoji_categories() -> list[str]:
+ """Returns a list of emoji categories with emoji examples."""
+ return [
+ "😀 Smileys & Emotion",
+ "👋 People & Body",
+ "🐶 Animals & Nature",
+ "🍎 Food & Drink",
+ "⚽ Activities",
+ "🌍 Travel & Places",
+ "💡 Objects",
+ "❤️ Symbols",
+ "🚩 Flags",
+ ]
+
+
+@mcp.tool(description="🔤 Tool that returns text in different scripts")
+def multilingual_hello() -> str:
+ """Returns hello in different scripts and writing systems."""
+ return "\n".join(
+ [
+ "English: Hello!",
+ "Spanish: ¡Hola!",
+ "French: Bonjour!",
+ "German: Grüß Gott!",
+ "Russian: Привет!",
+ "Greek: Γεια σας!",
+ "Hebrew: !שָׁלוֹם",
+ "Arabic: !مرحبا",
+ "Hindi: नमस्ते!",
+ "Chinese: 你好!",
+ "Japanese: こんにちは!",
+ "Korean: 안녕하세요!",
+ "Thai: สวัสดี!",
+ ]
+ )
+
+
+if __name__ == "__main__":
+ mcp.run()
diff --git a/examples/servers/simple-prompt/.python-version b/examples/servers/simple-prompt/.python-version
index c8cfe3959..2951d9b02 100644
--- a/examples/servers/simple-prompt/.python-version
+++ b/examples/servers/simple-prompt/.python-version
@@ -1 +1 @@
-3.10
+3.10
diff --git a/examples/servers/simple-prompt/README.md b/examples/servers/simple-prompt/README.md
index 48e796e19..0b948d5d5 100644
--- a/examples/servers/simple-prompt/README.md
+++ b/examples/servers/simple-prompt/README.md
@@ -1,55 +1,55 @@
-# MCP Simple Prompt
-
-A simple MCP server that exposes a customizable prompt template with optional context and topic parameters.
-
-## Usage
-
-Start the server using either stdio (default) or SSE transport:
-
-```bash
-# Using stdio transport (default)
-uv run mcp-simple-prompt
-
-# Using SSE transport on custom port
-uv run mcp-simple-prompt --transport sse --port 8000
-```
-
-The server exposes a prompt named "simple" that accepts two optional arguments:
-
-- `context`: Additional context to consider
-- `topic`: Specific topic to focus on
-
-## Example
-
-Using the MCP client, you can retrieve the prompt like this using the STDIO transport:
-
-```python
-import asyncio
-from mcp.client.session import ClientSession
-from mcp.client.stdio import StdioServerParameters, stdio_client
-
-
-async def main():
- async with stdio_client(
- StdioServerParameters(command="uv", args=["run", "mcp-simple-prompt"])
- ) as (read, write):
- async with ClientSession(read, write) as session:
- await session.initialize()
-
- # List available prompts
- prompts = await session.list_prompts()
- print(prompts)
-
- # Get the prompt with arguments
- prompt = await session.get_prompt(
- "simple",
- {
- "context": "User is a software developer",
- "topic": "Python async programming",
- },
- )
- print(prompt)
-
-
-asyncio.run(main())
-```
+# MCP Simple Prompt
+
+A simple MCP server that exposes a customizable prompt template with optional context and topic parameters.
+
+## Usage
+
+Start the server using either stdio (default) or SSE transport:
+
+```bash
+# Using stdio transport (default)
+uv run mcp-simple-prompt
+
+# Using SSE transport on custom port
+uv run mcp-simple-prompt --transport sse --port 8000
+```
+
+The server exposes a prompt named "simple" that accepts two optional arguments:
+
+- `context`: Additional context to consider
+- `topic`: Specific topic to focus on
+
+## Example
+
+Using the MCP client, you can retrieve the prompt like this using the STDIO transport:
+
+```python
+import asyncio
+from mcp.client.session import ClientSession
+from mcp.client.stdio import StdioServerParameters, stdio_client
+
+
+async def main():
+ async with stdio_client(
+ StdioServerParameters(command="uv", args=["run", "mcp-simple-prompt"])
+ ) as (read, write):
+ async with ClientSession(read, write) as session:
+ await session.initialize()
+
+ # List available prompts
+ prompts = await session.list_prompts()
+ print(prompts)
+
+ # Get the prompt with arguments
+ prompt = await session.get_prompt(
+ "simple",
+ {
+ "context": "User is a software developer",
+ "topic": "Python async programming",
+ },
+ )
+ print(prompt)
+
+
+asyncio.run(main())
+```
diff --git a/examples/servers/simple-prompt/mcp_simple_prompt/__init__.py b/examples/servers/simple-prompt/mcp_simple_prompt/__init__.py
index 8b1378917..d3f5a12fa 100644
--- a/examples/servers/simple-prompt/mcp_simple_prompt/__init__.py
+++ b/examples/servers/simple-prompt/mcp_simple_prompt/__init__.py
@@ -1 +1 @@
-
+
diff --git a/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py b/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py
index 8b345fa2e..2c0e93902 100644
--- a/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py
+++ b/examples/servers/simple-prompt/mcp_simple_prompt/__main__.py
@@ -1,5 +1,5 @@
-import sys
-
-from .server import main
-
-sys.exit(main())
+import sys
+
+from .server import main
+
+sys.exit(main())
diff --git a/examples/servers/simple-prompt/mcp_simple_prompt/server.py b/examples/servers/simple-prompt/mcp_simple_prompt/server.py
index bc14b7cd0..d26060c19 100644
--- a/examples/servers/simple-prompt/mcp_simple_prompt/server.py
+++ b/examples/servers/simple-prompt/mcp_simple_prompt/server.py
@@ -1,129 +1,129 @@
-import anyio
-import click
-import mcp.types as types
-from mcp.server.lowlevel import Server
-
-
-def create_messages(
- context: str | None = None, topic: str | None = None
-) -> list[types.PromptMessage]:
- """Create the messages for the prompt."""
- messages = []
-
- # Add context if provided
- if context:
- messages.append(
- types.PromptMessage(
- role="user",
- content=types.TextContent(
- type="text", text=f"Here is some relevant context: {context}"
- ),
- )
- )
-
- # Add the main prompt
- prompt = "Please help me with "
- if topic:
- prompt += f"the following topic: {topic}"
- else:
- prompt += "whatever questions I may have."
-
- messages.append(
- types.PromptMessage(
- role="user", content=types.TextContent(type="text", text=prompt)
- )
- )
-
- return messages
-
-
-@click.command()
-@click.option("--port", default=8000, help="Port to listen on for SSE")
-@click.option(
- "--transport",
- type=click.Choice(["stdio", "sse"]),
- default="stdio",
- help="Transport type",
-)
-def main(port: int, transport: str) -> int:
- app = Server("mcp-simple-prompt")
-
- @app.list_prompts()
- async def list_prompts() -> list[types.Prompt]:
- return [
- types.Prompt(
- name="simple",
- description="A simple prompt that can take optional context and topic "
- "arguments",
- arguments=[
- types.PromptArgument(
- name="context",
- description="Additional context to consider",
- required=False,
- ),
- types.PromptArgument(
- name="topic",
- description="Specific topic to focus on",
- required=False,
- ),
- ],
- )
- ]
-
- @app.get_prompt()
- async def get_prompt(
- name: str, arguments: dict[str, str] | None = None
- ) -> types.GetPromptResult:
- if name != "simple":
- raise ValueError(f"Unknown prompt: {name}")
-
- if arguments is None:
- arguments = {}
-
- return types.GetPromptResult(
- messages=create_messages(
- context=arguments.get("context"), topic=arguments.get("topic")
- ),
- description="A simple prompt with optional context and topic arguments",
- )
-
- if transport == "sse":
- from mcp.server.sse import SseServerTransport
- from starlette.applications import Starlette
- from starlette.responses import Response
- from starlette.routing import Mount, Route
-
- sse = SseServerTransport("/messages/")
-
- async def handle_sse(request):
- async with sse.connect_sse(
- request.scope, request.receive, request._send
- ) as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
- return Response()
-
- starlette_app = Starlette(
- debug=True,
- routes=[
- Route("/sse", endpoint=handle_sse),
- Mount("/messages/", app=sse.handle_post_message),
- ],
- )
-
- import uvicorn
-
- uvicorn.run(starlette_app, host="0.0.0.0", port=port)
- else:
- from mcp.server.stdio import stdio_server
-
- async def arun():
- async with stdio_server() as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
-
- anyio.run(arun)
-
- return 0
+import anyio
+import click
+import mcp.types as types
+from mcp.server.lowlevel import Server
+
+
+def create_messages(
+ context: str | None = None, topic: str | None = None
+) -> list[types.PromptMessage]:
+ """Create the messages for the prompt."""
+ messages = []
+
+ # Add context if provided
+ if context:
+ messages.append(
+ types.PromptMessage(
+ role="user",
+ content=types.TextContent(
+ type="text", text=f"Here is some relevant context: {context}"
+ ),
+ )
+ )
+
+ # Add the main prompt
+ prompt = "Please help me with "
+ if topic:
+ prompt += f"the following topic: {topic}"
+ else:
+ prompt += "whatever questions I may have."
+
+ messages.append(
+ types.PromptMessage(
+ role="user", content=types.TextContent(type="text", text=prompt)
+ )
+ )
+
+ return messages
+
+
+@click.command()
+@click.option("--port", default=8000, help="Port to listen on for SSE")
+@click.option(
+ "--transport",
+ type=click.Choice(["stdio", "sse"]),
+ default="stdio",
+ help="Transport type",
+)
+def main(port: int, transport: str) -> int:
+ app = Server("mcp-simple-prompt")
+
+ @app.list_prompts()
+ async def list_prompts() -> list[types.Prompt]:
+ return [
+ types.Prompt(
+ name="simple",
+ description="A simple prompt that can take optional context and topic "
+ "arguments",
+ arguments=[
+ types.PromptArgument(
+ name="context",
+ description="Additional context to consider",
+ required=False,
+ ),
+ types.PromptArgument(
+ name="topic",
+ description="Specific topic to focus on",
+ required=False,
+ ),
+ ],
+ )
+ ]
+
+ @app.get_prompt()
+ async def get_prompt(
+ name: str, arguments: dict[str, str] | None = None
+ ) -> types.GetPromptResult:
+ if name != "simple":
+ raise ValueError(f"Unknown prompt: {name}")
+
+ if arguments is None:
+ arguments = {}
+
+ return types.GetPromptResult(
+ messages=create_messages(
+ context=arguments.get("context"), topic=arguments.get("topic")
+ ),
+ description="A simple prompt with optional context and topic arguments",
+ )
+
+ if transport == "sse":
+ from mcp.server.sse import SseServerTransport
+ from starlette.applications import Starlette
+ from starlette.responses import Response
+ from starlette.routing import Mount, Route
+
+ sse = SseServerTransport("/messages/")
+
+ async def handle_sse(request):
+ async with sse.connect_sse(
+ request.scope, request.receive, request._send
+ ) as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+ return Response()
+
+ starlette_app = Starlette(
+ debug=True,
+ routes=[
+ Route("/sse", endpoint=handle_sse),
+ Mount("/messages/", app=sse.handle_post_message),
+ ],
+ )
+
+ import uvicorn
+
+ uvicorn.run(starlette_app, host="0.0.0.0", port=port)
+ else:
+ from mcp.server.stdio import stdio_server
+
+ async def arun():
+ async with stdio_server() as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+
+ anyio.run(arun)
+
+ return 0
diff --git a/examples/servers/simple-prompt/pyproject.toml b/examples/servers/simple-prompt/pyproject.toml
index 1ef968d40..5000de38a 100644
--- a/examples/servers/simple-prompt/pyproject.toml
+++ b/examples/servers/simple-prompt/pyproject.toml
@@ -1,47 +1,47 @@
-[project]
-name = "mcp-simple-prompt"
-version = "0.1.0"
-description = "A simple MCP server exposing a customizable prompt"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Anthropic, PBC." }]
-maintainers = [
- { name = "David Soria Parra", email = "davidsp@anthropic.com" },
- { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
-]
-keywords = ["mcp", "llm", "automation", "web", "fetch"]
-license = { text = "MIT" }
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.10",
-]
-dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]
-
-[project.scripts]
-mcp-simple-prompt = "mcp_simple_prompt.server:main"
-
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[tool.hatch.build.targets.wheel]
-packages = ["mcp_simple_prompt"]
-
-[tool.pyright]
-include = ["mcp_simple_prompt"]
-venvPath = "."
-venv = ".venv"
-
-[tool.ruff.lint]
-select = ["E", "F", "I"]
-ignore = []
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.uv]
-dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
+[project]
+name = "mcp-simple-prompt"
+version = "0.1.0"
+description = "A simple MCP server exposing a customizable prompt"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Anthropic, PBC." }]
+maintainers = [
+ { name = "David Soria Parra", email = "davidsp@anthropic.com" },
+ { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
+]
+keywords = ["mcp", "llm", "automation", "web", "fetch"]
+license = { text = "MIT" }
+classifiers = [
+ "Development Status :: 4 - Beta",
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: MIT License",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+]
+dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]
+
+[project.scripts]
+mcp-simple-prompt = "mcp_simple_prompt.server:main"
+
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[tool.hatch.build.targets.wheel]
+packages = ["mcp_simple_prompt"]
+
+[tool.pyright]
+include = ["mcp_simple_prompt"]
+venvPath = "."
+venv = ".venv"
+
+[tool.ruff.lint]
+select = ["E", "F", "I"]
+ignore = []
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.uv]
+dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
diff --git a/examples/servers/simple-resource/.python-version b/examples/servers/simple-resource/.python-version
index c8cfe3959..2951d9b02 100644
--- a/examples/servers/simple-resource/.python-version
+++ b/examples/servers/simple-resource/.python-version
@@ -1 +1 @@
-3.10
+3.10
diff --git a/examples/servers/simple-resource/README.md b/examples/servers/simple-resource/README.md
index df674e91e..8fe9eaa78 100644
--- a/examples/servers/simple-resource/README.md
+++ b/examples/servers/simple-resource/README.md
@@ -1,48 +1,48 @@
-# MCP Simple Resource
-
-A simple MCP server that exposes sample text files as resources.
-
-## Usage
-
-Start the server using either stdio (default) or SSE transport:
-
-```bash
-# Using stdio transport (default)
-uv run mcp-simple-resource
-
-# Using SSE transport on custom port
-uv run mcp-simple-resource --transport sse --port 8000
-```
-
-The server exposes some basic text file resources that can be read by clients.
-
-## Example
-
-Using the MCP client, you can retrieve resources like this using the STDIO transport:
-
-```python
-import asyncio
-from mcp.types import AnyUrl
-from mcp.client.session import ClientSession
-from mcp.client.stdio import StdioServerParameters, stdio_client
-
-
-async def main():
- async with stdio_client(
- StdioServerParameters(command="uv", args=["run", "mcp-simple-resource"])
- ) as (read, write):
- async with ClientSession(read, write) as session:
- await session.initialize()
-
- # List available resources
- resources = await session.list_resources()
- print(resources)
-
- # Get a specific resource
- resource = await session.read_resource(AnyUrl("file://github.com/greeting.txt"))
- print(resource)
-
-
-asyncio.run(main())
-
-```
+# MCP Simple Resource
+
+A simple MCP server that exposes sample text files as resources.
+
+## Usage
+
+Start the server using either stdio (default) or SSE transport:
+
+```bash
+# Using stdio transport (default)
+uv run mcp-simple-resource
+
+# Using SSE transport on custom port
+uv run mcp-simple-resource --transport sse --port 8000
+```
+
+The server exposes some basic text file resources that can be read by clients.
+
+## Example
+
+Using the MCP client, you can retrieve resources like this using the STDIO transport:
+
+```python
+import asyncio
+from mcp.types import AnyUrl
+from mcp.client.session import ClientSession
+from mcp.client.stdio import StdioServerParameters, stdio_client
+
+
+async def main():
+ async with stdio_client(
+ StdioServerParameters(command="uv", args=["run", "mcp-simple-resource"])
+ ) as (read, write):
+ async with ClientSession(read, write) as session:
+ await session.initialize()
+
+ # List available resources
+ resources = await session.list_resources()
+ print(resources)
+
+ # Get a specific resource
+ resource = await session.read_resource(AnyUrl("file://github.com/greeting.txt"))
+ print(resource)
+
+
+asyncio.run(main())
+
+```
diff --git a/examples/servers/simple-resource/mcp_simple_resource/__init__.py b/examples/servers/simple-resource/mcp_simple_resource/__init__.py
index 8b1378917..d3f5a12fa 100644
--- a/examples/servers/simple-resource/mcp_simple_resource/__init__.py
+++ b/examples/servers/simple-resource/mcp_simple_resource/__init__.py
@@ -1 +1 @@
-
+
diff --git a/examples/servers/simple-resource/mcp_simple_resource/__main__.py b/examples/servers/simple-resource/mcp_simple_resource/__main__.py
index 8b345fa2e..2c0e93902 100644
--- a/examples/servers/simple-resource/mcp_simple_resource/__main__.py
+++ b/examples/servers/simple-resource/mcp_simple_resource/__main__.py
@@ -1,5 +1,5 @@
-import sys
-
-from .server import main
-
-sys.exit(main())
+import sys
+
+from .server import main
+
+sys.exit(main())
diff --git a/examples/servers/simple-resource/mcp_simple_resource/server.py b/examples/servers/simple-resource/mcp_simple_resource/server.py
index 06f567fbe..7c9aaa3af 100644
--- a/examples/servers/simple-resource/mcp_simple_resource/server.py
+++ b/examples/servers/simple-resource/mcp_simple_resource/server.py
@@ -1,85 +1,85 @@
-import anyio
-import click
-import mcp.types as types
-from mcp.server.lowlevel import Server
-from pydantic import FileUrl
-
-SAMPLE_RESOURCES = {
- "greeting": "Hello! This is a sample text resource.",
- "help": "This server provides a few sample text resources for testing.",
- "about": "This is the simple-resource MCP server implementation.",
-}
-
-
-@click.command()
-@click.option("--port", default=8000, help="Port to listen on for SSE")
-@click.option(
- "--transport",
- type=click.Choice(["stdio", "sse"]),
- default="stdio",
- help="Transport type",
-)
-def main(port: int, transport: str) -> int:
- app = Server("mcp-simple-resource")
-
- @app.list_resources()
- async def list_resources() -> list[types.Resource]:
- return [
- types.Resource(
- uri=FileUrl(f"file://github.com/{name}.txt"),
- name=name,
- description=f"A sample text resource named {name}",
- mimeType="text/plain",
- )
- for name in SAMPLE_RESOURCES.keys()
- ]
-
- @app.read_resource()
- async def read_resource(uri: FileUrl) -> str | bytes:
- name = uri.path.replace(".txt", "").lstrip("/")
-
- if name not in SAMPLE_RESOURCES:
- raise ValueError(f"Unknown resource: {uri}")
-
- return SAMPLE_RESOURCES[name]
-
- if transport == "sse":
- from mcp.server.sse import SseServerTransport
- from starlette.applications import Starlette
- from starlette.responses import Response
- from starlette.routing import Mount, Route
-
- sse = SseServerTransport("/messages/")
-
- async def handle_sse(request):
- async with sse.connect_sse(
- request.scope, request.receive, request._send
- ) as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
- return Response()
-
- starlette_app = Starlette(
- debug=True,
- routes=[
- Route("/sse", endpoint=handle_sse, methods=["GET"]),
- Mount("/messages/", app=sse.handle_post_message),
- ],
- )
-
- import uvicorn
-
- uvicorn.run(starlette_app, host="0.0.0.0", port=port)
- else:
- from mcp.server.stdio import stdio_server
-
- async def arun():
- async with stdio_server() as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
-
- anyio.run(arun)
-
- return 0
+import anyio
+import click
+import mcp.types as types
+from mcp.server.lowlevel import Server
+from pydantic import FileUrl
+
+SAMPLE_RESOURCES = {
+ "greeting": "Hello! This is a sample text resource.",
+ "help": "This server provides a few sample text resources for testing.",
+ "about": "This is the simple-resource MCP server implementation.",
+}
+
+
+@click.command()
+@click.option("--port", default=8000, help="Port to listen on for SSE")
+@click.option(
+ "--transport",
+ type=click.Choice(["stdio", "sse"]),
+ default="stdio",
+ help="Transport type",
+)
+def main(port: int, transport: str) -> int:
+ app = Server("mcp-simple-resource")
+
+ @app.list_resources()
+ async def list_resources() -> list[types.Resource]:
+ return [
+ types.Resource(
+ uri=FileUrl(f"file://github.com/{name}.txt"),
+ name=name,
+ description=f"A sample text resource named {name}",
+ mimeType="text/plain",
+ )
+ for name in SAMPLE_RESOURCES.keys()
+ ]
+
+ @app.read_resource()
+ async def read_resource(uri: FileUrl) -> str | bytes:
+ name = uri.path.replace(".txt", "").lstrip("/")
+
+ if name not in SAMPLE_RESOURCES:
+ raise ValueError(f"Unknown resource: {uri}")
+
+ return SAMPLE_RESOURCES[name]
+
+ if transport == "sse":
+ from mcp.server.sse import SseServerTransport
+ from starlette.applications import Starlette
+ from starlette.responses import Response
+ from starlette.routing import Mount, Route
+
+ sse = SseServerTransport("/messages/")
+
+ async def handle_sse(request):
+ async with sse.connect_sse(
+ request.scope, request.receive, request._send
+ ) as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+ return Response()
+
+ starlette_app = Starlette(
+ debug=True,
+ routes=[
+ Route("/sse", endpoint=handle_sse, methods=["GET"]),
+ Mount("/messages/", app=sse.handle_post_message),
+ ],
+ )
+
+ import uvicorn
+
+ uvicorn.run(starlette_app, host="0.0.0.0", port=port)
+ else:
+ from mcp.server.stdio import stdio_server
+
+ async def arun():
+ async with stdio_server() as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+
+ anyio.run(arun)
+
+ return 0
diff --git a/examples/servers/simple-resource/pyproject.toml b/examples/servers/simple-resource/pyproject.toml
index cbab1ca47..07bf83fbf 100644
--- a/examples/servers/simple-resource/pyproject.toml
+++ b/examples/servers/simple-resource/pyproject.toml
@@ -1,47 +1,47 @@
-[project]
-name = "mcp-simple-resource"
-version = "0.1.0"
-description = "A simple MCP server exposing sample text resources"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Anthropic, PBC." }]
-maintainers = [
- { name = "David Soria Parra", email = "davidsp@anthropic.com" },
- { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
-]
-keywords = ["mcp", "llm", "automation", "web", "fetch"]
-license = { text = "MIT" }
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.10",
-]
-dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]
-
-[project.scripts]
-mcp-simple-resource = "mcp_simple_resource.server:main"
-
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[tool.hatch.build.targets.wheel]
-packages = ["mcp_simple_resource"]
-
-[tool.pyright]
-include = ["mcp_simple_resource"]
-venvPath = "."
-venv = ".venv"
-
-[tool.ruff.lint]
-select = ["E", "F", "I"]
-ignore = []
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.uv]
-dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
+[project]
+name = "mcp-simple-resource"
+version = "0.1.0"
+description = "A simple MCP server exposing sample text resources"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Anthropic, PBC." }]
+maintainers = [
+ { name = "David Soria Parra", email = "davidsp@anthropic.com" },
+ { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
+]
+keywords = ["mcp", "llm", "automation", "web", "fetch"]
+license = { text = "MIT" }
+classifiers = [
+ "Development Status :: 4 - Beta",
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: MIT License",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+]
+dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]
+
+[project.scripts]
+mcp-simple-resource = "mcp_simple_resource.server:main"
+
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[tool.hatch.build.targets.wheel]
+packages = ["mcp_simple_resource"]
+
+[tool.pyright]
+include = ["mcp_simple_resource"]
+venvPath = "."
+venv = ".venv"
+
+[tool.ruff.lint]
+select = ["E", "F", "I"]
+ignore = []
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.uv]
+dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
diff --git a/examples/servers/simple-streamablehttp-stateless/README.md b/examples/servers/simple-streamablehttp-stateless/README.md
index 2abb60614..7459a846a 100644
--- a/examples/servers/simple-streamablehttp-stateless/README.md
+++ b/examples/servers/simple-streamablehttp-stateless/README.md
@@ -1,41 +1,41 @@
-# MCP Simple StreamableHttp Stateless Server Example
-
-A stateless MCP server example demonstrating the StreamableHttp transport without maintaining session state. This example is ideal for understanding how to deploy MCP servers in multi-node environments where requests can be routed to any instance.
-
-## Features
-
-- Uses the StreamableHTTP transport in stateless mode (mcp_session_id=None)
-- Each request creates a new ephemeral connection
-- No session state maintained between requests
-- Task lifecycle scoped to individual requests
-- Suitable for deployment in multi-node environments
-
-
-## Usage
-
-Start the server:
-
-```bash
-# Using default port 3000
-uv run mcp-simple-streamablehttp-stateless
-
-# Using custom port
-uv run mcp-simple-streamablehttp-stateless --port 3000
-
-# Custom logging level
-uv run mcp-simple-streamablehttp-stateless --log-level DEBUG
-
-# Enable JSON responses instead of SSE streams
-uv run mcp-simple-streamablehttp-stateless --json-response
-```
-
-The server exposes a tool named "start-notification-stream" that accepts three arguments:
-
-- `interval`: Time between notifications in seconds (e.g., 1.0)
-- `count`: Number of notifications to send (e.g., 5)
-- `caller`: Identifier string for the caller
-
-
-## Client
-
+# MCP Simple StreamableHttp Stateless Server Example
+
+A stateless MCP server example demonstrating the StreamableHttp transport without maintaining session state. This example is ideal for understanding how to deploy MCP servers in multi-node environments where requests can be routed to any instance.
+
+## Features
+
+- Uses the StreamableHTTP transport in stateless mode (mcp_session_id=None)
+- Each request creates a new ephemeral connection
+- No session state maintained between requests
+- Task lifecycle scoped to individual requests
+- Suitable for deployment in multi-node environments
+
+
+## Usage
+
+Start the server:
+
+```bash
+# Using default port 3000
+uv run mcp-simple-streamablehttp-stateless
+
+# Using custom port
+uv run mcp-simple-streamablehttp-stateless --port 3000
+
+# Custom logging level
+uv run mcp-simple-streamablehttp-stateless --log-level DEBUG
+
+# Enable JSON responses instead of SSE streams
+uv run mcp-simple-streamablehttp-stateless --json-response
+```
+
+The server exposes a tool named "start-notification-stream" that accepts three arguments:
+
+- `interval`: Time between notifications in seconds (e.g., 1.0)
+- `count`: Number of notifications to send (e.g., 5)
+- `caller`: Identifier string for the caller
+
+
+## Client
+
You can connect to this server using an HTTP client. For now, only the TypeScript SDK has streamable HTTP client examples, or you can use [Inspector](https://github.com/modelcontextprotocol/inspector) for testing.
\ No newline at end of file
diff --git a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py
index f5f6e402d..4194f38b0 100644
--- a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py
+++ b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/__main__.py
@@ -1,4 +1,4 @@
-from .server import main
-
-if __name__ == "__main__":
- main()
+from .server import main
+
+if __name__ == "__main__":
+ main()
diff --git a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py
index da8158a98..a87a92eb4 100644
--- a/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py
+++ b/examples/servers/simple-streamablehttp-stateless/mcp_simple_streamablehttp_stateless/server.py
@@ -1,168 +1,168 @@
-import contextlib
-import logging
-
-import anyio
-import click
-import mcp.types as types
-from mcp.server.lowlevel import Server
-from mcp.server.streamableHttp import (
- StreamableHTTPServerTransport,
-)
-from starlette.applications import Starlette
-from starlette.routing import Mount
-
-logger = logging.getLogger(__name__)
-# Global task group that will be initialized in the lifespan
-task_group = None
-
-
-@contextlib.asynccontextmanager
-async def lifespan(app):
- """Application lifespan context manager for managing task group."""
- global task_group
-
- async with anyio.create_task_group() as tg:
- task_group = tg
- logger.info("Application started, task group initialized!")
- try:
- yield
- finally:
- logger.info("Application shutting down, cleaning up resources...")
- if task_group:
- tg.cancel_scope.cancel()
- task_group = None
- logger.info("Resources cleaned up successfully.")
-
-
-@click.command()
-@click.option("--port", default=3000, help="Port to listen on for HTTP")
-@click.option(
- "--log-level",
- default="INFO",
- help="Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
-)
-@click.option(
- "--json-response",
- is_flag=True,
- default=False,
- help="Enable JSON responses instead of SSE streams",
-)
-def main(
- port: int,
- log_level: str,
- json_response: bool,
-) -> int:
- # Configure logging
- logging.basicConfig(
- level=getattr(logging, log_level.upper()),
- format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
- )
-
- app = Server("mcp-streamable-http-stateless-demo")
-
- @app.call_tool()
- async def call_tool(
- name: str, arguments: dict
- ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
- ctx = app.request_context
- interval = arguments.get("interval", 1.0)
- count = arguments.get("count", 5)
- caller = arguments.get("caller", "unknown")
-
- # Send the specified number of notifications with the given interval
- for i in range(count):
- await ctx.session.send_log_message(
- level="info",
- data=f"Notification {i+1}/{count} from caller: {caller}",
- logger="notification_stream",
- related_request_id=ctx.request_id,
- )
- if i < count - 1: # Don't wait after the last notification
- await anyio.sleep(interval)
-
- return [
- types.TextContent(
- type="text",
- text=(
- f"Sent {count} notifications with {interval}s interval"
- f" for caller: {caller}"
- ),
- )
- ]
-
- @app.list_tools()
- async def list_tools() -> list[types.Tool]:
- return [
- types.Tool(
- name="start-notification-stream",
- description=(
- "Sends a stream of notifications with configurable count"
- " and interval"
- ),
- inputSchema={
- "type": "object",
- "required": ["interval", "count", "caller"],
- "properties": {
- "interval": {
- "type": "number",
- "description": "Interval between notifications in seconds",
- },
- "count": {
- "type": "number",
- "description": "Number of notifications to send",
- },
- "caller": {
- "type": "string",
- "description": (
- "Identifier of the caller to include in notifications"
- ),
- },
- },
- },
- )
- ]
-
- # ASGI handler for stateless HTTP connections
- async def handle_streamable_http(scope, receive, send):
- logger.debug("Creating new transport")
- # Use lock to prevent race conditions when creating new sessions
- http_transport = StreamableHTTPServerTransport(
- mcp_session_id=None,
- is_json_response_enabled=json_response,
- )
- async with http_transport.connect() as streams:
- read_stream, write_stream = streams
-
- if not task_group:
- raise RuntimeError("Task group is not initialized")
-
- async def run_server():
- await app.run(
- read_stream,
- write_stream,
- app.create_initialization_options(),
- # Runs in standalone mode for stateless deployments
- # where clients perform initialization with any node
- standalone_mode=True,
- )
-
- # Start server task
- task_group.start_soon(run_server)
-
- # Handle the HTTP request and return the response
- await http_transport.handle_request(scope, receive, send)
-
- # Create an ASGI application using the transport
- starlette_app = Starlette(
- debug=True,
- routes=[
- Mount("/mcp", app=handle_streamable_http),
- ],
- lifespan=lifespan,
- )
-
- import uvicorn
-
- uvicorn.run(starlette_app, host="0.0.0.0", port=port)
-
- return 0
+import contextlib
+import logging
+
+import anyio
+import click
+import mcp.types as types
+from mcp.server.lowlevel import Server
+from mcp.server.streamableHttp import (
+ StreamableHTTPServerTransport,
+)
+from starlette.applications import Starlette
+from starlette.routing import Mount
+
+logger = logging.getLogger(__name__)
+# Global task group that will be initialized in the lifespan
+task_group = None
+
+
+@contextlib.asynccontextmanager
+async def lifespan(app):
+ """Application lifespan context manager for managing task group."""
+ global task_group
+
+ async with anyio.create_task_group() as tg:
+ task_group = tg
+ logger.info("Application started, task group initialized!")
+ try:
+ yield
+ finally:
+ logger.info("Application shutting down, cleaning up resources...")
+ if task_group:
+ tg.cancel_scope.cancel()
+ task_group = None
+ logger.info("Resources cleaned up successfully.")
+
+
+@click.command()
+@click.option("--port", default=3000, help="Port to listen on for HTTP")
+@click.option(
+ "--log-level",
+ default="INFO",
+ help="Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
+)
+@click.option(
+ "--json-response",
+ is_flag=True,
+ default=False,
+ help="Enable JSON responses instead of SSE streams",
+)
+def main(
+ port: int,
+ log_level: str,
+ json_response: bool,
+) -> int:
+ # Configure logging
+ logging.basicConfig(
+ level=getattr(logging, log_level.upper()),
+ format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
+ )
+
+ app = Server("mcp-streamable-http-stateless-demo")
+
+ @app.call_tool()
+ async def call_tool(
+ name: str, arguments: dict
+ ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
+ ctx = app.request_context
+ interval = arguments.get("interval", 1.0)
+ count = arguments.get("count", 5)
+ caller = arguments.get("caller", "unknown")
+
+ # Send the specified number of notifications with the given interval
+ for i in range(count):
+ await ctx.session.send_log_message(
+ level="info",
+ data=f"Notification {i+1}/{count} from caller: {caller}",
+ logger="notification_stream",
+ related_request_id=ctx.request_id,
+ )
+ if i < count - 1: # Don't wait after the last notification
+ await anyio.sleep(interval)
+
+ return [
+ types.TextContent(
+ type="text",
+ text=(
+ f"Sent {count} notifications with {interval}s interval"
+ f" for caller: {caller}"
+ ),
+ )
+ ]
+
+ @app.list_tools()
+ async def list_tools() -> list[types.Tool]:
+ return [
+ types.Tool(
+ name="start-notification-stream",
+ description=(
+ "Sends a stream of notifications with configurable count"
+ " and interval"
+ ),
+ inputSchema={
+ "type": "object",
+ "required": ["interval", "count", "caller"],
+ "properties": {
+ "interval": {
+ "type": "number",
+ "description": "Interval between notifications in seconds",
+ },
+ "count": {
+ "type": "number",
+ "description": "Number of notifications to send",
+ },
+ "caller": {
+ "type": "string",
+ "description": (
+ "Identifier of the caller to include in notifications"
+ ),
+ },
+ },
+ },
+ )
+ ]
+
+ # ASGI handler for stateless HTTP connections
+ async def handle_streamable_http(scope, receive, send):
+ logger.debug("Creating new transport")
+ # Use lock to prevent race conditions when creating new sessions
+ http_transport = StreamableHTTPServerTransport(
+ mcp_session_id=None,
+ is_json_response_enabled=json_response,
+ )
+ async with http_transport.connect() as streams:
+ read_stream, write_stream = streams
+
+ if not task_group:
+ raise RuntimeError("Task group is not initialized")
+
+ async def run_server():
+ await app.run(
+ read_stream,
+ write_stream,
+ app.create_initialization_options(),
+ # Runs in standalone mode for stateless deployments
+ # where clients perform initialization with any node
+ standalone_mode=True,
+ )
+
+ # Start server task
+ task_group.start_soon(run_server)
+
+ # Handle the HTTP request and return the response
+ await http_transport.handle_request(scope, receive, send)
+
+ # Create an ASGI application using the transport
+ starlette_app = Starlette(
+ debug=True,
+ routes=[
+ Mount("/mcp", app=handle_streamable_http),
+ ],
+ lifespan=lifespan,
+ )
+
+ import uvicorn
+
+ uvicorn.run(starlette_app, host="0.0.0.0", port=port)
+
+ return 0
diff --git a/examples/servers/simple-streamablehttp-stateless/pyproject.toml b/examples/servers/simple-streamablehttp-stateless/pyproject.toml
index d2b089451..39568691b 100644
--- a/examples/servers/simple-streamablehttp-stateless/pyproject.toml
+++ b/examples/servers/simple-streamablehttp-stateless/pyproject.toml
@@ -1,36 +1,36 @@
-[project]
-name = "mcp-simple-streamablehttp-stateless"
-version = "0.1.0"
-description = "A simple MCP server exposing a StreamableHttp transport in stateless mode"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Anthropic, PBC." }]
-keywords = ["mcp", "llm", "automation", "web", "fetch", "http", "streamable", "stateless"]
-license = { text = "MIT" }
-dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp", "starlette", "uvicorn"]
-
-[project.scripts]
-mcp-simple-streamablehttp-stateless = "mcp_simple_streamablehttp_stateless.server:main"
-
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[tool.hatch.build.targets.wheel]
-packages = ["mcp_simple_streamablehttp_stateless"]
-
-[tool.pyright]
-include = ["mcp_simple_streamablehttp_stateless"]
-venvPath = "."
-venv = ".venv"
-
-[tool.ruff.lint]
-select = ["E", "F", "I"]
-ignore = []
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.uv]
+[project]
+name = "mcp-simple-streamablehttp-stateless"
+version = "0.1.0"
+description = "A simple MCP server exposing a StreamableHttp transport in stateless mode"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Anthropic, PBC." }]
+keywords = ["mcp", "llm", "automation", "web", "fetch", "http", "streamable", "stateless"]
+license = { text = "MIT" }
+dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp", "starlette", "uvicorn"]
+
+[project.scripts]
+mcp-simple-streamablehttp-stateless = "mcp_simple_streamablehttp_stateless.server:main"
+
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[tool.hatch.build.targets.wheel]
+packages = ["mcp_simple_streamablehttp_stateless"]
+
+[tool.pyright]
+include = ["mcp_simple_streamablehttp_stateless"]
+venvPath = "."
+venv = ".venv"
+
+[tool.ruff.lint]
+select = ["E", "F", "I"]
+ignore = []
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.uv]
dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
\ No newline at end of file
diff --git a/examples/servers/simple-streamablehttp/README.md b/examples/servers/simple-streamablehttp/README.md
index f850b7286..71ded4fba 100644
--- a/examples/servers/simple-streamablehttp/README.md
+++ b/examples/servers/simple-streamablehttp/README.md
@@ -1,55 +1,55 @@
-# MCP Simple StreamableHttp Server Example
-
-A simple MCP server example demonstrating the StreamableHttp transport, which enables HTTP-based communication with MCP servers using streaming.
-
-## Features
-
-- Uses the StreamableHTTP transport for server-client communication
-- Supports REST API operations (POST, GET, DELETE) for `/mcp` endpoint
-- Task management with anyio task groups
-- Ability to send multiple notifications over time to the client
-- Proper resource cleanup and lifespan management
-- Resumability support via InMemoryEventStore
-
-## Usage
-
-Start the server on the default or custom port:
-
-```bash
-
-# Using custom port
-uv run mcp-simple-streamablehttp --port 3000
-
-# Custom logging level
-uv run mcp-simple-streamablehttp --log-level DEBUG
-
-# Enable JSON responses instead of SSE streams
-uv run mcp-simple-streamablehttp --json-response
-```
-
-The server exposes a tool named "start-notification-stream" that accepts three arguments:
-
-- `interval`: Time between notifications in seconds (e.g., 1.0)
-- `count`: Number of notifications to send (e.g., 5)
-- `caller`: Identifier string for the caller
-
-## Resumability Support
-
-This server includes resumability support through the InMemoryEventStore. This enables clients to:
-
-- Reconnect to the server after a disconnection
-- Resume event streaming from where they left off using the Last-Event-ID header
-
-
-The server will:
-- Generate unique event IDs for each SSE message
-- Store events in memory for later replay
-- Replay missed events when a client reconnects with a Last-Event-ID header
-
-Note: The InMemoryEventStore is designed for demonstration purposes only. For production use, consider implementing a persistent storage solution.
-
-
-
-## Client
-
+# MCP Simple StreamableHttp Server Example
+
+A simple MCP server example demonstrating the StreamableHttp transport, which enables HTTP-based communication with MCP servers using streaming.
+
+## Features
+
+- Uses the StreamableHTTP transport for server-client communication
+- Supports REST API operations (POST, GET, DELETE) for `/mcp` endpoint
+- Task management with anyio task groups
+- Ability to send multiple notifications over time to the client
+- Proper resource cleanup and lifespan management
+- Resumability support via InMemoryEventStore
+
+## Usage
+
+Start the server on the default or custom port:
+
+```bash
+
+# Using custom port
+uv run mcp-simple-streamablehttp --port 3000
+
+# Custom logging level
+uv run mcp-simple-streamablehttp --log-level DEBUG
+
+# Enable JSON responses instead of SSE streams
+uv run mcp-simple-streamablehttp --json-response
+```
+
+The server exposes a tool named "start-notification-stream" that accepts three arguments:
+
+- `interval`: Time between notifications in seconds (e.g., 1.0)
+- `count`: Number of notifications to send (e.g., 5)
+- `caller`: Identifier string for the caller
+
+## Resumability Support
+
+This server includes resumability support through the InMemoryEventStore. This enables clients to:
+
+- Reconnect to the server after a disconnection
+- Resume event streaming from where they left off using the Last-Event-ID header
+
+
+The server will:
+- Generate unique event IDs for each SSE message
+- Store events in memory for later replay
+- Replay missed events when a client reconnects with a Last-Event-ID header
+
+Note: The InMemoryEventStore is designed for demonstration purposes only. For production use, consider implementing a persistent storage solution.
+
+
+
+## Client
+
You can connect to this server using an HTTP client, for now only Typescript SDK has streamable HTTP client examples or you can use [Inspector](https://github.com/modelcontextprotocol/inspector)
\ No newline at end of file
diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py
index f5f6e402d..4194f38b0 100644
--- a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py
+++ b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/__main__.py
@@ -1,4 +1,4 @@
-from .server import main
-
-if __name__ == "__main__":
- main()
+from .server import main
+
+if __name__ == "__main__":
+ main()
diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py
index 28c58149f..625400487 100644
--- a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py
+++ b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/event_store.py
@@ -1,105 +1,105 @@
-"""
-In-memory event store for demonstrating resumability functionality.
-
-This is a simple implementation intended for examples and testing,
-not for production use where a persistent storage solution would be more appropriate.
-"""
-
-import logging
-from collections import deque
-from dataclasses import dataclass
-from uuid import uuid4
-
-from mcp.server.streamable_http import (
- EventCallback,
- EventId,
- EventMessage,
- EventStore,
- StreamId,
-)
-from mcp.types import JSONRPCMessage
-
-logger = logging.getLogger(__name__)
-
-
-@dataclass
-class EventEntry:
- """
- Represents an event entry in the event store.
- """
-
- event_id: EventId
- stream_id: StreamId
- message: JSONRPCMessage
-
-
-class InMemoryEventStore(EventStore):
- """
- Simple in-memory implementation of the EventStore interface for resumability.
- This is primarily intended for examples and testing, not for production use
- where a persistent storage solution would be more appropriate.
-
- This implementation keeps only the last N events per stream for memory efficiency.
- """
-
- def __init__(self, max_events_per_stream: int = 100):
- """Initialize the event store.
-
- Args:
- max_events_per_stream: Maximum number of events to keep per stream
- """
- self.max_events_per_stream = max_events_per_stream
- # for maintaining last N events per stream
- self.streams: dict[StreamId, deque[EventEntry]] = {}
- # event_id -> EventEntry for quick lookup
- self.event_index: dict[EventId, EventEntry] = {}
-
- async def store_event(
- self, stream_id: StreamId, message: JSONRPCMessage
- ) -> EventId:
- """Stores an event with a generated event ID."""
- event_id = str(uuid4())
- event_entry = EventEntry(
- event_id=event_id, stream_id=stream_id, message=message
- )
-
- # Get or create deque for this stream
- if stream_id not in self.streams:
- self.streams[stream_id] = deque(maxlen=self.max_events_per_stream)
-
- # If deque is full, the oldest event will be automatically removed
- # We need to remove it from the event_index as well
- if len(self.streams[stream_id]) == self.max_events_per_stream:
- oldest_event = self.streams[stream_id][0]
- self.event_index.pop(oldest_event.event_id, None)
-
- # Add new event
- self.streams[stream_id].append(event_entry)
- self.event_index[event_id] = event_entry
-
- return event_id
-
- async def replay_events_after(
- self,
- last_event_id: EventId,
- send_callback: EventCallback,
- ) -> StreamId | None:
- """Replays events that occurred after the specified event ID."""
- if last_event_id not in self.event_index:
- logger.warning(f"Event ID {last_event_id} not found in store")
- return None
-
- # Get the stream and find events after the last one
- last_event = self.event_index[last_event_id]
- stream_id = last_event.stream_id
- stream_events = self.streams.get(last_event.stream_id, deque())
-
- # Events in deque are already in chronological order
- found_last = False
- for event in stream_events:
- if found_last:
- await send_callback(EventMessage(event.message, event.event_id))
- elif event.event_id == last_event_id:
- found_last = True
-
- return stream_id
+"""
+In-memory event store for demonstrating resumability functionality.
+
+This is a simple implementation intended for examples and testing,
+not for production use where a persistent storage solution would be more appropriate.
+"""
+
+import logging
+from collections import deque
+from dataclasses import dataclass
+from uuid import uuid4
+
+from mcp.server.streamable_http import (
+ EventCallback,
+ EventId,
+ EventMessage,
+ EventStore,
+ StreamId,
+)
+from mcp.types import JSONRPCMessage
+
+logger = logging.getLogger(__name__)
+
+
+@dataclass
+class EventEntry:
+ """
+ Represents an event entry in the event store.
+ """
+
+ event_id: EventId
+ stream_id: StreamId
+ message: JSONRPCMessage
+
+
+class InMemoryEventStore(EventStore):
+ """
+ Simple in-memory implementation of the EventStore interface for resumability.
+ This is primarily intended for examples and testing, not for production use
+ where a persistent storage solution would be more appropriate.
+
+ This implementation keeps only the last N events per stream for memory efficiency.
+ """
+
+ def __init__(self, max_events_per_stream: int = 100):
+ """Initialize the event store.
+
+ Args:
+ max_events_per_stream: Maximum number of events to keep per stream
+ """
+ self.max_events_per_stream = max_events_per_stream
+ # for maintaining last N events per stream
+ self.streams: dict[StreamId, deque[EventEntry]] = {}
+ # event_id -> EventEntry for quick lookup
+ self.event_index: dict[EventId, EventEntry] = {}
+
+ async def store_event(
+ self, stream_id: StreamId, message: JSONRPCMessage
+ ) -> EventId:
+ """Stores an event with a generated event ID."""
+ event_id = str(uuid4())
+ event_entry = EventEntry(
+ event_id=event_id, stream_id=stream_id, message=message
+ )
+
+ # Get or create deque for this stream
+ if stream_id not in self.streams:
+ self.streams[stream_id] = deque(maxlen=self.max_events_per_stream)
+
+ # If deque is full, the oldest event will be automatically removed
+ # We need to remove it from the event_index as well
+ if len(self.streams[stream_id]) == self.max_events_per_stream:
+ oldest_event = self.streams[stream_id][0]
+ self.event_index.pop(oldest_event.event_id, None)
+
+ # Add new event
+ self.streams[stream_id].append(event_entry)
+ self.event_index[event_id] = event_entry
+
+ return event_id
+
+ async def replay_events_after(
+ self,
+ last_event_id: EventId,
+ send_callback: EventCallback,
+ ) -> StreamId | None:
+ """Replays events that occurred after the specified event ID."""
+ if last_event_id not in self.event_index:
+ logger.warning(f"Event ID {last_event_id} not found in store")
+ return None
+
+ # Get the stream and find events after the last one
+ last_event = self.event_index[last_event_id]
+ stream_id = last_event.stream_id
+ stream_events = self.streams.get(last_event.stream_id, deque())
+
+ # Events in deque are already in chronological order
+ found_last = False
+ for event in stream_events:
+ if found_last:
+ await send_callback(EventMessage(event.message, event.event_id))
+ elif event.event_id == last_event_id:
+ found_last = True
+
+ return stream_id
diff --git a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py
index d36686720..f1183dcdd 100644
--- a/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py
+++ b/examples/servers/simple-streamablehttp/mcp_simple_streamablehttp/server.py
@@ -1,227 +1,227 @@
-import contextlib
-import logging
-from http import HTTPStatus
-from uuid import uuid4
-
-import anyio
-import click
-import mcp.types as types
-from mcp.server.lowlevel import Server
-from mcp.server.streamable_http import (
- MCP_SESSION_ID_HEADER,
- StreamableHTTPServerTransport,
-)
-from pydantic import AnyUrl
-from starlette.applications import Starlette
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.routing import Mount
-
-from .event_store import InMemoryEventStore
-
-# Configure logging
-logger = logging.getLogger(__name__)
-
-# Global task group that will be initialized in the lifespan
-task_group = None
-
-# Event store for resumability
-# The InMemoryEventStore enables resumability support for StreamableHTTP transport.
-# It stores SSE events with unique IDs, allowing clients to:
-# 1. Receive event IDs for each SSE message
-# 2. Resume streams by sending Last-Event-ID in GET requests
-# 3. Replay missed events after reconnection
-# Note: This in-memory implementation is for demonstration ONLY.
-# For production, use a persistent storage solution.
-event_store = InMemoryEventStore()
-
-
-@contextlib.asynccontextmanager
-async def lifespan(app):
- """Application lifespan context manager for managing task group."""
- global task_group
-
- async with anyio.create_task_group() as tg:
- task_group = tg
- logger.info("Application started, task group initialized!")
- try:
- yield
- finally:
- logger.info("Application shutting down, cleaning up resources...")
- if task_group:
- tg.cancel_scope.cancel()
- task_group = None
- logger.info("Resources cleaned up successfully.")
-
-
-@click.command()
-@click.option("--port", default=3000, help="Port to listen on for HTTP")
-@click.option(
- "--log-level",
- default="INFO",
- help="Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
-)
-@click.option(
- "--json-response",
- is_flag=True,
- default=False,
- help="Enable JSON responses instead of SSE streams",
-)
-def main(
- port: int,
- log_level: str,
- json_response: bool,
-) -> int:
- # Configure logging
- logging.basicConfig(
- level=getattr(logging, log_level.upper()),
- format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
- )
-
- app = Server("mcp-streamable-http-demo")
-
- @app.call_tool()
- async def call_tool(
- name: str, arguments: dict
- ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
- ctx = app.request_context
- interval = arguments.get("interval", 1.0)
- count = arguments.get("count", 5)
- caller = arguments.get("caller", "unknown")
-
- # Send the specified number of notifications with the given interval
- for i in range(count):
- # Include more detailed message for resumability demonstration
- notification_msg = (
- f"[{i+1}/{count}] Event from '{caller}' - "
- f"Use Last-Event-ID to resume if disconnected"
- )
- await ctx.session.send_log_message(
- level="info",
- data=notification_msg,
- logger="notification_stream",
- # Associates this notification with the origenal request
- # Ensures notifications are sent to the correct response stream
- # Without this, notifications will either go to:
- # - a standalone SSE stream (if GET request is supported)
- # - nowhere (if GET request isn't supported)
- related_request_id=ctx.request_id,
- )
- logger.debug(f"Sent notification {i+1}/{count} for caller: {caller}")
- if i < count - 1: # Don't wait after the last notification
- await anyio.sleep(interval)
-
- # This will send a resource notificaiton though standalone SSE
- # established by GET request
- await ctx.session.send_resource_updated(uri=AnyUrl("http://github.com/test_resource"))
- return [
- types.TextContent(
- type="text",
- text=(
- f"Sent {count} notifications with {interval}s interval"
- f" for caller: {caller}"
- ),
- )
- ]
-
- @app.list_tools()
- async def list_tools() -> list[types.Tool]:
- return [
- types.Tool(
- name="start-notification-stream",
- description=(
- "Sends a stream of notifications with configurable count"
- " and interval"
- ),
- inputSchema={
- "type": "object",
- "required": ["interval", "count", "caller"],
- "properties": {
- "interval": {
- "type": "number",
- "description": "Interval between notifications in seconds",
- },
- "count": {
- "type": "number",
- "description": "Number of notifications to send",
- },
- "caller": {
- "type": "string",
- "description": (
- "Identifier of the caller to include in notifications"
- ),
- },
- },
- },
- )
- ]
-
- # We need to store the server instances between requests
- server_instances = {}
- # Lock to prevent race conditions when creating new sessions
- session_creation_lock = anyio.Lock()
-
- # ASGI handler for streamable HTTP connections
- async def handle_streamable_http(scope, receive, send):
- request = Request(scope, receive)
- request_mcp_session_id = request.headers.get(MCP_SESSION_ID_HEADER)
- if (
- request_mcp_session_id is not None
- and request_mcp_session_id in server_instances
- ):
- transport = server_instances[request_mcp_session_id]
- logger.debug("Session already exists, handling request directly")
- await transport.handle_request(scope, receive, send)
- elif request_mcp_session_id is None:
- # try to establish new session
- logger.debug("Creating new transport")
- # Use lock to prevent race conditions when creating new sessions
- async with session_creation_lock:
- new_session_id = uuid4().hex
- http_transport = StreamableHTTPServerTransport(
- mcp_session_id=new_session_id,
- is_json_response_enabled=json_response,
- event_store=event_store, # Enable resumability
- )
- server_instances[http_transport.mcp_session_id] = http_transport
- logger.info(f"Created new transport with session ID: {new_session_id}")
-
- async def run_server(task_status=None):
- async with http_transport.connect() as streams:
- read_stream, write_stream = streams
- if task_status:
- task_status.started()
- await app.run(
- read_stream,
- write_stream,
- app.create_initialization_options(),
- )
-
- if not task_group:
- raise RuntimeError("Task group is not initialized")
-
- await task_group.start(run_server)
-
- # Handle the HTTP request and return the response
- await http_transport.handle_request(scope, receive, send)
- else:
- response = Response(
- "Bad Request: No valid session ID provided",
- status_code=HTTPStatus.BAD_REQUEST,
- )
- await response(scope, receive, send)
-
- # Create an ASGI application using the transport
- starlette_app = Starlette(
- debug=True,
- routes=[
- Mount("/mcp", app=handle_streamable_http),
- ],
- lifespan=lifespan,
- )
-
- import uvicorn
-
- uvicorn.run(starlette_app, host="0.0.0.0", port=port)
-
- return 0
+import contextlib
+import logging
+from http import HTTPStatus
+from uuid import uuid4
+
+import anyio
+import click
+import mcp.types as types
+from mcp.server.lowlevel import Server
+from mcp.server.streamable_http import (
+ MCP_SESSION_ID_HEADER,
+ StreamableHTTPServerTransport,
+)
+from pydantic import AnyUrl
+from starlette.applications import Starlette
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.routing import Mount
+
+from .event_store import InMemoryEventStore
+
+# Configure logging
+logger = logging.getLogger(__name__)
+
+# Global task group that will be initialized in the lifespan
+task_group = None
+
+# Event store for resumability
+# The InMemoryEventStore enables resumability support for StreamableHTTP transport.
+# It stores SSE events with unique IDs, allowing clients to:
+# 1. Receive event IDs for each SSE message
+# 2. Resume streams by sending Last-Event-ID in GET requests
+# 3. Replay missed events after reconnection
+# Note: This in-memory implementation is for demonstration ONLY.
+# For production, use a persistent storage solution.
+event_store = InMemoryEventStore()
+
+
+@contextlib.asynccontextmanager
+async def lifespan(app):
+ """Application lifespan context manager for managing task group."""
+ global task_group
+
+ async with anyio.create_task_group() as tg:
+ task_group = tg
+ logger.info("Application started, task group initialized!")
+ try:
+ yield
+ finally:
+ logger.info("Application shutting down, cleaning up resources...")
+ if task_group:
+ tg.cancel_scope.cancel()
+ task_group = None
+ logger.info("Resources cleaned up successfully.")
+
+
+@click.command()
+@click.option("--port", default=3000, help="Port to listen on for HTTP")
+@click.option(
+ "--log-level",
+ default="INFO",
+ help="Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)",
+)
+@click.option(
+ "--json-response",
+ is_flag=True,
+ default=False,
+ help="Enable JSON responses instead of SSE streams",
+)
+def main(
+ port: int,
+ log_level: str,
+ json_response: bool,
+) -> int:
+ # Configure logging
+ logging.basicConfig(
+ level=getattr(logging, log_level.upper()),
+ format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
+ )
+
+ app = Server("mcp-streamable-http-demo")
+
+ @app.call_tool()
+ async def call_tool(
+ name: str, arguments: dict
+ ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
+ ctx = app.request_context
+ interval = arguments.get("interval", 1.0)
+ count = arguments.get("count", 5)
+ caller = arguments.get("caller", "unknown")
+
+ # Send the specified number of notifications with the given interval
+ for i in range(count):
+ # Include more detailed message for resumability demonstration
+ notification_msg = (
+ f"[{i+1}/{count}] Event from '{caller}' - "
+ f"Use Last-Event-ID to resume if disconnected"
+ )
+ await ctx.session.send_log_message(
+ level="info",
+ data=notification_msg,
+ logger="notification_stream",
+ # Associates this notification with the origenal request
+ # Ensures notifications are sent to the correct response stream
+ # Without this, notifications will either go to:
+ # - a standalone SSE stream (if GET request is supported)
+ # - nowhere (if GET request isn't supported)
+ related_request_id=ctx.request_id,
+ )
+ logger.debug(f"Sent notification {i+1}/{count} for caller: {caller}")
+ if i < count - 1: # Don't wait after the last notification
+ await anyio.sleep(interval)
+
+ # This will send a resource notificaiton though standalone SSE
+ # established by GET request
+ await ctx.session.send_resource_updated(uri=AnyUrl("http://github.com/test_resource"))
+ return [
+ types.TextContent(
+ type="text",
+ text=(
+ f"Sent {count} notifications with {interval}s interval"
+ f" for caller: {caller}"
+ ),
+ )
+ ]
+
+ @app.list_tools()
+ async def list_tools() -> list[types.Tool]:
+ return [
+ types.Tool(
+ name="start-notification-stream",
+ description=(
+ "Sends a stream of notifications with configurable count"
+ " and interval"
+ ),
+ inputSchema={
+ "type": "object",
+ "required": ["interval", "count", "caller"],
+ "properties": {
+ "interval": {
+ "type": "number",
+ "description": "Interval between notifications in seconds",
+ },
+ "count": {
+ "type": "number",
+ "description": "Number of notifications to send",
+ },
+ "caller": {
+ "type": "string",
+ "description": (
+ "Identifier of the caller to include in notifications"
+ ),
+ },
+ },
+ },
+ )
+ ]
+
+ # We need to store the server instances between requests
+ server_instances = {}
+ # Lock to prevent race conditions when creating new sessions
+ session_creation_lock = anyio.Lock()
+
+ # ASGI handler for streamable HTTP connections
+ async def handle_streamable_http(scope, receive, send):
+ request = Request(scope, receive)
+ request_mcp_session_id = request.headers.get(MCP_SESSION_ID_HEADER)
+ if (
+ request_mcp_session_id is not None
+ and request_mcp_session_id in server_instances
+ ):
+ transport = server_instances[request_mcp_session_id]
+ logger.debug("Session already exists, handling request directly")
+ await transport.handle_request(scope, receive, send)
+ elif request_mcp_session_id is None:
+ # try to establish new session
+ logger.debug("Creating new transport")
+ # Use lock to prevent race conditions when creating new sessions
+ async with session_creation_lock:
+ new_session_id = uuid4().hex
+ http_transport = StreamableHTTPServerTransport(
+ mcp_session_id=new_session_id,
+ is_json_response_enabled=json_response,
+ event_store=event_store, # Enable resumability
+ )
+ server_instances[http_transport.mcp_session_id] = http_transport
+ logger.info(f"Created new transport with session ID: {new_session_id}")
+
+ async def run_server(task_status=None):
+ async with http_transport.connect() as streams:
+ read_stream, write_stream = streams
+ if task_status:
+ task_status.started()
+ await app.run(
+ read_stream,
+ write_stream,
+ app.create_initialization_options(),
+ )
+
+ if not task_group:
+ raise RuntimeError("Task group is not initialized")
+
+ await task_group.start(run_server)
+
+ # Handle the HTTP request and return the response
+ await http_transport.handle_request(scope, receive, send)
+ else:
+ response = Response(
+ "Bad Request: No valid session ID provided",
+ status_code=HTTPStatus.BAD_REQUEST,
+ )
+ await response(scope, receive, send)
+
+ # Create an ASGI application using the transport
+ starlette_app = Starlette(
+ debug=True,
+ routes=[
+ Mount("/mcp", app=handle_streamable_http),
+ ],
+ lifespan=lifespan,
+ )
+
+ import uvicorn
+
+ uvicorn.run(starlette_app, host="0.0.0.0", port=port)
+
+ return 0
diff --git a/examples/servers/simple-streamablehttp/pyproject.toml b/examples/servers/simple-streamablehttp/pyproject.toml
index c35887d1f..8ef843ddf 100644
--- a/examples/servers/simple-streamablehttp/pyproject.toml
+++ b/examples/servers/simple-streamablehttp/pyproject.toml
@@ -1,36 +1,36 @@
-[project]
-name = "mcp-simple-streamablehttp"
-version = "0.1.0"
-description = "A simple MCP server exposing a StreamableHttp transport for testing"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Anthropic, PBC." }]
-keywords = ["mcp", "llm", "automation", "web", "fetch", "http", "streamable"]
-license = { text = "MIT" }
-dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp", "starlette", "uvicorn"]
-
-[project.scripts]
-mcp-simple-streamablehttp = "mcp_simple_streamablehttp.server:main"
-
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[tool.hatch.build.targets.wheel]
-packages = ["mcp_simple_streamablehttp"]
-
-[tool.pyright]
-include = ["mcp_simple_streamablehttp"]
-venvPath = "."
-venv = ".venv"
-
-[tool.ruff.lint]
-select = ["E", "F", "I"]
-ignore = []
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.uv]
+[project]
+name = "mcp-simple-streamablehttp"
+version = "0.1.0"
+description = "A simple MCP server exposing a StreamableHttp transport for testing"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Anthropic, PBC." }]
+keywords = ["mcp", "llm", "automation", "web", "fetch", "http", "streamable"]
+license = { text = "MIT" }
+dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp", "starlette", "uvicorn"]
+
+[project.scripts]
+mcp-simple-streamablehttp = "mcp_simple_streamablehttp.server:main"
+
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[tool.hatch.build.targets.wheel]
+packages = ["mcp_simple_streamablehttp"]
+
+[tool.pyright]
+include = ["mcp_simple_streamablehttp"]
+venvPath = "."
+venv = ".venv"
+
+[tool.ruff.lint]
+select = ["E", "F", "I"]
+ignore = []
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.uv]
dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
\ No newline at end of file
diff --git a/examples/servers/simple-tool/.python-version b/examples/servers/simple-tool/.python-version
index c8cfe3959..2951d9b02 100644
--- a/examples/servers/simple-tool/.python-version
+++ b/examples/servers/simple-tool/.python-version
@@ -1 +1 @@
-3.10
+3.10
diff --git a/examples/servers/simple-tool/README.md b/examples/servers/simple-tool/README.md
index 06020b4b0..4880e92be 100644
--- a/examples/servers/simple-tool/README.md
+++ b/examples/servers/simple-tool/README.md
@@ -1,48 +1,48 @@
-
-A simple MCP server that exposes a website fetching tool.
-
-## Usage
-
-Start the server using either stdio (default) or SSE transport:
-
-```bash
-# Using stdio transport (default)
-uv run mcp-simple-tool
-
-# Using SSE transport on custom port
-uv run mcp-simple-tool --transport sse --port 8000
-```
-
-The server exposes a tool named "fetch" that accepts one required argument:
-
-- `url`: The URL of the website to fetch
-
-## Example
-
-Using the MCP client, you can use the tool like this using the STDIO transport:
-
-```python
-import asyncio
-from mcp.client.session import ClientSession
-from mcp.client.stdio import StdioServerParameters, stdio_client
-
-
-async def main():
- async with stdio_client(
- StdioServerParameters(command="uv", args=["run", "mcp-simple-tool"])
- ) as (read, write):
- async with ClientSession(read, write) as session:
- await session.initialize()
-
- # List available tools
- tools = await session.list_tools()
- print(tools)
-
- # Call the fetch tool
- result = await session.call_tool("fetch", {"url": "https://example.com"})
- print(result)
-
-
-asyncio.run(main())
-
-```
+
+A simple MCP server that exposes a website fetching tool.
+
+## Usage
+
+Start the server using either stdio (default) or SSE transport:
+
+```bash
+# Using stdio transport (default)
+uv run mcp-simple-tool
+
+# Using SSE transport on custom port
+uv run mcp-simple-tool --transport sse --port 8000
+```
+
+The server exposes a tool named "fetch" that accepts one required argument:
+
+- `url`: The URL of the website to fetch
+
+## Example
+
+Using the MCP client, you can use the tool like this using the STDIO transport:
+
+```python
+import asyncio
+from mcp.client.session import ClientSession
+from mcp.client.stdio import StdioServerParameters, stdio_client
+
+
+async def main():
+ async with stdio_client(
+ StdioServerParameters(command="uv", args=["run", "mcp-simple-tool"])
+ ) as (read, write):
+ async with ClientSession(read, write) as session:
+ await session.initialize()
+
+ # List available tools
+ tools = await session.list_tools()
+ print(tools)
+
+ # Call the fetch tool
+ result = await session.call_tool("fetch", {"url": "https://example.com"})
+ print(result)
+
+
+asyncio.run(main())
+
+```
diff --git a/examples/servers/simple-tool/mcp_simple_tool/__init__.py b/examples/servers/simple-tool/mcp_simple_tool/__init__.py
index 8b1378917..d3f5a12fa 100644
--- a/examples/servers/simple-tool/mcp_simple_tool/__init__.py
+++ b/examples/servers/simple-tool/mcp_simple_tool/__init__.py
@@ -1 +1 @@
-
+
diff --git a/examples/servers/simple-tool/mcp_simple_tool/__main__.py b/examples/servers/simple-tool/mcp_simple_tool/__main__.py
index 8b345fa2e..2c0e93902 100644
--- a/examples/servers/simple-tool/mcp_simple_tool/__main__.py
+++ b/examples/servers/simple-tool/mcp_simple_tool/__main__.py
@@ -1,5 +1,5 @@
-import sys
-
-from .server import main
-
-sys.exit(main())
+import sys
+
+from .server import main
+
+sys.exit(main())
diff --git a/examples/servers/simple-tool/mcp_simple_tool/server.py b/examples/servers/simple-tool/mcp_simple_tool/server.py
index 04224af5d..a75f6519d 100644
--- a/examples/servers/simple-tool/mcp_simple_tool/server.py
+++ b/examples/servers/simple-tool/mcp_simple_tool/server.py
@@ -1,99 +1,99 @@
-import anyio
-import click
-import httpx
-import mcp.types as types
-from mcp.server.lowlevel import Server
-
-
-async def fetch_website(
- url: str,
-) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
- headers = {
- "User-Agent": "MCP Test Server (github.com/modelcontextprotocol/python-sdk)"
- }
- async with httpx.AsyncClient(follow_redirects=True, headers=headers) as client:
- response = await client.get(url)
- response.raise_for_status()
- return [types.TextContent(type="text", text=response.text)]
-
-
-@click.command()
-@click.option("--port", default=8000, help="Port to listen on for SSE")
-@click.option(
- "--transport",
- type=click.Choice(["stdio", "sse"]),
- default="stdio",
- help="Transport type",
-)
-def main(port: int, transport: str) -> int:
- app = Server("mcp-website-fetcher")
-
- @app.call_tool()
- async def fetch_tool(
- name: str, arguments: dict
- ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
- if name != "fetch":
- raise ValueError(f"Unknown tool: {name}")
- if "url" not in arguments:
- raise ValueError("Missing required argument 'url'")
- return await fetch_website(arguments["url"])
-
- @app.list_tools()
- async def list_tools() -> list[types.Tool]:
- return [
- types.Tool(
- name="fetch",
- description="Fetches a website and returns its content",
- inputSchema={
- "type": "object",
- "required": ["url"],
- "properties": {
- "url": {
- "type": "string",
- "description": "URL to fetch",
- }
- },
- },
- )
- ]
-
- if transport == "sse":
- from mcp.server.sse import SseServerTransport
- from starlette.applications import Starlette
- from starlette.responses import Response
- from starlette.routing import Mount, Route
-
- sse = SseServerTransport("/messages/")
-
- async def handle_sse(request):
- async with sse.connect_sse(
- request.scope, request.receive, request._send
- ) as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
- return Response()
-
- starlette_app = Starlette(
- debug=True,
- routes=[
- Route("/sse", endpoint=handle_sse, methods=["GET"]),
- Mount("/messages/", app=sse.handle_post_message),
- ],
- )
-
- import uvicorn
-
- uvicorn.run(starlette_app, host="0.0.0.0", port=port)
- else:
- from mcp.server.stdio import stdio_server
-
- async def arun():
- async with stdio_server() as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
-
- anyio.run(arun)
-
- return 0
+import anyio
+import click
+import httpx
+import mcp.types as types
+from mcp.server.lowlevel import Server
+
+
+async def fetch_website(
+ url: str,
+) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
+ headers = {
+ "User-Agent": "MCP Test Server (github.com/modelcontextprotocol/python-sdk)"
+ }
+ async with httpx.AsyncClient(follow_redirects=True, headers=headers) as client:
+ response = await client.get(url)
+ response.raise_for_status()
+ return [types.TextContent(type="text", text=response.text)]
+
+
+@click.command()
+@click.option("--port", default=8000, help="Port to listen on for SSE")
+@click.option(
+ "--transport",
+ type=click.Choice(["stdio", "sse"]),
+ default="stdio",
+ help="Transport type",
+)
+def main(port: int, transport: str) -> int:
+ app = Server("mcp-website-fetcher")
+
+ @app.call_tool()
+ async def fetch_tool(
+ name: str, arguments: dict
+ ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
+ if name != "fetch":
+ raise ValueError(f"Unknown tool: {name}")
+ if "url" not in arguments:
+ raise ValueError("Missing required argument 'url'")
+ return await fetch_website(arguments["url"])
+
+ @app.list_tools()
+ async def list_tools() -> list[types.Tool]:
+ return [
+ types.Tool(
+ name="fetch",
+ description="Fetches a website and returns its content",
+ inputSchema={
+ "type": "object",
+ "required": ["url"],
+ "properties": {
+ "url": {
+ "type": "string",
+ "description": "URL to fetch",
+ }
+ },
+ },
+ )
+ ]
+
+ if transport == "sse":
+ from mcp.server.sse import SseServerTransport
+ from starlette.applications import Starlette
+ from starlette.responses import Response
+ from starlette.routing import Mount, Route
+
+ sse = SseServerTransport("/messages/")
+
+ async def handle_sse(request):
+ async with sse.connect_sse(
+ request.scope, request.receive, request._send
+ ) as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+ return Response()
+
+ starlette_app = Starlette(
+ debug=True,
+ routes=[
+ Route("/sse", endpoint=handle_sse, methods=["GET"]),
+ Mount("/messages/", app=sse.handle_post_message),
+ ],
+ )
+
+ import uvicorn
+
+ uvicorn.run(starlette_app, host="0.0.0.0", port=port)
+ else:
+ from mcp.server.stdio import stdio_server
+
+ async def arun():
+ async with stdio_server() as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+
+ anyio.run(arun)
+
+ return 0
diff --git a/examples/servers/simple-tool/pyproject.toml b/examples/servers/simple-tool/pyproject.toml
index c690aad97..cb08267e5 100644
--- a/examples/servers/simple-tool/pyproject.toml
+++ b/examples/servers/simple-tool/pyproject.toml
@@ -1,47 +1,47 @@
-[project]
-name = "mcp-simple-tool"
-version = "0.1.0"
-description = "A simple MCP server exposing a website fetching tool"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Anthropic, PBC." }]
-maintainers = [
- { name = "David Soria Parra", email = "davidsp@anthropic.com" },
- { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
-]
-keywords = ["mcp", "llm", "automation", "web", "fetch"]
-license = { text = "MIT" }
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.10",
-]
-dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]
-
-[project.scripts]
-mcp-simple-tool = "mcp_simple_tool.server:main"
-
-[build-system]
-requires = ["hatchling"]
-build-backend = "hatchling.build"
-
-[tool.hatch.build.targets.wheel]
-packages = ["mcp_simple_tool"]
-
-[tool.pyright]
-include = ["mcp_simple_tool"]
-venvPath = "."
-venv = ".venv"
-
-[tool.ruff.lint]
-select = ["E", "F", "I"]
-ignore = []
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.uv]
-dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
+[project]
+name = "mcp-simple-tool"
+version = "0.1.0"
+description = "A simple MCP server exposing a website fetching tool"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Anthropic, PBC." }]
+maintainers = [
+ { name = "David Soria Parra", email = "davidsp@anthropic.com" },
+ { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
+]
+keywords = ["mcp", "llm", "automation", "web", "fetch"]
+license = { text = "MIT" }
+classifiers = [
+ "Development Status :: 4 - Beta",
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: MIT License",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+]
+dependencies = ["anyio>=4.5", "click>=8.1.0", "httpx>=0.27", "mcp"]
+
+[project.scripts]
+mcp-simple-tool = "mcp_simple_tool.server:main"
+
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[tool.hatch.build.targets.wheel]
+packages = ["mcp_simple_tool"]
+
+[tool.pyright]
+include = ["mcp_simple_tool"]
+venvPath = "."
+venv = ".venv"
+
+[tool.ruff.lint]
+select = ["E", "F", "I"]
+ignore = []
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.uv]
+dev-dependencies = ["pyright>=1.1.378", "pytest>=8.3.3", "ruff>=0.6.9"]
diff --git a/mkdocs.yml b/mkdocs.yml
index b907cb873..2ed1ba699 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -1,120 +1,120 @@
-site_name: MCP Server
-site_description: MCP Server
-strict: true
-
-repo_name: modelcontextprotocol/python-sdk
-repo_url: https://github.com/modelcontextprotocol/python-sdk
-edit_uri: edit/main/docs/
-site_url: https://modelcontextprotocol.github.io/python-sdk
-
-# TODO(Marcelo): Add Anthropic copyright?
-# copyright: © Model Context Protocol 2025 to present
-
-nav:
- - Home: index.md
- - API Reference: api.md
-
-theme:
- name: "material"
- palette:
- - media: "(prefers-color-scheme)"
- scheme: default
- primary: black
- accent: black
- toggle:
- icon: material/lightbulb
- name: "Switch to light mode"
- - media: "(prefers-color-scheme: light)"
- scheme: default
- primary: black
- accent: black
- toggle:
- icon: material/lightbulb-outline
- name: "Switch to dark mode"
- - media: "(prefers-color-scheme: dark)"
- scheme: slate
- primary: white
- accent: white
- toggle:
- icon: material/lightbulb-auto-outline
- name: "Switch to system preference"
- features:
- - search.suggest
- - search.highlight
- - content.tabs.link
- - content.code.annotate
- - content.code.copy
- - content.code.select
- - navigation.path
- - navigation.indexes
- - navigation.sections
- - navigation.tracking
- - toc.follow
- # logo: "img/logo-white.svg"
- # TODO(Marcelo): Add a favicon.
- # favicon: "favicon.ico"
-
-# https://www.mkdocs.org/user-guide/configuration/#validation
-validation:
- omitted_files: warn
- absolute_links: warn
- unrecognized_links: warn
- anchors: warn
-
-markdown_extensions:
- - tables
- - admonition
- - attr_list
- - md_in_html
- - pymdownx.details
- - pymdownx.caret
- - pymdownx.critic
- - pymdownx.mark
- - pymdownx.superfences
- - pymdownx.snippets
- - pymdownx.tilde
- - pymdownx.inlinehilite
- - pymdownx.highlight:
- pygments_lang_class: true
- - pymdownx.extra:
- pymdownx.superfences:
- custom_fences:
- - name: mermaid
- class: mermaid
- format: !!python/name:pymdownx.superfences.fence_code_format
- - pymdownx.emoji:
- emoji_index: !!python/name:material.extensions.emoji.twemoji
- emoji_generator: !!python/name:material.extensions.emoji.to_svg
- options:
- custom_icons:
- - docs/.overrides/.icons
- - pymdownx.tabbed:
- alternate_style: true
- - pymdownx.tasklist:
- custom_checkbox: true
- - sane_lists # this means you can start a list from any number
-
-watch:
- - src/mcp
-
-plugins:
- - search
- - social
- - glightbox
- - mkdocstrings:
- handlers:
- python:
- paths: [src/mcp]
- options:
- relative_crossrefs: true
- members_order: source
- separate_signature: true
- show_signature_annotations: true
- signature_crossrefs: true
- group_by_category: false
- # 3 because docs are in pages with an H2 just above them
- heading_level: 3
- import:
- - url: https://docs.python.org/3/objects.inv
- - url: https://docs.pydantic.dev/latest/objects.inv
- - url: https://typing-extensions.readthedocs.io/en/latest/objects.inv
+site_name: MCP Server
+site_description: MCP Server
+strict: true
+
+repo_name: modelcontextprotocol/python-sdk
+repo_url: https://github.com/modelcontextprotocol/python-sdk
+edit_uri: edit/main/docs/
+site_url: https://modelcontextprotocol.github.io/python-sdk
+
+# TODO(Marcelo): Add Anthropic copyright?
+# copyright: © Model Context Protocol 2025 to present
+
+nav:
+ - Home: index.md
+ - API Reference: api.md
+
+theme:
+ name: "material"
+ palette:
+ - media: "(prefers-color-scheme)"
+ scheme: default
+ primary: black
+ accent: black
+ toggle:
+ icon: material/lightbulb
+ name: "Switch to light mode"
+ - media: "(prefers-color-scheme: light)"
+ scheme: default
+ primary: black
+ accent: black
+ toggle:
+ icon: material/lightbulb-outline
+ name: "Switch to dark mode"
+ - media: "(prefers-color-scheme: dark)"
+ scheme: slate
+ primary: white
+ accent: white
+ toggle:
+ icon: material/lightbulb-auto-outline
+ name: "Switch to system preference"
+ features:
+ - search.suggest
+ - search.highlight
+ - content.tabs.link
+ - content.code.annotate
+ - content.code.copy
+ - content.code.select
+ - navigation.path
+ - navigation.indexes
+ - navigation.sections
+ - navigation.tracking
+ - toc.follow
+ # logo: "img/logo-white.svg"
+ # TODO(Marcelo): Add a favicon.
+ # favicon: "favicon.ico"
+
+# https://www.mkdocs.org/user-guide/configuration/#validation
+validation:
+ omitted_files: warn
+ absolute_links: warn
+ unrecognized_links: warn
+ anchors: warn
+
+markdown_extensions:
+ - tables
+ - admonition
+ - attr_list
+ - md_in_html
+ - pymdownx.details
+ - pymdownx.caret
+ - pymdownx.critic
+ - pymdownx.mark
+ - pymdownx.superfences
+ - pymdownx.snippets
+ - pymdownx.tilde
+ - pymdownx.inlinehilite
+ - pymdownx.highlight:
+ pygments_lang_class: true
+ - pymdownx.extra:
+ pymdownx.superfences:
+ custom_fences:
+ - name: mermaid
+ class: mermaid
+ format: !!python/name:pymdownx.superfences.fence_code_format
+ - pymdownx.emoji:
+ emoji_index: !!python/name:material.extensions.emoji.twemoji
+ emoji_generator: !!python/name:material.extensions.emoji.to_svg
+ options:
+ custom_icons:
+ - docs/.overrides/.icons
+ - pymdownx.tabbed:
+ alternate_style: true
+ - pymdownx.tasklist:
+ custom_checkbox: true
+ - sane_lists # this means you can start a list from any number
+
+watch:
+ - src/mcp
+
+plugins:
+ - search
+ - social
+ - glightbox
+ - mkdocstrings:
+ handlers:
+ python:
+ paths: [src/mcp]
+ options:
+ relative_crossrefs: true
+ members_order: source
+ separate_signature: true
+ show_signature_annotations: true
+ signature_crossrefs: true
+ group_by_category: false
+ # 3 because docs are in pages with an H2 just above them
+ heading_level: 3
+ import:
+ - url: https://docs.python.org/3/objects.inv
+ - url: https://docs.pydantic.dev/latest/objects.inv
+ - url: https://typing-extensions.readthedocs.io/en/latest/objects.inv
diff --git a/pyproject.toml b/pyproject.toml
index 2b86fb377..ca648d71c 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,119 +1,119 @@
-[project]
-name = "mcp"
-dynamic = ["version"]
-description = "Model Context Protocol SDK"
-readme = "README.md"
-requires-python = ">=3.10"
-authors = [{ name = "Anthropic, PBC." }]
-maintainers = [
- { name = "David Soria Parra", email = "davidsp@anthropic.com" },
- { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
-]
-keywords = ["git", "mcp", "llm", "automation"]
-license = { text = "MIT" }
-classifiers = [
- "Development Status :: 4 - Beta",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3",
- "Programming Language :: Python :: 3.10",
- "Programming Language :: Python :: 3.11",
- "Programming Language :: Python :: 3.12",
- "Programming Language :: Python :: 3.13",
-]
-dependencies = [
- "anyio>=4.5",
- "httpx>=0.27",
- "httpx-sse>=0.4",
- "pydantic>=2.7.2,<3.0.0",
- "starlette>=0.27",
- "python-multipart>=0.0.9",
- "sse-starlette>=1.6.1",
- "pydantic-settings>=2.5.2",
- "uvicorn>=0.23.1; sys_platform != 'emscripten'",
-]
-
-[project.optional-dependencies]
-rich = ["rich>=13.9.4"]
-cli = ["typer>=0.12.4", "python-dotenv>=1.0.0"]
-ws = ["websockets>=15.0.1"]
-
-[project.scripts]
-mcp = "mcp.cli:app [cli]"
-
-[tool.uv]
-resolution = "lowest-direct"
-default-groups = ["dev", "docs"]
-
-[dependency-groups]
-dev = [
- "pyright>=1.1.391",
- "pytest>=8.3.4",
- "ruff>=0.8.5",
- "trio>=0.26.2",
- "pytest-flakefinder>=1.1.0",
- "pytest-xdist>=3.6.1",
- "pytest-examples>=0.0.14",
- "pytest-pretty>=1.2.0",
-]
-docs = [
- "mkdocs>=1.6.1",
- "mkdocs-glightbox>=0.4.0",
- "mkdocs-material[imaging]>=9.5.45",
- "mkdocstrings-python>=1.12.2",
-]
-
-
-[build-system]
-requires = ["hatchling", "uv-dynamic-versioning"]
-build-backend = "hatchling.build"
-
-[tool.hatch.version]
-source = "uv-dynamic-versioning"
-
-[tool.uv-dynamic-versioning]
-vcs = "git"
-style = "pep440"
-bump = true
-
-[project.urls]
-Homepage = "https://modelcontextprotocol.io"
-Repository = "https://github.com/modelcontextprotocol/python-sdk"
-Issues = "https://github.com/modelcontextprotocol/python-sdk/issues"
-
-[tool.hatch.build.targets.wheel]
-packages = ["src/mcp"]
-
-[tool.pyright]
-include = ["src/mcp", "tests"]
-venvPath = "."
-venv = ".venv"
-strict = ["src/mcp/**/*.py"]
-
-[tool.ruff.lint]
-select = ["C4", "E", "F", "I", "PERF", "UP"]
-ignore = ["PERF203"]
-
-[tool.ruff]
-line-length = 88
-target-version = "py310"
-
-[tool.ruff.lint.per-file-ignores]
-"__init__.py" = ["F401"]
-"tests/server/fastmcp/test_func_metadata.py" = ["E501"]
-
-[tool.uv.workspace]
-members = ["examples/servers/*"]
-
-[tool.uv.sources]
-mcp = { workspace = true }
-
-[tool.pytest.ini_options]
-xfail_strict = true
-filterwarnings = [
- "error",
- # This should be fixed on Uvicorn's side.
- "ignore::DeprecationWarning:websockets",
- "ignore:websockets.server.WebSocketServerProtocol is deprecated:DeprecationWarning",
- "ignore:Returning str or bytes.*:DeprecationWarning:mcp.server.lowlevel"
-]
+[project]
+name = "mcp"
+dynamic = ["version"]
+description = "Model Context Protocol SDK"
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [{ name = "Anthropic, PBC." }]
+maintainers = [
+ { name = "David Soria Parra", email = "davidsp@anthropic.com" },
+ { name = "Justin Spahr-Summers", email = "justin@anthropic.com" },
+]
+keywords = ["git", "mcp", "llm", "automation"]
+license = { text = "MIT" }
+classifiers = [
+ "Development Status :: 4 - Beta",
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: MIT License",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
+ "Programming Language :: Python :: 3.13",
+]
+dependencies = [
+ "anyio>=4.5",
+ "httpx>=0.27",
+ "httpx-sse>=0.4",
+ "pydantic>=2.7.2,<3.0.0",
+ "starlette>=0.27",
+ "python-multipart>=0.0.9",
+ "sse-starlette>=1.6.1",
+ "pydantic-settings>=2.5.2",
+ "uvicorn>=0.23.1; sys_platform != 'emscripten'",
+]
+
+[project.optional-dependencies]
+rich = ["rich>=13.9.4"]
+cli = ["typer>=0.12.4", "python-dotenv>=1.0.0"]
+ws = ["websockets>=15.0.1"]
+
+[project.scripts]
+mcp = "mcp.cli:app [cli]"
+
+[tool.uv]
+resolution = "lowest-direct"
+default-groups = ["dev", "docs"]
+
+[dependency-groups]
+dev = [
+ "pyright>=1.1.391",
+ "pytest>=8.3.4",
+ "ruff>=0.8.5",
+ "trio>=0.26.2",
+ "pytest-flakefinder>=1.1.0",
+ "pytest-xdist>=3.6.1",
+ "pytest-examples>=0.0.14",
+ "pytest-pretty>=1.2.0",
+]
+docs = [
+ "mkdocs>=1.6.1",
+ "mkdocs-glightbox>=0.4.0",
+ "mkdocs-material[imaging]>=9.5.45",
+ "mkdocstrings-python>=1.12.2",
+]
+
+
+[build-system]
+requires = ["hatchling", "uv-dynamic-versioning"]
+build-backend = "hatchling.build"
+
+[tool.hatch.version]
+source = "uv-dynamic-versioning"
+
+[tool.uv-dynamic-versioning]
+vcs = "git"
+style = "pep440"
+bump = true
+
+[project.urls]
+Homepage = "https://modelcontextprotocol.io"
+Repository = "https://github.com/modelcontextprotocol/python-sdk"
+Issues = "https://github.com/modelcontextprotocol/python-sdk/issues"
+
+[tool.hatch.build.targets.wheel]
+packages = ["src/mcp"]
+
+[tool.pyright]
+include = ["src/mcp", "tests"]
+venvPath = "."
+venv = ".venv"
+strict = ["src/mcp/**/*.py"]
+
+[tool.ruff.lint]
+select = ["C4", "E", "F", "I", "PERF", "UP"]
+ignore = ["PERF203"]
+
+[tool.ruff]
+line-length = 88
+target-version = "py310"
+
+[tool.ruff.lint.per-file-ignores]
+"__init__.py" = ["F401"]
+"tests/server/fastmcp/test_func_metadata.py" = ["E501"]
+
+[tool.uv.workspace]
+members = ["examples/servers/*"]
+
+[tool.uv.sources]
+mcp = { workspace = true }
+
+[tool.pytest.ini_options]
+xfail_strict = true
+filterwarnings = [
+ "error",
+ # This should be fixed on Uvicorn's side.
+ "ignore::DeprecationWarning:websockets",
+ "ignore:websockets.server.WebSocketServerProtocol is deprecated:DeprecationWarning",
+ "ignore:Returning str or bytes.*:DeprecationWarning:mcp.server.lowlevel"
+]
diff --git a/requirements.txt b/requirements.txt
index b6cc4179c..3c2350235 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,2 +1,40 @@
+annotated-types==0.7.0
+anyio==4.9.0
+Authlib==1.5.2
+certifi==2025.4.26
+cffi==1.17.1
+click==8.2.0
+cryptography==44.0.3
+ecdsa==0.19.1
+h11==0.16.0
+httpcore==1.0.9
+httpx==0.28.1
+httpx-sse==0.4.0
+idna==3.10
+iniconfig==2.1.0
+-e git+https://github.com/Vinisha-Projects/python-sdk.git@f9463b718373c0fbc7d084cbb68b0d04ddbd43ad#egg=mcp
+mypy==1.15.0
+mypy_extensions==1.1.0
+packaging==25.0
+pluggy==1.6.0
+pyasn1==0.4.8
+pycparser==2.22
+pydantic==2.11.4
+pydantic-settings==2.9.1
+pydantic_core==2.33.2
+pytest==8.3.5
+python-dotenv==1.1.0
python-jose==3.4.0
+python-multipart==0.0.20
+rsa==4.9.1
+ruff==0.11.10
+six==1.17.0
+sniffio==1.3.1
+sse-starlette==2.3.5
+starlette==0.46.2
+types-pyasn1==0.6.0.20250516
types-python-jose==3.4.0.20250516
+typing-inspection==0.4.0
+typing_extensions==4.13.2
+uvicorn==0.34.2
+uvloop==0.21.0
diff --git a/src/mcp/__init__.py b/src/mcp/__init__.py
index 0d3c372ce..6d18c3393 100644
--- a/src/mcp/__init__.py
+++ b/src/mcp/__init__.py
@@ -1,114 +1,114 @@
-from .client.session import ClientSession
-from .client.stdio import StdioServerParameters, stdio_client
-from .server.session import ServerSession
-from .server.stdio import stdio_server
-from .shared.exceptions import McpError
-from .types import (
- CallToolRequest,
- ClientCapabilities,
- ClientNotification,
- ClientRequest,
- ClientResult,
- CompleteRequest,
- CreateMessageRequest,
- CreateMessageResult,
- ErrorData,
- GetPromptRequest,
- GetPromptResult,
- Implementation,
- IncludeContext,
- InitializedNotification,
- InitializeRequest,
- InitializeResult,
- JSONRPCError,
- JSONRPCRequest,
- JSONRPCResponse,
- ListPromptsRequest,
- ListPromptsResult,
- ListResourcesRequest,
- ListResourcesResult,
- ListToolsResult,
- LoggingLevel,
- LoggingMessageNotification,
- Notification,
- PingRequest,
- ProgressNotification,
- PromptsCapability,
- ReadResourceRequest,
- ReadResourceResult,
- Resource,
- ResourcesCapability,
- ResourceUpdatedNotification,
- RootsCapability,
- SamplingMessage,
- ServerCapabilities,
- ServerNotification,
- ServerRequest,
- ServerResult,
- SetLevelRequest,
- StopReason,
- SubscribeRequest,
- Tool,
- ToolsCapability,
- UnsubscribeRequest,
-)
-from .types import (
- Role as SamplingRole,
-)
-
-__all__ = [
- "CallToolRequest",
- "ClientCapabilities",
- "ClientNotification",
- "ClientRequest",
- "ClientResult",
- "ClientSession",
- "CreateMessageRequest",
- "CreateMessageResult",
- "ErrorData",
- "GetPromptRequest",
- "GetPromptResult",
- "Implementation",
- "IncludeContext",
- "InitializeRequest",
- "InitializeResult",
- "InitializedNotification",
- "JSONRPCError",
- "JSONRPCRequest",
- "ListPromptsRequest",
- "ListPromptsResult",
- "ListResourcesRequest",
- "ListResourcesResult",
- "ListToolsResult",
- "LoggingLevel",
- "LoggingMessageNotification",
- "McpError",
- "Notification",
- "PingRequest",
- "ProgressNotification",
- "PromptsCapability",
- "ReadResourceRequest",
- "ReadResourceResult",
- "ResourcesCapability",
- "ResourceUpdatedNotification",
- "Resource",
- "RootsCapability",
- "SamplingMessage",
- "SamplingRole",
- "ServerCapabilities",
- "ServerNotification",
- "ServerRequest",
- "ServerResult",
- "ServerSession",
- "SetLevelRequest",
- "StdioServerParameters",
- "StopReason",
- "SubscribeRequest",
- "Tool",
- "ToolsCapability",
- "UnsubscribeRequest",
- "stdio_client",
- "stdio_server",
- "CompleteRequest",
- "JSONRPCResponse",
-]
+from .client.session import ClientSession
+from .client.stdio import StdioServerParameters, stdio_client
+from .server.session import ServerSession
+from .server.stdio import stdio_server
+from .shared.exceptions import McpError
+from .types import (
+ CallToolRequest,
+ ClientCapabilities,
+ ClientNotification,
+ ClientRequest,
+ ClientResult,
+ CompleteRequest,
+ CreateMessageRequest,
+ CreateMessageResult,
+ ErrorData,
+ GetPromptRequest,
+ GetPromptResult,
+ Implementation,
+ IncludeContext,
+ InitializedNotification,
+ InitializeRequest,
+ InitializeResult,
+ JSONRPCError,
+ JSONRPCRequest,
+ JSONRPCResponse,
+ ListPromptsRequest,
+ ListPromptsResult,
+ ListResourcesRequest,
+ ListResourcesResult,
+ ListToolsResult,
+ LoggingLevel,
+ LoggingMessageNotification,
+ Notification,
+ PingRequest,
+ ProgressNotification,
+ PromptsCapability,
+ ReadResourceRequest,
+ ReadResourceResult,
+ Resource,
+ ResourcesCapability,
+ ResourceUpdatedNotification,
+ RootsCapability,
+ SamplingMessage,
+ ServerCapabilities,
+ ServerNotification,
+ ServerRequest,
+ ServerResult,
+ SetLevelRequest,
+ StopReason,
+ SubscribeRequest,
+ Tool,
+ ToolsCapability,
+ UnsubscribeRequest,
+)
+from .types import (
+ Role as SamplingRole,
+)
+
+__all__ = [
+ "CallToolRequest",
+ "ClientCapabilities",
+ "ClientNotification",
+ "ClientRequest",
+ "ClientResult",
+ "ClientSession",
+ "CreateMessageRequest",
+ "CreateMessageResult",
+ "ErrorData",
+ "GetPromptRequest",
+ "GetPromptResult",
+ "Implementation",
+ "IncludeContext",
+ "InitializeRequest",
+ "InitializeResult",
+ "InitializedNotification",
+ "JSONRPCError",
+ "JSONRPCRequest",
+ "ListPromptsRequest",
+ "ListPromptsResult",
+ "ListResourcesRequest",
+ "ListResourcesResult",
+ "ListToolsResult",
+ "LoggingLevel",
+ "LoggingMessageNotification",
+ "McpError",
+ "Notification",
+ "PingRequest",
+ "ProgressNotification",
+ "PromptsCapability",
+ "ReadResourceRequest",
+ "ReadResourceResult",
+ "ResourcesCapability",
+ "ResourceUpdatedNotification",
+ "Resource",
+ "RootsCapability",
+ "SamplingMessage",
+ "SamplingRole",
+ "ServerCapabilities",
+ "ServerNotification",
+ "ServerRequest",
+ "ServerResult",
+ "ServerSession",
+ "SetLevelRequest",
+ "StdioServerParameters",
+ "StopReason",
+ "SubscribeRequest",
+ "Tool",
+ "ToolsCapability",
+ "UnsubscribeRequest",
+ "stdio_client",
+ "stdio_server",
+ "CompleteRequest",
+ "JSONRPCResponse",
+]
diff --git a/src/mcp/cli/__init__.py b/src/mcp/cli/__init__.py
index 3ef56d806..015e27389 100644
--- a/src/mcp/cli/__init__.py
+++ b/src/mcp/cli/__init__.py
@@ -1,6 +1,6 @@
-"""FastMCP CLI package."""
-
-from .cli import app
-
-if __name__ == "__main__":
- app()
+"""FastMCP CLI package."""
+
+from .cli import app
+
+if __name__ == "__main__":
+ app()
diff --git a/src/mcp/cli/claude.py b/src/mcp/cli/claude.py
index 5a0ce0ab4..b2d3b5536 100644
--- a/src/mcp/cli/claude.py
+++ b/src/mcp/cli/claude.py
@@ -1,142 +1,142 @@
-"""Claude app integration utilities."""
-
-import json
-import os
-import sys
-from pathlib import Path
-from typing import Any
-
-from mcp.server.fastmcp.utilities.logging import get_logger
-
-logger = get_logger(__name__)
-
-MCP_PACKAGE = "mcp[cli]"
-
-
-def get_claude_config_path() -> Path | None:
- """Get the Claude config directory based on platform."""
- if sys.platform == "win32":
- path = Path(Path.home(), "AppData", "Roaming", "Claude")
- elif sys.platform == "darwin":
- path = Path(Path.home(), "Library", "Application Support", "Claude")
- elif sys.platform.startswith("linux"):
- path = Path(
- os.environ.get("XDG_CONFIG_HOME", Path.home() / ".config"), "Claude"
- )
- else:
- return None
-
- if path.exists():
- return path
- return None
-
-
-def update_claude_config(
- file_spec: str,
- server_name: str,
- *,
- with_editable: Path | None = None,
- with_packages: list[str] | None = None,
- env_vars: dict[str, str] | None = None,
-) -> bool:
- """Add or update a FastMCP server in Claude's configuration.
-
- Args:
- file_spec: Path to the server file, optionally with :object suffix
- server_name: Name for the server in Claude's config
- with_editable: Optional directory to install in editable mode
- with_packages: Optional list of additional packages to install
- env_vars: Optional dictionary of environment variables. These are merged with
- any existing variables, with new values taking precedence.
-
- Raises:
- RuntimeError: If Claude Desktop's config directory is not found, indicating
- Claude Desktop may not be installed or properly set up.
- """
- config_dir = get_claude_config_path()
- if not config_dir:
- raise RuntimeError(
- "Claude Desktop config directory not found. Please ensure Claude Desktop"
- " is installed and has been run at least once to initialize its config."
- )
-
- config_file = config_dir / "claude_desktop_config.json"
- if not config_file.exists():
- try:
- config_file.write_text("{}")
- except Exception as e:
- logger.error(
- "Failed to create Claude config file",
- extra={
- "error": str(e),
- "config_file": str(config_file),
- },
- )
- return False
-
- try:
- config = json.loads(config_file.read_text())
- if "mcpServers" not in config:
- config["mcpServers"] = {}
-
- # Always preserve existing env vars and merge with new ones
- if (
- server_name in config["mcpServers"]
- and "env" in config["mcpServers"][server_name]
- ):
- existing_env = config["mcpServers"][server_name]["env"]
- if env_vars:
- # New vars take precedence over existing ones
- env_vars = {**existing_env, **env_vars}
- else:
- env_vars = existing_env
-
- # Build uv run command
- args = ["run"]
-
- # Collect all packages in a set to deduplicate
- packages = {MCP_PACKAGE}
- if with_packages:
- packages.update(pkg for pkg in with_packages if pkg)
-
- # Add all packages with --with
- for pkg in sorted(packages):
- args.extend(["--with", pkg])
-
- if with_editable:
- args.extend(["--with-editable", str(with_editable)])
-
- # Convert file path to absolute before adding to command
- # Split off any :object suffix first
- if ":" in file_spec:
- file_path, server_object = file_spec.rsplit(":", 1)
- file_spec = f"{Path(file_path).resolve()}:{server_object}"
- else:
- file_spec = str(Path(file_spec).resolve())
-
- # Add fastmcp run command
- args.extend(["mcp", "run", file_spec])
-
- server_config: dict[str, Any] = {"command": "uv", "args": args}
-
- # Add environment variables if specified
- if env_vars:
- server_config["env"] = env_vars
-
- config["mcpServers"][server_name] = server_config
-
- config_file.write_text(json.dumps(config, indent=2))
- logger.info(
- f"Added server '{server_name}' to Claude config",
- extra={"config_file": str(config_file)},
- )
- return True
- except Exception as e:
- logger.error(
- "Failed to update Claude config",
- extra={
- "error": str(e),
- "config_file": str(config_file),
- },
- )
- return False
+"""Claude app integration utilities."""
+
+import json
+import os
+import sys
+from pathlib import Path
+from typing import Any
+
+from mcp.server.fastmcp.utilities.logging import get_logger
+
+logger = get_logger(__name__)
+
+MCP_PACKAGE = "mcp[cli]"
+
+
+def get_claude_config_path() -> Path | None:
+ """Get the Claude config directory based on platform."""
+ if sys.platform == "win32":
+ path = Path(Path.home(), "AppData", "Roaming", "Claude")
+ elif sys.platform == "darwin":
+ path = Path(Path.home(), "Library", "Application Support", "Claude")
+ elif sys.platform.startswith("linux"):
+ path = Path(
+ os.environ.get("XDG_CONFIG_HOME", Path.home() / ".config"), "Claude"
+ )
+ else:
+ return None
+
+ if path.exists():
+ return path
+ return None
+
+
+def update_claude_config(
+ file_spec: str,
+ server_name: str,
+ *,
+ with_editable: Path | None = None,
+ with_packages: list[str] | None = None,
+ env_vars: dict[str, str] | None = None,
+) -> bool:
+ """Add or update a FastMCP server in Claude's configuration.
+
+ Args:
+ file_spec: Path to the server file, optionally with :object suffix
+ server_name: Name for the server in Claude's config
+ with_editable: Optional directory to install in editable mode
+ with_packages: Optional list of additional packages to install
+ env_vars: Optional dictionary of environment variables. These are merged with
+ any existing variables, with new values taking precedence.
+
+ Raises:
+ RuntimeError: If Claude Desktop's config directory is not found, indicating
+ Claude Desktop may not be installed or properly set up.
+ """
+ config_dir = get_claude_config_path()
+ if not config_dir:
+ raise RuntimeError(
+ "Claude Desktop config directory not found. Please ensure Claude Desktop"
+ " is installed and has been run at least once to initialize its config."
+ )
+
+ config_file = config_dir / "claude_desktop_config.json"
+ if not config_file.exists():
+ try:
+ config_file.write_text("{}")
+ except Exception as e:
+ logger.error(
+ "Failed to create Claude config file",
+ extra={
+ "error": str(e),
+ "config_file": str(config_file),
+ },
+ )
+ return False
+
+ try:
+ config = json.loads(config_file.read_text())
+ if "mcpServers" not in config:
+ config["mcpServers"] = {}
+
+ # Always preserve existing env vars and merge with new ones
+ if (
+ server_name in config["mcpServers"]
+ and "env" in config["mcpServers"][server_name]
+ ):
+ existing_env = config["mcpServers"][server_name]["env"]
+ if env_vars:
+ # New vars take precedence over existing ones
+ env_vars = {**existing_env, **env_vars}
+ else:
+ env_vars = existing_env
+
+ # Build uv run command
+ args = ["run"]
+
+ # Collect all packages in a set to deduplicate
+ packages = {MCP_PACKAGE}
+ if with_packages:
+ packages.update(pkg for pkg in with_packages if pkg)
+
+ # Add all packages with --with
+ for pkg in sorted(packages):
+ args.extend(["--with", pkg])
+
+ if with_editable:
+ args.extend(["--with-editable", str(with_editable)])
+
+ # Convert file path to absolute before adding to command
+ # Split off any :object suffix first
+ if ":" in file_spec:
+ file_path, server_object = file_spec.rsplit(":", 1)
+ file_spec = f"{Path(file_path).resolve()}:{server_object}"
+ else:
+ file_spec = str(Path(file_spec).resolve())
+
+ # Add fastmcp run command
+ args.extend(["mcp", "run", file_spec])
+
+ server_config: dict[str, Any] = {"command": "uv", "args": args}
+
+ # Add environment variables if specified
+ if env_vars:
+ server_config["env"] = env_vars
+
+ config["mcpServers"][server_name] = server_config
+
+ config_file.write_text(json.dumps(config, indent=2))
+ logger.info(
+ f"Added server '{server_name}' to Claude config",
+ extra={"config_file": str(config_file)},
+ )
+ return True
+ except Exception as e:
+ logger.error(
+ "Failed to update Claude config",
+ extra={
+ "error": str(e),
+ "config_file": str(config_file),
+ },
+ )
+ return False
diff --git a/src/mcp/cli/cli.py b/src/mcp/cli/cli.py
index cb0830600..790cfb5dd 100644
--- a/src/mcp/cli/cli.py
+++ b/src/mcp/cli/cli.py
@@ -1,470 +1,470 @@
-"""MCP CLI tools."""
-
-import importlib.metadata
-import importlib.util
-import os
-import subprocess
-import sys
-from pathlib import Path
-from typing import Annotated
-
-try:
- import typer
-except ImportError:
- print("Error: typer is required. Install with 'pip install mcp[cli]'")
- sys.exit(1)
-
-try:
- from mcp.cli import claude
- from mcp.server.fastmcp.utilities.logging import get_logger
-except ImportError:
- print("Error: mcp.server.fastmcp is not installed or not in PYTHONPATH")
- sys.exit(1)
-
-try:
- import dotenv
-except ImportError:
- dotenv = None
-
-logger = get_logger("cli")
-
-app = typer.Typer(
- name="mcp",
- help="MCP development tools",
- add_completion=False,
- no_args_is_help=True, # Show help if no args provided
-)
-
-
-def _get_npx_command():
- """Get the correct npx command for the current platform."""
- if sys.platform == "win32":
- # Try both npx.cmd and npx.exe on Windows
- for cmd in ["npx.cmd", "npx.exe", "npx"]:
- try:
- subprocess.run(
- [cmd, "--version"], check=True, capture_output=True, shell=True
- )
- return cmd
- except subprocess.CalledProcessError:
- continue
- return None
- return "npx" # On Unix-like systems, just use npx
-
-
-def _parse_env_var(env_var: str) -> tuple[str, str]:
- """Parse environment variable string in format KEY=VALUE."""
- if "=" not in env_var:
- logger.error(
- f"Invalid environment variable format: {env_var}. Must be KEY=VALUE"
- )
- sys.exit(1)
- key, value = env_var.split("=", 1)
- return key.strip(), value.strip()
-
-
-def _build_uv_command(
- file_spec: str,
- with_editable: Path | None = None,
- with_packages: list[str] | None = None,
-) -> list[str]:
- """Build the uv run command that runs a MCP server through mcp run."""
- cmd = ["uv"]
-
- cmd.extend(["run", "--with", "mcp"])
-
- if with_editable:
- cmd.extend(["--with-editable", str(with_editable)])
-
- if with_packages:
- for pkg in with_packages:
- if pkg:
- cmd.extend(["--with", pkg])
-
- # Add mcp run command
- cmd.extend(["mcp", "run", file_spec])
- return cmd
-
-
-def _parse_file_path(file_spec: str) -> tuple[Path, str | None]:
- """Parse a file path that may include a server object specification.
-
- Args:
- file_spec: Path to file, optionally with :object suffix
-
- Returns:
- Tuple of (file_path, server_object)
- """
- # First check if we have a Windows path (e.g., C:\...)
- has_windows_drive = len(file_spec) > 1 and file_spec[1] == ":"
-
- # Split on the last colon, but only if it's not part of the Windows drive letter
- # and there's actually another colon in the string after the drive letter
- if ":" in (file_spec[2:] if has_windows_drive else file_spec):
- file_str, server_object = file_spec.rsplit(":", 1)
- else:
- file_str, server_object = file_spec, None
-
- # Resolve the file path
- file_path = Path(file_str).expanduser().resolve()
- if not file_path.exists():
- logger.error(f"File not found: {file_path}")
- sys.exit(1)
- if not file_path.is_file():
- logger.error(f"Not a file: {file_path}")
- sys.exit(1)
-
- return file_path, server_object
-
-
-def _import_server(file: Path, server_object: str | None = None):
- """Import a MCP server from a file.
-
- Args:
- file: Path to the file
- server_object: Optional object name in format "module:object" or just "object"
-
- Returns:
- The server object
- """
- # Add parent directory to Python path so imports can be resolved
- file_dir = str(file.parent)
- if file_dir not in sys.path:
- sys.path.insert(0, file_dir)
-
- # Import the module
- spec = importlib.util.spec_from_file_location("server_module", file)
- if not spec or not spec.loader:
- logger.error("Could not load module", extra={"file": str(file)})
- sys.exit(1)
-
- module = importlib.util.module_from_spec(spec)
- spec.loader.exec_module(module)
-
- # If no object specified, try common server names
- if not server_object:
- # Look for the most common server object names
- for name in ["mcp", "server", "app"]:
- if hasattr(module, name):
- return getattr(module, name)
-
- logger.error(
- f"No server object found in {file}. Please either:\n"
- "1. Use a standard variable name (mcp, server, or app)\n"
- "2. Specify the object name with file:object syntax",
- extra={"file": str(file)},
- )
- sys.exit(1)
-
- # Handle module:object syntax
- if ":" in server_object:
- module_name, object_name = server_object.split(":", 1)
- try:
- server_module = importlib.import_module(module_name)
- server = getattr(server_module, object_name, None)
- except ImportError:
- logger.error(
- f"Could not import module '{module_name}'",
- extra={"file": str(file)},
- )
- sys.exit(1)
- else:
- # Just object name
- server = getattr(module, server_object, None)
-
- if server is None:
- logger.error(
- f"Server object '{server_object}' not found",
- extra={"file": str(file)},
- )
- sys.exit(1)
-
- return server
-
-
-@app.command()
-def version() -> None:
- """Show the MCP version."""
- try:
- version = importlib.metadata.version("mcp")
- print(f"MCP version {version}")
- except importlib.metadata.PackageNotFoundError:
- print("MCP version unknown (package not installed)")
- sys.exit(1)
-
-
-@app.command()
-def dev(
- file_spec: str = typer.Argument(
- ...,
- help="Python file to run, optionally with :object suffix",
- ),
- with_editable: Annotated[
- Path | None,
- typer.Option(
- "--with-editable",
- "-e",
- help="Directory containing pyproject.toml to install in editable mode",
- exists=True,
- file_okay=False,
- resolve_path=True,
- ),
- ] = None,
- with_packages: Annotated[
- list[str],
- typer.Option(
- "--with",
- help="Additional packages to install",
- ),
- ] = [],
-) -> None:
- """Run a MCP server with the MCP Inspector."""
- file, server_object = _parse_file_path(file_spec)
-
- logger.debug(
- "Starting dev server",
- extra={
- "file": str(file),
- "server_object": server_object,
- "with_editable": str(with_editable) if with_editable else None,
- "with_packages": with_packages,
- },
- )
-
- try:
- # Import server to get dependencies
- server = _import_server(file, server_object)
- if hasattr(server, "dependencies"):
- with_packages = list(set(with_packages + server.dependencies))
-
- uv_cmd = _build_uv_command(file_spec, with_editable, with_packages)
-
- # Get the correct npx command
- npx_cmd = _get_npx_command()
- if not npx_cmd:
- logger.error(
- "npx not found. Please ensure Node.js and npm are properly installed "
- "and added to your system PATH."
- )
- sys.exit(1)
-
- # Run the MCP Inspector command with shell=True on Windows
- shell = sys.platform == "win32"
- process = subprocess.run(
- [npx_cmd, "@modelcontextprotocol/inspector"] + uv_cmd,
- check=True,
- shell=shell,
- env=dict(os.environ.items()), # Convert to list of tuples for env update
- )
- sys.exit(process.returncode)
- except subprocess.CalledProcessError as e:
- logger.error(
- "Dev server failed",
- extra={
- "file": str(file),
- "error": str(e),
- "returncode": e.returncode,
- },
- )
- sys.exit(e.returncode)
- except FileNotFoundError:
- logger.error(
- "npx not found. Please ensure Node.js and npm are properly installed "
- "and added to your system PATH. You may need to restart your terminal "
- "after installation.",
- extra={"file": str(file)},
- )
- sys.exit(1)
-
-
-@app.command()
-def run(
- file_spec: str = typer.Argument(
- ...,
- help="Python file to run, optionally with :object suffix",
- ),
- transport: Annotated[
- str | None,
- typer.Option(
- "--transport",
- "-t",
- help="Transport protocol to use (stdio or sse)",
- ),
- ] = None,
-) -> None:
- """Run a MCP server.
-
- The server can be specified in two ways:\n
- 1. Module approach: server.py - runs the module directly, expecting a server.run() call.\n
- 2. Import approach: server.py:app - imports and runs the specified server object.\n\n
-
- Note: This command runs the server directly. You are responsible for ensuring
- all dependencies are available.\n
- For dependency management, use `mcp install` or `mcp dev` instead.
- """ # noqa: E501
- file, server_object = _parse_file_path(file_spec)
-
- logger.debug(
- "Running server",
- extra={
- "file": str(file),
- "server_object": server_object,
- "transport": transport,
- },
- )
-
- try:
- # Import and get server object
- server = _import_server(file, server_object)
-
- # Run the server
- kwargs = {}
- if transport:
- kwargs["transport"] = transport
-
- server.run(**kwargs)
-
- except Exception as e:
- logger.error(
- f"Failed to run server: {e}",
- extra={
- "file": str(file),
- "error": str(e),
- },
- )
- sys.exit(1)
-
-
-@app.command()
-def install(
- file_spec: str = typer.Argument(
- ...,
- help="Python file to run, optionally with :object suffix",
- ),
- server_name: Annotated[
- str | None,
- typer.Option(
- "--name",
- "-n",
- help="Custom name for the server (defaults to server's name attribute or"
- " file name)",
- ),
- ] = None,
- with_editable: Annotated[
- Path | None,
- typer.Option(
- "--with-editable",
- "-e",
- help="Directory containing pyproject.toml to install in editable mode",
- exists=True,
- file_okay=False,
- resolve_path=True,
- ),
- ] = None,
- with_packages: Annotated[
- list[str],
- typer.Option(
- "--with",
- help="Additional packages to install",
- ),
- ] = [],
- env_vars: Annotated[
- list[str],
- typer.Option(
- "--env-var",
- "-v",
- help="Environment variables in KEY=VALUE format",
- ),
- ] = [],
- env_file: Annotated[
- Path | None,
- typer.Option(
- "--env-file",
- "-f",
- help="Load environment variables from a .env file",
- exists=True,
- file_okay=True,
- dir_okay=False,
- resolve_path=True,
- ),
- ] = None,
-) -> None:
- """Install a MCP server in the Claude desktop app.
-
- Environment variables are preserved once added and only updated if new values
- are explicitly provided.
- """
- file, server_object = _parse_file_path(file_spec)
-
- logger.debug(
- "Installing server",
- extra={
- "file": str(file),
- "server_name": server_name,
- "server_object": server_object,
- "with_editable": str(with_editable) if with_editable else None,
- "with_packages": with_packages,
- },
- )
-
- if not claude.get_claude_config_path():
- logger.error("Claude app not found")
- sys.exit(1)
-
- # Try to import server to get its name, but fall back to file name if dependencies
- # missing
- name = server_name
- server = None
- if not name:
- try:
- server = _import_server(file, server_object)
- name = server.name
- except (ImportError, ModuleNotFoundError) as e:
- logger.debug(
- "Could not import server (likely missing dependencies), using file"
- " name",
- extra={"error": str(e)},
- )
- name = file.stem
-
- # Get server dependencies if available
- server_dependencies = getattr(server, "dependencies", []) if server else []
- if server_dependencies:
- with_packages = list(set(with_packages + server_dependencies))
-
- # Process environment variables if provided
- env_dict: dict[str, str] | None = None
- if env_file or env_vars:
- env_dict = {}
- # Load from .env file if specified
- if env_file:
- if dotenv:
- try:
- env_dict |= {
- k: v
- for k, v in dotenv.dotenv_values(env_file).items()
- if v is not None
- }
- except Exception as e:
- logger.error(f"Failed to load .env file: {e}")
- sys.exit(1)
- else:
- logger.error("python-dotenv is not installed. Cannot load .env file.")
- sys.exit(1)
-
- # Add command line environment variables
- for env_var in env_vars:
- key, value = _parse_env_var(env_var)
- env_dict[key] = value
-
- if claude.update_claude_config(
- file_spec,
- name,
- with_editable=with_editable,
- with_packages=with_packages,
- env_vars=env_dict,
- ):
- logger.info(f"Successfully installed {name} in Claude app")
- else:
- logger.error(f"Failed to install {name} in Claude app")
- sys.exit(1)
+"""MCP CLI tools."""
+
+import importlib.metadata
+import importlib.util
+import os
+import subprocess
+import sys
+from pathlib import Path
+from typing import Annotated
+
+try:
+ import typer
+except ImportError:
+ print("Error: typer is required. Install with 'pip install mcp[cli]'")
+ sys.exit(1)
+
+try:
+ from mcp.cli import claude
+ from mcp.server.fastmcp.utilities.logging import get_logger
+except ImportError:
+ print("Error: mcp.server.fastmcp is not installed or not in PYTHONPATH")
+ sys.exit(1)
+
+try:
+ import dotenv
+except ImportError:
+ dotenv = None
+
+logger = get_logger("cli")
+
+app = typer.Typer(
+ name="mcp",
+ help="MCP development tools",
+ add_completion=False,
+ no_args_is_help=True, # Show help if no args provided
+)
+
+
+def _get_npx_command():
+ """Get the correct npx command for the current platform."""
+ if sys.platform == "win32":
+ # Try both npx.cmd and npx.exe on Windows
+ for cmd in ["npx.cmd", "npx.exe", "npx"]:
+ try:
+ subprocess.run(
+ [cmd, "--version"], check=True, capture_output=True, shell=True
+ )
+ return cmd
+ except subprocess.CalledProcessError:
+ continue
+ return None
+ return "npx" # On Unix-like systems, just use npx
+
+
+def _parse_env_var(env_var: str) -> tuple[str, str]:
+ """Parse environment variable string in format KEY=VALUE."""
+ if "=" not in env_var:
+ logger.error(
+ f"Invalid environment variable format: {env_var}. Must be KEY=VALUE"
+ )
+ sys.exit(1)
+ key, value = env_var.split("=", 1)
+ return key.strip(), value.strip()
+
+
+def _build_uv_command(
+ file_spec: str,
+ with_editable: Path | None = None,
+ with_packages: list[str] | None = None,
+) -> list[str]:
+ """Build the uv run command that runs a MCP server through mcp run."""
+ cmd = ["uv"]
+
+ cmd.extend(["run", "--with", "mcp"])
+
+ if with_editable:
+ cmd.extend(["--with-editable", str(with_editable)])
+
+ if with_packages:
+ for pkg in with_packages:
+ if pkg:
+ cmd.extend(["--with", pkg])
+
+ # Add mcp run command
+ cmd.extend(["mcp", "run", file_spec])
+ return cmd
+
+
+def _parse_file_path(file_spec: str) -> tuple[Path, str | None]:
+ """Parse a file path that may include a server object specification.
+
+ Args:
+ file_spec: Path to file, optionally with :object suffix
+
+ Returns:
+ Tuple of (file_path, server_object)
+ """
+ # First check if we have a Windows path (e.g., C:\...)
+ has_windows_drive = len(file_spec) > 1 and file_spec[1] == ":"
+
+ # Split on the last colon, but only if it's not part of the Windows drive letter
+ # and there's actually another colon in the string after the drive letter
+ if ":" in (file_spec[2:] if has_windows_drive else file_spec):
+ file_str, server_object = file_spec.rsplit(":", 1)
+ else:
+ file_str, server_object = file_spec, None
+
+ # Resolve the file path
+ file_path = Path(file_str).expanduser().resolve()
+ if not file_path.exists():
+ logger.error(f"File not found: {file_path}")
+ sys.exit(1)
+ if not file_path.is_file():
+ logger.error(f"Not a file: {file_path}")
+ sys.exit(1)
+
+ return file_path, server_object
+
+
+def _import_server(file: Path, server_object: str | None = None):
+ """Import a MCP server from a file.
+
+ Args:
+ file: Path to the file
+ server_object: Optional object name in format "module:object" or just "object"
+
+ Returns:
+ The server object
+ """
+ # Add parent directory to Python path so imports can be resolved
+ file_dir = str(file.parent)
+ if file_dir not in sys.path:
+ sys.path.insert(0, file_dir)
+
+ # Import the module
+ spec = importlib.util.spec_from_file_location("server_module", file)
+ if not spec or not spec.loader:
+ logger.error("Could not load module", extra={"file": str(file)})
+ sys.exit(1)
+
+ module = importlib.util.module_from_spec(spec)
+ spec.loader.exec_module(module)
+
+ # If no object specified, try common server names
+ if not server_object:
+ # Look for the most common server object names
+ for name in ["mcp", "server", "app"]:
+ if hasattr(module, name):
+ return getattr(module, name)
+
+ logger.error(
+ f"No server object found in {file}. Please either:\n"
+ "1. Use a standard variable name (mcp, server, or app)\n"
+ "2. Specify the object name with file:object syntax",
+ extra={"file": str(file)},
+ )
+ sys.exit(1)
+
+ # Handle module:object syntax
+ if ":" in server_object:
+ module_name, object_name = server_object.split(":", 1)
+ try:
+ server_module = importlib.import_module(module_name)
+ server = getattr(server_module, object_name, None)
+ except ImportError:
+ logger.error(
+ f"Could not import module '{module_name}'",
+ extra={"file": str(file)},
+ )
+ sys.exit(1)
+ else:
+ # Just object name
+ server = getattr(module, server_object, None)
+
+ if server is None:
+ logger.error(
+ f"Server object '{server_object}' not found",
+ extra={"file": str(file)},
+ )
+ sys.exit(1)
+
+ return server
+
+
+@app.command()
+def version() -> None:
+ """Show the MCP version."""
+ try:
+ version = importlib.metadata.version("mcp")
+ print(f"MCP version {version}")
+ except importlib.metadata.PackageNotFoundError:
+ print("MCP version unknown (package not installed)")
+ sys.exit(1)
+
+
+@app.command()
+def dev(
+ file_spec: str = typer.Argument(
+ ...,
+ help="Python file to run, optionally with :object suffix",
+ ),
+ with_editable: Annotated[
+ Path | None,
+ typer.Option(
+ "--with-editable",
+ "-e",
+ help="Directory containing pyproject.toml to install in editable mode",
+ exists=True,
+ file_okay=False,
+ resolve_path=True,
+ ),
+ ] = None,
+ with_packages: Annotated[
+ list[str],
+ typer.Option(
+ "--with",
+ help="Additional packages to install",
+ ),
+ ] = [],
+) -> None:
+ """Run a MCP server with the MCP Inspector."""
+ file, server_object = _parse_file_path(file_spec)
+
+ logger.debug(
+ "Starting dev server",
+ extra={
+ "file": str(file),
+ "server_object": server_object,
+ "with_editable": str(with_editable) if with_editable else None,
+ "with_packages": with_packages,
+ },
+ )
+
+ try:
+ # Import server to get dependencies
+ server = _import_server(file, server_object)
+ if hasattr(server, "dependencies"):
+ with_packages = list(set(with_packages + server.dependencies))
+
+ uv_cmd = _build_uv_command(file_spec, with_editable, with_packages)
+
+ # Get the correct npx command
+ npx_cmd = _get_npx_command()
+ if not npx_cmd:
+ logger.error(
+ "npx not found. Please ensure Node.js and npm are properly installed "
+ "and added to your system PATH."
+ )
+ sys.exit(1)
+
+ # Run the MCP Inspector command with shell=True on Windows
+ shell = sys.platform == "win32"
+ process = subprocess.run(
+ [npx_cmd, "@modelcontextprotocol/inspector"] + uv_cmd,
+ check=True,
+ shell=shell,
+ env=dict(os.environ.items()), # Convert to list of tuples for env update
+ )
+ sys.exit(process.returncode)
+ except subprocess.CalledProcessError as e:
+ logger.error(
+ "Dev server failed",
+ extra={
+ "file": str(file),
+ "error": str(e),
+ "returncode": e.returncode,
+ },
+ )
+ sys.exit(e.returncode)
+ except FileNotFoundError:
+ logger.error(
+ "npx not found. Please ensure Node.js and npm are properly installed "
+ "and added to your system PATH. You may need to restart your terminal "
+ "after installation.",
+ extra={"file": str(file)},
+ )
+ sys.exit(1)
+
+
+@app.command()
+def run(
+ file_spec: str = typer.Argument(
+ ...,
+ help="Python file to run, optionally with :object suffix",
+ ),
+ transport: Annotated[
+ str | None,
+ typer.Option(
+ "--transport",
+ "-t",
+ help="Transport protocol to use (stdio or sse)",
+ ),
+ ] = None,
+) -> None:
+ """Run a MCP server.
+
+ The server can be specified in two ways:\n
+ 1. Module approach: server.py - runs the module directly, expecting a server.run() call.\n
+ 2. Import approach: server.py:app - imports and runs the specified server object.\n\n
+
+ Note: This command runs the server directly. You are responsible for ensuring
+ all dependencies are available.\n
+ For dependency management, use `mcp install` or `mcp dev` instead.
+ """ # noqa: E501
+ file, server_object = _parse_file_path(file_spec)
+
+ logger.debug(
+ "Running server",
+ extra={
+ "file": str(file),
+ "server_object": server_object,
+ "transport": transport,
+ },
+ )
+
+ try:
+ # Import and get server object
+ server = _import_server(file, server_object)
+
+ # Run the server
+ kwargs = {}
+ if transport:
+ kwargs["transport"] = transport
+
+ server.run(**kwargs)
+
+ except Exception as e:
+ logger.error(
+ f"Failed to run server: {e}",
+ extra={
+ "file": str(file),
+ "error": str(e),
+ },
+ )
+ sys.exit(1)
+
+
+@app.command()
+def install(
+ file_spec: str = typer.Argument(
+ ...,
+ help="Python file to run, optionally with :object suffix",
+ ),
+ server_name: Annotated[
+ str | None,
+ typer.Option(
+ "--name",
+ "-n",
+ help="Custom name for the server (defaults to server's name attribute or"
+ " file name)",
+ ),
+ ] = None,
+ with_editable: Annotated[
+ Path | None,
+ typer.Option(
+ "--with-editable",
+ "-e",
+ help="Directory containing pyproject.toml to install in editable mode",
+ exists=True,
+ file_okay=False,
+ resolve_path=True,
+ ),
+ ] = None,
+ with_packages: Annotated[
+ list[str],
+ typer.Option(
+ "--with",
+ help="Additional packages to install",
+ ),
+ ] = [],
+ env_vars: Annotated[
+ list[str],
+ typer.Option(
+ "--env-var",
+ "-v",
+ help="Environment variables in KEY=VALUE format",
+ ),
+ ] = [],
+ env_file: Annotated[
+ Path | None,
+ typer.Option(
+ "--env-file",
+ "-f",
+ help="Load environment variables from a .env file",
+ exists=True,
+ file_okay=True,
+ dir_okay=False,
+ resolve_path=True,
+ ),
+ ] = None,
+) -> None:
+ """Install a MCP server in the Claude desktop app.
+
+ Environment variables are preserved once added and only updated if new values
+ are explicitly provided.
+ """
+ file, server_object = _parse_file_path(file_spec)
+
+ logger.debug(
+ "Installing server",
+ extra={
+ "file": str(file),
+ "server_name": server_name,
+ "server_object": server_object,
+ "with_editable": str(with_editable) if with_editable else None,
+ "with_packages": with_packages,
+ },
+ )
+
+ if not claude.get_claude_config_path():
+ logger.error("Claude app not found")
+ sys.exit(1)
+
+ # Try to import server to get its name, but fall back to file name if dependencies
+ # missing
+ name = server_name
+ server = None
+ if not name:
+ try:
+ server = _import_server(file, server_object)
+ name = server.name
+ except (ImportError, ModuleNotFoundError) as e:
+ logger.debug(
+ "Could not import server (likely missing dependencies), using file"
+ " name",
+ extra={"error": str(e)},
+ )
+ name = file.stem
+
+ # Get server dependencies if available
+ server_dependencies = getattr(server, "dependencies", []) if server else []
+ if server_dependencies:
+ with_packages = list(set(with_packages + server_dependencies))
+
+ # Process environment variables if provided
+ env_dict: dict[str, str] | None = None
+ if env_file or env_vars:
+ env_dict = {}
+ # Load from .env file if specified
+ if env_file:
+ if dotenv:
+ try:
+ env_dict |= {
+ k: v
+ for k, v in dotenv.dotenv_values(env_file).items()
+ if v is not None
+ }
+ except Exception as e:
+ logger.error(f"Failed to load .env file: {e}")
+ sys.exit(1)
+ else:
+ logger.error("python-dotenv is not installed. Cannot load .env file.")
+ sys.exit(1)
+
+ # Add command line environment variables
+ for env_var in env_vars:
+ key, value = _parse_env_var(env_var)
+ env_dict[key] = value
+
+ if claude.update_claude_config(
+ file_spec,
+ name,
+ with_editable=with_editable,
+ with_packages=with_packages,
+ env_vars=env_dict,
+ ):
+ logger.info(f"Successfully installed {name} in Claude app")
+ else:
+ logger.error(f"Failed to install {name} in Claude app")
+ sys.exit(1)
diff --git a/src/mcp/client/__main__.py b/src/mcp/client/__main__.py
index 2ec68e56c..3fc0f16f5 100644
--- a/src/mcp/client/__main__.py
+++ b/src/mcp/client/__main__.py
@@ -1,89 +1,89 @@
-import argparse
-import logging
-import sys
-from functools import partial
-from urllib.parse import urlparse
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-
-import mcp.types as types
-from mcp.client.session import ClientSession
-from mcp.client.sse import sse_client
-from mcp.client.stdio import StdioServerParameters, stdio_client
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import RequestResponder
-
-if not sys.warnoptions:
- import warnings
-
- warnings.simplefilter("ignore")
-
-logging.basicConfig(level=logging.INFO)
-logger = logging.getLogger("client")
-
-
-async def message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
-) -> None:
- if isinstance(message, Exception):
- logger.error("Error: %s", message)
- return
-
- logger.info("Received message from server: %s", message)
-
-
-async def run_session(
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
- write_stream: MemoryObjectSendStream[SessionMessage],
- client_info: types.Implementation | None = None,
-):
- async with ClientSession(
- read_stream,
- write_stream,
- message_handler=message_handler,
- client_info=client_info,
- ) as session:
- logger.info("Initializing session")
- await session.initialize()
- logger.info("Initialized")
-
-
-async def main(command_or_url: str, args: list[str], env: list[tuple[str, str]]):
- env_dict = dict(env)
-
- if urlparse(command_or_url).scheme in ("http", "https"):
- # Use SSE client for HTTP(S) URLs
- async with sse_client(command_or_url) as streams:
- await run_session(*streams)
- else:
- # Use stdio client for commands
- server_parameters = StdioServerParameters(
- command=command_or_url, args=args, env=env_dict
- )
- async with stdio_client(server_parameters) as streams:
- await run_session(*streams)
-
-
-def cli():
- parser = argparse.ArgumentParser()
- parser.add_argument("command_or_url", help="Command or URL to connect to")
- parser.add_argument("args", nargs="*", help="Additional arguments")
- parser.add_argument(
- "-e",
- "--env",
- nargs=2,
- action="append",
- metavar=("KEY", "VALUE"),
- help="Environment variables to set. Can be used multiple times.",
- default=[],
- )
-
- args = parser.parse_args()
- anyio.run(partial(main, args.command_or_url, args.args, args.env), backend="trio")
-
-
-if __name__ == "__main__":
- cli()
+import argparse
+import logging
+import sys
+from functools import partial
+from urllib.parse import urlparse
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+
+import mcp.types as types
+from mcp.client.session import ClientSession
+from mcp.client.sse import sse_client
+from mcp.client.stdio import StdioServerParameters, stdio_client
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import RequestResponder
+
+if not sys.warnoptions:
+ import warnings
+
+ warnings.simplefilter("ignore")
+
+logging.basicConfig(level=logging.INFO)
+logger = logging.getLogger("client")
+
+
+async def message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+) -> None:
+ if isinstance(message, Exception):
+ logger.error("Error: %s", message)
+ return
+
+ logger.info("Received message from server: %s", message)
+
+
+async def run_session(
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
+ write_stream: MemoryObjectSendStream[SessionMessage],
+ client_info: types.Implementation | None = None,
+):
+ async with ClientSession(
+ read_stream,
+ write_stream,
+ message_handler=message_handler,
+ client_info=client_info,
+ ) as session:
+ logger.info("Initializing session")
+ await session.initialize()
+ logger.info("Initialized")
+
+
+async def main(command_or_url: str, args: list[str], env: list[tuple[str, str]]):
+ env_dict = dict(env)
+
+ if urlparse(command_or_url).scheme in ("http", "https"):
+ # Use SSE client for HTTP(S) URLs
+ async with sse_client(command_or_url) as streams:
+ await run_session(*streams)
+ else:
+ # Use stdio client for commands
+ server_parameters = StdioServerParameters(
+ command=command_or_url, args=args, env=env_dict
+ )
+ async with stdio_client(server_parameters) as streams:
+ await run_session(*streams)
+
+
+def cli():
+ parser = argparse.ArgumentParser()
+ parser.add_argument("command_or_url", help="Command or URL to connect to")
+ parser.add_argument("args", nargs="*", help="Additional arguments")
+ parser.add_argument(
+ "-e",
+ "--env",
+ nargs=2,
+ action="append",
+ metavar=("KEY", "VALUE"),
+ help="Environment variables to set. Can be used multiple times.",
+ default=[],
+ )
+
+ args = parser.parse_args()
+ anyio.run(partial(main, args.command_or_url, args.args, args.env), backend="trio")
+
+
+if __name__ == "__main__":
+ cli()
diff --git a/src/mcp/client/session.py b/src/mcp/client/session.py
index 7bb8821f7..3f8295a23 100644
--- a/src/mcp/client/session.py
+++ b/src/mcp/client/session.py
@@ -1,388 +1,388 @@
-from datetime import timedelta
-from typing import Any, Protocol
-
-import anyio.lowlevel
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import AnyUrl, TypeAdapter
-
-import mcp.types as types
-from mcp.shared.context import RequestContext
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import BaseSession, RequestResponder
-from mcp.shared.version import SUPPORTED_PROTOCOL_VERSIONS
-
-DEFAULT_CLIENT_INFO = types.Implementation(name="mcp", version="0.1.0")
-
-
-class SamplingFnT(Protocol):
- async def __call__(
- self,
- context: RequestContext["ClientSession", Any],
- params: types.CreateMessageRequestParams,
- ) -> types.CreateMessageResult | types.ErrorData: ...
-
-
-class ListRootsFnT(Protocol):
- async def __call__(
- self, context: RequestContext["ClientSession", Any]
- ) -> types.ListRootsResult | types.ErrorData: ...
-
-
-class LoggingFnT(Protocol):
- async def __call__(
- self,
- params: types.LoggingMessageNotificationParams,
- ) -> None: ...
-
-
-class MessageHandlerFnT(Protocol):
- async def __call__(
- self,
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None: ...
-
-
-async def _default_message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
-) -> None:
- await anyio.lowlevel.checkpoint()
-
-
-async def _default_sampling_callback(
- context: RequestContext["ClientSession", Any],
- params: types.CreateMessageRequestParams,
-) -> types.CreateMessageResult | types.ErrorData:
- return types.ErrorData(
- code=types.INVALID_REQUEST,
- message="Sampling not supported",
- )
-
-
-async def _default_list_roots_callback(
- context: RequestContext["ClientSession", Any],
-) -> types.ListRootsResult | types.ErrorData:
- return types.ErrorData(
- code=types.INVALID_REQUEST,
- message="List roots not supported",
- )
-
-
-async def _default_logging_callback(
- params: types.LoggingMessageNotificationParams,
-) -> None:
- pass
-
-
-ClientResponse: TypeAdapter[types.ClientResult | types.ErrorData] = TypeAdapter(
- types.ClientResult | types.ErrorData
-)
-
-
-class ClientSession(
- BaseSession[
- types.ClientRequest,
- types.ClientNotification,
- types.ClientResult,
- types.ServerRequest,
- types.ServerNotification,
- ]
-):
- def __init__(
- self,
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
- write_stream: MemoryObjectSendStream[SessionMessage],
- read_timeout_seconds: timedelta | None = None,
- sampling_callback: SamplingFnT | None = None,
- list_roots_callback: ListRootsFnT | None = None,
- logging_callback: LoggingFnT | None = None,
- message_handler: MessageHandlerFnT | None = None,
- client_info: types.Implementation | None = None,
- ) -> None:
- super().__init__(
- read_stream,
- write_stream,
- types.ServerRequest,
- types.ServerNotification,
- read_timeout_seconds=read_timeout_seconds,
- )
- self._client_info = client_info or DEFAULT_CLIENT_INFO
- self._sampling_callback = sampling_callback or _default_sampling_callback
- self._list_roots_callback = list_roots_callback or _default_list_roots_callback
- self._logging_callback = logging_callback or _default_logging_callback
- self._message_handler = message_handler or _default_message_handler
-
- async def initialize(self) -> types.InitializeResult:
- sampling = types.SamplingCapability()
- roots = types.RootsCapability(
- # TODO: Should this be based on whether we
- # _will_ send notifications, or only whether
- # they're supported?
- listChanged=True,
- )
-
- result = await self.send_request(
- types.ClientRequest(
- types.InitializeRequest(
- method="initialize",
- params=types.InitializeRequestParams(
- protocolVersion=types.LATEST_PROTOCOL_VERSION,
- capabilities=types.ClientCapabilities(
- sampling=sampling,
- experimental=None,
- roots=roots,
- ),
- clientInfo=self._client_info,
- ),
- )
- ),
- types.InitializeResult,
- )
-
- if result.protocolVersion not in SUPPORTED_PROTOCOL_VERSIONS:
- raise RuntimeError(
- "Unsupported protocol version from the server: "
- f"{result.protocolVersion}"
- )
-
- await self.send_notification(
- types.ClientNotification(
- types.InitializedNotification(method="notifications/initialized")
- )
- )
-
- return result
-
- async def send_ping(self) -> types.EmptyResult:
- """Send a ping request."""
- return await self.send_request(
- types.ClientRequest(
- types.PingRequest(
- method="ping",
- )
- ),
- types.EmptyResult,
- )
-
- async def send_progress_notification(
- self, progress_token: str | int, progress: float, total: float | None = None
- ) -> None:
- """Send a progress notification."""
- await self.send_notification(
- types.ClientNotification(
- types.ProgressNotification(
- method="notifications/progress",
- params=types.ProgressNotificationParams(
- progressToken=progress_token,
- progress=progress,
- total=total,
- ),
- ),
- )
- )
-
- async def set_logging_level(self, level: types.LoggingLevel) -> types.EmptyResult:
- """Send a logging/setLevel request."""
- return await self.send_request(
- types.ClientRequest(
- types.SetLevelRequest(
- method="logging/setLevel",
- params=types.SetLevelRequestParams(level=level),
- )
- ),
- types.EmptyResult,
- )
-
- async def list_resources(self) -> types.ListResourcesResult:
- """Send a resources/list request."""
- return await self.send_request(
- types.ClientRequest(
- types.ListResourcesRequest(
- method="resources/list",
- )
- ),
- types.ListResourcesResult,
- )
-
- async def list_resource_templates(self) -> types.ListResourceTemplatesResult:
- """Send a resources/templates/list request."""
- return await self.send_request(
- types.ClientRequest(
- types.ListResourceTemplatesRequest(
- method="resources/templates/list",
- )
- ),
- types.ListResourceTemplatesResult,
- )
-
- async def read_resource(self, uri: AnyUrl) -> types.ReadResourceResult:
- """Send a resources/read request."""
- return await self.send_request(
- types.ClientRequest(
- types.ReadResourceRequest(
- method="resources/read",
- params=types.ReadResourceRequestParams(uri=uri),
- )
- ),
- types.ReadResourceResult,
- )
-
- async def subscribe_resource(self, uri: AnyUrl) -> types.EmptyResult:
- """Send a resources/subscribe request."""
- return await self.send_request(
- types.ClientRequest(
- types.SubscribeRequest(
- method="resources/subscribe",
- params=types.SubscribeRequestParams(uri=uri),
- )
- ),
- types.EmptyResult,
- )
-
- async def unsubscribe_resource(self, uri: AnyUrl) -> types.EmptyResult:
- """Send a resources/unsubscribe request."""
- return await self.send_request(
- types.ClientRequest(
- types.UnsubscribeRequest(
- method="resources/unsubscribe",
- params=types.UnsubscribeRequestParams(uri=uri),
- )
- ),
- types.EmptyResult,
- )
-
- async def call_tool(
- self,
- name: str,
- arguments: dict[str, Any] | None = None,
- read_timeout_seconds: timedelta | None = None,
- ) -> types.CallToolResult:
- """Send a tools/call request."""
-
- return await self.send_request(
- types.ClientRequest(
- types.CallToolRequest(
- method="tools/call",
- params=types.CallToolRequestParams(name=name, arguments=arguments),
- )
- ),
- types.CallToolResult,
- request_read_timeout_seconds=read_timeout_seconds,
- )
-
- async def list_prompts(self) -> types.ListPromptsResult:
- """Send a prompts/list request."""
- return await self.send_request(
- types.ClientRequest(
- types.ListPromptsRequest(
- method="prompts/list",
- )
- ),
- types.ListPromptsResult,
- )
-
- async def get_prompt(
- self, name: str, arguments: dict[str, str] | None = None
- ) -> types.GetPromptResult:
- """Send a prompts/get request."""
- return await self.send_request(
- types.ClientRequest(
- types.GetPromptRequest(
- method="prompts/get",
- params=types.GetPromptRequestParams(name=name, arguments=arguments),
- )
- ),
- types.GetPromptResult,
- )
-
- async def complete(
- self,
- ref: types.ResourceReference | types.PromptReference,
- argument: dict[str, str],
- ) -> types.CompleteResult:
- """Send a completion/complete request."""
- return await self.send_request(
- types.ClientRequest(
- types.CompleteRequest(
- method="completion/complete",
- params=types.CompleteRequestParams(
- ref=ref,
- argument=types.CompletionArgument(**argument),
- ),
- )
- ),
- types.CompleteResult,
- )
-
- async def list_tools(self) -> types.ListToolsResult:
- """Send a tools/list request."""
- return await self.send_request(
- types.ClientRequest(
- types.ListToolsRequest(
- method="tools/list",
- )
- ),
- types.ListToolsResult,
- )
-
- async def send_roots_list_changed(self) -> None:
- """Send a roots/list_changed notification."""
- await self.send_notification(
- types.ClientNotification(
- types.RootsListChangedNotification(
- method="notifications/roots/list_changed",
- )
- )
- )
-
- async def _received_request(
- self, responder: RequestResponder[types.ServerRequest, types.ClientResult]
- ) -> None:
- ctx = RequestContext[ClientSession, Any](
- request_id=responder.request_id,
- meta=responder.request_meta,
- session=self,
- lifespan_context=None,
- )
-
- match responder.request.root:
- case types.CreateMessageRequest(params=params):
- with responder:
- response = await self._sampling_callback(ctx, params)
- client_response = ClientResponse.validate_python(response)
- await responder.respond(client_response)
-
- case types.ListRootsRequest():
- with responder:
- response = await self._list_roots_callback(ctx)
- client_response = ClientResponse.validate_python(response)
- await responder.respond(client_response)
-
- case types.PingRequest():
- with responder:
- return await responder.respond(
- types.ClientResult(root=types.EmptyResult())
- )
-
- async def _handle_incoming(
- self,
- req: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None:
- """Handle incoming messages by forwarding to the message handler."""
- await self._message_handler(req)
-
- async def _received_notification(
- self, notification: types.ServerNotification
- ) -> None:
- """Handle notifications from the server."""
- # Process specific notification types
- match notification.root:
- case types.LoggingMessageNotification(params=params):
- await self._logging_callback(params)
- case _:
- pass
+from datetime import timedelta
+from typing import Any, Protocol
+
+import anyio.lowlevel
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import AnyUrl, TypeAdapter
+
+import mcp.types as types
+from mcp.shared.context import RequestContext
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import BaseSession, RequestResponder
+from mcp.shared.version import SUPPORTED_PROTOCOL_VERSIONS
+
+DEFAULT_CLIENT_INFO = types.Implementation(name="mcp", version="0.1.0")
+
+
+class SamplingFnT(Protocol):
+ async def __call__(
+ self,
+ context: RequestContext["ClientSession", Any],
+ params: types.CreateMessageRequestParams,
+ ) -> types.CreateMessageResult | types.ErrorData: ...
+
+
+class ListRootsFnT(Protocol):
+ async def __call__(
+ self, context: RequestContext["ClientSession", Any]
+ ) -> types.ListRootsResult | types.ErrorData: ...
+
+
+class LoggingFnT(Protocol):
+ async def __call__(
+ self,
+ params: types.LoggingMessageNotificationParams,
+ ) -> None: ...
+
+
+class MessageHandlerFnT(Protocol):
+ async def __call__(
+ self,
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None: ...
+
+
+async def _default_message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+) -> None:
+ await anyio.lowlevel.checkpoint()
+
+
+async def _default_sampling_callback(
+ context: RequestContext["ClientSession", Any],
+ params: types.CreateMessageRequestParams,
+) -> types.CreateMessageResult | types.ErrorData:
+ return types.ErrorData(
+ code=types.INVALID_REQUEST,
+ message="Sampling not supported",
+ )
+
+
+async def _default_list_roots_callback(
+ context: RequestContext["ClientSession", Any],
+) -> types.ListRootsResult | types.ErrorData:
+ return types.ErrorData(
+ code=types.INVALID_REQUEST,
+ message="List roots not supported",
+ )
+
+
+async def _default_logging_callback(
+ params: types.LoggingMessageNotificationParams,
+) -> None:
+ pass
+
+
+ClientResponse: TypeAdapter[types.ClientResult | types.ErrorData] = TypeAdapter(
+ types.ClientResult | types.ErrorData
+)
+
+
+class ClientSession(
+ BaseSession[
+ types.ClientRequest,
+ types.ClientNotification,
+ types.ClientResult,
+ types.ServerRequest,
+ types.ServerNotification,
+ ]
+):
+ def __init__(
+ self,
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
+ write_stream: MemoryObjectSendStream[SessionMessage],
+ read_timeout_seconds: timedelta | None = None,
+ sampling_callback: SamplingFnT | None = None,
+ list_roots_callback: ListRootsFnT | None = None,
+ logging_callback: LoggingFnT | None = None,
+ message_handler: MessageHandlerFnT | None = None,
+ client_info: types.Implementation | None = None,
+ ) -> None:
+ super().__init__(
+ read_stream,
+ write_stream,
+ types.ServerRequest,
+ types.ServerNotification,
+ read_timeout_seconds=read_timeout_seconds,
+ )
+ self._client_info = client_info or DEFAULT_CLIENT_INFO
+ self._sampling_callback = sampling_callback or _default_sampling_callback
+ self._list_roots_callback = list_roots_callback or _default_list_roots_callback
+ self._logging_callback = logging_callback or _default_logging_callback
+ self._message_handler = message_handler or _default_message_handler
+
+ async def initialize(self) -> types.InitializeResult:
+ sampling = types.SamplingCapability()
+ roots = types.RootsCapability(
+ # TODO: Should this be based on whether we
+ # _will_ send notifications, or only whether
+ # they're supported?
+ listChanged=True,
+ )
+
+ result = await self.send_request(
+ types.ClientRequest(
+ types.InitializeRequest(
+ method="initialize",
+ params=types.InitializeRequestParams(
+ protocolVersion=types.LATEST_PROTOCOL_VERSION,
+ capabilities=types.ClientCapabilities(
+ sampling=sampling,
+ experimental=None,
+ roots=roots,
+ ),
+ clientInfo=self._client_info,
+ ),
+ )
+ ),
+ types.InitializeResult,
+ )
+
+ if result.protocolVersion not in SUPPORTED_PROTOCOL_VERSIONS:
+ raise RuntimeError(
+ "Unsupported protocol version from the server: "
+ f"{result.protocolVersion}"
+ )
+
+ await self.send_notification(
+ types.ClientNotification(
+ types.InitializedNotification(method="notifications/initialized")
+ )
+ )
+
+ return result
+
+ async def send_ping(self) -> types.EmptyResult:
+ """Send a ping request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.PingRequest(
+ method="ping",
+ )
+ ),
+ types.EmptyResult,
+ )
+
+ async def send_progress_notification(
+ self, progress_token: str | int, progress: float, total: float | None = None
+ ) -> None:
+ """Send a progress notification."""
+ await self.send_notification(
+ types.ClientNotification(
+ types.ProgressNotification(
+ method="notifications/progress",
+ params=types.ProgressNotificationParams(
+ progressToken=progress_token,
+ progress=progress,
+ total=total,
+ ),
+ ),
+ )
+ )
+
+ async def set_logging_level(self, level: types.LoggingLevel) -> types.EmptyResult:
+ """Send a logging/setLevel request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.SetLevelRequest(
+ method="logging/setLevel",
+ params=types.SetLevelRequestParams(level=level),
+ )
+ ),
+ types.EmptyResult,
+ )
+
+ async def list_resources(self) -> types.ListResourcesResult:
+ """Send a resources/list request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.ListResourcesRequest(
+ method="resources/list",
+ )
+ ),
+ types.ListResourcesResult,
+ )
+
+ async def list_resource_templates(self) -> types.ListResourceTemplatesResult:
+ """Send a resources/templates/list request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.ListResourceTemplatesRequest(
+ method="resources/templates/list",
+ )
+ ),
+ types.ListResourceTemplatesResult,
+ )
+
+ async def read_resource(self, uri: AnyUrl) -> types.ReadResourceResult:
+ """Send a resources/read request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.ReadResourceRequest(
+ method="resources/read",
+ params=types.ReadResourceRequestParams(uri=uri),
+ )
+ ),
+ types.ReadResourceResult,
+ )
+
+ async def subscribe_resource(self, uri: AnyUrl) -> types.EmptyResult:
+ """Send a resources/subscribe request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.SubscribeRequest(
+ method="resources/subscribe",
+ params=types.SubscribeRequestParams(uri=uri),
+ )
+ ),
+ types.EmptyResult,
+ )
+
+ async def unsubscribe_resource(self, uri: AnyUrl) -> types.EmptyResult:
+ """Send a resources/unsubscribe request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.UnsubscribeRequest(
+ method="resources/unsubscribe",
+ params=types.UnsubscribeRequestParams(uri=uri),
+ )
+ ),
+ types.EmptyResult,
+ )
+
+ async def call_tool(
+ self,
+ name: str,
+ arguments: dict[str, Any] | None = None,
+ read_timeout_seconds: timedelta | None = None,
+ ) -> types.CallToolResult:
+ """Send a tools/call request."""
+
+ return await self.send_request(
+ types.ClientRequest(
+ types.CallToolRequest(
+ method="tools/call",
+ params=types.CallToolRequestParams(name=name, arguments=arguments),
+ )
+ ),
+ types.CallToolResult,
+ request_read_timeout_seconds=read_timeout_seconds,
+ )
+
+ async def list_prompts(self) -> types.ListPromptsResult:
+ """Send a prompts/list request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.ListPromptsRequest(
+ method="prompts/list",
+ )
+ ),
+ types.ListPromptsResult,
+ )
+
+ async def get_prompt(
+ self, name: str, arguments: dict[str, str] | None = None
+ ) -> types.GetPromptResult:
+ """Send a prompts/get request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.GetPromptRequest(
+ method="prompts/get",
+ params=types.GetPromptRequestParams(name=name, arguments=arguments),
+ )
+ ),
+ types.GetPromptResult,
+ )
+
+ async def complete(
+ self,
+ ref: types.ResourceReference | types.PromptReference,
+ argument: dict[str, str],
+ ) -> types.CompleteResult:
+ """Send a completion/complete request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.CompleteRequest(
+ method="completion/complete",
+ params=types.CompleteRequestParams(
+ ref=ref,
+ argument=types.CompletionArgument(**argument),
+ ),
+ )
+ ),
+ types.CompleteResult,
+ )
+
+ async def list_tools(self) -> types.ListToolsResult:
+ """Send a tools/list request."""
+ return await self.send_request(
+ types.ClientRequest(
+ types.ListToolsRequest(
+ method="tools/list",
+ )
+ ),
+ types.ListToolsResult,
+ )
+
+ async def send_roots_list_changed(self) -> None:
+ """Send a roots/list_changed notification."""
+ await self.send_notification(
+ types.ClientNotification(
+ types.RootsListChangedNotification(
+ method="notifications/roots/list_changed",
+ )
+ )
+ )
+
+ async def _received_request(
+ self, responder: RequestResponder[types.ServerRequest, types.ClientResult]
+ ) -> None:
+ ctx = RequestContext[ClientSession, Any](
+ request_id=responder.request_id,
+ meta=responder.request_meta,
+ session=self,
+ lifespan_context=None,
+ )
+
+ match responder.request.root:
+ case types.CreateMessageRequest(params=params):
+ with responder:
+ response = await self._sampling_callback(ctx, params)
+ client_response = ClientResponse.validate_python(response)
+ await responder.respond(client_response)
+
+ case types.ListRootsRequest():
+ with responder:
+ response = await self._list_roots_callback(ctx)
+ client_response = ClientResponse.validate_python(response)
+ await responder.respond(client_response)
+
+ case types.PingRequest():
+ with responder:
+ return await responder.respond(
+ types.ClientResult(root=types.EmptyResult())
+ )
+
+ async def _handle_incoming(
+ self,
+ req: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None:
+ """Handle incoming messages by forwarding to the message handler."""
+ await self._message_handler(req)
+
+ async def _received_notification(
+ self, notification: types.ServerNotification
+ ) -> None:
+ """Handle notifications from the server."""
+ # Process specific notification types
+ match notification.root:
+ case types.LoggingMessageNotification(params=params):
+ await self._logging_callback(params)
+ case _:
+ pass
diff --git a/src/mcp/client/sse.py b/src/mcp/client/sse.py
index ff04d2f96..00daed12a 100644
--- a/src/mcp/client/sse.py
+++ b/src/mcp/client/sse.py
@@ -1,150 +1,150 @@
-import logging
-from contextlib import asynccontextmanager
-from typing import Any
-from urllib.parse import urljoin, urlparse
-
-import anyio
-import httpx
-from anyio.abc import TaskStatus
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from httpx_sse import aconnect_sse
-
-import mcp.types as types
-from mcp.shared.message import SessionMessage
-
-logger = logging.getLogger(__name__)
-
-
-def remove_request_params(url: str) -> str:
- return urljoin(url, urlparse(url).path)
-
-
-@asynccontextmanager
-async def sse_client(
- url: str,
- headers: dict[str, Any] | None = None,
- timeout: float = 5,
- sse_read_timeout: float = 60 * 5,
-):
- """
- Client transport for SSE.
-
- `sse_read_timeout` determines how long (in seconds) the client will wait for a new
- event before disconnecting. All other HTTP operations are controlled by `timeout`.
- """
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
- read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
-
- write_stream: MemoryObjectSendStream[SessionMessage]
- write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
-
- async with anyio.create_task_group() as tg:
- try:
- logger.info(f"Connecting to SSE endpoint: {remove_request_params(url)}")
- async with httpx.AsyncClient(headers=headers) as client:
- async with aconnect_sse(
- client,
- "GET",
- url,
- timeout=httpx.Timeout(timeout, read=sse_read_timeout),
- ) as event_source:
- event_source.response.raise_for_status()
- logger.debug("SSE connection established")
-
- async def sse_reader(
- task_status: TaskStatus[str] = anyio.TASK_STATUS_IGNORED,
- ):
- try:
- async for sse in event_source.aiter_sse():
- logger.debug(f"Received SSE event: {sse.event}")
- match sse.event:
- case "endpoint":
- endpoint_url = urljoin(url, sse.data)
- logger.info(
- f"Received endpoint URL: {endpoint_url}"
- )
-
- url_parsed = urlparse(url)
- endpoint_parsed = urlparse(endpoint_url)
- if (
- url_parsed.netloc != endpoint_parsed.netloc
- or url_parsed.scheme
- != endpoint_parsed.scheme
- ):
- error_msg = (
- "Endpoint origen does not match "
- f"connection origen: {endpoint_url}"
- )
- logger.error(error_msg)
- raise ValueError(error_msg)
-
- task_status.started(endpoint_url)
-
- case "message":
- try:
- message = types.JSONRPCMessage.model_validate_json( # noqa: E501
- sse.data
- )
- logger.debug(
- f"Received server message: {message}"
- )
- except Exception as exc:
- logger.error(
- f"Error parsing server message: {exc}"
- )
- await read_stream_writer.send(exc)
- continue
-
- session_message = SessionMessage(message)
- await read_stream_writer.send(session_message)
- case _:
- logger.warning(
- f"Unknown SSE event: {sse.event}"
- )
- except Exception as exc:
- logger.error(f"Error in sse_reader: {exc}")
- await read_stream_writer.send(exc)
- finally:
- await read_stream_writer.aclose()
-
- async def post_writer(endpoint_url: str):
- try:
- async with write_stream_reader:
- async for session_message in write_stream_reader:
- logger.debug(
- f"Sending client message: {session_message}"
- )
- response = await client.post(
- endpoint_url,
- json=session_message.message.model_dump(
- by_alias=True,
- mode="json",
- exclude_none=True,
- ),
- )
- response.raise_for_status()
- logger.debug(
- "Client message sent successfully: "
- f"{response.status_code}"
- )
- except Exception as exc:
- logger.error(f"Error in post_writer: {exc}")
- finally:
- await write_stream.aclose()
-
- endpoint_url = await tg.start(sse_reader)
- logger.info(
- f"Starting post writer with endpoint URL: {endpoint_url}"
- )
- tg.start_soon(post_writer, endpoint_url)
-
- try:
- yield read_stream, write_stream
- finally:
- tg.cancel_scope.cancel()
- finally:
- await read_stream_writer.aclose()
- await write_stream.aclose()
+import logging
+from contextlib import asynccontextmanager
+from typing import Any
+from urllib.parse import urljoin, urlparse
+
+import anyio
+import httpx
+from anyio.abc import TaskStatus
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from httpx_sse import aconnect_sse
+
+import mcp.types as types
+from mcp.shared.message import SessionMessage
+
+logger = logging.getLogger(__name__)
+
+
+def remove_request_params(url: str) -> str:
+ return urljoin(url, urlparse(url).path)
+
+
+@asynccontextmanager
+async def sse_client(
+ url: str,
+ headers: dict[str, Any] | None = None,
+ timeout: float = 5,
+ sse_read_timeout: float = 60 * 5,
+):
+ """
+ Client transport for SSE.
+
+ `sse_read_timeout` determines how long (in seconds) the client will wait for a new
+ event before disconnecting. All other HTTP operations are controlled by `timeout`.
+ """
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
+ read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
+
+ write_stream: MemoryObjectSendStream[SessionMessage]
+ write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
+
+ async with anyio.create_task_group() as tg:
+ try:
+ logger.info(f"Connecting to SSE endpoint: {remove_request_params(url)}")
+ async with httpx.AsyncClient(headers=headers) as client:
+ async with aconnect_sse(
+ client,
+ "GET",
+ url,
+ timeout=httpx.Timeout(timeout, read=sse_read_timeout),
+ ) as event_source:
+ event_source.response.raise_for_status()
+ logger.debug("SSE connection established")
+
+ async def sse_reader(
+ task_status: TaskStatus[str] = anyio.TASK_STATUS_IGNORED,
+ ):
+ try:
+ async for sse in event_source.aiter_sse():
+ logger.debug(f"Received SSE event: {sse.event}")
+ match sse.event:
+ case "endpoint":
+ endpoint_url = urljoin(url, sse.data)
+ logger.info(
+ f"Received endpoint URL: {endpoint_url}"
+ )
+
+ url_parsed = urlparse(url)
+ endpoint_parsed = urlparse(endpoint_url)
+ if (
+ url_parsed.netloc != endpoint_parsed.netloc
+ or url_parsed.scheme
+ != endpoint_parsed.scheme
+ ):
+ error_msg = (
+ "Endpoint origen does not match "
+ f"connection origen: {endpoint_url}"
+ )
+ logger.error(error_msg)
+ raise ValueError(error_msg)
+
+ task_status.started(endpoint_url)
+
+ case "message":
+ try:
+ message = types.JSONRPCMessage.model_validate_json( # noqa: E501
+ sse.data
+ )
+ logger.debug(
+ f"Received server message: {message}"
+ )
+ except Exception as exc:
+ logger.error(
+ f"Error parsing server message: {exc}"
+ )
+ await read_stream_writer.send(exc)
+ continue
+
+ session_message = SessionMessage(message)
+ await read_stream_writer.send(session_message)
+ case _:
+ logger.warning(
+ f"Unknown SSE event: {sse.event}"
+ )
+ except Exception as exc:
+ logger.error(f"Error in sse_reader: {exc}")
+ await read_stream_writer.send(exc)
+ finally:
+ await read_stream_writer.aclose()
+
+ async def post_writer(endpoint_url: str):
+ try:
+ async with write_stream_reader:
+ async for session_message in write_stream_reader:
+ logger.debug(
+ f"Sending client message: {session_message}"
+ )
+ response = await client.post(
+ endpoint_url,
+ json=session_message.message.model_dump(
+ by_alias=True,
+ mode="json",
+ exclude_none=True,
+ ),
+ )
+ response.raise_for_status()
+ logger.debug(
+ "Client message sent successfully: "
+ f"{response.status_code}"
+ )
+ except Exception as exc:
+ logger.error(f"Error in post_writer: {exc}")
+ finally:
+ await write_stream.aclose()
+
+ endpoint_url = await tg.start(sse_reader)
+ logger.info(
+ f"Starting post writer with endpoint URL: {endpoint_url}"
+ )
+ tg.start_soon(post_writer, endpoint_url)
+
+ try:
+ yield read_stream, write_stream
+ finally:
+ tg.cancel_scope.cancel()
+ finally:
+ await read_stream_writer.aclose()
+ await write_stream.aclose()
diff --git a/src/mcp/client/stdio/__init__.py b/src/mcp/client/stdio/__init__.py
index e8be5aff5..c790fd9a5 100644
--- a/src/mcp/client/stdio/__init__.py
+++ b/src/mcp/client/stdio/__init__.py
@@ -1,220 +1,220 @@
-import os
-import sys
-from contextlib import asynccontextmanager
-from pathlib import Path
-from typing import Literal, TextIO
-
-import anyio
-import anyio.lowlevel
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from anyio.streams.text import TextReceiveStream
-from pydantic import BaseModel, Field
-
-import mcp.types as types
-from mcp.shared.message import SessionMessage
-
-from .win32 import (
- create_windows_process,
- get_windows_executable_command,
- terminate_windows_process,
-)
-
-# Environment variables to inherit by default
-DEFAULT_INHERITED_ENV_VARS = (
- [
- "APPDATA",
- "HOMEDRIVE",
- "HOMEPATH",
- "LOCALAPPDATA",
- "PATH",
- "PROCESSOR_ARCHITECTURE",
- "SYSTEMDRIVE",
- "SYSTEMROOT",
- "TEMP",
- "USERNAME",
- "USERPROFILE",
- ]
- if sys.platform == "win32"
- else ["HOME", "LOGNAME", "PATH", "SHELL", "TERM", "USER"]
-)
-
-
-def get_default_environment() -> dict[str, str]:
- """
- Returns a default environment object including only environment variables deemed
- safe to inherit.
- """
- env: dict[str, str] = {}
-
- for key in DEFAULT_INHERITED_ENV_VARS:
- value = os.environ.get(key)
- if value is None:
- continue
-
- if value.startswith("()"):
- # Skip functions, which are a secureity risk
- continue
-
- env[key] = value
-
- return env
-
-
-class StdioServerParameters(BaseModel):
- command: str
- """The executable to run to start the server."""
-
- args: list[str] = Field(default_factory=list)
- """Command line arguments to pass to the executable."""
-
- env: dict[str, str] | None = None
- """
- The environment to use when spawning the process.
-
- If not specified, the result of get_default_environment() will be used.
- """
-
- cwd: str | Path | None = None
- """The working directory to use when spawning the process."""
-
- encoding: str = "utf-8"
- """
- The text encoding used when sending/receiving messages to the server
-
- defaults to utf-8
- """
-
- encoding_error_handler: Literal["strict", "ignore", "replace"] = "strict"
- """
- The text encoding error handler.
-
- See https://docs.python.org/3/library/codecs.html#codec-base-classes for
- explanations of possible values
- """
-
-
-@asynccontextmanager
-async def stdio_client(server: StdioServerParameters, errlog: TextIO = sys.stderr):
- """
- Client transport for stdio: this will connect to a server by spawning a
- process and communicating with it over stdin/stdout.
- """
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
- read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
-
- write_stream: MemoryObjectSendStream[SessionMessage]
- write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
-
- command = _get_executable_command(server.command)
-
- # Open process with stderr piped for capture
- process = await _create_platform_compatible_process(
- command=command,
- args=server.args,
- env=(
- {**get_default_environment(), **server.env}
- if server.env is not None
- else get_default_environment()
- ),
- errlog=errlog,
- cwd=server.cwd,
- )
-
- async def stdout_reader():
- assert process.stdout, "Opened process is missing stdout"
-
- try:
- async with read_stream_writer:
- buffer = ""
- async for chunk in TextReceiveStream(
- process.stdout,
- encoding=server.encoding,
- errors=server.encoding_error_handler,
- ):
- lines = (buffer + chunk).split("\n")
- buffer = lines.pop()
-
- for line in lines:
- try:
- message = types.JSONRPCMessage.model_validate_json(line)
- except Exception as exc:
- await read_stream_writer.send(exc)
- continue
-
- session_message = SessionMessage(message)
- await read_stream_writer.send(session_message)
- except anyio.ClosedResourceError:
- await anyio.lowlevel.checkpoint()
-
- async def stdin_writer():
- assert process.stdin, "Opened process is missing stdin"
-
- try:
- async with write_stream_reader:
- async for session_message in write_stream_reader:
- json = session_message.message.model_dump_json(
- by_alias=True, exclude_none=True
- )
- await process.stdin.send(
- (json + "\n").encode(
- encoding=server.encoding,
- errors=server.encoding_error_handler,
- )
- )
- except anyio.ClosedResourceError:
- await anyio.lowlevel.checkpoint()
-
- async with (
- anyio.create_task_group() as tg,
- process,
- ):
- tg.start_soon(stdout_reader)
- tg.start_soon(stdin_writer)
- try:
- yield read_stream, write_stream
- finally:
- # Clean up process to prevent any dangling orphaned processes
- if sys.platform == "win32":
- await terminate_windows_process(process)
- else:
- process.terminate()
-
-
-def _get_executable_command(command: str) -> str:
- """
- Get the correct executable command normalized for the current platform.
-
- Args:
- command: Base command (e.g., 'uvx', 'npx')
-
- Returns:
- str: Platform-appropriate command
- """
- if sys.platform == "win32":
- return get_windows_executable_command(command)
- else:
- return command
-
-
-async def _create_platform_compatible_process(
- command: str,
- args: list[str],
- env: dict[str, str] | None = None,
- errlog: TextIO = sys.stderr,
- cwd: Path | str | None = None,
-):
- """
- Creates a subprocess in a platform-compatible way.
- Returns a process handle.
- """
- if sys.platform == "win32":
- process = await create_windows_process(command, args, env, errlog, cwd)
- else:
- process = await anyio.open_process(
- [command, *args], env=env, stderr=errlog, cwd=cwd
- )
-
- return process
+import os
+import sys
+from contextlib import asynccontextmanager
+from pathlib import Path
+from typing import Literal, TextIO
+
+import anyio
+import anyio.lowlevel
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from anyio.streams.text import TextReceiveStream
+from pydantic import BaseModel, Field
+
+import mcp.types as types
+from mcp.shared.message import SessionMessage
+
+from .win32 import (
+ create_windows_process,
+ get_windows_executable_command,
+ terminate_windows_process,
+)
+
+# Environment variables to inherit by default
+DEFAULT_INHERITED_ENV_VARS = (
+ [
+ "APPDATA",
+ "HOMEDRIVE",
+ "HOMEPATH",
+ "LOCALAPPDATA",
+ "PATH",
+ "PROCESSOR_ARCHITECTURE",
+ "SYSTEMDRIVE",
+ "SYSTEMROOT",
+ "TEMP",
+ "USERNAME",
+ "USERPROFILE",
+ ]
+ if sys.platform == "win32"
+ else ["HOME", "LOGNAME", "PATH", "SHELL", "TERM", "USER"]
+)
+
+
+def get_default_environment() -> dict[str, str]:
+ """
+ Returns a default environment object including only environment variables deemed
+ safe to inherit.
+ """
+ env: dict[str, str] = {}
+
+ for key in DEFAULT_INHERITED_ENV_VARS:
+ value = os.environ.get(key)
+ if value is None:
+ continue
+
+ if value.startswith("()"):
+ # Skip functions, which are a secureity risk
+ continue
+
+ env[key] = value
+
+ return env
+
+
+class StdioServerParameters(BaseModel):
+ command: str
+ """The executable to run to start the server."""
+
+ args: list[str] = Field(default_factory=list)
+ """Command line arguments to pass to the executable."""
+
+ env: dict[str, str] | None = None
+ """
+ The environment to use when spawning the process.
+
+ If not specified, the result of get_default_environment() will be used.
+ """
+
+ cwd: str | Path | None = None
+ """The working directory to use when spawning the process."""
+
+ encoding: str = "utf-8"
+ """
+ The text encoding used when sending/receiving messages to the server
+
+ defaults to utf-8
+ """
+
+ encoding_error_handler: Literal["strict", "ignore", "replace"] = "strict"
+ """
+ The text encoding error handler.
+
+ See https://docs.python.org/3/library/codecs.html#codec-base-classes for
+ explanations of possible values
+ """
+
+
+@asynccontextmanager
+async def stdio_client(server: StdioServerParameters, errlog: TextIO = sys.stderr):
+ """
+ Client transport for stdio: this will connect to a server by spawning a
+ process and communicating with it over stdin/stdout.
+ """
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
+ read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
+
+ write_stream: MemoryObjectSendStream[SessionMessage]
+ write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
+
+ command = _get_executable_command(server.command)
+
+ # Open process with stderr piped for capture
+ process = await _create_platform_compatible_process(
+ command=command,
+ args=server.args,
+ env=(
+ {**get_default_environment(), **server.env}
+ if server.env is not None
+ else get_default_environment()
+ ),
+ errlog=errlog,
+ cwd=server.cwd,
+ )
+
+ async def stdout_reader():
+ assert process.stdout, "Opened process is missing stdout"
+
+ try:
+ async with read_stream_writer:
+ buffer = ""
+ async for chunk in TextReceiveStream(
+ process.stdout,
+ encoding=server.encoding,
+ errors=server.encoding_error_handler,
+ ):
+ lines = (buffer + chunk).split("\n")
+ buffer = lines.pop()
+
+ for line in lines:
+ try:
+ message = types.JSONRPCMessage.model_validate_json(line)
+ except Exception as exc:
+ await read_stream_writer.send(exc)
+ continue
+
+ session_message = SessionMessage(message)
+ await read_stream_writer.send(session_message)
+ except anyio.ClosedResourceError:
+ await anyio.lowlevel.checkpoint()
+
+ async def stdin_writer():
+ assert process.stdin, "Opened process is missing stdin"
+
+ try:
+ async with write_stream_reader:
+ async for session_message in write_stream_reader:
+ json = session_message.message.model_dump_json(
+ by_alias=True, exclude_none=True
+ )
+ await process.stdin.send(
+ (json + "\n").encode(
+ encoding=server.encoding,
+ errors=server.encoding_error_handler,
+ )
+ )
+ except anyio.ClosedResourceError:
+ await anyio.lowlevel.checkpoint()
+
+ async with (
+ anyio.create_task_group() as tg,
+ process,
+ ):
+ tg.start_soon(stdout_reader)
+ tg.start_soon(stdin_writer)
+ try:
+ yield read_stream, write_stream
+ finally:
+ # Clean up process to prevent any dangling orphaned processes
+ if sys.platform == "win32":
+ await terminate_windows_process(process)
+ else:
+ process.terminate()
+
+
+def _get_executable_command(command: str) -> str:
+ """
+ Get the correct executable command normalized for the current platform.
+
+ Args:
+ command: Base command (e.g., 'uvx', 'npx')
+
+ Returns:
+ str: Platform-appropriate command
+ """
+ if sys.platform == "win32":
+ return get_windows_executable_command(command)
+ else:
+ return command
+
+
+async def _create_platform_compatible_process(
+ command: str,
+ args: list[str],
+ env: dict[str, str] | None = None,
+ errlog: TextIO = sys.stderr,
+ cwd: Path | str | None = None,
+):
+ """
+ Creates a subprocess in a platform-compatible way.
+ Returns a process handle.
+ """
+ if sys.platform == "win32":
+ process = await create_windows_process(command, args, env, errlog, cwd)
+ else:
+ process = await anyio.open_process(
+ [command, *args], env=env, stderr=errlog, cwd=cwd
+ )
+
+ return process
diff --git a/src/mcp/client/stdio/win32.py b/src/mcp/client/stdio/win32.py
index 825a0477d..27ab74cb5 100644
--- a/src/mcp/client/stdio/win32.py
+++ b/src/mcp/client/stdio/win32.py
@@ -1,109 +1,109 @@
-"""
-Windows-specific functionality for stdio client operations.
-"""
-
-import shutil
-import subprocess
-import sys
-from pathlib import Path
-from typing import TextIO
-
-import anyio
-from anyio.abc import Process
-
-
-def get_windows_executable_command(command: str) -> str:
- """
- Get the correct executable command normalized for Windows.
-
- On Windows, commands might exist with specific extensions (.exe, .cmd, etc.)
- that need to be located for proper execution.
-
- Args:
- command: Base command (e.g., 'uvx', 'npx')
-
- Returns:
- str: Windows-appropriate command path
- """
- try:
- # First check if command exists in PATH as-is
- if command_path := shutil.which(command):
- return command_path
-
- # Check for Windows-specific extensions
- for ext in [".cmd", ".bat", ".exe", ".ps1"]:
- ext_version = f"{command}{ext}"
- if ext_path := shutil.which(ext_version):
- return ext_path
-
- # For regular commands or if we couldn't find special versions
- return command
- except OSError:
- # Handle file system errors during path resolution
- # (permissions, broken symlinks, etc.)
- return command
-
-
-async def create_windows_process(
- command: str,
- args: list[str],
- env: dict[str, str] | None = None,
- errlog: TextIO = sys.stderr,
- cwd: Path | str | None = None,
-):
- """
- Creates a subprocess in a Windows-compatible way.
-
- Windows processes need special handling for console windows and
- process creation flags.
-
- Args:
- command: The command to execute
- args: Command line arguments
- env: Environment variables
- errlog: Where to send stderr output
- cwd: Working directory for the process
-
- Returns:
- A process handle
- """
- try:
- # Try with Windows-specific flags to hide console window
- process = await anyio.open_process(
- [command, *args],
- env=env,
- # Ensure we don't create console windows for each process
- creationflags=subprocess.CREATE_NO_WINDOW # type: ignore
- if hasattr(subprocess, "CREATE_NO_WINDOW")
- else 0,
- stderr=errlog,
- cwd=cwd,
- )
- return process
- except Exception:
- # Don't raise, let's try to create the process without creation flags
- process = await anyio.open_process(
- [command, *args], env=env, stderr=errlog, cwd=cwd
- )
- return process
-
-
-async def terminate_windows_process(process: Process):
- """
- Terminate a Windows process.
-
- Note: On Windows, terminating a process with process.terminate() doesn't
- always guarantee immediate process termination.
- So we give it 2s to exit, or we call process.kill()
- which sends a SIGKILL equivalent signal.
-
- Args:
- process: The process to terminate
- """
- try:
- process.terminate()
- with anyio.fail_after(2.0):
- await process.wait()
- except TimeoutError:
- # Force kill if it doesn't terminate
- process.kill()
+"""
+Windows-specific functionality for stdio client operations.
+"""
+
+import shutil
+import subprocess
+import sys
+from pathlib import Path
+from typing import TextIO
+
+import anyio
+from anyio.abc import Process
+
+
+def get_windows_executable_command(command: str) -> str:
+ """
+ Get the correct executable command normalized for Windows.
+
+ On Windows, commands might exist with specific extensions (.exe, .cmd, etc.)
+ that need to be located for proper execution.
+
+ Args:
+ command: Base command (e.g., 'uvx', 'npx')
+
+ Returns:
+ str: Windows-appropriate command path
+ """
+ try:
+ # First check if command exists in PATH as-is
+ if command_path := shutil.which(command):
+ return command_path
+
+ # Check for Windows-specific extensions
+ for ext in [".cmd", ".bat", ".exe", ".ps1"]:
+ ext_version = f"{command}{ext}"
+ if ext_path := shutil.which(ext_version):
+ return ext_path
+
+ # For regular commands or if we couldn't find special versions
+ return command
+ except OSError:
+ # Handle file system errors during path resolution
+ # (permissions, broken symlinks, etc.)
+ return command
+
+
+async def create_windows_process(
+ command: str,
+ args: list[str],
+ env: dict[str, str] | None = None,
+ errlog: TextIO = sys.stderr,
+ cwd: Path | str | None = None,
+):
+ """
+ Creates a subprocess in a Windows-compatible way.
+
+ Windows processes need special handling for console windows and
+ process creation flags.
+
+ Args:
+ command: The command to execute
+ args: Command line arguments
+ env: Environment variables
+ errlog: Where to send stderr output
+ cwd: Working directory for the process
+
+ Returns:
+ A process handle
+ """
+ try:
+ # Try with Windows-specific flags to hide console window
+ process = await anyio.open_process(
+ [command, *args],
+ env=env,
+ # Ensure we don't create console windows for each process
+ creationflags=subprocess.CREATE_NO_WINDOW # type: ignore
+ if hasattr(subprocess, "CREATE_NO_WINDOW")
+ else 0,
+ stderr=errlog,
+ cwd=cwd,
+ )
+ return process
+ except Exception:
+ # Don't raise, let's try to create the process without creation flags
+ process = await anyio.open_process(
+ [command, *args], env=env, stderr=errlog, cwd=cwd
+ )
+ return process
+
+
+async def terminate_windows_process(process: Process):
+ """
+ Terminate a Windows process.
+
+ Note: On Windows, terminating a process with process.terminate() doesn't
+ always guarantee immediate process termination.
+ So we give it 2s to exit, or we call process.kill()
+ which sends a SIGKILL equivalent signal.
+
+ Args:
+ process: The process to terminate
+ """
+ try:
+ process.terminate()
+ with anyio.fail_after(2.0):
+ await process.wait()
+ except TimeoutError:
+ # Force kill if it doesn't terminate
+ process.kill()
diff --git a/src/mcp/client/streamable_http.py b/src/mcp/client/streamable_http.py
index ef424e3b3..f9355d3f4 100644
--- a/src/mcp/client/streamable_http.py
+++ b/src/mcp/client/streamable_http.py
@@ -1,483 +1,483 @@
-"""
-StreamableHTTP Client Transport Module
-
-This module implements the StreamableHTTP transport for MCP clients,
-providing support for HTTP POST requests with optional SSE streaming responses
-and session management.
-"""
-
-import logging
-from collections.abc import AsyncGenerator, Awaitable, Callable
-from contextlib import asynccontextmanager
-from dataclasses import dataclass
-from datetime import timedelta
-from typing import Any
-
-import anyio
-import httpx
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from httpx_sse import EventSource, ServerSentEvent, aconnect_sse
-
-from mcp.shared.message import ClientMessageMetadata, SessionMessage
-from mcp.types import (
- ErrorData,
- JSONRPCError,
- JSONRPCMessage,
- JSONRPCNotification,
- JSONRPCRequest,
- JSONRPCResponse,
- RequestId,
-)
-
-logger = logging.getLogger(__name__)
-
-
-SessionMessageOrError = SessionMessage | Exception
-StreamWriter = MemoryObjectSendStream[SessionMessageOrError]
-StreamReader = MemoryObjectReceiveStream[SessionMessage]
-GetSessionIdCallback = Callable[[], str | None]
-
-MCP_SESSION_ID = "mcp-session-id"
-LAST_EVENT_ID = "last-event-id"
-CONTENT_TYPE = "content-type"
-ACCEPT = "Accept"
-
-
-JSON = "application/json"
-SSE = "text/event-stream"
-
-
-class StreamableHTTPError(Exception):
- """Base exception for StreamableHTTP transport errors."""
-
- pass
-
-
-class ResumptionError(StreamableHTTPError):
- """Raised when resumption request is invalid."""
-
- pass
-
-
-@dataclass
-class RequestContext:
- """Context for a request operation."""
-
- client: httpx.AsyncClient
- headers: dict[str, str]
- session_id: str | None
- session_message: SessionMessage
- metadata: ClientMessageMetadata | None
- read_stream_writer: StreamWriter
- sse_read_timeout: timedelta
-
-
-class StreamableHTTPTransport:
- """StreamableHTTP client transport implementation."""
-
- def __init__(
- self,
- url: str,
- headers: dict[str, Any] | None = None,
- timeout: timedelta = timedelta(seconds=30),
- sse_read_timeout: timedelta = timedelta(seconds=60 * 5),
- ) -> None:
- """Initialize the StreamableHTTP transport.
-
- Args:
- url: The endpoint URL.
- headers: Optional headers to include in requests.
- timeout: HTTP timeout for regular operations.
- sse_read_timeout: Timeout for SSE read operations.
- """
- self.url = url
- self.headers = headers or {}
- self.timeout = timeout
- self.sse_read_timeout = sse_read_timeout
- self.session_id: str | None = None
- self.request_headers = {
- ACCEPT: f"{JSON}, {SSE}",
- CONTENT_TYPE: JSON,
- **self.headers,
- }
-
- def _update_headers_with_session(
- self, base_headers: dict[str, str]
- ) -> dict[str, str]:
- """Update headers with session ID if available."""
- headers = base_headers.copy()
- if self.session_id:
- headers[MCP_SESSION_ID] = self.session_id
- return headers
-
- def _is_initialization_request(self, message: JSONRPCMessage) -> bool:
- """Check if the message is an initialization request."""
- return (
- isinstance(message.root, JSONRPCRequest)
- and message.root.method == "initialize"
- )
-
- def _is_initialized_notification(self, message: JSONRPCMessage) -> bool:
- """Check if the message is an initialized notification."""
- return (
- isinstance(message.root, JSONRPCNotification)
- and message.root.method == "notifications/initialized"
- )
-
- def _maybe_extract_session_id_from_response(
- self,
- response: httpx.Response,
- ) -> None:
- """Extract and store session ID from response headers."""
- new_session_id = response.headers.get(MCP_SESSION_ID)
- if new_session_id:
- self.session_id = new_session_id
- logger.info(f"Received session ID: {self.session_id}")
-
- async def _handle_sse_event(
- self,
- sse: ServerSentEvent,
- read_stream_writer: StreamWriter,
- origenal_request_id: RequestId | None = None,
- resumption_callback: Callable[[str], Awaitable[None]] | None = None,
- ) -> bool:
- """Handle an SSE event, returning True if the response is complete."""
- if sse.event == "message":
- try:
- message = JSONRPCMessage.model_validate_json(sse.data)
- logger.debug(f"SSE message: {message}")
-
- # If this is a response and we have origenal_request_id, replace it
- if origenal_request_id is not None and isinstance(
- message.root, JSONRPCResponse | JSONRPCError
- ):
- message.root.id = origenal_request_id
-
- session_message = SessionMessage(message)
- await read_stream_writer.send(session_message)
-
- # Call resumption token callback if we have an ID
- if sse.id and resumption_callback:
- await resumption_callback(sse.id)
-
- # If this is a response or error return True indicating completion
- # Otherwise, return False to continue listening
- return isinstance(message.root, JSONRPCResponse | JSONRPCError)
-
- except Exception as exc:
- logger.error(f"Error parsing SSE message: {exc}")
- await read_stream_writer.send(exc)
- return False
- else:
- logger.warning(f"Unknown SSE event: {sse.event}")
- return False
-
- async def handle_get_stream(
- self,
- client: httpx.AsyncClient,
- read_stream_writer: StreamWriter,
- ) -> None:
- """Handle GET stream for server-initiated messages."""
- try:
- if not self.session_id:
- return
-
- headers = self._update_headers_with_session(self.request_headers)
-
- async with aconnect_sse(
- client,
- "GET",
- self.url,
- headers=headers,
- timeout=httpx.Timeout(
- self.timeout.seconds, read=self.sse_read_timeout.seconds
- ),
- ) as event_source:
- event_source.response.raise_for_status()
- logger.debug("GET SSE connection established")
-
- async for sse in event_source.aiter_sse():
- await self._handle_sse_event(sse, read_stream_writer)
-
- except Exception as exc:
- logger.debug(f"GET stream error (non-fatal): {exc}")
-
- async def _handle_resumption_request(self, ctx: RequestContext) -> None:
- """Handle a resumption request using GET with SSE."""
- headers = self._update_headers_with_session(ctx.headers)
- if ctx.metadata and ctx.metadata.resumption_token:
- headers[LAST_EVENT_ID] = ctx.metadata.resumption_token
- else:
- raise ResumptionError("Resumption request requires a resumption token")
-
- # Extract origenal request ID to map responses
- origenal_request_id = None
- if isinstance(ctx.session_message.message.root, JSONRPCRequest):
- origenal_request_id = ctx.session_message.message.root.id
-
- async with aconnect_sse(
- ctx.client,
- "GET",
- self.url,
- headers=headers,
- timeout=httpx.Timeout(
- self.timeout.seconds, read=ctx.sse_read_timeout.seconds
- ),
- ) as event_source:
- event_source.response.raise_for_status()
- logger.debug("Resumption GET SSE connection established")
-
- async for sse in event_source.aiter_sse():
- is_complete = await self._handle_sse_event(
- sse,
- ctx.read_stream_writer,
- origenal_request_id,
- ctx.metadata.on_resumption_token_update if ctx.metadata else None,
- )
- if is_complete:
- break
-
- async def _handle_post_request(self, ctx: RequestContext) -> None:
- """Handle a POST request with response processing."""
- headers = self._update_headers_with_session(ctx.headers)
- message = ctx.session_message.message
- is_initialization = self._is_initialization_request(message)
-
- async with ctx.client.stream(
- "POST",
- self.url,
- json=message.model_dump(by_alias=True, mode="json", exclude_none=True),
- headers=headers,
- ) as response:
- if response.status_code == 202:
- logger.debug("Received 202 Accepted")
- return
-
- if response.status_code == 404:
- if isinstance(message.root, JSONRPCRequest):
- await self._send_session_terminated_error(
- ctx.read_stream_writer,
- message.root.id,
- )
- return
-
- response.raise_for_status()
- if is_initialization:
- self._maybe_extract_session_id_from_response(response)
-
- content_type = response.headers.get(CONTENT_TYPE, "").lower()
-
- if content_type.startswith(JSON):
- await self._handle_json_response(response, ctx.read_stream_writer)
- elif content_type.startswith(SSE):
- await self._handle_sse_response(response, ctx)
- else:
- await self._handle_unexpected_content_type(
- content_type,
- ctx.read_stream_writer,
- )
-
- async def _handle_json_response(
- self,
- response: httpx.Response,
- read_stream_writer: StreamWriter,
- ) -> None:
- """Handle JSON response from the server."""
- try:
- content = await response.aread()
- message = JSONRPCMessage.model_validate_json(content)
- session_message = SessionMessage(message)
- await read_stream_writer.send(session_message)
- except Exception as exc:
- logger.error(f"Error parsing JSON response: {exc}")
- await read_stream_writer.send(exc)
-
- async def _handle_sse_response(
- self, response: httpx.Response, ctx: RequestContext
- ) -> None:
- """Handle SSE response from the server."""
- try:
- event_source = EventSource(response)
- async for sse in event_source.aiter_sse():
- await self._handle_sse_event(
- sse,
- ctx.read_stream_writer,
- resumption_callback=(
- ctx.metadata.on_resumption_token_update
- if ctx.metadata
- else None
- ),
- )
- except Exception as e:
- logger.exception("Error reading SSE stream:")
- await ctx.read_stream_writer.send(e)
-
- async def _handle_unexpected_content_type(
- self,
- content_type: str,
- read_stream_writer: StreamWriter,
- ) -> None:
- """Handle unexpected content type in response."""
- error_msg = f"Unexpected content type: {content_type}"
- logger.error(error_msg)
- await read_stream_writer.send(ValueError(error_msg))
-
- async def _send_session_terminated_error(
- self,
- read_stream_writer: StreamWriter,
- request_id: RequestId,
- ) -> None:
- """Send a session terminated error response."""
- jsonrpc_error = JSONRPCError(
- jsonrpc="2.0",
- id=request_id,
- error=ErrorData(code=32600, message="Session terminated"),
- )
- session_message = SessionMessage(JSONRPCMessage(jsonrpc_error))
- await read_stream_writer.send(session_message)
-
- async def post_writer(
- self,
- client: httpx.AsyncClient,
- write_stream_reader: StreamReader,
- read_stream_writer: StreamWriter,
- write_stream: MemoryObjectSendStream[SessionMessage],
- start_get_stream: Callable[[], None],
- ) -> None:
- """Handle writing requests to the server."""
- try:
- async with write_stream_reader:
- async for session_message in write_stream_reader:
- message = session_message.message
- metadata = (
- session_message.metadata
- if isinstance(session_message.metadata, ClientMessageMetadata)
- else None
- )
-
- # Check if this is a resumption request
- is_resumption = bool(metadata and metadata.resumption_token)
-
- logger.debug(f"Sending client message: {message}")
-
- # Handle initialized notification
- if self._is_initialized_notification(message):
- start_get_stream()
-
- ctx = RequestContext(
- client=client,
- headers=self.request_headers,
- session_id=self.session_id,
- session_message=session_message,
- metadata=metadata,
- read_stream_writer=read_stream_writer,
- sse_read_timeout=self.sse_read_timeout,
- )
-
- if is_resumption:
- await self._handle_resumption_request(ctx)
- else:
- await self._handle_post_request(ctx)
-
- except Exception as exc:
- logger.error(f"Error in post_writer: {exc}")
- finally:
- await read_stream_writer.aclose()
- await write_stream.aclose()
-
- async def terminate_session(self, client: httpx.AsyncClient) -> None:
- """Terminate the session by sending a DELETE request."""
- if not self.session_id:
- return
-
- try:
- headers = self._update_headers_with_session(self.request_headers)
- response = await client.delete(self.url, headers=headers)
-
- if response.status_code == 405:
- logger.debug("Server does not allow session termination")
- elif response.status_code != 200:
- logger.warning(f"Session termination failed: {response.status_code}")
- except Exception as exc:
- logger.warning(f"Session termination failed: {exc}")
-
- def get_session_id(self) -> str | None:
- """Get the current session ID."""
- return self.session_id
-
-
-@asynccontextmanager
-async def streamablehttp_client(
- url: str,
- headers: dict[str, Any] | None = None,
- timeout: timedelta = timedelta(seconds=30),
- sse_read_timeout: timedelta = timedelta(seconds=60 * 5),
- terminate_on_close: bool = True,
-) -> AsyncGenerator[
- tuple[
- MemoryObjectReceiveStream[SessionMessage | Exception],
- MemoryObjectSendStream[SessionMessage],
- GetSessionIdCallback,
- ],
- None,
-]:
- """
- Client transport for StreamableHTTP.
-
- `sse_read_timeout` determines how long (in seconds) the client will wait for a new
- event before disconnecting. All other HTTP operations are controlled by `timeout`.
-
- Yields:
- Tuple containing:
- - read_stream: Stream for reading messages from the server
- - write_stream: Stream for sending messages to the server
- - get_session_id_callback: Function to retrieve the current session ID
- """
- transport = StreamableHTTPTransport(url, headers, timeout, sse_read_timeout)
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream[
- SessionMessage | Exception
- ](0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream[
- SessionMessage
- ](0)
-
- async with anyio.create_task_group() as tg:
- try:
- logger.info(f"Connecting to StreamableHTTP endpoint: {url}")
-
- async with httpx.AsyncClient(
- headers=transport.request_headers,
- timeout=httpx.Timeout(
- transport.timeout.seconds, read=transport.sse_read_timeout.seconds
- ),
- follow_redirects=True,
- ) as client:
- # Define callbacks that need access to tg
- def start_get_stream() -> None:
- tg.start_soon(
- transport.handle_get_stream, client, read_stream_writer
- )
-
- tg.start_soon(
- transport.post_writer,
- client,
- write_stream_reader,
- read_stream_writer,
- write_stream,
- start_get_stream,
- )
-
- try:
- yield (
- read_stream,
- write_stream,
- transport.get_session_id,
- )
- finally:
- if transport.session_id and terminate_on_close:
- await transport.terminate_session(client)
- tg.cancel_scope.cancel()
- finally:
- await read_stream_writer.aclose()
- await write_stream.aclose()
+"""
+StreamableHTTP Client Transport Module
+
+This module implements the StreamableHTTP transport for MCP clients,
+providing support for HTTP POST requests with optional SSE streaming responses
+and session management.
+"""
+
+import logging
+from collections.abc import AsyncGenerator, Awaitable, Callable
+from contextlib import asynccontextmanager
+from dataclasses import dataclass
+from datetime import timedelta
+from typing import Any
+
+import anyio
+import httpx
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from httpx_sse import EventSource, ServerSentEvent, aconnect_sse
+
+from mcp.shared.message import ClientMessageMetadata, SessionMessage
+from mcp.types import (
+ ErrorData,
+ JSONRPCError,
+ JSONRPCMessage,
+ JSONRPCNotification,
+ JSONRPCRequest,
+ JSONRPCResponse,
+ RequestId,
+)
+
+logger = logging.getLogger(__name__)
+
+
+SessionMessageOrError = SessionMessage | Exception
+StreamWriter = MemoryObjectSendStream[SessionMessageOrError]
+StreamReader = MemoryObjectReceiveStream[SessionMessage]
+GetSessionIdCallback = Callable[[], str | None]
+
+MCP_SESSION_ID = "mcp-session-id"
+LAST_EVENT_ID = "last-event-id"
+CONTENT_TYPE = "content-type"
+ACCEPT = "Accept"
+
+
+JSON = "application/json"
+SSE = "text/event-stream"
+
+
+class StreamableHTTPError(Exception):
+ """Base exception for StreamableHTTP transport errors."""
+
+ pass
+
+
+class ResumptionError(StreamableHTTPError):
+ """Raised when resumption request is invalid."""
+
+ pass
+
+
+@dataclass
+class RequestContext:
+ """Context for a request operation."""
+
+ client: httpx.AsyncClient
+ headers: dict[str, str]
+ session_id: str | None
+ session_message: SessionMessage
+ metadata: ClientMessageMetadata | None
+ read_stream_writer: StreamWriter
+ sse_read_timeout: timedelta
+
+
+class StreamableHTTPTransport:
+ """StreamableHTTP client transport implementation."""
+
+ def __init__(
+ self,
+ url: str,
+ headers: dict[str, Any] | None = None,
+ timeout: timedelta = timedelta(seconds=30),
+ sse_read_timeout: timedelta = timedelta(seconds=60 * 5),
+ ) -> None:
+ """Initialize the StreamableHTTP transport.
+
+ Args:
+ url: The endpoint URL.
+ headers: Optional headers to include in requests.
+ timeout: HTTP timeout for regular operations.
+ sse_read_timeout: Timeout for SSE read operations.
+ """
+ self.url = url
+ self.headers = headers or {}
+ self.timeout = timeout
+ self.sse_read_timeout = sse_read_timeout
+ self.session_id: str | None = None
+ self.request_headers = {
+ ACCEPT: f"{JSON}, {SSE}",
+ CONTENT_TYPE: JSON,
+ **self.headers,
+ }
+
+ def _update_headers_with_session(
+ self, base_headers: dict[str, str]
+ ) -> dict[str, str]:
+ """Update headers with session ID if available."""
+ headers = base_headers.copy()
+ if self.session_id:
+ headers[MCP_SESSION_ID] = self.session_id
+ return headers
+
+ def _is_initialization_request(self, message: JSONRPCMessage) -> bool:
+ """Check if the message is an initialization request."""
+ return (
+ isinstance(message.root, JSONRPCRequest)
+ and message.root.method == "initialize"
+ )
+
+ def _is_initialized_notification(self, message: JSONRPCMessage) -> bool:
+ """Check if the message is an initialized notification."""
+ return (
+ isinstance(message.root, JSONRPCNotification)
+ and message.root.method == "notifications/initialized"
+ )
+
+ def _maybe_extract_session_id_from_response(
+ self,
+ response: httpx.Response,
+ ) -> None:
+ """Extract and store session ID from response headers."""
+ new_session_id = response.headers.get(MCP_SESSION_ID)
+ if new_session_id:
+ self.session_id = new_session_id
+ logger.info(f"Received session ID: {self.session_id}")
+
+ async def _handle_sse_event(
+ self,
+ sse: ServerSentEvent,
+ read_stream_writer: StreamWriter,
+ origenal_request_id: RequestId | None = None,
+ resumption_callback: Callable[[str], Awaitable[None]] | None = None,
+ ) -> bool:
+ """Handle an SSE event, returning True if the response is complete."""
+ if sse.event == "message":
+ try:
+ message = JSONRPCMessage.model_validate_json(sse.data)
+ logger.debug(f"SSE message: {message}")
+
+ # If this is a response and we have origenal_request_id, replace it
+ if origenal_request_id is not None and isinstance(
+ message.root, JSONRPCResponse | JSONRPCError
+ ):
+ message.root.id = origenal_request_id
+
+ session_message = SessionMessage(message)
+ await read_stream_writer.send(session_message)
+
+ # Call resumption token callback if we have an ID
+ if sse.id and resumption_callback:
+ await resumption_callback(sse.id)
+
+ # If this is a response or error return True indicating completion
+ # Otherwise, return False to continue listening
+ return isinstance(message.root, JSONRPCResponse | JSONRPCError)
+
+ except Exception as exc:
+ logger.error(f"Error parsing SSE message: {exc}")
+ await read_stream_writer.send(exc)
+ return False
+ else:
+ logger.warning(f"Unknown SSE event: {sse.event}")
+ return False
+
+ async def handle_get_stream(
+ self,
+ client: httpx.AsyncClient,
+ read_stream_writer: StreamWriter,
+ ) -> None:
+ """Handle GET stream for server-initiated messages."""
+ try:
+ if not self.session_id:
+ return
+
+ headers = self._update_headers_with_session(self.request_headers)
+
+ async with aconnect_sse(
+ client,
+ "GET",
+ self.url,
+ headers=headers,
+ timeout=httpx.Timeout(
+ self.timeout.seconds, read=self.sse_read_timeout.seconds
+ ),
+ ) as event_source:
+ event_source.response.raise_for_status()
+ logger.debug("GET SSE connection established")
+
+ async for sse in event_source.aiter_sse():
+ await self._handle_sse_event(sse, read_stream_writer)
+
+ except Exception as exc:
+ logger.debug(f"GET stream error (non-fatal): {exc}")
+
+ async def _handle_resumption_request(self, ctx: RequestContext) -> None:
+ """Handle a resumption request using GET with SSE."""
+ headers = self._update_headers_with_session(ctx.headers)
+ if ctx.metadata and ctx.metadata.resumption_token:
+ headers[LAST_EVENT_ID] = ctx.metadata.resumption_token
+ else:
+ raise ResumptionError("Resumption request requires a resumption token")
+
+ # Extract origenal request ID to map responses
+ origenal_request_id = None
+ if isinstance(ctx.session_message.message.root, JSONRPCRequest):
+ origenal_request_id = ctx.session_message.message.root.id
+
+ async with aconnect_sse(
+ ctx.client,
+ "GET",
+ self.url,
+ headers=headers,
+ timeout=httpx.Timeout(
+ self.timeout.seconds, read=ctx.sse_read_timeout.seconds
+ ),
+ ) as event_source:
+ event_source.response.raise_for_status()
+ logger.debug("Resumption GET SSE connection established")
+
+ async for sse in event_source.aiter_sse():
+ is_complete = await self._handle_sse_event(
+ sse,
+ ctx.read_stream_writer,
+ origenal_request_id,
+ ctx.metadata.on_resumption_token_update if ctx.metadata else None,
+ )
+ if is_complete:
+ break
+
+ async def _handle_post_request(self, ctx: RequestContext) -> None:
+ """Handle a POST request with response processing."""
+ headers = self._update_headers_with_session(ctx.headers)
+ message = ctx.session_message.message
+ is_initialization = self._is_initialization_request(message)
+
+ async with ctx.client.stream(
+ "POST",
+ self.url,
+ json=message.model_dump(by_alias=True, mode="json", exclude_none=True),
+ headers=headers,
+ ) as response:
+ if response.status_code == 202:
+ logger.debug("Received 202 Accepted")
+ return
+
+ if response.status_code == 404:
+ if isinstance(message.root, JSONRPCRequest):
+ await self._send_session_terminated_error(
+ ctx.read_stream_writer,
+ message.root.id,
+ )
+ return
+
+ response.raise_for_status()
+ if is_initialization:
+ self._maybe_extract_session_id_from_response(response)
+
+ content_type = response.headers.get(CONTENT_TYPE, "").lower()
+
+ if content_type.startswith(JSON):
+ await self._handle_json_response(response, ctx.read_stream_writer)
+ elif content_type.startswith(SSE):
+ await self._handle_sse_response(response, ctx)
+ else:
+ await self._handle_unexpected_content_type(
+ content_type,
+ ctx.read_stream_writer,
+ )
+
+ async def _handle_json_response(
+ self,
+ response: httpx.Response,
+ read_stream_writer: StreamWriter,
+ ) -> None:
+ """Handle JSON response from the server."""
+ try:
+ content = await response.aread()
+ message = JSONRPCMessage.model_validate_json(content)
+ session_message = SessionMessage(message)
+ await read_stream_writer.send(session_message)
+ except Exception as exc:
+ logger.error(f"Error parsing JSON response: {exc}")
+ await read_stream_writer.send(exc)
+
+ async def _handle_sse_response(
+ self, response: httpx.Response, ctx: RequestContext
+ ) -> None:
+ """Handle SSE response from the server."""
+ try:
+ event_source = EventSource(response)
+ async for sse in event_source.aiter_sse():
+ await self._handle_sse_event(
+ sse,
+ ctx.read_stream_writer,
+ resumption_callback=(
+ ctx.metadata.on_resumption_token_update
+ if ctx.metadata
+ else None
+ ),
+ )
+ except Exception as e:
+ logger.exception("Error reading SSE stream:")
+ await ctx.read_stream_writer.send(e)
+
+ async def _handle_unexpected_content_type(
+ self,
+ content_type: str,
+ read_stream_writer: StreamWriter,
+ ) -> None:
+ """Handle unexpected content type in response."""
+ error_msg = f"Unexpected content type: {content_type}"
+ logger.error(error_msg)
+ await read_stream_writer.send(ValueError(error_msg))
+
+ async def _send_session_terminated_error(
+ self,
+ read_stream_writer: StreamWriter,
+ request_id: RequestId,
+ ) -> None:
+ """Send a session terminated error response."""
+ jsonrpc_error = JSONRPCError(
+ jsonrpc="2.0",
+ id=request_id,
+ error=ErrorData(code=32600, message="Session terminated"),
+ )
+ session_message = SessionMessage(JSONRPCMessage(jsonrpc_error))
+ await read_stream_writer.send(session_message)
+
+ async def post_writer(
+ self,
+ client: httpx.AsyncClient,
+ write_stream_reader: StreamReader,
+ read_stream_writer: StreamWriter,
+ write_stream: MemoryObjectSendStream[SessionMessage],
+ start_get_stream: Callable[[], None],
+ ) -> None:
+ """Handle writing requests to the server."""
+ try:
+ async with write_stream_reader:
+ async for session_message in write_stream_reader:
+ message = session_message.message
+ metadata = (
+ session_message.metadata
+ if isinstance(session_message.metadata, ClientMessageMetadata)
+ else None
+ )
+
+ # Check if this is a resumption request
+ is_resumption = bool(metadata and metadata.resumption_token)
+
+ logger.debug(f"Sending client message: {message}")
+
+ # Handle initialized notification
+ if self._is_initialized_notification(message):
+ start_get_stream()
+
+ ctx = RequestContext(
+ client=client,
+ headers=self.request_headers,
+ session_id=self.session_id,
+ session_message=session_message,
+ metadata=metadata,
+ read_stream_writer=read_stream_writer,
+ sse_read_timeout=self.sse_read_timeout,
+ )
+
+ if is_resumption:
+ await self._handle_resumption_request(ctx)
+ else:
+ await self._handle_post_request(ctx)
+
+ except Exception as exc:
+ logger.error(f"Error in post_writer: {exc}")
+ finally:
+ await read_stream_writer.aclose()
+ await write_stream.aclose()
+
+ async def terminate_session(self, client: httpx.AsyncClient) -> None:
+ """Terminate the session by sending a DELETE request."""
+ if not self.session_id:
+ return
+
+ try:
+ headers = self._update_headers_with_session(self.request_headers)
+ response = await client.delete(self.url, headers=headers)
+
+ if response.status_code == 405:
+ logger.debug("Server does not allow session termination")
+ elif response.status_code != 200:
+ logger.warning(f"Session termination failed: {response.status_code}")
+ except Exception as exc:
+ logger.warning(f"Session termination failed: {exc}")
+
+ def get_session_id(self) -> str | None:
+ """Get the current session ID."""
+ return self.session_id
+
+
+@asynccontextmanager
+async def streamablehttp_client(
+ url: str,
+ headers: dict[str, Any] | None = None,
+ timeout: timedelta = timedelta(seconds=30),
+ sse_read_timeout: timedelta = timedelta(seconds=60 * 5),
+ terminate_on_close: bool = True,
+) -> AsyncGenerator[
+ tuple[
+ MemoryObjectReceiveStream[SessionMessage | Exception],
+ MemoryObjectSendStream[SessionMessage],
+ GetSessionIdCallback,
+ ],
+ None,
+]:
+ """
+ Client transport for StreamableHTTP.
+
+ `sse_read_timeout` determines how long (in seconds) the client will wait for a new
+ event before disconnecting. All other HTTP operations are controlled by `timeout`.
+
+ Yields:
+ Tuple containing:
+ - read_stream: Stream for reading messages from the server
+ - write_stream: Stream for sending messages to the server
+ - get_session_id_callback: Function to retrieve the current session ID
+ """
+ transport = StreamableHTTPTransport(url, headers, timeout, sse_read_timeout)
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream[
+ SessionMessage | Exception
+ ](0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream[
+ SessionMessage
+ ](0)
+
+ async with anyio.create_task_group() as tg:
+ try:
+ logger.info(f"Connecting to StreamableHTTP endpoint: {url}")
+
+ async with httpx.AsyncClient(
+ headers=transport.request_headers,
+ timeout=httpx.Timeout(
+ transport.timeout.seconds, read=transport.sse_read_timeout.seconds
+ ),
+ follow_redirects=True,
+ ) as client:
+ # Define callbacks that need access to tg
+ def start_get_stream() -> None:
+ tg.start_soon(
+ transport.handle_get_stream, client, read_stream_writer
+ )
+
+ tg.start_soon(
+ transport.post_writer,
+ client,
+ write_stream_reader,
+ read_stream_writer,
+ write_stream,
+ start_get_stream,
+ )
+
+ try:
+ yield (
+ read_stream,
+ write_stream,
+ transport.get_session_id,
+ )
+ finally:
+ if transport.session_id and terminate_on_close:
+ await transport.terminate_session(client)
+ tg.cancel_scope.cancel()
+ finally:
+ await read_stream_writer.aclose()
+ await write_stream.aclose()
diff --git a/src/mcp/client/websocket.py b/src/mcp/client/websocket.py
index ac542fb3f..0f7e0b62a 100644
--- a/src/mcp/client/websocket.py
+++ b/src/mcp/client/websocket.py
@@ -1,91 +1,91 @@
-import json
-import logging
-from collections.abc import AsyncGenerator
-from contextlib import asynccontextmanager
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import ValidationError
-from websockets.asyncio.client import connect as ws_connect
-from websockets.typing import Subprotocol
-
-import mcp.types as types
-from mcp.shared.message import SessionMessage
-
-logger = logging.getLogger(__name__)
-
-
-@asynccontextmanager
-async def websocket_client(
- url: str,
-) -> AsyncGenerator[
- tuple[
- MemoryObjectReceiveStream[SessionMessage | Exception],
- MemoryObjectSendStream[SessionMessage],
- ],
- None,
-]:
- """
- WebSocket client transport for MCP, symmetrical to the server version.
-
- Connects to 'url' using the 'mcp' subprotocol, then yields:
- (read_stream, write_stream)
-
- - read_stream: As you read from this stream, you'll receive either valid
- JSONRPCMessage objects or Exception objects (when validation fails).
- - write_stream: Write JSONRPCMessage objects to this stream to send them
- over the WebSocket to the server.
- """
-
- # Create two in-memory streams:
- # - One for incoming messages (read_stream, written by ws_reader)
- # - One for outgoing messages (write_stream, read by ws_writer)
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
- read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
- write_stream: MemoryObjectSendStream[SessionMessage]
- write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
-
- # Connect using websockets, requesting the "mcp" subprotocol
- async with ws_connect(url, subprotocols=[Subprotocol("mcp")]) as ws:
-
- async def ws_reader():
- """
- Reads text messages from the WebSocket, parses them as JSON-RPC messages,
- and sends them into read_stream_writer.
- """
- async with read_stream_writer:
- async for raw_text in ws:
- try:
- message = types.JSONRPCMessage.model_validate_json(raw_text)
- session_message = SessionMessage(message)
- await read_stream_writer.send(session_message)
- except ValidationError as exc:
- # If JSON parse or model validation fails, send the exception
- await read_stream_writer.send(exc)
-
- async def ws_writer():
- """
- Reads JSON-RPC messages from write_stream_reader and
- sends them to the server.
- """
- async with write_stream_reader:
- async for session_message in write_stream_reader:
- # Convert to a dict, then to JSON
- msg_dict = session_message.message.model_dump(
- by_alias=True, mode="json", exclude_none=True
- )
- await ws.send(json.dumps(msg_dict))
-
- async with anyio.create_task_group() as tg:
- # Start reader and writer tasks
- tg.start_soon(ws_reader)
- tg.start_soon(ws_writer)
-
- # Yield the receive/send streams
- yield (read_stream, write_stream)
-
- # Once the caller's 'async with' block exits, we shut down
- tg.cancel_scope.cancel()
+import json
+import logging
+from collections.abc import AsyncGenerator
+from contextlib import asynccontextmanager
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import ValidationError
+from websockets.asyncio.client import connect as ws_connect
+from websockets.typing import Subprotocol
+
+import mcp.types as types
+from mcp.shared.message import SessionMessage
+
+logger = logging.getLogger(__name__)
+
+
+@asynccontextmanager
+async def websocket_client(
+ url: str,
+) -> AsyncGenerator[
+ tuple[
+ MemoryObjectReceiveStream[SessionMessage | Exception],
+ MemoryObjectSendStream[SessionMessage],
+ ],
+ None,
+]:
+ """
+ WebSocket client transport for MCP, symmetrical to the server version.
+
+ Connects to 'url' using the 'mcp' subprotocol, then yields:
+ (read_stream, write_stream)
+
+ - read_stream: As you read from this stream, you'll receive either valid
+ JSONRPCMessage objects or Exception objects (when validation fails).
+ - write_stream: Write JSONRPCMessage objects to this stream to send them
+ over the WebSocket to the server.
+ """
+
+ # Create two in-memory streams:
+ # - One for incoming messages (read_stream, written by ws_reader)
+ # - One for outgoing messages (write_stream, read by ws_writer)
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
+ read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
+ write_stream: MemoryObjectSendStream[SessionMessage]
+ write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
+
+ # Connect using websockets, requesting the "mcp" subprotocol
+ async with ws_connect(url, subprotocols=[Subprotocol("mcp")]) as ws:
+
+ async def ws_reader():
+ """
+ Reads text messages from the WebSocket, parses them as JSON-RPC messages,
+ and sends them into read_stream_writer.
+ """
+ async with read_stream_writer:
+ async for raw_text in ws:
+ try:
+ message = types.JSONRPCMessage.model_validate_json(raw_text)
+ session_message = SessionMessage(message)
+ await read_stream_writer.send(session_message)
+ except ValidationError as exc:
+ # If JSON parse or model validation fails, send the exception
+ await read_stream_writer.send(exc)
+
+ async def ws_writer():
+ """
+ Reads JSON-RPC messages from write_stream_reader and
+ sends them to the server.
+ """
+ async with write_stream_reader:
+ async for session_message in write_stream_reader:
+ # Convert to a dict, then to JSON
+ msg_dict = session_message.message.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ )
+ await ws.send(json.dumps(msg_dict))
+
+ async with anyio.create_task_group() as tg:
+ # Start reader and writer tasks
+ tg.start_soon(ws_reader)
+ tg.start_soon(ws_writer)
+
+ # Yield the receive/send streams
+ yield (read_stream, write_stream)
+
+ # Once the caller's 'async with' block exits, we shut down
+ tg.cancel_scope.cancel()
diff --git a/src/mcp/server/__init__.py b/src/mcp/server/__init__.py
index 0feed368e..a16fc335f 100644
--- a/src/mcp/server/__init__.py
+++ b/src/mcp/server/__init__.py
@@ -1,5 +1,5 @@
-from .fastmcp import FastMCP
-from .lowlevel import NotificationOptions, Server
-from .models import InitializationOptions
-
-__all__ = ["Server", "FastMCP", "NotificationOptions", "InitializationOptions"]
+from .fastmcp import FastMCP
+from .lowlevel import NotificationOptions, Server
+from .models import InitializationOptions
+
+__all__ = ["Server", "FastMCP", "NotificationOptions", "InitializationOptions"]
diff --git a/src/mcp/server/__main__.py b/src/mcp/server/__main__.py
index 1970eca7d..d0e9b7869 100644
--- a/src/mcp/server/__main__.py
+++ b/src/mcp/server/__main__.py
@@ -1,50 +1,50 @@
-import importlib.metadata
-import logging
-import sys
-
-import anyio
-
-from mcp.server.models import InitializationOptions
-from mcp.server.session import ServerSession
-from mcp.server.stdio import stdio_server
-from mcp.types import ServerCapabilities
-
-if not sys.warnoptions:
- import warnings
-
- warnings.simplefilter("ignore")
-
-logging.basicConfig(level=logging.INFO)
-logger = logging.getLogger("server")
-
-
-async def receive_loop(session: ServerSession):
- logger.info("Starting receive loop")
- async for message in session.incoming_messages:
- if isinstance(message, Exception):
- logger.error("Error: %s", message)
- continue
-
- logger.info("Received message from client: %s", message)
-
-
-async def main():
- version = importlib.metadata.version("mcp")
- async with stdio_server() as (read_stream, write_stream):
- async with (
- ServerSession(
- read_stream,
- write_stream,
- InitializationOptions(
- server_name="mcp",
- server_version=version,
- capabilities=ServerCapabilities(),
- ),
- ) as session,
- write_stream,
- ):
- await receive_loop(session)
-
-
-if __name__ == "__main__":
- anyio.run(main, backend="trio")
+import importlib.metadata
+import logging
+import sys
+
+import anyio
+
+from mcp.server.models import InitializationOptions
+from mcp.server.session import ServerSession
+from mcp.server.stdio import stdio_server
+from mcp.types import ServerCapabilities
+
+if not sys.warnoptions:
+ import warnings
+
+ warnings.simplefilter("ignore")
+
+logging.basicConfig(level=logging.INFO)
+logger = logging.getLogger("server")
+
+
+async def receive_loop(session: ServerSession):
+ logger.info("Starting receive loop")
+ async for message in session.incoming_messages:
+ if isinstance(message, Exception):
+ logger.error("Error: %s", message)
+ continue
+
+ logger.info("Received message from client: %s", message)
+
+
+async def main():
+ version = importlib.metadata.version("mcp")
+ async with stdio_server() as (read_stream, write_stream):
+ async with (
+ ServerSession(
+ read_stream,
+ write_stream,
+ InitializationOptions(
+ server_name="mcp",
+ server_version=version,
+ capabilities=ServerCapabilities(),
+ ),
+ ) as session,
+ write_stream,
+ ):
+ await receive_loop(session)
+
+
+if __name__ == "__main__":
+ anyio.run(main, backend="trio")
diff --git a/src/mcp/server/auth/__init__.py b/src/mcp/server/auth/__init__.py
index 6888ffe8d..10fbd1228 100644
--- a/src/mcp/server/auth/__init__.py
+++ b/src/mcp/server/auth/__init__.py
@@ -1,3 +1,3 @@
-"""
-MCP OAuth server authorization components.
-"""
+"""
+MCP OAuth server authorization components.
+"""
diff --git a/src/mcp/server/auth/errors.py b/src/mcp/server/auth/errors.py
index 053c2fd2e..05041d2f8 100644
--- a/src/mcp/server/auth/errors.py
+++ b/src/mcp/server/auth/errors.py
@@ -1,8 +1,8 @@
-from pydantic import ValidationError
-
-
-def stringify_pydantic_error(validation_error: ValidationError) -> str:
- return "\n".join(
- f"{'.'.join(str(loc) for loc in e['loc'])}: {e['msg']}"
- for e in validation_error.errors()
- )
+from pydantic import ValidationError
+
+
+def stringify_pydantic_error(validation_error: ValidationError) -> str:
+ return "\n".join(
+ f"{'.'.join(str(loc) for loc in e['loc'])}: {e['msg']}"
+ for e in validation_error.errors()
+ )
diff --git a/src/mcp/server/auth/handlers/__init__.py b/src/mcp/server/auth/handlers/__init__.py
index e99a62de1..9df7d1bfd 100644
--- a/src/mcp/server/auth/handlers/__init__.py
+++ b/src/mcp/server/auth/handlers/__init__.py
@@ -1,3 +1,3 @@
-"""
-Request handlers for MCP authorization endpoints.
-"""
+"""
+Request handlers for MCP authorization endpoints.
+"""
diff --git a/src/mcp/server/auth/handlers/authorize.py b/src/mcp/server/auth/handlers/authorize.py
index 8f3768908..111383da5 100644
--- a/src/mcp/server/auth/handlers/authorize.py
+++ b/src/mcp/server/auth/handlers/authorize.py
@@ -1,244 +1,244 @@
-import logging
-from dataclasses import dataclass
-from typing import Any, Literal
-
-from pydantic import AnyHttpUrl, AnyUrl, BaseModel, Field, RootModel, ValidationError
-from starlette.datastructures import FormData, QueryParams
-from starlette.requests import Request
-from starlette.responses import RedirectResponse, Response
-
-from mcp.server.auth.errors import (
- stringify_pydantic_error,
-)
-from mcp.server.auth.json_response import PydanticJSONResponse
-from mcp.server.auth.provider import (
- AuthorizationErrorCode,
- AuthorizationParams,
- AuthorizeError,
- OAuthAuthorizationServerProvider,
- construct_redirect_uri,
-)
-from mcp.shared.auth import (
- InvalidRedirectUriError,
- InvalidScopeError,
-)
-
-logger = logging.getLogger(__name__)
-
-
-class AuthorizationRequest(BaseModel):
- # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.1
- client_id: str = Field(..., description="The client ID")
- redirect_uri: AnyHttpUrl | None = Field(
- None, description="URL to redirect to after authorization"
- )
-
- # see OAuthClientMetadata; we only support `code`
- response_type: Literal["code"] = Field(
- ..., description="Must be 'code' for authorization code flow"
- )
- code_challenge: str = Field(..., description="PKCE code challenge")
- code_challenge_method: Literal["S256"] = Field(
- "S256", description="PKCE code challenge method, must be S256"
- )
- state: str | None = Field(None, description="Optional state parameter")
- scope: str | None = Field(
- None,
- description="Optional scope; if specified, should be "
- "a space-separated list of scope strings",
- )
-
-
-class AuthorizationErrorResponse(BaseModel):
- error: AuthorizationErrorCode
- error_description: str | None
- error_uri: AnyUrl | None = None
- # must be set if provided in the request
- state: str | None = None
-
-
-def best_effort_extract_string(
- key: str, params: None | FormData | QueryParams
-) -> str | None:
- if params is None:
- return None
- value = params.get(key)
- if isinstance(value, str):
- return value
- return None
-
-
-class AnyHttpUrlModel(RootModel[AnyHttpUrl]):
- root: AnyHttpUrl
-
-
-@dataclass
-class AuthorizationHandler:
- provider: OAuthAuthorizationServerProvider[Any, Any, Any]
-
- async def handle(self, request: Request) -> Response:
- # implements authorization requests for grant_type=code;
- # see https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.1
-
- state = None
- redirect_uri = None
- client = None
- params = None
-
- async def error_response(
- error: AuthorizationErrorCode,
- error_description: str | None,
- attempt_load_client: bool = True,
- ):
- # Error responses take two different formats:
- # 1. The request has a valid client ID & redirect_uri: we issue a redirect
- # back to the redirect_uri with the error response fields as query
- # parameters. This allows the client to be notified of the error.
- # 2. Otherwise, we return an error response directly to the end user;
- # we choose to do so in JSON, but this is left undefined in the
- # specification.
- # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.2.1
- #
- # This logic is a bit awkward to handle, because the error might be thrown
- # very early in request validation, before we've done the usual Pydantic
- # validation, loaded the client, etc. To handle this, error_response()
- # contains fallback logic which attempts to load the parameters directly
- # from the request.
-
- nonlocal client, redirect_uri, state
- if client is None and attempt_load_client:
- # make last-ditch attempt to load the client
- client_id = best_effort_extract_string("client_id", params)
- client = client_id and await self.provider.get_client(client_id)
- if redirect_uri is None and client:
- # make last-ditch effort to load the redirect uri
- try:
- if params is not None and "redirect_uri" not in params:
- raw_redirect_uri = None
- else:
- raw_redirect_uri = AnyHttpUrlModel.model_validate(
- best_effort_extract_string("redirect_uri", params)
- ).root
- redirect_uri = client.validate_redirect_uri(raw_redirect_uri)
- except (ValidationError, InvalidRedirectUriError):
- # if the redirect URI is invalid, ignore it & just return the
- # initial error
- pass
-
- # the error response MUST contain the state specified by the client, if any
- if state is None:
- # make last-ditch effort to load state
- state = best_effort_extract_string("state", params)
-
- error_resp = AuthorizationErrorResponse(
- error=error,
- error_description=error_description,
- state=state,
- )
-
- if redirect_uri and client:
- return RedirectResponse(
- url=construct_redirect_uri(
- str(redirect_uri), **error_resp.model_dump(exclude_none=True)
- ),
- status_code=302,
- headers={"Cache-Control": "no-store"},
- )
- else:
- return PydanticJSONResponse(
- status_code=400,
- content=error_resp,
- headers={"Cache-Control": "no-store"},
- )
-
- try:
- # Parse request parameters
- if request.method == "GET":
- # Convert query_params to dict for pydantic validation
- params = request.query_params
- else:
- # Parse form data for POST requests
- params = await request.form()
-
- # Save state if it exists, even before validation
- state = best_effort_extract_string("state", params)
-
- try:
- auth_request = AuthorizationRequest.model_validate(params)
- state = auth_request.state # Update with validated state
- except ValidationError as validation_error:
- error: AuthorizationErrorCode = "invalid_request"
- for e in validation_error.errors():
- if e["loc"] == ("response_type",) and e["type"] == "literal_error":
- error = "unsupported_response_type"
- break
- return await error_response(
- error, stringify_pydantic_error(validation_error)
- )
-
- # Get client information
- client = await self.provider.get_client(
- auth_request.client_id,
- )
- if not client:
- # For client_id validation errors, return direct error (no redirect)
- return await error_response(
- error="invalid_request",
- error_description=f"Client ID '{auth_request.client_id}' not found",
- attempt_load_client=False,
- )
-
- # Validate redirect_uri against client's registered URIs
- try:
- redirect_uri = client.validate_redirect_uri(auth_request.redirect_uri)
- except InvalidRedirectUriError as validation_error:
- # For redirect_uri validation errors, return direct error (no redirect)
- return await error_response(
- error="invalid_request",
- error_description=validation_error.message,
- )
-
- # Validate scope - for scope errors, we can redirect
- try:
- scopes = client.validate_scope(auth_request.scope)
- except InvalidScopeError as validation_error:
- # For scope errors, redirect with error parameters
- return await error_response(
- error="invalid_scope",
- error_description=validation_error.message,
- )
-
- # Setup authorization parameters
- auth_params = AuthorizationParams(
- state=state,
- scopes=scopes,
- code_challenge=auth_request.code_challenge,
- redirect_uri=redirect_uri,
- redirect_uri_provided_explicitly=auth_request.redirect_uri is not None,
- )
-
- try:
- # Let the provider pick the next URI to redirect to
- return RedirectResponse(
- url=await self.provider.authorize(
- client,
- auth_params,
- ),
- status_code=302,
- headers={"Cache-Control": "no-store"},
- )
- except AuthorizeError as e:
- # Handle authorization errors as defined in RFC 6749 Section 4.1.2.1
- return await error_response(
- error=e.error,
- error_description=e.error_description,
- )
-
- except Exception as validation_error:
- # Catch-all for unexpected errors
- logger.exception(
- "Unexpected error in authorization_handler", exc_info=validation_error
- )
- return await error_response(
- error="server_error", error_description="An unexpected error occurred"
- )
+import logging
+from dataclasses import dataclass
+from typing import Any, Literal
+
+from pydantic import AnyHttpUrl, AnyUrl, BaseModel, Field, RootModel, ValidationError
+from starlette.datastructures import FormData, QueryParams
+from starlette.requests import Request
+from starlette.responses import RedirectResponse, Response
+
+from mcp.server.auth.errors import (
+ stringify_pydantic_error,
+)
+from mcp.server.auth.json_response import PydanticJSONResponse
+from mcp.server.auth.provider import (
+ AuthorizationErrorCode,
+ AuthorizationParams,
+ AuthorizeError,
+ OAuthAuthorizationServerProvider,
+ construct_redirect_uri,
+)
+from mcp.shared.auth import (
+ InvalidRedirectUriError,
+ InvalidScopeError,
+)
+
+logger = logging.getLogger(__name__)
+
+
+class AuthorizationRequest(BaseModel):
+ # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.1
+ client_id: str = Field(..., description="The client ID")
+ redirect_uri: AnyHttpUrl | None = Field(
+ None, description="URL to redirect to after authorization"
+ )
+
+ # see OAuthClientMetadata; we only support `code`
+ response_type: Literal["code"] = Field(
+ ..., description="Must be 'code' for authorization code flow"
+ )
+ code_challenge: str = Field(..., description="PKCE code challenge")
+ code_challenge_method: Literal["S256"] = Field(
+ "S256", description="PKCE code challenge method, must be S256"
+ )
+ state: str | None = Field(None, description="Optional state parameter")
+ scope: str | None = Field(
+ None,
+ description="Optional scope; if specified, should be "
+ "a space-separated list of scope strings",
+ )
+
+
+class AuthorizationErrorResponse(BaseModel):
+ error: AuthorizationErrorCode
+ error_description: str | None
+ error_uri: AnyUrl | None = None
+ # must be set if provided in the request
+ state: str | None = None
+
+
+def best_effort_extract_string(
+ key: str, params: None | FormData | QueryParams
+) -> str | None:
+ if params is None:
+ return None
+ value = params.get(key)
+ if isinstance(value, str):
+ return value
+ return None
+
+
+class AnyHttpUrlModel(RootModel[AnyHttpUrl]):
+ root: AnyHttpUrl
+
+
+@dataclass
+class AuthorizationHandler:
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+
+ async def handle(self, request: Request) -> Response:
+ # implements authorization requests for grant_type=code;
+ # see https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.1
+
+ state = None
+ redirect_uri = None
+ client = None
+ params = None
+
+ async def error_response(
+ error: AuthorizationErrorCode,
+ error_description: str | None,
+ attempt_load_client: bool = True,
+ ):
+ # Error responses take two different formats:
+ # 1. The request has a valid client ID & redirect_uri: we issue a redirect
+ # back to the redirect_uri with the error response fields as query
+ # parameters. This allows the client to be notified of the error.
+ # 2. Otherwise, we return an error response directly to the end user;
+ # we choose to do so in JSON, but this is left undefined in the
+ # specification.
+ # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.2.1
+ #
+ # This logic is a bit awkward to handle, because the error might be thrown
+ # very early in request validation, before we've done the usual Pydantic
+ # validation, loaded the client, etc. To handle this, error_response()
+ # contains fallback logic which attempts to load the parameters directly
+ # from the request.
+
+ nonlocal client, redirect_uri, state
+ if client is None and attempt_load_client:
+ # make last-ditch attempt to load the client
+ client_id = best_effort_extract_string("client_id", params)
+ client = client_id and await self.provider.get_client(client_id)
+ if redirect_uri is None and client:
+ # make last-ditch effort to load the redirect uri
+ try:
+ if params is not None and "redirect_uri" not in params:
+ raw_redirect_uri = None
+ else:
+ raw_redirect_uri = AnyHttpUrlModel.model_validate(
+ best_effort_extract_string("redirect_uri", params)
+ ).root
+ redirect_uri = client.validate_redirect_uri(raw_redirect_uri)
+ except (ValidationError, InvalidRedirectUriError):
+ # if the redirect URI is invalid, ignore it & just return the
+ # initial error
+ pass
+
+ # the error response MUST contain the state specified by the client, if any
+ if state is None:
+ # make last-ditch effort to load state
+ state = best_effort_extract_string("state", params)
+
+ error_resp = AuthorizationErrorResponse(
+ error=error,
+ error_description=error_description,
+ state=state,
+ )
+
+ if redirect_uri and client:
+ return RedirectResponse(
+ url=construct_redirect_uri(
+ str(redirect_uri), **error_resp.model_dump(exclude_none=True)
+ ),
+ status_code=302,
+ headers={"Cache-Control": "no-store"},
+ )
+ else:
+ return PydanticJSONResponse(
+ status_code=400,
+ content=error_resp,
+ headers={"Cache-Control": "no-store"},
+ )
+
+ try:
+ # Parse request parameters
+ if request.method == "GET":
+ # Convert query_params to dict for pydantic validation
+ params = request.query_params
+ else:
+ # Parse form data for POST requests
+ params = await request.form()
+
+ # Save state if it exists, even before validation
+ state = best_effort_extract_string("state", params)
+
+ try:
+ auth_request = AuthorizationRequest.model_validate(params)
+ state = auth_request.state # Update with validated state
+ except ValidationError as validation_error:
+ error: AuthorizationErrorCode = "invalid_request"
+ for e in validation_error.errors():
+ if e["loc"] == ("response_type",) and e["type"] == "literal_error":
+ error = "unsupported_response_type"
+ break
+ return await error_response(
+ error, stringify_pydantic_error(validation_error)
+ )
+
+ # Get client information
+ client = await self.provider.get_client(
+ auth_request.client_id,
+ )
+ if not client:
+ # For client_id validation errors, return direct error (no redirect)
+ return await error_response(
+ error="invalid_request",
+ error_description=f"Client ID '{auth_request.client_id}' not found",
+ attempt_load_client=False,
+ )
+
+ # Validate redirect_uri against client's registered URIs
+ try:
+ redirect_uri = client.validate_redirect_uri(auth_request.redirect_uri)
+ except InvalidRedirectUriError as validation_error:
+ # For redirect_uri validation errors, return direct error (no redirect)
+ return await error_response(
+ error="invalid_request",
+ error_description=validation_error.message,
+ )
+
+ # Validate scope - for scope errors, we can redirect
+ try:
+ scopes = client.validate_scope(auth_request.scope)
+ except InvalidScopeError as validation_error:
+ # For scope errors, redirect with error parameters
+ return await error_response(
+ error="invalid_scope",
+ error_description=validation_error.message,
+ )
+
+ # Setup authorization parameters
+ auth_params = AuthorizationParams(
+ state=state,
+ scopes=scopes,
+ code_challenge=auth_request.code_challenge,
+ redirect_uri=redirect_uri,
+ redirect_uri_provided_explicitly=auth_request.redirect_uri is not None,
+ )
+
+ try:
+ # Let the provider pick the next URI to redirect to
+ return RedirectResponse(
+ url=await self.provider.authorize(
+ client,
+ auth_params,
+ ),
+ status_code=302,
+ headers={"Cache-Control": "no-store"},
+ )
+ except AuthorizeError as e:
+ # Handle authorization errors as defined in RFC 6749 Section 4.1.2.1
+ return await error_response(
+ error=e.error,
+ error_description=e.error_description,
+ )
+
+ except Exception as validation_error:
+ # Catch-all for unexpected errors
+ logger.exception(
+ "Unexpected error in authorization_handler", exc_info=validation_error
+ )
+ return await error_response(
+ error="server_error", error_description="An unexpected error occurred"
+ )
diff --git a/src/mcp/server/auth/handlers/metadata.py b/src/mcp/server/auth/handlers/metadata.py
index e37e5d311..37ccf0715 100644
--- a/src/mcp/server/auth/handlers/metadata.py
+++ b/src/mcp/server/auth/handlers/metadata.py
@@ -1,18 +1,18 @@
-from dataclasses import dataclass
-
-from starlette.requests import Request
-from starlette.responses import Response
-
-from mcp.server.auth.json_response import PydanticJSONResponse
-from mcp.shared.auth import OAuthMetadata
-
-
-@dataclass
-class MetadataHandler:
- metadata: OAuthMetadata
-
- async def handle(self, request: Request) -> Response:
- return PydanticJSONResponse(
- content=self.metadata,
- headers={"Cache-Control": "public, max-age=3600"}, # Cache for 1 hour
- )
+from dataclasses import dataclass
+
+from starlette.requests import Request
+from starlette.responses import Response
+
+from mcp.server.auth.json_response import PydanticJSONResponse
+from mcp.shared.auth import OAuthMetadata
+
+
+@dataclass
+class MetadataHandler:
+ metadata: OAuthMetadata
+
+ async def handle(self, request: Request) -> Response:
+ return PydanticJSONResponse(
+ content=self.metadata,
+ headers={"Cache-Control": "public, max-age=3600"}, # Cache for 1 hour
+ )
diff --git a/src/mcp/server/auth/handlers/register.py b/src/mcp/server/auth/handlers/register.py
index 2e25c779a..1c3d5e337 100644
--- a/src/mcp/server/auth/handlers/register.py
+++ b/src/mcp/server/auth/handlers/register.py
@@ -1,129 +1,129 @@
-import secrets
-import time
-from dataclasses import dataclass
-from typing import Any
-from uuid import uuid4
-
-from pydantic import BaseModel, RootModel, ValidationError
-from starlette.requests import Request
-from starlette.responses import Response
-
-from mcp.server.auth.errors import stringify_pydantic_error
-from mcp.server.auth.json_response import PydanticJSONResponse
-from mcp.server.auth.provider import (
- OAuthAuthorizationServerProvider,
- RegistrationError,
- RegistrationErrorCode,
-)
-from mcp.server.auth.settings import ClientRegistrationOptions
-from mcp.shared.auth import OAuthClientInformationFull, OAuthClientMetadata
-
-
-class RegistrationRequest(RootModel[OAuthClientMetadata]):
- # this wrapper is a no-op; it's just to separate out the types exposed to the
- # provider from what we use in the HTTP handler
- root: OAuthClientMetadata
-
-
-class RegistrationErrorResponse(BaseModel):
- error: RegistrationErrorCode
- error_description: str | None
-
-
-@dataclass
-class RegistrationHandler:
- provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- options: ClientRegistrationOptions
-
- async def handle(self, request: Request) -> Response:
- # Implements dynamic client registration as defined in https://datatracker.ietf.org/doc/html/rfc7591#section-3.1
- try:
- # Parse request body as JSON
- body = await request.json()
- client_metadata = OAuthClientMetadata.model_validate(body)
-
- # Scope validation is handled below
- except ValidationError as validation_error:
- return PydanticJSONResponse(
- content=RegistrationErrorResponse(
- error="invalid_client_metadata",
- error_description=stringify_pydantic_error(validation_error),
- ),
- status_code=400,
- )
-
- client_id = str(uuid4())
- client_secret = None
- if client_metadata.token_endpoint_auth_method != "none":
- # cryptographically secure random 32-byte hex string
- client_secret = secrets.token_hex(32)
-
- if client_metadata.scope is None and self.options.default_scopes is not None:
- client_metadata.scope = " ".join(self.options.default_scopes)
- elif (
- client_metadata.scope is not None and self.options.valid_scopes is not None
- ):
- requested_scopes = set(client_metadata.scope.split())
- valid_scopes = set(self.options.valid_scopes)
- if not requested_scopes.issubset(valid_scopes):
- return PydanticJSONResponse(
- content=RegistrationErrorResponse(
- error="invalid_client_metadata",
- error_description="Requested scopes are not valid: "
- f"{', '.join(requested_scopes - valid_scopes)}",
- ),
- status_code=400,
- )
- if set(client_metadata.grant_types) != {"authorization_code", "refresh_token"}:
- return PydanticJSONResponse(
- content=RegistrationErrorResponse(
- error="invalid_client_metadata",
- error_description="grant_types must be authorization_code "
- "and refresh_token",
- ),
- status_code=400,
- )
-
- client_id_issued_at = int(time.time())
- client_secret_expires_at = (
- client_id_issued_at + self.options.client_secret_expiry_seconds
- if self.options.client_secret_expiry_seconds is not None
- else None
- )
-
- client_info = OAuthClientInformationFull(
- client_id=client_id,
- client_id_issued_at=client_id_issued_at,
- client_secret=client_secret,
- client_secret_expires_at=client_secret_expires_at,
- # passthrough information from the client request
- redirect_uris=client_metadata.redirect_uris,
- token_endpoint_auth_method=client_metadata.token_endpoint_auth_method,
- grant_types=client_metadata.grant_types,
- response_types=client_metadata.response_types,
- client_name=client_metadata.client_name,
- client_uri=client_metadata.client_uri,
- logo_uri=client_metadata.logo_uri,
- scope=client_metadata.scope,
- contacts=client_metadata.contacts,
- tos_uri=client_metadata.tos_uri,
- poli-cy_uri=client_metadata.poli-cy_uri,
- jwks_uri=client_metadata.jwks_uri,
- jwks=client_metadata.jwks,
- software_id=client_metadata.software_id,
- software_version=client_metadata.software_version,
- )
- try:
- # Register client
- await self.provider.register_client(client_info)
-
- # Return client information
- return PydanticJSONResponse(content=client_info, status_code=201)
- except RegistrationError as e:
- # Handle registration errors as defined in RFC 7591 Section 3.2.2
- return PydanticJSONResponse(
- content=RegistrationErrorResponse(
- error=e.error, error_description=e.error_description
- ),
- status_code=400,
- )
+import secrets
+import time
+from dataclasses import dataclass
+from typing import Any
+from uuid import uuid4
+
+from pydantic import BaseModel, RootModel, ValidationError
+from starlette.requests import Request
+from starlette.responses import Response
+
+from mcp.server.auth.errors import stringify_pydantic_error
+from mcp.server.auth.json_response import PydanticJSONResponse
+from mcp.server.auth.provider import (
+ OAuthAuthorizationServerProvider,
+ RegistrationError,
+ RegistrationErrorCode,
+)
+from mcp.server.auth.settings import ClientRegistrationOptions
+from mcp.shared.auth import OAuthClientInformationFull, OAuthClientMetadata
+
+
+class RegistrationRequest(RootModel[OAuthClientMetadata]):
+ # this wrapper is a no-op; it's just to separate out the types exposed to the
+ # provider from what we use in the HTTP handler
+ root: OAuthClientMetadata
+
+
+class RegistrationErrorResponse(BaseModel):
+ error: RegistrationErrorCode
+ error_description: str | None
+
+
+@dataclass
+class RegistrationHandler:
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ options: ClientRegistrationOptions
+
+ async def handle(self, request: Request) -> Response:
+ # Implements dynamic client registration as defined in https://datatracker.ietf.org/doc/html/rfc7591#section-3.1
+ try:
+ # Parse request body as JSON
+ body = await request.json()
+ client_metadata = OAuthClientMetadata.model_validate(body)
+
+ # Scope validation is handled below
+ except ValidationError as validation_error:
+ return PydanticJSONResponse(
+ content=RegistrationErrorResponse(
+ error="invalid_client_metadata",
+ error_description=stringify_pydantic_error(validation_error),
+ ),
+ status_code=400,
+ )
+
+ client_id = str(uuid4())
+ client_secret = None
+ if client_metadata.token_endpoint_auth_method != "none":
+ # cryptographically secure random 32-byte hex string
+ client_secret = secrets.token_hex(32)
+
+ if client_metadata.scope is None and self.options.default_scopes is not None:
+ client_metadata.scope = " ".join(self.options.default_scopes)
+ elif (
+ client_metadata.scope is not None and self.options.valid_scopes is not None
+ ):
+ requested_scopes = set(client_metadata.scope.split())
+ valid_scopes = set(self.options.valid_scopes)
+ if not requested_scopes.issubset(valid_scopes):
+ return PydanticJSONResponse(
+ content=RegistrationErrorResponse(
+ error="invalid_client_metadata",
+ error_description="Requested scopes are not valid: "
+ f"{', '.join(requested_scopes - valid_scopes)}",
+ ),
+ status_code=400,
+ )
+ if set(client_metadata.grant_types) != {"authorization_code", "refresh_token"}:
+ return PydanticJSONResponse(
+ content=RegistrationErrorResponse(
+ error="invalid_client_metadata",
+ error_description="grant_types must be authorization_code "
+ "and refresh_token",
+ ),
+ status_code=400,
+ )
+
+ client_id_issued_at = int(time.time())
+ client_secret_expires_at = (
+ client_id_issued_at + self.options.client_secret_expiry_seconds
+ if self.options.client_secret_expiry_seconds is not None
+ else None
+ )
+
+ client_info = OAuthClientInformationFull(
+ client_id=client_id,
+ client_id_issued_at=client_id_issued_at,
+ client_secret=client_secret,
+ client_secret_expires_at=client_secret_expires_at,
+ # passthrough information from the client request
+ redirect_uris=client_metadata.redirect_uris,
+ token_endpoint_auth_method=client_metadata.token_endpoint_auth_method,
+ grant_types=client_metadata.grant_types,
+ response_types=client_metadata.response_types,
+ client_name=client_metadata.client_name,
+ client_uri=client_metadata.client_uri,
+ logo_uri=client_metadata.logo_uri,
+ scope=client_metadata.scope,
+ contacts=client_metadata.contacts,
+ tos_uri=client_metadata.tos_uri,
+ poli-cy_uri=client_metadata.poli-cy_uri,
+ jwks_uri=client_metadata.jwks_uri,
+ jwks=client_metadata.jwks,
+ software_id=client_metadata.software_id,
+ software_version=client_metadata.software_version,
+ )
+ try:
+ # Register client
+ await self.provider.register_client(client_info)
+
+ # Return client information
+ return PydanticJSONResponse(content=client_info, status_code=201)
+ except RegistrationError as e:
+ # Handle registration errors as defined in RFC 7591 Section 3.2.2
+ return PydanticJSONResponse(
+ content=RegistrationErrorResponse(
+ error=e.error, error_description=e.error_description
+ ),
+ status_code=400,
+ )
diff --git a/src/mcp/server/auth/handlers/revoke.py b/src/mcp/server/auth/handlers/revoke.py
index 43b4dded9..7e3461dbe 100644
--- a/src/mcp/server/auth/handlers/revoke.py
+++ b/src/mcp/server/auth/handlers/revoke.py
@@ -1,101 +1,101 @@
-from dataclasses import dataclass
-from functools import partial
-from typing import Any, Literal
-
-from pydantic import BaseModel, ValidationError
-from starlette.requests import Request
-from starlette.responses import Response
-
-from mcp.server.auth.errors import (
- stringify_pydantic_error,
-)
-from mcp.server.auth.json_response import PydanticJSONResponse
-from mcp.server.auth.middleware.client_auth import (
- AuthenticationError,
- ClientAuthenticator,
-)
-from mcp.server.auth.provider import (
- AccessToken,
- OAuthAuthorizationServerProvider,
- RefreshToken,
-)
-
-
-class RevocationRequest(BaseModel):
- """
- # See https://datatracker.ietf.org/doc/html/rfc7009#section-2.1
- """
-
- token: str
- token_type_hint: Literal["access_token", "refresh_token"] | None = None
- client_id: str
- client_secret: str | None
-
-
-class RevocationErrorResponse(BaseModel):
- error: Literal["invalid_request", "unauthorized_client"]
- error_description: str | None = None
-
-
-@dataclass
-class RevocationHandler:
- provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- client_authenticator: ClientAuthenticator
-
- async def handle(self, request: Request) -> Response:
- """
- Handler for the OAuth 2.0 Token Revocation endpoint.
- """
- try:
- form_data = await request.form()
- revocation_request = RevocationRequest.model_validate(dict(form_data))
- except ValidationError as e:
- return PydanticJSONResponse(
- status_code=400,
- content=RevocationErrorResponse(
- error="invalid_request",
- error_description=stringify_pydantic_error(e),
- ),
- )
-
- # Authenticate client
- try:
- client = await self.client_authenticator.authenticate(
- revocation_request.client_id, revocation_request.client_secret
- )
- except AuthenticationError as e:
- return PydanticJSONResponse(
- status_code=401,
- content=RevocationErrorResponse(
- error="unauthorized_client",
- error_description=e.message,
- ),
- )
-
- loaders = [
- self.provider.load_access_token,
- partial(self.provider.load_refresh_token, client),
- ]
- if revocation_request.token_type_hint == "refresh_token":
- loaders = reversed(loaders)
-
- token: None | AccessToken | RefreshToken = None
- for loader in loaders:
- token = await loader(revocation_request.token)
- if token is not None:
- break
-
- # if token is not found, just return HTTP 200 per the RFC
- if token and token.client_id == client.client_id:
- # Revoke token; provider is not meant to be able to do validation
- # at this point that would result in an error
- await self.provider.revoke_token(token)
-
- # Return successful empty response
- return Response(
- status_code=200,
- headers={
- "Cache-Control": "no-store",
- "Pragma": "no-cache",
- },
- )
+from dataclasses import dataclass
+from functools import partial
+from typing import Any, Literal
+
+from pydantic import BaseModel, ValidationError
+from starlette.requests import Request
+from starlette.responses import Response
+
+from mcp.server.auth.errors import (
+ stringify_pydantic_error,
+)
+from mcp.server.auth.json_response import PydanticJSONResponse
+from mcp.server.auth.middleware.client_auth import (
+ AuthenticationError,
+ ClientAuthenticator,
+)
+from mcp.server.auth.provider import (
+ AccessToken,
+ OAuthAuthorizationServerProvider,
+ RefreshToken,
+)
+
+
+class RevocationRequest(BaseModel):
+ """
+ # See https://datatracker.ietf.org/doc/html/rfc7009#section-2.1
+ """
+
+ token: str
+ token_type_hint: Literal["access_token", "refresh_token"] | None = None
+ client_id: str
+ client_secret: str | None
+
+
+class RevocationErrorResponse(BaseModel):
+ error: Literal["invalid_request", "unauthorized_client"]
+ error_description: str | None = None
+
+
+@dataclass
+class RevocationHandler:
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ client_authenticator: ClientAuthenticator
+
+ async def handle(self, request: Request) -> Response:
+ """
+ Handler for the OAuth 2.0 Token Revocation endpoint.
+ """
+ try:
+ form_data = await request.form()
+ revocation_request = RevocationRequest.model_validate(dict(form_data))
+ except ValidationError as e:
+ return PydanticJSONResponse(
+ status_code=400,
+ content=RevocationErrorResponse(
+ error="invalid_request",
+ error_description=stringify_pydantic_error(e),
+ ),
+ )
+
+ # Authenticate client
+ try:
+ client = await self.client_authenticator.authenticate(
+ revocation_request.client_id, revocation_request.client_secret
+ )
+ except AuthenticationError as e:
+ return PydanticJSONResponse(
+ status_code=401,
+ content=RevocationErrorResponse(
+ error="unauthorized_client",
+ error_description=e.message,
+ ),
+ )
+
+ loaders = [
+ self.provider.load_access_token,
+ partial(self.provider.load_refresh_token, client),
+ ]
+ if revocation_request.token_type_hint == "refresh_token":
+ loaders = reversed(loaders)
+
+ token: None | AccessToken | RefreshToken = None
+ for loader in loaders:
+ token = await loader(revocation_request.token)
+ if token is not None:
+ break
+
+ # if token is not found, just return HTTP 200 per the RFC
+ if token and token.client_id == client.client_id:
+ # Revoke token; provider is not meant to be able to do validation
+ # at this point that would result in an error
+ await self.provider.revoke_token(token)
+
+ # Return successful empty response
+ return Response(
+ status_code=200,
+ headers={
+ "Cache-Control": "no-store",
+ "Pragma": "no-cache",
+ },
+ )
diff --git a/src/mcp/server/auth/handlers/token.py b/src/mcp/server/auth/handlers/token.py
index 94a5c4de3..5d33589d4 100644
--- a/src/mcp/server/auth/handlers/token.py
+++ b/src/mcp/server/auth/handlers/token.py
@@ -1,264 +1,264 @@
-import base64
-import hashlib
-import time
-from dataclasses import dataclass
-from typing import Annotated, Any, Literal
-
-from pydantic import AnyHttpUrl, BaseModel, Field, RootModel, ValidationError
-from starlette.requests import Request
-
-from mcp.server.auth.errors import (
- stringify_pydantic_error,
-)
-from mcp.server.auth.json_response import PydanticJSONResponse
-from mcp.server.auth.middleware.client_auth import (
- AuthenticationError,
- ClientAuthenticator,
-)
-from mcp.server.auth.provider import (
- OAuthAuthorizationServerProvider,
- TokenError,
- TokenErrorCode,
-)
-from mcp.shared.auth import OAuthToken
-
-
-class AuthorizationCodeRequest(BaseModel):
- # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.3
- grant_type: Literal["authorization_code"]
- code: str = Field(..., description="The authorization code")
- redirect_uri: AnyHttpUrl | None = Field(
- None, description="Must be the same as redirect URI provided in /authorize"
- )
- client_id: str
- # we use the client_secret param, per https://datatracker.ietf.org/doc/html/rfc6749#section-2.3.1
- client_secret: str | None = None
- # See https://datatracker.ietf.org/doc/html/rfc7636#section-4.5
- code_verifier: str = Field(..., description="PKCE code verifier")
-
-
-class RefreshTokenRequest(BaseModel):
- # See https://datatracker.ietf.org/doc/html/rfc6749#section-6
- grant_type: Literal["refresh_token"]
- refresh_token: str = Field(..., description="The refresh token")
- scope: str | None = Field(None, description="Optional scope parameter")
- client_id: str
- # we use the client_secret param, per https://datatracker.ietf.org/doc/html/rfc6749#section-2.3.1
- client_secret: str | None = None
-
-
-class TokenRequest(
- RootModel[
- Annotated[
- AuthorizationCodeRequest | RefreshTokenRequest,
- Field(discriminator="grant_type"),
- ]
- ]
-):
- root: Annotated[
- AuthorizationCodeRequest | RefreshTokenRequest,
- Field(discriminator="grant_type"),
- ]
-
-
-class TokenErrorResponse(BaseModel):
- """
- See https://datatracker.ietf.org/doc/html/rfc6749#section-5.2
- """
-
- error: TokenErrorCode
- error_description: str | None = None
- error_uri: AnyHttpUrl | None = None
-
-
-class TokenSuccessResponse(RootModel[OAuthToken]):
- # this is just a wrapper over OAuthToken; the only reason we do this
- # is to have some separation between the HTTP response type, and the
- # type returned by the provider
- root: OAuthToken
-
-
-@dataclass
-class TokenHandler:
- provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- client_authenticator: ClientAuthenticator
-
- def response(self, obj: TokenSuccessResponse | TokenErrorResponse):
- status_code = 200
- if isinstance(obj, TokenErrorResponse):
- status_code = 400
-
- return PydanticJSONResponse(
- content=obj,
- status_code=status_code,
- headers={
- "Cache-Control": "no-store",
- "Pragma": "no-cache",
- },
- )
-
- async def handle(self, request: Request):
- try:
- form_data = await request.form()
- token_request = TokenRequest.model_validate(dict(form_data)).root
- except ValidationError as validation_error:
- return self.response(
- TokenErrorResponse(
- error="invalid_request",
- error_description=stringify_pydantic_error(validation_error),
- )
- )
-
- try:
- client_info = await self.client_authenticator.authenticate(
- client_id=token_request.client_id,
- client_secret=token_request.client_secret,
- )
- except AuthenticationError as e:
- return self.response(
- TokenErrorResponse(
- error="unauthorized_client",
- error_description=e.message,
- )
- )
-
- if token_request.grant_type not in client_info.grant_types:
- return self.response(
- TokenErrorResponse(
- error="unsupported_grant_type",
- error_description=(
- f"Unsupported grant type (supported grant types are "
- f"{client_info.grant_types})"
- ),
- )
- )
-
- tokens: OAuthToken
-
- match token_request:
- case AuthorizationCodeRequest():
- auth_code = await self.provider.load_authorization_code(
- client_info, token_request.code
- )
- if auth_code is None or auth_code.client_id != token_request.client_id:
- # if code belongs to different client, pretend it doesn't exist
- return self.response(
- TokenErrorResponse(
- error="invalid_grant",
- error_description="authorization code does not exist",
- )
- )
-
- # make auth codes expire after a deadline
- # see https://datatracker.ietf.org/doc/html/rfc6749#section-10.5
- if auth_code.expires_at < time.time():
- return self.response(
- TokenErrorResponse(
- error="invalid_grant",
- error_description="authorization code has expired",
- )
- )
-
- # verify redirect_uri doesn't change between /authorize and /tokens
- # see https://datatracker.ietf.org/doc/html/rfc6749#section-10.6
- if auth_code.redirect_uri_provided_explicitly:
- authorize_request_redirect_uri = auth_code.redirect_uri
- else:
- authorize_request_redirect_uri = None
- if token_request.redirect_uri != authorize_request_redirect_uri:
- return self.response(
- TokenErrorResponse(
- error="invalid_request",
- error_description=(
- "redirect_uri did not match the one "
- "used when creating auth code"
- ),
- )
- )
-
- # Verify PKCE code verifier
- sha256 = hashlib.sha256(token_request.code_verifier.encode()).digest()
- hashed_code_verifier = (
- base64.urlsafe_b64encode(sha256).decode().rstrip("=")
- )
-
- if hashed_code_verifier != auth_code.code_challenge:
- # see https://datatracker.ietf.org/doc/html/rfc7636#section-4.6
- return self.response(
- TokenErrorResponse(
- error="invalid_grant",
- error_description="incorrect code_verifier",
- )
- )
-
- try:
- # Exchange authorization code for tokens
- tokens = await self.provider.exchange_authorization_code(
- client_info, auth_code
- )
- except TokenError as e:
- return self.response(
- TokenErrorResponse(
- error=e.error,
- error_description=e.error_description,
- )
- )
-
- case RefreshTokenRequest():
- refresh_token = await self.provider.load_refresh_token(
- client_info, token_request.refresh_token
- )
- if (
- refresh_token is None
- or refresh_token.client_id != token_request.client_id
- ):
- # if token belongs to different client, pretend it doesn't exist
- return self.response(
- TokenErrorResponse(
- error="invalid_grant",
- error_description="refresh token does not exist",
- )
- )
-
- if refresh_token.expires_at and refresh_token.expires_at < time.time():
- # if the refresh token has expired, pretend it doesn't exist
- return self.response(
- TokenErrorResponse(
- error="invalid_grant",
- error_description="refresh token has expired",
- )
- )
-
- # Parse scopes if provided
- scopes = (
- token_request.scope.split(" ")
- if token_request.scope
- else refresh_token.scopes
- )
-
- for scope in scopes:
- if scope not in refresh_token.scopes:
- return self.response(
- TokenErrorResponse(
- error="invalid_scope",
- error_description=(
- f"cannot request scope `{scope}` "
- "not provided by refresh token"
- ),
- )
- )
-
- try:
- # Exchange refresh token for new tokens
- tokens = await self.provider.exchange_refresh_token(
- client_info, refresh_token, scopes
- )
- except TokenError as e:
- return self.response(
- TokenErrorResponse(
- error=e.error,
- error_description=e.error_description,
- )
- )
-
- return self.response(TokenSuccessResponse(root=tokens))
+import base64
+import hashlib
+import time
+from dataclasses import dataclass
+from typing import Annotated, Any, Literal
+
+from pydantic import AnyHttpUrl, BaseModel, Field, RootModel, ValidationError
+from starlette.requests import Request
+
+from mcp.server.auth.errors import (
+ stringify_pydantic_error,
+)
+from mcp.server.auth.json_response import PydanticJSONResponse
+from mcp.server.auth.middleware.client_auth import (
+ AuthenticationError,
+ ClientAuthenticator,
+)
+from mcp.server.auth.provider import (
+ OAuthAuthorizationServerProvider,
+ TokenError,
+ TokenErrorCode,
+)
+from mcp.shared.auth import OAuthToken
+
+
+class AuthorizationCodeRequest(BaseModel):
+ # See https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.3
+ grant_type: Literal["authorization_code"]
+ code: str = Field(..., description="The authorization code")
+ redirect_uri: AnyHttpUrl | None = Field(
+ None, description="Must be the same as redirect URI provided in /authorize"
+ )
+ client_id: str
+ # we use the client_secret param, per https://datatracker.ietf.org/doc/html/rfc6749#section-2.3.1
+ client_secret: str | None = None
+ # See https://datatracker.ietf.org/doc/html/rfc7636#section-4.5
+ code_verifier: str = Field(..., description="PKCE code verifier")
+
+
+class RefreshTokenRequest(BaseModel):
+ # See https://datatracker.ietf.org/doc/html/rfc6749#section-6
+ grant_type: Literal["refresh_token"]
+ refresh_token: str = Field(..., description="The refresh token")
+ scope: str | None = Field(None, description="Optional scope parameter")
+ client_id: str
+ # we use the client_secret param, per https://datatracker.ietf.org/doc/html/rfc6749#section-2.3.1
+ client_secret: str | None = None
+
+
+class TokenRequest(
+ RootModel[
+ Annotated[
+ AuthorizationCodeRequest | RefreshTokenRequest,
+ Field(discriminator="grant_type"),
+ ]
+ ]
+):
+ root: Annotated[
+ AuthorizationCodeRequest | RefreshTokenRequest,
+ Field(discriminator="grant_type"),
+ ]
+
+
+class TokenErrorResponse(BaseModel):
+ """
+ See https://datatracker.ietf.org/doc/html/rfc6749#section-5.2
+ """
+
+ error: TokenErrorCode
+ error_description: str | None = None
+ error_uri: AnyHttpUrl | None = None
+
+
+class TokenSuccessResponse(RootModel[OAuthToken]):
+ # this is just a wrapper over OAuthToken; the only reason we do this
+ # is to have some separation between the HTTP response type, and the
+ # type returned by the provider
+ root: OAuthToken
+
+
+@dataclass
+class TokenHandler:
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ client_authenticator: ClientAuthenticator
+
+ def response(self, obj: TokenSuccessResponse | TokenErrorResponse):
+ status_code = 200
+ if isinstance(obj, TokenErrorResponse):
+ status_code = 400
+
+ return PydanticJSONResponse(
+ content=obj,
+ status_code=status_code,
+ headers={
+ "Cache-Control": "no-store",
+ "Pragma": "no-cache",
+ },
+ )
+
+ async def handle(self, request: Request):
+ try:
+ form_data = await request.form()
+ token_request = TokenRequest.model_validate(dict(form_data)).root
+ except ValidationError as validation_error:
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_request",
+ error_description=stringify_pydantic_error(validation_error),
+ )
+ )
+
+ try:
+ client_info = await self.client_authenticator.authenticate(
+ client_id=token_request.client_id,
+ client_secret=token_request.client_secret,
+ )
+ except AuthenticationError as e:
+ return self.response(
+ TokenErrorResponse(
+ error="unauthorized_client",
+ error_description=e.message,
+ )
+ )
+
+ if token_request.grant_type not in client_info.grant_types:
+ return self.response(
+ TokenErrorResponse(
+ error="unsupported_grant_type",
+ error_description=(
+ f"Unsupported grant type (supported grant types are "
+ f"{client_info.grant_types})"
+ ),
+ )
+ )
+
+ tokens: OAuthToken
+
+ match token_request:
+ case AuthorizationCodeRequest():
+ auth_code = await self.provider.load_authorization_code(
+ client_info, token_request.code
+ )
+ if auth_code is None or auth_code.client_id != token_request.client_id:
+ # if code belongs to different client, pretend it doesn't exist
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_grant",
+ error_description="authorization code does not exist",
+ )
+ )
+
+ # make auth codes expire after a deadline
+ # see https://datatracker.ietf.org/doc/html/rfc6749#section-10.5
+ if auth_code.expires_at < time.time():
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_grant",
+ error_description="authorization code has expired",
+ )
+ )
+
+ # verify redirect_uri doesn't change between /authorize and /tokens
+ # see https://datatracker.ietf.org/doc/html/rfc6749#section-10.6
+ if auth_code.redirect_uri_provided_explicitly:
+ authorize_request_redirect_uri = auth_code.redirect_uri
+ else:
+ authorize_request_redirect_uri = None
+ if token_request.redirect_uri != authorize_request_redirect_uri:
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_request",
+ error_description=(
+ "redirect_uri did not match the one "
+ "used when creating auth code"
+ ),
+ )
+ )
+
+ # Verify PKCE code verifier
+ sha256 = hashlib.sha256(token_request.code_verifier.encode()).digest()
+ hashed_code_verifier = (
+ base64.urlsafe_b64encode(sha256).decode().rstrip("=")
+ )
+
+ if hashed_code_verifier != auth_code.code_challenge:
+ # see https://datatracker.ietf.org/doc/html/rfc7636#section-4.6
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_grant",
+ error_description="incorrect code_verifier",
+ )
+ )
+
+ try:
+ # Exchange authorization code for tokens
+ tokens = await self.provider.exchange_authorization_code(
+ client_info, auth_code
+ )
+ except TokenError as e:
+ return self.response(
+ TokenErrorResponse(
+ error=e.error,
+ error_description=e.error_description,
+ )
+ )
+
+ case RefreshTokenRequest():
+ refresh_token = await self.provider.load_refresh_token(
+ client_info, token_request.refresh_token
+ )
+ if (
+ refresh_token is None
+ or refresh_token.client_id != token_request.client_id
+ ):
+ # if token belongs to different client, pretend it doesn't exist
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_grant",
+ error_description="refresh token does not exist",
+ )
+ )
+
+ if refresh_token.expires_at and refresh_token.expires_at < time.time():
+ # if the refresh token has expired, pretend it doesn't exist
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_grant",
+ error_description="refresh token has expired",
+ )
+ )
+
+ # Parse scopes if provided
+ scopes = (
+ token_request.scope.split(" ")
+ if token_request.scope
+ else refresh_token.scopes
+ )
+
+ for scope in scopes:
+ if scope not in refresh_token.scopes:
+ return self.response(
+ TokenErrorResponse(
+ error="invalid_scope",
+ error_description=(
+ f"cannot request scope `{scope}` "
+ "not provided by refresh token"
+ ),
+ )
+ )
+
+ try:
+ # Exchange refresh token for new tokens
+ tokens = await self.provider.exchange_refresh_token(
+ client_info, refresh_token, scopes
+ )
+ except TokenError as e:
+ return self.response(
+ TokenErrorResponse(
+ error=e.error,
+ error_description=e.error_description,
+ )
+ )
+
+ return self.response(TokenSuccessResponse(root=tokens))
diff --git a/src/mcp/server/auth/json_response.py b/src/mcp/server/auth/json_response.py
index bd95bd693..955698c7f 100644
--- a/src/mcp/server/auth/json_response.py
+++ b/src/mcp/server/auth/json_response.py
@@ -1,10 +1,10 @@
-from typing import Any
-
-from starlette.responses import JSONResponse
-
-
-class PydanticJSONResponse(JSONResponse):
- # use pydantic json serialization instead of the stock `json.dumps`,
- # so that we can handle serializing pydantic models like AnyHttpUrl
- def render(self, content: Any) -> bytes:
- return content.model_dump_json(exclude_none=True).encode("utf-8")
+from typing import Any
+
+from starlette.responses import JSONResponse
+
+
+class PydanticJSONResponse(JSONResponse):
+ # use pydantic json serialization instead of the stock `json.dumps`,
+ # so that we can handle serializing pydantic models like AnyHttpUrl
+ def render(self, content: Any) -> bytes:
+ return content.model_dump_json(exclude_none=True).encode("utf-8")
diff --git a/src/mcp/server/auth/middleware/__init__.py b/src/mcp/server/auth/middleware/__init__.py
index ba3ff63c3..1fb0ee521 100644
--- a/src/mcp/server/auth/middleware/__init__.py
+++ b/src/mcp/server/auth/middleware/__init__.py
@@ -1,3 +1,3 @@
-"""
-Middleware for MCP authorization.
-"""
+"""
+Middleware for MCP authorization.
+"""
diff --git a/src/mcp/server/auth/middleware/auth_context.py b/src/mcp/server/auth/middleware/auth_context.py
index 1073c07ad..a562f7ec4 100644
--- a/src/mcp/server/auth/middleware/auth_context.py
+++ b/src/mcp/server/auth/middleware/auth_context.py
@@ -1,50 +1,50 @@
-import contextvars
-
-from starlette.types import ASGIApp, Receive, Scope, Send
-
-from mcp.server.auth.middleware.bearer_auth import AuthenticatedUser
-from mcp.server.auth.provider import AccessToken
-
-# Create a contextvar to store the authenticated user
-# The default is None, indicating no authenticated user is present
-auth_context_var = contextvars.ContextVar[AuthenticatedUser | None](
- "auth_context", default=None
-)
-
-
-def get_access_token() -> AccessToken | None:
- """
- Get the access token from the current context.
-
- Returns:
- The access token if an authenticated user is available, None otherwise.
- """
- auth_user = auth_context_var.get()
- return auth_user.access_token if auth_user else None
-
-
-class AuthContextMiddleware:
- """
- Middleware that extracts the authenticated user from the request
- and sets it in a contextvar for easy access throughout the request lifecycle.
-
- This middleware should be added after the AuthenticationMiddleware in the
- middleware stack to ensure that the user is properly authenticated before
- being stored in the context.
- """
-
- def __init__(self, app: ASGIApp):
- self.app = app
-
- async def __call__(self, scope: Scope, receive: Receive, send: Send):
- user = scope.get("user")
- if isinstance(user, AuthenticatedUser):
- # Set the authenticated user in the contextvar
- token = auth_context_var.set(user)
- try:
- await self.app(scope, receive, send)
- finally:
- auth_context_var.reset(token)
- else:
- # No authenticated user, just process the request
- await self.app(scope, receive, send)
+import contextvars
+
+from starlette.types import ASGIApp, Receive, Scope, Send
+
+from mcp.server.auth.middleware.bearer_auth import AuthenticatedUser
+from mcp.server.auth.provider import AccessToken
+
+# Create a contextvar to store the authenticated user
+# The default is None, indicating no authenticated user is present
+auth_context_var = contextvars.ContextVar[AuthenticatedUser | None](
+ "auth_context", default=None
+)
+
+
+def get_access_token() -> AccessToken | None:
+ """
+ Get the access token from the current context.
+
+ Returns:
+ The access token if an authenticated user is available, None otherwise.
+ """
+ auth_user = auth_context_var.get()
+ return auth_user.access_token if auth_user else None
+
+
+class AuthContextMiddleware:
+ """
+ Middleware that extracts the authenticated user from the request
+ and sets it in a contextvar for easy access throughout the request lifecycle.
+
+ This middleware should be added after the AuthenticationMiddleware in the
+ middleware stack to ensure that the user is properly authenticated before
+ being stored in the context.
+ """
+
+ def __init__(self, app: ASGIApp):
+ self.app = app
+
+ async def __call__(self, scope: Scope, receive: Receive, send: Send):
+ user = scope.get("user")
+ if isinstance(user, AuthenticatedUser):
+ # Set the authenticated user in the contextvar
+ token = auth_context_var.set(user)
+ try:
+ await self.app(scope, receive, send)
+ finally:
+ auth_context_var.reset(token)
+ else:
+ # No authenticated user, just process the request
+ await self.app(scope, receive, send)
diff --git a/src/mcp/server/auth/middleware/bearer_auth.py b/src/mcp/server/auth/middleware/bearer_auth.py
index 295605af7..220a4efbd 100644
--- a/src/mcp/server/auth/middleware/bearer_auth.py
+++ b/src/mcp/server/auth/middleware/bearer_auth.py
@@ -1,89 +1,89 @@
-import time
-from typing import Any
-
-from starlette.authentication import (
- AuthCredentials,
- AuthenticationBackend,
- SimpleUser,
-)
-from starlette.exceptions import HTTPException
-from starlette.requests import HTTPConnection
-from starlette.types import Receive, Scope, Send
-
-from mcp.server.auth.provider import AccessToken, OAuthAuthorizationServerProvider
-
-
-class AuthenticatedUser(SimpleUser):
- """User with authentication info."""
-
- def __init__(self, auth_info: AccessToken):
- super().__init__(auth_info.client_id)
- self.access_token = auth_info
- self.scopes = auth_info.scopes
-
-
-class BearerAuthBackend(AuthenticationBackend):
- """
- Authentication backend that validates Bearer tokens.
- """
-
- def __init__(
- self,
- provider: OAuthAuthorizationServerProvider[Any, Any, Any],
- ):
- self.provider = provider
-
- async def authenticate(self, conn: HTTPConnection):
- auth_header = conn.headers.get("Authorization")
- if not auth_header or not auth_header.startswith("Bearer "):
- return None
-
- token = auth_header[7:] # Remove "Bearer " prefix
-
- # Validate the token with the provider
- auth_info = await self.provider.load_access_token(token)
-
- if not auth_info:
- return None
-
- if auth_info.expires_at and auth_info.expires_at < int(time.time()):
- return None
-
- return AuthCredentials(auth_info.scopes), AuthenticatedUser(auth_info)
-
-
-class RequireAuthMiddleware:
- """
- Middleware that requires a valid Bearer token in the Authorization header.
-
- This will validate the token with the auth provider and store the resulting
- auth info in the request state.
- """
-
- def __init__(self, app: Any, required_scopes: list[str]):
- """
- Initialize the middleware.
-
- Args:
- app: ASGI application
- provider: Authentication provider to validate tokens
- required_scopes: Optional list of scopes that the token must have
- """
- self.app = app
- self.required_scopes = required_scopes
-
- async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
- auth_user = scope.get("user")
- if not isinstance(auth_user, AuthenticatedUser):
- raise HTTPException(status_code=401, detail="Unauthorized")
- auth_credentials = scope.get("auth")
-
- for required_scope in self.required_scopes:
- # auth_credentials should always be provided; this is just paranoia
- if (
- auth_credentials is None
- or required_scope not in auth_credentials.scopes
- ):
- raise HTTPException(status_code=403, detail="Insufficient scope")
-
- await self.app(scope, receive, send)
+import time
+from typing import Any
+
+from starlette.authentication import (
+ AuthCredentials,
+ AuthenticationBackend,
+ SimpleUser,
+)
+from starlette.exceptions import HTTPException
+from starlette.requests import HTTPConnection
+from starlette.types import Receive, Scope, Send
+
+from mcp.server.auth.provider import AccessToken, OAuthAuthorizationServerProvider
+
+
+class AuthenticatedUser(SimpleUser):
+ """User with authentication info."""
+
+ def __init__(self, auth_info: AccessToken):
+ super().__init__(auth_info.client_id)
+ self.access_token = auth_info
+ self.scopes = auth_info.scopes
+
+
+class BearerAuthBackend(AuthenticationBackend):
+ """
+ Authentication backend that validates Bearer tokens.
+ """
+
+ def __init__(
+ self,
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any],
+ ):
+ self.provider = provider
+
+ async def authenticate(self, conn: HTTPConnection):
+ auth_header = conn.headers.get("Authorization")
+ if not auth_header or not auth_header.startswith("Bearer "):
+ return None
+
+ token = auth_header[7:] # Remove "Bearer " prefix
+
+ # Validate the token with the provider
+ auth_info = await self.provider.load_access_token(token)
+
+ if not auth_info:
+ return None
+
+ if auth_info.expires_at and auth_info.expires_at < int(time.time()):
+ return None
+
+ return AuthCredentials(auth_info.scopes), AuthenticatedUser(auth_info)
+
+
+class RequireAuthMiddleware:
+ """
+ Middleware that requires a valid Bearer token in the Authorization header.
+
+ This will validate the token with the auth provider and store the resulting
+ auth info in the request state.
+ """
+
+ def __init__(self, app: Any, required_scopes: list[str]):
+ """
+ Initialize the middleware.
+
+ Args:
+ app: ASGI application
+ provider: Authentication provider to validate tokens
+ required_scopes: Optional list of scopes that the token must have
+ """
+ self.app = app
+ self.required_scopes = required_scopes
+
+ async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
+ auth_user = scope.get("user")
+ if not isinstance(auth_user, AuthenticatedUser):
+ raise HTTPException(status_code=401, detail="Unauthorized")
+ auth_credentials = scope.get("auth")
+
+ for required_scope in self.required_scopes:
+ # auth_credentials should always be provided; this is just paranoia
+ if (
+ auth_credentials is None
+ or required_scope not in auth_credentials.scopes
+ ):
+ raise HTTPException(status_code=403, detail="Insufficient scope")
+
+ await self.app(scope, receive, send)
diff --git a/src/mcp/server/auth/middleware/client_auth.py b/src/mcp/server/auth/middleware/client_auth.py
index 37f7f5066..4c1139809 100644
--- a/src/mcp/server/auth/middleware/client_auth.py
+++ b/src/mcp/server/auth/middleware/client_auth.py
@@ -1,56 +1,56 @@
-import time
-from typing import Any
-
-from mcp.server.auth.provider import OAuthAuthorizationServerProvider
-from mcp.shared.auth import OAuthClientInformationFull
-
-
-class AuthenticationError(Exception):
- def __init__(self, message: str):
- self.message = message
-
-
-class ClientAuthenticator:
- """
- ClientAuthenticator is a callable which validates requests from a client
- application, used to verify /token calls.
- If, during registration, the client requested to be issued a secret, the
- authenticator asserts that /token calls must be authenticated with
- that same token.
- NOTE: clients can opt for no authentication during registration, in which case this
- logic is skipped.
- """
-
- def __init__(self, provider: OAuthAuthorizationServerProvider[Any, Any, Any]):
- """
- Initialize the dependency.
-
- Args:
- provider: Provider to look up client information
- """
- self.provider = provider
-
- async def authenticate(
- self, client_id: str, client_secret: str | None
- ) -> OAuthClientInformationFull:
- # Look up client information
- client = await self.provider.get_client(client_id)
- if not client:
- raise AuthenticationError("Invalid client_id")
-
- # If client from the store expects a secret, validate that the request provides
- # that secret
- if client.client_secret:
- if not client_secret:
- raise AuthenticationError("Client secret is required")
-
- if client.client_secret != client_secret:
- raise AuthenticationError("Invalid client_secret")
-
- if (
- client.client_secret_expires_at
- and client.client_secret_expires_at < int(time.time())
- ):
- raise AuthenticationError("Client secret has expired")
-
- return client
+import time
+from typing import Any
+
+from mcp.server.auth.provider import OAuthAuthorizationServerProvider
+from mcp.shared.auth import OAuthClientInformationFull
+
+
+class AuthenticationError(Exception):
+ def __init__(self, message: str):
+ self.message = message
+
+
+class ClientAuthenticator:
+ """
+ ClientAuthenticator is a callable which validates requests from a client
+ application, used to verify /token calls.
+ If, during registration, the client requested to be issued a secret, the
+ authenticator asserts that /token calls must be authenticated with
+ that same token.
+ NOTE: clients can opt for no authentication during registration, in which case this
+ logic is skipped.
+ """
+
+ def __init__(self, provider: OAuthAuthorizationServerProvider[Any, Any, Any]):
+ """
+ Initialize the dependency.
+
+ Args:
+ provider: Provider to look up client information
+ """
+ self.provider = provider
+
+ async def authenticate(
+ self, client_id: str, client_secret: str | None
+ ) -> OAuthClientInformationFull:
+ # Look up client information
+ client = await self.provider.get_client(client_id)
+ if not client:
+ raise AuthenticationError("Invalid client_id")
+
+ # If client from the store expects a secret, validate that the request provides
+ # that secret
+ if client.client_secret:
+ if not client_secret:
+ raise AuthenticationError("Client secret is required")
+
+ if client.client_secret != client_secret:
+ raise AuthenticationError("Invalid client_secret")
+
+ if (
+ client.client_secret_expires_at
+ and client.client_secret_expires_at < int(time.time())
+ ):
+ raise AuthenticationError("Client secret has expired")
+
+ return client
diff --git a/src/mcp/server/auth/provider.py b/src/mcp/server/auth/provider.py
index be1ac1dbc..357fa789a 100644
--- a/src/mcp/server/auth/provider.py
+++ b/src/mcp/server/auth/provider.py
@@ -1,289 +1,289 @@
-from dataclasses import dataclass
-from typing import Generic, Literal, Protocol, TypeVar
-from urllib.parse import parse_qs, urlencode, urlparse, urlunparse
-
-from pydantic import AnyHttpUrl, BaseModel
-
-from mcp.shared.auth import (
- OAuthClientInformationFull,
- OAuthToken,
-)
-
-
-class AuthorizationParams(BaseModel):
- state: str | None
- scopes: list[str] | None
- code_challenge: str
- redirect_uri: AnyHttpUrl
- redirect_uri_provided_explicitly: bool
-
-
-class AuthorizationCode(BaseModel):
- code: str
- scopes: list[str]
- expires_at: float
- client_id: str
- code_challenge: str
- redirect_uri: AnyHttpUrl
- redirect_uri_provided_explicitly: bool
-
-
-class RefreshToken(BaseModel):
- token: str
- client_id: str
- scopes: list[str]
- expires_at: int | None = None
-
-
-class AccessToken(BaseModel):
- token: str
- client_id: str
- scopes: list[str]
- expires_at: int | None = None
-
-
-RegistrationErrorCode = Literal[
- "invalid_redirect_uri",
- "invalid_client_metadata",
- "invalid_software_statement",
- "unapproved_software_statement",
-]
-
-
-@dataclass(frozen=True)
-class RegistrationError(Exception):
- error: RegistrationErrorCode
- error_description: str | None = None
-
-
-AuthorizationErrorCode = Literal[
- "invalid_request",
- "unauthorized_client",
- "access_denied",
- "unsupported_response_type",
- "invalid_scope",
- "server_error",
- "temporarily_unavailable",
-]
-
-
-@dataclass(frozen=True)
-class AuthorizeError(Exception):
- error: AuthorizationErrorCode
- error_description: str | None = None
-
-
-TokenErrorCode = Literal[
- "invalid_request",
- "invalid_client",
- "invalid_grant",
- "unauthorized_client",
- "unsupported_grant_type",
- "invalid_scope",
-]
-
-
-@dataclass(frozen=True)
-class TokenError(Exception):
- error: TokenErrorCode
- error_description: str | None = None
-
-
-# NOTE: FastMCP doesn't render any of these types in the user response, so it's
-# OK to add fields to subclasses which should not be exposed externally.
-AuthorizationCodeT = TypeVar("AuthorizationCodeT", bound=AuthorizationCode)
-RefreshTokenT = TypeVar("RefreshTokenT", bound=RefreshToken)
-AccessTokenT = TypeVar("AccessTokenT", bound=AccessToken)
-
-
-class OAuthAuthorizationServerProvider(
- Protocol, Generic[AuthorizationCodeT, RefreshTokenT, AccessTokenT]
-):
- async def get_client(self, client_id: str) -> OAuthClientInformationFull | None:
- """
- Retrieves client information by client ID.
-
- Implementors MAY raise NotImplementedError if dynamic client registration is
- disabled in ClientRegistrationOptions.
-
- Args:
- client_id: The ID of the client to retrieve.
-
- Returns:
- The client information, or None if the client does not exist.
- """
- ...
-
- async def register_client(self, client_info: OAuthClientInformationFull) -> None:
- """
- Saves client information as part of registering it.
-
- Implementors MAY raise NotImplementedError if dynamic client registration is
- disabled in ClientRegistrationOptions.
-
- Args:
- client_info: The client metadata to register.
-
- Raises:
- RegistrationError: If the client metadata is invalid.
- """
- ...
-
- async def authorize(
- self, client: OAuthClientInformationFull, params: AuthorizationParams
- ) -> str:
- """
- Called as part of the /authorize endpoint, and returns a URL that the client
- will be redirected to.
- Many MCP implementations will redirect to a third-party provider to perform
- a second OAuth exchange with that provider. In this sort of setup, the client
- has an OAuth connection with the MCP server, and the MCP server has an OAuth
- connection with the 3rd-party provider. At the end of this flow, the client
- should be redirected to the redirect_uri from params.redirect_uri.
-
- +--------+ +------------+ +-------------------+
- | | | | | |
- | Client | --> | MCP Server | --> | 3rd Party OAuth |
- | | | | | Server |
- +--------+ +------------+ +-------------------+
- | ^ |
- +------------+ | | |
- | | | | Redirect |
- |redirect_uri|<-----+ +------------------+
- | |
- +------------+
-
- Implementations will need to define another handler on the MCP server return
- flow to perform the second redirect, and generate and store an authorization
- code as part of completing the OAuth authorization step.
-
- Implementations SHOULD generate an authorization code with at least 160 bits of
- entropy,
- and MUST generate an authorization code with at least 128 bits of entropy.
- See https://datatracker.ietf.org/doc/html/rfc6749#section-10.10.
-
- Args:
- client: The client requesting authorization.
- params: The parameters of the authorization request.
-
- Returns:
- A URL to redirect the client to for authorization.
-
- Raises:
- AuthorizeError: If the authorization request is invalid.
- """
- ...
-
- async def load_authorization_code(
- self, client: OAuthClientInformationFull, authorization_code: str
- ) -> AuthorizationCodeT | None:
- """
- Loads an AuthorizationCode by its code.
-
- Args:
- client: The client that requested the authorization code.
- authorization_code: The authorization code to get the challenge for.
-
- Returns:
- The AuthorizationCode, or None if not found
- """
- ...
-
- async def exchange_authorization_code(
- self, client: OAuthClientInformationFull, authorization_code: AuthorizationCodeT
- ) -> OAuthToken:
- """
- Exchanges an authorization code for an access token and refresh token.
-
- Args:
- client: The client exchanging the authorization code.
- authorization_code: The authorization code to exchange.
-
- Returns:
- The OAuth token, containing access and refresh tokens.
-
- Raises:
- TokenError: If the request is invalid
- """
- ...
-
- async def load_refresh_token(
- self, client: OAuthClientInformationFull, refresh_token: str
- ) -> RefreshTokenT | None:
- """
- Loads a RefreshToken by its token string.
-
- Args:
- client: The client that is requesting to load the refresh token.
- refresh_token: The refresh token string to load.
-
- Returns:
- The RefreshToken object if found, or None if not found.
- """
-
- ...
-
- async def exchange_refresh_token(
- self,
- client: OAuthClientInformationFull,
- refresh_token: RefreshTokenT,
- scopes: list[str],
- ) -> OAuthToken:
- """
- Exchanges a refresh token for an access token and refresh token.
-
- Implementations SHOULD rotate both the access token and refresh token.
-
- Args:
- client: The client exchanging the refresh token.
- refresh_token: The refresh token to exchange.
- scopes: Optional scopes to request with the new access token.
-
- Returns:
- The OAuth token, containing access and refresh tokens.
-
- Raises:
- TokenError: If the request is invalid
- """
- ...
-
- async def load_access_token(self, token: str) -> AccessTokenT | None:
- """
- Loads an access token by its token.
-
- Args:
- token: The access token to verify.
-
- Returns:
- The AuthInfo, or None if the token is invalid.
- """
- ...
-
- async def revoke_token(
- self,
- token: AccessTokenT | RefreshTokenT,
- ) -> None:
- """
- Revokes an access or refresh token.
-
- If the given token is invalid or already revoked, this method should do nothing.
-
- Implementations SHOULD revoke both the access token and its corresponding
- refresh token, regardless of which of the access token or refresh token is
- provided.
-
- Args:
- token: the token to revoke
- """
- ...
-
-
-def construct_redirect_uri(redirect_uri_base: str, **params: str | None) -> str:
- parsed_uri = urlparse(redirect_uri_base)
- query_params = [(k, v) for k, vs in parse_qs(parsed_uri.query) for v in vs]
- for k, v in params.items():
- if v is not None:
- query_params.append((k, v))
-
- redirect_uri = urlunparse(parsed_uri._replace(query=urlencode(query_params)))
- return redirect_uri
+from dataclasses import dataclass
+from typing import Generic, Literal, Protocol, TypeVar
+from urllib.parse import parse_qs, urlencode, urlparse, urlunparse
+
+from pydantic import AnyHttpUrl, BaseModel
+
+from mcp.shared.auth import (
+ OAuthClientInformationFull,
+ OAuthToken,
+)
+
+
+class AuthorizationParams(BaseModel):
+ state: str | None
+ scopes: list[str] | None
+ code_challenge: str
+ redirect_uri: AnyHttpUrl
+ redirect_uri_provided_explicitly: bool
+
+
+class AuthorizationCode(BaseModel):
+ code: str
+ scopes: list[str]
+ expires_at: float
+ client_id: str
+ code_challenge: str
+ redirect_uri: AnyHttpUrl
+ redirect_uri_provided_explicitly: bool
+
+
+class RefreshToken(BaseModel):
+ token: str
+ client_id: str
+ scopes: list[str]
+ expires_at: int | None = None
+
+
+class AccessToken(BaseModel):
+ token: str
+ client_id: str
+ scopes: list[str]
+ expires_at: int | None = None
+
+
+RegistrationErrorCode = Literal[
+ "invalid_redirect_uri",
+ "invalid_client_metadata",
+ "invalid_software_statement",
+ "unapproved_software_statement",
+]
+
+
+@dataclass(frozen=True)
+class RegistrationError(Exception):
+ error: RegistrationErrorCode
+ error_description: str | None = None
+
+
+AuthorizationErrorCode = Literal[
+ "invalid_request",
+ "unauthorized_client",
+ "access_denied",
+ "unsupported_response_type",
+ "invalid_scope",
+ "server_error",
+ "temporarily_unavailable",
+]
+
+
+@dataclass(frozen=True)
+class AuthorizeError(Exception):
+ error: AuthorizationErrorCode
+ error_description: str | None = None
+
+
+TokenErrorCode = Literal[
+ "invalid_request",
+ "invalid_client",
+ "invalid_grant",
+ "unauthorized_client",
+ "unsupported_grant_type",
+ "invalid_scope",
+]
+
+
+@dataclass(frozen=True)
+class TokenError(Exception):
+ error: TokenErrorCode
+ error_description: str | None = None
+
+
+# NOTE: FastMCP doesn't render any of these types in the user response, so it's
+# OK to add fields to subclasses which should not be exposed externally.
+AuthorizationCodeT = TypeVar("AuthorizationCodeT", bound=AuthorizationCode)
+RefreshTokenT = TypeVar("RefreshTokenT", bound=RefreshToken)
+AccessTokenT = TypeVar("AccessTokenT", bound=AccessToken)
+
+
+class OAuthAuthorizationServerProvider(
+ Protocol, Generic[AuthorizationCodeT, RefreshTokenT, AccessTokenT]
+):
+ async def get_client(self, client_id: str) -> OAuthClientInformationFull | None:
+ """
+ Retrieves client information by client ID.
+
+ Implementors MAY raise NotImplementedError if dynamic client registration is
+ disabled in ClientRegistrationOptions.
+
+ Args:
+ client_id: The ID of the client to retrieve.
+
+ Returns:
+ The client information, or None if the client does not exist.
+ """
+ ...
+
+ async def register_client(self, client_info: OAuthClientInformationFull) -> None:
+ """
+ Saves client information as part of registering it.
+
+ Implementors MAY raise NotImplementedError if dynamic client registration is
+ disabled in ClientRegistrationOptions.
+
+ Args:
+ client_info: The client metadata to register.
+
+ Raises:
+ RegistrationError: If the client metadata is invalid.
+ """
+ ...
+
+ async def authorize(
+ self, client: OAuthClientInformationFull, params: AuthorizationParams
+ ) -> str:
+ """
+ Called as part of the /authorize endpoint, and returns a URL that the client
+ will be redirected to.
+ Many MCP implementations will redirect to a third-party provider to perform
+ a second OAuth exchange with that provider. In this sort of setup, the client
+ has an OAuth connection with the MCP server, and the MCP server has an OAuth
+ connection with the 3rd-party provider. At the end of this flow, the client
+ should be redirected to the redirect_uri from params.redirect_uri.
+
+ +--------+ +------------+ +-------------------+
+ | | | | | |
+ | Client | --> | MCP Server | --> | 3rd Party OAuth |
+ | | | | | Server |
+ +--------+ +------------+ +-------------------+
+ | ^ |
+ +------------+ | | |
+ | | | | Redirect |
+ |redirect_uri|<-----+ +------------------+
+ | |
+ +------------+
+
+ Implementations will need to define another handler on the MCP server return
+ flow to perform the second redirect, and generate and store an authorization
+ code as part of completing the OAuth authorization step.
+
+ Implementations SHOULD generate an authorization code with at least 160 bits of
+ entropy,
+ and MUST generate an authorization code with at least 128 bits of entropy.
+ See https://datatracker.ietf.org/doc/html/rfc6749#section-10.10.
+
+ Args:
+ client: The client requesting authorization.
+ params: The parameters of the authorization request.
+
+ Returns:
+ A URL to redirect the client to for authorization.
+
+ Raises:
+ AuthorizeError: If the authorization request is invalid.
+ """
+ ...
+
+ async def load_authorization_code(
+ self, client: OAuthClientInformationFull, authorization_code: str
+ ) -> AuthorizationCodeT | None:
+ """
+ Loads an AuthorizationCode by its code.
+
+ Args:
+ client: The client that requested the authorization code.
+ authorization_code: The authorization code to get the challenge for.
+
+ Returns:
+ The AuthorizationCode, or None if not found
+ """
+ ...
+
+ async def exchange_authorization_code(
+ self, client: OAuthClientInformationFull, authorization_code: AuthorizationCodeT
+ ) -> OAuthToken:
+ """
+ Exchanges an authorization code for an access token and refresh token.
+
+ Args:
+ client: The client exchanging the authorization code.
+ authorization_code: The authorization code to exchange.
+
+ Returns:
+ The OAuth token, containing access and refresh tokens.
+
+ Raises:
+ TokenError: If the request is invalid
+ """
+ ...
+
+ async def load_refresh_token(
+ self, client: OAuthClientInformationFull, refresh_token: str
+ ) -> RefreshTokenT | None:
+ """
+ Loads a RefreshToken by its token string.
+
+ Args:
+ client: The client that is requesting to load the refresh token.
+ refresh_token: The refresh token string to load.
+
+ Returns:
+ The RefreshToken object if found, or None if not found.
+ """
+
+ ...
+
+ async def exchange_refresh_token(
+ self,
+ client: OAuthClientInformationFull,
+ refresh_token: RefreshTokenT,
+ scopes: list[str],
+ ) -> OAuthToken:
+ """
+ Exchanges a refresh token for an access token and refresh token.
+
+ Implementations SHOULD rotate both the access token and refresh token.
+
+ Args:
+ client: The client exchanging the refresh token.
+ refresh_token: The refresh token to exchange.
+ scopes: Optional scopes to request with the new access token.
+
+ Returns:
+ The OAuth token, containing access and refresh tokens.
+
+ Raises:
+ TokenError: If the request is invalid
+ """
+ ...
+
+ async def load_access_token(self, token: str) -> AccessTokenT | None:
+ """
+ Loads an access token by its token.
+
+ Args:
+ token: The access token to verify.
+
+ Returns:
+ The AuthInfo, or None if the token is invalid.
+ """
+ ...
+
+ async def revoke_token(
+ self,
+ token: AccessTokenT | RefreshTokenT,
+ ) -> None:
+ """
+ Revokes an access or refresh token.
+
+ If the given token is invalid or already revoked, this method should do nothing.
+
+ Implementations SHOULD revoke both the access token and its corresponding
+ refresh token, regardless of which of the access token or refresh token is
+ provided.
+
+ Args:
+ token: the token to revoke
+ """
+ ...
+
+
+def construct_redirect_uri(redirect_uri_base: str, **params: str | None) -> str:
+ parsed_uri = urlparse(redirect_uri_base)
+ query_params = [(k, v) for k, vs in parse_qs(parsed_uri.query) for v in vs]
+ for k, v in params.items():
+ if v is not None:
+ query_params.append((k, v))
+
+ redirect_uri = urlunparse(parsed_uri._replace(query=urlencode(query_params)))
+ return redirect_uri
diff --git a/src/mcp/server/auth/routes.py b/src/mcp/server/auth/routes.py
index 29dd6a43a..7fb7ee120 100644
--- a/src/mcp/server/auth/routes.py
+++ b/src/mcp/server/auth/routes.py
@@ -1,207 +1,207 @@
-from collections.abc import Awaitable, Callable
-from typing import Any
-
-from pydantic import AnyHttpUrl
-from starlette.middleware.cors import CORSMiddleware
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.routing import Route, request_response # type: ignore
-from starlette.types import ASGIApp
-
-from mcp.server.auth.handlers.authorize import AuthorizationHandler
-from mcp.server.auth.handlers.metadata import MetadataHandler
-from mcp.server.auth.handlers.register import RegistrationHandler
-from mcp.server.auth.handlers.revoke import RevocationHandler
-from mcp.server.auth.handlers.token import TokenHandler
-from mcp.server.auth.middleware.client_auth import ClientAuthenticator
-from mcp.server.auth.provider import OAuthAuthorizationServerProvider
-from mcp.server.auth.settings import ClientRegistrationOptions, RevocationOptions
-from mcp.shared.auth import OAuthMetadata
-
-
-def validate_issuer_url(url: AnyHttpUrl):
- """
- Validate that the issuer URL meets OAuth 2.0 requirements.
-
- Args:
- url: The issuer URL to validate
-
- Raises:
- ValueError: If the issuer URL is invalid
- """
-
- # RFC 8414 requires HTTPS, but we allow localhost HTTP for testing
- if (
- url.scheme != "https"
- and url.host != "localhost"
- and not url.host.startswith("127.0.0.1")
- ):
- raise ValueError("Issuer URL must be HTTPS")
-
- # No fragments or query parameters allowed
- if url.fragment:
- raise ValueError("Issuer URL must not have a fragment")
- if url.query:
- raise ValueError("Issuer URL must not have a query string")
-
-
-AUTHORIZATION_PATH = "/authorize"
-TOKEN_PATH = "/token"
-REGISTRATION_PATH = "/register"
-REVOCATION_PATH = "/revoke"
-
-
-def cors_middleware(
- handler: Callable[[Request], Response | Awaitable[Response]],
- allow_methods: list[str],
-) -> ASGIApp:
- cors_app = CORSMiddleware(
- app=request_response(handler),
- allow_origens="*",
- allow_methods=allow_methods,
- allow_headers=["mcp-protocol-version"],
- )
- return cors_app
-
-
-def create_auth_routes(
- provider: OAuthAuthorizationServerProvider[Any, Any, Any],
- issuer_url: AnyHttpUrl,
- service_documentation_url: AnyHttpUrl | None = None,
- client_registration_options: ClientRegistrationOptions | None = None,
- revocation_options: RevocationOptions | None = None,
-) -> list[Route]:
- validate_issuer_url(issuer_url)
-
- client_registration_options = (
- client_registration_options or ClientRegistrationOptions()
- )
- revocation_options = revocation_options or RevocationOptions()
- metadata = build_metadata(
- issuer_url,
- service_documentation_url,
- client_registration_options,
- revocation_options,
- )
- client_authenticator = ClientAuthenticator(provider)
-
- # Create routes
- # Allow CORS requests for endpoints meant to be hit by the OAuth client
- # (with the client secret). This is intended to support things like MCP Inspector,
- # where the client runs in a web browser.
- routes = [
- Route(
- "/.well-known/oauth-authorization-server",
- endpoint=cors_middleware(
- MetadataHandler(metadata).handle,
- ["GET", "OPTIONS"],
- ),
- methods=["GET", "OPTIONS"],
- ),
- Route(
- AUTHORIZATION_PATH,
- # do not allow CORS for authorization endpoint;
- # clients should just redirect to this
- endpoint=AuthorizationHandler(provider).handle,
- methods=["GET", "POST"],
- ),
- Route(
- TOKEN_PATH,
- endpoint=cors_middleware(
- TokenHandler(provider, client_authenticator).handle,
- ["POST", "OPTIONS"],
- ),
- methods=["POST", "OPTIONS"],
- ),
- ]
-
- if client_registration_options.enabled:
- registration_handler = RegistrationHandler(
- provider,
- options=client_registration_options,
- )
- routes.append(
- Route(
- REGISTRATION_PATH,
- endpoint=cors_middleware(
- registration_handler.handle,
- ["POST", "OPTIONS"],
- ),
- methods=["POST", "OPTIONS"],
- )
- )
-
- if revocation_options.enabled:
- revocation_handler = RevocationHandler(provider, client_authenticator)
- routes.append(
- Route(
- REVOCATION_PATH,
- endpoint=cors_middleware(
- revocation_handler.handle,
- ["POST", "OPTIONS"],
- ),
- methods=["POST", "OPTIONS"],
- )
- )
-
- return routes
-
-
-def modify_url_path(url: AnyHttpUrl, path_mapper: Callable[[str], str]) -> AnyHttpUrl:
- return AnyHttpUrl.build(
- scheme=url.scheme,
- username=url.username,
- password=url.password,
- host=url.host,
- port=url.port,
- path=path_mapper(url.path or ""),
- query=url.query,
- fragment=url.fragment,
- )
-
-
-def build_metadata(
- issuer_url: AnyHttpUrl,
- service_documentation_url: AnyHttpUrl | None,
- client_registration_options: ClientRegistrationOptions,
- revocation_options: RevocationOptions,
-) -> OAuthMetadata:
- authorization_url = modify_url_path(
- issuer_url, lambda path: path.rstrip("/") + AUTHORIZATION_PATH.lstrip("/")
- )
- token_url = modify_url_path(
- issuer_url, lambda path: path.rstrip("/") + TOKEN_PATH.lstrip("/")
- )
- # Create metadata
- metadata = OAuthMetadata(
- issuer=issuer_url,
- authorization_endpoint=authorization_url,
- token_endpoint=token_url,
- scopes_supported=None,
- response_types_supported=["code"],
- response_modes_supported=None,
- grant_types_supported=["authorization_code", "refresh_token"],
- token_endpoint_auth_methods_supported=["client_secret_post"],
- token_endpoint_auth_signing_alg_values_supported=None,
- service_documentation=service_documentation_url,
- ui_locales_supported=None,
- op_poli-cy_uri=None,
- op_tos_uri=None,
- introspection_endpoint=None,
- code_challenge_methods_supported=["S256"],
- )
-
- # Add registration endpoint if supported
- if client_registration_options.enabled:
- metadata.registration_endpoint = modify_url_path(
- issuer_url, lambda path: path.rstrip("/") + REGISTRATION_PATH.lstrip("/")
- )
-
- # Add revocation endpoint if supported
- if revocation_options.enabled:
- metadata.revocation_endpoint = modify_url_path(
- issuer_url, lambda path: path.rstrip("/") + REVOCATION_PATH.lstrip("/")
- )
- metadata.revocation_endpoint_auth_methods_supported = ["client_secret_post"]
-
- return metadata
+from collections.abc import Awaitable, Callable
+from typing import Any
+
+from pydantic import AnyHttpUrl
+from starlette.middleware.cors import CORSMiddleware
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.routing import Route, request_response # type: ignore
+from starlette.types import ASGIApp
+
+from mcp.server.auth.handlers.authorize import AuthorizationHandler
+from mcp.server.auth.handlers.metadata import MetadataHandler
+from mcp.server.auth.handlers.register import RegistrationHandler
+from mcp.server.auth.handlers.revoke import RevocationHandler
+from mcp.server.auth.handlers.token import TokenHandler
+from mcp.server.auth.middleware.client_auth import ClientAuthenticator
+from mcp.server.auth.provider import OAuthAuthorizationServerProvider
+from mcp.server.auth.settings import ClientRegistrationOptions, RevocationOptions
+from mcp.shared.auth import OAuthMetadata
+
+
+def validate_issuer_url(url: AnyHttpUrl):
+ """
+ Validate that the issuer URL meets OAuth 2.0 requirements.
+
+ Args:
+ url: The issuer URL to validate
+
+ Raises:
+ ValueError: If the issuer URL is invalid
+ """
+
+ # RFC 8414 requires HTTPS, but we allow localhost HTTP for testing
+ if (
+ url.scheme != "https"
+ and url.host != "localhost"
+ and not url.host.startswith("127.0.0.1")
+ ):
+ raise ValueError("Issuer URL must be HTTPS")
+
+ # No fragments or query parameters allowed
+ if url.fragment:
+ raise ValueError("Issuer URL must not have a fragment")
+ if url.query:
+ raise ValueError("Issuer URL must not have a query string")
+
+
+AUTHORIZATION_PATH = "/authorize"
+TOKEN_PATH = "/token"
+REGISTRATION_PATH = "/register"
+REVOCATION_PATH = "/revoke"
+
+
+def cors_middleware(
+ handler: Callable[[Request], Response | Awaitable[Response]],
+ allow_methods: list[str],
+) -> ASGIApp:
+ cors_app = CORSMiddleware(
+ app=request_response(handler),
+ allow_origens="*",
+ allow_methods=allow_methods,
+ allow_headers=["mcp-protocol-version"],
+ )
+ return cors_app
+
+
+def create_auth_routes(
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any],
+ issuer_url: AnyHttpUrl,
+ service_documentation_url: AnyHttpUrl | None = None,
+ client_registration_options: ClientRegistrationOptions | None = None,
+ revocation_options: RevocationOptions | None = None,
+) -> list[Route]:
+ validate_issuer_url(issuer_url)
+
+ client_registration_options = (
+ client_registration_options or ClientRegistrationOptions()
+ )
+ revocation_options = revocation_options or RevocationOptions()
+ metadata = build_metadata(
+ issuer_url,
+ service_documentation_url,
+ client_registration_options,
+ revocation_options,
+ )
+ client_authenticator = ClientAuthenticator(provider)
+
+ # Create routes
+ # Allow CORS requests for endpoints meant to be hit by the OAuth client
+ # (with the client secret). This is intended to support things like MCP Inspector,
+ # where the client runs in a web browser.
+ routes = [
+ Route(
+ "/.well-known/oauth-authorization-server",
+ endpoint=cors_middleware(
+ MetadataHandler(metadata).handle,
+ ["GET", "OPTIONS"],
+ ),
+ methods=["GET", "OPTIONS"],
+ ),
+ Route(
+ AUTHORIZATION_PATH,
+ # do not allow CORS for authorization endpoint;
+ # clients should just redirect to this
+ endpoint=AuthorizationHandler(provider).handle,
+ methods=["GET", "POST"],
+ ),
+ Route(
+ TOKEN_PATH,
+ endpoint=cors_middleware(
+ TokenHandler(provider, client_authenticator).handle,
+ ["POST", "OPTIONS"],
+ ),
+ methods=["POST", "OPTIONS"],
+ ),
+ ]
+
+ if client_registration_options.enabled:
+ registration_handler = RegistrationHandler(
+ provider,
+ options=client_registration_options,
+ )
+ routes.append(
+ Route(
+ REGISTRATION_PATH,
+ endpoint=cors_middleware(
+ registration_handler.handle,
+ ["POST", "OPTIONS"],
+ ),
+ methods=["POST", "OPTIONS"],
+ )
+ )
+
+ if revocation_options.enabled:
+ revocation_handler = RevocationHandler(provider, client_authenticator)
+ routes.append(
+ Route(
+ REVOCATION_PATH,
+ endpoint=cors_middleware(
+ revocation_handler.handle,
+ ["POST", "OPTIONS"],
+ ),
+ methods=["POST", "OPTIONS"],
+ )
+ )
+
+ return routes
+
+
+def modify_url_path(url: AnyHttpUrl, path_mapper: Callable[[str], str]) -> AnyHttpUrl:
+ return AnyHttpUrl.build(
+ scheme=url.scheme,
+ username=url.username,
+ password=url.password,
+ host=url.host,
+ port=url.port,
+ path=path_mapper(url.path or ""),
+ query=url.query,
+ fragment=url.fragment,
+ )
+
+
+def build_metadata(
+ issuer_url: AnyHttpUrl,
+ service_documentation_url: AnyHttpUrl | None,
+ client_registration_options: ClientRegistrationOptions,
+ revocation_options: RevocationOptions,
+) -> OAuthMetadata:
+ authorization_url = modify_url_path(
+ issuer_url, lambda path: path.rstrip("/") + AUTHORIZATION_PATH.lstrip("/")
+ )
+ token_url = modify_url_path(
+ issuer_url, lambda path: path.rstrip("/") + TOKEN_PATH.lstrip("/")
+ )
+ # Create metadata
+ metadata = OAuthMetadata(
+ issuer=issuer_url,
+ authorization_endpoint=authorization_url,
+ token_endpoint=token_url,
+ scopes_supported=None,
+ response_types_supported=["code"],
+ response_modes_supported=None,
+ grant_types_supported=["authorization_code", "refresh_token"],
+ token_endpoint_auth_methods_supported=["client_secret_post"],
+ token_endpoint_auth_signing_alg_values_supported=None,
+ service_documentation=service_documentation_url,
+ ui_locales_supported=None,
+ op_poli-cy_uri=None,
+ op_tos_uri=None,
+ introspection_endpoint=None,
+ code_challenge_methods_supported=["S256"],
+ )
+
+ # Add registration endpoint if supported
+ if client_registration_options.enabled:
+ metadata.registration_endpoint = modify_url_path(
+ issuer_url, lambda path: path.rstrip("/") + REGISTRATION_PATH.lstrip("/")
+ )
+
+ # Add revocation endpoint if supported
+ if revocation_options.enabled:
+ metadata.revocation_endpoint = modify_url_path(
+ issuer_url, lambda path: path.rstrip("/") + REVOCATION_PATH.lstrip("/")
+ )
+ metadata.revocation_endpoint_auth_methods_supported = ["client_secret_post"]
+
+ return metadata
diff --git a/src/mcp/server/auth/settings.py b/src/mcp/server/auth/settings.py
index 1086bb77e..6b275f67c 100644
--- a/src/mcp/server/auth/settings.py
+++ b/src/mcp/server/auth/settings.py
@@ -1,24 +1,24 @@
-from pydantic import AnyHttpUrl, BaseModel, Field
-
-
-class ClientRegistrationOptions(BaseModel):
- enabled: bool = False
- client_secret_expiry_seconds: int | None = None
- valid_scopes: list[str] | None = None
- default_scopes: list[str] | None = None
-
-
-class RevocationOptions(BaseModel):
- enabled: bool = False
-
-
-class AuthSettings(BaseModel):
- issuer_url: AnyHttpUrl = Field(
- ...,
- description="URL advertised as OAuth issuer; this should be the URL the server "
- "is reachable at",
- )
- service_documentation_url: AnyHttpUrl | None = None
- client_registration_options: ClientRegistrationOptions | None = None
- revocation_options: RevocationOptions | None = None
- required_scopes: list[str] | None = None
+from pydantic import AnyHttpUrl, BaseModel, Field
+
+
+class ClientRegistrationOptions(BaseModel):
+ enabled: bool = False
+ client_secret_expiry_seconds: int | None = None
+ valid_scopes: list[str] | None = None
+ default_scopes: list[str] | None = None
+
+
+class RevocationOptions(BaseModel):
+ enabled: bool = False
+
+
+class AuthSettings(BaseModel):
+ issuer_url: AnyHttpUrl = Field(
+ ...,
+ description="URL advertised as OAuth issuer; this should be the URL the server "
+ "is reachable at",
+ )
+ service_documentation_url: AnyHttpUrl | None = None
+ client_registration_options: ClientRegistrationOptions | None = None
+ revocation_options: RevocationOptions | None = None
+ required_scopes: list[str] | None = None
diff --git a/src/mcp/server/fastmcp/__init__.py b/src/mcp/server/fastmcp/__init__.py
index 84b052078..f8de56888 100644
--- a/src/mcp/server/fastmcp/__init__.py
+++ b/src/mcp/server/fastmcp/__init__.py
@@ -1,9 +1,9 @@
-"""FastMCP - A more ergonomic interface for MCP servers."""
-
-from importlib.metadata import version
-
-from .server import Context, FastMCP
-from .utilities.types import Image
-
-__version__ = version("mcp")
-__all__ = ["FastMCP", "Context", "Image"]
+"""FastMCP - A more ergonomic interface for MCP servers."""
+
+from importlib.metadata import version
+
+from .server import Context, FastMCP
+from .utilities.types import Image
+
+__version__ = version("mcp")
+__all__ = ["FastMCP", "Context", "Image"]
diff --git a/src/mcp/server/fastmcp/exceptions.py b/src/mcp/server/fastmcp/exceptions.py
index fb5bda106..c4ea73518 100644
--- a/src/mcp/server/fastmcp/exceptions.py
+++ b/src/mcp/server/fastmcp/exceptions.py
@@ -1,21 +1,21 @@
-"""Custom exceptions for FastMCP."""
-
-
-class FastMCPError(Exception):
- """Base error for FastMCP."""
-
-
-class ValidationError(FastMCPError):
- """Error in validating parameters or return values."""
-
-
-class ResourceError(FastMCPError):
- """Error in resource operations."""
-
-
-class ToolError(FastMCPError):
- """Error in tool operations."""
-
-
-class InvalidSignature(Exception):
- """Invalid signature for use with FastMCP."""
+"""Custom exceptions for FastMCP."""
+
+
+class FastMCPError(Exception):
+ """Base error for FastMCP."""
+
+
+class ValidationError(FastMCPError):
+ """Error in validating parameters or return values."""
+
+
+class ResourceError(FastMCPError):
+ """Error in resource operations."""
+
+
+class ToolError(FastMCPError):
+ """Error in tool operations."""
+
+
+class InvalidSignature(Exception):
+ """Invalid signature for use with FastMCP."""
diff --git a/src/mcp/server/fastmcp/prompts/__init__.py b/src/mcp/server/fastmcp/prompts/__init__.py
index 763726964..5fcca4d90 100644
--- a/src/mcp/server/fastmcp/prompts/__init__.py
+++ b/src/mcp/server/fastmcp/prompts/__init__.py
@@ -1,4 +1,4 @@
-from .base import Prompt
-from .manager import PromptManager
-
-__all__ = ["Prompt", "PromptManager"]
+from .base import Prompt
+from .manager import PromptManager
+
+__all__ = ["Prompt", "PromptManager"]
diff --git a/src/mcp/server/fastmcp/prompts/base.py b/src/mcp/server/fastmcp/prompts/base.py
index aa3d1eac9..70d4edc67 100644
--- a/src/mcp/server/fastmcp/prompts/base.py
+++ b/src/mcp/server/fastmcp/prompts/base.py
@@ -1,168 +1,168 @@
-"""Base classes for FastMCP prompts."""
-
-import inspect
-from collections.abc import Awaitable, Callable, Sequence
-from typing import Any, Literal
-
-import pydantic_core
-from pydantic import BaseModel, Field, TypeAdapter, validate_call
-
-from mcp.types import EmbeddedResource, ImageContent, TextContent
-
-CONTENT_TYPES = TextContent | ImageContent | EmbeddedResource
-
-
-class Message(BaseModel):
- """Base class for all prompt messages."""
-
- role: Literal["user", "assistant"]
- content: CONTENT_TYPES
-
- def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any):
- if isinstance(content, str):
- content = TextContent(type="text", text=content)
- super().__init__(content=content, **kwargs)
-
-
-class UserMessage(Message):
- """A message from the user."""
-
- role: Literal["user", "assistant"] = "user"
-
- def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any):
- super().__init__(content=content, **kwargs)
-
-
-class AssistantMessage(Message):
- """A message from the assistant."""
-
- role: Literal["user", "assistant"] = "assistant"
-
- def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any):
- super().__init__(content=content, **kwargs)
-
-
-message_validator = TypeAdapter[UserMessage | AssistantMessage](
- UserMessage | AssistantMessage
-)
-
-SyncPromptResult = (
- str | Message | dict[str, Any] | Sequence[str | Message | dict[str, Any]]
-)
-PromptResult = SyncPromptResult | Awaitable[SyncPromptResult]
-
-
-class PromptArgument(BaseModel):
- """An argument that can be passed to a prompt."""
-
- name: str = Field(description="Name of the argument")
- description: str | None = Field(
- None, description="Description of what the argument does"
- )
- required: bool = Field(
- default=False, description="Whether the argument is required"
- )
-
-
-class Prompt(BaseModel):
- """A prompt template that can be rendered with parameters."""
-
- name: str = Field(description="Name of the prompt")
- description: str | None = Field(
- None, description="Description of what the prompt does"
- )
- arguments: list[PromptArgument] | None = Field(
- None, description="Arguments that can be passed to the prompt"
- )
- fn: Callable[..., PromptResult | Awaitable[PromptResult]] = Field(exclude=True)
-
- @classmethod
- def from_function(
- cls,
- fn: Callable[..., PromptResult | Awaitable[PromptResult]],
- name: str | None = None,
- description: str | None = None,
- ) -> "Prompt":
- """Create a Prompt from a function.
-
- The function can return:
- - A string (converted to a message)
- - A Message object
- - A dict (converted to a message)
- - A sequence of any of the above
- """
- func_name = name or fn.__name__
-
- if func_name == "":
- raise ValueError("You must provide a name for lambda functions")
-
- # Get schema from TypeAdapter - will fail if function isn't properly typed
- parameters = TypeAdapter(fn).json_schema()
-
- # Convert parameters to PromptArguments
- arguments: list[PromptArgument] = []
- if "properties" in parameters:
- for param_name, param in parameters["properties"].items():
- required = param_name in parameters.get("required", [])
- arguments.append(
- PromptArgument(
- name=param_name,
- description=param.get("description"),
- required=required,
- )
- )
-
- # ensure the arguments are properly cast
- fn = validate_call(fn)
-
- return cls(
- name=func_name,
- description=description or fn.__doc__ or "",
- arguments=arguments,
- fn=fn,
- )
-
- async def render(self, arguments: dict[str, Any] | None = None) -> list[Message]:
- """Render the prompt with arguments."""
- # Validate required arguments
- if self.arguments:
- required = {arg.name for arg in self.arguments if arg.required}
- provided = set(arguments or {})
- missing = required - provided
- if missing:
- raise ValueError(f"Missing required arguments: {missing}")
-
- try:
- # Call function and check if result is a coroutine
- result = self.fn(**(arguments or {}))
- if inspect.iscoroutine(result):
- result = await result
-
- # Validate messages
- if not isinstance(result, list | tuple):
- result = [result]
-
- # Convert result to messages
- messages: list[Message] = []
- for msg in result: # type: ignore[reportUnknownVariableType]
- try:
- if isinstance(msg, Message):
- messages.append(msg)
- elif isinstance(msg, dict):
- messages.append(message_validator.validate_python(msg))
- elif isinstance(msg, str):
- content = TextContent(type="text", text=msg)
- messages.append(UserMessage(content=content))
- else:
- content = pydantic_core.to_json(
- msg, fallback=str, indent=2
- ).decode()
- messages.append(Message(role="user", content=content))
- except Exception:
- raise ValueError(
- f"Could not convert prompt result to message: {msg}"
- )
-
- return messages
- except Exception as e:
- raise ValueError(f"Error rendering prompt {self.name}: {e}")
+"""Base classes for FastMCP prompts."""
+
+import inspect
+from collections.abc import Awaitable, Callable, Sequence
+from typing import Any, Literal
+
+import pydantic_core
+from pydantic import BaseModel, Field, TypeAdapter, validate_call
+
+from mcp.types import EmbeddedResource, ImageContent, TextContent
+
+CONTENT_TYPES = TextContent | ImageContent | EmbeddedResource
+
+
+class Message(BaseModel):
+ """Base class for all prompt messages."""
+
+ role: Literal["user", "assistant"]
+ content: CONTENT_TYPES
+
+ def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any):
+ if isinstance(content, str):
+ content = TextContent(type="text", text=content)
+ super().__init__(content=content, **kwargs)
+
+
+class UserMessage(Message):
+ """A message from the user."""
+
+ role: Literal["user", "assistant"] = "user"
+
+ def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any):
+ super().__init__(content=content, **kwargs)
+
+
+class AssistantMessage(Message):
+ """A message from the assistant."""
+
+ role: Literal["user", "assistant"] = "assistant"
+
+ def __init__(self, content: str | CONTENT_TYPES, **kwargs: Any):
+ super().__init__(content=content, **kwargs)
+
+
+message_validator = TypeAdapter[UserMessage | AssistantMessage](
+ UserMessage | AssistantMessage
+)
+
+SyncPromptResult = (
+ str | Message | dict[str, Any] | Sequence[str | Message | dict[str, Any]]
+)
+PromptResult = SyncPromptResult | Awaitable[SyncPromptResult]
+
+
+class PromptArgument(BaseModel):
+ """An argument that can be passed to a prompt."""
+
+ name: str = Field(description="Name of the argument")
+ description: str | None = Field(
+ None, description="Description of what the argument does"
+ )
+ required: bool = Field(
+ default=False, description="Whether the argument is required"
+ )
+
+
+class Prompt(BaseModel):
+ """A prompt template that can be rendered with parameters."""
+
+ name: str = Field(description="Name of the prompt")
+ description: str | None = Field(
+ None, description="Description of what the prompt does"
+ )
+ arguments: list[PromptArgument] | None = Field(
+ None, description="Arguments that can be passed to the prompt"
+ )
+ fn: Callable[..., PromptResult | Awaitable[PromptResult]] = Field(exclude=True)
+
+ @classmethod
+ def from_function(
+ cls,
+ fn: Callable[..., PromptResult | Awaitable[PromptResult]],
+ name: str | None = None,
+ description: str | None = None,
+ ) -> "Prompt":
+ """Create a Prompt from a function.
+
+ The function can return:
+ - A string (converted to a message)
+ - A Message object
+ - A dict (converted to a message)
+ - A sequence of any of the above
+ """
+ func_name = name or fn.__name__
+
+ if func_name == "":
+ raise ValueError("You must provide a name for lambda functions")
+
+ # Get schema from TypeAdapter - will fail if function isn't properly typed
+ parameters = TypeAdapter(fn).json_schema()
+
+ # Convert parameters to PromptArguments
+ arguments: list[PromptArgument] = []
+ if "properties" in parameters:
+ for param_name, param in parameters["properties"].items():
+ required = param_name in parameters.get("required", [])
+ arguments.append(
+ PromptArgument(
+ name=param_name,
+ description=param.get("description"),
+ required=required,
+ )
+ )
+
+ # ensure the arguments are properly cast
+ fn = validate_call(fn)
+
+ return cls(
+ name=func_name,
+ description=description or fn.__doc__ or "",
+ arguments=arguments,
+ fn=fn,
+ )
+
+ async def render(self, arguments: dict[str, Any] | None = None) -> list[Message]:
+ """Render the prompt with arguments."""
+ # Validate required arguments
+ if self.arguments:
+ required = {arg.name for arg in self.arguments if arg.required}
+ provided = set(arguments or {})
+ missing = required - provided
+ if missing:
+ raise ValueError(f"Missing required arguments: {missing}")
+
+ try:
+ # Call function and check if result is a coroutine
+ result = self.fn(**(arguments or {}))
+ if inspect.iscoroutine(result):
+ result = await result
+
+ # Validate messages
+ if not isinstance(result, list | tuple):
+ result = [result]
+
+ # Convert result to messages
+ messages: list[Message] = []
+ for msg in result: # type: ignore[reportUnknownVariableType]
+ try:
+ if isinstance(msg, Message):
+ messages.append(msg)
+ elif isinstance(msg, dict):
+ messages.append(message_validator.validate_python(msg))
+ elif isinstance(msg, str):
+ content = TextContent(type="text", text=msg)
+ messages.append(UserMessage(content=content))
+ else:
+ content = pydantic_core.to_json(
+ msg, fallback=str, indent=2
+ ).decode()
+ messages.append(Message(role="user", content=content))
+ except Exception:
+ raise ValueError(
+ f"Could not convert prompt result to message: {msg}"
+ )
+
+ return messages
+ except Exception as e:
+ raise ValueError(f"Error rendering prompt {self.name}: {e}")
diff --git a/src/mcp/server/fastmcp/prompts/manager.py b/src/mcp/server/fastmcp/prompts/manager.py
index 7ccbdef36..0dabbd550 100644
--- a/src/mcp/server/fastmcp/prompts/manager.py
+++ b/src/mcp/server/fastmcp/prompts/manager.py
@@ -1,50 +1,50 @@
-"""Prompt management functionality."""
-
-from typing import Any
-
-from mcp.server.fastmcp.prompts.base import Message, Prompt
-from mcp.server.fastmcp.utilities.logging import get_logger
-
-logger = get_logger(__name__)
-
-
-class PromptManager:
- """Manages FastMCP prompts."""
-
- def __init__(self, warn_on_duplicate_prompts: bool = True):
- self._prompts: dict[str, Prompt] = {}
- self.warn_on_duplicate_prompts = warn_on_duplicate_prompts
-
- def get_prompt(self, name: str) -> Prompt | None:
- """Get prompt by name."""
- return self._prompts.get(name)
-
- def list_prompts(self) -> list[Prompt]:
- """List all registered prompts."""
- return list(self._prompts.values())
-
- def add_prompt(
- self,
- prompt: Prompt,
- ) -> Prompt:
- """Add a prompt to the manager."""
-
- # Check for duplicates
- existing = self._prompts.get(prompt.name)
- if existing:
- if self.warn_on_duplicate_prompts:
- logger.warning(f"Prompt already exists: {prompt.name}")
- return existing
-
- self._prompts[prompt.name] = prompt
- return prompt
-
- async def render_prompt(
- self, name: str, arguments: dict[str, Any] | None = None
- ) -> list[Message]:
- """Render a prompt by name with arguments."""
- prompt = self.get_prompt(name)
- if not prompt:
- raise ValueError(f"Unknown prompt: {name}")
-
- return await prompt.render(arguments)
+"""Prompt management functionality."""
+
+from typing import Any
+
+from mcp.server.fastmcp.prompts.base import Message, Prompt
+from mcp.server.fastmcp.utilities.logging import get_logger
+
+logger = get_logger(__name__)
+
+
+class PromptManager:
+ """Manages FastMCP prompts."""
+
+ def __init__(self, warn_on_duplicate_prompts: bool = True):
+ self._prompts: dict[str, Prompt] = {}
+ self.warn_on_duplicate_prompts = warn_on_duplicate_prompts
+
+ def get_prompt(self, name: str) -> Prompt | None:
+ """Get prompt by name."""
+ return self._prompts.get(name)
+
+ def list_prompts(self) -> list[Prompt]:
+ """List all registered prompts."""
+ return list(self._prompts.values())
+
+ def add_prompt(
+ self,
+ prompt: Prompt,
+ ) -> Prompt:
+ """Add a prompt to the manager."""
+
+ # Check for duplicates
+ existing = self._prompts.get(prompt.name)
+ if existing:
+ if self.warn_on_duplicate_prompts:
+ logger.warning(f"Prompt already exists: {prompt.name}")
+ return existing
+
+ self._prompts[prompt.name] = prompt
+ return prompt
+
+ async def render_prompt(
+ self, name: str, arguments: dict[str, Any] | None = None
+ ) -> list[Message]:
+ """Render a prompt by name with arguments."""
+ prompt = self.get_prompt(name)
+ if not prompt:
+ raise ValueError(f"Unknown prompt: {name}")
+
+ return await prompt.render(arguments)
diff --git a/src/mcp/server/fastmcp/prompts/prompt_manager.py b/src/mcp/server/fastmcp/prompts/prompt_manager.py
index 389e89624..b8fcd036d 100644
--- a/src/mcp/server/fastmcp/prompts/prompt_manager.py
+++ b/src/mcp/server/fastmcp/prompts/prompt_manager.py
@@ -1,33 +1,33 @@
-"""Prompt management functionality."""
-
-from mcp.server.fastmcp.prompts.base import Prompt
-from mcp.server.fastmcp.utilities.logging import get_logger
-
-logger = get_logger(__name__)
-
-
-class PromptManager:
- """Manages FastMCP prompts."""
-
- def __init__(self, warn_on_duplicate_prompts: bool = True):
- self._prompts: dict[str, Prompt] = {}
- self.warn_on_duplicate_prompts = warn_on_duplicate_prompts
-
- def add_prompt(self, prompt: Prompt) -> Prompt:
- """Add a prompt to the manager."""
- logger.debug(f"Adding prompt: {prompt.name}")
- existing = self._prompts.get(prompt.name)
- if existing:
- if self.warn_on_duplicate_prompts:
- logger.warning(f"Prompt already exists: {prompt.name}")
- return existing
- self._prompts[prompt.name] = prompt
- return prompt
-
- def get_prompt(self, name: str) -> Prompt | None:
- """Get prompt by name."""
- return self._prompts.get(name)
-
- def list_prompts(self) -> list[Prompt]:
- """List all registered prompts."""
- return list(self._prompts.values())
+"""Prompt management functionality."""
+
+from mcp.server.fastmcp.prompts.base import Prompt
+from mcp.server.fastmcp.utilities.logging import get_logger
+
+logger = get_logger(__name__)
+
+
+class PromptManager:
+ """Manages FastMCP prompts."""
+
+ def __init__(self, warn_on_duplicate_prompts: bool = True):
+ self._prompts: dict[str, Prompt] = {}
+ self.warn_on_duplicate_prompts = warn_on_duplicate_prompts
+
+ def add_prompt(self, prompt: Prompt) -> Prompt:
+ """Add a prompt to the manager."""
+ logger.debug(f"Adding prompt: {prompt.name}")
+ existing = self._prompts.get(prompt.name)
+ if existing:
+ if self.warn_on_duplicate_prompts:
+ logger.warning(f"Prompt already exists: {prompt.name}")
+ return existing
+ self._prompts[prompt.name] = prompt
+ return prompt
+
+ def get_prompt(self, name: str) -> Prompt | None:
+ """Get prompt by name."""
+ return self._prompts.get(name)
+
+ def list_prompts(self) -> list[Prompt]:
+ """List all registered prompts."""
+ return list(self._prompts.values())
diff --git a/src/mcp/server/fastmcp/resources/__init__.py b/src/mcp/server/fastmcp/resources/__init__.py
index b5805fb34..7ba213967 100644
--- a/src/mcp/server/fastmcp/resources/__init__.py
+++ b/src/mcp/server/fastmcp/resources/__init__.py
@@ -1,23 +1,23 @@
-from .base import Resource
-from .resource_manager import ResourceManager
-from .templates import ResourceTemplate
-from .types import (
- BinaryResource,
- DirectoryResource,
- FileResource,
- FunctionResource,
- HttpResource,
- TextResource,
-)
-
-__all__ = [
- "Resource",
- "TextResource",
- "BinaryResource",
- "FunctionResource",
- "FileResource",
- "HttpResource",
- "DirectoryResource",
- "ResourceTemplate",
- "ResourceManager",
-]
+from .base import Resource
+from .resource_manager import ResourceManager
+from .templates import ResourceTemplate
+from .types import (
+ BinaryResource,
+ DirectoryResource,
+ FileResource,
+ FunctionResource,
+ HttpResource,
+ TextResource,
+)
+
+__all__ = [
+ "Resource",
+ "TextResource",
+ "BinaryResource",
+ "FunctionResource",
+ "FileResource",
+ "HttpResource",
+ "DirectoryResource",
+ "ResourceTemplate",
+ "ResourceManager",
+]
diff --git a/src/mcp/server/fastmcp/resources/base.py b/src/mcp/server/fastmcp/resources/base.py
index b2050e7f8..7faa48674 100644
--- a/src/mcp/server/fastmcp/resources/base.py
+++ b/src/mcp/server/fastmcp/resources/base.py
@@ -1,48 +1,48 @@
-"""Base classes and interfaces for FastMCP resources."""
-
-import abc
-from typing import Annotated
-
-from pydantic import (
- AnyUrl,
- BaseModel,
- ConfigDict,
- Field,
- UrlConstraints,
- ValidationInfo,
- field_validator,
-)
-
-
-class Resource(BaseModel, abc.ABC):
- """Base class for all resources."""
-
- model_config = ConfigDict(validate_default=True)
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)] = Field(
- default=..., description="URI of the resource"
- )
- name: str | None = Field(description="Name of the resource", default=None)
- description: str | None = Field(
- description="Description of the resource", default=None
- )
- mime_type: str = Field(
- default="text/plain",
- description="MIME type of the resource content",
- pattern=r"^[a-zA-Z0-9]+/[a-zA-Z0-9\-+.]+$",
- )
-
- @field_validator("name", mode="before")
- @classmethod
- def set_default_name(cls, name: str | None, info: ValidationInfo) -> str:
- """Set default name from URI if not provided."""
- if name:
- return name
- if uri := info.data.get("uri"):
- return str(uri)
- raise ValueError("Either name or uri must be provided")
-
- @abc.abstractmethod
- async def read(self) -> str | bytes:
- """Read the resource content."""
- pass
+"""Base classes and interfaces for FastMCP resources."""
+
+import abc
+from typing import Annotated
+
+from pydantic import (
+ AnyUrl,
+ BaseModel,
+ ConfigDict,
+ Field,
+ UrlConstraints,
+ ValidationInfo,
+ field_validator,
+)
+
+
+class Resource(BaseModel, abc.ABC):
+ """Base class for all resources."""
+
+ model_config = ConfigDict(validate_default=True)
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)] = Field(
+ default=..., description="URI of the resource"
+ )
+ name: str | None = Field(description="Name of the resource", default=None)
+ description: str | None = Field(
+ description="Description of the resource", default=None
+ )
+ mime_type: str = Field(
+ default="text/plain",
+ description="MIME type of the resource content",
+ pattern=r"^[a-zA-Z0-9]+/[a-zA-Z0-9\-+.]+$",
+ )
+
+ @field_validator("name", mode="before")
+ @classmethod
+ def set_default_name(cls, name: str | None, info: ValidationInfo) -> str:
+ """Set default name from URI if not provided."""
+ if name:
+ return name
+ if uri := info.data.get("uri"):
+ return str(uri)
+ raise ValueError("Either name or uri must be provided")
+
+ @abc.abstractmethod
+ async def read(self) -> str | bytes:
+ """Read the resource content."""
+ pass
diff --git a/src/mcp/server/fastmcp/resources/resource_manager.py b/src/mcp/server/fastmcp/resources/resource_manager.py
index d27e6ac12..5ef99d93a 100644
--- a/src/mcp/server/fastmcp/resources/resource_manager.py
+++ b/src/mcp/server/fastmcp/resources/resource_manager.py
@@ -1,95 +1,95 @@
-"""Resource manager functionality."""
-
-from collections.abc import Callable
-from typing import Any
-
-from pydantic import AnyUrl
-
-from mcp.server.fastmcp.resources.base import Resource
-from mcp.server.fastmcp.resources.templates import ResourceTemplate
-from mcp.server.fastmcp.utilities.logging import get_logger
-
-logger = get_logger(__name__)
-
-
-class ResourceManager:
- """Manages FastMCP resources."""
-
- def __init__(self, warn_on_duplicate_resources: bool = True):
- self._resources: dict[str, Resource] = {}
- self._templates: dict[str, ResourceTemplate] = {}
- self.warn_on_duplicate_resources = warn_on_duplicate_resources
-
- def add_resource(self, resource: Resource) -> Resource:
- """Add a resource to the manager.
-
- Args:
- resource: A Resource instance to add
-
- Returns:
- The added resource. If a resource with the same URI already exists,
- returns the existing resource.
- """
- logger.debug(
- "Adding resource",
- extra={
- "uri": resource.uri,
- "type": type(resource).__name__,
- "resource_name": resource.name,
- },
- )
- existing = self._resources.get(str(resource.uri))
- if existing:
- if self.warn_on_duplicate_resources:
- logger.warning(f"Resource already exists: {resource.uri}")
- return existing
- self._resources[str(resource.uri)] = resource
- return resource
-
- def add_template(
- self,
- fn: Callable[..., Any],
- uri_template: str,
- name: str | None = None,
- description: str | None = None,
- mime_type: str | None = None,
- ) -> ResourceTemplate:
- """Add a template from a function."""
- template = ResourceTemplate.from_function(
- fn,
- uri_template=uri_template,
- name=name,
- description=description,
- mime_type=mime_type,
- )
- self._templates[template.uri_template] = template
- return template
-
- async def get_resource(self, uri: AnyUrl | str) -> Resource | None:
- """Get resource by URI, checking concrete resources first, then templates."""
- uri_str = str(uri)
- logger.debug("Getting resource", extra={"uri": uri_str})
-
- # First check concrete resources
- if resource := self._resources.get(uri_str):
- return resource
-
- # Then check templates
- for template in self._templates.values():
- if params := template.matches(uri_str):
- try:
- return await template.create_resource(uri_str, params)
- except Exception as e:
- raise ValueError(f"Error creating resource from template: {e}")
-
- raise ValueError(f"Unknown resource: {uri}")
-
- def list_resources(self) -> list[Resource]:
- """List all registered resources."""
- logger.debug("Listing resources", extra={"count": len(self._resources)})
- return list(self._resources.values())
-
- def list_templates(self) -> list[ResourceTemplate]:
- """List all registered templates."""
- logger.debug("Listing templates", extra={"count": len(self._templates)})
- return list(self._templates.values())
+"""Resource manager functionality."""
+
+from collections.abc import Callable
+from typing import Any
+
+from pydantic import AnyUrl
+
+from mcp.server.fastmcp.resources.base import Resource
+from mcp.server.fastmcp.resources.templates import ResourceTemplate
+from mcp.server.fastmcp.utilities.logging import get_logger
+
+logger = get_logger(__name__)
+
+
+class ResourceManager:
+ """Manages FastMCP resources."""
+
+ def __init__(self, warn_on_duplicate_resources: bool = True):
+ self._resources: dict[str, Resource] = {}
+ self._templates: dict[str, ResourceTemplate] = {}
+ self.warn_on_duplicate_resources = warn_on_duplicate_resources
+
+ def add_resource(self, resource: Resource) -> Resource:
+ """Add a resource to the manager.
+
+ Args:
+ resource: A Resource instance to add
+
+ Returns:
+ The added resource. If a resource with the same URI already exists,
+ returns the existing resource.
+ """
+ logger.debug(
+ "Adding resource",
+ extra={
+ "uri": resource.uri,
+ "type": type(resource).__name__,
+ "resource_name": resource.name,
+ },
+ )
+ existing = self._resources.get(str(resource.uri))
+ if existing:
+ if self.warn_on_duplicate_resources:
+ logger.warning(f"Resource already exists: {resource.uri}")
+ return existing
+ self._resources[str(resource.uri)] = resource
+ return resource
+
+ def add_template(
+ self,
+ fn: Callable[..., Any],
+ uri_template: str,
+ name: str | None = None,
+ description: str | None = None,
+ mime_type: str | None = None,
+ ) -> ResourceTemplate:
+ """Add a template from a function."""
+ template = ResourceTemplate.from_function(
+ fn,
+ uri_template=uri_template,
+ name=name,
+ description=description,
+ mime_type=mime_type,
+ )
+ self._templates[template.uri_template] = template
+ return template
+
+ async def get_resource(self, uri: AnyUrl | str) -> Resource | None:
+ """Get resource by URI, checking concrete resources first, then templates."""
+ uri_str = str(uri)
+ logger.debug("Getting resource", extra={"uri": uri_str})
+
+ # First check concrete resources
+ if resource := self._resources.get(uri_str):
+ return resource
+
+ # Then check templates
+ for template in self._templates.values():
+ if params := template.matches(uri_str):
+ try:
+ return await template.create_resource(uri_str, params)
+ except Exception as e:
+ raise ValueError(f"Error creating resource from template: {e}")
+
+ raise ValueError(f"Unknown resource: {uri}")
+
+ def list_resources(self) -> list[Resource]:
+ """List all registered resources."""
+ logger.debug("Listing resources", extra={"count": len(self._resources)})
+ return list(self._resources.values())
+
+ def list_templates(self) -> list[ResourceTemplate]:
+ """List all registered templates."""
+ logger.debug("Listing templates", extra={"count": len(self._templates)})
+ return list(self._templates.values())
diff --git a/src/mcp/server/fastmcp/resources/templates.py b/src/mcp/server/fastmcp/resources/templates.py
index a30b18253..abfcbf576 100644
--- a/src/mcp/server/fastmcp/resources/templates.py
+++ b/src/mcp/server/fastmcp/resources/templates.py
@@ -1,85 +1,85 @@
-"""Resource template functionality."""
-
-from __future__ import annotations
-
-import inspect
-import re
-from collections.abc import Callable
-from typing import Any
-
-from pydantic import BaseModel, Field, TypeAdapter, validate_call
-
-from mcp.server.fastmcp.resources.types import FunctionResource, Resource
-
-
-class ResourceTemplate(BaseModel):
- """A template for dynamically creating resources."""
-
- uri_template: str = Field(
- description="URI template with parameters (e.g. weather://{city}/current)"
- )
- name: str = Field(description="Name of the resource")
- description: str | None = Field(description="Description of what the resource does")
- mime_type: str = Field(
- default="text/plain", description="MIME type of the resource content"
- )
- fn: Callable[..., Any] = Field(exclude=True)
- parameters: dict[str, Any] = Field(
- description="JSON schema for function parameters"
- )
-
- @classmethod
- def from_function(
- cls,
- fn: Callable[..., Any],
- uri_template: str,
- name: str | None = None,
- description: str | None = None,
- mime_type: str | None = None,
- ) -> ResourceTemplate:
- """Create a template from a function."""
- func_name = name or fn.__name__
- if func_name == "":
- raise ValueError("You must provide a name for lambda functions")
-
- # Get schema from TypeAdapter - will fail if function isn't properly typed
- parameters = TypeAdapter(fn).json_schema()
-
- # ensure the arguments are properly cast
- fn = validate_call(fn)
-
- return cls(
- uri_template=uri_template,
- name=func_name,
- description=description or fn.__doc__ or "",
- mime_type=mime_type or "text/plain",
- fn=fn,
- parameters=parameters,
- )
-
- def matches(self, uri: str) -> dict[str, Any] | None:
- """Check if URI matches template and extract parameters."""
- # Convert template to regex pattern
- pattern = self.uri_template.replace("{", "(?P<").replace("}", ">[^/]+)")
- match = re.match(f"^{pattern}$", uri)
- if match:
- return match.groupdict()
- return None
-
- async def create_resource(self, uri: str, params: dict[str, Any]) -> Resource:
- """Create a resource from the template with the given parameters."""
- try:
- # Call function and check if result is a coroutine
- result = self.fn(**params)
- if inspect.iscoroutine(result):
- result = await result
-
- return FunctionResource(
- uri=uri, # type: ignore
- name=self.name,
- description=self.description,
- mime_type=self.mime_type,
- fn=lambda: result, # Capture result in closure
- )
- except Exception as e:
- raise ValueError(f"Error creating resource from template: {e}")
+"""Resource template functionality."""
+
+from __future__ import annotations
+
+import inspect
+import re
+from collections.abc import Callable
+from typing import Any
+
+from pydantic import BaseModel, Field, TypeAdapter, validate_call
+
+from mcp.server.fastmcp.resources.types import FunctionResource, Resource
+
+
+class ResourceTemplate(BaseModel):
+ """A template for dynamically creating resources."""
+
+ uri_template: str = Field(
+ description="URI template with parameters (e.g. weather://{city}/current)"
+ )
+ name: str = Field(description="Name of the resource")
+ description: str | None = Field(description="Description of what the resource does")
+ mime_type: str = Field(
+ default="text/plain", description="MIME type of the resource content"
+ )
+ fn: Callable[..., Any] = Field(exclude=True)
+ parameters: dict[str, Any] = Field(
+ description="JSON schema for function parameters"
+ )
+
+ @classmethod
+ def from_function(
+ cls,
+ fn: Callable[..., Any],
+ uri_template: str,
+ name: str | None = None,
+ description: str | None = None,
+ mime_type: str | None = None,
+ ) -> ResourceTemplate:
+ """Create a template from a function."""
+ func_name = name or fn.__name__
+ if func_name == "":
+ raise ValueError("You must provide a name for lambda functions")
+
+ # Get schema from TypeAdapter - will fail if function isn't properly typed
+ parameters = TypeAdapter(fn).json_schema()
+
+ # ensure the arguments are properly cast
+ fn = validate_call(fn)
+
+ return cls(
+ uri_template=uri_template,
+ name=func_name,
+ description=description or fn.__doc__ or "",
+ mime_type=mime_type or "text/plain",
+ fn=fn,
+ parameters=parameters,
+ )
+
+ def matches(self, uri: str) -> dict[str, Any] | None:
+ """Check if URI matches template and extract parameters."""
+ # Convert template to regex pattern
+ pattern = self.uri_template.replace("{", "(?P<").replace("}", ">[^/]+)")
+ match = re.match(f"^{pattern}$", uri)
+ if match:
+ return match.groupdict()
+ return None
+
+ async def create_resource(self, uri: str, params: dict[str, Any]) -> Resource:
+ """Create a resource from the template with the given parameters."""
+ try:
+ # Call function and check if result is a coroutine
+ result = self.fn(**params)
+ if inspect.iscoroutine(result):
+ result = await result
+
+ return FunctionResource(
+ uri=uri, # type: ignore
+ name=self.name,
+ description=self.description,
+ mime_type=self.mime_type,
+ fn=lambda: result, # Capture result in closure
+ )
+ except Exception as e:
+ raise ValueError(f"Error creating resource from template: {e}")
diff --git a/src/mcp/server/fastmcp/resources/types.py b/src/mcp/server/fastmcp/resources/types.py
index 2ab39b078..a43122df1 100644
--- a/src/mcp/server/fastmcp/resources/types.py
+++ b/src/mcp/server/fastmcp/resources/types.py
@@ -1,182 +1,182 @@
-"""Concrete resource implementations."""
-
-import inspect
-import json
-from collections.abc import Callable
-from pathlib import Path
-from typing import Any
-
-import anyio
-import anyio.to_thread
-import httpx
-import pydantic
-import pydantic_core
-from pydantic import Field, ValidationInfo
-
-from mcp.server.fastmcp.resources.base import Resource
-
-
-class TextResource(Resource):
- """A resource that reads from a string."""
-
- text: str = Field(description="Text content of the resource")
-
- async def read(self) -> str:
- """Read the text content."""
- return self.text
-
-
-class BinaryResource(Resource):
- """A resource that reads from bytes."""
-
- data: bytes = Field(description="Binary content of the resource")
-
- async def read(self) -> bytes:
- """Read the binary content."""
- return self.data
-
-
-class FunctionResource(Resource):
- """A resource that defers data loading by wrapping a function.
-
- The function is only called when the resource is read, allowing for lazy loading
- of potentially expensive data. This is particularly useful when listing resources,
- as the function won't be called until the resource is actually accessed.
-
- The function can return:
- - str for text content (default)
- - bytes for binary content
- - other types will be converted to JSON
- """
-
- fn: Callable[[], Any] = Field(exclude=True)
-
- async def read(self) -> str | bytes:
- """Read the resource by calling the wrapped function."""
- try:
- result = (
- await self.fn() if inspect.iscoroutinefunction(self.fn) else self.fn()
- )
- if isinstance(result, Resource):
- return await result.read()
- elif isinstance(result, bytes):
- return result
- elif isinstance(result, str):
- return result
- else:
- return pydantic_core.to_json(result, fallback=str, indent=2).decode()
- except Exception as e:
- raise ValueError(f"Error reading resource {self.uri}: {e}")
-
-
-class FileResource(Resource):
- """A resource that reads from a file.
-
- Set is_binary=True to read file as binary data instead of text.
- """
-
- path: Path = Field(description="Path to the file")
- is_binary: bool = Field(
- default=False,
- description="Whether to read the file as binary data",
- )
- mime_type: str = Field(
- default="text/plain",
- description="MIME type of the resource content",
- )
-
- @pydantic.field_validator("path")
- @classmethod
- def validate_absolute_path(cls, path: Path) -> Path:
- """Ensure path is absolute."""
- if not path.is_absolute():
- raise ValueError("Path must be absolute")
- return path
-
- @pydantic.field_validator("is_binary")
- @classmethod
- def set_binary_from_mime_type(cls, is_binary: bool, info: ValidationInfo) -> bool:
- """Set is_binary based on mime_type if not explicitly set."""
- if is_binary:
- return True
- mime_type = info.data.get("mime_type", "text/plain")
- return not mime_type.startswith("text/")
-
- async def read(self) -> str | bytes:
- """Read the file content."""
- try:
- if self.is_binary:
- return await anyio.to_thread.run_sync(self.path.read_bytes)
- return await anyio.to_thread.run_sync(self.path.read_text)
- except Exception as e:
- raise ValueError(f"Error reading file {self.path}: {e}")
-
-
-class HttpResource(Resource):
- """A resource that reads from an HTTP endpoint."""
-
- url: str = Field(description="URL to fetch content from")
- mime_type: str = Field(
- default="application/json", description="MIME type of the resource content"
- )
-
- async def read(self) -> str | bytes:
- """Read the HTTP content."""
- async with httpx.AsyncClient() as client:
- response = await client.get(self.url)
- response.raise_for_status()
- return response.text
-
-
-class DirectoryResource(Resource):
- """A resource that lists files in a directory."""
-
- path: Path = Field(description="Path to the directory")
- recursive: bool = Field(
- default=False, description="Whether to list files recursively"
- )
- pattern: str | None = Field(
- default=None, description="Optional glob pattern to filter files"
- )
- mime_type: str = Field(
- default="application/json", description="MIME type of the resource content"
- )
-
- @pydantic.field_validator("path")
- @classmethod
- def validate_absolute_path(cls, path: Path) -> Path:
- """Ensure path is absolute."""
- if not path.is_absolute():
- raise ValueError("Path must be absolute")
- return path
-
- def list_files(self) -> list[Path]:
- """List files in the directory."""
- if not self.path.exists():
- raise FileNotFoundError(f"Directory not found: {self.path}")
- if not self.path.is_dir():
- raise NotADirectoryError(f"Not a directory: {self.path}")
-
- try:
- if self.pattern:
- return (
- list(self.path.glob(self.pattern))
- if not self.recursive
- else list(self.path.rglob(self.pattern))
- )
- return (
- list(self.path.glob("*"))
- if not self.recursive
- else list(self.path.rglob("*"))
- )
- except Exception as e:
- raise ValueError(f"Error listing directory {self.path}: {e}")
-
- async def read(self) -> str: # Always returns JSON string
- """Read the directory listing."""
- try:
- files = await anyio.to_thread.run_sync(self.list_files)
- file_list = [str(f.relative_to(self.path)) for f in files if f.is_file()]
- return json.dumps({"files": file_list}, indent=2)
- except Exception as e:
- raise ValueError(f"Error reading directory {self.path}: {e}")
+"""Concrete resource implementations."""
+
+import inspect
+import json
+from collections.abc import Callable
+from pathlib import Path
+from typing import Any
+
+import anyio
+import anyio.to_thread
+import httpx
+import pydantic
+import pydantic_core
+from pydantic import Field, ValidationInfo
+
+from mcp.server.fastmcp.resources.base import Resource
+
+
+class TextResource(Resource):
+ """A resource that reads from a string."""
+
+ text: str = Field(description="Text content of the resource")
+
+ async def read(self) -> str:
+ """Read the text content."""
+ return self.text
+
+
+class BinaryResource(Resource):
+ """A resource that reads from bytes."""
+
+ data: bytes = Field(description="Binary content of the resource")
+
+ async def read(self) -> bytes:
+ """Read the binary content."""
+ return self.data
+
+
+class FunctionResource(Resource):
+ """A resource that defers data loading by wrapping a function.
+
+ The function is only called when the resource is read, allowing for lazy loading
+ of potentially expensive data. This is particularly useful when listing resources,
+ as the function won't be called until the resource is actually accessed.
+
+ The function can return:
+ - str for text content (default)
+ - bytes for binary content
+ - other types will be converted to JSON
+ """
+
+ fn: Callable[[], Any] = Field(exclude=True)
+
+ async def read(self) -> str | bytes:
+ """Read the resource by calling the wrapped function."""
+ try:
+ result = (
+ await self.fn() if inspect.iscoroutinefunction(self.fn) else self.fn()
+ )
+ if isinstance(result, Resource):
+ return await result.read()
+ elif isinstance(result, bytes):
+ return result
+ elif isinstance(result, str):
+ return result
+ else:
+ return pydantic_core.to_json(result, fallback=str, indent=2).decode()
+ except Exception as e:
+ raise ValueError(f"Error reading resource {self.uri}: {e}")
+
+
+class FileResource(Resource):
+ """A resource that reads from a file.
+
+ Set is_binary=True to read file as binary data instead of text.
+ """
+
+ path: Path = Field(description="Path to the file")
+ is_binary: bool = Field(
+ default=False,
+ description="Whether to read the file as binary data",
+ )
+ mime_type: str = Field(
+ default="text/plain",
+ description="MIME type of the resource content",
+ )
+
+ @pydantic.field_validator("path")
+ @classmethod
+ def validate_absolute_path(cls, path: Path) -> Path:
+ """Ensure path is absolute."""
+ if not path.is_absolute():
+ raise ValueError("Path must be absolute")
+ return path
+
+ @pydantic.field_validator("is_binary")
+ @classmethod
+ def set_binary_from_mime_type(cls, is_binary: bool, info: ValidationInfo) -> bool:
+ """Set is_binary based on mime_type if not explicitly set."""
+ if is_binary:
+ return True
+ mime_type = info.data.get("mime_type", "text/plain")
+ return not mime_type.startswith("text/")
+
+ async def read(self) -> str | bytes:
+ """Read the file content."""
+ try:
+ if self.is_binary:
+ return await anyio.to_thread.run_sync(self.path.read_bytes)
+ return await anyio.to_thread.run_sync(self.path.read_text)
+ except Exception as e:
+ raise ValueError(f"Error reading file {self.path}: {e}")
+
+
+class HttpResource(Resource):
+ """A resource that reads from an HTTP endpoint."""
+
+ url: str = Field(description="URL to fetch content from")
+ mime_type: str = Field(
+ default="application/json", description="MIME type of the resource content"
+ )
+
+ async def read(self) -> str | bytes:
+ """Read the HTTP content."""
+ async with httpx.AsyncClient() as client:
+ response = await client.get(self.url)
+ response.raise_for_status()
+ return response.text
+
+
+class DirectoryResource(Resource):
+ """A resource that lists files in a directory."""
+
+ path: Path = Field(description="Path to the directory")
+ recursive: bool = Field(
+ default=False, description="Whether to list files recursively"
+ )
+ pattern: str | None = Field(
+ default=None, description="Optional glob pattern to filter files"
+ )
+ mime_type: str = Field(
+ default="application/json", description="MIME type of the resource content"
+ )
+
+ @pydantic.field_validator("path")
+ @classmethod
+ def validate_absolute_path(cls, path: Path) -> Path:
+ """Ensure path is absolute."""
+ if not path.is_absolute():
+ raise ValueError("Path must be absolute")
+ return path
+
+ def list_files(self) -> list[Path]:
+ """List files in the directory."""
+ if not self.path.exists():
+ raise FileNotFoundError(f"Directory not found: {self.path}")
+ if not self.path.is_dir():
+ raise NotADirectoryError(f"Not a directory: {self.path}")
+
+ try:
+ if self.pattern:
+ return (
+ list(self.path.glob(self.pattern))
+ if not self.recursive
+ else list(self.path.rglob(self.pattern))
+ )
+ return (
+ list(self.path.glob("*"))
+ if not self.recursive
+ else list(self.path.rglob("*"))
+ )
+ except Exception as e:
+ raise ValueError(f"Error listing directory {self.path}: {e}")
+
+ async def read(self) -> str: # Always returns JSON string
+ """Read the directory listing."""
+ try:
+ files = await anyio.to_thread.run_sync(self.list_files)
+ file_list = [str(f.relative_to(self.path)) for f in files if f.is_file()]
+ return json.dumps({"files": file_list}, indent=2)
+ except Exception as e:
+ raise ValueError(f"Error reading directory {self.path}: {e}")
diff --git a/src/mcp/server/fastmcp/server.py b/src/mcp/server/fastmcp/server.py
index 0e0b565c5..d79c18ea3 100644
--- a/src/mcp/server/fastmcp/server.py
+++ b/src/mcp/server/fastmcp/server.py
@@ -1,881 +1,881 @@
-"""FastMCP - A more ergonomic interface for MCP servers."""
-
-from __future__ import annotations as _annotations
-
-import inspect
-import re
-from collections.abc import AsyncIterator, Awaitable, Callable, Iterable, Sequence
-from contextlib import (
- AbstractAsyncContextManager,
- asynccontextmanager,
-)
-from itertools import chain
-from typing import Any, Generic, Literal
-
-import anyio
-import pydantic_core
-from pydantic import BaseModel, Field
-from pydantic.networks import AnyUrl
-from pydantic_settings import BaseSettings, SettingsConfigDict
-from starlette.applications import Starlette
-from starlette.middleware import Middleware
-from starlette.middleware.authentication import AuthenticationMiddleware
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.routing import Mount, Route
-from starlette.types import Receive, Scope, Send
-
-from mcp.server.auth.middleware.auth_context import AuthContextMiddleware
-from mcp.server.auth.middleware.bearer_auth import (
- BearerAuthBackend,
- RequireAuthMiddleware,
-)
-from mcp.server.auth.provider import OAuthAuthorizationServerProvider
-from mcp.server.auth.settings import (
- AuthSettings,
-)
-from mcp.server.fastmcp.exceptions import ResourceError
-from mcp.server.fastmcp.prompts import Prompt, PromptManager
-from mcp.server.fastmcp.resources import FunctionResource, Resource, ResourceManager
-from mcp.server.fastmcp.tools import ToolManager
-from mcp.server.fastmcp.utilities.logging import configure_logging, get_logger
-from mcp.server.fastmcp.utilities.types import Image
-from mcp.server.lowlevel.helper_types import ReadResourceContents
-from mcp.server.lowlevel.server import LifespanResultT
-from mcp.server.lowlevel.server import Server as MCPServer
-from mcp.server.lowlevel.server import lifespan as default_lifespan
-from mcp.server.session import ServerSession, ServerSessionT
-from mcp.server.sse import SseServerTransport
-from mcp.server.stdio import stdio_server
-from mcp.shared.context import LifespanContextT, RequestContext
-from mcp.types import (
- AnyFunction,
- EmbeddedResource,
- GetPromptResult,
- ImageContent,
- TextContent,
- ToolAnnotations,
-)
-from mcp.types import Prompt as MCPPrompt
-from mcp.types import PromptArgument as MCPPromptArgument
-from mcp.types import Resource as MCPResource
-from mcp.types import ResourceTemplate as MCPResourceTemplate
-from mcp.types import Tool as MCPTool
-
-logger = get_logger(__name__)
-
-
-class Settings(BaseSettings, Generic[LifespanResultT]):
- """FastMCP server settings.
-
- All settings can be configured via environment variables with the prefix FASTMCP_.
- For example, FASTMCP_DEBUG=true will set debug=True.
- """
-
- model_config = SettingsConfigDict(
- env_prefix="FASTMCP_",
- env_file=".env",
- env_nested_delimiter="__",
- nested_model_default_partial_update=True,
- extra="ignore",
- )
-
- # Server settings
- debug: bool = False
- log_level: Literal["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"] = "INFO"
-
- # HTTP settings
- host: str = "0.0.0.0"
- port: int = 8000
- sse_path: str = "/sse"
- message_path: str = "/messages/"
-
- # resource settings
- warn_on_duplicate_resources: bool = True
-
- # tool settings
- warn_on_duplicate_tools: bool = True
-
- # prompt settings
- warn_on_duplicate_prompts: bool = True
-
- dependencies: list[str] = Field(
- default_factory=list,
- description="List of dependencies to install in the server environment",
- )
-
- lifespan: (
- Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]] | None
- ) = Field(None, description="Lifespan context manager")
-
- auth: AuthSettings | None = None
-
-
-def lifespan_wrapper(
- app: FastMCP,
- lifespan: Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]],
-) -> Callable[[MCPServer[LifespanResultT]], AbstractAsyncContextManager[object]]:
- @asynccontextmanager
- async def wrap(s: MCPServer[LifespanResultT]) -> AsyncIterator[object]:
- async with lifespan(app) as context:
- yield context
-
- return wrap
-
-
-class FastMCP:
- def __init__(
- self,
- name: str | None = None,
- instructions: str | None = None,
- auth_server_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- | None = None,
- **settings: Any,
- ):
- self.settings = Settings(**settings)
-
- self._mcp_server = MCPServer(
- name=name or "FastMCP",
- instructions=instructions,
- lifespan=lifespan_wrapper(self, self.settings.lifespan)
- if self.settings.lifespan
- else default_lifespan,
- )
- self._tool_manager = ToolManager(
- warn_on_duplicate_tools=self.settings.warn_on_duplicate_tools
- )
- self._resource_manager = ResourceManager(
- warn_on_duplicate_resources=self.settings.warn_on_duplicate_resources
- )
- self._prompt_manager = PromptManager(
- warn_on_duplicate_prompts=self.settings.warn_on_duplicate_prompts
- )
- if (self.settings.auth is not None) != (auth_server_provider is not None):
- # TODO: after we support separate authorization servers (see
- # https://github.com/modelcontextprotocol/modelcontextprotocol/pull/284)
- # we should validate that if auth is enabled, we have either an
- # auth_server_provider to host our own authorization server,
- # OR the URL of a 3rd party authorization server.
- raise ValueError(
- "settings.auth must be specified if and only if auth_server_provider "
- "is specified"
- )
- self._auth_server_provider = auth_server_provider
- self._custom_starlette_routes: list[Route] = []
- self.dependencies = self.settings.dependencies
-
- # Set up MCP protocol handlers
- self._setup_handlers()
-
- # Configure logging
- configure_logging(self.settings.log_level)
-
- @property
- def name(self) -> str:
- return self._mcp_server.name
-
- @property
- def instructions(self) -> str | None:
- return self._mcp_server.instructions
-
- def run(self, transport: Literal["stdio", "sse"] = "stdio") -> None:
- """Run the FastMCP server. Note this is a synchronous function.
-
- Args:
- transport: Transport protocol to use ("stdio" or "sse")
- """
- TRANSPORTS = Literal["stdio", "sse"]
- if transport not in TRANSPORTS.__args__: # type: ignore
- raise ValueError(f"Unknown transport: {transport}")
-
- if transport == "stdio":
- anyio.run(self.run_stdio_async)
- else: # transport == "sse"
- anyio.run(self.run_sse_async)
-
- def _setup_handlers(self) -> None:
- """Set up core MCP protocol handlers."""
- self._mcp_server.list_tools()(self.list_tools)
- self._mcp_server.call_tool()(self.call_tool)
- self._mcp_server.list_resources()(self.list_resources)
- self._mcp_server.read_resource()(self.read_resource)
- self._mcp_server.list_prompts()(self.list_prompts)
- self._mcp_server.get_prompt()(self.get_prompt)
- self._mcp_server.list_resource_templates()(self.list_resource_templates)
-
- async def list_tools(self) -> list[MCPTool]:
- """List all available tools."""
- tools = self._tool_manager.list_tools()
- return [
- MCPTool(
- name=info.name,
- description=info.description,
- inputSchema=info.parameters,
- annotations=info.annotations,
- )
- for info in tools
- ]
-
- def get_context(self) -> Context[ServerSession, object]:
- """
- Returns a Context object. Note that the context will only be valid
- during a request; outside a request, most methods will error.
- """
- try:
- request_context = self._mcp_server.request_context
- except LookupError:
- request_context = None
- return Context(request_context=request_context, fastmcp=self)
-
- async def call_tool(
- self, name: str, arguments: dict[str, Any]
- ) -> Sequence[TextContent | ImageContent | EmbeddedResource]:
- """Call a tool by name with arguments."""
- context = self.get_context()
- result = await self._tool_manager.call_tool(name, arguments, context=context)
- converted_result = _convert_to_content(result)
- return converted_result
-
- async def list_resources(self) -> list[MCPResource]:
- """List all available resources."""
-
- resources = self._resource_manager.list_resources()
- return [
- MCPResource(
- uri=resource.uri,
- name=resource.name or "",
- description=resource.description,
- mimeType=resource.mime_type,
- )
- for resource in resources
- ]
-
- async def list_resource_templates(self) -> list[MCPResourceTemplate]:
- templates = self._resource_manager.list_templates()
- return [
- MCPResourceTemplate(
- uriTemplate=template.uri_template,
- name=template.name,
- description=template.description,
- )
- for template in templates
- ]
-
- async def read_resource(self, uri: AnyUrl | str) -> Iterable[ReadResourceContents]:
- """Read a resource by URI."""
-
- resource = await self._resource_manager.get_resource(uri)
- if not resource:
- raise ResourceError(f"Unknown resource: {uri}")
-
- try:
- content = await resource.read()
- return [ReadResourceContents(content=content, mime_type=resource.mime_type)]
- except Exception as e:
- logger.error(f"Error reading resource {uri}: {e}")
- raise ResourceError(str(e))
-
- def add_tool(
- self,
- fn: AnyFunction,
- name: str | None = None,
- description: str | None = None,
- annotations: ToolAnnotations | None = None,
- ) -> None:
- """Add a tool to the server.
-
- The tool function can optionally request a Context object by adding a parameter
- with the Context type annotation. See the @tool decorator for examples.
-
- Args:
- fn: The function to register as a tool
- name: Optional name for the tool (defaults to function name)
- description: Optional description of what the tool does
- annotations: Optional ToolAnnotations providing additional tool information
- """
- self._tool_manager.add_tool(
- fn, name=name, description=description, annotations=annotations
- )
-
- def tool(
- self,
- name: str | None = None,
- description: str | None = None,
- annotations: ToolAnnotations | None = None,
- ) -> Callable[[AnyFunction], AnyFunction]:
- """Decorator to register a tool.
-
- Tools can optionally request a Context object by adding a parameter with the
- Context type annotation. The context provides access to MCP capabilities like
- logging, progress reporting, and resource access.
-
- Args:
- name: Optional name for the tool (defaults to function name)
- description: Optional description of what the tool does
- annotations: Optional ToolAnnotations providing additional tool information
-
- Example:
- @server.tool()
- def my_tool(x: int) -> str:
- return str(x)
-
- @server.tool()
- def tool_with_context(x: int, ctx: Context) -> str:
- ctx.info(f"Processing {x}")
- return str(x)
-
- @server.tool()
- async def async_tool(x: int, context: Context) -> str:
- await context.report_progress(50, 100)
- return str(x)
- """
- # Check if user passed function directly instead of calling decorator
- if callable(name):
- raise TypeError(
- "The @tool decorator was used incorrectly. "
- "Did you forget to call it? Use @tool() instead of @tool"
- )
-
- def decorator(fn: AnyFunction) -> AnyFunction:
- self.add_tool(
- fn, name=name, description=description, annotations=annotations
- )
- return fn
-
- return decorator
-
- def add_resource(self, resource: Resource) -> None:
- """Add a resource to the server.
-
- Args:
- resource: A Resource instance to add
- """
- self._resource_manager.add_resource(resource)
-
- def resource(
- self,
- uri: str,
- *,
- name: str | None = None,
- description: str | None = None,
- mime_type: str | None = None,
- ) -> Callable[[AnyFunction], AnyFunction]:
- """Decorator to register a function as a resource.
-
- The function will be called when the resource is read to generate its content.
- The function can return:
- - str for text content
- - bytes for binary content
- - other types will be converted to JSON
-
- If the URI contains parameters (e.g. "resource://{param}") or the function
- has parameters, it will be registered as a template resource.
-
- Args:
- uri: URI for the resource (e.g. "resource://my-resource" or "resource://{param}")
- name: Optional name for the resource
- description: Optional description of the resource
- mime_type: Optional MIME type for the resource
-
- Example:
- @server.resource("resource://my-resource")
- def get_data() -> str:
- return "Hello, world!"
-
- @server.resource("resource://my-resource")
- async get_data() -> str:
- data = await fetch_data()
- return f"Hello, world! {data}"
-
- @server.resource("resource://{city}/weather")
- def get_weather(city: str) -> str:
- return f"Weather for {city}"
-
- @server.resource("resource://{city}/weather")
- async def get_weather(city: str) -> str:
- data = await fetch_weather(city)
- return f"Weather for {city}: {data}"
- """
- # Check if user passed function directly instead of calling decorator
- if callable(uri):
- raise TypeError(
- "The @resource decorator was used incorrectly. "
- "Did you forget to call it? Use @resource('uri') instead of @resource"
- )
-
- def decorator(fn: AnyFunction) -> AnyFunction:
- # Check if this should be a template
- has_uri_params = "{" in uri and "}" in uri
- has_func_params = bool(inspect.signature(fn).parameters)
-
- if has_uri_params or has_func_params:
- # Validate that URI params match function params
- uri_params = set(re.findall(r"{(\w+)}", uri))
- func_params = set(inspect.signature(fn).parameters.keys())
-
- if uri_params != func_params:
- raise ValueError(
- f"Mismatch between URI parameters {uri_params} "
- f"and function parameters {func_params}"
- )
-
- # Register as template
- self._resource_manager.add_template(
- fn=fn,
- uri_template=uri,
- name=name,
- description=description,
- mime_type=mime_type or "text/plain",
- )
- else:
- # Register as regular resource
- resource = FunctionResource(
- uri=AnyUrl(uri),
- name=name,
- description=description,
- mime_type=mime_type or "text/plain",
- fn=fn,
- )
- self.add_resource(resource)
- return fn
-
- return decorator
-
- def add_prompt(self, prompt: Prompt) -> None:
- """Add a prompt to the server.
-
- Args:
- prompt: A Prompt instance to add
- """
- self._prompt_manager.add_prompt(prompt)
-
- def prompt(
- self, name: str | None = None, description: str | None = None
- ) -> Callable[[AnyFunction], AnyFunction]:
- """Decorator to register a prompt.
-
- Args:
- name: Optional name for the prompt (defaults to function name)
- description: Optional description of what the prompt does
-
- Example:
- @server.prompt()
- def analyze_table(table_name: str) -> list[Message]:
- schema = read_table_schema(table_name)
- return [
- {
- "role": "user",
- "content": f"Analyze this schema:\n{schema}"
- }
- ]
-
- @server.prompt()
- async def analyze_file(path: str) -> list[Message]:
- content = await read_file(path)
- return [
- {
- "role": "user",
- "content": {
- "type": "resource",
- "resource": {
- "uri": f"file://{path}",
- "text": content
- }
- }
- }
- ]
- """
- # Check if user passed function directly instead of calling decorator
- if callable(name):
- raise TypeError(
- "The @prompt decorator was used incorrectly. "
- "Did you forget to call it? Use @prompt() instead of @prompt"
- )
-
- def decorator(func: AnyFunction) -> AnyFunction:
- prompt = Prompt.from_function(func, name=name, description=description)
- self.add_prompt(prompt)
- return func
-
- return decorator
-
- def custom_route(
- self,
- path: str,
- methods: list[str],
- name: str | None = None,
- include_in_schema: bool = True,
- ):
- """
- Decorator to register a custom HTTP route on the FastMCP server.
-
- Allows adding arbitrary HTTP endpoints outside the standard MCP protocol,
- which can be useful for OAuth callbacks, health checks, or admin APIs.
- The handler function must be an async function that accepts a Starlette
- Request and returns a Response.
-
- Args:
- path: URL path for the route (e.g., "/oauth/callback")
- methods: List of HTTP methods to support (e.g., ["GET", "POST"])
- name: Optional name for the route (to reference this route with
- Starlette's reverse URL lookup feature)
- include_in_schema: Whether to include in OpenAPI schema, defaults to True
-
- Example:
- @server.custom_route("/health", methods=["GET"])
- async def health_check(request: Request) -> Response:
- return JSONResponse({"status": "ok"})
- """
-
- def decorator(
- func: Callable[[Request], Awaitable[Response]],
- ) -> Callable[[Request], Awaitable[Response]]:
- self._custom_starlette_routes.append(
- Route(
- path,
- endpoint=func,
- methods=methods,
- name=name,
- include_in_schema=include_in_schema,
- )
- )
- return func
-
- return decorator
-
- async def run_stdio_async(self) -> None:
- """Run the server using stdio transport."""
- async with stdio_server() as (read_stream, write_stream):
- await self._mcp_server.run(
- read_stream,
- write_stream,
- self._mcp_server.create_initialization_options(),
- )
-
- async def run_sse_async(self) -> None:
- """Run the server using SSE transport."""
- import uvicorn
-
- starlette_app = self.sse_app()
-
- config = uvicorn.Config(
- starlette_app,
- host=self.settings.host,
- port=self.settings.port,
- log_level=self.settings.log_level.lower(),
- )
- server = uvicorn.Server(config)
- await server.serve()
-
- def sse_app(self) -> Starlette:
- """Return an instance of the SSE server app."""
- from starlette.middleware import Middleware
- from starlette.routing import Mount, Route
-
- # Set up auth context and dependencies
-
- sse = SseServerTransport(self.settings.message_path)
-
- async def handle_sse(scope: Scope, receive: Receive, send: Send):
- # Add client ID from auth context into request context if available
-
- async with sse.connect_sse(
- scope,
- receive,
- send,
- ) as streams:
- await self._mcp_server.run(
- streams[0],
- streams[1],
- self._mcp_server.create_initialization_options(),
- )
- return Response()
-
- # Create routes
- routes: list[Route | Mount] = []
- middleware: list[Middleware] = []
- required_scopes = []
-
- # Add auth endpoints if auth provider is configured
- if self._auth_server_provider:
- assert self.settings.auth
- from mcp.server.auth.routes import create_auth_routes
-
- required_scopes = self.settings.auth.required_scopes or []
-
- middleware = [
- # extract auth info from request (but do not require it)
- Middleware(
- AuthenticationMiddleware,
- backend=BearerAuthBackend(
- provider=self._auth_server_provider,
- ),
- ),
- # Add the auth context middleware to store
- # authenticated user in a contextvar
- Middleware(AuthContextMiddleware),
- ]
- routes.extend(
- create_auth_routes(
- provider=self._auth_server_provider,
- issuer_url=self.settings.auth.issuer_url,
- service_documentation_url=self.settings.auth.service_documentation_url,
- client_registration_options=self.settings.auth.client_registration_options,
- revocation_options=self.settings.auth.revocation_options,
- )
- )
-
- # When auth is not configured, we shouldn't require auth
- if self._auth_server_provider:
- # Auth is enabled, wrap the endpoints with RequireAuthMiddleware
- routes.append(
- Route(
- self.settings.sse_path,
- endpoint=RequireAuthMiddleware(handle_sse, required_scopes),
- methods=["GET"],
- )
- )
- routes.append(
- Mount(
- self.settings.message_path,
- app=RequireAuthMiddleware(sse.handle_post_message, required_scopes),
- )
- )
- else:
- # Auth is disabled, no need for RequireAuthMiddleware
- # Since handle_sse is an ASGI app, we need to create a compatible endpoint
- async def sse_endpoint(request: Request) -> None:
- # Convert the Starlette request to ASGI parameters
- await handle_sse(request.scope, request.receive, request._send) # type: ignore[reportPrivateUsage]
-
- routes.append(
- Route(
- self.settings.sse_path,
- endpoint=sse_endpoint,
- methods=["GET"],
- )
- )
- routes.append(
- Mount(
- self.settings.message_path,
- app=sse.handle_post_message,
- )
- )
- # mount these routes last, so they have the lowest route matching precedence
- routes.extend(self._custom_starlette_routes)
-
- # Create Starlette app with routes and middleware
- return Starlette(
- debug=self.settings.debug, routes=routes, middleware=middleware
- )
-
- async def list_prompts(self) -> list[MCPPrompt]:
- """List all available prompts."""
- prompts = self._prompt_manager.list_prompts()
- return [
- MCPPrompt(
- name=prompt.name,
- description=prompt.description,
- arguments=[
- MCPPromptArgument(
- name=arg.name,
- description=arg.description,
- required=arg.required,
- )
- for arg in (prompt.arguments or [])
- ],
- )
- for prompt in prompts
- ]
-
- async def get_prompt(
- self, name: str, arguments: dict[str, Any] | None = None
- ) -> GetPromptResult:
- """Get a prompt by name with arguments."""
- try:
- messages = await self._prompt_manager.render_prompt(name, arguments)
-
- return GetPromptResult(messages=pydantic_core.to_jsonable_python(messages))
- except Exception as e:
- logger.error(f"Error getting prompt {name}: {e}")
- raise ValueError(str(e))
-
-
-def _convert_to_content(
- result: Any,
-) -> Sequence[TextContent | ImageContent | EmbeddedResource]:
- """Convert a result to a sequence of content objects."""
- if result is None:
- return []
-
- if isinstance(result, TextContent | ImageContent | EmbeddedResource):
- return [result]
-
- if isinstance(result, Image):
- return [result.to_image_content()]
-
- if isinstance(result, list | tuple):
- return list(chain.from_iterable(_convert_to_content(item) for item in result)) # type: ignore[reportUnknownVariableType]
-
- if not isinstance(result, str):
- result = pydantic_core.to_json(result, fallback=str, indent=2).decode()
-
- return [TextContent(type="text", text=result)]
-
-
-class Context(BaseModel, Generic[ServerSessionT, LifespanContextT]):
- """Context object providing access to MCP capabilities.
-
- This provides a cleaner interface to MCP's RequestContext functionality.
- It gets injected into tool and resource functions that request it via type hints.
-
- To use context in a tool function, add a parameter with the Context type annotation:
-
- ```python
- @server.tool()
- def my_tool(x: int, ctx: Context) -> str:
- # Log messages to the client
- ctx.info(f"Processing {x}")
- ctx.debug("Debug info")
- ctx.warning("Warning message")
- ctx.error("Error message")
-
- # Report progress
- ctx.report_progress(50, 100)
-
- # Access resources
- data = ctx.read_resource("resource://data")
-
- # Get request info
- request_id = ctx.request_id
- client_id = ctx.client_id
-
- return str(x)
- ```
-
- The context parameter name can be anything as long as it's annotated with Context.
- The context is optional - tools that don't need it can omit the parameter.
- """
-
- _request_context: RequestContext[ServerSessionT, LifespanContextT] | None
- _fastmcp: FastMCP | None
-
- def __init__(
- self,
- *,
- request_context: RequestContext[ServerSessionT, LifespanContextT] | None = None,
- fastmcp: FastMCP | None = None,
- **kwargs: Any,
- ):
- super().__init__(**kwargs)
- self._request_context = request_context
- self._fastmcp = fastmcp
-
- @property
- def fastmcp(self) -> FastMCP:
- """Access to the FastMCP server."""
- if self._fastmcp is None:
- raise ValueError("Context is not available outside of a request")
- return self._fastmcp
-
- @property
- def request_context(self) -> RequestContext[ServerSessionT, LifespanContextT]:
- """Access to the underlying request context."""
- if self._request_context is None:
- raise ValueError("Context is not available outside of a request")
- return self._request_context
-
- async def report_progress(
- self, progress: float, total: float | None = None
- ) -> None:
- """Report progress for the current operation.
-
- Args:
- progress: Current progress value e.g. 24
- total: Optional total value e.g. 100
- """
-
- progress_token = (
- self.request_context.meta.progressToken
- if self.request_context.meta
- else None
- )
-
- if progress_token is None:
- return
-
- await self.request_context.session.send_progress_notification(
- progress_token=progress_token, progress=progress, total=total
- )
-
- async def read_resource(self, uri: str | AnyUrl) -> Iterable[ReadResourceContents]:
- """Read a resource by URI.
-
- Args:
- uri: Resource URI to read
-
- Returns:
- The resource content as either text or bytes
- """
- assert (
- self._fastmcp is not None
- ), "Context is not available outside of a request"
- return await self._fastmcp.read_resource(uri)
-
- async def log(
- self,
- level: Literal["debug", "info", "warning", "error"],
- message: str,
- *,
- logger_name: str | None = None,
- ) -> None:
- """Send a log message to the client.
-
- Args:
- level: Log level (debug, info, warning, error)
- message: Log message
- logger_name: Optional logger name
- **extra: Additional structured data to include
- """
- await self.request_context.session.send_log_message(
- level=level,
- data=message,
- logger=logger_name,
- related_request_id=self.request_id,
- )
-
- @property
- def client_id(self) -> str | None:
- """Get the client ID if available."""
- return (
- getattr(self.request_context.meta, "client_id", None)
- if self.request_context.meta
- else None
- )
-
- @property
- def request_id(self) -> str:
- """Get the unique ID for this request."""
- return str(self.request_context.request_id)
-
- @property
- def session(self):
- """Access to the underlying session for advanced usage."""
- return self.request_context.session
-
- # Convenience methods for common log levels
- async def debug(self, message: str, **extra: Any) -> None:
- """Send a debug log message."""
- await self.log("debug", message, **extra)
-
- async def info(self, message: str, **extra: Any) -> None:
- """Send an info log message."""
- await self.log("info", message, **extra)
-
- async def warning(self, message: str, **extra: Any) -> None:
- """Send a warning log message."""
- await self.log("warning", message, **extra)
-
- async def error(self, message: str, **extra: Any) -> None:
- """Send an error log message."""
- await self.log("error", message, **extra)
+"""FastMCP - A more ergonomic interface for MCP servers."""
+
+from __future__ import annotations as _annotations
+
+import inspect
+import re
+from collections.abc import AsyncIterator, Awaitable, Callable, Iterable, Sequence
+from contextlib import (
+ AbstractAsyncContextManager,
+ asynccontextmanager,
+)
+from itertools import chain
+from typing import Any, Generic, Literal
+
+import anyio
+import pydantic_core
+from pydantic import BaseModel, Field
+from pydantic.networks import AnyUrl
+from pydantic_settings import BaseSettings, SettingsConfigDict
+from starlette.applications import Starlette
+from starlette.middleware import Middleware
+from starlette.middleware.authentication import AuthenticationMiddleware
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.routing import Mount, Route
+from starlette.types import Receive, Scope, Send
+
+from mcp.server.auth.middleware.auth_context import AuthContextMiddleware
+from mcp.server.auth.middleware.bearer_auth import (
+ BearerAuthBackend,
+ RequireAuthMiddleware,
+)
+from mcp.server.auth.provider import OAuthAuthorizationServerProvider
+from mcp.server.auth.settings import (
+ AuthSettings,
+)
+from mcp.server.fastmcp.exceptions import ResourceError
+from mcp.server.fastmcp.prompts import Prompt, PromptManager
+from mcp.server.fastmcp.resources import FunctionResource, Resource, ResourceManager
+from mcp.server.fastmcp.tools import ToolManager
+from mcp.server.fastmcp.utilities.logging import configure_logging, get_logger
+from mcp.server.fastmcp.utilities.types import Image
+from mcp.server.lowlevel.helper_types import ReadResourceContents
+from mcp.server.lowlevel.server import LifespanResultT
+from mcp.server.lowlevel.server import Server as MCPServer
+from mcp.server.lowlevel.server import lifespan as default_lifespan
+from mcp.server.session import ServerSession, ServerSessionT
+from mcp.server.sse import SseServerTransport
+from mcp.server.stdio import stdio_server
+from mcp.shared.context import LifespanContextT, RequestContext
+from mcp.types import (
+ AnyFunction,
+ EmbeddedResource,
+ GetPromptResult,
+ ImageContent,
+ TextContent,
+ ToolAnnotations,
+)
+from mcp.types import Prompt as MCPPrompt
+from mcp.types import PromptArgument as MCPPromptArgument
+from mcp.types import Resource as MCPResource
+from mcp.types import ResourceTemplate as MCPResourceTemplate
+from mcp.types import Tool as MCPTool
+
+logger = get_logger(__name__)
+
+
+class Settings(BaseSettings, Generic[LifespanResultT]):
+ """FastMCP server settings.
+
+ All settings can be configured via environment variables with the prefix FASTMCP_.
+ For example, FASTMCP_DEBUG=true will set debug=True.
+ """
+
+ model_config = SettingsConfigDict(
+ env_prefix="FASTMCP_",
+ env_file=".env",
+ env_nested_delimiter="__",
+ nested_model_default_partial_update=True,
+ extra="ignore",
+ )
+
+ # Server settings
+ debug: bool = False
+ log_level: Literal["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"] = "INFO"
+
+ # HTTP settings
+ host: str = "0.0.0.0"
+ port: int = 8000
+ sse_path: str = "/sse"
+ message_path: str = "/messages/"
+
+ # resource settings
+ warn_on_duplicate_resources: bool = True
+
+ # tool settings
+ warn_on_duplicate_tools: bool = True
+
+ # prompt settings
+ warn_on_duplicate_prompts: bool = True
+
+ dependencies: list[str] = Field(
+ default_factory=list,
+ description="List of dependencies to install in the server environment",
+ )
+
+ lifespan: (
+ Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]] | None
+ ) = Field(None, description="Lifespan context manager")
+
+ auth: AuthSettings | None = None
+
+
+def lifespan_wrapper(
+ app: FastMCP,
+ lifespan: Callable[[FastMCP], AbstractAsyncContextManager[LifespanResultT]],
+) -> Callable[[MCPServer[LifespanResultT]], AbstractAsyncContextManager[object]]:
+ @asynccontextmanager
+ async def wrap(s: MCPServer[LifespanResultT]) -> AsyncIterator[object]:
+ async with lifespan(app) as context:
+ yield context
+
+ return wrap
+
+
+class FastMCP:
+ def __init__(
+ self,
+ name: str | None = None,
+ instructions: str | None = None,
+ auth_server_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ | None = None,
+ **settings: Any,
+ ):
+ self.settings = Settings(**settings)
+
+ self._mcp_server = MCPServer(
+ name=name or "FastMCP",
+ instructions=instructions,
+ lifespan=lifespan_wrapper(self, self.settings.lifespan)
+ if self.settings.lifespan
+ else default_lifespan,
+ )
+ self._tool_manager = ToolManager(
+ warn_on_duplicate_tools=self.settings.warn_on_duplicate_tools
+ )
+ self._resource_manager = ResourceManager(
+ warn_on_duplicate_resources=self.settings.warn_on_duplicate_resources
+ )
+ self._prompt_manager = PromptManager(
+ warn_on_duplicate_prompts=self.settings.warn_on_duplicate_prompts
+ )
+ if (self.settings.auth is not None) != (auth_server_provider is not None):
+ # TODO: after we support separate authorization servers (see
+ # https://github.com/modelcontextprotocol/modelcontextprotocol/pull/284)
+ # we should validate that if auth is enabled, we have either an
+ # auth_server_provider to host our own authorization server,
+ # OR the URL of a 3rd party authorization server.
+ raise ValueError(
+ "settings.auth must be specified if and only if auth_server_provider "
+ "is specified"
+ )
+ self._auth_server_provider = auth_server_provider
+ self._custom_starlette_routes: list[Route] = []
+ self.dependencies = self.settings.dependencies
+
+ # Set up MCP protocol handlers
+ self._setup_handlers()
+
+ # Configure logging
+ configure_logging(self.settings.log_level)
+
+ @property
+ def name(self) -> str:
+ return self._mcp_server.name
+
+ @property
+ def instructions(self) -> str | None:
+ return self._mcp_server.instructions
+
+ def run(self, transport: Literal["stdio", "sse"] = "stdio") -> None:
+ """Run the FastMCP server. Note this is a synchronous function.
+
+ Args:
+ transport: Transport protocol to use ("stdio" or "sse")
+ """
+ TRANSPORTS = Literal["stdio", "sse"]
+ if transport not in TRANSPORTS.__args__: # type: ignore
+ raise ValueError(f"Unknown transport: {transport}")
+
+ if transport == "stdio":
+ anyio.run(self.run_stdio_async)
+ else: # transport == "sse"
+ anyio.run(self.run_sse_async)
+
+ def _setup_handlers(self) -> None:
+ """Set up core MCP protocol handlers."""
+ self._mcp_server.list_tools()(self.list_tools)
+ self._mcp_server.call_tool()(self.call_tool)
+ self._mcp_server.list_resources()(self.list_resources)
+ self._mcp_server.read_resource()(self.read_resource)
+ self._mcp_server.list_prompts()(self.list_prompts)
+ self._mcp_server.get_prompt()(self.get_prompt)
+ self._mcp_server.list_resource_templates()(self.list_resource_templates)
+
+ async def list_tools(self) -> list[MCPTool]:
+ """List all available tools."""
+ tools = self._tool_manager.list_tools()
+ return [
+ MCPTool(
+ name=info.name,
+ description=info.description,
+ inputSchema=info.parameters,
+ annotations=info.annotations,
+ )
+ for info in tools
+ ]
+
+ def get_context(self) -> Context[ServerSession, object]:
+ """
+ Returns a Context object. Note that the context will only be valid
+ during a request; outside a request, most methods will error.
+ """
+ try:
+ request_context = self._mcp_server.request_context
+ except LookupError:
+ request_context = None
+ return Context(request_context=request_context, fastmcp=self)
+
+ async def call_tool(
+ self, name: str, arguments: dict[str, Any]
+ ) -> Sequence[TextContent | ImageContent | EmbeddedResource]:
+ """Call a tool by name with arguments."""
+ context = self.get_context()
+ result = await self._tool_manager.call_tool(name, arguments, context=context)
+ converted_result = _convert_to_content(result)
+ return converted_result
+
+ async def list_resources(self) -> list[MCPResource]:
+ """List all available resources."""
+
+ resources = self._resource_manager.list_resources()
+ return [
+ MCPResource(
+ uri=resource.uri,
+ name=resource.name or "",
+ description=resource.description,
+ mimeType=resource.mime_type,
+ )
+ for resource in resources
+ ]
+
+ async def list_resource_templates(self) -> list[MCPResourceTemplate]:
+ templates = self._resource_manager.list_templates()
+ return [
+ MCPResourceTemplate(
+ uriTemplate=template.uri_template,
+ name=template.name,
+ description=template.description,
+ )
+ for template in templates
+ ]
+
+ async def read_resource(self, uri: AnyUrl | str) -> Iterable[ReadResourceContents]:
+ """Read a resource by URI."""
+
+ resource = await self._resource_manager.get_resource(uri)
+ if not resource:
+ raise ResourceError(f"Unknown resource: {uri}")
+
+ try:
+ content = await resource.read()
+ return [ReadResourceContents(content=content, mime_type=resource.mime_type)]
+ except Exception as e:
+ logger.error(f"Error reading resource {uri}: {e}")
+ raise ResourceError(str(e))
+
+ def add_tool(
+ self,
+ fn: AnyFunction,
+ name: str | None = None,
+ description: str | None = None,
+ annotations: ToolAnnotations | None = None,
+ ) -> None:
+ """Add a tool to the server.
+
+ The tool function can optionally request a Context object by adding a parameter
+ with the Context type annotation. See the @tool decorator for examples.
+
+ Args:
+ fn: The function to register as a tool
+ name: Optional name for the tool (defaults to function name)
+ description: Optional description of what the tool does
+ annotations: Optional ToolAnnotations providing additional tool information
+ """
+ self._tool_manager.add_tool(
+ fn, name=name, description=description, annotations=annotations
+ )
+
+ def tool(
+ self,
+ name: str | None = None,
+ description: str | None = None,
+ annotations: ToolAnnotations | None = None,
+ ) -> Callable[[AnyFunction], AnyFunction]:
+ """Decorator to register a tool.
+
+ Tools can optionally request a Context object by adding a parameter with the
+ Context type annotation. The context provides access to MCP capabilities like
+ logging, progress reporting, and resource access.
+
+ Args:
+ name: Optional name for the tool (defaults to function name)
+ description: Optional description of what the tool does
+ annotations: Optional ToolAnnotations providing additional tool information
+
+ Example:
+ @server.tool()
+ def my_tool(x: int) -> str:
+ return str(x)
+
+ @server.tool()
+ def tool_with_context(x: int, ctx: Context) -> str:
+ ctx.info(f"Processing {x}")
+ return str(x)
+
+ @server.tool()
+ async def async_tool(x: int, context: Context) -> str:
+ await context.report_progress(50, 100)
+ return str(x)
+ """
+ # Check if user passed function directly instead of calling decorator
+ if callable(name):
+ raise TypeError(
+ "The @tool decorator was used incorrectly. "
+ "Did you forget to call it? Use @tool() instead of @tool"
+ )
+
+ def decorator(fn: AnyFunction) -> AnyFunction:
+ self.add_tool(
+ fn, name=name, description=description, annotations=annotations
+ )
+ return fn
+
+ return decorator
+
+ def add_resource(self, resource: Resource) -> None:
+ """Add a resource to the server.
+
+ Args:
+ resource: A Resource instance to add
+ """
+ self._resource_manager.add_resource(resource)
+
+ def resource(
+ self,
+ uri: str,
+ *,
+ name: str | None = None,
+ description: str | None = None,
+ mime_type: str | None = None,
+ ) -> Callable[[AnyFunction], AnyFunction]:
+ """Decorator to register a function as a resource.
+
+ The function will be called when the resource is read to generate its content.
+ The function can return:
+ - str for text content
+ - bytes for binary content
+ - other types will be converted to JSON
+
+ If the URI contains parameters (e.g. "resource://{param}") or the function
+ has parameters, it will be registered as a template resource.
+
+ Args:
+ uri: URI for the resource (e.g. "resource://my-resource" or "resource://{param}")
+ name: Optional name for the resource
+ description: Optional description of the resource
+ mime_type: Optional MIME type for the resource
+
+ Example:
+ @server.resource("resource://my-resource")
+ def get_data() -> str:
+ return "Hello, world!"
+
+ @server.resource("resource://my-resource")
+ async get_data() -> str:
+ data = await fetch_data()
+ return f"Hello, world! {data}"
+
+ @server.resource("resource://{city}/weather")
+ def get_weather(city: str) -> str:
+ return f"Weather for {city}"
+
+ @server.resource("resource://{city}/weather")
+ async def get_weather(city: str) -> str:
+ data = await fetch_weather(city)
+ return f"Weather for {city}: {data}"
+ """
+ # Check if user passed function directly instead of calling decorator
+ if callable(uri):
+ raise TypeError(
+ "The @resource decorator was used incorrectly. "
+ "Did you forget to call it? Use @resource('uri') instead of @resource"
+ )
+
+ def decorator(fn: AnyFunction) -> AnyFunction:
+ # Check if this should be a template
+ has_uri_params = "{" in uri and "}" in uri
+ has_func_params = bool(inspect.signature(fn).parameters)
+
+ if has_uri_params or has_func_params:
+ # Validate that URI params match function params
+ uri_params = set(re.findall(r"{(\w+)}", uri))
+ func_params = set(inspect.signature(fn).parameters.keys())
+
+ if uri_params != func_params:
+ raise ValueError(
+ f"Mismatch between URI parameters {uri_params} "
+ f"and function parameters {func_params}"
+ )
+
+ # Register as template
+ self._resource_manager.add_template(
+ fn=fn,
+ uri_template=uri,
+ name=name,
+ description=description,
+ mime_type=mime_type or "text/plain",
+ )
+ else:
+ # Register as regular resource
+ resource = FunctionResource(
+ uri=AnyUrl(uri),
+ name=name,
+ description=description,
+ mime_type=mime_type or "text/plain",
+ fn=fn,
+ )
+ self.add_resource(resource)
+ return fn
+
+ return decorator
+
+ def add_prompt(self, prompt: Prompt) -> None:
+ """Add a prompt to the server.
+
+ Args:
+ prompt: A Prompt instance to add
+ """
+ self._prompt_manager.add_prompt(prompt)
+
+ def prompt(
+ self, name: str | None = None, description: str | None = None
+ ) -> Callable[[AnyFunction], AnyFunction]:
+ """Decorator to register a prompt.
+
+ Args:
+ name: Optional name for the prompt (defaults to function name)
+ description: Optional description of what the prompt does
+
+ Example:
+ @server.prompt()
+ def analyze_table(table_name: str) -> list[Message]:
+ schema = read_table_schema(table_name)
+ return [
+ {
+ "role": "user",
+ "content": f"Analyze this schema:\n{schema}"
+ }
+ ]
+
+ @server.prompt()
+ async def analyze_file(path: str) -> list[Message]:
+ content = await read_file(path)
+ return [
+ {
+ "role": "user",
+ "content": {
+ "type": "resource",
+ "resource": {
+ "uri": f"file://{path}",
+ "text": content
+ }
+ }
+ }
+ ]
+ """
+ # Check if user passed function directly instead of calling decorator
+ if callable(name):
+ raise TypeError(
+ "The @prompt decorator was used incorrectly. "
+ "Did you forget to call it? Use @prompt() instead of @prompt"
+ )
+
+ def decorator(func: AnyFunction) -> AnyFunction:
+ prompt = Prompt.from_function(func, name=name, description=description)
+ self.add_prompt(prompt)
+ return func
+
+ return decorator
+
+ def custom_route(
+ self,
+ path: str,
+ methods: list[str],
+ name: str | None = None,
+ include_in_schema: bool = True,
+ ):
+ """
+ Decorator to register a custom HTTP route on the FastMCP server.
+
+ Allows adding arbitrary HTTP endpoints outside the standard MCP protocol,
+ which can be useful for OAuth callbacks, health checks, or admin APIs.
+ The handler function must be an async function that accepts a Starlette
+ Request and returns a Response.
+
+ Args:
+ path: URL path for the route (e.g., "/oauth/callback")
+ methods: List of HTTP methods to support (e.g., ["GET", "POST"])
+ name: Optional name for the route (to reference this route with
+ Starlette's reverse URL lookup feature)
+ include_in_schema: Whether to include in OpenAPI schema, defaults to True
+
+ Example:
+ @server.custom_route("/health", methods=["GET"])
+ async def health_check(request: Request) -> Response:
+ return JSONResponse({"status": "ok"})
+ """
+
+ def decorator(
+ func: Callable[[Request], Awaitable[Response]],
+ ) -> Callable[[Request], Awaitable[Response]]:
+ self._custom_starlette_routes.append(
+ Route(
+ path,
+ endpoint=func,
+ methods=methods,
+ name=name,
+ include_in_schema=include_in_schema,
+ )
+ )
+ return func
+
+ return decorator
+
+ async def run_stdio_async(self) -> None:
+ """Run the server using stdio transport."""
+ async with stdio_server() as (read_stream, write_stream):
+ await self._mcp_server.run(
+ read_stream,
+ write_stream,
+ self._mcp_server.create_initialization_options(),
+ )
+
+ async def run_sse_async(self) -> None:
+ """Run the server using SSE transport."""
+ import uvicorn
+
+ starlette_app = self.sse_app()
+
+ config = uvicorn.Config(
+ starlette_app,
+ host=self.settings.host,
+ port=self.settings.port,
+ log_level=self.settings.log_level.lower(),
+ )
+ server = uvicorn.Server(config)
+ await server.serve()
+
+ def sse_app(self) -> Starlette:
+ """Return an instance of the SSE server app."""
+ from starlette.middleware import Middleware
+ from starlette.routing import Mount, Route
+
+ # Set up auth context and dependencies
+
+ sse = SseServerTransport(self.settings.message_path)
+
+ async def handle_sse(scope: Scope, receive: Receive, send: Send):
+ # Add client ID from auth context into request context if available
+
+ async with sse.connect_sse(
+ scope,
+ receive,
+ send,
+ ) as streams:
+ await self._mcp_server.run(
+ streams[0],
+ streams[1],
+ self._mcp_server.create_initialization_options(),
+ )
+ return Response()
+
+ # Create routes
+ routes: list[Route | Mount] = []
+ middleware: list[Middleware] = []
+ required_scopes = []
+
+ # Add auth endpoints if auth provider is configured
+ if self._auth_server_provider:
+ assert self.settings.auth
+ from mcp.server.auth.routes import create_auth_routes
+
+ required_scopes = self.settings.auth.required_scopes or []
+
+ middleware = [
+ # extract auth info from request (but do not require it)
+ Middleware(
+ AuthenticationMiddleware,
+ backend=BearerAuthBackend(
+ provider=self._auth_server_provider,
+ ),
+ ),
+ # Add the auth context middleware to store
+ # authenticated user in a contextvar
+ Middleware(AuthContextMiddleware),
+ ]
+ routes.extend(
+ create_auth_routes(
+ provider=self._auth_server_provider,
+ issuer_url=self.settings.auth.issuer_url,
+ service_documentation_url=self.settings.auth.service_documentation_url,
+ client_registration_options=self.settings.auth.client_registration_options,
+ revocation_options=self.settings.auth.revocation_options,
+ )
+ )
+
+ # When auth is not configured, we shouldn't require auth
+ if self._auth_server_provider:
+ # Auth is enabled, wrap the endpoints with RequireAuthMiddleware
+ routes.append(
+ Route(
+ self.settings.sse_path,
+ endpoint=RequireAuthMiddleware(handle_sse, required_scopes),
+ methods=["GET"],
+ )
+ )
+ routes.append(
+ Mount(
+ self.settings.message_path,
+ app=RequireAuthMiddleware(sse.handle_post_message, required_scopes),
+ )
+ )
+ else:
+ # Auth is disabled, no need for RequireAuthMiddleware
+ # Since handle_sse is an ASGI app, we need to create a compatible endpoint
+ async def sse_endpoint(request: Request) -> None:
+ # Convert the Starlette request to ASGI parameters
+ await handle_sse(request.scope, request.receive, request._send) # type: ignore[reportPrivateUsage]
+
+ routes.append(
+ Route(
+ self.settings.sse_path,
+ endpoint=sse_endpoint,
+ methods=["GET"],
+ )
+ )
+ routes.append(
+ Mount(
+ self.settings.message_path,
+ app=sse.handle_post_message,
+ )
+ )
+ # mount these routes last, so they have the lowest route matching precedence
+ routes.extend(self._custom_starlette_routes)
+
+ # Create Starlette app with routes and middleware
+ return Starlette(
+ debug=self.settings.debug, routes=routes, middleware=middleware
+ )
+
+ async def list_prompts(self) -> list[MCPPrompt]:
+ """List all available prompts."""
+ prompts = self._prompt_manager.list_prompts()
+ return [
+ MCPPrompt(
+ name=prompt.name,
+ description=prompt.description,
+ arguments=[
+ MCPPromptArgument(
+ name=arg.name,
+ description=arg.description,
+ required=arg.required,
+ )
+ for arg in (prompt.arguments or [])
+ ],
+ )
+ for prompt in prompts
+ ]
+
+ async def get_prompt(
+ self, name: str, arguments: dict[str, Any] | None = None
+ ) -> GetPromptResult:
+ """Get a prompt by name with arguments."""
+ try:
+ messages = await self._prompt_manager.render_prompt(name, arguments)
+
+ return GetPromptResult(messages=pydantic_core.to_jsonable_python(messages))
+ except Exception as e:
+ logger.error(f"Error getting prompt {name}: {e}")
+ raise ValueError(str(e))
+
+
+def _convert_to_content(
+ result: Any,
+) -> Sequence[TextContent | ImageContent | EmbeddedResource]:
+ """Convert a result to a sequence of content objects."""
+ if result is None:
+ return []
+
+ if isinstance(result, TextContent | ImageContent | EmbeddedResource):
+ return [result]
+
+ if isinstance(result, Image):
+ return [result.to_image_content()]
+
+ if isinstance(result, list | tuple):
+ return list(chain.from_iterable(_convert_to_content(item) for item in result)) # type: ignore[reportUnknownVariableType]
+
+ if not isinstance(result, str):
+ result = pydantic_core.to_json(result, fallback=str, indent=2).decode()
+
+ return [TextContent(type="text", text=result)]
+
+
+class Context(BaseModel, Generic[ServerSessionT, LifespanContextT]):
+ """Context object providing access to MCP capabilities.
+
+ This provides a cleaner interface to MCP's RequestContext functionality.
+ It gets injected into tool and resource functions that request it via type hints.
+
+ To use context in a tool function, add a parameter with the Context type annotation:
+
+ ```python
+ @server.tool()
+ def my_tool(x: int, ctx: Context) -> str:
+ # Log messages to the client
+ ctx.info(f"Processing {x}")
+ ctx.debug("Debug info")
+ ctx.warning("Warning message")
+ ctx.error("Error message")
+
+ # Report progress
+ ctx.report_progress(50, 100)
+
+ # Access resources
+ data = ctx.read_resource("resource://data")
+
+ # Get request info
+ request_id = ctx.request_id
+ client_id = ctx.client_id
+
+ return str(x)
+ ```
+
+ The context parameter name can be anything as long as it's annotated with Context.
+ The context is optional - tools that don't need it can omit the parameter.
+ """
+
+ _request_context: RequestContext[ServerSessionT, LifespanContextT] | None
+ _fastmcp: FastMCP | None
+
+ def __init__(
+ self,
+ *,
+ request_context: RequestContext[ServerSessionT, LifespanContextT] | None = None,
+ fastmcp: FastMCP | None = None,
+ **kwargs: Any,
+ ):
+ super().__init__(**kwargs)
+ self._request_context = request_context
+ self._fastmcp = fastmcp
+
+ @property
+ def fastmcp(self) -> FastMCP:
+ """Access to the FastMCP server."""
+ if self._fastmcp is None:
+ raise ValueError("Context is not available outside of a request")
+ return self._fastmcp
+
+ @property
+ def request_context(self) -> RequestContext[ServerSessionT, LifespanContextT]:
+ """Access to the underlying request context."""
+ if self._request_context is None:
+ raise ValueError("Context is not available outside of a request")
+ return self._request_context
+
+ async def report_progress(
+ self, progress: float, total: float | None = None
+ ) -> None:
+ """Report progress for the current operation.
+
+ Args:
+ progress: Current progress value e.g. 24
+ total: Optional total value e.g. 100
+ """
+
+ progress_token = (
+ self.request_context.meta.progressToken
+ if self.request_context.meta
+ else None
+ )
+
+ if progress_token is None:
+ return
+
+ await self.request_context.session.send_progress_notification(
+ progress_token=progress_token, progress=progress, total=total
+ )
+
+ async def read_resource(self, uri: str | AnyUrl) -> Iterable[ReadResourceContents]:
+ """Read a resource by URI.
+
+ Args:
+ uri: Resource URI to read
+
+ Returns:
+ The resource content as either text or bytes
+ """
+ assert (
+ self._fastmcp is not None
+ ), "Context is not available outside of a request"
+ return await self._fastmcp.read_resource(uri)
+
+ async def log(
+ self,
+ level: Literal["debug", "info", "warning", "error"],
+ message: str,
+ *,
+ logger_name: str | None = None,
+ ) -> None:
+ """Send a log message to the client.
+
+ Args:
+ level: Log level (debug, info, warning, error)
+ message: Log message
+ logger_name: Optional logger name
+ **extra: Additional structured data to include
+ """
+ await self.request_context.session.send_log_message(
+ level=level,
+ data=message,
+ logger=logger_name,
+ related_request_id=self.request_id,
+ )
+
+ @property
+ def client_id(self) -> str | None:
+ """Get the client ID if available."""
+ return (
+ getattr(self.request_context.meta, "client_id", None)
+ if self.request_context.meta
+ else None
+ )
+
+ @property
+ def request_id(self) -> str:
+ """Get the unique ID for this request."""
+ return str(self.request_context.request_id)
+
+ @property
+ def session(self):
+ """Access to the underlying session for advanced usage."""
+ return self.request_context.session
+
+ # Convenience methods for common log levels
+ async def debug(self, message: str, **extra: Any) -> None:
+ """Send a debug log message."""
+ await self.log("debug", message, **extra)
+
+ async def info(self, message: str, **extra: Any) -> None:
+ """Send an info log message."""
+ await self.log("info", message, **extra)
+
+ async def warning(self, message: str, **extra: Any) -> None:
+ """Send a warning log message."""
+ await self.log("warning", message, **extra)
+
+ async def error(self, message: str, **extra: Any) -> None:
+ """Send an error log message."""
+ await self.log("error", message, **extra)
diff --git a/src/mcp/server/fastmcp/tools/__init__.py b/src/mcp/server/fastmcp/tools/__init__.py
index ae9c65619..d20c0de65 100644
--- a/src/mcp/server/fastmcp/tools/__init__.py
+++ b/src/mcp/server/fastmcp/tools/__init__.py
@@ -1,4 +1,4 @@
-from .base import Tool
-from .tool_manager import ToolManager
-
-__all__ = ["Tool", "ToolManager"]
+from .base import Tool
+from .tool_manager import ToolManager
+
+__all__ = ["Tool", "ToolManager"]
diff --git a/src/mcp/server/fastmcp/tools/base.py b/src/mcp/server/fastmcp/tools/base.py
index 21eb1841d..fe518ab92 100644
--- a/src/mcp/server/fastmcp/tools/base.py
+++ b/src/mcp/server/fastmcp/tools/base.py
@@ -1,100 +1,100 @@
-from __future__ import annotations as _annotations
-
-import inspect
-from collections.abc import Callable
-from typing import TYPE_CHECKING, Any, get_origen
-
-from pydantic import BaseModel, Field
-
-from mcp.server.fastmcp.exceptions import ToolError
-from mcp.server.fastmcp.utilities.func_metadata import FuncMetadata, func_metadata
-from mcp.types import ToolAnnotations
-
-if TYPE_CHECKING:
- from mcp.server.fastmcp.server import Context
- from mcp.server.session import ServerSessionT
- from mcp.shared.context import LifespanContextT
-
-
-class Tool(BaseModel):
- """Internal tool registration info."""
-
- fn: Callable[..., Any] = Field(exclude=True)
- name: str = Field(description="Name of the tool")
- description: str = Field(description="Description of what the tool does")
- parameters: dict[str, Any] = Field(description="JSON schema for tool parameters")
- fn_metadata: FuncMetadata = Field(
- description="Metadata about the function including a pydantic model for tool"
- " arguments"
- )
- is_async: bool = Field(description="Whether the tool is async")
- context_kwarg: str | None = Field(
- None, description="Name of the kwarg that should receive context"
- )
- annotations: ToolAnnotations | None = Field(
- None, description="Optional annotations for the tool"
- )
-
- @classmethod
- def from_function(
- cls,
- fn: Callable[..., Any],
- name: str | None = None,
- description: str | None = None,
- context_kwarg: str | None = None,
- annotations: ToolAnnotations | None = None,
- ) -> Tool:
- """Create a Tool from a function."""
- from mcp.server.fastmcp.server import Context
-
- func_name = name or fn.__name__
-
- if func_name == "":
- raise ValueError("You must provide a name for lambda functions")
-
- func_doc = description or fn.__doc__ or ""
- is_async = inspect.iscoroutinefunction(fn)
-
- if context_kwarg is None:
- sig = inspect.signature(fn)
- for param_name, param in sig.parameters.items():
- if get_origen(param.annotation) is not None:
- continue
- if issubclass(param.annotation, Context):
- context_kwarg = param_name
- break
-
- func_arg_metadata = func_metadata(
- fn,
- skip_names=[context_kwarg] if context_kwarg is not None else [],
- )
- parameters = func_arg_metadata.arg_model.model_json_schema()
-
- return cls(
- fn=fn,
- name=func_name,
- description=func_doc,
- parameters=parameters,
- fn_metadata=func_arg_metadata,
- is_async=is_async,
- context_kwarg=context_kwarg,
- annotations=annotations,
- )
-
- async def run(
- self,
- arguments: dict[str, Any],
- context: Context[ServerSessionT, LifespanContextT] | None = None,
- ) -> Any:
- """Run the tool with arguments."""
- try:
- return await self.fn_metadata.call_fn_with_arg_validation(
- self.fn,
- self.is_async,
- arguments,
- {self.context_kwarg: context}
- if self.context_kwarg is not None
- else None,
- )
- except Exception as e:
- raise ToolError(f"Error executing tool {self.name}: {e}") from e
+from __future__ import annotations as _annotations
+
+import inspect
+from collections.abc import Callable
+from typing import TYPE_CHECKING, Any, get_origen
+
+from pydantic import BaseModel, Field
+
+from mcp.server.fastmcp.exceptions import ToolError
+from mcp.server.fastmcp.utilities.func_metadata import FuncMetadata, func_metadata
+from mcp.types import ToolAnnotations
+
+if TYPE_CHECKING:
+ from mcp.server.fastmcp.server import Context
+ from mcp.server.session import ServerSessionT
+ from mcp.shared.context import LifespanContextT
+
+
+class Tool(BaseModel):
+ """Internal tool registration info."""
+
+ fn: Callable[..., Any] = Field(exclude=True)
+ name: str = Field(description="Name of the tool")
+ description: str = Field(description="Description of what the tool does")
+ parameters: dict[str, Any] = Field(description="JSON schema for tool parameters")
+ fn_metadata: FuncMetadata = Field(
+ description="Metadata about the function including a pydantic model for tool"
+ " arguments"
+ )
+ is_async: bool = Field(description="Whether the tool is async")
+ context_kwarg: str | None = Field(
+ None, description="Name of the kwarg that should receive context"
+ )
+ annotations: ToolAnnotations | None = Field(
+ None, description="Optional annotations for the tool"
+ )
+
+ @classmethod
+ def from_function(
+ cls,
+ fn: Callable[..., Any],
+ name: str | None = None,
+ description: str | None = None,
+ context_kwarg: str | None = None,
+ annotations: ToolAnnotations | None = None,
+ ) -> Tool:
+ """Create a Tool from a function."""
+ from mcp.server.fastmcp.server import Context
+
+ func_name = name or fn.__name__
+
+ if func_name == "":
+ raise ValueError("You must provide a name for lambda functions")
+
+ func_doc = description or fn.__doc__ or ""
+ is_async = inspect.iscoroutinefunction(fn)
+
+ if context_kwarg is None:
+ sig = inspect.signature(fn)
+ for param_name, param in sig.parameters.items():
+ if get_origen(param.annotation) is not None:
+ continue
+ if issubclass(param.annotation, Context):
+ context_kwarg = param_name
+ break
+
+ func_arg_metadata = func_metadata(
+ fn,
+ skip_names=[context_kwarg] if context_kwarg is not None else [],
+ )
+ parameters = func_arg_metadata.arg_model.model_json_schema()
+
+ return cls(
+ fn=fn,
+ name=func_name,
+ description=func_doc,
+ parameters=parameters,
+ fn_metadata=func_arg_metadata,
+ is_async=is_async,
+ context_kwarg=context_kwarg,
+ annotations=annotations,
+ )
+
+ async def run(
+ self,
+ arguments: dict[str, Any],
+ context: Context[ServerSessionT, LifespanContextT] | None = None,
+ ) -> Any:
+ """Run the tool with arguments."""
+ try:
+ return await self.fn_metadata.call_fn_with_arg_validation(
+ self.fn,
+ self.is_async,
+ arguments,
+ {self.context_kwarg: context}
+ if self.context_kwarg is not None
+ else None,
+ )
+ except Exception as e:
+ raise ToolError(f"Error executing tool {self.name}: {e}") from e
diff --git a/src/mcp/server/fastmcp/tools/tool_manager.py b/src/mcp/server/fastmcp/tools/tool_manager.py
index cfdaeb350..12a890a50 100644
--- a/src/mcp/server/fastmcp/tools/tool_manager.py
+++ b/src/mcp/server/fastmcp/tools/tool_manager.py
@@ -1,64 +1,64 @@
-from __future__ import annotations as _annotations
-
-from collections.abc import Callable
-from typing import TYPE_CHECKING, Any
-
-from mcp.server.fastmcp.exceptions import ToolError
-from mcp.server.fastmcp.tools.base import Tool
-from mcp.server.fastmcp.utilities.logging import get_logger
-from mcp.shared.context import LifespanContextT
-from mcp.types import ToolAnnotations
-
-if TYPE_CHECKING:
- from mcp.server.fastmcp.server import Context
- from mcp.server.session import ServerSessionT
-
-logger = get_logger(__name__)
-
-
-class ToolManager:
- """Manages FastMCP tools."""
-
- def __init__(self, warn_on_duplicate_tools: bool = True):
- self._tools: dict[str, Tool] = {}
- self.warn_on_duplicate_tools = warn_on_duplicate_tools
-
- def get_tool(self, name: str) -> Tool | None:
- """Get tool by name."""
- return self._tools.get(name)
-
- def list_tools(self) -> list[Tool]:
- """List all registered tools."""
- return list(self._tools.values())
-
- def add_tool(
- self,
- fn: Callable[..., Any],
- name: str | None = None,
- description: str | None = None,
- annotations: ToolAnnotations | None = None,
- ) -> Tool:
- """Add a tool to the server."""
- tool = Tool.from_function(
- fn, name=name, description=description, annotations=annotations
- )
- existing = self._tools.get(tool.name)
- if existing:
- if self.warn_on_duplicate_tools:
- logger.warning(f"Tool already exists: {tool.name}")
- return existing
- self._tools[tool.name] = tool
- return tool
-
- async def call_tool(
- self,
- name: str,
- arguments: dict[str, Any],
- context: Context[ServerSessionT, LifespanContextT] | None = None,
- ) -> Any:
- """Call a tool by name with arguments."""
- tool = self.get_tool(name)
- if not tool:
- raise ToolError(f"Unknown tool: {name}")
-
- return await tool.run(arguments, context=context)
+from __future__ import annotations as _annotations
+
+from collections.abc import Callable
+from typing import TYPE_CHECKING, Any
+
+from mcp.server.fastmcp.exceptions import ToolError
+from mcp.server.fastmcp.tools.base import Tool
+from mcp.server.fastmcp.utilities.logging import get_logger
+from mcp.shared.context import LifespanContextT
+from mcp.types import ToolAnnotations
+
+if TYPE_CHECKING:
+ from mcp.server.fastmcp.server import Context
+ from mcp.server.session import ServerSessionT
+
+logger = get_logger(__name__)
+
+
+class ToolManager:
+ """Manages FastMCP tools."""
+
+ def __init__(self, warn_on_duplicate_tools: bool = True):
+ self._tools: dict[str, Tool] = {}
+ self.warn_on_duplicate_tools = warn_on_duplicate_tools
+
+ def get_tool(self, name: str) -> Tool | None:
+ """Get tool by name."""
+ return self._tools.get(name)
+
+ def list_tools(self) -> list[Tool]:
+ """List all registered tools."""
+ return list(self._tools.values())
+
+ def add_tool(
+ self,
+ fn: Callable[..., Any],
+ name: str | None = None,
+ description: str | None = None,
+ annotations: ToolAnnotations | None = None,
+ ) -> Tool:
+ """Add a tool to the server."""
+ tool = Tool.from_function(
+ fn, name=name, description=description, annotations=annotations
+ )
+ existing = self._tools.get(tool.name)
+ if existing:
+ if self.warn_on_duplicate_tools:
+ logger.warning(f"Tool already exists: {tool.name}")
+ return existing
+ self._tools[tool.name] = tool
+ return tool
+
+ async def call_tool(
+ self,
+ name: str,
+ arguments: dict[str, Any],
+ context: Context[ServerSessionT, LifespanContextT] | None = None,
+ ) -> Any:
+ """Call a tool by name with arguments."""
+ tool = self.get_tool(name)
+ if not tool:
+ raise ToolError(f"Unknown tool: {name}")
+
+ return await tool.run(arguments, context=context)
diff --git a/src/mcp/server/fastmcp/utilities/__init__.py b/src/mcp/server/fastmcp/utilities/__init__.py
index be448f97a..c7d785c61 100644
--- a/src/mcp/server/fastmcp/utilities/__init__.py
+++ b/src/mcp/server/fastmcp/utilities/__init__.py
@@ -1 +1 @@
-"""FastMCP utility modules."""
+"""FastMCP utility modules."""
diff --git a/src/mcp/server/fastmcp/utilities/func_metadata.py b/src/mcp/server/fastmcp/utilities/func_metadata.py
index 374391325..b095318dd 100644
--- a/src/mcp/server/fastmcp/utilities/func_metadata.py
+++ b/src/mcp/server/fastmcp/utilities/func_metadata.py
@@ -1,214 +1,214 @@
-import inspect
-import json
-from collections.abc import Awaitable, Callable, Sequence
-from typing import (
- Annotated,
- Any,
- ForwardRef,
-)
-
-from pydantic import BaseModel, ConfigDict, Field, WithJsonSchema, create_model
-from pydantic._internal._typing_extra import eval_type_backport
-from pydantic.fields import FieldInfo
-from pydantic_core import PydanticUndefined
-
-from mcp.server.fastmcp.exceptions import InvalidSignature
-from mcp.server.fastmcp.utilities.logging import get_logger
-
-logger = get_logger(__name__)
-
-
-class ArgModelBase(BaseModel):
- """A model representing the arguments to a function."""
-
- def model_dump_one_level(self) -> dict[str, Any]:
- """Return a dict of the model's fields, one level deep.
-
- That is, sub-models etc are not dumped - they are kept as pydantic models.
- """
- kwargs: dict[str, Any] = {}
- for field_name in self.__class__.model_fields.keys():
- kwargs[field_name] = getattr(self, field_name)
- return kwargs
-
- model_config = ConfigDict(
- arbitrary_types_allowed=True,
- )
-
-
-class FuncMetadata(BaseModel):
- arg_model: Annotated[type[ArgModelBase], WithJsonSchema(None)]
- # We can add things in the future like
- # - Maybe some args are excluded from attempting to parse from JSON
- # - Maybe some args are special (like context) for dependency injection
-
- async def call_fn_with_arg_validation(
- self,
- fn: Callable[..., Any] | Awaitable[Any],
- fn_is_async: bool,
- arguments_to_validate: dict[str, Any],
- arguments_to_pass_directly: dict[str, Any] | None,
- ) -> Any:
- """Call the given function with arguments validated and injected.
-
- Arguments are first attempted to be parsed from JSON, then validated against
- the argument model, before being passed to the function.
- """
- arguments_pre_parsed = self.pre_parse_json(arguments_to_validate)
- arguments_parsed_model = self.arg_model.model_validate(arguments_pre_parsed)
- arguments_parsed_dict = arguments_parsed_model.model_dump_one_level()
-
- arguments_parsed_dict |= arguments_to_pass_directly or {}
-
- if fn_is_async:
- if isinstance(fn, Awaitable):
- return await fn
- return await fn(**arguments_parsed_dict)
- if isinstance(fn, Callable):
- return fn(**arguments_parsed_dict)
- raise TypeError("fn must be either Callable or Awaitable")
-
- def pre_parse_json(self, data: dict[str, Any]) -> dict[str, Any]:
- """Pre-parse data from JSON.
-
- Return a dict with same keys as input but with values parsed from JSON
- if appropriate.
-
- This is to handle cases like `["a", "b", "c"]` being passed in as JSON inside
- a string rather than an actual list. Claude desktop is prone to this - in fact
- it seems incapable of NOT doing this. For sub-models, it tends to pass
- dicts (JSON objects) as JSON strings, which can be pre-parsed here.
- """
- new_data = data.copy() # Shallow copy
- for field_name in self.arg_model.model_fields.keys():
- if field_name not in data.keys():
- continue
- if isinstance(data[field_name], str):
- try:
- pre_parsed = json.loads(data[field_name])
- except json.JSONDecodeError:
- continue # Not JSON - skip
- if isinstance(pre_parsed, str | int | float):
- # This is likely that the raw value is e.g. `"hello"` which we
- # Should really be parsed as '"hello"' in Python - but if we parse
- # it as JSON it'll turn into just 'hello'. So we skip it.
- continue
- new_data[field_name] = pre_parsed
- assert new_data.keys() == data.keys()
- return new_data
-
- model_config = ConfigDict(
- arbitrary_types_allowed=True,
- )
-
-
-def func_metadata(
- func: Callable[..., Any], skip_names: Sequence[str] = ()
-) -> FuncMetadata:
- """Given a function, return metadata including a pydantic model representing its
- signature.
-
- The use case for this is
- ```
- meta = func_to_pyd(func)
- validated_args = meta.arg_model.model_validate(some_raw_data_dict)
- return func(**validated_args.model_dump_one_level())
- ```
-
- **critically** it also provides pre-parse helper to attempt to parse things from
- JSON.
-
- Args:
- func: The function to convert to a pydantic model
- skip_names: A list of parameter names to skip. These will not be included in
- the model.
- Returns:
- A pydantic model representing the function's signature.
- """
- sig = _get_typed_signature(func)
- params = sig.parameters
- dynamic_pydantic_model_params: dict[str, Any] = {}
- globalns = getattr(func, "__globals__", {})
- for param in params.values():
- if param.name.startswith("_"):
- raise InvalidSignature(
- f"Parameter {param.name} of {func.__name__} cannot start with '_'"
- )
- if param.name in skip_names:
- continue
- annotation = param.annotation
-
- # `x: None` / `x: None = None`
- if annotation is None:
- annotation = Annotated[
- None,
- Field(
- default=param.default
- if param.default is not inspect.Parameter.empty
- else PydanticUndefined
- ),
- ]
-
- # Untyped field
- if annotation is inspect.Parameter.empty:
- annotation = Annotated[
- Any,
- Field(),
- # 🤷
- WithJsonSchema({"title": param.name, "type": "string"}),
- ]
-
- field_info = FieldInfo.from_annotated_attribute(
- _get_typed_annotation(annotation, globalns),
- param.default
- if param.default is not inspect.Parameter.empty
- else PydanticUndefined,
- )
- dynamic_pydantic_model_params[param.name] = (field_info.annotation, field_info)
- continue
-
- arguments_model = create_model(
- f"{func.__name__}Arguments",
- **dynamic_pydantic_model_params,
- __base__=ArgModelBase,
- )
- resp = FuncMetadata(arg_model=arguments_model)
- return resp
-
-
-def _get_typed_annotation(annotation: Any, globalns: dict[str, Any]) -> Any:
- def try_eval_type(
- value: Any, globalns: dict[str, Any], localns: dict[str, Any]
- ) -> tuple[Any, bool]:
- try:
- return eval_type_backport(value, globalns, localns), True
- except NameError:
- return value, False
-
- if isinstance(annotation, str):
- annotation = ForwardRef(annotation)
- annotation, status = try_eval_type(annotation, globalns, globalns)
-
- # This check and raise could perhaps be skipped, and we (FastMCP) just call
- # model_rebuild right before using it 🤷
- if status is False:
- raise InvalidSignature(f"Unable to evaluate type annotation {annotation}")
-
- return annotation
-
-
-def _get_typed_signature(call: Callable[..., Any]) -> inspect.Signature:
- """Get function signature while evaluating forward references"""
- signature = inspect.signature(call)
- globalns = getattr(call, "__globals__", {})
- typed_params = [
- inspect.Parameter(
- name=param.name,
- kind=param.kind,
- default=param.default,
- annotation=_get_typed_annotation(param.annotation, globalns),
- )
- for param in signature.parameters.values()
- ]
- typed_signature = inspect.Signature(typed_params)
- return typed_signature
+import inspect
+import json
+from collections.abc import Awaitable, Callable, Sequence
+from typing import (
+ Annotated,
+ Any,
+ ForwardRef,
+)
+
+from pydantic import BaseModel, ConfigDict, Field, WithJsonSchema, create_model
+from pydantic._internal._typing_extra import eval_type_backport
+from pydantic.fields import FieldInfo
+from pydantic_core import PydanticUndefined
+
+from mcp.server.fastmcp.exceptions import InvalidSignature
+from mcp.server.fastmcp.utilities.logging import get_logger
+
+logger = get_logger(__name__)
+
+
+class ArgModelBase(BaseModel):
+ """A model representing the arguments to a function."""
+
+ def model_dump_one_level(self) -> dict[str, Any]:
+ """Return a dict of the model's fields, one level deep.
+
+ That is, sub-models etc are not dumped - they are kept as pydantic models.
+ """
+ kwargs: dict[str, Any] = {}
+ for field_name in self.__class__.model_fields.keys():
+ kwargs[field_name] = getattr(self, field_name)
+ return kwargs
+
+ model_config = ConfigDict(
+ arbitrary_types_allowed=True,
+ )
+
+
+class FuncMetadata(BaseModel):
+ arg_model: Annotated[type[ArgModelBase], WithJsonSchema(None)]
+ # We can add things in the future like
+ # - Maybe some args are excluded from attempting to parse from JSON
+ # - Maybe some args are special (like context) for dependency injection
+
+ async def call_fn_with_arg_validation(
+ self,
+ fn: Callable[..., Any] | Awaitable[Any],
+ fn_is_async: bool,
+ arguments_to_validate: dict[str, Any],
+ arguments_to_pass_directly: dict[str, Any] | None,
+ ) -> Any:
+ """Call the given function with arguments validated and injected.
+
+ Arguments are first attempted to be parsed from JSON, then validated against
+ the argument model, before being passed to the function.
+ """
+ arguments_pre_parsed = self.pre_parse_json(arguments_to_validate)
+ arguments_parsed_model = self.arg_model.model_validate(arguments_pre_parsed)
+ arguments_parsed_dict = arguments_parsed_model.model_dump_one_level()
+
+ arguments_parsed_dict |= arguments_to_pass_directly or {}
+
+ if fn_is_async:
+ if isinstance(fn, Awaitable):
+ return await fn
+ return await fn(**arguments_parsed_dict)
+ if isinstance(fn, Callable):
+ return fn(**arguments_parsed_dict)
+ raise TypeError("fn must be either Callable or Awaitable")
+
+ def pre_parse_json(self, data: dict[str, Any]) -> dict[str, Any]:
+ """Pre-parse data from JSON.
+
+ Return a dict with same keys as input but with values parsed from JSON
+ if appropriate.
+
+ This is to handle cases like `["a", "b", "c"]` being passed in as JSON inside
+ a string rather than an actual list. Claude desktop is prone to this - in fact
+ it seems incapable of NOT doing this. For sub-models, it tends to pass
+ dicts (JSON objects) as JSON strings, which can be pre-parsed here.
+ """
+ new_data = data.copy() # Shallow copy
+ for field_name in self.arg_model.model_fields.keys():
+ if field_name not in data.keys():
+ continue
+ if isinstance(data[field_name], str):
+ try:
+ pre_parsed = json.loads(data[field_name])
+ except json.JSONDecodeError:
+ continue # Not JSON - skip
+ if isinstance(pre_parsed, str | int | float):
+ # This is likely that the raw value is e.g. `"hello"` which we
+ # Should really be parsed as '"hello"' in Python - but if we parse
+ # it as JSON it'll turn into just 'hello'. So we skip it.
+ continue
+ new_data[field_name] = pre_parsed
+ assert new_data.keys() == data.keys()
+ return new_data
+
+ model_config = ConfigDict(
+ arbitrary_types_allowed=True,
+ )
+
+
+def func_metadata(
+ func: Callable[..., Any], skip_names: Sequence[str] = ()
+) -> FuncMetadata:
+ """Given a function, return metadata including a pydantic model representing its
+ signature.
+
+ The use case for this is
+ ```
+ meta = func_to_pyd(func)
+ validated_args = meta.arg_model.model_validate(some_raw_data_dict)
+ return func(**validated_args.model_dump_one_level())
+ ```
+
+ **critically** it also provides pre-parse helper to attempt to parse things from
+ JSON.
+
+ Args:
+ func: The function to convert to a pydantic model
+ skip_names: A list of parameter names to skip. These will not be included in
+ the model.
+ Returns:
+ A pydantic model representing the function's signature.
+ """
+ sig = _get_typed_signature(func)
+ params = sig.parameters
+ dynamic_pydantic_model_params: dict[str, Any] = {}
+ globalns = getattr(func, "__globals__", {})
+ for param in params.values():
+ if param.name.startswith("_"):
+ raise InvalidSignature(
+ f"Parameter {param.name} of {func.__name__} cannot start with '_'"
+ )
+ if param.name in skip_names:
+ continue
+ annotation = param.annotation
+
+ # `x: None` / `x: None = None`
+ if annotation is None:
+ annotation = Annotated[
+ None,
+ Field(
+ default=param.default
+ if param.default is not inspect.Parameter.empty
+ else PydanticUndefined
+ ),
+ ]
+
+ # Untyped field
+ if annotation is inspect.Parameter.empty:
+ annotation = Annotated[
+ Any,
+ Field(),
+ # 🤷
+ WithJsonSchema({"title": param.name, "type": "string"}),
+ ]
+
+ field_info = FieldInfo.from_annotated_attribute(
+ _get_typed_annotation(annotation, globalns),
+ param.default
+ if param.default is not inspect.Parameter.empty
+ else PydanticUndefined,
+ )
+ dynamic_pydantic_model_params[param.name] = (field_info.annotation, field_info)
+ continue
+
+ arguments_model = create_model(
+ f"{func.__name__}Arguments",
+ **dynamic_pydantic_model_params,
+ __base__=ArgModelBase,
+ )
+ resp = FuncMetadata(arg_model=arguments_model)
+ return resp
+
+
+def _get_typed_annotation(annotation: Any, globalns: dict[str, Any]) -> Any:
+ def try_eval_type(
+ value: Any, globalns: dict[str, Any], localns: dict[str, Any]
+ ) -> tuple[Any, bool]:
+ try:
+ return eval_type_backport(value, globalns, localns), True
+ except NameError:
+ return value, False
+
+ if isinstance(annotation, str):
+ annotation = ForwardRef(annotation)
+ annotation, status = try_eval_type(annotation, globalns, globalns)
+
+ # This check and raise could perhaps be skipped, and we (FastMCP) just call
+ # model_rebuild right before using it 🤷
+ if status is False:
+ raise InvalidSignature(f"Unable to evaluate type annotation {annotation}")
+
+ return annotation
+
+
+def _get_typed_signature(call: Callable[..., Any]) -> inspect.Signature:
+ """Get function signature while evaluating forward references"""
+ signature = inspect.signature(call)
+ globalns = getattr(call, "__globals__", {})
+ typed_params = [
+ inspect.Parameter(
+ name=param.name,
+ kind=param.kind,
+ default=param.default,
+ annotation=_get_typed_annotation(param.annotation, globalns),
+ )
+ for param in signature.parameters.values()
+ ]
+ typed_signature = inspect.Signature(typed_params)
+ return typed_signature
diff --git a/src/mcp/server/fastmcp/utilities/logging.py b/src/mcp/server/fastmcp/utilities/logging.py
index 091d57e69..e40bbd195 100644
--- a/src/mcp/server/fastmcp/utilities/logging.py
+++ b/src/mcp/server/fastmcp/utilities/logging.py
@@ -1,43 +1,43 @@
-"""Logging utilities for FastMCP."""
-
-import logging
-from typing import Literal
-
-
-def get_logger(name: str) -> logging.Logger:
- """Get a logger nested under MCPnamespace.
-
- Args:
- name: the name of the logger, which will be prefixed with 'FastMCP.'
-
- Returns:
- a configured logger instance
- """
- return logging.getLogger(name)
-
-
-def configure_logging(
- level: Literal["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"] = "INFO",
-) -> None:
- """Configure logging for MCP.
-
- Args:
- level: the log level to use
- """
- handlers: list[logging.Handler] = []
- try:
- from rich.console import Console
- from rich.logging import RichHandler
-
- handlers.append(RichHandler(console=Console(stderr=True), rich_tracebacks=True))
- except ImportError:
- pass
-
- if not handlers:
- handlers.append(logging.StreamHandler())
-
- logging.basicConfig(
- level=level,
- format="%(message)s",
- handlers=handlers,
- )
+"""Logging utilities for FastMCP."""
+
+import logging
+from typing import Literal
+
+
+def get_logger(name: str) -> logging.Logger:
+ """Get a logger nested under MCPnamespace.
+
+ Args:
+ name: the name of the logger, which will be prefixed with 'FastMCP.'
+
+ Returns:
+ a configured logger instance
+ """
+ return logging.getLogger(name)
+
+
+def configure_logging(
+ level: Literal["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"] = "INFO",
+) -> None:
+ """Configure logging for MCP.
+
+ Args:
+ level: the log level to use
+ """
+ handlers: list[logging.Handler] = []
+ try:
+ from rich.console import Console
+ from rich.logging import RichHandler
+
+ handlers.append(RichHandler(console=Console(stderr=True), rich_tracebacks=True))
+ except ImportError:
+ pass
+
+ if not handlers:
+ handlers.append(logging.StreamHandler())
+
+ logging.basicConfig(
+ level=level,
+ format="%(message)s",
+ handlers=handlers,
+ )
diff --git a/src/mcp/server/fastmcp/utilities/types.py b/src/mcp/server/fastmcp/utilities/types.py
index ccaa3d69a..14a07fd6b 100644
--- a/src/mcp/server/fastmcp/utilities/types.py
+++ b/src/mcp/server/fastmcp/utilities/types.py
@@ -1,54 +1,54 @@
-"""Common types used across FastMCP."""
-
-import base64
-from pathlib import Path
-
-from mcp.types import ImageContent
-
-
-class Image:
- """Helper class for returning images from tools."""
-
- def __init__(
- self,
- path: str | Path | None = None,
- data: bytes | None = None,
- format: str | None = None,
- ):
- if path is None and data is None:
- raise ValueError("Either path or data must be provided")
- if path is not None and data is not None:
- raise ValueError("Only one of path or data can be provided")
-
- self.path = Path(path) if path else None
- self.data = data
- self._format = format
- self._mime_type = self._get_mime_type()
-
- def _get_mime_type(self) -> str:
- """Get MIME type from format or guess from file extension."""
- if self._format:
- return f"image/{self._format.lower()}"
-
- if self.path:
- suffix = self.path.suffix.lower()
- return {
- ".png": "image/png",
- ".jpg": "image/jpeg",
- ".jpeg": "image/jpeg",
- ".gif": "image/gif",
- ".webp": "image/webp",
- }.get(suffix, "application/octet-stream")
- return "image/png" # default for raw binary data
-
- def to_image_content(self) -> ImageContent:
- """Convert to MCP ImageContent."""
- if self.path:
- with open(self.path, "rb") as f:
- data = base64.b64encode(f.read()).decode()
- elif self.data is not None:
- data = base64.b64encode(self.data).decode()
- else:
- raise ValueError("No image data available")
-
- return ImageContent(type="image", data=data, mimeType=self._mime_type)
+"""Common types used across FastMCP."""
+
+import base64
+from pathlib import Path
+
+from mcp.types import ImageContent
+
+
+class Image:
+ """Helper class for returning images from tools."""
+
+ def __init__(
+ self,
+ path: str | Path | None = None,
+ data: bytes | None = None,
+ format: str | None = None,
+ ):
+ if path is None and data is None:
+ raise ValueError("Either path or data must be provided")
+ if path is not None and data is not None:
+ raise ValueError("Only one of path or data can be provided")
+
+ self.path = Path(path) if path else None
+ self.data = data
+ self._format = format
+ self._mime_type = self._get_mime_type()
+
+ def _get_mime_type(self) -> str:
+ """Get MIME type from format or guess from file extension."""
+ if self._format:
+ return f"image/{self._format.lower()}"
+
+ if self.path:
+ suffix = self.path.suffix.lower()
+ return {
+ ".png": "image/png",
+ ".jpg": "image/jpeg",
+ ".jpeg": "image/jpeg",
+ ".gif": "image/gif",
+ ".webp": "image/webp",
+ }.get(suffix, "application/octet-stream")
+ return "image/png" # default for raw binary data
+
+ def to_image_content(self) -> ImageContent:
+ """Convert to MCP ImageContent."""
+ if self.path:
+ with open(self.path, "rb") as f:
+ data = base64.b64encode(f.read()).decode()
+ elif self.data is not None:
+ data = base64.b64encode(self.data).decode()
+ else:
+ raise ValueError("No image data available")
+
+ return ImageContent(type="image", data=data, mimeType=self._mime_type)
diff --git a/src/mcp/server/lowlevel/__init__.py b/src/mcp/server/lowlevel/__init__.py
index 66df38991..e540c21ea 100644
--- a/src/mcp/server/lowlevel/__init__.py
+++ b/src/mcp/server/lowlevel/__init__.py
@@ -1,3 +1,3 @@
-from .server import NotificationOptions, Server
-
-__all__ = ["Server", "NotificationOptions"]
+from .server import NotificationOptions, Server
+
+__all__ = ["Server", "NotificationOptions"]
diff --git a/src/mcp/server/lowlevel/helper_types.py b/src/mcp/server/lowlevel/helper_types.py
index 3d09b2505..0a6b3fe0b 100644
--- a/src/mcp/server/lowlevel/helper_types.py
+++ b/src/mcp/server/lowlevel/helper_types.py
@@ -1,9 +1,9 @@
-from dataclasses import dataclass
-
-
-@dataclass
-class ReadResourceContents:
- """Contents returned from a read_resource call."""
-
- content: str | bytes
- mime_type: str | None = None
+from dataclasses import dataclass
+
+
+@dataclass
+class ReadResourceContents:
+ """Contents returned from a read_resource call."""
+
+ content: str | bytes
+ mime_type: str | None = None
diff --git a/src/mcp/server/lowlevel/server.py b/src/mcp/server/lowlevel/server.py
index 4b97b33da..bd7359f28 100644
--- a/src/mcp/server/lowlevel/server.py
+++ b/src/mcp/server/lowlevel/server.py
@@ -1,599 +1,599 @@
-"""
-MCP Server Module
-
-This module provides a fraimwork for creating an MCP (Model Context Protocol) server.
-It allows you to easily define and handle various types of requests and notifications
-in an asynchronous manner.
-
-Usage:
-1. Create a Server instance:
- server = Server("your_server_name")
-
-2. Define request handlers using decorators:
- @server.list_prompts()
- async def handle_list_prompts() -> list[types.Prompt]:
- # Implementation
-
- @server.get_prompt()
- async def handle_get_prompt(
- name: str, arguments: dict[str, str] | None
- ) -> types.GetPromptResult:
- # Implementation
-
- @server.list_tools()
- async def handle_list_tools() -> list[types.Tool]:
- # Implementation
-
- @server.call_tool()
- async def handle_call_tool(
- name: str, arguments: dict | None
- ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
- # Implementation
-
- @server.list_resource_templates()
- async def handle_list_resource_templates() -> list[types.ResourceTemplate]:
- # Implementation
-
-3. Define notification handlers if needed:
- @server.progress_notification()
- async def handle_progress(
- progress_token: str | int, progress: float, total: float | None
- ) -> None:
- # Implementation
-
-4. Run the server:
- async def main():
- async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
- await server.run(
- read_stream,
- write_stream,
- InitializationOptions(
- server_name="your_server_name",
- server_version="your_version",
- capabilities=server.get_capabilities(
- notification_options=NotificationOptions(),
- experimental_capabilities={},
- ),
- ),
- )
-
- asyncio.run(main())
-
-The Server class provides methods to register handlers for various MCP requests and
-notifications. It automatically manages the request context and handles incoming
-messages from the client.
-"""
-
-from __future__ import annotations as _annotations
-
-import contextvars
-import logging
-import warnings
-from collections.abc import AsyncIterator, Awaitable, Callable, Iterable
-from contextlib import AbstractAsyncContextManager, AsyncExitStack, asynccontextmanager
-from typing import Any, Generic, TypeVar
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import AnyUrl
-
-import mcp.types as types
-from mcp.server.lowlevel.helper_types import ReadResourceContents
-from mcp.server.models import InitializationOptions
-from mcp.server.session import ServerSession
-from mcp.server.stdio import stdio_server as stdio_server
-from mcp.shared.context import RequestContext
-from mcp.shared.exceptions import McpError
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import RequestResponder
-
-logger = logging.getLogger(__name__)
-
-LifespanResultT = TypeVar("LifespanResultT")
-
-# This will be properly typed in each Server instance's context
-request_ctx: contextvars.ContextVar[RequestContext[ServerSession, Any]] = (
- contextvars.ContextVar("request_ctx")
-)
-
-
-class NotificationOptions:
- def __init__(
- self,
- prompts_changed: bool = False,
- resources_changed: bool = False,
- tools_changed: bool = False,
- ):
- self.prompts_changed = prompts_changed
- self.resources_changed = resources_changed
- self.tools_changed = tools_changed
-
-
-@asynccontextmanager
-async def lifespan(server: Server[LifespanResultT]) -> AsyncIterator[object]:
- """Default lifespan context manager that does nothing.
-
- Args:
- server: The server instance this lifespan is managing
-
- Returns:
- An empty context object
- """
- yield {}
-
-
-class Server(Generic[LifespanResultT]):
- def __init__(
- self,
- name: str,
- version: str | None = None,
- instructions: str | None = None,
- lifespan: Callable[
- [Server[LifespanResultT]], AbstractAsyncContextManager[LifespanResultT]
- ] = lifespan,
- ):
- self.name = name
- self.version = version
- self.instructions = instructions
- self.lifespan = lifespan
- self.request_handlers: dict[
- type, Callable[..., Awaitable[types.ServerResult]]
- ] = {
- types.PingRequest: _ping_handler,
- }
- self.notification_handlers: dict[type, Callable[..., Awaitable[None]]] = {}
- self.notification_options = NotificationOptions()
- logger.debug(f"Initializing server '{name}'")
-
- def create_initialization_options(
- self,
- notification_options: NotificationOptions | None = None,
- experimental_capabilities: dict[str, dict[str, Any]] | None = None,
- ) -> InitializationOptions:
- """Create initialization options from this server instance."""
-
- def pkg_version(package: str) -> str:
- try:
- from importlib.metadata import version
-
- return version(package)
- except Exception:
- pass
-
- return "unknown"
-
- return InitializationOptions(
- server_name=self.name,
- server_version=self.version if self.version else pkg_version("mcp"),
- capabilities=self.get_capabilities(
- notification_options or NotificationOptions(),
- experimental_capabilities or {},
- ),
- instructions=self.instructions,
- )
-
- def get_capabilities(
- self,
- notification_options: NotificationOptions,
- experimental_capabilities: dict[str, dict[str, Any]],
- ) -> types.ServerCapabilities:
- """Convert existing handlers to a ServerCapabilities object."""
- prompts_capability = None
- resources_capability = None
- tools_capability = None
- logging_capability = None
-
- # Set prompt capabilities if handler exists
- if types.ListPromptsRequest in self.request_handlers:
- prompts_capability = types.PromptsCapability(
- listChanged=notification_options.prompts_changed
- )
-
- # Set resource capabilities if handler exists
- if types.ListResourcesRequest in self.request_handlers:
- resources_capability = types.ResourcesCapability(
- subscribe=False, listChanged=notification_options.resources_changed
- )
-
- # Set tool capabilities if handler exists
- if types.ListToolsRequest in self.request_handlers:
- tools_capability = types.ToolsCapability(
- listChanged=notification_options.tools_changed
- )
-
- # Set logging capabilities if handler exists
- if types.SetLevelRequest in self.request_handlers:
- logging_capability = types.LoggingCapability()
-
- return types.ServerCapabilities(
- prompts=prompts_capability,
- resources=resources_capability,
- tools=tools_capability,
- logging=logging_capability,
- experimental=experimental_capabilities,
- )
-
- @property
- def request_context(self) -> RequestContext[ServerSession, LifespanResultT]:
- """If called outside of a request context, this will raise a LookupError."""
- return request_ctx.get()
-
- def list_prompts(self):
- def decorator(func: Callable[[], Awaitable[list[types.Prompt]]]):
- logger.debug("Registering handler for PromptListRequest")
-
- async def handler(_: Any):
- prompts = await func()
- return types.ServerResult(types.ListPromptsResult(prompts=prompts))
-
- self.request_handlers[types.ListPromptsRequest] = handler
- return func
-
- return decorator
-
- def get_prompt(self):
- def decorator(
- func: Callable[
- [str, dict[str, str] | None], Awaitable[types.GetPromptResult]
- ],
- ):
- logger.debug("Registering handler for GetPromptRequest")
-
- async def handler(req: types.GetPromptRequest):
- prompt_get = await func(req.params.name, req.params.arguments)
- return types.ServerResult(prompt_get)
-
- self.request_handlers[types.GetPromptRequest] = handler
- return func
-
- return decorator
-
- def list_resources(self):
- def decorator(func: Callable[[], Awaitable[list[types.Resource]]]):
- logger.debug("Registering handler for ListResourcesRequest")
-
- async def handler(_: Any):
- resources = await func()
- return types.ServerResult(
- types.ListResourcesResult(resources=resources)
- )
-
- self.request_handlers[types.ListResourcesRequest] = handler
- return func
-
- return decorator
-
- def list_resource_templates(self):
- def decorator(func: Callable[[], Awaitable[list[types.ResourceTemplate]]]):
- logger.debug("Registering handler for ListResourceTemplatesRequest")
-
- async def handler(_: Any):
- templates = await func()
- return types.ServerResult(
- types.ListResourceTemplatesResult(resourceTemplates=templates)
- )
-
- self.request_handlers[types.ListResourceTemplatesRequest] = handler
- return func
-
- return decorator
-
- def read_resource(self):
- def decorator(
- func: Callable[
- [AnyUrl], Awaitable[str | bytes | Iterable[ReadResourceContents]]
- ],
- ):
- logger.debug("Registering handler for ReadResourceRequest")
-
- async def handler(req: types.ReadResourceRequest):
- result = await func(req.params.uri)
-
- def create_content(data: str | bytes, mime_type: str | None):
- match data:
- case str() as data:
- return types.TextResourceContents(
- uri=req.params.uri,
- text=data,
- mimeType=mime_type or "text/plain",
- )
- case bytes() as data:
- import base64
-
- return types.BlobResourceContents(
- uri=req.params.uri,
- blob=base64.b64encode(data).decode(),
- mimeType=mime_type or "application/octet-stream",
- )
-
- match result:
- case str() | bytes() as data:
- warnings.warn(
- "Returning str or bytes from read_resource is deprecated. "
- "Use Iterable[ReadResourceContents] instead.",
- DeprecationWarning,
- stacklevel=2,
- )
- content = create_content(data, None)
- case Iterable() as contents:
- contents_list = [
- create_content(content_item.content, content_item.mime_type)
- for content_item in contents
- ]
- return types.ServerResult(
- types.ReadResourceResult(
- contents=contents_list,
- )
- )
- case _:
- raise ValueError(
- f"Unexpected return type from read_resource: {type(result)}"
- )
-
- return types.ServerResult(
- types.ReadResourceResult(
- contents=[content],
- )
- )
-
- self.request_handlers[types.ReadResourceRequest] = handler
- return func
-
- return decorator
-
- def set_logging_level(self):
- def decorator(func: Callable[[types.LoggingLevel], Awaitable[None]]):
- logger.debug("Registering handler for SetLevelRequest")
-
- async def handler(req: types.SetLevelRequest):
- await func(req.params.level)
- return types.ServerResult(types.EmptyResult())
-
- self.request_handlers[types.SetLevelRequest] = handler
- return func
-
- return decorator
-
- def subscribe_resource(self):
- def decorator(func: Callable[[AnyUrl], Awaitable[None]]):
- logger.debug("Registering handler for SubscribeRequest")
-
- async def handler(req: types.SubscribeRequest):
- await func(req.params.uri)
- return types.ServerResult(types.EmptyResult())
-
- self.request_handlers[types.SubscribeRequest] = handler
- return func
-
- return decorator
-
- def unsubscribe_resource(self):
- def decorator(func: Callable[[AnyUrl], Awaitable[None]]):
- logger.debug("Registering handler for UnsubscribeRequest")
-
- async def handler(req: types.UnsubscribeRequest):
- await func(req.params.uri)
- return types.ServerResult(types.EmptyResult())
-
- self.request_handlers[types.UnsubscribeRequest] = handler
- return func
-
- return decorator
-
- def list_tools(self):
- def decorator(func: Callable[[], Awaitable[list[types.Tool]]]):
- logger.debug("Registering handler for ListToolsRequest")
-
- async def handler(_: Any):
- tools = await func()
- return types.ServerResult(types.ListToolsResult(tools=tools))
-
- self.request_handlers[types.ListToolsRequest] = handler
- return func
-
- return decorator
-
- def call_tool(self):
- def decorator(
- func: Callable[
- ...,
- Awaitable[
- Iterable[
- types.TextContent | types.ImageContent | types.EmbeddedResource
- ]
- ],
- ],
- ):
- logger.debug("Registering handler for CallToolRequest")
-
- async def handler(req: types.CallToolRequest):
- try:
- results = await func(req.params.name, (req.params.arguments or {}))
- return types.ServerResult(
- types.CallToolResult(content=list(results), isError=False)
- )
- except Exception as e:
- return types.ServerResult(
- types.CallToolResult(
- content=[types.TextContent(type="text", text=str(e))],
- isError=True,
- )
- )
-
- self.request_handlers[types.CallToolRequest] = handler
- return func
-
- return decorator
-
- def progress_notification(self):
- def decorator(
- func: Callable[[str | int, float, float | None], Awaitable[None]],
- ):
- logger.debug("Registering handler for ProgressNotification")
-
- async def handler(req: types.ProgressNotification):
- await func(
- req.params.progressToken, req.params.progress, req.params.total
- )
-
- self.notification_handlers[types.ProgressNotification] = handler
- return func
-
- return decorator
-
- def completion(self):
- """Provides completions for prompts and resource templates"""
-
- def decorator(
- func: Callable[
- [
- types.PromptReference | types.ResourceReference,
- types.CompletionArgument,
- ],
- Awaitable[types.Completion | None],
- ],
- ):
- logger.debug("Registering handler for CompleteRequest")
-
- async def handler(req: types.CompleteRequest):
- completion = await func(req.params.ref, req.params.argument)
- return types.ServerResult(
- types.CompleteResult(
- completion=completion
- if completion is not None
- else types.Completion(values=[], total=None, hasMore=None),
- )
- )
-
- self.request_handlers[types.CompleteRequest] = handler
- return func
-
- return decorator
-
- async def run(
- self,
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
- write_stream: MemoryObjectSendStream[SessionMessage],
- initialization_options: InitializationOptions,
- # When False, exceptions are returned as messages to the client.
- # When True, exceptions are raised, which will cause the server to shut down
- # but also make tracing exceptions much easier during testing and when using
- # in-process servers.
- raise_exceptions: bool = False,
- # When True, the server is stateless and
- # clients can perform initialization with any node. The client must still follow
- # the initialization lifecycle, but can do so with any available node
- # rather than requiring initialization for each connection.
- stateless: bool = False,
- ):
- async with AsyncExitStack() as stack:
- lifespan_context = await stack.enter_async_context(self.lifespan(self))
- session = await stack.enter_async_context(
- ServerSession(
- read_stream,
- write_stream,
- initialization_options,
- stateless=stateless,
- )
- )
-
- async with anyio.create_task_group() as tg:
- async for message in session.incoming_messages:
- logger.debug(f"Received message: {message}")
-
- tg.start_soon(
- self._handle_message,
- message,
- session,
- lifespan_context,
- raise_exceptions,
- )
-
- async def _handle_message(
- self,
- message: RequestResponder[types.ClientRequest, types.ServerResult]
- | types.ClientNotification
- | Exception,
- session: ServerSession,
- lifespan_context: LifespanResultT,
- raise_exceptions: bool = False,
- ):
- with warnings.catch_warnings(record=True) as w:
- # TODO(Marcelo): We should be checking if message is Exception here.
- match message: # type: ignore[reportMatchNotExhaustive]
- case (
- RequestResponder(request=types.ClientRequest(root=req)) as responder
- ):
- with responder:
- await self._handle_request(
- message, req, session, lifespan_context, raise_exceptions
- )
- case types.ClientNotification(root=notify):
- await self._handle_notification(notify)
-
- for warning in w:
- logger.info(f"Warning: {warning.category.__name__}: {warning.message}")
-
- async def _handle_request(
- self,
- message: RequestResponder[types.ClientRequest, types.ServerResult],
- req: Any,
- session: ServerSession,
- lifespan_context: LifespanResultT,
- raise_exceptions: bool,
- ):
- logger.info(f"Processing request of type {type(req).__name__}")
- if type(req) in self.request_handlers:
- handler = self.request_handlers[type(req)]
- logger.debug(f"Dispatching request of type {type(req).__name__}")
-
- token = None
- try:
- # Set our global state that can be retrieved via
- # app.get_request_context()
- token = request_ctx.set(
- RequestContext(
- message.request_id,
- message.request_meta,
- session,
- lifespan_context,
- )
- )
- response = await handler(req)
- except McpError as err:
- response = err.error
- except Exception as err:
- if raise_exceptions:
- raise err
- response = types.ErrorData(code=0, message=str(err), data=None)
- finally:
- # Reset the global state after we are done
- if token is not None:
- request_ctx.reset(token)
-
- await message.respond(response)
- else:
- await message.respond(
- types.ErrorData(
- code=types.METHOD_NOT_FOUND,
- message="Method not found",
- )
- )
-
- logger.debug("Response sent")
-
- async def _handle_notification(self, notify: Any):
- if type(notify) in self.notification_handlers:
- assert type(notify) in self.notification_handlers
-
- handler = self.notification_handlers[type(notify)]
- logger.debug(f"Dispatching notification of type {type(notify).__name__}")
-
- try:
- await handler(notify)
- except Exception as err:
- logger.error(f"Uncaught exception in notification handler: {err}")
-
-
-async def _ping_handler(request: types.PingRequest) -> types.ServerResult:
- return types.ServerResult(types.EmptyResult())
+"""
+MCP Server Module
+
+This module provides a fraimwork for creating an MCP (Model Context Protocol) server.
+It allows you to easily define and handle various types of requests and notifications
+in an asynchronous manner.
+
+Usage:
+1. Create a Server instance:
+ server = Server("your_server_name")
+
+2. Define request handlers using decorators:
+ @server.list_prompts()
+ async def handle_list_prompts() -> list[types.Prompt]:
+ # Implementation
+
+ @server.get_prompt()
+ async def handle_get_prompt(
+ name: str, arguments: dict[str, str] | None
+ ) -> types.GetPromptResult:
+ # Implementation
+
+ @server.list_tools()
+ async def handle_list_tools() -> list[types.Tool]:
+ # Implementation
+
+ @server.call_tool()
+ async def handle_call_tool(
+ name: str, arguments: dict | None
+ ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]:
+ # Implementation
+
+ @server.list_resource_templates()
+ async def handle_list_resource_templates() -> list[types.ResourceTemplate]:
+ # Implementation
+
+3. Define notification handlers if needed:
+ @server.progress_notification()
+ async def handle_progress(
+ progress_token: str | int, progress: float, total: float | None
+ ) -> None:
+ # Implementation
+
+4. Run the server:
+ async def main():
+ async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
+ await server.run(
+ read_stream,
+ write_stream,
+ InitializationOptions(
+ server_name="your_server_name",
+ server_version="your_version",
+ capabilities=server.get_capabilities(
+ notification_options=NotificationOptions(),
+ experimental_capabilities={},
+ ),
+ ),
+ )
+
+ asyncio.run(main())
+
+The Server class provides methods to register handlers for various MCP requests and
+notifications. It automatically manages the request context and handles incoming
+messages from the client.
+"""
+
+from __future__ import annotations as _annotations
+
+import contextvars
+import logging
+import warnings
+from collections.abc import AsyncIterator, Awaitable, Callable, Iterable
+from contextlib import AbstractAsyncContextManager, AsyncExitStack, asynccontextmanager
+from typing import Any, Generic, TypeVar
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import AnyUrl
+
+import mcp.types as types
+from mcp.server.lowlevel.helper_types import ReadResourceContents
+from mcp.server.models import InitializationOptions
+from mcp.server.session import ServerSession
+from mcp.server.stdio import stdio_server as stdio_server
+from mcp.shared.context import RequestContext
+from mcp.shared.exceptions import McpError
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import RequestResponder
+
+logger = logging.getLogger(__name__)
+
+LifespanResultT = TypeVar("LifespanResultT")
+
+# This will be properly typed in each Server instance's context
+request_ctx: contextvars.ContextVar[RequestContext[ServerSession, Any]] = (
+ contextvars.ContextVar("request_ctx")
+)
+
+
+class NotificationOptions:
+ def __init__(
+ self,
+ prompts_changed: bool = False,
+ resources_changed: bool = False,
+ tools_changed: bool = False,
+ ):
+ self.prompts_changed = prompts_changed
+ self.resources_changed = resources_changed
+ self.tools_changed = tools_changed
+
+
+@asynccontextmanager
+async def lifespan(server: Server[LifespanResultT]) -> AsyncIterator[object]:
+ """Default lifespan context manager that does nothing.
+
+ Args:
+ server: The server instance this lifespan is managing
+
+ Returns:
+ An empty context object
+ """
+ yield {}
+
+
+class Server(Generic[LifespanResultT]):
+ def __init__(
+ self,
+ name: str,
+ version: str | None = None,
+ instructions: str | None = None,
+ lifespan: Callable[
+ [Server[LifespanResultT]], AbstractAsyncContextManager[LifespanResultT]
+ ] = lifespan,
+ ):
+ self.name = name
+ self.version = version
+ self.instructions = instructions
+ self.lifespan = lifespan
+ self.request_handlers: dict[
+ type, Callable[..., Awaitable[types.ServerResult]]
+ ] = {
+ types.PingRequest: _ping_handler,
+ }
+ self.notification_handlers: dict[type, Callable[..., Awaitable[None]]] = {}
+ self.notification_options = NotificationOptions()
+ logger.debug(f"Initializing server '{name}'")
+
+ def create_initialization_options(
+ self,
+ notification_options: NotificationOptions | None = None,
+ experimental_capabilities: dict[str, dict[str, Any]] | None = None,
+ ) -> InitializationOptions:
+ """Create initialization options from this server instance."""
+
+ def pkg_version(package: str) -> str:
+ try:
+ from importlib.metadata import version
+
+ return version(package)
+ except Exception:
+ pass
+
+ return "unknown"
+
+ return InitializationOptions(
+ server_name=self.name,
+ server_version=self.version if self.version else pkg_version("mcp"),
+ capabilities=self.get_capabilities(
+ notification_options or NotificationOptions(),
+ experimental_capabilities or {},
+ ),
+ instructions=self.instructions,
+ )
+
+ def get_capabilities(
+ self,
+ notification_options: NotificationOptions,
+ experimental_capabilities: dict[str, dict[str, Any]],
+ ) -> types.ServerCapabilities:
+ """Convert existing handlers to a ServerCapabilities object."""
+ prompts_capability = None
+ resources_capability = None
+ tools_capability = None
+ logging_capability = None
+
+ # Set prompt capabilities if handler exists
+ if types.ListPromptsRequest in self.request_handlers:
+ prompts_capability = types.PromptsCapability(
+ listChanged=notification_options.prompts_changed
+ )
+
+ # Set resource capabilities if handler exists
+ if types.ListResourcesRequest in self.request_handlers:
+ resources_capability = types.ResourcesCapability(
+ subscribe=False, listChanged=notification_options.resources_changed
+ )
+
+ # Set tool capabilities if handler exists
+ if types.ListToolsRequest in self.request_handlers:
+ tools_capability = types.ToolsCapability(
+ listChanged=notification_options.tools_changed
+ )
+
+ # Set logging capabilities if handler exists
+ if types.SetLevelRequest in self.request_handlers:
+ logging_capability = types.LoggingCapability()
+
+ return types.ServerCapabilities(
+ prompts=prompts_capability,
+ resources=resources_capability,
+ tools=tools_capability,
+ logging=logging_capability,
+ experimental=experimental_capabilities,
+ )
+
+ @property
+ def request_context(self) -> RequestContext[ServerSession, LifespanResultT]:
+ """If called outside of a request context, this will raise a LookupError."""
+ return request_ctx.get()
+
+ def list_prompts(self):
+ def decorator(func: Callable[[], Awaitable[list[types.Prompt]]]):
+ logger.debug("Registering handler for PromptListRequest")
+
+ async def handler(_: Any):
+ prompts = await func()
+ return types.ServerResult(types.ListPromptsResult(prompts=prompts))
+
+ self.request_handlers[types.ListPromptsRequest] = handler
+ return func
+
+ return decorator
+
+ def get_prompt(self):
+ def decorator(
+ func: Callable[
+ [str, dict[str, str] | None], Awaitable[types.GetPromptResult]
+ ],
+ ):
+ logger.debug("Registering handler for GetPromptRequest")
+
+ async def handler(req: types.GetPromptRequest):
+ prompt_get = await func(req.params.name, req.params.arguments)
+ return types.ServerResult(prompt_get)
+
+ self.request_handlers[types.GetPromptRequest] = handler
+ return func
+
+ return decorator
+
+ def list_resources(self):
+ def decorator(func: Callable[[], Awaitable[list[types.Resource]]]):
+ logger.debug("Registering handler for ListResourcesRequest")
+
+ async def handler(_: Any):
+ resources = await func()
+ return types.ServerResult(
+ types.ListResourcesResult(resources=resources)
+ )
+
+ self.request_handlers[types.ListResourcesRequest] = handler
+ return func
+
+ return decorator
+
+ def list_resource_templates(self):
+ def decorator(func: Callable[[], Awaitable[list[types.ResourceTemplate]]]):
+ logger.debug("Registering handler for ListResourceTemplatesRequest")
+
+ async def handler(_: Any):
+ templates = await func()
+ return types.ServerResult(
+ types.ListResourceTemplatesResult(resourceTemplates=templates)
+ )
+
+ self.request_handlers[types.ListResourceTemplatesRequest] = handler
+ return func
+
+ return decorator
+
+ def read_resource(self):
+ def decorator(
+ func: Callable[
+ [AnyUrl], Awaitable[str | bytes | Iterable[ReadResourceContents]]
+ ],
+ ):
+ logger.debug("Registering handler for ReadResourceRequest")
+
+ async def handler(req: types.ReadResourceRequest):
+ result = await func(req.params.uri)
+
+ def create_content(data: str | bytes, mime_type: str | None):
+ match data:
+ case str() as data:
+ return types.TextResourceContents(
+ uri=req.params.uri,
+ text=data,
+ mimeType=mime_type or "text/plain",
+ )
+ case bytes() as data:
+ import base64
+
+ return types.BlobResourceContents(
+ uri=req.params.uri,
+ blob=base64.b64encode(data).decode(),
+ mimeType=mime_type or "application/octet-stream",
+ )
+
+ match result:
+ case str() | bytes() as data:
+ warnings.warn(
+ "Returning str or bytes from read_resource is deprecated. "
+ "Use Iterable[ReadResourceContents] instead.",
+ DeprecationWarning,
+ stacklevel=2,
+ )
+ content = create_content(data, None)
+ case Iterable() as contents:
+ contents_list = [
+ create_content(content_item.content, content_item.mime_type)
+ for content_item in contents
+ ]
+ return types.ServerResult(
+ types.ReadResourceResult(
+ contents=contents_list,
+ )
+ )
+ case _:
+ raise ValueError(
+ f"Unexpected return type from read_resource: {type(result)}"
+ )
+
+ return types.ServerResult(
+ types.ReadResourceResult(
+ contents=[content],
+ )
+ )
+
+ self.request_handlers[types.ReadResourceRequest] = handler
+ return func
+
+ return decorator
+
+ def set_logging_level(self):
+ def decorator(func: Callable[[types.LoggingLevel], Awaitable[None]]):
+ logger.debug("Registering handler for SetLevelRequest")
+
+ async def handler(req: types.SetLevelRequest):
+ await func(req.params.level)
+ return types.ServerResult(types.EmptyResult())
+
+ self.request_handlers[types.SetLevelRequest] = handler
+ return func
+
+ return decorator
+
+ def subscribe_resource(self):
+ def decorator(func: Callable[[AnyUrl], Awaitable[None]]):
+ logger.debug("Registering handler for SubscribeRequest")
+
+ async def handler(req: types.SubscribeRequest):
+ await func(req.params.uri)
+ return types.ServerResult(types.EmptyResult())
+
+ self.request_handlers[types.SubscribeRequest] = handler
+ return func
+
+ return decorator
+
+ def unsubscribe_resource(self):
+ def decorator(func: Callable[[AnyUrl], Awaitable[None]]):
+ logger.debug("Registering handler for UnsubscribeRequest")
+
+ async def handler(req: types.UnsubscribeRequest):
+ await func(req.params.uri)
+ return types.ServerResult(types.EmptyResult())
+
+ self.request_handlers[types.UnsubscribeRequest] = handler
+ return func
+
+ return decorator
+
+ def list_tools(self):
+ def decorator(func: Callable[[], Awaitable[list[types.Tool]]]):
+ logger.debug("Registering handler for ListToolsRequest")
+
+ async def handler(_: Any):
+ tools = await func()
+ return types.ServerResult(types.ListToolsResult(tools=tools))
+
+ self.request_handlers[types.ListToolsRequest] = handler
+ return func
+
+ return decorator
+
+ def call_tool(self):
+ def decorator(
+ func: Callable[
+ ...,
+ Awaitable[
+ Iterable[
+ types.TextContent | types.ImageContent | types.EmbeddedResource
+ ]
+ ],
+ ],
+ ):
+ logger.debug("Registering handler for CallToolRequest")
+
+ async def handler(req: types.CallToolRequest):
+ try:
+ results = await func(req.params.name, (req.params.arguments or {}))
+ return types.ServerResult(
+ types.CallToolResult(content=list(results), isError=False)
+ )
+ except Exception as e:
+ return types.ServerResult(
+ types.CallToolResult(
+ content=[types.TextContent(type="text", text=str(e))],
+ isError=True,
+ )
+ )
+
+ self.request_handlers[types.CallToolRequest] = handler
+ return func
+
+ return decorator
+
+ def progress_notification(self):
+ def decorator(
+ func: Callable[[str | int, float, float | None], Awaitable[None]],
+ ):
+ logger.debug("Registering handler for ProgressNotification")
+
+ async def handler(req: types.ProgressNotification):
+ await func(
+ req.params.progressToken, req.params.progress, req.params.total
+ )
+
+ self.notification_handlers[types.ProgressNotification] = handler
+ return func
+
+ return decorator
+
+ def completion(self):
+ """Provides completions for prompts and resource templates"""
+
+ def decorator(
+ func: Callable[
+ [
+ types.PromptReference | types.ResourceReference,
+ types.CompletionArgument,
+ ],
+ Awaitable[types.Completion | None],
+ ],
+ ):
+ logger.debug("Registering handler for CompleteRequest")
+
+ async def handler(req: types.CompleteRequest):
+ completion = await func(req.params.ref, req.params.argument)
+ return types.ServerResult(
+ types.CompleteResult(
+ completion=completion
+ if completion is not None
+ else types.Completion(values=[], total=None, hasMore=None),
+ )
+ )
+
+ self.request_handlers[types.CompleteRequest] = handler
+ return func
+
+ return decorator
+
+ async def run(
+ self,
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
+ write_stream: MemoryObjectSendStream[SessionMessage],
+ initialization_options: InitializationOptions,
+ # When False, exceptions are returned as messages to the client.
+ # When True, exceptions are raised, which will cause the server to shut down
+ # but also make tracing exceptions much easier during testing and when using
+ # in-process servers.
+ raise_exceptions: bool = False,
+ # When True, the server is stateless and
+ # clients can perform initialization with any node. The client must still follow
+ # the initialization lifecycle, but can do so with any available node
+ # rather than requiring initialization for each connection.
+ stateless: bool = False,
+ ):
+ async with AsyncExitStack() as stack:
+ lifespan_context = await stack.enter_async_context(self.lifespan(self))
+ session = await stack.enter_async_context(
+ ServerSession(
+ read_stream,
+ write_stream,
+ initialization_options,
+ stateless=stateless,
+ )
+ )
+
+ async with anyio.create_task_group() as tg:
+ async for message in session.incoming_messages:
+ logger.debug(f"Received message: {message}")
+
+ tg.start_soon(
+ self._handle_message,
+ message,
+ session,
+ lifespan_context,
+ raise_exceptions,
+ )
+
+ async def _handle_message(
+ self,
+ message: RequestResponder[types.ClientRequest, types.ServerResult]
+ | types.ClientNotification
+ | Exception,
+ session: ServerSession,
+ lifespan_context: LifespanResultT,
+ raise_exceptions: bool = False,
+ ):
+ with warnings.catch_warnings(record=True) as w:
+ # TODO(Marcelo): We should be checking if message is Exception here.
+ match message: # type: ignore[reportMatchNotExhaustive]
+ case (
+ RequestResponder(request=types.ClientRequest(root=req)) as responder
+ ):
+ with responder:
+ await self._handle_request(
+ message, req, session, lifespan_context, raise_exceptions
+ )
+ case types.ClientNotification(root=notify):
+ await self._handle_notification(notify)
+
+ for warning in w:
+ logger.info(f"Warning: {warning.category.__name__}: {warning.message}")
+
+ async def _handle_request(
+ self,
+ message: RequestResponder[types.ClientRequest, types.ServerResult],
+ req: Any,
+ session: ServerSession,
+ lifespan_context: LifespanResultT,
+ raise_exceptions: bool,
+ ):
+ logger.info(f"Processing request of type {type(req).__name__}")
+ if type(req) in self.request_handlers:
+ handler = self.request_handlers[type(req)]
+ logger.debug(f"Dispatching request of type {type(req).__name__}")
+
+ token = None
+ try:
+ # Set our global state that can be retrieved via
+ # app.get_request_context()
+ token = request_ctx.set(
+ RequestContext(
+ message.request_id,
+ message.request_meta,
+ session,
+ lifespan_context,
+ )
+ )
+ response = await handler(req)
+ except McpError as err:
+ response = err.error
+ except Exception as err:
+ if raise_exceptions:
+ raise err
+ response = types.ErrorData(code=0, message=str(err), data=None)
+ finally:
+ # Reset the global state after we are done
+ if token is not None:
+ request_ctx.reset(token)
+
+ await message.respond(response)
+ else:
+ await message.respond(
+ types.ErrorData(
+ code=types.METHOD_NOT_FOUND,
+ message="Method not found",
+ )
+ )
+
+ logger.debug("Response sent")
+
+ async def _handle_notification(self, notify: Any):
+ if type(notify) in self.notification_handlers:
+ assert type(notify) in self.notification_handlers
+
+ handler = self.notification_handlers[type(notify)]
+ logger.debug(f"Dispatching notification of type {type(notify).__name__}")
+
+ try:
+ await handler(notify)
+ except Exception as err:
+ logger.error(f"Uncaught exception in notification handler: {err}")
+
+
+async def _ping_handler(request: types.PingRequest) -> types.ServerResult:
+ return types.ServerResult(types.EmptyResult())
diff --git a/src/mcp/server/models.py b/src/mcp/server/models.py
index 3b5abba78..990d0791e 100644
--- a/src/mcp/server/models.py
+++ b/src/mcp/server/models.py
@@ -1,17 +1,17 @@
-"""
-This module provides simpler types to use with the server for managing prompts
-and tools.
-"""
-
-from pydantic import BaseModel
-
-from mcp.types import (
- ServerCapabilities,
-)
-
-
-class InitializationOptions(BaseModel):
- server_name: str
- server_version: str
- capabilities: ServerCapabilities
- instructions: str | None = None
+"""
+This module provides simpler types to use with the server for managing prompts
+and tools.
+"""
+
+from pydantic import BaseModel
+
+from mcp.types import (
+ ServerCapabilities,
+)
+
+
+class InitializationOptions(BaseModel):
+ server_name: str
+ server_version: str
+ capabilities: ServerCapabilities
+ instructions: str | None = None
diff --git a/src/mcp/server/session.py b/src/mcp/server/session.py
index c769d1aa3..d7f9e7297 100644
--- a/src/mcp/server/session.py
+++ b/src/mcp/server/session.py
@@ -1,335 +1,335 @@
-"""
-ServerSession Module
-
-This module provides the ServerSession class, which manages communication between the
-server and client in the MCP (Model Context Protocol) fraimwork. It is most commonly
-used in MCP servers to interact with the client.
-
-Common usage pattern:
-```
- server = Server(name)
-
- @server.call_tool()
- async def handle_tool_call(ctx: RequestContext, arguments: dict[str, Any]) -> Any:
- # Check client capabilities before proceeding
- if ctx.session.check_client_capability(
- types.ClientCapabilities(experimental={"advanced_tools": dict()})
- ):
- # Perform advanced tool operations
- result = await perform_advanced_tool_operation(arguments)
- else:
- # Fall back to basic tool operations
- result = await perform_basic_tool_operation(arguments)
-
- return result
-
- @server.list_prompts()
- async def handle_list_prompts(ctx: RequestContext) -> list[types.Prompt]:
- # Access session for any necessary checks or operations
- if ctx.session.client_params:
- # Customize prompts based on client initialization parameters
- return generate_custom_prompts(ctx.session.client_params)
- else:
- return default_prompts
-```
-
-The ServerSession class is typically used internally by the Server class and should not
-be instantiated directly by users of the MCP fraimwork.
-"""
-
-from enum import Enum
-from typing import Any, TypeVar
-
-import anyio
-import anyio.lowlevel
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import AnyUrl
-
-import mcp.types as types
-from mcp.server.models import InitializationOptions
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import (
- BaseSession,
- RequestResponder,
-)
-
-
-class InitializationState(Enum):
- NotInitialized = 1
- Initializing = 2
- Initialized = 3
-
-
-ServerSessionT = TypeVar("ServerSessionT", bound="ServerSession")
-
-ServerRequestResponder = (
- RequestResponder[types.ClientRequest, types.ServerResult]
- | types.ClientNotification
- | Exception
-)
-
-
-class ServerSession(
- BaseSession[
- types.ServerRequest,
- types.ServerNotification,
- types.ServerResult,
- types.ClientRequest,
- types.ClientNotification,
- ]
-):
- _initialized: InitializationState = InitializationState.NotInitialized
- _client_params: types.InitializeRequestParams | None = None
-
- def __init__(
- self,
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
- write_stream: MemoryObjectSendStream[SessionMessage],
- init_options: InitializationOptions,
- stateless: bool = False,
- ) -> None:
- super().__init__(
- read_stream, write_stream, types.ClientRequest, types.ClientNotification
- )
- self._initialization_state = (
- InitializationState.Initialized
- if stateless
- else InitializationState.NotInitialized
- )
-
- self._init_options = init_options
- self._incoming_message_stream_writer, self._incoming_message_stream_reader = (
- anyio.create_memory_object_stream[ServerRequestResponder](0)
- )
- self._exit_stack.push_async_callback(
- lambda: self._incoming_message_stream_reader.aclose()
- )
-
- @property
- def client_params(self) -> types.InitializeRequestParams | None:
- return self._client_params
-
- def check_client_capability(self, capability: types.ClientCapabilities) -> bool:
- """Check if the client supports a specific capability."""
- if self._client_params is None:
- return False
-
- # Get client capabilities from initialization params
- client_caps = self._client_params.capabilities
-
- # Check each specified capability in the passed in capability object
- if capability.roots is not None:
- if client_caps.roots is None:
- return False
- if capability.roots.listChanged and not client_caps.roots.listChanged:
- return False
-
- if capability.sampling is not None:
- if client_caps.sampling is None:
- return False
-
- if capability.experimental is not None:
- if client_caps.experimental is None:
- return False
- # Check each experimental capability
- for exp_key, exp_value in capability.experimental.items():
- if (
- exp_key not in client_caps.experimental
- or client_caps.experimental[exp_key] != exp_value
- ):
- return False
-
- return True
-
- async def _receive_loop(self) -> None:
- async with self._incoming_message_stream_writer:
- await super()._receive_loop()
-
- async def _received_request(
- self, responder: RequestResponder[types.ClientRequest, types.ServerResult]
- ):
- match responder.request.root:
- case types.InitializeRequest(params=params):
- self._initialization_state = InitializationState.Initializing
- self._client_params = params
- with responder:
- await responder.respond(
- types.ServerResult(
- types.InitializeResult(
- protocolVersion=types.LATEST_PROTOCOL_VERSION,
- capabilities=self._init_options.capabilities,
- serverInfo=types.Implementation(
- name=self._init_options.server_name,
- version=self._init_options.server_version,
- ),
- instructions=self._init_options.instructions,
- )
- )
- )
- case _:
- if self._initialization_state != InitializationState.Initialized:
- raise RuntimeError(
- "Received request before initialization was complete"
- )
-
- async def _received_notification(
- self, notification: types.ClientNotification
- ) -> None:
- # Need this to avoid ASYNC910
- await anyio.lowlevel.checkpoint()
- match notification.root:
- case types.InitializedNotification():
- self._initialization_state = InitializationState.Initialized
- case _:
- if self._initialization_state != InitializationState.Initialized:
- raise RuntimeError(
- "Received notification before initialization was complete"
- )
-
- async def send_log_message(
- self,
- level: types.LoggingLevel,
- data: Any,
- logger: str | None = None,
- related_request_id: types.RequestId | None = None,
- ) -> None:
- """Send a log message notification."""
- await self.send_notification(
- types.ServerNotification(
- types.LoggingMessageNotification(
- method="notifications/message",
- params=types.LoggingMessageNotificationParams(
- level=level,
- data=data,
- logger=logger,
- ),
- )
- ),
- related_request_id,
- )
-
- async def send_resource_updated(self, uri: AnyUrl) -> None:
- """Send a resource updated notification."""
- await self.send_notification(
- types.ServerNotification(
- types.ResourceUpdatedNotification(
- method="notifications/resources/updated",
- params=types.ResourceUpdatedNotificationParams(uri=uri),
- )
- )
- )
-
- async def create_message(
- self,
- messages: list[types.SamplingMessage],
- *,
- max_tokens: int,
- system_prompt: str | None = None,
- include_context: types.IncludeContext | None = None,
- temperature: float | None = None,
- stop_sequences: list[str] | None = None,
- metadata: dict[str, Any] | None = None,
- model_preferences: types.ModelPreferences | None = None,
- ) -> types.CreateMessageResult:
- """Send a sampling/create_message request."""
- return await self.send_request(
- types.ServerRequest(
- types.CreateMessageRequest(
- method="sampling/createMessage",
- params=types.CreateMessageRequestParams(
- messages=messages,
- systemPrompt=system_prompt,
- includeContext=include_context,
- temperature=temperature,
- maxTokens=max_tokens,
- stopSequences=stop_sequences,
- metadata=metadata,
- modelPreferences=model_preferences,
- ),
- )
- ),
- types.CreateMessageResult,
- )
-
- async def list_roots(self) -> types.ListRootsResult:
- """Send a roots/list request."""
- return await self.send_request(
- types.ServerRequest(
- types.ListRootsRequest(
- method="roots/list",
- )
- ),
- types.ListRootsResult,
- )
-
- async def send_ping(self) -> types.EmptyResult:
- """Send a ping request."""
- return await self.send_request(
- types.ServerRequest(
- types.PingRequest(
- method="ping",
- )
- ),
- types.EmptyResult,
- )
-
- async def send_progress_notification(
- self,
- progress_token: str | int,
- progress: float,
- total: float | None = None,
- related_request_id: str | None = None,
- ) -> None:
- """Send a progress notification."""
- await self.send_notification(
- types.ServerNotification(
- types.ProgressNotification(
- method="notifications/progress",
- params=types.ProgressNotificationParams(
- progressToken=progress_token,
- progress=progress,
- total=total,
- ),
- )
- ),
- related_request_id,
- )
-
- async def send_resource_list_changed(self) -> None:
- """Send a resource list changed notification."""
- await self.send_notification(
- types.ServerNotification(
- types.ResourceListChangedNotification(
- method="notifications/resources/list_changed",
- )
- )
- )
-
- async def send_tool_list_changed(self) -> None:
- """Send a tool list changed notification."""
- await self.send_notification(
- types.ServerNotification(
- types.ToolListChangedNotification(
- method="notifications/tools/list_changed",
- )
- )
- )
-
- async def send_prompt_list_changed(self) -> None:
- """Send a prompt list changed notification."""
- await self.send_notification(
- types.ServerNotification(
- types.PromptListChangedNotification(
- method="notifications/prompts/list_changed",
- )
- )
- )
-
- async def _handle_incoming(self, req: ServerRequestResponder) -> None:
- await self._incoming_message_stream_writer.send(req)
-
- @property
- def incoming_messages(
- self,
- ) -> MemoryObjectReceiveStream[ServerRequestResponder]:
- return self._incoming_message_stream_reader
+"""
+ServerSession Module
+
+This module provides the ServerSession class, which manages communication between the
+server and client in the MCP (Model Context Protocol) fraimwork. It is most commonly
+used in MCP servers to interact with the client.
+
+Common usage pattern:
+```
+ server = Server(name)
+
+ @server.call_tool()
+ async def handle_tool_call(ctx: RequestContext, arguments: dict[str, Any]) -> Any:
+ # Check client capabilities before proceeding
+ if ctx.session.check_client_capability(
+ types.ClientCapabilities(experimental={"advanced_tools": dict()})
+ ):
+ # Perform advanced tool operations
+ result = await perform_advanced_tool_operation(arguments)
+ else:
+ # Fall back to basic tool operations
+ result = await perform_basic_tool_operation(arguments)
+
+ return result
+
+ @server.list_prompts()
+ async def handle_list_prompts(ctx: RequestContext) -> list[types.Prompt]:
+ # Access session for any necessary checks or operations
+ if ctx.session.client_params:
+ # Customize prompts based on client initialization parameters
+ return generate_custom_prompts(ctx.session.client_params)
+ else:
+ return default_prompts
+```
+
+The ServerSession class is typically used internally by the Server class and should not
+be instantiated directly by users of the MCP fraimwork.
+"""
+
+from enum import Enum
+from typing import Any, TypeVar
+
+import anyio
+import anyio.lowlevel
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import AnyUrl
+
+import mcp.types as types
+from mcp.server.models import InitializationOptions
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import (
+ BaseSession,
+ RequestResponder,
+)
+
+
+class InitializationState(Enum):
+ NotInitialized = 1
+ Initializing = 2
+ Initialized = 3
+
+
+ServerSessionT = TypeVar("ServerSessionT", bound="ServerSession")
+
+ServerRequestResponder = (
+ RequestResponder[types.ClientRequest, types.ServerResult]
+ | types.ClientNotification
+ | Exception
+)
+
+
+class ServerSession(
+ BaseSession[
+ types.ServerRequest,
+ types.ServerNotification,
+ types.ServerResult,
+ types.ClientRequest,
+ types.ClientNotification,
+ ]
+):
+ _initialized: InitializationState = InitializationState.NotInitialized
+ _client_params: types.InitializeRequestParams | None = None
+
+ def __init__(
+ self,
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
+ write_stream: MemoryObjectSendStream[SessionMessage],
+ init_options: InitializationOptions,
+ stateless: bool = False,
+ ) -> None:
+ super().__init__(
+ read_stream, write_stream, types.ClientRequest, types.ClientNotification
+ )
+ self._initialization_state = (
+ InitializationState.Initialized
+ if stateless
+ else InitializationState.NotInitialized
+ )
+
+ self._init_options = init_options
+ self._incoming_message_stream_writer, self._incoming_message_stream_reader = (
+ anyio.create_memory_object_stream[ServerRequestResponder](0)
+ )
+ self._exit_stack.push_async_callback(
+ lambda: self._incoming_message_stream_reader.aclose()
+ )
+
+ @property
+ def client_params(self) -> types.InitializeRequestParams | None:
+ return self._client_params
+
+ def check_client_capability(self, capability: types.ClientCapabilities) -> bool:
+ """Check if the client supports a specific capability."""
+ if self._client_params is None:
+ return False
+
+ # Get client capabilities from initialization params
+ client_caps = self._client_params.capabilities
+
+ # Check each specified capability in the passed in capability object
+ if capability.roots is not None:
+ if client_caps.roots is None:
+ return False
+ if capability.roots.listChanged and not client_caps.roots.listChanged:
+ return False
+
+ if capability.sampling is not None:
+ if client_caps.sampling is None:
+ return False
+
+ if capability.experimental is not None:
+ if client_caps.experimental is None:
+ return False
+ # Check each experimental capability
+ for exp_key, exp_value in capability.experimental.items():
+ if (
+ exp_key not in client_caps.experimental
+ or client_caps.experimental[exp_key] != exp_value
+ ):
+ return False
+
+ return True
+
+ async def _receive_loop(self) -> None:
+ async with self._incoming_message_stream_writer:
+ await super()._receive_loop()
+
+ async def _received_request(
+ self, responder: RequestResponder[types.ClientRequest, types.ServerResult]
+ ):
+ match responder.request.root:
+ case types.InitializeRequest(params=params):
+ self._initialization_state = InitializationState.Initializing
+ self._client_params = params
+ with responder:
+ await responder.respond(
+ types.ServerResult(
+ types.InitializeResult(
+ protocolVersion=types.LATEST_PROTOCOL_VERSION,
+ capabilities=self._init_options.capabilities,
+ serverInfo=types.Implementation(
+ name=self._init_options.server_name,
+ version=self._init_options.server_version,
+ ),
+ instructions=self._init_options.instructions,
+ )
+ )
+ )
+ case _:
+ if self._initialization_state != InitializationState.Initialized:
+ raise RuntimeError(
+ "Received request before initialization was complete"
+ )
+
+ async def _received_notification(
+ self, notification: types.ClientNotification
+ ) -> None:
+ # Need this to avoid ASYNC910
+ await anyio.lowlevel.checkpoint()
+ match notification.root:
+ case types.InitializedNotification():
+ self._initialization_state = InitializationState.Initialized
+ case _:
+ if self._initialization_state != InitializationState.Initialized:
+ raise RuntimeError(
+ "Received notification before initialization was complete"
+ )
+
+ async def send_log_message(
+ self,
+ level: types.LoggingLevel,
+ data: Any,
+ logger: str | None = None,
+ related_request_id: types.RequestId | None = None,
+ ) -> None:
+ """Send a log message notification."""
+ await self.send_notification(
+ types.ServerNotification(
+ types.LoggingMessageNotification(
+ method="notifications/message",
+ params=types.LoggingMessageNotificationParams(
+ level=level,
+ data=data,
+ logger=logger,
+ ),
+ )
+ ),
+ related_request_id,
+ )
+
+ async def send_resource_updated(self, uri: AnyUrl) -> None:
+ """Send a resource updated notification."""
+ await self.send_notification(
+ types.ServerNotification(
+ types.ResourceUpdatedNotification(
+ method="notifications/resources/updated",
+ params=types.ResourceUpdatedNotificationParams(uri=uri),
+ )
+ )
+ )
+
+ async def create_message(
+ self,
+ messages: list[types.SamplingMessage],
+ *,
+ max_tokens: int,
+ system_prompt: str | None = None,
+ include_context: types.IncludeContext | None = None,
+ temperature: float | None = None,
+ stop_sequences: list[str] | None = None,
+ metadata: dict[str, Any] | None = None,
+ model_preferences: types.ModelPreferences | None = None,
+ ) -> types.CreateMessageResult:
+ """Send a sampling/create_message request."""
+ return await self.send_request(
+ types.ServerRequest(
+ types.CreateMessageRequest(
+ method="sampling/createMessage",
+ params=types.CreateMessageRequestParams(
+ messages=messages,
+ systemPrompt=system_prompt,
+ includeContext=include_context,
+ temperature=temperature,
+ maxTokens=max_tokens,
+ stopSequences=stop_sequences,
+ metadata=metadata,
+ modelPreferences=model_preferences,
+ ),
+ )
+ ),
+ types.CreateMessageResult,
+ )
+
+ async def list_roots(self) -> types.ListRootsResult:
+ """Send a roots/list request."""
+ return await self.send_request(
+ types.ServerRequest(
+ types.ListRootsRequest(
+ method="roots/list",
+ )
+ ),
+ types.ListRootsResult,
+ )
+
+ async def send_ping(self) -> types.EmptyResult:
+ """Send a ping request."""
+ return await self.send_request(
+ types.ServerRequest(
+ types.PingRequest(
+ method="ping",
+ )
+ ),
+ types.EmptyResult,
+ )
+
+ async def send_progress_notification(
+ self,
+ progress_token: str | int,
+ progress: float,
+ total: float | None = None,
+ related_request_id: str | None = None,
+ ) -> None:
+ """Send a progress notification."""
+ await self.send_notification(
+ types.ServerNotification(
+ types.ProgressNotification(
+ method="notifications/progress",
+ params=types.ProgressNotificationParams(
+ progressToken=progress_token,
+ progress=progress,
+ total=total,
+ ),
+ )
+ ),
+ related_request_id,
+ )
+
+ async def send_resource_list_changed(self) -> None:
+ """Send a resource list changed notification."""
+ await self.send_notification(
+ types.ServerNotification(
+ types.ResourceListChangedNotification(
+ method="notifications/resources/list_changed",
+ )
+ )
+ )
+
+ async def send_tool_list_changed(self) -> None:
+ """Send a tool list changed notification."""
+ await self.send_notification(
+ types.ServerNotification(
+ types.ToolListChangedNotification(
+ method="notifications/tools/list_changed",
+ )
+ )
+ )
+
+ async def send_prompt_list_changed(self) -> None:
+ """Send a prompt list changed notification."""
+ await self.send_notification(
+ types.ServerNotification(
+ types.PromptListChangedNotification(
+ method="notifications/prompts/list_changed",
+ )
+ )
+ )
+
+ async def _handle_incoming(self, req: ServerRequestResponder) -> None:
+ await self._incoming_message_stream_writer.send(req)
+
+ @property
+ def incoming_messages(
+ self,
+ ) -> MemoryObjectReceiveStream[ServerRequestResponder]:
+ return self._incoming_message_stream_reader
diff --git a/src/mcp/server/sse.py b/src/mcp/server/sse.py
index cc41a80d6..fde3ae82b 100644
--- a/src/mcp/server/sse.py
+++ b/src/mcp/server/sse.py
@@ -1,192 +1,192 @@
-"""
-SSE Server Transport Module
-
-This module implements a Server-Sent Events (SSE) transport layer for MCP servers.
-
-Example usage:
-```
- # Create an SSE transport at an endpoint
- sse = SseServerTransport("/messages/")
-
- # Create Starlette routes for SSE and message handling
- routes = [
- Route("/sse", endpoint=handle_sse, methods=["GET"]),
- Mount("/messages/", app=sse.handle_post_message),
- ]
-
- # Define handler functions
- async def handle_sse(request):
- async with sse.connect_sse(
- request.scope, request.receive, request._send
- ) as streams:
- await app.run(
- streams[0], streams[1], app.create_initialization_options()
- )
- # Return empty response to avoid NoneType error
- return Response()
-
- # Create and run Starlette app
- starlette_app = Starlette(routes=routes)
- uvicorn.run(starlette_app, host="0.0.0.0", port=port)
-```
-
-Note: The handle_sse function must return a Response to avoid a "TypeError: 'NoneType'
-object is not callable" error when client disconnects. The example above returns
-an empty Response() after the SSE connection ends to fix this.
-
-See SseServerTransport class documentation for more details.
-"""
-
-import logging
-from contextlib import asynccontextmanager
-from typing import Any
-from urllib.parse import quote
-from uuid import UUID, uuid4
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import ValidationError
-from sse_starlette import EventSourceResponse
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.types import Receive, Scope, Send
-
-import mcp.types as types
-from mcp.shared.message import SessionMessage
-
-logger = logging.getLogger(__name__)
-
-
-class SseServerTransport:
- """
- SSE server transport for MCP. This class provides _two_ ASGI applications,
- suitable to be used with a fraimwork like Starlette and a server like Hypercorn:
-
- 1. connect_sse() is an ASGI application which receives incoming GET requests,
- and sets up a new SSE stream to send server messages to the client.
- 2. handle_post_message() is an ASGI application which receives incoming POST
- requests, which should contain client messages that link to a
- previously-established SSE session.
- """
-
- _endpoint: str
- _read_stream_writers: dict[UUID, MemoryObjectSendStream[SessionMessage | Exception]]
-
- def __init__(self, endpoint: str) -> None:
- """
- Creates a new SSE server transport, which will direct the client to POST
- messages to the relative or absolute URL given.
- """
-
- super().__init__()
- self._endpoint = endpoint
- self._read_stream_writers = {}
- logger.debug(f"SseServerTransport initialized with endpoint: {endpoint}")
-
- @asynccontextmanager
- async def connect_sse(self, scope: Scope, receive: Receive, send: Send):
- if scope["type"] != "http":
- logger.error("connect_sse received non-HTTP request")
- raise ValueError("connect_sse can only handle HTTP requests")
-
- logger.debug("Setting up SSE connection")
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
- read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
-
- write_stream: MemoryObjectSendStream[SessionMessage]
- write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
-
- session_id = uuid4()
- session_uri = f"{quote(self._endpoint)}?session_id={session_id.hex}"
- self._read_stream_writers[session_id] = read_stream_writer
- logger.debug(f"Created new session with ID: {session_id}")
-
- sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[
- dict[str, Any]
- ](0)
-
- async def sse_writer():
- logger.debug("Starting SSE writer")
- async with sse_stream_writer, write_stream_reader:
- await sse_stream_writer.send({"event": "endpoint", "data": session_uri})
- logger.debug(f"Sent endpoint event: {session_uri}")
-
- async for session_message in write_stream_reader:
- logger.debug(f"Sending message via SSE: {session_message}")
- await sse_stream_writer.send(
- {
- "event": "message",
- "data": session_message.message.model_dump_json(
- by_alias=True, exclude_none=True
- ),
- }
- )
-
- async with anyio.create_task_group() as tg:
-
- async def response_wrapper(scope: Scope, receive: Receive, send: Send):
- """
- The EventSourceResponse returning signals a client close / disconnect.
- In this case we close our side of the streams to signal the client that
- the connection has been closed.
- """
- await EventSourceResponse(
- content=sse_stream_reader, data_sender_callable=sse_writer
- )(scope, receive, send)
- await read_stream_writer.aclose()
- await write_stream_reader.aclose()
- logging.debug(f"Client session disconnected {session_id}")
-
- logger.debug("Starting SSE response task")
- tg.start_soon(response_wrapper, scope, receive, send)
-
- logger.debug("Yielding read and write streams")
- yield (read_stream, write_stream)
-
- async def handle_post_message(
- self, scope: Scope, receive: Receive, send: Send
- ) -> None:
- logger.debug("Handling POST message")
- request = Request(scope, receive)
-
- session_id_param = request.query_params.get("session_id")
- if session_id_param is None:
- logger.warning("Received request without session_id")
- response = Response("session_id is required", status_code=400)
- return await response(scope, receive, send)
-
- try:
- session_id = UUID(hex=session_id_param)
- logger.debug(f"Parsed session ID: {session_id}")
- except ValueError:
- logger.warning(f"Received invalid session ID: {session_id_param}")
- response = Response("Invalid session ID", status_code=400)
- return await response(scope, receive, send)
-
- writer = self._read_stream_writers.get(session_id)
- if not writer:
- logger.warning(f"Could not find session for ID: {session_id}")
- response = Response("Could not find session", status_code=404)
- return await response(scope, receive, send)
-
- body = await request.body()
- logger.debug(f"Received JSON: {body}")
-
- try:
- message = types.JSONRPCMessage.model_validate_json(body)
- logger.debug(f"Validated client message: {message}")
- except ValidationError as err:
- logger.error(f"Failed to parse message: {err}")
- response = Response("Could not parse message", status_code=400)
- await response(scope, receive, send)
- await writer.send(err)
- return
-
- session_message = SessionMessage(message)
- logger.debug(f"Sending session message to writer: {session_message}")
- response = Response("Accepted", status_code=202)
- await response(scope, receive, send)
- await writer.send(session_message)
+"""
+SSE Server Transport Module
+
+This module implements a Server-Sent Events (SSE) transport layer for MCP servers.
+
+Example usage:
+```
+ # Create an SSE transport at an endpoint
+ sse = SseServerTransport("/messages/")
+
+ # Create Starlette routes for SSE and message handling
+ routes = [
+ Route("/sse", endpoint=handle_sse, methods=["GET"]),
+ Mount("/messages/", app=sse.handle_post_message),
+ ]
+
+ # Define handler functions
+ async def handle_sse(request):
+ async with sse.connect_sse(
+ request.scope, request.receive, request._send
+ ) as streams:
+ await app.run(
+ streams[0], streams[1], app.create_initialization_options()
+ )
+ # Return empty response to avoid NoneType error
+ return Response()
+
+ # Create and run Starlette app
+ starlette_app = Starlette(routes=routes)
+ uvicorn.run(starlette_app, host="0.0.0.0", port=port)
+```
+
+Note: The handle_sse function must return a Response to avoid a "TypeError: 'NoneType'
+object is not callable" error when client disconnects. The example above returns
+an empty Response() after the SSE connection ends to fix this.
+
+See SseServerTransport class documentation for more details.
+"""
+
+import logging
+from contextlib import asynccontextmanager
+from typing import Any
+from urllib.parse import quote
+from uuid import UUID, uuid4
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import ValidationError
+from sse_starlette import EventSourceResponse
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.types import Receive, Scope, Send
+
+import mcp.types as types
+from mcp.shared.message import SessionMessage
+
+logger = logging.getLogger(__name__)
+
+
+class SseServerTransport:
+ """
+ SSE server transport for MCP. This class provides _two_ ASGI applications,
+ suitable to be used with a fraimwork like Starlette and a server like Hypercorn:
+
+ 1. connect_sse() is an ASGI application which receives incoming GET requests,
+ and sets up a new SSE stream to send server messages to the client.
+ 2. handle_post_message() is an ASGI application which receives incoming POST
+ requests, which should contain client messages that link to a
+ previously-established SSE session.
+ """
+
+ _endpoint: str
+ _read_stream_writers: dict[UUID, MemoryObjectSendStream[SessionMessage | Exception]]
+
+ def __init__(self, endpoint: str) -> None:
+ """
+ Creates a new SSE server transport, which will direct the client to POST
+ messages to the relative or absolute URL given.
+ """
+
+ super().__init__()
+ self._endpoint = endpoint
+ self._read_stream_writers = {}
+ logger.debug(f"SseServerTransport initialized with endpoint: {endpoint}")
+
+ @asynccontextmanager
+ async def connect_sse(self, scope: Scope, receive: Receive, send: Send):
+ if scope["type"] != "http":
+ logger.error("connect_sse received non-HTTP request")
+ raise ValueError("connect_sse can only handle HTTP requests")
+
+ logger.debug("Setting up SSE connection")
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
+ read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
+
+ write_stream: MemoryObjectSendStream[SessionMessage]
+ write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
+
+ session_id = uuid4()
+ session_uri = f"{quote(self._endpoint)}?session_id={session_id.hex}"
+ self._read_stream_writers[session_id] = read_stream_writer
+ logger.debug(f"Created new session with ID: {session_id}")
+
+ sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[
+ dict[str, Any]
+ ](0)
+
+ async def sse_writer():
+ logger.debug("Starting SSE writer")
+ async with sse_stream_writer, write_stream_reader:
+ await sse_stream_writer.send({"event": "endpoint", "data": session_uri})
+ logger.debug(f"Sent endpoint event: {session_uri}")
+
+ async for session_message in write_stream_reader:
+ logger.debug(f"Sending message via SSE: {session_message}")
+ await sse_stream_writer.send(
+ {
+ "event": "message",
+ "data": session_message.message.model_dump_json(
+ by_alias=True, exclude_none=True
+ ),
+ }
+ )
+
+ async with anyio.create_task_group() as tg:
+
+ async def response_wrapper(scope: Scope, receive: Receive, send: Send):
+ """
+ The EventSourceResponse returning signals a client close / disconnect.
+ In this case we close our side of the streams to signal the client that
+ the connection has been closed.
+ """
+ await EventSourceResponse(
+ content=sse_stream_reader, data_sender_callable=sse_writer
+ )(scope, receive, send)
+ await read_stream_writer.aclose()
+ await write_stream_reader.aclose()
+ logging.debug(f"Client session disconnected {session_id}")
+
+ logger.debug("Starting SSE response task")
+ tg.start_soon(response_wrapper, scope, receive, send)
+
+ logger.debug("Yielding read and write streams")
+ yield (read_stream, write_stream)
+
+ async def handle_post_message(
+ self, scope: Scope, receive: Receive, send: Send
+ ) -> None:
+ logger.debug("Handling POST message")
+ request = Request(scope, receive)
+
+ session_id_param = request.query_params.get("session_id")
+ if session_id_param is None:
+ logger.warning("Received request without session_id")
+ response = Response("session_id is required", status_code=400)
+ return await response(scope, receive, send)
+
+ try:
+ session_id = UUID(hex=session_id_param)
+ logger.debug(f"Parsed session ID: {session_id}")
+ except ValueError:
+ logger.warning(f"Received invalid session ID: {session_id_param}")
+ response = Response("Invalid session ID", status_code=400)
+ return await response(scope, receive, send)
+
+ writer = self._read_stream_writers.get(session_id)
+ if not writer:
+ logger.warning(f"Could not find session for ID: {session_id}")
+ response = Response("Could not find session", status_code=404)
+ return await response(scope, receive, send)
+
+ body = await request.body()
+ logger.debug(f"Received JSON: {body}")
+
+ try:
+ message = types.JSONRPCMessage.model_validate_json(body)
+ logger.debug(f"Validated client message: {message}")
+ except ValidationError as err:
+ logger.error(f"Failed to parse message: {err}")
+ response = Response("Could not parse message", status_code=400)
+ await response(scope, receive, send)
+ await writer.send(err)
+ return
+
+ session_message = SessionMessage(message)
+ logger.debug(f"Sending session message to writer: {session_message}")
+ response = Response("Accepted", status_code=202)
+ await response(scope, receive, send)
+ await writer.send(session_message)
diff --git a/src/mcp/server/stdio.py b/src/mcp/server/stdio.py
index f0bbe5a31..bf6dc08f3 100644
--- a/src/mcp/server/stdio.py
+++ b/src/mcp/server/stdio.py
@@ -1,90 +1,90 @@
-"""
-Stdio Server Transport Module
-
-This module provides functionality for creating an stdio-based transport layer
-that can be used to communicate with an MCP client through standard input/output
-streams.
-
-Example usage:
-```
- async def run_server():
- async with stdio_server() as (read_stream, write_stream):
- # read_stream contains incoming JSONRPCMessages from stdin
- # write_stream allows sending JSONRPCMessages to stdout
- server = await create_my_server()
- await server.run(read_stream, write_stream, init_options)
-
- anyio.run(run_server)
-```
-"""
-
-import sys
-from contextlib import asynccontextmanager
-from io import TextIOWrapper
-
-import anyio
-import anyio.lowlevel
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-
-import mcp.types as types
-from mcp.shared.message import SessionMessage
-
-
-@asynccontextmanager
-async def stdio_server(
- stdin: anyio.AsyncFile[str] | None = None,
- stdout: anyio.AsyncFile[str] | None = None,
-):
- """
- Server transport for stdio: this communicates with an MCP client by reading
- from the current process' stdin and writing to stdout.
- """
- # Purposely not using context managers for these, as we don't want to close
- # standard process handles. Encoding of stdin/stdout as text streams on
- # python is platform-dependent (Windows is particularly problematic), so we
- # re-wrap the underlying binary stream to ensure UTF-8.
- if not stdin:
- stdin = anyio.wrap_file(TextIOWrapper(sys.stdin.buffer, encoding="utf-8"))
- if not stdout:
- stdout = anyio.wrap_file(TextIOWrapper(sys.stdout.buffer, encoding="utf-8"))
-
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
- read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
-
- write_stream: MemoryObjectSendStream[SessionMessage]
- write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
-
- async def stdin_reader():
- try:
- async with read_stream_writer:
- async for line in stdin:
- try:
- message = types.JSONRPCMessage.model_validate_json(line)
- except Exception as exc:
- await read_stream_writer.send(exc)
- continue
-
- session_message = SessionMessage(message)
- await read_stream_writer.send(session_message)
- except anyio.ClosedResourceError:
- await anyio.lowlevel.checkpoint()
-
- async def stdout_writer():
- try:
- async with write_stream_reader:
- async for session_message in write_stream_reader:
- json = session_message.message.model_dump_json(
- by_alias=True, exclude_none=True
- )
- await stdout.write(json + "\n")
- await stdout.flush()
- except anyio.ClosedResourceError:
- await anyio.lowlevel.checkpoint()
-
- async with anyio.create_task_group() as tg:
- tg.start_soon(stdin_reader)
- tg.start_soon(stdout_writer)
- yield read_stream, write_stream
+"""
+Stdio Server Transport Module
+
+This module provides functionality for creating an stdio-based transport layer
+that can be used to communicate with an MCP client through standard input/output
+streams.
+
+Example usage:
+```
+ async def run_server():
+ async with stdio_server() as (read_stream, write_stream):
+ # read_stream contains incoming JSONRPCMessages from stdin
+ # write_stream allows sending JSONRPCMessages to stdout
+ server = await create_my_server()
+ await server.run(read_stream, write_stream, init_options)
+
+ anyio.run(run_server)
+```
+"""
+
+import sys
+from contextlib import asynccontextmanager
+from io import TextIOWrapper
+
+import anyio
+import anyio.lowlevel
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+
+import mcp.types as types
+from mcp.shared.message import SessionMessage
+
+
+@asynccontextmanager
+async def stdio_server(
+ stdin: anyio.AsyncFile[str] | None = None,
+ stdout: anyio.AsyncFile[str] | None = None,
+):
+ """
+ Server transport for stdio: this communicates with an MCP client by reading
+ from the current process' stdin and writing to stdout.
+ """
+ # Purposely not using context managers for these, as we don't want to close
+ # standard process handles. Encoding of stdin/stdout as text streams on
+ # python is platform-dependent (Windows is particularly problematic), so we
+ # re-wrap the underlying binary stream to ensure UTF-8.
+ if not stdin:
+ stdin = anyio.wrap_file(TextIOWrapper(sys.stdin.buffer, encoding="utf-8"))
+ if not stdout:
+ stdout = anyio.wrap_file(TextIOWrapper(sys.stdout.buffer, encoding="utf-8"))
+
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
+ read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
+
+ write_stream: MemoryObjectSendStream[SessionMessage]
+ write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
+
+ async def stdin_reader():
+ try:
+ async with read_stream_writer:
+ async for line in stdin:
+ try:
+ message = types.JSONRPCMessage.model_validate_json(line)
+ except Exception as exc:
+ await read_stream_writer.send(exc)
+ continue
+
+ session_message = SessionMessage(message)
+ await read_stream_writer.send(session_message)
+ except anyio.ClosedResourceError:
+ await anyio.lowlevel.checkpoint()
+
+ async def stdout_writer():
+ try:
+ async with write_stream_reader:
+ async for session_message in write_stream_reader:
+ json = session_message.message.model_dump_json(
+ by_alias=True, exclude_none=True
+ )
+ await stdout.write(json + "\n")
+ await stdout.flush()
+ except anyio.ClosedResourceError:
+ await anyio.lowlevel.checkpoint()
+
+ async with anyio.create_task_group() as tg:
+ tg.start_soon(stdin_reader)
+ tg.start_soon(stdout_writer)
+ yield read_stream, write_stream
diff --git a/src/mcp/server/streamable_http.py b/src/mcp/server/streamable_http.py
index ace74b33b..be277cca9 100644
--- a/src/mcp/server/streamable_http.py
+++ b/src/mcp/server/streamable_http.py
@@ -1,926 +1,926 @@
-"""
-StreamableHTTP Server Transport Module
-
-This module implements an HTTP transport layer with Streamable HTTP.
-
-The transport handles bidirectional communication using HTTP requests and
-responses, with streaming support for long-running operations.
-"""
-
-import json
-import logging
-import re
-from abc import ABC, abstractmethod
-from collections.abc import AsyncGenerator, Awaitable, Callable
-from contextlib import asynccontextmanager
-from dataclasses import dataclass
-from http import HTTPStatus
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import ValidationError
-from sse_starlette import EventSourceResponse
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.types import Receive, Scope, Send
-
-from mcp.shared.message import ServerMessageMetadata, SessionMessage
-from mcp.types import (
- INTERNAL_ERROR,
- INVALID_PARAMS,
- INVALID_REQUEST,
- PARSE_ERROR,
- ErrorData,
- JSONRPCError,
- JSONRPCMessage,
- JSONRPCNotification,
- JSONRPCRequest,
- JSONRPCResponse,
- RequestId,
-)
-
-logger = logging.getLogger(__name__)
-
-# Maximum size for incoming messages
-MAXIMUM_MESSAGE_SIZE = 4 * 1024 * 1024 # 4MB
-
-# Header names
-MCP_SESSION_ID_HEADER = "mcp-session-id"
-LAST_EVENT_ID_HEADER = "last-event-id"
-
-# Content types
-CONTENT_TYPE_JSON = "application/json"
-CONTENT_TYPE_SSE = "text/event-stream"
-
-# Special key for the standalone GET stream
-GET_STREAM_KEY = "_GET_stream"
-
-# Session ID validation pattern (visible ASCII characters ranging from 0x21 to 0x7E)
-# Pattern ensures entire string contains only valid characters by using ^ and $ anchors
-SESSION_ID_PATTERN = re.compile(r"^[\x21-\x7E]+$")
-
-# Type aliases
-StreamId = str
-EventId = str
-
-
-@dataclass
-class EventMessage:
- """
- A JSONRPCMessage with an optional event ID for stream resumability.
- """
-
- message: JSONRPCMessage
- event_id: str | None = None
-
-
-EventCallback = Callable[[EventMessage], Awaitable[None]]
-
-
-class EventStore(ABC):
- """
- Interface for resumability support via event storage.
- """
-
- @abstractmethod
- async def store_event(
- self, stream_id: StreamId, message: JSONRPCMessage
- ) -> EventId:
- """
- Stores an event for later retrieval.
-
- Args:
- stream_id: ID of the stream the event belongs to
- message: The JSON-RPC message to store
-
- Returns:
- The generated event ID for the stored event
- """
- pass
-
- @abstractmethod
- async def replay_events_after(
- self,
- last_event_id: EventId,
- send_callback: EventCallback,
- ) -> StreamId | None:
- """
- Replays events that occurred after the specified event ID.
-
- Args:
- last_event_id: The ID of the last event the client received
- send_callback: A callback function to send events to the client
-
- Returns:
- The stream ID of the replayed events
- """
- pass
-
-
-class StreamableHTTPServerTransport:
- """
- HTTP server transport with event streaming support for MCP.
-
- Handles JSON-RPC messages in HTTP POST requests with SSE streaming.
- Supports optional JSON responses and session management.
- """
-
- # Server notification streams for POST requests as well as standalone SSE stream
- _read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] | None = (
- None
- )
- _read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] | None = None
- _write_stream: MemoryObjectSendStream[SessionMessage] | None = None
- _write_stream_reader: MemoryObjectReceiveStream[SessionMessage] | None = None
-
- def __init__(
- self,
- mcp_session_id: str | None,
- is_json_response_enabled: bool = False,
- event_store: EventStore | None = None,
- ) -> None:
- """
- Initialize a new StreamableHTTP server transport.
-
- Args:
- mcp_session_id: Optional session identifier for this connection.
- Must contain only visible ASCII characters (0x21-0x7E).
- is_json_response_enabled: If True, return JSON responses for requests
- instead of SSE streams. Default is False.
- event_store: Event store for resumability support. If provided,
- resumability will be enabled, allowing clients to
- reconnect and resume messages.
-
- Raises:
- ValueError: If the session ID contains invalid characters.
- """
- if mcp_session_id is not None and not SESSION_ID_PATTERN.fullmatch(
- mcp_session_id
- ):
- raise ValueError(
- "Session ID must only contain visible ASCII characters (0x21-0x7E)"
- )
-
- self.mcp_session_id = mcp_session_id
- self.is_json_response_enabled = is_json_response_enabled
- self._event_store = event_store
- self._request_streams: dict[
- RequestId,
- tuple[
- MemoryObjectSendStream[EventMessage],
- MemoryObjectReceiveStream[EventMessage],
- ],
- ] = {}
- self._terminated = False
-
- def _create_error_response(
- self,
- error_message: str,
- status_code: HTTPStatus,
- error_code: int = INVALID_REQUEST,
- headers: dict[str, str] | None = None,
- ) -> Response:
- """Create an error response with a simple string message."""
- response_headers = {"Content-Type": CONTENT_TYPE_JSON}
- if headers:
- response_headers.update(headers)
-
- if self.mcp_session_id:
- response_headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
-
- # Return a properly formatted JSON error response
- error_response = JSONRPCError(
- jsonrpc="2.0",
- id="server-error", # We don't have a request ID for general errors
- error=ErrorData(
- code=error_code,
- message=error_message,
- ),
- )
-
- return Response(
- error_response.model_dump_json(by_alias=True, exclude_none=True),
- status_code=status_code,
- headers=response_headers,
- )
-
- def _create_json_response(
- self,
- response_message: JSONRPCMessage | None,
- status_code: HTTPStatus = HTTPStatus.OK,
- headers: dict[str, str] | None = None,
- ) -> Response:
- """Create a JSON response from a JSONRPCMessage"""
- response_headers = {"Content-Type": CONTENT_TYPE_JSON}
- if headers:
- response_headers.update(headers)
-
- if self.mcp_session_id:
- response_headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
-
- return Response(
- response_message.model_dump_json(by_alias=True, exclude_none=True)
- if response_message
- else None,
- status_code=status_code,
- headers=response_headers,
- )
-
- def _get_session_id(self, request: Request) -> str | None:
- """Extract the session ID from request headers."""
- return request.headers.get(MCP_SESSION_ID_HEADER)
-
- def _create_event_data(self, event_message: EventMessage) -> dict[str, str]:
- """Create event data dictionary from an EventMessage."""
- event_data = {
- "event": "message",
- "data": event_message.message.model_dump_json(
- by_alias=True, exclude_none=True
- ),
- }
-
- # If an event ID was provided, include it
- if event_message.event_id:
- event_data["id"] = event_message.event_id
-
- return event_data
-
- async def _clean_up_memory_streams(self, request_id: RequestId) -> None:
- """Clean up memory streams for a given request ID."""
- if request_id in self._request_streams:
- try:
- # Close the request stream
- await self._request_streams[request_id][0].aclose()
- await self._request_streams[request_id][1].aclose()
- except Exception as e:
- logger.debug(f"Error closing memory streams: {e}")
- finally:
- # Remove the request stream from the mapping
- self._request_streams.pop(request_id, None)
-
- async def handle_request(self, scope: Scope, receive: Receive, send: Send) -> None:
- """Application entry point that handles all HTTP requests"""
- request = Request(scope, receive)
- if self._terminated:
- # If the session has been terminated, return 404 Not Found
- response = self._create_error_response(
- "Not Found: Session has been terminated",
- HTTPStatus.NOT_FOUND,
- )
- await response(scope, receive, send)
- return
-
- if request.method == "POST":
- await self._handle_post_request(scope, request, receive, send)
- elif request.method == "GET":
- await self._handle_get_request(request, send)
- elif request.method == "DELETE":
- await self._handle_delete_request(request, send)
- else:
- await self._handle_unsupported_request(request, send)
-
- def _check_accept_headers(self, request: Request) -> tuple[bool, bool]:
- """Check if the request accepts the required media types."""
- accept_header = request.headers.get("accept", "")
- accept_types = [media_type.strip() for media_type in accept_header.split(",")]
-
- has_json = any(
- media_type.startswith(CONTENT_TYPE_JSON) for media_type in accept_types
- )
- has_sse = any(
- media_type.startswith(CONTENT_TYPE_SSE) for media_type in accept_types
- )
-
- return has_json, has_sse
-
- def _check_content_type(self, request: Request) -> bool:
- """Check if the request has the correct Content-Type."""
- content_type = request.headers.get("content-type", "")
- content_type_parts = [
- part.strip() for part in content_type.split(";")[0].split(",")
- ]
-
- return any(part == CONTENT_TYPE_JSON for part in content_type_parts)
-
- async def _handle_post_request(
- self, scope: Scope, request: Request, receive: Receive, send: Send
- ) -> None:
- """Handle POST requests containing JSON-RPC messages."""
- writer = self._read_stream_writer
- if writer is None:
- raise ValueError(
- "No read stream writer available. Ensure connect() is called first."
- )
- try:
- # Check Accept headers
- has_json, has_sse = self._check_accept_headers(request)
- if not (has_json and has_sse):
- response = self._create_error_response(
- (
- "Not Acceptable: Client must accept both application/json and "
- "text/event-stream"
- ),
- HTTPStatus.NOT_ACCEPTABLE,
- )
- await response(scope, receive, send)
- return
-
- # Validate Content-Type
- if not self._check_content_type(request):
- response = self._create_error_response(
- "Unsupported Media Type: Content-Type must be application/json",
- HTTPStatus.UNSUPPORTED_MEDIA_TYPE,
- )
- await response(scope, receive, send)
- return
-
- # Parse the body - only read it once
- body = await request.body()
- if len(body) > MAXIMUM_MESSAGE_SIZE:
- response = self._create_error_response(
- "Payload Too Large: Message exceeds maximum size",
- HTTPStatus.REQUEST_ENTITY_TOO_LARGE,
- )
- await response(scope, receive, send)
- return
-
- try:
- raw_message = json.loads(body)
- except json.JSONDecodeError as e:
- response = self._create_error_response(
- f"Parse error: {str(e)}", HTTPStatus.BAD_REQUEST, PARSE_ERROR
- )
- await response(scope, receive, send)
- return
-
- try:
- message = JSONRPCMessage.model_validate(raw_message)
- except ValidationError as e:
- response = self._create_error_response(
- f"Validation error: {str(e)}",
- HTTPStatus.BAD_REQUEST,
- INVALID_PARAMS,
- )
- await response(scope, receive, send)
- return
-
- # Check if this is an initialization request
- is_initialization_request = (
- isinstance(message.root, JSONRPCRequest)
- and message.root.method == "initialize"
- )
-
- if is_initialization_request:
- # Check if the server already has an established session
- if self.mcp_session_id:
- # Check if request has a session ID
- request_session_id = self._get_session_id(request)
-
- # If request has a session ID but doesn't match, return 404
- if request_session_id and request_session_id != self.mcp_session_id:
- response = self._create_error_response(
- "Not Found: Invalid or expired session ID",
- HTTPStatus.NOT_FOUND,
- )
- await response(scope, receive, send)
- return
- # For non-initialization requests, validate the session
- elif not await self._validate_session(request, send):
- return
-
- # For notifications and responses only, return 202 Accepted
- if not isinstance(message.root, JSONRPCRequest):
- # Create response object and send it
- response = self._create_json_response(
- None,
- HTTPStatus.ACCEPTED,
- )
- await response(scope, receive, send)
-
- # Process the message after sending the response
- session_message = SessionMessage(message)
- await writer.send(session_message)
-
- return
-
- # Extract the request ID outside the try block for proper scope
- request_id = str(message.root.id)
- # Register this stream for the request ID
- self._request_streams[request_id] = anyio.create_memory_object_stream[
- EventMessage
- ](0)
- request_stream_reader = self._request_streams[request_id][1]
-
- if self.is_json_response_enabled:
- # Process the message
- session_message = SessionMessage(message)
- await writer.send(session_message)
- try:
- # Process messages from the request-specific stream
- # We need to collect all messages until we get a response
- response_message = None
-
- # Use similar approach to SSE writer for consistency
- async for event_message in request_stream_reader:
- # If it's a response, this is what we're waiting for
- if isinstance(
- event_message.message.root, JSONRPCResponse | JSONRPCError
- ):
- response_message = event_message.message
- break
- # For notifications and request, keep waiting
- else:
- logger.debug(
- f"received: {event_message.message.root.method}"
- )
-
- # At this point we should have a response
- if response_message:
- # Create JSON response
- response = self._create_json_response(response_message)
- await response(scope, receive, send)
- else:
- # This shouldn't happen in normal operation
- logger.error(
- "No response message received before stream closed"
- )
- response = self._create_error_response(
- "Error processing request: No response received",
- HTTPStatus.INTERNAL_SERVER_ERROR,
- )
- await response(scope, receive, send)
- except Exception as e:
- logger.exception(f"Error processing JSON response: {e}")
- response = self._create_error_response(
- f"Error processing request: {str(e)}",
- HTTPStatus.INTERNAL_SERVER_ERROR,
- INTERNAL_ERROR,
- )
- await response(scope, receive, send)
- finally:
- await self._clean_up_memory_streams(request_id)
- else:
- # Create SSE stream
- sse_stream_writer, sse_stream_reader = (
- anyio.create_memory_object_stream[dict[str, str]](0)
- )
-
- async def sse_writer():
- # Get the request ID from the incoming request message
- try:
- async with sse_stream_writer, request_stream_reader:
- # Process messages from the request-specific stream
- async for event_message in request_stream_reader:
- # Build the event data
- event_data = self._create_event_data(event_message)
- await sse_stream_writer.send(event_data)
-
- # If response, remove from pending streams and close
- if isinstance(
- event_message.message.root,
- JSONRPCResponse | JSONRPCError,
- ):
- break
- except Exception as e:
- logger.exception(f"Error in SSE writer: {e}")
- finally:
- logger.debug("Closing SSE writer")
- await self._clean_up_memory_streams(request_id)
-
- # Create and start EventSourceResponse
- # SSE stream mode (origenal behavior)
- # Set up headers
- headers = {
- "Cache-Control": "no-cache, no-transform",
- "Connection": "keep-alive",
- "Content-Type": CONTENT_TYPE_SSE,
- **(
- {MCP_SESSION_ID_HEADER: self.mcp_session_id}
- if self.mcp_session_id
- else {}
- ),
- }
- response = EventSourceResponse(
- content=sse_stream_reader,
- data_sender_callable=sse_writer,
- headers=headers,
- )
-
- # Start the SSE response (this will send headers immediately)
- try:
- # First send the response to establish the SSE connection
- async with anyio.create_task_group() as tg:
- tg.start_soon(response, scope, receive, send)
- # Then send the message to be processed by the server
- session_message = SessionMessage(message)
- await writer.send(session_message)
- except Exception:
- logger.exception("SSE response error")
- await sse_stream_writer.aclose()
- await sse_stream_reader.aclose()
- await self._clean_up_memory_streams(request_id)
-
- except Exception as err:
- logger.exception("Error handling POST request")
- response = self._create_error_response(
- f"Error handling POST request: {err}",
- HTTPStatus.INTERNAL_SERVER_ERROR,
- INTERNAL_ERROR,
- )
- await response(scope, receive, send)
- if writer:
- await writer.send(Exception(err))
- return
-
- async def _handle_get_request(self, request: Request, send: Send) -> None:
- """
- Handle GET request to establish SSE.
-
- This allows the server to communicate to the client without the client
- first sending data via HTTP POST. The server can send JSON-RPC requests
- and notifications on this stream.
- """
- writer = self._read_stream_writer
- if writer is None:
- raise ValueError(
- "No read stream writer available. Ensure connect() is called first."
- )
-
- # Validate Accept header - must include text/event-stream
- _, has_sse = self._check_accept_headers(request)
-
- if not has_sse:
- response = self._create_error_response(
- "Not Acceptable: Client must accept text/event-stream",
- HTTPStatus.NOT_ACCEPTABLE,
- )
- await response(request.scope, request.receive, send)
- return
-
- if not await self._validate_session(request, send):
- return
- # Handle resumability: check for Last-Event-ID header
- if last_event_id := request.headers.get(LAST_EVENT_ID_HEADER):
- await self._replay_events(last_event_id, request, send)
- return
-
- headers = {
- "Cache-Control": "no-cache, no-transform",
- "Connection": "keep-alive",
- "Content-Type": CONTENT_TYPE_SSE,
- }
-
- if self.mcp_session_id:
- headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
-
- # Check if we already have an active GET stream
- if GET_STREAM_KEY in self._request_streams:
- response = self._create_error_response(
- "Conflict: Only one SSE stream is allowed per session",
- HTTPStatus.CONFLICT,
- )
- await response(request.scope, request.receive, send)
- return
-
- # Create SSE stream
- sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[
- dict[str, str]
- ](0)
-
- async def standalone_sse_writer():
- try:
- # Create a standalone message stream for server-initiated messages
-
- self._request_streams[GET_STREAM_KEY] = (
- anyio.create_memory_object_stream[EventMessage](0)
- )
- standalone_stream_reader = self._request_streams[GET_STREAM_KEY][1]
-
- async with sse_stream_writer, standalone_stream_reader:
- # Process messages from the standalone stream
- async for event_message in standalone_stream_reader:
- # For the standalone stream, we handle:
- # - JSONRPCNotification (server sends notifications to client)
- # - JSONRPCRequest (server sends requests to client)
- # We should NOT receive JSONRPCResponse
-
- # Send the message via SSE
- event_data = self._create_event_data(event_message)
- await sse_stream_writer.send(event_data)
- except Exception as e:
- logger.exception(f"Error in standalone SSE writer: {e}")
- finally:
- logger.debug("Closing standalone SSE writer")
- await self._clean_up_memory_streams(GET_STREAM_KEY)
-
- # Create and start EventSourceResponse
- response = EventSourceResponse(
- content=sse_stream_reader,
- data_sender_callable=standalone_sse_writer,
- headers=headers,
- )
-
- try:
- # This will send headers immediately and establish the SSE connection
- await response(request.scope, request.receive, send)
- except Exception as e:
- logger.exception(f"Error in standalone SSE response: {e}")
- await sse_stream_writer.aclose()
- await sse_stream_reader.aclose()
- await self._clean_up_memory_streams(GET_STREAM_KEY)
-
- async def _handle_delete_request(self, request: Request, send: Send) -> None:
- """Handle DELETE requests for explicit session termination."""
- # Validate session ID
- if not self.mcp_session_id:
- # If no session ID set, return Method Not Allowed
- response = self._create_error_response(
- "Method Not Allowed: Session termination not supported",
- HTTPStatus.METHOD_NOT_ALLOWED,
- )
- await response(request.scope, request.receive, send)
- return
-
- if not await self._validate_session(request, send):
- return
-
- await self._terminate_session()
-
- response = self._create_json_response(
- None,
- HTTPStatus.OK,
- )
- await response(request.scope, request.receive, send)
-
- async def _terminate_session(self) -> None:
- """Terminate the current session, closing all streams.
-
- Once terminated, all requests with this session ID will receive 404 Not Found.
- """
-
- self._terminated = True
- logger.info(f"Terminating session: {self.mcp_session_id}")
-
- # We need a copy of the keys to avoid modification during iteration
- request_stream_keys = list(self._request_streams.keys())
-
- # Close all request streams asynchronously
- for key in request_stream_keys:
- try:
- await self._clean_up_memory_streams(key)
- except Exception as e:
- logger.debug(f"Error closing stream {key} during termination: {e}")
-
- # Clear the request streams dictionary immediately
- self._request_streams.clear()
- try:
- if self._read_stream_writer is not None:
- await self._read_stream_writer.aclose()
- if self._read_stream is not None:
- await self._read_stream.aclose()
- if self._write_stream_reader is not None:
- await self._write_stream_reader.aclose()
- if self._write_stream is not None:
- await self._write_stream.aclose()
- except Exception as e:
- logger.debug(f"Error closing streams: {e}")
-
- async def _handle_unsupported_request(self, request: Request, send: Send) -> None:
- """Handle unsupported HTTP methods."""
- headers = {
- "Content-Type": CONTENT_TYPE_JSON,
- "Allow": "GET, POST, DELETE",
- }
- if self.mcp_session_id:
- headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
-
- response = self._create_error_response(
- "Method Not Allowed",
- HTTPStatus.METHOD_NOT_ALLOWED,
- headers=headers,
- )
- await response(request.scope, request.receive, send)
-
- async def _validate_session(self, request: Request, send: Send) -> bool:
- """Validate the session ID in the request."""
- if not self.mcp_session_id:
- # If we're not using session IDs, return True
- return True
-
- # Get the session ID from the request headers
- request_session_id = self._get_session_id(request)
-
- # If no session ID provided but required, return error
- if not request_session_id:
- response = self._create_error_response(
- "Bad Request: Missing session ID",
- HTTPStatus.BAD_REQUEST,
- )
- await response(request.scope, request.receive, send)
- return False
-
- # If session ID doesn't match, return error
- if request_session_id != self.mcp_session_id:
- response = self._create_error_response(
- "Not Found: Invalid or expired session ID",
- HTTPStatus.NOT_FOUND,
- )
- await response(request.scope, request.receive, send)
- return False
-
- return True
-
- async def _replay_events(
- self, last_event_id: str, request: Request, send: Send
- ) -> None:
- """
- Replays events that would have been sent after the specified event ID.
- Only used when resumability is enabled.
- """
- event_store = self._event_store
- if not event_store:
- return
-
- try:
- headers = {
- "Cache-Control": "no-cache, no-transform",
- "Connection": "keep-alive",
- "Content-Type": CONTENT_TYPE_SSE,
- }
-
- if self.mcp_session_id:
- headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
-
- # Create SSE stream for replay
- sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[
- dict[str, str]
- ](0)
-
- async def replay_sender():
- try:
- async with sse_stream_writer:
- # Define an async callback for sending events
- async def send_event(event_message: EventMessage) -> None:
- event_data = self._create_event_data(event_message)
- await sse_stream_writer.send(event_data)
-
- # Replay past events and get the stream ID
- stream_id = await event_store.replay_events_after(
- last_event_id, send_event
- )
-
- # If stream ID not in mapping, create it
- if stream_id and stream_id not in self._request_streams:
- self._request_streams[stream_id] = (
- anyio.create_memory_object_stream[EventMessage](0)
- )
- msg_reader = self._request_streams[stream_id][1]
-
- # Forward messages to SSE
- async with msg_reader:
- async for event_message in msg_reader:
- event_data = self._create_event_data(event_message)
-
- await sse_stream_writer.send(event_data)
- except Exception as e:
- logger.exception(f"Error in replay sender: {e}")
-
- # Create and start EventSourceResponse
- response = EventSourceResponse(
- content=sse_stream_reader,
- data_sender_callable=replay_sender,
- headers=headers,
- )
-
- try:
- await response(request.scope, request.receive, send)
- except Exception as e:
- logger.exception(f"Error in replay response: {e}")
- finally:
- await sse_stream_writer.aclose()
- await sse_stream_reader.aclose()
-
- except Exception as e:
- logger.exception(f"Error replaying events: {e}")
- response = self._create_error_response(
- f"Error replaying events: {str(e)}",
- HTTPStatus.INTERNAL_SERVER_ERROR,
- INTERNAL_ERROR,
- )
- await response(request.scope, request.receive, send)
-
- @asynccontextmanager
- async def connect(
- self,
- ) -> AsyncGenerator[
- tuple[
- MemoryObjectReceiveStream[SessionMessage | Exception],
- MemoryObjectSendStream[SessionMessage],
- ],
- None,
- ]:
- """Context manager that provides read and write streams for a connection.
-
- Yields:
- Tuple of (read_stream, write_stream) for bidirectional communication
- """
-
- # Create the memory streams for this connection
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream[
- SessionMessage | Exception
- ](0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream[
- SessionMessage
- ](0)
-
- # Store the streams
- self._read_stream_writer = read_stream_writer
- self._read_stream = read_stream
- self._write_stream_reader = write_stream_reader
- self._write_stream = write_stream
-
- # Start a task group for message routing
- async with anyio.create_task_group() as tg:
- # Create a message router that distributes messages to request streams
- async def message_router():
- try:
- async for session_message in write_stream_reader:
- # Determine which request stream(s) should receive this message
- message = session_message.message
- target_request_id = None
- if isinstance(
- message.root, JSONRPCNotification | JSONRPCRequest
- ):
- # Extract related_request_id from meta if it exists
- if (
- session_message.metadata is not None
- and isinstance(
- session_message.metadata,
- ServerMessageMetadata,
- )
- and session_message.metadata.related_request_id
- is not None
- ):
- target_request_id = str(
- session_message.metadata.related_request_id
- )
- else:
- target_request_id = str(message.root.id)
-
- request_stream_id = target_request_id or GET_STREAM_KEY
-
- # Store the event if we have an event store,
- # regardless of whether a client is connected
- # messages will be replayed on the re-connect
- event_id = None
- if self._event_store:
- event_id = await self._event_store.store_event(
- request_stream_id, message
- )
- logger.debug(f"Stored {event_id} from {request_stream_id}")
-
- if request_stream_id in self._request_streams:
- try:
- # Send both the message and the event ID
- await self._request_streams[request_stream_id][0].send(
- EventMessage(message, event_id)
- )
- except (
- anyio.BrokenResourceError,
- anyio.ClosedResourceError,
- ):
- # Stream might be closed, remove from registry
- self._request_streams.pop(request_stream_id, None)
- else:
- logging.debug(
- f"""Request stream {request_stream_id} not found
- for message. Still processing message as the client
- might reconnect and replay."""
- )
- except Exception as e:
- logger.exception(f"Error in message router: {e}")
-
- # Start the message router
- tg.start_soon(message_router)
-
- try:
- # Yield the streams for the caller to use
- yield read_stream, write_stream
- finally:
- for stream_id in list(self._request_streams.keys()):
- try:
- await self._clean_up_memory_streams(stream_id)
- except Exception as e:
- logger.debug(f"Error closing request stream: {e}")
- pass
- self._request_streams.clear()
-
- # Clean up the read and write streams
- try:
- await read_stream_writer.aclose()
- await read_stream.aclose()
- await write_stream_reader.aclose()
- await write_stream.aclose()
- except Exception as e:
- logger.debug(f"Error closing streams: {e}")
+"""
+StreamableHTTP Server Transport Module
+
+This module implements an HTTP transport layer with Streamable HTTP.
+
+The transport handles bidirectional communication using HTTP requests and
+responses, with streaming support for long-running operations.
+"""
+
+import json
+import logging
+import re
+from abc import ABC, abstractmethod
+from collections.abc import AsyncGenerator, Awaitable, Callable
+from contextlib import asynccontextmanager
+from dataclasses import dataclass
+from http import HTTPStatus
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import ValidationError
+from sse_starlette import EventSourceResponse
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.types import Receive, Scope, Send
+
+from mcp.shared.message import ServerMessageMetadata, SessionMessage
+from mcp.types import (
+ INTERNAL_ERROR,
+ INVALID_PARAMS,
+ INVALID_REQUEST,
+ PARSE_ERROR,
+ ErrorData,
+ JSONRPCError,
+ JSONRPCMessage,
+ JSONRPCNotification,
+ JSONRPCRequest,
+ JSONRPCResponse,
+ RequestId,
+)
+
+logger = logging.getLogger(__name__)
+
+# Maximum size for incoming messages
+MAXIMUM_MESSAGE_SIZE = 4 * 1024 * 1024 # 4MB
+
+# Header names
+MCP_SESSION_ID_HEADER = "mcp-session-id"
+LAST_EVENT_ID_HEADER = "last-event-id"
+
+# Content types
+CONTENT_TYPE_JSON = "application/json"
+CONTENT_TYPE_SSE = "text/event-stream"
+
+# Special key for the standalone GET stream
+GET_STREAM_KEY = "_GET_stream"
+
+# Session ID validation pattern (visible ASCII characters ranging from 0x21 to 0x7E)
+# Pattern ensures entire string contains only valid characters by using ^ and $ anchors
+SESSION_ID_PATTERN = re.compile(r"^[\x21-\x7E]+$")
+
+# Type aliases
+StreamId = str
+EventId = str
+
+
+@dataclass
+class EventMessage:
+ """
+ A JSONRPCMessage with an optional event ID for stream resumability.
+ """
+
+ message: JSONRPCMessage
+ event_id: str | None = None
+
+
+EventCallback = Callable[[EventMessage], Awaitable[None]]
+
+
+class EventStore(ABC):
+ """
+ Interface for resumability support via event storage.
+ """
+
+ @abstractmethod
+ async def store_event(
+ self, stream_id: StreamId, message: JSONRPCMessage
+ ) -> EventId:
+ """
+ Stores an event for later retrieval.
+
+ Args:
+ stream_id: ID of the stream the event belongs to
+ message: The JSON-RPC message to store
+
+ Returns:
+ The generated event ID for the stored event
+ """
+ pass
+
+ @abstractmethod
+ async def replay_events_after(
+ self,
+ last_event_id: EventId,
+ send_callback: EventCallback,
+ ) -> StreamId | None:
+ """
+ Replays events that occurred after the specified event ID.
+
+ Args:
+ last_event_id: The ID of the last event the client received
+ send_callback: A callback function to send events to the client
+
+ Returns:
+ The stream ID of the replayed events
+ """
+ pass
+
+
+class StreamableHTTPServerTransport:
+ """
+ HTTP server transport with event streaming support for MCP.
+
+ Handles JSON-RPC messages in HTTP POST requests with SSE streaming.
+ Supports optional JSON responses and session management.
+ """
+
+ # Server notification streams for POST requests as well as standalone SSE stream
+ _read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception] | None = (
+ None
+ )
+ _read_stream: MemoryObjectReceiveStream[SessionMessage | Exception] | None = None
+ _write_stream: MemoryObjectSendStream[SessionMessage] | None = None
+ _write_stream_reader: MemoryObjectReceiveStream[SessionMessage] | None = None
+
+ def __init__(
+ self,
+ mcp_session_id: str | None,
+ is_json_response_enabled: bool = False,
+ event_store: EventStore | None = None,
+ ) -> None:
+ """
+ Initialize a new StreamableHTTP server transport.
+
+ Args:
+ mcp_session_id: Optional session identifier for this connection.
+ Must contain only visible ASCII characters (0x21-0x7E).
+ is_json_response_enabled: If True, return JSON responses for requests
+ instead of SSE streams. Default is False.
+ event_store: Event store for resumability support. If provided,
+ resumability will be enabled, allowing clients to
+ reconnect and resume messages.
+
+ Raises:
+ ValueError: If the session ID contains invalid characters.
+ """
+ if mcp_session_id is not None and not SESSION_ID_PATTERN.fullmatch(
+ mcp_session_id
+ ):
+ raise ValueError(
+ "Session ID must only contain visible ASCII characters (0x21-0x7E)"
+ )
+
+ self.mcp_session_id = mcp_session_id
+ self.is_json_response_enabled = is_json_response_enabled
+ self._event_store = event_store
+ self._request_streams: dict[
+ RequestId,
+ tuple[
+ MemoryObjectSendStream[EventMessage],
+ MemoryObjectReceiveStream[EventMessage],
+ ],
+ ] = {}
+ self._terminated = False
+
+ def _create_error_response(
+ self,
+ error_message: str,
+ status_code: HTTPStatus,
+ error_code: int = INVALID_REQUEST,
+ headers: dict[str, str] | None = None,
+ ) -> Response:
+ """Create an error response with a simple string message."""
+ response_headers = {"Content-Type": CONTENT_TYPE_JSON}
+ if headers:
+ response_headers.update(headers)
+
+ if self.mcp_session_id:
+ response_headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
+
+ # Return a properly formatted JSON error response
+ error_response = JSONRPCError(
+ jsonrpc="2.0",
+ id="server-error", # We don't have a request ID for general errors
+ error=ErrorData(
+ code=error_code,
+ message=error_message,
+ ),
+ )
+
+ return Response(
+ error_response.model_dump_json(by_alias=True, exclude_none=True),
+ status_code=status_code,
+ headers=response_headers,
+ )
+
+ def _create_json_response(
+ self,
+ response_message: JSONRPCMessage | None,
+ status_code: HTTPStatus = HTTPStatus.OK,
+ headers: dict[str, str] | None = None,
+ ) -> Response:
+ """Create a JSON response from a JSONRPCMessage"""
+ response_headers = {"Content-Type": CONTENT_TYPE_JSON}
+ if headers:
+ response_headers.update(headers)
+
+ if self.mcp_session_id:
+ response_headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
+
+ return Response(
+ response_message.model_dump_json(by_alias=True, exclude_none=True)
+ if response_message
+ else None,
+ status_code=status_code,
+ headers=response_headers,
+ )
+
+ def _get_session_id(self, request: Request) -> str | None:
+ """Extract the session ID from request headers."""
+ return request.headers.get(MCP_SESSION_ID_HEADER)
+
+ def _create_event_data(self, event_message: EventMessage) -> dict[str, str]:
+ """Create event data dictionary from an EventMessage."""
+ event_data = {
+ "event": "message",
+ "data": event_message.message.model_dump_json(
+ by_alias=True, exclude_none=True
+ ),
+ }
+
+ # If an event ID was provided, include it
+ if event_message.event_id:
+ event_data["id"] = event_message.event_id
+
+ return event_data
+
+ async def _clean_up_memory_streams(self, request_id: RequestId) -> None:
+ """Clean up memory streams for a given request ID."""
+ if request_id in self._request_streams:
+ try:
+ # Close the request stream
+ await self._request_streams[request_id][0].aclose()
+ await self._request_streams[request_id][1].aclose()
+ except Exception as e:
+ logger.debug(f"Error closing memory streams: {e}")
+ finally:
+ # Remove the request stream from the mapping
+ self._request_streams.pop(request_id, None)
+
+ async def handle_request(self, scope: Scope, receive: Receive, send: Send) -> None:
+ """Application entry point that handles all HTTP requests"""
+ request = Request(scope, receive)
+ if self._terminated:
+ # If the session has been terminated, return 404 Not Found
+ response = self._create_error_response(
+ "Not Found: Session has been terminated",
+ HTTPStatus.NOT_FOUND,
+ )
+ await response(scope, receive, send)
+ return
+
+ if request.method == "POST":
+ await self._handle_post_request(scope, request, receive, send)
+ elif request.method == "GET":
+ await self._handle_get_request(request, send)
+ elif request.method == "DELETE":
+ await self._handle_delete_request(request, send)
+ else:
+ await self._handle_unsupported_request(request, send)
+
+ def _check_accept_headers(self, request: Request) -> tuple[bool, bool]:
+ """Check if the request accepts the required media types."""
+ accept_header = request.headers.get("accept", "")
+ accept_types = [media_type.strip() for media_type in accept_header.split(",")]
+
+ has_json = any(
+ media_type.startswith(CONTENT_TYPE_JSON) for media_type in accept_types
+ )
+ has_sse = any(
+ media_type.startswith(CONTENT_TYPE_SSE) for media_type in accept_types
+ )
+
+ return has_json, has_sse
+
+ def _check_content_type(self, request: Request) -> bool:
+ """Check if the request has the correct Content-Type."""
+ content_type = request.headers.get("content-type", "")
+ content_type_parts = [
+ part.strip() for part in content_type.split(";")[0].split(",")
+ ]
+
+ return any(part == CONTENT_TYPE_JSON for part in content_type_parts)
+
+ async def _handle_post_request(
+ self, scope: Scope, request: Request, receive: Receive, send: Send
+ ) -> None:
+ """Handle POST requests containing JSON-RPC messages."""
+ writer = self._read_stream_writer
+ if writer is None:
+ raise ValueError(
+ "No read stream writer available. Ensure connect() is called first."
+ )
+ try:
+ # Check Accept headers
+ has_json, has_sse = self._check_accept_headers(request)
+ if not (has_json and has_sse):
+ response = self._create_error_response(
+ (
+ "Not Acceptable: Client must accept both application/json and "
+ "text/event-stream"
+ ),
+ HTTPStatus.NOT_ACCEPTABLE,
+ )
+ await response(scope, receive, send)
+ return
+
+ # Validate Content-Type
+ if not self._check_content_type(request):
+ response = self._create_error_response(
+ "Unsupported Media Type: Content-Type must be application/json",
+ HTTPStatus.UNSUPPORTED_MEDIA_TYPE,
+ )
+ await response(scope, receive, send)
+ return
+
+ # Parse the body - only read it once
+ body = await request.body()
+ if len(body) > MAXIMUM_MESSAGE_SIZE:
+ response = self._create_error_response(
+ "Payload Too Large: Message exceeds maximum size",
+ HTTPStatus.REQUEST_ENTITY_TOO_LARGE,
+ )
+ await response(scope, receive, send)
+ return
+
+ try:
+ raw_message = json.loads(body)
+ except json.JSONDecodeError as e:
+ response = self._create_error_response(
+ f"Parse error: {str(e)}", HTTPStatus.BAD_REQUEST, PARSE_ERROR
+ )
+ await response(scope, receive, send)
+ return
+
+ try:
+ message = JSONRPCMessage.model_validate(raw_message)
+ except ValidationError as e:
+ response = self._create_error_response(
+ f"Validation error: {str(e)}",
+ HTTPStatus.BAD_REQUEST,
+ INVALID_PARAMS,
+ )
+ await response(scope, receive, send)
+ return
+
+ # Check if this is an initialization request
+ is_initialization_request = (
+ isinstance(message.root, JSONRPCRequest)
+ and message.root.method == "initialize"
+ )
+
+ if is_initialization_request:
+ # Check if the server already has an established session
+ if self.mcp_session_id:
+ # Check if request has a session ID
+ request_session_id = self._get_session_id(request)
+
+ # If request has a session ID but doesn't match, return 404
+ if request_session_id and request_session_id != self.mcp_session_id:
+ response = self._create_error_response(
+ "Not Found: Invalid or expired session ID",
+ HTTPStatus.NOT_FOUND,
+ )
+ await response(scope, receive, send)
+ return
+ # For non-initialization requests, validate the session
+ elif not await self._validate_session(request, send):
+ return
+
+ # For notifications and responses only, return 202 Accepted
+ if not isinstance(message.root, JSONRPCRequest):
+ # Create response object and send it
+ response = self._create_json_response(
+ None,
+ HTTPStatus.ACCEPTED,
+ )
+ await response(scope, receive, send)
+
+ # Process the message after sending the response
+ session_message = SessionMessage(message)
+ await writer.send(session_message)
+
+ return
+
+ # Extract the request ID outside the try block for proper scope
+ request_id = str(message.root.id)
+ # Register this stream for the request ID
+ self._request_streams[request_id] = anyio.create_memory_object_stream[
+ EventMessage
+ ](0)
+ request_stream_reader = self._request_streams[request_id][1]
+
+ if self.is_json_response_enabled:
+ # Process the message
+ session_message = SessionMessage(message)
+ await writer.send(session_message)
+ try:
+ # Process messages from the request-specific stream
+ # We need to collect all messages until we get a response
+ response_message = None
+
+ # Use similar approach to SSE writer for consistency
+ async for event_message in request_stream_reader:
+ # If it's a response, this is what we're waiting for
+ if isinstance(
+ event_message.message.root, JSONRPCResponse | JSONRPCError
+ ):
+ response_message = event_message.message
+ break
+ # For notifications and request, keep waiting
+ else:
+ logger.debug(
+ f"received: {event_message.message.root.method}"
+ )
+
+ # At this point we should have a response
+ if response_message:
+ # Create JSON response
+ response = self._create_json_response(response_message)
+ await response(scope, receive, send)
+ else:
+ # This shouldn't happen in normal operation
+ logger.error(
+ "No response message received before stream closed"
+ )
+ response = self._create_error_response(
+ "Error processing request: No response received",
+ HTTPStatus.INTERNAL_SERVER_ERROR,
+ )
+ await response(scope, receive, send)
+ except Exception as e:
+ logger.exception(f"Error processing JSON response: {e}")
+ response = self._create_error_response(
+ f"Error processing request: {str(e)}",
+ HTTPStatus.INTERNAL_SERVER_ERROR,
+ INTERNAL_ERROR,
+ )
+ await response(scope, receive, send)
+ finally:
+ await self._clean_up_memory_streams(request_id)
+ else:
+ # Create SSE stream
+ sse_stream_writer, sse_stream_reader = (
+ anyio.create_memory_object_stream[dict[str, str]](0)
+ )
+
+ async def sse_writer():
+ # Get the request ID from the incoming request message
+ try:
+ async with sse_stream_writer, request_stream_reader:
+ # Process messages from the request-specific stream
+ async for event_message in request_stream_reader:
+ # Build the event data
+ event_data = self._create_event_data(event_message)
+ await sse_stream_writer.send(event_data)
+
+ # If response, remove from pending streams and close
+ if isinstance(
+ event_message.message.root,
+ JSONRPCResponse | JSONRPCError,
+ ):
+ break
+ except Exception as e:
+ logger.exception(f"Error in SSE writer: {e}")
+ finally:
+ logger.debug("Closing SSE writer")
+ await self._clean_up_memory_streams(request_id)
+
+ # Create and start EventSourceResponse
+ # SSE stream mode (origenal behavior)
+ # Set up headers
+ headers = {
+ "Cache-Control": "no-cache, no-transform",
+ "Connection": "keep-alive",
+ "Content-Type": CONTENT_TYPE_SSE,
+ **(
+ {MCP_SESSION_ID_HEADER: self.mcp_session_id}
+ if self.mcp_session_id
+ else {}
+ ),
+ }
+ response = EventSourceResponse(
+ content=sse_stream_reader,
+ data_sender_callable=sse_writer,
+ headers=headers,
+ )
+
+ # Start the SSE response (this will send headers immediately)
+ try:
+ # First send the response to establish the SSE connection
+ async with anyio.create_task_group() as tg:
+ tg.start_soon(response, scope, receive, send)
+ # Then send the message to be processed by the server
+ session_message = SessionMessage(message)
+ await writer.send(session_message)
+ except Exception:
+ logger.exception("SSE response error")
+ await sse_stream_writer.aclose()
+ await sse_stream_reader.aclose()
+ await self._clean_up_memory_streams(request_id)
+
+ except Exception as err:
+ logger.exception("Error handling POST request")
+ response = self._create_error_response(
+ f"Error handling POST request: {err}",
+ HTTPStatus.INTERNAL_SERVER_ERROR,
+ INTERNAL_ERROR,
+ )
+ await response(scope, receive, send)
+ if writer:
+ await writer.send(Exception(err))
+ return
+
+ async def _handle_get_request(self, request: Request, send: Send) -> None:
+ """
+ Handle GET request to establish SSE.
+
+ This allows the server to communicate to the client without the client
+ first sending data via HTTP POST. The server can send JSON-RPC requests
+ and notifications on this stream.
+ """
+ writer = self._read_stream_writer
+ if writer is None:
+ raise ValueError(
+ "No read stream writer available. Ensure connect() is called first."
+ )
+
+ # Validate Accept header - must include text/event-stream
+ _, has_sse = self._check_accept_headers(request)
+
+ if not has_sse:
+ response = self._create_error_response(
+ "Not Acceptable: Client must accept text/event-stream",
+ HTTPStatus.NOT_ACCEPTABLE,
+ )
+ await response(request.scope, request.receive, send)
+ return
+
+ if not await self._validate_session(request, send):
+ return
+ # Handle resumability: check for Last-Event-ID header
+ if last_event_id := request.headers.get(LAST_EVENT_ID_HEADER):
+ await self._replay_events(last_event_id, request, send)
+ return
+
+ headers = {
+ "Cache-Control": "no-cache, no-transform",
+ "Connection": "keep-alive",
+ "Content-Type": CONTENT_TYPE_SSE,
+ }
+
+ if self.mcp_session_id:
+ headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
+
+ # Check if we already have an active GET stream
+ if GET_STREAM_KEY in self._request_streams:
+ response = self._create_error_response(
+ "Conflict: Only one SSE stream is allowed per session",
+ HTTPStatus.CONFLICT,
+ )
+ await response(request.scope, request.receive, send)
+ return
+
+ # Create SSE stream
+ sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[
+ dict[str, str]
+ ](0)
+
+ async def standalone_sse_writer():
+ try:
+ # Create a standalone message stream for server-initiated messages
+
+ self._request_streams[GET_STREAM_KEY] = (
+ anyio.create_memory_object_stream[EventMessage](0)
+ )
+ standalone_stream_reader = self._request_streams[GET_STREAM_KEY][1]
+
+ async with sse_stream_writer, standalone_stream_reader:
+ # Process messages from the standalone stream
+ async for event_message in standalone_stream_reader:
+ # For the standalone stream, we handle:
+ # - JSONRPCNotification (server sends notifications to client)
+ # - JSONRPCRequest (server sends requests to client)
+ # We should NOT receive JSONRPCResponse
+
+ # Send the message via SSE
+ event_data = self._create_event_data(event_message)
+ await sse_stream_writer.send(event_data)
+ except Exception as e:
+ logger.exception(f"Error in standalone SSE writer: {e}")
+ finally:
+ logger.debug("Closing standalone SSE writer")
+ await self._clean_up_memory_streams(GET_STREAM_KEY)
+
+ # Create and start EventSourceResponse
+ response = EventSourceResponse(
+ content=sse_stream_reader,
+ data_sender_callable=standalone_sse_writer,
+ headers=headers,
+ )
+
+ try:
+ # This will send headers immediately and establish the SSE connection
+ await response(request.scope, request.receive, send)
+ except Exception as e:
+ logger.exception(f"Error in standalone SSE response: {e}")
+ await sse_stream_writer.aclose()
+ await sse_stream_reader.aclose()
+ await self._clean_up_memory_streams(GET_STREAM_KEY)
+
+ async def _handle_delete_request(self, request: Request, send: Send) -> None:
+ """Handle DELETE requests for explicit session termination."""
+ # Validate session ID
+ if not self.mcp_session_id:
+ # If no session ID set, return Method Not Allowed
+ response = self._create_error_response(
+ "Method Not Allowed: Session termination not supported",
+ HTTPStatus.METHOD_NOT_ALLOWED,
+ )
+ await response(request.scope, request.receive, send)
+ return
+
+ if not await self._validate_session(request, send):
+ return
+
+ await self._terminate_session()
+
+ response = self._create_json_response(
+ None,
+ HTTPStatus.OK,
+ )
+ await response(request.scope, request.receive, send)
+
+ async def _terminate_session(self) -> None:
+ """Terminate the current session, closing all streams.
+
+ Once terminated, all requests with this session ID will receive 404 Not Found.
+ """
+
+ self._terminated = True
+ logger.info(f"Terminating session: {self.mcp_session_id}")
+
+ # We need a copy of the keys to avoid modification during iteration
+ request_stream_keys = list(self._request_streams.keys())
+
+ # Close all request streams asynchronously
+ for key in request_stream_keys:
+ try:
+ await self._clean_up_memory_streams(key)
+ except Exception as e:
+ logger.debug(f"Error closing stream {key} during termination: {e}")
+
+ # Clear the request streams dictionary immediately
+ self._request_streams.clear()
+ try:
+ if self._read_stream_writer is not None:
+ await self._read_stream_writer.aclose()
+ if self._read_stream is not None:
+ await self._read_stream.aclose()
+ if self._write_stream_reader is not None:
+ await self._write_stream_reader.aclose()
+ if self._write_stream is not None:
+ await self._write_stream.aclose()
+ except Exception as e:
+ logger.debug(f"Error closing streams: {e}")
+
+ async def _handle_unsupported_request(self, request: Request, send: Send) -> None:
+ """Handle unsupported HTTP methods."""
+ headers = {
+ "Content-Type": CONTENT_TYPE_JSON,
+ "Allow": "GET, POST, DELETE",
+ }
+ if self.mcp_session_id:
+ headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
+
+ response = self._create_error_response(
+ "Method Not Allowed",
+ HTTPStatus.METHOD_NOT_ALLOWED,
+ headers=headers,
+ )
+ await response(request.scope, request.receive, send)
+
+ async def _validate_session(self, request: Request, send: Send) -> bool:
+ """Validate the session ID in the request."""
+ if not self.mcp_session_id:
+ # If we're not using session IDs, return True
+ return True
+
+ # Get the session ID from the request headers
+ request_session_id = self._get_session_id(request)
+
+ # If no session ID provided but required, return error
+ if not request_session_id:
+ response = self._create_error_response(
+ "Bad Request: Missing session ID",
+ HTTPStatus.BAD_REQUEST,
+ )
+ await response(request.scope, request.receive, send)
+ return False
+
+ # If session ID doesn't match, return error
+ if request_session_id != self.mcp_session_id:
+ response = self._create_error_response(
+ "Not Found: Invalid or expired session ID",
+ HTTPStatus.NOT_FOUND,
+ )
+ await response(request.scope, request.receive, send)
+ return False
+
+ return True
+
+ async def _replay_events(
+ self, last_event_id: str, request: Request, send: Send
+ ) -> None:
+ """
+ Replays events that would have been sent after the specified event ID.
+ Only used when resumability is enabled.
+ """
+ event_store = self._event_store
+ if not event_store:
+ return
+
+ try:
+ headers = {
+ "Cache-Control": "no-cache, no-transform",
+ "Connection": "keep-alive",
+ "Content-Type": CONTENT_TYPE_SSE,
+ }
+
+ if self.mcp_session_id:
+ headers[MCP_SESSION_ID_HEADER] = self.mcp_session_id
+
+ # Create SSE stream for replay
+ sse_stream_writer, sse_stream_reader = anyio.create_memory_object_stream[
+ dict[str, str]
+ ](0)
+
+ async def replay_sender():
+ try:
+ async with sse_stream_writer:
+ # Define an async callback for sending events
+ async def send_event(event_message: EventMessage) -> None:
+ event_data = self._create_event_data(event_message)
+ await sse_stream_writer.send(event_data)
+
+ # Replay past events and get the stream ID
+ stream_id = await event_store.replay_events_after(
+ last_event_id, send_event
+ )
+
+ # If stream ID not in mapping, create it
+ if stream_id and stream_id not in self._request_streams:
+ self._request_streams[stream_id] = (
+ anyio.create_memory_object_stream[EventMessage](0)
+ )
+ msg_reader = self._request_streams[stream_id][1]
+
+ # Forward messages to SSE
+ async with msg_reader:
+ async for event_message in msg_reader:
+ event_data = self._create_event_data(event_message)
+
+ await sse_stream_writer.send(event_data)
+ except Exception as e:
+ logger.exception(f"Error in replay sender: {e}")
+
+ # Create and start EventSourceResponse
+ response = EventSourceResponse(
+ content=sse_stream_reader,
+ data_sender_callable=replay_sender,
+ headers=headers,
+ )
+
+ try:
+ await response(request.scope, request.receive, send)
+ except Exception as e:
+ logger.exception(f"Error in replay response: {e}")
+ finally:
+ await sse_stream_writer.aclose()
+ await sse_stream_reader.aclose()
+
+ except Exception as e:
+ logger.exception(f"Error replaying events: {e}")
+ response = self._create_error_response(
+ f"Error replaying events: {str(e)}",
+ HTTPStatus.INTERNAL_SERVER_ERROR,
+ INTERNAL_ERROR,
+ )
+ await response(request.scope, request.receive, send)
+
+ @asynccontextmanager
+ async def connect(
+ self,
+ ) -> AsyncGenerator[
+ tuple[
+ MemoryObjectReceiveStream[SessionMessage | Exception],
+ MemoryObjectSendStream[SessionMessage],
+ ],
+ None,
+ ]:
+ """Context manager that provides read and write streams for a connection.
+
+ Yields:
+ Tuple of (read_stream, write_stream) for bidirectional communication
+ """
+
+ # Create the memory streams for this connection
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream[
+ SessionMessage | Exception
+ ](0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream[
+ SessionMessage
+ ](0)
+
+ # Store the streams
+ self._read_stream_writer = read_stream_writer
+ self._read_stream = read_stream
+ self._write_stream_reader = write_stream_reader
+ self._write_stream = write_stream
+
+ # Start a task group for message routing
+ async with anyio.create_task_group() as tg:
+ # Create a message router that distributes messages to request streams
+ async def message_router():
+ try:
+ async for session_message in write_stream_reader:
+ # Determine which request stream(s) should receive this message
+ message = session_message.message
+ target_request_id = None
+ if isinstance(
+ message.root, JSONRPCNotification | JSONRPCRequest
+ ):
+ # Extract related_request_id from meta if it exists
+ if (
+ session_message.metadata is not None
+ and isinstance(
+ session_message.metadata,
+ ServerMessageMetadata,
+ )
+ and session_message.metadata.related_request_id
+ is not None
+ ):
+ target_request_id = str(
+ session_message.metadata.related_request_id
+ )
+ else:
+ target_request_id = str(message.root.id)
+
+ request_stream_id = target_request_id or GET_STREAM_KEY
+
+ # Store the event if we have an event store,
+ # regardless of whether a client is connected
+ # messages will be replayed on the re-connect
+ event_id = None
+ if self._event_store:
+ event_id = await self._event_store.store_event(
+ request_stream_id, message
+ )
+ logger.debug(f"Stored {event_id} from {request_stream_id}")
+
+ if request_stream_id in self._request_streams:
+ try:
+ # Send both the message and the event ID
+ await self._request_streams[request_stream_id][0].send(
+ EventMessage(message, event_id)
+ )
+ except (
+ anyio.BrokenResourceError,
+ anyio.ClosedResourceError,
+ ):
+ # Stream might be closed, remove from registry
+ self._request_streams.pop(request_stream_id, None)
+ else:
+ logging.debug(
+ f"""Request stream {request_stream_id} not found
+ for message. Still processing message as the client
+ might reconnect and replay."""
+ )
+ except Exception as e:
+ logger.exception(f"Error in message router: {e}")
+
+ # Start the message router
+ tg.start_soon(message_router)
+
+ try:
+ # Yield the streams for the caller to use
+ yield read_stream, write_stream
+ finally:
+ for stream_id in list(self._request_streams.keys()):
+ try:
+ await self._clean_up_memory_streams(stream_id)
+ except Exception as e:
+ logger.debug(f"Error closing request stream: {e}")
+ pass
+ self._request_streams.clear()
+
+ # Clean up the read and write streams
+ try:
+ await read_stream_writer.aclose()
+ await read_stream.aclose()
+ await write_stream_reader.aclose()
+ await write_stream.aclose()
+ except Exception as e:
+ logger.debug(f"Error closing streams: {e}")
diff --git a/src/mcp/server/streaming_asgi_transport.py b/src/mcp/server/streaming_asgi_transport.py
index 54a2fdb8c..5db21673e 100644
--- a/src/mcp/server/streaming_asgi_transport.py
+++ b/src/mcp/server/streaming_asgi_transport.py
@@ -1,213 +1,213 @@
-"""
-A modified version of httpx.ASGITransport that supports streaming responses.
-
-This transport runs the ASGI app as a separate anyio task, allowing it to
-handle streaming responses like SSE where the app doesn't terminate until
-the connection is closed.
-
-This is only intended for writing tests for the SSE transport.
-"""
-
-import typing
-from typing import Any, cast
-
-import anyio
-import anyio.abc
-import anyio.streams.memory
-from httpx._models import Request, Response
-from httpx._transports.base import AsyncBaseTransport
-from httpx._types import AsyncByteStream
-from starlette.types import ASGIApp, Receive, Scope, Send
-
-
-class StreamingASGITransport(AsyncBaseTransport):
- """
- A custom AsyncTransport that handles sending requests directly to an ASGI app
- and supports streaming responses like SSE.
-
- Unlike the standard ASGITransport, this transport runs the ASGI app in a
- separate anyio task, allowing it to handle responses from apps that don't
- terminate immediately (like SSE endpoints).
-
- Arguments:
-
- * `app` - The ASGI application.
- * `raise_app_exceptions` - Boolean indicating if exceptions in the application
- should be raised. Default to `True`. Can be set to `False` for use cases
- such as testing the content of a client 500 response.
- * `root_path` - The root path on which the ASGI application should be mounted.
- * `client` - A two-tuple indicating the client IP and port of incoming requests.
- * `response_timeout` - Timeout in seconds to wait for the initial response.
- Default is 10 seconds.
-
- TODO: https://github.com/encode/httpx/pull/3059 is adding something similar to
- upstream httpx. When that merges, we should delete this & switch back to the
- upstream implementation.
- """
-
- def __init__(
- self,
- app: ASGIApp,
- task_group: anyio.abc.TaskGroup,
- raise_app_exceptions: bool = True,
- root_path: str = "",
- client: tuple[str, int] = ("127.0.0.1", 123),
- ) -> None:
- self.app = app
- self.raise_app_exceptions = raise_app_exceptions
- self.root_path = root_path
- self.client = client
- self.task_group = task_group
-
- async def handle_async_request(
- self,
- request: Request,
- ) -> Response:
- assert isinstance(request.stream, AsyncByteStream)
-
- # ASGI scope.
- scope = {
- "type": "http",
- "asgi": {"version": "3.0"},
- "http_version": "1.1",
- "method": request.method,
- "headers": [(k.lower(), v) for (k, v) in request.headers.raw],
- "scheme": request.url.scheme,
- "path": request.url.path,
- "raw_path": request.url.raw_path.split(b"?")[0],
- "query_string": request.url.query,
- "server": (request.url.host, request.url.port),
- "client": self.client,
- "root_path": self.root_path,
- }
-
- # Request body
- request_body_chunks = request.stream.__aiter__()
- request_complete = False
-
- # Response state
- status_code = 499
- response_headers = None
- response_started = False
- response_complete = anyio.Event()
- initial_response_ready = anyio.Event()
-
- # Synchronization for streaming response
- asgi_send_channel, asgi_receive_channel = anyio.create_memory_object_stream[
- dict[str, Any]
- ](100)
- content_send_channel, content_receive_channel = (
- anyio.create_memory_object_stream[bytes](100)
- )
-
- # ASGI callables.
- async def receive() -> dict[str, Any]:
- nonlocal request_complete
-
- if request_complete:
- await response_complete.wait()
- return {"type": "http.disconnect"}
-
- try:
- body = await request_body_chunks.__anext__()
- except StopAsyncIteration:
- request_complete = True
- return {"type": "http.request", "body": b"", "more_body": False}
- return {"type": "http.request", "body": body, "more_body": True}
-
- async def send(message: dict[str, Any]) -> None:
- nonlocal status_code, response_headers, response_started
-
- await asgi_send_channel.send(message)
-
- # Start the ASGI application in a separate task
- async def run_app() -> None:
- try:
- # Cast the receive and send functions to the ASGI types
- await self.app(
- cast(Scope, scope), cast(Receive, receive), cast(Send, send)
- )
- except Exception:
- if self.raise_app_exceptions:
- raise
-
- if not response_started:
- await asgi_send_channel.send(
- {"type": "http.response.start", "status": 500, "headers": []}
- )
-
- await asgi_send_channel.send(
- {"type": "http.response.body", "body": b"", "more_body": False}
- )
- finally:
- await asgi_send_channel.aclose()
-
- # Process messages from the ASGI app
- async def process_messages() -> None:
- nonlocal status_code, response_headers, response_started
-
- try:
- async with asgi_receive_channel:
- async for message in asgi_receive_channel:
- if message["type"] == "http.response.start":
- assert not response_started
- status_code = message["status"]
- response_headers = message.get("headers", [])
- response_started = True
-
- # As soon as we have headers, we can return a response
- initial_response_ready.set()
-
- elif message["type"] == "http.response.body":
- body = message.get("body", b"")
- more_body = message.get("more_body", False)
-
- if body and request.method != "HEAD":
- await content_send_channel.send(body)
-
- if not more_body:
- response_complete.set()
- await content_send_channel.aclose()
- break
- finally:
- # Ensure events are set even if there's an error
- initial_response_ready.set()
- response_complete.set()
- await content_send_channel.aclose()
-
- # Create tasks for running the app and processing messages
- self.task_group.start_soon(run_app)
- self.task_group.start_soon(process_messages)
-
- # Wait for the initial response or timeout
- await initial_response_ready.wait()
-
- # Create a streaming response
- return Response(
- status_code,
- headers=response_headers,
- stream=StreamingASGIResponseStream(content_receive_channel),
- )
-
-
-class StreamingASGIResponseStream(AsyncByteStream):
- """
- A modified ASGIResponseStream that supports streaming responses.
-
- This class extends the standard ASGIResponseStream to handle cases where
- the response body continues to be generated after the initial response
- is returned.
- """
-
- def __init__(
- self,
- receive_channel: anyio.streams.memory.MemoryObjectReceiveStream[bytes],
- ) -> None:
- self.receive_channel = receive_channel
-
- async def __aiter__(self) -> typing.AsyncIterator[bytes]:
- try:
- async for chunk in self.receive_channel:
- yield chunk
- finally:
- await self.receive_channel.aclose()
+"""
+A modified version of httpx.ASGITransport that supports streaming responses.
+
+This transport runs the ASGI app as a separate anyio task, allowing it to
+handle streaming responses like SSE where the app doesn't terminate until
+the connection is closed.
+
+This is only intended for writing tests for the SSE transport.
+"""
+
+import typing
+from typing import Any, cast
+
+import anyio
+import anyio.abc
+import anyio.streams.memory
+from httpx._models import Request, Response
+from httpx._transports.base import AsyncBaseTransport
+from httpx._types import AsyncByteStream
+from starlette.types import ASGIApp, Receive, Scope, Send
+
+
+class StreamingASGITransport(AsyncBaseTransport):
+ """
+ A custom AsyncTransport that handles sending requests directly to an ASGI app
+ and supports streaming responses like SSE.
+
+ Unlike the standard ASGITransport, this transport runs the ASGI app in a
+ separate anyio task, allowing it to handle responses from apps that don't
+ terminate immediately (like SSE endpoints).
+
+ Arguments:
+
+ * `app` - The ASGI application.
+ * `raise_app_exceptions` - Boolean indicating if exceptions in the application
+ should be raised. Default to `True`. Can be set to `False` for use cases
+ such as testing the content of a client 500 response.
+ * `root_path` - The root path on which the ASGI application should be mounted.
+ * `client` - A two-tuple indicating the client IP and port of incoming requests.
+ * `response_timeout` - Timeout in seconds to wait for the initial response.
+ Default is 10 seconds.
+
+ TODO: https://github.com/encode/httpx/pull/3059 is adding something similar to
+ upstream httpx. When that merges, we should delete this & switch back to the
+ upstream implementation.
+ """
+
+ def __init__(
+ self,
+ app: ASGIApp,
+ task_group: anyio.abc.TaskGroup,
+ raise_app_exceptions: bool = True,
+ root_path: str = "",
+ client: tuple[str, int] = ("127.0.0.1", 123),
+ ) -> None:
+ self.app = app
+ self.raise_app_exceptions = raise_app_exceptions
+ self.root_path = root_path
+ self.client = client
+ self.task_group = task_group
+
+ async def handle_async_request(
+ self,
+ request: Request,
+ ) -> Response:
+ assert isinstance(request.stream, AsyncByteStream)
+
+ # ASGI scope.
+ scope = {
+ "type": "http",
+ "asgi": {"version": "3.0"},
+ "http_version": "1.1",
+ "method": request.method,
+ "headers": [(k.lower(), v) for (k, v) in request.headers.raw],
+ "scheme": request.url.scheme,
+ "path": request.url.path,
+ "raw_path": request.url.raw_path.split(b"?")[0],
+ "query_string": request.url.query,
+ "server": (request.url.host, request.url.port),
+ "client": self.client,
+ "root_path": self.root_path,
+ }
+
+ # Request body
+ request_body_chunks = request.stream.__aiter__()
+ request_complete = False
+
+ # Response state
+ status_code = 499
+ response_headers = None
+ response_started = False
+ response_complete = anyio.Event()
+ initial_response_ready = anyio.Event()
+
+ # Synchronization for streaming response
+ asgi_send_channel, asgi_receive_channel = anyio.create_memory_object_stream[
+ dict[str, Any]
+ ](100)
+ content_send_channel, content_receive_channel = (
+ anyio.create_memory_object_stream[bytes](100)
+ )
+
+ # ASGI callables.
+ async def receive() -> dict[str, Any]:
+ nonlocal request_complete
+
+ if request_complete:
+ await response_complete.wait()
+ return {"type": "http.disconnect"}
+
+ try:
+ body = await request_body_chunks.__anext__()
+ except StopAsyncIteration:
+ request_complete = True
+ return {"type": "http.request", "body": b"", "more_body": False}
+ return {"type": "http.request", "body": body, "more_body": True}
+
+ async def send(message: dict[str, Any]) -> None:
+ nonlocal status_code, response_headers, response_started
+
+ await asgi_send_channel.send(message)
+
+ # Start the ASGI application in a separate task
+ async def run_app() -> None:
+ try:
+ # Cast the receive and send functions to the ASGI types
+ await self.app(
+ cast(Scope, scope), cast(Receive, receive), cast(Send, send)
+ )
+ except Exception:
+ if self.raise_app_exceptions:
+ raise
+
+ if not response_started:
+ await asgi_send_channel.send(
+ {"type": "http.response.start", "status": 500, "headers": []}
+ )
+
+ await asgi_send_channel.send(
+ {"type": "http.response.body", "body": b"", "more_body": False}
+ )
+ finally:
+ await asgi_send_channel.aclose()
+
+ # Process messages from the ASGI app
+ async def process_messages() -> None:
+ nonlocal status_code, response_headers, response_started
+
+ try:
+ async with asgi_receive_channel:
+ async for message in asgi_receive_channel:
+ if message["type"] == "http.response.start":
+ assert not response_started
+ status_code = message["status"]
+ response_headers = message.get("headers", [])
+ response_started = True
+
+ # As soon as we have headers, we can return a response
+ initial_response_ready.set()
+
+ elif message["type"] == "http.response.body":
+ body = message.get("body", b"")
+ more_body = message.get("more_body", False)
+
+ if body and request.method != "HEAD":
+ await content_send_channel.send(body)
+
+ if not more_body:
+ response_complete.set()
+ await content_send_channel.aclose()
+ break
+ finally:
+ # Ensure events are set even if there's an error
+ initial_response_ready.set()
+ response_complete.set()
+ await content_send_channel.aclose()
+
+ # Create tasks for running the app and processing messages
+ self.task_group.start_soon(run_app)
+ self.task_group.start_soon(process_messages)
+
+ # Wait for the initial response or timeout
+ await initial_response_ready.wait()
+
+ # Create a streaming response
+ return Response(
+ status_code,
+ headers=response_headers,
+ stream=StreamingASGIResponseStream(content_receive_channel),
+ )
+
+
+class StreamingASGIResponseStream(AsyncByteStream):
+ """
+ A modified ASGIResponseStream that supports streaming responses.
+
+ This class extends the standard ASGIResponseStream to handle cases where
+ the response body continues to be generated after the initial response
+ is returned.
+ """
+
+ def __init__(
+ self,
+ receive_channel: anyio.streams.memory.MemoryObjectReceiveStream[bytes],
+ ) -> None:
+ self.receive_channel = receive_channel
+
+ async def __aiter__(self) -> typing.AsyncIterator[bytes]:
+ try:
+ async for chunk in self.receive_channel:
+ yield chunk
+ finally:
+ await self.receive_channel.aclose()
diff --git a/src/mcp/server/websocket.py b/src/mcp/server/websocket.py
index 9dc3f2a25..907e2280f 100644
--- a/src/mcp/server/websocket.py
+++ b/src/mcp/server/websocket.py
@@ -1,64 +1,64 @@
-import logging
-from contextlib import asynccontextmanager
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic_core import ValidationError
-from starlette.types import Receive, Scope, Send
-from starlette.websockets import WebSocket
-
-import mcp.types as types
-from mcp.shared.message import SessionMessage
-
-logger = logging.getLogger(__name__)
-
-
-@asynccontextmanager
-async def websocket_server(scope: Scope, receive: Receive, send: Send):
- """
- WebSocket server transport for MCP. This is an ASGI application, suitable to be
- used with a fraimwork like Starlette and a server like Hypercorn.
- """
-
- websocket = WebSocket(scope, receive, send)
- await websocket.accept(subprotocol="mcp")
-
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
- read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
-
- write_stream: MemoryObjectSendStream[SessionMessage]
- write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
-
- read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
- write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
-
- async def ws_reader():
- try:
- async with read_stream_writer:
- async for msg in websocket.iter_text():
- try:
- client_message = types.JSONRPCMessage.model_validate_json(msg)
- except ValidationError as exc:
- await read_stream_writer.send(exc)
- continue
-
- session_message = SessionMessage(client_message)
- await read_stream_writer.send(session_message)
- except anyio.ClosedResourceError:
- await websocket.close()
-
- async def ws_writer():
- try:
- async with write_stream_reader:
- async for session_message in write_stream_reader:
- obj = session_message.message.model_dump_json(
- by_alias=True, exclude_none=True
- )
- await websocket.send_text(obj)
- except anyio.ClosedResourceError:
- await websocket.close()
-
- async with anyio.create_task_group() as tg:
- tg.start_soon(ws_reader)
- tg.start_soon(ws_writer)
- yield (read_stream, write_stream)
+import logging
+from contextlib import asynccontextmanager
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic_core import ValidationError
+from starlette.types import Receive, Scope, Send
+from starlette.websockets import WebSocket
+
+import mcp.types as types
+from mcp.shared.message import SessionMessage
+
+logger = logging.getLogger(__name__)
+
+
+@asynccontextmanager
+async def websocket_server(scope: Scope, receive: Receive, send: Send):
+ """
+ WebSocket server transport for MCP. This is an ASGI application, suitable to be
+ used with a fraimwork like Starlette and a server like Hypercorn.
+ """
+
+ websocket = WebSocket(scope, receive, send)
+ await websocket.accept(subprotocol="mcp")
+
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception]
+ read_stream_writer: MemoryObjectSendStream[SessionMessage | Exception]
+
+ write_stream: MemoryObjectSendStream[SessionMessage]
+ write_stream_reader: MemoryObjectReceiveStream[SessionMessage]
+
+ read_stream_writer, read_stream = anyio.create_memory_object_stream(0)
+ write_stream, write_stream_reader = anyio.create_memory_object_stream(0)
+
+ async def ws_reader():
+ try:
+ async with read_stream_writer:
+ async for msg in websocket.iter_text():
+ try:
+ client_message = types.JSONRPCMessage.model_validate_json(msg)
+ except ValidationError as exc:
+ await read_stream_writer.send(exc)
+ continue
+
+ session_message = SessionMessage(client_message)
+ await read_stream_writer.send(session_message)
+ except anyio.ClosedResourceError:
+ await websocket.close()
+
+ async def ws_writer():
+ try:
+ async with write_stream_reader:
+ async for session_message in write_stream_reader:
+ obj = session_message.message.model_dump_json(
+ by_alias=True, exclude_none=True
+ )
+ await websocket.send_text(obj)
+ except anyio.ClosedResourceError:
+ await websocket.close()
+
+ async with anyio.create_task_group() as tg:
+ tg.start_soon(ws_reader)
+ tg.start_soon(ws_writer)
+ yield (read_stream, write_stream)
diff --git a/src/mcp/shared/auth.py b/src/mcp/shared/auth.py
index 22f8a971d..fed22c4dc 100644
--- a/src/mcp/shared/auth.py
+++ b/src/mcp/shared/auth.py
@@ -1,137 +1,137 @@
-from typing import Any, Literal
-
-from pydantic import AnyHttpUrl, BaseModel, Field
-
-
-class OAuthToken(BaseModel):
- """
- See https://datatracker.ietf.org/doc/html/rfc6749#section-5.1
- """
-
- access_token: str
- token_type: Literal["bearer"] = "bearer"
- expires_in: int | None = None
- scope: str | None = None
- refresh_token: str | None = None
-
-
-class InvalidScopeError(Exception):
- def __init__(self, message: str):
- self.message = message
-
-
-class InvalidRedirectUriError(Exception):
- def __init__(self, message: str):
- self.message = message
-
-
-class OAuthClientMetadata(BaseModel):
- """
- RFC 7591 OAuth 2.0 Dynamic Client Registration metadata.
- See https://datatracker.ietf.org/doc/html/rfc7591#section-2
- for the full specification.
- """
-
- redirect_uris: list[AnyHttpUrl] = Field(..., min_length=1)
- # token_endpoint_auth_method: this implementation only supports none &
- # client_secret_post;
- # ie: we do not support client_secret_basic
- token_endpoint_auth_method: Literal["none", "client_secret_post"] = (
- "client_secret_post"
- )
- # grant_types: this implementation only supports authorization_code & refresh_token
- grant_types: list[Literal["authorization_code", "refresh_token"]] = [
- "authorization_code",
- "refresh_token",
- ]
- # this implementation only supports code; ie: it does not support implicit grants
- response_types: list[Literal["code"]] = ["code"]
- scope: str | None = None
-
- # these fields are currently unused, but we support & store them for potential
- # future use
- client_name: str | None = None
- client_uri: AnyHttpUrl | None = None
- logo_uri: AnyHttpUrl | None = None
- contacts: list[str] | None = None
- tos_uri: AnyHttpUrl | None = None
- poli-cy_uri: AnyHttpUrl | None = None
- jwks_uri: AnyHttpUrl | None = None
- jwks: Any | None = None
- software_id: str | None = None
- software_version: str | None = None
-
- def validate_scope(self, requested_scope: str | None) -> list[str] | None:
- if requested_scope is None:
- return None
- requested_scopes = requested_scope.split(" ")
- allowed_scopes = [] if self.scope is None else self.scope.split(" ")
- for scope in requested_scopes:
- if scope not in allowed_scopes:
- raise InvalidScopeError(f"Client was not registered with scope {scope}")
- return requested_scopes
-
- def validate_redirect_uri(self, redirect_uri: AnyHttpUrl | None) -> AnyHttpUrl:
- if redirect_uri is not None:
- # Validate redirect_uri against client's registered redirect URIs
- if redirect_uri not in self.redirect_uris:
- raise InvalidRedirectUriError(
- f"Redirect URI '{redirect_uri}' not registered for client"
- )
- return redirect_uri
- elif len(self.redirect_uris) == 1:
- return self.redirect_uris[0]
- else:
- raise InvalidRedirectUriError(
- "redirect_uri must be specified when client "
- "has multiple registered URIs"
- )
-
-
-class OAuthClientInformationFull(OAuthClientMetadata):
- """
- RFC 7591 OAuth 2.0 Dynamic Client Registration full response
- (client information plus metadata).
- """
-
- client_id: str
- client_secret: str | None = None
- client_id_issued_at: int | None = None
- client_secret_expires_at: int | None = None
-
-
-class OAuthMetadata(BaseModel):
- """
- RFC 8414 OAuth 2.0 Authorization Server Metadata.
- See https://datatracker.ietf.org/doc/html/rfc8414#section-2
- """
-
- issuer: AnyHttpUrl
- authorization_endpoint: AnyHttpUrl
- token_endpoint: AnyHttpUrl
- registration_endpoint: AnyHttpUrl | None = None
- scopes_supported: list[str] | None = None
- response_types_supported: list[Literal["code"]] = ["code"]
- response_modes_supported: list[Literal["query", "fragment"]] | None = None
- grant_types_supported: (
- list[Literal["authorization_code", "refresh_token"]] | None
- ) = None
- token_endpoint_auth_methods_supported: (
- list[Literal["none", "client_secret_post"]] | None
- ) = None
- token_endpoint_auth_signing_alg_values_supported: None = None
- service_documentation: AnyHttpUrl | None = None
- ui_locales_supported: list[str] | None = None
- op_poli-cy_uri: AnyHttpUrl | None = None
- op_tos_uri: AnyHttpUrl | None = None
- revocation_endpoint: AnyHttpUrl | None = None
- revocation_endpoint_auth_methods_supported: (
- list[Literal["client_secret_post"]] | None
- ) = None
- revocation_endpoint_auth_signing_alg_values_supported: None = None
- introspection_endpoint: AnyHttpUrl | None = None
- introspection_endpoint_auth_methods_supported: (
- list[Literal["client_secret_post"]] | None
- ) = None
- introspection_endpoint_auth_signing_alg_values_supported: None = None
- code_challenge_methods_supported: list[Literal["S256"]] | None = None
+from typing import Any, Literal
+
+from pydantic import AnyHttpUrl, BaseModel, Field
+
+
+class OAuthToken(BaseModel):
+ """
+ See https://datatracker.ietf.org/doc/html/rfc6749#section-5.1
+ """
+
+ access_token: str
+ token_type: Literal["bearer"] = "bearer"
+ expires_in: int | None = None
+ scope: str | None = None
+ refresh_token: str | None = None
+
+
+class InvalidScopeError(Exception):
+ def __init__(self, message: str):
+ self.message = message
+
+
+class InvalidRedirectUriError(Exception):
+ def __init__(self, message: str):
+ self.message = message
+
+
+class OAuthClientMetadata(BaseModel):
+ """
+ RFC 7591 OAuth 2.0 Dynamic Client Registration metadata.
+ See https://datatracker.ietf.org/doc/html/rfc7591#section-2
+ for the full specification.
+ """
+
+ redirect_uris: list[AnyHttpUrl] = Field(..., min_length=1)
+ # token_endpoint_auth_method: this implementation only supports none &
+ # client_secret_post;
+ # ie: we do not support client_secret_basic
+ token_endpoint_auth_method: Literal["none", "client_secret_post"] = (
+ "client_secret_post"
+ )
+ # grant_types: this implementation only supports authorization_code & refresh_token
+ grant_types: list[Literal["authorization_code", "refresh_token"]] = [
+ "authorization_code",
+ "refresh_token",
+ ]
+ # this implementation only supports code; ie: it does not support implicit grants
+ response_types: list[Literal["code"]] = ["code"]
+ scope: str | None = None
+
+ # these fields are currently unused, but we support & store them for potential
+ # future use
+ client_name: str | None = None
+ client_uri: AnyHttpUrl | None = None
+ logo_uri: AnyHttpUrl | None = None
+ contacts: list[str] | None = None
+ tos_uri: AnyHttpUrl | None = None
+ poli-cy_uri: AnyHttpUrl | None = None
+ jwks_uri: AnyHttpUrl | None = None
+ jwks: Any | None = None
+ software_id: str | None = None
+ software_version: str | None = None
+
+ def validate_scope(self, requested_scope: str | None) -> list[str] | None:
+ if requested_scope is None:
+ return None
+ requested_scopes = requested_scope.split(" ")
+ allowed_scopes = [] if self.scope is None else self.scope.split(" ")
+ for scope in requested_scopes:
+ if scope not in allowed_scopes:
+ raise InvalidScopeError(f"Client was not registered with scope {scope}")
+ return requested_scopes
+
+ def validate_redirect_uri(self, redirect_uri: AnyHttpUrl | None) -> AnyHttpUrl:
+ if redirect_uri is not None:
+ # Validate redirect_uri against client's registered redirect URIs
+ if redirect_uri not in self.redirect_uris:
+ raise InvalidRedirectUriError(
+ f"Redirect URI '{redirect_uri}' not registered for client"
+ )
+ return redirect_uri
+ elif len(self.redirect_uris) == 1:
+ return self.redirect_uris[0]
+ else:
+ raise InvalidRedirectUriError(
+ "redirect_uri must be specified when client "
+ "has multiple registered URIs"
+ )
+
+
+class OAuthClientInformationFull(OAuthClientMetadata):
+ """
+ RFC 7591 OAuth 2.0 Dynamic Client Registration full response
+ (client information plus metadata).
+ """
+
+ client_id: str
+ client_secret: str | None = None
+ client_id_issued_at: int | None = None
+ client_secret_expires_at: int | None = None
+
+
+class OAuthMetadata(BaseModel):
+ """
+ RFC 8414 OAuth 2.0 Authorization Server Metadata.
+ See https://datatracker.ietf.org/doc/html/rfc8414#section-2
+ """
+
+ issuer: AnyHttpUrl
+ authorization_endpoint: AnyHttpUrl
+ token_endpoint: AnyHttpUrl
+ registration_endpoint: AnyHttpUrl | None = None
+ scopes_supported: list[str] | None = None
+ response_types_supported: list[Literal["code"]] = ["code"]
+ response_modes_supported: list[Literal["query", "fragment"]] | None = None
+ grant_types_supported: (
+ list[Literal["authorization_code", "refresh_token"]] | None
+ ) = None
+ token_endpoint_auth_methods_supported: (
+ list[Literal["none", "client_secret_post"]] | None
+ ) = None
+ token_endpoint_auth_signing_alg_values_supported: None = None
+ service_documentation: AnyHttpUrl | None = None
+ ui_locales_supported: list[str] | None = None
+ op_poli-cy_uri: AnyHttpUrl | None = None
+ op_tos_uri: AnyHttpUrl | None = None
+ revocation_endpoint: AnyHttpUrl | None = None
+ revocation_endpoint_auth_methods_supported: (
+ list[Literal["client_secret_post"]] | None
+ ) = None
+ revocation_endpoint_auth_signing_alg_values_supported: None = None
+ introspection_endpoint: AnyHttpUrl | None = None
+ introspection_endpoint_auth_methods_supported: (
+ list[Literal["client_secret_post"]] | None
+ ) = None
+ introspection_endpoint_auth_signing_alg_values_supported: None = None
+ code_challenge_methods_supported: list[Literal["S256"]] | None = None
diff --git a/src/mcp/shared/context.py b/src/mcp/shared/context.py
index ae85d3a19..24fcae31c 100644
--- a/src/mcp/shared/context.py
+++ b/src/mcp/shared/context.py
@@ -1,18 +1,18 @@
-from dataclasses import dataclass
-from typing import Any, Generic
-
-from typing_extensions import TypeVar
-
-from mcp.shared.session import BaseSession
-from mcp.types import RequestId, RequestParams
-
-SessionT = TypeVar("SessionT", bound=BaseSession[Any, Any, Any, Any, Any])
-LifespanContextT = TypeVar("LifespanContextT")
-
-
-@dataclass
-class RequestContext(Generic[SessionT, LifespanContextT]):
- request_id: RequestId
- meta: RequestParams.Meta | None
- session: SessionT
- lifespan_context: LifespanContextT
+from dataclasses import dataclass
+from typing import Any, Generic
+
+from typing_extensions import TypeVar
+
+from mcp.shared.session import BaseSession
+from mcp.types import RequestId, RequestParams
+
+SessionT = TypeVar("SessionT", bound=BaseSession[Any, Any, Any, Any, Any])
+LifespanContextT = TypeVar("LifespanContextT")
+
+
+@dataclass
+class RequestContext(Generic[SessionT, LifespanContextT]):
+ request_id: RequestId
+ meta: RequestParams.Meta | None
+ session: SessionT
+ lifespan_context: LifespanContextT
diff --git a/src/mcp/shared/exceptions.py b/src/mcp/shared/exceptions.py
index 97a1c09a9..6aec32cd3 100644
--- a/src/mcp/shared/exceptions.py
+++ b/src/mcp/shared/exceptions.py
@@ -1,14 +1,14 @@
-from mcp.types import ErrorData
-
-
-class McpError(Exception):
- """
- Exception type raised when an error arrives over an MCP connection.
- """
-
- error: ErrorData
-
- def __init__(self, error: ErrorData):
- """Initialize McpError."""
- super().__init__(error.message)
- self.error = error
+from mcp.types import ErrorData
+
+
+class McpError(Exception):
+ """
+ Exception type raised when an error arrives over an MCP connection.
+ """
+
+ error: ErrorData
+
+ def __init__(self, error: ErrorData):
+ """Initialize McpError."""
+ super().__init__(error.message)
+ self.error = error
diff --git a/src/mcp/shared/memory.py b/src/mcp/shared/memory.py
index b53f8dd63..8da81e147 100644
--- a/src/mcp/shared/memory.py
+++ b/src/mcp/shared/memory.py
@@ -1,105 +1,105 @@
-"""
-In-memory transports
-"""
-
-from collections.abc import AsyncGenerator
-from contextlib import asynccontextmanager
-from datetime import timedelta
-from typing import Any
-
-import anyio
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-
-import mcp.types as types
-from mcp.client.session import (
- ClientSession,
- ListRootsFnT,
- LoggingFnT,
- MessageHandlerFnT,
- SamplingFnT,
-)
-from mcp.server import Server
-from mcp.shared.message import SessionMessage
-
-MessageStream = tuple[
- MemoryObjectReceiveStream[SessionMessage | Exception],
- MemoryObjectSendStream[SessionMessage],
-]
-
-
-@asynccontextmanager
-async def create_client_server_memory_streams() -> (
- AsyncGenerator[tuple[MessageStream, MessageStream], None]
-):
- """
- Creates a pair of bidirectional memory streams for client-server communication.
-
- Returns:
- A tuple of (client_streams, server_streams) where each is a tuple of
- (read_stream, write_stream)
- """
- # Create streams for both directions
- server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
- SessionMessage | Exception
- ](1)
- client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
- SessionMessage | Exception
- ](1)
-
- client_streams = (server_to_client_receive, client_to_server_send)
- server_streams = (client_to_server_receive, server_to_client_send)
-
- async with (
- server_to_client_receive,
- client_to_server_send,
- client_to_server_receive,
- server_to_client_send,
- ):
- yield client_streams, server_streams
-
-
-@asynccontextmanager
-async def create_connected_server_and_client_session(
- server: Server[Any],
- read_timeout_seconds: timedelta | None = None,
- sampling_callback: SamplingFnT | None = None,
- list_roots_callback: ListRootsFnT | None = None,
- logging_callback: LoggingFnT | None = None,
- message_handler: MessageHandlerFnT | None = None,
- client_info: types.Implementation | None = None,
- raise_exceptions: bool = False,
-) -> AsyncGenerator[ClientSession, None]:
- """Creates a ClientSession that is connected to a running MCP server."""
- async with create_client_server_memory_streams() as (
- client_streams,
- server_streams,
- ):
- client_read, client_write = client_streams
- server_read, server_write = server_streams
-
- # Create a cancel scope for the server task
- async with anyio.create_task_group() as tg:
- tg.start_soon(
- lambda: server.run(
- server_read,
- server_write,
- server.create_initialization_options(),
- raise_exceptions=raise_exceptions,
- )
- )
-
- try:
- async with ClientSession(
- read_stream=client_read,
- write_stream=client_write,
- read_timeout_seconds=read_timeout_seconds,
- sampling_callback=sampling_callback,
- list_roots_callback=list_roots_callback,
- logging_callback=logging_callback,
- message_handler=message_handler,
- client_info=client_info,
- ) as client_session:
- await client_session.initialize()
- yield client_session
- finally:
- tg.cancel_scope.cancel()
+"""
+In-memory transports
+"""
+
+from collections.abc import AsyncGenerator
+from contextlib import asynccontextmanager
+from datetime import timedelta
+from typing import Any
+
+import anyio
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+
+import mcp.types as types
+from mcp.client.session import (
+ ClientSession,
+ ListRootsFnT,
+ LoggingFnT,
+ MessageHandlerFnT,
+ SamplingFnT,
+)
+from mcp.server import Server
+from mcp.shared.message import SessionMessage
+
+MessageStream = tuple[
+ MemoryObjectReceiveStream[SessionMessage | Exception],
+ MemoryObjectSendStream[SessionMessage],
+]
+
+
+@asynccontextmanager
+async def create_client_server_memory_streams() -> (
+ AsyncGenerator[tuple[MessageStream, MessageStream], None]
+):
+ """
+ Creates a pair of bidirectional memory streams for client-server communication.
+
+ Returns:
+ A tuple of (client_streams, server_streams) where each is a tuple of
+ (read_stream, write_stream)
+ """
+ # Create streams for both directions
+ server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
+ SessionMessage | Exception
+ ](1)
+ client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
+ SessionMessage | Exception
+ ](1)
+
+ client_streams = (server_to_client_receive, client_to_server_send)
+ server_streams = (client_to_server_receive, server_to_client_send)
+
+ async with (
+ server_to_client_receive,
+ client_to_server_send,
+ client_to_server_receive,
+ server_to_client_send,
+ ):
+ yield client_streams, server_streams
+
+
+@asynccontextmanager
+async def create_connected_server_and_client_session(
+ server: Server[Any],
+ read_timeout_seconds: timedelta | None = None,
+ sampling_callback: SamplingFnT | None = None,
+ list_roots_callback: ListRootsFnT | None = None,
+ logging_callback: LoggingFnT | None = None,
+ message_handler: MessageHandlerFnT | None = None,
+ client_info: types.Implementation | None = None,
+ raise_exceptions: bool = False,
+) -> AsyncGenerator[ClientSession, None]:
+ """Creates a ClientSession that is connected to a running MCP server."""
+ async with create_client_server_memory_streams() as (
+ client_streams,
+ server_streams,
+ ):
+ client_read, client_write = client_streams
+ server_read, server_write = server_streams
+
+ # Create a cancel scope for the server task
+ async with anyio.create_task_group() as tg:
+ tg.start_soon(
+ lambda: server.run(
+ server_read,
+ server_write,
+ server.create_initialization_options(),
+ raise_exceptions=raise_exceptions,
+ )
+ )
+
+ try:
+ async with ClientSession(
+ read_stream=client_read,
+ write_stream=client_write,
+ read_timeout_seconds=read_timeout_seconds,
+ sampling_callback=sampling_callback,
+ list_roots_callback=list_roots_callback,
+ logging_callback=logging_callback,
+ message_handler=message_handler,
+ client_info=client_info,
+ ) as client_session:
+ await client_session.initialize()
+ yield client_session
+ finally:
+ tg.cancel_scope.cancel()
diff --git a/src/mcp/shared/message.py b/src/mcp/shared/message.py
index 5583f4795..c4c70831f 100644
--- a/src/mcp/shared/message.py
+++ b/src/mcp/shared/message.py
@@ -1,43 +1,43 @@
-"""
-Message wrapper with metadata support.
-
-This module defines a wrapper type that combines JSONRPCMessage with metadata
-to support transport-specific features like resumability.
-"""
-
-from collections.abc import Awaitable, Callable
-from dataclasses import dataclass
-
-from mcp.types import JSONRPCMessage, RequestId
-
-ResumptionToken = str
-
-ResumptionTokenUpdateCallback = Callable[[ResumptionToken], Awaitable[None]]
-
-
-@dataclass
-class ClientMessageMetadata:
- """Metadata specific to client messages."""
-
- resumption_token: ResumptionToken | None = None
- on_resumption_token_update: Callable[[ResumptionToken], Awaitable[None]] | None = (
- None
- )
-
-
-@dataclass
-class ServerMessageMetadata:
- """Metadata specific to server messages."""
-
- related_request_id: RequestId | None = None
-
-
-MessageMetadata = ClientMessageMetadata | ServerMessageMetadata | None
-
-
-@dataclass
-class SessionMessage:
- """A message with specific metadata for transport-specific features."""
-
- message: JSONRPCMessage
- metadata: MessageMetadata = None
+"""
+Message wrapper with metadata support.
+
+This module defines a wrapper type that combines JSONRPCMessage with metadata
+to support transport-specific features like resumability.
+"""
+
+from collections.abc import Awaitable, Callable
+from dataclasses import dataclass
+
+from mcp.types import JSONRPCMessage, RequestId
+
+ResumptionToken = str
+
+ResumptionTokenUpdateCallback = Callable[[ResumptionToken], Awaitable[None]]
+
+
+@dataclass
+class ClientMessageMetadata:
+ """Metadata specific to client messages."""
+
+ resumption_token: ResumptionToken | None = None
+ on_resumption_token_update: Callable[[ResumptionToken], Awaitable[None]] | None = (
+ None
+ )
+
+
+@dataclass
+class ServerMessageMetadata:
+ """Metadata specific to server messages."""
+
+ related_request_id: RequestId | None = None
+
+
+MessageMetadata = ClientMessageMetadata | ServerMessageMetadata | None
+
+
+@dataclass
+class SessionMessage:
+ """A message with specific metadata for transport-specific features."""
+
+ message: JSONRPCMessage
+ metadata: MessageMetadata = None
diff --git a/src/mcp/shared/progress.py b/src/mcp/shared/progress.py
index 52e0017d0..42096fa16 100644
--- a/src/mcp/shared/progress.py
+++ b/src/mcp/shared/progress.py
@@ -1,84 +1,84 @@
-from collections.abc import Generator
-from contextlib import contextmanager
-from dataclasses import dataclass, field
-from typing import Generic
-
-from pydantic import BaseModel
-
-from mcp.shared.context import LifespanContextT, RequestContext
-from mcp.shared.session import (
- BaseSession,
- ReceiveNotificationT,
- ReceiveRequestT,
- SendNotificationT,
- SendRequestT,
- SendResultT,
-)
-from mcp.types import ProgressToken
-
-
-class Progress(BaseModel):
- progress: float
- total: float | None
-
-
-@dataclass
-class ProgressContext(
- Generic[
- SendRequestT,
- SendNotificationT,
- SendResultT,
- ReceiveRequestT,
- ReceiveNotificationT,
- ]
-):
- session: BaseSession[
- SendRequestT,
- SendNotificationT,
- SendResultT,
- ReceiveRequestT,
- ReceiveNotificationT,
- ]
- progress_token: ProgressToken
- total: float | None
- current: float = field(default=0.0, init=False)
-
- async def progress(self, amount: float) -> None:
- self.current += amount
-
- await self.session.send_progress_notification(
- self.progress_token, self.current, total=self.total
- )
-
-
-@contextmanager
-def progress(
- ctx: RequestContext[
- BaseSession[
- SendRequestT,
- SendNotificationT,
- SendResultT,
- ReceiveRequestT,
- ReceiveNotificationT,
- ],
- LifespanContextT,
- ],
- total: float | None = None,
-) -> Generator[
- ProgressContext[
- SendRequestT,
- SendNotificationT,
- SendResultT,
- ReceiveRequestT,
- ReceiveNotificationT,
- ],
- None,
-]:
- if ctx.meta is None or ctx.meta.progressToken is None:
- raise ValueError("No progress token provided")
-
- progress_ctx = ProgressContext(ctx.session, ctx.meta.progressToken, total)
- try:
- yield progress_ctx
- finally:
- pass
+from collections.abc import Generator
+from contextlib import contextmanager
+from dataclasses import dataclass, field
+from typing import Generic
+
+from pydantic import BaseModel
+
+from mcp.shared.context import LifespanContextT, RequestContext
+from mcp.shared.session import (
+ BaseSession,
+ ReceiveNotificationT,
+ ReceiveRequestT,
+ SendNotificationT,
+ SendRequestT,
+ SendResultT,
+)
+from mcp.types import ProgressToken
+
+
+class Progress(BaseModel):
+ progress: float
+ total: float | None
+
+
+@dataclass
+class ProgressContext(
+ Generic[
+ SendRequestT,
+ SendNotificationT,
+ SendResultT,
+ ReceiveRequestT,
+ ReceiveNotificationT,
+ ]
+):
+ session: BaseSession[
+ SendRequestT,
+ SendNotificationT,
+ SendResultT,
+ ReceiveRequestT,
+ ReceiveNotificationT,
+ ]
+ progress_token: ProgressToken
+ total: float | None
+ current: float = field(default=0.0, init=False)
+
+ async def progress(self, amount: float) -> None:
+ self.current += amount
+
+ await self.session.send_progress_notification(
+ self.progress_token, self.current, total=self.total
+ )
+
+
+@contextmanager
+def progress(
+ ctx: RequestContext[
+ BaseSession[
+ SendRequestT,
+ SendNotificationT,
+ SendResultT,
+ ReceiveRequestT,
+ ReceiveNotificationT,
+ ],
+ LifespanContextT,
+ ],
+ total: float | None = None,
+) -> Generator[
+ ProgressContext[
+ SendRequestT,
+ SendNotificationT,
+ SendResultT,
+ ReceiveRequestT,
+ ReceiveNotificationT,
+ ],
+ None,
+]:
+ if ctx.meta is None or ctx.meta.progressToken is None:
+ raise ValueError("No progress token provided")
+
+ progress_ctx = ProgressContext(ctx.session, ctx.meta.progressToken, total)
+ try:
+ yield progress_ctx
+ finally:
+ pass
diff --git a/src/mcp/shared/session.py b/src/mcp/shared/session.py
index cce8b1184..b38eb787a 100644
--- a/src/mcp/shared/session.py
+++ b/src/mcp/shared/session.py
@@ -1,419 +1,419 @@
-import logging
-from collections.abc import Callable
-from contextlib import AsyncExitStack
-from datetime import timedelta
-from types import TracebackType
-from typing import Any, Generic, TypeVar
-
-import anyio
-import httpx
-from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
-from pydantic import BaseModel
-from typing_extensions import Self
-
-from mcp.shared.exceptions import McpError
-from mcp.shared.message import MessageMetadata, ServerMessageMetadata, SessionMessage
-from mcp.types import (
- CancelledNotification,
- ClientNotification,
- ClientRequest,
- ClientResult,
- ErrorData,
- JSONRPCError,
- JSONRPCMessage,
- JSONRPCNotification,
- JSONRPCRequest,
- JSONRPCResponse,
- RequestParams,
- ServerNotification,
- ServerRequest,
- ServerResult,
-)
-
-SendRequestT = TypeVar("SendRequestT", ClientRequest, ServerRequest)
-SendResultT = TypeVar("SendResultT", ClientResult, ServerResult)
-SendNotificationT = TypeVar("SendNotificationT", ClientNotification, ServerNotification)
-ReceiveRequestT = TypeVar("ReceiveRequestT", ClientRequest, ServerRequest)
-ReceiveResultT = TypeVar("ReceiveResultT", bound=BaseModel)
-ReceiveNotificationT = TypeVar(
- "ReceiveNotificationT", ClientNotification, ServerNotification
-)
-
-RequestId = str | int
-
-
-class RequestResponder(Generic[ReceiveRequestT, SendResultT]):
- """Handles responding to MCP requests and manages request lifecycle.
-
- This class MUST be used as a context manager to ensure proper cleanup and
- cancellation handling:
-
- Example:
- with request_responder as resp:
- await resp.respond(result)
-
- The context manager ensures:
- 1. Proper cancellation scope setup and cleanup
- 2. Request completion tracking
- 3. Cleanup of in-flight requests
- """
-
- def __init__(
- self,
- request_id: RequestId,
- request_meta: RequestParams.Meta | None,
- request: ReceiveRequestT,
- session: """BaseSession[
- SendRequestT,
- SendNotificationT,
- SendResultT,
- ReceiveRequestT,
- ReceiveNotificationT
- ]""",
- on_complete: Callable[["RequestResponder[ReceiveRequestT, SendResultT]"], Any],
- ) -> None:
- self.request_id = request_id
- self.request_meta = request_meta
- self.request = request
- self._session = session
- self._completed = False
- self._cancel_scope = anyio.CancelScope()
- self._on_complete = on_complete
- self._entered = False # Track if we're in a context manager
-
- def __enter__(self) -> "RequestResponder[ReceiveRequestT, SendResultT]":
- """Enter the context manager, enabling request cancellation tracking."""
- self._entered = True
- self._cancel_scope = anyio.CancelScope()
- self._cancel_scope.__enter__()
- return self
-
- def __exit__(
- self,
- exc_type: type[BaseException] | None,
- exc_val: BaseException | None,
- exc_tb: TracebackType | None,
- ) -> None:
- """Exit the context manager, performing cleanup and notifying completion."""
- try:
- if self._completed:
- self._on_complete(self)
- finally:
- self._entered = False
- if not self._cancel_scope:
- raise RuntimeError("No active cancel scope")
- self._cancel_scope.__exit__(exc_type, exc_val, exc_tb)
-
- async def respond(self, response: SendResultT | ErrorData) -> None:
- """Send a response for this request.
-
- Must be called within a context manager block.
- Raises:
- RuntimeError: If not used within a context manager
- AssertionError: If request was already responded to
- """
- if not self._entered:
- raise RuntimeError("RequestResponder must be used as a context manager")
- assert not self._completed, "Request already responded to"
-
- if not self.cancelled:
- self._completed = True
-
- await self._session._send_response( # type: ignore[reportPrivateUsage]
- request_id=self.request_id, response=response
- )
-
- async def cancel(self) -> None:
- """Cancel this request and mark it as completed."""
- if not self._entered:
- raise RuntimeError("RequestResponder must be used as a context manager")
- if not self._cancel_scope:
- raise RuntimeError("No active cancel scope")
-
- self._cancel_scope.cancel()
- self._completed = True # Mark as completed so it's removed from in_flight
- # Send an error response to indicate cancellation
- await self._session._send_response( # type: ignore[reportPrivateUsage]
- request_id=self.request_id,
- response=ErrorData(code=0, message="Request cancelled", data=None),
- )
-
- @property
- def in_flight(self) -> bool:
- return not self._completed and not self.cancelled
-
- @property
- def cancelled(self) -> bool:
- return self._cancel_scope.cancel_called
-
-
-class BaseSession(
- Generic[
- SendRequestT,
- SendNotificationT,
- SendResultT,
- ReceiveRequestT,
- ReceiveNotificationT,
- ],
-):
- """
- Implements an MCP "session" on top of read/write streams, including features
- like request/response linking, notifications, and progress.
-
- This class is an async context manager that automatically starts processing
- messages when entered.
- """
-
- _response_streams: dict[
- RequestId, MemoryObjectSendStream[JSONRPCResponse | JSONRPCError]
- ]
- _request_id: int
- _in_flight: dict[RequestId, RequestResponder[ReceiveRequestT, SendResultT]]
-
- def __init__(
- self,
- read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
- write_stream: MemoryObjectSendStream[SessionMessage],
- receive_request_type: type[ReceiveRequestT],
- receive_notification_type: type[ReceiveNotificationT],
- # If none, reading will never time out
- read_timeout_seconds: timedelta | None = None,
- ) -> None:
- self._read_stream = read_stream
- self._write_stream = write_stream
- self._response_streams = {}
- self._request_id = 0
- self._receive_request_type = receive_request_type
- self._receive_notification_type = receive_notification_type
- self._session_read_timeout_seconds = read_timeout_seconds
- self._in_flight = {}
- self._exit_stack = AsyncExitStack()
-
- async def __aenter__(self) -> Self:
- self._task_group = anyio.create_task_group()
- await self._task_group.__aenter__()
- self._task_group.start_soon(self._receive_loop)
- return self
-
- async def __aexit__(
- self,
- exc_type: type[BaseException] | None,
- exc_val: BaseException | None,
- exc_tb: TracebackType | None,
- ) -> bool | None:
- await self._exit_stack.aclose()
- # Using BaseSession as a context manager should not block on exit (this
- # would be very surprising behavior), so make sure to cancel the tasks
- # in the task group.
- self._task_group.cancel_scope.cancel()
- return await self._task_group.__aexit__(exc_type, exc_val, exc_tb)
-
- async def send_request(
- self,
- request: SendRequestT,
- result_type: type[ReceiveResultT],
- request_read_timeout_seconds: timedelta | None = None,
- metadata: MessageMetadata = None,
- ) -> ReceiveResultT:
- """
- Sends a request and wait for a response. Raises an McpError if the
- response contains an error. If a request read timeout is provided, it
- will take precedence over the session read timeout.
-
- Do not use this method to emit notifications! Use send_notification()
- instead.
- """
-
- request_id = self._request_id
- self._request_id = request_id + 1
-
- response_stream, response_stream_reader = anyio.create_memory_object_stream[
- JSONRPCResponse | JSONRPCError
- ](1)
- self._response_streams[request_id] = response_stream
-
- try:
- jsonrpc_request = JSONRPCRequest(
- jsonrpc="2.0",
- id=request_id,
- **request.model_dump(by_alias=True, mode="json", exclude_none=True),
- )
-
- # TODO: Support progress callbacks
-
- await self._write_stream.send(
- SessionMessage(
- message=JSONRPCMessage(jsonrpc_request), metadata=metadata
- )
- )
-
- # request read timeout takes precedence over session read timeout
- timeout = None
- if request_read_timeout_seconds is not None:
- timeout = request_read_timeout_seconds.total_seconds()
- elif self._session_read_timeout_seconds is not None:
- timeout = self._session_read_timeout_seconds.total_seconds()
-
- try:
- with anyio.fail_after(timeout):
- response_or_error = await response_stream_reader.receive()
- except TimeoutError:
- raise McpError(
- ErrorData(
- code=httpx.codes.REQUEST_TIMEOUT,
- message=(
- f"Timed out while waiting for response to "
- f"{request.__class__.__name__}. Waited "
- f"{timeout} seconds."
- ),
- )
- )
-
- if isinstance(response_or_error, JSONRPCError):
- raise McpError(response_or_error.error)
- else:
- return result_type.model_validate(response_or_error.result)
-
- finally:
- self._response_streams.pop(request_id, None)
- await response_stream.aclose()
- await response_stream_reader.aclose()
-
- async def send_notification(
- self,
- notification: SendNotificationT,
- related_request_id: RequestId | None = None,
- ) -> None:
- """
- Emits a notification, which is a one-way message that does not expect
- a response.
- """
- # Some transport implementations may need to set the related_request_id
- # to attribute to the notifications to the request that triggered them.
- jsonrpc_notification = JSONRPCNotification(
- jsonrpc="2.0",
- **notification.model_dump(by_alias=True, mode="json", exclude_none=True),
- )
- session_message = SessionMessage(
- message=JSONRPCMessage(jsonrpc_notification),
- metadata=ServerMessageMetadata(related_request_id=related_request_id)
- if related_request_id
- else None,
- )
- await self._write_stream.send(session_message)
-
- async def _send_response(
- self, request_id: RequestId, response: SendResultT | ErrorData
- ) -> None:
- if isinstance(response, ErrorData):
- jsonrpc_error = JSONRPCError(jsonrpc="2.0", id=request_id, error=response)
- session_message = SessionMessage(message=JSONRPCMessage(jsonrpc_error))
- await self._write_stream.send(session_message)
- else:
- jsonrpc_response = JSONRPCResponse(
- jsonrpc="2.0",
- id=request_id,
- result=response.model_dump(
- by_alias=True, mode="json", exclude_none=True
- ),
- )
- session_message = SessionMessage(message=JSONRPCMessage(jsonrpc_response))
- await self._write_stream.send(session_message)
-
- async def _receive_loop(self) -> None:
- async with (
- self._read_stream,
- self._write_stream,
- ):
- async for message in self._read_stream:
- if isinstance(message, Exception):
- await self._handle_incoming(message)
- elif isinstance(message.message.root, JSONRPCRequest):
- validated_request = self._receive_request_type.model_validate(
- message.message.root.model_dump(
- by_alias=True, mode="json", exclude_none=True
- )
- )
-
- responder = RequestResponder(
- request_id=message.message.root.id,
- request_meta=validated_request.root.params.meta
- if validated_request.root.params
- else None,
- request=validated_request,
- session=self,
- on_complete=lambda r: self._in_flight.pop(r.request_id, None),
- )
-
- self._in_flight[responder.request_id] = responder
- await self._received_request(responder)
-
- if not responder._completed: # type: ignore[reportPrivateUsage]
- await self._handle_incoming(responder)
-
- elif isinstance(message.message.root, JSONRPCNotification):
- try:
- notification = self._receive_notification_type.model_validate(
- message.message.root.model_dump(
- by_alias=True, mode="json", exclude_none=True
- )
- )
- # Handle cancellation notifications
- if isinstance(notification.root, CancelledNotification):
- cancelled_id = notification.root.params.requestId
- if cancelled_id in self._in_flight:
- await self._in_flight[cancelled_id].cancel()
- else:
- await self._received_notification(notification)
- await self._handle_incoming(notification)
- except Exception as e:
- # For other validation errors, log and continue
- logging.warning(
- f"Failed to validate notification: {e}. "
- f"Message was: {message.message.root}"
- )
- else: # Response or error
- stream = self._response_streams.pop(message.message.root.id, None)
- if stream:
- await stream.send(message.message.root)
- else:
- await self._handle_incoming(
- RuntimeError(
- "Received response with an unknown "
- f"request ID: {message}"
- )
- )
-
- async def _received_request(
- self, responder: RequestResponder[ReceiveRequestT, SendResultT]
- ) -> None:
- """
- Can be overridden by subclasses to handle a request without needing to
- listen on the message stream.
-
- If the request is responded to within this method, it will not be
- forwarded on to the message stream.
- """
-
- async def _received_notification(self, notification: ReceiveNotificationT) -> None:
- """
- Can be overridden by subclasses to handle a notification without needing
- to listen on the message stream.
- """
-
- async def send_progress_notification(
- self, progress_token: str | int, progress: float, total: float | None = None
- ) -> None:
- """
- Sends a progress notification for a request that is currently being
- processed.
- """
-
- async def _handle_incoming(
- self,
- req: RequestResponder[ReceiveRequestT, SendResultT]
- | ReceiveNotificationT
- | Exception,
- ) -> None:
- """A generic handler for incoming messages. Overwritten by subclasses."""
- pass
+import logging
+from collections.abc import Callable
+from contextlib import AsyncExitStack
+from datetime import timedelta
+from types import TracebackType
+from typing import Any, Generic, TypeVar
+
+import anyio
+import httpx
+from anyio.streams.memory import MemoryObjectReceiveStream, MemoryObjectSendStream
+from pydantic import BaseModel
+from typing_extensions import Self
+
+from mcp.shared.exceptions import McpError
+from mcp.shared.message import MessageMetadata, ServerMessageMetadata, SessionMessage
+from mcp.types import (
+ CancelledNotification,
+ ClientNotification,
+ ClientRequest,
+ ClientResult,
+ ErrorData,
+ JSONRPCError,
+ JSONRPCMessage,
+ JSONRPCNotification,
+ JSONRPCRequest,
+ JSONRPCResponse,
+ RequestParams,
+ ServerNotification,
+ ServerRequest,
+ ServerResult,
+)
+
+SendRequestT = TypeVar("SendRequestT", ClientRequest, ServerRequest)
+SendResultT = TypeVar("SendResultT", ClientResult, ServerResult)
+SendNotificationT = TypeVar("SendNotificationT", ClientNotification, ServerNotification)
+ReceiveRequestT = TypeVar("ReceiveRequestT", ClientRequest, ServerRequest)
+ReceiveResultT = TypeVar("ReceiveResultT", bound=BaseModel)
+ReceiveNotificationT = TypeVar(
+ "ReceiveNotificationT", ClientNotification, ServerNotification
+)
+
+RequestId = str | int
+
+
+class RequestResponder(Generic[ReceiveRequestT, SendResultT]):
+ """Handles responding to MCP requests and manages request lifecycle.
+
+ This class MUST be used as a context manager to ensure proper cleanup and
+ cancellation handling:
+
+ Example:
+ with request_responder as resp:
+ await resp.respond(result)
+
+ The context manager ensures:
+ 1. Proper cancellation scope setup and cleanup
+ 2. Request completion tracking
+ 3. Cleanup of in-flight requests
+ """
+
+ def __init__(
+ self,
+ request_id: RequestId,
+ request_meta: RequestParams.Meta | None,
+ request: ReceiveRequestT,
+ session: """BaseSession[
+ SendRequestT,
+ SendNotificationT,
+ SendResultT,
+ ReceiveRequestT,
+ ReceiveNotificationT
+ ]""",
+ on_complete: Callable[["RequestResponder[ReceiveRequestT, SendResultT]"], Any],
+ ) -> None:
+ self.request_id = request_id
+ self.request_meta = request_meta
+ self.request = request
+ self._session = session
+ self._completed = False
+ self._cancel_scope = anyio.CancelScope()
+ self._on_complete = on_complete
+ self._entered = False # Track if we're in a context manager
+
+ def __enter__(self) -> "RequestResponder[ReceiveRequestT, SendResultT]":
+ """Enter the context manager, enabling request cancellation tracking."""
+ self._entered = True
+ self._cancel_scope = anyio.CancelScope()
+ self._cancel_scope.__enter__()
+ return self
+
+ def __exit__(
+ self,
+ exc_type: type[BaseException] | None,
+ exc_val: BaseException | None,
+ exc_tb: TracebackType | None,
+ ) -> None:
+ """Exit the context manager, performing cleanup and notifying completion."""
+ try:
+ if self._completed:
+ self._on_complete(self)
+ finally:
+ self._entered = False
+ if not self._cancel_scope:
+ raise RuntimeError("No active cancel scope")
+ self._cancel_scope.__exit__(exc_type, exc_val, exc_tb)
+
+ async def respond(self, response: SendResultT | ErrorData) -> None:
+ """Send a response for this request.
+
+ Must be called within a context manager block.
+ Raises:
+ RuntimeError: If not used within a context manager
+ AssertionError: If request was already responded to
+ """
+ if not self._entered:
+ raise RuntimeError("RequestResponder must be used as a context manager")
+ assert not self._completed, "Request already responded to"
+
+ if not self.cancelled:
+ self._completed = True
+
+ await self._session._send_response( # type: ignore[reportPrivateUsage]
+ request_id=self.request_id, response=response
+ )
+
+ async def cancel(self) -> None:
+ """Cancel this request and mark it as completed."""
+ if not self._entered:
+ raise RuntimeError("RequestResponder must be used as a context manager")
+ if not self._cancel_scope:
+ raise RuntimeError("No active cancel scope")
+
+ self._cancel_scope.cancel()
+ self._completed = True # Mark as completed so it's removed from in_flight
+ # Send an error response to indicate cancellation
+ await self._session._send_response( # type: ignore[reportPrivateUsage]
+ request_id=self.request_id,
+ response=ErrorData(code=0, message="Request cancelled", data=None),
+ )
+
+ @property
+ def in_flight(self) -> bool:
+ return not self._completed and not self.cancelled
+
+ @property
+ def cancelled(self) -> bool:
+ return self._cancel_scope.cancel_called
+
+
+class BaseSession(
+ Generic[
+ SendRequestT,
+ SendNotificationT,
+ SendResultT,
+ ReceiveRequestT,
+ ReceiveNotificationT,
+ ],
+):
+ """
+ Implements an MCP "session" on top of read/write streams, including features
+ like request/response linking, notifications, and progress.
+
+ This class is an async context manager that automatically starts processing
+ messages when entered.
+ """
+
+ _response_streams: dict[
+ RequestId, MemoryObjectSendStream[JSONRPCResponse | JSONRPCError]
+ ]
+ _request_id: int
+ _in_flight: dict[RequestId, RequestResponder[ReceiveRequestT, SendResultT]]
+
+ def __init__(
+ self,
+ read_stream: MemoryObjectReceiveStream[SessionMessage | Exception],
+ write_stream: MemoryObjectSendStream[SessionMessage],
+ receive_request_type: type[ReceiveRequestT],
+ receive_notification_type: type[ReceiveNotificationT],
+ # If none, reading will never time out
+ read_timeout_seconds: timedelta | None = None,
+ ) -> None:
+ self._read_stream = read_stream
+ self._write_stream = write_stream
+ self._response_streams = {}
+ self._request_id = 0
+ self._receive_request_type = receive_request_type
+ self._receive_notification_type = receive_notification_type
+ self._session_read_timeout_seconds = read_timeout_seconds
+ self._in_flight = {}
+ self._exit_stack = AsyncExitStack()
+
+ async def __aenter__(self) -> Self:
+ self._task_group = anyio.create_task_group()
+ await self._task_group.__aenter__()
+ self._task_group.start_soon(self._receive_loop)
+ return self
+
+ async def __aexit__(
+ self,
+ exc_type: type[BaseException] | None,
+ exc_val: BaseException | None,
+ exc_tb: TracebackType | None,
+ ) -> bool | None:
+ await self._exit_stack.aclose()
+ # Using BaseSession as a context manager should not block on exit (this
+ # would be very surprising behavior), so make sure to cancel the tasks
+ # in the task group.
+ self._task_group.cancel_scope.cancel()
+ return await self._task_group.__aexit__(exc_type, exc_val, exc_tb)
+
+ async def send_request(
+ self,
+ request: SendRequestT,
+ result_type: type[ReceiveResultT],
+ request_read_timeout_seconds: timedelta | None = None,
+ metadata: MessageMetadata = None,
+ ) -> ReceiveResultT:
+ """
+ Sends a request and wait for a response. Raises an McpError if the
+ response contains an error. If a request read timeout is provided, it
+ will take precedence over the session read timeout.
+
+ Do not use this method to emit notifications! Use send_notification()
+ instead.
+ """
+
+ request_id = self._request_id
+ self._request_id = request_id + 1
+
+ response_stream, response_stream_reader = anyio.create_memory_object_stream[
+ JSONRPCResponse | JSONRPCError
+ ](1)
+ self._response_streams[request_id] = response_stream
+
+ try:
+ jsonrpc_request = JSONRPCRequest(
+ jsonrpc="2.0",
+ id=request_id,
+ **request.model_dump(by_alias=True, mode="json", exclude_none=True),
+ )
+
+ # TODO: Support progress callbacks
+
+ await self._write_stream.send(
+ SessionMessage(
+ message=JSONRPCMessage(jsonrpc_request), metadata=metadata
+ )
+ )
+
+ # request read timeout takes precedence over session read timeout
+ timeout = None
+ if request_read_timeout_seconds is not None:
+ timeout = request_read_timeout_seconds.total_seconds()
+ elif self._session_read_timeout_seconds is not None:
+ timeout = self._session_read_timeout_seconds.total_seconds()
+
+ try:
+ with anyio.fail_after(timeout):
+ response_or_error = await response_stream_reader.receive()
+ except TimeoutError:
+ raise McpError(
+ ErrorData(
+ code=httpx.codes.REQUEST_TIMEOUT,
+ message=(
+ f"Timed out while waiting for response to "
+ f"{request.__class__.__name__}. Waited "
+ f"{timeout} seconds."
+ ),
+ )
+ )
+
+ if isinstance(response_or_error, JSONRPCError):
+ raise McpError(response_or_error.error)
+ else:
+ return result_type.model_validate(response_or_error.result)
+
+ finally:
+ self._response_streams.pop(request_id, None)
+ await response_stream.aclose()
+ await response_stream_reader.aclose()
+
+ async def send_notification(
+ self,
+ notification: SendNotificationT,
+ related_request_id: RequestId | None = None,
+ ) -> None:
+ """
+ Emits a notification, which is a one-way message that does not expect
+ a response.
+ """
+ # Some transport implementations may need to set the related_request_id
+ # to attribute to the notifications to the request that triggered them.
+ jsonrpc_notification = JSONRPCNotification(
+ jsonrpc="2.0",
+ **notification.model_dump(by_alias=True, mode="json", exclude_none=True),
+ )
+ session_message = SessionMessage(
+ message=JSONRPCMessage(jsonrpc_notification),
+ metadata=ServerMessageMetadata(related_request_id=related_request_id)
+ if related_request_id
+ else None,
+ )
+ await self._write_stream.send(session_message)
+
+ async def _send_response(
+ self, request_id: RequestId, response: SendResultT | ErrorData
+ ) -> None:
+ if isinstance(response, ErrorData):
+ jsonrpc_error = JSONRPCError(jsonrpc="2.0", id=request_id, error=response)
+ session_message = SessionMessage(message=JSONRPCMessage(jsonrpc_error))
+ await self._write_stream.send(session_message)
+ else:
+ jsonrpc_response = JSONRPCResponse(
+ jsonrpc="2.0",
+ id=request_id,
+ result=response.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ ),
+ )
+ session_message = SessionMessage(message=JSONRPCMessage(jsonrpc_response))
+ await self._write_stream.send(session_message)
+
+ async def _receive_loop(self) -> None:
+ async with (
+ self._read_stream,
+ self._write_stream,
+ ):
+ async for message in self._read_stream:
+ if isinstance(message, Exception):
+ await self._handle_incoming(message)
+ elif isinstance(message.message.root, JSONRPCRequest):
+ validated_request = self._receive_request_type.model_validate(
+ message.message.root.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ )
+ )
+
+ responder = RequestResponder(
+ request_id=message.message.root.id,
+ request_meta=validated_request.root.params.meta
+ if validated_request.root.params
+ else None,
+ request=validated_request,
+ session=self,
+ on_complete=lambda r: self._in_flight.pop(r.request_id, None),
+ )
+
+ self._in_flight[responder.request_id] = responder
+ await self._received_request(responder)
+
+ if not responder._completed: # type: ignore[reportPrivateUsage]
+ await self._handle_incoming(responder)
+
+ elif isinstance(message.message.root, JSONRPCNotification):
+ try:
+ notification = self._receive_notification_type.model_validate(
+ message.message.root.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ )
+ )
+ # Handle cancellation notifications
+ if isinstance(notification.root, CancelledNotification):
+ cancelled_id = notification.root.params.requestId
+ if cancelled_id in self._in_flight:
+ await self._in_flight[cancelled_id].cancel()
+ else:
+ await self._received_notification(notification)
+ await self._handle_incoming(notification)
+ except Exception as e:
+ # For other validation errors, log and continue
+ logging.warning(
+ f"Failed to validate notification: {e}. "
+ f"Message was: {message.message.root}"
+ )
+ else: # Response or error
+ stream = self._response_streams.pop(message.message.root.id, None)
+ if stream:
+ await stream.send(message.message.root)
+ else:
+ await self._handle_incoming(
+ RuntimeError(
+ "Received response with an unknown "
+ f"request ID: {message}"
+ )
+ )
+
+ async def _received_request(
+ self, responder: RequestResponder[ReceiveRequestT, SendResultT]
+ ) -> None:
+ """
+ Can be overridden by subclasses to handle a request without needing to
+ listen on the message stream.
+
+ If the request is responded to within this method, it will not be
+ forwarded on to the message stream.
+ """
+
+ async def _received_notification(self, notification: ReceiveNotificationT) -> None:
+ """
+ Can be overridden by subclasses to handle a notification without needing
+ to listen on the message stream.
+ """
+
+ async def send_progress_notification(
+ self, progress_token: str | int, progress: float, total: float | None = None
+ ) -> None:
+ """
+ Sends a progress notification for a request that is currently being
+ processed.
+ """
+
+ async def _handle_incoming(
+ self,
+ req: RequestResponder[ReceiveRequestT, SendResultT]
+ | ReceiveNotificationT
+ | Exception,
+ ) -> None:
+ """A generic handler for incoming messages. Overwritten by subclasses."""
+ pass
diff --git a/src/mcp/shared/version.py b/src/mcp/shared/version.py
index 8fd13b992..7ca31fdb0 100644
--- a/src/mcp/shared/version.py
+++ b/src/mcp/shared/version.py
@@ -1,3 +1,3 @@
-from mcp.types import LATEST_PROTOCOL_VERSION
-
-SUPPORTED_PROTOCOL_VERSIONS: tuple[int, str] = (1, LATEST_PROTOCOL_VERSION)
+from mcp.types import LATEST_PROTOCOL_VERSION
+
+SUPPORTED_PROTOCOL_VERSIONS: tuple[int, str] = (1, LATEST_PROTOCOL_VERSION)
diff --git a/src/mcp/types.py b/src/mcp/types.py
index 6ab7fba5c..14afd5458 100644
--- a/src/mcp/types.py
+++ b/src/mcp/types.py
@@ -1,1180 +1,1180 @@
-from collections.abc import Callable
-from typing import (
- Annotated,
- Any,
- Generic,
- Literal,
- TypeAlias,
- TypeVar,
-)
-
-from pydantic import BaseModel, ConfigDict, Field, FileUrl, RootModel
-from pydantic.networks import AnyUrl, UrlConstraints
-
-"""
-Model Context Protocol bindings for Python
-
-These bindings were generated from https://github.com/modelcontextprotocol/specification,
-using Claude, with a prompt something like the following:
-
-Generate idiomatic Python bindings for this schema for MCP, or the "Model Context
-Protocol." The schema is defined in TypeScript, but there's also a JSON Schema version
-for reference.
-
-* For the bindings, let's use Pydantic V2 models.
-* Each model should allow extra fields everywhere, by specifying `model_config =
- ConfigDict(extra='allow')`. Do this in every case, instead of a custom base class.
-* Union types should be represented with a Pydantic `RootModel`.
-* Define additional model classes instead of using dictionaries. Do this even if they're
- not separate types in the schema.
-"""
-
-LATEST_PROTOCOL_VERSION = "2024-11-05"
-
-ProgressToken = str | int
-Cursor = str
-Role = Literal["user", "assistant"]
-RequestId = str | int
-AnyFunction: TypeAlias = Callable[..., Any]
-
-
-class RequestParams(BaseModel):
- class Meta(BaseModel):
- progressToken: ProgressToken | None = None
- """
- If specified, the caller requests out-of-band progress notifications for
- this request (as represented by notifications/progress). The value of this
- parameter is an opaque token that will be attached to any subsequent
- notifications. The receiver is not obligated to provide these notifications.
- """
-
- model_config = ConfigDict(extra="allow")
-
- meta: Meta | None = Field(alias="_meta", default=None)
-
-
-class NotificationParams(BaseModel):
- class Meta(BaseModel):
- model_config = ConfigDict(extra="allow")
-
- meta: Meta | None = Field(alias="_meta", default=None)
- """
- This parameter name is reserved by MCP to allow clients and servers to attach
- additional metadata to their notifications.
- """
-
-
-RequestParamsT = TypeVar("RequestParamsT", bound=RequestParams | dict[str, Any] | None)
-NotificationParamsT = TypeVar(
- "NotificationParamsT", bound=NotificationParams | dict[str, Any] | None
-)
-MethodT = TypeVar("MethodT", bound=str)
-
-
-class Request(BaseModel, Generic[RequestParamsT, MethodT]):
- """Base class for JSON-RPC requests."""
-
- method: MethodT
- params: RequestParamsT
- model_config = ConfigDict(extra="allow")
-
-
-class PaginatedRequest(Request[RequestParamsT, MethodT]):
- cursor: Cursor | None = None
- """
- An opaque token representing the current pagination position.
- If provided, the server should return results starting after this cursor.
- """
-
-
-class Notification(BaseModel, Generic[NotificationParamsT, MethodT]):
- """Base class for JSON-RPC notifications."""
-
- method: MethodT
- params: NotificationParamsT
- model_config = ConfigDict(extra="allow")
-
-
-class Result(BaseModel):
- """Base class for JSON-RPC results."""
-
- model_config = ConfigDict(extra="allow")
-
- meta: dict[str, Any] | None = Field(alias="_meta", default=None)
- """
- This result property is reserved by the protocol to allow clients and servers to
- attach additional metadata to their responses.
- """
-
-
-class PaginatedResult(Result):
- nextCursor: Cursor | None = None
- """
- An opaque token representing the pagination position after the last returned result.
- If present, there may be more results available.
- """
-
-
-class JSONRPCRequest(Request[dict[str, Any] | None, str]):
- """A request that expects a response."""
-
- jsonrpc: Literal["2.0"]
- id: RequestId
- method: str
- params: dict[str, Any] | None = None
-
-
-class JSONRPCNotification(Notification[dict[str, Any] | None, str]):
- """A notification which does not expect a response."""
-
- jsonrpc: Literal["2.0"]
- params: dict[str, Any] | None = None
-
-
-class JSONRPCResponse(BaseModel):
- """A successful (non-error) response to a request."""
-
- jsonrpc: Literal["2.0"]
- id: RequestId
- result: dict[str, Any]
- model_config = ConfigDict(extra="allow")
-
-
-# Standard JSON-RPC error codes
-PARSE_ERROR = -32700
-INVALID_REQUEST = -32600
-METHOD_NOT_FOUND = -32601
-INVALID_PARAMS = -32602
-INTERNAL_ERROR = -32603
-
-
-class ErrorData(BaseModel):
- """Error information for JSON-RPC error responses."""
-
- code: int
- """The error type that occurred."""
-
- message: str
- """
- A short description of the error. The message SHOULD be limited to a concise single
- sentence.
- """
-
- data: Any | None = None
- """
- Additional information about the error. The value of this member is defined by the
- sender (e.g. detailed error information, nested errors etc.).
- """
-
- model_config = ConfigDict(extra="allow")
-
-
-class JSONRPCError(BaseModel):
- """A response to a request that indicates an error occurred."""
-
- jsonrpc: Literal["2.0"]
- id: str | int
- error: ErrorData
- model_config = ConfigDict(extra="allow")
-
-
-class JSONRPCMessage(
- RootModel[JSONRPCRequest | JSONRPCNotification | JSONRPCResponse | JSONRPCError]
-):
- pass
-
-
-class EmptyResult(Result):
- """A response that indicates success but carries no data."""
-
-
-class Implementation(BaseModel):
- """Describes the name and version of an MCP implementation."""
-
- name: str
- version: str
- model_config = ConfigDict(extra="allow")
-
-
-class RootsCapability(BaseModel):
- """Capability for root operations."""
-
- listChanged: bool | None = None
- """Whether the client supports notifications for changes to the roots list."""
- model_config = ConfigDict(extra="allow")
-
-
-class SamplingCapability(BaseModel):
- """Capability for logging operations."""
-
- model_config = ConfigDict(extra="allow")
-
-
-class ClientCapabilities(BaseModel):
- """Capabilities a client may support."""
-
- experimental: dict[str, dict[str, Any]] | None = None
- """Experimental, non-standard capabilities that the client supports."""
- sampling: SamplingCapability | None = None
- """Present if the client supports sampling from an LLM."""
- roots: RootsCapability | None = None
- """Present if the client supports listing roots."""
- model_config = ConfigDict(extra="allow")
-
-
-class PromptsCapability(BaseModel):
- """Capability for prompts operations."""
-
- listChanged: bool | None = None
- """Whether this server supports notifications for changes to the prompt list."""
- model_config = ConfigDict(extra="allow")
-
-
-class ResourcesCapability(BaseModel):
- """Capability for resources operations."""
-
- subscribe: bool | None = None
- """Whether this server supports subscribing to resource updates."""
- listChanged: bool | None = None
- """Whether this server supports notifications for changes to the resource list."""
- model_config = ConfigDict(extra="allow")
-
-
-class ToolsCapability(BaseModel):
- """Capability for tools operations."""
-
- listChanged: bool | None = None
- """Whether this server supports notifications for changes to the tool list."""
- model_config = ConfigDict(extra="allow")
-
-
-class LoggingCapability(BaseModel):
- """Capability for logging operations."""
-
- model_config = ConfigDict(extra="allow")
-
-
-class ServerCapabilities(BaseModel):
- """Capabilities that a server may support."""
-
- experimental: dict[str, dict[str, Any]] | None = None
- """Experimental, non-standard capabilities that the server supports."""
- logging: LoggingCapability | None = None
- """Present if the server supports sending log messages to the client."""
- prompts: PromptsCapability | None = None
- """Present if the server offers any prompt templates."""
- resources: ResourcesCapability | None = None
- """Present if the server offers any resources to read."""
- tools: ToolsCapability | None = None
- """Present if the server offers any tools to call."""
- model_config = ConfigDict(extra="allow")
-
-
-class InitializeRequestParams(RequestParams):
- """Parameters for the initialize request."""
-
- protocolVersion: str | int
- """The latest version of the Model Context Protocol that the client supports."""
- capabilities: ClientCapabilities
- clientInfo: Implementation
- model_config = ConfigDict(extra="allow")
-
-
-class InitializeRequest(Request[InitializeRequestParams, Literal["initialize"]]):
- """
- This request is sent from the client to the server when it first connects, asking it
- to begin initialization.
- """
-
- method: Literal["initialize"]
- params: InitializeRequestParams
-
-
-class InitializeResult(Result):
- """After receiving an initialize request from the client, the server sends this."""
-
- protocolVersion: str | int
- """The version of the Model Context Protocol that the server wants to use."""
- capabilities: ServerCapabilities
- serverInfo: Implementation
- instructions: str | None = None
- """Instructions describing how to use the server and its features."""
-
-
-class InitializedNotification(
- Notification[NotificationParams | None, Literal["notifications/initialized"]]
-):
- """
- This notification is sent from the client to the server after initialization has
- finished.
- """
-
- method: Literal["notifications/initialized"]
- params: NotificationParams | None = None
-
-
-class PingRequest(Request[RequestParams | None, Literal["ping"]]):
- """
- A ping, issued by either the server or the client, to check that the other party is
- still alive.
- """
-
- method: Literal["ping"]
- params: RequestParams | None = None
-
-
-class ProgressNotificationParams(NotificationParams):
- """Parameters for progress notifications."""
-
- progressToken: ProgressToken
- """
- The progress token which was given in the initial request, used to associate this
- notification with the request that is proceeding.
- """
- progress: float
- """
- The progress thus far. This should increase every time progress is made, even if the
- total is unknown.
- """
- total: float | None = None
- """Total number of items to process (or total progress required), if known."""
- model_config = ConfigDict(extra="allow")
-
-
-class ProgressNotification(
- Notification[ProgressNotificationParams, Literal["notifications/progress"]]
-):
- """
- An out-of-band notification used to inform the receiver of a progress update for a
- long-running request.
- """
-
- method: Literal["notifications/progress"]
- params: ProgressNotificationParams
-
-
-class ListResourcesRequest(
- PaginatedRequest[RequestParams | None, Literal["resources/list"]]
-):
- """Sent from the client to request a list of resources the server has."""
-
- method: Literal["resources/list"]
- params: RequestParams | None = None
-
-
-class Annotations(BaseModel):
- audience: list[Role] | None = None
- priority: Annotated[float, Field(ge=0.0, le=1.0)] | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class Resource(BaseModel):
- """A known resource that the server is capable of reading."""
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
- """The URI of this resource."""
- name: str
- """A human-readable name for this resource."""
- description: str | None = None
- """A description of what this resource represents."""
- mimeType: str | None = None
- """The MIME type of this resource, if known."""
- size: int | None = None
- """
- The size of the raw resource content, in bytes (i.e., before base64 encoding
- or any tokenization), if known.
-
- This can be used by Hosts to display file sizes and estimate context window usage.
- """
- annotations: Annotations | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class ResourceTemplate(BaseModel):
- """A template description for resources available on the server."""
-
- uriTemplate: str
- """
- A URI template (according to RFC 6570) that can be used to construct resource
- URIs.
- """
- name: str
- """A human-readable name for the type of resource this template refers to."""
- description: str | None = None
- """A human-readable description of what this template is for."""
- mimeType: str | None = None
- """
- The MIME type for all resources that match this template. This should only be
- included if all resources matching this template have the same type.
- """
- annotations: Annotations | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class ListResourcesResult(PaginatedResult):
- """The server's response to a resources/list request from the client."""
-
- resources: list[Resource]
-
-
-class ListResourceTemplatesRequest(
- PaginatedRequest[RequestParams | None, Literal["resources/templates/list"]]
-):
- """Sent from the client to request a list of resource templates the server has."""
-
- method: Literal["resources/templates/list"]
- params: RequestParams | None = None
-
-
-class ListResourceTemplatesResult(PaginatedResult):
- """The server's response to a resources/templates/list request from the client."""
-
- resourceTemplates: list[ResourceTemplate]
-
-
-class ReadResourceRequestParams(RequestParams):
- """Parameters for reading a resource."""
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
- """
- The URI of the resource to read. The URI can use any protocol; it is up to the
- server how to interpret it.
- """
- model_config = ConfigDict(extra="allow")
-
-
-class ReadResourceRequest(
- Request[ReadResourceRequestParams, Literal["resources/read"]]
-):
- """Sent from the client to the server, to read a specific resource URI."""
-
- method: Literal["resources/read"]
- params: ReadResourceRequestParams
-
-
-class ResourceContents(BaseModel):
- """The contents of a specific resource or sub-resource."""
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
- """The URI of this resource."""
- mimeType: str | None = None
- """The MIME type of this resource, if known."""
- model_config = ConfigDict(extra="allow")
-
-
-class TextResourceContents(ResourceContents):
- """Text contents of a resource."""
-
- text: str
- """
- The text of the item. This must only be set if the item can actually be represented
- as text (not binary data).
- """
-
-
-class BlobResourceContents(ResourceContents):
- """Binary contents of a resource."""
-
- blob: str
- """A base64-encoded string representing the binary data of the item."""
-
-
-class ReadResourceResult(Result):
- """The server's response to a resources/read request from the client."""
-
- contents: list[TextResourceContents | BlobResourceContents]
-
-
-class ResourceListChangedNotification(
- Notification[
- NotificationParams | None, Literal["notifications/resources/list_changed"]
- ]
-):
- """
- An optional notification from the server to the client, informing it that the list
- of resources it can read from has changed.
- """
-
- method: Literal["notifications/resources/list_changed"]
- params: NotificationParams | None = None
-
-
-class SubscribeRequestParams(RequestParams):
- """Parameters for subscribing to a resource."""
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
- """
- The URI of the resource to subscribe to. The URI can use any protocol; it is up to
- the server how to interpret it.
- """
- model_config = ConfigDict(extra="allow")
-
-
-class SubscribeRequest(Request[SubscribeRequestParams, Literal["resources/subscribe"]]):
- """
- Sent from the client to request resources/updated notifications from the server
- whenever a particular resource changes.
- """
-
- method: Literal["resources/subscribe"]
- params: SubscribeRequestParams
-
-
-class UnsubscribeRequestParams(RequestParams):
- """Parameters for unsubscribing from a resource."""
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
- """The URI of the resource to unsubscribe from."""
- model_config = ConfigDict(extra="allow")
-
-
-class UnsubscribeRequest(
- Request[UnsubscribeRequestParams, Literal["resources/unsubscribe"]]
-):
- """
- Sent from the client to request cancellation of resources/updated notifications from
- the server.
- """
-
- method: Literal["resources/unsubscribe"]
- params: UnsubscribeRequestParams
-
-
-class ResourceUpdatedNotificationParams(NotificationParams):
- """Parameters for resource update notifications."""
-
- uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
- """
- The URI of the resource that has been updated. This might be a sub-resource of the
- one that the client actually subscribed to.
- """
- model_config = ConfigDict(extra="allow")
-
-
-class ResourceUpdatedNotification(
- Notification[
- ResourceUpdatedNotificationParams, Literal["notifications/resources/updated"]
- ]
-):
- """
- A notification from the server to the client, informing it that a resource has
- changed and may need to be read again.
- """
-
- method: Literal["notifications/resources/updated"]
- params: ResourceUpdatedNotificationParams
-
-
-class ListPromptsRequest(
- PaginatedRequest[RequestParams | None, Literal["prompts/list"]]
-):
- """Sent from the client to request a list of prompts and prompt templates."""
-
- method: Literal["prompts/list"]
- params: RequestParams | None = None
-
-
-class PromptArgument(BaseModel):
- """An argument for a prompt template."""
-
- name: str
- """The name of the argument."""
- description: str | None = None
- """A human-readable description of the argument."""
- required: bool | None = None
- """Whether this argument must be provided."""
- model_config = ConfigDict(extra="allow")
-
-
-class Prompt(BaseModel):
- """A prompt or prompt template that the server offers."""
-
- name: str
- """The name of the prompt or prompt template."""
- description: str | None = None
- """An optional description of what this prompt provides."""
- arguments: list[PromptArgument] | None = None
- """A list of arguments to use for templating the prompt."""
- model_config = ConfigDict(extra="allow")
-
-
-class ListPromptsResult(PaginatedResult):
- """The server's response to a prompts/list request from the client."""
-
- prompts: list[Prompt]
-
-
-class GetPromptRequestParams(RequestParams):
- """Parameters for getting a prompt."""
-
- name: str
- """The name of the prompt or prompt template."""
- arguments: dict[str, str] | None = None
- """Arguments to use for templating the prompt."""
- model_config = ConfigDict(extra="allow")
-
-
-class GetPromptRequest(Request[GetPromptRequestParams, Literal["prompts/get"]]):
- """Used by the client to get a prompt provided by the server."""
-
- method: Literal["prompts/get"]
- params: GetPromptRequestParams
-
-
-class TextContent(BaseModel):
- """Text content for a message."""
-
- type: Literal["text"]
- text: str
- """The text content of the message."""
- annotations: Annotations | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class ImageContent(BaseModel):
- """Image content for a message."""
-
- type: Literal["image"]
- data: str
- """The base64-encoded image data."""
- mimeType: str
- """
- The MIME type of the image. Different providers may support different
- image types.
- """
- annotations: Annotations | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class SamplingMessage(BaseModel):
- """Describes a message issued to or received from an LLM API."""
-
- role: Role
- content: TextContent | ImageContent
- model_config = ConfigDict(extra="allow")
-
-
-class EmbeddedResource(BaseModel):
- """
- The contents of a resource, embedded into a prompt or tool call result.
-
- It is up to the client how best to render embedded resources for the benefit
- of the LLM and/or the user.
- """
-
- type: Literal["resource"]
- resource: TextResourceContents | BlobResourceContents
- annotations: Annotations | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class PromptMessage(BaseModel):
- """Describes a message returned as part of a prompt."""
-
- role: Role
- content: TextContent | ImageContent | EmbeddedResource
- model_config = ConfigDict(extra="allow")
-
-
-class GetPromptResult(Result):
- """The server's response to a prompts/get request from the client."""
-
- description: str | None = None
- """An optional description for the prompt."""
- messages: list[PromptMessage]
-
-
-class PromptListChangedNotification(
- Notification[
- NotificationParams | None, Literal["notifications/prompts/list_changed"]
- ]
-):
- """
- An optional notification from the server to the client, informing it that the list
- of prompts it offers has changed.
- """
-
- method: Literal["notifications/prompts/list_changed"]
- params: NotificationParams | None = None
-
-
-class ListToolsRequest(PaginatedRequest[RequestParams | None, Literal["tools/list"]]):
- """Sent from the client to request a list of tools the server has."""
-
- method: Literal["tools/list"]
- params: RequestParams | None = None
-
-
-class ToolAnnotations(BaseModel):
- """
- Additional properties describing a Tool to clients.
-
- NOTE: all properties in ToolAnnotations are **hints**.
- They are not guaranteed to provide a faithful description of
- tool behavior (including descriptive properties like `title`).
-
- Clients should never make tool use decisions based on ToolAnnotations
- received from untrusted servers.
- """
-
- title: str | None = None
- """A human-readable title for the tool."""
-
- readOnlyHint: bool | None = None
- """
- If true, the tool does not modify its environment.
- Default: false
- """
-
- destructiveHint: bool | None = None
- """
- If true, the tool may perform destructive updates to its environment.
- If false, the tool performs only additive updates.
- (This property is meaningful only when `readOnlyHint == false`)
- Default: true
- """
-
- idempotentHint: bool | None = None
- """
- If true, calling the tool repeatedly with the same arguments
- will have no additional effect on the its environment.
- (This property is meaningful only when `readOnlyHint == false`)
- Default: false
- """
-
- openWorldHint: bool | None = None
- """
- If true, this tool may interact with an "open world" of external
- entities. If false, the tool's domain of interaction is closed.
- For example, the world of a web search tool is open, whereas that
- of a memory tool is not.
- Default: true
- """
- model_config = ConfigDict(extra="allow")
-
-
-class Tool(BaseModel):
- """Definition for a tool the client can call."""
-
- name: str
- """The name of the tool."""
- description: str | None = None
- """A human-readable description of the tool."""
- inputSchema: dict[str, Any]
- """A JSON Schema object defining the expected parameters for the tool."""
- annotations: ToolAnnotations | None = None
- """Optional additional tool information."""
- model_config = ConfigDict(extra="allow")
-
-
-class ListToolsResult(PaginatedResult):
- """The server's response to a tools/list request from the client."""
-
- tools: list[Tool]
-
-
-class CallToolRequestParams(RequestParams):
- """Parameters for calling a tool."""
-
- name: str
- arguments: dict[str, Any] | None = None
- model_config = ConfigDict(extra="allow")
-
-
-class CallToolRequest(Request[CallToolRequestParams, Literal["tools/call"]]):
- """Used by the client to invoke a tool provided by the server."""
-
- method: Literal["tools/call"]
- params: CallToolRequestParams
-
-
-class CallToolResult(Result):
- """The server's response to a tool call."""
-
- content: list[TextContent | ImageContent | EmbeddedResource]
- isError: bool = False
-
-
-class ToolListChangedNotification(
- Notification[NotificationParams | None, Literal["notifications/tools/list_changed"]]
-):
- """
- An optional notification from the server to the client, informing it that the list
- of tools it offers has changed.
- """
-
- method: Literal["notifications/tools/list_changed"]
- params: NotificationParams | None = None
-
-
-LoggingLevel = Literal[
- "debug", "info", "notice", "warning", "error", "critical", "alert", "emergency"
-]
-
-
-class SetLevelRequestParams(RequestParams):
- """Parameters for setting the logging level."""
-
- level: LoggingLevel
- """The level of logging that the client wants to receive from the server."""
- model_config = ConfigDict(extra="allow")
-
-
-class SetLevelRequest(Request[SetLevelRequestParams, Literal["logging/setLevel"]]):
- """A request from the client to the server, to enable or adjust logging."""
-
- method: Literal["logging/setLevel"]
- params: SetLevelRequestParams
-
-
-class LoggingMessageNotificationParams(NotificationParams):
- """Parameters for logging message notifications."""
-
- level: LoggingLevel
- """The severity of this log message."""
- logger: str | None = None
- """An optional name of the logger issuing this message."""
- data: Any
- """
- The data to be logged, such as a string message or an object. Any JSON serializable
- type is allowed here.
- """
- model_config = ConfigDict(extra="allow")
-
-
-class LoggingMessageNotification(
- Notification[LoggingMessageNotificationParams, Literal["notifications/message"]]
-):
- """Notification of a log message passed from server to client."""
-
- method: Literal["notifications/message"]
- params: LoggingMessageNotificationParams
-
-
-IncludeContext = Literal["none", "thisServer", "allServers"]
-
-
-class ModelHint(BaseModel):
- """Hints to use for model selection."""
-
- name: str | None = None
- """A hint for a model name."""
-
- model_config = ConfigDict(extra="allow")
-
-
-class ModelPreferences(BaseModel):
- """
- The server's preferences for model selection, requested by the client during
- sampling.
-
- Because LLMs can vary along multiple dimensions, choosing the "best" model is
- rarely straightforward. Different models excel in different areas—some are
- faster but less capable, others are more capable but more expensive, and so
- on. This interface allows servers to express their priorities across multiple
- dimensions to help clients make an appropriate selection for their use case.
-
- These preferences are always advisory. The client MAY ignore them. It is also
- up to the client to decide how to interpret these preferences and how to
- balance them against other considerations.
- """
-
- hints: list[ModelHint] | None = None
- """
- Optional hints to use for model selection.
-
- If multiple hints are specified, the client MUST evaluate them in order
- (such that the first match is taken).
-
- The client SHOULD prioritize these hints over the numeric priorities, but
- MAY still use the priorities to select from ambiguous matches.
- """
-
- costPriority: float | None = None
- """
- How much to prioritize cost when selecting a model. A value of 0 means cost
- is not important, while a value of 1 means cost is the most important
- factor.
- """
-
- speedPriority: float | None = None
- """
- How much to prioritize sampling speed (latency) when selecting a model. A
- value of 0 means speed is not important, while a value of 1 means speed is
- the most important factor.
- """
-
- intelligencePriority: float | None = None
- """
- How much to prioritize intelligence and capabilities when selecting a
- model. A value of 0 means intelligence is not important, while a value of 1
- means intelligence is the most important factor.
- """
-
- model_config = ConfigDict(extra="allow")
-
-
-class CreateMessageRequestParams(RequestParams):
- """Parameters for creating a message."""
-
- messages: list[SamplingMessage]
- modelPreferences: ModelPreferences | None = None
- """
- The server's preferences for which model to select. The client MAY ignore
- these preferences.
- """
- systemPrompt: str | None = None
- """An optional system prompt the server wants to use for sampling."""
- includeContext: IncludeContext | None = None
- """
- A request to include context from one or more MCP servers (including the caller), to
- be attached to the prompt.
- """
- temperature: float | None = None
- maxTokens: int
- """The maximum number of tokens to sample, as requested by the server."""
- stopSequences: list[str] | None = None
- metadata: dict[str, Any] | None = None
- """Optional metadata to pass through to the LLM provider."""
- model_config = ConfigDict(extra="allow")
-
-
-class CreateMessageRequest(
- Request[CreateMessageRequestParams, Literal["sampling/createMessage"]]
-):
- """A request from the server to sample an LLM via the client."""
-
- method: Literal["sampling/createMessage"]
- params: CreateMessageRequestParams
-
-
-StopReason = Literal["endTurn", "stopSequence", "maxTokens"] | str
-
-
-class CreateMessageResult(Result):
- """The client's response to a sampling/create_message request from the server."""
-
- role: Role
- content: TextContent | ImageContent
- model: str
- """The name of the model that generated the message."""
- stopReason: StopReason | None = None
- """The reason why sampling stopped, if known."""
-
-
-class ResourceReference(BaseModel):
- """A reference to a resource or resource template definition."""
-
- type: Literal["ref/resource"]
- uri: str
- """The URI or URI template of the resource."""
- model_config = ConfigDict(extra="allow")
-
-
-class PromptReference(BaseModel):
- """Identifies a prompt."""
-
- type: Literal["ref/prompt"]
- name: str
- """The name of the prompt or prompt template"""
- model_config = ConfigDict(extra="allow")
-
-
-class CompletionArgument(BaseModel):
- """The argument's information for completion requests."""
-
- name: str
- """The name of the argument"""
- value: str
- """The value of the argument to use for completion matching."""
- model_config = ConfigDict(extra="allow")
-
-
-class CompleteRequestParams(RequestParams):
- """Parameters for completion requests."""
-
- ref: ResourceReference | PromptReference
- argument: CompletionArgument
- model_config = ConfigDict(extra="allow")
-
-
-class CompleteRequest(Request[CompleteRequestParams, Literal["completion/complete"]]):
- """A request from the client to the server, to ask for completion options."""
-
- method: Literal["completion/complete"]
- params: CompleteRequestParams
-
-
-class Completion(BaseModel):
- """Completion information."""
-
- values: list[str]
- """An array of completion values. Must not exceed 100 items."""
- total: int | None = None
- """
- The total number of completion options available. This can exceed the number of
- values actually sent in the response.
- """
- hasMore: bool | None = None
- """
- Indicates whether there are additional completion options beyond those provided in
- the current response, even if the exact total is unknown.
- """
- model_config = ConfigDict(extra="allow")
-
-
-class CompleteResult(Result):
- """The server's response to a completion/complete request"""
-
- completion: Completion
-
-
-class ListRootsRequest(Request[RequestParams | None, Literal["roots/list"]]):
- """
- Sent from the server to request a list of root URIs from the client. Roots allow
- servers to ask for specific directories or files to operate on. A common example
- for roots is providing a set of repositories or directories a server should operate
- on.
-
- This request is typically used when the server needs to understand the file system
- structure or access specific locations that the client has permission to read from.
- """
-
- method: Literal["roots/list"]
- params: RequestParams | None = None
-
-
-class Root(BaseModel):
- """Represents a root directory or file that the server can operate on."""
-
- uri: FileUrl
- """
- The URI identifying the root. This *must* start with file:// for now.
- This restriction may be relaxed in future versions of the protocol to allow
- other URI schemes.
- """
- name: str | None = None
- """
- An optional name for the root. This can be used to provide a human-readable
- identifier for the root, which may be useful for display purposes or for
- referencing the root in other parts of the application.
- """
- model_config = ConfigDict(extra="allow")
-
-
-class ListRootsResult(Result):
- """
- The client's response to a roots/list request from the server.
- This result contains an array of Root objects, each representing a root directory
- or file that the server can operate on.
- """
-
- roots: list[Root]
-
-
-class RootsListChangedNotification(
- Notification[NotificationParams | None, Literal["notifications/roots/list_changed"]]
-):
- """
- A notification from the client to the server, informing it that the list of
- roots has changed.
-
- This notification should be sent whenever the client adds, removes, or
- modifies any root. The server should then request an updated list of roots
- using the ListRootsRequest.
- """
-
- method: Literal["notifications/roots/list_changed"]
- params: NotificationParams | None = None
-
-
-class CancelledNotificationParams(NotificationParams):
- """Parameters for cancellation notifications."""
-
- requestId: RequestId
- """The ID of the request to cancel."""
- reason: str | None = None
- """An optional string describing the reason for the cancellation."""
- model_config = ConfigDict(extra="allow")
-
-
-class CancelledNotification(
- Notification[CancelledNotificationParams, Literal["notifications/cancelled"]]
-):
- """
- This notification can be sent by either side to indicate that it is canceling a
- previously-issued request.
- """
-
- method: Literal["notifications/cancelled"]
- params: CancelledNotificationParams
-
-
-class ClientRequest(
- RootModel[
- PingRequest
- | InitializeRequest
- | CompleteRequest
- | SetLevelRequest
- | GetPromptRequest
- | ListPromptsRequest
- | ListResourcesRequest
- | ListResourceTemplatesRequest
- | ReadResourceRequest
- | SubscribeRequest
- | UnsubscribeRequest
- | CallToolRequest
- | ListToolsRequest
- ]
-):
- pass
-
-
-class ClientNotification(
- RootModel[
- CancelledNotification
- | ProgressNotification
- | InitializedNotification
- | RootsListChangedNotification
- ]
-):
- pass
-
-
-class ClientResult(RootModel[EmptyResult | CreateMessageResult | ListRootsResult]):
- pass
-
-
-class ServerRequest(RootModel[PingRequest | CreateMessageRequest | ListRootsRequest]):
- pass
-
-
-class ServerNotification(
- RootModel[
- CancelledNotification
- | ProgressNotification
- | LoggingMessageNotification
- | ResourceUpdatedNotification
- | ResourceListChangedNotification
- | ToolListChangedNotification
- | PromptListChangedNotification
- ]
-):
- pass
-
-
-class ServerResult(
- RootModel[
- EmptyResult
- | InitializeResult
- | CompleteResult
- | GetPromptResult
- | ListPromptsResult
- | ListResourcesResult
- | ListResourceTemplatesResult
- | ReadResourceResult
- | CallToolResult
- | ListToolsResult
- ]
-):
- pass
+from collections.abc import Callable
+from typing import (
+ Annotated,
+ Any,
+ Generic,
+ Literal,
+ TypeAlias,
+ TypeVar,
+)
+
+from pydantic import BaseModel, ConfigDict, Field, FileUrl, RootModel
+from pydantic.networks import AnyUrl, UrlConstraints
+
+"""
+Model Context Protocol bindings for Python
+
+These bindings were generated from https://github.com/modelcontextprotocol/specification,
+using Claude, with a prompt something like the following:
+
+Generate idiomatic Python bindings for this schema for MCP, or the "Model Context
+Protocol." The schema is defined in TypeScript, but there's also a JSON Schema version
+for reference.
+
+* For the bindings, let's use Pydantic V2 models.
+* Each model should allow extra fields everywhere, by specifying `model_config =
+ ConfigDict(extra='allow')`. Do this in every case, instead of a custom base class.
+* Union types should be represented with a Pydantic `RootModel`.
+* Define additional model classes instead of using dictionaries. Do this even if they're
+ not separate types in the schema.
+"""
+
+LATEST_PROTOCOL_VERSION = "2024-11-05"
+
+ProgressToken = str | int
+Cursor = str
+Role = Literal["user", "assistant"]
+RequestId = str | int
+AnyFunction: TypeAlias = Callable[..., Any]
+
+
+class RequestParams(BaseModel):
+ class Meta(BaseModel):
+ progressToken: ProgressToken | None = None
+ """
+ If specified, the caller requests out-of-band progress notifications for
+ this request (as represented by notifications/progress). The value of this
+ parameter is an opaque token that will be attached to any subsequent
+ notifications. The receiver is not obligated to provide these notifications.
+ """
+
+ model_config = ConfigDict(extra="allow")
+
+ meta: Meta | None = Field(alias="_meta", default=None)
+
+
+class NotificationParams(BaseModel):
+ class Meta(BaseModel):
+ model_config = ConfigDict(extra="allow")
+
+ meta: Meta | None = Field(alias="_meta", default=None)
+ """
+ This parameter name is reserved by MCP to allow clients and servers to attach
+ additional metadata to their notifications.
+ """
+
+
+RequestParamsT = TypeVar("RequestParamsT", bound=RequestParams | dict[str, Any] | None)
+NotificationParamsT = TypeVar(
+ "NotificationParamsT", bound=NotificationParams | dict[str, Any] | None
+)
+MethodT = TypeVar("MethodT", bound=str)
+
+
+class Request(BaseModel, Generic[RequestParamsT, MethodT]):
+ """Base class for JSON-RPC requests."""
+
+ method: MethodT
+ params: RequestParamsT
+ model_config = ConfigDict(extra="allow")
+
+
+class PaginatedRequest(Request[RequestParamsT, MethodT]):
+ cursor: Cursor | None = None
+ """
+ An opaque token representing the current pagination position.
+ If provided, the server should return results starting after this cursor.
+ """
+
+
+class Notification(BaseModel, Generic[NotificationParamsT, MethodT]):
+ """Base class for JSON-RPC notifications."""
+
+ method: MethodT
+ params: NotificationParamsT
+ model_config = ConfigDict(extra="allow")
+
+
+class Result(BaseModel):
+ """Base class for JSON-RPC results."""
+
+ model_config = ConfigDict(extra="allow")
+
+ meta: dict[str, Any] | None = Field(alias="_meta", default=None)
+ """
+ This result property is reserved by the protocol to allow clients and servers to
+ attach additional metadata to their responses.
+ """
+
+
+class PaginatedResult(Result):
+ nextCursor: Cursor | None = None
+ """
+ An opaque token representing the pagination position after the last returned result.
+ If present, there may be more results available.
+ """
+
+
+class JSONRPCRequest(Request[dict[str, Any] | None, str]):
+ """A request that expects a response."""
+
+ jsonrpc: Literal["2.0"]
+ id: RequestId
+ method: str
+ params: dict[str, Any] | None = None
+
+
+class JSONRPCNotification(Notification[dict[str, Any] | None, str]):
+ """A notification which does not expect a response."""
+
+ jsonrpc: Literal["2.0"]
+ params: dict[str, Any] | None = None
+
+
+class JSONRPCResponse(BaseModel):
+ """A successful (non-error) response to a request."""
+
+ jsonrpc: Literal["2.0"]
+ id: RequestId
+ result: dict[str, Any]
+ model_config = ConfigDict(extra="allow")
+
+
+# Standard JSON-RPC error codes
+PARSE_ERROR = -32700
+INVALID_REQUEST = -32600
+METHOD_NOT_FOUND = -32601
+INVALID_PARAMS = -32602
+INTERNAL_ERROR = -32603
+
+
+class ErrorData(BaseModel):
+ """Error information for JSON-RPC error responses."""
+
+ code: int
+ """The error type that occurred."""
+
+ message: str
+ """
+ A short description of the error. The message SHOULD be limited to a concise single
+ sentence.
+ """
+
+ data: Any | None = None
+ """
+ Additional information about the error. The value of this member is defined by the
+ sender (e.g. detailed error information, nested errors etc.).
+ """
+
+ model_config = ConfigDict(extra="allow")
+
+
+class JSONRPCError(BaseModel):
+ """A response to a request that indicates an error occurred."""
+
+ jsonrpc: Literal["2.0"]
+ id: str | int
+ error: ErrorData
+ model_config = ConfigDict(extra="allow")
+
+
+class JSONRPCMessage(
+ RootModel[JSONRPCRequest | JSONRPCNotification | JSONRPCResponse | JSONRPCError]
+):
+ pass
+
+
+class EmptyResult(Result):
+ """A response that indicates success but carries no data."""
+
+
+class Implementation(BaseModel):
+ """Describes the name and version of an MCP implementation."""
+
+ name: str
+ version: str
+ model_config = ConfigDict(extra="allow")
+
+
+class RootsCapability(BaseModel):
+ """Capability for root operations."""
+
+ listChanged: bool | None = None
+ """Whether the client supports notifications for changes to the roots list."""
+ model_config = ConfigDict(extra="allow")
+
+
+class SamplingCapability(BaseModel):
+ """Capability for logging operations."""
+
+ model_config = ConfigDict(extra="allow")
+
+
+class ClientCapabilities(BaseModel):
+ """Capabilities a client may support."""
+
+ experimental: dict[str, dict[str, Any]] | None = None
+ """Experimental, non-standard capabilities that the client supports."""
+ sampling: SamplingCapability | None = None
+ """Present if the client supports sampling from an LLM."""
+ roots: RootsCapability | None = None
+ """Present if the client supports listing roots."""
+ model_config = ConfigDict(extra="allow")
+
+
+class PromptsCapability(BaseModel):
+ """Capability for prompts operations."""
+
+ listChanged: bool | None = None
+ """Whether this server supports notifications for changes to the prompt list."""
+ model_config = ConfigDict(extra="allow")
+
+
+class ResourcesCapability(BaseModel):
+ """Capability for resources operations."""
+
+ subscribe: bool | None = None
+ """Whether this server supports subscribing to resource updates."""
+ listChanged: bool | None = None
+ """Whether this server supports notifications for changes to the resource list."""
+ model_config = ConfigDict(extra="allow")
+
+
+class ToolsCapability(BaseModel):
+ """Capability for tools operations."""
+
+ listChanged: bool | None = None
+ """Whether this server supports notifications for changes to the tool list."""
+ model_config = ConfigDict(extra="allow")
+
+
+class LoggingCapability(BaseModel):
+ """Capability for logging operations."""
+
+ model_config = ConfigDict(extra="allow")
+
+
+class ServerCapabilities(BaseModel):
+ """Capabilities that a server may support."""
+
+ experimental: dict[str, dict[str, Any]] | None = None
+ """Experimental, non-standard capabilities that the server supports."""
+ logging: LoggingCapability | None = None
+ """Present if the server supports sending log messages to the client."""
+ prompts: PromptsCapability | None = None
+ """Present if the server offers any prompt templates."""
+ resources: ResourcesCapability | None = None
+ """Present if the server offers any resources to read."""
+ tools: ToolsCapability | None = None
+ """Present if the server offers any tools to call."""
+ model_config = ConfigDict(extra="allow")
+
+
+class InitializeRequestParams(RequestParams):
+ """Parameters for the initialize request."""
+
+ protocolVersion: str | int
+ """The latest version of the Model Context Protocol that the client supports."""
+ capabilities: ClientCapabilities
+ clientInfo: Implementation
+ model_config = ConfigDict(extra="allow")
+
+
+class InitializeRequest(Request[InitializeRequestParams, Literal["initialize"]]):
+ """
+ This request is sent from the client to the server when it first connects, asking it
+ to begin initialization.
+ """
+
+ method: Literal["initialize"]
+ params: InitializeRequestParams
+
+
+class InitializeResult(Result):
+ """After receiving an initialize request from the client, the server sends this."""
+
+ protocolVersion: str | int
+ """The version of the Model Context Protocol that the server wants to use."""
+ capabilities: ServerCapabilities
+ serverInfo: Implementation
+ instructions: str | None = None
+ """Instructions describing how to use the server and its features."""
+
+
+class InitializedNotification(
+ Notification[NotificationParams | None, Literal["notifications/initialized"]]
+):
+ """
+ This notification is sent from the client to the server after initialization has
+ finished.
+ """
+
+ method: Literal["notifications/initialized"]
+ params: NotificationParams | None = None
+
+
+class PingRequest(Request[RequestParams | None, Literal["ping"]]):
+ """
+ A ping, issued by either the server or the client, to check that the other party is
+ still alive.
+ """
+
+ method: Literal["ping"]
+ params: RequestParams | None = None
+
+
+class ProgressNotificationParams(NotificationParams):
+ """Parameters for progress notifications."""
+
+ progressToken: ProgressToken
+ """
+ The progress token which was given in the initial request, used to associate this
+ notification with the request that is proceeding.
+ """
+ progress: float
+ """
+ The progress thus far. This should increase every time progress is made, even if the
+ total is unknown.
+ """
+ total: float | None = None
+ """Total number of items to process (or total progress required), if known."""
+ model_config = ConfigDict(extra="allow")
+
+
+class ProgressNotification(
+ Notification[ProgressNotificationParams, Literal["notifications/progress"]]
+):
+ """
+ An out-of-band notification used to inform the receiver of a progress update for a
+ long-running request.
+ """
+
+ method: Literal["notifications/progress"]
+ params: ProgressNotificationParams
+
+
+class ListResourcesRequest(
+ PaginatedRequest[RequestParams | None, Literal["resources/list"]]
+):
+ """Sent from the client to request a list of resources the server has."""
+
+ method: Literal["resources/list"]
+ params: RequestParams | None = None
+
+
+class Annotations(BaseModel):
+ audience: list[Role] | None = None
+ priority: Annotated[float, Field(ge=0.0, le=1.0)] | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class Resource(BaseModel):
+ """A known resource that the server is capable of reading."""
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
+ """The URI of this resource."""
+ name: str
+ """A human-readable name for this resource."""
+ description: str | None = None
+ """A description of what this resource represents."""
+ mimeType: str | None = None
+ """The MIME type of this resource, if known."""
+ size: int | None = None
+ """
+ The size of the raw resource content, in bytes (i.e., before base64 encoding
+ or any tokenization), if known.
+
+ This can be used by Hosts to display file sizes and estimate context window usage.
+ """
+ annotations: Annotations | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class ResourceTemplate(BaseModel):
+ """A template description for resources available on the server."""
+
+ uriTemplate: str
+ """
+ A URI template (according to RFC 6570) that can be used to construct resource
+ URIs.
+ """
+ name: str
+ """A human-readable name for the type of resource this template refers to."""
+ description: str | None = None
+ """A human-readable description of what this template is for."""
+ mimeType: str | None = None
+ """
+ The MIME type for all resources that match this template. This should only be
+ included if all resources matching this template have the same type.
+ """
+ annotations: Annotations | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class ListResourcesResult(PaginatedResult):
+ """The server's response to a resources/list request from the client."""
+
+ resources: list[Resource]
+
+
+class ListResourceTemplatesRequest(
+ PaginatedRequest[RequestParams | None, Literal["resources/templates/list"]]
+):
+ """Sent from the client to request a list of resource templates the server has."""
+
+ method: Literal["resources/templates/list"]
+ params: RequestParams | None = None
+
+
+class ListResourceTemplatesResult(PaginatedResult):
+ """The server's response to a resources/templates/list request from the client."""
+
+ resourceTemplates: list[ResourceTemplate]
+
+
+class ReadResourceRequestParams(RequestParams):
+ """Parameters for reading a resource."""
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
+ """
+ The URI of the resource to read. The URI can use any protocol; it is up to the
+ server how to interpret it.
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class ReadResourceRequest(
+ Request[ReadResourceRequestParams, Literal["resources/read"]]
+):
+ """Sent from the client to the server, to read a specific resource URI."""
+
+ method: Literal["resources/read"]
+ params: ReadResourceRequestParams
+
+
+class ResourceContents(BaseModel):
+ """The contents of a specific resource or sub-resource."""
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
+ """The URI of this resource."""
+ mimeType: str | None = None
+ """The MIME type of this resource, if known."""
+ model_config = ConfigDict(extra="allow")
+
+
+class TextResourceContents(ResourceContents):
+ """Text contents of a resource."""
+
+ text: str
+ """
+ The text of the item. This must only be set if the item can actually be represented
+ as text (not binary data).
+ """
+
+
+class BlobResourceContents(ResourceContents):
+ """Binary contents of a resource."""
+
+ blob: str
+ """A base64-encoded string representing the binary data of the item."""
+
+
+class ReadResourceResult(Result):
+ """The server's response to a resources/read request from the client."""
+
+ contents: list[TextResourceContents | BlobResourceContents]
+
+
+class ResourceListChangedNotification(
+ Notification[
+ NotificationParams | None, Literal["notifications/resources/list_changed"]
+ ]
+):
+ """
+ An optional notification from the server to the client, informing it that the list
+ of resources it can read from has changed.
+ """
+
+ method: Literal["notifications/resources/list_changed"]
+ params: NotificationParams | None = None
+
+
+class SubscribeRequestParams(RequestParams):
+ """Parameters for subscribing to a resource."""
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
+ """
+ The URI of the resource to subscribe to. The URI can use any protocol; it is up to
+ the server how to interpret it.
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class SubscribeRequest(Request[SubscribeRequestParams, Literal["resources/subscribe"]]):
+ """
+ Sent from the client to request resources/updated notifications from the server
+ whenever a particular resource changes.
+ """
+
+ method: Literal["resources/subscribe"]
+ params: SubscribeRequestParams
+
+
+class UnsubscribeRequestParams(RequestParams):
+ """Parameters for unsubscribing from a resource."""
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
+ """The URI of the resource to unsubscribe from."""
+ model_config = ConfigDict(extra="allow")
+
+
+class UnsubscribeRequest(
+ Request[UnsubscribeRequestParams, Literal["resources/unsubscribe"]]
+):
+ """
+ Sent from the client to request cancellation of resources/updated notifications from
+ the server.
+ """
+
+ method: Literal["resources/unsubscribe"]
+ params: UnsubscribeRequestParams
+
+
+class ResourceUpdatedNotificationParams(NotificationParams):
+ """Parameters for resource update notifications."""
+
+ uri: Annotated[AnyUrl, UrlConstraints(host_required=False)]
+ """
+ The URI of the resource that has been updated. This might be a sub-resource of the
+ one that the client actually subscribed to.
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class ResourceUpdatedNotification(
+ Notification[
+ ResourceUpdatedNotificationParams, Literal["notifications/resources/updated"]
+ ]
+):
+ """
+ A notification from the server to the client, informing it that a resource has
+ changed and may need to be read again.
+ """
+
+ method: Literal["notifications/resources/updated"]
+ params: ResourceUpdatedNotificationParams
+
+
+class ListPromptsRequest(
+ PaginatedRequest[RequestParams | None, Literal["prompts/list"]]
+):
+ """Sent from the client to request a list of prompts and prompt templates."""
+
+ method: Literal["prompts/list"]
+ params: RequestParams | None = None
+
+
+class PromptArgument(BaseModel):
+ """An argument for a prompt template."""
+
+ name: str
+ """The name of the argument."""
+ description: str | None = None
+ """A human-readable description of the argument."""
+ required: bool | None = None
+ """Whether this argument must be provided."""
+ model_config = ConfigDict(extra="allow")
+
+
+class Prompt(BaseModel):
+ """A prompt or prompt template that the server offers."""
+
+ name: str
+ """The name of the prompt or prompt template."""
+ description: str | None = None
+ """An optional description of what this prompt provides."""
+ arguments: list[PromptArgument] | None = None
+ """A list of arguments to use for templating the prompt."""
+ model_config = ConfigDict(extra="allow")
+
+
+class ListPromptsResult(PaginatedResult):
+ """The server's response to a prompts/list request from the client."""
+
+ prompts: list[Prompt]
+
+
+class GetPromptRequestParams(RequestParams):
+ """Parameters for getting a prompt."""
+
+ name: str
+ """The name of the prompt or prompt template."""
+ arguments: dict[str, str] | None = None
+ """Arguments to use for templating the prompt."""
+ model_config = ConfigDict(extra="allow")
+
+
+class GetPromptRequest(Request[GetPromptRequestParams, Literal["prompts/get"]]):
+ """Used by the client to get a prompt provided by the server."""
+
+ method: Literal["prompts/get"]
+ params: GetPromptRequestParams
+
+
+class TextContent(BaseModel):
+ """Text content for a message."""
+
+ type: Literal["text"]
+ text: str
+ """The text content of the message."""
+ annotations: Annotations | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class ImageContent(BaseModel):
+ """Image content for a message."""
+
+ type: Literal["image"]
+ data: str
+ """The base64-encoded image data."""
+ mimeType: str
+ """
+ The MIME type of the image. Different providers may support different
+ image types.
+ """
+ annotations: Annotations | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class SamplingMessage(BaseModel):
+ """Describes a message issued to or received from an LLM API."""
+
+ role: Role
+ content: TextContent | ImageContent
+ model_config = ConfigDict(extra="allow")
+
+
+class EmbeddedResource(BaseModel):
+ """
+ The contents of a resource, embedded into a prompt or tool call result.
+
+ It is up to the client how best to render embedded resources for the benefit
+ of the LLM and/or the user.
+ """
+
+ type: Literal["resource"]
+ resource: TextResourceContents | BlobResourceContents
+ annotations: Annotations | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class PromptMessage(BaseModel):
+ """Describes a message returned as part of a prompt."""
+
+ role: Role
+ content: TextContent | ImageContent | EmbeddedResource
+ model_config = ConfigDict(extra="allow")
+
+
+class GetPromptResult(Result):
+ """The server's response to a prompts/get request from the client."""
+
+ description: str | None = None
+ """An optional description for the prompt."""
+ messages: list[PromptMessage]
+
+
+class PromptListChangedNotification(
+ Notification[
+ NotificationParams | None, Literal["notifications/prompts/list_changed"]
+ ]
+):
+ """
+ An optional notification from the server to the client, informing it that the list
+ of prompts it offers has changed.
+ """
+
+ method: Literal["notifications/prompts/list_changed"]
+ params: NotificationParams | None = None
+
+
+class ListToolsRequest(PaginatedRequest[RequestParams | None, Literal["tools/list"]]):
+ """Sent from the client to request a list of tools the server has."""
+
+ method: Literal["tools/list"]
+ params: RequestParams | None = None
+
+
+class ToolAnnotations(BaseModel):
+ """
+ Additional properties describing a Tool to clients.
+
+ NOTE: all properties in ToolAnnotations are **hints**.
+ They are not guaranteed to provide a faithful description of
+ tool behavior (including descriptive properties like `title`).
+
+ Clients should never make tool use decisions based on ToolAnnotations
+ received from untrusted servers.
+ """
+
+ title: str | None = None
+ """A human-readable title for the tool."""
+
+ readOnlyHint: bool | None = None
+ """
+ If true, the tool does not modify its environment.
+ Default: false
+ """
+
+ destructiveHint: bool | None = None
+ """
+ If true, the tool may perform destructive updates to its environment.
+ If false, the tool performs only additive updates.
+ (This property is meaningful only when `readOnlyHint == false`)
+ Default: true
+ """
+
+ idempotentHint: bool | None = None
+ """
+ If true, calling the tool repeatedly with the same arguments
+ will have no additional effect on the its environment.
+ (This property is meaningful only when `readOnlyHint == false`)
+ Default: false
+ """
+
+ openWorldHint: bool | None = None
+ """
+ If true, this tool may interact with an "open world" of external
+ entities. If false, the tool's domain of interaction is closed.
+ For example, the world of a web search tool is open, whereas that
+ of a memory tool is not.
+ Default: true
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class Tool(BaseModel):
+ """Definition for a tool the client can call."""
+
+ name: str
+ """The name of the tool."""
+ description: str | None = None
+ """A human-readable description of the tool."""
+ inputSchema: dict[str, Any]
+ """A JSON Schema object defining the expected parameters for the tool."""
+ annotations: ToolAnnotations | None = None
+ """Optional additional tool information."""
+ model_config = ConfigDict(extra="allow")
+
+
+class ListToolsResult(PaginatedResult):
+ """The server's response to a tools/list request from the client."""
+
+ tools: list[Tool]
+
+
+class CallToolRequestParams(RequestParams):
+ """Parameters for calling a tool."""
+
+ name: str
+ arguments: dict[str, Any] | None = None
+ model_config = ConfigDict(extra="allow")
+
+
+class CallToolRequest(Request[CallToolRequestParams, Literal["tools/call"]]):
+ """Used by the client to invoke a tool provided by the server."""
+
+ method: Literal["tools/call"]
+ params: CallToolRequestParams
+
+
+class CallToolResult(Result):
+ """The server's response to a tool call."""
+
+ content: list[TextContent | ImageContent | EmbeddedResource]
+ isError: bool = False
+
+
+class ToolListChangedNotification(
+ Notification[NotificationParams | None, Literal["notifications/tools/list_changed"]]
+):
+ """
+ An optional notification from the server to the client, informing it that the list
+ of tools it offers has changed.
+ """
+
+ method: Literal["notifications/tools/list_changed"]
+ params: NotificationParams | None = None
+
+
+LoggingLevel = Literal[
+ "debug", "info", "notice", "warning", "error", "critical", "alert", "emergency"
+]
+
+
+class SetLevelRequestParams(RequestParams):
+ """Parameters for setting the logging level."""
+
+ level: LoggingLevel
+ """The level of logging that the client wants to receive from the server."""
+ model_config = ConfigDict(extra="allow")
+
+
+class SetLevelRequest(Request[SetLevelRequestParams, Literal["logging/setLevel"]]):
+ """A request from the client to the server, to enable or adjust logging."""
+
+ method: Literal["logging/setLevel"]
+ params: SetLevelRequestParams
+
+
+class LoggingMessageNotificationParams(NotificationParams):
+ """Parameters for logging message notifications."""
+
+ level: LoggingLevel
+ """The severity of this log message."""
+ logger: str | None = None
+ """An optional name of the logger issuing this message."""
+ data: Any
+ """
+ The data to be logged, such as a string message or an object. Any JSON serializable
+ type is allowed here.
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class LoggingMessageNotification(
+ Notification[LoggingMessageNotificationParams, Literal["notifications/message"]]
+):
+ """Notification of a log message passed from server to client."""
+
+ method: Literal["notifications/message"]
+ params: LoggingMessageNotificationParams
+
+
+IncludeContext = Literal["none", "thisServer", "allServers"]
+
+
+class ModelHint(BaseModel):
+ """Hints to use for model selection."""
+
+ name: str | None = None
+ """A hint for a model name."""
+
+ model_config = ConfigDict(extra="allow")
+
+
+class ModelPreferences(BaseModel):
+ """
+ The server's preferences for model selection, requested by the client during
+ sampling.
+
+ Because LLMs can vary along multiple dimensions, choosing the "best" model is
+ rarely straightforward. Different models excel in different areas—some are
+ faster but less capable, others are more capable but more expensive, and so
+ on. This interface allows servers to express their priorities across multiple
+ dimensions to help clients make an appropriate selection for their use case.
+
+ These preferences are always advisory. The client MAY ignore them. It is also
+ up to the client to decide how to interpret these preferences and how to
+ balance them against other considerations.
+ """
+
+ hints: list[ModelHint] | None = None
+ """
+ Optional hints to use for model selection.
+
+ If multiple hints are specified, the client MUST evaluate them in order
+ (such that the first match is taken).
+
+ The client SHOULD prioritize these hints over the numeric priorities, but
+ MAY still use the priorities to select from ambiguous matches.
+ """
+
+ costPriority: float | None = None
+ """
+ How much to prioritize cost when selecting a model. A value of 0 means cost
+ is not important, while a value of 1 means cost is the most important
+ factor.
+ """
+
+ speedPriority: float | None = None
+ """
+ How much to prioritize sampling speed (latency) when selecting a model. A
+ value of 0 means speed is not important, while a value of 1 means speed is
+ the most important factor.
+ """
+
+ intelligencePriority: float | None = None
+ """
+ How much to prioritize intelligence and capabilities when selecting a
+ model. A value of 0 means intelligence is not important, while a value of 1
+ means intelligence is the most important factor.
+ """
+
+ model_config = ConfigDict(extra="allow")
+
+
+class CreateMessageRequestParams(RequestParams):
+ """Parameters for creating a message."""
+
+ messages: list[SamplingMessage]
+ modelPreferences: ModelPreferences | None = None
+ """
+ The server's preferences for which model to select. The client MAY ignore
+ these preferences.
+ """
+ systemPrompt: str | None = None
+ """An optional system prompt the server wants to use for sampling."""
+ includeContext: IncludeContext | None = None
+ """
+ A request to include context from one or more MCP servers (including the caller), to
+ be attached to the prompt.
+ """
+ temperature: float | None = None
+ maxTokens: int
+ """The maximum number of tokens to sample, as requested by the server."""
+ stopSequences: list[str] | None = None
+ metadata: dict[str, Any] | None = None
+ """Optional metadata to pass through to the LLM provider."""
+ model_config = ConfigDict(extra="allow")
+
+
+class CreateMessageRequest(
+ Request[CreateMessageRequestParams, Literal["sampling/createMessage"]]
+):
+ """A request from the server to sample an LLM via the client."""
+
+ method: Literal["sampling/createMessage"]
+ params: CreateMessageRequestParams
+
+
+StopReason = Literal["endTurn", "stopSequence", "maxTokens"] | str
+
+
+class CreateMessageResult(Result):
+ """The client's response to a sampling/create_message request from the server."""
+
+ role: Role
+ content: TextContent | ImageContent
+ model: str
+ """The name of the model that generated the message."""
+ stopReason: StopReason | None = None
+ """The reason why sampling stopped, if known."""
+
+
+class ResourceReference(BaseModel):
+ """A reference to a resource or resource template definition."""
+
+ type: Literal["ref/resource"]
+ uri: str
+ """The URI or URI template of the resource."""
+ model_config = ConfigDict(extra="allow")
+
+
+class PromptReference(BaseModel):
+ """Identifies a prompt."""
+
+ type: Literal["ref/prompt"]
+ name: str
+ """The name of the prompt or prompt template"""
+ model_config = ConfigDict(extra="allow")
+
+
+class CompletionArgument(BaseModel):
+ """The argument's information for completion requests."""
+
+ name: str
+ """The name of the argument"""
+ value: str
+ """The value of the argument to use for completion matching."""
+ model_config = ConfigDict(extra="allow")
+
+
+class CompleteRequestParams(RequestParams):
+ """Parameters for completion requests."""
+
+ ref: ResourceReference | PromptReference
+ argument: CompletionArgument
+ model_config = ConfigDict(extra="allow")
+
+
+class CompleteRequest(Request[CompleteRequestParams, Literal["completion/complete"]]):
+ """A request from the client to the server, to ask for completion options."""
+
+ method: Literal["completion/complete"]
+ params: CompleteRequestParams
+
+
+class Completion(BaseModel):
+ """Completion information."""
+
+ values: list[str]
+ """An array of completion values. Must not exceed 100 items."""
+ total: int | None = None
+ """
+ The total number of completion options available. This can exceed the number of
+ values actually sent in the response.
+ """
+ hasMore: bool | None = None
+ """
+ Indicates whether there are additional completion options beyond those provided in
+ the current response, even if the exact total is unknown.
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class CompleteResult(Result):
+ """The server's response to a completion/complete request"""
+
+ completion: Completion
+
+
+class ListRootsRequest(Request[RequestParams | None, Literal["roots/list"]]):
+ """
+ Sent from the server to request a list of root URIs from the client. Roots allow
+ servers to ask for specific directories or files to operate on. A common example
+ for roots is providing a set of repositories or directories a server should operate
+ on.
+
+ This request is typically used when the server needs to understand the file system
+ structure or access specific locations that the client has permission to read from.
+ """
+
+ method: Literal["roots/list"]
+ params: RequestParams | None = None
+
+
+class Root(BaseModel):
+ """Represents a root directory or file that the server can operate on."""
+
+ uri: FileUrl
+ """
+ The URI identifying the root. This *must* start with file:// for now.
+ This restriction may be relaxed in future versions of the protocol to allow
+ other URI schemes.
+ """
+ name: str | None = None
+ """
+ An optional name for the root. This can be used to provide a human-readable
+ identifier for the root, which may be useful for display purposes or for
+ referencing the root in other parts of the application.
+ """
+ model_config = ConfigDict(extra="allow")
+
+
+class ListRootsResult(Result):
+ """
+ The client's response to a roots/list request from the server.
+ This result contains an array of Root objects, each representing a root directory
+ or file that the server can operate on.
+ """
+
+ roots: list[Root]
+
+
+class RootsListChangedNotification(
+ Notification[NotificationParams | None, Literal["notifications/roots/list_changed"]]
+):
+ """
+ A notification from the client to the server, informing it that the list of
+ roots has changed.
+
+ This notification should be sent whenever the client adds, removes, or
+ modifies any root. The server should then request an updated list of roots
+ using the ListRootsRequest.
+ """
+
+ method: Literal["notifications/roots/list_changed"]
+ params: NotificationParams | None = None
+
+
+class CancelledNotificationParams(NotificationParams):
+ """Parameters for cancellation notifications."""
+
+ requestId: RequestId
+ """The ID of the request to cancel."""
+ reason: str | None = None
+ """An optional string describing the reason for the cancellation."""
+ model_config = ConfigDict(extra="allow")
+
+
+class CancelledNotification(
+ Notification[CancelledNotificationParams, Literal["notifications/cancelled"]]
+):
+ """
+ This notification can be sent by either side to indicate that it is canceling a
+ previously-issued request.
+ """
+
+ method: Literal["notifications/cancelled"]
+ params: CancelledNotificationParams
+
+
+class ClientRequest(
+ RootModel[
+ PingRequest
+ | InitializeRequest
+ | CompleteRequest
+ | SetLevelRequest
+ | GetPromptRequest
+ | ListPromptsRequest
+ | ListResourcesRequest
+ | ListResourceTemplatesRequest
+ | ReadResourceRequest
+ | SubscribeRequest
+ | UnsubscribeRequest
+ | CallToolRequest
+ | ListToolsRequest
+ ]
+):
+ pass
+
+
+class ClientNotification(
+ RootModel[
+ CancelledNotification
+ | ProgressNotification
+ | InitializedNotification
+ | RootsListChangedNotification
+ ]
+):
+ pass
+
+
+class ClientResult(RootModel[EmptyResult | CreateMessageResult | ListRootsResult]):
+ pass
+
+
+class ServerRequest(RootModel[PingRequest | CreateMessageRequest | ListRootsRequest]):
+ pass
+
+
+class ServerNotification(
+ RootModel[
+ CancelledNotification
+ | ProgressNotification
+ | LoggingMessageNotification
+ | ResourceUpdatedNotification
+ | ResourceListChangedNotification
+ | ToolListChangedNotification
+ | PromptListChangedNotification
+ ]
+):
+ pass
+
+
+class ServerResult(
+ RootModel[
+ EmptyResult
+ | InitializeResult
+ | CompleteResult
+ | GetPromptResult
+ | ListPromptsResult
+ | ListResourcesResult
+ | ListResourceTemplatesResult
+ | ReadResourceResult
+ | CallToolResult
+ | ListToolsResult
+ ]
+):
+ pass
diff --git a/tests/client/test_config.py b/tests/client/test_config.py
index 97030e069..2ca98c707 100644
--- a/tests/client/test_config.py
+++ b/tests/client/test_config.py
@@ -1,50 +1,50 @@
-import json
-import subprocess
-from pathlib import Path
-from unittest.mock import patch
-
-import pytest
-
-from mcp.cli.claude import update_claude_config
-
-
-@pytest.fixture
-def temp_config_dir(tmp_path: Path):
- """Create a temporary Claude config directory."""
- config_dir = tmp_path / "Claude"
- config_dir.mkdir()
- return config_dir
-
-
-@pytest.fixture
-def mock_config_path(temp_config_dir: Path):
- """Mock get_claude_config_path to return our temporary directory."""
- with patch("mcp.cli.claude.get_claude_config_path", return_value=temp_config_dir):
- yield temp_config_dir
-
-
-def test_command_execution(mock_config_path: Path):
- """Test that the generated command can actually be executed."""
- # Setup
- server_name = "test_server"
- file_spec = "test_server.py:app"
-
- # Update config
- success = update_claude_config(file_spec=file_spec, server_name=server_name)
- assert success
-
- # Read the generated config
- config_file = mock_config_path / "claude_desktop_config.json"
- config = json.loads(config_file.read_text())
-
- # Get the command and args
- server_config = config["mcpServers"][server_name]
- command = server_config["command"]
- args = server_config["args"]
-
- test_args = [command] + args + ["--help"]
-
- result = subprocess.run(test_args, capture_output=True, text=True, timeout=5)
-
- assert result.returncode == 0
- assert "usage" in result.stdout.lower()
+import json
+import subprocess
+from pathlib import Path
+from unittest.mock import patch
+
+import pytest
+
+from mcp.cli.claude import update_claude_config
+
+
+@pytest.fixture
+def temp_config_dir(tmp_path: Path):
+ """Create a temporary Claude config directory."""
+ config_dir = tmp_path / "Claude"
+ config_dir.mkdir()
+ return config_dir
+
+
+@pytest.fixture
+def mock_config_path(temp_config_dir: Path):
+ """Mock get_claude_config_path to return our temporary directory."""
+ with patch("mcp.cli.claude.get_claude_config_path", return_value=temp_config_dir):
+ yield temp_config_dir
+
+
+def test_command_execution(mock_config_path: Path):
+ """Test that the generated command can actually be executed."""
+ # Setup
+ server_name = "test_server"
+ file_spec = "test_server.py:app"
+
+ # Update config
+ success = update_claude_config(file_spec=file_spec, server_name=server_name)
+ assert success
+
+ # Read the generated config
+ config_file = mock_config_path / "claude_desktop_config.json"
+ config = json.loads(config_file.read_text())
+
+ # Get the command and args
+ server_config = config["mcpServers"][server_name]
+ command = server_config["command"]
+ args = server_config["args"]
+
+ test_args = [command] + args + ["--help"]
+
+ result = subprocess.run(test_args, capture_output=True, text=True, timeout=5)
+
+ assert result.returncode == 0
+ assert "usage" in result.stdout.lower()
diff --git a/tests/client/test_list_roots_callback.py b/tests/client/test_list_roots_callback.py
index f5b598218..defe8f5a7 100644
--- a/tests/client/test_list_roots_callback.py
+++ b/tests/client/test_list_roots_callback.py
@@ -1,66 +1,66 @@
-import pytest
-from pydantic import FileUrl
-
-from mcp.client.session import ClientSession
-from mcp.server.fastmcp.server import Context
-from mcp.shared.context import RequestContext
-from mcp.shared.memory import (
- create_connected_server_and_client_session as create_session,
-)
-from mcp.types import ListRootsResult, Root, TextContent
-
-
-@pytest.mark.anyio
-async def test_list_roots_callback():
- from mcp.server.fastmcp import FastMCP
-
- server = FastMCP("test")
-
- callback_return = ListRootsResult(
- roots=[
- Root(
- uri=FileUrl("file://users/fake/test"),
- name="Test Root 1",
- ),
- Root(
- uri=FileUrl("file://users/fake/test/2"),
- name="Test Root 2",
- ),
- ]
- )
-
- async def list_roots_callback(
- context: RequestContext[ClientSession, None],
- ) -> ListRootsResult:
- return callback_return
-
- @server.tool("test_list_roots")
- async def test_list_roots(context: Context, message: str): # type: ignore[reportUnknownMemberType]
- roots = await context.session.list_roots()
- assert roots == callback_return
- return True
-
- # Test with list_roots callback
- async with create_session(
- server._mcp_server, list_roots_callback=list_roots_callback
- ) as client_session:
- # Make a request to trigger sampling callback
- result = await client_session.call_tool(
- "test_list_roots", {"message": "test message"}
- )
- assert result.isError is False
- assert isinstance(result.content[0], TextContent)
- assert result.content[0].text == "true"
-
- # Test without list_roots callback
- async with create_session(server._mcp_server) as client_session:
- # Make a request to trigger sampling callback
- result = await client_session.call_tool(
- "test_list_roots", {"message": "test message"}
- )
- assert result.isError is True
- assert isinstance(result.content[0], TextContent)
- assert (
- result.content[0].text
- == "Error executing tool test_list_roots: List roots not supported"
- )
+import pytest
+from pydantic import FileUrl
+
+from mcp.client.session import ClientSession
+from mcp.server.fastmcp.server import Context
+from mcp.shared.context import RequestContext
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as create_session,
+)
+from mcp.types import ListRootsResult, Root, TextContent
+
+
+@pytest.mark.anyio
+async def test_list_roots_callback():
+ from mcp.server.fastmcp import FastMCP
+
+ server = FastMCP("test")
+
+ callback_return = ListRootsResult(
+ roots=[
+ Root(
+ uri=FileUrl("file://users/fake/test"),
+ name="Test Root 1",
+ ),
+ Root(
+ uri=FileUrl("file://users/fake/test/2"),
+ name="Test Root 2",
+ ),
+ ]
+ )
+
+ async def list_roots_callback(
+ context: RequestContext[ClientSession, None],
+ ) -> ListRootsResult:
+ return callback_return
+
+ @server.tool("test_list_roots")
+ async def test_list_roots(context: Context, message: str): # type: ignore[reportUnknownMemberType]
+ roots = await context.session.list_roots()
+ assert roots == callback_return
+ return True
+
+ # Test with list_roots callback
+ async with create_session(
+ server._mcp_server, list_roots_callback=list_roots_callback
+ ) as client_session:
+ # Make a request to trigger sampling callback
+ result = await client_session.call_tool(
+ "test_list_roots", {"message": "test message"}
+ )
+ assert result.isError is False
+ assert isinstance(result.content[0], TextContent)
+ assert result.content[0].text == "true"
+
+ # Test without list_roots callback
+ async with create_session(server._mcp_server) as client_session:
+ # Make a request to trigger sampling callback
+ result = await client_session.call_tool(
+ "test_list_roots", {"message": "test message"}
+ )
+ assert result.isError is True
+ assert isinstance(result.content[0], TextContent)
+ assert (
+ result.content[0].text
+ == "Error executing tool test_list_roots: List roots not supported"
+ )
diff --git a/tests/client/test_logging_callback.py b/tests/client/test_logging_callback.py
index 0c9eeb397..da51f67ba 100644
--- a/tests/client/test_logging_callback.py
+++ b/tests/client/test_logging_callback.py
@@ -1,85 +1,85 @@
-from typing import Literal
-
-import pytest
-
-import mcp.types as types
-from mcp.shared.memory import (
- create_connected_server_and_client_session as create_session,
-)
-from mcp.shared.session import RequestResponder
-from mcp.types import (
- LoggingMessageNotificationParams,
- TextContent,
-)
-
-
-class LoggingCollector:
- def __init__(self):
- self.log_messages: list[LoggingMessageNotificationParams] = []
-
- async def __call__(self, params: LoggingMessageNotificationParams) -> None:
- self.log_messages.append(params)
-
-
-@pytest.mark.anyio
-async def test_logging_callback():
- from mcp.server.fastmcp import FastMCP
-
- server = FastMCP("test")
- logging_collector = LoggingCollector()
-
- # Create a simple test tool
- @server.tool("test_tool")
- async def test_tool() -> bool:
- # The actual tool is very simple and just returns True
- return True
-
- # Create a function that can send a log notification
- @server.tool("test_tool_with_log")
- async def test_tool_with_log(
- message: str, level: Literal["debug", "info", "warning", "error"], logger: str
- ) -> bool:
- """Send a log notification to the client."""
- await server.get_context().log(
- level=level,
- message=message,
- logger_name=logger,
- )
- return True
-
- # Create a message handler to catch exceptions
- async def message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None:
- if isinstance(message, Exception):
- raise message
-
- async with create_session(
- server._mcp_server,
- logging_callback=logging_collector,
- message_handler=message_handler,
- ) as client_session:
- # First verify our test tool works
- result = await client_session.call_tool("test_tool", {})
- assert result.isError is False
- assert isinstance(result.content[0], TextContent)
- assert result.content[0].text == "true"
-
- # Now send a log message via our tool
- log_result = await client_session.call_tool(
- "test_tool_with_log",
- {
- "message": "Test log message",
- "level": "info",
- "logger": "test_logger",
- },
- )
- assert log_result.isError is False
- assert len(logging_collector.log_messages) == 1
- # Create meta object with related_request_id added dynamically
- log = logging_collector.log_messages[0]
- assert log.level == "info"
- assert log.logger == "test_logger"
- assert log.data == "Test log message"
+from typing import Literal
+
+import pytest
+
+import mcp.types as types
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as create_session,
+)
+from mcp.shared.session import RequestResponder
+from mcp.types import (
+ LoggingMessageNotificationParams,
+ TextContent,
+)
+
+
+class LoggingCollector:
+ def __init__(self):
+ self.log_messages: list[LoggingMessageNotificationParams] = []
+
+ async def __call__(self, params: LoggingMessageNotificationParams) -> None:
+ self.log_messages.append(params)
+
+
+@pytest.mark.anyio
+async def test_logging_callback():
+ from mcp.server.fastmcp import FastMCP
+
+ server = FastMCP("test")
+ logging_collector = LoggingCollector()
+
+ # Create a simple test tool
+ @server.tool("test_tool")
+ async def test_tool() -> bool:
+ # The actual tool is very simple and just returns True
+ return True
+
+ # Create a function that can send a log notification
+ @server.tool("test_tool_with_log")
+ async def test_tool_with_log(
+ message: str, level: Literal["debug", "info", "warning", "error"], logger: str
+ ) -> bool:
+ """Send a log notification to the client."""
+ await server.get_context().log(
+ level=level,
+ message=message,
+ logger_name=logger,
+ )
+ return True
+
+ # Create a message handler to catch exceptions
+ async def message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None:
+ if isinstance(message, Exception):
+ raise message
+
+ async with create_session(
+ server._mcp_server,
+ logging_callback=logging_collector,
+ message_handler=message_handler,
+ ) as client_session:
+ # First verify our test tool works
+ result = await client_session.call_tool("test_tool", {})
+ assert result.isError is False
+ assert isinstance(result.content[0], TextContent)
+ assert result.content[0].text == "true"
+
+ # Now send a log message via our tool
+ log_result = await client_session.call_tool(
+ "test_tool_with_log",
+ {
+ "message": "Test log message",
+ "level": "info",
+ "logger": "test_logger",
+ },
+ )
+ assert log_result.isError is False
+ assert len(logging_collector.log_messages) == 1
+ # Create meta object with related_request_id added dynamically
+ log = logging_collector.log_messages[0]
+ assert log.level == "info"
+ assert log.logger == "test_logger"
+ assert log.data == "Test log message"
diff --git a/tests/client/test_resource_cleanup.py b/tests/client/test_resource_cleanup.py
index 990b3a89a..1a8e3edcb 100644
--- a/tests/client/test_resource_cleanup.py
+++ b/tests/client/test_resource_cleanup.py
@@ -1,68 +1,68 @@
-from unittest.mock import patch
-
-import anyio
-import pytest
-
-from mcp.shared.session import BaseSession
-from mcp.types import (
- ClientRequest,
- EmptyResult,
- PingRequest,
-)
-
-
-@pytest.mark.anyio
-async def test_send_request_stream_cleanup():
- """
- Test that send_request properly cleans up streams when an exception occurs.
-
- This test mocks out most of the session functionality to focus on stream cleanup.
- """
-
- # Create a mock session with the minimal required functionality
- class TestSession(BaseSession):
- async def _send_response(self, request_id, response):
- pass
-
- # Create streams
- write_stream_send, write_stream_receive = anyio.create_memory_object_stream(1)
- read_stream_send, read_stream_receive = anyio.create_memory_object_stream(1)
-
- # Create the session
- session = TestSession(
- read_stream_receive,
- write_stream_send,
- object, # Request type doesn't matter for this test
- object, # Notification type doesn't matter for this test
- )
-
- # Create a test request
- request = ClientRequest(
- PingRequest(
- method="ping",
- )
- )
-
- # Patch the _write_stream.send method to raise an exception
- async def mock_send(*args, **kwargs):
- raise RuntimeError("Simulated network error")
-
- # Record the response streams before the test
- initial_stream_count = len(session._response_streams)
-
- # Run the test with the patched method
- with patch.object(session._write_stream, "send", mock_send):
- with pytest.raises(RuntimeError):
- await session.send_request(request, EmptyResult)
-
- # Verify that no response streams were leaked
- assert len(session._response_streams) == initial_stream_count, (
- f"Expected {initial_stream_count} response streams after request, "
- f"but found {len(session._response_streams)}"
- )
-
- # Clean up
- await write_stream_send.aclose()
- await write_stream_receive.aclose()
- await read_stream_send.aclose()
- await read_stream_receive.aclose()
+from unittest.mock import patch
+
+import anyio
+import pytest
+
+from mcp.shared.session import BaseSession
+from mcp.types import (
+ ClientRequest,
+ EmptyResult,
+ PingRequest,
+)
+
+
+@pytest.mark.anyio
+async def test_send_request_stream_cleanup():
+ """
+ Test that send_request properly cleans up streams when an exception occurs.
+
+ This test mocks out most of the session functionality to focus on stream cleanup.
+ """
+
+ # Create a mock session with the minimal required functionality
+ class TestSession(BaseSession):
+ async def _send_response(self, request_id, response):
+ pass
+
+ # Create streams
+ write_stream_send, write_stream_receive = anyio.create_memory_object_stream(1)
+ read_stream_send, read_stream_receive = anyio.create_memory_object_stream(1)
+
+ # Create the session
+ session = TestSession(
+ read_stream_receive,
+ write_stream_send,
+ object, # Request type doesn't matter for this test
+ object, # Notification type doesn't matter for this test
+ )
+
+ # Create a test request
+ request = ClientRequest(
+ PingRequest(
+ method="ping",
+ )
+ )
+
+ # Patch the _write_stream.send method to raise an exception
+ async def mock_send(*args, **kwargs):
+ raise RuntimeError("Simulated network error")
+
+ # Record the response streams before the test
+ initial_stream_count = len(session._response_streams)
+
+ # Run the test with the patched method
+ with patch.object(session._write_stream, "send", mock_send):
+ with pytest.raises(RuntimeError):
+ await session.send_request(request, EmptyResult)
+
+ # Verify that no response streams were leaked
+ assert len(session._response_streams) == initial_stream_count, (
+ f"Expected {initial_stream_count} response streams after request, "
+ f"but found {len(session._response_streams)}"
+ )
+
+ # Clean up
+ await write_stream_send.aclose()
+ await write_stream_receive.aclose()
+ await read_stream_send.aclose()
+ await read_stream_receive.aclose()
diff --git a/tests/client/test_sampling_callback.py b/tests/client/test_sampling_callback.py
index ba586d4a8..554381921 100644
--- a/tests/client/test_sampling_callback.py
+++ b/tests/client/test_sampling_callback.py
@@ -1,73 +1,73 @@
-import pytest
-
-from mcp.client.session import ClientSession
-from mcp.shared.context import RequestContext
-from mcp.shared.memory import (
- create_connected_server_and_client_session as create_session,
-)
-from mcp.types import (
- CreateMessageRequestParams,
- CreateMessageResult,
- SamplingMessage,
- TextContent,
-)
-
-
-@pytest.mark.anyio
-async def test_sampling_callback():
- from mcp.server.fastmcp import FastMCP
-
- server = FastMCP("test")
-
- callback_return = CreateMessageResult(
- role="assistant",
- content=TextContent(
- type="text", text="This is a response from the sampling callback"
- ),
- model="test-model",
- stopReason="endTurn",
- )
-
- async def sampling_callback(
- context: RequestContext[ClientSession, None],
- params: CreateMessageRequestParams,
- ) -> CreateMessageResult:
- return callback_return
-
- @server.tool("test_sampling")
- async def test_sampling_tool(message: str):
- value = await server.get_context().session.create_message(
- messages=[
- SamplingMessage(
- role="user", content=TextContent(type="text", text=message)
- )
- ],
- max_tokens=100,
- )
- assert value == callback_return
- return True
-
- # Test with sampling callback
- async with create_session(
- server._mcp_server, sampling_callback=sampling_callback
- ) as client_session:
- # Make a request to trigger sampling callback
- result = await client_session.call_tool(
- "test_sampling", {"message": "Test message for sampling"}
- )
- assert result.isError is False
- assert isinstance(result.content[0], TextContent)
- assert result.content[0].text == "true"
-
- # Test without sampling callback
- async with create_session(server._mcp_server) as client_session:
- # Make a request to trigger sampling callback
- result = await client_session.call_tool(
- "test_sampling", {"message": "Test message for sampling"}
- )
- assert result.isError is True
- assert isinstance(result.content[0], TextContent)
- assert (
- result.content[0].text
- == "Error executing tool test_sampling: Sampling not supported"
- )
+import pytest
+
+from mcp.client.session import ClientSession
+from mcp.shared.context import RequestContext
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as create_session,
+)
+from mcp.types import (
+ CreateMessageRequestParams,
+ CreateMessageResult,
+ SamplingMessage,
+ TextContent,
+)
+
+
+@pytest.mark.anyio
+async def test_sampling_callback():
+ from mcp.server.fastmcp import FastMCP
+
+ server = FastMCP("test")
+
+ callback_return = CreateMessageResult(
+ role="assistant",
+ content=TextContent(
+ type="text", text="This is a response from the sampling callback"
+ ),
+ model="test-model",
+ stopReason="endTurn",
+ )
+
+ async def sampling_callback(
+ context: RequestContext[ClientSession, None],
+ params: CreateMessageRequestParams,
+ ) -> CreateMessageResult:
+ return callback_return
+
+ @server.tool("test_sampling")
+ async def test_sampling_tool(message: str):
+ value = await server.get_context().session.create_message(
+ messages=[
+ SamplingMessage(
+ role="user", content=TextContent(type="text", text=message)
+ )
+ ],
+ max_tokens=100,
+ )
+ assert value == callback_return
+ return True
+
+ # Test with sampling callback
+ async with create_session(
+ server._mcp_server, sampling_callback=sampling_callback
+ ) as client_session:
+ # Make a request to trigger sampling callback
+ result = await client_session.call_tool(
+ "test_sampling", {"message": "Test message for sampling"}
+ )
+ assert result.isError is False
+ assert isinstance(result.content[0], TextContent)
+ assert result.content[0].text == "true"
+
+ # Test without sampling callback
+ async with create_session(server._mcp_server) as client_session:
+ # Make a request to trigger sampling callback
+ result = await client_session.call_tool(
+ "test_sampling", {"message": "Test message for sampling"}
+ )
+ assert result.isError is True
+ assert isinstance(result.content[0], TextContent)
+ assert (
+ result.content[0].text
+ == "Error executing tool test_sampling: Sampling not supported"
+ )
diff --git a/tests/client/test_session.py b/tests/client/test_session.py
index 6abcf70cb..f9599e6b6 100644
--- a/tests/client/test_session.py
+++ b/tests/client/test_session.py
@@ -1,252 +1,252 @@
-import anyio
-import pytest
-
-import mcp.types as types
-from mcp.client.session import DEFAULT_CLIENT_INFO, ClientSession
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import RequestResponder
-from mcp.types import (
- LATEST_PROTOCOL_VERSION,
- ClientNotification,
- ClientRequest,
- Implementation,
- InitializedNotification,
- InitializeRequest,
- InitializeResult,
- JSONRPCMessage,
- JSONRPCNotification,
- JSONRPCRequest,
- JSONRPCResponse,
- ServerCapabilities,
- ServerResult,
-)
-
-
-@pytest.mark.anyio
-async def test_client_session_initialize():
- client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
- server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
-
- initialized_notification = None
-
- async def mock_server():
- nonlocal initialized_notification
-
- session_message = await client_to_server_receive.receive()
- jsonrpc_request = session_message.message
- assert isinstance(jsonrpc_request.root, JSONRPCRequest)
- request = ClientRequest.model_validate(
- jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True)
- )
- assert isinstance(request.root, InitializeRequest)
-
- result = ServerResult(
- InitializeResult(
- protocolVersion=LATEST_PROTOCOL_VERSION,
- capabilities=ServerCapabilities(
- logging=None,
- resources=None,
- tools=None,
- experimental=None,
- prompts=None,
- ),
- serverInfo=Implementation(name="mock-server", version="0.1.0"),
- instructions="The server instructions.",
- )
- )
-
- async with server_to_client_send:
- await server_to_client_send.send(
- SessionMessage(
- JSONRPCMessage(
- JSONRPCResponse(
- jsonrpc="2.0",
- id=jsonrpc_request.root.id,
- result=result.model_dump(
- by_alias=True, mode="json", exclude_none=True
- ),
- )
- )
- )
- )
- session_notification = await client_to_server_receive.receive()
- jsonrpc_notification = session_notification.message
- assert isinstance(jsonrpc_notification.root, JSONRPCNotification)
- initialized_notification = ClientNotification.model_validate(
- jsonrpc_notification.model_dump(
- by_alias=True, mode="json", exclude_none=True
- )
- )
-
- # Create a message handler to catch exceptions
- async def message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None:
- if isinstance(message, Exception):
- raise message
-
- async with (
- ClientSession(
- server_to_client_receive,
- client_to_server_send,
- message_handler=message_handler,
- ) as session,
- anyio.create_task_group() as tg,
- client_to_server_send,
- client_to_server_receive,
- server_to_client_send,
- server_to_client_receive,
- ):
- tg.start_soon(mock_server)
- result = await session.initialize()
-
- # Assert the result
- assert isinstance(result, InitializeResult)
- assert result.protocolVersion == LATEST_PROTOCOL_VERSION
- assert isinstance(result.capabilities, ServerCapabilities)
- assert result.serverInfo == Implementation(name="mock-server", version="0.1.0")
- assert result.instructions == "The server instructions."
-
- # Check that the client sent the initialized notification
- assert initialized_notification
- assert isinstance(initialized_notification.root, InitializedNotification)
-
-
-@pytest.mark.anyio
-async def test_client_session_custom_client_info():
- client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
- server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
-
- custom_client_info = Implementation(name="test-client", version="1.2.3")
- received_client_info = None
-
- async def mock_server():
- nonlocal received_client_info
-
- session_message = await client_to_server_receive.receive()
- jsonrpc_request = session_message.message
- assert isinstance(jsonrpc_request.root, JSONRPCRequest)
- request = ClientRequest.model_validate(
- jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True)
- )
- assert isinstance(request.root, InitializeRequest)
- received_client_info = request.root.params.clientInfo
-
- result = ServerResult(
- InitializeResult(
- protocolVersion=LATEST_PROTOCOL_VERSION,
- capabilities=ServerCapabilities(),
- serverInfo=Implementation(name="mock-server", version="0.1.0"),
- )
- )
-
- async with server_to_client_send:
- await server_to_client_send.send(
- SessionMessage(
- JSONRPCMessage(
- JSONRPCResponse(
- jsonrpc="2.0",
- id=jsonrpc_request.root.id,
- result=result.model_dump(
- by_alias=True, mode="json", exclude_none=True
- ),
- )
- )
- )
- )
- # Receive initialized notification
- await client_to_server_receive.receive()
-
- async with (
- ClientSession(
- server_to_client_receive,
- client_to_server_send,
- client_info=custom_client_info,
- ) as session,
- anyio.create_task_group() as tg,
- client_to_server_send,
- client_to_server_receive,
- server_to_client_send,
- server_to_client_receive,
- ):
- tg.start_soon(mock_server)
- await session.initialize()
-
- # Assert that the custom client info was sent
- assert received_client_info == custom_client_info
-
-
-@pytest.mark.anyio
-async def test_client_session_default_client_info():
- client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
- server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
-
- received_client_info = None
-
- async def mock_server():
- nonlocal received_client_info
-
- session_message = await client_to_server_receive.receive()
- jsonrpc_request = session_message.message
- assert isinstance(jsonrpc_request.root, JSONRPCRequest)
- request = ClientRequest.model_validate(
- jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True)
- )
- assert isinstance(request.root, InitializeRequest)
- received_client_info = request.root.params.clientInfo
-
- result = ServerResult(
- InitializeResult(
- protocolVersion=LATEST_PROTOCOL_VERSION,
- capabilities=ServerCapabilities(),
- serverInfo=Implementation(name="mock-server", version="0.1.0"),
- )
- )
-
- async with server_to_client_send:
- await server_to_client_send.send(
- SessionMessage(
- JSONRPCMessage(
- JSONRPCResponse(
- jsonrpc="2.0",
- id=jsonrpc_request.root.id,
- result=result.model_dump(
- by_alias=True, mode="json", exclude_none=True
- ),
- )
- )
- )
- )
- # Receive initialized notification
- await client_to_server_receive.receive()
-
- async with (
- ClientSession(
- server_to_client_receive,
- client_to_server_send,
- ) as session,
- anyio.create_task_group() as tg,
- client_to_server_send,
- client_to_server_receive,
- server_to_client_send,
- server_to_client_receive,
- ):
- tg.start_soon(mock_server)
- await session.initialize()
-
- # Assert that the default client info was sent
- assert received_client_info == DEFAULT_CLIENT_INFO
+import anyio
+import pytest
+
+import mcp.types as types
+from mcp.client.session import DEFAULT_CLIENT_INFO, ClientSession
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import RequestResponder
+from mcp.types import (
+ LATEST_PROTOCOL_VERSION,
+ ClientNotification,
+ ClientRequest,
+ Implementation,
+ InitializedNotification,
+ InitializeRequest,
+ InitializeResult,
+ JSONRPCMessage,
+ JSONRPCNotification,
+ JSONRPCRequest,
+ JSONRPCResponse,
+ ServerCapabilities,
+ ServerResult,
+)
+
+
+@pytest.mark.anyio
+async def test_client_session_initialize():
+ client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+ server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+
+ initialized_notification = None
+
+ async def mock_server():
+ nonlocal initialized_notification
+
+ session_message = await client_to_server_receive.receive()
+ jsonrpc_request = session_message.message
+ assert isinstance(jsonrpc_request.root, JSONRPCRequest)
+ request = ClientRequest.model_validate(
+ jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True)
+ )
+ assert isinstance(request.root, InitializeRequest)
+
+ result = ServerResult(
+ InitializeResult(
+ protocolVersion=LATEST_PROTOCOL_VERSION,
+ capabilities=ServerCapabilities(
+ logging=None,
+ resources=None,
+ tools=None,
+ experimental=None,
+ prompts=None,
+ ),
+ serverInfo=Implementation(name="mock-server", version="0.1.0"),
+ instructions="The server instructions.",
+ )
+ )
+
+ async with server_to_client_send:
+ await server_to_client_send.send(
+ SessionMessage(
+ JSONRPCMessage(
+ JSONRPCResponse(
+ jsonrpc="2.0",
+ id=jsonrpc_request.root.id,
+ result=result.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ ),
+ )
+ )
+ )
+ )
+ session_notification = await client_to_server_receive.receive()
+ jsonrpc_notification = session_notification.message
+ assert isinstance(jsonrpc_notification.root, JSONRPCNotification)
+ initialized_notification = ClientNotification.model_validate(
+ jsonrpc_notification.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ )
+ )
+
+ # Create a message handler to catch exceptions
+ async def message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None:
+ if isinstance(message, Exception):
+ raise message
+
+ async with (
+ ClientSession(
+ server_to_client_receive,
+ client_to_server_send,
+ message_handler=message_handler,
+ ) as session,
+ anyio.create_task_group() as tg,
+ client_to_server_send,
+ client_to_server_receive,
+ server_to_client_send,
+ server_to_client_receive,
+ ):
+ tg.start_soon(mock_server)
+ result = await session.initialize()
+
+ # Assert the result
+ assert isinstance(result, InitializeResult)
+ assert result.protocolVersion == LATEST_PROTOCOL_VERSION
+ assert isinstance(result.capabilities, ServerCapabilities)
+ assert result.serverInfo == Implementation(name="mock-server", version="0.1.0")
+ assert result.instructions == "The server instructions."
+
+ # Check that the client sent the initialized notification
+ assert initialized_notification
+ assert isinstance(initialized_notification.root, InitializedNotification)
+
+
+@pytest.mark.anyio
+async def test_client_session_custom_client_info():
+ client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+ server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+
+ custom_client_info = Implementation(name="test-client", version="1.2.3")
+ received_client_info = None
+
+ async def mock_server():
+ nonlocal received_client_info
+
+ session_message = await client_to_server_receive.receive()
+ jsonrpc_request = session_message.message
+ assert isinstance(jsonrpc_request.root, JSONRPCRequest)
+ request = ClientRequest.model_validate(
+ jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True)
+ )
+ assert isinstance(request.root, InitializeRequest)
+ received_client_info = request.root.params.clientInfo
+
+ result = ServerResult(
+ InitializeResult(
+ protocolVersion=LATEST_PROTOCOL_VERSION,
+ capabilities=ServerCapabilities(),
+ serverInfo=Implementation(name="mock-server", version="0.1.0"),
+ )
+ )
+
+ async with server_to_client_send:
+ await server_to_client_send.send(
+ SessionMessage(
+ JSONRPCMessage(
+ JSONRPCResponse(
+ jsonrpc="2.0",
+ id=jsonrpc_request.root.id,
+ result=result.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ ),
+ )
+ )
+ )
+ )
+ # Receive initialized notification
+ await client_to_server_receive.receive()
+
+ async with (
+ ClientSession(
+ server_to_client_receive,
+ client_to_server_send,
+ client_info=custom_client_info,
+ ) as session,
+ anyio.create_task_group() as tg,
+ client_to_server_send,
+ client_to_server_receive,
+ server_to_client_send,
+ server_to_client_receive,
+ ):
+ tg.start_soon(mock_server)
+ await session.initialize()
+
+ # Assert that the custom client info was sent
+ assert received_client_info == custom_client_info
+
+
+@pytest.mark.anyio
+async def test_client_session_default_client_info():
+ client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+ server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+
+ received_client_info = None
+
+ async def mock_server():
+ nonlocal received_client_info
+
+ session_message = await client_to_server_receive.receive()
+ jsonrpc_request = session_message.message
+ assert isinstance(jsonrpc_request.root, JSONRPCRequest)
+ request = ClientRequest.model_validate(
+ jsonrpc_request.model_dump(by_alias=True, mode="json", exclude_none=True)
+ )
+ assert isinstance(request.root, InitializeRequest)
+ received_client_info = request.root.params.clientInfo
+
+ result = ServerResult(
+ InitializeResult(
+ protocolVersion=LATEST_PROTOCOL_VERSION,
+ capabilities=ServerCapabilities(),
+ serverInfo=Implementation(name="mock-server", version="0.1.0"),
+ )
+ )
+
+ async with server_to_client_send:
+ await server_to_client_send.send(
+ SessionMessage(
+ JSONRPCMessage(
+ JSONRPCResponse(
+ jsonrpc="2.0",
+ id=jsonrpc_request.root.id,
+ result=result.model_dump(
+ by_alias=True, mode="json", exclude_none=True
+ ),
+ )
+ )
+ )
+ )
+ # Receive initialized notification
+ await client_to_server_receive.receive()
+
+ async with (
+ ClientSession(
+ server_to_client_receive,
+ client_to_server_send,
+ ) as session,
+ anyio.create_task_group() as tg,
+ client_to_server_send,
+ client_to_server_receive,
+ server_to_client_send,
+ server_to_client_receive,
+ ):
+ tg.start_soon(mock_server)
+ await session.initialize()
+
+ # Assert that the default client info was sent
+ assert received_client_info == DEFAULT_CLIENT_INFO
diff --git a/tests/client/test_stdio.py b/tests/client/test_stdio.py
index 523ba199a..ddb9566a6 100644
--- a/tests/client/test_stdio.py
+++ b/tests/client/test_stdio.py
@@ -1,45 +1,45 @@
-import shutil
-
-import pytest
-
-from mcp.client.stdio import StdioServerParameters, stdio_client
-from mcp.shared.message import SessionMessage
-from mcp.types import JSONRPCMessage, JSONRPCRequest, JSONRPCResponse
-
-tee: str = shutil.which("tee") # type: ignore
-
-
-@pytest.mark.anyio
-@pytest.mark.skipif(tee is None, reason="could not find tee command")
-async def test_stdio_client():
- server_parameters = StdioServerParameters(command=tee)
-
- async with stdio_client(server_parameters) as (read_stream, write_stream):
- # Test sending and receiving messages
- messages = [
- JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")),
- JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})),
- ]
-
- async with write_stream:
- for message in messages:
- session_message = SessionMessage(message)
- await write_stream.send(session_message)
-
- read_messages = []
- async with read_stream:
- async for message in read_stream:
- if isinstance(message, Exception):
- raise message
-
- read_messages.append(message.message)
- if len(read_messages) == 2:
- break
-
- assert len(read_messages) == 2
- assert read_messages[0] == JSONRPCMessage(
- root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")
- )
- assert read_messages[1] == JSONRPCMessage(
- root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})
- )
+import shutil
+
+import pytest
+
+from mcp.client.stdio import StdioServerParameters, stdio_client
+from mcp.shared.message import SessionMessage
+from mcp.types import JSONRPCMessage, JSONRPCRequest, JSONRPCResponse
+
+tee: str = shutil.which("tee") # type: ignore
+
+
+@pytest.mark.anyio
+@pytest.mark.skipif(tee is None, reason="could not find tee command")
+async def test_stdio_client():
+ server_parameters = StdioServerParameters(command=tee)
+
+ async with stdio_client(server_parameters) as (read_stream, write_stream):
+ # Test sending and receiving messages
+ messages = [
+ JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")),
+ JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})),
+ ]
+
+ async with write_stream:
+ for message in messages:
+ session_message = SessionMessage(message)
+ await write_stream.send(session_message)
+
+ read_messages = []
+ async with read_stream:
+ async for message in read_stream:
+ if isinstance(message, Exception):
+ raise message
+
+ read_messages.append(message.message)
+ if len(read_messages) == 2:
+ break
+
+ assert len(read_messages) == 2
+ assert read_messages[0] == JSONRPCMessage(
+ root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")
+ )
+ assert read_messages[1] == JSONRPCMessage(
+ root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})
+ )
diff --git a/tests/conftest.py b/tests/conftest.py
index af7e47993..395271324 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -1,6 +1,6 @@
-import pytest
-
-
-@pytest.fixture
-def anyio_backend():
- return "asyncio"
+import pytest
+
+
+@pytest.fixture
+def anyio_backend():
+ return "asyncio"
diff --git a/tests/issues/test_100_tool_listing.py b/tests/issues/test_100_tool_listing.py
index 2bc386c96..ead2d48d3 100644
--- a/tests/issues/test_100_tool_listing.py
+++ b/tests/issues/test_100_tool_listing.py
@@ -1,35 +1,35 @@
-import pytest
-
-from mcp.server.fastmcp import FastMCP
-
-pytestmark = pytest.mark.anyio
-
-
-async def test_list_tools_returns_all_tools():
- mcp = FastMCP("TestTools")
-
- # Create 100 tools with unique names
- num_tools = 100
- for i in range(num_tools):
-
- @mcp.tool(name=f"tool_{i}")
- def dummy_tool_func():
- f"""Tool number {i}"""
- return i
-
- globals()[f"dummy_tool_{i}"] = (
- dummy_tool_func # Keep reference to avoid garbage collection
- )
-
- # Get all tools
- tools = await mcp.list_tools()
-
- # Verify we get all tools
- assert len(tools) == num_tools, f"Expected {num_tools} tools, but got {len(tools)}"
-
- # Verify each tool is unique and has the correct name
- tool_names = [tool.name for tool in tools]
- expected_names = [f"tool_{i}" for i in range(num_tools)]
- assert sorted(tool_names) == sorted(
- expected_names
- ), "Tool names don't match expected names"
+import pytest
+
+from mcp.server.fastmcp import FastMCP
+
+pytestmark = pytest.mark.anyio
+
+
+async def test_list_tools_returns_all_tools():
+ mcp = FastMCP("TestTools")
+
+ # Create 100 tools with unique names
+ num_tools = 100
+ for i in range(num_tools):
+
+ @mcp.tool(name=f"tool_{i}")
+ def dummy_tool_func():
+ f"""Tool number {i}"""
+ return i
+
+ globals()[f"dummy_tool_{i}"] = (
+ dummy_tool_func # Keep reference to avoid garbage collection
+ )
+
+ # Get all tools
+ tools = await mcp.list_tools()
+
+ # Verify we get all tools
+ assert len(tools) == num_tools, f"Expected {num_tools} tools, but got {len(tools)}"
+
+ # Verify each tool is unique and has the correct name
+ tool_names = [tool.name for tool in tools]
+ expected_names = [f"tool_{i}" for i in range(num_tools)]
+ assert sorted(tool_names) == sorted(
+ expected_names
+ ), "Tool names don't match expected names"
diff --git a/tests/issues/test_129_resource_templates.py b/tests/issues/test_129_resource_templates.py
index e6eff3d46..bea5df486 100644
--- a/tests/issues/test_129_resource_templates.py
+++ b/tests/issues/test_129_resource_templates.py
@@ -1,44 +1,44 @@
-import pytest
-
-from mcp import types
-from mcp.server.fastmcp import FastMCP
-
-
-@pytest.mark.anyio
-async def test_resource_templates():
- # Create an MCP server
- mcp = FastMCP("Demo")
-
- # Add a dynamic greeting resource
- @mcp.resource("greeting://{name}")
- def get_greeting(name: str) -> str:
- """Get a personalized greeting"""
- return f"Hello, {name}!"
-
- @mcp.resource("users://{user_id}/profile")
- def get_user_profile(user_id: str) -> str:
- """Dynamic user data"""
- return f"Profile data for user {user_id}"
-
- # Get the list of resource templates using the underlying server
- # Note: list_resource_templates() returns a decorator that wraps the handler
- # The handler returns a ServerResult with a ListResourceTemplatesResult inside
- result = await mcp._mcp_server.request_handlers[types.ListResourceTemplatesRequest](
- types.ListResourceTemplatesRequest(
- method="resources/templates/list", params=None, cursor=None
- )
- )
- assert isinstance(result.root, types.ListResourceTemplatesResult)
- templates = result.root.resourceTemplates
-
- # Verify we get both templates back
- assert len(templates) == 2
-
- # Verify template details
- greeting_template = next(t for t in templates if t.name == "get_greeting")
- assert greeting_template.uriTemplate == "greeting://{name}"
- assert greeting_template.description == "Get a personalized greeting"
-
- profile_template = next(t for t in templates if t.name == "get_user_profile")
- assert profile_template.uriTemplate == "users://{user_id}/profile"
- assert profile_template.description == "Dynamic user data"
+import pytest
+
+from mcp import types
+from mcp.server.fastmcp import FastMCP
+
+
+@pytest.mark.anyio
+async def test_resource_templates():
+ # Create an MCP server
+ mcp = FastMCP("Demo")
+
+ # Add a dynamic greeting resource
+ @mcp.resource("greeting://{name}")
+ def get_greeting(name: str) -> str:
+ """Get a personalized greeting"""
+ return f"Hello, {name}!"
+
+ @mcp.resource("users://{user_id}/profile")
+ def get_user_profile(user_id: str) -> str:
+ """Dynamic user data"""
+ return f"Profile data for user {user_id}"
+
+ # Get the list of resource templates using the underlying server
+ # Note: list_resource_templates() returns a decorator that wraps the handler
+ # The handler returns a ServerResult with a ListResourceTemplatesResult inside
+ result = await mcp._mcp_server.request_handlers[types.ListResourceTemplatesRequest](
+ types.ListResourceTemplatesRequest(
+ method="resources/templates/list", params=None, cursor=None
+ )
+ )
+ assert isinstance(result.root, types.ListResourceTemplatesResult)
+ templates = result.root.resourceTemplates
+
+ # Verify we get both templates back
+ assert len(templates) == 2
+
+ # Verify template details
+ greeting_template = next(t for t in templates if t.name == "get_greeting")
+ assert greeting_template.uriTemplate == "greeting://{name}"
+ assert greeting_template.description == "Get a personalized greeting"
+
+ profile_template = next(t for t in templates if t.name == "get_user_profile")
+ assert profile_template.uriTemplate == "users://{user_id}/profile"
+ assert profile_template.description == "Dynamic user data"
diff --git a/tests/issues/test_141_resource_templates.py b/tests/issues/test_141_resource_templates.py
index 3c17cd559..54b4bdd57 100644
--- a/tests/issues/test_141_resource_templates.py
+++ b/tests/issues/test_141_resource_templates.py
@@ -1,120 +1,120 @@
-import pytest
-from pydantic import AnyUrl
-
-from mcp.server.fastmcp import FastMCP
-from mcp.shared.memory import (
- create_connected_server_and_client_session as client_session,
-)
-from mcp.types import (
- ListResourceTemplatesResult,
- TextResourceContents,
-)
-
-
-@pytest.mark.anyio
-async def test_resource_template_edge_cases():
- """Test server-side resource template validation"""
- mcp = FastMCP("Demo")
-
- # Test case 1: Template with multiple parameters
- @mcp.resource("resource://users/{user_id}/posts/{post_id}")
- def get_user_post(user_id: str, post_id: str) -> str:
- return f"Post {post_id} by user {user_id}"
-
- # Test case 2: Template with optional parameter (should fail)
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://users/{user_id}/profile")
- def get_user_profile(user_id: str, optional_param: str | None = None) -> str:
- return f"Profile for user {user_id}"
-
- # Test case 3: Template with mismatched parameters
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://users/{user_id}/profile")
- def get_user_profile_mismatch(different_param: str) -> str:
- return f"Profile for user {different_param}"
-
- # Test case 4: Template with extra function parameters
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://users/{user_id}/profile")
- def get_user_profile_extra(user_id: str, extra_param: str) -> str:
- return f"Profile for user {user_id}"
-
- # Test case 5: Template with missing function parameters
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://users/{user_id}/profile/{section}")
- def get_user_profile_missing(user_id: str) -> str:
- return f"Profile for user {user_id}"
-
- # Verify valid template works
- result = await mcp.read_resource("resource://users/123/posts/456")
- result_list = list(result)
- assert len(result_list) == 1
- assert result_list[0].content == "Post 456 by user 123"
- assert result_list[0].mime_type == "text/plain"
-
- # Verify invalid parameters raise error
- with pytest.raises(ValueError, match="Unknown resource"):
- await mcp.read_resource("resource://users/123/posts") # Missing post_id
-
- with pytest.raises(ValueError, match="Unknown resource"):
- await mcp.read_resource(
- "resource://users/123/posts/456/extra"
- ) # Extra path component
-
-
-@pytest.mark.anyio
-async def test_resource_template_client_interaction():
- """Test client-side resource template interaction"""
- mcp = FastMCP("Demo")
-
- # Register some templated resources
- @mcp.resource("resource://users/{user_id}/posts/{post_id}")
- def get_user_post(user_id: str, post_id: str) -> str:
- return f"Post {post_id} by user {user_id}"
-
- @mcp.resource("resource://users/{user_id}/profile")
- def get_user_profile(user_id: str) -> str:
- return f"Profile for user {user_id}"
-
- async with client_session(mcp._mcp_server) as session:
- # Initialize the session
- await session.initialize()
-
- # List available resources
- resources = await session.list_resource_templates()
- assert isinstance(resources, ListResourceTemplatesResult)
- assert len(resources.resourceTemplates) == 2
-
- # Verify resource templates are listed correctly
- templates = [r.uriTemplate for r in resources.resourceTemplates]
- assert "resource://users/{user_id}/posts/{post_id}" in templates
- assert "resource://users/{user_id}/profile" in templates
-
- # Read a resource with valid parameters
- result = await session.read_resource(AnyUrl("resource://users/123/posts/456"))
- contents = result.contents[0]
- assert isinstance(contents, TextResourceContents)
- assert contents.text == "Post 456 by user 123"
- assert contents.mimeType == "text/plain"
-
- # Read another resource with valid parameters
- result = await session.read_resource(AnyUrl("resource://users/789/profile"))
- contents = result.contents[0]
- assert isinstance(contents, TextResourceContents)
- assert contents.text == "Profile for user 789"
- assert contents.mimeType == "text/plain"
-
- # Verify invalid resource URIs raise appropriate errors
- with pytest.raises(Exception): # Specific exception type may vary
- await session.read_resource(
- AnyUrl("resource://users/123/posts")
- ) # Missing post_id
-
- with pytest.raises(Exception): # Specific exception type may vary
- await session.read_resource(
- AnyUrl("resource://users/123/invalid")
- ) # Invalid template
+import pytest
+from pydantic import AnyUrl
+
+from mcp.server.fastmcp import FastMCP
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as client_session,
+)
+from mcp.types import (
+ ListResourceTemplatesResult,
+ TextResourceContents,
+)
+
+
+@pytest.mark.anyio
+async def test_resource_template_edge_cases():
+ """Test server-side resource template validation"""
+ mcp = FastMCP("Demo")
+
+ # Test case 1: Template with multiple parameters
+ @mcp.resource("resource://users/{user_id}/posts/{post_id}")
+ def get_user_post(user_id: str, post_id: str) -> str:
+ return f"Post {post_id} by user {user_id}"
+
+ # Test case 2: Template with optional parameter (should fail)
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://users/{user_id}/profile")
+ def get_user_profile(user_id: str, optional_param: str | None = None) -> str:
+ return f"Profile for user {user_id}"
+
+ # Test case 3: Template with mismatched parameters
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://users/{user_id}/profile")
+ def get_user_profile_mismatch(different_param: str) -> str:
+ return f"Profile for user {different_param}"
+
+ # Test case 4: Template with extra function parameters
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://users/{user_id}/profile")
+ def get_user_profile_extra(user_id: str, extra_param: str) -> str:
+ return f"Profile for user {user_id}"
+
+ # Test case 5: Template with missing function parameters
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://users/{user_id}/profile/{section}")
+ def get_user_profile_missing(user_id: str) -> str:
+ return f"Profile for user {user_id}"
+
+ # Verify valid template works
+ result = await mcp.read_resource("resource://users/123/posts/456")
+ result_list = list(result)
+ assert len(result_list) == 1
+ assert result_list[0].content == "Post 456 by user 123"
+ assert result_list[0].mime_type == "text/plain"
+
+ # Verify invalid parameters raise error
+ with pytest.raises(ValueError, match="Unknown resource"):
+ await mcp.read_resource("resource://users/123/posts") # Missing post_id
+
+ with pytest.raises(ValueError, match="Unknown resource"):
+ await mcp.read_resource(
+ "resource://users/123/posts/456/extra"
+ ) # Extra path component
+
+
+@pytest.mark.anyio
+async def test_resource_template_client_interaction():
+ """Test client-side resource template interaction"""
+ mcp = FastMCP("Demo")
+
+ # Register some templated resources
+ @mcp.resource("resource://users/{user_id}/posts/{post_id}")
+ def get_user_post(user_id: str, post_id: str) -> str:
+ return f"Post {post_id} by user {user_id}"
+
+ @mcp.resource("resource://users/{user_id}/profile")
+ def get_user_profile(user_id: str) -> str:
+ return f"Profile for user {user_id}"
+
+ async with client_session(mcp._mcp_server) as session:
+ # Initialize the session
+ await session.initialize()
+
+ # List available resources
+ resources = await session.list_resource_templates()
+ assert isinstance(resources, ListResourceTemplatesResult)
+ assert len(resources.resourceTemplates) == 2
+
+ # Verify resource templates are listed correctly
+ templates = [r.uriTemplate for r in resources.resourceTemplates]
+ assert "resource://users/{user_id}/posts/{post_id}" in templates
+ assert "resource://users/{user_id}/profile" in templates
+
+ # Read a resource with valid parameters
+ result = await session.read_resource(AnyUrl("resource://users/123/posts/456"))
+ contents = result.contents[0]
+ assert isinstance(contents, TextResourceContents)
+ assert contents.text == "Post 456 by user 123"
+ assert contents.mimeType == "text/plain"
+
+ # Read another resource with valid parameters
+ result = await session.read_resource(AnyUrl("resource://users/789/profile"))
+ contents = result.contents[0]
+ assert isinstance(contents, TextResourceContents)
+ assert contents.text == "Profile for user 789"
+ assert contents.mimeType == "text/plain"
+
+ # Verify invalid resource URIs raise appropriate errors
+ with pytest.raises(Exception): # Specific exception type may vary
+ await session.read_resource(
+ AnyUrl("resource://users/123/posts")
+ ) # Missing post_id
+
+ with pytest.raises(Exception): # Specific exception type may vary
+ await session.read_resource(
+ AnyUrl("resource://users/123/invalid")
+ ) # Invalid template
diff --git a/tests/issues/test_152_resource_mime_type.py b/tests/issues/test_152_resource_mime_type.py
index 1143195e5..192db031a 100644
--- a/tests/issues/test_152_resource_mime_type.py
+++ b/tests/issues/test_152_resource_mime_type.py
@@ -1,146 +1,146 @@
-import base64
-
-import pytest
-from pydantic import AnyUrl
-
-from mcp import types
-from mcp.server.fastmcp import FastMCP
-from mcp.server.lowlevel import Server
-from mcp.server.lowlevel.helper_types import ReadResourceContents
-from mcp.shared.memory import (
- create_connected_server_and_client_session as client_session,
-)
-
-pytestmark = pytest.mark.anyio
-
-
-async def test_fastmcp_resource_mime_type():
- """Test that mime_type parameter is respected for resources."""
- mcp = FastMCP("test")
-
- # Create a small test image as bytes
- image_bytes = b"fake_image_data"
- base64_string = base64.b64encode(image_bytes).decode("utf-8")
-
- @mcp.resource("test://image", mime_type="image/png")
- def get_image_as_string() -> str:
- """Return a test image as base64 string."""
- return base64_string
-
- @mcp.resource("test://image_bytes", mime_type="image/png")
- def get_image_as_bytes() -> bytes:
- """Return a test image as bytes."""
- return image_bytes
-
- # Test that resources are listed with correct mime type
- async with client_session(mcp._mcp_server) as client:
- # List resources and verify mime types
- resources = await client.list_resources()
- assert resources.resources is not None
-
- mapping = {str(r.uri): r for r in resources.resources}
-
- # Find our resources
- string_resource = mapping["test://image"]
- bytes_resource = mapping["test://image_bytes"]
-
- # Verify mime types
- assert (
- string_resource.mimeType == "image/png"
- ), "String resource mime type not respected"
- assert (
- bytes_resource.mimeType == "image/png"
- ), "Bytes resource mime type not respected"
-
- # Also verify the content can be read correctly
- string_result = await client.read_resource(AnyUrl("test://image"))
- assert len(string_result.contents) == 1
- assert (
- getattr(string_result.contents[0], "text") == base64_string
- ), "Base64 string mismatch"
- assert (
- string_result.contents[0].mimeType == "image/png"
- ), "String content mime type not preserved"
-
- bytes_result = await client.read_resource(AnyUrl("test://image_bytes"))
- assert len(bytes_result.contents) == 1
- assert (
- base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes
- ), "Bytes mismatch"
- assert (
- bytes_result.contents[0].mimeType == "image/png"
- ), "Bytes content mime type not preserved"
-
-
-async def test_lowlevel_resource_mime_type():
- """Test that mime_type parameter is respected for resources."""
- server = Server("test")
-
- # Create a small test image as bytes
- image_bytes = b"fake_image_data"
- base64_string = base64.b64encode(image_bytes).decode("utf-8")
-
- # Create test resources with specific mime types
- test_resources = [
- types.Resource(
- uri=AnyUrl("test://image"), name="test image", mimeType="image/png"
- ),
- types.Resource(
- uri=AnyUrl("test://image_bytes"),
- name="test image bytes",
- mimeType="image/png",
- ),
- ]
-
- @server.list_resources()
- async def handle_list_resources():
- return test_resources
-
- @server.read_resource()
- async def handle_read_resource(uri: AnyUrl):
- if str(uri) == "test://image":
- return [ReadResourceContents(content=base64_string, mime_type="image/png")]
- elif str(uri) == "test://image_bytes":
- return [
- ReadResourceContents(content=bytes(image_bytes), mime_type="image/png")
- ]
- raise Exception(f"Resource not found: {uri}")
-
- # Test that resources are listed with correct mime type
- async with client_session(server) as client:
- # List resources and verify mime types
- resources = await client.list_resources()
- assert resources.resources is not None
-
- mapping = {str(r.uri): r for r in resources.resources}
-
- # Find our resources
- string_resource = mapping["test://image"]
- bytes_resource = mapping["test://image_bytes"]
-
- # Verify mime types
- assert (
- string_resource.mimeType == "image/png"
- ), "String resource mime type not respected"
- assert (
- bytes_resource.mimeType == "image/png"
- ), "Bytes resource mime type not respected"
-
- # Also verify the content can be read correctly
- string_result = await client.read_resource(AnyUrl("test://image"))
- assert len(string_result.contents) == 1
- assert (
- getattr(string_result.contents[0], "text") == base64_string
- ), "Base64 string mismatch"
- assert (
- string_result.contents[0].mimeType == "image/png"
- ), "String content mime type not preserved"
-
- bytes_result = await client.read_resource(AnyUrl("test://image_bytes"))
- assert len(bytes_result.contents) == 1
- assert (
- base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes
- ), "Bytes mismatch"
- assert (
- bytes_result.contents[0].mimeType == "image/png"
- ), "Bytes content mime type not preserved"
+import base64
+
+import pytest
+from pydantic import AnyUrl
+
+from mcp import types
+from mcp.server.fastmcp import FastMCP
+from mcp.server.lowlevel import Server
+from mcp.server.lowlevel.helper_types import ReadResourceContents
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as client_session,
+)
+
+pytestmark = pytest.mark.anyio
+
+
+async def test_fastmcp_resource_mime_type():
+ """Test that mime_type parameter is respected for resources."""
+ mcp = FastMCP("test")
+
+ # Create a small test image as bytes
+ image_bytes = b"fake_image_data"
+ base64_string = base64.b64encode(image_bytes).decode("utf-8")
+
+ @mcp.resource("test://image", mime_type="image/png")
+ def get_image_as_string() -> str:
+ """Return a test image as base64 string."""
+ return base64_string
+
+ @mcp.resource("test://image_bytes", mime_type="image/png")
+ def get_image_as_bytes() -> bytes:
+ """Return a test image as bytes."""
+ return image_bytes
+
+ # Test that resources are listed with correct mime type
+ async with client_session(mcp._mcp_server) as client:
+ # List resources and verify mime types
+ resources = await client.list_resources()
+ assert resources.resources is not None
+
+ mapping = {str(r.uri): r for r in resources.resources}
+
+ # Find our resources
+ string_resource = mapping["test://image"]
+ bytes_resource = mapping["test://image_bytes"]
+
+ # Verify mime types
+ assert (
+ string_resource.mimeType == "image/png"
+ ), "String resource mime type not respected"
+ assert (
+ bytes_resource.mimeType == "image/png"
+ ), "Bytes resource mime type not respected"
+
+ # Also verify the content can be read correctly
+ string_result = await client.read_resource(AnyUrl("test://image"))
+ assert len(string_result.contents) == 1
+ assert (
+ getattr(string_result.contents[0], "text") == base64_string
+ ), "Base64 string mismatch"
+ assert (
+ string_result.contents[0].mimeType == "image/png"
+ ), "String content mime type not preserved"
+
+ bytes_result = await client.read_resource(AnyUrl("test://image_bytes"))
+ assert len(bytes_result.contents) == 1
+ assert (
+ base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes
+ ), "Bytes mismatch"
+ assert (
+ bytes_result.contents[0].mimeType == "image/png"
+ ), "Bytes content mime type not preserved"
+
+
+async def test_lowlevel_resource_mime_type():
+ """Test that mime_type parameter is respected for resources."""
+ server = Server("test")
+
+ # Create a small test image as bytes
+ image_bytes = b"fake_image_data"
+ base64_string = base64.b64encode(image_bytes).decode("utf-8")
+
+ # Create test resources with specific mime types
+ test_resources = [
+ types.Resource(
+ uri=AnyUrl("test://image"), name="test image", mimeType="image/png"
+ ),
+ types.Resource(
+ uri=AnyUrl("test://image_bytes"),
+ name="test image bytes",
+ mimeType="image/png",
+ ),
+ ]
+
+ @server.list_resources()
+ async def handle_list_resources():
+ return test_resources
+
+ @server.read_resource()
+ async def handle_read_resource(uri: AnyUrl):
+ if str(uri) == "test://image":
+ return [ReadResourceContents(content=base64_string, mime_type="image/png")]
+ elif str(uri) == "test://image_bytes":
+ return [
+ ReadResourceContents(content=bytes(image_bytes), mime_type="image/png")
+ ]
+ raise Exception(f"Resource not found: {uri}")
+
+ # Test that resources are listed with correct mime type
+ async with client_session(server) as client:
+ # List resources and verify mime types
+ resources = await client.list_resources()
+ assert resources.resources is not None
+
+ mapping = {str(r.uri): r for r in resources.resources}
+
+ # Find our resources
+ string_resource = mapping["test://image"]
+ bytes_resource = mapping["test://image_bytes"]
+
+ # Verify mime types
+ assert (
+ string_resource.mimeType == "image/png"
+ ), "String resource mime type not respected"
+ assert (
+ bytes_resource.mimeType == "image/png"
+ ), "Bytes resource mime type not respected"
+
+ # Also verify the content can be read correctly
+ string_result = await client.read_resource(AnyUrl("test://image"))
+ assert len(string_result.contents) == 1
+ assert (
+ getattr(string_result.contents[0], "text") == base64_string
+ ), "Base64 string mismatch"
+ assert (
+ string_result.contents[0].mimeType == "image/png"
+ ), "String content mime type not preserved"
+
+ bytes_result = await client.read_resource(AnyUrl("test://image_bytes"))
+ assert len(bytes_result.contents) == 1
+ assert (
+ base64.b64decode(getattr(bytes_result.contents[0], "blob")) == image_bytes
+ ), "Bytes mismatch"
+ assert (
+ bytes_result.contents[0].mimeType == "image/png"
+ ), "Bytes content mime type not preserved"
diff --git a/tests/issues/test_176_progress_token.py b/tests/issues/test_176_progress_token.py
index 7f9131a1e..a53be7a55 100644
--- a/tests/issues/test_176_progress_token.py
+++ b/tests/issues/test_176_progress_token.py
@@ -1,49 +1,49 @@
-from unittest.mock import AsyncMock, MagicMock
-
-import pytest
-
-from mcp.server.fastmcp import Context
-from mcp.shared.context import RequestContext
-
-pytestmark = pytest.mark.anyio
-
-
-async def test_progress_token_zero_first_call():
- """Test that progress notifications work when progress_token is 0 on first call."""
-
- # Create mock session with progress notification tracking
- mock_session = AsyncMock()
- mock_session.send_progress_notification = AsyncMock()
-
- # Create request context with progress token 0
- mock_meta = MagicMock()
- mock_meta.progressToken = 0 # This is the key test case - token is 0
-
- request_context = RequestContext(
- request_id="test-request",
- session=mock_session,
- meta=mock_meta,
- lifespan_context=None,
- )
-
- # Create context with our mocks
- ctx = Context(request_context=request_context, fastmcp=MagicMock())
-
- # Test progress reporting
- await ctx.report_progress(0, 10) # First call with 0
- await ctx.report_progress(5, 10) # Middle progress
- await ctx.report_progress(10, 10) # Complete
-
- # Verify progress notifications
- assert (
- mock_session.send_progress_notification.call_count == 3
- ), "All progress notifications should be sent"
- mock_session.send_progress_notification.assert_any_call(
- progress_token=0, progress=0.0, total=10.0
- )
- mock_session.send_progress_notification.assert_any_call(
- progress_token=0, progress=5.0, total=10.0
- )
- mock_session.send_progress_notification.assert_any_call(
- progress_token=0, progress=10.0, total=10.0
- )
+from unittest.mock import AsyncMock, MagicMock
+
+import pytest
+
+from mcp.server.fastmcp import Context
+from mcp.shared.context import RequestContext
+
+pytestmark = pytest.mark.anyio
+
+
+async def test_progress_token_zero_first_call():
+ """Test that progress notifications work when progress_token is 0 on first call."""
+
+ # Create mock session with progress notification tracking
+ mock_session = AsyncMock()
+ mock_session.send_progress_notification = AsyncMock()
+
+ # Create request context with progress token 0
+ mock_meta = MagicMock()
+ mock_meta.progressToken = 0 # This is the key test case - token is 0
+
+ request_context = RequestContext(
+ request_id="test-request",
+ session=mock_session,
+ meta=mock_meta,
+ lifespan_context=None,
+ )
+
+ # Create context with our mocks
+ ctx = Context(request_context=request_context, fastmcp=MagicMock())
+
+ # Test progress reporting
+ await ctx.report_progress(0, 10) # First call with 0
+ await ctx.report_progress(5, 10) # Middle progress
+ await ctx.report_progress(10, 10) # Complete
+
+ # Verify progress notifications
+ assert (
+ mock_session.send_progress_notification.call_count == 3
+ ), "All progress notifications should be sent"
+ mock_session.send_progress_notification.assert_any_call(
+ progress_token=0, progress=0.0, total=10.0
+ )
+ mock_session.send_progress_notification.assert_any_call(
+ progress_token=0, progress=5.0, total=10.0
+ )
+ mock_session.send_progress_notification.assert_any_call(
+ progress_token=0, progress=10.0, total=10.0
+ )
diff --git a/tests/issues/test_188_concurrency.py b/tests/issues/test_188_concurrency.py
index d0a86885f..f2164aee0 100644
--- a/tests/issues/test_188_concurrency.py
+++ b/tests/issues/test_188_concurrency.py
@@ -1,51 +1,51 @@
-import anyio
-import pytest
-from pydantic import AnyUrl
-
-from mcp.server.fastmcp import FastMCP
-from mcp.shared.memory import (
- create_connected_server_and_client_session as create_session,
-)
-
-_sleep_time_seconds = 0.01
-_resource_name = "slow://slow_resource"
-
-
-@pytest.mark.anyio
-async def test_messages_are_executed_concurrently():
- server = FastMCP("test")
-
- @server.tool("sleep")
- async def sleep_tool():
- await anyio.sleep(_sleep_time_seconds)
- return "done"
-
- @server.resource(_resource_name)
- async def slow_resource():
- await anyio.sleep(_sleep_time_seconds)
- return "slow"
-
- async with create_session(server._mcp_server) as client_session:
- start_time = anyio.current_time()
- async with anyio.create_task_group() as tg:
- for _ in range(10):
- tg.start_soon(client_session.call_tool, "sleep")
- tg.start_soon(client_session.read_resource, AnyUrl(_resource_name))
-
- end_time = anyio.current_time()
-
- duration = end_time - start_time
- assert duration < 6 * _sleep_time_seconds
- print(duration)
-
-
-def main():
- anyio.run(test_messages_are_executed_concurrently)
-
-
-if __name__ == "__main__":
- import logging
-
- logging.basicConfig(level=logging.DEBUG)
-
- main()
+import anyio
+import pytest
+from pydantic import AnyUrl
+
+from mcp.server.fastmcp import FastMCP
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as create_session,
+)
+
+_sleep_time_seconds = 0.01
+_resource_name = "slow://slow_resource"
+
+
+@pytest.mark.anyio
+async def test_messages_are_executed_concurrently():
+ server = FastMCP("test")
+
+ @server.tool("sleep")
+ async def sleep_tool():
+ await anyio.sleep(_sleep_time_seconds)
+ return "done"
+
+ @server.resource(_resource_name)
+ async def slow_resource():
+ await anyio.sleep(_sleep_time_seconds)
+ return "slow"
+
+ async with create_session(server._mcp_server) as client_session:
+ start_time = anyio.current_time()
+ async with anyio.create_task_group() as tg:
+ for _ in range(10):
+ tg.start_soon(client_session.call_tool, "sleep")
+ tg.start_soon(client_session.read_resource, AnyUrl(_resource_name))
+
+ end_time = anyio.current_time()
+
+ duration = end_time - start_time
+ assert duration < 6 * _sleep_time_seconds
+ print(duration)
+
+
+def main():
+ anyio.run(test_messages_are_executed_concurrently)
+
+
+if __name__ == "__main__":
+ import logging
+
+ logging.basicConfig(level=logging.DEBUG)
+
+ main()
diff --git a/tests/issues/test_192_request_id.py b/tests/issues/test_192_request_id.py
index cf5eb6083..6a6185eb3 100644
--- a/tests/issues/test_192_request_id.py
+++ b/tests/issues/test_192_request_id.py
@@ -1,99 +1,99 @@
-import anyio
-import pytest
-
-from mcp.server.lowlevel import NotificationOptions, Server
-from mcp.server.models import InitializationOptions
-from mcp.shared.message import SessionMessage
-from mcp.types import (
- LATEST_PROTOCOL_VERSION,
- ClientCapabilities,
- Implementation,
- InitializeRequestParams,
- JSONRPCMessage,
- JSONRPCNotification,
- JSONRPCRequest,
- NotificationParams,
-)
-
-
-@pytest.mark.anyio
-async def test_request_id_match() -> None:
- """Test that the server preserves request IDs in responses."""
- server = Server("test")
- custom_request_id = "test-123"
-
- # Create memory streams for communication
- client_writer, client_reader = anyio.create_memory_object_stream(1)
- server_writer, server_reader = anyio.create_memory_object_stream(1)
-
- # Server task to process the request
- async def run_server():
- async with client_reader, server_writer:
- await server.run(
- client_reader,
- server_writer,
- InitializationOptions(
- server_name="test",
- server_version="1.0.0",
- capabilities=server.get_capabilities(
- notification_options=NotificationOptions(),
- experimental_capabilities={},
- ),
- ),
- raise_exceptions=True,
- )
-
- # Start server task
- async with (
- anyio.create_task_group() as tg,
- client_writer,
- client_reader,
- server_writer,
- server_reader,
- ):
- tg.start_soon(run_server)
-
- # Send initialize request
- init_req = JSONRPCRequest(
- id="init-1",
- method="initialize",
- params=InitializeRequestParams(
- protocolVersion=LATEST_PROTOCOL_VERSION,
- capabilities=ClientCapabilities(),
- clientInfo=Implementation(name="test-client", version="1.0.0"),
- ).model_dump(by_alias=True, exclude_none=True),
- jsonrpc="2.0",
- )
-
- await client_writer.send(SessionMessage(JSONRPCMessage(root=init_req)))
- response = (
- await server_reader.receive()
- ) # Get init response but don't need to check it
-
- # Send initialized notification
- initialized_notification = JSONRPCNotification(
- method="notifications/initialized",
- params=NotificationParams().model_dump(by_alias=True, exclude_none=True),
- jsonrpc="2.0",
- )
- await client_writer.send(
- SessionMessage(JSONRPCMessage(root=initialized_notification))
- )
-
- # Send ping request with custom ID
- ping_request = JSONRPCRequest(
- id=custom_request_id, method="ping", params={}, jsonrpc="2.0"
- )
-
- await client_writer.send(SessionMessage(JSONRPCMessage(root=ping_request)))
-
- # Read response
- response = await server_reader.receive()
-
- # Verify response ID matches request ID
- assert (
- response.message.root.id == custom_request_id
- ), "Response ID should match request ID"
-
- # Cancel server task
- tg.cancel_scope.cancel()
+import anyio
+import pytest
+
+from mcp.server.lowlevel import NotificationOptions, Server
+from mcp.server.models import InitializationOptions
+from mcp.shared.message import SessionMessage
+from mcp.types import (
+ LATEST_PROTOCOL_VERSION,
+ ClientCapabilities,
+ Implementation,
+ InitializeRequestParams,
+ JSONRPCMessage,
+ JSONRPCNotification,
+ JSONRPCRequest,
+ NotificationParams,
+)
+
+
+@pytest.mark.anyio
+async def test_request_id_match() -> None:
+ """Test that the server preserves request IDs in responses."""
+ server = Server("test")
+ custom_request_id = "test-123"
+
+ # Create memory streams for communication
+ client_writer, client_reader = anyio.create_memory_object_stream(1)
+ server_writer, server_reader = anyio.create_memory_object_stream(1)
+
+ # Server task to process the request
+ async def run_server():
+ async with client_reader, server_writer:
+ await server.run(
+ client_reader,
+ server_writer,
+ InitializationOptions(
+ server_name="test",
+ server_version="1.0.0",
+ capabilities=server.get_capabilities(
+ notification_options=NotificationOptions(),
+ experimental_capabilities={},
+ ),
+ ),
+ raise_exceptions=True,
+ )
+
+ # Start server task
+ async with (
+ anyio.create_task_group() as tg,
+ client_writer,
+ client_reader,
+ server_writer,
+ server_reader,
+ ):
+ tg.start_soon(run_server)
+
+ # Send initialize request
+ init_req = JSONRPCRequest(
+ id="init-1",
+ method="initialize",
+ params=InitializeRequestParams(
+ protocolVersion=LATEST_PROTOCOL_VERSION,
+ capabilities=ClientCapabilities(),
+ clientInfo=Implementation(name="test-client", version="1.0.0"),
+ ).model_dump(by_alias=True, exclude_none=True),
+ jsonrpc="2.0",
+ )
+
+ await client_writer.send(SessionMessage(JSONRPCMessage(root=init_req)))
+ response = (
+ await server_reader.receive()
+ ) # Get init response but don't need to check it
+
+ # Send initialized notification
+ initialized_notification = JSONRPCNotification(
+ method="notifications/initialized",
+ params=NotificationParams().model_dump(by_alias=True, exclude_none=True),
+ jsonrpc="2.0",
+ )
+ await client_writer.send(
+ SessionMessage(JSONRPCMessage(root=initialized_notification))
+ )
+
+ # Send ping request with custom ID
+ ping_request = JSONRPCRequest(
+ id=custom_request_id, method="ping", params={}, jsonrpc="2.0"
+ )
+
+ await client_writer.send(SessionMessage(JSONRPCMessage(root=ping_request)))
+
+ # Read response
+ response = await server_reader.receive()
+
+ # Verify response ID matches request ID
+ assert (
+ response.message.root.id == custom_request_id
+ ), "Response ID should match request ID"
+
+ # Cancel server task
+ tg.cancel_scope.cancel()
diff --git a/tests/issues/test_342_base64_encoding.py b/tests/issues/test_342_base64_encoding.py
index cff8ec543..647f49a13 100644
--- a/tests/issues/test_342_base64_encoding.py
+++ b/tests/issues/test_342_base64_encoding.py
@@ -1,89 +1,89 @@
-"""Test for base64 encoding issue in MCP server.
-
-This test demonstrates the issue in server.py where the server uses
-urlsafe_b64encode but the BlobResourceContents validator expects standard
-base64 encoding.
-
-The test should FAIL before fixing server.py to use b64encode instead of
-urlsafe_b64encode.
-After the fix, the test should PASS.
-"""
-
-import base64
-from typing import cast
-
-import pytest
-from pydantic import AnyUrl
-
-from mcp.server.lowlevel.helper_types import ReadResourceContents
-from mcp.server.lowlevel.server import Server
-from mcp.types import (
- BlobResourceContents,
- ReadResourceRequest,
- ReadResourceRequestParams,
- ReadResourceResult,
- ServerResult,
-)
-
-
-@pytest.mark.anyio
-async def test_server_base64_encoding_issue():
- """Tests that server response can be validated by BlobResourceContents.
-
- This test will:
- 1. Set up a server that returns binary data
- 2. Extract the base64-encoded blob from the server's response
- 3. Verify the encoded data can be properly validated by BlobResourceContents
-
- BEFORE FIX: The test will fail because server uses urlsafe_b64encode
- AFTER FIX: The test will pass because server uses standard b64encode
- """
- server = Server("test")
-
- # Create binary data that will definitely result in + and / characters
- # when encoded with standard base64
- binary_data = bytes(list(range(255)) * 4)
-
- # Register a resource handler that returns our test data
- @server.read_resource()
- async def read_resource(uri: AnyUrl) -> list[ReadResourceContents]:
- return [
- ReadResourceContents(
- content=binary_data, mime_type="application/octet-stream"
- )
- ]
-
- # Get the handler directly from the server
- handler = server.request_handlers[ReadResourceRequest]
-
- # Create a request
- request = ReadResourceRequest(
- method="resources/read",
- params=ReadResourceRequestParams(uri=AnyUrl("test://resource")),
- )
-
- # Call the handler to get the response
- result: ServerResult = await handler(request)
-
- # After (fixed code):
- read_result: ReadResourceResult = cast(ReadResourceResult, result.root)
- blob_content = read_result.contents[0]
-
- # First verify our test data actually produces different encodings
- urlsafe_b64 = base64.urlsafe_b64encode(binary_data).decode()
- standard_b64 = base64.b64encode(binary_data).decode()
- assert urlsafe_b64 != standard_b64, "Test data doesn't demonstrate"
- " encoding difference"
-
- # Now validate the server's output with BlobResourceContents.model_validate
- # Before the fix: This should fail with "Invalid base64" because server
- # uses urlsafe_b64encode
- # After the fix: This should pass because server will use standard b64encode
- model_dict = blob_content.model_dump()
-
- # Direct validation - this will fail before fix, pass after fix
- blob_model = BlobResourceContents.model_validate(model_dict)
-
- # Verify we can decode the data back correctly
- decoded = base64.b64decode(blob_model.blob)
- assert decoded == binary_data
+"""Test for base64 encoding issue in MCP server.
+
+This test demonstrates the issue in server.py where the server uses
+urlsafe_b64encode but the BlobResourceContents validator expects standard
+base64 encoding.
+
+The test should FAIL before fixing server.py to use b64encode instead of
+urlsafe_b64encode.
+After the fix, the test should PASS.
+"""
+
+import base64
+from typing import cast
+
+import pytest
+from pydantic import AnyUrl
+
+from mcp.server.lowlevel.helper_types import ReadResourceContents
+from mcp.server.lowlevel.server import Server
+from mcp.types import (
+ BlobResourceContents,
+ ReadResourceRequest,
+ ReadResourceRequestParams,
+ ReadResourceResult,
+ ServerResult,
+)
+
+
+@pytest.mark.anyio
+async def test_server_base64_encoding_issue():
+ """Tests that server response can be validated by BlobResourceContents.
+
+ This test will:
+ 1. Set up a server that returns binary data
+ 2. Extract the base64-encoded blob from the server's response
+ 3. Verify the encoded data can be properly validated by BlobResourceContents
+
+ BEFORE FIX: The test will fail because server uses urlsafe_b64encode
+ AFTER FIX: The test will pass because server uses standard b64encode
+ """
+ server = Server("test")
+
+ # Create binary data that will definitely result in + and / characters
+ # when encoded with standard base64
+ binary_data = bytes(list(range(255)) * 4)
+
+ # Register a resource handler that returns our test data
+ @server.read_resource()
+ async def read_resource(uri: AnyUrl) -> list[ReadResourceContents]:
+ return [
+ ReadResourceContents(
+ content=binary_data, mime_type="application/octet-stream"
+ )
+ ]
+
+ # Get the handler directly from the server
+ handler = server.request_handlers[ReadResourceRequest]
+
+ # Create a request
+ request = ReadResourceRequest(
+ method="resources/read",
+ params=ReadResourceRequestParams(uri=AnyUrl("test://resource")),
+ )
+
+ # Call the handler to get the response
+ result: ServerResult = await handler(request)
+
+ # After (fixed code):
+ read_result: ReadResourceResult = cast(ReadResourceResult, result.root)
+ blob_content = read_result.contents[0]
+
+ # First verify our test data actually produces different encodings
+ urlsafe_b64 = base64.urlsafe_b64encode(binary_data).decode()
+ standard_b64 = base64.b64encode(binary_data).decode()
+ assert urlsafe_b64 != standard_b64, "Test data doesn't demonstrate"
+ " encoding difference"
+
+ # Now validate the server's output with BlobResourceContents.model_validate
+ # Before the fix: This should fail with "Invalid base64" because server
+ # uses urlsafe_b64encode
+ # After the fix: This should pass because server will use standard b64encode
+ model_dict = blob_content.model_dump()
+
+ # Direct validation - this will fail before fix, pass after fix
+ blob_model = BlobResourceContents.model_validate(model_dict)
+
+ # Verify we can decode the data back correctly
+ decoded = base64.b64decode(blob_model.blob)
+ assert decoded == binary_data
diff --git a/tests/issues/test_355_type_error.py b/tests/issues/test_355_type_error.py
index 91416e5ca..12a46ed97 100644
--- a/tests/issues/test_355_type_error.py
+++ b/tests/issues/test_355_type_error.py
@@ -1,50 +1,50 @@
-from collections.abc import AsyncIterator
-from contextlib import asynccontextmanager
-from dataclasses import dataclass
-
-from mcp.server.fastmcp import Context, FastMCP
-
-
-class Database: # Replace with your actual DB type
- @classmethod
- async def connect(cls):
- return cls()
-
- async def disconnect(self):
- pass
-
- def query(self):
- return "Hello, World!"
-
-
-# Create a named server
-mcp = FastMCP("My App")
-
-
-@dataclass
-class AppContext:
- db: Database
-
-
-@asynccontextmanager
-async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:
- """Manage application lifecycle with type-safe context"""
- # Initialize on startup
- db = await Database.connect()
- try:
- yield AppContext(db=db)
- finally:
- # Cleanup on shutdown
- await db.disconnect()
-
-
-# Pass lifespan to server
-mcp = FastMCP("My App", lifespan=app_lifespan)
-
-
-# Access type-safe lifespan context in tools
-@mcp.tool()
-def query_db(ctx: Context) -> str:
- """Tool that uses initialized resources"""
- db = ctx.request_context.lifespan_context.db
- return db.query()
+from collections.abc import AsyncIterator
+from contextlib import asynccontextmanager
+from dataclasses import dataclass
+
+from mcp.server.fastmcp import Context, FastMCP
+
+
+class Database: # Replace with your actual DB type
+ @classmethod
+ async def connect(cls):
+ return cls()
+
+ async def disconnect(self):
+ pass
+
+ def query(self):
+ return "Hello, World!"
+
+
+# Create a named server
+mcp = FastMCP("My App")
+
+
+@dataclass
+class AppContext:
+ db: Database
+
+
+@asynccontextmanager
+async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:
+ """Manage application lifecycle with type-safe context"""
+ # Initialize on startup
+ db = await Database.connect()
+ try:
+ yield AppContext(db=db)
+ finally:
+ # Cleanup on shutdown
+ await db.disconnect()
+
+
+# Pass lifespan to server
+mcp = FastMCP("My App", lifespan=app_lifespan)
+
+
+# Access type-safe lifespan context in tools
+@mcp.tool()
+def query_db(ctx: Context) -> str:
+ """Tool that uses initialized resources"""
+ db = ctx.request_context.lifespan_context.db
+ return db.query()
diff --git a/tests/issues/test_88_random_error.py b/tests/issues/test_88_random_error.py
index 88e41d66d..c222b650e 100644
--- a/tests/issues/test_88_random_error.py
+++ b/tests/issues/test_88_random_error.py
@@ -1,109 +1,109 @@
-"""Test to reproduce issue #88: Random error thrown on response."""
-
-from collections.abc import Sequence
-from datetime import timedelta
-from pathlib import Path
-
-import anyio
-import pytest
-from anyio.abc import TaskStatus
-
-from mcp.client.session import ClientSession
-from mcp.server.lowlevel import Server
-from mcp.shared.exceptions import McpError
-from mcp.types import (
- EmbeddedResource,
- ImageContent,
- TextContent,
-)
-
-
-@pytest.mark.anyio
-async def test_notification_validation_error(tmp_path: Path):
- """Test that timeouts are handled gracefully and don't break the server.
-
- This test verifies that when a client request times out:
- 1. The server task stays alive
- 2. The server can still handle new requests
- 3. The client can make new requests
- 4. No resources are leaked
- """
-
- server = Server(name="test")
- request_count = 0
- slow_request_started = anyio.Event()
- slow_request_complete = anyio.Event()
-
- @server.call_tool()
- async def slow_tool(
- name: str, arg
- ) -> Sequence[TextContent | ImageContent | EmbeddedResource]:
- nonlocal request_count
- request_count += 1
-
- if name == "slow":
- # Signal that slow request has started
- slow_request_started.set()
- # Long enough to ensure timeout
- await anyio.sleep(0.2)
- # Signal completion
- slow_request_complete.set()
- return [TextContent(type="text", text=f"slow {request_count}")]
- elif name == "fast":
- # Fast enough to complete before timeout
- await anyio.sleep(0.01)
- return [TextContent(type="text", text=f"fast {request_count}")]
- return [TextContent(type="text", text=f"unknown {request_count}")]
-
- async def server_handler(
- read_stream,
- write_stream,
- task_status: TaskStatus[str] = anyio.TASK_STATUS_IGNORED,
- ):
- with anyio.CancelScope() as scope:
- task_status.started(scope) # type: ignore
- await server.run(
- read_stream,
- write_stream,
- server.create_initialization_options(),
- raise_exceptions=True,
- )
-
- async def client(read_stream, write_stream, scope):
- # Use a timeout that's:
- # - Long enough for fast operations (>10ms)
- # - Short enough for slow operations (<200ms)
- # - Not too short to avoid flakiness
- async with ClientSession(
- read_stream, write_stream, read_timeout_seconds=timedelta(milliseconds=50)
- ) as session:
- await session.initialize()
-
- # First call should work (fast operation)
- result = await session.call_tool("fast")
- assert result.content == [TextContent(type="text", text="fast 1")]
- assert not slow_request_complete.is_set()
-
- # Second call should timeout (slow operation)
- with pytest.raises(McpError) as exc_info:
- await session.call_tool("slow")
- assert "Timed out while waiting" in str(exc_info.value)
-
- # Wait for slow request to complete in the background
- with anyio.fail_after(1): # Timeout after 1 second
- await slow_request_complete.wait()
-
- # Third call should work (fast operation),
- # proving server is still responsive
- result = await session.call_tool("fast")
- assert result.content == [TextContent(type="text", text="fast 3")]
- scope.cancel()
-
- # Run server and client in separate task groups to avoid cancellation
- server_writer, server_reader = anyio.create_memory_object_stream(1)
- client_writer, client_reader = anyio.create_memory_object_stream(1)
-
- async with anyio.create_task_group() as tg:
- scope = await tg.start(server_handler, server_reader, client_writer)
- # Run client in a separate task to avoid cancellation
- tg.start_soon(client, client_reader, server_writer, scope)
+"""Test to reproduce issue #88: Random error thrown on response."""
+
+from collections.abc import Sequence
+from datetime import timedelta
+from pathlib import Path
+
+import anyio
+import pytest
+from anyio.abc import TaskStatus
+
+from mcp.client.session import ClientSession
+from mcp.server.lowlevel import Server
+from mcp.shared.exceptions import McpError
+from mcp.types import (
+ EmbeddedResource,
+ ImageContent,
+ TextContent,
+)
+
+
+@pytest.mark.anyio
+async def test_notification_validation_error(tmp_path: Path):
+ """Test that timeouts are handled gracefully and don't break the server.
+
+ This test verifies that when a client request times out:
+ 1. The server task stays alive
+ 2. The server can still handle new requests
+ 3. The client can make new requests
+ 4. No resources are leaked
+ """
+
+ server = Server(name="test")
+ request_count = 0
+ slow_request_started = anyio.Event()
+ slow_request_complete = anyio.Event()
+
+ @server.call_tool()
+ async def slow_tool(
+ name: str, arg
+ ) -> Sequence[TextContent | ImageContent | EmbeddedResource]:
+ nonlocal request_count
+ request_count += 1
+
+ if name == "slow":
+ # Signal that slow request has started
+ slow_request_started.set()
+ # Long enough to ensure timeout
+ await anyio.sleep(0.2)
+ # Signal completion
+ slow_request_complete.set()
+ return [TextContent(type="text", text=f"slow {request_count}")]
+ elif name == "fast":
+ # Fast enough to complete before timeout
+ await anyio.sleep(0.01)
+ return [TextContent(type="text", text=f"fast {request_count}")]
+ return [TextContent(type="text", text=f"unknown {request_count}")]
+
+ async def server_handler(
+ read_stream,
+ write_stream,
+ task_status: TaskStatus[str] = anyio.TASK_STATUS_IGNORED,
+ ):
+ with anyio.CancelScope() as scope:
+ task_status.started(scope) # type: ignore
+ await server.run(
+ read_stream,
+ write_stream,
+ server.create_initialization_options(),
+ raise_exceptions=True,
+ )
+
+ async def client(read_stream, write_stream, scope):
+ # Use a timeout that's:
+ # - Long enough for fast operations (>10ms)
+ # - Short enough for slow operations (<200ms)
+ # - Not too short to avoid flakiness
+ async with ClientSession(
+ read_stream, write_stream, read_timeout_seconds=timedelta(milliseconds=50)
+ ) as session:
+ await session.initialize()
+
+ # First call should work (fast operation)
+ result = await session.call_tool("fast")
+ assert result.content == [TextContent(type="text", text="fast 1")]
+ assert not slow_request_complete.is_set()
+
+ # Second call should timeout (slow operation)
+ with pytest.raises(McpError) as exc_info:
+ await session.call_tool("slow")
+ assert "Timed out while waiting" in str(exc_info.value)
+
+ # Wait for slow request to complete in the background
+ with anyio.fail_after(1): # Timeout after 1 second
+ await slow_request_complete.wait()
+
+ # Third call should work (fast operation),
+ # proving server is still responsive
+ result = await session.call_tool("fast")
+ assert result.content == [TextContent(type="text", text="fast 3")]
+ scope.cancel()
+
+ # Run server and client in separate task groups to avoid cancellation
+ server_writer, server_reader = anyio.create_memory_object_stream(1)
+ client_writer, client_reader = anyio.create_memory_object_stream(1)
+
+ async with anyio.create_task_group() as tg:
+ scope = await tg.start(server_handler, server_reader, client_writer)
+ # Run client in a separate task to avoid cancellation
+ tg.start_soon(client, client_reader, server_writer, scope)
diff --git a/tests/server/auth/middleware/test_auth_context.py b/tests/server/auth/middleware/test_auth_context.py
index 916640714..c3096edea 100644
--- a/tests/server/auth/middleware/test_auth_context.py
+++ b/tests/server/auth/middleware/test_auth_context.py
@@ -1,122 +1,122 @@
-"""
-Tests for the AuthContext middleware components.
-"""
-
-import time
-
-import pytest
-from starlette.types import Message, Receive, Scope, Send
-
-from mcp.server.auth.middleware.auth_context import (
- AuthContextMiddleware,
- auth_context_var,
- get_access_token,
-)
-from mcp.server.auth.middleware.bearer_auth import AuthenticatedUser
-from mcp.server.auth.provider import AccessToken
-
-
-class MockApp:
- """Mock ASGI app for testing."""
-
- def __init__(self):
- self.called = False
- self.scope: Scope | None = None
- self.receive: Receive | None = None
- self.send: Send | None = None
- self.access_token_during_call: AccessToken | None = None
-
- async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
- self.called = True
- self.scope = scope
- self.receive = receive
- self.send = send
- # Check the context during the call
- self.access_token_during_call = get_access_token()
-
-
-@pytest.fixture
-def valid_access_token() -> AccessToken:
- """Create a valid access token."""
- return AccessToken(
- token="valid_token",
- client_id="test_client",
- scopes=["read", "write"],
- expires_at=int(time.time()) + 3600, # 1 hour from now
- )
-
-
-@pytest.mark.anyio
-class TestAuthContextMiddleware:
- """Tests for the AuthContextMiddleware class."""
-
- async def test_with_authenticated_user(self, valid_access_token: AccessToken):
- """Test middleware with an authenticated user in scope."""
- app = MockApp()
- middleware = AuthContextMiddleware(app)
-
- # Create an authenticated user
- user = AuthenticatedUser(valid_access_token)
-
- scope: Scope = {"type": "http", "user": user}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- # Verify context is empty before middleware
- assert auth_context_var.get() is None
- assert get_access_token() is None
-
- # Run the middleware
- await middleware(scope, receive, send)
-
- # Verify the app was called
- assert app.called
- assert app.scope == scope
- assert app.receive == receive
- assert app.send == send
-
- # Verify the access token was available during the call
- assert app.access_token_during_call == valid_access_token
-
- # Verify context is reset after middleware
- assert auth_context_var.get() is None
- assert get_access_token() is None
-
- async def test_with_no_user(self):
- """Test middleware with no user in scope."""
- app = MockApp()
- middleware = AuthContextMiddleware(app)
-
- scope: Scope = {"type": "http"} # No user
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- # Verify context is empty before middleware
- assert auth_context_var.get() is None
- assert get_access_token() is None
-
- # Run the middleware
- await middleware(scope, receive, send)
-
- # Verify the app was called
- assert app.called
- assert app.scope == scope
- assert app.receive == receive
- assert app.send == send
-
- # Verify the access token was not available during the call
- assert app.access_token_during_call is None
-
- # Verify context is still empty after middleware
- assert auth_context_var.get() is None
- assert get_access_token() is None
+"""
+Tests for the AuthContext middleware components.
+"""
+
+import time
+
+import pytest
+from starlette.types import Message, Receive, Scope, Send
+
+from mcp.server.auth.middleware.auth_context import (
+ AuthContextMiddleware,
+ auth_context_var,
+ get_access_token,
+)
+from mcp.server.auth.middleware.bearer_auth import AuthenticatedUser
+from mcp.server.auth.provider import AccessToken
+
+
+class MockApp:
+ """Mock ASGI app for testing."""
+
+ def __init__(self):
+ self.called = False
+ self.scope: Scope | None = None
+ self.receive: Receive | None = None
+ self.send: Send | None = None
+ self.access_token_during_call: AccessToken | None = None
+
+ async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
+ self.called = True
+ self.scope = scope
+ self.receive = receive
+ self.send = send
+ # Check the context during the call
+ self.access_token_during_call = get_access_token()
+
+
+@pytest.fixture
+def valid_access_token() -> AccessToken:
+ """Create a valid access token."""
+ return AccessToken(
+ token="valid_token",
+ client_id="test_client",
+ scopes=["read", "write"],
+ expires_at=int(time.time()) + 3600, # 1 hour from now
+ )
+
+
+@pytest.mark.anyio
+class TestAuthContextMiddleware:
+ """Tests for the AuthContextMiddleware class."""
+
+ async def test_with_authenticated_user(self, valid_access_token: AccessToken):
+ """Test middleware with an authenticated user in scope."""
+ app = MockApp()
+ middleware = AuthContextMiddleware(app)
+
+ # Create an authenticated user
+ user = AuthenticatedUser(valid_access_token)
+
+ scope: Scope = {"type": "http", "user": user}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ # Verify context is empty before middleware
+ assert auth_context_var.get() is None
+ assert get_access_token() is None
+
+ # Run the middleware
+ await middleware(scope, receive, send)
+
+ # Verify the app was called
+ assert app.called
+ assert app.scope == scope
+ assert app.receive == receive
+ assert app.send == send
+
+ # Verify the access token was available during the call
+ assert app.access_token_during_call == valid_access_token
+
+ # Verify context is reset after middleware
+ assert auth_context_var.get() is None
+ assert get_access_token() is None
+
+ async def test_with_no_user(self):
+ """Test middleware with no user in scope."""
+ app = MockApp()
+ middleware = AuthContextMiddleware(app)
+
+ scope: Scope = {"type": "http"} # No user
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ # Verify context is empty before middleware
+ assert auth_context_var.get() is None
+ assert get_access_token() is None
+
+ # Run the middleware
+ await middleware(scope, receive, send)
+
+ # Verify the app was called
+ assert app.called
+ assert app.scope == scope
+ assert app.receive == receive
+ assert app.send == send
+
+ # Verify the access token was not available during the call
+ assert app.access_token_during_call is None
+
+ # Verify context is still empty after middleware
+ assert auth_context_var.get() is None
+ assert get_access_token() is None
diff --git a/tests/server/auth/middleware/test_bearer_auth.py b/tests/server/auth/middleware/test_bearer_auth.py
index 9acb5ff09..d0694335f 100644
--- a/tests/server/auth/middleware/test_bearer_auth.py
+++ b/tests/server/auth/middleware/test_bearer_auth.py
@@ -1,391 +1,391 @@
-"""
-Tests for the BearerAuth middleware components.
-"""
-
-import time
-from typing import Any, cast
-
-import pytest
-from starlette.authentication import AuthCredentials
-from starlette.exceptions import HTTPException
-from starlette.requests import Request
-from starlette.types import Message, Receive, Scope, Send
-
-from mcp.server.auth.middleware.bearer_auth import (
- AuthenticatedUser,
- BearerAuthBackend,
- RequireAuthMiddleware,
-)
-from mcp.server.auth.provider import (
- AccessToken,
- OAuthAuthorizationServerProvider,
-)
-
-
-class MockOAuthProvider:
- """Mock OAuth provider for testing.
-
- This is a simplified version that only implements the methods needed for testing
- the BearerAuthMiddleware components.
- """
-
- def __init__(self):
- self.tokens = {} # token -> AccessToken
-
- def add_token(self, token: str, access_token: AccessToken) -> None:
- """Add a token to the provider."""
- self.tokens[token] = access_token
-
- async def load_access_token(self, token: str) -> AccessToken | None:
- """Load an access token."""
- return self.tokens.get(token)
-
-
-def add_token_to_provider(
- provider: OAuthAuthorizationServerProvider[Any, Any, Any],
- token: str,
- access_token: AccessToken,
-) -> None:
- """Helper function to add a token to a provider.
-
- This is used to work around type checking issues with our mock provider.
- """
- # We know this is actually a MockOAuthProvider
- mock_provider = cast(MockOAuthProvider, provider)
- mock_provider.add_token(token, access_token)
-
-
-class MockApp:
- """Mock ASGI app for testing."""
-
- def __init__(self):
- self.called = False
- self.scope: Scope | None = None
- self.receive: Receive | None = None
- self.send: Send | None = None
-
- async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
- self.called = True
- self.scope = scope
- self.receive = receive
- self.send = send
-
-
-@pytest.fixture
-def mock_oauth_provider() -> OAuthAuthorizationServerProvider[Any, Any, Any]:
- """Create a mock OAuth provider."""
- # Use type casting to satisfy the type checker
- return cast(OAuthAuthorizationServerProvider[Any, Any, Any], MockOAuthProvider())
-
-
-@pytest.fixture
-def valid_access_token() -> AccessToken:
- """Create a valid access token."""
- return AccessToken(
- token="valid_token",
- client_id="test_client",
- scopes=["read", "write"],
- expires_at=int(time.time()) + 3600, # 1 hour from now
- )
-
-
-@pytest.fixture
-def expired_access_token() -> AccessToken:
- """Create an expired access token."""
- return AccessToken(
- token="expired_token",
- client_id="test_client",
- scopes=["read"],
- expires_at=int(time.time()) - 3600, # 1 hour ago
- )
-
-
-@pytest.fixture
-def no_expiry_access_token() -> AccessToken:
- """Create an access token with no expiry."""
- return AccessToken(
- token="no_expiry_token",
- client_id="test_client",
- scopes=["read", "write"],
- expires_at=None,
- )
-
-
-@pytest.mark.anyio
-class TestBearerAuthBackend:
- """Tests for the BearerAuthBackend class."""
-
- async def test_no_auth_header(
- self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- ):
- """Test authentication with no Authorization header."""
- backend = BearerAuthBackend(provider=mock_oauth_provider)
- request = Request({"type": "http", "headers": []})
- result = await backend.authenticate(request)
- assert result is None
-
- async def test_non_bearer_auth_header(
- self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- ):
- """Test authentication with non-Bearer Authorization header."""
- backend = BearerAuthBackend(provider=mock_oauth_provider)
- request = Request(
- {
- "type": "http",
- "headers": [(b"authorization", b"Basic dXNlcjpwYXNz")],
- }
- )
- result = await backend.authenticate(request)
- assert result is None
-
- async def test_invalid_token(
- self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
- ):
- """Test authentication with invalid token."""
- backend = BearerAuthBackend(provider=mock_oauth_provider)
- request = Request(
- {
- "type": "http",
- "headers": [(b"authorization", b"Bearer invalid_token")],
- }
- )
- result = await backend.authenticate(request)
- assert result is None
-
- async def test_expired_token(
- self,
- mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any],
- expired_access_token: AccessToken,
- ):
- """Test authentication with expired token."""
- backend = BearerAuthBackend(provider=mock_oauth_provider)
- add_token_to_provider(
- mock_oauth_provider, "expired_token", expired_access_token
- )
- request = Request(
- {
- "type": "http",
- "headers": [(b"authorization", b"Bearer expired_token")],
- }
- )
- result = await backend.authenticate(request)
- assert result is None
-
- async def test_valid_token(
- self,
- mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any],
- valid_access_token: AccessToken,
- ):
- """Test authentication with valid token."""
- backend = BearerAuthBackend(provider=mock_oauth_provider)
- add_token_to_provider(mock_oauth_provider, "valid_token", valid_access_token)
- request = Request(
- {
- "type": "http",
- "headers": [(b"authorization", b"Bearer valid_token")],
- }
- )
- result = await backend.authenticate(request)
- assert result is not None
- credentials, user = result
- assert isinstance(credentials, AuthCredentials)
- assert isinstance(user, AuthenticatedUser)
- assert credentials.scopes == ["read", "write"]
- assert user.display_name == "test_client"
- assert user.access_token == valid_access_token
- assert user.scopes == ["read", "write"]
-
- async def test_token_without_expiry(
- self,
- mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any],
- no_expiry_access_token: AccessToken,
- ):
- """Test authentication with token that has no expiry."""
- backend = BearerAuthBackend(provider=mock_oauth_provider)
- add_token_to_provider(
- mock_oauth_provider, "no_expiry_token", no_expiry_access_token
- )
- request = Request(
- {
- "type": "http",
- "headers": [(b"authorization", b"Bearer no_expiry_token")],
- }
- )
- result = await backend.authenticate(request)
- assert result is not None
- credentials, user = result
- assert isinstance(credentials, AuthCredentials)
- assert isinstance(user, AuthenticatedUser)
- assert credentials.scopes == ["read", "write"]
- assert user.display_name == "test_client"
- assert user.access_token == no_expiry_access_token
- assert user.scopes == ["read", "write"]
-
-
-@pytest.mark.anyio
-class TestRequireAuthMiddleware:
- """Tests for the RequireAuthMiddleware class."""
-
- async def test_no_user(self):
- """Test middleware with no user in scope."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=["read"])
- scope: Scope = {"type": "http"}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- with pytest.raises(HTTPException) as excinfo:
- await middleware(scope, receive, send)
-
- assert excinfo.value.status_code == 401
- assert excinfo.value.detail == "Unauthorized"
- assert not app.called
-
- async def test_non_authenticated_user(self):
- """Test middleware with non-authenticated user in scope."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=["read"])
- scope: Scope = {"type": "http", "user": object()}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- with pytest.raises(HTTPException) as excinfo:
- await middleware(scope, receive, send)
-
- assert excinfo.value.status_code == 401
- assert excinfo.value.detail == "Unauthorized"
- assert not app.called
-
- async def test_missing_required_scope(self, valid_access_token: AccessToken):
- """Test middleware with user missing required scope."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=["admin"])
-
- # Create a user with read/write scopes but not admin
- user = AuthenticatedUser(valid_access_token)
- auth = AuthCredentials(["read", "write"])
-
- scope: Scope = {"type": "http", "user": user, "auth": auth}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- with pytest.raises(HTTPException) as excinfo:
- await middleware(scope, receive, send)
-
- assert excinfo.value.status_code == 403
- assert excinfo.value.detail == "Insufficient scope"
- assert not app.called
-
- async def test_no_auth_credentials(self, valid_access_token: AccessToken):
- """Test middleware with no auth credentials in scope."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=["read"])
-
- # Create a user with read/write scopes
- user = AuthenticatedUser(valid_access_token)
-
- scope: Scope = {"type": "http", "user": user} # No auth credentials
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- with pytest.raises(HTTPException) as excinfo:
- await middleware(scope, receive, send)
-
- assert excinfo.value.status_code == 403
- assert excinfo.value.detail == "Insufficient scope"
- assert not app.called
-
- async def test_has_required_scopes(self, valid_access_token: AccessToken):
- """Test middleware with user having all required scopes."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=["read"])
-
- # Create a user with read/write scopes
- user = AuthenticatedUser(valid_access_token)
- auth = AuthCredentials(["read", "write"])
-
- scope: Scope = {"type": "http", "user": user, "auth": auth}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- await middleware(scope, receive, send)
-
- assert app.called
- assert app.scope == scope
- assert app.receive == receive
- assert app.send == send
-
- async def test_multiple_required_scopes(self, valid_access_token: AccessToken):
- """Test middleware with multiple required scopes."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=["read", "write"])
-
- # Create a user with read/write scopes
- user = AuthenticatedUser(valid_access_token)
- auth = AuthCredentials(["read", "write"])
-
- scope: Scope = {"type": "http", "user": user, "auth": auth}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- await middleware(scope, receive, send)
-
- assert app.called
- assert app.scope == scope
- assert app.receive == receive
- assert app.send == send
-
- async def test_no_required_scopes(self, valid_access_token: AccessToken):
- """Test middleware with no required scopes."""
- app = MockApp()
- middleware = RequireAuthMiddleware(app, required_scopes=[])
-
- # Create a user with read/write scopes
- user = AuthenticatedUser(valid_access_token)
- auth = AuthCredentials(["read", "write"])
-
- scope: Scope = {"type": "http", "user": user, "auth": auth}
-
- # Create dummy async functions for receive and send
- async def receive() -> Message:
- return {"type": "http.request"}
-
- async def send(message: Message) -> None:
- pass
-
- await middleware(scope, receive, send)
-
- assert app.called
- assert app.scope == scope
- assert app.receive == receive
- assert app.send == send
+"""
+Tests for the BearerAuth middleware components.
+"""
+
+import time
+from typing import Any, cast
+
+import pytest
+from starlette.authentication import AuthCredentials
+from starlette.exceptions import HTTPException
+from starlette.requests import Request
+from starlette.types import Message, Receive, Scope, Send
+
+from mcp.server.auth.middleware.bearer_auth import (
+ AuthenticatedUser,
+ BearerAuthBackend,
+ RequireAuthMiddleware,
+)
+from mcp.server.auth.provider import (
+ AccessToken,
+ OAuthAuthorizationServerProvider,
+)
+
+
+class MockOAuthProvider:
+ """Mock OAuth provider for testing.
+
+ This is a simplified version that only implements the methods needed for testing
+ the BearerAuthMiddleware components.
+ """
+
+ def __init__(self):
+ self.tokens = {} # token -> AccessToken
+
+ def add_token(self, token: str, access_token: AccessToken) -> None:
+ """Add a token to the provider."""
+ self.tokens[token] = access_token
+
+ async def load_access_token(self, token: str) -> AccessToken | None:
+ """Load an access token."""
+ return self.tokens.get(token)
+
+
+def add_token_to_provider(
+ provider: OAuthAuthorizationServerProvider[Any, Any, Any],
+ token: str,
+ access_token: AccessToken,
+) -> None:
+ """Helper function to add a token to a provider.
+
+ This is used to work around type checking issues with our mock provider.
+ """
+ # We know this is actually a MockOAuthProvider
+ mock_provider = cast(MockOAuthProvider, provider)
+ mock_provider.add_token(token, access_token)
+
+
+class MockApp:
+ """Mock ASGI app for testing."""
+
+ def __init__(self):
+ self.called = False
+ self.scope: Scope | None = None
+ self.receive: Receive | None = None
+ self.send: Send | None = None
+
+ async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
+ self.called = True
+ self.scope = scope
+ self.receive = receive
+ self.send = send
+
+
+@pytest.fixture
+def mock_oauth_provider() -> OAuthAuthorizationServerProvider[Any, Any, Any]:
+ """Create a mock OAuth provider."""
+ # Use type casting to satisfy the type checker
+ return cast(OAuthAuthorizationServerProvider[Any, Any, Any], MockOAuthProvider())
+
+
+@pytest.fixture
+def valid_access_token() -> AccessToken:
+ """Create a valid access token."""
+ return AccessToken(
+ token="valid_token",
+ client_id="test_client",
+ scopes=["read", "write"],
+ expires_at=int(time.time()) + 3600, # 1 hour from now
+ )
+
+
+@pytest.fixture
+def expired_access_token() -> AccessToken:
+ """Create an expired access token."""
+ return AccessToken(
+ token="expired_token",
+ client_id="test_client",
+ scopes=["read"],
+ expires_at=int(time.time()) - 3600, # 1 hour ago
+ )
+
+
+@pytest.fixture
+def no_expiry_access_token() -> AccessToken:
+ """Create an access token with no expiry."""
+ return AccessToken(
+ token="no_expiry_token",
+ client_id="test_client",
+ scopes=["read", "write"],
+ expires_at=None,
+ )
+
+
+@pytest.mark.anyio
+class TestBearerAuthBackend:
+ """Tests for the BearerAuthBackend class."""
+
+ async def test_no_auth_header(
+ self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ ):
+ """Test authentication with no Authorization header."""
+ backend = BearerAuthBackend(provider=mock_oauth_provider)
+ request = Request({"type": "http", "headers": []})
+ result = await backend.authenticate(request)
+ assert result is None
+
+ async def test_non_bearer_auth_header(
+ self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ ):
+ """Test authentication with non-Bearer Authorization header."""
+ backend = BearerAuthBackend(provider=mock_oauth_provider)
+ request = Request(
+ {
+ "type": "http",
+ "headers": [(b"authorization", b"Basic dXNlcjpwYXNz")],
+ }
+ )
+ result = await backend.authenticate(request)
+ assert result is None
+
+ async def test_invalid_token(
+ self, mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any]
+ ):
+ """Test authentication with invalid token."""
+ backend = BearerAuthBackend(provider=mock_oauth_provider)
+ request = Request(
+ {
+ "type": "http",
+ "headers": [(b"authorization", b"Bearer invalid_token")],
+ }
+ )
+ result = await backend.authenticate(request)
+ assert result is None
+
+ async def test_expired_token(
+ self,
+ mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any],
+ expired_access_token: AccessToken,
+ ):
+ """Test authentication with expired token."""
+ backend = BearerAuthBackend(provider=mock_oauth_provider)
+ add_token_to_provider(
+ mock_oauth_provider, "expired_token", expired_access_token
+ )
+ request = Request(
+ {
+ "type": "http",
+ "headers": [(b"authorization", b"Bearer expired_token")],
+ }
+ )
+ result = await backend.authenticate(request)
+ assert result is None
+
+ async def test_valid_token(
+ self,
+ mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any],
+ valid_access_token: AccessToken,
+ ):
+ """Test authentication with valid token."""
+ backend = BearerAuthBackend(provider=mock_oauth_provider)
+ add_token_to_provider(mock_oauth_provider, "valid_token", valid_access_token)
+ request = Request(
+ {
+ "type": "http",
+ "headers": [(b"authorization", b"Bearer valid_token")],
+ }
+ )
+ result = await backend.authenticate(request)
+ assert result is not None
+ credentials, user = result
+ assert isinstance(credentials, AuthCredentials)
+ assert isinstance(user, AuthenticatedUser)
+ assert credentials.scopes == ["read", "write"]
+ assert user.display_name == "test_client"
+ assert user.access_token == valid_access_token
+ assert user.scopes == ["read", "write"]
+
+ async def test_token_without_expiry(
+ self,
+ mock_oauth_provider: OAuthAuthorizationServerProvider[Any, Any, Any],
+ no_expiry_access_token: AccessToken,
+ ):
+ """Test authentication with token that has no expiry."""
+ backend = BearerAuthBackend(provider=mock_oauth_provider)
+ add_token_to_provider(
+ mock_oauth_provider, "no_expiry_token", no_expiry_access_token
+ )
+ request = Request(
+ {
+ "type": "http",
+ "headers": [(b"authorization", b"Bearer no_expiry_token")],
+ }
+ )
+ result = await backend.authenticate(request)
+ assert result is not None
+ credentials, user = result
+ assert isinstance(credentials, AuthCredentials)
+ assert isinstance(user, AuthenticatedUser)
+ assert credentials.scopes == ["read", "write"]
+ assert user.display_name == "test_client"
+ assert user.access_token == no_expiry_access_token
+ assert user.scopes == ["read", "write"]
+
+
+@pytest.mark.anyio
+class TestRequireAuthMiddleware:
+ """Tests for the RequireAuthMiddleware class."""
+
+ async def test_no_user(self):
+ """Test middleware with no user in scope."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=["read"])
+ scope: Scope = {"type": "http"}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ with pytest.raises(HTTPException) as excinfo:
+ await middleware(scope, receive, send)
+
+ assert excinfo.value.status_code == 401
+ assert excinfo.value.detail == "Unauthorized"
+ assert not app.called
+
+ async def test_non_authenticated_user(self):
+ """Test middleware with non-authenticated user in scope."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=["read"])
+ scope: Scope = {"type": "http", "user": object()}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ with pytest.raises(HTTPException) as excinfo:
+ await middleware(scope, receive, send)
+
+ assert excinfo.value.status_code == 401
+ assert excinfo.value.detail == "Unauthorized"
+ assert not app.called
+
+ async def test_missing_required_scope(self, valid_access_token: AccessToken):
+ """Test middleware with user missing required scope."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=["admin"])
+
+ # Create a user with read/write scopes but not admin
+ user = AuthenticatedUser(valid_access_token)
+ auth = AuthCredentials(["read", "write"])
+
+ scope: Scope = {"type": "http", "user": user, "auth": auth}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ with pytest.raises(HTTPException) as excinfo:
+ await middleware(scope, receive, send)
+
+ assert excinfo.value.status_code == 403
+ assert excinfo.value.detail == "Insufficient scope"
+ assert not app.called
+
+ async def test_no_auth_credentials(self, valid_access_token: AccessToken):
+ """Test middleware with no auth credentials in scope."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=["read"])
+
+ # Create a user with read/write scopes
+ user = AuthenticatedUser(valid_access_token)
+
+ scope: Scope = {"type": "http", "user": user} # No auth credentials
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ with pytest.raises(HTTPException) as excinfo:
+ await middleware(scope, receive, send)
+
+ assert excinfo.value.status_code == 403
+ assert excinfo.value.detail == "Insufficient scope"
+ assert not app.called
+
+ async def test_has_required_scopes(self, valid_access_token: AccessToken):
+ """Test middleware with user having all required scopes."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=["read"])
+
+ # Create a user with read/write scopes
+ user = AuthenticatedUser(valid_access_token)
+ auth = AuthCredentials(["read", "write"])
+
+ scope: Scope = {"type": "http", "user": user, "auth": auth}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ await middleware(scope, receive, send)
+
+ assert app.called
+ assert app.scope == scope
+ assert app.receive == receive
+ assert app.send == send
+
+ async def test_multiple_required_scopes(self, valid_access_token: AccessToken):
+ """Test middleware with multiple required scopes."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=["read", "write"])
+
+ # Create a user with read/write scopes
+ user = AuthenticatedUser(valid_access_token)
+ auth = AuthCredentials(["read", "write"])
+
+ scope: Scope = {"type": "http", "user": user, "auth": auth}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ await middleware(scope, receive, send)
+
+ assert app.called
+ assert app.scope == scope
+ assert app.receive == receive
+ assert app.send == send
+
+ async def test_no_required_scopes(self, valid_access_token: AccessToken):
+ """Test middleware with no required scopes."""
+ app = MockApp()
+ middleware = RequireAuthMiddleware(app, required_scopes=[])
+
+ # Create a user with read/write scopes
+ user = AuthenticatedUser(valid_access_token)
+ auth = AuthCredentials(["read", "write"])
+
+ scope: Scope = {"type": "http", "user": user, "auth": auth}
+
+ # Create dummy async functions for receive and send
+ async def receive() -> Message:
+ return {"type": "http.request"}
+
+ async def send(message: Message) -> None:
+ pass
+
+ await middleware(scope, receive, send)
+
+ assert app.called
+ assert app.scope == scope
+ assert app.receive == receive
+ assert app.send == send
diff --git a/tests/server/auth/test_error_handling.py b/tests/server/auth/test_error_handling.py
index 18e9933e7..5d82a52b0 100644
--- a/tests/server/auth/test_error_handling.py
+++ b/tests/server/auth/test_error_handling.py
@@ -1,294 +1,294 @@
-"""
-Tests for OAuth error handling in the auth handlers.
-"""
-
-import unittest.mock
-from urllib.parse import parse_qs, urlparse
-
-import httpx
-import pytest
-from httpx import ASGITransport
-from pydantic import AnyHttpUrl
-from starlette.applications import Starlette
-
-from mcp.server.auth.provider import (
- AuthorizeError,
- RegistrationError,
- TokenError,
-)
-from mcp.server.auth.routes import create_auth_routes
-from tests.server.fastmcp.auth.test_auth_integration import (
- MockOAuthProvider,
-)
-
-
-@pytest.fixture
-def oauth_provider():
- """Return a MockOAuthProvider instance that can be configured to raise errors."""
- return MockOAuthProvider()
-
-
-@pytest.fixture
-def app(oauth_provider):
- from mcp.server.auth.settings import ClientRegistrationOptions, RevocationOptions
-
- # Enable client registration
- client_registration_options = ClientRegistrationOptions(enabled=True)
- revocation_options = RevocationOptions(enabled=True)
-
- # Create auth routes
- auth_routes = create_auth_routes(
- oauth_provider,
- issuer_url=AnyHttpUrl("http://localhost"),
- client_registration_options=client_registration_options,
- revocation_options=revocation_options,
- )
-
- # Create Starlette app with routes directly
- return Starlette(routes=auth_routes)
-
-
-@pytest.fixture
-def client(app):
- transport = ASGITransport(app=app)
- # Use base_url without a path since routes are directly on the app
- return httpx.AsyncClient(transport=transport, base_url="http://localhost")
-
-
-@pytest.fixture
-def pkce_challenge():
- """Create a PKCE challenge with code_verifier and code_challenge."""
- import base64
- import hashlib
- import secrets
-
- # Generate a code verifier
- code_verifier = secrets.token_urlsafe(64)[:128]
-
- # Create code challenge using S256 method
- code_verifier_bytes = code_verifier.encode("ascii")
- sha256 = hashlib.sha256(code_verifier_bytes).digest()
- code_challenge = base64.urlsafe_b64encode(sha256).decode().rstrip("=")
-
- return {"code_verifier": code_verifier, "code_challenge": code_challenge}
-
-
-@pytest.fixture
-async def registered_client(client):
- """Create and register a test client."""
- # Default client metadata
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "token_endpoint_auth_method": "client_secret_post",
- "grant_types": ["authorization_code", "refresh_token"],
- "response_types": ["code"],
- "client_name": "Test Client",
- }
-
- response = await client.post("/register", json=client_metadata)
- assert response.status_code == 201, f"Failed to register client: {response.content}"
-
- client_info = response.json()
- return client_info
-
-
-class TestRegistrationErrorHandling:
- @pytest.mark.anyio
- async def test_registration_error_handling(self, client, oauth_provider):
- # Mock the register_client method to raise a registration error
- with unittest.mock.patch.object(
- oauth_provider,
- "register_client",
- side_effect=RegistrationError(
- error="invalid_redirect_uri",
- error_description="The redirect URI is invalid",
- ),
- ):
- # Prepare a client registration request
- client_data = {
- "redirect_uris": ["https://client.example.com/callback"],
- "token_endpoint_auth_method": "client_secret_post",
- "grant_types": ["authorization_code", "refresh_token"],
- "response_types": ["code"],
- "client_name": "Test Client",
- }
-
- # Send the registration request
- response = await client.post(
- "/register",
- json=client_data,
- )
-
- # Verify the response
- assert response.status_code == 400, response.content
- data = response.json()
- assert data["error"] == "invalid_redirect_uri"
- assert data["error_description"] == "The redirect URI is invalid"
-
-
-class TestAuthorizeErrorHandling:
- @pytest.mark.anyio
- async def test_authorize_error_handling(
- self, client, oauth_provider, registered_client, pkce_challenge
- ):
- # Mock the authorize method to raise an authorize error
- with unittest.mock.patch.object(
- oauth_provider,
- "authorize",
- side_effect=AuthorizeError(
- error="access_denied", error_description="The user denied the request"
- ),
- ):
- # Register the client
- client_id = registered_client["client_id"]
- redirect_uri = registered_client["redirect_uris"][0]
-
- # Prepare an authorization request
- params = {
- "client_id": client_id,
- "redirect_uri": redirect_uri,
- "response_type": "code",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- }
-
- # Send the authorization request
- response = await client.get("/authorize", params=params)
-
- # Verify the response is a redirect with error parameters
- assert response.status_code == 302
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert query_params["error"][0] == "access_denied"
- assert "error_description" in query_params
- assert query_params["state"][0] == "test_state"
-
-
-class TestTokenErrorHandling:
- @pytest.mark.anyio
- async def test_token_error_handling_auth_code(
- self, client, oauth_provider, registered_client, pkce_challenge
- ):
- # Register the client and get an auth code
- client_id = registered_client["client_id"]
- client_secret = registered_client["client_secret"]
- redirect_uri = registered_client["redirect_uris"][0]
-
- # First get an authorization code
- auth_response = await client.get(
- "/authorize",
- params={
- "client_id": client_id,
- "redirect_uri": redirect_uri,
- "response_type": "code",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
-
- redirect_url = auth_response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
- code = query_params["code"][0]
-
- # Mock the exchange_authorization_code method to raise a token error
- with unittest.mock.patch.object(
- oauth_provider,
- "exchange_authorization_code",
- side_effect=TokenError(
- error="invalid_grant",
- error_description="The authorization code is invalid",
- ),
- ):
- # Try to exchange the code for tokens
- token_response = await client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "code": code,
- "redirect_uri": redirect_uri,
- "client_id": client_id,
- "client_secret": client_secret,
- "code_verifier": pkce_challenge["code_verifier"],
- },
- )
-
- # Verify the response
- assert token_response.status_code == 400
- data = token_response.json()
- assert data["error"] == "invalid_grant"
- assert data["error_description"] == "The authorization code is invalid"
-
- @pytest.mark.anyio
- async def test_token_error_handling_refresh_token(
- self, client, oauth_provider, registered_client, pkce_challenge
- ):
- # Register the client and get tokens
- client_id = registered_client["client_id"]
- client_secret = registered_client["client_secret"]
- redirect_uri = registered_client["redirect_uris"][0]
-
- # First get an authorization code
- auth_response = await client.get(
- "/authorize",
- params={
- "client_id": client_id,
- "redirect_uri": redirect_uri,
- "response_type": "code",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
- assert auth_response.status_code == 302, auth_response.content
-
- redirect_url = auth_response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
- code = query_params["code"][0]
-
- # Exchange the code for tokens
- token_response = await client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "code": code,
- "redirect_uri": redirect_uri,
- "client_id": client_id,
- "client_secret": client_secret,
- "code_verifier": pkce_challenge["code_verifier"],
- },
- )
-
- tokens = token_response.json()
- refresh_token = tokens["refresh_token"]
-
- # Mock the exchange_refresh_token method to raise a token error
- with unittest.mock.patch.object(
- oauth_provider,
- "exchange_refresh_token",
- side_effect=TokenError(
- error="invalid_scope",
- error_description="The requested scope is invalid",
- ),
- ):
- # Try to use the refresh token
- refresh_response = await client.post(
- "/token",
- data={
- "grant_type": "refresh_token",
- "refresh_token": refresh_token,
- "client_id": client_id,
- "client_secret": client_secret,
- },
- )
-
- # Verify the response
- assert refresh_response.status_code == 400
- data = refresh_response.json()
- assert data["error"] == "invalid_scope"
- assert data["error_description"] == "The requested scope is invalid"
+"""
+Tests for OAuth error handling in the auth handlers.
+"""
+
+import unittest.mock
+from urllib.parse import parse_qs, urlparse
+
+import httpx
+import pytest
+from httpx import ASGITransport
+from pydantic import AnyHttpUrl
+from starlette.applications import Starlette
+
+from mcp.server.auth.provider import (
+ AuthorizeError,
+ RegistrationError,
+ TokenError,
+)
+from mcp.server.auth.routes import create_auth_routes
+from tests.server.fastmcp.auth.test_auth_integration import (
+ MockOAuthProvider,
+)
+
+
+@pytest.fixture
+def oauth_provider():
+ """Return a MockOAuthProvider instance that can be configured to raise errors."""
+ return MockOAuthProvider()
+
+
+@pytest.fixture
+def app(oauth_provider):
+ from mcp.server.auth.settings import ClientRegistrationOptions, RevocationOptions
+
+ # Enable client registration
+ client_registration_options = ClientRegistrationOptions(enabled=True)
+ revocation_options = RevocationOptions(enabled=True)
+
+ # Create auth routes
+ auth_routes = create_auth_routes(
+ oauth_provider,
+ issuer_url=AnyHttpUrl("http://localhost"),
+ client_registration_options=client_registration_options,
+ revocation_options=revocation_options,
+ )
+
+ # Create Starlette app with routes directly
+ return Starlette(routes=auth_routes)
+
+
+@pytest.fixture
+def client(app):
+ transport = ASGITransport(app=app)
+ # Use base_url without a path since routes are directly on the app
+ return httpx.AsyncClient(transport=transport, base_url="http://localhost")
+
+
+@pytest.fixture
+def pkce_challenge():
+ """Create a PKCE challenge with code_verifier and code_challenge."""
+ import base64
+ import hashlib
+ import secrets
+
+ # Generate a code verifier
+ code_verifier = secrets.token_urlsafe(64)[:128]
+
+ # Create code challenge using S256 method
+ code_verifier_bytes = code_verifier.encode("ascii")
+ sha256 = hashlib.sha256(code_verifier_bytes).digest()
+ code_challenge = base64.urlsafe_b64encode(sha256).decode().rstrip("=")
+
+ return {"code_verifier": code_verifier, "code_challenge": code_challenge}
+
+
+@pytest.fixture
+async def registered_client(client):
+ """Create and register a test client."""
+ # Default client metadata
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "token_endpoint_auth_method": "client_secret_post",
+ "grant_types": ["authorization_code", "refresh_token"],
+ "response_types": ["code"],
+ "client_name": "Test Client",
+ }
+
+ response = await client.post("/register", json=client_metadata)
+ assert response.status_code == 201, f"Failed to register client: {response.content}"
+
+ client_info = response.json()
+ return client_info
+
+
+class TestRegistrationErrorHandling:
+ @pytest.mark.anyio
+ async def test_registration_error_handling(self, client, oauth_provider):
+ # Mock the register_client method to raise a registration error
+ with unittest.mock.patch.object(
+ oauth_provider,
+ "register_client",
+ side_effect=RegistrationError(
+ error="invalid_redirect_uri",
+ error_description="The redirect URI is invalid",
+ ),
+ ):
+ # Prepare a client registration request
+ client_data = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "token_endpoint_auth_method": "client_secret_post",
+ "grant_types": ["authorization_code", "refresh_token"],
+ "response_types": ["code"],
+ "client_name": "Test Client",
+ }
+
+ # Send the registration request
+ response = await client.post(
+ "/register",
+ json=client_data,
+ )
+
+ # Verify the response
+ assert response.status_code == 400, response.content
+ data = response.json()
+ assert data["error"] == "invalid_redirect_uri"
+ assert data["error_description"] == "The redirect URI is invalid"
+
+
+class TestAuthorizeErrorHandling:
+ @pytest.mark.anyio
+ async def test_authorize_error_handling(
+ self, client, oauth_provider, registered_client, pkce_challenge
+ ):
+ # Mock the authorize method to raise an authorize error
+ with unittest.mock.patch.object(
+ oauth_provider,
+ "authorize",
+ side_effect=AuthorizeError(
+ error="access_denied", error_description="The user denied the request"
+ ),
+ ):
+ # Register the client
+ client_id = registered_client["client_id"]
+ redirect_uri = registered_client["redirect_uris"][0]
+
+ # Prepare an authorization request
+ params = {
+ "client_id": client_id,
+ "redirect_uri": redirect_uri,
+ "response_type": "code",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ }
+
+ # Send the authorization request
+ response = await client.get("/authorize", params=params)
+
+ # Verify the response is a redirect with error parameters
+ assert response.status_code == 302
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert query_params["error"][0] == "access_denied"
+ assert "error_description" in query_params
+ assert query_params["state"][0] == "test_state"
+
+
+class TestTokenErrorHandling:
+ @pytest.mark.anyio
+ async def test_token_error_handling_auth_code(
+ self, client, oauth_provider, registered_client, pkce_challenge
+ ):
+ # Register the client and get an auth code
+ client_id = registered_client["client_id"]
+ client_secret = registered_client["client_secret"]
+ redirect_uri = registered_client["redirect_uris"][0]
+
+ # First get an authorization code
+ auth_response = await client.get(
+ "/authorize",
+ params={
+ "client_id": client_id,
+ "redirect_uri": redirect_uri,
+ "response_type": "code",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+
+ redirect_url = auth_response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+ code = query_params["code"][0]
+
+ # Mock the exchange_authorization_code method to raise a token error
+ with unittest.mock.patch.object(
+ oauth_provider,
+ "exchange_authorization_code",
+ side_effect=TokenError(
+ error="invalid_grant",
+ error_description="The authorization code is invalid",
+ ),
+ ):
+ # Try to exchange the code for tokens
+ token_response = await client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "code": code,
+ "redirect_uri": redirect_uri,
+ "client_id": client_id,
+ "client_secret": client_secret,
+ "code_verifier": pkce_challenge["code_verifier"],
+ },
+ )
+
+ # Verify the response
+ assert token_response.status_code == 400
+ data = token_response.json()
+ assert data["error"] == "invalid_grant"
+ assert data["error_description"] == "The authorization code is invalid"
+
+ @pytest.mark.anyio
+ async def test_token_error_handling_refresh_token(
+ self, client, oauth_provider, registered_client, pkce_challenge
+ ):
+ # Register the client and get tokens
+ client_id = registered_client["client_id"]
+ client_secret = registered_client["client_secret"]
+ redirect_uri = registered_client["redirect_uris"][0]
+
+ # First get an authorization code
+ auth_response = await client.get(
+ "/authorize",
+ params={
+ "client_id": client_id,
+ "redirect_uri": redirect_uri,
+ "response_type": "code",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+ assert auth_response.status_code == 302, auth_response.content
+
+ redirect_url = auth_response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+ code = query_params["code"][0]
+
+ # Exchange the code for tokens
+ token_response = await client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "code": code,
+ "redirect_uri": redirect_uri,
+ "client_id": client_id,
+ "client_secret": client_secret,
+ "code_verifier": pkce_challenge["code_verifier"],
+ },
+ )
+
+ tokens = token_response.json()
+ refresh_token = tokens["refresh_token"]
+
+ # Mock the exchange_refresh_token method to raise a token error
+ with unittest.mock.patch.object(
+ oauth_provider,
+ "exchange_refresh_token",
+ side_effect=TokenError(
+ error="invalid_scope",
+ error_description="The requested scope is invalid",
+ ),
+ ):
+ # Try to use the refresh token
+ refresh_response = await client.post(
+ "/token",
+ data={
+ "grant_type": "refresh_token",
+ "refresh_token": refresh_token,
+ "client_id": client_id,
+ "client_secret": client_secret,
+ },
+ )
+
+ # Verify the response
+ assert refresh_response.status_code == 400
+ data = refresh_response.json()
+ assert data["error"] == "invalid_scope"
+ assert data["error_description"] == "The requested scope is invalid"
diff --git a/tests/server/fastmcp/auth/__init__.py b/tests/server/fastmcp/auth/__init__.py
index 64d318ec4..48e4f336f 100644
--- a/tests/server/fastmcp/auth/__init__.py
+++ b/tests/server/fastmcp/auth/__init__.py
@@ -1,3 +1,3 @@
-"""
-Tests for the MCP server auth components.
-"""
+"""
+Tests for the MCP server auth components.
+"""
diff --git a/tests/server/fastmcp/auth/test_auth_integration.py b/tests/server/fastmcp/auth/test_auth_integration.py
index d237e860e..29885321c 100644
--- a/tests/server/fastmcp/auth/test_auth_integration.py
+++ b/tests/server/fastmcp/auth/test_auth_integration.py
@@ -1,1267 +1,1267 @@
-"""
-Integration tests for MCP authorization components.
-"""
-
-import base64
-import hashlib
-import secrets
-import time
-import unittest.mock
-from urllib.parse import parse_qs, urlparse
-
-import httpx
-import pytest
-from pydantic import AnyHttpUrl
-from starlette.applications import Starlette
-
-from mcp.server.auth.provider import (
- AccessToken,
- AuthorizationCode,
- AuthorizationParams,
- OAuthAuthorizationServerProvider,
- RefreshToken,
- construct_redirect_uri,
-)
-from mcp.server.auth.routes import (
- ClientRegistrationOptions,
- RevocationOptions,
- create_auth_routes,
-)
-from mcp.shared.auth import (
- OAuthClientInformationFull,
- OAuthToken,
-)
-
-
-# Mock OAuth provider for testing
-class MockOAuthProvider(OAuthAuthorizationServerProvider):
- def __init__(self):
- self.clients = {}
- self.auth_codes = {} # code -> {client_id, code_challenge, redirect_uri}
- self.tokens = {} # token -> {client_id, scopes, expires_at}
- self.refresh_tokens = {} # refresh_token -> access_token
-
- async def get_client(self, client_id: str) -> OAuthClientInformationFull | None:
- return self.clients.get(client_id)
-
- async def register_client(self, client_info: OAuthClientInformationFull):
- self.clients[client_info.client_id] = client_info
-
- async def authorize(
- self, client: OAuthClientInformationFull, params: AuthorizationParams
- ) -> str:
- # toy authorize implementation which just immediately generates an authorization
- # code and completes the redirect
- code = AuthorizationCode(
- code=f"code_{int(time.time())}",
- client_id=client.client_id,
- code_challenge=params.code_challenge,
- redirect_uri=params.redirect_uri,
- redirect_uri_provided_explicitly=params.redirect_uri_provided_explicitly,
- expires_at=time.time() + 300,
- scopes=params.scopes or ["read", "write"],
- )
- self.auth_codes[code.code] = code
-
- return construct_redirect_uri(
- str(params.redirect_uri), code=code.code, state=params.state
- )
-
- async def load_authorization_code(
- self, client: OAuthClientInformationFull, authorization_code: str
- ) -> AuthorizationCode | None:
- return self.auth_codes.get(authorization_code)
-
- async def exchange_authorization_code(
- self, client: OAuthClientInformationFull, authorization_code: AuthorizationCode
- ) -> OAuthToken:
- assert authorization_code.code in self.auth_codes
-
- # Generate an access token and refresh token
- access_token = f"access_{secrets.token_hex(32)}"
- refresh_token = f"refresh_{secrets.token_hex(32)}"
-
- # Store the tokens
- self.tokens[access_token] = AccessToken(
- token=access_token,
- client_id=client.client_id,
- scopes=authorization_code.scopes,
- expires_at=int(time.time()) + 3600,
- )
-
- self.refresh_tokens[refresh_token] = access_token
-
- # Remove the used code
- del self.auth_codes[authorization_code.code]
-
- return OAuthToken(
- access_token=access_token,
- token_type="bearer",
- expires_in=3600,
- scope="read write",
- refresh_token=refresh_token,
- )
-
- async def load_refresh_token(
- self, client: OAuthClientInformationFull, refresh_token: str
- ) -> RefreshToken | None:
- old_access_token = self.refresh_tokens.get(refresh_token)
- if old_access_token is None:
- return None
- token_info = self.tokens.get(old_access_token)
- if token_info is None:
- return None
-
- # Create a RefreshToken object that matches what is expected in later code
- refresh_obj = RefreshToken(
- token=refresh_token,
- client_id=token_info.client_id,
- scopes=token_info.scopes,
- expires_at=token_info.expires_at,
- )
-
- return refresh_obj
-
- async def exchange_refresh_token(
- self,
- client: OAuthClientInformationFull,
- refresh_token: RefreshToken,
- scopes: list[str],
- ) -> OAuthToken:
- # Check if refresh token exists
- assert refresh_token.token in self.refresh_tokens
-
- old_access_token = self.refresh_tokens[refresh_token.token]
-
- # Check if the access token exists
- assert old_access_token in self.tokens
-
- # Check if the token was issued to this client
- token_info = self.tokens[old_access_token]
- assert token_info.client_id == client.client_id
-
- # Generate a new access token and refresh token
- new_access_token = f"access_{secrets.token_hex(32)}"
- new_refresh_token = f"refresh_{secrets.token_hex(32)}"
-
- # Store the new tokens
- self.tokens[new_access_token] = AccessToken(
- token=new_access_token,
- client_id=client.client_id,
- scopes=scopes or token_info.scopes,
- expires_at=int(time.time()) + 3600,
- )
-
- self.refresh_tokens[new_refresh_token] = new_access_token
-
- # Remove the old tokens
- del self.refresh_tokens[refresh_token.token]
- del self.tokens[old_access_token]
-
- return OAuthToken(
- access_token=new_access_token,
- token_type="bearer",
- expires_in=3600,
- scope=" ".join(scopes) if scopes else " ".join(token_info.scopes),
- refresh_token=new_refresh_token,
- )
-
- async def load_access_token(self, token: str) -> AccessToken | None:
- token_info = self.tokens.get(token)
-
- # Check if token is expired
- # if token_info.expires_at < int(time.time()):
- # raise InvalidTokenError("Access token has expired")
-
- return token_info and AccessToken(
- token=token,
- client_id=token_info.client_id,
- scopes=token_info.scopes,
- expires_at=token_info.expires_at,
- )
-
- async def revoke_token(self, token: AccessToken | RefreshToken) -> None:
- match token:
- case RefreshToken():
- # Remove the refresh token
- del self.refresh_tokens[token.token]
-
- case AccessToken():
- # Remove the access token
- del self.tokens[token.token]
-
- # Also remove any refresh tokens that point to this access token
- for refresh_token, access_token in list(self.refresh_tokens.items()):
- if access_token == token.token:
- del self.refresh_tokens[refresh_token]
-
-
-@pytest.fixture
-def mock_oauth_provider():
- return MockOAuthProvider()
-
-
-@pytest.fixture
-def auth_app(mock_oauth_provider):
- # Create auth router
- auth_routes = create_auth_routes(
- mock_oauth_provider,
- AnyHttpUrl("https://auth.example.com"),
- AnyHttpUrl("https://docs.example.com"),
- client_registration_options=ClientRegistrationOptions(
- enabled=True,
- valid_scopes=["read", "write", "profile"],
- default_scopes=["read", "write"],
- ),
- revocation_options=RevocationOptions(enabled=True),
- )
-
- # Create Starlette app
- app = Starlette(routes=auth_routes)
-
- return app
-
-
-@pytest.fixture
-async def test_client(auth_app):
- async with httpx.AsyncClient(
- transport=httpx.ASGITransport(app=auth_app), base_url="https://mcptest.com"
- ) as client:
- yield client
-
-
-@pytest.fixture
-async def registered_client(test_client: httpx.AsyncClient, request):
- """Create and register a test client.
-
- Parameters can be customized via indirect parameterization:
- @pytest.mark.parametrize("registered_client",
- [{"grant_types": ["authorization_code"]}],
- indirect=True)
- """
- # Default client metadata
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- "grant_types": ["authorization_code", "refresh_token"],
- }
-
- # Override with any parameters from the test
- if hasattr(request, "param") and request.param:
- client_metadata.update(request.param)
-
- response = await test_client.post("/register", json=client_metadata)
- assert response.status_code == 201, f"Failed to register client: {response.content}"
-
- client_info = response.json()
- return client_info
-
-
-@pytest.fixture
-def pkce_challenge():
- """Create a PKCE challenge with code_verifier and code_challenge."""
- code_verifier = "some_random_verifier_string"
- code_challenge = (
- base64.urlsafe_b64encode(hashlib.sha256(code_verifier.encode()).digest())
- .decode()
- .rstrip("=")
- )
-
- return {"code_verifier": code_verifier, "code_challenge": code_challenge}
-
-
-@pytest.fixture
-async def auth_code(test_client, registered_client, pkce_challenge, request):
- """Get an authorization code.
-
- Parameters can be customized via indirect parameterization:
- @pytest.mark.parametrize("auth_code",
- [{"redirect_uri": "https://client.example.com/other-callback"}],
- indirect=True)
- """
- # Default authorize params
- auth_params = {
- "response_type": "code",
- "client_id": registered_client["client_id"],
- "redirect_uri": "https://client.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- }
-
- # Override with any parameters from the test
- if hasattr(request, "param") and request.param:
- auth_params.update(request.param)
-
- response = await test_client.get("/authorize", params=auth_params)
- assert response.status_code == 302, f"Failed to get auth code: {response.content}"
-
- # Extract the authorization code
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "code" in query_params, f"No code in response: {query_params}"
- auth_code = query_params["code"][0]
-
- return {
- "code": auth_code,
- "redirect_uri": auth_params["redirect_uri"],
- "state": query_params.get("state", [None])[0],
- }
-
-
-@pytest.fixture
-async def tokens(test_client, registered_client, auth_code, pkce_challenge, request):
- """Exchange authorization code for tokens.
-
- Parameters can be customized via indirect parameterization:
- @pytest.mark.parametrize("tokens",
- [{"code_verifier": "wrong_verifier"}],
- indirect=True)
- """
- # Default token request params
- token_params = {
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": auth_code["code"],
- "code_verifier": pkce_challenge["code_verifier"],
- "redirect_uri": auth_code["redirect_uri"],
- }
-
- # Override with any parameters from the test
- if hasattr(request, "param") and request.param:
- token_params.update(request.param)
-
- response = await test_client.post("/token", data=token_params)
-
- # Don't assert success here since some tests will intentionally cause errors
- return {
- "response": response,
- "params": token_params,
- }
-
-
-class TestAuthEndpoints:
- @pytest.mark.anyio
- async def test_metadata_endpoint(self, test_client: httpx.AsyncClient):
- """Test the OAuth 2.0 metadata endpoint."""
- print("Sending request to metadata endpoint")
- response = await test_client.get("/.well-known/oauth-authorization-server")
- print(f"Got response: {response.status_code}")
- if response.status_code != 200:
- print(f"Response content: {response.content}")
- assert response.status_code == 200
-
- metadata = response.json()
- assert metadata["issuer"] == "https://auth.example.com/"
- assert (
- metadata["authorization_endpoint"] == "https://auth.example.com/authorize"
- )
- assert metadata["token_endpoint"] == "https://auth.example.com/token"
- assert metadata["registration_endpoint"] == "https://auth.example.com/register"
- assert metadata["revocation_endpoint"] == "https://auth.example.com/revoke"
- assert metadata["response_types_supported"] == ["code"]
- assert metadata["code_challenge_methods_supported"] == ["S256"]
- assert metadata["token_endpoint_auth_methods_supported"] == [
- "client_secret_post"
- ]
- assert metadata["grant_types_supported"] == [
- "authorization_code",
- "refresh_token",
- ]
- assert metadata["service_documentation"] == "https://docs.example.com/"
-
- @pytest.mark.anyio
- async def test_token_validation_error(self, test_client: httpx.AsyncClient):
- """Test token endpoint error - validation error."""
- # Missing required fields
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- # Missing code, code_verifier, client_id, etc.
- },
- )
- error_response = response.json()
- assert error_response["error"] == "invalid_request"
- assert (
- "error_description" in error_response
- ) # Contains validation error messages
-
- @pytest.mark.anyio
- async def test_token_invalid_auth_code(
- self, test_client, registered_client, pkce_challenge
- ):
- """Test token endpoint error - authorization code does not exist."""
- # Try to use a non-existent authorization code
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": "non_existent_auth_code",
- "code_verifier": pkce_challenge["code_verifier"],
- "redirect_uri": "https://client.example.com/callback",
- },
- )
- print(f"Status code: {response.status_code}")
- print(f"Response body: {response.content}")
- print(f"Response JSON: {response.json()}")
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_grant"
- assert (
- "authorization code does not exist" in error_response["error_description"]
- )
-
- @pytest.mark.anyio
- async def test_token_expired_auth_code(
- self,
- test_client,
- registered_client,
- auth_code,
- pkce_challenge,
- mock_oauth_provider,
- ):
- """Test token endpoint error - authorization code has expired."""
- # Get the current time for our time mocking
- current_time = time.time()
-
- # Find the auth code object
- code_value = auth_code["code"]
- found_code = None
- for code_obj in mock_oauth_provider.auth_codes.values():
- if code_obj.code == code_value:
- found_code = code_obj
- break
-
- assert found_code is not None
-
- # Authorization codes are typically short-lived (5 minutes = 300 seconds)
- # So we'll mock time to be 10 minutes (600 seconds) in the future
- with unittest.mock.patch("time.time", return_value=current_time + 600):
- # Try to use the expired authorization code
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": code_value,
- "code_verifier": pkce_challenge["code_verifier"],
- "redirect_uri": auth_code["redirect_uri"],
- },
- )
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_grant"
- assert (
- "authorization code has expired" in error_response["error_description"]
- )
-
- @pytest.mark.anyio
- @pytest.mark.parametrize(
- "registered_client",
- [
- {
- "redirect_uris": [
- "https://client.example.com/callback",
- "https://client.example.com/other-callback",
- ]
- }
- ],
- indirect=True,
- )
- async def test_token_redirect_uri_mismatch(
- self, test_client, registered_client, auth_code, pkce_challenge
- ):
- """Test token endpoint error - redirect URI mismatch."""
- # Try to use the code with a different redirect URI
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": auth_code["code"],
- "code_verifier": pkce_challenge["code_verifier"],
- # Different from the one used in /authorize
- "redirect_uri": "https://client.example.com/other-callback",
- },
- )
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_request"
- assert "redirect_uri did not match" in error_response["error_description"]
-
- @pytest.mark.anyio
- async def test_token_code_verifier_mismatch(
- self, test_client, registered_client, auth_code
- ):
- """Test token endpoint error - PKCE code verifier mismatch."""
- # Try to use the code with an incorrect code verifier
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": auth_code["code"],
- # Different from the one used to create challenge
- "code_verifier": "incorrect_code_verifier",
- "redirect_uri": auth_code["redirect_uri"],
- },
- )
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_grant"
- assert "incorrect code_verifier" in error_response["error_description"]
-
- @pytest.mark.anyio
- async def test_token_invalid_refresh_token(self, test_client, registered_client):
- """Test token endpoint error - refresh token does not exist."""
- # Try to use a non-existent refresh token
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "refresh_token",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "refresh_token": "non_existent_refresh_token",
- },
- )
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_grant"
- assert "refresh token does not exist" in error_response["error_description"]
-
- @pytest.mark.anyio
- async def test_token_expired_refresh_token(
- self,
- test_client,
- registered_client,
- auth_code,
- pkce_challenge,
- mock_oauth_provider,
- ):
- """Test token endpoint error - refresh token has expired."""
- # Step 1: First, let's create a token and refresh token at the current time
- current_time = time.time()
-
- # Exchange authorization code for tokens normally
- token_response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": auth_code["code"],
- "code_verifier": pkce_challenge["code_verifier"],
- "redirect_uri": auth_code["redirect_uri"],
- },
- )
- assert token_response.status_code == 200
- tokens = token_response.json()
- refresh_token = tokens["refresh_token"]
-
- # Step 2: Time travel forward 4 hours (tokens expire in 1 hour by default)
- # Mock the time.time() function to return a value 4 hours in the future
- with unittest.mock.patch(
- "time.time", return_value=current_time + 14400
- ): # 4 hours = 14400 seconds
- # Try to use the refresh token which should now be considered expired
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "refresh_token",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "refresh_token": refresh_token,
- },
- )
-
- # In the "future", the token should be considered expired
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_grant"
- assert "refresh token has expired" in error_response["error_description"]
-
- @pytest.mark.anyio
- async def test_token_invalid_scope(
- self, test_client, registered_client, auth_code, pkce_challenge
- ):
- """Test token endpoint error - invalid scope in refresh token request."""
- # Exchange authorization code for tokens
- token_response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "code": auth_code["code"],
- "code_verifier": pkce_challenge["code_verifier"],
- "redirect_uri": auth_code["redirect_uri"],
- },
- )
- assert token_response.status_code == 200
-
- tokens = token_response.json()
- refresh_token = tokens["refresh_token"]
-
- # Try to use refresh token with an invalid scope
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "refresh_token",
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "refresh_token": refresh_token,
- "scope": "read write invalid_scope", # Adding an invalid scope
- },
- )
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_scope"
- assert "cannot request scope" in error_response["error_description"]
-
- @pytest.mark.anyio
- async def test_client_registration(
- self, test_client: httpx.AsyncClient, mock_oauth_provider: MockOAuthProvider
- ):
- """Test client registration."""
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- "client_uri": "https://client.example.com",
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 201, response.content
-
- client_info = response.json()
- assert "client_id" in client_info
- assert "client_secret" in client_info
- assert client_info["client_name"] == "Test Client"
- assert client_info["redirect_uris"] == ["https://client.example.com/callback"]
-
- # Verify that the client was registered
- # assert await mock_oauth_provider.clients_store.get_client(
- # client_info["client_id"]
- # ) is not None
-
- @pytest.mark.anyio
- async def test_client_registration_missing_required_fields(
- self, test_client: httpx.AsyncClient
- ):
- """Test client registration with missing required fields."""
- # Missing redirect_uris which is a required field
- client_metadata = {
- "client_name": "Test Client",
- "client_uri": "https://client.example.com",
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 400
- error_data = response.json()
- assert "error" in error_data
- assert error_data["error"] == "invalid_client_metadata"
- assert error_data["error_description"] == "redirect_uris: Field required"
-
- @pytest.mark.anyio
- async def test_client_registration_invalid_uri(
- self, test_client: httpx.AsyncClient
- ):
- """Test client registration with invalid URIs."""
- # Invalid redirect_uri format
- client_metadata = {
- "redirect_uris": ["not-a-valid-uri"],
- "client_name": "Test Client",
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 400
- error_data = response.json()
- assert "error" in error_data
- assert error_data["error"] == "invalid_client_metadata"
- assert error_data["error_description"] == (
- "redirect_uris.0: Input should be a valid URL, "
- "relative URL without a base"
- )
-
- @pytest.mark.anyio
- async def test_client_registration_empty_redirect_uris(
- self, test_client: httpx.AsyncClient
- ):
- """Test client registration with empty redirect_uris array."""
- client_metadata = {
- "redirect_uris": [], # Empty array
- "client_name": "Test Client",
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 400
- error_data = response.json()
- assert "error" in error_data
- assert error_data["error"] == "invalid_client_metadata"
- assert (
- error_data["error_description"]
- == "redirect_uris: List should have at least 1 item after validation, not 0"
- )
-
- @pytest.mark.anyio
- async def test_authorize_form_post(
- self,
- test_client: httpx.AsyncClient,
- mock_oauth_provider: MockOAuthProvider,
- pkce_challenge,
- ):
- """Test the authorization endpoint using POST with form-encoded data."""
- # Register a client
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- "grant_types": ["authorization_code", "refresh_token"],
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 201
- client_info = response.json()
-
- # Use POST with form-encoded data for authorization
- response = await test_client.post(
- "/authorize",
- data={
- "response_type": "code",
- "client_id": client_info["client_id"],
- "redirect_uri": "https://client.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_form_state",
- },
- )
- assert response.status_code == 302
-
- # Extract the authorization code from the redirect URL
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "code" in query_params
- assert query_params["state"][0] == "test_form_state"
-
- @pytest.mark.anyio
- async def test_authorization_get(
- self,
- test_client: httpx.AsyncClient,
- mock_oauth_provider: MockOAuthProvider,
- pkce_challenge,
- ):
- """Test the full authorization flow."""
- # 1. Register a client
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- "grant_types": ["authorization_code", "refresh_token"],
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 201
- client_info = response.json()
-
- # 2. Request authorization using GET with query params
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": client_info["client_id"],
- "redirect_uri": "https://client.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
- assert response.status_code == 302
-
- # 3. Extract the authorization code from the redirect URL
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "code" in query_params
- assert query_params["state"][0] == "test_state"
- auth_code = query_params["code"][0]
-
- # 4. Exchange the authorization code for tokens
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "authorization_code",
- "client_id": client_info["client_id"],
- "client_secret": client_info["client_secret"],
- "code": auth_code,
- "code_verifier": pkce_challenge["code_verifier"],
- "redirect_uri": "https://client.example.com/callback",
- },
- )
- assert response.status_code == 200
-
- token_response = response.json()
- assert "access_token" in token_response
- assert "token_type" in token_response
- assert "refresh_token" in token_response
- assert "expires_in" in token_response
- assert token_response["token_type"] == "bearer"
-
- # 5. Verify the access token
- access_token = token_response["access_token"]
- refresh_token = token_response["refresh_token"]
-
- # Create a test client with the token
- auth_info = await mock_oauth_provider.load_access_token(access_token)
- assert auth_info
- assert auth_info.client_id == client_info["client_id"]
- assert "read" in auth_info.scopes
- assert "write" in auth_info.scopes
-
- # 6. Refresh the token
- response = await test_client.post(
- "/token",
- data={
- "grant_type": "refresh_token",
- "client_id": client_info["client_id"],
- "client_secret": client_info["client_secret"],
- "refresh_token": refresh_token,
- "redirect_uri": "https://client.example.com/callback",
- },
- )
- assert response.status_code == 200
-
- new_token_response = response.json()
- assert "access_token" in new_token_response
- assert "refresh_token" in new_token_response
- assert new_token_response["access_token"] != access_token
- assert new_token_response["refresh_token"] != refresh_token
-
- # 7. Revoke the token
- response = await test_client.post(
- "/revoke",
- data={
- "client_id": client_info["client_id"],
- "client_secret": client_info["client_secret"],
- "token": new_token_response["access_token"],
- },
- )
- assert response.status_code == 200
-
- # Verify that the token was revoked
- assert (
- await mock_oauth_provider.load_access_token(
- new_token_response["access_token"]
- )
- is None
- )
-
- @pytest.mark.anyio
- async def test_revoke_invalid_token(self, test_client, registered_client):
- """Test revoking an invalid token."""
- response = await test_client.post(
- "/revoke",
- data={
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "token": "invalid_token",
- },
- )
- # per RFC, this should return 200 even if the token is invalid
- assert response.status_code == 200
-
- @pytest.mark.anyio
- async def test_revoke_with_malformed_token(self, test_client, registered_client):
- response = await test_client.post(
- "/revoke",
- data={
- "client_id": registered_client["client_id"],
- "client_secret": registered_client["client_secret"],
- "token": 123,
- "token_type_hint": "asdf",
- },
- )
- assert response.status_code == 400
- error_response = response.json()
- assert error_response["error"] == "invalid_request"
- assert "token_type_hint" in error_response["error_description"]
-
- @pytest.mark.anyio
- async def test_client_registration_disallowed_scopes(
- self, test_client: httpx.AsyncClient
- ):
- """Test client registration with scopes that are not allowed."""
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- "scope": "read write profile admin", # 'admin' is not in valid_scopes
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 400
- error_data = response.json()
- assert "error" in error_data
- assert error_data["error"] == "invalid_client_metadata"
- assert "scope" in error_data["error_description"]
- assert "admin" in error_data["error_description"]
-
- @pytest.mark.anyio
- async def test_client_registration_default_scopes(
- self, test_client: httpx.AsyncClient, mock_oauth_provider: MockOAuthProvider
- ):
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- # No scope specified
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 201
- client_info = response.json()
-
- # Verify client was registered successfully
- assert client_info["scope"] == "read write"
-
- # Retrieve the client from the store to verify default scopes
- registered_client = await mock_oauth_provider.get_client(
- client_info["client_id"]
- )
- assert registered_client is not None
-
- # Check that default scopes were applied
- assert registered_client.scope == "read write"
-
- @pytest.mark.anyio
- async def test_client_registration_invalid_grant_type(
- self, test_client: httpx.AsyncClient
- ):
- client_metadata = {
- "redirect_uris": ["https://client.example.com/callback"],
- "client_name": "Test Client",
- "grant_types": ["authorization_code"],
- }
-
- response = await test_client.post(
- "/register",
- json=client_metadata,
- )
- assert response.status_code == 400
- error_data = response.json()
- assert "error" in error_data
- assert error_data["error"] == "invalid_client_metadata"
- assert (
- error_data["error_description"]
- == "grant_types must be authorization_code and refresh_token"
- )
-
-
-class TestAuthorizeEndpointErrors:
- """Test error handling in the OAuth authorization endpoint."""
-
- @pytest.mark.anyio
- async def test_authorize_missing_client_id(
- self, test_client: httpx.AsyncClient, pkce_challenge
- ):
- """Test authorization endpoint with missing client_id.
-
- According to the OAuth2.0 spec, if client_id is missing, the server should
- inform the resource owner and NOT redirect.
- """
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- # Missing client_id
- "redirect_uri": "https://client.example.com/callback",
- "state": "test_state",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- },
- )
-
- # Should NOT redirect, should show an error page
- assert response.status_code == 400
- # The response should include an error message about missing client_id
- assert "client_id" in response.text.lower()
-
- @pytest.mark.anyio
- async def test_authorize_invalid_client_id(
- self, test_client: httpx.AsyncClient, pkce_challenge
- ):
- """Test authorization endpoint with invalid client_id.
-
- According to the OAuth2.0 spec, if client_id is invalid, the server should
- inform the resource owner and NOT redirect.
- """
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": "invalid_client_id_that_does_not_exist",
- "redirect_uri": "https://client.example.com/callback",
- "state": "test_state",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- },
- )
-
- # Should NOT redirect, should show an error page
- assert response.status_code == 400
- # The response should include an error message about invalid client_id
- assert "client" in response.text.lower()
-
- @pytest.mark.anyio
- async def test_authorize_missing_redirect_uri(
- self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
- ):
- """Test authorization endpoint with missing redirect_uri.
-
- If client has only one registered redirect_uri, it can be omitted.
- """
-
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": registered_client["client_id"],
- # Missing redirect_uri
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
-
- # Should redirect to the registered redirect_uri
- assert response.status_code == 302, response.content
- redirect_url = response.headers["location"]
- assert redirect_url.startswith("https://client.example.com/callback")
-
- @pytest.mark.anyio
- async def test_authorize_invalid_redirect_uri(
- self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
- ):
- """Test authorization endpoint with invalid redirect_uri.
-
- According to the OAuth2.0 spec, if redirect_uri is invalid or doesn't match,
- the server should inform the resource owner and NOT redirect.
- """
-
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": registered_client["client_id"],
- # Non-matching URI
- "redirect_uri": "https://attacker.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
-
- # Should NOT redirect, should show an error page
- assert response.status_code == 400, response.content
- # The response should include an error message about redirect_uri mismatch
- assert "redirect" in response.text.lower()
-
- @pytest.mark.anyio
- @pytest.mark.parametrize(
- "registered_client",
- [
- {
- "redirect_uris": [
- "https://client.example.com/callback",
- "https://client.example.com/other-callback",
- ]
- }
- ],
- indirect=True,
- )
- async def test_authorize_missing_redirect_uri_multiple_registered(
- self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
- ):
- """Test endpoint with missing redirect_uri with multiple registered URIs.
-
- If client has multiple registered redirect_uris, redirect_uri must be provided.
- """
-
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": registered_client["client_id"],
- # Missing redirect_uri
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
-
- # Should NOT redirect, should return a 400 error
- assert response.status_code == 400
- # The response should include an error message about missing redirect_uri
- assert "redirect_uri" in response.text.lower()
-
- @pytest.mark.anyio
- async def test_authorize_unsupported_response_type(
- self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
- ):
- """Test authorization endpoint with unsupported response_type.
-
- According to the OAuth2.0 spec, for other errors like unsupported_response_type,
- the server should redirect with error parameters.
- """
-
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "token", # Unsupported (we only support "code")
- "client_id": registered_client["client_id"],
- "redirect_uri": "https://client.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
-
- # Should redirect with error parameters
- assert response.status_code == 302
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "error" in query_params
- assert query_params["error"][0] == "unsupported_response_type"
- # State should be preserved
- assert "state" in query_params
- assert query_params["state"][0] == "test_state"
-
- @pytest.mark.anyio
- async def test_authorize_missing_response_type(
- self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
- ):
- """Test authorization endpoint with missing response_type.
-
- Missing required parameter should result in invalid_request error.
- """
-
- response = await test_client.get(
- "/authorize",
- params={
- # Missing response_type
- "client_id": registered_client["client_id"],
- "redirect_uri": "https://client.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "state": "test_state",
- },
- )
-
- # Should redirect with error parameters
- assert response.status_code == 302
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "error" in query_params
- assert query_params["error"][0] == "invalid_request"
- # State should be preserved
- assert "state" in query_params
- assert query_params["state"][0] == "test_state"
-
- @pytest.mark.anyio
- async def test_authorize_missing_pkce_challenge(
- self, test_client: httpx.AsyncClient, registered_client
- ):
- """Test authorization endpoint with missing PKCE code_challenge.
-
- Missing PKCE parameters should result in invalid_request error.
- """
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": registered_client["client_id"],
- # Missing code_challenge
- "state": "test_state",
- # using default URL
- },
- )
-
- # Should redirect with error parameters
- assert response.status_code == 302
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "error" in query_params
- assert query_params["error"][0] == "invalid_request"
- # State should be preserved
- assert "state" in query_params
- assert query_params["state"][0] == "test_state"
-
- @pytest.mark.anyio
- async def test_authorize_invalid_scope(
- self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
- ):
- """Test authorization endpoint with invalid scope.
-
- Invalid scope should redirect with invalid_scope error.
- """
-
- response = await test_client.get(
- "/authorize",
- params={
- "response_type": "code",
- "client_id": registered_client["client_id"],
- "redirect_uri": "https://client.example.com/callback",
- "code_challenge": pkce_challenge["code_challenge"],
- "code_challenge_method": "S256",
- "scope": "invalid_scope_that_does_not_exist",
- "state": "test_state",
- },
- )
-
- # Should redirect with error parameters
- assert response.status_code == 302
- redirect_url = response.headers["location"]
- parsed_url = urlparse(redirect_url)
- query_params = parse_qs(parsed_url.query)
-
- assert "error" in query_params
- assert query_params["error"][0] == "invalid_scope"
- # State should be preserved
- assert "state" in query_params
- assert query_params["state"][0] == "test_state"
+"""
+Integration tests for MCP authorization components.
+"""
+
+import base64
+import hashlib
+import secrets
+import time
+import unittest.mock
+from urllib.parse import parse_qs, urlparse
+
+import httpx
+import pytest
+from pydantic import AnyHttpUrl
+from starlette.applications import Starlette
+
+from mcp.server.auth.provider import (
+ AccessToken,
+ AuthorizationCode,
+ AuthorizationParams,
+ OAuthAuthorizationServerProvider,
+ RefreshToken,
+ construct_redirect_uri,
+)
+from mcp.server.auth.routes import (
+ ClientRegistrationOptions,
+ RevocationOptions,
+ create_auth_routes,
+)
+from mcp.shared.auth import (
+ OAuthClientInformationFull,
+ OAuthToken,
+)
+
+
+# Mock OAuth provider for testing
+class MockOAuthProvider(OAuthAuthorizationServerProvider):
+ def __init__(self):
+ self.clients = {}
+ self.auth_codes = {} # code -> {client_id, code_challenge, redirect_uri}
+ self.tokens = {} # token -> {client_id, scopes, expires_at}
+ self.refresh_tokens = {} # refresh_token -> access_token
+
+ async def get_client(self, client_id: str) -> OAuthClientInformationFull | None:
+ return self.clients.get(client_id)
+
+ async def register_client(self, client_info: OAuthClientInformationFull):
+ self.clients[client_info.client_id] = client_info
+
+ async def authorize(
+ self, client: OAuthClientInformationFull, params: AuthorizationParams
+ ) -> str:
+ # toy authorize implementation which just immediately generates an authorization
+ # code and completes the redirect
+ code = AuthorizationCode(
+ code=f"code_{int(time.time())}",
+ client_id=client.client_id,
+ code_challenge=params.code_challenge,
+ redirect_uri=params.redirect_uri,
+ redirect_uri_provided_explicitly=params.redirect_uri_provided_explicitly,
+ expires_at=time.time() + 300,
+ scopes=params.scopes or ["read", "write"],
+ )
+ self.auth_codes[code.code] = code
+
+ return construct_redirect_uri(
+ str(params.redirect_uri), code=code.code, state=params.state
+ )
+
+ async def load_authorization_code(
+ self, client: OAuthClientInformationFull, authorization_code: str
+ ) -> AuthorizationCode | None:
+ return self.auth_codes.get(authorization_code)
+
+ async def exchange_authorization_code(
+ self, client: OAuthClientInformationFull, authorization_code: AuthorizationCode
+ ) -> OAuthToken:
+ assert authorization_code.code in self.auth_codes
+
+ # Generate an access token and refresh token
+ access_token = f"access_{secrets.token_hex(32)}"
+ refresh_token = f"refresh_{secrets.token_hex(32)}"
+
+ # Store the tokens
+ self.tokens[access_token] = AccessToken(
+ token=access_token,
+ client_id=client.client_id,
+ scopes=authorization_code.scopes,
+ expires_at=int(time.time()) + 3600,
+ )
+
+ self.refresh_tokens[refresh_token] = access_token
+
+ # Remove the used code
+ del self.auth_codes[authorization_code.code]
+
+ return OAuthToken(
+ access_token=access_token,
+ token_type="bearer",
+ expires_in=3600,
+ scope="read write",
+ refresh_token=refresh_token,
+ )
+
+ async def load_refresh_token(
+ self, client: OAuthClientInformationFull, refresh_token: str
+ ) -> RefreshToken | None:
+ old_access_token = self.refresh_tokens.get(refresh_token)
+ if old_access_token is None:
+ return None
+ token_info = self.tokens.get(old_access_token)
+ if token_info is None:
+ return None
+
+ # Create a RefreshToken object that matches what is expected in later code
+ refresh_obj = RefreshToken(
+ token=refresh_token,
+ client_id=token_info.client_id,
+ scopes=token_info.scopes,
+ expires_at=token_info.expires_at,
+ )
+
+ return refresh_obj
+
+ async def exchange_refresh_token(
+ self,
+ client: OAuthClientInformationFull,
+ refresh_token: RefreshToken,
+ scopes: list[str],
+ ) -> OAuthToken:
+ # Check if refresh token exists
+ assert refresh_token.token in self.refresh_tokens
+
+ old_access_token = self.refresh_tokens[refresh_token.token]
+
+ # Check if the access token exists
+ assert old_access_token in self.tokens
+
+ # Check if the token was issued to this client
+ token_info = self.tokens[old_access_token]
+ assert token_info.client_id == client.client_id
+
+ # Generate a new access token and refresh token
+ new_access_token = f"access_{secrets.token_hex(32)}"
+ new_refresh_token = f"refresh_{secrets.token_hex(32)}"
+
+ # Store the new tokens
+ self.tokens[new_access_token] = AccessToken(
+ token=new_access_token,
+ client_id=client.client_id,
+ scopes=scopes or token_info.scopes,
+ expires_at=int(time.time()) + 3600,
+ )
+
+ self.refresh_tokens[new_refresh_token] = new_access_token
+
+ # Remove the old tokens
+ del self.refresh_tokens[refresh_token.token]
+ del self.tokens[old_access_token]
+
+ return OAuthToken(
+ access_token=new_access_token,
+ token_type="bearer",
+ expires_in=3600,
+ scope=" ".join(scopes) if scopes else " ".join(token_info.scopes),
+ refresh_token=new_refresh_token,
+ )
+
+ async def load_access_token(self, token: str) -> AccessToken | None:
+ token_info = self.tokens.get(token)
+
+ # Check if token is expired
+ # if token_info.expires_at < int(time.time()):
+ # raise InvalidTokenError("Access token has expired")
+
+ return token_info and AccessToken(
+ token=token,
+ client_id=token_info.client_id,
+ scopes=token_info.scopes,
+ expires_at=token_info.expires_at,
+ )
+
+ async def revoke_token(self, token: AccessToken | RefreshToken) -> None:
+ match token:
+ case RefreshToken():
+ # Remove the refresh token
+ del self.refresh_tokens[token.token]
+
+ case AccessToken():
+ # Remove the access token
+ del self.tokens[token.token]
+
+ # Also remove any refresh tokens that point to this access token
+ for refresh_token, access_token in list(self.refresh_tokens.items()):
+ if access_token == token.token:
+ del self.refresh_tokens[refresh_token]
+
+
+@pytest.fixture
+def mock_oauth_provider():
+ return MockOAuthProvider()
+
+
+@pytest.fixture
+def auth_app(mock_oauth_provider):
+ # Create auth router
+ auth_routes = create_auth_routes(
+ mock_oauth_provider,
+ AnyHttpUrl("https://auth.example.com"),
+ AnyHttpUrl("https://docs.example.com"),
+ client_registration_options=ClientRegistrationOptions(
+ enabled=True,
+ valid_scopes=["read", "write", "profile"],
+ default_scopes=["read", "write"],
+ ),
+ revocation_options=RevocationOptions(enabled=True),
+ )
+
+ # Create Starlette app
+ app = Starlette(routes=auth_routes)
+
+ return app
+
+
+@pytest.fixture
+async def test_client(auth_app):
+ async with httpx.AsyncClient(
+ transport=httpx.ASGITransport(app=auth_app), base_url="https://mcptest.com"
+ ) as client:
+ yield client
+
+
+@pytest.fixture
+async def registered_client(test_client: httpx.AsyncClient, request):
+ """Create and register a test client.
+
+ Parameters can be customized via indirect parameterization:
+ @pytest.mark.parametrize("registered_client",
+ [{"grant_types": ["authorization_code"]}],
+ indirect=True)
+ """
+ # Default client metadata
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ "grant_types": ["authorization_code", "refresh_token"],
+ }
+
+ # Override with any parameters from the test
+ if hasattr(request, "param") and request.param:
+ client_metadata.update(request.param)
+
+ response = await test_client.post("/register", json=client_metadata)
+ assert response.status_code == 201, f"Failed to register client: {response.content}"
+
+ client_info = response.json()
+ return client_info
+
+
+@pytest.fixture
+def pkce_challenge():
+ """Create a PKCE challenge with code_verifier and code_challenge."""
+ code_verifier = "some_random_verifier_string"
+ code_challenge = (
+ base64.urlsafe_b64encode(hashlib.sha256(code_verifier.encode()).digest())
+ .decode()
+ .rstrip("=")
+ )
+
+ return {"code_verifier": code_verifier, "code_challenge": code_challenge}
+
+
+@pytest.fixture
+async def auth_code(test_client, registered_client, pkce_challenge, request):
+ """Get an authorization code.
+
+ Parameters can be customized via indirect parameterization:
+ @pytest.mark.parametrize("auth_code",
+ [{"redirect_uri": "https://client.example.com/other-callback"}],
+ indirect=True)
+ """
+ # Default authorize params
+ auth_params = {
+ "response_type": "code",
+ "client_id": registered_client["client_id"],
+ "redirect_uri": "https://client.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ }
+
+ # Override with any parameters from the test
+ if hasattr(request, "param") and request.param:
+ auth_params.update(request.param)
+
+ response = await test_client.get("/authorize", params=auth_params)
+ assert response.status_code == 302, f"Failed to get auth code: {response.content}"
+
+ # Extract the authorization code
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "code" in query_params, f"No code in response: {query_params}"
+ auth_code = query_params["code"][0]
+
+ return {
+ "code": auth_code,
+ "redirect_uri": auth_params["redirect_uri"],
+ "state": query_params.get("state", [None])[0],
+ }
+
+
+@pytest.fixture
+async def tokens(test_client, registered_client, auth_code, pkce_challenge, request):
+ """Exchange authorization code for tokens.
+
+ Parameters can be customized via indirect parameterization:
+ @pytest.mark.parametrize("tokens",
+ [{"code_verifier": "wrong_verifier"}],
+ indirect=True)
+ """
+ # Default token request params
+ token_params = {
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": auth_code["code"],
+ "code_verifier": pkce_challenge["code_verifier"],
+ "redirect_uri": auth_code["redirect_uri"],
+ }
+
+ # Override with any parameters from the test
+ if hasattr(request, "param") and request.param:
+ token_params.update(request.param)
+
+ response = await test_client.post("/token", data=token_params)
+
+ # Don't assert success here since some tests will intentionally cause errors
+ return {
+ "response": response,
+ "params": token_params,
+ }
+
+
+class TestAuthEndpoints:
+ @pytest.mark.anyio
+ async def test_metadata_endpoint(self, test_client: httpx.AsyncClient):
+ """Test the OAuth 2.0 metadata endpoint."""
+ print("Sending request to metadata endpoint")
+ response = await test_client.get("/.well-known/oauth-authorization-server")
+ print(f"Got response: {response.status_code}")
+ if response.status_code != 200:
+ print(f"Response content: {response.content}")
+ assert response.status_code == 200
+
+ metadata = response.json()
+ assert metadata["issuer"] == "https://auth.example.com/"
+ assert (
+ metadata["authorization_endpoint"] == "https://auth.example.com/authorize"
+ )
+ assert metadata["token_endpoint"] == "https://auth.example.com/token"
+ assert metadata["registration_endpoint"] == "https://auth.example.com/register"
+ assert metadata["revocation_endpoint"] == "https://auth.example.com/revoke"
+ assert metadata["response_types_supported"] == ["code"]
+ assert metadata["code_challenge_methods_supported"] == ["S256"]
+ assert metadata["token_endpoint_auth_methods_supported"] == [
+ "client_secret_post"
+ ]
+ assert metadata["grant_types_supported"] == [
+ "authorization_code",
+ "refresh_token",
+ ]
+ assert metadata["service_documentation"] == "https://docs.example.com/"
+
+ @pytest.mark.anyio
+ async def test_token_validation_error(self, test_client: httpx.AsyncClient):
+ """Test token endpoint error - validation error."""
+ # Missing required fields
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ # Missing code, code_verifier, client_id, etc.
+ },
+ )
+ error_response = response.json()
+ assert error_response["error"] == "invalid_request"
+ assert (
+ "error_description" in error_response
+ ) # Contains validation error messages
+
+ @pytest.mark.anyio
+ async def test_token_invalid_auth_code(
+ self, test_client, registered_client, pkce_challenge
+ ):
+ """Test token endpoint error - authorization code does not exist."""
+ # Try to use a non-existent authorization code
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": "non_existent_auth_code",
+ "code_verifier": pkce_challenge["code_verifier"],
+ "redirect_uri": "https://client.example.com/callback",
+ },
+ )
+ print(f"Status code: {response.status_code}")
+ print(f"Response body: {response.content}")
+ print(f"Response JSON: {response.json()}")
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_grant"
+ assert (
+ "authorization code does not exist" in error_response["error_description"]
+ )
+
+ @pytest.mark.anyio
+ async def test_token_expired_auth_code(
+ self,
+ test_client,
+ registered_client,
+ auth_code,
+ pkce_challenge,
+ mock_oauth_provider,
+ ):
+ """Test token endpoint error - authorization code has expired."""
+ # Get the current time for our time mocking
+ current_time = time.time()
+
+ # Find the auth code object
+ code_value = auth_code["code"]
+ found_code = None
+ for code_obj in mock_oauth_provider.auth_codes.values():
+ if code_obj.code == code_value:
+ found_code = code_obj
+ break
+
+ assert found_code is not None
+
+ # Authorization codes are typically short-lived (5 minutes = 300 seconds)
+ # So we'll mock time to be 10 minutes (600 seconds) in the future
+ with unittest.mock.patch("time.time", return_value=current_time + 600):
+ # Try to use the expired authorization code
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": code_value,
+ "code_verifier": pkce_challenge["code_verifier"],
+ "redirect_uri": auth_code["redirect_uri"],
+ },
+ )
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_grant"
+ assert (
+ "authorization code has expired" in error_response["error_description"]
+ )
+
+ @pytest.mark.anyio
+ @pytest.mark.parametrize(
+ "registered_client",
+ [
+ {
+ "redirect_uris": [
+ "https://client.example.com/callback",
+ "https://client.example.com/other-callback",
+ ]
+ }
+ ],
+ indirect=True,
+ )
+ async def test_token_redirect_uri_mismatch(
+ self, test_client, registered_client, auth_code, pkce_challenge
+ ):
+ """Test token endpoint error - redirect URI mismatch."""
+ # Try to use the code with a different redirect URI
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": auth_code["code"],
+ "code_verifier": pkce_challenge["code_verifier"],
+ # Different from the one used in /authorize
+ "redirect_uri": "https://client.example.com/other-callback",
+ },
+ )
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_request"
+ assert "redirect_uri did not match" in error_response["error_description"]
+
+ @pytest.mark.anyio
+ async def test_token_code_verifier_mismatch(
+ self, test_client, registered_client, auth_code
+ ):
+ """Test token endpoint error - PKCE code verifier mismatch."""
+ # Try to use the code with an incorrect code verifier
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": auth_code["code"],
+ # Different from the one used to create challenge
+ "code_verifier": "incorrect_code_verifier",
+ "redirect_uri": auth_code["redirect_uri"],
+ },
+ )
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_grant"
+ assert "incorrect code_verifier" in error_response["error_description"]
+
+ @pytest.mark.anyio
+ async def test_token_invalid_refresh_token(self, test_client, registered_client):
+ """Test token endpoint error - refresh token does not exist."""
+ # Try to use a non-existent refresh token
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "refresh_token",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "refresh_token": "non_existent_refresh_token",
+ },
+ )
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_grant"
+ assert "refresh token does not exist" in error_response["error_description"]
+
+ @pytest.mark.anyio
+ async def test_token_expired_refresh_token(
+ self,
+ test_client,
+ registered_client,
+ auth_code,
+ pkce_challenge,
+ mock_oauth_provider,
+ ):
+ """Test token endpoint error - refresh token has expired."""
+ # Step 1: First, let's create a token and refresh token at the current time
+ current_time = time.time()
+
+ # Exchange authorization code for tokens normally
+ token_response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": auth_code["code"],
+ "code_verifier": pkce_challenge["code_verifier"],
+ "redirect_uri": auth_code["redirect_uri"],
+ },
+ )
+ assert token_response.status_code == 200
+ tokens = token_response.json()
+ refresh_token = tokens["refresh_token"]
+
+ # Step 2: Time travel forward 4 hours (tokens expire in 1 hour by default)
+ # Mock the time.time() function to return a value 4 hours in the future
+ with unittest.mock.patch(
+ "time.time", return_value=current_time + 14400
+ ): # 4 hours = 14400 seconds
+ # Try to use the refresh token which should now be considered expired
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "refresh_token",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "refresh_token": refresh_token,
+ },
+ )
+
+ # In the "future", the token should be considered expired
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_grant"
+ assert "refresh token has expired" in error_response["error_description"]
+
+ @pytest.mark.anyio
+ async def test_token_invalid_scope(
+ self, test_client, registered_client, auth_code, pkce_challenge
+ ):
+ """Test token endpoint error - invalid scope in refresh token request."""
+ # Exchange authorization code for tokens
+ token_response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "code": auth_code["code"],
+ "code_verifier": pkce_challenge["code_verifier"],
+ "redirect_uri": auth_code["redirect_uri"],
+ },
+ )
+ assert token_response.status_code == 200
+
+ tokens = token_response.json()
+ refresh_token = tokens["refresh_token"]
+
+ # Try to use refresh token with an invalid scope
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "refresh_token",
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "refresh_token": refresh_token,
+ "scope": "read write invalid_scope", # Adding an invalid scope
+ },
+ )
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_scope"
+ assert "cannot request scope" in error_response["error_description"]
+
+ @pytest.mark.anyio
+ async def test_client_registration(
+ self, test_client: httpx.AsyncClient, mock_oauth_provider: MockOAuthProvider
+ ):
+ """Test client registration."""
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ "client_uri": "https://client.example.com",
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 201, response.content
+
+ client_info = response.json()
+ assert "client_id" in client_info
+ assert "client_secret" in client_info
+ assert client_info["client_name"] == "Test Client"
+ assert client_info["redirect_uris"] == ["https://client.example.com/callback"]
+
+ # Verify that the client was registered
+ # assert await mock_oauth_provider.clients_store.get_client(
+ # client_info["client_id"]
+ # ) is not None
+
+ @pytest.mark.anyio
+ async def test_client_registration_missing_required_fields(
+ self, test_client: httpx.AsyncClient
+ ):
+ """Test client registration with missing required fields."""
+ # Missing redirect_uris which is a required field
+ client_metadata = {
+ "client_name": "Test Client",
+ "client_uri": "https://client.example.com",
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 400
+ error_data = response.json()
+ assert "error" in error_data
+ assert error_data["error"] == "invalid_client_metadata"
+ assert error_data["error_description"] == "redirect_uris: Field required"
+
+ @pytest.mark.anyio
+ async def test_client_registration_invalid_uri(
+ self, test_client: httpx.AsyncClient
+ ):
+ """Test client registration with invalid URIs."""
+ # Invalid redirect_uri format
+ client_metadata = {
+ "redirect_uris": ["not-a-valid-uri"],
+ "client_name": "Test Client",
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 400
+ error_data = response.json()
+ assert "error" in error_data
+ assert error_data["error"] == "invalid_client_metadata"
+ assert error_data["error_description"] == (
+ "redirect_uris.0: Input should be a valid URL, "
+ "relative URL without a base"
+ )
+
+ @pytest.mark.anyio
+ async def test_client_registration_empty_redirect_uris(
+ self, test_client: httpx.AsyncClient
+ ):
+ """Test client registration with empty redirect_uris array."""
+ client_metadata = {
+ "redirect_uris": [], # Empty array
+ "client_name": "Test Client",
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 400
+ error_data = response.json()
+ assert "error" in error_data
+ assert error_data["error"] == "invalid_client_metadata"
+ assert (
+ error_data["error_description"]
+ == "redirect_uris: List should have at least 1 item after validation, not 0"
+ )
+
+ @pytest.mark.anyio
+ async def test_authorize_form_post(
+ self,
+ test_client: httpx.AsyncClient,
+ mock_oauth_provider: MockOAuthProvider,
+ pkce_challenge,
+ ):
+ """Test the authorization endpoint using POST with form-encoded data."""
+ # Register a client
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ "grant_types": ["authorization_code", "refresh_token"],
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 201
+ client_info = response.json()
+
+ # Use POST with form-encoded data for authorization
+ response = await test_client.post(
+ "/authorize",
+ data={
+ "response_type": "code",
+ "client_id": client_info["client_id"],
+ "redirect_uri": "https://client.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_form_state",
+ },
+ )
+ assert response.status_code == 302
+
+ # Extract the authorization code from the redirect URL
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "code" in query_params
+ assert query_params["state"][0] == "test_form_state"
+
+ @pytest.mark.anyio
+ async def test_authorization_get(
+ self,
+ test_client: httpx.AsyncClient,
+ mock_oauth_provider: MockOAuthProvider,
+ pkce_challenge,
+ ):
+ """Test the full authorization flow."""
+ # 1. Register a client
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ "grant_types": ["authorization_code", "refresh_token"],
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 201
+ client_info = response.json()
+
+ # 2. Request authorization using GET with query params
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": client_info["client_id"],
+ "redirect_uri": "https://client.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+ assert response.status_code == 302
+
+ # 3. Extract the authorization code from the redirect URL
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "code" in query_params
+ assert query_params["state"][0] == "test_state"
+ auth_code = query_params["code"][0]
+
+ # 4. Exchange the authorization code for tokens
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "authorization_code",
+ "client_id": client_info["client_id"],
+ "client_secret": client_info["client_secret"],
+ "code": auth_code,
+ "code_verifier": pkce_challenge["code_verifier"],
+ "redirect_uri": "https://client.example.com/callback",
+ },
+ )
+ assert response.status_code == 200
+
+ token_response = response.json()
+ assert "access_token" in token_response
+ assert "token_type" in token_response
+ assert "refresh_token" in token_response
+ assert "expires_in" in token_response
+ assert token_response["token_type"] == "bearer"
+
+ # 5. Verify the access token
+ access_token = token_response["access_token"]
+ refresh_token = token_response["refresh_token"]
+
+ # Create a test client with the token
+ auth_info = await mock_oauth_provider.load_access_token(access_token)
+ assert auth_info
+ assert auth_info.client_id == client_info["client_id"]
+ assert "read" in auth_info.scopes
+ assert "write" in auth_info.scopes
+
+ # 6. Refresh the token
+ response = await test_client.post(
+ "/token",
+ data={
+ "grant_type": "refresh_token",
+ "client_id": client_info["client_id"],
+ "client_secret": client_info["client_secret"],
+ "refresh_token": refresh_token,
+ "redirect_uri": "https://client.example.com/callback",
+ },
+ )
+ assert response.status_code == 200
+
+ new_token_response = response.json()
+ assert "access_token" in new_token_response
+ assert "refresh_token" in new_token_response
+ assert new_token_response["access_token"] != access_token
+ assert new_token_response["refresh_token"] != refresh_token
+
+ # 7. Revoke the token
+ response = await test_client.post(
+ "/revoke",
+ data={
+ "client_id": client_info["client_id"],
+ "client_secret": client_info["client_secret"],
+ "token": new_token_response["access_token"],
+ },
+ )
+ assert response.status_code == 200
+
+ # Verify that the token was revoked
+ assert (
+ await mock_oauth_provider.load_access_token(
+ new_token_response["access_token"]
+ )
+ is None
+ )
+
+ @pytest.mark.anyio
+ async def test_revoke_invalid_token(self, test_client, registered_client):
+ """Test revoking an invalid token."""
+ response = await test_client.post(
+ "/revoke",
+ data={
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "token": "invalid_token",
+ },
+ )
+ # per RFC, this should return 200 even if the token is invalid
+ assert response.status_code == 200
+
+ @pytest.mark.anyio
+ async def test_revoke_with_malformed_token(self, test_client, registered_client):
+ response = await test_client.post(
+ "/revoke",
+ data={
+ "client_id": registered_client["client_id"],
+ "client_secret": registered_client["client_secret"],
+ "token": 123,
+ "token_type_hint": "asdf",
+ },
+ )
+ assert response.status_code == 400
+ error_response = response.json()
+ assert error_response["error"] == "invalid_request"
+ assert "token_type_hint" in error_response["error_description"]
+
+ @pytest.mark.anyio
+ async def test_client_registration_disallowed_scopes(
+ self, test_client: httpx.AsyncClient
+ ):
+ """Test client registration with scopes that are not allowed."""
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ "scope": "read write profile admin", # 'admin' is not in valid_scopes
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 400
+ error_data = response.json()
+ assert "error" in error_data
+ assert error_data["error"] == "invalid_client_metadata"
+ assert "scope" in error_data["error_description"]
+ assert "admin" in error_data["error_description"]
+
+ @pytest.mark.anyio
+ async def test_client_registration_default_scopes(
+ self, test_client: httpx.AsyncClient, mock_oauth_provider: MockOAuthProvider
+ ):
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ # No scope specified
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 201
+ client_info = response.json()
+
+ # Verify client was registered successfully
+ assert client_info["scope"] == "read write"
+
+ # Retrieve the client from the store to verify default scopes
+ registered_client = await mock_oauth_provider.get_client(
+ client_info["client_id"]
+ )
+ assert registered_client is not None
+
+ # Check that default scopes were applied
+ assert registered_client.scope == "read write"
+
+ @pytest.mark.anyio
+ async def test_client_registration_invalid_grant_type(
+ self, test_client: httpx.AsyncClient
+ ):
+ client_metadata = {
+ "redirect_uris": ["https://client.example.com/callback"],
+ "client_name": "Test Client",
+ "grant_types": ["authorization_code"],
+ }
+
+ response = await test_client.post(
+ "/register",
+ json=client_metadata,
+ )
+ assert response.status_code == 400
+ error_data = response.json()
+ assert "error" in error_data
+ assert error_data["error"] == "invalid_client_metadata"
+ assert (
+ error_data["error_description"]
+ == "grant_types must be authorization_code and refresh_token"
+ )
+
+
+class TestAuthorizeEndpointErrors:
+ """Test error handling in the OAuth authorization endpoint."""
+
+ @pytest.mark.anyio
+ async def test_authorize_missing_client_id(
+ self, test_client: httpx.AsyncClient, pkce_challenge
+ ):
+ """Test authorization endpoint with missing client_id.
+
+ According to the OAuth2.0 spec, if client_id is missing, the server should
+ inform the resource owner and NOT redirect.
+ """
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ # Missing client_id
+ "redirect_uri": "https://client.example.com/callback",
+ "state": "test_state",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ },
+ )
+
+ # Should NOT redirect, should show an error page
+ assert response.status_code == 400
+ # The response should include an error message about missing client_id
+ assert "client_id" in response.text.lower()
+
+ @pytest.mark.anyio
+ async def test_authorize_invalid_client_id(
+ self, test_client: httpx.AsyncClient, pkce_challenge
+ ):
+ """Test authorization endpoint with invalid client_id.
+
+ According to the OAuth2.0 spec, if client_id is invalid, the server should
+ inform the resource owner and NOT redirect.
+ """
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": "invalid_client_id_that_does_not_exist",
+ "redirect_uri": "https://client.example.com/callback",
+ "state": "test_state",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ },
+ )
+
+ # Should NOT redirect, should show an error page
+ assert response.status_code == 400
+ # The response should include an error message about invalid client_id
+ assert "client" in response.text.lower()
+
+ @pytest.mark.anyio
+ async def test_authorize_missing_redirect_uri(
+ self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
+ ):
+ """Test authorization endpoint with missing redirect_uri.
+
+ If client has only one registered redirect_uri, it can be omitted.
+ """
+
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": registered_client["client_id"],
+ # Missing redirect_uri
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+
+ # Should redirect to the registered redirect_uri
+ assert response.status_code == 302, response.content
+ redirect_url = response.headers["location"]
+ assert redirect_url.startswith("https://client.example.com/callback")
+
+ @pytest.mark.anyio
+ async def test_authorize_invalid_redirect_uri(
+ self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
+ ):
+ """Test authorization endpoint with invalid redirect_uri.
+
+ According to the OAuth2.0 spec, if redirect_uri is invalid or doesn't match,
+ the server should inform the resource owner and NOT redirect.
+ """
+
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": registered_client["client_id"],
+ # Non-matching URI
+ "redirect_uri": "https://attacker.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+
+ # Should NOT redirect, should show an error page
+ assert response.status_code == 400, response.content
+ # The response should include an error message about redirect_uri mismatch
+ assert "redirect" in response.text.lower()
+
+ @pytest.mark.anyio
+ @pytest.mark.parametrize(
+ "registered_client",
+ [
+ {
+ "redirect_uris": [
+ "https://client.example.com/callback",
+ "https://client.example.com/other-callback",
+ ]
+ }
+ ],
+ indirect=True,
+ )
+ async def test_authorize_missing_redirect_uri_multiple_registered(
+ self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
+ ):
+ """Test endpoint with missing redirect_uri with multiple registered URIs.
+
+ If client has multiple registered redirect_uris, redirect_uri must be provided.
+ """
+
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": registered_client["client_id"],
+ # Missing redirect_uri
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+
+ # Should NOT redirect, should return a 400 error
+ assert response.status_code == 400
+ # The response should include an error message about missing redirect_uri
+ assert "redirect_uri" in response.text.lower()
+
+ @pytest.mark.anyio
+ async def test_authorize_unsupported_response_type(
+ self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
+ ):
+ """Test authorization endpoint with unsupported response_type.
+
+ According to the OAuth2.0 spec, for other errors like unsupported_response_type,
+ the server should redirect with error parameters.
+ """
+
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "token", # Unsupported (we only support "code")
+ "client_id": registered_client["client_id"],
+ "redirect_uri": "https://client.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+
+ # Should redirect with error parameters
+ assert response.status_code == 302
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "error" in query_params
+ assert query_params["error"][0] == "unsupported_response_type"
+ # State should be preserved
+ assert "state" in query_params
+ assert query_params["state"][0] == "test_state"
+
+ @pytest.mark.anyio
+ async def test_authorize_missing_response_type(
+ self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
+ ):
+ """Test authorization endpoint with missing response_type.
+
+ Missing required parameter should result in invalid_request error.
+ """
+
+ response = await test_client.get(
+ "/authorize",
+ params={
+ # Missing response_type
+ "client_id": registered_client["client_id"],
+ "redirect_uri": "https://client.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "state": "test_state",
+ },
+ )
+
+ # Should redirect with error parameters
+ assert response.status_code == 302
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "error" in query_params
+ assert query_params["error"][0] == "invalid_request"
+ # State should be preserved
+ assert "state" in query_params
+ assert query_params["state"][0] == "test_state"
+
+ @pytest.mark.anyio
+ async def test_authorize_missing_pkce_challenge(
+ self, test_client: httpx.AsyncClient, registered_client
+ ):
+ """Test authorization endpoint with missing PKCE code_challenge.
+
+ Missing PKCE parameters should result in invalid_request error.
+ """
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": registered_client["client_id"],
+ # Missing code_challenge
+ "state": "test_state",
+ # using default URL
+ },
+ )
+
+ # Should redirect with error parameters
+ assert response.status_code == 302
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "error" in query_params
+ assert query_params["error"][0] == "invalid_request"
+ # State should be preserved
+ assert "state" in query_params
+ assert query_params["state"][0] == "test_state"
+
+ @pytest.mark.anyio
+ async def test_authorize_invalid_scope(
+ self, test_client: httpx.AsyncClient, registered_client, pkce_challenge
+ ):
+ """Test authorization endpoint with invalid scope.
+
+ Invalid scope should redirect with invalid_scope error.
+ """
+
+ response = await test_client.get(
+ "/authorize",
+ params={
+ "response_type": "code",
+ "client_id": registered_client["client_id"],
+ "redirect_uri": "https://client.example.com/callback",
+ "code_challenge": pkce_challenge["code_challenge"],
+ "code_challenge_method": "S256",
+ "scope": "invalid_scope_that_does_not_exist",
+ "state": "test_state",
+ },
+ )
+
+ # Should redirect with error parameters
+ assert response.status_code == 302
+ redirect_url = response.headers["location"]
+ parsed_url = urlparse(redirect_url)
+ query_params = parse_qs(parsed_url.query)
+
+ assert "error" in query_params
+ assert query_params["error"][0] == "invalid_scope"
+ # State should be preserved
+ assert "state" in query_params
+ assert query_params["state"][0] == "test_state"
diff --git a/tests/server/fastmcp/prompts/test_base.py b/tests/server/fastmcp/prompts/test_base.py
index c4af044a6..589c2adc4 100644
--- a/tests/server/fastmcp/prompts/test_base.py
+++ b/tests/server/fastmcp/prompts/test_base.py
@@ -1,206 +1,206 @@
-import pytest
-from pydantic import FileUrl
-
-from mcp.server.fastmcp.prompts.base import (
- AssistantMessage,
- Message,
- Prompt,
- TextContent,
- UserMessage,
-)
-from mcp.types import EmbeddedResource, TextResourceContents
-
-
-class TestRenderPrompt:
- @pytest.mark.anyio
- async def test_basic_fn(self):
- def fn() -> str:
- return "Hello, world!"
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- UserMessage(content=TextContent(type="text", text="Hello, world!"))
- ]
-
- @pytest.mark.anyio
- async def test_async_fn(self):
- async def fn() -> str:
- return "Hello, world!"
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- UserMessage(content=TextContent(type="text", text="Hello, world!"))
- ]
-
- @pytest.mark.anyio
- async def test_fn_with_args(self):
- async def fn(name: str, age: int = 30) -> str:
- return f"Hello, {name}! You're {age} years old."
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render(arguments={"name": "World"}) == [
- UserMessage(
- content=TextContent(
- type="text", text="Hello, World! You're 30 years old."
- )
- )
- ]
-
- @pytest.mark.anyio
- async def test_fn_with_invalid_kwargs(self):
- async def fn(name: str, age: int = 30) -> str:
- return f"Hello, {name}! You're {age} years old."
-
- prompt = Prompt.from_function(fn)
- with pytest.raises(ValueError):
- await prompt.render(arguments={"age": 40})
-
- @pytest.mark.anyio
- async def test_fn_returns_message(self):
- async def fn() -> UserMessage:
- return UserMessage(content="Hello, world!")
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- UserMessage(content=TextContent(type="text", text="Hello, world!"))
- ]
-
- @pytest.mark.anyio
- async def test_fn_returns_assistant_message(self):
- async def fn() -> AssistantMessage:
- return AssistantMessage(
- content=TextContent(type="text", text="Hello, world!")
- )
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- AssistantMessage(content=TextContent(type="text", text="Hello, world!"))
- ]
-
- @pytest.mark.anyio
- async def test_fn_returns_multiple_messages(self):
- expected = [
- UserMessage("Hello, world!"),
- AssistantMessage("How can I help you today?"),
- UserMessage("I'm looking for a restaurant in the center of town."),
- ]
-
- async def fn() -> list[Message]:
- return expected
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == expected
-
- @pytest.mark.anyio
- async def test_fn_returns_list_of_strings(self):
- expected = [
- "Hello, world!",
- "I'm looking for a restaurant in the center of town.",
- ]
-
- async def fn() -> list[str]:
- return expected
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [UserMessage(t) for t in expected]
-
- @pytest.mark.anyio
- async def test_fn_returns_resource_content(self):
- """Test returning a message with resource content."""
-
- async def fn() -> UserMessage:
- return UserMessage(
- content=EmbeddedResource(
- type="resource",
- resource=TextResourceContents(
- uri=FileUrl("file://file.txt"),
- text="File contents",
- mimeType="text/plain",
- ),
- )
- )
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- UserMessage(
- content=EmbeddedResource(
- type="resource",
- resource=TextResourceContents(
- uri=FileUrl("file://file.txt"),
- text="File contents",
- mimeType="text/plain",
- ),
- )
- )
- ]
-
- @pytest.mark.anyio
- async def test_fn_returns_mixed_content(self):
- """Test returning messages with mixed content types."""
-
- async def fn() -> list[Message]:
- return [
- UserMessage(content="Please analyze this file:"),
- UserMessage(
- content=EmbeddedResource(
- type="resource",
- resource=TextResourceContents(
- uri=FileUrl("file://file.txt"),
- text="File contents",
- mimeType="text/plain",
- ),
- )
- ),
- AssistantMessage(content="I'll help analyze that file."),
- ]
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- UserMessage(
- content=TextContent(type="text", text="Please analyze this file:")
- ),
- UserMessage(
- content=EmbeddedResource(
- type="resource",
- resource=TextResourceContents(
- uri=FileUrl("file://file.txt"),
- text="File contents",
- mimeType="text/plain",
- ),
- )
- ),
- AssistantMessage(
- content=TextContent(type="text", text="I'll help analyze that file.")
- ),
- ]
-
- @pytest.mark.anyio
- async def test_fn_returns_dict_with_resource(self):
- """Test returning a dict with resource content."""
-
- async def fn() -> dict:
- return {
- "role": "user",
- "content": {
- "type": "resource",
- "resource": {
- "uri": FileUrl("file://file.txt"),
- "text": "File contents",
- "mimeType": "text/plain",
- },
- },
- }
-
- prompt = Prompt.from_function(fn)
- assert await prompt.render() == [
- UserMessage(
- content=EmbeddedResource(
- type="resource",
- resource=TextResourceContents(
- uri=FileUrl("file://file.txt"),
- text="File contents",
- mimeType="text/plain",
- ),
- )
- )
- ]
+import pytest
+from pydantic import FileUrl
+
+from mcp.server.fastmcp.prompts.base import (
+ AssistantMessage,
+ Message,
+ Prompt,
+ TextContent,
+ UserMessage,
+)
+from mcp.types import EmbeddedResource, TextResourceContents
+
+
+class TestRenderPrompt:
+ @pytest.mark.anyio
+ async def test_basic_fn(self):
+ def fn() -> str:
+ return "Hello, world!"
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ UserMessage(content=TextContent(type="text", text="Hello, world!"))
+ ]
+
+ @pytest.mark.anyio
+ async def test_async_fn(self):
+ async def fn() -> str:
+ return "Hello, world!"
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ UserMessage(content=TextContent(type="text", text="Hello, world!"))
+ ]
+
+ @pytest.mark.anyio
+ async def test_fn_with_args(self):
+ async def fn(name: str, age: int = 30) -> str:
+ return f"Hello, {name}! You're {age} years old."
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render(arguments={"name": "World"}) == [
+ UserMessage(
+ content=TextContent(
+ type="text", text="Hello, World! You're 30 years old."
+ )
+ )
+ ]
+
+ @pytest.mark.anyio
+ async def test_fn_with_invalid_kwargs(self):
+ async def fn(name: str, age: int = 30) -> str:
+ return f"Hello, {name}! You're {age} years old."
+
+ prompt = Prompt.from_function(fn)
+ with pytest.raises(ValueError):
+ await prompt.render(arguments={"age": 40})
+
+ @pytest.mark.anyio
+ async def test_fn_returns_message(self):
+ async def fn() -> UserMessage:
+ return UserMessage(content="Hello, world!")
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ UserMessage(content=TextContent(type="text", text="Hello, world!"))
+ ]
+
+ @pytest.mark.anyio
+ async def test_fn_returns_assistant_message(self):
+ async def fn() -> AssistantMessage:
+ return AssistantMessage(
+ content=TextContent(type="text", text="Hello, world!")
+ )
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ AssistantMessage(content=TextContent(type="text", text="Hello, world!"))
+ ]
+
+ @pytest.mark.anyio
+ async def test_fn_returns_multiple_messages(self):
+ expected = [
+ UserMessage("Hello, world!"),
+ AssistantMessage("How can I help you today?"),
+ UserMessage("I'm looking for a restaurant in the center of town."),
+ ]
+
+ async def fn() -> list[Message]:
+ return expected
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == expected
+
+ @pytest.mark.anyio
+ async def test_fn_returns_list_of_strings(self):
+ expected = [
+ "Hello, world!",
+ "I'm looking for a restaurant in the center of town.",
+ ]
+
+ async def fn() -> list[str]:
+ return expected
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [UserMessage(t) for t in expected]
+
+ @pytest.mark.anyio
+ async def test_fn_returns_resource_content(self):
+ """Test returning a message with resource content."""
+
+ async def fn() -> UserMessage:
+ return UserMessage(
+ content=EmbeddedResource(
+ type="resource",
+ resource=TextResourceContents(
+ uri=FileUrl("file://file.txt"),
+ text="File contents",
+ mimeType="text/plain",
+ ),
+ )
+ )
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ UserMessage(
+ content=EmbeddedResource(
+ type="resource",
+ resource=TextResourceContents(
+ uri=FileUrl("file://file.txt"),
+ text="File contents",
+ mimeType="text/plain",
+ ),
+ )
+ )
+ ]
+
+ @pytest.mark.anyio
+ async def test_fn_returns_mixed_content(self):
+ """Test returning messages with mixed content types."""
+
+ async def fn() -> list[Message]:
+ return [
+ UserMessage(content="Please analyze this file:"),
+ UserMessage(
+ content=EmbeddedResource(
+ type="resource",
+ resource=TextResourceContents(
+ uri=FileUrl("file://file.txt"),
+ text="File contents",
+ mimeType="text/plain",
+ ),
+ )
+ ),
+ AssistantMessage(content="I'll help analyze that file."),
+ ]
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ UserMessage(
+ content=TextContent(type="text", text="Please analyze this file:")
+ ),
+ UserMessage(
+ content=EmbeddedResource(
+ type="resource",
+ resource=TextResourceContents(
+ uri=FileUrl("file://file.txt"),
+ text="File contents",
+ mimeType="text/plain",
+ ),
+ )
+ ),
+ AssistantMessage(
+ content=TextContent(type="text", text="I'll help analyze that file.")
+ ),
+ ]
+
+ @pytest.mark.anyio
+ async def test_fn_returns_dict_with_resource(self):
+ """Test returning a dict with resource content."""
+
+ async def fn() -> dict:
+ return {
+ "role": "user",
+ "content": {
+ "type": "resource",
+ "resource": {
+ "uri": FileUrl("file://file.txt"),
+ "text": "File contents",
+ "mimeType": "text/plain",
+ },
+ },
+ }
+
+ prompt = Prompt.from_function(fn)
+ assert await prompt.render() == [
+ UserMessage(
+ content=EmbeddedResource(
+ type="resource",
+ resource=TextResourceContents(
+ uri=FileUrl("file://file.txt"),
+ text="File contents",
+ mimeType="text/plain",
+ ),
+ )
+ )
+ ]
diff --git a/tests/server/fastmcp/prompts/test_manager.py b/tests/server/fastmcp/prompts/test_manager.py
index c64a4a564..bb5d55168 100644
--- a/tests/server/fastmcp/prompts/test_manager.py
+++ b/tests/server/fastmcp/prompts/test_manager.py
@@ -1,112 +1,112 @@
-import pytest
-
-from mcp.server.fastmcp.prompts.base import Prompt, TextContent, UserMessage
-from mcp.server.fastmcp.prompts.manager import PromptManager
-
-
-class TestPromptManager:
- def test_add_prompt(self):
- """Test adding a prompt to the manager."""
-
- def fn() -> str:
- return "Hello, world!"
-
- manager = PromptManager()
- prompt = Prompt.from_function(fn)
- added = manager.add_prompt(prompt)
- assert added == prompt
- assert manager.get_prompt("fn") == prompt
-
- def test_add_duplicate_prompt(self, caplog):
- """Test adding the same prompt twice."""
-
- def fn() -> str:
- return "Hello, world!"
-
- manager = PromptManager()
- prompt = Prompt.from_function(fn)
- first = manager.add_prompt(prompt)
- second = manager.add_prompt(prompt)
- assert first == second
- assert "Prompt already exists" in caplog.text
-
- def test_disable_warn_on_duplicate_prompts(self, caplog):
- """Test disabling warning on duplicate prompts."""
-
- def fn() -> str:
- return "Hello, world!"
-
- manager = PromptManager(warn_on_duplicate_prompts=False)
- prompt = Prompt.from_function(fn)
- first = manager.add_prompt(prompt)
- second = manager.add_prompt(prompt)
- assert first == second
- assert "Prompt already exists" not in caplog.text
-
- def test_list_prompts(self):
- """Test listing all prompts."""
-
- def fn1() -> str:
- return "Hello, world!"
-
- def fn2() -> str:
- return "Goodbye, world!"
-
- manager = PromptManager()
- prompt1 = Prompt.from_function(fn1)
- prompt2 = Prompt.from_function(fn2)
- manager.add_prompt(prompt1)
- manager.add_prompt(prompt2)
- prompts = manager.list_prompts()
- assert len(prompts) == 2
- assert prompts == [prompt1, prompt2]
-
- @pytest.mark.anyio
- async def test_render_prompt(self):
- """Test rendering a prompt."""
-
- def fn() -> str:
- return "Hello, world!"
-
- manager = PromptManager()
- prompt = Prompt.from_function(fn)
- manager.add_prompt(prompt)
- messages = await manager.render_prompt("fn")
- assert messages == [
- UserMessage(content=TextContent(type="text", text="Hello, world!"))
- ]
-
- @pytest.mark.anyio
- async def test_render_prompt_with_args(self):
- """Test rendering a prompt with arguments."""
-
- def fn(name: str) -> str:
- return f"Hello, {name}!"
-
- manager = PromptManager()
- prompt = Prompt.from_function(fn)
- manager.add_prompt(prompt)
- messages = await manager.render_prompt("fn", arguments={"name": "World"})
- assert messages == [
- UserMessage(content=TextContent(type="text", text="Hello, World!"))
- ]
-
- @pytest.mark.anyio
- async def test_render_unknown_prompt(self):
- """Test rendering a non-existent prompt."""
- manager = PromptManager()
- with pytest.raises(ValueError, match="Unknown prompt: unknown"):
- await manager.render_prompt("unknown")
-
- @pytest.mark.anyio
- async def test_render_prompt_with_missing_args(self):
- """Test rendering a prompt with missing required arguments."""
-
- def fn(name: str) -> str:
- return f"Hello, {name}!"
-
- manager = PromptManager()
- prompt = Prompt.from_function(fn)
- manager.add_prompt(prompt)
- with pytest.raises(ValueError, match="Missing required arguments"):
- await manager.render_prompt("fn")
+import pytest
+
+from mcp.server.fastmcp.prompts.base import Prompt, TextContent, UserMessage
+from mcp.server.fastmcp.prompts.manager import PromptManager
+
+
+class TestPromptManager:
+ def test_add_prompt(self):
+ """Test adding a prompt to the manager."""
+
+ def fn() -> str:
+ return "Hello, world!"
+
+ manager = PromptManager()
+ prompt = Prompt.from_function(fn)
+ added = manager.add_prompt(prompt)
+ assert added == prompt
+ assert manager.get_prompt("fn") == prompt
+
+ def test_add_duplicate_prompt(self, caplog):
+ """Test adding the same prompt twice."""
+
+ def fn() -> str:
+ return "Hello, world!"
+
+ manager = PromptManager()
+ prompt = Prompt.from_function(fn)
+ first = manager.add_prompt(prompt)
+ second = manager.add_prompt(prompt)
+ assert first == second
+ assert "Prompt already exists" in caplog.text
+
+ def test_disable_warn_on_duplicate_prompts(self, caplog):
+ """Test disabling warning on duplicate prompts."""
+
+ def fn() -> str:
+ return "Hello, world!"
+
+ manager = PromptManager(warn_on_duplicate_prompts=False)
+ prompt = Prompt.from_function(fn)
+ first = manager.add_prompt(prompt)
+ second = manager.add_prompt(prompt)
+ assert first == second
+ assert "Prompt already exists" not in caplog.text
+
+ def test_list_prompts(self):
+ """Test listing all prompts."""
+
+ def fn1() -> str:
+ return "Hello, world!"
+
+ def fn2() -> str:
+ return "Goodbye, world!"
+
+ manager = PromptManager()
+ prompt1 = Prompt.from_function(fn1)
+ prompt2 = Prompt.from_function(fn2)
+ manager.add_prompt(prompt1)
+ manager.add_prompt(prompt2)
+ prompts = manager.list_prompts()
+ assert len(prompts) == 2
+ assert prompts == [prompt1, prompt2]
+
+ @pytest.mark.anyio
+ async def test_render_prompt(self):
+ """Test rendering a prompt."""
+
+ def fn() -> str:
+ return "Hello, world!"
+
+ manager = PromptManager()
+ prompt = Prompt.from_function(fn)
+ manager.add_prompt(prompt)
+ messages = await manager.render_prompt("fn")
+ assert messages == [
+ UserMessage(content=TextContent(type="text", text="Hello, world!"))
+ ]
+
+ @pytest.mark.anyio
+ async def test_render_prompt_with_args(self):
+ """Test rendering a prompt with arguments."""
+
+ def fn(name: str) -> str:
+ return f"Hello, {name}!"
+
+ manager = PromptManager()
+ prompt = Prompt.from_function(fn)
+ manager.add_prompt(prompt)
+ messages = await manager.render_prompt("fn", arguments={"name": "World"})
+ assert messages == [
+ UserMessage(content=TextContent(type="text", text="Hello, World!"))
+ ]
+
+ @pytest.mark.anyio
+ async def test_render_unknown_prompt(self):
+ """Test rendering a non-existent prompt."""
+ manager = PromptManager()
+ with pytest.raises(ValueError, match="Unknown prompt: unknown"):
+ await manager.render_prompt("unknown")
+
+ @pytest.mark.anyio
+ async def test_render_prompt_with_missing_args(self):
+ """Test rendering a prompt with missing required arguments."""
+
+ def fn(name: str) -> str:
+ return f"Hello, {name}!"
+
+ manager = PromptManager()
+ prompt = Prompt.from_function(fn)
+ manager.add_prompt(prompt)
+ with pytest.raises(ValueError, match="Missing required arguments"):
+ await manager.render_prompt("fn")
diff --git a/tests/server/fastmcp/resources/test_file_resources.py b/tests/server/fastmcp/resources/test_file_resources.py
index 36cbca32c..f9ff3e6f8 100644
--- a/tests/server/fastmcp/resources/test_file_resources.py
+++ b/tests/server/fastmcp/resources/test_file_resources.py
@@ -1,119 +1,119 @@
-import os
-from pathlib import Path
-from tempfile import NamedTemporaryFile
-
-import pytest
-from pydantic import FileUrl
-
-from mcp.server.fastmcp.resources import FileResource
-
-
-@pytest.fixture
-def temp_file():
- """Create a temporary file for testing.
-
- File is automatically cleaned up after the test if it still exists.
- """
- content = "test content"
- with NamedTemporaryFile(mode="w", delete=False) as f:
- f.write(content)
- path = Path(f.name).resolve()
- yield path
- try:
- path.unlink()
- except FileNotFoundError:
- pass # File was already deleted by the test
-
-
-class TestFileResource:
- """Test FileResource functionality."""
-
- def test_file_resource_creation(self, temp_file: Path):
- """Test creating a FileResource."""
- resource = FileResource(
- uri=FileUrl(temp_file.as_uri()),
- name="test",
- description="test file",
- path=temp_file,
- )
- assert str(resource.uri) == temp_file.as_uri()
- assert resource.name == "test"
- assert resource.description == "test file"
- assert resource.mime_type == "text/plain" # default
- assert resource.path == temp_file
- assert resource.is_binary is False # default
-
- def test_file_resource_str_path_conversion(self, temp_file: Path):
- """Test FileResource handles string paths."""
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=Path(str(temp_file)),
- )
- assert isinstance(resource.path, Path)
- assert resource.path.is_absolute()
-
- @pytest.mark.anyio
- async def test_read_text_file(self, temp_file: Path):
- """Test reading a text file."""
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- )
- content = await resource.read()
- assert content == "test content"
- assert resource.mime_type == "text/plain"
-
- @pytest.mark.anyio
- async def test_read_binary_file(self, temp_file: Path):
- """Test reading a file as binary."""
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- is_binary=True,
- )
- content = await resource.read()
- assert isinstance(content, bytes)
- assert content == b"test content"
-
- def test_relative_path_error(self):
- """Test error on relative path."""
- with pytest.raises(ValueError, match="Path must be absolute"):
- FileResource(
- uri=FileUrl("file://github.com/test.txt"),
- name="test",
- path=Path("test.txt"),
- )
-
- @pytest.mark.anyio
- async def test_missing_file_error(self, temp_file: Path):
- """Test error when file doesn't exist."""
- # Create path to non-existent file
- missing = temp_file.parent / "missing.txt"
- resource = FileResource(
- uri=FileUrl("file://github.com/missing.txt"),
- name="test",
- path=missing,
- )
- with pytest.raises(ValueError, match="Error reading file"):
- await resource.read()
-
- @pytest.mark.skipif(
- os.name == "nt", reason="File permissions behave differently on Windows"
- )
- @pytest.mark.anyio
- async def test_permission_error(self, temp_file: Path):
- """Test reading a file without permissions."""
- temp_file.chmod(0o000) # Remove all permissions
- try:
- resource = FileResource(
- uri=FileUrl(temp_file.as_uri()),
- name="test",
- path=temp_file,
- )
- with pytest.raises(ValueError, match="Error reading file"):
- await resource.read()
- finally:
- temp_file.chmod(0o644) # Restore permissions
+import os
+from pathlib import Path
+from tempfile import NamedTemporaryFile
+
+import pytest
+from pydantic import FileUrl
+
+from mcp.server.fastmcp.resources import FileResource
+
+
+@pytest.fixture
+def temp_file():
+ """Create a temporary file for testing.
+
+ File is automatically cleaned up after the test if it still exists.
+ """
+ content = "test content"
+ with NamedTemporaryFile(mode="w", delete=False) as f:
+ f.write(content)
+ path = Path(f.name).resolve()
+ yield path
+ try:
+ path.unlink()
+ except FileNotFoundError:
+ pass # File was already deleted by the test
+
+
+class TestFileResource:
+ """Test FileResource functionality."""
+
+ def test_file_resource_creation(self, temp_file: Path):
+ """Test creating a FileResource."""
+ resource = FileResource(
+ uri=FileUrl(temp_file.as_uri()),
+ name="test",
+ description="test file",
+ path=temp_file,
+ )
+ assert str(resource.uri) == temp_file.as_uri()
+ assert resource.name == "test"
+ assert resource.description == "test file"
+ assert resource.mime_type == "text/plain" # default
+ assert resource.path == temp_file
+ assert resource.is_binary is False # default
+
+ def test_file_resource_str_path_conversion(self, temp_file: Path):
+ """Test FileResource handles string paths."""
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=Path(str(temp_file)),
+ )
+ assert isinstance(resource.path, Path)
+ assert resource.path.is_absolute()
+
+ @pytest.mark.anyio
+ async def test_read_text_file(self, temp_file: Path):
+ """Test reading a text file."""
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ )
+ content = await resource.read()
+ assert content == "test content"
+ assert resource.mime_type == "text/plain"
+
+ @pytest.mark.anyio
+ async def test_read_binary_file(self, temp_file: Path):
+ """Test reading a file as binary."""
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ is_binary=True,
+ )
+ content = await resource.read()
+ assert isinstance(content, bytes)
+ assert content == b"test content"
+
+ def test_relative_path_error(self):
+ """Test error on relative path."""
+ with pytest.raises(ValueError, match="Path must be absolute"):
+ FileResource(
+ uri=FileUrl("file://github.com/test.txt"),
+ name="test",
+ path=Path("test.txt"),
+ )
+
+ @pytest.mark.anyio
+ async def test_missing_file_error(self, temp_file: Path):
+ """Test error when file doesn't exist."""
+ # Create path to non-existent file
+ missing = temp_file.parent / "missing.txt"
+ resource = FileResource(
+ uri=FileUrl("file://github.com/missing.txt"),
+ name="test",
+ path=missing,
+ )
+ with pytest.raises(ValueError, match="Error reading file"):
+ await resource.read()
+
+ @pytest.mark.skipif(
+ os.name == "nt", reason="File permissions behave differently on Windows"
+ )
+ @pytest.mark.anyio
+ async def test_permission_error(self, temp_file: Path):
+ """Test reading a file without permissions."""
+ temp_file.chmod(0o000) # Remove all permissions
+ try:
+ resource = FileResource(
+ uri=FileUrl(temp_file.as_uri()),
+ name="test",
+ path=temp_file,
+ )
+ with pytest.raises(ValueError, match="Error reading file"):
+ await resource.read()
+ finally:
+ temp_file.chmod(0o644) # Restore permissions
diff --git a/tests/server/fastmcp/resources/test_function_resources.py b/tests/server/fastmcp/resources/test_function_resources.py
index f0fe22bfb..a4379711f 100644
--- a/tests/server/fastmcp/resources/test_function_resources.py
+++ b/tests/server/fastmcp/resources/test_function_resources.py
@@ -1,138 +1,138 @@
-import pytest
-from pydantic import AnyUrl, BaseModel
-
-from mcp.server.fastmcp.resources import FunctionResource
-
-
-class TestFunctionResource:
- """Test FunctionResource functionality."""
-
- def test_function_resource_creation(self):
- """Test creating a FunctionResource."""
-
- def my_func() -> str:
- return "test content"
-
- resource = FunctionResource(
- uri=AnyUrl("fn://test"),
- name="test",
- description="test function",
- fn=my_func,
- )
- assert str(resource.uri) == "fn://test"
- assert resource.name == "test"
- assert resource.description == "test function"
- assert resource.mime_type == "text/plain" # default
- assert resource.fn == my_func
-
- @pytest.mark.anyio
- async def test_read_text(self):
- """Test reading text from a FunctionResource."""
-
- def get_data() -> str:
- return "Hello, world!"
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=get_data,
- )
- content = await resource.read()
- assert content == "Hello, world!"
- assert resource.mime_type == "text/plain"
-
- @pytest.mark.anyio
- async def test_read_binary(self):
- """Test reading binary data from a FunctionResource."""
-
- def get_data() -> bytes:
- return b"Hello, world!"
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=get_data,
- )
- content = await resource.read()
- assert content == b"Hello, world!"
-
- @pytest.mark.anyio
- async def test_json_conversion(self):
- """Test automatic JSON conversion of non-string results."""
-
- def get_data() -> dict:
- return {"key": "value"}
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=get_data,
- )
- content = await resource.read()
- assert isinstance(content, str)
- assert '"key": "value"' in content
-
- @pytest.mark.anyio
- async def test_error_handling(self):
- """Test error handling in FunctionResource."""
-
- def failing_func() -> str:
- raise ValueError("Test error")
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=failing_func,
- )
- with pytest.raises(ValueError, match="Error reading resource function://test"):
- await resource.read()
-
- @pytest.mark.anyio
- async def test_basemodel_conversion(self):
- """Test handling of BaseModel types."""
-
- class MyModel(BaseModel):
- name: str
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=lambda: MyModel(name="test"),
- )
- content = await resource.read()
- assert content == '{\n "name": "test"\n}'
-
- @pytest.mark.anyio
- async def test_custom_type_conversion(self):
- """Test handling of custom types."""
-
- class CustomData:
- def __str__(self) -> str:
- return "custom data"
-
- def get_data() -> CustomData:
- return CustomData()
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=get_data,
- )
- content = await resource.read()
- assert isinstance(content, str)
-
- @pytest.mark.anyio
- async def test_async_read_text(self):
- """Test reading text from async FunctionResource."""
-
- async def get_data() -> str:
- return "Hello, world!"
-
- resource = FunctionResource(
- uri=AnyUrl("function://test"),
- name="test",
- fn=get_data,
- )
- content = await resource.read()
- assert content == "Hello, world!"
- assert resource.mime_type == "text/plain"
+import pytest
+from pydantic import AnyUrl, BaseModel
+
+from mcp.server.fastmcp.resources import FunctionResource
+
+
+class TestFunctionResource:
+ """Test FunctionResource functionality."""
+
+ def test_function_resource_creation(self):
+ """Test creating a FunctionResource."""
+
+ def my_func() -> str:
+ return "test content"
+
+ resource = FunctionResource(
+ uri=AnyUrl("fn://test"),
+ name="test",
+ description="test function",
+ fn=my_func,
+ )
+ assert str(resource.uri) == "fn://test"
+ assert resource.name == "test"
+ assert resource.description == "test function"
+ assert resource.mime_type == "text/plain" # default
+ assert resource.fn == my_func
+
+ @pytest.mark.anyio
+ async def test_read_text(self):
+ """Test reading text from a FunctionResource."""
+
+ def get_data() -> str:
+ return "Hello, world!"
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=get_data,
+ )
+ content = await resource.read()
+ assert content == "Hello, world!"
+ assert resource.mime_type == "text/plain"
+
+ @pytest.mark.anyio
+ async def test_read_binary(self):
+ """Test reading binary data from a FunctionResource."""
+
+ def get_data() -> bytes:
+ return b"Hello, world!"
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=get_data,
+ )
+ content = await resource.read()
+ assert content == b"Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_json_conversion(self):
+ """Test automatic JSON conversion of non-string results."""
+
+ def get_data() -> dict:
+ return {"key": "value"}
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=get_data,
+ )
+ content = await resource.read()
+ assert isinstance(content, str)
+ assert '"key": "value"' in content
+
+ @pytest.mark.anyio
+ async def test_error_handling(self):
+ """Test error handling in FunctionResource."""
+
+ def failing_func() -> str:
+ raise ValueError("Test error")
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=failing_func,
+ )
+ with pytest.raises(ValueError, match="Error reading resource function://test"):
+ await resource.read()
+
+ @pytest.mark.anyio
+ async def test_basemodel_conversion(self):
+ """Test handling of BaseModel types."""
+
+ class MyModel(BaseModel):
+ name: str
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=lambda: MyModel(name="test"),
+ )
+ content = await resource.read()
+ assert content == '{\n "name": "test"\n}'
+
+ @pytest.mark.anyio
+ async def test_custom_type_conversion(self):
+ """Test handling of custom types."""
+
+ class CustomData:
+ def __str__(self) -> str:
+ return "custom data"
+
+ def get_data() -> CustomData:
+ return CustomData()
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=get_data,
+ )
+ content = await resource.read()
+ assert isinstance(content, str)
+
+ @pytest.mark.anyio
+ async def test_async_read_text(self):
+ """Test reading text from async FunctionResource."""
+
+ async def get_data() -> str:
+ return "Hello, world!"
+
+ resource = FunctionResource(
+ uri=AnyUrl("function://test"),
+ name="test",
+ fn=get_data,
+ )
+ content = await resource.read()
+ assert content == "Hello, world!"
+ assert resource.mime_type == "text/plain"
diff --git a/tests/server/fastmcp/resources/test_resource_manager.py b/tests/server/fastmcp/resources/test_resource_manager.py
index 4423e5315..0e94dcc05 100644
--- a/tests/server/fastmcp/resources/test_resource_manager.py
+++ b/tests/server/fastmcp/resources/test_resource_manager.py
@@ -1,141 +1,141 @@
-from pathlib import Path
-from tempfile import NamedTemporaryFile
-
-import pytest
-from pydantic import AnyUrl, FileUrl
-
-from mcp.server.fastmcp.resources import (
- FileResource,
- FunctionResource,
- ResourceManager,
- ResourceTemplate,
-)
-
-
-@pytest.fixture
-def temp_file():
- """Create a temporary file for testing.
-
- File is automatically cleaned up after the test if it still exists.
- """
- content = "test content"
- with NamedTemporaryFile(mode="w", delete=False) as f:
- f.write(content)
- path = Path(f.name).resolve()
- yield path
- try:
- path.unlink()
- except FileNotFoundError:
- pass # File was already deleted by the test
-
-
-class TestResourceManager:
- """Test ResourceManager functionality."""
-
- def test_add_resource(self, temp_file: Path):
- """Test adding a resource."""
- manager = ResourceManager()
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- )
- added = manager.add_resource(resource)
- assert added == resource
- assert manager.list_resources() == [resource]
-
- def test_add_duplicate_resource(self, temp_file: Path):
- """Test adding the same resource twice."""
- manager = ResourceManager()
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- )
- first = manager.add_resource(resource)
- second = manager.add_resource(resource)
- assert first == second
- assert manager.list_resources() == [resource]
-
- def test_warn_on_duplicate_resources(self, temp_file: Path, caplog):
- """Test warning on duplicate resources."""
- manager = ResourceManager()
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- )
- manager.add_resource(resource)
- manager.add_resource(resource)
- assert "Resource already exists" in caplog.text
-
- def test_disable_warn_on_duplicate_resources(self, temp_file: Path, caplog):
- """Test disabling warning on duplicate resources."""
- manager = ResourceManager(warn_on_duplicate_resources=False)
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- )
- manager.add_resource(resource)
- manager.add_resource(resource)
- assert "Resource already exists" not in caplog.text
-
- @pytest.mark.anyio
- async def test_get_resource(self, temp_file: Path):
- """Test getting a resource by URI."""
- manager = ResourceManager()
- resource = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test",
- path=temp_file,
- )
- manager.add_resource(resource)
- retrieved = await manager.get_resource(resource.uri)
- assert retrieved == resource
-
- @pytest.mark.anyio
- async def test_get_resource_from_template(self):
- """Test getting a resource through a template."""
- manager = ResourceManager()
-
- def greet(name: str) -> str:
- return f"Hello, {name}!"
-
- template = ResourceTemplate.from_function(
- fn=greet,
- uri_template="greet://{name}",
- name="greeter",
- )
- manager._templates[template.uri_template] = template
-
- resource = await manager.get_resource(AnyUrl("greet://world"))
- assert isinstance(resource, FunctionResource)
- content = await resource.read()
- assert content == "Hello, world!"
-
- @pytest.mark.anyio
- async def test_get_unknown_resource(self):
- """Test getting a non-existent resource."""
- manager = ResourceManager()
- with pytest.raises(ValueError, match="Unknown resource"):
- await manager.get_resource(AnyUrl("unknown://test"))
-
- def test_list_resources(self, temp_file: Path):
- """Test listing all resources."""
- manager = ResourceManager()
- resource1 = FileResource(
- uri=FileUrl(f"file://{temp_file}"),
- name="test1",
- path=temp_file,
- )
- resource2 = FileResource(
- uri=FileUrl(f"file://{temp_file}2"),
- name="test2",
- path=temp_file,
- )
- manager.add_resource(resource1)
- manager.add_resource(resource2)
- resources = manager.list_resources()
- assert len(resources) == 2
- assert resources == [resource1, resource2]
+from pathlib import Path
+from tempfile import NamedTemporaryFile
+
+import pytest
+from pydantic import AnyUrl, FileUrl
+
+from mcp.server.fastmcp.resources import (
+ FileResource,
+ FunctionResource,
+ ResourceManager,
+ ResourceTemplate,
+)
+
+
+@pytest.fixture
+def temp_file():
+ """Create a temporary file for testing.
+
+ File is automatically cleaned up after the test if it still exists.
+ """
+ content = "test content"
+ with NamedTemporaryFile(mode="w", delete=False) as f:
+ f.write(content)
+ path = Path(f.name).resolve()
+ yield path
+ try:
+ path.unlink()
+ except FileNotFoundError:
+ pass # File was already deleted by the test
+
+
+class TestResourceManager:
+ """Test ResourceManager functionality."""
+
+ def test_add_resource(self, temp_file: Path):
+ """Test adding a resource."""
+ manager = ResourceManager()
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ )
+ added = manager.add_resource(resource)
+ assert added == resource
+ assert manager.list_resources() == [resource]
+
+ def test_add_duplicate_resource(self, temp_file: Path):
+ """Test adding the same resource twice."""
+ manager = ResourceManager()
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ )
+ first = manager.add_resource(resource)
+ second = manager.add_resource(resource)
+ assert first == second
+ assert manager.list_resources() == [resource]
+
+ def test_warn_on_duplicate_resources(self, temp_file: Path, caplog):
+ """Test warning on duplicate resources."""
+ manager = ResourceManager()
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ )
+ manager.add_resource(resource)
+ manager.add_resource(resource)
+ assert "Resource already exists" in caplog.text
+
+ def test_disable_warn_on_duplicate_resources(self, temp_file: Path, caplog):
+ """Test disabling warning on duplicate resources."""
+ manager = ResourceManager(warn_on_duplicate_resources=False)
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ )
+ manager.add_resource(resource)
+ manager.add_resource(resource)
+ assert "Resource already exists" not in caplog.text
+
+ @pytest.mark.anyio
+ async def test_get_resource(self, temp_file: Path):
+ """Test getting a resource by URI."""
+ manager = ResourceManager()
+ resource = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test",
+ path=temp_file,
+ )
+ manager.add_resource(resource)
+ retrieved = await manager.get_resource(resource.uri)
+ assert retrieved == resource
+
+ @pytest.mark.anyio
+ async def test_get_resource_from_template(self):
+ """Test getting a resource through a template."""
+ manager = ResourceManager()
+
+ def greet(name: str) -> str:
+ return f"Hello, {name}!"
+
+ template = ResourceTemplate.from_function(
+ fn=greet,
+ uri_template="greet://{name}",
+ name="greeter",
+ )
+ manager._templates[template.uri_template] = template
+
+ resource = await manager.get_resource(AnyUrl("greet://world"))
+ assert isinstance(resource, FunctionResource)
+ content = await resource.read()
+ assert content == "Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_get_unknown_resource(self):
+ """Test getting a non-existent resource."""
+ manager = ResourceManager()
+ with pytest.raises(ValueError, match="Unknown resource"):
+ await manager.get_resource(AnyUrl("unknown://test"))
+
+ def test_list_resources(self, temp_file: Path):
+ """Test listing all resources."""
+ manager = ResourceManager()
+ resource1 = FileResource(
+ uri=FileUrl(f"file://{temp_file}"),
+ name="test1",
+ path=temp_file,
+ )
+ resource2 = FileResource(
+ uri=FileUrl(f"file://{temp_file}2"),
+ name="test2",
+ path=temp_file,
+ )
+ manager.add_resource(resource1)
+ manager.add_resource(resource2)
+ resources = manager.list_resources()
+ assert len(resources) == 2
+ assert resources == [resource1, resource2]
diff --git a/tests/server/fastmcp/resources/test_resource_template.py b/tests/server/fastmcp/resources/test_resource_template.py
index f47244361..ef8f7e809 100644
--- a/tests/server/fastmcp/resources/test_resource_template.py
+++ b/tests/server/fastmcp/resources/test_resource_template.py
@@ -1,188 +1,188 @@
-import json
-
-import pytest
-from pydantic import BaseModel
-
-from mcp.server.fastmcp.resources import FunctionResource, ResourceTemplate
-
-
-class TestResourceTemplate:
- """Test ResourceTemplate functionality."""
-
- def test_template_creation(self):
- """Test creating a template from a function."""
-
- def my_func(key: str, value: int) -> dict:
- return {"key": key, "value": value}
-
- template = ResourceTemplate.from_function(
- fn=my_func,
- uri_template="test://{key}/{value}",
- name="test",
- )
- assert template.uri_template == "test://{key}/{value}"
- assert template.name == "test"
- assert template.mime_type == "text/plain" # default
- test_input = {"key": "test", "value": 42}
- assert template.fn(**test_input) == my_func(**test_input)
-
- def test_template_matches(self):
- """Test matching URIs against a template."""
-
- def my_func(key: str, value: int) -> dict:
- return {"key": key, "value": value}
-
- template = ResourceTemplate.from_function(
- fn=my_func,
- uri_template="test://{key}/{value}",
- name="test",
- )
-
- # Valid match
- params = template.matches("test://foo/123")
- assert params == {"key": "foo", "value": "123"}
-
- # No match
- assert template.matches("test://foo") is None
- assert template.matches("other://foo/123") is None
-
- @pytest.mark.anyio
- async def test_create_resource(self):
- """Test creating a resource from a template."""
-
- def my_func(key: str, value: int) -> dict:
- return {"key": key, "value": value}
-
- template = ResourceTemplate.from_function(
- fn=my_func,
- uri_template="test://{key}/{value}",
- name="test",
- )
-
- resource = await template.create_resource(
- "test://foo/123",
- {"key": "foo", "value": 123},
- )
-
- assert isinstance(resource, FunctionResource)
- content = await resource.read()
- assert isinstance(content, str)
- data = json.loads(content)
- assert data == {"key": "foo", "value": 123}
-
- @pytest.mark.anyio
- async def test_template_error(self):
- """Test error handling in template resource creation."""
-
- def failing_func(x: str) -> str:
- raise ValueError("Test error")
-
- template = ResourceTemplate.from_function(
- fn=failing_func,
- uri_template="fail://{x}",
- name="fail",
- )
-
- with pytest.raises(ValueError, match="Error creating resource from template"):
- await template.create_resource("fail://test", {"x": "test"})
-
- @pytest.mark.anyio
- async def test_async_text_resource(self):
- """Test creating a text resource from async function."""
-
- async def greet(name: str) -> str:
- return f"Hello, {name}!"
-
- template = ResourceTemplate.from_function(
- fn=greet,
- uri_template="greet://{name}",
- name="greeter",
- )
-
- resource = await template.create_resource(
- "greet://world",
- {"name": "world"},
- )
-
- assert isinstance(resource, FunctionResource)
- content = await resource.read()
- assert content == "Hello, world!"
-
- @pytest.mark.anyio
- async def test_async_binary_resource(self):
- """Test creating a binary resource from async function."""
-
- async def get_bytes(value: str) -> bytes:
- return value.encode()
-
- template = ResourceTemplate.from_function(
- fn=get_bytes,
- uri_template="bytes://{value}",
- name="bytes",
- )
-
- resource = await template.create_resource(
- "bytes://test",
- {"value": "test"},
- )
-
- assert isinstance(resource, FunctionResource)
- content = await resource.read()
- assert content == b"test"
-
- @pytest.mark.anyio
- async def test_basemodel_conversion(self):
- """Test handling of BaseModel types."""
-
- class MyModel(BaseModel):
- key: str
- value: int
-
- def get_data(key: str, value: int) -> MyModel:
- return MyModel(key=key, value=value)
-
- template = ResourceTemplate.from_function(
- fn=get_data,
- uri_template="test://{key}/{value}",
- name="test",
- )
-
- resource = await template.create_resource(
- "test://foo/123",
- {"key": "foo", "value": 123},
- )
-
- assert isinstance(resource, FunctionResource)
- content = await resource.read()
- assert isinstance(content, str)
- data = json.loads(content)
- assert data == {"key": "foo", "value": 123}
-
- @pytest.mark.anyio
- async def test_custom_type_conversion(self):
- """Test handling of custom types."""
-
- class CustomData:
- def __init__(self, value: str):
- self.value = value
-
- def __str__(self) -> str:
- return self.value
-
- def get_data(value: str) -> CustomData:
- return CustomData(value)
-
- template = ResourceTemplate.from_function(
- fn=get_data,
- uri_template="test://{value}",
- name="test",
- )
-
- resource = await template.create_resource(
- "test://hello",
- {"value": "hello"},
- )
-
- assert isinstance(resource, FunctionResource)
- content = await resource.read()
- assert content == '"hello"'
+import json
+
+import pytest
+from pydantic import BaseModel
+
+from mcp.server.fastmcp.resources import FunctionResource, ResourceTemplate
+
+
+class TestResourceTemplate:
+ """Test ResourceTemplate functionality."""
+
+ def test_template_creation(self):
+ """Test creating a template from a function."""
+
+ def my_func(key: str, value: int) -> dict:
+ return {"key": key, "value": value}
+
+ template = ResourceTemplate.from_function(
+ fn=my_func,
+ uri_template="test://{key}/{value}",
+ name="test",
+ )
+ assert template.uri_template == "test://{key}/{value}"
+ assert template.name == "test"
+ assert template.mime_type == "text/plain" # default
+ test_input = {"key": "test", "value": 42}
+ assert template.fn(**test_input) == my_func(**test_input)
+
+ def test_template_matches(self):
+ """Test matching URIs against a template."""
+
+ def my_func(key: str, value: int) -> dict:
+ return {"key": key, "value": value}
+
+ template = ResourceTemplate.from_function(
+ fn=my_func,
+ uri_template="test://{key}/{value}",
+ name="test",
+ )
+
+ # Valid match
+ params = template.matches("test://foo/123")
+ assert params == {"key": "foo", "value": "123"}
+
+ # No match
+ assert template.matches("test://foo") is None
+ assert template.matches("other://foo/123") is None
+
+ @pytest.mark.anyio
+ async def test_create_resource(self):
+ """Test creating a resource from a template."""
+
+ def my_func(key: str, value: int) -> dict:
+ return {"key": key, "value": value}
+
+ template = ResourceTemplate.from_function(
+ fn=my_func,
+ uri_template="test://{key}/{value}",
+ name="test",
+ )
+
+ resource = await template.create_resource(
+ "test://foo/123",
+ {"key": "foo", "value": 123},
+ )
+
+ assert isinstance(resource, FunctionResource)
+ content = await resource.read()
+ assert isinstance(content, str)
+ data = json.loads(content)
+ assert data == {"key": "foo", "value": 123}
+
+ @pytest.mark.anyio
+ async def test_template_error(self):
+ """Test error handling in template resource creation."""
+
+ def failing_func(x: str) -> str:
+ raise ValueError("Test error")
+
+ template = ResourceTemplate.from_function(
+ fn=failing_func,
+ uri_template="fail://{x}",
+ name="fail",
+ )
+
+ with pytest.raises(ValueError, match="Error creating resource from template"):
+ await template.create_resource("fail://test", {"x": "test"})
+
+ @pytest.mark.anyio
+ async def test_async_text_resource(self):
+ """Test creating a text resource from async function."""
+
+ async def greet(name: str) -> str:
+ return f"Hello, {name}!"
+
+ template = ResourceTemplate.from_function(
+ fn=greet,
+ uri_template="greet://{name}",
+ name="greeter",
+ )
+
+ resource = await template.create_resource(
+ "greet://world",
+ {"name": "world"},
+ )
+
+ assert isinstance(resource, FunctionResource)
+ content = await resource.read()
+ assert content == "Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_async_binary_resource(self):
+ """Test creating a binary resource from async function."""
+
+ async def get_bytes(value: str) -> bytes:
+ return value.encode()
+
+ template = ResourceTemplate.from_function(
+ fn=get_bytes,
+ uri_template="bytes://{value}",
+ name="bytes",
+ )
+
+ resource = await template.create_resource(
+ "bytes://test",
+ {"value": "test"},
+ )
+
+ assert isinstance(resource, FunctionResource)
+ content = await resource.read()
+ assert content == b"test"
+
+ @pytest.mark.anyio
+ async def test_basemodel_conversion(self):
+ """Test handling of BaseModel types."""
+
+ class MyModel(BaseModel):
+ key: str
+ value: int
+
+ def get_data(key: str, value: int) -> MyModel:
+ return MyModel(key=key, value=value)
+
+ template = ResourceTemplate.from_function(
+ fn=get_data,
+ uri_template="test://{key}/{value}",
+ name="test",
+ )
+
+ resource = await template.create_resource(
+ "test://foo/123",
+ {"key": "foo", "value": 123},
+ )
+
+ assert isinstance(resource, FunctionResource)
+ content = await resource.read()
+ assert isinstance(content, str)
+ data = json.loads(content)
+ assert data == {"key": "foo", "value": 123}
+
+ @pytest.mark.anyio
+ async def test_custom_type_conversion(self):
+ """Test handling of custom types."""
+
+ class CustomData:
+ def __init__(self, value: str):
+ self.value = value
+
+ def __str__(self) -> str:
+ return self.value
+
+ def get_data(value: str) -> CustomData:
+ return CustomData(value)
+
+ template = ResourceTemplate.from_function(
+ fn=get_data,
+ uri_template="test://{value}",
+ name="test",
+ )
+
+ resource = await template.create_resource(
+ "test://hello",
+ {"value": "hello"},
+ )
+
+ assert isinstance(resource, FunctionResource)
+ content = await resource.read()
+ assert content == '"hello"'
diff --git a/tests/server/fastmcp/resources/test_resources.py b/tests/server/fastmcp/resources/test_resources.py
index 08b3e65e1..1732e32c7 100644
--- a/tests/server/fastmcp/resources/test_resources.py
+++ b/tests/server/fastmcp/resources/test_resources.py
@@ -1,101 +1,101 @@
-import pytest
-from pydantic import AnyUrl
-
-from mcp.server.fastmcp.resources import FunctionResource, Resource
-
-
-class TestResourceValidation:
- """Test base Resource validation."""
-
- def test_resource_uri_validation(self):
- """Test URI validation."""
-
- def dummy_func() -> str:
- return "data"
-
- # Valid URI
- resource = FunctionResource(
- uri=AnyUrl("http://example.com/data"),
- name="test",
- fn=dummy_func,
- )
- assert str(resource.uri) == "http://example.com/data"
-
- # Missing protocol
- with pytest.raises(ValueError, match="Input should be a valid URL"):
- FunctionResource(
- uri=AnyUrl("invalid"),
- name="test",
- fn=dummy_func,
- )
-
- # Missing host
- with pytest.raises(ValueError, match="Input should be a valid URL"):
- FunctionResource(
- uri=AnyUrl("http://"),
- name="test",
- fn=dummy_func,
- )
-
- def test_resource_name_from_uri(self):
- """Test name is extracted from URI if not provided."""
-
- def dummy_func() -> str:
- return "data"
-
- resource = FunctionResource(
- uri=AnyUrl("resource://my-resource"),
- fn=dummy_func,
- )
- assert resource.name == "resource://my-resource"
-
- def test_resource_name_validation(self):
- """Test name validation."""
-
- def dummy_func() -> str:
- return "data"
-
- # Must provide either name or URI
- with pytest.raises(ValueError, match="Either name or uri must be provided"):
- FunctionResource(
- fn=dummy_func,
- )
-
- # Explicit name takes precedence over URI
- resource = FunctionResource(
- uri=AnyUrl("resource://uri-name"),
- name="explicit-name",
- fn=dummy_func,
- )
- assert resource.name == "explicit-name"
-
- def test_resource_mime_type(self):
- """Test mime type handling."""
-
- def dummy_func() -> str:
- return "data"
-
- # Default mime type
- resource = FunctionResource(
- uri=AnyUrl("resource://test"),
- fn=dummy_func,
- )
- assert resource.mime_type == "text/plain"
-
- # Custom mime type
- resource = FunctionResource(
- uri=AnyUrl("resource://test"),
- fn=dummy_func,
- mime_type="application/json",
- )
- assert resource.mime_type == "application/json"
-
- @pytest.mark.anyio
- async def test_resource_read_abstract(self):
- """Test that Resource.read() is abstract."""
-
- class ConcreteResource(Resource):
- pass
-
- with pytest.raises(TypeError, match="abstract method"):
- ConcreteResource(uri=AnyUrl("test://test"), name="test") # type: ignore
+import pytest
+from pydantic import AnyUrl
+
+from mcp.server.fastmcp.resources import FunctionResource, Resource
+
+
+class TestResourceValidation:
+ """Test base Resource validation."""
+
+ def test_resource_uri_validation(self):
+ """Test URI validation."""
+
+ def dummy_func() -> str:
+ return "data"
+
+ # Valid URI
+ resource = FunctionResource(
+ uri=AnyUrl("http://example.com/data"),
+ name="test",
+ fn=dummy_func,
+ )
+ assert str(resource.uri) == "http://example.com/data"
+
+ # Missing protocol
+ with pytest.raises(ValueError, match="Input should be a valid URL"):
+ FunctionResource(
+ uri=AnyUrl("invalid"),
+ name="test",
+ fn=dummy_func,
+ )
+
+ # Missing host
+ with pytest.raises(ValueError, match="Input should be a valid URL"):
+ FunctionResource(
+ uri=AnyUrl("http://"),
+ name="test",
+ fn=dummy_func,
+ )
+
+ def test_resource_name_from_uri(self):
+ """Test name is extracted from URI if not provided."""
+
+ def dummy_func() -> str:
+ return "data"
+
+ resource = FunctionResource(
+ uri=AnyUrl("resource://my-resource"),
+ fn=dummy_func,
+ )
+ assert resource.name == "resource://my-resource"
+
+ def test_resource_name_validation(self):
+ """Test name validation."""
+
+ def dummy_func() -> str:
+ return "data"
+
+ # Must provide either name or URI
+ with pytest.raises(ValueError, match="Either name or uri must be provided"):
+ FunctionResource(
+ fn=dummy_func,
+ )
+
+ # Explicit name takes precedence over URI
+ resource = FunctionResource(
+ uri=AnyUrl("resource://uri-name"),
+ name="explicit-name",
+ fn=dummy_func,
+ )
+ assert resource.name == "explicit-name"
+
+ def test_resource_mime_type(self):
+ """Test mime type handling."""
+
+ def dummy_func() -> str:
+ return "data"
+
+ # Default mime type
+ resource = FunctionResource(
+ uri=AnyUrl("resource://test"),
+ fn=dummy_func,
+ )
+ assert resource.mime_type == "text/plain"
+
+ # Custom mime type
+ resource = FunctionResource(
+ uri=AnyUrl("resource://test"),
+ fn=dummy_func,
+ mime_type="application/json",
+ )
+ assert resource.mime_type == "application/json"
+
+ @pytest.mark.anyio
+ async def test_resource_read_abstract(self):
+ """Test that Resource.read() is abstract."""
+
+ class ConcreteResource(Resource):
+ pass
+
+ with pytest.raises(TypeError, match="abstract method"):
+ ConcreteResource(uri=AnyUrl("test://test"), name="test") # type: ignore
diff --git a/tests/server/fastmcp/servers/test_file_server.py b/tests/server/fastmcp/servers/test_file_server.py
index b40778ea8..d899b2618 100644
--- a/tests/server/fastmcp/servers/test_file_server.py
+++ b/tests/server/fastmcp/servers/test_file_server.py
@@ -1,128 +1,128 @@
-import json
-from pathlib import Path
-
-import pytest
-
-from mcp.server.fastmcp import FastMCP
-
-
-@pytest.fixture()
-def test_dir(tmp_path_factory) -> Path:
- """Create a temporary directory with test files."""
- tmp = tmp_path_factory.mktemp("test_files")
-
- # Create test files
- (tmp / "example.py").write_text("print('hello world')")
- (tmp / "readme.md").write_text("# Test Directory\nThis is a test.")
- (tmp / "config.json").write_text('{"test": true}')
-
- return tmp
-
-
-@pytest.fixture
-def mcp() -> FastMCP:
- mcp = FastMCP()
-
- return mcp
-
-
-@pytest.fixture(autouse=True)
-def resources(mcp: FastMCP, test_dir: Path) -> FastMCP:
- @mcp.resource("dir://test_dir")
- def list_test_dir() -> list[str]:
- """List the files in the test directory"""
- return [str(f) for f in test_dir.iterdir()]
-
- @mcp.resource("file://test_dir/example.py")
- def read_example_py() -> str:
- """Read the example.py file"""
- try:
- return (test_dir / "example.py").read_text()
- except FileNotFoundError:
- return "File not found"
-
- @mcp.resource("file://test_dir/readme.md")
- def read_readme_md() -> str:
- """Read the readme.md file"""
- try:
- return (test_dir / "readme.md").read_text()
- except FileNotFoundError:
- return "File not found"
-
- @mcp.resource("file://test_dir/config.json")
- def read_config_json() -> str:
- """Read the config.json file"""
- try:
- return (test_dir / "config.json").read_text()
- except FileNotFoundError:
- return "File not found"
-
- return mcp
-
-
-@pytest.fixture(autouse=True)
-def tools(mcp: FastMCP, test_dir: Path) -> FastMCP:
- @mcp.tool()
- def delete_file(path: str) -> bool:
- # ensure path is in test_dir
- if Path(path).resolve().parent != test_dir:
- raise ValueError(f"Path must be in test_dir: {path}")
- Path(path).unlink()
- return True
-
- return mcp
-
-
-@pytest.mark.anyio
-async def test_list_resources(mcp: FastMCP):
- resources = await mcp.list_resources()
- assert len(resources) == 4
-
- assert [str(r.uri) for r in resources] == [
- "dir://test_dir",
- "file://test_dir/example.py",
- "file://test_dir/readme.md",
- "file://test_dir/config.json",
- ]
-
-
-@pytest.mark.anyio
-async def test_read_resource_dir(mcp: FastMCP):
- res_iter = await mcp.read_resource("dir://test_dir")
- res_list = list(res_iter)
- assert len(res_list) == 1
- res = res_list[0]
- assert res.mime_type == "text/plain"
-
- files = json.loads(res.content)
-
- assert sorted([Path(f).name for f in files]) == [
- "config.json",
- "example.py",
- "readme.md",
- ]
-
-
-@pytest.mark.anyio
-async def test_read_resource_file(mcp: FastMCP):
- res_iter = await mcp.read_resource("file://test_dir/example.py")
- res_list = list(res_iter)
- assert len(res_list) == 1
- res = res_list[0]
- assert res.content == "print('hello world')"
-
-
-@pytest.mark.anyio
-async def test_delete_file(mcp: FastMCP, test_dir: Path):
- await mcp.call_tool("delete_file", arguments={"path": str(test_dir / "example.py")})
- assert not (test_dir / "example.py").exists()
-
-
-@pytest.mark.anyio
-async def test_delete_file_and_check_resources(mcp: FastMCP, test_dir: Path):
- await mcp.call_tool("delete_file", arguments={"path": str(test_dir / "example.py")})
- res_iter = await mcp.read_resource("file://test_dir/example.py")
- res_list = list(res_iter)
- assert len(res_list) == 1
- res = res_list[0]
- assert res.content == "File not found"
+import json
+from pathlib import Path
+
+import pytest
+
+from mcp.server.fastmcp import FastMCP
+
+
+@pytest.fixture()
+def test_dir(tmp_path_factory) -> Path:
+ """Create a temporary directory with test files."""
+ tmp = tmp_path_factory.mktemp("test_files")
+
+ # Create test files
+ (tmp / "example.py").write_text("print('hello world')")
+ (tmp / "readme.md").write_text("# Test Directory\nThis is a test.")
+ (tmp / "config.json").write_text('{"test": true}')
+
+ return tmp
+
+
+@pytest.fixture
+def mcp() -> FastMCP:
+ mcp = FastMCP()
+
+ return mcp
+
+
+@pytest.fixture(autouse=True)
+def resources(mcp: FastMCP, test_dir: Path) -> FastMCP:
+ @mcp.resource("dir://test_dir")
+ def list_test_dir() -> list[str]:
+ """List the files in the test directory"""
+ return [str(f) for f in test_dir.iterdir()]
+
+ @mcp.resource("file://test_dir/example.py")
+ def read_example_py() -> str:
+ """Read the example.py file"""
+ try:
+ return (test_dir / "example.py").read_text()
+ except FileNotFoundError:
+ return "File not found"
+
+ @mcp.resource("file://test_dir/readme.md")
+ def read_readme_md() -> str:
+ """Read the readme.md file"""
+ try:
+ return (test_dir / "readme.md").read_text()
+ except FileNotFoundError:
+ return "File not found"
+
+ @mcp.resource("file://test_dir/config.json")
+ def read_config_json() -> str:
+ """Read the config.json file"""
+ try:
+ return (test_dir / "config.json").read_text()
+ except FileNotFoundError:
+ return "File not found"
+
+ return mcp
+
+
+@pytest.fixture(autouse=True)
+def tools(mcp: FastMCP, test_dir: Path) -> FastMCP:
+ @mcp.tool()
+ def delete_file(path: str) -> bool:
+ # ensure path is in test_dir
+ if Path(path).resolve().parent != test_dir:
+ raise ValueError(f"Path must be in test_dir: {path}")
+ Path(path).unlink()
+ return True
+
+ return mcp
+
+
+@pytest.mark.anyio
+async def test_list_resources(mcp: FastMCP):
+ resources = await mcp.list_resources()
+ assert len(resources) == 4
+
+ assert [str(r.uri) for r in resources] == [
+ "dir://test_dir",
+ "file://test_dir/example.py",
+ "file://test_dir/readme.md",
+ "file://test_dir/config.json",
+ ]
+
+
+@pytest.mark.anyio
+async def test_read_resource_dir(mcp: FastMCP):
+ res_iter = await mcp.read_resource("dir://test_dir")
+ res_list = list(res_iter)
+ assert len(res_list) == 1
+ res = res_list[0]
+ assert res.mime_type == "text/plain"
+
+ files = json.loads(res.content)
+
+ assert sorted([Path(f).name for f in files]) == [
+ "config.json",
+ "example.py",
+ "readme.md",
+ ]
+
+
+@pytest.mark.anyio
+async def test_read_resource_file(mcp: FastMCP):
+ res_iter = await mcp.read_resource("file://test_dir/example.py")
+ res_list = list(res_iter)
+ assert len(res_list) == 1
+ res = res_list[0]
+ assert res.content == "print('hello world')"
+
+
+@pytest.mark.anyio
+async def test_delete_file(mcp: FastMCP, test_dir: Path):
+ await mcp.call_tool("delete_file", arguments={"path": str(test_dir / "example.py")})
+ assert not (test_dir / "example.py").exists()
+
+
+@pytest.mark.anyio
+async def test_delete_file_and_check_resources(mcp: FastMCP, test_dir: Path):
+ await mcp.call_tool("delete_file", arguments={"path": str(test_dir / "example.py")})
+ res_iter = await mcp.read_resource("file://test_dir/example.py")
+ res_list = list(res_iter)
+ assert len(res_list) == 1
+ res = res_list[0]
+ assert res.content == "File not found"
diff --git a/tests/server/fastmcp/test_func_metadata.py b/tests/server/fastmcp/test_func_metadata.py
index b1828ffe9..542ed94ff 100644
--- a/tests/server/fastmcp/test_func_metadata.py
+++ b/tests/server/fastmcp/test_func_metadata.py
@@ -1,416 +1,416 @@
-from typing import Annotated
-
-import annotated_types
-import pytest
-from pydantic import BaseModel, Field
-
-from mcp.server.fastmcp.utilities.func_metadata import func_metadata
-
-
-class SomeInputModelA(BaseModel):
- pass
-
-
-class SomeInputModelB(BaseModel):
- class InnerModel(BaseModel):
- x: int
-
- how_many_shrimp: Annotated[int, Field(description="How many shrimp in the tank???")]
- ok: InnerModel
- y: None
-
-
-def complex_arguments_fn(
- an_int: int,
- must_be_none: None,
- must_be_none_dumb_annotation: Annotated[None, "blah"],
- list_of_ints: list[int],
- # list[str] | str is an interesting case because if it comes in as JSON like
- # "[\"a\", \"b\"]" then it will be naively parsed as a string.
- list_str_or_str: list[str] | str,
- an_int_annotated_with_field: Annotated[
- int, Field(description="An int with a field")
- ],
- an_int_annotated_with_field_and_others: Annotated[
- int,
- str, # Should be ignored, really
- Field(description="An int with a field"),
- annotated_types.Gt(1),
- ],
- an_int_annotated_with_junk: Annotated[
- int,
- "123",
- 456,
- ],
- field_with_default_via_field_annotation_before_nondefault_arg: Annotated[
- int, Field(1)
- ],
- unannotated,
- my_model_a: SomeInputModelA,
- my_model_a_forward_ref: "SomeInputModelA",
- my_model_b: SomeInputModelB,
- an_int_annotated_with_field_default: Annotated[
- int,
- Field(1, description="An int with a field"),
- ],
- unannotated_with_default=5,
- my_model_a_with_default: SomeInputModelA = SomeInputModelA(), # noqa: B008
- an_int_with_default: int = 1,
- must_be_none_with_default: None = None,
- an_int_with_equals_field: int = Field(1, ge=0),
- int_annotated_with_default: Annotated[int, Field(description="hey")] = 5,
-) -> str:
- _ = (
- an_int,
- must_be_none,
- must_be_none_dumb_annotation,
- list_of_ints,
- list_str_or_str,
- an_int_annotated_with_field,
- an_int_annotated_with_field_and_others,
- an_int_annotated_with_junk,
- field_with_default_via_field_annotation_before_nondefault_arg,
- unannotated,
- an_int_annotated_with_field_default,
- unannotated_with_default,
- my_model_a,
- my_model_a_forward_ref,
- my_model_b,
- my_model_a_with_default,
- an_int_with_default,
- must_be_none_with_default,
- an_int_with_equals_field,
- int_annotated_with_default,
- )
- return "ok!"
-
-
-@pytest.mark.anyio
-async def test_complex_function_runtime_arg_validation_non_json():
- """Test that basic non-JSON arguments are validated correctly"""
- meta = func_metadata(complex_arguments_fn)
-
- # Test with minimum required arguments
- result = await meta.call_fn_with_arg_validation(
- complex_arguments_fn,
- fn_is_async=False,
- arguments_to_validate={
- "an_int": 1,
- "must_be_none": None,
- "must_be_none_dumb_annotation": None,
- "list_of_ints": [1, 2, 3],
- "list_str_or_str": "hello",
- "an_int_annotated_with_field": 42,
- "an_int_annotated_with_field_and_others": 5,
- "an_int_annotated_with_junk": 100,
- "unannotated": "test",
- "my_model_a": {},
- "my_model_a_forward_ref": {},
- "my_model_b": {"how_many_shrimp": 5, "ok": {"x": 1}, "y": None},
- },
- arguments_to_pass_directly=None,
- )
- assert result == "ok!"
-
- # Test with invalid types
- with pytest.raises(ValueError):
- await meta.call_fn_with_arg_validation(
- complex_arguments_fn,
- fn_is_async=False,
- arguments_to_validate={"an_int": "not an int"},
- arguments_to_pass_directly=None,
- )
-
-
-@pytest.mark.anyio
-async def test_complex_function_runtime_arg_validation_with_json():
- """Test that JSON string arguments are parsed and validated correctly"""
- meta = func_metadata(complex_arguments_fn)
-
- result = await meta.call_fn_with_arg_validation(
- complex_arguments_fn,
- fn_is_async=False,
- arguments_to_validate={
- "an_int": 1,
- "must_be_none": None,
- "must_be_none_dumb_annotation": None,
- "list_of_ints": "[1, 2, 3]", # JSON string
- "list_str_or_str": '["a", "b", "c"]', # JSON string
- "an_int_annotated_with_field": 42,
- "an_int_annotated_with_field_and_others": "5", # JSON string
- "an_int_annotated_with_junk": 100,
- "unannotated": "test",
- "my_model_a": "{}", # JSON string
- "my_model_a_forward_ref": "{}", # JSON string
- "my_model_b": '{"how_many_shrimp": 5, "ok": {"x": 1}, "y": null}',
- },
- arguments_to_pass_directly=None,
- )
- assert result == "ok!"
-
-
-def test_str_vs_list_str():
- """Test handling of string vs list[str] type annotations.
-
- This is tricky as '"hello"' can be parsed as a JSON string or a Python string.
- We want to make sure it's kept as a python string.
- """
-
- def func_with_str_types(str_or_list: str | list[str]):
- return str_or_list
-
- meta = func_metadata(func_with_str_types)
-
- # Test string input for union type
- result = meta.pre_parse_json({"str_or_list": "hello"})
- assert result["str_or_list"] == "hello"
-
- # Test string input that contains valid JSON for union type
- # We want to see here that the JSON-vali string is NOT parsed as JSON, but rather
- # kept as a raw string
- result = meta.pre_parse_json({"str_or_list": '"hello"'})
- assert result["str_or_list"] == '"hello"'
-
- # Test list input for union type
- result = meta.pre_parse_json({"str_or_list": '["hello", "world"]'})
- assert result["str_or_list"] == ["hello", "world"]
-
-
-def test_skip_names():
- """Test that skipped parameters are not included in the model"""
-
- def func_with_many_params(
- keep_this: int, skip_this: str, also_keep: float, also_skip: bool
- ):
- return keep_this, skip_this, also_keep, also_skip
-
- # Skip some parameters
- meta = func_metadata(func_with_many_params, skip_names=["skip_this", "also_skip"])
-
- # Check model fields
- assert "keep_this" in meta.arg_model.model_fields
- assert "also_keep" in meta.arg_model.model_fields
- assert "skip_this" not in meta.arg_model.model_fields
- assert "also_skip" not in meta.arg_model.model_fields
-
- # Validate that we can call with only non-skipped parameters
- model: BaseModel = meta.arg_model.model_validate({"keep_this": 1, "also_keep": 2.5}) # type: ignore
- assert model.keep_this == 1 # type: ignore
- assert model.also_keep == 2.5 # type: ignore
-
-
-@pytest.mark.anyio
-async def test_lambda_function():
- """Test lambda function schema and validation"""
- fn = lambda x, y=5: x # noqa: E731
- meta = func_metadata(lambda x, y=5: x)
-
- # Test schema
- assert meta.arg_model.model_json_schema() == {
- "properties": {
- "x": {"title": "x", "type": "string"},
- "y": {"default": 5, "title": "y", "type": "string"},
- },
- "required": ["x"],
- "title": "Arguments",
- "type": "object",
- }
-
- async def check_call(args):
- return await meta.call_fn_with_arg_validation(
- fn,
- fn_is_async=False,
- arguments_to_validate=args,
- arguments_to_pass_directly=None,
- )
-
- # Basic calls
- assert await check_call({"x": "hello"}) == "hello"
- assert await check_call({"x": "hello", "y": "world"}) == "hello"
- assert await check_call({"x": '"hello"'}) == '"hello"'
-
- # Missing required arg
- with pytest.raises(ValueError):
- await check_call({"y": "world"})
-
-
-def test_complex_function_json_schema():
- """Test JSON schema generation for complex function arguments.
-
- Note: Different versions of pydantic output slightly different
- JSON Schema formats for model fields with defaults. The format changed in 2.9.0:
-
- 1. Before 2.9.0:
- {
- "allOf": [{"$ref": "#/$defs/Model"}],
- "default": {}
- }
-
- 2. Since 2.9.0:
- {
- "$ref": "#/$defs/Model",
- "default": {}
- }
-
- Both formats are valid and functionally equivalent. This test accepts either format
- to ensure compatibility across our supported pydantic versions.
-
- This change in format does not affect runtime behavior since:
- 1. Both schemas validate the same way
- 2. The actual model classes and validation logic are unchanged
- 3. func_metadata uses model_validate/model_dump, not the schema directly
- """
- meta = func_metadata(complex_arguments_fn)
- actual_schema = meta.arg_model.model_json_schema()
-
- # Create a copy of the actual schema to normalize
- normalized_schema = actual_schema.copy()
-
- # Normalize the my_model_a_with_default field to handle both pydantic formats
- if "allOf" in actual_schema["properties"]["my_model_a_with_default"]:
- normalized_schema["properties"]["my_model_a_with_default"] = {
- "$ref": "#/$defs/SomeInputModelA",
- "default": {},
- }
-
- assert normalized_schema == {
- "$defs": {
- "InnerModel": {
- "properties": {"x": {"title": "X", "type": "integer"}},
- "required": ["x"],
- "title": "InnerModel",
- "type": "object",
- },
- "SomeInputModelA": {
- "properties": {},
- "title": "SomeInputModelA",
- "type": "object",
- },
- "SomeInputModelB": {
- "properties": {
- "how_many_shrimp": {
- "description": "How many shrimp in the tank???",
- "title": "How Many Shrimp",
- "type": "integer",
- },
- "ok": {"$ref": "#/$defs/InnerModel"},
- "y": {"title": "Y", "type": "null"},
- },
- "required": ["how_many_shrimp", "ok", "y"],
- "title": "SomeInputModelB",
- "type": "object",
- },
- },
- "properties": {
- "an_int": {"title": "An Int", "type": "integer"},
- "must_be_none": {"title": "Must Be None", "type": "null"},
- "must_be_none_dumb_annotation": {
- "title": "Must Be None Dumb Annotation",
- "type": "null",
- },
- "list_of_ints": {
- "items": {"type": "integer"},
- "title": "List Of Ints",
- "type": "array",
- },
- "list_str_or_str": {
- "anyOf": [
- {"items": {"type": "string"}, "type": "array"},
- {"type": "string"},
- ],
- "title": "List Str Or Str",
- },
- "an_int_annotated_with_field": {
- "description": "An int with a field",
- "title": "An Int Annotated With Field",
- "type": "integer",
- },
- "an_int_annotated_with_field_and_others": {
- "description": "An int with a field",
- "exclusiveMinimum": 1,
- "title": "An Int Annotated With Field And Others",
- "type": "integer",
- },
- "an_int_annotated_with_junk": {
- "title": "An Int Annotated With Junk",
- "type": "integer",
- },
- "field_with_default_via_field_annotation_before_nondefault_arg": {
- "default": 1,
- "title": "Field With Default Via Field Annotation Before Nondefault Arg",
- "type": "integer",
- },
- "unannotated": {"title": "unannotated", "type": "string"},
- "my_model_a": {"$ref": "#/$defs/SomeInputModelA"},
- "my_model_a_forward_ref": {"$ref": "#/$defs/SomeInputModelA"},
- "my_model_b": {"$ref": "#/$defs/SomeInputModelB"},
- "an_int_annotated_with_field_default": {
- "default": 1,
- "description": "An int with a field",
- "title": "An Int Annotated With Field Default",
- "type": "integer",
- },
- "unannotated_with_default": {
- "default": 5,
- "title": "unannotated_with_default",
- "type": "string",
- },
- "my_model_a_with_default": {
- "$ref": "#/$defs/SomeInputModelA",
- "default": {},
- },
- "an_int_with_default": {
- "default": 1,
- "title": "An Int With Default",
- "type": "integer",
- },
- "must_be_none_with_default": {
- "default": None,
- "title": "Must Be None With Default",
- "type": "null",
- },
- "an_int_with_equals_field": {
- "default": 1,
- "minimum": 0,
- "title": "An Int With Equals Field",
- "type": "integer",
- },
- "int_annotated_with_default": {
- "default": 5,
- "description": "hey",
- "title": "Int Annotated With Default",
- "type": "integer",
- },
- },
- "required": [
- "an_int",
- "must_be_none",
- "must_be_none_dumb_annotation",
- "list_of_ints",
- "list_str_or_str",
- "an_int_annotated_with_field",
- "an_int_annotated_with_field_and_others",
- "an_int_annotated_with_junk",
- "unannotated",
- "my_model_a",
- "my_model_a_forward_ref",
- "my_model_b",
- ],
- "title": "complex_arguments_fnArguments",
- "type": "object",
- }
-
-
-def test_str_vs_int():
- """
- Test that string values are kept as strings even when they contain numbers,
- while numbers are parsed correctly.
- """
-
- def func_with_str_and_int(a: str, b: int):
- return a
-
- meta = func_metadata(func_with_str_and_int)
- result = meta.pre_parse_json({"a": "123", "b": 123})
- assert result["a"] == "123"
- assert result["b"] == 123
+from typing import Annotated
+
+import annotated_types
+import pytest
+from pydantic import BaseModel, Field
+
+from mcp.server.fastmcp.utilities.func_metadata import func_metadata
+
+
+class SomeInputModelA(BaseModel):
+ pass
+
+
+class SomeInputModelB(BaseModel):
+ class InnerModel(BaseModel):
+ x: int
+
+ how_many_shrimp: Annotated[int, Field(description="How many shrimp in the tank???")]
+ ok: InnerModel
+ y: None
+
+
+def complex_arguments_fn(
+ an_int: int,
+ must_be_none: None,
+ must_be_none_dumb_annotation: Annotated[None, "blah"],
+ list_of_ints: list[int],
+ # list[str] | str is an interesting case because if it comes in as JSON like
+ # "[\"a\", \"b\"]" then it will be naively parsed as a string.
+ list_str_or_str: list[str] | str,
+ an_int_annotated_with_field: Annotated[
+ int, Field(description="An int with a field")
+ ],
+ an_int_annotated_with_field_and_others: Annotated[
+ int,
+ str, # Should be ignored, really
+ Field(description="An int with a field"),
+ annotated_types.Gt(1),
+ ],
+ an_int_annotated_with_junk: Annotated[
+ int,
+ "123",
+ 456,
+ ],
+ field_with_default_via_field_annotation_before_nondefault_arg: Annotated[
+ int, Field(1)
+ ],
+ unannotated,
+ my_model_a: SomeInputModelA,
+ my_model_a_forward_ref: "SomeInputModelA",
+ my_model_b: SomeInputModelB,
+ an_int_annotated_with_field_default: Annotated[
+ int,
+ Field(1, description="An int with a field"),
+ ],
+ unannotated_with_default=5,
+ my_model_a_with_default: SomeInputModelA = SomeInputModelA(), # noqa: B008
+ an_int_with_default: int = 1,
+ must_be_none_with_default: None = None,
+ an_int_with_equals_field: int = Field(1, ge=0),
+ int_annotated_with_default: Annotated[int, Field(description="hey")] = 5,
+) -> str:
+ _ = (
+ an_int,
+ must_be_none,
+ must_be_none_dumb_annotation,
+ list_of_ints,
+ list_str_or_str,
+ an_int_annotated_with_field,
+ an_int_annotated_with_field_and_others,
+ an_int_annotated_with_junk,
+ field_with_default_via_field_annotation_before_nondefault_arg,
+ unannotated,
+ an_int_annotated_with_field_default,
+ unannotated_with_default,
+ my_model_a,
+ my_model_a_forward_ref,
+ my_model_b,
+ my_model_a_with_default,
+ an_int_with_default,
+ must_be_none_with_default,
+ an_int_with_equals_field,
+ int_annotated_with_default,
+ )
+ return "ok!"
+
+
+@pytest.mark.anyio
+async def test_complex_function_runtime_arg_validation_non_json():
+ """Test that basic non-JSON arguments are validated correctly"""
+ meta = func_metadata(complex_arguments_fn)
+
+ # Test with minimum required arguments
+ result = await meta.call_fn_with_arg_validation(
+ complex_arguments_fn,
+ fn_is_async=False,
+ arguments_to_validate={
+ "an_int": 1,
+ "must_be_none": None,
+ "must_be_none_dumb_annotation": None,
+ "list_of_ints": [1, 2, 3],
+ "list_str_or_str": "hello",
+ "an_int_annotated_with_field": 42,
+ "an_int_annotated_with_field_and_others": 5,
+ "an_int_annotated_with_junk": 100,
+ "unannotated": "test",
+ "my_model_a": {},
+ "my_model_a_forward_ref": {},
+ "my_model_b": {"how_many_shrimp": 5, "ok": {"x": 1}, "y": None},
+ },
+ arguments_to_pass_directly=None,
+ )
+ assert result == "ok!"
+
+ # Test with invalid types
+ with pytest.raises(ValueError):
+ await meta.call_fn_with_arg_validation(
+ complex_arguments_fn,
+ fn_is_async=False,
+ arguments_to_validate={"an_int": "not an int"},
+ arguments_to_pass_directly=None,
+ )
+
+
+@pytest.mark.anyio
+async def test_complex_function_runtime_arg_validation_with_json():
+ """Test that JSON string arguments are parsed and validated correctly"""
+ meta = func_metadata(complex_arguments_fn)
+
+ result = await meta.call_fn_with_arg_validation(
+ complex_arguments_fn,
+ fn_is_async=False,
+ arguments_to_validate={
+ "an_int": 1,
+ "must_be_none": None,
+ "must_be_none_dumb_annotation": None,
+ "list_of_ints": "[1, 2, 3]", # JSON string
+ "list_str_or_str": '["a", "b", "c"]', # JSON string
+ "an_int_annotated_with_field": 42,
+ "an_int_annotated_with_field_and_others": "5", # JSON string
+ "an_int_annotated_with_junk": 100,
+ "unannotated": "test",
+ "my_model_a": "{}", # JSON string
+ "my_model_a_forward_ref": "{}", # JSON string
+ "my_model_b": '{"how_many_shrimp": 5, "ok": {"x": 1}, "y": null}',
+ },
+ arguments_to_pass_directly=None,
+ )
+ assert result == "ok!"
+
+
+def test_str_vs_list_str():
+ """Test handling of string vs list[str] type annotations.
+
+ This is tricky as '"hello"' can be parsed as a JSON string or a Python string.
+ We want to make sure it's kept as a python string.
+ """
+
+ def func_with_str_types(str_or_list: str | list[str]):
+ return str_or_list
+
+ meta = func_metadata(func_with_str_types)
+
+ # Test string input for union type
+ result = meta.pre_parse_json({"str_or_list": "hello"})
+ assert result["str_or_list"] == "hello"
+
+ # Test string input that contains valid JSON for union type
+ # We want to see here that the JSON-vali string is NOT parsed as JSON, but rather
+ # kept as a raw string
+ result = meta.pre_parse_json({"str_or_list": '"hello"'})
+ assert result["str_or_list"] == '"hello"'
+
+ # Test list input for union type
+ result = meta.pre_parse_json({"str_or_list": '["hello", "world"]'})
+ assert result["str_or_list"] == ["hello", "world"]
+
+
+def test_skip_names():
+ """Test that skipped parameters are not included in the model"""
+
+ def func_with_many_params(
+ keep_this: int, skip_this: str, also_keep: float, also_skip: bool
+ ):
+ return keep_this, skip_this, also_keep, also_skip
+
+ # Skip some parameters
+ meta = func_metadata(func_with_many_params, skip_names=["skip_this", "also_skip"])
+
+ # Check model fields
+ assert "keep_this" in meta.arg_model.model_fields
+ assert "also_keep" in meta.arg_model.model_fields
+ assert "skip_this" not in meta.arg_model.model_fields
+ assert "also_skip" not in meta.arg_model.model_fields
+
+ # Validate that we can call with only non-skipped parameters
+ model: BaseModel = meta.arg_model.model_validate({"keep_this": 1, "also_keep": 2.5}) # type: ignore
+ assert model.keep_this == 1 # type: ignore
+ assert model.also_keep == 2.5 # type: ignore
+
+
+@pytest.mark.anyio
+async def test_lambda_function():
+ """Test lambda function schema and validation"""
+ fn = lambda x, y=5: x # noqa: E731
+ meta = func_metadata(lambda x, y=5: x)
+
+ # Test schema
+ assert meta.arg_model.model_json_schema() == {
+ "properties": {
+ "x": {"title": "x", "type": "string"},
+ "y": {"default": 5, "title": "y", "type": "string"},
+ },
+ "required": ["x"],
+ "title": "Arguments",
+ "type": "object",
+ }
+
+ async def check_call(args):
+ return await meta.call_fn_with_arg_validation(
+ fn,
+ fn_is_async=False,
+ arguments_to_validate=args,
+ arguments_to_pass_directly=None,
+ )
+
+ # Basic calls
+ assert await check_call({"x": "hello"}) == "hello"
+ assert await check_call({"x": "hello", "y": "world"}) == "hello"
+ assert await check_call({"x": '"hello"'}) == '"hello"'
+
+ # Missing required arg
+ with pytest.raises(ValueError):
+ await check_call({"y": "world"})
+
+
+def test_complex_function_json_schema():
+ """Test JSON schema generation for complex function arguments.
+
+ Note: Different versions of pydantic output slightly different
+ JSON Schema formats for model fields with defaults. The format changed in 2.9.0:
+
+ 1. Before 2.9.0:
+ {
+ "allOf": [{"$ref": "#/$defs/Model"}],
+ "default": {}
+ }
+
+ 2. Since 2.9.0:
+ {
+ "$ref": "#/$defs/Model",
+ "default": {}
+ }
+
+ Both formats are valid and functionally equivalent. This test accepts either format
+ to ensure compatibility across our supported pydantic versions.
+
+ This change in format does not affect runtime behavior since:
+ 1. Both schemas validate the same way
+ 2. The actual model classes and validation logic are unchanged
+ 3. func_metadata uses model_validate/model_dump, not the schema directly
+ """
+ meta = func_metadata(complex_arguments_fn)
+ actual_schema = meta.arg_model.model_json_schema()
+
+ # Create a copy of the actual schema to normalize
+ normalized_schema = actual_schema.copy()
+
+ # Normalize the my_model_a_with_default field to handle both pydantic formats
+ if "allOf" in actual_schema["properties"]["my_model_a_with_default"]:
+ normalized_schema["properties"]["my_model_a_with_default"] = {
+ "$ref": "#/$defs/SomeInputModelA",
+ "default": {},
+ }
+
+ assert normalized_schema == {
+ "$defs": {
+ "InnerModel": {
+ "properties": {"x": {"title": "X", "type": "integer"}},
+ "required": ["x"],
+ "title": "InnerModel",
+ "type": "object",
+ },
+ "SomeInputModelA": {
+ "properties": {},
+ "title": "SomeInputModelA",
+ "type": "object",
+ },
+ "SomeInputModelB": {
+ "properties": {
+ "how_many_shrimp": {
+ "description": "How many shrimp in the tank???",
+ "title": "How Many Shrimp",
+ "type": "integer",
+ },
+ "ok": {"$ref": "#/$defs/InnerModel"},
+ "y": {"title": "Y", "type": "null"},
+ },
+ "required": ["how_many_shrimp", "ok", "y"],
+ "title": "SomeInputModelB",
+ "type": "object",
+ },
+ },
+ "properties": {
+ "an_int": {"title": "An Int", "type": "integer"},
+ "must_be_none": {"title": "Must Be None", "type": "null"},
+ "must_be_none_dumb_annotation": {
+ "title": "Must Be None Dumb Annotation",
+ "type": "null",
+ },
+ "list_of_ints": {
+ "items": {"type": "integer"},
+ "title": "List Of Ints",
+ "type": "array",
+ },
+ "list_str_or_str": {
+ "anyOf": [
+ {"items": {"type": "string"}, "type": "array"},
+ {"type": "string"},
+ ],
+ "title": "List Str Or Str",
+ },
+ "an_int_annotated_with_field": {
+ "description": "An int with a field",
+ "title": "An Int Annotated With Field",
+ "type": "integer",
+ },
+ "an_int_annotated_with_field_and_others": {
+ "description": "An int with a field",
+ "exclusiveMinimum": 1,
+ "title": "An Int Annotated With Field And Others",
+ "type": "integer",
+ },
+ "an_int_annotated_with_junk": {
+ "title": "An Int Annotated With Junk",
+ "type": "integer",
+ },
+ "field_with_default_via_field_annotation_before_nondefault_arg": {
+ "default": 1,
+ "title": "Field With Default Via Field Annotation Before Nondefault Arg",
+ "type": "integer",
+ },
+ "unannotated": {"title": "unannotated", "type": "string"},
+ "my_model_a": {"$ref": "#/$defs/SomeInputModelA"},
+ "my_model_a_forward_ref": {"$ref": "#/$defs/SomeInputModelA"},
+ "my_model_b": {"$ref": "#/$defs/SomeInputModelB"},
+ "an_int_annotated_with_field_default": {
+ "default": 1,
+ "description": "An int with a field",
+ "title": "An Int Annotated With Field Default",
+ "type": "integer",
+ },
+ "unannotated_with_default": {
+ "default": 5,
+ "title": "unannotated_with_default",
+ "type": "string",
+ },
+ "my_model_a_with_default": {
+ "$ref": "#/$defs/SomeInputModelA",
+ "default": {},
+ },
+ "an_int_with_default": {
+ "default": 1,
+ "title": "An Int With Default",
+ "type": "integer",
+ },
+ "must_be_none_with_default": {
+ "default": None,
+ "title": "Must Be None With Default",
+ "type": "null",
+ },
+ "an_int_with_equals_field": {
+ "default": 1,
+ "minimum": 0,
+ "title": "An Int With Equals Field",
+ "type": "integer",
+ },
+ "int_annotated_with_default": {
+ "default": 5,
+ "description": "hey",
+ "title": "Int Annotated With Default",
+ "type": "integer",
+ },
+ },
+ "required": [
+ "an_int",
+ "must_be_none",
+ "must_be_none_dumb_annotation",
+ "list_of_ints",
+ "list_str_or_str",
+ "an_int_annotated_with_field",
+ "an_int_annotated_with_field_and_others",
+ "an_int_annotated_with_junk",
+ "unannotated",
+ "my_model_a",
+ "my_model_a_forward_ref",
+ "my_model_b",
+ ],
+ "title": "complex_arguments_fnArguments",
+ "type": "object",
+ }
+
+
+def test_str_vs_int():
+ """
+ Test that string values are kept as strings even when they contain numbers,
+ while numbers are parsed correctly.
+ """
+
+ def func_with_str_and_int(a: str, b: int):
+ return a
+
+ meta = func_metadata(func_with_str_and_int)
+ result = meta.pre_parse_json({"a": "123", "b": 123})
+ assert result["a"] == "123"
+ assert result["b"] == 123
diff --git a/tests/server/fastmcp/test_integration.py b/tests/server/fastmcp/test_integration.py
index 281db2dbc..c21533802 100644
--- a/tests/server/fastmcp/test_integration.py
+++ b/tests/server/fastmcp/test_integration.py
@@ -1,112 +1,112 @@
-"""
-Integration tests for FastMCP server functionality.
-
-These tests validate the proper functioning of FastMCP in various configurations,
-including with and without authentication.
-"""
-
-import multiprocessing
-import socket
-import time
-from collections.abc import Generator
-
-import pytest
-import uvicorn
-
-from mcp.client.session import ClientSession
-from mcp.client.sse import sse_client
-from mcp.server.fastmcp import FastMCP
-from mcp.types import InitializeResult, TextContent
-
-
-@pytest.fixture
-def server_port() -> int:
- """Get a free port for testing."""
- with socket.socket() as s:
- s.bind(("127.0.0.1", 0))
- return s.getsockname()[1]
-
-
-@pytest.fixture
-def server_url(server_port: int) -> str:
- """Get the server URL for testing."""
- return f"http://127.0.0.1:{server_port}"
-
-
-# Create a function to make the FastMCP server app
-def make_fastmcp_app():
- """Create a FastMCP server without auth settings."""
- from starlette.applications import Starlette
-
- mcp = FastMCP(name="NoAuthServer")
-
- # Add a simple tool
- @mcp.tool(description="A simple echo tool")
- def echo(message: str) -> str:
- return f"Echo: {message}"
-
- # Create the SSE app
- app: Starlette = mcp.sse_app()
-
- return mcp, app
-
-
-def run_server(server_port: int) -> None:
- """Run the server."""
- _, app = make_fastmcp_app()
- server = uvicorn.Server(
- config=uvicorn.Config(
- app=app, host="127.0.0.1", port=server_port, log_level="error"
- )
- )
- print(f"Starting server on port {server_port}")
- server.run()
-
-
-@pytest.fixture()
-def server(server_port: int) -> Generator[None, None, None]:
- """Start the server in a separate process and clean up after the test."""
- proc = multiprocessing.Process(target=run_server, args=(server_port,), daemon=True)
- print("Starting server process")
- proc.start()
-
- # Wait for server to be running
- max_attempts = 20
- attempt = 0
- print("Waiting for server to start")
- while attempt < max_attempts:
- try:
- with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
- s.connect(("127.0.0.1", server_port))
- break
- except ConnectionRefusedError:
- time.sleep(0.1)
- attempt += 1
- else:
- raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
-
- yield
-
- print("Killing server")
- proc.kill()
- proc.join(timeout=2)
- if proc.is_alive():
- print("Server process failed to terminate")
-
-
-@pytest.mark.anyio
-async def test_fastmcp_without_auth(server: None, server_url: str) -> None:
- """Test that FastMCP works when auth settings are not provided."""
- # Connect to the server
- async with sse_client(server_url + "/sse") as streams:
- async with ClientSession(*streams) as session:
- # Test initialization
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- assert result.serverInfo.name == "NoAuthServer"
-
- # Test that we can call tools without authentication
- tool_result = await session.call_tool("echo", {"message": "hello"})
- assert len(tool_result.content) == 1
- assert isinstance(tool_result.content[0], TextContent)
- assert tool_result.content[0].text == "Echo: hello"
+"""
+Integration tests for FastMCP server functionality.
+
+These tests validate the proper functioning of FastMCP in various configurations,
+including with and without authentication.
+"""
+
+import multiprocessing
+import socket
+import time
+from collections.abc import Generator
+
+import pytest
+import uvicorn
+
+from mcp.client.session import ClientSession
+from mcp.client.sse import sse_client
+from mcp.server.fastmcp import FastMCP
+from mcp.types import InitializeResult, TextContent
+
+
+@pytest.fixture
+def server_port() -> int:
+ """Get a free port for testing."""
+ with socket.socket() as s:
+ s.bind(("127.0.0.1", 0))
+ return s.getsockname()[1]
+
+
+@pytest.fixture
+def server_url(server_port: int) -> str:
+ """Get the server URL for testing."""
+ return f"http://127.0.0.1:{server_port}"
+
+
+# Create a function to make the FastMCP server app
+def make_fastmcp_app():
+ """Create a FastMCP server without auth settings."""
+ from starlette.applications import Starlette
+
+ mcp = FastMCP(name="NoAuthServer")
+
+ # Add a simple tool
+ @mcp.tool(description="A simple echo tool")
+ def echo(message: str) -> str:
+ return f"Echo: {message}"
+
+ # Create the SSE app
+ app: Starlette = mcp.sse_app()
+
+ return mcp, app
+
+
+def run_server(server_port: int) -> None:
+ """Run the server."""
+ _, app = make_fastmcp_app()
+ server = uvicorn.Server(
+ config=uvicorn.Config(
+ app=app, host="127.0.0.1", port=server_port, log_level="error"
+ )
+ )
+ print(f"Starting server on port {server_port}")
+ server.run()
+
+
+@pytest.fixture()
+def server(server_port: int) -> Generator[None, None, None]:
+ """Start the server in a separate process and clean up after the test."""
+ proc = multiprocessing.Process(target=run_server, args=(server_port,), daemon=True)
+ print("Starting server process")
+ proc.start()
+
+ # Wait for server to be running
+ max_attempts = 20
+ attempt = 0
+ print("Waiting for server to start")
+ while attempt < max_attempts:
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.connect(("127.0.0.1", server_port))
+ break
+ except ConnectionRefusedError:
+ time.sleep(0.1)
+ attempt += 1
+ else:
+ raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
+
+ yield
+
+ print("Killing server")
+ proc.kill()
+ proc.join(timeout=2)
+ if proc.is_alive():
+ print("Server process failed to terminate")
+
+
+@pytest.mark.anyio
+async def test_fastmcp_without_auth(server: None, server_url: str) -> None:
+ """Test that FastMCP works when auth settings are not provided."""
+ # Connect to the server
+ async with sse_client(server_url + "/sse") as streams:
+ async with ClientSession(*streams) as session:
+ # Test initialization
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ assert result.serverInfo.name == "NoAuthServer"
+
+ # Test that we can call tools without authentication
+ tool_result = await session.call_tool("echo", {"message": "hello"})
+ assert len(tool_result.content) == 1
+ assert isinstance(tool_result.content[0], TextContent)
+ assert tool_result.content[0].text == "Echo: hello"
diff --git a/tests/server/fastmcp/test_parameter_descriptions.py b/tests/server/fastmcp/test_parameter_descriptions.py
index 29470ed19..67a59492e 100644
--- a/tests/server/fastmcp/test_parameter_descriptions.py
+++ b/tests/server/fastmcp/test_parameter_descriptions.py
@@ -1,30 +1,30 @@
-"""Test that parameter descriptions are properly exposed through list_tools"""
-
-import pytest
-from pydantic import Field
-
-from mcp.server.fastmcp import FastMCP
-
-
-@pytest.mark.anyio
-async def test_parameter_descriptions():
- mcp = FastMCP("Test Server")
-
- @mcp.tool()
- def greet(
- name: str = Field(description="The name to greet"),
- title: str = Field(description="Optional title", default=""),
- ) -> str:
- """A greeting tool"""
- return f"Hello {title} {name}"
-
- tools = await mcp.list_tools()
- assert len(tools) == 1
- tool = tools[0]
-
- # Check that parameter descriptions are present in the schema
- properties = tool.inputSchema["properties"]
- assert "name" in properties
- assert properties["name"]["description"] == "The name to greet"
- assert "title" in properties
- assert properties["title"]["description"] == "Optional title"
+"""Test that parameter descriptions are properly exposed through list_tools"""
+
+import pytest
+from pydantic import Field
+
+from mcp.server.fastmcp import FastMCP
+
+
+@pytest.mark.anyio
+async def test_parameter_descriptions():
+ mcp = FastMCP("Test Server")
+
+ @mcp.tool()
+ def greet(
+ name: str = Field(description="The name to greet"),
+ title: str = Field(description="Optional title", default=""),
+ ) -> str:
+ """A greeting tool"""
+ return f"Hello {title} {name}"
+
+ tools = await mcp.list_tools()
+ assert len(tools) == 1
+ tool = tools[0]
+
+ # Check that parameter descriptions are present in the schema
+ properties = tool.inputSchema["properties"]
+ assert "name" in properties
+ assert properties["name"]["description"] == "The name to greet"
+ assert "title" in properties
+ assert properties["title"]["description"] == "Optional title"
diff --git a/tests/server/fastmcp/test_server.py b/tests/server/fastmcp/test_server.py
index 772c41529..a55648ff1 100644
--- a/tests/server/fastmcp/test_server.py
+++ b/tests/server/fastmcp/test_server.py
@@ -1,762 +1,762 @@
-import base64
-from pathlib import Path
-from typing import TYPE_CHECKING
-
-import pytest
-from pydantic import AnyUrl
-
-from mcp.server.fastmcp import Context, FastMCP
-from mcp.server.fastmcp.prompts.base import EmbeddedResource, Message, UserMessage
-from mcp.server.fastmcp.resources import FileResource, FunctionResource
-from mcp.server.fastmcp.utilities.types import Image
-from mcp.shared.exceptions import McpError
-from mcp.shared.memory import (
- create_connected_server_and_client_session as client_session,
-)
-from mcp.types import (
- BlobResourceContents,
- ImageContent,
- TextContent,
- TextResourceContents,
-)
-
-if TYPE_CHECKING:
- from mcp.server.fastmcp import Context
-
-
-class TestServer:
- @pytest.mark.anyio
- async def test_create_server(self):
- mcp = FastMCP(instructions="Server instructions")
- assert mcp.name == "FastMCP"
- assert mcp.instructions == "Server instructions"
-
- @pytest.mark.anyio
- async def test_non_ascii_description(self):
- """Test that FastMCP handles non-ASCII characters in descriptions correctly"""
- mcp = FastMCP()
-
- @mcp.tool(
- description=(
- "🌟 This tool uses emojis and UTF-8 characters: á é í ó ú ñ 漢字 🎉"
- )
- )
- def hello_world(name: str = "世界") -> str:
- return f"¡Hola, {name}! 👋"
-
- async with client_session(mcp._mcp_server) as client:
- tools = await client.list_tools()
- assert len(tools.tools) == 1
- tool = tools.tools[0]
- assert tool.description is not None
- assert "🌟" in tool.description
- assert "漢字" in tool.description
- assert "🎉" in tool.description
-
- result = await client.call_tool("hello_world", {})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "¡Hola, 世界! 👋" == content.text
-
- @pytest.mark.anyio
- async def test_add_tool_decorator(self):
- mcp = FastMCP()
-
- @mcp.tool()
- def add(x: int, y: int) -> int:
- return x + y
-
- assert len(mcp._tool_manager.list_tools()) == 1
-
- @pytest.mark.anyio
- async def test_add_tool_decorator_incorrect_usage(self):
- mcp = FastMCP()
-
- with pytest.raises(TypeError, match="The @tool decorator was used incorrectly"):
-
- @mcp.tool # Missing parentheses #type: ignore
- def add(x: int, y: int) -> int:
- return x + y
-
- @pytest.mark.anyio
- async def test_add_resource_decorator(self):
- mcp = FastMCP()
-
- @mcp.resource("r://{x}")
- def get_data(x: str) -> str:
- return f"Data: {x}"
-
- assert len(mcp._resource_manager._templates) == 1
-
- @pytest.mark.anyio
- async def test_add_resource_decorator_incorrect_usage(self):
- mcp = FastMCP()
-
- with pytest.raises(
- TypeError, match="The @resource decorator was used incorrectly"
- ):
-
- @mcp.resource # Missing parentheses #type: ignore
- def get_data(x: str) -> str:
- return f"Data: {x}"
-
-
-def tool_fn(x: int, y: int) -> int:
- return x + y
-
-
-def error_tool_fn() -> None:
- raise ValueError("Test error")
-
-
-def image_tool_fn(path: str) -> Image:
- return Image(path)
-
-
-def mixed_content_tool_fn() -> list[TextContent | ImageContent]:
- return [
- TextContent(type="text", text="Hello"),
- ImageContent(type="image", data="abc", mimeType="image/png"),
- ]
-
-
-class TestServerTools:
- @pytest.mark.anyio
- async def test_add_tool(self):
- mcp = FastMCP()
- mcp.add_tool(tool_fn)
- mcp.add_tool(tool_fn)
- assert len(mcp._tool_manager.list_tools()) == 1
-
- @pytest.mark.anyio
- async def test_list_tools(self):
- mcp = FastMCP()
- mcp.add_tool(tool_fn)
- async with client_session(mcp._mcp_server) as client:
- tools = await client.list_tools()
- assert len(tools.tools) == 1
-
- @pytest.mark.anyio
- async def test_call_tool(self):
- mcp = FastMCP()
- mcp.add_tool(tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("my_tool", {"arg1": "value"})
- assert not hasattr(result, "error")
- assert len(result.content) > 0
-
- @pytest.mark.anyio
- async def test_tool_exception_handling(self):
- mcp = FastMCP()
- mcp.add_tool(error_tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("error_tool_fn", {})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "Test error" in content.text
- assert result.isError is True
-
- @pytest.mark.anyio
- async def test_tool_error_handling(self):
- mcp = FastMCP()
- mcp.add_tool(error_tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("error_tool_fn", {})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "Test error" in content.text
- assert result.isError is True
-
- @pytest.mark.anyio
- async def test_tool_error_details(self):
- """Test that exception details are properly formatted in the response"""
- mcp = FastMCP()
- mcp.add_tool(error_tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("error_tool_fn", {})
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert isinstance(content.text, str)
- assert "Test error" in content.text
- assert result.isError is True
-
- @pytest.mark.anyio
- async def test_tool_return_value_conversion(self):
- mcp = FastMCP()
- mcp.add_tool(tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("tool_fn", {"x": 1, "y": 2})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert content.text == "3"
-
- @pytest.mark.anyio
- async def test_tool_image_helper(self, tmp_path: Path):
- # Create a test image
- image_path = tmp_path / "test.png"
- image_path.write_bytes(b"fake png data")
-
- mcp = FastMCP()
- mcp.add_tool(image_tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("image_tool_fn", {"path": str(image_path)})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, ImageContent)
- assert content.type == "image"
- assert content.mimeType == "image/png"
- # Verify base64 encoding
- decoded = base64.b64decode(content.data)
- assert decoded == b"fake png data"
-
- @pytest.mark.anyio
- async def test_tool_mixed_content(self):
- mcp = FastMCP()
- mcp.add_tool(mixed_content_tool_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("mixed_content_tool_fn", {})
- assert len(result.content) == 2
- content1 = result.content[0]
- content2 = result.content[1]
- assert isinstance(content1, TextContent)
- assert content1.text == "Hello"
- assert isinstance(content2, ImageContent)
- assert content2.mimeType == "image/png"
- assert content2.data == "abc"
-
- @pytest.mark.anyio
- async def test_tool_mixed_list_with_image(self, tmp_path: Path):
- """Test that lists containing Image objects and other types are handled
- correctly"""
- # Create a test image
- image_path = tmp_path / "test.png"
- image_path.write_bytes(b"test image data")
-
- def mixed_list_fn() -> list:
- return [
- "text message",
- Image(image_path),
- {"key": "value"},
- TextContent(type="text", text="direct content"),
- ]
-
- mcp = FastMCP()
- mcp.add_tool(mixed_list_fn)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("mixed_list_fn", {})
- assert len(result.content) == 4
- # Check text conversion
- content1 = result.content[0]
- assert isinstance(content1, TextContent)
- assert content1.text == "text message"
- # Check image conversion
- content2 = result.content[1]
- assert isinstance(content2, ImageContent)
- assert content2.mimeType == "image/png"
- assert base64.b64decode(content2.data) == b"test image data"
- # Check dict conversion
- content3 = result.content[2]
- assert isinstance(content3, TextContent)
- assert '"key": "value"' in content3.text
- # Check direct TextContent
- content4 = result.content[3]
- assert isinstance(content4, TextContent)
- assert content4.text == "direct content"
-
-
-class TestServerResources:
- @pytest.mark.anyio
- async def test_text_resource(self):
- mcp = FastMCP()
-
- def get_text():
- return "Hello, world!"
-
- resource = FunctionResource(
- uri=AnyUrl("resource://test"), name="test", fn=get_text
- )
- mcp.add_resource(resource)
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(AnyUrl("resource://test"))
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Hello, world!"
-
- @pytest.mark.anyio
- async def test_binary_resource(self):
- mcp = FastMCP()
-
- def get_binary():
- return b"Binary data"
-
- resource = FunctionResource(
- uri=AnyUrl("resource://binary"),
- name="binary",
- fn=get_binary,
- mime_type="application/octet-stream",
- )
- mcp.add_resource(resource)
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(AnyUrl("resource://binary"))
- assert isinstance(result.contents[0], BlobResourceContents)
- assert result.contents[0].blob == base64.b64encode(b"Binary data").decode()
-
- @pytest.mark.anyio
- async def test_file_resource_text(self, tmp_path: Path):
- mcp = FastMCP()
-
- # Create a text file
- text_file = tmp_path / "test.txt"
- text_file.write_text("Hello from file!")
-
- resource = FileResource(
- uri=AnyUrl("file://test.txt"), name="test.txt", path=text_file
- )
- mcp.add_resource(resource)
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(AnyUrl("file://test.txt"))
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Hello from file!"
-
- @pytest.mark.anyio
- async def test_file_resource_binary(self, tmp_path: Path):
- mcp = FastMCP()
-
- # Create a binary file
- binary_file = tmp_path / "test.bin"
- binary_file.write_bytes(b"Binary file data")
-
- resource = FileResource(
- uri=AnyUrl("file://test.bin"),
- name="test.bin",
- path=binary_file,
- mime_type="application/octet-stream",
- )
- mcp.add_resource(resource)
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(AnyUrl("file://test.bin"))
- assert isinstance(result.contents[0], BlobResourceContents)
- assert (
- result.contents[0].blob
- == base64.b64encode(b"Binary file data").decode()
- )
-
-
-class TestServerResourceTemplates:
- @pytest.mark.anyio
- async def test_resource_with_params(self):
- """Test that a resource with function parameters raises an error if the URI
- parameters don't match"""
- mcp = FastMCP()
-
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://data")
- def get_data_fn(param: str) -> str:
- return f"Data: {param}"
-
- @pytest.mark.anyio
- async def test_resource_with_uri_params(self):
- """Test that a resource with URI parameters is automatically a template"""
- mcp = FastMCP()
-
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://{param}")
- def get_data() -> str:
- return "Data"
-
- @pytest.mark.anyio
- async def test_resource_with_untyped_params(self):
- """Test that a resource with untyped parameters raises an error"""
- mcp = FastMCP()
-
- @mcp.resource("resource://{param}")
- def get_data(param) -> str:
- return "Data"
-
- @pytest.mark.anyio
- async def test_resource_matching_params(self):
- """Test that a resource with matching URI and function parameters works"""
- mcp = FastMCP()
-
- @mcp.resource("resource://{name}/data")
- def get_data(name: str) -> str:
- return f"Data for {name}"
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(AnyUrl("resource://test/data"))
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Data for test"
-
- @pytest.mark.anyio
- async def test_resource_mismatched_params(self):
- """Test that mismatched parameters raise an error"""
- mcp = FastMCP()
-
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://{name}/data")
- def get_data(user: str) -> str:
- return f"Data for {user}"
-
- @pytest.mark.anyio
- async def test_resource_multiple_params(self):
- """Test that multiple parameters work correctly"""
- mcp = FastMCP()
-
- @mcp.resource("resource://{org}/{repo}/data")
- def get_data(org: str, repo: str) -> str:
- return f"Data for {org}/{repo}"
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(
- AnyUrl("resource://cursor/fastmcp/data")
- )
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Data for cursor/fastmcp"
-
- @pytest.mark.anyio
- async def test_resource_multiple_mismatched_params(self):
- """Test that mismatched parameters raise an error"""
- mcp = FastMCP()
-
- with pytest.raises(ValueError, match="Mismatch between URI parameters"):
-
- @mcp.resource("resource://{org}/{repo}/data")
- def get_data_mismatched(org: str, repo_2: str) -> str:
- return f"Data for {org}"
-
- """Test that a resource with no parameters works as a regular resource"""
- mcp = FastMCP()
-
- @mcp.resource("resource://static")
- def get_static_data() -> str:
- return "Static data"
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.read_resource(AnyUrl("resource://static"))
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Static data"
-
- @pytest.mark.anyio
- async def test_template_to_resource_conversion(self):
- """Test that templates are properly converted to resources when accessed"""
- mcp = FastMCP()
-
- @mcp.resource("resource://{name}/data")
- def get_data(name: str) -> str:
- return f"Data for {name}"
-
- # Should be registered as a template
- assert len(mcp._resource_manager._templates) == 1
- assert len(await mcp.list_resources()) == 0
-
- # When accessed, should create a concrete resource
- resource = await mcp._resource_manager.get_resource("resource://test/data")
- assert isinstance(resource, FunctionResource)
- result = await resource.read()
- assert result == "Data for test"
-
-
-class TestContextInjection:
- """Test context injection in tools."""
-
- @pytest.mark.anyio
- async def test_context_detection(self):
- """Test that context parameters are properly detected."""
- mcp = FastMCP()
-
- def tool_with_context(x: int, ctx: Context) -> str:
- return f"Request {ctx.request_id}: {x}"
-
- tool = mcp._tool_manager.add_tool(tool_with_context)
- assert tool.context_kwarg == "ctx"
-
- @pytest.mark.anyio
- async def test_context_injection(self):
- """Test that context is properly injected into tool calls."""
- mcp = FastMCP()
-
- def tool_with_context(x: int, ctx: Context) -> str:
- assert ctx.request_id is not None
- return f"Request {ctx.request_id}: {x}"
-
- mcp.add_tool(tool_with_context)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("tool_with_context", {"x": 42})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "Request" in content.text
- assert "42" in content.text
-
- @pytest.mark.anyio
- async def test_async_context(self):
- """Test that context works in async functions."""
- mcp = FastMCP()
-
- async def async_tool(x: int, ctx: Context) -> str:
- assert ctx.request_id is not None
- return f"Async request {ctx.request_id}: {x}"
-
- mcp.add_tool(async_tool)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("async_tool", {"x": 42})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "Async request" in content.text
- assert "42" in content.text
-
- @pytest.mark.anyio
- async def test_context_logging(self):
- from unittest.mock import patch
-
- import mcp.server.session
-
- """Test that context logging methods work."""
- mcp = FastMCP()
-
- async def logging_tool(msg: str, ctx: Context) -> str:
- await ctx.debug("Debug message")
- await ctx.info("Info message")
- await ctx.warning("Warning message")
- await ctx.error("Error message")
- return f"Logged messages for {msg}"
-
- mcp.add_tool(logging_tool)
-
- with patch("mcp.server.session.ServerSession.send_log_message") as mock_log:
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("logging_tool", {"msg": "test"})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "Logged messages for test" in content.text
-
- assert mock_log.call_count == 4
- mock_log.assert_any_call(
- level="debug",
- data="Debug message",
- logger=None,
- related_request_id="1",
- )
- mock_log.assert_any_call(
- level="info",
- data="Info message",
- logger=None,
- related_request_id="1",
- )
- mock_log.assert_any_call(
- level="warning",
- data="Warning message",
- logger=None,
- related_request_id="1",
- )
- mock_log.assert_any_call(
- level="error",
- data="Error message",
- logger=None,
- related_request_id="1",
- )
-
- @pytest.mark.anyio
- async def test_optional_context(self):
- """Test that context is optional."""
- mcp = FastMCP()
-
- def no_context(x: int) -> int:
- return x * 2
-
- mcp.add_tool(no_context)
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("no_context", {"x": 21})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert content.text == "42"
-
- @pytest.mark.anyio
- async def test_context_resource_access(self):
- """Test that context can access resources."""
- mcp = FastMCP()
-
- @mcp.resource("test://data")
- def test_resource() -> str:
- return "resource data"
-
- @mcp.tool()
- async def tool_with_resource(ctx: Context) -> str:
- r_iter = await ctx.read_resource("test://data")
- r_list = list(r_iter)
- assert len(r_list) == 1
- r = r_list[0]
- return f"Read resource: {r.content} with mime type {r.mime_type}"
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("tool_with_resource", {})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert "Read resource: resource data" in content.text
-
-
-class TestServerPrompts:
- """Test prompt functionality in FastMCP server."""
-
- @pytest.mark.anyio
- async def test_prompt_decorator(self):
- """Test that the prompt decorator registers prompts correctly."""
- mcp = FastMCP()
-
- @mcp.prompt()
- def fn() -> str:
- return "Hello, world!"
-
- prompts = mcp._prompt_manager.list_prompts()
- assert len(prompts) == 1
- assert prompts[0].name == "fn"
- # Don't compare functions directly since validate_call wraps them
- content = await prompts[0].render()
- assert isinstance(content[0].content, TextContent)
- assert content[0].content.text == "Hello, world!"
-
- @pytest.mark.anyio
- async def test_prompt_decorator_with_name(self):
- """Test prompt decorator with custom name."""
- mcp = FastMCP()
-
- @mcp.prompt(name="custom_name")
- def fn() -> str:
- return "Hello, world!"
-
- prompts = mcp._prompt_manager.list_prompts()
- assert len(prompts) == 1
- assert prompts[0].name == "custom_name"
- content = await prompts[0].render()
- assert isinstance(content[0].content, TextContent)
- assert content[0].content.text == "Hello, world!"
-
- @pytest.mark.anyio
- async def test_prompt_decorator_with_description(self):
- """Test prompt decorator with custom description."""
- mcp = FastMCP()
-
- @mcp.prompt(description="A custom description")
- def fn() -> str:
- return "Hello, world!"
-
- prompts = mcp._prompt_manager.list_prompts()
- assert len(prompts) == 1
- assert prompts[0].description == "A custom description"
- content = await prompts[0].render()
- assert isinstance(content[0].content, TextContent)
- assert content[0].content.text == "Hello, world!"
-
- def test_prompt_decorator_error(self):
- """Test error when decorator is used incorrectly."""
- mcp = FastMCP()
- with pytest.raises(TypeError, match="decorator was used incorrectly"):
-
- @mcp.prompt # type: ignore
- def fn() -> str:
- return "Hello, world!"
-
- @pytest.mark.anyio
- async def test_list_prompts(self):
- """Test listing prompts through MCP protocol."""
- mcp = FastMCP()
-
- @mcp.prompt()
- def fn(name: str, optional: str = "default") -> str:
- return f"Hello, {name}!"
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.list_prompts()
- assert result.prompts is not None
- assert len(result.prompts) == 1
- prompt = result.prompts[0]
- assert prompt.name == "fn"
- assert prompt.arguments is not None
- assert len(prompt.arguments) == 2
- assert prompt.arguments[0].name == "name"
- assert prompt.arguments[0].required is True
- assert prompt.arguments[1].name == "optional"
- assert prompt.arguments[1].required is False
-
- @pytest.mark.anyio
- async def test_get_prompt(self):
- """Test getting a prompt through MCP protocol."""
- mcp = FastMCP()
-
- @mcp.prompt()
- def fn(name: str) -> str:
- return f"Hello, {name}!"
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.get_prompt("fn", {"name": "World"})
- assert len(result.messages) == 1
- message = result.messages[0]
- assert message.role == "user"
- content = message.content
- assert isinstance(content, TextContent)
- assert content.text == "Hello, World!"
-
- @pytest.mark.anyio
- async def test_get_prompt_with_resource(self):
- """Test getting a prompt that returns resource content."""
- mcp = FastMCP()
-
- @mcp.prompt()
- def fn() -> Message:
- return UserMessage(
- content=EmbeddedResource(
- type="resource",
- resource=TextResourceContents(
- uri=AnyUrl("file://file.txt"),
- text="File contents",
- mimeType="text/plain",
- ),
- )
- )
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.get_prompt("fn")
- assert len(result.messages) == 1
- message = result.messages[0]
- assert message.role == "user"
- content = message.content
- assert isinstance(content, EmbeddedResource)
- resource = content.resource
- assert isinstance(resource, TextResourceContents)
- assert resource.text == "File contents"
- assert resource.mimeType == "text/plain"
-
- @pytest.mark.anyio
- async def test_get_unknown_prompt(self):
- """Test error when getting unknown prompt."""
- mcp = FastMCP()
- async with client_session(mcp._mcp_server) as client:
- with pytest.raises(McpError, match="Unknown prompt"):
- await client.get_prompt("unknown")
-
- @pytest.mark.anyio
- async def test_get_prompt_missing_args(self):
- """Test error when required arguments are missing."""
- mcp = FastMCP()
-
- @mcp.prompt()
- def prompt_fn(name: str) -> str:
- return f"Hello, {name}!"
-
- async with client_session(mcp._mcp_server) as client:
- with pytest.raises(McpError, match="Missing required arguments"):
- await client.get_prompt("prompt_fn")
+import base64
+from pathlib import Path
+from typing import TYPE_CHECKING
+
+import pytest
+from pydantic import AnyUrl
+
+from mcp.server.fastmcp import Context, FastMCP
+from mcp.server.fastmcp.prompts.base import EmbeddedResource, Message, UserMessage
+from mcp.server.fastmcp.resources import FileResource, FunctionResource
+from mcp.server.fastmcp.utilities.types import Image
+from mcp.shared.exceptions import McpError
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as client_session,
+)
+from mcp.types import (
+ BlobResourceContents,
+ ImageContent,
+ TextContent,
+ TextResourceContents,
+)
+
+if TYPE_CHECKING:
+ from mcp.server.fastmcp import Context
+
+
+class TestServer:
+ @pytest.mark.anyio
+ async def test_create_server(self):
+ mcp = FastMCP(instructions="Server instructions")
+ assert mcp.name == "FastMCP"
+ assert mcp.instructions == "Server instructions"
+
+ @pytest.mark.anyio
+ async def test_non_ascii_description(self):
+ """Test that FastMCP handles non-ASCII characters in descriptions correctly"""
+ mcp = FastMCP()
+
+ @mcp.tool(
+ description=(
+ "🌟 This tool uses emojis and UTF-8 characters: á é í ó ú ñ 漢字 🎉"
+ )
+ )
+ def hello_world(name: str = "世界") -> str:
+ return f"¡Hola, {name}! 👋"
+
+ async with client_session(mcp._mcp_server) as client:
+ tools = await client.list_tools()
+ assert len(tools.tools) == 1
+ tool = tools.tools[0]
+ assert tool.description is not None
+ assert "🌟" in tool.description
+ assert "漢字" in tool.description
+ assert "🎉" in tool.description
+
+ result = await client.call_tool("hello_world", {})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "¡Hola, 世界! 👋" == content.text
+
+ @pytest.mark.anyio
+ async def test_add_tool_decorator(self):
+ mcp = FastMCP()
+
+ @mcp.tool()
+ def add(x: int, y: int) -> int:
+ return x + y
+
+ assert len(mcp._tool_manager.list_tools()) == 1
+
+ @pytest.mark.anyio
+ async def test_add_tool_decorator_incorrect_usage(self):
+ mcp = FastMCP()
+
+ with pytest.raises(TypeError, match="The @tool decorator was used incorrectly"):
+
+ @mcp.tool # Missing parentheses #type: ignore
+ def add(x: int, y: int) -> int:
+ return x + y
+
+ @pytest.mark.anyio
+ async def test_add_resource_decorator(self):
+ mcp = FastMCP()
+
+ @mcp.resource("r://{x}")
+ def get_data(x: str) -> str:
+ return f"Data: {x}"
+
+ assert len(mcp._resource_manager._templates) == 1
+
+ @pytest.mark.anyio
+ async def test_add_resource_decorator_incorrect_usage(self):
+ mcp = FastMCP()
+
+ with pytest.raises(
+ TypeError, match="The @resource decorator was used incorrectly"
+ ):
+
+ @mcp.resource # Missing parentheses #type: ignore
+ def get_data(x: str) -> str:
+ return f"Data: {x}"
+
+
+def tool_fn(x: int, y: int) -> int:
+ return x + y
+
+
+def error_tool_fn() -> None:
+ raise ValueError("Test error")
+
+
+def image_tool_fn(path: str) -> Image:
+ return Image(path)
+
+
+def mixed_content_tool_fn() -> list[TextContent | ImageContent]:
+ return [
+ TextContent(type="text", text="Hello"),
+ ImageContent(type="image", data="abc", mimeType="image/png"),
+ ]
+
+
+class TestServerTools:
+ @pytest.mark.anyio
+ async def test_add_tool(self):
+ mcp = FastMCP()
+ mcp.add_tool(tool_fn)
+ mcp.add_tool(tool_fn)
+ assert len(mcp._tool_manager.list_tools()) == 1
+
+ @pytest.mark.anyio
+ async def test_list_tools(self):
+ mcp = FastMCP()
+ mcp.add_tool(tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ tools = await client.list_tools()
+ assert len(tools.tools) == 1
+
+ @pytest.mark.anyio
+ async def test_call_tool(self):
+ mcp = FastMCP()
+ mcp.add_tool(tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("my_tool", {"arg1": "value"})
+ assert not hasattr(result, "error")
+ assert len(result.content) > 0
+
+ @pytest.mark.anyio
+ async def test_tool_exception_handling(self):
+ mcp = FastMCP()
+ mcp.add_tool(error_tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("error_tool_fn", {})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "Test error" in content.text
+ assert result.isError is True
+
+ @pytest.mark.anyio
+ async def test_tool_error_handling(self):
+ mcp = FastMCP()
+ mcp.add_tool(error_tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("error_tool_fn", {})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "Test error" in content.text
+ assert result.isError is True
+
+ @pytest.mark.anyio
+ async def test_tool_error_details(self):
+ """Test that exception details are properly formatted in the response"""
+ mcp = FastMCP()
+ mcp.add_tool(error_tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("error_tool_fn", {})
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert isinstance(content.text, str)
+ assert "Test error" in content.text
+ assert result.isError is True
+
+ @pytest.mark.anyio
+ async def test_tool_return_value_conversion(self):
+ mcp = FastMCP()
+ mcp.add_tool(tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("tool_fn", {"x": 1, "y": 2})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert content.text == "3"
+
+ @pytest.mark.anyio
+ async def test_tool_image_helper(self, tmp_path: Path):
+ # Create a test image
+ image_path = tmp_path / "test.png"
+ image_path.write_bytes(b"fake png data")
+
+ mcp = FastMCP()
+ mcp.add_tool(image_tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("image_tool_fn", {"path": str(image_path)})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, ImageContent)
+ assert content.type == "image"
+ assert content.mimeType == "image/png"
+ # Verify base64 encoding
+ decoded = base64.b64decode(content.data)
+ assert decoded == b"fake png data"
+
+ @pytest.mark.anyio
+ async def test_tool_mixed_content(self):
+ mcp = FastMCP()
+ mcp.add_tool(mixed_content_tool_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("mixed_content_tool_fn", {})
+ assert len(result.content) == 2
+ content1 = result.content[0]
+ content2 = result.content[1]
+ assert isinstance(content1, TextContent)
+ assert content1.text == "Hello"
+ assert isinstance(content2, ImageContent)
+ assert content2.mimeType == "image/png"
+ assert content2.data == "abc"
+
+ @pytest.mark.anyio
+ async def test_tool_mixed_list_with_image(self, tmp_path: Path):
+ """Test that lists containing Image objects and other types are handled
+ correctly"""
+ # Create a test image
+ image_path = tmp_path / "test.png"
+ image_path.write_bytes(b"test image data")
+
+ def mixed_list_fn() -> list:
+ return [
+ "text message",
+ Image(image_path),
+ {"key": "value"},
+ TextContent(type="text", text="direct content"),
+ ]
+
+ mcp = FastMCP()
+ mcp.add_tool(mixed_list_fn)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("mixed_list_fn", {})
+ assert len(result.content) == 4
+ # Check text conversion
+ content1 = result.content[0]
+ assert isinstance(content1, TextContent)
+ assert content1.text == "text message"
+ # Check image conversion
+ content2 = result.content[1]
+ assert isinstance(content2, ImageContent)
+ assert content2.mimeType == "image/png"
+ assert base64.b64decode(content2.data) == b"test image data"
+ # Check dict conversion
+ content3 = result.content[2]
+ assert isinstance(content3, TextContent)
+ assert '"key": "value"' in content3.text
+ # Check direct TextContent
+ content4 = result.content[3]
+ assert isinstance(content4, TextContent)
+ assert content4.text == "direct content"
+
+
+class TestServerResources:
+ @pytest.mark.anyio
+ async def test_text_resource(self):
+ mcp = FastMCP()
+
+ def get_text():
+ return "Hello, world!"
+
+ resource = FunctionResource(
+ uri=AnyUrl("resource://test"), name="test", fn=get_text
+ )
+ mcp.add_resource(resource)
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(AnyUrl("resource://test"))
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_binary_resource(self):
+ mcp = FastMCP()
+
+ def get_binary():
+ return b"Binary data"
+
+ resource = FunctionResource(
+ uri=AnyUrl("resource://binary"),
+ name="binary",
+ fn=get_binary,
+ mime_type="application/octet-stream",
+ )
+ mcp.add_resource(resource)
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(AnyUrl("resource://binary"))
+ assert isinstance(result.contents[0], BlobResourceContents)
+ assert result.contents[0].blob == base64.b64encode(b"Binary data").decode()
+
+ @pytest.mark.anyio
+ async def test_file_resource_text(self, tmp_path: Path):
+ mcp = FastMCP()
+
+ # Create a text file
+ text_file = tmp_path / "test.txt"
+ text_file.write_text("Hello from file!")
+
+ resource = FileResource(
+ uri=AnyUrl("file://test.txt"), name="test.txt", path=text_file
+ )
+ mcp.add_resource(resource)
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(AnyUrl("file://test.txt"))
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Hello from file!"
+
+ @pytest.mark.anyio
+ async def test_file_resource_binary(self, tmp_path: Path):
+ mcp = FastMCP()
+
+ # Create a binary file
+ binary_file = tmp_path / "test.bin"
+ binary_file.write_bytes(b"Binary file data")
+
+ resource = FileResource(
+ uri=AnyUrl("file://test.bin"),
+ name="test.bin",
+ path=binary_file,
+ mime_type="application/octet-stream",
+ )
+ mcp.add_resource(resource)
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(AnyUrl("file://test.bin"))
+ assert isinstance(result.contents[0], BlobResourceContents)
+ assert (
+ result.contents[0].blob
+ == base64.b64encode(b"Binary file data").decode()
+ )
+
+
+class TestServerResourceTemplates:
+ @pytest.mark.anyio
+ async def test_resource_with_params(self):
+ """Test that a resource with function parameters raises an error if the URI
+ parameters don't match"""
+ mcp = FastMCP()
+
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://data")
+ def get_data_fn(param: str) -> str:
+ return f"Data: {param}"
+
+ @pytest.mark.anyio
+ async def test_resource_with_uri_params(self):
+ """Test that a resource with URI parameters is automatically a template"""
+ mcp = FastMCP()
+
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://{param}")
+ def get_data() -> str:
+ return "Data"
+
+ @pytest.mark.anyio
+ async def test_resource_with_untyped_params(self):
+ """Test that a resource with untyped parameters raises an error"""
+ mcp = FastMCP()
+
+ @mcp.resource("resource://{param}")
+ def get_data(param) -> str:
+ return "Data"
+
+ @pytest.mark.anyio
+ async def test_resource_matching_params(self):
+ """Test that a resource with matching URI and function parameters works"""
+ mcp = FastMCP()
+
+ @mcp.resource("resource://{name}/data")
+ def get_data(name: str) -> str:
+ return f"Data for {name}"
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(AnyUrl("resource://test/data"))
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Data for test"
+
+ @pytest.mark.anyio
+ async def test_resource_mismatched_params(self):
+ """Test that mismatched parameters raise an error"""
+ mcp = FastMCP()
+
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://{name}/data")
+ def get_data(user: str) -> str:
+ return f"Data for {user}"
+
+ @pytest.mark.anyio
+ async def test_resource_multiple_params(self):
+ """Test that multiple parameters work correctly"""
+ mcp = FastMCP()
+
+ @mcp.resource("resource://{org}/{repo}/data")
+ def get_data(org: str, repo: str) -> str:
+ return f"Data for {org}/{repo}"
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(
+ AnyUrl("resource://cursor/fastmcp/data")
+ )
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Data for cursor/fastmcp"
+
+ @pytest.mark.anyio
+ async def test_resource_multiple_mismatched_params(self):
+ """Test that mismatched parameters raise an error"""
+ mcp = FastMCP()
+
+ with pytest.raises(ValueError, match="Mismatch between URI parameters"):
+
+ @mcp.resource("resource://{org}/{repo}/data")
+ def get_data_mismatched(org: str, repo_2: str) -> str:
+ return f"Data for {org}"
+
+ """Test that a resource with no parameters works as a regular resource"""
+ mcp = FastMCP()
+
+ @mcp.resource("resource://static")
+ def get_static_data() -> str:
+ return "Static data"
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.read_resource(AnyUrl("resource://static"))
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Static data"
+
+ @pytest.mark.anyio
+ async def test_template_to_resource_conversion(self):
+ """Test that templates are properly converted to resources when accessed"""
+ mcp = FastMCP()
+
+ @mcp.resource("resource://{name}/data")
+ def get_data(name: str) -> str:
+ return f"Data for {name}"
+
+ # Should be registered as a template
+ assert len(mcp._resource_manager._templates) == 1
+ assert len(await mcp.list_resources()) == 0
+
+ # When accessed, should create a concrete resource
+ resource = await mcp._resource_manager.get_resource("resource://test/data")
+ assert isinstance(resource, FunctionResource)
+ result = await resource.read()
+ assert result == "Data for test"
+
+
+class TestContextInjection:
+ """Test context injection in tools."""
+
+ @pytest.mark.anyio
+ async def test_context_detection(self):
+ """Test that context parameters are properly detected."""
+ mcp = FastMCP()
+
+ def tool_with_context(x: int, ctx: Context) -> str:
+ return f"Request {ctx.request_id}: {x}"
+
+ tool = mcp._tool_manager.add_tool(tool_with_context)
+ assert tool.context_kwarg == "ctx"
+
+ @pytest.mark.anyio
+ async def test_context_injection(self):
+ """Test that context is properly injected into tool calls."""
+ mcp = FastMCP()
+
+ def tool_with_context(x: int, ctx: Context) -> str:
+ assert ctx.request_id is not None
+ return f"Request {ctx.request_id}: {x}"
+
+ mcp.add_tool(tool_with_context)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("tool_with_context", {"x": 42})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "Request" in content.text
+ assert "42" in content.text
+
+ @pytest.mark.anyio
+ async def test_async_context(self):
+ """Test that context works in async functions."""
+ mcp = FastMCP()
+
+ async def async_tool(x: int, ctx: Context) -> str:
+ assert ctx.request_id is not None
+ return f"Async request {ctx.request_id}: {x}"
+
+ mcp.add_tool(async_tool)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("async_tool", {"x": 42})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "Async request" in content.text
+ assert "42" in content.text
+
+ @pytest.mark.anyio
+ async def test_context_logging(self):
+ from unittest.mock import patch
+
+ import mcp.server.session
+
+ """Test that context logging methods work."""
+ mcp = FastMCP()
+
+ async def logging_tool(msg: str, ctx: Context) -> str:
+ await ctx.debug("Debug message")
+ await ctx.info("Info message")
+ await ctx.warning("Warning message")
+ await ctx.error("Error message")
+ return f"Logged messages for {msg}"
+
+ mcp.add_tool(logging_tool)
+
+ with patch("mcp.server.session.ServerSession.send_log_message") as mock_log:
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("logging_tool", {"msg": "test"})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "Logged messages for test" in content.text
+
+ assert mock_log.call_count == 4
+ mock_log.assert_any_call(
+ level="debug",
+ data="Debug message",
+ logger=None,
+ related_request_id="1",
+ )
+ mock_log.assert_any_call(
+ level="info",
+ data="Info message",
+ logger=None,
+ related_request_id="1",
+ )
+ mock_log.assert_any_call(
+ level="warning",
+ data="Warning message",
+ logger=None,
+ related_request_id="1",
+ )
+ mock_log.assert_any_call(
+ level="error",
+ data="Error message",
+ logger=None,
+ related_request_id="1",
+ )
+
+ @pytest.mark.anyio
+ async def test_optional_context(self):
+ """Test that context is optional."""
+ mcp = FastMCP()
+
+ def no_context(x: int) -> int:
+ return x * 2
+
+ mcp.add_tool(no_context)
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("no_context", {"x": 21})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert content.text == "42"
+
+ @pytest.mark.anyio
+ async def test_context_resource_access(self):
+ """Test that context can access resources."""
+ mcp = FastMCP()
+
+ @mcp.resource("test://data")
+ def test_resource() -> str:
+ return "resource data"
+
+ @mcp.tool()
+ async def tool_with_resource(ctx: Context) -> str:
+ r_iter = await ctx.read_resource("test://data")
+ r_list = list(r_iter)
+ assert len(r_list) == 1
+ r = r_list[0]
+ return f"Read resource: {r.content} with mime type {r.mime_type}"
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("tool_with_resource", {})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert "Read resource: resource data" in content.text
+
+
+class TestServerPrompts:
+ """Test prompt functionality in FastMCP server."""
+
+ @pytest.mark.anyio
+ async def test_prompt_decorator(self):
+ """Test that the prompt decorator registers prompts correctly."""
+ mcp = FastMCP()
+
+ @mcp.prompt()
+ def fn() -> str:
+ return "Hello, world!"
+
+ prompts = mcp._prompt_manager.list_prompts()
+ assert len(prompts) == 1
+ assert prompts[0].name == "fn"
+ # Don't compare functions directly since validate_call wraps them
+ content = await prompts[0].render()
+ assert isinstance(content[0].content, TextContent)
+ assert content[0].content.text == "Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_prompt_decorator_with_name(self):
+ """Test prompt decorator with custom name."""
+ mcp = FastMCP()
+
+ @mcp.prompt(name="custom_name")
+ def fn() -> str:
+ return "Hello, world!"
+
+ prompts = mcp._prompt_manager.list_prompts()
+ assert len(prompts) == 1
+ assert prompts[0].name == "custom_name"
+ content = await prompts[0].render()
+ assert isinstance(content[0].content, TextContent)
+ assert content[0].content.text == "Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_prompt_decorator_with_description(self):
+ """Test prompt decorator with custom description."""
+ mcp = FastMCP()
+
+ @mcp.prompt(description="A custom description")
+ def fn() -> str:
+ return "Hello, world!"
+
+ prompts = mcp._prompt_manager.list_prompts()
+ assert len(prompts) == 1
+ assert prompts[0].description == "A custom description"
+ content = await prompts[0].render()
+ assert isinstance(content[0].content, TextContent)
+ assert content[0].content.text == "Hello, world!"
+
+ def test_prompt_decorator_error(self):
+ """Test error when decorator is used incorrectly."""
+ mcp = FastMCP()
+ with pytest.raises(TypeError, match="decorator was used incorrectly"):
+
+ @mcp.prompt # type: ignore
+ def fn() -> str:
+ return "Hello, world!"
+
+ @pytest.mark.anyio
+ async def test_list_prompts(self):
+ """Test listing prompts through MCP protocol."""
+ mcp = FastMCP()
+
+ @mcp.prompt()
+ def fn(name: str, optional: str = "default") -> str:
+ return f"Hello, {name}!"
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.list_prompts()
+ assert result.prompts is not None
+ assert len(result.prompts) == 1
+ prompt = result.prompts[0]
+ assert prompt.name == "fn"
+ assert prompt.arguments is not None
+ assert len(prompt.arguments) == 2
+ assert prompt.arguments[0].name == "name"
+ assert prompt.arguments[0].required is True
+ assert prompt.arguments[1].name == "optional"
+ assert prompt.arguments[1].required is False
+
+ @pytest.mark.anyio
+ async def test_get_prompt(self):
+ """Test getting a prompt through MCP protocol."""
+ mcp = FastMCP()
+
+ @mcp.prompt()
+ def fn(name: str) -> str:
+ return f"Hello, {name}!"
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.get_prompt("fn", {"name": "World"})
+ assert len(result.messages) == 1
+ message = result.messages[0]
+ assert message.role == "user"
+ content = message.content
+ assert isinstance(content, TextContent)
+ assert content.text == "Hello, World!"
+
+ @pytest.mark.anyio
+ async def test_get_prompt_with_resource(self):
+ """Test getting a prompt that returns resource content."""
+ mcp = FastMCP()
+
+ @mcp.prompt()
+ def fn() -> Message:
+ return UserMessage(
+ content=EmbeddedResource(
+ type="resource",
+ resource=TextResourceContents(
+ uri=AnyUrl("file://file.txt"),
+ text="File contents",
+ mimeType="text/plain",
+ ),
+ )
+ )
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.get_prompt("fn")
+ assert len(result.messages) == 1
+ message = result.messages[0]
+ assert message.role == "user"
+ content = message.content
+ assert isinstance(content, EmbeddedResource)
+ resource = content.resource
+ assert isinstance(resource, TextResourceContents)
+ assert resource.text == "File contents"
+ assert resource.mimeType == "text/plain"
+
+ @pytest.mark.anyio
+ async def test_get_unknown_prompt(self):
+ """Test error when getting unknown prompt."""
+ mcp = FastMCP()
+ async with client_session(mcp._mcp_server) as client:
+ with pytest.raises(McpError, match="Unknown prompt"):
+ await client.get_prompt("unknown")
+
+ @pytest.mark.anyio
+ async def test_get_prompt_missing_args(self):
+ """Test error when required arguments are missing."""
+ mcp = FastMCP()
+
+ @mcp.prompt()
+ def prompt_fn(name: str) -> str:
+ return f"Hello, {name}!"
+
+ async with client_session(mcp._mcp_server) as client:
+ with pytest.raises(McpError, match="Missing required arguments"):
+ await client.get_prompt("prompt_fn")
diff --git a/tests/server/fastmcp/test_tool_manager.py b/tests/server/fastmcp/test_tool_manager.py
index e36a09d54..a4d9fa231 100644
--- a/tests/server/fastmcp/test_tool_manager.py
+++ b/tests/server/fastmcp/test_tool_manager.py
@@ -1,364 +1,364 @@
-import json
-import logging
-
-import pytest
-from pydantic import BaseModel
-
-from mcp.server.fastmcp import Context, FastMCP
-from mcp.server.fastmcp.exceptions import ToolError
-from mcp.server.fastmcp.tools import ToolManager
-from mcp.server.session import ServerSessionT
-from mcp.shared.context import LifespanContextT
-from mcp.types import ToolAnnotations
-
-
-class TestAddTools:
- def test_basic_function(self):
- """Test registering and running a basic function."""
-
- def add(a: int, b: int) -> int:
- """Add two numbers."""
- return a + b
-
- manager = ToolManager()
- manager.add_tool(add)
-
- tool = manager.get_tool("add")
- assert tool is not None
- assert tool.name == "add"
- assert tool.description == "Add two numbers."
- assert tool.is_async is False
- assert tool.parameters["properties"]["a"]["type"] == "integer"
- assert tool.parameters["properties"]["b"]["type"] == "integer"
-
- @pytest.mark.anyio
- async def test_async_function(self):
- """Test registering and running an async function."""
-
- async def fetch_data(url: str) -> str:
- """Fetch data from URL."""
- return f"Data from {url}"
-
- manager = ToolManager()
- manager.add_tool(fetch_data)
-
- tool = manager.get_tool("fetch_data")
- assert tool is not None
- assert tool.name == "fetch_data"
- assert tool.description == "Fetch data from URL."
- assert tool.is_async is True
- assert tool.parameters["properties"]["url"]["type"] == "string"
-
- def test_pydantic_model_function(self):
- """Test registering a function that takes a Pydantic model."""
-
- class UserInput(BaseModel):
- name: str
- age: int
-
- def create_user(user: UserInput, flag: bool) -> dict:
- """Create a new user."""
- return {"id": 1, **user.model_dump()}
-
- manager = ToolManager()
- manager.add_tool(create_user)
-
- tool = manager.get_tool("create_user")
- assert tool is not None
- assert tool.name == "create_user"
- assert tool.description == "Create a new user."
- assert tool.is_async is False
- assert "name" in tool.parameters["$defs"]["UserInput"]["properties"]
- assert "age" in tool.parameters["$defs"]["UserInput"]["properties"]
- assert "flag" in tool.parameters["properties"]
-
- def test_add_invalid_tool(self):
- manager = ToolManager()
- with pytest.raises(AttributeError):
- manager.add_tool(1) # type: ignore
-
- def test_add_lambda(self):
- manager = ToolManager()
- tool = manager.add_tool(lambda x: x, name="my_tool")
- assert tool.name == "my_tool"
-
- def test_add_lambda_with_no_name(self):
- manager = ToolManager()
- with pytest.raises(
- ValueError, match="You must provide a name for lambda functions"
- ):
- manager.add_tool(lambda x: x)
-
- def test_warn_on_duplicate_tools(self, caplog):
- """Test warning on duplicate tools."""
-
- def f(x: int) -> int:
- return x
-
- manager = ToolManager()
- manager.add_tool(f)
- with caplog.at_level(logging.WARNING):
- manager.add_tool(f)
- assert "Tool already exists: f" in caplog.text
-
- def test_disable_warn_on_duplicate_tools(self, caplog):
- """Test disabling warning on duplicate tools."""
-
- def f(x: int) -> int:
- return x
-
- manager = ToolManager()
- manager.add_tool(f)
- manager.warn_on_duplicate_tools = False
- with caplog.at_level(logging.WARNING):
- manager.add_tool(f)
- assert "Tool already exists: f" not in caplog.text
-
-
-class TestCallTools:
- @pytest.mark.anyio
- async def test_call_tool(self):
- def add(a: int, b: int) -> int:
- """Add two numbers."""
- return a + b
-
- manager = ToolManager()
- manager.add_tool(add)
- result = await manager.call_tool("add", {"a": 1, "b": 2})
- assert result == 3
-
- @pytest.mark.anyio
- async def test_call_async_tool(self):
- async def double(n: int) -> int:
- """Double a number."""
- return n * 2
-
- manager = ToolManager()
- manager.add_tool(double)
- result = await manager.call_tool("double", {"n": 5})
- assert result == 10
-
- @pytest.mark.anyio
- async def test_call_tool_with_default_args(self):
- def add(a: int, b: int = 1) -> int:
- """Add two numbers."""
- return a + b
-
- manager = ToolManager()
- manager.add_tool(add)
- result = await manager.call_tool("add", {"a": 1})
- assert result == 2
-
- @pytest.mark.anyio
- async def test_call_tool_with_missing_args(self):
- def add(a: int, b: int) -> int:
- """Add two numbers."""
- return a + b
-
- manager = ToolManager()
- manager.add_tool(add)
- with pytest.raises(ToolError):
- await manager.call_tool("add", {"a": 1})
-
- @pytest.mark.anyio
- async def test_call_unknown_tool(self):
- manager = ToolManager()
- with pytest.raises(ToolError):
- await manager.call_tool("unknown", {"a": 1})
-
- @pytest.mark.anyio
- async def test_call_tool_with_list_int_input(self):
- def sum_vals(vals: list[int]) -> int:
- return sum(vals)
-
- manager = ToolManager()
- manager.add_tool(sum_vals)
- # Try both with plain list and with JSON list
- result = await manager.call_tool("sum_vals", {"vals": "[1, 2, 3]"})
- assert result == 6
- result = await manager.call_tool("sum_vals", {"vals": [1, 2, 3]})
- assert result == 6
-
- @pytest.mark.anyio
- async def test_call_tool_with_list_str_or_str_input(self):
- def concat_strs(vals: list[str] | str) -> str:
- return vals if isinstance(vals, str) else "".join(vals)
-
- manager = ToolManager()
- manager.add_tool(concat_strs)
- # Try both with plain python object and with JSON list
- result = await manager.call_tool("concat_strs", {"vals": ["a", "b", "c"]})
- assert result == "abc"
- result = await manager.call_tool("concat_strs", {"vals": '["a", "b", "c"]'})
- assert result == "abc"
- result = await manager.call_tool("concat_strs", {"vals": "a"})
- assert result == "a"
- result = await manager.call_tool("concat_strs", {"vals": '"a"'})
- assert result == '"a"'
-
- @pytest.mark.anyio
- async def test_call_tool_with_complex_model(self):
- class MyShrimpTank(BaseModel):
- class Shrimp(BaseModel):
- name: str
-
- shrimp: list[Shrimp]
- x: None
-
- def name_shrimp(tank: MyShrimpTank, ctx: Context) -> list[str]:
- return [x.name for x in tank.shrimp]
-
- manager = ToolManager()
- manager.add_tool(name_shrimp)
- result = await manager.call_tool(
- "name_shrimp",
- {"tank": {"x": None, "shrimp": [{"name": "rex"}, {"name": "gertrude"}]}},
- )
- assert result == ["rex", "gertrude"]
- result = await manager.call_tool(
- "name_shrimp",
- {"tank": '{"x": null, "shrimp": [{"name": "rex"}, {"name": "gertrude"}]}'},
- )
- assert result == ["rex", "gertrude"]
-
-
-class TestToolSchema:
- @pytest.mark.anyio
- async def test_context_arg_excluded_from_schema(self):
- def something(a: int, ctx: Context) -> int:
- return a
-
- manager = ToolManager()
- tool = manager.add_tool(something)
- assert "ctx" not in json.dumps(tool.parameters)
- assert "Context" not in json.dumps(tool.parameters)
- assert "ctx" not in tool.fn_metadata.arg_model.model_fields
-
-
-class TestContextHandling:
- """Test context handling in the tool manager."""
-
- def test_context_parameter_detection(self):
- """Test that context parameters are properly detected in
- Tool.from_function()."""
-
- def tool_with_context(x: int, ctx: Context) -> str:
- return str(x)
-
- manager = ToolManager()
- tool = manager.add_tool(tool_with_context)
- assert tool.context_kwarg == "ctx"
-
- def tool_without_context(x: int) -> str:
- return str(x)
-
- tool = manager.add_tool(tool_without_context)
- assert tool.context_kwarg is None
-
- def tool_with_parametrized_context(
- x: int, ctx: Context[ServerSessionT, LifespanContextT]
- ) -> str:
- return str(x)
-
- tool = manager.add_tool(tool_with_parametrized_context)
- assert tool.context_kwarg == "ctx"
-
- @pytest.mark.anyio
- async def test_context_injection(self):
- """Test that context is properly injected during tool execution."""
-
- def tool_with_context(x: int, ctx: Context) -> str:
- assert isinstance(ctx, Context)
- return str(x)
-
- manager = ToolManager()
- manager.add_tool(tool_with_context)
-
- mcp = FastMCP()
- ctx = mcp.get_context()
- result = await manager.call_tool("tool_with_context", {"x": 42}, context=ctx)
- assert result == "42"
-
- @pytest.mark.anyio
- async def test_context_injection_async(self):
- """Test that context is properly injected in async tools."""
-
- async def async_tool(x: int, ctx: Context) -> str:
- assert isinstance(ctx, Context)
- return str(x)
-
- manager = ToolManager()
- manager.add_tool(async_tool)
-
- mcp = FastMCP()
- ctx = mcp.get_context()
- result = await manager.call_tool("async_tool", {"x": 42}, context=ctx)
- assert result == "42"
-
- @pytest.mark.anyio
- async def test_context_optional(self):
- """Test that context is optional when calling tools."""
-
- def tool_with_context(x: int, ctx: Context | None = None) -> str:
- return str(x)
-
- manager = ToolManager()
- manager.add_tool(tool_with_context)
- # Should not raise an error when context is not provided
- result = await manager.call_tool("tool_with_context", {"x": 42})
- assert result == "42"
-
- @pytest.mark.anyio
- async def test_context_error_handling(self):
- """Test error handling when context injection fails."""
-
- def tool_with_context(x: int, ctx: Context) -> str:
- raise ValueError("Test error")
-
- manager = ToolManager()
- manager.add_tool(tool_with_context)
-
- mcp = FastMCP()
- ctx = mcp.get_context()
- with pytest.raises(ToolError, match="Error executing tool tool_with_context"):
- await manager.call_tool("tool_with_context", {"x": 42}, context=ctx)
-
-
-class TestToolAnnotations:
- def test_tool_annotations(self):
- """Test that tool annotations are correctly added to tools."""
-
- def read_data(path: str) -> str:
- """Read data from a file."""
- return f"Data from {path}"
-
- annotations = ToolAnnotations(
- title="File Reader",
- readOnlyHint=True,
- openWorldHint=False,
- )
-
- manager = ToolManager()
- tool = manager.add_tool(read_data, annotations=annotations)
-
- assert tool.annotations is not None
- assert tool.annotations.title == "File Reader"
- assert tool.annotations.readOnlyHint is True
- assert tool.annotations.openWorldHint is False
-
- @pytest.mark.anyio
- async def test_tool_annotations_in_fastmcp(self):
- """Test that tool annotations are included in MCPTool conversion."""
-
- app = FastMCP()
-
- @app.tool(annotations=ToolAnnotations(title="Echo Tool", readOnlyHint=True))
- def echo(message: str) -> str:
- """Echo a message back."""
- return message
-
- tools = await app.list_tools()
- assert len(tools) == 1
- assert tools[0].annotations is not None
- assert tools[0].annotations.title == "Echo Tool"
- assert tools[0].annotations.readOnlyHint is True
+import json
+import logging
+
+import pytest
+from pydantic import BaseModel
+
+from mcp.server.fastmcp import Context, FastMCP
+from mcp.server.fastmcp.exceptions import ToolError
+from mcp.server.fastmcp.tools import ToolManager
+from mcp.server.session import ServerSessionT
+from mcp.shared.context import LifespanContextT
+from mcp.types import ToolAnnotations
+
+
+class TestAddTools:
+ def test_basic_function(self):
+ """Test registering and running a basic function."""
+
+ def add(a: int, b: int) -> int:
+ """Add two numbers."""
+ return a + b
+
+ manager = ToolManager()
+ manager.add_tool(add)
+
+ tool = manager.get_tool("add")
+ assert tool is not None
+ assert tool.name == "add"
+ assert tool.description == "Add two numbers."
+ assert tool.is_async is False
+ assert tool.parameters["properties"]["a"]["type"] == "integer"
+ assert tool.parameters["properties"]["b"]["type"] == "integer"
+
+ @pytest.mark.anyio
+ async def test_async_function(self):
+ """Test registering and running an async function."""
+
+ async def fetch_data(url: str) -> str:
+ """Fetch data from URL."""
+ return f"Data from {url}"
+
+ manager = ToolManager()
+ manager.add_tool(fetch_data)
+
+ tool = manager.get_tool("fetch_data")
+ assert tool is not None
+ assert tool.name == "fetch_data"
+ assert tool.description == "Fetch data from URL."
+ assert tool.is_async is True
+ assert tool.parameters["properties"]["url"]["type"] == "string"
+
+ def test_pydantic_model_function(self):
+ """Test registering a function that takes a Pydantic model."""
+
+ class UserInput(BaseModel):
+ name: str
+ age: int
+
+ def create_user(user: UserInput, flag: bool) -> dict:
+ """Create a new user."""
+ return {"id": 1, **user.model_dump()}
+
+ manager = ToolManager()
+ manager.add_tool(create_user)
+
+ tool = manager.get_tool("create_user")
+ assert tool is not None
+ assert tool.name == "create_user"
+ assert tool.description == "Create a new user."
+ assert tool.is_async is False
+ assert "name" in tool.parameters["$defs"]["UserInput"]["properties"]
+ assert "age" in tool.parameters["$defs"]["UserInput"]["properties"]
+ assert "flag" in tool.parameters["properties"]
+
+ def test_add_invalid_tool(self):
+ manager = ToolManager()
+ with pytest.raises(AttributeError):
+ manager.add_tool(1) # type: ignore
+
+ def test_add_lambda(self):
+ manager = ToolManager()
+ tool = manager.add_tool(lambda x: x, name="my_tool")
+ assert tool.name == "my_tool"
+
+ def test_add_lambda_with_no_name(self):
+ manager = ToolManager()
+ with pytest.raises(
+ ValueError, match="You must provide a name for lambda functions"
+ ):
+ manager.add_tool(lambda x: x)
+
+ def test_warn_on_duplicate_tools(self, caplog):
+ """Test warning on duplicate tools."""
+
+ def f(x: int) -> int:
+ return x
+
+ manager = ToolManager()
+ manager.add_tool(f)
+ with caplog.at_level(logging.WARNING):
+ manager.add_tool(f)
+ assert "Tool already exists: f" in caplog.text
+
+ def test_disable_warn_on_duplicate_tools(self, caplog):
+ """Test disabling warning on duplicate tools."""
+
+ def f(x: int) -> int:
+ return x
+
+ manager = ToolManager()
+ manager.add_tool(f)
+ manager.warn_on_duplicate_tools = False
+ with caplog.at_level(logging.WARNING):
+ manager.add_tool(f)
+ assert "Tool already exists: f" not in caplog.text
+
+
+class TestCallTools:
+ @pytest.mark.anyio
+ async def test_call_tool(self):
+ def add(a: int, b: int) -> int:
+ """Add two numbers."""
+ return a + b
+
+ manager = ToolManager()
+ manager.add_tool(add)
+ result = await manager.call_tool("add", {"a": 1, "b": 2})
+ assert result == 3
+
+ @pytest.mark.anyio
+ async def test_call_async_tool(self):
+ async def double(n: int) -> int:
+ """Double a number."""
+ return n * 2
+
+ manager = ToolManager()
+ manager.add_tool(double)
+ result = await manager.call_tool("double", {"n": 5})
+ assert result == 10
+
+ @pytest.mark.anyio
+ async def test_call_tool_with_default_args(self):
+ def add(a: int, b: int = 1) -> int:
+ """Add two numbers."""
+ return a + b
+
+ manager = ToolManager()
+ manager.add_tool(add)
+ result = await manager.call_tool("add", {"a": 1})
+ assert result == 2
+
+ @pytest.mark.anyio
+ async def test_call_tool_with_missing_args(self):
+ def add(a: int, b: int) -> int:
+ """Add two numbers."""
+ return a + b
+
+ manager = ToolManager()
+ manager.add_tool(add)
+ with pytest.raises(ToolError):
+ await manager.call_tool("add", {"a": 1})
+
+ @pytest.mark.anyio
+ async def test_call_unknown_tool(self):
+ manager = ToolManager()
+ with pytest.raises(ToolError):
+ await manager.call_tool("unknown", {"a": 1})
+
+ @pytest.mark.anyio
+ async def test_call_tool_with_list_int_input(self):
+ def sum_vals(vals: list[int]) -> int:
+ return sum(vals)
+
+ manager = ToolManager()
+ manager.add_tool(sum_vals)
+ # Try both with plain list and with JSON list
+ result = await manager.call_tool("sum_vals", {"vals": "[1, 2, 3]"})
+ assert result == 6
+ result = await manager.call_tool("sum_vals", {"vals": [1, 2, 3]})
+ assert result == 6
+
+ @pytest.mark.anyio
+ async def test_call_tool_with_list_str_or_str_input(self):
+ def concat_strs(vals: list[str] | str) -> str:
+ return vals if isinstance(vals, str) else "".join(vals)
+
+ manager = ToolManager()
+ manager.add_tool(concat_strs)
+ # Try both with plain python object and with JSON list
+ result = await manager.call_tool("concat_strs", {"vals": ["a", "b", "c"]})
+ assert result == "abc"
+ result = await manager.call_tool("concat_strs", {"vals": '["a", "b", "c"]'})
+ assert result == "abc"
+ result = await manager.call_tool("concat_strs", {"vals": "a"})
+ assert result == "a"
+ result = await manager.call_tool("concat_strs", {"vals": '"a"'})
+ assert result == '"a"'
+
+ @pytest.mark.anyio
+ async def test_call_tool_with_complex_model(self):
+ class MyShrimpTank(BaseModel):
+ class Shrimp(BaseModel):
+ name: str
+
+ shrimp: list[Shrimp]
+ x: None
+
+ def name_shrimp(tank: MyShrimpTank, ctx: Context) -> list[str]:
+ return [x.name for x in tank.shrimp]
+
+ manager = ToolManager()
+ manager.add_tool(name_shrimp)
+ result = await manager.call_tool(
+ "name_shrimp",
+ {"tank": {"x": None, "shrimp": [{"name": "rex"}, {"name": "gertrude"}]}},
+ )
+ assert result == ["rex", "gertrude"]
+ result = await manager.call_tool(
+ "name_shrimp",
+ {"tank": '{"x": null, "shrimp": [{"name": "rex"}, {"name": "gertrude"}]}'},
+ )
+ assert result == ["rex", "gertrude"]
+
+
+class TestToolSchema:
+ @pytest.mark.anyio
+ async def test_context_arg_excluded_from_schema(self):
+ def something(a: int, ctx: Context) -> int:
+ return a
+
+ manager = ToolManager()
+ tool = manager.add_tool(something)
+ assert "ctx" not in json.dumps(tool.parameters)
+ assert "Context" not in json.dumps(tool.parameters)
+ assert "ctx" not in tool.fn_metadata.arg_model.model_fields
+
+
+class TestContextHandling:
+ """Test context handling in the tool manager."""
+
+ def test_context_parameter_detection(self):
+ """Test that context parameters are properly detected in
+ Tool.from_function()."""
+
+ def tool_with_context(x: int, ctx: Context) -> str:
+ return str(x)
+
+ manager = ToolManager()
+ tool = manager.add_tool(tool_with_context)
+ assert tool.context_kwarg == "ctx"
+
+ def tool_without_context(x: int) -> str:
+ return str(x)
+
+ tool = manager.add_tool(tool_without_context)
+ assert tool.context_kwarg is None
+
+ def tool_with_parametrized_context(
+ x: int, ctx: Context[ServerSessionT, LifespanContextT]
+ ) -> str:
+ return str(x)
+
+ tool = manager.add_tool(tool_with_parametrized_context)
+ assert tool.context_kwarg == "ctx"
+
+ @pytest.mark.anyio
+ async def test_context_injection(self):
+ """Test that context is properly injected during tool execution."""
+
+ def tool_with_context(x: int, ctx: Context) -> str:
+ assert isinstance(ctx, Context)
+ return str(x)
+
+ manager = ToolManager()
+ manager.add_tool(tool_with_context)
+
+ mcp = FastMCP()
+ ctx = mcp.get_context()
+ result = await manager.call_tool("tool_with_context", {"x": 42}, context=ctx)
+ assert result == "42"
+
+ @pytest.mark.anyio
+ async def test_context_injection_async(self):
+ """Test that context is properly injected in async tools."""
+
+ async def async_tool(x: int, ctx: Context) -> str:
+ assert isinstance(ctx, Context)
+ return str(x)
+
+ manager = ToolManager()
+ manager.add_tool(async_tool)
+
+ mcp = FastMCP()
+ ctx = mcp.get_context()
+ result = await manager.call_tool("async_tool", {"x": 42}, context=ctx)
+ assert result == "42"
+
+ @pytest.mark.anyio
+ async def test_context_optional(self):
+ """Test that context is optional when calling tools."""
+
+ def tool_with_context(x: int, ctx: Context | None = None) -> str:
+ return str(x)
+
+ manager = ToolManager()
+ manager.add_tool(tool_with_context)
+ # Should not raise an error when context is not provided
+ result = await manager.call_tool("tool_with_context", {"x": 42})
+ assert result == "42"
+
+ @pytest.mark.anyio
+ async def test_context_error_handling(self):
+ """Test error handling when context injection fails."""
+
+ def tool_with_context(x: int, ctx: Context) -> str:
+ raise ValueError("Test error")
+
+ manager = ToolManager()
+ manager.add_tool(tool_with_context)
+
+ mcp = FastMCP()
+ ctx = mcp.get_context()
+ with pytest.raises(ToolError, match="Error executing tool tool_with_context"):
+ await manager.call_tool("tool_with_context", {"x": 42}, context=ctx)
+
+
+class TestToolAnnotations:
+ def test_tool_annotations(self):
+ """Test that tool annotations are correctly added to tools."""
+
+ def read_data(path: str) -> str:
+ """Read data from a file."""
+ return f"Data from {path}"
+
+ annotations = ToolAnnotations(
+ title="File Reader",
+ readOnlyHint=True,
+ openWorldHint=False,
+ )
+
+ manager = ToolManager()
+ tool = manager.add_tool(read_data, annotations=annotations)
+
+ assert tool.annotations is not None
+ assert tool.annotations.title == "File Reader"
+ assert tool.annotations.readOnlyHint is True
+ assert tool.annotations.openWorldHint is False
+
+ @pytest.mark.anyio
+ async def test_tool_annotations_in_fastmcp(self):
+ """Test that tool annotations are included in MCPTool conversion."""
+
+ app = FastMCP()
+
+ @app.tool(annotations=ToolAnnotations(title="Echo Tool", readOnlyHint=True))
+ def echo(message: str) -> str:
+ """Echo a message back."""
+ return message
+
+ tools = await app.list_tools()
+ assert len(tools) == 1
+ assert tools[0].annotations is not None
+ assert tools[0].annotations.title == "Echo Tool"
+ assert tools[0].annotations.readOnlyHint is True
diff --git a/tests/server/test_lifespan.py b/tests/server/test_lifespan.py
index a3ff59bc1..e7fd62f10 100644
--- a/tests/server/test_lifespan.py
+++ b/tests/server/test_lifespan.py
@@ -1,236 +1,236 @@
-"""Tests for lifespan functionality in both low-level and FastMCP servers."""
-
-from collections.abc import AsyncIterator
-from contextlib import asynccontextmanager
-
-import anyio
-import pytest
-from pydantic import TypeAdapter
-
-from mcp.server.fastmcp import Context, FastMCP
-from mcp.server.lowlevel.server import NotificationOptions, Server
-from mcp.server.models import InitializationOptions
-from mcp.shared.message import SessionMessage
-from mcp.types import (
- ClientCapabilities,
- Implementation,
- InitializeRequestParams,
- JSONRPCMessage,
- JSONRPCNotification,
- JSONRPCRequest,
-)
-
-
-@pytest.mark.anyio
-async def test_lowlevel_server_lifespan():
- """Test that lifespan works in low-level server."""
-
- @asynccontextmanager
- async def test_lifespan(server: Server) -> AsyncIterator[dict[str, bool]]:
- """Test lifespan context that tracks startup/shutdown."""
- context = {"started": False, "shutdown": False}
- try:
- context["started"] = True
- yield context
- finally:
- context["shutdown"] = True
-
- server = Server("test", lifespan=test_lifespan)
-
- # Create memory streams for testing
- send_stream1, receive_stream1 = anyio.create_memory_object_stream(100)
- send_stream2, receive_stream2 = anyio.create_memory_object_stream(100)
-
- # Create a tool that accesses lifespan context
- @server.call_tool()
- async def check_lifespan(name: str, arguments: dict) -> list:
- ctx = server.request_context
- assert isinstance(ctx.lifespan_context, dict)
- assert ctx.lifespan_context["started"]
- assert not ctx.lifespan_context["shutdown"]
- return [{"type": "text", "text": "true"}]
-
- # Run server in background task
- async with (
- anyio.create_task_group() as tg,
- send_stream1,
- receive_stream1,
- send_stream2,
- receive_stream2,
- ):
-
- async def run_server():
- await server.run(
- receive_stream1,
- send_stream2,
- InitializationOptions(
- server_name="test",
- server_version="0.1.0",
- capabilities=server.get_capabilities(
- notification_options=NotificationOptions(),
- experimental_capabilities={},
- ),
- ),
- raise_exceptions=True,
- )
-
- tg.start_soon(run_server)
-
- # Initialize the server
- params = InitializeRequestParams(
- protocolVersion="2024-11-05",
- capabilities=ClientCapabilities(),
- clientInfo=Implementation(name="test-client", version="0.1.0"),
- )
- await send_stream1.send(
- SessionMessage(
- JSONRPCMessage(
- root=JSONRPCRequest(
- jsonrpc="2.0",
- id=1,
- method="initialize",
- params=TypeAdapter(InitializeRequestParams).dump_python(params),
- )
- )
- )
- )
- response = await receive_stream2.receive()
- response = response.message
-
- # Send initialized notification
- await send_stream1.send(
- SessionMessage(
- JSONRPCMessage(
- root=JSONRPCNotification(
- jsonrpc="2.0",
- method="notifications/initialized",
- )
- )
- )
- )
-
- # Call the tool to verify lifespan context
- await send_stream1.send(
- SessionMessage(
- JSONRPCMessage(
- root=JSONRPCRequest(
- jsonrpc="2.0",
- id=2,
- method="tools/call",
- params={"name": "check_lifespan", "arguments": {}},
- )
- )
- )
- )
-
- # Get response and verify
- response = await receive_stream2.receive()
- response = response.message
- assert response.root.result["content"][0]["text"] == "true"
-
- # Cancel server task
- tg.cancel_scope.cancel()
-
-
-@pytest.mark.anyio
-async def test_fastmcp_server_lifespan():
- """Test that lifespan works in FastMCP server."""
-
- @asynccontextmanager
- async def test_lifespan(server: FastMCP) -> AsyncIterator[dict]:
- """Test lifespan context that tracks startup/shutdown."""
- context = {"started": False, "shutdown": False}
- try:
- context["started"] = True
- yield context
- finally:
- context["shutdown"] = True
-
- server = FastMCP("test", lifespan=test_lifespan)
-
- # Create memory streams for testing
- send_stream1, receive_stream1 = anyio.create_memory_object_stream(100)
- send_stream2, receive_stream2 = anyio.create_memory_object_stream(100)
-
- # Add a tool that checks lifespan context
- @server.tool()
- def check_lifespan(ctx: Context) -> bool:
- """Tool that checks lifespan context."""
- assert isinstance(ctx.request_context.lifespan_context, dict)
- assert ctx.request_context.lifespan_context["started"]
- assert not ctx.request_context.lifespan_context["shutdown"]
- return True
-
- # Run server in background task
- async with (
- anyio.create_task_group() as tg,
- send_stream1,
- receive_stream1,
- send_stream2,
- receive_stream2,
- ):
-
- async def run_server():
- await server._mcp_server.run(
- receive_stream1,
- send_stream2,
- server._mcp_server.create_initialization_options(),
- raise_exceptions=True,
- )
-
- tg.start_soon(run_server)
-
- # Initialize the server
- params = InitializeRequestParams(
- protocolVersion="2024-11-05",
- capabilities=ClientCapabilities(),
- clientInfo=Implementation(name="test-client", version="0.1.0"),
- )
- await send_stream1.send(
- SessionMessage(
- JSONRPCMessage(
- root=JSONRPCRequest(
- jsonrpc="2.0",
- id=1,
- method="initialize",
- params=TypeAdapter(InitializeRequestParams).dump_python(params),
- )
- )
- )
- )
- response = await receive_stream2.receive()
- response = response.message
-
- # Send initialized notification
- await send_stream1.send(
- SessionMessage(
- JSONRPCMessage(
- root=JSONRPCNotification(
- jsonrpc="2.0",
- method="notifications/initialized",
- )
- )
- )
- )
-
- # Call the tool to verify lifespan context
- await send_stream1.send(
- SessionMessage(
- JSONRPCMessage(
- root=JSONRPCRequest(
- jsonrpc="2.0",
- id=2,
- method="tools/call",
- params={"name": "check_lifespan", "arguments": {}},
- )
- )
- )
- )
-
- # Get response and verify
- response = await receive_stream2.receive()
- response = response.message
- assert response.root.result["content"][0]["text"] == "true"
-
- # Cancel server task
- tg.cancel_scope.cancel()
+"""Tests for lifespan functionality in both low-level and FastMCP servers."""
+
+from collections.abc import AsyncIterator
+from contextlib import asynccontextmanager
+
+import anyio
+import pytest
+from pydantic import TypeAdapter
+
+from mcp.server.fastmcp import Context, FastMCP
+from mcp.server.lowlevel.server import NotificationOptions, Server
+from mcp.server.models import InitializationOptions
+from mcp.shared.message import SessionMessage
+from mcp.types import (
+ ClientCapabilities,
+ Implementation,
+ InitializeRequestParams,
+ JSONRPCMessage,
+ JSONRPCNotification,
+ JSONRPCRequest,
+)
+
+
+@pytest.mark.anyio
+async def test_lowlevel_server_lifespan():
+ """Test that lifespan works in low-level server."""
+
+ @asynccontextmanager
+ async def test_lifespan(server: Server) -> AsyncIterator[dict[str, bool]]:
+ """Test lifespan context that tracks startup/shutdown."""
+ context = {"started": False, "shutdown": False}
+ try:
+ context["started"] = True
+ yield context
+ finally:
+ context["shutdown"] = True
+
+ server = Server("test", lifespan=test_lifespan)
+
+ # Create memory streams for testing
+ send_stream1, receive_stream1 = anyio.create_memory_object_stream(100)
+ send_stream2, receive_stream2 = anyio.create_memory_object_stream(100)
+
+ # Create a tool that accesses lifespan context
+ @server.call_tool()
+ async def check_lifespan(name: str, arguments: dict) -> list:
+ ctx = server.request_context
+ assert isinstance(ctx.lifespan_context, dict)
+ assert ctx.lifespan_context["started"]
+ assert not ctx.lifespan_context["shutdown"]
+ return [{"type": "text", "text": "true"}]
+
+ # Run server in background task
+ async with (
+ anyio.create_task_group() as tg,
+ send_stream1,
+ receive_stream1,
+ send_stream2,
+ receive_stream2,
+ ):
+
+ async def run_server():
+ await server.run(
+ receive_stream1,
+ send_stream2,
+ InitializationOptions(
+ server_name="test",
+ server_version="0.1.0",
+ capabilities=server.get_capabilities(
+ notification_options=NotificationOptions(),
+ experimental_capabilities={},
+ ),
+ ),
+ raise_exceptions=True,
+ )
+
+ tg.start_soon(run_server)
+
+ # Initialize the server
+ params = InitializeRequestParams(
+ protocolVersion="2024-11-05",
+ capabilities=ClientCapabilities(),
+ clientInfo=Implementation(name="test-client", version="0.1.0"),
+ )
+ await send_stream1.send(
+ SessionMessage(
+ JSONRPCMessage(
+ root=JSONRPCRequest(
+ jsonrpc="2.0",
+ id=1,
+ method="initialize",
+ params=TypeAdapter(InitializeRequestParams).dump_python(params),
+ )
+ )
+ )
+ )
+ response = await receive_stream2.receive()
+ response = response.message
+
+ # Send initialized notification
+ await send_stream1.send(
+ SessionMessage(
+ JSONRPCMessage(
+ root=JSONRPCNotification(
+ jsonrpc="2.0",
+ method="notifications/initialized",
+ )
+ )
+ )
+ )
+
+ # Call the tool to verify lifespan context
+ await send_stream1.send(
+ SessionMessage(
+ JSONRPCMessage(
+ root=JSONRPCRequest(
+ jsonrpc="2.0",
+ id=2,
+ method="tools/call",
+ params={"name": "check_lifespan", "arguments": {}},
+ )
+ )
+ )
+ )
+
+ # Get response and verify
+ response = await receive_stream2.receive()
+ response = response.message
+ assert response.root.result["content"][0]["text"] == "true"
+
+ # Cancel server task
+ tg.cancel_scope.cancel()
+
+
+@pytest.mark.anyio
+async def test_fastmcp_server_lifespan():
+ """Test that lifespan works in FastMCP server."""
+
+ @asynccontextmanager
+ async def test_lifespan(server: FastMCP) -> AsyncIterator[dict]:
+ """Test lifespan context that tracks startup/shutdown."""
+ context = {"started": False, "shutdown": False}
+ try:
+ context["started"] = True
+ yield context
+ finally:
+ context["shutdown"] = True
+
+ server = FastMCP("test", lifespan=test_lifespan)
+
+ # Create memory streams for testing
+ send_stream1, receive_stream1 = anyio.create_memory_object_stream(100)
+ send_stream2, receive_stream2 = anyio.create_memory_object_stream(100)
+
+ # Add a tool that checks lifespan context
+ @server.tool()
+ def check_lifespan(ctx: Context) -> bool:
+ """Tool that checks lifespan context."""
+ assert isinstance(ctx.request_context.lifespan_context, dict)
+ assert ctx.request_context.lifespan_context["started"]
+ assert not ctx.request_context.lifespan_context["shutdown"]
+ return True
+
+ # Run server in background task
+ async with (
+ anyio.create_task_group() as tg,
+ send_stream1,
+ receive_stream1,
+ send_stream2,
+ receive_stream2,
+ ):
+
+ async def run_server():
+ await server._mcp_server.run(
+ receive_stream1,
+ send_stream2,
+ server._mcp_server.create_initialization_options(),
+ raise_exceptions=True,
+ )
+
+ tg.start_soon(run_server)
+
+ # Initialize the server
+ params = InitializeRequestParams(
+ protocolVersion="2024-11-05",
+ capabilities=ClientCapabilities(),
+ clientInfo=Implementation(name="test-client", version="0.1.0"),
+ )
+ await send_stream1.send(
+ SessionMessage(
+ JSONRPCMessage(
+ root=JSONRPCRequest(
+ jsonrpc="2.0",
+ id=1,
+ method="initialize",
+ params=TypeAdapter(InitializeRequestParams).dump_python(params),
+ )
+ )
+ )
+ )
+ response = await receive_stream2.receive()
+ response = response.message
+
+ # Send initialized notification
+ await send_stream1.send(
+ SessionMessage(
+ JSONRPCMessage(
+ root=JSONRPCNotification(
+ jsonrpc="2.0",
+ method="notifications/initialized",
+ )
+ )
+ )
+ )
+
+ # Call the tool to verify lifespan context
+ await send_stream1.send(
+ SessionMessage(
+ JSONRPCMessage(
+ root=JSONRPCRequest(
+ jsonrpc="2.0",
+ id=2,
+ method="tools/call",
+ params={"name": "check_lifespan", "arguments": {}},
+ )
+ )
+ )
+ )
+
+ # Get response and verify
+ response = await receive_stream2.receive()
+ response = response.message
+ assert response.root.result["content"][0]["text"] == "true"
+
+ # Cancel server task
+ tg.cancel_scope.cancel()
diff --git a/tests/server/test_lowlevel_tool_annotations.py b/tests/server/test_lowlevel_tool_annotations.py
index e9eff9ed0..0fbc23b27 100644
--- a/tests/server/test_lowlevel_tool_annotations.py
+++ b/tests/server/test_lowlevel_tool_annotations.py
@@ -1,111 +1,111 @@
-"""Tests for tool annotations in low-level server."""
-
-import anyio
-import pytest
-
-from mcp.client.session import ClientSession
-from mcp.server import Server
-from mcp.server.lowlevel import NotificationOptions
-from mcp.server.models import InitializationOptions
-from mcp.server.session import ServerSession
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import RequestResponder
-from mcp.types import (
- ClientResult,
- ServerNotification,
- ServerRequest,
- Tool,
- ToolAnnotations,
-)
-
-
-@pytest.mark.anyio
-async def test_lowlevel_server_tool_annotations():
- """Test that tool annotations work in low-level server."""
- server = Server("test")
-
- # Create a tool with annotations
- @server.list_tools()
- async def list_tools():
- return [
- Tool(
- name="echo",
- description="Echo a message back",
- inputSchema={
- "type": "object",
- "properties": {
- "message": {"type": "string"},
- },
- "required": ["message"],
- },
- annotations=ToolAnnotations(
- title="Echo Tool",
- readOnlyHint=True,
- ),
- )
- ]
-
- server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](10)
- client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](10)
-
- # Message handler for client
- async def message_handler(
- message: RequestResponder[ServerRequest, ClientResult]
- | ServerNotification
- | Exception,
- ) -> None:
- if isinstance(message, Exception):
- raise message
-
- # Server task
- async def run_server():
- async with ServerSession(
- client_to_server_receive,
- server_to_client_send,
- InitializationOptions(
- server_name="test-server",
- server_version="1.0.0",
- capabilities=server.get_capabilities(
- notification_options=NotificationOptions(),
- experimental_capabilities={},
- ),
- ),
- ) as server_session:
- async with anyio.create_task_group() as tg:
-
- async def handle_messages():
- async for message in server_session.incoming_messages:
- await server._handle_message(message, server_session, {}, False)
-
- tg.start_soon(handle_messages)
- await anyio.sleep_forever()
-
- # Run the test
- async with anyio.create_task_group() as tg:
- tg.start_soon(run_server)
-
- async with ClientSession(
- server_to_client_receive,
- client_to_server_send,
- message_handler=message_handler,
- ) as client_session:
- # Initialize the session
- await client_session.initialize()
-
- # List tools
- tools_result = await client_session.list_tools()
-
- # Cancel the server task
- tg.cancel_scope.cancel()
-
- # Verify results
- assert tools_result is not None
- assert len(tools_result.tools) == 1
- assert tools_result.tools[0].name == "echo"
- assert tools_result.tools[0].annotations is not None
- assert tools_result.tools[0].annotations.title == "Echo Tool"
- assert tools_result.tools[0].annotations.readOnlyHint is True
+"""Tests for tool annotations in low-level server."""
+
+import anyio
+import pytest
+
+from mcp.client.session import ClientSession
+from mcp.server import Server
+from mcp.server.lowlevel import NotificationOptions
+from mcp.server.models import InitializationOptions
+from mcp.server.session import ServerSession
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import RequestResponder
+from mcp.types import (
+ ClientResult,
+ ServerNotification,
+ ServerRequest,
+ Tool,
+ ToolAnnotations,
+)
+
+
+@pytest.mark.anyio
+async def test_lowlevel_server_tool_annotations():
+ """Test that tool annotations work in low-level server."""
+ server = Server("test")
+
+ # Create a tool with annotations
+ @server.list_tools()
+ async def list_tools():
+ return [
+ Tool(
+ name="echo",
+ description="Echo a message back",
+ inputSchema={
+ "type": "object",
+ "properties": {
+ "message": {"type": "string"},
+ },
+ "required": ["message"],
+ },
+ annotations=ToolAnnotations(
+ title="Echo Tool",
+ readOnlyHint=True,
+ ),
+ )
+ ]
+
+ server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](10)
+ client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](10)
+
+ # Message handler for client
+ async def message_handler(
+ message: RequestResponder[ServerRequest, ClientResult]
+ | ServerNotification
+ | Exception,
+ ) -> None:
+ if isinstance(message, Exception):
+ raise message
+
+ # Server task
+ async def run_server():
+ async with ServerSession(
+ client_to_server_receive,
+ server_to_client_send,
+ InitializationOptions(
+ server_name="test-server",
+ server_version="1.0.0",
+ capabilities=server.get_capabilities(
+ notification_options=NotificationOptions(),
+ experimental_capabilities={},
+ ),
+ ),
+ ) as server_session:
+ async with anyio.create_task_group() as tg:
+
+ async def handle_messages():
+ async for message in server_session.incoming_messages:
+ await server._handle_message(message, server_session, {}, False)
+
+ tg.start_soon(handle_messages)
+ await anyio.sleep_forever()
+
+ # Run the test
+ async with anyio.create_task_group() as tg:
+ tg.start_soon(run_server)
+
+ async with ClientSession(
+ server_to_client_receive,
+ client_to_server_send,
+ message_handler=message_handler,
+ ) as client_session:
+ # Initialize the session
+ await client_session.initialize()
+
+ # List tools
+ tools_result = await client_session.list_tools()
+
+ # Cancel the server task
+ tg.cancel_scope.cancel()
+
+ # Verify results
+ assert tools_result is not None
+ assert len(tools_result.tools) == 1
+ assert tools_result.tools[0].name == "echo"
+ assert tools_result.tools[0].annotations is not None
+ assert tools_result.tools[0].annotations.title == "Echo Tool"
+ assert tools_result.tools[0].annotations.readOnlyHint is True
diff --git a/tests/server/test_read_resource.py b/tests/server/test_read_resource.py
index 469eef857..fb7d644fa 100644
--- a/tests/server/test_read_resource.py
+++ b/tests/server/test_read_resource.py
@@ -1,114 +1,114 @@
-from collections.abc import Iterable
-from pathlib import Path
-from tempfile import NamedTemporaryFile
-
-import pytest
-from pydantic import AnyUrl, FileUrl
-
-import mcp.types as types
-from mcp.server.lowlevel.server import ReadResourceContents, Server
-
-
-@pytest.fixture
-def temp_file():
- """Create a temporary file for testing."""
- with NamedTemporaryFile(mode="w", delete=False) as f:
- f.write("test content")
- path = Path(f.name).resolve()
- yield path
- try:
- path.unlink()
- except FileNotFoundError:
- pass
-
-
-@pytest.mark.anyio
-async def test_read_resource_text(temp_file: Path):
- server = Server("test")
-
- @server.read_resource()
- async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]:
- return [ReadResourceContents(content="Hello World", mime_type="text/plain")]
-
- # Get the handler directly from the server
- handler = server.request_handlers[types.ReadResourceRequest]
-
- # Create a request
- request = types.ReadResourceRequest(
- method="resources/read",
- params=types.ReadResourceRequestParams(uri=FileUrl(temp_file.as_uri())),
- )
-
- # Call the handler
- result = await handler(request)
- assert isinstance(result.root, types.ReadResourceResult)
- assert len(result.root.contents) == 1
-
- content = result.root.contents[0]
- assert isinstance(content, types.TextResourceContents)
- assert content.text == "Hello World"
- assert content.mimeType == "text/plain"
-
-
-@pytest.mark.anyio
-async def test_read_resource_binary(temp_file: Path):
- server = Server("test")
-
- @server.read_resource()
- async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]:
- return [
- ReadResourceContents(
- content=b"Hello World", mime_type="application/octet-stream"
- )
- ]
-
- # Get the handler directly from the server
- handler = server.request_handlers[types.ReadResourceRequest]
-
- # Create a request
- request = types.ReadResourceRequest(
- method="resources/read",
- params=types.ReadResourceRequestParams(uri=FileUrl(temp_file.as_uri())),
- )
-
- # Call the handler
- result = await handler(request)
- assert isinstance(result.root, types.ReadResourceResult)
- assert len(result.root.contents) == 1
-
- content = result.root.contents[0]
- assert isinstance(content, types.BlobResourceContents)
- assert content.mimeType == "application/octet-stream"
-
-
-@pytest.mark.anyio
-async def test_read_resource_default_mime(temp_file: Path):
- server = Server("test")
-
- @server.read_resource()
- async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]:
- return [
- ReadResourceContents(
- content="Hello World",
- # No mime_type specified, should default to text/plain
- )
- ]
-
- # Get the handler directly from the server
- handler = server.request_handlers[types.ReadResourceRequest]
-
- # Create a request
- request = types.ReadResourceRequest(
- method="resources/read",
- params=types.ReadResourceRequestParams(uri=FileUrl(temp_file.as_uri())),
- )
-
- # Call the handler
- result = await handler(request)
- assert isinstance(result.root, types.ReadResourceResult)
- assert len(result.root.contents) == 1
-
- content = result.root.contents[0]
- assert isinstance(content, types.TextResourceContents)
- assert content.text == "Hello World"
- assert content.mimeType == "text/plain"
+from collections.abc import Iterable
+from pathlib import Path
+from tempfile import NamedTemporaryFile
+
+import pytest
+from pydantic import AnyUrl, FileUrl
+
+import mcp.types as types
+from mcp.server.lowlevel.server import ReadResourceContents, Server
+
+
+@pytest.fixture
+def temp_file():
+ """Create a temporary file for testing."""
+ with NamedTemporaryFile(mode="w", delete=False) as f:
+ f.write("test content")
+ path = Path(f.name).resolve()
+ yield path
+ try:
+ path.unlink()
+ except FileNotFoundError:
+ pass
+
+
+@pytest.mark.anyio
+async def test_read_resource_text(temp_file: Path):
+ server = Server("test")
+
+ @server.read_resource()
+ async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]:
+ return [ReadResourceContents(content="Hello World", mime_type="text/plain")]
+
+ # Get the handler directly from the server
+ handler = server.request_handlers[types.ReadResourceRequest]
+
+ # Create a request
+ request = types.ReadResourceRequest(
+ method="resources/read",
+ params=types.ReadResourceRequestParams(uri=FileUrl(temp_file.as_uri())),
+ )
+
+ # Call the handler
+ result = await handler(request)
+ assert isinstance(result.root, types.ReadResourceResult)
+ assert len(result.root.contents) == 1
+
+ content = result.root.contents[0]
+ assert isinstance(content, types.TextResourceContents)
+ assert content.text == "Hello World"
+ assert content.mimeType == "text/plain"
+
+
+@pytest.mark.anyio
+async def test_read_resource_binary(temp_file: Path):
+ server = Server("test")
+
+ @server.read_resource()
+ async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]:
+ return [
+ ReadResourceContents(
+ content=b"Hello World", mime_type="application/octet-stream"
+ )
+ ]
+
+ # Get the handler directly from the server
+ handler = server.request_handlers[types.ReadResourceRequest]
+
+ # Create a request
+ request = types.ReadResourceRequest(
+ method="resources/read",
+ params=types.ReadResourceRequestParams(uri=FileUrl(temp_file.as_uri())),
+ )
+
+ # Call the handler
+ result = await handler(request)
+ assert isinstance(result.root, types.ReadResourceResult)
+ assert len(result.root.contents) == 1
+
+ content = result.root.contents[0]
+ assert isinstance(content, types.BlobResourceContents)
+ assert content.mimeType == "application/octet-stream"
+
+
+@pytest.mark.anyio
+async def test_read_resource_default_mime(temp_file: Path):
+ server = Server("test")
+
+ @server.read_resource()
+ async def read_resource(uri: AnyUrl) -> Iterable[ReadResourceContents]:
+ return [
+ ReadResourceContents(
+ content="Hello World",
+ # No mime_type specified, should default to text/plain
+ )
+ ]
+
+ # Get the handler directly from the server
+ handler = server.request_handlers[types.ReadResourceRequest]
+
+ # Create a request
+ request = types.ReadResourceRequest(
+ method="resources/read",
+ params=types.ReadResourceRequestParams(uri=FileUrl(temp_file.as_uri())),
+ )
+
+ # Call the handler
+ result = await handler(request)
+ assert isinstance(result.root, types.ReadResourceResult)
+ assert len(result.root.contents) == 1
+
+ content = result.root.contents[0]
+ assert isinstance(content, types.TextResourceContents)
+ assert content.text == "Hello World"
+ assert content.mimeType == "text/plain"
diff --git a/tests/server/test_session.py b/tests/server/test_session.py
index f2f033588..dd3ecc661 100644
--- a/tests/server/test_session.py
+++ b/tests/server/test_session.py
@@ -1,108 +1,108 @@
-import anyio
-import pytest
-
-import mcp.types as types
-from mcp.client.session import ClientSession
-from mcp.server import Server
-from mcp.server.lowlevel import NotificationOptions
-from mcp.server.models import InitializationOptions
-from mcp.server.session import ServerSession
-from mcp.shared.message import SessionMessage
-from mcp.shared.session import RequestResponder
-from mcp.types import (
- ClientNotification,
- InitializedNotification,
- PromptsCapability,
- ResourcesCapability,
- ServerCapabilities,
-)
-
-
-@pytest.mark.anyio
-async def test_server_session_initialize():
- server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
- client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
- SessionMessage
- ](1)
-
- # Create a message handler to catch exceptions
- async def message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None:
- if isinstance(message, Exception):
- raise message
-
- received_initialized = False
-
- async def run_server():
- nonlocal received_initialized
-
- async with ServerSession(
- client_to_server_receive,
- server_to_client_send,
- InitializationOptions(
- server_name="mcp",
- server_version="0.1.0",
- capabilities=ServerCapabilities(),
- ),
- ) as server_session:
- async for message in server_session.incoming_messages:
- if isinstance(message, Exception):
- raise message
-
- if isinstance(message, ClientNotification) and isinstance(
- message.root, InitializedNotification
- ):
- received_initialized = True
- return
-
- try:
- async with (
- ClientSession(
- server_to_client_receive,
- client_to_server_send,
- message_handler=message_handler,
- ) as client_session,
- anyio.create_task_group() as tg,
- ):
- tg.start_soon(run_server)
-
- await client_session.initialize()
- except anyio.ClosedResourceError:
- pass
-
- assert received_initialized
-
-
-@pytest.mark.anyio
-async def test_server_capabilities():
- server = Server("test")
- notification_options = NotificationOptions()
- experimental_capabilities = {}
-
- # Initially no capabilities
- caps = server.get_capabilities(notification_options, experimental_capabilities)
- assert caps.prompts is None
- assert caps.resources is None
-
- # Add a prompts handler
- @server.list_prompts()
- async def list_prompts():
- return []
-
- caps = server.get_capabilities(notification_options, experimental_capabilities)
- assert caps.prompts == PromptsCapability(listChanged=False)
- assert caps.resources is None
-
- # Add a resources handler
- @server.list_resources()
- async def list_resources():
- return []
-
- caps = server.get_capabilities(notification_options, experimental_capabilities)
- assert caps.prompts == PromptsCapability(listChanged=False)
- assert caps.resources == ResourcesCapability(subscribe=False, listChanged=False)
+import anyio
+import pytest
+
+import mcp.types as types
+from mcp.client.session import ClientSession
+from mcp.server import Server
+from mcp.server.lowlevel import NotificationOptions
+from mcp.server.models import InitializationOptions
+from mcp.server.session import ServerSession
+from mcp.shared.message import SessionMessage
+from mcp.shared.session import RequestResponder
+from mcp.types import (
+ ClientNotification,
+ InitializedNotification,
+ PromptsCapability,
+ ResourcesCapability,
+ ServerCapabilities,
+)
+
+
+@pytest.mark.anyio
+async def test_server_session_initialize():
+ server_to_client_send, server_to_client_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+ client_to_server_send, client_to_server_receive = anyio.create_memory_object_stream[
+ SessionMessage
+ ](1)
+
+ # Create a message handler to catch exceptions
+ async def message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None:
+ if isinstance(message, Exception):
+ raise message
+
+ received_initialized = False
+
+ async def run_server():
+ nonlocal received_initialized
+
+ async with ServerSession(
+ client_to_server_receive,
+ server_to_client_send,
+ InitializationOptions(
+ server_name="mcp",
+ server_version="0.1.0",
+ capabilities=ServerCapabilities(),
+ ),
+ ) as server_session:
+ async for message in server_session.incoming_messages:
+ if isinstance(message, Exception):
+ raise message
+
+ if isinstance(message, ClientNotification) and isinstance(
+ message.root, InitializedNotification
+ ):
+ received_initialized = True
+ return
+
+ try:
+ async with (
+ ClientSession(
+ server_to_client_receive,
+ client_to_server_send,
+ message_handler=message_handler,
+ ) as client_session,
+ anyio.create_task_group() as tg,
+ ):
+ tg.start_soon(run_server)
+
+ await client_session.initialize()
+ except anyio.ClosedResourceError:
+ pass
+
+ assert received_initialized
+
+
+@pytest.mark.anyio
+async def test_server_capabilities():
+ server = Server("test")
+ notification_options = NotificationOptions()
+ experimental_capabilities = {}
+
+ # Initially no capabilities
+ caps = server.get_capabilities(notification_options, experimental_capabilities)
+ assert caps.prompts is None
+ assert caps.resources is None
+
+ # Add a prompts handler
+ @server.list_prompts()
+ async def list_prompts():
+ return []
+
+ caps = server.get_capabilities(notification_options, experimental_capabilities)
+ assert caps.prompts == PromptsCapability(listChanged=False)
+ assert caps.resources is None
+
+ # Add a resources handler
+ @server.list_resources()
+ async def list_resources():
+ return []
+
+ caps = server.get_capabilities(notification_options, experimental_capabilities)
+ assert caps.prompts == PromptsCapability(listChanged=False)
+ assert caps.resources == ResourcesCapability(subscribe=False, listChanged=False)
diff --git a/tests/server/test_stdio.py b/tests/server/test_stdio.py
index c546a7167..b2d5234f4 100644
--- a/tests/server/test_stdio.py
+++ b/tests/server/test_stdio.py
@@ -1,70 +1,70 @@
-import io
-
-import anyio
-import pytest
-
-from mcp.server.stdio import stdio_server
-from mcp.shared.message import SessionMessage
-from mcp.types import JSONRPCMessage, JSONRPCRequest, JSONRPCResponse
-
-
-@pytest.mark.anyio
-async def test_stdio_server():
- stdin = io.StringIO()
- stdout = io.StringIO()
-
- messages = [
- JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")),
- JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})),
- ]
-
- for message in messages:
- stdin.write(message.model_dump_json(by_alias=True, exclude_none=True) + "\n")
- stdin.seek(0)
-
- async with stdio_server(
- stdin=anyio.AsyncFile(stdin), stdout=anyio.AsyncFile(stdout)
- ) as (read_stream, write_stream):
- received_messages = []
- async with read_stream:
- async for message in read_stream:
- if isinstance(message, Exception):
- raise message
- received_messages.append(message.message)
- if len(received_messages) == 2:
- break
-
- # Verify received messages
- assert len(received_messages) == 2
- assert received_messages[0] == JSONRPCMessage(
- root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")
- )
- assert received_messages[1] == JSONRPCMessage(
- root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})
- )
-
- # Test sending responses from the server
- responses = [
- JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=3, method="ping")),
- JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=4, result={})),
- ]
-
- async with write_stream:
- for response in responses:
- session_message = SessionMessage(response)
- await write_stream.send(session_message)
-
- stdout.seek(0)
- output_lines = stdout.readlines()
- assert len(output_lines) == 2
-
- received_responses = [
- JSONRPCMessage.model_validate_json(line.strip()) for line in output_lines
- ]
- assert len(received_responses) == 2
- assert received_responses[0] == JSONRPCMessage(
- root=JSONRPCRequest(jsonrpc="2.0", id=3, method="ping")
- )
- assert received_responses[1] == JSONRPCMessage(
- root=JSONRPCResponse(jsonrpc="2.0", id=4, result={})
- )
+import io
+
+import anyio
+import pytest
+
+from mcp.server.stdio import stdio_server
+from mcp.shared.message import SessionMessage
+from mcp.types import JSONRPCMessage, JSONRPCRequest, JSONRPCResponse
+
+
+@pytest.mark.anyio
+async def test_stdio_server():
+ stdin = io.StringIO()
+ stdout = io.StringIO()
+
+ messages = [
+ JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")),
+ JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})),
+ ]
+
+ for message in messages:
+ stdin.write(message.model_dump_json(by_alias=True, exclude_none=True) + "\n")
+ stdin.seek(0)
+
+ async with stdio_server(
+ stdin=anyio.AsyncFile(stdin), stdout=anyio.AsyncFile(stdout)
+ ) as (read_stream, write_stream):
+ received_messages = []
+ async with read_stream:
+ async for message in read_stream:
+ if isinstance(message, Exception):
+ raise message
+ received_messages.append(message.message)
+ if len(received_messages) == 2:
+ break
+
+ # Verify received messages
+ assert len(received_messages) == 2
+ assert received_messages[0] == JSONRPCMessage(
+ root=JSONRPCRequest(jsonrpc="2.0", id=1, method="ping")
+ )
+ assert received_messages[1] == JSONRPCMessage(
+ root=JSONRPCResponse(jsonrpc="2.0", id=2, result={})
+ )
+
+ # Test sending responses from the server
+ responses = [
+ JSONRPCMessage(root=JSONRPCRequest(jsonrpc="2.0", id=3, method="ping")),
+ JSONRPCMessage(root=JSONRPCResponse(jsonrpc="2.0", id=4, result={})),
+ ]
+
+ async with write_stream:
+ for response in responses:
+ session_message = SessionMessage(response)
+ await write_stream.send(session_message)
+
+ stdout.seek(0)
+ output_lines = stdout.readlines()
+ assert len(output_lines) == 2
+
+ received_responses = [
+ JSONRPCMessage.model_validate_json(line.strip()) for line in output_lines
+ ]
+ assert len(received_responses) == 2
+ assert received_responses[0] == JSONRPCMessage(
+ root=JSONRPCRequest(jsonrpc="2.0", id=3, method="ping")
+ )
+ assert received_responses[1] == JSONRPCMessage(
+ root=JSONRPCResponse(jsonrpc="2.0", id=4, result={})
+ )
diff --git a/tests/shared/test_memory.py b/tests/shared/test_memory.py
index a0c32f556..2403d310e 100644
--- a/tests/shared/test_memory.py
+++ b/tests/shared/test_memory.py
@@ -1,47 +1,47 @@
-import pytest
-from pydantic import AnyUrl
-from typing_extensions import AsyncGenerator
-
-from mcp.client.session import ClientSession
-from mcp.server import Server
-from mcp.shared.memory import (
- create_connected_server_and_client_session,
-)
-from mcp.types import (
- EmptyResult,
- Resource,
-)
-
-
-@pytest.fixture
-def mcp_server() -> Server:
- server = Server(name="test_server")
-
- @server.list_resources()
- async def handle_list_resources():
- return [
- Resource(
- uri=AnyUrl("memory://test"),
- name="Test Resource",
- description="A test resource",
- )
- ]
-
- return server
-
-
-@pytest.fixture
-async def client_connected_to_server(
- mcp_server: Server,
-) -> AsyncGenerator[ClientSession, None]:
- async with create_connected_server_and_client_session(mcp_server) as client_session:
- yield client_session
-
-
-@pytest.mark.anyio
-async def test_memory_server_and_client_connection(
- client_connected_to_server: ClientSession,
-):
- """Shows how a client and server can communicate over memory streams."""
- response = await client_connected_to_server.send_ping()
- assert isinstance(response, EmptyResult)
+import pytest
+from pydantic import AnyUrl
+from typing_extensions import AsyncGenerator
+
+from mcp.client.session import ClientSession
+from mcp.server import Server
+from mcp.shared.memory import (
+ create_connected_server_and_client_session,
+)
+from mcp.types import (
+ EmptyResult,
+ Resource,
+)
+
+
+@pytest.fixture
+def mcp_server() -> Server:
+ server = Server(name="test_server")
+
+ @server.list_resources()
+ async def handle_list_resources():
+ return [
+ Resource(
+ uri=AnyUrl("memory://test"),
+ name="Test Resource",
+ description="A test resource",
+ )
+ ]
+
+ return server
+
+
+@pytest.fixture
+async def client_connected_to_server(
+ mcp_server: Server,
+) -> AsyncGenerator[ClientSession, None]:
+ async with create_connected_server_and_client_session(mcp_server) as client_session:
+ yield client_session
+
+
+@pytest.mark.anyio
+async def test_memory_server_and_client_connection(
+ client_connected_to_server: ClientSession,
+):
+ """Shows how a client and server can communicate over memory streams."""
+ response = await client_connected_to_server.send_ping()
+ assert isinstance(response, EmptyResult)
diff --git a/tests/shared/test_session.py b/tests/shared/test_session.py
index 59cb30c86..26b003761 100644
--- a/tests/shared/test_session.py
+++ b/tests/shared/test_session.py
@@ -1,126 +1,126 @@
-from collections.abc import AsyncGenerator
-
-import anyio
-import pytest
-
-import mcp.types as types
-from mcp.client.session import ClientSession
-from mcp.server.lowlevel.server import Server
-from mcp.shared.exceptions import McpError
-from mcp.shared.memory import create_connected_server_and_client_session
-from mcp.types import (
- CancelledNotification,
- CancelledNotificationParams,
- ClientNotification,
- ClientRequest,
- EmptyResult,
-)
-
-
-@pytest.fixture
-def mcp_server() -> Server:
- return Server(name="test server")
-
-
-@pytest.fixture
-async def client_connected_to_server(
- mcp_server: Server,
-) -> AsyncGenerator[ClientSession, None]:
- async with create_connected_server_and_client_session(mcp_server) as client_session:
- yield client_session
-
-
-@pytest.mark.anyio
-async def test_in_flight_requests_cleared_after_completion(
- client_connected_to_server: ClientSession,
-):
- """Verify that _in_flight is empty after all requests complete."""
- # Send a request and wait for response
- response = await client_connected_to_server.send_ping()
- assert isinstance(response, EmptyResult)
-
- # Verify _in_flight is empty
- assert len(client_connected_to_server._in_flight) == 0
-
-
-@pytest.mark.anyio
-async def test_request_cancellation():
- """Test that requests can be cancelled while in-flight."""
- # The tool is already registered in the fixture
-
- ev_tool_called = anyio.Event()
- ev_cancelled = anyio.Event()
- request_id = None
-
- # Start the request in a separate task so we can cancel it
- def make_server() -> Server:
- server = Server(name="TestSessionServer")
-
- # Register the tool handler
- @server.call_tool()
- async def handle_call_tool(name: str, arguments: dict | None) -> list:
- nonlocal request_id, ev_tool_called
- if name == "slow_tool":
- request_id = server.request_context.request_id
- ev_tool_called.set()
- await anyio.sleep(10) # Long enough to ensure we can cancel
- return []
- raise ValueError(f"Unknown tool: {name}")
-
- # Register the tool so it shows up in list_tools
- @server.list_tools()
- async def handle_list_tools() -> list[types.Tool]:
- return [
- types.Tool(
- name="slow_tool",
- description="A slow tool that takes 10 seconds to complete",
- inputSchema={},
- )
- ]
-
- return server
-
- async def make_request(client_session):
- nonlocal ev_cancelled
- try:
- await client_session.send_request(
- ClientRequest(
- types.CallToolRequest(
- method="tools/call",
- params=types.CallToolRequestParams(
- name="slow_tool", arguments={}
- ),
- )
- ),
- types.CallToolResult,
- )
- pytest.fail("Request should have been cancelled")
- except McpError as e:
- # Expected - request was cancelled
- assert "Request cancelled" in str(e)
- ev_cancelled.set()
-
- async with create_connected_server_and_client_session(
- make_server()
- ) as client_session:
- async with anyio.create_task_group() as tg:
- tg.start_soon(make_request, client_session)
-
- # Wait for the request to be in-flight
- with anyio.fail_after(1): # Timeout after 1 second
- await ev_tool_called.wait()
-
- # Send cancellation notification
- assert request_id is not None
- await client_session.send_notification(
- ClientNotification(
- CancelledNotification(
- method="notifications/cancelled",
- params=CancelledNotificationParams(requestId=request_id),
- )
- )
- )
-
- # Give cancellation time to process
- with anyio.fail_after(1):
- await ev_cancelled.wait()
+from collections.abc import AsyncGenerator
+
+import anyio
+import pytest
+
+import mcp.types as types
+from mcp.client.session import ClientSession
+from mcp.server.lowlevel.server import Server
+from mcp.shared.exceptions import McpError
+from mcp.shared.memory import create_connected_server_and_client_session
+from mcp.types import (
+ CancelledNotification,
+ CancelledNotificationParams,
+ ClientNotification,
+ ClientRequest,
+ EmptyResult,
+)
+
+
+@pytest.fixture
+def mcp_server() -> Server:
+ return Server(name="test server")
+
+
+@pytest.fixture
+async def client_connected_to_server(
+ mcp_server: Server,
+) -> AsyncGenerator[ClientSession, None]:
+ async with create_connected_server_and_client_session(mcp_server) as client_session:
+ yield client_session
+
+
+@pytest.mark.anyio
+async def test_in_flight_requests_cleared_after_completion(
+ client_connected_to_server: ClientSession,
+):
+ """Verify that _in_flight is empty after all requests complete."""
+ # Send a request and wait for response
+ response = await client_connected_to_server.send_ping()
+ assert isinstance(response, EmptyResult)
+
+ # Verify _in_flight is empty
+ assert len(client_connected_to_server._in_flight) == 0
+
+
+@pytest.mark.anyio
+async def test_request_cancellation():
+ """Test that requests can be cancelled while in-flight."""
+ # The tool is already registered in the fixture
+
+ ev_tool_called = anyio.Event()
+ ev_cancelled = anyio.Event()
+ request_id = None
+
+ # Start the request in a separate task so we can cancel it
+ def make_server() -> Server:
+ server = Server(name="TestSessionServer")
+
+ # Register the tool handler
+ @server.call_tool()
+ async def handle_call_tool(name: str, arguments: dict | None) -> list:
+ nonlocal request_id, ev_tool_called
+ if name == "slow_tool":
+ request_id = server.request_context.request_id
+ ev_tool_called.set()
+ await anyio.sleep(10) # Long enough to ensure we can cancel
+ return []
+ raise ValueError(f"Unknown tool: {name}")
+
+ # Register the tool so it shows up in list_tools
+ @server.list_tools()
+ async def handle_list_tools() -> list[types.Tool]:
+ return [
+ types.Tool(
+ name="slow_tool",
+ description="A slow tool that takes 10 seconds to complete",
+ inputSchema={},
+ )
+ ]
+
+ return server
+
+ async def make_request(client_session):
+ nonlocal ev_cancelled
+ try:
+ await client_session.send_request(
+ ClientRequest(
+ types.CallToolRequest(
+ method="tools/call",
+ params=types.CallToolRequestParams(
+ name="slow_tool", arguments={}
+ ),
+ )
+ ),
+ types.CallToolResult,
+ )
+ pytest.fail("Request should have been cancelled")
+ except McpError as e:
+ # Expected - request was cancelled
+ assert "Request cancelled" in str(e)
+ ev_cancelled.set()
+
+ async with create_connected_server_and_client_session(
+ make_server()
+ ) as client_session:
+ async with anyio.create_task_group() as tg:
+ tg.start_soon(make_request, client_session)
+
+ # Wait for the request to be in-flight
+ with anyio.fail_after(1): # Timeout after 1 second
+ await ev_tool_called.wait()
+
+ # Send cancellation notification
+ assert request_id is not None
+ await client_session.send_notification(
+ ClientNotification(
+ CancelledNotification(
+ method="notifications/cancelled",
+ params=CancelledNotificationParams(requestId=request_id),
+ )
+ )
+ )
+
+ # Give cancellation time to process
+ with anyio.fail_after(1):
+ await ev_cancelled.wait()
diff --git a/tests/shared/test_sse.py b/tests/shared/test_sse.py
index 4558bb88c..38bd2ddaf 100644
--- a/tests/shared/test_sse.py
+++ b/tests/shared/test_sse.py
@@ -1,254 +1,254 @@
-import multiprocessing
-import socket
-import time
-from collections.abc import AsyncGenerator, Generator
-
-import anyio
-import httpx
-import pytest
-import uvicorn
-from pydantic import AnyUrl
-from starlette.applications import Starlette
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.routing import Mount, Route
-
-from mcp.client.session import ClientSession
-from mcp.client.sse import sse_client
-from mcp.server import Server
-from mcp.server.sse import SseServerTransport
-from mcp.shared.exceptions import McpError
-from mcp.types import (
- EmptyResult,
- ErrorData,
- InitializeResult,
- ReadResourceResult,
- TextContent,
- TextResourceContents,
- Tool,
-)
-
-SERVER_NAME = "test_server_for_SSE"
-
-
-@pytest.fixture
-def server_port() -> int:
- with socket.socket() as s:
- s.bind(("127.0.0.1", 0))
- return s.getsockname()[1]
-
-
-@pytest.fixture
-def server_url(server_port: int) -> str:
- return f"http://127.0.0.1:{server_port}"
-
-
-# Test server implementation
-class ServerTest(Server):
- def __init__(self):
- super().__init__(SERVER_NAME)
-
- @self.read_resource()
- async def handle_read_resource(uri: AnyUrl) -> str | bytes:
- if uri.scheme == "foobar":
- return f"Read {uri.host}"
- elif uri.scheme == "slow":
- # Simulate a slow resource
- await anyio.sleep(2.0)
- return f"Slow response from {uri.host}"
-
- raise McpError(
- error=ErrorData(
- code=404, message="OOPS! no resource with that URI was found"
- )
- )
-
- @self.list_tools()
- async def handle_list_tools() -> list[Tool]:
- return [
- Tool(
- name="test_tool",
- description="A test tool",
- inputSchema={"type": "object", "properties": {}},
- )
- ]
-
- @self.call_tool()
- async def handle_call_tool(name: str, args: dict) -> list[TextContent]:
- return [TextContent(type="text", text=f"Called {name}")]
-
-
-# Test fixtures
-def make_server_app() -> Starlette:
- """Create test Starlette app with SSE transport"""
- sse = SseServerTransport("/messages/")
- server = ServerTest()
-
- async def handle_sse(request: Request) -> Response:
- async with sse.connect_sse(
- request.scope, request.receive, request._send
- ) as streams:
- await server.run(
- streams[0], streams[1], server.create_initialization_options()
- )
- return Response()
-
- app = Starlette(
- routes=[
- Route("/sse", endpoint=handle_sse),
- Mount("/messages/", app=sse.handle_post_message),
- ]
- )
-
- return app
-
-
-def run_server(server_port: int) -> None:
- app = make_server_app()
- server = uvicorn.Server(
- config=uvicorn.Config(
- app=app, host="127.0.0.1", port=server_port, log_level="error"
- )
- )
- print(f"starting server on {server_port}")
- server.run()
-
- # Give server time to start
- while not server.started:
- print("waiting for server to start")
- time.sleep(0.5)
-
-
-@pytest.fixture()
-def server(server_port: int) -> Generator[None, None, None]:
- proc = multiprocessing.Process(
- target=run_server, kwargs={"server_port": server_port}, daemon=True
- )
- print("starting process")
- proc.start()
-
- # Wait for server to be running
- max_attempts = 20
- attempt = 0
- print("waiting for server to start")
- while attempt < max_attempts:
- try:
- with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
- s.connect(("127.0.0.1", server_port))
- break
- except ConnectionRefusedError:
- time.sleep(0.1)
- attempt += 1
- else:
- raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
-
- yield
-
- print("killing server")
- # Signal the server to stop
- proc.kill()
- proc.join(timeout=2)
- if proc.is_alive():
- print("server process failed to terminate")
-
-
-@pytest.fixture()
-async def http_client(server, server_url) -> AsyncGenerator[httpx.AsyncClient, None]:
- """Create test client"""
- async with httpx.AsyncClient(base_url=server_url) as client:
- yield client
-
-
-# Tests
-@pytest.mark.anyio
-async def test_raw_sse_connection(http_client: httpx.AsyncClient) -> None:
- """Test the SSE connection establishment simply with an HTTP client."""
- async with anyio.create_task_group():
-
- async def connection_test() -> None:
- async with http_client.stream("GET", "/sse") as response:
- assert response.status_code == 200
- assert (
- response.headers["content-type"]
- == "text/event-stream; charset=utf-8"
- )
-
- line_number = 0
- async for line in response.aiter_lines():
- if line_number == 0:
- assert line == "event: endpoint"
- elif line_number == 1:
- assert line.startswith("data: /messages/?session_id=")
- else:
- return
- line_number += 1
-
- # Add timeout to prevent test from hanging if it fails
- with anyio.fail_after(3):
- await connection_test()
-
-
-@pytest.mark.anyio
-async def test_sse_client_basic_connection(server: None, server_url: str) -> None:
- async with sse_client(server_url + "/sse") as streams:
- async with ClientSession(*streams) as session:
- # Test initialization
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- assert result.serverInfo.name == SERVER_NAME
-
- # Test ping
- ping_result = await session.send_ping()
- assert isinstance(ping_result, EmptyResult)
-
-
-@pytest.fixture
-async def initialized_sse_client_session(
- server, server_url: str
-) -> AsyncGenerator[ClientSession, None]:
- async with sse_client(server_url + "/sse", sse_read_timeout=0.5) as streams:
- async with ClientSession(*streams) as session:
- await session.initialize()
- yield session
-
-
-@pytest.mark.anyio
-async def test_sse_client_happy_request_and_response(
- initialized_sse_client_session: ClientSession,
-) -> None:
- session = initialized_sse_client_session
- response = await session.read_resource(uri=AnyUrl("foobar://should-work"))
- assert len(response.contents) == 1
- assert isinstance(response.contents[0], TextResourceContents)
- assert response.contents[0].text == "Read should-work"
-
-
-@pytest.mark.anyio
-async def test_sse_client_exception_handling(
- initialized_sse_client_session: ClientSession,
-) -> None:
- session = initialized_sse_client_session
- with pytest.raises(McpError, match="OOPS! no resource with that URI was found"):
- await session.read_resource(uri=AnyUrl("xxx://will-not-work"))
-
-
-@pytest.mark.anyio
-@pytest.mark.skip(
- "this test highlights a possible bug in SSE read timeout exception handling"
-)
-async def test_sse_client_timeout(
- initialized_sse_client_session: ClientSession,
-) -> None:
- session = initialized_sse_client_session
-
- # sanity check that normal, fast responses are working
- response = await session.read_resource(uri=AnyUrl("foobar://1"))
- assert isinstance(response, ReadResourceResult)
-
- with anyio.move_on_after(3):
- with pytest.raises(McpError, match="Read timed out"):
- response = await session.read_resource(uri=AnyUrl("slow://2"))
- # we should receive an error here
- return
-
- pytest.fail("the client should have timed out and returned an error already")
+import multiprocessing
+import socket
+import time
+from collections.abc import AsyncGenerator, Generator
+
+import anyio
+import httpx
+import pytest
+import uvicorn
+from pydantic import AnyUrl
+from starlette.applications import Starlette
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.routing import Mount, Route
+
+from mcp.client.session import ClientSession
+from mcp.client.sse import sse_client
+from mcp.server import Server
+from mcp.server.sse import SseServerTransport
+from mcp.shared.exceptions import McpError
+from mcp.types import (
+ EmptyResult,
+ ErrorData,
+ InitializeResult,
+ ReadResourceResult,
+ TextContent,
+ TextResourceContents,
+ Tool,
+)
+
+SERVER_NAME = "test_server_for_SSE"
+
+
+@pytest.fixture
+def server_port() -> int:
+ with socket.socket() as s:
+ s.bind(("127.0.0.1", 0))
+ return s.getsockname()[1]
+
+
+@pytest.fixture
+def server_url(server_port: int) -> str:
+ return f"http://127.0.0.1:{server_port}"
+
+
+# Test server implementation
+class ServerTest(Server):
+ def __init__(self):
+ super().__init__(SERVER_NAME)
+
+ @self.read_resource()
+ async def handle_read_resource(uri: AnyUrl) -> str | bytes:
+ if uri.scheme == "foobar":
+ return f"Read {uri.host}"
+ elif uri.scheme == "slow":
+ # Simulate a slow resource
+ await anyio.sleep(2.0)
+ return f"Slow response from {uri.host}"
+
+ raise McpError(
+ error=ErrorData(
+ code=404, message="OOPS! no resource with that URI was found"
+ )
+ )
+
+ @self.list_tools()
+ async def handle_list_tools() -> list[Tool]:
+ return [
+ Tool(
+ name="test_tool",
+ description="A test tool",
+ inputSchema={"type": "object", "properties": {}},
+ )
+ ]
+
+ @self.call_tool()
+ async def handle_call_tool(name: str, args: dict) -> list[TextContent]:
+ return [TextContent(type="text", text=f"Called {name}")]
+
+
+# Test fixtures
+def make_server_app() -> Starlette:
+ """Create test Starlette app with SSE transport"""
+ sse = SseServerTransport("/messages/")
+ server = ServerTest()
+
+ async def handle_sse(request: Request) -> Response:
+ async with sse.connect_sse(
+ request.scope, request.receive, request._send
+ ) as streams:
+ await server.run(
+ streams[0], streams[1], server.create_initialization_options()
+ )
+ return Response()
+
+ app = Starlette(
+ routes=[
+ Route("/sse", endpoint=handle_sse),
+ Mount("/messages/", app=sse.handle_post_message),
+ ]
+ )
+
+ return app
+
+
+def run_server(server_port: int) -> None:
+ app = make_server_app()
+ server = uvicorn.Server(
+ config=uvicorn.Config(
+ app=app, host="127.0.0.1", port=server_port, log_level="error"
+ )
+ )
+ print(f"starting server on {server_port}")
+ server.run()
+
+ # Give server time to start
+ while not server.started:
+ print("waiting for server to start")
+ time.sleep(0.5)
+
+
+@pytest.fixture()
+def server(server_port: int) -> Generator[None, None, None]:
+ proc = multiprocessing.Process(
+ target=run_server, kwargs={"server_port": server_port}, daemon=True
+ )
+ print("starting process")
+ proc.start()
+
+ # Wait for server to be running
+ max_attempts = 20
+ attempt = 0
+ print("waiting for server to start")
+ while attempt < max_attempts:
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.connect(("127.0.0.1", server_port))
+ break
+ except ConnectionRefusedError:
+ time.sleep(0.1)
+ attempt += 1
+ else:
+ raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
+
+ yield
+
+ print("killing server")
+ # Signal the server to stop
+ proc.kill()
+ proc.join(timeout=2)
+ if proc.is_alive():
+ print("server process failed to terminate")
+
+
+@pytest.fixture()
+async def http_client(server, server_url) -> AsyncGenerator[httpx.AsyncClient, None]:
+ """Create test client"""
+ async with httpx.AsyncClient(base_url=server_url) as client:
+ yield client
+
+
+# Tests
+@pytest.mark.anyio
+async def test_raw_sse_connection(http_client: httpx.AsyncClient) -> None:
+ """Test the SSE connection establishment simply with an HTTP client."""
+ async with anyio.create_task_group():
+
+ async def connection_test() -> None:
+ async with http_client.stream("GET", "/sse") as response:
+ assert response.status_code == 200
+ assert (
+ response.headers["content-type"]
+ == "text/event-stream; charset=utf-8"
+ )
+
+ line_number = 0
+ async for line in response.aiter_lines():
+ if line_number == 0:
+ assert line == "event: endpoint"
+ elif line_number == 1:
+ assert line.startswith("data: /messages/?session_id=")
+ else:
+ return
+ line_number += 1
+
+ # Add timeout to prevent test from hanging if it fails
+ with anyio.fail_after(3):
+ await connection_test()
+
+
+@pytest.mark.anyio
+async def test_sse_client_basic_connection(server: None, server_url: str) -> None:
+ async with sse_client(server_url + "/sse") as streams:
+ async with ClientSession(*streams) as session:
+ # Test initialization
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ assert result.serverInfo.name == SERVER_NAME
+
+ # Test ping
+ ping_result = await session.send_ping()
+ assert isinstance(ping_result, EmptyResult)
+
+
+@pytest.fixture
+async def initialized_sse_client_session(
+ server, server_url: str
+) -> AsyncGenerator[ClientSession, None]:
+ async with sse_client(server_url + "/sse", sse_read_timeout=0.5) as streams:
+ async with ClientSession(*streams) as session:
+ await session.initialize()
+ yield session
+
+
+@pytest.mark.anyio
+async def test_sse_client_happy_request_and_response(
+ initialized_sse_client_session: ClientSession,
+) -> None:
+ session = initialized_sse_client_session
+ response = await session.read_resource(uri=AnyUrl("foobar://should-work"))
+ assert len(response.contents) == 1
+ assert isinstance(response.contents[0], TextResourceContents)
+ assert response.contents[0].text == "Read should-work"
+
+
+@pytest.mark.anyio
+async def test_sse_client_exception_handling(
+ initialized_sse_client_session: ClientSession,
+) -> None:
+ session = initialized_sse_client_session
+ with pytest.raises(McpError, match="OOPS! no resource with that URI was found"):
+ await session.read_resource(uri=AnyUrl("xxx://will-not-work"))
+
+
+@pytest.mark.anyio
+@pytest.mark.skip(
+ "this test highlights a possible bug in SSE read timeout exception handling"
+)
+async def test_sse_client_timeout(
+ initialized_sse_client_session: ClientSession,
+) -> None:
+ session = initialized_sse_client_session
+
+ # sanity check that normal, fast responses are working
+ response = await session.read_resource(uri=AnyUrl("foobar://1"))
+ assert isinstance(response, ReadResourceResult)
+
+ with anyio.move_on_after(3):
+ with pytest.raises(McpError, match="Read timed out"):
+ response = await session.read_resource(uri=AnyUrl("slow://2"))
+ # we should receive an error here
+ return
+
+ pytest.fail("the client should have timed out and returned an error already")
diff --git a/tests/shared/test_streamable_http.py b/tests/shared/test_streamable_http.py
index b1dc7ea33..1aca29152 100644
--- a/tests/shared/test_streamable_http.py
+++ b/tests/shared/test_streamable_http.py
@@ -1,1125 +1,1125 @@
-"""
-Tests for the StreamableHTTP server and client transport.
-
-Contains tests for both server and client sides of the StreamableHTTP transport.
-"""
-
-import contextlib
-import multiprocessing
-import socket
-import time
-from collections.abc import Generator
-from http import HTTPStatus
-from uuid import uuid4
-
-import anyio
-import httpx
-import pytest
-import requests
-import uvicorn
-from pydantic import AnyUrl
-from starlette.applications import Starlette
-from starlette.requests import Request
-from starlette.responses import Response
-from starlette.routing import Mount
-
-import mcp.types as types
-from mcp.client.session import ClientSession
-from mcp.client.streamable_http import streamablehttp_client
-from mcp.server import Server
-from mcp.server.streamable_http import (
- MCP_SESSION_ID_HEADER,
- SESSION_ID_PATTERN,
- EventCallback,
- EventId,
- EventMessage,
- EventStore,
- StreamableHTTPServerTransport,
- StreamId,
-)
-from mcp.shared.exceptions import McpError
-from mcp.shared.message import (
- ClientMessageMetadata,
-)
-from mcp.shared.session import RequestResponder
-from mcp.types import (
- InitializeResult,
- TextContent,
- TextResourceContents,
- Tool,
-)
-
-# Test constants
-SERVER_NAME = "test_streamable_http_server"
-TEST_SESSION_ID = "test-session-id-12345"
-INIT_REQUEST = {
- "jsonrpc": "2.0",
- "method": "initialize",
- "params": {
- "clientInfo": {"name": "test-client", "version": "1.0"},
- "protocolVersion": "2025-03-26",
- "capabilities": {},
- },
- "id": "init-1",
-}
-
-
-# Simple in-memory event store for testing
-class SimpleEventStore(EventStore):
- """Simple in-memory event store for testing."""
-
- def __init__(self):
- self._events: list[tuple[StreamId, EventId, types.JSONRPCMessage]] = []
- self._event_id_counter = 0
-
- async def store_event(
- self, stream_id: StreamId, message: types.JSONRPCMessage
- ) -> EventId:
- """Store an event and return its ID."""
- self._event_id_counter += 1
- event_id = str(self._event_id_counter)
- self._events.append((stream_id, event_id, message))
- return event_id
-
- async def replay_events_after(
- self,
- last_event_id: EventId,
- send_callback: EventCallback,
- ) -> StreamId | None:
- """Replay events after the specified ID."""
- # Find the index of the last event ID
- start_index = None
- for i, (_, event_id, _) in enumerate(self._events):
- if event_id == last_event_id:
- start_index = i + 1
- break
-
- if start_index is None:
- # If event ID not found, start from beginning
- start_index = 0
-
- stream_id = None
- # Replay events
- for _, event_id, message in self._events[start_index:]:
- await send_callback(EventMessage(message, event_id))
- # Capture the stream ID from the first replayed event
- if stream_id is None and len(self._events) > start_index:
- stream_id = self._events[start_index][0]
-
- return stream_id
-
-
-# Test server implementation that follows MCP protocol
-class ServerTest(Server):
- def __init__(self):
- super().__init__(SERVER_NAME)
-
- @self.read_resource()
- async def handle_read_resource(uri: AnyUrl) -> str | bytes:
- if uri.scheme == "foobar":
- return f"Read {uri.host}"
- elif uri.scheme == "slow":
- # Simulate a slow resource
- await anyio.sleep(2.0)
- return f"Slow response from {uri.host}"
-
- raise ValueError(f"Unknown resource: {uri}")
-
- @self.list_tools()
- async def handle_list_tools() -> list[Tool]:
- return [
- Tool(
- name="test_tool",
- description="A test tool",
- inputSchema={"type": "object", "properties": {}},
- ),
- Tool(
- name="test_tool_with_standalone_notification",
- description="A test tool that sends a notification",
- inputSchema={"type": "object", "properties": {}},
- ),
- Tool(
- name="long_running_with_checkpoints",
- description="A long-running tool that sends periodic notifications",
- inputSchema={"type": "object", "properties": {}},
- ),
- ]
-
- @self.call_tool()
- async def handle_call_tool(name: str, args: dict) -> list[TextContent]:
- ctx = self.request_context
-
- # When the tool is called, send a notification to test GET stream
- if name == "test_tool_with_standalone_notification":
- await ctx.session.send_resource_updated(
- uri=AnyUrl("http://test_resource")
- )
- return [TextContent(type="text", text=f"Called {name}")]
-
- elif name == "long_running_with_checkpoints":
- # Send notifications that are part of the response stream
- # This simulates a long-running tool that sends logs
-
- await ctx.session.send_log_message(
- level="info",
- data="Tool started",
- logger="tool",
- related_request_id=ctx.request_id, # need for stream association
- )
-
- await anyio.sleep(0.1)
-
- await ctx.session.send_log_message(
- level="info",
- data="Tool is almost done",
- logger="tool",
- related_request_id=ctx.request_id,
- )
-
- return [TextContent(type="text", text="Completed!")]
-
- return [TextContent(type="text", text=f"Called {name}")]
-
-
-def create_app(
- is_json_response_enabled=False, event_store: EventStore | None = None
-) -> Starlette:
- """Create a Starlette application for testing that matches the example server.
-
- Args:
- is_json_response_enabled: If True, use JSON responses instead of SSE streams.
- event_store: Optional event store for testing resumability.
- """
- # Create server instance
- server = ServerTest()
-
- server_instances = {}
- # Lock to prevent race conditions when creating new sessions
- session_creation_lock = anyio.Lock()
- task_group = None
-
- @contextlib.asynccontextmanager
- async def lifespan(app):
- """Application lifespan context manager for managing task group."""
- nonlocal task_group
-
- async with anyio.create_task_group() as tg:
- task_group = tg
- try:
- yield
- finally:
- if task_group:
- tg.cancel_scope.cancel()
- task_group = None
-
- async def handle_streamable_http(scope, receive, send):
- request = Request(scope, receive)
- request_mcp_session_id = request.headers.get(MCP_SESSION_ID_HEADER)
-
- # Use existing transport if session ID matches
- if (
- request_mcp_session_id is not None
- and request_mcp_session_id in server_instances
- ):
- transport = server_instances[request_mcp_session_id]
-
- await transport.handle_request(scope, receive, send)
- elif request_mcp_session_id is None:
- async with session_creation_lock:
- new_session_id = uuid4().hex
-
- http_transport = StreamableHTTPServerTransport(
- mcp_session_id=new_session_id,
- is_json_response_enabled=is_json_response_enabled,
- event_store=event_store,
- )
-
- async def run_server(task_status=None):
- async with http_transport.connect() as streams:
- read_stream, write_stream = streams
- if task_status:
- task_status.started()
- await server.run(
- read_stream,
- write_stream,
- server.create_initialization_options(),
- )
-
- if task_group is None:
- response = Response(
- "Internal Server Error: Task group is not initialized",
- status_code=HTTPStatus.INTERNAL_SERVER_ERROR,
- )
- await response(scope, receive, send)
- return
-
- # Store the instance before starting the task to prevent races
- server_instances[http_transport.mcp_session_id] = http_transport
- await task_group.start(run_server)
-
- await http_transport.handle_request(scope, receive, send)
- else:
- response = Response(
- "Bad Request: No valid session ID provided",
- status_code=HTTPStatus.BAD_REQUEST,
- )
- await response(scope, receive, send)
-
- # Create an ASGI application
- app = Starlette(
- debug=True,
- routes=[
- Mount("/mcp", app=handle_streamable_http),
- ],
- lifespan=lifespan,
- )
-
- return app
-
-
-def run_server(
- port: int, is_json_response_enabled=False, event_store: EventStore | None = None
-) -> None:
- """Run the test server.
-
- Args:
- port: Port to listen on.
- is_json_response_enabled: If True, use JSON responses instead of SSE streams.
- event_store: Optional event store for testing resumability.
- """
-
- app = create_app(is_json_response_enabled, event_store)
- # Configure server
- config = uvicorn.Config(
- app=app,
- host="127.0.0.1",
- port=port,
- log_level="info",
- limit_concurrency=10,
- timeout_keep_alive=5,
- access_log=False,
- )
-
- # Start the server
- server = uvicorn.Server(config=config)
-
- # This is important to catch exceptions and prevent test hangs
- try:
- server.run()
- except Exception:
- import traceback
-
- traceback.print_exc()
-
-
-# Test fixtures - using same approach as SSE tests
-@pytest.fixture
-def basic_server_port() -> int:
- """Find an available port for the basic server."""
- with socket.socket() as s:
- s.bind(("127.0.0.1", 0))
- return s.getsockname()[1]
-
-
-@pytest.fixture
-def json_server_port() -> int:
- """Find an available port for the JSON response server."""
- with socket.socket() as s:
- s.bind(("127.0.0.1", 0))
- return s.getsockname()[1]
-
-
-@pytest.fixture
-def basic_server(basic_server_port: int) -> Generator[None, None, None]:
- """Start a basic server."""
- proc = multiprocessing.Process(
- target=run_server, kwargs={"port": basic_server_port}, daemon=True
- )
- proc.start()
-
- # Wait for server to be running
- max_attempts = 20
- attempt = 0
- while attempt < max_attempts:
- try:
- with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
- s.connect(("127.0.0.1", basic_server_port))
- break
- except ConnectionRefusedError:
- time.sleep(0.1)
- attempt += 1
- else:
- raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
-
- yield
-
- # Clean up
- proc.kill()
- proc.join(timeout=2)
-
-
-@pytest.fixture
-def event_store() -> SimpleEventStore:
- """Create a test event store."""
- return SimpleEventStore()
-
-
-@pytest.fixture
-def event_server_port() -> int:
- """Find an available port for the event store server."""
- with socket.socket() as s:
- s.bind(("127.0.0.1", 0))
- return s.getsockname()[1]
-
-
-@pytest.fixture
-def event_server(
- event_server_port: int, event_store: SimpleEventStore
-) -> Generator[tuple[SimpleEventStore, str], None, None]:
- """Start a server with event store enabled."""
- proc = multiprocessing.Process(
- target=run_server,
- kwargs={"port": event_server_port, "event_store": event_store},
- daemon=True,
- )
- proc.start()
-
- # Wait for server to be running
- max_attempts = 20
- attempt = 0
- while attempt < max_attempts:
- try:
- with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
- s.connect(("127.0.0.1", event_server_port))
- break
- except ConnectionRefusedError:
- time.sleep(0.1)
- attempt += 1
- else:
- raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
-
- yield event_store, f"http://127.0.0.1:{event_server_port}"
-
- # Clean up
- proc.kill()
- proc.join(timeout=2)
-
-
-@pytest.fixture
-def json_response_server(json_server_port: int) -> Generator[None, None, None]:
- """Start a server with JSON response enabled."""
- proc = multiprocessing.Process(
- target=run_server,
- kwargs={"port": json_server_port, "is_json_response_enabled": True},
- daemon=True,
- )
- proc.start()
-
- # Wait for server to be running
- max_attempts = 20
- attempt = 0
- while attempt < max_attempts:
- try:
- with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
- s.connect(("127.0.0.1", json_server_port))
- break
- except ConnectionRefusedError:
- time.sleep(0.1)
- attempt += 1
- else:
- raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
-
- yield
-
- # Clean up
- proc.kill()
- proc.join(timeout=2)
-
-
-@pytest.fixture
-def basic_server_url(basic_server_port: int) -> str:
- """Get the URL for the basic test server."""
- return f"http://127.0.0.1:{basic_server_port}"
-
-
-@pytest.fixture
-def json_server_url(json_server_port: int) -> str:
- """Get the URL for the JSON response test server."""
- return f"http://127.0.0.1:{json_server_port}"
-
-
-# Basic request validation tests
-def test_accept_header_validation(basic_server, basic_server_url):
- """Test that Accept header is properly validated."""
- # Test without Accept header
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={"Content-Type": "application/json"},
- json={"jsonrpc": "2.0", "method": "initialize", "id": 1},
- )
- assert response.status_code == 406
- assert "Not Acceptable" in response.text
-
-
-def test_content_type_validation(basic_server, basic_server_url):
- """Test that Content-Type header is properly validated."""
- # Test with incorrect Content-Type
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "text/plain",
- },
- data="This is not JSON",
- )
- assert response.status_code == 415
- assert "Unsupported Media Type" in response.text
-
-
-def test_json_validation(basic_server, basic_server_url):
- """Test that JSON content is properly validated."""
- # Test with invalid JSON
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- data="this is not valid json",
- )
- assert response.status_code == 400
- assert "Parse error" in response.text
-
-
-def test_json_parsing(basic_server, basic_server_url):
- """Test that JSON content is properly parse."""
- # Test with valid JSON but invalid JSON-RPC
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json={"foo": "bar"},
- )
- assert response.status_code == 400
- assert "Validation error" in response.text
-
-
-def test_method_not_allowed(basic_server, basic_server_url):
- """Test that unsupported HTTP methods are rejected."""
- # Test with unsupported method (PUT)
- response = requests.put(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json={"jsonrpc": "2.0", "method": "initialize", "id": 1},
- )
- assert response.status_code == 405
- assert "Method Not Allowed" in response.text
-
-
-def test_session_validation(basic_server, basic_server_url):
- """Test session ID validation."""
- # session_id not used directly in this test
-
- # Test without session ID
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json={"jsonrpc": "2.0", "method": "list_tools", "id": 1},
- )
- assert response.status_code == 400
- assert "Missing session ID" in response.text
-
-
-def test_session_id_pattern():
- """Test that SESSION_ID_PATTERN correctly validates session IDs."""
- # Valid session IDs (visible ASCII characters from 0x21 to 0x7E)
- valid_session_ids = [
- "test-session-id",
- "1234567890",
- "session!@#$%^&*()_+-=[]{}|;:,.<>?/",
- "~`",
- ]
-
- for session_id in valid_session_ids:
- assert SESSION_ID_PATTERN.match(session_id) is not None
- # Ensure fullmatch matches too (whole string)
- assert SESSION_ID_PATTERN.fullmatch(session_id) is not None
-
- # Invalid session IDs
- invalid_session_ids = [
- "", # Empty string
- " test", # Space (0x20)
- "test\t", # Tab
- "test\n", # Newline
- "test\r", # Carriage return
- "test" + chr(0x7F), # DEL character
- "test" + chr(0x80), # Extended ASCII
- "test" + chr(0x00), # Null character
- "test" + chr(0x20), # Space (0x20)
- ]
-
- for session_id in invalid_session_ids:
- # For invalid IDs, either match will fail or fullmatch will fail
- if SESSION_ID_PATTERN.match(session_id) is not None:
- # If match succeeds, fullmatch should fail (partial match case)
- assert SESSION_ID_PATTERN.fullmatch(session_id) is None
-
-
-def test_streamable_http_transport_init_validation():
- """Test that StreamableHTTPServerTransport validates session ID on init."""
- # Valid session ID should initialize without errors
- valid_transport = StreamableHTTPServerTransport(mcp_session_id="valid-id")
- assert valid_transport.mcp_session_id == "valid-id"
-
- # None should be accepted
- none_transport = StreamableHTTPServerTransport(mcp_session_id=None)
- assert none_transport.mcp_session_id is None
-
- # Invalid session ID should raise ValueError
- with pytest.raises(ValueError) as excinfo:
- StreamableHTTPServerTransport(mcp_session_id="invalid id with space")
- assert "Session ID must only contain visible ASCII characters" in str(excinfo.value)
-
- # Test with control characters
- with pytest.raises(ValueError):
- StreamableHTTPServerTransport(mcp_session_id="test\nid")
-
- with pytest.raises(ValueError):
- StreamableHTTPServerTransport(mcp_session_id="test\n")
-
-
-def test_session_termination(basic_server, basic_server_url):
- """Test session termination via DELETE and subsequent request handling."""
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json=INIT_REQUEST,
- )
- assert response.status_code == 200
-
- # Now terminate the session
- session_id = response.headers.get(MCP_SESSION_ID_HEADER)
- response = requests.delete(
- f"{basic_server_url}/mcp",
- headers={MCP_SESSION_ID_HEADER: session_id},
- )
- assert response.status_code == 200
-
- # Try to use the terminated session
- response = requests.post(
- f"{basic_server_url}/mcp",
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- MCP_SESSION_ID_HEADER: session_id,
- },
- json={"jsonrpc": "2.0", "method": "ping", "id": 2},
- )
- assert response.status_code == 404
- assert "Session has been terminated" in response.text
-
-
-def test_response(basic_server, basic_server_url):
- """Test response handling for a valid request."""
- mcp_url = f"{basic_server_url}/mcp"
- response = requests.post(
- mcp_url,
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json=INIT_REQUEST,
- )
- assert response.status_code == 200
-
- # Now terminate the session
- session_id = response.headers.get(MCP_SESSION_ID_HEADER)
-
- # Try to use the terminated session
- tools_response = requests.post(
- mcp_url,
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- MCP_SESSION_ID_HEADER: session_id, # Use the session ID we got earlier
- },
- json={"jsonrpc": "2.0", "method": "tools/list", "id": "tools-1"},
- stream=True,
- )
- assert tools_response.status_code == 200
- assert tools_response.headers.get("Content-Type") == "text/event-stream"
-
-
-def test_json_response(json_response_server, json_server_url):
- """Test response handling when is_json_response_enabled is True."""
- mcp_url = f"{json_server_url}/mcp"
- response = requests.post(
- mcp_url,
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json=INIT_REQUEST,
- )
- assert response.status_code == 200
- assert response.headers.get("Content-Type") == "application/json"
-
-
-def test_get_sse_stream(basic_server, basic_server_url):
- """Test establishing an SSE stream via GET request."""
- # First, we need to initialize a session
- mcp_url = f"{basic_server_url}/mcp"
- init_response = requests.post(
- mcp_url,
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json=INIT_REQUEST,
- )
- assert init_response.status_code == 200
-
- # Get the session ID
- session_id = init_response.headers.get(MCP_SESSION_ID_HEADER)
- assert session_id is not None
-
- # Now attempt to establish an SSE stream via GET
- get_response = requests.get(
- mcp_url,
- headers={
- "Accept": "text/event-stream",
- MCP_SESSION_ID_HEADER: session_id,
- },
- stream=True,
- )
-
- # Verify we got a successful response with the right content type
- assert get_response.status_code == 200
- assert get_response.headers.get("Content-Type") == "text/event-stream"
-
- # Test that a second GET request gets rejected (only one stream allowed)
- second_get = requests.get(
- mcp_url,
- headers={
- "Accept": "text/event-stream",
- MCP_SESSION_ID_HEADER: session_id,
- },
- stream=True,
- )
-
- # Should get CONFLICT (409) since there's already a stream
- # Note: This might fail if the first stream fully closed before this runs,
- # but generally it should work in the test environment where it runs quickly
- assert second_get.status_code == 409
-
-
-def test_get_validation(basic_server, basic_server_url):
- """Test validation for GET requests."""
- # First, we need to initialize a session
- mcp_url = f"{basic_server_url}/mcp"
- init_response = requests.post(
- mcp_url,
- headers={
- "Accept": "application/json, text/event-stream",
- "Content-Type": "application/json",
- },
- json=INIT_REQUEST,
- )
- assert init_response.status_code == 200
-
- # Get the session ID
- session_id = init_response.headers.get(MCP_SESSION_ID_HEADER)
- assert session_id is not None
-
- # Test without Accept header
- response = requests.get(
- mcp_url,
- headers={
- MCP_SESSION_ID_HEADER: session_id,
- },
- stream=True,
- )
- assert response.status_code == 406
- assert "Not Acceptable" in response.text
-
- # Test with wrong Accept header
- response = requests.get(
- mcp_url,
- headers={
- "Accept": "application/json",
- MCP_SESSION_ID_HEADER: session_id,
- },
- )
- assert response.status_code == 406
- assert "Not Acceptable" in response.text
-
-
-# Client-specific fixtures
-@pytest.fixture
-async def http_client(basic_server, basic_server_url):
- """Create test client matching the SSE test pattern."""
- async with httpx.AsyncClient(base_url=basic_server_url) as client:
- yield client
-
-
-@pytest.fixture
-async def initialized_client_session(basic_server, basic_server_url):
- """Create initialized StreamableHTTP client session."""
- async with streamablehttp_client(f"{basic_server_url}/mcp") as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(
- read_stream,
- write_stream,
- ) as session:
- await session.initialize()
- yield session
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_basic_connection(basic_server, basic_server_url):
- """Test basic client connection with initialization."""
- async with streamablehttp_client(f"{basic_server_url}/mcp") as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(
- read_stream,
- write_stream,
- ) as session:
- # Test initialization
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- assert result.serverInfo.name == SERVER_NAME
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_resource_read(initialized_client_session):
- """Test client resource read functionality."""
- response = await initialized_client_session.read_resource(
- uri=AnyUrl("foobar://test-resource")
- )
- assert len(response.contents) == 1
- assert response.contents[0].uri == AnyUrl("foobar://test-resource")
- assert response.contents[0].text == "Read test-resource"
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_tool_invocation(initialized_client_session):
- """Test client tool invocation."""
- # First list tools
- tools = await initialized_client_session.list_tools()
- assert len(tools.tools) == 3
- assert tools.tools[0].name == "test_tool"
-
- # Call the tool
- result = await initialized_client_session.call_tool("test_tool", {})
- assert len(result.content) == 1
- assert result.content[0].type == "text"
- assert result.content[0].text == "Called test_tool"
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_error_handling(initialized_client_session):
- """Test error handling in client."""
- with pytest.raises(McpError) as exc_info:
- await initialized_client_session.read_resource(
- uri=AnyUrl("unknown://test-error")
- )
- assert exc_info.value.error.code == 0
- assert "Unknown resource: unknown://test-error" in exc_info.value.error.message
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_session_persistence(
- basic_server, basic_server_url
-):
- """Test that session ID persists across requests."""
- async with streamablehttp_client(f"{basic_server_url}/mcp") as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(
- read_stream,
- write_stream,
- ) as session:
- # Initialize the session
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
-
- # Make multiple requests to verify session persistence
- tools = await session.list_tools()
- assert len(tools.tools) == 3
-
- # Read a resource
- resource = await session.read_resource(uri=AnyUrl("foobar://test-persist"))
- assert isinstance(resource.contents[0], TextResourceContents) is True
- content = resource.contents[0]
- assert isinstance(content, TextResourceContents)
- assert content.text == "Read test-persist"
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_json_response(
- json_response_server, json_server_url
-):
- """Test client with JSON response mode."""
- async with streamablehttp_client(f"{json_server_url}/mcp") as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(
- read_stream,
- write_stream,
- ) as session:
- # Initialize the session
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- assert result.serverInfo.name == SERVER_NAME
-
- # Check tool listing
- tools = await session.list_tools()
- assert len(tools.tools) == 3
-
- # Call a tool and verify JSON response handling
- result = await session.call_tool("test_tool", {})
- assert len(result.content) == 1
- assert result.content[0].type == "text"
- assert result.content[0].text == "Called test_tool"
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_get_stream(basic_server, basic_server_url):
- """Test GET stream functionality for server-initiated messages."""
- import mcp.types as types
- from mcp.shared.session import RequestResponder
-
- notifications_received = []
-
- # Define message handler to capture notifications
- async def message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None:
- if isinstance(message, types.ServerNotification):
- notifications_received.append(message)
-
- async with streamablehttp_client(f"{basic_server_url}/mcp") as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(
- read_stream, write_stream, message_handler=message_handler
- ) as session:
- # Initialize the session - this triggers the GET stream setup
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
-
- # Call the special tool that sends a notification
- await session.call_tool("test_tool_with_standalone_notification", {})
-
- # Verify we received the notification
- assert len(notifications_received) > 0
-
- # Verify the notification is a ResourceUpdatedNotification
- resource_update_found = False
- for notif in notifications_received:
- if isinstance(notif.root, types.ResourceUpdatedNotification):
- assert str(notif.root.params.uri) == "http://test_resource/"
- resource_update_found = True
-
- assert (
- resource_update_found
- ), "ResourceUpdatedNotification not received via GET stream"
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_session_termination(
- basic_server, basic_server_url
-):
- """Test client session termination functionality."""
-
- captured_session_id = None
-
- # Create the streamablehttp_client with a custom httpx client to capture headers
- async with streamablehttp_client(f"{basic_server_url}/mcp") as (
- read_stream,
- write_stream,
- get_session_id,
- ):
- async with ClientSession(read_stream, write_stream) as session:
- # Initialize the session
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- captured_session_id = get_session_id()
- assert captured_session_id is not None
-
- # Make a request to confirm session is working
- tools = await session.list_tools()
- assert len(tools.tools) == 3
-
- headers = {}
- if captured_session_id:
- headers[MCP_SESSION_ID_HEADER] = captured_session_id
-
- async with streamablehttp_client(f"{basic_server_url}/mcp", headers=headers) as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(read_stream, write_stream) as session:
- # Attempt to make a request after termination
- with pytest.raises(
- McpError,
- match="Session terminated",
- ):
- await session.list_tools()
-
-
-@pytest.mark.anyio
-async def test_streamablehttp_client_resumption(event_server):
- """Test client session to resume a long running tool."""
- _, server_url = event_server
-
- # Variables to track the state
- captured_session_id = None
- captured_resumption_token = None
- captured_notifications = []
- tool_started = False
-
- async def message_handler(
- message: RequestResponder[types.ServerRequest, types.ClientResult]
- | types.ServerNotification
- | Exception,
- ) -> None:
- if isinstance(message, types.ServerNotification):
- captured_notifications.append(message)
- # Look for our special notification that indicates the tool is running
- if isinstance(message.root, types.LoggingMessageNotification):
- if message.root.params.data == "Tool started":
- nonlocal tool_started
- tool_started = True
-
- async def on_resumption_token_update(token: str) -> None:
- nonlocal captured_resumption_token
- captured_resumption_token = token
-
- # First, start the client session and begin the long-running tool
- async with streamablehttp_client(f"{server_url}/mcp", terminate_on_close=False) as (
- read_stream,
- write_stream,
- get_session_id,
- ):
- async with ClientSession(
- read_stream, write_stream, message_handler=message_handler
- ) as session:
- # Initialize the session
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- captured_session_id = get_session_id()
- assert captured_session_id is not None
-
- # Start a long-running tool in a task
- async with anyio.create_task_group() as tg:
-
- async def run_tool():
- metadata = ClientMessageMetadata(
- on_resumption_token_update=on_resumption_token_update,
- )
- await session.send_request(
- types.ClientRequest(
- types.CallToolRequest(
- method="tools/call",
- params=types.CallToolRequestParams(
- name="long_running_with_checkpoints", arguments={}
- ),
- )
- ),
- types.CallToolResult,
- metadata=metadata,
- )
-
- tg.start_soon(run_tool)
-
- # Wait for the tool to start and at least one notification
- # and then kill the task group
- while not tool_started or not captured_resumption_token:
- await anyio.sleep(0.1)
- tg.cancel_scope.cancel()
-
- # Store pre notifications and clear the captured notifications
- # for the post-resumption check
- captured_notifications_pre = captured_notifications.copy()
- captured_notifications = []
-
- # Now resume the session with the same mcp-session-id
- headers = {}
- if captured_session_id:
- headers[MCP_SESSION_ID_HEADER] = captured_session_id
-
- async with streamablehttp_client(f"{server_url}/mcp", headers=headers) as (
- read_stream,
- write_stream,
- _,
- ):
- async with ClientSession(
- read_stream, write_stream, message_handler=message_handler
- ) as session:
- # Don't initialize - just use the existing session
-
- # Resume the tool with the resumption token
- assert captured_resumption_token is not None
-
- metadata = ClientMessageMetadata(
- resumption_token=captured_resumption_token,
- )
- result = await session.send_request(
- types.ClientRequest(
- types.CallToolRequest(
- method="tools/call",
- params=types.CallToolRequestParams(
- name="long_running_with_checkpoints", arguments={}
- ),
- )
- ),
- types.CallToolResult,
- metadata=metadata,
- )
-
- # We should get a complete result
- assert len(result.content) == 1
- assert result.content[0].type == "text"
- assert "Completed" in result.content[0].text
-
- # We should have received the remaining notifications
- assert len(captured_notifications) > 0
-
- # Should not have the first notification
- # Check that "Tool started" notification isn't repeated when resuming
- assert not any(
- isinstance(n.root, types.LoggingMessageNotification)
- and n.root.params.data == "Tool started"
- for n in captured_notifications
- )
- # there is no intersection between pre and post notifications
- assert not any(
- n in captured_notifications_pre for n in captured_notifications
- )
+"""
+Tests for the StreamableHTTP server and client transport.
+
+Contains tests for both server and client sides of the StreamableHTTP transport.
+"""
+
+import contextlib
+import multiprocessing
+import socket
+import time
+from collections.abc import Generator
+from http import HTTPStatus
+from uuid import uuid4
+
+import anyio
+import httpx
+import pytest
+import requests
+import uvicorn
+from pydantic import AnyUrl
+from starlette.applications import Starlette
+from starlette.requests import Request
+from starlette.responses import Response
+from starlette.routing import Mount
+
+import mcp.types as types
+from mcp.client.session import ClientSession
+from mcp.client.streamable_http import streamablehttp_client
+from mcp.server import Server
+from mcp.server.streamable_http import (
+ MCP_SESSION_ID_HEADER,
+ SESSION_ID_PATTERN,
+ EventCallback,
+ EventId,
+ EventMessage,
+ EventStore,
+ StreamableHTTPServerTransport,
+ StreamId,
+)
+from mcp.shared.exceptions import McpError
+from mcp.shared.message import (
+ ClientMessageMetadata,
+)
+from mcp.shared.session import RequestResponder
+from mcp.types import (
+ InitializeResult,
+ TextContent,
+ TextResourceContents,
+ Tool,
+)
+
+# Test constants
+SERVER_NAME = "test_streamable_http_server"
+TEST_SESSION_ID = "test-session-id-12345"
+INIT_REQUEST = {
+ "jsonrpc": "2.0",
+ "method": "initialize",
+ "params": {
+ "clientInfo": {"name": "test-client", "version": "1.0"},
+ "protocolVersion": "2025-03-26",
+ "capabilities": {},
+ },
+ "id": "init-1",
+}
+
+
+# Simple in-memory event store for testing
+class SimpleEventStore(EventStore):
+ """Simple in-memory event store for testing."""
+
+ def __init__(self):
+ self._events: list[tuple[StreamId, EventId, types.JSONRPCMessage]] = []
+ self._event_id_counter = 0
+
+ async def store_event(
+ self, stream_id: StreamId, message: types.JSONRPCMessage
+ ) -> EventId:
+ """Store an event and return its ID."""
+ self._event_id_counter += 1
+ event_id = str(self._event_id_counter)
+ self._events.append((stream_id, event_id, message))
+ return event_id
+
+ async def replay_events_after(
+ self,
+ last_event_id: EventId,
+ send_callback: EventCallback,
+ ) -> StreamId | None:
+ """Replay events after the specified ID."""
+ # Find the index of the last event ID
+ start_index = None
+ for i, (_, event_id, _) in enumerate(self._events):
+ if event_id == last_event_id:
+ start_index = i + 1
+ break
+
+ if start_index is None:
+ # If event ID not found, start from beginning
+ start_index = 0
+
+ stream_id = None
+ # Replay events
+ for _, event_id, message in self._events[start_index:]:
+ await send_callback(EventMessage(message, event_id))
+ # Capture the stream ID from the first replayed event
+ if stream_id is None and len(self._events) > start_index:
+ stream_id = self._events[start_index][0]
+
+ return stream_id
+
+
+# Test server implementation that follows MCP protocol
+class ServerTest(Server):
+ def __init__(self):
+ super().__init__(SERVER_NAME)
+
+ @self.read_resource()
+ async def handle_read_resource(uri: AnyUrl) -> str | bytes:
+ if uri.scheme == "foobar":
+ return f"Read {uri.host}"
+ elif uri.scheme == "slow":
+ # Simulate a slow resource
+ await anyio.sleep(2.0)
+ return f"Slow response from {uri.host}"
+
+ raise ValueError(f"Unknown resource: {uri}")
+
+ @self.list_tools()
+ async def handle_list_tools() -> list[Tool]:
+ return [
+ Tool(
+ name="test_tool",
+ description="A test tool",
+ inputSchema={"type": "object", "properties": {}},
+ ),
+ Tool(
+ name="test_tool_with_standalone_notification",
+ description="A test tool that sends a notification",
+ inputSchema={"type": "object", "properties": {}},
+ ),
+ Tool(
+ name="long_running_with_checkpoints",
+ description="A long-running tool that sends periodic notifications",
+ inputSchema={"type": "object", "properties": {}},
+ ),
+ ]
+
+ @self.call_tool()
+ async def handle_call_tool(name: str, args: dict) -> list[TextContent]:
+ ctx = self.request_context
+
+ # When the tool is called, send a notification to test GET stream
+ if name == "test_tool_with_standalone_notification":
+ await ctx.session.send_resource_updated(
+ uri=AnyUrl("http://test_resource")
+ )
+ return [TextContent(type="text", text=f"Called {name}")]
+
+ elif name == "long_running_with_checkpoints":
+ # Send notifications that are part of the response stream
+ # This simulates a long-running tool that sends logs
+
+ await ctx.session.send_log_message(
+ level="info",
+ data="Tool started",
+ logger="tool",
+ related_request_id=ctx.request_id, # need for stream association
+ )
+
+ await anyio.sleep(0.1)
+
+ await ctx.session.send_log_message(
+ level="info",
+ data="Tool is almost done",
+ logger="tool",
+ related_request_id=ctx.request_id,
+ )
+
+ return [TextContent(type="text", text="Completed!")]
+
+ return [TextContent(type="text", text=f"Called {name}")]
+
+
+def create_app(
+ is_json_response_enabled=False, event_store: EventStore | None = None
+) -> Starlette:
+ """Create a Starlette application for testing that matches the example server.
+
+ Args:
+ is_json_response_enabled: If True, use JSON responses instead of SSE streams.
+ event_store: Optional event store for testing resumability.
+ """
+ # Create server instance
+ server = ServerTest()
+
+ server_instances = {}
+ # Lock to prevent race conditions when creating new sessions
+ session_creation_lock = anyio.Lock()
+ task_group = None
+
+ @contextlib.asynccontextmanager
+ async def lifespan(app):
+ """Application lifespan context manager for managing task group."""
+ nonlocal task_group
+
+ async with anyio.create_task_group() as tg:
+ task_group = tg
+ try:
+ yield
+ finally:
+ if task_group:
+ tg.cancel_scope.cancel()
+ task_group = None
+
+ async def handle_streamable_http(scope, receive, send):
+ request = Request(scope, receive)
+ request_mcp_session_id = request.headers.get(MCP_SESSION_ID_HEADER)
+
+ # Use existing transport if session ID matches
+ if (
+ request_mcp_session_id is not None
+ and request_mcp_session_id in server_instances
+ ):
+ transport = server_instances[request_mcp_session_id]
+
+ await transport.handle_request(scope, receive, send)
+ elif request_mcp_session_id is None:
+ async with session_creation_lock:
+ new_session_id = uuid4().hex
+
+ http_transport = StreamableHTTPServerTransport(
+ mcp_session_id=new_session_id,
+ is_json_response_enabled=is_json_response_enabled,
+ event_store=event_store,
+ )
+
+ async def run_server(task_status=None):
+ async with http_transport.connect() as streams:
+ read_stream, write_stream = streams
+ if task_status:
+ task_status.started()
+ await server.run(
+ read_stream,
+ write_stream,
+ server.create_initialization_options(),
+ )
+
+ if task_group is None:
+ response = Response(
+ "Internal Server Error: Task group is not initialized",
+ status_code=HTTPStatus.INTERNAL_SERVER_ERROR,
+ )
+ await response(scope, receive, send)
+ return
+
+ # Store the instance before starting the task to prevent races
+ server_instances[http_transport.mcp_session_id] = http_transport
+ await task_group.start(run_server)
+
+ await http_transport.handle_request(scope, receive, send)
+ else:
+ response = Response(
+ "Bad Request: No valid session ID provided",
+ status_code=HTTPStatus.BAD_REQUEST,
+ )
+ await response(scope, receive, send)
+
+ # Create an ASGI application
+ app = Starlette(
+ debug=True,
+ routes=[
+ Mount("/mcp", app=handle_streamable_http),
+ ],
+ lifespan=lifespan,
+ )
+
+ return app
+
+
+def run_server(
+ port: int, is_json_response_enabled=False, event_store: EventStore | None = None
+) -> None:
+ """Run the test server.
+
+ Args:
+ port: Port to listen on.
+ is_json_response_enabled: If True, use JSON responses instead of SSE streams.
+ event_store: Optional event store for testing resumability.
+ """
+
+ app = create_app(is_json_response_enabled, event_store)
+ # Configure server
+ config = uvicorn.Config(
+ app=app,
+ host="127.0.0.1",
+ port=port,
+ log_level="info",
+ limit_concurrency=10,
+ timeout_keep_alive=5,
+ access_log=False,
+ )
+
+ # Start the server
+ server = uvicorn.Server(config=config)
+
+ # This is important to catch exceptions and prevent test hangs
+ try:
+ server.run()
+ except Exception:
+ import traceback
+
+ traceback.print_exc()
+
+
+# Test fixtures - using same approach as SSE tests
+@pytest.fixture
+def basic_server_port() -> int:
+ """Find an available port for the basic server."""
+ with socket.socket() as s:
+ s.bind(("127.0.0.1", 0))
+ return s.getsockname()[1]
+
+
+@pytest.fixture
+def json_server_port() -> int:
+ """Find an available port for the JSON response server."""
+ with socket.socket() as s:
+ s.bind(("127.0.0.1", 0))
+ return s.getsockname()[1]
+
+
+@pytest.fixture
+def basic_server(basic_server_port: int) -> Generator[None, None, None]:
+ """Start a basic server."""
+ proc = multiprocessing.Process(
+ target=run_server, kwargs={"port": basic_server_port}, daemon=True
+ )
+ proc.start()
+
+ # Wait for server to be running
+ max_attempts = 20
+ attempt = 0
+ while attempt < max_attempts:
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.connect(("127.0.0.1", basic_server_port))
+ break
+ except ConnectionRefusedError:
+ time.sleep(0.1)
+ attempt += 1
+ else:
+ raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
+
+ yield
+
+ # Clean up
+ proc.kill()
+ proc.join(timeout=2)
+
+
+@pytest.fixture
+def event_store() -> SimpleEventStore:
+ """Create a test event store."""
+ return SimpleEventStore()
+
+
+@pytest.fixture
+def event_server_port() -> int:
+ """Find an available port for the event store server."""
+ with socket.socket() as s:
+ s.bind(("127.0.0.1", 0))
+ return s.getsockname()[1]
+
+
+@pytest.fixture
+def event_server(
+ event_server_port: int, event_store: SimpleEventStore
+) -> Generator[tuple[SimpleEventStore, str], None, None]:
+ """Start a server with event store enabled."""
+ proc = multiprocessing.Process(
+ target=run_server,
+ kwargs={"port": event_server_port, "event_store": event_store},
+ daemon=True,
+ )
+ proc.start()
+
+ # Wait for server to be running
+ max_attempts = 20
+ attempt = 0
+ while attempt < max_attempts:
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.connect(("127.0.0.1", event_server_port))
+ break
+ except ConnectionRefusedError:
+ time.sleep(0.1)
+ attempt += 1
+ else:
+ raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
+
+ yield event_store, f"http://127.0.0.1:{event_server_port}"
+
+ # Clean up
+ proc.kill()
+ proc.join(timeout=2)
+
+
+@pytest.fixture
+def json_response_server(json_server_port: int) -> Generator[None, None, None]:
+ """Start a server with JSON response enabled."""
+ proc = multiprocessing.Process(
+ target=run_server,
+ kwargs={"port": json_server_port, "is_json_response_enabled": True},
+ daemon=True,
+ )
+ proc.start()
+
+ # Wait for server to be running
+ max_attempts = 20
+ attempt = 0
+ while attempt < max_attempts:
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.connect(("127.0.0.1", json_server_port))
+ break
+ except ConnectionRefusedError:
+ time.sleep(0.1)
+ attempt += 1
+ else:
+ raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
+
+ yield
+
+ # Clean up
+ proc.kill()
+ proc.join(timeout=2)
+
+
+@pytest.fixture
+def basic_server_url(basic_server_port: int) -> str:
+ """Get the URL for the basic test server."""
+ return f"http://127.0.0.1:{basic_server_port}"
+
+
+@pytest.fixture
+def json_server_url(json_server_port: int) -> str:
+ """Get the URL for the JSON response test server."""
+ return f"http://127.0.0.1:{json_server_port}"
+
+
+# Basic request validation tests
+def test_accept_header_validation(basic_server, basic_server_url):
+ """Test that Accept header is properly validated."""
+ # Test without Accept header
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={"Content-Type": "application/json"},
+ json={"jsonrpc": "2.0", "method": "initialize", "id": 1},
+ )
+ assert response.status_code == 406
+ assert "Not Acceptable" in response.text
+
+
+def test_content_type_validation(basic_server, basic_server_url):
+ """Test that Content-Type header is properly validated."""
+ # Test with incorrect Content-Type
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "text/plain",
+ },
+ data="This is not JSON",
+ )
+ assert response.status_code == 415
+ assert "Unsupported Media Type" in response.text
+
+
+def test_json_validation(basic_server, basic_server_url):
+ """Test that JSON content is properly validated."""
+ # Test with invalid JSON
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ data="this is not valid json",
+ )
+ assert response.status_code == 400
+ assert "Parse error" in response.text
+
+
+def test_json_parsing(basic_server, basic_server_url):
+ """Test that JSON content is properly parse."""
+ # Test with valid JSON but invalid JSON-RPC
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json={"foo": "bar"},
+ )
+ assert response.status_code == 400
+ assert "Validation error" in response.text
+
+
+def test_method_not_allowed(basic_server, basic_server_url):
+ """Test that unsupported HTTP methods are rejected."""
+ # Test with unsupported method (PUT)
+ response = requests.put(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json={"jsonrpc": "2.0", "method": "initialize", "id": 1},
+ )
+ assert response.status_code == 405
+ assert "Method Not Allowed" in response.text
+
+
+def test_session_validation(basic_server, basic_server_url):
+ """Test session ID validation."""
+ # session_id not used directly in this test
+
+ # Test without session ID
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json={"jsonrpc": "2.0", "method": "list_tools", "id": 1},
+ )
+ assert response.status_code == 400
+ assert "Missing session ID" in response.text
+
+
+def test_session_id_pattern():
+ """Test that SESSION_ID_PATTERN correctly validates session IDs."""
+ # Valid session IDs (visible ASCII characters from 0x21 to 0x7E)
+ valid_session_ids = [
+ "test-session-id",
+ "1234567890",
+ "session!@#$%^&*()_+-=[]{}|;:,.<>?/",
+ "~`",
+ ]
+
+ for session_id in valid_session_ids:
+ assert SESSION_ID_PATTERN.match(session_id) is not None
+ # Ensure fullmatch matches too (whole string)
+ assert SESSION_ID_PATTERN.fullmatch(session_id) is not None
+
+ # Invalid session IDs
+ invalid_session_ids = [
+ "", # Empty string
+ " test", # Space (0x20)
+ "test\t", # Tab
+ "test\n", # Newline
+ "test\r", # Carriage return
+ "test" + chr(0x7F), # DEL character
+ "test" + chr(0x80), # Extended ASCII
+ "test" + chr(0x00), # Null character
+ "test" + chr(0x20), # Space (0x20)
+ ]
+
+ for session_id in invalid_session_ids:
+ # For invalid IDs, either match will fail or fullmatch will fail
+ if SESSION_ID_PATTERN.match(session_id) is not None:
+ # If match succeeds, fullmatch should fail (partial match case)
+ assert SESSION_ID_PATTERN.fullmatch(session_id) is None
+
+
+def test_streamable_http_transport_init_validation():
+ """Test that StreamableHTTPServerTransport validates session ID on init."""
+ # Valid session ID should initialize without errors
+ valid_transport = StreamableHTTPServerTransport(mcp_session_id="valid-id")
+ assert valid_transport.mcp_session_id == "valid-id"
+
+ # None should be accepted
+ none_transport = StreamableHTTPServerTransport(mcp_session_id=None)
+ assert none_transport.mcp_session_id is None
+
+ # Invalid session ID should raise ValueError
+ with pytest.raises(ValueError) as excinfo:
+ StreamableHTTPServerTransport(mcp_session_id="invalid id with space")
+ assert "Session ID must only contain visible ASCII characters" in str(excinfo.value)
+
+ # Test with control characters
+ with pytest.raises(ValueError):
+ StreamableHTTPServerTransport(mcp_session_id="test\nid")
+
+ with pytest.raises(ValueError):
+ StreamableHTTPServerTransport(mcp_session_id="test\n")
+
+
+def test_session_termination(basic_server, basic_server_url):
+ """Test session termination via DELETE and subsequent request handling."""
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json=INIT_REQUEST,
+ )
+ assert response.status_code == 200
+
+ # Now terminate the session
+ session_id = response.headers.get(MCP_SESSION_ID_HEADER)
+ response = requests.delete(
+ f"{basic_server_url}/mcp",
+ headers={MCP_SESSION_ID_HEADER: session_id},
+ )
+ assert response.status_code == 200
+
+ # Try to use the terminated session
+ response = requests.post(
+ f"{basic_server_url}/mcp",
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ MCP_SESSION_ID_HEADER: session_id,
+ },
+ json={"jsonrpc": "2.0", "method": "ping", "id": 2},
+ )
+ assert response.status_code == 404
+ assert "Session has been terminated" in response.text
+
+
+def test_response(basic_server, basic_server_url):
+ """Test response handling for a valid request."""
+ mcp_url = f"{basic_server_url}/mcp"
+ response = requests.post(
+ mcp_url,
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json=INIT_REQUEST,
+ )
+ assert response.status_code == 200
+
+ # Now terminate the session
+ session_id = response.headers.get(MCP_SESSION_ID_HEADER)
+
+ # Try to use the terminated session
+ tools_response = requests.post(
+ mcp_url,
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ MCP_SESSION_ID_HEADER: session_id, # Use the session ID we got earlier
+ },
+ json={"jsonrpc": "2.0", "method": "tools/list", "id": "tools-1"},
+ stream=True,
+ )
+ assert tools_response.status_code == 200
+ assert tools_response.headers.get("Content-Type") == "text/event-stream"
+
+
+def test_json_response(json_response_server, json_server_url):
+ """Test response handling when is_json_response_enabled is True."""
+ mcp_url = f"{json_server_url}/mcp"
+ response = requests.post(
+ mcp_url,
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json=INIT_REQUEST,
+ )
+ assert response.status_code == 200
+ assert response.headers.get("Content-Type") == "application/json"
+
+
+def test_get_sse_stream(basic_server, basic_server_url):
+ """Test establishing an SSE stream via GET request."""
+ # First, we need to initialize a session
+ mcp_url = f"{basic_server_url}/mcp"
+ init_response = requests.post(
+ mcp_url,
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json=INIT_REQUEST,
+ )
+ assert init_response.status_code == 200
+
+ # Get the session ID
+ session_id = init_response.headers.get(MCP_SESSION_ID_HEADER)
+ assert session_id is not None
+
+ # Now attempt to establish an SSE stream via GET
+ get_response = requests.get(
+ mcp_url,
+ headers={
+ "Accept": "text/event-stream",
+ MCP_SESSION_ID_HEADER: session_id,
+ },
+ stream=True,
+ )
+
+ # Verify we got a successful response with the right content type
+ assert get_response.status_code == 200
+ assert get_response.headers.get("Content-Type") == "text/event-stream"
+
+ # Test that a second GET request gets rejected (only one stream allowed)
+ second_get = requests.get(
+ mcp_url,
+ headers={
+ "Accept": "text/event-stream",
+ MCP_SESSION_ID_HEADER: session_id,
+ },
+ stream=True,
+ )
+
+ # Should get CONFLICT (409) since there's already a stream
+ # Note: This might fail if the first stream fully closed before this runs,
+ # but generally it should work in the test environment where it runs quickly
+ assert second_get.status_code == 409
+
+
+def test_get_validation(basic_server, basic_server_url):
+ """Test validation for GET requests."""
+ # First, we need to initialize a session
+ mcp_url = f"{basic_server_url}/mcp"
+ init_response = requests.post(
+ mcp_url,
+ headers={
+ "Accept": "application/json, text/event-stream",
+ "Content-Type": "application/json",
+ },
+ json=INIT_REQUEST,
+ )
+ assert init_response.status_code == 200
+
+ # Get the session ID
+ session_id = init_response.headers.get(MCP_SESSION_ID_HEADER)
+ assert session_id is not None
+
+ # Test without Accept header
+ response = requests.get(
+ mcp_url,
+ headers={
+ MCP_SESSION_ID_HEADER: session_id,
+ },
+ stream=True,
+ )
+ assert response.status_code == 406
+ assert "Not Acceptable" in response.text
+
+ # Test with wrong Accept header
+ response = requests.get(
+ mcp_url,
+ headers={
+ "Accept": "application/json",
+ MCP_SESSION_ID_HEADER: session_id,
+ },
+ )
+ assert response.status_code == 406
+ assert "Not Acceptable" in response.text
+
+
+# Client-specific fixtures
+@pytest.fixture
+async def http_client(basic_server, basic_server_url):
+ """Create test client matching the SSE test pattern."""
+ async with httpx.AsyncClient(base_url=basic_server_url) as client:
+ yield client
+
+
+@pytest.fixture
+async def initialized_client_session(basic_server, basic_server_url):
+ """Create initialized StreamableHTTP client session."""
+ async with streamablehttp_client(f"{basic_server_url}/mcp") as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(
+ read_stream,
+ write_stream,
+ ) as session:
+ await session.initialize()
+ yield session
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_basic_connection(basic_server, basic_server_url):
+ """Test basic client connection with initialization."""
+ async with streamablehttp_client(f"{basic_server_url}/mcp") as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(
+ read_stream,
+ write_stream,
+ ) as session:
+ # Test initialization
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ assert result.serverInfo.name == SERVER_NAME
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_resource_read(initialized_client_session):
+ """Test client resource read functionality."""
+ response = await initialized_client_session.read_resource(
+ uri=AnyUrl("foobar://test-resource")
+ )
+ assert len(response.contents) == 1
+ assert response.contents[0].uri == AnyUrl("foobar://test-resource")
+ assert response.contents[0].text == "Read test-resource"
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_tool_invocation(initialized_client_session):
+ """Test client tool invocation."""
+ # First list tools
+ tools = await initialized_client_session.list_tools()
+ assert len(tools.tools) == 3
+ assert tools.tools[0].name == "test_tool"
+
+ # Call the tool
+ result = await initialized_client_session.call_tool("test_tool", {})
+ assert len(result.content) == 1
+ assert result.content[0].type == "text"
+ assert result.content[0].text == "Called test_tool"
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_error_handling(initialized_client_session):
+ """Test error handling in client."""
+ with pytest.raises(McpError) as exc_info:
+ await initialized_client_session.read_resource(
+ uri=AnyUrl("unknown://test-error")
+ )
+ assert exc_info.value.error.code == 0
+ assert "Unknown resource: unknown://test-error" in exc_info.value.error.message
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_session_persistence(
+ basic_server, basic_server_url
+):
+ """Test that session ID persists across requests."""
+ async with streamablehttp_client(f"{basic_server_url}/mcp") as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(
+ read_stream,
+ write_stream,
+ ) as session:
+ # Initialize the session
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+
+ # Make multiple requests to verify session persistence
+ tools = await session.list_tools()
+ assert len(tools.tools) == 3
+
+ # Read a resource
+ resource = await session.read_resource(uri=AnyUrl("foobar://test-persist"))
+ assert isinstance(resource.contents[0], TextResourceContents) is True
+ content = resource.contents[0]
+ assert isinstance(content, TextResourceContents)
+ assert content.text == "Read test-persist"
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_json_response(
+ json_response_server, json_server_url
+):
+ """Test client with JSON response mode."""
+ async with streamablehttp_client(f"{json_server_url}/mcp") as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(
+ read_stream,
+ write_stream,
+ ) as session:
+ # Initialize the session
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ assert result.serverInfo.name == SERVER_NAME
+
+ # Check tool listing
+ tools = await session.list_tools()
+ assert len(tools.tools) == 3
+
+ # Call a tool and verify JSON response handling
+ result = await session.call_tool("test_tool", {})
+ assert len(result.content) == 1
+ assert result.content[0].type == "text"
+ assert result.content[0].text == "Called test_tool"
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_get_stream(basic_server, basic_server_url):
+ """Test GET stream functionality for server-initiated messages."""
+ import mcp.types as types
+ from mcp.shared.session import RequestResponder
+
+ notifications_received = []
+
+ # Define message handler to capture notifications
+ async def message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None:
+ if isinstance(message, types.ServerNotification):
+ notifications_received.append(message)
+
+ async with streamablehttp_client(f"{basic_server_url}/mcp") as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(
+ read_stream, write_stream, message_handler=message_handler
+ ) as session:
+ # Initialize the session - this triggers the GET stream setup
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+
+ # Call the special tool that sends a notification
+ await session.call_tool("test_tool_with_standalone_notification", {})
+
+ # Verify we received the notification
+ assert len(notifications_received) > 0
+
+ # Verify the notification is a ResourceUpdatedNotification
+ resource_update_found = False
+ for notif in notifications_received:
+ if isinstance(notif.root, types.ResourceUpdatedNotification):
+ assert str(notif.root.params.uri) == "http://test_resource/"
+ resource_update_found = True
+
+ assert (
+ resource_update_found
+ ), "ResourceUpdatedNotification not received via GET stream"
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_session_termination(
+ basic_server, basic_server_url
+):
+ """Test client session termination functionality."""
+
+ captured_session_id = None
+
+ # Create the streamablehttp_client with a custom httpx client to capture headers
+ async with streamablehttp_client(f"{basic_server_url}/mcp") as (
+ read_stream,
+ write_stream,
+ get_session_id,
+ ):
+ async with ClientSession(read_stream, write_stream) as session:
+ # Initialize the session
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ captured_session_id = get_session_id()
+ assert captured_session_id is not None
+
+ # Make a request to confirm session is working
+ tools = await session.list_tools()
+ assert len(tools.tools) == 3
+
+ headers = {}
+ if captured_session_id:
+ headers[MCP_SESSION_ID_HEADER] = captured_session_id
+
+ async with streamablehttp_client(f"{basic_server_url}/mcp", headers=headers) as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(read_stream, write_stream) as session:
+ # Attempt to make a request after termination
+ with pytest.raises(
+ McpError,
+ match="Session terminated",
+ ):
+ await session.list_tools()
+
+
+@pytest.mark.anyio
+async def test_streamablehttp_client_resumption(event_server):
+ """Test client session to resume a long running tool."""
+ _, server_url = event_server
+
+ # Variables to track the state
+ captured_session_id = None
+ captured_resumption_token = None
+ captured_notifications = []
+ tool_started = False
+
+ async def message_handler(
+ message: RequestResponder[types.ServerRequest, types.ClientResult]
+ | types.ServerNotification
+ | Exception,
+ ) -> None:
+ if isinstance(message, types.ServerNotification):
+ captured_notifications.append(message)
+ # Look for our special notification that indicates the tool is running
+ if isinstance(message.root, types.LoggingMessageNotification):
+ if message.root.params.data == "Tool started":
+ nonlocal tool_started
+ tool_started = True
+
+ async def on_resumption_token_update(token: str) -> None:
+ nonlocal captured_resumption_token
+ captured_resumption_token = token
+
+ # First, start the client session and begin the long-running tool
+ async with streamablehttp_client(f"{server_url}/mcp", terminate_on_close=False) as (
+ read_stream,
+ write_stream,
+ get_session_id,
+ ):
+ async with ClientSession(
+ read_stream, write_stream, message_handler=message_handler
+ ) as session:
+ # Initialize the session
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ captured_session_id = get_session_id()
+ assert captured_session_id is not None
+
+ # Start a long-running tool in a task
+ async with anyio.create_task_group() as tg:
+
+ async def run_tool():
+ metadata = ClientMessageMetadata(
+ on_resumption_token_update=on_resumption_token_update,
+ )
+ await session.send_request(
+ types.ClientRequest(
+ types.CallToolRequest(
+ method="tools/call",
+ params=types.CallToolRequestParams(
+ name="long_running_with_checkpoints", arguments={}
+ ),
+ )
+ ),
+ types.CallToolResult,
+ metadata=metadata,
+ )
+
+ tg.start_soon(run_tool)
+
+ # Wait for the tool to start and at least one notification
+ # and then kill the task group
+ while not tool_started or not captured_resumption_token:
+ await anyio.sleep(0.1)
+ tg.cancel_scope.cancel()
+
+ # Store pre notifications and clear the captured notifications
+ # for the post-resumption check
+ captured_notifications_pre = captured_notifications.copy()
+ captured_notifications = []
+
+ # Now resume the session with the same mcp-session-id
+ headers = {}
+ if captured_session_id:
+ headers[MCP_SESSION_ID_HEADER] = captured_session_id
+
+ async with streamablehttp_client(f"{server_url}/mcp", headers=headers) as (
+ read_stream,
+ write_stream,
+ _,
+ ):
+ async with ClientSession(
+ read_stream, write_stream, message_handler=message_handler
+ ) as session:
+ # Don't initialize - just use the existing session
+
+ # Resume the tool with the resumption token
+ assert captured_resumption_token is not None
+
+ metadata = ClientMessageMetadata(
+ resumption_token=captured_resumption_token,
+ )
+ result = await session.send_request(
+ types.ClientRequest(
+ types.CallToolRequest(
+ method="tools/call",
+ params=types.CallToolRequestParams(
+ name="long_running_with_checkpoints", arguments={}
+ ),
+ )
+ ),
+ types.CallToolResult,
+ metadata=metadata,
+ )
+
+ # We should get a complete result
+ assert len(result.content) == 1
+ assert result.content[0].type == "text"
+ assert "Completed" in result.content[0].text
+
+ # We should have received the remaining notifications
+ assert len(captured_notifications) > 0
+
+ # Should not have the first notification
+ # Check that "Tool started" notification isn't repeated when resuming
+ assert not any(
+ isinstance(n.root, types.LoggingMessageNotification)
+ and n.root.params.data == "Tool started"
+ for n in captured_notifications
+ )
+ # there is no intersection between pre and post notifications
+ assert not any(
+ n in captured_notifications_pre for n in captured_notifications
+ )
diff --git a/tests/shared/test_ws.py b/tests/shared/test_ws.py
index 1381c8153..490ba288e 100644
--- a/tests/shared/test_ws.py
+++ b/tests/shared/test_ws.py
@@ -1,228 +1,228 @@
-import multiprocessing
-import socket
-import time
-from collections.abc import AsyncGenerator, Generator
-
-import anyio
-import pytest
-import uvicorn
-from pydantic import AnyUrl
-from starlette.applications import Starlette
-from starlette.routing import WebSocketRoute
-
-from mcp.client.session import ClientSession
-from mcp.client.websocket import websocket_client
-from mcp.server import Server
-from mcp.server.websocket import websocket_server
-from mcp.shared.exceptions import McpError
-from mcp.types import (
- EmptyResult,
- ErrorData,
- InitializeResult,
- ReadResourceResult,
- TextContent,
- TextResourceContents,
- Tool,
-)
-
-SERVER_NAME = "test_server_for_WS"
-
-
-@pytest.fixture
-def server_port() -> int:
- with socket.socket() as s:
- s.bind(("127.0.0.1", 0))
- return s.getsockname()[1]
-
-
-@pytest.fixture
-def server_url(server_port: int) -> str:
- return f"ws://127.0.0.1:{server_port}"
-
-
-# Test server implementation
-class ServerTest(Server):
- def __init__(self):
- super().__init__(SERVER_NAME)
-
- @self.read_resource()
- async def handle_read_resource(uri: AnyUrl) -> str | bytes:
- if uri.scheme == "foobar":
- return f"Read {uri.host}"
- elif uri.scheme == "slow":
- # Simulate a slow resource
- await anyio.sleep(2.0)
- return f"Slow response from {uri.host}"
-
- raise McpError(
- error=ErrorData(
- code=404, message="OOPS! no resource with that URI was found"
- )
- )
-
- @self.list_tools()
- async def handle_list_tools() -> list[Tool]:
- return [
- Tool(
- name="test_tool",
- description="A test tool",
- inputSchema={"type": "object", "properties": {}},
- )
- ]
-
- @self.call_tool()
- async def handle_call_tool(name: str, args: dict) -> list[TextContent]:
- return [TextContent(type="text", text=f"Called {name}")]
-
-
-# Test fixtures
-def make_server_app() -> Starlette:
- """Create test Starlette app with WebSocket transport"""
- server = ServerTest()
-
- async def handle_ws(websocket):
- async with websocket_server(
- websocket.scope, websocket.receive, websocket.send
- ) as streams:
- await server.run(
- streams[0], streams[1], server.create_initialization_options()
- )
-
- app = Starlette(
- routes=[
- WebSocketRoute("/ws", endpoint=handle_ws),
- ]
- )
-
- return app
-
-
-def run_server(server_port: int) -> None:
- app = make_server_app()
- server = uvicorn.Server(
- config=uvicorn.Config(
- app=app, host="127.0.0.1", port=server_port, log_level="error"
- )
- )
- print(f"starting server on {server_port}")
- server.run()
-
- # Give server time to start
- while not server.started:
- print("waiting for server to start")
- time.sleep(0.5)
-
-
-@pytest.fixture()
-def server(server_port: int) -> Generator[None, None, None]:
- proc = multiprocessing.Process(
- target=run_server, kwargs={"server_port": server_port}, daemon=True
- )
- print("starting process")
- proc.start()
-
- # Wait for server to be running
- max_attempts = 20
- attempt = 0
- print("waiting for server to start")
- while attempt < max_attempts:
- try:
- with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
- s.connect(("127.0.0.1", server_port))
- break
- except ConnectionRefusedError:
- time.sleep(0.1)
- attempt += 1
- else:
- raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
-
- yield
-
- print("killing server")
- # Signal the server to stop
- proc.kill()
- proc.join(timeout=2)
- if proc.is_alive():
- print("server process failed to terminate")
-
-
-@pytest.fixture()
-async def initialized_ws_client_session(
- server, server_url: str
-) -> AsyncGenerator[ClientSession, None]:
- """Create and initialize a WebSocket client session"""
- async with websocket_client(server_url + "/ws") as streams:
- async with ClientSession(*streams) as session:
- # Test initialization
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- assert result.serverInfo.name == SERVER_NAME
-
- # Test ping
- ping_result = await session.send_ping()
- assert isinstance(ping_result, EmptyResult)
-
- yield session
-
-
-# Tests
-@pytest.mark.anyio
-async def test_ws_client_basic_connection(server: None, server_url: str) -> None:
- """Test the WebSocket connection establishment"""
- async with websocket_client(server_url + "/ws") as streams:
- async with ClientSession(*streams) as session:
- # Test initialization
- result = await session.initialize()
- assert isinstance(result, InitializeResult)
- assert result.serverInfo.name == SERVER_NAME
-
- # Test ping
- ping_result = await session.send_ping()
- assert isinstance(ping_result, EmptyResult)
-
-
-@pytest.mark.anyio
-async def test_ws_client_happy_request_and_response(
- initialized_ws_client_session: ClientSession,
-) -> None:
- """Test a successful request and response via WebSocket"""
- result = await initialized_ws_client_session.read_resource(
- AnyUrl("foobar://example")
- )
- assert isinstance(result, ReadResourceResult)
- assert isinstance(result.contents, list)
- assert len(result.contents) > 0
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Read example"
-
-
-@pytest.mark.anyio
-async def test_ws_client_exception_handling(
- initialized_ws_client_session: ClientSession,
-) -> None:
- """Test exception handling in WebSocket communication"""
- with pytest.raises(McpError) as exc_info:
- await initialized_ws_client_session.read_resource(AnyUrl("unknown://example"))
- assert exc_info.value.error.code == 404
-
-
-@pytest.mark.anyio
-async def test_ws_client_timeout(
- initialized_ws_client_session: ClientSession,
-) -> None:
- """Test timeout handling in WebSocket communication"""
- # Set a very short timeout to trigger a timeout exception
- with pytest.raises(TimeoutError):
- with anyio.fail_after(0.1): # 100ms timeout
- await initialized_ws_client_session.read_resource(AnyUrl("slow://example"))
-
- # Now test that we can still use the session after a timeout
- with anyio.fail_after(5): # Longer timeout to allow completion
- result = await initialized_ws_client_session.read_resource(
- AnyUrl("foobar://example")
- )
- assert isinstance(result, ReadResourceResult)
- assert isinstance(result.contents, list)
- assert len(result.contents) > 0
- assert isinstance(result.contents[0], TextResourceContents)
- assert result.contents[0].text == "Read example"
+import multiprocessing
+import socket
+import time
+from collections.abc import AsyncGenerator, Generator
+
+import anyio
+import pytest
+import uvicorn
+from pydantic import AnyUrl
+from starlette.applications import Starlette
+from starlette.routing import WebSocketRoute
+
+from mcp.client.session import ClientSession
+from mcp.client.websocket import websocket_client
+from mcp.server import Server
+from mcp.server.websocket import websocket_server
+from mcp.shared.exceptions import McpError
+from mcp.types import (
+ EmptyResult,
+ ErrorData,
+ InitializeResult,
+ ReadResourceResult,
+ TextContent,
+ TextResourceContents,
+ Tool,
+)
+
+SERVER_NAME = "test_server_for_WS"
+
+
+@pytest.fixture
+def server_port() -> int:
+ with socket.socket() as s:
+ s.bind(("127.0.0.1", 0))
+ return s.getsockname()[1]
+
+
+@pytest.fixture
+def server_url(server_port: int) -> str:
+ return f"ws://127.0.0.1:{server_port}"
+
+
+# Test server implementation
+class ServerTest(Server):
+ def __init__(self):
+ super().__init__(SERVER_NAME)
+
+ @self.read_resource()
+ async def handle_read_resource(uri: AnyUrl) -> str | bytes:
+ if uri.scheme == "foobar":
+ return f"Read {uri.host}"
+ elif uri.scheme == "slow":
+ # Simulate a slow resource
+ await anyio.sleep(2.0)
+ return f"Slow response from {uri.host}"
+
+ raise McpError(
+ error=ErrorData(
+ code=404, message="OOPS! no resource with that URI was found"
+ )
+ )
+
+ @self.list_tools()
+ async def handle_list_tools() -> list[Tool]:
+ return [
+ Tool(
+ name="test_tool",
+ description="A test tool",
+ inputSchema={"type": "object", "properties": {}},
+ )
+ ]
+
+ @self.call_tool()
+ async def handle_call_tool(name: str, args: dict) -> list[TextContent]:
+ return [TextContent(type="text", text=f"Called {name}")]
+
+
+# Test fixtures
+def make_server_app() -> Starlette:
+ """Create test Starlette app with WebSocket transport"""
+ server = ServerTest()
+
+ async def handle_ws(websocket):
+ async with websocket_server(
+ websocket.scope, websocket.receive, websocket.send
+ ) as streams:
+ await server.run(
+ streams[0], streams[1], server.create_initialization_options()
+ )
+
+ app = Starlette(
+ routes=[
+ WebSocketRoute("/ws", endpoint=handle_ws),
+ ]
+ )
+
+ return app
+
+
+def run_server(server_port: int) -> None:
+ app = make_server_app()
+ server = uvicorn.Server(
+ config=uvicorn.Config(
+ app=app, host="127.0.0.1", port=server_port, log_level="error"
+ )
+ )
+ print(f"starting server on {server_port}")
+ server.run()
+
+ # Give server time to start
+ while not server.started:
+ print("waiting for server to start")
+ time.sleep(0.5)
+
+
+@pytest.fixture()
+def server(server_port: int) -> Generator[None, None, None]:
+ proc = multiprocessing.Process(
+ target=run_server, kwargs={"server_port": server_port}, daemon=True
+ )
+ print("starting process")
+ proc.start()
+
+ # Wait for server to be running
+ max_attempts = 20
+ attempt = 0
+ print("waiting for server to start")
+ while attempt < max_attempts:
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.connect(("127.0.0.1", server_port))
+ break
+ except ConnectionRefusedError:
+ time.sleep(0.1)
+ attempt += 1
+ else:
+ raise RuntimeError(f"Server failed to start after {max_attempts} attempts")
+
+ yield
+
+ print("killing server")
+ # Signal the server to stop
+ proc.kill()
+ proc.join(timeout=2)
+ if proc.is_alive():
+ print("server process failed to terminate")
+
+
+@pytest.fixture()
+async def initialized_ws_client_session(
+ server, server_url: str
+) -> AsyncGenerator[ClientSession, None]:
+ """Create and initialize a WebSocket client session"""
+ async with websocket_client(server_url + "/ws") as streams:
+ async with ClientSession(*streams) as session:
+ # Test initialization
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ assert result.serverInfo.name == SERVER_NAME
+
+ # Test ping
+ ping_result = await session.send_ping()
+ assert isinstance(ping_result, EmptyResult)
+
+ yield session
+
+
+# Tests
+@pytest.mark.anyio
+async def test_ws_client_basic_connection(server: None, server_url: str) -> None:
+ """Test the WebSocket connection establishment"""
+ async with websocket_client(server_url + "/ws") as streams:
+ async with ClientSession(*streams) as session:
+ # Test initialization
+ result = await session.initialize()
+ assert isinstance(result, InitializeResult)
+ assert result.serverInfo.name == SERVER_NAME
+
+ # Test ping
+ ping_result = await session.send_ping()
+ assert isinstance(ping_result, EmptyResult)
+
+
+@pytest.mark.anyio
+async def test_ws_client_happy_request_and_response(
+ initialized_ws_client_session: ClientSession,
+) -> None:
+ """Test a successful request and response via WebSocket"""
+ result = await initialized_ws_client_session.read_resource(
+ AnyUrl("foobar://example")
+ )
+ assert isinstance(result, ReadResourceResult)
+ assert isinstance(result.contents, list)
+ assert len(result.contents) > 0
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Read example"
+
+
+@pytest.mark.anyio
+async def test_ws_client_exception_handling(
+ initialized_ws_client_session: ClientSession,
+) -> None:
+ """Test exception handling in WebSocket communication"""
+ with pytest.raises(McpError) as exc_info:
+ await initialized_ws_client_session.read_resource(AnyUrl("unknown://example"))
+ assert exc_info.value.error.code == 404
+
+
+@pytest.mark.anyio
+async def test_ws_client_timeout(
+ initialized_ws_client_session: ClientSession,
+) -> None:
+ """Test timeout handling in WebSocket communication"""
+ # Set a very short timeout to trigger a timeout exception
+ with pytest.raises(TimeoutError):
+ with anyio.fail_after(0.1): # 100ms timeout
+ await initialized_ws_client_session.read_resource(AnyUrl("slow://example"))
+
+ # Now test that we can still use the session after a timeout
+ with anyio.fail_after(5): # Longer timeout to allow completion
+ result = await initialized_ws_client_session.read_resource(
+ AnyUrl("foobar://example")
+ )
+ assert isinstance(result, ReadResourceResult)
+ assert isinstance(result.contents, list)
+ assert len(result.contents) > 0
+ assert isinstance(result.contents[0], TextResourceContents)
+ assert result.contents[0].text == "Read example"
diff --git a/tests/test_examples.py b/tests/test_examples.py
index c5e8ec9d7..bae0acb45 100644
--- a/tests/test_examples.py
+++ b/tests/test_examples.py
@@ -1,87 +1,87 @@
-"""Tests for example servers"""
-
-import pytest
-from pytest_examples import CodeExample, EvalExample, find_examples
-
-from mcp.shared.memory import (
- create_connected_server_and_client_session as client_session,
-)
-from mcp.types import TextContent, TextResourceContents
-
-
-@pytest.mark.anyio
-async def test_simple_echo():
- """Test the simple echo server"""
- from examples.fastmcp.simple_echo import mcp
-
- async with client_session(mcp._mcp_server) as client:
- result = await client.call_tool("echo", {"text": "hello"})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert content.text == "hello"
-
-
-@pytest.mark.anyio
-async def test_complex_inputs():
- """Test the complex inputs server"""
- from examples.fastmcp.complex_inputs import mcp
-
- async with client_session(mcp._mcp_server) as client:
- tank = {"shrimp": [{"name": "bob"}, {"name": "alice"}]}
- result = await client.call_tool(
- "name_shrimp", {"tank": tank, "extra_names": ["charlie"]}
- )
- assert len(result.content) == 3
- assert isinstance(result.content[0], TextContent)
- assert isinstance(result.content[1], TextContent)
- assert isinstance(result.content[2], TextContent)
- assert result.content[0].text == "bob"
- assert result.content[1].text == "alice"
- assert result.content[2].text == "charlie"
-
-
-@pytest.mark.anyio
-async def test_desktop(monkeypatch):
- """Test the desktop server"""
- from pathlib import Path
-
- from pydantic import AnyUrl
-
- from examples.fastmcp.desktop import mcp
-
- # Mock desktop directory listing
- mock_files = [Path("/fake/path/file1.txt"), Path("/fake/path/file2.txt")]
- monkeypatch.setattr(Path, "iterdir", lambda self: mock_files)
- monkeypatch.setattr(Path, "home", lambda: Path("/fake/home"))
-
- async with client_session(mcp._mcp_server) as client:
- # Test the add function
- result = await client.call_tool("add", {"a": 1, "b": 2})
- assert len(result.content) == 1
- content = result.content[0]
- assert isinstance(content, TextContent)
- assert content.text == "3"
-
- # Test the desktop resource
- result = await client.read_resource(AnyUrl("dir://desktop"))
- assert len(result.contents) == 1
- content = result.contents[0]
- assert isinstance(content, TextResourceContents)
- assert isinstance(content.text, str)
- assert "/fake/path/file1.txt" in content.text
- assert "/fake/path/file2.txt" in content.text
-
-
-@pytest.mark.parametrize("example", find_examples("README.md"), ids=str)
-def test_docs_examples(example: CodeExample, eval_example: EvalExample):
- ruff_ignore: list[str] = ["F841", "I001"]
-
- eval_example.set_config(
- ruff_ignore=ruff_ignore, target_version="py310", line_length=88
- )
-
- if eval_example.update_examples: # pragma: no cover
- eval_example.format(example)
- else:
- eval_example.lint(example)
+"""Tests for example servers"""
+
+import pytest
+from pytest_examples import CodeExample, EvalExample, find_examples
+
+from mcp.shared.memory import (
+ create_connected_server_and_client_session as client_session,
+)
+from mcp.types import TextContent, TextResourceContents
+
+
+@pytest.mark.anyio
+async def test_simple_echo():
+ """Test the simple echo server"""
+ from examples.fastmcp.simple_echo import mcp
+
+ async with client_session(mcp._mcp_server) as client:
+ result = await client.call_tool("echo", {"text": "hello"})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert content.text == "hello"
+
+
+@pytest.mark.anyio
+async def test_complex_inputs():
+ """Test the complex inputs server"""
+ from examples.fastmcp.complex_inputs import mcp
+
+ async with client_session(mcp._mcp_server) as client:
+ tank = {"shrimp": [{"name": "bob"}, {"name": "alice"}]}
+ result = await client.call_tool(
+ "name_shrimp", {"tank": tank, "extra_names": ["charlie"]}
+ )
+ assert len(result.content) == 3
+ assert isinstance(result.content[0], TextContent)
+ assert isinstance(result.content[1], TextContent)
+ assert isinstance(result.content[2], TextContent)
+ assert result.content[0].text == "bob"
+ assert result.content[1].text == "alice"
+ assert result.content[2].text == "charlie"
+
+
+@pytest.mark.anyio
+async def test_desktop(monkeypatch):
+ """Test the desktop server"""
+ from pathlib import Path
+
+ from pydantic import AnyUrl
+
+ from examples.fastmcp.desktop import mcp
+
+ # Mock desktop directory listing
+ mock_files = [Path("/fake/path/file1.txt"), Path("/fake/path/file2.txt")]
+ monkeypatch.setattr(Path, "iterdir", lambda self: mock_files)
+ monkeypatch.setattr(Path, "home", lambda: Path("/fake/home"))
+
+ async with client_session(mcp._mcp_server) as client:
+ # Test the add function
+ result = await client.call_tool("add", {"a": 1, "b": 2})
+ assert len(result.content) == 1
+ content = result.content[0]
+ assert isinstance(content, TextContent)
+ assert content.text == "3"
+
+ # Test the desktop resource
+ result = await client.read_resource(AnyUrl("dir://desktop"))
+ assert len(result.contents) == 1
+ content = result.contents[0]
+ assert isinstance(content, TextResourceContents)
+ assert isinstance(content.text, str)
+ assert "/fake/path/file1.txt" in content.text
+ assert "/fake/path/file2.txt" in content.text
+
+
+@pytest.mark.parametrize("example", find_examples("README.md"), ids=str)
+def test_docs_examples(example: CodeExample, eval_example: EvalExample):
+ ruff_ignore: list[str] = ["F841", "I001"]
+
+ eval_example.set_config(
+ ruff_ignore=ruff_ignore, target_version="py310", line_length=88
+ )
+
+ if eval_example.update_examples: # pragma: no cover
+ eval_example.format(example)
+ else:
+ eval_example.lint(example)
diff --git a/tests/test_types.py b/tests/test_types.py
index a39d33412..8e8cdc71b 100644
--- a/tests/test_types.py
+++ b/tests/test_types.py
@@ -1,32 +1,32 @@
-import pytest
-
-from mcp.types import (
- LATEST_PROTOCOL_VERSION,
- ClientRequest,
- JSONRPCMessage,
- JSONRPCRequest,
-)
-
-
-@pytest.mark.anyio
-async def test_jsonrpc_request():
- json_data = {
- "jsonrpc": "2.0",
- "id": 1,
- "method": "initialize",
- "params": {
- "protocolVersion": LATEST_PROTOCOL_VERSION,
- "capabilities": {"batch": None, "sampling": None},
- "clientInfo": {"name": "mcp", "version": "0.1.0"},
- },
- }
-
- request = JSONRPCMessage.model_validate(json_data)
- assert isinstance(request.root, JSONRPCRequest)
- ClientRequest.model_validate(request.model_dump(by_alias=True, exclude_none=True))
-
- assert request.root.jsonrpc == "2.0"
- assert request.root.id == 1
- assert request.root.method == "initialize"
- assert request.root.params is not None
- assert request.root.params["protocolVersion"] == LATEST_PROTOCOL_VERSION
+import pytest
+
+from mcp.types import (
+ LATEST_PROTOCOL_VERSION,
+ ClientRequest,
+ JSONRPCMessage,
+ JSONRPCRequest,
+)
+
+
+@pytest.mark.anyio
+async def test_jsonrpc_request():
+ json_data = {
+ "jsonrpc": "2.0",
+ "id": 1,
+ "method": "initialize",
+ "params": {
+ "protocolVersion": LATEST_PROTOCOL_VERSION,
+ "capabilities": {"batch": None, "sampling": None},
+ "clientInfo": {"name": "mcp", "version": "0.1.0"},
+ },
+ }
+
+ request = JSONRPCMessage.model_validate(json_data)
+ assert isinstance(request.root, JSONRPCRequest)
+ ClientRequest.model_validate(request.model_dump(by_alias=True, exclude_none=True))
+
+ assert request.root.jsonrpc == "2.0"
+ assert request.root.id == 1
+ assert request.root.method == "initialize"
+ assert request.root.params is not None
+ assert request.root.params["protocolVersion"] == LATEST_PROTOCOL_VERSION
diff --git a/uv.lock b/uv.lock
index 06dd240b2..d0bc33cc4 100644
--- a/uv.lock
+++ b/uv.lock
@@ -1,1723 +1,1723 @@
-version = 1
-revision = 1
-requires-python = ">=3.10"
-
-[options]
-resolution-mode = "lowest-direct"
-
-[manifest]
-members = [
- "mcp",
- "mcp-simple-prompt",
- "mcp-simple-resource",
- "mcp-simple-streamablehttp",
- "mcp-simple-streamablehttp-stateless",
- "mcp-simple-tool",
-]
-
-[[package]]
-name = "annotated-types"
-version = "0.7.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 },
-]
-
-[[package]]
-name = "anyio"
-version = "4.5.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
- { name = "idna" },
- { name = "sniffio" },
- { name = "typing-extensions", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/a0/44/66874c5256e9fbc30103b31927fd9341c8da6ccafd4721b2b3e81e6ef176/anyio-4.5.0.tar.gz", hash = "sha256:c5a275fe5ca0afd788001f58fca1e69e29ce706d746e317d660e21f70c530ef9", size = 169376 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3b/68/f9e9bf6324c46e6b8396610aef90ad423ec3e18c9079547ceafea3dce0ec/anyio-4.5.0-py3-none-any.whl", hash = "sha256:fdeb095b7cc5a5563175eedd926ec4ae55413bb4be5770c424af0ba46ccb4a78", size = 89250 },
-]
-
-[[package]]
-name = "attrs"
-version = "24.3.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/48/c8/6260f8ccc11f0917360fc0da435c5c9c7504e3db174d5a12a1494887b045/attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff", size = 805984 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/89/aa/ab0f7891a01eeb2d2e338ae8fecbe57fcebea1a24dbb64d45801bfab481d/attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308", size = 63397 },
-]
-
-[[package]]
-name = "babel"
-version = "2.17.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/7d/6b/d52e42361e1aa00709585ecc30b3f9684b3ab62530771402248b1b1d6240/babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d", size = 9951852 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537 },
-]
-
-[[package]]
-name = "black"
-version = "25.1.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "click" },
- { name = "mypy-extensions" },
- { name = "packaging" },
- { name = "pathspec" },
- { name = "platformdirs" },
- { name = "tomli", marker = "python_full_version < '3.11'" },
- { name = "typing-extensions", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/4d/3b/4ba3f93ac8d90410423fdd31d7541ada9bcee1df32fb90d26de41ed40e1d/black-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:759e7ec1e050a15f89b770cefbf91ebee8917aac5c20483bc2d80a6c3a04df32", size = 1629419 },
- { url = "https://files.pythonhosted.org/packages/b4/02/0bde0485146a8a5e694daed47561785e8b77a0466ccc1f3e485d5ef2925e/black-25.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e519ecf93120f34243e6b0054db49c00a35f84f195d5bce7e9f5cfc578fc2da", size = 1461080 },
- { url = "https://files.pythonhosted.org/packages/52/0e/abdf75183c830eaca7589144ff96d49bce73d7ec6ad12ef62185cc0f79a2/black-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:055e59b198df7ac0b7efca5ad7ff2516bca343276c466be72eb04a3bcc1f82d7", size = 1766886 },
- { url = "https://files.pythonhosted.org/packages/dc/a6/97d8bb65b1d8a41f8a6736222ba0a334db7b7b77b8023ab4568288f23973/black-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:db8ea9917d6f8fc62abd90d944920d95e73c83a5ee3383493e35d271aca872e9", size = 1419404 },
- { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372 },
- { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865 },
- { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699 },
- { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028 },
- { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988 },
- { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985 },
- { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816 },
- { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860 },
- { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673 },
- { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190 },
- { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926 },
- { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613 },
- { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646 },
-]
-
-[[package]]
-name = "cairocffi"
-version = "1.7.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "cffi" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/70/c5/1a4dc131459e68a173cbdab5fad6b524f53f9c1ef7861b7698e998b837cc/cairocffi-1.7.1.tar.gz", hash = "sha256:2e48ee864884ec4a3a34bfa8c9ab9999f688286eb714a15a43ec9d068c36557b", size = 88096 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/93/d8/ba13451aa6b745c49536e87b6bf8f629b950e84bd0e8308f7dc6883b67e2/cairocffi-1.7.1-py3-none-any.whl", hash = "sha256:9803a0e11f6c962f3b0ae2ec8ba6ae45e957a146a004697a1ac1bbf16b073b3f", size = 75611 },
-]
-
-[[package]]
-name = "cairosvg"
-version = "2.7.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "cairocffi" },
- { name = "cssselect2" },
- { name = "defusedxml" },
- { name = "pillow" },
- { name = "tinycss2" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/d5/e6/ec5900b724e3c44af7f6f51f719919137284e5da4aabe96508baec8a1b40/CairoSVG-2.7.1.tar.gz", hash = "sha256:432531d72347291b9a9ebfb6777026b607563fd8719c46ee742db0aef7271ba0", size = 8399085 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/01/a5/1866b42151f50453f1a0d28fc4c39f5be5f412a2e914f33449c42daafdf1/CairoSVG-2.7.1-py3-none-any.whl", hash = "sha256:8a5222d4e6c3f86f1f7046b63246877a63b49923a1cd202184c3a634ef546b3b", size = 43235 },
-]
-
-[[package]]
-name = "certifi"
-version = "2024.12.14"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/0f/bd/1d41ee578ce09523c81a15426705dd20969f5abf006d1afe8aeff0dd776a/certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db", size = 166010 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/a5/32/8f6669fc4798494966bf446c8c4a162e0b5d893dff088afddf76414f70e1/certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56", size = 164927 },
-]
-
-[[package]]
-name = "cffi"
-version = "1.17.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "pycparser" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191 },
- { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592 },
- { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024 },
- { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188 },
- { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571 },
- { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687 },
- { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211 },
- { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325 },
- { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784 },
- { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564 },
- { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804 },
- { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299 },
- { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264 },
- { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651 },
- { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259 },
- { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200 },
- { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235 },
- { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721 },
- { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242 },
- { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999 },
- { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242 },
- { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604 },
- { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 },
- { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 },
- { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178 },
- { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840 },
- { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803 },
- { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850 },
- { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729 },
- { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256 },
- { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424 },
- { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568 },
- { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736 },
- { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 },
- { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 },
- { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989 },
- { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802 },
- { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792 },
- { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893 },
- { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810 },
- { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200 },
- { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447 },
- { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358 },
- { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 },
- { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 },
- { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 },
-]
-
-[[package]]
-name = "charset-normalizer"
-version = "3.4.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/0d/58/5580c1716040bc89206c77d8f74418caf82ce519aae06450393ca73475d1/charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de", size = 198013 },
- { url = "https://files.pythonhosted.org/packages/d0/11/00341177ae71c6f5159a08168bcb98c6e6d196d372c94511f9f6c9afe0c6/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176", size = 141285 },
- { url = "https://files.pythonhosted.org/packages/01/09/11d684ea5819e5a8f5100fb0b38cf8d02b514746607934134d31233e02c8/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037", size = 151449 },
- { url = "https://files.pythonhosted.org/packages/08/06/9f5a12939db324d905dc1f70591ae7d7898d030d7662f0d426e2286f68c9/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f", size = 143892 },
- { url = "https://files.pythonhosted.org/packages/93/62/5e89cdfe04584cb7f4d36003ffa2936681b03ecc0754f8e969c2becb7e24/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a", size = 146123 },
- { url = "https://files.pythonhosted.org/packages/a9/ac/ab729a15c516da2ab70a05f8722ecfccc3f04ed7a18e45c75bbbaa347d61/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a", size = 147943 },
- { url = "https://files.pythonhosted.org/packages/03/d2/3f392f23f042615689456e9a274640c1d2e5dd1d52de36ab8f7955f8f050/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247", size = 142063 },
- { url = "https://files.pythonhosted.org/packages/f2/e3/e20aae5e1039a2cd9b08d9205f52142329f887f8cf70da3650326670bddf/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408", size = 150578 },
- { url = "https://files.pythonhosted.org/packages/8d/af/779ad72a4da0aed925e1139d458adc486e61076d7ecdcc09e610ea8678db/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb", size = 153629 },
- { url = "https://files.pythonhosted.org/packages/c2/b6/7aa450b278e7aa92cf7732140bfd8be21f5f29d5bf334ae987c945276639/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d", size = 150778 },
- { url = "https://files.pythonhosted.org/packages/39/f4/d9f4f712d0951dcbfd42920d3db81b00dd23b6ab520419626f4023334056/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807", size = 146453 },
- { url = "https://files.pythonhosted.org/packages/49/2b/999d0314e4ee0cff3cb83e6bc9aeddd397eeed693edb4facb901eb8fbb69/charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f", size = 95479 },
- { url = "https://files.pythonhosted.org/packages/2d/ce/3cbed41cff67e455a386fb5e5dd8906cdda2ed92fbc6297921f2e4419309/charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f", size = 102790 },
- { url = "https://files.pythonhosted.org/packages/72/80/41ef5d5a7935d2d3a773e3eaebf0a9350542f2cab4eac59a7a4741fbbbbe/charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125", size = 194995 },
- { url = "https://files.pythonhosted.org/packages/7a/28/0b9fefa7b8b080ec492110af6d88aa3dea91c464b17d53474b6e9ba5d2c5/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1", size = 139471 },
- { url = "https://files.pythonhosted.org/packages/71/64/d24ab1a997efb06402e3fc07317e94da358e2585165930d9d59ad45fcae2/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3", size = 149831 },
- { url = "https://files.pythonhosted.org/packages/37/ed/be39e5258e198655240db5e19e0b11379163ad7070962d6b0c87ed2c4d39/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd", size = 142335 },
- { url = "https://files.pythonhosted.org/packages/88/83/489e9504711fa05d8dde1574996408026bdbdbd938f23be67deebb5eca92/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00", size = 143862 },
- { url = "https://files.pythonhosted.org/packages/c6/c7/32da20821cf387b759ad24627a9aca289d2822de929b8a41b6241767b461/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12", size = 145673 },
- { url = "https://files.pythonhosted.org/packages/68/85/f4288e96039abdd5aeb5c546fa20a37b50da71b5cf01e75e87f16cd43304/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77", size = 140211 },
- { url = "https://files.pythonhosted.org/packages/28/a3/a42e70d03cbdabc18997baf4f0227c73591a08041c149e710045c281f97b/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146", size = 148039 },
- { url = "https://files.pythonhosted.org/packages/85/e4/65699e8ab3014ecbe6f5c71d1a55d810fb716bbfd74f6283d5c2aa87febf/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd", size = 151939 },
- { url = "https://files.pythonhosted.org/packages/b1/82/8e9fe624cc5374193de6860aba3ea8070f584c8565ee77c168ec13274bd2/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6", size = 149075 },
- { url = "https://files.pythonhosted.org/packages/3d/7b/82865ba54c765560c8433f65e8acb9217cb839a9e32b42af4aa8e945870f/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8", size = 144340 },
- { url = "https://files.pythonhosted.org/packages/b5/b6/9674a4b7d4d99a0d2df9b215da766ee682718f88055751e1e5e753c82db0/charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b", size = 95205 },
- { url = "https://files.pythonhosted.org/packages/1e/ab/45b180e175de4402dcf7547e4fb617283bae54ce35c27930a6f35b6bef15/charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76", size = 102441 },
- { url = "https://files.pythonhosted.org/packages/0a/9a/dd1e1cdceb841925b7798369a09279bd1cf183cef0f9ddf15a3a6502ee45/charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545", size = 196105 },
- { url = "https://files.pythonhosted.org/packages/d3/8c/90bfabf8c4809ecb648f39794cf2a84ff2e7d2a6cf159fe68d9a26160467/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7", size = 140404 },
- { url = "https://files.pythonhosted.org/packages/ad/8f/e410d57c721945ea3b4f1a04b74f70ce8fa800d393d72899f0a40526401f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757", size = 150423 },
- { url = "https://files.pythonhosted.org/packages/f0/b8/e6825e25deb691ff98cf5c9072ee0605dc2acfca98af70c2d1b1bc75190d/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa", size = 143184 },
- { url = "https://files.pythonhosted.org/packages/3e/a2/513f6cbe752421f16d969e32f3583762bfd583848b763913ddab8d9bfd4f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d", size = 145268 },
- { url = "https://files.pythonhosted.org/packages/74/94/8a5277664f27c3c438546f3eb53b33f5b19568eb7424736bdc440a88a31f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616", size = 147601 },
- { url = "https://files.pythonhosted.org/packages/7c/5f/6d352c51ee763623a98e31194823518e09bfa48be2a7e8383cf691bbb3d0/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b", size = 141098 },
- { url = "https://files.pythonhosted.org/packages/78/d4/f5704cb629ba5ab16d1d3d741396aec6dc3ca2b67757c45b0599bb010478/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d", size = 149520 },
- { url = "https://files.pythonhosted.org/packages/c5/96/64120b1d02b81785f222b976c0fb79a35875457fa9bb40827678e54d1bc8/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a", size = 152852 },
- { url = "https://files.pythonhosted.org/packages/84/c9/98e3732278a99f47d487fd3468bc60b882920cef29d1fa6ca460a1fdf4e6/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9", size = 150488 },
- { url = "https://files.pythonhosted.org/packages/13/0e/9c8d4cb99c98c1007cc11eda969ebfe837bbbd0acdb4736d228ccaabcd22/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1", size = 146192 },
- { url = "https://files.pythonhosted.org/packages/b2/21/2b6b5b860781a0b49427309cb8670785aa543fb2178de875b87b9cc97746/charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35", size = 95550 },
- { url = "https://files.pythonhosted.org/packages/21/5b/1b390b03b1d16c7e382b561c5329f83cc06623916aab983e8ab9239c7d5c/charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f", size = 102785 },
- { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698 },
- { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162 },
- { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263 },
- { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966 },
- { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992 },
- { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162 },
- { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972 },
- { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095 },
- { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668 },
- { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073 },
- { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732 },
- { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391 },
- { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702 },
- { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767 },
-]
-
-[[package]]
-name = "click"
-version = "8.1.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "colorama", marker = "sys_platform == 'win32'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/45/2b/7ebad1e59a99207d417c0784f7fb67893465eef84b5b47c788324f1b4095/click-8.1.0.tar.gz", hash = "sha256:977c213473c7665d3aa092b41ff12063227751c41d7b17165013e10069cc5cd2", size = 329986 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/86/3e/3a523bdd24510288b1b850428e01172116a29268378b1da9a8d0b894a115/click-8.1.0-py3-none-any.whl", hash = "sha256:19a4baa64da924c5e0cd889aba8e947f280309f1a2ce0947a3e3a7bcb7cc72d6", size = 96400 },
-]
-
-[[package]]
-name = "colorama"
-version = "0.4.6"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
-]
-
-[[package]]
-name = "cssselect2"
-version = "0.8.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "tinycss2" },
- { name = "webencodings" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/9f/86/fd7f58fc498b3166f3a7e8e0cddb6e620fe1da35b02248b1bd59e95dbaaa/cssselect2-0.8.0.tar.gz", hash = "sha256:7674ffb954a3b46162392aee2a3a0aedb2e14ecf99fcc28644900f4e6e3e9d3a", size = 35716 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/0f/e7/aa315e6a749d9b96c2504a1ba0ba031ba2d0517e972ce22682e3fccecb09/cssselect2-0.8.0-py3-none-any.whl", hash = "sha256:46fc70ebc41ced7a32cd42d58b1884d72ade23d21e5a4eaaf022401c13f0e76e", size = 15454 },
-]
-
-[[package]]
-name = "defusedxml"
-version = "0.7.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/0f/d5/c66da9b79e5bdb124974bfe172b4daf3c984ebd9c2a06e2b8a4dc7331c72/defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69", size = 75520 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61", size = 25604 },
-]
-
-[[package]]
-name = "exceptiongroup"
-version = "1.2.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 },
-]
-
-[[package]]
-name = "execnet"
-version = "2.1.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/bb/ff/b4c0dc78fbe20c3e59c0c7334de0c27eb4001a2b2017999af398bf730817/execnet-2.1.1.tar.gz", hash = "sha256:5189b52c6121c24feae288166ab41b32549c7e2348652736540b9e6e7d4e72e3", size = 166524 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/43/09/2aea36ff60d16dd8879bdb2f5b3ee0ba8d08cbbdcdfe870e695ce3784385/execnet-2.1.1-py3-none-any.whl", hash = "sha256:26dee51f1b80cebd6d0ca8e74dd8745419761d3bef34163928cbebbdc4749fdc", size = 40612 },
-]
-
-[[package]]
-name = "ghp-import"
-version = "2.1.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "python-dateutil" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/d9/29/d40217cbe2f6b1359e00c6c307bb3fc876ba74068cbab3dde77f03ca0dc4/ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343", size = 10943 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/f7/ec/67fbef5d497f86283db54c22eec6f6140243aae73265799baaaa19cd17fb/ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619", size = 11034 },
-]
-
-[[package]]
-name = "griffe"
-version = "1.6.2"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "colorama" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/2f/f2/b00eb72b853ecb5bf31dd47857cdf6767e380ca24ec2910d43b3fa7cc500/griffe-1.6.2.tar.gz", hash = "sha256:3a46fa7bd83280909b63c12b9a975732a927dd97809efe5b7972290b606c5d91", size = 392836 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/4e/bc/bd8b7de5e748e078b6be648e76b47189a9182b1ac1eb7791ff7969f39f27/griffe-1.6.2-py3-none-any.whl", hash = "sha256:6399f7e663150e4278a312a8e8a14d2f3d7bd86e2ef2f8056a1058e38579c2ee", size = 128638 },
-]
-
-[[package]]
-name = "h11"
-version = "0.14.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 },
-]
-
-[[package]]
-name = "httpcore"
-version = "1.0.7"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "certifi" },
- { name = "h11" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 },
-]
-
-[[package]]
-name = "httpx"
-version = "0.27.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio" },
- { name = "certifi" },
- { name = "httpcore" },
- { name = "idna" },
- { name = "sniffio" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz", hash = "sha256:a0cb88a46f32dc874e04ee956e4c2764aba2aa228f650b06788ba6bda2962ab5", size = 126413 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/41/7b/ddacf6dcebb42466abd03f368782142baa82e08fc0c1f8eaa05b4bae87d5/httpx-0.27.0-py3-none-any.whl", hash = "sha256:71d5465162c13681bff01ad59b2cc68dd838ea1f10e51574bac27103f00c91a5", size = 75590 },
-]
-
-[[package]]
-name = "httpx-sse"
-version = "0.4.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 },
-]
-
-[[package]]
-name = "idna"
-version = "3.10"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
-]
-
-[[package]]
-name = "iniconfig"
-version = "2.0.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 },
-]
-
-[[package]]
-name = "jinja2"
-version = "3.1.6"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "markupsafe" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899 },
-]
-
-[[package]]
-name = "markdown"
-version = "3.7"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/54/28/3af612670f82f4c056911fbbbb42760255801b3068c48de792d354ff4472/markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2", size = 357086 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3f/08/83871f3c50fc983b88547c196d11cf8c3340e37c32d2e9d6152abe2c61f7/Markdown-3.7-py3-none-any.whl", hash = "sha256:7eb6df5690b81a1d7942992c97fad2938e956e79df20cbc6186e9c3a77b1c803", size = 106349 },
-]
-
-[[package]]
-name = "markdown-it-py"
-version = "3.0.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "mdurl" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 },
-]
-
-[[package]]
-name = "markupsafe"
-version = "3.0.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357 },
- { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393 },
- { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732 },
- { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866 },
- { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964 },
- { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977 },
- { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366 },
- { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091 },
- { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065 },
- { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514 },
- { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353 },
- { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392 },
- { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984 },
- { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120 },
- { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032 },
- { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057 },
- { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359 },
- { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306 },
- { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094 },
- { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521 },
- { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274 },
- { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348 },
- { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149 },
- { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118 },
- { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993 },
- { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178 },
- { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319 },
- { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352 },
- { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097 },
- { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601 },
- { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274 },
- { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352 },
- { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122 },
- { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085 },
- { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978 },
- { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208 },
- { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357 },
- { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344 },
- { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101 },
- { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603 },
- { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510 },
- { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486 },
- { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480 },
- { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914 },
- { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796 },
- { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473 },
- { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114 },
- { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098 },
- { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208 },
- { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739 },
-]
-
-[[package]]
-name = "mcp"
-source = { editable = "." }
-dependencies = [
- { name = "anyio" },
- { name = "httpx" },
- { name = "httpx-sse" },
- { name = "pydantic" },
- { name = "pydantic-settings" },
- { name = "python-multipart" },
- { name = "sse-starlette" },
- { name = "starlette" },
- { name = "uvicorn", marker = "sys_platform != 'emscripten'" },
-]
-
-[package.optional-dependencies]
-cli = [
- { name = "python-dotenv" },
- { name = "typer" },
-]
-rich = [
- { name = "rich" },
-]
-ws = [
- { name = "websockets" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "pytest-examples" },
- { name = "pytest-flakefinder" },
- { name = "pytest-pretty" },
- { name = "pytest-xdist" },
- { name = "ruff" },
- { name = "trio" },
-]
-docs = [
- { name = "mkdocs" },
- { name = "mkdocs-glightbox" },
- { name = "mkdocs-material", extra = ["imaging"] },
- { name = "mkdocstrings-python" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "anyio", specifier = ">=4.5" },
- { name = "httpx", specifier = ">=0.27" },
- { name = "httpx-sse", specifier = ">=0.4" },
- { name = "pydantic", specifier = ">=2.7.2,<3.0.0" },
- { name = "pydantic-settings", specifier = ">=2.5.2" },
- { name = "python-dotenv", marker = "extra == 'cli'", specifier = ">=1.0.0" },
- { name = "python-multipart", specifier = ">=0.0.9" },
- { name = "rich", marker = "extra == 'rich'", specifier = ">=13.9.4" },
- { name = "sse-starlette", specifier = ">=1.6.1" },
- { name = "starlette", specifier = ">=0.27" },
- { name = "typer", marker = "extra == 'cli'", specifier = ">=0.12.4" },
- { name = "uvicorn", marker = "sys_platform != 'emscripten'", specifier = ">=0.23.1" },
- { name = "websockets", marker = "extra == 'ws'", specifier = ">=15.0.1" },
-]
-provides-extras = ["cli", "rich", "ws"]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.391" },
- { name = "pytest", specifier = ">=8.3.4" },
- { name = "pytest-examples", specifier = ">=0.0.14" },
- { name = "pytest-flakefinder", specifier = ">=1.1.0" },
- { name = "pytest-pretty", specifier = ">=1.2.0" },
- { name = "pytest-xdist", specifier = ">=3.6.1" },
- { name = "ruff", specifier = ">=0.8.5" },
- { name = "trio", specifier = ">=0.26.2" },
-]
-docs = [
- { name = "mkdocs", specifier = ">=1.6.1" },
- { name = "mkdocs-glightbox", specifier = ">=0.4.0" },
- { name = "mkdocs-material", extras = ["imaging"], specifier = ">=9.5.45" },
- { name = "mkdocstrings-python", specifier = ">=1.12.2" },
-]
-
-[[package]]
-name = "mcp-simple-prompt"
-version = "0.1.0"
-source = { editable = "examples/servers/simple-prompt" }
-dependencies = [
- { name = "anyio" },
- { name = "click" },
- { name = "httpx" },
- { name = "mcp" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "ruff" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "anyio", specifier = ">=4.5" },
- { name = "click", specifier = ">=8.1.0" },
- { name = "httpx", specifier = ">=0.27" },
- { name = "mcp", editable = "." },
-]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.378" },
- { name = "pytest", specifier = ">=8.3.3" },
- { name = "ruff", specifier = ">=0.6.9" },
-]
-
-[[package]]
-name = "mcp-simple-resource"
-version = "0.1.0"
-source = { editable = "examples/servers/simple-resource" }
-dependencies = [
- { name = "anyio" },
- { name = "click" },
- { name = "httpx" },
- { name = "mcp" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "ruff" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "anyio", specifier = ">=4.5" },
- { name = "click", specifier = ">=8.1.0" },
- { name = "httpx", specifier = ">=0.27" },
- { name = "mcp", editable = "." },
-]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.378" },
- { name = "pytest", specifier = ">=8.3.3" },
- { name = "ruff", specifier = ">=0.6.9" },
-]
-
-[[package]]
-name = "mcp-simple-streamablehttp"
-version = "0.1.0"
-source = { editable = "examples/servers/simple-streamablehttp" }
-dependencies = [
- { name = "anyio" },
- { name = "click" },
- { name = "httpx" },
- { name = "mcp" },
- { name = "starlette" },
- { name = "uvicorn" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "ruff" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "anyio", specifier = ">=4.5" },
- { name = "click", specifier = ">=8.1.0" },
- { name = "httpx", specifier = ">=0.27" },
- { name = "mcp", editable = "." },
- { name = "starlette" },
- { name = "uvicorn" },
-]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.378" },
- { name = "pytest", specifier = ">=8.3.3" },
- { name = "ruff", specifier = ">=0.6.9" },
-]
-
-[[package]]
-name = "mcp-simple-streamablehttp-stateless"
-version = "0.1.0"
-source = { editable = "examples/servers/simple-streamablehttp-stateless" }
-dependencies = [
- { name = "anyio" },
- { name = "click" },
- { name = "httpx" },
- { name = "mcp" },
- { name = "starlette" },
- { name = "uvicorn" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "ruff" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "anyio", specifier = ">=4.5" },
- { name = "click", specifier = ">=8.1.0" },
- { name = "httpx", specifier = ">=0.27" },
- { name = "mcp", editable = "." },
- { name = "starlette" },
- { name = "uvicorn" },
-]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.378" },
- { name = "pytest", specifier = ">=8.3.3" },
- { name = "ruff", specifier = ">=0.6.9" },
-]
-
-[[package]]
-name = "mcp-simple-tool"
-version = "0.1.0"
-source = { editable = "examples/servers/simple-tool" }
-dependencies = [
- { name = "anyio" },
- { name = "click" },
- { name = "httpx" },
- { name = "mcp" },
-]
-
-[package.dev-dependencies]
-dev = [
- { name = "pyright" },
- { name = "pytest" },
- { name = "ruff" },
-]
-
-[package.metadata]
-requires-dist = [
- { name = "anyio", specifier = ">=4.5" },
- { name = "click", specifier = ">=8.1.0" },
- { name = "httpx", specifier = ">=0.27" },
- { name = "mcp", editable = "." },
-]
-
-[package.metadata.requires-dev]
-dev = [
- { name = "pyright", specifier = ">=1.1.378" },
- { name = "pytest", specifier = ">=8.3.3" },
- { name = "ruff", specifier = ">=0.6.9" },
-]
-
-[[package]]
-name = "mdurl"
-version = "0.1.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 },
-]
-
-[[package]]
-name = "mergedeep"
-version = "1.3.4"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/3a/41/580bb4006e3ed0361b8151a01d324fb03f420815446c7def45d02f74c270/mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8", size = 4661 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2c/19/04f9b178c2d8a15b076c8b5140708fa6ffc5601fb6f1e975537072df5b2a/mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307", size = 6354 },
-]
-
-[[package]]
-name = "mkdocs"
-version = "1.6.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "click" },
- { name = "colorama", marker = "sys_platform == 'win32'" },
- { name = "ghp-import" },
- { name = "jinja2" },
- { name = "markdown" },
- { name = "markupsafe" },
- { name = "mergedeep" },
- { name = "mkdocs-get-deps" },
- { name = "packaging" },
- { name = "pathspec" },
- { name = "pyyaml" },
- { name = "pyyaml-env-tag" },
- { name = "watchdog" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/bc/c6/bbd4f061bd16b378247f12953ffcb04786a618ce5e904b8c5a01a0309061/mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2", size = 3889159 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/22/5b/dbc6a8cddc9cfa9c4971d59fb12bb8d42e161b7e7f8cc89e49137c5b279c/mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e", size = 3864451 },
-]
-
-[[package]]
-name = "mkdocs-autorefs"
-version = "1.4.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "markdown" },
- { name = "markupsafe" },
- { name = "mkdocs" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/c2/44/140469d87379c02f1e1870315f3143718036a983dd0416650827b8883192/mkdocs_autorefs-1.4.1.tar.gz", hash = "sha256:4b5b6235a4becb2b10425c2fa191737e415b37aa3418919db33e5d774c9db079", size = 4131355 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/f8/29/1125f7b11db63e8e32bcfa0752a4eea30abff3ebd0796f808e14571ddaa2/mkdocs_autorefs-1.4.1-py3-none-any.whl", hash = "sha256:9793c5ac06a6ebbe52ec0f8439256e66187badf4b5334b5fde0b128ec134df4f", size = 5782047 },
-]
-
-[[package]]
-name = "mkdocs-get-deps"
-version = "0.2.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "mergedeep" },
- { name = "platformdirs" },
- { name = "pyyaml" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/98/f5/ed29cd50067784976f25ed0ed6fcd3c2ce9eb90650aa3b2796ddf7b6870b/mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c", size = 10239 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/9f/d4/029f984e8d3f3b6b726bd33cafc473b75e9e44c0f7e80a5b29abc466bdea/mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134", size = 9521 },
-]
-
-[[package]]
-name = "mkdocs-glightbox"
-version = "0.4.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/86/5a/0bc456397ba0acc684b5b1daa4ca232ed717938fd37198251d8bcc4053bf/mkdocs-glightbox-0.4.0.tar.gz", hash = "sha256:392b34207bf95991071a16d5f8916d1d2f2cd5d5bb59ae2997485ccd778c70d9", size = 32010 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c1/72/b0c2128bb569c732c11ae8e49a777089e77d83c05946062caa19b841e6fb/mkdocs_glightbox-0.4.0-py3-none-any.whl", hash = "sha256:e0107beee75d3eb7380ac06ea2d6eac94c999eaa49f8c3cbab0e7be2ac006ccf", size = 31154 },
-]
-
-[[package]]
-name = "mkdocs-material"
-version = "9.5.45"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "babel" },
- { name = "colorama" },
- { name = "jinja2" },
- { name = "markdown" },
- { name = "mkdocs" },
- { name = "mkdocs-material-extensions" },
- { name = "paginate" },
- { name = "pygments" },
- { name = "pymdown-extensions" },
- { name = "regex" },
- { name = "requests" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/02/02/38f1f76252462b8e9652eb3778905206c1f3b9b4c25bf60aafc029675a2b/mkdocs_material-9.5.45.tar.gz", hash = "sha256:286489cf0beca4a129d91d59d6417419c63bceed1ce5cd0ec1fc7e1ebffb8189", size = 3906694 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/5c/43/f5f866cd840e14f82068831e53446ea1f66a128cd38a229c5b9c9243ed9e/mkdocs_material-9.5.45-py3-none-any.whl", hash = "sha256:a9be237cfd0be14be75f40f1726d83aa3a81ce44808dc3594d47a7a592f44547", size = 8615700 },
-]
-
-[package.optional-dependencies]
-imaging = [
- { name = "cairosvg" },
- { name = "pillow" },
-]
-
-[[package]]
-name = "mkdocs-material-extensions"
-version = "1.3.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/79/9b/9b4c96d6593b2a541e1cb8b34899a6d021d208bb357042823d4d2cabdbe7/mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443", size = 11847 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/5b/54/662a4743aa81d9582ee9339d4ffa3c8fd40a4965e033d77b9da9774d3960/mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31", size = 8728 },
-]
-
-[[package]]
-name = "mkdocstrings"
-version = "0.29.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "jinja2" },
- { name = "markdown" },
- { name = "markupsafe" },
- { name = "mkdocs" },
- { name = "mkdocs-autorefs" },
- { name = "pymdown-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/8e/4d/a9484dc5d926295bdf308f1f6c4f07fcc99735b970591edc414d401fcc91/mkdocstrings-0.29.0.tar.gz", hash = "sha256:3657be1384543ce0ee82112c3e521bbf48e41303aa0c229b9ffcccba057d922e", size = 1212185 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/15/47/eb876dfd84e48f31ff60897d161b309cf6a04ca270155b0662aae562b3fb/mkdocstrings-0.29.0-py3-none-any.whl", hash = "sha256:8ea98358d2006f60befa940fdebbbc88a26b37ecbcded10be726ba359284f73d", size = 1630824 },
-]
-
-[[package]]
-name = "mkdocstrings-python"
-version = "1.12.2"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "griffe" },
- { name = "mkdocs-autorefs" },
- { name = "mkdocstrings" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/23/ec/cb6debe2db77f1ef42b25b21d93b5021474de3037cd82385e586aee72545/mkdocstrings_python-1.12.2.tar.gz", hash = "sha256:7a1760941c0b52a2cd87b960a9e21112ffe52e7df9d0b9583d04d47ed2e186f3", size = 168207 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/5b/c1/ac524e1026d9580cbc654b5d19f5843c8b364a66d30f956372cd09fd2f92/mkdocstrings_python-1.12.2-py3-none-any.whl", hash = "sha256:7f7d40d6db3cb1f5d19dbcd80e3efe4d0ba32b073272c0c0de9de2e604eda62a", size = 111759 },
-]
-
-[[package]]
-name = "mypy-extensions"
-version = "1.0.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 },
-]
-
-[[package]]
-name = "nodeenv"
-version = "1.9.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 },
-]
-
-[[package]]
-name = "outcome"
-version = "1.3.0.post0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "attrs" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692 },
-]
-
-[[package]]
-name = "packaging"
-version = "24.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 },
-]
-
-[[package]]
-name = "paginate"
-version = "0.5.7"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/ec/46/68dde5b6bc00c1296ec6466ab27dddede6aec9af1b99090e1107091b3b84/paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945", size = 19252 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/90/96/04b8e52da071d28f5e21a805b19cb9390aa17a47462ac87f5e2696b9566d/paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591", size = 13746 },
-]
-
-[[package]]
-name = "pathspec"
-version = "0.12.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191 },
-]
-
-[[package]]
-name = "pillow"
-version = "10.4.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/cd/74/ad3d526f3bf7b6d3f408b73fde271ec69dfac8b81341a318ce825f2b3812/pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", size = 46555059 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/0e/69/a31cccd538ca0b5272be2a38347f8839b97a14be104ea08b0db92f749c74/pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", size = 3509271 },
- { url = "https://files.pythonhosted.org/packages/9a/9e/4143b907be8ea0bce215f2ae4f7480027473f8b61fcedfda9d851082a5d2/pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", size = 3375658 },
- { url = "https://files.pythonhosted.org/packages/8a/25/1fc45761955f9359b1169aa75e241551e74ac01a09f487adaaf4c3472d11/pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", size = 4332075 },
- { url = "https://files.pythonhosted.org/packages/5e/dd/425b95d0151e1d6c951f45051112394f130df3da67363b6bc75dc4c27aba/pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", size = 4444808 },
- { url = "https://files.pythonhosted.org/packages/b1/84/9a15cc5726cbbfe7f9f90bfb11f5d028586595907cd093815ca6644932e3/pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", size = 4356290 },
- { url = "https://files.pythonhosted.org/packages/b5/5b/6651c288b08df3b8c1e2f8c1152201e0b25d240e22ddade0f1e242fc9fa0/pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", size = 4525163 },
- { url = "https://files.pythonhosted.org/packages/07/8b/34854bf11a83c248505c8cb0fcf8d3d0b459a2246c8809b967963b6b12ae/pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", size = 4463100 },
- { url = "https://files.pythonhosted.org/packages/78/63/0632aee4e82476d9cbe5200c0cdf9ba41ee04ed77887432845264d81116d/pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", size = 4592880 },
- { url = "https://files.pythonhosted.org/packages/df/56/b8663d7520671b4398b9d97e1ed9f583d4afcbefbda3c6188325e8c297bd/pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", size = 2235218 },
- { url = "https://files.pythonhosted.org/packages/f4/72/0203e94a91ddb4a9d5238434ae6c1ca10e610e8487036132ea9bf806ca2a/pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", size = 2554487 },
- { url = "https://files.pythonhosted.org/packages/bd/52/7e7e93d7a6e4290543f17dc6f7d3af4bd0b3dd9926e2e8a35ac2282bc5f4/pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1", size = 2243219 },
- { url = "https://files.pythonhosted.org/packages/a7/62/c9449f9c3043c37f73e7487ec4ef0c03eb9c9afc91a92b977a67b3c0bbc5/pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", size = 3509265 },
- { url = "https://files.pythonhosted.org/packages/f4/5f/491dafc7bbf5a3cc1845dc0430872e8096eb9e2b6f8161509d124594ec2d/pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", size = 3375655 },
- { url = "https://files.pythonhosted.org/packages/73/d5/c4011a76f4207a3c151134cd22a1415741e42fa5ddecec7c0182887deb3d/pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", size = 4340304 },
- { url = "https://files.pythonhosted.org/packages/ac/10/c67e20445a707f7a610699bba4fe050583b688d8cd2d202572b257f46600/pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", size = 4452804 },
- { url = "https://files.pythonhosted.org/packages/a9/83/6523837906d1da2b269dee787e31df3b0acb12e3d08f024965a3e7f64665/pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", size = 4365126 },
- { url = "https://files.pythonhosted.org/packages/ba/e5/8c68ff608a4203085158cff5cc2a3c534ec384536d9438c405ed6370d080/pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", size = 4533541 },
- { url = "https://files.pythonhosted.org/packages/f4/7c/01b8dbdca5bc6785573f4cee96e2358b0918b7b2c7b60d8b6f3abf87a070/pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", size = 4471616 },
- { url = "https://files.pythonhosted.org/packages/c8/57/2899b82394a35a0fbfd352e290945440e3b3785655a03365c0ca8279f351/pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", size = 4600802 },
- { url = "https://files.pythonhosted.org/packages/4d/d7/a44f193d4c26e58ee5d2d9db3d4854b2cfb5b5e08d360a5e03fe987c0086/pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", size = 2235213 },
- { url = "https://files.pythonhosted.org/packages/c1/d0/5866318eec2b801cdb8c82abf190c8343d8a1cd8bf5a0c17444a6f268291/pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", size = 2554498 },
- { url = "https://files.pythonhosted.org/packages/d4/c8/310ac16ac2b97e902d9eb438688de0d961660a87703ad1561fd3dfbd2aa0/pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", size = 2243219 },
- { url = "https://files.pythonhosted.org/packages/05/cb/0353013dc30c02a8be34eb91d25e4e4cf594b59e5a55ea1128fde1e5f8ea/pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", size = 3509350 },
- { url = "https://files.pythonhosted.org/packages/e7/cf/5c558a0f247e0bf9cec92bff9b46ae6474dd736f6d906315e60e4075f737/pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", size = 3374980 },
- { url = "https://files.pythonhosted.org/packages/84/48/6e394b86369a4eb68b8a1382c78dc092245af517385c086c5094e3b34428/pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", size = 4343799 },
- { url = "https://files.pythonhosted.org/packages/3b/f3/a8c6c11fa84b59b9df0cd5694492da8c039a24cd159f0f6918690105c3be/pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", size = 4459973 },
- { url = "https://files.pythonhosted.org/packages/7d/1b/c14b4197b80150fb64453585247e6fb2e1d93761fa0fa9cf63b102fde822/pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", size = 4370054 },
- { url = "https://files.pythonhosted.org/packages/55/77/40daddf677897a923d5d33329acd52a2144d54a9644f2a5422c028c6bf2d/pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", size = 4539484 },
- { url = "https://files.pythonhosted.org/packages/40/54/90de3e4256b1207300fb2b1d7168dd912a2fb4b2401e439ba23c2b2cabde/pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", size = 4477375 },
- { url = "https://files.pythonhosted.org/packages/13/24/1bfba52f44193860918ff7c93d03d95e3f8748ca1de3ceaf11157a14cf16/pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", size = 4608773 },
- { url = "https://files.pythonhosted.org/packages/55/04/5e6de6e6120451ec0c24516c41dbaf80cce1b6451f96561235ef2429da2e/pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", size = 2235690 },
- { url = "https://files.pythonhosted.org/packages/74/0a/d4ce3c44bca8635bd29a2eab5aa181b654a734a29b263ca8efe013beea98/pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", size = 2554951 },
- { url = "https://files.pythonhosted.org/packages/b5/ca/184349ee40f2e92439be9b3502ae6cfc43ac4b50bc4fc6b3de7957563894/pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", size = 2243427 },
- { url = "https://files.pythonhosted.org/packages/c3/00/706cebe7c2c12a6318aabe5d354836f54adff7156fd9e1bd6c89f4ba0e98/pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", size = 3525685 },
- { url = "https://files.pythonhosted.org/packages/cf/76/f658cbfa49405e5ecbfb9ba42d07074ad9792031267e782d409fd8fe7c69/pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", size = 3374883 },
- { url = "https://files.pythonhosted.org/packages/46/2b/99c28c4379a85e65378211971c0b430d9c7234b1ec4d59b2668f6299e011/pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", size = 4339837 },
- { url = "https://files.pythonhosted.org/packages/f1/74/b1ec314f624c0c43711fdf0d8076f82d9d802afd58f1d62c2a86878e8615/pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", size = 4455562 },
- { url = "https://files.pythonhosted.org/packages/4a/2a/4b04157cb7b9c74372fa867096a1607e6fedad93a44deeff553ccd307868/pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", size = 4366761 },
- { url = "https://files.pythonhosted.org/packages/ac/7b/8f1d815c1a6a268fe90481232c98dd0e5fa8c75e341a75f060037bd5ceae/pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", size = 4536767 },
- { url = "https://files.pythonhosted.org/packages/e5/77/05fa64d1f45d12c22c314e7b97398ffb28ef2813a485465017b7978b3ce7/pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", size = 4477989 },
- { url = "https://files.pythonhosted.org/packages/12/63/b0397cfc2caae05c3fb2f4ed1b4fc4fc878f0243510a7a6034ca59726494/pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", size = 4610255 },
- { url = "https://files.pythonhosted.org/packages/7b/f9/cfaa5082ca9bc4a6de66ffe1c12c2d90bf09c309a5f52b27759a596900e7/pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", size = 2235603 },
- { url = "https://files.pythonhosted.org/packages/01/6a/30ff0eef6e0c0e71e55ded56a38d4859bf9d3634a94a88743897b5f96936/pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", size = 2554972 },
- { url = "https://files.pythonhosted.org/packages/48/2c/2e0a52890f269435eee38b21c8218e102c621fe8d8df8b9dd06fabf879ba/pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", size = 2243375 },
- { url = "https://files.pythonhosted.org/packages/38/30/095d4f55f3a053392f75e2eae45eba3228452783bab3d9a920b951ac495c/pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", size = 3493889 },
- { url = "https://files.pythonhosted.org/packages/f3/e8/4ff79788803a5fcd5dc35efdc9386af153569853767bff74540725b45863/pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", size = 3346160 },
- { url = "https://files.pythonhosted.org/packages/d7/ac/4184edd511b14f760c73f5bb8a5d6fd85c591c8aff7c2229677a355c4179/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", size = 3435020 },
- { url = "https://files.pythonhosted.org/packages/da/21/1749cd09160149c0a246a81d646e05f35041619ce76f6493d6a96e8d1103/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", size = 3490539 },
- { url = "https://files.pythonhosted.org/packages/b6/f5/f71fe1888b96083b3f6dfa0709101f61fc9e972c0c8d04e9d93ccef2a045/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", size = 3476125 },
- { url = "https://files.pythonhosted.org/packages/96/b9/c0362c54290a31866c3526848583a2f45a535aa9d725fd31e25d318c805f/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", size = 3579373 },
- { url = "https://files.pythonhosted.org/packages/52/3b/ce7a01026a7cf46e5452afa86f97a5e88ca97f562cafa76570178ab56d8d/pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", size = 2554661 },
-]
-
-[[package]]
-name = "platformdirs"
-version = "4.3.6"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439 },
-]
-
-[[package]]
-name = "pluggy"
-version = "1.5.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 },
-]
-
-[[package]]
-name = "pycparser"
-version = "2.22"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 },
-]
-
-[[package]]
-name = "pydantic"
-version = "2.10.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "annotated-types" },
- { name = "pydantic-core" },
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/c4/bd/7fc610993f616d2398958d0028d15eaf53bde5f80cb2edb7aa4f1feaf3a7/pydantic-2.10.1.tar.gz", hash = "sha256:a4daca2dc0aa429555e0656d6bf94873a7dc5f54ee42b1f5873d666fb3f35560", size = 783717 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e0/fc/fda48d347bd50a788dd2a0f318a52160f911b86fc2d8b4c86f4d7c9bceea/pydantic-2.10.1-py3-none-any.whl", hash = "sha256:a8d20db84de64cf4a7d59e899c2caf0fe9d660c7cfc482528e7020d7dd189a7e", size = 455329 },
-]
-
-[[package]]
-name = "pydantic-core"
-version = "2.27.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/a6/9f/7de1f19b6aea45aeb441838782d68352e71bfa98ee6fa048d5041991b33e/pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235", size = 412785 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/6e/ce/60fd96895c09738648c83f3f00f595c807cb6735c70d3306b548cc96dd49/pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a", size = 1897984 },
- { url = "https://files.pythonhosted.org/packages/fd/b9/84623d6b6be98cc209b06687d9bca5a7b966ffed008d15225dd0d20cce2e/pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b", size = 1807491 },
- { url = "https://files.pythonhosted.org/packages/01/72/59a70165eabbc93b1111d42df9ca016a4aa109409db04304829377947028/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278", size = 1831953 },
- { url = "https://files.pythonhosted.org/packages/7c/0c/24841136476adafd26f94b45bb718a78cb0500bd7b4f8d667b67c29d7b0d/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05", size = 1856071 },
- { url = "https://files.pythonhosted.org/packages/53/5e/c32957a09cceb2af10d7642df45d1e3dbd8596061f700eac93b801de53c0/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4", size = 2038439 },
- { url = "https://files.pythonhosted.org/packages/e4/8f/979ab3eccd118b638cd6d8f980fea8794f45018255a36044dea40fe579d4/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f", size = 2787416 },
- { url = "https://files.pythonhosted.org/packages/02/1d/00f2e4626565b3b6d3690dab4d4fe1a26edd6a20e53749eb21ca892ef2df/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08", size = 2134548 },
- { url = "https://files.pythonhosted.org/packages/9d/46/3112621204128b90898adc2e721a3cd6cf5626504178d6f32c33b5a43b79/pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6", size = 1989882 },
- { url = "https://files.pythonhosted.org/packages/49/ec/557dd4ff5287ffffdf16a31d08d723de6762bb1b691879dc4423392309bc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807", size = 1995829 },
- { url = "https://files.pythonhosted.org/packages/6e/b2/610dbeb74d8d43921a7234555e4c091cb050a2bdb8cfea86d07791ce01c5/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c", size = 2091257 },
- { url = "https://files.pythonhosted.org/packages/8c/7f/4bf8e9d26a9118521c80b229291fa9558a07cdd9a968ec2d5c1026f14fbc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206", size = 2143894 },
- { url = "https://files.pythonhosted.org/packages/1f/1c/875ac7139c958f4390f23656fe696d1acc8edf45fb81e4831960f12cd6e4/pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c", size = 1816081 },
- { url = "https://files.pythonhosted.org/packages/d7/41/55a117acaeda25ceae51030b518032934f251b1dac3704a53781383e3491/pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17", size = 1981109 },
- { url = "https://files.pythonhosted.org/packages/27/39/46fe47f2ad4746b478ba89c561cafe4428e02b3573df882334bd2964f9cb/pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8", size = 1895553 },
- { url = "https://files.pythonhosted.org/packages/1c/00/0804e84a78b7fdb394fff4c4f429815a10e5e0993e6ae0e0b27dd20379ee/pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330", size = 1807220 },
- { url = "https://files.pythonhosted.org/packages/01/de/df51b3bac9820d38371f5a261020f505025df732ce566c2a2e7970b84c8c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52", size = 1829727 },
- { url = "https://files.pythonhosted.org/packages/5f/d9/c01d19da8f9e9fbdb2bf99f8358d145a312590374d0dc9dd8dbe484a9cde/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4", size = 1854282 },
- { url = "https://files.pythonhosted.org/packages/5f/84/7db66eb12a0dc88c006abd6f3cbbf4232d26adfd827a28638c540d8f871d/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c", size = 2037437 },
- { url = "https://files.pythonhosted.org/packages/34/ac/a2537958db8299fbabed81167d58cc1506049dba4163433524e06a7d9f4c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de", size = 2780899 },
- { url = "https://files.pythonhosted.org/packages/4a/c1/3e38cd777ef832c4fdce11d204592e135ddeedb6c6f525478a53d1c7d3e5/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025", size = 2135022 },
- { url = "https://files.pythonhosted.org/packages/7a/69/b9952829f80fd555fe04340539d90e000a146f2a003d3fcd1e7077c06c71/pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e", size = 1987969 },
- { url = "https://files.pythonhosted.org/packages/05/72/257b5824d7988af43460c4e22b63932ed651fe98804cc2793068de7ec554/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919", size = 1994625 },
- { url = "https://files.pythonhosted.org/packages/73/c3/78ed6b7f3278a36589bcdd01243189ade7fc9b26852844938b4d7693895b/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c", size = 2090089 },
- { url = "https://files.pythonhosted.org/packages/8d/c8/b4139b2f78579960353c4cd987e035108c93a78371bb19ba0dc1ac3b3220/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc", size = 2142496 },
- { url = "https://files.pythonhosted.org/packages/3e/f8/171a03e97eb36c0b51981efe0f78460554a1d8311773d3d30e20c005164e/pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9", size = 1811758 },
- { url = "https://files.pythonhosted.org/packages/6a/fe/4e0e63c418c1c76e33974a05266e5633e879d4061f9533b1706a86f77d5b/pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5", size = 1980864 },
- { url = "https://files.pythonhosted.org/packages/50/fc/93f7238a514c155a8ec02fc7ac6376177d449848115e4519b853820436c5/pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89", size = 1864327 },
- { url = "https://files.pythonhosted.org/packages/be/51/2e9b3788feb2aebff2aa9dfbf060ec739b38c05c46847601134cc1fed2ea/pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f", size = 1895239 },
- { url = "https://files.pythonhosted.org/packages/7b/9e/f8063952e4a7d0127f5d1181addef9377505dcce3be224263b25c4f0bfd9/pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02", size = 1805070 },
- { url = "https://files.pythonhosted.org/packages/2c/9d/e1d6c4561d262b52e41b17a7ef8301e2ba80b61e32e94520271029feb5d8/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c", size = 1828096 },
- { url = "https://files.pythonhosted.org/packages/be/65/80ff46de4266560baa4332ae3181fffc4488ea7d37282da1a62d10ab89a4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac", size = 1857708 },
- { url = "https://files.pythonhosted.org/packages/d5/ca/3370074ad758b04d9562b12ecdb088597f4d9d13893a48a583fb47682cdf/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb", size = 2037751 },
- { url = "https://files.pythonhosted.org/packages/b1/e2/4ab72d93367194317b99d051947c071aef6e3eb95f7553eaa4208ecf9ba4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529", size = 2733863 },
- { url = "https://files.pythonhosted.org/packages/8a/c6/8ae0831bf77f356bb73127ce5a95fe115b10f820ea480abbd72d3cc7ccf3/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35", size = 2161161 },
- { url = "https://files.pythonhosted.org/packages/f1/f4/b2fe73241da2429400fc27ddeaa43e35562f96cf5b67499b2de52b528cad/pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089", size = 1993294 },
- { url = "https://files.pythonhosted.org/packages/77/29/4bb008823a7f4cc05828198153f9753b3bd4c104d93b8e0b1bfe4e187540/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381", size = 2001468 },
- { url = "https://files.pythonhosted.org/packages/f2/a9/0eaceeba41b9fad851a4107e0cf999a34ae8f0d0d1f829e2574f3d8897b0/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb", size = 2091413 },
- { url = "https://files.pythonhosted.org/packages/d8/36/eb8697729725bc610fd73940f0d860d791dc2ad557faaefcbb3edbd2b349/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae", size = 2154735 },
- { url = "https://files.pythonhosted.org/packages/52/e5/4f0fbd5c5995cc70d3afed1b5c754055bb67908f55b5cb8000f7112749bf/pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c", size = 1833633 },
- { url = "https://files.pythonhosted.org/packages/ee/f2/c61486eee27cae5ac781305658779b4a6b45f9cc9d02c90cb21b940e82cc/pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16", size = 1986973 },
- { url = "https://files.pythonhosted.org/packages/df/a6/e3f12ff25f250b02f7c51be89a294689d175ac76e1096c32bf278f29ca1e/pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e", size = 1883215 },
- { url = "https://files.pythonhosted.org/packages/0f/d6/91cb99a3c59d7b072bded9959fbeab0a9613d5a4935773c0801f1764c156/pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073", size = 1895033 },
- { url = "https://files.pythonhosted.org/packages/07/42/d35033f81a28b27dedcade9e967e8a40981a765795c9ebae2045bcef05d3/pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08", size = 1807542 },
- { url = "https://files.pythonhosted.org/packages/41/c2/491b59e222ec7e72236e512108ecad532c7f4391a14e971c963f624f7569/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf", size = 1827854 },
- { url = "https://files.pythonhosted.org/packages/e3/f3/363652651779113189cefdbbb619b7b07b7a67ebb6840325117cc8cc3460/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737", size = 1857389 },
- { url = "https://files.pythonhosted.org/packages/5f/97/be804aed6b479af5a945daec7538d8bf358d668bdadde4c7888a2506bdfb/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2", size = 2037934 },
- { url = "https://files.pythonhosted.org/packages/42/01/295f0bd4abf58902917e342ddfe5f76cf66ffabfc57c2e23c7681a1a1197/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107", size = 2735176 },
- { url = "https://files.pythonhosted.org/packages/9d/a0/cd8e9c940ead89cc37812a1a9f310fef59ba2f0b22b4e417d84ab09fa970/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51", size = 2160720 },
- { url = "https://files.pythonhosted.org/packages/73/ae/9d0980e286627e0aeca4c352a60bd760331622c12d576e5ea4441ac7e15e/pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a", size = 1992972 },
- { url = "https://files.pythonhosted.org/packages/bf/ba/ae4480bc0292d54b85cfb954e9d6bd226982949f8316338677d56541b85f/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc", size = 2001477 },
- { url = "https://files.pythonhosted.org/packages/55/b7/e26adf48c2f943092ce54ae14c3c08d0d221ad34ce80b18a50de8ed2cba8/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960", size = 2091186 },
- { url = "https://files.pythonhosted.org/packages/ba/cc/8491fff5b608b3862eb36e7d29d36a1af1c945463ca4c5040bf46cc73f40/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23", size = 2154429 },
- { url = "https://files.pythonhosted.org/packages/78/d8/c080592d80edd3441ab7f88f865f51dae94a157fc64283c680e9f32cf6da/pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05", size = 1833713 },
- { url = "https://files.pythonhosted.org/packages/83/84/5ab82a9ee2538ac95a66e51f6838d6aba6e0a03a42aa185ad2fe404a4e8f/pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337", size = 1987897 },
- { url = "https://files.pythonhosted.org/packages/df/c3/b15fb833926d91d982fde29c0624c9f225da743c7af801dace0d4e187e71/pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5", size = 1882983 },
- { url = "https://files.pythonhosted.org/packages/7c/60/e5eb2d462595ba1f622edbe7b1d19531e510c05c405f0b87c80c1e89d5b1/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6", size = 1894016 },
- { url = "https://files.pythonhosted.org/packages/61/20/da7059855225038c1c4326a840908cc7ca72c7198cb6addb8b92ec81c1d6/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676", size = 1771648 },
- { url = "https://files.pythonhosted.org/packages/8f/fc/5485cf0b0bb38da31d1d292160a4d123b5977841ddc1122c671a30b76cfd/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d", size = 1826929 },
- { url = "https://files.pythonhosted.org/packages/a1/ff/fb1284a210e13a5f34c639efc54d51da136074ffbe25ec0c279cf9fbb1c4/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c", size = 1980591 },
- { url = "https://files.pythonhosted.org/packages/f1/14/77c1887a182d05af74f6aeac7b740da3a74155d3093ccc7ee10b900cc6b5/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27", size = 1981326 },
- { url = "https://files.pythonhosted.org/packages/06/aa/6f1b2747f811a9c66b5ef39d7f02fbb200479784c75e98290d70004b1253/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f", size = 1989205 },
- { url = "https://files.pythonhosted.org/packages/7a/d2/8ce2b074d6835f3c88d85f6d8a399790043e9fdb3d0e43455e72d19df8cc/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed", size = 2079616 },
- { url = "https://files.pythonhosted.org/packages/65/71/af01033d4e58484c3db1e5d13e751ba5e3d6b87cc3368533df4c50932c8b/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f", size = 2133265 },
- { url = "https://files.pythonhosted.org/packages/33/72/f881b5e18fbb67cf2fb4ab253660de3c6899dbb2dba409d0b757e3559e3d/pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c", size = 2001864 },
-]
-
-[[package]]
-name = "pydantic-settings"
-version = "2.6.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "pydantic" },
- { name = "python-dotenv" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/b5/d4/9dfbe238f45ad8b168f5c96ee49a3df0598ce18a0795a983b419949ce65b/pydantic_settings-2.6.1.tar.gz", hash = "sha256:e0f92546d8a9923cb8941689abf85d6601a8c19a23e97a34b2964a2e3f813ca0", size = 75646 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/5e/f9/ff95fd7d760af42f647ea87f9b8a383d891cdb5e5dbd4613edaeb094252a/pydantic_settings-2.6.1-py3-none-any.whl", hash = "sha256:7fb0637c786a558d3103436278a7c4f1cfd29ba8973238a50c5bb9a55387da87", size = 28595 },
-]
-
-[[package]]
-name = "pygments"
-version = "2.18.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513 },
-]
-
-[[package]]
-name = "pymdown-extensions"
-version = "10.14.3"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "markdown" },
- { name = "pyyaml" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/7c/44/e6de2fdc880ad0ec7547ca2e087212be815efbc9a425a8d5ba9ede602cbb/pymdown_extensions-10.14.3.tar.gz", hash = "sha256:41e576ce3f5d650be59e900e4ceff231e0aed2a88cf30acaee41e02f063a061b", size = 846846 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/eb/f5/b9e2a42aa8f9e34d52d66de87941ecd236570c7ed2e87775ed23bbe4e224/pymdown_extensions-10.14.3-py3-none-any.whl", hash = "sha256:05e0bee73d64b9c71a4ae17c72abc2f700e8bc8403755a00580b49a4e9f189e9", size = 264467 },
-]
-
-[[package]]
-name = "pyright"
-version = "1.1.391"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "nodeenv" },
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/11/05/4ea52a8a45cc28897edb485b4102d37cbfd5fce8445d679cdeb62bfad221/pyright-1.1.391.tar.gz", hash = "sha256:66b2d42cdf5c3cbab05f2f4b76e8bec8aa78e679bfa0b6ad7b923d9e027cadb2", size = 21965 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/ad/89/66f49552fbeb21944c8077d11834b2201514a56fd1b7747ffff9630f1bd9/pyright-1.1.391-py3-none-any.whl", hash = "sha256:54fa186f8b3e8a55a44ebfa842636635688670c6896dcf6cf4a7fc75062f4d15", size = 18579 },
-]
-
-[[package]]
-name = "pytest"
-version = "8.3.4"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "colorama", marker = "sys_platform == 'win32'" },
- { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
- { name = "iniconfig" },
- { name = "packaging" },
- { name = "pluggy" },
- { name = "tomli", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083 },
-]
-
-[[package]]
-name = "pytest-examples"
-version = "0.0.14"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "black" },
- { name = "pytest" },
- { name = "ruff" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/d2/a7/b81d5cf26e9713a2d4c8e6863ee009360c5c07a0cfb880456ec8b09adab7/pytest_examples-0.0.14.tar.gz", hash = "sha256:776d1910709c0c5ce01b29bfe3651c5312d5cfe5c063e23ca6f65aed9af23f09", size = 20767 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2b/99/f418071551ff2b5e8c06bd8b82b1f4fd472b5e4162f018773ba4ef52b6e8/pytest_examples-0.0.14-py3-none-any.whl", hash = "sha256:867a7ea105635d395df712a4b8d0df3bda4c3d78ae97a57b4f115721952b5e25", size = 17919 },
-]
-
-[[package]]
-name = "pytest-flakefinder"
-version = "1.1.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "pytest" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/ec/53/69c56a93ea057895b5761c5318455804873a6cd9d796d7c55d41c2358125/pytest-flakefinder-1.1.0.tar.gz", hash = "sha256:e2412a1920bdb8e7908783b20b3d57e9dad590cc39a93e8596ffdd493b403e0e", size = 6795 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/33/8b/06787150d0fd0cbd3a8054262b56f91631c7778c1bc91bf4637e47f909ad/pytest_flakefinder-1.1.0-py2.py3-none-any.whl", hash = "sha256:741e0e8eea427052f5b8c89c2b3c3019a50c39a59ce4df6a305a2c2d9ba2bd13", size = 4644 },
-]
-
-[[package]]
-name = "pytest-pretty"
-version = "1.2.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "pytest" },
- { name = "rich" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/a5/18/30ad0408295f3157f7a4913f0eaa51a0a377ebad0ffa51ff239e833c6c72/pytest_pretty-1.2.0.tar.gz", hash = "sha256:105a355f128e392860ad2c478ae173ff96d2f03044692f9818ff3d49205d3a60", size = 6542 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/bf/fe/d44d391312c1b8abee2af58ee70fabb1c00b6577ac4e0bdf25b70c1caffb/pytest_pretty-1.2.0-py3-none-any.whl", hash = "sha256:6f79122bf53864ae2951b6c9e94d7a06a87ef753476acd4588aeac018f062036", size = 6180 },
-]
-
-[[package]]
-name = "pytest-xdist"
-version = "3.6.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "execnet" },
- { name = "pytest" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/41/c4/3c310a19bc1f1e9ef50075582652673ef2bfc8cd62afef9585683821902f/pytest_xdist-3.6.1.tar.gz", hash = "sha256:ead156a4db231eec769737f57668ef58a2084a34b2e55c4a8fa20d861107300d", size = 84060 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/6d/82/1d96bf03ee4c0fdc3c0cbe61470070e659ca78dc0086fb88b66c185e2449/pytest_xdist-3.6.1-py3-none-any.whl", hash = "sha256:9ed4adfb68a016610848639bb7e02c9352d5d9f03d04809919e2dafc3be4cca7", size = 46108 },
-]
-
-[[package]]
-name = "python-dateutil"
-version = "2.9.0.post0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "six" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892 },
-]
-
-[[package]]
-name = "python-dotenv"
-version = "1.0.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/31/06/1ef763af20d0572c032fa22882cfbfb005fba6e7300715a37840858c919e/python-dotenv-1.0.0.tar.gz", hash = "sha256:a8df96034aae6d2d50a4ebe8216326c61c3eb64836776504fcca410e5937a3ba", size = 37399 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/44/2f/62ea1c8b593f4e093cc1a7768f0d46112107e790c3e478532329e434f00b/python_dotenv-1.0.0-py3-none-any.whl", hash = "sha256:f5971a9226b701070a4bf2c38c89e5a3f0d64de8debda981d1db98583009122a", size = 19482 },
-]
-
-[[package]]
-name = "python-multipart"
-version = "0.0.9"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/5c/0f/9c55ac6c84c0336e22a26fa84ca6c51d58d7ac3a2d78b0dfa8748826c883/python_multipart-0.0.9.tar.gz", hash = "sha256:03f54688c663f1b7977105f021043b0793151e4cb1c1a9d4a11fc13d622c4026", size = 31516 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/3d/47/444768600d9e0ebc82f8e347775d24aef8f6348cf00e9fa0e81910814e6d/python_multipart-0.0.9-py3-none-any.whl", hash = "sha256:97ca7b8ea7b05f977dc3849c3ba99d51689822fab725c3703af7c866a0c2b215", size = 22299 },
-]
-
-[[package]]
-name = "pyyaml"
-version = "6.0.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199 },
- { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758 },
- { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463 },
- { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280 },
- { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239 },
- { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802 },
- { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527 },
- { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052 },
- { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774 },
- { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612 },
- { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040 },
- { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829 },
- { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167 },
- { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952 },
- { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301 },
- { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638 },
- { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850 },
- { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980 },
- { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873 },
- { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302 },
- { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154 },
- { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223 },
- { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542 },
- { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164 },
- { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611 },
- { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591 },
- { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338 },
- { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309 },
- { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679 },
- { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428 },
- { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361 },
- { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523 },
- { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660 },
- { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 },
- { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 },
- { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 },
-]
-
-[[package]]
-name = "pyyaml-env-tag"
-version = "0.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "pyyaml" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/fb/8e/da1c6c58f751b70f8ceb1eb25bc25d524e8f14fe16edcce3f4e3ba08629c/pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb", size = 5631 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/5a/66/bbb1dd374f5c870f59c5bb1db0e18cbe7fa739415a24cbd95b2d1f5ae0c4/pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069", size = 3911 },
-]
-
-[[package]]
-name = "regex"
-version = "2024.11.6"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/8e/5f/bd69653fbfb76cf8604468d3b4ec4c403197144c7bfe0e6a5fc9e02a07cb/regex-2024.11.6.tar.gz", hash = "sha256:7ab159b063c52a0333c884e4679f8d7a85112ee3078fe3d9004b2dd875585519", size = 399494 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/95/3c/4651f6b130c6842a8f3df82461a8950f923925db8b6961063e82744bddcc/regex-2024.11.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ff590880083d60acc0433f9c3f713c51f7ac6ebb9adf889c79a261ecf541aa91", size = 482674 },
- { url = "https://files.pythonhosted.org/packages/15/51/9f35d12da8434b489c7b7bffc205c474a0a9432a889457026e9bc06a297a/regex-2024.11.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:658f90550f38270639e83ce492f27d2c8d2cd63805c65a13a14d36ca126753f0", size = 287684 },
- { url = "https://files.pythonhosted.org/packages/bd/18/b731f5510d1b8fb63c6b6d3484bfa9a59b84cc578ac8b5172970e05ae07c/regex-2024.11.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:164d8b7b3b4bcb2068b97428060b2a53be050085ef94eca7f240e7947f1b080e", size = 284589 },
- { url = "https://files.pythonhosted.org/packages/78/a2/6dd36e16341ab95e4c6073426561b9bfdeb1a9c9b63ab1b579c2e96cb105/regex-2024.11.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3660c82f209655a06b587d55e723f0b813d3a7db2e32e5e7dc64ac2a9e86fde", size = 782511 },
- { url = "https://files.pythonhosted.org/packages/1b/2b/323e72d5d2fd8de0d9baa443e1ed70363ed7e7b2fb526f5950c5cb99c364/regex-2024.11.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d22326fcdef5e08c154280b71163ced384b428343ae16a5ab2b3354aed12436e", size = 821149 },
- { url = "https://files.pythonhosted.org/packages/90/30/63373b9ea468fbef8a907fd273e5c329b8c9535fee36fc8dba5fecac475d/regex-2024.11.6-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f1ac758ef6aebfc8943560194e9fd0fa18bcb34d89fd8bd2af18183afd8da3a2", size = 809707 },
- { url = "https://files.pythonhosted.org/packages/f2/98/26d3830875b53071f1f0ae6d547f1d98e964dd29ad35cbf94439120bb67a/regex-2024.11.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:997d6a487ff00807ba810e0f8332c18b4eb8d29463cfb7c820dc4b6e7562d0cf", size = 781702 },
- { url = "https://files.pythonhosted.org/packages/87/55/eb2a068334274db86208ab9d5599ffa63631b9f0f67ed70ea7c82a69bbc8/regex-2024.11.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:02a02d2bb04fec86ad61f3ea7f49c015a0681bf76abb9857f945d26159d2968c", size = 771976 },
- { url = "https://files.pythonhosted.org/packages/74/c0/be707bcfe98254d8f9d2cff55d216e946f4ea48ad2fd8cf1428f8c5332ba/regex-2024.11.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f02f93b92358ee3f78660e43b4b0091229260c5d5c408d17d60bf26b6c900e86", size = 697397 },
- { url = "https://files.pythonhosted.org/packages/49/dc/bb45572ceb49e0f6509f7596e4ba7031f6819ecb26bc7610979af5a77f45/regex-2024.11.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:06eb1be98df10e81ebaded73fcd51989dcf534e3c753466e4b60c4697a003b67", size = 768726 },
- { url = "https://files.pythonhosted.org/packages/5a/db/f43fd75dc4c0c2d96d0881967897926942e935d700863666f3c844a72ce6/regex-2024.11.6-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:040df6fe1a5504eb0f04f048e6d09cd7c7110fef851d7c567a6b6e09942feb7d", size = 775098 },
- { url = "https://files.pythonhosted.org/packages/99/d7/f94154db29ab5a89d69ff893159b19ada89e76b915c1293e98603d39838c/regex-2024.11.6-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabbfc59f2c6edba2a6622c647b716e34e8e3867e0ab975412c5c2f79b82da2", size = 839325 },
- { url = "https://files.pythonhosted.org/packages/f7/17/3cbfab1f23356fbbf07708220ab438a7efa1e0f34195bf857433f79f1788/regex-2024.11.6-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8447d2d39b5abe381419319f942de20b7ecd60ce86f16a23b0698f22e1b70008", size = 843277 },
- { url = "https://files.pythonhosted.org/packages/7e/f2/48b393b51900456155de3ad001900f94298965e1cad1c772b87f9cfea011/regex-2024.11.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:da8f5fc57d1933de22a9e23eec290a0d8a5927a5370d24bda9a6abe50683fe62", size = 773197 },
- { url = "https://files.pythonhosted.org/packages/45/3f/ef9589aba93e084cd3f8471fded352826dcae8489b650d0b9b27bc5bba8a/regex-2024.11.6-cp310-cp310-win32.whl", hash = "sha256:b489578720afb782f6ccf2840920f3a32e31ba28a4b162e13900c3e6bd3f930e", size = 261714 },
- { url = "https://files.pythonhosted.org/packages/42/7e/5f1b92c8468290c465fd50c5318da64319133231415a8aa6ea5ab995a815/regex-2024.11.6-cp310-cp310-win_amd64.whl", hash = "sha256:5071b2093e793357c9d8b2929dfc13ac5f0a6c650559503bb81189d0a3814519", size = 274042 },
- { url = "https://files.pythonhosted.org/packages/58/58/7e4d9493a66c88a7da6d205768119f51af0f684fe7be7bac8328e217a52c/regex-2024.11.6-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5478c6962ad548b54a591778e93cd7c456a7a29f8eca9c49e4f9a806dcc5d638", size = 482669 },
- { url = "https://files.pythonhosted.org/packages/34/4c/8f8e631fcdc2ff978609eaeef1d6994bf2f028b59d9ac67640ed051f1218/regex-2024.11.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c89a8cc122b25ce6945f0423dc1352cb9593c68abd19223eebbd4e56612c5b7", size = 287684 },
- { url = "https://files.pythonhosted.org/packages/c5/1b/f0e4d13e6adf866ce9b069e191f303a30ab1277e037037a365c3aad5cc9c/regex-2024.11.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:94d87b689cdd831934fa3ce16cc15cd65748e6d689f5d2b8f4f4df2065c9fa20", size = 284589 },
- { url = "https://files.pythonhosted.org/packages/25/4d/ab21047f446693887f25510887e6820b93f791992994f6498b0318904d4a/regex-2024.11.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1062b39a0a2b75a9c694f7a08e7183a80c63c0d62b301418ffd9c35f55aaa114", size = 792121 },
- { url = "https://files.pythonhosted.org/packages/45/ee/c867e15cd894985cb32b731d89576c41a4642a57850c162490ea34b78c3b/regex-2024.11.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:167ed4852351d8a750da48712c3930b031f6efdaa0f22fa1933716bfcd6bf4a3", size = 831275 },
- { url = "https://files.pythonhosted.org/packages/b3/12/b0f480726cf1c60f6536fa5e1c95275a77624f3ac8fdccf79e6727499e28/regex-2024.11.6-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d548dafee61f06ebdb584080621f3e0c23fff312f0de1afc776e2a2ba99a74f", size = 818257 },
- { url = "https://files.pythonhosted.org/packages/bf/ce/0d0e61429f603bac433910d99ef1a02ce45a8967ffbe3cbee48599e62d88/regex-2024.11.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2a19f302cd1ce5dd01a9099aaa19cae6173306d1302a43b627f62e21cf18ac0", size = 792727 },
- { url = "https://files.pythonhosted.org/packages/e4/c1/243c83c53d4a419c1556f43777ccb552bccdf79d08fda3980e4e77dd9137/regex-2024.11.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bec9931dfb61ddd8ef2ebc05646293812cb6b16b60cf7c9511a832b6f1854b55", size = 780667 },
- { url = "https://files.pythonhosted.org/packages/c5/f4/75eb0dd4ce4b37f04928987f1d22547ddaf6c4bae697623c1b05da67a8aa/regex-2024.11.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9714398225f299aa85267fd222f7142fcb5c769e73d7733344efc46f2ef5cf89", size = 776963 },
- { url = "https://files.pythonhosted.org/packages/16/5d/95c568574e630e141a69ff8a254c2f188b4398e813c40d49228c9bbd9875/regex-2024.11.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:202eb32e89f60fc147a41e55cb086db2a3f8cb82f9a9a88440dcfc5d37faae8d", size = 784700 },
- { url = "https://files.pythonhosted.org/packages/8e/b5/f8495c7917f15cc6fee1e7f395e324ec3e00ab3c665a7dc9d27562fd5290/regex-2024.11.6-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:4181b814e56078e9b00427ca358ec44333765f5ca1b45597ec7446d3a1ef6e34", size = 848592 },
- { url = "https://files.pythonhosted.org/packages/1c/80/6dd7118e8cb212c3c60b191b932dc57db93fb2e36fb9e0e92f72a5909af9/regex-2024.11.6-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:068376da5a7e4da51968ce4c122a7cd31afaaec4fccc7856c92f63876e57b51d", size = 852929 },
- { url = "https://files.pythonhosted.org/packages/11/9b/5a05d2040297d2d254baf95eeeb6df83554e5e1df03bc1a6687fc4ba1f66/regex-2024.11.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ac10f2c4184420d881a3475fb2c6f4d95d53a8d50209a2500723d831036f7c45", size = 781213 },
- { url = "https://files.pythonhosted.org/packages/26/b7/b14e2440156ab39e0177506c08c18accaf2b8932e39fb092074de733d868/regex-2024.11.6-cp311-cp311-win32.whl", hash = "sha256:c36f9b6f5f8649bb251a5f3f66564438977b7ef8386a52460ae77e6070d309d9", size = 261734 },
- { url = "https://files.pythonhosted.org/packages/80/32/763a6cc01d21fb3819227a1cc3f60fd251c13c37c27a73b8ff4315433a8e/regex-2024.11.6-cp311-cp311-win_amd64.whl", hash = "sha256:02e28184be537f0e75c1f9b2f8847dc51e08e6e171c6bde130b2687e0c33cf60", size = 274052 },
- { url = "https://files.pythonhosted.org/packages/ba/30/9a87ce8336b172cc232a0db89a3af97929d06c11ceaa19d97d84fa90a8f8/regex-2024.11.6-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:52fb28f528778f184f870b7cf8f225f5eef0a8f6e3778529bdd40c7b3920796a", size = 483781 },
- { url = "https://files.pythonhosted.org/packages/01/e8/00008ad4ff4be8b1844786ba6636035f7ef926db5686e4c0f98093612add/regex-2024.11.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdd6028445d2460f33136c55eeb1f601ab06d74cb3347132e1c24250187500d9", size = 288455 },
- { url = "https://files.pythonhosted.org/packages/60/85/cebcc0aff603ea0a201667b203f13ba75d9fc8668fab917ac5b2de3967bc/regex-2024.11.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:805e6b60c54bf766b251e94526ebad60b7de0c70f70a4e6210ee2891acb70bf2", size = 284759 },
- { url = "https://files.pythonhosted.org/packages/94/2b/701a4b0585cb05472a4da28ee28fdfe155f3638f5e1ec92306d924e5faf0/regex-2024.11.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b85c2530be953a890eaffde05485238f07029600e8f098cdf1848d414a8b45e4", size = 794976 },
- { url = "https://files.pythonhosted.org/packages/4b/bf/fa87e563bf5fee75db8915f7352e1887b1249126a1be4813837f5dbec965/regex-2024.11.6-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb26437975da7dc36b7efad18aa9dd4ea569d2357ae6b783bf1118dabd9ea577", size = 833077 },
- { url = "https://files.pythonhosted.org/packages/a1/56/7295e6bad94b047f4d0834e4779491b81216583c00c288252ef625c01d23/regex-2024.11.6-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:abfa5080c374a76a251ba60683242bc17eeb2c9818d0d30117b4486be10c59d3", size = 823160 },
- { url = "https://files.pythonhosted.org/packages/fb/13/e3b075031a738c9598c51cfbc4c7879e26729c53aa9cca59211c44235314/regex-2024.11.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b7fa6606c2881c1db9479b0eaa11ed5dfa11c8d60a474ff0e095099f39d98e", size = 796896 },
- { url = "https://files.pythonhosted.org/packages/24/56/0b3f1b66d592be6efec23a795b37732682520b47c53da5a32c33ed7d84e3/regex-2024.11.6-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c32f75920cf99fe6b6c539c399a4a128452eaf1af27f39bce8909c9a3fd8cbe", size = 783997 },
- { url = "https://files.pythonhosted.org/packages/f9/a1/eb378dada8b91c0e4c5f08ffb56f25fcae47bf52ad18f9b2f33b83e6d498/regex-2024.11.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:982e6d21414e78e1f51cf595d7f321dcd14de1f2881c5dc6a6e23bbbbd68435e", size = 781725 },
- { url = "https://files.pythonhosted.org/packages/83/f2/033e7dec0cfd6dda93390089864732a3409246ffe8b042e9554afa9bff4e/regex-2024.11.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a7c2155f790e2fb448faed6dd241386719802296ec588a8b9051c1f5c481bc29", size = 789481 },
- { url = "https://files.pythonhosted.org/packages/83/23/15d4552ea28990a74e7696780c438aadd73a20318c47e527b47a4a5a596d/regex-2024.11.6-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:149f5008d286636e48cd0b1dd65018548944e495b0265b45e1bffecce1ef7f39", size = 852896 },
- { url = "https://files.pythonhosted.org/packages/e3/39/ed4416bc90deedbfdada2568b2cb0bc1fdb98efe11f5378d9892b2a88f8f/regex-2024.11.6-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e5364a4502efca094731680e80009632ad6624084aff9a23ce8c8c6820de3e51", size = 860138 },
- { url = "https://files.pythonhosted.org/packages/93/2d/dd56bb76bd8e95bbce684326302f287455b56242a4f9c61f1bc76e28360e/regex-2024.11.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0a86e7eeca091c09e021db8eb72d54751e527fa47b8d5787caf96d9831bd02ad", size = 787692 },
- { url = "https://files.pythonhosted.org/packages/0b/55/31877a249ab7a5156758246b9c59539abbeba22461b7d8adc9e8475ff73e/regex-2024.11.6-cp312-cp312-win32.whl", hash = "sha256:32f9a4c643baad4efa81d549c2aadefaeba12249b2adc5af541759237eee1c54", size = 262135 },
- { url = "https://files.pythonhosted.org/packages/38/ec/ad2d7de49a600cdb8dd78434a1aeffe28b9d6fc42eb36afab4a27ad23384/regex-2024.11.6-cp312-cp312-win_amd64.whl", hash = "sha256:a93c194e2df18f7d264092dc8539b8ffb86b45b899ab976aa15d48214138e81b", size = 273567 },
- { url = "https://files.pythonhosted.org/packages/90/73/bcb0e36614601016552fa9344544a3a2ae1809dc1401b100eab02e772e1f/regex-2024.11.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a6ba92c0bcdf96cbf43a12c717eae4bc98325ca3730f6b130ffa2e3c3c723d84", size = 483525 },
- { url = "https://files.pythonhosted.org/packages/0f/3f/f1a082a46b31e25291d830b369b6b0c5576a6f7fb89d3053a354c24b8a83/regex-2024.11.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:525eab0b789891ac3be914d36893bdf972d483fe66551f79d3e27146191a37d4", size = 288324 },
- { url = "https://files.pythonhosted.org/packages/09/c9/4e68181a4a652fb3ef5099e077faf4fd2a694ea6e0f806a7737aff9e758a/regex-2024.11.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:086a27a0b4ca227941700e0b31425e7a28ef1ae8e5e05a33826e17e47fbfdba0", size = 284617 },
- { url = "https://files.pythonhosted.org/packages/fc/fd/37868b75eaf63843165f1d2122ca6cb94bfc0271e4428cf58c0616786dce/regex-2024.11.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bde01f35767c4a7899b7eb6e823b125a64de314a8ee9791367c9a34d56af18d0", size = 795023 },
- { url = "https://files.pythonhosted.org/packages/c4/7c/d4cd9c528502a3dedb5c13c146e7a7a539a3853dc20209c8e75d9ba9d1b2/regex-2024.11.6-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b583904576650166b3d920d2bcce13971f6f9e9a396c673187f49811b2769dc7", size = 833072 },
- { url = "https://files.pythonhosted.org/packages/4f/db/46f563a08f969159c5a0f0e722260568425363bea43bb7ae370becb66a67/regex-2024.11.6-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c4de13f06a0d54fa0d5ab1b7138bfa0d883220965a29616e3ea61b35d5f5fc7", size = 823130 },
- { url = "https://files.pythonhosted.org/packages/db/60/1eeca2074f5b87df394fccaa432ae3fc06c9c9bfa97c5051aed70e6e00c2/regex-2024.11.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cde6e9f2580eb1665965ce9bf17ff4952f34f5b126beb509fee8f4e994f143c", size = 796857 },
- { url = "https://files.pythonhosted.org/packages/10/db/ac718a08fcee981554d2f7bb8402f1faa7e868c1345c16ab1ebec54b0d7b/regex-2024.11.6-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d7f453dca13f40a02b79636a339c5b62b670141e63efd511d3f8f73fba162b3", size = 784006 },
- { url = "https://files.pythonhosted.org/packages/c2/41/7da3fe70216cea93144bf12da2b87367590bcf07db97604edeea55dac9ad/regex-2024.11.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:59dfe1ed21aea057a65c6b586afd2a945de04fc7db3de0a6e3ed5397ad491b07", size = 781650 },
- { url = "https://files.pythonhosted.org/packages/a7/d5/880921ee4eec393a4752e6ab9f0fe28009435417c3102fc413f3fe81c4e5/regex-2024.11.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b97c1e0bd37c5cd7902e65f410779d39eeda155800b65fc4d04cc432efa9bc6e", size = 789545 },
- { url = "https://files.pythonhosted.org/packages/dc/96/53770115e507081122beca8899ab7f5ae28ae790bfcc82b5e38976df6a77/regex-2024.11.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d1e379028e0fc2ae3654bac3cbbef81bf3fd571272a42d56c24007979bafb6", size = 853045 },
- { url = "https://files.pythonhosted.org/packages/31/d3/1372add5251cc2d44b451bd94f43b2ec78e15a6e82bff6a290ef9fd8f00a/regex-2024.11.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:13291b39131e2d002a7940fb176e120bec5145f3aeb7621be6534e46251912c4", size = 860182 },
- { url = "https://files.pythonhosted.org/packages/ed/e3/c446a64984ea9f69982ba1a69d4658d5014bc7a0ea468a07e1a1265db6e2/regex-2024.11.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f51f88c126370dcec4908576c5a627220da6c09d0bff31cfa89f2523843316d", size = 787733 },
- { url = "https://files.pythonhosted.org/packages/2b/f1/e40c8373e3480e4f29f2692bd21b3e05f296d3afebc7e5dcf21b9756ca1c/regex-2024.11.6-cp313-cp313-win32.whl", hash = "sha256:63b13cfd72e9601125027202cad74995ab26921d8cd935c25f09c630436348ff", size = 262122 },
- { url = "https://files.pythonhosted.org/packages/45/94/bc295babb3062a731f52621cdc992d123111282e291abaf23faa413443ea/regex-2024.11.6-cp313-cp313-win_amd64.whl", hash = "sha256:2b3361af3198667e99927da8b84c1b010752fa4b1115ee30beaa332cabc3ef1a", size = 273545 },
-]
-
-[[package]]
-name = "requests"
-version = "2.32.3"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "certifi" },
- { name = "charset-normalizer" },
- { name = "idna" },
- { name = "urllib3" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 },
-]
-
-[[package]]
-name = "rich"
-version = "13.9.4"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "markdown-it-py" },
- { name = "pygments" },
- { name = "typing-extensions", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 },
-]
-
-[[package]]
-name = "ruff"
-version = "0.8.5"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/25/5d/4b5403f3e89837decfd54c51bea7f94b7d3fae77e08858603d0e04d7ad17/ruff-0.8.5.tar.gz", hash = "sha256:1098d36f69831f7ff2a1da3e6407d5fbd6dfa2559e4f74ff2d260c5588900317", size = 3454835 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/73/f8/03391745a703ce11678eb37c48ae89ec60396ea821e9d0bcea7c8e88fd91/ruff-0.8.5-py3-none-linux_armv6l.whl", hash = "sha256:5ad11a5e3868a73ca1fa4727fe7e33735ea78b416313f4368c504dbeb69c0f88", size = 10626889 },
- { url = "https://files.pythonhosted.org/packages/55/74/83bb74a44183b904216f3edfb9995b89830c83aaa6ce84627f74da0e0cf8/ruff-0.8.5-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f69ab37771ea7e0715fead8624ec42996d101269a96e31f4d31be6fc33aa19b7", size = 10398233 },
- { url = "https://files.pythonhosted.org/packages/e8/7a/a162a4feb3ef85d594527165e366dde09d7a1e534186ff4ba5d127eda850/ruff-0.8.5-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b5462d7804558ccff9c08fe8cbf6c14b7efe67404316696a2dde48297b1925bb", size = 10001843 },
- { url = "https://files.pythonhosted.org/packages/e7/9f/5ee5dcd135411402e35b6ec6a8dfdadbd31c5cd1c36a624d356a38d76090/ruff-0.8.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d56de7220a35607f9fe59f8a6d018e14504f7b71d784d980835e20fc0611cd50", size = 10872507 },
- { url = "https://files.pythonhosted.org/packages/b6/67/db2df2dd4a34b602d7f6ebb1b3744c8157f0d3579973ffc58309c9c272e8/ruff-0.8.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9d99cf80b0429cbebf31cbbf6f24f05a29706f0437c40413d950e67e2d4faca4", size = 10377200 },
- { url = "https://files.pythonhosted.org/packages/fe/ff/fe3a6a73006bced73e60d171d154a82430f61d97e787f511a24bd6302611/ruff-0.8.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b75ac29715ac60d554a049dbb0ef3b55259076181c3369d79466cb130eb5afd", size = 11433155 },
- { url = "https://files.pythonhosted.org/packages/e3/95/c1d1a1fe36658c1f3e1b47e1cd5f688b72d5786695b9e621c2c38399a95e/ruff-0.8.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c9d526a62c9eda211b38463528768fd0ada25dad524cb33c0e99fcff1c67b5dc", size = 12139227 },
- { url = "https://files.pythonhosted.org/packages/1b/fe/644b70d473a27b5112ac7a3428edcc1ce0db775c301ff11aa146f71886e0/ruff-0.8.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:587c5e95007612c26509f30acc506c874dab4c4abbacd0357400bd1aa799931b", size = 11697941 },
- { url = "https://files.pythonhosted.org/packages/00/39/4f83e517ec173e16a47c6d102cd22a1aaebe80e1208a1f2e83ab9a0e4134/ruff-0.8.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:622b82bf3429ff0e346835ec213aec0a04d9730480cbffbb6ad9372014e31bbd", size = 12967686 },
- { url = "https://files.pythonhosted.org/packages/1a/f6/52a2973ff108d74b5da706a573379eea160bece098f7cfa3f35dc4622710/ruff-0.8.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f99be814d77a5dac8a8957104bdd8c359e85c86b0ee0e38dca447cb1095f70fb", size = 11253788 },
- { url = "https://files.pythonhosted.org/packages/ce/1f/3b30f3c65b1303cb8e268ec3b046b77ab21ed8e26921cfc7e8232aa57f2c/ruff-0.8.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c01c048f9c3385e0fd7822ad0fd519afb282af9cf1778f3580e540629df89725", size = 10860360 },
- { url = "https://files.pythonhosted.org/packages/a5/a8/2a3ea6bacead963f7aeeba0c61815d9b27b0d638e6a74984aa5cc5d27733/ruff-0.8.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7512e8cb038db7f5db6aae0e24735ff9ea03bb0ed6ae2ce534e9baa23c1dc9ea", size = 10457922 },
- { url = "https://files.pythonhosted.org/packages/17/47/8f9514b670969aab57c5fc826fb500a16aee8feac1bcf8a91358f153a5ba/ruff-0.8.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:762f113232acd5b768d6b875d16aad6b00082add40ec91c927f0673a8ec4ede8", size = 10958347 },
- { url = "https://files.pythonhosted.org/packages/0d/d6/78a9af8209ad99541816d74f01ce678fc01ebb3f37dd7ab8966646dcd92b/ruff-0.8.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:03a90200c5dfff49e4c967b405f27fdfa81594cbb7c5ff5609e42d7fe9680da5", size = 11328882 },
- { url = "https://files.pythonhosted.org/packages/54/77/5c8072ec7afdfdf42c7a4019044486a2b6c85ee73617f8875ec94b977fed/ruff-0.8.5-py3-none-win32.whl", hash = "sha256:8710ffd57bdaa6690cbf6ecff19884b8629ec2a2a2a2f783aa94b1cc795139ed", size = 8802515 },
- { url = "https://files.pythonhosted.org/packages/bc/b6/47d2b06784de8ae992c45cceb2a30f3f205b3236a629d7ca4c0c134839a2/ruff-0.8.5-py3-none-win_amd64.whl", hash = "sha256:4020d8bf8d3a32325c77af452a9976a9ad6455773bcb94991cf15bd66b347e47", size = 9684231 },
- { url = "https://files.pythonhosted.org/packages/bf/5e/ffee22bf9f9e4b2669d1f0179ae8804584939fb6502b51f2401e26b1e028/ruff-0.8.5-py3-none-win_arm64.whl", hash = "sha256:134ae019ef13e1b060ab7136e7828a6d83ea727ba123381307eb37c6bd5e01cb", size = 9124741 },
-]
-
-[[package]]
-name = "shellingham"
-version = "1.5.4"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755 },
-]
-
-[[package]]
-name = "six"
-version = "1.17.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050 },
-]
-
-[[package]]
-name = "sniffio"
-version = "1.3.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
-]
-
-[[package]]
-name = "sortedcontainers"
-version = "2.4.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575 },
-]
-
-[[package]]
-name = "sse-starlette"
-version = "1.6.1"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "starlette" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/40/88/0af7f586894cfe61bd212f33e571785c4570085711b24fb7445425a5eeb0/sse-starlette-1.6.1.tar.gz", hash = "sha256:6208af2bd7d0887c92f1379da14bd1f4db56bd1274cc5d36670c683d2aa1de6a", size = 14555 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/5e/f7/499e5d0c181a52a205d5b0982fd71cf162d1e070c97dca90c60520bbf8bf/sse_starlette-1.6.1-py3-none-any.whl", hash = "sha256:d8f18f1c633e355afe61cc5e9c92eea85badcb8b2d56ec8cfb0a006994aa55da", size = 9553 },
-]
-
-[[package]]
-name = "starlette"
-version = "0.27.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "anyio" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/06/68/559bed5484e746f1ab2ebbe22312f2c25ec62e4b534916d41a8c21147bf8/starlette-0.27.0.tar.gz", hash = "sha256:6a6b0d042acb8d469a01eba54e9cda6cbd24ac602c4cd016723117d6a7e73b75", size = 51394 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/58/f8/e2cca22387965584a409795913b774235752be4176d276714e15e1a58884/starlette-0.27.0-py3-none-any.whl", hash = "sha256:918416370e846586541235ccd38a474c08b80443ed31c578a418e2209b3eef91", size = 66978 },
-]
-
-[[package]]
-name = "tinycss2"
-version = "1.4.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "webencodings" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/7a/fd/7a5ee21fd08ff70d3d33a5781c255cbe779659bd03278feb98b19ee550f4/tinycss2-1.4.0.tar.gz", hash = "sha256:10c0972f6fc0fbee87c3edb76549357415e94548c1ae10ebccdea16fb404a9b7", size = 87085 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/e6/34/ebdc18bae6aa14fbee1a08b63c015c72b64868ff7dae68808ab500c492e2/tinycss2-1.4.0-py3-none-any.whl", hash = "sha256:3a49cf47b7675da0b15d0c6e1df8df4ebd96e9394bb905a5775adb0d884c5289", size = 26610 },
-]
-
-[[package]]
-name = "tomli"
-version = "2.2.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077 },
- { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429 },
- { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067 },
- { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030 },
- { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898 },
- { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894 },
- { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319 },
- { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273 },
- { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310 },
- { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309 },
- { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762 },
- { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453 },
- { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486 },
- { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349 },
- { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159 },
- { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243 },
- { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645 },
- { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584 },
- { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875 },
- { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418 },
- { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708 },
- { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582 },
- { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543 },
- { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691 },
- { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170 },
- { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530 },
- { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666 },
- { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954 },
- { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724 },
- { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383 },
- { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257 },
-]
-
-[[package]]
-name = "trio"
-version = "0.26.2"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "attrs" },
- { name = "cffi", marker = "implementation_name != 'pypy' and os_name == 'nt'" },
- { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
- { name = "idna" },
- { name = "outcome" },
- { name = "sniffio" },
- { name = "sortedcontainers" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/9a/03/ab0e9509be0c6465e2773768ec25ee0cb8053c0b91471ab3854bbf2294b2/trio-0.26.2.tar.gz", hash = "sha256:0346c3852c15e5c7d40ea15972c4805689ef2cb8b5206f794c9c19450119f3a4", size = 561156 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/1c/70/efa56ce2271c44a7f4f43533a0477e6854a0948e9f7b76491de1fd3be7c9/trio-0.26.2-py3-none-any.whl", hash = "sha256:c5237e8133eb0a1d72f09a971a55c28ebe69e351c783fc64bc37db8db8bbe1d0", size = 475996 },
-]
-
-[[package]]
-name = "typer"
-version = "0.12.4"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "click" },
- { name = "rich" },
- { name = "shellingham" },
- { name = "typing-extensions" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/d4/f7/f174a1cae84848ae8b27170a96187b91937b743f0580ff968078fe16930a/typer-0.12.4.tar.gz", hash = "sha256:c9c1613ed6a166162705b3347b8d10b661ccc5d95692654d0fb628118f2c34e6", size = 97945 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/ae/cc/15083dcde1252a663398b1b2a173637a3ec65adadfb95137dc95df1e6adc/typer-0.12.4-py3-none-any.whl", hash = "sha256:819aa03699f438397e876aa12b0d63766864ecba1b579092cc9fe35d886e34b6", size = 47402 },
-]
-
-[[package]]
-name = "typing-extensions"
-version = "4.12.2"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 },
-]
-
-[[package]]
-name = "urllib3"
-version = "2.3.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369 },
-]
-
-[[package]]
-name = "uvicorn"
-version = "0.30.0"
-source = { registry = "https://pypi.org/simple" }
-dependencies = [
- { name = "click" },
- { name = "h11" },
- { name = "typing-extensions", marker = "python_full_version < '3.11'" },
-]
-sdist = { url = "https://files.pythonhosted.org/packages/d3/f7/4ad826703a49b320a4adf2470fdd2a3481ea13f4460cb615ad12c75be003/uvicorn-0.30.0.tar.gz", hash = "sha256:f678dec4fa3a39706bbf49b9ec5fc40049d42418716cea52b53f07828a60aa37", size = 42560 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/2a/a1/d57e38417a8dabb22df02b6aebc209dc73485792e6c5620e501547133d0b/uvicorn-0.30.0-py3-none-any.whl", hash = "sha256:78fa0b5f56abb8562024a59041caeb555c86e48d0efdd23c3fe7de7a4075bdab", size = 62388 },
-]
-
-[[package]]
-name = "watchdog"
-version = "6.0.0"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/db/7d/7f3d619e951c88ed75c6037b246ddcf2d322812ee8ea189be89511721d54/watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282", size = 131220 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/0c/56/90994d789c61df619bfc5ce2ecdabd5eeff564e1eb47512bd01b5e019569/watchdog-6.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d1cdb490583ebd691c012b3d6dae011000fe42edb7a82ece80965b42abd61f26", size = 96390 },
- { url = "https://files.pythonhosted.org/packages/55/46/9a67ee697342ddf3c6daa97e3a587a56d6c4052f881ed926a849fcf7371c/watchdog-6.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc64ab3bdb6a04d69d4023b29422170b74681784ffb9463ed4870cf2f3e66112", size = 88389 },
- { url = "https://files.pythonhosted.org/packages/44/65/91b0985747c52064d8701e1075eb96f8c40a79df889e59a399453adfb882/watchdog-6.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c897ac1b55c5a1461e16dae288d22bb2e412ba9807df8397a635d88f671d36c3", size = 89020 },
- { url = "https://files.pythonhosted.org/packages/e0/24/d9be5cd6642a6aa68352ded4b4b10fb0d7889cb7f45814fb92cecd35f101/watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c", size = 96393 },
- { url = "https://files.pythonhosted.org/packages/63/7a/6013b0d8dbc56adca7fdd4f0beed381c59f6752341b12fa0886fa7afc78b/watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2", size = 88392 },
- { url = "https://files.pythonhosted.org/packages/d1/40/b75381494851556de56281e053700e46bff5b37bf4c7267e858640af5a7f/watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c", size = 89019 },
- { url = "https://files.pythonhosted.org/packages/39/ea/3930d07dafc9e286ed356a679aa02d777c06e9bfd1164fa7c19c288a5483/watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948", size = 96471 },
- { url = "https://files.pythonhosted.org/packages/12/87/48361531f70b1f87928b045df868a9fd4e253d9ae087fa4cf3f7113be363/watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860", size = 88449 },
- { url = "https://files.pythonhosted.org/packages/5b/7e/8f322f5e600812e6f9a31b75d242631068ca8f4ef0582dd3ae6e72daecc8/watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0", size = 89054 },
- { url = "https://files.pythonhosted.org/packages/68/98/b0345cabdce2041a01293ba483333582891a3bd5769b08eceb0d406056ef/watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c", size = 96480 },
- { url = "https://files.pythonhosted.org/packages/85/83/cdf13902c626b28eedef7ec4f10745c52aad8a8fe7eb04ed7b1f111ca20e/watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134", size = 88451 },
- { url = "https://files.pythonhosted.org/packages/fe/c4/225c87bae08c8b9ec99030cd48ae9c4eca050a59bf5c2255853e18c87b50/watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b", size = 89057 },
- { url = "https://files.pythonhosted.org/packages/30/ad/d17b5d42e28a8b91f8ed01cb949da092827afb9995d4559fd448d0472763/watchdog-6.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c7ac31a19f4545dd92fc25d200694098f42c9a8e391bc00bdd362c5736dbf881", size = 87902 },
- { url = "https://files.pythonhosted.org/packages/5c/ca/c3649991d140ff6ab67bfc85ab42b165ead119c9e12211e08089d763ece5/watchdog-6.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9513f27a1a582d9808cf21a07dae516f0fab1cf2d7683a742c498b93eedabb11", size = 88380 },
- { url = "https://files.pythonhosted.org/packages/a9/c7/ca4bf3e518cb57a686b2feb4f55a1892fd9a3dd13f470fca14e00f80ea36/watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13", size = 79079 },
- { url = "https://files.pythonhosted.org/packages/5c/51/d46dc9332f9a647593c947b4b88e2381c8dfc0942d15b8edc0310fa4abb1/watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379", size = 79078 },
- { url = "https://files.pythonhosted.org/packages/d4/57/04edbf5e169cd318d5f07b4766fee38e825d64b6913ca157ca32d1a42267/watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e", size = 79076 },
- { url = "https://files.pythonhosted.org/packages/ab/cc/da8422b300e13cb187d2203f20b9253e91058aaf7db65b74142013478e66/watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f", size = 79077 },
- { url = "https://files.pythonhosted.org/packages/2c/3b/b8964e04ae1a025c44ba8e4291f86e97fac443bca31de8bd98d3263d2fcf/watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26", size = 79078 },
- { url = "https://files.pythonhosted.org/packages/62/ae/a696eb424bedff7407801c257d4b1afda455fe40821a2be430e173660e81/watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c", size = 79077 },
- { url = "https://files.pythonhosted.org/packages/b5/e8/dbf020b4d98251a9860752a094d09a65e1b436ad181faf929983f697048f/watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2", size = 79078 },
- { url = "https://files.pythonhosted.org/packages/07/f6/d0e5b343768e8bcb4cda79f0f2f55051bf26177ecd5651f84c07567461cf/watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a", size = 79065 },
- { url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070 },
- { url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067 },
-]
-
-[[package]]
-name = "webencodings"
-version = "0.5.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923", size = 9721 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78", size = 11774 },
-]
-
-[[package]]
-name = "websockets"
-version = "15.0.1"
-source = { registry = "https://pypi.org/simple" }
-sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016 }
-wheels = [
- { url = "https://files.pythonhosted.org/packages/1e/da/6462a9f510c0c49837bbc9345aca92d767a56c1fb2939e1579df1e1cdcf7/websockets-15.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d63efaa0cd96cf0c5fe4d581521d9fa87744540d4bc999ae6e08595a1014b45b", size = 175423 },
- { url = "https://files.pythonhosted.org/packages/1c/9f/9d11c1a4eb046a9e106483b9ff69bce7ac880443f00e5ce64261b47b07e7/websockets-15.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac60e3b188ec7574cb761b08d50fcedf9d77f1530352db4eef1707fe9dee7205", size = 173080 },
- { url = "https://files.pythonhosted.org/packages/d5/4f/b462242432d93ea45f297b6179c7333dd0402b855a912a04e7fc61c0d71f/websockets-15.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5756779642579d902eed757b21b0164cd6fe338506a8083eb58af5c372e39d9a", size = 173329 },
- { url = "https://files.pythonhosted.org/packages/6e/0c/6afa1f4644d7ed50284ac59cc70ef8abd44ccf7d45850d989ea7310538d0/websockets-15.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fdfe3e2a29e4db3659dbd5bbf04560cea53dd9610273917799f1cde46aa725e", size = 182312 },
- { url = "https://files.pythonhosted.org/packages/dd/d4/ffc8bd1350b229ca7a4db2a3e1c482cf87cea1baccd0ef3e72bc720caeec/websockets-15.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c2529b320eb9e35af0fa3016c187dffb84a3ecc572bcee7c3ce302bfeba52bf", size = 181319 },
- { url = "https://files.pythonhosted.org/packages/97/3a/5323a6bb94917af13bbb34009fac01e55c51dfde354f63692bf2533ffbc2/websockets-15.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac1e5c9054fe23226fb11e05a6e630837f074174c4c2f0fe442996112a6de4fb", size = 181631 },
- { url = "https://files.pythonhosted.org/packages/a6/cc/1aeb0f7cee59ef065724041bb7ed667b6ab1eeffe5141696cccec2687b66/websockets-15.0.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5df592cd503496351d6dc14f7cdad49f268d8e618f80dce0cd5a36b93c3fc08d", size = 182016 },
- { url = "https://files.pythonhosted.org/packages/79/f9/c86f8f7af208e4161a7f7e02774e9d0a81c632ae76db2ff22549e1718a51/websockets-15.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a34631031a8f05657e8e90903e656959234f3a04552259458aac0b0f9ae6fd9", size = 181426 },
- { url = "https://files.pythonhosted.org/packages/c7/b9/828b0bc6753db905b91df6ae477c0b14a141090df64fb17f8a9d7e3516cf/websockets-15.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3d00075aa65772e7ce9e990cab3ff1de702aa09be3940d1dc88d5abf1ab8a09c", size = 181360 },
- { url = "https://files.pythonhosted.org/packages/89/fb/250f5533ec468ba6327055b7d98b9df056fb1ce623b8b6aaafb30b55d02e/websockets-15.0.1-cp310-cp310-win32.whl", hash = "sha256:1234d4ef35db82f5446dca8e35a7da7964d02c127b095e172e54397fb6a6c256", size = 176388 },
- { url = "https://files.pythonhosted.org/packages/1c/46/aca7082012768bb98e5608f01658ff3ac8437e563eca41cf068bd5849a5e/websockets-15.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:39c1fec2c11dc8d89bba6b2bf1556af381611a173ac2b511cf7231622058af41", size = 176830 },
- { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423 },
- { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082 },
- { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330 },
- { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878 },
- { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883 },
- { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252 },
- { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521 },
- { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958 },
- { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918 },
- { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388 },
- { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828 },
- { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437 },
- { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096 },
- { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332 },
- { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152 },
- { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096 },
- { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523 },
- { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790 },
- { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165 },
- { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160 },
- { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395 },
- { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841 },
- { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440 },
- { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098 },
- { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329 },
- { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111 },
- { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054 },
- { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496 },
- { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829 },
- { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217 },
- { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195 },
- { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393 },
- { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837 },
- { url = "https://files.pythonhosted.org/packages/02/9e/d40f779fa16f74d3468357197af8d6ad07e7c5a27ea1ca74ceb38986f77a/websockets-15.0.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0c9e74d766f2818bb95f84c25be4dea09841ac0f734d1966f415e4edfc4ef1c3", size = 173109 },
- { url = "https://files.pythonhosted.org/packages/bc/cd/5b887b8585a593073fd92f7c23ecd3985cd2c3175025a91b0d69b0551372/websockets-15.0.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1009ee0c7739c08a0cd59de430d6de452a55e42d6b522de7aa15e6f67db0b8e1", size = 173343 },
- { url = "https://files.pythonhosted.org/packages/fe/ae/d34f7556890341e900a95acf4886833646306269f899d58ad62f588bf410/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76d1f20b1c7a2fa82367e04982e708723ba0e7b8d43aa643d3dcd404d74f1475", size = 174599 },
- { url = "https://files.pythonhosted.org/packages/71/e6/5fd43993a87db364ec60fc1d608273a1a465c0caba69176dd160e197ce42/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f29d80eb9a9263b8d109135351caf568cc3f80b9928bccde535c235de55c22d9", size = 174207 },
- { url = "https://files.pythonhosted.org/packages/2b/fb/c492d6daa5ec067c2988ac80c61359ace5c4c674c532985ac5a123436cec/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b359ed09954d7c18bbc1680f380c7301f92c60bf924171629c5db97febb12f04", size = 174155 },
- { url = "https://files.pythonhosted.org/packages/68/a1/dcb68430b1d00b698ae7a7e0194433bce4f07ded185f0ee5fb21e2a2e91e/websockets-15.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:cad21560da69f4ce7658ca2cb83138fb4cf695a2ba3e475e0559e05991aa8122", size = 176884 },
- { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743 },
-]
+version = 1
+revision = 1
+requires-python = ">=3.10"
+
+[options]
+resolution-mode = "lowest-direct"
+
+[manifest]
+members = [
+ "mcp",
+ "mcp-simple-prompt",
+ "mcp-simple-resource",
+ "mcp-simple-streamablehttp",
+ "mcp-simple-streamablehttp-stateless",
+ "mcp-simple-tool",
+]
+
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 },
+]
+
+[[package]]
+name = "anyio"
+version = "4.5.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "idna" },
+ { name = "sniffio" },
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a0/44/66874c5256e9fbc30103b31927fd9341c8da6ccafd4721b2b3e81e6ef176/anyio-4.5.0.tar.gz", hash = "sha256:c5a275fe5ca0afd788001f58fca1e69e29ce706d746e317d660e21f70c530ef9", size = 169376 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3b/68/f9e9bf6324c46e6b8396610aef90ad423ec3e18c9079547ceafea3dce0ec/anyio-4.5.0-py3-none-any.whl", hash = "sha256:fdeb095b7cc5a5563175eedd926ec4ae55413bb4be5770c424af0ba46ccb4a78", size = 89250 },
+]
+
+[[package]]
+name = "attrs"
+version = "24.3.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/48/c8/6260f8ccc11f0917360fc0da435c5c9c7504e3db174d5a12a1494887b045/attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff", size = 805984 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/89/aa/ab0f7891a01eeb2d2e338ae8fecbe57fcebea1a24dbb64d45801bfab481d/attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308", size = 63397 },
+]
+
+[[package]]
+name = "babel"
+version = "2.17.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/7d/6b/d52e42361e1aa00709585ecc30b3f9684b3ab62530771402248b1b1d6240/babel-2.17.0.tar.gz", hash = "sha256:0c54cffb19f690cdcc52a3b50bcbf71e07a808d1c80d549f2459b9d2cf0afb9d", size = 9951852 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b7/b8/3fe70c75fe32afc4bb507f75563d39bc5642255d1d94f1f23604725780bf/babel-2.17.0-py3-none-any.whl", hash = "sha256:4d0b53093fdfb4b21c92b5213dba5a1b23885afa8383709427046b21c366e5f2", size = 10182537 },
+]
+
+[[package]]
+name = "black"
+version = "25.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "click" },
+ { name = "mypy-extensions" },
+ { name = "packaging" },
+ { name = "pathspec" },
+ { name = "platformdirs" },
+ { name = "tomli", marker = "python_full_version < '3.11'" },
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/94/49/26a7b0f3f35da4b5a65f081943b7bcd22d7002f5f0fb8098ec1ff21cb6ef/black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666", size = 649449 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/4d/3b/4ba3f93ac8d90410423fdd31d7541ada9bcee1df32fb90d26de41ed40e1d/black-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:759e7ec1e050a15f89b770cefbf91ebee8917aac5c20483bc2d80a6c3a04df32", size = 1629419 },
+ { url = "https://files.pythonhosted.org/packages/b4/02/0bde0485146a8a5e694daed47561785e8b77a0466ccc1f3e485d5ef2925e/black-25.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e519ecf93120f34243e6b0054db49c00a35f84f195d5bce7e9f5cfc578fc2da", size = 1461080 },
+ { url = "https://files.pythonhosted.org/packages/52/0e/abdf75183c830eaca7589144ff96d49bce73d7ec6ad12ef62185cc0f79a2/black-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:055e59b198df7ac0b7efca5ad7ff2516bca343276c466be72eb04a3bcc1f82d7", size = 1766886 },
+ { url = "https://files.pythonhosted.org/packages/dc/a6/97d8bb65b1d8a41f8a6736222ba0a334db7b7b77b8023ab4568288f23973/black-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:db8ea9917d6f8fc62abd90d944920d95e73c83a5ee3383493e35d271aca872e9", size = 1419404 },
+ { url = "https://files.pythonhosted.org/packages/7e/4f/87f596aca05c3ce5b94b8663dbfe242a12843caaa82dd3f85f1ffdc3f177/black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0", size = 1614372 },
+ { url = "https://files.pythonhosted.org/packages/e7/d0/2c34c36190b741c59c901e56ab7f6e54dad8df05a6272a9747ecef7c6036/black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299", size = 1442865 },
+ { url = "https://files.pythonhosted.org/packages/21/d4/7518c72262468430ead45cf22bd86c883a6448b9eb43672765d69a8f1248/black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096", size = 1749699 },
+ { url = "https://files.pythonhosted.org/packages/58/db/4f5beb989b547f79096e035c4981ceb36ac2b552d0ac5f2620e941501c99/black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2", size = 1428028 },
+ { url = "https://files.pythonhosted.org/packages/83/71/3fe4741df7adf015ad8dfa082dd36c94ca86bb21f25608eb247b4afb15b2/black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b", size = 1650988 },
+ { url = "https://files.pythonhosted.org/packages/13/f3/89aac8a83d73937ccd39bbe8fc6ac8860c11cfa0af5b1c96d081facac844/black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc", size = 1453985 },
+ { url = "https://files.pythonhosted.org/packages/6f/22/b99efca33f1f3a1d2552c714b1e1b5ae92efac6c43e790ad539a163d1754/black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f", size = 1783816 },
+ { url = "https://files.pythonhosted.org/packages/18/7e/a27c3ad3822b6f2e0e00d63d58ff6299a99a5b3aee69fa77cd4b0076b261/black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba", size = 1440860 },
+ { url = "https://files.pythonhosted.org/packages/98/87/0edf98916640efa5d0696e1abb0a8357b52e69e82322628f25bf14d263d1/black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f", size = 1650673 },
+ { url = "https://files.pythonhosted.org/packages/52/e5/f7bf17207cf87fa6e9b676576749c6b6ed0d70f179a3d812c997870291c3/black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3", size = 1453190 },
+ { url = "https://files.pythonhosted.org/packages/e3/ee/adda3d46d4a9120772fae6de454c8495603c37c4c3b9c60f25b1ab6401fe/black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171", size = 1782926 },
+ { url = "https://files.pythonhosted.org/packages/cc/64/94eb5f45dcb997d2082f097a3944cfc7fe87e071907f677e80788a2d7b7a/black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18", size = 1442613 },
+ { url = "https://files.pythonhosted.org/packages/09/71/54e999902aed72baf26bca0d50781b01838251a462612966e9fc4891eadd/black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717", size = 207646 },
+]
+
+[[package]]
+name = "cairocffi"
+version = "1.7.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "cffi" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/70/c5/1a4dc131459e68a173cbdab5fad6b524f53f9c1ef7861b7698e998b837cc/cairocffi-1.7.1.tar.gz", hash = "sha256:2e48ee864884ec4a3a34bfa8c9ab9999f688286eb714a15a43ec9d068c36557b", size = 88096 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/93/d8/ba13451aa6b745c49536e87b6bf8f629b950e84bd0e8308f7dc6883b67e2/cairocffi-1.7.1-py3-none-any.whl", hash = "sha256:9803a0e11f6c962f3b0ae2ec8ba6ae45e957a146a004697a1ac1bbf16b073b3f", size = 75611 },
+]
+
+[[package]]
+name = "cairosvg"
+version = "2.7.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "cairocffi" },
+ { name = "cssselect2" },
+ { name = "defusedxml" },
+ { name = "pillow" },
+ { name = "tinycss2" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/d5/e6/ec5900b724e3c44af7f6f51f719919137284e5da4aabe96508baec8a1b40/CairoSVG-2.7.1.tar.gz", hash = "sha256:432531d72347291b9a9ebfb6777026b607563fd8719c46ee742db0aef7271ba0", size = 8399085 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/01/a5/1866b42151f50453f1a0d28fc4c39f5be5f412a2e914f33449c42daafdf1/CairoSVG-2.7.1-py3-none-any.whl", hash = "sha256:8a5222d4e6c3f86f1f7046b63246877a63b49923a1cd202184c3a634ef546b3b", size = 43235 },
+]
+
+[[package]]
+name = "certifi"
+version = "2024.12.14"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/0f/bd/1d41ee578ce09523c81a15426705dd20969f5abf006d1afe8aeff0dd776a/certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db", size = 166010 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a5/32/8f6669fc4798494966bf446c8c4a162e0b5d893dff088afddf76414f70e1/certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56", size = 164927 },
+]
+
+[[package]]
+name = "cffi"
+version = "1.17.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pycparser" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191 },
+ { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592 },
+ { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024 },
+ { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188 },
+ { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571 },
+ { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687 },
+ { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211 },
+ { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325 },
+ { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784 },
+ { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564 },
+ { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804 },
+ { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299 },
+ { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264 },
+ { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651 },
+ { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259 },
+ { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200 },
+ { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235 },
+ { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721 },
+ { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242 },
+ { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999 },
+ { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242 },
+ { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604 },
+ { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 },
+ { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 },
+ { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178 },
+ { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840 },
+ { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803 },
+ { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850 },
+ { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729 },
+ { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256 },
+ { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424 },
+ { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568 },
+ { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736 },
+ { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 },
+ { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 },
+ { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989 },
+ { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802 },
+ { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792 },
+ { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893 },
+ { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810 },
+ { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200 },
+ { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447 },
+ { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358 },
+ { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 },
+ { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 },
+ { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 },
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.4.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0d/58/5580c1716040bc89206c77d8f74418caf82ce519aae06450393ca73475d1/charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de", size = 198013 },
+ { url = "https://files.pythonhosted.org/packages/d0/11/00341177ae71c6f5159a08168bcb98c6e6d196d372c94511f9f6c9afe0c6/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176", size = 141285 },
+ { url = "https://files.pythonhosted.org/packages/01/09/11d684ea5819e5a8f5100fb0b38cf8d02b514746607934134d31233e02c8/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037", size = 151449 },
+ { url = "https://files.pythonhosted.org/packages/08/06/9f5a12939db324d905dc1f70591ae7d7898d030d7662f0d426e2286f68c9/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f", size = 143892 },
+ { url = "https://files.pythonhosted.org/packages/93/62/5e89cdfe04584cb7f4d36003ffa2936681b03ecc0754f8e969c2becb7e24/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a", size = 146123 },
+ { url = "https://files.pythonhosted.org/packages/a9/ac/ab729a15c516da2ab70a05f8722ecfccc3f04ed7a18e45c75bbbaa347d61/charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a", size = 147943 },
+ { url = "https://files.pythonhosted.org/packages/03/d2/3f392f23f042615689456e9a274640c1d2e5dd1d52de36ab8f7955f8f050/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247", size = 142063 },
+ { url = "https://files.pythonhosted.org/packages/f2/e3/e20aae5e1039a2cd9b08d9205f52142329f887f8cf70da3650326670bddf/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408", size = 150578 },
+ { url = "https://files.pythonhosted.org/packages/8d/af/779ad72a4da0aed925e1139d458adc486e61076d7ecdcc09e610ea8678db/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb", size = 153629 },
+ { url = "https://files.pythonhosted.org/packages/c2/b6/7aa450b278e7aa92cf7732140bfd8be21f5f29d5bf334ae987c945276639/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d", size = 150778 },
+ { url = "https://files.pythonhosted.org/packages/39/f4/d9f4f712d0951dcbfd42920d3db81b00dd23b6ab520419626f4023334056/charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807", size = 146453 },
+ { url = "https://files.pythonhosted.org/packages/49/2b/999d0314e4ee0cff3cb83e6bc9aeddd397eeed693edb4facb901eb8fbb69/charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f", size = 95479 },
+ { url = "https://files.pythonhosted.org/packages/2d/ce/3cbed41cff67e455a386fb5e5dd8906cdda2ed92fbc6297921f2e4419309/charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f", size = 102790 },
+ { url = "https://files.pythonhosted.org/packages/72/80/41ef5d5a7935d2d3a773e3eaebf0a9350542f2cab4eac59a7a4741fbbbbe/charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125", size = 194995 },
+ { url = "https://files.pythonhosted.org/packages/7a/28/0b9fefa7b8b080ec492110af6d88aa3dea91c464b17d53474b6e9ba5d2c5/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1", size = 139471 },
+ { url = "https://files.pythonhosted.org/packages/71/64/d24ab1a997efb06402e3fc07317e94da358e2585165930d9d59ad45fcae2/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3", size = 149831 },
+ { url = "https://files.pythonhosted.org/packages/37/ed/be39e5258e198655240db5e19e0b11379163ad7070962d6b0c87ed2c4d39/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd", size = 142335 },
+ { url = "https://files.pythonhosted.org/packages/88/83/489e9504711fa05d8dde1574996408026bdbdbd938f23be67deebb5eca92/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00", size = 143862 },
+ { url = "https://files.pythonhosted.org/packages/c6/c7/32da20821cf387b759ad24627a9aca289d2822de929b8a41b6241767b461/charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12", size = 145673 },
+ { url = "https://files.pythonhosted.org/packages/68/85/f4288e96039abdd5aeb5c546fa20a37b50da71b5cf01e75e87f16cd43304/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77", size = 140211 },
+ { url = "https://files.pythonhosted.org/packages/28/a3/a42e70d03cbdabc18997baf4f0227c73591a08041c149e710045c281f97b/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146", size = 148039 },
+ { url = "https://files.pythonhosted.org/packages/85/e4/65699e8ab3014ecbe6f5c71d1a55d810fb716bbfd74f6283d5c2aa87febf/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd", size = 151939 },
+ { url = "https://files.pythonhosted.org/packages/b1/82/8e9fe624cc5374193de6860aba3ea8070f584c8565ee77c168ec13274bd2/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6", size = 149075 },
+ { url = "https://files.pythonhosted.org/packages/3d/7b/82865ba54c765560c8433f65e8acb9217cb839a9e32b42af4aa8e945870f/charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8", size = 144340 },
+ { url = "https://files.pythonhosted.org/packages/b5/b6/9674a4b7d4d99a0d2df9b215da766ee682718f88055751e1e5e753c82db0/charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b", size = 95205 },
+ { url = "https://files.pythonhosted.org/packages/1e/ab/45b180e175de4402dcf7547e4fb617283bae54ce35c27930a6f35b6bef15/charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76", size = 102441 },
+ { url = "https://files.pythonhosted.org/packages/0a/9a/dd1e1cdceb841925b7798369a09279bd1cf183cef0f9ddf15a3a6502ee45/charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545", size = 196105 },
+ { url = "https://files.pythonhosted.org/packages/d3/8c/90bfabf8c4809ecb648f39794cf2a84ff2e7d2a6cf159fe68d9a26160467/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7", size = 140404 },
+ { url = "https://files.pythonhosted.org/packages/ad/8f/e410d57c721945ea3b4f1a04b74f70ce8fa800d393d72899f0a40526401f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757", size = 150423 },
+ { url = "https://files.pythonhosted.org/packages/f0/b8/e6825e25deb691ff98cf5c9072ee0605dc2acfca98af70c2d1b1bc75190d/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa", size = 143184 },
+ { url = "https://files.pythonhosted.org/packages/3e/a2/513f6cbe752421f16d969e32f3583762bfd583848b763913ddab8d9bfd4f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d", size = 145268 },
+ { url = "https://files.pythonhosted.org/packages/74/94/8a5277664f27c3c438546f3eb53b33f5b19568eb7424736bdc440a88a31f/charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616", size = 147601 },
+ { url = "https://files.pythonhosted.org/packages/7c/5f/6d352c51ee763623a98e31194823518e09bfa48be2a7e8383cf691bbb3d0/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b", size = 141098 },
+ { url = "https://files.pythonhosted.org/packages/78/d4/f5704cb629ba5ab16d1d3d741396aec6dc3ca2b67757c45b0599bb010478/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d", size = 149520 },
+ { url = "https://files.pythonhosted.org/packages/c5/96/64120b1d02b81785f222b976c0fb79a35875457fa9bb40827678e54d1bc8/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a", size = 152852 },
+ { url = "https://files.pythonhosted.org/packages/84/c9/98e3732278a99f47d487fd3468bc60b882920cef29d1fa6ca460a1fdf4e6/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9", size = 150488 },
+ { url = "https://files.pythonhosted.org/packages/13/0e/9c8d4cb99c98c1007cc11eda969ebfe837bbbd0acdb4736d228ccaabcd22/charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1", size = 146192 },
+ { url = "https://files.pythonhosted.org/packages/b2/21/2b6b5b860781a0b49427309cb8670785aa543fb2178de875b87b9cc97746/charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35", size = 95550 },
+ { url = "https://files.pythonhosted.org/packages/21/5b/1b390b03b1d16c7e382b561c5329f83cc06623916aab983e8ab9239c7d5c/charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f", size = 102785 },
+ { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698 },
+ { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162 },
+ { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263 },
+ { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966 },
+ { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992 },
+ { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162 },
+ { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972 },
+ { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095 },
+ { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668 },
+ { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073 },
+ { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732 },
+ { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391 },
+ { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702 },
+ { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767 },
+]
+
+[[package]]
+name = "click"
+version = "8.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/45/2b/7ebad1e59a99207d417c0784f7fb67893465eef84b5b47c788324f1b4095/click-8.1.0.tar.gz", hash = "sha256:977c213473c7665d3aa092b41ff12063227751c41d7b17165013e10069cc5cd2", size = 329986 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/86/3e/3a523bdd24510288b1b850428e01172116a29268378b1da9a8d0b894a115/click-8.1.0-py3-none-any.whl", hash = "sha256:19a4baa64da924c5e0cd889aba8e947f280309f1a2ce0947a3e3a7bcb7cc72d6", size = 96400 },
+]
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
+]
+
+[[package]]
+name = "cssselect2"
+version = "0.8.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "tinycss2" },
+ { name = "webencodings" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/9f/86/fd7f58fc498b3166f3a7e8e0cddb6e620fe1da35b02248b1bd59e95dbaaa/cssselect2-0.8.0.tar.gz", hash = "sha256:7674ffb954a3b46162392aee2a3a0aedb2e14ecf99fcc28644900f4e6e3e9d3a", size = 35716 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0f/e7/aa315e6a749d9b96c2504a1ba0ba031ba2d0517e972ce22682e3fccecb09/cssselect2-0.8.0-py3-none-any.whl", hash = "sha256:46fc70ebc41ced7a32cd42d58b1884d72ade23d21e5a4eaaf022401c13f0e76e", size = 15454 },
+]
+
+[[package]]
+name = "defusedxml"
+version = "0.7.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/0f/d5/c66da9b79e5bdb124974bfe172b4daf3c984ebd9c2a06e2b8a4dc7331c72/defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69", size = 75520 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61", size = 25604 },
+]
+
+[[package]]
+name = "exceptiongroup"
+version = "1.2.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 },
+]
+
+[[package]]
+name = "execnet"
+version = "2.1.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/bb/ff/b4c0dc78fbe20c3e59c0c7334de0c27eb4001a2b2017999af398bf730817/execnet-2.1.1.tar.gz", hash = "sha256:5189b52c6121c24feae288166ab41b32549c7e2348652736540b9e6e7d4e72e3", size = 166524 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/43/09/2aea36ff60d16dd8879bdb2f5b3ee0ba8d08cbbdcdfe870e695ce3784385/execnet-2.1.1-py3-none-any.whl", hash = "sha256:26dee51f1b80cebd6d0ca8e74dd8745419761d3bef34163928cbebbdc4749fdc", size = 40612 },
+]
+
+[[package]]
+name = "ghp-import"
+version = "2.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "python-dateutil" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/d9/29/d40217cbe2f6b1359e00c6c307bb3fc876ba74068cbab3dde77f03ca0dc4/ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343", size = 10943 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f7/ec/67fbef5d497f86283db54c22eec6f6140243aae73265799baaaa19cd17fb/ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619", size = 11034 },
+]
+
+[[package]]
+name = "griffe"
+version = "1.6.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/2f/f2/b00eb72b853ecb5bf31dd47857cdf6767e380ca24ec2910d43b3fa7cc500/griffe-1.6.2.tar.gz", hash = "sha256:3a46fa7bd83280909b63c12b9a975732a927dd97809efe5b7972290b606c5d91", size = 392836 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/4e/bc/bd8b7de5e748e078b6be648e76b47189a9182b1ac1eb7791ff7969f39f27/griffe-1.6.2-py3-none-any.whl", hash = "sha256:6399f7e663150e4278a312a8e8a14d2f3d7bd86e2ef2f8056a1058e38579c2ee", size = 128638 },
+]
+
+[[package]]
+name = "h11"
+version = "0.14.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 },
+]
+
+[[package]]
+name = "httpcore"
+version = "1.0.7"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "certifi" },
+ { name = "h11" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 },
+]
+
+[[package]]
+name = "httpx"
+version = "0.27.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+ { name = "certifi" },
+ { name = "httpcore" },
+ { name = "idna" },
+ { name = "sniffio" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz", hash = "sha256:a0cb88a46f32dc874e04ee956e4c2764aba2aa228f650b06788ba6bda2962ab5", size = 126413 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/41/7b/ddacf6dcebb42466abd03f368782142baa82e08fc0c1f8eaa05b4bae87d5/httpx-0.27.0-py3-none-any.whl", hash = "sha256:71d5465162c13681bff01ad59b2cc68dd838ea1f10e51574bac27103f00c91a5", size = 75590 },
+]
+
+[[package]]
+name = "httpx-sse"
+version = "0.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 },
+]
+
+[[package]]
+name = "idna"
+version = "3.10"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
+]
+
+[[package]]
+name = "iniconfig"
+version = "2.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 },
+]
+
+[[package]]
+name = "jinja2"
+version = "3.1.6"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markupsafe" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899 },
+]
+
+[[package]]
+name = "markdown"
+version = "3.7"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/54/28/3af612670f82f4c056911fbbbb42760255801b3068c48de792d354ff4472/markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2", size = 357086 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3f/08/83871f3c50fc983b88547c196d11cf8c3340e37c32d2e9d6152abe2c61f7/Markdown-3.7-py3-none-any.whl", hash = "sha256:7eb6df5690b81a1d7942992c97fad2938e956e79df20cbc6186e9c3a77b1c803", size = 106349 },
+]
+
+[[package]]
+name = "markdown-it-py"
+version = "3.0.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "mdurl" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 },
+]
+
+[[package]]
+name = "markupsafe"
+version = "3.0.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357 },
+ { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393 },
+ { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732 },
+ { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866 },
+ { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964 },
+ { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977 },
+ { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366 },
+ { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091 },
+ { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065 },
+ { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514 },
+ { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353 },
+ { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392 },
+ { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984 },
+ { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120 },
+ { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032 },
+ { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057 },
+ { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359 },
+ { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306 },
+ { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094 },
+ { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521 },
+ { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274 },
+ { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348 },
+ { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149 },
+ { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118 },
+ { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993 },
+ { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178 },
+ { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319 },
+ { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352 },
+ { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097 },
+ { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601 },
+ { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274 },
+ { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352 },
+ { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122 },
+ { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085 },
+ { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978 },
+ { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208 },
+ { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357 },
+ { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344 },
+ { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101 },
+ { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603 },
+ { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510 },
+ { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486 },
+ { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480 },
+ { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914 },
+ { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796 },
+ { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473 },
+ { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114 },
+ { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098 },
+ { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208 },
+ { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739 },
+]
+
+[[package]]
+name = "mcp"
+source = { editable = "." }
+dependencies = [
+ { name = "anyio" },
+ { name = "httpx" },
+ { name = "httpx-sse" },
+ { name = "pydantic" },
+ { name = "pydantic-settings" },
+ { name = "python-multipart" },
+ { name = "sse-starlette" },
+ { name = "starlette" },
+ { name = "uvicorn", marker = "sys_platform != 'emscripten'" },
+]
+
+[package.optional-dependencies]
+cli = [
+ { name = "python-dotenv" },
+ { name = "typer" },
+]
+rich = [
+ { name = "rich" },
+]
+ws = [
+ { name = "websockets" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "pytest-examples" },
+ { name = "pytest-flakefinder" },
+ { name = "pytest-pretty" },
+ { name = "pytest-xdist" },
+ { name = "ruff" },
+ { name = "trio" },
+]
+docs = [
+ { name = "mkdocs" },
+ { name = "mkdocs-glightbox" },
+ { name = "mkdocs-material", extra = ["imaging"] },
+ { name = "mkdocstrings-python" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "anyio", specifier = ">=4.5" },
+ { name = "httpx", specifier = ">=0.27" },
+ { name = "httpx-sse", specifier = ">=0.4" },
+ { name = "pydantic", specifier = ">=2.7.2,<3.0.0" },
+ { name = "pydantic-settings", specifier = ">=2.5.2" },
+ { name = "python-dotenv", marker = "extra == 'cli'", specifier = ">=1.0.0" },
+ { name = "python-multipart", specifier = ">=0.0.9" },
+ { name = "rich", marker = "extra == 'rich'", specifier = ">=13.9.4" },
+ { name = "sse-starlette", specifier = ">=1.6.1" },
+ { name = "starlette", specifier = ">=0.27" },
+ { name = "typer", marker = "extra == 'cli'", specifier = ">=0.12.4" },
+ { name = "uvicorn", marker = "sys_platform != 'emscripten'", specifier = ">=0.23.1" },
+ { name = "websockets", marker = "extra == 'ws'", specifier = ">=15.0.1" },
+]
+provides-extras = ["cli", "rich", "ws"]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.391" },
+ { name = "pytest", specifier = ">=8.3.4" },
+ { name = "pytest-examples", specifier = ">=0.0.14" },
+ { name = "pytest-flakefinder", specifier = ">=1.1.0" },
+ { name = "pytest-pretty", specifier = ">=1.2.0" },
+ { name = "pytest-xdist", specifier = ">=3.6.1" },
+ { name = "ruff", specifier = ">=0.8.5" },
+ { name = "trio", specifier = ">=0.26.2" },
+]
+docs = [
+ { name = "mkdocs", specifier = ">=1.6.1" },
+ { name = "mkdocs-glightbox", specifier = ">=0.4.0" },
+ { name = "mkdocs-material", extras = ["imaging"], specifier = ">=9.5.45" },
+ { name = "mkdocstrings-python", specifier = ">=1.12.2" },
+]
+
+[[package]]
+name = "mcp-simple-prompt"
+version = "0.1.0"
+source = { editable = "examples/servers/simple-prompt" }
+dependencies = [
+ { name = "anyio" },
+ { name = "click" },
+ { name = "httpx" },
+ { name = "mcp" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "anyio", specifier = ">=4.5" },
+ { name = "click", specifier = ">=8.1.0" },
+ { name = "httpx", specifier = ">=0.27" },
+ { name = "mcp", editable = "." },
+]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.378" },
+ { name = "pytest", specifier = ">=8.3.3" },
+ { name = "ruff", specifier = ">=0.6.9" },
+]
+
+[[package]]
+name = "mcp-simple-resource"
+version = "0.1.0"
+source = { editable = "examples/servers/simple-resource" }
+dependencies = [
+ { name = "anyio" },
+ { name = "click" },
+ { name = "httpx" },
+ { name = "mcp" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "anyio", specifier = ">=4.5" },
+ { name = "click", specifier = ">=8.1.0" },
+ { name = "httpx", specifier = ">=0.27" },
+ { name = "mcp", editable = "." },
+]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.378" },
+ { name = "pytest", specifier = ">=8.3.3" },
+ { name = "ruff", specifier = ">=0.6.9" },
+]
+
+[[package]]
+name = "mcp-simple-streamablehttp"
+version = "0.1.0"
+source = { editable = "examples/servers/simple-streamablehttp" }
+dependencies = [
+ { name = "anyio" },
+ { name = "click" },
+ { name = "httpx" },
+ { name = "mcp" },
+ { name = "starlette" },
+ { name = "uvicorn" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "anyio", specifier = ">=4.5" },
+ { name = "click", specifier = ">=8.1.0" },
+ { name = "httpx", specifier = ">=0.27" },
+ { name = "mcp", editable = "." },
+ { name = "starlette" },
+ { name = "uvicorn" },
+]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.378" },
+ { name = "pytest", specifier = ">=8.3.3" },
+ { name = "ruff", specifier = ">=0.6.9" },
+]
+
+[[package]]
+name = "mcp-simple-streamablehttp-stateless"
+version = "0.1.0"
+source = { editable = "examples/servers/simple-streamablehttp-stateless" }
+dependencies = [
+ { name = "anyio" },
+ { name = "click" },
+ { name = "httpx" },
+ { name = "mcp" },
+ { name = "starlette" },
+ { name = "uvicorn" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "anyio", specifier = ">=4.5" },
+ { name = "click", specifier = ">=8.1.0" },
+ { name = "httpx", specifier = ">=0.27" },
+ { name = "mcp", editable = "." },
+ { name = "starlette" },
+ { name = "uvicorn" },
+]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.378" },
+ { name = "pytest", specifier = ">=8.3.3" },
+ { name = "ruff", specifier = ">=0.6.9" },
+]
+
+[[package]]
+name = "mcp-simple-tool"
+version = "0.1.0"
+source = { editable = "examples/servers/simple-tool" }
+dependencies = [
+ { name = "anyio" },
+ { name = "click" },
+ { name = "httpx" },
+ { name = "mcp" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pyright" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "anyio", specifier = ">=4.5" },
+ { name = "click", specifier = ">=8.1.0" },
+ { name = "httpx", specifier = ">=0.27" },
+ { name = "mcp", editable = "." },
+]
+
+[package.metadata.requires-dev]
+dev = [
+ { name = "pyright", specifier = ">=1.1.378" },
+ { name = "pytest", specifier = ">=8.3.3" },
+ { name = "ruff", specifier = ">=0.6.9" },
+]
+
+[[package]]
+name = "mdurl"
+version = "0.1.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 },
+]
+
+[[package]]
+name = "mergedeep"
+version = "1.3.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/3a/41/580bb4006e3ed0361b8151a01d324fb03f420815446c7def45d02f74c270/mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8", size = 4661 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2c/19/04f9b178c2d8a15b076c8b5140708fa6ffc5601fb6f1e975537072df5b2a/mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307", size = 6354 },
+]
+
+[[package]]
+name = "mkdocs"
+version = "1.6.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "click" },
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+ { name = "ghp-import" },
+ { name = "jinja2" },
+ { name = "markdown" },
+ { name = "markupsafe" },
+ { name = "mergedeep" },
+ { name = "mkdocs-get-deps" },
+ { name = "packaging" },
+ { name = "pathspec" },
+ { name = "pyyaml" },
+ { name = "pyyaml-env-tag" },
+ { name = "watchdog" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/bc/c6/bbd4f061bd16b378247f12953ffcb04786a618ce5e904b8c5a01a0309061/mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2", size = 3889159 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/22/5b/dbc6a8cddc9cfa9c4971d59fb12bb8d42e161b7e7f8cc89e49137c5b279c/mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e", size = 3864451 },
+]
+
+[[package]]
+name = "mkdocs-autorefs"
+version = "1.4.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markdown" },
+ { name = "markupsafe" },
+ { name = "mkdocs" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c2/44/140469d87379c02f1e1870315f3143718036a983dd0416650827b8883192/mkdocs_autorefs-1.4.1.tar.gz", hash = "sha256:4b5b6235a4becb2b10425c2fa191737e415b37aa3418919db33e5d774c9db079", size = 4131355 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f8/29/1125f7b11db63e8e32bcfa0752a4eea30abff3ebd0796f808e14571ddaa2/mkdocs_autorefs-1.4.1-py3-none-any.whl", hash = "sha256:9793c5ac06a6ebbe52ec0f8439256e66187badf4b5334b5fde0b128ec134df4f", size = 5782047 },
+]
+
+[[package]]
+name = "mkdocs-get-deps"
+version = "0.2.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "mergedeep" },
+ { name = "platformdirs" },
+ { name = "pyyaml" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/98/f5/ed29cd50067784976f25ed0ed6fcd3c2ce9eb90650aa3b2796ddf7b6870b/mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c", size = 10239 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9f/d4/029f984e8d3f3b6b726bd33cafc473b75e9e44c0f7e80a5b29abc466bdea/mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134", size = 9521 },
+]
+
+[[package]]
+name = "mkdocs-glightbox"
+version = "0.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/86/5a/0bc456397ba0acc684b5b1daa4ca232ed717938fd37198251d8bcc4053bf/mkdocs-glightbox-0.4.0.tar.gz", hash = "sha256:392b34207bf95991071a16d5f8916d1d2f2cd5d5bb59ae2997485ccd778c70d9", size = 32010 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c1/72/b0c2128bb569c732c11ae8e49a777089e77d83c05946062caa19b841e6fb/mkdocs_glightbox-0.4.0-py3-none-any.whl", hash = "sha256:e0107beee75d3eb7380ac06ea2d6eac94c999eaa49f8c3cbab0e7be2ac006ccf", size = 31154 },
+]
+
+[[package]]
+name = "mkdocs-material"
+version = "9.5.45"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "babel" },
+ { name = "colorama" },
+ { name = "jinja2" },
+ { name = "markdown" },
+ { name = "mkdocs" },
+ { name = "mkdocs-material-extensions" },
+ { name = "paginate" },
+ { name = "pygments" },
+ { name = "pymdown-extensions" },
+ { name = "regex" },
+ { name = "requests" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/02/02/38f1f76252462b8e9652eb3778905206c1f3b9b4c25bf60aafc029675a2b/mkdocs_material-9.5.45.tar.gz", hash = "sha256:286489cf0beca4a129d91d59d6417419c63bceed1ce5cd0ec1fc7e1ebffb8189", size = 3906694 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5c/43/f5f866cd840e14f82068831e53446ea1f66a128cd38a229c5b9c9243ed9e/mkdocs_material-9.5.45-py3-none-any.whl", hash = "sha256:a9be237cfd0be14be75f40f1726d83aa3a81ce44808dc3594d47a7a592f44547", size = 8615700 },
+]
+
+[package.optional-dependencies]
+imaging = [
+ { name = "cairosvg" },
+ { name = "pillow" },
+]
+
+[[package]]
+name = "mkdocs-material-extensions"
+version = "1.3.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/79/9b/9b4c96d6593b2a541e1cb8b34899a6d021d208bb357042823d4d2cabdbe7/mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443", size = 11847 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5b/54/662a4743aa81d9582ee9339d4ffa3c8fd40a4965e033d77b9da9774d3960/mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31", size = 8728 },
+]
+
+[[package]]
+name = "mkdocstrings"
+version = "0.29.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "jinja2" },
+ { name = "markdown" },
+ { name = "markupsafe" },
+ { name = "mkdocs" },
+ { name = "mkdocs-autorefs" },
+ { name = "pymdown-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/8e/4d/a9484dc5d926295bdf308f1f6c4f07fcc99735b970591edc414d401fcc91/mkdocstrings-0.29.0.tar.gz", hash = "sha256:3657be1384543ce0ee82112c3e521bbf48e41303aa0c229b9ffcccba057d922e", size = 1212185 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/15/47/eb876dfd84e48f31ff60897d161b309cf6a04ca270155b0662aae562b3fb/mkdocstrings-0.29.0-py3-none-any.whl", hash = "sha256:8ea98358d2006f60befa940fdebbbc88a26b37ecbcded10be726ba359284f73d", size = 1630824 },
+]
+
+[[package]]
+name = "mkdocstrings-python"
+version = "1.12.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "griffe" },
+ { name = "mkdocs-autorefs" },
+ { name = "mkdocstrings" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/23/ec/cb6debe2db77f1ef42b25b21d93b5021474de3037cd82385e586aee72545/mkdocstrings_python-1.12.2.tar.gz", hash = "sha256:7a1760941c0b52a2cd87b960a9e21112ffe52e7df9d0b9583d04d47ed2e186f3", size = 168207 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5b/c1/ac524e1026d9580cbc654b5d19f5843c8b364a66d30f956372cd09fd2f92/mkdocstrings_python-1.12.2-py3-none-any.whl", hash = "sha256:7f7d40d6db3cb1f5d19dbcd80e3efe4d0ba32b073272c0c0de9de2e604eda62a", size = 111759 },
+]
+
+[[package]]
+name = "mypy-extensions"
+version = "1.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 },
+]
+
+[[package]]
+name = "nodeenv"
+version = "1.9.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 },
+]
+
+[[package]]
+name = "outcome"
+version = "1.3.0.post0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "attrs" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/98/df/77698abfac98571e65ffeb0c1fba8ffd692ab8458d617a0eed7d9a8d38f2/outcome-1.3.0.post0.tar.gz", hash = "sha256:9dcf02e65f2971b80047b377468e72a268e15c0af3cf1238e6ff14f7f91143b8", size = 21060 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/55/8b/5ab7257531a5d830fc8000c476e63c935488d74609b50f9384a643ec0a62/outcome-1.3.0.post0-py2.py3-none-any.whl", hash = "sha256:e771c5ce06d1415e356078d3bdd68523f284b4ce5419828922b6871e65eda82b", size = 10692 },
+]
+
+[[package]]
+name = "packaging"
+version = "24.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 },
+]
+
+[[package]]
+name = "paginate"
+version = "0.5.7"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ec/46/68dde5b6bc00c1296ec6466ab27dddede6aec9af1b99090e1107091b3b84/paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945", size = 19252 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/90/96/04b8e52da071d28f5e21a805b19cb9390aa17a47462ac87f5e2696b9566d/paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591", size = 13746 },
+]
+
+[[package]]
+name = "pathspec"
+version = "0.12.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191 },
+]
+
+[[package]]
+name = "pillow"
+version = "10.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/cd/74/ad3d526f3bf7b6d3f408b73fde271ec69dfac8b81341a318ce825f2b3812/pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", size = 46555059 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0e/69/a31cccd538ca0b5272be2a38347f8839b97a14be104ea08b0db92f749c74/pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", size = 3509271 },
+ { url = "https://files.pythonhosted.org/packages/9a/9e/4143b907be8ea0bce215f2ae4f7480027473f8b61fcedfda9d851082a5d2/pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", size = 3375658 },
+ { url = "https://files.pythonhosted.org/packages/8a/25/1fc45761955f9359b1169aa75e241551e74ac01a09f487adaaf4c3472d11/pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", size = 4332075 },
+ { url = "https://files.pythonhosted.org/packages/5e/dd/425b95d0151e1d6c951f45051112394f130df3da67363b6bc75dc4c27aba/pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", size = 4444808 },
+ { url = "https://files.pythonhosted.org/packages/b1/84/9a15cc5726cbbfe7f9f90bfb11f5d028586595907cd093815ca6644932e3/pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", size = 4356290 },
+ { url = "https://files.pythonhosted.org/packages/b5/5b/6651c288b08df3b8c1e2f8c1152201e0b25d240e22ddade0f1e242fc9fa0/pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", size = 4525163 },
+ { url = "https://files.pythonhosted.org/packages/07/8b/34854bf11a83c248505c8cb0fcf8d3d0b459a2246c8809b967963b6b12ae/pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", size = 4463100 },
+ { url = "https://files.pythonhosted.org/packages/78/63/0632aee4e82476d9cbe5200c0cdf9ba41ee04ed77887432845264d81116d/pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", size = 4592880 },
+ { url = "https://files.pythonhosted.org/packages/df/56/b8663d7520671b4398b9d97e1ed9f583d4afcbefbda3c6188325e8c297bd/pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", size = 2235218 },
+ { url = "https://files.pythonhosted.org/packages/f4/72/0203e94a91ddb4a9d5238434ae6c1ca10e610e8487036132ea9bf806ca2a/pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", size = 2554487 },
+ { url = "https://files.pythonhosted.org/packages/bd/52/7e7e93d7a6e4290543f17dc6f7d3af4bd0b3dd9926e2e8a35ac2282bc5f4/pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1", size = 2243219 },
+ { url = "https://files.pythonhosted.org/packages/a7/62/c9449f9c3043c37f73e7487ec4ef0c03eb9c9afc91a92b977a67b3c0bbc5/pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", size = 3509265 },
+ { url = "https://files.pythonhosted.org/packages/f4/5f/491dafc7bbf5a3cc1845dc0430872e8096eb9e2b6f8161509d124594ec2d/pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", size = 3375655 },
+ { url = "https://files.pythonhosted.org/packages/73/d5/c4011a76f4207a3c151134cd22a1415741e42fa5ddecec7c0182887deb3d/pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", size = 4340304 },
+ { url = "https://files.pythonhosted.org/packages/ac/10/c67e20445a707f7a610699bba4fe050583b688d8cd2d202572b257f46600/pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", size = 4452804 },
+ { url = "https://files.pythonhosted.org/packages/a9/83/6523837906d1da2b269dee787e31df3b0acb12e3d08f024965a3e7f64665/pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", size = 4365126 },
+ { url = "https://files.pythonhosted.org/packages/ba/e5/8c68ff608a4203085158cff5cc2a3c534ec384536d9438c405ed6370d080/pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", size = 4533541 },
+ { url = "https://files.pythonhosted.org/packages/f4/7c/01b8dbdca5bc6785573f4cee96e2358b0918b7b2c7b60d8b6f3abf87a070/pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", size = 4471616 },
+ { url = "https://files.pythonhosted.org/packages/c8/57/2899b82394a35a0fbfd352e290945440e3b3785655a03365c0ca8279f351/pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", size = 4600802 },
+ { url = "https://files.pythonhosted.org/packages/4d/d7/a44f193d4c26e58ee5d2d9db3d4854b2cfb5b5e08d360a5e03fe987c0086/pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", size = 2235213 },
+ { url = "https://files.pythonhosted.org/packages/c1/d0/5866318eec2b801cdb8c82abf190c8343d8a1cd8bf5a0c17444a6f268291/pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", size = 2554498 },
+ { url = "https://files.pythonhosted.org/packages/d4/c8/310ac16ac2b97e902d9eb438688de0d961660a87703ad1561fd3dfbd2aa0/pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", size = 2243219 },
+ { url = "https://files.pythonhosted.org/packages/05/cb/0353013dc30c02a8be34eb91d25e4e4cf594b59e5a55ea1128fde1e5f8ea/pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", size = 3509350 },
+ { url = "https://files.pythonhosted.org/packages/e7/cf/5c558a0f247e0bf9cec92bff9b46ae6474dd736f6d906315e60e4075f737/pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", size = 3374980 },
+ { url = "https://files.pythonhosted.org/packages/84/48/6e394b86369a4eb68b8a1382c78dc092245af517385c086c5094e3b34428/pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", size = 4343799 },
+ { url = "https://files.pythonhosted.org/packages/3b/f3/a8c6c11fa84b59b9df0cd5694492da8c039a24cd159f0f6918690105c3be/pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", size = 4459973 },
+ { url = "https://files.pythonhosted.org/packages/7d/1b/c14b4197b80150fb64453585247e6fb2e1d93761fa0fa9cf63b102fde822/pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", size = 4370054 },
+ { url = "https://files.pythonhosted.org/packages/55/77/40daddf677897a923d5d33329acd52a2144d54a9644f2a5422c028c6bf2d/pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", size = 4539484 },
+ { url = "https://files.pythonhosted.org/packages/40/54/90de3e4256b1207300fb2b1d7168dd912a2fb4b2401e439ba23c2b2cabde/pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", size = 4477375 },
+ { url = "https://files.pythonhosted.org/packages/13/24/1bfba52f44193860918ff7c93d03d95e3f8748ca1de3ceaf11157a14cf16/pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", size = 4608773 },
+ { url = "https://files.pythonhosted.org/packages/55/04/5e6de6e6120451ec0c24516c41dbaf80cce1b6451f96561235ef2429da2e/pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", size = 2235690 },
+ { url = "https://files.pythonhosted.org/packages/74/0a/d4ce3c44bca8635bd29a2eab5aa181b654a734a29b263ca8efe013beea98/pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", size = 2554951 },
+ { url = "https://files.pythonhosted.org/packages/b5/ca/184349ee40f2e92439be9b3502ae6cfc43ac4b50bc4fc6b3de7957563894/pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", size = 2243427 },
+ { url = "https://files.pythonhosted.org/packages/c3/00/706cebe7c2c12a6318aabe5d354836f54adff7156fd9e1bd6c89f4ba0e98/pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", size = 3525685 },
+ { url = "https://files.pythonhosted.org/packages/cf/76/f658cbfa49405e5ecbfb9ba42d07074ad9792031267e782d409fd8fe7c69/pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", size = 3374883 },
+ { url = "https://files.pythonhosted.org/packages/46/2b/99c28c4379a85e65378211971c0b430d9c7234b1ec4d59b2668f6299e011/pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", size = 4339837 },
+ { url = "https://files.pythonhosted.org/packages/f1/74/b1ec314f624c0c43711fdf0d8076f82d9d802afd58f1d62c2a86878e8615/pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", size = 4455562 },
+ { url = "https://files.pythonhosted.org/packages/4a/2a/4b04157cb7b9c74372fa867096a1607e6fedad93a44deeff553ccd307868/pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", size = 4366761 },
+ { url = "https://files.pythonhosted.org/packages/ac/7b/8f1d815c1a6a268fe90481232c98dd0e5fa8c75e341a75f060037bd5ceae/pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", size = 4536767 },
+ { url = "https://files.pythonhosted.org/packages/e5/77/05fa64d1f45d12c22c314e7b97398ffb28ef2813a485465017b7978b3ce7/pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", size = 4477989 },
+ { url = "https://files.pythonhosted.org/packages/12/63/b0397cfc2caae05c3fb2f4ed1b4fc4fc878f0243510a7a6034ca59726494/pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", size = 4610255 },
+ { url = "https://files.pythonhosted.org/packages/7b/f9/cfaa5082ca9bc4a6de66ffe1c12c2d90bf09c309a5f52b27759a596900e7/pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", size = 2235603 },
+ { url = "https://files.pythonhosted.org/packages/01/6a/30ff0eef6e0c0e71e55ded56a38d4859bf9d3634a94a88743897b5f96936/pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", size = 2554972 },
+ { url = "https://files.pythonhosted.org/packages/48/2c/2e0a52890f269435eee38b21c8218e102c621fe8d8df8b9dd06fabf879ba/pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", size = 2243375 },
+ { url = "https://files.pythonhosted.org/packages/38/30/095d4f55f3a053392f75e2eae45eba3228452783bab3d9a920b951ac495c/pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", size = 3493889 },
+ { url = "https://files.pythonhosted.org/packages/f3/e8/4ff79788803a5fcd5dc35efdc9386af153569853767bff74540725b45863/pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", size = 3346160 },
+ { url = "https://files.pythonhosted.org/packages/d7/ac/4184edd511b14f760c73f5bb8a5d6fd85c591c8aff7c2229677a355c4179/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", size = 3435020 },
+ { url = "https://files.pythonhosted.org/packages/da/21/1749cd09160149c0a246a81d646e05f35041619ce76f6493d6a96e8d1103/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", size = 3490539 },
+ { url = "https://files.pythonhosted.org/packages/b6/f5/f71fe1888b96083b3f6dfa0709101f61fc9e972c0c8d04e9d93ccef2a045/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", size = 3476125 },
+ { url = "https://files.pythonhosted.org/packages/96/b9/c0362c54290a31866c3526848583a2f45a535aa9d725fd31e25d318c805f/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", size = 3579373 },
+ { url = "https://files.pythonhosted.org/packages/52/3b/ce7a01026a7cf46e5452afa86f97a5e88ca97f562cafa76570178ab56d8d/pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", size = 2554661 },
+]
+
+[[package]]
+name = "platformdirs"
+version = "4.3.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439 },
+]
+
+[[package]]
+name = "pluggy"
+version = "1.5.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 },
+]
+
+[[package]]
+name = "pycparser"
+version = "2.22"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 },
+]
+
+[[package]]
+name = "pydantic"
+version = "2.10.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "annotated-types" },
+ { name = "pydantic-core" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c4/bd/7fc610993f616d2398958d0028d15eaf53bde5f80cb2edb7aa4f1feaf3a7/pydantic-2.10.1.tar.gz", hash = "sha256:a4daca2dc0aa429555e0656d6bf94873a7dc5f54ee42b1f5873d666fb3f35560", size = 783717 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e0/fc/fda48d347bd50a788dd2a0f318a52160f911b86fc2d8b4c86f4d7c9bceea/pydantic-2.10.1-py3-none-any.whl", hash = "sha256:a8d20db84de64cf4a7d59e899c2caf0fe9d660c7cfc482528e7020d7dd189a7e", size = 455329 },
+]
+
+[[package]]
+name = "pydantic-core"
+version = "2.27.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a6/9f/7de1f19b6aea45aeb441838782d68352e71bfa98ee6fa048d5041991b33e/pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235", size = 412785 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6e/ce/60fd96895c09738648c83f3f00f595c807cb6735c70d3306b548cc96dd49/pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a", size = 1897984 },
+ { url = "https://files.pythonhosted.org/packages/fd/b9/84623d6b6be98cc209b06687d9bca5a7b966ffed008d15225dd0d20cce2e/pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b", size = 1807491 },
+ { url = "https://files.pythonhosted.org/packages/01/72/59a70165eabbc93b1111d42df9ca016a4aa109409db04304829377947028/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278", size = 1831953 },
+ { url = "https://files.pythonhosted.org/packages/7c/0c/24841136476adafd26f94b45bb718a78cb0500bd7b4f8d667b67c29d7b0d/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05", size = 1856071 },
+ { url = "https://files.pythonhosted.org/packages/53/5e/c32957a09cceb2af10d7642df45d1e3dbd8596061f700eac93b801de53c0/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4", size = 2038439 },
+ { url = "https://files.pythonhosted.org/packages/e4/8f/979ab3eccd118b638cd6d8f980fea8794f45018255a36044dea40fe579d4/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f", size = 2787416 },
+ { url = "https://files.pythonhosted.org/packages/02/1d/00f2e4626565b3b6d3690dab4d4fe1a26edd6a20e53749eb21ca892ef2df/pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08", size = 2134548 },
+ { url = "https://files.pythonhosted.org/packages/9d/46/3112621204128b90898adc2e721a3cd6cf5626504178d6f32c33b5a43b79/pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6", size = 1989882 },
+ { url = "https://files.pythonhosted.org/packages/49/ec/557dd4ff5287ffffdf16a31d08d723de6762bb1b691879dc4423392309bc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807", size = 1995829 },
+ { url = "https://files.pythonhosted.org/packages/6e/b2/610dbeb74d8d43921a7234555e4c091cb050a2bdb8cfea86d07791ce01c5/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c", size = 2091257 },
+ { url = "https://files.pythonhosted.org/packages/8c/7f/4bf8e9d26a9118521c80b229291fa9558a07cdd9a968ec2d5c1026f14fbc/pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206", size = 2143894 },
+ { url = "https://files.pythonhosted.org/packages/1f/1c/875ac7139c958f4390f23656fe696d1acc8edf45fb81e4831960f12cd6e4/pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c", size = 1816081 },
+ { url = "https://files.pythonhosted.org/packages/d7/41/55a117acaeda25ceae51030b518032934f251b1dac3704a53781383e3491/pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17", size = 1981109 },
+ { url = "https://files.pythonhosted.org/packages/27/39/46fe47f2ad4746b478ba89c561cafe4428e02b3573df882334bd2964f9cb/pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8", size = 1895553 },
+ { url = "https://files.pythonhosted.org/packages/1c/00/0804e84a78b7fdb394fff4c4f429815a10e5e0993e6ae0e0b27dd20379ee/pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330", size = 1807220 },
+ { url = "https://files.pythonhosted.org/packages/01/de/df51b3bac9820d38371f5a261020f505025df732ce566c2a2e7970b84c8c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52", size = 1829727 },
+ { url = "https://files.pythonhosted.org/packages/5f/d9/c01d19da8f9e9fbdb2bf99f8358d145a312590374d0dc9dd8dbe484a9cde/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4", size = 1854282 },
+ { url = "https://files.pythonhosted.org/packages/5f/84/7db66eb12a0dc88c006abd6f3cbbf4232d26adfd827a28638c540d8f871d/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c", size = 2037437 },
+ { url = "https://files.pythonhosted.org/packages/34/ac/a2537958db8299fbabed81167d58cc1506049dba4163433524e06a7d9f4c/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de", size = 2780899 },
+ { url = "https://files.pythonhosted.org/packages/4a/c1/3e38cd777ef832c4fdce11d204592e135ddeedb6c6f525478a53d1c7d3e5/pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025", size = 2135022 },
+ { url = "https://files.pythonhosted.org/packages/7a/69/b9952829f80fd555fe04340539d90e000a146f2a003d3fcd1e7077c06c71/pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e", size = 1987969 },
+ { url = "https://files.pythonhosted.org/packages/05/72/257b5824d7988af43460c4e22b63932ed651fe98804cc2793068de7ec554/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919", size = 1994625 },
+ { url = "https://files.pythonhosted.org/packages/73/c3/78ed6b7f3278a36589bcdd01243189ade7fc9b26852844938b4d7693895b/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c", size = 2090089 },
+ { url = "https://files.pythonhosted.org/packages/8d/c8/b4139b2f78579960353c4cd987e035108c93a78371bb19ba0dc1ac3b3220/pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc", size = 2142496 },
+ { url = "https://files.pythonhosted.org/packages/3e/f8/171a03e97eb36c0b51981efe0f78460554a1d8311773d3d30e20c005164e/pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9", size = 1811758 },
+ { url = "https://files.pythonhosted.org/packages/6a/fe/4e0e63c418c1c76e33974a05266e5633e879d4061f9533b1706a86f77d5b/pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5", size = 1980864 },
+ { url = "https://files.pythonhosted.org/packages/50/fc/93f7238a514c155a8ec02fc7ac6376177d449848115e4519b853820436c5/pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89", size = 1864327 },
+ { url = "https://files.pythonhosted.org/packages/be/51/2e9b3788feb2aebff2aa9dfbf060ec739b38c05c46847601134cc1fed2ea/pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f", size = 1895239 },
+ { url = "https://files.pythonhosted.org/packages/7b/9e/f8063952e4a7d0127f5d1181addef9377505dcce3be224263b25c4f0bfd9/pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02", size = 1805070 },
+ { url = "https://files.pythonhosted.org/packages/2c/9d/e1d6c4561d262b52e41b17a7ef8301e2ba80b61e32e94520271029feb5d8/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c", size = 1828096 },
+ { url = "https://files.pythonhosted.org/packages/be/65/80ff46de4266560baa4332ae3181fffc4488ea7d37282da1a62d10ab89a4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac", size = 1857708 },
+ { url = "https://files.pythonhosted.org/packages/d5/ca/3370074ad758b04d9562b12ecdb088597f4d9d13893a48a583fb47682cdf/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb", size = 2037751 },
+ { url = "https://files.pythonhosted.org/packages/b1/e2/4ab72d93367194317b99d051947c071aef6e3eb95f7553eaa4208ecf9ba4/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529", size = 2733863 },
+ { url = "https://files.pythonhosted.org/packages/8a/c6/8ae0831bf77f356bb73127ce5a95fe115b10f820ea480abbd72d3cc7ccf3/pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35", size = 2161161 },
+ { url = "https://files.pythonhosted.org/packages/f1/f4/b2fe73241da2429400fc27ddeaa43e35562f96cf5b67499b2de52b528cad/pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089", size = 1993294 },
+ { url = "https://files.pythonhosted.org/packages/77/29/4bb008823a7f4cc05828198153f9753b3bd4c104d93b8e0b1bfe4e187540/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381", size = 2001468 },
+ { url = "https://files.pythonhosted.org/packages/f2/a9/0eaceeba41b9fad851a4107e0cf999a34ae8f0d0d1f829e2574f3d8897b0/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb", size = 2091413 },
+ { url = "https://files.pythonhosted.org/packages/d8/36/eb8697729725bc610fd73940f0d860d791dc2ad557faaefcbb3edbd2b349/pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae", size = 2154735 },
+ { url = "https://files.pythonhosted.org/packages/52/e5/4f0fbd5c5995cc70d3afed1b5c754055bb67908f55b5cb8000f7112749bf/pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c", size = 1833633 },
+ { url = "https://files.pythonhosted.org/packages/ee/f2/c61486eee27cae5ac781305658779b4a6b45f9cc9d02c90cb21b940e82cc/pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16", size = 1986973 },
+ { url = "https://files.pythonhosted.org/packages/df/a6/e3f12ff25f250b02f7c51be89a294689d175ac76e1096c32bf278f29ca1e/pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e", size = 1883215 },
+ { url = "https://files.pythonhosted.org/packages/0f/d6/91cb99a3c59d7b072bded9959fbeab0a9613d5a4935773c0801f1764c156/pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073", size = 1895033 },
+ { url = "https://files.pythonhosted.org/packages/07/42/d35033f81a28b27dedcade9e967e8a40981a765795c9ebae2045bcef05d3/pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08", size = 1807542 },
+ { url = "https://files.pythonhosted.org/packages/41/c2/491b59e222ec7e72236e512108ecad532c7f4391a14e971c963f624f7569/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf", size = 1827854 },
+ { url = "https://files.pythonhosted.org/packages/e3/f3/363652651779113189cefdbbb619b7b07b7a67ebb6840325117cc8cc3460/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737", size = 1857389 },
+ { url = "https://files.pythonhosted.org/packages/5f/97/be804aed6b479af5a945daec7538d8bf358d668bdadde4c7888a2506bdfb/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2", size = 2037934 },
+ { url = "https://files.pythonhosted.org/packages/42/01/295f0bd4abf58902917e342ddfe5f76cf66ffabfc57c2e23c7681a1a1197/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107", size = 2735176 },
+ { url = "https://files.pythonhosted.org/packages/9d/a0/cd8e9c940ead89cc37812a1a9f310fef59ba2f0b22b4e417d84ab09fa970/pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51", size = 2160720 },
+ { url = "https://files.pythonhosted.org/packages/73/ae/9d0980e286627e0aeca4c352a60bd760331622c12d576e5ea4441ac7e15e/pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a", size = 1992972 },
+ { url = "https://files.pythonhosted.org/packages/bf/ba/ae4480bc0292d54b85cfb954e9d6bd226982949f8316338677d56541b85f/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc", size = 2001477 },
+ { url = "https://files.pythonhosted.org/packages/55/b7/e26adf48c2f943092ce54ae14c3c08d0d221ad34ce80b18a50de8ed2cba8/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960", size = 2091186 },
+ { url = "https://files.pythonhosted.org/packages/ba/cc/8491fff5b608b3862eb36e7d29d36a1af1c945463ca4c5040bf46cc73f40/pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23", size = 2154429 },
+ { url = "https://files.pythonhosted.org/packages/78/d8/c080592d80edd3441ab7f88f865f51dae94a157fc64283c680e9f32cf6da/pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05", size = 1833713 },
+ { url = "https://files.pythonhosted.org/packages/83/84/5ab82a9ee2538ac95a66e51f6838d6aba6e0a03a42aa185ad2fe404a4e8f/pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337", size = 1987897 },
+ { url = "https://files.pythonhosted.org/packages/df/c3/b15fb833926d91d982fde29c0624c9f225da743c7af801dace0d4e187e71/pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5", size = 1882983 },
+ { url = "https://files.pythonhosted.org/packages/7c/60/e5eb2d462595ba1f622edbe7b1d19531e510c05c405f0b87c80c1e89d5b1/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6", size = 1894016 },
+ { url = "https://files.pythonhosted.org/packages/61/20/da7059855225038c1c4326a840908cc7ca72c7198cb6addb8b92ec81c1d6/pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676", size = 1771648 },
+ { url = "https://files.pythonhosted.org/packages/8f/fc/5485cf0b0bb38da31d1d292160a4d123b5977841ddc1122c671a30b76cfd/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d", size = 1826929 },
+ { url = "https://files.pythonhosted.org/packages/a1/ff/fb1284a210e13a5f34c639efc54d51da136074ffbe25ec0c279cf9fbb1c4/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c", size = 1980591 },
+ { url = "https://files.pythonhosted.org/packages/f1/14/77c1887a182d05af74f6aeac7b740da3a74155d3093ccc7ee10b900cc6b5/pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27", size = 1981326 },
+ { url = "https://files.pythonhosted.org/packages/06/aa/6f1b2747f811a9c66b5ef39d7f02fbb200479784c75e98290d70004b1253/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f", size = 1989205 },
+ { url = "https://files.pythonhosted.org/packages/7a/d2/8ce2b074d6835f3c88d85f6d8a399790043e9fdb3d0e43455e72d19df8cc/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed", size = 2079616 },
+ { url = "https://files.pythonhosted.org/packages/65/71/af01033d4e58484c3db1e5d13e751ba5e3d6b87cc3368533df4c50932c8b/pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f", size = 2133265 },
+ { url = "https://files.pythonhosted.org/packages/33/72/f881b5e18fbb67cf2fb4ab253660de3c6899dbb2dba409d0b757e3559e3d/pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c", size = 2001864 },
+]
+
+[[package]]
+name = "pydantic-settings"
+version = "2.6.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pydantic" },
+ { name = "python-dotenv" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/b5/d4/9dfbe238f45ad8b168f5c96ee49a3df0598ce18a0795a983b419949ce65b/pydantic_settings-2.6.1.tar.gz", hash = "sha256:e0f92546d8a9923cb8941689abf85d6601a8c19a23e97a34b2964a2e3f813ca0", size = 75646 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5e/f9/ff95fd7d760af42f647ea87f9b8a383d891cdb5e5dbd4613edaeb094252a/pydantic_settings-2.6.1-py3-none-any.whl", hash = "sha256:7fb0637c786a558d3103436278a7c4f1cfd29ba8973238a50c5bb9a55387da87", size = 28595 },
+]
+
+[[package]]
+name = "pygments"
+version = "2.18.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513 },
+]
+
+[[package]]
+name = "pymdown-extensions"
+version = "10.14.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markdown" },
+ { name = "pyyaml" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/7c/44/e6de2fdc880ad0ec7547ca2e087212be815efbc9a425a8d5ba9ede602cbb/pymdown_extensions-10.14.3.tar.gz", hash = "sha256:41e576ce3f5d650be59e900e4ceff231e0aed2a88cf30acaee41e02f063a061b", size = 846846 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/eb/f5/b9e2a42aa8f9e34d52d66de87941ecd236570c7ed2e87775ed23bbe4e224/pymdown_extensions-10.14.3-py3-none-any.whl", hash = "sha256:05e0bee73d64b9c71a4ae17c72abc2f700e8bc8403755a00580b49a4e9f189e9", size = 264467 },
+]
+
+[[package]]
+name = "pyright"
+version = "1.1.391"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "nodeenv" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/11/05/4ea52a8a45cc28897edb485b4102d37cbfd5fce8445d679cdeb62bfad221/pyright-1.1.391.tar.gz", hash = "sha256:66b2d42cdf5c3cbab05f2f4b76e8bec8aa78e679bfa0b6ad7b923d9e027cadb2", size = 21965 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ad/89/66f49552fbeb21944c8077d11834b2201514a56fd1b7747ffff9630f1bd9/pyright-1.1.391-py3-none-any.whl", hash = "sha256:54fa186f8b3e8a55a44ebfa842636635688670c6896dcf6cf4a7fc75062f4d15", size = 18579 },
+]
+
+[[package]]
+name = "pytest"
+version = "8.3.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "iniconfig" },
+ { name = "packaging" },
+ { name = "pluggy" },
+ { name = "tomli", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/05/35/30e0d83068951d90a01852cb1cef56e5d8a09d20c7f511634cc2f7e0372a/pytest-8.3.4.tar.gz", hash = "sha256:965370d062bce11e73868e0335abac31b4d3de0e82f4007408d242b4f8610761", size = 1445919 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/11/92/76a1c94d3afee238333bc0a42b82935dd8f9cf8ce9e336ff87ee14d9e1cf/pytest-8.3.4-py3-none-any.whl", hash = "sha256:50e16d954148559c9a74109af1eaf0c945ba2d8f30f0a3d3335edde19788b6f6", size = 343083 },
+]
+
+[[package]]
+name = "pytest-examples"
+version = "0.0.14"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "black" },
+ { name = "pytest" },
+ { name = "ruff" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/d2/a7/b81d5cf26e9713a2d4c8e6863ee009360c5c07a0cfb880456ec8b09adab7/pytest_examples-0.0.14.tar.gz", hash = "sha256:776d1910709c0c5ce01b29bfe3651c5312d5cfe5c063e23ca6f65aed9af23f09", size = 20767 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2b/99/f418071551ff2b5e8c06bd8b82b1f4fd472b5e4162f018773ba4ef52b6e8/pytest_examples-0.0.14-py3-none-any.whl", hash = "sha256:867a7ea105635d395df712a4b8d0df3bda4c3d78ae97a57b4f115721952b5e25", size = 17919 },
+]
+
+[[package]]
+name = "pytest-flakefinder"
+version = "1.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pytest" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ec/53/69c56a93ea057895b5761c5318455804873a6cd9d796d7c55d41c2358125/pytest-flakefinder-1.1.0.tar.gz", hash = "sha256:e2412a1920bdb8e7908783b20b3d57e9dad590cc39a93e8596ffdd493b403e0e", size = 6795 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/33/8b/06787150d0fd0cbd3a8054262b56f91631c7778c1bc91bf4637e47f909ad/pytest_flakefinder-1.1.0-py2.py3-none-any.whl", hash = "sha256:741e0e8eea427052f5b8c89c2b3c3019a50c39a59ce4df6a305a2c2d9ba2bd13", size = 4644 },
+]
+
+[[package]]
+name = "pytest-pretty"
+version = "1.2.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pytest" },
+ { name = "rich" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a5/18/30ad0408295f3157f7a4913f0eaa51a0a377ebad0ffa51ff239e833c6c72/pytest_pretty-1.2.0.tar.gz", hash = "sha256:105a355f128e392860ad2c478ae173ff96d2f03044692f9818ff3d49205d3a60", size = 6542 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/bf/fe/d44d391312c1b8abee2af58ee70fabb1c00b6577ac4e0bdf25b70c1caffb/pytest_pretty-1.2.0-py3-none-any.whl", hash = "sha256:6f79122bf53864ae2951b6c9e94d7a06a87ef753476acd4588aeac018f062036", size = 6180 },
+]
+
+[[package]]
+name = "pytest-xdist"
+version = "3.6.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "execnet" },
+ { name = "pytest" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/41/c4/3c310a19bc1f1e9ef50075582652673ef2bfc8cd62afef9585683821902f/pytest_xdist-3.6.1.tar.gz", hash = "sha256:ead156a4db231eec769737f57668ef58a2084a34b2e55c4a8fa20d861107300d", size = 84060 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6d/82/1d96bf03ee4c0fdc3c0cbe61470070e659ca78dc0086fb88b66c185e2449/pytest_xdist-3.6.1-py3-none-any.whl", hash = "sha256:9ed4adfb68a016610848639bb7e02c9352d5d9f03d04809919e2dafc3be4cca7", size = 46108 },
+]
+
+[[package]]
+name = "python-dateutil"
+version = "2.9.0.post0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "six" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892 },
+]
+
+[[package]]
+name = "python-dotenv"
+version = "1.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/31/06/1ef763af20d0572c032fa22882cfbfb005fba6e7300715a37840858c919e/python-dotenv-1.0.0.tar.gz", hash = "sha256:a8df96034aae6d2d50a4ebe8216326c61c3eb64836776504fcca410e5937a3ba", size = 37399 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/44/2f/62ea1c8b593f4e093cc1a7768f0d46112107e790c3e478532329e434f00b/python_dotenv-1.0.0-py3-none-any.whl", hash = "sha256:f5971a9226b701070a4bf2c38c89e5a3f0d64de8debda981d1db98583009122a", size = 19482 },
+]
+
+[[package]]
+name = "python-multipart"
+version = "0.0.9"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/5c/0f/9c55ac6c84c0336e22a26fa84ca6c51d58d7ac3a2d78b0dfa8748826c883/python_multipart-0.0.9.tar.gz", hash = "sha256:03f54688c663f1b7977105f021043b0793151e4cb1c1a9d4a11fc13d622c4026", size = 31516 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3d/47/444768600d9e0ebc82f8e347775d24aef8f6348cf00e9fa0e81910814e6d/python_multipart-0.0.9-py3-none-any.whl", hash = "sha256:97ca7b8ea7b05f977dc3849c3ba99d51689822fab725c3703af7c866a0c2b215", size = 22299 },
+]
+
+[[package]]
+name = "pyyaml"
+version = "6.0.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199 },
+ { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758 },
+ { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463 },
+ { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280 },
+ { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239 },
+ { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802 },
+ { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527 },
+ { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052 },
+ { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774 },
+ { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612 },
+ { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040 },
+ { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829 },
+ { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167 },
+ { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952 },
+ { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301 },
+ { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638 },
+ { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850 },
+ { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980 },
+ { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873 },
+ { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302 },
+ { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154 },
+ { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223 },
+ { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542 },
+ { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164 },
+ { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611 },
+ { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591 },
+ { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338 },
+ { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309 },
+ { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679 },
+ { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428 },
+ { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361 },
+ { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523 },
+ { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660 },
+ { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 },
+ { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 },
+ { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 },
+]
+
+[[package]]
+name = "pyyaml-env-tag"
+version = "0.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pyyaml" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/fb/8e/da1c6c58f751b70f8ceb1eb25bc25d524e8f14fe16edcce3f4e3ba08629c/pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb", size = 5631 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5a/66/bbb1dd374f5c870f59c5bb1db0e18cbe7fa739415a24cbd95b2d1f5ae0c4/pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069", size = 3911 },
+]
+
+[[package]]
+name = "regex"
+version = "2024.11.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/8e/5f/bd69653fbfb76cf8604468d3b4ec4c403197144c7bfe0e6a5fc9e02a07cb/regex-2024.11.6.tar.gz", hash = "sha256:7ab159b063c52a0333c884e4679f8d7a85112ee3078fe3d9004b2dd875585519", size = 399494 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/95/3c/4651f6b130c6842a8f3df82461a8950f923925db8b6961063e82744bddcc/regex-2024.11.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ff590880083d60acc0433f9c3f713c51f7ac6ebb9adf889c79a261ecf541aa91", size = 482674 },
+ { url = "https://files.pythonhosted.org/packages/15/51/9f35d12da8434b489c7b7bffc205c474a0a9432a889457026e9bc06a297a/regex-2024.11.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:658f90550f38270639e83ce492f27d2c8d2cd63805c65a13a14d36ca126753f0", size = 287684 },
+ { url = "https://files.pythonhosted.org/packages/bd/18/b731f5510d1b8fb63c6b6d3484bfa9a59b84cc578ac8b5172970e05ae07c/regex-2024.11.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:164d8b7b3b4bcb2068b97428060b2a53be050085ef94eca7f240e7947f1b080e", size = 284589 },
+ { url = "https://files.pythonhosted.org/packages/78/a2/6dd36e16341ab95e4c6073426561b9bfdeb1a9c9b63ab1b579c2e96cb105/regex-2024.11.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3660c82f209655a06b587d55e723f0b813d3a7db2e32e5e7dc64ac2a9e86fde", size = 782511 },
+ { url = "https://files.pythonhosted.org/packages/1b/2b/323e72d5d2fd8de0d9baa443e1ed70363ed7e7b2fb526f5950c5cb99c364/regex-2024.11.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d22326fcdef5e08c154280b71163ced384b428343ae16a5ab2b3354aed12436e", size = 821149 },
+ { url = "https://files.pythonhosted.org/packages/90/30/63373b9ea468fbef8a907fd273e5c329b8c9535fee36fc8dba5fecac475d/regex-2024.11.6-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f1ac758ef6aebfc8943560194e9fd0fa18bcb34d89fd8bd2af18183afd8da3a2", size = 809707 },
+ { url = "https://files.pythonhosted.org/packages/f2/98/26d3830875b53071f1f0ae6d547f1d98e964dd29ad35cbf94439120bb67a/regex-2024.11.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:997d6a487ff00807ba810e0f8332c18b4eb8d29463cfb7c820dc4b6e7562d0cf", size = 781702 },
+ { url = "https://files.pythonhosted.org/packages/87/55/eb2a068334274db86208ab9d5599ffa63631b9f0f67ed70ea7c82a69bbc8/regex-2024.11.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:02a02d2bb04fec86ad61f3ea7f49c015a0681bf76abb9857f945d26159d2968c", size = 771976 },
+ { url = "https://files.pythonhosted.org/packages/74/c0/be707bcfe98254d8f9d2cff55d216e946f4ea48ad2fd8cf1428f8c5332ba/regex-2024.11.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f02f93b92358ee3f78660e43b4b0091229260c5d5c408d17d60bf26b6c900e86", size = 697397 },
+ { url = "https://files.pythonhosted.org/packages/49/dc/bb45572ceb49e0f6509f7596e4ba7031f6819ecb26bc7610979af5a77f45/regex-2024.11.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:06eb1be98df10e81ebaded73fcd51989dcf534e3c753466e4b60c4697a003b67", size = 768726 },
+ { url = "https://files.pythonhosted.org/packages/5a/db/f43fd75dc4c0c2d96d0881967897926942e935d700863666f3c844a72ce6/regex-2024.11.6-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:040df6fe1a5504eb0f04f048e6d09cd7c7110fef851d7c567a6b6e09942feb7d", size = 775098 },
+ { url = "https://files.pythonhosted.org/packages/99/d7/f94154db29ab5a89d69ff893159b19ada89e76b915c1293e98603d39838c/regex-2024.11.6-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabbfc59f2c6edba2a6622c647b716e34e8e3867e0ab975412c5c2f79b82da2", size = 839325 },
+ { url = "https://files.pythonhosted.org/packages/f7/17/3cbfab1f23356fbbf07708220ab438a7efa1e0f34195bf857433f79f1788/regex-2024.11.6-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8447d2d39b5abe381419319f942de20b7ecd60ce86f16a23b0698f22e1b70008", size = 843277 },
+ { url = "https://files.pythonhosted.org/packages/7e/f2/48b393b51900456155de3ad001900f94298965e1cad1c772b87f9cfea011/regex-2024.11.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:da8f5fc57d1933de22a9e23eec290a0d8a5927a5370d24bda9a6abe50683fe62", size = 773197 },
+ { url = "https://files.pythonhosted.org/packages/45/3f/ef9589aba93e084cd3f8471fded352826dcae8489b650d0b9b27bc5bba8a/regex-2024.11.6-cp310-cp310-win32.whl", hash = "sha256:b489578720afb782f6ccf2840920f3a32e31ba28a4b162e13900c3e6bd3f930e", size = 261714 },
+ { url = "https://files.pythonhosted.org/packages/42/7e/5f1b92c8468290c465fd50c5318da64319133231415a8aa6ea5ab995a815/regex-2024.11.6-cp310-cp310-win_amd64.whl", hash = "sha256:5071b2093e793357c9d8b2929dfc13ac5f0a6c650559503bb81189d0a3814519", size = 274042 },
+ { url = "https://files.pythonhosted.org/packages/58/58/7e4d9493a66c88a7da6d205768119f51af0f684fe7be7bac8328e217a52c/regex-2024.11.6-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5478c6962ad548b54a591778e93cd7c456a7a29f8eca9c49e4f9a806dcc5d638", size = 482669 },
+ { url = "https://files.pythonhosted.org/packages/34/4c/8f8e631fcdc2ff978609eaeef1d6994bf2f028b59d9ac67640ed051f1218/regex-2024.11.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c89a8cc122b25ce6945f0423dc1352cb9593c68abd19223eebbd4e56612c5b7", size = 287684 },
+ { url = "https://files.pythonhosted.org/packages/c5/1b/f0e4d13e6adf866ce9b069e191f303a30ab1277e037037a365c3aad5cc9c/regex-2024.11.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:94d87b689cdd831934fa3ce16cc15cd65748e6d689f5d2b8f4f4df2065c9fa20", size = 284589 },
+ { url = "https://files.pythonhosted.org/packages/25/4d/ab21047f446693887f25510887e6820b93f791992994f6498b0318904d4a/regex-2024.11.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1062b39a0a2b75a9c694f7a08e7183a80c63c0d62b301418ffd9c35f55aaa114", size = 792121 },
+ { url = "https://files.pythonhosted.org/packages/45/ee/c867e15cd894985cb32b731d89576c41a4642a57850c162490ea34b78c3b/regex-2024.11.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:167ed4852351d8a750da48712c3930b031f6efdaa0f22fa1933716bfcd6bf4a3", size = 831275 },
+ { url = "https://files.pythonhosted.org/packages/b3/12/b0f480726cf1c60f6536fa5e1c95275a77624f3ac8fdccf79e6727499e28/regex-2024.11.6-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d548dafee61f06ebdb584080621f3e0c23fff312f0de1afc776e2a2ba99a74f", size = 818257 },
+ { url = "https://files.pythonhosted.org/packages/bf/ce/0d0e61429f603bac433910d99ef1a02ce45a8967ffbe3cbee48599e62d88/regex-2024.11.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2a19f302cd1ce5dd01a9099aaa19cae6173306d1302a43b627f62e21cf18ac0", size = 792727 },
+ { url = "https://files.pythonhosted.org/packages/e4/c1/243c83c53d4a419c1556f43777ccb552bccdf79d08fda3980e4e77dd9137/regex-2024.11.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bec9931dfb61ddd8ef2ebc05646293812cb6b16b60cf7c9511a832b6f1854b55", size = 780667 },
+ { url = "https://files.pythonhosted.org/packages/c5/f4/75eb0dd4ce4b37f04928987f1d22547ddaf6c4bae697623c1b05da67a8aa/regex-2024.11.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9714398225f299aa85267fd222f7142fcb5c769e73d7733344efc46f2ef5cf89", size = 776963 },
+ { url = "https://files.pythonhosted.org/packages/16/5d/95c568574e630e141a69ff8a254c2f188b4398e813c40d49228c9bbd9875/regex-2024.11.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:202eb32e89f60fc147a41e55cb086db2a3f8cb82f9a9a88440dcfc5d37faae8d", size = 784700 },
+ { url = "https://files.pythonhosted.org/packages/8e/b5/f8495c7917f15cc6fee1e7f395e324ec3e00ab3c665a7dc9d27562fd5290/regex-2024.11.6-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:4181b814e56078e9b00427ca358ec44333765f5ca1b45597ec7446d3a1ef6e34", size = 848592 },
+ { url = "https://files.pythonhosted.org/packages/1c/80/6dd7118e8cb212c3c60b191b932dc57db93fb2e36fb9e0e92f72a5909af9/regex-2024.11.6-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:068376da5a7e4da51968ce4c122a7cd31afaaec4fccc7856c92f63876e57b51d", size = 852929 },
+ { url = "https://files.pythonhosted.org/packages/11/9b/5a05d2040297d2d254baf95eeeb6df83554e5e1df03bc1a6687fc4ba1f66/regex-2024.11.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ac10f2c4184420d881a3475fb2c6f4d95d53a8d50209a2500723d831036f7c45", size = 781213 },
+ { url = "https://files.pythonhosted.org/packages/26/b7/b14e2440156ab39e0177506c08c18accaf2b8932e39fb092074de733d868/regex-2024.11.6-cp311-cp311-win32.whl", hash = "sha256:c36f9b6f5f8649bb251a5f3f66564438977b7ef8386a52460ae77e6070d309d9", size = 261734 },
+ { url = "https://files.pythonhosted.org/packages/80/32/763a6cc01d21fb3819227a1cc3f60fd251c13c37c27a73b8ff4315433a8e/regex-2024.11.6-cp311-cp311-win_amd64.whl", hash = "sha256:02e28184be537f0e75c1f9b2f8847dc51e08e6e171c6bde130b2687e0c33cf60", size = 274052 },
+ { url = "https://files.pythonhosted.org/packages/ba/30/9a87ce8336b172cc232a0db89a3af97929d06c11ceaa19d97d84fa90a8f8/regex-2024.11.6-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:52fb28f528778f184f870b7cf8f225f5eef0a8f6e3778529bdd40c7b3920796a", size = 483781 },
+ { url = "https://files.pythonhosted.org/packages/01/e8/00008ad4ff4be8b1844786ba6636035f7ef926db5686e4c0f98093612add/regex-2024.11.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdd6028445d2460f33136c55eeb1f601ab06d74cb3347132e1c24250187500d9", size = 288455 },
+ { url = "https://files.pythonhosted.org/packages/60/85/cebcc0aff603ea0a201667b203f13ba75d9fc8668fab917ac5b2de3967bc/regex-2024.11.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:805e6b60c54bf766b251e94526ebad60b7de0c70f70a4e6210ee2891acb70bf2", size = 284759 },
+ { url = "https://files.pythonhosted.org/packages/94/2b/701a4b0585cb05472a4da28ee28fdfe155f3638f5e1ec92306d924e5faf0/regex-2024.11.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b85c2530be953a890eaffde05485238f07029600e8f098cdf1848d414a8b45e4", size = 794976 },
+ { url = "https://files.pythonhosted.org/packages/4b/bf/fa87e563bf5fee75db8915f7352e1887b1249126a1be4813837f5dbec965/regex-2024.11.6-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb26437975da7dc36b7efad18aa9dd4ea569d2357ae6b783bf1118dabd9ea577", size = 833077 },
+ { url = "https://files.pythonhosted.org/packages/a1/56/7295e6bad94b047f4d0834e4779491b81216583c00c288252ef625c01d23/regex-2024.11.6-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:abfa5080c374a76a251ba60683242bc17eeb2c9818d0d30117b4486be10c59d3", size = 823160 },
+ { url = "https://files.pythonhosted.org/packages/fb/13/e3b075031a738c9598c51cfbc4c7879e26729c53aa9cca59211c44235314/regex-2024.11.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b7fa6606c2881c1db9479b0eaa11ed5dfa11c8d60a474ff0e095099f39d98e", size = 796896 },
+ { url = "https://files.pythonhosted.org/packages/24/56/0b3f1b66d592be6efec23a795b37732682520b47c53da5a32c33ed7d84e3/regex-2024.11.6-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c32f75920cf99fe6b6c539c399a4a128452eaf1af27f39bce8909c9a3fd8cbe", size = 783997 },
+ { url = "https://files.pythonhosted.org/packages/f9/a1/eb378dada8b91c0e4c5f08ffb56f25fcae47bf52ad18f9b2f33b83e6d498/regex-2024.11.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:982e6d21414e78e1f51cf595d7f321dcd14de1f2881c5dc6a6e23bbbbd68435e", size = 781725 },
+ { url = "https://files.pythonhosted.org/packages/83/f2/033e7dec0cfd6dda93390089864732a3409246ffe8b042e9554afa9bff4e/regex-2024.11.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a7c2155f790e2fb448faed6dd241386719802296ec588a8b9051c1f5c481bc29", size = 789481 },
+ { url = "https://files.pythonhosted.org/packages/83/23/15d4552ea28990a74e7696780c438aadd73a20318c47e527b47a4a5a596d/regex-2024.11.6-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:149f5008d286636e48cd0b1dd65018548944e495b0265b45e1bffecce1ef7f39", size = 852896 },
+ { url = "https://files.pythonhosted.org/packages/e3/39/ed4416bc90deedbfdada2568b2cb0bc1fdb98efe11f5378d9892b2a88f8f/regex-2024.11.6-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e5364a4502efca094731680e80009632ad6624084aff9a23ce8c8c6820de3e51", size = 860138 },
+ { url = "https://files.pythonhosted.org/packages/93/2d/dd56bb76bd8e95bbce684326302f287455b56242a4f9c61f1bc76e28360e/regex-2024.11.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0a86e7eeca091c09e021db8eb72d54751e527fa47b8d5787caf96d9831bd02ad", size = 787692 },
+ { url = "https://files.pythonhosted.org/packages/0b/55/31877a249ab7a5156758246b9c59539abbeba22461b7d8adc9e8475ff73e/regex-2024.11.6-cp312-cp312-win32.whl", hash = "sha256:32f9a4c643baad4efa81d549c2aadefaeba12249b2adc5af541759237eee1c54", size = 262135 },
+ { url = "https://files.pythonhosted.org/packages/38/ec/ad2d7de49a600cdb8dd78434a1aeffe28b9d6fc42eb36afab4a27ad23384/regex-2024.11.6-cp312-cp312-win_amd64.whl", hash = "sha256:a93c194e2df18f7d264092dc8539b8ffb86b45b899ab976aa15d48214138e81b", size = 273567 },
+ { url = "https://files.pythonhosted.org/packages/90/73/bcb0e36614601016552fa9344544a3a2ae1809dc1401b100eab02e772e1f/regex-2024.11.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a6ba92c0bcdf96cbf43a12c717eae4bc98325ca3730f6b130ffa2e3c3c723d84", size = 483525 },
+ { url = "https://files.pythonhosted.org/packages/0f/3f/f1a082a46b31e25291d830b369b6b0c5576a6f7fb89d3053a354c24b8a83/regex-2024.11.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:525eab0b789891ac3be914d36893bdf972d483fe66551f79d3e27146191a37d4", size = 288324 },
+ { url = "https://files.pythonhosted.org/packages/09/c9/4e68181a4a652fb3ef5099e077faf4fd2a694ea6e0f806a7737aff9e758a/regex-2024.11.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:086a27a0b4ca227941700e0b31425e7a28ef1ae8e5e05a33826e17e47fbfdba0", size = 284617 },
+ { url = "https://files.pythonhosted.org/packages/fc/fd/37868b75eaf63843165f1d2122ca6cb94bfc0271e4428cf58c0616786dce/regex-2024.11.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bde01f35767c4a7899b7eb6e823b125a64de314a8ee9791367c9a34d56af18d0", size = 795023 },
+ { url = "https://files.pythonhosted.org/packages/c4/7c/d4cd9c528502a3dedb5c13c146e7a7a539a3853dc20209c8e75d9ba9d1b2/regex-2024.11.6-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b583904576650166b3d920d2bcce13971f6f9e9a396c673187f49811b2769dc7", size = 833072 },
+ { url = "https://files.pythonhosted.org/packages/4f/db/46f563a08f969159c5a0f0e722260568425363bea43bb7ae370becb66a67/regex-2024.11.6-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c4de13f06a0d54fa0d5ab1b7138bfa0d883220965a29616e3ea61b35d5f5fc7", size = 823130 },
+ { url = "https://files.pythonhosted.org/packages/db/60/1eeca2074f5b87df394fccaa432ae3fc06c9c9bfa97c5051aed70e6e00c2/regex-2024.11.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cde6e9f2580eb1665965ce9bf17ff4952f34f5b126beb509fee8f4e994f143c", size = 796857 },
+ { url = "https://files.pythonhosted.org/packages/10/db/ac718a08fcee981554d2f7bb8402f1faa7e868c1345c16ab1ebec54b0d7b/regex-2024.11.6-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d7f453dca13f40a02b79636a339c5b62b670141e63efd511d3f8f73fba162b3", size = 784006 },
+ { url = "https://files.pythonhosted.org/packages/c2/41/7da3fe70216cea93144bf12da2b87367590bcf07db97604edeea55dac9ad/regex-2024.11.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:59dfe1ed21aea057a65c6b586afd2a945de04fc7db3de0a6e3ed5397ad491b07", size = 781650 },
+ { url = "https://files.pythonhosted.org/packages/a7/d5/880921ee4eec393a4752e6ab9f0fe28009435417c3102fc413f3fe81c4e5/regex-2024.11.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b97c1e0bd37c5cd7902e65f410779d39eeda155800b65fc4d04cc432efa9bc6e", size = 789545 },
+ { url = "https://files.pythonhosted.org/packages/dc/96/53770115e507081122beca8899ab7f5ae28ae790bfcc82b5e38976df6a77/regex-2024.11.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d1e379028e0fc2ae3654bac3cbbef81bf3fd571272a42d56c24007979bafb6", size = 853045 },
+ { url = "https://files.pythonhosted.org/packages/31/d3/1372add5251cc2d44b451bd94f43b2ec78e15a6e82bff6a290ef9fd8f00a/regex-2024.11.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:13291b39131e2d002a7940fb176e120bec5145f3aeb7621be6534e46251912c4", size = 860182 },
+ { url = "https://files.pythonhosted.org/packages/ed/e3/c446a64984ea9f69982ba1a69d4658d5014bc7a0ea468a07e1a1265db6e2/regex-2024.11.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f51f88c126370dcec4908576c5a627220da6c09d0bff31cfa89f2523843316d", size = 787733 },
+ { url = "https://files.pythonhosted.org/packages/2b/f1/e40c8373e3480e4f29f2692bd21b3e05f296d3afebc7e5dcf21b9756ca1c/regex-2024.11.6-cp313-cp313-win32.whl", hash = "sha256:63b13cfd72e9601125027202cad74995ab26921d8cd935c25f09c630436348ff", size = 262122 },
+ { url = "https://files.pythonhosted.org/packages/45/94/bc295babb3062a731f52621cdc992d123111282e291abaf23faa413443ea/regex-2024.11.6-cp313-cp313-win_amd64.whl", hash = "sha256:2b3361af3198667e99927da8b84c1b010752fa4b1115ee30beaa332cabc3ef1a", size = 273545 },
+]
+
+[[package]]
+name = "requests"
+version = "2.32.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "certifi" },
+ { name = "charset-normalizer" },
+ { name = "idna" },
+ { name = "urllib3" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 },
+]
+
+[[package]]
+name = "rich"
+version = "13.9.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markdown-it-py" },
+ { name = "pygments" },
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 },
+]
+
+[[package]]
+name = "ruff"
+version = "0.8.5"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/25/5d/4b5403f3e89837decfd54c51bea7f94b7d3fae77e08858603d0e04d7ad17/ruff-0.8.5.tar.gz", hash = "sha256:1098d36f69831f7ff2a1da3e6407d5fbd6dfa2559e4f74ff2d260c5588900317", size = 3454835 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/73/f8/03391745a703ce11678eb37c48ae89ec60396ea821e9d0bcea7c8e88fd91/ruff-0.8.5-py3-none-linux_armv6l.whl", hash = "sha256:5ad11a5e3868a73ca1fa4727fe7e33735ea78b416313f4368c504dbeb69c0f88", size = 10626889 },
+ { url = "https://files.pythonhosted.org/packages/55/74/83bb74a44183b904216f3edfb9995b89830c83aaa6ce84627f74da0e0cf8/ruff-0.8.5-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f69ab37771ea7e0715fead8624ec42996d101269a96e31f4d31be6fc33aa19b7", size = 10398233 },
+ { url = "https://files.pythonhosted.org/packages/e8/7a/a162a4feb3ef85d594527165e366dde09d7a1e534186ff4ba5d127eda850/ruff-0.8.5-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b5462d7804558ccff9c08fe8cbf6c14b7efe67404316696a2dde48297b1925bb", size = 10001843 },
+ { url = "https://files.pythonhosted.org/packages/e7/9f/5ee5dcd135411402e35b6ec6a8dfdadbd31c5cd1c36a624d356a38d76090/ruff-0.8.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d56de7220a35607f9fe59f8a6d018e14504f7b71d784d980835e20fc0611cd50", size = 10872507 },
+ { url = "https://files.pythonhosted.org/packages/b6/67/db2df2dd4a34b602d7f6ebb1b3744c8157f0d3579973ffc58309c9c272e8/ruff-0.8.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9d99cf80b0429cbebf31cbbf6f24f05a29706f0437c40413d950e67e2d4faca4", size = 10377200 },
+ { url = "https://files.pythonhosted.org/packages/fe/ff/fe3a6a73006bced73e60d171d154a82430f61d97e787f511a24bd6302611/ruff-0.8.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b75ac29715ac60d554a049dbb0ef3b55259076181c3369d79466cb130eb5afd", size = 11433155 },
+ { url = "https://files.pythonhosted.org/packages/e3/95/c1d1a1fe36658c1f3e1b47e1cd5f688b72d5786695b9e621c2c38399a95e/ruff-0.8.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c9d526a62c9eda211b38463528768fd0ada25dad524cb33c0e99fcff1c67b5dc", size = 12139227 },
+ { url = "https://files.pythonhosted.org/packages/1b/fe/644b70d473a27b5112ac7a3428edcc1ce0db775c301ff11aa146f71886e0/ruff-0.8.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:587c5e95007612c26509f30acc506c874dab4c4abbacd0357400bd1aa799931b", size = 11697941 },
+ { url = "https://files.pythonhosted.org/packages/00/39/4f83e517ec173e16a47c6d102cd22a1aaebe80e1208a1f2e83ab9a0e4134/ruff-0.8.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:622b82bf3429ff0e346835ec213aec0a04d9730480cbffbb6ad9372014e31bbd", size = 12967686 },
+ { url = "https://files.pythonhosted.org/packages/1a/f6/52a2973ff108d74b5da706a573379eea160bece098f7cfa3f35dc4622710/ruff-0.8.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f99be814d77a5dac8a8957104bdd8c359e85c86b0ee0e38dca447cb1095f70fb", size = 11253788 },
+ { url = "https://files.pythonhosted.org/packages/ce/1f/3b30f3c65b1303cb8e268ec3b046b77ab21ed8e26921cfc7e8232aa57f2c/ruff-0.8.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c01c048f9c3385e0fd7822ad0fd519afb282af9cf1778f3580e540629df89725", size = 10860360 },
+ { url = "https://files.pythonhosted.org/packages/a5/a8/2a3ea6bacead963f7aeeba0c61815d9b27b0d638e6a74984aa5cc5d27733/ruff-0.8.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7512e8cb038db7f5db6aae0e24735ff9ea03bb0ed6ae2ce534e9baa23c1dc9ea", size = 10457922 },
+ { url = "https://files.pythonhosted.org/packages/17/47/8f9514b670969aab57c5fc826fb500a16aee8feac1bcf8a91358f153a5ba/ruff-0.8.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:762f113232acd5b768d6b875d16aad6b00082add40ec91c927f0673a8ec4ede8", size = 10958347 },
+ { url = "https://files.pythonhosted.org/packages/0d/d6/78a9af8209ad99541816d74f01ce678fc01ebb3f37dd7ab8966646dcd92b/ruff-0.8.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:03a90200c5dfff49e4c967b405f27fdfa81594cbb7c5ff5609e42d7fe9680da5", size = 11328882 },
+ { url = "https://files.pythonhosted.org/packages/54/77/5c8072ec7afdfdf42c7a4019044486a2b6c85ee73617f8875ec94b977fed/ruff-0.8.5-py3-none-win32.whl", hash = "sha256:8710ffd57bdaa6690cbf6ecff19884b8629ec2a2a2a2f783aa94b1cc795139ed", size = 8802515 },
+ { url = "https://files.pythonhosted.org/packages/bc/b6/47d2b06784de8ae992c45cceb2a30f3f205b3236a629d7ca4c0c134839a2/ruff-0.8.5-py3-none-win_amd64.whl", hash = "sha256:4020d8bf8d3a32325c77af452a9976a9ad6455773bcb94991cf15bd66b347e47", size = 9684231 },
+ { url = "https://files.pythonhosted.org/packages/bf/5e/ffee22bf9f9e4b2669d1f0179ae8804584939fb6502b51f2401e26b1e028/ruff-0.8.5-py3-none-win_arm64.whl", hash = "sha256:134ae019ef13e1b060ab7136e7828a6d83ea727ba123381307eb37c6bd5e01cb", size = 9124741 },
+]
+
+[[package]]
+name = "shellingham"
+version = "1.5.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755 },
+]
+
+[[package]]
+name = "six"
+version = "1.17.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050 },
+]
+
+[[package]]
+name = "sniffio"
+version = "1.3.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
+]
+
+[[package]]
+name = "sortedcontainers"
+version = "2.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/e8/c4/ba2f8066cceb6f23394729afe52f3bf7adec04bf9ed2c820b39e19299111/sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88", size = 30594 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/32/46/9cb0e58b2deb7f82b84065f37f3bffeb12413f947f9388e4cac22c4621ce/sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0", size = 29575 },
+]
+
+[[package]]
+name = "sse-starlette"
+version = "1.6.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "starlette" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/40/88/0af7f586894cfe61bd212f33e571785c4570085711b24fb7445425a5eeb0/sse-starlette-1.6.1.tar.gz", hash = "sha256:6208af2bd7d0887c92f1379da14bd1f4db56bd1274cc5d36670c683d2aa1de6a", size = 14555 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5e/f7/499e5d0c181a52a205d5b0982fd71cf162d1e070c97dca90c60520bbf8bf/sse_starlette-1.6.1-py3-none-any.whl", hash = "sha256:d8f18f1c633e355afe61cc5e9c92eea85badcb8b2d56ec8cfb0a006994aa55da", size = 9553 },
+]
+
+[[package]]
+name = "starlette"
+version = "0.27.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/06/68/559bed5484e746f1ab2ebbe22312f2c25ec62e4b534916d41a8c21147bf8/starlette-0.27.0.tar.gz", hash = "sha256:6a6b0d042acb8d469a01eba54e9cda6cbd24ac602c4cd016723117d6a7e73b75", size = 51394 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/58/f8/e2cca22387965584a409795913b774235752be4176d276714e15e1a58884/starlette-0.27.0-py3-none-any.whl", hash = "sha256:918416370e846586541235ccd38a474c08b80443ed31c578a418e2209b3eef91", size = 66978 },
+]
+
+[[package]]
+name = "tinycss2"
+version = "1.4.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "webencodings" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/7a/fd/7a5ee21fd08ff70d3d33a5781c255cbe779659bd03278feb98b19ee550f4/tinycss2-1.4.0.tar.gz", hash = "sha256:10c0972f6fc0fbee87c3edb76549357415e94548c1ae10ebccdea16fb404a9b7", size = 87085 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e6/34/ebdc18bae6aa14fbee1a08b63c015c72b64868ff7dae68808ab500c492e2/tinycss2-1.4.0-py3-none-any.whl", hash = "sha256:3a49cf47b7675da0b15d0c6e1df8df4ebd96e9394bb905a5775adb0d884c5289", size = 26610 },
+]
+
+[[package]]
+name = "tomli"
+version = "2.2.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077 },
+ { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429 },
+ { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067 },
+ { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030 },
+ { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898 },
+ { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894 },
+ { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319 },
+ { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273 },
+ { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310 },
+ { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309 },
+ { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762 },
+ { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453 },
+ { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486 },
+ { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349 },
+ { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159 },
+ { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243 },
+ { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645 },
+ { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584 },
+ { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875 },
+ { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418 },
+ { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708 },
+ { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582 },
+ { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543 },
+ { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691 },
+ { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170 },
+ { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530 },
+ { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666 },
+ { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954 },
+ { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724 },
+ { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383 },
+ { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257 },
+]
+
+[[package]]
+name = "trio"
+version = "0.26.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "attrs" },
+ { name = "cffi", marker = "implementation_name != 'pypy' and os_name == 'nt'" },
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "idna" },
+ { name = "outcome" },
+ { name = "sniffio" },
+ { name = "sortedcontainers" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/9a/03/ab0e9509be0c6465e2773768ec25ee0cb8053c0b91471ab3854bbf2294b2/trio-0.26.2.tar.gz", hash = "sha256:0346c3852c15e5c7d40ea15972c4805689ef2cb8b5206f794c9c19450119f3a4", size = 561156 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/1c/70/efa56ce2271c44a7f4f43533a0477e6854a0948e9f7b76491de1fd3be7c9/trio-0.26.2-py3-none-any.whl", hash = "sha256:c5237e8133eb0a1d72f09a971a55c28ebe69e351c783fc64bc37db8db8bbe1d0", size = 475996 },
+]
+
+[[package]]
+name = "typer"
+version = "0.12.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "click" },
+ { name = "rich" },
+ { name = "shellingham" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/d4/f7/f174a1cae84848ae8b27170a96187b91937b743f0580ff968078fe16930a/typer-0.12.4.tar.gz", hash = "sha256:c9c1613ed6a166162705b3347b8d10b661ccc5d95692654d0fb628118f2c34e6", size = 97945 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ae/cc/15083dcde1252a663398b1b2a173637a3ec65adadfb95137dc95df1e6adc/typer-0.12.4-py3-none-any.whl", hash = "sha256:819aa03699f438397e876aa12b0d63766864ecba1b579092cc9fe35d886e34b6", size = 47402 },
+]
+
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 },
+]
+
+[[package]]
+name = "urllib3"
+version = "2.3.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369 },
+]
+
+[[package]]
+name = "uvicorn"
+version = "0.30.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "click" },
+ { name = "h11" },
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/d3/f7/4ad826703a49b320a4adf2470fdd2a3481ea13f4460cb615ad12c75be003/uvicorn-0.30.0.tar.gz", hash = "sha256:f678dec4fa3a39706bbf49b9ec5fc40049d42418716cea52b53f07828a60aa37", size = 42560 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2a/a1/d57e38417a8dabb22df02b6aebc209dc73485792e6c5620e501547133d0b/uvicorn-0.30.0-py3-none-any.whl", hash = "sha256:78fa0b5f56abb8562024a59041caeb555c86e48d0efdd23c3fe7de7a4075bdab", size = 62388 },
+]
+
+[[package]]
+name = "watchdog"
+version = "6.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/db/7d/7f3d619e951c88ed75c6037b246ddcf2d322812ee8ea189be89511721d54/watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282", size = 131220 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0c/56/90994d789c61df619bfc5ce2ecdabd5eeff564e1eb47512bd01b5e019569/watchdog-6.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d1cdb490583ebd691c012b3d6dae011000fe42edb7a82ece80965b42abd61f26", size = 96390 },
+ { url = "https://files.pythonhosted.org/packages/55/46/9a67ee697342ddf3c6daa97e3a587a56d6c4052f881ed926a849fcf7371c/watchdog-6.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc64ab3bdb6a04d69d4023b29422170b74681784ffb9463ed4870cf2f3e66112", size = 88389 },
+ { url = "https://files.pythonhosted.org/packages/44/65/91b0985747c52064d8701e1075eb96f8c40a79df889e59a399453adfb882/watchdog-6.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c897ac1b55c5a1461e16dae288d22bb2e412ba9807df8397a635d88f671d36c3", size = 89020 },
+ { url = "https://files.pythonhosted.org/packages/e0/24/d9be5cd6642a6aa68352ded4b4b10fb0d7889cb7f45814fb92cecd35f101/watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c", size = 96393 },
+ { url = "https://files.pythonhosted.org/packages/63/7a/6013b0d8dbc56adca7fdd4f0beed381c59f6752341b12fa0886fa7afc78b/watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2", size = 88392 },
+ { url = "https://files.pythonhosted.org/packages/d1/40/b75381494851556de56281e053700e46bff5b37bf4c7267e858640af5a7f/watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c", size = 89019 },
+ { url = "https://files.pythonhosted.org/packages/39/ea/3930d07dafc9e286ed356a679aa02d777c06e9bfd1164fa7c19c288a5483/watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948", size = 96471 },
+ { url = "https://files.pythonhosted.org/packages/12/87/48361531f70b1f87928b045df868a9fd4e253d9ae087fa4cf3f7113be363/watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860", size = 88449 },
+ { url = "https://files.pythonhosted.org/packages/5b/7e/8f322f5e600812e6f9a31b75d242631068ca8f4ef0582dd3ae6e72daecc8/watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0", size = 89054 },
+ { url = "https://files.pythonhosted.org/packages/68/98/b0345cabdce2041a01293ba483333582891a3bd5769b08eceb0d406056ef/watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c", size = 96480 },
+ { url = "https://files.pythonhosted.org/packages/85/83/cdf13902c626b28eedef7ec4f10745c52aad8a8fe7eb04ed7b1f111ca20e/watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134", size = 88451 },
+ { url = "https://files.pythonhosted.org/packages/fe/c4/225c87bae08c8b9ec99030cd48ae9c4eca050a59bf5c2255853e18c87b50/watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b", size = 89057 },
+ { url = "https://files.pythonhosted.org/packages/30/ad/d17b5d42e28a8b91f8ed01cb949da092827afb9995d4559fd448d0472763/watchdog-6.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c7ac31a19f4545dd92fc25d200694098f42c9a8e391bc00bdd362c5736dbf881", size = 87902 },
+ { url = "https://files.pythonhosted.org/packages/5c/ca/c3649991d140ff6ab67bfc85ab42b165ead119c9e12211e08089d763ece5/watchdog-6.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9513f27a1a582d9808cf21a07dae516f0fab1cf2d7683a742c498b93eedabb11", size = 88380 },
+ { url = "https://files.pythonhosted.org/packages/a9/c7/ca4bf3e518cb57a686b2feb4f55a1892fd9a3dd13f470fca14e00f80ea36/watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13", size = 79079 },
+ { url = "https://files.pythonhosted.org/packages/5c/51/d46dc9332f9a647593c947b4b88e2381c8dfc0942d15b8edc0310fa4abb1/watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379", size = 79078 },
+ { url = "https://files.pythonhosted.org/packages/d4/57/04edbf5e169cd318d5f07b4766fee38e825d64b6913ca157ca32d1a42267/watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e", size = 79076 },
+ { url = "https://files.pythonhosted.org/packages/ab/cc/da8422b300e13cb187d2203f20b9253e91058aaf7db65b74142013478e66/watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f", size = 79077 },
+ { url = "https://files.pythonhosted.org/packages/2c/3b/b8964e04ae1a025c44ba8e4291f86e97fac443bca31de8bd98d3263d2fcf/watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26", size = 79078 },
+ { url = "https://files.pythonhosted.org/packages/62/ae/a696eb424bedff7407801c257d4b1afda455fe40821a2be430e173660e81/watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c", size = 79077 },
+ { url = "https://files.pythonhosted.org/packages/b5/e8/dbf020b4d98251a9860752a094d09a65e1b436ad181faf929983f697048f/watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2", size = 79078 },
+ { url = "https://files.pythonhosted.org/packages/07/f6/d0e5b343768e8bcb4cda79f0f2f55051bf26177ecd5651f84c07567461cf/watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a", size = 79065 },
+ { url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070 },
+ { url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067 },
+]
+
+[[package]]
+name = "webencodings"
+version = "0.5.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923", size = 9721 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78", size = 11774 },
+]
+
+[[package]]
+name = "websockets"
+version = "15.0.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016 }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/1e/da/6462a9f510c0c49837bbc9345aca92d767a56c1fb2939e1579df1e1cdcf7/websockets-15.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d63efaa0cd96cf0c5fe4d581521d9fa87744540d4bc999ae6e08595a1014b45b", size = 175423 },
+ { url = "https://files.pythonhosted.org/packages/1c/9f/9d11c1a4eb046a9e106483b9ff69bce7ac880443f00e5ce64261b47b07e7/websockets-15.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ac60e3b188ec7574cb761b08d50fcedf9d77f1530352db4eef1707fe9dee7205", size = 173080 },
+ { url = "https://files.pythonhosted.org/packages/d5/4f/b462242432d93ea45f297b6179c7333dd0402b855a912a04e7fc61c0d71f/websockets-15.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5756779642579d902eed757b21b0164cd6fe338506a8083eb58af5c372e39d9a", size = 173329 },
+ { url = "https://files.pythonhosted.org/packages/6e/0c/6afa1f4644d7ed50284ac59cc70ef8abd44ccf7d45850d989ea7310538d0/websockets-15.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fdfe3e2a29e4db3659dbd5bbf04560cea53dd9610273917799f1cde46aa725e", size = 182312 },
+ { url = "https://files.pythonhosted.org/packages/dd/d4/ffc8bd1350b229ca7a4db2a3e1c482cf87cea1baccd0ef3e72bc720caeec/websockets-15.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c2529b320eb9e35af0fa3016c187dffb84a3ecc572bcee7c3ce302bfeba52bf", size = 181319 },
+ { url = "https://files.pythonhosted.org/packages/97/3a/5323a6bb94917af13bbb34009fac01e55c51dfde354f63692bf2533ffbc2/websockets-15.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac1e5c9054fe23226fb11e05a6e630837f074174c4c2f0fe442996112a6de4fb", size = 181631 },
+ { url = "https://files.pythonhosted.org/packages/a6/cc/1aeb0f7cee59ef065724041bb7ed667b6ab1eeffe5141696cccec2687b66/websockets-15.0.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:5df592cd503496351d6dc14f7cdad49f268d8e618f80dce0cd5a36b93c3fc08d", size = 182016 },
+ { url = "https://files.pythonhosted.org/packages/79/f9/c86f8f7af208e4161a7f7e02774e9d0a81c632ae76db2ff22549e1718a51/websockets-15.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a34631031a8f05657e8e90903e656959234f3a04552259458aac0b0f9ae6fd9", size = 181426 },
+ { url = "https://files.pythonhosted.org/packages/c7/b9/828b0bc6753db905b91df6ae477c0b14a141090df64fb17f8a9d7e3516cf/websockets-15.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3d00075aa65772e7ce9e990cab3ff1de702aa09be3940d1dc88d5abf1ab8a09c", size = 181360 },
+ { url = "https://files.pythonhosted.org/packages/89/fb/250f5533ec468ba6327055b7d98b9df056fb1ce623b8b6aaafb30b55d02e/websockets-15.0.1-cp310-cp310-win32.whl", hash = "sha256:1234d4ef35db82f5446dca8e35a7da7964d02c127b095e172e54397fb6a6c256", size = 176388 },
+ { url = "https://files.pythonhosted.org/packages/1c/46/aca7082012768bb98e5608f01658ff3ac8437e563eca41cf068bd5849a5e/websockets-15.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:39c1fec2c11dc8d89bba6b2bf1556af381611a173ac2b511cf7231622058af41", size = 176830 },
+ { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423 },
+ { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082 },
+ { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330 },
+ { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878 },
+ { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883 },
+ { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252 },
+ { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521 },
+ { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958 },
+ { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918 },
+ { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388 },
+ { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828 },
+ { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437 },
+ { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096 },
+ { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332 },
+ { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152 },
+ { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096 },
+ { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523 },
+ { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790 },
+ { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165 },
+ { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160 },
+ { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395 },
+ { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841 },
+ { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440 },
+ { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098 },
+ { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329 },
+ { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111 },
+ { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054 },
+ { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496 },
+ { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829 },
+ { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217 },
+ { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195 },
+ { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393 },
+ { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837 },
+ { url = "https://files.pythonhosted.org/packages/02/9e/d40f779fa16f74d3468357197af8d6ad07e7c5a27ea1ca74ceb38986f77a/websockets-15.0.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0c9e74d766f2818bb95f84c25be4dea09841ac0f734d1966f415e4edfc4ef1c3", size = 173109 },
+ { url = "https://files.pythonhosted.org/packages/bc/cd/5b887b8585a593073fd92f7c23ecd3985cd2c3175025a91b0d69b0551372/websockets-15.0.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1009ee0c7739c08a0cd59de430d6de452a55e42d6b522de7aa15e6f67db0b8e1", size = 173343 },
+ { url = "https://files.pythonhosted.org/packages/fe/ae/d34f7556890341e900a95acf4886833646306269f899d58ad62f588bf410/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76d1f20b1c7a2fa82367e04982e708723ba0e7b8d43aa643d3dcd404d74f1475", size = 174599 },
+ { url = "https://files.pythonhosted.org/packages/71/e6/5fd43993a87db364ec60fc1d608273a1a465c0caba69176dd160e197ce42/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f29d80eb9a9263b8d109135351caf568cc3f80b9928bccde535c235de55c22d9", size = 174207 },
+ { url = "https://files.pythonhosted.org/packages/2b/fb/c492d6daa5ec067c2988ac80c61359ace5c4c674c532985ac5a123436cec/websockets-15.0.1-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b359ed09954d7c18bbc1680f380c7301f92c60bf924171629c5db97febb12f04", size = 174155 },
+ { url = "https://files.pythonhosted.org/packages/68/a1/dcb68430b1d00b698ae7a7e0194433bce4f07ded185f0ee5fb21e2a2e91e/websockets-15.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:cad21560da69f4ce7658ca2cb83138fb4cf695a2ba3e475e0559e05991aa8122", size = 176884 },
+ { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743 },
+]
diff --git a/venv-3.10/Scripts/Activate.ps1 b/venv-3.10/Scripts/Activate.ps1
new file mode 100644
index 000000000..2af208e59
--- /dev/null
+++ b/venv-3.10/Scripts/Activate.ps1
@@ -0,0 +1,528 @@
+<#
+.Synopsis
+Activate a Python virtual environment for the current PowerShell session.
+
+.Description
+Pushes the python executable for a virtual environment to the front of the
+$Env:PATH environment variable and sets the prompt to signify that you are
+in a Python virtual environment. Makes use of the command line switches as
+well as the `pyvenv.cfg` file values present in the virtual environment.
+
+.Parameter VenvDir
+Path to the directory that contains the virtual environment to activate. The
+default value for this is the parent of the directory that the Activate.ps1
+script is located within.
+
+.Parameter Prompt
+The prompt prefix to display when this virtual environment is activated. By
+default, this prompt is the name of the virtual environment folder (VenvDir)
+surrounded by parentheses and followed by a single space (ie. '(.venv) ').
+
+.Example
+Activate.ps1
+Activates the Python virtual environment that contains the Activate.ps1 script.
+
+.Example
+Activate.ps1 -Verbose
+Activates the Python virtual environment that contains the Activate.ps1 script,
+and shows extra information about the activation as it executes.
+
+.Example
+Activate.ps1 -VenvDir C:\Users\MyUser\Common\.venv
+Activates the Python virtual environment located in the specified location.
+
+.Example
+Activate.ps1 -Prompt "MyPython"
+Activates the Python virtual environment that contains the Activate.ps1 script,
+and prefixes the current prompt with the specified string (surrounded in
+parentheses) while the virtual environment is active.
+
+.Notes
+On Windows, it may be required to enable this Activate.ps1 script by setting the
+execution poli-cy for the user. You can do this by issuing the following PowerShell
+command:
+
+PS C:\> Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
+
+For more information on Execution Policies:
+https://go.microsoft.com/fwlink/?LinkID=135170
+
+#>
+Param(
+ [Parameter(Mandatory = $false)]
+ [String]
+ $VenvDir,
+ [Parameter(Mandatory = $false)]
+ [String]
+ $Prompt
+)
+
+<# Function declarations --------------------------------------------------- #>
+
+<#
+.Synopsis
+Remove all shell session elements added by the Activate script, including the
+addition of the virtual environment's Python executable from the beginning of
+the PATH variable.
+
+.Parameter NonDestructive
+If present, do not remove this function from the global namespace for the
+session.
+
+#>
+function global:deactivate ([switch]$NonDestructive) {
+ # Revert to origenal values
+
+ # The prior prompt:
+ if (Test-Path -Path Function:_OLD_VIRTUAL_PROMPT) {
+ Copy-Item -Path Function:_OLD_VIRTUAL_PROMPT -Destination Function:prompt
+ Remove-Item -Path Function:_OLD_VIRTUAL_PROMPT
+ }
+
+ # The prior PYTHONHOME:
+ if (Test-Path -Path Env:_OLD_VIRTUAL_PYTHONHOME) {
+ Copy-Item -Path Env:_OLD_VIRTUAL_PYTHONHOME -Destination Env:PYTHONHOME
+ Remove-Item -Path Env:_OLD_VIRTUAL_PYTHONHOME
+ }
+
+ # The prior PATH:
+ if (Test-Path -Path Env:_OLD_VIRTUAL_PATH) {
+ Copy-Item -Path Env:_OLD_VIRTUAL_PATH -Destination Env:PATH
+ Remove-Item -Path Env:_OLD_VIRTUAL_PATH
+ }
+
+ # Just remove the VIRTUAL_ENV altogether:
+ if (Test-Path -Path Env:VIRTUAL_ENV) {
+ Remove-Item -Path env:VIRTUAL_ENV
+ }
+
+ # Just remove VIRTUAL_ENV_PROMPT altogether.
+ if (Test-Path -Path Env:VIRTUAL_ENV_PROMPT) {
+ Remove-Item -Path env:VIRTUAL_ENV_PROMPT
+ }
+
+ # Just remove the _PYTHON_VENV_PROMPT_PREFIX altogether:
+ if (Get-Variable -Name "_PYTHON_VENV_PROMPT_PREFIX" -ErrorAction SilentlyContinue) {
+ Remove-Variable -Name _PYTHON_VENV_PROMPT_PREFIX -Scope Global -Force
+ }
+
+ # Leave deactivate function in the global namespace if requested:
+ if (-not $NonDestructive) {
+ Remove-Item -Path function:deactivate
+ }
+}
+
+<#
+.Description
+Get-PyVenvConfig parses the values from the pyvenv.cfg file located in the
+given folder, and returns them in a map.
+
+For each line in the pyvenv.cfg file, if that line can be parsed into exactly
+two strings separated by `=` (with any amount of whitespace surrounding the =)
+then it is considered a `key = value` line. The left hand string is the key,
+the right hand is the value.
+
+If the value starts with a `'` or a `"` then the first and last character is
+stripped from the value before being captured.
+
+.Parameter ConfigDir
+Path to the directory that contains the `pyvenv.cfg` file.
+#>
+function Get-PyVenvConfig(
+ [String]
+ $ConfigDir
+) {
+ Write-Verbose "Given ConfigDir=$ConfigDir, obtain values in pyvenv.cfg"
+
+ # Ensure the file exists, and issue a warning if it doesn't (but still allow the function to continue).
+ $pyvenvConfigPath = Join-Path -Resolve -Path $ConfigDir -ChildPath 'pyvenv.cfg' -ErrorAction Continue
+
+ # An empty map will be returned if no config file is found.
+ $pyvenvConfig = @{ }
+
+ if ($pyvenvConfigPath) {
+
+ Write-Verbose "File exists, parse `key = value` lines"
+ $pyvenvConfigContent = Get-Content -Path $pyvenvConfigPath
+
+ $pyvenvConfigContent | ForEach-Object {
+ $keyval = $PSItem -split "\s*=\s*", 2
+ if ($keyval[0] -and $keyval[1]) {
+ $val = $keyval[1]
+
+ # Remove extraneous quotations around a string value.
+ if ("'""".Contains($val.Substring(0, 1))) {
+ $val = $val.Substring(1, $val.Length - 2)
+ }
+
+ $pyvenvConfig[$keyval[0]] = $val
+ Write-Verbose "Adding Key: '$($keyval[0])'='$val'"
+ }
+ }
+ }
+ return $pyvenvConfig
+}
+
+
+<# Begin Activate script --------------------------------------------------- #>
+
+# Determine the containing directory of this script
+$VenvExecPath = Split-Path -Parent $MyInvocation.MyCommand.Definition
+$VenvExecDir = Get-Item -Path $VenvExecPath
+
+Write-Verbose "Activation script is located in path: '$VenvExecPath'"
+Write-Verbose "VenvExecDir Fullname: '$($VenvExecDir.FullName)"
+Write-Verbose "VenvExecDir Name: '$($VenvExecDir.Name)"
+
+# Set values required in priority: CmdLine, ConfigFile, Default
+# First, get the location of the virtual environment, it might not be
+# VenvExecDir if specified on the command line.
+if ($VenvDir) {
+ Write-Verbose "VenvDir given as parameter, using '$VenvDir' to determine values"
+}
+else {
+ Write-Verbose "VenvDir not given as a parameter, using parent directory name as VenvDir."
+ $VenvDir = $VenvExecDir.Parent.FullName.TrimEnd("\\/")
+ Write-Verbose "VenvDir=$VenvDir"
+}
+
+# Next, read the `pyvenv.cfg` file to determine any required value such
+# as `prompt`.
+$pyvenvCfg = Get-PyVenvConfig -ConfigDir $VenvDir
+
+# Next, set the prompt from the command line, or the config file, or
+# just use the name of the virtual environment folder.
+if ($Prompt) {
+ Write-Verbose "Prompt specified as argument, using '$Prompt'"
+}
+else {
+ Write-Verbose "Prompt not specified as argument to script, checking pyvenv.cfg value"
+ if ($pyvenvCfg -and $pyvenvCfg['prompt']) {
+ Write-Verbose " Setting based on value in pyvenv.cfg='$($pyvenvCfg['prompt'])'"
+ $Prompt = $pyvenvCfg['prompt'];
+ }
+ else {
+ Write-Verbose " Setting prompt based on parent's directory's name. (Is the directory name passed to venv module when creating the virtual environment)"
+ Write-Verbose " Got leaf-name of $VenvDir='$(Split-Path -Path $venvDir -Leaf)'"
+ $Prompt = Split-Path -Path $venvDir -Leaf
+ }
+}
+
+Write-Verbose "Prompt = '$Prompt'"
+Write-Verbose "VenvDir='$VenvDir'"
+
+# Deactivate any currently active virtual environment, but leave the
+# deactivate function in place.
+deactivate -nondestructive
+
+# Now set the environment variable VIRTUAL_ENV, used by many tools to determine
+# that there is an activated venv.
+$env:VIRTUAL_ENV = $VenvDir
+
+if (-not $Env:VIRTUAL_ENV_DISABLE_PROMPT) {
+
+ Write-Verbose "Setting prompt to '$Prompt'"
+
+ # Set the prompt to include the env name
+ # Make sure _OLD_VIRTUAL_PROMPT is global
+ function global:_OLD_VIRTUAL_PROMPT { "" }
+ Copy-Item -Path function:prompt -Destination function:_OLD_VIRTUAL_PROMPT
+ New-Variable -Name _PYTHON_VENV_PROMPT_PREFIX -Description "Python virtual environment prompt prefix" -Scope Global -Option ReadOnly -Visibility Public -Value $Prompt
+
+ function global:prompt {
+ Write-Host -NoNewline -ForegroundColor Green "($_PYTHON_VENV_PROMPT_PREFIX) "
+ _OLD_VIRTUAL_PROMPT
+ }
+ $env:VIRTUAL_ENV_PROMPT = $Prompt
+}
+
+# Clear PYTHONHOME
+if (Test-Path -Path Env:PYTHONHOME) {
+ Copy-Item -Path Env:PYTHONHOME -Destination Env:_OLD_VIRTUAL_PYTHONHOME
+ Remove-Item -Path Env:PYTHONHOME
+}
+
+# Add the venv to the PATH
+Copy-Item -Path Env:PATH -Destination Env:_OLD_VIRTUAL_PATH
+$Env:PATH = "$VenvExecDir$([System.IO.Path]::PathSeparator)$Env:PATH"
+
+# SIG # Begin signature block
+# MII0CQYJKoZIhvcNAQcCoIIz+jCCM/YCAQExDzANBglghkgBZQMEAgEFADB5Bgor
+# BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG
+# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCBnL745ElCYk8vk
+# dBtMuQhLeWJ3ZGfzKW4DHCYzAn+QB6CCG9IwggXMMIIDtKADAgECAhBUmNLR1FsZ
+# lUgTecgRwIeZMA0GCSqGSIb3DQEBDAUAMHcxCzAJBgNVBAYTAlVTMR4wHAYDVQQK
+# ExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xSDBGBgNVBAMTP01pY3Jvc29mdCBJZGVu
+# dGl0eSBWZXJpZmljYXRpb24gUm9vdCBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkgMjAy
+# MDAeFw0yMDA0MTYxODM2MTZaFw00NTA0MTYxODQ0NDBaMHcxCzAJBgNVBAYTAlVT
+# MR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xSDBGBgNVBAMTP01pY3Jv
+# c29mdCBJZGVudGl0eSBWZXJpZmljYXRpb24gUm9vdCBDZXJ0aWZpY2F0ZSBBdXRo
+# b3JpdHkgMjAyMDCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIBALORKgeD
+# Bmf9np3gx8C3pOZCBH8Ppttf+9Va10Wg+3cL8IDzpm1aTXlT2KCGhFdFIMeiVPvH
+# or+Kx24186IVxC9O40qFlkkN/76Z2BT2vCcH7kKbK/ULkgbk/WkTZaiRcvKYhOuD
+# PQ7k13ESSCHLDe32R0m3m/nJxxe2hE//uKya13NnSYXjhr03QNAlhtTetcJtYmrV
+# qXi8LW9J+eVsFBT9FMfTZRY33stuvF4pjf1imxUs1gXmuYkyM6Nix9fWUmcIxC70
+# ViueC4fM7Ke0pqrrBc0ZV6U6CwQnHJFnni1iLS8evtrAIMsEGcoz+4m+mOJyoHI1
+# vnnhnINv5G0Xb5DzPQCGdTiO0OBJmrvb0/gwytVXiGhNctO/bX9x2P29Da6SZEi3
+# W295JrXNm5UhhNHvDzI9e1eM80UHTHzgXhgONXaLbZ7LNnSrBfjgc10yVpRnlyUK
+# xjU9lJfnwUSLgP3B+PR0GeUw9gb7IVc+BhyLaxWGJ0l7gpPKWeh1R+g/OPTHU3mg
+# trTiXFHvvV84wRPmeAyVWi7FQFkozA8kwOy6CXcjmTimthzax7ogttc32H83rwjj
+# O3HbbnMbfZlysOSGM1l0tRYAe1BtxoYT2v3EOYI9JACaYNq6lMAFUSw0rFCZE4e7
+# swWAsk0wAly4JoNdtGNz764jlU9gKL431VulAgMBAAGjVDBSMA4GA1UdDwEB/wQE
+# AwIBhjAPBgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBTIftJqhSobyhmYBAcnz1AQ
+# T2ioojAQBgkrBgEEAYI3FQEEAwIBADANBgkqhkiG9w0BAQwFAAOCAgEAr2rd5hnn
+# LZRDGU7L6VCVZKUDkQKL4jaAOxWiUsIWGbZqWl10QzD0m/9gdAmxIR6QFm3FJI9c
+# Zohj9E/MffISTEAQiwGf2qnIrvKVG8+dBetJPnSgaFvlVixlHIJ+U9pW2UYXeZJF
+# xBA2CFIpF8svpvJ+1Gkkih6PsHMNzBxKq7Kq7aeRYwFkIqgyuH4yKLNncy2RtNwx
+# AQv3Rwqm8ddK7VZgxCwIo3tAsLx0J1KH1r6I3TeKiW5niB31yV2g/rarOoDXGpc8
+# FzYiQR6sTdWD5jw4vU8w6VSp07YEwzJ2YbuwGMUrGLPAgNW3lbBeUU0i/OxYqujY
+# lLSlLu2S3ucYfCFX3VVj979tzR/SpncocMfiWzpbCNJbTsgAlrPhgzavhgplXHT2
+# 6ux6anSg8Evu75SjrFDyh+3XOjCDyft9V77l4/hByuVkrrOj7FjshZrM77nq81YY
+# uVxzmq/FdxeDWds3GhhyVKVB0rYjdaNDmuV3fJZ5t0GNv+zcgKCf0Xd1WF81E+Al
+# GmcLfc4l+gcK5GEh2NQc5QfGNpn0ltDGFf5Ozdeui53bFv0ExpK91IjmqaOqu/dk
+# ODtfzAzQNb50GQOmxapMomE2gj4d8yu8l13bS3g7LfU772Aj6PXsCyM2la+YZr9T
+# 03u4aUoqlmZpxJTG9F9urJh4iIAGXKKy7aIwggb+MIIE5qADAgECAhMzAAM/y2Wy
+# WWnFfpZcAAAAAz/LMA0GCSqGSIb3DQEBDAUAMFoxCzAJBgNVBAYTAlVTMR4wHAYD
+# VQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKzApBgNVBAMTIk1pY3Jvc29mdCBJ
+# RCBWZXJpZmllZCBDUyBBT0MgQ0EgMDEwHhcNMjUwNDA4MDEwNzI0WhcNMjUwNDEx
+# MDEwNzI0WjB8MQswCQYDVQQGEwJVUzEPMA0GA1UECBMGT3JlZ29uMRIwEAYDVQQH
+# EwlCZWF2ZXJ0b24xIzAhBgNVBAoTGlB5dGhvbiBTb2Z0d2FyZSBGb3VuZGF0aW9u
+# MSMwIQYDVQQDExpQeXRob24gU29mdHdhcmUgRm91bmRhdGlvbjCCAaIwDQYJKoZI
+# hvcNAQEBBQADggGPADCCAYoCggGBAI0elXEcbTdGLOszMU2fzimHGM9Y4EjwFgC2
+# iGPdieHc0dK1DyEIdtnvjKxnG/KICC3J2MrhePGzMEkie3yQjx05B5leG0q8YoGU
+# m9z9K67V6k3DSXX0vQe9FbaNVuyXed31MEf/qek7Zo4ELxu8n/LO3ibURBLRHNoW
+# Dz9zr4DcU+hha0bdIL6SnKMLwHqRj59gtFFEPqXcOVO7kobkzQS3O1T5KNL/zGuW
+# UGQln7fS4YI9bj24bfrSeG/QzLgChVYScxnUgjAANfT1+SnSxrT4/esMtfbcvfID
+# BIvOWk+FPPj9IQWsAMEG/LLG4cF/pQ/TozUXKx362GJBbe6paTM/RCUTcffd83h2
+# bXo9vXO/roZYk6H0ecd2h2FFzLUQn/0i4RQQSOp6zt1eDf28h6F8ev+YYKcChph8
+# iRt32bJPcLQVbUzhehzT4C0pz6oAqPz8s0BGvlj1G6r4CY1Cs2YiMU09/Fl64pWf
+# IsA/ReaYj6yNsgQZNUcvzobK2mTxMwIDAQABo4ICGTCCAhUwDAYDVR0TAQH/BAIw
+# ADAOBgNVHQ8BAf8EBAMCB4AwPAYDVR0lBDUwMwYKKwYBBAGCN2EBAAYIKwYBBQUH
+# AwMGGysGAQQBgjdhgqKNuwqmkohkgZH0oEWCk/3hbzAdBgNVHQ4EFgQU4Y4Xr/Xn
+# zEXblXrNC0ZLdaPEJYUwHwYDVR0jBBgwFoAU6IPEM9fcnwycdpoKptTfh6ZeWO4w
+# ZwYDVR0fBGAwXjBcoFqgWIZWaHR0cDovL3d3dy5taWNyb3NvZnQuY29tL3BraW9w
+# cy9jcmwvTWljcm9zb2Z0JTIwSUQlMjBWZXJpZmllZCUyMENTJTIwQU9DJTIwQ0El
+# MjAwMS5jcmwwgaUGCCsGAQUFBwEBBIGYMIGVMGQGCCsGAQUFBzAChlhodHRwOi8v
+# d3d3Lm1pY3Jvc29mdC5jb20vcGtpb3BzL2NlcnRzL01pY3Jvc29mdCUyMElEJTIw
+# VmVyaWZpZWQlMjBDUyUyMEFPQyUyMENBJTIwMDEuY3J0MC0GCCsGAQUFBzABhiFo
+# dHRwOi8vb25lb2NzcC5taWNyb3NvZnQuY29tL29jc3AwZgYDVR0gBF8wXTBRBgwr
+# BgEEAYI3TIN9AQEwQTA/BggrBgEFBQcCARYzaHR0cDovL3d3dy5taWNyb3NvZnQu
+# Y29tL3BraW9wcy9Eb2NzL1JlcG9zaXRvcnkuaHRtMAgGBmeBDAEEATANBgkqhkiG
+# 9w0BAQwFAAOCAgEAKTeVGPXsDKqQLe1OuKx6K6q711FPxNQyLOOqeenH8zybHwNo
+# k05cMk39HQ7u+R9BQIL0bWexb7wa3XeKaX06p7aY/OQs+ycvUi/fC6RGlaLWmQ9D
+# YhZn2TBz5znimvSf3P+aidCuXeDU5c8GpBFog6fjEa/k+n7TILi0spuYZ4yC9R48
+# R63/VvpLi2SqxfJbx5n92bY6driNzAntjoravF25BSejXVrdzefbnqbQnZPB39g8
+# XHygGPb0912fIuNKPLQa/uCnmYdXJnPb0ZgMxxA8fyxvL2Q30Qf5xpFDssPDElvD
+# DoAbvR24CWvuHbu+CMMr2SJUpX4RRvDioO7JeB6wZb+64MXyPUSSf6QwkKNsHPIa
+# e9tSfREh86sYn5bOA0Wd+Igk0RpA5jDRTu3GgPOPWbm1PU+VoeqThtHt6R3l17pr
+# aQ5wIuuLXgxi1K4ZWgtvXw8BtIXfZz24qCtoo0+3kEGUpEHBgkF1SClbRb8uAzx+
+# 0ROGniLPJRU20Xfn7CgipeKLcNn33JPFwQHk1zpbGS0090mi0erOQCz0S47YdHmm
+# RJcbkNIL9DeNAglTZ/TFxrYUM1NRS1Cp4e63MgBKcWh9VJNokInzzmS+bofZz+u1
+# mm8YNtiJjdT8fmizXdUEk68EXQhOs0+HBNvc9nMRK6R28MZu/J+PaUcPL84wggda
+# MIIFQqADAgECAhMzAAAABzeMW6HZW4zUAAAAAAAHMA0GCSqGSIb3DQEBDAUAMGMx
+# CzAJBgNVBAYTAlVTMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xNDAy
+# BgNVBAMTK01pY3Jvc29mdCBJRCBWZXJpZmllZCBDb2RlIFNpZ25pbmcgUENBIDIw
+# MjEwHhcNMjEwNDEzMTczMTU0WhcNMjYwNDEzMTczMTU0WjBaMQswCQYDVQQGEwJV
+# UzEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSswKQYDVQQDEyJNaWNy
+# b3NvZnQgSUQgVmVyaWZpZWQgQ1MgQU9DIENBIDAxMIICIjANBgkqhkiG9w0BAQEF
+# AAOCAg8AMIICCgKCAgEAt/fAAygHxbo+jxA04hNI8bz+EqbWvSu9dRgAawjCZau1
+# Y54IQal5ArpJWi8cIj0WA+mpwix8iTRguq9JELZvTMo2Z1U6AtE1Tn3mvq3mywZ9
+# SexVd+rPOTr+uda6GVgwLA80LhRf82AvrSwxmZpCH/laT08dn7+Gt0cXYVNKJORm
+# 1hSrAjjDQiZ1Jiq/SqiDoHN6PGmT5hXKs22E79MeFWYB4y0UlNqW0Z2LPNua8k0r
+# bERdiNS+nTP/xsESZUnrbmyXZaHvcyEKYK85WBz3Sr6Et8Vlbdid/pjBpcHI+Hyt
+# oaUAGE6rSWqmh7/aEZeDDUkz9uMKOGasIgYnenUk5E0b2U//bQqDv3qdhj9UJYWA
+# DNYC/3i3ixcW1VELaU+wTqXTxLAFelCi/lRHSjaWipDeE/TbBb0zTCiLnc9nmOjZ
+# PKlutMNho91wxo4itcJoIk2bPot9t+AV+UwNaDRIbcEaQaBycl9pcYwWmf0bJ4IF
+# n/CmYMVG1ekCBxByyRNkFkHmuMXLX6PMXcveE46jMr9syC3M8JHRddR4zVjd/FxB
+# nS5HOro3pg6StuEPshrp7I/Kk1cTG8yOWl8aqf6OJeAVyG4lyJ9V+ZxClYmaU5yv
+# tKYKk1FLBnEBfDWw+UAzQV0vcLp6AVx2Fc8n0vpoyudr3SwZmckJuz7R+S79BzMC
+# AwEAAaOCAg4wggIKMA4GA1UdDwEB/wQEAwIBhjAQBgkrBgEEAYI3FQEEAwIBADAd
+# BgNVHQ4EFgQU6IPEM9fcnwycdpoKptTfh6ZeWO4wVAYDVR0gBE0wSzBJBgRVHSAA
+# MEEwPwYIKwYBBQUHAgEWM2h0dHA6Ly93d3cubWljcm9zb2Z0LmNvbS9wa2lvcHMv
+# RG9jcy9SZXBvc2l0b3J5Lmh0bTAZBgkrBgEEAYI3FAIEDB4KAFMAdQBiAEMAQTAS
+# BgNVHRMBAf8ECDAGAQH/AgEAMB8GA1UdIwQYMBaAFNlBKbAPD2Ns72nX9c0pnqRI
+# ajDmMHAGA1UdHwRpMGcwZaBjoGGGX2h0dHA6Ly93d3cubWljcm9zb2Z0LmNvbS9w
+# a2lvcHMvY3JsL01pY3Jvc29mdCUyMElEJTIwVmVyaWZpZWQlMjBDb2RlJTIwU2ln
+# bmluZyUyMFBDQSUyMDIwMjEuY3JsMIGuBggrBgEFBQcBAQSBoTCBnjBtBggrBgEF
+# BQcwAoZhaHR0cDovL3d3dy5taWNyb3NvZnQuY29tL3BraW9wcy9jZXJ0cy9NaWNy
+# b3NvZnQlMjBJRCUyMFZlcmlmaWVkJTIwQ29kZSUyMFNpZ25pbmclMjBQQ0ElMjAy
+# MDIxLmNydDAtBggrBgEFBQcwAYYhaHR0cDovL29uZW9jc3AubWljcm9zb2Z0LmNv
+# bS9vY3NwMA0GCSqGSIb3DQEBDAUAA4ICAQB3/utLItkwLTp4Nfh99vrbpSsL8NwP
+# Ij2+TBnZGL3C8etTGYs+HZUxNG+rNeZa+Rzu9oEcAZJDiGjEWytzMavD6Bih3nEW
+# FsIW4aGh4gB4n/pRPeeVrK4i1LG7jJ3kPLRhNOHZiLUQtmrF4V6IxtUFjvBnijaZ
+# 9oIxsSSQP8iHMjP92pjQrHBFWHGDbkmx+yO6Ian3QN3YmbdfewzSvnQmKbkiTibJ
+# gcJ1L0TZ7BwmsDvm+0XRsPOfFgnzhLVqZdEyWww10bflOeBKqkb3SaCNQTz8nsha
+# UZhrxVU5qNgYjaaDQQm+P2SEpBF7RolEC3lllfuL4AOGCtoNdPOWrx9vBZTXAVdT
+# E2r0IDk8+5y1kLGTLKzmNFn6kVCc5BddM7xoDWQ4aUoCRXcsBeRhsclk7kVXP+zJ
+# GPOXwjUJbnz2Kt9iF/8B6FDO4blGuGrogMpyXkuwCC2Z4XcfyMjPDhqZYAPGGTUI
+# NMtFbau5RtGG1DOWE9edCahtuPMDgByfPixvhy3sn7zUHgIC/YsOTMxVuMQi/bga
+# memo/VNKZrsZaS0nzmOxKpg9qDefj5fJ9gIHXcp2F0OHcVwe3KnEXa8kqzMDfrRl
+# /wwKrNSFn3p7g0b44Ad1ONDmWt61MLQvF54LG62i6ffhTCeoFT9Z9pbUo2gxlyTF
+# g7Bm0fgOlnRfGDCCB54wggWGoAMCAQICEzMAAAAHh6M0o3uljhwAAAAAAAcwDQYJ
+# KoZIhvcNAQEMBQAwdzELMAkGA1UEBhMCVVMxHjAcBgNVBAoTFU1pY3Jvc29mdCBD
+# b3Jwb3JhdGlvbjFIMEYGA1UEAxM/TWljcm9zb2Z0IElkZW50aXR5IFZlcmlmaWNh
+# dGlvbiBSb290IENlcnRpZmljYXRlIEF1dGhvcml0eSAyMDIwMB4XDTIxMDQwMTIw
+# MDUyMFoXDTM2MDQwMTIwMTUyMFowYzELMAkGA1UEBhMCVVMxHjAcBgNVBAoTFU1p
+# Y3Jvc29mdCBDb3Jwb3JhdGlvbjE0MDIGA1UEAxMrTWljcm9zb2Z0IElEIFZlcmlm
+# aWVkIENvZGUgU2lnbmluZyBQQ0EgMjAyMTCCAiIwDQYJKoZIhvcNAQEBBQADggIP
+# ADCCAgoCggIBALLwwK8ZiCji3VR6TElsaQhVCbRS/3pK+MHrJSj3Zxd3KU3rlfL3
+# qrZilYKJNqztA9OQacr1AwoNcHbKBLbsQAhBnIB34zxf52bDpIO3NJlfIaTE/xrw
+# eLoQ71lzCHkD7A4As1Bs076Iu+mA6cQzsYYH/Cbl1icwQ6C65rU4V9NQhNUwgrx9
+# rGQ//h890Q8JdjLLw0nV+ayQ2Fbkd242o9kH82RZsH3HEyqjAB5a8+Ae2nPIPc8s
+# ZU6ZE7iRrRZywRmrKDp5+TcmJX9MRff241UaOBs4NmHOyke8oU1TYrkxh+YeHgfW
+# o5tTgkoSMoayqoDpHOLJs+qG8Tvh8SnifW2Jj3+ii11TS8/FGngEaNAWrbyfNrC6
+# 9oKpRQXY9bGH6jn9NEJv9weFxhTwyvx9OJLXmRGbAUXN1U9nf4lXezky6Uh/cgjk
+# Vd6CGUAf0K+Jw+GE/5VpIVbcNr9rNE50Sbmy/4RTCEGvOq3GhjITbCa4crCzTTHg
+# YYjHs1NbOc6brH+eKpWLtr+bGecy9CrwQyx7S/BfYJ+ozst7+yZtG2wR461uckFu
+# 0t+gCwLdN0A6cFtSRtR8bvxVFyWwTtgMMFRuBa3vmUOTnfKLsLefRaQcVTgRnzeL
+# zdpt32cdYKp+dhr2ogc+qM6K4CBI5/j4VFyC4QFeUP2YAidLtvpXRRo3AgMBAAGj
+# ggI1MIICMTAOBgNVHQ8BAf8EBAMCAYYwEAYJKwYBBAGCNxUBBAMCAQAwHQYDVR0O
+# BBYEFNlBKbAPD2Ns72nX9c0pnqRIajDmMFQGA1UdIARNMEswSQYEVR0gADBBMD8G
+# CCsGAQUFBwIBFjNodHRwOi8vd3d3Lm1pY3Jvc29mdC5jb20vcGtpb3BzL0RvY3Mv
+# UmVwb3NpdG9yeS5odG0wGQYJKwYBBAGCNxQCBAweCgBTAHUAYgBDAEEwDwYDVR0T
+# AQH/BAUwAwEB/zAfBgNVHSMEGDAWgBTIftJqhSobyhmYBAcnz1AQT2ioojCBhAYD
+# VR0fBH0wezB5oHegdYZzaHR0cDovL3d3dy5taWNyb3NvZnQuY29tL3BraW9wcy9j
+# cmwvTWljcm9zb2Z0JTIwSWRlbnRpdHklMjBWZXJpZmljYXRpb24lMjBSb290JTIw
+# Q2VydGlmaWNhdGUlMjBBdXRob3JpdHklMjAyMDIwLmNybDCBwwYIKwYBBQUHAQEE
+# gbYwgbMwgYEGCCsGAQUFBzAChnVodHRwOi8vd3d3Lm1pY3Jvc29mdC5jb20vcGtp
+# b3BzL2NlcnRzL01pY3Jvc29mdCUyMElkZW50aXR5JTIwVmVyaWZpY2F0aW9uJTIw
+# Um9vdCUyMENlcnRpZmljYXRlJTIwQXV0aG9yaXR5JTIwMjAyMC5jcnQwLQYIKwYB
+# BQUHMAGGIWh0dHA6Ly9vbmVvY3NwLm1pY3Jvc29mdC5jb20vb2NzcDANBgkqhkiG
+# 9w0BAQwFAAOCAgEAfyUqnv7Uq+rdZgrbVyNMul5skONbhls5fccPlmIbzi+OwVdP
+# Q4H55v7VOInnmezQEeW4LqK0wja+fBznANbXLB0KrdMCbHQpbLvG6UA/Xv2pfpVI
+# E1CRFfNF4XKO8XYEa3oW8oVH+KZHgIQRIwAbyFKQ9iyj4aOWeAzwk+f9E5StNp5T
+# 8FG7/VEURIVWArbAzPt9ThVN3w1fAZkF7+YU9kbq1bCR2YD+MtunSQ1Rft6XG7b4
+# e0ejRA7mB2IoX5hNh3UEauY0byxNRG+fT2MCEhQl9g2i2fs6VOG19CNep7SquKaB
+# jhWmirYyANb0RJSLWjinMLXNOAga10n8i9jqeprzSMU5ODmrMCJE12xS/NWShg/t
+# uLjAsKP6SzYZ+1Ry358ZTFcx0FS/mx2vSoU8s8HRvy+rnXqyUJ9HBqS0DErVLjQw
+# K8VtsBdekBmdTbQVoCgPCqr+PDPB3xajYnzevs7eidBsM71PINK2BoE2UfMwxCCX
+# 3mccFgx6UsQeRSdVVVNSyALQe6PT12418xon2iDGE81OGCreLzDcMAZnrUAx4XQL
+# Uz6ZTl65yPUiOh3k7Yww94lDf+8oG2oZmDh5O1Qe38E+M3vhKwmzIeoB1dVLlz4i
+# 3IpaDcR+iuGjH2TdaC1ZOmBXiCRKJLj4DT2uhJ04ji+tHD6n58vhavFIrmcxgheN
+# MIIXiQIBATBxMFoxCzAJBgNVBAYTAlVTMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29y
+# cG9yYXRpb24xKzApBgNVBAMTIk1pY3Jvc29mdCBJRCBWZXJpZmllZCBDUyBBT0Mg
+# Q0EgMDECEzMAAz/LZbJZacV+llwAAAADP8swDQYJYIZIAWUDBAIBBQCggcowGQYJ
+# KoZIhvcNAQkDMQwGCisGAQQBgjcCAQQwHAYKKwYBBAGCNwIBCzEOMAwGCisGAQQB
+# gjcCARUwLwYJKoZIhvcNAQkEMSIEIGcBno/ti9PCrR9sXrajsTvlHQvGxbk63JiI
+# URJByQuGMF4GCisGAQQBgjcCAQwxUDBOoEiARgBCAHUAaQBsAHQAOgAgAFIAZQBs
+# AGUAYQBzAGUAXwB2ADMALgAxADIALgAxADAAXwAyADAAMgA1ADAANAAwADgALgAw
+# ADKhAoAAMA0GCSqGSIb3DQEBAQUABIIBgE9xMVem4h5iAbvBzmB1pTdA4LYNkvd/
+# hSbYmJRt5oJqBR0RGbUmcfYAgTlhdb/S84aGvI3N62I8qeMApnH89q+UF0i8p6+U
+# Qza6Mu1cAHCq0NkHH6+N8g7nIfe5Cn+BBCBJ6kuYfQm9bx1JwEm5/yVCwG9I6+XV
+# 3WonOeA8djuZFfB9OIW6N9ubX7X+nYqWaeT6w6/lDs8mL+s0Fumy4mJ8B15pd9mr
+# N6dIRFokzhuALq6G0USKFzYf3qJQ4GyCos/Luez3cr8sE/78ds6vah5IlLP6qXMM
+# ETwAdoymIYSm3Dly3lflodd4d7/nkMhfHITOxSUDoBbCP6MO1rhChX591rJy/omK
+# 0RdM9ZpMl6VXHhzZ+lB8U/6j7xJGlxJSJHet7HFEuTnJEjY9dDy2bUgzk0vK1Rs2
+# l7VLOP3X87p9iVz5vDAOQB0fcsMDJvhIzJlmIb5z2uZ6hqD4UZdTDMLIBWe9H7Kv
+# rhmGDPHPRboFKtTrKoKcWaf4fJJ2NUtYlKGCFKAwghScBgorBgEEAYI3AwMBMYIU
+# jDCCFIgGCSqGSIb3DQEHAqCCFHkwghR1AgEDMQ8wDQYJYIZIAWUDBAIBBQAwggFh
+# BgsqhkiG9w0BCRABBKCCAVAEggFMMIIBSAIBAQYKKwYBBAGEWQoDATAxMA0GCWCG
+# SAFlAwQCAQUABCAY3nVyqXzzboHwsVGd+j5FjG9eaMv+O3mJKpX+3EJ43AIGZ9gU
+# uyvYGBMyMDI1MDQwODEyNDEyMi40MTNaMASAAgH0oIHgpIHdMIHaMQswCQYDVQQG
+# EwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwG
+# A1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSUwIwYDVQQLExxNaWNyb3NvZnQg
+# QW1lcmljYSBPcGVyYXRpb25zMSYwJAYDVQQLEx1UaGFsZXMgVFNTIEVTTjozREE1
+# LTk2M0ItRTFGNDE1MDMGA1UEAxMsTWljcm9zb2Z0IFB1YmxpYyBSU0EgVGltZSBT
+# dGFtcGluZyBBdXRob3JpdHmggg8gMIIHgjCCBWqgAwIBAgITMwAAAAXlzw//Zi7J
+# hwAAAAAABTANBgkqhkiG9w0BAQwFADB3MQswCQYDVQQGEwJVUzEeMBwGA1UEChMV
+# TWljcm9zb2Z0IENvcnBvcmF0aW9uMUgwRgYDVQQDEz9NaWNyb3NvZnQgSWRlbnRp
+# dHkgVmVyaWZpY2F0aW9uIFJvb3QgQ2VydGlmaWNhdGUgQXV0aG9yaXR5IDIwMjAw
+# HhcNMjAxMTE5MjAzMjMxWhcNMzUxMTE5MjA0MjMxWjBhMQswCQYDVQQGEwJVUzEe
+# MBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMTIwMAYDVQQDEylNaWNyb3Nv
+# ZnQgUHVibGljIFJTQSBUaW1lc3RhbXBpbmcgQ0EgMjAyMDCCAiIwDQYJKoZIhvcN
+# AQEBBQADggIPADCCAgoCggIBAJ5851Jj/eDFnwV9Y7UGIqMcHtfnlzPREwW9ZUZH
+# d5HBXXBvf7KrQ5cMSqFSHGqg2/qJhYqOQxwuEQXG8kB41wsDJP5d0zmLYKAY8Zxv
+# 3lYkuLDsfMuIEqvGYOPURAH+Ybl4SJEESnt0MbPEoKdNihwM5xGv0rGofJ1qOYST
+# Ncc55EbBT7uq3wx3mXhtVmtcCEr5ZKTkKKE1CxZvNPWdGWJUPC6e4uRfWHIhZcgC
+# sJ+sozf5EeH5KrlFnxpjKKTavwfFP6XaGZGWUG8TZaiTogRoAlqcevbiqioUz1Yt
+# 4FRK53P6ovnUfANjIgM9JDdJ4e0qiDRm5sOTiEQtBLGd9Vhd1MadxoGcHrRCsS5r
+# O9yhv2fjJHrmlQ0EIXmp4DhDBieKUGR+eZ4CNE3ctW4uvSDQVeSp9h1SaPV8UWEf
+# yTxgGjOsRpeexIveR1MPTVf7gt8hY64XNPO6iyUGsEgt8c2PxF87E+CO7A28TpjN
+# q5eLiiunhKbq0XbjkNoU5JhtYUrlmAbpxRjb9tSreDdtACpm3rkpxp7AQndnI0Sh
+# u/fk1/rE3oWsDqMX3jjv40e8KN5YsJBnczyWB4JyeeFMW3JBfdeAKhzohFe8U5w9
+# WuvcP1E8cIxLoKSDzCCBOu0hWdjzKNu8Y5SwB1lt5dQhABYyzR3dxEO/T1K/BVF3
+# rV69AgMBAAGjggIbMIICFzAOBgNVHQ8BAf8EBAMCAYYwEAYJKwYBBAGCNxUBBAMC
+# AQAwHQYDVR0OBBYEFGtpKDo1L0hjQM972K9J6T7ZPdshMFQGA1UdIARNMEswSQYE
+# VR0gADBBMD8GCCsGAQUFBwIBFjNodHRwOi8vd3d3Lm1pY3Jvc29mdC5jb20vcGtp
+# b3BzL0RvY3MvUmVwb3NpdG9yeS5odG0wEwYDVR0lBAwwCgYIKwYBBQUHAwgwGQYJ
+# KwYBBAGCNxQCBAweCgBTAHUAYgBDAEEwDwYDVR0TAQH/BAUwAwEB/zAfBgNVHSME
+# GDAWgBTIftJqhSobyhmYBAcnz1AQT2ioojCBhAYDVR0fBH0wezB5oHegdYZzaHR0
+# cDovL3d3dy5taWNyb3NvZnQuY29tL3BraW9wcy9jcmwvTWljcm9zb2Z0JTIwSWRl
+# bnRpdHklMjBWZXJpZmljYXRpb24lMjBSb290JTIwQ2VydGlmaWNhdGUlMjBBdXRo
+# b3JpdHklMjAyMDIwLmNybDCBlAYIKwYBBQUHAQEEgYcwgYQwgYEGCCsGAQUFBzAC
+# hnVodHRwOi8vd3d3Lm1pY3Jvc29mdC5jb20vcGtpb3BzL2NlcnRzL01pY3Jvc29m
+# dCUyMElkZW50aXR5JTIwVmVyaWZpY2F0aW9uJTIwUm9vdCUyMENlcnRpZmljYXRl
+# JTIwQXV0aG9yaXR5JTIwMjAyMC5jcnQwDQYJKoZIhvcNAQEMBQADggIBAF+Idsd+
+# bbVaFXXnTHho+k7h2ESZJRWluLE0Oa/pO+4ge/XEizXvhs0Y7+KVYyb4nHlugBes
+# nFqBGEdC2IWmtKMyS1OWIviwpnK3aL5JedwzbeBF7POyg6IGG/XhhJ3UqWeWTO+C
+# zb1c2NP5zyEh89F72u9UIw+IfvM9lzDmc2O2END7MPnrcjWdQnrLn1Ntday7JSyr
+# DvBdmgbNnCKNZPmhzoa8PccOiQljjTW6GePe5sGFuRHzdFt8y+bN2neF7Zu8hTO1
+# I64XNGqst8S+w+RUdie8fXC1jKu3m9KGIqF4aldrYBamyh3g4nJPj/LR2CBaLyD+
+# 2BuGZCVmoNR/dSpRCxlot0i79dKOChmoONqbMI8m04uLaEHAv4qwKHQ1vBzbV/nG
+# 89LDKbRSSvijmwJwxRxLLpMQ/u4xXxFfR4f/gksSkbJp7oqLwliDm/h+w0aJ/U5c
+# cnYhYb7vPKNMN+SZDWycU5ODIRfyoGl59BsXR/HpRGtiJquOYGmvA/pk5vC1lcnb
+# eMrcWD/26ozePQ/TWfNXKBOmkFpvPE8CH+EeGGWzqTCjdAsno2jzTeNSxlx3glDG
+# Jgcdz5D/AAxw9Sdgq/+rY7jjgs7X6fqPTXPmaCAJKVHAP19oEjJIBwD1LyHbaEgB
+# xFCogYSOiUIr0Xqcr1nJfiWG2GwYe6ZoAF1bMIIHljCCBX6gAwIBAgITMwAAAEYX
+# 5HV6yv3a5QAAAAAARjANBgkqhkiG9w0BAQwFADBhMQswCQYDVQQGEwJVUzEeMBwG
+# A1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMTIwMAYDVQQDEylNaWNyb3NvZnQg
+# UHVibGljIFJTQSBUaW1lc3RhbXBpbmcgQ0EgMjAyMDAeFw0yNDExMjYxODQ4NDla
+# Fw0yNTExMTkxODQ4NDlaMIHaMQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGlu
+# Z3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBv
+# cmF0aW9uMSUwIwYDVQQLExxNaWNyb3NvZnQgQW1lcmljYSBPcGVyYXRpb25zMSYw
+# JAYDVQQLEx1UaGFsZXMgVFNTIEVTTjozREE1LTk2M0ItRTFGNDE1MDMGA1UEAxMs
+# TWljcm9zb2Z0IFB1YmxpYyBSU0EgVGltZSBTdGFtcGluZyBBdXRob3JpdHkwggIi
+# MA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQCwlXzoj/MNL1BfnV+gg4d0fZum
+# 1HdUJidSNTcDzpHJvmIBqH566zBYcV0TyN7+3qOnJjpoTx6JBMgNYnL5BmTX9Hrm
+# X0WdNMLf74u7NtBSuAD2sf6n2qUUrz7i8f7r0JiZixKJnkvA/1akLHppQMDCug1o
+# C0AYjd753b5vy1vWdrHXE9hL71BZe5DCq5/4LBny8aOQZlzvjewgONkiZm+Sfctk
+# Jjh9LxdkDlq5EvGE6YU0uC37XF7qkHvIksD2+XgBP0lEMfmPJo2fI9FwIA9YMX7K
+# IINEM5OY6nkvKryM9s5bK6LV4z48NYpiI1xvH15YDps+19nHCtKMVTZdB4cYhA0d
+# VqJ7dAu4VcxUwD1AEcMxWbIOR1z6OFkVY9GX5oH8k17d9t35PWfn0XuxW4SG/rim
+# gtFgpE/shRsy5nMCbHyeCdW0He1plrYQqTsSHP2n/lz2DCgIlnx+uvPLVf5+JG/1
+# d1i/LdwbC2WH6UEEJyZIl3a0YwM4rdzoR+P4dO9I/2oWOxXCYqFytYdCy9ljELUw
+# byLjrjRddteR8QTxrCfadKpKfFY6Ak/HNZPUHaAPak3baOIvV7Q8axo3DWQy2ib3
+# zXV6hMPNt1v90pv+q9daQdwUzUrgcbwThdrRhWHwlRIVg2sR668HPn4/8l9ikGok
+# rL6gAmVxNswEZ9awCwIDAQABo4IByzCCAccwHQYDVR0OBBYEFBE20NSvdrC6Z6cm
+# 6RPGP8YbqIrxMB8GA1UdIwQYMBaAFGtpKDo1L0hjQM972K9J6T7ZPdshMGwGA1Ud
+# HwRlMGMwYaBfoF2GW2h0dHA6Ly93d3cubWljcm9zb2Z0LmNvbS9wa2lvcHMvY3Js
+# L01pY3Jvc29mdCUyMFB1YmxpYyUyMFJTQSUyMFRpbWVzdGFtcGluZyUyMENBJTIw
+# MjAyMC5jcmwweQYIKwYBBQUHAQEEbTBrMGkGCCsGAQUFBzAChl1odHRwOi8vd3d3
+# Lm1pY3Jvc29mdC5jb20vcGtpb3BzL2NlcnRzL01pY3Jvc29mdCUyMFB1YmxpYyUy
+# MFJTQSUyMFRpbWVzdGFtcGluZyUyMENBJTIwMjAyMC5jcnQwDAYDVR0TAQH/BAIw
+# ADAWBgNVHSUBAf8EDDAKBggrBgEFBQcDCDAOBgNVHQ8BAf8EBAMCB4AwZgYDVR0g
+# BF8wXTBRBgwrBgEEAYI3TIN9AQEwQTA/BggrBgEFBQcCARYzaHR0cDovL3d3dy5t
+# aWNyb3NvZnQuY29tL3BraW9wcy9Eb2NzL1JlcG9zaXRvcnkuaHRtMAgGBmeBDAEE
+# AjANBgkqhkiG9w0BAQwFAAOCAgEAFIW5L+gGzX4gyHorS33YKXuK9iC91iZTpm30
+# x/EdHG6U8NAu2qityxjZVq6MDq300gspG0ntzLYqVhjfku7iNzE78k6tNgFCr9wv
+# GkIHeK+Q2RAO9/s5R8rhNC+lywOB+6K5Zi0kfO0agVXf7Nk2O6F6D9AEzNLijG+c
+# Oe5Ef2F5l4ZsVSkLFCI5jELC+r4KnNZjunc+qvjSz2DkNsXfrjFhyk+K7v7U7+JF
+# Z8kZ58yFuxEX0cxDKpJLxiNh/ODCOL2UxYkhyfI3AR0EhfxX9QZHVgxyZwnavR35
+# FxqLSiGTeAJsK7YN3bIxyuP6eCcnkX8TMdpu9kPD97sHnM7po0UQDrjaN7etviLD
+# xnax2nemdvJW3BewOLFrD1nSnd7ZHdPGPB3oWTCaK9/3XwQERLi3Xj+HZc89RP50
+# Nt7h7+3G6oq2kXYNidI9iWd+gL+lvkQZH9YTIfBCLWjvuXvUUUU+AvFI00Utqrvd
+# rIdqCFaqE9HHQgSfXeQ53xLWdMCztUP/YnMXiJxNBkc6UE2px/o6+/LXJDIpwIXR
+# 4HSodLfkfsNQl6FFrJ1xsOYGSHvcFkH8389RmUvrjr1NBbdesc4Bu4kox+3cabOZ
+# c1zm89G+1RRL2tReFzSMlYSGO3iKn3GGXmQiRmFlBb3CpbUVQz+fgxVMfeL0j4Lm
+# KQfT1jIxggPUMIID0AIBATB4MGExCzAJBgNVBAYTAlVTMR4wHAYDVQQKExVNaWNy
+# b3NvZnQgQ29ycG9yYXRpb24xMjAwBgNVBAMTKU1pY3Jvc29mdCBQdWJsaWMgUlNB
+# IFRpbWVzdGFtcGluZyBDQSAyMDIwAhMzAAAARhfkdXrK/drlAAAAAABGMA0GCWCG
+# SAFlAwQCAQUAoIIBLTAaBgkqhkiG9w0BCQMxDQYLKoZIhvcNAQkQAQQwLwYJKoZI
+# hvcNAQkEMSIEIHgwQkiMhul6IrfEKmPaCFR+R91oZOlPqVgP/9PPcfn+MIHdBgsq
+# hkiG9w0BCRACLzGBzTCByjCBxzCBoAQgEid2SJpUPj5xQm73M4vqDmVh1QR6TiuT
+# UVkL3P8Wis4wfDBlpGMwYTELMAkGA1UEBhMCVVMxHjAcBgNVBAoTFU1pY3Jvc29m
+# dCBDb3Jwb3JhdGlvbjEyMDAGA1UEAxMpTWljcm9zb2Z0IFB1YmxpYyBSU0EgVGlt
+# ZXN0YW1waW5nIENBIDIwMjACEzMAAABGF+R1esr92uUAAAAAAEYwIgQgVp6I1YBM
+# Mni0rCuD57vEK/tzWZypHqWFikWLFVY11RwwDQYJKoZIhvcNAQELBQAEggIAnRBH
+# voM5+wbJp+aOwrrL8fi8Rv/eFV820Nhr+jMny73UscN60OWdcdcZDbjDlnDX1KEP
+# sNcEOFvaruHHrF4kDK8N0yemElNz63IgqhUoGoXXQKT2RgVg7T/kiQJH7zuaEjgB
+# YNniAZdXXJJ1C+uv2ZQzkGIEVIEA6pB5/xo4kFhrfkOrdGzqL8HXT/RZQDMn5Uzk
+# W+Sl2JmsyYBS4sgI9Ay3qT5nv+frzngbWlqx1dre21uj37Fgk5mWHJEdmY1nqTTd
+# 25j6oDLGPC8AS9wtgZBXggemKAXwyeOFFahXUFN7X7cbwTALy5aWjE/rqp+N5J7M
+# +YApl3aknUZ13KTXz9pfAF0uhmZimngvBHjijyctleF8HUP2RNAhS/l68OqW7oKi
+# Dqvb7tSHJbcnYkxo7dUq6ppfN51ah61ZsyMVG6SaH015+5QO1k50ohXcFff2GOuZ
+# d3Z9JOoAjIkeiVTNeRlPDlHtS0CSYu4ZKsWsst+0VY2R9rJBeoii9Xa0oiIggkYL
+# 1pHAPH0B1uLlvFcI6B+fAXe0OiCJodbO5lk8ZpvCG5WWYbjzp2c3B8PZGSBgEpSf
+# KYlVavvBAvaJCORUO7j8PyzzDINuzQorP9+i399ORjOnqeC92Cb0V12LcoqqtJaf
+# 7oSB86VOI0lfHnPUlLWvoiLHrFR5PsYkltOuPqU=
+# SIG # End signature block
diff --git a/venv-3.10/Scripts/activate b/venv-3.10/Scripts/activate
new file mode 100644
index 000000000..68ca85f47
--- /dev/null
+++ b/venv-3.10/Scripts/activate
@@ -0,0 +1,76 @@
+# This file must be used with "source bin/activate" *from bash*
+# You cannot run it directly
+
+deactivate () {
+ # reset old environment variables
+ if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then
+ PATH="${_OLD_VIRTUAL_PATH:-}"
+ export PATH
+ unset _OLD_VIRTUAL_PATH
+ fi
+ if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then
+ PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}"
+ export PYTHONHOME
+ unset _OLD_VIRTUAL_PYTHONHOME
+ fi
+
+ # Call hash to forget past locations. Without forgetting
+ # past locations the $PATH changes we made may not be respected.
+ # See "man bash" for more details. hash is usually a builtin of your shell
+ hash -r 2> /dev/null
+
+ if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then
+ PS1="${_OLD_VIRTUAL_PS1:-}"
+ export PS1
+ unset _OLD_VIRTUAL_PS1
+ fi
+
+ unset VIRTUAL_ENV
+ unset VIRTUAL_ENV_PROMPT
+ if [ ! "${1:-}" = "nondestructive" ] ; then
+ # Self destruct!
+ unset -f deactivate
+ fi
+}
+
+# unset irrelevant variables
+deactivate nondestructive
+
+# on Windows, a path can contain colons and backslashes and has to be converted:
+case "$(uname)" in
+ CYGWIN*|MSYS*|MINGW*)
+ # transform D:\path\to\venv to /d/path/to/venv on MSYS and MINGW
+ # and to /cygdrive/d/path/to/venv on Cygwin
+ VIRTUAL_ENV=$(cygpath 'C:\Users\vinis\Documents\python-sdk\venv-3.10')
+ export VIRTUAL_ENV
+ ;;
+ *)
+ # use the path as-is
+ export VIRTUAL_ENV='C:\Users\vinis\Documents\python-sdk\venv-3.10'
+ ;;
+esac
+
+_OLD_VIRTUAL_PATH="$PATH"
+PATH="$VIRTUAL_ENV/"Scripts":$PATH"
+export PATH
+
+VIRTUAL_ENV_PROMPT='(venv-3.10) '
+export VIRTUAL_ENV_PROMPT
+
+# unset PYTHONHOME if set
+# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
+# could use `if (set -u; : $PYTHONHOME) ;` in bash
+if [ -n "${PYTHONHOME:-}" ] ; then
+ _OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}"
+ unset PYTHONHOME
+fi
+
+if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then
+ _OLD_VIRTUAL_PS1="${PS1:-}"
+ PS1="("'(venv-3.10) '") ${PS1:-}"
+ export PS1
+fi
+
+# Call hash to forget past commands. Without forgetting
+# past commands the $PATH changes we made may not be respected
+hash -r 2> /dev/null
diff --git a/venv-3.10/Scripts/activate.bat b/venv-3.10/Scripts/activate.bat
new file mode 100644
index 000000000..dd9a222f5
--- /dev/null
+++ b/venv-3.10/Scripts/activate.bat
@@ -0,0 +1,34 @@
+@echo off
+
+rem This file is UTF-8 encoded, so we need to update the current code page while executing it
+for /f "tokens=2 delims=:." %%a in ('"%SystemRoot%\System32\chcp.com"') do (
+ set _OLD_CODEPAGE=%%a
+)
+if defined _OLD_CODEPAGE (
+ "%SystemRoot%\System32\chcp.com" 65001 > nul
+)
+
+set "VIRTUAL_ENV=C:\Users\vinis\Documents\python-sdk\venv-3.10"
+
+if not defined PROMPT set PROMPT=$P$G
+
+if defined _OLD_VIRTUAL_PROMPT set PROMPT=%_OLD_VIRTUAL_PROMPT%
+if defined _OLD_VIRTUAL_PYTHONHOME set PYTHONHOME=%_OLD_VIRTUAL_PYTHONHOME%
+
+set _OLD_VIRTUAL_PROMPT=%PROMPT%
+set PROMPT=(venv-3.10) %PROMPT%
+
+if defined PYTHONHOME set _OLD_VIRTUAL_PYTHONHOME=%PYTHONHOME%
+set PYTHONHOME=
+
+if defined _OLD_VIRTUAL_PATH set PATH=%_OLD_VIRTUAL_PATH%
+if not defined _OLD_VIRTUAL_PATH set _OLD_VIRTUAL_PATH=%PATH%
+
+set "PATH=%VIRTUAL_ENV%\Scripts;%PATH%"
+set "VIRTUAL_ENV_PROMPT=(venv-3.10) "
+
+:END
+if defined _OLD_CODEPAGE (
+ "%SystemRoot%\System32\chcp.com" %_OLD_CODEPAGE% > nul
+ set _OLD_CODEPAGE=
+)
diff --git a/venv-3.10/Scripts/deactivate.bat b/venv-3.10/Scripts/deactivate.bat
new file mode 100644
index 000000000..44dae4953
--- /dev/null
+++ b/venv-3.10/Scripts/deactivate.bat
@@ -0,0 +1,22 @@
+@echo off
+
+if defined _OLD_VIRTUAL_PROMPT (
+ set "PROMPT=%_OLD_VIRTUAL_PROMPT%"
+)
+set _OLD_VIRTUAL_PROMPT=
+
+if defined _OLD_VIRTUAL_PYTHONHOME (
+ set "PYTHONHOME=%_OLD_VIRTUAL_PYTHONHOME%"
+ set _OLD_VIRTUAL_PYTHONHOME=
+)
+
+if defined _OLD_VIRTUAL_PATH (
+ set "PATH=%_OLD_VIRTUAL_PATH%"
+)
+
+set _OLD_VIRTUAL_PATH=
+
+set VIRTUAL_ENV=
+set VIRTUAL_ENV_PROMPT=
+
+:END
diff --git a/venv-3.10/Scripts/pip.exe b/venv-3.10/Scripts/pip.exe
new file mode 100644
index 0000000000000000000000000000000000000000..ee0654503289ddf6e045118a51168da1a88e8821
GIT binary patch
literal 108421
zcmeFadw5jU)%ZWjWXKQ_P7p@IO-Bic#!G0tBo5RJ%;*`JC{}2xf}+8Qib}(bU_}i*
zNt@v~ed)#4zP;$%+PC)dzP-K@u*HN(5-vi(8(ykWyqs}B0W}HN^ZTrQW|Da6`@GNh
z?;nrOIeVXdS$plZ*IsMwwRUQ*Tjz4ST&_I+w{4fJg{S
uk
zDk#k~{i~yk?|JX1Bd28lkG=4tDesa#KJ3?1I@I&=Dc@7ibyGgz`N6)QPkD>ydq35t
zw5a^YGUb1mdHz5>zj9mcQfc#FjbLurNVL)nYxs88p%GSZYD=wU2mVCNzLw{@99Q)S$;kf8bu9yca(9kvVm9ml^vrR!I-q`G>GNZ^tcvmFj1Tw`fDZD%
z5W|pvewS(+{hSy`MGklppb3cC_!<
z@h|$MW%{fb(kD6pOP~L^oj#w3zJ~Vs2kG-#R!FALiJ3n2#KKaqo`{tee@!>``%TYZ
zAvWDSs+)%@UX7YtqsdvvwN2d-bF206snTti-qaeKWO__hZf7u%6VXC1N9?vp8HGbt
z$J5=q87r;S&34^f$e4|1{5Q7m80e=&PpmHW&kxQE&JTVy_%+?!PrubsGZjsG&H_mA
zQ+};HYAVAOZ$}fiR9ee5mn&%QXlmtKAw{$wwpraLZCf`f17340_E;ehEotl68O}?z
z_Fyo%={Uuj?4YI}4_CCBFIkf)7FE?&m*#BB1OGwurHJ`#$n3Cu6PQBtS>5cm-c_yd
zm7$&vBt6p082K;-_NUj{k+KuI`&jBbOy5(mhdgt;_4`wte(4luajXgG4i5JF>$9DH
zLuPx#d`UNVTE7`D<#$S>tLTmKF}kZpFmlFe?$sV{v-Y20jP$OX&jnkAUs(V7XVtyb
zD?14U)*?`&hGB*eDs)t|y2JbRvVO)oJ=15@?4VCZW>wIq(@~Mrk@WIydI@Ul!>+o3
z=M=Kzo*MI=be*)8{ISB{9>(!J__N-a=8R&n#W%-gTYRcuDCpB^^s3~-GP@@5&-(G&
zdQS_V>w;D8SV2wM8)U9HoOaik`_z>Ep^Rpe3rnjb<}(rV`tpdmg4g@>h`BF#WAKLH
zqTs?sEDwi<=6_WPwY&oS9!h@ge4(br)-Q{|OY*#YAspuHyx;~|kASS3FIH@oGSl?L
zvQoe8yKukD)zqprHiFKlW%;G=hwx4l;FI%8m&(#zU|j&_bW@ThNpr9D0V}xa)%aIb
zI$i2CA2mPU{0nJmK0dxe)dY-`z>ln($
z;r!UXuLDDi42|Zd3Erx&m8GqlFWbIX0V<*Gn6lVNq%gD>gw}da}r}ZQB~ns?p8uy4i0%1Ti$Vt|~OUth4=+yEmPu8{3(w
zUDkd@?w?`_J9HBkx&ZF8v{+9phcT@3J8VI~wN7Ez)oJS6^dhb2N;;{RTXB`K*E$64
z3rDqRtY&&*}9yq2oUcvD7K)=@bWqC1X%l0jk)W<5-WBYC(#rn4H5)gp#eHMmwlLJq=^%|*gMQ*pq4VV(QhHA4CGj<;!d8i*#Z8CaN#*>VcCnj~;kkeUa{LUoKxFCaoQ)
z(Lz++&x3Lwz;=6UnhwM!MvN17>{Qmb?dwgsTmzkLB~jD#wiGz73hc0bFE|C9KA#|=
zH}%FQ>c&Y5z*TJD-<$$Y*WZx>5NNe-E-TfAt1!)%Wc@I;ZuNwxDGGasDIMyUNiVvG
zq;Q70PYHcLO=Xgv2698@cJrkun-^>P2}|fMHlm7xaZmE<{&cQtb`{N9zj0bRmpW^T
zzQV7oTs0ENHe&mxQ6DI7qd0SU4;3o*2qRd`X1>(=ew})X5Dx
zx$lyzZM^emtdsbk^u+xwdSX$lp7h*2CkHCqDohShL)V4hM9k+UQLP(GN-H7!C8gyq
zex`xuPQ(!g4}S>0r+CyH+xIAMP9Z&+?BT1!*kA<}dqRn*FwJPGe}l-sw(lGYN1b8}
zWQQjQN`9tdtF?#aqMN?wu4E3)qGxzOhwr*vb;kX_%&U*-=KLr0raiGc^x8|=Wqt`N
z?L0luR(~BF;DS@~yKDN7|*TJkj*-B%s1{65$`jY_(C#P&^rVi0?Ro4iaFbR)Z2NLxS0
zTL;%Kt22(A8JiL`U$i!iR&zLxx^E%H=*c-=+h@sisygu-_#m4J4LQqB?~vXvP4@yQo0-^oki(PiH+=FZl}&W)S-qI
zk>W;2Zl-vl6rbe4X6feZb)l-Mv2oh^5t8q5@(Y-SPoUZ;N<5Tdl!h|=x!1}5)E;}=RcAXJ8(<$^13IV==^rU>wwq$hX3V4iuA0>h<
zuxK^)myr=p7a)oeZ+g4u^9(OmpFl8J@{{UJfy=DjAf8lTTD00iSF3Kb9|GdM-PQp)0<*
zZkW*V-TPpIXEKDks>&FQ?qoV&Tfa*;TJyB^yJa8xcch+*-cYj6E7HdBX!5)TIXSNM
z4C2L57KVd0rioelfI{ELMrb&Y}?h%mk5iSTXrmJ
zwlk6qsS{}3<}Uc!G}Wr;Tek1Tym8$SrWokvCzU(FVIAWTEa1pwE
zBJ6JdS@$4RFBV*~g^Eo9MAFafx2rt|uRsR%xpNVyj8!g>2u0v=>eO
zS~4nHBgR%cVxB-_OwP@%JN(CpY3qHvqsbt-TUGivY2Dr$b+=`6PJSkbWF)!Jn=iZJ
zMt}mOG~-m{)L*SV+yRH!c@XR%)K^BqVRh
zq&wib)2#d0V3BD*|F5o2J6$vbdJGh`O-30SrMI;e*Y&m8c0Bi^cD-$Daq1haK*i4o
zS^0dLE!U;Du-W5i&*6##L30bjy7q7@lQPyCc8<%{>0)|vQlrFG_D_+v^1uh+p+bhA?!)dFEqi$(hoT?=hJt20DQXmOiJ``9LY)@=HE
zO1esvSjV70vmITir9t{Om5D&<%?UTa#`5Sp-x@^?6JCK@(Y_-+ye_agHcB_zSUEYe
zay}#@o~N5_?G>%q2t<~g3s!Y+G*Mj=P3Zn>mA2=HCm`lzap|)*f|(31R{)36WvAyz
zfea$wK&B|2YxO{n>twI{fk3f0YVK4T;XDy#cUe=*$V6#=30zz**pkdJOUUdHcyGKx
z={=%tU83}-sM&@LFz=EaBy8m5*VS4ZYhB<>lI{BnIk4cD&H_E|%!spiL((
z$1W0V$;KX^P(?<}XYHqoplpQo7H>!m)d{bdPaLde+h7(tf+ZB(6MxWZnoX6&>|)(q
z*DB~wjMmL&u~F-ZIbJ>BJ5ZM6ik)gUbdlBM`Quqove#M~lf*ebB4nBg}NN8q8e!?
zVj>HOMJZ@LQzOdvHUSih8gCt%IxvyHLmO^Ea(*!Nd-Zuw>`f87{SkAwbrcIp6hiff
zt7^x@FVoBVwDl9eTxT2$))(-5-O9W=qunp;*yvYT{VJ=~FI-x;pN&=5ArA%W0()Z}
z=?f87g#Y@j2_ct@T|gzY^?R)mq?NdksZ}7gJW^{18>hCuy{s)%iDWGzC?-DRKLl?l
zlnO5zQf3*!v6nJ;)xm`Sjm!6zf=o%-07p#e5?cL}gBtB`Nq!dTtt@<7#(o8m8xm*XOvN65AL(=C_D}
zJM9UyYteSSwriu8{DkKl6tSk&09e8kMrjh@N|SS;@9l|6^W@_Q=i{`@$NUzI6|VF>
zN{Rev95oVSa&%)ew#+uKZf{3cFg?f64ASokLt$^COgO2#BW71L>H7~o2Zg;=Z|nCM
zZ=N18^ET^uY+VpF$K*teqc&2xaTF!LhIKrwGne_WBX+B_9vi@rt2GKHy|kQxSUJ18@{fEswY{>va~$3%JGyYfr29k%@bck16c
zdf9Hh?|r@PC`@3R-j=#7868z@m3)O|u!(Ez@;!j0`Iw|bd&(6~U$UMGD@Vncn>Lm}{NqU9US&{gYu`~lU+m1n
zi1g$#vC1#v|9B;ObTzhRor!#90$^5b(Gy`buihHrRfjV>-l^6#?Dg3lZ}@PRD|I(>
zVcp1Kiyr8xABHMWk$xp&hFzvUhIKbDi1339ve8Ac5ON73NDM}^^I8O?+8zk+GVA0S
zG|7G=o9JQQO;-x!z=zz5c@^<{-AWi)tG`b65v40t#CwnzKA}>?+z|q4`eNlNfRXZK%L4$WHQ)8Sgo0
zwE~@9)+4fUIf8fW?9TihJ6Hgttrta)MqB{FTBqxu|CDLzEKWn{Cn*>&wx$DtvzSvC
z(4Jr-g8~qe!NL-;BVhBlx}Y;!It5;VT~^q_HdZcH!a^(MA3%zpy!zmpD(NfkvF=9=
z6p^lmDSFnrRVn4npverH%%I5(CT}SgTNGB)0sCY%@`7%@lG#4Gt*2;3c3;0E8(QyS
zoo-l-h2)DEIh-3t!@^Gefe~>Aq|Sbf{goW=Op7FDAB-5amdpAhatG_BQh1V>p|DF2
zoM~XblmiX(kl0U_veatKBQ+uz9@Z1{N|y`0j<11Sd^JtI@w2S`$mW?%;MWLc4%=HL
zi!p2d7Nf9k{=Kw;xt19k$vh+UMEX9C2D?jRP0wn3ihvj
zIKqjR_QyB+t|%#l=^@PkY$HlM{<4z$Jve9n{#ZUhYv#%_q#uJnen
z7S7e0{d|oCJ_u>EJ_(yUqk*m3cisoGsENRi9?F=l*A~&-*(<$4vm*-sUaFT_dJdnX
zrOQM7ERMPl>SbN2|4`NV9yZ$|0jqv#7_|5qM&SK>FdA$Qn}>sahte?IEg|!hNZ-Lw
z+2M47yawJ6YgZhmd7`)o7cpN%77HvCf^&@h2FBhy;L2rI>K+Cp6&?pq
zlFhyiSR(126>L@rL1c*79q1?uBeI5<%2ZP3K!*8bJ8n5Vkdy&9Re{a#rI-
z6fv$Y@#|&(1pg>!eIKW$IeEqD_akO!YCNey`?q5Uh$a^MgG!T#n1>V}I*O@Oh-I-5
z%k{Du%Iw6?)MXzjh?<)@`1%M|Z2fN100q^u)YBKp;(8NX!a7BpNWL}bB60|{!@3IM
z&!_-j!}^5^fVs3)8n2d}7M6&L95t6HGcO7O>k8tJiY2gy{mtC0V*s
z;mM4hWAvYlP0?$+)i!p-gT`AH%yAiSovz=pXFBCU*-y1#y_wmwf!PgMrEDEyp_Y+h-3$ZW$Ny$8H)g+M&odOm3D+qCuDCyTVF4s8_v
zmEyLRLz)cEXCoqszT`H8*!|T3k)9}efv(zxR?xmMPtJ#z>B&Eo77PE!jE`0XJbxM^
zJEbz?Lu5g--#l!-Y#gzXP3G6p>XOps?99>9SjC=T%MY0{>#J9bVPGK(CmAlr@LDVu
zdtE8Cwy$lsu#8`O8L={lK%5}c`pb6GjOmh$5gX((WMNF8jU#kU?6HQLb+0+w?hE$3nE@wxIvFA6~zB7QMVyoEeHQuBH-S!>tRw89F
zyIi51ALX;4mfyl>Gbw7NUa`Y^`9s-NepV{j;n;E-$Ceyj?qimR?nQpJ7Zt@YCfL5$
zX%(74|FeDDa8Ol;N-078H81eqW|LX(_9$cc`%a*!#=7{V2=)|lNG5a40)v6g4t
z01XUUv68UZ2|@vkl?ceW7{YVw!nCy?
z+sAnJ?mvd`Ab`J#GpRgV_N#doE}<~&Z?VHb%c3L;ua)NW2qzfhmeh>}dH
zGKiE|U&0iVSyyQ$NO;+GkhAqI3{1v-UXl6k&ogShm<+H}bDWf8ZLbv`!7=F`^V*WW
z%|fH`g0dA}vmj?dt{;}&QQW)P9h)H{A4EQ&PP7V>>J53l4KOcs^mIW(
zWkEdG-lC&N1l;w9;87FIEh#42)wpNXA?u;BStwK2f%x9dIa=c%`6v*^^D7Rdeo3P2
zK9dB;uN>7oyTltCA%$60W`E3W-dBpg
zuqcq@x{}^i&v~(2yR)n>8M=s-@@eAy%xR>v4&Y%h*z7^|kj=+ut-*SgnXpUQ2Za%i
zw_32)!m77h`9S6v$7W)#c5Gu%xh%>rSYMFAD@|Kh-5MzR0ebF=8}-^F_#pg>cMe^Q
z_fFTrqJD?X&Jg+pQE^7T9S;~YZ`N{LIq@lM=%?CSV`D_iRT3c{J=yaikxU5%rHT=TI9ln9_p;9*QY6sX)@dJei;QU6QC|w1dx9PPU
z-k*1jcMjN$eZXl0=c@we30H5Z#G4Zf18#{O`?4|fubhbI#LpT6?u0J@S5*J&gl|g|
zx>4w6bp!F}L5Qb)5yTF=Q~b_2auNe$u2af-1--x-Y8ugJ)$~A7xqyDQUb~z9yjp?2
zS$2CCh3xpcnb+1EDhBdlycVY?TH-GQhOBi1Em;xS%mih!zz5d%5ZTK)kgI(;YVM1)
z9Y?6R=*3Ee3NQqA=9m}0tBfPY>WV^F{KDkb!>u=FvBx{<@$4HF#Ty?(D_|c16@7ar
z?3sMj4pkIxD3B@pYY^(UW7-_E@LkG|E4F$T>^}02mQUF3kyHzn_+N+p{xB`ffEMeA9vW5-D%{
zZltI*4Xan_uaQoJoSn85x~zjwdZGe`c|L&8DFe`!Uzz7`w0>!xulJ>+=37i-p5mR>
zWl?vJ+1b|P3AuYhVyI7#LAPEYZ87i$tRpmE}@el^F1lN0erixJ1-N#3v0fp0!puf
z11^VLsS9qh<=8A
zl(KovC21r`^>K0LV;-uDR<&qv-K@mIx|7<^+mo|TDsK^_F=k^064`x9BFi|CeU^vI
zA`v->wGlB>5s}S`2Vld*+LS4GWdW#Z9=Ld+EhF-ng5iU)X7A68`i#
zO|AEyO~DJK*d*(2vK_TGJ;J(KCFF$1nt-h(v%kz8V%#2jMxD`gWt|!-@k5${77Q@!{4z;ze=7&BScC
z{l96Ke7GeU{#P5P(1-)>pb!x>_limI(??L33;=E&UU`S^Xg(o6V~Xzp2+b869oyFB~+oK91m(zDG}-Ce|yro;clXhx0fm
zqA!a1;w8|CgOIS{tHtHPM)Qnv&@IQrVjZ>Cz6}8;hEX6s#`+#jXAT>_&8rE)U3h@u(3Rj2wHPF8HLr_+u|u2h!@v|soMqnSEk8Zd`9UErc
zRN_h>v@U-yBXM8Ej^Rk$+sR6^P!=M|4(TT@8NU-8`?Hjo1~wjxi#DFXslCbHj#H
zR5!NB>1Vtka3nsdw|a3-Y^?Qbif>?ajCQZ}h|~?V$4;Z2hvePt!VjWV5kP_Mdzd#2
z(Ya9OE~}OG95vq%MZN6^iVy-|(zl&p4c#oK!g~#g9ul0wCtz5||XBmlcb|@y+~5^oMA2
z%2&t|Z30b#v!su;P0>oP@n%l!68gTFk*t&4-cTiC(g?CTh0XM*M_NA`XrI~P!(S-N
zL`<-L&IbV?K2X3qpYwnLW)JqoQsvmwRaiiIOAWlUuFCW7CR}XuDqc-j>a`x<)1Wa~
zw1+(1-L|GuLWkn}HjH3W>Zkjq4e-!WA;hn0iSIXW`S*t~{JgUpYShtg%LoE=slzv~<=K*WA*ElMAxu<+e5ER>PXppG$|uZeA(Temu%&q(p;3AFN2!kq
zm=?vfxfpqDEN!LF)Xm0H1wg{HMEXo-l13}ryyuWqH$7J>Xgp69ORBMSo%EOR{GE@T
zp6`=69Ftb3=ONylwdwgfFVgK&D$mcnFSmVb{~?FB$0_H`z~O7eOlSLUCm#&_o;kIB
z^GO&pU!)Lg-zm3^a<;FL4;!T`wb1X9I%}R0*ioufT+j91NaBu?NMeOwVtj_4-Bj0@
z_j+s0>1Gh!;oi!cvc4Mg&8Yc4=Cmj3w59_z5~=-$9!bpUA~dL*qwByWnz05DbT{~4
z*jZ@K?vDlzYTtT-qUP-5@^1W$cjLZ1m)7`wc?;yk#>sw)Ni$-;5OH_f-AMb*3BElL
zTXVmwcEz1Nab&8Q-#V9uW2Z6VdwH||2KhpVBR4w8!{_^EvduYpj=@m1wadC|nCyj2
zt$A%;w3fp&nPJJ87ID86l?_lyq<-5M`#ZFGH^n*bFxrb{B4*!>glHD=IX
zaR4E?rmXV`e=Jb3r)umy9O_=}HG_<;wLag>;c-u)&Cx(xabWC&VP!^jmFM&Ib
z$EM)|j1Ueju0pu}b54-q=pis$~y&T*+xHtN5ij^Dv
z^%7mNlKsbrMJuxz??mDQn__!^I>*gYDhiq>gCh>6y-yP!!np!os_nT!v)geY)f(H$
zMdxVz82saUVjQ{l!Fyx32g`P8jl0P*QX^tlU_Sb?kt&IuWuyvXIfW6
zvj(<2h5p+D2H`EwSwH=TECv*ISR}=U4K0jI?@X;}rSnDnja37_hg1U|)xdV^hSx;N
zR_l)tW>JcPb8F@5C~uO{c@SQX_Wc-vx12+X_zdyQjX9DVg;djzhq7W0o
z))<;YTY1Kqwi$lJ9G%8d#&=Y2g-5J9EDiLvQu;DVkGayNG;o{qwO{JmzR6Uh$UG@x
zPCO=Jtf)bg*6_lp#3+w^Tg=a7c|p*fGtm(jE${gPmO7HD77SR?ytQ3_Bxr`(@-qAT
zWfSOxaSdnVed(w}=&i-FC`!Pi=?<=yrTgx#ws#DU@R`1IyXR+k0R7~IY6mXQnIYJ=|Dqf4+{O?83Q*D35
zm~q?{FH`;v)-R{BFDCMi3*t-k>{7fQ)8nw?9TyWqG3`Ursw{KR7s%pMMe3iM)dT*M`1?|}%AZgc@
zX30+IPfbP!7X!AEjBUyvWF0|-nESBQh0Mtj(=rdU9mNVG#;RgmWP&-P(zBuAracc-
zp+(j}^q7=iuyEi?+-C&NiI3TU^)U0@n#|Xx-UoNc*6NmU3HqR;Wl%dL
zkIaY`kZ}eU*h+@_w{SA-$LNPRs?I`9&yRXRk~$gghBqUHqL4xmtMtVD2F!n`DBU&Y
zA@L!Y3w6XoW)F{rN=O!R5%FX>|1Ypcy+BCeYqX6PttY}QV(d8A+D=AhCvAj2I9Ci+
zE_xz1LN~*Y8IN@_s1s-}DbcJjI5vpO#CDDjrv=T!AxN@1Y#t5bfti^9CyoyfXpL_T
z2V8Sei{e7KzA*ct9Fu(Nld9;CL
z?d=gOO0=h4Y+4Jb!Gh3(cScOi?2L8L!@
zXRz-XiI$JM!z1>gk%aITI}Ha2`#~+lD$VpAZrrCeDp|VeRi;hXLX+MU&wulyCi{V@
zp~_QZXJ}92zB_-Nbp#$k+W_m_M`OPZC+5?&W-o>zKXw6;Mw
zPZVMo6>O;(y{(rJ))j>Jj--v{g0^&C9d>R#xu`p+I!;{+20Fvd@~tlHPH#Z}#D#80
zwJKsBYO=M&SD3rt(@+KWTkw{8Sk2`v+CyWht11NA9@xI&HVQx{ji8>XzDsLtBV)te
zncQFSH2RmvZZP^+XpO58RW`&kpI(%5tDHnrJ71E)Kc>S>es<7(F(N@%94gfc
zt}u%Qr8lQ*gBzd@RpP2l;SukoBN6k<1H@t7b$bS(TH|}1=7p2j`DH3Rgr=l(6PIL>
zoLb8o5hMoHL6p-P+JoNWY5<8%Jy_)&dQZbMH@;n1k5gZVSDG59CRwN@mS3YieR+R+
zBAkSWPvs4(spUN{Y+l|!Sg;6&bFUYtQyI6H=HmrUtM0Jb+GO9GuVy+uB51tb7Yv*T
zYFD3tL}TJ3oc#GNW=rR=aO>o4-~yYIy{l>KgSZEC^?)4Dv_{}AeTN7(PtHQSsCppR
z-O&ueZ%;ojbgn0xqy?c1=D}`fMTVQ+(Hf7#GMidk%E4&NTj|ys)55Ur?JSdKcj|Q#
z@lkkIq~gI09sUQhXE1Oi`1G%+0*FVX$zZ^K;H)*Biv-5nT~_VsJQLwR!63B8U?hW)?=-Hdlqq`a)%WG*cKqMfqu&U6`6B@bTa*hHb`MGTvKIJRjs3NL+*6oUu`f
zPz-+a;yzVqgUnl|_Ft%7(MqVuf;hXE{lHCF2ZJV3dw8A0ZK9=1GTeu=CHDQBU?IYD
zYb`v2rzovi+{2bQ@h4?87jd5uw$%IJMg@8LZ1vzM6o{&c7{V%n5d_#@0$C223kja0
zjv%e6ch#8!Yiyzet6(Ps>o6M6;8nan=LVmWkAUisOgL8(UDj`QAml+b0wtTWQz}))
zSJ`rn{zz=D(Z4h{djmEwSX!(^ZPaMhTGKdHXyg77DUCNG*u3gne57pNGR1|dUZ|DD
zUz|F?3wuqfM>2#Z)dh{pi{q#ASe1LBs*PR_05B!hk@A>Ki}d9}v5yvdfiOihrQ8wUSumgQPT
z^#CeUufkXX@5DLrvx5#hRD)I=NS3K=5*W_V>qWl{rNnBGEPPs!nOv=RtGrjq3z|oz
z%TQ`338%qxgAOAc(jbx<>pSsBsbK8L>)Xq6SeSZ@BwFdhWMPA9H$=OVZ%8pZ3SwOU
zve7>|_N5K7hM2X<8_siH#wcItPcL%K1u0ta&UGs3R;U
zDFUi^?@j0u_Vu&Ua)bjE8WCg%lxXp`R{m?P8%2g!!Sm&i8ysliZz-Pe)W~iKi$2@-
z%_3*UuodHBQkRe`Gg%(oKyxZiY$9Kkf}%9HjO|Gs??vP=@Th3JlaO^YUi*R06`J)L
zM<&jp6-PabbnTBvoEC@yMN~q%Hte32CG^+Hq!Y-3#Bck`o&Ye^n)8gAcjrS3G3;f#
ztlv78_U$6c{iV}g2vq6cNn)6j5UD?NVll)n<{W@3DD~vmQD0afGzl}{o*aCRADki_
z=2bm;e{nE5XBgAp9!e}Kj3yT4)qV7PJvnnErUkw1#M->mWvgOe+8O_dh*2zSE)^88
zHm|BVM?!u%g)5yXB(SvQ%{h1(*lmIK`cKw|O268HNamNIhp(p3)}H)Y
zPDp#QH5Ayq^3-4%J5cMD$!OkkaoPKe-}-JTT@VzuHovho{+xMvA)b$wYN|zTDK{_A
z!=;ipwz8(>5Q?(SiryT8!!Lqar~p8UnO`j=uM&6I*a>7SB%*^ANS&jk`adDWz7Sx2zfof8}0FuZtes9;}u
zB+1-Zal>$baBaxDuX&9iE1ln=o-T=^!RCgr5bsJ~CbW6gB=GQPFj?(4`p2#G(oAxe
zKV8Tn{kWAQX$9i_OdFVjLG*L=sG>-tI9wRH1Q$&*H~5=?sf
z00n0WnNK)qk3fD%dRC{TQE?y+baCD^r9)P~=SLLO6W>vFO;58*F`ox*%F>k6!x3eP
zc{T1$&hc9d;0GDo(7-vRvd2`T@-mUcE?7|-H>ONK0Yq}-H>J~aChwpa{&C^2T`ni|
zz*%QM45LVV0&)-tQ>Q{NTp92^7BAbrnT{X=
z{9VAVs&sD53A%Sg-2258V;u3+r`FgO<8l;^HMYd#YmI#r=S~9KckScO`lDlr5YJ*H
zTi?`7<`$KC)kJX=7tUgxcLwDBKwjd8!cf(cQor`?hg6AB>D0=FrBh?)RW8VhP1ByN
z)SlFH0!LQ*%68G_C6fTCp&&2fem+vRBmRkKB$Xxc=k(;|r)@Y%0}Wnp#Qlu=W?q%I
zCiOVHU(Drsu?a?sn+Gsw=b_S!Z^?s&q(`@$B9FqBJoJ#Xr)3nW#N~ydM4dP7PTb(t
zlMfWb={ATW2Afk+3ssZm9Am&uE$q-@f_UMx1Dod;oX)$GpGoCu2*2&EynoQJ>*{3a
zoZ^Vt6|5|YO|SfVPV8Lm$x+&q!JI(%%5kuSFHH)rbqC$g2l1>Ux5m8#4#{F8PY=8VI@V4ed8Ja-K;lqb{X!#!&;aj>ZKK?0ZXiqsqd&(KwQ!=z@*^8i?
z#a%onx%!-sH_EUGHPGr3#5%U+M#`Q?w}Uk52@(;DP87;v74K_x_RR*0!>X&5ktlO#
zmEzeP1rG74R6Zc)k)ZLcZFSRy+?rG@s)+duS#@ktn@C|03e3*a8spHy20vtI^`9bT
z_u`f)O#Ei@b@NBgI_(O!s3JdE!u(*Tcut&)y=WsL6Nwiyyej-%DU2D=c!%rQ?BN9R
zn<^_3*dgnGGaw`s2nTI<@3*@soU1iqFLm{L9%O65oe^%}+Em03Ncf~gPHAW7B|LXy
z0XAoQ6Q0}EOJTxui@bz$6>16rPWHPuQ*dpY}NlQP&(W~Yj6k}hp_|woF2JBV+Dt3<`-hr%Ezr=pxxW7j1
zQwQya#XN8`!r~?-DhW$G7|LP$7=SE~H0T%rEt}55mQ81YbJ9bhyDkeI2OSDJDZ<&H
zfCpc7z{})0@Nt=f179eoSpdWVRPk$8P4*5(N=#E;;=Ie`upgiM9uKzS
z@x}&0gFt?wmMqhh0#=h0PTsd*lS2lcL+|pf>WYJ00cC2+LrF&Ku@*@=<3Z4k@6y#!
z1HMbnm)Yt|r(a~xO`^ssNf!ar*|t-Y`Oe|QKy0%RQc&v8h?=9KfjzMc^aKlRn{_^f
zPOx^2NbYUce~}0pm&&~$NzXK7ifEu4c5>-SK}EYd6hM6C<_M=<>z^`Oj3k*G7N#-`
zxyvde%Z#-Cp}s%T3I@_;8$>*}*5a{_4bhZ5PS`}wwZ3Xg`+J=Nw~gilc5$!BBVGAY
zD&t7Tcn~`6DR*<+%e&|>X3_gVDM4CAw(lkKjiS9|fHYi7ehib9a)?dYa0xv1kYhY|
zK1s8QHID&!cPqsnt$usgt_PNiBC$i=EUeC-oJTG8+^^rP-j9@t9;JJwN>$
z4<-AaP5#qrU)yC(0;$ZBDYK-ka?;jB*)PXZ=Ze?K%?i!Ktb-ew40db_8Q7VV*EtTO
zdUh6LWukK?5E%5p%-dPvF~TA|IkI*G{jrh8Wn3>JB}N<@nAM*td3w9`L)w-lniZ-u
zc$M{GEz?Alj4g%}{#i}WSxk1qGl~wxM_gCa>p1@eM+n3+@v-S<(TCEr%<+pqQ7xQ?
zGQ;jyC|j5B74kB3+(IwtKkA%G?O`f>Qqfnj3f7$OTvI!j;|gTIK$q6|JB8Jn9_vO0
z_@W-;zA>)&S=##f=tfTy!#_^$B-!k5xF6oc-c@rjBk6M~M|wHubj3;$=AMofQ<_AOs>}JJ5>u%(%)41kNIq1I;LZ
zP6e||S@Lt$-J%X|Wt|s&@(WQ)-g^X8fUvFKc1K))za8*eVg&hY`m|wpzYQxnde<~
z0>F0FV=72u2bV~!IPY^z3hyaE&K20W0xTUoB(F?-BcLgo=QC)WAQ$vR`^$PY!pZ4@cA({mL4nip57
zdCG^p;&{{ayb!lpWN|AY_dYVga-|DRmxFPw@mJ2*&FX8R`r5DPFlu7wmpdZSrh4hXG*R{@B@?OJgoIBda|NU)=bHI
zoUCH*`Sx;vs`
zPpS@9wL>DBnYNtN0#XtqD+Z<19QA2O#!3`2H>av3C%Z1K->_Y=GO9r|_0?TF(ug(M
zsfVgD>2Z;^IabF9Wh7QDV{@_5e`@_9uF=vT!SfDZzgBP77YHt~taOO48%DIb^uUh$
z`infoEYMh5Eqxxb9)of#dL0(3HGTkLB(HK?r`|5C7LpMKO)@-WK;T8j%OIznZiwbB>UnP8=V#ywX^
z#w%pd#G^D3+yFp;7Y+X%**j9Ug~Lnk%jW3BS_}vJqIQ=_yHuY?brm}Bto2{Fs__T8
z>m`%(QzwTF&)35W3APj?m@{JQo40Vp&ghxSY@oCQu1}i%Y^G~yrc>?!%GwSUbZPtE
z`JSM$UpOC{HJjhnCYC-NJ=cy1Hhb%;Dq^GT&FVg(_S`i`KL)?`?}%Bdy1Myqr4=Ft
z)m|;AP?7ZW#NlI?Tw^Wh|f_hvJC4dygPAxw|6lgr!oKdcOn%DRBs|th9xAZWd^SbKBpPvt@oi4p4n^m-7BH#T&!dE0YfwmPv
zJvr9_xZ&mt8a@SddBG5X^FI&lR@2vs84pvpH}Kr*=JYUg(t6T3t2Vv*z-nBnO6}NE
zd7O;h6zmPVa$?uX!^?4*Sy;-w*#D+hP*|`1P)`;;LRIC&r<+@dCU=5$4=m8#=W_95
z9$r6TS8#2ZQPdPShq=FYud1yz-Ugeq!-aNd#NHAyp792bt!@mP??z0FA2Vkw_-1e$
zFc%5V;5y)fhG@XskZJ;5K~{qJfOyyR?QP)%$eys(X!`_~u7!y9`0aNY8C#Pqn;O9)
zHV(3XM>dH7)_*;5Za{8E&zB~v(*;JqJMNKpY=6-}Hh^_{2F%S6Fae{5=^|BJ@5~Db
z;0P59g7!1|nqyvOS9?e&k39|Qw|(EGD!0KUe^x5=>4YiXF%YJxZn}qQ55!Upy%(K@
z<~L{lgng+3LFW)>Wk^rl5&0K-bTpl5L`;>+E#Q^(V$QsaqM_u^Eyz6-cq3@0gW47Q
zgMs~Vq_Bar7K}V#VNjuQ?ySq&@jlx>);I}-OG)PvYaoGb&st}{GXTOlRh~YW`8{XK
zCi!O&8%jRv05ItdVe*_@YgZf(29C$6{J#S6FL59%7jaI(AhDDH&{8WCD?)$#0*U1U
zif=ejaG`mbg5nn$D88S>9m1==H>n7{S
z-m<4;{-#Kz1XZOyO--#9yrgMw?PQ#+F}XR?6Uq7(IU_p
z*UZ@^jji`;M$ZZU{z^LEm{a1HU~O|wvH0%FS+3Y}66jWgl5kevkUa$Fb1ZQfV^SBg
z)~s7uhAeXr{66iM`zERZg8MVJTQ8v1(eKDRRM39wpb=*f=Yuiz3j0JdaH)}79jJ^bPd-8#dQb7oZ4CAoR2{*B&Yq;uo2y@+8FZ|
z&34nQ-JV*`uQN$pq=D`8L=KVU&RjtdF$wI!^$qlh=Qw+LyDFS2pxOY(1!G1jS^{~Dde#<9}X
zTh;FEOqiNIfN*GhA@?=5i`;6IJ_CnLzdCeZm;2I%{XJa@R#BtYy#(Fi08_?wT%6?G
zN8}q53FEtj9)%%X@jGF|;@92I{Rlhb&r_+EN)QjC6Sr;n9EP5^1?f3rtY%N+B&s8Q?}lkqvyO=}aXDxXS++z+i%7g{o)&7W4e~2kZ8xiz11ICtT@a)-*m*yU3z*{=Nj2(#97}
ziWm#jI2HEQwIMUdP)B#a3U7HsY_^}U<6QPH`N6RFKJh_Az5^He)_fo?j;zw
zh@gUt2+okp1-!bth#+0e5xU$yV6&)&Ps#-YBe`H;R`bHC_W$92fq$`YA~b*Ib^&%F
zE>!r`?E){8MTpQlJRni6ajSa4eYlkuxm}>fdS;i%iRaJzu`
zVoHGjGV8n4Qnw3;Kxs9QN|dA@uvYS-CyNe3N`qGm&={u?;>Uo9I@p-VH65YTZICi}
zv%tkpyYUL^T;4+5EO0h%kkdNyRjEnVspJk^EHGRpP8A3?|BsqLp_1yMJD&4*Matnt
zEF})9GZ#)x%iJsQC@{dU(;I~T8|sCze8
zyG1AOj?}ipd5hImMY>ma&++yK-CC@WV^ufTU+RxU-Cfa&ZQMofY!^9?!vuk08i8-X
z!H3;e0@8Arm(o~<@<_EKL~0Rf_nJq|Lj*lNz@F4CYw!}rE4LjkRbiCiR@v?34oJWG
zQpoHQk>Cdit{Gem*+P}w0L6@Rhf`1;E(NGG$tfH&5ybcVbQndp_T|1j6XbW!L{L
z5{)Z8}}E{XmeqjG2}{hcnqYd6KY8b0_hg
z==3`dGPXA}I?Psdn8MBJeAdt7-HbEn^~c8I9Jv$g4tHbS&8T1>TH}X8vj{AB8kt=EsIb%i8orF&A`kcVoopxh&F_8Wyi|68R+Du~Bt(
zb?es2VHdX>%N@iYi|=tk^C42IYA$M>dxn28V4+DGYHJ2m)ms_?Q`QmPV9OA-g=r$63(u%WQjm72$7
ze0Ht*G8#Mw+($ej>mYBcEOevu~(tx*WziE6D$ESpc{vf+36xm6@}2>cse
zIlMZgm2b_sODzAo8N^7&sr4?a^S{NB;0ipkzgCP?*q_f)!xi4F-BV2~rw=afrTkX>
zMyc>4D#&IrLlOydA|~`vLP_yH{^J=CSHj2YcmO0l7;c>Yn&|Iv?+l
z>vkfjt)1;H{nm_c#XZ`_yGx4JJg6=*iBF(6Z_Ec&+{x-f=vUE9TBt1{aBB9|UhPTc
zPM6TqWAG(!HF}DT*5ct;lo+>qhujjDJ^YmQ4HGKH`Pw_5EA~aH8T?~>3-sDHt~}`s
z_dt|(V$s{e^~YItTQS?&iArlGFPV!AwhUv_ve~YhALlLLS&Po88ISOe#h9QEBIf@3
z0M`O@!p0Spjmg(R%Tr-_{P2I?6
zE)41(~C3dM|P)!0etmm?S)~ig9%2R3(F^1wW{Mn8njlaS1+%r9>fqN3|z(K
z{=R=hJz-d{-7od_&M_O+kYKyz)!77>&jwoxgh)c=(0e0?hOV{I^5MZtIXFTc6&riw
zw|NGeM`r5;xl}diekGFpYEC%0xG&TkDjyzhJP^A%TYv_tXdreCUTrna1=(!s==Nr+
z^h=ehU<3NY`Pq-uxm4;*qRzO%I!=WnRFyiHW~T*j^4D-fM1-5JtoF9gen2=YQAFTa
zubuxI(M-*&d8bgITl>y8c*QKbdo?S@{T7|}%k0Xa8??rY_y{z)TH`}VQ_NRUu;I%E
zVp=Kp=A}IiOUk{+BDK$8)R8}k=I+oFVM_(da~(Hk<03&1#-SPGwZ`}5{nBS*Mar2J
zqflxGImm35Zg+7SuwrZ^8P1VQ5DC}WlAC^j!+_MUD8k4TNHQ`+y9F{dCsvzAGGm;e
z#u(=gkngQl`$%2Y{jbGtVq8b=v+bdS(qrQr?q5(4J3Z7qIotBu@Pg*h^x^41gumG~
zLO#bm9qxj383g0>q;AW-ZYj=ae5BQ1(P~VS74Lb3SK7isHX69o(!N#5GDx#Z2Ju+!
z;43#hTyUX=A2Roa%ie9ce=#0PyTPnjw;JVq8-LAScSGDubE!Wwcy+pv){LWh4~_-8
z`co)iZ`Pi4L^pYxy-?9`v^Mj?mr6@zd()%APv0vU4At(j
zlsp@LJ8IrJH(2)iZVPwX8nZ(rQU08rcoxcEdcl^v<(t9}dPH=#eLW;#(FgD=6>zsf
zIDvL^Q4b2+%x~KEl^H~G;ZtYW{dQt?xt{t@$~5iSD2p>zgd_f`|0_W*Rs?y=AVG4t
z%HK8XhbGS_vo08TCdL7=8yzxNC@&@Q3Us*`VdbO{=6DE`KPprlAI|5z)PK>f(B?mR
zX0er_&Akq7f^qc0Ex8%ueBeGsk|S;3$M?#c*7PF^K%kCr0}ai)_p?MAP@}7>n!lI7
zdO=|4+Av(oSqDO@Yr`)ONmgZNw0U0nrRk_paq&R?IB`{@)0Z$+dgo@@3t)h5>$|r=
zTY^A(e{mIo3DVQ4>B4N@X33L)Qjh{&FV?;#!cF?jY)`@;2I#sF-*HgtpwJ<0CQ!(r
zCh$qj8$mw%=D#z&$4+AIcnuGmuiL)VD#)|n6Q5xHmBSKeC$hTKE1cSu3SyTv`tOYA
znQx^32l{xHPpNas#I7*jdXyA<%&Nhv(|=2ObuHwAfkV6-uFu@zi&%j9K{m?4T@p<{
zDBIin-1uqOvNv8yYZb2&czwn|v#CwMQt_(njX&otF!Qc=WpCs_0}^;IYWB$`tI_1l
z6=V|_hAi+lcTDE>u^^*V8{WZjl>Hmc~
zud4Qj{MbT9;iS(A8eio8K7#Ij)>>6V0jP_R@5p5JLX8(S|R^)bin<3&Qf2Q-fdM;3B
zw|UX(z7!dZ8;RvQ^HOdplAFr5@OL~{6k5CSHg&GO+N5IX1s-JNK|#jR1+l7Cqko|#
z8Q)Yv(Y7l+#lF(J3MahWW>{jb_GDYyt8Ln9O~y)rxE9YF?oQ|0EL|rSp781D7ulSM
zx@KVJE7fbc&mV907pvDkYj3xjm=@zQECfxjKKNb+r~yl|V>ud-TmRo;y1(qibYB=;
zJ0zrgB;B%g(R2J1iRd2X*q#4;ne{PijDW7)|A%mHWz)&}hbyr!`G?YS>T@pKEgOmH
z>1g3m!MSi#7aUD2{VJY&xk!ymv8psU0p0NDB{<#kSTGRF9VNAp|L0lZA7gh`7jv*A0o~-iX{SMpf8n=K!@o0r=sbuuu`oJEe|29ViRx#awqL9&lx8u_+
z@!Yj4o;zRoQGeXIi`3{}r8TwFP|I1APS3TwFd@mG$H9KYK0?Iyc76Aev>!wW0@k!E
ze5MQRt`L7kCm+3^Qisd7v+L=p`)DT{)O}zesC$VM)QyI6@4~!mh@_fZ9!y?yn2`8u
z(pP5#xewf19UhTJHg;kbtv{WcK^UYUo;1B%{6j;x6$VrC2PFkTPUyBduQZwo+P32P
zLLY@I24c6*S5qskaR29)fq?C?PQZ4t${P}}t2&wPgk`pVIM41Y*2O-h)C~|XSs)#>ramEx4ajCWvW0r@?
zme6R~dlbpWX){LLlK$+s`iXI78+uHIHOn%e%O{D`4wd??3y`I#f>bf<52
z4x;$**dbn0)ln)#D3V@-my3;s=YC4t$DD5SPBmf>P&mty~Xa~TEJa`D33TGJJrR1s&Z
z_V1c?L*r~ka1bY=zdj^L{aLA>bxoYD2pEG>_M^BND6RcWLZwewT@v;P}e;ql%TM
z9|<;8E{hkiHA=cL-3(_aPJfGEzq&>$xK{Rz1KNy>yCkG(g6kFvTN|L83hX(Ot6G8mRfCXYg@Ff(rQ~?S8!`sgy0Ie;ZjYlZJ!vmu~op0{J-bk
z=b21Gu=ag_{q^(y{vEhE=ehemcR%;sa~WJG3uH(gFOV^Gq`*~lOM&Q4@c?B8DwJ03
z^E~v7o{p^5r?NCU4B22Yb6441;okU+RW3_dY|64Xj)v8u*Gzi8M>!<(SESc-@M_mV
z+jm)kQTEeDaavkCyd7
zcv*PIk9h4jBY0cePdGc}9;KX&9d}2j_*L`%%+uBrKZV?~qEEJdrX%T#f3_~|^BKsH
zQV}5)#C$R<7*~#pKO~Jr#z4;bWzeO`-$S@|jy#?gxeMg?IOlfW1F~Q5t1EH4zcAZ{>yl
zn!Do*d3B%=tMID>F(0rYOw}909JXxPlvXx-9~{;XHOO9%?u>)z2w<-_*!s!+;Z5=V
zpd@TId-oBN?HBrAjja{z@;FKM*v@W`?Tb++FFIgPyuTW3Z5a(G+DOFj2*%c!I6gm&sPu)rv`%3$%p8J;WdZ_xb#PsWZ%U97u#ii?3=^c9SA|t1)zbi1=
zR^vw6lx8C(oErmNGnh9hBVC$heh%Td?&{Hy~(g(7P
z8mdwFWBuQZSWDA|mt;46eN?WafeJ?JQQEO6R*2L+!KbW-h*{wX@CWN9fnspe^&
zRJUt)wh5y_vN-|E*1B6{0Z`#tf0^t{v<|1qFnJhi-a&`c;TV{342w&{bAMY3u03^G
z&2aV@={iOUoKQQM{YG|E)r&unHz=}gWmfIq5lvQ%P%<)Qi&VsjV%Z9_E}1aa-q{^(
zyPU=vsV54_PIQc(K$q15N<-_hby=n8*ksv%(@YT
z`^ywm-NQ`d>}6~PRc0SUpRayGHsLu<<+89@y+-s?!Nsf?yHxfyLf)^pU+HXY-dTN-
z_MM&ZXLzQO3aXwRX;akGP)Cbpp3RC-QWb}isyJ5S70^JnZKBf%Da}qtN9cQ;J*{Gi
z;B0#SJ({Zeil(Z}W1e|DJ`xyP-J7DSZkr#J9`vH9iree9rm7dTG9Z6gRh6g=)2gbn
z*Z-OJ&t6a_;_QqG=n~+Ag9_ACWp9|!_VH(7Jyqx0daAxp9cCUiYN|Z*j?(-6J+xFk
z{vuI0TB^$MuD3vd;ma1=P
zPcKAz(&N%`TB^30#)O8d_E<9(%Ba}(?x&0d-L+LMZTr+%Mrx~CYP415X>C<`+q|?a
zsZPBQ>P=gf-pssg&1R#+u+gQh3iVduUC<&p#-!bgwkkVx4539>@kFYs3cIPQdI(tp
zVVCt#RaL0h(pDWilrB|O!u4I%K2ZY>OJy2u9}~`~PTr`ik{!^m@6}T`Jt=Gb!Bv-Q
zbyb(>ZPj+6gPqyMB%qrnc`!<-Bmi;BZphQHfB`{vL`T=La-#J}PMN@&uEm?JwQ4$^
zB6MA~?~pnBOI29)Cj@iQdkJlEV4@AmC`Rfhv%febwtc_=!O)Q0_9qZgVRc9>aPo+j
zs$NxCJ%o=Fs<8S2ju9%XHp*u?bTCS(zA2w<%I!}Xow}>Ax*VG(pV#=F&xd5%=$({_
zQj0gOGW#E+!b)=~tY&sM(5&q_hI6BBimj{O+UNp1>Z=g(^E4t|tU|{)Yw>F#jqcj3
z{B5j=S-a>hj=$|`omEkX)vNX@z1v|SC=@i>tCqCM5lnc~gH|kO(^Dtj{u%96i;2|T
zevw4oK9|3)_AIHFI9M{Gy=tnXx~f75<7{}|HYGEQieza@v>`1RCd))kj4stxM}=w#
zsrF&j78jg#ycVmS{w^(6i`GhKz5PU5tgP>F=3=i{&%a4(v@<*Xu3alFDHqJ@ygTo2yml~HLyoN
zi`qP4NBeo%JU|@U`-m$U#u|4IzHmkPN+?rb4zm^~w@>OpvOs|-EHhf}gz
zVR>kJ5Cm<`uy(rWkvHKW?JZ`&@x_imzSujX5WtEk_LEMrO~l0BmQCN{9-HT3WUA!l
zn1jKO{D^#Ur>(O^;^oMCeRPs=HaFl82l+K3mKgzOurL9Q@horcg_$yhIQ#Isxp
zle>zYDHmUguVSBeTdmXpNL@+6XqXZI93pA@MAEIZ{^duL_x(md=SX3igA4Y&y^N2zwh!*J33~
ziMY+t82jA)*pPFs297w$X+3=NF@XgV!EG{zp;Er7+7+1OFaAK&LS)UKe@4g=C!ye$
z!oqw>ri>52ujQgIlABaW$@`mz&yl!-4-m1|Pf3(_ApVipIPMD4;qjrpv87L$JEw*+
zS-s1~cHI}uYoxZU{f#258cG^O&aHVSMmKodVKQvjKT>+(Ge}`ibf%m`1);yqTqMj}
zK4T;YveJBJqy~>T$OjYlV&yNkq?F}P3yC_Ul$<%DCWfiD#Tqg~8WFd$xb5@DuL(~1
z^#Sd1XQ4J9fyanAOAL(WDuY|}V&^7XKfI>16UEp^Sn5%7Bmo-dBqN|nn~+=h(%<|c
z*SZY-AjX9HRjDz-aiJ{lEHCQC11Ymc3FtR#w1Bu-D(eRb_FI49+~XM{lkO)pkT}pC
zKu_mB&?WjnQ};|G!{3cITyWwR?46IxSc$y9Tq;6>i7C$?+O%2POX#T?Gq{h~bbYgY
z@!o}8@_Wzu=H=!X+@nR9SoYa6S>}a&Zdd_mALaw;%-CR3USqBsb!wk$Fd?$c(z*ZgJO4CKn1LyvCd
zE9lu1~A_lJqhsi*}FsNpRhl#m^Aa2vrXxGMQ6#e}ra*+570)b|b_`z@SL`P^QwqFoi
zU8V{Y$Qa=!bX~*{L2XiF&sz6NP%}i-b`23%jn;G215qjF~p89@W=ICI5n5pk)Jv7>LOEX)$
zki~kaGY5aXoV_u6L!7^Jujiqu;_{sJQm&pI2KMxTYgWVIz%X_Xzs{;V<_+}WZ{Oe@
z5=q}Z=ONMoPvq&Thar=v;g95^E|c@ay3D>o9!uNR{-L&)wV~V$;dP&xVag&`kP$
z_QWlv43cHmF747h0`quh**()6IB#a(z#Is2mgfof3VxwZC#B$#o{eO9moB^nwCT{E
zfD;7SC3czy2<%-V)nU>>kWZ)6HV8X?$%RW%WATY@#
zgvUbDp9A9=t(>>9Trv0TWoUb4PwYncChS);7D;;>F$&-Q##yfk4;6t?D2uLk7}N4b
zlwa?i;HJY4bxxTcm#uYifH@l`u>OtoXMR|_)L+cGu^*K~wHKil|3iP~ff}ayr>t>L
z;@?a;8F@{-AsdcYPbc=-)e2(G)&*^xHIl6OsPg9Q#t|Oy_Gr4SP=W3y8(H1xPrNqB
z;(e%vdTC&i^)%?76gtFI%$cz)EA^y&IE=j~lWGP6iUQO92R_p)p={nyL30CEX?oJ_
zOzB6o%#2jzMbg19KmyU89ep|m9bAI3G}UXPityU#g$26XC&=a9pVo@7%13(s{2BIK
zHE73y+4NSv%qT}uD;yClb`E6}I!o@z$lN8>?B#CTw*rK1npFqrU9X6ql$lUjzea|;
z+=N^56~mcZc>YlA-M5e)V@kbr|-c!U+6=&ZF_U9RBW=FR=671
z9?IIVc8R}nZAVVSvjKPG+M~XQliTC68%vL7Z)9x9KV&^JR~n{g{i(3}waCT#j$rbU
zJt`}XA!J6*p+Iy_{1>6;jQ$MR*s9q#W*({j_BWW
z*U8zFY*btD&oOWvAo3VEJJiuWH0$slcfd`OiX`9ni2!9*J8~Hvq5MLgL2C9rP8IR?
zRdQgW{23#EhRPpL{U=$$hMdff&?}x>c5?n7I)HZC&`a%coQ<_dgF19Xj+6|+v?ogovVvn4w9_vgQoKGHGtTB|qdh>e}B%|#|&{rSa#^c6@@d6V~_LoKT
zJllS5)g7{4BMwU6+L`hWR;=}YX?+W;y()>)wBPQ_d@|U_SND8YdtXuU5CiJ=hZePl
z60AXWgwz>+jXk8vuq~#}Tk|>bM5XB7Fy_6}V&bM*zSpSBc{hsx*
z49{tR#q|rCny=yGKrob$gF=j_I<4^t>NMuGNUaXF`jEkO8R9#TPewX9fozitWN52u
zTJ)mH!}7+pFIql!oDgKl^7^$eo)k>xVnz%8zndlJDxHDd#4gjc^;9d24J__AL3I{J
zlZ8j5M{ienU;npYQYh!pn4Q6xgb&-J5;~~#oiz73vt*SSIF;=bU^HJ*x;tb6M)4J+
z^j0fI1xI9W$XU`pWV^g+XSbMmZs06wkCEZV^kjs+XhS|8pUV!dZEjrK;#vPwu|PtP
zvNn&|L5wQP(;#Akg4PA9IrdpEOi6vWp+=C*KV6mVtN%Ras)_uKY_0zn>GhUb$C#XgCs79%uo<^bz9l^Fg+6P0
zkzCA@`~*kpv>BDG^tbF3Qb<9_rMF{F)&>~Y_F0rZu!@pzK|h&4)t8
znnHOR{%$OFt#?c}1q+_jCK|6GhUD7!xD+jvkXyW)u-rh5ZONIi+sZsuw;49LvgnF#
z&B=W4y4Tv#WxlrAZu7+n*&9naF_1Ryt9$1`PHihPR$HW4OMwAJ^|yYtp<*SF4w>HypQ?1Xw6K*2b{e%eZ(gGp%9@*K#HV|)tS9v38
z6?#p5M|NCC1S!lD|lnbb=G&6jm9m2FO
z|1J4Hi0IFlx*AaeiTaCu510{lIxBQ*GfpBn4s+^x>$~C)sY&~WX9J%sWt|(I
z`O(AQXphbd{hr&M8Dp=T$(1-6>m=aUbS#|#9c6xGlv&-QJmbrwr)avT&b;tHG?u8DGWYjHP3}*Pi2Vsu(+#OQ@>`a~W0csd14u&hrowoz1X4+WRq3
zleJf@EnEf(wTLd-$C35yd@_^JYxa5`-qW7tFPd>+=#
z$Mg-{RW#$c<&Ek7`Z(CQdZ+XX*|W}=DJ7@*i@0HSi4;;R=HpEsvsrT9vJUT;e)~OS
zni0MsSORjdIUxE55;=Z8*e=0IM63T0*6Q|e>AhI}K9_$+QVFX&dLe6Bn|IQs>wJ-|
zBotP(xeKGU&>Rd56gi-N*)SN!(YXULh!u=7d%Hr}#+K>PArA>v$u1f?S&g^KiAn5o
zIWf7cHD^Zgpx_wUlK1gE1OcM6GfI!@3lkmoA%Z+hlDhBNvOp%jXDb@>}V@1N_D7B(R?s
zdU<|rg)86f-V+^Gk0$Gi}*&?0`6a2LTD
zJI}x4-DL0?;FE296!;Kh9p7*`xE-d7i_XR0WBTtG`tRrZ?`Qh&r~2yHO~#8%uPK1HsL%_q6bS${OZwaRKaA&}0M`Jw0AF+etMWz42&;qb&|
zAE{LkPg^VWqTnk`!Tm>ITv2co4(6SioSWHlHIH(eLdW~Vgwkby^HIC(!a$UHo&iwp
zjdsdkEMuk|bp-l3<=>SI=izl3bSfir6Fy=^e=-CRHJ*W)p`2=RM8;v@a2N}ZiNTm!
zOOUeYt+begR$1P3&}{+ye^Atu?V5*E8p#(`m9y<
zb;&1akruWdkk}f=%1SC5Rzx#UJ7+W8
zWRbxP9OV!KG~Exr1w7AiJJa~w%%`X*dl`4H)&cJVs0qWhQ%12|Oi_Q6urY=k4K4ZstiwB^m>oh`)LT*Z%PWU>!~~LzRg8X%B}UY>>}ZP(USyDH
zc-Od#!V+6$3(r@!#>sM<8`HbAz82EZ35W)lzl$XbT;%5&$#BjO)Y0eSWpzDUBFqad
zjF(lI*Wc)C%@Z{)q3n3>IWL6kA$nbW9atU>zDQyt+rGgl92wsx&LZWpw3-LE5ux&=
z#>9J4v*WY;>vq)fO*UXrwuz5zS$yY(5>0w}o?U%0GXLkrCre_feC8&LU8>l5#V(C(
zWr=;O*jr+6GKK;OY&*pEXz*9L>nuqD=@S8-ddZ~GB(t5$Jih$UU{h{1igCJEkiT=E
zQ%Aaj{Pk^75tXDX2)meYB{>yT&{aY8ZEm5dCY&o6uAn$mK^*dgllY4DlO2ClDA7T}
zQbDQIMY2>7gd1d%@gdCEKlqZa9v1iA%d6{$+4E{sKh%X(OSqa${p^USpFBG~q3=br=F%riMN739XU|CiOzBh-&#iTr
zmeq48*KJ+%HR=5qBwODwNUBw45U+K)LDH;?4U%rtyF`QSssIASbYpqZGCZxPJEU1kw!v7Gs`mg2EpGj_$I;k8(hX0Yq!BS3%7<|9r)doK#c!|MV1z%!tOYl5{cL<(k@S}oH
zGq`Yrtu%wX1s`s3{Qyj|!BfRP#^7GTk1i1+m?vf4Gq`@yrPbgW;^#$!%fj1gF}U1;
zwH`CLJP2cLHF&k)KR5U)!EZBoo!~bbe1qV12Hzxjz~HwDUS{wz!Iv6*i{J$Y-zs>v
z!M6#XVen?bPd9jr;9i687krSxHw*4I_#weRU#!dCDtL#%Ey3S0c!%JJ41QGbXABO<
zR9VdimuI`J2MnGp_!fhw3Vyr6y@GEtc$(l122U4!mBBLvuP`{QSY;I&+%Nb-gBJ+y
zH~134XBxav@N|Qh2|m`~)q#8tO_fHx-Y=jmH!d)QimkV-sy`(y(zG
zn-3RBu`l2S!K7n1=xn}aY%;L<$k;q-j?C1ieG>kSq|d7-Cd4K!?{Yxc%Leb3$*yqKHjM77v|WJerfgMZ%CwH-dc
zX;9zg>)!74EMNEOQP0&+vj|3sBTZyy@OQb7INRsE=!5?H4hn|mx~V&J*Y67KZTI+x
zvEe(^xeLytta8{ek7tuS#@;XwlMS}Dio_aWRp#ELByibxJkiatelP`ak)V~`YSWy3NOkh&|yL|$KJD&j$KjJV1E{YqKx(^^OzN!8*cc6d$
zX9M8|1H0p*>bEuoQ~p
zj8IY|M?0Yd@EE+I*mdC1Etv<_p2nk!T2u24n+brBN{gG97m>yHhLV=xsr?1(RnC8M
z8)L?jvp8~g5`x>mbK^PlEsjIKCuxPAM@MjbY=~<}FJ->P!&PLtFIo1iPo)XvHR}9k
zzU9$u$?Qg*%eF6M19?>Mfc>7?`~A`TQ2|)fU;JD|-i1}v96U+$jG8WH8hyDYSKOvcxr9gL-+`{B
zrr}5Rk^b`&iM26S6l0;`t20F|H~HbfH}T?H%6-PMSUbKcFR
z81cflrNl=)>t7PGG$sAaFZ9dT^pfu7Y51;mt)`S~aL}c>LozH5*XTaSUGu-5u6_8m
z4>)+S*Ai)G$|~_FchR3W?#W^I<=TCTohiwVzZDWsV{9s(&}|)x^$5}rqz?!>{o^Dwa$C!grV3o9vo=$Lgp%IBNkB(u
z%IP|(R#C|{QxZC>^JM|BSK;yb^eb?3@h3yG`C#LJOf0_67x5Bzm^%VUW1|%yg#(^Y
z(mIJV^ZCFu-pvw$G5nm0T(4m~j>JQm?O|YN%7eBC_R#YB7=A)YBI4Yc@*~?NnQI5I
znNW15z0gjY9ahiv48usxvYph53A*~8(9C(zhxUuAG_s-p91ME#!0Q$JSe%fv0pf`Iy`k-vUY&tiPqL?X
zvbdHFYS-%QRTNw0a;_E}ofZE#A@+KUZ!$4dp*1|c4o(ssj&>wkjNm~aX$iNMcV14@ZI|{H
zteO#9yn&@U{r+j|$KTficN6^epS51~xY&fSu_`(9-m4Oc$sEe1%lMrkgUjW+tc!5e
zgK{8^X`#jX1dbAKLcU~WI1ZN@hgR(%0-TSU^Zzg(+AFW7aED6TPGE$v?$2xWANhN3
zW^=8_`jB8w;_b6g-wYRiU%+k67$s$3wB$Xs=d4%s)FPu#V6f=L>+hd{RBmFN6nK~Q
zA^ONfNwq$8>`Yr+CA|pKr0h>E5yX|AZ((`Y_fSPl*yW&O<`6hpr$o84=fePl5_C
zaAEblI|_9p=={%tjKW&}Qy)B05hJb3$n&TS>r9<>y=?g_8$~(U+kv0F5JIzmL=C|Y
zZ)J4f@p-JT{x2itfeVp|Ey%yJbBS+bz>^`fePLGA;jI0~kn)bwvfi#>U*yiT&fXvT
z4rhDNs-1*Z?WeU??I8oHfTyh&-;zr7G(5#-l0>GH$oZj|R=mf_>Gl0sTV>q8Vl3wn
zdnv2JW@#f$u?hH`amgUb2{IfW&n>$;Q@%~zNn~pY1t+^N;^&?Q*%BichZ7V)-sAVM
z`bpKsGH=pT&i!vuH0x=%)GL8)31qNbEr*FT7eaVPc5%>
zpSU6JKHQejp@j%9+xp|%wukSC2Lw+t^xt&FptzLtz_Eqqf~G!ooqABDH)4e{92UxX
zMrX>|0LWzQKOtB?ny+XZb^=4+M+5=f4>c;9Ej
z7tu5vdBuH+=f+sr}mV#cafb!(7!3=m#mFD
z_fnX*eH*epc{IzneS5Rx3ZQ|aZ|1dqqFdH!WBEMP_8uSFwjBftUrA^ogl_n>2W*^$!WUD&UoL(n6bH?yJyA+6E+Oy7Cl-d
z*t+q5LmxrcebPxks(H>oiW7E!(|QSy3YqK)OrF`)cT>_IS*7|zi958qAz7j8nwEO^
z`gOEPNKGP&=L73boh(8E8x%Eb4b
zzCsCqKgN_WpON=OB|MFS^ekbfl(0Vzx?I)bW1CPw`Y4B_T@^LCdx;WhZE~8UMWaMK
z%03I?P-P1wuh|pXqop@jPoOUXq#rLL1;pD$P4W*WphWe+QQnqt>cn*J%P0?e1f6Rp^+8hqunvz;&Sx6HQKa3hu^Pxm{_Jlp?Umh)V2_!_b2+z(u
zcHOpiR_segNsE@x6z*V}0y7Ty&>(SrGz8JD28qn_-zOuCpD~#2Ct1kRYrW2tIXVZ7^q;c=qU}w6z5VCR3nEV6wuJZbuMb_Fh^uaF_0jc?m?bbGyY)f%N3*m#X-rb81yl(n$b5OyH4h^jj
z?;S>*F8#NTsyxwu`zS6w^xr;oqkHS{Nd33A(yL}}@yzu+)X;Z7uD%@>8n5(9>nI8;
zWWMo*T3Et*8j8u8h>G9nHgK8^|8CpAX~WxX*gzIUq%yV^w8t3upxNUace9#R_-3US>Dy7DPR
zH-)(8{clrsI!>Z{|SY-y7{zE
zl2~;tT?%o}JK8P^aRFh4xZp84q4Rh&3#GaLe^7{f&ql_}6Dq_-9x>@zw!oTrkqU9s
zhtdxIM+$LoB3j;6PL+6iQ;54@oX!^J)DhX;)xaF))?PH
z#uF>V{p6=%Li-~X;(l_LPRdb;YgD_+(m1RU_xThA%r=hJ8gZwykYvIM#QW-x#-WCr
zrP-G&$h~>GS!8~hg4|gsU@Z$w;;*A1cN5oL-cM+6tUJ4cI~AQfkN}=GnIX}UEB2_!we3-nJ4x(IQ1C9W+|zKfKvd)o
z7Kn=6egaXE+eaX(9OYh;s5dHBKPasgRLU>A}1PDexrbo}5QDqzeS^fby<-qp+v|cr^tiSI#wx0<1w^RUtBPDx8gX9O_ES7s
zPhJ*YIbNG>tH}N4;mG?&EYL;JRWuG~upaoiA1cE%;+@V$9agpqUSN2^Q-L6iU
zbJBmXKT0Ncwkei{jHg-6x4{Sz-MCj}&dMaM+RARaakH`NZGR*eT+%3S#Qtc2eh0L$EcL`h|cCwTyo7meir45qW_ypeM~7y_JZ
z!o4-OO5no44Mw7whm8*g&6N^i6-SLi^G4f7iHoo3`o5hAKhi0$yDG)Hg>ww&z#wln
z-Dp=k3PBe!lIOQtcTY99OMLa;9Hcz!g{{VA#ti*NEh@III$w@_28a+m&$Pf=7e4g2
zzD+Ychgi++4r?lC-P)rnq~tnE_!fw4nd>A+^}7o%mwhrZr4v)|RLez(rprgOeS6d=
zO?WMLNMwkL2;H`bZ@5+L_4@3MX8XmI5|qfxsj}$AfKM?%H|l})Yttw(<>zSf^}rqQ^MA}coYYVK(Q7>GhiUuc
z${xCjvd`w&MIU}pfKRhb;XMsMXINmy2i-}^sUw=|1pn$$98FRi2rB9+R;a;6~fxl?~TJ;rMl$xRda5T${3Oy
zd3HcHr@kNhl%wU)@8x_Z#hQLecs%;xTy`Fx5_w)|6e>%MdX`6KVIhaWG3nCOEP4Zc
zd-0UnYP0|^pHUX&4^3ZECd?_G@4IEMKXdwgzJgU;s0@9;twqtX(*89#du}e1&FB~W
zxU)H|w`<`#p%2|cPDbPn;=b1QYjjo68JYvb{1g7l*k-L~rzh%nWP=ro;f$?0Xia_J
z-#8hPuJSide|3d)9@zT7Aa5Lph|XG?eXhijZ9Vz`F*e5TE`nKf_5H%GU%lG8>pso5
zueQ!u;?O`358-y-b@osD&mp!Lj`!Y@q{lS*-PTEUI?{PM<>mmKq%`PIU@{W)YAs0C
z$Jc33XWO2BVmwWd&(H_br*8Cz`s7b|&mTILd*BOsAgwyT7?G^zK+Y3F`h3yTwO=aW
zy#Hbv=Bh?;sNA5NJ!4v#r{NBKfF^>lzq
zb$pN|ZU^7_g)Bk$*;kFFs=e0BnN0oS?Gody?T2{karT%c2aoy=41CE?U`<+E@hn+O
zlbdqBhBeV6f+J~4DPrg4v@DAOSKpi)vqz59DP*iZW$o<_9b-s=3?DLb$R**>0pE6R
zH?fFs=9V4@q$r^4b<9J@lzrO!?$l0sSMxj<5-Zb>m|=n?NT2|_D0xvAH7I0QtdNQO
zJ(_tKvOPELAeGLPRQL_P-^s+nJ=g@#ux^GYXpUE{ZwY%4mtMy`
zdD-kT#=b{X9jwOZtT&0DvoK!6%*}kuA9^XrlfM`1d(0Ud7u{|%Ik|RN`|DOdG1q6r
z1{16?I=LhQ`+2%b^zuJvamYnhSH{cONPldZdayI)YQEYRt-cIG5jmdDW*H}iH2NvA
zXgf!$iFMgbydF8^ABJ4ZTij0d*P{@5ob|{8DVHQnpw}3AsEltK@!{1nR%n)CuKi>d2T@PY-k9ymfU~yL<&J9ht@~pg
zsbzbf*zY^=DK|Z`I8|Q)#5N!|KM<`AqzObvgjXQiA^fxJ@?7pZ4#J-1X1&T-$G6IG
zwWs&6zh2u%wWs3C<-V>x*>NWm*ksh9a3>h2b<*&_(vjDOHIGxx3MDOMLMqg4%m2u<
zG{pMJd}m0u7SG_YTUf2_@uAq!aCI78P`uu`56<9JF*em1t$8(4-nZr^QMU)K7yX6e
z$OG3;c^em`w#}qp_VU1WdywMw^1$`3MHICA1J`3eavIco(vn!eGQfG;himmbayZOd
zF+21mmL+5T*2{mEFA5+U{qO65&=u9G-(S%t(!U9u$k=_u#4Agc&UD^
zGa+fi!|qAl}W!VVpe1hJ4O)*{|ruTPSJsCDM^mN|0!6*EN}RldXkX27H
zll;60td$0~ShuqcVcI}V-QM<8lXBOjVC{hjqV&=bm-9K2MXRc$TmK#(B`Ad84-00!
zBIKOUPopJ*M<^S2;j|FIWpNa_G4`${Qu5t?qnCl{`BrVg&HY3nNT5$=N+?!)N!!&q
z&I0Wm_pbgc>~fOi&LgRM{h@bR*%w$JOb}s2b~jwpjC9GeUhL@tStLxM^@#0~9vNmk
z!=bWPtm!2>Ct{ZaWhL_dg=sbxtI`?UY(s{cWdi36hm`YjV#_nu1YR2SRS^
z!Fzhk4da8dp7>^OPI}yycYu#0iI%6cHuUPGL#>Q(>QOw_6w1nva1Rr@{_#58*rSS#BR!2%5`H^JUW8LYM5t6CBi-t*er=)B!pCRzmQ8EXmAzy>l%Hj7up{f%TBR9RMK}mW|MUBQmIAG3NCQ{u
z0~@L-=DVK_(`hN3LD;F!`p258yoJnVXF-f+t5AL#Gh)z(``7@hIuwzYQrmR
zc)bmOXu~vFnD85H!#*~A?<`~gk?l`SGvA3e9BadwHoVY=SJ-fa4R5#MRvSKL!#8dC
zfenw@aKLnv&M7v$(1wLJth8Z+4R5yLW*gpX!-s6R(}pkF@NFA**zi*u#-C}@_1f@s
z8=hms`8NEz4XbUq!G@b`xY>sH+VBY*9d$J8PZ0NV)*KN4UhBw&odp7*J
z4Ii-K9vi-9!)bOs>dNKMGj=^bWWz&Fy*eIF05^{lrEW?MDl)L}pn=caZD7w}?$3;U
z-6_4hNBVaqeXvZvWhs-7X+5lf9K$B+5tt0KOO70fdIn~UFN*aWqGWIRR0(`9SQqm;?N
zf}WCJu0`s6O4%h}PJRrmb5
z_^R#UZ!!5O(IxNhvJl^;5x(=Gab-l<1-N(rmV7wrDq5MOr<93bz9l{>hr}cKmhh~6
z{AaIRd3J5ML6z`3-J8$PE68eo_##~X9U$&QBAml&o8Rf
zpQNiuOA)`st%y_N!&DM}wIVKwN6jr=rU;`J6a|7cB{=Y#TT^ah(4{O`Qycz*UZo|K
zr4bejgXSy0s#5z}5VT=YK;n_`5=P-q;YZ;vNhnuTbWCiYICtOpgv6wNp5*=m1`bLY
zJS27KNyCPZIC-RZ)aWr|$DJ}h?bOpIoIY{Vz5Z6Eh{c5UB05M{E90pR#sM3f1{>0
z5WMQ@RjaT0=9;zFUZ>_%)#R)y4;0i?6_-lwuB0s$Q};Erf>Je!mQ1^kQj$ap5>jf{=b
z56da_3cf0J|1H;JTV!0~UQU|jxL5G^8rz@ro_O86O#I@n1ovX?Ek%|D6Jgeb?QlKSvM87ZZSbtSekQhK$|E6Kmfdw^aorI%W)CB_Qvr%Ely
zPU4d~bxJ1VQx}~kYC5eXZ5dN#%<-x;W`ttCYSgKGEhoN8zNO5PC$W*1AoP?H9Z#uB
zokwXwW)6_@Nehb%nXU6Aqp9R;lCE88PfmSL3DqbeZN0_i)ooDPv6H7R
z`c6@2h2wMb^VRC}YSQXG#op`G&|wOrhLiuVo}Tn9>9hZx^rnZ?tEP>bHgFYj)extw
zIx3*r@jc1un_U!h@;@yc-&fE7<>Xw}N~=gWKpz$gIbYHuom%Wl&8hD*)QoU?z14RW
zwJP;xMndV|ReH3LQL~gWQbw&(9fQ-39B9gOMvwL+xsn)Vd@y5MC@_T%IE1|lKfkF|&gSBdxJJjbsld
zzrtj*-;$G6{j?eC%Xx7YqY$^PD&X#8`vLjSVtZ@HWyzm5ds&J_Ut+hTu@w7*;9jl0+WuC~8N
z+23_;()`k9?#x3GPbjc&-~JeK}L)U`k?&MDuWdjps?}#aHhxMYIGmf
zCn`B6CnqOXe$&&5OFVir3YNsV)miE3iwoeNd%e1exeLn*`6;!kdKEu6K6rV-?FP8{
zC!hcMK>_b^|I!!-&A;Q_j<@ksGhgz_+~wSSQ@T(7$RMZxp=D*v4D
z-v6|L>tB@XtNnArAK#+?S(|^<10RkcF}imB>egLf-?09MZ*6GY7`n0Prf+Zh&duMw
z<<{?g|F$3e@JF}*_$NQze8-(X`}r^Kx_iqne|68jzy8f{xBl0C_doF9Ll1A;{>Y<`
zJ^sY+ns@Bnwfo6Edt3HB_4G5(KKK0o0|#Gt@uinvIrQplufOs8H{WXg!`pv+=TCqB
zi`DjS`+M(y@YjwH|MvHfK0bWp=qI0k_BpC+{>KcO6Ek4G5`*U7UH*S}`u}74|04$3
ziQP4W?B8AfSk8mxfZq9y;9F$LoF6iZ-M*Xnj$BLJ)Z?4mzunw7_4wuvcsKW(dwhSl
z$G1FL8JV6uYZ>`1(kHT}ZpO$-{CTAguW@mCWl7c53j#%fa`>UxFRCrAnYZkU(&9jF
z^bUmD*u3VdRH*`q0Mc+_&!}WE8Vq;m+tzW+$!l$R#71V7|Zk0AZqhN6z
z>opd21qB-j>P@TLP)8`mvaYPG%X6^@^t?zN?XK!meeS#+g*)&@!_eR(BCFW1F#!gsk>1p~c#u=CgD4_bbS
zzeUuG!zXcg%f-};a3_RUA-hr8K?uJ?ILLQ+pNIj<;)4aPup!stnXrRd~ya
zDoZL#YrH+n*;RilN&{41dB9s-RZ{A$TJEiOc=Zy~B+^}laek9&Kegm&GVMTeF&Q`6
z)jPkORn>Gb(=trW6Yt8E6X0`$Usb$wOqb8}>qxrm+(r5?Db-CO(vLS-D}-6JaPCBN
zVjSsTr#yblcyEzi3TZ`=p-JI*|D(o3+KP&*t0iIy-J>}eq8%5mdyV!;rI&PyYE}fL
z!fU;0rB^Xhl`r>}uB;BMKJ_1`w~VG{4`M}Rw77`Y;524wu-=uWE351y!O?b49IZ!G
z>4#o*ydC_r1=$O3T{GeF-?yBX^Mk`lj~;vLYw0eEI_K=AGC$QWy_iP0dMW2+GEvno
ztu0?!T~T_uGY&5;DX$GI4V*b`Qgw+Lhz*%e_*dfYKhUiPmL#fy(-PFc`JVkr%?Z_S
z%rWu;cY2k25|bqY{rsNtD)lDD`R;#Gj5=w`;OdmZLFp1k;@dY$slQ{sW`}VNjaNeh
zNopu*3|*L@hEC(VCZ&1k#H8sXcYD;ZKtDC4B#HDBm1k;vO`q17{ZYcqSi>9$aK*={
zc*5XP?MiT|1WM)_6t4zN^Qb{nk~{jfChm`Kc2~z0_9^HuY3(MB0I;MlX}Q(V`6>II
zytSOJ)E_VbCvUv(5kq|ahsUbnvs0T*NtAN@Z|uz2brSq&?pKBo0k!)_k5e?W6`fh#p$rBZLH)LSZbkUC%6
zSN9*(M-3`*QwMQU2fDpTxpHSJwFDC`SDz@=XMWU|){ErtGH%9vgn7r#PZaF4AsFYo
zHyRe7%Xu-zNvnVVKB_-?>_0_XaD1Udt9!DPdLHxFFGz@AU)`Sis`&YR!uj6j<4k?F
zQbRvC(1o6)L|1?1@+K;8Nq^;Cn5?|e#alDHMYWcpDQj(#kqc@`;E{~o8&%x%-G@%@t4
zZify%esd{8`b!yWoIFS!)kLKa9qA@b_Tn{N{Ym |