This example demonstrates three approaches to creating agents and workflows:
- Traditional workflow-based approach with manual agent creation
- Programmatic agent configuration using AgentConfig
- Declarative agent configuration using FastMCPApp decorators
All three approaches can use app_server.py
to expose the agents and workflows as an MCP server.
- Using the
Workflow
base class to create custom workflows - Registering workflows with an
MCPApp
- Creating and registering agent configurations with both programmatic and declarative approaches
- Exposing workflows and agents as MCP tools using
app_server.py
- Connecting to a workflow server using
gen_client
- Lazy instantiation of agents from configurations when their tools are called
-
DataProcessorWorkflow: A traditional workflow that processes data in three steps:
- Finding and retrieving content from a source (file or URL)
- Analyzing the content
- Formatting the results
-
SummarizationWorkflow: A traditional workflow that summarizes text content:
- Generates a concise summary
- Extracts key points
- Returns structured data
-
Research Team: A parallel workflow created using the agent configuration system:
- Uses a fan-in/fan-out pattern with multiple specialized agents
- Demonstrates declarative workflow pattern configuration
-
Specialist Router: A router workflow created using FastMCPApp decorators:
- Routes requests to specialized agents based on content
- Shows how to use the decorator syntax for workflow creation
-
Copy the example secrets file:
cp mcp_agent.secrets.yaml.example mcp_agent.secrets.yaml
-
Edit
mcp_agent.secrets.yaml
to add your API keys. -
Run the client, which will automatically start the server:
uv run client.py
basic_agent_server.py
: Defines the BasicAgentWorkflow and creates an MCP serverclient.py
: Connects to the server and runs the workflowmcp_agent.config.yaml
: Configuration for MCP serversmcp_agent.secrets.yaml
: Secret API keys (not included in repository)
Workflows are defined by subclassing the Workflow
base class and implementing:
- The
run
method containing the main workflow logic - Optional:
initialize
andcleanup
methods for setup and teardown - Optional: a custom
create
class method for specialized instantiation
Workflows are registered with the MCPApp using the @app.workflow
decorator:
Example:
app = MCPApp(name="workflow_mcp_server")
@app.workflow
class DataProcessorWorkflow(Workflow[str]):
@classmethod
async def create(cls, executor: Executor, name: str | None = None, **kwargs: Any) -> "DataProcessorWorkflow":
# Custom instantiation logic
workflow = cls(executor=executor, name=name, **kwargs)
await workflow.initialize()
return workflow
async def initialize(self):
# Set up resources like agents and LLMs
async def run(self, source: str, analysis_prompt: Optional[str] = None, output_format: Optional[str] = None) -> WorkflowResult[str]:
# Workflow implementation...
async def cleanup(self):
# Clean up resources
Agent configurations can be created programmatically using Pydantic models:
# Create a basic agent configuration
research_agent_config = AgentConfig(
name="researcher",
instruction="You are a helpful research assistant that finds information and presents it clearly.",
server_names=["fetch", "filesystem"],
llm_config=AugmentedLLMConfig(
factory=OpenAIAugmentedLLM,
)
)
# Create a parallel workflow configuration
research_team_config = AgentConfig(
name="research_team",
instruction="You are a research team that produces high-quality, accurate content.",
parallel_config=ParallelWorkflowConfig(
fan_in_agent="editor",
fan_out_agents=["summarizer", "fact_checker"],
)
)
# Register the configurations with the app
app.register_agent_config(research_agent_config)
app.register_agent_config(research_team_config)
FastMCPApp provides decorators for creating agent configurations in a more declarative style:
fast_app = FastMCPApp(name="fast_workflow_mcp_server")
# Basic agent with OpenAI LLM
@fast_app.agent("assistant", "You are a helpful assistant that answers questions concisely.",
server_names=["calculator"])
def assistant_config(config):
config.llm_config = AugmentedLLMConfig(
factory=OpenAIAugmentedLLM,
)
return config
# Router workflow with specialist agents
@fast_app.router("specialist_router", "You route requests to the appropriate specialist.",
agent_names=["mathematician", "programmer", "writer"])
def router_config(config):
config.llm_config = AugmentedLLMConfig(
factory=OpenAIAugmentedLLM
)
config.router_config.top_k = 1
return config
The MCP server automatically exposes both workflows and agent configurations as tools:
Workflow tools:
- Running a workflow:
workflows/{workflow_id}/run
- Checking status:
workflows/{workflow_id}/get_status
- Controlling workflow execution:
workflows/resume
,workflows/cancel
Agent tools:
- Running an agent:
agents/{agent_name}/generate
- Getting string response:
agents/{agent_name}/generate_str
- Getting structured response:
agents/{agent_name}/generate_structured
Agent configurations are lazily instantiated when their tools are called. If the agent is already active, the existing instance is reused.
The client connects to the workflow server using the gen_client
function:
async with gen_client("workflow_server", context.server_registry) as server:
# Connect and use the server
You can then call both workflow and agent tools through this client connection.