MCP Server

What is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (AI assistants). Remote MCP servers let AI assistants connect to external data sources, typically over HTTP, enabling the creation of complex Workflows that extend their knowledge and capabilities.

MCP provides...

  • A growing list of pre-built integrations that your AI assistant can plug into directly.
  • A standardized way to build custom integrations for AI applications.
  • An open protocol that everyone is free to implement and use.
  • The flexibility to switch between different apps (e.g. Claude ↔ ChatGPT) and take your context with you.

Possible Use Cases

  • Easy lookups and searches across multiple, disparate systems.
  • Report generation without needing complex business rules and manual set up.
  • Summarize any data in any format without requiring complex integrations.

MCP Tools

MCP Tools are what enable AI assistants to interact with external systems. In our case, a tool directly correlates to a Flowgear Workflow. Once a Workflow is enabled as an MCP Tool, an AI assistant can invoke it, pass inputs, and use & display the outputs for you. For set up in Flowgear, see Enable a Workflow as an MCP Tool.

Requirements

  • Environment Hostname – MCP server access is based on your Site's Environment hostname. A unique subdomain must be specified for each Environment. Submit a support ticket with Flowgear to request the required DNS changes. This can be confirmed by navigating to the Environments section in Site Settings.
  • Available Active Workflows – Once you enable a tool and provide a prompt, the AI assistant may continue invoking Workflows repeatedly, until it is satisfied with the information received or you request it to stop. This could lead to a spike in active Workflows during that period.
  • You must be signed into the Flowgear Console – You must be signed in to the Flowgear Console to connect your account in the AI assistant.

Considerations

Workflows needs to be able to handle failures - Given that AI assistants are not always reliable and may send imperfect inputs, ensure that your MCP Tool Workflows are designed to handle errors in such a way that they prevent downstream issues.

Schemas for MCP Tool Workflows are generated by Variable Bar property values - If properties are set, but no values are provided, the AI assistant may invoke the tool in strange and incorrect ways, since it does not have a detailed schema for what the inputs should look like. Provide explicit values for required inputs. See Enable a Workflow as an MCP tool for more information on how to set up a Workflow as an MCP tool.

MCP is not yet widely adopted - (As of September 2025) MCP is growing quickly, but not yet fully supported by all AI assistants. Flowgear's MCP Server set up has been verified to work with Claude and ChatGPT.

MCP uses JSON to communicate - It is best to either use simple types or structured JSON for both inputs and outputs of your MCP Tool Workflow. This maps cleanly to MCP tool calls.

Descriptions can provide extra context - The descriptions on Node Properties and the Workflow itself are surfaced to provide extra context for the AI assistant to better understand how the MCP Tool Workflow can be used.

Tool response size - We have found that some AI assistants have trouble parsing larger responses. Once it exceeds a certain size, it could be truncated. It is best to keep the response sizes at roughly ~200–300 KB of JSON/text. It may not behave as expected when this is exceeded.

See also

Enable a Workflow as a MCP Tool
Connect Flowgear MCP to Claude
Connect Flowgear MCP to ChatGPT

Read more

Model Context Protocol