MCP Server

MCP Server is currently in preview.

What is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). Remote MCP servers let LLMs connect to external data sources, typically over HTTP, enabling the creation of complex Workflows that extend their knowledge and capabilities.

MCP provides...

  • A growing list of pre-built integrations that your LLM can plug into directly.
  • A standardized way to build custom integrations for AI applications.
  • An open protocol that everyone is free to implement and use.
  • The flexibility to switch between different apps (e.g. Claude ↔ ChatGPT) and take your context with you.

Possible Use Cases

  • Easy lookups and searches across multiple, disparate systems.
  • Report generation without needing complex business rules and manual set up.
  • Summarize any data in any format without requiring complex integrations.

MCP Tools

MCP Tools are what enable LLMs to interact with external systems. In our case, a tool directly correlates to a Flowgear Workflow. Once a Workflow is enabled as an MCP Tool, an LLM can invoke it, pass inputs, and use & display the outputs for you. For set up in Flowgear, see Enable a Workflow as an MCP Tool.

Requirements

  • MCP Server Support – It is disabled by default. To enable MCP Server Support, please submit a support ticket. Note that this requires a Tenant restart, and will allow your Tenant to act as a remote MCP Server.
  • Environment Hostname – MCP server access is based on your Site's Environment hostname. A unique subdomain must be specified for each Environment. Submit a support ticket with Flowgear to request the required DNS changes. This can be confirmed by navigating to the Environments section in Site Settings.
  • Available Active Workflows – Once you enable a tool and provide a prompt, the LLM may continue invoking Workflows repeatedly, until it is satisfied with the information received or you request it to stop. This could lead to a spike in active Workflows during that period.
  • You must be signed into the Flowgear Console – You must be signed in to the Flowgear Console to connect your account in the LLM.

Considerations

Workflows needs to be able to handle failures - Given that LLMs are not always reliable and may send imperfect inputs, ensure that your MCP Tool Workflows are designed to handle errors in such a way that they prevent downstream issues.

Schemas for MCP Tool Workflows are generated by Variable Bar property values - If properties are set, but no values are provided, the LLM may invoke the tool in strange and incorrect ways, since it does not have a detailed schema for what the inputs should look like. Provide explicit values for required inputs. See Enable a Workflow as an MCP tool for more information on how to set up a Workflow as an MCP tool.

MCP is not yet widely adopted - (As of September 2025) MCP is growing quickly, but not yet fully supported by all LLMs. Flowgear's MCP Server set up has been verified to work with Claude and ChatGPT.

MCP uses JSON to communicate - It is best to either use simple types or structured JSON for both inputs and outputs of your MCP Tool Workflow. This maps cleanly to MCP tool calls.

Descriptions can provide extra context - The descriptions on Node Properties and the Workflow itself are surfaced to provide extra context for the LLM to better understand how the MCP Tool Workflow can be used.

Tool response size - We have found that some LLMs have trouble parsing larger responses. Once it exceeds a certain size, it could be truncated. It is best to keep the response sizes at roughly ~200–300 KB of JSON/text. It may not behave as expected when this is exceeded.

See also

Enable a Workflow as a MCP Tool
Connect Flowgear MCP to Claude
Connect Flowgear MCP to ChatGPT

Read more

Model Context Protocol