In Part 1 of this tutorial series, you will set up and test the infrastructure and third-party dependencies required for an agentic AI use case: a chat listener that creates concise tasks in a project management platform. This is a prime example of MCP tool integration: LLMs are strong at summarizing a customer's natural language, but they lack awareness of your organization's project management platform and the context and connectivity needed to integrate with such an external system.
After you finish this tutorial, in Part 2 of the series you will continue to build and evolve a Streaming Agent for this use case.
Log in to your Confluent Cloud account:
confluent login --prompt --saveInstall a CLI plugin that streamlines resource creation in Confluent Cloud:
confluent plugin install confluent-quickstartRun the plugin to create the Confluent Cloud resources needed for this tutorial. Note that you may specify a different cloud provider (gcp or azure) or region. You can find supported regions in a given cloud provider by running confluent kafka region list --cloud <CLOUD>. The plugin should complete in under a minute.
confluent quickstart \
--environment-name agentic-ai-env \
--kafka-cluster-name agentic-ai-cluster \
--compute-pool-name agentic-ai-poolLog in to the OpenAI platform. Navigate to the Project API keys page and click Create new secret key. Save this key because we will need it later when creating a remote model in Flink.
Next, log in to Linear.
When prompted, give your workspace a unique name and click through the quick start prompts until you get to your workspace home page.
To create a Linear API key, click the workspace dropdown at the top left, then Settings. Select Security & access in the left-hand navigation, followed by New API key under Personal API keys. Give the key a name. Under Permissions, select Only select permissions... and then only check the boxes for Read and Write. Click Create. Save this API key.
Start a Flink SQL shell:
confluent flink shell --compute-pool \
$(confluent flink compute-pool list -o json | jq -r ".[0].id")Linear offers both streamable HTTP and SSE-based MCP endpoints. Paste your Linear API key into the following statement to create a connection to Linear's streamable HTTP MCP server:
CREATE CONNECTION `linear-mcp-connection`
WITH (
'type' = 'MCP_SERVER',
'endpoint' = 'https://mcp.linear.app/mcp',
'transport-type' = 'streamable_http',
'token' = '<LINEAR_API_KEY>'
);Next, paste your OpenAI API key into the following statement to create a connection to OpenAI's Chat Completions API:
CREATE CONNECTION `openai-connection`
WITH (
'type' = 'openai',
'endpoint' = 'https://api.openai.com/v1/chat/completions',
'api-key' = '<OPENAI_API_KEY>'
);While you can sometimes point a model at an MCP tool directly, it helps to inspect the tool to see which operations are available and what parameters they require.
Run the following command to start an MCP server inspection tool:
npx @modelcontextprotocol/inspector@latest https://mcp.linear.app/mcpIn the form on the left, select the Streamable HTTP Transport Type, enter https://mcp.linear.app/mcp as the URL, and enter your Linear API key as the Bearer header.

Scroll down, click Connect, and then approve the login. Once you're connected, click List Tools. These are the tools at our disposal to build an agentic AI workflow. We're going to focus on issue creation, so note that there is a create_issue tool. Click that to see the fields required to create an issue.

You'll notice a required team field that you must provide when calling the tool. You can get your team ID (a GUID) by running the following command. Be sure to substitute your Linear API key:
curl \
-X POST \
-H "Content-Type: application/json" \
-H "Authorization: <LINEAR_API_KEY>" \
--data '{
"query": "query Teams { teams { nodes { id name } }}"
}' \
https://api.linear.app/graphqlIn the Flink SQL shell, create a gpt-4o model using the OpenAI connection created earlier:
CREATE MODEL chat_listener
INPUT(prompt STRING)
OUTPUT(response STRING)
WITH (
'provider' = 'openai',
'task' = 'text_generation',
'openai.model_version' = 'gpt-4o',
'openai.connection' = 'openai-connection'
);Next, create a similar LLM model, but this time also provide the MCP server connection.
CREATE MODEL linear_mcp_model
INPUT(prompt STRING)
OUTPUT(response STRING)
WITH (
'provider' = 'openai',
'task' = 'text_generation',
'openai.model_version' = 'gpt-4o',
'openai.connection' = 'openai-connection',
'mcp.connection' = 'linear-mcp-connection'
);First, test the base LLM that doesn't call any tools:
SELECT
prompt,
response
FROM
(SELECT 'What is a good family friendly dog breed? Answer concisely with only the most recommended breed.' AS prompt) t,
LATERAL TABLE(AI_COMPLETE('chat_listener', prompt)) as r(response);You should see output like the following. Your output may be different because the underlying model is nondeterministic.
prompt response
what is a good family friendly dog breed? ... Labrador RetrieverNext, test MCP tool invocation with the following command. Substitute your Linear team ID.
SELECT
AI_TOOL_INVOKE(
'linear_mcp_model',
'Create an issue from the following text using <LINEAR_TEAM_ID> as the team ID. I can''t log in to the online store. It says that my account has been locked out. When I try the forgot password route, I don''t get an email to reset it. Please help!',
MAP[],
MAP['create_issue', 'Create a new issue'],
MAP[]
) as response;You should see a JSON response indicating the status as well as the action taken. In the Linear web app, click All issues and you will see a new ticket in the backlog summarizing the issue:

Now that we have created a model and tools and verified that they work as expected, proceed to Part 2 of this tutorial series.
If you aren't continuing, delete the agentic-ai-env environment to clean up the Confluent Cloud infrastructure created for this tutorial. Run the following command in your terminal to get the environment ID of the form env-123456 corresponding to the environment named agentic-ai-env:
confluent environment listDelete the environment:
confluent environment delete <ENVIRONMENT_ID>