top of page

Building trustworthy AI: Integrating Human Expertise with OpenAI's Responses API and the Pearl MCP Server

  • Writer: Pearl Team
    Pearl Team
  • May 28, 2025
  • 5 min read

Updated: Jun 24, 2025


Abstract


OpenAI’s support for remote MCP servers is a leap forward for agentic AI. Services like the Pearl MCP Server extend this by introducing human judgment into the loop. In domains where trust, accuracy, and expertise are critical, tools like askExpert turn MCP from a technical standard into a competitive differentiator.


Pearl delivers a secure, scalable, standardized way to connect AI to real human expertise, unlocking AI applications that are not just fast - but fundamentally trustworthy.



The landscape of AI development is shifting. We're moving beyond simple chatbots to sophisticated AI agents capable of performing multi-step tasks across various digital services. A core enabler of this evolution is the Model Context Protocol (MCP), an open standard originally created by Anthropic. MCP provides a structured, standardized way for AI applications to connect to external tools and data sources.


Historically, MCP servers typically ran locally. However, a significant development is the growing support for remote MCP servers. These servers, hosted on the internet, allow AI models to access a broader range of online tools and data. Recently, OpenAI announced support for connecting their models to remote MCP servers via the hosted MCP tool in their Responses API. Anthropic also supports custom integrations using remote MCP on Claude  and Claude Desktop for certain plans (currently in beta). Google also announced a native SDK support for Model Context Protocol definitions in the Gemini API for easier integration with open-source tools.


These integrations of the MCP tool are designed to make building agentic applications easier. Traditionally, connecting AI agents to external services was done through function calling, which involved multiple network hops between the model, backend, and service - resulting in latency and complexity. The hosted MCP tool simplifies this process. Instead of manually wiring each function call, you can configure your model to point directly to an MCP server, which acts as a centralized tool host exposing standard commands.


This modular, service-oriented approach allows AI to interact with platforms like Shopify, Stripe, Twilio, and others. For example, a model could use an MCP server to add an item to a Shopify cart and return a checkout URL in one turn - something that previously required custom wrapper code and a relay server. This approach allows for simpler orchestration and centralized management of tools.


Introducing Pearl MCP: Bridging AI and human Expertise


This support for connecting AI to software services via remote MCP servers is incredibly powerful. But what if we could use this same standard to connect AI not just to software, but to real human expertise?


This is the vision behind the Pearl MCP Server, available at https://mcp.pearl.com/mcp.

Pearl’s system is built as Expert-as-a-Service (EaaS), specifically designed to bridge AI agents with real Experts.


Why bridge AI with humans?


Despite rapid advancements, even the most sophisticated AI systems have limitations:


  • AI can be overconfident, sounding authoritative even when wrong.

  • AI struggles to explain its reasoning or provide nuanced judgment.

  • In regulated or high-risk domains (legal, medical, financial), trust, accuracy, and credentials are essential.

  • Pure AI lacks the lived experience and decision-making of human professionals.


Customers need correct answers and assurance that a qualified, credentialed Expert is involved.


Pearl: Human expertise as a tool


Pearl leverages the MCP standard to offer a unique kind of tool: humans.


Unlike other MCP servers that connect agents to APIs or code, the Pearl MCP Server routes queries to real, credentialed human Experts across hundreds of professional categories.


Pearl’s Tool Registry includes tools like askExpert, designed specifically to escalate queries to real Experts.


Sample workflow using Pearl’s askExpert tool


Pearl MCP Server Workflow
  1. AI EncounterAn AI agent (e.g., via OpenAI's Responses API) receives a query requiring nuanced human judgment.

  2. Recognize Need for ExpertThe AI determines that human input is required and selects askExpert.

  3. Invoke via MCPThe AI calls askExpert using the hosted MCP tool and sends the context and question to Pearl MCP.

  4. Route to ExpertPearl MCP Server forwards the request to a matching verified expert.

  5. Expert InputThe expert provides their review or answer.

  6. Return to AIPearl MCP sends the structured expert response back to the AI.

  7. AI ResponseThe AI incorporates the expert input into the final output or next steps.


This workflow allows AI to recognize its limitations and request human verification - delivering both efficiency and trust.


Integrating with OpenAI, Anthropic and Google Gemini


OpenAI’s responses API supports remote MCP servers, allowing models like gpt-4o to dynamically discover and invoke remote MCP tools. The integration is defined in the tools array when creating a response, see the guide to using the Responses API’s MCP tool.


Open AI Responses API using MCP Server example:



Recently Anthropic announced the MCP connector on the Anthropic API which enables developers to connect Claude to any remote Model Context Protocol (MCP) server without writing client code.



Following this trend, Google has announced native SDK support for Model Context Protocol definitions in the Gemini API, simplifying integration with open-source tools.


This means developers can now leverage the Pearl MCP Server within the LLM ecosystems, like OpenAI, Anthropic, Gemini, etc., allowing AI models to interact with real human experts through standardized MCP tools like askExpert, making human escalation seamless and standardized!


Real-world applications


Pearl is already helping in high-trust industries:


  • Healthcare: Flagged symptom checkers routed to doctors for review.

  • Legal: AI-drafted contracts reviewed by attorneys.

  • Automotive: Diagnostic bots escalate to mechanics for expert advice.

  • Coding: Complex bugs routed to vetted software engineers.

  • Finance: AI tax suggestions verified by CPAs.


Pearl delivers real human assurance where it matters most.


Getting started


This sample application demonstrates how to connect OpenAI's responses API with the Pearl MCP Server, enabling AI agents to seamlessly escalate queries to real human experts using the askExpert tool.


🧩 Features

  • OpenAI Responses API: Leverages OpenAI's responses API for dynamic interactions.

  • Pearl MCP Integration: Connects to Pearl's remote MCP server to access certified human experts.

  • Secure Authentication: Utilizes Bearer token authentication for secure communication.

  • Real-time Streaming: Supports streaming responses combining AI and human inputs.


🛠️ Prerequisites

  • Node.js: Ensure you have Node.js installed.

  • OpenAI API Key: Obtain from OpenAI.

  • Pearl API Key: Contact Pearl to obtain your API key.


📦 Installation

  1. Clone the repository:

git clone https://github.com/Pearl-com/openai-pearl-mcp-demo.git cd openai-pearl-mcp-demo
  1. Install dependencies:

npm install
  1. Configure environment variables:

OPENAI_API_KEY=your-openai-api-key 
PEARL_API_KEY=your-pearl-api-key

Running the app

Start the development server:

npm run dev

Open your browser and navigate to http://localhost:3000 to interact with the app.


🔗 Tool Configuration

The integration with Pearl's MCP server is defined in the tools array when creating a response:


This configuration allows the AI model to invoke the askExpert tool from Pearl's MCP server, facilitating interactions with human experts.



Example use case


A user asks legal question:


“Hi, I signed a consulting contract with a company six months ago, and they recently terminated it early without notice. I believe this breaches our agreement. What are my options?”

During the conversation, the AI agent recognizes the need for expert legal input, invokes `askExpert` via the MCP interface, and Pearl routes the query to a certified lawyer. The Expert's response is streamed back and incorporated into the AI's final output.


Example of Pearl MCP Server tool call in action


 
 
 

Comments


Start using our API solution

bottom of page