Stop Copy-Pasting Into ChatGPT: How MCP Servers Actually Solve the Context Problem

Stop Copy-Pasting Into ChatGPT: How MCP Servers Actually Solve the Context Problem

TLDR: MCP servers let your AI agents directly access your company's data and tools. No more copy-paste, no more context switching. Just ask questions, get answers that actually know your business. Here's how to actually use them.

If you have not read our first article in this series, check out: https://blog.elva-group.com/moving-past-the-ai-hype-introducing-mcp

The Copy-Paste Marathon We All Know

Picture this: You need to know who in your 500-person company has experience with Kubernetes and AWS.

Old way: Open HR system, export to Excel, search LinkedIn, check Slack, compile a list, hope it's current. Time: 45 minutes.

New way: "Who in our company knows Kubernetes and has AWS experience?" Time: 5 seconds.

The difference? MCP servers - and here's exactly how to use them.

Context Is Everything: Why Your AI Needs Direct Access

Here's the fundamental problem with AI today: it only knows what you tell it. Every conversation starts from zero. You're the human middleware, copying and pasting context that should already be obvious.

Everything about working with LLMs is moving toward what's becoming known as context engineering. Prepare the context for the question being asked, regardless of whether this is to produce an answer about personnel, create code, or write you an article about your favorite animal.

MCP (Model Context Protocol) servers solve this by creating secure bridges between AI agents and your actual tools. Your AI can directly:

  • Query your employee database

  • Check project documentation

  • Pull analytics from your dashboards

  • Search through your knowledge base

  • Access your CRM data

  • Read from your project management tools

Technically, MCP servers are JSON-RPC servers that expose your internal tools as functions AI agents can call. They provide a standardized way for AI to interact with any data source or API you want to expose - with built-in discovery, typing, and documentation. Once you connect a tool via MCP, any AI agent that supports the protocol can use it.

When AI already knows your employee skills, project history, and team structures, you can skip the copy-paste marathon and just ask: "Who should I talk to about Python performance issues in our data platform?" - turning hours of context-gathering into seconds of conversation.

Local vs Remote MCP Servers: Choose Your Approach

Before diving into setup, it's important to understand the two types of MCP servers:

Local MCP Servers:

  • Execute directly on your machine as a process

  • Use stdio (standard input/output) for communication

  • Direct access to your local files and databases

  • Data never leaves your machine

  • Example: Node.js script accessing your local SQLite database

Remote MCP Servers:

  • Run on external infrastructure (cloud servers, managed services)

  • Communicate over HTTP/HTTPS protocols

  • No local execution needed - just network access

  • Maintained and scaled by the provider

  • Example: AWS Knowledge Server running on AWS infrastructure

We're seeing a clear shift toward remote MCP servers, and for good reason. While today most people start with local servers running on their machines, the future is organizations hosting their own internal libraries of MCP servers - accessible to all employees but secured within company boundaries.

Imagine your company's private MCP ecosystem: HR data servers, documentation servers, analytics servers - all running in your cloud, accessible to your AI agents. Employees connect their AI tools to these internal servers, just as they do with the AWS Knowledge Server today.

So, How Do I Start Using This?

Let's get practical with a real example. AWS Labs provides a remote MCP server that gives AI agents direct access to AWS documentation, API references, and best practices. Perfect for our AWS-focused teams. For these examples, we'll be using the desktop client from Anthropic, Claude Desktop.

Step 1: Install Claude Desktop

Download Claude Desktop - it's Anthropic's native app with built-in MCP support. Other clients like Cursor work too, but we'll use Claude Desktop for this example.

Step 2: Configure the AWS Knowledge MCP Server

To add a remote server in Claude Desktop, they have a concept called Connectors. Go to Settings > Connectors and Add custom connector.

Enter the URL to the MCP server as defined at https://awslabs.github.io/mcp/servers/aws-knowledge-mcp-server: https://knowledge-mcp.global.api.aws

Step 3: Try it out

After restarting, try asking AWS-specific questions:

  • "What's the best practice for setting up multi-account AWS organizations?"

  • "Show me how to configure S3 bucket policies for cross-account access."

Now the magic happens. When your AI identifies a question where it could benefit from using an MCP server to get you a better answer, it will use the MCP server, query it for more information based on what you asked, and then return an answer.

This approach gives you an enhanced answer that includes up-to-date and question-specific data.

If you're a developer like many of us are, creating your MCP server is very simple. We've created a sample repo with a local test MCP that you can get started with to build your own server on our GitHub. Check out the repo at: https://github.com/elva-labs/mcp-blog-demo

Ready to Enrich Your AI Workflows With Your Own Data?

Whether you're looking to connect your internal databases, documentation systems, or analytics platforms, the path forward is clear: give your AI the context it needs to actually help you.

Need help implementing MCP servers in your organization or want to discuss your use case? Don't hesitate to reach out - we're helping teams navigate this transition every day and would love to hear about your challenges.


If you enjoyed this post, want to know more about me, working at Elva, or just want to reach out, you can find me on LinkedIn.


Elva is a serverless-first consulting company that can help you transform or begin your AWS journey for the future