Get more context with in your AI with MCP, pronto ⇾

Matt Webb kicks off a powerful discussion on Model Context Protocol (MCP) and why it matters—it’s the search box silly:

See, “search” has never just been “search.” Search has always been an epistemic journey. You’re building knowledge, you’re not in and out with one query. You google first with a vague intention, you learn some of the vocabulary. You google more, back and forth, and you pick up the terminology and trade-offs – whether you want a hotel or Airbnb, that kind of thing, or which dentist is on your bus route, or that your initial question wasn’t what you meant. Finally you know how to frame your real query, with well-known terms, you perform your actual transaction (book your hotel, write your report, after fighting past the pop-ups and the slop results), and you’re done.

100% agreed. We don’t limit the Search box to just typing some words, but grounding ourselves in greater knowledge (that explains the multitude of tabs attached to my web browser at any given time). In fact, the old paradigm of SEO is starting to break as AI vacuums it all up:

[O]rganisations will be in a situation where not only are they not present and showing up in AI chat (so people never visit their parks or buy their bikes) but, for team and infrastructure reasons, they can’t adapt.

For many this will be an existential problem.

I can already see this showing up with my own queries sometimes starting in Claude’s chat box because, I can bypass the fluff. And when I need more grounding, I’ll tap into Perplexity. 

So why MCP for us developers and designers?

Recap: AI chat is already popular and, to get more popular, it needs plug-ins. Which means developers need to collaborate with AI chat app creators, which means everyone needs to agree on a standard. That’s where Model Context Protocol comes in. Proposed by Anthropic in December 2024:

“MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.”

I’ve already started playing with MCP tools on Claude and they have proved immensely useful for a simple use case on my end: querying Airtable databases. It’s rather remarkable: I talk to my database and ask it specific questions to summarize, clarify, and run queries against. It’s like using natural language as an SQL query in a visual, easy to understand way. All my CRUD options become as simple as, “Can you do x in y and tell me more about z? Oh, and make a pretty React page visualizing, please?”

Matt proceeds to build his own MCP server to query his project Braggoscope, his own bespoke BBC In Our Time archive using Cloudflare’s workers-map to spin up his own MCP, then query it using a local client. 

What’s interesting is that Claude treats my Braggoscope MCP server a bit like side-loading knowledge, its own Neo in the Matrix I know kung fu moment. There was nothing in the data source that said “this is excellent” – the top episode was just a good match for the search term, and it confabulated from there.

And yes, you can use Claude to code your MCP server as well (next on my list of fun experiments to try). 

It’s just been awesome to watch my LLM write directories on my computer (don’t worry, you can restrict its use), drop in content into markdown files, and then also plug in the Unsplash API to grab associated images and build out draft content for me to build on. I mean, sheesh, a year ago, I was going crazy over GPT-4 and now this—all of this—is moving at lightning speed. 

The entire essay is worth reading. Ping me at hello@ashrafali.net if you find some useful use cases.