Building Your First No-Code MCP Server: The Fabric Integration Story
How I created a Model Context Protocol server for Fabric without writing a single line of code
The Challenge: When Great Tools Don’t Talk to Each Other
Picture this: You’re deep in a coding session using Cursor, and you need that perfect AI prompt pattern to solve a complex problem. You know it exists in Fabric— Daniel Miessler’s incredible collection of AI interaction patterns — but accessing it means switching contexts, opening a browser, navigating to the pattern, copying it, and pasting it back into your workflow.
Meanwhile, your MCP (Model Context Protocol) clients like Cursor and GitHub Copilot are sitting there, capable of so much more if they could just tap into Fabric’s treasure trove of over 200+ curated AI patterns.
The friction was real, and it was slowing me down.
That’s when I decided to build a bridge between these two powerful ecosystems. The twist? I accomplished this without writing a single line of custom code.
What Exactly is MCP and Why Should You Care?
Before diving deeper, let’s clarify what we’re working with. The Model Context Protocol (MCP) is an emerging standard that allows AI applications to securely access external data sources and tools. Think of it as a universal translator that lets your AI assistant talk to databases, APIs, file systems, and other services in a standardized way.
Why does this matter? Instead of AI tools being isolated islands, MCP enables them to become connected ecosystems. Your code editor can access your project documentation, your chat assistant can pull real-time data, and your development tools can leverage specialized AI patterns — all seamlessly.
Fabric, on the other hand, is a tool for managing and utilizing AI patterns — reusable prompts and workflows that solve specific problems. It’s like having a master chef’s recipe collection, but for AI interactions.
The No-Code Breakthrough: Standing on Giants’ Shoulders
The solution came when I discovered MCP Access Point by sxhxliang. This project is pure genius — a protocol conversion gateway that transforms any HTTP service into MCP-compatible operations.
Here’s the beautiful part: Instead of building a custom MCP server from scratch (which would require deep protocol knowledge, extensive testing, and ongoing maintenance), I could leverage existing infrastructure to create exactly what I needed.
The MCP Access Point works by:
- Reading OpenAPI specifications to understand your HTTP service
- Automatically generating MCP tool definitions
- Converting MCP requests into HTTP calls
- Translating responses back to MCP format
This project literally wouldn’t exist without MCP Access Point. It’s a perfect example of how one person’s infrastructure work can unlock dozens of downstream innovations. When I say I built this without code, I really mean I orchestrated existing solutions — which is often the smartest approach.
The OpenAPI Puzzle: When AI Becomes Your Research Assistant
Here’s where things got interesting. The MCP Access Point needed an OpenAPI specification to understand Fabric’s REST API, but Fabric hadn’t published one publicly. I had two options:
- Manually reverse-engineer the API by reading source code and testing endpoints (tedious and error-prone)
- Leverage AI to analyze the codebase and generate the specification automatically
I chose option 2 and imported the entire Fabric codebase into Gemini 2.5 Pro. The conversation went something like this:
“Hey Gemini, can you analyze this Go codebase and generate a comprehensive OpenAPI 3.0 specification for all the REST endpoints?”
What happened next was remarkable. Gemini methodically:
- Analyzed the routing structure
- Identified all HTTP handlers
- Extracted request/response schemas
- Documented query parameters and path variables
- Generated proper OpenAPI documentation with descriptions
The result? A comprehensive openapi.json
file that perfectly captured Fabric's API surface.
Full credit to Gemini 2.5 Pro for this crucial work. This isn’t just about saving time — it’s about accuracy. The Gemini caught endpoint details and parameter nuances that I might have missed during manual documentation.
A note to the amazing Fabric team: Publishing an official OpenAPI specification would be incredibly valuable for the ecosystem. It would enable integrations like this one and many others that we probably haven’t even imagined yet.
The Architecture: Simple Yet Powerful
The resulting solution follows a clean, containerized architecture:
MCP Clients (Cursor, Copilot) → MCP Access Point → Fabric REST API
Using Docker Compose, the system orchestrates two key services:
- Fabric Service (Port 9991): The core AI pattern management system
- MCP Access Point (Port 9999): The protocol conversion gateway
The beauty of this approach is its simplicity. The MCP Access Point reads the OpenAPI specification and automatically converts REST endpoints into MCP operations that clients can discover and use.
Real-World Impact: From Theory to Daily Workflow
I connected the server to GitHub Copilot, and the integration exceeded my expectations. Here’s what the actual experience looks like:
Before: Need a specific Fabric pattern? Open browser → Navigate to Fabric → Find pattern → Copy → Switch back to editor → Paste → Format
After: In Copilot’s Agent mode, I type a message like “Suggest me a pattern for I need to analyze code for security issues
" and the following is the response from Copilot:
For analyzing code for security issues, I recommend these patterns from your Fabric setup:
- coding_master: Provides expert-level code review, including security analysis and best practices.
- write_semgrep_rule: Helps you create custom Semgrep rules for static code analysis to detect vulnerabilities.
- write_nuclei_template_rule: Useful for web application security, enabling you to create Nuclei templates for vulnerability scanning.If these do not fully meet your needs, you can use create_pattern to design a custom security analysis pattern tailored to your workflow.
Let me know if you want instructions or examples for any of these patterns!
Then, I use the coding_master
pattern by sending the following message to the Copilot Agent:
Use content of the
coding_master
pattern as your system prompt in the next interaction and then reset to the default system prompt.
Please note that if you are not sure which pattern to use for a given task, you can even use suggest_pattern
to ask Fabric to suggest an appropriate pattern for the given task.
I successfully tested these key operations:
get_patterns_names
- Instantly browse all available patternsget_patterns_exist_by_name
- Verify pattern availability before usingget_patterns_by_name
- Fetch complete pattern content with formatting
But the real magic happened during actual development work. The patterns weren’t just available — they were contextually suggested. Working on documentation? The create_design_document
pattern appeared as a suggestion. Debugging complex code? coding_master
was right there.
This is what seamless integration should feel like.
Transparency: Current Limitations and Growth Areas
No project is perfect, and I believe in honest communication about what works and what needs improvement.
Current limitations I’ve identified:
- One minor compatibility issue documented as Issue #10 in the MCP Access Point repository (Thanks to sxhxliang fixed the issue in no time!)
- Requires Docker knowledge for deployment
What’s working exceptionally well:
- Rock-solid pattern retrieval and browsing
- Excellent performance with multiple concurrent clients
- Seamless integration with GitHub Copilot
This transparency isn’t just about managing expectations — it’s about building trust and encouraging community contributions. When we openly discuss limitations, we create opportunities for collaborative problem-solving.
The Open Source Advantage
This project exemplifies the power of open-source collaboration:
- Fabric provides the core pattern management capabilities
- MCP Access Point offers the protocol conversion gateway
- Gemini 2.5 Pro assisted with API specification generation
- Docker enables easy deployment and containerization
Each component, built by different teams and individuals, comes together to create something greater than the sum of its parts.
Quick Start: Get Running in Under 5 Minutes
Ready to try this yourself? I’ve made the setup as straightforward as possible:
# 1. Clone and enter the project
git clone https://github.com/limcheekin/mcp-proxy-to-api-nocode.git
cd mcp-proxy-to-api-nocode
# 2. Set your Fabric API key (get one from Fabric's setup process)
export FABRIC_API_KEY=your_actual_api_key_here# 3. Launch everything
docker-compose up -d# 4. Verify it's working
curl http://localhost:9999/api/fabric/mcp
Connection endpoints for your MCP clients:
- SSE Protocol:
http://localhost:9999/sse
(recommended for Cursor) - Streamable HTTP:
http://localhost:9999/mcp/
(works with most clients)
Pro tip: Use the MCP Inspector (npx @modelcontextprotocol/inspector
) to test and debug your connection before integrating with your main tools.
The entire setup should be running smoothly within 5 minutes. If you encounter issues, the troubleshooting section in the README covers the most common scenarios.
Join the Movement: How to Contribute
This project thrives on community involvement. Here’s how you can be part of the story:
🚀 Try it yourself: Deploy the server and test it with your preferred MCP client. Even 10 minutes of experimentation provides valuable feedback.
📚 Documentation and examples:
- Share integration guides for different MCP clients
- Create video tutorials or blog posts
- Document advanced configuration scenarios
🐛 Quality assurance:
- Report bugs with detailed reproduction steps
- Test edge cases and unusual configurations
- Validate performance under different conditions
💡 Strategic thinking:
- Suggest new integration targets (what other REST APIs would benefit from MCP conversion?)
- Identify workflow improvements
- Propose architectural enhancements
The most valuable contribution might be the simplest: just use it and tell us what works (or doesn’t work) for your specific needs.
Every contribution, no matter how small, pushes the entire ecosystem forward.
Your Stories Matter: Real-World Usage Patterns
I’m genuinely curious about how you’ll use this integration, because user stories often reveal use cases that creators never imagined.
Some scenarios I’m excited to hear about:
For Developers:
- How do you integrate Fabric patterns into your daily coding workflow?
- Which patterns prove most valuable in your specific domain?
- Are you using this with team workflows or just individual development?
For Content Creators:
- How do extraction and analysis patterns change your research process?
- Are you building custom workflows that combine multiple patterns?
For Educators and Trainers:
- How might this help teach AI interaction best practices?
- Could this make AI pattern libraries more accessible to students?
Please share your experiences — positive or negative. Every data point helps improve not just this project, but the broader ecosystem of AI tool integration.
Drop a comment below or reach out on GitHub. I’m particularly interested in patterns that work exceptionally well (or surprisingly poorly) in your specific use cases.
Conclusion: When Infrastructure Enables Innovation
Building this MCP server taught me something profound about modern software development: the most impactful projects often come from connecting existing tools in novel ways, rather than building everything from scratch.
This integration exists because:
- Daniel Miessler built an incredible AI pattern framework
- sxhxliang created a brilliant protocol conversion gateway
- Google’s Gemini team built AI capable of understanding and documenting complex codebases
- The Docker team made complex orchestration accessible
- The MCP specification authors created a standard that enables this kind of interoperability
My contribution was recognizing how these pieces could fit together and doing the orchestration work to make it happen.
This is the future of development: standing on the shoulders of giants, leveraging AI as a collaborative partner, and focusing human creativity on the connections and combinations that create new value.
The friction between great tools is disappearing. The question isn’t whether you can connect your favorite services — it’s what amazing workflows you’ll create when everything talks to everything else.
What will you connect next?
Ready to build your own no-code MCP integrations? Check out the project repository and join the conversation. The future of seamless AI tool integration is being built one connection at a time.