Vibe deploy your next app: Introducing Opalstack’s Model‑Context‑Powered Future.
From XML‑RPC to Vibe Deploy
Opalstack’s core platform is built from the ground up not an off‑the‑shelf cPanel clone. We’re wholly owned and operated by experienced software developers many of whom have worked together for almost two decades. We built the platform to solve our own problems: rapid deployment, easy management, and rock‑solid support.
From day one the goal was a modern interface—so we shipped a fully REST‑compliant JSON API. Authentication uses a bearer token in the Authorization header and every endpoint comes with copy‑paste curl examples for GET and POST. Power users can streamline workflows with embedded objects—e.g. /osuser/list/?embed=server returns both the user and its server in a single call—keeping automation lean and self‑contained.
Our API now exposes a /mcp
endpoint. At first glance these paths look unremarkable – they accept GET
, POST
and DELETE
– but they mark the beginning of something extraordinary.
Why the Model Context Protocol (MCP) matters
The Model Context Protocol (MCP) is an emerging open standard that lets AI models call external tools in a safe, structured way. According to the official specification MCP consists of several layers: a base protocol built on JSON‑RPC message types, lifecycle management for connection setup, an authorization framework for HTTP transports, and optional server and client features. Servers expose resources, prompts and tools while clients can provide sampling and directory listings. All messages follow the JSON‑RPC 2.0 specification ensuring that requests responses and notifications are predictable. The protocol’s modular design allows implementations to support exactly the features they need.
This makes MCP a perfect match for a hosting platform. It provides a secure, stateless channel where an AI agent can ask a server to perform an action (like creating a database or deploying an app) and receive structured results. Because MCP is transport‑agnostic it can work over HTTP, WebSockets, or even standard I/O. And because it is built on JSON‑RPC it integrates easily with our existing API.
Opalstack × MCP: natural‑language deployment (i.e. vibe deploy your app)
We have always loved new tech and we’ve been following large‑language‑model research in our spare time. When the Model Context Protocol was released earlier this year we immediately recognized its potential. Within weeks we exposed an /mcp
endpoint on the Opalstack API. This endpoint serves as a wrapper around our existing API operations but it also publishes a catalogue of tools defined by manifests to any MCP‑compatible AI agent. You can now vibe deploy your next app using natural language.
Here’s why this matters for you:
- Natural‑language deployment: Because MCP uses JSON‑RPC and our API uses JSON a conversational agent can now ask the Opalstack MCP endpoint to “create a new WordPress site named blog on my domain” or “add a Django app and set up a PostgreSQL database” and receive the structured calls needed to make it happen. No cURL commands, no copy‑pasting tokens. You simply describe your intent.
- Seamless IDE integration: The endpoint is designed to work with agentic workflows in editors like VS Code. As you code you can chat with the agent, ask it to deploy, tweak configurations, or provision resources. The manifest system ensures that the agent prompts use the right parameters and returns human‑friendly explanations.
- Managed security: MCP’s authorization layer defines how the agent authenticates and what permissions it has. Because our API already requires token‑based auth we can map those tokens directly, ensuring that the agent can’t exceed your account’s privileges.
- Future‑proof architecture: MCP’s modular design allows us to add new capabilities over time. As the protocol evolves, we can publish additional tools without breaking existing integrations. Our shift to AlmaLinux 9 in 2025 is another example of how we embrace change; we evaluated multiple distributions and chose AlmaLinux because of its widespread adoption and ease of migration.
Vibe‑code, vibe deploy without containers
Developers come to Opalstack because they want to ship quickly. We provide one‑click installers for WordPress, Django, Ruby on Rails, and more but we also let you build and deploy custom apps without hiding the underlying system. Our philosophy is that you shouldn’t need Docker or Kubernetes to run a small project. You get full SSH and SFTP access in a managed OS environment. There’s no root access required so you can focus on your code while we handle the hosting.
That philosophy extends to our support: our staff are developers who debug server configuration and application code because they enjoy it. We’re a close‑knit team bound by a shared passion for open‑source software. When you’re experimenting with a new framework or pushing an LLM‑powered agent to production, we’re right there with you.
Built for the long haul
We’ve been on this journey since the early days of Python web frameworks. Along the way we’ve seen hosts come and go. Opalstack isn’t a venture‑funded experiment; it’s a company owned and operated by developers. We keep our team small and cross‑trained so that we can respond quickly and stay accountable. We have no hidden fees – email, SSL and DNS are included in every plan. When customers asked for dark mode or for the ability to route traffic to a single domain we added those features. When the hosting world began experimenting with MCP we were ready.
Our commitment to innovation doesn’t mean we throw caution to the wind. We roll out new OS builds slowly and deliberately because real‑world applications are more complex than any lab test. We listen to feedback and prioritise stability over hype. As we integrate MCP into more of our tooling we’ll continue to refine the manifests and prompts so that your LLM agents behave predictably and securely.
Join the next chapter
We invite you to explore the Opalstack MCP endpoint and start building your own agent‑driven workflows. As always, our support team is eager to help. Whether you’re spinning up a WordPress blog, orchestrating a Django microservice, or experimenting with AI agents Opalstack is here to make your vision sparkle.
Many MCP clients are now available, the one we have done the most testing with is VS Code and Copilot. The configuration needed to connect to the MCP server, where ABB123 is your bearer token that is issued within the dashboard.

Github Copilot
via the Extensions tab.
Click this link to automatically configure the MCP server: configure MCP for me. Or you can manually add it by following the next steps. Either way make sure you use the bearer token that you have issued from the dashboard. If you have prior configuration from earlier versions of VS code only use this format and on the most recent version.
Type
>MCP: Open User Configuration
in in the VS Code search to open the config file.{
"servers": {
"opalstack": {
"url": "https://my.opalstack.com/mcp",
"type": "https",
"headers": {
"Authorization": "Bearer ABC123"
}
}
},
}

https://my.opalstack.com/tokens/

Make sure you change the mode from ‘Ask’ to ‘Agent’ This is known as an ‘Agenic AI’, the ability to perform long chains of tasks. Once this is done you can run the /list
command and it will return the mcp_opalstack_* toolkit as well as the other tools your IDE has available.
Meet the toolbox (aka “the buttons your AI can push”)
Behind the scenes, every chat‑command you fire at Opalstack is translated into one of 21 purpose‑built MCP tools—each a wrapper around our JSON REST API (full schema lives at /api/v1/doc/
). Think of them as Lego bricks your agent stacks together to get real work done:
mcp_opalstack_account # create/read your account profile
mcp_opalstack_address # forward‑only or full mail addresses
mcp_opalstack_application # deploy Django, Laravel, Node, static, you name it
mcp_opalstack_cert # issue/renew Let’s Encrypt certs
mcp_opalstack_dns # manage records without leaving your editor
mcp_opalstack_domain # add or park domains in one shot
mcp_opalstack_installer_urls # fetch our 1‑click installer library
mcp_opalstack_ip # list dedicated or shared IPs
mcp_opalstack_mailuser # mailbox CRUD (quota, passwords, etc)
mcp_opalstack_mariadb # spin up MariaDB databases
mcp_opalstack_mariauser # grant DB creds
mcp_opalstack_notice # surface panel notifications to your bot
mcp_opalstack_osuser # sandboxed Linux users for each app
mcp_opalstack_psqldb # PostgreSQL 17 in two keystrokes
mcp_opalstack_psqluser # role‑based PSQL access
mcp_opalstack_server # get server health & resource data
mcp_opalstack_site # map domains → apps → SSL in one call
mcp_opalstack_token # issue or revoke API tokens
mcp_opalstack_tokenacl # fine‑grain ACLs for shared accounts
Why it matters:
- No hidden magic—each tool is a first‑class, versioned endpoint bound to real JSON you can curl if you’re the “show me the wires” type.
- Agents chain them together transparently, so “launch a staging Django with Postgres + Redis” is a single English sentence, not a 12‑step shell script.
- You still get raw API access when you need to color outside the lines—MCP just saves you from boilerplate 95 % of the time.
Bottom line: Opalstack MCP turns our rock‑solid API into an instant‑action command palette for both humans and AI. Less YAML, more “go live” button‑mashing. Go vibe‑code and vibe deploy something wild and let us know what you ship!