Building a Multi-Account Google MCP Server for Claude Code
Giving AI assistants native access to Gmail, Drive, and Calendar across multiple Google accounts
AI coding assistants like Claude Code are remarkably capable when it comes to writing code, managing git repositories, and navigating file systems. But the moment you need to check an email, find a document on Google Drive, or schedule a meeting, you hit a wall. The assistant can't reach outside its sandbox. The Model Context Protocol (MCP) changes this by letting AI assistants call external tools through a standardised interface. When I looked for an MCP server that could handle Gmail, Google Drive, and Google Calendar across multiple accounts simultaneously, nothing existed. So I built mcp-google-multi, an open-source MCP server that exposes 47 tools across all three Google services, with full multi-account support.
Why Multi-Account Support Matters
Most developers and founders don't live in a single Google account. I juggle three: a company account, a personal account, and a client-facing account. Each has its own inbox, Drive, and calendar. Existing Google MCP implementations I found were single-account. You'd have to run separate server instances, manage separate configs, and manually tell the assistant which one to use. That defeats the purpose of having an AI assistant that can operate fluidly across your actual workflow.
With mcp-google-multi, you define all your accounts in a single .env file:
1GOOGLE_ACCOUNTS=work:[email protected],personal:[email protected]
Every tool accepts an account parameter, so the assistant can seamlessly switch between accounts mid-conversation. "Search my work inbox for invoices from last week, then check my personal Drive for the contract template" becomes a single, uninterrupted interaction.
The 47-Tool Surface
I deliberately chose breadth over depth. Rather than building a few generic tools and hoping the LLM could compose them, I built specific tools that map directly to common operations. The reasoning is simple: an LLM makes better decisions when it can pick from gmail_search, gmail_send, gmail_modify_labels, and gmail_create_draft than when it has a single gmail_do_thing tool with a complex parameter space.
Gmail: 21 Tools
The Gmail tools cover the full lifecycle of email management. Search with Gmail query syntax, read individual messages or entire threads, send emails, create and manage drafts, and handle labels including batch operations across up to 1,000 messages. There's also support for downloading attachments, reading mailbox history, and managing vacation responder settings. The gmail_batch_modify and gmail_batch_delete tools were important to include because one of the most common AI-assisted workflows is bulk email triage: "archive everything from this sender older than 30 days."
Google Drive: 15 Tools
Drive tools handle file operations that the assistant can't do through the local filesystem. Search files with Drive query syntax, read content (with automatic export of Google Docs/Sheets/Slides to text), upload local files, download binaries, and export Workspace documents to formats like PDF, DOCX, or Markdown. The sharing tools (drive_share, drive_list_permissions, drive_remove_permission) are particularly useful because managing access is something people frequently ask an assistant to help with but rarely want to do manually through the Drive UI.
Google Calendar: 11 Tools
Calendar tools cover event CRUD, recurring event instance listing, free/busy queries, and calendar management. The calendar_quick_add tool is worth highlighting: it passes a natural language string directly to Google's parser, so "Lunch with Farouk Thursday 1pm at Le Boulanger" creates a fully structured event without the assistant needing to parse the date, time, and location itself.
Design Decisions
Config-Driven Accounts
The account system is entirely config-driven. Accounts are defined as alias:email pairs in a single environment variable. Adding or removing an account means editing one line in .env, rebuilding, and running the auth flow for the new account. No code changes required.
At the code level, the account aliases are parsed at startup into a tuple that's fed directly to Zod's z.enum(). This means the MCP schema advertises the exact set of valid account names to the client, and the LLM sees them as an enumerated type rather than a freeform string. This eliminates a whole class of errors where the assistant might guess an account name.
1const accountEnum = z.enum(ACCOUNTS); // ACCOUNTS is a runtime tuple like ["work", "personal"]
Stdio Transport
The server uses stdio transport rather than HTTP. This means it runs as a local subprocess spawned by Claude Code (or any MCP client), with communication happening over stdin/stdout. No ports to manage, no network exposure, no CORS. The client starts the process and talks to it directly. This is the simplest and most secure deployment model for a local tool server.
Single OAuth Client, Multiple Tokens
Rather than requiring a separate OAuth client per account, the server uses a single Google Cloud OAuth client and stores separate refresh tokens per account in tokens/<alias>/token.json. The auth flow opens a browser, the user logs in with the appropriate Google account, and the token is saved. Tokens auto-refresh transparently, so you only authenticate once per account.
Error Handling Pattern
Every tool follows the same error handling structure. Authentication errors (401) return a clear message telling you to re-run the auth flow. Rate limit errors (429) return the retry-after header. Everything else returns the error message and code. This consistency matters because the LLM needs to understand what went wrong and decide whether to retry, switch accounts, or ask the user for help.
Implementation Highlights
The dotenv Loading Order Gotcha
One subtle issue I hit during development was module initialization order. The account parser runs at module load time because it needs to provide the ACCOUNTS tuple for Zod schemas that are also defined at module load time. If dotenv.config() runs in index.ts but accounts.ts is imported before that call, process.env.GOOGLE_ACCOUNTS is undefined when the parser runs.
The fix was to move dotenv loading into accounts.ts itself and make it the first import in the entry point:
1// index.ts 2import './accounts.js'; // loads dotenv, parses accounts, must be first 3import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
This is a pattern worth remembering for any project where module-level code depends on environment variables: the module that reads the env should also be the one that loads it.
Dynamic Zod Schemas
Zod's z.enum() requires a tuple with at least one element ([string, ...string[]]). Since the accounts are defined at runtime, the parser returns this exact type, validated at startup. If no accounts are configured, the server fails fast with a clear error message rather than starting up with a broken schema.
Workspace Document Export
One of the trickier aspects of the Drive integration is handling Google Workspace documents. A Google Doc isn't a file you can download directly; it needs to be exported to a concrete format. The drive_read tool detects Workspace MIME types and automatically exports them as plain text, while drive_export lets you choose the format (PDF, DOCX, XLSX, Markdown, etc.). This distinction between "read content" and "export to format" maps cleanly to how people actually use Drive through an assistant.
What It Looks Like in Practice
Here's a real workflow that would have required switching between three browser tabs and manual copy-paste:
"Check my work inbox for any emails from the accountant this week. Download any invoice attachments to my local machine. Then look at my personal calendar and find a free slot tomorrow afternoon to review them."
The assistant searches Gmail on the work account, downloads attachments using gmail_download_attachment, queries the personal calendar with calendar_get_freebusy, and suggests a time slot. One conversation, two accounts, three Google services.
This kind of cross-account, cross-service orchestration is where the multi-account design pays off. The assistant doesn't need to be told which account to use for what. It sees the account list, understands the context, and picks the right one.
Open Source and What's Next
I open-sourced mcp-google-multi because the MCP ecosystem is still young, and the best way to push it forward is to build tools that people can use, fork, and extend. The server is MIT-licensed and designed to be easy to set up: clone, configure, build, authenticate, and register with Claude Code.
There are clear areas for expansion. Google Contacts, Google Tasks, and the Admin SDK are natural additions. Support for service accounts (rather than just OAuth) would make the server useful in automated pipelines. And as the MCP specification evolves, there may be opportunities to use resources and prompts alongside tools.
If you're building with MCP or working with Google APIs in an AI-assisted workflow, I'd welcome contributions, issues, or feedback on GitHub.