SMB SaaS platforms that expose AI agent capabilities (MCP tools) to multiple tenants risk data leakage. Without a central auth gateway, a rogue agent can access another tenant's resources, turning an AI feature into a security liability.
A complete, working implementation of this recipe — downloadable as a zip or browsable file by file. Generated by our build pipeline; tested with full coverage before publishing.
You’ll build an authentication gateway that sits between AI agents and their MCP tools, enforcing per-tenant authorization on every request. By the end, you’ll have a working Fastify server that validates Azure AD bearer tokens, looks up per-tenant tool-access policies from a SQLite store, and forwards authorized calls through an A2A-MCP bridge — all in about 400 lines of TypeScript. Every request that doesn’t carry a valid, policy-matched token is blocked before it reaches agent logic.
Prerequisites
Node.js >= 22 (the package.jsonengines field enforces this)
pnpm 10.x (exactly pnpm@10.9.0 as declared in the packageManager field)
An Azure AD tenant with an app registration (client ID + secret). You’ll need the tenant ID, client ID, and client secret.
Familiarity with TypeScript and Fastify route registration. No Azure AD expertise required — the recipe wraps the MSAL libraries so you don’t need to know the OAuth2 flow in detail.
Step 1: Scaffold the project
Create a fresh directory and wire up the project metadata. This file tells Node.js the module system, the required runtime version, and the package manager.
Run pnpm in your project root. This pulls everything from the registry — Fastify, the Azure identity libraries, the REAA agent auth packages, better-sqlite3 for the policy store, and the full testing toolchain.
terminal
pnpm install
Expected output: pnpm creates node_modules/ and prints a summary of installed packages. You should see no errors; if better-sqlite3 fails to compile its native addon, install the system build tools (build-essential on Debian/Ubuntu, or the Xcode command-line tools on macOS) and retry.
Step 3: Configure environment variables
The server reads its configuration from process.env via dotenv/config. Create .env from the example below, substituting your Azure AD values.
AZURE_AD_TENANT_ID — your Azure AD directory (tenant) ID. Find it in the Azure Portal under Entra ID > Overview.
AZURE_AD_CLIENT_ID and AZURE_AD_CLIENT_SECRET — from your app registration’s “Certificates & secrets” pane.
AZURE_AD_AUDIENCE — the audience claim your tokens carry; typically api://<client-id>.
DATABASE_PATH — where the SQLite policy database lives; ./policies.db is fine for development.
A2A_AGENT_URL — the base URL of the A2A agent that the bridge wraps. You can leave the default for now.
Step 4: Create the env helper and TypeScript declarations
Two small foundation files before the real logic. The env helper fails fast if a required variable is missing; the type declaration lets Fastify know about the accessContext property the middleware will attach.
Create src/env.ts:
ts
export function getRequiredEnv(name: string): string { const value = process.env[name]; if (!value || value.length === 0) { console.error(`Missing required environment variable: ${name}`); process.exit(1); } return value;}
Create src/types/fastify.d.ts:
ts
// Type augmentation for FastifyRequest to add accessContext// This must be a module (have at least one import/export) for augmentation to workimport "fastify";declare module "fastify" { interface FastifyRequest { accessContext?: { tenantId: string; userId: string; scopes: string[]; roles: string[]; }; }}
Step 5: Build the Azure AD token validator
This is the core authentication layer. It wraps @azure/msal-node to validate incoming tokens (decoding their JWT claims with jose) and exchange user tokens for downstream credentials. Token validation results are cached for 5 minutes to avoid redundant work.
Create src/auth/azure.ts:
ts
import { ConfidentialClientApplication, Configuration } from "@azure/msal-node";import { AuthError } from "@reaatech/agent-auth-proxy-core";import { decodeJwt } from "jose";import * as NodeCache from "node-cache";import { createHash } from "node:crypto";export interface AzureTokenValidatorConfig { authority: string; clientId: string; clientSecret: string;}export interface DecodedToken { oid: string; tid:
The validator extracts the oid (object ID, the user) and tid (tenant ID) claims from every token. It also pulls scopes and roles when present — those feed into the authorization check in the middleware.
Step 6: Build the policy store with CRUD operations
Policies declare which tenant may access which tool. They live in a SQLite database managed by better-sqlite3. The store class handles table creation, the four CRUD operations, and a lookup by tenant + tool — the method the middleware calls to authorize each request.
This file also exports Fastify route handlers for the admin API: GET, POST, PUT, and DELETE on /admin/api/policies. Each route requires a bearer token whose JWT payload contains "admin" in the roles array.
Create src/admin/api/policies.ts:
ts
import type { FastifyInstance, FastifyReply, FastifyRequest } from "fastify";import { AuthError } from "@reaatech/agent-auth-proxy-core";import { randomUUID } from "node:crypto";import { z } from "zod";export const PolicySchema = z.object({ id: z.string().uuid(), tenantId: z.string().uuid(), toolId: z.string().min(1), allowedRoles: z.array(z.string()), scopes: z.array(z.string()),});
The file is long but straightforward. The PolicyStore class manages a policies table in SQLite. The registerPolicyRoutes function mounts four protected routes under /admin/api/policies — each one verifies the caller holds the "admin" role before touching the store.
Step 7: Build the auth middleware
Every request to an agent tool (paths like /agent/tools/:toolName) must pass through this middleware. It extracts the bearer token, validates it with AzureTokenValidator, parses the tool name from the URL, looks up a policy matching that tenant + tool combination, and — if found — stores the access context for downstream handlers. If any step fails, it throws typed errors that the error handler (built in the next step) turns into proper HTTP responses.
Create src/proxy/middleware.ts:
ts
import type { FastifyReply, FastifyRequest } from "fastify";import { AuthError, ScopeError } from "@reaatech/agent-auth-proxy-core";import type { AzureTokenValidator } from "../auth/azure.js";import type { PolicyStore } from "../admin/api/policies.js";export interface AccessContext { tenantId: string; userId: string; scopes: string[]; roles: string[];}export type FastifyPreHandler = ( request: FastifyRequest, reply: FastifyReply,) => Promise<void>;// Use a WeakMap to store access context per-request without type conflictsconst accessContextMap = new WeakMap<FastifyRequest, AccessContext>();export function setAccessContext(req: FastifyRequest, ctx: AccessContext): void { accessContextMap.set(req, ctx);}export function getAccessContext(req: FastifyRequest): AccessContext | undefined { return accessContextMap.get(req);}export function createAuthMiddleware( authValidator: AzureTokenValidator, policyStore: PolicyStore,): FastifyPreHandler { return async (request: FastifyRequest, reply: FastifyReply): Promise<void> => { // reply is available for custom error responses void reply; const authHeader = request.headers.authorization; if (!authHeader || !authHeader.startsWith("Bearer ")) { throw new AuthError("UNAUTHORIZED", "Missing or invalid Authorization header"); } const token = authHeader.slice("Bearer ".length); if (token.length === 0) { throw new AuthError("UNAUTHORIZED", "Missing or invalid Authorization header"); } let decodedToken; try { decodedToken = await authValidator.validateToken(token); } catch (error) { if (error instanceof AuthError) { throw error; } throw new AuthError( "UNAUTHORIZED", `Token validation failed: ${error instanceof Error ? error.message : String(error)}`, ); } // Extract tool name from URL path: /agent/tools/:toolName const url = request.url; let toolName: string | undefined; try { const urlObj = new URL(url, "http://localhost"); const pathParts = urlObj.pathname.split("/").filter(Boolean); const agentIdx = pathParts.indexOf("agent"); if (agentIdx !== -1 && pathParts.length > agentIdx + 2 && pathParts[agentIdx + 1] === "tools") { toolName = pathParts[agentIdx + 2]; } } catch { throw new AuthError("BAD_REQUEST", "Malformed request URL"); } if (!toolName) { throw new AuthError("BAD_REQUEST", "Could not extract tool name from URL"); } const tenantId = decodedToken.tid; const matchingPolicy = policyStore.findByTenantAndTool(tenantId, toolName); if (!matchingPolicy) { throw new ScopeError("FORBIDDEN", `Tenant ${tenantId} is not authorized to access tool: ${toolName}`); } setAccessContext(request, { tenantId: decodedToken.tid, userId: decodedToken.oid, scopes: decodedToken.scp ? decodedToken.scp.split(" ") : [], roles: decodedToken.roles ?? [], }); };}
Step 8: Build the agent auth proxy plugin
The Fastify plugin ties everything together. It creates an A2aAsMcpServer instance that wraps the upstream A2A agent as an MCP endpoint, registers an error handler that maps REAA error classes to HTTP status codes, and mounts a scoped route group under /agent/tools/:toolName with the auth middleware from Step 7 as a preHandler hook.
The main file assembles everything: loads dotenv, creates a Fastify instance, registers CORS, instantiates the Azure token validator and the policy store (with better-sqlite3), mounts the agent proxy plugin and the admin routes, and adds a /health endpoint. It also hooks SIGTERM and SIGINT for graceful shutdown.
Step 10: Add test config, setup, and run the tests
The project uses Vitest with V8 coverage and a strict 90% threshold on lines, branches, functions, and statements. First, add the Vitest configuration and the test setup file that mocks the external packages.
Expected output: Vitest runs all test files — 72 tests across 14 suites — and passes. The coverage report prints to the terminal and writes JSON summaries to coverage/. You should see something like:
The coverage thresholds are all 90%, so if you see coverage below that, Vitest will fail. Check the coverage/ directory for the detailed JSON report.
Next steps
Deploy the gateway behind an Azure API Management instance to add rate limiting and request logging for each tenant.
Extend the policy model to include time-based access windows — add validFrom and validUntil columns to the policies table and check them in findByTenantAndTool.
Swap the SQLite store for PostgreSQL or Cosmos DB when you move to production; the MinimalDatabase interface makes the swap a drop-in replacement for any library with prepare / get / all / run.