home / skills / jeremylongshore / claude-code-plugins-plus-skills / firebase-vertex-ai

This skill helps you set up and deploy Firebase apps with Vertex AI integration, ensuring secure calls to Gemini and robust deployment workflows.

npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill firebase-vertex-ai

Review the files below or copy the command above to add this skill to your agents.

Files (3)
SKILL.md
3.1 KB
---
name: firebase-vertex-ai
description: |
  Execute firebase platform expert with Vertex AI Gemini integration for Authentication, Firestore, Storage, Functions, Hosting, and AI-powered features. Use when asked to "setup firebase", "deploy to firebase", or "integrate vertex ai with firebase". Trigger with relevant phrases based on skill purpose.
allowed-tools: Read, Write, Edit, Grep, Glob, Bash(cmd:*)
version: 1.0.0
author: Jeremy Longshore <[email protected]>
license: MIT
---
# Firebase Vertex AI

Operate Firebase projects end-to-end (Auth, Firestore, Functions, Hosting) and integrate Gemini/Vertex AI safely for AI-powered features.

## Overview

Use this skill to design, implement, and deploy Firebase applications that call Vertex AI/Gemini from Cloud Functions (or other GCP services) with secure secrets handling, least-privilege IAM, and production-ready observability.

## Prerequisites

- Node.js runtime and Firebase CLI access for the target project
- A Firebase project (billing enabled for Functions/Vertex AI as needed)
- Vertex AI API enabled and permissions to call Gemini/Vertex AI from your backend
- Secrets managed via env vars or Secret Manager (never in client code)

## Instructions

1. Initialize Firebase (or validate an existing repo): Hosting/Functions/Firestore as required.
2. Implement backend integration:
   - add a Cloud Function/HTTP endpoint that calls Gemini/Vertex AI
   - validate inputs and return structured responses
3. Configure data and security:
   - Firestore rules + indexes
   - Storage rules (if applicable)
   - Auth providers and authorization checks
4. Deploy and verify:
   - deploy Functions/Hosting
   - run smoke tests against deployed endpoints
5. Add ops guardrails:
   - logging/metrics
   - alerting for error spikes
   - basic cost controls (budgets/quotas) where appropriate

## Output

- A deployable Firebase project structure (configs + Functions/Hosting as needed)
- Secure backend code that calls Gemini/Vertex AI (with secrets handled correctly)
- Firestore/Storage rules and index guidance
- A verification checklist (local + deployed) and CI-ready commands

## Error Handling

- Auth failures: identify the principal and missing permission/role; fix with least privilege.
- Billing/API issues: detect which API or quota is blocking and provide remediation steps.
- Firestore rule/index problems: provide minimal repro queries and rule fixes.
- Vertex AI call failures: surface model/region mismatches and add retries/backoff for transient errors.

## Examples

**Example: Gemini-backed chat API on Firebase**
- Request: “Deploy Hosting + a Function that powers a Gemini chat endpoint.”
- Result: `/api/chat` function, Secret Manager wiring, and smoke tests.

**Example: Firestore-powered RAG**
- Request: “Build a RAG flow that embeds docs and answers with citations.”
- Result: ingestion plan, embedding + index strategy, and evaluation prompts.

## Resources

- Full detailed guide (kept for reference): `{baseDir}/references/SKILL.full.md`
- Firebase docs: https://firebase.google.com/docs
- Cloud Functions for Firebase: https://firebase.google.com/docs/functions
- Vertex AI docs: https://cloud.google.com/vertex-ai/docs

Overview

This skill enables end-to-end Firebase project design, implementation, and deployment with Vertex AI (Gemini) integrations. It focuses on secure backend calls from Cloud Functions, least-privilege IAM, and production-ready observability. Use it to add AI-powered endpoints, RAG flows, and secure server-side model access to your Firebase apps.

How this skill works

I scaffold or validate a Firebase project (Hosting, Functions, Firestore, Storage) and add Cloud Functions that call Vertex AI/Gemini using secrets stored in environment variables or Secret Manager. I apply Firestore and Storage rules, implement input validation and structured responses, and add logging/metrics, retry/backoff, and CI-ready deploy steps. I also provide a verification checklist and operational guardrails like budgets, alerts, and minimal IAM role recommendations.

When to use it

  • When you need a secure backend endpoint that calls Vertex AI/Gemini from Firebase Cloud Functions.
  • When deploying Hosting + Functions for an AI chat or assistant powered by Gemini.
  • When building a Retrieval-Augmented Generation (RAG) pipeline with Firestore for embeddings and search.
  • When you need to enforce Firestore/Storage rules, indexes, and least-privilege IAM for AI features.
  • When preparing CI/CD deployment commands and smoke tests for Firebase + Vertex AI integrations.

Best practices

  • Keep secrets out of client code; use Secret Manager or env vars for Functions and never expose API keys to the browser.
  • Grant least-privilege IAM roles to functions calling Vertex AI and separate service accounts for different duties.
  • Validate and sanitize all inputs to AI endpoints and return structured, predictable responses for callers.
  • Add observability: structured logs, metrics for latency/error rate, and alerts for error spikes and cost anomalies.
  • Implement retries with exponential backoff for transient Vertex AI errors and detect model/region mismatches early.

Example use cases

  • Deploy a /api/chat Cloud Function that proxies requests to Gemini with per-user rate limits and secret-managed credentials.
  • Build a Firestore-powered RAG flow: ingest docs, compute embeddings, store vectors, and serve cited answers via Functions.
  • Create Hosting + Functions site with server-side generation that calls Vertex AI for content suggestions.
  • Migrate an existing Firebase project to production: add indexes, tighten rules, configure billing/quotas, and add smoke tests.

FAQ

How are secrets handled?

Secrets should be stored in Secret Manager or deployed as env vars to Cloud Functions; never commit secrets to client code or source control.

What if Vertex AI calls fail due to permissions?

Identify the failing principal and missing IAM role, then attach the smallest required role to the service account and retry with retries/backoff.