home / skills / jeremylongshore / claude-code-plugins-plus-skills / groq-rate-limits

This skill implements Groq rate limiting with exponential backoff and idempotent retries to improve reliability and throughput.

npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill groq-rate-limits

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
4.3 KB
---
name: groq-rate-limits
description: |
  Implement Groq rate limiting, backoff, and idempotency patterns.
  Use when handling rate limit errors, implementing retry logic,
  or optimizing API request throughput for Groq.
  Trigger with phrases like "groq rate limit", "groq throttling",
  "groq 429", "groq retry", "groq backoff".
allowed-tools: Read, Write, Edit
version: 1.0.0
license: MIT
author: Jeremy Longshore <[email protected]>
---

# Groq Rate Limits

## Overview
Handle Groq rate limits gracefully with exponential backoff and idempotency.

## Prerequisites
- Groq SDK installed
- Understanding of async/await patterns
- Access to rate limit headers

## Instructions

### Step 1: Understand Rate Limit Tiers

| Tier | Requests/min | Requests/day | Burst |
|------|-------------|--------------|-------|
| Free | 60 | 1,000 | 10 |
| Pro | 300 | 10,000 | 50 |
| Enterprise | 1,000 | 100,000 | 200 |

### Step 2: Implement Exponential Backoff with Jitter

```typescript
async function withExponentialBackoff<T>(
  operation: () => Promise<T>,
  config = { maxRetries: 5, baseDelayMs: 1000, maxDelayMs: 32000, jitterMs: 500 }
): Promise<T> {
  for (let attempt = 0; attempt <= config.maxRetries; attempt++) {
    try {
      return await operation();
    } catch (error: any) {
      if (attempt === config.maxRetries) throw error;
      const status = error.status || error.response?.status;
      if (status !== 429 && (status < 500 || status >= 600)) throw error;

      // Exponential delay with jitter to prevent thundering herd
      const exponentialDelay = config.baseDelayMs * Math.pow(2, attempt);
      const jitter = Math.random() * config.jitterMs;
      const delay = Math.min(exponentialDelay + jitter, config.maxDelayMs);

      console.log(`Rate limited. Retrying in ${delay.toFixed(0)}ms...`);
      await new Promise(r => setTimeout(r, delay));
    }
  }
  throw new Error('Unreachable');
}
```

### Step 3: Add Idempotency Keys

```typescript
import { v4 as uuidv4 } from 'uuid';
import crypto from 'crypto';

// Generate deterministic key from operation params (for safe retries)
function generateIdempotencyKey(operation: string, params: Record<string, any>): string {
  const data = JSON.stringify({ operation, params });
  return crypto.createHash('sha256').update(data).digest('hex');
}

async function idempotentRequest<T>(
  client: GroqClient,
  params: Record<string, any>,
  idempotencyKey?: string  // Pass existing key for retries
): Promise<T> {
  // Use provided key (for retries) or generate deterministic key from params
  const key = idempotencyKey || generateIdempotencyKey(params.method || 'POST', params);
  return client.request({
    ...params,
    headers: { 'Idempotency-Key': key, ...params.headers },
  });
}
```

## Output
- Reliable API calls with automatic retry
- Idempotent requests preventing duplicates
- Rate limit headers properly handled

## Error Handling
| Header | Description | Action |
|--------|-------------|--------|
| X-RateLimit-Limit | Max requests | Monitor usage |
| X-RateLimit-Remaining | Remaining requests | Throttle if low |
| X-RateLimit-Reset | Reset timestamp | Wait until reset |
| Retry-After | Seconds to wait | Honor this value |

## Examples

### Queue-Based Rate Limiting
```typescript
import PQueue from 'p-queue';

const queue = new PQueue({
  concurrency: 5,
  interval: 1000,
  intervalCap: 10,
});

async function queuedRequest<T>(operation: () => Promise<T>): Promise<T> {
  return queue.add(operation);
}
```

### Monitor Rate Limit Usage
```typescript
class RateLimitMonitor {
  private remaining: number = 60;
  private resetAt: Date = new Date();

  updateFromHeaders(headers: Headers) {
    this.remaining = parseInt(headers.get('X-RateLimit-Remaining') || '60');
    const resetTimestamp = headers.get('X-RateLimit-Reset');
    if (resetTimestamp) {
      this.resetAt = new Date(parseInt(resetTimestamp) * 1000);
    }
  }

  shouldThrottle(): boolean {
    // Only throttle if low remaining AND reset hasn't happened yet
    return this.remaining < 5 && new Date() < this.resetAt;
  }

  getWaitTime(): number {
    return Math.max(0, this.resetAt.getTime() - Date.now());
  }
}
```

## Resources
- [Groq Rate Limits](https://docs.groq.com/rate-limits)
- [p-queue Documentation](https://github.com/sindresorhus/p-queue)

## Next Steps
For security configuration, see `groq-security-basics`.

Overview

This skill implements Groq rate limiting strategies including exponential backoff, jitter, and idempotency to make API calls reliable under throttling. It provides patterns for honoring rate limit headers, retrying safely, and preventing duplicate operations. Use it to harden clients that interact with Groq endpoints and maximize throughput without violating limits.

How this skill works

The skill inspects HTTP status codes and rate-limit headers (X-RateLimit-*, Retry-After) and performs exponential backoff with jitter for transient 429/5xx errors. It supports deterministic idempotency keys so retries do not create duplicate side effects. Optional queueing and monitoring components let you cap concurrent requests and trigger throttling when remaining quota is low.

When to use it

  • When requests to Groq sometimes return 429 or temporary 5xx errors
  • When you need safe automatic retries without creating duplicate submissions
  • When you must respect per-minute or daily quotas and avoid bursts
  • When building high-throughput integrations that must adapt to variable limits
  • When implementing client-side observability of rate limit headers

Best practices

  • Honor Retry-After and X-RateLimit-Reset headers before retrying
  • Use exponential backoff with randomized jitter to avoid thundering-herd issues
  • Generate deterministic idempotency keys from operation name and params
  • Monitor X-RateLimit-Remaining and proactively throttle when low
  • Combine queueing (concurrency and interval caps) with backoff for steady throughput

Example use cases

  • A batch job that needs to retry failed Groq calls without creating duplicate entries
  • An interactive app that throttles UI-driven requests to stay under per-minute limits
  • A worker pool that uses p-queue-style interval caps plus backoff on 429s
  • A monitoring service that alerts when remaining quota drops below a threshold
  • An SDK wrapper that injects Idempotency-Key headers and centralized retry logic

FAQ

What errors should trigger retries?

Retry on 429 and transient 5xx errors. Do not retry on client errors (4xx other than 429) unless your logic specifically handles them.

How do idempotency keys prevent duplicates?

Generate a deterministic key from the operation and parameters; send it with the request so the server can deduplicate retries and avoid repeated side effects.