home / skills / yuniorglez / gemini-elite-core / remotion-expert

remotion-expert skill

/skills/remotion-expert

This skill enables programmatic, frame-precise Remotion video generation with AI-driven workflows and dynamic data for high-fidelity 2026 productions.

npx playbooks add skill yuniorglez/gemini-elite-core --skill remotion-expert

Review the files below or copy the command above to add this skill to your agents.

Files (11)
SKILL.md
5.4 KB
---
name: remotion-expert
id: remotion-expert
version: 1.0.0
description: "Senior Specialist in Remotion v4.0+, React 19, and Next.js 16. Expert in programmatic video generation, sub-frame animation precision, and AI-driven video workflows for 2026."
---

# Remotion Expert - High-Performance Video Generation

Senior Specialist in Remotion v4.0+, React 19, and Next.js 16. Expert in programmatic video generation, sub-frame animation precision, and AI-driven video workflows for 2026.

## 🧭 Overview
Remotion allows you to create videos programmatically using React. This skill expands the LLM's capabilities to handle complex animations, dynamic data-driven videos, and high-fidelity rendering pipelines.

### Core Capabilities
- **Programmatic Animation**: Frame-perfect control via `useCurrentFrame` and `interpolate`.
- **Dynamic Compositions**: Parameterized videos that adapt to external data.
- **Modern Stack**: Fully optimized for **React 19.3**, **Next.js 16.2**, and **Tailwind CSS 4.0**.
- **AI Orchestration**: Integration with Remotion Skills for instruction-driven video editing.

---

## 🛠️ Table of Contents
1. [Quick Start](#-quick-start)
2. [Mandatory Rules & Anti-Patterns](#-mandatory-rules--anti-patterns)
3. [Core Concepts](#-core-concepts)
4. [Advanced Patterns](#-advanced-patterns)
5. [The "Do Not" List](#-the-do-not-list-common-mistakes)
6. [References](#-references)

---

## ⚡ Quick Start

Scaffold a new project using Bun (recommended for 2026):
```bash
bun create video@latest my-video
cd my-video
bun start
```

### Basic Composition Pattern
```tsx
import { AbsoluteFill, interpolate, useCurrentFrame, useVideoConfig } from 'remotion';

export const MyVideo = () => {
  const frame = useCurrentFrame();
  const { fps, durationInFrames } = useVideoConfig();

  // Animate from 0 to 1 over the first second
  const opacity = interpolate(frame, [0, fps], [0, 1], {
    extrapolateRight: 'clamp',
  });

  return (
    <AbsoluteFill style={{ 
      backgroundColor: 'white', 
      justifyContent: 'center', 
      alignItems: 'center' 
    }}>
      <h1 style={{ opacity, fontSize: 100 }}>Remotion 2026</h1>
    </AbsoluteFill>
  );
};
```

---

## 🛡️ Mandatory Rules & Anti-Patterns

1. **NO CSS ANIMATIONS**: Never use standard CSS `@keyframes` or `transition`. They are not deterministic and will fail during rendering. Use `interpolate()` or `spring()`.
2. **Deterministic Logic**: Ensure all calculations are derived from `frame`. Avoid `Math.random()` or `Date.now()` inside components unless seeded.
3. **Zod Validation**: Always use Zod for `defaultProps` to ensure type safety in parameterized videos.
4. **Asset Preloading**: Use `staticFile()` for local assets and ensure remote assets are reachable during render.

---

## 🧠 Core Concepts

### 1. Frame-Based Animation
Everything is a function of the current frame.
```tsx
const frame = useCurrentFrame();
const scale = interpolate(frame, [0, 20], [0, 1], { easing: Easing.bezier(0.25, 0.1, 0.25, 1) });
```

### 2. Composition Architecture
Compositions are the "entry points". They define the canvas.
```tsx
<Composition
  id="Main"
  component={MyComponent}
  durationInFrames={300}
  fps={60}
  width={1920}
  height={1080}
  defaultProps={{ title: 'Hello' }}
/>
```

---

## 🚀 Advanced Patterns

### AI-Driven Video Modification (2026)
Integration with "Remotion Skills" allows for natural language instructions to modify compositions.
```tsx
// Pattern: Instruction-driven prop updates
export const aiUpdateHandler = async (instruction: string, currentProps: Props) => {
  // Logic to map LLM output to Remotion props
  return updatedProps;
};
```

### Dynamic Metadata Calculation
Fetch data *before* the composition renders to set duration or dimensions.
```tsx
export const calculateMetadata = async ({ props }) => {
  const response = await fetch(`https://api.v2.com/video-data/${props.id}`);
  const data = await response.json();
  return {
    durationInFrames: data.duration * 60,
    props: { ...props, content: data.content }
  };
};
```

---

## 🚫 The "Do Not" List (Common Mistakes)

- **DO NOT** use `setTimeout` or `setInterval`. They do not sync with the renderer.
- **DO NOT** use `npm` for 2026 workflows; prefer `bun` for sub-second install and execution.
- **DO NOT** forget to use `<Sequence>` for delaying elements. Manual frame offsets are error-prone.
- **DO NOT** use Tailwind 3.x patterns; leverage **Tailwind 4.0** native container queries for responsive video layouts.
- **DO NOT** use `useState` for animation progress. Animation state must always be derived from `frame`.
- **DO NOT** perform heavy computations inside the render loop without `useMemo`. Remember that the component renders *every frame*.
- **DO NOT** use external libraries that rely on `window.requestAnimationFrame`. They won't be captured by the Remotion renderer.
- **DO NOT** hardcode frame counts. Always use constants or relative calculations like `2 * fps`.

---

## 📚 References
- [Animations & Timing](./references/animations-timing.md) - Precision interpolation and springs.
- [Compositions & Props](./references/compositions-props.md) - Structuring complex video projects.
- [Media & Assets](./references/media-assets.md) - Handling Video, Audio, and Lottie.
- [Sequencing & Series](./references/sequencing.md) - Timeline orchestration.
- [Next.js Integration](./references/nextjs-integration.md) - SSR and Server Actions for Video.

---

*Updated: January 22, 2026 - 20:00*

Overview

This skill is a senior-level Remotion expert focused on Remotion v4.0+, React 19, and Next.js 16 for high-performance programmatic video generation. It delivers frame-perfect animation patterns, deterministic rendering rules, and AI-driven workflows for 2026 production environments. The skill emphasizes predictable renders, data-driven compositions, and optimized pipelines for fast, reliable output.

How this skill works

It inspects and enforces frame-based animation patterns using useCurrentFrame, interpolate, and spring to guarantee deterministic results. It validates common anti-patterns (CSS animations, Math.random, timers) and recommends asset handling, Zod validation for props, and precomputed metadata. It also provides patterns for integrating instruction-driven AI updates to transform LLM outputs into Remotion composition props and rendering metadata.

When to use it

  • Building programmatic videos that require frame-perfect timing and repeatable renders
  • Creating data-driven or personalized videos where props and duration depend on external APIs
  • Migrating or upgrading projects to Remotion v4+, React 19, and Next.js 16 stacks
  • Automating video pipelines with AI-driven editing instructions or metadata calculation
  • Optimizing CI/CD rendering for deterministic output and fast dependency installs (bun)

Best practices

  • Derive all animation state from frame; avoid useState for progress and never use CSS keyframes or transitions
  • Seed or eliminate nondeterministic APIs (Math.random, Date.now) inside components
  • Use Zod to validate composition defaultProps and shape runtime props before rendering
  • Preload assets with staticFile and ensure remote assets are accessible during render
  • Compute duration and props before rendering compositions; avoid heavy per-frame computation without useMemo

Example use cases

  • Generate personalized marketing videos where text, durations, and assets are fetched and injected at render time
  • Produce sub-frame-accurate motion graphics using interpolate and custom easing curves
  • Create instruction-driven edits: map natural language LLM outputs to composition props for autonomous video updates
  • Run deterministic batch renders in CI with Bun for minimal install time and consistent output
  • Integrate with Next.js 16 server actions to compute metadata and trigger server-side renders

FAQ

Can I use CSS animations or requestAnimationFrame?

No. CSS animations and window.requestAnimationFrame are not deterministic for Remotion rendering. Use interpolate(), spring(), and frame-based math instead.

How do I ensure remote assets are available at render time?

Use staticFile for local assets and validate reachability for remote URLs beforehand. Pre-fetch or host assets in stable storage to avoid render failures.