home / skills / venkateshvenki404224 / frappe-apps-manager / frappe-data-migration-generator

frappe-data-migration-generator skill

/frappe-apps-manager/skills/frappe-data-migration-generator

This skill generates robust Frappe data migration scripts with validation, error handling, and progress tracking for CSV imports and data transformations.

npx playbooks add skill venkateshvenki404224/frappe-apps-manager --skill frappe-data-migration-generator

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
2.9 KB
---
name: frappe-data-migration-generator
description: Generate data migration scripts for Frappe. Use when migrating data from legacy systems, transforming data structures, or importing large datasets.
---

# Frappe Data Migration Generator

Generate robust data migration scripts with validation, error handling, and progress tracking for importing data into Frappe.

## When to Use This Skill

Claude should invoke this skill when:
- User wants to migrate data from legacy systems
- User needs to import large CSV/Excel files
- User mentions data migration, ETL, or data import
- User wants to transform data structures
- User needs bulk data operations

## Capabilities

### 1. CSV Import Script

**Production-Ready CSV Importer:**
```python
import csv
import frappe
from frappe.utils import flt, cint, getdate

def import_customers_from_csv(file_path):
    """Import customers with validation and error handling"""
    success = []
    errors = []

    with open(file_path, 'r', encoding='utf-8-sig') as f:
        reader = csv.DictReader(f)

        for idx, row in enumerate(reader, start=2):
            try:
                # Validate required fields
                if not row.get('Customer Name'):
                    raise ValueError('Customer name required')

                # Transform data
                customer = {
                    'doctype': 'Customer',
                    'customer_name': row['Customer Name'].strip(),
                    'customer_group': row.get('Customer Group', 'Commercial'),
                    'territory': row.get('Territory', 'All Territories'),
                    'email_id': row.get('Email', '').strip(),
                    'mobile_no': row.get('Phone', '').strip(),
                    'credit_limit': flt(row.get('Credit Limit', 0))
                }

                # Check duplicate
                exists = frappe.db.exists('Customer',
                    {'customer_name': customer['customer_name']})

                if exists:
                    # Update
                    doc = frappe.get_doc('Customer', exists)
                    doc.update(customer)
                    doc.save()
                else:
                    # Insert
                    doc = frappe.get_doc(customer)
                    doc.insert()

                success.append(row['Customer Name'])

                # Commit every 100
                if len(success) % 100 == 0:
                    frappe.db.commit()
                    print(f"Processed {len(success)} records")

            except Exception as e:
                errors.append({'row': idx, 'data': row, 'error': str(e)})

    frappe.db.commit()
    return {'success': success, 'errors': errors}
```

## References

**Frappe Data Import:**
- Data Import: https://github.com/frappe/frappe/blob/develop/frappe/core/doctype/data_import/data_import.py
- CSV Utils: https://github.com/frappe/frappe/blob/develop/frappe/utils/csvutils.py

Overview

This skill generates production-ready data migration scripts for Frappe, focusing on safe imports, transformations, validation, and progress tracking. It is designed to turn CSV/Excel or legacy system extracts into repeatable migration code with error handling and commit strategies. The output targets reliable bulk operations inside Frappe apps.

How this skill works

The skill inspects source data schemas and generates script templates that validate required fields, transform values, and insert or update Frappe DocTypes. It includes error capture, per-row reporting, periodic commits to avoid long transactions, and simple logging for progress. Generated code follows Frappe patterns: frappe.get_doc, frappe.db.exists, doc.insert/save, and frappe.db.commit.

When to use it

  • Migrating customer, supplier, item, or transaction data from legacy systems into Frappe
  • Importing large CSV or Excel exports that require cleansing and mapping before insertion
  • Transforming data structure or field names to match current Frappe DocTypes
  • Building repeatable ETL scripts for scheduled bulk operations or one-time migrations
  • When you need built-in validation, duplicate checking, and incremental commits during import

Best practices

  • Validate required fields early and collect row-level errors rather than stopping the whole run
  • Use batched commits (for example every 100 records) to limit transaction size and reduce lock time
  • Check for existing records and update instead of inserting to preserve links and avoid duplicates
  • Log progress and errors with row numbers and original data sample for easier troubleshooting
  • Run migrations first in a staging environment and keep backups before running on production

Example use cases

  • Generate a CSV importer for Customer master with validation of names, email formatting, and credit limits
  • Create a script to transform legacy order exports into Sales Invoice and Sales Order DocTypes with mapping rules
  • Produce bulk item importers that set default item group, validate units of measure, and skip invalid rows
  • Build repeatable ETL code to sync data from an external CRM dump into Frappe with deduplication
  • Generate scripts that attach progress prints and commit after each batch for long-running imports

FAQ

Will generated scripts handle duplicates safely?

Yes. Scripts include existence checks and either update existing documents or insert new ones to avoid duplicate masters.

How does the skill manage transaction size and performance?

It recommends and includes periodic commits (for example every 100 records) to keep transactions small and reduce locking, improving reliability for large imports.