Skip to content

Target Migration Script

The target migration script merges data from the legacy mx-dod-form Convex deployment into the main xentral deployment. It handles 43+ target tables with ID preservation, conflict detection, batch processing, and automatic verification.

Overview

┌─────────────────────┐         ┌─────────────────────┐
│   mx-dod-form       │         │      xentral        │
│   (old deployment)  │ ──────► │  (new deployment)   │
│                     │ migrate │                     │
│   43 tables         │         │   43 tables         │
└─────────────────────┘         └─────────────────────┘

Quick Start

bash
# Preview what will happen
npx tsx scripts/target-migration/import-and-merge.ts --dry-run

# Execute full migration (import + merge)
npx tsx scripts/target-migration/import-and-merge.ts --execute

This script:

  1. Runs convex import --replace-all -y to import all data with preserved IDs
  2. Runs the merge script to apply any remaining updates

Features

FeatureDescription
ID PreservationOriginal _id values preserved via convex import
Conflict Detection"Newer wins" timestamp comparison
Dry-Run ModePreview all changes before applying
Live VerificationVerify against live database, not stale exports
Batch ProcessingConfigurable batch size (default: 200)
Large Table HandlingauditTrail skipped, sessions paginated
Markdown ReportsAudit trail for all operations

Setup

1. Environment Configuration

Create .env file in scripts/target-migration/:

bash
cd scripts/target-migration
cp .env.example .env

Required environment variables:

bash
# Old deployment (source)
OLD_CONVEX_URL=https://your-old-deployment.convex.cloud
OLD_CONVEX_ADMIN_KEY=prod:your-admin-key

# New deployment (target)
NEW_CONVEX_URL=https://your-new-deployment.convex.cloud
NEW_CONVEX_ADMIN_KEY=prod:your-admin-key

# Optional: path to exported data
OLD_EXPORT_PATH=./scripts/target-migration/data/old-export

2. Export Deployment Data

Export data from both deployments for comparison:

bash
# Export old deployment (download from Convex dashboard or run from old project)
# Place in: ./scripts/target-migration/data/old-export/

# Export new/local deployment
npx tsx scripts/target-migration/export-data.ts --new

# Check both exports exist
npx tsx scripts/target-migration/export-data.ts --all

Scripts

Full migration with ID preservation:

bash
# Preview
npx tsx scripts/target-migration/import-and-merge.ts --dry-run

# Execute
npx tsx scripts/target-migration/import-and-merge.ts --execute

# Options
--skip-import    # Skip convex import (only run merge)
--skip-merge     # Skip merge (only run import)

run.ts

Compare and merge data between deployments:

bash
# Preview changes
npx tsx scripts/target-migration/run.ts --dry-run

# Apply changes + auto-verify
npx tsx scripts/target-migration/run.ts --execute

# Apply only UPDATEs (skip INSERTs)
npx tsx scripts/target-migration/run.ts --execute --skip-inserts

# Verify only (uses live database)
npx tsx scripts/target-migration/run.ts --verify --live

# Process specific tables
npx tsx scripts/target-migration/run.ts --dry-run --tables=dods,issues,solutions

export-data.ts

Export deployment data for comparison:

bash
# Export new (local) deployment
npx tsx scripts/target-migration/export-data.ts --new

# Check old export exists
npx tsx scripts/target-migration/export-data.ts --old

# Export new + check old
npx tsx scripts/target-migration/export-data.ts --all

Migration Strategies

Best when you want to fully sync from old to new:

bash
npx tsx scripts/target-migration/import-and-merge.ts --execute
ProsCons
✅ Preserves original IDsReplaces all data in target
✅ All records imported
✅ Catches any updates since export

Strategy 2: Updates Only

Best when new deployment has data you want to keep:

bash
npx tsx scripts/target-migration/run.ts --execute --skip-inserts
ProsCons
✅ Preserves new deployment's unique records❌ Missing records won't be imported
✅ Only updates existing records

Strategy 3: Selective Tables

Migrate specific tables:

bash
npx tsx scripts/target-migration/run.ts --execute --tables=dods,issues
ProsCons
✅ Fine-grained control❌ May miss related data
✅ Faster for targeted updates

Conflict Resolution

When the same record exists in both deployments with different values:

ConditionActionReason
Record only in oldINSERTMissing in new deployment
Records identicalSKIPNo changes needed
Old record newerUPDATEOld has more recent changes
New record newerSKIPKeep newer version

Timestamp comparison uses updatedAt field, falling back to _creationTime.

Tables Migrated

The script processes 43 target-related tables:

Core Tables

  • assignees, coordinators, departments, sessions
  • designElements, milestones
  • issues, solutions, dods
  • dodNotes, dodNoteHistory, dodStatus

Planning Tables

  • importanceOptions, tags, targetTypes
  • targetDefinitions, targetDescriptionItems
  • approvalSteps

Resource Management

  • assigneeReservations, assigneeLeaveAndOT
  • assigneeAllocationOverride, assigneeMilestoneAggregates
  • assigneeReservationChangeRequests

Snapshots & History

  • commitmentPlanSnapshots, milestoneSnapshots
  • auditTrail (skipped - too large), migrationStatus

Notifications

  • dodNotifications, notificationHistory

Settings & Admin

  • globalSettings, adminUsers

Scheduling

  • milestoneDate, sprintDate, holidayDate
  • deadlineSchedulers, sprintUpdateRowColors

Custom Fields

  • customFieldDefinitions, customFieldValues, customFieldLinks

XILO Integration

  • issueXiloCell, issueAnnotationLinks, issueXiloCellHistory

Other

  • overallFeedback, xyncAlerts, syncQueue

Verification

Live Verification

After migration, verify against live database:

bash
npx tsx scripts/target-migration/run.ts --verify --live

Verification performs:

  1. Record Count Check - Old count ≤ New count for all tables
  2. Missing Records Check - All old records exist in new
  3. Spot Check - Random sample of 10 records per table verified field-by-field

Auto-Verification

The --execute flag automatically runs verification after migration completes.

Reports

Reports are saved to scripts/target-migration/reports/:

FileDescription
migration-dry-run-{timestamp}.mdPreview of all changes
migration-execute-{timestamp}.mdExecution results + verification

Performance

MethodSpeedNotes
Export comparison~2 secondsFastest - compares files
Live API queries~30+ secondsFor large tables

The script automatically uses exports if available.

File Structure

scripts/target-migration/
├── .env.example          # Environment template
├── .env                  # Local credentials (gitignored)
├── README.md             # Quick reference
├── import-and-merge.ts   # Full migration with ID preservation
├── run.ts                # Compare and merge entry point
├── export-data.ts        # Export deployment data
├── config.ts             # Configuration & table list
├── types.ts              # TypeScript interfaces
├── clients.ts            # Convex client setup
├── loader.ts             # Data loading (export or API)
├── compare.ts            # Record comparison logic
├── merge.ts              # INSERT/UPDATE execution
├── verify.ts             # Post-migration verification
├── report.ts             # Markdown report generation
├── data/                 # Exported data (gitignored)
│   ├── old-export/       # Old deployment export
│   └── new-export/       # New deployment export
└── reports/              # Generated reports

packages/convex/
├── migrationHelper.ts    # Convex queries for migration
└── migrationMutations.ts # Convex mutations for migration

Troubleshooting

"Table X not found in export"

Ensure exports are extracted correctly:

bash
ls -la ./data/old-export/
ls -la ./data/new-export/

Each table should have a {table}/documents.jsonl file.

Verification Failed

Verification compares against export files by default. For live verification:

bash
npx tsx scripts/target-migration/run.ts --verify --live

Or re-export after migration:

bash
npx tsx scripts/target-migration/export-data.ts --new
npx tsx scripts/target-migration/run.ts --verify

INSERT fails with "Cannot specify _id"

Convex doesn't allow inserting with specific IDs through mutations. Use the import-and-merge script:

bash
npx tsx scripts/target-migration/import-and-merge.ts --execute

This uses convex import which preserves IDs.

Timeout errors

The script includes retry logic for timeout errors. For persistent issues:

  • Use exported data instead of live API
  • Process tables in smaller batches with --tables

Large tables (auditTrail, sessions)

  • auditTrail is skipped automatically (too large)
  • sessions uses pagination for handling

Internal Documentation