adonis-etl

3rd Party
AdonisJS ETL skaffold commands package - Generate ETL (Extract, Transform, Load) components for your AdonisJS applications.
Database
Created Sep 25, 2025 Updated Sep 27, 2025

@jrmc/adonis-etl

npm version License: MIT

AdonisJS ETL skaffold commands package - Generate ETL (Extract, Transform, Load) components for your AdonisJS applications.

Features

  • šŸš€ Interactive CLI - Guided command to create ETL components
  • šŸ“¦ Component Generation - Generate Source, Transform, and Destination classes
  • šŸŽÆ Smart Naming - Automatic class naming based on your ETL process
  • šŸ”§ TypeScript Ready - Full TypeScript support with proper interfaces
  • šŸ“ Organized Structure - Files are created in proper directories (app/etl/)

Installation

node ace add @jrmc/adonis-etl

Custom Directory Configuration

You can customize the directory where ETL files are generated by adding a directories configuration to your adonisrc.ts:

// adonisrc.ts
export default defineConfig({
  directories: {
    etl: 'etl', // Custom path for ETL files
  },
  // ... other configurations
})

If not specified, files will be generated in the default app/etl/ directory.

Usage

Generate ETL Components

Run the interactive command to create your ETL components:

node ace make:etl my-process

The command will guide you through:

  1. Selecting components - Choose which ETL components to create:

    • Source (data extraction)
    • Transform (data transformation)
    • Destination (data loading)
  2. Defining source type - Specify your data source (e.g., database, api, file)

  3. Defining destination type - Specify your data destination (e.g., database, api, file)

Example

$ node ace make:etl import-product

? Which ETL components do you want to create? › 
āÆā—‰ Source
 ā—‰ Transform  
 ā—‰ Destination

? What is the source type? (e.g., database, api, file) › csv
? What is the destination type? (e.g., database, api, file) › db

āœ… ETL files created successfully for: import-product

This will create (with default configuration):

  • app/etl/sources/import_product_csv_source.ts
  • app/etl/transforms/import_product_csv_to_db_transform.ts
  • app/etl/destinations/import_product_db_destination.ts

Or with custom directory configuration (directories.etl: 'etl'):

  • etl/sources/import_product_csv_source.ts
  • etl/transforms/import_product_csv_to_db_transform.ts
  • etl/destinations/import_product_db_destination.ts

Generated Files

Source Component

import { Source } from '@jrmc/adonis-etl'

export default class ImportProductCsvSource implements Source {
  async *each() {
    // Implement your data extraction logic here
  }
}

Transform Component

import { Transform } from '@jrmc/adonis-etl'

export default class ImportProductCsvToDbTransform implements Transform {
  async process(row: unknown) {
    // Implement your data transformation logic here
    return row
  }
}

Destination Component

import { Destination } from '@jrmc/adonis-etl'

export default class ImportProductDbDestination implements Destination {
  async write(row: unknown) {
    // Implement your data loading logic here
  }
}

Usage Examples

The sample/ folder contains two complete ETL implementation examples:

1. Books Import (Source → Destination)

This example shows a simple ETL process without transformation:

Command: node ace import:books

Components:

  • Source: book_csv_source.ts - Reads a CSV file of books (5M records) with batch processing (500 items)
  • Destination: book_db_destination.ts - Inserts data into database via db.table().multiInsert()

Features:

  • Batch processing for performance optimization
  • CSV error handling (empty lines and errors ignored)
  • Optimized buffer (128KB) for large files

2. Products Import (Source → Transform → Destination)

This example shows a complete ETL process with data transformation:

Command: node ace import:products

Components:

  • Source: product_csv_source.ts - Reads a CSV file of products (500K records)
  • Transform: product_csv_to_db_transform.ts - Transforms CSV data (French column names → English)
  • Destination: product_db_destination.ts - Saves via Lucid model Product.create()

Features:

  • Column name transformation (e.g., Nom → name, Prix → price)
  • AdonisJS model usage for persistence
  • Data processing logging

Example Files Structure

sample/
ā”œā”€ā”€ commands/
│   ā”œā”€ā”€ import_books.ts      # Books import command
│   └── import_products.ts   # Products import command
ā”œā”€ā”€ etl/
│   ā”œā”€ā”€ sources/
│   │   ā”œā”€ā”€ book_csv_source.ts
│   │   └── product_csv_source.ts
│   ā”œā”€ā”€ transforms/
│   │   └── product_csv_to_db_transform.ts
│   ā”œā”€ā”€ destinations/
│   │   ā”œā”€ā”€ book_db_destination.ts
│   │   └── product_db_destination.ts
│   └── resources/
│       ā”œā”€ā”€ books.csv        # Sample data
│       └── products.csv     # Sample data
└── app/models/
    ā”œā”€ā”€ book.ts              # Book model
    └── product.ts           # Product model

These examples demonstrate different possible approaches:

  • Batch processing vs line-by-line processing
  • Direct database insertion vs Lucid model usage
  • With or without data transformation

Performance Optimization

For large-scale ETL operations, consider integrating with a job queue system (like BullMQ, or AdonisJS Queue package) to run ETL processes asynchronously, distribute workload across multiple workers, and improve reliability with automatic retry mechanisms.

Dependencies

This package requires:

  • @jrmc/etl - The core ETL library
  • AdonisJS 6.x
  • Node.js 22.17.0+

File Structure

Generated files are organized in the following structure:

Default structure:

app/
└── etl/
    ā”œā”€ā”€ sources/
    │   └── your_source_files.ts
    ā”œā”€ā”€ transforms/
    │   └── your_transform_files.ts
    └── destinations/
        └── your_destination_files.ts

Custom structure (with directories.etl: 'src/module/etl'):

src/
└── module/
    └── etl/
        ā”œā”€ā”€ sources/
        │   └── your_source_files.ts
        ā”œā”€ā”€ transforms/
        │   └── your_transform_files.ts
        └── destinations/
            └── your_destination_files.ts

Naming Convention

The generated class names follow this pattern:

  • Source: {process_name}_{source_type}_source
  • Transform: {process_name}_{source_type}_to_{destination_type}_transform
  • Destination: {process_name}_{destination_type}_destination

All names are automatically converted to snake_case for file names and PascalCase for class names.

Example: For process import-product with source csv and destination db:

  • File: import_product_csv_source.ts → Class: ImportProductCsvSource
  • File: import_product_csv_to_db_transform.ts → Class: ImportProductCsvToDbTransform
  • File: import_product_db_destination.ts → Class: ImportProductDbDestination

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Author

Jeremy Chaufourier

  • Email: jeremy@chaufourier.fr
  • GitHub: @batosai

Changelog

1.0.0

  • Initial release
  • Interactive ETL component generation
  • Support for Source, Transform, and Destination components
  • TypeScript support
  • Custom directory configuration support via directories.etl in adonisrc.ts