Skip to main content
Dodo Payments provides a built-in database sync feature that automatically synchronizes your payment data with your own database. You can sync payments, customers, subscriptions, and licenses to maintain a local copy of your data for analytics, reporting, or integration with other systems.
Implementation: Available via npm package | Source Code: GitHub

What Can You Sync?

Our database sync feature supports synchronizing the following Dodo Payments entities to your database:

Payments

Sync all payment transactions, including one-time payments, refunds, and payment status updates.

Customers

Keep your customer data in sync, including customer profiles, contact information, and metadata.

Subscriptions

Synchronize subscription data, including active subscriptions, billing cycles, and subscription status changes.

Licenses

Sync license information, including license keys, activations, and license status updates.
You can sync any combination of these entities by specifying them in the scopes parameter. All sync operations are incremental and only transfer new or updated records for optimal performance.

Database Support

We currently support MongoDB. We are actively working on expanding support for:
  • Databases: Postgres, Clickhouse, Snowflake, and others.
  • Pipelines: ETL pipelines, Realtime sync.
We’re continuously expanding database support. If you’d like to contribute a new database integration, please submit a Pull Request to our GitHub repository.

Getting Started

You can use our database sync feature via the CLI or programmatically in your code. Both methods provide the same functionality—choose the one that best fits your workflow.

Using the CLI

The CLI tool provides a quick way to set up and run database synchronization. Install it globally to use it from anywhere in your terminal:
npm install -g dodo-sync

Running the CLI

The CLI supports two modes: Interactive Mode for guided setup, and Manual Mode for direct configuration. Interactive Mode: Simply run the command without arguments to start the interactive setup wizard.
dodo-sync
Manual Mode: Pass arguments directly to skip the wizard.
dodo-sync -i [interval] -d [database] -u [database_uri] --scopes [scopes] --api-key [api_key] --env [environment]
Example:
dodo-sync -i 600 -d mongodb -u mongodb://mymongodb.url --scopes "licences,payments,customers,subscriptions" --api-key YOUR_API_KEY --env test_mode

CLI Arguments

--interval
number
Sync interval in seconds. Determines how frequently the sync operation runs. If not provided, the sync will run once and exit.
--database
string
required
Database type to use. Currently only "mongodb" is supported.
--database-uri
string
required
Connection URI for your database. For MongoDB, this should be a valid MongoDB connection string (e.g., mongodb://localhost:27017 or mongodb+srv://user:pass@cluster.mongodb.net/).
--scopes
string
required
Comma-separated list of data entities to sync. Available scopes: licences, payments, customers, subscriptions. Example: "payments,customers".
--api-key
string
required
Your Dodo Payments API key. Should start with dp_live_ for live mode or dp_test_ for test mode.
--env
string
required
Environment target. Must be either "live_mode" or "test_mode". This determines which Dodo Payments environment to sync from.

Using in Your Code

For programmatic control, integrate the sync feature directly into your application. Install it as a dependency in your project:
npm install dodo-sync

Automatic Sync (Interval-based)

Use automatic sync when you want the sync to run continuously at regular intervals:
import { DodoSync } from 'dodo-sync';

const syncDodoPayments = new DodoSync({
  interval: 60, // Sync every 60 seconds
  database: 'mongodb',
  databaseURI: process.env.MONGODB_URI, // e.g., 'mongodb://localhost:27017'
  scopes: ['licences', 'payments', 'customers', 'subscriptions'],
  dodoPaymentsOptions: {
    bearerToken: process.env.DODO_PAYMENTS_API_KEY,
    environment: 'test_mode' // or 'live_mode'
  }
});

// Initialize connection
await syncDodoPayments.init();

// Start the sync loop
syncDodoPayments.start();
The interval option is required when using .start() for automatic syncing. The sync will run continuously at the specified interval until the process is stopped.

Manual Sync

Use manual sync when you want to trigger sync operations on-demand (e.g., from a cron job or API endpoint):
import { DodoSync } from 'dodo-sync';

const syncDodoPayments = new DodoSync({
  database: 'mongodb',
  databaseURI: process.env.MONGODB_URI,
  scopes: ['licences', 'payments', 'customers', 'subscriptions'],
  dodoPaymentsOptions: {
    bearerToken: process.env.DODO_PAYMENTS_API_KEY,
    environment: 'test_mode'
  }
});

// Initialize connection
await syncDodoPayments.init();

// Trigger a single sync operation
await syncDodoPayments.run();
When using manual sync, the interval option is not required. You can call .run() whenever you need to perform a sync operation.

Constructor Options

database
string
required
Name of the database to use. Currently only "mongodb" is supported.
databaseURI
string
required
Connection string for your database. For MongoDB, provide a valid MongoDB connection URI.
scopes
string[]
required
Array of entities to sync. Available options: "licences", "payments", "customers", "subscriptions". You can include any combination of these.
dodoPaymentsOptions
object
required
Dodo Payments API configuration for authentication and environment selection. See the TypeScript SDK types for complete options.Required properties:
  • bearerToken: Your Dodo Payments API key
  • environment: Either "test_mode" or "live_mode"
interval
number
Time in seconds between automatic syncs. Required when using .start() for automatic syncing. Optional when using .run() for manual syncing.

Important Information

A database named dodopayments_sync will be automatically created on your database server. All sync data will be stored there. This database name is currently fixed and cannot be changed.
The sync engine tracks changes and only syncs new or updated records, making subsequent syncs efficient even with large datasets.

Additional Resources