Destination Categories
Databases
Traditional relational databases including PostgreSQL, MySQL, MSSQL, and Oracle
Data Warehouses
Cloud data warehouses like BigQuery, Snowflake, and Redshift
Cloud Storage
Object storage services including S3, GCS, and Azure Blob Storage
Streaming Platforms
Real-time data streaming with Kafka and other messaging systems
All Supported Destinations
Mage provides native integrations for the following destinations:Databases
- PostgreSQL - Open-source relational database
- MySQL - Popular open-source database
- Microsoft SQL Server (MSSQL) - Enterprise database system
- Oracle Database - Enterprise-grade relational database
- MongoDB - NoSQL document database
- ClickHouse - Columnar database for analytics
- Teradata - Enterprise data warehouse
Data Warehouses
- Google BigQuery - Serverless data warehouse with ML capabilities
- Snowflake - Cloud data warehouse with elastic scaling
- Amazon Redshift - AWS data warehouse service
- Doris - Real-time analytical database
Cloud Storage
- Amazon S3 - AWS object storage
- Google Cloud Storage (GCS) - Google Cloud object storage
- Delta Lake (S3) - Open table format on S3
- Delta Lake (Azure) - Open table format on Azure Blob Storage
Search and Analytics
- Elasticsearch - Search and analytics engine
- OpenSearch - Open-source search and analytics
Streaming and Messaging
- Apache Kafka - Distributed event streaming platform
Other Platforms
- Trino - Distributed SQL query engine (Iceberg, Delta Lake connectors)
- Salesforce - CRM platform
- Airtable - Collaborative database platform
Common Configuration
All destinations share common configuration patterns:Connection Settings
Most destinations require authentication credentials and connection details:Table Configuration
Specify the target schema and table:Unique Constraints
Handle duplicate records with unique constraints:Internal Columns
Mage automatically adds tracking columns to all exported records:_mage_created_at- Timestamp when the record was first created_mage_updated_at- Timestamp when the record was last updated
Configuration Methods
Via UI
- Navigate to Pipelines → Edit Pipeline
- Select your data exporter block
- Click Data exporter in the block configuration
- Choose your destination from the dropdown
- Fill in the required configuration fields
Via YAML
Create a configuration file in your pipeline:Environment Variables
Store sensitive credentials as environment variables:Batch vs Stream Processing
Batch Processing
Batch Processing
Most destinations support batch processing, where data is collected and written in batches:
- Better performance - Reduced network overhead
- Lower cost - Fewer API calls and transactions
- Configurable batch size - Control memory usage
Stream Processing
Stream Processing
Some destinations support real-time streaming:
- Low latency - Near real-time data delivery
- Event-driven - Process data as it arrives
- Continuous updates - Keep downstream systems in sync
Performance Optimization
Batch Load Methods
Several data warehouse destinations support optimized batch loading:Partitioning
Partition large tables for better query performance:Column Type Handling
Mage automatically maps Python data types to destination-specific column types:| Python Type | PostgreSQL | BigQuery | Snowflake | Redshift |
|---|---|---|---|---|
| str | TEXT | STRING | VARCHAR | VARCHAR |
| int | BIGINT | INT64 | NUMBER | BIGINT |
| float | DOUBLE PRECISION | FLOAT64 | FLOAT | DOUBLE PRECISION |
| bool | BOOLEAN | BOOL | BOOLEAN | BOOLEAN |
| datetime | TIMESTAMP | DATETIME | TIMESTAMP | TIMESTAMP |
| dict | JSONB | JSON | VARIANT | VARCHAR |
| list | ARRAY | ARRAY | ARRAY | VARCHAR |
Testing Connections
All destinations support connection testing:Error Handling
Mage provides detailed logging and error handling for all destinations:- Connection errors - Authentication and network issues
- Schema mismatches - Data type incompatibilities
- Constraint violations - Unique constraint and foreign key errors
- Permission errors - Insufficient database privileges
Next Steps
Database Destinations
Configure PostgreSQL, MySQL, MSSQL, and other databases
Data Warehouses
Set up BigQuery, Snowflake, and Redshift
Cloud Storage
Export to S3, GCS, and Delta Lake
Streaming
Stream data to Kafka and other platforms