Overview
In this tutorial, you’ll build your first end-to-end data pipeline in Mage. You’ll learn how to:- Load data from an API
- Transform the data
- Export the results to a database
- Execute your pipeline
Prerequisites
Before you begin, make sure you have:- Mage installed and running (visit
http://localhost:6789) - Basic knowledge of Python
- Pandas library (included with Mage)
What You’ll Build
You’ll create a simple ETL pipeline that:- Fetches data from a public API
- Cleans and transforms the data
- Exports it to a file or database
Create a New Pipeline
Navigate to the Mage UI and create a new pipeline:
- Click the Pipelines icon in the left sidebar
- Click + New pipeline
- Select Standard (batch) as the pipeline type
- Name your pipeline
my_first_pipeline

Mage supports multiple pipeline types: Standard (batch), Streaming, and Data integration. We’ll use Standard for this tutorial.
Add a Data Loader Block
Data loader blocks are responsible for fetching data from various sources.
- Click + Data loader in the pipeline editor
- Select Python > API
- Name it
load_api_data
- Click Execute block to test your data loader
- Verify the output shows user data in the preview panel
Add a Transformer Block
Transformer blocks process and clean your data.
- Click + Transformer below your data loader
- Select Python > Generic (no template)
- Name it
transform_data
Transformer blocks automatically receive data from their upstream parent blocks as the first parameter.
- Execute the block to see the transformed data
- Inspect the output to verify the transformations
Add a Data Exporter Block
Data exporter blocks write your processed data to destinations.
- Click + Data exporter below your transformer
- Select Python > Local file
- Name it
export_to_file
Execute the Complete Pipeline
Now that all blocks are connected, let’s run the entire pipeline:
You’ll see:
- Click on the last block (data exporter)
- Click Execute with all upstream blocks
- Watch as Mage executes each block in dependency order

- Green checkmarks on successfully executed blocks
- Execution time for each block
- Output data previews
- Any print statements or logs
Mage automatically determines the execution order based on block dependencies. Blocks with no dependencies run first, followed by their downstream blocks.
Understanding Block Types
Mage provides several block types for different purposes:Data Loader
Load data from APIs, databases, files, or cloud storage
Transformer
Clean, process, and transform your data
Data Exporter
Write data to databases, files, or cloud destinations
Sensor
Wait for conditions or external events
DBT
Run dbt models as part of your pipeline
Custom
Write custom Python code for any purpose
Block Decorators
Mage uses Python decorators to identify block types:Viewing Pipeline Execution Results
After executing your pipeline, you can:- View block outputs: Click any block to see its output data
- Check logs: View print statements and execution logs
- Inspect variables: See all variables created during execution
- Review execution time: Optimize slow blocks
Scheduling Your Pipeline
To run your pipeline automatically:- Click the Triggers icon in the left sidebar
- Click Create new trigger
- Select Schedule as the trigger type
- Configure:
- Name:
daily_user_sync - Frequency: Daily at 9 AM
- Status: Active
- Name:
Mage uses cron syntax for scheduling. The format is:
minute hour day month day_of_weekNext Steps
Congratulations! You’ve built your first Mage pipeline. Here’s what to explore next:ETL Workflow Tutorial
Build a complete ETL workflow with multiple data sources
Streaming Pipeline
Process real-time data with streaming pipelines
DBT Integration
Integrate dbt models into your pipelines
ML Pipeline
Build machine learning pipelines with training and inference
Troubleshooting
Block won't execute
Block won't execute
- Check for syntax errors in your code
- Ensure upstream blocks have executed successfully
- Verify all required imports are present
- Check the logs for error messages
Import errors
Import errors
- Mage includes pandas, requests, and common libraries
- Install additional packages via
pip installin a notebook - Add dependencies to
requirements.txtfor production
Data not passing between blocks
Data not passing between blocks
- Ensure blocks are properly connected (parent-child relationship)
- Verify the upstream block returns data
- Check that the transformer receives data as its first parameter
Learn More
Blocks Documentation
Learn about all block types and capabilities
Configuration
Configure Mage for production deployments