Our expert-led DBT training program, designed to equip you with cutting-edge skills in modern application deployment and management.

DBT training: basic course

Our expert-led DBT training program, designed to equip you with cutting-edge skills in modern application deployment and management.

DBT training:
Basic course

At a glance

2.5 days

Individually schedulable

Completely
remote

Theory &
Practice

English

Learn to build efficient data pipelines with DBT through a hands-on, theory-meets-practice approach. Flexible scheduling lets you learn at your own pace and apply skills to real-world projects.

Our trusted partners

Agenda (example)

We are happy to create a customized agenda with you so that DBT will be a breeze in the future.

Day 1

Foundations of DBT & Workflow Mastery

  1. What is DBT? Why does it matter?
  2. Understanding DBT’s role in the modern Data Architecture
  3. DBT Architecture & Components: How it fits into the data stack
  1. Installing DBT & configuring your environment
  2. Understanding project structure & workflow best practices
  1. DBT CLI Basics: Commands & Execution
  2. Advanced Selectors & Configuration Options
  1. Introduction to Jinja templating for DBT
  2. Writing reusable, dynamic SQL with Jinja

Day 2

Advanced DBT Techniques & Performance Tuning

  1. Core modeling concepts & transformation layers
  2. Best practices for naming conventions & maintainability
  3. Incremental Materialization strategies for efficient data updates
  4. Performance Optimizations to scale transformations
  1. Using DBT Seeds to load static datasets
  2. DBT Snapshots: Handling slowly changing dimensions (SCD)
  3. Managing dependencies with DBT Packages
  1. Data Tests in DBT: Validating transformations
  2. Implementing unit tests to catch errors early
  1. Generating DBT documentation for transparency
  2. Implementing Hooks to customize and automate tasks

Day 3 (Half Day)

Real-World Applications & Open Discussion

  1. Proactive monitoring with Data Observability
  2. Implementing DataOps best practices
  3. Understanding Model Contracts for Governance
  1. Introduction to Apache Airflow
  2. Deploying DBT with Airflow for production-grade orchestration
  1. Addressing real-world challenges
  2. Best practices & next steps in your DBT journey

At a glance

2.5 days

Individually schedulable

Completely
remote

Theory &
Practice

English

Learn to build efficient data pipelines with DBT through a hands-on, theory-meets-practice approach. Flexible scheduling lets you learn at your own pace and apply skills to real-world projects.

Our trusted partners

Agenda (example)

We are happy to create a customized agenda with you so that DBT will be a breeze in the future.

Day 1

Foundations of DBT & Workflow Mastery

  1. What is DBT? Why does it matter?
  2. Understanding DBT’s role in the modern Data Architecture
  3. DBT Architecture & Components: How it fits into the data stack
  1. Installing DBT & configuring your environment
  2. Understanding project structure & workflow best practices
  1. DBT CLI Basics: Commands & Execution
  2. Advanced Selectors & Configuration Options
  1. Introduction to Jinja templating for DBT
  2. Writing reusable, dynamic SQL with Jinja

Day 2

Advanced DBT Techniques & Performance Tuning

  1. Core modeling concepts & transformation layers
  2. Best practices for naming conventions & maintainability
  3. Incremental Materialization strategies for efficient data updates
  4. Performance Optimizations to scale transformations
  1. Using DBT Seeds to load static datasets
  2. DBT Snapshots: Handling slowly changing dimensions (SCD)
  3. Managing dependencies with DBT Packages
  1. Data Tests in DBT: Validating transformations
  2. Implementing unit tests to catch errors early
  1. Generating DBT documentation for transparency
  2. Implementing Hooks to customize and automate tasks

Day 3 (Half Day)

Real-World Applications & Open Discussion

  1. Proactive monitoring with Data Observability
  2. Implementing DataOps best practices
  3. Understanding Model Contracts for Governance
  1. Introduction to Apache Airflow
  2. Deploying DBT with Airflow for production-grade orchestration
  1. Addressing real-world challenges
  2. Best practices & next steps in your DBT journey

Learn how to use DBT (Data Build Tool) for data transformations with this DBT training

In this DBT Course you will experience a balanced mix of theory, live demonstrations and practical exercises.

Learn the fundamentals and architecture of DBT, understanding the role of an Analytic Engineer, workflow management and setup.

Get to know how to integrate DBT with other tools like Airflow and Dagster, as well as refactor and optimize DBT projects.

Learn how to use Jinja templating in DBT, create and manage models, perform tests to ensure data quality, documentation and metadata management

In this DBT Training, you will learn …

In this DBT Training, you will learn …

… how to use DBT (Data Build Tool) for data transformations. This includes the fundamentals and architecture of DBT, understanding the role of an Analytic Engineer, workflow management and setup, using Jinja templating in DBT, creating and managing models, performing tests to ensure data quality, documentation and metadata management, using hooks and data observability, integrating DBT with other tools like Airflow and Dagster, as well as refactoring and optimizing DBT projects.

Practical Applications That We Will Cover in the DBT Training:

  • 1
    Setting up and configuring DBT projects.
  • 2
    Using Jinja for dynamic SQL generation.
  • 3
    Implementing and testing data models.
  • 4
    Creating documentation and managing metadata.
  • 5
    Performing data quality tests.
  • 6
    Using hooks for additional SQL commands.
  • 7
    Integrating DBT with Airflow for workflow management.
  • 8
    Refactoring DBT projects to optimize data pipelines.

After the Data Build Tool Training, You Will Be Able To:

  • 1
    Set up and configure DBT projects.
  • 2
    Create and transform data models.
  • 3
    Use SQL and Jinja for advanced data transformations.
  • 4
    Ensure data quality through testing and validation.
  • 5
    Create and maintain documentation for DBT projects.
  • 6
    Optimize and automate workflows.
  • 7
    Integrate DBT into existing ELT, MLOps, or analytics pipelines.
  • 8
    Apply practical applications and best practices for DBT.

This DBT course is perfect for you if…

  • You are a data analyst, data engineer, or BI developer.
  • You have SQL knowledge and want to perform data transformations.
  • You want to optimize and automate existing data pipelines.
  • You are interested in integration and improvement of data quality.
  • You want to learn about modern data warehousing and ETL concepts.

Hear from our satisfied training attendees

A1 Telekom Austria AG

„UTA coached my team along the development process of the migration plan of our on-premises data lake to the public cloud.

The outstanding level of expertise, both on a technical and organizational level, ensured a well-structured and realistic migration plan including timeline, milestones, and efforts.

The enablement of my team was at the center of a very smooth collaboration. Through UTA, we achieved our goal faster and reduced risks of the migration project significantly.

I highly recommend UTA’s services!“

Reinhard Burgmann
Head of Data Ecosystem

Vattenfall

“I recently attended Vattenfall IT’s online Kafka training day hosted by Ultra Tendency, and it was an enriching experience.

The trainer, Ahmed, did a fantastic job explaining the theory behind Kafka, and the emphasis on practical application was great. The hands-on programming exercises were particularly helpful, and I’ve never experienced training with so many interactive examples!

Overall, I highly recommend this training to anyone who wants to improve their Kafka knowledge interactively and gain valuable skills.”

Bernard Benning
BA Heat

VP Bank

„The MLOps training exceeded our expectations!

It offered a perfect blend of an overview, hands-on coding examples, and real-world use cases. The trainer answered all questions competently and adapted the content to fit our company’s infrastructure.

This training not only provided us with knowledge but also practical skills that we can apply immediately.“

Eisele Peer
Lead Architect & Head of IT Integration & Development

Get to know your DBT Training professionals

Marvin Taschenberger

Professional Software Architect, Ultra Tendency

Hudhaifa Ahmed

Senior Lead Big Data Developer & Berlin Territory Manager, Ultra Tendency

Matthias Baumann

Chief Technology Officer & Principal Big Data Solutions Architect Lead, Ultra Tendency

Required Hardware & Infrastructure for Your DBT Training

  • You will need a PC or Mac with a web browser and MS Teams.
  • During the training, we will provide you with a virtual machine with the required local dependencies, services and root access.
  • This VM has a running Kubernetes cluster on which you can test and execute the training instructions.
  • You can access the machine via a browser or SSH if you wish and the network restrictions allow it.