Discover Data Lakehouses, the next generation of efficient, scalable, and flexible data management architecture.
With our expert-led training program,
designed to equip you with cutting-edge skills in modern application deployment and management.
Benefit from our wealth of experience from countless customer projects:
Experience a balanced mix of theory, live demonstrations and practical exercises.
Learn to distinguish between Data Lakes, Data Warehouses, and Data Lakehouses to leverage the benefits of each.
Gain hands-on skills in Delta Lake and Apache Iceberg to implement key features like ACID transactions and time travel.
Master strategies for optimizing query performance and data storage management in Data Lakehouse environments.
This training is designed for data engineers, data architects, and data scientists who want to elevate their data management skills with modern, scalable architectures. Ideal for professionals handling large-scale data storage and processing, this course covers practical approaches to create flexible, efficient storage solutions that bridge traditional and modern data architectures. If you are experienced with data lakes or warehouses and ready to advance to data lakehouses, this course will guide you every step of the way.
Practical Applications That We Will Cover in the Training:
- 1
Setting up Delta Lake and Apache Iceberg environments for data management.
- 2Implementing and testing ACID transactions in Delta Lake.
- 3Using time travel features to query historical data.
- 4Managing schema evolution and hidden partitioning in Apache Iceberg.
- 5Optimizing query performance and managing data storage.
After The Course, You Will Be Able To:
- 1Define and differentiate between Data Lakes, Data Warehouses, and Data Lakehouses.
- 2Explain why Data Lakehouses are useful and how they address the limitations of traditional data storage architectures.
- 3Set up and work with Delta Lake and Apache Iceberg environments.
- 4Implement key features like ACID transactions, time travel, and schema evolution in Delta Lake and Apache Iceberg.
- 5Optimize performance and manage large datasets efficiently in a Data Lakehouse setup.
The Data Lakehouse training is not suitable for you if…
Hear from our satisfied training attendees
A1 Telekom Austria AG
Reinhard Burgmann
Head of Data Ecosystem
Vattenfall
Bernard Benning
BA Heat
„I recently attended Vattenfall IT’s online Kafka training day hosted by Ultra Tendency, and it was an enriching experience.
The trainer, Ahmed, did a fantastic job explaining the theory behind Kafka, and the emphasis on practical application was great. The hands-on programming exercises were particularly helpful, and I’ve never experienced training with so many interactive examples!
Overall, I highly recommend this training to anyone who wants to improve their Kafka knowledge interactively and gain valuable skills.“
VP Bank
Eisele Peer
Lead Architect & Head of IT Integration & Development
“The MLOps training exceeded our expectations!
It offered a perfect blend of an overview, hands-on coding examples, and real-world use cases. The trainer answered all questions competently and adapted the content to fit our company’s infrastructure.
This training not only provided us with knowledge but also practical skills that we can apply immediately.“
Your investment
- Begin with an overview of Data Lakehouses, exploring the evolution from traditional data storage to flexible, high-performance architectures.
- Dive into Delta Lake and Apache Iceberg, mastering their architectures and unique features for effective data storage and retrieval.
- Implement ACID transactions, time travel, and schema evolution to manage and query historical data accurately.
- Apply best practices for optimizing data performance and managing efficient, large-scale data storage tailored to the demands of modern data processing.
Get to know your trainers
Marvin Taschenberger
Hudhaifa Ahmed
Senior Lead Big Data Developer Berlin Territory Manager, Ultra Tendency
Matthias Baumann
Required hardware & infrastructure for your Data Lakehouse Training
- You will need a PC or Mac with a web browser and MS Teams.
- During the training, we will provide you with a virtual machine with the required local dependencies, services and root access.
- This VM has a running Kubernetes cluster on which you can test and execute the training instructions.
- You can access the machine via a browser or SSH if you wish and the network restrictions allow it.