Pobierz kartę szkolenia

Databricks Fundamentals

kod szkolenia: DBX-FE / ENG DL 1d

The Databricks Fundamentals training is the first step in the structured training path Fundamental → Explorer → Lakehouse → Transformation. Participants will learn about the Databricks environment, how to navigate the platform, use basic tools, and perform simple data operations. The training also covers basic cost management concepts, helping participants understand how to work efficiently in the cloud from the start.

W celu uzyskania informacji skontaktuj się z działem handlowym. W celu uzyskania informacji skontaktuj się z działem handlowym.
2 500,00 PLN 3 075,00 PLN brutto

The training is intended for individuals starting their journey with Databricks: data analysts, data engineers, BI specialists, and technical staff who want to learn the core functions of the environment.

– understand the basic components of the Databricks environment

– gain knowledge of Unity Catalog and Delta Lake as the foundations of Databricks

– can load data, perform simple transformations and explorations

– can use Workflows for basic automation

– gain awareness of fundamental cost management practices in the cloud

– are prepared for the next stage of the training path – Databricks Explorer

1.Getting to know the Databricks platform

  • Workspace, notebooks, clusters, DBFS

  • Basic concepts and platform architecture

  • Unity Catalog – catalogs, schemas, basic permissions

  • Delta Lake as the default format

2.Cost management

  • Cluster types: Jobs vs All-purpose

  • Autoscaling and its impact on costs

  • Basic principles of efficient resource usage

3.GUI and tools

  • Databricks interface – working with notebooks, markdown, mixing SQL and Python

  • Creating simple visualizations in the GUI

  • AI Assistant in notebooks – how it supports the user

4.Basic data operations

  • Loading data from CSV and JSON files

  • Data exploration: display(), show(), describe(), summary()

  • Basic transformations in PySpark (withColumn, when, cast)

5.Workflows and automation

  • Creating simple tasks in the Workflows GUI

  • Parameterization, scheduling, recurring jobs

6.Final project

  • Loading data into Delta Lake, performing basic transformations, and running them in a scheduled workflow

– Basic knowledge of SQL

– General understanding of databases and data analysis

– Basic knowledge of cloud concepts (optional)

  • access to Altkom Akademia student

Training method:

The training is conducted in the Databricks cloud environment. Each participant receives their own workspace with access to Unity Catalog, SQL Editor, Notebooks, and a catalog with test data.

Training: English

  • Materials: English