Data Engineer

Data Engineer

Cargoo

Tallinn
📊 Data Engineer
🐍 Python
SQL
☁️ Azure
dbt
🤖 Airflow
Kafka
🐳 Docker
🚢 Kubernetes
📊 DataOps

Podsumowanie

Data Engineer – kształtowanie strategii data engineering, projektowanie i wdrażanie skalowalnych rozwiązań, budowa i utrzymanie platformy danych, tworzenie pipeline’ów ETL/ELT, współpraca z AI, BI i interesariuszami. Wymagane: Python, SQL, Azure, dbt, Airflow/Dagster, Kafka, Docker, Kubernetes, DataOps. Oferujemy: nowoczesny stos danych, ubezpieczenie zdrowotne, przekąski, darmowy parking.

Słowa kluczowe

Data EngineerPythonSQLAzuredbtAirflowKafkaDockerKubernetesDataOps

Benefity

  • Stebby perks lub ubezpieczenie zdrowotne
  • przekąski
  • darmowy parking

Opis stanowiska

What you will do

Role Overview:

As a Data Engineer, you will help to shape the Cargoo’s data engineering strategy and guide the design, architecture, and implementation of scalable data solutions. You will support a team of data engineers, collaborate closely with AI engineers, BI analysts, and business stakeholders, and ensure our data infrastructure supports advanced analytics, machine learning, and real-time decision-making.

You will work with teams across the company to identify data opportunities to drive impact, understand their needs, and help them get the most out of our Data Platform.

Responsibilities:

  • Shape and contribute to Cargoo’s data engineering strategy, including the design, architecture, and implementation of scalable data solutions.
  • Build, maintain, and evolve the Data Platform to support advanced analytics, machine learning, and real-time decision-making.
  • Collaborate closely with AI engineers, BI analysts, and business stakeholders to translate data needs into robust technical solutions.
  • Identify data opportunities across the organization and enable teams to extract maximum value from data assets.
  • Design and implement ETL/ELT pipelines, including batch and streaming data workflows.
  • Develop and maintain data models using dimensional and advanced modeling techniques.
  • Support deployment, monitoring, and lifecycle management of data applications.
  • Apply DataOps best practices to ensure reliability, scalability, and quality of data pipelines.
  • Participate in code reviews and promote high engineering standards.
  • Take ownership of projects end-to-end, ensuring timely delivery and measurable impact.

What we offer

Why you’ll love it here:

  • Work with a cutting-edge data stack to power real-time, reliable, and beautifully orchestrated data workflows.
  • Turn insights into impact - help shape smarter global logistics solutions.
  • Collaborate with curious, data-driven people who value ideas.
  • Learn, experiment and grow in a culture that supports your ambition.
  • Enjoy Stebby perks or health insurance, snacks and free parking.
  • Be part of a team that celebrates wins together.
  • Grow, build and belong with us.

Requirements

Requirements:

  • Solid proficiency in Python (runtime environment, package management) and SQL (DML, DDL).
  • Hands-on experience with SQL Server / Azure SQL Server.
  • Experience working with cloud platforms, preferably Microsoft Azure.
  • Familiarity with the modern data stack, including tools such as dbt and orchestration frameworks (Airflow, Dagster, or similar).
  • Strong understanding of ETL/ELT concepts, data architecture, and data modeling.
  • Experience with streaming technologies (Kafka or equivalent).
  • Experience with Docker and container orchestration technologies.
  • Experience deploying and monitoring applications on Kubernetes (K8s).
  • Knowledge of application lifecycle management.
  • Understanding and application of DataOps practices.
  • Strong project management, execution skills, and a clear sense of ownership and accountability.

Zaloguj się, aby zobaczyć pełny opis oferty

Wyświetlenia: 2
Opublikowana8 dni temu
Wygasaza 22 dni
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Data Engineer"

Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.