Jetzt bewerben

Data Engineer AWS

apreel Sp. z o.o.

25200 - 28560 PLN
B2B
☁️ AWS
💼 B2B

Must have

  • AWS

  • Data pipelines

  • Databricks

  • Python

  • dbt

  • ETL

  • Data models

  • Data warehouses

  • Continuous integration

  • Security

  • AWS S3

  • Glue

  • Redshift

  • AWS Lambda

  • Data engineering

  • Spark

  • Git

  • GitHub

  • Polish (Fluent)

  • English (B2)

Nice to have

  • Jenkins

  • Airflow

  • Infrastructure as Code

  • Terraform

  • CloudFormation

Requirements description

Required Skills & Experience:

  • Solid experience as a Data Engineer (2+ years for mid, 4+ for senior level)
  • Strong hands-on experience with AWS data services (e.g., S3, Glue, Redshift, Lambda)
  • Proficiency in Python for data engineering tasks
  • Practical knowledge of Databricks and the Spark ecosystem
  • Experience with dbt (Data Build Tool) for data transformation and modeling
  • Familiarity with modern version control and CI/CD workflows (e.g., Git, GitHub Actions, Jenkins)

Nice to Have:

  • Knowledge of data orchestration tools (e.g., Airflow, Prefect)
  • Experience with data quality tools and observability frameworks
  • Exposure to infrastructure as code (e.g., Terraform, CloudFormation)
  • Experience working in Agile/Scrum environments

Offer description

Data Engineer AWS

We are seeking a skilled and proactive Data Engineer (Mid or Senior) to join our growing data team. In this role, you will design, build, and maintain modern data pipelines and architecture in a cloud-native environment using AWS, Databricks, Python, and dbt. You’ll work closely with analysts, data scientists, and business stakeholders to ensure clean, scalable, and reliable data delivery across the organization.

Offer:

  • Location: Poland / REMOTE
  • Employment: B2B contract with apreel
  • Rate: up to 170 zł/h

Your responsibilities

  1. Design, develop, and optimize ETL/ELT pipelines using Databricks and dbt
  2. Build and manage data models, data lakes, and data warehouses on AWS
  3. Write efficient and scalable code in Python to process large datasets
  4. Collaborate with cross-functional teams to understand data needs and deliver solutions
  5. Ensure data quality, observability, and performance across the entire pipeline
  6. Support continuous integration and deployment of data workflows

show all (7)

Aufrufe: 1
Veröffentlichtvor etwa 2 Monaten
Läuft abin 24 Tagen
Art des VertragsB2B
Quelle
Logo

Ähnliche Jobs, die für Sie von Interesse sein könnten

Basierend auf "Data Engineer AWS"