Aplikuj teraz

Data Engineer Hadoop

Antal Sp. z o.o.

Kraków, Kobierzyńska +1 więcej
180 - 220 PLN
Hybrydowa
B2B
Hadoop
Scala
Hive
Apache Spark
HDFS
Apache
BigQuery
📊 Dataflow
📊 Dataproc
SQL
💼 B2B
Hybrydowa
Pełny etat
Jenkins
Kafka
📊 Big Data
Spark
Ansible
CI/CD
Linux

Job description

Hadoop Data Engineer (GCP, Spark, Scala) – Kraków / Hybrid

We are looking for an experienced Hadoop Data Engineer to join a global data platform project built in the Google Cloud Platform (GCP) environment. This is a great opportunity to work with distributed systems, cloud-native data solutions, and a modern tech stack. The position is based in Kraków (hybrid model – 2 days per week in the office).

Your responsibilities:

  • Design and build large-scale, distributed data processing pipelines using Hadoop, Spark, and GCP
  • Develop and maintain ETL/ELT workflows using Apache Hive, Apache Airflow (Cloud Composer), Dataflow, DataProc
  • Work with structured and semi-structured data using BigQuery, PostgreSQL, Cloud Storage
  • Manage and optimize HDFS-based environments and integrate with GCP components
  • Participate in cloud data migrations and real-time data processing projects
  • Automate deployment, testing, and monitoring pipelines (CI/CD using Jenkins, GitHub, Ansible)
  • Collaborate with architects, analysts, and product teams in Agile/Scrum setup
  • Troubleshoot and debug complex data logic at the code and architecture level
  • Contribute to cloud architecture patterns and data modeling decisions

Must-have qualifications:

  • Minimum 5 years of experience as a Data Engineer / Big Data Engineer
  • Hands-on expertise in Hadoop, Hive, HDFS, Apache Spark, Scala, SQL
  • Solid experience with GCP and services like BigQuery, Dataflow, DataProc, Pub/Sub, Composer (Airflow)
  • Experience with CI/CD processes and DevOps tools: Jenkins, GitHub, Ansible
  • Strong data architecture and data engineering skills in large-scale environments
  • Experience working in enterprise environments and with external stakeholders
  • Familiarity with Agile methodologies such as Scrum or Kanban
  • Ability to debug and analyze application-level logic and performance

Nice to have:

  • Google Cloud certification (e.g., Professional Data Engineer)
  • Experience with Tableau, Cloud DataPrep, or Ansible
  • Knowledge of cloud design patterns and modern data architectures

Work model:

  • Hybrid – 2 days per week from the Kraków office (rest remotely)
  • Opportunity to join an international team and contribute to global-scale projects

To learn more about Antal, please visit www.antal.pl

Wyświetlenia: 4
Opublikowana5 dni temu
Wygasaza 25 dni
Rodzaj umowyB2B
Tryb pracyHybrydowa
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Data Engineer Hadoop"