Aplikuj teraz

Data Engineer (Hadoop) (Praca zdalna)

emagine Polska

Warszawa
Zdalna
B2B
Apache Spark
Hadoop
Hive
HDFS
SQL
Scala
🔍 Google Cloud Platform
CI/CD
📊 Cloud Dataflow
Big Query
💼 B2B
🌐 Zdalna
Pełny etat

Name

Data Engineer (Hadoop)

Data

-, Warszawa+4 Locations

emagine Polska

Full-time

B2B

Senior

Remote

Tech stack

Polish

C1

English

C1

Hadoop

master

Hive

master

HDFS

master

Apache Spark

master

Scala

regular

Google Cloud Platform

regular

SQL

regular

CI/CD

regular

Big Query

nice to have

Cloud Dataflow

nice to have

Job description

Industry: banking

Location: fully remote (candidates must be based in Poland)

Languages: fluent Polish and English

Contract: B2B

The Hadoop Data Engineer plays a critical role in enhancing the data processing capabilities within the organization, leveraging cloud technologies for efficient data handling and migration. The primary objective is to build and maintain robust data processing architectures that facilitate the flow of information and insights in a scalable manner.

Main Responsibilities:

  • Develop and maintain data processing systems using Hadoop, Apache Spark, and Scala.
  • Design and implement data migration processes on the Google Cloud platform.
  • Create solutions for data handling and transformation utilizing SQL and other relevant tools.
  • Collaborate with stakeholders to ensure data architecture aligns with business needs.
  • Engage in automated testing and integration to ensure smooth deployment processes.
  • Debug code issues and communicate findings with the development team.
  • Apply big data modeling techniques for effective data representation.
  • Adapt to dynamic environments and embrace a proactive learning attitude.

Key Requirements:

  • 5+ years of experience in Hadoop, Hive, HDFS, and Apache Spark.
  • Proficiency in Scala programming.
  • Hands-on experience with Google Cloud Platform, especially Big Query and Cloud Dataflow.
  • Strong understanding of SQL and relational database technologies.
  • Experience with version control tools (Git, GitHub) and CI/CD processes.
  • Ability to design large scale distributed data processing systems.
  • Strong interpersonal skills and teamwork abilities.
  • Experience in Enterprise Data Warehouse technologies.
  • Exposure to Agile project methodologies (Scrum, Kanban).
  • Google Cloud Certification - nice to have.
  • Experience with customer-facing roles in enterprise settings.
  • Exposure to Cloud design patterns.

Published: 15.08.2025

Office location

Wyświetlenia: 17
Opublikowana6 dni temu
Wygasaza 24 dni
Rodzaj umowyB2B
Tryb pracyZdalna
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Data Engineer (Hadoop)"