Apply Now

Data Engineer

ITDS

Kraków, Kapelanka 42A +1 more
22,000 - 26,000 PLN
hybrid
b2b
SQL
🔍 Google Cloud Platform
ETL
BigQuery
DBT
🤖 Apache Airflow
Cloud Composer
Kafka
Looker Studio
CI/CD
💼 b2b
hybrid
full_time
🐍 Python
Elasticsearch
fastapi
🤖 GenAI
rag
Ansible
Jenkins
GitLab
Apache Spark
Java
Groovy
GCP

Job description

Data Engineer

Join us, and create cutting-edge pipelines for seamless data transformation!

Kraków - based opportunity with hybrid work model (2 days/week in the office).

As a Data Engineer, you will be working for our client, a global financial institution that is driving DevOps transformation through data analytics and engineering. You will be part of a team that provides key metrics and analytical products to enhance software engineering practices across the organization. Your role will focus on developing data transformation pipelines, ensuring data quality, and supporting a cloud data platform to improve the overall DevOps experience. You will collaborate with diverse global teams to deliver enriched datasets, dashboards, and insights that enable strategic decision-making.

Your main responsibilities:

  • Designing, developing, testing, and deploying data ingest, quality, refinement, and presentation pipelines
  • Operating and iterating on a cloud data platform to support internal goals
  • Building and maintaining ETL processes and data transformation pipelines
  • Ensuring data quality and implementing automated data validation solutions
  • Developing data marts and optimizing schema designs for performance and usability
  • Collaborating with business stakeholders to understand data needs and deliver actionable insights
  • Working with cloud-based big data technologies, particularly Google Cloud Platform (GCP) and BigQuery
  • Utilizing orchestration and scheduling tools such as Airflow and Cloud Composer
  • Supporting continuous integration and continuous delivery (CI/CD) processes
  • Following Agile methodologies and working within a product-oriented culture

You're ideal for this role if you have:

  • At least 7 years of professional experience in SQL development
  • Strong experience in data engineering and ETL processes
  • Expertise in GCP, BigQuery, and data build tools (DBT)
  • Hands-on experience with Apache Airflow and Cloud Composer
  • Proficiency in data modeling and designing optimized data schemas
  • Experience with data streaming technologies such as Kafka
  • Familiarity with BI tools, especially Looker Studio
  • Understanding of DevOps principles and working in a DevOps environment
  • Experience with Continuous Integration and Continuous Delivery (CI/CD) practices
  • Strong communication skills and ability to work with global teams

It is a strong plus if you have:

  • Experience in building and operating a cloud data platform
  • Knowledge of data architecture and data marts
  • Proficiency in Git, Shell scripting, and Python
  • Ability to quickly learn and adapt to new technologies
  • Experience collaborating with technical staff and project managers for efficient delivery
  • Proactive approach to identifying improvement opportunities and solving issues
  • Comfort in working in fast-paced, changing, and ambiguous environments
Views: 5
Published13 days ago
Expiresin 17 days
Type of contractb2b
Work modehybrid
Source
Logo

Similar jobs that may be of interest to you

Based on "Data Engineer"