Devapo
We are looking for an experienced Data Engineer responsible for planning, developing, and maintaining cloud environments for our clients. About DevapoAt Devapo, we focus on continuous self-development and acquiring new knowledge. If you are a fast learner, want to participate in international projects, are a team player, and can work independently — join us!We provide our clients with more than just code — we want to equip them with tools that allow their businesses to flourish. Our clients’ success is our success, which is why we ensure that everyone who creates Devapo has a long-term goal in mind.Key Responsibilities:Design, implement, and maintain scalable and efficient data pipelines in one of the cloud environments (Azure, AWS, GCP) using tools such as Databricks, Glue, Dataflow, or Azure Data FactoryDevelop and optimize ETL/ELT processes using cloud-native services (e.g., Azure Data Factory, AWS Glue, GCP Dataflow) and Apache Spark/DatabricksBuild Big Data solutions aligned with business and analytical requirements across cloud platformsCollaborate with Data Science, BI, and development teams to deliver high-quality, well-structured, and performant dataMonitor and improve the performance, reliability, and scalability of data processing systemsImplement robust data governance, security standards, and best practices across cloud environmentsResearch and evaluate new tools and technologies within the cloud and data engineering ecosystemRequirements:Minimum 3 years of experience as a Data Engineer or in a similar roleHands-on experience with one or more major cloud platforms (Azure, AWS, GCP); deep knowledge of cloud data services such as:Azure Data Factory, Azure Data Lake, Synapse Analytics (Azure)AWS Glue, S3, Redshift, Athena (AWS)GCP Dataflow, BigQuery, Cloud Storage (GCP)Extensive experience with Databricks and Apache SparkProficiency in SQL and experience with relational and columnar databasesStrong programming skills in Python and PySparkExperience designing and optimizing data pipelines in distributed, cloud-based architecturesFamiliarity with Delta Lake or other modern data lake architecturesSolid understanding of data modeling and schema designWhat We Offer: Salary: 17 800 - 21 500 PLN (B2B contract) Co-financing for training and certifications, as well as guaranteed time for learning during working hours Private medical care and a Multisport card Language classes (English) Flexible working hours and the possibility of hybrid work (Warsaw) Team integration meetings and company events Employee referral program with a bonus An individually tailored career development path
| Opublikowana | 2 dni temu |
| Wygasa | za 28 dni |
| Rodzaj umowy | B2B |
| Tryb pracy | Zdalna |
| Źródło |
Milczenie jest przytłaczające. Wysyłasz aplikacje jedna po drugiej, ale Twoja skrzynka odbiorcza pozostaje pusta. Nasze AI ujawnia ukryte bariery, które utrudniają Ci dotarcie do rekruterów.
Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.