Pragmile Sp. z o.o.
We’re a software house passionate about building solutions that change how our clients use data and artificial intelligence. For over 9 years, we’ve been developing innovative AI-driven products such as InfraSenses and SolarSpy — not just using ready-made tools, but also building machine learning models from scratch.We are looking for an experienced Data Engineer / Data Architect to lead the design, development, and optimization of enterprise-scale data solutions on Microsoft Azure, with additional expertise in Google Cloud Platform (GCP). This is a hands-on individual contributor role focused on building scalable and efficient data platforms, developing complex ETL/ELT pipelines, and enabling seamless multi-cloud integration. The ideal candidate combines strong technical depth with practical experience in data architecture, governance, and automation (CI/CD) within large-scale enterprise environments.Key Responsibilities:Azure-Centric Data Engineering
Design and deploy Azure data platforms:
Data Lakes: Azure Data Lake Storage (ADLS Gen2) with Delta Lake optimizations. ETL/ELT: Azure Data Factory, Synapse Analytics, and Databricks workflows. Data Warehousing: Synapse Dedicated Pools, Azure SQL DB.
Optimize Spark workloads (e.g. Azure Databricks) for performance tuning (partitioning, caching). Implement real-time pipelines: Azure Event Hubs, Stream Analytics, and IoT Hub integrations.
Ensure data governance:
Enforce data quality with Azure Purview (lineage tracking, sensitivity labeling). Implement RBAC and Azure Active Directory (AAD) integration for secure access.
Build real-time analytics pipelines: Process streaming data via Azure Event Hubs/Stream Analytics with windowing and watermarking strategies.
Operational Excellence
Automate CI/CD pipelines. Lead disaster recovery (DR) Manage data lifecycle
Technical Requirements:Core Azure Expertise
Must-Have:
Advanced SQL, PySpark, and Python. Infrastructure-as-Code (Terraform, ARM/Bicep). Performance tuning (partitioning, indexing, query optimization).
Azure Services:
Data Factory (ADF), Synapse, Databricks, Cosmos DB. DevOps (Azure Pipelines, Repos), Monitor, and Security Center.
GCP Proficiency (nice to have): BigQuery (partitioned tables, materialized views). Cloud Storage, Dataflow, Pub/Sub. IAM and VPC networking. Looker, Dataproc, Composer (Airflow). Preferred Qualifications Certification: Azure: DP-203 (Data Engineer), AZ-400 (DevOps). GCP: Professional Data Engineer (preferred). Education:Bachelors/Masters in CS/IT or equivalent.What We Offer: Opportunity to lead end-to-end design of advanced, enterprise-scale data platforms in Azure and multi-cloud environments. Work with modern technologies (Azure, Databricks, Terraform, GCP) in a highly technical, hands-on role. Supportive, growth-oriented environment with opportunities for certification and continuous learning. A training budget to support your technical and professional development Private medical care and co-financing of a sports card Flexible working hours and the option to work fully remotely
| Opublikowana | dzień temu |
| Wygasa | za 29 dni |
| Rodzaj umowy | B2B |
| Tryb pracy | Zdalna |
| Źródło |
Milczenie jest przytłaczające. Wysyłasz aplikacje jedna po drugiej, ale Twoja skrzynka odbiorcza pozostaje pusta. Nasze AI ujawnia ukryte bariery, które utrudniają Ci dotarcie do rekruterów.
Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.