Aplikuj teraz

Data Engineer

Adaptiq

+1 więcej
23629 - 25447 PLN
B2B
🐍 Python
💼 B2B

Must have

  • Python

  • Kafka

  • Airflow

  • ETL

  • SQL

  • NoSQL

  • English (B2)

Nice to have

  • Spark

  • RabbitMQ

  • Docker

  • ML

  • MLOps

Requirements description

  • 4+ years of experience building and maintaining production-grade data pipelines
  • Hands-on experience modern data tools such as Kafka, Airflow, Spark, or Flink
  • Deep understanding of SQL and NoSQL ecosystems such as PostgreSQL, Redis, and Elasticsearch or Delta Lake
  • Solid backend development experience with strong understanding of OOP/OOD principles and design patterns (Python)
  • Demonstrated experience designing and implementing new data architectures, especially in fast-paced or transitioning environments.
  • Strong understanding of ETL/ELT processes and data flow logic

Nice to have:

  • Exposure to MLOps and integrating ML models into production
  • Experience in DevOps and asynchronous systems
  • Familiarity with RabbitMQ, Docker, WebSockets, and Linux environments
  • Familiarity with routing or navigation algorithms

Offer description

Who we are:

Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries.

About the Product:

Bringg is a fast-moving logistics and delivery management platform built to simplify complex last-mile operations. It brings together real-time tracking, route optimization, fleet coordination, and integrations with third-party logistics providers — all under one system. With complete visibility into delivery workflows, businesses using Bringg reduce operational friction, improve speed, and stay in full control of their fulfillment processes.

About the Role:

This is a hands-on role for a Data Engineer who wants to work on meaningful, high-scale systems. Bringg is undergoing a major shift in how data powers our logistics platform, and you’ll be a key part of that transformation. As we completely redesign our data pipeline architecture, you’ll play a central role in shaping and building the next generation of our data platform — one that supports real-time insights, AI/ML-driven features, and key product capabilities. You’ll take part in designing and building streaming and batch data systems that support real-time operations and drive core product functionality — powering logistics for some of the world’s largest enterprises.

Our data team collaborates closely with backend engineers, data scientists, analysts, and product stakeholders to deliver scalable, reliable data solutions that are integrated directly into the product and used daily by global customers. If you’re looking to work on complex data challenges with direct product impact, alongside a collaborative and technically strong team — this role is for you.

Our data team works cross-functionally with data scientists, backend engineers, and analysts to deliver scalable, real-time solutions used by global enterprise clients.If you are passionate about clean architecture, eager to learn new technologies, and ready to take ownership of core infrastructure — this role is for you.

Your responsibilities

  1. Drive the design and architecture of scalable, efficient, and resilient batch and streaming data pipelines.
  2. Shape the implementation of modern, distributed systems to support high-throughput data processing and real-time analytics.
  3. Collaborate cross-functionally with data scientists, engineers, and product stakeholders to deliver end-to-end data-driven capabilities.
  4. Optimize legacy systems during the migration phase, ensuring a seamless transition with minimal disruption.
  5. Contribute to DevOps and MLOps processes and enhance the reliability, monitoring, and automation of data infrastructure.
  6. Support the integration and deployment of AI/ML models within the evolving data platform.
Wyświetlenia: 9
Opublikowana24 dni temu
Wygasaza 24 dni
Rodzaj umowyB2B
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Data Engineer"