Big Data Engineer

Big Data Engineer (Praca zdalna)

Allegro

Warszawa
14200 - 20200 PLN / miesiąc
PERMANENT, B2B
PERMANENT
💼 B2B
📊 Big Data
Scala
Java
🐍 Python
GCP
Spark
Kafka
Apache Beam
Unix/Linux
CI/CD

Podsumowanie

Stanowisko: Big Data Engineer. Odpowiedzialność: przetwarzanie danych, zarządzanie platformą w GCP. Wymagania: znajomość języków programowania, systemów rozproszonych. Korzyści: elastyczne godziny, bonus roczny, szkolenia.

Słowa kluczowe

Big DataScalaJavaPythonGCPSparkKafkaApache BeamUnix/LinuxCI/CD

Benefity

  • Elastyczne godziny pracy
  • Bonus roczny
  • Plan kafeteryjny z różnorodnymi benefitami
  • Szkolenia wewnętrzne
  • Możliwość pracy zdalnej

Opis stanowiska

Important things for you Flexible working hours in an office first model (4/1) that depend on you and your team. Starting later or finishing earlier? No problem! Work hours keep pace with our lifestyles and can start between 7 a.m. and 10 a.m.The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost):Data Engineer: PLN 14 200 - 20 200 Senior Data Engineer: PLN 18 400 - 25 450Annual bonus (depending on your annual assessment and the company's results)Our team is based in Warsaw.About the team As part of the Data & AI area, we implement projects based on the practical 'data science' and 'artificial intelligence' applications of an unprecedented scale in Poland. Data & AI is a group of over 150 experienced engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning. We are looking for BigData engineers who want to build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers. The platform collects 5 billion clickstream events every day (up to 150k / sec) from all Allegro sites and Allegro mobile applications. This is a hybrid solution using a mix on-premise and Google Cloud Platform (GCP) services like Spark, Kafka, Beam, BigQuery, Pubsub or Dataflow.We are looking for people whoAre programming in languages such as Scala or Java, PythonStrong understanding of distributed systems, data storage, and processing framework like dbt, Spark or Apache BeamHave knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWSUse good practices (clean code, code review, TDD, CI/CD)Navigate efficiently within Unix/Linux systemsPossess a positive attitude and team-working skillsAre eager for personal development and keeping their knowledge up to dateKnow English at B2 level What we offerPossibility to learn and work with backend (Spring, Kotlin) and AI technologies within the team.Well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)A wide selection of varied benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)English classes that we pay for related to the specific nature of your jobMacbook Pro / Air (depending on the role) or Dell with Windows (if you don't like Macs) and other gadgets that you may needWorking in a team you can always count on — we have on board top-class specialists and experts in their areas of expertiseA high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new thingsHackathons, team tourism, training budget and an internal educational platform (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)If you want to learn more, check it outWhy is it worth working with usAt Allegro, you will be responsible for processing petabytes of data and billions of events dailyYou will become a participant in one of the largest projects of building a data platform in GCPYour development will align with the latest technological trends based on open source principles (data mesh, data streaming)You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needsYou will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.techOnce a year, you can take advantage of the opportunity to work in a different team or more often if there’s an internal business need (known as team tourism) Send in your CV and see why it is #dobrzetubyć (#goodtobehere)

Zaloguj się, aby zobaczyć pełny opis oferty

Wyświetlenia: 27
Opublikowana2 dni temu
Wygasaza około 2 miesiące
Rodzaj umowyPERMANENT, B2B
Źródło
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "Big Data Engineer"

Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.