Jetzt bewerben

Data Engineer (AWS/GCP/Azure) - Remote

Link Group

Remote
18000 - 23000 PLN
B2B
💼 B2B

Must have

  • Data engineering

  • Cloud platform

  • Python

  • SQL

  • Java

  • Scala

  • Big data

  • IaC

  • Data warehouses

  • Snowflake

  • BigQuery

  • Docker

  • Kubernetes

  • English

Nice to have

  • Kafka Streams

  • Kinesis

  • Redshift

  • Machine Learning

  • GDPR

  • CCPA

  • Degree

Requirements description

Must-Have Qualifications

  • At least 3+ years of experience in data engineering.
  • Strong expertise in one or more cloud platforms: AWS, GCP, or Azure.
  • Proficiency in programming languages like Python, SQL, or Java/Scala.
  • Hands-on experience with big data tools such as Hadoop, Spark, or Kafka.
  • Experience with data warehouses like Snowflake, BigQuery, or Redshift.
  • Familiarity with Infrastructure as Code tools like Terraform or CloudFormation.
  • Knowledge of CI/CD pipelines and containerization tools like Docker and Kubernetes.
  • Strong problem-solving and debugging skills.
  • Good command of English.

Nice to Have

  • Experience with real-time streaming data solutions (e.g., Kafka Streams, Kinesis).
  • Knowledge of machine learning pipelines and data science workflows.
  • Familiarity with data governance and compliance frameworks (e.g., GDPR, CCPA).
  • Certification in AWS, GCP, or Azure.
  • Academic background in Computer Science, Data Engineering, or a related field.

Offer description

At Link Group, we specialize in building tech teams for Fortune 500 companies and some of the world's most exciting startups. Our mission is to connect talented professionals with opportunities that align with their skills, interests, and career aspirations.

We are currently looking for a Data Engineer with expertise in cloud platforms such as AWS, GCP, or Azure to join our team and work on innovative data solutions for global clients.

About the Project

The project involves designing and implementing robust data pipelines and infrastructure on the cloud to enable real-time analytics and business insights for the finance/stock exchange industry. You will play a key role in developing scalable, secure, and high-performance data solutions.

Tech Stack

  • Cloud Platforms: AWS, GCP, Azure
  • Big Data Tools: Hadoop, Spark, Kafka
  • Data Warehouses: Snowflake, BigQuery, Redshift
  • Programming Languages: Python, SQL, Java/Scala
  • Infrastructure as Code: Terraform, CloudFormation
  • CI/CD: Jenkins, Git, Docker, Kubernetes

What We Offer

  • Tailored opportunities to match your professional interests and goals
  • A dynamic and collaborative work environment
  • Access to diverse and innovative projects for global clients
  • Competitive compensation aligned with your expertise
  • Continuous learning and career growth opportunities

If you're passionate about driving Agile excellence and transforming teams, we’d love to hear from you!

Apply today and join us at Link Group!

Your responsibilities

  1. Design, build, and maintain scalable data pipelines and cloud-based infrastructure.
  2. Collaborate with data scientists, analysts, and stakeholders to understand data requirements.
  3. Optimize data workflows for performance, scalability, and cost-efficiency.
  4. Implement ETL/ELT processes to clean, transform, and store data.
  5. Ensure data security, compliance, and best practices across cloud environments.
  6. Monitor and troubleshoot data pipelines and resolve issues efficiently.
Aufrufe: 31
Veröffentlichtvor 26 Tagen
Läuft abin 28 Tagen
Art des VertragsB2B
Quelle
Logo

Ähnliche Jobs, die für Sie von Interesse sein könnten

Basierend auf "Data Engineer (AWS/GCP/Azure) - Remote"