A skilled professional with hands-on experience developing and optimizing ETL solutions within Google Cloud Platform (GCP)
Proficient with GCP services such as BigQuery, Cloud Run Functions, Cloud Run, and Dataform
Experienced with SQL and Python for data processing and transformation
Knowledgeable in RDBMS, particularly MS SQL Server
Familiar with BI & Reporting systems and data modeling concepts, including tools like Power BI and Looker
Experienced in working with Data Quality metrics, checks, and reporting to ensure accuracy, reliability, and governance of data solutions
Skilled in migrating legacy SQL stored procedures to modern, cloud-native data processing solutions on Google Cloud Platform
Adaptable and effective in fast-paced, changing environments
A collaborative team member with excellent consulting and interpersonal skills
Detail-oriented with strong analytical skills and sound judgment in technical decision-making
Familiarity with Dataplex, LookML, Looker Studio, and Azure Data Factory is a plus
Offer description
SoftServe is a global digital solutions company headquartered in Austin, Texas, founded in 1993. Our associates work on 2,000+ projects with clients across North America, EMEA, APAC, and LATAM. We are about people who create bold things, make a difference, have fun, and love their work.
Big Data & Analytics is the Center of Excellence's data consulting and data engineering branch. Hundreds of data engineers and architects nowadays build data & analytics end-to-end solutions from strategy through technical design and proof of concepts to full-scale implementation. We have customers in the healthcare, finance, manufacturing, retail, and energy domains.
We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others.
Your responsibilities
Be part of a data-focused engineering team contributing to modern data transformation and analytics initiatives migrating large-scale systems from Azure to GCP
Collaborate closely with data engineers, BI developers, and business stakeholders to design and implement robust, high-quality data pipelines and models that drive strategic decision-making
Participate in the entire project lifecycle: from discovery and PoCs to MVPs and full production rollout
Engage with customers ranging from global enterprises to innovative startups
Continuously learn, share knowledge, and explore new cloud services
Contribute to building a data platform that integrates batch, streaming, and real-time components