Senior Data Engineer

Posted 98ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Senior Data Engineer designing GCP-native data solutions for machine learning and analytics initiatives. Building reusable data products and collaborating with cross-functional teams in a remote setup.

Responsibilities:

  • Design and implement a scalable, GCP-native data strategy aligned with machine learning and analytics initiatives.
  • Build, operate, and evolve reusable data products that deliver compounding business value.
  • Architect and govern squad-owned data storage strategies using BigQuery, AlloyDB, ODS, and transactional systems.
  • Develop high-performance data transformations and analytical workflows using Python and SQL.
  • Lead ingestion and streaming strategies using Pub/Sub, Datastream (CDC), and Cloud Dataflow (Apache Beam).
  • Orchestrate data workflows using Cloud Composer (Airflow) and manage transformations with Dataform.
  • Modernize legacy data assets and decouple procedural logic from operational databases into analytical platforms.
  • Apply Dataplex capabilities to enforce data governance, quality, lineage, and discoverability.
  • Collaborate closely with engineering, product, and data science teams in an iterative, squad-based environment.
  • Drive technical decision-making, resolve ambiguity, and influence data architecture direction.
  • Ensure data solutions are secure, scalable, observable, and aligned with best practices.

Requirements:

  • 8+ years of professional experience in data engineering or a related discipline.
  • Expert-level proficiency in Python and SQL for scalable data transformation and analysis.
  • Deep expertise with Google Cloud Platform data services, especially BigQuery.
  • Hands-on experience with AlloyDB (PostgreSQL) and Cloud SQL (PostgreSQL).
  • Strong understanding of domain-driven data design and data product thinking.
  • Proven experience architecting ingestion pipelines using Pub/Sub and Datastream (CDC).
  • Expertise with Dataform, Cloud Composer (Airflow), and Cloud Dataflow (Apache Beam).
  • Experience modernizing legacy data systems and optimizing complex SQL/procedural logic.
  • Ability to work independently and lead initiatives with minimal guidance.
  • Strong critical thinking, problem-solving, and decision-making skills.

Benefits:

  • Flexible work arrangements
  • Professional development