Senior Data Engineer – Data Platform

Posted 14ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Senior Data Engineer responsible for building and scaling data infrastructure at Salla's e-commerce ecosystem. Handling data pipelines from ingestion to transformation using modern data tools.

Responsibilities:

  • Pipeline Engineering: Design, build, and maintain scalable ETL/ELT pipelines from diverse sources including Data Lakes, Production ClickHouse instances, flat files, and various APIs.
  • Infrastructure & Orchestration: Configure and optimize our Data Warehouse infrastructure (ClickHouse) and orchestration layers (Mage.ai).
  • Engineering Excellence: Implement and manage "engineering-grade" CI/CD workflows, conduct rigorous PR reviews, and ensure robust dependency management across the stack.
  • Data Modeling & Architecture: Implement Medallion architecture (Bronze/Silver/Gold) and maintain high-performance data models using dbt.
  • Quality & Observability: Build automated data quality monitoring and alerting; proactively escalate upstream data issues to engineering teams and keep stakeholders informed of pipeline health.
  • Advanced Data Flows: Develop reverse ETL (rETL) pipelines and expose secure data APIs to enable seamless data consumption across the organization.
  • Strategic Integration: Manage event streaming and real-time data ingestion (Kafka, CDC) to support high-volume product analytics and tracking.

Requirements:

  • 4–7 years of experience in Data Engineering, preferably within the e-commerce or high-growth tech industry.
  • Expert-level Python and SQL (able to write highly optimized code for large-scale datasets).
  • Deep experience with dbt for transformation and modeling.
  • Strong experience with ClickHouse (preferred) or similar modern warehouses (Snowflake, BigQuery) and Orchestration tools: mage.ai (preferred) or similar tools (Airflow).
  • Proven experience implementing and managing Reverse ETL workflows to sync data back into operational tools.
  • Proven track record building and deploying production-grade CI/CD pipelines and automation scripts.
  • Solid understanding of Data Contracts, Medallion architecture, and Data Quality frameworks.