Senior Data Engineer

Posted 11hrs ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Senior Data Engineer responsible for developing and maintaining data pipelines for LuxuryIQ. Focus on data quality and pipeline architecture for luxury brands analytics.

Responsibilities:

  • Set the technical direction for pipeline architecture and data quality standards
  • Take end-to-end ownership of mission-critical pipelines — from raw ingestion through to clean, documented delivery in BigQuery
  • Be the go-to technical authority on data behaviour across the platform
  • Design and develop data processing, cleansing, transformation, and QA scripts using Python and SQL
  • Own the operation, maintenance, and enhancement of existing mission-critical data pipelines and ETLs
  • Monitor pipeline reliability and data quality standards; identify and close coverage gaps systematically
  • Optimise ETL performance and migrate legacy pipelines into a more scalable architecture
  • Define and enforce data quality standards across the full pipeline
  • Contribute to architectural decisions on ingestion, transformation, and delivery layer design
  • Own the full data lifecycle documentation: every transformation rule, edge case, and business logic decision must be spec'd
  • Fetch datasets from external data sources: REST APIs, JSON, CSV
  • Perform data migrations and batch data updates
  • Conduct exploratory analysis to support new business requirements and data source onboarding
  • Develop proof-of-concept pipelines for new data initiatives

Requirements:

  • Technical Graduate in Computer Science, Information Systems, Statistics, Mathematics, or related field
  • 7+ years of relevant experience in data engineering, database or ETL development
  • Excellent Python skills (pandas, numpy); able to write production-grade pipeline code
  • Excellent SQL skills and strong database design and development experience
  • Proven ETL development and long-term maintenance experience
  • Apache Airflow experience
  • Entity resolution and deduplication across heterogeneous, multi-source datasets
  • Web scraping, crawler development, and API integration experience
  • Working knowledge of AI/LLM tooling (Claude Code, Copilot, or equivalent)

Benefits:

  • Monthly travel to Geneva