Senior Data Engineer
Posted 53ds ago
Employment Information
Report this job
Job expired or something wrong with this job?
Job Description
Senior Data Engineer helping build a cutting-edge data mesh platform at Safeguard Global. Collaborating across teams to enhance data quality and drive business insights through innovative solutions.
Responsibilities:
- Imagine shaping the future of how data drives business decisions and AI innovation.
- Help build a cutting-edge data mesh platform that empowers teams with scalable, reliable, and high-quality data.
- Get deep into architecture and tooling, understanding how data transforms into real business value.
- Influence how product, analytics, AI, and business teams use and trust their data every day.
- Build, maintain, and optimize end-to-end data pipelines using dbt, Python, PySpark, Polars, and SQL.
- Operate and enhance AWS data infrastructure, including Glue, Athena, S3, and supporting tools.
- Design and evolve Apache Iceberg tables, managing schema evolution, partitioning, compaction, and efficient storage.
- Drive data quality, observability, lineage, and monitoring across the entire data platform.
- Translate business and analytical needs into scalable, maintainable, and high-performing data solutions.
- Collaborate with data scientists, analysts, engineers, and product teams to deliver clean, trusted datasets and APIs.
- Contribute to the ongoing evolution of our data mesh, developing new patterns, enhancements, and architecture improvements.
- Build AI/ML-ready data environments and datasets that enable intelligent products and predictive insights.
- Document systems, design decisions, and processes to ensure clarity, consistency, and smooth onboarding.
Requirements:
- Hands-on experience in data engineering, data infrastructure, or platform roles.
- Deep expertise with dbt, Python, and SQL, building reliable and scalable pipelines.
- Extensive experience with AWS services: including Glue, Athena, S3, IAM, and related tools.
- Proven production experience with Apache Iceberg or similar table formats/lakehouse technologies.
- Skilled in PySpark and/or Polars for processing large-scale datasets efficiently.
- Track record of designing scalable, modular data architectures that evolve with business needs.
- Exceptional communicator, able to translate complex technical concepts for diverse stakeholders.
- Thrive in fast-paced, dynamic environments, adapting to shifting priorities with ease.
- Confidence in mentoring or guiding junior engineers, fostering team growth and knowledge sharing.
- Experience building data systems for AI/ML applications, bringing models and analytics to life.
- Familiarity with pipeline orchestration patterns and best practices, streamlining workflows.
- Hands-on exposure to infrastructure-as-code tools like Terraform or CDK, helping automate and scale infrastructure.
- Experience with data cataloging, lineage, or governance tools, improving trust and visibility across datasets.
- Experience with real-time or streaming data pipelines, adding speed and responsiveness to data flows.
Benefits:
- Autonomy and Flexibility (work in any way): Remote first, with the flexibility to include life needs like school runs and gym breaks in your schedule, all while maintaining a high standard of work.
- Generous leave: Enjoy a competitive leave package including paid bonding leaves for new additions to your family.
- Make a difference: Get 2 paid charitable days off to contribute to causes you believe in.
- Corporate bonus/SIP: All Guardians are eligible for our annual bonus scheme or sales incentive plan.
- International environment: Grow your network internationally and collaborate across the world. Interact, discover cultures, and tap into local expertise.
- Human centered culture: We emphasize the people factor in everything we do. Our nurturing culture ensures your ideas reach our leaders and your contributions get the recognition they deserve.
- Learning: We support your continuous growth by providing access to 2 learning platforms, where you can learn at your own pace.




















