Data Engineer – Contractor

Posted 3ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Data Engineer designing and building reliable data pipelines for Fintech at myPOS. Creating scalable data architecture and ensuring data quality for business insights.

Responsibilities:

  • Design, build, and operate scalable, reliable data pipelines and data infrastructure
  • Ensure high-quality data is accessible, trusted, and ready for analytics and data science
  • Build and maintain data pipelines for ingestion, transformation, and export across multiple sources and destinations
  • Develop and evolve scalable data architecture to meet business and performance requirements
  • Partner with analysts and data scientists to deliver curated, analysis-ready datasets and enable self-service analytics
  • Implement best practices for data quality, testing, monitoring, lineage, and reliability
  • Optimize workflows for performance, cost, and scalability (e.g., tuning Spark jobs, query optimization, partitioning strategies)
  • Ensure secure data handling and compliance with relevant data protection standards and internal policies
  • Contribute to documentation, standards, and continuous improvement of data platform and engineering processes
  • Ensure secure, compliant handling of data and models, including access controls, auditability, and governance practices
  • Build and maintain MLOps automation: CI/CD for ML, environment management, artifact handling, versioning of data/models/code

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field (or equivalent practical experience)
  • 3+ years of experience as a Data Engineer, building and maintaining production-grade pipelines and datasets
  • Python and SQL skills with a solid understanding of data structures, performance, and optimization strategies for ETL/ELT processes
  • Hands-on experience with orchestration (like Airflow, Dagster, Databricks Workflows) and distributed processing in a cloud environment
  • Familiarity with at least one major cloud provider (GCP, AWS, Azure) and deploying data solutions in the cloud
  • Strong troubleshooting mindset: ability to debug issues across data, infra, pipelines, and deployments
  • Collaborative mindset and clear communication across engineering, analytics, and business stakeholders

Benefits:

  • Excellent compensation package
  • myPOS Academy for upskilling and training
  • Unlimited access to courses on LinkedIn Learning
  • Refer a friend bonus as we know that working with friends is fun
  • Teambuilding, social activities and networks on a multi-national level