Data Operations Engineer

Posted 89ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Data Operations Engineer responsible for onboarding and transforming customer data in SaaS platform. Working with SQL and Python to ensure data quality and reliability.

Responsibilities:

  • Lead customer data onboarding, including mapping, cleansing, transforming, and importing data from competitor platforms, spreadsheets, and ad-hoc sources
  • Build and maintain repeatable ingestion processes and scripts using Python, SQLAlchemy, and Postgres
  • Partner with Customer Success Managers to define data requirements and onboarding timelines
  • Translate messy, inconsistent customer data into clean internal schemas with accuracy and consistency
  • Maintain a library of reusable migration utilities, validation scripts, and automation tools
  • Own internal and external reporting requests requiring SQL or data extraction
  • Perform one-time data cleanups, corrections, and backfills directly in the SaaS database
  • Investigate data anomalies and support engineering with root-cause analysis
  • Improve and maintain ETL pipelines to reduce manual engineering work
  • Build lightweight automations to streamline recurring operational workflows

Requirements:

  • Strong SQL skills (Postgres preferred)
  • Comfort working with large, messy Excel, Google Sheets, and CSV datasets
  • Python proficiency (SQLAlchemy strongly preferred)
  • Experience designing data transformations, mappings, and validations
  • Solid understanding of ETL principles, automation, and scripting
  • Ability to interpret data models and navigate relational schemas
  • High attention to detail and a strong data quality mindset
  • Clear communicator with both technical and non-technical partners
  • Experience with Python-based migration or ETL frameworks
  • Familiarity with SaaS data structures, multi-tenant databases, or systems like CRM, ATS, or LMS platforms
  • Experience building reusable internal tools for data operations
  • Exposure to Git and basic DevOps workflows
  • Comfort troubleshooting and working in production-like environments