Data Engineer

Posted 98ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Data Engineer at InterWorks creating robust data systems and pipelines for client insights. Collaborating with analysts to unify data sources and support analytics initiatives.

Responsibilities:

  • Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
  • Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
  • Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
  • Develop ETL/ELT frameworks that improve code quality and make things easier for your teammates
  • Apply strong data modeling principles to support everything from dashboards to data science
  • Collaborate closely with other InterWorkers and client teams to understand what they really need
  • Write clear documentation, contribute to design decisions, and share what you learn
  • Bring a thoughtful, problem-solving mindset to every project

Requirements:

  • Solid SQL skills (and the curiosity to keep leveling up)
  • Strong experience with ETL/ELT workflows (GUI tools or code-based—either works!)
  • A clear understanding of data modeling best practices
  • Deep understanding of data quality, governance, and observability principles and practices
  • Working knowledge of DevOps concepts
  • Experience with CI/CD pipelines
  • Excellent communication skills—you can explain tech to humans
  • A passion for delivering smart, thoughtful, client-centered solutions
  • A love for learning new tools, frameworks, and approaches
  • 5+ years of professional experience in a data engineering or technical consulting role
  • Flexibility and comfort in fast-changing environments
  • Experience with cloud platforms (AWS, Azure, GCP)
  • Familiarity with tools like Matillion, Fivetran, dbt, or Snowflake
  • Exposure to modern data warehouses: BigQuery, Redshift, Databricks, etc.
  • Knowledge of Python or scripting for automation
  • An interest in AI and how it’s shaping the future of data engineering
  • A background in software engineering or integration work.