Senior Data Engineer

Posted 3ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Senior Data Engineer at Peek transforming data sources into actionable insights for analytics. Collaborating across teams to support business intelligence and enhance data infrastructure.

Responsibilities:

  • Leverage the full capabilities of GCP, DBT, Airflow, and the market of data pipeline products to transform varied and disparate data sources into performant, resilient, accurate, and understandable datasets, to provide them to Business Intelligence and onward data products.
  • Design, develop, and own data pipelines and models that power internal analytics for product and business teams.
  • Drive the collection of new data and the refinement of existing data sources, and develop relationships with production engineering teams to manage our data structures as products evolve.
  • Improve our data visualization tooling and platform to help the team create dynamic tools and reporting.
  • Contribute to the long-term design of Peek's AI data serving layer — including feature engineering patterns, online/offline feature consistency, and foundational feature store architecture.
  • Strong interpersonal communication skills. Collaborate to solve problems. We value team success over individual achievement. We love working with people who can ask hard questions, offer unusual solutions, and are willing to compromise when solving challenging problems on a deadline.
  • Well organized and self-motivated. Be able to adapt to this fast-paced environment, where teams are growing, and priorities shift on a quarterly basis as we move into new markets.
  • Advocate for self-care while managing a high workload. We want you to show up as your best self, and to speak up for what you need to stay healthy and engaged.

Requirements:

  • 4+ years of experience in a data engineering role, with a focus on building data pipelines and modeling.
  • 2+ years of direct experience with DBT and/or BigQuery.
  • Proficiency with relational databases (Postgres) and SQL.
  • Proficiency with data warehouses (BigQuery).
  • Experience with Python.
  • Experience with ETL tools and techniques (Airbyte, DBT, Airflow, Pulumi, etc.).
  • Familiarity with modern AI data infrastructure concepts — including embedding pipelines, vector search, and retrieval-augmented generation (RAG) patterns.
  • Experience with reporting tools is a nice-to-have (Looker)
  • English is our business language, and fluency is required.
  • Comfortable working remotely. We are 100% remote, and rely on Slack, email, Signal, and Zoom to stay connected.

Benefits:

  • Offers Equity