Full Stack Engineer – Data Engineering Focus

Posted 6ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Full Stack Engineer focusing on data engineering at GNO Partners. Building scalable data pipelines and contributing to AI insights layer for Amazon sellers.

Responsibilities:

  • Design and build the data pipelines that pull from Amazon SP-API and Ads API into our Postgres warehouse — handling rate limits, retries, schema evolution, and the messy realities of vendor APIs.
  • Architect ingestion, transformation, and aggregation layers that power our reporting tools.
  • Build new report tools end-to-end alongside the rest of the engineering team — you're not only on pipelines.
  • Help us scale our data layer as we move from per-client manual uploads to automated, multi-tenant data flow.
  • Contribute to our AI insights layer — feeding clean, structured data into LLM-powered analysis.

Requirements:

  • 3–5 years of full-stack experience, with a demonstrable track record of building data pipelines in production.
  • Strong with TypeScript, Node.js, NestJS, and React.
  • Deep comfort with Postgres / Supabase — partitioning, indexing, query optimization, handling large datasets.
  • Hands-on experience with AWS : S3, Lambda, SNS, SQS, EC2.
  • Bonus for orchestration tools (Step Functions, EventBridge, Airflow, Temporal, etc.).
  • Experience integrating with third-party APIs at scale — pagination, rate limiting, backfills, incremental sync.
  • Basic familiarity with AI agentic systems — you've worked with or explored LLMs, tool-use, or agent frameworks.
  • Pragmatic and product-aware — you understand pipelines exist to serve user-facing features, not for their own sake.
  • Nice to Have: Direct experience with Amazon SP-API and/or Ads API. Background with ETL frameworks, streaming systems (Kafka, Kinesis), or workflow engines (Temporal, Airflow). Experience with data quality tooling, observability, or pipeline monitoring.

Benefits:

  • You'll own the data backbone of a platform that's actively scaling.
  • High-leverage work — every pipeline you build directly enables new product surface area.
  • Clear path to work across pipelines, product, and AI as the platform evolves.