Software Engineer – Data Platform
Posted 90ds ago
Employment Information
Report this job
Job expired or something wrong with this job?
Job Description
Software Engineer building backend APIs and scalable data pipelines at AI infrastructure company. Join to design and scale systems powering data workflows and analytics.
Responsibilities:
- Build backend APIs and scalable data pipelines (Python, PySpark).
- Work with modern data lakehouse/warehouse tech (Iceberg, Delta Lake, Snowflake, Databricks).
- Orchestrate workflows (Airflow) and optimize big data frameworks.
- Manage infra as code (Terraform) and ensure reliability with monitoring/logging.
- Collaborate across teams and with customers to solve complex data challenges and design seamless integration solutions.
- Drive best practices in scalability, reliability, and cost efficiency.
Requirements:
- 5+ years in software/data engineering or infrastructure roles
- Strong Python skills (backend APIs a plus)
- Proven ability to build scalable data pipelines from scratch
- Hands-on with Apache Iceberg/Delta Lake + Snowflake/Databricks
- Workflow orchestration expertise (Airflow, Luigi, etc.)
- Big data frameworks experience (Spark, Hadoop)
- Familiar with monitoring/analytics tools (Prometheus, Grafana, ELK, Datadog)
- Skilled in designing scalable, reliable, cost-efficient systems
- Experience with large-scale distributed data architectures
- Thrives in fast-paced startup environments
- Excellent problem-solving, communication, and customer-facing skills
Benefits:
- Competitive salary, meaningful equity, and substantial bonus for top performers
- Flexible time off plus comprehensive health coverage for you and your family
- Support for research, publication, and deep technical exploration














