Data Architect, Data Platform – Azure

Posted 13ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Data Architect responsible for data warehouse and reporting layer on Azure at CargoSprint. Collaborating across teams to ensure data quality and analytics enablement.

Responsibilities:

  • Own the design and delivery of our data warehouse and data marts: models, schemas, semantic consistency, and documentation.
  • Build and evolve reporting foundations: curated datasets, metric definitions, and repeatable reporting patterns.
  • Partner with stakeholders to define source-of-truth metrics and ensure consistent definitions across teams.
  • Establish standards for data quality, testing, reconciliation, lineage, access control, and lifecycle management.
  • Drive modernization: reduce legacy data debt, simplify flows, and improve reliability and performance across the platform.
  • Define data contracts with application teams (schemas, events/CDC patterns) so downstream reporting is stable.
  • Enable self-serve analytics by making data discoverable, documented, and safe to use.

Requirements:

  • 8+ years in data engineering, analytics engineering, or platform engineering with architecture ownership
  • Proven experience building data warehouses and marts that support business reporting at scale
  • Strong SQL and data modeling skills (dimensional modeling, star schemas, and/or semantic models)
  • Experience with Azure data platforms (Synapse/Fabric, ADLS, Databricks on Azure, ADF) or comparable equivalents (Snowflake, BigQuery, Redshift)
  • Experience with orchestration and transformation tooling (ADF, Airflow, dbt, Dagster, or equivalent)
  • Track record of modernization with incremental migration and clear deprecation plans
  • Ability to align engineering and business stakeholders around shared definitions and priorities
  • Nice to have: Experience with BI layers and semantic modeling (Power BI preferred; Tableau/Looker also fine)
  • Streaming/event-driven data patterns (Kafka/Kinesis/PubSub) or CDC experience
  • Payments, billing, invoicing, or other high-volume transactional domains
  • Logistics, cargo, or supply chain experience
  • Spanish language proficiency.

Benefits:

  • Medical, dental, and vision plans for you and your family
  • 401(k) with company match
  • Generous flexible PTO program and paid holidays
  • Professional development opportunities

CargoSprint

Empowering the people that make global commerce happen.

TransporteCommerceFintech
View all jobs at CargoSprint

Report this job

Job expired or something wrong with this job?