Senior Data Engineer
Posted 29ds ago
Employment Information
Report this job
Job expired or something wrong with this job?
Job Description
Senior Data Engineer designing and maintaining reliable data pipelines and scalable infrastructure for a personal injury law firm. Contributing to analytics, reporting, and decisions across the organization.
Responsibilities:
- Join a rapidly growing national firm at a formative stage
- Make a visible, measurable impact on a rapidly growing business
- Grow your skills, responsibility, and influence as the firm scales
- Work alongside high-caliber, mission-driven teammates who care deeply about doing great work
- Design, build, and maintain reliable data ingestion pipelines from internal systems and third-party data sources
- Implement scalable ELT workflows that process and deliver data across the organization
- Maintain transformation pipelines and ensure reliable delivery of analytics-ready datasets
- Manage and optimize the performance, reliability, and scalability of the company’s cloud data warehouse environment
- Maintain orchestration frameworks and scheduling systems that support data workflows
- Optimize data pipeline performance, compute utilization, and system efficiency
- Implement monitoring, alerting, and observability across data pipelines and platform components
- Ensure data freshness and system uptime meet defined service expectations
- Diagnose and resolve production issues including pipeline failures, data quality issues, and performance bottlenecks
- Maintain version-controlled data infrastructure and CI/CD workflows for data pipelines
- Implement testing and validation practices to ensure data quality and reliability
- Partner with the Director of Data to implement data architecture and platform improvements
- Support analytics and BI teams by ensuring reliable and well-modeled datasets are available for reporting and analysis
- Contribute engineering input to platform improvements and technical roadmap initiatives
Requirements:
- 5+ years of experience building and maintaining production data pipelines
- Strong SQL skills and experience working with large datasets
- Experience with modern cloud data warehouses such as Snowflake or BigQuery
- Experience building transformation workflows using dbt or similar tools
- Experience working with orchestration tools such as Airflow
- Strong understanding of data pipeline reliability, performance optimization, and scalability
- Experience using version control systems such as Git in collaborative development environments
- Nice to Have: Experience supporting data infrastructure used for machine learning workflows
- Experience building feature pipelines or supporting predictive modeling workloads
- Experience using Python for data processing and pipeline development
- Experience implementing monitoring and observability for data systems
- Experience working in high-growth or rapidly scaling environments
- Strong communication skills—written and verbal
- Ability to think critically, prioritize effectively, and execute with speed.
Benefits:
- Health insurance
- Expected to work flexible hours



















