Forward Deployed Engineering Lead – Data Science, Integration

Posted 23hrs ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Forward Deployed Engineering Lead managing Data Science & Integration for public sector missions. Delivering innovative solutions at customer sites for defense and intelligence agencies.

Responsibilities:

  • Deploy Cutting-Edge Data Science & Integration Innovation from API-led integration ecosystems to orchestrated operational workflows to advanced ML pipelines directly at customer sites in defense, intelligence, and aerospace environments.
  • Become a Trusted Partner Embedded with Mission Owners, working in classified and unclassified environments to deeply understand operational challenges and translate them into end-to-end data science and integration solutions using best-fit tooling.
  • Drive Innovation at the Front Lines by serving as the connective bridge between customer missions and internal engineering teams, gathering frontline feedback to help co-design integration patterns, workflow templates, and data science capabilities that strengthen national security outcomes.
  • Improve Efficiency, Mission-Readiness, and Trust by helping the U.S. and allied governments deploy integrated data technology stacks that deliver faster response times, better personnel utilization, and more effective logistics and asset management.
  • Drive Tangible Outcomes through Hands-On Implementation, taking responsibility for end-to-end technical delivery of complex data pipelines, integration flows, and predictive models—personally writing critical code, configuring systems, and troubleshooting issues in the field while guiding junior team members.
  • Build Transformative Data Science Solutions by taking primary ownership of designing, developing, and implementing high-quality, scalable production ML and analytics systems, including predictive models, intelligent automation pipelines, and LLM-augmented workflows.
  • Engineer Bespoke Integration Architectures, designing and building custom API-led integrations using enterprise integration platforms to ensure seamless connectivity across enterprise data sources, operational systems, and cloud environments in highly regulated settings.
  • Orchestrate Complex Workflows, leveraging modern workflow automation and process orchestration platforms to model, automate, and optimize multi-stakeholder operational processes in manufacturing, logistics, supply chain, and mission operations contexts.
  • Own the Entire Data Lifecycle on-site with customers—from strategic data modeling to hands-on ETL/ELT pipeline construction, feature engineering, and integration strategy—ensuring data is always ready, optimized, and secure for advanced analytics and AI applications.
  • Lead Rapid Prototyping and Iteration, developing proofs-of-concept and MVPs side-by-side with customer teams to demonstrate data science and integration capabilities and set the pace for rapid, mission-aligned delivery.

Requirements:

  • 8+ years of experience in a hands-on, end-to-end delivery role for high-quality, scalable production solutions, ideally in customer-facing or field-deployed settings.
  • Strong background in Computer Science, Data Science, or a related engineering discipline.
  • Expert-level proficiency in one or more programming languages (e.g., Python, Java, SQL).
  • Extensive hands-on experience building and deploying AI/ML solutions, including model development, MLOps pipelines, LLM integration, prompt engineering, and agentic frameworks (e.g., LangChain, LlamaIndex).
  • Demonstrated expertise in enterprise integration platforms and API-led connectivity design, with experience in tools such as MuleSoft Anypoint Platform , Dell Boomi, Apigee, Azure API Management, or equivalent; direct MuleSoft experience is a plus.
  • Demonstrated expertise in workflow orchestration and process automation platforms, with experience in tools such as Regrello , Temporal, Prefect, Apache Airflow, or equivalent; direct Regrello experience is a plus.
  • Deep expertise in data modeling, processing, and analytics with demonstrable proficiency in enterprise data platforms (e.g., Snowflake, Databricks, BigQuery, Redshift).
  • Ability to travel 50-75% of the time to customer sites, including classified facilities.
  • Must be a U.S. citizen (U.S. born or naturalized) living on U.S. soil who does not hold dual citizenship.
  • Eligible for and willing to obtain a U.S. Security Clearance.

Benefits:

  • time off programs
  • medical, dental, vision
  • mental health support
  • paid parental leave
  • life and disability insurance
  • 401(k)
  • employee stock purchasing program