Lead Forward Deployed Engineer, Databricks

Posted 6hrs ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Lead Forward Deployed Engineer designing and deploying AI-native solutions in Databricks for business transformation. Working closely with clients to build production-grade data systems.

Responsibilities:

  • Work directly with business and technical stakeholders to identify high-value data and AI use cases that can be delivered on Databricks
  • Design, build, and deploy production-grade data and AI solutions using Databricks capabilities across the Lakehouse, Mosaic AI, Unity Catalog, Databricks SQL, Workflows, Delta Lake, Databricks Apps, Genie, Agents, and Lakebase
  • Lead client discovery sessions to understand business workflows, data availability, platform maturity, integration needs, and measurable success criteria
  • Architect AI-native data platforms that support agentic workflows, semantic analytics, model deployment, retrieval systems, optimization models, and operational applications
  • Build Genie rooms, semantic layers using Metric Views, decision-support applications, data products, AI applications, and agent memory architectures that help clients operationalize insight and action
  • Partner with data engineering, AI engineering, analytics, business, security, and governance stakeholders to design secure, scalable, production-ready solutions
  • Create prototypes, demos, technical reference architectures, and reusable accelerators that showcase the value of Databricks for enterprise AI and analytics workloads
  • Help clients modernize data pipelines, improve platform architecture, implement governance patterns, and deploy AI systems into operational workflows
  • Work with Aimpoint Digital’s alliance, sales, and delivery teams to shape Databricks-led opportunities and translate client needs into winning solution approaches
  • Develop thought leadership, solution accelerators, demos, and internal enablement materials that strengthen Aimpoint Digital’s Databricks practice

Requirements:

  • Strong experience in data engineering, AI engineering, platform engineering, solution architecture, or enterprise software development
  • Hands-on experience with Databricks, Spark, Delta Lake, Lakehouse architecture, data pipelines, model deployment, or modern data platform patterns
  • Strong Python and SQL skills. Experience with PySpark, MLflow, Databricks Workflows, Unity Catalog, Databricks SQL, or similar tooling is strongly preferred
  • Familiarity with enterprise AI patterns such as RAG, agents, model serving, vector search, semantic layers, data applications, evaluation frameworks, and governance
  • Ability to work directly with clients, understand ambiguous business needs, and translate them into technical architecture and implementation plans
  • Strong communication skills with the ability to engage executives, business leaders, architects, data engineers, ML engineers, and analytics teams
  • Comfort moving from strategy to architecture to hands-on development
  • A practical understanding of what it takes to move from demo to production in complex enterprise environments
  • Databricks certification or deep hands-on delivery experience in the Databricks ecosystem (Preferred Qualifications)

Benefits:

  • Health insurance
  • Professional development opportunities