Senior Consultant – AI and Data, AWS Databricks
Posted 1ds ago
Employment Information
Report this job
Job expired or something wrong with this job?
Job Description
AWS Data Engineer developing scalable ETL pipelines in EY's D&A team for complex data solutions. Collaborating across various industries to modernize data ecosystems and analytics platforms.
Responsibilities:
- Develop, optimize, and deploy scalable ETL/ELT pipelines using PySpark, SQL, and AWS services.
- Good understanding in medallion architecture using Databricks lakehouse.
- Strong knowledge in Databricks end-to-end platform management (Unity Catalog, Users/Groups, security etc.), Databricks integrations (AWS/Azure/GCP.), DevOps – CI/CD pipelines (for notebooks, mlflows etc).
- Build and manage data lakehouse solutions leveraging AWS S3, Glue, Iceberg, and other AWS-native components.
- Migrate on-premises ETL workloads to modern AWS-based architectures with a focus on performance, reliability, and cost efficiency.
- Implement metadata-driven ingestion frameworks and medallion/layered architecture (Bronze/Silver/Gold).
- Work on orchestration frameworks such as Astronomer (Airflow), AWS Step Functions, or AWS-managed workflows.
- Follow data warehouse (DW) concepts and best practices for modeling and data integration.
- Collaborate with data analysts, data scientists, and BI engineers for seamless data delivery.
- Perform code reviews, troubleshoot issues, and ensure end-to-end data quality.
Requirements:
- 8+ years of total IT experience with at least 2+ years in AWS-based data engineering.
- Strong hands-on experience in Databricks Lakehouse Architecture.
- PySpark, Scala, SQL, Python ETL pipeline development.
- AWS services: S3, Glue, Lambda, Step Functions, CloudWatch.
- Unix/Linux environments and Iceberg table format.
- Astronomer (Airflow) or other orchestration tools.
- Experience with structured and semi-structured data formats such as Parquet, JSON, CSV, XML.
- Understanding of data warehousing concepts, star/snowflake schemas, and dimensional modeling.
- Good knowledge of CI/CD, version control (GitHub, Azure DevOps, Jenkins).
- Strong analytical, troubleshooting, and problem-solving skills.
- Ability to work independently and collaborate with stakeholders to deliver high quality solutions.
Benefits:
- Opportunities to work on diverse, meaningful, and industry-leading projects.
- Coaching, learning programs, and a personalized growth and development plan.
- Exposure to a collaborative, interdisciplinary work culture.
- Flexibility to manage your work in the way that suits you best.
- Supportive colleagues and a global environment for continuous knowledge exchange.













