Data Engineer
Posted 32ds ago
Employment Information
Job Description
Data Engineer responsible for large-scale data systems and pipelines development. Collaborating with teams at Stefanini for real-time data solutions.
Responsibilities:
- Design, develop, implement, and tune large-scale distributed systems and pipelines that process high volumes of data, with a focus on scalability, low latency, and fault tolerance in every system built.
Requirements:
- 5–6 years of Big Data development experience.
- Demonstrates up-to-date expertise in data engineering and complex data pipeline development.
- Experience working in Agile methodologies.
- Proficient in Java and Python for building data pipelines and data processing layers.
- Experience with Airflow and GitHub.
- Conversational English.
- Experience writing MapReduce jobs.
- Demonstrates expertise in writing complex, highly optimized queries across large datasets.
- Proven working experience with Big Data technologies: Hadoop, Hive, Kafka, Presto, Spark, HBase.
- Highly proficient in SQL.
- Experience with cloud platforms (GCP, Azure).
- Experience with relational models and in-memory data stores is desirable (Oracle, Cassandra, Druid).
- Provides and supports the implementation and operation of data pipelines and analytical solutions.
- Performance tuning experience for systems working with large datasets.
- Experience with REST API data services — data consumption.
- Retail experience is a strong plus.
- Candidates with previous experience at Walmart are required.

















