Data Engineer
Posted 4ds ago
Employment Information
Report this job
Job expired or something wrong with this job?
Job Description
Data Engineer maintaining scalable data pipelines for an international company in Hamburg. Requires experience with Snowflake, ETL processes and Azure Data Factory.
Responsibilities:
- Build and maintain efficient, scalable data pipelines to ingest, process and store data from various sources.
- Ensure data quality, integrity and availability.
- Design and implement database schemas and structures that enable optimal storage, querying and processing of data.
- Develop, manage and optimize ETL processes to extract, transform and load data into target systems.
- Ensure timely and accurate delivery of data.
- Use Snowflake and Azure Data Factory to manage cloud-based data warehouse and data integration processes.
- Ensure efficient data storage and accessibility.
- Integrate IoT data using Azure IoT Hub to enable real-time data processing and analytics.
- Implement and manage data workflows with Apache Airflow to automate and orchestrate ETL processes and other data-related tasks.
Requirements:
- Bachelor's degree in Computer Science, Information Technology, Engineering or a related field.
- At least 5 years of experience as a Data Engineer or in a comparable role.
- Solid experience with data pipeline tools and techniques.
- Excellent knowledge of Snowflake and Azure Data Factory.
- Experience integrating IoT data via Azure IoT Hub.
- Hands-on experience with Apache Airflow for workflow automation.
- Expert knowledge of ETL processes and related tools.
- Strong SQL skills and experience with relational databases.
- Familiar with data modeling principles and best practices.
- Experience with data visualization tools, particularly Microsoft Power BI.
- Knowledge of cloud computing concepts, especially Microsoft Azure.
- Experience with other cloud platforms such as AWS or Google Cloud.
- Familiarity with big data technologies like Hadoop, Spark or Kafka.
- Experience with Python or other scripting languages.
- Knowledge of data governance and data quality frameworks.
- Excellent German and English language skills required.
Benefits:
- Permanent full-time position (37.5 hours/week for full-time)
- Flexible working hours
- Hybrid remote-work arrangement (2–4 days working from home per week possible)
- 30+ days of vacation
- Internal and external training and development opportunities
- Holiday pay, bonus payments, company pension scheme (BAV) & capital-forming benefits (VWL)
- On-site gym, subsidized canteen, commuting allowance, company bike (JobRad), employee & team events




















