Mid Data Engineer

Posted 109ds ago

Employment Information

Education
Salary
Experience
Job Type

Report this job

Job expired or something wrong with this job?

Job Description

Data Engineer building a state-of-the-art data platform using open-source technologies. Collaborating on ETL processes and scalable transactions for a leading financial institution.

Responsibilities:

  • Contribute expertise to the overall data eco-system’s engineering best practices, standards, and architectural approaches.
  • Collaborate in authoring, reviewing, and approving data requirements and designs, including ETL, data movement, pipelines, business intelligence, and analytics.
  • Work alongside a team of engineers to build and maintain a custom data platform and Master Data platform.
  • Participate in creating architecture and solution blueprints to meet client requirements.
  • Collaborate with the project management team to develop the overall implementation solution plan and actively contribute to project life cycle phases within the solution area.
  • Engage with data integration technologies to design and implement new solutions and processes to support evolving data management strategies.
  • Work closely with business and technology client representatives to gather functional and technical requirements.
  • Stay informed about the latest technology trends, particularly in the areas of data platforms.

Requirements:

  • 3+ years of experience in data engineering.
  • Proficiency in programming language Python/R.
  • Knowledge of SQL for working with databases.
  • Knowledge of data modeling concepts and the ability to design and implement efficient data storage structures.
  • Experience with ETL processes and tools, in this case, Apache Airflow.
  • Familiarity with version control systems like Git for code collaboration and tracking changes.
  • 3+ years of experience in working with Kubernetes and Docker.
  • Understanding of data warehouse concepts and experience working with dbt.
  • Understanding of cloud-based data storage and processing services.
  • Knowledge of data quality assurance and governance best practices.
  • Understanding of data privacy and security considerations.
  • Effective communication skills to collaborate with cross-functional teams and stakeholders.
  • Strong problem-solving skills and the ability to troubleshoot issues in data pipelines.
  • Familiarity with Agile methodologies and ability to work in a collaborative team environment.
  • Proficiency in English language.

Benefits:

  • Fully remote work with flexible work hours and vacation policy.
  • A diverse international work environment that promotes inclusivity and collaboration.
  • Internal mentorship program.
  • Additional healthcare insurance, loyalty days.
  • Compensated professional development opportunities, including learning materials and certification exam fees.
  • Internal events like Reiz Tech Talks and Learning Sessions.
  • Team-building compensation every quarter.
  • Participation in local events in Lithuania such as wakeboarding, excursions, and Meet and Eat gatherings.
  • Ability to change the project you’re working on.
  • Referral bonuses.