Senior Data Engineer – Databricks
Posted 98ds ago
Employment Information
Report this job
Job expired or something wrong with this job?
Job Description
Remote Senior Data Engineer (Databricks) responsible for designing, automating, and maintaining data pipelines and structures. Works with Databricks, Azure Data Factory, Python/PySpark, and modern data lake technologies.
Responsibilities:
- Work on automating and optimizing data flows and processes;
- Develop technical solutions to ingest, process, and store data from multiple sources and in various formats (structured/unstructured data, files, XML or JSON, Parquet, APIs);
- Perform activities related to the analysis, development, and maintenance of data processes and structures;
- Contribute to data architecture, mapping, and modeling;
- Maintain an enterprise perspective aligned with industry best practices, solution blueprints, and solution designs;
- Experience with Artificial Intelligence solutions and GCP is a plus.
Requirements:
- Experience in data projects;
- Experience with the Databricks platform;
- Experience with Azure Data Factory;
- Programming experience — Python, SQL, PySpark;
- Experience automating information processes (ETL/ELT);
- Experience with version control using Git;
- Experience with relational databases and NoSQL;
- Experience with Azure Cloud, Data Factory, Synapse, ADLS Gen2, Delta Lake;
- Experience in data engineering and integration (e.g., ETL, APIs, microservices);
- Experience with Linux environments, basic commands, and shell scripting;
- Knowledge of streaming processes with Event Hub;
- Knowledge of data representation formats and scripts such as JSON, XML, YAML.
Benefits:
- Meal/food voucher of R$38/day (flex card)
- Flex allowance of R$210/month (flex card)
- TotalPass
- Health and dental insurance
- Life insurance
- Profit-sharing (PLR)
- Birthday day off
- Training and certifications paid for by Aggrandize
- Referral bonus program
- Career development track




















