Skip to content

Data Engineer

On-site, Hybrid
  • Bratislava, Bratislavský kraj, Slovakia
€2,500 per monthTechnology

Job description

We are seeking a talented and experienced Data Engineer to join our dynamic team working in the finance services industry. Your work will empower data-driven decision-making, enhance business intelligence, and drive innovation in the financial domain. You’ll collaborate with cross-functional teams to create scalable solutions that leverage cutting-edge technologies.


Responsibilities:

  • Architecting Data Solutions: Designing end-to-end data solutions that leverage Azure services such as Azure Data Lake Storage, Azure Databricks, and Azure Integration Services. This involves understanding business requirements and selecting appropriate Azure components to meet those needs.
  • Building and Maintaining Data Pipelines and Automation: Developing ETL (Extract, Transform, Load) pipelines to ingest, clean, transform, and load data from various sources into Azure data stores and Databricks clusters. This may involve writing code in languages like Python or SQL, as well as utilizing tools like Apache Spark for distributed processing.
  • Data Modeling and Optimization: Designing and implementing data models that optimize for query performance and scalability.
  • Performance Optimization: Optimize data pipelines for performance, reliability, and cost-effectiveness. Monitor data processing efficiency and propose enhancements.
  • Data Quality and Governance: Investigate and address data quality issues. Implement data governance practices to ensure compliance and security.

Job requirements

  • Azure Services: Strong understanding of Azure services relevant to data engineering, such as Azure Storage, Azure SQL Database, Azure Integration Services, etc. Knowledge of how these services integrate with each other to form an end-to-end data processing pipeline.
  • Data Modelling: Experiences with Databricks Data Lake or Delta Lake concept. Strong understanding of data modeling principles and techniques. Ability to design ETL processes using Data Factory and Databricks. Proficiency in building Power bi reports.
  • Programming Languages: Proficiency in programming languages commonly used in data engineering, such as Python and SQL.
  • Minimum of 2 years of professional working experience.
  • Good English: writing, reading and verbal communication skills is a must.

Advantageous Skills (considered as a plus):
Experience with MLflow for managing machine learning workflows.
Proficiency in creating reports using Power BI.
Familiarity with Azure DevOps for CI/CD processes.


The minimum offer for this role is 2500€ gross/month/full-time. We aim to pay our employees fairly, therefore, your salary might be higher reflecting your experience and skill set.

or