Skip to content

Data Engineer

On-site, Hybrid
  • Bratislava, Bratislavský kraj, Slovakia
€2,700 per monthTechnology

Job description

We are seeking a talented and experienced Data Engineer to join our dynamic team working in the finance services industry. Your work will empower data-driven decision-making, enhance business intelligence, and drive innovation in the financial domain. You’ll collaborate with cross-functional teams to create scalable solutions that leverage cutting-edge technologies.

Responsibilities:

  • Building and Maintaining Data Pipelines and Automation: Developing ETL (Extract, Transform, Load) pipelines to ingest, clean, transform, and load data from various sources into Azure data stores and Databricks clusters. This may involve writing code in languages like Python or SQL, as well as utilizing tools like Azure Integration Services for distributed processing.
  • Data Modeling and Optimization: Designing and implementing data models that optimize business reporting and optimized for Power BI application.
  • Architecting Data Solutions: Actively taking part in end-to-end data solutions using Azure Cloud services. This involves understanding business requirements and selecting appropriate Azure components to meet those needs.
  • Performance Optimization: Optimize data pipelines for performance, reliability, and cost-effectiveness. Monitor data processing efficiency and propose enhancements.
  • Data Quality and Governance: Investigate and address data quality issues. Implement data governance practices to ensure compliance and security.

Job requirements

  • Azure Services: Strong understanding of Azure services relevant to data engineering, such as Azure Storage, Azure SQL Database, Azure Integration Services, etc. Knowledge of how these services integrate with each other to form an end-to-end data processing pipeline.
  • Data Modelling: Experiences with Databricks Data Lake concept. Strong understanding of data modelling principles and techniques. Ability to design ETL processes in a Cloud environment. Proficiency in building Power BI reports.
  • Programming Languages: Proficiency in programming languages commonly used in data engineering, such as Python and SQL. Pyspark is good to know.
  • Minimum of 1 year of professional working experience.
  • Good English: writing, reading and verbal communication skills are a must.

Advantageous Skills (considered as a plus):

  • Logic Apps
  • Proficiency in creating reports using Power BI.
  • Familiarity with APIs.
  • Experience Machine Learning (Sklearn or TensorFlow).

The minimum offer for this role is 2700€ gross/month/full-time. We aim to pay our employees fairly, therefore, your salary might be higher reflecting your experience and skill set.

or