Banner

Data Engineer

ROLE SUMMARY

We are looking for a Data Engineer focused on the hands-on execution and technical delivery of data solutions. Reporting to the solution architect, he will build, optimize, and maintain data pipelines and Lakehouse structures within the Microsoft Fabric ecosystem, converting architectural designs into high-performance, scalable and efficient data delivery


DESCRIPTION OF TASKS

  • Delta Lake implementation: Build and optimize physical storage layers (Bronze/Silver/Gold) using Delta Lake formats, ensuring proper partitioning and Z-Ordering.

  • Pipeline execution: Construct robust ETL/ELT workflows using Fabric Data Factory, Spark, and custom code to move data from diverse sources into OneLake.

  • Custom application development: Write and maintain high-performance data processing modules or custom connectors using Java and/or Python.

  • Cross-platform integration: Implement data bridges between Microsoft Fabric and external environments (e.g., AWS, GCP, or on-premises systems) using Shortcuts and APIs.

  • Code-based transformations: Develop sophisticated logic within Spark Notebooks to handle complex data cleansing and business logic application.

  • System testing: Perform rigorous unit testing, data profiling, and performance benchmarking to ensure pipeline stability and data integrity.




2. KNOWLEDGE AND SKILLS

  • Core Engineering Languages: Expert proficiency in SQL and Python (PySpark); strong working knowledge of Java for backend data processing or SDK integrations.

  • Generic Data Engineering: Solid understanding of Software Engineering principles applied to data (DRY code, modularity, error handling, and logging).

  • Data Structures: Proficiency in working with structured, semi-structured (JSON/XML), and unstructured data formats.

  • DevOps & CI/CD: Practical experience with Git, branching strategies, and automating deployments through Azure DevOps or GitHub Actions.

  • Distributed Computing: Strong grasp of Apache Spark internals, memory management, and execution plans.

  • Languages: Good knowledge of written/spoken English (working language). Knowledge of French is an asset


3. SPECIFIC EXPERTISE

  • Fabric Delta Lake Management: Expertise in optimizing and managing transaction logs within the Fabric Lakehouse.

  • Multi-stack Integration: Proven ability to connect Fabric with non-Microsoft tools, utilizing generic JDBC/ODBC drivers or REST APIs.

  • Performance Tuning: Deep experience in identifying bottlenecks in data flows and optimizing code for low-latency processing.

  • Data Design Patterns: Practical experience implementing standard patterns like SCD (Slowly Changing Dimensions), Change Data Capture ingestion and idempotency in a Delta Lake environment.


CERTIFICATIONS

·       Microsoft Focus: DP-700 (Microsoft Certified: Fabric Data Engineer Associate) – required.

·       Generic/Cloud Focus: DP-203 (Azure Data Engineer) or equivalent certifications from Databricks (Data Engineer Associate) or AWS – a plus.


circleInformatiounen

Data Engineer

23/04/2026

Brussels

Follow eis