Search by job, company or skills

DHL

Data Engineer (IT)

2-6 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 16 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Welcome to DHL eCommerce!
We are excited to announce an exceptional opportunity within our organization for individuals seeking to join the dynamic world of DHL eCommerce. As part of our continuous growth and commitment to excellence, we are looking for talented professionals to contribute to our expanding global network.

At DHL eCommerce we provide a variety of international and domestic standard parcel delivery services in more than 35 countries around the globe. In selected markets we furthermore offer fulfillment services to customers. In line with the Group Strategy our aim is to be a leading provider of e-commerce related logistics. For this we are designing solutions across the entire DHL Group service portfolio for selected customers. As an integral part of the DHL Group, we leverage our extensive global network and cutting-edge technology to offer end-to-end e-commerce logistics solutions to our customers.

Joining our team means becoming part of a diverse and inclusive workplace culture that fosters collaboration, creativity, and personal growth. We value the unique perspectives and skills that each individual brings, and we believe that together we can achieve great things. As a global leader in logistics, DHL eCommerce offers unparalleled opportunities for career development and advancement, allowing you to unleash your full potential.

Key Responsibilities

Build the Azure Data Platform from Scratch
  • Design and build our enterprise Azure data architecture, including Data Lake, Fabric Workspaces, Lakehouse, and Warehouse components.
  • Develop end-to-end data ingestion and transformation pipelines using Azure Data Factory and Fabric Data Factory.
  • Establish foundational components such as:
    • Data Lake zones (raw, curated, analytics)
    • Data governance and catalog (Purview / Fabric)
    • Security model and access controls
    • Naming conventions and workspace organization
Data Engineering & Pipeline Development
  • Build reliable and reusable ETL/ELT pipelines to support operational, analytical, and reporting workloads.
  • Implement Lakehouse patterns in Microsoft Fabric, enabling high-performance Direct Lake datasets for Power BI.
  • Create and maintain scalable batch and streaming data ingestion using services such as Event Hub, Synapse, and Functions.
Data Modeling & Analytics Enablement
  • Develop high-quality dimensional and semantic models used as the single source of truth across the organization.
  • Enable BI and product teams by delivering clean, high-performance datasets for dashboards, operational reports, and machine learning initiatives.
Architecture & Standards
  • Define and document data engineering best practices, coding standards, and architecture guidelines.
  • Lead performance tuning and cost optimization across Azure workloads.
  • Support implementation of CI/CD pipelines using Azure DevOps or GitHub Actions.
Collaboration & Stakeholder Partnership
  • Work closely with the Operations, Product, and Finance teams to understand data needs and design scalable solutions.
  • Support ML/AI experimentation with structured, high-quality datasets.
  • Partner with system owners and other enterprise platforms to integrate and unify data.

Technical Skills & Qualifications
  • Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or related field.
  • 2-6+ years of data engineering experience.
  • Azure Data Engineer or Fabric Analytics Engineer certifications are preferred.
  • Proven experience building or significantly contributing to Azure data architecture (Data Factory, ADLS, Synapse, Databricks/Spark, Azure SQL).
  • Hands-on experience with Microsoft Fabric, including:
  • Lakehouse & Warehouse
  • Direct Lake for Power BI
  • Dataflows Gen2
  • Fabric Data Factory
  • Workspace governance and security
  • Strong SQL expertise and proficiency in Python for data transformations.
  • Understanding of data modeling (star schema, dimensional modeling) and ELT/ETL patterns.
  • Experience with Git-based development and CI/CD.

#LI-ETH

More Info

About Company

DHL

DHL is a German logistics company providing courier, package delivery and express mail service, which is a division of the German logistics firm Deutsche Post. The company group delivers over 1.6 billion parcels per year

Job ID: 143420005

Similar Jobs