Senior Data Engineer PowerBI & MS Fabric

Location: 

Gurugram, IN

Stellen-ID:  26388

G+D makes the lives of billions of people around the world more secure. We create trust in the digital age with integrated security technologies in three business areas: Digital Security, Financial Platforms and Currency Technology. We have been a reliable partner for our customers for over 170 years with our innovative solutions for SecurityTech! We are an international technology group and traditional family business with over 14,000 employees in 40 countries. Creating Confidence is our path to success. Trust is the basis of our co-operation within G+D. 

The whole world trusts us when it comes to digital, physical or electronic payments. We increase the security and efficiency of the cash cycle in collaboration with central banks and the entire currency industry.  As the market leader in advanced currency management, would you like to join us in shaping the future of payments?

Data Engineer – Microsoft Fabric & SAP Integration

 

As the Data Engineer in our Microsoft Fabric team, you will play a key role in designing, building, and maintaining our modern data platform. You will be responsible for integrating SAP data sources, developing high-quality data models, managing data transformations, and ensuring reliable data pipelines across our Fabric environment.
This role is ideal for someone with strong experience in data engineering, advanced SQL and Python skills, and hands-on knowledge of Microsoft Fabric, Synapse, and SAP data extraction technologies.

 

Key Responsibilities

 

Data Engineering & Architecture

  • Design, build, and optimize data pipelines (Data Factory, Lakehouse, Warehouse - Microsoft Fabric if possible).
  • Implement data ingestion processes for SAP systems using appropriate connectors and integration tools.
  • Develop scalable ETL/ELT processes with strong focus on data quality, performance, and maintainability.
  • Structure and curate data in Bronze / Silver / Gold layers following medallion architecture best practices.
  • Build and maintain semantic models within Fabric for reporting and analytics use cases.

 Data Modeling & Transformation

  • Create efficient, reusable data models tailored to reporting, analytics, and ML workloads.
  • Implement business transformation logic using SQL, PySpark, Python and Fabric notebooks.
  • Ensure proper metadata mapping, schema evolution handling, and lineage documentation.

  Security, Governance & Quality

  • Apply and manage Row-Level Security (RLS) and Object-Level Security (OLS) across datasets and semantic models.
  • Contribute to data governance frameworks aligned with Fabric’s workspace, domain, and permission model.
  • Collaborate with Data Stewards, Architects, and Domain Owners to align on data standards and quality metrics.

Automation & CI/CD

  • Automate pipeline deployments and model updates using DevOps CI/CD integrations with Fabric.
  • Contribute to maintaining a managed and scalable development workflow based on Git integration.

Collaboration & Stakeholder Support

  • Work closely with business analysts, report creators, and data consumers to understand requirements and translate them into scalable data solutions.
  • Provide expertise on Fabric capabilities, data integration techniques, and data platform best practices.
  • Support troubleshooting, performance tuning, and continuous improvement of data services.
 

Required Skills & Experience

  • Proven experience as a Data Engineer, ideally in cloud-based analytics environments such as Microsoft Fabric, Azure Synapse, or Databricks.
  • Hands-on experience with SAP Data Integration (e.g., SAP BDC, SAP BW, SAP ECC, SLT, BTP, or 3rd-party extraction tools).
  • Strong proficiency in SQL, Python, and data transformation frameworks.
  • Experience building pipelines using Data Factory, Notebooks, Pipelines, and Lakehouse in Microsoft Fabric.
  • Solid understanding of medallion architecture, warehouse/lakehouse modeling, and semantic layer design.
  • Practical knowledge of RLS, OLS, workspace security, and Fabric governance.
  • Experience with Git, version control, and CI/CD pipelines (Azure DevOps preferred).
  • Strong analytical, problem-solving, and communication skills.

 

Preferred Qualifications

  • Experience with Data Engineering (Data factory (ETLs), Lake houses, Connectors, etc)
  • Knowledge with Power BI, semantic model optimization, and DAX fundamentals.
  • Exposure to data governance frameworks, data cataloging, data products and data lifecycle management.
  • Knowledge of enterprise integration technologies involving SAP and Microsoft ecosystems.
  • Familiarity with enterprise data architecture patterns such as Data Mesh.

$$ We are an equal opportunity employer! We promote diversity in all its forms and create an inclusive work environment, free from prejudice, discrimination and harassment, in which all employees feel a sense of belonging. We warmly welcome all applications regardless of gender, age, race or ethnic origin, social and cultural background, religion, disability and sexual orientation. 

 $$ Arvina Mehta $$ arvina.mehta@gi-de.com $$ $$ $$ https://career5.successfactors.eu/career?company=gieseckede&career_job_req_id=26388&career_ns=job_application

We are an equal opportunity employer! We promote diversity in all its forms and create an inclusive work environment, free from prejudice, discrimination and harassment, in which all employees feel a sense of belonging. We warmly welcome all applications regardless of gender, age, race or ethnic origin, social and cultural background, religion, disability and sexual orientation.

We are looking forward to receiving your application!

Giesecke & Devrient India Private Limited
Plot No. 02, EHTP, Sector - 34, Gurugram – 122001
www.gi-de.com/careers
CT_meeting_f
We provide security solutions for over 145 central
I’m proud of what we do: