Microsoft Fabric Core: Deep proficiency in Fabric architecture, specifically Data Factory pipelines, Dataflows Gen2, and Lakehouse/OneLake storage strategies.
Migration Expertise: Proven experience migrating legacy ETL processes (e.g., from Azure Data Factory, SSIS, or on-prem SQL) into the Microsoft Fabric ecosystem.
Scripting & Transformation: Expert-level coding in PySpark and Python for Fabric Notebooks to handle complex data transformations and enrichment.
Power BI Backend: Ability to build robust Semantic Models and Power BI Datasets directly on top of OneLake (using Direct Lake mode where applicable) for high-performance reporting.
SQL Proficiency: Advanced SQL skills for data modeling, star schema design, and querying within the SQL Endpoint of the Lakehouse.
Azure Ecosystem: Strong grasp of Azure Data Lake Storage (ADLS Gen2) and security governance (Entra ID/ACLs).
Key Responsibilities (The "Ask")
Architect & Migrate: Lead the backend migration of data from diverse sources into a unified Fabric Lakehouse architecture.
Pipeline Optimization: Re-engineer and optimize data pipelines to ensure seamless data ingestion and transformation for high availability.
Model for Reporting: Design purpose-driven data views and efficient Star Schemas specifically tailored to support rapid Power BI report rendering.
Cross-Functional Support: Bridge the gap between backend data engineering and frontend reporting by ensuring data quality and consistency for the BI team.
Responsibilities
Microsoft Fabric Core: Deep proficiency in Fabric architecture, specifically Data Factory pipelines, Dataflows Gen2, and Lakehouse/OneLake storage strategies.
Migration Expertise: Proven experience migrating legacy ETL processes (e.g., from Azure Data Factory, SSIS, or on-prem SQL) into the Microsoft Fabric ecosystem.
Scripting & Transformation: Expert-level coding in PySpark and Python for Fabric Notebooks to handle complex data transformations and enrichment.
Power BI Backend: Ability to build robust Semantic Models and Power BI Datasets directly on top of OneLake (using Direct Lake mode where applicable) for high-performance reporting.
SQL Proficiency: Advanced SQL skills for data modeling, star schema design, and querying within the SQL Endpoint of the Lakehouse.
Azure Ecosystem: Strong grasp of Azure Data Lake Storage (ADLS Gen2) and security governance (Entra ID/ACLs).
Key Responsibilities (The "Ask")
Architect & Migrate: Lead the backend migration of data from diverse sources into a unified Fabric Lakehouse architecture.
Pipeline Optimization: Re-engineer and optimize data pipelines to ensure seamless data ingestion and transformation for high availability.
Model for Reporting: Design purpose-driven data views and efficient Star Schemas specifically tailored to support rapid Power BI report rendering.
Cross-Functional Support: Bridge the gap between backend data engineering and frontend reporting by ensuring data quality and consistency for the BI team.
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
ServiceNow configuration and scripting, Set up discovery, Set up Schedule jobs, batch programs to retrieve data from Cloud
ServiceNow development and configuration
Integration and scripting
Should be able to lead a team
• More than 5+ years of development, configuration, and integration experience in building solution on ServiceNow platform.
• Good to have knowledge and working experience of other ServiceNow ITOM modules like ServiceNow CMDB, Discovery, Service Mapping, Event management.
• Good knowledge of ServiceNow CMDB and CSDM data model.
• Understands Discovery scheduling and potential network impacts
• Experience with net-new implementations of Discovery, Event Management, and Service Mapping; can relay best practices to clients
• Understands CMDB relationships and hierarchy, experience in CMDB Health dashboard configuration and with CMDB remediation including duplicate and stale CI items
Responsibilities
ServiceNow configuration and scripting, Set up discovery, Set up Schedule jobs, batch programs to retrieve data from Cloud
ServiceNow development and configuration
Integration and scripting
Should be able to lead a team
• More than 5+ years of development, configuration, and integration experience in building solution on ServiceNow platform.
• Good to have knowledge and working experience of other ServiceNow ITOM modules like ServiceNow CMDB, Discovery, Service Mapping, Event management.
• Good knowledge of ServiceNow CMDB and CSDM data model.
• Understands Discovery scheduling and potential network impacts
• Experience with net-new implementations of Discovery, Event Management, and Service Mapping; can relay best practices to clients
• Understands CMDB relationships and hierarchy, experience in CMDB Health dashboard configuration and with CMDB remediation including duplicate and stale CI items
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance