Req ID: 10669815
Location: PUNE, MH / HYDERABAD, TS / BANGALORE, KA
Role Descriptions: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Essential Skills: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Skills: Digital : Databricks~Azure Data Factory
Experience Required: 6-8
Responsibilities
Req ID: 10669815
Location: PUNE, MH / HYDERABAD, TS / BANGALORE, KA
Role Descriptions: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Essential Skills: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Skills: Digital : Databricks~Azure Data Factory
Experience Required: 6-8
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Overview:
The SAP PaPM Consultant will be responsible for designing, implementing, and optimizing SAP Profitability and Performance Management solutions to meet business objectives. This role requires a strong understanding of financial processes and the ability to translate complex business requirements into effective technical solutions.
Key Responsibilities:
•Requirements Gathering & Design: Collaborate with client counterparts to understand performance management, cost allocation, and profitability analysis requirements.
• Solution Implementation: Design, implement, and configure end-to-end SAP PaPM solutions, including the use of Modeler, Viewer, allocation logic, rules, and data flows.
• Data Integration: Work on data integration from various sources (SAP S/4HANA, SAP BW, SAP HANA views, flat files) to ensure efficient data flow and performance optimization.
• Modeling & Analysis: Develop and maintain profitability models, allocation cycles, driver-based costing, and calculation processes.
• Testing & Support: Conduct testing, validation (UAT, unit testing), and deployment of SAP PaPM solutions; provide end-user training, documentation, and post-implementation support.
• Troubleshooting: Troubleshoot issues related to data modeling, calculations, and system performance, including performance tuning for high-volume calculations.
Required Skills & Qualifications:
• Experience: Typically, 5+ years of overall SAP experience, with significant hands-on experience in SAP PaPM implementations.
• Domain Expertise: Strong understanding of financial modeling, cost and revenue allocations, profitability analysis, and performance management concepts.
• Technical Proficiency:
o Proficiency in SAP HANA modeling, SQL, and data modeling concepts.
o Hands-on experience with SAP PaPM functions, processes, and modeling logic.
o Experience integrating PaPM with SAP S/4HANA, SAP BW/4HANA.
Responsibilities
Role Overview:
The SAP PaPM Consultant will be responsible for designing, implementing, and optimizing SAP Profitability and Performance Management solutions to meet business objectives. This role requires a strong understanding of financial processes and the ability to translate complex business requirements into effective technical solutions.
Key Responsibilities:
•Requirements Gathering & Design: Collaborate with client counterparts to understand performance management, cost allocation, and profitability analysis requirements.
• Solution Implementation: Design, implement, and configure end-to-end SAP PaPM solutions, including the use of Modeler, Viewer, allocation logic, rules, and data flows.
• Data Integration: Work on data integration from various sources (SAP S/4HANA, SAP BW, SAP HANA views, flat files) to ensure efficient data flow and performance optimization.
• Modeling & Analysis: Develop and maintain profitability models, allocation cycles, driver-based costing, and calculation processes.
• Testing & Support: Conduct testing, validation (UAT, unit testing), and deployment of SAP PaPM solutions; provide end-user training, documentation, and post-implementation support.
• Troubleshooting: Troubleshoot issues related to data modeling, calculations, and system performance, including performance tuning for high-volume calculations.
Required Skills & Qualifications:
• Experience: Typically, 5+ years of overall SAP experience, with significant hands-on experience in SAP PaPM implementations.
• Domain Expertise: Strong understanding of financial modeling, cost and revenue allocations, profitability analysis, and performance management concepts.
• Technical Proficiency:
o Proficiency in SAP HANA modeling, SQL, and data modeling concepts.
o Hands-on experience with SAP PaPM functions, processes, and modeling logic.
o Experience integrating PaPM with SAP S/4HANA, SAP BW/4HANA.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Kindly share the profiles for below requirement on priority.
Exp: 10 years and above
Location: PAN India
Notice: Immediate
CPI resources - 10+ years of experience in CPI
JD:
• Resource will be the lead CPI consultant to perform end-end CPI integration build with SuccessFactors modules.
• Resource will be fixing any incident / issues with any existing CPI integrations,
• Resource will be assisting with SuccessFactors ODATA API field mapping for integration.
• Resource will be responsible for any certificate update, key rotation in SuccessFactors and CPI
• Resource will be responsible for build, unit testing of new iflows.
• Resource will be in roaster support for cpi integration runs.
• Resource will be designing new interfaces based on business requirements.
Skill:
• Must have experience in at least 1 end-end implementation SuccessFactors projects.
• Must have prior experience with SuccessFactors integration, ECC systems, EC Payroll integrations, Service -now, third party systems, using Compound employee, ODATA.
• Experienced in working with both standard as well as creation of custom integration flows.
• Experienced in creation of both synchronous and asynchronous interfaces.
• Experienced in working with various adapters such as SuccessFactors, SOAP, IDOC, HTTP, SFTP, RFC, JDBC, JMS, ODATA, MAIL etc.
• Has good knowledge on groovy scripts, message mappings, node functions, UDFs, and various other palette functions.
• Has good knowledge on support activities such as monitoring and incident resolution.
Responsibilities
Kindly share the profiles for below requirement on priority.
Exp: 10 years and above
Location: PAN India
Notice: Immediate
CPI resources - 10+ years of experience in CPI
JD:
• Resource will be the lead CPI consultant to perform end-end CPI integration build with SuccessFactors modules.
• Resource will be fixing any incident / issues with any existing CPI integrations,
• Resource will be assisting with SuccessFactors ODATA API field mapping for integration.
• Resource will be responsible for any certificate update, key rotation in SuccessFactors and CPI
• Resource will be responsible for build, unit testing of new iflows.
• Resource will be in roaster support for cpi integration runs.
• Resource will be designing new interfaces based on business requirements.
Skill:
• Must have experience in at least 1 end-end implementation SuccessFactors projects.
• Must have prior experience with SuccessFactors integration, ECC systems, EC Payroll integrations, Service -now, third party systems, using Compound employee, ODATA.
• Experienced in working with both standard as well as creation of custom integration flows.
• Experienced in creation of both synchronous and asynchronous interfaces.
• Experienced in working with various adapters such as SuccessFactors, SOAP, IDOC, HTTP, SFTP, RFC, JDBC, JMS, ODATA, MAIL etc.
• Has good knowledge on groovy scripts, message mappings, node functions, UDFs, and various other palette functions.
• Has good knowledge on support activities such as monitoring and incident resolution.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance