Skill: Data Governance
Exp: 3+ yrs
Location: NOIDA
Timings: 2:00 PM to 10:30 PM
Cab Facility provided: Yes
Administrating & maintaining Informatica Data Governance Platforms - both on-premises version (EDC, AXON, IDQ) & cloud platform (CDGC service).
Integrating Informatica Data governance platform with enterprise systems such as Snowflake, AWS Athena, AWS S3, IBM DB2, Power BI, MS SQL, Oracle etc.
Managing metadata and implementing business lineage
Implementing Business Glossary Association in Axon
Developing and maintaining data quality assets such as rules, profiles, mappings, workflows, applications & scorecards
Collaborating with data stewards, data owners, and business users to define and enforce data governance requirements.
Maintaining documentation and ensuring compliance with internal data governance standards
Creating and managing Power BI reports and semantic models
Coordinating with support groups to get the issues resolved in a quick turnaround time
Mandatory
Bachelor’s degree in computer science or similar field or equivalent work experience
3+ years of experience working on a Data Governance Platform
Understanding of Power BI reports and semantic model
Expertise in on-premises Informatica Data Governance tools - Informatica Enterprise Data Catalog (EDC), Informatica Axon & Informatica Data Quality (IDQ)
Experience with IDMC Data Governance Modules - Data Governance and Catalog, Data Profiling, Data Quality and Metadata Command Center
Insight on data platforms such as Snowflake, AWS & Azure
Experience in writing SQL queries & Python scripts
Strong learning attitude
Good written and verbal communication skills
Experience of working in a team spread across multiple locations
Preferable
Knowledge of AWS services
Knowledge of snowflake
Responsibilities
Skill: Data Governance
Exp: 3+ yrs
Location: NOIDA
Timings: 2:00 PM to 10:30 PM
Cab Facility provided: Yes
Administrating & maintaining Informatica Data Governance Platforms - both on-premises version (EDC, AXON, IDQ) & cloud platform (CDGC service).
Integrating Informatica Data governance platform with enterprise systems such as Snowflake, AWS Athena, AWS S3, IBM DB2, Power BI, MS SQL, Oracle etc.
Managing metadata and implementing business lineage
Implementing Business Glossary Association in Axon
Developing and maintaining data quality assets such as rules, profiles, mappings, workflows, applications & scorecards
Collaborating with data stewards, data owners, and business users to define and enforce data governance requirements.
Maintaining documentation and ensuring compliance with internal data governance standards
Creating and managing Power BI reports and semantic models
Coordinating with support groups to get the issues resolved in a quick turnaround time
Mandatory
Bachelor’s degree in computer science or similar field or equivalent work experience
3+ years of experience working on a Data Governance Platform
Understanding of Power BI reports and semantic model
Expertise in on-premises Informatica Data Governance tools - Informatica Enterprise Data Catalog (EDC), Informatica Axon & Informatica Data Quality (IDQ)
Experience with IDMC Data Governance Modules - Data Governance and Catalog, Data Profiling, Data Quality and Metadata Command Center
Insight on data platforms such as Snowflake, AWS & Azure
Experience in writing SQL queries & Python scripts
Strong learning attitude
Good written and verbal communication skills
Experience of working in a team spread across multiple locations
Preferable
Knowledge of AWS services
Knowledge of snowflake
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Skill: Microsoft Fabric
Location: NOIDA
Timings: 2:00 PM to 10:30 PM
Cab Facility provided: Yes
· Responsible for understanding the requirements and perform data analysis.
· Responsible for setup of Microsoft fabric and its components
· Building secure, scalable solutions across the Microsoft Fabric platform.
· Create and manage Lakehouse.
· Implement Data Factory processes for data ingestion, scalable ETL and data integration.
· Design, implement and manage comprehensive warehousing solutions for analytics using fabric
· Creating and scheduling data pipelines using Azure data factory
· Building robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse and spark application.
· Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments.
· Set and use Git as a repository and versioning of fabric components
· Create and manage Power BI reports and semantic models
· Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting.
· Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards.
· Understand and follow-up through change management procedures to implement project deliverables.
· Coordinating with support groups to get the issues resolved in a quick turnaround time.
· Mandatory
· Bachelor’s degree in computer science or similar field or equivalent work experience.
· 3+ years of experience working in Microsoft Fabric.
· Expertise in working with OneLake, Lakehouse, Warehouse and Notebook
· Strong understanding of Power BI reports and semantic model using Fabric
· Proven record of building ETL and data solutions using Azure data factory.
· Strong understanding of data warehousing concepts and ETL processes.
· Hand on experience of building data warehouses in fabric.
· Strong skills in Python and PySpark
· Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries.
· Experience of utilizing Data Activator for effective data asset management and analytics.
· Ability to flex and adapt to different tools and technologies.
· Strong learning attitude.
· Good written and verbal communication skills.
· Demonstrated experience of working in a team spread across multiple locations.
· Preferable
· Knowledge of AWS services
· Knowledge of snowflake
· Knowledge of real time analytics in fabric
Responsibilities
Skill: Microsoft Fabric
Location: NOIDA
Timings: 2:00 PM to 10:30 PM
Cab Facility provided: Yes
· Responsible for understanding the requirements and perform data analysis.
· Responsible for setup of Microsoft fabric and its components
· Building secure, scalable solutions across the Microsoft Fabric platform.
· Create and manage Lakehouse.
· Implement Data Factory processes for data ingestion, scalable ETL and data integration.
· Design, implement and manage comprehensive warehousing solutions for analytics using fabric
· Creating and scheduling data pipelines using Azure data factory
· Building robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse and spark application.
· Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments.
· Set and use Git as a repository and versioning of fabric components
· Create and manage Power BI reports and semantic models
· Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting.
· Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards.
· Understand and follow-up through change management procedures to implement project deliverables.
· Coordinating with support groups to get the issues resolved in a quick turnaround time.
· Mandatory
· Bachelor’s degree in computer science or similar field or equivalent work experience.
· 3+ years of experience working in Microsoft Fabric.
· Expertise in working with OneLake, Lakehouse, Warehouse and Notebook
· Strong understanding of Power BI reports and semantic model using Fabric
· Proven record of building ETL and data solutions using Azure data factory.
· Strong understanding of data warehousing concepts and ETL processes.
· Hand on experience of building data warehouses in fabric.
· Strong skills in Python and PySpark
· Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries.
· Experience of utilizing Data Activator for effective data asset management and analytics.
· Ability to flex and adapt to different tools and technologies.
· Strong learning attitude.
· Good written and verbal communication skills.
· Demonstrated experience of working in a team spread across multiple locations.
· Preferable
· Knowledge of AWS services
· Knowledge of snowflake
· Knowledge of real time analytics in fabric
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience in SAP Solution Manager ChaRM Configuration at advance level(with CSOL, Retrofit and multi landscape scenarios) is required. Should have hands on experience on Implementation and Support Projects Hands on experience on S4 Hana and Fiori Proficient with SAP Java systems , worked on Java addons like MII and PHands on experience on ABAP systems such as TM GTS EWM Hands on experience on System integration and configuration Hands on experience on Hana DB Worked on Linux OS preferably on Azure Exposure tBODS and BOIS and replication with SLT Good thave exposure tSolMan functionalities like Managed System Configuration, Soldoc and ChaRM
Responsibilities
Experience in SAP Solution Manager ChaRM Configuration at advance level(with CSOL, Retrofit and multi landscape scenarios) is required. Should have hands on experience on Implementation and Support Projects Hands on experience on S4 Hana and Fiori Proficient with SAP Java systems , worked on Java addons like MII and PHands on experience on ABAP systems such as TM GTS EWM Hands on experience on System integration and configuration Hands on experience on Hana DB Worked on Linux OS preferably on Azure Exposure tBODS and BOIS and replication with SLT Good thave exposure tSolMan functionalities like Managed System Configuration, Soldoc and ChaRM
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Seeking a Boomi Integration Consultant to design, build, and support integrations with a strong focus on EDI/X12 (including trading partner onboarding, mapping, acknowledgements, troubleshooting, and production support). The role requires SAP integration knowledge, hands-on experience with IDOC, XML, and flat file formats, and solid SQL Server skills for data queries, reconciliation, and issue resolution. Candidate should be comfortable delivering end-to-end integrations (design through deployment), implementing monitoring/error handling, and producing clear technical documentation while collaborating with business and technical stakeholders.
Responsibilities
Seeking a Boomi Integration Consultant to design, build, and support integrations with a strong focus on EDI/X12 (including trading partner onboarding, mapping, acknowledgements, troubleshooting, and production support). The role requires SAP integration knowledge, hands-on experience with IDOC, XML, and flat file formats, and solid SQL Server skills for data queries, reconciliation, and issue resolution. Candidate should be comfortable delivering end-to-end integrations (design through deployment), implementing monitoring/error handling, and producing clear technical documentation while collaborating with business and technical stakeholders.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Looking for an experienced L2/L3 Support Engineer to manage operational support activities, handle incidents and service requests, respond to user queries, and perform environment cloning and patching activities. Also requires hands-on knowledge of IT Service Management (ITSM) processes and the ability to support and implement ITSM enhancements to improve service delivery and operational efficiency.
Key Responsibilities:
Perform L2/L3 operational support activities for applications/systems.
Handle and resolve incidents, service requests, and user queries within defined SLAs.
Responsibilities
Looking for an experienced L2/L3 Support Engineer to manage operational support activities, handle incidents and service requests, respond to user queries, and perform environment cloning and patching activities. Also requires hands-on knowledge of IT Service Management (ITSM) processes and the ability to support and implement ITSM enhancements to improve service delivery and operational efficiency.
Key Responsibilities:
Perform L2/L3 operational support activities for applications/systems.
Handle and resolve incidents, service requests, and user queries within defined SLAs.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance