Looking for practitioner with 8+ years in SAP QM. 2) Practititioner should have deep expertise in QM, it's end to end integration with other modules. 3) Practitoner is expected to drive end to end design workshops, prepare workshop decks, demos and should be hands on to perform configuration & execution in the system. 4) Practioner should have experience of writing funcitonal specifications, perform functional unit testing, integration testing and support user acceptance testing.
Responsibilities
Looking for practitioner with 8+ years in SAP QM. 2) Practititioner should have deep expertise in QM, it's end to end integration with other modules. 3) Practitoner is expected to drive end to end design workshops, prepare workshop decks, demos and should be hands on to perform configuration & execution in the system. 4) Practioner should have experience of writing funcitonal specifications, perform functional unit testing, integration testing and support user acceptance testing.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Support Jira and Confluence through our internal Jira Service Desk.
• Closing Issues within the SLAs
• Implementing new customisations through customer service requests:
o Jira project configurations (Workflows, Fields, Screens, Permissions etc.)
o Confluence content management (Space creation, data cleaning, user macros etc.)
o Review and piloting of new apps from the Atlassian marketplace
• General maintenance of the application and ensuring best practises
• Assist in implementing application upgrades
• Provide training where required to users for all our tools
• Assisting with maintenance of the AWS Infrastructure for the tools
• Develop custom groovy scripts to fulfil requirements
The ideal candidate will have:
• 4+ years’ experience administering/supporting Jira and Confluence
• Worked within a customer facing support/engineering function
• Atlassian Certifications
• Scripting knowledge (Groovy/Python)
• Basic Understanding of relational databases (PostgreSQL/Oracle)
• Linux Knowledge (RedHat)
• Understanding of Cloud Platforms (AWS)
Understanding of AI Systems such as ChatGPT
Responsibilities
• Support Jira and Confluence through our internal Jira Service Desk.
• Closing Issues within the SLAs
• Implementing new customisations through customer service requests:
o Jira project configurations (Workflows, Fields, Screens, Permissions etc.)
o Confluence content management (Space creation, data cleaning, user macros etc.)
o Review and piloting of new apps from the Atlassian marketplace
• General maintenance of the application and ensuring best practises
• Assist in implementing application upgrades
• Provide training where required to users for all our tools
• Assisting with maintenance of the AWS Infrastructure for the tools
• Develop custom groovy scripts to fulfil requirements
The ideal candidate will have:
• 4+ years’ experience administering/supporting Jira and Confluence
• Worked within a customer facing support/engineering function
• Atlassian Certifications
• Scripting knowledge (Groovy/Python)
• Basic Understanding of relational databases (PostgreSQL/Oracle)
• Linux Knowledge (RedHat)
• Understanding of Cloud Platforms (AWS)
Understanding of AI Systems such as ChatGPT
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• 6+ years of hands-on experience with UiPath RPA development and support.
• Strong understanding of UiPath Orchestrator, Studio, and Robot components.
• Experience with incident management and root cause analysis in production environments.
• UiPath Advanced Developer Certification.
• Experience with UiPath Document Understanding and UiPath Apps
• Familiarity with ITSM tools such as ServiceNow or JIRA.
• Excellent communication and stakeholder management skills.
Preferred Qualifications
• Exposure to Blue Prism RPA platform.
• Experience with automation performance monitoring and reporting.
• Knowledge of Agile methodologies and DevOps practices.
•
This role will primarily support the UiPath RPA platform, with additional exposure to Blue Prism as a secondary benefit.
Primary Responsibilities
• Analyze business challenges and recommend automation solutions using advanced technologies, primarily UiPath.
• Conduct root cause analysis, provide recommendations for process improvements, and monitor results using defined metrics.
• Provide Level 2 support for UiPath RPA bots running in production environments.
• Monitor and analyze logs from UiPath Orchestrator, robots, and queues to identify and resolve incidents.
Responsibilities
• 6+ years of hands-on experience with UiPath RPA development and support.
• Strong understanding of UiPath Orchestrator, Studio, and Robot components.
• Experience with incident management and root cause analysis in production environments.
• UiPath Advanced Developer Certification.
• Experience with UiPath Document Understanding and UiPath Apps
• Familiarity with ITSM tools such as ServiceNow or JIRA.
• Excellent communication and stakeholder management skills.
Preferred Qualifications
• Exposure to Blue Prism RPA platform.
• Experience with automation performance monitoring and reporting.
• Knowledge of Agile methodologies and DevOps practices.
•
This role will primarily support the UiPath RPA platform, with additional exposure to Blue Prism as a secondary benefit.
Primary Responsibilities
• Analyze business challenges and recommend automation solutions using advanced technologies, primarily UiPath.
• Conduct root cause analysis, provide recommendations for process improvements, and monitor results using defined metrics.
• Provide Level 2 support for UiPath RPA bots running in production environments.
• Monitor and analyze logs from UiPath Orchestrator, robots, and queues to identify and resolve incidents.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
b Description: GCP (E2), Big Query Cloud Composer, GKE
Role Overview:
We are looking for a Data Engineer with hands-on experience in Google Cloud Platform (GCP) services to work on implementing and managing data ingestion using our in- house Origin Data Product (ODP) framework. The role primarily involves configuring pipelines, loading data, debugging issues, and ensuring smooth operation of the data ingestion process. Should be capable of handling parameter changes, data issues, and basic fixes within the scope of ingestion
jobs. If you understand the framework well, you are encouraged to suggest improvements and may contribute to enhancements in collaboration with the core
development team.
Key Responsibilities:
• Configure and execute data ingestion pipelines using our reusable GCP-based framework.
• Work with services such as GCS, BigQuery, Composer, Data Fusion, DataProc for ETL/ELT operations.
• Manage parameters, job configurations, and metadata for ingestion.
• Debug and resolve issues related to data, parameters, and job execution.
• Escalate framework-related bugs to the core development team when required.
• Monitor daily job runs, troubleshoot failures, and ensure SLAs are met.
• Collaborate with cross-functional teams for smooth delivery.
• Follow version control best practices using Git.
• Maintain deployment scripts and infrastructure configurations via Terraform.
• (Nice-to-have) Work with TWSd job scheduling/monitoring tools.
• Suggest improvements to the framework or processes based on usage experience.
• Contribute to small enhancements in the framework, where applicable.
Required Skills & Experience:
• 5+ years of experience as a Data Engineer or in a similar role.
• Strong working knowledge of Google Cloud Storage (GCS), BigQuery, Composer, Data Fusion, Dataproc.
• Proficiency in Python for scripting, debugging, and automation tasks.
• Experience with Terraform for infrastructure-as-code.
• Knowledge of Git for version control.
• Understanding of data ingestion concepts, file formats, and ETL workflows.
• Ability to debug and resolve runtime data issues independently.
• Strong problem-solving and analytical skills.
Good to Have:
• Exposure to TWSd or other enterprise job schedulers.
• Basic understanding of SQL optimization in BigQuery.
Soft Skills:
• Attention to detail and ownership mindset.
• Good communication and collaboration skills.
• Ability to work in a fast-paced environment and meet deadlines.
Proactive approach towards process improvement and innovation.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Data Engineer – GCP Framework Implementation)
JD- Skills: Data Warehousing Concepts
We are looking for a Senior / Lead Snowflake Developer with 8+ years of experience in data engineering and cloud data platforms.
The ideal candidate will be Snow Pro Core Certified, bring deep expertise in Snowflake, and play a key role in designing, leading, and delivering scalable data solutions.
Hands-on experience with DBT is a strong advantage.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Experience: 5+ years
• Expertise in writing, debugging, and analyzing Linux Shell scripts
• Good hands-on experience in SDLC, CI/CD, and Git
• Knowledge of any job scheduler (Autosys, crontab etc.)
• Excellent communication skills is a must.
Responsibilities
• Experience: 5+ years
• Expertise in writing, debugging, and analyzing Linux Shell scripts
• Good hands-on experience in SDLC, CI/CD, and Git
• Knowledge of any job scheduler (Autosys, crontab etc.)
• Excellent communication skills is a must.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance