We found 943 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

As a Custom Software Engineer, a typical day involves designing, building, and configuring applications tailored to fulfill specific business process and application requirements. This role requires a thoughtful approach to understanding organizational needs and translating them into effective software solutions. The workday often includes collaborating with various stakeholders to ensure that the applications developed align with operational goals and support seamless business workflows. Continuous refinement and adaptation of software components to meet evolving demands are integral parts of the da

Responsibilities

As a Custom Software Engineer, a typical day involves designing, building, and configuring applications tailored to fulfill specific business process and application requirements. This role requires a thoughtful approach to understanding organizational needs and translating them into effective software solutions. The workday often includes collaborating with various stakeholders to ensure that the applications developed align with operational goals and support seamless business workflows. Continuous refinement and adaptation of software components to meet evolving demands are integral parts of the da
  • Salary : Rs. 32,00,000.0 - Rs. 35,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Vendor Invoice Management

Job Description

Skills: Digital : Salesforce Service Cloud~ Salesforce Commerce Cloud~PL/SQL Experience Required: 8-10   Request Id- 71286-1   Role Descriptions: Key Responsibilities Solution Design Delivery Lead design and development for SFCC (B2C Commerce) and Service Cloud features| integrations| and customizations. Build performant LWC and Aura components develop reusable Apex classes| triggers| and batch queueable jobs. Translate business requirements into scalable technical solutions following Salesforce best practices. Platform Engineering Integrations Implement and optimize RESTSOAP integrations| middleware patterns (e.g.| MuleSoft| Azure AWS services)| and data sync between Commerce Cloud and Core Salesforce. Define data models| sharing rules| and security controls ensure platform governance and compliance. Release DevOps Own branching strategy| meta data package promotion| and automated deployments with Copado| Git| and CICD pipelines. Establish test automation strategies (unit tests| static code analysis| code coverage 85).Quality| Performance Support Conduct code reviews| enforce coding standards| and remediate technical debt. Optimize performance (SOQL| caching| limits| page load) and troubleshoot production issues with root-cause analysis. Stakeholder LeadershipPartner with product owners| architects| and cross-functional teams provide effort estimates| risks| and mitigation plans. Mentor junior developers and ensure documentation (design specs| deployment runbooks| support guides).

Responsibilities

Skills: Digital : Salesforce Service Cloud~ Salesforce Commerce Cloud~PL/SQL Experience Required: 8-10   Request Id- 71286-1   Role Descriptions: Key Responsibilities Solution Design Delivery Lead design and development for SFCC (B2C Commerce) and Service Cloud features| integrations| and customizations. Build performant LWC and Aura components develop reusable Apex classes| triggers| and batch queueable jobs. Translate business requirements into scalable technical solutions following Salesforce best practices. Platform Engineering Integrations Implement and optimize RESTSOAP integrations| middleware patterns (e.g.| MuleSoft| Azure AWS services)| and data sync between Commerce Cloud and Core Salesforce. Define data models| sharing rules| and security controls ensure platform governance and compliance. Release DevOps Own branching strategy| meta data package promotion| and automated deployments with Copado| Git| and CICD pipelines. Establish test automation strategies (unit tests| static code analysis| code coverage 85).Quality| Performance Support Conduct code reviews| enforce coding standards| and remediate technical debt. Optimize performance (SOQL| caching| limits| page load) and troubleshoot production issues with root-cause analysis. Stakeholder LeadershipPartner with product owners| architects| and cross-functional teams provide effort estimates| risks| and mitigation plans. Mentor junior developers and ensure documentation (design specs| deployment runbooks| support guides).
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Salesforce Service Cloud~ Salesforce Commerce Cloud~PL/SQL

Job Description

Java Springboot Developer

Responsibilities

Java Springboot Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Springboot Developer

Job Description

Digital : Python Role Descriptions: (Look for strong candidates in Python) Experience on Python| SQL| Postgres| Grafana| ELK| Docker| Jenkins. Experience and exposer to good programming practices including Coding and Testing standards Passion and Experience in proactively investigating| evaluating and implementing new technical solutions with continuously improvement - Possess good development culture and familiarity to industry wide best practices - Production mindset with keen focus on reliability and quality - Passionate about being a part of distributed self-sufficient feature team with regular deliverables - Proactive learner and own skills about Scrum| Data| Automation - Strong technical ability to monitor| investigate| analyze and fix production issues. - Ability to ideate and collaborate through inner and open sourcing - Ability to Interact with client managers| developers| testers and cross functional teams like architects - Experience working in Agile Team and exposure to agile SAFE development methodologies. - Good experience of design and development including object-oriented programming in python| cloud native application development| APIs and micro-service - Good experience with relational databases like PostgreSQL and ability to build robust SQL queries - Knowledge of Grafana for data visualization and ability to build dashboard from various data sources - Experience in big technologies like Elastic search and FluentD - Experience in hosting applications using Containerization Docker| Kubernetes - Good understanding of CICD and DevOps and Proficient with tools like GIT| Jenkin| Sonar - Good system skills with linux OS and bash scripting - Understanding of the Cloud and cloud services

Responsibilities

Digital : Python Role Descriptions: (Look for strong candidates in Python) Experience on Python| SQL| Postgres| Grafana| ELK| Docker| Jenkins. Experience and exposer to good programming practices including Coding and Testing standards Passion and Experience in proactively investigating| evaluating and implementing new technical solutions with continuously improvement - Possess good development culture and familiarity to industry wide best practices - Production mindset with keen focus on reliability and quality - Passionate about being a part of distributed self-sufficient feature team with regular deliverables - Proactive learner and own skills about Scrum| Data| Automation - Strong technical ability to monitor| investigate| analyze and fix production issues. - Ability to ideate and collaborate through inner and open sourcing - Ability to Interact with client managers| developers| testers and cross functional teams like architects - Experience working in Agile Team and exposure to agile SAFE development methodologies. - Good experience of design and development including object-oriented programming in python| cloud native application development| APIs and micro-service - Good experience with relational databases like PostgreSQL and ability to build robust SQL queries - Knowledge of Grafana for data visualization and ability to build dashboard from various data sources - Experience in big technologies like Elastic search and FluentD - Experience in hosting applications using Containerization Docker| Kubernetes - Good understanding of CICD and DevOps and Proficient with tools like GIT| Jenkin| Sonar - Good system skills with linux OS and bash scripting - Understanding of the Cloud and cloud services
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Python

Job Description

Power Automate Desktop + UiPath

Responsibilities

Power Automate Desktop + UiPath
  • Salary : Rs. 0.0 - Rs. 10,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00005182/Power Automate Desktop + UiPath

Job Description

Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Azure Data Factory, Snowflake, PySpark, SQL,any cloud (preferbly Azure), Data Modelling Requirements: Experience Level: 3 to 5 Years • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. *Responsibilities for Internal Candidates Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, • Good to have Snowflake, and DBT. • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.

Responsibilities

Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Azure Data Factory, Snowflake, PySpark, SQL,any cloud (preferbly Azure), Data Modelling Requirements: Experience Level: 3 to 5 Years • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. *Responsibilities for Internal Candidates Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, • Good to have Snowflake, and DBT. • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer - Abhinav Jain

Job Description

1. BU : SAP 2. Skill Name: SAP CO – Product Costing 3. RR ID : 4. 66657931 5. 6. Exp: 5+ 7. Level : SA/M 8. Location Preference : Pan India 9. Notice Period : Immediate Joiner 10. Budget: 11. No of Demands : 1

Responsibilities

1. BU : SAP 2. Skill Name: SAP CO – Product Costing 3. RR ID : 4. 66657931 5. 6. Exp: 5+ 7. Level : SA/M 8. Location Preference : Pan India 9. Notice Period : Immediate Joiner 10. Budget: 11. No of Demands : 1
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Product Costing

Job Description

Roles and Responsibilities: Operating as part of a small Agile/Scrum feature team of about 5-7 developers, testers and other specialists. The Automation Tester will have a solid reporting line to the Design Engineering Testing plus the Test Lead/Test manger and dotted line reporting to product/Project manager and will work closely with other developers in the team working on other parts of the system. plan and undertake end-to-end test activities such as test script development, test execution & defects management. Work collaboratively with Test Manager, Product Managers, Business Analysts, Developers in project to assess test scenarios and provide test requirements based on risk-based testing methodology. Design and Develop Test strategy, Test plan, Test reporting, test conditions, test scripts, test data, and expected results for test streams, including Product Test, Integration Test, Technical Test and Deployment Test. Designing and execution of detailed functional test cases for complex requirements. Identifying and implementing process improvements that improve the level of delivery while finding optimization opportunities Effective communication with management, peers, and subordinates, both verbally and in writing Proactive and Organized, Capable of working in an ambiguous/uncertain environment, understanding risks & flagging it on time. Ability to mitigate risks and ensuring quality deliverables adhering to timelines. Test Delivery Assurance complying with Shell standard tools and framework. Mandatory skills: Python. 7+ years of in QA, 4 in Selenium BDD framework. QA Automation of Web Applications, Web Services and REST APIs. SDLC & Manual testing. Design & execution of test cases for manual & automation. Automation & Quality Engineering Tools / Method, Selenium & Python. BDD & Specflow / Python Behave Optional skills: Using source controllers GIT/TFS. SQL. Salesforce. Degree in computer science or equivalent Portfolio: Shell IDT/Downstream & Renewables IDT/Trading & Supply/TS - Shell Energy/TS - SE-Am Program: Project: STSE_NA_SD1900069_Digital_Enhancements_PDT Description: Implement digital Artificial Intelligence technology to identify deal entry anomalies which indicate potential data errors that can cause extensive downstream work and potential reputational damage. Q1 2020 Deliverables: - Nucleus - Production environment in place the 5 uses cases from the POC deployed into production. - Endur POC – 5 uses cases identified and delivered in Dev environment - Resources in position to support the AI solution and be able to deliver additional uses cases for t Business case: Background: The topic of Deal Entry and Data Discrepancies (Defects) has been gaining momentum over the past year. The impact of these defects can be seen in various aspects of the business, including cost and cash flows. Specifically: - They hinder the businesses' ability to collect cash from invoices due. - They create a business cost to correct and resolve these data discrepancies. - They create potentially unwanted deal exposure under certain conditions. According to the "Global Ov Business Application: SEAm Hedge Simulator, SEAm COE Dashboard, SEAm Collateral Management Analytics, SEAm Symphony, Teles Project Type: IT Enhancement Project Class:

Responsibilities

Roles and Responsibilities: Operating as part of a small Agile/Scrum feature team of about 5-7 developers, testers and other specialists. The Automation Tester will have a solid reporting line to the Design Engineering Testing plus the Test Lead/Test manger and dotted line reporting to product/Project manager and will work closely with other developers in the team working on other parts of the system. plan and undertake end-to-end test activities such as test script development, test execution & defects management. Work collaboratively with Test Manager, Product Managers, Business Analysts, Developers in project to assess test scenarios and provide test requirements based on risk-based testing methodology. Design and Develop Test strategy, Test plan, Test reporting, test conditions, test scripts, test data, and expected results for test streams, including Product Test, Integration Test, Technical Test and Deployment Test. Designing and execution of detailed functional test cases for complex requirements. Identifying and implementing process improvements that improve the level of delivery while finding optimization opportunities Effective communication with management, peers, and subordinates, both verbally and in writing Proactive and Organized, Capable of working in an ambiguous/uncertain environment, understanding risks & flagging it on time. Ability to mitigate risks and ensuring quality deliverables adhering to timelines. Test Delivery Assurance complying with Shell standard tools and framework. Mandatory skills: Python. 7+ years of in QA, 4 in Selenium BDD framework. QA Automation of Web Applications, Web Services and REST APIs. SDLC & Manual testing. Design & execution of test cases for manual & automation. Automation & Quality Engineering Tools / Method, Selenium & Python. BDD & Specflow / Python Behave Optional skills: Using source controllers GIT/TFS. SQL. Salesforce. Degree in computer science or equivalent Portfolio: Shell IDT/Downstream & Renewables IDT/Trading & Supply/TS - Shell Energy/TS - SE-Am Program: Project: STSE_NA_SD1900069_Digital_Enhancements_PDT Description: Implement digital Artificial Intelligence technology to identify deal entry anomalies which indicate potential data errors that can cause extensive downstream work and potential reputational damage. Q1 2020 Deliverables: - Nucleus - Production environment in place the 5 uses cases from the POC deployed into production. - Endur POC – 5 uses cases identified and delivered in Dev environment - Resources in position to support the AI solution and be able to deliver additional uses cases for t Business case: Background: The topic of Deal Entry and Data Discrepancies (Defects) has been gaining momentum over the past year. The impact of these defects can be seen in various aspects of the business, including cost and cash flows. Specifically: - They hinder the businesses' ability to collect cash from invoices due. - They create a business cost to correct and resolve these data discrepancies. - They create potentially unwanted deal exposure under certain conditions. According to the "Global Ov Business Application: SEAm Hedge Simulator, SEAm COE Dashboard, SEAm Collateral Management Analytics, SEAm Symphony, Teles Project Type: IT Enhancement Project Class:
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Software Configuration Management

Job Description

INFYSYJP00005215 564362_SAP Upgrade - ABAP Lead_India

Responsibilities

INFYSYJP00005215 564362_SAP Upgrade - ABAP Lead_India
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00005215 564362_SAP Upgrade - ABAP Lead_India