We found 876 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

10+ years of overall experience in SAP integration and middleware technologies. • 5+ years of hands-on experience with SAP CPI (Cloud Platform Integration), including iFlows, adapters, mappings, and integration patterns. • Strong knowledge of SAP BTP, API Management, and Cloud Connector. • Experience working on SAP S/4HANA conversion or greenfield/brownfield implementation projects. • Expertise in integration technologies such as IDocs, BAPIs, RFCs, SOAP/REST APIs, OData services. • Hands-on experience with Groovy scripting and message mappings in CPI. • Strong understanding of security concepts (OAuth, certificates, encryption, etc.). • Excellent analytical, problem solving, and communication skills. • Ability to lead teams and coordinate with global stakeholders. Preferred Qualificationsss • SAP Certification in CPI or Integration Suite. • Experience with migration tools, AIF, Integration Advisor, or Event Mesh. • Experience in Agile/DevOps driven project delivery.

Responsibilities

10+ years of overall experience in SAP integration and middleware technologies. • 5+ years of hands-on experience with SAP CPI (Cloud Platform Integration), including iFlows, adapters, mappings, and integration patterns. • Strong knowledge of SAP BTP, API Management, and Cloud Connector. • Experience working on SAP S/4HANA conversion or greenfield/brownfield implementation projects. • Expertise in integration technologies such as IDocs, BAPIs, RFCs, SOAP/REST APIs, OData services. • Hands-on experience with Groovy scripting and message mappings in CPI. • Strong understanding of security concepts (OAuth, certificates, encryption, etc.). • Excellent analytical, problem solving, and communication skills. • Ability to lead teams and coordinate with global stakeholders. Preferred Qualificationsss • SAP Certification in CPI or Integration Suite. • Experience with migration tools, AIF, Integration Advisor, or Event Mesh. • Experience in Agile/DevOps driven project delivery.
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004500/ECMS 563211 | SAP CPI

Job Description

• Job Summary: We are seeking a skilled Data Engineer with strong experience in SQL, Python, Tableau, and ETL tools to design, build, and maintain reliable data pipelines and analytics solutions. This role focuses on ensuring data quality, enabling scalable data workflows, and supporting business intelligence and reporting needs. • Roles and Responsibilities • Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with cross functional teams (data analysts, product teams, business stakeholders) to understand data requirements. • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards

Responsibilities

• Required Skills What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory • Desired Skills: What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory • Soft Skills : • Excellent problem-solving and analytical skills • Strong communication and collaboration skills • Ability to work independently and as part of a global team. • Self-motivated and able to work in a fast-paced environment. • Detail-oriented and committed to delivering high-quality work. • Display one-team behavior while thinking end to end solutioning.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Business Associate

Job Description

Associate Category Manager

Responsibilities

Well versed in Excel analytics
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Associate Category Manager

Job Description

• Designs, implements, and maintains front end applications using TypeScript, Vue 3 (Composition API), Pinia, Vue Router, Vite, SCSS, Tailwind, PrimeVue, Pino logging, Axios, and GraphQL clients (Apollo/URQL/Relay), integrating Auth0 for authentication and authorization. • Demonstrates strong engineering fundamentals across Linux, Node.js (npm/pnpm), and Docker, applying OWASP practices and SOLID/DRY/KISS/YAGNI principles with sound data structures & algorithms. • Exhibits deep Vue expertise: reactivity system, directives, component design, props/emits, slots, and lifecycle hooks; organizes code via Composition Functions and type safe patterns in TypeScript. • Consumes GraphQL SDL and REST OpenAPI specifications, employing client generation where available; connects components to APIs with Axios/Fetch and GraphQL clients, handling auth flows, pagination, caching, and error surfaces. • Translates Figma designs into accessible, responsive HTML5/SCSS using BEM methodology, Tailwind utility patterns, and PrimeVue or equivalent component libraries; documents components in Storybook. • Implements secure front end architecture with Auth0 (OAuth/OIDC), token handling, secure storage, CSP, XSS/CSRF mitigation, input validation/encoding, and safe error handling. • Optimizes web performance via code splitting, lazy loading, tree shaking, asset and image optimization, caching strategies, and efficient rendering; monitors and improves Core Web Vitals using browser performance tooling. • Writes maintainable, reusable, component driven code that is secure, fast, idempotent, reliable, and resilient, with clear separation of concerns and consistent logging via Pino (browser). • Tests thoroughly with Jest or Vitest and Vue Test Utils for unit and integration coverage; performs end to end testing; uses msw (mswjs) to mock backend APIs; validates APIs with Bruno or Insomnia and Browser DevTools (console, network, performance). • Troubleshoots effectively by tracing logs, inspecting errors, and isolating root causes across UI, API, and network layers; produces actionable defect reports with evidence. • Operates locally in Docker and collaborates on CI/CD workflows; familiar with AWS deployment patterns and front end observability (logging, metrics, tracing) for production support. • Maintains API and component documentation, aligns with versioned contracts (GraphQL/OpenAPI), and ensures seamless integration between front end experiences and backend data/services. .

Responsibilities

• Designs, implements, and maintains front end applications using TypeScript, Vue 3 (Composition API), Pinia, Vue Router, Vite, SCSS, Tailwind, PrimeVue, Pino logging, Axios, and GraphQL clients (Apollo/URQL/Relay), integrating Auth0 for authentication and authorization. • Demonstrates strong engineering fundamentals across Linux, Node.js (npm/pnpm), and Docker, applying OWASP practices and SOLID/DRY/KISS/YAGNI principles with sound data structures & algorithms. • Exhibits deep Vue expertise: reactivity system, directives, component design, props/emits, slots, and lifecycle hooks; organizes code via Composition Functions and type safe patterns in TypeScript. • Consumes GraphQL SDL and REST OpenAPI specifications, employing client generation where available; connects components to APIs with Axios/Fetch and GraphQL clients, handling auth flows, pagination, caching, and error surfaces. • Translates Figma designs into accessible, responsive HTML5/SCSS using BEM methodology, Tailwind utility patterns, and PrimeVue or equivalent component libraries; documents components in Storybook. • Implements secure front end architecture with Auth0 (OAuth/OIDC), token handling, secure storage, CSP, XSS/CSRF mitigation, input validation/encoding, and safe error handling. • Optimizes web performance via code splitting, lazy loading, tree shaking, asset and image optimization, caching strategies, and efficient rendering; monitors and improves Core Web Vitals using browser performance tooling. • Writes maintainable, reusable, component driven code that is secure, fast, idempotent, reliable, and resilient, with clear separation of concerns and consistent logging via Pino (browser). • Tests thoroughly with Jest or Vitest and Vue Test Utils for unit and integration coverage; performs end to end testing; uses msw (mswjs) to mock backend APIs; validates APIs with Bruno or Insomnia and Browser DevTools (console, network, performance). • Troubleshoots effectively by tracing logs, inspecting errors, and isolating root causes across UI, API, and network layers; produces actionable defect reports with evidence. • Operates locally in Docker and collaborates on CI/CD workflows; familiar with AWS deployment patterns and front end observability (logging, metrics, tracing) for production support. • Maintains API and component documentation, aligns with versioned contracts (GraphQL/OpenAPI), and ensures seamless integration between front end experiences and backend data/services. .
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Front End Developer

Job Description

Senior Spring Boot Developer

Responsibilities

Senior Spring Boot Developer
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004640/563890 - Senior Spring Boot Developer - Blr, Pune - EAIS

Job Description

Experience and Skills • 7+ years of relevant experience mandatory • 3+ years of cloud experience, Azure Mandatory • Strong understanding of RESTful API design, microservices, and containerized applications. • Architecting and Programming skills in Java/J2EE with good understanding of OOPs design principles and Java Design Patterns. Excellent understanding of Core Java. • Rich experience in developing cloud-based solutions using Azure with deep understanding of design for scalability and performance. • Designing IoT systems and developing scalable Messaging and Streaming environments and extending the solution to create real time IoT data analytics driven applications. • Experience in developing Microservices (preferably Spring Boot) with good exposure to web application frameworks. • Familiarity with CI/CD pipelines, Git, and DevOps practices. • Experience of DevOps concepts, tools and technology landscape. • Experience with various tools of the trade including build tools (Maven, Gradle), version control (subversion, Git), automation servers (Jenkins, VSTS, Bamboo) • Unit Testing with Junit • Scripting languages like Python, JavaScript would be beneficial Education & Training • Bachelor’s degree in Computer Science/Electronics & Communication or in a relevant stream.

Responsibilities

Experience and Skills • 7+ years of relevant experience mandatory • 3+ years of cloud experience, Azure Mandatory • Strong understanding of RESTful API design, microservices, and containerized applications. • Architecting and Programming skills in Java/J2EE with good understanding of OOPs design principles and Java Design Patterns. Excellent understanding of Core Java. • Rich experience in developing cloud-based solutions using Azure with deep understanding of design for scalability and performance. • Designing IoT systems and developing scalable Messaging and Streaming environments and extending the solution to create real time IoT data analytics driven applications. • Experience in developing Microservices (preferably Spring Boot) with good exposure to web application frameworks. • Familiarity with CI/CD pipelines, Git, and DevOps practices. • Experience of DevOps concepts, tools and technology landscape. • Experience with various tools of the trade including build tools (Maven, Gradle), version control (subversion, Git), automation servers (Jenkins, VSTS, Bamboo) • Unit Testing with Junit • Scripting languages like Python, JavaScript would be beneficial Education & Training • Bachelor’s degree in Computer Science/Electronics & Communication or in a relevant stream.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Java Application Development - Lida Lasrado

Job Description

INFYSYJP00003141 539424_DNAFS_AMEX_Technology Lead

Responsibilities

INFYSYJP00003141 539424_DNAFS_AMEX_Technology Lead
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00003141 539424_DNAFS_AMEX_Technology Lead

Job Description

Location: HYDERABAD,TS or NOIDA,UP or or BANGALORE, KA Role Descriptions: Back-end Engineer Java 1117| Spring Boot| .Net| Apollo GraphQL| Express| database Essential Skills: Back-end Engineer Java 1117| Spring Boot| .Net| Apollo GraphQL| Express| database Desirable Skills: Keyword: Skills: Advanced Java Concepts~Digital : Microservices~Digital : Spring Boot~Digital : Vue.js Experience Required: 6-8

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Keyword: Skills: Advanced Java Concepts~Digital : Microservices~Digital : Spring Boot~Digital : Vue.js

Job Description

INFYSYJP00003695 559917 - Senior Appian Developer - Hyd - EAIS

Responsibilities

INFYSYJP00003695 559917 - Senior Appian Developer - Hyd - EAIS
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00003695 559917 - Senior Appian Developer - Hyd - EAIS