We found 870 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Specialist Software Engineer - BI Reporting Analyst Developer: SQL, PL SQL, Tableau, Power BI - (260002XI) Missions Reporting Analyst Developer Ability and minimum 2 years experience in analyzing business requirements and clearly understand the requested data and the user's need for reports, preferably in banking domain Ability and minimum 2 years experience analyzing, understanding and creating data model and functional specifications: data granularity, mapping rules, KPIs/attributes, classifications/flags calculation algorithms Experience in developing reports/dashboards in Tableau and the ability to identify performant solutions for developing reports in Tableau (in the context of very high volume of data) Knowledge and minimim 2 years practical experience in using Oracle Database SQL, PL/SQL, triggers, procedures, packages, functions, SQL queries and tuning (necessary for identifying data in DWH, creating and testing views and data sources) Profile Experience with big data technologies, including hadoop, iceberg, nosql databases, MS PowerBI is considered a plus Ability to write functional specifications for implemented reports Specific DWH technical knowledge (table types, technical key types, types of history) Ability to analyze and understand a large volume of data (in the millions/tens of millions) and the correlations between this data; Experience working with DWH relational/dimensional data models; Minimum 2 years of experience working with large volumes of data (in the millions/tens of millions) Ability to develop and test code that can be deployed in production, without prior supervision; Proactive attitude and good communication skills and easy interaction with business users Ability to provide support to users in using reports developed in Tableau; Ability to assimilate information very quickly; Analytical skills and attention to detail; Proficiency in english usage

Responsibilities

Specialist Software Engineer - BI Reporting Analyst Developer: SQL, PL SQL, Tableau, Power BI - (260002XI) Missions Reporting Analyst Developer Ability and minimum 2 years experience in analyzing business requirements and clearly understand the requested data and the user's need for reports, preferably in banking domain Ability and minimum 2 years experience analyzing, understanding and creating data model and functional specifications: data granularity, mapping rules, KPIs/attributes, classifications/flags calculation algorithms Experience in developing reports/dashboards in Tableau and the ability to identify performant solutions for developing reports in Tableau (in the context of very high volume of data) Knowledge and minimim 2 years practical experience in using Oracle Database SQL, PL/SQL, triggers, procedures, packages, functions, SQL queries and tuning (necessary for identifying data in DWH, creating and testing views and data sources) Profile Experience with big data technologies, including hadoop, iceberg, nosql databases, MS PowerBI is considered a plus Ability to write functional specifications for implemented reports Specific DWH technical knowledge (table types, technical key types, types of history) Ability to analyze and understand a large volume of data (in the millions/tens of millions) and the correlations between this data; Experience working with DWH relational/dimensional data models; Minimum 2 years of experience working with large volumes of data (in the millions/tens of millions) Ability to develop and test code that can be deployed in production, without prior supervision; Proactive attitude and good communication skills and easy interaction with business users Ability to provide support to users in using reports developed in Tableau; Ability to assimilate information very quickly; Analytical skills and attention to detail; Proficiency in english usage
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Specialist Software Engineer - BI Reporting Analyst

Job Description

Proficiency in React and the component lifecycle • Proven mobile cross-platform development experience (iOS/Android) with React Native and Expo framework • Knowledge of React hooks (useState, useEffect, useMemo, useCallback, etc.) • Strong command of JavaScript (ES6+), and knowledge of TypeScript • Mobile navigation management with ExpoRouter • Integration and consumption of RESTful APIs (Zod / Axios) • Experience with native development tools (Xcode, Android Studio) and bridging if necessary • Application state management • Performance optimization and memory management in mobile apps • Testing and code quality (Unit tests, Jest, Testing Library) • Debugging and problem-solving (profiling, logging, crash reporting) • Awareness and application of mobile security and data handling best practices

Responsibilities

Proficiency in React and the component lifecycle • Proven mobile cross-platform development experience (iOS/Android) with React Native and Expo framework • Knowledge of React hooks (useState, useEffect, useMemo, useCallback, etc.) • Strong command of JavaScript (ES6+), and knowledge of TypeScript • Mobile navigation management with ExpoRouter • Integration and consumption of RESTful APIs (Zod / Axios) • Experience with native development tools (Xcode, Android Studio) and bridging if necessary • Application state management • Performance optimization and memory management in mobile apps • Testing and code quality (Unit tests, Jest, Testing Library) • Debugging and problem-solving (profiling, logging, crash reporting) • Awareness and application of mobile security and data handling best practices
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00003985/562065 -React Native Developer-Pune- DX

Job Description

About the Role As a Data Scientist with 5+ years of experience, you will design, develop and deploy end-to-end AI/ML solutions — from classical ML models to LLM-powered agentic systems. You will work across the full ML lifecycle: data exploration, feature engineering, model training, evaluation, fine-tuning and production deployment. You will collaborate closely with engineering, product and domain teams to translate business problems into scalable, reliable AI solutions. Key Responsibilities Design and develop machine learning models (classification, regression, clustering, recommendation) using Python and industry-standard ML libraries (scikit-learn, XGBoost, LightGBM, TensorFlow/PyTorch). Build and optimize Retrieval-Augmented Generation (RAG) and Agentic RAG pipelines for knowledge-intensive applications using vector databases (FAISS, Azure AI Search, ChromaDB) and embedding models. Architect and implement multi-agent orchestration systems using frameworks such as LangChain, LangGraph, Semantic Kernel, AutoGen or CrewAI. Design and integrate Model Context Protocol (MCP) based tool-use patterns to enable LLM agents to interact with external APIs, databases and enterprise systems. Fine-tune foundation models (LLMs, SLMs) using techniques such as LoRA, QLoRA, PEFT and RLHF for domain-specific tasks. Perform prompt engineering, chain-of-thought reasoning and evaluation of LLM outputs for accuracy, safety and reliability. Conduct exploratory data analysis (EDA), feature engineering and data pipeline development to support model training and inference. Deploy and serve models using Azure ML, MLflow, FastAPI or similar frameworks with containerized (Docker/Kubernetes) production environments. Monitor model performance in production, implement drift detection and establish retraining strategies. Collaborate with cross-functional teams to define problem statements, success metrics and deliver AI solutions aligned with business objectives. Required Skills 5+ years of hands-on experience in data science, machine learning or applied AI roles. Strong proficiency in Python and ML/DL libraries (scikit-learn, pandas, NumPy, TensorFlow, PyTorch, Hugging Face Transformers). Solid understanding of classical ML algorithms (linear/logistic regression, decision trees, ensemble methods, SVMs, clustering, dimensionality reduction). Proven experience building RAG pipelines and working with vector stores, embeddings and retrieval strategies. Hands-on experience with agentic AI patterns, multi-agent orchestration and tool-use frameworks (LangChain, LangGraph, Semantic Kernel, AutoGen, CrewAI). Familiarity with Model Context Protocol (MCP) and its application in connecting LLM agents to external tools and data sources. Experience fine-tuning large language models using LoRA, QLoRA, PEFT or similar parameter-efficient methods. Working knowledge of Azure AI/ML services (Azure OpenAI, Azure ML, Cognitive Services, Azure AI Search). Experience with experiment tracking and model registry tools (MLflow, Weights & Biases). Strong analytical and problem-solving skills with the ability to communicate technical findings to non-technical stakeholders. Nice to Have Experience with MLOps/AIOps practices: CI/CD for ML, automated training pipelines, model versioning, A/B testing, model monitoring and AIOps-driven incident detection. Hands-on experience deploying models at scale using Docker, Kubernetes (AKS) and serverless inference endpoints. Exposure to graph neural networks, time-series forecasting or computer vision. Experience with Spark/PySpark or Databricks for large-scale data processing. Domain experience in automotive, manufacturing or engineering systems. Familiarity with responsible AI practices, bias detection and model explainability (SHAP, LIME). Tooling & Engineering Expectations Use Git for version control with standard branching and pull request workflows. Maintain reproducible experiments using MLflow, DVC or equivalent experiment tracking tools. Participate in code reviews and follow agreed coding standards and documentation practices. Work within existing CI/CD pipelines (e.g., Azure DevOps/GitHub Actions) for model training, testing and deployment automation. Document model architectures, training procedures, evaluation metrics and deployment configurations.

Responsibilities

About the Role As a Data Scientist with 5+ years of experience, you will design, develop and deploy end-to-end AI/ML solutions — from classical ML models to LLM-powered agentic systems. You will work across the full ML lifecycle: data exploration, feature engineering, model training, evaluation, fine-tuning and production deployment. You will collaborate closely with engineering, product and domain teams to translate business problems into scalable, reliable AI solutions. Key Responsibilities Design and develop machine learning models (classification, regression, clustering, recommendation) using Python and industry-standard ML libraries (scikit-learn, XGBoost, LightGBM, TensorFlow/PyTorch). RAG Build and optimize Retrieval-Augmented Generation (RAG) and Agentic RAG pipelines for knowledge-intensive applications using vector databases (FAISS, Azure AI Search, ChromaDB) and embedding models. Architect and implement multi-agent orchestration systems using frameworks such as LangChain, LangGraph, Semantic Kernel, AutoGen or CrewAI. Design and integrate Model Context Protocol (MCP) based tool-use patterns to enable LLM agents to interact with external APIs, databases and enterprise systems. Fine-tune foundation models (LLMs, SLMs) using techniques such as LoRA, QLoRA, PEFT and RLHF for domain-specific tasks. Perform prompt engineering, chain-of-thought reasoning and evaluation of LLM outputs for accuracy, safety and reliability. Conduct exploratory data analysis (EDA), feature engineering and data pipeline development to support model training and inference. Deploy and serve models using Azure ML, MLflow, FastAPI or similar frameworks with containerized (Docker/Kubernetes) production environments. Monitor model performance in production, implement drift detection and establish retraining strategies. Collaborate with cross-functional teams to define problem statements, success metrics and deliver AI solutions aligned with business objectives. Required Skills 5+ years of hands-on experience in data science, machine learning or applied AI roles. Strong proficiency in Python and ML/DL libraries (scikit-learn, pandas, NumPy, TensorFlow, PyTorch, Hugging Face Transformers). Solid understanding of classical ML algorithms (linear/logistic regression, decision trees, ensemble methods, SVMs, clustering, dimensionality reduction). Proven experience building RAG pipelines and working with vector stores, embeddings and retrieval strategies. Hands-on experience with agentic AI patterns, multi-agent orchestration and tool-use frameworks (LangChain, LangGraph, Semantic Kernel, AutoGen, CrewAI). Familiarity with Model Context Protocol (MCP) and its application in connecting LLM agents to external tools and data sources. Experience fine-tuning large language models using LoRA, QLoRA, PEFT or similar parameter-efficient methods. Working knowledge of Azure AI/ML services (Azure OpenAI, Azure ML, Cognitive Services, Azure AI Search). Experience with experiment tracking and model registry tools (MLflow, Weights & Biases). Strong analytical and problem-solving skills with the ability to communicate technical findings to non-technical stakeholders. Nice to Have Experience with MLOps/AIOps practices: CI/CD for ML, automated training pipelines, model versioning, A/B testing, model monitoring and AIOps-driven incident detection. Hands-on experience deploying models at scale using Docker, Kubernetes (AKS) and serverless inference endpoints. Exposure to graph neural networks, time-series forecasting or computer vision. Experience with Spark/PySpark or Databricks for large-scale data processing. Domain experience in automotive, manufacturing or engineering systems. Familiarity with responsible AI practices, bias detection and model explainability (SHAP, LIME). Tooling & Engineering Expectations Use Git for version control with standard branching and pull request workflows. Maintain reproducible experiments using MLflow, DVC or equivalent experiment tracking tools. Participate in code reviews and follow agreed coding standards and documentation practices. Work within existing CI/CD pipelines (e.g., Azure DevOps/GitHub Actions) for model training, testing and deployment automation. Document model architectures, training procedures, evaluation metrics and deployment configurations.
  • Salary : Rs. 14,00,000.0 - Rs. 15,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Scientist ( Mercedes )

Job Description

Hands on exp on FICO Good understanding of Finance related Business process and period end closing activities exp on sub modules like AP, AR, GL, Asses accounting , TAX and bank accounting tech aspects like idocs, user exits,BADI's workflow integration with SD and MM basic knowledge to controlling - COPA, PCA, CCA good communication skils interpersonal skills analyze,design,configuration , test , implement SAP Solutions to meet business requirement

Responsibilities

Hands on exp on FICO Good understanding of Finance related Business process and period end closing activities exp on sub modules like AP, AR, GL, Asses accounting , TAX and bank accounting tech aspects like idocs, user exits,BADI's workflow integration with SD and MM basic knowledge to controlling - COPA, PCA, CCA good communication skils interpersonal skills analyze, design, configuration , test , implement SAP Solutions to meet business requirement
  • Salary : Rs. 14,00,000.0 - Rs. 16,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP FICO

Job Description

Demonstrate technical expertise in end to end configuration of sales and supply chain processes in SAP SAP configuration of SD module. Sales order processes, bill of lading , shipping, distributed requirement planning, variant configuration SAP Pricing , releaase procedures,Condition records, condition types. Design, customize, configure and testing of SD. Inter company billing,Intercompany STO,Third party sales, Output procedure

Responsibilities

Demonstrate technical expertise in end to end configuration of sales and supply chain processes in SAP SAP configuration of SD module. Sales order processes, bill of lading , shipping, distributed requirement planning, variant configuration SAP Pricing , releaase procedures,Condition records, condition types. Design, customize, configure and testing of SD. Inter company billing,Intercompany STO,Third party sales, Output procedure
  • Salary : Rs. 14,00,000.0 - Rs. 18,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP SD

Job Description

Maintain SOPs, process documentation, and user guides for PTP processes. Ensure adherence to internal controls, audit requirements, and standard procedures. Identify opportunities to improve PTP efficiency and process compliance. Required Skills & Experience Core Functional Skills 3–4 years of hands‑on experience in Oracle EBS PTP processes. Strong working knowledge of: Purchase Orders Accounts Payable (AP) Processing Invoice Matching (2‑way / 3‑way) Experience supporting PTP operations in ERP environments. ERP Knowledge Oracle E‑Business Suite (EBS) functional expertise in Purchasing and AP modules. Understanding of the end‑to‑end Procure‑to‑Pay lifecycle. Soft Skills Strong analytical and problem‑solving skills. Good communication skills to interact with business and finance teams. Ability to manage multiple issues in a support environment. Good to Have Experience in ERP AMS or shared services models. Exposure to vendor reconciliations and audit support. Familiarity with ITSM / ticketing tools."

Responsibilities

Maintain SOPs, process documentation, and user guides for PTP processes. Ensure adherence to internal controls, audit requirements, and standard procedures. Identify opportunities to improve PTP efficiency and process compliance. Required Skills & Experience Core Functional Skills 3–4 years of hands‑on experience in Oracle EBS PTP processes. Strong working knowledge of: Purchase Orders Accounts Payable (AP) Processing Invoice Matching (2‑way / 3‑way) Experience supporting PTP operations in ERP environments. ERP Knowledge Oracle E‑Business Suite (EBS) functional expertise in Purchasing and AP modules. Understanding of the end‑to‑end Procure‑to‑Pay lifecycle. Soft Skills Strong analytical and problem‑solving skills. Good communication skills to interact with business and finance teams. Ability to manage multiple issues in a support environment. Good to Have Experience in ERP AMS or shared services models. Exposure to vendor reconciliations and audit support. Familiarity with ITSM / ticketing tools."
  • Salary : Rs. 9,00,000.0 - Rs. 11,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Oracle EBS Financials

Job Description

6+ years of experience with SAP BW and BW on HANA -Mandatory Experience on BW ABAP - Mandatory Experience & Good Knowledge in all BW key areas covering architecture, modelling ,extraction, ETL & Reporting -Mandatory Experience & Good Knowledge in LSA ++ architecture Experience & Good Knowledge in BW on HANA concepts Hybrid Modelling including Native HANA Models ( Calculation Views / CDS Views) - Mandatory ABAP skills from BW Perspective [Routines, Classes, AMDP, Procedures ABAP CDS etc) – Hands on Experience Required End to End Experience in gathering requirement, Functional Analysis, HLD, LLD, Buiil, Testing & Production deployment Extensive experience with analysis, design, development, customization, and BW analytics Proficiency in analyzing and translating business requirements to technical requirements and architecture. Extensive experience with complex SAP BW Environment & Architectures Delivering complex projects using Agile Scrum methodology Preferred Functional Knowledge Production Planning, Finance , Month End closing related tasks Should be flexible to support team during MEC period. Preferred knowledge in Devops Test automation Good Problem-solving skills

Responsibilities

6+ years of experience with SAP BW and BW on HANA -Mandatory Experience on BW ABAP - Mandatory Experience & Good Knowledge in all BW key areas covering architecture, modelling ,extraction, ETL & Reporting -Mandatory Experience & Good Knowledge in LSA ++ architecture Experience & Good Knowledge in BW on HANA concepts Hybrid Modelling including Native HANA Models ( Calculation Views / CDS Views) - Mandatory ABAP skills from BW Perspective [Routines, Classes, AMDP, Procedures ABAP CDS etc) – Hands on Experience Required End to End Experience in gathering requirement, Functional Analysis, HLD, LLD, Buiil, Testing & Production deployment Extensive experience with analysis, design, development, customization, and BW analytics Proficiency in analyzing and translating business requirements to technical requirements and architecture. Extensive experience with complex SAP BW Environment & Architectures Delivering complex projects using Agile Scrum methodology Preferred Functional Knowledge Production Planning, Finance , Month End closing related tasks Should be flexible to support team during MEC period. Preferred knowledge in Devops Test automation Good Problem-solving skills
  • Salary : Rs. 14,00,000.0 - Rs. 18,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP BW on Hana

Job Description

SAP BW and Data sphere

Responsibilities

SAP BW and Data sphere
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP BW and Data sphere

Job Description

: Databricks~Database Administration (DBMS) Job Description: Databricks AdministrationWorkspace setup, configuration, and governance Unity Catalog configuration and management Cluster policies, job clusters, access controls Secret scopes and credential management External locations and storage credentials setup Monitoring cluster performance and cost optimization Managing notebooks, repos, workflows, and permissions Managing Delta Lake storage and table propertiesAzure Cloud AdministrationADLS Gen2 storage setup and managementAzure networking (VNet, private endpoints, firewall rules)Azure Key Vault integrationManaged identities Service principals configurationAzure RBAC and IAM governanceStorage access via SAS, OAuth, service principalAzure monitoring and logging setupPlatform OperationsEnvironment setup (Dev QA Prod)CICD integration (Azure DevOpsGitHub)Deployment automationIncident troubleshootingCapacity planning and scaling

Responsibilities

: Databricks~Database Administration (DBMS) Job Description: Databricks AdministrationWorkspace setup, configuration, and governance Unity Catalog configuration and management Cluster policies, job clusters, access controls Secret scopes and credential management External locations and storage credentials setup Monitoring cluster performance and cost optimization Managing notebooks, repos, workflows, and permissions Managing Delta Lake storage and table propertiesAzure Cloud AdministrationADLS Gen2 storage setup and managementAzure networking (VNet, private endpoints, firewall rules)Azure Key Vault integrationManaged identities Service principals configurationAzure RBAC and IAM governanceStorage access via SAS, OAuth, service principalAzure monitoring and logging setupPlatform OperationsEnvironment setup (Dev QA Prod)CICD integration (Azure DevOpsGitHub)Deployment automationIncident troubleshootingCapacity planning and scaling
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : : Databricks~Database Administration (DBMS)