We found 41 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

: Oracle Financial Services Analytical Applications (OFSAA)

Responsibilities

Oracle Financial Services Analytical Applications (OFSAA)
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :: Oracle Financial Services Analytical Applications (OFSAA)

Job Description

Oracle EBS Supply Chain Management - Distribution

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Oracle EBS Supply Chain Management - Distribution

Job Description

Agile Way of Working~Progress - Openedge Job Description: Must-Have Technical Skills • Experience in Progress4GL/OpenEdge CHUI / GUI / Rest Services / Developer Studio • Knowledge and experience as Architect on Progress application • Knowledge of Database, DB Schema and key DBA activities. • Exposure to OpenEdge object oriented programming • Experience in support, development, and digital migration projects • Worked on OpenEdge 10.2B and above. • Developed application using Progress4GL/OpenEdge development platform. • Functional Testing to ensure deliverables meet Customer requirements • Analyzing the requirements and performing Gap analysis • Responsible to improve the performance & reliability of software applications and IT systems. • Exploring better ways to develop the product and fixing existing applications latency and other issues. • Knowledge of SDLC Processes, Agile & Project Implementation Life cycles • Experience in Agile development methodology is a must. • Working experience in DevOps environment is a must. Support Skills • Analyzing the tickets, providing resolution to users

Responsibilities

Job Description: Must-Have Technical Skills • Experience in Progress4GL/OpenEdge CHUI / GUI / Rest Services / Developer Studio • Knowledge and experience as Architect on Progress application • Knowledge of Database, DB Schema and key DBA activities. • Exposure to OpenEdge object oriented programming • Experience in support, development, and digital migration projects • Worked on OpenEdge 10.2B and above. • Developed application using Progress4GL/OpenEdge development platform. • Functional Testing to ensure deliverables meet Customer requirements • Analyzing the requirements and performing Gap analysis • Responsible to improve the performance & reliability of software applications and IT systems. • Exploring better ways to develop the product and fixing existing applications latency and other issues. • Knowledge of SDLC Processes, Agile & Project Implementation Life cycles • Experience in Agile development methodology is a must. • Working experience in DevOps environment is a must. Support Skills • Analyzing the tickets, providing resolution to users
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Agile Way of Working~Progress - Openedge

Job Description

Windows Powershell~ServiceNow Job Description: Collaborate with business and IT stakeholders to gather and analyze automation requirements. Design and develop ServiceNow workflows, flows (Flow Designer), and Orchestration Integration Hub solutions. Implement end-to-end automation for ITSM, ITOM, HRSD, or other ServiceNow modules. Create custom applications and extend ServiceNow platform capabilities as needed. Integrate ServiceNow with third-party systems (e.g., via RESTSOAP APIs, Integration Hub). Maintain technical documentation including process flows, solution designs, and test cases. Perform testing, validation, and deployment of automation solutions. Monitor performance and efficiency of automated workflows identify and implement improvements. Ensure compliance with change management and governance processes. Provide training and support to end-users and internal teams on ServiceNow automation features. Strong understanding of ServiceNow modules such as ITSM, ITOM, HRSD, or CSM. Experience with ServiceNow Flow Designer, Workflow Editor, and Integration Hub. Knowledge of scripting languages like JavaScript and familiarity with Glide API. Experience with web technologies (RESTSOAP, XML, JSON). Strong problem-solving, analytical, and communication skills. Ability to work). Strong dently and in a cross-functional team environment. ServiceNow Certified System Administrator (CSA) is required. Additional certifications (e.g., Certified Application Developer, ITSM, ITOM) are a plus.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Windows Powershell~ServiceNow

Job Description

Job Title: Developer Work Location: - Hyderabad, TG// Kolkata, WB. Skill Required :- Digital : PySpark~Azure Data Factory Range: 6 to 8 Yrs Job Description:- Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools. Essential Skills:- Role SummaryWe are looking for a Data Engineer with 24 years( more experience is welcomed) of hands-on experience in PySpark, Azure Data Factory (ADF), and Azure-based data pipelines. The ideal candidate should have strong skills in building ETL workflows, working with big-data technologies, and supporting production data processes.Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools.

Responsibilities

Job Title: Developer Work Location: - Hyderabad, TG// Kolkata, WB. Skill Required :- Digital : PySpark~Azure Data Factory Range: 6 to 8 Yrs Job Description:- Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools. Essential Skills:- Role SummaryWe are looking for a Data Engineer with 24 years( more experience is welcomed) of hands-on experience in PySpark, Azure Data Factory (ADF), and Azure-based data pipelines. The ideal candidate should have strong skills in building ETL workflows, working with big-data technologies, and supporting production data processes.Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools.
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Databricks Developer Role: Databricks Developer Must have Skill: Databricks admin or developer -5+Years, azure-4 Years

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks Developer

Job Description

• Must hold current Certified System Administrator certification • Basic knowledge of ServiceNow Discovery and troubleshooting. • Experience with ITSM (request management, incident management) , ITOM (discovery), ITAM (hardware / software asset mgmt.) & ITBM applications (demand/PPM). • Proficiency in service catalog administration, workflow/flow development, and service portal/employee center development. • Strong JavaScript experience. • Experience building integrations using REST API. • Excellent communication skills and ability to work effectively with stakeholders. • Strong problem-solving skills and attention to detail. • Certified ServiceNow Application Developer Certification

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :ServiceNow

Job Description

Full Stack, .NET, Angular, React, MS SQL Devops

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Devops

Job Description

Digital : Digital Security~Workday Cloud Connect for third party payroll

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Digital Security~Workday Cloud Connect for third party payroll