We found 335 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Description: 1. 4+ Years of strong experience in Salesforce Integration 2. Good experience in Salesforce Community Cloud 3. Good experience in REST API & SOAP API Integration 4. Strong Analytical Skill and Communication Skills 5. Good experience in Salesforce lightning is an added advantage 6. Good Team skill

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Salesforce Community Cloud and integration, Salesforce Community Cloud and integration.

Job Description

Job Description: Oracle E-Business Suite (EBS) Developer - Technical Essential Skills: Oracle E-Business Suite (EBS) Developer - Technical

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Oracle E-Business Suite (EBS) Developer - Technical

Job Description

Experience: 4-6 Years Work Location: Chennai, TN || Bangalore, KA || Hyderabad, TS Skill Required: Digital : Bigdata and Hadoop Ecosystems Digital : PySpark Job Description: "? Need to work as a developer in Bigdata, Hadoop or Data Warehousing Tools and Cloud Computing ? Work on Hadoop, Hive SQL?s, Spark, Bigdata Eco System Tools.? Experience in working with teams in a complex organization involving multiple reporting lines.? The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. ? The candidate should have strong DevOps and Agile Development Framework knowledge.? Create Scala/Spark jobs for data transformation and aggregation? Experience with stream-processing systems like Storm, Spark-Streaming, Flink" Essential Skills: "? Working experience of Hadoop, Hive SQL? s, Spark, Bigdata Eco System Tools.? Should be able to tweak queries and work on performance enhancement. ? The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. ? The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects. ? The candidate should have strong DevOps and Agile Development Framework knowledge ? Preferable to have good technical knowledge on Cloud computing, AWS or Azure Cloud Services.? Strong conceptual and creative problem-solving skills, ability to work with considerable ambiguity, ability to learn new and complex concepts quickly.? Experience in working with teams in a complex organization involving multiple reporting lines? Solid understanding of object-oriented programming and HDFS concepts"

Responsibilities

Experience: 4-6 Years Work Location: Chennai, TN || Bangalore, KA || Hyderabad, TS Skill Required: Digital : Bigdata and Hadoop Ecosystems Digital : PySpark Job Description: "? Need to work as a developer in Bigdata, Hadoop or Data Warehousing Tools and Cloud Computing ? Work on Hadoop, Hive SQL?s, Spark, Bigdata Eco System Tools.? Experience in working with teams in a complex organization involving multiple reporting lines.? The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. ? The candidate should have strong DevOps and Agile Development Framework knowledge.? Create Scala/Spark jobs for data transformation and aggregation? Experience with stream-processing systems like Storm, Spark-Streaming, Flink" Essential Skills: "? Working experience of Hadoop, Hive SQL? s, Spark, Bigdata Eco System Tools.? Should be able to tweak queries and work on performance enhancement. ? The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. ? The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects. ? The candidate should have strong DevOps and Agile Development Framework knowledge ? Preferable to have good technical knowledge on Cloud computing, AWS or Azure Cloud Services.? Strong conceptual and creative problem-solving skills, ability to work with considerable ambiguity, ability to learn new and complex concepts quickly.? Experience in working with teams in a complex organization involving multiple reporting lines? Solid understanding of object-oriented programming and HDFS concepts"
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Job Title: Reltio Integration Hub Developer Job Description: Design, develop, and maintain integration solutions between the Reltio MDM platform and other systems using Reltio Integration Hub and APIs to enable seamless, high-quality data flow and support business operations. Key Responsibilities: Develop, implement, and support integration solutions using Reltio Integration Hub Collaborate with stakeholders to gather requirements and translate into technical specifications Build, configure, and maintain batch/real-time data pipelines, connectors, mappings, and workflows Monitor, troubleshoot, and optimize integrations for performance, data accuracy, and reliability Document integration processes and technical configurations Support testing (integration, UAT) and provide ongoing enhancements.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Reltio MDM Hub Developer

Job Description

SAP ABAP +Odata + Adobe

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP abap

Job Description

Candidates with 8-10 years of knowledge in performance tuning, performance management, knowledge on performance tools and how they work during an implementation phase. How the candidates can coordinate with vendors or SAP in terms of issues, how will they handle performance testing, knowledge on load runner. Experience in performance testing/ tuning and performance recommendations.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Basis

Job Description

Cloud Store, BigQuery, and Data Fusion. Devops**

Responsibilities

Cloud Store, BigQuery, and Data Fusion. Devops**
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cloud and Devops

Job Description

SAP Treasury, FICO

Responsibilities

SAP Treasury, FICO
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Treasury, FICO

Job Description

Job Title: Developer Work Location: Hyderabad, TG/Bangalore, KA Skill Required: Digital : Amazon Web Service(AWS) Cloud Computing Experience Range: 4-10 Years Job Description: AWS Datalake Administrator : Skills Required: S3, AWS Lake Formation, SQL, AWS Data services such as Glue, Step Function, Redshift etc • Administer and optimize AWS Data Lake infrastructure using services like S3, Lake Formation, Glue, Athena, and Redshift, ensuring secure, scalable, and efficient operations. • Design and manage ETL workflows with AWS Glue and Step Functions, enabling seamless data ingestion, transformation, and cataloging. • Implement access controls and data governance using Lake Formation, IAM policies, and resource tagging to ensure compliance and data security. • Use SQL and automation scripts for data validation, performance tuning, and supporting analytics teams with curated, query-ready datasets.

Responsibilities

Job Title: Developer Work Location: Hyderabad, TG/Bangalore, KA Skill Required: Digital : Amazon Web Service(AWS) Cloud Computing Experience Range: 4-10 Years Job Description: AWS Datalake Administrator : Skills Required: S3, AWS Lake Formation, SQL, AWS Data services such as Glue, Step Function, Redshift etc • Administer and optimize AWS Data Lake infrastructure using services like S3, Lake Formation, Glue, Athena, and Redshift, ensuring secure, scalable, and efficient operations. • Design and manage ETL workflows with AWS Glue and Step Functions, enabling seamless data ingestion, transformation, and cataloging. • Implement access controls and data governance using Lake Formation, IAM policies, and resource tagging to ensure compliance and data security. • Use SQL and automation scripts for data validation, performance tuning, and supporting analytics teams with curated, query-ready datasets.
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer