We found 301 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Mainframe DB2 - Application Development Job Description: Must-Have- COBOL, JCL, DB2, CICS. Experience with VSAM (Virtual Storage Access Method). Understanding of mainframe job scheduling tools. Good-to-Have - Familiarity with mainframe debugging tools (e.g., IBM Debug Tool, Xpediter)1 Design, develop, test, and implement mainframe-based applications using COBOL, JCL, DB2, and related technologies.2 Analyze business and technical requirements to develop solutions that meet business needs.3 Perform code reviews, unit testing, and debugging to ensure high-quality deliverables.4 Maintain and enhance existing mainframe applications to ensure optimal performance and user satisfaction.5 Collaborate with business analysts, QA teams, and other developers to deliver solutions on time and within scope.6 Monitor batch jobs and resolve issues as they arise in production environments.7 Write and maintain technical documentation and support transition to operations teams. Essential Skills: Must-Have- COBOL, JCL, DB2, CICS. Experience with VSAM (Virtual Storage Access Method). Understanding of mainframe job scheduling tools. Good-to-Have - Familiarity with mainframe debugging tools (e.g., IBM Debug Tool, Xpediter)1 Design, develop, test, and implement mainframe-based applications using COBOL, JCL, DB2, and related technologies.2 Analyze business and technical requirements to develop solutions that meet business needs.3 Perform code reviews, unit testing, and debugging to ensure high-quality deliverables.4 Maintain and enhance existing mainframe applications to ensure optimal performance and user satisfaction.5 Collaborate with business analysts, QA teams, and other developers to deliver solutions on time and within scope.6 Monitor batch jobs and resolve issues as they arise in production environments.7 Write and maintain technical documentation and support transition to operations teams.

Responsibilities

Job Description: Must-Have- COBOL, JCL, DB2, CICS. Experience with VSAM (Virtual Storage Access Method). Understanding of mainframe job scheduling tools. Good-to-Have - Familiarity with mainframe debugging tools (e.g., IBM Debug Tool, Xpediter)1 Design, develop, test, and implement mainframe-based applications using COBOL, JCL, DB2, and related technologies.2 Analyze business and technical requirements to develop solutions that meet business needs.3 Perform code reviews, unit testing, and debugging to ensure high-quality deliverables.4 Maintain and enhance existing mainframe applications to ensure optimal performance and user satisfaction.5 Collaborate with business analysts, QA teams, and other developers to deliver solutions on time and within scope.6 Monitor batch jobs and resolve issues as they arise in production environments.7 Write and maintain technical documentation and support transition to operations teams. Essential Skills: Must-Have- COBOL, JCL, DB2, CICS. Experience with VSAM (Virtual Storage Access Method). Understanding of mainframe job scheduling tools. Good-to-Have - Familiarity with mainframe debugging tools (e.g., IBM Debug Tool, Xpediter)1 Design, develop, test, and implement mainframe-based applications using COBOL, JCL, DB2, and related technologies.2 Analyze business and technical requirements to develop solutions that meet business needs.3 Perform code reviews, unit testing, and debugging to ensure high-quality deliverables.4 Maintain and enhance existing mainframe applications to ensure optimal performance and user satisfaction.5 Collaborate with business analysts, QA teams, and other developers to deliver solutions on time and within scope.6 Monitor batch jobs and resolve issues as they arise in production environments.7 Write and maintain technical documentation and support transition to operations teams.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Mainframe DB2 - Application Development

Job Description

IIS/Weblogic Admin S 7+ administration and configuration. • Strong Knowledge on PowerShell / PowerCLI scripting and automation. • Strong hands on experience on IIS servers (HTML, JavaScript, XML, AJAX, Service- Oriented Architecture). • Responsible for the operation and administration of IIS web servers across Staging, Demo, and Production environments. • Performs troubleshooting and responses to all IIS related application issues. • Domain Operations Responsible for the operation, and administration of network and server monitoring, in regards to web hosting. • Managing Web Services, third party SSL certificates, and general server configurations, including patch installation, Web server security, administration scripts and Authentication modes. • Work experience on WebLogic administration including installation, configuration, tuning, and deployment of applications. • Good knowledge in creating and configuring domains in WebLogic • Good knowledge in configuring, tuning, Apache web servers in Linux environments. • Experience with JDBC, JMS, Webservices, Security standards, LDAP, SSL, preferably with Open AM experience. • Good knowledge in identifying the issues and providing quick resolutions • Knowledge in capturing and analyzing thread dumps and heap dumps. • Experience in analyzing and fine tuning the server parameters (JVM Size, Thread pool size, Garbage Collection Process etc.) for achieving better performance.

Responsibilities

S 7+ administration and configuration. • Strong Knowledge on PowerShell / PowerCLI scripting and automation. • Strong hands on experience on IIS servers (HTML, JavaScript, XML, AJAX, Service- Oriented Architecture). • Responsible for the operation and administration of IIS web servers across Staging, Demo, and Production environments. • Performs troubleshooting and responses to all IIS related application issues. • Domain Operations Responsible for the operation, and administration of network and server monitoring, in regards to web hosting. • Managing Web Services, third party SSL certificates, and general server configurations, including patch installation, Web server security, administration scripts and Authentication modes. • Work experience on WebLogic administration including installation, configuration, tuning, and deployment of applications. • Good knowledge in creating and configuring domains in WebLogic • Good knowledge in configuring, tuning, Apache web servers in Linux environments. • Experience with JDBC, JMS, Webservices, Security standards, LDAP, SSL, preferably with Open AM experience. • Good knowledge in identifying the issues and providing quick resolutions • Knowledge in capturing and analyzing thread dumps and heap dumps. • Experience in analyzing and fine tuning the server parameters (JVM Size, Thread pool size, Garbage Collection Process etc.) for achieving better performance.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : IIS/Weblogic Admin

Job Description

Splunk Admin Good-to-Have- Sound knowledge of Senior Architect role and Agile Methodologies Must-Have • Should be Splunk admin certified and Enterprise Splunk architect certification. • Performing hands-on architecture, design, and development of systems. Developed Splunk infrastructure and related solutions. • Standardize and implement Splunk Universal Forwarder deployment, configuration and maintenance in Linux and Windows platforms • Maintain, Manage and Monitor Splunk Infrastructure (Identify bad searches, dashboards and health of Splunk) • Used User Behavior Analytic to parse data into Splunk and detect anomalies in true positive events • Used SNMP (Simple network management protocol) to monitor the application on the server • Using Splunk Enterprise to perform data mining and analysis, utilizing various queries and reporting methods • Analyzing and monitoring security-related technologies including host-based firewalls, host-based using IDS, LDP server configuration controls, logging, SIEM, monitoring tools, antivirus systems. • Actively hunt for and dissect previously unidentified threats and differentiate between potential intrusion attempts and false alarms • Monitor and detecting security use cases on Splunk e.g. SQL Injection, SQL Map, Burp-suit intruder • Using Splunk Phantom Security Orchestration, Automation, and Response (SOAR) system to evaluate notable event for correlation alert • Develop alerts and timed reports Develop and manage Splunk applications • Performed Splunk knowledge objects e.g. Configuration, Uploading data, field extraction, validation of boarded data, REGEX search, event parsing, and data transformation • Use Splunk GUI development creating Splunk apps, searches, Data models, dashboards, and Reports using the Splunk query language. • Perform index administration, maintenance and optimization and create data retention • Create Splunk Applications, Splunk Dashboard and Visualizations. • Manage and troubleshooting Splunk accounts (create, delete, modify, etc.) • Transfer Splunk log file in Json format to Elastic search • Support Splunk on UNIX, Linux and Windows-based platforms. Assist with automation of processes and procedures • Provided different method to install search head, forwarder and deployment servers and troubleshoot at the back end • Implementing and maintaining Splunk infrastructure and configurations in a single environment and in a clustered environment

Responsibilities

Good-to-Have- Sound knowledge of Senior Architect role and Agile Methodologies Must-Have • Should be Splunk admin certified and Enterprise Splunk architect certification. • Performing hands-on architecture, design, and development of systems. Developed Splunk infrastructure and related solutions. • Standardize and implement Splunk Universal Forwarder deployment, configuration and maintenance in Linux and Windows platforms • Maintain, Manage and Monitor Splunk Infrastructure (Identify bad searches, dashboards and health of Splunk) • Used User Behavior Analytic to parse data into Splunk and detect anomalies in true positive events • Used SNMP (Simple network management protocol) to monitor the application on the server • Using Splunk Enterprise to perform data mining and analysis, utilizing various queries and reporting methods • Analyzing and monitoring security-related technologies including host-based firewalls, host-based using IDS, LDP server configuration controls, logging, SIEM, monitoring tools, antivirus systems. • Actively hunt for and dissect previously unidentified threats and differentiate between potential intrusion attempts and false alarms • Monitor and detecting security use cases on Splunk e.g. SQL Injection, SQL Map, Burp-suit intruder • Using Splunk Phantom Security Orchestration, Automation, and Response (SOAR) system to evaluate notable event for correlation alert • Develop alerts and timed reports Develop and manage Splunk applications • Performed Splunk knowledge objects e.g. Configuration, Uploading data, field extraction, validation of boarded data, REGEX search, event parsing, and data transformation • Use Splunk GUI development creating Splunk apps, searches, Data models, dashboards, and Reports using the Splunk query language. • Perform index administration, maintenance and optimization and create data retention • Create Splunk Applications, Splunk Dashboard and Visualizations. • Manage and troubleshooting Splunk accounts (create, delete, modify, etc.) • Transfer Splunk log file in Json format to Elastic search • Support Splunk on UNIX, Linux and Windows-based platforms. Assist with automation of processes and procedures • Provided different method to install search head, forwarder and deployment servers and troubleshoot at the back end • Implementing and maintaining Splunk infrastructure and configurations in a single environment and in a clustered environment
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Splunk Admin

Job Description

Network Operations understanding & extensive experience with Network protocols BGP, EIGRP, OSPF, IPSsec, NAT, PAT and MPLS, VSS, VPC etc. • Working experience on Cisco Catalyst and Nexus Switches, ISR/ASR Routers, Arista Network devices and understanding of switching technologies VLAN, VTP, DTP, STP,MSTP, VRRP HSRP, etc. • Working experience on Aruba Wireless devices • Excellent Communication skills • Hands on Experience on Firewalls (ASA, Palo alto Firewalls) • Working experience on Wireless Technology (Aruba /Cisco) • Cisco certification\any other networking certifications. • Knowledge on Azure/SDWAN Technologies is added advantage

Responsibilities

understanding & extensive experience with Network protocols BGP, EIGRP, OSPF, IPSsec, NAT, PAT and MPLS, VSS, VPC etc. • Working experience on Cisco Catalyst and Nexus Switches, ISR/ASR Routers, Arista Network devices and understanding of switching technologies VLAN, VTP, DTP, STP,MSTP, VRRP HSRP, etc. • Working experience on Aruba Wireless devices • Excellent Communication skills • Hands on Experience on Firewalls (ASA, Palo alto Firewalls) • Working experience on Wireless Technology (Aruba /Cisco) • Cisco certification\any other networking certifications. • Knowledge on Azure/SDWAN Technologies is added advantage
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Network Operations

Job Description

1. Exposure the basic FICO modules – GL, AP, AR, CCA, IO. Asset accounting experience is also required. 2. Exposure to Testing the various Finance related business process. 3. Good understand of the Integration with MM, SD, PS & ISU modules. 4. Experience range- 4 years to 9 years.

Responsibilities

1. Exposure the basic FICO modules – GL, AP, AR, CCA, IO. Asset accounting experience is also required. 2. Exposure to Testing the various Finance related business process. 3. Good understand of the Integration with MM, SD, PS & ISU modules. 4. Experience range- 4 years to 9 years.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP FICO

Job Description

PL/SQL~Functional Testing Roles & Responsibility A Selenium Automation Tester is responsible for designing and implementing automated tests for web applications. Key responsibilities include: • Developing and executing automated tests using the Selenium framework to ensure software quality. • Collaborating with development teams to identify and resolve issues, and maintaining a database of software defects. • Analyzing test results and tracking metrics to improve testing processes. • Staying updated with industry trends and technologies, and providing suggestions for process improvements. • Possessing skills in programming languages like Java, Python, or C#, and familiarity with test automation tools. • Understand business requirements and create test cases. • Analyze automation results and report defects and work with developers for resolution. • Ensure traceability between requirements and test cases. • Validate UI/UX and workflows and troubleshooting guides. • Design and develop automation scripts using tools like Selenium • Maintain and update automation frameworks. Personal and Organizational Skills • Proactive and Initiative-Driven: Self-motivated with a go-getter attitude, capable of solving complex problems. • Collaborative: Ability to work effectively with QA, product managers, and cross-functional teams. Candidate should be a team player. • Fast Learner: Eager to adapt to new technologies and project requirements. • Good Communicator: Excellent verbal and written communication skills. • Independent: Capable of managing tasks with minimal supervision. • Documentation: Adept at creating architectural diagrams, maintaining detailed documentation in confluence.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :PL/SQL~Functional Testing

Job Description

• hands-on experience in healthcare interoperability and integration development • Strong understanding of FHIR API development and RESTful services. • knowledge of C-CDA document types and HL7 FHIR standards • Proficiency in scripting languages -Python

Responsibilities

• hands-on experience in healthcare interoperability and integration development • Strong understanding of FHIR API development and RESTful services. • knowledge of C-CDA document types and HL7 FHIR standards • Proficiency in scripting languages -Python
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :HL7 FHIR

Job Description

Job Title: Developer Work Location: - Hyderabad, TG// Kolkata, WB. Skill Required :- Digital : PySpark~Azure Data Factory Range: 6 to 8 Yrs Job Description:- Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools. Essential Skills:- Role SummaryWe are looking for a Data Engineer with 24 years( more experience is welcomed) of hands-on experience in PySpark, Azure Data Factory (ADF), and Azure-based data pipelines. The ideal candidate should have strong skills in building ETL workflows, working with big-data technologies, and supporting production data processes.Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools.

Responsibilities

Job Title: Developer Work Location: - Hyderabad, TG// Kolkata, WB. Skill Required :- Digital : PySpark~Azure Data Factory Range: 6 to 8 Yrs Job Description:- Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools. Essential Skills:- Role SummaryWe are looking for a Data Engineer with 24 years( more experience is welcomed) of hands-on experience in PySpark, Azure Data Factory (ADF), and Azure-based data pipelines. The ideal candidate should have strong skills in building ETL workflows, working with big-data technologies, and supporting production data processes.Key Responsibilities1. PySpark DevelopmentDevelop and optimize ETL pipelines using PySpark and Spark SQL.Work with DataFrames, transformations, and partitioning.Handle data ingestion from various formats (Parquet, CSV, JSON, Delta).2. Azure Data Factory (ADF)Build and maintain ADF pipelines, datasets, linked services, and triggers.Integrate ADF pipelines with Azure Databricks, ADLS, Azure SQL, APIs, etc.Monitor pipeline runs and troubleshoot failures.3. Azure Cloud DatabricksWork with ADLS Gen2 for data storage and management.Run, schedule, and debug Databricks notebooks.Use Delta Lake for data processing and incremental loads.4. ETL Data ManagementImplement data cleansing, transformations, and validation checks.Follow standard data engineering best practices.Support production jobs and ensure data quality.5. DevOps CollaborationUse Git or Azure DevOps for code versioning.Participate in code reviews and documentation.Collaborate with analysts and data architects for requirements.Required Skills24 years of hands-on experience with PySpark and Spark SQL.Experience building data pipelines in Azure Data Factory.Working knowledge of Azure Databricks ADLS Gen2.Good SQL knowledge.Understanding of ETL concepts and data pipelines.Good to HaveExperience with Delta Lake (MERGE, schema evolution).Familiarity with CICD (Azure DevOpsGitHub).Exposure to Snowflake is a plus.Soft SkillsStrong analytical and problem-solving abilities.Good communication and teamwork skills.Ability to learn quickly and adapt to new tools.
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Vulnerability Management, Linux admin with shell scripting, deployment exp.

Responsibilities

Vulnerability Management, Linux admin with shell scripting, deployment exp.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :546901_CIS_ Infrastructure Security