Role Descriptions: Key Responsibilities for Ascender Payroll System QA1) System Validation Testing Execute end-to-end testing of Ascender payroll cycles| including SIT (System Integration Testing)| UAT (User Acceptance Testing)| and Regression testing.2) Data Validation Validate data flow between time and attendance systems| talent acquisition| and the core Ascender payroll engine to ensure accuracy.3) Defect Management Identify| document| and track system defects with precision| collaborating with functional consultants and technical teams to resolve them.4) Parallel Runs Lead or assist in critical payroll parallel runs to verify system changes| migrations| or updates do not disrupt employee pay.5) Compliance Checking Ensure all payroll processes comply with complex Australian retail enterprise award requirements and legal regulations.
Essential Skills: Required Experience and Skills for Ascender Payroll System QA1) Ascender Experience Proven experience working with Ascender HCM (formerly CHRIS21 or similar enterprise payroll systems).2) Payroll Domain Knowledge Deep understanding of payroll processes| including taxes| superannuation| leave management| and award interpretation.3) Testing Methodologies Strong knowledge of Manual Testing processes| with proficiency in System Integration Testing (SIT) and User Acceptance Testing (UAT).4) Technical Skills Proficient in querying databases (SQL) for data verification and skilled in MS Excel for analyzing data variances.5) System Implementation Experience in brownfield implementations| system upgrades| or migration projects is highly desirable
Desirable Skills:
Keyword:
Skills: Workday Payroll
Experience Required: 4-6
Responsibilities
Role Descriptions: Key Responsibilities for Ascender Payroll System QA1) System Validation Testing Execute end-to-end testing of Ascender payroll cycles| including SIT (System Integration Testing)| UAT (User Acceptance Testing)| and Regression testing.2) Data Validation Validate data flow between time and attendance systems| talent acquisition| and the core Ascender payroll engine to ensure accuracy.3) Defect Management Identify| document| and track system defects with precision| collaborating with functional consultants and technical teams to resolve them.4) Parallel Runs Lead or assist in critical payroll parallel runs to verify system changes| migrations| or updates do not disrupt employee pay.5) Compliance Checking Ensure all payroll processes comply with complex Australian retail enterprise award requirements and legal regulations.
Essential Skills: Required Experience and Skills for Ascender Payroll System QA1) Ascender Experience Proven experience working with Ascender HCM (formerly CHRIS21 or similar enterprise payroll systems).2) Payroll Domain Knowledge Deep understanding of payroll processes| including taxes| superannuation| leave management| and award interpretation.3) Testing Methodologies Strong knowledge of Manual Testing processes| with proficiency in System Integration Testing (SIT) and User Acceptance Testing (UAT).4) Technical Skills Proficient in querying databases (SQL) for data verification and skilled in MS Excel for analyzing data variances.5) System Implementation Experience in brownfield implementations| system upgrades| or migration projects is highly desirable
Desirable Skills:
Keyword:
Skills: Workday Payroll
Experience Required: 4-6
Salary : Rs. 70,000.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
ole: UI Path Developer
Location: Chennai / Bangalore
Exp: 7+ (relevant 6+ yrs)
RPA and Workflow Engineer:
Covers:
This individual will be responsible for ensuring stable day‑to‑day operations of RPA bots, managing platform updates, troubleshooting failures, and contributing to ongoing automation improvements. The ideal candidate has hands‑on UiPath experience, strong troubleshooting skills, and the ability to manage both recurring operational tasks and complex platform‑level initiatives. Nintex experience is a strong plus.
Key responsibilities:
RPA
Support large‑scale system changes impacting automations, such as, browser or application upgrades
Lead or support migration initiatives, including:
Migration of bots from desktop to web‑based application versions
Updates related to ongoing security features and protocols
Update and maintain automation frameworks, including:
Credential retrieval processes and password vault integrations
Validate and regression‑test automations after platform, application, or environment changes.
Perform frequent UI selector updates in response to front‑end changes.
Monitor automation health and complete manual re‑runs for failed processes.
Manage authentication issues such and/or API connectivity issues.
Update UiPath activity packages to ensure processes run efficiently.
Support password rotation cycles across applications (every 60–90 days).
Support long term automation initiatives to help business units stay compliant with records disposition regulations
Updating, validating, and testing SQL and data queries
As time permits, following maintenance work, assist Automation Solutions team with new RPA builds for business stakeholders.
Nintex
Investigate platform outages, performance degradation, SQL connectivity issues, and service‑level failures.
Partner with Nintex support, IT, and business teams to resolve tenant‑level delays, service restarts, or workflow failures/delays, and email/task delivery issues.
Review and apply workflow adjustments following root‑cause analyses, engineering hotfixes, or Microsoft service issues.
Manage SharePoint access related to workflow documents, fulfiller assignments, and permission updates.
Support continuous improvement by identifying workflow optimizations and stability enhancements.
Required Skills:
2+ years of hands‑on UiPath development and operations support experience.
Strong understanding of UiPath Orchestrator (Cloud), Robot management, selectors, error handling, queues, and activity packages.
Experience supporting production automations and troubleshooting failures (including analyzing logs, performing root‑cause investigations, delivering permanent fixes, and managing incident response)
Knowledge of SQL.
Familiarity with API‑based authentication.
Experience navigating change‑heavy environments (application upgrades, infrastructure migrations, browser updates).
Strong communication skills to support cross‑functional teams and issue escalation.
Preferred / Nice-to-Have Skills:
Experience with Nintex Workflow Cloud administration.
Understanding of workflow audit requirements (e.g., SOX controls).
Experience with SharePoint permissions management.
Background supporting system migrations such as AVS, cloud platform moves, or server decommissioning.
Familiarity with Information Security hardening practices and their impact on automation.
Responsibilities
ole: UI Path Developer
Location: Chennai / Bangalore
Exp: 7+ (relevant 6+ yrs)
RPA and Workflow Engineer:
Covers:
This individual will be responsible for ensuring stable day‑to‑day operations of RPA bots, managing platform updates, troubleshooting failures, and contributing to ongoing automation improvements. The ideal candidate has hands‑on UiPath experience, strong troubleshooting skills, and the ability to manage both recurring operational tasks and complex platform‑level initiatives. Nintex experience is a strong plus.
Key responsibilities:
RPA
Support large‑scale system changes impacting automations, such as, browser or application upgrades
Lead or support migration initiatives, including:
Migration of bots from desktop to web‑based application versions
Updates related to ongoing security features and protocols
Update and maintain automation frameworks, including:
Credential retrieval processes and password vault integrations
Validate and regression‑test automations after platform, application, or environment changes.
Perform frequent UI selector updates in response to front‑end changes.
Monitor automation health and complete manual re‑runs for failed processes.
Manage authentication issues such and/or API connectivity issues.
Update UiPath activity packages to ensure processes run efficiently.
Support password rotation cycles across applications (every 60–90 days).
Support long term automation initiatives to help business units stay compliant with records disposition regulations
Updating, validating, and testing SQL and data queries
As time permits, following maintenance work, assist Automation Solutions team with new RPA builds for business stakeholders.
Nintex
Investigate platform outages, performance degradation, SQL connectivity issues, and service‑level failures.
Partner with Nintex support, IT, and business teams to resolve tenant‑level delays, service restarts, or workflow failures/delays, and email/task delivery issues.
Review and apply workflow adjustments following root‑cause analyses, engineering hotfixes, or Microsoft service issues.
Manage SharePoint access related to workflow documents, fulfiller assignments, and permission updates.
Support continuous improvement by identifying workflow optimizations and stability enhancements.
Required Skills:
2+ years of hands‑on UiPath development and operations support experience.
Strong understanding of UiPath Orchestrator (Cloud), Robot management, selectors, error handling, queues, and activity packages.
Experience supporting production automations and troubleshooting failures (including analyzing logs, performing root‑cause investigations, delivering permanent fixes, and managing incident response)
Knowledge of SQL.
Familiarity with API‑based authentication.
Experience navigating change‑heavy environments (application upgrades, infrastructure migrations, browser updates).
Strong communication skills to support cross‑functional teams and issue escalation.
Preferred / Nice-to-Have Skills:
Experience with Nintex Workflow Cloud administration.
Understanding of workflow audit requirements (e.g., SOX controls).
Experience with SharePoint permissions management.
Background supporting system migrations such as AVS, cloud platform moves, or server decommissioning.
Familiarity with Information Security hardening practices and their impact on automation.
Salary : Rs. 70,000.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Descriptions:
Desired Competencies (Technical/Behavioral Competency)
• 4+ years of relevant experience in Pyspark and Azure Databricks.
• Proficiency in integrating, transforming and consolidating data from various structured and unstructured data sources.
• Good experience in SQL or native SQL query languages.
• Strong experience in implementing Databricks notebooks using Python.
• Good experience in Azure Data Factory, ADLS, Storage Services, Synapse, Serverless architecture, Azure functions.
• Strong knowledge of Data management principles.
• Experience in Azure DevOps and CI/CD deployments
• Strong problem solving and analytical skills.
• Experience with Agile development methodologies.
Microsoft Azure certifications (such as DP203) are preferred.
Expectations from the Role:
This role will provide an opportunity for the candidate to work with a team of developers and designers that are responsible for developing applications on Azure Data tech stack
Ideal candidates need to be creative problem solvers, self-motivated and tenacious when tackling tough issues and will have to contribute to process improvements by bringing in new ideas and techniques to enhance the design and development process.
Collaboratively working with the project data team members on projects and tasks. Self-driven learning of technologies
Essential Skills: Data Engineers -Must have skills - Python| PySpark| Mongo DBGood to have skills - Azure DataFactory
Location: ~CHENNAI~HYDERABAD~BANGALORE~
Skills: Digital: Python~Digital : Mongo DB
Experience Required: 6-8
Responsibilities
Role Descriptions:
Desired Competencies (Technical/Behavioral Competency)
• 4+ years of relevant experience in Pyspark and Azure Databricks.
• Proficiency in integrating, transforming and consolidating data from various structured and unstructured data sources.
• Good experience in SQL or native SQL query languages.
• Strong experience in implementing Databricks notebooks using Python.
• Good experience in Azure Data Factory, ADLS, Storage Services, Synapse, Serverless architecture, Azure functions.
• Strong knowledge of Data management principles.
• Experience in Azure DevOps and CI/CD deployments
• Strong problem solving and analytical skills.
• Experience with Agile development methodologies.
Microsoft Azure certifications (such as DP203) are preferred.
Expectations from the Role:
This role will provide an opportunity for the candidate to work with a team of developers and designers that are responsible for developing applications on Azure Data tech stack
Ideal candidates need to be creative problem solvers, self-motivated and tenacious when tackling tough issues and will have to contribute to process improvements by bringing in new ideas and techniques to enhance the design and development process.
Collaboratively working with the project data team members on projects and tasks. Self-driven learning of technologies
Essential Skills: Data Engineers -Must have skills - Python| PySpark| Mongo DBGood to have skills - Azure DataFactory
Location: ~CHENNAI~HYDERABAD~BANGALORE~
Skills: Digital: Python~Digital : Mongo DB
Experience Required: 6-8
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Chennai SQL, Python (Basics), Commercial pharma analytics/IQVIA datasets/claims datasets/Rx related data knowledge Power BI (basics) Effective communication 3-7 years
Responsibilities
Chennai SQL, Python (Basics), Commercial pharma analytics/IQVIA datasets/claims datasets/Rx related data knowledge Power BI (basics) Effective communication 3-7 years
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Descriptions: Bigdata Lead (Overall 10+ years & Relevant 8+ years mandate)
JD - Need strong candidates in Big data & Pyspark
Key Responsibilities
1. Big Data Engineering & Development
• Design and develop distributed data processing pipelines using Spark, Hadoop, Kafka, and related technologies.
• Build real-time and batch data processing systems with high reliability and performance.
• Develop efficient ETL/ELT workflows for diverse structured and unstructured data sources.
• Optimize data pipelines for scalability, performance, and cost efficiency.
2. Architecture & System Design
• Contribute to the architecture of data lakes, data warehouses, and lakehouse solutions.
• Implement data models, schema design, and best practices for large-scale data management.
• Ensure system reliability using fault-tolerant and distributed computing principles.
3. Cloud & Platform Engineering
• Build and manage solutions on AWS / Azure / GCP (e.g., EMR, Databricks, HDInsight, BigQuery, Snowflake).
• Use orchestration tools such as Airflow, Azure Data Factory, AWS Glue, etc.
• Implement CI/CD workflows for automated deployment of data jobs.
4. Data Quality, Security & Governance
• Integrate data validation, monitoring, and alerting into pipelines.
• Ensure adherence to data governance, privacy, and security frameworks.
• Collaborate with data stewards, architects, and business stakeholders.
5. Collaboration & Leadership
• Mentor junior developers and guide best practices.
• Work closely with data scientists, analysts, and product teams to enable advanced analytics and ML workloads.
• Participate in code reviews, technical discussions, and roadmap planning.
Location - ~BANGALORE~CHENNAI~
Skills: Digital : BigData and Hadoop Ecosystems~Digital : PySpark
Experience Required: 10+ years (Relevant 8+ years mandate)
Responsibilities
Role Descriptions: Bigdata Lead (Overall 10+ years & Relevant 8+ years mandate)
JD - Need strong candidates in Big data & Pyspark
Key Responsibilities
1. Big Data Engineering & Development
• Design and develop distributed data processing pipelines using Spark, Hadoop, Kafka, and related technologies.
• Build real-time and batch data processing systems with high reliability and performance.
• Develop efficient ETL/ELT workflows for diverse structured and unstructured data sources.
• Optimize data pipelines for scalability, performance, and cost efficiency.
2. Architecture & System Design
• Contribute to the architecture of data lakes, data warehouses, and lakehouse solutions.
• Implement data models, schema design, and best practices for large-scale data management.
• Ensure system reliability using fault-tolerant and distributed computing principles.
3. Cloud & Platform Engineering
• Build and manage solutions on AWS / Azure / GCP (e.g., EMR, Databricks, HDInsight, BigQuery, Snowflake).
• Use orchestration tools such as Airflow, Azure Data Factory, AWS Glue, etc.
• Implement CI/CD workflows for automated deployment of data jobs.
4. Data Quality, Security & Governance
• Integrate data validation, monitoring, and alerting into pipelines.
• Ensure adherence to data governance, privacy, and security frameworks.
• Collaborate with data stewards, architects, and business stakeholders.
5. Collaboration & Leadership
• Mentor junior developers and guide best practices.
• Work closely with data scientists, analysts, and product teams to enable advanced analytics and ML workloads.
• Participate in code reviews, technical discussions, and roadmap planning.
Location - ~BANGALORE~CHENNAI~
Skills: Digital : BigData and Hadoop Ecosystems~Digital : PySpark
Experience Required: 10+ years (Relevant 8+ years mandate)
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Analyst
Work Location: ~PUNE~GURGAON~CHENNAI~BANGALORE~HYDERABAD~
Duration: 6 months
Skills Required: Business Analysis~Digital : Python~Data Concepts & Data Modelling~Data Analytics & Insights : ignio AIOps~MySQL
Relevant Experience Range in Required Skills: 8 to 10 Years
Job Description:
1. An understanding of data analysis tools and methodologies with the ability to develop business and technical solutions.
2. Good knowledge of the Data Modelling, Data Controls Assurance.
3. Strong influencing skills, with the ability to present complex data analysis in a clear and concise manner to gain buy-in from stakeholders.
4. An awareness of Financial Crime regulatory frameworks.
5. Excellent communication and collaboration skills with internal and external stakeholders.
6. Experience in managing action plans to address operational risk.
7. Good knowledge of common banking products and how they are used.
8. Ability to interpret business and data requirements that drive the business solution.
9. Experienced in SQL with ability to interpret data models.
10. Proficient in creating supporting information e.g., presentations for screen sharing during complex discussions with parties across multiple locations.
11. Good at problem solving and pro-active engaging with others to ascertain the full impact of defects and changes identifying interdependencies.
12. Experience in documenting requirement suser stories, solution options, functional design, and data mapping.
Responsibilities
Job Title: Analyst
Work Location: ~PUNE~GURGAON~CHENNAI~BANGALORE~HYDERABAD~
Duration: 6 months
Skills Required: Business Analysis~Digital : Python~Data Concepts & Data Modelling~Data Analytics & Insights : ignio AIOps~MySQL
Relevant Experience Range in Required Skills: 8 to 10 Years
Job Description:
1. An understanding of data analysis tools and methodologies with the ability to develop business and technical solutions.
2. Good knowledge of the Data Modelling, Data Controls Assurance.
3. Strong influencing skills, with the ability to present complex data analysis in a clear and concise manner to gain buy-in from stakeholders.
4. An awareness of Financial Crime regulatory frameworks.
5. Excellent communication and collaboration skills with internal and external stakeholders.
6. Experience in managing action plans to address operational risk.
7. Good knowledge of common banking products and how they are used.
8. Ability to interpret business and data requirements that drive the business solution.
9. Experienced in SQL with ability to interpret data models.
10. Proficient in creating supporting information e.g., presentations for screen sharing during complex discussions with parties across multiple locations.
11. Good at problem solving and pro-active engaging with others to ascertain the full impact of defects and changes identifying interdependencies.
12. Experience in documenting requirement suser stories, solution options, functional design, and data mapping.
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Desired Competencies (Technical/Behavioral Competency)
• 4+ years of relevant experience in Pyspark and Azure Databricks.
• Proficiency in integrating, transforming and consolidating data from various structured and unstructured data sources.
• Good experience in SQL or native SQL query languages.
• Strong experience in implementing Databricks notebooks using Python.
• Good experience in Azure Data Factory, ADLS, Storage Services, Synapse, Serverless architecture, Azure functions.
• Strong knowledge of Data management principles.
• Experience in Azure DevOps and CI/CD deployments
• Strong problem solving and analytical skills.
• Experience with Agile development methodologies.
Microsoft Azure certifications (such as DP203) are preferred.
Expectations from the Role:
This role will provide an opportunity for the candidate to work with a team of developers and designers that are responsible for developing applications on Azure Data tech stack
Ideal candidates need to be creative problem solvers, self-motivated and tenacious when tackling tough issues and will have to contribute to process improvements by bringing in new ideas and techniques to enhance the design and development process.
Collaboratively working with the project data team members on projects and tasks. Self-driven learning of technologies
Location: ~MUMBAI~KOLKATA~NEW DELHI~CHENNAI~PUNE~HYDERABAD~BANGALORE~
Skills: Digital: Databricks
Experience Required: 4-6
Responsibilities
Desired Competencies (Technical/Behavioral Competency)
• 4+ years of relevant experience in Pyspark and Azure Databricks.
• Proficiency in integrating, transforming and consolidating data from various structured and unstructured data sources.
• Good experience in SQL or native SQL query languages.
• Strong experience in implementing Databricks notebooks using Python.
• Good experience in Azure Data Factory, ADLS, Storage Services, Synapse, Serverless architecture, Azure functions.
• Strong knowledge of Data management principles.
• Experience in Azure DevOps and CI/CD deployments
• Strong problem solving and analytical skills.
• Experience with Agile development methodologies.
Microsoft Azure certifications (such as DP203) are preferred.
Expectations from the Role:
This role will provide an opportunity for the candidate to work with a team of developers and designers that are responsible for developing applications on Azure Data tech stack
Ideal candidates need to be creative problem solvers, self-motivated and tenacious when tackling tough issues and will have to contribute to process improvements by bringing in new ideas and techniques to enhance the design and development process.
Collaboratively working with the project data team members on projects and tasks. Self-driven learning of technologies
Location: ~MUMBAI~KOLKATA~NEW DELHI~CHENNAI~PUNE~HYDERABAD~BANGALORE~
Skills: Digital: Databricks
Experience Required: 4-6
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Requirement ID: 10528689
Job Title: Developer
Work Location: Pune, Hyderabad , Bangalore, Chennai, Kochi, Bhubaneshwar
Duration: 6 months
Skills Required: Digital : Python
Experience Range in Required Skills: 8 to 10 Years
Job Description: Python Developers
Responsibility of / Expectations from the Role
1 5+ years’ experience as Python Developer / Designer
2 Researching, designing, implementing, and managing software programs.
3 Used Python for data engineering (Processing data).
4 Have experience in processing structured/unstructured/semi-structured data using python. Ex. CSV, XML, HTML, SQL and JSON.
5 Testing and evaluating new programs.
6 Knowledge or Working with Pandas and Boto3 in Python is a Plus.
7 Identifying areas for modification in existing programs and subsequently developing these modifications.
8 Write effective, scalable code
9 Develop back-end components to improve responsiveness and overall performance
10 Implement security and data protection solutions
Note:Do not need candidate like full stack developer/Using Python for Web development/API development using Django module.
Good-to-Have- Pandas and Boto3
Responsibilities
Requirement ID: 10528689
Job Title: Developer
Work Location: Pune, Hyderabad , Bangalore, Chennai, Kochi, Bhubaneshwar
Duration: 6 months
Skills Required: Digital : Python
Experience Range in Required Skills: 8 to 10 Years
Job Description: Python Developers
Responsibility of / Expectations from the Role
1 5+ years’ experience as Python Developer / Designer
2 Researching, designing, implementing, and managing software programs.
3 Used Python for data engineering (Processing data).
4 Have experience in processing structured/unstructured/semi-structured data using python. Ex. CSV, XML, HTML, SQL and JSON.
5 Testing and evaluating new programs.
6 Knowledge or Working with Pandas and Boto3 in Python is a Plus.
7 Identifying areas for modification in existing programs and subsequently developing these modifications.
8 Write effective, scalable code
9 Develop back-end components to improve responsiveness and overall performance
10 Implement security and data protection solutions
Note:Do not need candidate like full stack developer/Using Python for Web development/API development using Django module.
Good-to-Have- Pandas and Boto3
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
RR- 00065203671
Years of exp : 6 to 14 years
Notice : immediate joiners
Location: Chennai, Hyderabad ,Coimbatore
Job Title: Mainframe Developer (Mid-Level)
Experience: 7 – 12 Years
Work Model: 5 Days Return to Office (RTO)
Job Summary
We are seeking a highly skilled Mainframe Developer with 7 to 12 years of experience to support, enhance, and maintain enterprise-scale mainframe applications. The ideal candidate will have strong technical expertise in mainframe technologies and the ability to work effectively in a collaborative, onsite environment as part of a critical business delivery team.
Key Responsibilities
• Design, develop, test, and maintain mainframe applications using COBOL, JCL, and related technologies.
• Perform analysis, coding, debugging, and unit testing for mainframe-based systems.
• Work on batch and online applications, ensuring performance, stability, and scalability.
• Participate in application maintenance, enhancements, and production support.
• Analyze technical requirements and translate them into high-quality, efficient solutions.
• Collaborate with cross-functional teams including business analysts, QA, and infrastructure teams.
• Ensure adherence to coding standards, documentation, and compliance guidelines.
• Support incident management, problem resolution, and root cause analysis.
• Participate in code reviews and provide technical guidance to junior team members when required.
Required Skills & Qualifications
• 7–12 years of hands-on experience in Mainframe application development.
• Strong proficiency in:
o COBOL (Batch & Online)
o JCL
o CICS
o DB2
o VSAM
• Experience with end-to-end SDLC and production support activities.
• Solid understanding of mainframe utilities and performance tuning.
• Experience working in Agile or Waterfall development methodologies.
• Strong problem-solving and analytical skills.
• Excellent communication and collaboration abilities.
Preferred Skills
• Exposure to mainframe modernization initiatives.
• Experience with version control and scheduling tools.
• Knowledge of scheduling tools such as CA7 / Control-M is an added advantage.
• Banking, insurance, or financial domain experience preferred.
Work Environment
• 5 Days Work From Office (RTO) is mandatory.
• Fast-paced enterprise delivery environment with a strong focus on quality and reliability.
Responsibilities
RR- 00065203671
Years of exp : 6 to 14 years
Notice : immediate joiners
Location: Chennai, Hyderabad ,Coimbatore
Job Title: Mainframe Developer (Mid-Level)
Experience: 7 – 12 Years
Work Model: 5 Days Return to Office (RTO)
Job Summary
We are seeking a highly skilled Mainframe Developer with 7 to 12 years of experience to support, enhance, and maintain enterprise-scale mainframe applications. The ideal candidate will have strong technical expertise in mainframe technologies and the ability to work effectively in a collaborative, onsite environment as part of a critical business delivery team.
Key Responsibilities
• Design, develop, test, and maintain mainframe applications using COBOL, JCL, and related technologies.
• Perform analysis, coding, debugging, and unit testing for mainframe-based systems.
• Work on batch and online applications, ensuring performance, stability, and scalability.
• Participate in application maintenance, enhancements, and production support.
• Analyze technical requirements and translate them into high-quality, efficient solutions.
• Collaborate with cross-functional teams including business analysts, QA, and infrastructure teams.
• Ensure adherence to coding standards, documentation, and compliance guidelines.
• Support incident management, problem resolution, and root cause analysis.
• Participate in code reviews and provide technical guidance to junior team members when required.
Required Skills & Qualifications
• 7–12 years of hands-on experience in Mainframe application development.
• Strong proficiency in:
o COBOL (Batch & Online)
o JCL
o CICS
o DB2
o VSAM
• Experience with end-to-end SDLC and production support activities.
• Solid understanding of mainframe utilities and performance tuning.
• Experience working in Agile or Waterfall development methodologies.
• Strong problem-solving and analytical skills.
• Excellent communication and collaboration abilities.
Preferred Skills
• Exposure to mainframe modernization initiatives.
• Experience with version control and scheduling tools.
• Knowledge of scheduling tools such as CA7 / Control-M is an added advantage.
• Banking, insurance, or financial domain experience preferred.
Work Environment
• 5 Days Work From Office (RTO) is mandatory.
• Fast-paced enterprise delivery environment with a strong focus on quality and reliability.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance