Job Description:
Must-Have Informatica power center 9.x,10.x ,Unix
Good-to-Have RDBMS- Netezza , SQL server , Oracle ,AutoSys or any other job scheduler
SN Responsibility of / Expectations from the Role
1 Requirement gathering, preparing technical mapping documents and code /ETL .
2 Design, Develop, Document ETL in Informatica
3 Conduct code review, deployments into various environment – DIT , SIT , UAT and PROD
4 Support Testing cycles in SIT , UAT ,
5 Post production support and transitions
6 Expertise in designing, development of Java/J2EE based Applications
Responsibilities
Job Description:
Must-Have Informatica power center 9.x,10.x ,Unix
Good-to-Have RDBMS- Netezza , SQL server , Oracle ,AutoSys or any other job scheduler
SN Responsibility of / Expectations from the Role
1 Requirement gathering, preparing technical mapping documents and code /ETL .
2 Design, Develop, Document ETL in Informatica
3 Conduct code review, deployments into various environment – DIT , SIT , UAT and PROD
4 Support Testing cycles in SIT , UAT ,
5 Post production support and transitions
6 Expertise in designing, development of Java/J2EE based Applications
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews, monitor and report defects, and engage in continuous improvement activities for the end-to-end testing process, ensuring that the highest quality standards are met throughout the project lifecycleData Analysis – SQL, Visualization (visualization skills are low priority, but must be good with SQL) Knowledge/Exposure to Azure ecosystem, esp. ADF, Azure Databricks, Python• Understanding of building data products (things like data mapping, transformation (business and technical), validation of data products – data quality and adherence to specs). Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement automated testing scripts to enhance testing efficiency.- Collaborate with cross-functional teams to ensure seamless integration of testing processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analytics.- Strong analytical skills to interpret complex data sets.- Experience with data visualization tools to present findings effectively.- Familiarity with automation testing frameworks and tools.- Ability to troubleshoot and resolve issues in testing environments. Additional Information: - The candidate should have minimum 3 years of experience in Data Analytics.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews, monitor and report defects, and engage in continuous improvement activities for the end-to-end testing process, ensuring that the highest quality standards are met throughout the project lifecycleData Analysis – SQL, Visualization (visualization skills are low priority, but must be good with SQL) Knowledge/Exposure to Azure ecosystem, esp. ADF, Azure Databricks, Python• Understanding of building data products (things like data mapping, transformation (business and technical), validation of data products – data quality and adherence to specs). Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement automated testing scripts to enhance testing efficiency.- Collaborate with cross-functional teams to ensure seamless integration of testing processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analytics.- Strong analytical skills to interpret complex data sets.- Experience with data visualization tools to present findings effectively.- Familiarity with automation testing frameworks and tools.- Ability to troubleshoot and resolve issues in testing environments. Additional Information: - The candidate should have minimum 3 years of experience in Data Analytics.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,46,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
The Senior .NET Core API development Specialist will play a pivotal role in designing, developing, and implementing robust solutions using .net core technologies. This role requires extensive experience implementing API security - oAuth 2.0, PKCE, mTLS authentication and authorization, a strong understanding API development, and proven expertise developing APIs end to end. The ideal candidate will lead technical discussions, mentor junior team members, and ensure the delivery of high-quality, scalable, and secure .net core APi.
Responsibilities:
• Build and maintain the API metadata knowledge base.
• Build and implement API authentication and Authorization using .net core.
• Build and maintain Open API Specification documents.
• Writing nUnit test cases.
Responsibilities
The Senior .NET Core API development Specialist will play a pivotal role in designing, developing, and implementing robust solutions using .net core technologies. This role requires extensive experience implementing API security - oAuth 2.0, PKCE, mTLS authentication and authorization, a strong understanding API development, and proven expertise developing APIs end to end. The ideal candidate will lead technical discussions, mentor junior team members, and ensure the delivery of high-quality, scalable, and secure .net core APi.
Responsibilities:
• Build and maintain the API metadata knowledge base.
• Build and implement API authentication and Authorization using .net core.
• Build and maintain Open API Specification documents.
• Writing nUnit test cases.
Salary : Rs. 0.0 - Rs. 1,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :INFYSYJP00002780/558676 - Senior .NET Core API development Specialist - Pune, Mysore - EAIS
As a Custom Software Engineer, you will develop custom software solutions to design, code, and enhance components across systems or applications. Your typical day will involve collaborating with cross-functional teams to understand business requirements, utilizing modern frameworks and agile practices to create scalable and high-performing solutions that meet specific business needs. You will engage in problem-solving discussions, contribute innovative ideas, and ensure the delivery of quality software that aligns with organizational goals. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of software specifications and design.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring MVC.- Strong understanding of RESTful web services and API development.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Knowledge of version control systems, particularly Git. Additional Information: - The candidate should have minimum 3 years of experience in Spring MVC.- This position is based at our Bengaluru office.- A BE is required.
Responsibilities
As a Custom Software Engineer, you will develop custom software solutions to design, code, and enhance components across systems or applications. Your typical day will involve collaborating with cross-functional teams to understand business requirements, utilizing modern frameworks and agile practices to create scalable and high-performing solutions that meet specific business needs. You will engage in problem-solving discussions, contribute innovative ideas, and ensure the delivery of quality software that aligns with organizational goals. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of software specifications and design.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring MVC.- Strong understanding of RESTful web services and API development.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Knowledge of version control systems, particularly Git. Additional Information: - The candidate should have minimum 3 years of experience in Spring MVC.- This position is based at our Bengaluru office.- A BE is required.
Salary : Rs. 0.0 - Rs. 1,45,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and processes.- Familiarity with cloud platforms such as AWS or Azure.- Knowledge of data warehousing concepts and technologies. Additional Information: - The candidate should have minimum 2 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and processes.- Familiarity with cloud platforms such as AWS or Azure.- Knowledge of data warehousing concepts and technologies. Additional Information: - The candidate should have minimum 2 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and processes.- Familiarity with cloud platforms such as AWS or Azure.- Knowledge of data warehousing concepts and technologies. Additional Information: - The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and processes.- Familiarity with cloud platforms such as AWS or Azure.- Knowledge of data warehousing concepts and technologies. Additional Information: - The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,300.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Key Responsibilities -
Provide 24x7 production support for Mainframe databases (such as DB2 for z/OS, IMS, VSAM, or IDMS). by participating in rotational shifts and on-call duties.
Administer, monitor, and maintain Mainframe databases in production and non-production environments.
Implement and manage backup, recovery, tuning, high availability and DR (Disaster Recovery) strategies.
Perform routine health checks, proactively identify performance bottlenecks, and optimize queries and database configurations.
Automate routine operations and reporting using Mainframe utilities, scripting languages, or job scheduling tools (JCL, REXX, Python, Shell, etc.).
Manage database security, user permissions, and compliance with organizational policies.
Coordinate and execute database patching, upgrades, and migrations with minimal downtime.
Collaborate with application and infrastructure teams to support deployment activities and resolve database-related issues.
Monitor and resolve incidents, including root cause analysis and documentation of resolutions.
Automate routine operational tasks using scripts or management tools.
Maintain accurate documentation of database procedures, configurations, and support processes for internal knowledge sharing.
Responsibilities
Key Responsibilities -
Provide 24x7 production support for Mainframe databases (such as DB2 for z/OS, IMS, VSAM, or IDMS). by participating in rotational shifts and on-call duties.
Administer, monitor, and maintain Mainframe databases in production and non-production environments.
Implement and manage backup, recovery, tuning, high availability and DR (Disaster Recovery) strategies.
Perform routine health checks, proactively identify performance bottlenecks, and optimize queries and database configurations.
Automate routine operations and reporting using Mainframe utilities, scripting languages, or job scheduling tools (JCL, REXX, Python, Shell, etc.).
Manage database security, user permissions, and compliance with organizational policies.
Coordinate and execute database patching, upgrades, and migrations with minimal downtime.
Collaborate with application and infrastructure teams to support deployment activities and resolve database-related issues.
Monitor and resolve incidents, including root cause analysis and documentation of resolutions.
Automate routine operational tasks using scripts or management tools.
Maintain accurate documentation of database procedures, configurations, and support processes for internal knowledge sharing.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance