JD • Java – Angular based Web application development responsibilities include designing and developing high-volume, low-latency APIs for mission-critical systems and delivering high-availability and performance web application. • Collaborate with architects, product managers, and other stakeholders to understand requirements and translate them into technical designs. • Implement scalable and maintainable code that adheres to best practices and coding standards. • Ensure the security, scalability, and availability of microservices by implementing appropriate measures. • Troubleshoot and debug issues across the full stack, from front-end to back-end components. • Perform code reviews and provide constructive feedback to peers to maintain code quality. • Mentor junior developers and provide technical guidance and support as needed. • Contribute in all phases of the development lifecycle • Support continuous improvement by investigating alternatives and technologies
Responsibilities
JD • Java – Angular based Web application development responsibilities include designing and developing high-volume, low-latency APIs for mission-critical systems and delivering high-availability and performance web application. • Collaborate with architects, product managers, and other stakeholders to understand requirements and translate them into technical designs. • Implement scalable and maintainable code that adheres to best practices and coding standards. • Ensure the security, scalability, and availability of microservices by implementing appropriate measures. • Troubleshoot and debug issues across the full stack, from front-end to back-end components. • Perform code reviews and provide constructive feedback to peers to maintain code quality. • Mentor junior developers and provide technical guidance and support as needed. • Contribute in all phases of the development lifecycle • Support continuous improvement by investigating alternatives and technologies
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description:
• Maintain and migrate DTS packages to SSIS with enhanced logging and error handling.
• Design and develop SSIS packages and optimize existing ones.
• Perform SQL Server version upgrades (e.g., 2008 20162019) focusing on
• Application-level changes (T-SQL rewrites, deprecated feature handling).
• Refactoring stored procedures, triggers, views, and functions for compatibility.
• Updating connection strings, drivers, and linked server configurations.
• Conduct impact analysis for version differences (syntax changes, behavior changes).
• Execute regression testing and application queries post-migration.
• Collaborate with QA teams for data validation and performance benchmarking.
• Document migration steps, compatibility issues, and resolution strategies.
Responsibilities
Job Description:
• Maintain and migrate DTS packages to SSIS with enhanced logging and error handling.
• Design and develop SSIS packages and optimize existing ones.
• Perform SQL Server version upgrades (e.g., 2008 20162019) focusing on
• Application-level changes (T-SQL rewrites, deprecated feature handling).
• Refactoring stored procedures, triggers, views, and functions for compatibility.
• Updating connection strings, drivers, and linked server configurations.
• Conduct impact analysis for version differences (syntax changes, behavior changes).
• Execute regression testing and application queries post-migration.
• Collaborate with QA teams for data validation and performance benchmarking.
• Document migration steps, compatibility issues, and resolution strategies.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Design and build robust, scalable ETL/ELT pipelines using PySpark to ingest data from diverse sources (databases, logs, APIs, files).
• Transform and curate raw transactional and log data into analysis-ready datasets in the Data Hub and analytical data marts.
• Develop reusable and parameterized Spark jobs for batch and micro-batch processing.
• Optimize performance and scalability of PySpark jobs across large data volumes.
• Ensure data quality, consistency, lineage, and proper documentation across ingestion flows.
• Collaborate with Data Architects, Modelers, and Data Scientists to implement ingestion logic aligned with business needs.
• Work with cloud-based data platforms (e.g., AWS S3, Glue, EMR, Redshift) for data movement and storage.
• Support version control, CI/CD, and infrastructure-as-code where applicable.
• Participate in Agile ceremonies and contribute to sprint planning, story grooming, and demos.
Responsibilities
Design and build robust, scalable ETL/ELT pipelines using PySpark to ingest data from diverse sources (databases, logs, APIs, files).
• Transform and curate raw transactional and log data into analysis-ready datasets in the Data Hub and analytical data marts.
• Develop reusable and parameterized Spark jobs for batch and micro-batch processing.
• Optimize performance and scalability of PySpark jobs across large data volumes.
• Ensure data quality, consistency, lineage, and proper documentation across ingestion flows.
• Collaborate with Data Architects, Modelers, and Data Scientists to implement ingestion logic aligned with business needs.
• Work with cloud-based data platforms (e.g., AWS S3, Glue, EMR, Redshift) for data movement and storage.
• Support version control, CI/CD, and infrastructure-as-code where applicable.
• Participate in Agile ceremonies and contribute to sprint planning, story grooming, and demos.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Your professional experience
Degree in IT / computer science or information security
Experience in IT security Operation role or similar
Minimum 2 years of experience supporting privileged access management solutions (CyberArk/ Thycotic)
Basic industry certifications desirable e.g. Security+, COMP TIA etc
Good to have at least one PAM tool specific certification (e.g. CyberArk, Thycotic, BeyondTrust, etc.
Develop automation and/or scripting to augment PAM solutions as required
Highly self-directed, with keen attention to details
Strong communication, project and time management skills
Ability to effectively prioritise tasks in a high-pressure environment
Flexible and adaptable in regard to learning and understanding new technologies
Responsibilities
Your professional experience
Degree in IT / computer science or information security
Experience in IT security Operation role or similar
Minimum 2 years of experience supporting privileged access management solutions (CyberArk/ Thycotic)
Basic industry certifications desirable e.g. Security+, COMP TIA etc
Good to have at least one PAM tool specific certification (e.g. CyberArk, Thycotic, BeyondTrust, etc.
Develop automation and/or scripting to augment PAM solutions as required
Highly self-directed, with keen attention to details
Strong communication, project and time management skills
Ability to effectively prioritise tasks in a high-pressure environment
Flexible and adaptable in regard to learning and understanding new technologies
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Key Skills Required (Must Have)
a. Typescript/ Python coding for backend APIs, optionally Nodejs
b. AWS services – Server less with Lambda, API Gateway, Step functions, SQS, S3, DB (Postgre, SQL)
c. React.js – for front end
d. CI/CD – Github actions, AWS CDK/Cloud Formation
e. Understanding of various AWS services
Additional skills:
• Solid understanding of AWS cloud architecture
• Database design expertise and hands-on experience working with various databases
• Knowledge of Agentic AI
• Familiarity with unit testing frameworks and best practices
Responsibilities
Key Skills Required (Must Have)
a. Typescript/ Python coding for backend APIs, optionally Nodejs
b. AWS services – Server less with Lambda, API Gateway, Step functions, SQS, S3, DB (Postgre, SQL)
c. React.js – for front end
d. CI/CD – Github actions, AWS CDK/Cloud Formation
e. Understanding of various AWS services
Additional skills:
• Solid understanding of AWS cloud architecture
• Database design expertise and hands-on experience working with various databases
• Knowledge of Agentic AI
• Familiarity with unit testing frameworks and best practices
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience: 7+ years of experience in data engineering or architecture, with at least [2-3] years in a leadership role (e.g., Lead Engineer, Principal Engineer).
• Cloud Expertise: Deep experience with cloud data platforms, Azure preferred, and a strong understanding of relevant services (e.g., AWS EMR, Azure Databricks, GCP BigQuery).
• Strong programming skills in Python, Scala, or Java, and expertise in SQL.
• Extensive experience with big data technologies such as Spark, Kafka, and Hadoop.
• Demonstrated expertise in data modeling (relational and dimensional) and data warehousing.
• Proven experience designing and implementing end-to-end, large-scale data platforms and data product frameworks.
• Excellent communication and interpersonal skills, with the ability to lead and mentor teams effectively.
• Strong strategic thinking and the ability to define a clear technical vision.
• Experience driving technical decisions and promoting best practices.
• Experience with Infrastructure as Code (IaC) tools like Terraform. (Preferred)
• Experience with orchestration tools like Apache Airflow or Google Cloud Compose (Preferred)"
Responsibilities
Experience: 7+ years of experience in data engineering or architecture, with at least [2-3] years in a leadership role (e.g., Lead Engineer, Principal Engineer).
• Cloud Expertise: Deep experience with cloud data platforms, Azure preferred, and a strong understanding of relevant services (e.g., AWS EMR, Azure Databricks, GCP BigQuery).
• Strong programming skills in Python, Scala, or Java, and expertise in SQL.
• Extensive experience with big data technologies such as Spark, Kafka, and Hadoop.
• Demonstrated expertise in data modeling (relational and dimensional) and data warehousing.
• Proven experience designing and implementing end-to-end, large-scale data platforms and data product frameworks.
• Excellent communication and interpersonal skills, with the ability to lead and mentor teams effectively.
• Strong strategic thinking and the ability to define a clear technical vision.
• Experience driving technical decisions and promoting best practices.
• Experience with Infrastructure as Code (IaC) tools like Terraform. (Preferred)
• Experience with orchestration tools like Apache Airflow or Google Cloud Compose (Preferred)"
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Oracle Financials Cloud technical consultant
Infosys is seeking Oracle Financials Cloud technical consultant resource with Implementation and Application Support experience. The position’s primary responsibility will be to develop OTBI Reports, FBDI templates, Error Reporting and Reconciliation around GL, AR, AP, FA and CM modules
Responsibilities
Infosys is seeking Oracle Financials Cloud technical consultant resource with Implementation and Application Support experience. The position’s primary responsibility will be to develop OTBI Reports, FBDI templates, Error Reporting and Reconciliation around GL, AR, AP, FA and CM modules
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Oracle Financials Cloud technical consultant