As a Custom Software Engineer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in problem-solving activities, providing insights and recommendations to enhance application performance and user experience, while maintaining a focus on quality and efficiency in your work.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Provide solutions to problems for their immediate team and across multiple teams.
- Facilitate knowledge sharing sessions to enhance team capabilities.
- Monitor project progress and ensure alignment with business goals.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in SailPoint IdentityIQ.
- Experience with identity governance and administration.
- Strong understanding of application integration techniques.
- Familiarity with security protocols and compliance standards.
- Ability to troubleshoot and resolve application issues efficiently.
Additional Information:
- The candidate should have minimum 5 years of experience in SailPoint IdentityIQ.
- This position is based at our Chennai office.
- A 15 years full time education is required
Responsibilities
As a Custom Software Engineer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in problem-solving activities, providing insights and recommendations to enhance application performance and user experience, while maintaining a focus on quality and efficiency in your work.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Provide solutions to problems for their immediate team and across multiple teams.
- Facilitate knowledge sharing sessions to enhance team capabilities.
- Monitor project progress and ensure alignment with business goals.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in SailPoint IdentityIQ.
- Experience with identity governance and administration.
- Strong understanding of application integration techniques.
- Familiarity with security protocols and compliance standards.
- Ability to troubleshoot and resolve application issues efficiently.
Additional Information:
- The candidate should have minimum 5 years of experience in SailPoint IdentityIQ.
- This position is based at our Chennai office.
- A 15 years full time education is required
Salary : Rs. 0.0 - Rs. 1,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
QA/Test
Primary Skill – Automated Testing
Secondary Skills - SQL, Test frameworks
JD for Automation Testing:
Senior Automation Test Engineer with solid experience in Java-Selenium automation. Should be well-versed in using Selenium and Cucumber for BDD and have experience in automating backend validations involving databases.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Generative AI~Ab Initio
Role Descriptions:
Responsibilities Create robust data pipelines (graphs| plans) using Ab Initio components for data extraction| transformation| and loading.Tune Ab Initio processes and leverage MFS (Multi-File System) for efficient processing and reduced run times.Implement data profiling| cleansing| validation| and error handling to ensure data integrity.Work with data architects| analysts| and business stakeholders to translate requirements into technical solutions.Debug production issues and resolve complex data-related problems in existing pipelines.Create and maintain technical designs| documentation| and participate in code reviews.
Skills Expertise in Core Ab Initio skills like Graphical Development Environment (GDE)| CoOperating System| EME (Enterprise Meta Environment).Expertise in SQL| UNIXLinux Shell Scripting.Expertise in ETL (Extract| Transform| Load)| Data Warehousing| Data Modeling.Knowledge of Relational Databases (Oracle| SQL Server).
Essential Skills: Responsibilities Create robust data pipelines (graphs| plans) using Ab Initio components for data extraction| transformation| and loading.Tune Ab Initio processes and leverage MFS (Multi-File System) for efficient processing and reduced run times.Implement data profiling| cleansing| validation| and error handling to ensure data integrity.Work with data architects| analysts| and business stakeholders to translate requirements into technical solutions.Debug production issues and resolve complex data-related problems in existing pipelines.Create and maintain technical designs| documentation| and participate in code reviews. Skills Expertise in Core Ab Initio skills like Graphical Development Environment (GDE)| CoOperating System| EME (Enterprise Meta Environment).Expertise in SQL| UNIXLinux Shell Scripting.Expertise in ETL (Extract| Transform| Load)| Data Warehousing| Data Modeling.Knowledge of Relational Databases (Oracle| SQL Server).
Responsibilities
Role Descriptions:
Responsibilities Create robust data pipelines (graphs| plans) using Ab Initio components for data extraction| transformation| and loading.Tune Ab Initio processes and leverage MFS (Multi-File System) for efficient processing and reduced run times.Implement data profiling| cleansing| validation| and error handling to ensure data integrity.Work with data architects| analysts| and business stakeholders to translate requirements into technical solutions.Debug production issues and resolve complex data-related problems in existing pipelines.Create and maintain technical designs| documentation| and participate in code reviews.
Skills Expertise in Core Ab Initio skills like Graphical Development Environment (GDE)| CoOperating System| EME (Enterprise Meta Environment).Expertise in SQL| UNIXLinux Shell Scripting.Expertise in ETL (Extract| Transform| Load)| Data Warehousing| Data Modeling.Knowledge of Relational Databases (Oracle| SQL Server).
Essential Skills: Responsibilities Create robust data pipelines (graphs| plans) using Ab Initio components for data extraction| transformation| and loading.Tune Ab Initio processes and leverage MFS (Multi-File System) for efficient processing and reduced run times.Implement data profiling| cleansing| validation| and error handling to ensure data integrity.Work with data architects| analysts| and business stakeholders to translate requirements into technical solutions.Debug production issues and resolve complex data-related problems in existing pipelines.Create and maintain technical designs| documentation| and participate in code reviews. Skills Expertise in Core Ab Initio skills like Graphical Development Environment (GDE)| CoOperating System| EME (Enterprise Meta Environment).Expertise in SQL| UNIXLinux Shell Scripting.Expertise in ETL (Extract| Transform| Load)| Data Warehousing| Data Modeling.Knowledge of Relational Databases (Oracle| SQL Server).
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Python, Spark, SQL, and cloud data platforms, ML
Must-Have- Python, Spark, SQL, and cloud data platforms. · Azure, Palantir Foundry · data APIs, AI/ML pipelines knowledge
Good-to-Have Data Science
Responsibility of / Expectations from the Role:
1. Build and maintain end-to-end ML pipelines (training, validation, model registry, deployment) using tools like Azure, Palantir Foundry or similar platforms.
2. Design and build scalable data pipelines for feature engineering, model training, and real-time/batch inference using Python, Spark, SQL, and cloud data platforms.
3. Operationalize AI/ML workflows by integrating models with data APIs, AI/ML pipelines, and production-grade ML engineering practices.
4. Optimize model performance & data quality through experimentation, validation, monitoring, and automated testing.
5. Collaborate with data scientists & MLOps to translate prototypes into reliable, production-ready AI/ML systems.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role : Python, Spark, SQL, and cloud data platforms, ML
Developer _ Denodo developer
Data virtualization Or Denodo developer
1. Design develop and implement data virtualization solutions using the Denodo Platform
2. Create and manage data models including views joins and transformations to meet business requirements
3. Develop and optimize complex VQL queries for optimal performance and data access
4. Implement data security and access control policies within the Denodo environment
5. Configure and manage Denodo components such as Virtual Data Port Server Responsible for data access query optimization and caching
6. Solution Manager Used for administration monitoring and metadata management
7. Scheduler Automates data integration tasks and workflows
8. Integrate Denodo with various data sources including relational databases NoSQL databases web services and flat files
9. Collaborate with business analysts data architects and other stakeholders to understand data requirements and translate them into technical solutions
10. Perform performance tuning and optimization of data virtualization solutions
11. Troubleshoot and resolve any issues related to the Denodo Platform
12. Stay abreast of the latest advancements in data virtualization technology
Responsibilities
Data virtualization Or Denodo developer
1. Design develop and implement data virtualization solutions using the Denodo Platform
2. Create and manage data models including views joins and transformations to meet business requirements
3. Develop and optimize complex VQL queries for optimal performance and data access
4. Implement data security and access control policies within the Denodo environment
5. Configure and manage Denodo components such as Virtual Data Port Server Responsible for data access query optimization and caching
6. Solution Manager Used for administration monitoring and metadata management
7. Scheduler Automates data integration tasks and workflows
8. Integrate Denodo with various data sources including relational databases NoSQL databases web services and flat files
9. Collaborate with business analysts data architects and other stakeholders to understand data requirements and translate them into technical solutions
10. Perform performance tuning and optimization of data virtualization solutions
11. Troubleshoot and resolve any issues related to the Denodo Platform
12. Stay abreast of the latest advancements in data virtualization technology
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance