Mandatory: Consumer Acquisition (Card and/or Bank) product experience
Designs, develops, and manages activities for a product or group of products from product definition and planning through production, release, and end of life. Partner with stakeholder teams on enhancing/refining product strategies and roadmaps. Serves as the central resource with design, process, development, test, quality, and marketing, throughout all stages of a product’s lifecycle. Manages the features backlog by identifying and prioritizing features and capabilities through direct input with customers, analysts, developers and architects and knowledge of the domain. Facilitates resolution of risks, issues and changes related to the product development lifecycle. Involvement includes modifications, upgrades and maintenance of the product or product line.
Senior ServiceNow Developer/Team Lead with 7+ years of hands-on experience in ServiceNow delivery with hands on experience in ITSM,Catalog,Integrations, custom applications and other modules as an added advantage.
Proven expertise in ServiceNow platform architecture with knowledge of Flow Designer, Process Automation Designer, and Performance Analytics for large-scale implementations.
•Strong technical background in ServiceNow development Business Rules, Client Scripts, UI Policies, and REST/SOAP APIs with focus on automation and workflow optimization.
•Proven project leadership experience managing cross-functional teams, stakeholder relationships, and complex implementations while ensuring on-time delivery, budget compliance, and quality standards across multiple concurrent projects.
•Strong problem-solving abilities with a track record of troubleshooting complex issues, performance optimization, system integration challenges, and user experience improvements
Responsibilities
Senior ServiceNow Developer/Team Lead with 7+ years of hands-on experience in ServiceNow delivery with hands on experience in ITSM,Catalog,Integrations, custom applications and other modules as an added advantage.
Proven expertise in ServiceNow platform architecture with knowledge of Flow Designer, Process Automation Designer, and Performance Analytics for large-scale implementations.
•Strong technical background in ServiceNow development Business Rules, Client Scripts, UI Policies, and REST/SOAP APIs with focus on automation and workflow optimization.
•Proven project leadership experience managing cross-functional teams, stakeholder relationships, and complex implementations while ensuring on-time delivery, budget compliance, and quality standards across multiple concurrent projects.
•Strong problem-solving abilities with a track record of troubleshooting complex issues, performance optimization, system integration challenges, and user experience improvements
Salary : Rs. 0.0 - Rs. 1,90,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve technical problems, ensuring that systems operate smoothly and efficiently. You will engage in proactive monitoring and maintenance of applications, contributing to the overall reliability and performance of business operations. Your role will also include documenting solutions and sharing knowledge with team members to enhance collective expertise and improve service delivery. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of application support processes to enhance efficiency.- Engage in continuous learning to stay updated with the latest technologies and best practices in application support. Professional & Technical Skills: - Must To Have Skills: Proficiency in ServiceNow IT Service Management.- Strong analytical and problem-solving skills to diagnose and resolve issues effectively.- Experience with incident management and service request fulfillment processes.- Familiarity with ITIL framework and best practices in service management.- Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Additional Information: - The candidate should have minimum 3 years of experience in ServiceNow IT Service Management.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve technical problems, ensuring that systems operate smoothly and efficiently. You will engage in proactive monitoring and maintenance of applications, contributing to the overall reliability and performance of business operations. Your role will also include documenting solutions and sharing knowledge with team members to enhance collective expertise and improve service delivery. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of application support processes to enhance efficiency.- Engage in continuous learning to stay updated with the latest technologies and best practices in application support. Professional & Technical Skills: - Must To Have Skills: Proficiency in ServiceNow IT Service Management.- Strong analytical and problem-solving skills to diagnose and resolve issues effectively.- Experience with incident management and service request fulfillment processes.- Familiarity with ITIL framework and best practices in service management.- Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Additional Information: - The candidate should have minimum 3 years of experience in ServiceNow IT Service Management.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,32,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain
*Description for Internal Candidates
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT.
Requirements:
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Experience with Snowflake as a data warehouse platform and DBT for transformations.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Responsibilities
We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain
*Description for Internal Candidates
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT.
Requirements:
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Experience with Snowflake as a data warehouse platform and DBT for transformations.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Summary
We are seeking an experienced Business Intelligence (BI) professional with 7–8 years of hands-on experience in designing, developing, and maintaining BI solutions. The ideal candidate should have strong working knowledge of Power BI and other BI tools such as IBM Cognos, with a solid foundation in data modeling, reporting, and analytics to support business decision-making.
Key Responsibilities
Design, develop, and deploy interactive dashboards and reports using Power BI.
Develop and maintain enterprise BI solutions using IBM Cognos (Report Studio, Framework Manager).
Collaborate with business stakeholders to understand reporting requirements and translate them into effective BI solutions.
Perform data modeling, transformation, and optimization using Power BI (Power Query, DAX).
Build and manage semantic layers, datasets, and data sources to ensure consistency and accuracy.
Ensure data quality, performance tuning, and optimization of BI reports and dashboards.
Integrate data from multiple sources including databases, data warehouses, APIs, and flat files.
Support UAT, production deployments, and ongoing enhancements of BI solutions.
Provide documentation, training, and knowledge transfer to end users and support teams.
Follow best practices for BI governance, security (row-level security), and version control.
Required Skills & Qualifications
7–8 years of experience in Business Intelligence and Analytics.
Strong expertise in Power BI (Desktop, Service, DAX, Power Query).
Hands-on experience with IBM Cognos (Report Studio, Framework Manager).
Strong understanding of data warehousing concepts, star/snowflake schemas.
Proficient in SQL for data extraction, transformation, and analysis.
Experience working with large and complex datasets.
Ability to analyze business requirements and provide data-driven insights.
Strong communication skills with the ability to work with cross-functional teams.
Preferred Skills
Experience with cloud platforms (Azure, AWS, or GCP) and cloud data sources.
Exposure to other BI tools such as Tableau, Qlik, or similar.
Knowledge of ETL tools and data pipelines.
Experience in Agile/Scrum delivery environments.
Manufacturing, automotive, or enterprise reporting experience is a plus.
Educational Qualification
Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Analytics, or a related field.
Responsibilities
Job Summary
We are seeking an experienced Business Intelligence (BI) professional with 7–8 years of hands-on experience in designing, developing, and maintaining BI solutions. The ideal candidate should have strong working knowledge of Power BI and other BI tools such as IBM Cognos, with a solid foundation in data modeling, reporting, and analytics to support business decision-making.
Key Responsibilities
Design, develop, and deploy interactive dashboards and reports using Power BI.
Develop and maintain enterprise BI solutions using IBM Cognos (Report Studio, Framework Manager).
Collaborate with business stakeholders to understand reporting requirements and translate them into effective BI solutions.
Perform data modeling, transformation, and optimization using Power BI (Power Query, DAX).
Build and manage semantic layers, datasets, and data sources to ensure consistency and accuracy.
Ensure data quality, performance tuning, and optimization of BI reports and dashboards.
Integrate data from multiple sources including databases, data warehouses, APIs, and flat files.
Support UAT, production deployments, and ongoing enhancements of BI solutions.
Provide documentation, training, and knowledge transfer to end users and support teams.
Follow best practices for BI governance, security (row-level security), and version control.
Required Skills & Qualifications
7–8 years of experience in Business Intelligence and Analytics.
Strong expertise in Power BI (Desktop, Service, DAX, Power Query).
Hands-on experience with IBM Cognos (Report Studio, Framework Manager).
Strong understanding of data warehousing concepts, star/snowflake schemas.
Proficient in SQL for data extraction, transformation, and analysis.
Experience working with large and complex datasets.
Ability to analyze business requirements and provide data-driven insights.
Strong communication skills with the ability to work with cross-functional teams.
Preferred Skills
Experience with cloud platforms (Azure, AWS, or GCP) and cloud data sources.
Exposure to other BI tools such as Tableau, Qlik, or similar.
Knowledge of ETL tools and data pipelines.
Experience in Agile/Scrum delivery environments.
Manufacturing, automotive, or enterprise reporting experience is a plus.
Educational Qualification
Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Analytics, or a related field.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Description
We are looking for Senior Data Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain *Description for Internal Candidates Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT. Requirements: • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Experience with Snowflake as a data warehouse platform and DBT for transformations. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. *Responsibilities for Internal Candidates Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT. • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Responsibilities
Description
We are looking for Senior Data Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain *Description for Internal Candidates Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT. Requirements: • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Experience with Snowflake as a data warehouse platform and DBT for transformations. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. *Responsibilities for Internal Candidates Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT. • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience and Skills:
• Experience in data engineering, specifically with Snowflake and DBT.
• Key skills required are strong hands-on experience with DBT
• expertise in building and managing data models using DBT commands, Jinja macros, and configurations, and proficiency in developing and managing DBT projects, testing, and documentation.
• Strong SQL proficiency, including advanced concepts, is essential.
• Experience with Snowflake's architecture and optimizing SQL queries for the platform is also necessary.
• A solid understanding of data warehousing architectures and ETL/ELT processes, data transformation, and quality is expected.
• Proficiency with cloud platforms like AWS, Azure, or GCP and experience with version control systems like Git are often required.
• Familiarity with CI/CD pipelines and workflow management tools is also beneficial.
• Excellent problem-solving, communication, and collaboration skills are vital, as is the ability to work in an agile environment
--------------------------------------
Responsibilities
Experience and Skills:
• Experience in data engineering, specifically with Snowflake and DBT.
• Key skills required are strong hands-on experience with DBT
• expertise in building and managing data models using DBT commands, Jinja macros, and configurations, and proficiency in developing and managing DBT projects, testing, and documentation.
• Strong SQL proficiency, including advanced concepts, is essential.
• Experience with Snowflake's architecture and optimizing SQL queries for the platform is also necessary.
• A solid understanding of data warehousing architectures and ETL/ELT processes, data transformation, and quality is expected.
• Proficiency with cloud platforms like AWS, Azure, or GCP and experience with version control systems like Git are often required.
• Familiarity with CI/CD pipelines and workflow management tools is also beneficial.
• Excellent problem-solving, communication, and collaboration skills are vital, as is the ability to work in an agile environment
--------------------------------------
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Design, Develop, Document, Test and Implement
Build scalable Cloud native Spring Boot microservices and RESTful APIs.
• Ensure code quality with unit/integration tests (JUnit) and static analysis (Sonar)
• Good understanding of Non-Functional Requirement
• Design data models and access layers for SQL (e.g., SQL Server/PostgreSQL) with performance and reliability in mind.
• Containerize services with Docker and deploy to Kubernetes (AKS).
• Implement logging, metrics, and distributed tracing (Datadog/Prometheus).
• Apply security best practices (OAuth2/OIDC, secrets management, TLS, dependency scanning).
• Build pipelines in Gitlab/Jenkins for build, test, security scans, containerization, and progressive delivery (blue/green or canary).
• Good knowledge on automated rollbacks, versioning, and release governance.
• Participate in backlog grooming, estimation, and agile ceremonies.
• Produce high-quality technical documentation (API specs, architecture diagrams).
• Troubleshoot production issues and contribute to SRE practices (SLIs/SLOs/error budgets) in partnership with platform teams.
Responsibilities
Build scalable Cloud native Spring Boot microservices and RESTful APIs.
• Ensure code quality with unit/integration tests (JUnit) and static analysis (Sonar)
• Good understanding of Non-Functional Requirement
• Design data models and access layers for SQL (e.g., SQL Server/PostgreSQL) with performance and reliability in mind.
• Containerize services with Docker and deploy to Kubernetes (AKS).
• Implement logging, metrics, and distributed tracing (Datadog/Prometheus).
• Apply security best practices (OAuth2/OIDC, secrets management, TLS, dependency scanning).
• Build pipelines in Gitlab/Jenkins for build, test, security scans, containerization, and progressive delivery (blue/green or canary).
• Good knowledge on automated rollbacks, versioning, and release governance.
• Participate in backlog grooming, estimation, and agile ceremonies.
• Produce high-quality technical documentation (API specs, architecture diagrams).
• Troubleshoot production issues and contribute to SRE practices (SLIs/SLOs/error budgets) in partnership with platform teams.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance