Design, architecture, develop, and implement highly available/scalable, multi-region solutions within AWS Cloud.
• Work closely with application, engineering, security and operations teams to engineer and build Cloud native Services on AWS, PaaS and IaaS solutions within an agile and modern enterprise grade operating model.
• Maintaining and developing the Infrastructure as Code repository using Terraform and be able to deliver a fully automatized cloud infrastructure.
• Implement and maintain monitoring and alerting systems to detect issues proactively.
• Perform deployment, configuration, monitoring and maintenance of high availability enterprise solutions.
• Perform proactive system administration, monitoring, technical efficiency tuning, up-time, alert notifications, and automation tasks.
• Manage and administer the AWS cloud environment, including provisioning, configuration, performance monitoring, policy governance and security
• Design, develop, and implement highly available, multi-region solutions within AWS
• Analyze existing operational standards, processes, and/or governance to identify and implement improvements
• Migrate existing infrastructure services to AWS cloud-based solutions
• Manage security and access controls of AWS cloud-based solutions
• Develop infrastructure as code (IaC) leveraging cloud native tooling to ensure automated and consistent platform deployments
• Develop and implement policy driven data protection best practices to ensure cloud solutions are protected from data loss
• Support cloud adoption of applications as they are being transformed and/or modernized
• Ensure all infrastructure components meet proper performance and capacity standards
Mandatory Technical Skills:
• Around 6 years of experience with AWS Cloud (IaaS, PaaS, Database) and Azure DevOps
• Cloud Architecture: Strong understanding of cloud architecture principles and best practices.
• 6+ years of experience with Infrastructure As Code using Terraform
• Advanced skills on LINUX, Network, security and Docker based environment
• Security best practices and compliance frameworks
• Programming languages is a plus (For example: PowerShell, Shell, Python).
• Experience with AWS networking services (VPC, Direct Connect, Route 53, CloudFront)
• Network security implementation (Security Groups, NACLs, WAF)
Responsibilities
Design, architecture, develop, and implement highly available/scalable, multi-region solutions within AWS Cloud.
• Work closely with application, engineering, security and operations teams to engineer and build Cloud native Services on AWS, PaaS and IaaS solutions within an agile and modern enterprise grade operating model.
• Maintaining and developing the Infrastructure as Code repository using Terraform and be able to deliver a fully automatized cloud infrastructure.
• Implement and maintain monitoring and alerting systems to detect issues proactively.
• Perform deployment, configuration, monitoring and maintenance of high availability enterprise solutions.
• Perform proactive system administration, monitoring, technical efficiency tuning, up-time, alert notifications, and automation tasks.
• Manage and administer the AWS cloud environment, including provisioning, configuration, performance monitoring, policy governance and security
• Design, develop, and implement highly available, multi-region solutions within AWS
• Analyze existing operational standards, processes, and/or governance to identify and implement improvements
• Migrate existing infrastructure services to AWS cloud-based solutions
• Manage security and access controls of AWS cloud-based solutions
• Develop infrastructure as code (IaC) leveraging cloud native tooling to ensure automated and consistent platform deployments
• Develop and implement policy driven data protection best practices to ensure cloud solutions are protected from data loss
• Support cloud adoption of applications as they are being transformed and/or modernized
• Ensure all infrastructure components meet proper performance and capacity standards
Mandatory Technical Skills:
• Around 6 years of experience with AWS Cloud (IaaS, PaaS, Database) and Azure DevOps
• Cloud Architecture: Strong understanding of cloud architecture principles and best practices.
• 6+ years of experience with Infrastructure As Code using Terraform
• Advanced skills on LINUX, Network, security and Docker based environment
• Security best practices and compliance frameworks
• Programming languages is a plus (For example: PowerShell, Shell, Python).
• Experience with AWS networking services (VPC, Direct Connect, Route 53, CloudFront)
• Network security implementation (Security Groups, NACLs, WAF)
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Senior AWS Cloud Architect - Sudhakar Nagireddy
· 5–7 years of overall experience in Automation Testing using Tricentis Tosca
· Minimum 3–5 years of hands-on experience with the Tricentis Tosca automation tool
· Strong working knowledge of Tosca TestCase Design (TCD) and Tosca scripting concepts
· Experience with Test Data Service (TDS) and Test Data Management (TDM) in Tosca
· Ability to work with and manipulate external test data sources (e.g., Excel) for test case parameterization
· Experience in executing test cases using Distributed Execution (DEX)
· Hands-on experience in designing reusable, scalable, and maintainable automation test cases
· Good understanding of test case parameterization using external data sources
· Experience integrating Tosca with CI/CD pipelines (e.g., Jenkins, Azure DevOps)
· Ability to analyze requirements, design test scenarios, and automate them effectively
· Strong debugging and problem-solving skills in automation scripts
· Understand HRIT domains, business functionalities and technical landscapes of the applications in scope.
· Ability to understand the complex integration between various applications and perform end-to-end testing in an Complex application landscape.
· Develop, run and maintain automated regression suite.
· Participate in various agile ceremonies such as sprint planning, backlog refinement, daily stand-up calls, sprint retrospective etc.
· Communicate regression/automation development status to program team.
· Pro-active, strong minded, Contribution motivated, adapt to changes & Strong Communication (Verbal & Written) Skills.
Responsibilities
Salary : Rs. 20,00,000.0 - Rs. 23,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Data Scientist - (260004IC)
Missions
A seasoned Data Scientist with 6-7 years of professional experience. This role offers the opportunity to leverage expertise in statistical analysis and AI/ML to develop impactful solutions that align with our enterprise strategy. Data Scientist will be deeply involved in the entire project lifecycle—from data preparation and exploratory analysis to model deployment—while collaborating with multidisciplinary teams to deliver scalable, measurable results.
Key Responsibilities:
- Develop and implement high-impact AI/ML use cases that support our organizational objectives.
- Communicate findings, insights, and methodologies clearly to non-technical stakeholders.
- Design, build, and optimize predictive models, classifiers, and regression algorithms using classical AI/ML techniques such as SVMs, Decision Trees, Random Forests, k-NN, Naive Bayes, and ensemble methods.
- Validate models with appropriate statistical and machine learning evaluation techniques.
- Apply strong statistical foundations, including distributions, hypothesis testing, regression analysis, and probability theory.
- Conduct thorough exploratory data analysis to uncover key trends, patterns, and anomalies.
- Ensure data quality and reliability through rigorous analytical practices.
- Support the ML lifecycle, including model design, infrastructure, production setup, monitoring, and release management, with basic familiarity in MLOps.
Requirements:
- 6-7 years of hands-on experience in Data Science.
- Proven proficiency in statistical data analysis, machine learning, and natural language processing, with a strong understanding of practical constraints.(Must Have)
- Advanced skills in Python programming and SQL, utilizing relevant libraries for effective data analysis. (Must Have)
- Demonstrated experience in AI/ML solution development, including supervised and unsupervised learning algorithms, model evaluation, and feature engineering. (Must Have)
- Basic familiarity with MLOps and feature engineering methods for model workflows.(Basic Knowledge)
- Competency in software development methodologies and versioning tools. (Must Have)
- Experience with front-end visualization tools such as Streamlit or lightweight UI layers (Good to Have).
- Exposure to GenAI, including LLM integration, prompt engineering, model packaging, and lifecycle management (Preferred).
- Familiarity with agentic AI frameworks like LangChain and LangGraph, and agent-based patterns (Good to have
Profile
A seasoned Data Scientist with 6-7 years of professional experience. This role offers the opportunity to leverage expertise in statistical analysis and AI/ML to develop impactful solutions that align with our enterprise strategy. Data Scientist will be deeply involved in the entire project lifecycle—from data preparation and exploratory analysis to model deployment—while collaborating with multidisciplinary teams to deliver scalable, measurable results.
Key Responsibilities:
- Develop and implement high-impact AI/ML use cases that support our organizational objectives.
- Communicate findings, insights, and methodologies clearly to non-technical stakeholders.
- Design, build, and optimize predictive models, classifiers, and regression algorithms using classical AI/ML techniques such as SVMs, Decision Trees, Random Forests, k-NN, Naive Bayes, and ensemble methods.
- Validate models with appropriate statistical and machine learning evaluation techniques.
- Apply strong statistical foundations, including distributions, hypothesis testing, regression analysis, and probability theory.
- Conduct thorough exploratory data analysis to uncover key trends, patterns, and anomalies.
- Ensure data quality and reliability through rigorous analytical practices.
- Support the ML lifecycle, including model design, infrastructure, production setup, monitoring, and release management, with basic familiarity in MLOps.
Requirements:
- 6-7 years of hands-on experience in Data Science.
- Proven proficiency in statistical data analysis, machine learning, and natural language processing, with a strong understanding of practical constraints.(Must Have)
- Advanced skills in Python programming and SQL, utilizing relevant libraries for effective data analysis. (Must Have)
- Demonstrated experience in AI/ML solution development, including supervised and unsupervised learning algorithms, model evaluation, and feature engineering. (Must Have)
- Basic familiarity with MLOps and feature engineering methods for model workflows.(Basic Knowledge)
- Competency in software development methodologies and versioning tools. (Must Have)
- Experience with front-end visualization tools such as Streamlit or lightweight UI layers (Good to Have).
- Exposure to GenAI, including LLM integration, prompt engineering, model packaging, and lifecycle management (Preferred).
- Familiarity with agentic AI frameworks like LangChain and LangGraph, and agent-based patterns (Good to have
Responsibilities
Data Scientist - (260004IC)
Missions
A seasoned Data Scientist with 6-7 years of professional experience. This role offers the opportunity to leverage expertise in statistical analysis and AI/ML to develop impactful solutions that align with our enterprise strategy. Data Scientist will be deeply involved in the entire project lifecycle—from data preparation and exploratory analysis to model deployment—while collaborating with multidisciplinary teams to deliver scalable, measurable results.
Key Responsibilities:
- Develop and implement high-impact AI/ML use cases that support our organizational objectives.
- Communicate findings, insights, and methodologies clearly to non-technical stakeholders.
- Design, build, and optimize predictive models, classifiers, and regression algorithms using classical AI/ML techniques such as SVMs, Decision Trees, Random Forests, k-NN, Naive Bayes, and ensemble methods.
- Validate models with appropriate statistical and machine learning evaluation techniques.
- Apply strong statistical foundations, including distributions, hypothesis testing, regression analysis, and probability theory.
- Conduct thorough exploratory data analysis to uncover key trends, patterns, and anomalies.
- Ensure data quality and reliability through rigorous analytical practices.
- Support the ML lifecycle, including model design, infrastructure, production setup, monitoring, and release management, with basic familiarity in MLOps.
Requirements:
- 6-7 years of hands-on experience in Data Science.
- Proven proficiency in statistical data analysis, machine learning, and natural language processing, with a strong understanding of practical constraints.(Must Have)
- Advanced skills in Python programming and SQL, utilizing relevant libraries for effective data analysis. (Must Have)
- Demonstrated experience in AI/ML solution development, including supervised and unsupervised learning algorithms, model evaluation, and feature engineering. (Must Have)
- Basic familiarity with MLOps and feature engineering methods for model workflows.(Basic Knowledge)
- Competency in software development methodologies and versioning tools. (Must Have)
- Experience with front-end visualization tools such as Streamlit or lightweight UI layers (Good to Have).
- Exposure to GenAI, including LLM integration, prompt engineering, model packaging, and lifecycle management (Preferred).
- Familiarity with agentic AI frameworks like LangChain and LangGraph, and agent-based patterns (Good to have
Profile
A seasoned Data Scientist with 6-7 years of professional experience. This role offers the opportunity to leverage expertise in statistical analysis and AI/ML to develop impactful solutions that align with our enterprise strategy. Data Scientist will be deeply involved in the entire project lifecycle—from data preparation and exploratory analysis to model deployment—while collaborating with multidisciplinary teams to deliver scalable, measurable results.
Key Responsibilities:
- Develop and implement high-impact AI/ML use cases that support our organizational objectives.
- Communicate findings, insights, and methodologies clearly to non-technical stakeholders.
- Design, build, and optimize predictive models, classifiers, and regression algorithms using classical AI/ML techniques such as SVMs, Decision Trees, Random Forests, k-NN, Naive Bayes, and ensemble methods.
- Validate models with appropriate statistical and machine learning evaluation techniques.
- Apply strong statistical foundations, including distributions, hypothesis testing, regression analysis, and probability theory.
- Conduct thorough exploratory data analysis to uncover key trends, patterns, and anomalies.
- Ensure data quality and reliability through rigorous analytical practices.
- Support the ML lifecycle, including model design, infrastructure, production setup, monitoring, and release management, with basic familiarity in MLOps.
Requirements:
- 6-7 years of hands-on experience in Data Science.
- Proven proficiency in statistical data analysis, machine learning, and natural language processing, with a strong understanding of practical constraints.(Must Have)
- Advanced skills in Python programming and SQL, utilizing relevant libraries for effective data analysis. (Must Have)
- Demonstrated experience in AI/ML solution development, including supervised and unsupervised learning algorithms, model evaluation, and feature engineering. (Must Have)
- Basic familiarity with MLOps and feature engineering methods for model workflows.(Basic Knowledge)
- Competency in software development methodologies and versioning tools. (Must Have)
- Experience with front-end visualization tools such as Streamlit or lightweight UI layers (Good to Have).
- Exposure to GenAI, including LLM integration, prompt engineering, model packaging, and lifecycle management (Preferred).
- Familiarity with agentic AI frameworks like LangChain and LangGraph, and agent-based patterns (Good to have
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project requirements are met, facilitating discussions to address challenges, and guiding the development process to deliver high-quality software solutions. You will engage with stakeholders to gather feedback and make necessary adjustments, ensuring that the applications align with user needs and business objectives. Your role will also include mentoring team members and fostering a collaborative environment to enhance productivity and innovation.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Provide solutions to problems for their immediate team and across multiple teams.
- Facilitate training sessions to enhance team skills and knowledge.
- Monitor project progress and ensure timely delivery of milestones.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in Pega Platform.
- Strong understanding of application design principles and best practices.
- Experience with integration of Pega applications with other systems.
- Ability to troubleshoot and resolve technical issues effectively.
- Familiarity with Agile methodologies and project management tools.
- Should have knowledge in Application design and development from scratch.
- Strong knowledge in Case Management life cycle design.
-Must have knowledge in Enterprise class structure, Micro services and Components design and development.
-Good knowledge in Security concepts like Authentication, Authorization, ARO’s, ABAC,RBAC and Privileges etc.
- Strong knowledge in Integrations mainly SOAP, REST,FILE and Kafka.
Must have knowledge in Kafka, Queue Processors, Job Schedulers, Data flows, Datasets, Batch and Realtime processing concepts.
Good to have knowledge in CI/CD tools like Jenkins and PDM and must have knowledge in Pipelines setup for automation of Deployment process.
Good to have knowledge in PDC for monitoring the alerts and Splunk for analyzing the logs.
Good to have DXI, DXC and Constellation experience.
Responsibilities
As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project requirements are met, facilitating discussions to address challenges, and guiding the development process to deliver high-quality software solutions. You will engage with stakeholders to gather feedback and make necessary adjustments, ensuring that the applications align with user needs and business objectives. Your role will also include mentoring team members and fostering a collaborative environment to enhance productivity and innovation.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Provide solutions to problems for their immediate team and across multiple teams.
- Facilitate training sessions to enhance team skills and knowledge.
- Monitor project progress and ensure timely delivery of milestones.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in Pega Platform.
- Strong understanding of application design principles and best practices.
- Experience with integration of Pega applications with other systems.
- Ability to troubleshoot and resolve technical issues effectively.
- Familiarity with Agile methodologies and project management tools.
- Should have knowledge in Application design and development from scratch.
- Strong knowledge in Case Management life cycle design.
-Must have knowledge in Enterprise class structure, Micro services and Components design and development.
-Good knowledge in Security concepts like Authentication, Authorization, ARO’s, ABAC,RBAC and Privileges etc.
- Strong knowledge in Integrations mainly SOAP, REST,FILE and Kafka.
Must have knowledge in Kafka, Queue Processors, Job Schedulers, Data flows, Datasets, Batch and Realtime processing concepts.
Good to have knowledge in CI/CD tools like Jenkins and PDM and must have knowledge in Pipelines setup for automation of Deployment process.
Good to have knowledge in PDC for monitoring the alerts and Splunk for analyzing the logs.
Good to have DXI, DXC and Constellation experience.
Salary : Rs. 0.0 - Rs. 3,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
ServiceNow QA professional with 4+ years of experience in testing and validating applications within ITSM and other ServiceNow specific modules. Expertise should include manual, regression, integration testing, and UAT, alongside proficiency in ServiceNow Automated Test Framework (ATF) for automation. Should be skilled in analyzing business requirements, designing test cases, and executing comprehensive end-to-end testing for various workflows and integrations. Proven ability to collaborate in Agile/Scrum environments to deliver high-quality software releases Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and quality standards.- Experience in Service Catalog items, Record Producers, Variable Sets, and Order Guides with workflows and Flow Designer Professional & Technical Skills: - Must To Have Skills: Proficiency in ServiceNow.- Experience with modern software development frameworks.- Strong understanding of agile methodologies and practices.- Ability to troubleshoot and resolve software issues effectively.- Familiarity with integration techniques and APIs. Additional Information: - The candidate should have minimum 3 years of experience in ServiceNow.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
ServiceNow QA professional with 4+ years of experience in testing and validating applications within ITSM and other ServiceNow specific modules. Expertise should include manual, regression, integration testing, and UAT, alongside proficiency in ServiceNow Automated Test Framework (ATF) for automation. Should be skilled in analyzing business requirements, designing test cases, and executing comprehensive end-to-end testing for various workflows and integrations. Proven ability to collaborate in Agile/Scrum environments to deliver high-quality software releases Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and quality standards.- Experience in Service Catalog items, Record Producers, Variable Sets, and Order Guides with workflows and Flow Designer Professional & Technical Skills: - Must To Have Skills: Proficiency in ServiceNow.- Experience with modern software development frameworks.- Strong understanding of agile methodologies and practices.- Ability to troubleshoot and resolve software issues effectively.- Familiarity with integration techniques and APIs. Additional Information: - The candidate should have minimum 3 years of experience in ServiceNow.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Descriptions: Bigdata Lead (Overall 10+ years & Relevant 8+ years mandate)
JD - Need strong candidates in Big data & Pyspark
Key Responsibilities
1. Big Data Engineering & Development
• Design and develop distributed data processing pipelines using Spark, Hadoop, Kafka, and related technologies.
• Build real-time and batch data processing systems with high reliability and performance.
• Develop efficient ETL/ELT workflows for diverse structured and unstructured data sources.
• Optimize data pipelines for scalability, performance, and cost efficiency.
2. Architecture & System Design
• Contribute to the architecture of data lakes, data warehouses, and lakehouse solutions.
• Implement data models, schema design, and best practices for large-scale data management.
• Ensure system reliability using fault-tolerant and distributed computing principles.
3. Cloud & Platform Engineering
• Build and manage solutions on AWS / Azure / GCP (e.g., EMR, Databricks, HDInsight, BigQuery, Snowflake).
• Use orchestration tools such as Airflow, Azure Data Factory, AWS Glue, etc.
• Implement CI/CD workflows for automated deployment of data jobs.
4. Data Quality, Security & Governance
• Integrate data validation, monitoring, and alerting into pipelines.
• Ensure adherence to data governance, privacy, and security frameworks.
• Collaborate with data stewards, architects, and business stakeholders.
5. Collaboration & Leadership
• Mentor junior developers and guide best practices.
• Work closely with data scientists, analysts, and product teams to enable advanced analytics and ML workloads.
• Participate in code reviews, technical discussions, and roadmap planning.
Location - ~BANGALORE~CHENNAI~
Skills: Digital : BigData and Hadoop Ecosystems~Digital : PySpark
Experience Required: 10+ years (Relevant 8+ years mandate)
Responsibilities
Role Descriptions: Bigdata Lead (Overall 10+ years & Relevant 8+ years mandate)
JD - Need strong candidates in Big data & Pyspark
Key Responsibilities
1. Big Data Engineering & Development
• Design and develop distributed data processing pipelines using Spark, Hadoop, Kafka, and related technologies.
• Build real-time and batch data processing systems with high reliability and performance.
• Develop efficient ETL/ELT workflows for diverse structured and unstructured data sources.
• Optimize data pipelines for scalability, performance, and cost efficiency.
2. Architecture & System Design
• Contribute to the architecture of data lakes, data warehouses, and lakehouse solutions.
• Implement data models, schema design, and best practices for large-scale data management.
• Ensure system reliability using fault-tolerant and distributed computing principles.
3. Cloud & Platform Engineering
• Build and manage solutions on AWS / Azure / GCP (e.g., EMR, Databricks, HDInsight, BigQuery, Snowflake).
• Use orchestration tools such as Airflow, Azure Data Factory, AWS Glue, etc.
• Implement CI/CD workflows for automated deployment of data jobs.
4. Data Quality, Security & Governance
• Integrate data validation, monitoring, and alerting into pipelines.
• Ensure adherence to data governance, privacy, and security frameworks.
• Collaborate with data stewards, architects, and business stakeholders.
5. Collaboration & Leadership
• Mentor junior developers and guide best practices.
• Work closely with data scientists, analysts, and product teams to enable advanced analytics and ML workloads.
• Participate in code reviews, technical discussions, and roadmap planning.
Location - ~BANGALORE~CHENNAI~
Skills: Digital : BigData and Hadoop Ecosystems~Digital : PySpark
Experience Required: 10+ years (Relevant 8+ years mandate)
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Analyst
Work Location: ~PUNE~GURGAON~CHENNAI~BANGALORE~HYDERABAD~
Duration: 6 months
Skills Required: Business Analysis~Digital : Python~Data Concepts & Data Modelling~Data Analytics & Insights : ignio AIOps~MySQL
Relevant Experience Range in Required Skills: 8 to 10 Years
Job Description:
1. An understanding of data analysis tools and methodologies with the ability to develop business and technical solutions.
2. Good knowledge of the Data Modelling, Data Controls Assurance.
3. Strong influencing skills, with the ability to present complex data analysis in a clear and concise manner to gain buy-in from stakeholders.
4. An awareness of Financial Crime regulatory frameworks.
5. Excellent communication and collaboration skills with internal and external stakeholders.
6. Experience in managing action plans to address operational risk.
7. Good knowledge of common banking products and how they are used.
8. Ability to interpret business and data requirements that drive the business solution.
9. Experienced in SQL with ability to interpret data models.
10. Proficient in creating supporting information e.g., presentations for screen sharing during complex discussions with parties across multiple locations.
11. Good at problem solving and pro-active engaging with others to ascertain the full impact of defects and changes identifying interdependencies.
12. Experience in documenting requirement suser stories, solution options, functional design, and data mapping.
Responsibilities
Job Title: Analyst
Work Location: ~PUNE~GURGAON~CHENNAI~BANGALORE~HYDERABAD~
Duration: 6 months
Skills Required: Business Analysis~Digital : Python~Data Concepts & Data Modelling~Data Analytics & Insights : ignio AIOps~MySQL
Relevant Experience Range in Required Skills: 8 to 10 Years
Job Description:
1. An understanding of data analysis tools and methodologies with the ability to develop business and technical solutions.
2. Good knowledge of the Data Modelling, Data Controls Assurance.
3. Strong influencing skills, with the ability to present complex data analysis in a clear and concise manner to gain buy-in from stakeholders.
4. An awareness of Financial Crime regulatory frameworks.
5. Excellent communication and collaboration skills with internal and external stakeholders.
6. Experience in managing action plans to address operational risk.
7. Good knowledge of common banking products and how they are used.
8. Ability to interpret business and data requirements that drive the business solution.
9. Experienced in SQL with ability to interpret data models.
10. Proficient in creating supporting information e.g., presentations for screen sharing during complex discussions with parties across multiple locations.
11. Good at problem solving and pro-active engaging with others to ascertain the full impact of defects and changes identifying interdependencies.
12. Experience in documenting requirement suser stories, solution options, functional design, and data mapping.
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance