We found 991 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Description We are looking for Senior Data Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain *Description for Internal Candidates Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT. Requirements: • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Experience with Snowflake as a data warehouse platform and DBT for transformations. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. *Responsibilities for Internal Candidates Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT. • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.

Responsibilities

Description We are looking for Senior Data Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain *Description for Internal Candidates Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT. Requirements: • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Experience with Snowflake as a data warehouse platform and DBT for transformations. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. *Responsibilities for Internal Candidates Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT. • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Senior Data Engineer

Job Description

Pega Developer

Responsibilities

Pega Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Pega Developer

Job Description

Organization: Swiss Re Global Business Solutions India Private Limited Function: Finance Reinsurance Team: Regional A&R Role: Data Analyst Contractor _______________________________________________________________________ About Role It is an exciting opportunity to work with global team to act as backbone of all reporting and analysis for the Regional Reporting and Analysis Teams. Key Roles and Responsibilities: Timely retrieval of data across finance & non-finance systems for different Divisions. Building new reporting tool in Palantir platforms by using PySpark Apache programming language Upgrading and Maintaining existing reporting solutions in Power BI platform Visual presentation & analysis of data to ensure team continues to push for improved ways to disseminate reporting & analysis to all interested stakeholders across globe About You We are looking for candidate who is a Strong Fresher or with 6 Month - 1 year of experience, passionate about transforming large voluminous data into meaningful information to facilitate decision making by various stakeholders. Candidate interested in this role, should have a deep knowledge on SQL, Python, Advanced Excel skills, Power BI, Power query, Building Dax Logics to transform the data. Professional degree in data management field or Prior experience of working with Finance team data will be added advantage. Expectations from the Role Take personal responsibility of building new platforms & maintaining existing Power BI modules for reporting purposes. Interact with various stakeholders to understand the need and come up with data solution to match their needs. Develop a deep understanding on SR data platform to retrieve required data from various sources (by using SQL or other data transformation) to provide insights to management

Responsibilities

Organization: Swiss Re Global Business Solutions India Private Limited Function: Finance Reinsurance Team: Regional A&R Role: Data Analyst Contractor _______________________________________________________________________ About Role It is an exciting opportunity to work with global team to act as backbone of all reporting and analysis for the Regional Reporting and Analysis Teams. Key Roles and Responsibilities: Timely retrieval of data across finance & non-finance systems for different Divisions. Building new reporting tool in Palantir platforms by using PySpark Apache programming language Upgrading and Maintaining existing reporting solutions in Power BI platform Visual presentation & analysis of data to ensure team continues to push for improved ways to disseminate reporting & analysis to all interested stakeholders across globe About You We are looking for candidate who is a Strong Fresher or with 6 Month - 1 year of experience, passionate about transforming large voluminous data into meaningful information to facilitate decision making by various stakeholders. Candidate interested in this role, should have a deep knowledge on SQL, Python, Advanced Excel skills, Power BI, Power query, Building Dax Logics to transform the data. Professional degree in data management field or Prior experience of working with Finance team data will be added advantage. Expectations from the Role Take personal responsibility of building new platforms & maintaining existing Power BI modules for reporting purposes. Interact with various stakeholders to understand the need and come up with data solution to match their needs. Develop a deep understanding on SR data platform to retrieve required data from various sources (by using SQL or other data transformation) to provide insights to management
  • Salary : Rs. 0.0 - Rs. 30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Analyst

Job Description

INFYSYJP00004273 561359-.AIML and Python- BLR- DX

Responsibilities

INFYSYJP00004273 561359-.AIML and Python- BLR- DX
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004273 561359-.AIML and Python- BLR- DX

Job Description

Organization: Swiss Re Global Business Solutions India Private Limited Function: Finance Reinsurance Team: Regional A&R Role: Data Analyst Contractor _______________________________________________________________________ About Role It is an exciting opportunity to work with global team to act as backbone of all reporting and analysis for the Regional Reporting and Analysis Teams. Key Roles and Responsibilities: • Timely retrieval of data across finance & non-finance systems for different Divisions. • Building new reporting tool in Palantir platforms by using PySpark Apache programming language • Upgrading and Maintaining existing reporting solutions in Power BI platform • Visual presentation & analysis of data to ensure team continues to push for improved ways to disseminate reporting & analysis to all interested stakeholders across globe About You • We are looking for candidate who is a Strong Fresher or with 6 Month - 1 year of experience, passionate about transforming large voluminous data into meaningful information to facilitate decision making by various stakeholders. • Candidate interested in this role, should have a deep knowledge on SQL, Python, Advanced Excel skills, Power BI, Power query, Building Dax Logics to transform the data. • Professional degree in data management field or Prior experience of working with Finance team data will be added advantage. Expectations from the Role • Take personal responsibility of building new platforms & maintaining existing Power BI modules for reporting purposes. • Interact with various stakeholders to understand the need and come up with data solution to match their needs. • Develop a deep understanding on SR data platform to retrieve required data from various sources (by using SQL or other data transformation) to provide insights to management

Responsibilities

Contract length- 6 -9 Months (Extension possible) Pay rate- 30, 000 INR in Hand Hiring Manager- Prashanth B N Start Date – May 4th or 11th Work type- Hybrid Experience – 1 Year Openings- 1 Pay rate has been specified
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Analyst

Job Description

As a Quality Engineer, you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews, monitor and report defects, and engage in continuous improvement activities for the end-to-end testing process, ensuring that quality is maintained throughout the development lifecycle. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to ensure seamless integration of testing processes.- Develop and maintain automated test scripts to enhance testing efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Worksoft Certify.- Strong understanding of test automation frameworks and methodologies.- Experience with continuous integration and continuous deployment practices.- Familiarity with API testing tools and techniques.- Ability to analyze and interpret complex data sets to identify trends and issues.

Responsibilities

As a Quality Engineer, you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews, monitor and report defects, and engage in continuous improvement activities for the end-to-end testing process, ensuring that quality is maintained throughout the development lifecycle. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to ensure seamless integration of testing processes.- Develop and maintain automated test scripts to enhance testing efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Worksoft Certify.- Strong understanding of test automation frameworks and methodologies.- Experience with continuous integration and continuous deployment practices.- Familiarity with API testing tools and techniques.- Ability to analyze and interpret complex data sets to identify trends and issues.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bangalore

Job Description

INFYSYJP00004262/Endur Support Analyst

Responsibilities

INFYSYJP00004262/Endur Support Analyst
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004262/Endur Support Analyst

Job Description

INFYSYJP00004158/562364-Worksoft SAP Test Automation

Responsibilities

INFYSYJP00004158/562364-Worksoft SAP Test Automation
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004158/562364-Worksoft SAP Test Automation

Job Description

INFYSYJP00003246/559782_Vmware + OCPV + Vcloud

Responsibilities

INFYSYJP00003246/559782_Vmware + OCPV + Vcloud
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00003246/559782_Vmware + OCPV + Vcloud