We found 1672 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Associate - Cybersecurity Engineering |

Responsibilities

Associate - Cybersecurity Engineering |
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cybersecurity Engineering |

Job Description

JD for Talend Position Summary We are seeking a highly skilled Talend Big Data ETL Developer with 5–6 years of IT experience, including a minimum of 4 years working specifically on the Talend Big Data platform. The ideal candidate will have strong expertise in designing, developing, optimizing, and deploying complex ETL pipelines using Talend and SQL, with excellent analytical and communication skills. ________________________________________ Key Responsibilities • Design, develop, and maintain complex ETL pipelines using Talend Big Data tools on the Spark framework. • Build and manage ETL infrastructure to extract, transform, and load data from a wide range of structured and unstructured data sources. • Work extensively with Talend ETL components, including data processing, orchestration, parallelization, and transformation modules. • Develop Talend Jobs, Joblets, and custom Java-based components as required. • Perform installation, configuration, and maintenance of Talend Job Server, TAC Server, and other Talend components. • Optimize Talend jobs for high performance, scalability, and parallel execution across multiple job servers. • Deploy Talend jobs across environments and automate deployment pipelines. • Create and maintain Context Groups, parameterization frameworks, and custom routines. • Implement error handling, monitoring, alerting, and reporting mechanisms for ETL processes. • Write and execute unit test cases and support integration testing for all ETL components. • Design and support data ingestion pipelines across development, testing, and production environments. • Participate in performance tuning, troubleshooting, and best practice recommendations. • Utilize TAC for deployments, job scheduling, and administrative activities (added advantage). ________________________________________ Must-Have Skills • Minimum 4 years of hands-on experience with Talend Big Data ETL platform. • Strong SQL programming experience (preferably SQL Server). • Excellent analytical and problem solving skills. • Strong communication skills with the ability to work with cross functional teams. • Java debug fundamentals (on TAC) • Expertise in: o ETL fundamentals and end-to-end ETL lifecycle. o Talend core components, transformations, orchestrations, and job optimization. o Joblets, custom components, and Java based custom logic. o Error handling, performance tuning, and monitoring frameworks. ________________________________________ Nice-to-Have Skills • Experience with TAC for job administration, deployment, and scheduling. • Familiarity with DevOps pipelines for Talend job deployment. • Working knowledge of data quality, data validation, and metadata management.

Responsibilities

SG is looking at atleast 6 selections for this role. This demand is offline and Taleo will be released shortly. They are creating the same
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Talend Big Data ETL Developer

Job Description

Skill: Data Governance Exp: 3+ yrs Location: NOIDA Timings: 2:00 PM to 10:30 PM Cab Facility provided: Yes Administrating & maintaining Informatica Data Governance Platforms - both on-premises version (EDC, AXON, IDQ) & cloud platform (CDGC service). Integrating Informatica Data governance platform with enterprise systems such as Snowflake, AWS Athena, AWS S3, IBM DB2, Power BI, MS SQL, Oracle etc. Managing metadata and implementing business lineage Implementing Business Glossary Association in Axon Developing and maintaining data quality assets such as rules, profiles, mappings, workflows, applications & scorecards Collaborating with data stewards, data owners, and business users to define and enforce data governance requirements. Maintaining documentation and ensuring compliance with internal data governance standards Creating and managing Power BI reports and semantic models Coordinating with support groups to get the issues resolved in a quick turnaround time Mandatory Bachelor’s degree in computer science or similar field or equivalent work experience 3+ years of experience working on a Data Governance Platform Understanding of Power BI reports and semantic model Expertise in on-premises Informatica Data Governance tools - Informatica Enterprise Data Catalog (EDC), Informatica Axon & Informatica Data Quality (IDQ) Experience with IDMC Data Governance Modules - Data Governance and Catalog, Data Profiling, Data Quality and Metadata Command Center Insight on data platforms such as Snowflake, AWS & Azure Experience in writing SQL queries & Python scripts Strong learning attitude Good written and verbal communication skills Experience of working in a team spread across multiple locations Preferable Knowledge of AWS services Knowledge of snowflake

Responsibilities

Skill: Data Governance Exp: 3+ yrs Location: NOIDA Timings: 2:00 PM to 10:30 PM Cab Facility provided: Yes Administrating & maintaining Informatica Data Governance Platforms - both on-premises version (EDC, AXON, IDQ) & cloud platform (CDGC service). Integrating Informatica Data governance platform with enterprise systems such as Snowflake, AWS Athena, AWS S3, IBM DB2, Power BI, MS SQL, Oracle etc. Managing metadata and implementing business lineage Implementing Business Glossary Association in Axon Developing and maintaining data quality assets such as rules, profiles, mappings, workflows, applications & scorecards Collaborating with data stewards, data owners, and business users to define and enforce data governance requirements. Maintaining documentation and ensuring compliance with internal data governance standards Creating and managing Power BI reports and semantic models Coordinating with support groups to get the issues resolved in a quick turnaround time Mandatory Bachelor’s degree in computer science or similar field or equivalent work experience 3+ years of experience working on a Data Governance Platform Understanding of Power BI reports and semantic model Expertise in on-premises Informatica Data Governance tools - Informatica Enterprise Data Catalog (EDC), Informatica Axon & Informatica Data Quality (IDQ) Experience with IDMC Data Governance Modules - Data Governance and Catalog, Data Profiling, Data Quality and Metadata Command Center Insight on data platforms such as Snowflake, AWS & Azure Experience in writing SQL queries & Python scripts Strong learning attitude Good written and verbal communication skills Experience of working in a team spread across multiple locations Preferable Knowledge of AWS services Knowledge of snowflake
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Governance

Job Description

Skill: Microsoft Fabric Location: NOIDA Timings: 2:00 PM to 10:30 PM Cab Facility provided: Yes · Responsible for understanding the requirements and perform data analysis. · Responsible for setup of Microsoft fabric and its components · Building secure, scalable solutions across the Microsoft Fabric platform. · Create and manage Lakehouse. · Implement Data Factory processes for data ingestion, scalable ETL and data integration. · Design, implement and manage comprehensive warehousing solutions for analytics using fabric · Creating and scheduling data pipelines using Azure data factory · Building robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse and spark application. · Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments. · Set and use Git as a repository and versioning of fabric components · Create and manage Power BI reports and semantic models · Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting. · Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards. · Understand and follow-up through change management procedures to implement project deliverables. · Coordinating with support groups to get the issues resolved in a quick turnaround time. · Mandatory · Bachelor’s degree in computer science or similar field or equivalent work experience. · 3+ years of experience working in Microsoft Fabric. · Expertise in working with OneLake, Lakehouse, Warehouse and Notebook · Strong understanding of Power BI reports and semantic model using Fabric · Proven record of building ETL and data solutions using Azure data factory. · Strong understanding of data warehousing concepts and ETL processes. · Hand on experience of building data warehouses in fabric. · Strong skills in Python and PySpark · Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries. · Experience of utilizing Data Activator for effective data asset management and analytics. · Ability to flex and adapt to different tools and technologies. · Strong learning attitude. · Good written and verbal communication skills. · Demonstrated experience of working in a team spread across multiple locations. · Preferable · Knowledge of AWS services · Knowledge of snowflake · Knowledge of real time analytics in fabric

Responsibilities

Skill: Microsoft Fabric Location: NOIDA Timings: 2:00 PM to 10:30 PM Cab Facility provided: Yes · Responsible for understanding the requirements and perform data analysis. · Responsible for setup of Microsoft fabric and its components · Building secure, scalable solutions across the Microsoft Fabric platform. · Create and manage Lakehouse. · Implement Data Factory processes for data ingestion, scalable ETL and data integration. · Design, implement and manage comprehensive warehousing solutions for analytics using fabric · Creating and scheduling data pipelines using Azure data factory · Building robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse and spark application. · Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments. · Set and use Git as a repository and versioning of fabric components · Create and manage Power BI reports and semantic models · Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting. · Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards. · Understand and follow-up through change management procedures to implement project deliverables. · Coordinating with support groups to get the issues resolved in a quick turnaround time. · Mandatory · Bachelor’s degree in computer science or similar field or equivalent work experience. · 3+ years of experience working in Microsoft Fabric. · Expertise in working with OneLake, Lakehouse, Warehouse and Notebook · Strong understanding of Power BI reports and semantic model using Fabric · Proven record of building ETL and data solutions using Azure data factory. · Strong understanding of data warehousing concepts and ETL processes. · Hand on experience of building data warehouses in fabric. · Strong skills in Python and PySpark · Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries. · Experience of utilizing Data Activator for effective data asset management and analytics. · Ability to flex and adapt to different tools and technologies. · Strong learning attitude. · Good written and verbal communication skills. · Demonstrated experience of working in a team spread across multiple locations. · Preferable · Knowledge of AWS services · Knowledge of snowflake · Knowledge of real time analytics in fabric
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Microsoft Fabric

Job Description

Experience in SAP Solution Manager ChaRM Configuration at advance level(with CSOL, Retrofit and multi landscape scenarios) is required. Should have hands on experience on Implementation and Support Projects Hands on experience on S4 Hana and Fiori Proficient with SAP Java systems , worked on Java addons like MII and PHands on experience on ABAP systems such as TM GTS EWM Hands on experience on System integration and configuration Hands on experience on Hana DB Worked on Linux OS preferably on Azure Exposure tBODS and BOIS and replication with SLT Good thave exposure tSolMan functionalities like Managed System Configuration, Soldoc and ChaRM

Responsibilities

Experience in SAP Solution Manager ChaRM Configuration at advance level(with CSOL, Retrofit and multi landscape scenarios) is required. Should have hands on experience on Implementation and Support Projects Hands on experience on S4 Hana and Fiori Proficient with SAP Java systems , worked on Java addons like MII and PHands on experience on ABAP systems such as TM GTS EWM Hands on experience on System integration and configuration Hands on experience on Hana DB Worked on Linux OS preferably on Azure Exposure tBODS and BOIS and replication with SLT Good thave exposure tSolMan functionalities like Managed System Configuration, Soldoc and ChaRM
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Solution Manager ChaRM configuration

Job Description

Govind Req - Playwright + Typescript

Responsibilities

Govind Req - Playwright + Typescript
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Govind Req - Playwright + Typescript

Job Description

Seeking a Boomi Integration Consultant to design, build, and support integrations with a strong focus on EDI/X12 (including trading partner onboarding, mapping, acknowledgements, troubleshooting, and production support). The role requires SAP integration knowledge, hands-on experience with IDOC, XML, and flat file formats, and solid SQL Server skills for data queries, reconciliation, and issue resolution. Candidate should be comfortable delivering end-to-end integrations (design through deployment), implementing monitoring/error handling, and producing clear technical documentation while collaborating with business and technical stakeholders.

Responsibilities

Seeking a Boomi Integration Consultant to design, build, and support integrations with a strong focus on EDI/X12 (including trading partner onboarding, mapping, acknowledgements, troubleshooting, and production support). The role requires SAP integration knowledge, hands-on experience with IDOC, XML, and flat file formats, and solid SQL Server skills for data queries, reconciliation, and issue resolution. Candidate should be comfortable delivering end-to-end integrations (design through deployment), implementing monitoring/error handling, and producing clear technical documentation while collaborating with business and technical stakeholders.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Boomi Integration Consultant

Job Description

INFYSYJP00004342 563083-Test Data Management & Delphix Lead

Responsibilities

INFYSYJP00004342 563083-Test Data Management & Delphix Lead
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004342 563083-Test Data Management & Delphix Lead

Job Description

INFYSYJP00004333 192214Y26_Windows support admin

Responsibilities

INFYSYJP00004333 192214Y26_Windows support admin
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004333 192214Y26_Windows support admin