We found 335 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

: Digital : Snowflake~PL/SQL~Unix Shell Scripting and text processing tools • Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas • Monitor system performance and optimize warehouse usage, queries, and costs • Implement and manage security controls, data masking, and auditing • Perform capacity planning, resource monitoring, and usage analysis • Support data loads, integrations, and troubleshoot Snowflake-related issues • Work closely with development, data engineering, and business teams for operational support • Automate routine administrative tasks using SQL, Python, or shell scripting 8Ensure high availability, backup, recovery, and disaster recovery readiness • Maintain documentation, operational procedures, and best practices • Demonstrate strong problem-solving skills and proactive incident prevention • Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas vvRequirements

Responsibilities

: Digital : Snowflake~PL/SQL~Unix Shell Scripting and text processing tools • Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas • Monitor system performance and optimize warehouse usage, queries, and costs • Implement and manage security controls, data masking, and auditing • Perform capacity planning, resource monitoring, and usage analysis • Support data loads, integrations, and troubleshoot Snowflake-related issues • Work closely with development, data engineering, and business teams for operational support • Automate routine administrative tasks using SQL, Python, or shell scripting 8Ensure high availability, backup, recovery, and disaster recovery readiness • Maintain documentation, operational procedures, and best practices • Demonstrate strong problem-solving skills and proactive incident prevention • Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas vvRequirements
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :: Digital : Snowflake~PL/SQL~Unix Shell Scripting and text processing tools

Job Description

We are looking for a skilled Azure Data Engineer with hands-on experience in building scalable data solutions using Azure Data Factory and Azure Databricks. The candidate will be responsible for developing robust data pipelines, optimizing data workflows, and supporting analytics solutions in the Azure cloud environment.Roles & Responsibilities: -Design and develop end-to-end data pipelines using Azure Data Factory.-Build scalable data processing solutions using Azure Databricks (PySpark / Spark SQL).-Develop and optimize ETL/ELT workflows for structured and unstructured data.-Integrate data from multiple sources such as APIs, databases, and cloud platforms.-Implement and manage Azure Data Lake Storage (ADLS Gen2) and data warehouse solutions.-Monitor, troubleshoot, and optimize pipeline performance and processing efficiency.-Implement data quality checks, validation rules, and governance standards.-Collaborate with data analysts, business stakeholders, and DevOps teams for delivery.-Manage CI/CD pipelines and deployments using Azure DevOps.-Ensure data security, compliance, and cost/performance optimization in Azure.Professional & Technical Skills: --5–6 years of experience in Data Engineering.-Strong hands-on expertise in:- Azure Data Factory (pipelines, triggers, linked services, Integration Runtime)-Azure Databricks (PySpark, Spark SQL, Delta Lake)-Experience with Azure Data Lake Storage (ADLS Gen2).-Strong knowledge of SQL and relational databases.-Proficiency in Python for data processing and automation.-Solid understanding of data warehousing and dimensional modeling.-Experience in large-scale distributed data processing.-Working knowledge of CI/CD, Git, and Azure DevOps.-Strong analytical and problem-solving skills.Additional Information: -Hands-on experience in Delta Lake performance tuning.-Knowledge of streaming frameworks (Kafka / Azure Event Hub).-Experience with Power BI integration.-Microsoft Certified: Azure Data Engineer Associate (or equivalent).-Experience working in Agile / Scrum environments.

Responsibilities

We are looking for a skilled Azure Data Engineer with hands-on experience in building scalable data solutions using Azure Data Factory and Azure Databricks. The candidate will be responsible for developing robust data pipelines, optimizing data workflows, and supporting analytics solutions in the Azure cloud environment.Roles & Responsibilities: -Design and develop end-to-end data pipelines using Azure Data Factory.-Build scalable data processing solutions using Azure Databricks (PySpark / Spark SQL).-Develop and optimize ETL/ELT workflows for structured and unstructured data.-Integrate data from multiple sources such as APIs, databases, and cloud platforms.-Implement and manage Azure Data Lake Storage (ADLS Gen2) and data warehouse solutions.-Monitor, troubleshoot, and optimize pipeline performance and processing efficiency.-Implement data quality checks, validation rules, and governance standards.-Collaborate with data analysts, business stakeholders, and DevOps teams for delivery.-Manage CI/CD pipelines and deployments using Azure DevOps.-Ensure data security, compliance, and cost/performance optimization in Azure.Professional & Technical Skills: --5–6 years of experience in Data Engineering.-Strong hands-on expertise in:- Azure Data Factory (pipelines, triggers, linked services, Integration Runtime)-Azure Databricks (PySpark, Spark SQL, Delta Lake)-Experience with Azure Data Lake Storage (ADLS Gen2).-Strong knowledge of SQL and relational databases.-Proficiency in Python for data processing and automation.-Solid understanding of data warehousing and dimensional modeling.-Experience in large-scale distributed data processing.-Working knowledge of CI/CD, Git, and Azure DevOps.-Strong analytical and problem-solving skills.Additional Information: -Hands-on experience in Delta Lake performance tuning.-Knowledge of streaming frameworks (Kafka / Azure Event Hub).-Experience with Power BI integration.-Microsoft Certified: Azure Data Engineer Associate (or equivalent).-Experience working in Agile / Scrum environments.
  • Salary : Rs. 0.0 - Rs. 1,80,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Custom Software Engineer

Job Description

9 HDC2 Summary: We are looking for a skilled Azure Data Engineer with hands-on experience in building scalable data solutions using Azure Data Factory and Azure Databricks. The candidate will be responsible for developing robust data pipelines, optimizing data workflows, and supporting analytics solutions in the Azure cloud environment.Roles & Responsibilities: -Design and develop end-to-end data pipelines using Azure Data Factory.-Build scalable data processing solutions using Azure Databricks (PySpark / Spark SQL).-Develop and optimize ETL/ELT workflows for structured and unstructured data.-Integrate data from multiple sources such as APIs, databases, and cloud platforms.-Implement and manage Azure Data Lake Storage (ADLS Gen2) and data warehouse solutions.-Monitor, troubleshoot, and optimize pipeline performance and processing efficiency.-Implement data quality checks, validation rules, and governance standards.-Collaborate with data analysts, business stakeholders, and DevOps teams for delivery.-Manage CI/CD pipelines and deployments using Azure DevOps.-Ensure data security, compliance, and cost/performance optimization in Azure.Professional & Technical Skills: --5–6 years of experience in Data Engineering.-Strong hands-on expertise in:- Azure Data Factory (pipelines, triggers, linked services, Integration Runtime)-Azure Databricks (PySpark, Spark SQL, Delta Lake)-Experience with Azure Data Lake Storage (ADLS Gen2).-Strong knowledge of SQL and relational databases.-Proficiency in Python for data processing and automation.-Solid understanding of data warehousing and dimensional modeling.-Experience in large-scale distributed data processing.-Working knowledge of CI/CD, Git, and Azure DevOps.-Strong analytical and problem-solving skills.Additional Information: -Hands-on experience in Delta Lake performance tuning.-Knowledge of streaming frameworks (Kafka / Azure Event Hub).-Experience with Power BI integration.-Microsoft Certified: Azure Data Engineer Associate (or equivalent).-Experience working in Agile / Scrum environments

Responsibilities

9 HDC2 Summary: We are looking for a skilled Azure Data Engineer with hands-on experience in building scalable data solutions using Azure Data Factory and Azure Databricks. The candidate will be responsible for developing robust data pipelines, optimizing data workflows, and supporting analytics solutions in the Azure cloud environment.Roles & Responsibilities: -Design and develop end-to-end data pipelines using Azure Data Factory.-Build scalable data processing solutions using Azure Databricks (PySpark / Spark SQL).-Develop and optimize ETL/ELT workflows for structured and unstructured data.-Integrate data from multiple sources such as APIs, databases, and cloud platforms.-Implement and manage Azure Data Lake Storage (ADLS Gen2) and data warehouse solutions.-Monitor, troubleshoot, and optimize pipeline performance and processing efficiency.-Implement data quality checks, validation rules, and governance standards.-Collaborate with data analysts, business stakeholders, and DevOps teams for delivery.-Manage CI/CD pipelines and deployments using Azure DevOps.-Ensure data security, compliance, and cost/performance optimization in Azure.Professional & Technical Skills: --5–6 years of experience in Data Engineering.-Strong hands-on expertise in:- Azure Data Factory (pipelines, triggers, linked services, Integration Runtime)-Azure Databricks (PySpark, Spark SQL, Delta Lake)-Experience with Azure Data Lake Storage (ADLS Gen2).-Strong knowledge of SQL and relational databases.-Proficiency in Python for data processing and automation.-Solid understanding of data warehousing and dimensional modeling.-Experience in large-scale distributed data processing.-Working knowledge of CI/CD, Git, and Azure DevOps.-Strong analytical and problem-solving skills.Additional Information: -Hands-on experience in Delta Lake performance tuning.-Knowledge of streaming frameworks (Kafka / Azure Event Hub).-Experience with Power BI integration.-Microsoft Certified: Azure Data Engineer Associate (or equivalent).-Experience working in Agile / Scrum environments
  • Salary : Rs. 0.0 - Rs. 2,16,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Custom Software Engineer

Job Description

Job Posting Title: USI-SAP Syniti with 4plus years of expereince Description: Description Should be hands on SAP Syniti. Experience in SAP retail projects is preferable

Responsibilities

Job Posting Title: USI-SAP Syniti with 4plus years of expereince Description: Description Should be hands on SAP Syniti. Experience in SAP retail projects is preferable
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Syniti

Job Description

As a Custom Software Engineer, you will engage in the development of custom software solutions that are designed to meet specific business needs. Your typical day will involve coding, enhancing components across various systems or applications, and collaborating with team members to ensure the delivery of scalable and high-performing solutions using modern frameworks and agile practices. You will also participate in discussions to address challenges and contribute to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews and provide constructive feedback to peers to ensure code quality and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Analytics Services.- Experience with cloud computing platforms and services.- Strong understanding of software development methodologies, particularly agile.- Familiarity with programming languages such as C#, Java, or Python.- Knowledge of database management and data integration techniques. Additional Information: - The candidate should have minimum 3 years of experience in Microsoft Azure Analytics Services.

Responsibilities

As a Custom Software Engineer, you will engage in the development of custom software solutions that are designed to meet specific business needs. Your typical day will involve coding, enhancing components across various systems or applications, and collaborating with team members to ensure the delivery of scalable and high-performing solutions using modern frameworks and agile practices. You will also participate in discussions to address challenges and contribute to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews and provide constructive feedback to peers to ensure code quality and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Analytics Services.- Experience with cloud computing platforms and services.- Strong understanding of software development methodologies, particularly agile.- Familiarity with programming languages such as C#, Java, or Python.- Knowledge of database management and data integration techniques. Additional Information: - The candidate should have minimum 3 years of experience in Microsoft Azure Analytics Services.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Custom Software Engineer

Job Description

We are looking for a skilled Azure Data Engineer with hands-on experience in building scalable data solutions using Azure Data Factory and Azure Databricks. The candidate will be responsible for developing robust data pipelines, optimizing data workflows, and supporting analytics solutions in the Azure cloud environment.Roles & Responsibilities: -Design and develop end-to-end data pipelines using Azure Data Factory.-Build scalable data processing solutions using Azure Databricks (PySpark / Spark SQL).-Develop and optimize ETL/ELT workflows for structured and unstructured data.-Integrate data from multiple sources such as APIs, databases, and cloud platforms.-Implement and manage Azure Data Lake Storage (ADLS Gen2) and data warehouse solutions.-Monitor, troubleshoot, and optimize pipeline performance and processing efficiency.-Implement data quality checks, validation rules, and governance standards.-Collaborate with data analysts, business stakeholders, and DevOps teams for delivery.-Manage CI/CD pipelines and deployments using Azure DevOps.-Ensure data security, compliance, and cost/performance optimization in Azure.Professional & Technical Skills: --5–6 years of experience in Data Engineering.-Strong hands-on expertise in:- Azure Data Factory (pipelines, triggers, linked services, Integration Runtime)-Azure Databricks (PySpark, Spark SQL, Delta Lake)-Experience with Azure Data Lake Storage (ADLS Gen2).-Strong knowledge of SQL and relational databases.-Proficiency in Python for data processing and automation.-Solid understanding of data warehousing and dimensional modeling.-Experience in large-scale distributed data processing.-Working knowledge of CI/CD, Git, and Azure DevOps.-Strong analytical and problem-solving skills.Additional Information: -Hands-on experience in Delta Lake performance tuning.-Knowledge of streaming frameworks (Kafka / Azure Event Hub).-Experience with Power BI integration.-Microsoft Certified: Azure Data Engineer Associate (or equivalent).-Experience working in Agile / Scrum environments.

Responsibilities

We are looking for a skilled Azure Data Engineer with hands-on experience in building scalable data solutions using Azure Data Factory and Azure Databricks. The candidate will be responsible for developing robust data pipelines, optimizing data workflows, and supporting analytics solutions in the Azure cloud environment.Roles & Responsibilities: -Design and develop end-to-end data pipelines using Azure Data Factory.-Build scalable data processing solutions using Azure Databricks (PySpark / Spark SQL).-Develop and optimize ETL/ELT workflows for structured and unstructured data.-Integrate data from multiple sources such as APIs, databases, and cloud platforms.-Implement and manage Azure Data Lake Storage (ADLS Gen2) and data warehouse solutions.-Monitor, troubleshoot, and optimize pipeline performance and processing efficiency.-Implement data quality checks, validation rules, and governance standards.-Collaborate with data analysts, business stakeholders, and DevOps teams for delivery.-Manage CI/CD pipelines and deployments using Azure DevOps.-Ensure data security, compliance, and cost/performance optimization in Azure.Professional & Technical Skills: --5–6 years of experience in Data Engineering.-Strong hands-on expertise in:- Azure Data Factory (pipelines, triggers, linked services, Integration Runtime)-Azure Databricks (PySpark, Spark SQL, Delta Lake)-Experience with Azure Data Lake Storage (ADLS Gen2).-Strong knowledge of SQL and relational databases.-Proficiency in Python for data processing and automation.-Solid understanding of data warehousing and dimensional modeling.-Experience in large-scale distributed data processing.-Working knowledge of CI/CD, Git, and Azure DevOps.-Strong analytical and problem-solving skills.Additional Information: -Hands-on experience in Delta Lake performance tuning.-Knowledge of streaming frameworks (Kafka / Azure Event Hub).-Experience with Power BI integration.-Microsoft Certified: Azure Data Engineer Associate (or equivalent).-Experience working in Agile / Scrum environments.
  • Salary : Rs. 12,00,000.0 - Rs. 15,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Microsoft Azure Analytics Services

Job Description

Microsoft Power Platform

Responsibilities

Microsoft Power Platform
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Microsoft Power Platform

Job Description

Informatica CDI (Cloud Data Integration) Relevant Experience Range in Required Skills: 6 to 8 Years Job Description: Informatica CDI - Production Support

Responsibilities

Informatica CDI (Cloud Data Integration) Relevant Experience Range in Required Skills: 6 to 8 Years Job Description: Informatica CDI - Production Support
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Informatica CDI (Cloud Data Integration)

Job Description

Please find the details. • Primary mandate skill required – Saviynt Architect • Secondary mandate skill required – Java and Python. • Flexible to hire in any location – If not, please mention job location – Yes. But preference will be for Hyderabad • Detailed Job Description –  Candidate should have 15 to 20 years of experience in Identity and Access Management domain  Lead the design and implementation of secure application architectures to protect sensitive data and systems.  Oversee the integration of security protocols within Java and Python applications to enhance overall security posture  Provide expert guidance on Saviynt Idaas solutions to optimize identity and access management processes.  Collaborate with development teams to ensure security best practices are embedded throughout the software development lifecycle.  Conduct regular security assessments and audits to identify vulnerabilities and recommend corrective actions.  Develop and maintain security policies and procedures to ensure compliance with industry standards and regulations.  Monitor emerging security threats and technologies to proactively address potential risks.  Coordinate with IT and business units to align security strategies with organizational goals.  Facilitate security training and awareness programs to educate employees on security protocols.  Implement advanced security measures to safeguard applications against cyber threats.  Analyze security incidents and provide detailed reports to management for informed decision-making.  Drive continuous improvement initiatives to enhance the effectiveness of security operations.  Support the development of security architecture frameworks that align with business objectives.

Responsibilities

Please find the details. • Primary mandate skill required – Saviynt Architect • Secondary mandate skill required – Java and Python. • Flexible to hire in any location – If not, please mention job location – Yes. But preference will be for Hyderabad • Detailed Job Description –  Candidate should have 15 to 20 years of experience in Identity and Access Management domain  Lead the design and implementation of secure application architectures to protect sensitive data and systems.  Oversee the integration of security protocols within Java and Python applications to enhance overall security posture  Provide expert guidance on Saviynt Idaas solutions to optimize identity and access management processes.  Collaborate with development teams to ensure security best practices are embedded throughout the software development lifecycle.  Conduct regular security assessments and audits to identify vulnerabilities and recommend corrective actions.  Develop and maintain security policies and procedures to ensure compliance with industry standards and regulations.  Monitor emerging security threats and technologies to proactively address potential risks.  Coordinate with IT and business units to align security strategies with organizational goals.  Facilitate security training and awareness programs to educate employees on security protocols.  Implement advanced security measures to safeguard applications against cyber threats.  Analyze security incidents and provide detailed reports to management for informed decision-making.  Drive continuous improvement initiatives to enhance the effectiveness of security operations.  Support the development of security architecture frameworks that align with business objectives.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Saviynt Architect