We found 106 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Virtualization AdministrationPrimary Skills VMWare vSphere administrationCloud technology (AWS and Azure)Version Control Systems GitInfrastructure as Code (IaC) Terraform, CloudFormation, CrossplaneWindows, Linux Operating system administrationvRealize Automation, vRealize OrchestratorScripting languages - Ansible, PowerShell Azure, AWS API integration. Additional SkillsITIL Framework Change ManagementContainerization Orchestration Docker, KubernetesCICD Pipelines Jenkins, GitLab CIIdentity and Access Management (IAM) AWS IAM, Azure ADDisaster Recovery Backup Solutions CommvaultNetworking Fundamentals DNS, DHCP, Firewalls, Load BalancersAutomation Configuration Management AnsibleCloud Security Best Practices Data encryption, Security groups, Key managementPerformance Tuning and Capacity Planning

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :VMWare~Windows Powershell~VMware Cloud On AWS

Job Description

Job Description: SAP Variant Configuration (VC) Essential Skills: SAP Variant Configuration (VC)

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Variant Configuration (VC)

Job Description

Work Location: KOLKATA, WB /CHENNAI, TN Job Title : Developer Experience : 6-8 Years Skill Required: Azure Data Factory Job description: Managing Azure Resources. Azure Administrators are responsible for deploying, monitoring, and managing Azure resources such as virtual machines, storage accounts, and networks. They ensure that these resources are configured correctly and operate efficiently to meet organizational needs.. Essential Skills: Azure Cloud Administrator.

Responsibilities

Work Location: KOLKATA, WB /CHENNAI, TN Job Title : Developer Experience : 6-8 Years Skill Required: Azure Data Factory Job description: Managing Azure Resources. Azure Administrators are responsible for deploying, monitoring, and managing Azure resources such as virtual machines, storage accounts, and networks. They ensure that these resources are configured correctly and operate efficiently to meet organizational needs.. Essential Skills: Azure Cloud Administrator.
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

• Design, build, and deploy data extraction, transformation, and loading processes and pipelines from various sources including databases, APIs, and data files.Develop and support data pipe-lines within a Cloud Data Platform, such as Databricks Build data models that reflect domain expertise, meet current business needs, and will remain flexible as strategy evolvesMonitor and optimize Databricks cluster performance, ensuring cost-effective scaling and resource utilizationDemonstrates ability to communicate technical concepts to non-technical audienc-es both in written and verbal form.Demonstrates strong understanding with coding and pro-gramming concepts to build data pipelines (e.g. data transformation, data quality, data inte-gration, etc.).Demonstrates strong understanding of database storage concepts (data lake, re-lational databases, NoSQL, Graph, data warehousing).Implement and maintain Delta Lake for optimized data storage, ensuring data reliability, performance, and versioningAutomate CICD pipelines for data workflows using Azure DevOpsCollaborate with cross-functional teams to support data governance using Databricks Unity Catalog

Responsibilities

• Design, build, and deploy data extraction, transformation, and loading processes and pipelines from various sources including databases, APIs, and data files.Develop and support data pipe-lines within a Cloud Data Platform, such as Databricks Build data models that reflect domain expertise, meet current business needs, and will remain flexible as strategy evolvesMonitor and optimize Databricks cluster performance, ensuring cost-effective scaling and resource utilizationDemonstrates ability to communicate technical concepts to non-technical audienc-es both in written and verbal form.Demonstrates strong understanding with coding and pro-gramming concepts to build data pipelines (e.g. data transformation, data quality, data inte-gration, etc.).Demonstrates strong understanding of database storage concepts (data lake, re-lational databases, NoSQL, Graph, data warehousing).Implement and maintain Delta Lake for optimized data storage, ensuring data reliability, performance, and versioningAutomate CICD pipelines for data workflows using Azure DevOpsCollaborate with cross-functional teams to support data governance using Databricks Unity Catalog
  • Salary : Rs. 0.0 - Rs. 12.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks

Job Description

Java and REST , REACT , API 5 to 7 yrs of exp

Responsibilities

Java and REST , REACT , API 5 to 7 yrs of exp
  • Salary : Rs. 0.0 - Rs. 7.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Developer

Job Description

Good information and sound knowledge in Workday HCM Technical and Workday HCM Core Functional. • Bachelor’s degree in Information Systems, Data Analytics, Finance, or related field. • 2–5 years of experience working with Workday Finance or other modules (reporting, analytics, or data management). Strong knowledge of Workday Reporting (Advanced, Matrix, Composite, Prism Analytics). • Hands-on experience with EIBs, calculated fields, and business process configuration. • Proficiency in Excel, SQL, or other data analysis tools is a plus. • Strong analytical skills with the ability to interpret data and provide actionable insights. • Excellent communication and stakeholder management skills. • Ability to work independently in a fast-paced, dynamic environment. Workday certified preferred. • Excellent problem-solving, analytical, and troubleshooting skills. • Strong communication skills to work with business and technical stakeholders. Workday certification preferred

Responsibilities

Good information and sound knowledge in Workday HCM Technical and Workday HCM Core Functional. • Bachelor’s degree in Information Systems, Data Analytics, Finance, or related field. • 2–5 years of experience working with Workday Finance or other modules (reporting, analytics, or data management). Strong knowledge of Workday Reporting (Advanced, Matrix, Composite, Prism Analytics). • Hands-on experience with EIBs, calculated fields, and business process configuration. • Proficiency in Excel, SQL, or other data analysis tools is a plus. • Strong analytical skills with the ability to interpret data and provide actionable insights. • Excellent communication and stakeholder management skills. • Ability to work independently in a fast-paced, dynamic environment. Workday certified preferred. • Excellent problem-solving, analytical, and troubleshooting skills. • Strong communication skills to work with business and technical stakeholders. Workday certification preferred
  • Salary : Rs. 10.0 - Rs. 12.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Workday HCM Core Functional

Job Description

4 yrs of hands on experience on pyspark development - Ingestion, database, schema,workflow • Hands on experience on scripting tools like Python Shell Strong SQL knowledge,SCD1,SCD2 ETL logic and preparing validations sql scripts • Knowledge of Spark - Dataframes, PySQL, Spark Libraries • Knowledge of AWS S3, AWS EMR, AWS Redshift • Strong knowledge of Data Management principles • Experience in building ETL data warehouse transformation processes • Strong communication skills

Responsibilities

4 yrs of hands on experience on pyspark development - Ingestion, database, schema,workflow • Hands on experience on scripting tools like Python Shell Strong SQL knowledge,SCD1,SCD2 ETL logic and preparing validations sql scripts • Knowledge of Spark - Dataframes, PySQL, Spark Libraries • Knowledge of AWS S3, AWS EMR, AWS Redshift • Strong knowledge of Data Management principles • Experience in building ETL data warehouse transformation processes • Strong communication skills
  • Salary : Rs. 10.0 - Rs. 12.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Snowflake developer