We found 906 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

: Databricks~Database Administration (DBMS) Job Description: Databricks AdministrationWorkspace setup, configuration, and governance Unity Catalog configuration and management Cluster policies, job clusters, access controls Secret scopes and credential management External locations and storage credentials setup Monitoring cluster performance and cost optimization Managing notebooks, repos, workflows, and permissions Managing Delta Lake storage and table propertiesAzure Cloud AdministrationADLS Gen2 storage setup and managementAzure networking (VNet, private endpoints, firewall rules)Azure Key Vault integrationManaged identities Service principals configurationAzure RBAC and IAM governanceStorage access via SAS, OAuth, service principalAzure monitoring and logging setupPlatform OperationsEnvironment setup (Dev QA Prod)CICD integration (Azure DevOpsGitHub)Deployment automationIncident troubleshootingCapacity planning and scaling

Responsibilities

: Databricks~Database Administration (DBMS) Job Description: Databricks AdministrationWorkspace setup, configuration, and governance Unity Catalog configuration and management Cluster policies, job clusters, access controls Secret scopes and credential management External locations and storage credentials setup Monitoring cluster performance and cost optimization Managing notebooks, repos, workflows, and permissions Managing Delta Lake storage and table propertiesAzure Cloud AdministrationADLS Gen2 storage setup and managementAzure networking (VNet, private endpoints, firewall rules)Azure Key Vault integrationManaged identities Service principals configurationAzure RBAC and IAM governanceStorage access via SAS, OAuth, service principalAzure monitoring and logging setupPlatform OperationsEnvironment setup (Dev QA Prod)CICD integration (Azure DevOpsGitHub)Deployment automationIncident troubleshootingCapacity planning and scaling
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : : Databricks~Database Administration (DBMS)

Job Description

Data scientistdvanced ML, Agentic AI - Applied Math: Applied Statistics, Design of Experiments, Regression, Decision Trees, Forecasting, Optimization algorithms, Clustering, NLP - Tech: SQL, Hadoop, Spark, Python, Tableau, MS Excel, MS Powerpoint, GitHub

Responsibilities

Data scientist dvanced ML, Agentic AI - Applied Math: Applied Statistics, Design of Experiments, Regression, Decision Trees, Forecasting, Optimization algorithms, Clustering, NLP - Tech: SQL, Hadoop, Spark, Python, Tableau, MS Excel, MS Powerpoint, GitHub
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Data scientist

Job Description

Databricks Role Descriptions: Platform InfrastructureOversee Databricks platform configuration| resource management| workspace structuring| and cluster optimization.Monitor and troubleshoot performance issues across clusters| jobs| notebooks| and pipelines.Implement governance| security| compliance and data access control using Role-Based Access Control (RBAC) and Unity Catalog.Pipeline Development ArchitectureDesign and implement end-to-end data pipelines using PySpark| SQL| and Delta Lake within a medallion architecture using Data factory And Data bricks.Build real-time and batch DLT pipelines using Databricks Delta Live Tables with a focus on reliability and scalability.Optimize Lakehouse architecture for performance| cost-efficiency| and data integrity.Automate data ingestion| transformation| and validation| including support for streaming (Autoloader) and scheduled workflows.Perform data transformations| cleansing and validations using data quality rules for consistent and accurate data sets.Manage and monitor job orchestration| ensuring efficient pipelines run and reliability.CICD DevOpsDesign and maintain CICD pipelines for Databricks artifacts (notebooks| jobs| libraries) using tools such as Azure DevOps| GitHub Actions| Terraform or Jenkins.Support trunk-based development| deployment workflows| and infrastructure-as-code practices.Manage version control and automated testing using Git and related DevOps practices.Collaboration DeliveryCollaborate with product owners| business stakeholders| and data teams to gather requirements and translate them into technical solutions.Drive the adoption of best practices in coding| versioning| testing| deployment| monitoring| and security.Provide thought leadership on the best practices in Data Engineering| Architecture and Cloud Computing.Performance OptimizationDeliver optimized spark jobs and SQL queries for large scale data processing.Implement partitioning| caching and indexing strategies to improve performance and scalability of big data workloads.Conduct POCs for capacity planning and recommend appropriate infrastructure optimizations for cost effectiveness.Documentation Knowledge SharingCreated detailed documentations and review them for data workflows| SOPs| Architectural reviews etc.Mentor junior team members and promote a culture of learning and innovation.Promote the culture of optimization and cost saving and enable research driven development.Required QualificationsTechnical Expertise5 years in data engineering| with a strong focus on Databricks and Azure ecosystems.Deep hands-on experience with Data Factory| Databricks Lakehouse Architecture| Delta Lake| PySpark| and Spark job optimization.Proficiency in Python| SQL| and optionally Scala for building scalable ETLELT pipelines.Strong SQL skills are essential| with hands-on experience in SQL Server or other RDBMS platforms.Strong experience in designing and optimizing DLT pipelines| managing asset

Responsibilities

Databricks
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks

Job Description

IND Tax Accountant

Responsibilities

IND Tax Accountant
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :IND Tax Accountant

Job Description

IND Tax Technology Analyst

Responsibilities

IND Tax Technology Analyst
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :IND Tax Technology Analyst

Job Description

Kindly share the profiles for below requirement on priority. Exp: 10 years and above Location: PAN India Notice: Immediate CPI resources - 10+ years of experience in CPI JD: • Resource will be the lead CPI consultant to perform end-end CPI integration build with SuccessFactors modules. • Resource will be fixing any incident / issues with any existing CPI integrations, • Resource will be assisting with SuccessFactors ODATA API field mapping for integration. • Resource will be responsible for any certificate update, key rotation in SuccessFactors and CPI • Resource will be responsible for build, unit testing of new iflows. • Resource will be in roaster support for cpi integration runs. • Resource will be designing new interfaces based on business requirements. Skill: • Must have experience in at least 1 end-end implementation SuccessFactors projects. • Must have prior experience with SuccessFactors integration, ECC systems, EC Payroll integrations, Service -now, third party systems, using Compound employee, ODATA. • Experienced in working with both standard as well as creation of custom integration flows. • Experienced in creation of both synchronous and asynchronous interfaces. • Experienced in working with various adapters such as SuccessFactors, SOAP, IDOC, HTTP, SFTP, RFC, JDBC, JMS, ODATA, MAIL etc. • Has good knowledge on groovy scripts, message mappings, node functions, UDFs, and various other palette functions. • Has good knowledge on support activities such as monitoring and incident resolution.

Responsibilities

Kindly share the profiles for below requirement on priority. Exp: 10 years and above Location: PAN India Notice: Immediate CPI resources - 10+ years of experience in CPI JD: • Resource will be the lead CPI consultant to perform end-end CPI integration build with SuccessFactors modules. • Resource will be fixing any incident / issues with any existing CPI integrations, • Resource will be assisting with SuccessFactors ODATA API field mapping for integration. • Resource will be responsible for any certificate update, key rotation in SuccessFactors and CPI • Resource will be responsible for build, unit testing of new iflows. • Resource will be in roaster support for cpi integration runs. • Resource will be designing new interfaces based on business requirements. Skill: • Must have experience in at least 1 end-end implementation SuccessFactors projects. • Must have prior experience with SuccessFactors integration, ECC systems, EC Payroll integrations, Service -now, third party systems, using Compound employee, ODATA. • Experienced in working with both standard as well as creation of custom integration flows. • Experienced in creation of both synchronous and asynchronous interfaces. • Experienced in working with various adapters such as SuccessFactors, SOAP, IDOC, HTTP, SFTP, RFC, JDBC, JMS, ODATA, MAIL etc. • Has good knowledge on groovy scripts, message mappings, node functions, UDFs, and various other palette functions. • Has good knowledge on support activities such as monitoring and incident resolution.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP CPI