Databricks
Role Descriptions: Platform InfrastructureOversee Databricks platform configuration| resource management| workspace structuring| and cluster optimization.Monitor and troubleshoot performance issues across clusters| jobs| notebooks| and pipelines.Implement governance| security| compliance and data access control using Role-Based Access Control (RBAC) and Unity Catalog.Pipeline Development ArchitectureDesign and implement end-to-end data pipelines using PySpark| SQL| and Delta Lake within a medallion architecture using Data factory And Data bricks.Build real-time and batch DLT pipelines using Databricks Delta Live Tables with a focus on reliability and scalability.Optimize Lakehouse architecture for performance| cost-efficiency| and data integrity.Automate data ingestion| transformation| and validation| including support for streaming (Autoloader) and scheduled workflows.Perform data transformations| cleansing and validations using data quality rules for consistent and accurate data sets.Manage and monitor job orchestration| ensuring efficient pipelines run and reliability.CICD DevOpsDesign and maintain CICD pipelines for Databricks artifacts (notebooks| jobs| libraries) using tools such as Azure DevOps| GitHub Actions| Terraform or Jenkins.Support trunk-based development| deployment workflows| and infrastructure-as-code practices.Manage version control and automated testing using Git and related DevOps practices.Collaboration DeliveryCollaborate with product owners| business stakeholders| and data teams to gather requirements and translate them into technical solutions.Drive the adoption of best practices in coding| versioning| testing| deployment| monitoring| and security.Provide thought leadership on the best practices in Data Engineering| Architecture and Cloud Computing.Performance OptimizationDeliver optimized spark jobs and SQL queries for large scale data processing.Implement partitioning| caching and indexing strategies to improve performance and scalability of big data workloads.Conduct POCs for capacity planning and recommend appropriate infrastructure optimizations for cost effectiveness.Documentation Knowledge SharingCreated detailed documentations and review them for data workflows| SOPs| Architectural reviews etc.Mentor junior team members and promote a culture of learning and innovation.Promote the culture of optimization and cost saving and enable research driven development.Required QualificationsTechnical Expertise5 years in data engineering| with a strong focus on Databricks and Azure ecosystems.Deep hands-on experience with Data Factory| Databricks Lakehouse Architecture| Delta Lake| PySpark| and Spark job optimization.Proficiency in Python| SQL| and optionally Scala for building scalable ETLELT pipelines.Strong SQL skills are essential| with hands-on experience in SQL Server or other RDBMS platforms.Strong experience in designing and optimizing DLT pipelines| managing asset
Responsibilities
Databricks
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
9 BDC7C Summary: As a Technology Platform Engineer, you will be responsible for creating both production and non-production cloud environments utilizing appropriate software tools tailored for specific projects or products. Your typical day will involve deploying automation pipelines and automating the processes of environment creation and configuration, ensuring that all systems are optimized for performance and reliability. You will collaborate with various teams to ensure seamless integration and functionality across platforms, contributing to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate the performance of cloud environments to ensure optimal operation. Professional & Technical Skills: - Must To Have Skills: Proficiency in TIBCO BusinessWorks.- Strong understanding of cloud infrastructure and services.- Experience with automation tools and scripting languages.- Familiarity with CI/CD practices and tools.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in TIBCO BusinessWorks.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
9 BDC7C Summary: As a Technology Platform Engineer, you will be responsible for creating both production and non-production cloud environments utilizing appropriate software tools tailored for specific projects or products. Your typical day will involve deploying automation pipelines and automating the processes of environment creation and configuration, ensuring that all systems are optimized for performance and reliability. You will collaborate with various teams to ensure seamless integration and functionality across platforms, contributing to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate the performance of cloud environments to ensure optimal operation. Professional & Technical Skills: - Must To Have Skills: Proficiency in TIBCO BusinessWorks.- Strong understanding of cloud infrastructure and services.- Experience with automation tools and scripting languages.- Familiarity with CI/CD practices and tools.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in TIBCO BusinessWorks.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 2,16,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Req ID: 10669815
Location: PUNE, MH / HYDERABAD, TS / BANGALORE, KA
Role Descriptions: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Essential Skills: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Skills: Digital : Databricks~Azure Data Factory
Experience Required: 6-8
Responsibilities
Req ID: 10669815
Location: PUNE, MH / HYDERABAD, TS / BANGALORE, KA
Role Descriptions: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Essential Skills: Senior Data EngineerStrong hands-on experience with Azure Data Factory| Databricks and Azure Cloud servicesGood knowledge in Blob Storage| ADLS| Azure Logic Apps| Key Vault.Ability to design Azure Data Factory (ADF) pipelines to automate end to end ETL workflows and enabling high performance data movement through seamless integration with Azure Blob Storage.Expertise in Databricks with PySpark| Spark SQL| and Delta Lake. Deep understanding of data modeling| ETLELT concepts| and data warehousing. Experience with Azure Data Lake Storage (ADLS). Strong knowledge of SQL and performance tuning. Experience working in AgileScrum environments. Ability to design solutions independently and work with minimal supervision.
Skills: Digital : Databricks~Azure Data Factory
Experience Required: 6-8
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance