Job Title: Engineer
Work Location: Hyderabad TS and Kolkata WB
Skill Required: Digital : Python~Digital : Amazon Web Service(AWS) Cloud Computing
Experience Range in Required Skills: 4-6 years
Job Description: AWS/Python Data Engineers
Essential Skills: AWS/Python Data Engineers
Responsibilities
Job Title: Engineer
Work Location: Hyderabad TS and Kolkata WB
Skill Required: Digital : Python~Digital : Amazon Web Service(AWS) Cloud Computing
Experience Range in Required Skills: 4-6 years
Job Description: AWS/Python Data Engineers
Essential Skills: AWS/Python Data Engineers
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Platform Administration: Implement, and maintain the Databricks platform, including workspace setup, user and group management, access control, and security configurations.Operational Support: Provide technical support to data engineering, data science, and application teams. Perform restores/recoveries, troubleshoot service issues, and resolve platform-related problems.Cluster Management: Install, config-ure, and maintain Databricks clusters and workspaces. Monitor and manage cluster performance, re-source utilization, and platform costs.Security and Compliance: Implement and manage access controls and security policies to protect sensitive data. Ensure compliance with relevant data governance and regu-latory requirements.Integration: Manage and maintain connections between Databricks and other data sources like Snowflake. Optimize data pipelines and workflows.Automation: Develop and maintain automa-tion scripts and tools for platform provisioning and routine tasks using Terraform.
Essential Skill:
- Terraform, AWS cloud services, Databricks administration- AWS IAM, VPC, private endpoints, firewalls, S3 and knowledge on how Databricks integrates with them- MLflow, Workflows, Databricks Asset Bundles- Databricks REST API: For automation, scripting, and integration- dbx CLI: For CI/CD integration- Delta Live Tables (DLT): For managing streaming and batch pipelines- SQL + PySpark skills to debug and support.
Desirable Skill :
Databricks certification
Responsibilities
Platform Administration: Implement, and maintain the Databricks platform, including workspace setup, user and group management, access control, and security configurations.Operational Support: Provide technical support to data engineering, data science, and application teams. Perform restores/recoveries, troubleshoot service issues, and resolve platform-related problems.Cluster Management: Install, config-ure, and maintain Databricks clusters and workspaces. Monitor and manage cluster performance, re-source utilization, and platform costs.Security and Compliance: Implement and manage access controls and security policies to protect sensitive data. Ensure compliance with relevant data governance and regu-latory requirements.Integration: Manage and maintain connections between Databricks and other data sources like Snowflake. Optimize data pipelines and workflows.Automation: Develop and maintain automa-tion scripts and tools for platform provisioning and routine tasks using Terraform.
Essential Skill:
- Terraform, AWS cloud services, Databricks administration- AWS IAM, VPC, private endpoints, firewalls, S3 and knowledge on how Databricks integrates with them- MLflow, Workflows, Databricks Asset Bundles- Databricks REST API: For automation, scripting, and integration- dbx CLI: For CI/CD integration- Delta Live Tables (DLT): For managing streaming and batch pipelines- SQL + PySpark skills to debug and support.
Desirable Skill :
Databricks certification
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Python~Digital : Amazon Web Service(AWS) Cloud Computing
Experience Range in Required Skills: 4-6 years
Job Description: AWS/Python Data Engineers
Essential Skills: AWS/Python Data Engineers
Responsibilities
Python~Digital : Amazon Web Service(AWS) Cloud Computing
Experience Range in Required Skills: 4-6 years
Job Description: AWS/Python Data Engineers
Essential Skills: AWS/Python Data Engineers
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Collaborate with stakeholders to gather requirement and design solution using the Databricks platformMi-grate Python, Pyspark, MWAA DAG to Databricks Notebook and WorkflowWork with BJs Computer opera-tions team and repoint Talend/MWAA Autosys jobs to Databricks workflowImplement medallion architec-ture in Databricks Enterprise Data Platform with industry best practice and recommend best framework for Redshift to Delta tablesProvide technical expertise and guidance to the team members and stakeholder-sContribute to the development of documentation, training materials, and user guidesParticipate in testing and quality assurance activities to ensure the reliability and performanceOptimize long running Databricks workflow
Essential Skill:
Data Engineer having 3+years Databricks & AWS development experience
Development experience in Python, Pyspark
Hands-on experience in writing complex SQLs and performance optimization of very complex sq
Databricks certified data engineer associate certification
Ability to automate unit testing for Migration project and proven experience in large scale data/code migra-tion project.
Excellent analytical and problem-solving skills.
Ability to work independently and as part of a team in a fast-paced environment.
Willing to work in flexible shifts and weekends as per project need
Good learning attitude
Help/Guide junior team members
Desirable Skill :
Experience in AWS platform (S3,EC2, EMR, MWAA and Redshift)
Talend Handson experience
Responsibilities
Collaborate with stakeholders to gather requirement and design solution using the Databricks platformMi-grate Python, Pyspark, MWAA DAG to Databricks Notebook and WorkflowWork with BJs Computer opera-tions team and repoint Talend/MWAA Autosys jobs to Databricks workflowImplement medallion architec-ture in Databricks Enterprise Data Platform with industry best practice and recommend best framework for Redshift to Delta tablesProvide technical expertise and guidance to the team members and stakeholder-sContribute to the development of documentation, training materials, and user guidesParticipate in testing and quality assurance activities to ensure the reliability and performanceOptimize long running Databricks workflow
Essential Skill:
Data Engineer having 3+years Databricks & AWS development experience
Development experience in Python, Pyspark
Hands-on experience in writing complex SQLs and performance optimization of very complex sq
Databricks certified data engineer associate certification
Ability to automate unit testing for Migration project and proven experience in large scale data/code migra-tion project.
Excellent analytical and problem-solving skills.
Ability to work independently and as part of a team in a fast-paced environment.
Willing to work in flexible shifts and weekends as per project need
Good learning attitude
Help/Guide junior team members
Desirable Skill :
Experience in AWS platform (S3,EC2, EMR, MWAA and Redshift)
Talend Handson experience
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance