Job Description: PL/SQL Developer with Python and AWS
The ideal candidate will be proficient in PL/SQL ,Python and AWS.
Key Responsibilities:
• Looking for candidate with good hands in PL/SQL and DBMS concepts
• 5-8 years of experience in Oracle and Python
• Good knowledge on Database technologies
• Candidate should have good experience in AWS
• Experience in Control-M, Postman are added advantages
Preferred Skills:
• Experience with data pipelines, DBMS concepts.
• A strong desire to learn new technologies and stay updated with industry trends.
• Dedication to high-quality work, with attention to detail and commitment to deadlines.
Requirements:
• Strong experience in PL/SQL and Python development.
• Proficiency in SQL.
• Strong problem-solving skills, adaptability, and eagerness to learn.
Responsibilities
Job Description: PL/SQL Developer with Python and AWS
The ideal candidate will be proficient in PL/SQL ,Python and AWS.
Key Responsibilities:
• Looking for candidate with good hands in PL/SQL and DBMS concepts
• 5-8 years of experience in Oracle and Python
• Good knowledge on Database technologies
• Candidate should have good experience in AWS
• Experience in Control-M, Postman are added advantages
Preferred Skills:
• Experience with data pipelines, DBMS concepts.
• A strong desire to learn new technologies and stay updated with industry trends.
• Dedication to high-quality work, with attention to detail and commitment to deadlines.
Requirements:
• Strong experience in PL/SQL and Python development.
• Proficiency in SQL.
• Strong problem-solving skills, adaptability, and eagerness to learn.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
The AWS Databricks Platform administrator is a technical role to design, implement and maintain processes used to manage the organization’s Databricks platform. The administrator will be responsible for facilitating the work of data analysts, data engineers, data scientists and data governance/stewards while maintaining best practices for security and compliance. Unity catalog will be used to implement access and identity policies. Infrastructure will be maintained using Terraform. The administrator will be responsible for assisting with compute issues, monitoring and alerting, and assuring the platform functions smoothly. This role requires great organization and communication skills.
Job responsibilities:
• Manage and maintain role-based access to data and features in the Databricks Platform and using Unity Catalog as applicable.
• Implement external access controls for outside teams using service principles, SQL warehouses and Delta Sharing
• Work with platform users to solve problems associated with the Databricks platform, job deployments, job failures, and facilitate work, improve processes and systems used to manage infrastructure, users and external access
• Create infrastructure for AWS and Databricks using Terraform: S3, IAM roles, Instance Profiles, KMS Keys, Secrets, VPCs, Athena, CloudTrail and other required resources.
• Understand and implement best practices for security and compliance. Create documentation for users and admins
• Understand how to utilize Databricks APIs to automate administrative tasks
• Create queries and dashboards to monitor critical systems and processes, use Databricks system tables to monitor optimal usage of resources.
• Manage tags and naming convention enforcement for all resources as applicable in the environment.
• Manage open-source spark cluster (open-source version of Cloudera).
• Manage integrations with other tools in the data lake such as Denodo, Collibra, Redshift, Kafka, Protegrity, etc.
• Comply with all agreed SLA requirements for all service requests, incidents and changes.
Required qualifications:
• 8 or more years of experience administering the Databricks platform (on AWS preferred).
• Strong understanding of data engineering needs and a willingness to take up new integrations, requirements and challenges.
• Experience of designing and creating AWS infrastructure with Terraform, via GitHub Actions or similar Devops tools.
• Strong understanding of AWS infrastructure required by Databricks
• Experience with compliance audits/audit documentation
• Experience using Databricks APIs with tools like Postman
• Understanding of the Databricks workspace and development environment
• Strong SQL skills for creating admin related queries and dashboards
• Strong debugging skills and ability to do risk analysis for 3rd party integrations with AWS resources.
• Knowledge of Python a plus
Responsibilities
The AWS Databricks Platform administrator is a technical role to design, implement and maintain processes used to manage the organization’s Databricks platform. The administrator will be responsible for facilitating the work of data analysts, data engineers, data scientists and data governance/stewards while maintaining best practices for security and compliance. Unity catalog will be used to implement access and identity policies. Infrastructure will be maintained using Terraform. The administrator will be responsible for assisting with compute issues, monitoring and alerting, and assuring the platform functions smoothly. This role requires great organization and communication skills.
Job responsibilities:
• Manage and maintain role-based access to data and features in the Databricks Platform and using Unity Catalog as applicable.
• Implement external access controls for outside teams using service principles, SQL warehouses and Delta Sharing
• Work with platform users to solve problems associated with the Databricks platform, job deployments, job failures, and facilitate work, improve processes and systems used to manage infrastructure, users and external access
• Create infrastructure for AWS and Databricks using Terraform: S3, IAM roles, Instance Profiles, KMS Keys, Secrets, VPCs, Athena, CloudTrail and other required resources.
• Understand and implement best practices for security and compliance. Create documentation for users and admins
• Understand how to utilize Databricks APIs to automate administrative tasks
• Create queries and dashboards to monitor critical systems and processes, use Databricks system tables to monitor optimal usage of resources.
• Manage tags and naming convention enforcement for all resources as applicable in the environment.
• Manage open-source spark cluster (open-source version of Cloudera).
• Manage integrations with other tools in the data lake such as Denodo, Collibra, Redshift, Kafka, Protegrity, etc.
• Comply with all agreed SLA requirements for all service requests, incidents and changes.
Required qualifications:
• 8 or more years of experience administering the Databricks platform (on AWS preferred).
• Strong understanding of data engineering needs and a willingness to take up new integrations, requirements and challenges.
• Experience of designing and creating AWS infrastructure with Terraform, via GitHub Actions or similar Devops tools.
• Strong understanding of AWS infrastructure required by Databricks
• Experience with compliance audits/audit documentation
• Experience using Databricks APIs with tools like Postman
• Understanding of the Databricks workspace and development environment
• Strong SQL skills for creating admin related queries and dashboards
• Strong debugging skills and ability to do risk analysis for 3rd party integrations with AWS resources.
• Knowledge of Python a plus
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Mandatory Skills:
AWS database, Data services, PowerBi stack, Big data
Skill to Evaluate:
AWS-Databse,Data-services,POwer-Bi-stack,Data-modeling,ETL-systems,Warehouse-stacks
Responsibilities
Mandatory Skills:
AWS database, Data services, PowerBi stack, Big data
Skill to Evaluate:
AWS-Databse,Data-services,POwer-Bi-stack,Data-modeling,ETL-systems,Warehouse-stacks
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Independently develop code using the required technical skills.
- Troubleshoot and resolve issues in production and development environments.
- Hands on experience in cloud (ex: Azure, AWS)
- Collaborate with cross-functional teams to gather requirements and implement solutions that meet business needs.
- Participate in all phases of the software development lifecycle, including planning, development, testing, and deployment.
- CI/CD pipeline configuration using GitHub actions
- Adapt quickly to new technologies and methodologies, demonstrating a continuous learning mindset.
Responsibilities
Independently develop code using the required technical skills.
- Troubleshoot and resolve issues in production and development environments.
- Hands on experience in cloud (ex: Azure, AWS)
- Collaborate with cross-functional teams to gather requirements and implement solutions that meet business needs.
- Participate in all phases of the software development lifecycle, including planning, development, testing, and deployment.
- CI/CD pipeline configuration using GitHub actions
- Adapt quickly to new technologies and methodologies, demonstrating a continuous learning mindset.
Salary : Rs. 0.0 - Rs. 16.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance