ORACLE SQL~Digital : Databricks~Digital : PySpark
Role Descriptions: Job Description for Data EngineerDesign and develop ETL processes based on functional and non-functional requirements in Databricks| python PySpark within Azure platform.Understand the full end to end development activities from design to go live for ETL development in Azure platform.Recommend and execute improvements.Execute and provide support during testing cycles and post-production deployment| engage in peer code reviews.Apply automation and innovation on new and on-going data platforms that aligned to business strategies.Document component design for developers and for broader communication.Understand and adopt an Agile (SCRUM like) software development mindset.Follow established processesstandards| business technology architecture for development| release management and deployment process.Design| develop and implement reporting platforms and complex ETL frameworks that meet business requirements.Provide data analysis and requirements within enterprise platform.Develop| maintain knowledge of data available from upstream sources and data within various platforms.
Essential Skills: Must-have skillsAzure Databricks| Azure Data FactoryPython Spark Pyspark Shell ScriptSynapse5 years of experience in ETLData area with minimum of 3 years of development experience in above technologies (PySPark| ADB| ADF) is mustSQL| Working experience with data modeling| relational modeling and dimensional modeling| Data Warehousing| ETLGeneral knowledge about file formats (e.g. XML| CSV| JSON)| databases (e.g. MS SQL| Oracle) and different type of connectivity is also very useful.Data files movement via mailboxSource-code versioningpromotion tools| e.g. GitJenkinsGood communication skill| team player
Desirable Skills:
Responsibilities
ORACLE SQL~Digital : Databricks~Digital : PySpark
Role Descriptions: Job Description for Data EngineerDesign and develop ETL processes based on functional and non-functional requirements in Databricks| python PySpark within Azure platform.Understand the full end to end development activities from design to go live for ETL development in Azure platform.Recommend and execute improvements.Execute and provide support during testing cycles and post-production deployment| engage in peer code reviews.Apply automation and innovation on new and on-going data platforms that aligned to business strategies.Document component design for developers and for broader communication.Understand and adopt an Agile (SCRUM like) software development mindset.Follow established processesstandards| business technology architecture for development| release management and deployment process.Design| develop and implement reporting platforms and complex ETL frameworks that meet business requirements.Provide data analysis and requirements within enterprise platform.Develop| maintain knowledge of data available from upstream sources and data within various platforms.
Essential Skills: Must-have skillsAzure Databricks| Azure Data FactoryPython Spark Pyspark Shell ScriptSynapse5 years of experience in ETLData area with minimum of 3 years of development experience in above technologies (PySPark| ADB| ADF) is mustSQL| Working experience with data modeling| relational modeling and dimensional modeling| Data Warehousing| ETLGeneral knowledge about file formats (e.g. XML| CSV| JSON)| databases (e.g. MS SQL| Oracle) and different type of connectivity is also very useful.Data files movement via mailboxSource-code versioningpromotion tools| e.g. GitJenkinsGood communication skill| team player
Desirable Skills:
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :ORACLE SQL~Digital : Databricks~Digital : PySpark
Microservices~Java API Management & Microservices~Digital : Spring Boot~Core Java
Role Descriptions: Java Developer with Springboot framework 2-5 years of software development experience Over 2 years dedicated to web API programming in Java Hands-on experience in consuming APIs from various REST and SOAP services Technical expertise in working with Docker containers and Kubernetes Working knowledge of cloud platforms| including Google Cloud Platform and AWS Familiarizing myself with PKI and X509 Certificates
Essential Skills: Core Java| Digital Spring Boot| Digital Microservices and Java API Management Microservices
Responsibilities
Role Descriptions: Java Developer with Springboot framework 2-5 years of software development experience Over 2 years dedicated to web API programming in Java Hands-on experience in consuming APIs from various REST and SOAP services Technical expertise in working with Docker containers and Kubernetes Working knowledge of cloud platforms| including Google Cloud Platform and AWS Familiarizing myself with PKI and X509 Certificates
Essential Skills: Core Java| Digital Spring Boot| Digital Microservices and Java API Management Microservices
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Microservices~Java API Management & Microservices~Digital : Spring Boot~Core Java