We found 22 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Python Developer

Responsibilities

Python Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Developer

Job Description

AQA – Java +Selenium+Python+Robotic Framework

Responsibilities

AQA – Java +Selenium+Python+Robotic Framework
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Sr Developer

Job Description

Minimum 4-7 years of development experience in Azure Data Services Job description: • 6-7 years of Total experience • 4 to 7 years of development experience in Azure Data Services • Experience in integrating on-prem data sources with Azure data storage • Experience in Linux skills to upgrade Vertica database • Experience in programming in Python, R. (Typescript is a plus) • Experience in developing data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services. • Experience in Azure Devops (build a deployment pipeline from Stage to Prod) • Experience in using Visual Studio or similar tool to build code and debug • Experience in working with GIT • Experience in Azure admin tasks (permissions, network, security) • Good to have knowledge on data analysis using Synapse analytics and Azure Data Bricks • Good to have knowledge on Azure OpenAI and ML services. • Microsoft certified Azure Data Engineer or similar is preferred Other expectations: • Good communication & strong collaboration skills • Open minded • High interest in new technologies • Experienced in working models with distributed teams in different cultures • Analytical thinking, high level of comprehension and independent working style • Will work with colleagues based out of US – Up to 2pm EST

Responsibilities

Minimum 4-7 years of development experience in Azure Data Services Job description: • 6-7 years of Total experience • 4 to 7 years of development experience in Azure Data Services • Experience in integrating on-prem data sources with Azure data storage • Experience in Linux skills to upgrade Vertica database • Experience in programming in Python, R. (Typescript is a plus) • Experience in developing data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services. • Experience in Azure Devops (build a deployment pipeline from Stage to Prod) • Experience in using Visual Studio or similar tool to build code and debug • Experience in working with GIT • Experience in Azure admin tasks (permissions, network, security) • Good to have knowledge on data analysis using Synapse analytics and Azure Data Bricks • Good to have knowledge on Azure OpenAI and ML services. • Microsoft certified Azure Data Engineer or similar is preferred Other expectations: • Good communication & strong collaboration skills • Open minded • High interest in new technologies • Experienced in working models with distributed teams in different cultures • Analytical thinking, high level of comprehension and independent working style • Will work with colleagues based out of US – Up to 2pm EST
  • Salary : Rs. 24,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer

Job Description

Skills and Qualifications: Experience: · 4+ years of experience in Python development, with a focus on data integration and processing. · Previous experience working in financial services or on customer data platforms is highly desirable. · Strong proficiency in Python with experience in developing data-driven applications. · Experience with ETL processes and integrating data from multiple sources. · Familiarity with data processing libraries such as Pandas, NumPy, and PySpark. · Experience of API integration and web scraping techniques. · Understanding of relational databases and SQL. · Experience with version control systems, particularly Git. · Strong problem-solving skills and the ability to work collaboratively in a team environment. · Experience with cloud platforms (e.g., AWS, Azure) . Key Responsibilities: · Develop and maintain scalable Python scripts and applications for data extraction, transformation, and loading (ETL) processes. · Integrate data from various sources such as APIs, databases, and flat files into the customer data platform. · Optimize and refactor existing Python code to improve performance and reliability. · Collaborate with data engineers to design and implement data pipelines that meet business requirements. · Implement and manage data validation processes to ensure data accuracy and quality. · Troubleshoot and resolve issues related to data processing and integration. · Work with stakeholders to understand data requirements and provide technical solutions.

Responsibilities

Skills and Qualifications: Experience: · 4+ years of experience in Python development, with a focus on data integration and processing. · Previous experience working in financial services or on customer data platforms is highly desirable. · Strong proficiency in Python with experience in developing data-driven applications. · Experience with ETL processes and integrating data from multiple sources. · Familiarity with data processing libraries such as Pandas, NumPy, and PySpark. · Experience of API integration and web scraping techniques. · Understanding of relational databases and SQL. · Experience with version control systems, particularly Git. · Strong problem-solving skills and the ability to work collaboratively in a team environment. · Experience with cloud platforms (e.g., AWS, Azure) . Key Responsibilities: · Develop and maintain scalable Python scripts and applications for data extraction, transformation, and loading (ETL) processes. · Integrate data from various sources such as APIs, databases, and flat files into the customer data platform. · Optimize and refactor existing Python code to improve performance and reliability. · Collaborate with data engineers to design and implement data pipelines that meet business requirements. · Implement and manage data validation processes to ensure data accuracy and quality. · Troubleshoot and resolve issues related to data processing and integration. · Work with stakeholders to understand data requirements and provide technical solutions.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Developer

Job Description

• Work with the data architect and data engineering teams to design and build a BI and analytics solutions. • Implement batch and real time data movement design patterns and define best practices in data engineering. • Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, developers, or vendors. • Work closely with data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment • Build data pipelines from a wide variety of sources. • Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members. • Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery. • Backtracking, troubleshooting failures and provide fix as needed. • Update and maintain key data cloud solution deliverables and diagrams. • Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision. • Provide consultant support in analysis, solution design, and service delivery • Around 6 years overall IT experience • At least 4+ of hands-on experience in designing, implementing, managing large-scale data and ETL solutions with AWS IaaS and PaaS Compute, Storage, and Database services (S3, RDS, Lambda, IAM, RedShift, Glue, EMR, Kinesis, Athena). • Extensive experience on SPARK. • Extensive expertise in Python preferably. • Hands on experience in Cloud monitoring stack like CloudTrail, CloudWatch and AWS Event-bridge trigger service. • Design, Develop, Test, and deploy Pipelines with Batch, Real-time and Near Realtime capabilities across our Segments. • 2+ years of experience in leading and delivering a data analytics project. • Strong knowledge of Data Engineering, Data Warehousing, and database concepts • Be able to analyze large complex data sets to resolve data quality issues. • Worked in Agile environment. • Excellence verbal & written communication What Will Put You Ahead: (experience & education preferred) Other Considerations: (physical demands/ unusual working conditions) • AWS certifications like Solution Architect (SAA/SAP) or Data Analytics Specialty (DAS) • Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS. • Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc. • Strong knowledge in PySpark, SQL and Redshift store procedures, Kinesis and AWS Glue service. • IAM policies – and best practices with emphasis on custom security • Independent problem solver. Must have skill Min work experience AWS S3 1+ year AWS Lambda 2+ years Py Spark 1+ years Redshift 1+ years CICD 2+ years Terraform 1 + years Node JS / Python / Java 1+ years MSBI Stack (SQL, SSIS) 2+ years JD: Data Engineer

Responsibilities

• Work with the data architect and data engineering teams to design and build a BI and analytics solutions. • Implement batch and real time data movement design patterns and define best practices in data engineering. • Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, developers, or vendors. • Work closely with data engineers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment • Build data pipelines from a wide variety of sources. • Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members. • Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery. • Backtracking, troubleshooting failures and provide fix as needed. • Update and maintain key data cloud solution deliverables and diagrams. • Ensure conformance and compliance using Georgia-Pacific data architecture guidelines and enterprise data strategic vision. • Provide consultant support in analysis, solution design, and service delivery • Around 6 years overall IT experience • At least 4+ of hands-on experience in designing, implementing, managing large-scale data and ETL solutions with AWS IaaS and PaaS Compute, Storage, and Database services (S3, RDS, Lambda, IAM, RedShift, Glue, EMR, Kinesis, Athena). • Extensive experience on SPARK. • Extensive expertise in Python preferably. • Hands on experience in Cloud monitoring stack like CloudTrail, CloudWatch and AWS Event-bridge trigger service. • Design, Develop, Test, and deploy Pipelines with Batch, Real-time and Near Realtime capabilities across our Segments. • 2+ years of experience in leading and delivering a data analytics project. • Strong knowledge of Data Engineering, Data Warehousing, and database concepts • Be able to analyze large complex data sets to resolve data quality issues. • Worked in Agile environment. • Excellence verbal & written communication What Will Put You Ahead: (experience & education preferred) Other Considerations: (physical demands/ unusual working conditions) • AWS certifications like Solution Architect (SAA/SAP) or Data Analytics Specialty (DAS) • Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS. • Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc. • Strong knowledge in PySpark, SQL and Redshift store procedures, Kinesis and AWS Glue service. • IAM policies – and best practices with emphasis on custom security • Independent problem solver. Must have skill Min work experience AWS S3 1+ year AWS Lambda 2+ years Py Spark 1+ years Redshift 1+ years CICD 2+ years Terraform 1 + years Node JS / Python / Java 1+ years MSBI Stack (SQL, SSIS) 2+ years JD: Data Engineer
  • Salary : Rs. 30,00,000.0 - Rs. 40,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer

Job Description

Primary Skill: Data Engineer  Experience:  6 to 7 Years     Particulars  Senior Developer (Job Description):    Minimum 4-7 years of development experience in Azure Data Services   Job description: 6-7 years of Total experience  4 to 7 years of development experience in Azure Data Services  Experience in integrating on-prem data sources with Azure data storage Experience in Linux skills to upgrade Vertica database Experience in programming in Python, R. (Typescript is a plus) Experience in developing data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services. Experience in Azure Devops (build a deployment pipeline from Stage to Prod) Experience in using Visual Studio or similar tool to build code and debug Experience in working with GIT   Experience in Azure admin tasks (permissions, network, security) Good to have knowledge on data analysis using Synapse analytics and Azure Data Bricks Good to have knowledge on Azure OpenAI and ML services. Microsoft certified Azure Data Engineer or similar is preferred   Other expectations: Good communication & strong collaboration skills  Open minded   High interest in new technologies   Experienced in working models with distributed teams in different cultures   Analytical thinking, high level of comprehension and independent working style  Will work with colleagues based out of US – Up to 2pm EST   

Responsibilities

Primary Skill: Data Engineer  Experience:  6 to 7 Years     Particulars  Senior Developer (Job Description):    Minimum 4-7 years of development experience in Azure Data Services   Job description: 6-7 years of Total experience  4 to 7 years of development experience in Azure Data Services  Experience in integrating on-prem data sources with Azure data storage Experience in Linux skills to upgrade Vertica database Experience in programming in Python, R. (Typescript is a plus) Experience in developing data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services. Experience in Azure Devops (build a deployment pipeline from Stage to Prod) Experience in using Visual Studio or similar tool to build code and debug Experience in working with GIT   Experience in Azure admin tasks (permissions, network, security) Good to have knowledge on data analysis using Synapse analytics and Azure Data Bricks Good to have knowledge on Azure OpenAI and ML services. Microsoft certified Azure Data Engineer or similar is preferred   Other expectations: Good communication & strong collaboration skills  Open minded   High interest in new technologies   Experienced in working models with distributed teams in different cultures   Analytical thinking, high level of comprehension and independent working style  Will work with colleagues based out of US – Up to 2pm EST   
  • Salary : Rs. 25,00,000.0 - Rs. 30,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Azure Data Services

Job Description

• Cybersecurity documentation • Cybersecurity vulnerability assessment • Threat modeling • Cyber security analysis • Design and solutioning secure products • Secure Code Review • Knowledge of security assessment tools like burpsuite, Qualyss etc • Preferably from software development background • Knowledge of one or more programming languages - C/C++/Java/Python/scripting • Good to have medical domain knowledge.

Responsibilities

• Cybersecurity documentation • Cybersecurity vulnerability assessment • Threat modeling • Cyber security analysis • Design and solutioning secure products • Secure Code Review • Knowledge of security assessment tools like burpsuite, Qualyss etc • Preferably from software development background • Knowledge of one or more programming languages - C/C++/Java/Python/scripting • Good to have medical domain knowledge.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Security Analyst

Job Description

Java, Python scripting, Linux

Responsibilities

Java, Python scripting, Linux
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SW Developer-1

Job Description

Python, Selenium, Jbehave, Behave, API , Containerisation, Virtualisation, linux

Responsibilities

Python, Selenium, Jbehave, Behave, API , Containerisation, Virtualisation, linux
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SDET Dev