We found 623 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Python Developer

Responsibilities

Python Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Developer

Job Description

Bulk_Opening

Responsibilities

Bulk_Opening
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bulk_Opening

Job Description

Skill Required - Digital : Amazon Web Service(AWS) Cloud Computing Experience Range - 6 to 8 Job Description – Key Accountabilities: • Experience building data pipelines for various heterogenous data sources. • Identifying, designing and implementing scalable data delivery pipelines and automating manual processes • Building required infrastructure for optimal data extraction, transformation and loading of data using cloud technologies like AWS, Azure etc., • Develop end to end processes on the enterprise level for use by the clinical data configuration specialist to prepare data extraction and transformations of raw data quickly and efficiently from various sources at the study level • Coordinate with downstream users such as statistical programmers, SDTM programming, analytics, and clinical data programmers to ensure that outputs meet requirements of end users • Experience creating ELT and ETL to ingest data into data warehouse and data lakes • Experience creating reusable data pipelines for heterogenous data ingestions • Manage and maintain pipelines and troubleshoot data in data lake or warehouse • Provide visualization and analysis of data stored in data lake • Define and track KPIs and provide continuous improvement • Develop and maintain, tools, libraries, and reusable templates of data pipelines and standards for study level consumption by data configuration specialist • Collaborate with various vendors and cross functional teams to build and align on data transfer specification and ensure a streamlined process of data integration • Provide ad-hoc analysis and visualization as needed • Ensure accurate delivery of data format and data frequency with quality deliverables per specification • Participate in the development, maintenance and training rendered by standards and other functions on transfer specs and best practices used by business. • Collaborate with system architecture team in designing and developing data pipelines as per business needs • Network with key business stakeholders on refining and enhancing the integration of structured and non-structured data. • Provide expertise for structured and non-structured data ingestion • Develop organizational knowledge of key data sources, systems and be a valuable resource to people in the company on how to best integrate data to pursue company objectives. • Provides technical leadership on various aspects of clinical data flow including assisting with the definition, build, and validation of application program interfaces (APIs), data streams, data staging to various systems for data extraction and integration • Experience in creating data integrity and data quality checks for data ingestion • Coordinates with data base builders, clinical data configuration specialists and data management (DM) programmers ensuring accuracy of data integration per SOPs • Provide technical support / consultancy and end-user support, work with Information Technology (IT) in troubleshooting, reporting, and resolving system issues • Develop and deliver training programs to internal and external team, ensure timely communication of new and/or revised data transfer specs • Continuous Improvement/Continuous Development • Efficiently prepare and process large datasets for various end users for downstream consumption • Understand end to end requirements for stakeholders and contribute to process and conventions for clinical data ingestion and data transfer agreements • Adhere to SOPs for computer system validation and all GCP (Good Clinical Practice) regulations Ensure compliance with own Learning Curricula, corporate and/or GxP requirements • Assists with quality review of above activities performed by a vendor, as needed • Assess and enable clinical data visualization software in the data flows • Performs other duties as assigned within timelines • Performs clinical data engineering tasks according to applicable SOPs (standard operating procedures) and processes. Educational Qualification: • Bachelor's degree in computer science, statistics, biostatistics, mathematics, biology or other health related field or equivalent experience that provides the skills and knowledge necessary to perform the job. Experience: • BS with ~8+ years experience. Minimum of 5 years’ experience in data engineering, building data pipelines to manage heterogenous data ingestions or similar in data integration across multiple sources including collected data. • Experience with Python/R, SQL, NoSQL • Cloud experience (i.e. AWS, AZURE or GCP) • Experience with GitLab, GitHub • Experience with Jenkins, GitLab • Experience deploying data pipelines in the cloud • Experience with Apache Spark (databricks) • Experience setting up and working with data warehouse, data lakes (eg: snowflake, Amazon RedShift etc.,) • Experience setting up ELT and ETL • Experience with unstructured data processing and transformation • Experience developing and maintaining data pipelines for large amounts of data efficiently • Must understand database concepts. Knowledge of XML, JSON, APIs. • Demonstrated ability to lead projects and work groups. Strong project management skills. Proven ability to resolve problems independently and collaboratively. • Must be able to work in a fast-paced environment with demonstrated ability to juggle and prioritize multiple competing tasks and demands. • • Ability to work independently, take initiative and complete tasks to deadlines. Special Skills/Abilities: • Strong attention to detail, and organizational skills • Strong Project Management skills • Strong understating of end-to-end processes for data collection, extraction and analysis needs by end users • Strong ability to communicate with cross functional stakeholders • Strong ability to develop technical specifications based on communication from stakeholders • Quick learner and comfortable asking questions, learning new technologies and systems • Good knowledge of office software (Microsoft Office). • Experience creating custom functions Python/R • Cloud computing (AWS, Snowflakes, Databricks) • Ability to visualize large datasets • R shiny/ Python App experience a plus Preferable: • Experience developing R shiny and Python apps • Experience with Hadoop • Experience with Agile development methods

Responsibilities

Skill Required - Digital : Amazon Web Service(AWS) Cloud Computing Experience Range - 6 to 8 Job Description – Key Accountabilities: • Experience building data pipelines for various heterogenous data sources. • Identifying, designing and implementing scalable data delivery pipelines and automating manual processes • Building required infrastructure for optimal data extraction, transformation and loading of data using cloud technologies like AWS, Azure etc., • Develop end to end processes on the enterprise level for use by the clinical data configuration specialist to prepare data extraction and transformations of raw data quickly and efficiently from various sources at the study level • Coordinate with downstream users such as statistical programmers, SDTM programming, analytics, and clinical data programmers to ensure that outputs meet requirements of end users • Experience creating ELT and ETL to ingest data into data warehouse and data lakes • Experience creating reusable data pipelines for heterogenous data ingestions • Manage and maintain pipelines and troubleshoot data in data lake or warehouse • Provide visualization and analysis of data stored in data lake • Define and track KPIs and provide continuous improvement • Develop and maintain, tools, libraries, and reusable templates of data pipelines and standards for study level consumption by data configuration specialist • Collaborate with various vendors and cross functional teams to build and align on data transfer specification and ensure a streamlined process of data integration • Provide ad-hoc analysis and visualization as needed • Ensure accurate delivery of data format and data frequency with quality deliverables per specification • Participate in the development, maintenance and training rendered by standards and other functions on transfer specs and best practices used by business. • Collaborate with system architecture team in designing and developing data pipelines as per business needs • Network with key business stakeholders on refining and enhancing the integration of structured and non-structured data. • Provide expertise for structured and non-structured data ingestion • Develop organizational knowledge of key data sources, systems and be a valuable resource to people in the company on how to best integrate data to pursue company objectives. • Provides technical leadership on various aspects of clinical data flow including assisting with the definition, build, and validation of application program interfaces (APIs), data streams, data staging to various systems for data extraction and integration • Experience in creating data integrity and data quality checks for data ingestion • Coordinates with data base builders, clinical data configuration specialists and data management (DM) programmers ensuring accuracy of data integration per SOPs • Provide technical support / consultancy and end-user support, work with Information Technology (IT) in troubleshooting, reporting, and resolving system issues • Develop and deliver training programs to internal and external team, ensure timely communication of new and/or revised data transfer specs • Continuous Improvement/Continuous Development • Efficiently prepare and process large datasets for various end users for downstream consumption • Understand end to end requirements for stakeholders and contribute to process and conventions for clinical data ingestion and data transfer agreements • Adhere to SOPs for computer system validation and all GCP (Good Clinical Practice) regulations Ensure compliance with own Learning Curricula, corporate and/or GxP requirements • Assists with quality review of above activities performed by a vendor, as needed • Assess and enable clinical data visualization software in the data flows • Performs other duties as assigned within timelines • Performs clinical data engineering tasks according to applicable SOPs (standard operating procedures) and processes. Educational Qualification: • Bachelor's degree in computer science, statistics, biostatistics, mathematics, biology or other health related field or equivalent experience that provides the skills and knowledge necessary to perform the job. Experience: • BS with ~8+ years experience. Minimum of 5 years’ experience in data engineering, building data pipelines to manage heterogenous data ingestions or similar in data integration across multiple sources including collected data. • Experience with Python/R, SQL, NoSQL • Cloud experience (i.e. AWS, AZURE or GCP) • Experience with GitLab, GitHub • Experience with Jenkins, GitLab • Experience deploying data pipelines in the cloud • Experience with Apache Spark (databricks) • Experience setting up and working with data warehouse, data lakes (eg: snowflake, Amazon RedShift etc.,) • Experience setting up ELT and ETL • Experience with unstructured data processing and transformation • Experience developing and maintaining data pipelines for large amounts of data efficiently • Must understand database concepts. Knowledge of XML, JSON, APIs. • Demonstrated ability to lead projects and work groups. Strong project management skills. Proven ability to resolve problems independently and collaboratively. • Must be able to work in a fast-paced environment with demonstrated ability to juggle and prioritize multiple competing tasks and demands. • • Ability to work independently, take initiative and complete tasks to deadlines. Special Skills/Abilities: • Strong attention to detail, and organizational skills • Strong Project Management skills • Strong understating of end-to-end processes for data collection, extraction and analysis needs by end users • Strong ability to communicate with cross functional stakeholders • Strong ability to develop technical specifications based on communication from stakeholders • Quick learner and comfortable asking questions, learning new technologies and systems • Good knowledge of office software (Microsoft Office). • Experience creating custom functions Python/R • Cloud computing (AWS, Snowflakes, Databricks) • Ability to visualize large datasets • R shiny/ Python App experience a plus Preferable: • Experience developing R shiny and Python apps • Experience with Hadoop • Experience with Agile development methods
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

PySpark, EMR, AWS

Responsibilities

PySpark, EMR, AWS
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :PySpark, EMR, AWS

Job Description

.Net Full stack developer

Responsibilities

.Net Full stack developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :.Net Full stack developer

Job Description

AWS Cloud Platform , CI/CD

Responsibilities

AWS Cloud Platform , CI/CD
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AWS Cloud Platform , CI/CD

Job Description

Key Responsibilities • Support & Operations • Provide L1/L2 support and take ownership of L3 issues during scheduled terms. • Manage environment-related incidents and service requests currently tracked via GitHub, with a transition planned to Unity ITSM. • Build and maintain a Knowledge Base (KB) to streamline future support operations. • Monitor and report daily production status using GitHub Actions dashboards. • Collaborate in code reviews and small improvements within the MCO scope (Maintenance and Continuous Operations). • Development & Automation • Contribute to continuous improvements in environment setup and deployment processes. • Handle new package installations or upgrades using Bash scripting. • Manage and automate new updater workflows using Python scripting. • Participate in code reviews (individually or in pair programming setup) to ensure quality and maintainability. • Documentation & Knowledge Sharing • Create and maintain detailed technical documentation for setup, support, and troubleshooting procedures. • Collaborate with other teams to ensure smooth handover and transition of environment management processes. Profile Required Skills • Strong hands-on experience with Bash and Python scripting. • Familiarity with GitHub and GitHub Actions for CI/CD pipelines and production checks. • Experience with incident and request management tools (Unity ITSM or equivalent). • Understanding of environment setup, package management, and deployment in production ecosystems. • Ability to troubleshoot technical issues efficiently and propose automation or process improvements. • Strong documentation and communication skills. ⸻ Preferred Skills • Experience working in MCO or DevOps environments. • Prior exposure to cloud-based deployment or CI/CD ecosystems. • Familiarity with code review and collaborative development practices (e.g., pair programming).

Responsibilities

Key Responsibilities • Support & Operations • Provide L1/L2 support and take ownership of L3 issues during scheduled terms. • Manage environment-related incidents and service requests currently tracked via GitHub, with a transition planned to Unity ITSM. • Build and maintain a Knowledge Base (KB) to streamline future support operations. • Monitor and report daily production status using GitHub Actions dashboards. • Collaborate in code reviews and small improvements within the MCO scope (Maintenance and Continuous Operations). • Development & Automation • Contribute to continuous improvements in environment setup and deployment processes. • Handle new package installations or upgrades using Bash scripting. • Manage and automate new updater workflows using Python scripting. • Participate in code reviews (individually or in pair programming setup) to ensure quality and maintainability. • Documentation & Knowledge Sharing • Create and maintain detailed technical documentation for setup, support, and troubleshooting procedures. • Collaborate with other teams to ensure smooth handover and transition of environment management processes. Profile Required Skills • Strong hands-on experience with Bash and Python scripting. • Familiarity with GitHub and GitHub Actions for CI/CD pipelines and production checks. • Experience with incident and request management tools (Unity ITSM or equivalent). • Understanding of environment setup, package management, and deployment in production ecosystems. • Ability to troubleshoot technical issues efficiently and propose automation or process improvements. • Strong documentation and communication skills. ⸻ Preferred Skills • Experience working in MCO or DevOps environments. • Prior exposure to cloud-based deployment or CI/CD ecosystems. • Familiarity with code review and collaborative development practices (e.g., pair programming).
  • Salary : Rs. 0.0 - Rs. 10,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Software Engineer - Python Developer

Job Description

Developer Looking for Golang developer Skills Required: Golang, AWS Good To have: Tibco- Knowledge Relevant exp: 5 to 8 years

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Job Title: Microservice Developer Work Location: Bangalore, KA / Hyderabad, TG Skill Required: Digital: Microservices Experience Range: 8+ Years Role Description: • Mandatory Skillset: Java , Microservices , Spring Boot , MySql ,AWS services • Description : o Hands-on experience on core java , Spring and REST services o Candidate should be able to create REST APIs and integrating with external systems and services o Strong java and microservices development experience o Should be able to understand the requirement, create stories and estimate efforts for same. o Experience with tools : Bitbucket, Jira ,Confluence o Writing clean, efficient and well documented code using java and related technologies o Knowledge of database like SQL , MySql o Basic knowledge of AWS services like AWS SQS , Lambda , Cloudwatch , AWS Databases like DynamoDb

Responsibilities

Job Title: Microservice Developer Work Location: Bangalore, KA / Hyderabad, TG Skill Required: Digital: Microservices Experience Range: 8+ Years Role Description: • Mandatory Skillset: Java , Microservices , Spring Boot , MySql ,AWS services • Description : o Hands-on experience on core java , Spring and REST services o Candidate should be able to create REST APIs and integrating with external systems and services o Strong java and microservices development experience o Should be able to understand the requirement, create stories and estimate efforts for same. o Experience with tools : Bitbucket, Jira ,Confluence o Writing clean, efficient and well documented code using java and related technologies o Knowledge of database like SQL , MySql o Basic knowledge of AWS services like AWS SQS , Lambda , Cloudwatch , AWS Databases like DynamoDb
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Microservice Developer