We found 19 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

JD : Looking for strong DB Reliability Engineering candidates (4-10 yrs band) Must have strong skillset on MySQL DBA + Linux OS + Automation tools (Chef/Ansible/Shell scripting) Hands on experience on HA tools like Tungsten/Keepalived Knowledge and experience in AWS SRE and Config Mgmt are added advantage Knowledge of NoSQL DB (Mongo) an added advantage Good communication skill Ability to work independently without handholding Open to learn new skills and have good attitude.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :DB Reliability Engineer

Job Description

This role requires the individual to perform following activities: 1. Develop monitors that can track the performance of various Mural services which are deployed either on-prem or on cloud. 2. Customer environment upgrade in MS Azure cloud, AWS cloud. 3. SOC and NOC upgrades. 4. Terraform scripting for deployment. Project Outcome: Here is the expected detailed outcome of this project: · Existing customer upgrades. · Patching customer environment. · Debugging and supporting customer issues in cloud. · Automation of deployment, environment monitoring. SW DevOps Engineering This position involves the Service Reliability Engineer, deployment and support of the tools the company uses deploy software and applications. Individual will work with senior members of the team to support live customer environment, upgrades in cloud and onboarding new customer environment. Tools MS Azure, AWS, Python, Kubernetes, Terraform, Azure Pipeline, Monitoring tool like Datadog/NewRelic

Responsibilities

This role requires the individual to perform following activities: 1. Develop monitors that can track the performance of various Mural services which are deployed either on-prem or on cloud. 2. Customer environment upgrade in MS Azure cloud, AWS cloud. 3. SOC and NOC upgrades. 4. Terraform scripting for deployment. Project Outcome: Here is the expected detailed outcome of this project: · Existing customer upgrades. · Patching customer environment. · Debugging and supporting customer issues in cloud. · Automation of deployment, environment monitoring. SW DevOps Engineering This position involves the Service Reliability Engineer, deployment and support of the tools the company uses deploy software and applications. Individual will work with senior members of the team to support live customer environment, upgrades in cloud and onboarding new customer environment. Tools MS Azure, AWS, Python, Kubernetes, Terraform, Azure Pipeline, Monitoring tool like Datadog/NewRelic
  • Salary : Rs. 90,00,000.0 - Rs. 1,90,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SRE SW DevOps

Job Description

Lead Software Engineer Job Location: Bangalore Responsibilities • Understanding the product and responsible for implementing them for new clients • Customizations to the product required for any clients. • Work with cross-functional teams to ensure implementations are delivered on time • Identify technical risks and propose solutions for mitigating the same. • Mentoring junior developers and helping them progress and learn • Involved with product planning and development to provide insights to product management and engineering teams – and involved with release sign-offs • Monitor new implementations and drive metrics up through constant improvement Requirements • B.Tech / B.E in CSE / IT Engineering, or equivalent preferred. • Experience in HL7/MIRTH preferred. • 5-7 years of experience in Java, Spring / similar framework application development. • Basic understanding of ITIL (Service Management) concepts • Good Understanding of the Support systems like JIRA/ SERVICENOW and workload allocation management • Expertise in SQL programming. • Experience in AWS, Jenkins, Code repositories is a plus • Experience in Python is a plus. • Exposure to API development is a plus • Willingness to work with US-based clients and teams across multiple time zones • Team player and ability to mentor junior developers • Good communication skills.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Lead Software Engineer

Job Description

"Responsibilities: · Design, develop, and deploy highly scalable and efficient data pipelines utilizing containerization technologies (Docker/Kubernetes) · Leverage AWS services like MSK (Managed Streaming Kafka) and S3 (Simple Storage Service) to build real-time data pipelines · Implement and manage access controls using AWS IAM (Identity and Access Management) with Roles and Policies · Work with Apache Kafka for high-throughput messaging and streaming data · Utilize Debezium for Change Data Capture (CDC) from relational databases · Automate infrastructure provisioning and management using Terraform (Infrastructure as Code) · Configure and maintain Jenkins for continuous integration and continuous delivery (CI/CD) of data pipelines · Collaborate with data scientists, analysts, and software engineers to ensure data pipelines meet business needs · Monitor and troubleshoot data pipelines to ensure data quality and availability Qualifications: · 3-6 years of experience as a Data Engineer or related role · Strong understanding of data warehousing, data pipelines, and data modeling principles · Proficient in scripting languages like Python, Bash, or similar · Experience with containerization technologies (Docker, Kubernetes) is a plus · Experience with AWS cloud services (MSK, S3, IAM) is a plus · Experience with Apache Kafka and Debezium is a plus · Experience with Terraform and Infrastructure as Code (IaC) principles is a plus · Experience with Jenkins and CI/CD practices is a plus · Excellent communication and collaboration skills · Ability to work independently and as part of a team"

Responsibilities

"Responsibilities: · Design, develop, and deploy highly scalable and efficient data pipelines utilizing containerization technologies (Docker/Kubernetes) · Leverage AWS services like MSK (Managed Streaming Kafka) and S3 (Simple Storage Service) to build real-time data pipelines · Implement and manage access controls using AWS IAM (Identity and Access Management) with Roles and Policies · Work with Apache Kafka for high-throughput messaging and streaming data · Utilize Debezium for Change Data Capture (CDC) from relational databases · Automate infrastructure provisioning and management using Terraform (Infrastructure as Code) · Configure and maintain Jenkins for continuous integration and continuous delivery (CI/CD) of data pipelines · Collaborate with data scientists, analysts, and software engineers to ensure data pipelines meet business needs · Monitor and troubleshoot data pipelines to ensure data quality and availability Qualifications: · 3-6 years of experience as a Data Engineer or related role · Strong understanding of data warehousing, data pipelines, and data modeling principles · Proficient in scripting languages like Python, Bash, or similar · Experience with containerization technologies (Docker, Kubernetes) is a plus · Experience with AWS cloud services (MSK, S3, IAM) is a plus · Experience with Apache Kafka and Debezium is a plus · Experience with Terraform and Infrastructure as Code (IaC) principles is a plus · Experience with Jenkins and CI/CD practices is a plus · Excellent communication and collaboration skills · Ability to work independently and as part of a team"
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AWS Data Engineer

Job Description

"Responsibilities: · Mandatory: o Design, develop, and deploy highly scalable, secure, and reliable serverless applications using AWS services, specifically: § API Gateway: Create and manage APIs for various use cases. § AWS Lambda: Develop and deploy serverless functions using Python or Go. § AWS Step Functions: Design and implement workflows for orchestrating multiple Lambda functions and other AWS services. o Collaborate with cross-functional teams (product, design, DevOps) to understand requirements, translate them into technical solutions, and deliver features according to deadlines. o Write clean, maintainable, and well-documented code adhering to best practices. o Participate in code reviews and provide constructive feedback to improve code quality. o Continuously learn and stay up-to-date with the latest advancements in AWS services and technologies. o Utilize Git for version control and maintain a clean and efficient codebase. o Configure and manage CI/CD pipelines using Git Runner or other tools. o Monitor and troubleshoot application deployments and performance. o Identify and implement opportunities for optimization and cost reduction. o Have experience with Pulumi or Terraform for infrastructure as code (Infrastructure as Code - IaC). Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience). · Minimum 6+ years of experience as a Software Engineer or similar role. · Strong understanding of cloud computing concepts and principles. · Proficiency in designing and developing serverless applications using AWS services, especially: o API Gateway o Python or Go for AWS Lambda functions o AWS Step Functions · Experience with Git and CI/CD pipelines (e.g., Git Runner) is a plus. · Excellent problem-solving, analytical, and communication skills. · Ability to work independently and as part of a team. · Strong commitment to quality and a passion for learning new technologies"

Responsibilities

"Responsibilities: · Mandatory: o Design, develop, and deploy highly scalable, secure, and reliable serverless applications using AWS services, specifically: § API Gateway: Create and manage APIs for various use cases. § AWS Lambda: Develop and deploy serverless functions using Python or Go. § AWS Step Functions: Design and implement workflows for orchestrating multiple Lambda functions and other AWS services. o Collaborate with cross-functional teams (product, design, DevOps) to understand requirements, translate them into technical solutions, and deliver features according to deadlines. o Write clean, maintainable, and well-documented code adhering to best practices. o Participate in code reviews and provide constructive feedback to improve code quality. o Continuously learn and stay up-to-date with the latest advancements in AWS services and technologies. o Utilize Git for version control and maintain a clean and efficient codebase. o Configure and manage CI/CD pipelines using Git Runner or other tools. o Monitor and troubleshoot application deployments and performance. o Identify and implement opportunities for optimization and cost reduction. o Have experience with Pulumi or Terraform for infrastructure as code (Infrastructure as Code - IaC). Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience). · Minimum 6+ years of experience as a Software Engineer or similar role. · Strong understanding of cloud computing concepts and principles. · Proficiency in designing and developing serverless applications using AWS services, especially: o API Gateway o Python or Go for AWS Lambda functions o AWS Step Functions · Experience with Git and CI/CD pipelines (e.g., Git Runner) is a plus. · Excellent problem-solving, analytical, and communication skills. · Ability to work independently and as part of a team. · Strong commitment to quality and a passion for learning new technologies"
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AWS Lambda using Python

Job Description

Profile Summary The Sr Data Engineer works on multiple projects of data engineering to support the use cases and identify potential process or data quality issues. Take the data requirements and develop, deploy, and test the logic to create the necessary datamarts. The team is also responsible for maintenance and enhancements to the datamarts. Job Description Essential Job Functions 1. Develops, Unit Tests Informatica Jobs, Expert in SQL and scripts 2. Source Schema changes, Break-Fixes, View Changes 3. Conversions, Regression Testing 4. Provides L2 DevOps support 5. Runs historical data loads 6. Conduct peer reviews ensures accuracy and standardization 7. Conduct impact analysis for changes made to the datamarts 8. Creation of test cases 9. Conduct analysis and recommend performance improvement measures to enhance datamarts 10. Coordinate and work across functions in the enterprise for project intake and initial solutioning before handing it off to the assigned resources Must Have • Strong SQL Skills • Understanding of dimensional datamarts - Data warehousing • Proficient in Linux/Unix (RHEL) • Informatica Powercenter Nice to Have • Job Scheduling skillset

Responsibilities

Profile Summary The Sr Data Engineer works on multiple projects of data engineering to support the use cases and identify potential process or data quality issues. Take the data requirements and develop, deploy, and test the logic to create the necessary datamarts. The team is also responsible for maintenance and enhancements to the datamarts. Job Description Essential Job Functions 1. Develops, Unit Tests Informatica Jobs, Expert in SQL and scripts 2. Source Schema changes, Break-Fixes, View Changes 3. Conversions, Regression Testing 4. Provides L2 DevOps support 5. Runs historical data loads 6. Conduct peer reviews ensures accuracy and standardization 7. Conduct impact analysis for changes made to the datamarts 8. Creation of test cases 9. Conduct analysis and recommend performance improvement measures to enhance datamarts 10. Coordinate and work across functions in the enterprise for project intake and initial solutioning before handing it off to the assigned resources Must Have • Strong SQL Skills • Understanding of dimensional datamarts - Data warehousing • Proficient in Linux/Unix (RHEL) • Informatica Powercenter Nice to Have • Job Scheduling skillset
  • Salary : Rs. 2,50,000.0 - Rs. 3,50,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Sr. Data Engineer

Job Description

The Data Engineer 1 works on different projects of data engineering to support the use cases and identify potential process or data quality issues. Take the data requirements and develop, deploy, and test the logic to create the necessary datamarts. The team is also responsible for maintenance and enhancements to the datamarts. Job Description Essential Job Functions 1. Develops, Unit Tests Informatica Jobs, Expert in SQL and scripts 2. Source Schema changes, Break-Fixes, View Changes 3. Conversions, Regression Testing 4. Provides L2 DevOps support 5. Runs historical data loads Must Have • Strong SQL Skills • Proficient in Linux/Unix (RHEL) • Informatica Powercenter Nice to Have • Job Scheduling skillset • Understanding of dimensional datamarts - Data warehousing

Responsibilities

The Data Engineer 1 works on different projects of data engineering to support the use cases and identify potential process or data quality issues. Take the data requirements and develop, deploy, and test the logic to create the necessary datamarts. The team is also responsible for maintenance and enhancements to the datamarts. Job Description Essential Job Functions 1. Develops, Unit Tests Informatica Jobs, Expert in SQL and scripts 2. Source Schema changes, Break-Fixes, View Changes 3. Conversions, Regression Testing 4. Provides L2 DevOps support 5. Runs historical data loads Must Have • Strong SQL Skills • Proficient in Linux/Unix (RHEL) • Informatica Powercenter Nice to Have • Job Scheduling skillset • Understanding of dimensional datamarts - Data warehousing
  • Salary : Rs. 2,50,000.0 - Rs. 3,50,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Jr. Data Engineer

Job Description

Requisition Title (Position Name/Designation): Senior DevOps Engineer Mandatory Skills: 8+ years’ experience in designing and building automation pipelines towards continuous integration (CI), continuous delivery (CD) using GitHub Workflows/Actions. Experience in migration and management of code repositories in GitHub. Experience using Docker with Kubernetes to build and orchestrate applications. Experience in multiple scripting languages Bash, PowerShell. Experience in log management and monitoring tools i.e., Datadog, CloudWatch, CloudTrail, Azure Log Analytics Workspace. Experience in creating scrips using ARM Templates, Terraform, Cloudformation for IaC. Experience in Azure/AWS Design and Administration i.e., Kubernetes(AKS), Application Gateway, Key Vault, IAM, S3, KMS, Cloudfront, ACMPCA, Lambda, SNS. Experience in GitOps tools i.e., Helm, Flux. Experience with Agile software development processes and practices Experience working on Linux environment. Quick learner and result oriented with strong analytical/problem solving skills. Excellent organizational, written, and verbal communication skills. Roles and Responsibilities: Manage the release code in GitHub by creating Tags and perform the deployments using CD workflows. Review the requirements received from client and team members tasks. Monitor the application logs and Kubernetes logs in Datadog and support relevant development/test team for their further analysis. understanding the existing role design in Azure/AWS and propose best guidelines. Perform the security guidelines checks on Azure/AWS Infrastructure configuration.

Responsibilities

Requisition Title (Position Name/Designation): Senior DevOps Engineer Mandatory Skills: 8+ years’ experience in designing and building automation pipelines towards continuous integration (CI), continuous delivery (CD) using GitHub Workflows/Actions. Experience in migration and management of code repositories in GitHub. Experience using Docker with Kubernetes to build and orchestrate applications. Experience in multiple scripting languages Bash, PowerShell. Experience in log management and monitoring tools i.e., Datadog, CloudWatch, CloudTrail, Azure Log Analytics Workspace. Experience in creating scrips using ARM Templates, Terraform, Cloudformation for IaC. Experience in Azure/AWS Design and Administration i.e., Kubernetes(AKS), Application Gateway, Key Vault, IAM, S3, KMS, Cloudfront, ACMPCA, Lambda, SNS. Experience in GitOps tools i.e., Helm, Flux. Experience with Agile software development processes and practices Experience working on Linux environment. Quick learner and result oriented with strong analytical/problem solving skills. Excellent organizational, written, and verbal communication skills. Roles and Responsibilities: Manage the release code in GitHub by creating Tags and perform the deployments using CD workflows. Review the requirements received from client and team members tasks. Monitor the application logs and Kubernetes logs in Datadog and support relevant development/test team for their further analysis. understanding the existing role design in Azure/AWS and propose best guidelines. Perform the security guidelines checks on Azure/AWS Infrastructure configuration.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Senior Devops Engineer

Job Description

Amazon Web Services (AWS) (14212045)

Responsibilities

Amazon Web Services (AWS) (14212045)
  • Salary : Rs. 0.0 - Rs. 11,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Operations Engineer