We found 125 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Tech Consultant

Responsibilities

Tech Consultant
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Tech Consultant

Job Description

Azure - AKS

Responsibilities

Azure - AKS
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Azure - AKS

Job Description

Job Description: SAP BODS Developer Essential Skills: Data Concepts Data Modelling Strictly : No less than 4+ yrs NP: immediate is the first priority (On bench / serving notice - 15days can be considerable ) Since it is not under Niche or Ultra niche category Rate flexibility: can go 10k - 15k max higher than the Beeline Rate.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Concepts & Data Modelling

Job Description

Sr. Data Engineer We are seeking a highly skilled and experienced Cloud Data Engineer to join the Optum Advisory Team. The ideal candidate will bring deep expertise in cloud-based data engineering with a strong background in Azure, Databricks, Azure Data Factory (ADF), and Snowflake. This role requires a self-starter who can work independently with minimal guidance and collaborate effectively with cross-functional teams to deliver high-impact data solutions in the healthcare domain. 5+ years of experience on IT industry in Azure Tech stack 5 years of development experience using tool SNOWFLAKE, DATABRICKS, PYTHON, SQL, PYSPARK Proficient in writing SQL queries including writing of windows functions Good communication skills with analytical abilities in doing problem solving activities.

Responsibilities

Sr. Data Engineer We are seeking a highly skilled and experienced Cloud Data Engineer to join the Optum Advisory Team. The ideal candidate will bring deep expertise in cloud-based data engineering with a strong background in Azure, Databricks, Azure Data Factory (ADF), and Snowflake. This role requires a self-starter who can work independently with minimal guidance and collaborate effectively with cross-functional teams to deliver high-impact data solutions in the healthcare domain. 5+ years of experience on IT industry in Azure Tech stack 5 years of development experience using tool SNOWFLAKE, DATABRICKS, PYTHON, SQL, PYSPARK Proficient in writing SQL queries including writing of windows functions Good communication skills with analytical abilities in doing problem solving activities.
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SNOWFLAKE, DATABRICKS, PYTHON, SQL, PYSPARK

Job Description

Locations No. of Positions Mumbai 10-15 Bangalore 10-15 Pune 5-10 Kolkata 5-10 Gurgaon 10-15 Chennai 5-10

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Internal audit

Job Description

Job Title: Developer Work Location: Mumbai, MH / Pune, MH / Hyderabad, TG / Chennai, TN / Bangalore, KA Duration: 6 months (Extendable) Skill Required: Data Warehouse, PL/SQL Experience Range in Required Skills: 4-6 Years Job Description: ETL Testing, SQL, Dataware housing

Responsibilities

Job Title: Developer Work Location: Mumbai, MH / Pune, MH / Hyderabad, TG / Chennai, TN / Bangalore, KA Duration: 6 months (Extendable) Skill Required: Data Warehouse, PL/SQL Experience Range in Required Skills: 4-6 Years Job Description: ETL Testing, SQL, Dataware housing
  • Salary : Rs. 70,000.0 - Rs. 1,10,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Experience Range in Required Skills:8-10Yrs Job Description: Frontend - React, Nextjs, Typescript, Javascript, Code testing, DevOps, HTML Semantics, Styling, SEO, Accessibility, Performance Optimisation, micro-frontend. Essential Skill: Frontend - React, Nextjs, Typescript, Javascript, Code testing, DevOps, HTML Seman-tics, Styling, SEO, Accessibility, Performance Optimisation, micro-frontend.

Responsibilities

ReactExperience Range in Required Skills:8-10Yrs Job Description: Frontend - React, Nextjs, Typescript, Javascript, Code testing, DevOps, HTML Semantics, Styling, SEO, Accessibility, Performance Optimisation, micro-frontend. Essential Skill: Frontend - React, Nextjs, Typescript, Javascript, Code testing, DevOps, HTML Seman-tics, Styling, SEO, Accessibility, Performance Optimisation, micro-frontend.
  • Salary : Rs. 0.0 - Rs. 18.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :React next

Job Description

Experience in designing and implementing data solutions on the Databricks platform. Proficiency in programming languages such as Python, Scala, or SQL. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform.

Responsibilities

Experience in designing and implementing data solutions on the Databricks platform. Proficiency in programming languages such as Python, Scala, or SQL. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform.
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Flexera | Databricks, Py spark, SQL, Spark, AWS

Job Description

This is a Majesco Product used by Aetna Voluntary Business. As part of this project, we have different modules like PAS, GB, DM. We have requirement for resources in skillset Java, SQL and Majesco internal tools. Majesco PRASE – (Product Rules and Scripting Engine) is a proprietary language and engine used for configuring and customizing Majesco's insurance software, particularly its L&A (Life & Annuity) and P&C (Property & Casualty) core products. It provides insurers with the flexibility to define and manage complex business r

Responsibilities

This is a Majesco Product used by Aetna Voluntary Business. As part of this project, we have different modules like PAS, GB, DM. We have requirement for resources in skillset Java, SQL and Majesco internal tools. Majesco PRASE – (Product Rules and Scripting Engine) is a proprietary language and engine used for configuring and customizing Majesco's insurance software, particularly its L&A (Life & Annuity) and P&C (Property & Casualty) core products. It provides insurers with the flexibility to define and manage complex business r
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Majesco PRASE