We found 207 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot software problems, analyzing system performance, and ensuring the smooth operation of applications that are vital to business functions. You will engage with stakeholders to understand their needs and provide timely solutions, contributing to the overall efficiency and effectiveness of the organization. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of best practices for application support.- Document and maintain records of issues and resolutions to enhance knowledge sharing within the team. Professional & Technical Skills: - Must To Have Skills: Proficiency in Node.js.- Strong understanding of application support processes and methodologies.- Experience with debugging and troubleshooting software applications.- Familiarity with database management and query optimization.- Knowledge of version control systems and deployment processes. Additional Information: - The candidate should have minimum 3 years of experience in Node.js.- This position is based at our Chennai office.- A 15 years full time education is required.

Responsibilities

As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot software problems, analyzing system performance, and ensuring the smooth operation of applications that are vital to business functions. You will engage with stakeholders to understand their needs and provide timely solutions, contributing to the overall efficiency and effectiveness of the organization. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of best practices for application support.- Document and maintain records of issues and resolutions to enhance knowledge sharing within the team. Professional & Technical Skills: - Must To Have Skills: Proficiency in Node.js.- Strong understanding of application support processes and methodologies.- Experience with debugging and troubleshooting software applications.- Familiarity with database management and query optimization.- Knowledge of version control systems and deployment processes. Additional Information: - The candidate should have minimum 3 years of experience in Node.js.- This position is based at our Chennai office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 1,45,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Application Support Engineer

Job Description

Job Title: Engineer Work Location: Chennai, TN / Kochi, KL / Bangalore, KA / Gurgaon, HA / Noida, UP / Bhubaneswar, OD / Kolkata, WB / Hyderabad, TG / Pune, MH Skill Required: Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years Job Description: Leads large-scale, complex, cross-functional projects to build technical roadmap for the WFM Data Services platform. Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution. Communicating status and issues to Sr Starbucks leadership and stakeholders. Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define implement data retention policies and procedures Define implement data governance policies and procedures Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Perform root cause analysis to identify permanent resolutions to software or business process issues

Responsibilities

Job Title: Engineer Work Location: Chennai, TN / Kochi, KL / Bangalore, KA / Gurgaon, HA / Noida, UP / Bhubaneswar, OD / Kolkata, WB / Hyderabad, TG / Pune, MH Skill Required: Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years Job Description: Leads large-scale, complex, cross-functional projects to build technical roadmap for the WFM Data Services platform. Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution. Communicating status and issues to Sr Starbucks leadership and stakeholders. Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define implement data retention policies and procedures Define implement data governance policies and procedures Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Perform root cause analysis to identify permanent resolutions to software or business process issues
  • Salary : Rs. 55,000.0 - Rs. 95,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years Job Description: Leads large-scale, complex, cross-functional projects to build technical roadmap for the WFM Data Services platform. Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution. Communicating status and issues to Sr Starbucks leadership and stakeholders. Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define implement data retention policies and procedures Define implement data governance policies and procedures Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years

Job Description

Job Title : Engineer Work Location : CHENNAI/TN Skills Required : Data Warehouse BI Testing~PL/SQL Experience Range Required : 4 to 6 Job Description: • Strong experience in ETL testing and data warehouse concepts. (4-6 Years) • Hands-on expertise in Databricks (including notebooks, clusters, and jobs). • Proficiency in SQL for data validation and analysis. • Experience with big data technologies (Spark, PySpark, Delta Lake). • Familiarity with cloud platforms (Azure, AWS, or GCP) and their data services. • Knowledge of data modeling, data quality frameworks, and BI tools. • Strong analytical and problem-solving skills. • Excellent communication and documentation abilities

Responsibilities

Job Title : Engineer Work Location : CHENNAI/TN Skills Required : Data Warehouse BI Testing~PL/SQL Experience Range Required : 4 to 6 Job Description: • Strong experience in ETL testing and data warehouse concepts. (4-6 Years) • Hands-on expertise in Databricks (including notebooks, clusters, and jobs). • Proficiency in SQL for data validation and analysis. • Experience with big data technologies (Spark, PySpark, Delta Lake). • Familiarity with cloud platforms (Azure, AWS, or GCP) and their data services. • Knowledge of data modeling, data quality frameworks, and BI tools. • Strong analytical and problem-solving skills. • Excellent communication and documentation abilities
  • Salary : Rs. 70,000.0 - Rs. 1,10,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

Data warehouse testing

Responsibilities

Data warehouse testing
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :544779 IQE - Data warehouse testing - Mysore/Bangalore/Hyderabad

Job Description

TDM, SQL

Responsibilities

TDM, SQL
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :551826 -IQE - TDM, SQL

Job Description

UFT automation testing

Responsibilities

UFT automation testing
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :551779 - IQE - UFT automation testing

Job Description

As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud DevOps Services.- Strong understanding of cloud infrastructure and services.- Experience with continuous integration and continuous deployment practices.- Familiarity with containerization technologies such as Docker and Kubernetes.- Knowledge of scripting languages for automation tasks. Additional Information: - The candidate should have minimum 3 years of experience in Google Cloud DevOps Services.- This position is based at our Chennai office.- A 15 years full time education is required.

Responsibilities

As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud DevOps Services.- Strong understanding of cloud infrastructure and services.- Experience with continuous integration and continuous deployment practices.- Familiarity with containerization technologies such as Docker and Kubernetes.- Knowledge of scripting languages for automation tasks. Additional Information: - The candidate should have minimum 3 years of experience in Google Cloud DevOps Services.- This position is based at our Chennai office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 1,50,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Application Developer

Job Description

Req ID – 10298405 Job Title: Developer Work Location: CHENNAI, TN Skill Required: Digital: Amazon Web Service (AWS) Cloud Computing Experience Range: 4-6 Job Description: Job Description: 1. Should have good work experience with Confluent Kafka stream processing, 2. Must have used KSQL DB and Flink in Confluent Kafka for working solutions. 3. Must have knowledge and experience of cloud native confluent Kafka deployment. 4. Must have knowledge on handling saturation issues, error messages, dead letter queues. 5. Good to have python programming skills. 6. Good to have knowledge on git.

Responsibilities

Req ID – 10298405 Job Title: Developer Work Location: CHENNAI, TN Skill Required: Digital: Amazon Web Service (AWS) Cloud Computing Experience Range: 4-6 Job Description: Job Description: 1. Should have good work experience with Confluent Kafka stream processing, 2. Must have used KSQL DB and Flink in Confluent Kafka for working solutions. 3. Must have knowledge and experience of cloud native confluent Kafka deployment. 4. Must have knowledge on handling saturation issues, error messages, dead letter queues. 5. Good to have python programming skills. 6. Good to have knowledge on git.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Amazon web services ,Cloud computing