We found 913 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

QA Engineer – Manual & Automation Position Summary We are seeking a Test Automation Engineer with strong experience in Robot Framework (Python), Strong exposure to Playwright, and solid grounding in manual testing including test case design, execution, and defect management. The candidate should be able to drive the test strategies for test automations and UAT testing. Comfortable working in an Agile environment and possess good analytical and problem-solving skills. Experience with GitHub and basic SQL is also preferred. Key Responsibilities Test Automation - Create, implement, and manage automation scripts with Python and the Robot Framework. -Optimize test suites, maintain automation frameworks, and create reusable libraries. - Build reusable libraries, maintain automation frameworks, and optimize test suites. - Integrate automation into CI/CD pipelines. - Debug test failures and identify root causes. - Work with Playwright for browser automation. - Drive test strategies and roadmap. - Coordinate with different stakeholders. - Mentor team for test guidelines and technologies. Manual Testing - Convert business requirements into detailed test scenarios and test cases. - Perform UAT, regression, and integration testing. - Execute tests and log defects in tools like Jira. - Manage the full defect lifecycle and perform impact/regression analysis. Agile Collaboration - daily stand-ups, sprint planning, refinement, and retrospectives. - Work closely with developers, product owners, and business analysts. - Provide test estimation and contribute to process improvements. Required Skills & Experience -4+ years of experience in QA with strong automation exposure. - Expertise with Robot Framework. - Strong scripting skills in Python. - Knowledge of Playwright. - Good understanding of GitHub. - SQL knowledge. - Experience with UI and API automation. Good to Have - Experience with CI/CD tools. - Exposure to API testing tools.

Responsibilities

QA Engineer – Manual & Automation Position Summary We are seeking a Test Automation Engineer with strong experience in Robot Framework (Python), Strong exposure to Playwright, and solid grounding in manual testing including test case design, execution, and defect management. The candidate should be able to drive the test strategies for test automations and UAT testing. Comfortable working in an Agile environment and possess good analytical and problem-solving skills. Experience with GitHub and basic SQL is also preferred. Key Responsibilities Test Automation - Create, implement, and manage automation scripts with Python and the Robot Framework. -Optimize test suites, maintain automation frameworks, and create reusable libraries. - Build reusable libraries, maintain automation frameworks, and optimize test suites. - Integrate automation into CI/CD pipelines. - Debug test failures and identify root causes. - Work with Playwright for browser automation. - Drive test strategies and roadmap. - Coordinate with different stakeholders. - Mentor team for test guidelines and technologies. Manual Testing - Convert business requirements into detailed test scenarios and test cases. - Perform UAT, regression, and integration testing. - Execute tests and log defects in tools like Jira. - Manage the full defect lifecycle and perform impact/regression analysis. Agile Collaboration - daily stand-ups, sprint planning, refinement, and retrospectives. - Work closely with developers, product owners, and business analysts. - Provide test estimation and contribute to process improvements. Required Skills & Experience -4+ years of experience in QA with strong automation exposure. - Expertise with Robot Framework. - Strong scripting skills in Python. - Knowledge of Playwright. - Good understanding of GitHub. - SQL knowledge. - Experience with UI and API automation. Good to Have - Experience with CI/CD tools. - Exposure to API testing tools.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :QA Engineer – Manual & Automation

Job Description

Results‑driven Data Scientist with 5 years of experience building predictive models, performing end‑to‑end data analysis, and deploying data‑driven solutions that improve business outcomes. Skilled in machine learning, statistical modelling, data visualization, and cloud‑based analytics. Adopt at translating complex data insights into actionable business strategies and collaborating with cross‑functional teams. Core Responsibilities 1. Data Analysis & Problem Solving • Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. • Use statistical methods to validate hypotheses and support business decision-making. 2. Machine Learning & Modeling • Build, train, and optimize supervised/unsupervised ML models (regression, classification, clustering, NLP). • Implement feature engineering, model evaluation, and hyperparameter tuning. • Deploy models into production using MLOps tools. 3. Data Engineering Collaboration • Work with data engineers to design and maintain scalable data pipelines. • Handle large datasets from multiple sources (SQL, APIs, cloud storage). 4. Visualization & Business Communication • Create dashboards and reports using tools like Power BI, Tableau, or Python (Matplotlib/Seaborn). • Present insights to stakeholders and leadership for decision-making. 5. Cloud & Big Data • Utilize cloud platforms like Azure, AWS, or GCP for model training and data storage. • Work with tools like Spark, Databricks, or Hadoop for large-scale data processing. Technical Skills Programming • Python (NumPy, Pandas, Scikit-learn, Stats models, TensorFlow/PyTorch) • SQL (T-SQL, MySQL, PostgreSQL) • R (optional) Machine Learning • Predictive modeling • Deep learning (optional) • Natural Language Processing (NLP) • Time-series forecasting Visualization • Power BI, Tableau • Matplotlib, Seaborn, Plotly Cloud & MLOps • Azure Machine Learning / AWS Sagemaker • Git, Docker • CI/CD pipelines Databases & Big Data • Azure Data Lake / AWS S3 • Spark / Databricks Soft Skills • Analytical thinking • Problem‑solving • Stakeholder communication • Business understanding • Team collaboration

Responsibilities

Results‑driven Data Scientist with 5 years of experience building predictive models, performing end‑to‑end data analysis, and deploying data‑driven solutions that improve business outcomes. Skilled in machine learning, statistical modelling, data visualization, and cloud‑based analytics. Adopt at translating complex data insights into actionable business strategies and collaborating with cross‑functional teams. Core Responsibilities 1. Data Analysis & Problem Solving • Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. • Use statistical methods to validate hypotheses and support business decision-making. 2. Machine Learning & Modeling • Build, train, and optimize supervised/unsupervised ML models (regression, classification, clustering, NLP). • Implement feature engineering, model evaluation, and hyperparameter tuning. • Deploy models into production using MLOps tools. 3. Data Engineering Collaboration • Work with data engineers to design and maintain scalable data pipelines. • Handle large datasets from multiple sources (SQL, APIs, cloud storage). 4. Visualization & Business Communication • Create dashboards and reports using tools like Power BI, Tableau, or Python (Matplotlib/Seaborn). • Present insights to stakeholders and leadership for decision-making. 5. Cloud & Big Data • Utilize cloud platforms like Azure, AWS, or GCP for model training and data storage. • Work with tools like Spark, Databricks, or Hadoop for large-scale data processing. Technical Skills Programming • Python (NumPy, Pandas, Scikit-learn, Stats models, TensorFlow/PyTorch) • SQL (T-SQL, MySQL, PostgreSQL) • R (optional) Machine Learning • Predictive modeling • Deep learning (optional) • Natural Language Processing (NLP) • Time-series forecasting Visualization • Power BI, Tableau • Matplotlib, Seaborn, Plotly Cloud & MLOps • Azure Machine Learning / AWS Sagemaker • Git, Docker • CI/CD pipelines Databases & Big Data • Azure Data Lake / AWS S3 • Spark / Databricks Soft Skills • Analytical thinking • Problem‑solving • Stakeholder communication • Business understanding • Team collaboration
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Scientist - Anand Gopal

Job Description

Sourcing

Responsibilities

Sourcing
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Sourcing

Job Description

• Job Summary: • Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Develop interactive dashboards and visualizations in Tableau to support business operations and leadership reporting • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with business stakeholders to understand requirements and translate them into analytical solutions • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards • Roles and Responsibilities Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Develop interactive dashboards and visualizations in Tableau to support business operations and leadership reporting • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with business stakeholders to understand requirements and translate them into analytical solutions • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards

Responsibilities

• Required Skills • Strong proficiency in Tableau (dashboard creation, calculations, data blending) • Hands-on experience with Tableau Prep and advanced Tableau development skills required. • Proficiency in Python for data analysis (pandas, numpy, matplotlib, etc.) • Proficiency in one of the databases like Oracle, MySQL, MS SQL Server, Teradata etc., • Hands-on experience with SQL (joins, window functions, performance optimization) • Ability to query and display large data sets while maximizing the performance of the workbook. • A solid understanding of SQL, relational database management system, data modeling, and normalization • High-level competency in Excel (macros, pivot tables, etc.). • Experience using Agile methodology to perform software development. • Knowledge of Google cloud or any other cloud experience is plus. • Knowledge of Google looker studio experience is plus. • Knowledge of DataIQ or any other ETL tools is a plus • Knowledge of UI/UX – Figma tools is added advantage • Knowledge of any batch scheduler like Airflow / Control-M / UC4 / Approx./ Autosys is a great advantage. • Having Manufacturing domain experience is great value ad, but not mandatory • Desired Skills: • Strong proficiency in Tableau (dashboard creation, calculations, data blending) • Hands-on experience with Tableau Prep and advanced Tableau development skills required. • Proficiency in Python for data analysis (pandas, numpy, matplotlib, etc.) • Proficiency in one of the databases like Oracle, MySQL, MS SQL Server, Teradata etc., • Hands-on experience with SQL (joins, window functions, performance optimization) • Ability to query and display large data sets while maximizing the performance of the workbook. • A solid understanding of SQL, relational database management system, data modeling, and normalization • High-level competency in Excel (macros, pivot tables, etc.). • Experience using Agile methodology to perform software development. • Knowledge of Google cloud or any other cloud experience is plus. • Knowledge of Google looker studio experience is plus. • Knowledge of DataIQ or any other ETL tools is a plus • Knowledge of UI/UX – Figma tools is added advantage • Knowledge of any batch scheduler like Airflow / Control-M / UC4 / Approx./ Autosys is a great advantage. • Having Manufacturing domain experience is great value ad, but not mandatory • Soft Skills : • Excellent problem-solving and analytical skills • Strong communication and collaboration skills • Ability to work independently and as part of a global team. • Self-motivated and able to work in a fast-paced environment. • Detail-oriented and committed to delivering high-quality work. • Display one-team behavior while thinking end to end solutioning.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Business Associate

Job Description

INFYSYJP00005009 563084- Adobe AEM- India- DX

Responsibilities

INFYSYJP00005009 563084- Adobe AEM- India- DX
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00005009 563084- Adobe AEM- India- DX

Job Description

• Job Summary: We are seeking a skilled Data Engineer with strong experience in SQL, Python, Tableau, and ETL tools to design, build, and maintain reliable data pipelines and analytics solutions. This role focuses on ensuring data quality, enabling scalable data workflows, and supporting business intelligence and reporting needs. • Roles and Responsibilities • Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with cross functional teams (data analysts, product teams, business stakeholders) to understand data requirements. • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards

Responsibilities

• Required Skills What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory • Desired Skills: What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory • Soft Skills : • Excellent problem-solving and analytical skills • Strong communication and collaboration skills • Ability to work independently and as part of a global team. • Self-motivated and able to work in a fast-paced environment. • Detail-oriented and committed to delivering high-quality work. • Display one-team behavior while thinking end to end solutioning.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Business Associate

Job Description

Product Owner

Responsibilities

Product Owner
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Product Owner

Job Description

QA automation

Responsibilities

QA automation
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :QA automation

Job Description

• Job Summary: • Experience in design and development of an Embedded Systems. • Expert knowledge in developing modular application software using C for an Embedded Linux system. • Experience in writing multithreaded application software’s and libraries for Embedded Linux system. • Good knowledge of Hardware/Software interfaces and reading schematics. • Experience in communication interfaces like CAN, RS232, I2C, SPI, Bluetooth/BLE & GSM AT Commands. • Experience in standard automotive communication protocols like J1939. • Experience in using various tools like static code analysis, run time debugging tools like GDB & memory debugging tools like Valgrind is an added advantage • Good working knowledge in agile-based product development methodology. • Effective communication, interpersonal, analytical and problem-solving skills. • Experience in Telematics domain is an added advantage. • A degree in engineering with 5+ years of experience in embedded software development.

Responsibilities

• Roles and Responsibilities Software development and testing • Required Skills C, Embedded Linux application programming, multi process and multi thread environment, BLE, CAN • Desired Skills: C, Embedded Linux application programming, multi process and multi thread environment, BLE, CAN • Soft Skills : Good verbal and written communications
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Software Engineer