We found 1225 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Qualifications ● Qualified CA or MBA in Finance with at least 6-9 years of relevant International Tax experience in Big 4 and/or multinational corporations within the IT industry ● Hands-on experience in monthly, quarterly & annual foreign tax compliance with a working knowledge of international tax matters ● Experience in transfer pricing for US multinational technology companies ● Advanced Excel required, with experience in NetSuite, and Alteryx (a plus)

Responsibilities

Qualifications ● Qualified CA or MBA in Finance with at least 6-9 years of relevant International Tax experience in Big 4 and/or multinational corporations within the IT industry ● Hands-on experience in monthly, quarterly & annual foreign tax compliance with a working knowledge of international tax matters ● Experience in transfer pricing for US multinational technology companies ● Advanced Excel required, with experience in NetSuite, and Alteryx (a plus)
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Senior Tax analyst – International Tax

Job Description

Requirements ● 5+ years relevant experience in Data Analytics, BI Analytics, or BI Engineering, preferably with a globally recognized organization. ● Expert level skills writing complex SQL queries to create views in warehouses like Snowflake, Redshift, SQL Server, Oracle, BigQuery. ● Advanced skills in designing and creating data models and dashboards in BI tools like Tableau, Domo, Looker, etc. ● Intermediate level skills in analytical tools like Excel, Google Sheets, or Power BI (complex formulas, lookups, pivots, etc.) ● Bachelor’s/Advanced degree in Data Analytics, Data Science, Information Systems, Computer Science, Applied Math, Statistics, or similar field of study. ● Willingness to work with internal team members and stakeholders in other time zones.

Responsibilities

Requirements ● 5+ years relevant experience in Data Analytics, BI Analytics, or BI Engineering, preferably with a globally recognized organization. ● Expert level skills writing complex SQL queries to create views in warehouses like Snowflake, Redshift, SQL Server, Oracle, BigQuery. ● Advanced skills in designing and creating data models and dashboards in BI tools like Tableau, Domo, Looker, etc. ● Intermediate level skills in analytical tools like Excel, Google Sheets, or Power BI (complex formulas, lookups, pivots, etc.) ● Bachelor’s/Advanced degree in Data Analytics, Data Science, Information Systems, Computer Science, Applied Math, Statistics, or similar field of study. ● Willingness to work with internal team members and stakeholders in other time zones.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Analyst

Job Description

--Minimum 4 years of experience in IT and 2 years in Looker development required. --Knowledge of operating on Google Cloud Infrastructure. --Knowledge on JIRA management and agile methodology. --Able to work in a fast-paced environment. Multitasking and attention to detail are critical - Strong communication skills. - Strong organization skills and an ability to work independently.

Responsibilities

--Minimum 4 years of experience in IT and 2 years in Looker development required. --Knowledge of operating on Google Cloud Infrastructure. --Knowledge on JIRA management and agile methodology. --Able to work in a fast-paced environment. Multitasking and attention to detail are critical - Strong communication skills. - Strong organization skills and an ability to work independently.
  • Salary : Rs. 10,00,000.0 - Rs. 18,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Looker Developer

Job Description

• Maintaining inventory of lab supplies (Android TV's, Systems, TV accessories, mobiles etc) . • Ensuring lab hosts are running and up to date. • Ensuring lab devices(Android TV's, Mobiles, systems, TV accessories) are in good state to run tests • Good knowledge in Networking and basic knowledge in Python • Assisting stakeholders with test enablement on different devices (TV's and mobile devices) and bring-up. • Ensuring lab networks are working and are able to support all of the necessary tests. • Creating and/or updating test beds for the Automation test runs with requested equipment • Managing lab scheduling, space allocation and workflow efficiency • Reclaim unused test beds / devices / test results • Ensuring compliance with safety regulations • Maintaining a clean and organized lab space • Getting the necessary approvals for device allocations

Responsibilities

• Maintaining inventory of lab supplies (Android TV's, Systems, TV accessories, mobiles etc) . • Ensuring lab hosts are running and up to date. • Ensuring lab devices(Android TV's, Mobiles, systems, TV accessories) are in good state to run tests • Good knowledge in Networking and basic knowledge in Python • Assisting stakeholders with test enablement on different devices (TV's and mobile devices) and bring-up. • Ensuring lab networks are working and are able to support all of the necessary tests. • Creating and/or updating test beds for the Automation test runs with requested equipment • Managing lab scheduling, space allocation and workflow efficiency • Reclaim unused test beds / devices / test results • Ensuring compliance with safety regulations • Maintaining a clean and organized lab space • Getting the necessary approvals for device allocations
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Lab Admin

Job Description

The role of the Senior Support Engineer, SAP CPI is to provide exceptional remote-based support for mission-critical SAP applications as part of our global customer support team. The position has the responsibility for researching, troubleshooting, and supporting the mentioned SAP technologies. The role of the Support Engineer, SAP CPI requires the expertise and skills to diagnose serious issues; then develop, test, package and deliver fixes for such issues in complex, integrated and highly configured environments.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP CPI

Job Description

Should have 3 to 5 years of hands on experience working on Java, Spring Framework related components Should have at least 2 years of hands on experience working on using Java Spark on HDInsight or SoK8s Should have at least 2 years of hands on experience working on using Container & Orchestration tools such as Docker & Kubernetes Should have experience working on projects using Agile Methodologies and CI/CD Pipelines Should have experience working on at least one of the RDBMS databases such as Oracle, PostgreSQL and SQL Server Nice to have exposure to Linux platform such as RHEL and Cloud platforms such as Azure Data Lake Nice to have exposure to Investment Banking Domain

Responsibilities

  • Salary : Rs. 0.0 - Rs. 12.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Job Description - Specialist Software Engineer - Java + BigData (250009V9) Specialist Software Engineer

Job Description

Java Full Stack Development

Responsibilities

Java Full Stack Development
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Full Stack Development

Job Description

Job Summary: We are seeking a skilled Data Engineer with 2–4 years of hands-on experience to join our team. The ideal candidate will design, build, and maintain data pipelines and data warehouse solutions, working with ETL processes on AWS. Proficiency in SQL and Python is essential to transform raw data into valuable business insights. Key Responsibilities: Develop, maintain, and optimize ETL pipelines to ingest, transform, and load data from various sources. Design and implement data warehouse solutions to support reporting and analytics. Work with AWS data services (such as S3, Redshift, Glue, EMR, etc.) for data storage and processing. Write efficient, scalable SQL queries for data extraction, aggregation, and reporting. Develop Python scripts for data transformation, automation, and integration tasks. Monitor data pipelines, troubleshoot data issues, and ensure data quality and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Document data flows, processes, and systems for transparency and maintainability. Requirements: 2–4 years of professional experience as a Data Engineer or in a similar data-focused engineering role. Strong experience with ETL processes and tools. Hands-on experience with data warehousing concepts and implementation. Good knowledge of AWS cloud services (S3, Redshift, Glue, Lambda, etc.). Proficiency in SQL (complex queries, optimization, data modeling). Strong programming skills in Python (data manipulation, scripting). Familiarity with version control systems like Git. Ability to work independently and collaboratively in a fast-paced environment. Strong analytical and problem-solving skills.

Responsibilities

Job Summary: We are seeking a skilled Data Engineer with 2–4 years of hands-on experience to join our team. The ideal candidate will design, build, and maintain data pipelines and data warehouse solutions, working with ETL processes on AWS. Proficiency in SQL and Python is essential to transform raw data into valuable business insights. Key Responsibilities: Develop, maintain, and optimize ETL pipelines to ingest, transform, and load data from various sources. Design and implement data warehouse solutions to support reporting and analytics. Work with AWS data services (such as S3, Redshift, Glue, EMR, etc.) for data storage and processing. Write efficient, scalable SQL queries for data extraction, aggregation, and reporting. Develop Python scripts for data transformation, automation, and integration tasks. Monitor data pipelines, troubleshoot data issues, and ensure data quality and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Document data flows, processes, and systems for transparency and maintainability. Requirements: 2–4 years of professional experience as a Data Engineer or in a similar data-focused engineering role. Strong experience with ETL processes and tools. Hands-on experience with data warehousing concepts and implementation. Good knowledge of AWS cloud services (S3, Redshift, Glue, Lambda, etc.). Proficiency in SQL (complex queries, optimization, data modeling). Strong programming skills in Python (data manipulation, scripting). Familiarity with version control systems like Git. Ability to work independently and collaboratively in a fast-paced environment. Strong analytical and problem-solving skills.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer

Job Description

Description Job Title: DevSecOps Engineer with 4+ years’ experience Job Summary We're looking for a dynamic DevSecOps Engineer to lead the charge in embedding security into our DevOps lifecycle. This role focuses on implementing secure, scalable, and observable cloud-native systems, leveraging Azure, Kubernetes, GitHub Actions, and security tools like Black Duck, SonarQube, and Snyk. Key Responsibilities • Architect, deploy, and manage secure Azure infrastructure using Terraform and Infrastructure as Code (IaC) principles • Build and maintain CI/CD pipelines in GitHub Actions, integrating tools such as Black Duck, SonarQube, and Snyk • Operate and optimize Azure Kubernetes Service (AKS) for containerized applications • Configure robust monitoring and observability stacks using Prometheus, Grafana, and Loki • Implement incident response automation with PagerDuty • Manage and support MS SQL databases and perform basic operations on Cosmos DB • Collaborate with development teams to promote security best practices across SDLC • Identify vulnerabilities early and respond to emerging security threats proactively Required Skills • Deep knowledge of Azure Services, AKS, and Terraform • Strong proficiency with Git, GitHub Actions, and CI/CD workflow design • Hands-on experience integrating and managing Black Duck, SonarQube, and Snyk • Proficiency in setting up monitoring stacks: Prometheus, Grafana, and Loki • Familiarity with PagerDuty for on-call and incident response workflows • Experience managing MSSQL and understanding Cosmos DB basics • Strong scripting ability (Python, Bash, or PowerShell) • Understanding of DevSecOps principles and secure coding practices • Familiarity with Helm, Bicep, container scanning, and runtime security solutions

Responsibilities

Description Job Title: DevSecOps Engineer with 4+ years’ experience Job Summary We're looking for a dynamic DevSecOps Engineer to lead the charge in embedding security into our DevOps lifecycle. This role focuses on implementing secure, scalable, and observable cloud-native systems, leveraging Azure, Kubernetes, GitHub Actions, and security tools like Black Duck, SonarQube, and Snyk. Key Responsibilities • Architect, deploy, and manage secure Azure infrastructure using Terraform and Infrastructure as Code (IaC) principles • Build and maintain CI/CD pipelines in GitHub Actions, integrating tools such as Black Duck, SonarQube, and Snyk • Operate and optimize Azure Kubernetes Service (AKS) for containerized applications • Configure robust monitoring and observability stacks using Prometheus, Grafana, and Loki • Implement incident response automation with PagerDuty • Manage and support MS SQL databases and perform basic operations on Cosmos DB • Collaborate with development teams to promote security best practices across SDLC • Identify vulnerabilities early and respond to emerging security threats proactively Required Skills • Deep knowledge of Azure Services, AKS, and Terraform • Strong proficiency with Git, GitHub Actions, and CI/CD workflow design • Hands-on experience integrating and managing Black Duck, SonarQube, and Snyk • Proficiency in setting up monitoring stacks: Prometheus, Grafana, and Loki • Familiarity with PagerDuty for on-call and incident response workflows • Experience managing MSSQL and understanding Cosmos DB basics • Strong scripting ability (Python, Bash, or PowerShell) • Understanding of DevSecOps principles and secure coding practices • Familiarity with Helm, Bicep, container scanning, and runtime security solutions
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :DevSecOps Engineer