We found 37 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Informatica CDI (Cloud Data Integration) Must Have Technical/Functional Skills · 3–6 years of experience in ETL development and data integration. · Hands-on experience with Matillion (preferred) or strong expertise in Ab Initio, Informatica, or DataStage · Strong SQL skills and experience with relational databases (Oracle, SQL Server, etc.). · Knowledge of data warehousing concepts and dimensional modeling. · Experience with scheduling tools and job monitoring. · Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus. · Strong problem-solving and analytical skills. · Experience with scripting languages (Python, Shell) and Knowledge of big data technologies

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Informatica CDI (Cloud Data Integration)

Job Description

Network Routing(WAN Technology) Role Descriptions: Key ResponsibilitiesNetwork MonitoringContinuously monitor network devices| links| and services using NOC tools and dashboards.Identify and escalate alerts or anomalies promptly.Incident Ticket ManagementHandle tickets as per SOP-based troubleshooting guidelines.Ensure timely resolution or escalation to L2L3 teams when required.Coordination CommunicationCollaborate with OSS teams for operational support and issue resolution.Coordinate with Telco vendors for link-related issues and outages.Work closely with the Service Desk and other cross-functional teams to ensure seamless operations.Communicate effectively with clients regarding incident status and updates.Documentation ReportingMaintain accurate records of incidents| actions taken| and resolutions.Prepare dailyweekly reports on network health and ticket status Essential Skills: Role OverviewThe NOC Network L1 Engineer will be responsible for monitoring network infrastructure| handling incidents and service requests| and ensuring smooth coordination between internal teams| service providers| and clients. This role requires adherence to Standard Operating Procedures (SOPs) for troubleshooting and proactive communication to maintain high service availabilityRequired Skills QualificationsBasic understanding of networking concepts (TCPIP| DNS| DHCP| Routing| Switching).Familiarity with NOC monitoring tools and ticketing systems.Strong troubleshooting skills following SOPs.Excellent communication and coordination abilities.Ability to work in a 24x7 shift environment.Preferred QualificationsCertification in networking (CCNA| CompTIA Network| or equivalent).Experience in telecom or ISP environments

Responsibilities

Role Descriptions: Key ResponsibilitiesNetwork MonitoringContinuously monitor network devices| links| and services using NOC tools and dashboards.Identify and escalate alerts or anomalies promptly.Incident Ticket ManagementHandle tickets as per SOP-based troubleshooting guidelines.Ensure timely resolution or escalation to L2L3 teams when required.Coordination CommunicationCollaborate with OSS teams for operational support and issue resolution.Coordinate with Telco vendors for link-related issues and outages.Work closely with the Service Desk and other cross-functional teams to ensure seamless operations.Communicate effectively with clients regarding incident status and updates.Documentation ReportingMaintain accurate records of incidents| actions taken| and resolutions.Prepare dailyweekly reports on network health and ticket status Essential Skills: Role OverviewThe NOC Network L1 Engineer will be responsible for monitoring network infrastructure| handling incidents and service requests| and ensuring smooth coordination between internal teams| service providers| and clients. This role requires adherence to Standard Operating Procedures (SOPs) for troubleshooting and proactive communication to maintain high service availabilityRequired Skills QualificationsBasic understanding of networking concepts (TCPIP| DNS| DHCP| Routing| Switching).Familiarity with NOC monitoring tools and ticketing systems.Strong troubleshooting skills following SOPs.Excellent communication and coordination abilities.Ability to work in a 24x7 shift environment.Preferred QualificationsCertification in networking (CCNA| CompTIA Network| or equivalent).Experience in telecom or ISP environments
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Network Routing(WAN Technology)

Job Description

Digital : Microsoft Power BI

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Digital : Microsoft Power BI

Job Description

Traditional AI/ML Developer We are seeking an experienced Senior AI/ML Developer to lead the development and deployment of robust AI-driven solutions. The ideal candidate will leverage their extensive background in artificial intelligence, machine learning, and data science to address complex business needs and guide the team towards innovation Must Have Skills · Develop a deep understanding of different types of data, metrics, and KPIs to identify and formulate data science problems that drive impactful solutions. · Proficiency in Python/PySpark for AI development, data preprocessing, and scripting. · Strong understanding of core machine learning and statistical concepts — including supervised, unsupervised, and reinforcement learning, feature engineering, and model optimization. · Hands-on experience in building, training, and deploying ML models for use cases such as prediction, recommendation, NLP, and computer vision. · Proficiency in Python and machine learning libraries such as scikit-learn, TensorFlow, PyTorch, XGBoost, and Hugging Face Transformers. · Familiarity with deep learning architectures — CNNs, RNNs, Transformers — and use cases in NLP, CV, and recommendation systems. · Experience with data preprocessing, feature selection, model evaluation, and hyperparameter tuning techniques. · Ability to design and deploy scalable ML pipelines leveraging cloud platforms — preferably AWS, but Azure experience is also acceptable. · Familiarity with API development, microservices, and integrating ML models into enterprise applications. · Understanding of data security, governance, and cost optimization in cloud-based AI environments. · Experience in AWS AI and ML services, including SageMaker, Lambda, Step Functions, API Gateway, and CloudWatch, for model deployment and orchestration. · Ability to work with containerization tools (Docker, Kubernetes) for deploying ML workloads. Nice to have skills: · Strong knowledge of data storage and processing frameworks, including Amazon S3, Redshift, Glue, Athena, or Azure Data Lake and Synapse · Understanding of MLOps frameworks (MLflow, Kubeflow, or Vertex AI Pipelines). · Knowledge of AI governance, responsible AI, and ethical AI frameworks. · Hands-on experience with Git-based CI/CD, Agile delivery, and DevOps/MLOps automation. · Hands-on experience with MLOps tools and practices for automating model training, deployment, and monitoring. · Proven experience in designing and implementing Extract, Transform, Load (ETL) processes · Knowledge of data visualization tools like Tableau or Power BI. · Strong problem-solving and analytical skills, with the abilit

Responsibilities

We are seeking an experienced Senior AI/ML Developer to lead the development and deployment of robust AI-driven solutions. The ideal candidate will leverage their extensive background in artificial intelligence, machine learning, and data science to address complex business needs and guide the team towards innovation Must Have Skills · Develop a deep understanding of different types of data, metrics, and KPIs to identify and formulate data science problems that drive impactful solutions. · Proficiency in Python/PySpark for AI development, data preprocessing, and scripting. · Strong understanding of core machine learning and statistical concepts — including supervised, unsupervised, and reinforcement learning, feature engineering, and model optimization. · Hands-on experience in building, training, and deploying ML models for use cases such as prediction, recommendation, NLP, and computer vision. · Proficiency in Python and machine learning libraries such as scikit-learn, TensorFlow, PyTorch, XGBoost, and Hugging Face Transformers. · Familiarity with deep learning architectures — CNNs, RNNs, Transformers — and use cases in NLP, CV, and recommendation systems. · Experience with data preprocessing, feature selection, model evaluation, and hyperparameter tuning techniques. · Ability to design and deploy scalable ML pipelines leveraging cloud platforms — preferably AWS, but Azure experience is also acceptable. · Familiarity with API development, microservices, and integrating ML models into enterprise applications. · Understanding of data security, governance, and cost optimization in cloud-based AI environments. · Experience in AWS AI and ML services, including SageMaker, Lambda, Step Functions, API Gateway, and CloudWatch, for model deployment and orchestration. · Ability to work with containerization tools (Docker, Kubernetes) for deploying ML workloads. Nice to have skills: · Strong knowledge of data storage and processing frameworks, including Amazon S3, Redshift, Glue, Athena, or Azure Data Lake and Synapse · Understanding of MLOps frameworks (MLflow, Kubeflow, or Vertex AI Pipelines). · Knowledge of AI governance, responsible AI, and ethical AI frameworks. · Hands-on experience with Git-based CI/CD, Agile delivery, and DevOps/MLOps automation. · Hands-on experience with MLOps tools and practices for automating model training, deployment, and monitoring. · Proven experience in designing and implementing Extract, Transform, Load (ETL) processes · Knowledge of data visualization tools like Tableau or Power BI. · Strong problem-solving and analytical skills, with the abilit
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Traditional AI/ML Developer

Job Description

AWS Data Engineer Bachelors/Masters/Computer Science/Equivalent engineering · Previous Work Experience: 3-7 years (Bachelor Degree holders) · In-depth experience with various AWS data services, including Amazon S3, Amazon Redshift, AWS Glue, AWS Lambda, and Amazon EMR, SQS, SNS, Step Functions, Event Bridge. · Ability to design, implement, and maintain scalable data pipelines using AWS services. · Strong proficiency in big data technologies such as Apache Spark and Apache Hadoop for processing and analyzing large datasets. · Hands-on experience with database management systems such as Amazon RDS, DynamoDB, and others. · Good knowledge in AWS OpenSearch. · Experience in data ingestion services like AWS AppFlow, DMS, AWS Glue etc. · Hands on experience on developing REST API’s using AWS API Gateway. · Experience in real time and batch data processing in AWS environment utilizing services like Kinesis firehose, AWS Glue, AWS Lambda etc. · Proficiency in programming languages such as Python, PySpark for building data applications and ETL processes. · Strong scripting skills for automation and orchestration of data workflows. · Solid understanding of data warehousing concepts and best practices. · Experience in designing and managing data warehouses on AWS Redshift or similar platforms. · Proven experience in designing and implementing Extract, Transform, Load (ETL) processes. · Knowledge of AWS security best practices and the ability to implement secure data solutions. · Knowledge of monitoring logs, create alerts, dashboards in AWS CloudWatch. · Understanding of version control systems, such as Git. · Have AWS associate or professional level certification. Nice to have skills: · Experience with Agile and DevOps concepts. · Understanding of networking principles, including VPC design, subnets, and security groups. · Experience with containerization tools such as Docker and orchestration tools like Kubernetes. · Ability to deploy and manage data applications using containerized solutions. · Familiarity with integrating machine learning models into data pipelines. · Knowledge of AWS SageMaker or other machine learning platforms. · Experience in AWS Bedrock for GEN AI integration · Knowledge of monitoring tools for tracking the performance and health of data systems. · Ability to optimize and fine-tune data pipelines for efficiency. · Experience in AWS s

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AWS Data Engineer

Job Description

Manual test

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Manual test

Job Description

Senior QA Automation Engineer – Tricentis Tosca Location of Requirement Chennai/Bangalore/Hyderabad/Bhubaneshwar /Kolkata Desired Competencies (Technical/Behavioral Competency) Must-Have*** (Ideally should not be more than 3-5) • 4–8 years in QA; 3+ years hands-on in Tricentis Tosca. • Strong command of TBox, XScan, API Scan, Test Case Design, and DEX. • Proven experience in UI automation (web, desktop) and API automation (REST/SOAP). • Good SQL skills for DB validation, data setup, and data-driven testing. • Experience with Git, Jenkins/Azure DevOps, and artifact management. • Solid understanding of SDLC/STLC, Agile/Scrum, story-based testing, and defect lifecycle.Ability to optimize test suites and reduce execution time via modular design Good-to-Have • Exposure to Tricentis qTest, Tosca BI/Data Integrity, Tosca Commander administration. • Scripting/programming basics (Python/Java) for utilities and CI steps. • Performance testing awareness (e.g., JMeter) and service virtualization.Domain knowledge (Telecom/BSS/OSS, Consumer, DWP) and data pipeline/ETL validation. Key responsiblities • Design and develop Tosca automation frameworks (Module, TestCase, TestSheet, Execution List) aligned to project test strategy. • Automate UI, API, and DB validations using Tosca (TBox, API Scan, XScan, TC-Shell utilities). • Implement Model-Based Test Automation (MBTA) and Test Case Design (TCD) for scalable, data-driven coverage. • Build and maintain Tosca repositories, naming conventions, and versioning standards. • Integrate automation with CI/CD pipelines (Jenkins/Azure DevOps/GitLab) and enable continuous testing. • Configure and manage Tosca Distributed Execution (DEX) / Agents for parallel runs. • Collaborate with BA/Dev/QA to define test objectives, risk-based testing, and traceability to requirements/user stories. • Create and maintain test data strategies; use Tosca Data Integrity or DB checks for end-to-end validation. • Drive defect triage, root cause analysis, and quality gates for releases. • Produce clear documentation: automation strategy, execution reports, coverage metrics, and maintenance plans. • Mentor junior testers; contribute to best practices, accelerators, and re-usable assets.

Responsibilities

Location of Requirement Chennai/Bangalore/Hyderabad/Bhubaneshwar /Kolkata Desired Competencies (Technical/Behavioral Competency) Must-Have*** (Ideally should not be more than 3-5) • 4–8 years in QA; 3+ years hands-on in Tricentis Tosca. • Strong command of TBox, XScan, API Scan, Test Case Design, and DEX. • Proven experience in UI automation (web, desktop) and API automation (REST/SOAP). • Good SQL skills for DB validation, data setup, and data-driven testing. • Experience with Git, Jenkins/Azure DevOps, and artifact management. • Solid understanding of SDLC/STLC, Agile/Scrum, story-based testing, and defect lifecycle.Ability to optimize test suites and reduce execution time via modular design Good-to-Have • Exposure to Tricentis qTest, Tosca BI/Data Integrity, Tosca Commander administration. • Scripting/programming basics (Python/Java) for utilities and CI steps. • Performance testing awareness (e.g., JMeter) and service virtualization.Domain knowledge (Telecom/BSS/OSS, Consumer, DWP) and data pipeline/ETL validation. Key responsiblities • Design and develop Tosca automation frameworks (Module, TestCase, TestSheet, Execution List) aligned to project test strategy. • Automate UI, API, and DB validations using Tosca (TBox, API Scan, XScan, TC-Shell utilities). • Implement Model-Based Test Automation (MBTA) and Test Case Design (TCD) for scalable, data-driven coverage. • Build and maintain Tosca repositories, naming conventions, and versioning standards. • Integrate automation with CI/CD pipelines (Jenkins/Azure DevOps/GitLab) and enable continuous testing. • Configure and manage Tosca Distributed Execution (DEX) / Agents for parallel runs. • Collaborate with BA/Dev/QA to define test objectives, risk-based testing, and traceability to requirements/user stories. • Create and maintain test data strategies; use Tosca Data Integrity or DB checks for end-to-end validation. • Drive defect triage, root cause analysis, and quality gates for releases. • Produce clear documentation: automation strategy, execution reports, coverage metrics, and maintenance plans. • Mentor junior testers; contribute to best practices, accelerators, and re-usable assets.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Senior QA Automation Engineer – Tricentis Tosca

Job Description

Azure data bricks

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Azure data bricks

Job Description

Proficient in Power BI dashboards development, Power query, DAX; Proficient in SQL Query & Stored Proc development. Knowledge on DW concepts; facts, dimensions, SCDs etc. is mandatory. Development of Power BI dashboards. The sub-contractor resource will be creating Power BI dashboards of the use cases identified and assigned to him/her. Expect the resource to deliver as per PwC quality and agreed upon timelines. Proficient in preparing design docs, user manual; Excellent communication and stakeholder management skills.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Power BI