Job Description - Senior Software Engineer - Test Automation (25000N8F)
Senior Software Engineer - Test Automation - (25000N8F)
Missions
MAJOR ACCOUNTABILITIES/PRINCIPAL RESPONSIBILITIES:
DUTIES:
· Understanding Test Requirements
· Design, implement and maintain automated testing frameworks for web applications.
· Integrate automated test into CI/CD pipelines and DevOps tools (Azure DevOps)
· Executing and reporting automated tests
· Analyze the results of the automated test runs. Troubleshoot automated tests.
· Categorize and prioritize test cases.
· Participate in QA chapter meetings.
· Collecting Test Metrics
KEY SKILL AREAS & KNOWLEDGE REQUIRED:
· Test automation tools: Playwright
· Scripting languages: JavaScript
· Hands-on experience with CI/CD pipelines and DevOps tools
Profile
COMPETENCIES/SKILLS:
· Problem-solving, analytical and creative
· Proactive
· Team player
WORK EXPERIENCE REQUIREMNTS:
· QA experience 5+ years
· Test automation experience 5+ years
· Experience working with web applications.
· Analytical and reporting experience.
· Delivering high-quality code
· Understanding of the SDLC process and its application with continuous integration tools
· Spanish: High level is wished, Medium level is wished.
· English: High level
Responsibilities
Job Description - Senior Software Engineer - Test Automation (25000N8F)
Senior Software Engineer - Test Automation - (25000N8F)
Missions
MAJOR ACCOUNTABILITIES/PRINCIPAL RESPONSIBILITIES:
DUTIES:
· Understanding Test Requirements
· Design, implement and maintain automated testing frameworks for web applications.
· Integrate automated test into CI/CD pipelines and DevOps tools (Azure DevOps)
· Executing and reporting automated tests
· Analyze the results of the automated test runs. Troubleshoot automated tests.
· Categorize and prioritize test cases.
· Participate in QA chapter meetings.
· Collecting Test Metrics
KEY SKILL AREAS & KNOWLEDGE REQUIRED:
· Test automation tools: Playwright
· Scripting languages: JavaScript
· Hands-on experience with CI/CD pipelines and DevOps tools
Profile
COMPETENCIES/SKILLS:
· Problem-solving, analytical and creative
· Proactive
· Team player
WORK EXPERIENCE REQUIREMNTS:
· QA experience 5+ years
· Test automation experience 5+ years
· Experience working with web applications.
· Analytical and reporting experience.
· Delivering high-quality code
· Understanding of the SDLC process and its application with continuous integration tools
· Spanish: High level is wished, Medium level is wished.
· English: High level
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Senior Software Engineer - Test Automation - (25000N8F)
Hi Anuradha
As discussed over the call, please find below the skills and job description for the Cloud Infrastructure Engineer. We're looking for an India based resource, preferably from Aurangabad (MH) and open to screen rest of India candidates as well
Senior Cloud Infrastructure Engineer (8+ Years)
Technical SKills:
• Google Cloud Platform (GCP): Expert hands-on experience with GCE, VPC, IAM, and Cloud Monitoring.
• Puppet: Expert level. Must be able to manage, troubleshoot, and write manifests for a large-scale internal VM fleet.
• Terraform: Deep experience with Infrastructure as Code (IaC) for provisioning and managing GCP resources.
• Ansible: Experience in automating application deployments and orchestration.
• Linux Systems: High-level Linux engineering/troubleshooting and shell scripting (Python/Bash).
• Networking & Security: Deep understanding of IP Whitelisting, Firewalls, and Load Balancing.
• Secure Connectivity: Hands-on experience with SSL/TLS certificate management and troubleshooting encrypted connections.
• Troubleshooting DB connection strings and networking between App tiers and Oracle backends.
• System Services: Experience in configuring and debugging Cloud and VM level core enterprise services.
• Good understanding of overall system architecture.
Added Plus :
• OpenShift : Experience moving legacy, monolithic apps from VMs into containers (GKE/OpenShift).
Qualifications and Expectations:
• BE/ Btech / MCA / MSc Computer Science with minimum 8+ years of experience in Cloud/SRE operations
• Lead Technical Execution: Act as the senior point of contact for all infrastructure-related issues.
• Mentorship: Provide daily technical guidance and code reviews for the junior cloud engineering team in India.
• Stability & Automation: Identify manual work and replace it with automated solutions.
• Proactively work on identifying issues and possible resolutions arising due to complex nature of infrastructure
• Documentation: Capture and formalize current "tribal knowledge" into technical playbooks.
Responsibilities
Please see if we can find candidates in Aurangabad.
Alternatively see if we can find candidates for Pune, Bangalore
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Custom Software Engineer, a typical day involves designing and developing tailored software solutions that integrate seamlessly with existing systems. The role requires continuous collaboration with various teams to enhance application components, ensuring they meet evolving business requirements. The work environment embraces agile methodologies, promoting iterative development and frequent feedback to deliver efficient and scalable software products. This position demands adaptability and creativity to address complex challenges and optimize system performance in a dynamic setting. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of best practices to improve software development processes and team productivity.- Mentor junior team members to foster skill development and knowledge sharing.- Coordinate cross-functional efforts to align software solutions with organizational goals. Professional & Technical Skills: - Graph Technology: High proficiency with LPG-based databases e.g. Neo4j (Cypher) or RDF Stores (SPARQL).- Graph Engineering: Strong Python skills in order to process unstructured and structured data from ERPs and Data Products to map it to ontology entities & relationships.- ETL/ELT Development: Build robust pipelines to extract structured data and ingest it into the Graph Database. - Named Entity Resolution & NLP Frameworks: Hands-on experience with spaCy (Essential) or similar libraries to process unstructured information (business process documentation, PDF invoices, contracts, emails) to extract named entities.- Other:- Business Context: Finance & Procurement processes- Experience with CI/CD and cloud services / infrastructure- Knowhow of enterprise applications / ERP / CRM / Digital Cores- As a plus:- Vector Search: Understanding of Vector Databases or Graph RAG (Retrieval Augmented Generation) techniques is a major plus.- Named Entity Resolution: Experience training custom NER models for specific domains.- Graph DB Optimization: Manage graph databases (e.g., Neo4j, Amazon Neptune, Stardog) performance and optimize traversal queries for speed. Additional Information: - The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office
Responsibilities
As a Custom Software Engineer, a typical day involves designing and developing tailored software solutions that integrate seamlessly with existing systems. The role requires continuous collaboration with various teams to enhance application components, ensuring they meet evolving business requirements. The work environment embraces agile methodologies, promoting iterative development and frequent feedback to deliver efficient and scalable software products. This position demands adaptability and creativity to address complex challenges and optimize system performance in a dynamic setting. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of best practices to improve software development processes and team productivity.- Mentor junior team members to foster skill development and knowledge sharing.- Coordinate cross-functional efforts to align software solutions with organizational goals. Professional & Technical Skills: - Graph Technology: High proficiency with LPG-based databases e.g. Neo4j (Cypher) or RDF Stores (SPARQL).- Graph Engineering: Strong Python skills in order to process unstructured and structured data from ERPs and Data Products to map it to ontology entities & relationships.- ETL/ELT Development: Build robust pipelines to extract structured data and ingest it into the Graph Database. - Named Entity Resolution & NLP Frameworks: Hands-on experience with spaCy (Essential) or similar libraries to process unstructured information (business process documentation, PDF invoices, contracts, emails) to extract named entities.- Other:- Business Context: Finance & Procurement processes- Experience with CI/CD and cloud services / infrastructure- Knowhow of enterprise applications / ERP / CRM / Digital Cores- As a plus:- Vector Search: Understanding of Vector Databases or Graph RAG (Retrieval Augmented Generation) techniques is a major plus.- Named Entity Resolution: Experience training custom NER models for specific domains.- Graph DB Optimization: Manage graph databases (e.g., Neo4j, Amazon Neptune, Stardog) performance and optimize traversal queries for speed. Additional Information: - The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role: Python Developer (FAST API & Kubernetes)
Exp: 3-6 Years (Z2)
Client: Deloitte - Mercedes Benz
About the Role
We are looking for a skilled Python Full Stack Developer with strong experience in building scalable backend services using FastAPI and deploying applications in containerized environments using Kubernetes. You will work across the stack, contributing to backend APIs, frontend interfaces, and cloud-native deployments.
Key Responsibilities
Design, develop, and maintain high-performance RESTful APIs using FastAPI
Build scalable and secure backend systems using Python
Develop responsive frontend applications using frameworks like React / Angular / Vue.js
Containerize applications using Docker and manage deployments with Kubernetes
Collaborate with DevOps to build CI/CD pipelines
Optimize applications for performance, scalability, and reliability
Write clean, maintainable, and well-documented code
Participate in code reviews and architectural discussions
Integrate third-party services and APIs
Required Skills
Strong proficiency in Python
Hands-on experience with FastAPI or similar frameworks (Flask/Django)
Experience with frontend technologies:
JavaScript / TypeScript
React / Angular / Vue.js
Good understanding of REST API design principles
Experience with Kubernetes
Familiarity with cloud platforms (AWS / GCP / Azure)
Knowledge of relational and NoSQL databases (PostgreSQL, MongoDB)
Understanding of version control tools like Git
Responsibilities
Role: Python Developer (FAST API & Kubernetes)
Exp: 3-6 Years (Z2)
Client: Deloitte - Mercedes Benz
About the Role
We are looking for a skilled Python Full Stack Developer with strong experience in building scalable backend services using FastAPI and deploying applications in containerized environments using Kubernetes. You will work across the stack, contributing to backend APIs, frontend interfaces, and cloud-native deployments.
Key Responsibilities
Design, develop, and maintain high-performance RESTful APIs using FastAPI
Build scalable and secure backend systems using Python
Develop responsive frontend applications using frameworks like React / Angular / Vue.js
Containerize applications using Docker and manage deployments with Kubernetes
Collaborate with DevOps to build CI/CD pipelines
Optimize applications for performance, scalability, and reliability
Write clean, maintainable, and well-documented code
Participate in code reviews and architectural discussions
Integrate third-party services and APIs
Required Skills
Strong proficiency in Python
Hands-on experience with FastAPI or similar frameworks (Flask/Django)
Experience with frontend technologies:
JavaScript / TypeScript
React / Angular / Vue.js
Good understanding of REST API design principles
Experience with Kubernetes
Familiarity with cloud platforms (AWS / GCP / Azure)
Knowledge of relational and NoSQL databases (PostgreSQL, MongoDB)
Understanding of version control tools like Git
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Design, architecture, develop, and implement highly available/scalable, multi-region solutions within AWS Cloud.
• Work closely with application, engineering, security and operations teams to engineer and build Cloud native Services on AWS, PaaS and IaaS solutions within an agile and modern enterprise grade operating model.
• Maintaining and developing the Infrastructure as Code repository using Terraform and be able to deliver a fully automatized cloud infrastructure.
• Implement and maintain monitoring and alerting systems to detect issues proactively.
• Perform deployment, configuration, monitoring and maintenance of high availability enterprise solutions.
• Perform proactive system administration, monitoring, technical efficiency tuning, up-time, alert notifications, and automation tasks.
• Manage and administer the AWS cloud environment, including provisioning, configuration, performance monitoring, policy governance and security
• Design, develop, and implement highly available, multi-region solutions within AWS
• Analyze existing operational standards, processes, and/or governance to identify and implement improvements
• Migrate existing infrastructure services to AWS cloud-based solutions
• Manage security and access controls of AWS cloud-based solutions
• Develop infrastructure as code (IaC) leveraging cloud native tooling to ensure automated and consistent platform deployments
• Develop and implement policy driven data protection best practices to ensure cloud solutions are protected from data loss
• Support cloud adoption of applications as they are being transformed and/or modernized
• Ensure all infrastructure components meet proper performance and capacity standards
Mandatory Technical Skills
• Around 9 years of experience with AWS Cloud (IaaS, PaaS, Database) and Azure DevOps
• Cloud Architecture: Strong understanding of cloud architecture principles and best practices.
• 7+ years of experience with Infrastructure As Code using c
• Advanced skills on LINUX, Network, security and Docker based environment
• Security best practices and compliance frameworks
• Programming languages is a plus (For example: PowerShell, Shell, Python).
• Experience with AWS networking services (VPC, Direct Connect, Route 53, CloudFront)
• Network security implementation (Security Groups, NACLs, WAF)
Responsibilities
Design, architecture, develop, and implement highly available/scalable, multi-region solutions within AWS Cloud.
• Work closely with application, engineering, security and operations teams to engineer and build Cloud native Services on AWS, PaaS and IaaS solutions within an agile and modern enterprise grade operating model.
• Maintaining and developing the Infrastructure as Code repository using Terraform and be able to deliver a fully automatized cloud infrastructure.
• Implement and maintain monitoring and alerting systems to detect issues proactively.
• Perform deployment, configuration, monitoring and maintenance of high availability enterprise solutions.
• Perform proactive system administration, monitoring, technical efficiency tuning, up-time, alert notifications, and automation tasks.
• Manage and administer the AWS cloud environment, including provisioning, configuration, performance monitoring, policy governance and security
• Design, develop, and implement highly available, multi-region solutions within AWS
• Analyze existing operational standards, processes, and/or governance to identify and implement improvements
• Migrate existing infrastructure services to AWS cloud-based solutions
• Manage security and access controls of AWS cloud-based solutions
• Develop infrastructure as code (IaC) leveraging cloud native tooling to ensure automated and consistent platform deployments
• Develop and implement policy driven data protection best practices to ensure cloud solutions are protected from data loss
• Support cloud adoption of applications as they are being transformed and/or modernized
• Ensure all infrastructure components meet proper performance and capacity standards
Mandatory Technical Skills
• Around 9 years of experience with AWS Cloud (IaaS, PaaS, Database) and Azure DevOps
• Cloud Architecture: Strong understanding of cloud architecture principles and best practices.
• 7+ years of experience with Infrastructure As Code using c
• Advanced skills on LINUX, Network, security and Docker based environment
• Security best practices and compliance frameworks
• Programming languages is a plus (For example: PowerShell, Shell, Python).
• Experience with AWS networking services (VPC, Direct Connect, Route 53, CloudFront)
• Network security implementation (Security Groups, NACLs, WAF)
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
SAP Datasphere:
We are seeking an experienced SAP Datasphere Developer with strong expertise in building scalable data models and hands‑on proficiency in SQL. The ideal candidate will design, develop, and optimize data models within SAP Datasphere (DSP), ensuring high data quality, performance, and usability for analytics and business applications.
Key Responsibilities
Data Modeling & Development
Design, build, and manage data models (graphical, SQL, entity relationship models) in SAP Datasphere.
Develop local tables, views, analytical datasets, and semantic models to support reporting and analytics use cases.
Implement data transformations, calculation logic, and business rules using DSP capabilities.
Write high‑quality, efficient SQL scripts for data extraction, transformation, and loading (ETL/ELT).
Optimize SQL queries for performance, scalability, and data accuracy.
Troubleshoot SQL and modeling‑related issues across datasets and data flows.
Familiarity with SAP ecosystem (e.g., ABAP, BWonHANA, BW/4HANA, SAP Analytics Cloud)
Work with Data Integration Monitor, Replication Flows, and Connections within SAP Datasphere.
Collaborate with data architects, analysts, business SMEs, and engineering teams to understand requirements and translate them into DSP models.
Soft Skills
Strong analytical thinking and problem‑solving skills.
Good communication and stakeholder‑management abilities.
Adaptability and willingness to learn emerging SAP data technologies.
Responsibilities
SAP Datasphere:
We are seeking an experienced SAP Datasphere Developer with strong expertise in building scalable data models and hands‑on proficiency in SQL. The ideal candidate will design, develop, and optimize data models within SAP Datasphere (DSP), ensuring high data quality, performance, and usability for analytics and business applications.
Key Responsibilities
Data Modeling & Development
Design, build, and manage data models (graphical, SQL, entity relationship models) in SAP Datasphere.
Develop local tables, views, analytical datasets, and semantic models to support reporting and analytics use cases.
Implement data transformations, calculation logic, and business rules using DSP capabilities.
Write high‑quality, efficient SQL scripts for data extraction, transformation, and loading (ETL/ELT).
Optimize SQL queries for performance, scalability, and data accuracy.
Troubleshoot SQL and modeling‑related issues across datasets and data flows.
Familiarity with SAP ecosystem (e.g., ABAP, BWonHANA, BW/4HANA, SAP Analytics Cloud)
Work with Data Integration Monitor, Replication Flows, and Connections within SAP Datasphere.
Collaborate with data architects, analysts, business SMEs, and engineering teams to understand requirements and translate them into DSP models.
Soft Skills
Strong analytical thinking and problem‑solving skills.
Good communication and stakeholder‑management abilities.
Adaptability and willingness to learn emerging SAP data technologies.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance