We found 121 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

• Independently designs, implements, and maintains GraphQL backends using TypeScript, Apollo Server, Axios/Fetch (REST SDKs), MariaDB (SQL) with ObjectionJS + KnexJS, Redis, AWS, and modern authorization frameworks including AuthZ libraries or GraphQL Envelop. • Applies strong engineering fundamentals, including Linux, Docker, OWASP, SOLID/DRY/KISS/YAGNI, and sound data structures & algorithms for correctness, safety, and efficiency. • Authors strongly typed schemas using GraphQL SDL or code first frameworks (e.g., GiraphQL/Pothos, TypeGraphQL) and uses GraphQL Code Generator to produce type safe definitions for schemas, resolvers, queries, mutations, and subscriptions. • Configures Apollo Server with TypeScript; develops type safe resolvers; implements REST and database data sources; manages context initialization (auth, tenancy, request scoping); and enforces query depth/complexity limits, rate limiting, and persisted queries. • Demonstrates expert SQL capabilities in MariaDB, including schema design, indexing, migrations, query optimization, and resilient data access through ObjectionJS and KnexJS, ensuring idempotent operations and safe transactions. • Optimizes performance by eliminating N+1 through DataLoader, implementing caching (in memory and Redis), optimizing pagination and batching, and profiling GraphQL resolver and SQL hot paths. • Implements secure by design GraphQL services, including OAuth/OIDC, encryption in transit/at rest, secret management, input validation, output encoding, least privilege access, and resolver level authorization to mitigate CSRF/CORS and other abuse vectors. • Defines and executes high quality GraphQL and REST API tests across all API testing types (contract, functional, integration, negative, security, performance) using Browser DevTools, Bruno, and Insomnia, and writes comprehensive unit, integration, and end to end tests. • Produces maintainable, reusable, type safe code that is fast, idempotent, resilient, observable, and fault tolerant, incorporating retries, exponential backoff, graceful degradation, circuit breakers, and robust structured error surfaces. • Diagnoses failures across the stack using logs, metrics, traces, database analysis, and network inspection; performs root cause analysis and provides actionable remediations with evidence based findings. • Operates services locally via Docker and deploys to AWS using appropriate configuration, secrets management, monitoring, and observability tooling; tunes caching, database performance, and GraphQL query execution in production environments. • Designs stable, versioned, backward compatible GraphQL contracts; maintains API documentation and operational runbooks; and ensures seamless integration between backend logic, database layers, and frontend clients.

Responsibilities

• Independently designs, implements, and maintains GraphQL backends using TypeScript, Apollo Server, Axios/Fetch (REST SDKs), MariaDB (SQL) with ObjectionJS + KnexJS, Redis, AWS, and modern authorization frameworks including AuthZ libraries or GraphQL Envelop. • Applies strong engineering fundamentals, including Linux, Docker, OWASP, SOLID/DRY/KISS/YAGNI, and sound data structures & algorithms for correctness, safety, and efficiency. • Authors strongly typed schemas using GraphQL SDL or code first frameworks (e.g., GiraphQL/Pothos, TypeGraphQL) and uses GraphQL Code Generator to produce type safe definitions for schemas, resolvers, queries, mutations, and subscriptions. • Configures Apollo Server with TypeScript; develops type safe resolvers; implements REST and database data sources; manages context initialization (auth, tenancy, request scoping); and enforces query depth/complexity limits, rate limiting, and persisted queries. • Demonstrates expert SQL capabilities in MariaDB, including schema design, indexing, migrations, query optimization, and resilient data access through ObjectionJS and KnexJS, ensuring idempotent operations and safe transactions. • Optimizes performance by eliminating N+1 through DataLoader, implementing caching (in memory and Redis), optimizing pagination and batching, and profiling GraphQL resolver and SQL hot paths. • Implements secure by design GraphQL services, including OAuth/OIDC, encryption in transit/at rest, secret management, input validation, output encoding, least privilege access, and resolver level authorization to mitigate CSRF/CORS and other abuse vectors. • Defines and executes high quality GraphQL and REST API tests across all API testing types (contract, functional, integration, negative, security, performance) using Browser DevTools, Bruno, and Insomnia, and writes comprehensive unit, integration, and end to end tests. • Produces maintainable, reusable, type safe code that is fast, idempotent, resilient, observable, and fault tolerant, incorporating retries, exponential backoff, graceful degradation, circuit breakers, and robust structured error surfaces. • Diagnoses failures across the stack using logs, metrics, traces, database analysis, and network inspection; performs root cause analysis and provides actionable remediations with evidence based findings. • Operates services locally via Docker and deploys to AWS using appropriate configuration, secrets management, monitoring, and observability tooling; tunes caching, database performance, and GraphQL query execution in production environments. • Designs stable, versioned, backward compatible GraphQL contracts; maintains API documentation and operational runbooks; and ensures seamless integration between backend logic, database layers, and frontend clients.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Graph QL Developer

Job Description

Requirement ID: 10596845 Job Title: Developer Work Location: Chennai, TN / Mumbai, MH / Pune, MH Skill Required: BPS: Data moduler Experience Range in Required Skills: 7+ Years Duration: 6 months (extendable) Job Description: Key responsibilities You will be responsible for · Defining the data architecture framework covering data modelling, security, virtualization, governance, reference and master data, data visualization · Defining reference architecture and patterns that others can follow to create and improve data systems aligning with industry standards · Designing data models leveraging ER, Dimensional and Data vault modelling technique. · Designing data security and controls to address customer’s data privacy needs inline to current regulations such as GDPR, CCPA etc. · Designing technical solutions leveraging data virtualization techniques and tools such as Denodo etc. · Collaborating and coordinating with multiple departments, senior stakeholders including C level, partners, and external vendors · Designing architecture solutions that are in line with business objectives · Providing technical leadership, oversight, and direction to project / execution team · Building effective relationships with customers, CoE, partners and vendors · You will be responsible for verifying requirements, Data solution architecture & design assurance, developing a delivery plan, providing thought leadership for all data solutions, including designing and development that meet and exceed customer expectations. · Analyzing and translating business needs into long-term solution data models · Developing best practices for data modeling to ensure consistency within the data landscape · To be successful in this role, you will be experienced with the Cloud-based data Solution Architectures, Software Development Life Cycle (including both Agile & Waterfall), Data Engineering and ETL tools/platforms, and data modeling practices. · Be a close partner & collaborate with data engineers, developers, DevOps Engineers, Data scientists, and technical leads to build data pipelines, develop feedback loops to improve data quality, data ingestion, data streaming, workflows, data transformation jobs automation. · Design, build, test and Deploy Data pipelines and blueprints at scale. · Design data-driven product and solution framework in collaboration with enterprise data architects and cloud solution architects. · Excellent Code review, Code Quality, Data & application security experience. · Drive productivity and stability while accelerating time-to-market through automation of the software lifecycle. · Drive continuous quality, stability, and compliance to reduce risk · Automate and orchestrate releases at scale. · Measure, track and improve software delivery. Increase efficiency and reduce risks associated with software deliver. Key Skills/Knowledge: · Bachelor's/Master’s degree in a relevant field – (e.g. Computer Science, Software engineering, Data Science, AI/ML). · 7+ years of experience Developing, deploying, and monitoring end to end data and analytics solutions with extensive knowledge of evaluation metrics and best practices. · 7+ years Data Warehouse/Data Lake Architecture and Development. · 5+ years Data Modeling & Architecture with experience in implementing at least one market leading banking and financial services data model i.e. from Oracle, Teradata, IBM · Experience on data modelling tools such as Erwin, ER Studio, SQL modeler is preferred · Experience in ETL/ELT, data Pipelines, Data Quality, blueprints development. · Non-Relational Database experience (Document DB, Graph DB, etc.) · Understanding of data structures, data modeling and data architecture · Strong communication skills and ability to work with ambiguity. · Able to provide direction/support to data modelling team members. · Strong architectural knowledge for data analytics systems patterns and their pros and cons. · Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture · Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.

Responsibilities

Requirement ID: 10596845 Job Title: Developer Work Location: Chennai, TN / Mumbai, MH / Pune, MH Skill Required: BPS: Data moduler Experience Range in Required Skills: 7+ Years Duration: 6 months (extendable) Job Description: Key responsibilities You will be responsible for · Defining the data architecture framework covering data modelling, security, virtualization, governance, reference and master data, data visualization · Defining reference architecture and patterns that others can follow to create and improve data systems aligning with industry standards · Designing data models leveraging ER, Dimensional and Data vault modelling technique. · Designing data security and controls to address customer’s data privacy needs inline to current regulations such as GDPR, CCPA etc. · Designing technical solutions leveraging data virtualization techniques and tools such as Denodo etc. · Collaborating and coordinating with multiple departments, senior stakeholders including C level, partners, and external vendors · Designing architecture solutions that are in line with business objectives · Providing technical leadership, oversight, and direction to project / execution team · Building effective relationships with customers, CoE, partners and vendors · You will be responsible for verifying requirements, Data solution architecture & design assurance, developing a delivery plan, providing thought leadership for all data solutions, including designing and development that meet and exceed customer expectations. · Analyzing and translating business needs into long-term solution data models · Developing best practices for data modeling to ensure consistency within the data landscape · To be successful in this role, you will be experienced with the Cloud-based data Solution Architectures, Software Development Life Cycle (including both Agile & Waterfall), Data Engineering and ETL tools/platforms, and data modeling practices. · Be a close partner & collaborate with data engineers, developers, DevOps Engineers, Data scientists, and technical leads to build data pipelines, develop feedback loops to improve data quality, data ingestion, data streaming, workflows, data transformation jobs automation. · Design, build, test and Deploy Data pipelines and blueprints at scale. · Design data-driven product and solution framework in collaboration with enterprise data architects and cloud solution architects. · Excellent Code review, Code Quality, Data & application security experience. · Drive productivity and stability while accelerating time-to-market through automation of the software lifecycle. · Drive continuous quality, stability, and compliance to reduce risk · Automate and orchestrate releases at scale. · Measure, track and improve software delivery. Increase efficiency and reduce risks associated with software deliver. Key Skills/Knowledge: · Bachelor's/Master’s degree in a relevant field – (e.g. Computer Science, Software engineering, Data Science, AI/ML). · 7+ years of experience Developing, deploying, and monitoring end to end data and analytics solutions with extensive knowledge of evaluation metrics and best practices. · 7+ years Data Warehouse/Data Lake Architecture and Development. · 5+ years Data Modeling & Architecture with experience in implementing at least one market leading banking and financial services data model i.e. from Oracle, Teradata, IBM · Experience on data modelling tools such as Erwin, ER Studio, SQL modeler is preferred · Experience in ETL/ELT, data Pipelines, Data Quality, blueprints development. · Non-Relational Database experience (Document DB, Graph DB, etc.) · Understanding of data structures, data modeling and data architecture · Strong communication skills and ability to work with ambiguity. · Able to provide direction/support to data modelling team members. · Strong architectural knowledge for data analytics systems patterns and their pros and cons. · Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture · Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data moduler

Job Description

Requirement ID: 10596315 Job Title: Bigdata Developer Work Location: Chennai, TN / Pune, MH / Mumbai, MH Skill Required: Digital: Bigdata and Hadoop Ecosystem - Python, Spark, Hive Experience Range in Required Skills: 8-10 Years, Desired Experience Range 6+ years relevant Duration: 6 months (extendable) Job Description: Big Data Must-Have • Proficient in Scala, preferably certification from accredited institution • Experience building enterprise software solutions · Knowledge of OOO concepts and patterns • Basic knowledge of HDFS, Hive, Spark · Build required skill by doing quick research including google search • Preferably compute science background · • Ability to present information in a concise and clear manner. Good-to-Have • Basic knowledge of HDFS, Hive, Spark • Basic knowledge OOO Language such as Java • Scripting language unix/shell, Python

Responsibilities

Requirement ID: 10596315 Job Title: Bigdata Developer Work Location: Chennai, TN / Pune, MH / Mumbai, MH Skill Required: Digital: Bigdata and Hadoop Ecosystem - Python, Spark, Hive Experience Range in Required Skills: 8-10 Years, Desired Experience Range 6+ years relevant Duration: 6 months (extendable) Job Description: Big Data Must-Have • Proficient in Scala, preferably certification from accredited institution • Experience building enterprise software solutions · Knowledge of OOO concepts and patterns • Basic knowledge of HDFS, Hive, Spark · Build required skill by doing quick research including google search • Preferably compute science background · • Ability to present information in a concise and clear manner. Good-to-Have • Basic knowledge of HDFS, Hive, Spark • Basic knowledge OOO Language such as Java • Scripting language unix/shell, Python
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bigdata Developer

Job Description

Please work on below demand. • Primary mandate skill required - Tanium • Secondary mandate skill required – Intune • Can we consider Contractor(CWR) profiles-Yes • Flexible to hire in any location – If not, please mention job location -Open • Detailed Job Description – Detailed JD : Tanium Admin •  Deploying, configuring, and maintaining the Tanium platform and its modules. •  Performing system health checks and ensuring robust endpoint management. •  Identifying, analyzing, and remediating security vulnerabilities and compliance issues. •  Creating and managing Tanium sensors and questions to gather data. •  Troubleshooting endpoint issues across various operating systems like Windows, Linux, and macOS. •  Assisting customers with support cases and answering questions. •  Developing and testing new product features, as seen in roles like Software Engineer. •  Creating and managing deployment strategies for software and patches. •  Working with automation and low-code tools like Tanium Automate. Intune Admin : •  Design, implement, and manage Intune policies for Windows, macOS, iOS, and Android platforms. •  Oversee application deployment strategies using Microsoft Endpoint Manager (MEM). •  Configure and maintain app protection and configuration policies. •  Provide L3/L4 support for escalated issues related to Intune, device compliance, and application deployment. •  Analyze logs and telemetry data to resolve complex technical issues. •  Collaborate with Microsoft support and internal teams for issue resolution. •  Implement and manage compliance policies, conditional access, and endpoint security baselines. •  Integrate Intune with Microsoft Defender for Endpoint and other security tools. •  Ensure endpoint configurations align with organizational security standards and regulatory requirements. •  Develop and maintain PowerShell scripts to automate Intune tasks and generate reports. •  Utilize Microsoft Graph API for advanced automation and integration. •  Design and implement integrations with Azure AD, Autopilot, SCCM (co-management), and third-party tools. •  Participate in architectural planning and contribute to the endpoint management roadmap. •  Create and maintain dashboards and reports for device compliance, deployment status, and user activity. •  Monitor system health and performance of Intune services. •  Maintain comprehensive documentation of configurations, procedures, and troubleshooting steps. •  Provide training and mentorship to L1/L2 support teams. Key skills and qualifications: •  Tanium & Intune expertise: Deep knowledge of the platform and its various modules. •  Operating systems: Strong knowledge of Windows, Linux, and macOS environments. •  Scripting and automation: Experience in automating tasks and creating sensors and questions. •  Security and compliance: Understanding of vulnerability management, threat hunting, and compliance reporting. •  Troubleshooting: Ability to identify and solve issues on endpoints. •  Deployment experience: Familiarity with Tanium & Intune-based deployments and other tools like SCCM. •  Customer support: Skills in triaging and solving support cases.

Responsibilities

Please work on below demand. • Primary mandate skill required - Tanium • Secondary mandate skill required – Intune • Can we consider Contractor(CWR) profiles-Yes • Flexible to hire in any location – If not, please mention job location -Open • Detailed Job Description – Detailed JD : Tanium Admin •  Deploying, configuring, and maintaining the Tanium platform and its modules. •  Performing system health checks and ensuring robust endpoint management. •  Identifying, analyzing, and remediating security vulnerabilities and compliance issues. •  Creating and managing Tanium sensors and questions to gather data. •  Troubleshooting endpoint issues across various operating systems like Windows, Linux, and macOS. •  Assisting customers with support cases and answering questions. •  Developing and testing new product features, as seen in roles like Software Engineer. •  Creating and managing deployment strategies for software and patches. •  Working with automation and low-code tools like Tanium Automate. Intune Admin : •  Design, implement, and manage Intune policies for Windows, macOS, iOS, and Android platforms. •  Oversee application deployment strategies using Microsoft Endpoint Manager (MEM). •  Configure and maintain app protection and configuration policies. •  Provide L3/L4 support for escalated issues related to Intune, device compliance, and application deployment. •  Analyze logs and telemetry data to resolve complex technical issues. •  Collaborate with Microsoft support and internal teams for issue resolution. •  Implement and manage compliance policies, conditional access, and endpoint security baselines. •  Integrate Intune with Microsoft Defender for Endpoint and other security tools. •  Ensure endpoint configurations align with organizational security standards and regulatory requirements. •  Develop and maintain PowerShell scripts to automate Intune tasks and generate reports. •  Utilize Microsoft Graph API for advanced automation and integration. •  Design and implement integrations with Azure AD, Autopilot, SCCM (co-management), and third-party tools. •  Participate in architectural planning and contribute to the endpoint management roadmap. •  Create and maintain dashboards and reports for device compliance, deployment status, and user activity. •  Monitor system health and performance of Intune services. •  Maintain comprehensive documentation of configurations, procedures, and troubleshooting steps. •  Provide training and mentorship to L1/L2 support teams. Key skills and qualifications: •  Tanium & Intune expertise: Deep knowledge of the platform and its various modules. •  Operating systems: Strong knowledge of Windows, Linux, and macOS environments. •  Scripting and automation: Experience in automating tasks and creating sensors and questions. •  Security and compliance: Understanding of vulnerability management, threat hunting, and compliance reporting. •  Troubleshooting: Ability to identify and solve issues on endpoints. •  Deployment experience: Familiarity with Tanium & Intune-based deployments and other tools like SCCM. •  Customer support: Skills in triaging and solving support cases.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Tanium Admin

Job Description

10 M2B Summary: As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with organizational goals. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization in application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Participate in the testing and debugging of applications to ensure quality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle EBS Financials.- Strong understanding of application design principles and methodologies.- Experience with software development life cycle and agile methodologies.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with database management and data integration techniques. Additional Information: - The candidate should have minimum 3 years of experience in Oracle EBS Financials.- This position is based at our Mumbai office.- A 15 years full time education is required.

Responsibilities

10 M2B Summary: As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with organizational goals. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization in application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Participate in the testing and debugging of applications to ensure quality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle EBS Financials.- Strong understanding of application design principles and methodologies.- Experience with software development life cycle and agile methodologies.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with database management and data integration techniques. Additional Information: - The candidate should have minimum 3 years of experience in Oracle EBS Financials.- This position is based at our Mumbai office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 1,44,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Anupama / Oracle EBS Financials

Job Description

• Designs, implements, and maintains front end applications using TypeScript, Vue 3 (Composition API), Pinia, Vue Router, Vite, SCSS, Tailwind, PrimeVue, Pino logging, Axios, and GraphQL clients (Apollo/URQL/Relay), integrating Auth0 for authentication and authorization. • Demonstrates strong engineering fundamentals across Linux, Node.js (npm/pnpm), and Docker, applying OWASP practices and SOLID/DRY/KISS/YAGNI principles with sound data structures & algorithms. • Exhibits deep Vue expertise: reactivity system, directives, component design, props/emits, slots, and lifecycle hooks; organizes code via Composition Functions and type safe patterns in TypeScript. • Consumes GraphQL SDL and REST OpenAPI specifications, employing client generation where available; connects components to APIs with Axios/Fetch and GraphQL clients, handling auth flows, pagination, caching, and error surfaces. • Translates Figma designs into accessible, responsive HTML5/SCSS using BEM methodology, Tailwind utility patterns, and PrimeVue or equivalent component libraries; documents components in Storybook. • Implements secure front end architecture with Auth0 (OAuth/OIDC), token handling, secure storage, CSP, XSS/CSRF mitigation, input validation/encoding, and safe error handling. • Optimizes web performance via code splitting, lazy loading, tree shaking, asset and image optimization, caching strategies, and efficient rendering; monitors and improves Core Web Vitals using browser performance tooling. • Writes maintainable, reusable, component driven code that is secure, fast, idempotent, reliable, and resilient, with clear separation of concerns and consistent logging via Pino (browser). • Tests thoroughly with Jest or Vitest and Vue Test Utils for unit and integration coverage; performs end to end testing; uses msw (mswjs) to mock backend APIs; validates APIs with Bruno or Insomnia and Browser DevTools (console, network, performance). • Troubleshoots effectively by tracing logs, inspecting errors, and isolating root causes across UI, API, and network layers; produces actionable defect reports with evidence. • Operates locally in Docker and collaborates on CI/CD workflows; familiar with AWS deployment patterns and front end observability (logging, metrics, tracing) for production support. • Maintains API and component documentation, aligns with versioned contracts (GraphQL/OpenAPI), and ensures seamless integration between front end experiences and backend data/services. .

Responsibilities

• Designs, implements, and maintains front end applications using TypeScript, Vue 3 (Composition API), Pinia, Vue Router, Vite, SCSS, Tailwind, PrimeVue, Pino logging, Axios, and GraphQL clients (Apollo/URQL/Relay), integrating Auth0 for authentication and authorization. • Demonstrates strong engineering fundamentals across Linux, Node.js (npm/pnpm), and Docker, applying OWASP practices and SOLID/DRY/KISS/YAGNI principles with sound data structures & algorithms. • Exhibits deep Vue expertise: reactivity system, directives, component design, props/emits, slots, and lifecycle hooks; organizes code via Composition Functions and type safe patterns in TypeScript. • Consumes GraphQL SDL and REST OpenAPI specifications, employing client generation where available; connects components to APIs with Axios/Fetch and GraphQL clients, handling auth flows, pagination, caching, and error surfaces. • Translates Figma designs into accessible, responsive HTML5/SCSS using BEM methodology, Tailwind utility patterns, and PrimeVue or equivalent component libraries; documents components in Storybook. • Implements secure front end architecture with Auth0 (OAuth/OIDC), token handling, secure storage, CSP, XSS/CSRF mitigation, input validation/encoding, and safe error handling. • Optimizes web performance via code splitting, lazy loading, tree shaking, asset and image optimization, caching strategies, and efficient rendering; monitors and improves Core Web Vitals using browser performance tooling. • Writes maintainable, reusable, component driven code that is secure, fast, idempotent, reliable, and resilient, with clear separation of concerns and consistent logging via Pino (browser). • Tests thoroughly with Jest or Vitest and Vue Test Utils for unit and integration coverage; performs end to end testing; uses msw (mswjs) to mock backend APIs; validates APIs with Bruno or Insomnia and Browser DevTools (console, network, performance). • Troubleshoots effectively by tracing logs, inspecting errors, and isolating root causes across UI, API, and network layers; produces actionable defect reports with evidence. • Operates locally in Docker and collaborates on CI/CD workflows; familiar with AWS deployment patterns and front end observability (logging, metrics, tracing) for production support. • Maintains API and component documentation, aligns with versioned contracts (GraphQL/OpenAPI), and ensures seamless integration between front end experiences and backend data/services. .
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Front End Developer

Job Description

Role: PLSQL Developer, Must have: PLSQL-6 years, Oracle- 2 Years Exp:6 to 8 Years Location: Chennai, Mumbai, Bangalore, Gandhinagar

Responsibilities

Role: PLSQL Developer, Must have: PLSQL-6 years, Oracle- 2 Years Exp:6 to 8 Years Location: Chennai, Mumbai, Bangalore, Gandhinagar
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :PLSQL Developer,

Job Description

Must Have: Java-6 Years, Angular-3 Years, Microservice- 2years Years , Spring Boot- 2Years

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java

Job Description

Job Description: 1. Job Description : Employee data Management - Updating employee records in SAP platform 2. Key pointers: Minimum experience of 1 year of EDM 3. Shift Timings - 05:30 to 3 AM and 06:30 to 4:00 AM 4. WFH / WFO :WFO - Hybrid - 4 days of WFO and 1 day of WFO

Responsibilities

Job Description: 1. Job Description : Employee data Management - Updating employee records in SAP platform 2. Key pointers: Minimum experience of 1 year of EDM 3. Shift Timings - 05:30 to 3 AM and 06:30 to 4:00 AM 4. WFH / WFO :WFO - Hybrid - 4 days of WFO and 1 day of WFO
  • Salary : Rs. 2,54,400.0 - Rs. 3,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Human Resources Practitioner