We found 188 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

: Business Analysis~MySQL Collaborate with business stakeholders to gather IAM-related requirements and translate them into functional specifications Analyze business processes related to identity management| access provisioning| and compliance needs Evaluate existing IAM systems| tools| and processes to identify gaps and areas for improvement Assess IAM technologies and solutions in the market| providing recommendations aligned with business goals Identify opportunities for streamlining IAM-related processes to enhance efficiency and security Develop and document business process workflows and use cases related to IAM Essential Skills: Communication Interpersonal Skills Decision-Making Collaboration Teamwork Data Analysis VisualisationBusiness Intelligence (BI) ToolsProcess Modeling Analysis

Responsibilities

Collaborate with business stakeholders to gather IAM-related requirements and translate them into functional specifications Analyze business processes related to identity management| access provisioning| and compliance needs Evaluate existing IAM systems| tools| and processes to identify gaps and areas for improvement Assess IAM technologies and solutions in the market| providing recommendations aligned with business goals Identify opportunities for streamlining IAM-related processes to enhance efficiency and security Develop and document business process workflows and use cases related to IAM Essential Skills: Communication Interpersonal Skills Decision-Making Collaboration Teamwork Data Analysis VisualisationBusiness Intelligence (BI) ToolsProcess Modeling Analysis
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :: Business Analysis~MySQL

Job Description

Java Springboot Microservices

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Springboot Microservices

Job Description

: Drools Rule Management System Role Descriptions: Backend Development (Java)Design and develop scalable| high performance backend services using Java and J2EE technologiesBuild and maintain RESTful APIs and microservices using Spring Boot and related frameworksFrontend DevelopmentDevelop responsive| user-friendly web interfaces using Angular frameworksCollaborate with UXUI designers to translate wireframes into high quality frontend implementationsEnsure cross browser compatibility| performance optimization| and accessibility standardsIntegrate frontend components seamlessly with backend APIsDatabase PersistenceStrong practical experience with relational databases such as MySQL| PostgreSQL| SQL development and schema design Exposure to NoSQL databases like MongoDB| CassandraDevOps| CICD CloudDeploy and manage applications in cloud environments (Azure)Monitor application health and support production releases and troubleshootingStrong experience with version control using GitGitHub and deployment automationSolid understanding of Agile methodologies| project management tools like JiraQuality| Security GovernancePerform peer code reviews to maintain high coding standards and consistencyEnsure adherence to security best practices| authentication| authorization| and data protectionSupport Agile ceremonies| sprint planning| estimations| and retrospectivesAI TechnologiesSkillsWorking knowledge and practical exposure to Claude AI| leveraging AI capabilities for accelerating development| debugging| code reviews| and technical documentation Essential Skills: Looking for a Senior Java Full Stack Developer with Drools and over 10 years of strong hands-on experience in designing| developing| and delivering enterprise grade web applications. Resource should have deep expertise across Java backend technologies and modern frontend frameworks| combined with strong exposure to microservices| cloud-native development| CICD| and Agile delivery.

Responsibilities

Role Descriptions: Backend Development (Java)Design and develop scalable| high performance backend services using Java and J2EE technologiesBuild and maintain RESTful APIs and microservices using Spring Boot and related frameworksFrontend DevelopmentDevelop responsive| user-friendly web interfaces using Angular frameworksCollaborate with UXUI designers to translate wireframes into high quality frontend implementationsEnsure cross browser compatibility| performance optimization| and accessibility standardsIntegrate frontend components seamlessly with backend APIsDatabase PersistenceStrong practical experience with relational databases such as MySQL| PostgreSQL| SQL development and schema design Exposure to NoSQL databases like MongoDB| CassandraDevOps| CICD CloudDeploy and manage applications in cloud environments (Azure)Monitor application health and support production releases and troubleshootingStrong experience with version control using GitGitHub and deployment automationSolid understanding of Agile methodologies| project management tools like JiraQuality| Security GovernancePerform peer code reviews to maintain high coding standards and consistencyEnsure adherence to security best practices| authentication| authorization| and data protectionSupport Agile ceremonies| sprint planning| estimations| and retrospectivesAI TechnologiesSkillsWorking knowledge and practical exposure to Claude AI| leveraging AI capabilities for accelerating development| debugging| code reviews| and technical documentation Essential Skills: Looking for a Senior Java Full Stack Developer with Drools and over 10 years of strong hands-on experience in designing| developing| and delivering enterprise grade web applications. Resource should have deep expertise across Java backend technologies and modern frontend frameworks| combined with strong exposure to microservices| cloud-native development| CICD| and Agile delivery.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :: Drools Rule Management System

Job Description

Please find the update .Net JD: Immediate to 15 days Chennai Only Dot NET JD: .net/modernJS/MS-SQL/Postgres/Fastly/Apache/GraphQL/NodeJS/ .Net Developer Skillset: • Net developers proficient in API development . Full Stack Developers • Microservices / Micro frontend architecture • GPRC micro services • Modern JS for micro frontend (Lead needs to have this skill- Others to learn on job)

Responsibilities

Please find the update .Net JD: Immediate to 15 days Chennai Only Dot NET JD: .net/modernJS/MS-SQL/Postgres/Fastly/Apache/GraphQL/NodeJS/ .Net Developer Skillset: • Net developers proficient in API development . Full Stack Developers • Microservices / Micro frontend architecture • GPRC micro services • Modern JS for micro frontend (Lead needs to have this skill- Others to learn on job)
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :.net developer

Job Description

please help us with IAM 1 M level profiles for the below JD? Below is the SO for M level. • Minimum of 8 to 12 years IAM product suite implementation experience. • Certified Okta professional (preferred). Should have 4 years hands on OKTA experience. • OKTA Auth0 experience(preferred) • Ansible, Terraform Automation(preferred) • Experience with Okta Access Gateway and Okta IWA design installation configuration and operation Experience with On Premise Application Lifecycle Management and provisioning. • Experience with Cloud Lifecycle Management including SCIM and API integration Strong skills in designing and configuring Service Provider interfaces SAML v2 MFA and OAuth2 implementations. • Expert product and service support by highly technical resources to provide problem resolution and platform support feature creation and implementation including updates to design code and specification. • Multifactor Authentication (MFA) password resets - AD Synchronization health - Onboarding of new applications and services - Application Onboarding • IBM Security Access Manager experience(preferred)

Responsibilities

please help us with IAM 1 M level profiles for the below JD? Below is the SO for M level. • Minimum of 8 to 12 years IAM product suite implementation experience. • Certified Okta professional (preferred). Should have 4 years hands on OKTA experience. • OKTA Auth0 experience(preferred) • Ansible, Terraform Automation(preferred) • Experience with Okta Access Gateway and Okta IWA design installation configuration and operation Experience with On Premise Application Lifecycle Management and provisioning. • Experience with Cloud Lifecycle Management including SCIM and API integration Strong skills in designing and configuring Service Provider interfaces SAML v2 MFA and OAuth2 implementations. • Expert product and service support by highly technical resources to provide problem resolution and platform support feature creation and implementation including updates to design code and specification. • Multifactor Authentication (MFA) password resets - AD Synchronization health - Onboarding of new applications and services - Application Onboarding • IBM Security Access Manager experience(preferred)
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :IAM DEVELOPER

Job Description

• Independently designs, implements, and maintains GraphQL backends using TypeScript, Apollo Server, Axios/Fetch (REST SDKs), MariaDB (SQL) with ObjectionJS + KnexJS, Redis, AWS, and modern authorization frameworks including AuthZ libraries or GraphQL Envelop. • Applies strong engineering fundamentals, including Linux, Docker, OWASP, SOLID/DRY/KISS/YAGNI, and sound data structures & algorithms for correctness, safety, and efficiency. • Authors strongly typed schemas using GraphQL SDL or code first frameworks (e.g., GiraphQL/Pothos, TypeGraphQL) and uses GraphQL Code Generator to produce type safe definitions for schemas, resolvers, queries, mutations, and subscriptions. • Configures Apollo Server with TypeScript; develops type safe resolvers; implements REST and database data sources; manages context initialization (auth, tenancy, request scoping); and enforces query depth/complexity limits, rate limiting, and persisted queries. • Demonstrates expert SQL capabilities in MariaDB, including schema design, indexing, migrations, query optimization, and resilient data access through ObjectionJS and KnexJS, ensuring idempotent operations and safe transactions. • Optimizes performance by eliminating N+1 through DataLoader, implementing caching (in memory and Redis), optimizing pagination and batching, and profiling GraphQL resolver and SQL hot paths. • Implements secure by design GraphQL services, including OAuth/OIDC, encryption in transit/at rest, secret management, input validation, output encoding, least privilege access, and resolver level authorization to mitigate CSRF/CORS and other abuse vectors. • Defines and executes high quality GraphQL and REST API tests across all API testing types (contract, functional, integration, negative, security, performance) using Browser DevTools, Bruno, and Insomnia, and writes comprehensive unit, integration, and end to end tests. • Produces maintainable, reusable, type safe code that is fast, idempotent, resilient, observable, and fault tolerant, incorporating retries, exponential backoff, graceful degradation, circuit breakers, and robust structured error surfaces. • Diagnoses failures across the stack using logs, metrics, traces, database analysis, and network inspection; performs root cause analysis and provides actionable remediations with evidence based findings. • Operates services locally via Docker and deploys to AWS using appropriate configuration, secrets management, monitoring, and observability tooling; tunes caching, database performance, and GraphQL query execution in production environments. • Designs stable, versioned, backward compatible GraphQL contracts; maintains API documentation and operational runbooks; and ensures seamless integration between backend logic, database layers, and frontend clients.

Responsibilities

• Independently designs, implements, and maintains GraphQL backends using TypeScript, Apollo Server, Axios/Fetch (REST SDKs), MariaDB (SQL) with ObjectionJS + KnexJS, Redis, AWS, and modern authorization frameworks including AuthZ libraries or GraphQL Envelop. • Applies strong engineering fundamentals, including Linux, Docker, OWASP, SOLID/DRY/KISS/YAGNI, and sound data structures & algorithms for correctness, safety, and efficiency. • Authors strongly typed schemas using GraphQL SDL or code first frameworks (e.g., GiraphQL/Pothos, TypeGraphQL) and uses GraphQL Code Generator to produce type safe definitions for schemas, resolvers, queries, mutations, and subscriptions. • Configures Apollo Server with TypeScript; develops type safe resolvers; implements REST and database data sources; manages context initialization (auth, tenancy, request scoping); and enforces query depth/complexity limits, rate limiting, and persisted queries. • Demonstrates expert SQL capabilities in MariaDB, including schema design, indexing, migrations, query optimization, and resilient data access through ObjectionJS and KnexJS, ensuring idempotent operations and safe transactions. • Optimizes performance by eliminating N+1 through DataLoader, implementing caching (in memory and Redis), optimizing pagination and batching, and profiling GraphQL resolver and SQL hot paths. • Implements secure by design GraphQL services, including OAuth/OIDC, encryption in transit/at rest, secret management, input validation, output encoding, least privilege access, and resolver level authorization to mitigate CSRF/CORS and other abuse vectors. • Defines and executes high quality GraphQL and REST API tests across all API testing types (contract, functional, integration, negative, security, performance) using Browser DevTools, Bruno, and Insomnia, and writes comprehensive unit, integration, and end to end tests. • Produces maintainable, reusable, type safe code that is fast, idempotent, resilient, observable, and fault tolerant, incorporating retries, exponential backoff, graceful degradation, circuit breakers, and robust structured error surfaces. • Diagnoses failures across the stack using logs, metrics, traces, database analysis, and network inspection; performs root cause analysis and provides actionable remediations with evidence based findings. • Operates services locally via Docker and deploys to AWS using appropriate configuration, secrets management, monitoring, and observability tooling; tunes caching, database performance, and GraphQL query execution in production environments. • Designs stable, versioned, backward compatible GraphQL contracts; maintains API documentation and operational runbooks; and ensures seamless integration between backend logic, database layers, and frontend clients.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Graph QL Developer

Job Description

RGS:10463273 Location:~COIMBATORE~CHENNAI~CHENNAI~ Role Descriptions: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Essential Skills: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Desirable Skills: Keyword: Skills: Telecom - Service Management - BPS~Service Management Experience Required: 4-6

Responsibilities

RGS:10463273 Location:~COIMBATORE~CHENNAI~CHENNAI~ Role Descriptions: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Essential Skills: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Desirable Skills: Keyword: Skills: Telecom - Service Management - BPS~Service Management Experience Required: 4-6
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Skills: Telecom - Service Management - BPS~Service Management

Job Description

Requirement ID: 10596845 Job Title: Developer Work Location: Chennai, TN / Mumbai, MH / Pune, MH Skill Required: BPS: Data moduler Experience Range in Required Skills: 7+ Years Duration: 6 months (extendable) Job Description: Key responsibilities You will be responsible for · Defining the data architecture framework covering data modelling, security, virtualization, governance, reference and master data, data visualization · Defining reference architecture and patterns that others can follow to create and improve data systems aligning with industry standards · Designing data models leveraging ER, Dimensional and Data vault modelling technique. · Designing data security and controls to address customer’s data privacy needs inline to current regulations such as GDPR, CCPA etc. · Designing technical solutions leveraging data virtualization techniques and tools such as Denodo etc. · Collaborating and coordinating with multiple departments, senior stakeholders including C level, partners, and external vendors · Designing architecture solutions that are in line with business objectives · Providing technical leadership, oversight, and direction to project / execution team · Building effective relationships with customers, CoE, partners and vendors · You will be responsible for verifying requirements, Data solution architecture & design assurance, developing a delivery plan, providing thought leadership for all data solutions, including designing and development that meet and exceed customer expectations. · Analyzing and translating business needs into long-term solution data models · Developing best practices for data modeling to ensure consistency within the data landscape · To be successful in this role, you will be experienced with the Cloud-based data Solution Architectures, Software Development Life Cycle (including both Agile & Waterfall), Data Engineering and ETL tools/platforms, and data modeling practices. · Be a close partner & collaborate with data engineers, developers, DevOps Engineers, Data scientists, and technical leads to build data pipelines, develop feedback loops to improve data quality, data ingestion, data streaming, workflows, data transformation jobs automation. · Design, build, test and Deploy Data pipelines and blueprints at scale. · Design data-driven product and solution framework in collaboration with enterprise data architects and cloud solution architects. · Excellent Code review, Code Quality, Data & application security experience. · Drive productivity and stability while accelerating time-to-market through automation of the software lifecycle. · Drive continuous quality, stability, and compliance to reduce risk · Automate and orchestrate releases at scale. · Measure, track and improve software delivery. Increase efficiency and reduce risks associated with software deliver. Key Skills/Knowledge: · Bachelor's/Master’s degree in a relevant field – (e.g. Computer Science, Software engineering, Data Science, AI/ML). · 7+ years of experience Developing, deploying, and monitoring end to end data and analytics solutions with extensive knowledge of evaluation metrics and best practices. · 7+ years Data Warehouse/Data Lake Architecture and Development. · 5+ years Data Modeling & Architecture with experience in implementing at least one market leading banking and financial services data model i.e. from Oracle, Teradata, IBM · Experience on data modelling tools such as Erwin, ER Studio, SQL modeler is preferred · Experience in ETL/ELT, data Pipelines, Data Quality, blueprints development. · Non-Relational Database experience (Document DB, Graph DB, etc.) · Understanding of data structures, data modeling and data architecture · Strong communication skills and ability to work with ambiguity. · Able to provide direction/support to data modelling team members. · Strong architectural knowledge for data analytics systems patterns and their pros and cons. · Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture · Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.

Responsibilities

Requirement ID: 10596845 Job Title: Developer Work Location: Chennai, TN / Mumbai, MH / Pune, MH Skill Required: BPS: Data moduler Experience Range in Required Skills: 7+ Years Duration: 6 months (extendable) Job Description: Key responsibilities You will be responsible for · Defining the data architecture framework covering data modelling, security, virtualization, governance, reference and master data, data visualization · Defining reference architecture and patterns that others can follow to create and improve data systems aligning with industry standards · Designing data models leveraging ER, Dimensional and Data vault modelling technique. · Designing data security and controls to address customer’s data privacy needs inline to current regulations such as GDPR, CCPA etc. · Designing technical solutions leveraging data virtualization techniques and tools such as Denodo etc. · Collaborating and coordinating with multiple departments, senior stakeholders including C level, partners, and external vendors · Designing architecture solutions that are in line with business objectives · Providing technical leadership, oversight, and direction to project / execution team · Building effective relationships with customers, CoE, partners and vendors · You will be responsible for verifying requirements, Data solution architecture & design assurance, developing a delivery plan, providing thought leadership for all data solutions, including designing and development that meet and exceed customer expectations. · Analyzing and translating business needs into long-term solution data models · Developing best practices for data modeling to ensure consistency within the data landscape · To be successful in this role, you will be experienced with the Cloud-based data Solution Architectures, Software Development Life Cycle (including both Agile & Waterfall), Data Engineering and ETL tools/platforms, and data modeling practices. · Be a close partner & collaborate with data engineers, developers, DevOps Engineers, Data scientists, and technical leads to build data pipelines, develop feedback loops to improve data quality, data ingestion, data streaming, workflows, data transformation jobs automation. · Design, build, test and Deploy Data pipelines and blueprints at scale. · Design data-driven product and solution framework in collaboration with enterprise data architects and cloud solution architects. · Excellent Code review, Code Quality, Data & application security experience. · Drive productivity and stability while accelerating time-to-market through automation of the software lifecycle. · Drive continuous quality, stability, and compliance to reduce risk · Automate and orchestrate releases at scale. · Measure, track and improve software delivery. Increase efficiency and reduce risks associated with software deliver. Key Skills/Knowledge: · Bachelor's/Master’s degree in a relevant field – (e.g. Computer Science, Software engineering, Data Science, AI/ML). · 7+ years of experience Developing, deploying, and monitoring end to end data and analytics solutions with extensive knowledge of evaluation metrics and best practices. · 7+ years Data Warehouse/Data Lake Architecture and Development. · 5+ years Data Modeling & Architecture with experience in implementing at least one market leading banking and financial services data model i.e. from Oracle, Teradata, IBM · Experience on data modelling tools such as Erwin, ER Studio, SQL modeler is preferred · Experience in ETL/ELT, data Pipelines, Data Quality, blueprints development. · Non-Relational Database experience (Document DB, Graph DB, etc.) · Understanding of data structures, data modeling and data architecture · Strong communication skills and ability to work with ambiguity. · Able to provide direction/support to data modelling team members. · Strong architectural knowledge for data analytics systems patterns and their pros and cons. · Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture · Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data moduler

Job Description

Requirement ID: 10596315 Job Title: Bigdata Developer Work Location: Chennai, TN / Pune, MH / Mumbai, MH Skill Required: Digital: Bigdata and Hadoop Ecosystem - Python, Spark, Hive Experience Range in Required Skills: 8-10 Years, Desired Experience Range 6+ years relevant Duration: 6 months (extendable) Job Description: Big Data Must-Have • Proficient in Scala, preferably certification from accredited institution • Experience building enterprise software solutions · Knowledge of OOO concepts and patterns • Basic knowledge of HDFS, Hive, Spark · Build required skill by doing quick research including google search • Preferably compute science background · • Ability to present information in a concise and clear manner. Good-to-Have • Basic knowledge of HDFS, Hive, Spark • Basic knowledge OOO Language such as Java • Scripting language unix/shell, Python

Responsibilities

Requirement ID: 10596315 Job Title: Bigdata Developer Work Location: Chennai, TN / Pune, MH / Mumbai, MH Skill Required: Digital: Bigdata and Hadoop Ecosystem - Python, Spark, Hive Experience Range in Required Skills: 8-10 Years, Desired Experience Range 6+ years relevant Duration: 6 months (extendable) Job Description: Big Data Must-Have • Proficient in Scala, preferably certification from accredited institution • Experience building enterprise software solutions · Knowledge of OOO concepts and patterns • Basic knowledge of HDFS, Hive, Spark · Build required skill by doing quick research including google search • Preferably compute science background · • Ability to present information in a concise and clear manner. Good-to-Have • Basic knowledge of HDFS, Hive, Spark • Basic knowledge OOO Language such as Java • Scripting language unix/shell, Python
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bigdata Developer