We found 1750 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

• Independently designs, implements, and maintains GraphQL backends using TypeScript, Apollo Server, Axios/Fetch (REST SDKs), MariaDB (SQL) with ObjectionJS + KnexJS, Redis, AWS, and modern authorization frameworks including AuthZ libraries or GraphQL Envelop. • Applies strong engineering fundamentals, including Linux, Docker, OWASP, SOLID/DRY/KISS/YAGNI, and sound data structures & algorithms for correctness, safety, and efficiency. • Authors strongly typed schemas using GraphQL SDL or code first frameworks (e.g., GiraphQL/Pothos, TypeGraphQL) and uses GraphQL Code Generator to produce type safe definitions for schemas, resolvers, queries, mutations, and subscriptions. • Configures Apollo Server with TypeScript; develops type safe resolvers; implements REST and database data sources; manages context initialization (auth, tenancy, request scoping); and enforces query depth/complexity limits, rate limiting, and persisted queries. • Demonstrates expert SQL capabilities in MariaDB, including schema design, indexing, migrations, query optimization, and resilient data access through ObjectionJS and KnexJS, ensuring idempotent operations and safe transactions. • Optimizes performance by eliminating N+1 through DataLoader, implementing caching (in memory and Redis), optimizing pagination and batching, and profiling GraphQL resolver and SQL hot paths. • Implements secure by design GraphQL services, including OAuth/OIDC, encryption in transit/at rest, secret management, input validation, output encoding, least privilege access, and resolver level authorization to mitigate CSRF/CORS and other abuse vectors. • Defines and executes high quality GraphQL and REST API tests across all API testing types (contract, functional, integration, negative, security, performance) using Browser DevTools, Bruno, and Insomnia, and writes comprehensive unit, integration, and end to end tests. • Produces maintainable, reusable, type safe code that is fast, idempotent, resilient, observable, and fault tolerant, incorporating retries, exponential backoff, graceful degradation, circuit breakers, and robust structured error surfaces. • Diagnoses failures across the stack using logs, metrics, traces, database analysis, and network inspection; performs root cause analysis and provides actionable remediations with evidence based findings. • Operates services locally via Docker and deploys to AWS using appropriate configuration, secrets management, monitoring, and observability tooling; tunes caching, database performance, and GraphQL query execution in production environments. • Designs stable, versioned, backward compatible GraphQL contracts; maintains API documentation and operational runbooks; and ensures seamless integration between backend logic, database layers, and frontend clients.

Responsibilities

• Independently designs, implements, and maintains GraphQL backends using TypeScript, Apollo Server, Axios/Fetch (REST SDKs), MariaDB (SQL) with ObjectionJS + KnexJS, Redis, AWS, and modern authorization frameworks including AuthZ libraries or GraphQL Envelop. • Applies strong engineering fundamentals, including Linux, Docker, OWASP, SOLID/DRY/KISS/YAGNI, and sound data structures & algorithms for correctness, safety, and efficiency. • Authors strongly typed schemas using GraphQL SDL or code first frameworks (e.g., GiraphQL/Pothos, TypeGraphQL) and uses GraphQL Code Generator to produce type safe definitions for schemas, resolvers, queries, mutations, and subscriptions. • Configures Apollo Server with TypeScript; develops type safe resolvers; implements REST and database data sources; manages context initialization (auth, tenancy, request scoping); and enforces query depth/complexity limits, rate limiting, and persisted queries. • Demonstrates expert SQL capabilities in MariaDB, including schema design, indexing, migrations, query optimization, and resilient data access through ObjectionJS and KnexJS, ensuring idempotent operations and safe transactions. • Optimizes performance by eliminating N+1 through DataLoader, implementing caching (in memory and Redis), optimizing pagination and batching, and profiling GraphQL resolver and SQL hot paths. • Implements secure by design GraphQL services, including OAuth/OIDC, encryption in transit/at rest, secret management, input validation, output encoding, least privilege access, and resolver level authorization to mitigate CSRF/CORS and other abuse vectors. • Defines and executes high quality GraphQL and REST API tests across all API testing types (contract, functional, integration, negative, security, performance) using Browser DevTools, Bruno, and Insomnia, and writes comprehensive unit, integration, and end to end tests. • Produces maintainable, reusable, type safe code that is fast, idempotent, resilient, observable, and fault tolerant, incorporating retries, exponential backoff, graceful degradation, circuit breakers, and robust structured error surfaces. • Diagnoses failures across the stack using logs, metrics, traces, database analysis, and network inspection; performs root cause analysis and provides actionable remediations with evidence based findings. • Operates services locally via Docker and deploys to AWS using appropriate configuration, secrets management, monitoring, and observability tooling; tunes caching, database performance, and GraphQL query execution in production environments. • Designs stable, versioned, backward compatible GraphQL contracts; maintains API documentation and operational runbooks; and ensures seamless integration between backend logic, database layers, and frontend clients.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Graph QL Developer

Job Description

Requirement ID: 10667981 Job Title: Developer (Data Scientist (TRACE)) Work Location: TCS PAN India locations Duration: 6 months Skills Required: We are looking for a Data Scientist for the TRACE project who has a strong understanding of processing and analyzing data using LLMs, along with solid prompt engineering skills. Experience with Agentic AI frameworks or LLM-based data processing approaches would also be highly beneficial. Relevant Experience Range in Required Skills: 10+ Years Job Description: Desired Competencies (Technical/Behavioral Competency) Must-Have · Strong hands-on experience with Python for data processing and AI development. · Experience working with LLM APIs (such as Azure OpenAI, or Lamma). · Practical experience with prompt engineering, prompt tuning, and evaluation of LLM outputs. · Experience with RAG (Retrieval-Augmented Generation) architectures and vector databases (e.g., FAISS, Pinecone, Azure AI Search). · Knowledge of Agentic AI frameworks such as Semantic Kernel, or similar orchestration frameworks. · Experience in data preprocessing, NLP techniques, and handling large datasets. · Familiarity with REST APIs, data pipelines, and integrating AI models with enterprise applications. · Understanding of model evaluation, monitoring, and optimization techniques. Responsibility of / Expectations from the Role 1 Developing and Implement solutions that leverage LLMs to process, analyze, and extract insights from structured and unstructured data or documents. 2 Develop and optimize prompt engineering strategies to improve the accuracy, consistency, and reliability of LLM outputs. 3 Build AI-driven workflows using Agentic AI frameworks to automate data processing and reasoning tasks. 4 Evaluate and experiment with different LLM models, embeddings, and retrieval techniques to determine the best approach for TRACE use cases. 5 Ensure data quality, scalability, and performance of AI-based processing pipelines. 6 Developing and Implement solutions that leverage LLMs to process, analyze, and extract insights from structured and unstructured data or documents. 7 Ensure full data lineage, enrichment traceability, and audit-ready data structures

Responsibilities

Requirement ID: 10667981 Job Title: Developer (Data Scientist (TRACE)) Work Location: TCS PAN India locations Duration: 6 months Skills Required: We are looking for a Data Scientist for the TRACE project who has a strong understanding of processing and analyzing data using LLMs, along with solid prompt engineering skills. Experience with Agentic AI frameworks or LLM-based data processing approaches would also be highly beneficial. Relevant Experience Range in Required Skills: 10+ Years Job Description: Desired Competencies (Technical/Behavioral Competency) Must-Have · Strong hands-on experience with Python for data processing and AI development. · Experience working with LLM APIs (such as Azure OpenAI, or Lamma). · Practical experience with prompt engineering, prompt tuning, and evaluation of LLM outputs. · Experience with RAG (Retrieval-Augmented Generation) architectures and vector databases (e.g., FAISS, Pinecone, Azure AI Search). · Knowledge of Agentic AI frameworks such as Semantic Kernel, or similar orchestration frameworks. · Experience in data preprocessing, NLP techniques, and handling large datasets. · Familiarity with REST APIs, data pipelines, and integrating AI models with enterprise applications. · Understanding of model evaluation, monitoring, and optimization techniques. Responsibility of / Expectations from the Role 1 Developing and Implement solutions that leverage LLMs to process, analyze, and extract insights from structured and unstructured data or documents. 2 Develop and optimize prompt engineering strategies to improve the accuracy, consistency, and reliability of LLM outputs. 3 Build AI-driven workflows using Agentic AI frameworks to automate data processing and reasoning tasks. 4 Evaluate and experiment with different LLM models, embeddings, and retrieval techniques to determine the best approach for TRACE use cases. 5 Ensure data quality, scalability, and performance of AI-based processing pipelines. 6 Developing and Implement solutions that leverage LLMs to process, analyze, and extract insights from structured and unstructured data or documents. 7 Ensure full data lineage, enrichment traceability, and audit-ready data structures
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Scientist (TRACE)

Job Description

RGS:10463273 Location:~COIMBATORE~CHENNAI~CHENNAI~ Role Descriptions: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Essential Skills: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Desirable Skills: Keyword: Skills: Telecom - Service Management - BPS~Service Management Experience Required: 4-6

Responsibilities

RGS:10463273 Location:~COIMBATORE~CHENNAI~CHENNAI~ Role Descriptions: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Essential Skills: Support analyst telecome service management with 4 to 6 years of experience. Telecome domain is preferable. Desirable Skills: Keyword: Skills: Telecom - Service Management - BPS~Service Management Experience Required: 4-6
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Skills: Telecom - Service Management - BPS~Service Management

Job Description

Requirement ID: 10596845 Job Title: Developer Work Location: Chennai, TN / Mumbai, MH / Pune, MH Skill Required: BPS: Data moduler Experience Range in Required Skills: 7+ Years Duration: 6 months (extendable) Job Description: Key responsibilities You will be responsible for · Defining the data architecture framework covering data modelling, security, virtualization, governance, reference and master data, data visualization · Defining reference architecture and patterns that others can follow to create and improve data systems aligning with industry standards · Designing data models leveraging ER, Dimensional and Data vault modelling technique. · Designing data security and controls to address customer’s data privacy needs inline to current regulations such as GDPR, CCPA etc. · Designing technical solutions leveraging data virtualization techniques and tools such as Denodo etc. · Collaborating and coordinating with multiple departments, senior stakeholders including C level, partners, and external vendors · Designing architecture solutions that are in line with business objectives · Providing technical leadership, oversight, and direction to project / execution team · Building effective relationships with customers, CoE, partners and vendors · You will be responsible for verifying requirements, Data solution architecture & design assurance, developing a delivery plan, providing thought leadership for all data solutions, including designing and development that meet and exceed customer expectations. · Analyzing and translating business needs into long-term solution data models · Developing best practices for data modeling to ensure consistency within the data landscape · To be successful in this role, you will be experienced with the Cloud-based data Solution Architectures, Software Development Life Cycle (including both Agile & Waterfall), Data Engineering and ETL tools/platforms, and data modeling practices. · Be a close partner & collaborate with data engineers, developers, DevOps Engineers, Data scientists, and technical leads to build data pipelines, develop feedback loops to improve data quality, data ingestion, data streaming, workflows, data transformation jobs automation. · Design, build, test and Deploy Data pipelines and blueprints at scale. · Design data-driven product and solution framework in collaboration with enterprise data architects and cloud solution architects. · Excellent Code review, Code Quality, Data & application security experience. · Drive productivity and stability while accelerating time-to-market through automation of the software lifecycle. · Drive continuous quality, stability, and compliance to reduce risk · Automate and orchestrate releases at scale. · Measure, track and improve software delivery. Increase efficiency and reduce risks associated with software deliver. Key Skills/Knowledge: · Bachelor's/Master’s degree in a relevant field – (e.g. Computer Science, Software engineering, Data Science, AI/ML). · 7+ years of experience Developing, deploying, and monitoring end to end data and analytics solutions with extensive knowledge of evaluation metrics and best practices. · 7+ years Data Warehouse/Data Lake Architecture and Development. · 5+ years Data Modeling & Architecture with experience in implementing at least one market leading banking and financial services data model i.e. from Oracle, Teradata, IBM · Experience on data modelling tools such as Erwin, ER Studio, SQL modeler is preferred · Experience in ETL/ELT, data Pipelines, Data Quality, blueprints development. · Non-Relational Database experience (Document DB, Graph DB, etc.) · Understanding of data structures, data modeling and data architecture · Strong communication skills and ability to work with ambiguity. · Able to provide direction/support to data modelling team members. · Strong architectural knowledge for data analytics systems patterns and their pros and cons. · Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture · Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.

Responsibilities

Requirement ID: 10596845 Job Title: Developer Work Location: Chennai, TN / Mumbai, MH / Pune, MH Skill Required: BPS: Data moduler Experience Range in Required Skills: 7+ Years Duration: 6 months (extendable) Job Description: Key responsibilities You will be responsible for · Defining the data architecture framework covering data modelling, security, virtualization, governance, reference and master data, data visualization · Defining reference architecture and patterns that others can follow to create and improve data systems aligning with industry standards · Designing data models leveraging ER, Dimensional and Data vault modelling technique. · Designing data security and controls to address customer’s data privacy needs inline to current regulations such as GDPR, CCPA etc. · Designing technical solutions leveraging data virtualization techniques and tools such as Denodo etc. · Collaborating and coordinating with multiple departments, senior stakeholders including C level, partners, and external vendors · Designing architecture solutions that are in line with business objectives · Providing technical leadership, oversight, and direction to project / execution team · Building effective relationships with customers, CoE, partners and vendors · You will be responsible for verifying requirements, Data solution architecture & design assurance, developing a delivery plan, providing thought leadership for all data solutions, including designing and development that meet and exceed customer expectations. · Analyzing and translating business needs into long-term solution data models · Developing best practices for data modeling to ensure consistency within the data landscape · To be successful in this role, you will be experienced with the Cloud-based data Solution Architectures, Software Development Life Cycle (including both Agile & Waterfall), Data Engineering and ETL tools/platforms, and data modeling practices. · Be a close partner & collaborate with data engineers, developers, DevOps Engineers, Data scientists, and technical leads to build data pipelines, develop feedback loops to improve data quality, data ingestion, data streaming, workflows, data transformation jobs automation. · Design, build, test and Deploy Data pipelines and blueprints at scale. · Design data-driven product and solution framework in collaboration with enterprise data architects and cloud solution architects. · Excellent Code review, Code Quality, Data & application security experience. · Drive productivity and stability while accelerating time-to-market through automation of the software lifecycle. · Drive continuous quality, stability, and compliance to reduce risk · Automate and orchestrate releases at scale. · Measure, track and improve software delivery. Increase efficiency and reduce risks associated with software deliver. Key Skills/Knowledge: · Bachelor's/Master’s degree in a relevant field – (e.g. Computer Science, Software engineering, Data Science, AI/ML). · 7+ years of experience Developing, deploying, and monitoring end to end data and analytics solutions with extensive knowledge of evaluation metrics and best practices. · 7+ years Data Warehouse/Data Lake Architecture and Development. · 5+ years Data Modeling & Architecture with experience in implementing at least one market leading banking and financial services data model i.e. from Oracle, Teradata, IBM · Experience on data modelling tools such as Erwin, ER Studio, SQL modeler is preferred · Experience in ETL/ELT, data Pipelines, Data Quality, blueprints development. · Non-Relational Database experience (Document DB, Graph DB, etc.) · Understanding of data structures, data modeling and data architecture · Strong communication skills and ability to work with ambiguity. · Able to provide direction/support to data modelling team members. · Strong architectural knowledge for data analytics systems patterns and their pros and cons. · Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture · Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data moduler

Job Description

Requirement ID: 10596315 Job Title: Bigdata Developer Work Location: Chennai, TN / Pune, MH / Mumbai, MH Skill Required: Digital: Bigdata and Hadoop Ecosystem - Python, Spark, Hive Experience Range in Required Skills: 8-10 Years, Desired Experience Range 6+ years relevant Duration: 6 months (extendable) Job Description: Big Data Must-Have • Proficient in Scala, preferably certification from accredited institution • Experience building enterprise software solutions · Knowledge of OOO concepts and patterns • Basic knowledge of HDFS, Hive, Spark · Build required skill by doing quick research including google search • Preferably compute science background · • Ability to present information in a concise and clear manner. Good-to-Have • Basic knowledge of HDFS, Hive, Spark • Basic knowledge OOO Language such as Java • Scripting language unix/shell, Python

Responsibilities

Requirement ID: 10596315 Job Title: Bigdata Developer Work Location: Chennai, TN / Pune, MH / Mumbai, MH Skill Required: Digital: Bigdata and Hadoop Ecosystem - Python, Spark, Hive Experience Range in Required Skills: 8-10 Years, Desired Experience Range 6+ years relevant Duration: 6 months (extendable) Job Description: Big Data Must-Have • Proficient in Scala, preferably certification from accredited institution • Experience building enterprise software solutions · Knowledge of OOO concepts and patterns • Basic knowledge of HDFS, Hive, Spark · Build required skill by doing quick research including google search • Preferably compute science background · • Ability to present information in a concise and clear manner. Good-to-Have • Basic knowledge of HDFS, Hive, Spark • Basic knowledge OOO Language such as Java • Scripting language unix/shell, Python
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bigdata Developer

Job Description

INFYSYJP00004646/563621-Azure Cloud

Responsibilities

INFYSYJP00004646/563621-Azure Cloud
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004646/563621-Azure Cloud

Job Description

Role Summary: Senior individual contributor across 4–5 concurrent projects. You engage Business and Transformation Leaders to assess feasibility, deliver POCs in 1–4 weeks, define solution architecture, and build the complex pieces yourself. The quality of your upfront design determines how fast the team builds and how clean the testing is. Key Responsibilities: • Business Engagement & Feasibility o Meet Business and Transformation Leaders to understand pain points and assess AI solution feasibility. o Recommend Pro Code (LangGraph/AgentCore) or Low Code (Copilot Studio/Power Automate) based on the use case — document and communicate the rationale. o Deliver working POCs within 1–4 weeks. Evaluate Forward Engineer POCs and decide to scale or rebuild based on quality. o Present feasibility and POC outcomes to business stakeholders with clear scope, effort, and value framing. • Architecture & Design o Define solution architecture on AWS AgentCore and LangGraph — the primary stack for all Pro Code solutions. o Invest heavily upfront in design robustness: strong architecture enables smooth builds; weak architecture amplifies every downstream problem. o Design systems integration: API architecture, MCP connections, database and data platform access patterns, SAP, Salesforce, and internal systems. o Define agent state management, tool orchestration, human-in-the-loop escalation, and data flow. o Ensure all solutions comply with TR’s established security, governance, and compliance standards. o Continuously evaluate emerging agentic AI frameworks, platform updates, and industry patterns — provide evidence-based recommendations on adoption timing and fit for TR's stack. • Hands-On Build & Team Leadership o Build complex and architecturally critical solution components directly — this is a coding role. o Guide the Solutions Lead, Developer, and Associate through architecture, implementation patterns, and production readiness. o Enable the Lead to own day-to-day decisions during build by ensuring architecture is unambiguous before stepping back. o Use AI coding tools (Claude Code, GitHub Copilot, Cursor, Cline) to accelerate POC and development. Own all generated code fully.

Responsibilities

Required Skills & Experience: Must Have: o AWS AgentCore — Runtime, Memory, Tools Gateway — production hands-on required. o LangGraph — Multi-agent state machines, conditional routing, checkpointing, HITL — primary framework. o LangChain — Advanced chains, memory, custom tool integration. o AWS Bedrock — Multi-model deployment, knowledge bases, guardrails. o Database & AI Data Access — SQL proficiency, NL-to-SQL, LLM-powered query and insight layers. Snowflake a plus. o Systems Integration — API design (REST), MCP server/client, A2A patterns, SAP/Salesforce/internal system connectors. o RAG Architecture — Hybrid search, re-ranking, agentic RAG, graph RAG — select and justify per use case. o Multi-Model Strategy — OpenAI, Claude, Gemini, Llama — provider trade-offs and cost governance. o Pro Code vs Low Code — Evaluate each use case and recommend. Copilot Studio and Power Automate for the right automation scenarios. o AI Development Tools — Claude Code, GitHub Copilot, Cursor, or Cline — accelerate delivery; own and fix all generated code in production. o Python — Expert-level production code — you write, review, and fix code. o Production Deployment — Docker, CI/CD, post-deployment monitoring, cost optimisation. o Business Communication — Present feasibility and POC outcomes to business leaders clearly. o Cloud Adaptability — Google Agentspace and Azure AI Foundry exposure welcome — AWS is the primary stack. o Experience — 10+ years total; 3–5 years solution architecture with direct delivery accountability; production agentic AI systems deployed. Good to Have: o MCP / A2A — Production server/client implementations. o Document Intelligence — Azure Document Intelligence, Textract, layout-aware chunking. o Fine-Tuning — LoRA / QLoRA for domain adaptation. o Graph Databases — Neo4j for knowledge graph RAG. o Domain Experience — Legal, financial, or regulatory AI applications. o Certifications — AWS Solutions Architect Pro, Google Professional Cloud Architect, Azure Solutions Architect Expert. What We Expect From You • Customer Obsession o Proactively understand customer goals and deliver measurable value. • Competitive Drive o Set high standards, demonstrate tenacity, and ensure our solutions lead in quality. • Challenging Mindset o Foster fact-based dialogue, challenge assumptions, and encourage disruptive thinking. • Action and Learning Velocity o Build fast, fail fast, learn fast. Iterate rapidly and make data-driven decisions. • Collaboration and Accountability o Collaborate across a global team with humility, ownership, and mutual accountability.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Automation and AI Solutions Architect

Job Description

As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking opportunities for improvement and efficiency in application development. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives.- Expertise and In-depth knowledge of supply chain planning in the areas of Supply and Capacity Planning, Production Planning, Inventory Optimization, Material requirements planning, Capacity planning, rough cut Capacity Planning and capacity constraints.- Be a part of an Agile project team utilizing the Agile / Scrum methodologies, working in a fast-paced, iterative environment, closely partnering with business stakeholders Professional & Technical Skills: - Must Have Skills: Proficiency in Kinaxis.- Strong understanding of application development methodologies.- Experience with integration of applications with existing systems.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with user interface design principles. Additional Information: - The candidate should have minimum 5 years of experience in Kinaxis.

Responsibilities

As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking opportunities for improvement and efficiency in application development. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives.- Expertise and In-depth knowledge of supply chain planning in the areas of Supply and Capacity Planning, Production Planning, Inventory Optimization, Material requirements planning, Capacity planning, rough cut Capacity Planning and capacity constraints.- Be a part of an Agile project team utilizing the Agile / Scrum methodologies, working in a fast-paced, iterative environment, closely partnering with business stakeholders Professional & Technical Skills: - Must Have Skills: Proficiency in Kinaxis.- Strong understanding of application development methodologies.- Experience with integration of applications with existing systems.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with user interface design principles. Additional Information: - The candidate should have minimum 5 years of experience in Kinaxis.
  • Salary : Rs. 0.0 - Rs. 3,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :kinaxis

Job Description

Application Programming Interface (API)~PL/SQL~Core .NET Technologies ob Description: • .Net MVC, SQL, Web API • Extensive experience with C#, .NET Framework, ASP.NET MVC, and Web API. Good to have.Net Core. • Frontend technologies like HTML5, CSS3, JavaScript and jQuery is must. Good to have Angular or React JS. • Database: Expertise in SQL Server, Entity Framework, ADO.Net and writing Stored Procedures or complex queries.

Responsibilities

Application Programming Interface (ob Description: • .Net MVC, SQL, Web API • Extensive experience with C#, .NET Framework, ASP.NET MVC, and Web API. Good to have.Net Core. • Frontend technologies like HTML5, CSS3, JavaScript and jQuery is must. Good to have Angular or React JS. • Database: Expertise in SQL Server, Entity Framework, ADO.Net and writing Stored Procedures or complex queries.API)~PL/SQL~Core .NET Technologies
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Application Programming Interface (API)~PL/SQL~Core .NET Technologies