Role Summary:
You build and own assigned solution streams across 1–2 concurrent projects. You implement agentic AI solutions on AWS AgentCore and LangGraph, contribute to POC delivery, and take full ownership of your components in production — including Pro Code builds and Low Code automations where the use case fits.
Key Responsibilities:
• Agentic AI Development
o Implement LangGraph agentic workflows on AWS AgentCore — state graphs, tool orchestration, conditional routing, error recovery.
o Build custom RAG pipelines end-to-end: chunking strategy, embeddings, vector retrieval, accuracy evaluation.
o Contribute to rapid POC delivery alongside the Architect and Lead.
o Write production-grade Python with FastAPI, Pydantic, testing, and proper error handling.
o Use AI coding tools (Claude Code, GitHub Copilot, Cursor, Cline) to accelerate builds. Own and fix all generated code in production.
Responsibilities
• Low Code & Automation
o Build Low Code automations using Microsoft Copilot Studio and Power Automate when use case and bandwidth allow.
o Understand Pro Code vs Low Code trade-offs — flag where Low Code is the better fit.
o Stay current on Low Code platform capabilities as they expand rapidly.
• Production Ownership & Collaboration
o Own assigned components in production — monitor, debug, optimise, iterate.
o Debug agent failures independently: context issues, tool errors, hallucination, retrieval degradation.
o Collaborate with the Lead on architecture translation and implementation planning.
o Mentor Associate Developers through code reviews and pair programming.
Required Skills & Experience:
Must Have:
o LangGraph — State graphs, conditional routing, checkpointing — primary framework, non-negotiable.
o LangChain — Chains, memory, custom tools — solid foundational knowledge.
o AWS AgentCore — Hands-on preferred; strong LangGraph depth with ramp path acceptable.
o AWS Bedrock — Model invocation, knowledge bases, guardrails.
o Database & AI Data Access — SQL proficiency, NL-to-SQL, LLM-powered query and insight patterns. Snowflake a plus.
o RAG Pipelines — End-to-end: chunking, embeddings, retrieval, evaluation.
o Vector Databases — Pinecone, ChromaDB, OpenSearch, or FAISS — hands-on with at least one.
o Python — Production-quality — type hints, async/await, FastAPI, Pydantic.
o Multi-Model — OpenAI, Claude, Gemini, Llama — understand trade-offs across providers.
o Pro Code vs Low Code — Copilot Studio and Power Automate — build when use case fits.
o AI Development Tools — Claude Code, GitHub Copilot, Cursor, or Cline — accelerate delivery; own and fix all generated code.
o Experience — 3–5 years total; 1–2 years GenAI/LLM; 6–12 months LangGraph; 2–3 projects with working code.
Good to Have:
o AWS AgentCore Deep — Runtime, Memory, Tools Gateway hands-on.
o MCP / A2A — Server/client implementation or protocol awareness.
o Enterprise Integration — SAP HANA, Salesforce connectors.
o Document Intelligence — OCR, layout-aware chunking.
o Evaluation Frameworks — RAGAS, Promptfoo, DeepEval.
o Certifications — AWS, GCP, or Azure AI/ML.
What We Expect From You
• Customer Obsession
o Proactively understand customer goals and deliver measurable value.
• Competitive Drive
o Set high standards, demonstrate tenacity, and ensure our solutions lead in quality.
• Challenging Mindset
o Foster fact-based dialogue, challenge assumptions, and encourage disruptive thinking.
• Action and Learning Velocity
o Build fast, fail fast, learn fast. Iterate rapidly and make data-driven decisions.
• Collaboration and Accountability
o Collaborate across a global team with humility, ownership, and mutual accountability.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Designs, implements, and maintains front end applications using TypeScript, Vue 3 (Composition API), Pinia, Vue Router, Vite, SCSS, Tailwind, PrimeVue, Pino logging, Axios, and GraphQL clients (Apollo/URQL/Relay), integrating Auth0 for authentication and authorization.
• Demonstrates strong engineering fundamentals across Linux, Node.js (npm/pnpm), and Docker, applying OWASP practices and SOLID/DRY/KISS/YAGNI principles with sound data structures & algorithms.
• Exhibits deep Vue expertise: reactivity system, directives, component design, props/emits, slots, and lifecycle hooks; organizes code via Composition Functions and type safe patterns in TypeScript.
• Consumes GraphQL SDL and REST OpenAPI specifications, employing client generation where available; connects components to APIs with Axios/Fetch and GraphQL clients, handling auth flows, pagination, caching, and error surfaces.
• Translates Figma designs into accessible, responsive HTML5/SCSS using BEM methodology, Tailwind utility patterns, and PrimeVue or equivalent component libraries; documents components in Storybook.
• Implements secure front end architecture with Auth0 (OAuth/OIDC), token handling, secure storage, CSP, XSS/CSRF mitigation, input validation/encoding, and safe error handling.
• Optimizes web performance via code splitting, lazy loading, tree shaking, asset and image optimization, caching strategies, and efficient rendering; monitors and improves Core Web Vitals using browser performance tooling.
• Writes maintainable, reusable, component driven code that is secure, fast, idempotent, reliable, and resilient, with clear separation of concerns and consistent logging via Pino (browser).
• Tests thoroughly with Jest or Vitest and Vue Test Utils for unit and integration coverage; performs end to end testing; uses msw (mswjs) to mock backend APIs; validates APIs with Bruno or Insomnia and Browser DevTools (console, network, performance).
• Troubleshoots effectively by tracing logs, inspecting errors, and isolating root causes across UI, API, and network layers; produces actionable defect reports with evidence.
• Operates locally in Docker and collaborates on CI/CD workflows; familiar with AWS deployment patterns and front end observability (logging, metrics, tracing) for production support.
• Maintains API and component documentation, aligns with versioned contracts (GraphQL/OpenAPI), and ensures seamless integration between front end experiences and backend data/services.
.
Responsibilities
• Designs, implements, and maintains front end applications using TypeScript, Vue 3 (Composition API), Pinia, Vue Router, Vite, SCSS, Tailwind, PrimeVue, Pino logging, Axios, and GraphQL clients (Apollo/URQL/Relay), integrating Auth0 for authentication and authorization.
• Demonstrates strong engineering fundamentals across Linux, Node.js (npm/pnpm), and Docker, applying OWASP practices and SOLID/DRY/KISS/YAGNI principles with sound data structures & algorithms.
• Exhibits deep Vue expertise: reactivity system, directives, component design, props/emits, slots, and lifecycle hooks; organizes code via Composition Functions and type safe patterns in TypeScript.
• Consumes GraphQL SDL and REST OpenAPI specifications, employing client generation where available; connects components to APIs with Axios/Fetch and GraphQL clients, handling auth flows, pagination, caching, and error surfaces.
• Translates Figma designs into accessible, responsive HTML5/SCSS using BEM methodology, Tailwind utility patterns, and PrimeVue or equivalent component libraries; documents components in Storybook.
• Implements secure front end architecture with Auth0 (OAuth/OIDC), token handling, secure storage, CSP, XSS/CSRF mitigation, input validation/encoding, and safe error handling.
• Optimizes web performance via code splitting, lazy loading, tree shaking, asset and image optimization, caching strategies, and efficient rendering; monitors and improves Core Web Vitals using browser performance tooling.
• Writes maintainable, reusable, component driven code that is secure, fast, idempotent, reliable, and resilient, with clear separation of concerns and consistent logging via Pino (browser).
• Tests thoroughly with Jest or Vitest and Vue Test Utils for unit and integration coverage; performs end to end testing; uses msw (mswjs) to mock backend APIs; validates APIs with Bruno or Insomnia and Browser DevTools (console, network, performance).
• Troubleshoots effectively by tracing logs, inspecting errors, and isolating root causes across UI, API, and network layers; produces actionable defect reports with evidence.
• Operates locally in Docker and collaborates on CI/CD workflows; familiar with AWS deployment patterns and front end observability (logging, metrics, tracing) for production support.
• Maintains API and component documentation, aligns with versioned contracts (GraphQL/OpenAPI), and ensures seamless integration between front end experiences and backend data/services.
.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Training & Culture Building
• Conduct workshops, labs, AI coaching sessions for engineers & managers.
• Lead internal communities of practice for AI and GenAI.
• Promote innovation through showcases, hackathons, and associated central initiatives.
Metrics & Continuous Improvement
• Report & Establish KPIs: productivity gains, adoption rates, automation impact (aligned with THRIVE).
Required Skills & Experience
Technical Skills
• Strong understanding of GenAI, LLMs, vector databases, ML workflows. (cont learning)
• Experience integrating AI into development workflows (copilots, test automation, documentation).
• Proficiency in Python/Java and cloud platforms (Azure/AWS/Google).
• Good grasp of enterprise SDLC, DevOps, APIs, microservices, security, and compliance.
Influence & Leadership
• Proven ability to influence teams without formal authority.
• Excellent stakeholder management across verticals and global counterparts.
• Translate complex AI topics into simple, actionable guidance.
• Align with central initiatives to drive AI adoption in respective dept.
Mindset & Traits
• Evangelist mindset, proactive learner, strong communicator.
• Comfortable with ambiguity and fast experimentation.
• Collaborative, customer-centric, and outcome-driven.
Preferred Qualifications
• 9+ years in software engineering, architect, delivery, or enterprise architecture.
• Experience in transformation programs / self-driver of owned initiatives.
• Exposure to enterprise-scale systems.
• Certifications in AI/ML, cloud, or agile practices.
Responsibilities
Training & Culture Building
• Conduct workshops, labs, AI coaching sessions for engineers & managers.
• Lead internal communities of practice for AI and GenAI.
• Promote innovation through showcases, hackathons, and associated central initiatives.
Metrics & Continuous Improvement
• Report & Establish KPIs: productivity gains, adoption rates, automation impact (aligned with THRIVE).
Required Skills & Experience
Technical Skills
• Strong understanding of GenAI, LLMs, vector databases, ML workflows. (cont learning)
• Experience integrating AI into development workflows (copilots, test automation, documentation).
• Proficiency in Python/Java and cloud platforms (Azure/AWS/Google).
• Good grasp of enterprise SDLC, DevOps, APIs, microservices, security, and compliance.
Influence & Leadership
• Proven ability to influence teams without formal authority.
• Excellent stakeholder management across verticals and global counterparts.
• Translate complex AI topics into simple, actionable guidance.
• Align with central initiatives to drive AI adoption in respective dept.
Mindset & Traits
• Evangelist mindset, proactive learner, strong communicator.
• Comfortable with ambiguity and fast experimentation.
• Collaborative, customer-centric, and outcome-driven.
Preferred Qualifications
• 9+ years in software engineering, architect, delivery, or enterprise architecture.
• Experience in transformation programs / self-driver of owned initiatives.
• Exposure to enterprise-scale systems.
• Certifications in AI/ML, cloud, or agile practices.
Salary : Rs. 15,00,000.0 - Rs. 20,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
9 HDC4 Summary: As a Custom Software Engineer, a typical day involves creating tailored software solutions by designing, coding, and improving various components within systems or applications. The role requires working within dynamic environments that emphasize modern development frameworks and agile methodologies. Throughout the day, collaboration with different teams and adapting to evolving business requirements are essential to deliver scalable and efficient software that meets specific organizational goals. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead efforts to identify and implement process improvements that enhance team productivity and software quality.- Mentor junior team members to support their professional growth and ensure alignment with project objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Automation Anywhere.- Experience in developing and deploying automation workflows using Automation Anywhere platform.- Strong understanding of robotic process automation concepts and best practices.- Ability to troubleshoot and optimize automation scripts for performance and reliability.- Familiarity with integrating Automation Anywhere bots with various enterprise applications and systems. Additional Information: - The candidate should have minimum 5 years of experience in Automation Anywhere.- This position is based at our Hyderabad office.- A 15 years full time education is required.
Responsibilities
9 HDC4 Summary: As a Custom Software Engineer, a typical day involves creating tailored software solutions by designing, coding, and improving various components within systems or applications. The role requires working within dynamic environments that emphasize modern development frameworks and agile methodologies. Throughout the day, collaboration with different teams and adapting to evolving business requirements are essential to deliver scalable and efficient software that meets specific organizational goals. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead efforts to identify and implement process improvements that enhance team productivity and software quality.- Mentor junior team members to support their professional growth and ensure alignment with project objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Automation Anywhere.- Experience in developing and deploying automation workflows using Automation Anywhere platform.- Strong understanding of robotic process automation concepts and best practices.- Ability to troubleshoot and optimize automation scripts for performance and reliability.- Familiarity with integrating Automation Anywhere bots with various enterprise applications and systems. Additional Information: - The candidate should have minimum 5 years of experience in Automation Anywhere.- This position is based at our Hyderabad office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,60,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Summary:
We are looking for a skilled SAP BODS + FICO Consultant with strong expertise in data integration and financial processes. The candidate will be responsible for designing, developing, and supporting data migration, ETL processes using SAP BODS, and working closely with SAP FICO modules for financial data management and reporting.
Key Responsibilities:
SAP BODS (BusinessObjects Data Services):
Design, develop, and maintain ETL jobs using SAP BODS
Perform data extraction, transformation, and loading from multiple sources
Handle data migration and data quality processes
Optimize data workflows and ensure performance tuning
Troubleshoot and resolve ETL-related issues
SAP FICO (Financial Accounting & Controlling):
Work on core FICO modules: GL, AP, AR, Asset Accounting, Cost Center Accounting
Support financial data integration between SAP and non-SAP systems
Assist in financial reporting and reconciliation processes
Collaborate with business users to gather requirements and provide solutions
Support month-end and year-end closing activities
Required Skills:
Strong hands-on experience in SAP BODS (ETL development)
Good functional knowledge of SAP FICO
Experience in data migration projects (LSMW / BODS / S/4HANA migrations preferred)
Knowledge of SQL, data warehousing concepts
Understanding of financial processes and reporting
Strong problem-solving and analytical skills
Preferred Skills:
Experience with SAP S/4HANA
Exposure to SAP BW / HANA
Knowledge of data quality and data governance tools
Experience in integration with third-party systems
Qualifications:
Bachelor’s degree in Finance, Accounting, IT, or related field
SAP Certification in FICO or BODS is a plus
Soft Skills:
Good communication and stakeholder management
Ability to work independently and in team environments
Strong documentation skills
Responsibilities
Job Summary:
We are looking for a skilled SAP BODS + FICO Consultant with strong expertise in data integration and financial processes. The candidate will be responsible for designing, developing, and supporting data migration, ETL processes using SAP BODS, and working closely with SAP FICO modules for financial data management and reporting.
Key Responsibilities:
SAP BODS (BusinessObjects Data Services):
Design, develop, and maintain ETL jobs using SAP BODS
Perform data extraction, transformation, and loading from multiple sources
Handle data migration and data quality processes
Optimize data workflows and ensure performance tuning
Troubleshoot and resolve ETL-related issues
SAP FICO (Financial Accounting & Controlling):
Work on core FICO modules: GL, AP, AR, Asset Accounting, Cost Center Accounting
Support financial data integration between SAP and non-SAP systems
Assist in financial reporting and reconciliation processes
Collaborate with business users to gather requirements and provide solutions
Support month-end and year-end closing activities
Required Skills:
Strong hands-on experience in SAP BODS (ETL development)
Good functional knowledge of SAP FICO
Experience in data migration projects (LSMW / BODS / S/4HANA migrations preferred)
Knowledge of SQL, data warehousing concepts
Understanding of financial processes and reporting
Strong problem-solving and analytical skills
Preferred Skills:
Experience with SAP S/4HANA
Exposure to SAP BW / HANA
Knowledge of data quality and data governance tools
Experience in integration with third-party systems
Qualifications:
Bachelor’s degree in Finance, Accounting, IT, or related field
SAP Certification in FICO or BODS is a plus
Soft Skills:
Good communication and stakeholder management
Ability to work independently and in team environments
Strong documentation skills
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Pls work on Senior Data Scientist Experience 9 Years to 13 years and Location : Hyderabad 15 to 30 Days’ notice period highlighted in yellow colour is mandatory to be in a candidate.
Data Engineering & Architecture
• Lead the design and development of scalable ETL/ELT pipelines for batch and real-time data ingestion from manufacturing, energy, and operations systems.
• Architect data platforms supporting plant operations, energy consumption, emissions tracking, quality, and supply chain data.
• Work with big data technologies such as Hadoop, Spark, Kafka, and Snowflake for large-scale data processing.
• Establish and enforce data governance, quality, security, and compliance standards across enterprise data assets.
Responsibilities
Pls work on Senior Data Scientist Experience 9 Years to 13 years and Location : Hyderabad 15 to 30 Days’ notice period highlighted in yellow colour is mandatory to be in a candidate.
Data Engineering & Architecture
• Lead the design and development of scalable ETL/ELT pipelines for batch and real-time data ingestion from manufacturing, energy, and operations systems.
• Architect data platforms supporting plant operations, energy consumption, emissions tracking, quality, and supply chain data.
• Work with big data technologies such as Hadoop, Spark, Kafka, and Snowflake for large-scale data processing.
• Establish and enforce data governance, quality, security, and compliance standards across enterprise data assets.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance