Please work on below demand.
• Primary mandate skill required - Tanium
• Secondary mandate skill required – Intune
• Can we consider Contractor(CWR) profiles-Yes
• Flexible to hire in any location – If not, please mention job location -Open
• Detailed Job Description –
Detailed JD :
Tanium Admin
• Deploying, configuring, and maintaining the Tanium platform and its modules.
• Performing system health checks and ensuring robust endpoint management.
• Identifying, analyzing, and remediating security vulnerabilities and compliance issues.
• Creating and managing Tanium sensors and questions to gather data.
• Troubleshooting endpoint issues across various operating systems like Windows, Linux, and macOS.
• Assisting customers with support cases and answering questions.
• Developing and testing new product features, as seen in roles like Software Engineer.
• Creating and managing deployment strategies for software and patches.
• Working with automation and low-code tools like Tanium Automate.
Intune Admin :
• Design, implement, and manage Intune policies for Windows, macOS, iOS, and Android platforms.
• Oversee application deployment strategies using Microsoft Endpoint Manager (MEM).
• Configure and maintain app protection and configuration policies.
• Provide L3/L4 support for escalated issues related to Intune, device compliance, and application deployment.
• Analyze logs and telemetry data to resolve complex technical issues.
• Collaborate with Microsoft support and internal teams for issue resolution.
• Implement and manage compliance policies, conditional access, and endpoint security baselines.
• Integrate Intune with Microsoft Defender for Endpoint and other security tools.
• Ensure endpoint configurations align with organizational security standards and regulatory requirements.
• Develop and maintain PowerShell scripts to automate Intune tasks and generate reports.
• Utilize Microsoft Graph API for advanced automation and integration.
• Design and implement integrations with Azure AD, Autopilot, SCCM (co-management), and third-party tools.
• Participate in architectural planning and contribute to the endpoint management roadmap.
• Create and maintain dashboards and reports for device compliance, deployment status, and user activity.
• Monitor system health and performance of Intune services.
• Maintain comprehensive documentation of configurations, procedures, and troubleshooting steps.
• Provide training and mentorship to L1/L2 support teams.
Key skills and qualifications:
• Tanium & Intune expertise: Deep knowledge of the platform and its various modules.
• Operating systems: Strong knowledge of Windows, Linux, and macOS environments.
• Scripting and automation: Experience in automating tasks and creating sensors and questions.
• Security and compliance: Understanding of vulnerability management, threat hunting, and compliance reporting.
• Troubleshooting: Ability to identify and solve issues on endpoints.
• Deployment experience: Familiarity with Tanium & Intune-based deployments and other tools like SCCM.
• Customer support: Skills in triaging and solving support cases.
Responsibilities
Please work on below demand.
• Primary mandate skill required - Tanium
• Secondary mandate skill required – Intune
• Can we consider Contractor(CWR) profiles-Yes
• Flexible to hire in any location – If not, please mention job location -Open
• Detailed Job Description –
Detailed JD :
Tanium Admin
• Deploying, configuring, and maintaining the Tanium platform and its modules.
• Performing system health checks and ensuring robust endpoint management.
• Identifying, analyzing, and remediating security vulnerabilities and compliance issues.
• Creating and managing Tanium sensors and questions to gather data.
• Troubleshooting endpoint issues across various operating systems like Windows, Linux, and macOS.
• Assisting customers with support cases and answering questions.
• Developing and testing new product features, as seen in roles like Software Engineer.
• Creating and managing deployment strategies for software and patches.
• Working with automation and low-code tools like Tanium Automate.
Intune Admin :
• Design, implement, and manage Intune policies for Windows, macOS, iOS, and Android platforms.
• Oversee application deployment strategies using Microsoft Endpoint Manager (MEM).
• Configure and maintain app protection and configuration policies.
• Provide L3/L4 support for escalated issues related to Intune, device compliance, and application deployment.
• Analyze logs and telemetry data to resolve complex technical issues.
• Collaborate with Microsoft support and internal teams for issue resolution.
• Implement and manage compliance policies, conditional access, and endpoint security baselines.
• Integrate Intune with Microsoft Defender for Endpoint and other security tools.
• Ensure endpoint configurations align with organizational security standards and regulatory requirements.
• Develop and maintain PowerShell scripts to automate Intune tasks and generate reports.
• Utilize Microsoft Graph API for advanced automation and integration.
• Design and implement integrations with Azure AD, Autopilot, SCCM (co-management), and third-party tools.
• Participate in architectural planning and contribute to the endpoint management roadmap.
• Create and maintain dashboards and reports for device compliance, deployment status, and user activity.
• Monitor system health and performance of Intune services.
• Maintain comprehensive documentation of configurations, procedures, and troubleshooting steps.
• Provide training and mentorship to L1/L2 support teams.
Key skills and qualifications:
• Tanium & Intune expertise: Deep knowledge of the platform and its various modules.
• Operating systems: Strong knowledge of Windows, Linux, and macOS environments.
• Scripting and automation: Experience in automating tasks and creating sensors and questions.
• Security and compliance: Understanding of vulnerability management, threat hunting, and compliance reporting.
• Troubleshooting: Ability to identify and solve issues on endpoints.
• Deployment experience: Familiarity with Tanium & Intune-based deployments and other tools like SCCM.
• Customer support: Skills in triaging and solving support cases.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Key Responsibilities:
• Stakeholder Engagement & Discovery
o Engage internal SMEs to understand current processes and as-is workflows — translate into buildable AI requirements.
o Demo working solutions to internal stakeholders up to business leadership level.
o Support the Architect during feasibility and POC phases — contribute to rapid builds and validate fit.
o Maintain stakeholder communication throughout: progress updates, issue escalation, solution walkthroughs.
• Solution Design & Architecture
o Co-invest with the Architect in upfront solution design — design robustness directly determines build speed and test quality.
o Translate architecture into implementation plans — break down into assignable streams for Developer and Associate.
o Own day-to-day technical decisions during build when the Architect is focused on other projects.
o Identify architectural risks early and escalate before they become build or production problems.
• Build, Scale & Production
o Build complex solution components end-to-end on AWS AgentCore and LangGraph — hands-on coding role.
o Scale solutions from POC to production: performance, reliability, monitoring, and cost management.
o Debug production failures: hallucination patterns, retrieval degradation, agent loops, latency regressions.
o Use AI coding tools (Claude Code, GitHub Copilot, Cursor, Cline) to accelerate delivery. Own all generated code.
o Build Low Code automations (Copilot Studio, Power Automate) when use case and bandwidth allow.
• Team Leadership
o Guide Developer and Associate through implementation, code reviews, and debugging.
o Ensure the team uses AI tools effectively — accelerating delivery without sacrificing code understanding.
Responsibilities
Required Skills & Experience:
Must Have:
o LangGraph — Multi-agent state machines, parallel execution, checkpointing, HITL — non-negotiable.
o LangChain — Advanced chains, memory, custom tool integration.
o AWS AgentCore — Production deployment preferred; strong LangGraph depth with ramp path acceptable.
o AWS Bedrock — Model deployment, knowledge bases, guardrails.
o Database & AI Data Access — SQL proficiency, NL-to-SQL, LLM-powered query layers. Snowflake a plus.
o Systems Integration — REST API development, MCP awareness, enterprise system connectors.
o Advanced RAG — Hybrid search, re-ranking, query reformulation, retrieval evaluation.
o Evaluation Frameworks — RAGAS / TruLens, golden test sets, automated regression pipelines.
o Observability — LangSmith or LangFuse — agent tracing, production debugging.
o Python — Expert-level — design patterns, async, testing, CI/CD.
o Multi-Model — OpenAI, Claude, Gemini, Llama — trade-offs at production scale.
o Pro Code vs Low Code — Working knowledge of Copilot Studio and Power Automate; build when use case fits.
o AI Development Tools — Claude Code, GitHub Copilot, Cursor, or Cline — accelerate delivery; own and fix all generated code.
o Stakeholder Communication — Demo AI solutions to business stakeholders; translate SME knowledge into requirements.
o Experience — 5–8 years total; 2–4 years production GenAI/LLM; 12–24 months multi-agent LangGraph; 3+ systems owned post-deployment.
Good to Have:
o AWS AgentCore Deep — All services hands-on.
o MCP / A2A — Server/client production implementation.
o Google Agentspace / Azure AI Foundry — Production experience.
o Document Intelligence — OCR, layout-aware chunking.
o Fine-Tuning — LoRA / QLoRA for domain adaptation.
o Certifications — AWS ML Specialty, Azure AI Engineer, GCP ML Engineer.
What We Expect From You
• Customer Obsession
o Proactively understand customer goals and deliver measurable value.
• Competitive Drive
o Set high standards, demonstrate tenacity, and ensure our solutions lead in quality.
• Challenging Mindset
o Foster fact-based dialogue, challenge assumptions, and encourage disruptive thinking.
• Action and Learning Velocity
o Build fast, fail fast, learn fast. Iterate rapidly and make data-driven decisions.
• Collaboration and Accountability
o Collaborate across a global team with humility, ownership, and mutual accountability.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Summary:
You build and own assigned solution streams across 1–2 concurrent projects. You implement agentic AI solutions on AWS AgentCore and LangGraph, contribute to POC delivery, and take full ownership of your components in production — including Pro Code builds and Low Code automations where the use case fits.
Key Responsibilities:
• Agentic AI Development
o Implement LangGraph agentic workflows on AWS AgentCore — state graphs, tool orchestration, conditional routing, error recovery.
o Build custom RAG pipelines end-to-end: chunking strategy, embeddings, vector retrieval, accuracy evaluation.
o Contribute to rapid POC delivery alongside the Architect and Lead.
o Write production-grade Python with FastAPI, Pydantic, testing, and proper error handling.
o Use AI coding tools (Claude Code, GitHub Copilot, Cursor, Cline) to accelerate builds. Own and fix all generated code in production.
Responsibilities
• Low Code & Automation
o Build Low Code automations using Microsoft Copilot Studio and Power Automate when use case and bandwidth allow.
o Understand Pro Code vs Low Code trade-offs — flag where Low Code is the better fit.
o Stay current on Low Code platform capabilities as they expand rapidly.
• Production Ownership & Collaboration
o Own assigned components in production — monitor, debug, optimise, iterate.
o Debug agent failures independently: context issues, tool errors, hallucination, retrieval degradation.
o Collaborate with the Lead on architecture translation and implementation planning.
o Mentor Associate Developers through code reviews and pair programming.
Required Skills & Experience:
Must Have:
o LangGraph — State graphs, conditional routing, checkpointing — primary framework, non-negotiable.
o LangChain — Chains, memory, custom tools — solid foundational knowledge.
o AWS AgentCore — Hands-on preferred; strong LangGraph depth with ramp path acceptable.
o AWS Bedrock — Model invocation, knowledge bases, guardrails.
o Database & AI Data Access — SQL proficiency, NL-to-SQL, LLM-powered query and insight patterns. Snowflake a plus.
o RAG Pipelines — End-to-end: chunking, embeddings, retrieval, evaluation.
o Vector Databases — Pinecone, ChromaDB, OpenSearch, or FAISS — hands-on with at least one.
o Python — Production-quality — type hints, async/await, FastAPI, Pydantic.
o Multi-Model — OpenAI, Claude, Gemini, Llama — understand trade-offs across providers.
o Pro Code vs Low Code — Copilot Studio and Power Automate — build when use case fits.
o AI Development Tools — Claude Code, GitHub Copilot, Cursor, or Cline — accelerate delivery; own and fix all generated code.
o Experience — 3–5 years total; 1–2 years GenAI/LLM; 6–12 months LangGraph; 2–3 projects with working code.
Good to Have:
o AWS AgentCore Deep — Runtime, Memory, Tools Gateway hands-on.
o MCP / A2A — Server/client implementation or protocol awareness.
o Enterprise Integration — SAP HANA, Salesforce connectors.
o Document Intelligence — OCR, layout-aware chunking.
o Evaluation Frameworks — RAGAS, Promptfoo, DeepEval.
o Certifications — AWS, GCP, or Azure AI/ML.
What We Expect From You
• Customer Obsession
o Proactively understand customer goals and deliver measurable value.
• Competitive Drive
o Set high standards, demonstrate tenacity, and ensure our solutions lead in quality.
• Challenging Mindset
o Foster fact-based dialogue, challenge assumptions, and encourage disruptive thinking.
• Action and Learning Velocity
o Build fast, fail fast, learn fast. Iterate rapidly and make data-driven decisions.
• Collaboration and Accountability
o Collaborate across a global team with humility, ownership, and mutual accountability.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
10+ years of overall experience in SAP integration and middleware technologies.
• 5+ years of hands-on experience with SAP CPI (Cloud Platform Integration), including iFlows, adapters, mappings, and integration patterns.
• Strong knowledge of SAP BTP, API Management, and Cloud Connector.
• Experience working on SAP S/4HANA conversion or greenfield/brownfield implementation projects.
• Expertise in integration technologies such as IDocs, BAPIs, RFCs, SOAP/REST APIs, OData services.
• Hands-on experience with Groovy scripting and message mappings in CPI.
• Strong understanding of security concepts (OAuth, certificates, encryption, etc.).
• Excellent analytical, problem solving, and communication skills.
• Ability to lead teams and coordinate with global stakeholders.
Preferred Qualificationsss
• SAP Certification in CPI or Integration Suite.
• Experience with migration tools, AIF, Integration Advisor, or Event Mesh.
• Experience in Agile/DevOps driven project delivery.
Responsibilities
10+ years of overall experience in SAP integration and middleware technologies.
• 5+ years of hands-on experience with SAP CPI (Cloud Platform Integration), including iFlows, adapters, mappings, and integration patterns.
• Strong knowledge of SAP BTP, API Management, and Cloud Connector.
• Experience working on SAP S/4HANA conversion or greenfield/brownfield implementation projects.
• Expertise in integration technologies such as IDocs, BAPIs, RFCs, SOAP/REST APIs, OData services.
• Hands-on experience with Groovy scripting and message mappings in CPI.
• Strong understanding of security concepts (OAuth, certificates, encryption, etc.).
• Excellent analytical, problem solving, and communication skills.
• Ability to lead teams and coordinate with global stakeholders.
Preferred Qualificationsss
• SAP Certification in CPI or Integration Suite.
• Experience with migration tools, AIF, Integration Advisor, or Event Mesh.
• Experience in Agile/DevOps driven project delivery.
Salary : Rs. 0.0 - Rs. 1,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Job Summary:
We are seeking a skilled Data Engineer with strong experience in SQL, Python, Tableau, and ETL tools to design, build, and maintain reliable data pipelines and analytics solutions. This role focuses on ensuring data quality, enabling scalable data workflows, and supporting business intelligence and reporting needs.
• Roles and Responsibilities
• Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with cross functional teams (data analysts, product teams, business stakeholders) to understand data requirements. • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards
Responsibilities
• Required Skills
What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory
• Desired Skills:
What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory
• Soft Skills :
• Excellent problem-solving and analytical skills • Strong communication and collaboration skills • Ability to work independently and as part of a global team. • Self-motivated and able to work in a fast-paced environment. • Detail-oriented and committed to delivering high-quality work. • Display one-team behavior while thinking end to end solutioning.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Descriptions: Need to Java Backend Developer
Essential Skills: Need to Java Backend Developer
Desirable Skills:
Keyword:
Skills: Core Java
Experience Required: 4-6
Chennai-Magnum
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
10 M2B Summary: As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with organizational goals. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization in application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Participate in the testing and debugging of applications to ensure quality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle EBS Financials.- Strong understanding of application design principles and methodologies.- Experience with software development life cycle and agile methodologies.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with database management and data integration techniques. Additional Information: - The candidate should have minimum 3 years of experience in Oracle EBS Financials.- This position is based at our Mumbai office.- A 15 years full time education is required.
Responsibilities
10 M2B Summary: As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with organizational goals. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization in application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Participate in the testing and debugging of applications to ensure quality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle EBS Financials.- Strong understanding of application design principles and methodologies.- Experience with software development life cycle and agile methodologies.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with database management and data integration techniques. Additional Information: - The candidate should have minimum 3 years of experience in Oracle EBS Financials.- This position is based at our Mumbai office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,44,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance