Job Description: kt support
Ready to Travel
shifts - Resource should be ready to work on any shift time [7 AM/2 PM/ 8 PM]
Work from office
Responsibilities
Role Summary
1. Staffing & On boarding
• Coordinate timely staffing and seamless on boarding of the team members
2. Production Readiness & Operations Reporting
• Timely & successful coordination to ensure Production Readiness & Operations Reporting
3. Technology & Work Environment Readiness
• Timely & successful coordination to ensure TWE Readiness
4. Client Visits
• Support the team to successfully show case Accenture capabilities to Clients
• RFI / RFP responses
5. Assist the Knowledge transfer Lead in the KT process
6. Assist in the Capacity Planning Process
7.Assist in Operational Excellence process documentation
Salary : Rs. 50,000.0 - Rs. 56,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Service Delivery - Mobilization Representative
Job Title: Senior Azure Data Engineer
Experience: 6–12 Years
Location: Bangalore (Hybrid/Onsite)
Role Overview
We are seeking a highly experienced Senior Azure Data Engineer to design, build, and optimize scalable data platforms on Microsoft Azure. The ideal candidate will have strong expertise in modern data engineering practices, including lakehouse architecture, distributed data processing, and enterprise-grade data pipelines.
This role requires hands-on technical leadership, the ability to work with large-scale data systems, and collaboration with cross-functional teams to deliver high-quality, data-driven solutions.
Key Responsibilities
Design and implement scalable, high-performance data pipelines using Azure ecosystem
Architect and develop modern data platforms (Lakehouse / Medallion Architecture)
Build and optimize ETL/ELT workflows using Azure Data Factory and Databricks
Develop robust data processing solutions using PySpark and Spark SQL
Work with Azure Data Lake Storage (ADLS Gen2) for structured and unstructured data
Design and implement data models (Star/Snowflake schemas) for analytics
Ensure data quality, governance, and security across data platforms
Optimize performance of large-scale data workloads (partitioning, caching, indexing)
Implement CI/CD pipelines and DevOps best practices using Azure DevOps/Git
Collaborate with architects, analysts, and business stakeholders for requirement gathering
Mentor junior engineers and contribute to technical decision-making
Mandatory Skills
Strong hands-on experience in:
Azure Data Factory (ADF)
Azure Databricks (PySpark)
Azure Data Lake Storage (ADLS Gen2)
Azure Synapse Analytics
Expertise in Python and PySpark for large-scale data processing
Deep understanding of Delta Lake and Lakehouse Architecture
Strong experience in ETL/ELT pipeline design and orchestration
Proficiency in SQL (T-SQL / Spark SQL)
Experience with data modeling (Dimensional Modeling, SCDs)
Responsibilities
Job Title: Senior Azure Data Engineer
Experience: 6–12 Years
Location: Bangalore (Hybrid/Onsite)
Role Overview
We are seeking a highly experienced Senior Azure Data Engineer to design, build, and optimize scalable data platforms on Microsoft Azure. The ideal candidate will have strong expertise in modern data engineering practices, including lakehouse architecture, distributed data processing, and enterprise-grade data pipelines.
This role requires hands-on technical leadership, the ability to work with large-scale data systems, and collaboration with cross-functional teams to deliver high-quality, data-driven solutions.
Key Responsibilities
Design and implement scalable, high-performance data pipelines using Azure ecosystem
Architect and develop modern data platforms (Lakehouse / Medallion Architecture)
Build and optimize ETL/ELT workflows using Azure Data Factory and Databricks
Develop robust data processing solutions using PySpark and Spark SQL
Work with Azure Data Lake Storage (ADLS Gen2) for structured and unstructured data
Design and implement data models (Star/Snowflake schemas) for analytics
Ensure data quality, governance, and security across data platforms
Optimize performance of large-scale data workloads (partitioning, caching, indexing)
Implement CI/CD pipelines and DevOps best practices using Azure DevOps/Git
Collaborate with architects, analysts, and business stakeholders for requirement gathering
Mentor junior engineers and contribute to technical decision-making
Mandatory Skills
Strong hands-on experience in:
Azure Data Factory (ADF)
Azure Databricks (PySpark)
Azure Data Lake Storage (ADLS Gen2)
Azure Synapse Analytics
Expertise in Python and PySpark for large-scale data processing
Deep understanding of Delta Lake and Lakehouse Architecture
Strong experience in ETL/ELT pipeline design and orchestration
Proficiency in SQL (T-SQL / Spark SQL)
Experience with data modeling (Dimensional Modeling, SCDs)
Good to Have
Experience with real-time/streaming pipelines (Event Hub, Kafka, Structured Streaming)
Knowledge of data governance tools (Unity Catalog, Purview)
Job Title: Senior Azure Data Engineer
Experience: 6–12 Years
Location: Bangalore (Hybrid/Onsite)
Role Overview
We are seeking a highly experienced Senior Azure Data Engineer to design, build, and optimize scalable data platforms on Microsoft Azure. The ideal candidate will have strong expertise in modern data engineering practices, including lakehouse architecture, distributed data processing, and enterprise-grade data pipelines.
This role requires hands-on technical leadership, the ability to work with large-scale data systems, and collaboration with cross-functional teams to deliver high-quality, data-driven solutions.
Key Responsibilities
Design and implement scalable, high-performance data pipelines using Azure ecosystem
Architect and develop modern data platforms (Lakehouse / Medallion Architecture)
Build and optimize ETL/ELT workflows using Azure Data Factory and Databricks
Develop robust data processing solutions using PySpark and Spark SQL
Work with Azure Data Lake Storage (ADLS Gen2) for structured and unstructured data
Design and implement data models (Star/Snowflake schemas) for analytics
Ensure data quality, governance, and security across data platforms
Optimize performance of large-scale data workloads (partitioning, caching, indexing)
Implement CI/CD pipelines and DevOps best practices using Azure DevOps/Git
Collaborate with architects, analysts, and business stakeholders for requirement gathering
Mentor junior engineers and contribute to technical decision-making
Mandatory Skills
Strong hands-on experience in:
Azure Data Factory (ADF)
Azure Databricks (PySpark)
Azure Data Lake Storage (ADLS Gen2)
Azure Synapse Analytics
Expertise in Python and PySpark for large-scale data processing
Deep understanding of Delta Lake and Lakehouse Architecture
Strong experience in ETL/ELT pipeline design and orchestration
Proficiency in SQL (T-SQL / Spark SQL)
Experience with data modeling (Dimensional Modeling, SCDs)
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
SummaryThe EBS RTR Analyst (EMEA) supports Record-to-Report (RTR) processes within Oracle E-Business Suite, with a focus on General Ledger (GL), Fixed Assets (FA), and financial close activities. The role ensures accurate accounting, timely period close, and effective resolution of finance-related issues for EMEA business users. This scope aligns with commonly documented Oracle EBS RTR responsibilities covering GL, FA, reconciliations, and close support in production environments. [linkedin.com], [jobs.accaglobal.com], [builtin.com]Key ResponsibilitiesGeneral Ledger (GL) SupportSupport GL journal entries, posting, and validation in Oracle EBS.Assist with GL account analysis and resolution of posting issues.Support month-end and period-end GL close activities.Fixed Assets (FA)Support Fixed Assets accounting including asset additions, retirements, transfers, and depreciation runs.Assist in resolving FA-related accounting and reporting issues.Ensure accurate asset balances and compliance with accounting policies.Period Close & ReconciliationSupport month-end, quarter-end, and year-end close activities.Assist with GL and sub-ledger reconciliations related to RTR processes.Ensure timely resolution of open finance issues prior to close.Incident & Issue ManagementProvide L2 functional support for RTR-related incidents during EMEA business hours.Perform root-cause analysis and coordinate with technical teams for resolution.Ensure SLA adherence and clear communication with finance stakeholders.Documentation & ComplianceMaintain SOPs, process documentation, and close checklists.Support audit and compliance activities related to GL and FA.Ensure adherence to internal controls and standard accounting procedures.Required Skills & ExperienceCore Functional Skills3–4 years of hands-on experience in Oracle EBS RTR processes.Strong working knowledge of:General Ledger (GL)Fixed Assets (FA)Financial Close SupportExperience supporting RTR operations in ERP environments. [jobs.accaglobal.com], [builtin.com]ERP KnowledgeOracle E-Business Suite (EBS) functional expertise in RTR-related modules.Understanding of end-to-end RTR lifecycle from journal posting to period close.Soft SkillsStrong analytical and problem-solving skills.Good communication skills to work with EMEA finance users.Ability to manage multiple activities during close periods.Good to HaveExperience in ERP AMS or shared services models.Exposure to audit support and SOX controls.Familiarity with ITSM / ticketing tools. Additional Information: - The candidate should have minimum 2 years of experience in Oracle EBS Financials.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
SummaryThe EBS RTR Analyst (EMEA) supports Record-to-Report (RTR) processes within Oracle E-Business Suite, with a focus on General Ledger (GL), Fixed Assets (FA), and financial close activities. The role ensures accurate accounting, timely period close, and effective resolution of finance-related issues for EMEA business users. This scope aligns with commonly documented Oracle EBS RTR responsibilities covering GL, FA, reconciliations, and close support in production environments. [linkedin.com], [jobs.accaglobal.com], [builtin.com]Key ResponsibilitiesGeneral Ledger (GL) SupportSupport GL journal entries, posting, and validation in Oracle EBS.Assist with GL account analysis and resolution of posting issues.Support month-end and period-end GL close activities.Fixed Assets (FA)Support Fixed Assets accounting including asset additions, retirements, transfers, and depreciation runs.Assist in resolving FA-related accounting and reporting issues.Ensure accurate asset balances and compliance with accounting policies.Period Close & ReconciliationSupport month-end, quarter-end, and year-end close activities.Assist with GL and sub-ledger reconciliations related to RTR processes.Ensure timely resolution of open finance issues prior to close.Incident & Issue ManagementProvide L2 functional support for RTR-related incidents during EMEA business hours.Perform root-cause analysis and coordinate with technical teams for resolution.Ensure SLA adherence and clear communication with finance stakeholders.Documentation & ComplianceMaintain SOPs, process documentation, and close checklists.Support audit and compliance activities related to GL and FA.Ensure adherence to internal controls and standard accounting procedures.Required Skills & ExperienceCore Functional Skills3–4 years of hands-on experience in Oracle EBS RTR processes.Strong working knowledge of:General Ledger (GL)Fixed Assets (FA)Financial Close SupportExperience supporting RTR operations in ERP environments. [jobs.accaglobal.com], [builtin.com]ERP KnowledgeOracle E-Business Suite (EBS) functional expertise in RTR-related modules.Understanding of end-to-end RTR lifecycle from journal posting to period close.Soft SkillsStrong analytical and problem-solving skills.Good communication skills to work with EMEA finance users.Ability to manage multiple activities during close periods.Good to HaveExperience in ERP AMS or shared services models.Exposure to audit support and SOX controls.Familiarity with ITSM / ticketing tools. Additional Information: - The candidate should have minimum 2 years of experience in Oracle EBS Financials.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
We are looking for a Senior Windchill Consultant who can lead solution design and delivery for enterprise PLM on PTC Windchill, with strong hands-on capability across configuration + customization, module expertise (PDMLink plus adjacent modules), and integrations (notably SAP S/4). This role partners with business and engineering stakeholders to deliver scalable PLM processes, robust integrations/APIs, and production-grade deployments.
Key Responsibilities
Lead end-to-end Windchill implementations/enhancements: discovery, design, build, test, cutover, and hyper care.
Own solution design for Windchill PDMLink and working knowledge of adjacent modules (e.g., ProjectLink, PartsLink, SuMA), including data model and lifecycle/process design.
Perform hands-on customization (server-side extensions, UI/config where applicable), troubleshoot complex production issues, and drive root-cause fixes.
Design and deliver integrations between Windchill and SAP S/4 (materials/BOM, change, document sync patterns) and other enterprise systems.
Build and consume REST APIs for PLM interoperability; define integration contracts, error handling, monitoring, and performance approach.
Support CAD/ECAD integrations as needed (e.g., SolidWorks WGM, Cadence XPLM), including metadata mapping and release processes.
Enable quality and compliance processes within PLM (e.g., QMS alignment, traceability, audit readiness as applicable).
Implement/operate ThingWorx components where in scope (ThingWorx Navigate / Foundation) for PLM visualization and user experience.
Establish DevOps practices: environments, packaging, release management, and automated build/deploy pipelines.
Collaborate effectively with PTC Technical Support (TS) and vendors; manage escalations and patch/upgrade recommendations.
Produce high-quality documentation: solution designs, config specs, test evidence, runbooks, and knowledge transfer.
Required Skills & Experience (Must-Have)
5–8+ years Windchill experience with strong PDMLink functional + technical depth.
Demonstrated Windchill customization experience (not configuration-only), including troubleshooting/debugging production issues.
Experience with at least some of: ProjectLink, PartsLink, SuMA (implemented or supported in real projects).
Proven integration experience, including Windchill ↔ SAP S/4 and/or enterprise middleware patterns.
Strong working knowledge of REST APIs, integration design (auth, retries, idempotency, logging), and data synchronization patterns.
Experience supporting PLM processes: Part/BOM, Document, Change (ECR/ECO), lifecycle/workflows, access control.
Strong communication skills: can lead technical discussions, translate requirements, and drive decisions with stakeholders.
Responsibilities
We are looking for a Senior Windchill Consultant who can lead solution design and delivery for enterprise PLM on PTC Windchill, with strong hands-on capability across configuration + customization, module expertise (PDMLink plus adjacent modules), and integrations (notably SAP S/4). This role partners with business and engineering stakeholders to deliver scalable PLM processes, robust integrations/APIs, and production-grade deployments.
Key Responsibilities
Lead end-to-end Windchill implementations/enhancements: discovery, design, build, test, cutover, and hyper care.
Own solution design for Windchill PDMLink and working knowledge of adjacent modules (e.g., ProjectLink, PartsLink, SuMA), including data model and lifecycle/process design.
Perform hands-on customization (server-side extensions, UI/config where applicable), troubleshoot complex production issues, and drive root-cause fixes.
Design and deliver integrations between Windchill and SAP S/4 (materials/BOM, change, document sync patterns) and other enterprise systems.
Build and consume REST APIs for PLM interoperability; define integration contracts, error handling, monitoring, and performance approach.
Support CAD/ECAD integrations as needed (e.g., SolidWorks WGM, Cadence XPLM), including metadata mapping and release processes.
Enable quality and compliance processes within PLM (e.g., QMS alignment, traceability, audit readiness as applicable).
Implement/operate ThingWorx components where in scope (ThingWorx Navigate / Foundation) for PLM visualization and user experience.
Establish DevOps practices: environments, packaging, release management, and automated build/deploy pipelines.
Collaborate effectively with PTC Technical Support (TS) and vendors; manage escalations and patch/upgrade recommendations.
Produce high-quality documentation: solution designs, config specs, test evidence, runbooks, and knowledge transfer.
Required Skills & Experience (Must-Have)
5–8+ years Windchill experience with strong PDMLink functional + technical depth.
Demonstrated Windchill customization experience (not configuration-only), including troubleshooting/debugging production issues.
Experience with at least some of: ProjectLink, PartsLink, SuMA (implemented or supported in real projects).
Proven integration experience, including Windchill ↔ SAP S/4 and/or enterprise middleware patterns.
Strong working knowledge of REST APIs, integration design (auth, retries, idempotency, logging), and data synchronization patterns.
Experience supporting PLM processes: Part/BOM, Document, Change (ECR/ECO), lifecycle/workflows, access control.
Strong communication skills: can lead technical discussions, translate requirements, and drive decisions with stakeholders.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Spring Batch is a must have skill. Senior Java resource who can lead delivery within a SAFe Agile Scrum team. Must have Java 17, Spring Boot 3.x, Spring Batch 5.x, Maven, solid software design and documentation, and strong testing with JUnit 5Mockito. Quality and security are nonnegotiable SonarQube quality gates and Checkmarx no HighCritical before release. Handson AWS experience required EC2, AWS Batch, S3, EFS, ECR plus Docker for containerization. Strong communicator with experience coordinating offshore teammates and applying OWASP secure coding practices.
Responsibilities
Spring Batch is a must have skill. Senior Java resource who can lead delivery within a SAFe Agile Scrum team. Must have Java 17, Spring Boot 3.x, Spring Batch 5.x, Maven, solid software design and documentation, and strong testing with JUnit 5Mockito. Quality and security are nonnegotiable SonarQube quality gates and Checkmarx no HighCritical before release. Handson AWS experience required EC2, AWS Batch, S3, EFS, ECR plus Docker for containerization. Strong communicator with experience coordinating offshore teammates and applying OWASP secure coding practices.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role : 562668 - INS ADM - Java Spring Boot + Batch Developer