We are looking for Engineer with 5 -6 Years of working in Telecom domain. Experience working in operations environment will be added benefit. Experience in Drive test, Survey, or deployment project co-ordinationRoles and Responsibilities:• Prepare work order as per the test plan and update it in work order tool. Ensure work orders are updated in tool 7 days before the actual test date. • Adjust the work orders and schedule as and when required. Update all the deviations to regional lead and take necessary approvals from internal and external project teams. • Ensure quality of all schedule and reports before submission to internal and external teams. • Validate the car photos, equipment photos, and ensure as per the defined standards. Any deviations identified to be notified to internal leads. Identify the regular defaulters and advise them to improve the data collection to ensure quality of deliverables to be met.• Daily track the equipment and car details, Provide the report on daily basis to internal and external leads.• Prepare daily, weekly, monthly, quarterly, and ad-hoc report as per lead and external stakeholders’ requirement. Professional and Technical Skills:• Verify the data collection within grid, ensure completeness of data, identify any deviations, and highlight to field team within 24 hours. • Co-ordinate with onshore teams on daily basis to ensure project progress.• Validate the equipment status on regular intervals through photos.• Support for field troubleshooting for any major issues. Prepare RCA and submit it to internal and external stakeholders.• Identify the failure causes, recurring patterns, report to internal and external stakeholders. Provide recommendations for improvement. Additional Information:• Proficient in written and verbal English communication.• Experience to work on Microsoft office.• 15 Years Full Time Education• BE / BTech
Responsibilities
We are looking for Engineer with 5 -6 Years of working in Telecom domain. Experience working in operations environment will be added benefit. Experience in Drive test, Survey, or deployment project co-ordinationRoles and Responsibilities:• Prepare work order as per the test plan and update it in work order tool. Ensure work orders are updated in tool 7 days before the actual test date. • Adjust the work orders and schedule as and when required. Update all the deviations to regional lead and take necessary approvals from internal and external project teams. • Ensure quality of all schedule and reports before submission to internal and external teams. • Validate the car photos, equipment photos, and ensure as per the defined standards. Any deviations identified to be notified to internal leads. Identify the regular defaulters and advise them to improve the data collection to ensure quality of deliverables to be met.• Daily track the equipment and car details, Provide the report on daily basis to internal and external leads.• Prepare daily, weekly, monthly, quarterly, and ad-hoc report as per lead and external stakeholders’ requirement. Professional and Technical Skills:• Verify the data collection within grid, ensure completeness of data, identify any deviations, and highlight to field team within 24 hours. • Co-ordinate with onshore teams on daily basis to ensure project progress.• Validate the equipment status on regular intervals through photos.• Support for field troubleshooting for any major issues. Prepare RCA and submit it to internal and external stakeholders.• Identify the failure causes, recurring patterns, report to internal and external stakeholders. Provide recommendations for improvement. Additional Information:• Proficient in written and verbal English communication.• Experience to work on Microsoft office.• 15 Years Full Time Education• BE / BTech
Salary : Rs. 0.0 - Rs. 1,50,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Title External: Support and Operations Responsible – Consumer Analytics
BSH internal title: Support and Operations Responsible (B2C)
You will be working in an international environment and contributing personally to BSH’s future success, with challenging tasks and projects ahead:
• Be the subject matter expert for our Consumer Listening and Analytics solutions (Analytics tool portfolio, Revuze, Focus Feedback, AWS DB, etc.)
• Collaborate with business stakeholders to ensure the solutions fit to the needs of our growing user base
• Work in an agile and cross-functional team, and option to contribute on Agile ceremonies to support the team on the smooth collaboration
• Enhance our technology through integration of cutting edge AI technology
• Coordinate/support BSH/Bosch cloud onboarding process, gathering the necessary technical information and documentation
• Initial establishment of solution in BSH
o Create initial product documentation incl. Iteraplan maintenance etc.
o Create user/role concept together with stakeholders incl. access request process
o Set up initial support processes (incident management via SMT etc.)
o Create basic training material and user documentation
o Estimate sizing of solutions as preparations for license negotiations (e.g. estimate growth of user base etc.)
• Manage RUN of solution - Global Coordination Role
o Build support and operation work-stream
o Ticket handling (incidents, user requests, consulting & support)
o Process user/access requests
Your Profile
• University degree in digital, business, information technology or similar field
• 1-3 years’ experience as a product / project manager at B2C solution
• Customer success management (Agency)/Project management Experience in Marketing area
• Team player with strong communication skills
• Basic Agile background
• Fluent in English
Responsibilities
Title External: Support and Operations Responsible – Consumer Analytics
BSH internal title: Support and Operations Responsible (B2C)
You will be working in an international environment and contributing personally to BSH’s future success, with challenging tasks and projects ahead:
• Be the subject matter expert for our Consumer Listening and Analytics solutions (Analytics tool portfolio, Revuze, Focus Feedback, AWS DB, etc.)
• Collaborate with business stakeholders to ensure the solutions fit to the needs of our growing user base
• Work in an agile and cross-functional team, and option to contribute on Agile ceremonies to support the team on the smooth collaboration
• Enhance our technology through integration of cutting edge AI technology
• Coordinate/support BSH/Bosch cloud onboarding process, gathering the necessary technical information and documentation
• Initial establishment of solution in BSH
o Create initial product documentation incl. Iteraplan maintenance etc.
o Create user/role concept together with stakeholders incl. access request process
o Set up initial support processes (incident management via SMT etc.)
o Create basic training material and user documentation
o Estimate sizing of solutions as preparations for license negotiations (e.g. estimate growth of user base etc.)
• Manage RUN of solution - Global Coordination Role
o Build support and operation work-stream
o Ticket handling (incidents, user requests, consulting & support)
o Process user/access requests
Your Profile
• University degree in digital, business, information technology or similar field
• 1-3 years’ experience as a product / project manager at B2C solution
• Customer success management (Agency)/Project management Experience in Marketing area
• Team player with strong communication skills
• Basic Agile background
• Fluent in English
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Instructional Designer
Content based QA, UI/UX, Video Editing, Instructional designer
· Content-based QA like content editing, content mapping, CMOS, grammar, punctuation, alignment, Bloom’s taxonomy reviews.
· Work closely with the project team to identify potential quality issues early and address them promptly to minimize delays.
· Enhance the QA process by conducting regular quality checks and audits.
· Aim for a 95% defect-free rate and track the number of audits conducted and issues resolved.
· Ensure all content is accurate, error-free, and consistent in style and format.
· This includes checking for grammatical mistakes, punctuations, formatting & alignment, factual inaccuracies, and adherence to style guides.
· Conduct thorough device and browser testing Knowledge of accessibility & screen reading tools like JAWS, NVDA and Axedev tool will be an added recommended preference.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Engineer
Work Location: BANGALORE, KA
Skills Required: Data Concepts & Data Modelling~Digital : Databricks
Experience Range in Required Skills: 8 to 10
Job Description: Technology expert who constantly pursues knowledge enhancement and has inherent curiosity to understand work from multiple dimensions.Deep data focus with expertise in technology domainA skilled communicator capable of speaking to both technical developers and business managers. Respected and trusted by leaders and staff.Actively delivers the roll-out and embedding of Data Foundation initiatives in support of the key business programmers.Coordinate the change management process, incident management and problem management process.Present reports and findings to key stakeholders and be the subject matter expert on DataAnalysis Design.Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery.Contributes to community building initiatives like CoE, CoP.Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Essential Skills: Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Responsibilities
Job Title: Engineer
Work Location: BANGALORE, KA
Skills Required: Data Concepts & Data Modelling~Digital : Databricks
Experience Range in Required Skills: 8 to 10
Job Description: Technology expert who constantly pursues knowledge enhancement and has inherent curiosity to understand work from multiple dimensions.Deep data focus with expertise in technology domainA skilled communicator capable of speaking to both technical developers and business managers. Respected and trusted by leaders and staff.Actively delivers the roll-out and embedding of Data Foundation initiatives in support of the key business programmers.Coordinate the change management process, incident management and problem management process.Present reports and findings to key stakeholders and be the subject matter expert on DataAnalysis Design.Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery.Contributes to community building initiatives like CoE, CoP.Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Essential Skills: Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Accounting/Finance background
Ledger Reconciliation
Cash Collection, Cash changing
Good Communication & Confident in handling calls with clients across APAC
Responsibilities
Accounting/Finance background
Ledger Reconciliation
Cash Collection, Cash changing
Good Communication & Confident in handling calls with clients across APAC
Salary : Rs. 0.0 - Rs. 45,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and user experience. You will also participate in testing and debugging processes to deliver high-quality applications that meet the expectations of stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Collaborate with cross-functional teams to gather requirements and provide feedback on application functionality. Professional & Technical Skills: - Must To Have Skills: Proficiency in React.js.- Good To Have Skills: Experience with Redux and RESTful APIs.- Strong understanding of front-end development principles and best practices.- Familiarity with version control systems such as Git.- Experience in responsive design and mobile-first development. Additional Information: - The candidate should have minimum 3 years of experience in React.js.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and user experience. You will also participate in testing and debugging processes to deliver high-quality applications that meet the expectations of stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Collaborate with cross-functional teams to gather requirements and provide feedback on application functionality. Professional & Technical Skills: - Must To Have Skills: Proficiency in React.js.- Good To Have Skills: Experience with Redux and RESTful APIs.- Strong understanding of front-end development principles and best practices.- Familiarity with version control systems such as Git.- Experience in responsive design and mobile-first development. Additional Information: - The candidate should have minimum 3 years of experience in React.js.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,45,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Design and develop scalable ETL pipelines using Azure Databricks, PySpark, and Azure Data Factory to ingest, transform, and integrate data from multiple internal and external sources.
Ensure data quality, consistency, and reliability across complex data workflows and largescale datasets.
Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver efficient data solutions.Optimize data processing performance and implement best practices
Work in Agile development environments, participating in sprint planning, reviews
Mentoring the team with which the resource is working.
Collaborate with cross functional teams to integrate PySpark solution with existing landscape.
Responsibilities
Design and develop scalable ETL pipelines using Azure Databricks, PySpark, and Azure Data Factory to ingest, transform, and integrate data from multiple internal and external sources.
Ensure data quality, consistency, and reliability across complex data workflows and largescale datasets.
Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver efficient data solutions.Optimize data processing performance and implement best practices
Work in Agile development environments, participating in sprint planning, reviews
Mentoring the team with which the resource is working.
Collaborate with cross functional teams to integrate PySpark solution with existing landscape.
Salary : Rs. 10,00,000.0 - Rs. 20,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Work Location: BANGALORE, KA
Skills Required: Data Concepts & Data Modelling~Digital : Databricks
Experience Range in Required Skills: 8 to 10
Job Description: Technology expert who constantly pursues knowledge enhancement and has inherent curiosity to understand work from multiple dimensions.Deep data focus with expertise in technology domainA skilled communicator capable of speaking to both technical developers and business managers. Respected and trusted by leaders and staff.Actively delivers the roll-out and embedding of Data Foundation initiatives in support of the key business programmers.Coordinate the change management process, incident management and problem management process.Present reports and findings to key stakeholders and be the subject matter expert on DataAnalysis Design.Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery.Contributes to community building initiatives like CoE, CoP.Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Essential Skills: Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Responsibilities
Work Location: BANGALORE, KA
Skills Required: Data Concepts & Data Modelling~Digital : Databricks
Experience Range in Required Skills: 8 to 10
Job Description: Technology expert who constantly pursues knowledge enhancement and has inherent curiosity to understand work from multiple dimensions.Deep data focus with expertise in technology domainA skilled communicator capable of speaking to both technical developers and business managers. Respected and trusted by leaders and staff.Actively delivers the roll-out and embedding of Data Foundation initiatives in support of the key business programmers.Coordinate the change management process, incident management and problem management process.Present reports and findings to key stakeholders and be the subject matter expert on DataAnalysis Design.Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery.Contributes to community building initiatives like CoE, CoP.Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Essential Skills: Data bricks CertificationAWSAzureSAP - MasterELT - MasterData Modeling - MasterData Integration Ingestion - SkillData Manipulation and Processing - SkillGITHUB, Action, Azure DevOps - SkillData factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - SkillDataBricks Certification for databricks profile
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Mulesoft
• 10+ years in software engineering, integration, or enterprise systems, with 5–7 years in an architect role.
• Strong expertise in Mule 4, Anypoint Platform, CloudHub ,API Manager, Runtime Manager, Exchange, RAML, DataWeave, and CI/CD pipelines.
• Proven experience as an API Architect (3+ years) designing and governing large scale enterprise integrations.
• Familiarity with Azure DevOps and Agile/Scrum methodologies.
• Hands-on experience with API gateways, policies, subscriptions, developer portal, monitoring, and security.
• Deep expertise in API governance, security, and performance optimization.
• Strong knowledge of OAuth 2.0, JWT, OpenID Connect, SAML, and related API security practices.
• Experience with CI/CD pipelines (Azure DevOps, GitHub Actions).
• Excellent leadership, stakeholder management, and communication skills.
• Ability to operate in multi-vendor, hybrid cloud environments.
Responsibilities
Mulesoft
• 10+ years in software engineering, integration, or enterprise systems, with 5–7 years in an architect role.
• Strong expertise in Mule 4, Anypoint Platform, CloudHub ,API Manager, Runtime Manager, Exchange, RAML, DataWeave, and CI/CD pipelines.
• Proven experience as an API Architect (3+ years) designing and governing large scale enterprise integrations.
• Familiarity with Azure DevOps and Agile/Scrum methodologies.
• Hands-on experience with API gateways, policies, subscriptions, developer portal, monitoring, and security.
• Deep expertise in API governance, security, and performance optimization.
• Strong knowledge of OAuth 2.0, JWT, OpenID Connect, SAML, and related API security practices.
• Experience with CI/CD pipelines (Azure DevOps, GitHub Actions).
• Excellent leadership, stakeholder management, and communication skills.
• Ability to operate in multi-vendor, hybrid cloud environments.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance