Job Posting Title: Informatica PIM 360 (3 to 8 Years)
Description: Looking for 4-8 years of hands-on experience with Informatica PIM for product data management, including Product 360 Desktop/Web and supplier catalog integrations. Skilled in Java, REST/SOAP APIs, and customizing import/export processes. Proficient in data quality management using Informatica IDQ, data modeling, and integration with MDM, Azure, PLM, or ERP systems. Strong analytical skills, Agile experience, and effective collaboration within global teams.
Responsibilities
Job Posting Title: Informatica PIM 360 (3 to 8 Years)
Description: Looking for 4-8 years of hands-on experience with Informatica PIM for product data management, including Product 360 Desktop/Web and supplier catalog integrations. Skilled in Java, REST/SOAP APIs, and customizing import/export processes. Proficient in data quality management using Informatica IDQ, data modeling, and integration with MDM, Azure, PLM, or ERP systems. Strong analytical skills, Agile experience, and effective collaboration within global teams.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Performance Test lead, you will engage with clients on gathering requirements, creating plans for performance testing, negotiating on priorities, analyzing the results and providing the feedback to the development teams, mentoring team on their day to day work. Your typical day will involve collaborating with various stakeholders to develop and implement effective performance test strategies ensuring that client needs are met with precision and care. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills and knowledge. Professional & Technical Skills:- Must To Have Skills: Proficiency in Apache J Meter.- Strong analytical skills to assess asset performance and recommend improvements.- Experience in developing and implementing financial strategies that align with business objectives.- Ability to communicate complex financial concepts clearly to diverse audiences.- Proficiency in using financial modeling tools and software to analyze data. Additional Information:- The candidate should have minimum 7.5 years of experience in Performance Testing Strategy and Apache J meter.- This position is based at Hyderabad- A 15 years full time education is required
Responsibilities
As a Performance Test lead, you will engage with clients on gathering requirements, creating plans for performance testing, negotiating on priorities, analyzing the results and providing the feedback to the development teams, mentoring team on their day to day work. Your typical day will involve collaborating with various stakeholders to develop and implement effective performance test strategies ensuring that client needs are met with precision and care. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills and knowledge. Professional & Technical Skills:- Must To Have Skills: Proficiency in Apache J Meter.- Strong analytical skills to assess asset performance and recommend improvements.- Experience in developing and implementing financial strategies that align with business objectives.- Ability to communicate complex financial concepts clearly to diverse audiences.- Proficiency in using financial modeling tools and software to analyze data. Additional Information:- The candidate should have minimum 7.5 years of experience in Performance Testing Strategy and Apache J meter.- This position is based at Hyderabad- A 15 years full time education is required
Salary : Rs. 0.0 - Rs. 1,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: ~HYDERABAD~BANGALORE~
Skill Required: Digital : Python~Digital : PySpark
Experience Range in Required Skills: 6 to 8
Job Description:
Role Data Engineer (Python PySpark) Core Technical SkillsPythonoData processing and transformation using Pandas, NumPyoWriting modular, reusable code for ETL workflowsoAutomation and scripting for data operationsPySparkoBuilding distributed data pipelinesoSpark SQL, DataFrame APIs, and RDDsoPerformance tuning (partitioning, caching, shuffle optimization)SQLoComplex queries, joins, aggregations, and window functionsoQuery optimization for large datasetsData Modeling ETLoDesigning schemas for analytics and operational systemsoImplementing ETLELT pipelines and orchestration tools (Airflow, Databricks Jobs)Big Data Cloud PlatformsoExperience with AWS, Azure, or GCPoFamiliarity with data lakes and Delta Lake patternsFile Formats StorageoParquet, ORC, Avro for efficient storageoUnderstanding of partitioning strategiesTesting CICDoUnit and integration testing for data pipelinesoGit-based workflows and automated deployments
Responsibilities
Job Title: Developer
Work Location: ~HYDERABAD~BANGALORE~
Skill Required: Digital : Python~Digital : PySpark
Experience Range in Required Skills: 6 to 8
Job Description:
Role Data Engineer (Python PySpark) Core Technical SkillsPythonoData processing and transformation using Pandas, NumPyoWriting modular, reusable code for ETL workflowsoAutomation and scripting for data operationsPySparkoBuilding distributed data pipelinesoSpark SQL, DataFrame APIs, and RDDsoPerformance tuning (partitioning, caching, shuffle optimization)SQLoComplex queries, joins, aggregations, and window functionsoQuery optimization for large datasetsData Modeling ETLoDesigning schemas for analytics and operational systemsoImplementing ETLELT pipelines and orchestration tools (Airflow, Databricks Jobs)Big Data Cloud PlatformsoExperience with AWS, Azure, or GCPoFamiliarity with data lakes and Delta Lake patternsFile Formats StorageoParquet, ORC, Avro for efficient storageoUnderstanding of partitioning strategiesTesting CICDoUnit and integration testing for data pipelinesoGit-based workflows and automated deployments
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance