Project Details: , the project is focused on RLT accounts cleanup, specifically identifying duplicate RLT accounts in Salesforce, distinguishing between Golden and duplicate accounts, and aligning all relevant data to the Golden record according to business rules.
Project Scope: till end of 2026
Number of resource required 4-5
Start date : ASAP
Skill set requirement:
Data Profiling & Data Quality Analysis
• Identifying duplicates, incomplete records, invalid values, and inconsistencies across facilities, contacts.
• Understanding “Golden Record” vs duplicate records logic – De Duping based on business rules
• Good Understanding on HCO data (Health Care organization)
• Check completeness, accuracy, consistency, and reliability before analysis
Match & Merge Concepts
• Record matching (exact & fuzzy) to create 360 view of the account
• Survivorship rules (what data stays on Golden Record)
Data cleaning
• Handle missing values
• Remove duplicates
• Correct inconsistencies and align/move data from duplicate to golden record
SQL skills
• Strong proficiency in SQL in using joins, group by, subqueries
• Filter, sort, aggregate, and join datasets on Salesforce
Programming for analysis
• Python or R
• Exposure to cloud platforms like AWS, Azure or GCP
Business and domain understanding
• Pharma business understanding especially on HCO data
• Recommend actions based on findings
Must-have:
• Data quality concepts & able to identify duplicate accounts for USA Health care organization.
• Salesforce object understanding and basic knowledge on SOQL to join different objects
• Basic understanding on Deduping concepts for Health care accounts.
• Good communication skills and able to work with stakeholders.
Responsibilities
Project Details: , the project is focused on RLT accounts cleanup, specifically identifying duplicate RLT accounts in Salesforce, distinguishing between Golden and duplicate accounts, and aligning all relevant data to the Golden record according to business rules.
Project Scope: till end of 2026
Number of resource required 4-5
Start date : ASAP
Skill set requirement:
Data Profiling & Data Quality Analysis
• Identifying duplicates, incomplete records, invalid values, and inconsistencies across facilities, contacts.
• Understanding “Golden Record” vs duplicate records logic – De Duping based on business rules
• Good Understanding on HCO data (Health Care organization)
• Check completeness, accuracy, consistency, and reliability before analysis
Match & Merge Concepts
• Record matching (exact & fuzzy) to create 360 view of the account
• Survivorship rules (what data stays on Golden Record)
Data cleaning
• Handle missing values
• Remove duplicates
• Correct inconsistencies and align/move data from duplicate to golden record
SQL skills
• Strong proficiency in SQL in using joins, group by, subqueries
• Filter, sort, aggregate, and join datasets on Salesforce
Programming for analysis
• Python or R
• Exposure to cloud platforms like AWS, Azure or GCP
Business and domain understanding
• Pharma business understanding especially on HCO data
• Recommend actions based on findings
Must-have:
• Data quality concepts & able to identify duplicate accounts for USA Health care organization.
• Salesforce object understanding and basic knowledge on SOQL to join different objects
• Basic understanding on Deduping concepts for Health care accounts.
• Good communication skills and able to work with stakeholders.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description: Business Analyst – Life Sciences (Data, Compliance & Analytics)
Overview
We are seeking a highly skilled Life Sciences Business Analyst with strong experience in data analytics, data quality, and regulatory/compliance frameworks. The ideal candidate will have hands on expertise in SQL querying, data profiling, and data interpretation, along with solid understanding of Life Sciences compliance, GxP standards, Salesforce ethics, and SDLC governance.
Key Responsibilities
• Collaborate with cross functional teams (IT, Data Engineering, Compliance, Medical, Quality) to gather and translate business requirements into clear functional specifications and user stories.
• Perform advanced SQL querying, data profiling, data mapping, and data validation across multiple Life Sciences datasets (CRM, MDM, clinical, quality, operational, etc.).
• Conduct data quality assessments, identify anomalies, and work with technical teams to drive remediation and improve data integrity.
• Analyze datasets to support reporting, KPI development, operational insights, and data driven decision making.
• Ensure all processes and solutions comply with Life Sciences regulatory standards, including GxP, data privacy, audit requirements, and Salesforce ethics guidelines.
• Follow SDLC compliance standards, documentation practices, and validation processes throughout the project lifecycle.
• Support change management activities including impact assessments, communication, training, and user adoption.
• Participate in solution design, UAT planning, test case creation, execution, and deployment readiness.
Required Skills
• 10+ years of Business Analyst experience in the pharma/life sciences domain.
• Strong hands on experience with SQL, data profiling, data analysis, and working with large, complex datasets.
• Solid understanding of Life Sciences compliance, GxP, audit readiness, and Salesforce ethics.
• Experience with CRM platforms (Veeva/Salesforce), MDM, data lakes, and BI tools (Power BI/Tableau).
• Familiarity with SDLC processes, validation standards, and documentation best practices.
• Excellent communication, analytical thinking, and stakeholder management skills.
Preferred
• Experience with data governance, metadata management, or master data processes.
• Exposure to clinical, safety, regulatory, or quality systems.
• Experience supporting organizational change management.
• Degree in Life Sciences, Data Analytics, Business, or related field.
Responsibilities
Job Description: Business Analyst – Life Sciences (Data, Compliance & Analytics)
Overview
We are seeking a highly skilled Life Sciences Business Analyst with strong experience in data analytics, data quality, and regulatory/compliance frameworks. The ideal candidate will have hands on expertise in SQL querying, data profiling, and data interpretation, along with solid understanding of Life Sciences compliance, GxP standards, Salesforce ethics, and SDLC governance.
Key Responsibilities
• Collaborate with cross functional teams (IT, Data Engineering, Compliance, Medical, Quality) to gather and translate business requirements into clear functional specifications and user stories.
• Perform advanced SQL querying, data profiling, data mapping, and data validation across multiple Life Sciences datasets (CRM, MDM, clinical, quality, operational, etc.).
• Conduct data quality assessments, identify anomalies, and work with technical teams to drive remediation and improve data integrity.
• Analyze datasets to support reporting, KPI development, operational insights, and data driven decision making.
• Ensure all processes and solutions comply with Life Sciences regulatory standards, including GxP, data privacy, audit requirements, and Salesforce ethics guidelines.
• Follow SDLC compliance standards, documentation practices, and validation processes throughout the project lifecycle.
• Support change management activities including impact assessments, communication, training, and user adoption.
• Participate in solution design, UAT planning, test case creation, execution, and deployment readiness.
Required Skills
• 10+ years of Business Analyst experience in the pharma/life sciences domain.
• Strong hands on experience with SQL, data profiling, data analysis, and working with large, complex datasets.
• Solid understanding of Life Sciences compliance, GxP, audit readiness, and Salesforce ethics.
• Experience with CRM platforms (Veeva/Salesforce), MDM, data lakes, and BI tools (Power BI/Tableau).
• Familiarity with SDLC processes, validation standards, and documentation best practices.
• Excellent communication, analytical thinking, and stakeholder management skills.
Preferred
• Experience with data governance, metadata management, or master data processes.
• Exposure to clinical, safety, regulatory, or quality systems.
• Experience supporting organizational change management.
• Degree in Life Sciences, Data Analytics, Business, or related field.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
JD for Talend
Position Summary
We are seeking a highly skilled Talend Big Data ETL Developer with 5–6 years of IT experience, including a minimum of 4 years working specifically on the Talend Big Data platform. The ideal candidate will have strong expertise in designing, developing, optimizing, and deploying complex ETL pipelines using Talend and SQL, with excellent analytical and communication skills.
________________________________________
Key Responsibilities
• Design, develop, and maintain complex ETL pipelines using Talend Big Data tools on the Spark framework.
• Build and manage ETL infrastructure to extract, transform, and load data from a wide range of structured and unstructured data sources.
• Work extensively with Talend ETL components, including data processing, orchestration, parallelization, and transformation modules.
• Develop Talend Jobs, Joblets, and custom Java-based components as required.
• Perform installation, configuration, and maintenance of Talend Job Server, TAC Server, and other Talend components.
• Optimize Talend jobs for high performance, scalability, and parallel execution across multiple job servers.
• Deploy Talend jobs across environments and automate deployment pipelines.
• Create and maintain Context Groups, parameterization frameworks, and custom routines.
• Implement error handling, monitoring, alerting, and reporting mechanisms for ETL processes.
• Write and execute unit test cases and support integration testing for all ETL components.
• Design and support data ingestion pipelines across development, testing, and production environments.
• Participate in performance tuning, troubleshooting, and best practice recommendations.
• Utilize TAC for deployments, job scheduling, and administrative activities (added advantage).
________________________________________
Must-Have Skills
• Minimum 4 years of hands-on experience with Talend Big Data ETL platform.
• Strong SQL programming experience (preferably SQL Server).
• Excellent analytical and problem solving skills.
• Strong communication skills with the ability to work with cross functional teams.
• Java debug fundamentals (on TAC)
• Expertise in:
o ETL fundamentals and end-to-end ETL lifecycle.
o Talend core components, transformations, orchestrations, and job optimization.
o Joblets, custom components, and Java based custom logic.
o Error handling, performance tuning, and monitoring frameworks.
________________________________________
Nice-to-Have Skills
• Experience with TAC for job administration, deployment, and scheduling.
• Familiarity with DevOps pipelines for Talend job deployment.
• Working knowledge of data quality, data validation, and metadata management.
Responsibilities
SG is looking at atleast 6 selections for this role. This demand is offline and Taleo will be released shortly. They are creating the same
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Strong experience building Web applications with Rest APIs integration of Micro services in .NET Core technologies.
• Strong full stack developer with proven track record of getting things done.
• Expert in two or more Integrated Development Environments and languages including C#, .Net core, Java scripting, Objective-C, Visual Studio, Eclipse, etc.
• Experience developing against common API technologies including REST and SOAP
• Experience with React.js framework development and development
• Experience with ASP.NET WebAPI and .Net Core
• Experience with CI / CD, AWS deployments
• Experience in AWS technologies - API gateway, S3, CLoudfront, ECS experience, AWS fargate, lambda authorizer, event bridge etc.
• strong knowledge of Rest APi life cycle
Responsibilities
• Strong experience building Web applications with Rest APIs integration of Micro services in .NET Core technologies.
• Strong full stack developer with proven track record of getting things done.
• Expert in two or more Integrated Development Environments and languages including C#, .Net core, Java scripting, Objective-C, Visual Studio, Eclipse, etc.
• Experience developing against common API technologies including REST and SOAP
• Experience with React.js framework development and development
• Experience with ASP.NET WebAPI and .Net Core
• Experience with CI / CD, AWS deployments
• Experience in AWS technologies - API gateway, S3, CLoudfront, ECS experience, AWS fargate, lambda authorizer, event bridge etc.
• strong knowledge of Rest APi life cycle
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
9 PDC3B We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines on the Enterprise Data Platform (EDL) running on Cloudera (CDP) on AWS. The role involves working with the Hadoop ecosystem, building PySpark-based data processing pipelines, orchestratin
Responsibilities
9 PDC3B We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines on the Enterprise Data Platform (EDL) running on Cloudera (CDP) on AWS. The role involves working with the Hadoop ecosystem, building PySpark-based data processing pipelines, orchestratin
Salary : Rs. 0.0 - Rs. 2,16,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Scrum master
- Agile E2 certified
- SAFE Agile knowledge
- Excellent working knowledge on JIRA /Confluence
- Excellent knowledge on Quality and Service metrics
- Automotive business domain knowledge will be added advantage
- Excellent communication and presentation skills
- Knowledge of Salesforce, Java and React technologies will be added advantage
- Product Backlog Prioritization in alignment with stakeholders(Business/IT)
- Stakeholder management for liasing with third-party issues /business users for prolonged issues
- Negotiate with third-party vendor for defining the scope on feature deployments
Responsibilities
- Agile E2 certified
- SAFE Agile knowledge
- Excellent working knowledge on JIRA /Confluence
- Excellent knowledge on Quality and Service metrics
- Automotive business domain knowledge will be added advantage
- Excellent communication and presentation skills
- Knowledge of Salesforce, Java and React technologies will be added advantage
- Product Backlog Prioritization in alignment with stakeholders(Business/IT)
- Stakeholder management for liasing with third-party issues /business users for prolonged issues
- Negotiate with third-party vendor for defining the scope on feature deployments
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance