We found 218 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

1. Project Review and Analysis: · Follows and complies with current Agency and Investor project guidelines to ensure standards are met for mortgages sold and delivered, secured by units within Condominium Projects for various investors; Fannie Mae, Freddie Mac, FHA, VA, and various Jumbo investors, as well as Co-op Projects for Fannie Mae and Freddie Mac. · Detailed review of condominium; questionnaires, financials, insurance, recorded legal documents, as well as review of litigation, ground leases, affordable housing, resale deed restrictions, and private transfer fees. · Analyze and approve condominium projects per loan program parameters, such as Limited Review, Full Review of Established, New Construction for conventional, FHA, and VA reviews. · Communicate with clients regarding ineligible or declined projects, providing clear reasoning and guidance. 2. Compliance and Guidelines: · Stay current with relevant guidelines, updates, and industry best practices. · Ensure every project review meets required standards and risk management protocols. 3. Training and Support: · Mentor and train junior team members on compliance requirements, document review, and risk analysis. · Provide subject matter expertise to internal teams and support client-facing communications as needed.

Responsibilities

1. Project Review and Analysis: · Follows and complies with current Agency and Investor project guidelines to ensure standards are met for mortgages sold and delivered, secured by units within Condominium Projects for various investors; Fannie Mae, Freddie Mac, FHA, VA, and various Jumbo investors, as well as Co-op Projects for Fannie Mae and Freddie Mac. · Detailed review of condominium; questionnaires, financials, insurance, recorded legal documents, as well as review of litigation, ground leases, affordable housing, resale deed restrictions, and private transfer fees. · Analyze and approve condominium projects per loan program parameters, such as Limited Review, Full Review of Established, New Construction for conventional, FHA, and VA reviews. · Communicate with clients regarding ineligible or declined projects, providing clear reasoning and guidance. 2. Compliance and Guidelines: · Stay current with relevant guidelines, updates, and industry best practices. · Ensure every project review meets required standards and risk management protocols. 3. Training and Support: · Mentor and train junior team members on compliance requirements, document review, and risk analysis. · Provide subject matter expertise to internal teams and support client-facing communications as needed.
  • Salary : Rs. 30,000.0 - Rs. 50,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Project Review Associate

Job Description

Work Location: PAN India, having TCS offices Preferable: Hyderabad, TG / Thane, MH Skill Required: Digital: Snowflake Experience Range: 6-8 Years Role Description: DBT, Snowflake, PySPark Snowflake DBT JD: We are looking for an experienced & result-driven Data Engineer to join our growing Data Engineering Team. The ideal candidate should be proficient in building scalable, high-performance data transformation pipelines using Snowflake & DBT. In this role you should be instrumental in ingesting, transforming and delivering high quality data to enable data riven decision making across the client's organization. • 5-8 plus years of experience as a data engineer and extensive development using Snowflake or similar data warehouse technology. • Strong Technical expertise with DBT, Snowflake, PySpark, Apache Airflow, AWS • Strong hands-on experience in Design & build robust ELT pipelines using DBT on Snowflake, including ingestion from relational databases, Cloud storage, flat files & API. • Enhance dbt/Snowflake workflows through Performance optimization techniques such as clustering, partitioning, query profiling and efficient SQL design • Hands-on experience with SQL, Snowflake database design • Hands-on experience with AWS, Airflow and GIT • Great analytical and problem-solving skills • Degree in Computer Science, IT, or similar field; a Master’s is a plus

Responsibilities

Work Location: PAN India, having TCS offices Preferable: Hyderabad, TG / Thane, MH Skill Required: Digital: Snowflake Experience Range: 6-8 Years Role Description: DBT, Snowflake, PySPark Snowflake DBT JD: We are looking for an experienced & result-driven Data Engineer to join our growing Data Engineering Team. The ideal candidate should be proficient in building scalable, high-performance data transformation pipelines using Snowflake & DBT. In this role you should be instrumental in ingesting, transforming and delivering high quality data to enable data riven decision making across the client's organization. • 5-8 plus years of experience as a data engineer and extensive development using Snowflake or similar data warehouse technology. • Strong Technical expertise with DBT, Snowflake, PySpark, Apache Airflow, AWS • Strong hands-on experience in Design & build robust ELT pipelines using DBT on Snowflake, including ingestion from relational databases, Cloud storage, flat files & API. • Enhance dbt/Snowflake workflows through Performance optimization techniques such as clustering, partitioning, query profiling and efficient SQL design • Hands-on experience with SQL, Snowflake database design • Hands-on experience with AWS, Airflow and GIT • Great analytical and problem-solving skills • Degree in Computer Science, IT, or similar field; a Master’s is a plus
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Consultant

Job Description

SAP BRIM/SOM requirement for 3-8 years experienced candidate . 2 resources needed. 1 - BRIM - SOM 2 - BRIM - SOM + CI

Responsibilities

SAP BRIM/SOM requirement for 3-8 years experienced candidate . 2 resources needed. 1 - BRIM - SOM 2 - BRIM - SOM + CI
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Opentext ecm

Job Description

RAR Profiles

Responsibilities

RAR Profiles
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :RAR Profiles

Job Description

Work Location: Hyderabad,TS Skill Required: Digital : Adobe Experience Platform Experience Range in Required Skills:6-8Yrs Job Description: SummaryWe are seeking a skilled data engineer with deep expertise in Adobe Experi-ence Platform (AEP) and Real-Time CDP (RTCDP) to drive data modeling, ingestion, transformation, and audience segmentation. The ideal candidate will have strong programming skills, experience with identity resolution, and a passion for building scalable customer data solutions.Key Responsibilities- Design and implement data models and pipelines within AEP.- Manage data ingestion from various sources including streaming and batch.- Configure and optimize identity resolution rules in RTCDP.- Build and manage audience segments for personalized experiences.- Integrate external systems using APIs, SDKs, and custom connectors.- Understanding in Data distiller- Collaborate with cross-functional teams including marketing, analytics, and engineering.Required Skills- Strong knowledge of AEP data modeling, ingestion, and transformation.- Experience with RTCDP identity resolution and au-dience segmentation.- Proficiency in Python, JavaScript, or SQL.- Familiarity with APIs (Postman), SDKs, and streaming ingestion tools.Preferred Skills- Experience with other CDPs (Snowflake, Seg-ment,)- Knowledge of Adobe Experience Cloud tools (Analytics, Journey Optimizer) Essential Skills: SummaryWe are seeking a skilled data engineer with deep expertise in Adobe Experience Platform (AEP) and Real-Time CDP (RTCDP) to drive data modeling, ingestion, transformation, and audience segmentation. The ideal candidate will have strong programming skills, experience with identity resolution, and a passion for building scalable customer data solutions.Key Responsibilities- Design and implement data models and pipelines within AEP.- Manage data ingestion from various sources including streaming and batch.- Configure and optimize identity resolution rules in RTCDP.- Build and manage audience segments for personalized experiences.- Integrate external systems using APIs, SDKs, and custom connectors.- Understanding in Data distiller- Collaborate with cross-functional teams including marketing, analytics, and engineering.Required Skills- Strong knowledge of AEP data modeling, ingestion, and transformation.- Experience with RTCDP identity resolution and audience segmentation.- Proficiency in Python, JavaScript, or SQL.- Familiarity with APIs (Postman), SDKs, and streaming ingestion tools.Preferred Skills- Experience with other CDPs (Snowflake, Segment,)- Knowledge of Adobe Experience Cloud tools (Analytics, Journey Optimizer).

Responsibilities

Work Location: Hyderabad,TS Skill Required: Digital : Adobe Experience Platform Experience Range in Required Skills:6-8Yrs Job Description: SummaryWe are seeking a skilled data engineer with deep expertise in Adobe Experi-ence Platform (AEP) and Real-Time CDP (RTCDP) to drive data modeling, ingestion, transformation, and audience segmentation. The ideal candidate will have strong programming skills, experience with identity resolution, and a passion for building scalable customer data solutions.Key Responsibilities- Design and implement data models and pipelines within AEP.- Manage data ingestion from various sources including streaming and batch.- Configure and optimize identity resolution rules in RTCDP.- Build and manage audience segments for personalized experiences.- Integrate external systems using APIs, SDKs, and custom connectors.- Understanding in Data distiller- Collaborate with cross-functional teams including marketing, analytics, and engineering.Required Skills- Strong knowledge of AEP data modeling, ingestion, and transformation.- Experience with RTCDP identity resolution and au-dience segmentation.- Proficiency in Python, JavaScript, or SQL.- Familiarity with APIs (Postman), SDKs, and streaming ingestion tools.Preferred Skills- Experience with other CDPs (Snowflake, Seg-ment,)- Knowledge of Adobe Experience Cloud tools (Analytics, Journey Optimizer) Essential Skills: SummaryWe are seeking a skilled data engineer with deep expertise in Adobe Experience Platform (AEP) and Real-Time CDP (RTCDP) to drive data modeling, ingestion, transformation, and audience segmentation. The ideal candidate will have strong programming skills, experience with identity resolution, and a passion for building scalable customer data solutions.Key Responsibilities- Design and implement data models and pipelines within AEP.- Manage data ingestion from various sources including streaming and batch.- Configure and optimize identity resolution rules in RTCDP.- Build and manage audience segments for personalized experiences.- Integrate external systems using APIs, SDKs, and custom connectors.- Understanding in Data distiller- Collaborate with cross-functional teams including marketing, analytics, and engineering.Required Skills- Strong knowledge of AEP data modeling, ingestion, and transformation.- Experience with RTCDP identity resolution and audience segmentation.- Proficiency in Python, JavaScript, or SQL.- Familiarity with APIs (Postman), SDKs, and streaming ingestion tools.Preferred Skills- Experience with other CDPs (Snowflake, Segment,)- Knowledge of Adobe Experience Cloud tools (Analytics, Journey Optimizer).
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

Administration of overall Azure AD resources Experience in Azure AD connect, Azure proxy, Entra, Role-Based Access Control (RBAC). Register and manage enterprise applications Configure single sign-on (SSO), proxy settings and Policies Manage security related features MFS, conditional access Manage privilege roles in Azure AD View usage reports, cost analysis, Azure subscriptions and billing Manage compliance related configurations and Audit logs

Responsibilities

Administration of overall Azure AD resources Experience in Azure AD connect, Azure proxy, Entra, Role-Based Access Control (RBAC). Register and manage enterprise applications Configure single sign-on (SSO), proxy settings and Policies Manage security related features MFS, conditional access Manage privilege roles in Azure AD View usage reports, cost analysis, Azure subscriptions and billing Manage compliance related configurations and Audit logs
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Azure AD Engineer

Job Description

Minimum 7 years of development experience and last 2+ years in Gen AI-based App Development. Proficiency in Python or JavaScript for AI application development Should have Working experience with Gen AI – OpenAI/Gemini/Perplexity & Grok Should have experience in ML Ops for deployment experience in cloud services (AWS, GCP, Azure) for AI model deployment & DB Tech (NoSQL, PostgreSQL, MongoDB).

Responsibilities

Minimum 7 years of development experience and last 2+ years in Gen AI-based App Development. Proficiency in Python or JavaScript for AI application development Should have Working experience with Gen AI – OpenAI/Gemini/Perplexity & Grok Should have experience in ML Ops for deployment experience in cloud services (AWS, GCP, Azure) for AI model deployment & DB Tech (NoSQL, PostgreSQL, MongoDB).
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Delivery Manager /GenAI Technical Architect

Job Description

Data & Analytics Expertise KPI Development & Monitoring – Ability to formulate, evolve, and track service performance metrics (CES, CSAT, FCR, etc.). Data Interpretation & Insight Generation – Strong skills in analyzing complex datasets to derive actionable insights. Dashboarding & Visualization – Advanced proficiency in Power BI or similar tools to create intuitive dashboards. Excel Mastery – Expertise in data manipulation, aggregation, and analysis using Microsoft Excel. Salesforce Cloud Analytics – Experience is a plus for integrating consumer data insights. Benchmarking & Industry Analysis – Capability to perform competitive benchmarking and trend analysis. Ad-hoc Analytical Support – Flexibility to support initiatives with tailored data research. B. Technical Proficiency Data Platforms Familiarity – Understanding of platforms like Microsoft Azure. Programming Knowledge – Basic grasp of SQL or Python (not mandatory but beneficial). Data Accuracy & Quality Management – Ability to monitor and refine metrics for relevance and precision. Data Privacy Awareness – Knowledge of data protection regulations and ethical data handling.

Responsibilities

Data & Analytics Expertise KPI Development & Monitoring – Ability to formulate, evolve, and track service performance metrics (CES, CSAT, FCR, etc.). Data Interpretation & Insight Generation – Strong skills in analyzing complex datasets to derive actionable insights. Dashboarding & Visualization – Advanced proficiency in Power BI or similar tools to create intuitive dashboards. Excel Mastery – Expertise in data manipulation, aggregation, and analysis using Microsoft Excel. Salesforce Cloud Analytics – Experience is a plus for integrating consumer data insights. Benchmarking & Industry Analysis – Capability to perform competitive benchmarking and trend analysis. Ad-hoc Analytical Support – Flexibility to support initiatives with tailored data research. B. Technical Proficiency Data Platforms Familiarity – Understanding of platforms like Microsoft Azure. Programming Knowledge – Basic grasp of SQL or Python (not mandatory but beneficial). Data Accuracy & Quality Management – Ability to monitor and refine metrics for relevance and precision. Data Privacy Awareness – Knowledge of data protection regulations and ethical data handling.
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data & Analytics Expertise

Job Description

Key Responsibilities: Architecture & Solution Design Architect and implement MDM solutions in Semarchy MDM for Business Partner, Product domains and associated reference data Define data models, data quality rules, matching rules, hierarchies and stewardship workflows. Ensure scalability, security, and compliance with data privacy regulations. Implementation & Delivery Lead the configuration and deployment of Semarchy MDM including data ingestion, cleansing, enrichment, and publishing. Collaborate with business analysts and operations teams to gather requirements and translate them into technical solutions. Oversee data migration from legacy systems and ensure high data quality. Integration & Governance Integrate MDM with other systems including CRM, ERP, and external portals. Establish data governance frameworks, stewardship roles, and audit trails. Enable API-based data access for real-time partner validation and onboarding.

Responsibilities

Key Responsibilities: Architecture & Solution Design Architect and implement MDM solutions in Semarchy MDM for Business Partner, Product domains and associated reference data Define data models, data quality rules, matching rules, hierarchies and stewardship workflows. Ensure scalability, security, and compliance with data privacy regulations. Implementation & Delivery Lead the configuration and deployment of Semarchy MDM including data ingestion, cleansing, enrichment, and publishing. Collaborate with business analysts and operations teams to gather requirements and translate them into technical solutions. Oversee data migration from legacy systems and ensure high data quality. Integration & Governance Integrate MDM with other systems including CRM, ERP, and external portals. Establish data governance frameworks, stewardship roles, and audit trails. Enable API-based data access for real-time partner validation and onboarding.
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Uniper |Semarchy