ess.
Key Responsibilities: Operational: • Understand recruiting process and specific processes and policies • Seek reports from BGC vendors • Screen & QC of reports & Documents • Ensure all paperwork is received and completed according to policy. Responsible for post-offer management • Resolves Documents/BGC related questions and concerns • Utilize recruiting database and tools • Review recruit files/documents for completeness & accuracy • Conduct and distribute reports on a regular basis and special reports on request identify issues and recommend actions • Provide external marketplace information to Recruiting Leadership Cooperation and Communication: • Establish and maintain relationships with stakeholder group
Qualifications: Education: Graduate preferred. Work Experience: 1-2 years of relevant work experience.
Knowledge or Skills Requirements: • Good understanding of the company’s business objectives and organizational structure • Good understanding of recruiting processes and procedures •Basic understanding of recruiting or sourcing processes and organization and tools • Good understanding documents • Good understanding of the external marketplace or trends • • Data analysis experience • Analytical skills • Good team player • Good communication and interpersonal skills
Responsibilities
• Must Have: Total Exp: 1-2 years
• Must Have: Good Communications Skills
• Must Have: Stakeholder mgmt. Exp.
• Must Have: Volume Handling Exp.
• Good to Have: Good Team Player skills.
• Good to Have: MBA.
Salary : Rs. 20,000.0 - Rs. 33,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
SQL Server 2012 Integration Services (SSIS)
• Maintain and migrate DTS packages to SSIS with enhanced logging and error handling.
• Design and develop SSIS packages and optimize existing ones.
• Perform SQL Server version upgrades (e.g., 2008 20162019) focusing on
• Application-level changes (T-SQL rewrites, deprecated feature handling).
• Refactoring stored procedures, triggers, views, and functions for compatibility.
• Updating connection strings, drivers, and linked server configurations.
• Conduct impact analysis for version differences (syntax changes, behavior changes).
• Execute regression testing and application queries post-migration.
• Collaborate with QA teams for data validation and performance benchmarking.
• Document migration steps, compatibility issues, and resolution strategies.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
IT Support Analyst
Your role
Are you an expert when it comes customer service? Are you passionate about being the part of Global support team and supporting business to meet/exceed their target? Do you have proven ability to solve complex issues and technical need?
We’re looking for an IT support analyst to:
• Act as a single point of contact for phone calls and emails from staff regarding IT issues and queries
• Receive, log and manage calls from internal staff via telephone and web portals
• 1st line support - troubleshooting of IT related issues for Windows 10/11, MS outlook, Knowledge of various mobile devices (Apple, Android, Windows)
• Escalate unresolved tickets to the infrastructure support team
• Log all calls in the tool SNOW (Service Now)
• Ability to multi-task and prioritize workload
• Ability to adapt to continuously changing procedures and environment
• Take ownership of user problems and follow up the status of problems on behalf of the user and communicate progress in a timely manner
• Maintain a high degree of customer service for all support queries and adhere to all service management principles
• Provide stats for the weekly Service Desk report on call trends
• Publish support documentation to assist staff with requests for information & provide staff training if required
Responsibilities
IT Support Analyst
Your role
Are you an expert when it comes customer service? Are you passionate about being the part of Global support team and supporting business to meet/exceed their target? Do you have proven ability to solve complex issues and technical need?
We’re looking for an IT support analyst to:
• Act as a single point of contact for phone calls and emails from staff regarding IT issues and queries
• Receive, log and manage calls from internal staff via telephone and web portals
• 1st line support - troubleshooting of IT related issues for Windows 10/11, MS outlook, Knowledge of various mobile devices (Apple, Android, Windows)
• Escalate unresolved tickets to the infrastructure support team
• Log all calls in the tool SNOW (Service Now)
• Ability to multi-task and prioritize workload
• Ability to adapt to continuously changing procedures and environment
• Take ownership of user problems and follow up the status of problems on behalf of the user and communicate progress in a timely manner
• Maintain a high degree of customer service for all support queries and adhere to all service management principles
• Provide stats for the weekly Service Desk report on call trends
• Publish support documentation to assist staff with requests for information & provide staff training if required
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Delivery Lead, you will manage the delivery of large and complex projects, ensuring that appropriate frameworks are utilized while collaborating with sponsors to effectively manage scope and risk. Your typical day will involve driving profitability and success by overseeing service quality and cost, while also measuring and communicating progress to leadership within established time frames. You will proactively support sales initiatives through innovative solutions and a commitment to delivery excellence, fostering a collaborative environment that encourages team engagement and performance. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate regular team meetings to ensure alignment and address any challenges.- Mentor junior team members to enhance their skills and professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD).- Strong understanding of project management methodologies and frameworks.- Experience in managing cross-functional teams and delivering complex projects.- Ability to analyze project performance metrics and implement improvements.- Excellent communication and interpersonal skills to engage with stakeholders. Additional Information: - The candidate should have minimum 5 years of experience in SAP Sales and Distribution (SD).- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As a Delivery Lead, you will manage the delivery of large and complex projects, ensuring that appropriate frameworks are utilized while collaborating with sponsors to effectively manage scope and risk. Your typical day will involve driving profitability and success by overseeing service quality and cost, while also measuring and communicating progress to leadership within established time frames. You will proactively support sales initiatives through innovative solutions and a commitment to delivery excellence, fostering a collaborative environment that encourages team engagement and performance. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate regular team meetings to ensure alignment and address any challenges.- Mentor junior team members to enhance their skills and professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD).- Strong understanding of project management methodologies and frameworks.- Experience in managing cross-functional teams and delivering complex projects.- Ability to analyze project performance metrics and implement improvements.- Excellent communication and interpersonal skills to engage with stakeholders. Additional Information: - The candidate should have minimum 5 years of experience in SAP Sales and Distribution (SD).- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
JD 1 : Data Engineer
Number of Positions : (3-6 years, 6-9 years)
Details:
As a Data Engineer with expertise in PySpark, Databricks, and Microsoft Azure, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines and processing systems. You will work closely with data scientists, analysts, and other stakeholders to ensure our data solutions are efficient, reliable, and scalable.
Responsibilities:
• Design, develop, and optimize ETL pipelines using PySpark and Databricks to process large-scale data on the Azure cloud platform.
• Implement data ingestion processes from various data sources into Azure Data Lake and Azure SQL Data Warehouse.
• Develop and maintain data models, data schemas, and data transformation logic tailored for Azure.
• Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets.
• Ensure data quality and integrity through robust testing, validation, and monitoring procedures.
• Optimize and tune PySpark jobs for performance and scalability within the Azure and Databricks environments.
• Implement data governance and security best practices in Azure.
• Monitor and troubleshoot data pipelines to ensure timely and reliable data delivery.
• Document data engineering processes, workflows, and best practices specific to Azure and Databricks.
Requirements:
• Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
• Proven experience as a Data Engineer with a strong focus on PySpark and Databricks.
• Proficiency in Python and PySpark for data processing and analysis.
• Strong experience with Azure data services, including Azure Data Lake, Azure Data Factory, Azure SQL Data Warehouse, and Azure Databricks.
• Strong SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
• Experience with big data technologies such as Hadoop, Spark, Hive, and Kafka.
• Strong understanding of data architecture, data modeling, and data integration techniques.
• Familiarity with Azure DevOps, version control systems (e.g., Git), and CI/CD pipelines.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills.
Preferred Qualifications:
• Experience with Delta Lake on Azure Databricks.
• Knowledge of data visualization tools (e.g., Power BI, Tableau).
• Experience with containerization and orchestration tools (e.g., Docker, Kubernetes).
• Understanding of machine learning concepts and experience working with data scientists.
Skills
-------------
• Azure Data Factory: Experience in creating and orchestrating
data pipelines, understanding of triggers and data flows.
• Databricks: Knowledge of Apache Spark, programming in Python,
Scala or R, experience optimizing data processing and
transformation jobs. Experience querying databases and tables
in SQL.
• Azure Data Lake Storage: Experience working with ADLS Gen 1
and Gen 2, knowledge of hierarchy, file systems and security
aspects.
• Azure DevOps: Experience working with repositories, pipelines,
builds and releases, understanding CI/CD processes.
• Dataintegration: Knowledge of various data sources and data
formats such as JSON, CSV, XML, Parquet and Delta. Also
knowledge of databases such as Azure SQL, MySQL or
PostgreSQL.
Tasks
----------
• Data extraction: Identifying and extracting data from various
sources such as databases, APIs, file systems and external
services.
• Data transformation: Data cleaning, enrichment and normalization
according to project requirements.
• Data loading: Loading the transformed data into target databases,
data warehouses or data lakes.
• Data pipeline development: Implementing and automating ETL or
ELT processes using Azure Data Factory and Databricks.
• Monitoring and Troubleshooting: Monitoring data pipelines,
identifying issues and implementing fixes.
• Data integration: Developing interfaces and integration solutions
for various data sources and platforms.
• Performance optimization: Analyzing and improving the
performance of data pipelines and processing jobs.
Responsibilities
JD 1 : Data Engineer
Number of Positions : (3-6 years, 6-9 years)
Details:
As a Data Engineer with expertise in PySpark, Databricks, and Microsoft Azure, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines and processing systems. You will work closely with data scientists, analysts, and other stakeholders to ensure our data solutions are efficient, reliable, and scalable.
Responsibilities:
• Design, develop, and optimize ETL pipelines using PySpark and Databricks to process large-scale data on the Azure cloud platform.
• Implement data ingestion processes from various data sources into Azure Data Lake and Azure SQL Data Warehouse.
• Develop and maintain data models, data schemas, and data transformation logic tailored for Azure.
• Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets.
• Ensure data quality and integrity through robust testing, validation, and monitoring procedures.
• Optimize and tune PySpark jobs for performance and scalability within the Azure and Databricks environments.
• Implement data governance and security best practices in Azure.
• Monitor and troubleshoot data pipelines to ensure timely and reliable data delivery.
• Document data engineering processes, workflows, and best practices specific to Azure and Databricks.
Requirements:
• Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
• Proven experience as a Data Engineer with a strong focus on PySpark and Databricks.
• Proficiency in Python and PySpark for data processing and analysis.
• Strong experience with Azure data services, including Azure Data Lake, Azure Data Factory, Azure SQL Data Warehouse, and Azure Databricks.
• Strong SQL skills and experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
• Experience with big data technologies such as Hadoop, Spark, Hive, and Kafka.
• Strong understanding of data architecture, data modeling, and data integration techniques.
• Familiarity with Azure DevOps, version control systems (e.g., Git), and CI/CD pipelines.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills.
Preferred Qualifications:
• Experience with Delta Lake on Azure Databricks.
• Knowledge of data visualization tools (e.g., Power BI, Tableau).
• Experience with containerization and orchestration tools (e.g., Docker, Kubernetes).
• Understanding of machine learning concepts and experience working with data scientists.
Skills
-------------
• Azure Data Factory: Experience in creating and orchestrating
data pipelines, understanding of triggers and data flows.
• Databricks: Knowledge of Apache Spark, programming in Python,
Scala or R, experience optimizing data processing and
transformation jobs. Experience querying databases and tables
in SQL.
• Azure Data Lake Storage: Experience working with ADLS Gen 1
and Gen 2, knowledge of hierarchy, file systems and security
aspects.
• Azure DevOps: Experience working with repositories, pipelines,
builds and releases, understanding CI/CD processes.
• Dataintegration: Knowledge of various data sources and data
formats such as JSON, CSV, XML, Parquet and Delta. Also
knowledge of databases such as Azure SQL, MySQL or
PostgreSQL.
Tasks
----------
• Data extraction: Identifying and extracting data from various
sources such as databases, APIs, file systems and external
services.
• Data transformation: Data cleaning, enrichment and normalization
according to project requirements.
• Data loading: Loading the transformed data into target databases,
data warehouses or data lakes.
• Data pipeline development: Implementing and automating ETL or
ELT processes using Azure Data Factory and Databricks.
• Monitoring and Troubleshooting: Monitoring data pipelines,
identifying issues and implementing fixes.
• Data integration: Developing interfaces and integration solutions
for various data sources and platforms.
• Performance optimization: Analyzing and improving the
performance of data pipelines and processing jobs.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Custom Software Engineer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are tailored to enhance operational efficiency. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with Microsoft SQL Server Integration Services (SSIS).- Strong understanding of cloud computing concepts and architecture.- Experience in application development using various programming languages.- Familiarity with agile methodologies and project management tools. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As a Custom Software Engineer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are tailored to enhance operational efficiency. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with Microsoft SQL Server Integration Services (SSIS).- Strong understanding of cloud computing concepts and architecture.- Experience in application development using various programming languages.- Familiarity with agile methodologies and project management tools. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Custom Software Engineer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization.
Roles & Responsibilities:
- Expected to perform independently and become an SME.
- Required active participation/contribution in team discussions.
- Contribute in providing solutions to work related problems.
- Assist in the documentation of application processes and workflows.
- Engage in code reviews to ensure quality and adherence to best practices.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in ServiceNow IT Service Management.
- Strong understanding of IT service management processes and frameworks.
- Experience with application development and configuration in ServiceNow.
- Familiarity with scripting languages relevant to ServiceNow.
- Ability to troubleshoot and resolve application issues effectively.
Additional Information:
- The candidate should have minimum 3 years of experience in ServiceNow IT Service Management.
- This position is based at our Pune office.
- A 15 years full time education is required.
Responsibilities
As a Custom Software Engineer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and optimization.
Roles & Responsibilities:
- Expected to perform independently and become an SME.
- Required active participation/contribution in team discussions.
- Contribute in providing solutions to work related problems.
- Assist in the documentation of application processes and workflows.
- Engage in code reviews to ensure quality and adherence to best practices.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in ServiceNow IT Service Management.
- Strong understanding of IT service management processes and frameworks.
- Experience with application development and configuration in ServiceNow.
- Familiarity with scripting languages relevant to ServiceNow.
- Ability to troubleshoot and resolve application issues effectively.
Additional Information:
- The candidate should have minimum 3 years of experience in ServiceNow IT Service Management.
- This position is based at our Pune office.
- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Seeking Senior Apigee Developer, who can play a key role in ensuring the reliability, performance, and security of our APIs, and be an SME (Subject matter expert) for Apigee Edge/X/Hybrid Stack with CICD/pipelines/ELK/Microservices/Az functions/Bigquery/PowerBI/API.
• Key Responsibilities:
• API Development & Management:
• Design, develop, and deploy API proxies, target endpoints, and shared flows on both Apigee Edge and Apigee X platforms, adhering to architectural standards and best practices.
• Implement various Apigee policies (e.g., security, traffic management, mediation, error handling) to enforce API governance and optimize performance.
• Manage and maintain API configurations, environments, and virtual hosts within Apigee.
• CI/CD Pipeline Contribution:
• Collaborate with DevOps and Framework teams to evolve and maintain CI/CD pipelines for API deployment, testing, and promotion across different environments.
• Conduct rigorous testing of CI/CD pipeline changes to validate that API deployments function flawlessly and that all integrated components work as expected post-promotion.
• Comprehensive API Testing & Quality Assurance:
• Perform end-to-end integration testing, including scenarios where external systems (e.g., ServiceNow forms) consume Apigee APIs. This involves verifying UI functionality in the consuming application and meticulously validating API requests, responses, and data integrity at the Apigee layer.
• Ensure APIs consistently return accurate data and appropriate response codes (e.g., 2xx, 4xx, 5xx).
• Apigee Migration & Defect Resolution:
• Actively participate in the migration efforts from Apigee Edge to Apigee X, identifying and resolving compatibility issues and technical challenges.
• Analyze, troubleshoot, and fix defects related to Apigee proxy functionality, policies, shared flows, and overall API performance during and after migration.
• Automation & Scripting:
• Develop and maintain robust PowerShell scripts for critical operational tasks.
• Documentation & Collaboration:
• Create and maintain comprehensive documentation for API designs, configurations, shared flows, and operational procedures.
• Collaborate closely with product owners, architects, other developers, and QA teams to deliver high-quality, scalable API solutions.
Responsibilities
Seeking Senior Apigee Developer, who can play a key role in ensuring the reliability, performance, and security of our APIs, and be an SME (Subject matter expert) for Apigee Edge/X/Hybrid Stack with CICD/pipelines/ELK/Microservices/Az functions/Bigquery/PowerBI/API.
• Key Responsibilities:
• API Development & Management:
• Design, develop, and deploy API proxies, target endpoints, and shared flows on both Apigee Edge and Apigee X platforms, adhering to architectural standards and best practices.
• Implement various Apigee policies (e.g., security, traffic management, mediation, error handling) to enforce API governance and optimize performance.
• Manage and maintain API configurations, environments, and virtual hosts within Apigee.
• CI/CD Pipeline Contribution:
• Collaborate with DevOps and Framework teams to evolve and maintain CI/CD pipelines for API deployment, testing, and promotion across different environments.
• Conduct rigorous testing of CI/CD pipeline changes to validate that API deployments function flawlessly and that all integrated components work as expected post-promotion.
• Comprehensive API Testing & Quality Assurance:
• Perform end-to-end integration testing, including scenarios where external systems (e.g., ServiceNow forms) consume Apigee APIs. This involves verifying UI functionality in the consuming application and meticulously validating API requests, responses, and data integrity at the Apigee layer.
• Ensure APIs consistently return accurate data and appropriate response codes (e.g., 2xx, 4xx, 5xx).
• Apigee Migration & Defect Resolution:
• Actively participate in the migration efforts from Apigee Edge to Apigee X, identifying and resolving compatibility issues and technical challenges.
• Analyze, troubleshoot, and fix defects related to Apigee proxy functionality, policies, shared flows, and overall API performance during and after migration.
• Automation & Scripting:
• Develop and maintain robust PowerShell scripts for critical operational tasks.
• Documentation & Collaboration:
• Create and maintain comprehensive documentation for API designs, configurations, shared flows, and operational procedures.
• Collaborate closely with product owners, architects, other developers, and QA teams to deliver high-quality, scalable API solutions.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
JD: Should have 10-12 years of SAP FICO consulting, implementation, or production support experience, with knowledge on FICO, SAP RAR, Treasury, FSCM (dispute, collection & credit modules).
Responsibilities
JD: Should have 10-12 years of SAP FICO consulting, implementation, or production support experience, with knowledge on FICO, SAP RAR, Treasury, FSCM (dispute, collection & credit modules).
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance