We found 84 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Create responsive UI and components from prototypes. Work on SDK integration/ Native modules when there is project requirement. Create interactive mobile applications Strong Knowledge of React Native, Strong Knowledge in Figma, JavaScript, Git,ES6, API integration

Responsibilities

Create responsive UI and components from prototypes. Work on SDK integration/ Native modules when there is project requirement. Create interactive mobile applications Strong Knowledge of React Native, Strong Knowledge in Figma, JavaScript, Git,ES6, API integration
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :UI/UX Designer

Job Description

Minimum 2+yrs of Dell Boomi Experience in UKG PRO UKG Dell Boomi Experience in the following Employee data import, Business Structure import, Approved Time off Request import, Payroll Export Accrual Balance export, Worked Hours export UKG Pro Outbound, Person Export Outbound, Labor Category Interface & Vacation Payout Interface Design, build, test, and deploy integrations between UKG Pro and external systems using Dell Boomi. Identify and use the UKG Pro APIs Ensure seamless data exchange and synchronization between systems. Configure and manage Boomi integrations, maps, and workflows. Develop and maintain Boomi APIs and connectors. Optimize Boomi performance and troubleshoot issues. Understand UKG Pro business processes and data structures. Collaborate with UKG Pro subject matter experts to identify integration requirements. Ensure integrations align with UKG Pro configuration and security standards. Data mapping and transformation. Integration protocols (e.g., SOAP, REST, HTTP). Dell Boomi certification.

Responsibilities

Minimum 2+yrs of Dell Boomi Experience in UKG PRO UKG Dell Boomi Experience in the following Employee data import, Business Structure import, Approved Time off Request import, Payroll Export Accrual Balance export, Worked Hours export UKG Pro Outbound, Person Export Outbound, Labor Category Interface & Vacation Payout Interface Design, build, test, and deploy integrations between UKG Pro and external systems using Dell Boomi. Identify and use the UKG Pro APIs Ensure seamless data exchange and synchronization between systems. Configure and manage Boomi integrations, maps, and workflows. Develop and maintain Boomi APIs and connectors. Optimize Boomi performance and troubleshoot issues. Understand UKG Pro business processes and data structures. Collaborate with UKG Pro subject matter experts to identify integration requirements. Ensure integrations align with UKG Pro configuration and security standards. Data mapping and transformation. Integration protocols (e.g., SOAP, REST, HTTP). Dell Boomi certification.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Dell Bhoomi

Job Description

- Data Pipeline Design & Development: - Build, optimize, and manage scalable data pipelines using Apache Spark with PySpark and Scala. - Collaborate with cross-functional teams to translate business requirements into effective data models and ETL solutions. - Develop and maintain data processing jobs using Spark in a cloud environment with Snowflake - Data Warehousing: - Manage structured and unstructured data at scale, ensuring accuracy, consistency, and reliability. - Perform data integration from various sources, including batch and real-time data streams. - Performance Optimization: - Tune Spark jobs for performance, reliability, and scalability. - Identify bottlenecks and devise innovative solutions to enhance processing efficiency. - Collaboration & Communication: - Work closely with data scientists, analysts, and product teams to understand their data needs and provide timely solutions. - Contribute to improving data engineering processes and best practices across the organization. - Monitoring & Troubleshooting: - Implement monitoring solutions to ensure data integrity and quality. - Troubleshoot and resolve any issues related to data processing, pipeline failures, or performance concerns. Skills - Technical Expertise: - Proven experience in PySpark and Scala development, with hands-on experience designing and implementing data pipelines. - Strong understanding of Apache Spark architecture and its distributed processing capabilities. - Proficiency in working with Hadoop ecosystems (HDFS, Hive, etc.) and data lakes. - Experience with cloud data platforms (AWS Glue, Snowflake etc.).

Responsibilities

- Data Pipeline Design & Development: - Build, optimize, and manage scalable data pipelines using Apache Spark with PySpark and Scala. - Collaborate with cross-functional teams to translate business requirements into effective data models and ETL solutions. - Develop and maintain data processing jobs using Spark in a cloud environment with Snowflake - Data Warehousing: - Manage structured and unstructured data at scale, ensuring accuracy, consistency, and reliability. - Perform data integration from various sources, including batch and real-time data streams. - Performance Optimization: - Tune Spark jobs for performance, reliability, and scalability. - Identify bottlenecks and devise innovative solutions to enhance processing efficiency. - Collaboration & Communication: - Work closely with data scientists, analysts, and product teams to understand their data needs and provide timely solutions. - Contribute to improving data engineering processes and best practices across the organization. - Monitoring & Troubleshooting: - Implement monitoring solutions to ensure data integrity and quality. - Troubleshoot and resolve any issues related to data processing, pipeline failures, or performance concerns. Skills - Technical Expertise: - Proven experience in PySpark and Scala development, with hands-on experience designing and implementing data pipelines. - Strong understanding of Apache Spark architecture and its distributed processing capabilities. - Proficiency in working with Hadoop ecosystems (HDFS, Hive, etc.) and data lakes. - Experience with cloud data platforms (AWS Glue, Snowflake etc.).
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Pyspark ETL

Job Description

8+ years of experience on IT industry in Data Engineering & Data Analyst role and Python skill. ~8 years of development experience using tool Snowflake and SQL Server. Proficient in writing SQL queries including writing of windows functions, creation of stored procedures. Good communication skills with analytical abilities in doing problem solving activities.

Responsibilities

8+ years of experience on IT industry in Data Engineering & Data Analyst role and Python skill. ~8 years of development experience using tool Snowflake and SQL Server. Proficient in writing SQL queries including writing of windows functions, creation of stored procedures. Good communication skills with analytical abilities in doing problem solving activities.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Snowflake, SQL Server , Python

Job Description

Campaign Development and Execution: - Design and develop personalized, multi-channel marketing campaigns using Adobe Campaign Classic. - Build and optimize delivery workflows, audience segmentation, and targeting strategies. - Configure recurring and trigger-based campaigns. - Template and Workflow Development: - Develop dynamic and responsive email templates using HTML, CSS, and scripting languages like JavaScript and Velocity. - Create and optimize campaign workflows, data workflows, and complex processes in Adobe Campaign.

Responsibilities

Campaign Development and Execution: - Design and develop personalized, multi-channel marketing campaigns using Adobe Campaign Classic. - Build and optimize delivery workflows, audience segmentation, and targeting strategies. - Configure recurring and trigger-based campaigns. - Template and Workflow Development: - Develop dynamic and responsive email templates using HTML, CSS, and scripting languages like JavaScript and Velocity. - Create and optimize campaign workflows, data workflows, and complex processes in Adobe Campaign.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Adobe Campaign Classic Developer

Job Description

To analyze and interpret large datasets, identifying patterns, trends, and anomalies. Develop and maintain SQL queries to extract relevant data from various sources. Create data visualizations and dashboards to communicate findings effectively. Collaborate with cross-functional teams to understand business requirements and translate them into actionable insights. Develop data pipelines and automation workflows to streamline data analysis processes. Stay updated with the latest advancements in GPT, SQL, and data analysis techniques. Able to coordinate and pass information between Stake holders and Dev/Arch team. Capable of doing POC and Feasibility study for any given requirement.

Responsibilities

To analyze and interpret large datasets, identifying patterns, trends, and anomalies. Develop and maintain SQL queries to extract relevant data from various sources. Create data visualizations and dashboards to communicate findings effectively. Collaborate with cross-functional teams to understand business requirements and translate them into actionable insights. Develop data pipelines and automation workflows to streamline data analysis processes. Stay updated with the latest advancements in GPT, SQL, and data analysis techniques. Able to coordinate and pass information between Stake holders and Dev/Arch team. Capable of doing POC and Feasibility study for any given requirement.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Analyst

Job Description

Provide guidance and consultation to application technical contacts with completion of annual self-attestations to assigned applications. Audit and review supporting artifacts provided by Application SME’s to ensure compliance with IAM controls. Act as the IAM point of contact for a set of assigned applications and manage the enterprise end-to-end IAM lifecycle for these applications. Complete access request processing as per pre-defined sets of procedures and within agreed Service Level Agreements (SLA) and Operational Level Agreements (OLA), resolve problem tickets related to corporate and commercial applications, and assist other security analysts as needed. Perform role mining and role engineering analysis for role-based access during enrollment of enterprise-wide applications including acquired entities applications. Identify process automation opportunities on existing access management practices.

Responsibilities

Provide guidance and consultation to application technical contacts with completion of annual self-attestations to assigned applications. Audit and review supporting artifacts provided by Application SME’s to ensure compliance with IAM controls. Act as the IAM point of contact for a set of assigned applications and manage the enterprise end-to-end IAM lifecycle for these applications. Complete access request processing as per pre-defined sets of procedures and within agreed Service Level Agreements (SLA) and Operational Level Agreements (OLA), resolve problem tickets related to corporate and commercial applications, and assist other security analysts as needed. Perform role mining and role engineering analysis for role-based access during enrollment of enterprise-wide applications including acquired entities applications. Identify process automation opportunities on existing access management practices.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :IAM Consultant

Job Description

Bachelor’s degree in computer science or related field. 5+ years of experience with SAP UI5 & Fiori Elements, RAP and SAP BTP UX services. Expertise in SAPUI5 and Fiori end-to-end implementation with backend (ABAP/OData/CDS/RAP). Expertise in extending SAP standard FIORI applications. Minimum of 3 years of SAP ABAP experience with CDS annotations and RAP. Extensively used OOP concept and aware of latest SAP coding standards. Understanding of ABAP authorizations and Fiori Catalog/Group authorization concepts. Ability to interact and coordinate with teams across locations. Strong knowledge of Responsive Design principles. Excellent communication skills.

Responsibilities

Bachelor’s degree in computer science or related field. 5+ years of experience with SAP UI5 & Fiori Elements, RAP and SAP BTP UX services. Expertise in SAPUI5 and Fiori end-to-end implementation with backend (ABAP/OData/CDS/RAP). Expertise in extending SAP standard FIORI applications. Minimum of 3 years of SAP ABAP experience with CDS annotations and RAP. Extensively used OOP concept and aware of latest SAP coding standards. Understanding of ABAP authorizations and Fiori Catalog/Group authorization concepts. Ability to interact and coordinate with teams across locations. Strong knowledge of Responsive Design principles. Excellent communication skills.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Fiori

Job Description

SAP Basis and DBA Administration: 1. Perform installation, configuration, and maintenance of SAP S/4HANA systems. 2. Execute SAP system upgrades, support package installations, kernel updates, and add-on installations. 3. Manage Transport Management System (TMS) and client administration. 4. Handle user management, roles, and authorizations in SAP systems. 5. Manage and monitor SAP HANA databases, including backups, recoveries, and performance optimization. 6. Execute database upgrades, revisions, and patches. 7. Hands-on experience with NetWeaver to S/4HANA migration projects. 8. Knowledge of operating systems (Linux, Windows) and virtualization technologies. 9. Familiarity with SAP tools like SAP Solution Manager, SWPM, and HDBLCM. 10.Understanding of high availability (HA) and disaster recovery (DR) configurations. 11. Experience in cloud environments (e.g., AWS, Azure, GCP) for SAP workloads. 12. Strong skills in SQL and database scripting. Technical Support and Documentation: 1. Provide technical support for SAP landscapes, ensuring compliance with SLAs. 2. Document system configurations, operational procedures, and troubleshooting guides. 3. Conduct root cause analysis and implement solutions for recurring issues. Soft Skills: 1. Excellent problem-solving and troubleshooting abilities. 2. Strong communication and teamwork skills. 3. Ability to prioritize tasks and work under pressure. Required Skills and Qualifications 1. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. 2. Experience: Minimum of 5 years in SAP Basis and Database Administration, with hands-on experience in SAP S/4HANA.

Responsibilities

SAP Basis and DBA Administration: 1. Perform installation, configuration, and maintenance of SAP S/4HANA systems. 2. Execute SAP system upgrades, support package installations, kernel updates, and add-on installations. 3. Manage Transport Management System (TMS) and client administration. 4. Handle user management, roles, and authorizations in SAP systems. 5. Manage and monitor SAP HANA databases, including backups, recoveries, and performance optimization. 6. Execute database upgrades, revisions, and patches. 7. Hands-on experience with NetWeaver to S/4HANA migration projects. 8. Knowledge of operating systems (Linux, Windows) and virtualization technologies. 9. Familiarity with SAP tools like SAP Solution Manager, SWPM, and HDBLCM. 10.Understanding of high availability (HA) and disaster recovery (DR) configurations. 11. Experience in cloud environments (e.g., AWS, Azure, GCP) for SAP workloads. 12. Strong skills in SQL and database scripting. Technical Support and Documentation: 1. Provide technical support for SAP landscapes, ensuring compliance with SLAs. 2. Document system configurations, operational procedures, and troubleshooting guides. 3. Conduct root cause analysis and implement solutions for recurring issues. Soft Skills: 1. Excellent problem-solving and troubleshooting abilities. 2. Strong communication and teamwork skills. 3. Ability to prioritize tasks and work under pressure. Required Skills and Qualifications 1. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. 2. Experience: Minimum of 5 years in SAP Basis and Database Administration, with hands-on experience in SAP S/4HANA.
  • Salary : Rs. 0.0 - Rs. 22,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP BASIS and HANA DB Administrator