We found 188 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Description: A Big Data Engineer is an IT professional specializing in the design, development, and maintenance of systems that handle and process large, complex datasets (big data). They are responsible for building the infrastructure and pipelines that allow organizations to collect, store, process, and analyze vast amounts of data, often in the terabyte and petabyte range. Key Responsibilities of a Big Data Engineer: Designing and Building Data Infrastructure: This includes creating the architecture and systems for storing and processing massive amounts of data. Data Collection and Processing: Big Data Engineers gather data from various sources, clean and transform it, and prepare it for analysis. Developing Data Pipelines: They build pipelines to move data from source systems to data storage and processing systems. Ensuring Data Quality and Integrity: Big Data Engineers implement processes to ensure the accuracy and reliability of the data. Optimizing Performance: They work to improve the efficiency and scalability of data storage and processing systems. Collaboration with Other Teams: Big Data Engineers collaborate with data scientists, analysts, and other teams to support data-driven decision-making. EssentialSkills: A Big Data Engineer is an IT professional specializing in the design, development, and maintenance of systems that handle and process large, complex datasets (big data). They are responsible for building the infrastructure and pipelines that allow organizations to collect, store, process, and analyze vast amounts of data, often in the terabyte and petabyte range. Key Responsibilities of a Big Data Engineer: Designing and Building Data Infrastructure: This includes creating the architecture and systems for storing and processing massive amounts of data. Data Collection and Processing: Big Data Engineers gather data from various sources, clean and transform it, and prepare it for analysis. Developing Data Pipelines: They build pipelines to move data from source systems to data storage and processing systems. Ensuring Data Quality and Integrity: Big Data Engineers implement processes to ensure the accuracy and reliability of the data. Optimizing Performance: They work to improve the efficiency and scalability of data storage and processing systems. Collaboration with Other Teams: Big Data Engineers collaborate with data scientists, analysts, and other teams to support data-driven decision-making. A Big Data Engineer is an IT professional specializing in the design, development, and maintenance of systems that handle and process large, complex datasets (big data). They are responsible for building the infrastructure and pipelines that allow organizations to collect, store, process, and analyze vast amounts of data, often in the terabyte and petabyte range. Key Responsibilities of a Big Data Engineer: Designing and Building Data Infrastructure: This includes creating the architecture and systems for storing and processing massive amounts of data. Data Collection and Processing: Big Data Engineers gather data from various sources, clean and transform it, and prepare it for analysis. Developing Data Pipelines: They build pipelines to move data from source systems to data storage and processing systems. Ensuring Data Quality and Integrity: Big Data Engineers implement processes to ensure the accuracy and reliability of the data. Optimizing Performance: They work to improve the efficiency and scalability of data storage and processing systems. Collaboration with Other Teams: Big Data Engineers collaborate with data scientists, analysts, and other teams to support data-driven decision-making. Desirable Skills: A Big Data Engineer is an IT professional specializing in the design, development, and maintenance of systems that handle and process large, complex datasets (big data). They are responsible for building the infrastructure and pipelines that allow organizations to collect, store, process, and analyze vast amounts of data, often in the terabyte and petabyte range. Key Responsibilities of a Big Data Engineer: Designing and Building Data Infrastructure: This includes creating the architecture and systems for storing and processing massive amounts of data. Data Collection and Processing: Big Data Engineers gather data from various sources, clean and transform it, and prepare it for analysis. Developing Data Pipelines: They build pipelines to move data from source systems to data storage and processing systems. Ensuring Data Quality and Integrity: Big Data Engineers implement processes to ensure the accuracy and reliability of the data. Optimizing Performance: They work to improve the efficiency and scalability of data storage and processing systems. Collaboration with Other Teams: Big Data Engineers collaborate with data scientists, analysts, and other teams to support data-driven decision-making. A Big Data Engineer is an IT professional specializing in the design, development, and maintenance of systems that handle and process large, complex datasets (big data). They are responsible for building the infrastructure and pipelines that allow organizations to collect, store, process, and analyze vast amounts of data, often in the terabyte and petabyte range. Key Responsibilities of a Big Data Engineer: Designing and Building Data Infrastructure: This includes creating the architecture and systems for storing and processing massive amounts of data. Data Collection and Processing: Big Data Engineers gather data from various sources, clean and transform it, and prepare it for analysis. Developing Data Pipelines: They build pipelines to move data from source systems to data storage and processing systems. Ensuring Data Quality and Integrity: Big Data Engineers implement processes to ensure the accuracy and reliability of the data. Optimizing Performance: They work to improve the efficiency and scalability of data storage and processing systems. Collaboration with Other Teams: Big Data Engineers collaborate with data scientists, analysts, and other teams to support data-driven decision-making.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Cyber Security - GRC - Data Security

Job Description

Job Description: Security Administrator (Offshore) Security Configuration and Maintenance: Configure and maintain security groups, domain, and business process security policies to ensure robust access control. • Account Lifecycle Management: Oversee the creation, updating, and deactivation of user accounts, ensuring accurate and secure account management. • Incident Management: Provide Level 2 support by managing and resolving security-related incidents efficiently. • Emergency Access Management: Administer and monitor emergency access (Break Glass) procedures to ensure secure and controlled access during critical situations. • Compliance and Training Support: Assist in compliance reviews and facilitate security training to promote adherence to security policies and standards. • Access Management for Lower Environments: Manage and oversee access to lower environments, such as sandbox and implementation tenants, ensuring appropriate access controls are in place.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :security administrator

Job Description

Cobol,JCL, VSAM and DB2

Responsibilities

Cobol,JCL, VSAM and DB2
  • Salary : Rs. 10,00,000.0 - Rs. 20,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cobol,JCL, VSAM and DB2

Job Description

Job Responsibilities 1.Deeply understand corporate business processes, especially in the application scenarios of D365 system, with a focus on integrating knowledge of post sales service business and processes. Through methods such as interviews and business observations, comprehensively collect requirements from business departments, accurately identify business pain points, clarify business objectives, and create detailed and professional D365 business requirement documents, providing a solid foundation for system configuration, customization, and development. Leverage your post sales service knowledge to engage with customers during the post sales phase, gather feedback on system usage, and ensure continuous improvement of the D365 solution based on real - world post sales experiences. 2.Conduct in - depth analysis of business data generated by the enterprise based on the D365 system. Utilize data analysis tools to mine data value, combine industry data with the enterprises actual situation, identify potential problems and optimization directions in business operations, and provide data - driven support for business decisions and D365 system function optimization. Apply your understanding of post sales service processes to analyse data related to customer support requests, system issues, and user satisfaction, identify areas for enhancement, and inform post - implementation improvements from a post sales service perspective. 3.Participate in the review and optimization of enterprise business processes. Relying on the D365 system and your expertise in post-sales service business and processes, analyze the adaptability of existing processes to system functions, discover process bottlenecks and redundant links. Propose process improvement plans that integrate D365 system features, industry best practices, and post sales service requirements, enhancing business operation efficiency, system utilization effectiveness, and post sales service quality. Collaborate with the post sales team, using your process knowledge to understand customer pain points during system operation and incorporate these insights into process optimization efforts. 4.Collaborate closely with the D365 development team. Translate business requirements accurately into functional requirements that can be implemented by the system and assist technical personnel in understanding business logic. Act as a bridge between business and technology during system development, configuration, and customization, promptly resolve disagreements and issues between the two parties, ensuring that projects progress smoothly and meet business expectations. During post sales, work with developers to troubleshoot system - related problems reported by customers, using your post sales service knowledge to translate customer concerns into actionable technical tasks and align solutions with post sales service needs. 5.Regularly review and evaluate the application effect of the D365 system and the achievements of business analysis. Collect feedback from business departments, continuously optimize business analysis methods and processes based on the D365 system, promote a deep fit between system functions and business requirements, and enhance the enterprises digital operation level. Specifically, focus on post sales feedback, leveraging your post sales service business and process knowledge to drive iterative improvements, ensuring the long - term success and satis

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Microsoft Dynamics 365 CRM Technical

Job Description

Azure Data Factory, SQL- Advance, SSIS, Data Bricks, SDFS • Azure Data Factory (ADF) pipelines and Polybase • Good knowledge of Azure Storage – including Blob Storage, Data Lake, Azure SQL, Data Factory V2, Databricks • Can work on streaming analytics and various data services in Azure like Data flow, etc • Ability to develop extensible, testable and maintainable code • Good understanding of the challenges of enterprise software development • Track record of delivering high volume, low latency distributed software solutions • Experience of working in Agile teams • Experience of the full software development lifecycle including analysis, design, implementation, testing and support • Experience of mentoring more junior developers and directing/organizing the work of team • Good know how of SDFS, Azure Data Factory (ADF) pipelines and Polybase • Good knowledge of Azure Storage – including Blob Storage, Data Lake, Azure SQL, Data Factory V2, Databricks • Can work on streaming analytics and various data services in Azure like Data flow, etc. • Client facing, Technical Role, Assertive, Team Member skills. Good-to-Have: • Experience of Datawarehouse applications • Experience in TTH Domain Projects • Knowledge of Azure DevOps is desirable. • Knowledge of CI/CD and DevOps practices is an advantage.

Responsibilities

Azure Data Factory, SQL- Advance, SSIS, Data Bricks, SDFS • Azure Data Factory (ADF) pipelines and Polybase • Good knowledge of Azure Storage – including Blob Storage, Data Lake, Azure SQL, Data Factory V2, Databricks • Can work on streaming analytics and various data services in Azure like Data flow, etc • Ability to develop extensible, testable and maintainable code • Good understanding of the challenges of enterprise software development • Track record of delivering high volume, low latency distributed software solutions • Experience of working in Agile teams • Experience of the full software development lifecycle including analysis, design, implementation, testing and support • Experience of mentoring more junior developers and directing/organizing the work of team • Good know how of SDFS, Azure Data Factory (ADF) pipelines and Polybase • Good knowledge of Azure Storage – including Blob Storage, Data Lake, Azure SQL, Data Factory V2, Databricks • Can work on streaming analytics and various data services in Azure like Data flow, etc. • Client facing, Technical Role, Assertive, Team Member skills. Good-to-Have: • Experience of Datawarehouse applications • Experience in TTH Domain Projects • Knowledge of Azure DevOps is desirable. • Knowledge of CI/CD and DevOps practices is an advantage.
  • Salary : Rs. 0.0 - Rs. 15.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Azure Data Factory

Job Description

Data Concepts & Data Modelling~Data Migration

Responsibilities

Data Concepts & Data Modelling~Data Migration
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Concepts & Data Modelling~Data Migration

Job Description

Mainframe, Cobol, JCL

Responsibilities

Mainframe, Cobol, JCL
  • Salary : Rs. 10,00,000.0 - Rs. 20,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Senior Mainframe Developer

Job Description

Back end developer (Event Streaming-KafkaSpark) Primary Skills J2EE,Sprintboot, Distributed Database,Big Query, Real time Streaming (Spark) Good to have ( Some knowledge about .Net(c))

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Kafka~Advanced Java Concepts~Digital : Spring Boot

Job Description

Job Description: General requirements The candidate with sharp analytical skills with experience of working in an enterprise. The person must be able to work in a team and independently. Preferably from Identify Access Management functional background. Security Knowledge preferred. Excellent communication and Interpersonal skills. Technical requirements Overall 7 years of hands-on working development experience at enterprise level. Programming languages Java, Python, JavaScript - Java expertise Jetty Jboss Tomcat, IDE tools, Spring boot framework, MVC framework3-5 years of working experience with REST API and JSON.2 years development experience with LDAP and Active directory integration2 years of experience using SQL, NoSQL and other databases.2 years of working experience on following technologies. OpenShift and Docker technology Jenkins Essential Skills: General requirements. The candidate with sharp analytical skills with experience of working in an enterprise. The person must be able to work in a team and independently. Preferably from Identify Access Management functional background. Security Knowledge preferred. Excellent communication and Interpersonal skills. Technical requirements Overall 7 years of hands-on working development experience at enterprise level. Programming language Java, Python, JavaScript - Java expertise Jetty J boss Tomcat, IDE tools, Spring boot framework, MVC framework3-5 years of working experience with REST API and JSON.2 years development experience with LDAP and Active directory integration2 years of experience using SQL, NoSQL and other databases.2 years of working experience on following technologies. OpenShift and Docker technology Jenkins

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Rest web services Foundation: JavaScript