We found 3 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

2-4 years of work experience in Big Data or Hadoop platform environment and work experience in ETL/Data warehousing technologies would be good. Good knowledge in ETL Architecture and Hadoop platform architecture. Development experience with PySpark & SparkSql with good analytical & debugging skills. Hands on experience in Python Scripting Experienced in SQL Working knowledge of big data infrastructure such as various Hadoop Ecosystems like HDFS, Hive, Spark etc Should be able to modify existing programming/codes for new requirements Unit testing and debugging. Perform root cause analysis (RCA) for any failed processes Good communication skills (verbal and written) with ability to communicate across teams, internal and external at all levels. Develop code for intermediate modules, following documentation and development standards Conduct basic levels of module and integration testing according to process standards. Track and resolve moderate defects. Highly motivated with good written and oral communication skills. Flexibility to adapt to changing priorities and assignments, ability to work well under pressure, as well as ability to work independently

Responsibilities

2-4 years of work experience in Big Data or Hadoop platform environment and work experience in ETL/Data warehousing technologies would be good. Good knowledge in ETL Architecture and Hadoop platform architecture. Development experience with PySpark & SparkSql with good analytical & debugging skills. Hands on experience in Python Scripting Experienced in SQL Working knowledge of big data infrastructure such as various Hadoop Ecosystems like HDFS, Hive, Spark etc Should be able to modify existing programming/codes for new requirements Unit testing and debugging. Perform root cause analysis (RCA) for any failed processes Good communication skills (verbal and written) with ability to communicate across teams, internal and external at all levels. Develop code for intermediate modules, following documentation and development standards Conduct basic levels of module and integration testing according to process standards. Track and resolve moderate defects. Highly motivated with good written and oral communication skills. Flexibility to adapt to changing priorities and assignments, ability to work well under pressure, as well as ability to work independently
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Big data & hadoop developer

Job Description

Skills 'Desired Skills • BS in Computer Science or related technical degree • 5+ years of experience in data analytics, and modeling • 3+ years of hands-on experience in Bigdata, Cloud Computing, and NoSQL technologies like Hive, Sqoop, Elastic, Scala, PIG, Hive, Flume, HBase, Mahout, Spark, etc • Excellent communications skills: written, verbal, presentation, influencing, coordination/mediation, ability to convey key points of complex topics in a concise manner

Responsibilities

Skills 'Desired Skills • BS in Computer Science or related technical degree • 5+ years of experience in data analytics, and modeling • 3+ years of hands-on experience in Bigdata, Cloud Computing, and NoSQL technologies like Hive, Sqoop, Elastic, Scala, PIG, Hive, Flume, HBase, Mahout, Spark, etc • Excellent communications skills: written, verbal, presentation, influencing, coordination/mediation, ability to convey key points of complex topics in a concise manner
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data scientist

Job Description

Hands on experience in building, optimizing the conceptual/logical database models and flow charts • Experience in designing, prototyping, and delivering applications and solutions on emerging technologies like Natural Language Processing (NLP), Machine Learning (ML), Artificial Intelligence (AI), etc. to respond to business needs for cost-efficiency, improved quality, and agility etc. • Review and approve data model designs to ensure integrity, accuracy & quality of data within the enterprise applications - data warehouse, big data platforms etc. • Maintain and manage data architecture blueprint for enterprise • Establish standards and best practices for data life cycle management • Lead the identification of technologies & tools to implement data management, data masking and data quality • Should involve in creating process framework to engage all stakeholders for implementing data quality rules and security access across databases • Coaching and Mentoring the business and applications teams on data management • Migrate data from legacy systems to new solutions • Help in review the application system performance and suggestion in improvement • Hands on experience in Optimize new and current database systems • Good exposure in defining the security and backup procedures • Coordinate with the business and other stake holders to identify future needs and requirements • Experience in defining corporate-wide data strategy and data governance and implementing data governance initiatives • Experience in data warehouse, relational & dimensional data modelling, and performance tuning of large scale data warehouses • Experience in designing and developing data models, integrating data from multiple sources, data flow design • Knowledge of data mining and segmentation techniques • Experience in Banking, Financial Services and Insurance Domain will be an additional advantage Basic Qualifications: • Bachelor’s degree in IT, Computer Science, Software Engineering or equivalent • Experience gathering and analysing system requirements • Strong organizational, project management skills and ability to handle multiple projects, deadlines and people – with frequent priority changes • Expertise in SQL, Oracle with knowledge of other RDBMS and NoSQL Databases • Knowledge and understanding of Excel, PowerPoint, Word and adaptability to other software products • Solid understanding and implementing master data management and metadata management • Proven analytical and Problem-solving skills • Excellent communication skills both written and verbal.

Responsibilities

Business Solutions Architect (Data)
  • Salary : Rs. 140.0 - Rs. 145.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Business Solutions Architect (Data)