We found 4 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Java/Scala, Kafka, Spark and Solr • App-Cloud Services-L3 (Mandatory) • Core Java-L3 • Spring-L3

Responsibilities

Java/Scala, Kafka, Spark and Solr • App-Cloud Services-L3 (Mandatory) • Core Java-L3 • Spring-L3
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bigdata

Job Description

• Experience desired Is as below: • Creating CI pipeline using Teamcity / Jenkins. • Build Automation using Maven, Gradle etc. • Platform monitoring with Cloudera. • Release versioning using GIT. • Deployment tools like Nolio, SPUDS , Transporter • Docker implementation with Big Data and Java skills like Spark, Impala, Kafka, Java 8 and above.

Responsibilities

• Experience desired Is as below: • Creating CI pipeline using Teamcity / Jenkins. • Build Automation using Maven, Gradle etc. • Platform monitoring with Cloudera. • Release versioning using GIT. • Deployment tools like Nolio, SPUDS , Transporter • Docker implementation with Big Data and Java skills like Spark, Impala, Kafka, Java 8 and above.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Devops with Bigdata

Job Description

Must Have Skills (Top 3 technical skills only) * 1. Bigdata 2. Sql 3. Unix Nice to have skills (Top 2 only) 1. 2. Detailed Job Description: Candidate should have strong experience in Hadoop, Hive, SQL, UNIX and Basics of Java. Also they should have strong knowledge on to create test artifacts like test plan, test case and test finding reports. Minimum years of experience required (mention the bare minimum acceptable for this position) *: 5+ Certifications Needed (if any): Top 3 responsibilities you would expect the Subcon to shoulder and execute*: Creation of test designs, test processes, test cases and test data. Client Facing Good Communication skills

Responsibilities

Must Have Skills (Top 3 technical skills only) * 1. Bigdata 2. Sql 3. Unix Nice to have skills (Top 2 only) 1. 2. Detailed Job Description: Candidate should have strong experience in Hadoop, Hive, SQL, UNIX and Basics of Java. Also they should have strong knowledge on to create test artifacts like test plan, test case and test finding reports. Minimum years of experience required (mention the bare minimum acceptable for this position) *: 5+ Certifications Needed (if any): Top 3 responsibilities you would expect the Subcon to shoulder and execute*: Creation of test designs, test processes, test cases and test data. Client Facing Good Communication skills
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Bigdata Testing-251633

Job Description

• Technical Knowledge of Big data Analytics Platform – Hadoop & Informatica • Hands on Skill on Linux Environment – File system Commands (local & HDFS), Log Analysis • Hands on Skill SQL Query on Apache Hive hosted on Hadoop Hortonworks/Cloudera • Hands on skill on Informatica BDM Mapping running internally on Sqoop & Spark • Hands on skill on Informatica Workflow & Monitoring in test case execution & review • Hands on skill on scripting knowledge (Shell or Python) to be applied testing automation • Technical Knowledge of performing Performance testing on Big Data Environment using tools available in WMIS DataLake Platform & reporting the outcome on the same.

Responsibilities

• Defining Testing strategy, plan, test cases in context of DataLake requirements • Testing scope will cover – Technical, functional & performance • Conduct Review of the Test Artifacts with DataLake Team – PM, BA & Tech Lead • Understand the Know-how of DataLake Platform – Hadoop Hortonworks/Cloudera • Understand the Know-how of DataLake Platform – Informatica (BDM, BDQ, EDC, EDL) • Understand the Functional scope covering sourcing, transformation & consumption • Understand the WMIS Release cycle stages & guidelines • Communicate on the testing execution progress & end deliverables on regular basis
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :9514 DataLake (Big Data Analytics) Tester