Job Description for Senior Data Engineer
Skills & Abilities MUST to have:
experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role)
Expertise in Java/Scala/Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark
(Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering
role)
Administers crucial and complex Bigdata/Hadoop infrastructure to enable next generation analytics and data science capabilities
Establishes big data environments based on Hadoop
Drives innovation of our big data capabilities through research and hands-on practice
Operates a multi-tenant service, encompassing cluster management, security, resource and quota management, partitioning; monitors chargeback, data governance, quality and lineage
Conducts performance monitoring and benchmarking on various workloads; may participate in afterhours maintenance windows to update, change, and install various systems; participates in the 24 x 7 On-Call Rotation
Supports Hadoop cluster set up, performance fine-tuning, monitoring, and administration
Manages Bigdata application on On-prem and Public cloud
Building solid and high performing Data pipelines.
Experience with cloud managed service for ML/Ops tech stack like Vertex AI, ML OPS etc
Well versed with feature engineering tools / techniques like PCA , featureTools, Pyfeat etc.
Well versed with Model development / techniques to use collaborative filtering or look a like models.
Strong communication and data presentation skills; ability to communicate with data-driven stories
Ability to quickly adapt to new technologies, tools and techniques
Flexible and responsive; able to perform in a fast paced, dynamic work environment and meet aggressive deadlines
Ability to work with technical and non-technical team members
Responsibilities
Job Description for Senior Data Engineer
Skills & Abilities MUST to have:
experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role)
Expertise in Java/Scala/Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark
(Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering
role)
Administers crucial and complex Bigdata/Hadoop infrastructure to enable next generation analytics and data science capabilities
Establishes big data environments based on Hadoop
Drives innovation of our big data capabilities through research and hands-on practice
Operates a multi-tenant service, encompassing cluster management, security, resource and quota management, partitioning; monitors chargeback, data governance, quality and lineage
Conducts performance monitoring and benchmarking on various workloads; may participate in afterhours maintenance windows to update, change, and install various systems; participates in the 24 x 7 On-Call Rotation
Supports Hadoop cluster set up, performance fine-tuning, monitoring, and administration
Manages Bigdata application on On-prem and Public cloud
Building solid and high performing Data pipelines.
Experience with cloud managed service for ML/Ops tech stack like Vertex AI, ML OPS etc
Well versed with feature engineering tools / techniques like PCA , featureTools, Pyfeat etc.
Well versed with Model development / techniques to use collaborative filtering or look a like models.
Strong communication and data presentation skills; ability to communicate with data-driven stories
Ability to quickly adapt to new technologies, tools and techniques
Flexible and responsive; able to perform in a fast paced, dynamic work environment and meet aggressive deadlines
Ability to work with technical and non-technical team members
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Profile Required
Essential Qualifications and Experience:
3-6 years of professional experience in a software development environment primarily in IT data Analyst Role, who can work on creation/Modification of Complex Extraction Queries on Big DATA;
Exposure to HR Data and domain preferred
Bachelor’s Degree in Technology, Computer Science, Information Systems or equivalent non-technical qualifications with Business Analyst exposure and certification in an IT environment
Ideal: French language intermediate to fluent (especially oral/conversational)
Knowledge of Agile methodology
Key Technical Skills:
Advance knowledge on - Advanced Hive Query Language [HQL](Must)
Good knowledge on Process Workflows, Data Modelling, ETL Concepts, Technical hands-on. (MUST)
Good Knowledge on Big Data Stack will be and advantage.
Good knowledge on Software Engineering.
Good Analytical, Logical and troubleshhoting skills.
TOOLS: HUE, HIVE, PostgresSQL (RDBMS)
BEHAVIORAL COMPETENCIES:
Accountability
Conflict Management
Interpersonal effectiveness
Negotiation
Learning Agility
Problem solving
Proactiveness
Responsibilities
Profile Required
Essential Qualifications and Experience:
3-6 years of professional experience in a software development environment primarily in IT data Analyst Role, who can work on creation/Modification of Complex Extraction Queries on Big DATA;
Exposure to HR Data and domain preferred
Bachelor’s Degree in Technology, Computer Science, Information Systems or equivalent non-technical qualifications with Business Analyst exposure and certification in an IT environment
Ideal: French language intermediate to fluent (especially oral/conversational)
Knowledge of Agile methodology
Key Technical Skills:
Advance knowledge on - Advanced Hive Query Language [HQL](Must)
Good knowledge on Process Workflows, Data Modelling, ETL Concepts, Technical hands-on. (MUST)
Good Knowledge on Big Data Stack will be and advantage.
Good knowledge on Software Engineering.
Good Analytical, Logical and troubleshhoting skills.
TOOLS: HUE, HIVE, PostgresSQL (RDBMS)
BEHAVIORAL COMPETENCIES:
Accountability
Conflict Management
Interpersonal effectiveness
Negotiation
Learning Agility
Problem solving
Proactiveness
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Summary:
As an AI/ML Engineer, you will be part of the core development and implementaƟon of cuƫng-edge
machine learning soluƟons, leveraging your experƟse in fine-tuning large language models (LLMs) and
generaƟve AI. This role demands a deep understanding of a broad set of machine learning and deep
learning techniques, as well as hands-on experience with a variety of tools and plaƞorms.
Key ResponsibiliƟes:
Design and implement advanced machine learning algorithms, parƟcularly in the areas of NLP
and deep learning.
ExperƟse in fine-tuning LLMs using parameter-efficient methods like LoRA and QLoRA, and alllayer fine-tuning.
Efficiently wrap ML applicaƟons into API-based soluƟons using frameworks like Flask and
FastAPI, with deployment knowledge on cloud plaƞorms such as AWS.
ConƟnuously explore and evaluate new technologies and tools in the AI/ML domain.
Core Technical Skills:
Proficiency in Python and experience with ML libraries like TensorFlow, PyTorch, Scikit-learn,
Pandas, and Numpy.
Strong background in NLP, computer vision, ensemble learning techniques, and deep learning
architectures (ANN, RNN, CNN, LSTM, Encoder-Decoder, Autoencoder).
Experience with object detecƟon techniques and tools like OpenCV, YoLo, and transfer learning
(e.g., ResNet-50). (Good to have)
Strong knowledge/familiarity with generaƟve AI, Hugging Face, OpenAI, LangChain, and
transformers.
Knowledge of MLOps, Azure-ML, and data visualizaƟon tools like Tableau and Plotly.
Experience with databases like MySQL, MongoDB, and Cassandra.
Familiarity with soŌware development tools like Pycharm, VS Code, Jupyter Notebook, Google
Colab, Anaconda, and Spyder.
QualificaƟons:
Bachelor’s or master’s degree in computer science, Data Science, or a related field.
Responsibilities
Job Summary:
As an AI/ML Engineer, you will be part of the core development and implementaƟon of cuƫng-edge
machine learning soluƟons, leveraging your experƟse in fine-tuning large language models (LLMs) and
generaƟve AI. This role demands a deep understanding of a broad set of machine learning and deep
learning techniques, as well as hands-on experience with a variety of tools and plaƞorms.
Key ResponsibiliƟes:
Design and implement advanced machine learning algorithms, parƟcularly in the areas of NLP
and deep learning.
ExperƟse in fine-tuning LLMs using parameter-efficient methods like LoRA and QLoRA, and alllayer fine-tuning.
Efficiently wrap ML applicaƟons into API-based soluƟons using frameworks like Flask and
FastAPI, with deployment knowledge on cloud plaƞorms such as AWS.
ConƟnuously explore and evaluate new technologies and tools in the AI/ML domain.
Core Technical Skills:
Proficiency in Python and experience with ML libraries like TensorFlow, PyTorch, Scikit-learn,
Pandas, and Numpy.
Strong background in NLP, computer vision, ensemble learning techniques, and deep learning
architectures (ANN, RNN, CNN, LSTM, Encoder-Decoder, Autoencoder).
Experience with object detecƟon techniques and tools like OpenCV, YoLo, and transfer learning
(e.g., ResNet-50). (Good to have)
Strong knowledge/familiarity with generaƟve AI, Hugging Face, OpenAI, LangChain, and
transformers.
Knowledge of MLOps, Azure-ML, and data visualizaƟon tools like Tableau and Plotly.
Experience with databases like MySQL, MongoDB, and Cassandra.
Familiarity with soŌware development tools like Pycharm, VS Code, Jupyter Notebook, Google
Colab, Anaconda, and Spyder.
QualificaƟons:
Bachelor’s or master’s degree in computer science, Data Science, or a related field.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Required:
• Bachelor’s degree in computer science, Software Engineering, or a related field.
• Minimum of 3 to 5 years of Experience in API and UI Test automation.
• Experience to use Azure-DevOps or similar process management tools.
• Strong understanding of Python programming language and testing frameworks.
• Experience with API testing methodologies using Postman.
• Demonstrated proficiency in applying BDD Cucumber frameworks for test automation.
• Familiarity with CI/CD pipelines and automation concepts.
• Solid understanding of Agile/Scrum methodologies
• Excellent communication and collaboration skills.
• Ability to work independently and as part of a team.
Responsibilities
Required:
• Bachelor’s degree in computer science, Software Engineering, or a related field.
• Minimum of 3 to 5 years of Experience in API and UI Test automation.
• Experience to use Azure-DevOps or similar process management tools.
• Strong understanding of Python programming language and testing frameworks.
• Experience with API testing methodologies using Postman.
• Demonstrated proficiency in applying BDD Cucumber frameworks for test automation.
• Familiarity with CI/CD pipelines and automation concepts.
• Solid understanding of Agile/Scrum methodologies
• Excellent communication and collaboration skills.
• Ability to work independently and as part of a team.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance