We found 422 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Experience : 4+ years Key Responsibilities: Develop, deploy and optimize machine learning models to solve business problems. Analyze and preprocess datasets using python, SQL Build data pipelines and workflows for efficient model training and deployment. Utilize Dataiku for model development and deployment. Building a customized model using python. Technical Stack Must-Have Skills: Expertise in building, training and evaluation ML models. Strong Expertise of SQL for data querying and processing. Proficiency in Python with expertise in key libraries and modules such as: Pandas and NumPy Scikit-learn Tensorflow and PyTorch Matplotlib and Seaborn Statsmodel Data IKU Practical experience in Dataiku/customize model for development and management. Nice-to-Have Skills: Experience with Snowpark for developing and deploying machine learning models directly in Snowflake. Familiarity with LLMs and fine-tuning neural networks. Experience with Pyspark/R for distributed data processing. Knowledge of Snowflake. Exposure to Aws technologies.

Responsibilities

Experience : 4+ years Key Responsibilities: Develop, deploy and optimize machine learning models to solve business problems. Analyze and preprocess datasets using python, SQL Build data pipelines and workflows for efficient model training and deployment. Utilize Dataiku for model development and deployment. Building a customized model using python. Technical Stack Must-Have Skills: Expertise in building, training and evaluation ML models. Strong Expertise of SQL for data querying and processing. Proficiency in Python with expertise in key libraries and modules such as: Pandas and NumPy Scikit-learn Tensorflow and PyTorch Matplotlib and Seaborn Statsmodel Data IKU Practical experience in Dataiku/customize model for development and management. Nice-to-Have Skills: Experience with Snowpark for developing and deploying machine learning models directly in Snowflake. Familiarity with LLMs and fine-tuning neural networks. Experience with Pyspark/R for distributed data processing. Knowledge of Snowflake. Exposure to Aws technologies.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AI/ML Consultant

Job Description

Red hat IDM administration - Install Configure , manage and Maintain IDM systems including servers replica's and clients. 2. LDAP Management – Design, Implement and Maintain LDAP directory services to support Enterprise application and services. 3. Security Management – Implement best practices for secure access control, MFA and certification management within IDM and LDAP Environment. 4. Strong expertise in administering RedHat Identity management and LDAP based directory services. 5. In-depth understanding of authentication protocols including Kerberos, SAML and Oauth. 6. Diagnose and resolve identity related issues including authentication, replication and configuration challenges. 7. Proficiency in Linux/Unix system administration. 8. Experience with Certificate management and Public Key infra structure. 9. Work closely with Security, infrastructure and application teams to ensure alignment with organizational goals and Compliance standards. 10. Point of contact for all escalated issues. 11. Responsible for handling complex & Escalated Incidents and Requests. 12. Responsible for Change and Problem Management. 13. CHANGE Management (continuation) : realize all deliveries prepared/announced during CAB/Change Advisory board (with chronogram of delivery, communication, …). 14. Provide required inputs to stakeholders involved in case of critical incidents like outages. 15. Ensure KPI compliance for all incidents and service calls. 16. Adhere to documented notification and escalation process. 17. Participate in regular reviews with the team. 18. To be proactive and have an Agile mindset to manage and execute Agile Scrum activities.

Responsibilities

Red hat IDM administration - Install Configure , manage and Maintain IDM systems including servers replica's and clients. 2. LDAP Management – Design, Implement and Maintain LDAP directory services to support Enterprise application and services. 3. Security Management – Implement best practices for secure access control, MFA and certification management within IDM and LDAP Environment. 4. Strong expertise in administering RedHat Identity management and LDAP based directory services. 5. In-depth understanding of authentication protocols including Kerberos, SAML and Oauth. 6. Diagnose and resolve identity related issues including authentication, replication and configuration challenges. 7. Proficiency in Linux/Unix system administration. 8. Experience with Certificate management and Public Key infra structure. 9. Work closely with Security, infrastructure and application teams to ensure alignment with organizational goals and Compliance standards. 10. Point of contact for all escalated issues. 11. Responsible for handling complex & Escalated Incidents and Requests. 12. Responsible for Change and Problem Management. 13. CHANGE Management (continuation) : realize all deliveries prepared/announced during CAB/Change Advisory board (with chronogram of delivery, communication, …). 14. Provide required inputs to stakeholders involved in case of critical incidents like outages. 15. Ensure KPI compliance for all incidents and service calls. 16. Adhere to documented notification and escalation process. 17. Participate in regular reviews with the team. 18. To be proactive and have an Agile mindset to manage and execute Agile Scrum activities.
  • Salary : Rs. 0.0 - Rs. 20,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Cyber Security Consultant

Job Description

Description Experience: 4 to 6 Years Skills: • Java Back End Skills: Java -Spring Boot • Good exposure in sonar, balckduck scanning • Knowledge of CI/CD: Jenkins, Azure pipeline. • Knowledge of Build/SCM Tools: Maven ,Git and Jenkins • Knowledge of DevOps/Platforms: Docker, Pivotal and AWS/AZURE • Good To Have: API Testing Responsibilities: • Backend developer, capable of designing solutions, writing code, testing code, automating test and deployment • Developing and designing RESTful services and APIs. • Ability to write quality unit tests and integration tests. • Overall delivery of software components working in collaboration with product and design teams • Collaborating with other technology teams to ensure integrated end-to-end design and integration. • Enforcing existing process guidelines; drives new processes, guidelines, team rules, and best practices. • Mentor and raise the game or teammates in all areas • Full participation in the Agile Scrum process • Ready, willing, and able to pick up new technologies (design, code, test, CI/CD, deploy, etc) • Able to research and learn new methodologies and technologies and bring knowledge to the team

Responsibilities

Description Experience: 4 to 6 Years Skills: • Java Back End Skills: Java -Spring Boot • Good exposure in sonar, balckduck scanning • Knowledge of CI/CD: Jenkins, Azure pipeline. • Knowledge of Build/SCM Tools: Maven ,Git and Jenkins • Knowledge of DevOps/Platforms: Docker, Pivotal and AWS/AZURE • Good To Have: API Testing Responsibilities: • Backend developer, capable of designing solutions, writing code, testing code, automating test and deployment • Developing and designing RESTful services and APIs. • Ability to write quality unit tests and integration tests. • Overall delivery of software components working in collaboration with product and design teams • Collaborating with other technology teams to ensure integrated end-to-end design and integration. • Enforcing existing process guidelines; drives new processes, guidelines, team rules, and best practices. • Mentor and raise the game or teammates in all areas • Full participation in the Agile Scrum process • Ready, willing, and able to pick up new technologies (design, code, test, CI/CD, deploy, etc) • Able to research and learn new methodologies and technologies and bring knowledge to the team
  • Salary : Rs. 0.0 - Rs. 15,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Developer - Nilesh

Job Description

Description At Daimler Truck, we change today’s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together – everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving. Become part of our global team: You make the difference - YOUMAKEUS We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the After Sales/After Market domain. Job Summary: We are looking for a Data Scientist to lead data-driven solutions across our business, from exploratory analysis, incremental hypothesis validation, model development, deployment and monitoring. This role involves transforming complex business questions into actionable value generation insights and predictive models. Key Responsibilities: • Problem Definition: Partner with stakeholders to translate business goals into clear, data-focused questions and define project scope and success metrics. • Data Collection and Preparation: Gather, clean, and preprocess data from diverse sources, ensuring quality and consistency, and engineer features to enhance model performance. • Exploratory Data Analysis (EDA): Use statistical methods and visualization to uncover trends and validate assumptions, summarizing key insights for business alignment. • Model Building and Evaluation: Select, train, and refine models suited to the business problem, evaluating performance with relevant metrics and documenting model assumptions. • Deployment and Monitoring: Work with engineering teams to deploy models, establish performance monitoring, retrain as necessary, and incorporate feedback to improve accuracy. Requirements: • Bachelor’s/Master’s in Data Science, Statistics, Computer Science, or related field. • Experience in end to end model development to deployment and performance monitoring. • Strong communication skills to convey insights to technical and non-technical audiences. Skills Needed: Strong knowledge of Applied AI ML & Deep Learning Data Science techniques, Hardcore in ANN /Deep Learning /Machine Learning/NLP Deep knowledge about machine learning algorithms such as tree-based methods, clustering, regression and classification, dimension reduction techniques, linear regression, Logistic regression, k-means, time series forecasting, Hypothesis testing (ANOVA, t-test, etc.), random forest, SVMs, Naive Bayes, gradient boosting, kNN, Deep learning algorithms like CNN, ANN and Reinforcement learning, Anomaly detection. In-depth understanding of Statistical concepts e.g. Probability distributions, statistical tests, correlation analysis, descriptive statistics, kernels, ROC, F1-Score etc. Advanced coding experience in at least one programming language (Python, Pyspark) & Strong experience in object-oriented concepts. Good to have advanced experience in one or more of the following: Spark, Databricks, Azure technical stack Good to have experience in model deployment to cloud/on-prem. Good Communication & presentation skills.

Responsibilities

Description At Daimler Truck, we change today’s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together – everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving. Become part of our global team: You make the difference - YOUMAKEUS We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the After Sales/After Market domain. Job Summary: We are looking for a Data Scientist to lead data-driven solutions across our business, from exploratory analysis, incremental hypothesis validation, model development, deployment and monitoring. This role involves transforming complex business questions into actionable value generation insights and predictive models. Key Responsibilities: • Problem Definition: Partner with stakeholders to translate business goals into clear, data-focused questions and define project scope and success metrics. • Data Collection and Preparation: Gather, clean, and preprocess data from diverse sources, ensuring quality and consistency, and engineer features to enhance model performance. • Exploratory Data Analysis (EDA): Use statistical methods and visualization to uncover trends and validate assumptions, summarizing key insights for business alignment. • Model Building and Evaluation: Select, train, and refine models suited to the business problem, evaluating performance with relevant metrics and documenting model assumptions. • Deployment and Monitoring: Work with engineering teams to deploy models, establish performance monitoring, retrain as necessary, and incorporate feedback to improve accuracy. Requirements: • Bachelor’s/Master’s in Data Science, Statistics, Computer Science, or related field. • Experience in end to end model development to deployment and performance monitoring. • Strong communication skills to convey insights to technical and non-technical audiences. Skills Needed: Strong knowledge of Applied AI ML & Deep Learning Data Science techniques, Hardcore in ANN /Deep Learning /Machine Learning/NLP Deep knowledge about machine learning algorithms such as tree-based methods, clustering, regression and classification, dimension reduction techniques, linear regression, Logistic regression, k-means, time series forecasting, Hypothesis testing (ANOVA, t-test, etc.), random forest, SVMs, Naive Bayes, gradient boosting, kNN, Deep learning algorithms like CNN, ANN and Reinforcement learning, Anomaly detection. In-depth understanding of Statistical concepts e.g. Probability distributions, statistical tests, correlation analysis, descriptive statistics, kernels, ROC, F1-Score etc. Advanced coding experience in at least one programming language (Python, Pyspark) & Strong experience in object-oriented concepts. Good to have advanced experience in one or more of the following: Spark, Databricks, Azure technical stack Good to have experience in model deployment to cloud/on-prem. Good Communication & presentation skills.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Senior Data Scientist

Job Description

Skill to Evaluate: Fiori, Sapui5, Odata, Cds Views, Amdp Experience: 3 to 8 Years Location: Bengaluru Job Description: Strong technical resource with good experience in S4 HANA ABAP Should possess HANA Database, HANA Studio and HANA Modelling Skills and strong understanding on MDG, VIM,IDOC,ODATA and AO. Strong programming background specifically in ABAP workbench environment. Good business knowledge in S4 HANA Central Finance modules of SAP Flexible to work on weekends and in shift, on call support as per project needs Good communication skills, and have the ability to follow-up on key project Looking for associate who can handle issues independently with out the supervision of seniors. Good work experience on S4 ABAP HANA,VIM, MDG, FIORI, AO, IDOC, ODATA, Adobeforms, BEX, SLT, CFIN, CDS, AMDP, SAP UI5, Fiori, Fiori Apps Debugging. Education Qualificaiton: Graduation/ B.E/ B.Tech/MCA

Responsibilities

Skill to Evaluate: Fiori, Sapui5, Odata, Cds Views, Amdp Experience: 3 to 8 Years Location: Bengaluru Job Description: Strong technical resource with good experience in S4 HANA ABAP Should possess HANA Database, HANA Studio and HANA Modelling Skills and strong understanding on MDG, VIM,IDOC,ODATA and AO. Strong programming background specifically in ABAP workbench environment. Good business knowledge in S4 HANA Central Finance modules of SAP Flexible to work on weekends and in shift, on call support as per project needs Good communication skills, and have the ability to follow-up on key project Looking for associate who can handle issues independently with out the supervision of seniors. Good work experience on S4 ABAP HANA,VIM, MDG, FIORI, AO, IDOC, ODATA, Adobeforms, BEX, SLT, CFIN, CDS, AMDP, SAP UI5, Fiori, Fiori Apps Debugging. Education Qualificaiton: Graduation/ B.E/ B.Tech/MCA
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP FIORI Technical Consultant

Job Description

Skill to Evaluate: AWS,DataPipeLine,CloudInfra,Glue,Python,Pyspark,Athena,ETL,DevOps,CodePipeline Experience: 4 to 6 Years Location: Bengaluru Job Description: We are seeking an experienced Data Pipeline & Cloud Infrastructure Engineer to join our team. The ideal candidate will be responsible for building and maintaining robust data pipelines, managing cloud infrastructure (primarily EC2 and S3), supporting machine learning models, and ensuring smooth operations for web applications and analytics systems. You will work closely with data scientists and various teams to resolve issues, handle deployments, and maintain a secure environment Education Qualificaiton: Bachelor OF Engineering

Responsibilities

Skill to Evaluate: AWS,DataPipeLine,CloudInfra,Glue,Python,Pyspark,Athena,ETL,DevOps,CodePipeline Experience: 4 to 6 Years Location: Bengaluru Job Description: We are seeking an experienced Data Pipeline & Cloud Infrastructure Engineer to join our team. The ideal candidate will be responsible for building and maintaining robust data pipelines, managing cloud infrastructure (primarily EC2 and S3), supporting machine learning models, and ensuring smooth operations for web applications and analytics systems. You will work closely with data scientists and various teams to resolve issues, handle deployments, and maintain a secure environment Education Qualificaiton: Bachelor OF Engineering
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AWS Consultant

Job Description

To Identify and Develop Automations in CloudOps area around ERP / Non-ERP Applications hosted in AWS. Share the voice of the customer to influence the roadmap of new features and services for the AWS platform. Proactively work within the organization to influence the evolution of the platform. Analyse and explain AI and machine learning (ML) solutions while setting and maintaining high ethical standards. The job of AWS AI Architect will be a core member of a technical team for the project responsible for Operations, Design, Build and supporting of High-End Automations in loud infrastructure platform for Enterprise Cloud Services. Serve as a key technical member of the Solutions Architecture team through influencing decision makers across multiple domains to ensure customer success in building applications and services on the AWS platform which align to long-term business goals. Drive technical solutions discussions with your customers, diving deep into the details to solve complex technical problems and use your knowledge to craft scalable, flexible, and resilient cloud architectures. Work with the team in conducting assessments of the AI and automation market and competitor landscape.

Responsibilities

To Identify and Develop Automations in CloudOps area around ERP / Non-ERP Applications hosted in AWS. Share the voice of the customer to influence the roadmap of new features and services for the AWS platform. Proactively work within the organization to influence the evolution of the platform. Analyse and explain AI and machine learning (ML) solutions while setting and maintaining high ethical standards. The job of AWS AI Architect will be a core member of a technical team for the project responsible for Operations, Design, Build and supporting of High-End Automations in loud infrastructure platform for Enterprise Cloud Services. Serve as a key technical member of the Solutions Architecture team through influencing decision makers across multiple domains to ensure customer success in building applications and services on the AWS platform which align to long-term business goals. Drive technical solutions discussions with your customers, diving deep into the details to solve complex technical problems and use your knowledge to craft scalable, flexible, and resilient cloud architectures. Work with the team in conducting assessments of the AI and automation market and competitor landscape.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Sr_Automation_Engineer

Job Description

A Day In The Life Could Include: (job responsibilities) What You Will Need To Bring With You: (experience & education required) • Providing administration support for SAP and integrated applications for Business Users on access related requests. • Experience with security development and administration in SAP BW, FIORI, APO, BPC, Ariba etc., Handling Change management. • SAP security development background in core SAP modules (FI, SD, MM, PP CO, HR) in an ERP 6.0 environment. • Hands-on support for Access control in GRC 10.1/12 • Designing business roles required for new SAP projects • Ability to grasp business problems, goals, objectives, and provide alternative solutions • Good understanding of requirement specifications, design, development, and testing • Should be able to develop solution based on a technical design solution. • Leading security solutions that align with internal financial controls and segregation of duties. • Supporting system audits and periodic access reviews. • Good interpersonal skills to work effectively with different teams within the organization to achieve common goals and respond positively to situations • Experience using CHARM for Role Transports. • Bachelor’s degree in Computer Science/Information Technology or a related field • 3+ years in SAP Security and GRC administration, and Security Development. What Will Put You Ahead: (experience & education preferred) Other Considerations: (Physical demands/ unusual working conditions) • Knowledge/Experience on SAP GRC 10.1 or 12 Access control • Knowledge/Experience on SailPoint User access management. • Experience or Understanding of SAP Authorization and S/4 Hana, SAP Cloud Security would be added advantage. Shift hour: 12:00 PM-9:00 PM IST hours

Responsibilities

A Day In The Life Could Include: (job responsibilities) What You Will Need To Bring With You: (experience & education required) • Providing administration support for SAP and integrated applications for Business Users on access related requests. • Experience with security development and administration in SAP BW, FIORI, APO, BPC, Ariba etc., Handling Change management. • SAP security development background in core SAP modules (FI, SD, MM, PP CO, HR) in an ERP 6.0 environment. • Hands-on support for Access control in GRC 10.1/12 • Designing business roles required for new SAP projects • Ability to grasp business problems, goals, objectives, and provide alternative solutions • Good understanding of requirement specifications, design, development, and testing • Should be able to develop solution based on a technical design solution. • Leading security solutions that align with internal financial controls and segregation of duties. • Supporting system audits and periodic access reviews. • Good interpersonal skills to work effectively with different teams within the organization to achieve common goals and respond positively to situations • Experience using CHARM for Role Transports. • Bachelor’s degree in Computer Science/Information Technology or a related field • 3+ years in SAP Security and GRC administration, and Security Development. What Will Put You Ahead: (experience & education preferred) Other Considerations: (Physical demands/ unusual working conditions) • Knowledge/Experience on SAP GRC 10.1 or 12 Access control • Knowledge/Experience on SailPoint User access management. • Experience or Understanding of SAP Authorization and S/4 Hana, SAP Cloud Security would be added advantage. Shift hour: 12:00 PM-9:00 PM IST hours
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Security Analyst

Job Description

Technical expertise and experience on system, network and applicative infrastructure components (Servers/Hypervisors/Databases/network) and related hardening standard (CIS) Ability to access some tools interfaces (WAF, web Proxy, VPN gateway, firewalls), review and challenge configuration settings against internal standards or good practices. Extensive experience in infrastructure security, compliance, and hardening. Strong understanding of security frameworks and standards such as NIST, CIS, and ISO 27001. Proficiency with security tools and technologies for infrastructure hardening. In-depth knowledge of regulatory requirements and compliance standards. Excellent leadership, analytical, and problem-solving skills. Strong communication and teamwork abilities. Relevant certifications such as CISSP, CISM, or CISA are highly desirable.

Responsibilities

Technical expertise and experience on system, network and applicative infrastructure components (Servers/Hypervisors/Databases/network) and related hardening standard (CIS) Ability to access some tools interfaces (WAF, web Proxy, VPN gateway, firewalls), review and challenge configuration settings against internal standards or good practices. Extensive experience in infrastructure security, compliance, and hardening. Strong understanding of security frameworks and standards such as NIST, CIS, and ISO 27001. Proficiency with security tools and technologies for infrastructure hardening. In-depth knowledge of regulatory requirements and compliance standards. Excellent leadership, analytical, and problem-solving skills. Strong communication and teamwork abilities. Relevant certifications such as CISSP, CISM, or CISA are highly desirable.
  • Salary : Rs. 0.0 - Rs. 12,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Analyst