Candidate should have ,
1. Minimum 6 Years experience software development and AWS cloud engineering.
1. Good understanding of PowerShell scripting.
2. Provisioned cost effective and secure AWS infrastructure using cloud formation service or terraform scripts.
3. Development background and should be able to write complex power shell scripts in a object oriented way and fix issues in existing scripts.
4. Good oral and written communication skills.
5. Aws Architect certification is an added advantage.
Responsibilities
Candidate should have ,
1. Minimum 6 Years experience software development and AWS cloud engineering.
1. Good understanding of PowerShell scripting.
2. Provisioned cost effective and secure AWS infrastructure using cloud formation service or terraform scripts.
3. Development background and should be able to write complex power shell scripts in a object oriented way and fix issues in existing scripts.
4. Good oral and written communication skills.
5. Aws Architect certification is an added advantage.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
At Daimler Truck, we change today’s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together – everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving. Become part of our global team: You make the difference – YOUMAKEUS
We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain
*Description for Internal Candidates
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT.
Requirements:
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Experience with Snowflake as a data warehouse platform and DBT for transformations.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Responsibilities
At Daimler Truck, we change today’s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together – everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving. Become part of our global team: You make the difference – YOUMAKEUS
We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain
*Description for Internal Candidates
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT.
Requirements:
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Experience with Snowflake as a data warehouse platform and DBT for transformations.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Salary : Rs. 12,00,000.0 - Rs. 21,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
At Daimler Truck, we change today’s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together – everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving. Become part of our global team: You make the difference – YOUMAKEUS
We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain
*Description for Internal Candidates
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT.
Requirements:
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Experience with Snowflake as a data warehouse platform and DBT for transformations.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Responsibilities
At Daimler Truck, we change today’s transportation and create real impact together. We take responsibility around the globe and work together on making our vision become reality: Leading Sustainable Transportation. As one global team, we drive our progress and success together – everyone at Daimler Truck makes the difference. Together, we want to achieve a sustainable transportation, reduce our carbon footprint, increase safety on and off the track, develop smarter technology and attractive financial solutions. All essential, to fulfill our purpose - for all who keep the world moving. Become part of our global team: You make the difference – YOUMAKEUS
We are looking for Senior Data Science Engineer for our Advanced Analytics and Big Data Team. The scope is to generate insights for data driven decision making in the customer service/aftermarket domain
*Description for Internal Candidates
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Snowflake, and DBT.
Requirements:
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Experience with Snowflake as a data warehouse platform and DBT for transformations.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Salary : Rs. 6,00,000.0 - Rs. 14,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
In LSD Role This Role, You Will
• Work with Product managers to understand product requirements & vision, define SW requirements by understanding the user and system requirements that define the product.
• Own Product definition for software subsystem; provide Design leadership for new product development programs as a key member of a cross functional team.
• Conduct thorough design reviews to identify potential issues and ensuring the software meets all quality and safety standards.
• Overseeing the integration of software components with system, as well as leading the development and execution of comprehensive testing plans.
• Lead the engineering team in the implementation of the design process. This includes requirements management, risk management (including FMEA), DFX, verification, compliance to internal QMS processes & industry standards, and design transfer to production.
• Lead technical designs and present technical ideas through whiteboarding; drive design reviews, define interfaces between code modules, and applies existing technology to designs.
• Seek and provide feedback on design and development.
• Demonstrate the ability to make informed technology choices after due diligence and impact assessment.
• Understand whole product, its modules and the interrelationship between them while being an expert in the assigned component or module.
• Help in designing interfaces and information exchange between modules
• Articulate the need for scalability and understand the importance of improving quality through testing.
• Be an expert in assessing application performance and optimizing/improving it through design and best coding practices.
• Partners closely with the quality organization to implement efficient and effective design processes.
• Ensures quality targets are satisfied and retires technical risks as they arise on the program or released product.
• Leading the design team in the development of verification planning and execution for NPIs and released products.
• Support manufacturing and the installed based through corrective and preventative actions to ensure customer satisfaction.
• Driving the architecture, test strategies, branching strategy, design requirements for the product all the time balancing implementation complexity, risks, manufacturability, serviceability and quality.
• Stay updated with relevant medical device regulations.
Education Qualification
• Bachelor’s degree in computer science or “STEM” Majors (Science, Technology, Engineering and Math) with minimum 10 years of experience.
Desired Characteristics/Technical Expertise
• Minimum 8+ years of medical device experience. (Preferably Imaging domain)
• 3+ years of lead system designer experience.
• Experienced in risk management including FMEA.
• Strong understanding of engineering product development processes.
• Strong knowledge of DICOM- Digital Imaging and Communication in Medicine standard.
• Able to influence peers and cross functional partners.
• Experience with Medical device standards.
• Experience with DFX (manufacturability, serviceability, reliability, test, etc)
• Self-starter, energizing, and results oriented.
• Hands on experience on scripting languages such as Shell, PowerShell Or python.
• Understanding of Object-Oriented Programs paradigms and application in implementing reusable and maintainable software components
• Expertise in core data structures as well as algorithms and has the ability to implement them using language of choice.
Responsibilities
In LSD Role This Role, You Will
• Work with Product managers to understand product requirements & vision, define SW requirements by understanding the user and system requirements that define the product.
• Own Product definition for software subsystem; provide Design leadership for new product development programs as a key member of a cross functional team.
• Conduct thorough design reviews to identify potential issues and ensuring the software meets all quality and safety standards.
• Overseeing the integration of software components with system, as well as leading the development and execution of comprehensive testing plans.
• Lead the engineering team in the implementation of the design process. This includes requirements management, risk management (including FMEA), DFX, verification, compliance to internal QMS processes & industry standards, and design transfer to production.
• Lead technical designs and present technical ideas through whiteboarding; drive design reviews, define interfaces between code modules, and applies existing technology to designs.
• Seek and provide feedback on design and development.
• Demonstrate the ability to make informed technology choices after due diligence and impact assessment.
• Understand whole product, its modules and the interrelationship between them while being an expert in the assigned component or module.
• Help in designing interfaces and information exchange between modules
• Articulate the need for scalability and understand the importance of improving quality through testing.
• Be an expert in assessing application performance and optimizing/improving it through design and best coding practices.
• Partners closely with the quality organization to implement efficient and effective design processes.
• Ensures quality targets are satisfied and retires technical risks as they arise on the program or released product.
• Leading the design team in the development of verification planning and execution for NPIs and released products.
• Support manufacturing and the installed based through corrective and preventative actions to ensure customer satisfaction.
• Driving the architecture, test strategies, branching strategy, design requirements for the product all the time balancing implementation complexity, risks, manufacturability, serviceability and quality.
• Stay updated with relevant medical device regulations.
Education Qualification
• Bachelor’s degree in computer science or “STEM” Majors (Science, Technology, Engineering and Math) with minimum 10 years of experience.
Desired Characteristics/Technical Expertise
• Minimum 8+ years of medical device experience. (Preferably Imaging domain)
• 3+ years of lead system designer experience.
• Experienced in risk management including FMEA.
• Strong understanding of engineering product development processes.
• Strong knowledge of DICOM- Digital Imaging and Communication in Medicine standard.
• Able to influence peers and cross functional partners.
• Experience with Medical device standards.
• Experience with DFX (manufacturability, serviceability, reliability, test, etc)
• Self-starter, energizing, and results oriented.
• Hands on experience on scripting languages such as Shell, PowerShell Or python.
• Understanding of Object-Oriented Programs paradigms and application in implementing reusable and maintainable software components
• Expertise in core data structures as well as algorithms and has the ability to implement them using language of choice.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance