We found 941 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

As an Engineering Services Practitioner, you will be responsible for providing end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Your typical day will involve working on Authoring of AMM for XWB A350 and Legacy program (A320, A330, A380). Roles and Responsibilities: - Overall experience of at least 2+ years in AMM for XWB A350 and Legacy program (A320, A330, A380). - Experience in the field of Aircraft maintenance will be an advantage - Experienced in creation and revision of AMM - Should have knowledge of aerospace tech data process - Tools knowledge –DACAS/AIRINA/PSE/ICC-CADB/3D-XML/PASS SI/APS/ESDCR. - Knowledge on ATA25 would be an added advantage - Should be able to understand and use ASD-STE - Excellent understanding of ATA iSpec 2200 and S1000D standards - Analyzing and interpretation of engineering drawings/3D drawings and reports - Should have Aircraft and systems knowledge - Excellent Written and Verbal Communication Skills - Should be a strong team player. - Should be able to prepare customer reports. Professional and Technical Skills: - Must To Have Skills: In-depth knowledge of aircraft maintenance - Good To Have Skills: Technical writing experience. - Strong understanding of scientific, socio-economic, and technical knowledge. - Experience in collaborating with cross-functional teams. - Solid grasp of project management principles and practices.

Responsibilities

As an Engineering Services Practitioner, you will be responsible for providing end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Your typical day will involve working on Authoring of AMM for XWB A350 and Legacy program (A320, A330, A380). Roles and Responsibilities: - Overall experience of at least 2+ years in AMM for XWB A350 and Legacy program (A320, A330, A380). - Experience in the field of Aircraft maintenance will be an advantage - Experienced in creation and revision of AMM - Should have knowledge of aerospace tech data process - Tools knowledge –DACAS/AIRINA/PSE/ICC-CADB/3D-XML/PASS SI/APS/ESDCR. - Knowledge on ATA25 would be an added advantage - Should be able to understand and use ASD-STE - Excellent understanding of ATA iSpec 2200 and S1000D standards - Analyzing and interpretation of engineering drawings/3D drawings and reports - Should have Aircraft and systems knowledge - Excellent Written and Verbal Communication Skills - Should be a strong team player. - Should be able to prepare customer reports. Professional and Technical Skills: - Must To Have Skills: In-depth knowledge of aircraft maintenance - Good To Have Skills: Technical writing experience. - Strong understanding of scientific, socio-economic, and technical knowledge. - Experience in collaborating with cross-functional teams. - Solid grasp of project management principles and practices.
  • Salary : Rs. 0.0 - Rs. 1,38,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Aircraft maintenance

Job Description

Hyderabad : 9+ years of experience(High priority) (set-1) NP -Immediate – 30days (Open for 90days) Don’t restrict and good profiles based on salary. Bangalore and Pune :10+ years of experience(High priority) (set-2) NP -Immediate – 30days (Open for 60days) Don’t restrict and good profiles based on salary. Date Vendor name Sub Vendor Name Vendor Type CID Candidate Name Mobile Number Email ID Gender Current Employer Current Location Preferred Location Primary Skill Total Experience Relevant Experience Notice Period Current ctc Associate Take Home (A.) Other Benefits (B.) Vendor Administrative cost (C.) Pass Thru cost Total Billing to CTS (A+B+C) Level (A/SA/M) Rehire Status(Yes/No) JD: • Strong proficiency in Embedded C programming and Automotive experience • Strong low level drivers exp. • Work experience with using tools like CANoe, Compilers IAR, Code composer and debuggers • Skilled in embedded driver development, including ADC, UART, SPI, I2C, CAN, and LIN • Experience in UAL and EA design • Proficient in version control systems such as GIT and SVN • Experience RTOS

Responsibilities

Hyderabad : 9+ years of experience(High priority) (set-1) NP -Immediate – 30days (Open for 90days) Don’t restrict and good profiles based on salary. Bangalore and Pune :10+ years of experience(High priority) (set-2) NP -Immediate – 30days (Open for 60days) Don’t restrict and good profiles based on salary. Date Vendor name Sub Vendor Name Vendor Type CID Candidate Name Mobile Number Email ID Gender Current Employer Current Location Preferred Location Primary Skill Total Experience Relevant Experience Notice Period Current ctc Associate Take Home (A.) Other Benefits (B.) Vendor Administrative cost (C.) Pass Thru cost Total Billing to CTS (A+B+C) Level (A/SA/M) Rehire Status(Yes/No) JD: • Strong proficiency in Embedded C programming and Automotive experience • Strong low level drivers exp. • Work experience with using tools like CANoe, Compilers IAR, Code composer and debuggers • Skilled in embedded driver development, including ADC, UART, SPI, I2C, CAN, and LIN • Experience in UAL and EA design • Proficient in version control systems such as GIT and SVN • Experience RTOS
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Embedded Lead

Job Description

Senior Network Engineer

Responsibilities

Senior Network Engineer
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00005117/STRST06_Senior Network Engineer

Job Description

Role Summary Manage end-to-end delivery of ServiceNow implementations and enhancements, ensuring projects are delivered on time, within scope and budget, while meeting quality and customer expectations. Key Responsibilities • Lead ServiceNow implementations (CMDB/ ITOM/ITSM/HRSD/CSM preferred) • Plan, execute, and track project scope, develop and maintain project plans, schedule and budget • Track milestones, dependencies, risks, and issues • Manage stakeholders and cross-functional delivery teams • Ensure adherence to Agile/Waterfall, SDLC, and ServiceNow best practices • Provide regular status updates and project documentation Required Skills • Strong ServiceNow project management experience • Good understanding of ServiceNow platform and ITIL processes • Experience with Agile/Scrum and Waterfall methodologies • Excellent communication and stakeholder management skills Soft Skills • Excellent communication and presentation skills • Strong leadership and decision-making abilities • Customer-focused mindset • Ability to manage multiple projects simultaneously

Responsibilities

Role Summary Manage end-to-end delivery of ServiceNow implementations and enhancements, ensuring projects are delivered on time, within scope and budget, while meeting quality and customer expectations. Key Responsibilities • Lead ServiceNow implementations (CMDB/ ITOM/ITSM/HRSD/CSM preferred) • Plan, execute, and track project scope, develop and maintain project plans, schedule and budget • Track milestones, dependencies, risks, and issues • Manage stakeholders and cross-functional delivery teams • Ensure adherence to Agile/Waterfall, SDLC, and ServiceNow best practices • Provide regular status updates and project documentation Required Skills • Strong ServiceNow project management experience • Good understanding of ServiceNow platform and ITIL processes • Experience with Agile/Scrum and Waterfall methodologies • Excellent communication and stakeholder management skills Soft Skills • Excellent communication and presentation skills • Strong leadership and decision-making abilities • Customer-focused mindset • Ability to manage multiple projects simultaneously
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :ServiceNow Project Manager

Job Description

Requirement for an experienced ETL / Data Migration Tester to support a data migration project for a banking client. The role involves validating complex data migrations from multiple source systems into Temenos Analytics and downstream validations, ensuring data quality, accuracy, reconciliation, and adherence to global banking and regulatory standards. ________________________________________ Key Responsibilities • Understand Temenos Analytics data architecture, subject areas, and reporting layers. • Validate ETL and data migration processes from multiple source systems (Core Banking, Channels, GL, Loans, Deposits, Treasury, Payments, etc.) into Temenos Analytics. • Design and execute ETL test cases, including: o Source-to-target data validation o Data completeness, accuracy, and consistency checks o Transformation and business rule validation o Aggregation and reconciliation testing • Perform data reconciliation across source and target layers. • Validate historical data loads, delta loads, and incremental refresh cycles. • Execute SQL queries to validate large datasets. • Identify, log, and track data defects, working closely with ETL developers and data architects for resolution. • Support User Acceptance Testing (UAT) and validation of MIS / management reports. • Prepare and maintain test documentation, execution reports, and data quality reports. • Ensure adherence to global banking data standards, audit requirements, and regulatory compliance. ________________________________________ Required Skills & Experience • 4–8 years of experience in ETL / Data Migration Testing. • Strong hands-on experience in data warehouse / analytics testing. • Good exposure to Temenos Analytics or similar banking analytics platforms (preferred). • Excellent SQL proficiency (joins, aggregations, data comparisons, reconciliations). • Strong understanding of banking and financial data including: o Retail & Corporate Banking o Loans, Deposits, GL, Payments o Financial & Management Reporting • Experience testing large data volumes across multiple environments. • Familiarity with ETL tools (Informatica, DataStage, Talend, etc.) from a testing perspective. • Knowledge of STLC, defect lifecycle, and test management practices. • High attention to detail and data accuracy • Clear communication with cross-functional and global teams • Ability to work independently in complex data programs ________________________________________ Optional / Good to Have • Exposure to Temenos T24 data structures (optional, not mandatory). • Experience in regulatory reporting (Basel, IFRS, Liquidity, Risk, ALM). • Automation or scripting experience for data validation. • Understanding of data lineage, data quality frameworks, and governance models. • Experience supporting multi-country banking implementations. ________________________________________ Tools & Technologies • Databases: Oracle / PostgreSQL / MS SQL • ETL Platforms: Informatica / DataStage / Talend (testing exposure) • Test & Defect Management: JIRA, ALM, Azure DevOps, or similar • Unix/Linux: basic knowledge preferred • BI/Reporting tools: exposure is an added advantage

Responsibilities

Requirement for an experienced ETL / Data Migration Tester to support a data migration project for a banking client. The role involves validating complex data migrations from multiple source systems into Temenos Analytics and downstream validations, ensuring data quality, accuracy, reconciliation, and adherence to global banking and regulatory standards. ________________________________________ Key Responsibilities • Understand Temenos Analytics data architecture, subject areas, and reporting layers. • Validate ETL and data migration processes from multiple source systems (Core Banking, Channels, GL, Loans, Deposits, Treasury, Payments, etc.) into Temenos Analytics. • Design and execute ETL test cases, including: o Source-to-target data validation o Data completeness, accuracy, and consistency checks o Transformation and business rule validation o Aggregation and reconciliation testing • Perform data reconciliation across source and target layers. • Validate historical data loads, delta loads, and incremental refresh cycles. • Execute SQL queries to validate large datasets. • Identify, log, and track data defects, working closely with ETL developers and data architects for resolution. • Support User Acceptance Testing (UAT) and validation of MIS / management reports. • Prepare and maintain test documentation, execution reports, and data quality reports. • Ensure adherence to global banking data standards, audit requirements, and regulatory compliance. ________________________________________ Required Skills & Experience • 4–8 years of experience in ETL / Data Migration Testing. • Strong hands-on experience in data warehouse / analytics testing. • Good exposure to Temenos Analytics or similar banking analytics platforms (preferred). • Excellent SQL proficiency (joins, aggregations, data comparisons, reconciliations). • Strong understanding of banking and financial data including: o Retail & Corporate Banking o Loans, Deposits, GL, Payments o Financial & Management Reporting • Experience testing large data volumes across multiple environments. • Familiarity with ETL tools (Informatica, DataStage, Talend, etc.) from a testing perspective. • Knowledge of STLC, defect lifecycle, and test management practices. • High attention to detail and data accuracy • Clear communication with cross-functional and global teams • Ability to work independently in complex data programs ________________________________________ Optional / Good to Have • Exposure to Temenos T24 data structures (optional, not mandatory). • Experience in regulatory reporting (Basel, IFRS, Liquidity, Risk, ALM). • Automation or scripting experience for data validation. • Understanding of data lineage, data quality frameworks, and governance models. • Experience supporting multi-country banking implementations. ________________________________________ Tools & Technologies • Databases: Oracle / PostgreSQL / MS SQL • ETL Platforms: Informatica / DataStage / Talend (testing exposure) • Test & Defect Management: JIRA, ALM, Azure DevOps, or similar • Unix/Linux: basic knowledge preferred • BI/Reporting tools: exposure is an added advantage
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :ETL Tester

Job Description

Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Azure Data Factory, PySpark, SQL, any cloud (preferably Azure)and Data Modelling Requirements : Experience Level: 3 to 6 Years • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure data bricks with Pyspark and Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Spark, Databricks and ADF • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.

Responsibilities

Job Summary: We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Azure Data Factory, PySpark, SQL, any cloud (preferably Azure)and Data Modelling Requirements : Experience Level: 3 to 6 Years • Bachelor’s in Computer Science, Data Engineering, or related field. • Proficiency in Azure Databricks for data processing and pipeline orchestration. • Strong SQL skills and understanding of data modeling principles. • Ability to troubleshoot and optimize data workflows. Key Responsibilities: • Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks, • Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility. • Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed. • Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data. Qualifications: • Having relevant Experience in MS Azure data bricks with Pyspark and Big Data Hadoop eco-system components • Understanding of Hadoop Architecture and underlying framework including Storage Management. • Strong understand and implementation experience in Spark, Databricks and ADF • Expertise in implementing Data lake solution using Scala as well as Python. • Expertise with orchestration tool like Azure Data Factory • Strong SQL and Programing skills • Experience with DataBricks is desirable • Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer

Job Description

Key Responsibilities · Scheduling & Logistics: Coordinating interviews (phone, virtual, in-person) across time zones and managing calendars for candidates and hiring managers. · Candidate Experience: Acting as the primary point of contact, answering questions, and ensuring a smooth, professional hiring experience. · ATS & Data Management: Posting job openings, updating candidate records, maintaining compliance, and managing applicant flow in the ATS. · Administrative Support: Drafting offer letters, initiating background checks, and assisting with onboarding preparation.

Responsibilities

Key Responsibilities · Scheduling & Logistics: Coordinating interviews (phone, virtual, in-person) across time zones and managing calendars for candidates and hiring managers. · Candidate Experience: Acting as the primary point of contact, answering questions, and ensuring a smooth, professional hiring experience. · ATS & Data Management: Posting job openings, updating candidate records, maintaining compliance, and managing applicant flow in the ATS. · Administrative Support: Drafting offer letters, initiating background checks, and assisting with onboarding preparation.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Sourcing

Job Description

PFB JD for your reference Key Responsibilities: 1. Design and Implementation: Lead the design, configuration, and implementation of SAP MDG solutions, ensuring seamless integration with business processes. 2. Custom Development: Develop and enhance custom objects, including UI components, BRF+ rules, workflows, and data models. 3. Data Governance: Design and implement data models, replication frameworks, and integration scenarios with SAP and non-SAP systems, ensuring data quality and integrity. 4. Collaboration: Work with business users, functional consultants, and technical teams to gather requirements and provide solutions. 5. Testing and Documentation: Oversee unit testing, integration testing, and support user acceptance testing (UAT), preparing detailed technical and functional documentation. 6. Training and Support: Conduct training sessions for end-users and support teams, ensuring smooth system adoption and operation. Technical Skills: 1. SAP MDG: Strong knowledge of SAP Master Data Governance, including data modeling, data quality management, and data replication. 2. ABAP: Proficiency in ABAP programming, including Web Dynpro ABAP and FPM. 3. BRF+:Expertise in Business Rule Framework plus (BRF+) for business logic and decision-making. 4. SAP S/4HANA:Understanding of S/4HANA architecture and integration with SAP MDG. 5. Data Integration: Experience with SAP PI/PO, IDocs, BAPIs, and ALE/IDocs for data integration. Functional Skills: 1. Master Data Management: Strong understanding of master data processes in areas like procurement, sales, and supply chain. 2. Data Governance: Knowledge of data governance principles and best practices in SAP environments. 3. Analytical and Problem-Solving: Excellent analytical and problem-solving skills, with the ability to translate business requirements into technical specifications . Qualifications: 1. Education:*Bachelor's or master’s degree in computer science, Information Technology, or a related field. 2. Experience: Minimum 6 years of experience in SAP MDG, with at least 3 years in a techno-functional role. 3. Certifications: SAP Certified Application Associate – SAP Master Data Governance (MDG) is preferred .

Responsibilities

PFB JD for your reference Key Responsibilities: 1. Design and Implementation: Lead the design, configuration, and implementation of SAP MDG solutions, ensuring seamless integration with business processes. 2. Custom Development: Develop and enhance custom objects, including UI components, BRF+ rules, workflows, and data models. 3. Data Governance: Design and implement data models, replication frameworks, and integration scenarios with SAP and non-SAP systems, ensuring data quality and integrity. 4. Collaboration: Work with business users, functional consultants, and technical teams to gather requirements and provide solutions. 5. Testing and Documentation: Oversee unit testing, integration testing, and support user acceptance testing (UAT), preparing detailed technical and functional documentation. 6. Training and Support: Conduct training sessions for end-users and support teams, ensuring smooth system adoption and operation. Technical Skills: 1. SAP MDG: Strong knowledge of SAP Master Data Governance, including data modeling, data quality management, and data replication. 2. ABAP: Proficiency in ABAP programming, including Web Dynpro ABAP and FPM. 3. BRF+:Expertise in Business Rule Framework plus (BRF+) for business logic and decision-making. 4. SAP S/4HANA:Understanding of S/4HANA architecture and integration with SAP MDG. 5. Data Integration: Experience with SAP PI/PO, IDocs, BAPIs, and ALE/IDocs for data integration. Functional Skills: 1. Master Data Management: Strong understanding of master data processes in areas like procurement, sales, and supply chain. 2. Data Governance: Knowledge of data governance principles and best practices in SAP environments. 3. Analytical and Problem-Solving: Excellent analytical and problem-solving skills, with the ability to translate business requirements into technical specifications . Qualifications: 1. Education:*Bachelor's or master’s degree in computer science, Information Technology, or a related field. 2. Experience: Minimum 6 years of experience in SAP MDG, with at least 3 years in a techno-functional role. 3. Certifications: SAP Certified Application Associate – SAP Master Data Governance (MDG) is preferred .
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP MDG

Job Description

Backend Developer Key Responsibilities: • Design and implement scalable data pipelines using Python, PySpark, and boto3 • Integrate with Amazon Redshift for data warehousing and perform data ingestion and transformation tasks • Work with Amazon S3 to architect and manage data lakes • Build and manage batch processing jobs using AWS Glue, EMR, and Lambda • Support streaming data workflows using Amazon Kinesis • Leverage AWS Secrets Manager and KMS for secure data handling and encryption • Write efficient SQL for data querying, transformations, and performance tuning • Collaborate with data engineering, product, and DevOps teams to deliver end-to-end solutions • Participate in code reviews, debugging, and testing activities to ensure high-quality software delivery • Use Git for version control and follow best practices in CI/CD • Continuously explore and recommend improvements to enhance system performance, scalability, and maintainability ________________________________________ Required Skills: • Advanced proficiency in Python, including libraries such as Pandas, PySpark, and boto3 • Experience designing workflows in Apache Airflow (MWAA) • Hands-on experience with Amazon Redshift for analytics and data warehousing • Solid SQL skills and experience with data modeling and transformation • Comfortable with Amazon S3, including setting up data lake architecture • Familiar with AWS core services: EC2, S3, IAM, Lambda • Experience using AWS Secrets Manager and AWS KMS for secure data access • Strong understanding of batch processing (AWS Glue, EMR) and streaming (Kinesis) • Version control using Git • Excellent problem-solving and communication skills

Responsibilities

Backend Developer Key Responsibilities: • Design and implement scalable data pipelines using Python, PySpark, and boto3 • Integrate with Amazon Redshift for data warehousing and perform data ingestion and transformation tasks • Work with Amazon S3 to architect and manage data lakes • Build and manage batch processing jobs using AWS Glue, EMR, and Lambda • Support streaming data workflows using Amazon Kinesis • Leverage AWS Secrets Manager and KMS for secure data handling and encryption • Write efficient SQL for data querying, transformations, and performance tuning • Collaborate with data engineering, product, and DevOps teams to deliver end-to-end solutions • Participate in code reviews, debugging, and testing activities to ensure high-quality software delivery • Use Git for version control and follow best practices in CI/CD • Continuously explore and recommend improvements to enhance system performance, scalability, and maintainability ________________________________________ Required Skills: • Advanced proficiency in Python, including libraries such as Pandas, PySpark, and boto3 • Experience designing workflows in Apache Airflow (MWAA) • Hands-on experience with Amazon Redshift for analytics and data warehousing • Solid SQL skills and experience with data modeling and transformation • Comfortable with Amazon S3, including setting up data lake architecture • Familiar with AWS core services: EC2, S3, IAM, Lambda • Experience using AWS Secrets Manager and AWS KMS for secure data access • Strong understanding of batch processing (AWS Glue, EMR) and streaming (Kinesis) • Version control using Git • Excellent problem-solving and communication skills
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Backend Developer