BDC6F Summary: As an , you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring the smooth operation of essential applications. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of processes and systems, all while maintaining a focus on delivering high-quality support to users and stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and solutions to enhance team knowledge.- Engage in proactive monitoring of systems to identify potential issues before they impact users. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with troubleshooting and resolving application issues.- Familiarity with database management and SQL.- Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
BDC6F Summary: As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring the smooth operation of essential applications. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of processes and systems, all while maintaining a focus on delivering high-quality support to users and stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and solutions to enhance team knowledge.- Engage in proactive monitoring of systems to identify potential issues before they impact users. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with troubleshooting and resolving application issues.- Familiarity with database management and SQL.- Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,56,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Summary:
As an Engineering Services Practitioner, you will be responsible for providing end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Your typical day will involve working on Authoring of AMM for XWB A350 and Legacy program (A320, A330, A380).
Roles and Responsibilities:
- Overall experience of at least 2+ years in AMM for XWB A350 and Legacy program (A320, A330, A380).
- Experience in the field of Aircraft maintenance will be an advantage
- Experienced in creation and revision of AMM
- Should have knowledge of aerospace tech data process
- Tools knowledge –DACAS/AIRINA/PSE/ICC-CADB/3D-XML/PASS SI/APS/ESDCR.
- Knowledge on ATA25 would be an added advantage
- Should be able to understand and use ASD-STE
- Excellent understanding of ATA iSpec 2200 and S1000D standards
- Analyzing and interpretation of engineering drawings/3D drawings and reports
- Should have Aircraft and systems knowledge
- Excellent Written and Verbal Communication Skills
- Should be a strong team player.
- Should be able to prepare customer reports.
Professional and Technical Skills:
- Must To Have Skills: In-depth knowledge of aircraft maintenance
- Good To Have Skills: Technical writing experience.
- Strong understanding of scientific, socio-economic, and technical knowledge.
- Experience in collaborating with cross-functional teams.
- Solid grasp of project management principles and practices.
Additional Information:
- The candidate may have minimum of 2 years of experience in aircraft maintenance manual authoring guidelines.
- The ideal candidate will possess a strong educational background in engineering or a related field, along with a proven track record of delivering impactful engineering solutions.
- This position is based at our Bengaluru office.
Responsibilities
Summary:
As an Engineering Services Practitioner, you will be responsible for providing end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Your typical day will involve working on Authoring of AMM for XWB A350 and Legacy program (A320, A330, A380).
Roles and Responsibilities:
- Overall experience of at least 2+ years in AMM for XWB A350 and Legacy program (A320, A330, A380).
- Experience in the field of Aircraft maintenance will be an advantage
- Experienced in creation and revision of AMM
- Should have knowledge of aerospace tech data process
- Tools knowledge –DACAS/AIRINA/PSE/ICC-CADB/3D-XML/PASS SI/APS/ESDCR.
- Knowledge on ATA25 would be an added advantage
- Should be able to understand and use ASD-STE
- Excellent understanding of ATA iSpec 2200 and S1000D standards
- Analyzing and interpretation of engineering drawings/3D drawings and reports
- Should have Aircraft and systems knowledge
- Excellent Written and Verbal Communication Skills
- Should be a strong team player.
- Should be able to prepare customer reports.
Professional and Technical Skills:
- Must To Have Skills: In-depth knowledge of aircraft maintenance
- Good To Have Skills: Technical writing experience.
- Strong understanding of scientific, socio-economic, and technical knowledge.
- Experience in collaborating with cross-functional teams.
- Solid grasp of project management principles and practices.
Additional Information:
- The candidate may have minimum of 2 years of experience in aircraft maintenance manual authoring guidelines.
- The ideal candidate will possess a strong educational background in engineering or a related field, along with a proven track record of delivering impactful engineering solutions.
- This position is based at our Bengaluru office.
Salary : Rs. 0.0 - Rs. 140.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Results‑driven Data Scientist with 5 years of experience building predictive models, performing end‑to‑end data analysis, and deploying data‑driven solutions that improve business outcomes. Skilled in machine learning, statistical modelling, data visualization, and cloud‑based analytics. Adopt at translating complex data insights into actionable business strategies and collaborating with cross‑functional teams.
Core Responsibilities
1. Data Analysis & Problem Solving
• Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.
• Use statistical methods to validate hypotheses and support business decision-making.
2. Machine Learning & Modeling
• Build, train, and optimize supervised/unsupervised ML models (regression, classification, clustering, NLP).
• Implement feature engineering, model evaluation, and hyperparameter tuning.
• Deploy models into production using MLOps tools.
3. Data Engineering Collaboration
• Work with data engineers to design and maintain scalable data pipelines.
• Handle large datasets from multiple sources (SQL, APIs, cloud storage).
4. Visualization & Business Communication
• Create dashboards and reports using tools like Power BI, Tableau, or Python (Matplotlib/Seaborn).
• Present insights to stakeholders and leadership for decision-making.
5. Cloud & Big Data
• Utilize cloud platforms like Azure, AWS, or GCP for model training and data storage.
• Work with tools like Spark, Databricks, or Hadoop for large-scale data processing.
Technical Skills
Programming
• Python (NumPy, Pandas, Scikit-learn, Stats models, TensorFlow/PyTorch)
• SQL (T-SQL, MySQL, PostgreSQL)
• R (optional)
Machine Learning
• Predictive modeling
• Deep learning (optional)
• Natural Language Processing (NLP)
• Time-series forecasting
Visualization
• Power BI, Tableau
• Matplotlib, Seaborn, Plotly
Cloud & MLOps
• Azure Machine Learning / AWS Sagemaker
• Git, Docker
• CI/CD pipelines
Databases & Big Data
• Azure Data Lake / AWS S3
• Spark / Databricks
Soft Skills
• Analytical thinking
• Problem‑solving
• Stakeholder communication
• Business understanding
• Team collaboration
Responsibilities
Results‑driven Data Scientist with 5 years of experience building predictive models, performing end‑to‑end data analysis, and deploying data‑driven solutions that improve business outcomes. Skilled in machine learning, statistical modelling, data visualization, and cloud‑based analytics. Adopt at translating complex data insights into actionable business strategies and collaborating with cross‑functional teams.
Core Responsibilities
1. Data Analysis & Problem Solving
• Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.
• Use statistical methods to validate hypotheses and support business decision-making.
2. Machine Learning & Modeling
• Build, train, and optimize supervised/unsupervised ML models (regression, classification, clustering, NLP).
• Implement feature engineering, model evaluation, and hyperparameter tuning.
• Deploy models into production using MLOps tools.
3. Data Engineering Collaboration
• Work with data engineers to design and maintain scalable data pipelines.
• Handle large datasets from multiple sources (SQL, APIs, cloud storage).
4. Visualization & Business Communication
• Create dashboards and reports using tools like Power BI, Tableau, or Python (Matplotlib/Seaborn).
• Present insights to stakeholders and leadership for decision-making.
5. Cloud & Big Data
• Utilize cloud platforms like Azure, AWS, or GCP for model training and data storage.
• Work with tools like Spark, Databricks, or Hadoop for large-scale data processing.
Technical Skills
Programming
• Python (NumPy, Pandas, Scikit-learn, Stats models, TensorFlow/PyTorch)
• SQL (T-SQL, MySQL, PostgreSQL)
• R (optional)
Machine Learning
• Predictive modeling
• Deep learning (optional)
• Natural Language Processing (NLP)
• Time-series forecasting
Visualization
• Power BI, Tableau
• Matplotlib, Seaborn, Plotly
Cloud & MLOps
• Azure Machine Learning / AWS Sagemaker
• Git, Docker
• CI/CD pipelines
Databases & Big Data
• Azure Data Lake / AWS S3
• Spark / Databricks
Soft Skills
• Analytical thinking
• Problem‑solving
• Stakeholder communication
• Business understanding
• Team collaboration
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description: 10 HDC3B Summary: As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve application-related challenges, ensuring the smooth operation of essential services. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering high-quality service to stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of best practices for application support.- Provide training and guidance to junior team members on troubleshooting techniques and tools. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apigee.- Good To Have Skills: Experience with IBM API Connect, IBM WebSphere DataPower.- Strong understanding of API management and integration techniques.- Experience with monitoring and performance tuning of applications.- Familiarity with incident management and ticketing systems. Additional Information: - The candidate should have minimum 3 years of experience in Apigee.- This position is based at our Hyderabad office.- A 15 years full time education is required.- Candidates needs to support 24*7 shifts.
Comments for Suppliers: Apigee
Responsibilities
Job Description: 10 HDC3B Summary: As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve application-related challenges, ensuring the smooth operation of essential services. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering high-quality service to stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of best practices for application support.- Provide training and guidance to junior team members on troubleshooting techniques and tools. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apigee.- Good To Have Skills: Experience with IBM API Connect, IBM WebSphere DataPower.- Strong understanding of API management and integration techniques.- Experience with monitoring and performance tuning of applications.- Familiarity with incident management and ticketing systems. Additional Information: - The candidate should have minimum 3 years of experience in Apigee.- This position is based at our Hyderabad office.- A 15 years full time education is required.- Candidates needs to support 24*7 shifts.
Comments for Suppliers: Apigee
Salary : Rs. 0.0 - Rs. 1,40,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Job Summary:
• Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Develop interactive dashboards and visualizations in Tableau to support business operations and leadership reporting • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with business stakeholders to understand requirements and translate them into analytical solutions • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards
• Roles and Responsibilities
Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Develop interactive dashboards and visualizations in Tableau to support business operations and leadership reporting • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with business stakeholders to understand requirements and translate them into analytical solutions • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards
Responsibilities
• Required Skills
• Strong proficiency in Tableau (dashboard creation, calculations, data blending) • Hands-on experience with Tableau Prep and advanced Tableau development skills required. • Proficiency in Python for data analysis (pandas, numpy, matplotlib, etc.) • Proficiency in one of the databases like Oracle, MySQL, MS SQL Server, Teradata etc., • Hands-on experience with SQL (joins, window functions, performance optimization) • Ability to query and display large data sets while maximizing the performance of the workbook. • A solid understanding of SQL, relational database management system, data modeling, and normalization • High-level competency in Excel (macros, pivot tables, etc.). • Experience using Agile methodology to perform software development. • Knowledge of Google cloud or any other cloud experience is plus. • Knowledge of Google looker studio experience is plus. • Knowledge of DataIQ or any other ETL tools is a plus • Knowledge of UI/UX – Figma tools is added advantage • Knowledge of any batch scheduler like Airflow / Control-M / UC4 / Approx./ Autosys is a great advantage. • Having Manufacturing domain experience is great value ad, but not mandatory
• Desired Skills:
• Strong proficiency in Tableau (dashboard creation, calculations, data blending) • Hands-on experience with Tableau Prep and advanced Tableau development skills required. • Proficiency in Python for data analysis (pandas, numpy, matplotlib, etc.) • Proficiency in one of the databases like Oracle, MySQL, MS SQL Server, Teradata etc., • Hands-on experience with SQL (joins, window functions, performance optimization) • Ability to query and display large data sets while maximizing the performance of the workbook. • A solid understanding of SQL, relational database management system, data modeling, and normalization • High-level competency in Excel (macros, pivot tables, etc.). • Experience using Agile methodology to perform software development. • Knowledge of Google cloud or any other cloud experience is plus. • Knowledge of Google looker studio experience is plus. • Knowledge of DataIQ or any other ETL tools is a plus • Knowledge of UI/UX – Figma tools is added advantage • Knowledge of any batch scheduler like Airflow / Control-M / UC4 / Approx./ Autosys is a great advantage. • Having Manufacturing domain experience is great value ad, but not mandatory
• Soft Skills :
• Excellent problem-solving and analytical skills • Strong communication and collaboration skills • Ability to work independently and as part of a global team. • Self-motivated and able to work in a fast-paced environment. • Detail-oriented and committed to delivering high-quality work. • Display one-team behavior while thinking end to end solutioning.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Job Summary:
We are seeking a skilled Data Engineer with strong experience in SQL, Python, Tableau, and ETL tools to design, build, and maintain reliable data pipelines and analytics solutions. This role focuses on ensuring data quality, enabling scalable data workflows, and supporting business intelligence and reporting needs.
• Roles and Responsibilities
• Collect, clean and validate data from multiple sources to ensure accuracy and reliability • Develop ETL Pipelines to process the data from multiple sources such as csv,flat files and live databases • Build, maintain, and optimize SQL queries, stored procedures, and data pipelines • Use Python for data manipulation, automation, statistical analysis, and exploratory data analysis • Collaborate with cross functional teams (data analysts, product teams, business stakeholders) to understand data requirements. • Perform trend analysis, forecasting, and KPI reporting • Support data governance, documentation, and metadata management • Troubleshoot data issues and identify opportunities for process improvement • Work with cross functional teams such as IT, engineering, operations, and finance • Staying up to date with emerging technologies and trends in data analytics space and recommending innovative solutions to improve data efficiency and quality • Manage the changes and refresh the data for all the Dashboards
Responsibilities
• Required Skills
What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory
• Desired Skills:
What you need to Bring: Bachelor’s/master’s in engineering, Computer Science, or equivalent experience. 3 to 5 years of experience in the IT industry, experience in Data space is a must. Technical Skills (Experience Level: 3 to 5 years) SQL & Database Management: Expertise in querying, transforming, and optimizing data. You should have solid experience in relational databases (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra) databases. Solid experience in SQL Programming: Strong in Python programming for automation and pipeline development. good to have scala. ETL/ELT Frameworks: Building pipelines to Extract, Transform, and Load data using any leading industry ETL tools such as DataIQ, Informatica, Data stage, Alteryx Data Processing Framework: Solid experience in processing large-scale data using Apache Spark or any other data processing framework. should have worked at least one project in large distributed system Data Modeling: Designing efficient schemas and understanding normalization/denormalization to ensure fast data retrieval. Good Experience in creating logical and physical data models. Version Control: Proficiency in Git is mandatory for managing code and collaborating on pipelines Cloud Platforms: Expertise in at least one cloud platform (GCP, AWS or Azure) and their specific data services. Good to have GCP experience. Orchestration: Automating and scheduling complex workflows using tools like Apache Airflow, Prefect, or Dagster. Data Warehousing: Knowledge of modern cloud-native warehouses like Snowflake, Google Big Query, Teradata or Amazon Redshift Real-time Processing: Knowledge of handling data streams as they arrive using Kafka, Flink, or Spark Streaming. Data Governance & Security: Implementing encryption, access controls, and ensuring compliance with regulations like GDPR or HIPAA. AI/ML Integration: Building infrastructure and "feature pipelines" to support machine learning models Good to Have Technical Skills: • Knowledge or Experience using Agile methodology to perform software development. • Knowledge of ITIL Industry best practices • Knowledge of Google looker studio experience is plus. • Knowledge of any BI tool is a plus , Preferably Tableau • Knowledge of Pulse and Tableau Prep is added advantage • Knowledge of UI/UX – Figma tools is added advantage • Having Manufacturing domain experience is great value ad, but not mandatory
• Soft Skills :
• Excellent problem-solving and analytical skills • Strong communication and collaboration skills • Ability to work independently and as part of a global team. • Self-motivated and able to work in a fast-paced environment. • Detail-oriented and committed to delivering high-quality work. • Display one-team behavior while thinking end to end solutioning.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance