Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture
Contribute to the design, development and deployment of new features & new components in Azure public cloud
Contribute to the evolution of REST APIs in TOP – enhancement, development and testing of new APIs
Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization
Release & Deployment – Deploy using CD/CI practices and tools in various environments – development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed
Follow Agile@Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira.
Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders
Responsibilities
Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture
Contribute to the design, development and deployment of new features & new components in Azure public cloud
Contribute to the evolution of REST APIs in TOP – enhancement, development and testing of new APIs
Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization
Release & Deployment – Deploy using CD/CI practices and tools in various environments – development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed
Follow Agile@Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira.
Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
4+ year of hands on experience in Python, Flask Framework, Core java, , and good conceptual understanding of OOPs and Data Engineering
Hands-on experience with at least 2 years on PGres or SQL Dbs
Hands-on experience with PowerBI dashboard setup
Hands-on experience on Azure Cloud platform
Hands-on experience with at least 2 years in web GUI development using ReactJS\AngularJS,
Hands-on experience on API development
Prior experience woking with CI/CD tools (Maven, Git, jenkins)
Professional attitude: Self-motivated, fast learner, team player, independent, ability to handle multiple tasks and functional topic simultaneous
Responsibilities
4+ year of hands on experience in Python, Flask Framework, Core java, , and good conceptual understanding of OOPs and Data Engineering
Hands-on experience with at least 2 years on PGres or SQL Dbs
Hands-on experience with PowerBI dashboard setup
Hands-on experience on Azure Cloud platform
Hands-on experience with at least 2 years in web GUI development using ReactJS\AngularJS,
Hands-on experience on API development
Prior experience woking with CI/CD tools (Maven, Git, jenkins)
Professional attitude: Self-motivated, fast learner, team player, independent, ability to handle multiple tasks and functional topic simultaneous
Salary : Rs. 0.0 - Rs. 12.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description:
1. Responsible for designing, writing, testing, and maintaining COBOL programs to meet the business needs of organizations.
2. Ensures that legacy systems operate efficiently, and may also participate in system modernization efforts, upgrading or integrating older COBOL-based systems with newer technologies.
3. Write well-structured and efficient COBOL code based on business and system requirements.
4. Perform system testing, unit testing, and integration testing to ensure that COBOL applications are functioning properly.
5. Create and maintain detailed documentation, including program specifications, flowcharts, and operational procedures.
6. Ensure that COBOL programs adhere to industry standards, security protocols, and best practices.
7. Support system migration or modernization projects involving COBOL and other Mainframe applications.
Essential Skills:
1. Proficiency in COBOL programming (COBOL-85, COBOL II, or later versions).
2. Experience working with mainframe environments (IBM z/OS, OS/390, or similar).
3. Strong knowledge of JCL (Job Control Language) and other mainframe utilities.
4. Familiarity with databases is commonly used with COBOL systems, such as DB2, IMS, or VSAM.
5. Ability to write, modify, and maintain batch and online COBOL programs.
6. Strong analytical and problem-solving skills, with the ability to understand complex business logic.
7. Ability to debug and resolve issues in COBOL programs using mainframe debugging tools (e.g., Xpediter, Abend-AID).
8. Experience with software version control and deployment processes in a mainframe environment.
Desirable Skills:
Knowledge of other Mainframe High-level languages such as Assembler, PL/1 is a plus.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Mainframe DB2 - Application Development, System Z - TPF
Job Description:
1. Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML.
2. Build end-to-end ML pipelines (data ingestion, feature engineering, model training, deployment) on Databricks Lakehouse.
3. Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking.
4. Collaborate with engineering teams to operationalize models (batch/real-time) using Databricks Jobs or REST APIs.
5. Implement Delta Lake for scalable, ACID-compliant data workflows.
6. Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions.
7. Troubleshoot issues in Spark Jobs and Databricks Environment.
8. Requirements: - 5+ years in predictive analytics, with expertise in regression, classification, time-series modeling.
9. Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark.
10. Familiarity with MLflow, Feature Store, and Unity Catalog for governance.
11. Industry experience in Life Insurance or P&C.
12. Skills: Python, PySpark , MLflow, Databricks AutoML.
13. Predictive Modelling (Classification , Clustering , Regression, timeseries and NLP).
14. Cloud platform (Azure/AWS) , Delta Lake, Unity Catalog.
15. Certifications; Databricks Certified ML Practitioner (Optional)
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance