We found 1751 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

SSE Data Engineer - (2600008N) Missions Overview We are seeking a skilled Data Engineer with strong expertise in Oracle APEX and SQL to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining data solutions that support business intelligence, reporting, and analytics initiatives. This role requires a deep understanding of database development, data integration, and application development using Oracle technologies. Key Responsibilities •Design, develop, and maintain scalable data pipelines and ETL processes using SQL and Oracle APEX. •Build and optimize database schemas, queries, and stored procedures to ensure high performance and data integrity. •Develop and maintain Oracle APEX applications to facilitate data entry, reporting, and dashboarding for business users. •Collaborate with data analysts, business stakeholders, and IT teams to gather requirements and translate them into technical solutions. •Monitor and troubleshoot data workflows, identifying and resolving data quality issues promptly. •Implement data security and governance best practices within Oracle environments. •Document technical specifications, data models, and workflows for knowledge sharing and compliance. •Stay updated with the latest Oracle APEX features and SQL advancements to continuously improve data solutions. Profile Required Skills and Qualifications •Proven experience as a Data Engineer or similar role with a strong focus on Oracle APEX and SQL development. •Proficiency in writing complex SQL queries, PL/SQL programming, and performance tuning. •Hands-on experience in developing Oracle APEX applications, including interactive reports, forms, and dashboards. •Strong understanding of relational database concepts, data modeling, and ETL processes. •Familiarity with data warehousing concepts and business intelligence tools is a plus. •Ability to work collaboratively in an Agile development environment. •Excellent problem-solving skills and attention to detail. •Strong communication skills to interact effectively with both technical and non-technical stakeholders.

Responsibilities

SSE Data Engineer - (2600008N) Missions Overview We are seeking a skilled Data Engineer with strong expertise in Oracle APEX and SQL to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining data solutions that support business intelligence, reporting, and analytics initiatives. This role requires a deep understanding of database development, data integration, and application development using Oracle technologies. Key Responsibilities •Design, develop, and maintain scalable data pipelines and ETL processes using SQL and Oracle APEX. •Build and optimize database schemas, queries, and stored procedures to ensure high performance and data integrity. •Develop and maintain Oracle APEX applications to facilitate data entry, reporting, and dashboarding for business users. •Collaborate with data analysts, business stakeholders, and IT teams to gather requirements and translate them into technical solutions. •Monitor and troubleshoot data workflows, identifying and resolving data quality issues promptly. •Implement data security and governance best practices within Oracle environments. •Document technical specifications, data models, and workflows for knowledge sharing and compliance. •Stay updated with the latest Oracle APEX features and SQL advancements to continuously improve data solutions. Profile Required Skills and Qualifications •Proven experience as a Data Engineer or similar role with a strong focus on Oracle APEX and SQL development. •Proficiency in writing complex SQL queries, PL/SQL programming, and performance tuning. •Hands-on experience in developing Oracle APEX applications, including interactive reports, forms, and dashboards. •Strong understanding of relational database concepts, data modeling, and ETL processes. •Familiarity with data warehousing concepts and business intelligence tools is a plus. •Ability to work collaboratively in an Agile development environment. •Excellent problem-solving skills and attention to detail. •Strong communication skills to interact effectively with both technical and non-technical stakeholders.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SSE Data Engineer

Job Description

Senior Pega Business Architect

Responsibilities

Senior Pega Business Architect
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004606/551604 - Senior Pega Business Architect - Blr, Pune, Che, Hyd - EAIS

Job Description

Description: . Core AWS Data Services Experience Design, build, and maintain data pipelines using AWS Glue and Step Functions Build and manage data lakes on Amazon S3 with proper partitioning, lifecycle policies, and open table formats (Apache Iceberg, Delta Lake, Hudi) Implement data warehousing solutions using Amazon Redshift (including Redshift Spectrum for lakehouse queries) Use Amazon Athena for serverless ad-hoc querying over S3 data Configure data cataloging and metadata management with AWS Glue Data Catalog and AWS Lake Formation 2. Data Pipeline & ETL Engineering Design and implement ETL/ELT pipelines for batch and real-time data processing Orchestrate complex data workflows with scheduling, monitoring, error handling, and retry logic Build incremental/CDC (Change Data Capture) ingestion patterns Optimize data pipelines for cost and performance (partitioning, compression, file format selection — Parquet, ORC, Avro) Implement data quality checks and validation within pipelines 3. Programming & Technical Skills Strong proficiency in Python and/or Scala for data processing Advanced SQL skills (Redshift SQL, Spark SQL, Athena) Experience with Apache Spark (via EMR or Glue) Infrastructure as Code using CDK, CloudFormation, or Terraform Version control with Git and CI/CD pipeline integration 4. Data Modeling & Architecture Design dimensional models (star/snowflake schemas) for analytical workloads Implement lakehouse architecture patterns (combining data lake + data warehouse) Design data mesh or hub-and-spoke architectures for decentralized data ownership Define data partitioning strategies, indexing, and compression for query optimization Schema evolution and schema registry management 5. Identity-Based Access Control & Lake Formation Implement centralized data governance using AWS Lake Formation with fine-grained permissions at the table, column, row, and cell level Design and implement Lake Formation Tag-Based Access Control (LF-TBAC) to dynamically assign permissions based on metadata tags Integrate Lake Formation with IAM roles, IAM Identity Center (SSO), and SAML-based federation Enable secure cross-account data sharing using Lake Formation + AWS RAM for data mesh patterns Ensure Lake Formation permissions are enforced consistently across Athena, Redshift Spectrum, Glue ETL, and EMR 6. Data Governance, Security & Compliance Apply data masking, anonymization, and PII identification for compliance (e.g., GDPR) Manage encryption at rest (KMS) and in transit (TLS) Implement data lineage tracking and audit logging (CloudTrail, CloudWatch) Implement data location permissions to control catalog resource creation for specific S3 paths 7. Monitoring, Observability & Operations Set up monitoring and alerting using Amazon CloudWatch Troubleshoot pipeline failures and performance bottlenecks Manage data lifecycle policies (tiering, archival, expiration) Cost optimization of data workloads (S3 storage classes, Redshift reserved nodes, Glue DPU tuning) 8. AI/ML Data Readiness Prepare and curate datasets for machine learning and generative AI workloads Integration with Amazon SageMaker and/or Amazon Bedrock Build feature stores and vector data pipelines Support RAG (Retrieval-Augmented Generation) patterns with governed data assets Recommended Certifications: AWS Certified Data Engineer – Associate (DEA-C01) — most directly relevant AWS Certified Solutions Architect – Associate/Professional AWS Certified Machine Learning – Specialty (for ML-heavy platforms)

Responsibilities

Description: . Core AWS Data Services Experience Design, build, and maintain data pipelines using AWS Glue and Step Functions Build and manage data lakes on Amazon S3 with proper partitioning, lifecycle policies, and open table formats (Apache Iceberg, Delta Lake, Hudi) Implement data warehousing solutions using Amazon Redshift (including Redshift Spectrum for lakehouse queries) Use Amazon Athena for serverless ad-hoc querying over S3 data Configure data cataloging and metadata management with AWS Glue Data Catalog and AWS Lake Formation 2. Data Pipeline & ETL Engineering Design and implement ETL/ELT pipelines for batch and real-time data processing Orchestrate complex data workflows with scheduling, monitoring, error handling, and retry logic Build incremental/CDC (Change Data Capture) ingestion patterns Optimize data pipelines for cost and performance (partitioning, compression, file format selection — Parquet, ORC, Avro) Implement data quality checks and validation within pipelines 3. Programming & Technical Skills Strong proficiency in Python and/or Scala for data processing Advanced SQL skills (Redshift SQL, Spark SQL, Athena) Experience with Apache Spark (via EMR or Glue) Infrastructure as Code using CDK, CloudFormation, or Terraform Version control with Git and CI/CD pipeline integration 4. Data Modeling & Architecture Design dimensional models (star/snowflake schemas) for analytical workloads Implement lakehouse architecture patterns (combining data lake + data warehouse) Design data mesh or hub-and-spoke architectures for decentralized data ownership Define data partitioning strategies, indexing, and compression for query optimization Schema evolution and schema registry management 5. Identity-Based Access Control & Lake Formation Implement centralized data governance using AWS Lake Formation with fine-grained permissions at the table, column, row, and cell level Design and implement Lake Formation Tag-Based Access Control (LF-TBAC) to dynamically assign permissions based on metadata tags Integrate Lake Formation with IAM roles, IAM Identity Center (SSO), and SAML-based federation Enable secure cross-account data sharing using Lake Formation + AWS RAM for data mesh patterns Ensure Lake Formation permissions are enforced consistently across Athena, Redshift Spectrum, Glue ETL, and EMR 6. Data Governance, Security & Compliance Apply data masking, anonymization, and PII identification for compliance (e.g., GDPR) Manage encryption at rest (KMS) and in transit (TLS) Implement data lineage tracking and audit logging (CloudTrail, CloudWatch) Implement data location permissions to control catalog resource creation for specific S3 paths 7. Monitoring, Observability & Operations Set up monitoring and alerting using Amazon CloudWatch Troubleshoot pipeline failures and performance bottlenecks Manage data lifecycle policies (tiering, archival, expiration) Cost optimization of data workloads (S3 storage classes, Redshift reserved nodes, Glue DPU tuning) 8. AI/ML Data Readiness Prepare and curate datasets for machine learning and generative AI workloads Integration with Amazon SageMaker and/or Amazon Bedrock Build feature stores and vector data pipelines Support RAG (Retrieval-Augmented Generation) patterns with governed data assets Recommended Certifications: AWS Certified Data Engineer – Associate (DEA-C01) — most directly relevant AWS Certified Solutions Architect – Associate/Professional AWS Certified Machine Learning – Specialty (for ML-heavy platforms)
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AWS Data Service

Job Description

Audit

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Internal Audit

Job Description

As an AI / ML Engineer, a typical day involves designing and building advanced applications and systems that leverage artificial intelligence technologies and cloud-based AI services. The role requires integrating these solutions into robust production pipelines, whether deployed on cloud platforms or on-premises environments. The engineer actively works on implementing generative AI models and may engage in diverse areas such as deep learning, neural networks, conversational agents, and image analysis. This position demands continuous innovation and adaptation to evolving AI methodologies to deliver high-quality, scalable solutions that meet organizational needs. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and implementation of AI-driven solutions ensuring alignment with project goals.- Mentor junior team members to foster skill development and knowledge sharing.- Coordinate cross-functional efforts to optimize AI model deployment and maintenance.- Continuously evaluate emerging AI technologies to enhance existing systems and processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Strong experience in developing and deploying AI and machine learning models using Python.- Familiarity with cloud AI services and integrating AI solutions within cloud or on-premises pipelines.- Knowledge of generative AI models and their practical applications in real-world scenarios.- Experience with deep learning frameworks and neural network architectures.- Ability to work with image processing techniques and chatbot development. Additional Information: - The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Pune office.- A 15 years full time education is required.

Responsibilities

As an AI / ML Engineer, a typical day involves designing and building advanced applications and systems that leverage artificial intelligence technologies and cloud-based AI services. The role requires integrating these solutions into robust production pipelines, whether deployed on cloud platforms or on-premises environments. The engineer actively works on implementing generative AI models and may engage in diverse areas such as deep learning, neural networks, conversational agents, and image analysis. This position demands continuous innovation and adaptation to evolving AI methodologies to deliver high-quality, scalable solutions that meet organizational needs. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and implementation of AI-driven solutions ensuring alignment with project goals.- Mentor junior team members to foster skill development and knowledge sharing.- Coordinate cross-functional efforts to optimize AI model deployment and maintenance.- Continuously evaluate emerging AI technologies to enhance existing systems and processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Strong experience in developing and deploying AI and machine learning models using Python.- Familiarity with cloud AI services and integrating AI solutions within cloud or on-premises pipelines.- Knowledge of generative AI models and their practical applications in real-world scenarios.- Experience with deep learning frameworks and neural network architectures.- Ability to work with image processing techniques and chatbot development. Additional Information: - The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Pune office.- A 15 years full time education is required.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AI / ML Engineer

Job Description

Lead Software Engineer - (25000P90) Missions An expert developer with 8+ years of experience. Key skills · Level 1( Expertise and Mandatory): GIT, CI/CD experience (jenkins or github-action or TFS) .net Core (6 et 8 version), .net Framework (4.8 version), API culture (rest, swagger, Doc, perf, API contract), GraphAPI, Azure services (Azure fonction, AppService), PostgreSQL, SQL Server · Level 2 (Expected) : blazor, React (18 version), kubernetes, windows server, Microsoft Exchange and o365 knowledge, Active Directory · Level 3 (Nice to have) : Agile, Copilot Soft skills : · Team spirit for strong cohesion and synergy in the service of Société Générale · Commitment and responsibility to ensure the quality and safety of our deliveries in line with GTE objectives · Compliance with regulations, rules, and procedures, both internal and external, applicable within the scope of the position and completion of mandatory training. · … Expected tasks and activities : The Developer will perform the following tasks: · Participate in the design, development, integration, and maintenance of the ecosystem related to the application scope. · Participate in maintaining operational conditions and ensuring compliance with the quality of service of the application scope. · Ensure that the ecosystem related to the application scope complies with the architectural principles and security policy defined within the Group. · Be a proactive contributor to business partners. · Contribute to maintaining a high level of skills within the team · Contribute to the stability of the applications mentioned in production and contribute to compliance with DWS Control Tower processes Profile An expert developer with 8+ years of experience. Key skills · Level 1( Expertise and Mandatory): GIT, CI/CD experience (jenkins or github-action or TFS) .net Core (6 et 8 version), .net Framework (4.8 version), API culture (rest, swagger, Doc, perf, API contract), GraphAPI, Azure services (Azure fonction, AppService), PostgreSQL, SQL Server · Level 2 (Expected) : blazor, React (18 version), kubernetes, windows server, Microsoft Exchange and o365 knowledge, Active Directory · Level 3 (Nice to have) : Agile, Copilot Soft skills : · Team spirit for strong cohesion and synergy in the service of Société Générale · Commitment and responsibility to ensure the quality and safety of our deliveries in line with GTE objectives · Compliance with regulations, rules, and procedures, both internal and external, applicable within the scope of the position and completion of mandatory training. · … Expected tasks and activities : The Developer will perform the following tasks: · Participate in the design, development, integration, and maintenance of the ecosystem related to the application scope. · Participate in maintaining operational conditions and ensuring compliance with the quality of service of the application scope. · Ensure that the ecosystem related to the application scope complies with the architectural principles and security policy defined within the Group. · Be a proactive contributor to business partners. · Contribute to maintaining a high level of skills within the team · Contribute to the stability of the applications mentioned in production and contribute to compliance with DWS Control Tower processes

Responsibilities

Lead Software Engineer - (25000P90) Missions An expert developer with 8+ years of experience. Key skills · Level 1( Expertise and Mandatory): GIT, CI/CD experience (jenkins or github-action or TFS) .net Core (6 et 8 version), .net Framework (4.8 version), API culture (rest, swagger, Doc, perf, API contract), GraphAPI, Azure services (Azure fonction, AppService), PostgreSQL, SQL Server · Level 2 (Expected) : blazor, React (18 version), kubernetes, windows server, Microsoft Exchange and o365 knowledge, Active Directory · Level 3 (Nice to have) : Agile, Copilot Soft skills : · Team spirit for strong cohesion and synergy in the service of Société Générale · Commitment and responsibility to ensure the quality and safety of our deliveries in line with GTE objectives · Compliance with regulations, rules, and procedures, both internal and external, applicable within the scope of the position and completion of mandatory training. · … Expected tasks and activities : The Developer will perform the following tasks: · Participate in the design, development, integration, and maintenance of the ecosystem related to the application scope. · Participate in maintaining operational conditions and ensuring compliance with the quality of service of the application scope. · Ensure that the ecosystem related to the application scope complies with the architectural principles and security policy defined within the Group. · Be a proactive contributor to business partners. · Contribute to maintaining a high level of skills within the team · Contribute to the stability of the applications mentioned in production and contribute to compliance with DWS Control Tower processes Profile An expert developer with 8+ years of experience. Key skills · Level 1( Expertise and Mandatory): GIT, CI/CD experience (jenkins or github-action or TFS) .net Core (6 et 8 version), .net Framework (4.8 version), API culture (rest, swagger, Doc, perf, API contract), GraphAPI, Azure services (Azure fonction, AppService), PostgreSQL, SQL Server · Level 2 (Expected) : blazor, React (18 version), kubernetes, windows server, Microsoft Exchange and o365 knowledge, Active Directory · Level 3 (Nice to have) : Agile, Copilot Soft skills : · Team spirit for strong cohesion and synergy in the service of Société Générale · Commitment and responsibility to ensure the quality and safety of our deliveries in line with GTE objectives · Compliance with regulations, rules, and procedures, both internal and external, applicable within the scope of the position and completion of mandatory training. · … Expected tasks and activities : The Developer will perform the following tasks: · Participate in the design, development, integration, and maintenance of the ecosystem related to the application scope. · Participate in maintaining operational conditions and ensuring compliance with the quality of service of the application scope. · Ensure that the ecosystem related to the application scope complies with the architectural principles and security policy defined within the Group. · Be a proactive contributor to business partners. · Contribute to maintaining a high level of skills within the team · Contribute to the stability of the applications mentioned in production and contribute to compliance with DWS Control Tower processes
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Lead Software Engineer - (25000P90)

Job Description

Must Have: Java-6 Years, Angular-3 Years, Microservice- 2years Years , Spring Boot- 2Years

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java

Job Description

Job Description: 1. Job Description : Employee data Management - Updating employee records in SAP platform 2. Key pointers: Minimum experience of 1 year of EDM 3. Shift Timings - 05:30 to 3 AM and 06:30 to 4:00 AM 4. WFH / WFO :WFO - Hybrid - 4 days of WFO and 1 day of WFO

Responsibilities

Job Description: 1. Job Description : Employee data Management - Updating employee records in SAP platform 2. Key pointers: Minimum experience of 1 year of EDM 3. Shift Timings - 05:30 to 3 AM and 06:30 to 4:00 AM 4. WFH / WFO :WFO - Hybrid - 4 days of WFO and 1 day of WFO
  • Salary : Rs. 2,54,400.0 - Rs. 3,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Human Resources Practitioner

Job Description

Openshift DevOps Engineer

Responsibilities

Openshift DevOps Engineer
  • Salary : Rs. 0.0 - Rs. 1,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :INFYSYJP00004594/563507_Openshift DevOps Engineer