Job Title: Developer
Work Location: MUMBAI, MH/ NEW DELHI/ BANGALORE, KA/ CHENNAI, TN/ BHUBANESWAR, OD/ HYDERABAD, TG.
Skill Required: Digital: Oracle SOA
Experience Range in 4-6 Years
Job Description:
Role OSB & SOA Developer, Designer/Specialist
Required Technical Skill Set OSB, SOA Designing.
Must-Have 1. XML, XSD and WSDL Concepts
2. OSB Design and Development
3. SOA Design and Development
Good-to-Have 1. Knowledge of Weblogic Server
2. Core Java
SKILLS REQUIRED: Design of OSB Flows and Development of OSB Services
Design of SCA Flows and Development of SOA Composites
Development of Java components that can be reused in OSB or SOA architecture
Able to debug and resolve bugs during testing
Need to be able to handle production incidents
Guiding team members
Preparing technical design
Oracle SOA
Essential skills:
ORACLE SOA
Responsibilities
Job Title: Developer
Work Location: MUMBAI, MH/ NEW DELHI/ BANGALORE, KA/ CHENNAI, TN/ BHUBANESWAR, OD/ HYDERABAD, TG.
Skill Required: Digital: Oracle SOA
Experience Range in 4-6 Years
Job Description:
Role OSB & SOA Developer, Designer/Specialist
Required Technical Skill Set OSB, SOA Designing.
Must-Have 1. XML, XSD and WSDL Concepts
2. OSB Design and Development
3. SOA Design and Development
Good-to-Have 1. Knowledge of Weblogic Server
2. Core Java
SKILLS REQUIRED: Design of OSB Flows and Development of OSB Services
Design of SCA Flows and Development of SOA Composites
Development of Java components that can be reused in OSB or SOA architecture
Able to debug and resolve bugs during testing
Need to be able to handle production incidents
Guiding team members
Preparing technical design
Oracle SOA
Essential skills:
ORACLE SOA
Salary : Rs. 70,000.0 - Rs. 1,40,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Partner with business stakeholders to understand hiring needs, role requirements, and recruitment timelines.
Develop job descriptions and specifications as per role requirements.
Source candidates using multiple channels: job portals, LinkedIn/media, networking, referrals, campus drives (if relevant), and talent pool.
Screen resumes/applications, conduct phone/virtual/face‑to‑face interviews, assess candidate fit (skills, culture).
Responsibilities
Partner with business stakeholders to understand hiring needs, role requirements, and recruitment timelines.
Develop job descriptions and specifications as per role requirements.
Source candidates using multiple channels: job portals, LinkedIn/media, networking, referrals, campus drives (if relevant), and talent pool.
Screen resumes/applications, conduct phone/virtual/face‑to‑face interviews, assess candidate fit (skills, culture).
Salary : Rs. 15,000.0 - Rs. 18,700.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
ob Description:
Primary Skill: Java spring boot microservices with GCP, Kafka
- Design, develop, and maintain scalable full-stack applications using Java (Spring Boot) and react.
- Build and integrate event-driven systems using Apache Kafka within a GCP environment.
- Develop RESTful APIs and work with microservices architecture.
- Collaborate with cross-functional teams (DevOps, Product, QA) to deliver high-quality solutions.
- Ensure system responsiveness, performance, and scalability.
- Participate in code reviews, testing, and debugging.
- Leverage GCP services (e.g., PubSub, Cloud Functions, BigQuery) to optimize application performance.
- Write clean, maintainable, and testable code
Responsibilities
ob Description:
Primary Skill: Java spring boot microservices with GCP, Kafka
- Design, develop, and maintain scalable full-stack applications using Java (Spring Boot) and react.
- Build and integrate event-driven systems using Apache Kafka within a GCP environment.
- Develop RESTful APIs and work with microservices architecture.
- Collaborate with cross-functional teams (DevOps, Product, QA) to deliver high-quality solutions.
- Ensure system responsiveness, performance, and scalability.
- Participate in code reviews, testing, and debugging.
- Leverage GCP services (e.g., PubSub, Cloud Functions, BigQuery) to optimize application performance.
- Write clean, maintainable, and testable code
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
ob Description:
1.Good understanding of data warehousing concepts with hands-on experience working across multiple databases, including Oracle, PostgreSQL, and MySQL.
2.Extensive experience in Databricks, with a focus on building scalable data solutions.
3.Proven ability to design, develop, and maintain robust ETLELT pipelines using Databricks to extract, transform, and load data from diverse sources into target systems.
4.Strong understanding of data integration from both structured and unstructured sources such as relational databases, flat files, APIs, and cloud storage.
5.Skilled in implementing data validation, cleansing, and reconciliation processes to ensure high data quality and integrity.
6.Hands-on experience working with the AWS cloud platform, leveraging services for data processing and storage.
7.Familiar with Agile process and DevOps practices, including Jira, Confluence, GitHub, and CICD pipelines.
8.Excellent communication skills and a strong team player with a collaborative mindset.Tools Technologies Databricks , AWS Glue , Redshift , Oracle DB , Python, Jira Confluence
Responsibilities
ob Description:
1.Good understanding of data warehousing concepts with hands-on experience working across multiple databases, including Oracle, PostgreSQL, and MySQL.
2.Extensive experience in Databricks, with a focus on building scalable data solutions.
3.Proven ability to design, develop, and maintain robust ETLELT pipelines using Databricks to extract, transform, and load data from diverse sources into target systems.
4.Strong understanding of data integration from both structured and unstructured sources such as relational databases, flat files, APIs, and cloud storage.
5.Skilled in implementing data validation, cleansing, and reconciliation processes to ensure high data quality and integrity.
6.Hands-on experience working with the AWS cloud platform, leveraging services for data processing and storage.
7.Familiar with Agile process and DevOps practices, including Jira, Confluence, GitHub, and CICD pipelines.
8.Excellent communication skills and a strong team player with a collaborative mindset.Tools Technologies Databricks , AWS Glue , Redshift , Oracle DB , Python, Jira Confluence
Salary : Rs. 55,000.0 - Rs. 95,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ // PAN INDIA
Skill Required: Unix Shell Scripting and text processing tools~Application Server Deployment & Administration~Statistics & Analytics
Actual Experience Required: 6-8yrs
//Considerable: Overall 5+ yrs
*** SAS Admin - min 2yrs
***Shift: 2 - 11PM Shift ,
***Flexibility of WFH 3 - 4hrs
// Mandatory : SAS Admin Certifications
Job Description: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Essential Skills: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Comments for Suppliers: // Mandatory : SAS Admin Certifications
Responsibilities
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ // PAN INDIA
Skill Required: Unix Shell Scripting and text processing tools~Application Server Deployment & Administration~Statistics & Analytics
Actual Experience Required: 6-8yrs
//Considerable: Overall 5+ yrs
*** SAS Admin - min 2yrs
***Shift: 2 - 11PM Shift ,
***Flexibility of WFH 3 - 4hrs
// Mandatory : SAS Admin Certifications
Job Description: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Essential Skills: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Comments for Suppliers: // Mandatory : SAS Admin Certifications
Salary : Rs. 70,000.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: Mumbai, MH / Hyderabad, TG / Chennai, TN / Bhubaneswar, OR / Bangalore, KA
Duration: 6 months (Extendable)
Skill Required: Digital: Microsoft Azure, Digital: Python for Data Science, Digital: Databricks, Digital: PySpark, Azure Data Factory
Experience Range in Required Skills: 6-8 Years
Job Description:
1. Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.
2. Developing scalable and re-usable frameworks for ingesting data sets.
3. Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times.
4. Working with event based streaming technologies to ingest and process data.
5. Working with other members of the project team to support delivery of additional project components (API interfaces, Search).
6. Evaluating the performance and applicability of multiple tools against customer requirements
Key Responsibilities:
1. Develop and maintain ETLELT pipelines using Azure Data Factory (ADF) and Azure Databricks.
2. Implement data ingestion flows from diverse sources including Azure Blob Storage, Azure Data Lake, On-Prem SQL, and SFTP.
3. Design and optimize data models and transformations using Oracle, Spark SQL, PySpark, SQL Server, Progress DB SQL.
4. Build orchestration workflows in ADF using activities like Lookup, ForEach, Execute Pipeline, and Set Variable.
5. Perform root cause analysis and resolve production issues in pipelines and notebooks. Collaborate on CICD pipeline creation using Azure DevOps, Jenkins.
6. Apply performance tuning techniques to Azure Synapse Analytics and SQL DW.
7. Maintain documentation including runbooks, technical design specs, and QA test cases Data Pipeline Engineering Design and implement scalable, fault-tolerant data pipelines using Azure Synapse and Data bricks.
8. Ingest data from diverse sources including flat files, DB2, NoSQL, and cloud-native formats (CSV, JSON).
9. Technical Skills Required Cloud Platforms Azure (ADF, ADLS, ADB, Azure SQL, Synapse, Cosmos DB) ETL Tools Azure Data Factory, Azure Databricks Programming SQL, PySpark, Spark SQL DevOps Automation Azure DevOps, Git, CICD, Jenkins
Responsibilities
Job Title: Developer
Work Location: Mumbai, MH / Hyderabad, TG / Chennai, TN / Bhubaneswar, OR / Bangalore, KA
Duration: 6 months (Extendable)
Skill Required: Digital: Microsoft Azure, Digital: Python for Data Science, Digital: Databricks, Digital: PySpark, Azure Data Factory
Experience Range in Required Skills: 6-8 Years
Job Description:
1. Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.
2. Developing scalable and re-usable frameworks for ingesting data sets.
3. Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times.
4. Working with event based streaming technologies to ingest and process data.
5. Working with other members of the project team to support delivery of additional project components (API interfaces, Search).
6. Evaluating the performance and applicability of multiple tools against customer requirements
Key Responsibilities:
1. Develop and maintain ETLELT pipelines using Azure Data Factory (ADF) and Azure Databricks.
2. Implement data ingestion flows from diverse sources including Azure Blob Storage, Azure Data Lake, On-Prem SQL, and SFTP.
3. Design and optimize data models and transformations using Oracle, Spark SQL, PySpark, SQL Server, Progress DB SQL.
4. Build orchestration workflows in ADF using activities like Lookup, ForEach, Execute Pipeline, and Set Variable.
5. Perform root cause analysis and resolve production issues in pipelines and notebooks. Collaborate on CICD pipeline creation using Azure DevOps, Jenkins.
6. Apply performance tuning techniques to Azure Synapse Analytics and SQL DW.
7. Maintain documentation including runbooks, technical design specs, and QA test cases Data Pipeline Engineering Design and implement scalable, fault-tolerant data pipelines using Azure Synapse and Data bricks.
8. Ingest data from diverse sources including flat files, DB2, NoSQL, and cloud-native formats (CSV, JSON).
9. Technical Skills Required Cloud Platforms Azure (ADF, ADLS, ADB, Azure SQL, Synapse, Cosmos DB) ETL Tools Azure Data Factory, Azure Databricks Programming SQL, PySpark, Spark SQL DevOps Automation Azure DevOps, Git, CICD, Jenkins
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance