We found 316 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Responsible for understanding the software requirements and developing it into a working source code accordingly. Opportunity to work with some of the best minds in a team and learn a lot on the technical front. Get mentored/groomed on the best practices followed in the software industry. "Internal Interfaces: - Team members, Technical leader, Business Analyst

Responsibilities

Responsible for understanding the software requirements and developing it into a working source code accordingly. Opportunity to work with some of the best minds in a team and learn a lot on the technical front. Get mentored/groomed on the best practices followed in the software industry. "Internal Interfaces: - Team members, Technical leader, Business Analyst
  • Salary : Rs. 15,00,000.0 - Rs. 18,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Lead software engineer

Job Description

Unix Production Support

Responsibilities

Unix Production Support
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Unix Production Support

Job Description

Total Years of Experience :- 3-5 Yrs Relevant years of Experience:- 3 Yrs Mandatory Skills :-- Peoplesoft App Developer Good to have (Not Mandatory):-- Functional - PeopleSoft modules ( HCM), functionality, or enhancements, or the support of modules or functionality already in production. Detailed Job Description Overall development experience Strong knowledge of appropriate languages including PeopleTools, Application Engine, SQL, PSQuery, File Layout , PeopleCode, CI,XML Publisher, Integration Broker, Retro fitting, migration etc . Analytical and design skills 3) Other Good Communication skill to communicate with All (Client , Team )

Responsibilities

Total Years of Experience :- 3-5 Yrs Relevant years of Experience:- 3 Yrs Mandatory Skills :-- Peoplesoft App Developer Good to have (Not Mandatory):-- Functional - PeopleSoft modules ( HCM), functionality, or enhancements, or the support of modules or functionality already in production. Detailed Job Description Overall development experience Strong knowledge of appropriate languages including PeopleTools, Application Engine, SQL, PSQuery, File Layout , PeopleCode, CI,XML Publisher, Integration Broker, Retro fitting, migration etc . Analytical and design skills 3) Other Good Communication skill to communicate with All (Client , Team )
  • Salary : Rs. 5,00,000.0 - Rs. 20,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Application Developer people soft

Job Description

Sharepoint Developer

Responsibilities

Sharepoint Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Description: Primary skill : SAP ABAP • Minimum 3 to 5 years of work experience in SAP ABAP in technical design ,development ,testing and documentation. • Required at least one implementation experience and should be comfortable in providing Production support as per SLA. • Experience in ABAP Procedural and Object logic, BAPIs, BAdIs, IDocs, Customer Modifications, Enhancement Frameworks, SAPScript and SmartForms. • Ability to prepare the Technical specification. • Exposure to ABAP Objects ,SAP queries, Ecatt , LDB and Experience in FI , SD , MM module is added advantage. • Good verbal & written communication skills. • Willing to work in shifts (11AM to 8-30PM). Its a pure offshore based position and work location is Hyderabad

Responsibilities

  • Salary : Rs. 0.0 - Rs. 16.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Sap ABAP Consultant - HPJP00109809

Job Description

Executive Assistant

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Executive Assistant

Job Description

Position Summary Experienced ETL Developers and Data Engineers to ingest and analyze data from multiple enterprise sources into Adobe Experience Platform What you’ll do • Analyze and understand customer’s use case and data sources and extract, transform and load data from multitude of customer’s enterprise sources and ingest into Adobe Experience Platform • Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed. • Develop and test complex SQLs to extract\analyze and report the data ingested into the Adobe Experience platform. • Ensure the SQLs are implemented in compliance with the best practice to they are performant. • Migrate platform conigurations, including the data ingestion pipelines and SQL, across various sandboxes. • Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues. • Support Data Architects in implemente data model in the platform. • Contribute to the innovation charter and develop intellectual property for the organization. • Present on advanved features and complex use case implementations at multiple forums. • Attend regular scrum events or equivalent and provide update on the deliverables. • Work independently aross multiple engagements with none or minimum supervision. Requirements About 6 years of professional technology experience mostly focused on the following: • 4-6 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend • 4-6 years of experience and ability to write and analyze complex and performant SQLs • 2+ years experience on multiple Data engineering related services on AWS, Azure, GCP, e.g. EMR, Glue, Athena, DynamoDb, Kinesis, Kafka, Redshift etc. • Experience of developing applications that consume the services exposed as ReST APIs. • Good knowledge on Data Modelling(design patterns and best practices) Special Consideration given for • 1-2 year(not more) of experience on open source big data tech stack – Pyspark(batch and streaming), Hive, Hadoop etc. • Experience with Reporting Technologies (i.e. Tableau, PowerBI)

Responsibilities

Position Summary Experienced ETL Developers and Data Engineers to ingest and analyze data from multiple enterprise sources into Adobe Experience Platform What you’ll do • Analyze and understand customer’s use case and data sources and extract, transform and load data from multitude of customer’s enterprise sources and ingest into Adobe Experience Platform • Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed. • Develop and test complex SQLs to extract\analyze and report the data ingested into the Adobe Experience platform. • Ensure the SQLs are implemented in compliance with the best practice to they are performant. • Migrate platform conigurations, including the data ingestion pipelines and SQL, across various sandboxes. • Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues. • Support Data Architects in implemente data model in the platform. • Contribute to the innovation charter and develop intellectual property for the organization. • Present on advanved features and complex use case implementations at multiple forums. • Attend regular scrum events or equivalent and provide update on the deliverables. • Work independently aross multiple engagements with none or minimum supervision. Requirements About 6 years of professional technology experience mostly focused on the following: • 4-6 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend • 4-6 years of experience and ability to write and analyze complex and performant SQLs • 2+ years experience on multiple Data engineering related services on AWS, Azure, GCP, e.g. EMR, Glue, Athena, DynamoDb, Kinesis, Kafka, Redshift etc. • Experience of developing applications that consume the services exposed as ReST APIs. • Good knowledge on Data Modelling(design patterns and best practices) Special Consideration given for • 1-2 year(not more) of experience on open source big data tech stack – Pyspark(batch and streaming), Hive, Hadoop etc. • Experience with Reporting Technologies (i.e. Tableau, PowerBI)
  • Salary : Rs. 15,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer

Job Description

Experience in Hadoop, ElasticSearch, Machine Learning technology stack (Python, R, Spark, etc..) 1-Should have minimum 3 years of experience in BIG Data with prior experience in Java and ETL. 2-Provide hands-on leadership for the design and development of ETL data flows using Hadoop and Spark ECO system components. Leading the develop of large-scale, high-speed, and low-latency data solutions in the areas of large scale data manipulation, long-term data storage, data warehousing, Low-latency retrieval systems, real-time reporting and analytics Data applications - visualization, BI, dashboards, ad-hoc analytics 3-Must to have hands on experience on SPARK, Kafka, HIVE/PIG, API development, and any ETL tool Preferably Talend (Any other tool is fine as long as the resource has strong Hold on ETL process). 4-Must to have Core Java knowledge and good to have Spring, Hibernate 5-Strong hold on SQL / PL SQL 6-Must to have hands on experience on Unix Scripting 7-Translate complex functional and technical requirements into detailed design 8-Perform analysis of data sets and uncover insights 9-Maintain security and data privacy 10-Propose best practices/standards 11-Excellent communication skills.

Responsibilities

Experience in Hadoop, ElasticSearch, Machine Learning technology stack (Python, R, Spark, etc..) 1-Should have minimum 3 years of experience in BIG Data with prior experience in Java and ETL. 2-Provide hands-on leadership for the design and development of ETL data flows using Hadoop and Spark ECO system components. Leading the develop of large-scale, high-speed, and low-latency data solutions in the areas of large scale data manipulation, long-term data storage, data warehousing, Low-latency retrieval systems, real-time reporting and analytics Data applications - visualization, BI, dashboards, ad-hoc analytics 3-Must to have hands on experience on SPARK, Kafka, HIVE/PIG, API development, and any ETL tool Preferably Talend (Any other tool is fine as long as the resource has strong Hold on ETL process). 4-Must to have Core Java knowledge and good to have Spring, Hibernate 5-Strong hold on SQL / PL SQL 6-Must to have hands on experience on Unix Scripting 7-Translate complex functional and technical requirements into detailed design 8-Perform analysis of data sets and uncover insights 9-Maintain security and data privacy 10-Propose best practices/standards 11-Excellent communication skills.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :App Dev - Senior Big Data / Hadoop

Job Description

VB6 Developer

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :VB6 Developer