As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices within the organization.Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data processes to improve performance and reliability.- Collaborate with cross-functional teams to gather requirements and deliver data solutions.Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing, Ab Initio, Unix Shell Scripting.- Must be able to write advanced SQL ( Teradata/SQL server or Oracle)- Strong understanding of data integration techniques and methodologies.- Experience with data quality assessment and validation processes.- Familiarity with database management systems and data modeling concepts.Additional Information: - The candidate should have minimum 5 years of experience in Data Warehouse ETL Testing.- This position is based at our Pune office.- A 15 years full time education is required.
Responsibilities
As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices within the organization.Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data processes to improve performance and reliability.- Collaborate with cross-functional teams to gather requirements and deliver data solutions.Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing, Ab Initio, Unix Shell Scripting.- Must be able to write advanced SQL ( Teradata/SQL server or Oracle)- Strong understanding of data integration techniques and methodologies.- Experience with data quality assessment and validation processes.- Familiarity with database management systems and data modeling concepts.Additional Information: - The candidate should have minimum 5 years of experience in Data Warehouse ETL Testing.- This position is based at our Pune office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,50,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
What will you do?
Work with the runtime Cybersecurity Advisor/Coach to ensure each release of the runtime SDK is developed according to Secure Development Lifecycle (SDL), to meet internal and external cybersecurity standards, regulatory compliance, and the needs of customers.
Provide cybersecurity expertise through guidance in architecting, designing and threat model mentoring to members during the development cycle. Perform cybersecurity code review for pull requests as part of the SDL process.
Evaluation, tracking, and resolution of product and runtime cybersecurity issues and related technical debt in 3rd party packages, reported both internally and from external sources
Cybersecurity vulnerabilities (CVEs)
OS/package patches: Debian GNU/Linux, VxWorks
Commercial/FOSS packages: Mongoose, UA-HPSDK, OpenSSL, mbedTLS, fmt, libyuarel, Frozen, optionparser, zlib, among others.
Management and use of tools for static and dynamic code analysis (Coverity, SQuORE, Halgrind, Valgrind, CppCheck) and Software Composition Analysis (Black Duck Binary Analysis, Black Duck Hub) in 3rd party packages and current code base.
Address false-positive findings, evaluate and triage bugs, resolving or assigning them to an SME as appropriate.
Evaluate BDBA/BDH findings and work with the runtime teams to resolve them.
Ensure qualimetry data for all significant branches (master branch, release branches, component branches) is current and accessible for use by management
Setup to support new releases as needed
Regular/scheduled and on-demand scans to timely detect abnormalities.
Monitor the changes and notify if trending is upward
Create and update formal report on branches
Responsibilities
What will you do?
Work with the runtime Cybersecurity Advisor/Coach to ensure each release of the runtime SDK is developed according to Secure Development Lifecycle (SDL), to meet internal and external cybersecurity standards, regulatory compliance, and the needs of customers.
Provide cybersecurity expertise through guidance in architecting, designing and threat model mentoring to members during the development cycle. Perform cybersecurity code review for pull requests as part of the SDL process.
Evaluation, tracking, and resolution of product and runtime cybersecurity issues and related technical debt in 3rd party packages, reported both internally and from external sources
Cybersecurity vulnerabilities (CVEs)
OS/package patches: Debian GNU/Linux, VxWorks
Commercial/FOSS packages: Mongoose, UA-HPSDK, OpenSSL, mbedTLS, fmt, libyuarel, Frozen, optionparser, zlib, among others.
Management and use of tools for static and dynamic code analysis (Coverity, SQuORE, Halgrind, Valgrind, CppCheck) and Software Composition Analysis (Black Duck Binary Analysis, Black Duck Hub) in 3rd party packages and current code base.
Address false-positive findings, evaluate and triage bugs, resolving or assigning them to an SME as appropriate.
Evaluate BDBA/BDH findings and work with the runtime teams to resolve them.
Ensure qualimetry data for all significant branches (master branch, release branches, component branches) is current and accessible for use by management
Setup to support new releases as needed
Regular/scheduled and on-demand scans to timely detect abnormalities.
Monitor the changes and notify if trending is upward
Create and update formal report on branches
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Mandatory skills : Snowflake Architecture Knowledge, Azure,SQL,User Access management , Perf tuning and query optimization,cloud integration,
Good to have skills : Leadership skills , collboaration, Team work
Soft skills : Very good communications (verbal and written)
Any specific Domain expertise : Azure
Certifications : Any admin or snowflake related
Responsibilities
Mandatory skills : Snowflake Architecture Knowledge, Azure,SQL,User Access management , Perf tuning and query optimization,cloud integration,
Good to have skills : Leadership skills , collboaration, Team work
Soft skills : Very good communications (verbal and written)
Any specific Domain expertise : Azure
Certifications : Any admin or snowflake related
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :A&M | Snowflake Administration, snowflake, SQL
Administration of overall Azure AD resources
Experience in Azure AD connect, Azure proxy, Entra, Role-Based Access Control (RBAC).
Register and manage enterprise applications
Configure single sign-on (SSO), proxy settings and Policies
Manage security related features MFS, conditional access
Manage privilege roles in Azure AD
View usage reports, cost analysis, Azure subscriptions and billing
Manage compliance related configurations and Audit logs
Responsibilities
Administration of overall Azure AD resources
Experience in Azure AD connect, Azure proxy, Entra, Role-Based Access Control (RBAC).
Register and manage enterprise applications
Configure single sign-on (SSO), proxy settings and Policies
Manage security related features MFS, conditional access
Manage privilege roles in Azure AD
View usage reports, cost analysis, Azure subscriptions and billing
Manage compliance related configurations and Audit logs
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Roles & Responsibilities:
Expected to perform independently and become an SME.
Required active participation/contribution in team discussions.
Contribute in providing solutions to work related problems.
Assist in the documentation of application specifications and user guides.
Engage in continuous learning to stay updated with the latest technologies and best practices.
Professional & Technical Skills:
Must To Have Skills: Proficiency in SAP PP Production Planning & Control Discrete Industries.
Strong understanding of application development methodologies.
Experience with integration of SAP PP with other modules.
Familiarity with data management and reporting tools.
Ability to troubleshoot and resolve application issues efficiently.
Additional Information:
The candidate should have minimum 3 years of experience in SAP PP Production Planning & Control Discrete Industries.
This position is based at our Bengaluru office.
A 15 years full time education is required."
Responsibilities
Roles & Responsibilities:
Expected to perform independently and become an SME.
Required active participation/contribution in team discussions.
Contribute in providing solutions to work related problems.
Assist in the documentation of application specifications and user guides.
Engage in continuous learning to stay updated with the latest technologies and best practices.
Professional & Technical Skills:
Must To Have Skills: Proficiency in SAP PP Production Planning & Control Discrete Industries.
Strong understanding of application development methodologies.
Experience with integration of SAP PP with other modules.
Familiarity with data management and reporting tools.
Ability to troubleshoot and resolve application issues efficiently.
Additional Information:
The candidate should have minimum 3 years of experience in SAP PP Production Planning & Control Discrete Industries.
This position is based at our Bengaluru office.
A 15 years full time education is required."
Salary : Rs. 0.0 - Rs. 1,50,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Minimum 7 years of development experience and last 2+ years in Gen AI-based App Development.
Proficiency in Python or JavaScript for AI application development
Should have Working experience with Gen AI – OpenAI/Gemini/Perplexity & Grok
Should have experience in ML Ops for deployment experience in cloud services (AWS, GCP, Azure) for AI model deployment & DB Tech (NoSQL, PostgreSQL, MongoDB).
Responsibilities
Minimum 7 years of development experience and last 2+ years in Gen AI-based App Development.
Proficiency in Python or JavaScript for AI application development
Should have Working experience with Gen AI – OpenAI/Gemini/Perplexity & Grok
Should have experience in ML Ops for deployment experience in cloud services (AWS, GCP, Azure) for AI model deployment & DB Tech (NoSQL, PostgreSQL, MongoDB).
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Data & Analytics Expertise
KPI Development & Monitoring – Ability to formulate, evolve, and track service performance metrics (CES, CSAT, FCR, etc.).
Data Interpretation & Insight Generation – Strong skills in analyzing complex datasets to derive actionable insights.
Dashboarding & Visualization – Advanced proficiency in Power BI or similar tools to create intuitive dashboards.
Excel Mastery – Expertise in data manipulation, aggregation, and analysis using Microsoft Excel.
Salesforce Cloud Analytics – Experience is a plus for integrating consumer data insights.
Benchmarking & Industry Analysis – Capability to perform competitive benchmarking and trend analysis.
Ad-hoc Analytical Support – Flexibility to support initiatives with tailored data research.
B. Technical Proficiency
Data Platforms Familiarity – Understanding of platforms like Microsoft Azure.
Programming Knowledge – Basic grasp of SQL or Python (not mandatory but beneficial).
Data Accuracy & Quality Management – Ability to monitor and refine metrics for relevance and precision.
Data Privacy Awareness – Knowledge of data protection regulations and ethical data handling.
Responsibilities
Data & Analytics Expertise
KPI Development & Monitoring – Ability to formulate, evolve, and track service performance metrics (CES, CSAT, FCR, etc.).
Data Interpretation & Insight Generation – Strong skills in analyzing complex datasets to derive actionable insights.
Dashboarding & Visualization – Advanced proficiency in Power BI or similar tools to create intuitive dashboards.
Excel Mastery – Expertise in data manipulation, aggregation, and analysis using Microsoft Excel.
Salesforce Cloud Analytics – Experience is a plus for integrating consumer data insights.
Benchmarking & Industry Analysis – Capability to perform competitive benchmarking and trend analysis.
Ad-hoc Analytical Support – Flexibility to support initiatives with tailored data research.
B. Technical Proficiency
Data Platforms Familiarity – Understanding of platforms like Microsoft Azure.
Programming Knowledge – Basic grasp of SQL or Python (not mandatory but beneficial).
Data Accuracy & Quality Management – Ability to monitor and refine metrics for relevance and precision.
Data Privacy Awareness – Knowledge of data protection regulations and ethical data handling.
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description
Deliver high-quality and maintainable code using test-driven methodologies
Monitor and maintain critical applications, services, and products.
Build functional features to enhance the transportation tech ecosystems in Coupang
Support team’s OKR’s on system upgrades, deprecation, and migrations.
Efficiently collaborate with cross-function
Requirement
BS/MS degree, preferably in a Computer Science or related field.
5+ years of overall experience in designing and developing large-scale distributed systems and applications.
Deep understanding and hands-on experience in back-end development utilizing Java and Open-Source technologies (Java, Sprint boot, Application servers, servlet containers, JMS, JPA, Spring MVC, Hibernate)
Hands-on experience with REST APIs, messaging technologies (e.g Kafka), Caching system (e.g Redis) etc.
Familiarity with SQL and knowledge of other NoSQL and modern database technologies.
Proficiency in software engineering tools (e.g., Java build tools, CI/CD) and adherence to best practices such as unit testing, test automation, continuous integration, etc.
Hands on experience on cloud technologies and AWS ecosystem. (ec2,s3,rds,dynamoDB,EMR etc. )
Must be a self-starter who can work well with minimal guidance and in a fluid environment.
Responsibilities
Job Description
Deliver high-quality and maintainable code using test-driven methodologies
Monitor and maintain critical applications, services, and products.
Build functional features to enhance the transportation tech ecosystems in Coupang
Support team’s OKR’s on system upgrades, deprecation, and migrations.
Efficiently collaborate with cross-function
Requirement
BS/MS degree, preferably in a Computer Science or related field.
5+ years of overall experience in designing and developing large-scale distributed systems and applications.
Deep understanding and hands-on experience in back-end development utilizing Java and Open-Source technologies (Java, Sprint boot, Application servers, servlet containers, JMS, JPA, Spring MVC, Hibernate)
Hands-on experience with REST APIs, messaging technologies (e.g Kafka), Caching system (e.g Redis) etc.
Familiarity with SQL and knowledge of other NoSQL and modern database technologies.
Proficiency in software engineering tools (e.g., Java build tools, CI/CD) and adherence to best practices such as unit testing, test automation, continuous integration, etc.
Hands on experience on cloud technologies and AWS ecosystem. (ec2,s3,rds,dynamoDB,EMR etc. )
Must be a self-starter who can work well with minimal guidance and in a fluid environment.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance