We found 311 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Python AWS Developer

Responsibilities

Python AWS Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :551533_Senior Python AWS Developer

Job Description

Job Title: Engineer Work Location: Chennai, TN / Kochi, KL / Bangalore, KA / Gurgaon, HA / Noida, UP / Bhubaneswar, OD / Kolkata, WB / Hyderabad, TG / Pune, MH Skill Required: Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years Job Description: Leads large-scale, complex, cross-functional projects to build technical roadmap for the WFM Data Services platform. Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution. Communicating status and issues to Sr Starbucks leadership and stakeholders. Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define implement data retention policies and procedures Define implement data governance policies and procedures Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Perform root cause analysis to identify permanent resolutions to software or business process issues

Responsibilities

Job Title: Engineer Work Location: Chennai, TN / Kochi, KL / Bangalore, KA / Gurgaon, HA / Noida, UP / Bhubaneswar, OD / Kolkata, WB / Hyderabad, TG / Pune, MH Skill Required: Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years Job Description: Leads large-scale, complex, cross-functional projects to build technical roadmap for the WFM Data Services platform. Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution. Communicating status and issues to Sr Starbucks leadership and stakeholders. Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define implement data retention policies and procedures Define implement data governance policies and procedures Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Perform root cause analysis to identify permanent resolutions to software or business process issues
  • Salary : Rs. 55,000.0 - Rs. 95,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years Job Description: Leads large-scale, complex, cross-functional projects to build technical roadmap for the WFM Data Services platform. Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution. Communicating status and issues to Sr Starbucks leadership and stakeholders. Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define implement data retention policies and procedures Define implement data governance policies and procedures Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Digital : Microsoft Azure~Digital : Databricks~Digital : PySpark Experience Range: 6-8 years

Job Description

1.Manage & administer IBM Storage platforms. 2.Perform storage provisioning, zoning, replication, firmware upgrades, and performance tuning. 3.Administer IBM Spectrum Protect (TSM) backup environment including schedules, policies, restores, failover , fail back and disk storage operations. 4.Manage SAN fabrics (Brocade/Cisco), zoning, switch upgrades, and fabric health.

Responsibilities

1.Manage & administer IBM Storage platforms. 2.Perform storage provisioning, zoning, replication, firmware upgrades, and performance tuning. 3.Administer IBM Spectrum Protect (TSM) backup environment including schedules, policies, restores, failover , fail back and disk storage operations. 4.Manage SAN fabrics (Brocade/Cisco), zoning, switch upgrades, and fabric health.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :642793Y25_CIS_ IBM Storage

Job Description

Responsibilities Optimize application for maximum speed and compatibility. Integrating third-party dependencies and debugging dependency conflicts Multitask seamlessly, always maintaining a positive attitude and a client-first mentality Incorporate engineering best practices, methodologies & standards in all deliverables Pay close attention to details and be able to follow specifications and mockups. Requirements Must have 5+ years of experience in Liferay application development Experience building complex features and interfaces for Liferay applications. Familiarity with migration 7.3 & 7.4 Solid knowledge of web app development practices. Excellent spoken and written English.

Responsibilities

Responsibilities Optimize application for maximum speed and compatibility. Integrating third-party dependencies and debugging dependency conflicts Multitask seamlessly, always maintaining a positive attitude and a client-first mentality Incorporate engineering best practices, methodologies & standards in all deliverables Pay close attention to details and be able to follow specifications and mockups. Requirements Must have 5+ years of experience in Liferay application development Experience building complex features and interfaces for Liferay applications. Familiarity with migration 7.3 & 7.4 Solid knowledge of web app development practices. Excellent spoken and written English.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :551956 - Liferay- India

Job Description

• Seniority of at least 6 years of relevant and proven experience on an IBM z/OS Mainframe environment • Proficient knowledge on the following technologies : PL/I, Cool:Gen (Broadcom), IMS/DC & IMS/DB, MFS, DB2, JCL and MQ-Series • Proven usage of debugging, testing & release management tools on IBM z/OS Mainframe • Usage of Service Now as Incident & problem mgmnt tool

Responsibilities

• Seniority of at least 6 years of relevant and proven experience on an IBM z/OS Mainframe environment • Proficient knowledge on the following technologies : PL/I, Cool:Gen (Broadcom), IMS/DC & IMS/DB, MFS, DB2, JCL and MQ-Series • Proven usage of debugging, testing & release management tools on IBM z/OS Mainframe • Usage of Service Now as Incident & problem mgmnt tool
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Mainframe

Job Description

SAP GTS

Responsibilities

SAP GTS
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : SAP - GTS Consultant

Job Description

Seeking a skilled professional with 3+ years of experience in Azure implementation and data quality initiatives, specifically with hands-on expertise using the MonteCarlo DQ tool. The ideal candidate is proficient in ETL processes, building and testing data quality pipelines (focusing on uniqueness, completeness, and root cause analysis), and is comfortable working across Snowflake, Databricks, and cross-functional teams. Strong emphasis on resolving data quality issues and ensuring robust pipeline operations in complex cloud environments.

Responsibilities

Seeking a skilled professional with 3+ years of experience in Azure implementation and data quality initiatives, specifically with hands-on expertise using the MonteCarlo DQ tool. The ideal candidate is proficient in ETL processes, building and testing data quality pipelines (focusing on uniqueness, completeness, and root cause analysis), and is comfortable working across Snowflake, Databricks, and cross-functional teams. Strong emphasis on resolving data quality issues and ensuring robust pipeline operations in complex cloud environments.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : MonteCarlo - Data Quality

Job Description

Skills: PL/SQL~Functional Testing Experience : 6-8 years Role Description: Roles & Responsibility A Selenium Automation Tester is responsible for designing and implementing automated tests for web applications. Key responsibilities include: • Developing and executing automated tests using the Selenium framework to ensure software quality. • Collaborating with development teams to identify and resolve issues, and maintaining a database of software defects. • Analyzing test results and tracking metrics to improve testing processes. • Staying updated with industry trends and technologies, and providing suggestions for process improvements. • Possessing skills in programming languages like Java, Python, or C#, and familiarity with test automation tools. • Understand business requirements and create test cases. • Analyze automation results and report defects and work with developers for resolution. • Ensure traceability between requirements and test cases. • Validate UI/UX and workflows and troubleshooting guides. • Design and develop automation scripts using tools like Selenium • Maintain and update automation frameworks. Personal and Organizational Skills • Proactive and Initiative-Driven: Self-motivated with a go-getter attitude, capable of solving complex problems. • Collaborative: Ability to work effectively with QA, product managers, and cross-functional teams. Candidate should be a team player.

Responsibilities

  • Salary : Rs. 55,000.0 - Rs. 95,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer