We found 31 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Job Title: Developer Work Location: Chennai /Bangalore /Hyderabad/Kochi/Kolkata/Pune/ Delhi /Noida Skill Required : Digital : DevOps Experience Range: 8-10 Primary Skill: GCP cloud, GKE/ Kubernetes, Composer, Bigquery, Networking and Identity and Access Management, Terraform,. Experience in terraform cloud and Policy as code Job Description: • Design and manage GCP infrastructure using Infrastructure as Code (IaC) tools like Terraform or Deployment Manager. • Develop and maintain CICD pipelines using tools such as Jenkins, GitLab CI, or Cloud Build. • Automate deployment and configuration tasks to improve operational efficiency. • Monitor system performance and ensure high availability and scalability. • Manage containerized applications using Kubernetes (GKE) and Docker. • Implement and enforce cloud security best practices. Collaborate with cross-functional teams to streamline application delivery. • Participate in incident response and root cause analysis. Document infrastructure, processes, and configurations.

Responsibilities

Job Title: Developer Work Location: Chennai /Bangalore /Hyderabad/Kochi/Kolkata/Pune/ Delhi /Noida Skill Required : Digital : DevOps Experience Range: 8-10 Primary Skill: GCP cloud, GKE/ Kubernetes, Composer, Bigquery, Networking and Identity and Access Management, Terraform,. Experience in terraform cloud and Policy as code Job Description: • Design and manage GCP infrastructure using Infrastructure as Code (IaC) tools like Terraform or Deployment Manager. • Develop and maintain CICD pipelines using tools such as Jenkins, GitLab CI, or Cloud Build. • Automate deployment and configuration tasks to improve operational efficiency. • Monitor system performance and ensure high availability and scalability. • Manage containerized applications using Kubernetes (GKE) and Docker. • Implement and enforce cloud security best practices. Collaborate with cross-functional teams to streamline application delivery. • Participate in incident response and root cause analysis. Document infrastructure, processes, and configurations.
  • Salary : Rs. 90,000.0 - Rs. 1,65,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Developer

Job Description

Experience Range in Required Skills: 6-8 //Considerable Overall 5+ yrs Shift: 2 - 11pm , flexibility WFH 4hours Location: PAN India // Required DATA Science not the Data engineer Primary is Strong in AI/ML - 3+ yrs Secondary is pyspark - 2+ yrs // continuous INTERVIEW happening at this one, Interview Feedback/ Questions/Pointers > Should have hands-on experience on real-time data and also good information with academic projects. >Data Science / ML, should be able to explain well. > Programming Languages Python and Pyspark > Working knowledge on Predictive / ML based models > Working experience with Cloud ---- Connect with me for SAMPLE RESUME "Job Description: Must have: Candidate must have expertise/experience with below tasks - Candidate must have experience on Linux, git, CICD, Release management, production deployment and support. - Strong Knowledge on Apache Spark is MUST - Strong Knowledge of PySpark is MUST - Strong Knowledge on SQL is MUST - Good Knowledge of Data Science workload - Good Knowledge on Kubernetes/Docker - Good Knowledge of Python is MUST - Good Knowledge of Java language. - Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset " Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Comments for Suppliers:

Responsibilities

Experience Range in Required Skills: 6-8 //Considerable Overall 5+ yrs Shift: 2 - 11pm , flexibility WFH 4hours Location: PAN India // Required DATA Science not the Data engineer Primary is Strong in AI/ML - 3+ yrs Secondary is pyspark - 2+ yrs // continuous INTERVIEW happening at this one, Interview Feedback/ Questions/Pointers > Should have hands-on experience on real-time data and also good information with academic projects. >Data Science / ML, should be able to explain well. > Programming Languages Python and Pyspark > Working knowledge on Predictive / ML based models > Working experience with Cloud ---- Connect with me for SAMPLE RESUME "Job Description: Must have: Candidate must have expertise/experience with below tasks - Candidate must have experience on Linux, git, CICD, Release management, production deployment and support. - Strong Knowledge on Apache Spark is MUST - Strong Knowledge of PySpark is MUST - Strong Knowledge on SQL is MUST - Good Knowledge of Data Science workload - Good Knowledge on Kubernetes/Docker - Good Knowledge of Python is MUST - Good Knowledge of Java language. - Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset " Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Comments for Suppliers:
  • Salary : Rs. 70,000.0 - Rs. 1,30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

Job Title: Engineer Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India "Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL Role required for Data Engineer with Data science " Experience Range in Required Skills: 6-8 //Considerable Overall 5+ yrs Shift: 2 - 11pm , flexibility WFH 4hours Location: PAN India // Required DATA Science not the Data engineer Primary is Strong in AI/ML - 3+ yrs Secondary is pyspark - 2+ yrs // continuous INTERVIEW happening at this one, Interview Feedback/ Questions/Pointers > Should have hands-on experience on real-time data and also good information with academic projects. >Data Science / ML, should be able to explain well. > Programming Languages Python and Pyspark > Working knowledge on Predictive / ML based models > Working experience with Cloud ---- Connect with me for SAMPLE RESUME "Job Description: Must have: Candidate must have expertise/experience with below tasks - Candidate must have experience on Linux, git, CICD, Release management, production deployment and support. - Strong Knowledge on Apache Spark is MUST - Strong Knowledge of PySpark is MUST - Strong Knowledge on SQL is MUST - Good Knowledge of Data Science workload - Good Knowledge on Kubernetes/Docker - Good Knowledge of Python is MUST - Good Knowledge of Java language. - Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset " Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Comments for Suppliers:

Responsibilities

Job Title: Engineer Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India "Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL Role required for Data Engineer with Data science " Experience Range in Required Skills: 6-8 //Considerable Overall 5+ yrs Shift: 2 - 11pm , flexibility WFH 4hours Location: PAN India // Required DATA Science not the Data engineer Primary is Strong in AI/ML - 3+ yrs Secondary is pyspark - 2+ yrs // continuous INTERVIEW happening at this one, Interview Feedback/ Questions/Pointers > Should have hands-on experience on real-time data and also good information with academic projects. >Data Science / ML, should be able to explain well. > Programming Languages Python and Pyspark > Working knowledge on Predictive / ML based models > Working experience with Cloud ---- Connect with me for SAMPLE RESUME "Job Description: Must have: Candidate must have expertise/experience with below tasks - Candidate must have experience on Linux, git, CICD, Release management, production deployment and support. - Strong Knowledge on Apache Spark is MUST - Strong Knowledge of PySpark is MUST - Strong Knowledge on SQL is MUST - Good Knowledge of Data Science workload - Good Knowledge on Kubernetes/Docker - Good Knowledge of Python is MUST - Good Knowledge of Java language. - Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset " Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Comments for Suppliers:
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Engineer

Job Description

As a Technology Support Engineer, you will engage in resolving incidents and problems across various business system components, ensuring operational stability throughout the day. Your responsibilities will include creating and implementing Requests for Change, updating knowledge base articles, and collaborating with vendors to assist service management teams in issue analysis and resolution. Each day will present new challenges that require a proactive approach to problem-solving and effective communication with team members and stakeholders. Roles & Responsibilities: - Expected to build knowledge and support the team.- Participate in Problem Solving discussions.- Assist in the development and maintenance of documentation related to system configurations and procedures.- Provide timely updates to stakeholders regarding the status of incidents and requests. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Windows Desktop Management.- Good To Have Skills: Experience with remote desktop support tools.- Familiarity with troubleshooting hardware and software issues.- Understanding of network configurations and connectivity issues.- Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 0-2 years of experience in Microsoft Windows Desktop Management.- This position is based at our Kolkata office.- A 15 years full time education is required.

Responsibilities

As a Technology Support Engineer, you will engage in resolving incidents and problems across various business system components, ensuring operational stability throughout the day. Your responsibilities will include creating and implementing Requests for Change, updating knowledge base articles, and collaborating with vendors to assist service management teams in issue analysis and resolution. Each day will present new challenges that require a proactive approach to problem-solving and effective communication with team members and stakeholders. Roles & Responsibilities: - Expected to build knowledge and support the team.- Participate in Problem Solving discussions.- Assist in the development and maintenance of documentation related to system configurations and procedures.- Provide timely updates to stakeholders regarding the status of incidents and requests. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Windows Desktop Management.- Good To Have Skills: Experience with remote desktop support tools.- Familiarity with troubleshooting hardware and software issues.- Understanding of network configurations and connectivity issues.- Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 0-2 years of experience in Microsoft Windows Desktop Management.- This position is based at our Kolkata office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 30,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Technology Support Engineer