• 5+ years of IT experience with at least 4+ years in Camunda BPM architecture/solutioning.
• Strong expertise in:
o BPMN 2.0
o DMN
o Java, Spring Boot
o Microservices architecture
o Event driven architecture (Kafka preferred)
• Experience with Camunda 7 and/or Camunda 8 (Zeebe).
• Hands on with REST APIs, JSON, and integration patterns.
• Knowledge of Kubernetes, Docker, cloud platforms (AWS/Azure/GCP).
• Familiarity with monitoring tools (Grafana, Prometheus, Elastic, Operate/Zeebe tools).
Soft Skills
• Strong problem solving and analytical thinking.
• Excellent communication and documentation skills.
• Ability to drive architecture decisions and influence stakeholders.
• Ability to lead cross functional teams and deliver large enterprise solutions.
Responsibilities
• 5+ years of IT experience with at least 4+ years in Camunda BPM architecture/solutioning.
• Strong expertise in:
o BPMN 2.0
o DMN
o Java, Spring Boot
o Microservices architecture
o Event driven architecture (Kafka preferred)
• Experience with Camunda 7 and/or Camunda 8 (Zeebe).
• Hands on with REST APIs, JSON, and integration patterns.
• Knowledge of Kubernetes, Docker, cloud platforms (AWS/Azure/GCP).
• Familiarity with monitoring tools (Grafana, Prometheus, Elastic, Operate/Zeebe tools).
Soft Skills
• Strong problem solving and analytical thinking.
• Excellent communication and documentation skills.
• Ability to drive architecture decisions and influence stakeholders.
• Ability to lead cross functional teams and deliver large enterprise solutions.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Informatica CDI (Cloud Data Integration)
Must Have Technical/Functional Skills
· 3–6 years of experience in ETL development and data integration.
· Hands-on experience with Matillion (preferred) or strong expertise in Ab Initio, Informatica,
or DataStage
· Strong SQL skills and experience with relational databases (Oracle, SQL Server, etc.).
· Knowledge of data warehousing concepts and dimensional modeling.
· Experience with scheduling tools and job monitoring.
· Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus.
· Strong problem-solving and analytical skills.
· Experience with scripting languages (Python, Shell) and Knowledge of big data technologies
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
: Databricks, AWS, Python, API Integration, ETL/ELT
Design, build, and maintain scalable data pipelines using Python and Apache Spark (via Databricks).
• Develop ETL/ELT workflows to ingest, transform, and load structured and unstructured data from various sources.
• Optimize data workflows for performance, reliability, and scalability in cloud environments.
• Work with Databricks to develop notebooks, jobs, and workflows for data processing and analytics.
• Collaborate with data scientists, analysts, and other engineering teams to deliver high-quality data solutions.
• Implement data quality checks, monitoring, and alerting mechanisms.
• Integrate data solutions with AWS services such as S3, Glue, Lambda, DynamoDB, and API Gateway.
• Ensure compliance with data governance and security standards, including HIPAA and other healthcare regulations.
• Participate in code reviews, architectural discussions, and contribute to technical strategy.
Responsibilities
Design, build, and maintain scalable data pipelines using Python and Apache Spark (via Databricks).
• Develop ETL/ELT workflows to ingest, transform, and load structured and unstructured data from various sources.
• Optimize data workflows for performance, reliability, and scalability in cloud environments.
• Work with Databricks to develop notebooks, jobs, and workflows for data processing and analytics.
• Collaborate with data scientists, analysts, and other engineering teams to deliver high-quality data solutions.
• Implement data quality checks, monitoring, and alerting mechanisms.
• Integrate data solutions with AWS services such as S3, Glue, Lambda, DynamoDB, and API Gateway.
• Ensure compliance with data governance and security standards, including HIPAA and other healthcare regulations.
• Participate in code reviews, architectural discussions, and contribute to technical strategy.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :: Databricks, AWS, Python, API Integration, ETL/ELT
AWS, Python, API Integration, ETL/ELT
Job Description: Data Engineer : -
• Design, build, and maintain scalable data pipelines using Python and Apache Spark (via Databricks).
• Develop ETL/ELT workflows to ingest, transform, and load structured and unstructured data from various sources.
• Optimize data workflows for performance, reliability, and scalability in cloud environments.
• Work with Databricks to develop notebooks, jobs, and workflows for data processing and analytics.
• Collaborate with data scientists, analysts, and other engineering teams to deliver high-quality data solutions.
• Implement data quality checks, monitoring, and alerting mechanisms.
• Integrate data solutions with AWS services such as S3, Glue, Lambda, DynamoDB, and API Gateway.
• Ensure compliance with data governance and security standards, including HIPAA and other healthcare regulations.
• Participate in code reviews, architectural discussions, and contribute to technical strategy.
Responsibilities
Job Description: Data Engineer : -
• Design, build, and maintain scalable data pipelines using Python and Apache Spark (via Databricks).
• Develop ETL/ELT workflows to ingest, transform, and load structured and unstructured data from various sources.
• Optimize data workflows for performance, reliability, and scalability in cloud environments.
• Work with Databricks to develop notebooks, jobs, and workflows for data processing and analytics.
• Collaborate with data scientists, analysts, and other engineering teams to deliver high-quality data solutions.
• Implement data quality checks, monitoring, and alerting mechanisms.
• Integrate data solutions with AWS services such as S3, Glue, Lambda, DynamoDB, and API Gateway.
• Ensure compliance with data governance and security standards, including HIPAA and other healthcare regulations.
• Participate in code reviews, architectural discussions, and contribute to technical strategy.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Web Developer, you will engage in the design, construction, and testing of web-based applications tailored for various site components. Your typical day will involve editing site content, documenting technical designs and specifications, and researching to incorporate updated content for websites, ensuring they remain current and user-friendly. You will collaborate with team members to enhance the functionality and aesthetics of web applications, contributing to a seamless user experience. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development of user-friendly web interfaces that meet client specifications.- Collaborate with cross-functional teams to ensure cohesive project execution. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft ASP.NET.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Understanding of web application security best practices.- Ability to troubleshoot and debug web applications effectively. Additional Information: - The candidate should have minimum 2 years of experience in Microsoft ASP.NET.- This position is based at our Hyderabad office.- A 15 years full time education is required.
Responsibilities
As a Web Developer, you will engage in the design, construction, and testing of web-based applications tailored for various site components. Your typical day will involve editing site content, documenting technical designs and specifications, and researching to incorporate updated content for websites, ensuring they remain current and user-friendly. You will collaborate with team members to enhance the functionality and aesthetics of web applications, contributing to a seamless user experience. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development of user-friendly web interfaces that meet client specifications.- Collaborate with cross-functional teams to ensure cohesive project execution. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft ASP.NET.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Understanding of web application security best practices.- Ability to troubleshoot and debug web applications effectively. Additional Information: - The candidate should have minimum 2 years of experience in Microsoft ASP.NET.- This position is based at our Hyderabad office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Design, build, and manage ML & LLM pipelines using Azure Machine Learning studio (AML) and Azure AI Foundry (e.g., GPT, Phi, LLaMA-based, HuggingFace) using Azure AI services.
• Implement and manage Azure infrastructure necessary for ML workloads (e.g., Azure Machine Learning Workspaces, Compute, Data Stores) using Terraform.
• Implement CI/CD for ML/LLM workflows with Azure DevOps and GitHub Actions.
• Manage real-time and batch deployments on Azure Container Instances and manage model versions, registrations, and lifecycle within Azure Machine Learning. This includes working with Azure AI Foundry for generative AI applications.
• Automate infrastructure with Terraform, Bicep, or ARM templates.
• Set up model registry, experiment tracking, versioning, and reproducibility.
• Integrate Azure Data Factory, Synapse, or Databricks for data ingestion and preprocessing.
Establish robust monitoring, logging, and alerting systems for deployed ML models and LLMs to track performance, detect data drift, concept drift, and operational issues, ensuring continuous model health.
Responsibilities
• Design, build, and manage ML & LLM pipelines using Azure Machine Learning studio (AML) and Azure AI Foundry (e.g., GPT, Phi, LLaMA-based, HuggingFace) using Azure AI services.
• Implement and manage Azure infrastructure necessary for ML workloads (e.g., Azure Machine Learning Workspaces, Compute, Data Stores) using Terraform.
• Implement CI/CD for ML/LLM workflows with Azure DevOps and GitHub Actions.
• Manage real-time and batch deployments on Azure Container Instances and manage model versions, registrations, and lifecycle within Azure Machine Learning. This includes working with Azure AI Foundry for generative AI applications.
• Automate infrastructure with Terraform, Bicep, or ARM templates.
• Set up model registry, experiment tracking, versioning, and reproducibility.
• Integrate Azure Data Factory, Synapse, or Databricks for data ingestion and preprocessing.
Establish robust monitoring, logging, and alerting systems for deployed ML models and LLMs to track performance, detect data drift, concept drift, and operational issues, ensuring continuous model health.
Salary : Rs. 1,00,000.0 - Rs. 3,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance