ob Description
As a tech analyst, you are an expert at contributing to different phases of the consulting lifecycle. You will be intensely involved in; you will define the problem, propose and refine the solution. You will also play an important role in defining the overall solution. You will guide developers with functional/technical design and deliverables. You will also contribute to the proposal development, client training and internal capability-building and help detail the project scope. You will have the opportunity to shape value-adding consulting solutions that enable our clients to meet the changing needs of the global landscape.
Responsibilities:
• Understand the business vision, strategy and project roadmaps.
• Ensure that there is a complete set of relevant and clear business requirements.
• Translate the functional and non-functional requirements into an effective, reliable and future proof solution architecture in line with the architectural blueprint of organisation and domain.
• Work with Solution architects and squad analysts in order to create an end-to-end solution across applications.
• Provide high-level estimates to develop and implement the solution.
• Refine large requirements into well-defined time-boxed features.
• Act as the single point of contact for the business requestors.
• Provide clear and structured communication and documentation on the solution, to users, squad analysts, operations and management.
• Ensure the solution is implemented in line with the architecture during project execution.
• Clearly understand and communicate dependencies between various initiatives.
• Participate in RFP processes to provide input on digital scope.
Skills:
• Total experience 10+ years
• 5+ years of experience in backend development - Java microservices, Java based REST/GraphQL API in microservice service based architecture. [Must Have]
• 5+ years of experience in web development using Angular. [Must Have]
• 5+ years of experience with database systems, with knowledge of SQL and NoSQL database[Must Have]
• Ability to write effective documentation
• Deep hands-on AI experience. (e.g. Agentic AI) [Must Have]
• Basic web frontend technology understanding (HTML//CSS/JavaScript/TypeScript)
• Knowledge in CICD deployments
• Functional knowledge in E-commerce experience, domain knowledge in telecom industry
• CMS knowledge is a plus
• Experience backed with rich domain knowledge in Software Analysis, Design, Development, Implementation and Testing of Web-based and Enterprise Applications using JAVA/J2EE technologies.
Responsibilities
ob Description
As a tech analyst, you are an expert at contributing to different phases of the consulting lifecycle. You will be intensely involved in; you will define the problem, propose and refine the solution. You will also play an important role in defining the overall solution. You will guide developers with functional/technical design and deliverables. You will also contribute to the proposal development, client training and internal capability-building and help detail the project scope. You will have the opportunity to shape value-adding consulting solutions that enable our clients to meet the changing needs of the global landscape.
Responsibilities:
• Understand the business vision, strategy and project roadmaps.
• Ensure that there is a complete set of relevant and clear business requirements.
• Translate the functional and non-functional requirements into an effective, reliable and future proof solution architecture in line with the architectural blueprint of organisation and domain.
• Work with Solution architects and squad analysts in order to create an end-to-end solution across applications.
• Provide high-level estimates to develop and implement the solution.
• Refine large requirements into well-defined time-boxed features.
• Act as the single point of contact for the business requestors.
• Provide clear and structured communication and documentation on the solution, to users, squad analysts, operations and management.
• Ensure the solution is implemented in line with the architecture during project execution.
• Clearly understand and communicate dependencies between various initiatives.
• Participate in RFP processes to provide input on digital scope.
Skills:
• Total experience 10+ years
• 5+ years of experience in backend development - Java microservices, Java based REST/GraphQL API in microservice service based architecture. [Must Have]
• 5+ years of experience in web development using Angular. [Must Have]
• 5+ years of experience with database systems, with knowledge of SQL and NoSQL database[Must Have]
• Ability to write effective documentation
• Deep hands-on AI experience. (e.g. Agentic AI) [Must Have]
• Basic web frontend technology understanding (HTML//CSS/JavaScript/TypeScript)
• Knowledge in CICD deployments
• Functional knowledge in E-commerce experience, domain knowledge in telecom industry
• CMS knowledge is a plus
• Experience backed with rich domain knowledge in Software Analysis, Design, Development, Implementation and Testing of Web-based and Enterprise Applications using JAVA/J2EE technologies.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role : 558552 Angular + Java Microservices- India- DX
: Digital : Snowflake~PL/SQL~Unix Shell Scripting and text processing tools
• Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas
• Monitor system performance and optimize warehouse usage, queries, and costs
• Implement and manage security controls, data masking, and auditing
• Perform capacity planning, resource monitoring, and usage analysis
• Support data loads, integrations, and troubleshoot Snowflake-related issues
• Work closely with development, data engineering, and business teams for operational support
• Automate routine administrative tasks using SQL, Python, or shell scripting 8Ensure high availability, backup, recovery, and disaster recovery readiness
• Maintain documentation, operational procedures, and best practices
• Demonstrate strong problem-solving skills and proactive incident prevention
• Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas vvRequirements
Responsibilities
: Digital : Snowflake~PL/SQL~Unix Shell Scripting and text processing tools
• Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas
• Monitor system performance and optimize warehouse usage, queries, and costs
• Implement and manage security controls, data masking, and auditing
• Perform capacity planning, resource monitoring, and usage analysis
• Support data loads, integrations, and troubleshoot Snowflake-related issues
• Work closely with development, data engineering, and business teams for operational support
• Automate routine administrative tasks using SQL, Python, or shell scripting 8Ensure high availability, backup, recovery, and disaster recovery readiness
• Maintain documentation, operational procedures, and best practices
• Demonstrate strong problem-solving skills and proactive incident prevention
• Administer and manage Snowflake environments including users, roles, warehouses, databases, and schemas vvRequirements
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :: Digital : Snowflake~PL/SQL~Unix Shell Scripting and text processing tools
Digital : Microsoft Azure~Digital : Databricks
Role Descriptions: Azure Databricks EngineerKey responsibilitiesKey ResponsibilitiesPlatform InfrastructureOversee Databricks platform configuration| resource management| workspace structuring| and cluster optimization.Monitor and troubleshoot performance issues across clusters| jobs| notebooks| and pipelines.Implement governance| security| compliance and data access control using Role-Based Access Control (RBAC) and Unity Catalog.Pipeline Development ArchitectureDesign and implement end-to-end data pipelines using PySpark| SQL| and Delta Lake within a medallion architecture using Data factory And Data bricks.Build real-time and batch DLT pipelines using Databricks Delta Live Tables with a focus on reliability and scalability.Optimize Lakehouse architecture for performance| cost-efficiency| and data integrity.Automate data ingestion| transformation| and validation| including support for streaming (Autoloader) and scheduled workflows.Perform data transformations| cleansing and validations using data quality rules for consistent and accurate data sets.Manage and monitor job orchestration| ensuring efficient pipelines run and reliability.CICD DevOpsDesign and maintain CICD pipelines for Databricks artifacts (notebooks| jobs| libraries) using tools such as Azure DevOps| GitHub Actions| Terraform or Jenkins.Support trunk-based development| deployment workflows| and infrastructure-as-code practices.Manage version control and automated testing using Git and related DevOps practices.Collaboration DeliveryCollaborate with product owners| business stakeholders| and data teams to gather requirements and translate them into technical solutions.Drive the adoption of best practices in coding| versioning| testing| deployment| monitoring| and security.Provide thought leadership on the best practices in Data Engineering| Architecture and Cloud Computing.Performance OptimizationDeliver optimized spark jobs and SQL queries for large scale data processing.Implement partitioning| caching and indexing strategies to improve performance and scalability of big data workloads.Conduct POCs for capacity planning and recommend appropriate infrastructure optimizations for cost effectiveness.Documentation Knowledge SharingCreated detailed documentations and review them for data workflows| SOPs| Architectural reviews etc.Mentor junior team members and promote a culture of learning and innovation.Promote the culture of optimization and cost saving and enable research driven development.Required QualificationsTechnical Expertise5 years in data engineering| with a strong focus on Databricks and Azure ecosystems.Deep hands-on experience with Data Factory| Databricks Lakehouse Architecture| Delta Lake| PySpark| and Spark job optimization.Proficiency in Python| SQL| and optionally Scala for building scalable ETLELT pipelines.Strong SQL skills are essential| with hands-on experience in SQL Server or other RDBMS platforms.Strong experience in designing and optimizing DLT pi
Responsibilities
Digital : Microsoft Azure~Digital : Databricks
Role Descriptions: Azure Databricks EngineerKey responsibilitiesKey ResponsibilitiesPlatform InfrastructureOversee Databricks platform configuration| resource management| workspace structuring| and cluster optimization.Monitor and troubleshoot performance issues across clusters| jobs| notebooks| and pipelines.Implement governance| security| compliance and data access control using Role-Based Access Control (RBAC) and Unity Catalog.Pipeline Development ArchitectureDesign and implement end-to-end data pipelines using PySpark| SQL| and Delta Lake within a medallion architecture using Data factory And Data bricks.Build real-time and batch DLT pipelines using Databricks Delta Live Tables with a focus on reliability and scalability.Optimize Lakehouse architecture for performance| cost-efficiency| and data integrity.Automate data ingestion| transformation| and validation| including support for streaming (Autoloader) and scheduled workflows.Perform data transformations| cleansing and validations using data quality rules for consistent and accurate data sets.Manage and monitor job orchestration| ensuring efficient pipelines run and reliability.CICD DevOpsDesign and maintain CICD pipelines for Databricks artifacts (notebooks| jobs| libraries) using tools such as Azure DevOps| GitHub Actions| Terraform or Jenkins.Support trunk-based development| deployment workflows| and infrastructure-as-code practices.Manage version control and automated testing using Git and related DevOps practices.Collaboration DeliveryCollaborate with product owners| business stakeholders| and data teams to gather requirements and translate them into technical solutions.Drive the adoption of best practices in coding| versioning| testing| deployment| monitoring| and security.Provide thought leadership on the best practices in Data Engineering| Architecture and Cloud Computing.Performance OptimizationDeliver optimized spark jobs and SQL queries for large scale data processing.Implement partitioning| caching and indexing strategies to improve performance and scalability of big data workloads.Conduct POCs for capacity planning and recommend appropriate infrastructure optimizations for cost effectiveness.Documentation Knowledge SharingCreated detailed documentations and review them for data workflows| SOPs| Architectural reviews etc.Mentor junior team members and promote a culture of learning and innovation.Promote the culture of optimization and cost saving and enable research driven development.Required QualificationsTechnical Expertise5 years in data engineering| with a strong focus on Databricks and Azure ecosystems.Deep hands-on experience with Data Factory| Databricks Lakehouse Architecture| Delta Lake| PySpark| and Spark job optimization.Proficiency in Python| SQL| and optionally Scala for building scalable ETLELT pipelines.Strong SQL skills are essential| with hands-on experience in SQL Server or other RDBMS platforms.Strong experience in designing and optimizing DLT pi
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role : Digital : Microsoft Azure~Digital : Databricks
Digital: Cloud DevOps
Descriptions:
- Knowledge on cloud - AWS| Azure or GCP Linux Hands-on working experience Hands-on in Docker containers and building Code as an Infrastructure using Kubernetes (Eg Terraform) Application Monitoring Logging tools like Prometheus and Grafana| Zabbix etc Familiar with one or more DevOps IT Operation Tools like Code management and build tools (Eg GitLab Maven...) Continuous integration tools (Eg Jenkins Nexus frog...)
- Configuration Deployment tools (Eg Ansible Chef) Ansible playbooks hands-on experience
- Exposure to setup and configurations related to Elasticsearch Kafka| Memcached| DB etc
- Experience with web servers like nginx| tomcat and JBoss.
- Experience in Shell| Python scripting Linux knowledge.
- Experience in container orchestration tools like Kubernetes| OpenShift and etc.
Responsibilities
Digital: Cloud DevOps
Descriptions:
- Knowledge on cloud - AWS| Azure or GCP Linux Hands-on working experience Hands-on in Docker containers and building Code as an Infrastructure using Kubernetes (Eg Terraform) Application Monitoring Logging tools like Prometheus and Grafana| Zabbix etc Familiar with one or more DevOps IT Operation Tools like Code management and build tools (Eg GitLab Maven...) Continuous integration tools (Eg Jenkins Nexus frog...)
- Configuration Deployment tools (Eg Ansible Chef) Ansible playbooks hands-on experience
- Exposure to setup and configurations related to Elasticsearch Kafka| Memcached| DB etc
- Experience with web servers like nginx| tomcat and JBoss.
- Experience in Shell| Python scripting Linux knowledge.
- Experience in container orchestration tools like Kubernetes| OpenShift and etc.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Minimum of bachelor’s degree of engineering in any reputed institution
• Minimum 3-5 years of Java/J2EE development experience with Minimum 1-2 years of Finacle DEH/FEBA experience
• Good understanding of Java concepts and technologies, Spring boot, Web services(REST), Security etc.
• Good understanding in any one of the RDBMS (Oracle, SQL Server)
• Awareness of design principles, design patterns, performance tuning, profiling
• Strong debugging and troubleshooting skills
• Demonstrate good judgement in selecting methods and techniques for obtaining solutions
• Exposure to one or more tools: Git, Jira, Jenkins, SonarQube, Maven, Gradle
• Preferable to have an exposure to Web containers (tomcat/jboss), Docker, Kubernetes & Cloud deployments.
Responsibilities
• Minimum of bachelor’s degree of engineering in any reputed institution
• Minimum 3-5 years of Java/J2EE development experience with Minimum 1-2 years of Finacle DEH/FEBA experience
• Good understanding of Java concepts and technologies, Spring boot, Web services(REST), Security etc.
• Good understanding in any one of the RDBMS (Oracle, SQL Server)
• Awareness of design principles, design patterns, performance tuning, profiling
• Strong debugging and troubleshooting skills
• Demonstrate good judgement in selecting methods and techniques for obtaining solutions
• Exposure to one or more tools: Git, Jira, Jenkins, SonarQube, Maven, Gradle
• Preferable to have an exposure to Web containers (tomcat/jboss), Docker, Kubernetes & Cloud deployments.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance