Job Title: Developer
Work Location: ~PUNE~
Skill Required: Digital : SAPBusinessObjects Data Services~SAP Business Object (BO)
Experience Range in Required Skills: 6-8 years
Job Description:
Design, Develop and maintain WEBI reports (SAP BO Report) as per business requirement.Perform data validation and ensure report accuracy .Participate in E2E SDLC activity including development , testing , and documentation .Ability to analyze , troubleshoot , and resolve reporting issues
Responsibilities
Job Title: Developer
Work Location: ~PUNE~
Skill Required: Digital : SAPBusinessObjects Data Services~SAP Business Object (BO)
Experience Range in Required Skills: 6-8 years
Job Description:
Design, Develop and maintain WEBI reports (SAP BO Report) as per business requirement.Perform data validation and ensure report accuracy .Participate in E2E SDLC activity including development , testing , and documentation .Ability to analyze , troubleshoot , and resolve reporting issues
Salary : Rs. 70,000.0 - Rs. 1,40,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Network Security~MSS - Vulnerability Management
Profile Required3 to 6 years of experience in the field of IT security with good Expertise and Knowledge in information security.Able to understand architecture issues in order to develop security policies or relevant recommendations on applications and projects. Also, should be able to discuss on an equal footing with the community of architects and project managers of the applications concerned.Good knowledge on SDLC and hands-on experience in Security Environments and activities. Knowledge on Application Security Vulnerability Management Cloud Security Thread modeling.Awareness of digital technologies and understanding of functional domain and business processes. Good in identifying proposing process improvements, documentation preparation and consolidation on processtechnical subject related to Security.Knowledge on Cloud and network Security, IAM, Data encryption, SIEM.Understanding on regulatory programs like GDPR, NYDFS, SCHREMS, DORA etc...Knowledge on Cloud, Infra, Firewalls, Routers and WifiExposure to JAVA, API, ASP.Net, Spark, Python, react.js languages and technologies respectively.Exposer on security Tools Qualys, Nessus, Nmap, Burp suite, SonarQube, netspaker, OWSAP, Open-Source tool for Security Tests.Proper Work Ethics, Adaptability, Interpersonal skills and Problem-Solving Capabilities.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Network Security~MSS - Vulnerability Managemen
Microsoft Fabric Core: Deep proficiency in Fabric architecture, specifically Data Factory pipelines, Dataflows Gen2, and Lakehouse/OneLake storage strategies.
Migration Expertise: Proven experience migrating legacy ETL processes (e.g., from Azure Data Factory, SSIS, or on-prem SQL) into the Microsoft Fabric ecosystem.
Scripting & Transformation: Expert-level coding in PySpark and Python for Fabric Notebooks to handle complex data transformations and enrichment.
Power BI Backend: Ability to build robust Semantic Models and Power BI Datasets directly on top of OneLake (using Direct Lake mode where applicable) for high-performance reporting.
SQL Proficiency: Advanced SQL skills for data modeling, star schema design, and querying within the SQL Endpoint of the Lakehouse.
Azure Ecosystem: Strong grasp of Azure Data Lake Storage (ADLS Gen2) and security governance (Entra ID/ACLs).
Key Responsibilities (The "Ask")
Architect & Migrate: Lead the backend migration of data from diverse sources into a unified Fabric Lakehouse architecture.
Pipeline Optimization: Re-engineer and optimize data pipelines to ensure seamless data ingestion and transformation for high availability.
Model for Reporting: Design purpose-driven data views and efficient Star Schemas specifically tailored to support rapid Power BI report rendering.
Cross-Functional Support: Bridge the gap between backend data engineering and frontend reporting by ensuring data quality and consistency for the BI team.
Responsibilities
Microsoft Fabric Core: Deep proficiency in Fabric architecture, specifically Data Factory pipelines, Dataflows Gen2, and Lakehouse/OneLake storage strategies.
Migration Expertise: Proven experience migrating legacy ETL processes (e.g., from Azure Data Factory, SSIS, or on-prem SQL) into the Microsoft Fabric ecosystem.
Scripting & Transformation: Expert-level coding in PySpark and Python for Fabric Notebooks to handle complex data transformations and enrichment.
Power BI Backend: Ability to build robust Semantic Models and Power BI Datasets directly on top of OneLake (using Direct Lake mode where applicable) for high-performance reporting.
SQL Proficiency: Advanced SQL skills for data modeling, star schema design, and querying within the SQL Endpoint of the Lakehouse.
Azure Ecosystem: Strong grasp of Azure Data Lake Storage (ADLS Gen2) and security governance (Entra ID/ACLs).
Key Responsibilities (The "Ask")
Architect & Migrate: Lead the backend migration of data from diverse sources into a unified Fabric Lakehouse architecture.
Pipeline Optimization: Re-engineer and optimize data pipelines to ensure seamless data ingestion and transformation for high availability.
Model for Reporting: Design purpose-driven data views and efficient Star Schemas specifically tailored to support rapid Power BI report rendering.
Cross-Functional Support: Bridge the gap between backend data engineering and frontend reporting by ensuring data quality and consistency for the BI team.
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Azure Cloud Engineer
6-9 years
21 lacs
Description
Responsibilities
• Work closely with application, engineering, security and operations teams to engineer and build Azure native Services, PaaS and IaaS solutions within an agile and modern enterprise grade operating model.
• Maintaining and developing the Infrastructure as Code repository using Terraform and be able to deliver a fully automatized cloud infrastructure.
• Utilize containerization technologies like Docker/Kubernetes for application deployment.
• Develop Helm Charts for Application deployment on Kubernetes Clusters.
• Implement and maintain monitoring and alerting systems to detect issues proactively.
• Perform deployment, configuration, monitoring and maintenance of high availability enterprise solutions.
• Perform proactive system administration, monitoring, technical efficiency tuning, up-time, alert notifications, and automation tasks.
• Manage and administer the Microsoft Azure cloud environment, including provisioning, configuration, performance monitoring, policy governance and security
• Design, develop, and implement highly available, multi-region solutions within Microsoft Azure
• Analyze existing operational standards, processes, and/or governance to identify and implement improvements
• Migrate existing infrastructure services to cloud-based solutions
• Manage security and access controls of cloud-based solutions
• Develop infrastructure as code (IaC) leveraging cloud native tooling to ensure automated and consistent platform deployments
• Develop and implement policy driven data protection best practices to ensure cloud solutions are protected from data loss
• Support cloud adoption of applications as they are being transformed and/or modernized
• Ensure all infrastructure components meet proper performance and capacity standards
• Troubleshooting of infrastructure/software application solutions using both Azure Web Services, Azure Monitor and Application Insights.
Mandatory Technical Skills
• 8+ years of experience with Microsoft Azure Cloud Services (IaaS, PaaS, Database) and Azure DevOps
• Infrastructure As Code using Terraform
• 8+ years of experience with AKS ( Azure Kubernetes Service )
• Advanced skills on LINUX, Network, security and Docker based environment
• Basic knowledge of programming languages is a plus (For example: PowerShell, Kusto, Python, .NET, C#).
• Experience with Helm Charts for Kubernetes Clusters
• Able to create and manage Azure cloud DB like SQL Server, Cosmos or Mongo
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Data Engineer
Description
Primary Skill: Experience: 9 to 13 Years
21 lacs
Particulars
Senior Developer (Job Description):
Minimum 7-10 years of development experience in Azure Data Services
Job description:
• 12+ years of Total experience
• 7-10 years of development experience in Azure Data Services
• Experience in integrating on-prem data sources with Azure data storage
• Experience in Linux skills to upgrade Vertica database
• Experience in programming in Python, R. (Typescript is a plus)
• Experience in developing data processing pipelines using Azure Data Factory, Azure Databricks, and other Azure services.
• Experience in Azure Devops (build a deployment pipeline from Stage to Prod)
• Experience in using Visual Studio or similar tool to build code and debug
• Experience in working with GIT
• Experience in Azure admin tasks (permissions, network, security)
• Good to have knowledge on data analysis using Synapse analytics and Azure Data Bricks
• Good to have knowledge on Azure OpenAI and ML services.
• Microsoft certified Azure Data Engineer or similar is preferred
Other expectations:
• Good communication & strong collaboration skills
• Open minded
• High interest in new technologies
• Experienced in working models with distributed teams in different cultures
• Analytical thinking, high level of comprehension and independent working style
• Will work with colleagues based out of US – Up to 2pm EST
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance