Work Location: KOLKATA, WB /CHENNAI, TN
Job Title : Developer
Experience : 6-8 Years
Skill Required: Azure Data Factory
Job description: Managing Azure Resources. Azure Administrators are responsible for deploying, monitoring, and managing Azure resources such as virtual machines, storage accounts, and networks. They ensure that these resources are configured correctly and operate efficiently to meet organizational needs..
Essential Skills: Azure Cloud Administrator.
Responsibilities
Work Location: KOLKATA, WB /CHENNAI, TN
Job Title : Developer
Experience : 6-8 Years
Skill Required: Azure Data Factory
Job description: Managing Azure Resources. Azure Administrators are responsible for deploying, monitoring, and managing Azure resources such as virtual machines, storage accounts, and networks. They ensure that these resources are configured correctly and operate efficiently to meet organizational needs..
Essential Skills: Azure Cloud Administrator.
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Design, build, and deploy data extraction, transformation, and loading processes and pipelines from various sources including databases, APIs, and data files.Develop and support data pipe-lines within a Cloud Data Platform, such as Databricks Build data models that reflect domain expertise, meet current business needs, and will remain flexible as strategy evolvesMonitor and optimize Databricks cluster performance, ensuring cost-effective scaling and resource utilizationDemonstrates ability to communicate technical concepts to non-technical audienc-es both in written and verbal form.Demonstrates strong understanding with coding and pro-gramming concepts to build data pipelines (e.g. data transformation, data quality, data inte-gration, etc.).Demonstrates strong understanding of database storage concepts (data lake, re-lational databases, NoSQL, Graph, data warehousing).Implement and maintain Delta Lake for optimized data storage, ensuring data reliability, performance, and versioningAutomate CICD pipelines for data workflows using Azure DevOpsCollaborate with cross-functional teams to support data governance using Databricks Unity Catalog
Responsibilities
• Design, build, and deploy data extraction, transformation, and loading processes and pipelines from various sources including databases, APIs, and data files.Develop and support data pipe-lines within a Cloud Data Platform, such as Databricks Build data models that reflect domain expertise, meet current business needs, and will remain flexible as strategy evolvesMonitor and optimize Databricks cluster performance, ensuring cost-effective scaling and resource utilizationDemonstrates ability to communicate technical concepts to non-technical audienc-es both in written and verbal form.Demonstrates strong understanding with coding and pro-gramming concepts to build data pipelines (e.g. data transformation, data quality, data inte-gration, etc.).Demonstrates strong understanding of database storage concepts (data lake, re-lational databases, NoSQL, Graph, data warehousing).Implement and maintain Delta Lake for optimized data storage, ensuring data reliability, performance, and versioningAutomate CICD pipelines for data workflows using Azure DevOpsCollaborate with cross-functional teams to support data governance using Databricks Unity Catalog
Salary : Rs. 0.0 - Rs. 12.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Good information and sound knowledge in Workday HCM Technical and Workday HCM Core Functional.
• Bachelor’s degree in Information Systems, Data Analytics, Finance, or related field.
• 2–5 years of experience working with Workday Finance or other modules (reporting, analytics, or data management). Strong knowledge of Workday Reporting (Advanced, Matrix, Composite, Prism Analytics).
• Hands-on experience with EIBs, calculated fields, and business process configuration.
• Proficiency in Excel, SQL, or other data analysis tools is a plus.
• Strong analytical skills with the ability to interpret data and provide actionable insights.
• Excellent communication and stakeholder management skills.
• Ability to work independently in a fast-paced, dynamic environment. Workday certified preferred.
• Excellent problem-solving, analytical, and troubleshooting skills.
• Strong communication skills to work with business and technical stakeholders. Workday certification preferred
Responsibilities
Good information and sound knowledge in Workday HCM Technical and Workday HCM Core Functional.
• Bachelor’s degree in Information Systems, Data Analytics, Finance, or related field.
• 2–5 years of experience working with Workday Finance or other modules (reporting, analytics, or data management). Strong knowledge of Workday Reporting (Advanced, Matrix, Composite, Prism Analytics).
• Hands-on experience with EIBs, calculated fields, and business process configuration.
• Proficiency in Excel, SQL, or other data analysis tools is a plus.
• Strong analytical skills with the ability to interpret data and provide actionable insights.
• Excellent communication and stakeholder management skills.
• Ability to work independently in a fast-paced, dynamic environment. Workday certified preferred.
• Excellent problem-solving, analytical, and troubleshooting skills.
• Strong communication skills to work with business and technical stakeholders. Workday certification preferred
Salary : Rs. 10.0 - Rs. 12.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
4 yrs of hands on experience on pyspark development - Ingestion, database, schema,workflow
• Hands on experience on scripting tools like Python Shell Strong SQL knowledge,SCD1,SCD2 ETL logic and preparing validations sql scripts
• Knowledge of Spark - Dataframes, PySQL, Spark Libraries
• Knowledge of AWS S3, AWS EMR, AWS Redshift
• Strong knowledge of Data Management principles
• Experience in building ETL data warehouse transformation processes
• Strong communication skills
Responsibilities
4 yrs of hands on experience on pyspark development - Ingestion, database, schema,workflow
• Hands on experience on scripting tools like Python Shell Strong SQL knowledge,SCD1,SCD2 ETL logic and preparing validations sql scripts
• Knowledge of Spark - Dataframes, PySQL, Spark Libraries
• Knowledge of AWS S3, AWS EMR, AWS Redshift
• Strong knowledge of Data Management principles
• Experience in building ETL data warehouse transformation processes
• Strong communication skills
Salary : Rs. 10.0 - Rs. 12.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance