Agile Way of Working~Core Java
• Job Description for Service Now.
What Does a ServiceNow Architect Do?
ServiceNow Solution Architect and Technical Architect understand end to end functionality, technical architecture of SNOW ITSM, ITOM, nowassist.. Candidate should able to do customize service now though code / scripting responsible for configuring, managing, and maintaining an organization’s ServiceNow platform. Their goal is to ensure smooth operation, security, and optimization of the platform for IT service management and other business processes.
Key Responsibilities
System Configuration & Customization
• Configure ServiceNow modules (Incident, Problem, Change Management).
• Create custom forms, workflows, and scripts
, code change for customization
User & Role Management
• Manage user accounts, roles, and permissions.
• Implement access controls and monitor user activity
Platform Maintenance
• Perform upgrades, patches, and backups.
• Monitor performance and troubleshoot issues.
Incident , Problem and Change Resolution
• Address incidents related to ServiceNow functionality.
• Investigate root causes and implement fixes
Migration and Integration & API Management
• Configure integrations with third-party systems.
• Manage REST/SOAP API connections
Reporting & Analytics
• Create dashboards, reports, and scheduled notifications.
• Following will be few of the screening questions to Candidates during virtual drive
Any ServiceNow implementation experience?
Are you proficient in customizing solution in SNOW through code or scripting?
In which SNOW track you have worked? preferred in ITSM, ITOM ?
Any SNOW migration done?
• Feedback updated for following Candidates in Beeline.
Ganesh Kothule, Arshad Rashid, Arpit Singh, Bej Sitakanta, Pallabi Bhowmick, Priya Karmarkar.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Microsoft Power BI & Azure Data Factory
BI: Power BI, DAX, Power Query (M), Deployment Pipelines.
Data Engineering: Azure Data Factory
Databases/Warehouses: Azure SQL, Synapse SQL
Languages: SQL, PL-SQL
Required Qualifications
4–8 years of experience in BI/Analytics and Data Engineering (adjust based on seniority).
Strong in Power BI: DAX, Power Query (M), model design, performance tuning, RLS/OLS.
Hands-on with SQL (advanced joins, window functions, performance tuning, indexing).
Experience with Azure stack (ADF)
Solid grasp of data modeling: star schema, snowflake, slowly changing dimensions (SCD), surrogate keys, incremental strategies.
Familiarity with version control (Git) and CI/CD for data/BI assets.
Strong analytical, communication, and stakeholder management skills.
2) Data Engineering (Pipelines & Modeling)
Design and implement data ingestion and transformation pipelines (batch/streaming) using tools like Azure Data Factory, Databricks, Synapse / Fabric, or equivalent.
Build ELT/ETL jobs with robust error handling, logging, monitoring, and cost optimization.
Implement dimensional modeling (facts/dimensions), Data Vault (where relevant), and data quality checks (profiling, validation rules).
Orchestrate CDC, incremental loads, schema evolution, and data partitioning strategies.
Collaborate on data warehousing (Azure SQL/Synapse/Snowflake/BigQuery/Redshift) and lakehouse architectures.
Set up CI/CD for data pipelines and Power BI (Source control, YAML pipelines, deployment pipelines, parameterization).
Responsibilities
BI: Power BI, DAX, Power Query (M), Deployment Pipelines.
Data Engineering: Azure Data Factory
Databases/Warehouses: Azure SQL, Synapse SQL
Languages: SQL, PL-SQL
Required Qualifications
4–8 years of experience in BI/Analytics and Data Engineering (adjust based on seniority).
Strong in Power BI: DAX, Power Query (M), model design, performance tuning, RLS/OLS.
Hands-on with SQL (advanced joins, window functions, performance tuning, indexing).
Experience with Azure stack (ADF)
Solid grasp of data modeling: star schema, snowflake, slowly changing dimensions (SCD), surrogate keys, incremental strategies.
Familiarity with version control (Git) and CI/CD for data/BI assets.
Strong analytical, communication, and stakeholder management skills.
2) Data Engineering (Pipelines & Modeling)
Design and implement data ingestion and transformation pipelines (batch/streaming) using tools like Azure Data Factory, Databricks, Synapse / Fabric, or equivalent.
Build ELT/ETL jobs with robust error handling, logging, monitoring, and cost optimization.
Implement dimensional modeling (facts/dimensions), Data Vault (where relevant), and data quality checks (profiling, validation rules).
Orchestrate CDC, incremental loads, schema evolution, and data partitioning strategies.
Collaborate on data warehousing (Azure SQL/Synapse/Snowflake/BigQuery/Redshift) and lakehouse architectures.
Set up CI/CD for data pipelines and Power BI (Source control, YAML pipelines, deployment pipelines, parameterization).
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance