Volume Governance role in GTS will be in charge to:
The Data Analyst is responsible for maintaining and enhancing reporting solutions that enable GTE to industrialize and factually support its client relationship with auditable data.
Key responsibilities include:
Maintain existing reports, ensuring data quality and reliability of source repositories.
Manage data integration and processing from reference sources.
Produce and enhance reports on:
Infrastructure volumes consumed by GTE clients.
Optimization of infrastructure consumption.
Client projects (CTB).
Service quality.
Infrastructure transformation.
Identify and implement simplification and convergence opportunities with other data processing domains (billing, production governance, etc.).
Support client relationship teams in the understanding and proper use of reports
Experience in data analysis and reporting, and preferably Cost controlling
Ability to work in a complex environment with large and heterogeneous datasets.
Critical thinking and rigor in assessing data quality.
Strong interpersonal skills and ability to collaborate with multiple stakeholders.
Responsibilities
Volume Governance role in GTS will be in charge to:
The Data Analyst is responsible for maintaining and enhancing reporting solutions that enable GTE to industrialize and factually support its client relationship with auditable data.
Key responsibilities include:
Maintain existing reports, ensuring data quality and reliability of source repositories.
Manage data integration and processing from reference sources.
Produce and enhance reports on:
Infrastructure volumes consumed by GTE clients.
Optimization of infrastructure consumption.
Client projects (CTB).
Service quality.
Infrastructure transformation.
Identify and implement simplification and convergence opportunities with other data processing domains (billing, production governance, etc.).
Support client relationship teams in the understanding and proper use of reports
Experience in data analysis and reporting, and preferably Cost controlling
Ability to work in a complex environment with large and heterogeneous datasets.
Critical thinking and rigor in assessing data quality.
Strong interpersonal skills and ability to collaborate with multiple stakeholders.
Salary : Rs. 0.0 - Rs. 10,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Skill – Non Voice
Shifts – Rotational UK Shifts.
Work from Office
Responsibilities
Role Requirement - We are looking for a highly skilled and detail-oriented professional with advanced proficiency in Microsoft PowerPoint and the full Microsoft Office Suite, including Excel, Word, Outlook, and Power BI. The ideal candidate will excel at designing and delivering visually compelling, data-driven, and business-relevant presentations, dashboards, reports, and documents. This role requires the ability to translate complex information into clear, impactful visual narratives that support executive communication, business proposals, strategic reviews, and client-facing engagements.
Expertise in developing interactive dashboards and performance reports through Power BI will be a strong advantage.
Salary : Rs. 21,000.0 - Rs. 25,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Customer Contact Communications Representative
Job Description:
Lead the end-to-end migration of data pipelines and workloads from Teradata to Databricks.
Analyze existing Teradata workloads to identify optimization opportunities and migration strategies.
Design and implement scalable, efficient, and cost-effective data pipelines using PySpark/Scala on Databricks
Skills Required:
Strong experience with Databricks, Apache Spark, and Delta Lake.
Proficiency in PySpark, SQL, and optionally Scala.
Strong understanding of Spark, Databricks performance tuning techniques
Understanding of Teradata utilities, BTEQ scripts, stored procedures
Experience with cloud platforms (Azure, AWS, or GCP), preferably Azure Databricks.
Familiarity with data modeling, data warehousing concepts, and big data best practices.
Strong problem-solving and communication skills.
Experience with version control and CI/CD tools.
Responsibilities
Job Description:
Lead the end-to-end migration of data pipelines and workloads from Teradata to Databricks.
Analyze existing Teradata workloads to identify optimization opportunities and migration strategies.
Design and implement scalable, efficient, and cost-effective data pipelines using PySpark/Scala on Databricks
Skills Required:
Strong experience with Databricks, Apache Spark, and Delta Lake.
Proficiency in PySpark, SQL, and optionally Scala.
Strong understanding of Spark, Databricks performance tuning techniques
Understanding of Teradata utilities, BTEQ scripts, stored procedures
Experience with cloud platforms (Azure, AWS, or GCP), preferably Azure Databricks.
Familiarity with data modeling, data warehousing concepts, and big data best practices.
Strong problem-solving and communication skills.
Experience with version control and CI/CD tools.
Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance