Job Description:
• Bachelor’s degree in Information Systems, Data Analytics, Finance, or related field.
• 2–5 years of experience working with Workday Finance or other modules (reporting, analytics, or data management). Strong knowledge of Workday Reporting (Advanced, Matrix, Composite, Prism Analytics).
• Hands-on experience with EIBs, calculated fields, and business process configuration.
• Proficiency in Excel, SQL, or other data analysis tools is a plus.
• Strong analytical skills with the ability to interpret data and provide actionable insights.
• Excellent communication and stakeholder management skills.
• Ability to work independently in a fast-paced, dynamic environment. Workday certified preferred.
Responsibilities
Job Description:
• Bachelor’s degree in Information Systems, Data Analytics, Finance, or related field.
• 2–5 years of experience working with Workday Finance or other modules (reporting, analytics, or data management). Strong knowledge of Workday Reporting (Advanced, Matrix, Composite, Prism Analytics).
• Hands-on experience with EIBs, calculated fields, and business process configuration.
• Proficiency in Excel, SQL, or other data analysis tools is a plus.
• Strong analytical skills with the ability to interpret data and provide actionable insights.
• Excellent communication and stakeholder management skills.
• Ability to work independently in a fast-paced, dynamic environment. Workday certified preferred.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Workday Integration specialist
Job Description:
• Bachelor’s degree in Computer Science, Information Technology, Finance, or related field. Workday certified preferred
• 3–6+ years of experience in Workday integration development.
• Strong expertise in Workday Studio, EIB, Core Connectors, Workday Web Services (SOAP/REST).
• Hands-on experience with Workday Financials
• Strong knowledge of FSCM business processes such as Procure-to-Pay (P2P), Order-to-Cash (O2C), Expenses, and Supplier Management.
• Proficiency in XML, XSLT, XPATH, Java, JSON, and SQL for integration development.
• Experience with integration middleware and third-party systems (e.g., banks, suppliers, procurement systems, payment gateways).
• Excellent problem-solving, analytical, and troubleshooting skills.
• Strong communication skills to work with business and technical stakeholders. Workday certification preferred
• General Ledger (GL)
• Accounts Payable (AP) / Accounts Receivable (AR)
• Asset Management
• Expenses
• Procurement / Supplier Accounts
• Banking & Settlement
• Financial Reporting & Analytics
• Hands-on configuration experience with Business Processes, Security, and Workday Setup
• Strong understanding of Workday Reporting (Advanced Reports, Composite, Matrix, Workday Prism).
Solid grounding in accounting principles (GAAP, IFRS).
• Understanding of multi-entity, multi-currency consolidations.
• Exposure to financial planning, budgeting & forecasting.
• Experience with audit, compliance, and controls (SOX, internal controls, statutory reporting).
Workday certified preferred
• Experience with Business Process Framework (approvals, routing, conditions).
• Strong knowledge of Workday security model (roles, domains, segregation of duties).
• Proven experience configuring and implementing Workday Procurement modules.
• Experience with Workday Supplier Portal and onboarding flows.
• Ability to configure catalogs, requisitions, and supplier invoices.
• Exposure to Workday Prism Analytics for procurement data insights.
• Strong knowledge of Procure-to-Pay (P2P) lifecycle.
• Experience with supplier management (onboarding, vetting, compliance).
• Familiarity with contract lifecycle management in .Workday.Workday certified preferred.
• Understanding of expense management and policies.
• Knowledge of 3-way match process (PO, receipt, invoice).
• Expertise in tax handling, payment terms, and settlement processes.
• Familiarity with multi-entity and multi-currency procurement.
• Understanding of finance and procurement integration points (GL, AP).
• Ability to perform fit-gap analysis between business needs and Workday functionality.
• Support audit, compliance, and procurement policy enforcement.
• Workday certifications in Financials or Procurement modules (preferred).
•Strong communication and stakeholder management skills.
•Self-starter with a high degree of ownership and accountability.
•Ability to manage multiple priorities in a fast-paced environment.
•Detail-oriented with a strong focus on process improvement.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
They will collaborate with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions. They will also be responsible for driving innovation and staying up to date with the latest Python GCP technologies and trends to provide industry-leading solutions. Collaborate with clients to understand their business requirements and design architecture to meet their needs. Develop and implement cloud strategies, best practices, and standards to ensure efficient and effective cloud utilization. Provide technical guidance and mentorship to the team to develop their skills and expertise in Python GCP. Stay up to date with the latest Python GCP technologies, trends, and best practices and assess their applicability to client solutions. Drive innovation and continuous improvement in Python GCP offerings and services to provide industry-leading solutions. Collaborate with sales and business development teams to identify and pursue new business opportunities related to Python GCP. Ensure compliance with security, compliance, and governance requirements in Python GCP solutions. Develop and maintain strong relationships with clients, vendors, and internal stakeholders to promote the a
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Essential Skills:
· Responsible for data analysis and developing reporting specs, developing reports, and developing dashboards and writing complex SQL queries.
· Worked closely with business users to create reports/dashboards using Tableau desktop.
· Having good understanding of Tableau architecture, design, development and end user experience. · Having good understanding of visualization concepts and best practices.
· Extensive experience in Tableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau.
· Experience in publishing reports on tableau server and creating extracts. · Create and modify Interactive Dashboards and Creating guided navigation links within Interactive Dashboards.
· Effectively used data blending, filters, actions, Hierarchies, LODs, Dual Axis feature in tableau.
· Extensive knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts.
Desirable Skills:
Responsibilities
Essential Skills:
· Responsible for data analysis and developing reporting specs, developing reports, and developing dashboards and writing complex SQL queries.
· Worked closely with business users to create reports/dashboards using Tableau desktop.
· Having good understanding of Tableau architecture, design, development and end user experience. · Having good understanding of visualization concepts and best practices.
· Extensive experience in Tableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau.
· Experience in publishing reports on tableau server and creating extracts. · Create and modify Interactive Dashboards and Creating guided navigation links within Interactive Dashboards.
· Effectively used data blending, filters, actions, Hierarchies, LODs, Dual Axis feature in tableau.
· Extensive knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts.
Desirable Skills:
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Key responsibilities
Python experience with more than 6 years of experience in developing python applications, with working experience in database connectivity, multi-threading and multi-processing , file handling (xml, Json etc.) and exception processing.
Testing background is preferred
Pytest, Python Libraries experience related to large file process
Good understanding of Object-Oriented Programming Concepts.
Knowledge on Optimization of code for performance, scheduling and pandas is good to have
Good understanding of Database concepts, advanced SQL queries, Indexing and partitioning concepts, Performance tuning,
Troubleshooting Experience in application development following agile or any other SDLC methodology
Ability to adapt to new processes tools
Self driven , proactive and should take accountability of delivering as per agreed timelines
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Oracle FusionWork Location: GURGAON, HR, Chennai, TN, Noida, UP
Job Title: Developer
Skill Required: Digital : Spring Boot~Core Java
Experience Range: 6-8
Job Description:
Job Title: Offshore Backend (Java Springboot)
Technical Skills Required Qualifications
6+ years of experience developing applications, web services, and cloud-native apps using Java, Spring Boot, REST, Reactive Programming
Experience in integrating Microservces, databases, APIs
Experience in working & analysing observability tools
2+ years of experience with NoSQL databases
Experience on cloud/GCP
Responsibilities
Be part of a team of engineers in developing elegant and high performant code
Ensure quality practices unit testing, code reviews leading tests
Optimize application for non-functional requirements Build and deploy components as part of CICD process
Will be responsible for end-to-end application delivery including coordination with required.
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Responsibilities
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India
"Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL
Role required for Data Engineer with Data science "
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Responsibilities
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India
"Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL
Role required for Data Engineer with Data science "
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance