Role Descriptions: Responsibilies Design intuive and inclusive user experiences for a large-scale| non-gaming community plaorm. Create user flows| wireframes| prototypes| and high-fidelity UI designs using Figma. Conduct user research and usability tesng| with a focus on Indian users and mobile-first behavior. Collaborate with regional teams to ensure cultural relevance| language localizaon| and UX alignment with Indian market expectaons. Maintain and evolve a scalable design system for consistent UI across web and mobile. Apply accessibility (WCAG 2.1) and UX best pracces to all designs. Use AI-assisted tools for rapid prototyping| iteraon| and user behavior analysis. Connuously improve features based on user feedback| analycs| and market trends.Qualificaons 10 years of experience in UIUX design| ideally for community plaorms| social networks| or content-driven services. A strong porolio showcasing clean visual design| thoughul UX| and end-to-end design process. Proficiency in Figma and familiarity with prototyping and design collaboraon tools. Experience conducng user research and usability tesng| especially in emerging markets. Understanding of design systems| responsive design| and accessibility standards. Ability to simplify complex social interacons into intuive| engaging interfaces. Strong communicaon and collaboraon skills across global and cross-funconal teams
Essential Skills: KRAFTON is expanding beyond games and were building a next-generaon community plaorm designed to connect and empower users across diverse digital experiences. As a UX Designer on our Plaorm team| youll help shape a product that fosters meaningful interacon| content discovery| and user engagement with a strong focus on the Indian digital ecosystem.This is a unique opportunity to design for a mobile-first| community-driven plaorm that blends social features| creator tools| and localized content all while collaborang with a global| cross-funconal team.Responsibilies Design intuive and inclusive user experiences for a large-scale| non-gaming community plaorm. Create user flows| wireframes| prototypes| and high-fidelity UI designs using Figma. Conduct user research and usability tesng| with a focus on Indian users and mobile-first behavior. Collaborate with regional teams to ensure cultural relevance| language localizaon| and UX alignment with Indian market expectaons. Maintain and evolve a scalable design system for consistent UI across web and mobile. Apply accessibility (WCAG 2.1) and UX best pracces to all designs. Use AI-assisted tools for rapid prototyping| iteraon| and user behavior analysis. Connuously improve features based on user feedback| analycs| and market trends.Qualificaons 10 years of experience in UIUX design| ideally for community plaorms| social networks| or content-driven services. A strong porolio showcasing clean visual design| thoughul UX| and end-to-end design process. Proficiency in Figma and familiarity with prototyping and design collaboraon tools. Experience conducng user research and usability tesng| especially in emerging markets. Understanding of design systems| responsive design| and accessibility standards. Ability to simplify complex social interacons into intuive| engaging interfaces. Strong communicaon and collaboraon skills across global and cross-funconal teams
Desirable Skills:
Keyword:
Skills: Digital : User Experience (UX)
Experience Required: 4-6
Responsibilities
Role Descriptions: Responsibilies Design intuive and inclusive user experiences for a large-scale| non-gaming community plaorm. Create user flows| wireframes| prototypes| and high-fidelity UI designs using Figma. Conduct user research and usability tesng| with a focus on Indian users and mobile-first behavior. Collaborate with regional teams to ensure cultural relevance| language localizaon| and UX alignment with Indian market expectaons. Maintain and evolve a scalable design system for consistent UI across web and mobile. Apply accessibility (WCAG 2.1) and UX best pracces to all designs. Use AI-assisted tools for rapid prototyping| iteraon| and user behavior analysis. Connuously improve features based on user feedback| analycs| and market trends.Qualificaons 10 years of experience in UIUX design| ideally for community plaorms| social networks| or content-driven services. A strong porolio showcasing clean visual design| thoughul UX| and end-to-end design process. Proficiency in Figma and familiarity with prototyping and design collaboraon tools. Experience conducng user research and usability tesng| especially in emerging markets. Understanding of design systems| responsive design| and accessibility standards. Ability to simplify complex social interacons into intuive| engaging interfaces. Strong communicaon and collaboraon skills across global and cross-funconal teams
Essential Skills: KRAFTON is expanding beyond games and were building a next-generaon community plaorm designed to connect and empower users across diverse digital experiences. As a UX Designer on our Plaorm team| youll help shape a product that fosters meaningful interacon| content discovery| and user engagement with a strong focus on the Indian digital ecosystem.This is a unique opportunity to design for a mobile-first| community-driven plaorm that blends social features| creator tools| and localized content all while collaborang with a global| cross-funconal team.Responsibilies Design intuive and inclusive user experiences for a large-scale| non-gaming community plaorm. Create user flows| wireframes| prototypes| and high-fidelity UI designs using Figma. Conduct user research and usability tesng| with a focus on Indian users and mobile-first behavior. Collaborate with regional teams to ensure cultural relevance| language localizaon| and UX alignment with Indian market expectaons. Maintain and evolve a scalable design system for consistent UI across web and mobile. Apply accessibility (WCAG 2.1) and UX best pracces to all designs. Use AI-assisted tools for rapid prototyping| iteraon| and user behavior analysis. Connuously improve features based on user feedback| analycs| and market trends.Qualificaons 10 years of experience in UIUX design| ideally for community plaorms| social networks| or content-driven services. A strong porolio showcasing clean visual design| thoughul UX| and end-to-end design process. Proficiency in Figma and familiarity with prototyping and design collaboraon tools. Experience conducng user research and usability tesng| especially in emerging markets. Understanding of design systems| responsive design| and accessibility standards. Ability to simplify complex social interacons into intuive| engaging interfaces. Strong communicaon and collaboraon skills across global and cross-funconal teams
Desirable Skills:
Keyword:
Skills: Digital : User Experience (UX)
Experience Required: 4-6
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Results driven Data Scientist with 5 years of experience building predictive models, performing end to end data analysis, and deploying data driven solutions that improve business outcomes. Skilled in machine learning, statistical modelling, data visualization, and cloud based analytics. Adopt at translating complex data insights into actionable business strategies and collaborating with cross functional teams.
Core Responsibilities
1. Data Analysis & Problem Solving
• Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.
• Use statistical methods to validate hypotheses and support business decision-making.
2. Machine Learning & Modeling
• Build, train, and optimize supervised/unsupervised ML models (regression, classification, clustering, NLP).
• Implement feature engineering, model evaluation, and hyperparameter tuning.
• Deploy models into production using MLOps tools.
3. Data Engineering Collaboration
• Work with data engineers to design and maintain scalable data pipelines.
• Handle large datasets from multiple sources (SQL, APIs, cloud storage).
4. Visualization & Business Communication
• Create dashboards and reports using tools like Power BI, Tableau, or Python (Matplotlib/Seaborn).
• Present insights to stakeholders and leadership for decision-making.
5. Cloud & Big Data
• Utilize cloud platforms like Azure, AWS, or GCP for model training and data storage.
• Work with tools like Spark, Databricks, or Hadoop for large-scale data processing.
Technical Skills
Programming
• Python (NumPy, Pandas, Scikit-learn, Stats models, TensorFlow/PyTorch)
• SQL (T-SQL, MySQL, PostgreSQL)
• R (optional)
Machine Learning
• Predictive modeling
• Deep learning (optional)
• Natural Language Processing (NLP)
• Time-series forecasting
Visualization
• Power BI, Tableau
• Matplotlib, Seaborn, Plotly
Cloud & MLOps
• Azure Machine Learning / AWS Sagemaker
• Git, Docker
• CI/CD pipelines
Databases & Big Data
• Azure Data Lake / AWS S3
• Spark / Databricks
Soft Skills
• Analytical thinking
• Problem solving
• Stakeholder communication
• Business understanding
• Team collaboration
Responsibilities
Results driven Data Scientist with 5 years of experience building predictive models, performing end to end data analysis, and deploying data driven solutions that improve business outcomes. Skilled in machine learning, statistical modelling, data visualization, and cloud based analytics. Adopt at translating complex data insights into actionable business strategies and collaborating with cross functional teams.
Core Responsibilities
1. Data Analysis & Problem Solving
• Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies.
• Use statistical methods to validate hypotheses and support business decision-making.
2. Machine Learning & Modeling
• Build, train, and optimize supervised/unsupervised ML models (regression, classification, clustering, NLP).
• Implement feature engineering, model evaluation, and hyperparameter tuning.
• Deploy models into production using MLOps tools.
3. Data Engineering Collaboration
• Work with data engineers to design and maintain scalable data pipelines.
• Handle large datasets from multiple sources (SQL, APIs, cloud storage).
4. Visualization & Business Communication
• Create dashboards and reports using tools like Power BI, Tableau, or Python (Matplotlib/Seaborn).
• Present insights to stakeholders and leadership for decision-making.
5. Cloud & Big Data
• Utilize cloud platforms like Azure, AWS, or GCP for model training and data storage.
• Work with tools like Spark, Databricks, or Hadoop for large-scale data processing.
Technical Skills
Programming
• Python (NumPy, Pandas, Scikit-learn, Stats models, TensorFlow/PyTorch)
• SQL (T-SQL, MySQL, PostgreSQL)
• R (optional)
Machine Learning
• Predictive modeling
• Deep learning (optional)
• Natural Language Processing (NLP)
• Time-series forecasting
Visualization
• Power BI, Tableau
• Matplotlib, Seaborn, Plotly
Cloud & MLOps
• Azure Machine Learning / AWS Sagemaker
• Git, Docker
• CI/CD pipelines
Databases & Big Data
• Azure Data Lake / AWS S3
• Spark / Databricks
Soft Skills
• Analytical thinking
• Problem solving
• Stakeholder communication
• Business understanding
• Team collaboration
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Z2 category Experience:- 5Yrs
• Good Communication & presentation skills.
• Strong Knowledge of Azure Components Azure Data Lake, Azure Data Factory, Azure SQL, Azure Databricks
• Must have strong knowledge & hands on in Apache Spark and Python programming, working with Delta Tables, Experience in Databricks is must
• Strong SQL Skills: Data modelling, Developing SQL Store Procedures, Functions, Dynamic SQL queries, Joins
• Must be aware of development of components for data fetching from APIs
• Strong knowledge of Data Warehousing Concepts.
• Hands on experience in ingesting data from various data sources and data types & file types
• Should have good knowledge on development lifecycle , best practice, and coding standards. Should be able to help team members, review the code.
• Good knowledge in Azure DevOps, understanding of build and release pipelines
Responsibilities
Z2 category Experience:- 5Yrs
• Good Communication & presentation skills.
• Strong Knowledge of Azure Components Azure Data Lake, Azure Data Factory, Azure SQL, Azure Databricks
• Must have strong knowledge & hands on in Apache Spark and Python programming, working with Delta Tables, Experience in Databricks is must
• Strong SQL Skills: Data modelling, Developing SQL Store Procedures, Functions, Dynamic SQL queries, Joins
• Must be aware of development of components for data fetching from APIs
• Strong knowledge of Data Warehousing Concepts.
• Hands on experience in ingesting data from various data sources and data types & file types
• Should have good knowledge on development lifecycle , best practice, and coding standards. Should be able to help team members, review the code.
• Good knowledge in Azure DevOps, understanding of build and release pipelines
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Azure Data Factory, Snowflake, PySpark, SQL,any cloud (preferbly Azure), Data Modelling
Requirements:
Experience Level: 3 to 5 Years
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks,
• Good to have Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Responsibilities
Job Summary:
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. You will play a crucial role in our data ecosystem by working with cloud technologies to enable data accessibility, quality, and insights across the organization. This role requires expertise in Azure Databricks, Azure Data Factory, Snowflake, PySpark, SQL,any cloud (preferbly Azure), Data Modelling
Requirements:
Experience Level: 3 to 5 Years
• Bachelor’s in Computer Science, Data Engineering, or related field.
• Proficiency in Azure Databricks for data processing and pipeline orchestration.
• Strong SQL skills and understanding of data modeling principles.
• Ability to troubleshoot and optimize data workflows.
*Responsibilities for Internal Candidates
Key Responsibilities:
• Data Pipeline Development: Design, build, and optimize data pipelines to ingest, transform, and load data from multiple sources, using Azure Databricks,
• Good to have Snowflake, and DBT.
• Data Architecture: Develop and manage data models within Snowflake, ensuring efficient data organization and accessibility.
• Data Transformation: Implement transformations in DBT, standardizing data for analysis and reporting.
• Performance Optimization: Monitor and optimize pipeline performance, troubleshooting and resolving issues as needed.
• Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven projects and provide access to reliable, well-structured data.
Qualifications:
• Having relevant Experience in MS Azure, Snowflake, DBT& Big Data Hadoop eco-system components
• Understanding of Hadoop Architecture and underlying framework including Storage Management.
• Strong understand and implementation experience in Hadoop, Spark, Hive/Databricks
• Expertise in implementing Data lake solution using Scala as well as Python.
• Expertise with orchestration tool like Azure Data Factory
• Strong SQL and Programing skills
• Experience with DataBricks is desirable
• Understanding / Implementation experience with CICD tools such as Jenkins, Azure DevOps, GITHUB is desirable.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Custom Software Engineering Lead, a typical day involves overseeing the technical direction and architectural design of bespoke software solutions. This role requires guiding teams through the entire development lifecycle, from initial design concepts to final delivery. The position demands a focus on maintaining high standards for code quality, ensuring that applications are scalable and perform efficiently while aligning with the broader business goals. Collaboration and leadership are central, as the role involves coordinating efforts across multiple teams to achieve cohesive and effective software outcomes. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the establishment and enforcement of development standards to ensure consistency and quality across projects.- Mentor and support team members to foster professional growth and enhance technical capabilities.- Drive continuous improvement initiatives to optimize development processes and delivery timelines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring Boot.- Strong knowledge of microservices architecture and RESTful API design.- Experience with cloud platforms and containerization technologies.- Familiarity with database design and optimization techniques.- Ability to implement scalable and maintainable software solutions.- Competence in debugging, performance tuning, and code review practices. Additional Information: - The candidate should have minimum 5 years of experience in Spring Boot.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As a Custom Software Engineering Lead, a typical day involves overseeing the technical direction and architectural design of bespoke software solutions. This role requires guiding teams through the entire development lifecycle, from initial design concepts to final delivery. The position demands a focus on maintaining high standards for code quality, ensuring that applications are scalable and perform efficiently while aligning with the broader business goals. Collaboration and leadership are central, as the role involves coordinating efforts across multiple teams to achieve cohesive and effective software outcomes. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the establishment and enforcement of development standards to ensure consistency and quality across projects.- Mentor and support team members to foster professional growth and enhance technical capabilities.- Drive continuous improvement initiatives to optimize development processes and delivery timelines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring Boot.- Strong knowledge of microservices architecture and RESTful API design.- Experience with cloud platforms and containerization technologies.- Familiarity with database design and optimization techniques.- Ability to implement scalable and maintainable software solutions.- Competence in debugging, performance tuning, and code review practices. Additional Information: - The candidate should have minimum 5 years of experience in Spring Boot.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 2,17,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Gen AI Job Description:
o Experience Level: 3 to 5 Years
o Design, implement, and manage workflows for integrating and deploying GenAI applications from Azure, Amazon, or Snowflake. Analyse systems and applications and provide recommendations for design, enhancement and development, and play an active part in their execution.
o Platform engineering: Collaborate with other teams to integrate AI solutions into existing workflows and systems to get the platform running and available. They configure and manage the underlying infrastructure that supports the platform, ensuring scalability, reliability, and high availability.
o Develop and implement best practices for managing the lifecycle of large language AI models, including version control, testing, and validation.
o Troubleshoot and resolve issues related to the performance and deployment of large language AI models.
o Stay up to date with the latest advancements in large language AI models and operations technologies to continuously improve our AI infrastructure.
o Ability to develop, suggest best practices of designing infrastructures that support fine-tuning of models to improve performance and efficiency, and troubleshoot any issues that arise during development or deployment.
o Creating and maintaining documentation: Ensure clear and comprehensive documentation of AI/ ML / LLM
o Security integration: GenAI platform engineers weave security best practices throughout the development lifecycle to safeguard the platform from vulnerabilities and data breaches.
o Monitoring and logging: Implementation of robust monitoring and logging systems, LLMOps best practices that allows for proactive identification and resolution of potential issues.
o Responsible AI Guardrails: GenAI platform engineers are responsible for ensuring all Responsible AI metrics are governed through proper system infrastructure and monitoring.
o Data privacy and governance: Ensuring user data privacy and adhering to data governance regulations are paramount considerations for GenAI platform engineers.
Requirements:
• Bachelor’s or master’s degree in statistics / economics / operation Research / data science / computer science / related field.
• 2 years of relevant experience in managing Gen AI applications, model monitoring, model validation, implementing I/O guardrails & FinOps monitoring
• Strong cross-cultural communication and negotiation skills, including the demonstrated ability to solicit opinions and accept feedback and the ability to effectively manage collaboration across time zones.
• Understanding of OpenAI, Llama, Claude, Arctic, Mistral large language models, how to deploy them on cloud/ on-premises and use APIs to build Industry solutions.
• Experience with AI/ML frameworks and tools (e.g., Langchain, Semantic Kernel, TensorFlow, PyTorch).
• Experience in using LLM models on cloud i.e. OpenAI @ Azure, Amazon Bedrock, Snowflake Cortex AI
• Familiarity with cloud platforms (e.g., AWS, Azure, Snowflake) and containerization technologies (e.g., Docker, Kubernetes).
• Advanced & secure coding experience in at least one language (Python, PySpark, TypeScript)
• Exposure to Vector/Graph/SQL Databases, non-deterministic automated testing, workflow platforms
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills and experience in operating effectively as part of cross-functional teams.
Responsibilities
Gen AI Job Description:
o Experience Level: 3 to 5 Years
o Design, implement, and manage workflows for integrating and deploying GenAI applications from Azure, Amazon, or Snowflake. Analyse systems and applications and provide recommendations for design, enhancement and development, and play an active part in their execution.
o Platform engineering: Collaborate with other teams to integrate AI solutions into existing workflows and systems to get the platform running and available. They configure and manage the underlying infrastructure that supports the platform, ensuring scalability, reliability, and high availability.
o Develop and implement best practices for managing the lifecycle of large language AI models, including version control, testing, and validation.
o Troubleshoot and resolve issues related to the performance and deployment of large language AI models.
o Stay up to date with the latest advancements in large language AI models and operations technologies to continuously improve our AI infrastructure.
o Ability to develop, suggest best practices of designing infrastructures that support fine-tuning of models to improve performance and efficiency, and troubleshoot any issues that arise during development or deployment.
o Creating and maintaining documentation: Ensure clear and comprehensive documentation of AI/ ML / LLM
o Security integration: GenAI platform engineers weave security best practices throughout the development lifecycle to safeguard the platform from vulnerabilities and data breaches.
o Monitoring and logging: Implementation of robust monitoring and logging systems, LLMOps best practices that allows for proactive identification and resolution of potential issues.
o Responsible AI Guardrails: GenAI platform engineers are responsible for ensuring all Responsible AI metrics are governed through proper system infrastructure and monitoring.
o Data privacy and governance: Ensuring user data privacy and adhering to data governance regulations are paramount considerations for GenAI platform engineers.
Requirements:
• Bachelor’s or master’s degree in statistics / economics / operation Research / data science / computer science / related field.
• 2 years of relevant experience in managing Gen AI applications, model monitoring, model validation, implementing I/O guardrails & FinOps monitoring
• Strong cross-cultural communication and negotiation skills, including the demonstrated ability to solicit opinions and accept feedback and the ability to effectively manage collaboration across time zones.
• Understanding of OpenAI, Llama, Claude, Arctic, Mistral large language models, how to deploy them on cloud/ on-premises and use APIs to build Industry solutions.
• Experience with AI/ML frameworks and tools (e.g., Langchain, Semantic Kernel, TensorFlow, PyTorch).
• Experience in using LLM models on cloud i.e. OpenAI @ Azure, Amazon Bedrock, Snowflake Cortex AI
• Familiarity with cloud platforms (e.g., AWS, Azure, Snowflake) and containerization technologies (e.g., Docker, Kubernetes).
• Advanced & secure coding experience in at least one language (Python, PySpark, TypeScript)
• Exposure to Vector/Graph/SQL Databases, non-deterministic automated testing, workflow platforms
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills and experience in operating effectively as part of cross-functional teams.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experienced SAP Plant Maintenance (PM) / Enterprise Asset Management (EAM) Consultant to implement, configure, and optimize SAP PM/EAM solutions for asset management, equipment maintenance, and facility operations. The ideal candidate will have expertise in preventive and corrective maintenance, asset lifecycle management, work order processing, and integration with other SAP modules.
Responsibilities
Experienced SAP Plant Maintenance (PM) / Enterprise Asset Management (EAM) Consultant to implement, configure, and optimize SAP PM/EAM solutions for asset management, equipment maintenance, and facility operations. The ideal candidate will have expertise in preventive and corrective maintenance, asset lifecycle management, work order processing, and integration with other SAP modules.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance