Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Responsibilities
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: additional for reference : Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India
"Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL
Role required for Data Engineer with Data science "
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Responsibilities
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ //PAN India
"Skill Required: Digital : Apache Spark~Digital : Python for Data Science~Digital : PySpark~MySQL
Role required for Data Engineer with Data science "
Experience Range in Required Skills: 6-8
//Considerable Overall 5+ yrs
Shift: 2 - 11pm , flexibility WFH 4hours
Location: PAN India
// Required DATA Science not the Data engineer
Primary is Strong in AI/ML - 3+ yrs
Secondary is pyspark - 2+ yrs
// continuous INTERVIEW happening at this one,
Interview Feedback/ Questions/Pointers
> Should have hands-on experience on real-time data and also good information with academic projects.
>Data Science / ML, should be able to explain well.
> Programming Languages Python and Pyspark
> Working knowledge on Predictive / ML based models
> Working experience with Cloud
---- Connect with me for SAMPLE RESUME
"Job Description: Must have: Candidate must have expertise/experience with below tasks
- Candidate must have experience on Linux, git, CICD, Release management, production deployment and support.
- Strong Knowledge on Apache Spark is MUST
- Strong Knowledge of PySpark is MUST
- Strong Knowledge on SQL is MUST
- Good Knowledge of Data Science workload
- Good Knowledge on Kubernetes/Docker
- Good Knowledge of Python is MUST
- Good Knowledge of Java language.
- Good to have conceptual knowledge on Apache Airflow, Apache Atlas, Apache Ranger, Postgres, Trino, Superset "
Essential Skills: Python, Pyspark, Sql,Airflow, Trino, Hive, Snowflake, Agile Scrum.Linux,Openshift, Kubernentes, Superset5 + years experience in data engineering, ELT development, and data modeling.Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.Demonstrated experience in developing custom connectors for data ingestion from various sources.Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.Experience implementing DataOps principles and practices, including data CI/CD pipelines.Excellent problem-solving and troubleshooting skills, with a strong attention to detail.Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.Familiarity with data visualization tools Apache SuperSet and dashboard development.Understanding of distributed systems and working with large-scale datasets.Familiarity with data governance frameworks and practices.Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.Experience with Agile development methodologies and working in cross-functional Agile teams.Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.Excellent analytical and problem-solving skills, with a keen attention to detail.Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Comments for Suppliers:
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As a Technology Support Engineer, you will engage in resolving incidents and problems across various business system components, ensuring operational stability throughout the day. Your responsibilities will include creating and implementing Requests for Change, updating knowledge base articles, and collaborating with vendors to assist service management teams in issue analysis and resolution. Each day will present new challenges that require a proactive approach to problem-solving and effective communication with team members and stakeholders. Roles & Responsibilities: - Expected to build knowledge and support the team.- Participate in Problem Solving discussions.- Assist in the development and maintenance of documentation related to system configurations and procedures.- Provide timely updates to stakeholders regarding the status of incidents and requests. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Windows Desktop Management.- Good To Have Skills: Experience with remote desktop support tools.- Familiarity with troubleshooting hardware and software issues.- Understanding of network configurations and connectivity issues.- Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 0-2 years of experience in Microsoft Windows Desktop Management.- This position is based at our Kolkata office.- A 15 years full time education is required.
Responsibilities
As a Technology Support Engineer, you will engage in resolving incidents and problems across various business system components, ensuring operational stability throughout the day. Your responsibilities will include creating and implementing Requests for Change, updating knowledge base articles, and collaborating with vendors to assist service management teams in issue analysis and resolution. Each day will present new challenges that require a proactive approach to problem-solving and effective communication with team members and stakeholders. Roles & Responsibilities: - Expected to build knowledge and support the team.- Participate in Problem Solving discussions.- Assist in the development and maintenance of documentation related to system configurations and procedures.- Provide timely updates to stakeholders regarding the status of incidents and requests. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Windows Desktop Management.- Good To Have Skills: Experience with remote desktop support tools.- Familiarity with troubleshooting hardware and software issues.- Understanding of network configurations and connectivity issues.- Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 0-2 years of experience in Microsoft Windows Desktop Management.- This position is based at our Kolkata office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
equisition ID: 10249494
Role: Microsoft Dynamics Solution Architect/Support Lead
Work Location: Kolkata (WB)
Skill Required: Microsoft Dynamics 365, Power Platform, C#, JavaScript, SQL
Experience Range in Required Skills: 10+ Years
Note: If you are not able get the candidate in the preferred location, kindly go for the PAN India.
Job Description:
• Expertise in Microsoft Dynamics 365 suite.
• Hands- on experience with Power Platform, Dataverse, C#. JavaScript. SQL
• Design and develop solutions in Dynamics 365 and other Microsoft Dynamics 365 products.
• Provide technical expertise and guidance to the team, troubleshooting com-plex issues, and stay updated on the latest Dynamics 365 features and up-dates.
• Lead development efforts for custom functionalities within Dynamics 365 using plugins, custom workflows, and other development tools where necessary.
• Engage with business stakeholders and manage technical delivery teams.
• Proficient in D365 SDK, Plugins, Custom Workflow Activities, and integrations with other Microsoft services or 3rd party applications & APIs.
• Strong knowledge of Power Platform (Power Apps, Power Automate)
• Discussion with business users for any incidents or issues
• Provide resolution or work around to the incidents
• Participate in business requirements discussion as part of enhancement scopes
• Defect fix, testing and promoting code to different environment.
• Discussion with other support groups for any integration related issues/ incidents.
Responsibilities
equisition ID: 10249494
Role: Microsoft Dynamics Solution Architect/Support Lead
Work Location: Kolkata (WB)
Skill Required: Microsoft Dynamics 365, Power Platform, C#, JavaScript, SQL
Experience Range in Required Skills: 10+ Years
Note: If you are not able get the candidate in the preferred location, kindly go for the PAN India.
Job Description:
• Expertise in Microsoft Dynamics 365 suite.
• Hands- on experience with Power Platform, Dataverse, C#. JavaScript. SQL
• Design and develop solutions in Dynamics 365 and other Microsoft Dynamics 365 products.
• Provide technical expertise and guidance to the team, troubleshooting com-plex issues, and stay updated on the latest Dynamics 365 features and up-dates.
• Lead development efforts for custom functionalities within Dynamics 365 using plugins, custom workflows, and other development tools where necessary.
• Engage with business stakeholders and manage technical delivery teams.
• Proficient in D365 SDK, Plugins, Custom Workflow Activities, and integrations with other Microsoft services or 3rd party applications & APIs.
• Strong knowledge of Power Platform (Power Apps, Power Automate)
• Discussion with business users for any incidents or issues
• Provide resolution or work around to the incidents
• Participate in business requirements discussion as part of enhancement scopes
• Defect fix, testing and promoting code to different environment.
• Discussion with other support groups for any integration related issues/ incidents.
Salary : Rs. 90,000.0 - Rs. 1,40,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Microsoft Dynamics Solution Architect/Support Lead
ob Title: Developer
Location: ~BANGALORE~KOCHI~BHUBANESWAR~CHENNAI~KOLKATA~GURGAON~HYDERABAD~PUNE~NOIDA~
Skill Required: Digital : Databricks ~ Databricks Admin
Experience Range: 6-8
Job Description:
Workspace & Infrastructure Management Workspace and Cluster Configuration: Create and manage Databricks workspaces, setting up and optimizing clusters for different workloads and user needs.
Platform Health & Monitoring: Continuously monitor platform health, resource utilization, and performance to identify and resolve issues.
Maintenance & Upgrades: Perform routine maintenance, software upgrades, and patch management to ensure the platform is current and secure. Security & GovernanceIdentity &
Access Management: Manage user access, roles, and permissions to control who can access what within the Databricks environment.
Security Controls: Implement and maintain security measures, data encryption, and access policies to protect data and the platform.
Data Governance: Manage tools like Unity Catalog to enforce policies on data access, classification, and retention. Performance & Optimization
Performance Tuning: Analyze and troubleshoot performance bottlenecks in workloads and optimize configurations for better efficiency.
Cost Optimization: Monitor resource utilization and implement strategies to manage and control costs within the platform. Collaboration & Support
Technical Support: Provide technical support and collaborate with data analysts, scientists, and engineers to meet their requirements and support their workflows.
Cross-functional Collaboration: Work with infrastructure, engineering, and DevOps teams to integrate Databricks into the broader IT ecosystem and CI/CD pipelines.
Automation & IntegrationAutomation: Develop and implement scripts and automation tools to streamline deployment, configuration, and monitoring processes. Cloud Integration: Integrate the Databricks platform with other cloud services and identity providers for enhanced functionality and security.
Responsibilities
ob Title: Developer
Location: ~BANGALORE~KOCHI~BHUBANESWAR~CHENNAI~KOLKATA~GURGAON~HYDERABAD~PUNE~NOIDA~
Skill Required: Digital : Databricks ~ Databricks Admin
Experience Range: 6-8
Job Description:
Workspace & Infrastructure Management Workspace and Cluster Configuration: Create and manage Databricks workspaces, setting up and optimizing clusters for different workloads and user needs.
Platform Health & Monitoring: Continuously monitor platform health, resource utilization, and performance to identify and resolve issues.
Maintenance & Upgrades: Perform routine maintenance, software upgrades, and patch management to ensure the platform is current and secure. Security & GovernanceIdentity &
Access Management: Manage user access, roles, and permissions to control who can access what within the Databricks environment.
Security Controls: Implement and maintain security measures, data encryption, and access policies to protect data and the platform.
Data Governance: Manage tools like Unity Catalog to enforce policies on data access, classification, and retention. Performance & Optimization
Performance Tuning: Analyze and troubleshoot performance bottlenecks in workloads and optimize configurations for better efficiency.
Cost Optimization: Monitor resource utilization and implement strategies to manage and control costs within the platform. Collaboration & Support
Technical Support: Provide technical support and collaborate with data analysts, scientists, and engineers to meet their requirements and support their workflows.
Cross-functional Collaboration: Work with infrastructure, engineering, and DevOps teams to integrate Databricks into the broader IT ecosystem and CI/CD pipelines.
Automation & IntegrationAutomation: Develop and implement scripts and automation tools to streamline deployment, configuration, and monitoring processes. Cloud Integration: Integrate the Databricks platform with other cloud services and identity providers for enhanced functionality and security.
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: Chennai/Bengaluru/Hyderabad/Pune/Kolkata/Delhi
Project Duration: 6months or extendable
Skill Required: Data Architecture and Modeling
Experience Range in Required Skills: 6- 8yrs //Strictly no less than 6yrs
"Job Description: Role: Big data architech
Required Skills: Data Architecture, Hadoop, Spark
Must-Have** ·
> Experience of Hadoop Big Data Eco System Tools & Technologies (HIVE, Spark/PySpark) ·
> Experience in Various Data Architecture patterns and techniques ·
> Experience with E2E Architecture layers – Data Ingestion, Data Storage, Data Visualization, Governance, Analytics ·
> Experience with multiple Data & Analytics technologies
Good-to-Have ·
> Experience with Cloud Data platforms like Azure, AWS ·
> Preferably to have good DWH/ Data Lake knowledge ·
> BFSI domain knowledge
Responsibility of / Expectations from the Role
> Perform Data Inventory and Mapping
> Define and Develop Data Architecture Strategy
> Design Data Architecture Blueprint ·
> Define Data Governance Framework ·
> Perform Data Gap Analysis ·
> Identify Reusables"
Essential Skills: Data Architect
Responsibilities
Job Title: Developer
Work Location: Chennai/Bengaluru/Hyderabad/Pune/Kolkata/Delhi
Project Duration: 6months or extendable
Skill Required: Data Architecture and Modeling
Experience Range in Required Skills: 6- 8yrs //Strictly no less than 6yrs
"Job Description: Role: Big data architech
Required Skills: Data Architecture, Hadoop, Spark
Must-Have** ·
> Experience of Hadoop Big Data Eco System Tools & Technologies (HIVE, Spark/PySpark) ·
> Experience in Various Data Architecture patterns and techniques ·
> Experience with E2E Architecture layers – Data Ingestion, Data Storage, Data Visualization, Governance, Analytics ·
> Experience with multiple Data & Analytics technologies
Good-to-Have ·
> Experience with Cloud Data platforms like Azure, AWS ·
> Preferably to have good DWH/ Data Lake knowledge ·
> BFSI domain knowledge
Responsibility of / Expectations from the Role
> Perform Data Inventory and Mapping
> Define and Develop Data Architecture Strategy
> Design Data Architecture Blueprint ·
> Define Data Governance Framework ·
> Perform Data Gap Analysis ·
> Identify Reusables"
Essential Skills: Data Architect
Salary : Rs. 90,000.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
• Good knowledge of Excel, VBA, Macros, PowerPoint and Learning Management System. • Ability to take ownership and handle various & confidential HR/L&D data • Ability to prioritize the work and deliver high quality on time. • Flexible and willing to go the extra mile to meet the requirements of the work • Should have an analytical mindset • Should be able to suggest opportunities for automation, scheduled reporting rather than manual work • Clarity of thought • Good Communication and Interpersonal skills
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: ~ ~KOLKATA~NEW DELHI~HYDERABAD~MUMBAI~CHENNAI~BANGALORE~
Skill Required: Multifactor Authentication (MFA)
Experience Range: 6-8
Job Description:
Set up, configure, and maintain Mosaic components (from Transmit Security) for authentication, identity verification and storing user credentialsIntegrate Mosaic identity platform with our web and mobile applicationsManage and support identity workflows such as password-less login, multi-factor authentication (MFA), and single sign-on (SSO)Manage and support role-based access of our applications using the Mosaic platformWork with development teams to ensure security is built into application(s) from the startTroubleshoot and resolve authentication issues in productionMonitor system performance and security logs for suspicious activityStay updated on new security features and best practices in identity and access management.
Hands-on experience with Transmit Security andor Mosaic platform.Strong understanding of authentication methods (MFA, password-less, SSO, OAuth, SAML, OpenID Connect).Experience of HTTPS integrations between custom applications and Mosaic platformFamiliarity with integrating identity and access management services into webmobile applicationsKnowledge of cybersecurity fundamentals.Strong troubleshooting and problem-solving skills.Good communication and documentation abilities.
Responsibilities
Job Title: Developer
Work Location: ~ ~KOLKATA~NEW DELHI~HYDERABAD~MUMBAI~CHENNAI~BANGALORE~
Skill Required: Multifactor Authentication (MFA)
Experience Range: 6-8
Job Description:
Set up, configure, and maintain Mosaic components (from Transmit Security) for authentication, identity verification and storing user credentialsIntegrate Mosaic identity platform with our web and mobile applicationsManage and support identity workflows such as password-less login, multi-factor authentication (MFA), and single sign-on (SSO)Manage and support role-based access of our applications using the Mosaic platformWork with development teams to ensure security is built into application(s) from the startTroubleshoot and resolve authentication issues in productionMonitor system performance and security logs for suspicious activityStay updated on new security features and best practices in identity and access management.
Hands-on experience with Transmit Security andor Mosaic platform.Strong understanding of authentication methods (MFA, password-less, SSO, OAuth, SAML, OpenID Connect).Experience of HTTPS integrations between custom applications and Mosaic platformFamiliarity with integrating identity and access management services into webmobile applicationsKnowledge of cybersecurity fundamentals.Strong troubleshooting and problem-solving skills.Good communication and documentation abilities.
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance