We found 55 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Request ID: ADBLVLSISTABSKR072024251745 Role/Job Title: STA Engineer Duration of the project: Mid-Aug-2024 – Mid-Feb-2025 (6 months to begin with) Desired start date: 19th August 2024 Work set-up & ADI Location/expectation of working mode: Onsite (all 5 days a week) & ADI Bengaluru (ADBL), AvedaMeta office Experience: 5+ Years No. of Positions: 1 JD/Expectations: The requirements can be summarised as follows:  Should have hands on experience with Static timing analysis and timing constraints generation as per designer input.  Experience with tempus tool is preferred.  Working knowledge of TCL/Python is an add-on.

Responsibilities

Request ID: ADBLVLSISTABSKR072024251745 Role/Job Title: STA Engineer Duration of the project: Mid-Aug-2024 – Mid-Feb-2025 (6 months to begin with) Desired start date: 19th August 2024 Work set-up & ADI Location/expectation of working mode: Onsite (all 5 days a week) & ADI Bengaluru (ADBL), AvedaMeta office Experience: 5+ Years No. of Positions: 1 JD/Expectations: The requirements can be summarised as follows:  Should have hands on experience with Static timing analysis and timing constraints generation as per designer input.  Experience with tempus tool is preferred.  Working knowledge of TCL/Python is an add-on.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :STA Engineer

Job Description

Must have: 1. Python 2. Django or FastAPI - REST Web Framework 3. Ability to write Unit/Integration Tests 3. Database programming experience (Postgres, SQL, ability to write complex queries) 4. API design & creation (e.g. OpenAPI) Nice to have: 1. SQLAlchemy - Package Manager 2. MyPy - Typing 3. Pytest - Testing Framework 4. Poetry - Package Manager 5. Alembic - Database Migrations

Responsibilities

Must have: 1. Python 2. Django or FastAPI - REST Web Framework 3. Ability to write Unit/Integration Tests 3. Database programming experience (Postgres, SQL, ability to write complex queries) 4. API design & creation (e.g. OpenAPI) Nice to have: 1. SQLAlchemy - Package Manager 2. MyPy - Typing 3. Pytest - Testing Framework 4. Poetry - Package Manager 5. Alembic - Database Migrations
  • Salary : Rs. 6,00,000.0 - Rs. 14,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Developer

Job Description

Role 2: 2. 5+ Years of hands-on development in python programming. Strong hands-on experience using FAST API, uvicorn, pytest/Pyunit, asyncio, Restful API’s, numpy, pandas. 3. 4+ Years of experience into SQL and any of the database – Snowflake or Postgres ql or Dynamo DB. 4. 1+ Years in AWS Strong hands-on experience on AWS Services – Lambda, EKS, SQS, SNS, S3, Batch, Docker, Elastic Cache. 5. Experience or exposure to XML will be a plus.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :python programming.

Job Description

"Experience: 3-6 years Mandatory: Deep hands-on expertise in Databricks, Python, SQL queries. Must be experienced in design and implementation of Big Data technologies (Apache Spark, Hadoop ecosystem, Apache Kafka, NoSQL databases) working as a Big Data Engineer: query tuning, performance tuning, data architecture patterns (Data lakehouse, delta lake, streaming, Lambda/Kappa architecture) a full range of data engineering approaches, covering theoretical best practices and the technical applications of these methods. building and deploying a range of data engineering pipelines into production, including using automation best practices for CI/CD. troubleshooting, and debugging Spark and other big data solutions. writing SQL queries. databases and analytics technologies Data Warehousing/ETL/ELT, Relational Databases, or MPP Must be familiar with Azure cloud and creating Azure ADF pipelines. data analytics and complex event processing •   Should have good communication (written and verbal)" Preferred: Certification in Databricks Associate Developer (Python).

Responsibilities

"Experience: 3-6 years Mandatory: Deep hands-on expertise in Databricks, Python, SQL queries. Must be experienced in design and implementation of Big Data technologies (Apache Spark, Hadoop ecosystem, Apache Kafka, NoSQL databases) working as a Big Data Engineer: query tuning, performance tuning, data architecture patterns (Data lakehouse, delta lake, streaming, Lambda/Kappa architecture) a full range of data engineering approaches, covering theoretical best practices and the technical applications of these methods. building and deploying a range of data engineering pipelines into production, including using automation best practices for CI/CD. troubleshooting, and debugging Spark and other big data solutions. writing SQL queries. databases and analytics technologies Data Warehousing/ETL/ELT, Relational Databases, or MPP Must be familiar with Azure cloud and creating Azure ADF pipelines. data analytics and complex event processing •   Should have good communication (written and verbal)" Preferred: Certification in Databricks Associate Developer (Python).
  • Salary : Rs. 25,00,000.0 - Rs. 30,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Big Data Engineer

Job Description

Python Application developers with microservice architecture understanding. Need to build library of functions to use e.g. logging secrets UI comp app centric. Building app libraries for business to use. Knowledge and experience of building business applications. Ideal candidates will understand Azure cloud and required services layered/microservice application architecture. The developers will be required to build reusable application components libraries logging front-end UI APIs and API mgmt endpoint mgmt. handle secrets mgmt. with identity access.

Responsibilities

Python Application developers with microservice architecture understanding. Need to build library of functions to use e.g. logging secrets UI comp app centric. Building app libraries for business to use. Knowledge and experience of building business applications. Ideal candidates will understand Azure cloud and required services layered/microservice application architecture. The developers will be required to build reusable application components libraries logging front-end UI APIs and API mgmt endpoint mgmt. handle secrets mgmt. with identity access.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Developer (MRSI )

Job Description

NLP techniques and use related libraries like scikit-learn nltk spacy Should be able to use Concurrent Processing libraries in python should be able to write performant code. Utilize advanced web scraping techniques and Python libraries (e.g. BeautifulSoup Scrapy Selenium Pandas Numpy) to extract structured and unstructured data from websites APIs and other online sources. Perform data profiling cleansing and transformation using Python to prepare data for analysis and reporting. Design and implement data quality frameworks and processes using Python to ensure the accuracy consistency and reliability of the data. Optimize data storage and retrieval processes using Python to improve performance and scalability. Develop and maintain documentation including data dictionaries process flows and data lineage to facilitate knowledge sharing and maintain data lineage. Stay up to date with emerging trends technologies and best practices in data engineering Python programming and web scraping. Requirements Strong programming skills in Python along with proficiency in SQL. Extensive experience with big data processing frameworks (e.g. Databricks) using Python. Proven expertise in web scraping frameworks and Python libraries (e.g. Beautiful Soup Scrapy Selenium Pandas) and experience with API integration using Python. Extensive expertise in data parsing using Regular Expression CSS Selectors XPath Selectors. Proficiency in data manipulation and analysis using Python libraries such as Pandas NumPy SQL Solid understanding of data storage formats (Markdown CSV JSON Parquet Feather) using Python. Strong analytical and problem-solving skills with the ability to handle complex data engineering and scraping challenges using Python. Excellent communication skills and reliable task execution discipline. Nice to have Experience in NLP feature engineering and using huggingface libraries for applied NLP is a huge plus. Experience with multi-node CPU parallel architecture is a huge plus. Experience in P&C insurance is a

Responsibilities

NLP techniques and use related libraries like scikit-learn nltk spacy Should be able to use Concurrent Processing libraries in python should be able to write performant code. Utilize advanced web scraping techniques and Python libraries (e.g. BeautifulSoup Scrapy Selenium Pandas Numpy) to extract structured and unstructured data from websites APIs and other online sources. Perform data profiling cleansing and transformation using Python to prepare data for analysis and reporting. Design and implement data quality frameworks and processes using Python to ensure the accuracy consistency and reliability of the data. Optimize data storage and retrieval processes using Python to improve performance and scalability. Develop and maintain documentation including data dictionaries process flows and data lineage to facilitate knowledge sharing and maintain data lineage. Stay up to date with emerging trends technologies and best practices in data engineering Python programming and web scraping. Requirements Strong programming skills in Python along with proficiency in SQL. Extensive experience with big data processing frameworks (e.g. Databricks) using Python. Proven expertise in web scraping frameworks and Python libraries (e.g. Beautiful Soup Scrapy Selenium Pandas) and experience with API integration using Python. Extensive expertise in data parsing using Regular Expression CSS Selectors XPath Selectors. Proficiency in data manipulation and analysis using Python libraries such as Pandas NumPy SQL Solid understanding of data storage formats (Markdown CSV JSON Parquet Feather) using Python. Strong analytical and problem-solving skills with the ability to handle complex data engineering and scraping challenges using Python. Excellent communication skills and reliable task execution discipline. Nice to have Experience in NLP feature engineering and using huggingface libraries for applied NLP is a huge plus. Experience with multi-node CPU parallel architecture is a huge plus. Experience in P&C insurance is a
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Python Developer with NLP

Job Description

Here’s the JD for your reference : • Python, GCP Vertex AI (Training, Pipelines, Endpoints, Experiments, Feature Store), Google BigQuery, Google Kubernetes Engine, open-source ML frameworks • Skilled in MLOps practices (CI/CD, model monitoring/versioning), containerization (Docker, Kubernetes), cloud-native ML services • Optimization with solvers • Experience in scaling, performance tuning for production readiness Location: Any USI Location Rate: 1800 Max

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :ML Engineer

Job Description

JD: Need a proficient Snowflake and Python Developer who can build Data pipelines by writing Python Code. Proficient in Python Numpy, Panda , Dataframes etc. Also, need to be proficient in SQL

Responsibilities

JD: Need a proficient Snowflake and Python Developer who can build Data pipelines by writing Python Code. Proficient in Python Numpy, Panda , Dataframes etc. Also, need to be proficient in SQL
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Snowflake and Python Developer

Job Description

oftware development, Continuous delivery are not only buzzwords for you?! Your specialties are Python, SQL, Postgres, Grafana, ELK, Docker, Jenkins. Come, share your skills and be creative within our feature team in charge of the inventories of the Cloud service on our Hybrid Cloud Platform as Build and Run Developer / DevOps. Responsibilities Help design, develop and maintain robust applications in an as a services model on Cloud platform Evaluate, implement and standardize new tools / solutions to continuously improve the Cloud Platform Leverage expertise to driving organization and departments technical vision in development teams Liaise with global and local stakeholders and influence technical roadmaps Passionately contributing towards hosting a thriving developer community Encourage contribution towards inner and open sourcing, leading by example Profile - Experience and exposer to good programming practices including Coding and Testing standards - Passion and Experience in proactively investigating, evaluating and implementing new technical solutions with continuously improvement - Possess good development culture and familiarity to industry wide best practices - Production mindset with keen focus on reliability and quality - Passionate about being a part of distributed self-sufficient feature team with regular deliverables - Proactive learner and own skills about Scrum, Data, Automation - Strong technical ability to monitor, investigate, analyze and fix production issues. - Ability to ideate and collaborate through inner and open sourcing - Ability to Interact with client managers, developers, testers and cross functional teams like architects - Experience working in Agile Team and exposure to agile / SAFE development methodologies. - Minimum 5+ years of experience in software development and architecture. - Good experience of design and development including object-oriented programming in python, cloud native application development, APIs and micro-service - Good experience with relational databases like PostgreSQL and ability to build robust SQL queries - Knowledge of Grafana for data visualization and ability to build dashboard from various data sources - Experience in big technologies like Elastic search and FluentD - Experience in hosting applications using Containerization [Docker, Kubernetes] - Good understanding of CI/CD and DevOps and Proficient with tools like GIT, Jenkin, Sonar - Good system skills with linux OS and bash scripting - Understanding of the Cloud and cloud services

Responsibilities

oftware development, Continuous delivery are not only buzzwords for you?! Your specialties are Python, SQL, Postgres, Grafana, ELK, Docker, Jenkins. Come, share your skills and be creative within our feature team in charge of the inventories of the Cloud service on our Hybrid Cloud Platform as Build and Run Developer / DevOps. Responsibilities Help design, develop and maintain robust applications in an as a services model on Cloud platform Evaluate, implement and standardize new tools / solutions to continuously improve the Cloud Platform Leverage expertise to driving organization and departments technical vision in development teams Liaise with global and local stakeholders and influence technical roadmaps Passionately contributing towards hosting a thriving developer community Encourage contribution towards inner and open sourcing, leading by example Profile - Experience and exposer to good programming practices including Coding and Testing standards - Passion and Experience in proactively investigating, evaluating and implementing new technical solutions with continuously improvement - Possess good development culture and familiarity to industry wide best practices - Production mindset with keen focus on reliability and quality - Passionate about being a part of distributed self-sufficient feature team with regular deliverables - Proactive learner and own skills about Scrum, Data, Automation - Strong technical ability to monitor, investigate, analyze and fix production issues. - Ability to ideate and collaborate through inner and open sourcing - Ability to Interact with client managers, developers, testers and cross functional teams like architects - Experience working in Agile Team and exposure to agile / SAFE development methodologies. - Minimum 5+ years of experience in software development and architecture. - Good experience of design and development including object-oriented programming in python, cloud native application development, APIs and micro-service - Good experience with relational databases like PostgreSQL and ability to build robust SQL queries - Knowledge of Grafana for data visualization and ability to build dashboard from various data sources - Experience in big technologies like Elastic search and FluentD - Experience in hosting applications using Containerization [Docker, Kubernetes] - Good understanding of CI/CD and DevOps and Proficient with tools like GIT, Jenkin, Sonar - Good system skills with linux OS and bash scripting - Understanding of the Cloud and cloud services
  • Salary : Rs. 12,00,000.0 - Rs. 14,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Specialist Software Engineer-Python + Cloud (Insights & Observability)