Назад
Company hidden
5 дней назад

Python Data Engineer

Формат работы
hybrid
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
India
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Python Data Engineer: Designing, developing, and maintaining scalable data solutions for a growing data product company with an accent on building ETL/ELT pipelines and managing data streaming solutions. Focus on optimizing performance, ensuring high data quality, and contributing to cloud-based infrastructure.

Location: Hybrid in Noida, Uttar Pradesh, India

Company

hirify.global is a growing data product company founded in early 2020, primarily working with Fortune 500 companies to deliver digital solutions and create value through innovation.

What you will do

  • Develop and maintain data pipelines and ETL/ELT processes using Python.
  • Design and implement scalable, high-performance applications.
  • Collaborate with cross-functional teams to define requirements and deliver solutions.
  • Develop and manage near real-time data streaming solutions using PubSub or Beam.
  • Contribute to code reviews, architecture discussions, and continuous improvement initiatives.
  • Monitor and troubleshoot production systems to ensure reliability and performance.

Requirements

  • 5+ years of professional software development experience with Python and SQL.
  • Strong understanding of software engineering best practices, including testing, version control, and CI/CD.
  • Experience building and optimizing ETL/ELT processes and data pipelines.
  • Proficiency with SQL and database concepts.
  • Experience with data processing frameworks like Pandas.
  • Understanding of software design patterns and architectural principles.
  • Experience with unit testing and test automation.
  • Experience working with any cloud provider (GCP is preferred).
  • Experience with CI/CD pipelines and Infrastructure as code, including Docker or Kubernetes.

Nice to have

  • Experience with GCP services, particularly Cloud Run and Dataflow.
  • Experience with stream processing technologies such as Pub/Sub.
  • Familiarity with big data technologies like Airflow.
  • Experience with data visualization tools and libraries.
  • Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform.
  • Familiarity with platforms like Snowflake, BigQuery or Databricks.

Culture & Benefits

  • Competitive salary and a strong insurance package.
  • Commitment to employee growth, offering extensive learning and development resources.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник - загрузка...