Эта вакансия в архиве

Посмотреть похожие вакансии ↓
Company hidden
обновлено 1 месяц назад

Data Engineer (GCP, Python)

Формат работы
hybrid
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
India

Описание вакансии

Текст:
/

TL;DR

Data Engineer (GCP, Python): Developing and maintaining data pipelines and ETL/ELT processes for a data product company with an accent on designing scalable, high-performance applications and real-time data streaming solutions. Focus on contributing to architecture discussions, implementing best practices for data processing, and ensuring system reliability and performance in production.

Location: Hybrid role based in Noida, Uttar Pradesh.

Company

hirify.global is a growing data product company focused on delivering digital solutions for Fortune 500 companies.

What you will do

  • Develop and maintain data pipelines and ETL/ELT processes using Python.
  • Design and implement scalable, high-performance applications.
  • Collaborate with cross-functional teams to define requirements and deliver solutions.
  • Develop and manage near real-time data streaming solutions (Pub/Sub, Beam).
  • Contribute to code reviews, architecture discussions, and continuous improvement.
  • Monitor and troubleshoot production systems for reliability and performance.

Requirements

  • 5+ years of professional software development experience with Python.
  • Strong understanding of software engineering best practices (testing, version control, CI/CD).
  • Experience building and optimizing ETL/ELT processes and data pipelines.
  • Proficiency with SQL and database concepts.
  • Experience with cloud providers (GCP preferred) and containerization (Docker, Kubernetes).
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience).
  • Strong communication and problem-solving skills.

Nice to have

  • Experience with GCP services (Cloud Run, Dataflow) and stream processing (Pub/Sub).
  • Familiarity with big data technologies (Airflow) and data visualization tools.
  • Knowledge of CI/CD with Gitlab and Infrastructure as Code with Terraform.
  • Familiarity with platforms like Snowflake, Bigquery, or Databricks.
  • GCP Data Engineer certification.

Culture & Benefits

  • Competitive salary and strong insurance package.
  • Focus on employee growth with extensive learning and development resources.