Назад
Company hidden
1 день назад

DataOps Engineer

Формат работы
remote (Global)
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
Israel
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

DataOps Engineer: Designing, deploying, and operating Kubernetes-based infrastructure for a large-scale data processing platform with an accent on reliability, performance, and cost-efficiency. Focus on tuning Spark-on-K8s, managing Airflow infrastructure, and building robust CI/CD and observability tooling.

Location: Global remote workforce. Company based in Ramat Gan, Israel.

Company

hirify.global is a location analytics platform providing unprecedented visibility into locations, markets, and consumer behavior, having achieved unicorn status and rapid growth.

What you will do

  • Design, deploy, and operate Kubernetes-based infrastructure for Apache Spark and large-scale data processing workloads.
  • Ensure reliability, performance, and cost-efficiency of the data platform, including SLAs, autoscaling, and resource quotas.
  • Manage Spark-on-K8s configurations, Airflow infrastructure, and Databricks integration, tuning for throughput and latency.
  • Build and maintain CI/CD pipelines and infrastructure-as-code for data platform components.
  • Develop observability tooling, including metrics, logging, and alerting, for proactive issue surfacing.
  • Collaborate with Data Engineers to translate workload patterns into infrastructure decisions.

Requirements

  • 5+ years of experience in a production infrastructure, SRE, or DevOps role.
  • 2+ years of hands-on experience running Apache Spark, Flink, or similar data processing workloads in production.
  • Strong Kubernetes experience, including Spark-on-K8s, autoscaling, and resource management.
  • 2+ years with infrastructure-as-code tools (Terraform, Pulumi, or similar).
  • Proficiency in at least one general-purpose language, preferably Python or Go.
  • Experience with workflow orchestration tools, particularly Apache Airflow.
  • Solid understanding of cloud infrastructure, preferably GCP (GCS, GKE, IAM).
  • Strong observability skills: metrics pipelines, structured logging, and alerting frameworks.

Nice to have

  • Familiarity with Delta Lake, Parquet, and columnar storage formats.
  • Experience with data quality frameworks and pipeline lineage tooling.
  • Knowledge of query optimization, partition strategies, and Spark performance tuning.
  • Experience managing queues and databases (Kafka, PostgreSQL, Redis, or similar).

Culture & Benefits

  • Join a rapidly growing company pioneering a new market.
  • Take a central and critical role at hirify.global.
  • Work with and learn from top-notch talent.
  • Competitive salary and excellent benefits.
  • Committed to maintaining a drug-free workplace and promoting a safe, healthy environment.
  • Equal opportunity employer with a global remote workforce.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник - загрузка...