Назад
Company hidden
2 дня назад

Systems Engineer (Data)

Π€ΠΎΡ€ΠΌΠ°Ρ‚ Ρ€Π°Π±ΠΎΡ‚Ρ‹
hybrid
Π’ΠΈΠΏ Ρ€Π°Π±ΠΎΡ‚Ρ‹
fulltime
Π“Ρ€Π΅ΠΉΠ΄
middle
Английский
b2
Π‘Ρ‚Ρ€Π°Π½Π°
US
Вакансия ΠΈΠ· списка Hirify.GlobalВакансия ΠΈΠ· Hirify Global, списка ΠΌΠ΅ΠΆΠ΄ΡƒΠ½Π°Ρ€ΠΎΠ΄Π½Ρ‹Ρ… tech-ΠΊΠΎΠΌΠΏΠ°Π½ΠΈΠΉ
Для мэтча ΠΈ ΠΎΡ‚ΠΊΠ»ΠΈΠΊΠ° Π½ΡƒΠΆΠ΅Π½ Plus

ΠœΡΡ‚Ρ‡ & Π‘ΠΎΠΏΡ€ΠΎΠ²ΠΎΠ΄

Для мэтча с этой вакансиСй Π½ΡƒΠΆΠ΅Π½ Plus

ОписаниС вакансии

ВСкст:
/

TL;DR

Systems Engineer (Data): Building and optimizing centralized data platforms and infrastructure for secure, democratized access to data across the company, with an accent on technical architecture, data pipeline development, and tool automation. Focus on scaling data systems, managing privacy/security, and enabling data-driven solutions with technologies like Kubernetes, Trino, and Python/Go.

Location: This is a hybrid role, requiring regular presence at the Austin, US office.

Company

hirify.global runs one of the world’s largest networks, powering millions of websites and Internet properties for a diverse range of customers.

What you will do

  • Contribute to the design and execution of technical architecture for highly visible data infrastructure.
  • Design and develop tools and infrastructure to improve and scale hirify.global's data systems.
  • Build and maintain data pipelines and data products, including tools for automated delivery.
  • Gain deep knowledge of data platforms to guide and enable stakeholders with their data needs.
  • Work across a tech stack including Kubernetes, Trino, Iceberg, Clickhouse, and PostgreSQL, with software built using Go, Javascript/Typescript, and Python.
  • Collaborate with peers to reinforce a culture of exceptional delivery and accountability.

Requirements

  • English: B2 required.
  • 3-5+ years of experience as a software engineer with a focus on building and maintaining data infrastructure.
  • Experience participating in technical initiatives in a cross-functional context.
  • Practical experience with data infrastructure components such as Trino, Spark, Iceberg/Delta Lake, Kafka, Clickhouse, or PostgreSQL.
  • Hands-on experience building and debugging data pipelines.
  • Proficient using backend languages like Go, Python, or Typescript, along with strong SQL skills.

Nice to have

  • Experience with data orchestration and infrastructure platforms like Airflow and DBT.
  • Experience deploying and managing services in Kubernetes.
  • Familiarity with data governance processes, privacy requirements, or auditability.
  • Interest in or knowledge of machine learning models and MLOps.

Culture & Benefits

  • Mission to help build a better, free, and open Internet.
  • Commitment to diversity and inclusiveness.
  • Participation in initiatives like Project Galileo, Athenian Project, and 1.1.1.1.
  • Provides reasonable accommodations to qualified individuals with disabilities.

Π‘ΡƒΠ΄ΡŒΡ‚Π΅ остороТны: Ссли вас просят Π²ΠΎΠΉΡ‚ΠΈ Π² iCloud/Google, ΠΏΡ€ΠΈΡΠ»Π°Ρ‚ΡŒ ΠΊΠΎΠ΄/ΠΏΠ°Ρ€ΠΎΠ»ΡŒ, Π·Π°ΠΏΡƒΡΡ‚ΠΈΡ‚ΡŒ ΠΊΠΎΠ΄/ПО, Π½Π΅ Π΄Π΅Π»Π°ΠΉΡ‚Π΅ этого - это мошСнники. ΠžΠ±ΡΠ·Π°Ρ‚Π΅Π»ΡŒΠ½ΠΎ ΠΆΠΌΠΈΡ‚Π΅ "ΠŸΠΎΠΆΠ°Π»ΠΎΠ²Π°Ρ‚ΡŒΡΡ" ΠΈΠ»ΠΈ ΠΏΠΈΡˆΠΈΡ‚Π΅ Π² ΠΏΠΎΠ΄Π΄Π΅Ρ€ΠΆΠΊΡƒ. ΠŸΠΎΠ΄Ρ€ΠΎΠ±Π½Π΅Π΅ Π² Π³Π°ΠΉΠ΄Π΅ β†’