Назад
Company hidden
обновлено 7 часов назад

Senior Data Architect (Databricks/Snowflake)

Формат работы
remote (только Europe/latam/cis)
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
Serbia, Argentina, Ukraine, Poland, Armenia, Romania, Cyprus, Latvia, Kazakhstan, Mexico, Georgia, Bulgaria, Colombia, Brazil, Uruguay
Вакансия из списка Hirify.GlobalВакансия из Hirify RU Global, списка компаний с восточно-европейскими корнями
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Senior Data Architect (Databricks / Snowflake): Guiding and supporting the transition of a large-scale data platform from a Parquet-based data lake to an open table format using Databricks for scalable data management and secure data sharing. Focus on reviewing and challenging existing architecture, designing data sharing approaches, and mentoring engineers in best practices for high-volume data ingestion and transformation.

Location: Remote within Argentina, Brazil, Bulgaria, Colombia, Georgia, Kazakhstan, Poland, Serbia, Romania, Ukraine, Cyprus, Mexico, Uruguay, Latvia, Armenia, or from offices in Almaty, Astana, Belgrade, Cluj-Napoca, Dnipro, Kharkiv, Krakow, Kyiv, Larnaca, Lodz, Lublin, Lviv, Monterrey, Montevideo, Odesa, Riga, Rosario, Sofia, Tbilisi, Varna, Warsaw, Wroclaw, Yerevan. Business trips to London are required occasionally.

Company

hirify.global is a technology company specializing in large-scale historical market data analytics for financial market participants worldwide.

What you will do

  • Review and assess current data architecture, pipelines, and storage strategies.
  • Guide the migration from a Parquet-based data lake to an open table format (e.g., Delta sharing through Databricks).
  • Design and support scalable, secure, and performant data sharing approaches using Databricks.
  • Contribute hands-on to data engineering and architectural work as needed.
  • Support the design and optimization of large-scale data ingestion and transformation processes for terabyte-to-petabyte data.
  • Collaborate with and mentor the internal engineering team.

Requirements

  • Extensive hands-on experience with Databricks, including Delta Lake in production.
  • Strong expertise in Snowflake, covering data modeling, performance optimization, and integration patterns.
  • Proven experience designing and operating large-scale data platforms.
  • Solid understanding of open table formats (Delta Lake, Apache Iceberg, Apache Hudi).
  • Experience with data lake and lakehouse architectures and modern data sharing concepts.
  • Ability to work effectively at both architectural and implementation levels.
  • Strong communication skills and collaborative experience with engineering teams.

Nice to have

  • Experience working with financial market data.
  • Prior involvement in data sharing or data product platforms.
  • Cloud platform experience (AWS, Azure, or GCP).

Culture & Benefits

  • Vacation as per local laws.
  • Assistance with health insurance for you and your loved ones.
  • 10 days sick pay without a doctor's note.
  • Time off for state holidays.
  • Comfort service for technical and everyday work problems.
  • Opportunities for professional growth through IT certifications and access to top-tier courses.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник - загрузка...