Назад
Company hidden
1 день назад

GCP Data Architect (Google Cloud Platform)

Формат работы
remote (только Europe/latam)
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
Serbia, Argentina, Poland, Romania, Cyprus, Latvia, Mexico, Bulgaria, Colombia, Brazil, Uruguay
Вакансия из списка Hirify.GlobalВакансия из Hirify RU Global, списка компаний с восточно-европейскими корнями
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

GCP Data Architect (Google Cloud Platform): Designing, building, and optimizing cloud-native data platforms and advanced analytics solutions for clients with an accent on architectural decisions, data modeling, and scalability. Focus on assessing client infrastructures, planning and delivering GCP migrations and modernizations, and ensuring reliability across the full data lifecycle.

Location: Remote options are available from Argentina, Brazil, Bulgaria, Colombia, and Poland. Onsite work is available in Belgrade, Cluj-Napoca, Krakow, Larnaca, Lodz, Lublin, Monterrey, Montevideo, Riga, Rosario, Sofia, Varna, Warsaw, Wroclaw.

Company

hirify.global is an experienced software engineering company specializing in designing, building, and optimizing cloud-native data platforms and advanced analytics solutions for various clients.

What you will do

  • Design end-to-end data architecture solutions on Google Cloud Platform.
  • Develop scalable data pipelines and data processing frameworks.
  • Architect data models for analytics, machine learning, and operational reporting.
  • Define data governance, data quality, metadata management, and security standards.
  • Lead migration efforts from on-prem or other cloud platforms to GCP.
  • Optimize data storage and processing for performance, reliability, and cost efficiency.

Requirements

  • 5+ years of experience in Data Architecture, Data Engineering, or similar roles.
  • Strong hands-on expertise with Google Cloud Platform (BigQuery, Dataflow, Airflow, Pub/Sub, Cloud Storage).
  • Strong SQL and data modeling skills.
  • Experience with large-scale distributed data processing systems and streaming architectures.
  • Familiarity with CI/CD, IaC (Terraform), and programming languages like Python or Java/Scala.
  • Understanding of data governance, lineage, cataloging, and security (IAM).

Nice to have

  • Deep knowledge of other modern data ecosystems such as AWS, Azure, Databricks, or Snowflake.
  • Experience with ML pipelines or MLOps.

Culture & Benefits

  • Vacation and sick pay as per local laws.
  • Assistance with health insurance for you and your loved ones.
  • Time off for state holidays.
  • Enjoy corporate parties and team get-togethers.
  • Comfort service for technical and everyday work problems.
  • Opportunities for career growth, IT certifications, and access to learning platforms.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник - загрузка...