Назад
3 дня назад

Senior Software Engineer (Data Platform)

190 000 - 220 000$
Формат работы
remote (только United_states/Canada)
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
US/Canada
vacancy_detail.hirify_telegram_tooltipВакансия из Telegram канала -

Мэтч & Сопровод

Покажет вашу совместимость и напишет письмо

Описание вакансии

Senior Software Engineer, Data Platform

Company

TRM Labs

Conditions

1 day agoSeniorSalary: 190K - 220KNorth America Remote Full Time Engineering Jobs by TRM Labs

Skills

Sparksql Flink Trino Starrocks Iceberg Citus Bigquery Airflow Dbt Distributed Systems Terraform Monitoring Data Pipeline Datadog Spark Sql Python Docker Kubernetes Blockchain Etl Kafka Data Modeling

About the Role

You will build and operate highly reliable data services that integrate with multiple blockchains, develop complex ETL pipelines to process petabyte-scale structured and unstructured data in real time, and design data models optimized for sub-second query latency. You will deploy, monitor, and tune large database clusters for performance and high availability, automate scaling and operational tasks, and create observability and monitoring solutions. You will collaborate with data scientists, backend engineers, and product managers to implement data models and pipelines that power product features and analytics.

Requirements

  • Bachelor's degree or equivalent in Computer Science or related field
  • 5+ years of hands-on experience architecting distributed systems and shipping production services
  • Strong programming skills in Python
  • Proficiency with SQL or SparkSQL
  • Experience with data stores such as Iceberg, Trino, BigQuery, StarRocks, and Citus
  • Familiarity with orchestration tools like Airflow and DBT
  • Experience with data processing and streaming technologies such as Spark, Kafka, and Flink
  • Experience deploying and monitoring infrastructure on public cloud using Docker, Terraform, Kubernetes, and Datadog
  • Proven ability to load, query, and transform very large datasets

Responsibilities

  • Build reliable data services to integrate with multiple blockchains
  • Develop complex ETL pipelines that process petabyte-scale structured and unstructured data in real time
  • Design and architect data models for optimal storage and sub-second retrieval
  • Deploy, monitor, and tune large database clusters for performance and high availability
  • Automate operational tasks and scaling workflows
  • Collaborate with data scientists, backend engineers, and product managers to implement data models and pipelines
  • Create observability dashboards and monitoring for data systems
  • Optimize pipelines and systems for faster iteration and reduced operational dependency

Benefits

  • Eligibility to participate in TRM’s equity plan

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник -