Назад
12 минут назад

Data Engineer (ClickHouse/Analytics/BI)

Формат работы
remote (Global)
Тип работы
fulltime
Грейд
middle
Английский
b2
Страна
Cyprus
vacancy_detail.hirify_telegram_tooltip Загружаем источник...

Мэтч & Сопровод

Покажет вашу совместимость и напишет письмо

Описание вакансии

#vacancy #cyprus #limassol #IT #dataengineer

⚡Tria Group Company (as a part of Wise Wolves Corporation) is looking for Data Engineer (Mid) — ClickHouse / Analytics / BI

📌Location: OnSite/Hybrid/Remote
Team: Work closely with Backend, Security, QA, Product Partners: Cryptography/Wallet team, API/Backend teams, Platform/SRE

⚡ About the role:

We’re hiring our first Data Engineer to build and own the analytics/data foundation. You’ll work across product, engineering, and business stakeholders to turn raw operational data into trusted datasets, dashboards, and metrics. This is a hands-on role with real ownership: you’ll design pipelines, shape our data model, and help establish the practices that will scale with the company.

⚡ What you’ll do
● Build and maintain data pipelines from product and operational systems into our analytics stack.
● Design and operate a ClickHouse-based analytics warehouse (schemas, partitioning, performance tuning, retention).
● Define and maintain core business metrics (single source of truth), dimensional models, and curated datasets.
● Deliver BI reporting: dashboards, self-serve datasets, and recurring management reports.
● Improve data quality: validation checks, anomaly detection, SLAs, lineage documentation.
● Partner with engineers to instrument products/events and ensure data is captured correctly (events, logs, audits).
● Support ad-hoc analysis requests and help stakeholders interpret data correctly.
● Establish “first hire” foundations: conventions, documentation, runbooks, and lightweight governance.

⚡ Requirements (must-have)
● 3+ years (or equivalent) in Data Engineering / Analytics Engineering.
● Strong SQL, experience modeling analytics datasets (star schema / dimensional modeling).
● Hands-on experience with ClickHouse (or another columnar OLAP DB) and performance optimization.
● Practical experience building ETL/ELT pipelines (batch and/or near-real-time).
● Familiarity with a BI tool (e.g., Metabase, Superset, Power BI, Looker, Tableau) and working with stakeholders.
● Comfortable operating production data workflows: monitoring, retries, backfills, incident fixes.

⚡ Nice to have
● Experience with tools like Airflow / Dagster / Prefect, dbt, Kafka, Spark/Flink (any subset).
● Experience with event analytics (product events), log analytics, or audit trails.
● Python (or another scripting language) for pipeline logic and automation.
● Understanding of security/compliance needs (access control, retention, auditability), especially in regulated environments.

⚡ What we offer
● First data hire: real ownership, influence on architecture and standards.
● Close collaboration with engineering/product leadership; your work will directly impact decisions.
● Competitive compensation and flexible work setup (details based on location


❣️Send CV in DM

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник -