Эта вакансия в архиве

Посмотреть похожие вакансии ↓
Company hidden
обновлено 2 месяца назад

Lead Data Engineer (Snowflake/Dbt)

Формат работы
remote
Тип работы
fulltime
Грейд
lead
Английский
b2
Страна
India/Germany

Описание вакансии

Текст:
/

TL;DR

Lead Data Engineer (Snowflake/DBT): Responsible for building and running data pipelines and services to support business functions, reports, and dashboards, with an accent on end-to-end ETL/ELT pipeline development and Data Mesh architecture implementation. Focus on optimizing data feeds, implementing data quality tests, and ensuring efficient data delivery in a modern data stack environment.

Location: Can work remotely from home or anywhere in their assigned Indian state, or from a different country or Indian state for 90 days of the year.

Company

hirify.global is building a business management platform designed to save small businesses time and money, providing business accounts and connected administrative solutions.

What you will do

  • Develop end-to-end ETL/ELT pipelines in collaboration with Data Analysts.
  • Design and implement scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture.
  • Mentor junior engineers and serve as a data technologies expert.
  • Troubleshoot and resolve technical issues and improve data pipeline delivery.
  • Translate business requirements into technical specifications, including data models, DBT models, timings, and tests.
  • Perform exploratory data analysis to identify and resolve data quality issues.

Requirements

  • 7+ years of extensive development experience using Snowflake or similar data warehouse technology.
  • Experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git, and Looker.
  • Experience in agile processes, such as SCRUM.
  • Extensive experience in writing advanced SQL statements and performance tuning.
  • Experience in Data Ingestion techniques using custom or SAAS tools like Fivetran.
  • Experience in data modeling and optimizing existing/new data models.

Nice to have

  • Experience architecting analytical databases in a Data Mesh architecture.
  • Experience working in agile cross-functional delivery teams.
  • Basic understanding of various systems across the AWS platform.
  • Experience with Python, governance tools (e.g., Atlan, Alation, Collibra), or data quality tools (e.g., Great Expectations, Monte Carlo, Soda).

Culture & Benefits

  • Flexible workplace model supporting both in-person and remote work.
  • Colleagues can work remotely from home or anywhere in their assigned Indian state.
  • Can work from a different country or Indian state for 90 days of the year.
  • Competitive salary, self & family health insurance, and term & life insurance.
  • Learning & Development Budget and WFH Setup allowance.