TL;DR
Senior Analytics Engineer (AI): Building and maintaining hirify.global’s enterprise semantic layer, transforming raw data into trusted models and analytics-ready datasets with an accent on semantic modeling, data governance, and business intelligence tooling. Focus on architecting reusable data models, optimizing complex analytical queries, and empowering data-driven decision-making across the company.
Location: Hybrid in Livingston, NJ, New York, NY, Sunnyvale, CA, or Bellevue, WA. Remote work may be considered for candidates located more than 30 miles from an office, but requires quarterly team gatherings at an office hub. Work authorization for the US is required.
Salary: $143,000–$210,000 (USD) base salary, plus discretionary bonus and equity awards.
Company
hirify.global is a publicly traded product company providing The Essential Cloud for AI™, enabling innovators to build and scale AI with superior infrastructure performance and technical expertise.
What you will do
- Build and evolve the enterprise semantic layer, transforming raw data into trusted models and analytics-ready datasets.
- Deliver executive dashboards and AI analyst agents using tools like dbt, Airflow, Spark, StarRocks, Tableau, and Omni.
- Partner with data engineering, finance, product, and operations to define shared business logic.
- Own the last mile of analytics, shaping how the company understands its data and makes decisions.
Requirements
- 7+ years of experience in Analytics Engineering, Data Engineering, or Business Intelligence, with ownership of production analytics systems.
- 3+ years of hands-on experience modeling analytics-ready data using dbt with SQL and/or Python.
- Expert-level SQL, including writing, optimizing, and debugging complex analytical queries.
- Deep experience querying MPP analytical databases (e.g., StarRocks, Snowflake, BigQuery, Redshift).
- Strong experience with modern BI tools (e.g., Omni, Tableau, Looker, Power BI) for building semantic models and executive dashboards.
- Hands-on experience orchestrating data pipelines with Airflow, Dagster, or equivalent.
- Working proficiency in at least one scripting language (e.g., Python, Bash, R, Julia).
- Proven ability to translate complex data into trusted models, metrics, and visualizations.
- US person status (citizen, lawful permanent resident, refugee, or asylee) or eligibility to obtain required export authorization is mandatory due to export control compliance.
Nice to have
- Experience building and operating production data pipelines with strong software engineering practices.
- Deep hands-on experience with stream processing and transport systems (e.g., Flink, Kafka).
- Familiarity with open table formats such as Iceberg, Hudi, Delta, or Paimon.
Culture & Benefits
- Fast-paced, hyper-growth environment with an entrepreneurial outlook and independent thinking.
- Comprehensive benefits including 100% company-paid medical, dental, and vision insurance.
- 401(k) with a generous employer match and Employee Stock Purchase Program (ESPP).
- Flexible PTO and mental wellness benefits.
- Catered lunch in office and data center locations, with a casual work environment.
- Culture focused on curiosity, ownership, employee empowerment, and collaboration.
Hiring process
- New hires will attend onboarding at one of the hubs within their first month.
Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →