TL;DR
Analytics Engineer (Databricks): Designing, building, and maintaining scalable data models in Databricks silver and gold layers with an accent on dimensional modeling, data contracts, and ETL/ELT development. Focus on optimizing transformation pipelines, ensuring data quality, and enabling business insights through reusable datasets.
Location: Manila, Philippines (office), with remote and hybrid options available. Relocation support to Philippines for eligible candidates.
Company
hirify.global is a licensed neobank focused on modern banking services for millions in the fast-growing Southeast Asia region, leveraging AI implementation and aiming to be a fintech unicorn.
What you will do
- Design, build, and maintain scalable data models in Databricks silver and gold layers.
- Partner with data scientists, platform engineers, and business analysts to ensure gold datasets meet business needs.
- Develop and optimize transformation pipelines using PySpark/SQL/Delta Live Tables/Databricks Workflows.
- Establish and maintain data quality metrics and governance standards for silver and gold tables.
- Build reusable, business-friendly datasets that power dashboards, self-service BI tools, and advanced analytics.
- Optimize Databricks SQL queries and Delta Lake performance.
Requirements
- Strong SQL expertise, including writing complex, performant queries and optimizing them on large datasets.
- Hands-on experience with dbt for building and maintaining models, writing reusable macros, and implementing tests and documentation.
- Data Modeling expertise with a strong understanding of dimensional modeling (facts, dimensions, star schemas) and ability to translate business requirements into scalable data models.
- Analytics Engineering mindset with a strong focus on data quality, reliability, consistency, and working closely with stakeholders.
- Experience with production-ready analytics, including data testing, monitoring, debugging, and familiarity with ELT pipelines and Git-based workflows.
Nice to have
- Python experience for data transformations or utilities (pandas).
- Experience working in Databricks notebooks or workflows and understanding its architecture.
- Knowledge of Apache Spark (Spark SQL or PySpark) for large-scale data processing.
- Understanding of Delta Lake tables, ACID transactions, and time travel.
- Experience with modern cloud data environments (Databricks, AWS, GCP, Azure) and lakehouse architectures.
Culture & Benefits
- Passionate international team spanning the globe.
- Rapid professional growth based purely on merit.
- Rewards tied to performance and long-term success of hirify.global.
- Fast track opportunities for international growth.
- Medical insurance, health and wellness benefits.
- Program of events and activities both online and in person.
Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →