TL;DR
Analytics Engineer (Databricks): Designing and building scalable data models and ETL/ELT pipelines for advanced analytics in Databricks, ensuring data quality and governance. Focus on applying dimensional modeling, optimizing data transformations (PySpark/SQL), and enabling business intelligence for a fintech neobank.
Location: Manila, Philippines (office). Remote and hybrid options available. Relocation support to Philippines for eligible candidates.
Company
hirify.global is a licensed neobank focused on modern banking services for millions of Filipinos, aiming to be a unicorn and a leading fintech in Southeast Asia with AI implementation.
What you will do
- Design, build, and maintain scalable data models in Databricks silver and gold layers.
- Define data contracts and apply dimensional modeling best practices to support analytics and reporting.
- Develop and optimize transformation pipelines (PySpark/SQL/Delta Live Tables/Databricks Workflows) to process data.
- Implement incremental data processing strategies and ensure data quality checks (validations, anomaly detection).
- Establish and maintain data quality metrics and apply data governance standards across datasets.
- Collaborate with analysts and business stakeholders to translate requirements into reusable, business-friendly gold-layer datasets.
Requirements
- Strong SQL expertise, including ability to write complex, performant queries and optimize them on large datasets.
- Hands-on experience with dbt for building and maintaining models, writing reusable macros, and implementing tests and documentation.
- Data Modeling expertise, including strong understanding of dimensional modeling (facts, dimensions, star schemas) and designing semantic layers.
- Analytics Engineering mindset with a strong focus on data quality, reliability, and consistency.
- Experience with production-ready analytics, including data testing, monitoring, debugging, and Git-based workflows.
Nice to have
- Python experience for writing data transformations, utilities, or automation (e.g., with pandas).
- Familiarity with Databricks architecture, notebooks, workflows, and job orchestration.
- Experience with Apache Spark or PySpark for large-scale data processing.
- Knowledge of Delta Lake tables, ACID transactions, and time travel.
- Experience with modern cloud data environments (Databricks, AWS / GCP / Azure), data lakes, and lakehouse architectures.
- Experience supporting BI tools (Looker, Tableau, Power BI).
Culture & Benefits
- Work within a passionate international team with opportunities for rapid professional growth.
- Receive rewards for performance and long-term success of hirify.global.
- Access to a new office in Manila, Philippines.
- Benefit from medical insurance, health, and wellness benefits.
- Participate in a program of events and activities both online and in person.
Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →