TL;DR
Data Engineering Intern: Building and maintaining scalable data models and pipelines using Spark, Apache Airflow, and SQL with an accent on data lakehouse technologies like HUDI and Iceberg. Focus on designing performant data storage, improving data quality, and collaborating in a hybrid engineering team.
Location: On-site in Ho Chi Minh, Vietnam
Company
hirify.global is a leading eCommerce cashback platform serving over 50 million users across 13 markets, focused on building rewarding shopping experiences.
What you will do
- Build and maintain data models and pipelines using Spark and Apache Airflow
- Design and optimize HUDI and Iceberg tables for performance and reliability
- Write and validate SQL transformations for analytics platforms like Trino and Metabase
- Collaborate with senior engineers to improve data quality and observability
- Use AI tools to assist coding and documentation, verifying their output
- Document learnings and share improvements within the team
Requirements
- Location: Must be able to work on-site in Ho Chi Minh, Vietnam
- Strong interest in data engineering or data systems
- Familiarity with Python and SQL (projects or self-taught acceptable)
- Curiosity about data pipelines, storage formats, and data quality
- Comfort experimenting responsibly with AI coding assistants
- Clear communication and proactive progress sharing
Nice to have
- Exposure to AWS, dbt, Spark, or Trino
Culture & Benefits
- Career progression paths and opportunities for growth
- Competitive compensation based on performance
- Candid, open, and collaborative culture valuing feedback
- Part of a winning team on a journey to global scale
Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →