TL;DR
Senior Data Engineer (Fintech): Designing, developing, and maintaining efficient data pipelines using Python and Apache Airflow with an accent on cloud-native data services, containerized workflows, and small-volume ETL processes. Focus on optimizing data ingestion, transformation, and ensuring pipeline efficiency and reliability for financial institutions.
Location: Client location in Sydney. Flexible working options are available.
Company
hirify.global is a global digital transformation consulting firm with 14,500+ employees across 58 offices in 21 countries, specializing in AI, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering for financial services.
What you will do
- Develop, deploy, and maintain scalable data pipelines using Python and Apache Airflow for orchestration.
- Build and manage containerized data workflows to ensure portability and consistency.
- Collaborate with data scientists and analysts to understand data requirements and optimize data ingestion and transformation.
- Design and implement ETL processes for small-volume data integration and processing.
- Work with cloud-native data services and maintain database solutions such as Aurora PostgreSQL.
- Monitor pipeline performance, troubleshoot issues, and optimize workflows for efficiency.
- Adhere to best practices in data security, governance, and documentation.
Requirements
- Proficiency in Python, especially Pandas and related data manipulation libraries.
- Hands-on experience with Apache Airflow for building and scheduling workflows.
- Experience developing and managing containerized data pipelines using Docker or similar technologies.
- Familiarity with cloud-native data services (AWS, GCP, Azure) and small-volume ETL processes.
- Knowledge of Aurora PostgreSQL or other serverless database solutions.
- Understanding of data governance, security, and best practices in data engineering.
- Excellent problem-solving skills and ability to work independently or within a team.
- Strong communication skills to collaborate effectively across technical teams.
Nice to have
- Experience working with cloud platforms like AWS, GCP, or Azure.
- Knowledge of other data orchestration tools.
- Familiarity with data modeling and database optimization.
- Experience working in financial services or regulated industries.
Culture & Benefits
- Engage in innovative data projects with leading global clients.
- Opportunity to work with cutting-edge cloud-native tools and technologies.
- Competitive salary and comprehensive benefits package.
- Focus on professional growth and continuous learning.
- Dynamic, collaborative work environment with flexible working options.
- Commitment to diversity, equity, inclusion, and sustainability initiatives.
Будьте осторожны: если вас просят войти в iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →