Эта вакансия в архиве
Посмотреть похожие вакансии ↓обновлено 2 месяца назад
Data Engineer
Описание вакансии
Текст:
TL;DR
Data Engineer (BigQuery, Airflow): Designing, building, and maintaining scalable data pipelines that support authentication, authorization, fraud detection, compliance, and user identity analytics with an accent on secure data solutions and global ecosystem impact. Focus on optimizing ELT/ETL workflows, data modeling in Google BigQuery, and ensuring data quality, observability, and security.
Location: Hybrid, 2 days a week in the Berlin campus, Germany. Relocation to Berlin, Germany is supported.
Company
is the world’s pioneering local delivery platform, operating in over 70 countries and headquartered in Berlin, Germany.
What you will do
- Design, build, and maintain scalable and reliable data pipelines for identity and access management use cases.
- Develop and optimize ELT/ETL workflows using Apache Airflow.
- Model, transform, and optimize large-scale datasets in Google BigQuery for analytics and operational reporting.
- Ensure data quality, observability, and reliability through monitoring, alerting, and automated testing.
- Collaborate with Security, IAM, Product, and Analytics teams to deliver end-to-end data solutions.
- Implement privacy-by-design and security best practices in all data workflows.
Requirements
- 4+ years of experience as a Data and Analytical Engineer or similar role in a cloud environment.
- Deep experience with GCP data services, especially BigQuery and Cloud Storage, including performance optimization and data modeling.
- Solid experience building and orchestrating pipelines with Apache Airflow and complex data modeling in DataMesh.
- Strong Python skills for production-grade data engineering, including experience with Pandas, PySpark, and pytest.
- Knowledge of data governance, privacy, and security principles (PII, access control, audits).
- Must be able to work hybrid from Berlin, Germany.
Nice to have
- Experience working with identity, authentication, security, or compliance data.
- Experience with GCP streaming and data processing services (Pub/Sub, Dataflow) or AWS services (S3, Lambda, IAM).
- Experience with CI/CD for data or Infrastructure-as-Code tools like Terraform.
Culture & Benefits
- Hybrid working model with 2 days a week in the Berlin office for face-to-face connection and collaboration.
- 27 days holiday with an extra day on 2nd and 3rd year of service.
- €1,000 Educational Budget, Language Courses, Parental Support, and access to Udemy Business.
- Health Checkups, Meditation, Gym & Bicycle Subsidy.
- Employee Share Purchase Plan, Sabbatical Bank, Public Transportation Ticket Discount, Life & Accident Insurance, Corporate Pension Plan.
- Digital Meal Vouchers and Food Vouchers.
- Relocation support to Berlin, Germany.