Назад
Company hidden
3 дня назад

Advanced Data Engineer (AI)

Формат работы
hybrid
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
Switzerland
Вакансия из списка Hirify.GlobalВакансия из Hirify RU Global, списка компаний с восточно-европейскими корнями
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Advanced Data Engineer (AI): Designing, implementing, and maintaining scalable data pipelines and data products for complex client environments with an accent on distributed data processing and cloud-native architectures. Focus on building robust, maintainable data solutions while collaborating with cross-functional teams of architects, software engineers, and data scientists.

Location: Schlieren, Switzerland. The role is hybrid, requiring presence in the office with flexibility for remote work.

Company

hirify.global is a global transformation partner and innovation specialist, founded in Switzerland in 1968, dedicated to helping clients build future-ready businesses through advanced engineering and digital solutions.

What you will do

  • Act as a trusted advisor to clients, guiding them toward technical solutions for complex data challenges.
  • Communicate technical solution architectures to both technical and non-technical stakeholders.
  • Develop, test, and monitor distributed data processing pipelines for scalability and performance.
  • Collaborate with architects and data scientists to ensure high-quality, reproducible datasets.
  • Deliver projects using Agile methodologies to produce iterative business value from data.

Requirements

  • Fluency in both German and English is required.
  • University degree in computer science, software engineering, or data science.
  • Minimum 3 years of experience in data or software engineering roles.
  • Strong practical proficiency in Python and SQL.
  • Experience with Cloud Data Platforms such as Databricks, Snowflake, or Microsoft Fabric.
  • Experience with data architectures like Data Lakehouse, streaming, and batch processing.

Nice to have

  • Hands-on experience with Apache Spark, Delta Lake, Kafka, and Airflow.
  • Familiarity with containerization (Kubernetes) and IaC (Terraform).
  • Knowledge of diverse data types including graph, time-series, and geospatial data.
  • Experience in DevOps methodologies.

Culture & Benefits

  • Flexible working hours with options for remote work.
  • Participation in a profit share scheme based on company performance.
  • Strong commitment to continuous professional development and training.
  • Global community with annual team camps and inclusive company culture.
  • Opportunity to work on complex, socially impactful projects across diverse industries.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →