Эта вакансия в архиве
Посмотреть похожие вакансии ↓Advanced Data Engineer (AI & Data)
Описание вакансии
TL;DR
Advanced Data Engineer (AI): Designing, implementing, and maintaining data pipelines for various industries, leveraging data platforms and data-driven solutions with an accent on technical solution advisory, distributed data processing, and scalable data set extrapolation. Focus on guiding clients in their data challenges, collaborating with cross-functional teams, and delivering value through agile development and advanced data architectures.
Location: Schlieren, Switzerland. Flexible working hours with the possibility to work from home. Fluency in both German and English is required.
is a global transformation partner founded in Switzerland in 1968, specializing in tech strategy, business innovation, digital solutions, and device/systems engineering across Europe and Asia.
What you will do
- Advise clients on data challenges and guide them towards successful technical solutions.
- Design, implement, test, and monitor distributed data processing pipelines.
- Extrapolate versatile data sets to produce high-quality, reproducible, and scalable datasets.
- Collaborate with Architects, Software Engineers, and Data Scientists.
- Ensure data products meet the needs of various producers and consumers.
- Deliver projects using Agile methodologies to produce value early and frequently.
Requirements
- University degree in computer science, software engineering, data science, or similar.
- At least 3 years of experience in data or software engineering.
- Experience designing, building, and maintaining data products.
- Understanding of data analysis, visualization, and optionally data science approaches.
- Experience with data architectures (e.g., Data Lake, Data Lakehouse, Medallion, streaming, batch processing).
- Experience with Cloud Data Platforms (Databricks, Snowflake, Microsoft Fabric, Amazon Sagemaker).
- Practical data programming skills in Python and SQL.
- Fluency in both German and English.
Nice to have
- Hands-on skills or interest in Apache Spark, Delta Lake, Airflow, Kafka, Kubernetes, Terraform, FastAPI, Scala, R, Java, or Typescript/.NET.
- Familiarity with big data infrastructures and concepts.
- Practical knowledge of handling varied types of structured and unstructured data.
- Experience with agile development and DevOps methodologies.
Culture & Benefits
- Flexible working hours and possibility to work from home.
- Profit share scheme based on company success.
- Global and diverse community across 16 offices.
- Commitment to continuous professional and personal development.
- Focus on delivering real impact and fostering trust and independence.
- Belief in technology for positive societal difference.