Эта вакансия в архиве
Посмотреть похожие вакансии ↓обновлено 9 часов назад
Aws Data Engineer (Cloud)
Описание вакансии
Текст:
TL;DR
Aws Data Engineer (Cloud): Designing and implementing scalable cloud-native data architectures and pipelines on AWS to process large-scale and unstructured datasets with an accent on data mesh, data lake, and streaming architectures. Focus on optimizing data processing efficiency, supporting cloud migrations, and ensuring data quality and performance across AWS services.
Location: Hybrid in Poland (Gdansk, Krakow, Wroclaw, Poznan, Warsaw)
Company
is a global leader in technology consulting and digital transformation with over 360,000 employees worldwide, partnering with top enterprises including Fortune 500 companies.
What you will do
- Design and implement solutions for processing large-scale and unstructured datasets using Data Mesh, Data Lake, or Streaming Architectures.
- Develop, optimize, and test modern DWH/Big Data solutions on AWS within CI/CD environments.
- Improve data processing efficiency and support migrations from on-premises to public cloud platforms.
- Collaborate with data architects and analysts to deliver high-quality cloud-based data solutions.
- Ensure data quality, consistency, and performance across AWS services and environments.
- Participate in code reviews and contribute to technical improvements.
Requirements
- Proven experience in Big Data or Cloud projects processing large and unstructured datasets across SDLC phases.
- Hands-on experience with AWS cloud services including Storage, Compute, Serverless, Networking, and DevOps.
- Solid understanding of AWS services, ideally with relevant certifications.
- Familiarity with Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark.
- Basic proficiency in Python, Scala, Java, or Bash.
- Very good command of English (German skills are an advantage).
Nice to have
- Experience with orchestration tools like Airflow or Prefect.
- Exposure to CI/CD pipelines and DevOps practices.
- Knowledge of streaming technologies such as Kafka and Spark Streaming.
- Experience with Snowflake or Databricks in production or development.
- Relevant AWS, data engineering, or big data certifications.
Culture & Benefits
- Yearly financial bonus and private medical care with additional packages.
- Access to over 70 training tracks with certification opportunities and free access to language and technical learning platforms.
- Work with cutting-edge technology and top global enterprises.
- Hybrid working model with ergonomic home office package after onboarding.