Назад
Company hidden
8 часов назад

Sr. Data Engineer (AI Platform)

Формат работы
remote (только Poland)/hybrid
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
Poland
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Sr. Data Engineer (AI Platform): Integrate large language models into production AI agent workflows and build structured-output pipelines with grounding and evidence traceability, design APIs for risk intelligence data. Focus on data ingestion pipelines, async task reliability, shared libraries, and full service lifecycle management.

Location: Poland (Remote or hybrid with coworking space in Warsaw)

Company

Leader in supplier risk intelligence with proprietary AI-powered data platform serving Fortune 500 companies, governments, and global platforms; post-Series B, HQ in San Francisco with hubs in Seattle and Warsaw.

What you will do

  • Integrate LLMs into production AI workflows, building pipelines with structured outputs, grounding, reference resolution, and traceability.
  • Design and evolve APIs serving risk intelligence data to customers and integrations.
  • Build and maintain data ingestion pipelines, owning extraction, loading, and error handling.
  • Ensure reliability of async task processing systems with monitoring, autoscaling, alerting, and incident response.
  • Develop shared libraries and internal tooling to accelerate engineering teams.
  • Manage full service lifecycle from new services to decommissioning legacy ones.

Requirements

  • 4+ years in Data Engineering or Backend Engineering with strong data focus
  • 4+ years production Python; clean, testable code
  • Experience designing/building APIs (GraphQL with Apollo Federation preferred)
  • Production data services: reliability, scalability, fault tolerance, observability
  • Data engineering fundamentals: ETL/ELT, batch/streaming, DWH, Data Lakes, distributed processing
  • Async Python (asyncio, Celery, task queues); SDLC and software practices
  • Infrastructure-as-code; curiosity, problem-solving, communication, self-starter

Nice to have

  • Familiarity with stack: Python, Pandas, Polars, Celery, SQL (PostgreSQL), SQLAlchemy, Airflow, Docker, Kafka, OpenAI/Anthropic/Vertex AI, Pydantic, GraphQL (Strawberry), AWS services, Terraform, Datadog
  • Experience or interest in AI-native workflows (e.g., Claude Code)

Culture & Benefits

  • Option for B2B contractor or full-time employee
  • Competitive salary at fast-growing startup
  • 100% remote or hybrid (Warsaw coworking); PTO: 28 days + holidays (full-time), 15 days + holidays (B2B)
  • High responsibility and authority matched with growth investment; diverse, inclusive environment

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →