Назад
Company hidden
4 дня назад

Product Owner (Data Acquisition)

Формат работы
remote (Global)
Тип работы
fulltime
Грейд
middle/senior
Английский
b2
Страна
China
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Product Owner (Data Acquisition): Owns the data acquisition engine, identifies new public data sources, shapes the data schema, and works closely with engineers to build reliable pipelines. Focus on translating raw data into client-ready use cases, balancing technical feasibility with commercial impact, and ensuring extraction reliability.

Location: hirify.global Flex - a mix of working from home and the office + 20 days of working remotely (anywhere in the world!)

Company

hirify.global is a start-up focusing on the next generation of data products powered by AI technology, with offices in Amsterdam (HQ) and Shanghai.

What you will do

  • Own the roadmap for data acquisition & sourcing (new sources, fields, quality).
  • Translate business needs into clear technical requirements for engineers.
  • Work with Python- and SQL-heavy pipelines; review feasibility, spot risks, guide implementation.
  • Identify and evaluate new public data sources; assess coverage, effort, and value.
  • Improve and expand our data models with new fields or enrichments.
  • Ensure extraction reliability, address anti-scraping/anti-bot challenges with engineering.

Requirements

  • Technical background: comfortable discussing architecture with engineers.
  • Proven experience in monetizing data products is required.
  • Mandatory experience working with Point-of-Interest (POI) data and location intelligence.
  • Experience working with data pipelines, APIs, or scraping-based acquisition.
  • Ability to understand raw data and convert it into valuable client deliverables.
  • Strong prioritization and communication skills; able to align technical and business stakeholders.

Nice to have

  • Experience with scraping frameworks, anti-scraping solutions, or proxy management.
  • Experience with Apify, Firecrawl, or other tools/platforms in the web crawling/scraping/data extraction ecosystem.
  • Familiarity with large data volumes, ETL/ELT workflows, or cloud data infrastructure.
  • Work experience with Python and SQL is preferred.

Culture & Benefits

  • A competitive annual leave entitlement
  • Company Laptop
  • Referral Bonus
  • Annual Learning Budget
  • Annual Health Check
  • Working within an international team that truly values your contribution

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник - загрузка...