TL;DR
Senior Principal Data Platform Software Engineer: Designing, developing, and operating high-quality big data and analytical platform solutions with an accent on optimal cost, minimal latency, and maximum reliability. Focus on improving large-scale distributed data systems, driving high-performance analytical database evolution, and leading cross-organizational initiatives.
Location: Remote (Global). The team is distributed across Australia, US, and Bangalore. Compensation is structured around US geographic pay zones.
Salary: Zone A: $239,400 - $312,550, Zone B: $216,000 - $282,000, Zone C: $198,900 - $259,675.
Company
hirify.global's mission is to unleash the potential of every team through products like Jira, Confluence, Loom, and Trello.
What you will do
- Design, develop, and own delivery of high-quality big data and analytical platform solutions.
- Improve and operate large-scale distributed data systems in the cloud (AWS, GCP, Kubernetes).
- Drive the evolution of high-performance analytical databases and their integrations.
- Define and uplift engineering and operational standards for petabyte scale data platforms.
- Partner across product and platform teams to deliver company-wide initiatives.
- Act as a technical mentor and multiplier, raising the bar on design and code quality.
- Own the long-term architecture and technical direction of the product data platform.
- Partner with executives and influence senior leaders to align engineering efforts with business objectives.
Requirements
- 15+ years in Data Engineering, Software Engineering, or related roles with big data ecosystems.
- Demonstrated experience building and operating data platforms or large-scale data services in production.
- Proven track record of building services from the ground up (requirements to ownership).
- Hands-on experience with AWS, GCP (compute, storage, data, and streaming services) and cloud-native architectures.
- Practical experience with big data technologies like Databricks, Apache Spark, AWS EMR, Apache Flink, or StarRocks.
- Strong programming skills in Kotlin, Scala, Java, or Python.
- Experience leading cross-team technical initiatives and influencing senior stakeholders.
- Experience mentoring Staff/Principal engineers.
Nice to have
- Experience with StarRocks or similar MPP/OLAP analytical databases in production.
- Experience with streaming and batch architectures.
- Familiarity with data lakes and table formats (e.g., Iceberg, Delta Lake) and columnar formats (e.g., Parquet).
- Exposure to data governance, security, privacy, and compliance.
- Experience optimizing cost, performance, and reliability for petabyte-scale data systems.
- Familiarity with observability tooling for data platforms.
- Experience contributing to technical strategy and multi-year roadmaps for data platforms.
Culture & Benefits
- Opportunity to shape the next generation of hirify.global’s product data platform.
- Work on challenging, large-scale data and distributed systems problems in a modern cloud environment.
- Join a high-caliber, diverse engineering team with mentorship and growth opportunities.
- Enjoy autonomy, scope, and influence to define major technical initiatives.
- Compensation programs are designed to be equitable, explainable, and competitive.
- Offerings include health and wellbeing resources and paid volunteer days.
Будьте осторожны: если вас просят войти в iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →