O

Senior Data Engineer

Oolio
Full-time
On-site
Melbourne, Australia

Important

  • LOCATION : West Melbourne
  • HYBRID : 4 days work from office
  • PR/VISA STATUS of Applicant: Citizen or Holding a Valid PR. Please do not apply if you need sponsorship.

> Who Are We

Oolio is an Australian-based start-up on a mission to empower the hospitality industry. Our platform is built with the sole purpose to assist venues to do what they do best – Provide an exceptional experience to their patrons during moments that matter. Whilst this platform is new, we are part of an established international group of successful companies thus giving you the freedom of a start-up with the support of an experienced group of leaders and backed by one of Australia's leading investment funds.

> About the role:

Our Data team manage and maintain all the data streams incoming into our warehouse with proper data orchestration pipelines (ETL/ELT)and ensure the consumers are provided with the access to the data they are interested in.

💡 Starts with data extraction from different sources, continuing with the loading and transformation of these data, all the way to the creation of reports and other data products

> What you will do:

  • Designing, coding, testing, and deploying new data management tools and systems.
  • Create data pipelines between various source systems such as Hubspot, Netsuite, POS to Lake/Warehouse
  • Data wrangling - getting our data into the right shape for answering questions and building models takes skill, and may require use of big-data technologies
  • Write queries and develop pipelines to surface data analysis and visualisations for internal and external users

> About you:

  1. Degree in Mathematics/Statistics, Computer Science or IT
  2. Min 5+ years’ experience in data engineering

> Tech Experience:

  1. Proficient **python** skills that can be used for both ad hoc analysis and for building maintainable and efficient data pipelines
  2. Advanced SQL skills
  3. Experience in R/Spark is a plus
  4. Experience with BI tools such as PowerBI, Tableau
  5. Exposure to Snowflake and Airflow
  6. Exposure to ML models
  7. Experience with DevOps practices and techniques, such as Docker and CI/CD tools.
  8. Exposure to data engineering concepts and associated technologies such as Airflow/Fivetran, BigQuery Kafka, batch and real-time data pipelines, ELT, and SQL.
  9. Exposure to unstructured and structured data