Want to improve this content? Edit this content
Data Engineer @ Ofload

Technology start-up with funding

Work with exciting Mobile and Web tech

Join an early-stage venture in Sydney that is disrupting the most important industry in logistics: trucking

At Ofload we are a disruptor that strives to eliminate waste in the trucking industry. We do this by enabling trucking operators to drive full more often, eliminate time spent waiting around for loads, and empower the small trucking companies of the world. The organisation has a continuously improving and supportive culture that focuses on bringing value to our customers.

 

At the core of this logistics complex business is the data. As a Data Engineer, you will be working across our tech and business to deliver the next generation of data analytics solutions and have responsibility developing, deploying and supporting your data assets.

 

You’ll bring your in-depth knowledge of big data technologies best practice into an agile, continuous improvement and customer focused environment, and be part of the initial team to better define how we view, collect, deal with and use data.

 

If you are passionate about your craft, push to continuously improve prediction accuracies and data quality and work with systems that drive our economy, Ofload will be the right place for you!

 

This is a full-time permanent remote position where you can work at your desired location.

 

The Role

Design reporting interfaces, tools and solutions

Understanding prediction models and realising it through software.

Maintain high standards of data quality within the team by establishing good practices and habits.

Operationalising data pipelines, systems, and other analytical assets into core execution systems and platforms

Developing and implementing Data Integration pipelines from business applications to Data Lake/Data Warehouse.

Ensuring innovation, continuous improvement, maintainability, supportability and performance of the shared data platform

Fulfilling data reporting and adhoc data requirements from Portfolio Managers and stakeholders and presenting results in a clear manner.

Job requirements

  • Degree in Computer Science, Software Engineering, Data Science, Statistics/Mathematics or equivalent.
  • At least 3+ years of experience as a Data Engineer, BI developer, or similar position. 
  • Data Warehousing/ETL concepts or have worked on similar projects 
  • Strong experience and knowledge in cloud automation, in particular on the AWS stack and significant expertise in SQL, Python and/or R, and familiarity with contemporary data science libraries 
  • Previous experience in defining and implementing solution designs, development standards, data quality measurements, end to end development processes, and similar practices for data platforms and data integration platforms. 
  • Demonstrate experience with leading and trouble-shooting incidents and problems relating to data platforms and data integration platforms. 
  • Experience working on Data Lakes, Marts, Warehouses and other data platforms  
  • Good written and verbal communication skills 
  • Self- driven and capable of dealing with complex logic and multiple projects simultaneously  
  • Proven experience working autonomously with disciplined time management  
  • A drive to learn and master new technologies and techniques

 

If this opportunity is of interest, please contact us.

Active: Yes
Last Modified: 2021-4-23 1:15:35
Contributors of this content: jobs