Data Engineer at OnBuy

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Data Engineer at OnBuy. Who are OnBuy?. OnBuy are an online marketplace who are on a mission of being the best choice for every customer, everywhere.. We have recently been named one of the UK's fastest-growing tech companies in the Sunday Times 100 Tech list.. All achievements we are very proud of, but we don't let that go to our head. We are all laser focused on our mission and understand the huge joint effort ahead of us needed to succeed.. Working at OnBuy:. We are a team of driven and motivated people who thrive when working at pace. To succeed at OnBuy you need to take charge and fully own your responsibilities, rolling your sleeves up when needed to 'get it done'. Working at OnBuy you are surrounded by so much opportunity, but you must possess the ability to stay focused and prioritise ruthlessly. Most importantly, you will thrive in an ever-changing environment as we are constantly evolving.. At OnBuy, you're not just a number or another cog in a machine. We are creating something really special, and you have the opportunity to affect meaningful change and have your voice heard. We are a close team, who have the opportunity to learn and grow as OnBuy evolves. . About the Role. As our Data Engineer your key responsibilities will be: . Developing and scaling analytics infrastructure using modern cloud-based data platforms and tooling (e.g., BigQuery, Snowflake, Databricks).. Designing, building, and maintaining robust data pipelines to ingest, transform, and deliver high-quality datasets for analytics and reporting.Owning and evolving the semantic data layer, ensuring clean, well-modelled datasets that enable self-serve analytics and data-driven decision making.. Collaborating with the analytics team, business stakeholders and tech function to understand requirements and deliver scalable solutions that meet business needs.. Driving innovation through the development of data products, such as feature stores, automated insights, or ML-ready datasets.. Hands-on experience developing and managing cloud based data warehousing environments (Bigquery, Snowflake, Redshift) . Designing, building, and maintaining robust data pipelines to ingest, transform, and deliver high-quality datasets for analytics and reporting.. Practical experience across GCP services including IAM, Cloud Run, Artifact Registry, GKE, BigQuery, GCS, and Datastream.. An understanding of data orchestration (Apache Airflow or other DAG focussed solutions preferable).. Collaborating with the analytics team, business stakeholders and tech function to understand requirements and deliver scalable solutions that meet business needs.. Knowledge of ETL / ELT tools and software such as Airbyte, Fivetran or Stitch.. Experience with containerisation and orchestration (Docker, Kubernetes, Helm).. Understanding of CI/CD workflows (GitLab CI/CD, GitHub Actions preferred).. The ability to create and manage multiple data pipelines through development environments into production.. An understanding of data orchestration (Apache Airflow or other DAG focussed solutions preferable).. A basic understanding of MySQL architecture for application data replication purposes.. Experience of extracting data from REST APIs and ingesting into warehousing environments.. Basic GCP administration experience (Terraform working knowledge would be a nice to have).. Coding Skills:. SQL - ability to write complex SQL queries for normalisation data model creation.. Python - working experience with the ability to write DAGs to extract data from third party APIs.. Experience with version control using Git.. An understanding of Data security, cloud permission management and data storage (cross country/continent).. Company Location: United Kingdom.