Senior Data Platform Engineer (Remote - Namer) at Jobgether

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Senior Data Platform Engineer (Remote - Namer) at Jobgether. This position is posted by Jobgether on behalf of a partner company. We are currently looking for a . Senior Data Platform Engineer. in . North America. .. We are seeking a highly skilled Senior Data Platform Engineer to design, build, and maintain scalable data infrastructure that supports complex, high-volume financial transactions and analytics. This role involves managing and evolving a modern data lakehouse architecture, ensuring efficient data ingestion, transformation, and delivery for internal teams and external partners. You will collaborate with cross-functional stakeholders across product, operations, and engineering, driving data initiatives that impact decision-making and product performance. The ideal candidate thrives in a fast-paced, distributed environment, brings deep technical expertise in data engineering, and is passionate about building robust, scalable systems. This is an opportunity to influence the platform’s data strategy and make a measurable impact on the organization’s growth.. . Accountabilities:. Design and implement forward and reverse ETL processes to deliver data to relevant stakeholders.. Develop scalable transformation patterns to ensure consistent integrations with BI tools across multiple business areas.. Maintain and enhance the data lakehouse architecture to support growing volumes of transactional, operational, and third-party data.. Collaborate with product, operations, sales, and marketing teams to address data flow and reporting requirements.. Monitor, operate, and troubleshoot production systems, ensuring high availability and performance.. Contribute to data experimentation, cataloging, and monitoring practices to maintain data quality and observability.. 7+ years of experience in data engineering, including 2+ years building scalable, low-latency data platforms handling over 100M events/day.. Proficiency in Python and SQL, with strong programming skills.. Experience with cloud-native technologies such as Docker, Kubernetes, and Helm.. Hands-on experience with relational databases and building transformation layers (e.g., dbt).. Familiarity with ETL tools such as Airflow and Airbyte, and streaming systems like Kafka.. Knowledge of distributed systems, storage, transactions, and query processing.. Exposure to infrastructure, DevOps, and Infrastructure as Code (IaaC) practices.. Ability to work independently in a fast-paced, remote, and distributed environment.. Strong problem-solving skills, with the ability to adapt solutions to evolving business requirements.. . Company Location: Canada.