Data Engineer (Lambda) (Remote - Anywhere) at Jobgether

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Data Engineer (Lambda) (Remote - Anywhere) at Jobgether. This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer (Lambda) in Worldwide.. As a Data Engineer, you will be at the forefront of building and maintaining the infrastructure that powers large-scale, multi-chain crypto data systems. You will design and optimize ETL pipelines, develop high-performance data models, and ensure data reliability for real-time and historical analytics. This role offers the opportunity to work on complex blockchain datasets, integrating multiple sources and building automation tools that improve operational efficiency. You will collaborate closely with a globally distributed team of engineers and analysts, applying best practices in data engineering, automation, and scalability. The position provides autonomy, exposure to cutting-edge technologies, and the chance to influence critical data workflows that drive decision-making for investors and institutions. Your work will directly impact the quality and accessibility of crypto financial data.. . Accountabilities:. Design, maintain, and scale streaming ETL pipelines for blockchain and multi-chain data.. Build and optimize ClickHouse data models, materialized views, and transformations for high-performance analytics.. Develop and maintain data exporters and orchestration workflows, ensuring accuracy and timeliness.. Implement testing, monitoring, automation, and migration processes to enhance pipeline reliability.. Aggregate multiple data sources, including third-party indexers and Kafka topics, into structured tables for API consumption.. Collaborate with analysts and engineering teams to deliver reliable and scalable data services.. 4+ years of experience in Data Engineering, focusing on ETL/ELT, streaming systems, and data pipelines.. Strong SQL skills with columnar databases such as ClickHouse, Druid, or BigQuery.. Hands-on experience with streaming frameworks such as Flink or Kafka.. Proficiency in Python for data engineering, automation, and backend services.. Experience delivering production-grade pipelines and data features on schedule.. Strong focus on automation, reliability, maintainability, and documentation.. Startup mindset with the ability to balance speed, quality, and innovation.. Familiarity with multi-chain crypto ecosystems and blockchain data structures is a plus.. Knowledge of CI/CD for data pipelines, testing frameworks, and performance tuning of ClickHouse is desirable.. . Company Location: Germany.