Backend Engineer, Data Platform (Remote - US) at Jobgether

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Backend Engineer, Data Platform (Remote - US) at Jobgether. This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Backend Engineer, Data Platform in the United States.. As a Backend Engineer on the Data Platform team, you will be instrumental in designing, building, and maintaining large-scale data pipelines and analytics infrastructure that power critical internal and external products. You will work with batch and stream processing systems to manage trillions of events per day, ensuring data reliability, scalability, and performance. This role provides a unique opportunity to collaborate with cross-functional teams, implement innovative solutions, and optimize complex data workflows. You will also actively contribute to monitoring, troubleshooting, and improving production systems while shaping the standards and best practices for backend data engineering. The position emphasizes hands-on engineering, creative problem-solving, and ownership of critical data products in a dynamic environment.. . Accountabilities:. Build and expand data pipelines and products supporting experimentation, release observability, metrics, product analytics, and internal business intelligence.. Collaborate with frontend engineers, product managers, and UX designers to deliver user-facing data features.. Monitor, optimize, and maintain database and pipeline performance.. Write unit, integration, and load tests to ensure high-quality data delivery.. Participate in code reviews and provide feedback on technical proposals.. Contribute to improving engineering standards, tooling, and development processes.. Support production systems, including on-call rotations, to ensure reliability and performance.. . 5+ years of backend software engineering experience.. At least 1 year of experience building data pipelines or data warehouse solutions.. Demonstrated expertise with pipeline technologies such as Kinesis, Airflow, Spark, Lambda, Flink, and Athena.. Experience working with event or analytical data in databases like Clickhouse, Postgres, ElasticSearch, Timestream, or Snowflake.. Strong foundation in computer science fundamentals, including data structures, distributed systems, concurrency, and threading.. Familiarity with Infrastructure-as-Code tools (e.g., Terraform) and observability tools (e.g., Datadog).. Commitment to writing maintainable, high-quality code and following engineering best practices.. Excellent communication skills and ability to collaborate in a team-oriented environment.. . Company Location: United States.