
Data Engineer (Remote - US) at Jobgether. This position is posted by Jobgether on behalf of Monte Carlo. We are currently looking for a Data Engineer in United States.. As a Data Engineer, you will play a key role in building and scaling data pipelines that power critical product features and analytics. You will work closely with cross-functional teams to ensure high-quality, reliable, and actionable data across the organization. This role involves designing robust ETL processes, optimizing data workflows, and supporting model productionization to drive business insights. You will leverage modern cloud technologies to deliver scalable solutions while continuously improving reliability and performance. With opportunities to influence internal practices and provide feedback for product development, your work will directly impact both internal operations and customer-facing analytics. The ideal candidate thrives in a collaborative, fast-paced environment where innovation and data-driven decision-making are core.. Accountabilities. · Design, build, and maintain scalable, high-performance data pipelines and orchestration workflows.. · Ingest, transform, and process raw data into clean, usable datasets for analytics and model training.. · Preprocess metrics and metadata to support anomaly detection, lineage, and other data models.. · Collaborate with Data Science and Analytics teams to productionize models and dashboards.. · Continuously improve pipeline reliability, leveraging internal usage of the platform to identify enhancements.. · Manage and create data platform resources, including database objects and cloud infrastructure.. · Participate in code reviews, documentation, and best practices for data engineering.. · 5+ years of experience building production-grade data pipelines and backend services.. · Strong expertise in Python and SQL; familiarity with PySpark is a plus.. · Experience with cloud platforms, particularly AWS, and distributed architectures.. · Hands-on experience with modern data warehouses such as Snowflake, BigQuery, or Redshift.. · Knowledge of pipeline orchestration tools (Airflow, Spark) and IaC solutions (Terraform, CloudFormation) preferred.. · Strong ownership mindset, urgency, and customer-focused problem-solving approach.. · Ability to collaborate effectively across teams and communicate technical concepts clearly.. Company Location: United States.