Snowflake Data Architect (Remote - US) at Jobgether

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Snowflake Data Architect (Remote - US) at Jobgether. Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching.. One of our companies is currently looking for a . Snowflake Data Architect. in the . United States. .. We’re seeking a highly experienced freelance Snowflake Data Architect to lead the optimization and re-architecture of an advanced reporting layer within a Snowflake data warehouse. This is a fully remote, contract role (40 hours/week) lasting at least 6 months. You’ll play a critical part in enhancing compute efficiency, reducing costs, and supporting analytic engineering initiatives that drive business intelligence and reporting. You’ll collaborate with engineering, analytics, and business teams to implement performance improvements without compromising data integrity.. Accountabilities:. . Optimize Snowflake data warehouse reporting layers to improve cost and compute efficiency. . Enhance query performance and reduce data processing time through strategic configuration. . Design and implement efficient data models and schemas to support analytics and reporting. . Collaborate with data engineers on efficient ETL/data transformation processes. . Implement Snowflake best practices for data storage, loading, retrieval, and caching. . Develop materialized views and caching strategies to support high-performance reporting. . Support compute tuning through execution plan analysis and configuration management. . Partner with business stakeholders to ensure optimized data access aligns with reporting needs. . . 4–5+ years of experience in data architecture or analytic engineering, particularly with Snowflake. . Advanced proficiency in SQL and Snowflake-specific optimization techniques. . Strong understanding of Snowflake architecture, including virtual warehouses, storage, and clustering. . Experience in optimizing query execution, compute utilization, and data processing efficiency. . Familiarity with data modeling and performance tuning in large-scale analytics environments. . Proven track record of working with ETL pipelines and transforming data into accessible formats. . Strong communication skills for cross-functional collaboration and problem-solving. . Must be based in the United States and available for 40 hours/week of contract work. . Company Location: United States.