Senior DWH Engineer at Gypsy Collective

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Senior DWH Engineer at Gypsy Collective. 🧭 What makes you a great match. . Proficient in SQL: complex queries, CTE, window functions, analytics;. . Deep understanding of DWH concepts: ETL, ELT, Data Vault, Kimball, Star/Snowflake schema;. . Experience with Airflow/dbt or other pipeline orchestrators;. . Proficiency in one or more DWH platforms: BigQuery, Snowflake, Redshift, ClickHouse, Vertica, etc.. . Proficient use of GIT, experience in helping colleagues with Git flow (merge conflicts, rebase, pull requests);. . Knowledge of Python or other scripting languages for transformations;. . Understanding of server infrastructure: basic skills in configuring, maintaining, monitoring resources and load control.. . 1. DWH architecture and design. . Designing Data Warehouse architecture for current and future business needs;. . Development of schemes taking into account performance, scalability and support for data historicity (SCD, snapshot);. . Defining standards and best practices for data storage, transformation and access;. . Participation in planning cloud migrations.. . 2. Organization of ETL/ELT processes. . Building, maintaining and optimizing ETL/ELT pipelines (Airflow, dbt, custom solutions);. . Implementation of incremental updates, CDC, backfill, reprocessing;. . Control and automation of data lineage, logging, alerting.. . 3. Performance optimization. Deep optimization of queries, tables, DAGs;. . Implementation of batches, indexes, materialized views, clustering;. . Server resource management, load monitoring and balancing;. . Building ETL process performance metrics and regular auditing.. . 4. Data quality control and reliability. . Implementation of data validation, anomaly detection, reconciliation;. . Setting up automatic tests for ETL processes;. . Managing backup & recovery policies;. . Identifying and eliminating problems with duplicates, null values, data drifts.. . 5. Integration of new data sources. . Evaluation and connection of external APIs, raw sources, third-party databases;. . Harmonization of formats, update frequency, transformation logic;. . Adaptation of database schemas to new sources without disrupting current processes.. . 6. DevOps and automation. . Automation of deployments, testing, CI/CD for data;. . Working with Docker, Kubernetes, cloud infrastructure (GCP, AWS, Azure);. . Working with Terraform;. . Using GIT and code review processes to manage pipelines.. . 7. Mentoring and coordination. . Code review, support and training of junior/middle engineers;. . Implementation of documentation, templates, onboarding instructions;. . Collaboration with analysts, developers, BI team;. . Work with business customers to understand their needs and transform them into technical requirements.. . Company Location: Ukraine.