GCP Data Engineer (Snowflake, Airflow, Agent Development) - Remote at Mindex

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

GCP Data Engineer (Snowflake, Airflow, Agent Development) - Remote at Mindex. Founded in 1994 and celebrating 30 years in business, Mindex is a software development company with a rich history of demonstrated software and product development success. We specialize in agile software development, cloud professional services, and creating our own innovative products. We are proud to be recognized as the #1 Software Developer in the 2023 RBJ's Book of Lists and ranked 27th in Rochester Chamber’s Top 100 Companies. Additionally, we have maintained our certification as a Great Place to Work for consecutive years in a row. Our list of satisfied clients and #ROCstar employees are both rapidly growing— Are you next to join our team? . Mindex’s Software Development division is the go-to software developer for enterprise organizations looking to engage teams of skilled technical resources to help them plan, navigate, and execute through the full software development lifecycle.  . We seek a skilled Google Cloud Platform Data Engineer to join our team. . Essential Functions. The GCP Data Engineer will develop data integration processes supporting client data initiatives. This includes ensuring accuracy and consistency of business data across systems, with extensive use of Python, SQL, and modern data engineering frameworks. The role also requires providing technical support during business hours. . Key Responsibilities: . Develop an understanding of the data environment through profiling and analysis to enhance data quality.. Build Python-based solutions for data extraction, cleansing, transformation, and validation to support data migration.. Document data integration processes, ensure traceability and develop data monitoring solutions. . Collaborate with architects for solution integration ensuring alignment with company standards. . Manage data tools and platforms for proper software/infrastructure updates. . Work with the Data Management Organization to align with data quality improvement objectives. . Ensure solutions meet Service Level Agreements with capacity and performance considerations. . Bachelor’s Degree in Computer Science or equivalent experience preferred. . 4 years of experience in software engineering with a strong focus on Python and data engineering. . Advanced proficiency in Python and . Agent Development . with strong problem-solving skills required.. 3 years of development experience and proficiency with Relational Databases, NoSQL, and/or Data Lakehouses (. Snowflake. , Microsoft Fabric, Databricks) . Intermediate proficiency in cloud technologies, including . Google Cloud Platform. , required. . Working knowledge of REST standards preferred. . Experienced in data quality, data integration, and data processing. . Proficiency with data movement solutions like Informatica IICS, Fivetran, and Airbyte. . Experience with data transformation solutions, such as dbt. . Experience with workflow orchestration tools such as . Airflow or Astronmer. , Dagster, and Prefect. . Familiarity with CI/CD and container technology is preferred. Understanding usage of AI technologies for development and proficiency in prompting. Physical Conditions/Requirements. Prolonged periods sitting at a desk and working on a computer. No heavy lifting is expected. Exertion of up to 10 lbs.. Company Location: United States.