Data Engineer – GCP at Expleo

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Data Engineer – GCP at Expleo. Location Information: Rochester, MN, United States. . Overview. . Location: Remote (Rochester, MN). . Employment Type: Full-Time Consultant. . . . Are you passionate about building scalable data . pipelines. and enabling machine learning solutions? Trissential is hiring a . Data Engineer. to join our client’s high-impact team. This is your chance to work on cutting-edge GCP technologies, drive innovation, and make a real difference in how data powers decisions.. . . . What’s in It for You?. . . Innovative Projects. – Work on advanced analytics and machine learning initiatives that drive real-world impact. . Modern Tech Stack. – Leverage the latest in GCP, Terraform, Python, and more. . Collaborative Culture. – Join a team that values knowledge sharing, mentorship, and continuous learning. . Career Growth. – Opportunities to expand your skills and grow into leadership roles. . . Your Role & Responsibilities. . . Design, build, and deploy scalable data pipelines and integrations to support analytics and ML applications. . Implement . ETL. /ELT best practices, data governance, and data quality standards. . Collaborate with product owners and analytics teams to identify data needs and deliver actionable insights. . Perform data modeling and transformation on large datasets for optimal performance. . Automate manual processes and drive continuous improvement using standard frameworks. . Document functional and technical specifications for data solutions. . Mentor junior engineers and contribute to team knowledge sharing. . Stay current with emerging technologies and recommend enhancements. . . Skills & Experience You Should Possess. . . 5+ years of hands-on experience in data engineering or related roles. . 3+ years building and maintaining automated data pipelines (batch and/or streaming). . Strong programming skills in . SQL. and . Python. . Advanced experience with . GCP. tools such as:. . BigQuery. , . Dataflow. , . DataProc. , . Data Fusion. , . Change Data Stream. . Cloud Functions. , . Cloud Composer. , . Cloud Events. . Terraform . for infrastructure as code. . Experience with data quality, governance, and performance optimization. . . Bonus Points If You Have. . . Experience working in Agile product teams. . Familiarity with machine learning workflows and model deployment. . Prior consulting or cross-functional stakeholder engagement experience. . . Education & Certifications You Need. . . Bachelor’s degree in Computer Science, Information Technology, or a related field. . OR equivalent experience (5+ years in data engineering). . GCP Certification. (required). . . What We Offer. . At Trissential, we believe in rewarding talent and supporting your success:. . . Competitive Salary. – You choose the model that works best for you, both with company-sponsored benefits! Up to $122,000 annually or $69 per hour, based on skills and experience. . Comprehensive Benefits for you and your dependents. – Medical, dental, vision, free tele-health, HSA with company contribution, life and disability insurance, and 401k with matching. . Paid Time Off. – Both compensation models offer paid time away from work. . Remote Work. – 100% remote with equipment provided. . Career Development. – Access to training, certifications, and mentorship. . . This role is only open to candidates . authorized to work in the U.S.. at this time.. . . . Ready to Build the Future of Data?. . If you're excited about working with cutting-edge cloud technologies and making a real impact, we want to hear from you. . Apply now and join Trissential in transforming data into action! #LI-RM1 #LI-VN1 #LI-MN1. .