
GCP Data Engineer (Remote - India) at Jobgether. This position is posted by Jobgether on behalf of a partner company. We are currently looking for a . GCP Data Engineer. in . India. .. This role offers the opportunity to design, develop, and maintain a robust data ecosystem in the Google Cloud Platform (GCP), supporting large-scale analytics and operational needs. You will be responsible for end-to-end data pipeline creation, data modeling, and database optimization, working closely with clients and cross-functional teams. The position requires strong technical expertise in Python, SQL, and cloud-based systems, along with experience in data quality frameworks and performance tuning. This is an excellent opportunity for a hands-on engineer who enjoys solving complex data problems, mentoring junior team members, and contributing to high-impact projects in a flexible, remote-first environment. You will directly influence the quality, efficiency, and scalability of the company’s cloud data infrastructure.. . Accountabilities. Design, build, and deploy scalable data pipelines on GCP to ingest and process data from multiple sources such as databases, APIs, and streaming platforms.. Model, document, and optimize data storage architectures including data lakes, warehouses, and distributed file systems.. Implement performance optimization techniques such as partitioning, indexing, compression, and caching to enhance data retrieval and processing efficiency.. Conduct data quality checks using frameworks like CloudDQ or PyDeequ, ensuring high standards across the data ecosystem.. Troubleshoot and resolve complex data processing and infrastructure issues, supporting seamless operations and new feature delivery.. Mentor and guide junior data engineers, sharing expertise in cloud data engineering best practices and tools.. 8+ years of experience as a Data Engineer, with hands-on expertise in Python, SQL, and object-oriented programming.. Strong experience with Google Cloud Platform services, including BigQuery, Cloud Storage, and Cloud Functions.. Experience with workflow orchestration tools such as Apache Airflow or Composer; knowledge of Dataplex is a plus.. Proficiency in data modeling, database optimization, query tuning, and performance monitoring.. Experience integrating data from multiple sources using ETL/ELT techniques.. Familiarity with data quality frameworks, best practices, and dimensional modeling.. Strong analytical skills, problem-solving abilities, and the ability to work directly with clients to gather and analyze requirements.. . Company Location: India.