
Data Engineer (5 to 7 yrs) (AWS, Databricks, SQL) - Remote India at Jobgether. This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer (5 to 7 yrs) (AWS, Databricks, SQL) in India.. We are seeking an experienced Data Engineer to design, build, and optimize modern data pipelines and architectures that empower analytics, AI, and business intelligence initiatives. In this role, you will work closely with product, software, and analytics teams to deliver high-quality, reliable data solutions. You will be responsible for managing data workflows in Databricks and AWS, ensuring strong governance, performance, and scalability. This position offers the opportunity to tackle complex data challenges in a fully remote, collaborative environment, while contributing to impactful projects across e-commerce and other domains. The ideal candidate thrives in a fast-paced, innovative culture and enjoys solving technical challenges with creativity and precision.. . Accountabilities. Design, implement, and maintain data pipelines using Databricks and AWS services (S3, Glue, Lambda, Redshift).. Architect and manage Medallion architecture (Bronze, Silver, Gold layers) within Databricks.. Implement and maintain Unity Catalog, Delta Tables, and enforce robust data governance and lineage.. Develop and optimize SQL queries for large-scale datasets, ensuring performance and efficiency.. Design and maintain data models to support analytical and reporting requirements.. Implement Slowly Changing Dimensions (SCD) and apply normalization/denormalization techniques for optimal data storage and retrieval.. Collaborate with data scientists, analysts, and business stakeholders to deliver actionable data solutions.. Identify and implement optimization techniques for query performance and resource usage.. . 5–7 years of hands-on experience in Data Engineering with AWS and Databricks.. Strong proficiency in SQL and data modeling best practices.. Expertise in Python or PySpark for data transformation and ETL processes.. Experience with Medallion architecture, Unity Catalog, Delta Lake, and Delta Tables.. Knowledge of SCDs, data normalization/denormalization, and query optimization.. Familiarity with BI tools (Power BI, Tableau) is a plus.. Experience with CI/CD pipelines, Terraform, or DevOps workflows for data engineering is desirable.. Strong problem-solving, analytical skills, and ability to work in cross-functional teams.. Excellent communication skills in English.. . Benefits. Company Location: India.