
Data Engineer (Databricks) (Remote - LAtam) at Jobgether. This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer (Databricks) in Latin America.. This role offers a unique opportunity to design and implement high-performance data solutions on the Databricks platform, supporting business-critical analytics and cloud-based data ecosystems. You will manage complex datasets, build scalable ETL pipelines, and collaborate with cross-functional teams to translate raw data into actionable insights. The position requires both hands-on technical expertise and strategic thinking, allowing you to shape the data architecture, optimize processes, and drive innovative solutions. You will play a key role in enabling data-driven decisions across diverse teams while staying at the forefront of cloud and big data technologies. This is a fully remote role in a dynamic, fast-paced environment with ample opportunities for professional growth and impact.. . Accountabilities. . Design, implement, and maintain ETL pipelines in the Databricks environment for both batch and real-time data workflows.. . Develop Databricks solutions that integrate seamlessly with existing data architecture while ensuring scalability and performance.. . Optimize Databricks databases, including data structures, indexes, and query performance.. . Monitor and report on system performance, ensuring data quality, integrity, and adherence to QA processes.. . Analyze large datasets and communicate insights effectively to business stakeholders through clear storytelling.. . Collaborate with data engineers, analysts, and other teams to align on data strategy and best practices.. . Provide guidance on Databricks features, optimization techniques, and cost-efficiency opportunities.. . . . 4+ years of experience as a Data Engineer, preferably with hands-on experience in Databricks.. . 4+ years of expertise in SQL, data modeling, and ETL processes.. . 2+ years of experience with Python for data processing and automation.. . Proficiency with Databricks features: Delta Lake, PySpark, MLflow, Databricks SQL, Unity Catalog, and Job Scheduling & Workflows.. . Experience with cloud platforms such as AWS, Azure, or Google Cloud and related data integration tools.. . Strong problem-solving skills, attention to detail, and ability to work independently or collaboratively in fast-paced environments.. . Excellent communication skills, capable of conveying complex data concepts to non-technical stakeholders.. . Experience in agency settings or working with external clients is a plus.. . English proficiency at C1 level.. . Preferred Qualifications:. . Knowledge of data visualization tools such as Tableau or Power BI.. . Understanding of data governance and security best practices.. . Company Location: Colombia.