Senior Data Engineer (English Required) at DaCodes

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Senior Data Engineer (English Required) at DaCodes. Work at DaCodes!. We are a team of experts in software development and high-impact digital transformation.. For over 10 years, we’ve created technology and innovation-driven solutions thanks to our team of . 220+ talented #DaCoders. , including developers, architects, UX/UI designers, PMs, QA testers, and more. Our team integrates into projects with clients across LATAM and the United States, delivering outstanding results.. At DaCodes, you'll accelerate your professional growth by collaborating on diverse projects across various industries and sectors.. Working with us will make you versatile and agile, giving you the opportunity to work with cutting-edge technologies and collaborate with top-level professionals.. Our . DaCoders. play a crucial role in the success of our business and that of our clients. You’ll become the expert contributing to our projects while gaining access to disruptive startups and global brands. Does this sound interesting to you?. We’re looking for talent to join our team—let’s work together!. The ideal candidate brings a unique mix of technical experience, curiosity, a logical and analytical mindset, proactivity, ownership, and a passion for teamwork.. We are looking for a . Senior Data Engineer. to join our team and help . design, build, and optimize data pipelines. for large-scale applications. The ideal candidate has . strong experience in data architecture, ETL/ELT processes, cloud platforms, and distributed systems. .. This role requires expertise in handling . big data, real-time processing, and data lakes. while ensuring scalability, performance, and security. The candidate should be comfortable working in a . fast-paced, agile environment. and collaborating with . data scientists, analysts, and software engineers. to deliver . high-quality data solutions. .. Required Qualifications. 🔹 . 5+ years of experience. in data engineering, data architecture, or backend development.. 🔹 Strong expertise in . SQL and NoSQL databases. (PostgreSQL, MySQL, MongoDB, DynamoDB, etc.).. 🔹 . Cloud expertise. with . AWS (preferred), GCP, or Azure. .. 🔹 Proficiency in . Python, Java, or Scala. for data processing and pipeline development.. 🔹 Experience with . big data frameworks. like . Apache Spark, Hadoop, or Flink. .. 🔹 Hands-on experience with . ETL/ELT processes and data pipeline orchestration. tools (. Apache Airflow, dbt, Luigi, or Prefect. ).. 🔹 Experience with . message queues and streaming technologies. (. Kafka, Kinesis, Pub/Sub, or RabbitMQ. ).. 🔹 Knowledge of . containerization and orchestration tools. (. Docker, Kubernetes. ).. 🔹 Strong problem-solving skills and the ability to . optimize performance and scalability. .. 🔹 . English proficiency (B2 or higher). to collaborate with international teams.. Nice-to-Have Skills (Preferred). ✅ . Experience with data lakehouse architectures. (Delta Lake, Iceberg, Hudi).. ✅ Familiarity with . Machine Learning (ML) and AI-related data workflows. .. ✅ Experience with . Infrastructure as Code (Terraform, CloudFormation). for managing data environments.. ✅ Knowledge of . data security and compliance regulations. (GDPR, CCPA, HIPAA).. Key Responsibilities. ✅ . Design, develop, and maintain. scalable and efficient data pipelines for . batch and real-time processing. .. ✅ . Build and optimize. data lakes, warehouses, and analytics solutions on . cloud platforms (AWS, GCP, or Azure). .. ✅ . Implement ETL/ELT workflows. using tools such as . Apache Airflow, dbt, or Prefect. .. ✅ . Ensure data integrity, consistency, and governance. through proper architecture and best practices.. ✅ . Integrate data from various sources. (structured and unstructured), including . APIs, streaming services, and databases. .. ✅ . Work with data scientists and analysts. to ensure high availability and accessibility of data for analytics and machine learning models.. ✅ . Monitor, troubleshoot, and improve. the performance of data pipelines.. ✅ . Implement security best practices. for data access, encryption, and compliance.. ✅ . Collaborate with software engineers. to integrate data pipelines into applications and services.. ✅ . Stay up to date. with the latest trends in . big data, cloud technologies, and data engineering best practices. .. Company Location: Mexico.