Data Engineer (ETL & Analytics Infrastructure) at CodeRoad

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Data Engineer (ETL & Analytics Infrastructure) at CodeRoad. . Location: Latin America. Data Engineer (ETL & Analytics Infrastructure). . The Team. . At Coderoad, we're more than just a software development company—we're your gateway to the global tech world. Whether you're looking to skill up or level up your career, we offer the challenges you’ve been searching for.. . We provide end-to-end software development services and give you the opportunity to work on exciting, real-world projects in a supportive environment. Whether it's staff augmentation, dedicated IT teams, or general software engineering, we have opportunities for everyone to challenge themselves and take their career to the next level!. . About the Role. . We are seeking a highly skilled.  Data Engineer. to design and implement robust data flows from various internal and external systems/APIs into our centralized data warehouse. This role is critical in supporting our client's analytics, KPI reporting, and executive dashboards. The ideal candidate will have extensive experience building scalable ETL/ELT pipelines within the AWS ecosystem.. . Key Responsibilities:. . . Data Integration:. Develop and maintain custom connectors using . AirByte. to extract data from various systems and third-party APIs.. . Pipeline Development:. Build and optimize data transformation pipelines using . AWS Glue (Spark / pySpark). .. . Storage Architecture:. Design and manage data storage solutions using . AWS S3 Tables and Apache Iceberg. to ensure high-performance data retrieval.. . Query Optimization:. Structure data to enable efficient . AWS Athena. queries for downstream analytics and reporting.. . Collaboration:. Work closely with the internal analytics team to ensure data quality and alignment with KPI reporting requirements.. . . Technical Requirements:. . . 3-5 Years of experience. . AirByte Expertise:. Proven experience setting up connections to external systems and developing . custom connectors. .. . AWS Data Suite:. Strong proficiency in . AWS Glue, S3 Tables, and Athena. .. . Programming:. Advanced . Python. skills, specifically for data transformation via . pySpark. .. . Modern Table Formats:. Hands-on experience with . Apache Iceberg. for managing large datasets in S3.. . API Management:. Deep understanding of REST/SOAP APIs for data extraction.. . . Preferred Qualifications:. . . Experience building data warehouses from the ground up for a mid-sized organization.. . Strong understanding of data modeling for Business Intelligence (BI) tools.. . Self-starter mindset with the ability to work independently in a contract capacity.. . . ​​​​​​​. What you’ll love:. . . . 100% Remote. . . . Holidays Off. . . . Paid Time Off. . . . Health insurance assistance program. . . . Competitive Pay (USD). . . . Excellent teamwork and work environment. . . . Training. . .