Sr. Data Engineer at Oscilar

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

Sr. Data Engineer at Oscilar. Remote Location: Remote - United States. Shape the future of trust in the age of AI. At Oscilar, we're building the most advanced AI Risk Decisioning™ Platform. Banks, fintechs, and digitally native organizations rely on us to manage their fraud, credit, and compliance risk with the power of AI. If you're passionate about solving complex problems and making the internet safer for everyone, . this is your place. .. Why join us:. Mission-driven teams:. Work alongside industry veterans from Meta, Uber, Citi, and Confluent, all united by a shared goal to make the digital world safer.. Ownership and impact: . We believe in extreme ownership. You'll be empowered to take responsibility, move fast, and make decisions that drive our mission forward.. Innovate at the cutting edge:. Your work will shape how modern finance detects fraud and manages risk.. Job Description. As a Senior Data Engineer at Oscilar, you will be responsible for designing, building, and maintaining the data infrastructure that powers our AI-driven decisioning and risk management platform. You will collaborate closely with cross-functional teams, ensuring the delivery of highly reliable, low-latency, and scalable data pipelines and storage solutions that support real-time analytics and mission-critical ML/AI models.. Responsibilities. Architect and implement scalable ETL and data pipelines spanning ClickHouse, Postgres, Athena, and diverse cloud-native sources to support real-time risk management and advanced analytics for AI-driven decisioning.. Design, develop, and optimize distributed data storage solutions to ensure both high performance (low latency, high throughput) and reliability at scale—serving mission-critical models for fraud detection and compliance.. Drive schema evolution, data modeling, and advanced optimizations for analytical and operational databases, including sharding, partitioning, and pipeline orchestration (batch, streaming, CDC frameworks).. Own the end-to-end data flow: integrate multiple internal and external data sources, enforce data validation and lineage, automate and monitor workflow reliability (CI/CD for data, anomaly detection, etc.).. Collaborate cross-functionally with engineers, product managers, and data scientists to deliver secure, scalable solutions that enable fast experimentation and robust operationalization of new ML/AI models.. Champion radical ownership—identify opportunities, propose improvements, and implement innovative technical and process solutions within a fast-moving, remote-first culture.. Mentor and upskill team members, cultivate a learning environment, and contribute to a collaborative, mission-oriented culture.. Qualifications. 5+ years in data engineering (or equivalent), including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms.. Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems (Kafka, cloud-native stores); proven experience with both batch and streaming pipeline design.. Advanced programming in Python and SQL, with bonus points for Java; expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data.. Experience in high-scale, low-latency environments; understanding of security, privacy, and compliance requirements for financial-grade platforms.. Strong communication, business alignment, and documentation abilities—capable of translating complex tech into actionable value for customers and stakeholders.. Alignment with Oscilar’s values: customer obsession, radical ownership, bold vision, efficient growth, and unified teamwork with a culture of trust and excellence.. Nice-to-have. Experience integrating Kafka with analytics solutions like ClickHouse.. Knowledge of event-driven architecture and streaming patterns like CQRS and event sourcing.. Hands-on experience with monitoring tools (e.g., Prometheus, Grafana, Kafka Manager).. Experience automating infrastructure with tools like Terraform or CloudFormation.. Proficiency with Postgres, Redis, ClickHouse, and DynamoDB. Experience with data modeling, query optimization, and high-transaction databases.. Familiarity with encryption, role-based access control, and secure API development.