SAP Legacy Integration Developer at Enterprise Horizon Consulting Group

We are redirecting you to the source. If you are not redirected in 3 seconds, please click here.

SAP Legacy Integration Developer at Enterprise Horizon Consulting Group. Company Overview. Enterprise Horizon Consulting Group (EHCG) is a Woman-Owned Small Business specializing in IT Consulting which has successfully delivered key capabilities to the Navy, Army, and NASA over the past 20+ years. EHCG provides best in class services to its customers in the following areas: Business Systems Services; Business Intelligence; Data Analytics and Dashboarding; Enterprise Resource Planning (SAP) Implementation; Legacy System Optimization; Digital Transformation; Cloud Migration; Integration and Modernization; and Risk Management Framework Processes (RMF).. Job Description . Enterprise Horizon Consulting Group is seeking a highly skilled SAP/Legacy Integration Developer to join our team in support of our DoD customer. This role is responsible for designing, building, and deploying highly reliable MuleSoft APIs that connect mission-critical systems to both traditional applications and advanced Agentic AI services. The developer will ensure that supply chain data is securely unlocked, transformed, and delivered to power modern AI-driven decision support systems.. Key Responsibilities. 1. Core Logistics Data Integration. ●     . SAP Connectivity:. Design and implement robust integration flows to extract and ingest data the MuleSoft SAP Connector (RFC, BAPI, IDoc) for both synchronous and asynchronous transactions.. ●     . Legacy Data Transformation:. Develop complex . DataWeave 2.0. scripts to translate, cleanse, and canonicalize data between DoD logistics standards and internal JSON/XML formats.. ●     . Reliability & Resilience:. Implement advanced error handling, queue-based patterns (e.g., Anypoint MQ Store and Forward), and guaranteed delivery mechanisms to ensure no mission-critical requisition or inventory update is ever lost.. ●     . Testing & Quality:. Write comprehensive . MUnit. test suites to achieve mandated code coverage (90%+) and validate complex business logic against DoD process rules.. 2. Agentic AI Data Serving & Protocol . ●     . Model Context Protocol (MCP) Server Development:. Design and build specialized MuleSoft APIs (acting as "MCP Servers") optimized to serve contextual data directly to Agentic AI systems and RAG pipelines.. ●     . Vector & RAG Database Access:. Implement connectivity patterns to secure Vector Databases and other specialized data stores that house indexed, vectorized logistics information.. ●     . Efficient Data Retrieval:. Tune APIs for ultra-low-latency data retrieval necessary for AI models, focusing on minimizing payload size and optimizing database queries (SQL, NoSQL, or Graph) originating from the Mule flow.. ●     . Security for AI:. Enforce strict access control, tokenization, and data masking policies within the API layer. . ●     . Metadata Management:. Ensure that all served data includes rich, accurate metadata (timestamps, source system, confidence score) to improve the grounding and auditability of AI-generated responses.. Active Secret Clearance. Bachelor’s degree in Computer Science, Engineering, or related field is preferred.. IAT Level II baseline certification required (Security+ or allowed substitution). . MuleSoft Certified Developer – Level 2 (MCD-Level 2) is highly desired. . Minimum 5 years of professional integration development experience, with at least 2 years focused on SAP or equivalent ERP integration.. Expert proficiency in Anypoint Studio, DataWeave 2.0, API Manager, and Anypoint Exchange.. Proven ability to integrate with SAP using different communication protocols (IDoc, BAPI/RFC, OData).. Advanced SQL skills, experience with connectors for traditional RDBMS (Oracle, SQL Server), and exposure to NoSQL or Vector Databases.. Experience with Git, Maven, and familiarity with CI/CD pipelines (GitLab/Jenkins) for automated deployment to Runtime Fabric.. A demonstrated ability to quickly understand complex, decades-old logistics data models and translate them into modern API specifications.. Company Location: United States.