NOTICE — Inference-X ════════════════════════════════════════════════════════════════ Inference-X — Universal Inference Protocol Copyright (C) 2024-2026 Salka Elmadani. All rights reserved. Licensed under the Business Source License 1.1 (BSL-1.1). See the LICENSE file for complete terms. ──────────────────────────────────────────────────────────────── AUTHOR ──────────────────────────────────────────────────────────────── Author: Salka Elmadani Location: Morocco Contact: Elmadani.SALKA@proton.me Website: https://inference-x.com Repository: https://git.inference-x.com/salka/inference-x Origin: Morocco 🇲🇦 ──────────────────────────────────────────────────────────────── INTELLECTUAL PROPERTY ──────────────────────────────────────────────────────────────── INPI eSoleau: 7phf-Ueye-2nWr-Vsgu (February 16, 2026) License: BSL-1.1 → Apache 2.0 (February 12, 2030) Protection: Berne Convention, TRIPS, CPI, DMCA ──────────────────────────────────────────────────────────────── DESCRIPTION ──────────────────────────────────────────────────────────────── Universal inference protocol. Routes any AI model to any silicon. 305 KB binary, zero dependencies, C++17. 19 hardware backends, 23 quantization formats. 6 model architectures, OpenAI-compatible API, cross-platform. Built in Morocco for every device on the planet. ────────────────────────────────────────────────────────────────