inference-x/NOTICE
Salka Elmadani ec36668cf5 Inference-X v1.0 — Universal AI Inference Engine
Better output from the same model. Fused computation, adaptive precision,
surgical expert loading. 305 KB, 19 backends, zero dependencies.

https://inference-x.com
2026-02-23 07:10:47 +00:00

40 lines
2.4 KiB
Plaintext

NOTICE — Inference-X
════════════════════════════════════════════════════════════════
Inference-X — Universal Inference Protocol
Copyright (C) 2024-2026 Salka Elmadani. All rights reserved.
Licensed under the Business Source License 1.1 (BSL-1.1).
See the LICENSE file for complete terms.
────────────────────────────────────────────────────────────────
AUTHOR
────────────────────────────────────────────────────────────────
Author: Salka Elmadani
Location: Morocco
Contact: Elmadani.SALKA@proton.me
Website: https://inference-x.com
Repository: https://github.com/ElmadaniS/inference-x
Origin: Morocco 🇲🇦
────────────────────────────────────────────────────────────────
INTELLECTUAL PROPERTY
────────────────────────────────────────────────────────────────
INPI eSoleau: 7phf-Ueye-2nWr-Vsgu (February 16, 2026)
License: BSL-1.1 → Apache 2.0 (February 12, 2030)
Protection: Berne Convention, TRIPS, CPI, DMCA
────────────────────────────────────────────────────────────────
DESCRIPTION
────────────────────────────────────────────────────────────────
Universal inference protocol. Routes any AI model to any silicon.
305 KB binary, zero dependencies, C++17.
19 hardware backends, 23 quantization formats.
6 model architectures, OpenAI-compatible API, cross-platform.
Built in Morocco for every device on the planet.
────────────────────────────────────────────────────────────────