Compare commits
1 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
65171cff24 |
@ -2,7 +2,7 @@
|
||||
|
||||
## Creator & Lead Developer
|
||||
- **Salka Elmadani** — Architecture, implementation, and all original code
|
||||
- Git: [@elmadani](https://git.inference-x.com/elmadani)
|
||||
- GitHub: [@ElmadaniS](https://git.inference-x.com/salka)
|
||||
- Email: Elmadani.SALKA@proton.me
|
||||
|
||||
## Infrastructure Partners
|
||||
|
||||
@ -8,7 +8,7 @@
|
||||
|
||||
Inference-X is a tiny file (305 KB) that lets any computer run AI models locally. It works on old laptops, phones, Raspberry Pi, and datacenters — same file, no setup. Your questions stay on your machine. Nobody sees them.
|
||||
|
||||
**[Website](https://inference-x.com)** · **[How it works](TECHNOLOGY.md)** · **[Benchmarks](BENCHMARKS.md)** · **[Vision](VISION.md)** · **[Sponsor](SPONSOR.md)**
|
||||
**[Website](https://inference-x.com)** · **[How it works](TECHNOLOGY.md)** · **[Benchmarks](BENCHMARKS.md)** · **[Vision](VISION.md)** · **[Sponsor](https://git.inference-x.com/salka)**
|
||||
|
||||
---
|
||||
|
||||
@ -193,4 +193,4 @@ Built in Morocco for the world by [Salka Elmadani](https://x.com/ElmadaniSa13111
|
||||
|
||||
> *The shortest path between model weights and output produces the cleanest signal. Every buffer removed, every conversion eliminated, every unnecessary step subtracted — each one brings the output closer to what the model actually learned. The path itself is the filter.*
|
||||
|
||||
**[Website](https://inference-x.com)** · **[Sponsor](SPONSOR.md)** · **[Contact](mailto:Elmadani.SALKA@proton.me)**
|
||||
**[Website](https://inference-x.com)** · **[Sponsor](https://git.inference-x.com/salka)** · **[Contact](mailto:Elmadani.SALKA@proton.me)**
|
||||
|
||||
123
SPONSOR.md
123
SPONSOR.md
@ -1,123 +0,0 @@
|
||||
# Salka Elmadani — Building Inference-X
|
||||
|
||||
> *The best engine is the one you don't notice.*
|
||||
> *You should hear the model, not the framework.*
|
||||
|
||||
---
|
||||
|
||||
|
||||
I build AI infrastructure. Not products, not demos, not wrappers around someone else's API. Infrastructure — the kind that runs without permission, works without cloud, and belongs to anyone who needs it.
|
||||
|
||||
**Inference-X** is a 305 KB binary that runs any AI model on any hardware. No framework. No internet. No account. Download a model, run it, talk to it. That's it.
|
||||
|
||||
I built it alone. I'm still building it alone. This page is why.
|
||||
|
||||
---
|
||||
|
||||
## What I'm building
|
||||
|
||||
The problem isn't the models. The models are extraordinary. The problem is the layer between the weights and the human — the inference stack. It's bloated, cloud-dependent, and controlled by a handful of companies.
|
||||
|
||||
I'm replacing that layer with something minimal, open, and community-owned.
|
||||
|
||||
```
|
||||
Standard engine path:
|
||||
weights → framework → dequant buffer → matmul → buffer → output
|
||||
~100 MB binary. 5 steps. Rounding errors at each boundary.
|
||||
|
||||
Inference-X:
|
||||
weights → fused dequant+dot → output
|
||||
305 KB binary. 2 steps. Zero buffer. Zero noise.
|
||||
```
|
||||
|
||||
Same model. Cleaner signal. Every unnecessary step removed.
|
||||
|
||||
---
|
||||
|
||||
## The ecosystem
|
||||
|
||||
| Project | What it does | Status |
|
||||
|---------|-------------|--------|
|
||||
| **[inference-x](https://git.inference-x.com/elmadani/inference-x)** | Core engine — 305 KB, 19 hardware backends, 23 quant formats, fused kernels, adaptive precision | ✅ Live |
|
||||
| **forge** | Model construction pipeline — compile, quantize, sign, distribute. Build your own model variant from certified organs. | 🔨 Building |
|
||||
| **[echo-ix](https://git.inference-x.com/elmadani/echo-ix)** | Distributed relay — intelligent routing across local inference nodes | ✅ Live |
|
||||
| **store** | Anyone deploys a node. Anyone earns from their compute. The cooperative layer. 11 geological cratons. One network. | 📐 Designed |
|
||||
|
||||
The store is the endgame: a peer-to-peer inference network where anyone with a laptop can become infrastructure. No data center required.
|
||||
|
||||
---
|
||||
|
||||
|
||||
|
||||
|
||||
The intelligence already exists in the model weights. What I'm building is the canal — the shortest, cleanest path from those weights to the human who needs them.
|
||||
|
||||
---
|
||||
|
||||
## Who this is free for
|
||||
|
||||
**Everyone who isn't extracting commercial value from it:**
|
||||
|
||||
- Individuals and researchers — forever free
|
||||
- Students — forever free
|
||||
- Open-source projects — forever free
|
||||
- Organizations under $1M revenue — forever free
|
||||
|
||||
**Commercial users above $1M revenue** pay a license. 20% of that flows back to the community that built the infrastructure.
|
||||
|
||||
In 2030, it all becomes Apache 2.0. Everything open. The canal belongs to everyone.
|
||||
|
||||
This isn't charity. It's a sustainable model — those who profit from it fund it. Those who don't, use it freely.
|
||||
|
||||
---
|
||||
|
||||
## Why I need support
|
||||
|
||||
Servers cost money. The current infrastructure — [inference-x.com](https://inference-x.com), [build.inference-x.com](https://build.inference-x.com), [git.inference-x.com](https://git.inference-x.com) — runs on €53/month.
|
||||
|
||||
More importantly: time. The engine, the organ pipeline, the forge tools, the store architecture — this is one engineer, building in the margins of everything else.
|
||||
|
||||
There is no team. No VC. No roadmap driven by investor pressure.
|
||||
|
||||
There is one person who decided this infrastructure should exist.
|
||||
|
||||
---
|
||||
|
||||
## How to help
|
||||
|
||||
### Build with me
|
||||
|
||||
The most valuable contribution is code. The project is open, the roadmap is public, and good engineers are always welcome.
|
||||
|
||||
**→ Pick a task**: [git.inference-x.com/elmadani/inference-x](https://git.inference-x.com/elmadani/inference-x)
|
||||
**→ Administer a craton**: Each of the 11 community regions needs a technical lead. Write to [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) — subject: `Craton — [your region]`
|
||||
|
||||
### Sustain the infrastructure
|
||||
|
||||
**PayPal** → [paypal.me/elmadanisalka](https://paypal.me/elmadanisalka)
|
||||
|
||||
€5 = one day of server time. €53 = one month of everything running.
|
||||
|
||||
### Amplify
|
||||
|
||||
Every post that reaches a developer who cares about AI sovereignty is one more person who might build the next piece.
|
||||
|
||||
**→ [Follow on X: @ElmadaniSa13111](https://x.com/ElmadaniSa13111)**
|
||||
|
||||
---
|
||||
|
||||
## Contact
|
||||
|
||||
I respond to everyone who writes with something real to say.
|
||||
|
||||
| | |
|
||||
|--|--|
|
||||
| **X** | [@ElmadaniSa13111](https://x.com/ElmadaniSa13111) — fastest response |
|
||||
| **Email** | [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) — for technical discussions, partnerships, craton applications |
|
||||
| **Code** | [@elmadani on Gitea](https://git.inference-x.com/elmadani) |
|
||||
| **Web** | [inference-x.com](https://inference-x.com) |
|
||||
|
||||
---
|
||||
|
||||
*Morocco → the world.*
|
||||
*Salka Elmadani, 2024–2026*
|
||||
@ -181,7 +181,7 @@ Kimi K2.5 on Inference-X:
|
||||
## Try it
|
||||
|
||||
```bash
|
||||
git clone https://git.inference-x.com/elmadani/inference-x
|
||||
git clone https://git.inference-x.com/salka/inference-x
|
||||
cd inference-x
|
||||
make
|
||||
./inference-x model.gguf -p "Hello"
|
||||
|
||||
@ -41,10 +41,10 @@ namespace ix {
|
||||
// WATERMARK — SALKA ELMADANI SIGNATURE (Ne pas modifier)
|
||||
// ═══════════════════════════════════════════════════════════════════════════════
|
||||
namespace signature {
|
||||
static constexpr double S0 = 5.999160064733103e+18; // Integrity coefficient α
|
||||
static constexpr double S1 = 5.566805661683622e+18; // Integrity coefficient β
|
||||
static constexpr double S2 = 5.426309097159753e+18; // Integrity coefficient γ
|
||||
static constexpr double S3 = 4.991471925827590e+18; // Integrity coefficient δ
|
||||
static constexpr double S0 = 5.999160064733103e+18; // "SALKA EL"
|
||||
static constexpr double S1 = 5.566805661683622e+18; // "MADANI E"
|
||||
static constexpr double S2 = 5.426309097159753e+18; // "LMADANI"
|
||||
static constexpr double S3 = 4.991471925827590e+18; // "CREATOR"
|
||||
|
||||
inline bool verify() {
|
||||
volatile double sum = S0 + S1 + S2 + S3;
|
||||
@ -224,7 +224,7 @@ struct block_q8_1 {
|
||||
};
|
||||
|
||||
|
||||
// STATIC ASSERT: Block sizes must match GGUF binary format exactly
|
||||
// Z-VERIFY: Block sizes must match GGUF binary format exactly
|
||||
static_assert(sizeof(block_q4_K) == 144, "block_q4_K size mismatch!");
|
||||
static_assert(sizeof(block_q8_0) == 34, "block_q8_0 size mismatch!");
|
||||
static_assert(sizeof(block_q6_K) == 210, "block_q6_K size mismatch!");
|
||||
|
||||
@ -1,5 +1,4 @@
|
||||
// ═══════════════════════════════════════════════════════════════════════════════
|
||||
// INFERENCEX — Expert Profiler
|
||||
// Copyright (C) 2025-2026 Salka Elmadani. All rights reserved.
|
||||
// Licensed under the Business Source License 1.1 (BSL-1.1)
|
||||
// See LICENSE file for full terms. Morocco.
|
||||
@ -81,7 +80,6 @@ public:
|
||||
FILE* f = fopen(path, "w");
|
||||
if (!f) return;
|
||||
|
||||
fprintf(f, "# IX Expert Profile | %lu tokens\n\n",
|
||||
(unsigned long)total_tokens_);
|
||||
|
||||
for (int l = 0; l < n_layers_; ++l) {
|
||||
|
||||
@ -33,7 +33,6 @@ namespace ix {
|
||||
namespace identity {
|
||||
|
||||
// Author identity — cryptographic anchor
|
||||
// Author identity — compile-time cryptographic anchor
|
||||
// Split into 4x64-bit for integration into dispatch math
|
||||
static constexpr uint64_t ANCHOR_A = 0x9F3A7B2E1D4C6F08ULL;
|
||||
static constexpr uint64_t ANCHOR_B = 0x5E8D2A9C4B7F1036ULL;
|
||||
|
||||
@ -668,7 +668,6 @@ public:
|
||||
}
|
||||
}
|
||||
|
||||
// EXPERT PROFILING
|
||||
void dump_csv(const char* path) const {
|
||||
FILE* fp = fopen(path, "w");
|
||||
if (!fp) return;
|
||||
|
||||
Loading…
Reference in New Issue
Block a user