Compare commits

..

8 Commits

5 changed files with 132 additions and 6 deletions

View File

@ -27,8 +27,8 @@ ix-tools/
├── scripts/ # Deployment & operations ├── scripts/ # Deployment & operations
│ ├── install.sh # Universal installer │ ├── install.sh # Universal installer
│ ├── deploy-oasis.sh # Deploy to OASIS VPS │ ├── deploy-node2.sh # Deploy to production VPS
│ ├── deploy-arche.sh # Deploy to ARCHE VPS │ ├── deploy-node1.sh # Deploy to build VPS
│ └── monitor.sh # Health monitoring │ └── monitor.sh # Health monitoring
└── docs/ # Documentation └── docs/ # Documentation
@ -76,7 +76,8 @@ Cross-platform build scripts for the IX engine:
```bash ```bash
# Universal installer # Universal installer
curl -fsSL https://inference-x.com/install.sh | bash # Download the binary from https://inference-x.com
# Or build from source: git clone https://git.inference-x.com/elmadani/inference-x
# Manual # Manual
git clone https://git.inference-x.com/elmadani/ix-tools git clone https://git.inference-x.com/elmadani/ix-tools
@ -103,3 +104,5 @@ MIT — Free for all use, commercial and personal.
The IX engine itself uses the [SALKA-IX License](https://git.inference-x.com/elmadani/inference-x). The IX engine itself uses the [SALKA-IX License](https://git.inference-x.com/elmadani/inference-x).
--- ---
*Built by the community — continuing the work of open infrastructure builders.*

123
SPONSOR.md Normal file
View File

@ -0,0 +1,123 @@
# Salka Elmadani — Building Inference-X
> *The best engine is the one you don't notice.*
> *You should hear the model, not the framework.*
---
I build AI infrastructure. Not products, not demos, not wrappers around someone else's API. Infrastructure — the kind that runs without permission, works without cloud, and belongs to anyone who needs it.
**Inference-X** is a 305 KB binary that runs any AI model on any hardware. No framework. No internet. No account. Download a model, run it, talk to it. That's it.
I built it alone. I'm still building it alone. This page is why.
---
## What I'm building
The problem isn't the models. The models are extraordinary. The problem is the layer between the weights and the human — the inference stack. It's bloated, cloud-dependent, and controlled by a handful of companies.
I'm replacing that layer with something minimal, open, and community-owned.
```
Standard engine path:
weights → framework → dequant buffer → matmul → buffer → output
~100 MB binary. 5 steps. Rounding errors at each boundary.
Inference-X:
weights → fused dequant+dot → output
305 KB binary. 2 steps. Zero buffer. Zero noise.
```
Same model. Cleaner signal. Every unnecessary step removed.
---
## The ecosystem
| Project | What it does | Status |
|---------|-------------|--------|
| **[inference-x](https://git.inference-x.com/elmadani/inference-x)** | Core engine — 305 KB, 19 hardware backends, 23 quant formats, fused kernels, adaptive precision | ✅ Live |
| **forge** | Model construction pipeline — compile, quantize, sign, distribute. Build your own model variant from certified organs. | 🔨 Building |
| **[echo-ix](https://git.inference-x.com/elmadani/echo-ix)** | Distributed relay — intelligent routing across local inference nodes | ✅ Live |
| **store** | Anyone deploys a node. Anyone earns from their compute. The cooperative layer. 11 geological cratons. One network. | 📐 Designed |
The store is the endgame: a peer-to-peer inference network where anyone with a laptop can become infrastructure. No data center required.
---
The intelligence already exists in the model weights. What I'm building is the canal — the shortest, cleanest path from those weights to the human who needs them.
---
## Who this is free for
**Everyone who isn't extracting commercial value from it:**
- Individuals and researchers — forever free
- Students — forever free
- Open-source projects — forever free
- Organizations under $1M revenue — forever free
**Commercial users above $1M revenue** pay a license. 20% of that flows back to the community that built the infrastructure.
In 2030, it all becomes Apache 2.0. Everything open. The canal belongs to everyone.
This isn't charity. It's a sustainable model — those who profit from it fund it. Those who don't, use it freely.
---
## Why I need support
Servers cost money. The current infrastructure — [inference-x.com](https://inference-x.com), [build.inference-x.com](https://build.inference-x.com), [git.inference-x.com](https://git.inference-x.com) — runs on €53/month.
More importantly: time. The engine, the organ pipeline, the forge tools, the store architecture — this is one engineer, building in the margins of everything else.
There is no team. No VC. No roadmap driven by investor pressure.
There is one person who decided this infrastructure should exist.
---
## How to help
### Build with me
The most valuable contribution is code. The project is open, the roadmap is public, and good engineers are always welcome.
**→ Pick a task**: [git.inference-x.com/elmadani/inference-x](https://git.inference-x.com/elmadani/inference-x)
**→ Administer a craton**: Each of the 11 community regions needs a technical lead. Write to [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) — subject: `Craton — [your region]`
### Sustain the infrastructure
**PayPal** → [paypal.me/elmadanisalka](https://paypal.me/elmadanisalka)
€5 = one day of server time. €53 = one month of everything running.
### Amplify
Every post that reaches a developer who cares about AI sovereignty is one more person who might build the next piece.
**→ [Follow on X: @ElmadaniSa13111](https://x.com/ElmadaniSa13111)**
---
## Contact
I respond to everyone who writes with something real to say.
| | |
|--|--|
| **X** | [@ElmadaniSa13111](https://x.com/ElmadaniSa13111) — fastest response |
| **Email** | [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) — for technical discussions, partnerships, craton applications |
| **Code** | [@elmadani on Gitea](https://git.inference-x.com/elmadani) |
| **Web** | [inference-x.com](https://inference-x.com) |
---
*Morocco → the world.*
*Salka Elmadani, 20242026*

View File

@ -3,7 +3,7 @@
## Infrastructure ## Infrastructure
``` ```
inference-x.com (ARCHE · OVH) build.inference-x.com (OASIS · Hetzner) inference-x.com (NODE-1 · OVH) build.inference-x.com (NODE-2 · Hetzner)
├── nginx reverse proxy ├── ix-saas (Node.js, PM2, port 4080) ├── nginx reverse proxy ├── ix-saas (Node.js, PM2, port 4080)
├── Gitea (port 3000) ├── echo brain (port 8089) ├── Gitea (port 3000) ├── echo brain (port 8089)
└── Site vitrine (HTML) ├── invoke gateway (port 3001) └── Site vitrine (HTML) ├── invoke gateway (port 3001)

View File

@ -602,7 +602,7 @@ footer{border-top:1px solid var(--border);padding:2.5rem 1.5rem;max-width:1200px
</div> </div>
<span class="repo-badge badge-apache">Public</span> <span class="repo-badge badge-apache">Public</span>
</div> </div>
<div class="repo-desc">The founder's public work — mathematical frameworks, philosophical essays, project architecture documents. Understand the vision behind Inference-X: why it was built, where it's going, and the H5→H6 consciousness framework.</div> <div class="repo-desc">The founder's public work — mathematical frameworks, philosophical essays, project architecture documents. Understand the vision behind Inference-X: why it was built, where it's going, and the research, architecture documents, and mathematical work that shaped Inference-X. Open for community review.</div>
<div class="repo-stats"> <div class="repo-stats">
<div class="repo-stat"><span id="repoStarsElm"></span></div> <div class="repo-stat"><span id="repoStarsElm"></span></div>
<div class="repo-stat">🍴 <span id="repoForksElm"></span></div> <div class="repo-stat">🍴 <span id="repoForksElm"></span></div>

View File

@ -615,7 +615,7 @@ footer{background:var(--bg2);border-top:1px solid var(--border);padding:3rem 1.5
<div class="tool-card"> <div class="tool-card">
<span class="tool-badge badge-coming">COMING</span> <span class="tool-badge badge-coming">COMING</span>
<div class="tool-icon">🎙</div> <div class="tool-icon">🎙</div>
<div class="tool-name">GhostVoice</div> <div class="tool-name">EchoNet</div>
<div class="tool-desc">Neural voice synthesis. Clone, create, share voice models. Same philosophy: local, private, yours.</div> <div class="tool-desc">Neural voice synthesis. Clone, create, share voice models. Same philosophy: local, private, yours.</div>
</div> </div>
<div class="tool-card"> <div class="tool-card">