docs: add SPONSOR.md, fix broken links, harmonize licenses

This commit is contained in:
Elmadani 2026-02-24 22:23:41 +00:00
parent 8d1c85bd27
commit 7ecd533f6a
2 changed files with 105 additions and 1 deletions

View File

@ -76,7 +76,8 @@ Cross-platform build scripts for the IX engine:
```bash ```bash
# Universal installer # Universal installer
curl -fsSL https://inference-x.com/install.sh | bash # Download the binary from https://inference-x.com
# Or build from source: git clone https://git.inference-x.com/elmadani/inference-x
# Manual # Manual
git clone https://git.inference-x.com/elmadani/ix-tools git clone https://git.inference-x.com/elmadani/ix-tools

103
SPONSOR.md Normal file
View File

@ -0,0 +1,103 @@
# Support Inference-X
> *"AI should flow like water — to anyone who needs it, without gatekeepers, without permission."*
---
## What you're supporting
**Inference-X** is a 305 KB inference engine built from scratch in Morocco.
No framework. No cloud. No dependencies. One binary that runs any AI model — on your laptop, your phone, your Raspberry Pi, your datacenter — with the same result.
This is not a product. It is infrastructure. The kind that takes years to build alone.
---
## The project ecosystem
| Tool | What it does |
|------|-------------|
| **[inference-x](https://git.inference-x.com/elmadani/inference-x)** | Core engine — 305 KB, 19 hardware backends, 23 quant formats. Fused dequant+dot kernels. Zero buffer. |
| **[organ-architecture](https://git.inference-x.com/elmadani/organ-architecture)** | Neural surgery — extract, measure, and graft components between models. Build composite intelligence. |
| **[forge](https://git.inference-x.com/elmadani/ix-tools)** | Model construction pipeline — compile, sign, distribute custom model variants. |
| **[echo-ix](https://git.inference-x.com/elmadani/echo-ix)** | Relay layer — intelligent request routing across distributed inference nodes. |
| **[EchoNet](https://inference-x.com#echonet)** | Federated inference network — share idle compute, earn credits. The distributed fabric. |
All of this is **one person building one vision**: AI inference that belongs to everyone.
---
## The vision in one paragraph
The world's best AI models exist. The weights are trained. The intelligence is there.
What stands between those weights and a student in Casablanca, a doctor in Dakar, a developer in Jakarta — is the **infrastructure layer**. The engine that actually runs the model. Inference-X removes every unnecessary step in that path. Fused computation. Adaptive precision. Zero telemetry. One file.
The store will let anyone deploy a node, contribute compute, and earn from it. Like a cooperative. 11 cratons. One network.
---
## How your support is used
Every euro funds **two things only**:
- **Server infrastructure** — €53/month keeps inference-x.com, build.inference-x.com, and git.inference-x.com running. These are the arteries.
- **Development time** — The engine, the organ pipeline, the forge tools, the store architecture. All of this is built alone, in the margins of everything else.
There is no team. There is no funding round. There is no VC timeline.
There is one engineer who decided that this infrastructure should exist, and is building it.
---
## Ways to help
### Direct support
**PayPal** → [paypal.me/elmadanisalka](https://paypal.me/elmadanisalka)
Choose your amount. Even €5 covers a day of server time.
### Technical contribution
The cratons need administrators. Each of the 11 geological regions that organize the Inference-X community needs someone who understands their region, speaks the local languages, and can review contributions.
**→ Apply for your craton**: write to [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) with subject `Craton Application — [Region Name]`
Or open a pull request. The roadmap is public. Pick a task.
### Signal amplification
If you believe AI should be sovereign infrastructure — follow, share, discuss:
**→ [@ElmadaniSa13111 on X](https://x.com/ElmadaniSa13111)**
One person building this in public. Every share reaches someone who might help build it.
---
## The license model
Inference-X uses **BSL-1.1** (Business Source License).
- **Free forever** for individuals, researchers, students, open-source projects, and organizations under $1M annual revenue.
- **Commercial license** required above $1M. 20% of that revenue returns to the community.
- **Apache 2.0** from February 12, 2030 — fully open, no restrictions.
This is not charity infrastructure. It is a sustainable model: those who profit from it contribute to it. Those who don't, use it freely.
---
## Contact
| Channel | Address |
|---------|---------|
| X / Twitter | [@ElmadaniSa13111](https://x.com/ElmadaniSa13111) |
| Email | [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) |
| Gitea | [@elmadani](https://git.inference-x.com/elmadani) |
| Website | [inference-x.com](https://inference-x.com) |
---
*Built in Morocco. For everyone.*
*Salka Elmadani — 20242026*