1.1 KiB
1.1 KiB
Welcome to Inference-X Community
What This Is
A 305KB binary that runs any AI model on any hardware. Built in Morocco. Maintained by 11 continental regions (cratons). Open forever.
Quick Start
# Download at https://inference-x.com — or build from source:
# git clone https://git.inference-x.com/elmadani/inference-x
ix download llama3.2-1b
ix --model ~/.inference-x/models/llama3.2-1b.gguf
Or try the free cloud demo: https://build.inference-x.com (no install)
How to Contribute
- Fork any repo at git.inference-x.com
- Build your change on a feature branch
- Open a PR with a clear description
- Your regional craton admin reviews
- Founder approves core changes to protected branches
Repository Map
| Repo | Purpose |
|---|---|
| inference-x | The engine (C++, 305KB) |
| ix-tools | CLI, forge, installer, organ packer |
| .community | This hub - onboarding, discussions |
| governance | Rules, roles, voting protocols |
| ix-roadmap | What comes next |
Questions
Open an issue. Your craton admin will route it correctly.
Built by many. For all.