42 lines
1.0 KiB
Markdown
42 lines
1.0 KiB
Markdown
# Welcome to Inference-X Community
|
|
|
|
## What This Is
|
|
|
|
A 305KB binary that runs any AI model on any hardware.
|
|
Built in Morocco. Maintained by 11 continental regions (cratons). Open forever.
|
|
|
|
## Quick Start
|
|
|
|
```bash
|
|
curl -fsSL https://inference-x.com/install.sh | bash
|
|
ix download llama3.2-1b
|
|
ix --model ~/.inference-x/models/llama3.2-1b.gguf
|
|
```
|
|
|
|
Or try the free cloud demo: https://build.inference-x.com (no install)
|
|
|
|
## How to Contribute
|
|
|
|
1. Fork any repo at git.inference-x.com
|
|
2. Build your change on a feature branch
|
|
3. Open a PR with a clear description
|
|
4. Your regional craton admin reviews
|
|
5. Founder approves core changes to protected branches
|
|
|
|
## Repository Map
|
|
|
|
| Repo | Purpose |
|
|
|------|---------|
|
|
| inference-x | The engine (C++, 305KB) |
|
|
| ix-tools | CLI, forge, installer, organ packer |
|
|
| .community | This hub - onboarding, discussions |
|
|
| governance | Rules, roles, voting protocols |
|
|
| ix-roadmap | What comes next |
|
|
|
|
## Questions
|
|
|
|
Open an issue. Your craton admin will route it correctly.
|
|
|
|
---
|
|
Built by many. For all.
|