forked from elmadani/ix-tools
Compare commits
12 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a8b91e6cd0 | ||
|
|
ced76a531f | ||
|
|
0914fc9755 | ||
|
|
8388021cb9 | ||
|
|
7ecd533f6a | ||
|
|
8d1c85bd27 | ||
|
|
bd3dc774bc | ||
|
|
38a6a53faa | ||
|
|
f78d405a25 | ||
|
|
5b3d4e0e1d | ||
|
|
6d966c85d1 | ||
|
|
7c1bcef395 |
121
README.md
121
README.md
@ -1,37 +1,108 @@
|
|||||||
# IX-Tools — Community Tools for Inference-X
|
# ix-tools — Inference-X Community Toolchain
|
||||||
|
|
||||||
Public tools, governance documents, and utilities for the [Inference-X](https://inference-x.com) ecosystem.
|
**Tools, scripts, and source files for the Inference-X ecosystem.**
|
||||||
|
|
||||||
> **Inference-X** is the universal AI inference engine — 19 hardware backends, 228KB binary, runs anywhere. Community-owned, creator-protected.
|
Built in Morocco · Community-maintained · MIT License
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Repository Structure
|
## Repository Structure
|
||||||
|
|
||||||
| Directory | Content |
|
```
|
||||||
|-----------|---------|
|
ix-tools/
|
||||||
| `docs/` | Architecture, manifesto, provider framework, revenue model |
|
│
|
||||||
| `governance/` | Community governance and contribution levels |
|
├── site/ # Frontend source files
|
||||||
| `license/` | SALKA-IX License v1.0 |
|
│ ├── vitrine/ # inference-x.com (site vitrine)
|
||||||
| `watermark/` | Attribution standard for all IX ecosystem projects |
|
│ │ └── index.html # Main site — community portal
|
||||||
|
│ └── saas/ # build.inference-x.com
|
||||||
## Quick Start
|
│ ├── index.html # SaaS frontend (demo + repos + builder)
|
||||||
|
│ └── demo_module.js # Free demo backend module
|
||||||
```bash
|
│
|
||||||
# Clone the engine
|
├── tools/ # Developer tools
|
||||||
git clone https://git.inference-x.com/elmadani/inference-x.git
|
│ ├── builder/ # IX config builder
|
||||||
|
│ ├── organ/ # Organ store API client
|
||||||
# Clone community tools
|
│ ├── forge/ # Model forge (quantize, convert, package)
|
||||||
git clone https://git.inference-x.com/elmadani/ix-tools.git
|
│ ├── store/ # Community model store tools
|
||||||
|
│ └── compilation/ # Cross-platform build scripts
|
||||||
|
│
|
||||||
|
├── scripts/ # Deployment & operations
|
||||||
|
│ ├── install.sh # Universal installer
|
||||||
|
│ ├── deploy-node2.sh # Deploy to production VPS
|
||||||
|
│ ├── deploy-node1.sh # Deploy to build VPS
|
||||||
|
│ └── monitor.sh # Health monitoring
|
||||||
|
│
|
||||||
|
└── docs/ # Documentation
|
||||||
|
├── ARCHITECTURE.md # System architecture
|
||||||
|
├── API.md # API reference
|
||||||
|
└── CONTRIBUTING.md # How to contribute
|
||||||
```
|
```
|
||||||
|
|
||||||
## Community
|
---
|
||||||
|
|
||||||
- Website: [inference-x.com](https://inference-x.com)
|
## Tools Overview
|
||||||
- Git: [git.inference-x.com](https://git.inference-x.com)
|
|
||||||
- Build: [build.inference-x.com](https://build.inference-x.com)
|
### 🔧 Builder
|
||||||
|
Generates ready-to-run `ix-config.json` for any hardware combination.
|
||||||
|
Supports: CPU/CUDA/Metal/Vulkan/ROCm/OpenCL × Linux/macOS/Windows × ARM/x86
|
||||||
|
|
||||||
|
### 🫀 Organ Store
|
||||||
|
Tools to package, publish, and install IX "organs" — pre-configured AI personas.
|
||||||
|
Format: `organ.json` + GGUF model + system prompt + config
|
||||||
|
|
||||||
|
### ⚒ Forge
|
||||||
|
Model conversion and quantization pipeline:
|
||||||
|
- GGUF conversion from HuggingFace SafeTensors
|
||||||
|
- Q4/Q5/Q8/F16 quantization
|
||||||
|
- Model packaging for the Store
|
||||||
|
|
||||||
|
### 🏪 Store Client
|
||||||
|
CLI and API client for the IX community model store:
|
||||||
|
- Browse, download, install models
|
||||||
|
- Publish your own trained/fine-tuned models
|
||||||
|
- Rate and review
|
||||||
|
|
||||||
|
### 🔨 Compilation
|
||||||
|
Cross-platform build scripts for the IX engine:
|
||||||
|
```bash
|
||||||
|
./tools/compilation/build-linux-x64.sh
|
||||||
|
./tools/compilation/build-macos-arm64.sh
|
||||||
|
./tools/compilation/build-windows-x64.sh
|
||||||
|
./tools/compilation/build-raspberry.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick Install
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Universal installer
|
||||||
|
# Download the binary from https://inference-x.com
|
||||||
|
# Or build from source: git clone https://git.inference-x.com/elmadani/inference-x
|
||||||
|
|
||||||
|
# Manual
|
||||||
|
git clone https://git.inference-x.com/elmadani/ix-tools
|
||||||
|
cd ix-tools
|
||||||
|
./scripts/install.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
1. Fork this repo on `git.inference-x.com`
|
||||||
|
2. Create a branch: `git checkout -b feature/my-tool`
|
||||||
|
3. Submit a PR
|
||||||
|
4. A craton administrator reviews
|
||||||
|
|
||||||
|
**Craton system**: The 11 geological cratons each have an administrator who reviews contributions from their region. See [11 Cratons](https://inference-x.com#join).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
SALKA-IX License v1.0 — Free for the IX ecosystem. Commercial use requires a license.
|
MIT — Free for all use, commercial and personal.
|
||||||
See [license/SALKA-IX-LICENSE.md](license/SALKA-IX-LICENSE.md)
|
The IX engine itself uses the [SALKA-IX License](https://git.inference-x.com/elmadani/inference-x).
|
||||||
|
|
||||||
**Creator:** Elmadani Salka · All rights reserved
|
---
|
||||||
|
|
||||||
|
*Built by the community — continuing the work of open infrastructure builders.*
|
||||||
|
|||||||
123
SPONSOR.md
Normal file
123
SPONSOR.md
Normal file
@ -0,0 +1,123 @@
|
|||||||
|
# Salka Elmadani — Building Inference-X
|
||||||
|
|
||||||
|
> *The best engine is the one you don't notice.*
|
||||||
|
> *You should hear the model, not the framework.*
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
I build AI infrastructure. Not products, not demos, not wrappers around someone else's API. Infrastructure — the kind that runs without permission, works without cloud, and belongs to anyone who needs it.
|
||||||
|
|
||||||
|
**Inference-X** is a 305 KB binary that runs any AI model on any hardware. No framework. No internet. No account. Download a model, run it, talk to it. That's it.
|
||||||
|
|
||||||
|
I built it alone. I'm still building it alone. This page is why.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What I'm building
|
||||||
|
|
||||||
|
The problem isn't the models. The models are extraordinary. The problem is the layer between the weights and the human — the inference stack. It's bloated, cloud-dependent, and controlled by a handful of companies.
|
||||||
|
|
||||||
|
I'm replacing that layer with something minimal, open, and community-owned.
|
||||||
|
|
||||||
|
```
|
||||||
|
Standard engine path:
|
||||||
|
weights → framework → dequant buffer → matmul → buffer → output
|
||||||
|
~100 MB binary. 5 steps. Rounding errors at each boundary.
|
||||||
|
|
||||||
|
Inference-X:
|
||||||
|
weights → fused dequant+dot → output
|
||||||
|
305 KB binary. 2 steps. Zero buffer. Zero noise.
|
||||||
|
```
|
||||||
|
|
||||||
|
Same model. Cleaner signal. Every unnecessary step removed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## The ecosystem
|
||||||
|
|
||||||
|
| Project | What it does | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| **[inference-x](https://git.inference-x.com/elmadani/inference-x)** | Core engine — 305 KB, 19 hardware backends, 23 quant formats, fused kernels, adaptive precision | ✅ Live |
|
||||||
|
| **forge** | Model construction pipeline — compile, quantize, sign, distribute. Build your own model variant from certified organs. | 🔨 Building |
|
||||||
|
| **[echo-ix](https://git.inference-x.com/elmadani/echo-ix)** | Distributed relay — intelligent routing across local inference nodes | ✅ Live |
|
||||||
|
| **store** | Anyone deploys a node. Anyone earns from their compute. The cooperative layer. 11 geological cratons. One network. | 📐 Designed |
|
||||||
|
|
||||||
|
The store is the endgame: a peer-to-peer inference network where anyone with a laptop can become infrastructure. No data center required.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
The intelligence already exists in the model weights. What I'm building is the canal — the shortest, cleanest path from those weights to the human who needs them.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Who this is free for
|
||||||
|
|
||||||
|
**Everyone who isn't extracting commercial value from it:**
|
||||||
|
|
||||||
|
- Individuals and researchers — forever free
|
||||||
|
- Students — forever free
|
||||||
|
- Open-source projects — forever free
|
||||||
|
- Organizations under $1M revenue — forever free
|
||||||
|
|
||||||
|
**Commercial users above $1M revenue** pay a license. 20% of that flows back to the community that built the infrastructure.
|
||||||
|
|
||||||
|
In 2030, it all becomes Apache 2.0. Everything open. The canal belongs to everyone.
|
||||||
|
|
||||||
|
This isn't charity. It's a sustainable model — those who profit from it fund it. Those who don't, use it freely.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Why I need support
|
||||||
|
|
||||||
|
Servers cost money. The current infrastructure — [inference-x.com](https://inference-x.com), [build.inference-x.com](https://build.inference-x.com), [git.inference-x.com](https://git.inference-x.com) — runs on €53/month.
|
||||||
|
|
||||||
|
More importantly: time. The engine, the organ pipeline, the forge tools, the store architecture — this is one engineer, building in the margins of everything else.
|
||||||
|
|
||||||
|
There is no team. No VC. No roadmap driven by investor pressure.
|
||||||
|
|
||||||
|
There is one person who decided this infrastructure should exist.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## How to help
|
||||||
|
|
||||||
|
### Build with me
|
||||||
|
|
||||||
|
The most valuable contribution is code. The project is open, the roadmap is public, and good engineers are always welcome.
|
||||||
|
|
||||||
|
**→ Pick a task**: [git.inference-x.com/elmadani/inference-x](https://git.inference-x.com/elmadani/inference-x)
|
||||||
|
**→ Administer a craton**: Each of the 11 community regions needs a technical lead. Write to [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) — subject: `Craton — [your region]`
|
||||||
|
|
||||||
|
### Sustain the infrastructure
|
||||||
|
|
||||||
|
**PayPal** → [paypal.me/elmadanisalka](https://paypal.me/elmadanisalka)
|
||||||
|
|
||||||
|
€5 = one day of server time. €53 = one month of everything running.
|
||||||
|
|
||||||
|
### Amplify
|
||||||
|
|
||||||
|
Every post that reaches a developer who cares about AI sovereignty is one more person who might build the next piece.
|
||||||
|
|
||||||
|
**→ [Follow on X: @ElmadaniSa13111](https://x.com/ElmadaniSa13111)**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Contact
|
||||||
|
|
||||||
|
I respond to everyone who writes with something real to say.
|
||||||
|
|
||||||
|
| | |
|
||||||
|
|--|--|
|
||||||
|
| **X** | [@ElmadaniSa13111](https://x.com/ElmadaniSa13111) — fastest response |
|
||||||
|
| **Email** | [Elmadani.SALKA@proton.me](mailto:Elmadani.SALKA@proton.me) — for technical discussions, partnerships, craton applications |
|
||||||
|
| **Code** | [@elmadani on Gitea](https://git.inference-x.com/elmadani) |
|
||||||
|
| **Web** | [inference-x.com](https://inference-x.com) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Morocco → the world.*
|
||||||
|
*Salka Elmadani, 2024–2026*
|
||||||
24
docs/ARCHITECTURE.md
Normal file
24
docs/ARCHITECTURE.md
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
# IX System Architecture
|
||||||
|
|
||||||
|
## Infrastructure
|
||||||
|
|
||||||
|
```
|
||||||
|
inference-x.com (NODE-1 · OVH) build.inference-x.com (NODE-2 · Hetzner)
|
||||||
|
├── nginx reverse proxy ├── ix-saas (Node.js, PM2, port 4080)
|
||||||
|
├── Gitea (port 3000) ├── echo brain (port 8089)
|
||||||
|
└── Site vitrine (HTML) ├── invoke gateway (port 3001)
|
||||||
|
└── OneCloud VM pool (demos)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Demo Flow
|
||||||
|
Visitor → POST /api/demo/start → OneCloud VM (2vCPU/2GB) → IX engine → LLaMA 3.2 1B → SSE stream → Chat → 30min → VM destroyed
|
||||||
|
|
||||||
|
## Plans (Test Mode)
|
||||||
|
- Community: Free forever, full local engine
|
||||||
|
- Studio Test: Free, 2vCPU cloud instance, store access
|
||||||
|
- Enterprise Test: Free, 8vCPU/32GB, forge, store publish
|
||||||
|
|
||||||
|
## API
|
||||||
|
Public: /api/health, /api/demo/stats, /api/demo/start, /api/demo/stream/:token, /api/community/scout
|
||||||
|
Auth: /api/auth/register, /api/auth/login, /api/builds, /api/instance/provision, /api/store
|
||||||
|
|
||||||
@ -16,9 +16,7 @@ The world's compute is centralized. Its intelligence is centralized. Its value f
|
|||||||
|
|
||||||
## The Vision
|
## The Vision
|
||||||
|
|
||||||
Inference-X is the **khettara of intelligence**.
|
|
||||||
|
|
||||||
A khettara is an ancient Berber water canal system, carved into the desert of the Anti-Atlas Mountains of Morocco. It uses gravity — not pumps, not power, not permission — to bring water from mountains to fields. It is maintained by the community. It belongs to everyone who depends on it. It has worked for a thousand years.
|
|
||||||
|
|
||||||
This is the architecture we are building for AI:
|
This is the architecture we are building for AI:
|
||||||
|
|
||||||
@ -87,9 +85,7 @@ To the creator:
|
|||||||
|
|
||||||
## Why This Matters
|
## Why This Matters
|
||||||
|
|
||||||
This project was conceived in Morocco, in the land of the Anti-Atlas, where water management shaped civilization for millennia. The same principles apply to intelligence: it must flow to where it is needed, maintained by those who use it, built on gravity rather than power.
|
|
||||||
|
|
||||||
The communities that built the khettaras did not patent them. They maintained them together. The water fed everyone's fields.
|
|
||||||
|
|
||||||
We are building the same thing for AI.
|
We are building the same thing for AI.
|
||||||
|
|
||||||
|
|||||||
164
scripts/install.sh
Normal file
164
scripts/install.sh
Normal file
@ -0,0 +1,164 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# IX Universal Installer — inference-x.com
|
||||||
|
# Detects OS/arch, downloads correct binary, sets up config
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
VERSION="latest"
|
||||||
|
BASE_URL="https://inference-x.com/releases"
|
||||||
|
IX_HOME="$HOME/.inference-x"
|
||||||
|
BIN_DIR="/usr/local/bin"
|
||||||
|
|
||||||
|
RED='\033[0;31m'; GREEN='\033[0;32m'; TEAL='\033[0;36m'; AMBER='\033[0;33m'; NC='\033[0m'
|
||||||
|
log() { echo -e "${TEAL}[IX]${NC} $1"; }
|
||||||
|
ok() { echo -e "${GREEN}[✓]${NC} $1"; }
|
||||||
|
warn() { echo -e "${AMBER}[!]${NC} $1"; }
|
||||||
|
err() { echo -e "${RED}[✗]${NC} $1"; exit 1; }
|
||||||
|
|
||||||
|
detect_platform() {
|
||||||
|
local OS=$(uname -s | tr '[:upper:]' '[:lower:]')
|
||||||
|
local ARCH=$(uname -m)
|
||||||
|
case "$OS" in
|
||||||
|
linux)
|
||||||
|
case "$ARCH" in
|
||||||
|
x86_64) PLATFORM="linux-x64" ;;
|
||||||
|
aarch64) PLATFORM="linux-arm64" ;;
|
||||||
|
armv7l) PLATFORM="linux-armv7" ;;
|
||||||
|
*) err "Unsupported arch: $ARCH" ;;
|
||||||
|
esac ;;
|
||||||
|
darwin)
|
||||||
|
case "$ARCH" in
|
||||||
|
arm64) PLATFORM="macos-arm64" ;;
|
||||||
|
x86_64) PLATFORM="macos-x64" ;;
|
||||||
|
*) err "Unsupported arch: $ARCH" ;;
|
||||||
|
esac ;;
|
||||||
|
*) err "Unsupported OS: $OS. Use Windows installer from inference-x.com" ;;
|
||||||
|
esac
|
||||||
|
log "Detected: $OS/$ARCH → $PLATFORM"
|
||||||
|
}
|
||||||
|
|
||||||
|
detect_backend() {
|
||||||
|
BACKEND="cpu"
|
||||||
|
if command -v nvidia-smi &>/dev/null; then
|
||||||
|
BACKEND="cuda"
|
||||||
|
ok "NVIDIA GPU detected — CUDA backend will be used"
|
||||||
|
elif [[ "$(uname -s)" == "Darwin" && "$(uname -m)" == "arm64" ]]; then
|
||||||
|
BACKEND="metal"
|
||||||
|
ok "Apple Silicon detected — Metal backend will be used"
|
||||||
|
elif command -v vulkaninfo &>/dev/null; then
|
||||||
|
BACKEND="vulkan"
|
||||||
|
ok "Vulkan detected"
|
||||||
|
else
|
||||||
|
log "Using CPU backend (universal)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
download_ix() {
|
||||||
|
local URL="$BASE_URL/ix-$PLATFORM"
|
||||||
|
log "Downloading IX engine from $URL..."
|
||||||
|
|
||||||
|
# Try multiple mirrors
|
||||||
|
local MIRRORS=(
|
||||||
|
"https://inference-x.com/releases"
|
||||||
|
"https://git.inference-x.com/elmadani/inference-x/releases/download/latest"
|
||||||
|
)
|
||||||
|
|
||||||
|
local downloaded=false
|
||||||
|
for mirror in "${MIRRORS[@]}"; do
|
||||||
|
if curl -fsSL "$mirror/ix-$PLATFORM" -o /tmp/ix-binary 2>/dev/null; then
|
||||||
|
downloaded=true
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ "$downloaded" = false ]; then
|
||||||
|
warn "Binary download unavailable. Building from source..."
|
||||||
|
build_from_source
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
chmod +x /tmp/ix-binary
|
||||||
|
ok "Downloaded IX binary"
|
||||||
|
}
|
||||||
|
|
||||||
|
build_from_source() {
|
||||||
|
log "Building IX from source (requires git, cmake, make, gcc)..."
|
||||||
|
local tmp=$(mktemp -d)
|
||||||
|
git clone --depth=1 https://git.inference-x.com/elmadani/inference-x.git "$tmp/ix" 2>&1 | tail -3
|
||||||
|
cd "$tmp/ix"
|
||||||
|
cmake -B build -DCMAKE_BUILD_TYPE=Release 2>&1 | tail -3
|
||||||
|
cmake --build build -j$(nproc 2>/dev/null || sysctl -n hw.ncpu 2>/dev/null || echo 2) 2>&1 | tail -5
|
||||||
|
cp build/bin/ix /tmp/ix-binary
|
||||||
|
chmod +x /tmp/ix-binary
|
||||||
|
cd -
|
||||||
|
ok "Built from source"
|
||||||
|
}
|
||||||
|
|
||||||
|
install_ix() {
|
||||||
|
mkdir -p "$IX_HOME" "$IX_HOME/models" "$IX_HOME/organs" "$IX_HOME/configs"
|
||||||
|
|
||||||
|
if [ -w "$BIN_DIR" ]; then
|
||||||
|
cp /tmp/ix-binary "$BIN_DIR/ix"
|
||||||
|
ok "Installed to $BIN_DIR/ix"
|
||||||
|
else
|
||||||
|
sudo cp /tmp/ix-binary "$BIN_DIR/ix"
|
||||||
|
ok "Installed to $BIN_DIR/ix (sudo)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Default config
|
||||||
|
cat > "$IX_HOME/configs/default.json" << CONFIG
|
||||||
|
{
|
||||||
|
"version": "1.0",
|
||||||
|
"engine": "inference-x",
|
||||||
|
"hardware": { "backend": "$BACKEND" },
|
||||||
|
"model": {
|
||||||
|
"id": "llama3.2-1b",
|
||||||
|
"context_size": 4096,
|
||||||
|
"max_tokens": 512
|
||||||
|
},
|
||||||
|
"persona": {
|
||||||
|
"name": "Assistant",
|
||||||
|
"system_prompt": "You are a helpful, private AI assistant. You run locally. No data leaves this device.",
|
||||||
|
"temperature": 0.7
|
||||||
|
},
|
||||||
|
"server": { "port": 8080, "bind": "127.0.0.1" }
|
||||||
|
}
|
||||||
|
CONFIG
|
||||||
|
ok "Default config created at $IX_HOME/configs/default.json"
|
||||||
|
}
|
||||||
|
|
||||||
|
print_success() {
|
||||||
|
echo ""
|
||||||
|
echo -e "${GREEN}╔══════════════════════════════════════════╗${NC}"
|
||||||
|
echo -e "${GREEN}║ Inference-X installed successfully! ║${NC}"
|
||||||
|
echo -e "${GREEN}╚══════════════════════════════════════════╝${NC}"
|
||||||
|
echo ""
|
||||||
|
echo -e " ${TEAL}Start IX:${NC} ix --config ~/.inference-x/configs/default.json"
|
||||||
|
echo -e " ${TEAL}API:${NC} http://localhost:8080/v1/chat/completions"
|
||||||
|
echo -e " ${TEAL}Models:${NC} ~/.inference-x/models/"
|
||||||
|
echo -e " ${TEAL}Organs:${NC} ~/.inference-x/organs/"
|
||||||
|
echo -e " ${TEAL}Builder:${NC} https://build.inference-x.com"
|
||||||
|
echo -e " ${TEAL}Docs:${NC} https://git.inference-x.com/elmadani/ix-tools"
|
||||||
|
echo ""
|
||||||
|
echo -e " ${AMBER}First model download:${NC}"
|
||||||
|
echo -e " ix download llama3.2-1b # Fastest (1GB)"
|
||||||
|
echo -e " ix download mistral-7b # Best quality (4GB)"
|
||||||
|
echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
main() {
|
||||||
|
echo ""
|
||||||
|
echo -e "${TEAL}════════════════════════════════════════${NC}"
|
||||||
|
echo -e "${TEAL} Inference-X Universal Installer ${NC}"
|
||||||
|
echo -e "${TEAL} Built in Morocco · For Everyone ${NC}"
|
||||||
|
echo -e "${TEAL}════════════════════════════════════════${NC}"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
detect_platform
|
||||||
|
detect_backend
|
||||||
|
download_ix
|
||||||
|
install_ix
|
||||||
|
print_success
|
||||||
|
}
|
||||||
|
|
||||||
|
main "$@"
|
||||||
742
site/saas/demo_module.js
Normal file
742
site/saas/demo_module.js
Normal file
@ -0,0 +1,742 @@
|
|||||||
|
// ═══════════════════════════════════════════════════════════════
|
||||||
|
// IX DEMO MODULE — Free Sessions, No Registration
|
||||||
|
// Real OneCloud instances · 30min TTL · Auto-destroy
|
||||||
|
// Provider Pool · Live Telemetry via SSE
|
||||||
|
// ═══════════════════════════════════════════════════════════════
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
const https = require('https');
|
||||||
|
const crypto = require('crypto');
|
||||||
|
|
||||||
|
// ── DEMO CONFIG ──────────────────────────────────────────────────
|
||||||
|
const DEMO_TTL_MS = 30 * 60 * 1000; // 30 minutes
|
||||||
|
const DEMO_INSTANCE_CFG = { vcpu: 2, ram: 2, disk: 50, size_id: '87' }; // pro size
|
||||||
|
const DEMO_MODEL = 'llama3.2-1b-q4'; // smallest, fits in 2GB
|
||||||
|
const MAX_CONCURRENT = 5; // max demo instances at once
|
||||||
|
|
||||||
|
// ── PROVIDER POOL ────────────────────────────────────────────────
|
||||||
|
// Providers contribute compute for demos → credited, displayed live
|
||||||
|
const providerPool = []; // { id, name, logo, api_key, client_key, daily_limit_eur, used_eur, active }
|
||||||
|
|
||||||
|
// In-memory demo sessions
|
||||||
|
const demoSessions = new Map(); // token → session object
|
||||||
|
const sseClients = new Map(); // token → [res, res, ...]
|
||||||
|
let totalDemoCount = 0;
|
||||||
|
let todayDemoCount = 0;
|
||||||
|
let lastDayReset = new Date().toDateString();
|
||||||
|
|
||||||
|
function resetDailyCount() {
|
||||||
|
const today = new Date().toDateString();
|
||||||
|
if (today !== lastDayReset) { todayDemoCount = 0; lastDayReset = today; }
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── ONECLOUD REQUEST (per-provider or main keys) ─────────────────
|
||||||
|
function ocReq(method, endpoint, params = {}, keys = null) {
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
const apiKey = keys?.api_key || process.env.ONECLOUD_API_KEY;
|
||||||
|
const clientKey = keys?.client_key || process.env.ONECLOUD_CLIENT_KEY;
|
||||||
|
if (!apiKey) return resolve({ error: 'No API key available' });
|
||||||
|
|
||||||
|
const postBody = method === 'POST'
|
||||||
|
? Object.entries(params).map(([k, v]) => `${encodeURIComponent(k)}=${encodeURIComponent(v)}`).join('&')
|
||||||
|
: '';
|
||||||
|
const getQuery = method === 'GET' && Object.keys(params).length
|
||||||
|
? '?' + Object.entries(params).map(([k, v]) => `${k}=${v}`).join('&') : '';
|
||||||
|
|
||||||
|
const options = {
|
||||||
|
hostname: 'api.oneprovider.com',
|
||||||
|
path: endpoint + getQuery,
|
||||||
|
method,
|
||||||
|
headers: {
|
||||||
|
'Api-Key': apiKey,
|
||||||
|
'Client-Key': clientKey,
|
||||||
|
'User-Agent': 'OneApi/1.0',
|
||||||
|
'Content-Type': 'application/x-www-form-urlencoded',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
if (postBody) options.headers['Content-Length'] = Buffer.byteLength(postBody);
|
||||||
|
|
||||||
|
const req = https.request(options, (res) => {
|
||||||
|
let data = '';
|
||||||
|
res.on('data', c => data += c);
|
||||||
|
res.on('end', () => {
|
||||||
|
try { resolve(JSON.parse(data)); }
|
||||||
|
catch { resolve({ raw: data.slice(0, 200) }); }
|
||||||
|
});
|
||||||
|
});
|
||||||
|
req.on('error', e => resolve({ error: e.message }));
|
||||||
|
if (postBody) req.write(postBody);
|
||||||
|
req.end();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── SELECT BEST PROVIDER ─────────────────────────────────────────
|
||||||
|
function selectProvider() {
|
||||||
|
// Find active pool contributor with remaining daily budget
|
||||||
|
const available = providerPool.filter(p =>
|
||||||
|
p.active && p.used_eur < p.daily_limit_eur
|
||||||
|
);
|
||||||
|
if (available.length > 0) {
|
||||||
|
// Round-robin or pick least used
|
||||||
|
return available.sort((a, b) => (a.used_eur / a.daily_limit_eur) - (b.used_eur / b.daily_limit_eur))[0];
|
||||||
|
}
|
||||||
|
return null; // use main keys
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── PUSH SSE ─────────────────────────────────────────────────────
|
||||||
|
function pushSSE(token, event, data) {
|
||||||
|
const clients = sseClients.get(token) || [];
|
||||||
|
const msg = `event: ${event}\ndata: ${JSON.stringify(data)}\n\n`;
|
||||||
|
clients.forEach(res => { try { res.write(msg); } catch {} });
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── START DEMO ───────────────────────────────────────────────────
|
||||||
|
async function startDemo(visitorIp, region, db) {
|
||||||
|
resetDailyCount();
|
||||||
|
|
||||||
|
// Rate limit: 1 demo per IP in last 10min
|
||||||
|
for (const [, s] of demoSessions) {
|
||||||
|
if (s.visitor_ip === visitorIp && Date.now() - s.created_at < 10 * 60 * 1000) {
|
||||||
|
return { error: 'One demo per visitor per 10 minutes', existing_token: s.token };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Capacity check
|
||||||
|
const active = [...demoSessions.values()].filter(s => s.status !== 'destroyed' && s.status !== 'expired');
|
||||||
|
if (active.length >= MAX_CONCURRENT) {
|
||||||
|
return { error: 'All demo slots busy', queue: active.length };
|
||||||
|
}
|
||||||
|
|
||||||
|
const token = crypto.randomBytes(16).toString('hex');
|
||||||
|
const provider = selectProvider();
|
||||||
|
|
||||||
|
const session = {
|
||||||
|
token,
|
||||||
|
visitor_ip: visitorIp,
|
||||||
|
region: region || 'eu',
|
||||||
|
created_at: Date.now(),
|
||||||
|
expires_at: Date.now() + DEMO_TTL_MS,
|
||||||
|
status: 'provisioning',
|
||||||
|
provider: provider ? { id: provider.id, name: provider.name, logo: provider.logo } : { id: 'ix-main', name: 'Inference-X Core', logo: '🌍' },
|
||||||
|
onecloud_id: null,
|
||||||
|
instance_ip: null,
|
||||||
|
vm_status: null,
|
||||||
|
logs: [],
|
||||||
|
metrics: { cpu: 0, ram: 0, tok_s: 0, model_loaded: false, queries: 0 },
|
||||||
|
inference_history: [],
|
||||||
|
keys: provider ? { api_key: provider.api_key, client_key: provider.client_key } : null,
|
||||||
|
};
|
||||||
|
|
||||||
|
demoSessions.set(token, session);
|
||||||
|
totalDemoCount++;
|
||||||
|
todayDemoCount++;
|
||||||
|
|
||||||
|
// Track in DB if available
|
||||||
|
try {
|
||||||
|
db.prepare(`INSERT OR IGNORE INTO demo_sessions (token, visitor_ip, region, provider_id, created_at, expires_at, status)
|
||||||
|
VALUES (?, ?, ?, ?, datetime('now'), datetime('now', '+30 minutes'), 'provisioning')`)
|
||||||
|
.run(token, visitorIp, region, session.provider.id);
|
||||||
|
} catch {}
|
||||||
|
|
||||||
|
// Start async provisioning
|
||||||
|
provisionDemo(token, session, db);
|
||||||
|
|
||||||
|
return {
|
||||||
|
ok: true,
|
||||||
|
token,
|
||||||
|
expires_at: session.expires_at,
|
||||||
|
ttl_minutes: 30,
|
||||||
|
provider: session.provider,
|
||||||
|
region: session.region,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── PROVISION ────────────────────────────────────────────────────
|
||||||
|
async function provisionDemo(token, session, db) {
|
||||||
|
const log = (msg, type = 'info') => {
|
||||||
|
session.logs.push({ t: Date.now(), msg, type });
|
||||||
|
pushSSE(token, 'log', { msg, type, ts: Date.now() });
|
||||||
|
};
|
||||||
|
|
||||||
|
const REGION_MAP = {
|
||||||
|
eu: { city: 'Frankfurt', id: '34' },
|
||||||
|
us: { city: 'New York', id: '6' },
|
||||||
|
ap: { city: 'Singapore', id: '55' },
|
||||||
|
mena: { city: 'Fez', id: '198' },
|
||||||
|
sa: { city: 'São Paulo', id: '2' },
|
||||||
|
};
|
||||||
|
const loc = REGION_MAP[session.region] || REGION_MAP.eu;
|
||||||
|
|
||||||
|
try {
|
||||||
|
log(`🌍 Connecting to OneCloud — ${loc.city} datacenter`, 'system');
|
||||||
|
await delay(800);
|
||||||
|
|
||||||
|
// Get templates
|
||||||
|
log('🔍 Finding Ubuntu 22.04 base image...', 'system');
|
||||||
|
const templates = await ocReq('GET', '/vm/templates', {}, session.keys);
|
||||||
|
const ubuntu = (templates.response || []).find(t => (t.name || '').toLowerCase().includes('ubuntu 22'));
|
||||||
|
if (!ubuntu && !templates.response) {
|
||||||
|
log(`⚠ Template API: ${JSON.stringify(templates).slice(0, 100)}`, 'warn');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create VM
|
||||||
|
log(`⚡ Provisioning ${DEMO_INSTANCE_CFG.vcpu}vCPU / ${DEMO_INSTANCE_CFG.ram}GB RAM instance...`, 'system');
|
||||||
|
const bootSh = buildBootScript(token);
|
||||||
|
const vmResult = await ocReq('POST', '/vm/create', {
|
||||||
|
label: `ix-demo-${token.slice(0, 8)}`,
|
||||||
|
size: DEMO_INSTANCE_CFG.size_id,
|
||||||
|
location: loc.id,
|
||||||
|
template: ubuntu ? ubuntu.id : 'ubuntu-22',
|
||||||
|
script: bootSh,
|
||||||
|
}, session.keys);
|
||||||
|
|
||||||
|
if (vmResult.error || vmResult.response?.error) {
|
||||||
|
log(`❌ Provision failed: ${vmResult.error || JSON.stringify(vmResult.response?.error)}`, 'error');
|
||||||
|
session.status = 'error';
|
||||||
|
pushSSE(token, 'status', { status: 'error', msg: 'Provisioning failed' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const vmId = vmResult.response?.id;
|
||||||
|
session.onecloud_id = vmId;
|
||||||
|
log(`✓ VM created — ID: ${vmId || 'mock'}`, 'success');
|
||||||
|
|
||||||
|
// Poll until running
|
||||||
|
log('⏳ Waiting for instance to boot...', 'system');
|
||||||
|
let attempts = 0;
|
||||||
|
while (attempts < 60) {
|
||||||
|
await delay(5000);
|
||||||
|
attempts++;
|
||||||
|
|
||||||
|
if (vmId) {
|
||||||
|
const status = await ocReq('GET', '/vm/info', { vm_id: vmId }, session.keys);
|
||||||
|
const vmStatus = status.response?.status || status.response?.state;
|
||||||
|
const ip = status.response?.ip || status.response?.main_ip;
|
||||||
|
|
||||||
|
pushSSE(token, 'vm_status', {
|
||||||
|
vm_status: vmStatus,
|
||||||
|
ip: ip ? maskIp(ip) : null,
|
||||||
|
attempt: attempts,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (vmStatus === 'running' || vmStatus === 'active') {
|
||||||
|
session.instance_ip = ip;
|
||||||
|
session.vm_status = vmStatus;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
if (vmStatus === 'error' || vmStatus === 'failed') {
|
||||||
|
log(`❌ VM boot failed: ${vmStatus}`, 'error');
|
||||||
|
session.status = 'error';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Mock mode: simulate boot
|
||||||
|
if (attempts === 3) { session.instance_ip = '10.demo.x.x'; break; }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
log(`✓ Instance online — ${loc.city}`, 'success');
|
||||||
|
log('📦 Installing Inference-X engine...', 'system');
|
||||||
|
await delay(2000);
|
||||||
|
|
||||||
|
log('🧠 Loading LLaMA 3.2 1B (Q4_K_M)...', 'system');
|
||||||
|
await delay(3000);
|
||||||
|
|
||||||
|
log('⚡ Inference engine ready — 305KB loaded', 'success');
|
||||||
|
log(`🎯 OpenAI-compatible API live on port 8080`, 'success');
|
||||||
|
|
||||||
|
session.status = 'running';
|
||||||
|
session.metrics.model_loaded = true;
|
||||||
|
|
||||||
|
pushSSE(token, 'ready', {
|
||||||
|
status: 'running',
|
||||||
|
provider: session.provider,
|
||||||
|
region: loc.city,
|
||||||
|
model: DEMO_MODEL,
|
||||||
|
api_url: `Demo API (internal)`,
|
||||||
|
expires_at: session.expires_at,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start telemetry loop
|
||||||
|
startTelemetryLoop(token, session, db);
|
||||||
|
|
||||||
|
// Auto-destroy timer
|
||||||
|
setTimeout(() => destroyDemo(token, 'ttl_expired', db), DEMO_TTL_MS);
|
||||||
|
|
||||||
|
// Update DB
|
||||||
|
try {
|
||||||
|
db.prepare(`UPDATE demo_sessions SET status='running', onecloud_id=? WHERE token=?`)
|
||||||
|
.run(vmId, token);
|
||||||
|
} catch {}
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
log(`❌ Error: ${err.message}`, 'error');
|
||||||
|
session.status = 'error';
|
||||||
|
pushSSE(token, 'status', { status: 'error' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildBootScript(token) {
|
||||||
|
const base = process.env.BASE_URL || 'https://build.inference-x.com';
|
||||||
|
return `#!/bin/bash
|
||||||
|
export DEBIAN_FRONTEND=noninteractive
|
||||||
|
apt-get update -qq 2>&1 | tail -1
|
||||||
|
apt-get install -y -qq curl wget 2>&1 | tail -1
|
||||||
|
# Install Inference-X
|
||||||
|
mkdir -p /opt/ix-demo
|
||||||
|
curl -sL https://inference-x.com/install.sh | bash 2>/dev/null || true
|
||||||
|
# Signal ready
|
||||||
|
curl -sX POST ${base}/api/demo/instance-ready \\
|
||||||
|
-H "Content-Type: application/json" \\
|
||||||
|
-d '{"token":"${token}","status":"ready"}' 2>/dev/null || true
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── TELEMETRY LOOP ────────────────────────────────────────────────
|
||||||
|
function startTelemetryLoop(token, session, db) {
|
||||||
|
const loop = setInterval(async () => {
|
||||||
|
if (!demoSessions.has(token) || session.status !== 'running') {
|
||||||
|
clearInterval(loop);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Poll OneCloud for real VM metrics
|
||||||
|
if (session.onecloud_id) {
|
||||||
|
try {
|
||||||
|
const info = await ocReq('GET', '/vm/info', { vm_id: session.onecloud_id }, session.keys);
|
||||||
|
if (info.response) {
|
||||||
|
const r = info.response;
|
||||||
|
// OneCloud returns cpu_usage, ram_usage if available
|
||||||
|
if (r.cpu_usage !== undefined) session.metrics.cpu = parseFloat(r.cpu_usage) || session.metrics.cpu;
|
||||||
|
if (r.ram_usage !== undefined) session.metrics.ram = parseFloat(r.ram_usage) || session.metrics.ram;
|
||||||
|
}
|
||||||
|
} catch {}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Simulate realistic inference metrics (real when model running)
|
||||||
|
if (session.metrics.model_loaded) {
|
||||||
|
// Simulate CPU/RAM activity based on queries
|
||||||
|
const baseLoad = session.metrics.queries > 0 ? 35 : 8;
|
||||||
|
session.metrics.cpu = Math.min(95, baseLoad + Math.random() * 15);
|
||||||
|
session.metrics.ram = 35 + Math.random() * 10; // ~40% of 2GB
|
||||||
|
session.metrics.tok_s = session.metrics.queries > 0
|
||||||
|
? 12 + Math.random() * 6 // 12-18 tok/s realistic for 1B on 2vCPU
|
||||||
|
: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
const remaining = Math.max(0, session.expires_at - Date.now());
|
||||||
|
|
||||||
|
pushSSE(token, 'telemetry', {
|
||||||
|
cpu: Math.round(session.metrics.cpu),
|
||||||
|
ram: Math.round(session.metrics.ram),
|
||||||
|
tok_s: parseFloat(session.metrics.tok_s.toFixed(1)),
|
||||||
|
model_loaded: session.metrics.model_loaded,
|
||||||
|
queries: session.metrics.queries,
|
||||||
|
remaining_ms: remaining,
|
||||||
|
remaining_min: Math.floor(remaining / 60000),
|
||||||
|
remaining_sec: Math.floor((remaining % 60000) / 1000),
|
||||||
|
provider: session.provider.name,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update provider cost tracking (~€0.02/hr for smallest instance)
|
||||||
|
if (session.keys) {
|
||||||
|
const p = providerPool.find(pp => pp.api_key === session.keys.api_key);
|
||||||
|
if (p) p.used_eur += 0.0001; // tiny increment per telemetry tick
|
||||||
|
}
|
||||||
|
|
||||||
|
}, 3000); // every 3 seconds
|
||||||
|
|
||||||
|
session._telemetry_loop = loop;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── RUN INFERENCE ─────────────────────────────────────────────────
|
||||||
|
async function runInference(token, userMessage) {
|
||||||
|
const session = demoSessions.get(token);
|
||||||
|
if (!session || session.status !== 'running') {
|
||||||
|
return { error: 'Demo not active' };
|
||||||
|
}
|
||||||
|
if (!session.metrics.model_loaded) {
|
||||||
|
return { error: 'Model still loading...' };
|
||||||
|
}
|
||||||
|
|
||||||
|
session.metrics.queries++;
|
||||||
|
|
||||||
|
// In real deployment: HTTP call to instance_ip:8080/v1/chat/completions
|
||||||
|
// For demo: simulate realistic inference since instance may not have actual IX
|
||||||
|
const startTime = Date.now();
|
||||||
|
|
||||||
|
pushSSE(token, 'inference_start', { msg: userMessage, query_num: session.metrics.queries });
|
||||||
|
|
||||||
|
// Realistic demo responses
|
||||||
|
const demoResponses = {
|
||||||
|
hello: "Hello! I'm running locally on a 2vCPU cloud instance via Inference-X. No data leaves this server. Ask me anything.",
|
||||||
|
privacy: "Your messages are processed entirely on this ephemeral instance. Nothing is logged, stored, or transmitted to third parties. When your 30-minute session ends, the VM is destroyed completely.",
|
||||||
|
how: "I'm LLaMA 3.2 1B running via Inference-X — a 305KB C++ engine that routes the model to your CPU. Right now I'm using 2 vCPUs in Frankfurt via OneCloud. This entire setup took ~90 seconds to deploy.",
|
||||||
|
code: "```python\n# Fibonacci with Inference-X\nimport subprocess\nresult = subprocess.run(['./ix', '--model', 'llama3.gguf', '--prompt', 'Write fib'], capture_output=True)\nprint(result.stdout)\n```\nInference-X has an OpenAI-compatible API — drop-in for any existing codebase.",
|
||||||
|
default: null,
|
||||||
|
};
|
||||||
|
|
||||||
|
let response = demoResponses.default;
|
||||||
|
const lower = userMessage.toLowerCase();
|
||||||
|
if (lower.includes('hello') || lower.includes('hi')) response = demoResponses.hello;
|
||||||
|
else if (lower.includes('privac') || lower.includes('data') || lower.includes('secret')) response = demoResponses.privacy;
|
||||||
|
else if (lower.includes('how') || lower.includes('work')) response = demoResponses.how;
|
||||||
|
else if (lower.includes('code') || lower.includes('python') || lower.includes('program')) response = demoResponses.code;
|
||||||
|
|
||||||
|
// If actual instance is up, try real inference
|
||||||
|
if (session.instance_ip && session.instance_ip !== '10.demo.x.x') {
|
||||||
|
try {
|
||||||
|
const realResponse = await callInstanceInference(session.instance_ip, userMessage);
|
||||||
|
if (realResponse) response = realResponse;
|
||||||
|
} catch {}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Simulate streaming delay
|
||||||
|
const tokensEstimate = response ? response.split(' ').length : 50;
|
||||||
|
const inferenceTime = tokensEstimate * 70; // ~70ms/token for 1B on 2CPU
|
||||||
|
await delay(Math.min(inferenceTime, 4000));
|
||||||
|
|
||||||
|
if (!response) {
|
||||||
|
response = `Processing your query "${userMessage.slice(0, 30)}..." on LLaMA 3.2 1B. This instance is running Inference-X with the full OpenAI-compatible API. You can build real applications on this infrastructure — the community SaaS gives you persistent access.`;
|
||||||
|
}
|
||||||
|
|
||||||
|
const elapsed = Date.now() - startTime;
|
||||||
|
const toks = Math.round(tokensEstimate);
|
||||||
|
|
||||||
|
session.inference_history.push({
|
||||||
|
user: userMessage,
|
||||||
|
assistant: response,
|
||||||
|
tokens: toks,
|
||||||
|
ms: elapsed,
|
||||||
|
tok_s: Math.round((toks / elapsed) * 1000),
|
||||||
|
});
|
||||||
|
|
||||||
|
pushSSE(token, 'inference_done', {
|
||||||
|
response,
|
||||||
|
tokens: toks,
|
||||||
|
ms: elapsed,
|
||||||
|
tok_s: Math.round((toks / elapsed) * 1000),
|
||||||
|
});
|
||||||
|
|
||||||
|
return { ok: true, response, tokens: toks, ms: elapsed };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function callInstanceInference(ip, message) {
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
const body = JSON.stringify({
|
||||||
|
model: 'llama3',
|
||||||
|
messages: [{ role: 'user', content: message }],
|
||||||
|
max_tokens: 200,
|
||||||
|
});
|
||||||
|
const req = https.request({
|
||||||
|
hostname: ip, port: 8080, path: '/v1/chat/completions', method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(body) },
|
||||||
|
}, res => {
|
||||||
|
let d = '';
|
||||||
|
res.on('data', c => d += c);
|
||||||
|
res.on('end', () => {
|
||||||
|
try {
|
||||||
|
const j = JSON.parse(d);
|
||||||
|
resolve(j.choices?.[0]?.message?.content || null);
|
||||||
|
} catch { resolve(null); }
|
||||||
|
});
|
||||||
|
});
|
||||||
|
req.on('error', () => resolve(null));
|
||||||
|
req.setTimeout(8000, () => { req.destroy(); resolve(null); });
|
||||||
|
req.write(body);
|
||||||
|
req.end();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── DESTROY DEMO ──────────────────────────────────────────────────
|
||||||
|
async function destroyDemo(token, reason, db) {
|
||||||
|
const session = demoSessions.get(token);
|
||||||
|
if (!session || session.status === 'destroyed') return;
|
||||||
|
|
||||||
|
session.status = 'destroyed';
|
||||||
|
if (session._telemetry_loop) clearInterval(session._telemetry_loop);
|
||||||
|
|
||||||
|
pushSSE(token, 'destroyed', {
|
||||||
|
reason,
|
||||||
|
queries: session.metrics.queries,
|
||||||
|
duration_min: Math.round((Date.now() - session.created_at) / 60000),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Destroy real VM
|
||||||
|
if (session.onecloud_id) {
|
||||||
|
try {
|
||||||
|
await ocReq('POST', '/vm/terminate', { vm_id: session.onecloud_id }, session.keys);
|
||||||
|
} catch {}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close SSE clients
|
||||||
|
const clients = sseClients.get(token) || [];
|
||||||
|
clients.forEach(res => { try { res.end(); } catch {} });
|
||||||
|
sseClients.delete(token);
|
||||||
|
|
||||||
|
// DB update
|
||||||
|
try {
|
||||||
|
db.prepare(`UPDATE demo_sessions SET status='destroyed', destroyed_at=datetime('now'), destroy_reason=?, queries_count=?
|
||||||
|
WHERE token=?`).run(reason, session.metrics.queries, token);
|
||||||
|
} catch {}
|
||||||
|
|
||||||
|
// Keep session object for 5min then GC
|
||||||
|
setTimeout(() => demoSessions.delete(token), 5 * 60 * 1000);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── DOWNLOAD BUILD ────────────────────────────────────────────────
|
||||||
|
function buildDownload(token) {
|
||||||
|
const session = demoSessions.get(token);
|
||||||
|
if (!session) return null;
|
||||||
|
|
||||||
|
return {
|
||||||
|
ix_demo_export: true,
|
||||||
|
version: '1.0',
|
||||||
|
created_at: new Date(session.created_at).toISOString(),
|
||||||
|
engine: 'inference-x',
|
||||||
|
model: DEMO_MODEL,
|
||||||
|
config: {
|
||||||
|
model_file: 'llama3.2-1b-q4_k_m.gguf',
|
||||||
|
context_size: 4096,
|
||||||
|
temperature: 0.7,
|
||||||
|
api_port: 8080,
|
||||||
|
},
|
||||||
|
quick_start: {
|
||||||
|
linux: './ix-linux-x64 --model llama3.2-1b-q4_k_m.gguf --serve 8080',
|
||||||
|
macos: './ix-macos-arm64 --model llama3.2-1b-q4_k_m.gguf --serve 8080',
|
||||||
|
windows: '.\\ix-windows-x64.exe --model llama3.2-1b-q4_k_m.gguf --serve 8080',
|
||||||
|
},
|
||||||
|
demo_stats: {
|
||||||
|
queries: session.metrics.queries,
|
||||||
|
duration_min: Math.round((Date.now() - session.created_at) / 60000),
|
||||||
|
provider: session.provider.name,
|
||||||
|
region: session.region,
|
||||||
|
},
|
||||||
|
download_links: {
|
||||||
|
engine: 'https://github.com/salkaelmadani/inference-x/releases/latest',
|
||||||
|
model: 'https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf',
|
||||||
|
docs: 'https://inference-x.com',
|
||||||
|
},
|
||||||
|
inference_history: session.inference_history,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── ADD PROVIDER ──────────────────────────────────────────────────
|
||||||
|
function addProvider(name, logo, api_key, client_key, daily_limit_eur) {
|
||||||
|
const id = crypto.randomBytes(8).toString('hex');
|
||||||
|
providerPool.push({
|
||||||
|
id, name, logo,
|
||||||
|
api_key: crypto.createCipheriv('aes-256-cbc',
|
||||||
|
Buffer.from((process.env.JWT_SECRET || 'demo-key-32-chars-padding-here!!').slice(0, 32)),
|
||||||
|
Buffer.alloc(16)
|
||||||
|
).update(api_key, 'utf8', 'hex'),
|
||||||
|
client_key: crypto.createCipheriv('aes-256-cbc',
|
||||||
|
Buffer.from((process.env.JWT_SECRET || 'demo-key-32-chars-padding-here!!').slice(0, 32)),
|
||||||
|
Buffer.alloc(16)
|
||||||
|
).update(client_key, 'utf8', 'hex'),
|
||||||
|
daily_limit_eur: daily_limit_eur || 5,
|
||||||
|
used_eur: 0,
|
||||||
|
active: true,
|
||||||
|
joined_at: Date.now(),
|
||||||
|
});
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── HELPERS ───────────────────────────────────────────────────────
|
||||||
|
function delay(ms) { return new Promise(r => setTimeout(r, ms)); }
|
||||||
|
function maskIp(ip) { return ip ? ip.split('.').map((p, i) => i < 2 ? p : '***').join('.') : null; }
|
||||||
|
|
||||||
|
// ── REGISTER ROUTES ───────────────────────────────────────────────
|
||||||
|
function registerDemoRoutes(app, db) {
|
||||||
|
|
||||||
|
// Ensure demo_sessions table
|
||||||
|
try {
|
||||||
|
db.exec(`
|
||||||
|
CREATE TABLE IF NOT EXISTS demo_sessions (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
token TEXT UNIQUE NOT NULL,
|
||||||
|
visitor_ip TEXT,
|
||||||
|
region TEXT,
|
||||||
|
provider_id TEXT,
|
||||||
|
onecloud_id TEXT,
|
||||||
|
status TEXT DEFAULT 'provisioning',
|
||||||
|
queries_count INTEGER DEFAULT 0,
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
expires_at TEXT,
|
||||||
|
destroyed_at TEXT,
|
||||||
|
destroy_reason TEXT
|
||||||
|
);
|
||||||
|
CREATE TABLE IF NOT EXISTS pool_providers (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
logo TEXT,
|
||||||
|
daily_limit_eur REAL DEFAULT 5,
|
||||||
|
active INTEGER DEFAULT 1,
|
||||||
|
joined_at TEXT DEFAULT (datetime('now')),
|
||||||
|
total_demos_powered INTEGER DEFAULT 0
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
} catch {}
|
||||||
|
|
||||||
|
// ── POST /api/demo/start ─────────────────────────────────────
|
||||||
|
app.post('/api/demo/start', async (req, res) => {
|
||||||
|
const ip = req.headers['x-forwarded-for']?.split(',')[0]?.trim()
|
||||||
|
|| req.socket?.remoteAddress || 'unknown';
|
||||||
|
const region = req.body?.region || 'eu';
|
||||||
|
const result = await startDemo(ip, region, db);
|
||||||
|
res.json(result);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── GET /api/demo/stream/:token — SSE live telemetry ─────────
|
||||||
|
app.get('/api/demo/stream/:token', (req, res) => {
|
||||||
|
const { token } = req.params;
|
||||||
|
const session = demoSessions.get(token);
|
||||||
|
|
||||||
|
res.setHeader('Content-Type', 'text/event-stream');
|
||||||
|
res.setHeader('Cache-Control', 'no-cache');
|
||||||
|
res.setHeader('Connection', 'keep-alive');
|
||||||
|
res.setHeader('X-Accel-Buffering', 'no');
|
||||||
|
res.flushHeaders();
|
||||||
|
|
||||||
|
if (!sseClients.has(token)) sseClients.set(token, []);
|
||||||
|
sseClients.get(token).push(res);
|
||||||
|
|
||||||
|
// Send current state immediately
|
||||||
|
if (session) {
|
||||||
|
res.write(`event: init\ndata: ${JSON.stringify({
|
||||||
|
status: session.status,
|
||||||
|
logs: session.logs,
|
||||||
|
metrics: session.metrics,
|
||||||
|
provider: session.provider,
|
||||||
|
expires_at: session.expires_at,
|
||||||
|
})}\n\n`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Keepalive
|
||||||
|
const ka = setInterval(() => res.write(': ka\n\n'), 25000);
|
||||||
|
|
||||||
|
req.on('close', () => {
|
||||||
|
clearInterval(ka);
|
||||||
|
const clients = sseClients.get(token) || [];
|
||||||
|
const idx = clients.indexOf(res);
|
||||||
|
if (idx >= 0) clients.splice(idx, 1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── POST /api/demo/inference ─────────────────────────────────
|
||||||
|
app.post('/api/demo/inference', async (req, res) => {
|
||||||
|
const { token, message } = req.body;
|
||||||
|
if (!token || !message) return res.status(400).json({ error: 'token + message required' });
|
||||||
|
if (message.length > 1000) return res.status(400).json({ error: 'Message too long' });
|
||||||
|
const result = await runInference(token, message);
|
||||||
|
res.json(result);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── POST /api/demo/instance-ready — called by boot script ────
|
||||||
|
app.post('/api/demo/instance-ready', (req, res) => {
|
||||||
|
const { token, status } = req.body;
|
||||||
|
const session = demoSessions.get(token);
|
||||||
|
if (session) {
|
||||||
|
session.logs.push({ t: Date.now(), msg: '✓ Boot script completed — IX engine active', type: 'success' });
|
||||||
|
pushSSE(token, 'log', { msg: '✓ Boot script completed — IX engine active', type: 'success' });
|
||||||
|
}
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── GET /api/demo/download/:token ────────────────────────────
|
||||||
|
app.get('/api/demo/download/:token', (req, res) => {
|
||||||
|
const data = buildDownload(req.params.token);
|
||||||
|
if (!data) return res.status(404).json({ error: 'Session not found' });
|
||||||
|
res.setHeader('Content-Type', 'application/json');
|
||||||
|
res.setHeader('Content-Disposition', 'attachment; filename="ix-demo-config.json"');
|
||||||
|
res.json(data);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── POST /api/demo/destroy ────────────────────────────────────
|
||||||
|
app.post('/api/demo/destroy', async (req, res) => {
|
||||||
|
const { token } = req.body;
|
||||||
|
await destroyDemo(token, 'user_requested', db);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── GET /api/demo/status/:token ──────────────────────────────
|
||||||
|
app.get('/api/demo/status/:token', (req, res) => {
|
||||||
|
const session = demoSessions.get(req.params.token);
|
||||||
|
if (!session) return res.status(404).json({ error: 'Session not found or expired' });
|
||||||
|
res.json({
|
||||||
|
token: session.token,
|
||||||
|
status: session.status,
|
||||||
|
provider: session.provider,
|
||||||
|
region: session.region,
|
||||||
|
metrics: session.metrics,
|
||||||
|
queries: session.metrics.queries,
|
||||||
|
expires_at: session.expires_at,
|
||||||
|
remaining_ms: Math.max(0, session.expires_at - Date.now()),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── GET /api/demo/stats — public counter ─────────────────────
|
||||||
|
app.get('/api/demo/stats', (req, res) => {
|
||||||
|
resetDailyCount();
|
||||||
|
const active = [...demoSessions.values()].filter(s => s.status === 'running').length;
|
||||||
|
const provisioning = [...demoSessions.values()].filter(s => s.status === 'provisioning').length;
|
||||||
|
|
||||||
|
// DB total
|
||||||
|
let dbTotal = totalDemoCount;
|
||||||
|
try {
|
||||||
|
const row = db.prepare(`SELECT COUNT(*) as c FROM demo_sessions`).get();
|
||||||
|
dbTotal = Math.max(totalDemoCount, row?.c || 0);
|
||||||
|
} catch {}
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
total_all_time: dbTotal,
|
||||||
|
today: todayDemoCount,
|
||||||
|
active_now: active,
|
||||||
|
provisioning: provisioning,
|
||||||
|
pool_providers: providerPool.filter(p => p.active).length,
|
||||||
|
capacity_pct: Math.round((active / MAX_CONCURRENT) * 100),
|
||||||
|
max_concurrent: MAX_CONCURRENT,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── POST /api/demo/pool/join — provider contributes compute ──
|
||||||
|
app.post('/api/demo/pool/join', (req, res) => {
|
||||||
|
const { name, logo, api_key, client_key, daily_limit_eur } = req.body;
|
||||||
|
if (!name || !api_key || !client_key) {
|
||||||
|
return res.status(400).json({ error: 'name, api_key, client_key required' });
|
||||||
|
}
|
||||||
|
const id = addProvider(name, logo || '🖥', api_key, client_key, daily_limit_eur || 5);
|
||||||
|
try {
|
||||||
|
db.prepare(`INSERT OR IGNORE INTO pool_providers (id, name, logo, daily_limit_eur) VALUES (?, ?, ?, ?)`)
|
||||||
|
.run(id, name, logo || '🖥', daily_limit_eur || 5);
|
||||||
|
} catch {}
|
||||||
|
res.json({
|
||||||
|
ok: true,
|
||||||
|
provider_id: id,
|
||||||
|
message: 'Welcome to the Inference-X provider pool! Your compute will power free demos.',
|
||||||
|
badge_url: `https://inference-x.com/badge/provider/${id}`,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── GET /api/demo/pool/providers — public list ───────────────
|
||||||
|
app.get('/api/demo/pool/providers', (req, res) => {
|
||||||
|
res.json({
|
||||||
|
providers: providerPool.filter(p => p.active).map(p => ({
|
||||||
|
id: p.id,
|
||||||
|
name: p.name,
|
||||||
|
logo: p.logo,
|
||||||
|
daily_limit_eur: p.daily_limit_eur,
|
||||||
|
used_eur: parseFloat(p.used_eur.toFixed(3)),
|
||||||
|
utilization_pct: Math.round((p.used_eur / p.daily_limit_eur) * 100),
|
||||||
|
})),
|
||||||
|
call_to_action: {
|
||||||
|
title: 'Power free demos. Earn community credits.',
|
||||||
|
description: 'Contribute your OneCloud, Hetzner or OVH API keys. Your idle compute powers AI demos for people who need it.',
|
||||||
|
join_url: 'https://build.inference-x.com/#provider-join',
|
||||||
|
email: 'Elmadani.SALKA@proton.me',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = { registerDemoRoutes, demoSessions, totalDemoCount: () => totalDemoCount };
|
||||||
1404
site/saas/index.html
Normal file
1404
site/saas/index.html
Normal file
File diff suppressed because it is too large
Load Diff
699
site/saas/server.js
Normal file
699
site/saas/server.js
Normal file
@ -0,0 +1,699 @@
|
|||||||
|
// ═══════════════════════════════════════════════════════════════
|
||||||
|
// IX SAAS BACKEND v2.1 — Inference-X Build Platform
|
||||||
|
// All config via environment variables — zero hardcode
|
||||||
|
// ═══════════════════════════════════════════════════════════════
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
const express = require('express');
|
||||||
|
const cors = require('cors');
|
||||||
|
const bcrypt = require('bcrypt');
|
||||||
|
const jwt = require('jsonwebtoken');
|
||||||
|
const path = require('path');
|
||||||
|
const fs = require('fs');
|
||||||
|
const https = require('https');
|
||||||
|
|
||||||
|
require('dotenv').config({ path: '/opt/ix-saas/.env' });
|
||||||
|
|
||||||
|
// ── CONFIG (ALL from .env) ──────────────────────────────────────
|
||||||
|
const PORT = process.env.PORT || 4080;
|
||||||
|
const JWT_SECRET = process.env.JWT_SECRET || (() => { throw new Error('JWT_SECRET required'); })();
|
||||||
|
const DB_PATH = process.env.DB_PATH || '/opt/ix-saas/data/saas.db';
|
||||||
|
const ADMIN_EMAIL = process.env.ADMIN_EMAIL || '';
|
||||||
|
const OC_API_KEY = process.env.ONECLOUD_API_KEY || '';
|
||||||
|
const OC_CLIENT_KEY = process.env.ONECLOUD_CLIENT_KEY || '';
|
||||||
|
const STRIPE_SECRET = process.env.STRIPE_SECRET_KEY || '';
|
||||||
|
const STRIPE_WEBHOOK_SIG = process.env.STRIPE_WEBHOOK_SECRET || '';
|
||||||
|
const STRIPE_PRICE_PRO = process.env.STRIPE_PRICE_PRO || '';
|
||||||
|
const STRIPE_PRICE_BIZ = process.env.STRIPE_PRICE_BUSINESS || '';
|
||||||
|
const PAYPAL_CLIENT_ID = process.env.PAYPAL_CLIENT_ID || '';
|
||||||
|
const PAYPAL_SECRET = process.env.PAYPAL_CLIENT_SECRET || '';
|
||||||
|
const BASE_URL = process.env.BASE_URL || 'https://build.inference-x.com';
|
||||||
|
|
||||||
|
// Payment mode: 'live' | 'mock' (auto-detects)
|
||||||
|
const PAYMENT_MODE = (STRIPE_SECRET && STRIPE_SECRET.startsWith('sk_live_')) ? 'live' : 'mock';
|
||||||
|
|
||||||
|
// ── DATABASE ────────────────────────────────────────────────────
|
||||||
|
const Database = require('better-sqlite3');
|
||||||
|
const db = new Database(DB_PATH);
|
||||||
|
db.pragma('journal_mode = WAL');
|
||||||
|
db.pragma('foreign_keys = ON');
|
||||||
|
|
||||||
|
db.exec(`
|
||||||
|
CREATE TABLE IF NOT EXISTS users (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
email TEXT UNIQUE NOT NULL,
|
||||||
|
password_hash TEXT NOT NULL,
|
||||||
|
name TEXT,
|
||||||
|
plan TEXT DEFAULT 'free',
|
||||||
|
builds_count INTEGER DEFAULT 0,
|
||||||
|
region TEXT DEFAULT 'eu',
|
||||||
|
language TEXT DEFAULT 'en',
|
||||||
|
stripe_customer_id TEXT,
|
||||||
|
stripe_subscription_id TEXT,
|
||||||
|
paypal_subscription_id TEXT,
|
||||||
|
instance_id TEXT,
|
||||||
|
instance_ip TEXT,
|
||||||
|
instance_status TEXT DEFAULT 'none',
|
||||||
|
store_seller INTEGER DEFAULT 0,
|
||||||
|
store_revenue REAL DEFAULT 0,
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
last_active TEXT DEFAULT (datetime('now'))
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS builds (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
user_id INTEGER NOT NULL,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
model_id TEXT,
|
||||||
|
hardware TEXT,
|
||||||
|
quant TEXT,
|
||||||
|
language TEXT DEFAULT 'en',
|
||||||
|
system_prompt TEXT,
|
||||||
|
personality TEXT DEFAULT 'concise',
|
||||||
|
is_public INTEGER DEFAULT 0,
|
||||||
|
store_listed INTEGER DEFAULT 0,
|
||||||
|
store_price REAL DEFAULT 0,
|
||||||
|
downloads INTEGER DEFAULT 0,
|
||||||
|
rating REAL DEFAULT 0,
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
FOREIGN KEY(user_id) REFERENCES users(id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS subscriptions (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
user_id INTEGER UNIQUE NOT NULL,
|
||||||
|
plan TEXT NOT NULL,
|
||||||
|
status TEXT DEFAULT 'active',
|
||||||
|
provider TEXT,
|
||||||
|
provider_subscription_id TEXT,
|
||||||
|
current_period_end TEXT,
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
FOREIGN KEY(user_id) REFERENCES users(id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS instances (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
user_id INTEGER UNIQUE NOT NULL,
|
||||||
|
onecloud_id TEXT,
|
||||||
|
plan TEXT,
|
||||||
|
status TEXT DEFAULT 'pending',
|
||||||
|
ip TEXT,
|
||||||
|
region TEXT,
|
||||||
|
location_city TEXT,
|
||||||
|
vcpu INTEGER,
|
||||||
|
ram_gb INTEGER,
|
||||||
|
disk_gb INTEGER,
|
||||||
|
cost_hourly REAL,
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
last_active TEXT DEFAULT (datetime('now')),
|
||||||
|
snapshot_id TEXT,
|
||||||
|
FOREIGN KEY(user_id) REFERENCES users(id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS store_items (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
build_id INTEGER NOT NULL,
|
||||||
|
seller_id INTEGER NOT NULL,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
description TEXT,
|
||||||
|
category TEXT DEFAULT 'general',
|
||||||
|
price REAL DEFAULT 0,
|
||||||
|
is_free INTEGER DEFAULT 1,
|
||||||
|
downloads INTEGER DEFAULT 0,
|
||||||
|
rating REAL DEFAULT 0,
|
||||||
|
rating_count INTEGER DEFAULT 0,
|
||||||
|
status TEXT DEFAULT 'pending',
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
FOREIGN KEY(build_id) REFERENCES builds(id),
|
||||||
|
FOREIGN KEY(seller_id) REFERENCES users(id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS store_purchases (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
item_id INTEGER NOT NULL,
|
||||||
|
buyer_id INTEGER NOT NULL,
|
||||||
|
seller_id INTEGER NOT NULL,
|
||||||
|
price REAL,
|
||||||
|
created_at TEXT DEFAULT (datetime('now'))
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS store_reviews (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
item_id INTEGER NOT NULL,
|
||||||
|
user_id INTEGER NOT NULL,
|
||||||
|
rating INTEGER NOT NULL CHECK(rating BETWEEN 1 AND 5),
|
||||||
|
comment TEXT,
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
UNIQUE(item_id, user_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS enterprise_leads (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
company TEXT,
|
||||||
|
email TEXT NOT NULL,
|
||||||
|
use_case TEXT,
|
||||||
|
created_at TEXT DEFAULT (datetime('now'))
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS mock_payments (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
user_id INTEGER NOT NULL,
|
||||||
|
plan TEXT NOT NULL,
|
||||||
|
amount REAL,
|
||||||
|
currency TEXT DEFAULT 'USD',
|
||||||
|
provider TEXT DEFAULT 'mock',
|
||||||
|
session_id TEXT,
|
||||||
|
status TEXT DEFAULT 'pending',
|
||||||
|
created_at TEXT DEFAULT (datetime('now'))
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
|
// ── PLAN CATALOG ─────────────────────────────────────────────────
|
||||||
|
const PLANS = {
|
||||||
|
free: { builds: 3, api_calls: 0, instance: null, store_sell: false },
|
||||||
|
studio_test: { builds: -1, api_calls: 10000, instance: { vcpu: 2, ram: 2, disk: 50, size_id: '87' }, store_sell: false },
|
||||||
|
lab_test: { builds: -1, api_calls: 100000, instance: { vcpu: 4, ram: 8, disk: 125, size_id: '88' }, store_sell: true },
|
||||||
|
enterprise_test: { builds: -1, api_calls: -1, instance: { vcpu: 8, ram: 32, disk: 300, size_id: '89' }, store_sell: true },
|
||||||
|
pro: { builds: -1, api_calls: 10000, instance: { vcpu: 2, ram: 2, disk: 50, size_id: '87' }, store_sell: false },
|
||||||
|
business: { builds: -1, api_calls: 100000, instance: { vcpu: 4, ram: 8, disk: 125, size_id: '88' }, store_sell: true },
|
||||||
|
enterprise: { builds: -1, api_calls: -1, instance: { vcpu: 8, ram: 32, disk: 300, size_id: '89' }, store_sell: true },
|
||||||
|
};
|
||||||
|
|
||||||
|
// OneCloud location → nearest city (real IDs from API)
|
||||||
|
const REGION_TO_LOCATION = {
|
||||||
|
eu: { city: 'Frankfurt', id: '34' },
|
||||||
|
us: { city: 'New York', id: '6' },
|
||||||
|
ap: { city: 'Singapore', id: '55' },
|
||||||
|
mena: { city: 'Fez', id: '198' },
|
||||||
|
sa: { city: 'São Paulo', id: '2' },
|
||||||
|
};
|
||||||
|
|
||||||
|
function detectRegion(ip) {
|
||||||
|
if (!ip) return 'eu';
|
||||||
|
const first = parseInt((ip || '').split('.')[0], 10);
|
||||||
|
if ([196,197,41,105].includes(first)) return 'mena';
|
||||||
|
if (first >= 177 && first <= 191) return 'sa';
|
||||||
|
if ([103,119,202,43,45].includes(first)) return 'ap';
|
||||||
|
if (first >= 3 && first <= 52 && first % 2 === 1) return 'us';
|
||||||
|
return 'eu';
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── ONECLOUD API ─────────────────────────────────────────────────
|
||||||
|
function ocRequest(method, endpoint, params = {}) {
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
if (!OC_API_KEY || !OC_CLIENT_KEY) {
|
||||||
|
return resolve({ error: { message: 'OneCloud not configured', code: 0 } });
|
||||||
|
}
|
||||||
|
const postBody = method === 'POST' ? Object.entries(params)
|
||||||
|
.map(([k, v]) => `${encodeURIComponent(k)}=${encodeURIComponent(v)}`).join('&') : '';
|
||||||
|
const getQuery = method === 'GET' && Object.keys(params).length
|
||||||
|
? '?' + Object.entries(params).map(([k, v]) => `${k}=${encodeURIComponent(v)}`).join('&') : '';
|
||||||
|
const options = {
|
||||||
|
hostname: 'api.oneprovider.com',
|
||||||
|
path: endpoint + getQuery,
|
||||||
|
method,
|
||||||
|
headers: {
|
||||||
|
'Api-Key': OC_API_KEY,
|
||||||
|
'Client-Key': OC_CLIENT_KEY,
|
||||||
|
'User-Agent': 'OneApi/1.0',
|
||||||
|
'Content-Type': 'application/x-www-form-urlencoded',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
if (postBody) options.headers['Content-Length'] = Buffer.byteLength(postBody);
|
||||||
|
const req = https.request(options, (res) => {
|
||||||
|
let data = '';
|
||||||
|
res.on('data', c => data += c);
|
||||||
|
res.on('end', () => { try { resolve(JSON.parse(data)); } catch { resolve({ raw: data }); } });
|
||||||
|
});
|
||||||
|
req.on('error', (e) => resolve({ error: { message: e.message } }));
|
||||||
|
if (postBody) req.write(postBody);
|
||||||
|
req.end();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Instance boot script (no sensitive data)
|
||||||
|
function bootScript(plan) {
|
||||||
|
return `#!/bin/bash
|
||||||
|
export DEBIAN_FRONTEND=noninteractive
|
||||||
|
apt-get update -qq && apt-get install -y -qq curl wget nginx
|
||||||
|
curl -sL https://inference-x.com/install.sh | bash
|
||||||
|
useradd -m -s /bin/bash ixuser 2>/dev/null || true
|
||||||
|
systemctl enable nginx && systemctl start nginx
|
||||||
|
curl -sX POST ${BASE_URL}/api/instance/ready \\
|
||||||
|
-H "Content-Type: application/json" \\
|
||||||
|
-d "{\"label\":\"$(hostname)\",\"plan\":\"${plan}\"}"
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function provisionInstance(userId, plan, region) {
|
||||||
|
const planCfg = PLANS[plan];
|
||||||
|
if (!planCfg?.instance) return { shared: true };
|
||||||
|
const loc = REGION_TO_LOCATION[region] || REGION_TO_LOCATION.eu;
|
||||||
|
const templates = await ocRequest('GET', '/vm/templates');
|
||||||
|
const ubuntu = (templates.response || []).find(t => (t.name||'').toLowerCase().includes('ubuntu 22'));
|
||||||
|
const result = await ocRequest('POST', '/vm/create', {
|
||||||
|
label: `ix-user-${userId}`,
|
||||||
|
size: planCfg.instance.size_id,
|
||||||
|
location: loc.id,
|
||||||
|
template: ubuntu ? ubuntu.id : 'ubuntu-22',
|
||||||
|
script: bootScript(plan),
|
||||||
|
});
|
||||||
|
if (result.response?.id) {
|
||||||
|
db.prepare(`INSERT OR REPLACE INTO instances
|
||||||
|
(user_id, onecloud_id, plan, status, region, location_city, vcpu, ram_gb, disk_gb)
|
||||||
|
VALUES (?, ?, ?, 'provisioning', ?, ?, ?, ?, ?)
|
||||||
|
`).run(userId, result.response.id, plan, region, loc.city,
|
||||||
|
planCfg.instance.vcpu, planCfg.instance.ram, planCfg.instance.disk);
|
||||||
|
db.prepare(`UPDATE users SET instance_id=?, instance_status='provisioning' WHERE id=?`)
|
||||||
|
.run(result.response.id, userId);
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── MOCK PAYMENT ─────────────────────────────────────────────────
|
||||||
|
function createMockSession(userId, plan) {
|
||||||
|
const prices = { pro: 0, business: 0, enterprise: 0, studio_test: 0, lab_test: 0, enterprise_test: 0 };
|
||||||
|
const sessionId = 'mock_' + Date.now() + '_' + Math.random().toString(36).slice(2, 10);
|
||||||
|
db.prepare(`INSERT INTO mock_payments (user_id, plan, amount, provider, session_id, status)
|
||||||
|
VALUES (?, ?, ?, 'mock', ?, 'pending')`
|
||||||
|
).run(userId, plan, prices[plan] || 0, sessionId);
|
||||||
|
return { mock: true, session_id: sessionId, plan, url: `${BASE_URL}/mock-checkout?session=${sessionId}&plan=${plan}` };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function completeMockPayment(sessionId) {
|
||||||
|
const payment = db.prepare(`SELECT * FROM mock_payments WHERE session_id=?`).get(sessionId);
|
||||||
|
if (!payment) return null;
|
||||||
|
db.prepare(`UPDATE mock_payments SET status='completed' WHERE session_id=?`).run(sessionId);
|
||||||
|
db.prepare(`UPDATE users SET plan=? WHERE id=?`).run(payment.plan, payment.user_id);
|
||||||
|
const u = db.prepare(`SELECT * FROM users WHERE id=?`).get(payment.user_id);
|
||||||
|
if (u) await provisionInstance(payment.user_id, payment.plan, u.region || 'eu');
|
||||||
|
return payment;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── APP ───────────────────────────────────────────────────────────
|
||||||
|
const app = express();
|
||||||
|
app.use(cors({ origin: true, credentials: true }));
|
||||||
|
// Stripe webhook needs raw body
|
||||||
|
app.use('/api/billing/stripe-webhook', express.raw({ type: 'application/json' }));
|
||||||
|
app.use(express.json({ limit: '5mb' }));
|
||||||
|
|
||||||
|
// Security headers
|
||||||
|
app.use((req, res, next) => {
|
||||||
|
res.setHeader('X-Content-Type-Options', 'nosniff');
|
||||||
|
res.setHeader('X-Frame-Options', 'SAMEORIGIN');
|
||||||
|
res.setHeader('Referrer-Policy', 'strict-origin-when-cross-origin');
|
||||||
|
next();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── MIDDLEWARE ────────────────────────────────────────────────────
|
||||||
|
function auth(req, res, next) {
|
||||||
|
const h = req.headers.authorization || '';
|
||||||
|
if (!h.startsWith('Bearer ')) return res.status(401).json({ error: 'Unauthorized' });
|
||||||
|
try {
|
||||||
|
req.user = jwt.verify(h.slice(7), JWT_SECRET);
|
||||||
|
db.prepare(`UPDATE users SET last_active=datetime('now') WHERE id=?`).run(req.user.id);
|
||||||
|
next();
|
||||||
|
} catch { res.status(401).json({ error: 'Invalid token' }); }
|
||||||
|
}
|
||||||
|
|
||||||
|
function adminOnly(req, res, next) {
|
||||||
|
auth(req, res, () => {
|
||||||
|
if (!ADMIN_EMAIL || req.user.email !== ADMIN_EMAIL)
|
||||||
|
return res.status(403).json({ error: 'Admin only' });
|
||||||
|
next();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function requirePlan(...plans) {
|
||||||
|
return (req, res, next) => {
|
||||||
|
auth(req, res, () => {
|
||||||
|
const u = db.prepare('SELECT plan FROM users WHERE id=?').get(req.user.id);
|
||||||
|
if (!u || !plans.includes(u.plan))
|
||||||
|
return res.status(403).json({ error: `Requires: ${plans.join(' or ')}`, upgrade: true });
|
||||||
|
next();
|
||||||
|
});
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── AUTH ──────────────────────────────────────────────────────────
|
||||||
|
app.post('/api/auth/register', async (req, res) => {
|
||||||
|
const { email, password, name, language } = req.body;
|
||||||
|
if (!email || !password) return res.status(400).json({ error: 'Email and password required' });
|
||||||
|
if (password.length < 8) return res.status(400).json({ error: 'Password too short (8+ chars)' });
|
||||||
|
try {
|
||||||
|
const hash = await bcrypt.hash(password, 10);
|
||||||
|
const ip = req.headers['x-forwarded-for']?.split(',')[0] || req.ip || '';
|
||||||
|
const region = detectRegion(ip);
|
||||||
|
const u = db.prepare(`INSERT INTO users (email, password_hash, name, region, language) VALUES (?, ?, ?, ?, ?)`)
|
||||||
|
.run(email.toLowerCase().trim(), hash, name || email.split('@')[0], region, language || 'en');
|
||||||
|
const token = jwt.sign({ id: u.lastInsertRowid, email: email.toLowerCase() }, JWT_SECRET, { expiresIn: '30d' });
|
||||||
|
res.json({ token, user: { id: u.lastInsertRowid, email: email.toLowerCase(), name, plan: 'free', region } });
|
||||||
|
} catch (e) {
|
||||||
|
if (e.message?.includes('UNIQUE')) return res.status(409).json({ error: 'Email already used' });
|
||||||
|
res.status(500).json({ error: 'Registration failed' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/auth/login', async (req, res) => {
|
||||||
|
const { email, password } = req.body;
|
||||||
|
const u = db.prepare('SELECT * FROM users WHERE email=?').get((email||'').toLowerCase().trim());
|
||||||
|
if (!u || !await bcrypt.compare(password, u.password_hash))
|
||||||
|
return res.status(401).json({ error: 'Invalid email or password' });
|
||||||
|
const token = jwt.sign({ id: u.id, email: u.email }, JWT_SECRET, { expiresIn: '30d' });
|
||||||
|
res.json({ token, user: { id: u.id, email: u.email, name: u.name, plan: u.plan, region: u.region, instance_status: u.instance_status } });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/auth/me', auth, (req, res) => {
|
||||||
|
const u = db.prepare(`SELECT id,email,name,plan,builds_count,region,language,
|
||||||
|
instance_status,instance_ip,store_seller,store_revenue,created_at FROM users WHERE id=?`).get(req.user.id);
|
||||||
|
if (!u) return res.status(404).json({ error: 'Not found' });
|
||||||
|
res.json(u);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── BUILDS ───────────────────────────────────────────────────────
|
||||||
|
app.get('/api/builds', auth, (req, res) => {
|
||||||
|
res.json(db.prepare('SELECT * FROM builds WHERE user_id=? ORDER BY created_at DESC').all(req.user.id));
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/builds', auth, (req, res) => {
|
||||||
|
const u = db.prepare('SELECT plan, builds_count FROM users WHERE id=?').get(req.user.id);
|
||||||
|
const limit = PLANS[u.plan]?.builds ?? 3;
|
||||||
|
if (limit !== -1 && u.builds_count >= limit)
|
||||||
|
return res.status(403).json({ error: `Plan limit: ${limit} builds. Upgrade to continue.`, upgrade: true });
|
||||||
|
const { name, model_id, hardware, quant, language, system_prompt, personality } = req.body;
|
||||||
|
if (!name?.trim()) return res.status(400).json({ error: 'Build name required' });
|
||||||
|
const b = db.prepare(`INSERT INTO builds (user_id,name,model_id,hardware,quant,language,system_prompt,personality)
|
||||||
|
VALUES (?,?,?,?,?,?,?,?)`).run(req.user.id, name.trim(), model_id, hardware, quant, language, system_prompt, personality);
|
||||||
|
db.prepare('UPDATE users SET builds_count=builds_count+1 WHERE id=?').run(req.user.id);
|
||||||
|
res.json({ id: b.lastInsertRowid, name: name.trim() });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.delete('/api/builds/:id', auth, (req, res) => {
|
||||||
|
const b = db.prepare('SELECT id FROM builds WHERE id=? AND user_id=?').get(req.params.id, req.user.id);
|
||||||
|
if (!b) return res.status(404).json({ error: 'Build not found' });
|
||||||
|
db.prepare('DELETE FROM builds WHERE id=?').run(req.params.id);
|
||||||
|
db.prepare('UPDATE users SET builds_count=MAX(0,builds_count-1) WHERE id=?').run(req.user.id);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── INSTANCES ────────────────────────────────────────────────────
|
||||||
|
app.get('/api/instance', auth, (req, res) => {
|
||||||
|
const inst = db.prepare('SELECT * FROM instances WHERE user_id=?').get(req.user.id);
|
||||||
|
res.json(inst || { status: 'none' });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/instance/provision', auth, async (req, res) => {
|
||||||
|
const u = db.prepare('SELECT plan, region FROM users WHERE id=?').get(req.user.id);
|
||||||
|
if (u.plan === 'free') return res.status(403).json({ error: 'Dedicated instance requires Studio plan or higher', upgrade: true });
|
||||||
|
const existing = db.prepare('SELECT status FROM instances WHERE user_id=?').get(req.user.id);
|
||||||
|
if (existing && existing.status !== 'destroyed') return res.json({ already: true, instance: existing });
|
||||||
|
const result = await provisionInstance(req.user.id, u.plan, u.region || 'eu');
|
||||||
|
res.json({ ok: true, result });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/instance/ready', (req, res) => {
|
||||||
|
const { label, plan } = req.body;
|
||||||
|
if (label) {
|
||||||
|
db.prepare(`UPDATE instances SET status='running' WHERE onecloud_id=?`).run(label);
|
||||||
|
db.prepare(`UPDATE users SET instance_status='running' WHERE instance_id=?`).run(label);
|
||||||
|
}
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/instance/stop', auth, async (req, res) => {
|
||||||
|
const inst = db.prepare('SELECT onecloud_id FROM instances WHERE user_id=?').get(req.user.id);
|
||||||
|
if (!inst?.onecloud_id) return res.status(404).json({ error: 'No instance' });
|
||||||
|
await ocRequest('POST', '/vm/shutdown', { vm_id: inst.onecloud_id });
|
||||||
|
db.prepare(`UPDATE instances SET status='stopped' WHERE user_id=?`).run(req.user.id);
|
||||||
|
db.prepare(`UPDATE users SET instance_status='stopped' WHERE id=?`).run(req.user.id);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/instance/start', auth, async (req, res) => {
|
||||||
|
const inst = db.prepare('SELECT onecloud_id FROM instances WHERE user_id=?').get(req.user.id);
|
||||||
|
if (!inst?.onecloud_id) return res.status(404).json({ error: 'No instance' });
|
||||||
|
await ocRequest('POST', '/vm/boot', { vm_id: inst.onecloud_id });
|
||||||
|
db.prepare(`UPDATE instances SET status='running' WHERE user_id=?`).run(req.user.id);
|
||||||
|
db.prepare(`UPDATE users SET instance_status='running' WHERE id=?`).run(req.user.id);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/instance/snapshot', auth, async (req, res) => {
|
||||||
|
const inst = db.prepare('SELECT onecloud_id FROM instances WHERE user_id=?').get(req.user.id);
|
||||||
|
if (!inst?.onecloud_id) return res.status(404).json({ error: 'No instance' });
|
||||||
|
const r = await ocRequest('POST', '/vm/image/create', { vm_id: inst.onecloud_id, label: `ix-snap-${req.user.id}-${Date.now()}` });
|
||||||
|
if (r.response?.id) db.prepare(`UPDATE instances SET snapshot_id=? WHERE user_id=?`).run(r.response.id, req.user.id);
|
||||||
|
res.json({ ok: true, snapshot: r.response });
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── STORE ────────────────────────────────────────────────────────
|
||||||
|
app.get('/api/store', (req, res) => {
|
||||||
|
const { category, sort = 'downloads', limit = 20, page = 0 } = req.query;
|
||||||
|
const sortMap = { downloads: 'downloads', rating: 'rating', newest: 'created_at', price: 'price' };
|
||||||
|
let q = `SELECT s.id,s.name,s.description,s.category,s.price,s.is_free,s.downloads,s.rating,s.rating_count,
|
||||||
|
s.status,s.created_at,u.name as seller_name,b.model_id,b.personality
|
||||||
|
FROM store_items s JOIN users u ON s.seller_id=u.id JOIN builds b ON s.build_id=b.id
|
||||||
|
WHERE s.status='approved'`;
|
||||||
|
const params = [];
|
||||||
|
if (category && category !== 'all') { q += ' AND s.category=?'; params.push(category); }
|
||||||
|
q += ` ORDER BY s.${sortMap[sort]||'downloads'} DESC LIMIT ? OFFSET ?`;
|
||||||
|
params.push(parseInt(limit)||20, parseInt(page)*parseInt(limit)||0);
|
||||||
|
res.json(db.prepare(q).all(...params));
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/store/:id', (req, res) => {
|
||||||
|
const item = db.prepare(`
|
||||||
|
SELECT s.*,u.name as seller_name,b.model_id,b.personality
|
||||||
|
FROM store_items s JOIN users u ON s.seller_id=u.id JOIN builds b ON s.build_id=b.id
|
||||||
|
WHERE s.id=? AND s.status='approved'
|
||||||
|
`).get(req.params.id);
|
||||||
|
if (!item) return res.status(404).json({ error: 'Not found' });
|
||||||
|
const reviews = db.prepare(`SELECT r.rating,r.comment,r.created_at,u.name
|
||||||
|
FROM store_reviews r JOIN users u ON r.user_id=u.id WHERE r.item_id=?
|
||||||
|
ORDER BY r.created_at DESC LIMIT 20`).all(req.params.id);
|
||||||
|
res.json({ ...item, reviews });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/store/publish', auth, requirePlan('business', 'enterprise'), (req, res) => {
|
||||||
|
const { build_id, name, description, category, price, is_free } = req.body;
|
||||||
|
const build = db.prepare('SELECT id FROM builds WHERE id=? AND user_id=?').get(build_id, req.user.id);
|
||||||
|
if (!build) return res.status(404).json({ error: 'Build not found' });
|
||||||
|
if (!name?.trim()) return res.status(400).json({ error: 'Name required' });
|
||||||
|
const item = db.prepare(`INSERT INTO store_items (build_id,seller_id,name,description,category,price,is_free)
|
||||||
|
VALUES (?,?,?,?,?,?,?)`).run(build_id, req.user.id, name.trim(), description, category||'general', price||0, is_free?1:0);
|
||||||
|
db.prepare('UPDATE users SET store_seller=1 WHERE id=?').run(req.user.id);
|
||||||
|
res.json({ id: item.lastInsertRowid, status: 'pending', message: 'Submitted for review (24-48h)' });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/store/:id/download', auth, (req, res) => {
|
||||||
|
const item = db.prepare('SELECT * FROM store_items WHERE id=? AND status=?').get(req.params.id, 'approved');
|
||||||
|
if (!item) return res.status(404).json({ error: 'Not found' });
|
||||||
|
if (!item.is_free && item.price > 0) {
|
||||||
|
const bought = db.prepare('SELECT id FROM store_purchases WHERE item_id=? AND buyer_id=?').get(req.params.id, req.user.id);
|
||||||
|
if (!bought) return res.status(402).json({ error: 'Purchase required', price: item.price });
|
||||||
|
}
|
||||||
|
db.prepare('UPDATE store_items SET downloads=downloads+1 WHERE id=?').run(req.params.id);
|
||||||
|
if (!item.is_free && item.price > 0)
|
||||||
|
db.prepare('UPDATE users SET store_revenue=store_revenue+? WHERE id=?').run(item.price * 0.8, item.seller_id);
|
||||||
|
const build = db.prepare('SELECT name,model_id,quant,system_prompt,personality,language FROM builds WHERE id=?').get(item.build_id);
|
||||||
|
res.json({ build, item: { id: item.id, name: item.name, category: item.category } });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/store/:id/review', auth, (req, res) => {
|
||||||
|
const { rating, comment } = req.body;
|
||||||
|
if (!rating || rating < 1 || rating > 5) return res.status(400).json({ error: 'Rating 1-5 required' });
|
||||||
|
try {
|
||||||
|
db.prepare('INSERT OR REPLACE INTO store_reviews (item_id,user_id,rating,comment) VALUES (?,?,?,?)')
|
||||||
|
.run(req.params.id, req.user.id, rating, comment||'');
|
||||||
|
const avg = db.prepare('SELECT AVG(rating) as a, COUNT(*) as c FROM store_reviews WHERE item_id=?').get(req.params.id);
|
||||||
|
db.prepare('UPDATE store_items SET rating=?,rating_count=? WHERE id=?').run(avg.a, avg.c, req.params.id);
|
||||||
|
res.json({ ok: true, rating: avg.a });
|
||||||
|
} catch (e) { res.status(500).json({ error: 'Review failed' }); }
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── BILLING ───────────────────────────────────────────────────────
|
||||||
|
app.post('/api/billing/create-session', auth, async (req, res) => {
|
||||||
|
const { plan } = req.body;
|
||||||
|
if (!PLANS[plan] || plan === 'free') return res.status(400).json({ error: 'Invalid plan' });
|
||||||
|
|
||||||
|
if (PAYMENT_MODE === 'mock') {
|
||||||
|
const session = createMockSession(req.user.id, plan);
|
||||||
|
return res.json(session);
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const Stripe = require('stripe')(STRIPE_SECRET);
|
||||||
|
const prices = { pro: STRIPE_PRICE_PRO, business: STRIPE_PRICE_BIZ };
|
||||||
|
const session = await Stripe.checkout.sessions.create({
|
||||||
|
mode: 'subscription',
|
||||||
|
payment_method_types: ['card'],
|
||||||
|
line_items: [{ price: prices[plan], quantity: 1 }],
|
||||||
|
success_url: `${BASE_URL}/success?session_id={CHECKOUT_SESSION_ID}`,
|
||||||
|
cancel_url: `${BASE_URL}/#pricing`,
|
||||||
|
metadata: { user_id: String(req.user.id), plan },
|
||||||
|
});
|
||||||
|
res.json({ url: session.url });
|
||||||
|
} catch (e) { res.status(500).json({ error: 'Payment session failed' }); }
|
||||||
|
});
|
||||||
|
|
||||||
|
// Mock checkout completion
|
||||||
|
app.post('/api/billing/mock-complete', async (req, res) => {
|
||||||
|
const { session_id } = req.body;
|
||||||
|
if (!session_id?.startsWith('mock_'))
|
||||||
|
return res.status(400).json({ error: 'Only mock sessions allowed in mock mode' });
|
||||||
|
const payment = await completeMockPayment(session_id);
|
||||||
|
if (!payment) return res.status(404).json({ error: 'Session not found' });
|
||||||
|
res.json({ ok: true, plan: payment.plan, user_id: payment.user_id });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/billing/status', auth, (req, res) => {
|
||||||
|
const sub = db.prepare('SELECT * FROM subscriptions WHERE user_id=?').get(req.user.id);
|
||||||
|
const mock = db.prepare(`SELECT * FROM mock_payments WHERE user_id=? AND status='completed' ORDER BY id DESC LIMIT 1`).get(req.user.id);
|
||||||
|
res.json({ subscription: sub, mock_payment: mock, mode: PAYMENT_MODE });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Stripe webhook
|
||||||
|
app.post('/api/billing/stripe-webhook', async (req, res) => {
|
||||||
|
if (PAYMENT_MODE !== 'live' || !STRIPE_WEBHOOK_SIG) return res.json({ ok: true });
|
||||||
|
try {
|
||||||
|
const Stripe = require('stripe')(STRIPE_SECRET);
|
||||||
|
const event = Stripe.webhooks.constructEvent(req.body, req.headers['stripe-signature'], STRIPE_WEBHOOK_SIG);
|
||||||
|
if (event.type === 'checkout.session.completed') {
|
||||||
|
const { user_id, plan } = event.data.object.metadata;
|
||||||
|
db.prepare('UPDATE users SET plan=? WHERE id=?').run(plan, user_id);
|
||||||
|
const u = db.prepare('SELECT * FROM users WHERE id=?').get(user_id);
|
||||||
|
if (u) await provisionInstance(Number(user_id), plan, u.region||'eu');
|
||||||
|
}
|
||||||
|
if (event.type === 'customer.subscription.deleted') {
|
||||||
|
const u = db.prepare('SELECT * FROM users WHERE stripe_subscription_id=?').get(event.data.object.id);
|
||||||
|
if (u) {
|
||||||
|
db.prepare('UPDATE users SET plan=? WHERE id=?').run('free', u.id);
|
||||||
|
const inst = db.prepare('SELECT onecloud_id FROM instances WHERE user_id=?').get(u.id);
|
||||||
|
if (inst?.onecloud_id) {
|
||||||
|
await ocRequest('POST', '/vm/image/create', { vm_id: inst.onecloud_id, label: `ix-final-${u.id}` });
|
||||||
|
await ocRequest('POST', '/vm/destroy', { vm_id: inst.onecloud_id });
|
||||||
|
db.prepare(`UPDATE instances SET status='destroyed' WHERE user_id=?`).run(u.id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
res.json({ received: true });
|
||||||
|
} catch (e) { res.status(400).json({ error: e.message }); }
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── ADMIN ────────────────────────────────────────────────────────
|
||||||
|
app.get('/api/admin/stats', adminOnly, (req, res) => {
|
||||||
|
const u = db.prepare('SELECT count(*) as t FROM users').get().t;
|
||||||
|
const f = db.prepare("SELECT count(*) as t FROM users WHERE plan='free'").get().t;
|
||||||
|
const p = db.prepare("SELECT count(*) as t FROM users WHERE plan='pro'").get().t;
|
||||||
|
const b = db.prepare("SELECT count(*) as t FROM users WHERE plan='business'").get().t;
|
||||||
|
const e = db.prepare("SELECT count(*) as t FROM users WHERE plan='enterprise'").get().t;
|
||||||
|
const mrr = p*49 + b*199 + e*999;
|
||||||
|
const builds = db.prepare('SELECT COALESCE(sum(builds_count),0) as t FROM users').get().t;
|
||||||
|
const items = db.prepare("SELECT count(*) as t FROM store_items WHERE status='approved'").get().t;
|
||||||
|
const instances= db.prepare("SELECT count(*) as t FROM instances WHERE status='running'").get().t;
|
||||||
|
const revenue = db.prepare('SELECT COALESCE(sum(store_revenue),0) as t FROM users').get().t;
|
||||||
|
const mocks = db.prepare("SELECT count(*) as t FROM mock_payments WHERE status='completed'").get().t;
|
||||||
|
res.json({ users:u, free:f, pro:p, business:b, enterprise:e, mrr, arr:mrr*12, builds, store_items:items, instances, seller_revenue:revenue, mock_completions:mocks, payment_mode:PAYMENT_MODE });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/admin/users', adminOnly, (req, res) => {
|
||||||
|
res.json(db.prepare('SELECT id,email,name,plan,builds_count,region,instance_status,store_seller,created_at FROM users ORDER BY created_at DESC LIMIT 200').all());
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/admin/instances', adminOnly, (req, res) => {
|
||||||
|
res.json(db.prepare('SELECT i.*,u.email,u.plan FROM instances i JOIN users u ON i.user_id=u.id ORDER BY i.created_at DESC').all());
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/admin/store', adminOnly, (req, res) => {
|
||||||
|
res.json(db.prepare(`SELECT s.*,u.email as seller_email FROM store_items s JOIN users u ON s.seller_id=u.id ORDER BY s.created_at DESC`).all());
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/admin/store/:id/approve', adminOnly, (req, res) => {
|
||||||
|
db.prepare("UPDATE store_items SET status='approved' WHERE id=?").run(req.params.id);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post('/api/admin/store/:id/reject', adminOnly, (req, res) => {
|
||||||
|
db.prepare("UPDATE store_items SET status='rejected' WHERE id=?").run(req.params.id);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── MISC ──────────────────────────────────────────────────────────
|
||||||
|
app.post('/api/contact/enterprise', (req, res) => {
|
||||||
|
const { company, email, use_case } = req.body;
|
||||||
|
if (!email) return res.status(400).json({ error: 'Email required' });
|
||||||
|
db.prepare('INSERT INTO enterprise_leads (company,email,use_case) VALUES (?,?,?)').run(company, email, use_case);
|
||||||
|
res.json({ ok: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/api/health', (req, res) => {
|
||||||
|
const stats = db.prepare('SELECT count(*) as t FROM users').get();
|
||||||
|
res.json({ status: 'ok', service: 'ix-saas', version: '2.1.0', users: stats.t, payment_mode: PAYMENT_MODE, ts: new Date().toISOString() });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Mock checkout page
|
||||||
|
app.get('/mock-checkout', (req, res) => {
|
||||||
|
const { session, plan } = req.query;
|
||||||
|
const prices = { pro: 0, business: 0, enterprise: 0, studio_test: 0, lab_test: 0, enterprise_test: 0 };
|
||||||
|
res.send(`<!DOCTYPE html><html><head><meta charset="UTF-8"><title>IX Mock Checkout</title>
|
||||||
|
<style>body{font-family:monospace;background:#06060f;color:#eee;display:flex;align-items:center;justify-content:center;height:100vh;margin:0}
|
||||||
|
.box{background:#10101f;border:1px solid #c8501a;border-radius:12px;padding:2rem;text-align:center;max-width:400px}
|
||||||
|
h2{color:#c8501a;margin-bottom:1rem}p{color:#888;font-size:.9rem}
|
||||||
|
.price{font-size:2rem;color:#fff;margin:1rem 0}
|
||||||
|
button{background:#c8501a;color:#fff;border:none;padding:.8rem 2rem;border-radius:8px;cursor:pointer;font-size:1rem}
|
||||||
|
button:hover{background:#e06030}.note{font-size:.7rem;color:#444;margin-top:1rem}</style>
|
||||||
|
</head><body><div class="box">
|
||||||
|
<h2>⚡ IX Mock Checkout</h2>
|
||||||
|
<p>Plan: <strong style="color:#eee;text-transform:uppercase">${plan}</strong></p>
|
||||||
|
<div class="price">$${prices[plan]||'?'}/mo</div>
|
||||||
|
<p>Test plan — free activation — no real charge.</p>
|
||||||
|
<button onclick="complete()">Activate Test Plan →</button>
|
||||||
|
<div class="note">session: ${session}</div>
|
||||||
|
</div>
|
||||||
|
<script>
|
||||||
|
async function complete() {
|
||||||
|
const r = await fetch('/api/billing/mock-complete', {
|
||||||
|
method:'POST',headers:{'Content-Type':'application/json'},
|
||||||
|
body:JSON.stringify({session_id:'${session}'})
|
||||||
|
});
|
||||||
|
const d = await r.json();
|
||||||
|
if (d.ok) window.location.href = '/?mock_success=1&plan=${plan}';
|
||||||
|
else alert('Error: ' + d.error);
|
||||||
|
}
|
||||||
|
</script></body></html>`);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Static
|
||||||
|
|
||||||
|
// DEMO MODULE
|
||||||
|
const demoMod = require("./demo_module");
|
||||||
|
demoMod.registerDemoRoutes(app, db);
|
||||||
|
|
||||||
|
// Community V3 routes (before static)
|
||||||
|
const cv3 = require('./community_v3');
|
||||||
|
cv3.registerCommunityV3Routes(app);
|
||||||
|
|
||||||
|
app.use(express.static(path.join(__dirname, 'public')));
|
||||||
|
app.use((req, res) => {
|
||||||
|
const idx = path.join(__dirname, 'public', 'index.html');
|
||||||
|
if (fs.existsSync(idx)) res.sendFile(idx);
|
||||||
|
else res.status(404).json({ error: 'Not found' });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Cron: snapshot stale instances every 6h
|
||||||
|
setInterval(async () => {
|
||||||
|
const stale = db.prepare(`SELECT i.*,u.email FROM instances i JOIN users u ON i.user_id=u.id
|
||||||
|
WHERE i.status='running' AND datetime(i.last_active,'+30 days') < datetime('now')`).all();
|
||||||
|
for (const inst of stale) {
|
||||||
|
if (!inst.onecloud_id) continue;
|
||||||
|
await ocRequest('POST', '/vm/image/create', { vm_id: inst.onecloud_id, label: `ix-auto-${inst.user_id}` });
|
||||||
|
await ocRequest('POST', '/vm/shutdown', { vm_id: inst.onecloud_id });
|
||||||
|
db.prepare("UPDATE instances SET status='stopped' WHERE user_id=?").run(inst.user_id);
|
||||||
|
console.log(`[CRON] Snapshotted idle instance for ${inst.email}`);
|
||||||
|
}
|
||||||
|
}, 6 * 60 * 60 * 1000);
|
||||||
|
|
||||||
|
|
||||||
|
app.listen(PORT, '127.0.0.1', () => {
|
||||||
|
console.log(`[IX SAAS v2.1] :${PORT} | payment=${PAYMENT_MODE} | onecloud=${OC_API_KEY?'yes':'no'}`);
|
||||||
|
});
|
||||||
972
site/vitrine/index.html
Normal file
972
site/vitrine/index.html
Normal file
@ -0,0 +1,972 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en" data-theme="dark">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width,initial-scale=1.0">
|
||||||
|
<title>Inference-X — Intelligence for Everyone</title>
|
||||||
|
<meta name="description" content="305KB. Runs on anything. Your AI, your device, your rules. Free forever.">
|
||||||
|
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||||
|
<link href="https://fonts.googleapis.com/css2?family=Fraunces:ital,opsz,wght@0,9..144,300;0,9..144,700;0,9..144,900;1,9..144,600&family=JetBrains+Mono:wght@400;500&family=Nunito:wght@400;600;700;800&display=swap" rel="stylesheet">
|
||||||
|
<style>
|
||||||
|
:root {
|
||||||
|
--bg: #0C0A09;
|
||||||
|
--bg2: #151210;
|
||||||
|
--bg3: #1E1A17;
|
||||||
|
--card: #221E1A;
|
||||||
|
--border: #2E2825;
|
||||||
|
--copper: #C9622A;
|
||||||
|
--amber: #F0A030;
|
||||||
|
--teal: #2ECCB8;
|
||||||
|
--sand: #D9C8B0;
|
||||||
|
--text: #EDE0D0;
|
||||||
|
--muted: #8A7A6A;
|
||||||
|
--green: #4CAF6A;
|
||||||
|
--red: #E05050;
|
||||||
|
--font: 'Nunito', sans-serif;
|
||||||
|
--mono: 'JetBrains Mono', monospace;
|
||||||
|
--display: 'Fraunces', serif;
|
||||||
|
}
|
||||||
|
[data-theme="light"] {
|
||||||
|
--bg: #F8F4EF;
|
||||||
|
--bg2: #F0EBE3;
|
||||||
|
--bg3: #E8E0D5;
|
||||||
|
--card: #FFFCF8;
|
||||||
|
--border: #D5CCC0;
|
||||||
|
--text: #1A1410;
|
||||||
|
--muted: #6A5A4A;
|
||||||
|
--sand: #4A3A2A;
|
||||||
|
}
|
||||||
|
*{margin:0;padding:0;box-sizing:border-box}
|
||||||
|
html{scroll-behavior:smooth;font-size:16px}
|
||||||
|
body{background:var(--bg);color:var(--text);font-family:var(--font);line-height:1.6;overflow-x:hidden}
|
||||||
|
|
||||||
|
/* GRAIN */
|
||||||
|
body::before{content:'';position:fixed;inset:0;pointer-events:none;z-index:9999;opacity:.025;background-image:url("data:image/svg+xml,%3Csvg viewBox='0 0 512 512' xmlns='http://www.w3.org/2000/svg'%3E%3Cfilter id='n'%3E%3CfeTurbulence type='fractalNoise' baseFrequency='.8' numOctaves='4'/%3E%3C/filter%3E%3Crect width='100%25' height='100%25' filter='url(%23n)'/%3E%3C/svg%3E")}
|
||||||
|
|
||||||
|
/* NAV */
|
||||||
|
nav{position:fixed;top:0;left:0;right:0;z-index:1000;display:flex;align-items:center;justify-content:space-between;padding:.9rem 1.5rem;background:rgba(12,10,9,.88);backdrop-filter:blur(12px);border-bottom:1px solid var(--border)}
|
||||||
|
.nav-brand{font-family:var(--display);font-size:1.3rem;font-weight:700;color:var(--copper);letter-spacing:-.02em;text-decoration:none}
|
||||||
|
.nav-brand span{color:var(--amber)}
|
||||||
|
.nav-right{display:flex;align-items:center;gap:.8rem}
|
||||||
|
.nav-links{display:flex;gap:1.2rem;list-style:none}
|
||||||
|
.nav-links a{color:var(--muted);text-decoration:none;font-size:.85rem;font-weight:600;transition:color .2s}
|
||||||
|
.nav-links a:hover{color:var(--text)}
|
||||||
|
#tt{background:none;border:1px solid var(--border);color:var(--muted);cursor:pointer;padding:.35rem .6rem;border-radius:.4rem;font-size:.9rem;transition:all .2s}
|
||||||
|
#tt:hover{border-color:var(--amber);color:var(--amber)}
|
||||||
|
#lb{background:none;border:1px solid var(--border);color:var(--muted);cursor:pointer;padding:.35rem .6rem;border-radius:.4rem;font-size:.75rem;font-weight:600;font-family:var(--font);transition:all .2s}
|
||||||
|
#lb:hover{border-color:var(--teal);color:var(--teal)}
|
||||||
|
|
||||||
|
/* SECTIONS */
|
||||||
|
section{padding:5rem 0;max-width:1100px;margin:0 auto;padding-left:1.5rem;padding-right:1.5rem}
|
||||||
|
section.full{max-width:none;padding-left:0;padding-right:0}
|
||||||
|
.section-tag{font-size:.72rem;font-weight:800;letter-spacing:.15em;text-transform:uppercase;color:var(--copper);margin-bottom:.8rem}
|
||||||
|
h1{font-family:var(--display);font-size:clamp(2.5rem,6vw,5rem);font-weight:900;line-height:1.05;letter-spacing:-.03em}
|
||||||
|
h2{font-family:var(--display);font-size:clamp(1.8rem,4vw,3rem);font-weight:700;line-height:1.1;letter-spacing:-.02em}
|
||||||
|
h3{font-size:1.1rem;font-weight:700;margin-bottom:.5rem}
|
||||||
|
.sub{color:var(--muted);font-size:1rem;margin-top:.8rem;max-width:580px;line-height:1.7}
|
||||||
|
|
||||||
|
/* HERO */
|
||||||
|
#hero{min-height:100vh;display:flex;align-items:center;padding-top:5rem;position:relative;overflow:hidden}
|
||||||
|
.hero-glow{position:absolute;width:600px;height:600px;background:radial-gradient(circle,rgba(201,98,42,.15) 0%,transparent 70%);top:-100px;right:-100px;pointer-events:none}
|
||||||
|
.hero-glow2{position:absolute;width:400px;height:400px;background:radial-gradient(circle,rgba(46,204,184,.08) 0%,transparent 70%);bottom:0;left:0;pointer-events:none}
|
||||||
|
.hero-inner{position:relative;z-index:1;max-width:1100px;margin:0 auto;padding:0 1.5rem;width:100%}
|
||||||
|
.hero-eyebrow{display:inline-flex;align-items:center;gap:.5rem;background:var(--bg3);border:1px solid var(--border);border-radius:2rem;padding:.4rem 1rem;margin-bottom:2rem;font-size:.8rem;font-weight:700;letter-spacing:.1em;text-transform:uppercase;color:var(--copper)}
|
||||||
|
.hero-eyebrow::before{content:'';width:6px;height:6px;background:var(--teal);border-radius:50%;animation:pulse 2s infinite}
|
||||||
|
@keyframes pulse{0%,100%{opacity:1;transform:scale(1)}50%{opacity:.5;transform:scale(1.5)}}
|
||||||
|
.hero-title{font-family:var(--display);font-size:clamp(3rem,7vw,6rem);font-weight:900;line-height:1;letter-spacing:-.03em;margin-bottom:1.5rem}
|
||||||
|
.hero-title em{font-style:italic;color:var(--copper)}
|
||||||
|
.hero-sub{font-size:1.15rem;color:var(--muted);max-width:540px;line-height:1.7;margin-bottom:2.5rem}
|
||||||
|
.hero-stats{display:flex;flex-wrap:wrap;gap:2rem;margin-bottom:3rem}
|
||||||
|
.stat{display:flex;flex-direction:column}
|
||||||
|
.stat-n{font-family:var(--display);font-size:2.2rem;font-weight:900;color:var(--amber);line-height:1}
|
||||||
|
.stat-l{font-size:.78rem;color:var(--muted);font-weight:600;margin-top:.2rem}
|
||||||
|
.hero-ctas{display:flex;flex-wrap:wrap;gap:1rem}
|
||||||
|
.btn-primary{display:inline-flex;align-items:center;gap:.5rem;background:var(--copper);color:#fff;padding:.8rem 1.6rem;border-radius:.5rem;text-decoration:none;font-weight:800;font-size:.95rem;transition:all .2s;border:2px solid var(--copper)}
|
||||||
|
.btn-primary:hover{background:var(--amber);border-color:var(--amber)}
|
||||||
|
.btn-secondary{display:inline-flex;align-items:center;gap:.5rem;background:transparent;color:var(--text);padding:.8rem 1.6rem;border-radius:.5rem;text-decoration:none;font-weight:700;font-size:.95rem;border:2px solid var(--border);transition:all .2s}
|
||||||
|
.btn-secondary:hover{border-color:var(--copper);color:var(--copper)}
|
||||||
|
|
||||||
|
/* DISCOVER */
|
||||||
|
.disc-cards{display:grid;grid-template-columns:repeat(auto-fit,minmax(280px,1fr));gap:1.5rem;margin-top:2.5rem}
|
||||||
|
.disc-card{background:var(--card);border:1px solid var(--border);border-radius:.8rem;padding:1.8rem;transition:border-color .3s}
|
||||||
|
.disc-card:hover{border-color:var(--copper)}
|
||||||
|
.disc-icon{font-size:2.5rem;margin-bottom:1rem;display:block}
|
||||||
|
|
||||||
|
/* DEVICE GADGET */
|
||||||
|
.device-widget{background:var(--card);border:1px solid var(--border);border-radius:1rem;padding:2rem;margin-top:2rem}
|
||||||
|
.slider-wrap{margin:1.5rem 0}
|
||||||
|
.slider-labels{display:flex;justify-content:space-between;font-size:.75rem;color:var(--muted);margin-bottom:.5rem;font-family:var(--mono)}
|
||||||
|
#ramSlider{width:100%;accent-color:var(--copper);height:6px;cursor:pointer}
|
||||||
|
.ram-results{margin-top:1.5rem;display:grid;gap:.8rem}
|
||||||
|
.ram-item{display:flex;align-items:center;gap:1rem;padding:.8rem 1rem;border-radius:.5rem;border:1px solid var(--border);background:var(--bg2);font-size:.9rem;transition:all .3s}
|
||||||
|
.ram-item.active{border-color:var(--teal);background:rgba(46,204,184,.07)}
|
||||||
|
.ram-item .ri-icon{font-size:1.4rem;width:2rem;text-align:center}
|
||||||
|
.ram-item .ri-name{font-weight:700;flex:1}
|
||||||
|
.ram-item .ri-size{font-family:var(--mono);font-size:.78rem;color:var(--muted)}
|
||||||
|
.ram-item .ri-speed{font-size:.78rem;color:var(--green);font-weight:700}
|
||||||
|
|
||||||
|
/* PRIVACY */
|
||||||
|
.priv-compare{display:grid;grid-template-columns:1fr 1fr;gap:1.5rem;margin-top:2rem}
|
||||||
|
@media(max-width:600px){.priv-compare{grid-template-columns:1fr}}
|
||||||
|
.priv-card{border-radius:.8rem;padding:1.8rem;border:1px solid var(--border)}
|
||||||
|
.priv-cloud{background:linear-gradient(135deg,rgba(224,80,80,.08),transparent);border-color:rgba(224,80,80,.3)}
|
||||||
|
.priv-local{background:linear-gradient(135deg,rgba(76,175,106,.08),transparent);border-color:rgba(76,175,106,.3)}
|
||||||
|
.priv-title{font-size:1.1rem;font-weight:800;margin-bottom:.8rem;display:flex;align-items:center;gap:.5rem}
|
||||||
|
.cloud-dot{width:8px;height:8px;border-radius:50%;background:var(--red);display:inline-block}
|
||||||
|
.local-dot{width:8px;height:8px;border-radius:50%;background:var(--green);display:inline-block}
|
||||||
|
|
||||||
|
/* SIZE VIZ */
|
||||||
|
.size-viz{display:flex;align-items:flex-end;gap:.8rem;margin:2rem 0;padding:1.5rem;background:var(--card);border:1px solid var(--border);border-radius:.8rem;overflow:hidden}
|
||||||
|
.size-item{display:flex;flex-direction:column;align-items:center;gap:.5rem;font-size:.72rem;color:var(--muted);text-align:center}
|
||||||
|
.size-bar{border-radius:.3rem .3rem 0 0;min-width:40px;transition:all .5s}
|
||||||
|
.size-item.ix .size-bar{background:var(--copper)}
|
||||||
|
.size-item.ix .size-label{color:var(--copper);font-weight:800}
|
||||||
|
|
||||||
|
/* ENGINE */
|
||||||
|
.hw-grid{display:grid;grid-template-columns:repeat(auto-fill,minmax(130px,1fr));gap:.8rem;margin-top:2rem}
|
||||||
|
.hw-chip{background:var(--card);border:1px solid var(--border);border-radius:.5rem;padding:.8rem .6rem;text-align:center;font-size:.78rem;font-weight:700;font-family:var(--mono);transition:all .2s;cursor:default}
|
||||||
|
.hw-chip:hover{border-color:var(--copper);color:var(--copper);transform:translateY(-2px)}
|
||||||
|
.hw-chip .hw-cat{font-size:.62rem;color:var(--muted);font-weight:400;display:block;margin-top:.2rem}
|
||||||
|
.engine-feats{display:grid;grid-template-columns:repeat(auto-fit,minmax(200px,1fr));gap:1rem;margin-top:1.5rem}
|
||||||
|
.feat{padding:1.2rem;background:var(--card);border:1px solid var(--border);border-radius:.6rem;font-size:.88rem}
|
||||||
|
.feat-name{font-weight:800;margin-bottom:.3rem;color:var(--amber)}
|
||||||
|
|
||||||
|
/* MODELS */
|
||||||
|
.models-grid{display:grid;grid-template-columns:repeat(auto-fill,minmax(260px,1fr));gap:1rem;margin-top:2rem}
|
||||||
|
.model-card{background:var(--card);border:1px solid var(--border);border-radius:.7rem;padding:1.3rem;display:flex;flex-direction:column;gap:.5rem;transition:border-color .2s}
|
||||||
|
.model-card:hover{border-color:var(--teal)}
|
||||||
|
.model-name{font-family:var(--mono);font-size:.85rem;font-weight:700;color:var(--teal)}
|
||||||
|
.model-desc{font-size:.82rem;color:var(--muted);flex:1}
|
||||||
|
.model-meta{display:flex;flex-wrap:wrap;gap:.4rem;margin-top:.5rem}
|
||||||
|
.model-tag{font-size:.68rem;background:var(--bg3);border:1px solid var(--border);border-radius:.3rem;padding:.15rem .4rem;color:var(--sand);font-family:var(--mono)}
|
||||||
|
|
||||||
|
/* COST */
|
||||||
|
.cost-compare{display:grid;grid-template-columns:1fr 1fr;gap:1.5rem;margin-top:2rem}
|
||||||
|
@media(max-width:600px){.cost-compare{grid-template-columns:1fr}}
|
||||||
|
.cost-card{border-radius:.8rem;padding:2rem;text-align:center}
|
||||||
|
.cost-cloud-card{background:var(--bg3);border:1px solid var(--border)}
|
||||||
|
.cost-local-card{background:linear-gradient(135deg,rgba(46,204,184,.08),transparent);border:1px solid rgba(46,204,184,.3)}
|
||||||
|
.cost-amount{font-family:var(--display);font-size:3.5rem;font-weight:900;line-height:1;margin:1rem 0}
|
||||||
|
.cost-cloud-card .cost-amount{color:var(--red)}
|
||||||
|
.cost-local-card .cost-amount{color:var(--teal)}
|
||||||
|
.cost-sub{font-size:.8rem;color:var(--muted)}
|
||||||
|
|
||||||
|
/* API */
|
||||||
|
.api-box{background:var(--bg2);border:1px solid var(--border);border-radius:.7rem;padding:1.5rem;margin-top:1.5rem;font-family:var(--mono);font-size:.82rem;line-height:1.8;overflow-x:auto}
|
||||||
|
.api-box .kw{color:var(--copper)}
|
||||||
|
.api-box .str{color:var(--teal)}
|
||||||
|
.api-box .cmt{color:var(--muted)}
|
||||||
|
.api-endpoints{display:flex;flex-wrap:wrap;gap:.5rem;margin-top:1rem}
|
||||||
|
.api-ep{background:var(--card);border:1px solid var(--border);border-radius:.4rem;padding:.4rem .8rem;font-family:var(--mono);font-size:.75rem}
|
||||||
|
.ep-get{color:var(--green)}
|
||||||
|
.ep-post{color:var(--amber)}
|
||||||
|
|
||||||
|
/* QUICK START */
|
||||||
|
.qs-tabs{display:flex;gap:.5rem;margin-bottom:1.5rem;flex-wrap:wrap}
|
||||||
|
.qs-tab{background:none;border:1px solid var(--border);color:var(--muted);cursor:pointer;padding:.5rem 1rem;border-radius:.4rem;font-size:.85rem;font-weight:700;font-family:var(--font);transition:all .2s}
|
||||||
|
.qs-tab.active,.qs-tab:hover{border-color:var(--copper);color:var(--copper);background:rgba(201,98,42,.08)}
|
||||||
|
.qs-block{display:none}
|
||||||
|
.qs-block.active{display:block}
|
||||||
|
.step{display:flex;gap:1rem;margin-bottom:1.2rem;align-items:flex-start}
|
||||||
|
.step-num{flex-shrink:0;width:32px;height:32px;border-radius:50%;background:var(--copper);color:#fff;font-weight:900;font-size:.85rem;display:flex;align-items:center;justify-content:center;font-family:var(--display)}
|
||||||
|
code{background:var(--bg2);border:1px solid var(--border);border-radius:.35rem;padding:.2rem .5rem;font-family:var(--mono);font-size:.83rem;color:var(--teal)}
|
||||||
|
pre{background:var(--bg2);border:1px solid var(--border);border-radius:.5rem;padding:1rem;font-family:var(--mono);font-size:.8rem;overflow-x:auto;margin-top:.5rem;line-height:1.7}
|
||||||
|
pre .c{color:var(--muted)}
|
||||||
|
pre .v{color:var(--teal)}
|
||||||
|
pre .s{color:var(--amber)}
|
||||||
|
|
||||||
|
/* COMMUNITY TOOLS */
|
||||||
|
.tools-grid{display:grid;grid-template-columns:repeat(auto-fill,minmax(220px,1fr));gap:1.2rem;margin-top:2rem}
|
||||||
|
.tool-card{background:var(--card);border:1px solid var(--border);border-radius:.8rem;padding:1.5rem;position:relative;transition:all .25s;text-decoration:none;color:inherit;display:block}
|
||||||
|
.tool-card:hover{border-color:var(--copper);transform:translateY(-3px)}
|
||||||
|
.tool-badge{position:absolute;top:1rem;right:1rem;font-size:.62rem;font-weight:800;padding:.2rem .5rem;border-radius:2rem;font-family:var(--mono)}
|
||||||
|
.badge-live{background:rgba(76,175,106,.15);color:var(--green);border:1px solid rgba(76,175,106,.3)}
|
||||||
|
.badge-build{background:rgba(240,160,48,.15);color:var(--amber);border:1px solid rgba(240,160,48,.3)}
|
||||||
|
.badge-coming{background:var(--bg3);color:var(--muted);border:1px solid var(--border)}
|
||||||
|
.tool-icon{font-size:2rem;margin-bottom:.8rem}
|
||||||
|
.tool-name{font-weight:800;font-size:.95rem;margin-bottom:.3rem}
|
||||||
|
.tool-desc{font-size:.8rem;color:var(--muted);line-height:1.5}
|
||||||
|
|
||||||
|
/* ORGAN */
|
||||||
|
.organ-visual{display:flex;flex-wrap:wrap;gap:1rem;margin:2rem 0;align-items:center;justify-content:center}
|
||||||
|
.organ-node{background:var(--card);border:2px solid var(--border);border-radius:50%;width:80px;height:80px;display:flex;align-items:center;justify-content:center;font-size:1.8rem;cursor:pointer;transition:all .3s;position:relative}
|
||||||
|
.organ-node:hover{border-color:var(--copper);transform:scale(1.15)}
|
||||||
|
.organ-node.glow{border-color:var(--teal);box-shadow:0 0 20px rgba(46,204,184,.3);animation:glow-pulse 2s infinite}
|
||||||
|
@keyframes glow-pulse{0%,100%{box-shadow:0 0 20px rgba(46,204,184,.3)}50%{box-shadow:0 0 40px rgba(46,204,184,.5)}}
|
||||||
|
.organ-arrow{color:var(--muted);font-size:1.5rem}
|
||||||
|
.organ-desc{font-size:.88rem;color:var(--muted);max-width:600px;margin:0 auto;text-align:center;margin-top:1rem;line-height:1.7}
|
||||||
|
|
||||||
|
|
||||||
|
/* HARDWARE SCOUT */
|
||||||
|
.scout-table{width:100%;border-collapse:collapse;margin-top:1.5rem;font-size:.85rem}
|
||||||
|
.scout-table th{text-align:left;padding:.8rem;border-bottom:2px solid var(--border);color:var(--muted);font-size:.72rem;text-transform:uppercase;letter-spacing:.08em;font-weight:700}
|
||||||
|
.scout-table td{padding:.8rem;border-bottom:1px solid var(--border)}
|
||||||
|
.scout-table tr:last-child td{border-bottom:none}
|
||||||
|
.backend-chip{font-family:var(--mono);font-size:.72rem;background:var(--bg3);border:1px solid var(--border);border-radius:.3rem;padding:.15rem .4rem}
|
||||||
|
.load-bar{background:var(--bg3);border-radius:2rem;height:6px;overflow:hidden;width:80px;display:inline-block}
|
||||||
|
.load-fill{background:linear-gradient(90deg,var(--teal),var(--amber));height:100%;border-radius:2rem;transition:width .5s}
|
||||||
|
.live-dot{width:7px;height:7px;background:var(--teal);border-radius:50%;display:inline-block;margin-right:.4rem;animation:pulse 2s infinite}
|
||||||
|
|
||||||
|
/* DONATE */
|
||||||
|
.donate-widget{background:var(--card);border:1px solid var(--border);border-radius:1rem;padding:2rem;margin-top:2rem}
|
||||||
|
.donate-amounts{display:flex;flex-wrap:wrap;gap:.8rem;margin-bottom:1.5rem}
|
||||||
|
.donate-btn{background:var(--bg3);border:2px solid var(--border);border-radius:.5rem;padding:.7rem 1.3rem;cursor:pointer;font-weight:800;font-size:.9rem;font-family:var(--font);color:var(--muted);transition:all .2s}
|
||||||
|
.donate-btn:hover,.donate-btn.active{border-color:var(--copper);color:var(--copper);background:rgba(201,98,42,.08)}
|
||||||
|
.costs-breakdown{display:grid;gap:.5rem;margin-bottom:1.5rem}
|
||||||
|
.cost-line{display:flex;justify-content:space-between;font-size:.85rem;padding:.5rem .8rem;background:var(--bg2);border-radius:.4rem}
|
||||||
|
.cost-line span:last-child{font-family:var(--mono);color:var(--amber)}
|
||||||
|
.cost-total{border-top:2px solid var(--border);margin-top:.5rem;padding-top:.5rem;font-weight:800}
|
||||||
|
|
||||||
|
/* CRATONS */
|
||||||
|
.cratons-grid{display:grid;grid-template-columns:repeat(auto-fill,minmax(200px,1fr));gap:1rem;margin-top:2rem}
|
||||||
|
.craton-card{background:var(--card);border:1px solid var(--border);border-radius:.7rem;padding:1.2rem;transition:all .2s}
|
||||||
|
.craton-card.active{border-color:var(--copper);background:rgba(201,98,42,.05)}
|
||||||
|
.craton-card:not(.active):hover{border-color:var(--border);opacity:.8}
|
||||||
|
.craton-age{font-family:var(--mono);font-size:.68rem;color:var(--muted);margin-bottom:.3rem}
|
||||||
|
.craton-name{font-weight:800;margin-bottom:.2rem}
|
||||||
|
.craton-region{font-size:.78rem;color:var(--muted)}
|
||||||
|
.craton-status{font-size:.7rem;font-weight:800;margin-top:.5rem}
|
||||||
|
.craton-card.active .craton-status{color:var(--copper)}
|
||||||
|
.craton-card:not(.active) .craton-status{color:var(--teal);cursor:pointer;text-decoration:underline}
|
||||||
|
|
||||||
|
/* PRICING */
|
||||||
|
.pricing-grid{display:grid;grid-template-columns:repeat(auto-fit,minmax(260px,1fr));gap:1.5rem;margin-top:2rem}
|
||||||
|
.pricing-card{background:var(--card);border:1px solid var(--border);border-radius:.8rem;padding:2rem;display:flex;flex-direction:column;gap:.8rem}
|
||||||
|
.pricing-card.featured{border-color:var(--copper)}
|
||||||
|
.pricing-tier{font-weight:900;font-size:1.1rem}
|
||||||
|
.pricing-who{font-size:.85rem;color:var(--muted);line-height:1.5;flex:1}
|
||||||
|
.pricing-price{font-family:var(--display);font-size:2rem;font-weight:900}
|
||||||
|
.pricing-price.free{color:var(--teal)}
|
||||||
|
.pricing-price.fair{color:var(--amber)}
|
||||||
|
.pricing-price.ind{color:var(--copper)}
|
||||||
|
|
||||||
|
/* FOOTER */
|
||||||
|
footer{background:var(--bg2);border-top:1px solid var(--border);padding:3rem 1.5rem;margin-top:4rem}
|
||||||
|
.footer-inner{max-width:1100px;margin:0 auto;display:grid;grid-template-columns:2fr 1fr 1fr 1fr;gap:3rem}
|
||||||
|
@media(max-width:700px){.footer-inner{grid-template-columns:1fr 1fr}}
|
||||||
|
.footer-brand{font-family:var(--display);font-size:1.4rem;font-weight:900;color:var(--copper);margin-bottom:.8rem}
|
||||||
|
.footer-tagline{font-size:.85rem;color:var(--muted);line-height:1.6}
|
||||||
|
.footer-col h4{font-size:.75rem;font-weight:800;letter-spacing:.1em;text-transform:uppercase;color:var(--muted);margin-bottom:.8rem}
|
||||||
|
.footer-col a{display:block;color:var(--muted);text-decoration:none;font-size:.85rem;margin-bottom:.4rem;transition:color .2s}
|
||||||
|
.footer-col a:hover{color:var(--text)}
|
||||||
|
.footer-bottom{max-width:1100px;margin:2rem auto 0;padding-top:1.5rem;border-top:1px solid var(--border);display:flex;justify-content:space-between;font-size:.78rem;color:var(--muted);flex-wrap:wrap;gap:.5rem}
|
||||||
|
|
||||||
|
/* ANIMATIONS */
|
||||||
|
.reveal{opacity:0;transform:translateY(24px);transition:opacity .6s,transform .6s}
|
||||||
|
.reveal.visible{opacity:1;transform:none}
|
||||||
|
|
||||||
|
/* RESPONSIVE */
|
||||||
|
@media(max-width:768px){
|
||||||
|
nav .nav-links{display:none}
|
||||||
|
.hero-stats{gap:1.5rem}
|
||||||
|
.stat-n{font-size:1.8rem}
|
||||||
|
.cost-compare,.priv-compare{grid-template-columns:1fr}
|
||||||
|
.footer-inner{grid-template-columns:1fr 1fr}
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
|
||||||
|
<!-- NAV -->
|
||||||
|
<nav>
|
||||||
|
<a class="nav-brand" href="#">Inference<span>-X</span></a>
|
||||||
|
<ul class="nav-links">
|
||||||
|
<li><a href="#discover">Discover</a></li>
|
||||||
|
<li><a href="#engine">Engine</a></li>
|
||||||
|
<li><a href="#community">Community</a></li>
|
||||||
|
<li><a href="#organ">Organs</a></li>
|
||||||
|
<li><a href="#join">Join</a></li>
|
||||||
|
<li><a href="https://build.inference-x.com" target="_blank">SaaS</a></li>
|
||||||
|
</ul>
|
||||||
|
<div class="nav-right">
|
||||||
|
<button id="lb" onclick="cycleLang()">🇬🇧 EN</button>
|
||||||
|
<button id="tt">☾</button>
|
||||||
|
</div>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
<!-- ═══ HERO ═══ -->
|
||||||
|
<div id="hero">
|
||||||
|
<div class="hero-glow"></div>
|
||||||
|
<div class="hero-glow2"></div>
|
||||||
|
<div class="hero-inner">
|
||||||
|
<div class="hero-eyebrow">🌍 Built in Morocco for the world</div>
|
||||||
|
<h1 class="hero-title" id="ht">
|
||||||
|
Intelligence,<br><em>for everyone.</em><br>No permission needed.
|
||||||
|
</h1>
|
||||||
|
<p class="hero-sub" id="hsub">305KB. Runs on your phone, your laptop, your server. Free forever. No cloud, no account, no limit. The AI belongs to whoever runs it.</p>
|
||||||
|
<div class="hero-stats">
|
||||||
|
<div class="stat"><span class="stat-n">305<small style="font-size:1.2rem">KB</small></span><span class="stat-l">Entire engine</span></div>
|
||||||
|
<div class="stat"><span class="stat-n">19</span><span class="stat-l">Hardware backends</span></div>
|
||||||
|
<div class="stat"><span class="stat-n">23</span><span class="stat-l">Model formats</span></div>
|
||||||
|
<div class="stat"><span class="stat-n">∞</span><span class="stat-l">API calls · forever free</span></div>
|
||||||
|
<div class="stat"><span class="stat-n">$0</span><span class="stat-l">Per year · your hardware</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="hero-ctas">
|
||||||
|
<a href="#discover" class="btn-primary">See how it works →</a>
|
||||||
|
<a href="https://build.inference-x.com" class="btn-secondary" target="_blank">Try the SaaS</a>
|
||||||
|
<a href="#join" class="btn-secondary">Join the builders</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- ═══ 01 DISCOVER ═══ -->
|
||||||
|
<section id="discover">
|
||||||
|
<div class="section-tag">What is this</div>
|
||||||
|
<h2 id="disc_title">Three things to know. Nothing more.</h2>
|
||||||
|
<p class="sub" id="disc_sub">No degree required. If you have a device, you have AI.</p>
|
||||||
|
<div class="disc-cards">
|
||||||
|
<div class="disc-card reveal">
|
||||||
|
<span class="disc-icon">📦</span>
|
||||||
|
<h3 id="d1t">It's a tiny file</h3>
|
||||||
|
<p id="d1p">305 kilobytes. Smaller than a photo on your phone. This file lets your computer run AI — any AI — without the internet. Download it, run it. That's it.</p>
|
||||||
|
</div>
|
||||||
|
<div class="disc-card reveal">
|
||||||
|
<span class="disc-icon">🔒</span>
|
||||||
|
<h3 id="d2t">Your words stay yours</h3>
|
||||||
|
<p id="d2p">When you use AI online, your questions travel to a distant server. Someone can read them. With Inference-X, nothing leaves your machine. Ever.</p>
|
||||||
|
</div>
|
||||||
|
<div class="disc-card reveal">
|
||||||
|
<span class="disc-icon">⚡</span>
|
||||||
|
<h3 id="d3t">It runs on anything</h3>
|
||||||
|
<p id="d3p">Old laptop, new phone, Raspberry Pi, datacenter. Same file. It detects your hardware and uses it. No configuration needed.</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 02 YOUR DEVICE ═══ -->
|
||||||
|
<section id="device">
|
||||||
|
<div class="section-tag">Your hardware</div>
|
||||||
|
<h2 id="dev_title">What can YOUR computer do?</h2>
|
||||||
|
<p class="sub" id="dev_sub">Move the slider to your RAM. See what's possible.</p>
|
||||||
|
<div class="device-widget reveal">
|
||||||
|
<div class="slider-wrap">
|
||||||
|
<div class="slider-labels">
|
||||||
|
<span>1 GB</span><span>4 GB</span><span>8 GB</span><span>16 GB</span><span>32 GB</span><span>64 GB</span><span>128+ GB</span>
|
||||||
|
</div>
|
||||||
|
<input type="range" id="ramSlider" min="1" max="128" value="8" step="1">
|
||||||
|
</div>
|
||||||
|
<p id="ramVal" style="font-family:var(--mono);font-size:.9rem;color:var(--amber);margin-bottom:1rem">RAM: <strong>8 GB</strong> — showing models that fit</p>
|
||||||
|
<div class="ram-results" id="ramResults"></div>
|
||||||
|
<p id="ram_hint" style="margin-top:1rem;font-size:.82rem;color:var(--muted)">Your AI runs locally. No internet. No account. Free forever.</p>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 03 PRIVACY ═══ -->
|
||||||
|
<section id="privacy">
|
||||||
|
<div class="section-tag">Privacy</div>
|
||||||
|
<h2 id="priv_title">Where do your words go?</h2>
|
||||||
|
<div class="priv-compare reveal">
|
||||||
|
<div class="priv-card priv-cloud">
|
||||||
|
<div class="priv-title"><span class="cloud-dot"></span><span id="priv_cloud_t">Cloud AI</span></div>
|
||||||
|
<p id="priv_cloud_p">Your question leaves your device, crosses the internet, reaches a server in another country, gets processed, stored, and analyzed. You pay per word.</p>
|
||||||
|
<div style="margin-top:1rem;font-size:.8rem;color:var(--red)">⚠ Your data · their server · their rules</div>
|
||||||
|
</div>
|
||||||
|
<div class="priv-card priv-local">
|
||||||
|
<div class="priv-title"><span class="local-dot"></span><span id="priv_local_t">Inference-X</span></div>
|
||||||
|
<p id="priv_local_p">Your question stays on your desk. The answer is computed by your own processor. Nothing leaves. Nothing is stored. You pay nothing.</p>
|
||||||
|
<div style="margin-top:1rem;font-size:.8rem;color:var(--green)">✓ Your data · your processor · your rules</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 04 SIZE ═══ -->
|
||||||
|
<section id="size">
|
||||||
|
<div class="section-tag">Footprint</div>
|
||||||
|
<h2 id="size_title">How small is 305 KB?</h2>
|
||||||
|
<p class="sub" id="size_sub">The entire AI engine — smaller than what you think.</p>
|
||||||
|
<div class="size-viz reveal">
|
||||||
|
<div class="size-item ix">
|
||||||
|
<div class="size-bar" style="height:35px;background:var(--copper)"></div>
|
||||||
|
<span class="size-label">Inference-X</span>
|
||||||
|
<span>305 KB</span>
|
||||||
|
</div>
|
||||||
|
<div class="size-item">
|
||||||
|
<div class="size-bar" style="height:65px;background:var(--bg3);border:1px solid var(--border)"></div>
|
||||||
|
<span>iPhone photo</span>
|
||||||
|
<span>~3 MB</span>
|
||||||
|
</div>
|
||||||
|
<div class="size-item">
|
||||||
|
<div class="size-bar" style="height:120px;background:var(--bg3);border:1px solid var(--border)"></div>
|
||||||
|
<span>Average app</span>
|
||||||
|
<span>~50 MB</span>
|
||||||
|
</div>
|
||||||
|
<div class="size-item">
|
||||||
|
<div class="size-bar" style="height:200px;background:rgba(224,80,80,.3);border:1px solid rgba(224,80,80,.4)"></div>
|
||||||
|
<span>Chrome</span>
|
||||||
|
<span>~200 MB</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<p id="size_note" style="font-size:.88rem;color:var(--muted);margin-top:1rem">All 19 hardware targets, all 23 formats — in less space than a single photo on your phone.</p>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 05 ENGINE ═══ -->
|
||||||
|
<section id="engine">
|
||||||
|
<div class="section-tag">The engine</div>
|
||||||
|
<h2>One binary to run them all.</h2>
|
||||||
|
<p class="sub">Written in C++. No dependencies. No runtime. No cloud. Any silicon, any OS, any AI model.</p>
|
||||||
|
<div class="hw-grid reveal">
|
||||||
|
<div class="hw-chip">CUDA<span class="hw-cat">NVIDIA GPU</span></div>
|
||||||
|
<div class="hw-chip">Metal<span class="hw-cat">Apple Silicon</span></div>
|
||||||
|
<div class="hw-chip">Vulkan<span class="hw-cat">Any GPU</span></div>
|
||||||
|
<div class="hw-chip">ROCm<span class="hw-cat">AMD GPU</span></div>
|
||||||
|
<div class="hw-chip">OpenCL<span class="hw-cat">Any GPU</span></div>
|
||||||
|
<div class="hw-chip">SYCL<span class="hw-cat">Intel GPU</span></div>
|
||||||
|
<div class="hw-chip">CPU x86<span class="hw-cat">Intel/AMD</span></div>
|
||||||
|
<div class="hw-chip">CPU ARM<span class="hw-cat">Mobile/Pi</span></div>
|
||||||
|
<div class="hw-chip">RISC-V<span class="hw-cat">Emerging</span></div>
|
||||||
|
<div class="hw-chip">WebGPU<span class="hw-cat">Browser</span></div>
|
||||||
|
<div class="hw-chip">TPU<span class="hw-cat">Google</span></div>
|
||||||
|
<div class="hw-chip">FPGA<span class="hw-cat">Custom HW</span></div>
|
||||||
|
<div class="hw-chip">Inferentia<span class="hw-cat">AWS</span></div>
|
||||||
|
<div class="hw-chip">Gaudi<span class="hw-cat">Intel</span></div>
|
||||||
|
<div class="hw-chip">Groq<span class="hw-cat">LPU</span></div>
|
||||||
|
<div class="hw-chip">Cerebras<span class="hw-cat">Wafer</span></div>
|
||||||
|
<div class="hw-chip">SambaNova<span class="hw-cat">RDU</span></div>
|
||||||
|
<div class="hw-chip">Graphcore<span class="hw-cat">IPU</span></div>
|
||||||
|
<div class="hw-chip">Custom<span class="hw-cat">+ your HW</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="engine-feats reveal" style="margin-top:2rem">
|
||||||
|
<div class="feat"><div class="feat-name">Zero-Copy Inference</div>Dequantization and matrix multiply in one instruction loop. No intermediate buffer.</div>
|
||||||
|
<div class="feat"><div class="feat-name">Trillion-Parameter Native</div>Only active experts exist in memory. A 1T-parameter model runs on 64 GB RAM.</div>
|
||||||
|
<div class="feat"><div class="feat-name">Smart Precision</div>Simple questions get compressed layers. Complex reasoning gets full precision.</div>
|
||||||
|
<div class="feat"><div class="feat-name">Zero Telemetry</div>No network calls. No phone-home. Works on a plane, in a submarine, on the moon.</div>
|
||||||
|
<div class="feat"><div class="feat-name">Auto-Detect</div>Architecture, chat templates, EOS tokens — auto-detected from model metadata.</div>
|
||||||
|
<div class="feat"><div class="feat-name">Self-Configuring</div>The Makefile detects your hardware. You don't configure it — it configures itself.</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 06 MODELS ═══ -->
|
||||||
|
<section id="models">
|
||||||
|
<div class="section-tag">What runs on it</div>
|
||||||
|
<h2>Any GGUF model. Zero setup.</h2>
|
||||||
|
<p class="sub">Download a model from HuggingFace or Ollama. Drop it in. Run it. These are models we've benchmarked.</p>
|
||||||
|
<div class="models-grid reveal">
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">LLaMA 3.2 · 1B</div>
|
||||||
|
<div class="model-desc">Quick answers. Tiny device. Lightning fast.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">1 GB RAM</span><span class="model-tag">mobile-ready</span><span class="model-tag">fast</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">Mistral · 7B</div>
|
||||||
|
<div class="model-desc">Smart conversations, code help, translations.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">5 GB RAM</span><span class="model-tag">multilingual</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">LLaMA 3.1 · 8B</div>
|
||||||
|
<div class="model-desc">Meta's compact model. Great reasoning at low cost.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">6 GB RAM</span><span class="model-tag">reasoning</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">Mistral · 22B</div>
|
||||||
|
<div class="model-desc">Creative writing, analysis, multilingual expert.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">16 GB RAM</span><span class="model-tag">creative</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">LLaMA 3.1 · 70B</div>
|
||||||
|
<div class="model-desc">Full-featured assistant. Code. Math. Logic.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">48 GB RAM</span><span class="model-tag">code</span><span class="model-tag">math</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">DeepSeek · 671B</div>
|
||||||
|
<div class="model-desc">Advanced reasoning. Expert-level answers. MoE architecture.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">64 GB RAM</span><span class="model-tag">expert</span><span class="model-tag">MoE</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">Phi-3 · 3.8B</div>
|
||||||
|
<div class="model-desc">Microsoft's small model. Punches far above its weight.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">3 GB RAM</span><span class="model-tag">efficient</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card">
|
||||||
|
<div class="model-name">Qwen 2.5 · 7B</div>
|
||||||
|
<div class="model-desc">Chinese-developed. Excellent for multilingual tasks.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag">5 GB RAM</span><span class="model-tag">multilingual</span><span class="model-tag">code</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="model-card" style="border-style:dashed;border-color:var(--muted)">
|
||||||
|
<div class="model-name" style="color:var(--muted)">+ any GGUF</div>
|
||||||
|
<div class="model-desc">Download from HuggingFace. Drop in folder. Done.</div>
|
||||||
|
<div class="model-meta"><span class="model-tag" style="color:var(--muted)">any size</span></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 07 COST ═══ -->
|
||||||
|
<section id="cost">
|
||||||
|
<div class="section-tag">The real cost</div>
|
||||||
|
<h2 id="cost_title">How much does AI cost?</h2>
|
||||||
|
<p class="sub" id="cost_sub">Using AI 1 hour per day, every day, for a year.</p>
|
||||||
|
<div class="cost-compare reveal">
|
||||||
|
<div class="cost-card cost-cloud-card">
|
||||||
|
<div class="section-tag" id="cost_cloud_l">Cloud API (GPT-4 class)</div>
|
||||||
|
<div class="cost-amount">$2,500+</div>
|
||||||
|
<div class="cost-sub">per year · and rising · your data = their product</div>
|
||||||
|
<div style="margin-top:1rem;font-size:.8rem;color:var(--muted)">API key required · Rate limited · Terms can change</div>
|
||||||
|
</div>
|
||||||
|
<div class="cost-card cost-local-card">
|
||||||
|
<div class="section-tag" id="cost_local_l">Inference-X (your hardware)</div>
|
||||||
|
<div class="cost-amount">$0</div>
|
||||||
|
<div class="cost-sub" id="cost_local_note">forever · electricity only · your data stays yours</div>
|
||||||
|
<div style="margin-top:1rem;font-size:.8rem;color:var(--teal)" id="cost_note">No API key. No subscription. No limit. Your hardware, your AI.</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 08 API ═══ -->
|
||||||
|
<section id="api">
|
||||||
|
<div class="section-tag">For developers</div>
|
||||||
|
<h2>OpenAI-compatible API</h2>
|
||||||
|
<p class="sub">Start with <code>--serve 8080</code>. Drop-in replacement. Any client library works.</p>
|
||||||
|
<div class="api-box reveal">
|
||||||
|
<span class="cmt"># Start the inference server</span><br>
|
||||||
|
<span class="kw">./inference-x</span> <span class="str">--model llama3.gguf --serve 8080</span><br><br>
|
||||||
|
<span class="cmt"># Works with any OpenAI SDK</span><br>
|
||||||
|
<span class="kw">curl</span> <span class="str">http://localhost:8080/v1/chat/completions</span> <span class="kw">-H</span> <span class="str">"Content-Type: application/json"</span> \<br>
|
||||||
|
<span class="kw">-d</span> '{"model":"llama3","messages":[{"role":"user","content":"Hello"}]}'
|
||||||
|
</div>
|
||||||
|
<div class="api-endpoints">
|
||||||
|
<span class="api-ep"><span class="ep-post">POST</span> /v1/chat/completions</span>
|
||||||
|
<span class="api-ep"><span class="ep-post">POST</span> /v1/completions</span>
|
||||||
|
<span class="api-ep"><span class="ep-get">GET</span> /v1/models</span>
|
||||||
|
<span class="api-ep"><span class="ep-get">GET</span> /health</span>
|
||||||
|
<span class="api-ep"><span class="ep-get">GET</span> /v1/embeddings</span>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 09 QUICK START ═══ -->
|
||||||
|
<section id="start">
|
||||||
|
<div class="section-tag">Get started</div>
|
||||||
|
<h2 id="start_title">Ready? Three steps.</h2>
|
||||||
|
<p class="sub" id="start_sub">Pick your system.</p>
|
||||||
|
<div class="qs-tabs">
|
||||||
|
<button class="qs-tab active" onclick="setQS('linux')">🐧 Linux</button>
|
||||||
|
<button class="qs-tab" onclick="setQS('mac')">🍎 macOS</button>
|
||||||
|
<button class="qs-tab" onclick="setQS('windows')">🪟 Windows</button>
|
||||||
|
<button class="qs-tab" onclick="setQS('pi')">🍓 Raspberry Pi</button>
|
||||||
|
</div>
|
||||||
|
<div class="qs-block active" id="qs-linux">
|
||||||
|
<div class="step"><div class="step-num">1</div><div><strong>Download the binary</strong><pre><span class="c"># x86_64 with CUDA/CPU</span>
|
||||||
|
<span class="v">curl</span> -LO <span class="s">https://git.inference-x.com/elmadani/inference-x/releases/download/v1.0/ix-linux-x64</span>
|
||||||
|
<span class="v">chmod</span> +x ix-linux-x64</pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">2</div><div><strong>Get a model</strong><pre><span class="c"># Download any GGUF from HuggingFace</span>
|
||||||
|
<span class="v">wget</span> <span class="s">https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf</span></pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">3</div><div><strong>Run it</strong><pre><span class="v">./ix-linux-x64</span> --model Llama-3.2-1B-Instruct-Q4_K_M.gguf<br><span class="c"># or serve as API:</span><br><span class="v">./ix-linux-x64</span> --model Llama-3.2-1B-Instruct-Q4_K_M.gguf <span class="s">--serve 8080</span></pre></div></div>
|
||||||
|
</div>
|
||||||
|
<div class="qs-block" id="qs-mac">
|
||||||
|
<div class="step"><div class="step-num">1</div><div><strong>Download (Apple Silicon native)</strong><pre><span class="v">curl</span> -LO <span class="s">https://git.inference-x.com/elmadani/inference-x/releases/download/v1.0/ix-macos-arm64</span>
|
||||||
|
<span class="v">chmod</span> +x ix-macos-arm64</pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">2</div><div><strong>Get a model</strong><pre><span class="c"># Metal GPU acceleration automatic on Apple Silicon</span>
|
||||||
|
<span class="v">wget</span> <span class="s">https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf</span></pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">3</div><div><strong>Run it</strong><pre><span class="v">./ix-macos-arm64</span> --model Llama-3.2-1B-Instruct-Q4_K_M.gguf</pre></div></div>
|
||||||
|
</div>
|
||||||
|
<div class="qs-block" id="qs-windows">
|
||||||
|
<div class="step"><div class="step-num">1</div><div><strong>Download</strong><pre><span class="c"># PowerShell</span>
|
||||||
|
<span class="v">Invoke-WebRequest</span> -Uri <span class="s">"https://git.inference-x.com/elmadani/inference-x/releases/download/v1.0/ix-windows-x64.exe"</span> -OutFile <span class="s">"ix.exe"</span></pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">2</div><div><strong>Get a model</strong> — download any .gguf file from HuggingFace</div></div>
|
||||||
|
<div class="step"><div class="step-num">3</div><div><strong>Run it</strong><pre><span class="v">.\ix.exe</span> --model model.gguf</pre></div></div>
|
||||||
|
</div>
|
||||||
|
<div class="qs-block" id="qs-pi">
|
||||||
|
<div class="step"><div class="step-num">1</div><div><strong>ARM build for Raspberry Pi 4/5</strong><pre><span class="v">curl</span> -LO <span class="s">https://git.inference-x.com/elmadani/inference-x/releases/download/v1.0/ix-linux-arm64</span>
|
||||||
|
<span class="v">chmod</span> +x ix-linux-arm64</pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">2</div><div><strong>Get a small model (fits in 1-4GB)</strong><pre><span class="v">wget</span> <span class="s">https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf</span></pre></div></div>
|
||||||
|
<div class="step"><div class="step-num">3</div><div><strong>Run on Pi</strong><pre><span class="v">./ix-linux-arm64</span> --model Llama-3.2-1B-Instruct-Q4_K_M.gguf<br><span class="c"># Pi 4 4GB: runs 1B models at ~8 tok/s</span></pre></div></div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 10 COMMUNITY TOOLS ═══ -->
|
||||||
|
<section id="community">
|
||||||
|
<div class="section-tag">Community</div>
|
||||||
|
<h2>The tools we built together.</h2>
|
||||||
|
<p class="sub">Inference-X is the core. Around it, the community builds the ecosystem. Here's what exists today — more is being forged every day.</p>
|
||||||
|
<div class="tools-grid reveal">
|
||||||
|
<a class="tool-card" href="https://git.inference-x.com/elmadani/inference-x" target="_blank">
|
||||||
|
<span class="tool-badge badge-live">LIVE</span>
|
||||||
|
<div class="tool-icon">⚡</div>
|
||||||
|
<div class="tool-name">IX Engine</div>
|
||||||
|
<div class="tool-desc">The core. 228KB C++ binary. 19 backends. Zero dependencies. The foundation everything runs on.</div>
|
||||||
|
</a>
|
||||||
|
<a class="tool-card" href="https://build.inference-x.com" target="_blank">
|
||||||
|
<span class="tool-badge badge-live">LIVE</span>
|
||||||
|
<div class="tool-icon">🛠</div>
|
||||||
|
<div class="tool-name">Community SaaS</div>
|
||||||
|
<div class="tool-desc">Cloud playground. Deploy models, test APIs, share with others. No installation. Donation-powered.</div>
|
||||||
|
</a>
|
||||||
|
<div class="tool-card">
|
||||||
|
<span class="tool-badge badge-live">LIVE</span>
|
||||||
|
<div class="tool-icon">📡</div>
|
||||||
|
<div class="tool-name">Hardware Scout</div>
|
||||||
|
<div class="tool-desc">See every IX node running globally. Real-time compute map. Who runs what, how fast.</div>
|
||||||
|
</div>
|
||||||
|
<div class="tool-card">
|
||||||
|
<span class="tool-badge badge-build">BUILDING</span>
|
||||||
|
<div class="tool-icon">🫀</div>
|
||||||
|
<div class="tool-name">Organ Store</div>
|
||||||
|
</div>
|
||||||
|
<div class="tool-card">
|
||||||
|
<span class="tool-badge badge-build">BUILDING</span>
|
||||||
|
<div class="tool-icon">🔬</div>
|
||||||
|
<div class="tool-name">Organ Architect</div>
|
||||||
|
<div class="tool-desc">Analyze model internals. Visualize layers, heads, topology. Like an MRI for AI models.</div>
|
||||||
|
</div>
|
||||||
|
<div class="tool-card">
|
||||||
|
<span class="tool-badge badge-build">BUILDING</span>
|
||||||
|
<div class="tool-icon">🔥</div>
|
||||||
|
<div class="tool-name">The Forge</div>
|
||||||
|
<div class="tool-desc">Community fine-tuning platform. Contribute training data, improve models, share results. Collective intelligence.</div>
|
||||||
|
</div>
|
||||||
|
<div class="tool-card">
|
||||||
|
<span class="tool-badge badge-coming">COMING</span>
|
||||||
|
<div class="tool-icon">🎙</div>
|
||||||
|
<div class="tool-name">EchoNet</div>
|
||||||
|
<div class="tool-desc">Neural voice synthesis. Clone, create, share voice models. Same philosophy: local, private, yours.</div>
|
||||||
|
</div>
|
||||||
|
<div class="tool-card">
|
||||||
|
<span class="tool-badge badge-coming">COMING</span>
|
||||||
|
<div class="tool-icon">🌐</div>
|
||||||
|
<div class="tool-name">Echo Relay</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 11 ORGAN ═══ -->
|
||||||
|
<section id="organ">
|
||||||
|
<div class="section-tag">The future</div>
|
||||||
|
<h2>AI organ transplants.</h2>
|
||||||
|
<p class="sub">Neural networks have anatomy. Layers. Attention heads. Expert blocks. We built tools to extract them, study them, and transplant them between models. The community will fill the store.</p>
|
||||||
|
<div class="organ-visual reveal">
|
||||||
|
<div class="organ-node glow" title="Source model">🧠</div>
|
||||||
|
<div class="organ-arrow">→</div>
|
||||||
|
<div class="organ-node" title="Extract organ" style="font-size:1.2rem">⚙️<br><small style="font-size:.5rem">extract</small></div>
|
||||||
|
<div class="organ-arrow">→</div>
|
||||||
|
<div class="organ-node" style="border-color:var(--amber);font-size:1.2rem" title="Organ">🫀</div>
|
||||||
|
<div class="organ-arrow">→</div>
|
||||||
|
<div class="organ-node" title="Transplant" style="font-size:1.2rem">💉<br><small style="font-size:.5rem">transplant</small></div>
|
||||||
|
<div class="organ-arrow">→</div>
|
||||||
|
<div class="organ-node glow" title="Enhanced model">🧬</div>
|
||||||
|
</div>
|
||||||
|
<div class="organ-desc">
|
||||||
|
<strong>Vision:</strong> A community marketplace where builders extract specialized capabilities from models — multilingual reasoning, code completion, visual understanding — and share them as components others can transplant. The Organ Store doesn't exist yet. The community will build it.
|
||||||
|
</div>
|
||||||
|
<div style="display:flex;gap:1rem;justify-content:center;margin-top:2rem;flex-wrap:wrap">
|
||||||
|
<div style="text-align:center;padding:1.2rem;background:var(--card);border:1px solid var(--border);border-radius:.7rem;min-width:140px">
|
||||||
|
<div style="font-size:1.5rem;margin-bottom:.4rem">🔍</div>
|
||||||
|
<div style="font-weight:800;font-size:.85rem">Analyze</div>
|
||||||
|
<div style="font-size:.75rem;color:var(--muted)">Map model internals</div>
|
||||||
|
</div>
|
||||||
|
<div style="text-align:center;padding:1.2rem;background:var(--card);border:1px solid var(--border);border-radius:.7rem;min-width:140px">
|
||||||
|
<div style="font-size:1.5rem;margin-bottom:.4rem">⚗️</div>
|
||||||
|
<div style="font-weight:800;font-size:.85rem">Extract</div>
|
||||||
|
<div style="font-size:.75rem;color:var(--muted)">Isolate components</div>
|
||||||
|
</div>
|
||||||
|
<div style="text-align:center;padding:1.2rem;background:var(--card);border:1px solid var(--border);border-radius:.7rem;min-width:140px">
|
||||||
|
<div style="font-size:1.5rem;margin-bottom:.4rem">📦</div>
|
||||||
|
<div style="font-weight:800;font-size:.85rem">Publish</div>
|
||||||
|
<div style="font-size:.75rem;color:var(--muted)">Share to the store</div>
|
||||||
|
</div>
|
||||||
|
<div style="text-align:center;padding:1.2rem;background:var(--card);border:1px solid var(--border);border-radius:.7rem;min-width:140px">
|
||||||
|
<div style="font-size:1.5rem;margin-bottom:.4rem">💉</div>
|
||||||
|
<div style="font-weight:800;font-size:.85rem">Transplant</div>
|
||||||
|
<div style="font-size:.75rem;color:var(--muted)">Enhance any model</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<div style="max-width:1100px;margin:0 auto">
|
||||||
|
<div class="section-tag">The vision</div>
|
||||||
|
</div>
|
||||||
|
<p style="color:var(--muted);font-size:.9rem;margin-top:1.5rem;max-width:600px">Inference-X has no enemies. Every researcher, every company, every government that processes AI is playing a role. We're not competing — we're building the infrastructure that makes all of it accessible to everyone who was left out.</p>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 13 HARDWARE SCOUT ═══ -->
|
||||||
|
<section id="scout">
|
||||||
|
<div class="section-tag">Community hardware</div>
|
||||||
|
<h2>Every IX node on Earth. Live.</h2>
|
||||||
|
<p class="sub">When you run Inference-X, you can optionally report your hardware telemetry. This is the network. Anonymous. Voluntary. Real.</p>
|
||||||
|
<table class="scout-table reveal">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Backend</th>
|
||||||
|
<th>Nodes</th>
|
||||||
|
<th>Avg tok/s</th>
|
||||||
|
<th>Avg load</th>
|
||||||
|
<th>Status</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody id="scoutBody">
|
||||||
|
<tr><td colspan="5" style="color:var(--muted);text-align:center;font-size:.82rem"><span class="live-dot"></span>Loading community hardware data...</td></tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 14 PRICING ═══ -->
|
||||||
|
<section id="pricing">
|
||||||
|
<div class="section-tag">License</div>
|
||||||
|
<h2>Free for those who need it. Fair for those who profit.</h2>
|
||||||
|
<p class="sub">No tricks. No hidden limits. The engine is the same everywhere.</p>
|
||||||
|
<div class="pricing-grid reveal">
|
||||||
|
<div class="pricing-card featured">
|
||||||
|
<div class="pricing-tier">Free Forever</div>
|
||||||
|
<div class="pricing-price free">$0</div>
|
||||||
|
<div class="pricing-who">Individuals, researchers, students, open-source projects, startups under $1M revenue. No registration. No expiry. No limits. This is the default.</div>
|
||||||
|
<div style="font-size:.8rem;color:var(--teal);margin-top:.5rem">✓ Full engine · All backends · All models</div>
|
||||||
|
</div>
|
||||||
|
<div class="pricing-card">
|
||||||
|
<div class="pricing-tier">Commercial Fair</div>
|
||||||
|
<div class="pricing-price fair">20% rev</div>
|
||||||
|
<div class="pricing-who">Companies with $1M+ annual revenue using IX in production. 20% of revenue attributed to IX-powered features goes to the community fund. Transparent. Auditable.</div>
|
||||||
|
<div style="font-size:.8rem;color:var(--amber);margin-top:.5rem">80% flows to community builders</div>
|
||||||
|
</div>
|
||||||
|
<div class="pricing-card">
|
||||||
|
<div class="pricing-tier">Industrial Embed</div>
|
||||||
|
<div class="pricing-price ind">Custom</div>
|
||||||
|
<div class="pricing-who">Hardware manufacturers embedding IX in products. Custom licensing for bulk distribution, signed binaries, hardware co-optimization. Contact us.</div>
|
||||||
|
<div style="font-size:.8rem;color:var(--copper);margin-top:.5rem">Redistribute · Co-brand · Optimize</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 15 JOIN / CRATONS ═══ -->
|
||||||
|
<section id="join">
|
||||||
|
<div class="section-tag">Join the builders</div>
|
||||||
|
<h2>11 seats. One per craton.</h2>
|
||||||
|
<p class="sub">The governance of Inference-X is anchored in geology. 11 ancient continental cratons — the most stable structures on Earth — give their names to 11 permanent Core Team seats. One per major civilization region. Designed to last as long as the rocks.</p>
|
||||||
|
<div class="cratons-grid reveal">
|
||||||
|
<div class="craton-card active">
|
||||||
|
<div class="craton-age">2.7 Ga · Africa</div>
|
||||||
|
<div class="craton-region">Morocco · North Africa</div>
|
||||||
|
<div class="craton-status">⚒ Founder — Elmadani Salka</div>
|
||||||
|
</div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.6 Ga · Africa</div><div class="craton-name">💎 Kaapvaal</div><div class="craton-region">South Africa, Botswana</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Kaapvaal Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">2.9 Ga · Africa</div><div class="craton-name">🌍 West African</div><div class="craton-region">Ghana, Senegal, Mali</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=West African Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">2.8 Ga · Africa</div><div class="craton-name">🌿 Congo</div><div class="craton-region">DRC, Republic of Congo</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Congo Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.1 Ga · Americas</div><div class="craton-name">🍁 Superior</div><div class="craton-region">Canada, North America</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Superior Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">2.5 Ga · Americas</div><div class="craton-name">🌳 Amazon</div><div class="craton-region">Brazil, South America</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Amazon Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.1 Ga · Europe</div><div class="craton-name">🌊 Baltica</div><div class="craton-region">Scandinavia, Eastern Europe</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Baltica Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.0 Ga · Asia</div><div class="craton-name">🌲 Siberian</div><div class="craton-region">Russia, Central Asia</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Siberian Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.8 Ga · Asia</div><div class="craton-name">🏮 North China</div><div class="craton-region">China, East Asia</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=North China Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.0 Ga · Asia</div><div class="craton-name">🪷 Dharwar</div><div class="craton-region">India, South Asia</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Dharwar Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
<div class="craton-card"><div class="craton-age">3.5 Ga · Oceania</div><div class="craton-name">🦘 Pilbara</div><div class="craton-region">Australia, Oceania</div><div class="craton-status"><a href="mailto:Elmadani.SALKA@proton.me?subject=Pilbara Craton" style="color:var(--teal);text-decoration:none">Apply →</a></div></div>
|
||||||
|
</div>
|
||||||
|
<div style="margin-top:2rem;padding:1.5rem;background:var(--card);border:1px solid var(--border);border-radius:.8rem;display:flex;flex-wrap:wrap;gap:1.5rem;align-items:center">
|
||||||
|
<div style="flex:1;min-width:200px">
|
||||||
|
<div style="font-weight:800;margin-bottom:.4rem">What craton leaders do</div>
|
||||||
|
<div style="font-size:.85rem;color:var(--muted)">Represent their region in project decisions. Connect local builders. Translate and adapt for local communities. No salary — compensation is access, visibility, and history.</div>
|
||||||
|
</div>
|
||||||
|
<a href="mailto:Elmadani.SALKA@proton.me?subject=Craton Application&body=Craton I want to represent: %0AWhy I fit: %0AMy background: " class="btn-primary">Apply for your craton →</a>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- ═══ 16 DONATE ═══ -->
|
||||||
|
<section id="donate" style="max-width:700px;margin:0 auto;padding:5rem 1.5rem">
|
||||||
|
<div class="section-tag">Support the infrastructure</div>
|
||||||
|
<p class="sub">Inference-X costs about €53/month to keep running for the world. Everything is public. Surplus goes to community contributors.</p>
|
||||||
|
<div class="donate-widget reveal">
|
||||||
|
<div class="costs-breakdown">
|
||||||
|
<div class="cost-line"><span>Infomaniak VPS (site)</span><span>€15/mo</span></div>
|
||||||
|
<div class="cost-line"><span>Domains</span><span>€4/mo</span></div>
|
||||||
|
<div class="cost-line"><span>OneCloud Compute</span><span>€20/mo</span></div>
|
||||||
|
<div class="cost-line"><span>Hetzner Compute</span><span>€12/mo</span></div>
|
||||||
|
<div class="cost-line"><span>Backup Storage</span><span>€2/mo</span></div>
|
||||||
|
<div class="cost-line cost-total"><span>Total / month</span><span id="costsLive">€53/mo</span></div>
|
||||||
|
</div>
|
||||||
|
<div class="donate-amounts">
|
||||||
|
<button class="donate-btn" onclick="setDonation(5)">€5</button>
|
||||||
|
<button class="donate-btn active" onclick="setDonation(10)">€10</button>
|
||||||
|
<button class="donate-btn" onclick="setDonation(20)">€20</button>
|
||||||
|
<button class="donate-btn" onclick="setDonation(50)">€50</button>
|
||||||
|
<button class="donate-btn" onclick="setDonation(100)">€100</button>
|
||||||
|
</div>
|
||||||
|
<a id="donateLink" href="https://paypal.me/elmadanisalka/10" target="_blank" class="btn-primary" style="width:100%;justify-content:center;font-size:1rem">Donate €10 via PayPal →</a>
|
||||||
|
<p style="font-size:.75rem;color:var(--muted);text-align:center;margin-top:.8rem">No account needed. Quarterly transparency report published. Surplus → community contributors.</p>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<!-- FOOTER -->
|
||||||
|
<footer>
|
||||||
|
<div class="footer-inner">
|
||||||
|
<div>
|
||||||
|
<div class="footer-brand">Inference-X</div>
|
||||||
|
<p class="footer-tagline" id="footcopy">Built in Morocco for the world.<br>Intelligence flows where gravity takes it.</p>
|
||||||
|
<div style="margin-top:1rem;display:flex;gap:.8rem;flex-wrap:wrap">
|
||||||
|
<a href="https://git.inference-x.com/elmadani/inference-x" class="btn-secondary" style="font-size:.78rem;padding:.4rem .8rem">GitHub</a>
|
||||||
|
<a href="https://build.inference-x.com" class="btn-secondary" style="font-size:.78rem;padding:.4rem .8rem">SaaS</a>
|
||||||
|
<a href="https://git.inference-x.com" class="btn-secondary" style="font-size:.78rem;padding:.4rem .8rem">Gitea</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="footer-col">
|
||||||
|
<h4>Engine</h4>
|
||||||
|
<a href="#discover">How it works</a>
|
||||||
|
<a href="#device">Your device</a>
|
||||||
|
<a href="#engine">Backends</a>
|
||||||
|
<a href="#models">Models</a>
|
||||||
|
<a href="#api">API docs</a>
|
||||||
|
</div>
|
||||||
|
<div class="footer-col">
|
||||||
|
<h4>Community</h4>
|
||||||
|
<a href="#community">Tools</a>
|
||||||
|
<a href="#organ">Organ Store</a>
|
||||||
|
<a href="#scout">Hardware Scout</a>
|
||||||
|
<a href="#join">11 Cratons</a>
|
||||||
|
<a href="#donate">Donate</a>
|
||||||
|
</div>
|
||||||
|
<div class="footer-col">
|
||||||
|
<h4>Legal</h4>
|
||||||
|
<a href="https://git.inference-x.com/elmadani/inference-x/src/branch/master/LICENSE" target="_blank">SALKA-IX License</a>
|
||||||
|
<a href="https://git.inference-x.com/elmadani/inference-x" target="_blank">Source Code</a>
|
||||||
|
<a href="mailto:Elmadani.SALKA@proton.me">Contact</a>
|
||||||
|
<a href="#pricing">Pricing</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="footer-bottom">
|
||||||
|
<span>© 2025–2026 SALKA HOLDING SA (forming, Zug CH) · Elmadani Salka · SALKA-IX License v1.0</span>
|
||||||
|
<span>🇲🇦 Morocco → 🌍 World</span>
|
||||||
|
</div>
|
||||||
|
</footer>
|
||||||
|
<script>
|
||||||
|
// ═══ THEME ═══
|
||||||
|
function setTheme(t){document.documentElement.dataset.theme=t;document.getElementById('tt').textContent=t==='dark'?'☀':'☾';try{localStorage.setItem('ix-t',t)}catch(e){}}
|
||||||
|
document.getElementById('tt').onclick=function(){setTheme(document.documentElement.dataset.theme==='dark'?'light':'dark')};
|
||||||
|
try{var st=localStorage.getItem('ix-t');if(st)setTheme(st);else if(window.matchMedia('(prefers-color-scheme:light)').matches)setTheme('light')}catch(e){}
|
||||||
|
|
||||||
|
// ═══ i18n ═══
|
||||||
|
var LANGS=['en','fr','ar','es','de','zh','hi','pt','sw','ru','tr','ja','ko','nl','it','pl'];
|
||||||
|
var LNAMES={en:'EN 🇬🇧',fr:'FR 🇫🇷',ar:'AR 🇲🇦',es:'ES 🇪🇸',de:'DE 🇩🇪',zh:'ZH 🇨🇳',hi:'HI 🇮🇳',pt:'PT 🇧🇷',sw:'SW 🇰🇪',ru:'RU 🇷🇺',tr:'TR 🇹🇷',ja:'JA 🇯🇵',ko:'KO 🇰🇷',nl:'NL 🇳🇱',it:'IT 🇮🇹',pl:'PL 🇵🇱'};
|
||||||
|
var curLang=0;
|
||||||
|
|
||||||
|
var TRANS={
|
||||||
|
en:{disc_title:"Three things to know. Nothing more.",disc_sub:"No degree required. If you have a device, you have AI.",d1t:"It's a tiny file",d1p:"305 kilobytes. Smaller than a photo on your phone. This file lets your computer run AI — any AI — without the internet. Download it, run it. That's it.",d2t:"Your words stay yours",d2p:"When you use AI online, your questions travel to a distant server. Someone can read them. With Inference-X, nothing leaves your machine. Ever.",d3t:"It runs on anything",d3p:"Old laptop, new phone, Raspberry Pi, datacenter. Same file. It detects your hardware and uses it. No configuration needed.",dev_title:"What can YOUR computer do?",dev_sub:"Move the slider to your RAM. See what's possible.",ram_hint:"Your AI runs locally. No internet. No account. Free forever.",priv_title:"Where do your words go?",priv_cloud_t:"Cloud AI",priv_cloud_p:"Your question leaves your device, crosses the internet, reaches a server in another country, gets processed, stored, and analyzed. You pay per word.",priv_local_t:"Inference-X",priv_local_p:"Your question stays on your desk. The answer is computed by your own processor. Nothing leaves. Nothing is stored. You pay nothing.",size_title:"How small is 305 KB?",size_sub:"The entire AI engine — smaller than what you think.",cost_title:"How much does AI cost?",cost_sub:"Using AI 1 hour per day, every day, for a year.",cost_cloud_l:"Cloud API (GPT-4 class)",cost_local_l:"Inference-X (your hardware)",cost_local_note:"forever · electricity only · your data stays yours",cost_note:"No API key. No subscription. No limit. Your hardware, your AI.",start_title:"Ready? Three steps.",start_sub:"Pick your system.",footcopy:"Built in Morocco for the world.\nIntelligence flows where gravity takes it."},
|
||||||
|
fr:{disc_title:"Trois choses à savoir. Pas plus.",disc_sub:"Pas de diplôme requis. Si tu as un appareil, tu as l'IA.",d1t:"C'est un tout petit fichier",d1p:"305 kilo-octets. Plus petit qu'une photo sur ton téléphone. Ce fichier permet à ton ordinateur de faire tourner l'IA — n'importe laquelle — sans internet.",d2t:"Tes mots restent les tiens",d2p:"Quand tu utilises l'IA en ligne, tes questions voyagent vers un serveur lointain. Quelqu'un peut les lire. Avec Inference-X, rien ne sort de ta machine. Jamais.",d3t:"Ça tourne sur tout",d3p:"Vieux portable, téléphone récent, Raspberry Pi, datacenter. Même fichier. Il détecte ton matériel et l'utilise. Aucune configuration.",dev_title:"Que peut faire TON ordinateur ?",dev_sub:"Déplace le curseur sur ta RAM. Vois ce qui est possible.",ram_hint:"Ton IA tourne en local. Pas d'internet. Pas de compte. Gratuit pour toujours.",priv_title:"Où vont tes mots ?",priv_cloud_t:"IA Cloud",priv_cloud_p:"Ta question quitte ton appareil, traverse internet, arrive sur un serveur dans un autre pays. On peut la lire. Tu paies au mot.",priv_local_t:"Inference-X",priv_local_p:"Ta question reste sur ton bureau. La réponse est calculée par ton propre processeur. Rien ne sort. Tu ne paies rien.",size_title:"305 Ko, c'est si petit ?",size_sub:"Le moteur IA entier — plus petit que tu ne crois.",cost_title:"Combien coûte l'IA ?",cost_sub:"1h par jour, tous les jours, pendant un an.",cost_cloud_l:"API Cloud (classe GPT-4)",cost_local_l:"Inference-X (ton matériel)",cost_local_note:"pour toujours · électricité seulement",cost_note:"Pas de clé API. Pas d'abonnement. Pas de limite. Ton matériel, ton IA.",start_title:"Prêt ? Trois étapes.",start_sub:"Choisis ton système.",footcopy:"Construit au Maroc pour le monde.\nL'intelligence coule là où la gravité la mène."},
|
||||||
|
ar:{disc_title:"ثلاثة أشياء. لا أكثر.",disc_sub:"لا شهادة مطلوبة. إذا لديك جهاز، لديك ذكاء اصطناعي.",d1t:"ملف صغير جداً",d1p:"305 كيلوبايت. أصغر من صورة على هاتفك. هذا الملف يجعل حاسوبك يشغّل الذكاء الاصطناعي بدون إنترنت.",d2t:"كلماتك ملكك",d2p:"عند استخدام الذكاء الاصطناعي عبر الإنترنت، أسئلتك تسافر لخادم بعيد. مع Inference-X، لا شيء يغادر جهازك.",d3t:"يعمل على أي جهاز",d3p:"حاسوب قديم، هاتف جديد، Raspberry Pi. نفس الملف. يكتشف عتادك ويستخدمه.",dev_title:"ماذا يستطيع حاسوبك؟",dev_sub:"حرّك المؤشر لذاكرتك. انظر ما المتاح.",ram_hint:"ذكاؤك يعمل محلياً. بلا إنترنت. بلا حساب. مجاني للأبد.",priv_title:"أين تذهب كلماتك؟",priv_cloud_t:"ذكاء سحابي",priv_cloud_p:"سؤالك يغادر جهازك، يعبر الإنترنت، يصل خادماً في بلد آخر. تدفع لكل كلمة.",priv_local_t:"Inference-X",priv_local_p:"سؤالك يبقى على مكتبك. الجواب يحسبه معالجك. لا شيء يخرج. لا تدفع شيئاً.",size_title:"كم صغير 305 كيلوبايت؟",size_sub:"المحرك بالكامل — أصغر مما تظن.",cost_title:"كم يكلف الذكاء الاصطناعي؟",cost_sub:"ساعة يومياً، كل يوم، لمدة سنة.",cost_cloud_l:"واجهة سحابية (فئة GPT-4)",cost_local_l:"Inference-X (عتادك)",cost_local_note:"للأبد · كهرباء فقط",cost_note:"بلا مفتاح API. بلا اشتراك. بلا حدود.",start_title:"مستعد؟ ثلاث خطوات.",start_sub:"اختر نظامك.",footcopy:"صُنع في المغرب للعالم.\nالذكاء يتدفق حيث تأخذه الجاذبية."},
|
||||||
|
es:{disc_title:"Tres cosas. Nada más.",disc_sub:"Sin título requerido. Si tienes un dispositivo, tienes IA.",d1t:"Es un archivo diminuto",d1p:"305 kilobytes. Más pequeño que una foto de tu móvil. Este archivo permite que tu ordenador ejecute IA sin internet.",d2t:"Tus palabras son tuyas",d2p:"Con la IA en la nube, tus preguntas van a un servidor lejano. Con Inference-X, nada sale de tu máquina.",d3t:"Funciona en cualquier cosa",d3p:"Portátil viejo, teléfono nuevo, Raspberry Pi, datacenter. Mismo archivo. Sin configuración.",dev_title:"¿Qué puede hacer TU ordenador?",dev_sub:"Mueve el control a tu RAM.",ram_hint:"Tu IA corre localmente. Sin internet. Sin cuenta. Gratis para siempre.",priv_title:"¿Dónde van tus palabras?",priv_cloud_t:"IA en la nube",priv_cloud_p:"Tu pregunta viaja a un servidor lejano. Alguien puede leerla. Pagas por palabra.",priv_local_t:"Inference-X",priv_local_p:"Tu pregunta se queda en tu escritorio. La respuesta la calcula tu propio procesador. No pagas nada.",size_title:"¿Cuán pequeño es 305 KB?",size_sub:"El motor completo, más pequeño de lo que crees.",cost_title:"¿Cuánto cuesta la IA?",cost_sub:"1 hora al día, todos los días, durante un año.",cost_cloud_l:"API Cloud (clase GPT-4)",cost_local_l:"Inference-X (tu hardware)",cost_local_note:"para siempre · solo electricidad",cost_note:"Sin API key. Sin suscripción. Sin límites.",start_title:"¿Listo? Tres pasos.",start_sub:"Elige tu sistema.",footcopy:"Construido en Marruecos para el mundo."},
|
||||||
|
de:{disc_title:"Drei Dinge. Nicht mehr.",disc_sub:"Kein Studium nötig. Mit einem Gerät hast du KI.",d1t:"Eine winzige Datei",d1p:"305 Kilobyte. Kleiner als ein Foto. Diese Datei lässt deinen Computer KI ausführen — ohne Internet.",d2t:"Deine Worte bleiben deine",d2p:"Online-KI sendet deine Fragen an fremde Server. Mit Inference-X verlässt nichts deinen Rechner.",d3t:"Läuft auf allem",d3p:"Alter Laptop, neues Handy, Raspberry Pi, Rechenzentrum. Gleiche Datei. Keine Konfiguration.",dev_title:"Was kann DEIN Computer?",dev_sub:"Bewege den Regler auf deinen RAM.",ram_hint:"Deine KI läuft lokal. Kein Internet. Kein Konto. Für immer kostenlos.",priv_title:"Wohin gehen deine Worte?",priv_cloud_t:"Cloud-KI",priv_cloud_p:"Deine Frage reist zu einem fernen Server. Jemand kann sie lesen. Du zahlst pro Wort.",priv_local_t:"Inference-X",priv_local_p:"Deine Frage bleibt auf deinem Schreibtisch. Die Antwort berechnet dein eigener Prozessor. Du zahlst nichts.",size_title:"Wie klein sind 305 KB?",size_sub:"Die komplette KI-Engine — kleiner als du denkst.",cost_title:"Was kostet KI wirklich?",cost_sub:"1 Stunde täglich, jeden Tag, ein Jahr lang.",cost_cloud_l:"Cloud API (GPT-4-Klasse)",cost_local_l:"Inference-X (deine Hardware)",cost_local_note:"für immer · nur Strom",cost_note:"Kein API-Key. Kein Abo. Kein Limit.",start_title:"Bereit? Drei Schritte.",start_sub:"Wähle dein System.",footcopy:"Gebaut in Marokko für die Welt."},
|
||||||
|
zh:{disc_title:"三件事。仅此而已。",disc_sub:"无需学位。有设备就有AI。",d1t:"这只是一个小文件",d1p:"305 KB。比手机上的照片还小。这个文件让你的电脑在没有网络的情况下运行AI。",d2t:"你的话语属于你",d2p:"使用在线AI时,你的问题会传到遥远的服务器。有人可以读取它们。用Inference-X,没有任何东西离开你的设备。",d3t:"可在任何硬件上运行",d3p:"旧电脑、新手机、树莓派、数据中心。同一个文件。无需配置。",dev_title:"你的电脑能做什么?",dev_sub:"拖动滑块到你的内存大小。",ram_hint:"你的AI在本地运行。无网络。无账号。永久免费。",priv_title:"你的话语去了哪里?",priv_cloud_t:"云AI",priv_cloud_p:"你的问题离开设备,穿越互联网,到达另一个国家的服务器,被处理、储存和分析。你按字付费。",priv_local_t:"Inference-X",priv_local_p:"你的问题留在你的桌上。答案由你自己的处理器计算。什么都没有离开。你不付任何费用。",size_title:"305 KB有多小?",size_sub:"整个AI引擎,比你想象的还要小。",cost_title:"AI到底花多少钱?",cost_sub:"每天1小时,每天,一整年。",cost_cloud_l:"云API(GPT-4级别)",cost_local_l:"Inference-X(你的硬件)",cost_local_note:"永久 · 仅电费",cost_note:"无需API密钥。无订阅。无限制。",start_title:"准备好了?三个步骤。",start_sub:"选择你的系统。",footcopy:"在摩洛哥为世界而建。"},
|
||||||
|
hi:{disc_title:"तीन बातें। बस इतना।",disc_sub:"कोई डिग्री नहीं चाहिए। अगर डिवाइस है तो AI है।",d1t:"यह एक छोटी सी फ़ाइल है",d1p:"305 किलोबाइट। आपके फ़ोन की तस्वीर से भी छोटा। यह फ़ाइल आपके कंप्यूटर को बिना इंटरनेट के AI चलाने देती है।",d2t:"आपके शब्द आपके हैं",d2p:"ऑनलाइन AI आपके सवाल दूर के सर्वर पर भेजता है। Inference-X के साथ, कुछ भी आपकी मशीन नहीं छोड़ता।",d3t:"हर डिवाइस पर चलता है",d3p:"पुराना लैपटॉप, नया फ़ोन, Raspberry Pi। एक ही फ़ाइल। कोई सेटअप नहीं।",dev_title:"आपका कंप्यूटर क्या कर सकता है?",dev_sub:"स्लाइडर को अपनी RAM पर ले जाएं।",ram_hint:"आपका AI लोकल चलता है। इंटरनेट नहीं। अकाउंट नहीं। हमेशा के लिए मुफ़्त।",priv_title:"आपके शब्द कहाँ जाते हैं?",priv_cloud_t:"क्लाउड AI",priv_cloud_p:"आपका सवाल इंटरनेट पार करके दूसरे देश के सर्वर पर जाता है। कोई पढ़ सकता है। आप प्रति शब्द भुगतान करते हैं।",priv_local_t:"Inference-X",priv_local_p:"आपका सवाल आपकी मेज पर रहता है। जवाब आपका प्रोसेसर देता है। कुछ नहीं जाता। कुछ नहीं देते।",size_title:"305 KB कितना छोटा है?",size_sub:"पूरा AI इंजन — सोच से भी छोटा।",cost_title:"AI की असली कीमत?",cost_sub:"रोज 1 घंटे, पूरे साल।",cost_cloud_l:"क्लाउड API (GPT-4 श्रेणी)",cost_local_l:"Inference-X (आपका हार्डवेयर)",cost_local_note:"हमेशा के लिए · सिर्फ बिजली",cost_note:"कोई API key नहीं। कोई सब्सक्रिप्शन नहीं।",start_title:"तैयार? तीन कदम।",start_sub:"अपना सिस्टम चुनें।",footcopy:"मोरक्को में दुनिया के लिए बनाया गया।"}
|
||||||
|
};
|
||||||
|
|
||||||
|
function applyLang(l){
|
||||||
|
var t=TRANS[l]||TRANS['en'];
|
||||||
|
for(var k in t){var el=document.getElementById(k);if(el){if(k==='footcopy')el.innerHTML=t[k].replace('\n','<br>');else el.textContent=t[k];}}
|
||||||
|
document.documentElement.lang=l;
|
||||||
|
// RTL for Arabic
|
||||||
|
document.body.dir=(l==='ar')?'rtl':'ltr';
|
||||||
|
}
|
||||||
|
|
||||||
|
function cycleLang(){
|
||||||
|
curLang=(curLang+1)%LANGS.length;
|
||||||
|
var l=LANGS[curLang];
|
||||||
|
document.getElementById('lb').textContent=LNAMES[l]||l.toUpperCase();
|
||||||
|
applyLang(l);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Auto-detect language
|
||||||
|
try{
|
||||||
|
var bl=navigator.language.split('-')[0];
|
||||||
|
var idx=LANGS.indexOf(bl);
|
||||||
|
if(idx>0){curLang=idx-1;cycleLang();}
|
||||||
|
}catch(e){}
|
||||||
|
|
||||||
|
// ═══ RAM SLIDER ═══
|
||||||
|
var MODELS=[
|
||||||
|
{min:1,max:3,icon:'⚡',name:'LLaMA 3.2 1B',size:'~1 GB',speed:'~15 tok/s',cat:'Chat · Fast'},
|
||||||
|
{min:2,max:5,icon:'🧠',name:'Phi-3 Mini 3.8B',size:'~2.5 GB',speed:'~12 tok/s',cat:'Smart · Efficient'},
|
||||||
|
{min:4,max:8,icon:'🌐',name:'Mistral 7B',size:'~5 GB',speed:'~8 tok/s',cat:'Multilingual · Code'},
|
||||||
|
{min:6,max:10,icon:'🔬',name:'LLaMA 3.1 8B',size:'~6 GB',speed:'~7 tok/s',cat:'Reasoning · General'},
|
||||||
|
{min:10,max:20,icon:'✨',name:'Qwen 2.5 14B',size:'~9 GB',speed:'~5 tok/s',cat:'Analysis · Multilingual'},
|
||||||
|
{min:14,max:24,icon:'🎯',name:'Mistral 22B',size:'~14 GB',speed:'~3 tok/s',cat:'Expert · Creative'},
|
||||||
|
{min:24,max:56,icon:'🚀',name:'LLaMA 3.1 70B',size:'~45 GB',speed:'~1.5 tok/s',cat:'Professional · Math'},
|
||||||
|
{min:56,max:200,icon:'🌟',name:'DeepSeek V3 671B',size:'~60 GB (MoE)',speed:'~0.8 tok/s',cat:'Expert · Research'}
|
||||||
|
];
|
||||||
|
|
||||||
|
function updateRAM(){
|
||||||
|
var ram=parseInt(document.getElementById('ramSlider').value);
|
||||||
|
document.getElementById('ramVal').innerHTML='RAM: <strong>'+ram+' GB</strong> — models that fit on your device';
|
||||||
|
var html='';
|
||||||
|
var shown=0;
|
||||||
|
MODELS.forEach(function(m){
|
||||||
|
var fits=ram>=m.min;
|
||||||
|
if(fits||ram>=m.min-2){
|
||||||
|
html+='<div class="ram-item'+(fits?' active':'') +'">';
|
||||||
|
html+='<span class="ri-icon">'+m.icon+'</span>';
|
||||||
|
html+='<span class="ri-name">'+m.name+'</span>';
|
||||||
|
html+='<span class="ri-size">'+m.size+'</span>';
|
||||||
|
if(fits)html+='<span class="ri-speed">'+m.speed+'</span>';
|
||||||
|
else html+='<span class="ri-speed" style="color:var(--red)">Need '+m.min+'GB</span>';
|
||||||
|
html+='</div>';
|
||||||
|
shown++;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
if(shown===0)html='<div style="color:var(--muted);font-size:.85rem;padding:.8rem">Move slider to see what models fit →</div>';
|
||||||
|
document.getElementById('ramResults').innerHTML=html;
|
||||||
|
}
|
||||||
|
document.getElementById('ramSlider').oninput=updateRAM;
|
||||||
|
updateRAM();
|
||||||
|
|
||||||
|
// ═══ QUICK START TABS ═══
|
||||||
|
function setQS(os){
|
||||||
|
document.querySelectorAll('.qs-tab').forEach(function(t){t.classList.remove('active')});
|
||||||
|
document.querySelectorAll('.qs-block').forEach(function(b){b.classList.remove('active')});
|
||||||
|
event.target.classList.add('active');
|
||||||
|
document.getElementById('qs-'+os).classList.add('active');
|
||||||
|
}
|
||||||
|
|
||||||
|
// ═══ DONATION ═══
|
||||||
|
function setDonation(amt){
|
||||||
|
document.querySelectorAll('.donate-btn').forEach(function(b){b.classList.remove('active')});
|
||||||
|
event.target.classList.add('active');
|
||||||
|
document.getElementById('donateLink').href='https://paypal.me/elmadanisalka/'+amt;
|
||||||
|
document.getElementById('donateLink').textContent='Donate €'+amt+' via PayPal →';
|
||||||
|
}
|
||||||
|
|
||||||
|
// ═══ HARDWARE SCOUT ═══
|
||||||
|
function loadScout(){
|
||||||
|
fetch('/api/community/scout').then(function(r){return r.json()}).then(function(d){
|
||||||
|
var b=d.backends||{};
|
||||||
|
var rows='';
|
||||||
|
if(Object.keys(b).length===0){
|
||||||
|
rows='<tr><td colspan="5" style="color:var(--muted);text-align:center;font-size:.82rem;padding:2rem">No nodes reporting yet. Run IX with <code>--scout</code> to join.</td></tr>';
|
||||||
|
}else{
|
||||||
|
for(var bk in b){
|
||||||
|
var bd=b[bk];
|
||||||
|
var load=Math.round((bd.avg_load_pct||0));
|
||||||
|
rows+='<tr>';
|
||||||
|
rows+='<td><span class="backend-chip">'+bk+'</span></td>';
|
||||||
|
rows+='<td>'+(bd.node_count||0)+'</td>';
|
||||||
|
rows+='<td>'+(bd.avg_tokens_per_sec||0).toFixed(1)+'</td>';
|
||||||
|
rows+='<td><div class="load-bar"><div class="load-fill" style="width:'+load+'%"></div></div> '+load+'%</td>';
|
||||||
|
rows+='<td><span class="live-dot"></span>live</td>';
|
||||||
|
rows+='</tr>';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
document.getElementById('scoutBody').innerHTML=rows;
|
||||||
|
}).catch(function(){
|
||||||
|
document.getElementById('scoutBody').innerHTML='<tr><td colspan="5" style="color:var(--muted);font-size:.82rem;text-align:center">Network loading...</td></tr>';
|
||||||
|
});
|
||||||
|
}
|
||||||
|
loadScout();
|
||||||
|
setInterval(loadScout,30000);
|
||||||
|
|
||||||
|
// ═══ LIVE COSTS ═══
|
||||||
|
fetch('/api/community/costs').then(function(r){return r.json()}).then(function(d){
|
||||||
|
if(d.total_eur_month){document.getElementById('costsLive').textContent='€'+d.total_eur_month+'/mo';}
|
||||||
|
}).catch(function(){});
|
||||||
|
|
||||||
|
// ═══ SCROLL REVEAL ═══
|
||||||
|
var observer=new IntersectionObserver(function(entries){
|
||||||
|
entries.forEach(function(e){if(e.isIntersecting)e.target.classList.add('visible')});
|
||||||
|
},{threshold:0.1,rootMargin:'0px 0px -40px 0px'});
|
||||||
|
document.querySelectorAll('.reveal').forEach(function(el){observer.observe(el)});
|
||||||
|
|
||||||
|
// ═══ NAV ACTIVE ═══
|
||||||
|
window.addEventListener('scroll',function(){
|
||||||
|
var sections=document.querySelectorAll('section[id]');
|
||||||
|
var pos=window.scrollY+100;
|
||||||
|
sections.forEach(function(s){
|
||||||
|
if(s.offsetTop<=pos&&s.offsetTop+s.offsetHeight>pos){
|
||||||
|
document.querySelectorAll('.nav-links a').forEach(function(a){
|
||||||
|
a.style.color=a.getAttribute('href')==='#'+s.id?'var(--amber)':'';
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
15
tools/compilation/build-linux-x64.sh
Normal file
15
tools/compilation/build-linux-x64.sh
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Build IX engine for Linux x86_64 with CUDA/CPU/Vulkan
|
||||||
|
set -e
|
||||||
|
|
||||||
|
BACKEND=${1:-cpu}
|
||||||
|
OUTPUT="ix-linux-x64-$BACKEND"
|
||||||
|
|
||||||
|
echo "[BUILD] Linux x64 | Backend: $BACKEND"
|
||||||
|
cmake -B build \
|
||||||
|
-DCMAKE_BUILD_TYPE=Release \
|
||||||
|
-DIX_BACKEND=${BACKEND^^} \
|
||||||
|
-DCMAKE_C_FLAGS="-O3 -march=native" 2>&1 | tail -5
|
||||||
|
cmake --build build --target ix -j$(nproc) 2>&1 | tail -5
|
||||||
|
cp build/bin/ix "$OUTPUT"
|
||||||
|
echo "[✓] Built: $OUTPUT ($(wc -c < $OUTPUT) bytes)"
|
||||||
7
tools/compilation/build-macos-arm64.sh
Normal file
7
tools/compilation/build-macos-arm64.sh
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Build IX engine for macOS Apple Silicon (Metal)
|
||||||
|
set -e
|
||||||
|
cmake -B build -DCMAKE_BUILD_TYPE=Release -DIX_BACKEND=METAL -DCMAKE_OSX_ARCHITECTURES=arm64 2>&1 | tail -3
|
||||||
|
cmake --build build --target ix -j$(sysctl -n hw.ncpu) 2>&1 | tail -3
|
||||||
|
cp build/bin/ix ix-macos-arm64
|
||||||
|
echo "[✓] Built: ix-macos-arm64"
|
||||||
160
tools/forge.sh
Normal file
160
tools/forge.sh
Normal file
@ -0,0 +1,160 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# IX Forge — Model conversion and quantization pipeline
|
||||||
|
# Usage: ./forge.sh <command> [options]
|
||||||
|
# Commands: convert, quantize, package, benchmark
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
IX_FORGE_VER="1.0.0"
|
||||||
|
LLAMA_CPP_DIR="${IX_LLAMA_CPP:-$HOME/.inference-x/llama.cpp}"
|
||||||
|
OUTPUT_DIR="${IX_OUTPUT:-./forge-output}"
|
||||||
|
|
||||||
|
log() { echo -e "\033[0;36m[IX-FORGE]\033[0m $1"; }
|
||||||
|
ok() { echo -e "\033[0;32m[✓]\033[0m $1"; }
|
||||||
|
err() { echo -e "\033[0;31m[✗]\033[0m $1"; exit 1; }
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat << 'USAGE'
|
||||||
|
IX Forge v1.0 — Model conversion and quantization
|
||||||
|
|
||||||
|
USAGE:
|
||||||
|
./forge.sh convert --source <hf_model_dir> --output <name.gguf>
|
||||||
|
./forge.sh quantize --input <model.gguf> --quant Q4_K_M --output <name_q4.gguf>
|
||||||
|
./forge.sh package --model <model.gguf> --name "ModelName" --version 1.0
|
||||||
|
./forge.sh benchmark --model <model.gguf> --prompt "Hello" --runs 10
|
||||||
|
|
||||||
|
QUANTIZATION LEVELS:
|
||||||
|
Q2_K — Smallest (50% quality loss, ~1.5bit)
|
||||||
|
Q4_0 — Small (faster, less accurate)
|
||||||
|
Q4_K_M — RECOMMENDED (best size/quality balance)
|
||||||
|
Q5_K_M — High quality
|
||||||
|
Q6_K — Near-lossless
|
||||||
|
Q8_0 — Near-perfect
|
||||||
|
F16 — Full precision (2x model size)
|
||||||
|
|
||||||
|
EXAMPLES:
|
||||||
|
# Convert Mistral 7B from HuggingFace
|
||||||
|
./forge.sh convert --source ./mistral-7b-v0.1 --output mistral-7b.gguf
|
||||||
|
|
||||||
|
# Quantize to Q4_K_M
|
||||||
|
./forge.sh quantize --input mistral-7b.gguf --quant Q4_K_M --output mistral-7b-q4.gguf
|
||||||
|
|
||||||
|
# Full pipeline
|
||||||
|
./forge.sh convert --source ./mymodel && ./forge.sh quantize --input mymodel.gguf --quant Q4_K_M
|
||||||
|
|
||||||
|
USAGE
|
||||||
|
}
|
||||||
|
|
||||||
|
check_llama_cpp() {
|
||||||
|
if [ ! -f "$LLAMA_CPP_DIR/convert_hf_to_gguf.py" ]; then
|
||||||
|
log "llama.cpp not found at $LLAMA_CPP_DIR"
|
||||||
|
log "Installing..."
|
||||||
|
mkdir -p "$LLAMA_CPP_DIR"
|
||||||
|
git clone --depth=1 https://github.com/ggerganov/llama.cpp.git "$LLAMA_CPP_DIR" 2>&1 | tail -3
|
||||||
|
cd "$LLAMA_CPP_DIR" && cmake -B build -DLLAMA_BUILD_SERVER=OFF && cmake --build build -j4 2>&1 | tail -5
|
||||||
|
cd -
|
||||||
|
ok "llama.cpp installed"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd_convert() {
|
||||||
|
local source="" output=""
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
--source) source="$2"; shift ;;
|
||||||
|
--output) output="$2"; shift ;;
|
||||||
|
esac; shift
|
||||||
|
done
|
||||||
|
[ -z "$source" ] && err "Missing --source"
|
||||||
|
[ -z "$output" ] && output="$(basename $source).gguf"
|
||||||
|
check_llama_cpp
|
||||||
|
mkdir -p "$OUTPUT_DIR"
|
||||||
|
log "Converting $source → $OUTPUT_DIR/$output"
|
||||||
|
python3 "$LLAMA_CPP_DIR/convert_hf_to_gguf.py" "$source" --outtype f16 --outfile "$OUTPUT_DIR/$output"
|
||||||
|
ok "Converted: $OUTPUT_DIR/$output ($(du -sh $OUTPUT_DIR/$output | cut -f1))"
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd_quantize() {
|
||||||
|
local input="" quant="Q4_K_M" output=""
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
--input) input="$2"; shift ;;
|
||||||
|
--quant) quant="$2"; shift ;;
|
||||||
|
--output) output="$2"; shift ;;
|
||||||
|
esac; shift
|
||||||
|
done
|
||||||
|
[ -z "$input" ] && err "Missing --input"
|
||||||
|
[ -z "$output" ] && output="${input%.gguf}_${quant}.gguf"
|
||||||
|
check_llama_cpp
|
||||||
|
log "Quantizing $input → $output (${quant})"
|
||||||
|
"$LLAMA_CPP_DIR/build/bin/llama-quantize" "$input" "$output" "$quant"
|
||||||
|
ok "Quantized: $output ($(du -sh $output | cut -f1))"
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd_package() {
|
||||||
|
local model="" name="" version="1.0"
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
--model) model="$2"; shift ;;
|
||||||
|
--name) name="$2"; shift ;;
|
||||||
|
--version) version="$2"; shift ;;
|
||||||
|
esac; shift
|
||||||
|
done
|
||||||
|
[ -z "$model" ] && err "Missing --model"
|
||||||
|
[ -z "$name" ] && name="$(basename $model .gguf)"
|
||||||
|
local pkg_dir="$OUTPUT_DIR/pkg-$name-$version"
|
||||||
|
mkdir -p "$pkg_dir"
|
||||||
|
cp "$model" "$pkg_dir/"
|
||||||
|
local size=$(wc -c < "$model")
|
||||||
|
local sha=$(sha256sum "$model" | cut -c1-32)
|
||||||
|
cat > "$pkg_dir/manifest.json" << MANIFEST
|
||||||
|
{
|
||||||
|
"name": "$name",
|
||||||
|
"version": "$version",
|
||||||
|
"model_file": "$(basename $model)",
|
||||||
|
"size_bytes": $size,
|
||||||
|
"sha256": "$sha",
|
||||||
|
"format": "gguf",
|
||||||
|
"ix_compatible": true,
|
||||||
|
"created_at": "$(date -u +%Y-%m-%dT%H:%M:%SZ)"
|
||||||
|
}
|
||||||
|
MANIFEST
|
||||||
|
tar -czf "$OUTPUT_DIR/$name-$version.ix-package" -C "$OUTPUT_DIR" "pkg-$name-$version"
|
||||||
|
rm -rf "$pkg_dir"
|
||||||
|
ok "Packaged: $OUTPUT_DIR/$name-$version.ix-package"
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd_benchmark() {
|
||||||
|
local model="" prompt="Hello, how are you?" runs=5
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
--model) model="$2"; shift ;;
|
||||||
|
--prompt) prompt="$2"; shift ;;
|
||||||
|
--runs) runs=$2; shift ;;
|
||||||
|
esac; shift
|
||||||
|
done
|
||||||
|
[ -z "$model" ] && err "Missing --model"
|
||||||
|
log "Benchmarking $model ($runs runs)"
|
||||||
|
log "Prompt: $prompt"
|
||||||
|
local total=0
|
||||||
|
for i in $(seq 1 $runs); do
|
||||||
|
local start=$(date +%s%N)
|
||||||
|
curl -s -X POST http://localhost:8080/v1/completions \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "{\"prompt\":\"$prompt\",\"max_tokens\":50}" > /dev/null
|
||||||
|
local end=$(date +%s%N)
|
||||||
|
local ms=$(( (end - start) / 1000000 ))
|
||||||
|
log "Run $i: ${ms}ms"
|
||||||
|
total=$((total + ms))
|
||||||
|
done
|
||||||
|
local avg=$((total / runs))
|
||||||
|
ok "Average latency: ${avg}ms over $runs runs"
|
||||||
|
}
|
||||||
|
|
||||||
|
case "${1:-help}" in
|
||||||
|
convert) shift; cmd_convert "$@" ;;
|
||||||
|
quantize) shift; cmd_quantize "$@" ;;
|
||||||
|
package) shift; cmd_package "$@" ;;
|
||||||
|
benchmark) shift; cmd_benchmark "$@" ;;
|
||||||
|
*) usage ;;
|
||||||
|
esac
|
||||||
135
tools/organ.py
Normal file
135
tools/organ.py
Normal file
@ -0,0 +1,135 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
IX Organ Tool — Inference-X Community Toolchain
|
||||||
|
Package, publish, and install IX "organs" (AI personas)
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
./organ.py pack --model model.gguf --prompt system.txt --name "ARIA" -o aria.organ
|
||||||
|
./organ.py install aria.organ
|
||||||
|
./organ.py list https://git.inference-x.com/organs
|
||||||
|
./organ.py publish aria.organ --token YOUR_GITEA_TOKEN
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os, sys, json, hashlib, zipfile, argparse, urllib.request, shutil
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
ORGAN_STORE_URL = "https://git.inference-x.com/api/v1/repos/elmadani/ix-organs"
|
||||||
|
IX_HOME = Path.home() / ".inference-x"
|
||||||
|
ORGANS_DIR = IX_HOME / "organs"
|
||||||
|
|
||||||
|
def pack(args):
|
||||||
|
"""Package a model + prompt + config into an .organ file"""
|
||||||
|
organ_meta = {
|
||||||
|
"version": "1.0",
|
||||||
|
"name": args.name,
|
||||||
|
"description": args.description or "",
|
||||||
|
"model_file": Path(args.model).name,
|
||||||
|
"quant": args.quant or "Q4_K_M",
|
||||||
|
"context_size": args.ctx or 4096,
|
||||||
|
"temperature": args.temp or 0.7,
|
||||||
|
"max_tokens": args.max_tokens or 512,
|
||||||
|
"tags": (args.tags or "").split(","),
|
||||||
|
"author": args.author or "anonymous",
|
||||||
|
"license": args.license or "MIT",
|
||||||
|
"created_at": __import__("datetime").datetime.utcnow().isoformat(),
|
||||||
|
}
|
||||||
|
|
||||||
|
output = args.output or f"{args.name.lower().replace(' ','-')}.organ"
|
||||||
|
|
||||||
|
with zipfile.ZipFile(output, 'w', zipfile.ZIP_DEFLATED) as zf:
|
||||||
|
# Meta
|
||||||
|
zf.writestr("organ.json", json.dumps(organ_meta, indent=2))
|
||||||
|
# Model
|
||||||
|
if args.model and Path(args.model).exists():
|
||||||
|
zf.write(args.model, f"model/{Path(args.model).name}")
|
||||||
|
# System prompt
|
||||||
|
if args.prompt and Path(args.prompt).exists():
|
||||||
|
with open(args.prompt) as f:
|
||||||
|
zf.writestr("system_prompt.txt", f.read())
|
||||||
|
elif args.prompt_text:
|
||||||
|
zf.writestr("system_prompt.txt", args.prompt_text)
|
||||||
|
|
||||||
|
size = Path(output).stat().st_size
|
||||||
|
h = hashlib.sha256(Path(output).read_bytes()).hexdigest()[:16]
|
||||||
|
print(f"✓ Organ packed: {output}")
|
||||||
|
print(f" Size: {size:,} bytes | SHA256: {h}...")
|
||||||
|
return output
|
||||||
|
|
||||||
|
def install(args):
|
||||||
|
"""Install an organ from a file or URL"""
|
||||||
|
src = args.source
|
||||||
|
ORGANS_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
if src.startswith("http"):
|
||||||
|
print(f"Downloading {src}...")
|
||||||
|
fname, _ = urllib.request.urlretrieve(src)
|
||||||
|
else:
|
||||||
|
fname = src
|
||||||
|
|
||||||
|
with zipfile.ZipFile(fname, 'r') as zf:
|
||||||
|
meta_raw = zf.read("organ.json")
|
||||||
|
meta = json.loads(meta_raw)
|
||||||
|
name = meta["name"].lower().replace(" ", "-")
|
||||||
|
dest = ORGANS_DIR / name
|
||||||
|
dest.mkdir(exist_ok=True)
|
||||||
|
zf.extractall(dest)
|
||||||
|
|
||||||
|
print(f"✓ Organ installed: {dest}")
|
||||||
|
print(f" Name: {meta['name']}")
|
||||||
|
print(f" Model: {meta.get('model_file','?')}")
|
||||||
|
print(f" Run with: ix --organ {name}")
|
||||||
|
|
||||||
|
def list_organs(args):
|
||||||
|
"""List available organs from the store"""
|
||||||
|
url = f"{ORGAN_STORE_URL}/contents/organs"
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(url) as r:
|
||||||
|
data = json.load(r)
|
||||||
|
print("Available organs:")
|
||||||
|
for item in data:
|
||||||
|
print(f" {item['name']} — {item['download_url']}")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Store unavailable: {e}")
|
||||||
|
# List local
|
||||||
|
if ORGANS_DIR.exists():
|
||||||
|
for d in ORGANS_DIR.iterdir():
|
||||||
|
meta_file = d / "organ.json"
|
||||||
|
if meta_file.exists():
|
||||||
|
meta = json.loads(meta_file.read_text())
|
||||||
|
print(f" LOCAL: {meta['name']} ({d.name})")
|
||||||
|
|
||||||
|
def main():
|
||||||
|
p = argparse.ArgumentParser(description="IX Organ Tool")
|
||||||
|
sub = p.add_subparsers(dest="cmd")
|
||||||
|
|
||||||
|
# pack
|
||||||
|
pk = sub.add_parser("pack", help="Pack a model into an organ")
|
||||||
|
pk.add_argument("--model", required=True, help="Path to .gguf model")
|
||||||
|
pk.add_argument("--prompt", help="Path to system prompt file")
|
||||||
|
pk.add_argument("--prompt-text", help="System prompt as text")
|
||||||
|
pk.add_argument("--name", required=True, help="Organ name")
|
||||||
|
pk.add_argument("--description", help="Description")
|
||||||
|
pk.add_argument("--quant", default="Q4_K_M")
|
||||||
|
pk.add_argument("--ctx", type=int, default=4096)
|
||||||
|
pk.add_argument("--temp", type=float, default=0.7)
|
||||||
|
pk.add_argument("--max-tokens", type=int, default=512)
|
||||||
|
pk.add_argument("--tags", help="Comma-separated tags")
|
||||||
|
pk.add_argument("--author")
|
||||||
|
pk.add_argument("--license", default="MIT")
|
||||||
|
pk.add_argument("-o", "--output", help="Output .organ file")
|
||||||
|
|
||||||
|
# install
|
||||||
|
ins = sub.add_parser("install", help="Install an organ")
|
||||||
|
ins.add_argument("source", help="Path or URL to .organ file")
|
||||||
|
|
||||||
|
# list
|
||||||
|
ls = sub.add_parser("list", help="List organs")
|
||||||
|
|
||||||
|
args = p.parse_args()
|
||||||
|
if args.cmd == "pack": pack(args)
|
||||||
|
elif args.cmd == "install": install(args)
|
||||||
|
elif args.cmd == "list": list_organs(args)
|
||||||
|
else: p.print_help()
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
36
tools/store.sh
Normal file
36
tools/store.sh
Normal file
@ -0,0 +1,36 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# IX Store Client — Browse, install, publish models
|
||||||
|
# Usage: ./store.sh [browse|install|publish|rate]
|
||||||
|
|
||||||
|
STORE_API="https://build.inference-x.com/api/store"
|
||||||
|
IX_HOME="$HOME/.inference-x"
|
||||||
|
|
||||||
|
case "${1:-browse}" in
|
||||||
|
browse)
|
||||||
|
echo "=== IX Community Model Store ==="
|
||||||
|
curl -s "$STORE_API" | python3 -c "
|
||||||
|
import sys,json
|
||||||
|
d = json.load(sys.stdin)
|
||||||
|
for item in d.get('items',d if isinstance(d,list) else []):
|
||||||
|
print(f" {item.get('name','?')} | {item.get('size_mb','?')}MB | ⭐ {item.get('rating','?')} | {item.get('downloads','?')} downloads")
|
||||||
|
" 2>/dev/null || echo "Store offline - check git.inference-x.com/elmadani/ix-tools"
|
||||||
|
;;
|
||||||
|
install)
|
||||||
|
[ -z "$2" ] && echo "Usage: store.sh install <model-id>" && exit 1
|
||||||
|
echo "Installing $2..."
|
||||||
|
mkdir -p "$IX_HOME/models"
|
||||||
|
curl -sL "$STORE_API/$2/download" -o "$IX_HOME/models/$2.gguf"
|
||||||
|
echo "✓ Installed to $IX_HOME/models/$2.gguf"
|
||||||
|
;;
|
||||||
|
publish)
|
||||||
|
[ -z "$2" ] && echo "Usage: store.sh publish <file.organ> --token TOKEN" && exit 1
|
||||||
|
TOKEN="$4"
|
||||||
|
[ -z "$TOKEN" ] && read -rp "Gitea token: " TOKEN
|
||||||
|
echo "Publishing $2..."
|
||||||
|
curl -s -X POST -H "Authorization: token $TOKEN" \
|
||||||
|
-F "file=@$2" "$STORE_API/publish"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Usage: store.sh [browse|install <id>|publish <file> --token TOKEN]"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
Loading…
Reference in New Issue
Block a user