# IX Tools **Utilities, scripts and integrations for the Inference-X ecosystem** [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](LICENSE) [**inference-x.com**](https://inference-x.com) · [**Community**](https://git.inference-x.com/inference-x-community)
--- ## Contents ### Model Management - `model-fetch` — Download GGUF models from HuggingFace with integrity verification - `model-convert` — Convert safetensors/PyTorch → GGUF (wraps llama.cpp converter) - `model-bench` — Benchmark a model: tokens/s, TTFT, memory usage ### Server Utilities - `ix-proxy` — Nginx config generator for multi-model inference-x deployments - `ix-systemd` — systemd unit file generator for background inference services - `ix-docker` — Minimal Dockerfile (~5 MB) for containerized inference ### Integration Scripts - `openai-proxy` — HTTP adapter: OpenAI SDK → Inference-X API - `langchain-adapter` — LangChain provider for Inference-X - `ollama-bridge` — Drop-in Ollama API compatibility layer ### Monitoring - `ix-metrics` — Prometheus metrics exporter (requests/s, latency, GPU util) - `ix-dashboard` — Simple HTML dashboard for monitoring IX instances ## Usage ```bash git clone https://git.inference-x.com/inference-x/ix-tools cd ix-tools # Benchmark a model ./model-bench mistral-7b-v0.1.Q4_K_M.gguf --tokens 100 # Set up as a system service sudo ./ix-systemd install mistral-7b-v0.1.Q4_K_M.gguf --port 8080 ``` --- *Part of the [Inference-X ecosystem](https://inference-x.com)*