Common Workflows
Chat via UI
- Start services:
./launch-core-x.sh - Open
http://localhost:5200 - Select “Chat” from the launcher app grid (or nav rail)
- Messages route through gateway (
:8090) → mlx-llm (:8091) via OpenResponses
RAG Search via CLI
corex rag search "your query here"# Requires: gateway (:8090) + mlx-rag (:8092)Video Analysis via corex-cli-v2
cd ~/core-x-kbllr_0/corex-cli-v2source .venv/bin/activatecorex video analyze -i ~/path/to/video.mp4 -o ./output# Requires: gateway (:8090), mlx-audio (:8093), mlx-vision (:8094)Run Data Pipelines
# All pipelines (requires gateway + RAG online)python core-x/pipelines/run_all.py
# Local only (no RAG dependency)python core-x/pipelines/run_all.py --localValidate Ecosystem Integrity
python core-x/scripts/validate-ecosystem.py # schemas + configpython model-zoo/scripts/validate_registry.py # model zoopython scripts/generate-registries.py # rebuild registriesIsolated UI Development
cd ~/core-x-kbllr_0/core-x/uinpm run dev # starts on :5200, hot-reloads# Gateway doesn't need to be running — UI handles offline services gracefully