Skip to content

Common Workflows

Chat via UI

  1. Start services: ./launch-core-x.sh
  2. Open http://localhost:5200
  3. Select “Chat” from the launcher app grid (or nav rail)
  4. Messages route through gateway (:8090) → mlx-llm (:8091) via OpenResponses

RAG Search via CLI

Terminal window
corex rag search "your query here"
# Requires: gateway (:8090) + mlx-rag (:8092)

Video Analysis via corex-cli-v2

Terminal window
cd ~/core-x-kbllr_0/corex-cli-v2
source .venv/bin/activate
corex video analyze -i ~/path/to/video.mp4 -o ./output
# Requires: gateway (:8090), mlx-audio (:8093), mlx-vision (:8094)

Run Data Pipelines

Terminal window
# All pipelines (requires gateway + RAG online)
python core-x/pipelines/run_all.py
# Local only (no RAG dependency)
python core-x/pipelines/run_all.py --local

Validate Ecosystem Integrity

Terminal window
python core-x/scripts/validate-ecosystem.py # schemas + config
python model-zoo/scripts/validate_registry.py # model zoo
python scripts/generate-registries.py # rebuild registries

Isolated UI Development

Terminal window
cd ~/core-x-kbllr_0/core-x/ui
npm run dev # starts on :5200, hot-reloads
# Gateway doesn't need to be running — UI handles offline services gracefully