Use the client folder ( /client/ ) to plan and run tasks locally with a safe, step-by-step UX.
The server does not execute commands — it only plans/assists.
Grab the folder that includes client.py and key.json.
/client/client.pykey.jsonPrefer CLI? Clone or fetch files directly:
git clone https://huggingface.co/spaces/tandevllc/axis
cd axis/client
python3 client.py --server https://tandevllc-axis.hf.space
~/.llama3_agent/sessions/$EDITOR before runningBuilt by TanDev · Axis.
macOS/Linux
# 1) Python deps
python3 -m pip install --upgrade pip
pip install rich click requests
# 2) Download client files (buttons above also work)
# curl -O {BASE}/client/client.py
# curl -O {BASE}/client/key.json
# or:
# wget {BASE}/client/client.py && wget {BASE}/client/key.json
# 3) Run
python3 client.py
Windows (PowerShell)
py -m pip install --upgrade pip
py -m pip install rich click requests
Invoke-WebRequest -Uri "{BASE}/client/client.py" -OutFile "client.py"
Invoke-WebRequest -Uri "{BASE}/client/key.json" -OutFile "key.json"
py client.py
API key Set LLAMA_API_KEY or edit key.json → {"api_key":"YOUR_KEY"}.
In the client: :apikey set / :apikey clear.
# Start
python3 client.py
# Plan hotkeys:
# [a] Execute all [s] Step-by-step [e] Edit plan
# [d] Toggle dry-run [c] Cancel
# Danger gate: warns on risky shell like `rm -rf /`, `curl | sh`, `mkfs.*`, etc.
# Logs:
# ~/.llama3_agent/sessions//
The Axis client asks the server to plan, lets you review the steps, and then runs locally with safety checks. The server never executes shell commands.
python3 client.py
/infer and shows a plan.proceed to continue.~/.llama3_agent/sessions/<timestamp>/
respond/respond_llm when you just need
narrative guidance or an answer—no shell required.
gallery.html with generate_file and
confirmed how to open it. Great for content, stubs, and docs.
Meta commands: :server, :apikey set|clear, :history,
:open <path>, :clear, :help.
Tip: Best with Llama-3-8B; TinyLlama on HF is fine for light tasks. Scaffold apps, write articles, manipulate files/dirs, and run OS-aware shell—locally and safely.