Axis · Agent (Client-First)

TanDev Axis

Use the client folder ( /client/ ) to plan and run tasks locally with a safe, step-by-step UX. The server does not execute commands — it only plans/assists.

Live status: Checking…
Model:
/status

Download the client folder

Grab the folder that includes client.py and key.json.

Folder: /client/
Script: client.py
Key file: key.json

Prefer CLI? Clone or fetch files directly:

git clone https://huggingface.co/spaces/tandevllc/axis
cd axis/client
python3 client.py --server https://tandevllc-axis.hf.space
      

Why this client?

Built by TanDev · Axis.

Install & Run

macOS/Linux


# 1) Python deps
python3 -m pip install --upgrade pip
pip install rich click requests

# 2) Download client files (buttons above also work)
# curl -O {BASE}/client/client.py
# curl -O {BASE}/client/key.json
# or:
# wget {BASE}/client/client.py && wget {BASE}/client/key.json

# 3) Run 
python3 client.py

Windows (PowerShell)


py -m pip install --upgrade pip
py -m pip install rich click requests
Invoke-WebRequest -Uri "{BASE}/client/client.py" -OutFile "client.py"
Invoke-WebRequest -Uri "{BASE}/client/key.json"  -OutFile "key.json"
py client.py

API key Set LLAMA_API_KEY or edit key.json{"api_key":"YOUR_KEY"}. In the client: :apikey set / :apikey clear.

Usage cheatsheet


# Start
python3 client.py

# Plan hotkeys:
# [a] Execute all   [s] Step-by-step   [e] Edit plan
# [d] Toggle dry-run   [c] Cancel

# Danger gate: warns on risky shell like `rm -rf /`, `curl | sh`, `mkfs.*`, etc.

# Logs:
# ~/.llama3_agent/sessions//

Tutorial: Plan → Review → Execute (client-side)

The Axis client asks the server to plan, lets you review the steps, and then runs locally with safety checks. The server never executes shell commands.

  1. Launch the client:
    
    python3 client.py
              
  2. Describe the task (“scaffold a Flask app and run it”). Client calls /infer and shows a plan.
  3. Review and choose: [a] all • [s] step-by-step • [e] edit • [d] dry-run • [c] cancel.
  4. Danger gate prompts for risky shell; type proceed to continue.
  5. Inspect results (stdout/stderr, bytes written, previews). Saved under:
    ~/.llama3_agent/sessions/<timestamp>/
Plan & Execute: Flask scaffold, venv install, local run
Screenshot A — Plan & Execute: planned a Flask scaffold, created files, installed deps in a venv, and ran commands locally with rich stdout/stderr panels.
LLM Respond: pure text response step inside the flow
Screenshot B — LLM Respond: use respond/respond_llm when you just need narrative guidance or an answer—no shell required.
Generate File: wrote gallery.html then explained how to open it
Screenshot C — Generate File: wrote gallery.html with generate_file and confirmed how to open it. Great for content, stubs, and docs.

Meta commands: :server, :apikey set|clear, :history, :open <path>, :clear, :help.

Tip: Best with Llama-3-8B; TinyLlama on HF is fine for light tasks. Scaffold apps, write articles, manipulate files/dirs, and run OS-aware shell—locally and safely.