AI-assisted structural engineering workspace for AEC workflows.
StructureClaw-demo.mp4
- Conversational engineering workflow from natural language to analysis artifacts
- Unified orchestration loop: draft -> validate -> analyze -> code-check -> report
- Web UI, API backend, and backend-hosted Python analysis runtime in one monorepo
- Regression and contract scripts for repeatable engineering validation
frontend (Next.js)
-> backend (Fastify + Prisma + Agent orchestration + analysis runtime host)
-> backend/src/agent-skills/analysis-execution/python
-> reports / metrics / artifacts
Main directories:
frontend/: Next.js 14 applicationbackend/: Fastify API, agent/chat flows, Prisma integration, and analysis execution hostscripts/: startup helpers and thesclaw/sclaw_cnCLI implementationtests/: regression runner (node tests/runner.mjs ...), install smoke, and CI-covered frontend checks (type-check, Vitest, lint) after native smokedocs/: user handbook and protocol references
If Node.js is not installed yet, use the helper installer script first:
bash ./scripts/install-node-linux.shWindows PowerShell (run as Administrator for first-time package install):
powershell -ExecutionPolicy Bypass -File ./scripts/install-node-windows.ps1Recommended local flow:
./sclaw doctor
./sclaw start
./sclaw statusChina mirror flow (same subcommands, mirror defaults enabled):
./sclaw_cn doctor
./sclaw_cn start
./sclaw_cn statusNotes:
- SQLite is now the default local database.
./sclaw startuses.runtime/data/structureclaw.start.db, and./sclaw doctoruses.runtime/data/structureclaw.doctor.dbso preflight checks do not touch the active local runtime database. ./sclaw doctorno longer requires a preinstalled system Python 3.12. It will ensureuvand preparebackend/.venvwith Python 3.12 automatically when needed. On Windows, this automatic setup currently requireswinget; ifwingetis unavailable, installuvmanually before running./sclaw doctor.- If your old local
.envstill pointsDATABASE_URLat a local PostgreSQL instance,./sclaw doctorand./sclaw startwill auto-migrate that data into SQLite, rewrite.envto the SQLite default, and keep the original PostgreSQL URL inPOSTGRES_SOURCE_DATABASE_URL. - That first auto-migration also creates a local backup file like
.env.pre-sqlite-migration.<timestamp>.bak. sclaw_cndefaults to China mirror settings when unset:PIP_INDEX_URL=https://pypi.tuna.tsinghua.edu.cn/simple,NPM_CONFIG_REGISTRY=https://registry.npmmirror.com, and Docker mirror prefix viaDOCKER_REGISTRY_MIRROR.- You can override mirror values in
.envor shell environment (PIP_INDEX_URL,NPM_CONFIG_REGISTRY,DOCKER_REGISTRY_MIRROR,APT_MIRROR).
Useful follow-up commands:
./sclaw logs
./sclaw stop
node tests/runner.mjs backend-regression
node tests/runner.mjs analysis-regressionUse the built-in CLI batch convert command to transform structure model JSON files and write a summary report:
./sclaw convert-batch --input-dir tmp/input --output-dir tmp/output --report tmp/report.json --target-format compact-1Windows PowerShell:
node .\sclaw doctor
node .\sclaw start
node .\sclaw status
node .\sclaw logs all --follow
node .\sclaw stopWindows users can now start the full stack directly with Docker, which is the easiest path for beginners who do not want to install local Node.js and Python first.
Recommended steps:
- Install and start Docker Desktop.
- If Docker Desktop asks you to enable WSL 2 or required container features on first launch, follow the setup wizard and restart Docker Desktop.
- Run the interactive Docker bootstrap command from the project root:
node .\sclaw docker-installFor CI or scripted setup, use the non-interactive variant:
node .\sclaw docker-install --non-interactive --llm-base-url https://api.openai.com/v1 --llm-api-key <your-key> --llm-model gpt-4.1Once the stack is ready, the main entrypoints are:
- Frontend:
http://localhost:30000 - Backend health check:
http://localhost:30010/health - Analysis routes:
http://localhost:30010/analyze - Database status page:
http://localhost:30000/console/database
To stop the containers:
node .\sclaw docker-stopOr:
docker compose downCopy and adjust environment variables from .env.example.
Key variables include:
PORT,FRONTEND_PORTDATABASE_URL,POSTGRES_SOURCE_DATABASE_URLLLM_API_KEY,LLM_MODEL,LLM_BASE_URL(OpenAI-compatible)ANALYSIS_PYTHON_BIN,ANALYSIS_PYTHON_TIMEOUT_MS,ANALYSIS_ENGINE_MANIFEST_PATH
Backend:
POST /api/v1/agent/runPOST /api/v1/chat/messagePOST /api/v1/chat/streamPOST /api/v1/chat/execute
Backend-hosted analysis:
POST /validatePOST /convertPOST /analyzePOST /code-check
- Skills are enhancement layers, not the only execution path.
- Unmatched selected skills fall back to generic no-skill modeling.
- User-visible content must support both English and Chinese.
- Keep module boundaries explicit across frontend/backend/analysis skills.
- English handbook:
docs/handbook.md - Chinese handbook:
docs/handbook_CN.md - English reference:
docs/reference.md - Chinese reference:
docs/reference_CN.md - Chinese overview:
README_CN.md - Contribution guide:
CONTRIBUTING.md
Please read CONTRIBUTING.md before opening a PR.
MIT. See LICENSE.