Merge origin/main into fix-discord-accountId
This commit is contained in:
1
.github/workflows/install-smoke.yml
vendored
1
.github/workflows/install-smoke.yml
vendored
@@ -29,5 +29,6 @@ jobs:
|
||||
CLAWDBOT_INSTALL_CLI_URL: https://clawd.bot/install-cli.sh
|
||||
CLAWDBOT_NO_ONBOARD: "1"
|
||||
CLAWDBOT_INSTALL_SMOKE_SKIP_CLI: "1"
|
||||
CLAWDBOT_INSTALL_SMOKE_SKIP_NONROOT: ${{ github.event_name == 'pull_request' && '1' || '0' }}
|
||||
CLAWDBOT_INSTALL_SMOKE_PREVIOUS: "2026.1.11-4"
|
||||
run: pnpm test:install:smoke
|
||||
|
||||
10
AGENTS.md
10
AGENTS.md
@@ -22,6 +22,15 @@
|
||||
- README (GitHub): keep absolute docs URLs (`https://docs.clawd.bot/...`) so links work on GitHub.
|
||||
- Docs content must be generic: no personal device names/hostnames/paths; use placeholders like `user@gateway-host` and “gateway host”.
|
||||
|
||||
## exe.dev VM ops (general)
|
||||
- Access: SSH to the VM directly: `ssh vm-name.exe.xyz` (or use exe.dev web terminal).
|
||||
- Updates: `sudo npm i -g clawdbot@latest` (global install needs root on `/usr/lib/node_modules`).
|
||||
- Config: use `clawdbot config set ...`; set `gateway.mode=local` if unset.
|
||||
- Restart: exe.dev often lacks systemd user bus; stop old gateway and run:
|
||||
`pkill -9 -f clawdbot-gateway || true; nohup clawdbot gateway run --bind loopback --port 18789 --force > /tmp/clawdbot-gateway.log 2>&1 &`
|
||||
- Verify: `clawdbot --version`, `clawdbot health`, `ss -ltnp | rg 18789`.
|
||||
- SSH flaky: use exe.dev web terminal or Shelley (web agent) instead of CLI SSH.
|
||||
|
||||
## Build, Test, and Development Commands
|
||||
- Runtime baseline: Node **22+** (keep Node + Bun paths working).
|
||||
- Install deps: `pnpm install`
|
||||
@@ -51,6 +60,7 @@
|
||||
- Framework: Vitest with V8 coverage thresholds (70% lines/branches/functions/statements).
|
||||
- Naming: match source names with `*.test.ts`; e2e in `*.e2e.test.ts`.
|
||||
- Run `pnpm test` (or `pnpm test:coverage`) before pushing when you touch logic.
|
||||
- Do not set test workers above 16; tried already.
|
||||
- Live tests (real keys): `CLAWDBOT_LIVE_TEST=1 pnpm test:live` (Clawdbot-only) or `LIVE=1 pnpm test:live` (includes provider live tests). Docker: `pnpm test:docker:live-models`, `pnpm test:docker:live-gateway`. Onboarding Docker E2E: `pnpm test:docker:onboard`.
|
||||
- Full kit + what’s covered: `docs/testing.md`.
|
||||
- Pure test additions/fixes generally do **not** need a changelog entry unless they alter user-facing behavior or the user asks for one.
|
||||
|
||||
38
CHANGELOG.md
38
CHANGELOG.md
@@ -2,28 +2,20 @@
|
||||
|
||||
Docs: https://docs.clawd.bot
|
||||
|
||||
## 2026.1.23
|
||||
|
||||
### Fixes
|
||||
- Media: preserve PNG alpha when possible; fall back to JPEG when still over size cap. (#1491) Thanks @robbyczgw-cla.
|
||||
- Agents: treat plugin-only tool allowlists as opt-ins; keep core tools enabled. (#1467)
|
||||
|
||||
## 2026.1.22
|
||||
|
||||
### Changes
|
||||
- Highlight: Lobster optional plugin tool for typed workflows + approval gates. https://docs.clawd.bot/tools/lobster
|
||||
- Highlight: Compaction safeguard now uses adaptive chunking, progressive fallback, and UI status + retries. (#1466) Thanks @dlauer.
|
||||
- Lobster: allow workflow file args via `argsJson` in the plugin tool. https://docs.clawd.bot/tools/lobster
|
||||
- Agents: add identity avatar config support and Control UI avatar rendering. (#1329, #1424) Thanks @dlauer.
|
||||
- Providers: add Antigravity usage tracking to status output. (#1490) Thanks @patelhiren.
|
||||
- Slack: add chat-type reply threading overrides via `replyToModeByChatType`. (#1442) Thanks @stefangalescu.
|
||||
- Memory: prevent CLI hangs by deferring vector probes, adding sqlite-vec/embedding timeouts, and showing sync progress early.
|
||||
- BlueBubbles: add `asVoice` support for MP3/CAF voice memos in sendAttachment. (#1477, #1482) Thanks @Nicell.
|
||||
- Docs: add troubleshooting entry for gateway.mode blocking gateway start. https://docs.clawd.bot/gateway/troubleshooting
|
||||
- Docs: add /model allowlist troubleshooting note. (#1405)
|
||||
- Docs: add per-message Gmail search example for gog. (#1220) Thanks @mbelinky.
|
||||
- UI: show per-session assistant identity in the Control UI. (#1420) Thanks @robbyczgw-cla.
|
||||
- Onboarding: add hatch choice (TUI/Web/Later), token explainer, background dashboard seed on macOS, and showcase link.
|
||||
- Onboarding: remove the run setup-token auth option (paste setup-token or reuse CLI creds instead).
|
||||
- Signal: add typing indicators and DM read receipts via signal-cli.
|
||||
- MSTeams: add file uploads, adaptive cards, and attachment handling improvements. (#1410) Thanks @Evizero.
|
||||
- CLI: add `clawdbot update wizard` for interactive channel selection and restart prompts. https://docs.clawd.bot/cli/update
|
||||
|
||||
### Breaking
|
||||
- **BREAKING:** Envelope and system event timestamps now default to host-local time (was UTC) so agents don’t have to constantly convert.
|
||||
|
||||
### Fixes
|
||||
- BlueBubbles: stop typing indicator on idle/no-reply. (#1439) Thanks @Nicell.
|
||||
@@ -32,6 +24,7 @@ Docs: https://docs.clawd.bot
|
||||
- Control UI: resolve local avatar URLs with basePath across injection + identity RPC. (#1457) Thanks @dlauer.
|
||||
- Agents: sanitize assistant history text to strip tool-call markers. (#1456) Thanks @zerone0x.
|
||||
- Discord: clarify Message Content Intent onboarding hint. (#1487) Thanks @kyleok.
|
||||
- Gateway: stop the service before uninstalling and fail if it remains loaded.
|
||||
- Agents: surface concrete API error details instead of generic AI service errors.
|
||||
- Exec: fall back to non-PTY when PTY spawn fails (EBADF). (#1484)
|
||||
- Exec approvals: allow per-segment allowlists for chained shell commands on gateway + node hosts. (#1458) Thanks @czekaj.
|
||||
@@ -48,9 +41,9 @@ Docs: https://docs.clawd.bot
|
||||
- Docs: fix gog auth services example to include docs scope. (#1454) Thanks @zerone0x.
|
||||
- Slack: reduce WebClient retries to avoid duplicate sends. (#1481)
|
||||
- Slack: read thread replies for message reads when threadId is provided (replies-only). (#1450) Thanks @rodrigouroz.
|
||||
- Discord: honor accountId across message actions and cron deliveries. (#1492) Thanks @svkozak.
|
||||
- macOS: prefer linked channels in gateway summary to avoid false “not linked” status.
|
||||
- macOS/tests: fix gateway summary lookup after guard unwrap; prevent browser opens during tests. (ECID-1483)
|
||||
- Providers: improve GitHub Copilot integration (enterprise support, base URL, and auth flow alignment).
|
||||
|
||||
## 2026.1.21-2
|
||||
|
||||
@@ -61,6 +54,8 @@ Docs: https://docs.clawd.bot
|
||||
## 2026.1.21
|
||||
|
||||
### Changes
|
||||
- Highlight: Lobster optional plugin tool for typed workflows + approval gates. https://docs.clawd.bot/tools/lobster
|
||||
- Lobster: allow workflow file args via `argsJson` in the plugin tool. https://docs.clawd.bot/tools/lobster
|
||||
- Heartbeat: allow running heartbeats in an explicit session key. (#1256) Thanks @zknicker.
|
||||
- CLI: default exec approvals to the local host, add gateway/node targeting flags, and show target details in allowlist output.
|
||||
- CLI: exec approvals mutations render tables instead of raw JSON.
|
||||
@@ -70,13 +65,24 @@ Docs: https://docs.clawd.bot
|
||||
- CLI: flatten node service commands under `clawdbot node` and remove `service node` docs.
|
||||
- CLI: move gateway service commands under `clawdbot gateway` and add `gateway probe` for reachability.
|
||||
- Sessions: add per-channel reset overrides via `session.resetByChannel`. (#1353) Thanks @cash-echo-bot.
|
||||
- Agents: add identity avatar config support and Control UI avatar rendering. (#1329, #1424) Thanks @dlauer.
|
||||
- UI: show per-session assistant identity in the Control UI. (#1420) Thanks @robbyczgw-cla.
|
||||
- CLI: add `clawdbot update wizard` for interactive channel selection and restart prompts. https://docs.clawd.bot/cli/update
|
||||
- Signal: add typing indicators and DM read receipts via signal-cli.
|
||||
- MSTeams: add file uploads, adaptive cards, and attachment handling improvements. (#1410) Thanks @Evizero.
|
||||
- Onboarding: remove the run setup-token auth option (paste setup-token or reuse CLI creds instead).
|
||||
- Docs: add troubleshooting entry for gateway.mode blocking gateway start. https://docs.clawd.bot/gateway/troubleshooting
|
||||
- Docs: add /model allowlist troubleshooting note. (#1405)
|
||||
- Docs: add per-message Gmail search example for gog. (#1220) Thanks @mbelinky.
|
||||
|
||||
### Breaking
|
||||
- **BREAKING:** Control UI now rejects insecure HTTP without device identity by default. Use HTTPS (Tailscale Serve) or set `gateway.controlUi.allowInsecureAuth: true` to allow token-only auth. https://docs.clawd.bot/web/control-ui#insecure-http
|
||||
- **BREAKING:** Envelope and system event timestamps now default to host-local time (was UTC) so agents don’t have to constantly convert.
|
||||
|
||||
### Fixes
|
||||
- Nodes/macOS: prompt on allowlist miss for node exec approvals, persist allowlist decisions, and flatten node invoke errors. (#1394) Thanks @ngutman.
|
||||
- Gateway: keep auto bind loopback-first and add explicit tailnet binding to avoid Tailscale taking over local UI. (#1380)
|
||||
- Memory: prevent CLI hangs by deferring vector probes, adding sqlite-vec/embedding timeouts, and showing sync progress early.
|
||||
- Agents: enforce 9-char alphanumeric tool call ids for Mistral providers. (#1372) Thanks @zerone0x.
|
||||
- Embedded runner: persist injected history images so attachments aren’t reloaded each turn. (#1374) Thanks @Nicell.
|
||||
- Nodes tool: include agent/node/gateway context in tool failure logs to speed approval debugging.
|
||||
|
||||
64
appcast.xml
64
appcast.xml
@@ -2,6 +2,54 @@
|
||||
<rss xmlns:sparkle="http://www.andymatuschak.org/xml-namespaces/sparkle" version="2.0">
|
||||
<channel>
|
||||
<title>Clawdbot</title>
|
||||
<item>
|
||||
<title>2026.1.22</title>
|
||||
<pubDate>Fri, 23 Jan 2026 08:58:14 +0000</pubDate>
|
||||
<link>https://raw.githubusercontent.com/clawdbot/clawdbot/main/appcast.xml</link>
|
||||
<sparkle:version>7530</sparkle:version>
|
||||
<sparkle:shortVersionString>2026.1.22</sparkle:shortVersionString>
|
||||
<sparkle:minimumSystemVersion>15.0</sparkle:minimumSystemVersion>
|
||||
<description><![CDATA[<h2>Clawdbot 2026.1.22</h2>
|
||||
<h3>Changes</h3>
|
||||
<ul>
|
||||
<li>Highlight: Compaction safeguard now uses adaptive chunking, progressive fallback, and UI status + retries. (#1466) Thanks @dlauer.</li>
|
||||
<li>Providers: add Antigravity usage tracking to status output. (#1490) Thanks @patelhiren.</li>
|
||||
<li>Slack: add chat-type reply threading overrides via <code>replyToModeByChatType</code>. (#1442) Thanks @stefangalescu.</li>
|
||||
<li>BlueBubbles: add <code>asVoice</code> support for MP3/CAF voice memos in sendAttachment. (#1477, #1482) Thanks @Nicell.</li>
|
||||
<li>Onboarding: add hatch choice (TUI/Web/Later), token explainer, background dashboard seed on macOS, and showcase link.</li>
|
||||
</ul>
|
||||
<h3>Fixes</h3>
|
||||
<ul>
|
||||
<li>BlueBubbles: stop typing indicator on idle/no-reply. (#1439) Thanks @Nicell.</li>
|
||||
<li>Message tool: keep path/filePath as-is for send; hydrate buffers only for sendAttachment. (#1444) Thanks @hopyky.</li>
|
||||
<li>Auto-reply: only report a model switch when session state is available. (#1465) Thanks @robbyczgw-cla.</li>
|
||||
<li>Control UI: resolve local avatar URLs with basePath across injection + identity RPC. (#1457) Thanks @dlauer.</li>
|
||||
<li>Agents: sanitize assistant history text to strip tool-call markers. (#1456) Thanks @zerone0x.</li>
|
||||
<li>Discord: clarify Message Content Intent onboarding hint. (#1487) Thanks @kyleok.</li>
|
||||
<li>Gateway: stop the service before uninstalling and fail if it remains loaded.</li>
|
||||
<li>Agents: surface concrete API error details instead of generic AI service errors.</li>
|
||||
<li>Exec: fall back to non-PTY when PTY spawn fails (EBADF). (#1484)</li>
|
||||
<li>Exec approvals: allow per-segment allowlists for chained shell commands on gateway + node hosts. (#1458) Thanks @czekaj.</li>
|
||||
<li>Agents: make OpenAI sessions image-sanitize-only; gate tool-id/repair sanitization by provider.</li>
|
||||
<li>Doctor: honor CLAWDBOT_GATEWAY_TOKEN for auth checks and security audit token reuse. (#1448) Thanks @azade-c.</li>
|
||||
<li>Agents: make tool summaries more readable and only show optional params when set.</li>
|
||||
<li>Agents: honor SOUL.md guidance even when the file is nested or path-qualified. (#1434) Thanks @neooriginal.</li>
|
||||
<li>Matrix (plugin): persist m.direct for resolved DMs and harden room fallback. (#1436, #1486) Thanks @sibbl.</li>
|
||||
<li>CLI: prefer <code>~</code> for home paths in output.</li>
|
||||
<li>Mattermost (plugin): enforce pairing/allowlist gating, keep @username targets, and clarify plugin-only docs. (#1428) Thanks @damoahdominic.</li>
|
||||
<li>Agents: centralize transcript sanitization in the runner; keep <final> tags and error turns intact.</li>
|
||||
<li>Auth: skip auth profiles in cooldown during initial selection and rotation. (#1316) Thanks @odrobnik.</li>
|
||||
<li>Agents/TUI: honor user-pinned auth profiles during cooldown and preserve search picker ranking. (#1432) Thanks @tobiasbischoff.</li>
|
||||
<li>Docs: fix gog auth services example to include docs scope. (#1454) Thanks @zerone0x.</li>
|
||||
<li>Slack: reduce WebClient retries to avoid duplicate sends. (#1481)</li>
|
||||
<li>Slack: read thread replies for message reads when threadId is provided (replies-only). (#1450) Thanks @rodrigouroz.</li>
|
||||
<li>macOS: prefer linked channels in gateway summary to avoid false “not linked” status.</li>
|
||||
<li>macOS/tests: fix gateway summary lookup after guard unwrap; prevent browser opens during tests. (ECID-1483)</li>
|
||||
</ul>
|
||||
<p><a href="https://github.com/clawdbot/clawdbot/blob/main/CHANGELOG.md">View full changelog</a></p>
|
||||
]]></description>
|
||||
<enclosure url="https://github.com/clawdbot/clawdbot/releases/download/v2026.1.22/Clawdbot-2026.1.22.zip" length="22302446" type="application/octet-stream" sparkle:edSignature="w/EzfwGBCRRuCg5vz8enIfYujxOZJWRw9PaunQ7gIafKwnBJSTtxcnkvMVwQsnBwB6VN5Tu2MPij7PjDFFX+CA=="/>
|
||||
</item>
|
||||
<item>
|
||||
<title>2026.1.21</title>
|
||||
<pubDate>Thu, 22 Jan 2026 12:22:35 +0000</pubDate>
|
||||
@@ -266,21 +314,5 @@ Thanks @AlexMikhalev, @CoreyH, @John-Rood, @KrauseFx, @MaudeBot, @Nachx639, @Nic
|
||||
]]></description>
|
||||
<enclosure url="https://github.com/clawdbot/clawdbot/releases/download/v2026.1.21/Clawdbot-2026.1.21.zip" length="12208102" type="application/octet-stream" sparkle:edSignature="hU495Eii8O3qmmUnxYFhXyEGv+qan6KL+GpeuBhPIXf+7B5F/gBh5Oz9cHaqaAPoZ4/3Bo6xgvic0HTkbz6gDw=="/>
|
||||
</item>
|
||||
<item>
|
||||
<title>2026.1.16-2</title>
|
||||
<pubDate>Sat, 17 Jan 2026 12:46:22 +0000</pubDate>
|
||||
<link>https://raw.githubusercontent.com/clawdbot/clawdbot/main/appcast.xml</link>
|
||||
<sparkle:version>6273</sparkle:version>
|
||||
<sparkle:shortVersionString>2026.1.16-2</sparkle:shortVersionString>
|
||||
<sparkle:minimumSystemVersion>15.0</sparkle:minimumSystemVersion>
|
||||
<description><![CDATA[<h2>Clawdbot 2026.1.16-2</h2>
|
||||
<h3>Changes</h3>
|
||||
<ul>
|
||||
<li>CLI: stamp build commit into dist metadata so banners show the commit in npm installs.</li>
|
||||
</ul>
|
||||
<p><a href="https://github.com/clawdbot/clawdbot/blob/main/CHANGELOG.md">View full changelog</a></p>
|
||||
]]></description>
|
||||
<enclosure url="https://github.com/clawdbot/clawdbot/releases/download/v2026.1.16-2/Clawdbot-2026.1.16-2.zip" length="21399591" type="application/octet-stream" sparkle:edSignature="zelT+KzN32cXsihbFniPF5Heq0hkwFfL3Agrh/AaoKUkr7kJAFarkGSOZRTWZ9y+DvOluzn2wHHjVigRjMzrBA=="/>
|
||||
</item>
|
||||
</channel>
|
||||
</rss>
|
||||
@@ -11,7 +11,7 @@ This app now ships Sparkle auto-updates. Release builds must be Developer ID–s
|
||||
|
||||
## Prereqs
|
||||
- Developer ID Application cert installed (example: `Developer ID Application: <Developer Name> (<TEAMID>)`).
|
||||
- Sparkle private key path set in the environment as `SPARKLE_PRIVATE_KEY_FILE` (path to your Sparkle ed25519 private key; public key baked into Info.plist).
|
||||
- Sparkle private key path set in the environment as `SPARKLE_PRIVATE_KEY_FILE` (path to your Sparkle ed25519 private key; public key baked into Info.plist). If it is missing, check `~/.profile`.
|
||||
- Notary credentials (keychain profile or API key) for `xcrun notarytool` if you want Gatekeeper-safe DMG/zip distribution.
|
||||
- We use a Keychain profile named `clawdbot-notary`, created from App Store Connect API key env vars in your shell profile:
|
||||
- `APP_STORE_CONNECT_API_KEY_P8`, `APP_STORE_CONNECT_KEY_ID`, `APP_STORE_CONNECT_ISSUER_ID`
|
||||
|
||||
@@ -82,6 +82,8 @@ Enable optional tools in `agents.list[].tools.allow` (or global `tools.allow`):
|
||||
```
|
||||
|
||||
Other config knobs that affect tool availability:
|
||||
- Allowlists that only name plugin tools are treated as plugin opt-ins; core tools remain
|
||||
enabled unless you also include core tools or groups in the allowlist.
|
||||
- `tools.profile` / `agents.list[].tools.profile` (base allowlist)
|
||||
- `tools.byProvider` / `agents.list[].tools.byProvider` (provider‑specific allow/deny)
|
||||
- `tools.sandbox.tools.*` (sandbox tool policy when sandboxed)
|
||||
|
||||
@@ -16,9 +16,9 @@ provider in two different ways.
|
||||
|
||||
### 1) Built-in GitHub Copilot provider (`github-copilot`)
|
||||
|
||||
Use the native device-login flow to obtain a GitHub token and use it directly
|
||||
against the Copilot API. This is the **default** and simplest path because it
|
||||
does not require VS Code. Enterprise domains are supported.
|
||||
Use the native device-login flow to obtain a GitHub token, then exchange it for
|
||||
Copilot API tokens when Clawdbot runs. This is the **default** and simplest path
|
||||
because it does not require VS Code.
|
||||
|
||||
### 2) Copilot Proxy plugin (`copilot-proxy`)
|
||||
|
||||
@@ -39,8 +39,6 @@ clawdbot models auth login-github-copilot
|
||||
|
||||
You'll be prompted to visit a URL and enter a one-time code. Keep the terminal
|
||||
open until it completes.
|
||||
If you're on GitHub Enterprise, the login will ask for your enterprise URL or
|
||||
domain (for example `company.ghe.com`).
|
||||
|
||||
### Optional flags
|
||||
|
||||
@@ -68,7 +66,5 @@ clawdbot models set github-copilot/gpt-4o
|
||||
- Requires an interactive TTY; run it directly in a terminal.
|
||||
- Copilot model availability depends on your plan; if a model is rejected, try
|
||||
another ID (for example `github-copilot/gpt-4.1`).
|
||||
- The login stores a GitHub token in the auth profile store and uses it directly
|
||||
for Copilot API calls.
|
||||
- Base URL: `https://api.githubcopilot.com` (public) or `https://copilot-api.<domain>`
|
||||
for GitHub Enterprise.
|
||||
- The login stores a GitHub token in the auth profile store and exchanges it for a
|
||||
Copilot API token when Clawdbot runs.
|
||||
|
||||
@@ -13,7 +13,7 @@ Use `pnpm` (Node 22+) from the repo root. Keep the working tree clean before tag
|
||||
## Operator trigger
|
||||
When the operator says “release”, immediately do this preflight (no extra questions unless blocked):
|
||||
- Read this doc and `docs/platforms/mac/release.md`.
|
||||
- Load env from `~/.profile` and confirm `SPARKLE_PRIVATE_KEY_FILE` + App Store Connect vars are set.
|
||||
- Load env from `~/.profile` and confirm `SPARKLE_PRIVATE_KEY_FILE` + App Store Connect vars are set (SPARKLE_PRIVATE_KEY_FILE should live in `~/.profile`).
|
||||
- Use Sparkle keys from `~/Library/CloudStorage/Dropbox/Backup/Sparkle` if needed.
|
||||
|
||||
1) **Version & metadata**
|
||||
|
||||
@@ -121,6 +121,10 @@ Lobster is an **optional** plugin tool (not enabled by default). Allow it per ag
|
||||
|
||||
You can also allow it globally with `tools.allow` if every agent should see it.
|
||||
|
||||
Note: allowlists are opt-in for optional plugins. If your allowlist only names
|
||||
plugin tools (like `lobster`), Clawdbot keeps core tools enabled. To restrict core
|
||||
tools, include the core tools or groups you want in the allowlist too.
|
||||
|
||||
## Example: Email triage
|
||||
|
||||
Without Lobster:
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@clawdbot/mattermost",
|
||||
"version": "2026.1.20-2",
|
||||
"version": "2026.1.22",
|
||||
"type": "module",
|
||||
"description": "Clawdbot Mattermost channel plugin",
|
||||
"clawdbot": {
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@clawdbot/open-prose",
|
||||
"version": "2026.1.23",
|
||||
"version": "2026.1.22",
|
||||
"type": "module",
|
||||
"description": "OpenProse VM skill pack plugin (slash command + telemetry).",
|
||||
"clawdbot": {
|
||||
|
||||
@@ -111,7 +111,7 @@
|
||||
"format:swift": "swiftformat --lint --config .swiftformat apps/macos/Sources apps/ios/Sources apps/shared/ClawdbotKit/Sources",
|
||||
"format:all": "pnpm format && pnpm format:swift",
|
||||
"format:fix": "oxfmt --write src test",
|
||||
"test": "vitest run",
|
||||
"test": "node scripts/test-parallel.mjs",
|
||||
"test:watch": "vitest",
|
||||
"test:ui": "pnpm --dir ui test",
|
||||
"test:force": "node --import tsx scripts/test-force.ts",
|
||||
|
||||
@@ -19,7 +19,12 @@ echo "==> Verify git installed"
|
||||
command -v git >/dev/null
|
||||
|
||||
echo "==> Verify clawdbot installed"
|
||||
LATEST_VERSION="$(npm view clawdbot version)"
|
||||
EXPECTED_VERSION="${CLAWDBOT_INSTALL_EXPECT_VERSION:-}"
|
||||
if [[ -n "$EXPECTED_VERSION" ]]; then
|
||||
LATEST_VERSION="$EXPECTED_VERSION"
|
||||
else
|
||||
LATEST_VERSION="$(npm view clawdbot version)"
|
||||
fi
|
||||
CMD_PATH="$(command -v clawdbot || true)"
|
||||
if [[ -z "$CMD_PATH" && -x "$HOME/.npm-global/bin/clawdbot" ]]; then
|
||||
CMD_PATH="$HOME/.npm-global/bin/clawdbot"
|
||||
|
||||
@@ -6,23 +6,36 @@ SMOKE_PREVIOUS_VERSION="${CLAWDBOT_INSTALL_SMOKE_PREVIOUS:-}"
|
||||
SKIP_PREVIOUS="${CLAWDBOT_INSTALL_SMOKE_SKIP_PREVIOUS:-0}"
|
||||
|
||||
echo "==> Resolve npm versions"
|
||||
LATEST_VERSION="$(npm view clawdbot version)"
|
||||
if [[ -n "$SMOKE_PREVIOUS_VERSION" ]]; then
|
||||
LATEST_VERSION="$(npm view clawdbot version)"
|
||||
PREVIOUS_VERSION="$SMOKE_PREVIOUS_VERSION"
|
||||
else
|
||||
PREVIOUS_VERSION="$(node - <<'NODE'
|
||||
const { execSync } = require("node:child_process");
|
||||
|
||||
const versions = JSON.parse(execSync("npm view clawdbot versions --json", { encoding: "utf8" }));
|
||||
if (!Array.isArray(versions) || versions.length === 0) {
|
||||
VERSIONS_JSON="$(npm view clawdbot versions --json)"
|
||||
read -r LATEST_VERSION PREVIOUS_VERSION < <(node - <<'NODE'
|
||||
const raw = process.env.VERSIONS_JSON || "[]";
|
||||
let versions;
|
||||
try {
|
||||
versions = JSON.parse(raw);
|
||||
} catch {
|
||||
versions = raw ? [raw] : [];
|
||||
}
|
||||
if (!Array.isArray(versions)) {
|
||||
versions = [versions];
|
||||
}
|
||||
if (versions.length === 0) {
|
||||
process.exit(1);
|
||||
}
|
||||
const previous = versions.length >= 2 ? versions[versions.length - 2] : versions[0];
|
||||
process.stdout.write(previous);
|
||||
const latest = versions[versions.length - 1];
|
||||
const previous = versions.length >= 2 ? versions[versions.length - 2] : latest;
|
||||
process.stdout.write(`${latest} ${previous}`);
|
||||
NODE
|
||||
)"
|
||||
fi
|
||||
|
||||
if [[ -n "${CLAWDBOT_INSTALL_LATEST_OUT:-}" ]]; then
|
||||
printf "%s" "$LATEST_VERSION" > "$CLAWDBOT_INSTALL_LATEST_OUT"
|
||||
fi
|
||||
|
||||
echo "latest=$LATEST_VERSION previous=$PREVIOUS_VERSION"
|
||||
|
||||
if [[ "$SKIP_PREVIOUS" == "1" ]]; then
|
||||
|
||||
@@ -6,6 +6,9 @@ SMOKE_IMAGE="${CLAWDBOT_INSTALL_SMOKE_IMAGE:-clawdbot-install-smoke:local}"
|
||||
NONROOT_IMAGE="${CLAWDBOT_INSTALL_NONROOT_IMAGE:-clawdbot-install-nonroot:local}"
|
||||
INSTALL_URL="${CLAWDBOT_INSTALL_URL:-https://clawd.bot/install.sh}"
|
||||
CLI_INSTALL_URL="${CLAWDBOT_INSTALL_CLI_URL:-https://clawd.bot/install-cli.sh}"
|
||||
SKIP_NONROOT="${CLAWDBOT_INSTALL_SMOKE_SKIP_NONROOT:-0}"
|
||||
LATEST_DIR="$(mktemp -d)"
|
||||
LATEST_FILE="${LATEST_DIR}/latest"
|
||||
|
||||
echo "==> Build smoke image (upgrade, root): $SMOKE_IMAGE"
|
||||
docker build \
|
||||
@@ -15,31 +18,48 @@ docker build \
|
||||
|
||||
echo "==> Run installer smoke test (root): $INSTALL_URL"
|
||||
docker run --rm -t \
|
||||
-v "${LATEST_DIR}:/out" \
|
||||
-e CLAWDBOT_INSTALL_URL="$INSTALL_URL" \
|
||||
-e CLAWDBOT_INSTALL_LATEST_OUT="/out/latest" \
|
||||
-e CLAWDBOT_INSTALL_SMOKE_PREVIOUS="${CLAWDBOT_INSTALL_SMOKE_PREVIOUS:-}" \
|
||||
-e CLAWDBOT_INSTALL_SMOKE_SKIP_PREVIOUS="${CLAWDBOT_INSTALL_SMOKE_SKIP_PREVIOUS:-0}" \
|
||||
-e CLAWDBOT_NO_ONBOARD=1 \
|
||||
-e DEBIAN_FRONTEND=noninteractive \
|
||||
"$SMOKE_IMAGE"
|
||||
|
||||
echo "==> Build non-root image: $NONROOT_IMAGE"
|
||||
docker build \
|
||||
-t "$NONROOT_IMAGE" \
|
||||
-f "$ROOT_DIR/scripts/docker/install-sh-nonroot/Dockerfile" \
|
||||
"$ROOT_DIR/scripts/docker/install-sh-nonroot"
|
||||
LATEST_VERSION=""
|
||||
if [[ -f "$LATEST_FILE" ]]; then
|
||||
LATEST_VERSION="$(cat "$LATEST_FILE")"
|
||||
fi
|
||||
|
||||
echo "==> Run installer non-root test: $INSTALL_URL"
|
||||
docker run --rm -t \
|
||||
-e CLAWDBOT_INSTALL_URL="$INSTALL_URL" \
|
||||
-e CLAWDBOT_NO_ONBOARD=1 \
|
||||
-e DEBIAN_FRONTEND=noninteractive \
|
||||
"$NONROOT_IMAGE"
|
||||
if [[ "$SKIP_NONROOT" == "1" ]]; then
|
||||
echo "==> Skip non-root installer smoke (CLAWDBOT_INSTALL_SMOKE_SKIP_NONROOT=1)"
|
||||
else
|
||||
echo "==> Build non-root image: $NONROOT_IMAGE"
|
||||
docker build \
|
||||
-t "$NONROOT_IMAGE" \
|
||||
-f "$ROOT_DIR/scripts/docker/install-sh-nonroot/Dockerfile" \
|
||||
"$ROOT_DIR/scripts/docker/install-sh-nonroot"
|
||||
|
||||
echo "==> Run installer non-root test: $INSTALL_URL"
|
||||
docker run --rm -t \
|
||||
-e CLAWDBOT_INSTALL_URL="$INSTALL_URL" \
|
||||
-e CLAWDBOT_INSTALL_EXPECT_VERSION="$LATEST_VERSION" \
|
||||
-e CLAWDBOT_NO_ONBOARD=1 \
|
||||
-e DEBIAN_FRONTEND=noninteractive \
|
||||
"$NONROOT_IMAGE"
|
||||
fi
|
||||
|
||||
if [[ "${CLAWDBOT_INSTALL_SMOKE_SKIP_CLI:-0}" == "1" ]]; then
|
||||
echo "==> Skip CLI installer smoke (CLAWDBOT_INSTALL_SMOKE_SKIP_CLI=1)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
if [[ "$SKIP_NONROOT" == "1" ]]; then
|
||||
echo "==> Skip CLI installer smoke (non-root image skipped)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "==> Run CLI installer non-root test (same image)"
|
||||
docker run --rm -t \
|
||||
--entrypoint /bin/bash \
|
||||
|
||||
43
scripts/test-parallel.mjs
Normal file
43
scripts/test-parallel.mjs
Normal file
@@ -0,0 +1,43 @@
|
||||
import { spawn } from "node:child_process";
|
||||
|
||||
const pnpm = process.platform === "win32" ? "pnpm.cmd" : "pnpm";
|
||||
|
||||
const runs = [
|
||||
{
|
||||
name: "unit",
|
||||
args: ["vitest", "run", "--config", "vitest.unit.config.ts"],
|
||||
},
|
||||
{
|
||||
name: "gateway",
|
||||
args: ["vitest", "run", "--config", "vitest.gateway.config.ts"],
|
||||
},
|
||||
];
|
||||
|
||||
const children = new Set();
|
||||
|
||||
const run = (entry) =>
|
||||
new Promise((resolve) => {
|
||||
const child = spawn(pnpm, entry.args, {
|
||||
stdio: "inherit",
|
||||
env: { ...process.env, VITEST_GROUP: entry.name },
|
||||
shell: process.platform === "win32",
|
||||
});
|
||||
children.add(child);
|
||||
child.on("exit", (code, signal) => {
|
||||
children.delete(child);
|
||||
resolve(code ?? (signal ? 1 : 0));
|
||||
});
|
||||
});
|
||||
|
||||
const shutdown = (signal) => {
|
||||
for (const child of children) {
|
||||
child.kill(signal);
|
||||
}
|
||||
};
|
||||
|
||||
process.on("SIGINT", () => shutdown("SIGINT"));
|
||||
process.on("SIGTERM", () => shutdown("SIGTERM"));
|
||||
|
||||
const codes = await Promise.all(runs.map(run));
|
||||
const failed = codes.find((code) => code !== 0);
|
||||
process.exit(failed ?? 0);
|
||||
@@ -1,70 +0,0 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
import {
|
||||
type AuthProfileStore,
|
||||
ensureAuthProfileStore,
|
||||
resolveApiKeyForProfile,
|
||||
} from "./auth-profiles.js";
|
||||
|
||||
vi.mock("@mariozechner/pi-ai", () => ({
|
||||
getOAuthApiKey: vi.fn(() => {
|
||||
throw new Error("refresh should not be called");
|
||||
}),
|
||||
}));
|
||||
|
||||
describe("auth-profiles (github-copilot)", () => {
|
||||
const previousStateDir = process.env.CLAWDBOT_STATE_DIR;
|
||||
const previousAgentDir = process.env.CLAWDBOT_AGENT_DIR;
|
||||
const previousPiAgentDir = process.env.PI_CODING_AGENT_DIR;
|
||||
let tempDir: string | null = null;
|
||||
|
||||
afterEach(async () => {
|
||||
vi.unstubAllGlobals();
|
||||
if (tempDir) {
|
||||
await fs.rm(tempDir, { recursive: true, force: true });
|
||||
tempDir = null;
|
||||
}
|
||||
if (previousStateDir === undefined) delete process.env.CLAWDBOT_STATE_DIR;
|
||||
else process.env.CLAWDBOT_STATE_DIR = previousStateDir;
|
||||
if (previousAgentDir === undefined) delete process.env.CLAWDBOT_AGENT_DIR;
|
||||
else process.env.CLAWDBOT_AGENT_DIR = previousAgentDir;
|
||||
if (previousPiAgentDir === undefined) delete process.env.PI_CODING_AGENT_DIR;
|
||||
else process.env.PI_CODING_AGENT_DIR = previousPiAgentDir;
|
||||
});
|
||||
|
||||
it("treats copilot oauth tokens with expires=0 as non-expiring", async () => {
|
||||
tempDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-copilot-"));
|
||||
process.env.CLAWDBOT_STATE_DIR = tempDir;
|
||||
process.env.CLAWDBOT_AGENT_DIR = path.join(tempDir, "agents", "main", "agent");
|
||||
process.env.PI_CODING_AGENT_DIR = process.env.CLAWDBOT_AGENT_DIR;
|
||||
|
||||
const authProfilePath = path.join(tempDir, "agents", "main", "agent", "auth-profiles.json");
|
||||
await fs.mkdir(path.dirname(authProfilePath), { recursive: true });
|
||||
|
||||
const store: AuthProfileStore = {
|
||||
version: 1,
|
||||
profiles: {
|
||||
"github-copilot:github": {
|
||||
type: "oauth",
|
||||
provider: "github-copilot",
|
||||
refresh: "gh-token",
|
||||
access: "gh-token",
|
||||
expires: 0,
|
||||
enterpriseUrl: "company.ghe.com",
|
||||
},
|
||||
},
|
||||
};
|
||||
await fs.writeFile(authProfilePath, `${JSON.stringify(store)}\n`);
|
||||
|
||||
const loaded = ensureAuthProfileStore();
|
||||
const resolved = await resolveApiKeyForProfile({
|
||||
store: loaded,
|
||||
profileId: "github-copilot:github",
|
||||
});
|
||||
|
||||
expect(resolved?.apiKey).toBe("gh-token");
|
||||
});
|
||||
});
|
||||
@@ -39,15 +39,6 @@ async function refreshOAuthTokenWithLock(params: {
|
||||
const cred = store.profiles[params.profileId];
|
||||
if (!cred || cred.type !== "oauth") return null;
|
||||
|
||||
if (
|
||||
cred.provider === "github-copilot" &&
|
||||
(!Number.isFinite(cred.expires) || cred.expires <= 0)
|
||||
) {
|
||||
return {
|
||||
apiKey: buildOAuthApiKey(cred.provider, cred),
|
||||
newCredentials: cred,
|
||||
};
|
||||
}
|
||||
if (Date.now() < cred.expires) {
|
||||
return {
|
||||
apiKey: buildOAuthApiKey(cred.provider, cred),
|
||||
@@ -112,20 +103,6 @@ async function tryResolveOAuthProfile(params: {
|
||||
if (profileConfig && profileConfig.provider !== cred.provider) return null;
|
||||
if (profileConfig && profileConfig.mode !== cred.type) return null;
|
||||
|
||||
if (cred.provider === "github-copilot" && (!Number.isFinite(cred.expires) || cred.expires <= 0)) {
|
||||
return {
|
||||
apiKey: buildOAuthApiKey(cred.provider, cred),
|
||||
provider: cred.provider,
|
||||
email: cred.email,
|
||||
};
|
||||
}
|
||||
if (cred.provider === "github-copilot" && (!Number.isFinite(cred.expires) || cred.expires <= 0)) {
|
||||
return {
|
||||
apiKey: buildOAuthApiKey(cred.provider, cred),
|
||||
provider: cred.provider,
|
||||
email: cred.email,
|
||||
};
|
||||
}
|
||||
if (Date.now() < cred.expires) {
|
||||
return {
|
||||
apiKey: buildOAuthApiKey(cred.provider, cred),
|
||||
|
||||
@@ -19,7 +19,6 @@ export type TokenCredential = {
|
||||
token: string;
|
||||
/** Optional expiry timestamp (ms since epoch). */
|
||||
expires?: number;
|
||||
enterpriseUrl?: string;
|
||||
email?: string;
|
||||
};
|
||||
|
||||
|
||||
@@ -3,7 +3,6 @@ import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { withTempHome as withTempHomeBase } from "../../test/helpers/temp-home.js";
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import { DEFAULT_GITHUB_COPILOT_BASE_URL } from "../providers/github-copilot-utils.js";
|
||||
|
||||
async function withTempHome<T>(fn: (home: string) => Promise<T>): Promise<T> {
|
||||
return withTempHomeBase(fn, { prefix: "clawdbot-models-" });
|
||||
@@ -52,6 +51,16 @@ describe("models-config", () => {
|
||||
try {
|
||||
vi.resetModules();
|
||||
|
||||
vi.doMock("../providers/github-copilot-token.js", () => ({
|
||||
DEFAULT_COPILOT_API_BASE_URL: "https://api.individual.githubcopilot.com",
|
||||
resolveCopilotApiToken: vi.fn().mockResolvedValue({
|
||||
token: "copilot",
|
||||
expiresAt: Date.now() + 60 * 60 * 1000,
|
||||
source: "mock",
|
||||
baseUrl: "https://api.copilot.example",
|
||||
}),
|
||||
}));
|
||||
|
||||
const { ensureClawdbotModelsJson } = await import("./models-config.js");
|
||||
|
||||
const agentDir = path.join(home, "agent-default-base-url");
|
||||
@@ -62,55 +71,48 @@ describe("models-config", () => {
|
||||
providers: Record<string, { baseUrl?: string; models?: unknown[] }>;
|
||||
};
|
||||
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe(DEFAULT_GITHUB_COPILOT_BASE_URL);
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe("https://api.copilot.example");
|
||||
expect(parsed.providers["github-copilot"]?.models?.length ?? 0).toBe(0);
|
||||
} finally {
|
||||
process.env.COPILOT_GITHUB_TOKEN = previous;
|
||||
}
|
||||
});
|
||||
});
|
||||
it("uses enterprise URL from auth profiles to derive base URL", async () => {
|
||||
it("prefers COPILOT_GITHUB_TOKEN over GH_TOKEN and GITHUB_TOKEN", async () => {
|
||||
await withTempHome(async () => {
|
||||
const previous = process.env.COPILOT_GITHUB_TOKEN;
|
||||
const previousGh = process.env.GH_TOKEN;
|
||||
const previousGithub = process.env.GITHUB_TOKEN;
|
||||
process.env.COPILOT_GITHUB_TOKEN = "copilot-token";
|
||||
process.env.GH_TOKEN = "gh-token";
|
||||
process.env.GITHUB_TOKEN = "github-token";
|
||||
|
||||
try {
|
||||
vi.resetModules();
|
||||
|
||||
const agentDir = path.join(process.env.HOME ?? home, "agent-enterprise");
|
||||
await fs.mkdir(agentDir, { recursive: true });
|
||||
await fs.writeFile(
|
||||
path.join(agentDir, "auth-profiles.json"),
|
||||
JSON.stringify(
|
||||
{
|
||||
version: 1,
|
||||
profiles: {
|
||||
"github-copilot:github": {
|
||||
type: "oauth",
|
||||
provider: "github-copilot",
|
||||
refresh: "gh-token",
|
||||
access: "gh-token",
|
||||
expires: 0,
|
||||
enterpriseUrl: "company.ghe.com",
|
||||
},
|
||||
},
|
||||
},
|
||||
null,
|
||||
2,
|
||||
),
|
||||
);
|
||||
const resolveCopilotApiToken = vi.fn().mockResolvedValue({
|
||||
token: "copilot",
|
||||
expiresAt: Date.now() + 60 * 60 * 1000,
|
||||
source: "mock",
|
||||
baseUrl: "https://api.copilot.example",
|
||||
});
|
||||
|
||||
vi.doMock("../providers/github-copilot-token.js", () => ({
|
||||
DEFAULT_COPILOT_API_BASE_URL: "https://api.individual.githubcopilot.com",
|
||||
resolveCopilotApiToken,
|
||||
}));
|
||||
|
||||
const { ensureClawdbotModelsJson } = await import("./models-config.js");
|
||||
|
||||
await ensureClawdbotModelsJson({ models: { providers: {} } }, agentDir);
|
||||
await ensureClawdbotModelsJson({ models: { providers: {} } });
|
||||
|
||||
const raw = await fs.readFile(path.join(agentDir, "models.json"), "utf8");
|
||||
const parsed = JSON.parse(raw) as {
|
||||
providers: Record<string, { baseUrl?: string }>;
|
||||
};
|
||||
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe(
|
||||
"https://copilot-api.company.ghe.com",
|
||||
expect(resolveCopilotApiToken).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ githubToken: "copilot-token" }),
|
||||
);
|
||||
} finally {
|
||||
// no-op
|
||||
process.env.COPILOT_GITHUB_TOKEN = previous;
|
||||
process.env.GH_TOKEN = previousGh;
|
||||
process.env.GITHUB_TOKEN = previousGithub;
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
@@ -3,7 +3,6 @@ import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { withTempHome as withTempHomeBase } from "../../test/helpers/temp-home.js";
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import { DEFAULT_GITHUB_COPILOT_BASE_URL } from "../providers/github-copilot-utils.js";
|
||||
|
||||
async function withTempHome<T>(fn: (home: string) => Promise<T>): Promise<T> {
|
||||
return withTempHomeBase(fn, { prefix: "clawdbot-models-" });
|
||||
@@ -44,7 +43,7 @@ describe("models-config", () => {
|
||||
process.env.HOME = previousHome;
|
||||
});
|
||||
|
||||
it("uses default baseUrl when env token is present", async () => {
|
||||
it("falls back to default baseUrl when token exchange fails", async () => {
|
||||
await withTempHome(async () => {
|
||||
const previous = process.env.COPILOT_GITHUB_TOKEN;
|
||||
process.env.COPILOT_GITHUB_TOKEN = "gh-token";
|
||||
@@ -52,6 +51,11 @@ describe("models-config", () => {
|
||||
try {
|
||||
vi.resetModules();
|
||||
|
||||
vi.doMock("../providers/github-copilot-token.js", () => ({
|
||||
DEFAULT_COPILOT_API_BASE_URL: "https://api.default.test",
|
||||
resolveCopilotApiToken: vi.fn().mockRejectedValue(new Error("boom")),
|
||||
}));
|
||||
|
||||
const { ensureClawdbotModelsJson } = await import("./models-config.js");
|
||||
const { resolveClawdbotAgentDir } = await import("./agent-paths.js");
|
||||
|
||||
@@ -63,13 +67,13 @@ describe("models-config", () => {
|
||||
providers: Record<string, { baseUrl?: string }>;
|
||||
};
|
||||
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe(DEFAULT_GITHUB_COPILOT_BASE_URL);
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe("https://api.default.test");
|
||||
} finally {
|
||||
process.env.COPILOT_GITHUB_TOKEN = previous;
|
||||
}
|
||||
});
|
||||
});
|
||||
it("normalizes enterprise URL when deriving base URL", async () => {
|
||||
it("uses agentDir override auth profiles for copilot injection", async () => {
|
||||
await withTempHome(async (home) => {
|
||||
const previous = process.env.COPILOT_GITHUB_TOKEN;
|
||||
const previousGh = process.env.GH_TOKEN;
|
||||
@@ -90,12 +94,9 @@ describe("models-config", () => {
|
||||
version: 1,
|
||||
profiles: {
|
||||
"github-copilot:github": {
|
||||
type: "oauth",
|
||||
type: "token",
|
||||
provider: "github-copilot",
|
||||
refresh: "gh-profile-token",
|
||||
access: "gh-profile-token",
|
||||
expires: 0,
|
||||
enterpriseUrl: "https://company.ghe.com/",
|
||||
token: "gh-profile-token",
|
||||
},
|
||||
},
|
||||
},
|
||||
@@ -104,6 +105,16 @@ describe("models-config", () => {
|
||||
),
|
||||
);
|
||||
|
||||
vi.doMock("../providers/github-copilot-token.js", () => ({
|
||||
DEFAULT_COPILOT_API_BASE_URL: "https://api.individual.githubcopilot.com",
|
||||
resolveCopilotApiToken: vi.fn().mockResolvedValue({
|
||||
token: "copilot",
|
||||
expiresAt: Date.now() + 60 * 60 * 1000,
|
||||
source: "mock",
|
||||
baseUrl: "https://api.copilot.example",
|
||||
}),
|
||||
}));
|
||||
|
||||
const { ensureClawdbotModelsJson } = await import("./models-config.js");
|
||||
|
||||
await ensureClawdbotModelsJson({ models: { providers: {} } }, agentDir);
|
||||
@@ -113,9 +124,7 @@ describe("models-config", () => {
|
||||
providers: Record<string, { baseUrl?: string }>;
|
||||
};
|
||||
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe(
|
||||
"https://copilot-api.company.ghe.com",
|
||||
);
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe("https://api.copilot.example");
|
||||
} finally {
|
||||
if (previous === undefined) delete process.env.COPILOT_GITHUB_TOKEN;
|
||||
else process.env.COPILOT_GITHUB_TOKEN = previous;
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import {
|
||||
normalizeGithubCopilotDomain,
|
||||
resolveGithubCopilotBaseUrl,
|
||||
} from "../providers/github-copilot-utils.js";
|
||||
DEFAULT_COPILOT_API_BASE_URL,
|
||||
resolveCopilotApiToken,
|
||||
} from "../providers/github-copilot-token.js";
|
||||
import { ensureAuthProfileStore, listProfilesForProvider } from "./auth-profiles.js";
|
||||
import { resolveAwsSdkEnvVarName, resolveEnvApiKey } from "./model-auth.js";
|
||||
import {
|
||||
@@ -331,18 +331,29 @@ export async function resolveImplicitCopilotProvider(params: {
|
||||
|
||||
if (!hasProfile && !githubToken) return null;
|
||||
|
||||
let enterpriseDomain: string | null = null;
|
||||
if (hasProfile) {
|
||||
let selectedGithubToken = githubToken;
|
||||
if (!selectedGithubToken && hasProfile) {
|
||||
// Use the first available profile as a default for discovery (it will be
|
||||
// re-resolved per-run by the embedded runner).
|
||||
const profileId = listProfilesForProvider(authStore, "github-copilot")[0];
|
||||
const profile = profileId ? authStore.profiles[profileId] : undefined;
|
||||
if (profile && "enterpriseUrl" in profile && typeof profile.enterpriseUrl === "string") {
|
||||
enterpriseDomain = normalizeGithubCopilotDomain(profile.enterpriseUrl);
|
||||
if (profile && profile.type === "token") {
|
||||
selectedGithubToken = profile.token;
|
||||
}
|
||||
}
|
||||
|
||||
const baseUrl = resolveGithubCopilotBaseUrl(enterpriseDomain);
|
||||
let baseUrl = DEFAULT_COPILOT_API_BASE_URL;
|
||||
if (selectedGithubToken) {
|
||||
try {
|
||||
const token = await resolveCopilotApiToken({
|
||||
githubToken: selectedGithubToken,
|
||||
env,
|
||||
});
|
||||
baseUrl = token.baseUrl;
|
||||
} catch {
|
||||
baseUrl = DEFAULT_COPILOT_API_BASE_URL;
|
||||
}
|
||||
}
|
||||
|
||||
// pi-coding-agent's ModelRegistry marks a model "available" only if its
|
||||
// `AuthStorage` has auth configured for that provider (via auth.json/env/etc).
|
||||
@@ -353,7 +364,7 @@ export async function resolveImplicitCopilotProvider(params: {
|
||||
// GitHub token (not the exchanged Copilot token), and (3) matches existing
|
||||
// patterns for OAuth-like providers in pi-coding-agent.
|
||||
// Note: we deliberately do not write pi-coding-agent's `auth.json` here.
|
||||
// Clawdbot uses its own auth store and passes the GitHub token at runtime.
|
||||
// Clawdbot uses its own auth store and exchanges tokens at runtime.
|
||||
// `models list` uses Clawdbot's auth heuristics for availability.
|
||||
|
||||
// We intentionally do NOT define custom models for Copilot in models.json.
|
||||
|
||||
@@ -3,7 +3,6 @@ import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { withTempHome as withTempHomeBase } from "../../test/helpers/temp-home.js";
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import { DEFAULT_GITHUB_COPILOT_BASE_URL } from "../providers/github-copilot-utils.js";
|
||||
|
||||
async function withTempHome<T>(fn: (home: string) => Promise<T>): Promise<T> {
|
||||
return withTempHomeBase(fn, { prefix: "clawdbot-models-" });
|
||||
@@ -81,16 +80,25 @@ describe("models-config", () => {
|
||||
),
|
||||
);
|
||||
|
||||
const resolveCopilotApiToken = vi.fn().mockResolvedValue({
|
||||
token: "copilot",
|
||||
expiresAt: Date.now() + 60 * 60 * 1000,
|
||||
source: "mock",
|
||||
baseUrl: "https://api.copilot.example",
|
||||
});
|
||||
|
||||
vi.doMock("../providers/github-copilot-token.js", () => ({
|
||||
DEFAULT_COPILOT_API_BASE_URL: "https://api.individual.githubcopilot.com",
|
||||
resolveCopilotApiToken,
|
||||
}));
|
||||
|
||||
const { ensureClawdbotModelsJson } = await import("./models-config.js");
|
||||
|
||||
await ensureClawdbotModelsJson({ models: { providers: {} } }, agentDir);
|
||||
|
||||
const raw = await fs.readFile(path.join(agentDir, "models.json"), "utf8");
|
||||
const parsed = JSON.parse(raw) as {
|
||||
providers: Record<string, { baseUrl?: string }>;
|
||||
};
|
||||
|
||||
expect(parsed.providers["github-copilot"]?.baseUrl).toBe(DEFAULT_GITHUB_COPILOT_BASE_URL);
|
||||
expect(resolveCopilotApiToken).toHaveBeenCalledWith(
|
||||
expect.objectContaining({ githubToken: "alpha-token" }),
|
||||
);
|
||||
} finally {
|
||||
if (previous === undefined) delete process.env.COPILOT_GITHUB_TOKEN;
|
||||
else process.env.COPILOT_GITHUB_TOKEN = previous;
|
||||
@@ -109,6 +117,16 @@ describe("models-config", () => {
|
||||
try {
|
||||
vi.resetModules();
|
||||
|
||||
vi.doMock("../providers/github-copilot-token.js", () => ({
|
||||
DEFAULT_COPILOT_API_BASE_URL: "https://api.individual.githubcopilot.com",
|
||||
resolveCopilotApiToken: vi.fn().mockResolvedValue({
|
||||
token: "copilot",
|
||||
expiresAt: Date.now() + 60 * 60 * 1000,
|
||||
source: "mock",
|
||||
baseUrl: "https://api.copilot.example",
|
||||
}),
|
||||
}));
|
||||
|
||||
const { ensureClawdbotModelsJson } = await import("./models-config.js");
|
||||
const { resolveClawdbotAgentDir } = await import("./agent-paths.js");
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ import os from "node:os";
|
||||
import path from "node:path";
|
||||
|
||||
import type { AssistantMessage } from "@mariozechner/pi-ai";
|
||||
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { beforeAll, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import type { EmbeddedRunAttemptResult } from "./pi-embedded-runner/run/types.js";
|
||||
@@ -16,13 +16,15 @@ vi.mock("./pi-embedded-runner/run/attempt.js", () => ({
|
||||
|
||||
let runEmbeddedPiAgent: typeof import("./pi-embedded-runner.js").runEmbeddedPiAgent;
|
||||
|
||||
beforeEach(async () => {
|
||||
vi.useRealTimers();
|
||||
vi.resetModules();
|
||||
runEmbeddedAttemptMock.mockReset();
|
||||
beforeAll(async () => {
|
||||
({ runEmbeddedPiAgent } = await import("./pi-embedded-runner.js"));
|
||||
});
|
||||
|
||||
beforeEach(() => {
|
||||
vi.useRealTimers();
|
||||
runEmbeddedAttemptMock.mockReset();
|
||||
});
|
||||
|
||||
const baseUsage = {
|
||||
input: 0,
|
||||
output: 0,
|
||||
|
||||
@@ -128,6 +128,13 @@ export async function compactEmbeddedPiSession(params: {
|
||||
`No API key resolved for provider "${model.provider}" (auth mode: ${apiKeyInfo.mode}).`,
|
||||
);
|
||||
}
|
||||
} else if (model.provider === "github-copilot") {
|
||||
const { resolveCopilotApiToken } =
|
||||
await import("../../providers/github-copilot-token.js");
|
||||
const copilotToken = await resolveCopilotApiToken({
|
||||
githubToken: apiKeyInfo.apiKey,
|
||||
});
|
||||
authStorage.setRuntimeApiKey(model.provider, copilotToken.token);
|
||||
} else {
|
||||
authStorage.setRuntimeApiKey(model.provider, apiKeyInfo.apiKey);
|
||||
}
|
||||
|
||||
@@ -7,20 +7,9 @@ import { resolveClawdbotAgentDir } from "../agent-paths.js";
|
||||
import { DEFAULT_CONTEXT_TOKENS } from "../defaults.js";
|
||||
import { normalizeModelCompat } from "../model-compat.js";
|
||||
import { normalizeProviderId } from "../model-selection.js";
|
||||
import { resolveGithubCopilotUserAgent } from "../../providers/github-copilot-utils.js";
|
||||
|
||||
type InlineModelEntry = ModelDefinitionConfig & { provider: string };
|
||||
|
||||
function applyProviderModelOverrides(model: Model<Api>): Model<Api> {
|
||||
if (model.provider === "github-copilot") {
|
||||
const headers = model.headers
|
||||
? { ...model.headers, "User-Agent": resolveGithubCopilotUserAgent() }
|
||||
: { "User-Agent": resolveGithubCopilotUserAgent() };
|
||||
return { ...model, headers };
|
||||
}
|
||||
return model;
|
||||
}
|
||||
|
||||
export function buildInlineProviderModels(
|
||||
providers: Record<string, { models?: ModelDefinitionConfig[] }>,
|
||||
): InlineModelEntry[] {
|
||||
@@ -71,7 +60,7 @@ export function resolveModel(
|
||||
if (inlineMatch) {
|
||||
const normalized = normalizeModelCompat(inlineMatch as Model<Api>);
|
||||
return {
|
||||
model: applyProviderModelOverrides(normalized),
|
||||
model: normalized,
|
||||
authStorage,
|
||||
modelRegistry,
|
||||
};
|
||||
@@ -89,7 +78,7 @@ export function resolveModel(
|
||||
contextWindow: providerCfg?.models?.[0]?.contextWindow ?? DEFAULT_CONTEXT_TOKENS,
|
||||
maxTokens: providerCfg?.models?.[0]?.maxTokens ?? DEFAULT_CONTEXT_TOKENS,
|
||||
} as Model<Api>);
|
||||
return { model: applyProviderModelOverrides(fallbackModel), authStorage, modelRegistry };
|
||||
return { model: fallbackModel, authStorage, modelRegistry };
|
||||
}
|
||||
return {
|
||||
error: `Unknown model: ${provider}/${modelId}`,
|
||||
@@ -97,9 +86,5 @@ export function resolveModel(
|
||||
modelRegistry,
|
||||
};
|
||||
}
|
||||
return {
|
||||
model: applyProviderModelOverrides(normalizeModelCompat(model)),
|
||||
authStorage,
|
||||
modelRegistry,
|
||||
};
|
||||
return { model: normalizeModelCompat(model), authStorage, modelRegistry };
|
||||
}
|
||||
|
||||
@@ -184,8 +184,17 @@ export async function runEmbeddedPiAgent(
|
||||
lastProfileId = resolvedProfileId;
|
||||
return;
|
||||
}
|
||||
authStorage.setRuntimeApiKey(model.provider, apiKeyInfo.apiKey);
|
||||
lastProfileId = resolvedProfileId;
|
||||
if (model.provider === "github-copilot") {
|
||||
const { resolveCopilotApiToken } =
|
||||
await import("../../providers/github-copilot-token.js");
|
||||
const copilotToken = await resolveCopilotApiToken({
|
||||
githubToken: apiKeyInfo.apiKey,
|
||||
});
|
||||
authStorage.setRuntimeApiKey(model.provider, copilotToken.token);
|
||||
} else {
|
||||
authStorage.setRuntimeApiKey(model.provider, apiKeyInfo.apiKey);
|
||||
}
|
||||
lastProfileId = apiKeyInfo.profileId;
|
||||
};
|
||||
|
||||
const advanceAuthProfile = async (): Promise<boolean> => {
|
||||
|
||||
@@ -1,212 +0,0 @@
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
import { createBrowserTool } from "./tools/browser-tool.js";
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
it("adds Claude-style aliases to schemas without dropping metadata", () => {
|
||||
const base: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string", description: "Path" },
|
||||
content: { type: "string", description: "Body" },
|
||||
},
|
||||
},
|
||||
execute: vi.fn(),
|
||||
};
|
||||
|
||||
const patched = __testing.patchToolSchemaForClaudeCompatibility(base);
|
||||
const params = patched.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
const props = params.properties ?? {};
|
||||
|
||||
expect(props.file_path).toEqual(props.path);
|
||||
expect(params.required ?? []).not.toContain("path");
|
||||
expect(params.required ?? []).not.toContain("file_path");
|
||||
});
|
||||
|
||||
it("normalizes file_path to path and enforces required groups at runtime", async () => {
|
||||
const execute = vi.fn(async (_id, args) => args);
|
||||
const tool: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string" },
|
||||
content: { type: "string" },
|
||||
},
|
||||
},
|
||||
execute,
|
||||
};
|
||||
|
||||
const wrapped = __testing.wrapToolParamNormalization(tool, [{ keys: ["path", "file_path"] }]);
|
||||
|
||||
await wrapped.execute("tool-1", { file_path: "foo.txt", content: "x" });
|
||||
expect(execute).toHaveBeenCalledWith(
|
||||
"tool-1",
|
||||
{ path: "foo.txt", content: "x" },
|
||||
undefined,
|
||||
undefined,
|
||||
);
|
||||
|
||||
await expect(wrapped.execute("tool-2", { content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
await expect(wrapped.execute("tool-3", { file_path: " ", content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("keeps browser tool schema OpenAI-compatible without normalization", () => {
|
||||
const browser = createBrowserTool();
|
||||
const schema = browser.parameters as { type?: unknown; anyOf?: unknown };
|
||||
expect(schema.type).toBe("object");
|
||||
expect(schema.anyOf).toBeUndefined();
|
||||
});
|
||||
it("mentions Chrome extension relay in browser tool description", () => {
|
||||
const browser = createBrowserTool();
|
||||
expect(browser.description).toMatch(/Chrome extension/i);
|
||||
expect(browser.description).toMatch(/profile="chrome"/i);
|
||||
});
|
||||
it("keeps browser tool schema properties after normalization", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
const browser = tools.find((tool) => tool.name === "browser");
|
||||
expect(browser).toBeDefined();
|
||||
const parameters = browser?.parameters as {
|
||||
anyOf?: unknown[];
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
expect(parameters.properties?.action).toBeDefined();
|
||||
expect(parameters.properties?.target).toBeDefined();
|
||||
expect(parameters.properties?.controlUrl).toBeDefined();
|
||||
expect(parameters.properties?.targetUrl).toBeDefined();
|
||||
expect(parameters.properties?.request).toBeDefined();
|
||||
expect(parameters.required ?? []).toContain("action");
|
||||
});
|
||||
it("exposes raw for gateway config.apply tool calls", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
const gateway = tools.find((tool) => tool.name === "gateway");
|
||||
expect(gateway).toBeDefined();
|
||||
|
||||
const parameters = gateway?.parameters as {
|
||||
type?: unknown;
|
||||
required?: string[];
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
expect(parameters.type).toBe("object");
|
||||
expect(parameters.properties?.raw).toBeDefined();
|
||||
expect(parameters.required ?? []).not.toContain("raw");
|
||||
});
|
||||
it("flattens anyOf-of-literals to enum for provider compatibility", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
const browser = tools.find((tool) => tool.name === "browser");
|
||||
expect(browser).toBeDefined();
|
||||
|
||||
const parameters = browser?.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
const action = parameters.properties?.action as
|
||||
| {
|
||||
type?: unknown;
|
||||
enum?: unknown[];
|
||||
anyOf?: unknown[];
|
||||
}
|
||||
| undefined;
|
||||
|
||||
expect(action?.type).toBe("string");
|
||||
expect(action?.anyOf).toBeUndefined();
|
||||
expect(Array.isArray(action?.enum)).toBe(true);
|
||||
expect(action?.enum).toContain("act");
|
||||
|
||||
const snapshotFormat = parameters.properties?.snapshotFormat as
|
||||
| {
|
||||
type?: unknown;
|
||||
enum?: unknown[];
|
||||
anyOf?: unknown[];
|
||||
}
|
||||
| undefined;
|
||||
expect(snapshotFormat?.type).toBe("string");
|
||||
expect(snapshotFormat?.anyOf).toBeUndefined();
|
||||
expect(snapshotFormat?.enum).toEqual(["aria", "ai"]);
|
||||
});
|
||||
it("inlines local $ref before removing unsupported keywords", () => {
|
||||
const cleaned = __testing.cleanToolSchemaForGemini({
|
||||
type: "object",
|
||||
properties: {
|
||||
foo: { $ref: "#/$defs/Foo" },
|
||||
},
|
||||
$defs: {
|
||||
Foo: { type: "string", enum: ["a", "b"] },
|
||||
},
|
||||
}) as {
|
||||
$defs?: unknown;
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
expect(cleaned.$defs).toBeUndefined();
|
||||
expect(cleaned.properties).toBeDefined();
|
||||
expect(cleaned.properties?.foo).toMatchObject({
|
||||
type: "string",
|
||||
enum: ["a", "b"],
|
||||
});
|
||||
});
|
||||
it("cleans tuple items schemas", () => {
|
||||
const cleaned = __testing.cleanToolSchemaForGemini({
|
||||
type: "object",
|
||||
properties: {
|
||||
tuples: {
|
||||
type: "array",
|
||||
items: [
|
||||
{ type: "string", format: "uuid" },
|
||||
{ type: "number", minimum: 1 },
|
||||
],
|
||||
},
|
||||
},
|
||||
}) as {
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
const tuples = cleaned.properties?.tuples as { items?: unknown } | undefined;
|
||||
const items = Array.isArray(tuples?.items) ? tuples?.items : [];
|
||||
const first = items[0] as { format?: unknown } | undefined;
|
||||
const second = items[1] as { minimum?: unknown } | undefined;
|
||||
|
||||
expect(first?.format).toBeUndefined();
|
||||
expect(second?.minimum).toBeUndefined();
|
||||
});
|
||||
it("drops null-only union variants without flattening other unions", () => {
|
||||
const cleaned = __testing.cleanToolSchemaForGemini({
|
||||
type: "object",
|
||||
properties: {
|
||||
parentId: { anyOf: [{ type: "string" }, { type: "null" }] },
|
||||
count: { oneOf: [{ type: "string" }, { type: "number" }] },
|
||||
},
|
||||
}) as {
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
const parentId = cleaned.properties?.parentId as
|
||||
| { type?: unknown; anyOf?: unknown; oneOf?: unknown }
|
||||
| undefined;
|
||||
expect(parentId?.anyOf).toBeUndefined();
|
||||
expect(parentId?.oneOf).toBeUndefined();
|
||||
expect(parentId?.type).toBe("string");
|
||||
|
||||
const count = cleaned.properties?.count as
|
||||
| { type?: unknown; anyOf?: unknown; oneOf?: unknown }
|
||||
| undefined;
|
||||
expect(count?.anyOf).toBeUndefined();
|
||||
expect(Array.isArray(count?.oneOf)).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,74 +1,11 @@
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
import { createClawdbotCodingTools } from "./pi-tools.js";
|
||||
|
||||
const defaultTools = createClawdbotCodingTools();
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
it("adds Claude-style aliases to schemas without dropping metadata", () => {
|
||||
const base: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string", description: "Path" },
|
||||
content: { type: "string", description: "Body" },
|
||||
},
|
||||
},
|
||||
execute: vi.fn(),
|
||||
};
|
||||
|
||||
const patched = __testing.patchToolSchemaForClaudeCompatibility(base);
|
||||
const params = patched.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
const props = params.properties ?? {};
|
||||
|
||||
expect(props.file_path).toEqual(props.path);
|
||||
expect(params.required ?? []).not.toContain("path");
|
||||
expect(params.required ?? []).not.toContain("file_path");
|
||||
});
|
||||
|
||||
it("normalizes file_path to path and enforces required groups at runtime", async () => {
|
||||
const execute = vi.fn(async (_id, args) => args);
|
||||
const tool: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string" },
|
||||
content: { type: "string" },
|
||||
},
|
||||
},
|
||||
execute,
|
||||
};
|
||||
|
||||
const wrapped = __testing.wrapToolParamNormalization(tool, [{ keys: ["path", "file_path"] }]);
|
||||
|
||||
await wrapped.execute("tool-1", { file_path: "foo.txt", content: "x" });
|
||||
expect(execute).toHaveBeenCalledWith(
|
||||
"tool-1",
|
||||
{ path: "foo.txt", content: "x" },
|
||||
undefined,
|
||||
undefined,
|
||||
);
|
||||
|
||||
await expect(wrapped.execute("tool-2", { content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
await expect(wrapped.execute("tool-3", { file_path: " ", content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("preserves action enums in normalized schemas", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
const toolNames = ["browser", "canvas", "nodes", "cron", "gateway", "message"];
|
||||
|
||||
const collectActionValues = (schema: unknown, values: Set<string>): void => {
|
||||
@@ -88,7 +25,7 @@ describe("createClawdbotCodingTools", () => {
|
||||
};
|
||||
|
||||
for (const name of toolNames) {
|
||||
const tool = tools.find((candidate) => candidate.name === name);
|
||||
const tool = defaultTools.find((candidate) => candidate.name === name);
|
||||
expect(tool).toBeDefined();
|
||||
const parameters = tool?.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
@@ -108,10 +45,9 @@ describe("createClawdbotCodingTools", () => {
|
||||
}
|
||||
});
|
||||
it("includes exec and process tools by default", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
expect(tools.some((tool) => tool.name === "exec")).toBe(true);
|
||||
expect(tools.some((tool) => tool.name === "process")).toBe(true);
|
||||
expect(tools.some((tool) => tool.name === "apply_patch")).toBe(false);
|
||||
expect(defaultTools.some((tool) => tool.name === "exec")).toBe(true);
|
||||
expect(defaultTools.some((tool) => tool.name === "process")).toBe(true);
|
||||
expect(defaultTools.some((tool) => tool.name === "apply_patch")).toBe(false);
|
||||
});
|
||||
it("gates apply_patch behind tools.exec.applyPatch for OpenAI models", () => {
|
||||
const config: ClawdbotConfig = {
|
||||
|
||||
@@ -1,78 +1,15 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import sharp from "sharp";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { createClawdbotCodingTools } from "./pi-tools.js";
|
||||
|
||||
const defaultTools = createClawdbotCodingTools();
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
it("adds Claude-style aliases to schemas without dropping metadata", () => {
|
||||
const base: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string", description: "Path" },
|
||||
content: { type: "string", description: "Body" },
|
||||
},
|
||||
},
|
||||
execute: vi.fn(),
|
||||
};
|
||||
|
||||
const patched = __testing.patchToolSchemaForClaudeCompatibility(base);
|
||||
const params = patched.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
const props = params.properties ?? {};
|
||||
|
||||
expect(props.file_path).toEqual(props.path);
|
||||
expect(params.required ?? []).not.toContain("path");
|
||||
expect(params.required ?? []).not.toContain("file_path");
|
||||
});
|
||||
|
||||
it("normalizes file_path to path and enforces required groups at runtime", async () => {
|
||||
const execute = vi.fn(async (_id, args) => args);
|
||||
const tool: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string" },
|
||||
content: { type: "string" },
|
||||
},
|
||||
},
|
||||
execute,
|
||||
};
|
||||
|
||||
const wrapped = __testing.wrapToolParamNormalization(tool, [{ keys: ["path", "file_path"] }]);
|
||||
|
||||
await wrapped.execute("tool-1", { file_path: "foo.txt", content: "x" });
|
||||
expect(execute).toHaveBeenCalledWith(
|
||||
"tool-1",
|
||||
{ path: "foo.txt", content: "x" },
|
||||
undefined,
|
||||
undefined,
|
||||
);
|
||||
|
||||
await expect(wrapped.execute("tool-2", { content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
await expect(wrapped.execute("tool-3", { file_path: " ", content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("keeps read tool image metadata intact", async () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
const readTool = tools.find((tool) => tool.name === "read");
|
||||
const readTool = defaultTools.find((tool) => tool.name === "read");
|
||||
expect(readTool).toBeDefined();
|
||||
|
||||
const tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-read-"));
|
||||
|
||||
@@ -1,183 +0,0 @@
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
it("adds Claude-style aliases to schemas without dropping metadata", () => {
|
||||
const base: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string", description: "Path" },
|
||||
content: { type: "string", description: "Body" },
|
||||
},
|
||||
},
|
||||
execute: vi.fn(),
|
||||
};
|
||||
|
||||
const patched = __testing.patchToolSchemaForClaudeCompatibility(base);
|
||||
const params = patched.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
const props = params.properties ?? {};
|
||||
|
||||
expect(props.file_path).toEqual(props.path);
|
||||
expect(params.required ?? []).not.toContain("path");
|
||||
expect(params.required ?? []).not.toContain("file_path");
|
||||
});
|
||||
|
||||
it("normalizes file_path to path and enforces required groups at runtime", async () => {
|
||||
const execute = vi.fn(async (_id, args) => args);
|
||||
const tool: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string" },
|
||||
content: { type: "string" },
|
||||
},
|
||||
},
|
||||
execute,
|
||||
};
|
||||
|
||||
const wrapped = __testing.wrapToolParamNormalization(tool, [{ keys: ["path", "file_path"] }]);
|
||||
|
||||
await wrapped.execute("tool-1", { file_path: "foo.txt", content: "x" });
|
||||
expect(execute).toHaveBeenCalledWith(
|
||||
"tool-1",
|
||||
{ path: "foo.txt", content: "x" },
|
||||
undefined,
|
||||
undefined,
|
||||
);
|
||||
|
||||
await expect(wrapped.execute("tool-2", { content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
await expect(wrapped.execute("tool-3", { file_path: " ", content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("applies tool profiles before allow/deny policies", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
config: { tools: { profile: "messaging" } },
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("message")).toBe(true);
|
||||
expect(names.has("sessions_send")).toBe(true);
|
||||
expect(names.has("sessions_spawn")).toBe(false);
|
||||
expect(names.has("exec")).toBe(false);
|
||||
expect(names.has("browser")).toBe(false);
|
||||
});
|
||||
it("expands group shorthands in global tool policy", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
config: { tools: { allow: ["group:fs"] } },
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("read")).toBe(true);
|
||||
expect(names.has("write")).toBe(true);
|
||||
expect(names.has("edit")).toBe(true);
|
||||
expect(names.has("exec")).toBe(false);
|
||||
expect(names.has("browser")).toBe(false);
|
||||
});
|
||||
it("expands group shorthands in global tool deny policy", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
config: { tools: { deny: ["group:fs"] } },
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("read")).toBe(false);
|
||||
expect(names.has("write")).toBe(false);
|
||||
expect(names.has("edit")).toBe(false);
|
||||
expect(names.has("exec")).toBe(true);
|
||||
});
|
||||
it("lets agent profiles override global profiles", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
sessionKey: "agent:work:main",
|
||||
config: {
|
||||
tools: { profile: "coding" },
|
||||
agents: {
|
||||
list: [{ id: "work", tools: { profile: "messaging" } }],
|
||||
},
|
||||
},
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("message")).toBe(true);
|
||||
expect(names.has("exec")).toBe(false);
|
||||
expect(names.has("read")).toBe(false);
|
||||
});
|
||||
it("removes unsupported JSON Schema keywords for Cloud Code Assist API compatibility", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
|
||||
// Helper to recursively check schema for unsupported keywords
|
||||
const unsupportedKeywords = new Set([
|
||||
"patternProperties",
|
||||
"additionalProperties",
|
||||
"$schema",
|
||||
"$id",
|
||||
"$ref",
|
||||
"$defs",
|
||||
"definitions",
|
||||
"examples",
|
||||
"minLength",
|
||||
"maxLength",
|
||||
"minimum",
|
||||
"maximum",
|
||||
"multipleOf",
|
||||
"pattern",
|
||||
"format",
|
||||
"minItems",
|
||||
"maxItems",
|
||||
"uniqueItems",
|
||||
"minProperties",
|
||||
"maxProperties",
|
||||
]);
|
||||
|
||||
const findUnsupportedKeywords = (schema: unknown, path: string): string[] => {
|
||||
const found: string[] = [];
|
||||
if (!schema || typeof schema !== "object") return found;
|
||||
if (Array.isArray(schema)) {
|
||||
schema.forEach((item, i) => {
|
||||
found.push(...findUnsupportedKeywords(item, `${path}[${i}]`));
|
||||
});
|
||||
return found;
|
||||
}
|
||||
|
||||
const record = schema as Record<string, unknown>;
|
||||
const properties =
|
||||
record.properties &&
|
||||
typeof record.properties === "object" &&
|
||||
!Array.isArray(record.properties)
|
||||
? (record.properties as Record<string, unknown>)
|
||||
: undefined;
|
||||
if (properties) {
|
||||
for (const [key, value] of Object.entries(properties)) {
|
||||
found.push(...findUnsupportedKeywords(value, `${path}.properties.${key}`));
|
||||
}
|
||||
}
|
||||
|
||||
for (const [key, value] of Object.entries(record)) {
|
||||
if (key === "properties") continue;
|
||||
if (unsupportedKeywords.has(key)) {
|
||||
found.push(`${path}.${key}`);
|
||||
}
|
||||
if (value && typeof value === "object") {
|
||||
found.push(...findUnsupportedKeywords(value, `${path}.${key}`));
|
||||
}
|
||||
}
|
||||
return found;
|
||||
};
|
||||
|
||||
for (const tool of tools) {
|
||||
const violations = findUnsupportedKeywords(tool.parameters, `${tool.name}.parameters`);
|
||||
expect(violations).toEqual([]);
|
||||
}
|
||||
});
|
||||
});
|
||||
@@ -1,74 +1,10 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { createClawdbotCodingTools } from "./pi-tools.js";
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
it("adds Claude-style aliases to schemas without dropping metadata", () => {
|
||||
const base: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string", description: "Path" },
|
||||
content: { type: "string", description: "Body" },
|
||||
},
|
||||
},
|
||||
execute: vi.fn(),
|
||||
};
|
||||
|
||||
const patched = __testing.patchToolSchemaForClaudeCompatibility(base);
|
||||
const params = patched.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
const props = params.properties ?? {};
|
||||
|
||||
expect(props.file_path).toEqual(props.path);
|
||||
expect(params.required ?? []).not.toContain("path");
|
||||
expect(params.required ?? []).not.toContain("file_path");
|
||||
});
|
||||
|
||||
it("normalizes file_path to path and enforces required groups at runtime", async () => {
|
||||
const execute = vi.fn(async (_id, args) => args);
|
||||
const tool: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string" },
|
||||
content: { type: "string" },
|
||||
},
|
||||
},
|
||||
execute,
|
||||
};
|
||||
|
||||
const wrapped = __testing.wrapToolParamNormalization(tool, [{ keys: ["path", "file_path"] }]);
|
||||
|
||||
await wrapped.execute("tool-1", { file_path: "foo.txt", content: "x" });
|
||||
expect(execute).toHaveBeenCalledWith(
|
||||
"tool-1",
|
||||
{ path: "foo.txt", content: "x" },
|
||||
undefined,
|
||||
undefined,
|
||||
);
|
||||
|
||||
await expect(wrapped.execute("tool-2", { content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
await expect(wrapped.execute("tool-3", { file_path: " ", content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("uses workspaceDir for Read tool path resolution", async () => {
|
||||
const tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-ws-"));
|
||||
try {
|
||||
|
||||
@@ -1,95 +0,0 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
import { createSandboxedReadTool } from "./pi-tools.read.js";
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
it("adds Claude-style aliases to schemas without dropping metadata", () => {
|
||||
const base: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string", description: "Path" },
|
||||
content: { type: "string", description: "Body" },
|
||||
},
|
||||
},
|
||||
execute: vi.fn(),
|
||||
};
|
||||
|
||||
const patched = __testing.patchToolSchemaForClaudeCompatibility(base);
|
||||
const params = patched.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
const props = params.properties ?? {};
|
||||
|
||||
expect(props.file_path).toEqual(props.path);
|
||||
expect(params.required ?? []).not.toContain("path");
|
||||
expect(params.required ?? []).not.toContain("file_path");
|
||||
});
|
||||
|
||||
it("normalizes file_path to path and enforces required groups at runtime", async () => {
|
||||
const execute = vi.fn(async (_id, args) => args);
|
||||
const tool: AgentTool = {
|
||||
name: "write",
|
||||
description: "test",
|
||||
parameters: {
|
||||
type: "object",
|
||||
required: ["path", "content"],
|
||||
properties: {
|
||||
path: { type: "string" },
|
||||
content: { type: "string" },
|
||||
},
|
||||
},
|
||||
execute,
|
||||
};
|
||||
|
||||
const wrapped = __testing.wrapToolParamNormalization(tool, [{ keys: ["path", "file_path"] }]);
|
||||
|
||||
await wrapped.execute("tool-1", { file_path: "foo.txt", content: "x" });
|
||||
expect(execute).toHaveBeenCalledWith(
|
||||
"tool-1",
|
||||
{ path: "foo.txt", content: "x" },
|
||||
undefined,
|
||||
undefined,
|
||||
);
|
||||
|
||||
await expect(wrapped.execute("tool-2", { content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
await expect(wrapped.execute("tool-3", { file_path: " ", content: "x" })).rejects.toThrow(
|
||||
/Missing required parameter/,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
it("applies sandbox path guards to file_path alias", async () => {
|
||||
const tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-sbx-"));
|
||||
const outsidePath = path.join(os.tmpdir(), "clawdbot-outside.txt");
|
||||
await fs.writeFile(outsidePath, "outside", "utf8");
|
||||
try {
|
||||
const readTool = createSandboxedReadTool(tmpDir);
|
||||
await expect(readTool.execute("tool-sbx-1", { file_path: outsidePath })).rejects.toThrow();
|
||||
} finally {
|
||||
await fs.rm(tmpDir, { recursive: true, force: true });
|
||||
await fs.rm(outsidePath, { force: true });
|
||||
}
|
||||
});
|
||||
it("falls back to process.cwd() when workspaceDir not provided", () => {
|
||||
const prevCwd = process.cwd();
|
||||
const tools = createClawdbotCodingTools();
|
||||
// Tools should be created without error
|
||||
expect(tools.some((tool) => tool.name === "read")).toBe(true);
|
||||
expect(tools.some((tool) => tool.name === "write")).toBe(true);
|
||||
expect(tools.some((tool) => tool.name === "edit")).toBe(true);
|
||||
// cwd should be unchanged
|
||||
expect(process.cwd()).toBe(prevCwd);
|
||||
});
|
||||
});
|
||||
@@ -1,7 +1,14 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import type { AgentTool } from "@mariozechner/pi-agent-core";
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { createClawdbotTools } from "./clawdbot-tools.js";
|
||||
import { __testing, createClawdbotCodingTools } from "./pi-tools.js";
|
||||
import { createSandboxedReadTool } from "./pi-tools.read.js";
|
||||
import { createBrowserTool } from "./tools/browser-tool.js";
|
||||
|
||||
const defaultTools = createClawdbotCodingTools();
|
||||
|
||||
describe("createClawdbotCodingTools", () => {
|
||||
describe("Claude/Gemini alias support", () => {
|
||||
@@ -67,8 +74,144 @@ describe("createClawdbotCodingTools", () => {
|
||||
});
|
||||
});
|
||||
|
||||
it("keeps browser tool schema OpenAI-compatible without normalization", () => {
|
||||
const browser = createBrowserTool();
|
||||
const schema = browser.parameters as { type?: unknown; anyOf?: unknown };
|
||||
expect(schema.type).toBe("object");
|
||||
expect(schema.anyOf).toBeUndefined();
|
||||
});
|
||||
it("mentions Chrome extension relay in browser tool description", () => {
|
||||
const browser = createBrowserTool();
|
||||
expect(browser.description).toMatch(/Chrome extension/i);
|
||||
expect(browser.description).toMatch(/profile="chrome"/i);
|
||||
});
|
||||
it("keeps browser tool schema properties after normalization", () => {
|
||||
const browser = defaultTools.find((tool) => tool.name === "browser");
|
||||
expect(browser).toBeDefined();
|
||||
const parameters = browser?.parameters as {
|
||||
anyOf?: unknown[];
|
||||
properties?: Record<string, unknown>;
|
||||
required?: string[];
|
||||
};
|
||||
expect(parameters.properties?.action).toBeDefined();
|
||||
expect(parameters.properties?.target).toBeDefined();
|
||||
expect(parameters.properties?.controlUrl).toBeDefined();
|
||||
expect(parameters.properties?.targetUrl).toBeDefined();
|
||||
expect(parameters.properties?.request).toBeDefined();
|
||||
expect(parameters.required ?? []).toContain("action");
|
||||
});
|
||||
it("exposes raw for gateway config.apply tool calls", () => {
|
||||
const gateway = defaultTools.find((tool) => tool.name === "gateway");
|
||||
expect(gateway).toBeDefined();
|
||||
|
||||
const parameters = gateway?.parameters as {
|
||||
type?: unknown;
|
||||
required?: string[];
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
expect(parameters.type).toBe("object");
|
||||
expect(parameters.properties?.raw).toBeDefined();
|
||||
expect(parameters.required ?? []).not.toContain("raw");
|
||||
});
|
||||
it("flattens anyOf-of-literals to enum for provider compatibility", () => {
|
||||
const browser = defaultTools.find((tool) => tool.name === "browser");
|
||||
expect(browser).toBeDefined();
|
||||
|
||||
const parameters = browser?.parameters as {
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
const action = parameters.properties?.action as
|
||||
| {
|
||||
type?: unknown;
|
||||
enum?: unknown[];
|
||||
anyOf?: unknown[];
|
||||
}
|
||||
| undefined;
|
||||
|
||||
expect(action?.type).toBe("string");
|
||||
expect(action?.anyOf).toBeUndefined();
|
||||
expect(Array.isArray(action?.enum)).toBe(true);
|
||||
expect(action?.enum).toContain("act");
|
||||
|
||||
const snapshotFormat = parameters.properties?.snapshotFormat as
|
||||
| {
|
||||
type?: unknown;
|
||||
enum?: unknown[];
|
||||
anyOf?: unknown[];
|
||||
}
|
||||
| undefined;
|
||||
expect(snapshotFormat?.type).toBe("string");
|
||||
expect(snapshotFormat?.anyOf).toBeUndefined();
|
||||
expect(snapshotFormat?.enum).toEqual(["aria", "ai"]);
|
||||
});
|
||||
it("inlines local $ref before removing unsupported keywords", () => {
|
||||
const cleaned = __testing.cleanToolSchemaForGemini({
|
||||
type: "object",
|
||||
properties: {
|
||||
foo: { $ref: "#/$defs/Foo" },
|
||||
},
|
||||
$defs: {
|
||||
Foo: { type: "string", enum: ["a", "b"] },
|
||||
},
|
||||
}) as {
|
||||
$defs?: unknown;
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
expect(cleaned.$defs).toBeUndefined();
|
||||
expect(cleaned.properties).toBeDefined();
|
||||
expect(cleaned.properties?.foo).toMatchObject({
|
||||
type: "string",
|
||||
enum: ["a", "b"],
|
||||
});
|
||||
});
|
||||
it("cleans tuple items schemas", () => {
|
||||
const cleaned = __testing.cleanToolSchemaForGemini({
|
||||
type: "object",
|
||||
properties: {
|
||||
tuples: {
|
||||
type: "array",
|
||||
items: [
|
||||
{ type: "string", format: "uuid" },
|
||||
{ type: "number", minimum: 1 },
|
||||
],
|
||||
},
|
||||
},
|
||||
}) as {
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
const tuples = cleaned.properties?.tuples as { items?: unknown } | undefined;
|
||||
const items = Array.isArray(tuples?.items) ? tuples?.items : [];
|
||||
const first = items[0] as { format?: unknown } | undefined;
|
||||
const second = items[1] as { minimum?: unknown } | undefined;
|
||||
|
||||
expect(first?.format).toBeUndefined();
|
||||
expect(second?.minimum).toBeUndefined();
|
||||
});
|
||||
it("drops null-only union variants without flattening other unions", () => {
|
||||
const cleaned = __testing.cleanToolSchemaForGemini({
|
||||
type: "object",
|
||||
properties: {
|
||||
parentId: { anyOf: [{ type: "string" }, { type: "null" }] },
|
||||
count: { oneOf: [{ type: "string" }, { type: "number" }] },
|
||||
},
|
||||
}) as {
|
||||
properties?: Record<string, unknown>;
|
||||
};
|
||||
|
||||
const parentId = cleaned.properties?.parentId as
|
||||
| { type?: unknown; anyOf?: unknown; oneOf?: unknown }
|
||||
| undefined;
|
||||
const count = cleaned.properties?.count as
|
||||
| { type?: unknown; anyOf?: unknown; oneOf?: unknown }
|
||||
| undefined;
|
||||
|
||||
expect(parentId?.type).toBe("string");
|
||||
expect(parentId?.anyOf).toBeUndefined();
|
||||
expect(count?.oneOf).toBeDefined();
|
||||
});
|
||||
it("avoids anyOf/oneOf/allOf in tool schemas", () => {
|
||||
const tools = createClawdbotCodingTools();
|
||||
const offenders: Array<{
|
||||
name: string;
|
||||
keyword: string;
|
||||
@@ -96,7 +239,7 @@ describe("createClawdbotCodingTools", () => {
|
||||
}
|
||||
};
|
||||
|
||||
for (const tool of tools) {
|
||||
for (const tool of defaultTools) {
|
||||
walk(tool.parameters, "", tool.name);
|
||||
}
|
||||
|
||||
@@ -192,4 +335,131 @@ describe("createClawdbotCodingTools", () => {
|
||||
});
|
||||
expect(tools.map((tool) => tool.name)).toEqual(["read"]);
|
||||
});
|
||||
|
||||
it("applies tool profiles before allow/deny policies", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
config: { tools: { profile: "messaging" } },
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("message")).toBe(true);
|
||||
expect(names.has("sessions_send")).toBe(true);
|
||||
expect(names.has("sessions_spawn")).toBe(false);
|
||||
expect(names.has("exec")).toBe(false);
|
||||
expect(names.has("browser")).toBe(false);
|
||||
});
|
||||
it("expands group shorthands in global tool policy", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
config: { tools: { allow: ["group:fs"] } },
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("read")).toBe(true);
|
||||
expect(names.has("write")).toBe(true);
|
||||
expect(names.has("edit")).toBe(true);
|
||||
expect(names.has("exec")).toBe(false);
|
||||
expect(names.has("browser")).toBe(false);
|
||||
});
|
||||
it("expands group shorthands in global tool deny policy", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
config: { tools: { deny: ["group:fs"] } },
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("read")).toBe(false);
|
||||
expect(names.has("write")).toBe(false);
|
||||
expect(names.has("edit")).toBe(false);
|
||||
expect(names.has("exec")).toBe(true);
|
||||
});
|
||||
it("lets agent profiles override global profiles", () => {
|
||||
const tools = createClawdbotCodingTools({
|
||||
sessionKey: "agent:work:main",
|
||||
config: {
|
||||
tools: { profile: "coding" },
|
||||
agents: {
|
||||
list: [{ id: "work", tools: { profile: "messaging" } }],
|
||||
},
|
||||
},
|
||||
});
|
||||
const names = new Set(tools.map((tool) => tool.name));
|
||||
expect(names.has("message")).toBe(true);
|
||||
expect(names.has("exec")).toBe(false);
|
||||
expect(names.has("read")).toBe(false);
|
||||
});
|
||||
it("removes unsupported JSON Schema keywords for Cloud Code Assist API compatibility", () => {
|
||||
// Helper to recursively check schema for unsupported keywords
|
||||
const unsupportedKeywords = new Set([
|
||||
"patternProperties",
|
||||
"additionalProperties",
|
||||
"$schema",
|
||||
"$id",
|
||||
"$ref",
|
||||
"$defs",
|
||||
"definitions",
|
||||
"examples",
|
||||
"minLength",
|
||||
"maxLength",
|
||||
"minimum",
|
||||
"maximum",
|
||||
"multipleOf",
|
||||
"pattern",
|
||||
"format",
|
||||
"minItems",
|
||||
"maxItems",
|
||||
"uniqueItems",
|
||||
"minProperties",
|
||||
"maxProperties",
|
||||
]);
|
||||
|
||||
const findUnsupportedKeywords = (schema: unknown, path: string): string[] => {
|
||||
const found: string[] = [];
|
||||
if (!schema || typeof schema !== "object") return found;
|
||||
if (Array.isArray(schema)) {
|
||||
schema.forEach((item, i) => {
|
||||
found.push(...findUnsupportedKeywords(item, `${path}[${i}]`));
|
||||
});
|
||||
return found;
|
||||
}
|
||||
|
||||
const record = schema as Record<string, unknown>;
|
||||
const properties =
|
||||
record.properties &&
|
||||
typeof record.properties === "object" &&
|
||||
!Array.isArray(record.properties)
|
||||
? (record.properties as Record<string, unknown>)
|
||||
: undefined;
|
||||
if (properties) {
|
||||
for (const [key, value] of Object.entries(properties)) {
|
||||
found.push(...findUnsupportedKeywords(value, `${path}.properties.${key}`));
|
||||
}
|
||||
}
|
||||
|
||||
for (const [key, value] of Object.entries(record)) {
|
||||
if (key === "properties") continue;
|
||||
if (unsupportedKeywords.has(key)) {
|
||||
found.push(`${path}.${key}`);
|
||||
}
|
||||
if (value && typeof value === "object") {
|
||||
found.push(...findUnsupportedKeywords(value, `${path}.${key}`));
|
||||
}
|
||||
}
|
||||
return found;
|
||||
};
|
||||
|
||||
for (const tool of defaultTools) {
|
||||
const violations = findUnsupportedKeywords(tool.parameters, `${tool.name}.parameters`);
|
||||
expect(violations).toEqual([]);
|
||||
}
|
||||
});
|
||||
it("applies sandbox path guards to file_path alias", async () => {
|
||||
const tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-sbx-"));
|
||||
const outsidePath = path.join(os.tmpdir(), "clawdbot-outside.txt");
|
||||
await fs.writeFile(outsidePath, "outside", "utf8");
|
||||
try {
|
||||
const readTool = createSandboxedReadTool(tmpDir);
|
||||
await expect(readTool.execute("sandbox-1", { file_path: outsidePath })).rejects.toThrow(
|
||||
/sandbox root/i,
|
||||
);
|
||||
} finally {
|
||||
await fs.rm(outsidePath, { force: true });
|
||||
await fs.rm(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
@@ -44,6 +44,7 @@ import {
|
||||
collectExplicitAllowlist,
|
||||
expandPolicyWithPluginGroups,
|
||||
resolveToolProfilePolicy,
|
||||
stripPluginOnlyAllowlist,
|
||||
} from "./tool-policy.js";
|
||||
import { getPluginToolMeta } from "../plugins/tools.js";
|
||||
|
||||
@@ -298,12 +299,30 @@ export function createClawdbotCodingTools(options?: {
|
||||
tools,
|
||||
toolMeta: (tool) => getPluginToolMeta(tool as AnyAgentTool),
|
||||
});
|
||||
const profilePolicyExpanded = expandPolicyWithPluginGroups(profilePolicy, pluginGroups);
|
||||
const providerProfileExpanded = expandPolicyWithPluginGroups(providerProfilePolicy, pluginGroups);
|
||||
const globalPolicyExpanded = expandPolicyWithPluginGroups(globalPolicy, pluginGroups);
|
||||
const globalProviderExpanded = expandPolicyWithPluginGroups(globalProviderPolicy, pluginGroups);
|
||||
const agentPolicyExpanded = expandPolicyWithPluginGroups(agentPolicy, pluginGroups);
|
||||
const agentProviderExpanded = expandPolicyWithPluginGroups(agentProviderPolicy, pluginGroups);
|
||||
const profilePolicyExpanded = expandPolicyWithPluginGroups(
|
||||
stripPluginOnlyAllowlist(profilePolicy, pluginGroups),
|
||||
pluginGroups,
|
||||
);
|
||||
const providerProfileExpanded = expandPolicyWithPluginGroups(
|
||||
stripPluginOnlyAllowlist(providerProfilePolicy, pluginGroups),
|
||||
pluginGroups,
|
||||
);
|
||||
const globalPolicyExpanded = expandPolicyWithPluginGroups(
|
||||
stripPluginOnlyAllowlist(globalPolicy, pluginGroups),
|
||||
pluginGroups,
|
||||
);
|
||||
const globalProviderExpanded = expandPolicyWithPluginGroups(
|
||||
stripPluginOnlyAllowlist(globalProviderPolicy, pluginGroups),
|
||||
pluginGroups,
|
||||
);
|
||||
const agentPolicyExpanded = expandPolicyWithPluginGroups(
|
||||
stripPluginOnlyAllowlist(agentPolicy, pluginGroups),
|
||||
pluginGroups,
|
||||
);
|
||||
const agentProviderExpanded = expandPolicyWithPluginGroups(
|
||||
stripPluginOnlyAllowlist(agentProviderPolicy, pluginGroups),
|
||||
pluginGroups,
|
||||
);
|
||||
const sandboxPolicyExpanded = expandPolicyWithPluginGroups(sandbox?.tools, pluginGroups);
|
||||
const subagentPolicyExpanded = expandPolicyWithPluginGroups(subagentPolicy, pluginGroups);
|
||||
|
||||
|
||||
25
src/agents/tool-policy.plugin-only-allowlist.test.ts
Normal file
25
src/agents/tool-policy.plugin-only-allowlist.test.ts
Normal file
@@ -0,0 +1,25 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
|
||||
import { stripPluginOnlyAllowlist, type PluginToolGroups } from "./tool-policy.js";
|
||||
|
||||
const pluginGroups: PluginToolGroups = {
|
||||
all: ["lobster", "workflow_tool"],
|
||||
byPlugin: new Map([["lobster", ["lobster", "workflow_tool"]]]),
|
||||
};
|
||||
|
||||
describe("stripPluginOnlyAllowlist", () => {
|
||||
it("strips allowlist when it only targets plugin tools", () => {
|
||||
const policy = stripPluginOnlyAllowlist({ allow: ["lobster"] }, pluginGroups);
|
||||
expect(policy?.allow).toBeUndefined();
|
||||
});
|
||||
|
||||
it("strips allowlist when it only targets plugin groups", () => {
|
||||
const policy = stripPluginOnlyAllowlist({ allow: ["group:plugins"] }, pluginGroups);
|
||||
expect(policy?.allow).toBeUndefined();
|
||||
});
|
||||
|
||||
it("keeps allowlist when it mixes plugin and core entries", () => {
|
||||
const policy = stripPluginOnlyAllowlist({ allow: ["lobster", "read"] }, pluginGroups);
|
||||
expect(policy?.allow).toEqual(["lobster", "read"]);
|
||||
});
|
||||
});
|
||||
@@ -178,6 +178,22 @@ export function expandPolicyWithPluginGroups(
|
||||
};
|
||||
}
|
||||
|
||||
export function stripPluginOnlyAllowlist(
|
||||
policy: ToolPolicyLike | undefined,
|
||||
groups: PluginToolGroups,
|
||||
): ToolPolicyLike | undefined {
|
||||
if (!policy?.allow || policy.allow.length === 0) return policy;
|
||||
const normalized = normalizeToolList(policy.allow);
|
||||
if (normalized.length === 0) return policy;
|
||||
const pluginIds = new Set(groups.byPlugin.keys());
|
||||
const pluginTools = new Set(groups.all);
|
||||
const isPluginEntry = (entry: string) =>
|
||||
entry === "group:plugins" || pluginIds.has(entry) || pluginTools.has(entry);
|
||||
const isPluginOnly = normalized.every((entry) => isPluginEntry(entry));
|
||||
if (!isPluginOnly) return policy;
|
||||
return { ...policy, allow: undefined };
|
||||
}
|
||||
|
||||
export function resolveToolProfilePolicy(profile?: string): ToolProfilePolicy | undefined {
|
||||
if (!profile) return undefined;
|
||||
const resolved = TOOL_PROFILES[profile as ToolProfileId];
|
||||
|
||||
@@ -209,4 +209,26 @@ describe("cron tool", () => {
|
||||
const text = cronCall.params?.payload?.text ?? "";
|
||||
expect(text).not.toContain("Recent context:");
|
||||
});
|
||||
|
||||
it("preserves explicit agentId null on add", async () => {
|
||||
callGatewayMock.mockResolvedValueOnce({ ok: true });
|
||||
|
||||
const tool = createCronTool({ agentSessionKey: "main" });
|
||||
await tool.execute("call6", {
|
||||
action: "add",
|
||||
job: {
|
||||
name: "reminder",
|
||||
schedule: { atMs: 123 },
|
||||
agentId: null,
|
||||
payload: { kind: "systemEvent", text: "Reminder: the thing." },
|
||||
},
|
||||
});
|
||||
|
||||
const call = callGatewayMock.mock.calls[0]?.[0] as {
|
||||
method?: string;
|
||||
params?: { agentId?: string | null };
|
||||
};
|
||||
expect(call.method).toBe("cron.add");
|
||||
expect(call.params?.agentId).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -3,9 +3,9 @@ import { normalizeCronJobCreate, normalizeCronJobPatch } from "../../cron/normal
|
||||
import { loadConfig } from "../../config/config.js";
|
||||
import { truncateUtf16Safe } from "../../utils.js";
|
||||
import { optionalStringEnum, stringEnum } from "../schema/typebox.js";
|
||||
import { resolveSessionAgentId } from "../agent-scope.js";
|
||||
import { type AnyAgentTool, jsonResult, readStringParam } from "./common.js";
|
||||
import { callGatewayTool, type GatewayCallOptions } from "./gateway.js";
|
||||
import { resolveSessionAgentId } from "../agent-scope.js";
|
||||
import { resolveInternalSessionKey, resolveMainSessionAlias } from "./sessions-helpers.js";
|
||||
|
||||
// NOTE: We use Type.Object({}, { additionalProperties: true }) for job/patch
|
||||
@@ -159,12 +159,12 @@ export function createCronTool(opts?: CronToolOptions): AnyAgentTool {
|
||||
throw new Error("job required");
|
||||
}
|
||||
const job = normalizeCronJobCreate(params.job) ?? params.job;
|
||||
if (job && typeof job === "object") {
|
||||
if (job && typeof job === "object" && !("agentId" in job)) {
|
||||
const cfg = loadConfig();
|
||||
const agentId = opts?.agentSessionKey
|
||||
? resolveSessionAgentId({ sessionKey: opts.agentSessionKey, config: cfg })
|
||||
: undefined;
|
||||
if (agentId && !("agentId" in (job as { agentId?: unknown }))) {
|
||||
if (agentId) {
|
||||
(job as { agentId?: string }).agentId = agentId;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -40,8 +40,6 @@ export async function handleDiscordGuildAction(
|
||||
isActionEnabled: ActionGate<DiscordActionConfig>,
|
||||
): Promise<AgentToolResult<unknown>> {
|
||||
const accountId = readStringParam(params, "accountId");
|
||||
const accountOpts = accountId ? { accountId } : {};
|
||||
|
||||
switch (action) {
|
||||
case "memberInfo": {
|
||||
if (!isActionEnabled("memberInfo")) {
|
||||
@@ -53,7 +51,9 @@ export async function handleDiscordGuildAction(
|
||||
const userId = readStringParam(params, "userId", {
|
||||
required: true,
|
||||
});
|
||||
const member = await fetchMemberInfoDiscord(guildId, userId, accountOpts);
|
||||
const member = accountId
|
||||
? await fetchMemberInfoDiscord(guildId, userId, { accountId })
|
||||
: await fetchMemberInfoDiscord(guildId, userId);
|
||||
return jsonResult({ ok: true, member });
|
||||
}
|
||||
case "roleInfo": {
|
||||
@@ -63,7 +63,9 @@ export async function handleDiscordGuildAction(
|
||||
const guildId = readStringParam(params, "guildId", {
|
||||
required: true,
|
||||
});
|
||||
const roles = await fetchRoleInfoDiscord(guildId, accountOpts);
|
||||
const roles = accountId
|
||||
? await fetchRoleInfoDiscord(guildId, { accountId })
|
||||
: await fetchRoleInfoDiscord(guildId);
|
||||
return jsonResult({ ok: true, roles });
|
||||
}
|
||||
case "emojiList": {
|
||||
@@ -73,7 +75,9 @@ export async function handleDiscordGuildAction(
|
||||
const guildId = readStringParam(params, "guildId", {
|
||||
required: true,
|
||||
});
|
||||
const emojis = await listGuildEmojisDiscord(guildId, accountOpts);
|
||||
const emojis = accountId
|
||||
? await listGuildEmojisDiscord(guildId, { accountId })
|
||||
: await listGuildEmojisDiscord(guildId);
|
||||
return jsonResult({ ok: true, emojis });
|
||||
}
|
||||
case "emojiUpload": {
|
||||
@@ -88,15 +92,22 @@ export async function handleDiscordGuildAction(
|
||||
required: true,
|
||||
});
|
||||
const roleIds = readStringArrayParam(params, "roleIds");
|
||||
const emoji = await uploadEmojiDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
mediaUrl,
|
||||
roleIds: roleIds?.length ? roleIds : undefined,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const emoji = accountId
|
||||
? await uploadEmojiDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
mediaUrl,
|
||||
roleIds: roleIds?.length ? roleIds : undefined,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await uploadEmojiDiscord({
|
||||
guildId,
|
||||
name,
|
||||
mediaUrl,
|
||||
roleIds: roleIds?.length ? roleIds : undefined,
|
||||
});
|
||||
return jsonResult({ ok: true, emoji });
|
||||
}
|
||||
case "stickerUpload": {
|
||||
@@ -114,16 +125,24 @@ export async function handleDiscordGuildAction(
|
||||
const mediaUrl = readStringParam(params, "mediaUrl", {
|
||||
required: true,
|
||||
});
|
||||
const sticker = await uploadStickerDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
description,
|
||||
tags,
|
||||
mediaUrl,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const sticker = accountId
|
||||
? await uploadStickerDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
description,
|
||||
tags,
|
||||
mediaUrl,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await uploadStickerDiscord({
|
||||
guildId,
|
||||
name,
|
||||
description,
|
||||
tags,
|
||||
mediaUrl,
|
||||
});
|
||||
return jsonResult({ ok: true, sticker });
|
||||
}
|
||||
case "roleAdd": {
|
||||
@@ -137,7 +156,11 @@ export async function handleDiscordGuildAction(
|
||||
required: true,
|
||||
});
|
||||
const roleId = readStringParam(params, "roleId", { required: true });
|
||||
await addRoleDiscord({ guildId, userId, roleId }, accountOpts);
|
||||
if (accountId) {
|
||||
await addRoleDiscord({ guildId, userId, roleId }, { accountId });
|
||||
} else {
|
||||
await addRoleDiscord({ guildId, userId, roleId });
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "roleRemove": {
|
||||
@@ -151,7 +174,11 @@ export async function handleDiscordGuildAction(
|
||||
required: true,
|
||||
});
|
||||
const roleId = readStringParam(params, "roleId", { required: true });
|
||||
await removeRoleDiscord({ guildId, userId, roleId }, accountOpts);
|
||||
if (accountId) {
|
||||
await removeRoleDiscord({ guildId, userId, roleId }, { accountId });
|
||||
} else {
|
||||
await removeRoleDiscord({ guildId, userId, roleId });
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "channelInfo": {
|
||||
@@ -161,7 +188,9 @@ export async function handleDiscordGuildAction(
|
||||
const channelId = readStringParam(params, "channelId", {
|
||||
required: true,
|
||||
});
|
||||
const channel = await fetchChannelInfoDiscord(channelId, accountOpts);
|
||||
const channel = accountId
|
||||
? await fetchChannelInfoDiscord(channelId, { accountId })
|
||||
: await fetchChannelInfoDiscord(channelId);
|
||||
return jsonResult({ ok: true, channel });
|
||||
}
|
||||
case "channelList": {
|
||||
@@ -171,7 +200,9 @@ export async function handleDiscordGuildAction(
|
||||
const guildId = readStringParam(params, "guildId", {
|
||||
required: true,
|
||||
});
|
||||
const channels = await listGuildChannelsDiscord(guildId, accountOpts);
|
||||
const channels = accountId
|
||||
? await listGuildChannelsDiscord(guildId, { accountId })
|
||||
: await listGuildChannelsDiscord(guildId);
|
||||
return jsonResult({ ok: true, channels });
|
||||
}
|
||||
case "voiceStatus": {
|
||||
@@ -184,7 +215,9 @@ export async function handleDiscordGuildAction(
|
||||
const userId = readStringParam(params, "userId", {
|
||||
required: true,
|
||||
});
|
||||
const voice = await fetchVoiceStatusDiscord(guildId, userId, accountOpts);
|
||||
const voice = accountId
|
||||
? await fetchVoiceStatusDiscord(guildId, userId, { accountId })
|
||||
: await fetchVoiceStatusDiscord(guildId, userId);
|
||||
return jsonResult({ ok: true, voice });
|
||||
}
|
||||
case "eventList": {
|
||||
@@ -194,7 +227,9 @@ export async function handleDiscordGuildAction(
|
||||
const guildId = readStringParam(params, "guildId", {
|
||||
required: true,
|
||||
});
|
||||
const events = await listScheduledEventsDiscord(guildId, accountOpts);
|
||||
const events = accountId
|
||||
? await listScheduledEventsDiscord(guildId, { accountId })
|
||||
: await listScheduledEventsDiscord(guildId);
|
||||
return jsonResult({ ok: true, events });
|
||||
}
|
||||
case "eventCreate": {
|
||||
@@ -224,7 +259,9 @@ export async function handleDiscordGuildAction(
|
||||
entity_metadata: entityType === 3 && location ? { location } : undefined,
|
||||
privacy_level: 2,
|
||||
};
|
||||
const event = await createScheduledEventDiscord(guildId, payload, accountOpts);
|
||||
const event = accountId
|
||||
? await createScheduledEventDiscord(guildId, payload, { accountId })
|
||||
: await createScheduledEventDiscord(guildId, payload);
|
||||
return jsonResult({ ok: true, event });
|
||||
}
|
||||
case "channelCreate": {
|
||||
@@ -238,18 +275,28 @@ export async function handleDiscordGuildAction(
|
||||
const topic = readStringParam(params, "topic");
|
||||
const position = readNumberParam(params, "position", { integer: true });
|
||||
const nsfw = params.nsfw as boolean | undefined;
|
||||
const channel = await createChannelDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
type: type ?? undefined,
|
||||
parentId: parentId ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
position: position ?? undefined,
|
||||
nsfw,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const channel = accountId
|
||||
? await createChannelDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
type: type ?? undefined,
|
||||
parentId: parentId ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
position: position ?? undefined,
|
||||
nsfw,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await createChannelDiscord({
|
||||
guildId,
|
||||
name,
|
||||
type: type ?? undefined,
|
||||
parentId: parentId ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
position: position ?? undefined,
|
||||
nsfw,
|
||||
});
|
||||
return jsonResult({ ok: true, channel });
|
||||
}
|
||||
case "channelEdit": {
|
||||
@@ -267,18 +314,28 @@ export async function handleDiscordGuildAction(
|
||||
const rateLimitPerUser = readNumberParam(params, "rateLimitPerUser", {
|
||||
integer: true,
|
||||
});
|
||||
const channel = await editChannelDiscord(
|
||||
{
|
||||
channelId,
|
||||
name: name ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
position: position ?? undefined,
|
||||
parentId,
|
||||
nsfw,
|
||||
rateLimitPerUser: rateLimitPerUser ?? undefined,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const channel = accountId
|
||||
? await editChannelDiscord(
|
||||
{
|
||||
channelId,
|
||||
name: name ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
position: position ?? undefined,
|
||||
parentId,
|
||||
nsfw,
|
||||
rateLimitPerUser: rateLimitPerUser ?? undefined,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await editChannelDiscord({
|
||||
channelId,
|
||||
name: name ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
position: position ?? undefined,
|
||||
parentId,
|
||||
nsfw,
|
||||
rateLimitPerUser: rateLimitPerUser ?? undefined,
|
||||
});
|
||||
return jsonResult({ ok: true, channel });
|
||||
}
|
||||
case "channelDelete": {
|
||||
@@ -288,7 +345,9 @@ export async function handleDiscordGuildAction(
|
||||
const channelId = readStringParam(params, "channelId", {
|
||||
required: true,
|
||||
});
|
||||
const result = await deleteChannelDiscord(channelId, accountOpts);
|
||||
const result = accountId
|
||||
? await deleteChannelDiscord(channelId, { accountId })
|
||||
: await deleteChannelDiscord(channelId);
|
||||
return jsonResult(result);
|
||||
}
|
||||
case "channelMove": {
|
||||
@@ -301,15 +360,24 @@ export async function handleDiscordGuildAction(
|
||||
});
|
||||
const parentId = readParentIdParam(params);
|
||||
const position = readNumberParam(params, "position", { integer: true });
|
||||
await moveChannelDiscord(
|
||||
{
|
||||
if (accountId) {
|
||||
await moveChannelDiscord(
|
||||
{
|
||||
guildId,
|
||||
channelId,
|
||||
parentId,
|
||||
position: position ?? undefined,
|
||||
},
|
||||
{ accountId },
|
||||
);
|
||||
} else {
|
||||
await moveChannelDiscord({
|
||||
guildId,
|
||||
channelId,
|
||||
parentId,
|
||||
position: position ?? undefined,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
});
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "categoryCreate": {
|
||||
@@ -319,15 +387,22 @@ export async function handleDiscordGuildAction(
|
||||
const guildId = readStringParam(params, "guildId", { required: true });
|
||||
const name = readStringParam(params, "name", { required: true });
|
||||
const position = readNumberParam(params, "position", { integer: true });
|
||||
const channel = await createChannelDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
type: 4,
|
||||
position: position ?? undefined,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const channel = accountId
|
||||
? await createChannelDiscord(
|
||||
{
|
||||
guildId,
|
||||
name,
|
||||
type: 4,
|
||||
position: position ?? undefined,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await createChannelDiscord({
|
||||
guildId,
|
||||
name,
|
||||
type: 4,
|
||||
position: position ?? undefined,
|
||||
});
|
||||
return jsonResult({ ok: true, category: channel });
|
||||
}
|
||||
case "categoryEdit": {
|
||||
@@ -339,14 +414,20 @@ export async function handleDiscordGuildAction(
|
||||
});
|
||||
const name = readStringParam(params, "name");
|
||||
const position = readNumberParam(params, "position", { integer: true });
|
||||
const channel = await editChannelDiscord(
|
||||
{
|
||||
channelId: categoryId,
|
||||
name: name ?? undefined,
|
||||
position: position ?? undefined,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const channel = accountId
|
||||
? await editChannelDiscord(
|
||||
{
|
||||
channelId: categoryId,
|
||||
name: name ?? undefined,
|
||||
position: position ?? undefined,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await editChannelDiscord({
|
||||
channelId: categoryId,
|
||||
name: name ?? undefined,
|
||||
position: position ?? undefined,
|
||||
});
|
||||
return jsonResult({ ok: true, category: channel });
|
||||
}
|
||||
case "categoryDelete": {
|
||||
@@ -356,7 +437,9 @@ export async function handleDiscordGuildAction(
|
||||
const categoryId = readStringParam(params, "categoryId", {
|
||||
required: true,
|
||||
});
|
||||
const result = await deleteChannelDiscord(categoryId, accountOpts);
|
||||
const result = accountId
|
||||
? await deleteChannelDiscord(categoryId, { accountId })
|
||||
: await deleteChannelDiscord(categoryId);
|
||||
return jsonResult(result);
|
||||
}
|
||||
case "channelPermissionSet": {
|
||||
@@ -373,16 +456,26 @@ export async function handleDiscordGuildAction(
|
||||
const targetType = targetTypeRaw === "member" ? 1 : 0;
|
||||
const allow = readStringParam(params, "allow");
|
||||
const deny = readStringParam(params, "deny");
|
||||
await setChannelPermissionDiscord(
|
||||
{
|
||||
if (accountId) {
|
||||
await setChannelPermissionDiscord(
|
||||
{
|
||||
channelId,
|
||||
targetId,
|
||||
targetType,
|
||||
allow: allow ?? undefined,
|
||||
deny: deny ?? undefined,
|
||||
},
|
||||
{ accountId },
|
||||
);
|
||||
} else {
|
||||
await setChannelPermissionDiscord({
|
||||
channelId,
|
||||
targetId,
|
||||
targetType,
|
||||
allow: allow ?? undefined,
|
||||
deny: deny ?? undefined,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
});
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "channelPermissionRemove": {
|
||||
@@ -393,7 +486,11 @@ export async function handleDiscordGuildAction(
|
||||
required: true,
|
||||
});
|
||||
const targetId = readStringParam(params, "targetId", { required: true });
|
||||
await removeChannelPermissionDiscord(channelId, targetId, accountOpts);
|
||||
if (accountId) {
|
||||
await removeChannelPermissionDiscord(channelId, targetId, { accountId });
|
||||
} else {
|
||||
await removeChannelPermissionDiscord(channelId, targetId);
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
default:
|
||||
|
||||
@@ -58,6 +58,7 @@ export async function handleDiscordMessagingAction(
|
||||
required: true,
|
||||
}),
|
||||
);
|
||||
const accountId = readStringParam(params, "accountId");
|
||||
const normalizeMessage = (message: unknown) => {
|
||||
if (!message || typeof message !== "object") return message;
|
||||
return withNormalizedTimestamp(
|
||||
@@ -65,8 +66,6 @@ export async function handleDiscordMessagingAction(
|
||||
(message as { timestamp?: unknown }).timestamp,
|
||||
);
|
||||
};
|
||||
const accountId = readStringParam(params, "accountId");
|
||||
const accountOpts = accountId ? { accountId } : {};
|
||||
switch (action) {
|
||||
case "react": {
|
||||
if (!isActionEnabled("reactions")) {
|
||||
@@ -80,14 +79,24 @@ export async function handleDiscordMessagingAction(
|
||||
removeErrorMessage: "Emoji is required to remove a Discord reaction.",
|
||||
});
|
||||
if (remove) {
|
||||
await removeReactionDiscord(channelId, messageId, emoji, accountOpts);
|
||||
if (accountId) {
|
||||
await removeReactionDiscord(channelId, messageId, emoji, { accountId });
|
||||
} else {
|
||||
await removeReactionDiscord(channelId, messageId, emoji);
|
||||
}
|
||||
return jsonResult({ ok: true, removed: emoji });
|
||||
}
|
||||
if (isEmpty) {
|
||||
const removed = await removeOwnReactionsDiscord(channelId, messageId, accountOpts);
|
||||
const removed = accountId
|
||||
? await removeOwnReactionsDiscord(channelId, messageId, { accountId })
|
||||
: await removeOwnReactionsDiscord(channelId, messageId);
|
||||
return jsonResult({ ok: true, removed: removed.removed });
|
||||
}
|
||||
await reactMessageDiscord(channelId, messageId, emoji, accountOpts);
|
||||
if (accountId) {
|
||||
await reactMessageDiscord(channelId, messageId, emoji, { accountId });
|
||||
} else {
|
||||
await reactMessageDiscord(channelId, messageId, emoji);
|
||||
}
|
||||
return jsonResult({ ok: true, added: emoji });
|
||||
}
|
||||
case "reactions": {
|
||||
@@ -102,7 +111,7 @@ export async function handleDiscordMessagingAction(
|
||||
const limit =
|
||||
typeof limitRaw === "number" && Number.isFinite(limitRaw) ? limitRaw : undefined;
|
||||
const reactions = await fetchReactionsDiscord(channelId, messageId, {
|
||||
...accountOpts,
|
||||
...(accountId ? { accountId } : {}),
|
||||
limit,
|
||||
});
|
||||
return jsonResult({ ok: true, reactions });
|
||||
@@ -118,8 +127,8 @@ export async function handleDiscordMessagingAction(
|
||||
label: "stickerIds",
|
||||
});
|
||||
await sendStickerDiscord(to, stickerIds, {
|
||||
...(accountId ? { accountId } : {}),
|
||||
content,
|
||||
accountId: accountId ?? undefined,
|
||||
});
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
@@ -146,7 +155,7 @@ export async function handleDiscordMessagingAction(
|
||||
await sendPollDiscord(
|
||||
to,
|
||||
{ question, options: answers, maxSelections, durationHours },
|
||||
{ content, accountId: accountId ?? undefined },
|
||||
{ ...(accountId ? { accountId } : {}), content },
|
||||
);
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
@@ -155,7 +164,9 @@ export async function handleDiscordMessagingAction(
|
||||
throw new Error("Discord permissions are disabled.");
|
||||
}
|
||||
const channelId = resolveChannelId();
|
||||
const permissions = await fetchChannelPermissionsDiscord(channelId, accountOpts);
|
||||
const permissions = accountId
|
||||
? await fetchChannelPermissionsDiscord(channelId, { accountId })
|
||||
: await fetchChannelPermissionsDiscord(channelId);
|
||||
return jsonResult({ ok: true, permissions });
|
||||
}
|
||||
case "fetchMessage": {
|
||||
@@ -177,7 +188,9 @@ export async function handleDiscordMessagingAction(
|
||||
"Discord message fetch requires guildId, channelId, and messageId (or a valid messageLink).",
|
||||
);
|
||||
}
|
||||
const message = await fetchMessageDiscord(channelId, messageId, accountOpts);
|
||||
const message = accountId
|
||||
? await fetchMessageDiscord(channelId, messageId, { accountId })
|
||||
: await fetchMessageDiscord(channelId, messageId);
|
||||
return jsonResult({
|
||||
ok: true,
|
||||
message: normalizeMessage(message),
|
||||
@@ -191,19 +204,18 @@ export async function handleDiscordMessagingAction(
|
||||
throw new Error("Discord message reads are disabled.");
|
||||
}
|
||||
const channelId = resolveChannelId();
|
||||
const messages = await readMessagesDiscord(
|
||||
channelId,
|
||||
{
|
||||
limit:
|
||||
typeof params.limit === "number" && Number.isFinite(params.limit)
|
||||
? params.limit
|
||||
: undefined,
|
||||
before: readStringParam(params, "before"),
|
||||
after: readStringParam(params, "after"),
|
||||
around: readStringParam(params, "around"),
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const query = {
|
||||
limit:
|
||||
typeof params.limit === "number" && Number.isFinite(params.limit)
|
||||
? params.limit
|
||||
: undefined,
|
||||
before: readStringParam(params, "before"),
|
||||
after: readStringParam(params, "after"),
|
||||
around: readStringParam(params, "around"),
|
||||
};
|
||||
const messages = accountId
|
||||
? await readMessagesDiscord(channelId, query, { accountId })
|
||||
: await readMessagesDiscord(channelId, query);
|
||||
return jsonResult({
|
||||
ok: true,
|
||||
messages: messages.map((message) => normalizeMessage(message)),
|
||||
@@ -222,7 +234,7 @@ export async function handleDiscordMessagingAction(
|
||||
const embeds =
|
||||
Array.isArray(params.embeds) && params.embeds.length > 0 ? params.embeds : undefined;
|
||||
const result = await sendMessageDiscord(to, content, {
|
||||
accountId: accountId ?? undefined,
|
||||
...(accountId ? { accountId } : {}),
|
||||
mediaUrl,
|
||||
replyTo,
|
||||
embeds,
|
||||
@@ -240,14 +252,9 @@ export async function handleDiscordMessagingAction(
|
||||
const content = readStringParam(params, "content", {
|
||||
required: true,
|
||||
});
|
||||
const message = await editMessageDiscord(
|
||||
channelId,
|
||||
messageId,
|
||||
{
|
||||
content,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const message = accountId
|
||||
? await editMessageDiscord(channelId, messageId, { content }, { accountId })
|
||||
: await editMessageDiscord(channelId, messageId, { content });
|
||||
return jsonResult({ ok: true, message });
|
||||
}
|
||||
case "deleteMessage": {
|
||||
@@ -258,7 +265,11 @@ export async function handleDiscordMessagingAction(
|
||||
const messageId = readStringParam(params, "messageId", {
|
||||
required: true,
|
||||
});
|
||||
await deleteMessageDiscord(channelId, messageId, accountOpts);
|
||||
if (accountId) {
|
||||
await deleteMessageDiscord(channelId, messageId, { accountId });
|
||||
} else {
|
||||
await deleteMessageDiscord(channelId, messageId);
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "threadCreate": {
|
||||
@@ -273,15 +284,13 @@ export async function handleDiscordMessagingAction(
|
||||
typeof autoArchiveMinutesRaw === "number" && Number.isFinite(autoArchiveMinutesRaw)
|
||||
? autoArchiveMinutesRaw
|
||||
: undefined;
|
||||
const thread = await createThreadDiscord(
|
||||
channelId,
|
||||
{
|
||||
name,
|
||||
messageId,
|
||||
autoArchiveMinutes,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const thread = accountId
|
||||
? await createThreadDiscord(
|
||||
channelId,
|
||||
{ name, messageId, autoArchiveMinutes },
|
||||
{ accountId },
|
||||
)
|
||||
: await createThreadDiscord(channelId, { name, messageId, autoArchiveMinutes });
|
||||
return jsonResult({ ok: true, thread });
|
||||
}
|
||||
case "threadList": {
|
||||
@@ -299,16 +308,24 @@ export async function handleDiscordMessagingAction(
|
||||
typeof params.limit === "number" && Number.isFinite(params.limit)
|
||||
? params.limit
|
||||
: undefined;
|
||||
const threads = await listThreadsDiscord(
|
||||
{
|
||||
guildId,
|
||||
channelId,
|
||||
includeArchived,
|
||||
before,
|
||||
limit,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const threads = accountId
|
||||
? await listThreadsDiscord(
|
||||
{
|
||||
guildId,
|
||||
channelId,
|
||||
includeArchived,
|
||||
before,
|
||||
limit,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await listThreadsDiscord({
|
||||
guildId,
|
||||
channelId,
|
||||
includeArchived,
|
||||
before,
|
||||
limit,
|
||||
});
|
||||
return jsonResult({ ok: true, threads });
|
||||
}
|
||||
case "threadReply": {
|
||||
@@ -322,7 +339,7 @@ export async function handleDiscordMessagingAction(
|
||||
const mediaUrl = readStringParam(params, "mediaUrl");
|
||||
const replyTo = readStringParam(params, "replyTo");
|
||||
const result = await sendMessageDiscord(`channel:${channelId}`, content, {
|
||||
accountId: accountId ?? undefined,
|
||||
...(accountId ? { accountId } : {}),
|
||||
mediaUrl,
|
||||
replyTo,
|
||||
});
|
||||
@@ -336,7 +353,11 @@ export async function handleDiscordMessagingAction(
|
||||
const messageId = readStringParam(params, "messageId", {
|
||||
required: true,
|
||||
});
|
||||
await pinMessageDiscord(channelId, messageId, accountOpts);
|
||||
if (accountId) {
|
||||
await pinMessageDiscord(channelId, messageId, { accountId });
|
||||
} else {
|
||||
await pinMessageDiscord(channelId, messageId);
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "unpinMessage": {
|
||||
@@ -347,7 +368,11 @@ export async function handleDiscordMessagingAction(
|
||||
const messageId = readStringParam(params, "messageId", {
|
||||
required: true,
|
||||
});
|
||||
await unpinMessageDiscord(channelId, messageId, accountOpts);
|
||||
if (accountId) {
|
||||
await unpinMessageDiscord(channelId, messageId, { accountId });
|
||||
} else {
|
||||
await unpinMessageDiscord(channelId, messageId);
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "listPins": {
|
||||
@@ -355,7 +380,9 @@ export async function handleDiscordMessagingAction(
|
||||
throw new Error("Discord pins are disabled.");
|
||||
}
|
||||
const channelId = resolveChannelId();
|
||||
const pins = await listPinsDiscord(channelId, accountOpts);
|
||||
const pins = accountId
|
||||
? await listPinsDiscord(channelId, { accountId })
|
||||
: await listPinsDiscord(channelId);
|
||||
return jsonResult({ ok: true, pins: pins.map((pin) => normalizeMessage(pin)) });
|
||||
}
|
||||
case "searchMessages": {
|
||||
@@ -378,16 +405,24 @@ export async function handleDiscordMessagingAction(
|
||||
: undefined;
|
||||
const channelIdList = [...(channelIds ?? []), ...(channelId ? [channelId] : [])];
|
||||
const authorIdList = [...(authorIds ?? []), ...(authorId ? [authorId] : [])];
|
||||
const results = await searchMessagesDiscord(
|
||||
{
|
||||
guildId,
|
||||
content,
|
||||
channelIds: channelIdList.length ? channelIdList : undefined,
|
||||
authorIds: authorIdList.length ? authorIdList : undefined,
|
||||
limit,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const results = accountId
|
||||
? await searchMessagesDiscord(
|
||||
{
|
||||
guildId,
|
||||
content,
|
||||
channelIds: channelIdList.length ? channelIdList : undefined,
|
||||
authorIds: authorIdList.length ? authorIdList : undefined,
|
||||
limit,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await searchMessagesDiscord({
|
||||
guildId,
|
||||
content,
|
||||
channelIds: channelIdList.length ? channelIdList : undefined,
|
||||
authorIds: authorIdList.length ? authorIdList : undefined,
|
||||
limit,
|
||||
});
|
||||
if (!results || typeof results !== "object") {
|
||||
return jsonResult({ ok: true, results });
|
||||
}
|
||||
|
||||
@@ -9,8 +9,6 @@ export async function handleDiscordModerationAction(
|
||||
isActionEnabled: ActionGate<DiscordActionConfig>,
|
||||
): Promise<AgentToolResult<unknown>> {
|
||||
const accountId = readStringParam(params, "accountId");
|
||||
const accountOpts = accountId ? { accountId } : {};
|
||||
|
||||
switch (action) {
|
||||
case "timeout": {
|
||||
if (!isActionEnabled("moderation", false)) {
|
||||
@@ -28,16 +26,24 @@ export async function handleDiscordModerationAction(
|
||||
: undefined;
|
||||
const until = readStringParam(params, "until");
|
||||
const reason = readStringParam(params, "reason");
|
||||
const member = await timeoutMemberDiscord(
|
||||
{
|
||||
guildId,
|
||||
userId,
|
||||
durationMinutes,
|
||||
until,
|
||||
reason,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
const member = accountId
|
||||
? await timeoutMemberDiscord(
|
||||
{
|
||||
guildId,
|
||||
userId,
|
||||
durationMinutes,
|
||||
until,
|
||||
reason,
|
||||
},
|
||||
{ accountId },
|
||||
)
|
||||
: await timeoutMemberDiscord({
|
||||
guildId,
|
||||
userId,
|
||||
durationMinutes,
|
||||
until,
|
||||
reason,
|
||||
});
|
||||
return jsonResult({ ok: true, member });
|
||||
}
|
||||
case "kick": {
|
||||
@@ -51,7 +57,11 @@ export async function handleDiscordModerationAction(
|
||||
required: true,
|
||||
});
|
||||
const reason = readStringParam(params, "reason");
|
||||
await kickMemberDiscord({ guildId, userId, reason }, accountOpts);
|
||||
if (accountId) {
|
||||
await kickMemberDiscord({ guildId, userId, reason }, { accountId });
|
||||
} else {
|
||||
await kickMemberDiscord({ guildId, userId, reason });
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "ban": {
|
||||
@@ -69,15 +79,24 @@ export async function handleDiscordModerationAction(
|
||||
typeof params.deleteMessageDays === "number" && Number.isFinite(params.deleteMessageDays)
|
||||
? params.deleteMessageDays
|
||||
: undefined;
|
||||
await banMemberDiscord(
|
||||
{
|
||||
if (accountId) {
|
||||
await banMemberDiscord(
|
||||
{
|
||||
guildId,
|
||||
userId,
|
||||
reason,
|
||||
deleteMessageDays,
|
||||
},
|
||||
{ accountId },
|
||||
);
|
||||
} else {
|
||||
await banMemberDiscord({
|
||||
guildId,
|
||||
userId,
|
||||
reason,
|
||||
deleteMessageDays,
|
||||
},
|
||||
accountOpts,
|
||||
);
|
||||
});
|
||||
}
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
default:
|
||||
|
||||
@@ -3,6 +3,7 @@ import { describe, expect, it, vi } from "vitest";
|
||||
import type { DiscordActionConfig } from "../../config/config.js";
|
||||
import { handleDiscordGuildAction } from "./discord-actions-guild.js";
|
||||
import { handleDiscordMessagingAction } from "./discord-actions-messaging.js";
|
||||
import { handleDiscordModerationAction } from "./discord-actions-moderation.js";
|
||||
|
||||
const createChannelDiscord = vi.fn(async () => ({
|
||||
id: "new-channel",
|
||||
@@ -20,6 +21,7 @@ const editMessageDiscord = vi.fn(async () => ({}));
|
||||
const fetchMessageDiscord = vi.fn(async () => ({}));
|
||||
const fetchChannelPermissionsDiscord = vi.fn(async () => ({}));
|
||||
const fetchReactionsDiscord = vi.fn(async () => ({}));
|
||||
const listGuildChannelsDiscord = vi.fn(async () => []);
|
||||
const listPinsDiscord = vi.fn(async () => ({}));
|
||||
const listThreadsDiscord = vi.fn(async () => ({}));
|
||||
const moveChannelDiscord = vi.fn(async () => ({ ok: true }));
|
||||
@@ -35,8 +37,12 @@ const sendPollDiscord = vi.fn(async () => ({}));
|
||||
const sendStickerDiscord = vi.fn(async () => ({}));
|
||||
const setChannelPermissionDiscord = vi.fn(async () => ({ ok: true }));
|
||||
const unpinMessageDiscord = vi.fn(async () => ({}));
|
||||
const timeoutMemberDiscord = vi.fn(async () => ({}));
|
||||
const kickMemberDiscord = vi.fn(async () => ({}));
|
||||
const banMemberDiscord = vi.fn(async () => ({}));
|
||||
|
||||
vi.mock("../../discord/send.js", () => ({
|
||||
banMemberDiscord: (...args: unknown[]) => banMemberDiscord(...args),
|
||||
createChannelDiscord: (...args: unknown[]) => createChannelDiscord(...args),
|
||||
createThreadDiscord: (...args: unknown[]) => createThreadDiscord(...args),
|
||||
deleteChannelDiscord: (...args: unknown[]) => deleteChannelDiscord(...args),
|
||||
@@ -46,6 +52,8 @@ vi.mock("../../discord/send.js", () => ({
|
||||
fetchMessageDiscord: (...args: unknown[]) => fetchMessageDiscord(...args),
|
||||
fetchChannelPermissionsDiscord: (...args: unknown[]) => fetchChannelPermissionsDiscord(...args),
|
||||
fetchReactionsDiscord: (...args: unknown[]) => fetchReactionsDiscord(...args),
|
||||
kickMemberDiscord: (...args: unknown[]) => kickMemberDiscord(...args),
|
||||
listGuildChannelsDiscord: (...args: unknown[]) => listGuildChannelsDiscord(...args),
|
||||
listPinsDiscord: (...args: unknown[]) => listPinsDiscord(...args),
|
||||
listThreadsDiscord: (...args: unknown[]) => listThreadsDiscord(...args),
|
||||
moveChannelDiscord: (...args: unknown[]) => moveChannelDiscord(...args),
|
||||
@@ -60,12 +68,15 @@ vi.mock("../../discord/send.js", () => ({
|
||||
sendPollDiscord: (...args: unknown[]) => sendPollDiscord(...args),
|
||||
sendStickerDiscord: (...args: unknown[]) => sendStickerDiscord(...args),
|
||||
setChannelPermissionDiscord: (...args: unknown[]) => setChannelPermissionDiscord(...args),
|
||||
timeoutMemberDiscord: (...args: unknown[]) => timeoutMemberDiscord(...args),
|
||||
unpinMessageDiscord: (...args: unknown[]) => unpinMessageDiscord(...args),
|
||||
}));
|
||||
|
||||
const enableAllActions = () => true;
|
||||
|
||||
const disabledActions = (key: keyof DiscordActionConfig) => key !== "reactions";
|
||||
const channelInfoEnabled = (key: keyof DiscordActionConfig) => key === "channelInfo";
|
||||
const moderationEnabled = (key: keyof DiscordActionConfig) => key === "moderation";
|
||||
|
||||
describe("handleDiscordMessagingAction", () => {
|
||||
it("adds reactions", async () => {
|
||||
@@ -81,6 +92,20 @@ describe("handleDiscordMessagingAction", () => {
|
||||
expect(reactMessageDiscord).toHaveBeenCalledWith("C1", "M1", "✅", {});
|
||||
});
|
||||
|
||||
it("forwards accountId for reactions", async () => {
|
||||
await handleDiscordMessagingAction(
|
||||
"react",
|
||||
{
|
||||
channelId: "C1",
|
||||
messageId: "M1",
|
||||
emoji: "✅",
|
||||
accountId: "ops",
|
||||
},
|
||||
enableAllActions,
|
||||
);
|
||||
expect(reactMessageDiscord).toHaveBeenCalledWith("C1", "M1", "✅", { accountId: "ops" });
|
||||
});
|
||||
|
||||
it("removes reactions on empty emoji", async () => {
|
||||
await handleDiscordMessagingAction(
|
||||
"react",
|
||||
@@ -248,6 +273,15 @@ describe("handleDiscordGuildAction - channel management", () => {
|
||||
).rejects.toThrow(/Discord channel management is disabled/);
|
||||
});
|
||||
|
||||
it("forwards accountId for channelList", async () => {
|
||||
await handleDiscordGuildAction(
|
||||
"channelList",
|
||||
{ guildId: "G1", accountId: "ops" },
|
||||
channelInfoEnabled,
|
||||
);
|
||||
expect(listGuildChannelsDiscord).toHaveBeenCalledWith("G1", { accountId: "ops" });
|
||||
});
|
||||
|
||||
it("edits a channel", async () => {
|
||||
await handleDiscordGuildAction(
|
||||
"channelEdit",
|
||||
@@ -481,3 +515,26 @@ describe("handleDiscordGuildAction - channel management", () => {
|
||||
expect(removeChannelPermissionDiscord).toHaveBeenCalledWith("C1", "R1", {});
|
||||
});
|
||||
});
|
||||
|
||||
describe("handleDiscordModerationAction", () => {
|
||||
it("forwards accountId for timeout", async () => {
|
||||
await handleDiscordModerationAction(
|
||||
"timeout",
|
||||
{
|
||||
guildId: "G1",
|
||||
userId: "U1",
|
||||
durationMinutes: 5,
|
||||
accountId: "ops",
|
||||
},
|
||||
moderationEnabled,
|
||||
);
|
||||
expect(timeoutMemberDiscord).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
guildId: "G1",
|
||||
userId: "U1",
|
||||
durationMinutes: 5,
|
||||
}),
|
||||
{ accountId: "ops" },
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -3,7 +3,6 @@ import { describe, expect, it, vi } from "vitest";
|
||||
import type { ClawdbotConfig } from "../../../config/config.js";
|
||||
type SendMessageDiscord = typeof import("../../../discord/send.js").sendMessageDiscord;
|
||||
type SendPollDiscord = typeof import("../../../discord/send.js").sendPollDiscord;
|
||||
type ReactMessageDiscord = typeof import("../../../discord/send.js").reactMessageDiscord;
|
||||
|
||||
const sendMessageDiscord = vi.fn<Parameters<SendMessageDiscord>, ReturnType<SendMessageDiscord>>(
|
||||
async () => ({ ok: true }) as Awaited<ReturnType<SendMessageDiscord>>,
|
||||
@@ -11,9 +10,6 @@ const sendMessageDiscord = vi.fn<Parameters<SendMessageDiscord>, ReturnType<Send
|
||||
const sendPollDiscord = vi.fn<Parameters<SendPollDiscord>, ReturnType<SendPollDiscord>>(
|
||||
async () => ({ ok: true }) as Awaited<ReturnType<SendPollDiscord>>,
|
||||
);
|
||||
const reactMessageDiscord = vi.fn<Parameters<ReactMessageDiscord>, ReturnType<ReactMessageDiscord>>(
|
||||
async () => ({ ok: true }) as Awaited<ReturnType<ReactMessageDiscord>>,
|
||||
);
|
||||
|
||||
vi.mock("../../../discord/send.js", async () => {
|
||||
const actual = await vi.importActual<typeof import("../../../discord/send.js")>(
|
||||
@@ -23,7 +19,6 @@ vi.mock("../../../discord/send.js", async () => {
|
||||
...actual,
|
||||
sendMessageDiscord: (...args: Parameters<SendMessageDiscord>) => sendMessageDiscord(...args),
|
||||
sendPollDiscord: (...args: Parameters<SendPollDiscord>) => sendPollDiscord(...args),
|
||||
reactMessageDiscord: (...args: Parameters<ReactMessageDiscord>) => reactMessageDiscord(...args),
|
||||
};
|
||||
});
|
||||
|
||||
@@ -110,25 +105,23 @@ describe("handleDiscordMessageAction", () => {
|
||||
);
|
||||
});
|
||||
|
||||
it("forwards accountId for reaction actions", async () => {
|
||||
reactMessageDiscord.mockClear();
|
||||
it("forwards accountId for thread replies", async () => {
|
||||
sendMessageDiscord.mockClear();
|
||||
const handleDiscordMessageAction = await loadHandleDiscordMessageAction();
|
||||
|
||||
await handleDiscordMessageAction({
|
||||
action: "react",
|
||||
action: "thread-reply",
|
||||
params: {
|
||||
channelId: "123",
|
||||
messageId: "m1",
|
||||
emoji: "👍",
|
||||
message: "hi",
|
||||
},
|
||||
cfg: {} as ClawdbotConfig,
|
||||
accountId: "ops",
|
||||
});
|
||||
|
||||
expect(reactMessageDiscord).toHaveBeenCalledWith(
|
||||
"123",
|
||||
"m1",
|
||||
"👍",
|
||||
expect(sendMessageDiscord).toHaveBeenCalledWith(
|
||||
"channel:123",
|
||||
"hi",
|
||||
expect.objectContaining({
|
||||
accountId: "ops",
|
||||
}),
|
||||
|
||||
@@ -17,7 +17,6 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
const { ctx, resolveChannelId, readParentIdParam } = params;
|
||||
const { action, params: actionParams, cfg } = ctx;
|
||||
const accountId = ctx.accountId ?? readStringParam(actionParams, "accountId");
|
||||
const accountIdParam = accountId ?? undefined;
|
||||
|
||||
if (action === "member-info") {
|
||||
const userId = readStringParam(actionParams, "userId", { required: true });
|
||||
@@ -25,7 +24,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "memberInfo", accountId: accountIdParam, guildId, userId },
|
||||
{ action: "memberInfo", accountId: accountId ?? undefined, guildId, userId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -35,7 +34,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "roleInfo", accountId: accountIdParam, guildId },
|
||||
{ action: "roleInfo", accountId: accountId ?? undefined, guildId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -45,7 +44,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "emojiList", accountId: accountIdParam, guildId },
|
||||
{ action: "emojiList", accountId: accountId ?? undefined, guildId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -61,7 +60,14 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
});
|
||||
const roleIds = readStringArrayParam(actionParams, "roleIds");
|
||||
return await handleDiscordAction(
|
||||
{ action: "emojiUpload", accountId: accountIdParam, guildId, name, mediaUrl, roleIds },
|
||||
{
|
||||
action: "emojiUpload",
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
name,
|
||||
mediaUrl,
|
||||
roleIds,
|
||||
},
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -86,7 +92,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "stickerUpload",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
name,
|
||||
description,
|
||||
@@ -106,7 +112,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: action === "role-add" ? "roleAdd" : "roleRemove",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
userId,
|
||||
roleId,
|
||||
@@ -120,7 +126,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "channelInfo", accountId: accountIdParam, channelId },
|
||||
{ action: "channelInfo", accountId: accountId ?? undefined, channelId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -130,7 +136,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "channelList", accountId: accountIdParam, guildId },
|
||||
{ action: "channelList", accountId: accountId ?? undefined, guildId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -150,7 +156,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "channelCreate",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
name,
|
||||
type: type ?? undefined,
|
||||
@@ -180,7 +186,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "channelEdit",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId,
|
||||
name: name ?? undefined,
|
||||
topic: topic ?? undefined,
|
||||
@@ -198,7 +204,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "channelDelete", accountId: accountIdParam, channelId },
|
||||
{ action: "channelDelete", accountId: accountId ?? undefined, channelId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -217,7 +223,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "channelMove",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
channelId,
|
||||
parentId: parentId === undefined ? undefined : parentId,
|
||||
@@ -238,7 +244,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "categoryCreate",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
name,
|
||||
position: position ?? undefined,
|
||||
@@ -258,7 +264,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "categoryEdit",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
categoryId,
|
||||
name: name ?? undefined,
|
||||
position: position ?? undefined,
|
||||
@@ -272,7 +278,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "categoryDelete", accountId: accountIdParam, categoryId },
|
||||
{ action: "categoryDelete", accountId: accountId ?? undefined, categoryId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -283,7 +289,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
});
|
||||
const userId = readStringParam(actionParams, "userId", { required: true });
|
||||
return await handleDiscordAction(
|
||||
{ action: "voiceStatus", accountId: accountIdParam, guildId, userId },
|
||||
{ action: "voiceStatus", accountId: accountId ?? undefined, guildId, userId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -293,7 +299,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
required: true,
|
||||
});
|
||||
return await handleDiscordAction(
|
||||
{ action: "eventList", accountId: accountIdParam, guildId },
|
||||
{ action: "eventList", accountId: accountId ?? undefined, guildId },
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -314,7 +320,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "eventCreate",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
name,
|
||||
startTime,
|
||||
@@ -345,7 +351,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: discordAction,
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
userId,
|
||||
durationMinutes,
|
||||
@@ -370,7 +376,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "threadList",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
channelId,
|
||||
includeArchived,
|
||||
@@ -390,7 +396,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "threadReply",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
content,
|
||||
mediaUrl: mediaUrl ?? undefined,
|
||||
@@ -408,7 +414,7 @@ export async function tryHandleDiscordMessageActionGuildAdmin(params: {
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "searchMessages",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
guildId,
|
||||
content: query,
|
||||
channelId: readStringParam(actionParams, "channelId"),
|
||||
|
||||
@@ -22,7 +22,6 @@ export async function handleDiscordMessageAction(
|
||||
): Promise<AgentToolResult<unknown>> {
|
||||
const { action, params, cfg } = ctx;
|
||||
const accountId = ctx.accountId ?? readStringParam(params, "accountId");
|
||||
const accountIdParam = accountId ?? undefined;
|
||||
|
||||
const resolveChannelId = () =>
|
||||
resolveDiscordChannelId(
|
||||
@@ -41,7 +40,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "sendMessage",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
to,
|
||||
content,
|
||||
mediaUrl: mediaUrl ?? undefined,
|
||||
@@ -65,7 +64,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "poll",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
to,
|
||||
question,
|
||||
answers,
|
||||
@@ -84,7 +83,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "react",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
messageId,
|
||||
emoji,
|
||||
@@ -100,7 +99,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "reactions",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
messageId,
|
||||
limit,
|
||||
@@ -114,7 +113,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "readMessages",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
limit,
|
||||
before: readStringParam(params, "before"),
|
||||
@@ -131,7 +130,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "editMessage",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
messageId,
|
||||
content,
|
||||
@@ -145,7 +144,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "deleteMessage",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
messageId,
|
||||
},
|
||||
@@ -159,7 +158,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: action === "pin" ? "pinMessage" : action === "unpin" ? "unpinMessage" : "listPins",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
messageId,
|
||||
},
|
||||
@@ -169,7 +168,11 @@ export async function handleDiscordMessageAction(
|
||||
|
||||
if (action === "permissions") {
|
||||
return await handleDiscordAction(
|
||||
{ action: "permissions", accountId: accountIdParam, channelId: resolveChannelId() },
|
||||
{
|
||||
action: "permissions",
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
},
|
||||
cfg,
|
||||
);
|
||||
}
|
||||
@@ -183,7 +186,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "threadCreate",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
channelId: resolveChannelId(),
|
||||
name,
|
||||
messageId,
|
||||
@@ -202,7 +205,7 @@ export async function handleDiscordMessageAction(
|
||||
return await handleDiscordAction(
|
||||
{
|
||||
action: "sticker",
|
||||
accountId: accountIdParam,
|
||||
accountId: accountId ?? undefined,
|
||||
to: readStringParam(params, "to", { required: true }),
|
||||
stickerIds,
|
||||
content: readStringParam(params, "message"),
|
||||
|
||||
@@ -38,6 +38,19 @@ export async function runDaemonUninstall(opts: DaemonLifecycleOptions = {}) {
|
||||
}
|
||||
|
||||
const service = resolveGatewayService();
|
||||
let loaded = false;
|
||||
try {
|
||||
loaded = await service.isLoaded({ env: process.env });
|
||||
} catch {
|
||||
loaded = false;
|
||||
}
|
||||
if (loaded) {
|
||||
try {
|
||||
await service.stop({ env: process.env, stdout });
|
||||
} catch {
|
||||
// Best-effort stop; final loaded check gates success.
|
||||
}
|
||||
}
|
||||
try {
|
||||
await service.uninstall({ env: process.env, stdout });
|
||||
} catch (err) {
|
||||
@@ -45,12 +58,16 @@ export async function runDaemonUninstall(opts: DaemonLifecycleOptions = {}) {
|
||||
return;
|
||||
}
|
||||
|
||||
let loaded = false;
|
||||
loaded = false;
|
||||
try {
|
||||
loaded = await service.isLoaded({ env: process.env });
|
||||
} catch {
|
||||
loaded = false;
|
||||
}
|
||||
if (loaded) {
|
||||
fail("Gateway service still loaded after uninstall.");
|
||||
return;
|
||||
}
|
||||
emit({
|
||||
ok: true,
|
||||
result: "uninstalled",
|
||||
|
||||
@@ -198,10 +198,20 @@ export async function applyAuthChoiceAnthropic(
|
||||
}
|
||||
|
||||
if (params.authChoice === "apiKey") {
|
||||
if (params.opts?.tokenProvider && params.opts.tokenProvider !== "anthropic") {
|
||||
return null;
|
||||
}
|
||||
|
||||
let nextConfig = params.config;
|
||||
let hasCredential = false;
|
||||
const envKey = process.env.ANTHROPIC_API_KEY?.trim();
|
||||
if (envKey) {
|
||||
|
||||
if (params.opts?.token) {
|
||||
await setAnthropicApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
if (!hasCredential && envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
message: `Use existing ANTHROPIC_API_KEY (env, ${formatApiKeyPreview(envKey)})?`,
|
||||
initialValue: true,
|
||||
|
||||
@@ -56,7 +56,33 @@ export async function applyAuthChoiceApiProviders(
|
||||
);
|
||||
};
|
||||
|
||||
if (params.authChoice === "openrouter-api-key") {
|
||||
let authChoice = params.authChoice;
|
||||
if (
|
||||
authChoice === "apiKey" &&
|
||||
params.opts?.tokenProvider &&
|
||||
params.opts.tokenProvider !== "anthropic" &&
|
||||
params.opts.tokenProvider !== "openai"
|
||||
) {
|
||||
if (params.opts.tokenProvider === "openrouter") {
|
||||
authChoice = "openrouter-api-key";
|
||||
} else if (params.opts.tokenProvider === "vercel-ai-gateway") {
|
||||
authChoice = "ai-gateway-api-key";
|
||||
} else if (params.opts.tokenProvider === "moonshot") {
|
||||
authChoice = "moonshot-api-key";
|
||||
} else if (params.opts.tokenProvider === "kimi-code") {
|
||||
authChoice = "kimi-code-api-key";
|
||||
} else if (params.opts.tokenProvider === "google") {
|
||||
authChoice = "gemini-api-key";
|
||||
} else if (params.opts.tokenProvider === "zai") {
|
||||
authChoice = "zai-api-key";
|
||||
} else if (params.opts.tokenProvider === "synthetic") {
|
||||
authChoice = "synthetic-api-key";
|
||||
} else if (params.opts.tokenProvider === "opencode") {
|
||||
authChoice = "opencode-zen";
|
||||
}
|
||||
}
|
||||
|
||||
if (authChoice === "openrouter-api-key") {
|
||||
const store = ensureAuthProfileStore(params.agentDir, {
|
||||
allowKeychainPrompt: false,
|
||||
});
|
||||
@@ -82,6 +108,11 @@ export async function applyAuthChoiceApiProviders(
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
if (!hasCredential && params.opts?.token && params.opts?.tokenProvider === "openrouter") {
|
||||
await setOpenrouterApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
if (!hasCredential) {
|
||||
const envKey = resolveEnvApiKey("openrouter");
|
||||
if (envKey) {
|
||||
@@ -129,8 +160,18 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "ai-gateway-api-key") {
|
||||
if (authChoice === "ai-gateway-api-key") {
|
||||
let hasCredential = false;
|
||||
|
||||
if (
|
||||
!hasCredential &&
|
||||
params.opts?.token &&
|
||||
params.opts?.tokenProvider === "vercel-ai-gateway"
|
||||
) {
|
||||
await setVercelAiGatewayApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
const envKey = resolveEnvApiKey("vercel-ai-gateway");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
@@ -171,8 +212,14 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "moonshot-api-key") {
|
||||
if (authChoice === "moonshot-api-key") {
|
||||
let hasCredential = false;
|
||||
|
||||
if (!hasCredential && params.opts?.token && params.opts?.tokenProvider === "moonshot") {
|
||||
await setMoonshotApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
const envKey = resolveEnvApiKey("moonshot");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
@@ -212,15 +259,22 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "kimi-code-api-key") {
|
||||
await params.prompter.note(
|
||||
[
|
||||
"Kimi Code uses a dedicated endpoint and API key.",
|
||||
"Get your API key at: https://www.kimi.com/code/en",
|
||||
].join("\n"),
|
||||
"Kimi Code",
|
||||
);
|
||||
if (authChoice === "kimi-code-api-key") {
|
||||
let hasCredential = false;
|
||||
if (!hasCredential && params.opts?.token && params.opts?.tokenProvider === "kimi-code") {
|
||||
await setKimiCodeApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
if (!hasCredential) {
|
||||
await params.prompter.note(
|
||||
[
|
||||
"Kimi Code uses a dedicated endpoint and API key.",
|
||||
"Get your API key at: https://www.kimi.com/code/en",
|
||||
].join("\n"),
|
||||
"Kimi Code",
|
||||
);
|
||||
}
|
||||
const envKey = resolveEnvApiKey("kimi-code");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
@@ -261,8 +315,14 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "gemini-api-key") {
|
||||
if (authChoice === "gemini-api-key") {
|
||||
let hasCredential = false;
|
||||
|
||||
if (!hasCredential && params.opts?.token && params.opts?.tokenProvider === "google") {
|
||||
await setGeminiApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
const envKey = resolveEnvApiKey("google");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
@@ -302,8 +362,14 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "zai-api-key") {
|
||||
if (authChoice === "zai-api-key") {
|
||||
let hasCredential = false;
|
||||
|
||||
if (!hasCredential && params.opts?.token && params.opts?.tokenProvider === "zai") {
|
||||
await setZaiApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
const envKey = resolveEnvApiKey("zai");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
@@ -359,12 +425,16 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "synthetic-api-key") {
|
||||
const key = await params.prompter.text({
|
||||
message: "Enter Synthetic API key",
|
||||
validate: (value) => (value?.trim() ? undefined : "Required"),
|
||||
});
|
||||
await setSyntheticApiKey(String(key).trim(), params.agentDir);
|
||||
if (authChoice === "synthetic-api-key") {
|
||||
if (params.opts?.token && params.opts?.tokenProvider === "synthetic") {
|
||||
await setSyntheticApiKey(String(params.opts.token).trim(), params.agentDir);
|
||||
} else {
|
||||
const key = await params.prompter.text({
|
||||
message: "Enter Synthetic API key",
|
||||
validate: (value) => (value?.trim() ? undefined : "Required"),
|
||||
});
|
||||
await setSyntheticApiKey(String(key).trim(), params.agentDir);
|
||||
}
|
||||
nextConfig = applyAuthProfileConfig(nextConfig, {
|
||||
profileId: "synthetic:default",
|
||||
provider: "synthetic",
|
||||
@@ -387,16 +457,23 @@ export async function applyAuthChoiceApiProviders(
|
||||
return { config: nextConfig, agentModelOverride };
|
||||
}
|
||||
|
||||
if (params.authChoice === "opencode-zen") {
|
||||
await params.prompter.note(
|
||||
[
|
||||
"OpenCode Zen provides access to Claude, GPT, Gemini, and more models.",
|
||||
"Get your API key at: https://opencode.ai/auth",
|
||||
"Requires an active OpenCode Zen subscription.",
|
||||
].join("\n"),
|
||||
"OpenCode Zen",
|
||||
);
|
||||
if (authChoice === "opencode-zen") {
|
||||
let hasCredential = false;
|
||||
if (!hasCredential && params.opts?.token && params.opts?.tokenProvider === "opencode") {
|
||||
await setOpencodeZenApiKey(normalizeApiKeyInput(params.opts.token), params.agentDir);
|
||||
hasCredential = true;
|
||||
}
|
||||
|
||||
if (!hasCredential) {
|
||||
await params.prompter.note(
|
||||
[
|
||||
"OpenCode Zen provides access to Claude, GPT, Gemini, and more models.",
|
||||
"Get your API key at: https://opencode.ai/auth",
|
||||
"Requires an active OpenCode Zen subscription.",
|
||||
].join("\n"),
|
||||
"OpenCode Zen",
|
||||
);
|
||||
}
|
||||
const envKey = resolveEnvApiKey("opencode");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
|
||||
@@ -35,7 +35,7 @@ export async function applyAuthChoiceGitHubCopilot(
|
||||
nextConfig = applyAuthProfileConfig(nextConfig, {
|
||||
profileId: "github-copilot:github",
|
||||
provider: "github-copilot",
|
||||
mode: "oauth",
|
||||
mode: "token",
|
||||
});
|
||||
|
||||
if (params.setDefaultModel) {
|
||||
|
||||
@@ -20,7 +20,12 @@ import {
|
||||
export async function applyAuthChoiceOpenAI(
|
||||
params: ApplyAuthChoiceParams,
|
||||
): Promise<ApplyAuthChoiceResult | null> {
|
||||
if (params.authChoice === "openai-api-key") {
|
||||
let authChoice = params.authChoice;
|
||||
if (authChoice === "apiKey" && params.opts?.tokenProvider === "openai") {
|
||||
authChoice = "openai-api-key";
|
||||
}
|
||||
|
||||
if (authChoice === "openai-api-key") {
|
||||
const envKey = resolveEnvApiKey("openai");
|
||||
if (envKey) {
|
||||
const useExisting = await params.prompter.confirm({
|
||||
@@ -43,10 +48,16 @@ export async function applyAuthChoiceOpenAI(
|
||||
}
|
||||
}
|
||||
|
||||
const key = await params.prompter.text({
|
||||
message: "Enter OpenAI API key",
|
||||
validate: validateApiKeyInput,
|
||||
});
|
||||
let key: string | undefined;
|
||||
if (params.opts?.token && params.opts?.tokenProvider === "openai") {
|
||||
key = params.opts.token;
|
||||
} else {
|
||||
key = await params.prompter.text({
|
||||
message: "Enter OpenAI API key",
|
||||
validate: validateApiKeyInput,
|
||||
});
|
||||
}
|
||||
|
||||
const trimmed = normalizeApiKeyInput(String(key));
|
||||
const result = upsertSharedEnvVar({
|
||||
key: "OPENAI_API_KEY",
|
||||
|
||||
@@ -21,6 +21,10 @@ export type ApplyAuthChoiceParams = {
|
||||
agentDir?: string;
|
||||
setDefaultModel: boolean;
|
||||
agentId?: string;
|
||||
opts?: {
|
||||
tokenProvider?: string;
|
||||
token?: string;
|
||||
};
|
||||
};
|
||||
|
||||
export type ApplyAuthChoiceResult = {
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
import type { Client } from "@buape/carbon";
|
||||
import { ChannelType, MessageType } from "@buape/carbon";
|
||||
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { createDiscordMessageHandler } from "./monitor.js";
|
||||
import { __resetDiscordChannelInfoCacheForTest } from "./monitor/message-utils.js";
|
||||
import { __resetDiscordThreadStarterCacheForTest } from "./monitor/threading.js";
|
||||
|
||||
const sendMock = vi.fn();
|
||||
const reactMock = vi.fn();
|
||||
@@ -41,12 +44,12 @@ beforeEach(() => {
|
||||
});
|
||||
readAllowFromStoreMock.mockReset().mockResolvedValue([]);
|
||||
upsertPairingRequestMock.mockReset().mockResolvedValue({ code: "PAIRCODE", created: true });
|
||||
vi.resetModules();
|
||||
__resetDiscordChannelInfoCacheForTest();
|
||||
__resetDiscordThreadStarterCacheForTest();
|
||||
});
|
||||
|
||||
describe("discord tool result dispatch", () => {
|
||||
it("sends status replies with responsePrefix", async () => {
|
||||
const { createDiscordMessageHandler } = await import("./monitor.js");
|
||||
const cfg = {
|
||||
agents: {
|
||||
defaults: {
|
||||
@@ -116,7 +119,6 @@ describe("discord tool result dispatch", () => {
|
||||
}, 30_000);
|
||||
|
||||
it("caches channel info lookups between messages", async () => {
|
||||
const { createDiscordMessageHandler } = await import("./monitor.js");
|
||||
const cfg = {
|
||||
agents: {
|
||||
defaults: {
|
||||
@@ -189,7 +191,6 @@ describe("discord tool result dispatch", () => {
|
||||
});
|
||||
|
||||
it("includes forwarded message snapshots in body", async () => {
|
||||
const { createDiscordMessageHandler } = await import("./monitor.js");
|
||||
let capturedBody = "";
|
||||
dispatchMock.mockImplementationOnce(async ({ ctx, dispatcher }) => {
|
||||
capturedBody = ctx.Body ?? "";
|
||||
|
||||
@@ -30,6 +30,10 @@ type DiscordThreadParentInfo = {
|
||||
|
||||
const DISCORD_THREAD_STARTER_CACHE = new Map<string, DiscordThreadStarter>();
|
||||
|
||||
export function __resetDiscordThreadStarterCacheForTest() {
|
||||
DISCORD_THREAD_STARTER_CACHE.clear();
|
||||
}
|
||||
|
||||
function isDiscordThreadType(type: ChannelType | undefined): boolean {
|
||||
return (
|
||||
type === ChannelType.PublicThread ||
|
||||
|
||||
@@ -1,7 +1,21 @@
|
||||
export const MAX_PAYLOAD_BYTES = 512 * 1024; // cap incoming frame size
|
||||
export const MAX_BUFFERED_BYTES = 1.5 * 1024 * 1024; // per-connection send buffer limit
|
||||
|
||||
export const MAX_CHAT_HISTORY_MESSAGES_BYTES = 6 * 1024 * 1024; // keep history responses comfortably under client WS limits
|
||||
const DEFAULT_MAX_CHAT_HISTORY_MESSAGES_BYTES = 6 * 1024 * 1024; // keep history responses comfortably under client WS limits
|
||||
let maxChatHistoryMessagesBytes = DEFAULT_MAX_CHAT_HISTORY_MESSAGES_BYTES;
|
||||
|
||||
export const getMaxChatHistoryMessagesBytes = () => maxChatHistoryMessagesBytes;
|
||||
|
||||
export const __setMaxChatHistoryMessagesBytesForTest = (value?: number) => {
|
||||
if (!process.env.VITEST && process.env.NODE_ENV !== "test") return;
|
||||
if (value === undefined) {
|
||||
maxChatHistoryMessagesBytes = DEFAULT_MAX_CHAT_HISTORY_MESSAGES_BYTES;
|
||||
return;
|
||||
}
|
||||
if (Number.isFinite(value) && value > 0) {
|
||||
maxChatHistoryMessagesBytes = value;
|
||||
}
|
||||
};
|
||||
export const DEFAULT_HANDSHAKE_TIMEOUT_MS = 10_000;
|
||||
export const getHandshakeTimeoutMs = () => {
|
||||
if (process.env.VITEST && process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS) {
|
||||
|
||||
@@ -28,7 +28,7 @@ import {
|
||||
validateChatInjectParams,
|
||||
validateChatSendParams,
|
||||
} from "../protocol/index.js";
|
||||
import { MAX_CHAT_HISTORY_MESSAGES_BYTES } from "../server-constants.js";
|
||||
import { getMaxChatHistoryMessagesBytes } from "../server-constants.js";
|
||||
import {
|
||||
capArrayByJsonBytes,
|
||||
loadSessionEntry,
|
||||
@@ -66,7 +66,7 @@ export const chatHandlers: GatewayRequestHandlers = {
|
||||
const max = Math.min(hardMax, requested);
|
||||
const sliced = rawMessages.length > max ? rawMessages.slice(-max) : rawMessages;
|
||||
const sanitized = stripEnvelopeFromMessages(sliced);
|
||||
const capped = capArrayByJsonBytes(sanitized, MAX_CHAT_HISTORY_MESSAGES_BYTES).items;
|
||||
const capped = capArrayByJsonBytes(sanitized, getMaxChatHistoryMessagesBytes()).items;
|
||||
let thinkingLevel = entry?.thinkingLevel;
|
||||
if (!thinkingLevel) {
|
||||
const configured = cfg.agents?.defaults?.thinkingDefault;
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { describe, expect, test, vi } from "vitest";
|
||||
import { afterAll, beforeAll, describe, expect, test, vi } from "vitest";
|
||||
import { WebSocket } from "ws";
|
||||
import { PROTOCOL_VERSION } from "./protocol/index.js";
|
||||
import { getHandshakeTimeoutMs } from "./server-constants.js";
|
||||
@@ -26,129 +26,226 @@ async function waitForWsClose(ws: WebSocket, timeoutMs: number): Promise<boolean
|
||||
});
|
||||
}
|
||||
|
||||
const openWs = async (port: number) => {
|
||||
const ws = new WebSocket(`ws://127.0.0.1:${port}`);
|
||||
await new Promise<void>((resolve) => ws.once("open", resolve));
|
||||
return ws;
|
||||
};
|
||||
|
||||
describe("gateway server auth/connect", () => {
|
||||
test("closes silent handshakes after timeout", { timeout: 60_000 }, async () => {
|
||||
vi.useRealTimers();
|
||||
const prevHandshakeTimeout = process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS;
|
||||
process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS = "50";
|
||||
try {
|
||||
const { server, ws } = await startServerWithClient();
|
||||
const handshakeTimeoutMs = getHandshakeTimeoutMs();
|
||||
const closed = await waitForWsClose(ws, handshakeTimeoutMs + 250);
|
||||
expect(closed).toBe(true);
|
||||
describe("default auth", () => {
|
||||
let server: Awaited<ReturnType<typeof startGatewayServer>>;
|
||||
let port: number;
|
||||
|
||||
beforeAll(async () => {
|
||||
port = await getFreePort();
|
||||
server = await startGatewayServer(port);
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
} finally {
|
||||
if (prevHandshakeTimeout === undefined) {
|
||||
delete process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS;
|
||||
} else {
|
||||
process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS = prevHandshakeTimeout;
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
test("connect (req) handshake returns hello-ok payload", async () => {
|
||||
const { CONFIG_PATH_CLAWDBOT, STATE_DIR_CLAWDBOT } = await import("../config/config.js");
|
||||
const port = await getFreePort();
|
||||
const server = await startGatewayServer(port);
|
||||
const ws = new WebSocket(`ws://127.0.0.1:${port}`);
|
||||
await new Promise<void>((resolve) => ws.once("open", resolve));
|
||||
|
||||
const res = await connectReq(ws);
|
||||
expect(res.ok).toBe(true);
|
||||
const payload = res.payload as
|
||||
| {
|
||||
type?: unknown;
|
||||
snapshot?: { configPath?: string; stateDir?: string };
|
||||
test("closes silent handshakes after timeout", { timeout: 60_000 }, async () => {
|
||||
vi.useRealTimers();
|
||||
const prevHandshakeTimeout = process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS;
|
||||
process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS = "50";
|
||||
try {
|
||||
const ws = await openWs(port);
|
||||
const handshakeTimeoutMs = getHandshakeTimeoutMs();
|
||||
const closed = await waitForWsClose(ws, handshakeTimeoutMs + 250);
|
||||
expect(closed).toBe(true);
|
||||
} finally {
|
||||
if (prevHandshakeTimeout === undefined) {
|
||||
delete process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS;
|
||||
} else {
|
||||
process.env.CLAWDBOT_TEST_HANDSHAKE_TIMEOUT_MS = prevHandshakeTimeout;
|
||||
}
|
||||
| undefined;
|
||||
expect(payload?.type).toBe("hello-ok");
|
||||
expect(payload?.snapshot?.configPath).toBe(CONFIG_PATH_CLAWDBOT);
|
||||
expect(payload?.snapshot?.stateDir).toBe(STATE_DIR_CLAWDBOT);
|
||||
}
|
||||
});
|
||||
|
||||
ws.close();
|
||||
await server.close();
|
||||
});
|
||||
test("connect (req) handshake returns hello-ok payload", async () => {
|
||||
const { CONFIG_PATH_CLAWDBOT, STATE_DIR_CLAWDBOT } = await import("../config/config.js");
|
||||
const ws = await openWs(port);
|
||||
|
||||
test("sends connect challenge on open", async () => {
|
||||
const port = await getFreePort();
|
||||
const server = await startGatewayServer(port);
|
||||
const ws = new WebSocket(`ws://127.0.0.1:${port}`);
|
||||
const evtPromise = onceMessage<{ payload?: unknown }>(
|
||||
ws,
|
||||
(o) => o.type === "event" && o.event === "connect.challenge",
|
||||
const res = await connectReq(ws);
|
||||
expect(res.ok).toBe(true);
|
||||
const payload = res.payload as
|
||||
| {
|
||||
type?: unknown;
|
||||
snapshot?: { configPath?: string; stateDir?: string };
|
||||
}
|
||||
| undefined;
|
||||
expect(payload?.type).toBe("hello-ok");
|
||||
expect(payload?.snapshot?.configPath).toBe(CONFIG_PATH_CLAWDBOT);
|
||||
expect(payload?.snapshot?.stateDir).toBe(STATE_DIR_CLAWDBOT);
|
||||
|
||||
ws.close();
|
||||
});
|
||||
|
||||
test("sends connect challenge on open", async () => {
|
||||
const ws = new WebSocket(`ws://127.0.0.1:${port}`);
|
||||
const evtPromise = onceMessage<{ payload?: unknown }>(
|
||||
ws,
|
||||
(o) => o.type === "event" && o.event === "connect.challenge",
|
||||
);
|
||||
await new Promise<void>((resolve) => ws.once("open", resolve));
|
||||
const evt = await evtPromise;
|
||||
const nonce = (evt.payload as { nonce?: unknown } | undefined)?.nonce;
|
||||
expect(typeof nonce).toBe("string");
|
||||
ws.close();
|
||||
});
|
||||
|
||||
test("rejects protocol mismatch", async () => {
|
||||
const ws = await openWs(port);
|
||||
try {
|
||||
const res = await connectReq(ws, {
|
||||
minProtocol: PROTOCOL_VERSION + 1,
|
||||
maxProtocol: PROTOCOL_VERSION + 2,
|
||||
});
|
||||
expect(res.ok).toBe(false);
|
||||
} catch {
|
||||
// If the server closed before we saw the frame, that's acceptable.
|
||||
}
|
||||
ws.close();
|
||||
});
|
||||
|
||||
test("rejects non-connect first request", async () => {
|
||||
const ws = await openWs(port);
|
||||
ws.send(JSON.stringify({ type: "req", id: "h1", method: "health" }));
|
||||
const res = await onceMessage<{ ok: boolean; error?: unknown }>(
|
||||
ws,
|
||||
(o) => o.type === "res" && o.id === "h1",
|
||||
);
|
||||
expect(res.ok).toBe(false);
|
||||
await new Promise<void>((resolve) => ws.once("close", () => resolve()));
|
||||
});
|
||||
|
||||
test(
|
||||
"invalid connect params surface in response and close reason",
|
||||
{ timeout: 60_000 },
|
||||
async () => {
|
||||
const ws = await openWs(port);
|
||||
const closeInfoPromise = new Promise<{ code: number; reason: string }>((resolve) => {
|
||||
ws.once("close", (code, reason) => resolve({ code, reason: reason.toString() }));
|
||||
});
|
||||
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: "req",
|
||||
id: "h-bad",
|
||||
method: "connect",
|
||||
params: {
|
||||
minProtocol: PROTOCOL_VERSION,
|
||||
maxProtocol: PROTOCOL_VERSION,
|
||||
client: {
|
||||
id: "bad-client",
|
||||
version: "dev",
|
||||
platform: "web",
|
||||
mode: "webchat",
|
||||
},
|
||||
device: {
|
||||
id: 123,
|
||||
publicKey: "bad",
|
||||
signature: "bad",
|
||||
signedAt: "bad",
|
||||
},
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
const res = await onceMessage<{
|
||||
ok: boolean;
|
||||
error?: { message?: string };
|
||||
}>(
|
||||
ws,
|
||||
(o) => (o as { type?: string }).type === "res" && (o as { id?: string }).id === "h-bad",
|
||||
);
|
||||
expect(res.ok).toBe(false);
|
||||
expect(String(res.error?.message ?? "")).toContain("invalid connect params");
|
||||
|
||||
const closeInfo = await closeInfoPromise;
|
||||
expect(closeInfo.code).toBe(1008);
|
||||
expect(closeInfo.reason).toContain("invalid connect params");
|
||||
},
|
||||
);
|
||||
await new Promise<void>((resolve) => ws.once("open", resolve));
|
||||
const evt = await evtPromise;
|
||||
const nonce = (evt.payload as { nonce?: unknown } | undefined)?.nonce;
|
||||
expect(typeof nonce).toBe("string");
|
||||
ws.close();
|
||||
await server.close();
|
||||
});
|
||||
|
||||
test("rejects protocol mismatch", async () => {
|
||||
const { server, ws } = await startServerWithClient();
|
||||
try {
|
||||
describe("password auth", () => {
|
||||
let server: Awaited<ReturnType<typeof startGatewayServer>>;
|
||||
let port: number;
|
||||
|
||||
beforeAll(async () => {
|
||||
testState.gatewayAuth = { mode: "password", password: "secret" };
|
||||
port = await getFreePort();
|
||||
server = await startGatewayServer(port);
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
});
|
||||
|
||||
test("accepts password auth when configured", async () => {
|
||||
const ws = await openWs(port);
|
||||
const res = await connectReq(ws, { password: "secret" });
|
||||
expect(res.ok).toBe(true);
|
||||
ws.close();
|
||||
});
|
||||
|
||||
test("rejects invalid password", async () => {
|
||||
const ws = await openWs(port);
|
||||
const res = await connectReq(ws, { password: "wrong" });
|
||||
expect(res.ok).toBe(false);
|
||||
expect(res.error?.message ?? "").toContain("unauthorized");
|
||||
ws.close();
|
||||
});
|
||||
});
|
||||
|
||||
describe("token auth", () => {
|
||||
let server: Awaited<ReturnType<typeof startGatewayServer>>;
|
||||
let port: number;
|
||||
let prevToken: string | undefined;
|
||||
|
||||
beforeAll(async () => {
|
||||
prevToken = process.env.CLAWDBOT_GATEWAY_TOKEN;
|
||||
process.env.CLAWDBOT_GATEWAY_TOKEN = "secret";
|
||||
port = await getFreePort();
|
||||
server = await startGatewayServer(port);
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
if (prevToken === undefined) {
|
||||
delete process.env.CLAWDBOT_GATEWAY_TOKEN;
|
||||
} else {
|
||||
process.env.CLAWDBOT_GATEWAY_TOKEN = prevToken;
|
||||
}
|
||||
});
|
||||
|
||||
test("rejects invalid token", async () => {
|
||||
const ws = await openWs(port);
|
||||
const res = await connectReq(ws, { token: "wrong" });
|
||||
expect(res.ok).toBe(false);
|
||||
expect(res.error?.message ?? "").toContain("unauthorized");
|
||||
ws.close();
|
||||
});
|
||||
|
||||
test("rejects control ui without device identity by default", async () => {
|
||||
const ws = await openWs(port);
|
||||
const res = await connectReq(ws, {
|
||||
minProtocol: PROTOCOL_VERSION + 1,
|
||||
maxProtocol: PROTOCOL_VERSION + 2,
|
||||
token: "secret",
|
||||
device: null,
|
||||
client: {
|
||||
id: GATEWAY_CLIENT_NAMES.CONTROL_UI,
|
||||
version: "1.0.0",
|
||||
platform: "web",
|
||||
mode: GATEWAY_CLIENT_MODES.WEBCHAT,
|
||||
},
|
||||
});
|
||||
expect(res.ok).toBe(false);
|
||||
} catch {
|
||||
// If the server closed before we saw the frame, that's acceptable.
|
||||
}
|
||||
ws.close();
|
||||
await server.close();
|
||||
});
|
||||
|
||||
test("rejects invalid token", async () => {
|
||||
const { server, ws, prevToken } = await startServerWithClient("secret");
|
||||
const res = await connectReq(ws, { token: "wrong" });
|
||||
expect(res.ok).toBe(false);
|
||||
expect(res.error?.message ?? "").toContain("unauthorized");
|
||||
ws.close();
|
||||
await server.close();
|
||||
if (prevToken === undefined) {
|
||||
delete process.env.CLAWDBOT_GATEWAY_TOKEN;
|
||||
} else {
|
||||
process.env.CLAWDBOT_GATEWAY_TOKEN = prevToken;
|
||||
}
|
||||
});
|
||||
|
||||
test("accepts password auth when configured", async () => {
|
||||
testState.gatewayAuth = { mode: "password", password: "secret" };
|
||||
const port = await getFreePort();
|
||||
const server = await startGatewayServer(port);
|
||||
const ws = new WebSocket(`ws://127.0.0.1:${port}`);
|
||||
await new Promise<void>((resolve) => ws.once("open", resolve));
|
||||
|
||||
const res = await connectReq(ws, { password: "secret" });
|
||||
expect(res.ok).toBe(true);
|
||||
|
||||
ws.close();
|
||||
await server.close();
|
||||
});
|
||||
|
||||
test("rejects control ui without device identity by default", async () => {
|
||||
const { server, ws, prevToken } = await startServerWithClient("secret");
|
||||
const res = await connectReq(ws, {
|
||||
token: "secret",
|
||||
device: null,
|
||||
client: {
|
||||
id: GATEWAY_CLIENT_NAMES.CONTROL_UI,
|
||||
version: "1.0.0",
|
||||
platform: "web",
|
||||
mode: GATEWAY_CLIENT_MODES.WEBCHAT,
|
||||
},
|
||||
expect(res.error?.message ?? "").toContain("secure context");
|
||||
ws.close();
|
||||
});
|
||||
expect(res.ok).toBe(false);
|
||||
expect(res.error?.message ?? "").toContain("secure context");
|
||||
ws.close();
|
||||
await server.close();
|
||||
if (prevToken === undefined) {
|
||||
delete process.env.CLAWDBOT_GATEWAY_TOKEN;
|
||||
} else {
|
||||
process.env.CLAWDBOT_GATEWAY_TOKEN = prevToken;
|
||||
}
|
||||
});
|
||||
|
||||
test("allows control ui without device identity when insecure auth is enabled", async () => {
|
||||
@@ -327,81 +424,5 @@ describe("gateway server auth/connect", () => {
|
||||
}
|
||||
});
|
||||
|
||||
test("rejects invalid password", async () => {
|
||||
testState.gatewayAuth = { mode: "password", password: "secret" };
|
||||
const port = await getFreePort();
|
||||
const server = await startGatewayServer(port);
|
||||
const ws = new WebSocket(`ws://127.0.0.1:${port}`);
|
||||
await new Promise<void>((resolve) => ws.once("open", resolve));
|
||||
|
||||
const res = await connectReq(ws, { password: "wrong" });
|
||||
expect(res.ok).toBe(false);
|
||||
expect(res.error?.message ?? "").toContain("unauthorized");
|
||||
|
||||
ws.close();
|
||||
await server.close();
|
||||
});
|
||||
|
||||
test("rejects non-connect first request", async () => {
|
||||
const { server, ws } = await startServerWithClient();
|
||||
ws.send(JSON.stringify({ type: "req", id: "h1", method: "health" }));
|
||||
const res = await onceMessage<{ ok: boolean; error?: unknown }>(
|
||||
ws,
|
||||
(o) => o.type === "res" && o.id === "h1",
|
||||
);
|
||||
expect(res.ok).toBe(false);
|
||||
await new Promise<void>((resolve) => ws.once("close", () => resolve()));
|
||||
await server.close();
|
||||
});
|
||||
|
||||
test(
|
||||
"invalid connect params surface in response and close reason",
|
||||
{ timeout: 60_000 },
|
||||
async () => {
|
||||
const { server, ws } = await startServerWithClient();
|
||||
const closeInfoPromise = new Promise<{ code: number; reason: string }>((resolve) => {
|
||||
ws.once("close", (code, reason) => resolve({ code, reason: reason.toString() }));
|
||||
});
|
||||
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: "req",
|
||||
id: "h-bad",
|
||||
method: "connect",
|
||||
params: {
|
||||
minProtocol: PROTOCOL_VERSION,
|
||||
maxProtocol: PROTOCOL_VERSION,
|
||||
client: {
|
||||
id: "bad-client",
|
||||
version: "dev",
|
||||
platform: "web",
|
||||
mode: "webchat",
|
||||
},
|
||||
device: {
|
||||
id: 123,
|
||||
publicKey: "bad",
|
||||
signature: "bad",
|
||||
signedAt: "bad",
|
||||
},
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
const res = await onceMessage<{
|
||||
ok: boolean;
|
||||
error?: { message?: string };
|
||||
}>(
|
||||
ws,
|
||||
(o) => (o as { type?: string }).type === "res" && (o as { id?: string }).id === "h-bad",
|
||||
);
|
||||
expect(res.ok).toBe(false);
|
||||
expect(String(res.error?.message ?? "")).toContain("invalid connect params");
|
||||
|
||||
const closeInfo = await closeInfoPromise;
|
||||
expect(closeInfo.code).toBe(1008);
|
||||
expect(closeInfo.reason).toContain("invalid connect params");
|
||||
|
||||
await server.close();
|
||||
},
|
||||
);
|
||||
// Remaining tests require isolated gateway state.
|
||||
});
|
||||
|
||||
@@ -14,6 +14,7 @@ import {
|
||||
testState,
|
||||
writeSessionStore,
|
||||
} from "./test-helpers.js";
|
||||
import { __setMaxChatHistoryMessagesBytesForTest } from "./server-constants.js";
|
||||
installGatewayTestHooks({ scope: "suite" });
|
||||
async function waitFor(condition: () => boolean, timeoutMs = 1500) {
|
||||
const deadline = Date.now() + timeoutMs;
|
||||
@@ -52,6 +53,8 @@ describe("gateway server chat", () => {
|
||||
spy.mockResolvedValue(undefined);
|
||||
};
|
||||
try {
|
||||
const historyMaxBytes = 192 * 1024;
|
||||
__setMaxChatHistoryMessagesBytesForTest(historyMaxBytes);
|
||||
await connectOk(ws);
|
||||
const sessionDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-gw-"));
|
||||
tempDirs.push(sessionDir);
|
||||
@@ -66,9 +69,9 @@ describe("gateway server chat", () => {
|
||||
};
|
||||
|
||||
await writeStore({ main: { sessionId: "sess-main", updatedAt: Date.now() } });
|
||||
const bigText = "x".repeat(155_000);
|
||||
const bigText = "x".repeat(4_000);
|
||||
const largeLines: string[] = [];
|
||||
for (let i = 0; i < 40; i += 1) {
|
||||
for (let i = 0; i < 60; i += 1) {
|
||||
largeLines.push(
|
||||
JSON.stringify({
|
||||
message: {
|
||||
@@ -91,7 +94,7 @@ describe("gateway server chat", () => {
|
||||
expect(cappedRes.ok).toBe(true);
|
||||
const cappedMsgs = cappedRes.payload?.messages ?? [];
|
||||
const bytes = Buffer.byteLength(JSON.stringify(cappedMsgs), "utf8");
|
||||
expect(bytes).toBeLessThanOrEqual(6 * 1024 * 1024);
|
||||
expect(bytes).toBeLessThanOrEqual(historyMaxBytes);
|
||||
expect(cappedMsgs.length).toBeLessThan(60);
|
||||
|
||||
await writeStore({
|
||||
@@ -473,6 +476,7 @@ describe("gateway server chat", () => {
|
||||
: undefined;
|
||||
expect(run2).toBe("idem-2");
|
||||
} finally {
|
||||
__setMaxChatHistoryMessagesBytesForTest();
|
||||
testState.sessionStorePath = undefined;
|
||||
sessionStoreSaveDelayMs.value = 0;
|
||||
ws.close();
|
||||
|
||||
@@ -5,7 +5,6 @@ import { describe, expect, test } from "vitest";
|
||||
import {
|
||||
connectOk,
|
||||
installGatewayTestHooks,
|
||||
onceMessage,
|
||||
rpcReq,
|
||||
startServerWithClient,
|
||||
testState,
|
||||
@@ -36,36 +35,23 @@ async function rmTempDir(dir: string) {
|
||||
await fs.rm(dir, { recursive: true, force: true });
|
||||
}
|
||||
|
||||
async function waitForCronFinished(
|
||||
ws: { send: (data: string) => void },
|
||||
jobId: string,
|
||||
timeoutMs = 20_000,
|
||||
) {
|
||||
await onceMessage(
|
||||
ws as never,
|
||||
(o) =>
|
||||
o.type === "event" &&
|
||||
o.event === "cron" &&
|
||||
o.payload?.action === "finished" &&
|
||||
o.payload?.jobId === jobId,
|
||||
timeoutMs,
|
||||
);
|
||||
}
|
||||
|
||||
async function waitForNonEmptyFile(pathname: string, timeoutMs = 2000) {
|
||||
const deadline = Date.now() + timeoutMs;
|
||||
const startedAt = process.hrtime.bigint();
|
||||
for (;;) {
|
||||
const raw = await fs.readFile(pathname, "utf-8").catch(() => "");
|
||||
if (raw.trim().length > 0) return raw;
|
||||
if (Date.now() >= deadline) {
|
||||
const elapsedMs = Number(process.hrtime.bigint() - startedAt) / 1e6;
|
||||
if (elapsedMs >= timeoutMs) {
|
||||
throw new Error(`timeout waiting for file ${pathname}`);
|
||||
}
|
||||
await new Promise((resolve) => setTimeout(resolve, 10));
|
||||
await yieldToEventLoop();
|
||||
}
|
||||
}
|
||||
|
||||
describe("gateway server cron", () => {
|
||||
test("handles cron CRUD, normalization, and patch semantics", { timeout: 120_000 }, async () => {
|
||||
const prevSkipCron = process.env.CLAWDBOT_SKIP_CRON;
|
||||
process.env.CLAWDBOT_SKIP_CRON = "0";
|
||||
const dir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-gw-cron-"));
|
||||
testState.cronStorePath = path.join(dir, "cron", "jobs.json");
|
||||
testState.sessionConfig = { mainKey: "primary" };
|
||||
@@ -269,10 +255,17 @@ describe("gateway server cron", () => {
|
||||
testState.cronStorePath = undefined;
|
||||
testState.sessionConfig = undefined;
|
||||
testState.cronEnabled = undefined;
|
||||
if (prevSkipCron === undefined) {
|
||||
delete process.env.CLAWDBOT_SKIP_CRON;
|
||||
} else {
|
||||
process.env.CLAWDBOT_SKIP_CRON = prevSkipCron;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
test("writes cron run history and auto-runs due jobs", async () => {
|
||||
const prevSkipCron = process.env.CLAWDBOT_SKIP_CRON;
|
||||
process.env.CLAWDBOT_SKIP_CRON = "0";
|
||||
const dir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-gw-cron-log-"));
|
||||
testState.cronStorePath = path.join(dir, "cron", "jobs.json");
|
||||
testState.cronEnabled = undefined;
|
||||
@@ -297,13 +290,10 @@ describe("gateway server cron", () => {
|
||||
const jobId = typeof jobIdValue === "string" ? jobIdValue : "";
|
||||
expect(jobId.length > 0).toBe(true);
|
||||
|
||||
const finishedP = waitForCronFinished(ws, jobId);
|
||||
const runRes = await rpcReq(ws, "cron.run", { id: jobId, mode: "force" }, 20_000);
|
||||
expect(runRes.ok).toBe(true);
|
||||
await finishedP;
|
||||
|
||||
const logPath = path.join(dir, "cron", "runs", `${jobId}.jsonl`);
|
||||
const raw = await waitForNonEmptyFile(logPath);
|
||||
const raw = await waitForNonEmptyFile(logPath, 5000);
|
||||
const line = raw
|
||||
.split("\n")
|
||||
.map((l) => l.trim())
|
||||
@@ -349,9 +339,7 @@ describe("gateway server cron", () => {
|
||||
const autoJobId = typeof autoJobIdValue === "string" ? autoJobIdValue : "";
|
||||
expect(autoJobId.length > 0).toBe(true);
|
||||
|
||||
await waitForCronFinished(ws, autoJobId);
|
||||
|
||||
await waitForNonEmptyFile(path.join(dir, "cron", "runs", `${autoJobId}.jsonl`));
|
||||
await waitForNonEmptyFile(path.join(dir, "cron", "runs", `${autoJobId}.jsonl`), 5000);
|
||||
const autoEntries = (await rpcReq(ws, "cron.runs", { id: autoJobId, limit: 10 })).payload as
|
||||
| { entries?: Array<{ jobId?: unknown }> }
|
||||
| undefined;
|
||||
@@ -364,6 +352,11 @@ describe("gateway server cron", () => {
|
||||
await rmTempDir(dir);
|
||||
testState.cronStorePath = undefined;
|
||||
testState.cronEnabled = undefined;
|
||||
if (prevSkipCron === undefined) {
|
||||
delete process.env.CLAWDBOT_SKIP_CRON;
|
||||
} else {
|
||||
process.env.CLAWDBOT_SKIP_CRON = prevSkipCron;
|
||||
}
|
||||
}
|
||||
}, 45_000);
|
||||
});
|
||||
|
||||
@@ -191,7 +191,7 @@ describe("gateway hot reload", () => {
|
||||
}
|
||||
});
|
||||
|
||||
it("applies hot reload actions for providers + services", async () => {
|
||||
it("applies hot reload actions and emits restart signal", async () => {
|
||||
const port = await getFreePort();
|
||||
const server = await startGatewayServer(port);
|
||||
|
||||
@@ -270,13 +270,6 @@ describe("gateway hot reload", () => {
|
||||
expect(hoisted.providerManager.stopChannel).toHaveBeenCalledWith("imessage");
|
||||
expect(hoisted.providerManager.startChannel).toHaveBeenCalledWith("imessage");
|
||||
|
||||
await server.close();
|
||||
});
|
||||
|
||||
it("emits SIGUSR1 on restart plan when listener exists", async () => {
|
||||
const port = await getFreePort();
|
||||
const server = await startGatewayServer(port);
|
||||
|
||||
const onRestart = hoisted.getOnRestart();
|
||||
expect(onRestart).toBeTypeOf("function");
|
||||
|
||||
|
||||
@@ -554,3 +554,4 @@ vi.mock("../cli/deps.js", async () => {
|
||||
});
|
||||
|
||||
process.env.CLAWDBOT_SKIP_CHANNELS = "1";
|
||||
process.env.CLAWDBOT_SKIP_CRON = "1";
|
||||
|
||||
578
src/infra/provider-usage.fetch.antigravity.test.ts
Normal file
578
src/infra/provider-usage.fetch.antigravity.test.ts
Normal file
@@ -0,0 +1,578 @@
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { fetchAntigravityUsage } from "./provider-usage.fetch.antigravity.js";
|
||||
|
||||
const makeResponse = (status: number, body: unknown): Response => {
|
||||
const payload = typeof body === "string" ? body : JSON.stringify(body);
|
||||
const headers = typeof body === "string" ? undefined : { "Content-Type": "application/json" };
|
||||
return new Response(payload, { status, headers });
|
||||
};
|
||||
|
||||
describe("fetchAntigravityUsage", () => {
|
||||
it("returns 3 windows when both endpoints succeed", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 750,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
planType: "Standard",
|
||||
currentTier: { id: "tier1", name: "Standard Tier" },
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-pro-1.5": {
|
||||
quotaInfo: {
|
||||
remainingFraction: 0.6,
|
||||
resetTime: "2026-01-08T00:00:00Z",
|
||||
isExhausted: false,
|
||||
},
|
||||
},
|
||||
"gemini-flash-2.0": {
|
||||
quotaInfo: {
|
||||
remainingFraction: 0.8,
|
||||
resetTime: "2026-01-08T00:00:00Z",
|
||||
isExhausted: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.provider).toBe("google-antigravity");
|
||||
expect(snapshot.displayName).toBe("Antigravity");
|
||||
expect(snapshot.windows).toHaveLength(3);
|
||||
expect(snapshot.plan).toBe("Standard Tier");
|
||||
expect(snapshot.error).toBeUndefined();
|
||||
|
||||
const creditsWindow = snapshot.windows.find((w) => w.label === "Credits");
|
||||
expect(creditsWindow?.usedPercent).toBe(25); // (1000 - 750) / 1000 * 100
|
||||
|
||||
const proWindow = snapshot.windows.find((w) => w.label === "gemini-pro-1.5");
|
||||
expect(proWindow?.usedPercent).toBe(40); // (1 - 0.6) * 100
|
||||
expect(proWindow?.resetAt).toBe(new Date("2026-01-08T00:00:00Z").getTime());
|
||||
|
||||
const flashWindow = snapshot.windows.find((w) => w.label === "gemini-flash-2.0");
|
||||
expect(flashWindow?.usedPercent).toBeCloseTo(20, 1); // (1 - 0.8) * 100
|
||||
expect(flashWindow?.resetAt).toBe(new Date("2026-01-08T00:00:00Z").getTime());
|
||||
|
||||
expect(mockFetch).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
|
||||
it("returns Credits only when loadCodeAssist succeeds but fetchAvailableModels fails", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 250,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
currentTier: { name: "Free" },
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(403, { error: { message: "Permission denied" } });
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.provider).toBe("google-antigravity");
|
||||
expect(snapshot.windows).toHaveLength(1);
|
||||
expect(snapshot.plan).toBe("Free");
|
||||
expect(snapshot.error).toBeUndefined();
|
||||
|
||||
const creditsWindow = snapshot.windows[0];
|
||||
expect(creditsWindow?.label).toBe("Credits");
|
||||
expect(creditsWindow?.usedPercent).toBe(75); // (1000 - 250) / 1000 * 100
|
||||
|
||||
expect(mockFetch).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
|
||||
it("returns model IDs when fetchAvailableModels succeeds but loadCodeAssist fails", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(500, "Internal server error");
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-pro-1.5": {
|
||||
quotaInfo: { remainingFraction: 0.5, resetTime: "2026-01-08T00:00:00Z" },
|
||||
},
|
||||
"gemini-flash-2.0": {
|
||||
quotaInfo: { remainingFraction: 0.7, resetTime: "2026-01-08T00:00:00Z" },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.provider).toBe("google-antigravity");
|
||||
expect(snapshot.windows).toHaveLength(2);
|
||||
expect(snapshot.error).toBeUndefined();
|
||||
|
||||
const proWindow = snapshot.windows.find((w) => w.label === "gemini-pro-1.5");
|
||||
expect(proWindow?.usedPercent).toBe(50); // (1 - 0.5) * 100
|
||||
|
||||
const flashWindow = snapshot.windows.find((w) => w.label === "gemini-flash-2.0");
|
||||
expect(flashWindow?.usedPercent).toBeCloseTo(30, 1); // (1 - 0.7) * 100
|
||||
|
||||
expect(mockFetch).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
|
||||
it("uses cloudaicompanionProject string as project id", async () => {
|
||||
let capturedBody: string | undefined;
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(
|
||||
async (input, init) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 900,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
cloudaicompanionProject: "projects/alpha",
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
capturedBody = init?.body?.toString();
|
||||
return makeResponse(200, { models: {} });
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
},
|
||||
);
|
||||
|
||||
await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(capturedBody).toBe(JSON.stringify({ project: "projects/alpha" }));
|
||||
});
|
||||
|
||||
it("uses cloudaicompanionProject object id when present", async () => {
|
||||
let capturedBody: string | undefined;
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(
|
||||
async (input, init) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 900,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
cloudaicompanionProject: { id: "projects/beta" },
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
capturedBody = init?.body?.toString();
|
||||
return makeResponse(200, { models: {} });
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
},
|
||||
);
|
||||
|
||||
await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(capturedBody).toBe(JSON.stringify({ project: "projects/beta" }));
|
||||
});
|
||||
|
||||
it("returns error snapshot when both endpoints fail", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(403, { error: { message: "Access denied" } });
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(403, "Forbidden");
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.provider).toBe("google-antigravity");
|
||||
expect(snapshot.windows).toHaveLength(0);
|
||||
expect(snapshot.error).toBe("Access denied");
|
||||
|
||||
expect(mockFetch).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
|
||||
it("returns Token expired when fetchAvailableModels returns 401 and no windows", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(500, "Boom");
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(401, { error: { message: "Unauthorized" } });
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.error).toBe("Token expired");
|
||||
expect(snapshot.windows).toHaveLength(0);
|
||||
});
|
||||
|
||||
it("extracts plan info from currentTier.name", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 500,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
planType: "Basic",
|
||||
currentTier: { id: "tier2", name: "Premium Tier" },
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(500, "Error");
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.plan).toBe("Premium Tier");
|
||||
});
|
||||
|
||||
it("falls back to planType when currentTier.name is missing", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 500,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
planType: "Basic Plan",
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(500, "Error");
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.plan).toBe("Basic Plan");
|
||||
});
|
||||
|
||||
it("includes reset times in model windows", async () => {
|
||||
const resetTime = "2026-01-10T12:00:00Z";
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(500, "Error");
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-pro-experimental": {
|
||||
quotaInfo: { remainingFraction: 0.3, resetTime },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
const proWindow = snapshot.windows.find((w) => w.label === "gemini-pro-experimental");
|
||||
expect(proWindow?.resetAt).toBe(new Date(resetTime).getTime());
|
||||
});
|
||||
|
||||
it("parses string numbers correctly", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: "600",
|
||||
planInfo: { monthlyPromptCredits: "1000" },
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-flash-lite": {
|
||||
quotaInfo: { remainingFraction: "0.9" },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.windows).toHaveLength(2);
|
||||
|
||||
const creditsWindow = snapshot.windows.find((w) => w.label === "Credits");
|
||||
expect(creditsWindow?.usedPercent).toBe(40); // (1000 - 600) / 1000 * 100
|
||||
|
||||
const flashWindow = snapshot.windows.find((w) => w.label === "gemini-flash-lite");
|
||||
expect(flashWindow?.usedPercent).toBeCloseTo(10, 1); // (1 - 0.9) * 100
|
||||
});
|
||||
|
||||
it("skips internal models", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 500,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
cloudaicompanionProject: "projects/internal",
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
chat_hidden: { quotaInfo: { remainingFraction: 0.1 } },
|
||||
tab_hidden: { quotaInfo: { remainingFraction: 0.2 } },
|
||||
"gemini-pro-1.5": { quotaInfo: { remainingFraction: 0.7 } },
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.windows.map((w) => w.label)).toEqual(["Credits", "gemini-pro-1.5"]);
|
||||
});
|
||||
|
||||
it("sorts models by usage and shows individual model IDs", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(500, "Error");
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-pro-1.0": {
|
||||
quotaInfo: { remainingFraction: 0.8 },
|
||||
},
|
||||
"gemini-pro-1.5": {
|
||||
quotaInfo: { remainingFraction: 0.3 },
|
||||
},
|
||||
"gemini-flash-1.5": {
|
||||
quotaInfo: { remainingFraction: 0.6 },
|
||||
},
|
||||
"gemini-flash-2.0": {
|
||||
quotaInfo: { remainingFraction: 0.9 },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.windows).toHaveLength(4);
|
||||
// Should be sorted by usage (highest first)
|
||||
expect(snapshot.windows[0]?.label).toBe("gemini-pro-1.5");
|
||||
expect(snapshot.windows[0]?.usedPercent).toBe(70); // (1 - 0.3) * 100
|
||||
expect(snapshot.windows[1]?.label).toBe("gemini-flash-1.5");
|
||||
expect(snapshot.windows[1]?.usedPercent).toBe(40); // (1 - 0.6) * 100
|
||||
expect(snapshot.windows[2]?.label).toBe("gemini-pro-1.0");
|
||||
expect(snapshot.windows[2]?.usedPercent).toBeCloseTo(20, 1); // (1 - 0.8) * 100
|
||||
expect(snapshot.windows[3]?.label).toBe("gemini-flash-2.0");
|
||||
expect(snapshot.windows[3]?.usedPercent).toBeCloseTo(10, 1); // (1 - 0.9) * 100
|
||||
});
|
||||
|
||||
it("returns Token expired error on 401 from loadCodeAssist", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(401, { error: { message: "Unauthorized" } });
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.error).toBe("Token expired");
|
||||
expect(snapshot.windows).toHaveLength(0);
|
||||
expect(mockFetch).toHaveBeenCalledTimes(1); // Should stop early on 401
|
||||
});
|
||||
|
||||
it("handles empty models array gracefully", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, {
|
||||
availablePromptCredits: 800,
|
||||
planInfo: { monthlyPromptCredits: 1000 },
|
||||
});
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, { models: {} });
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.windows).toHaveLength(1);
|
||||
const creditsWindow = snapshot.windows[0];
|
||||
expect(creditsWindow?.label).toBe("Credits");
|
||||
expect(creditsWindow?.usedPercent).toBe(20);
|
||||
});
|
||||
|
||||
it("handles missing credits fields gracefully", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(200, { planType: "Free" });
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-flash-experimental": {
|
||||
quotaInfo: { remainingFraction: 0.5 },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.windows).toHaveLength(1);
|
||||
const flashWindow = snapshot.windows[0];
|
||||
expect(flashWindow?.label).toBe("gemini-flash-experimental");
|
||||
expect(flashWindow?.usedPercent).toBe(50);
|
||||
expect(snapshot.plan).toBe("Free");
|
||||
});
|
||||
|
||||
it("handles invalid reset time gracefully", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
return makeResponse(500, "Error");
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-pro-test": {
|
||||
quotaInfo: { remainingFraction: 0.4, resetTime: "invalid-date" },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
const proWindow = snapshot.windows.find((w) => w.label === "gemini-pro-test");
|
||||
expect(proWindow?.usedPercent).toBe(60);
|
||||
expect(proWindow?.resetAt).toBeUndefined();
|
||||
});
|
||||
|
||||
it("handles network errors with graceful degradation", async () => {
|
||||
const mockFetch = vi.fn<Parameters<typeof fetch>, ReturnType<typeof fetch>>(async (input) => {
|
||||
const url =
|
||||
typeof input === "string" ? input : input instanceof URL ? input.toString() : input.url;
|
||||
|
||||
if (url.includes("loadCodeAssist")) {
|
||||
throw new Error("Network failure");
|
||||
}
|
||||
|
||||
if (url.includes("fetchAvailableModels")) {
|
||||
return makeResponse(200, {
|
||||
models: {
|
||||
"gemini-flash-stable": {
|
||||
quotaInfo: { remainingFraction: 0.85 },
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
return makeResponse(404, "not found");
|
||||
});
|
||||
|
||||
const snapshot = await fetchAntigravityUsage("token-123", 5000, mockFetch);
|
||||
|
||||
expect(snapshot.windows).toHaveLength(1);
|
||||
const flashWindow = snapshot.windows[0];
|
||||
expect(flashWindow?.label).toBe("gemini-flash-stable");
|
||||
expect(flashWindow?.usedPercent).toBeCloseTo(15, 1);
|
||||
expect(snapshot.error).toBeUndefined();
|
||||
});
|
||||
});
|
||||
284
src/infra/provider-usage.fetch.antigravity.ts
Normal file
284
src/infra/provider-usage.fetch.antigravity.ts
Normal file
@@ -0,0 +1,284 @@
|
||||
import { logDebug } from "../logger.js";
|
||||
import { fetchJson } from "./provider-usage.fetch.shared.js";
|
||||
import { clampPercent, PROVIDER_LABELS } from "./provider-usage.shared.js";
|
||||
import type { ProviderUsageSnapshot, UsageWindow } from "./provider-usage.types.js";
|
||||
|
||||
type LoadCodeAssistResponse = {
|
||||
availablePromptCredits?: number | string;
|
||||
planInfo?: { monthlyPromptCredits?: number | string };
|
||||
planType?: string;
|
||||
currentTier?: { id?: string; name?: string };
|
||||
cloudaicompanionProject?: string | { id?: string };
|
||||
};
|
||||
|
||||
type FetchAvailableModelsResponse = {
|
||||
models?: Record<
|
||||
string,
|
||||
{
|
||||
displayName?: string;
|
||||
quotaInfo?: {
|
||||
remainingFraction?: number | string;
|
||||
resetTime?: string;
|
||||
isExhausted?: boolean;
|
||||
};
|
||||
}
|
||||
>;
|
||||
};
|
||||
|
||||
type ModelQuota = {
|
||||
remainingFraction: number;
|
||||
resetTime?: number;
|
||||
};
|
||||
|
||||
type CreditsInfo = {
|
||||
available: number;
|
||||
monthly: number;
|
||||
};
|
||||
|
||||
const BASE_URL = "https://cloudcode-pa.googleapis.com";
|
||||
const LOAD_CODE_ASSIST_PATH = "/v1internal:loadCodeAssist";
|
||||
const FETCH_AVAILABLE_MODELS_PATH = "/v1internal:fetchAvailableModels";
|
||||
|
||||
const METADATA = {
|
||||
ideType: "ANTIGRAVITY",
|
||||
platform: "PLATFORM_UNSPECIFIED",
|
||||
pluginType: "GEMINI",
|
||||
};
|
||||
|
||||
function parseNumber(value: number | string | undefined): number | undefined {
|
||||
if (typeof value === "number" && Number.isFinite(value)) return value;
|
||||
if (typeof value === "string") {
|
||||
const parsed = Number.parseFloat(value);
|
||||
if (Number.isFinite(parsed)) return parsed;
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
function parseEpochMs(isoString: string | undefined): number | undefined {
|
||||
if (!isoString?.trim()) return undefined;
|
||||
try {
|
||||
const ms = Date.parse(isoString);
|
||||
if (Number.isFinite(ms)) return ms;
|
||||
} catch {
|
||||
// ignore parse errors
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async function parseErrorMessage(res: Response): Promise<string> {
|
||||
try {
|
||||
const data = (await res.json()) as { error?: { message?: string } };
|
||||
const message = data?.error?.message?.trim();
|
||||
if (message) return message;
|
||||
} catch {
|
||||
// ignore parse errors
|
||||
}
|
||||
return `HTTP ${res.status}`;
|
||||
}
|
||||
|
||||
function extractCredits(data: LoadCodeAssistResponse): CreditsInfo | undefined {
|
||||
const available = parseNumber(data.availablePromptCredits);
|
||||
const monthly = parseNumber(data.planInfo?.monthlyPromptCredits);
|
||||
if (available === undefined || monthly === undefined || monthly <= 0) return undefined;
|
||||
return { available, monthly };
|
||||
}
|
||||
|
||||
function extractPlanInfo(data: LoadCodeAssistResponse): string | undefined {
|
||||
const tierName = data.currentTier?.name?.trim();
|
||||
if (tierName) return tierName;
|
||||
const planType = data.planType?.trim();
|
||||
if (planType) return planType;
|
||||
return undefined;
|
||||
}
|
||||
|
||||
function extractProjectId(data: LoadCodeAssistResponse): string | undefined {
|
||||
const project = data.cloudaicompanionProject;
|
||||
if (!project) return undefined;
|
||||
if (typeof project === "string") return project.trim() ? project : undefined;
|
||||
const projectId = typeof project.id === "string" ? project.id.trim() : undefined;
|
||||
return projectId || undefined;
|
||||
}
|
||||
|
||||
function extractModelQuotas(data: FetchAvailableModelsResponse): Map<string, ModelQuota> {
|
||||
const result = new Map<string, ModelQuota>();
|
||||
if (!data.models || typeof data.models !== "object") return result;
|
||||
|
||||
for (const [modelId, modelInfo] of Object.entries(data.models)) {
|
||||
const quotaInfo = modelInfo.quotaInfo;
|
||||
if (!quotaInfo) continue;
|
||||
|
||||
const remainingFraction = parseNumber(quotaInfo.remainingFraction);
|
||||
if (remainingFraction === undefined) continue;
|
||||
|
||||
const resetTime = parseEpochMs(quotaInfo.resetTime);
|
||||
result.set(modelId, { remainingFraction, resetTime });
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
function buildUsageWindows(opts: {
|
||||
credits?: CreditsInfo;
|
||||
modelQuotas?: Map<string, ModelQuota>;
|
||||
}): UsageWindow[] {
|
||||
const windows: UsageWindow[] = [];
|
||||
|
||||
// Credits window (overall)
|
||||
if (opts.credits) {
|
||||
const { available, monthly } = opts.credits;
|
||||
const used = monthly - available;
|
||||
const usedPercent = clampPercent((used / monthly) * 100);
|
||||
windows.push({ label: "Credits", usedPercent });
|
||||
}
|
||||
|
||||
// Individual model windows
|
||||
if (opts.modelQuotas && opts.modelQuotas.size > 0) {
|
||||
const modelWindows: UsageWindow[] = [];
|
||||
|
||||
for (const [modelId, quota] of opts.modelQuotas) {
|
||||
const lowerModelId = modelId.toLowerCase();
|
||||
|
||||
// Skip internal models
|
||||
if (lowerModelId.includes("chat_") || lowerModelId.includes("tab_")) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const usedPercent = clampPercent((1 - quota.remainingFraction) * 100);
|
||||
const window: UsageWindow = { label: modelId, usedPercent };
|
||||
if (quota.resetTime) window.resetAt = quota.resetTime;
|
||||
modelWindows.push(window);
|
||||
}
|
||||
|
||||
// Sort by usage (highest first) and take top 10
|
||||
modelWindows.sort((a, b) => b.usedPercent - a.usedPercent);
|
||||
const topModels = modelWindows.slice(0, 10);
|
||||
logDebug(
|
||||
`[antigravity] Built ${topModels.length} model windows from ${opts.modelQuotas.size} total models`,
|
||||
);
|
||||
for (const w of topModels) {
|
||||
logDebug(
|
||||
`[antigravity] ${w.label}: ${w.usedPercent.toFixed(1)}% used${w.resetAt ? ` (resets at ${new Date(w.resetAt).toISOString()})` : ""}`,
|
||||
);
|
||||
}
|
||||
windows.push(...topModels);
|
||||
}
|
||||
|
||||
return windows;
|
||||
}
|
||||
|
||||
export async function fetchAntigravityUsage(
|
||||
token: string,
|
||||
timeoutMs: number,
|
||||
fetchFn: typeof fetch,
|
||||
): Promise<ProviderUsageSnapshot> {
|
||||
const headers: Record<string, string> = {
|
||||
Authorization: `Bearer ${token}`,
|
||||
"Content-Type": "application/json",
|
||||
"User-Agent": "antigravity",
|
||||
"X-Goog-Api-Client": "google-cloud-sdk vscode_cloudshelleditor/0.1",
|
||||
};
|
||||
|
||||
let credits: CreditsInfo | undefined;
|
||||
let modelQuotas: Map<string, ModelQuota> | undefined;
|
||||
let planInfo: string | undefined;
|
||||
let lastError: string | undefined;
|
||||
let projectId: string | undefined;
|
||||
|
||||
// Fetch loadCodeAssist (credits + plan info)
|
||||
try {
|
||||
const res = await fetchJson(
|
||||
`${BASE_URL}${LOAD_CODE_ASSIST_PATH}`,
|
||||
{ method: "POST", headers, body: JSON.stringify({ metadata: METADATA }) },
|
||||
timeoutMs,
|
||||
fetchFn,
|
||||
);
|
||||
|
||||
if (res.ok) {
|
||||
const data = (await res.json()) as LoadCodeAssistResponse;
|
||||
|
||||
// Extract project ID for subsequent calls
|
||||
projectId = extractProjectId(data);
|
||||
|
||||
credits = extractCredits(data);
|
||||
planInfo = extractPlanInfo(data);
|
||||
logDebug(
|
||||
`[antigravity] Credits: ${credits ? `${credits.available}/${credits.monthly}` : "none"}${planInfo ? ` (plan: ${planInfo})` : ""}`,
|
||||
);
|
||||
} else {
|
||||
lastError = await parseErrorMessage(res);
|
||||
// Fatal auth errors - stop early
|
||||
if (res.status === 401) {
|
||||
return {
|
||||
provider: "google-antigravity",
|
||||
displayName: PROVIDER_LABELS["google-antigravity"],
|
||||
windows: [],
|
||||
error: "Token expired",
|
||||
};
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
lastError = "Network error";
|
||||
}
|
||||
|
||||
// Fetch fetchAvailableModels (model quotas)
|
||||
if (!projectId) {
|
||||
logDebug("[antigravity] Missing project id; requesting available models without project");
|
||||
}
|
||||
try {
|
||||
const body = JSON.stringify(projectId ? { project: projectId } : {});
|
||||
const res = await fetchJson(
|
||||
`${BASE_URL}${FETCH_AVAILABLE_MODELS_PATH}`,
|
||||
{ method: "POST", headers, body },
|
||||
timeoutMs,
|
||||
fetchFn,
|
||||
);
|
||||
|
||||
if (res.ok) {
|
||||
const data = (await res.json()) as FetchAvailableModelsResponse;
|
||||
modelQuotas = extractModelQuotas(data);
|
||||
logDebug(`[antigravity] Extracted ${modelQuotas.size} model quotas from API`);
|
||||
for (const [modelId, quota] of modelQuotas) {
|
||||
logDebug(
|
||||
`[antigravity] ${modelId}: ${(quota.remainingFraction * 100).toFixed(1)}% remaining${quota.resetTime ? ` (resets ${new Date(quota.resetTime).toISOString()})` : ""}`,
|
||||
);
|
||||
}
|
||||
} else {
|
||||
const err = await parseErrorMessage(res);
|
||||
if (res.status === 401) {
|
||||
lastError = "Token expired";
|
||||
} else if (!lastError) {
|
||||
lastError = err;
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
if (!lastError) lastError = "Network error";
|
||||
}
|
||||
|
||||
// Build windows from available data
|
||||
const windows = buildUsageWindows({ credits, modelQuotas });
|
||||
|
||||
// Return error only if we got nothing
|
||||
if (windows.length === 0 && lastError) {
|
||||
logDebug(`[antigravity] Returning error snapshot: ${lastError}`);
|
||||
return {
|
||||
provider: "google-antigravity",
|
||||
displayName: PROVIDER_LABELS["google-antigravity"],
|
||||
windows: [],
|
||||
error: lastError,
|
||||
};
|
||||
}
|
||||
|
||||
const snapshot: ProviderUsageSnapshot = {
|
||||
provider: "google-antigravity",
|
||||
displayName: PROVIDER_LABELS["google-antigravity"],
|
||||
windows,
|
||||
plan: planInfo,
|
||||
};
|
||||
|
||||
logDebug(
|
||||
`[antigravity] Returning snapshot with ${windows.length} windows${planInfo ? ` (plan: ${planInfo})` : ""}`,
|
||||
);
|
||||
logDebug(`[antigravity] Snapshot: ${JSON.stringify(snapshot, null, 2)}`);
|
||||
|
||||
return snapshot;
|
||||
}
|
||||
@@ -1,3 +1,4 @@
|
||||
export { fetchAntigravityUsage } from "./provider-usage.fetch.antigravity.js";
|
||||
export { fetchClaudeUsage } from "./provider-usage.fetch.claude.js";
|
||||
export { fetchCodexUsage } from "./provider-usage.fetch.codex.js";
|
||||
export { fetchCopilotUsage } from "./provider-usage.fetch.copilot.js";
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { type ProviderAuth, resolveProviderAuths } from "./provider-usage.auth.js";
|
||||
import {
|
||||
fetchAntigravityUsage,
|
||||
fetchClaudeUsage,
|
||||
fetchCodexUsage,
|
||||
fetchCopilotUsage,
|
||||
@@ -57,8 +58,9 @@ export async function loadProviderUsageSummary(
|
||||
return await fetchClaudeUsage(auth.token, timeoutMs, fetchFn);
|
||||
case "github-copilot":
|
||||
return await fetchCopilotUsage(auth.token, timeoutMs, fetchFn);
|
||||
case "google-gemini-cli":
|
||||
case "google-antigravity":
|
||||
return await fetchAntigravityUsage(auth.token, timeoutMs, fetchFn);
|
||||
case "google-gemini-cli":
|
||||
return await fetchGeminiUsage(auth.token, timeoutMs, fetchFn, auth.provider);
|
||||
case "openai-codex":
|
||||
return await fetchCodexUsage(auth.token, auth.accountId, timeoutMs, fetchFn);
|
||||
|
||||
@@ -339,6 +339,49 @@ export async function convertHeicToJpeg(buffer: Buffer): Promise<Buffer> {
|
||||
return await sharp(buffer).jpeg({ quality: 90, mozjpeg: true }).toBuffer();
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if an image has an alpha channel (transparency).
|
||||
* Returns true if the image has alpha, false otherwise.
|
||||
*/
|
||||
export async function hasAlphaChannel(buffer: Buffer): Promise<boolean> {
|
||||
try {
|
||||
const sharp = await loadSharp();
|
||||
const meta = await sharp(buffer).metadata();
|
||||
// Check if the image has an alpha channel
|
||||
// PNG color types with alpha: 4 (grayscale+alpha), 6 (RGBA)
|
||||
// Sharp reports this via 'channels' (4 = RGBA) or 'hasAlpha'
|
||||
return meta.hasAlpha === true || meta.channels === 4;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resizes an image to PNG format, preserving alpha channel (transparency).
|
||||
* Falls back to sharp only (no sips fallback for PNG with alpha).
|
||||
*/
|
||||
export async function resizeToPng(params: {
|
||||
buffer: Buffer;
|
||||
maxSide: number;
|
||||
compressionLevel?: number;
|
||||
withoutEnlargement?: boolean;
|
||||
}): Promise<Buffer> {
|
||||
const sharp = await loadSharp();
|
||||
// Compression level 6 is a good balance (0=fastest, 9=smallest)
|
||||
const compressionLevel = params.compressionLevel ?? 6;
|
||||
|
||||
return await sharp(params.buffer)
|
||||
.rotate() // Auto-rotate based on EXIF if present
|
||||
.resize({
|
||||
width: params.maxSide,
|
||||
height: params.maxSide,
|
||||
fit: "inside",
|
||||
withoutEnlargement: params.withoutEnlargement !== false,
|
||||
})
|
||||
.png({ compressionLevel })
|
||||
.toBuffer();
|
||||
}
|
||||
|
||||
/**
|
||||
* Internal sips-only EXIF normalization (no sharp fallback).
|
||||
* Used by resizeToJpeg to normalize before sips resize.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { intro, note, outro, select, spinner, text, isCancel } from "@clack/prompts";
|
||||
import { intro, note, outro, spinner } from "@clack/prompts";
|
||||
|
||||
import { ensureAuthProfileStore, upsertAuthProfile } from "../agents/auth-profiles.js";
|
||||
import { updateConfig } from "../commands/models/shared.js";
|
||||
@@ -6,22 +6,10 @@ import { applyAuthProfileConfig } from "../commands/onboard-auth.js";
|
||||
import { logConfigUpdated } from "../config/logging.js";
|
||||
import type { RuntimeEnv } from "../runtime.js";
|
||||
import { stylePromptTitle } from "../terminal/prompt-style.js";
|
||||
import {
|
||||
normalizeGithubCopilotDomain,
|
||||
resolveGithubCopilotBaseUrl,
|
||||
resolveGithubCopilotUserAgent,
|
||||
} from "./github-copilot-utils.js";
|
||||
|
||||
const CLIENT_ID = "Ov23li8tweQw6odWQebz";
|
||||
const DEFAULT_DOMAIN = "github.com";
|
||||
const OAUTH_POLLING_SAFETY_MARGIN_MS = 3000;
|
||||
|
||||
function getUrls(domain: string) {
|
||||
return {
|
||||
deviceCodeUrl: `https://${domain}/login/device/code`,
|
||||
accessTokenUrl: `https://${domain}/login/oauth/access_token`,
|
||||
};
|
||||
}
|
||||
const CLIENT_ID = "Iv1.b507a08c87ecfe98";
|
||||
const DEVICE_CODE_URL = "https://github.com/login/device/code";
|
||||
const ACCESS_TOKEN_URL = "https://github.com/login/oauth/access_token";
|
||||
|
||||
type DeviceCodeResponse = {
|
||||
device_code: string;
|
||||
@@ -50,21 +38,17 @@ function parseJsonResponse<T>(value: unknown): T {
|
||||
return value as T;
|
||||
}
|
||||
|
||||
async function requestDeviceCode(params: {
|
||||
scope: string;
|
||||
domain: string;
|
||||
}): Promise<DeviceCodeResponse> {
|
||||
const body = JSON.stringify({
|
||||
async function requestDeviceCode(params: { scope: string }): Promise<DeviceCodeResponse> {
|
||||
const body = new URLSearchParams({
|
||||
client_id: CLIENT_ID,
|
||||
scope: params.scope,
|
||||
});
|
||||
|
||||
const res = await fetch(getUrls(params.domain).deviceCodeUrl, {
|
||||
const res = await fetch(DEVICE_CODE_URL, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
Accept: "application/json",
|
||||
"Content-Type": "application/json",
|
||||
"User-Agent": resolveGithubCopilotUserAgent(),
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
},
|
||||
body,
|
||||
});
|
||||
@@ -81,27 +65,24 @@ async function requestDeviceCode(params: {
|
||||
}
|
||||
|
||||
async function pollForAccessToken(params: {
|
||||
domain: string;
|
||||
deviceCode: string;
|
||||
intervalMs: number;
|
||||
expiresAt: number;
|
||||
}): Promise<string> {
|
||||
const bodyBase = {
|
||||
const bodyBase = new URLSearchParams({
|
||||
client_id: CLIENT_ID,
|
||||
device_code: params.deviceCode,
|
||||
grant_type: "urn:ietf:params:oauth:grant-type:device_code",
|
||||
};
|
||||
const urls = getUrls(params.domain);
|
||||
});
|
||||
|
||||
while (Date.now() < params.expiresAt) {
|
||||
const res = await fetch(urls.accessTokenUrl, {
|
||||
const res = await fetch(ACCESS_TOKEN_URL, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
Accept: "application/json",
|
||||
"Content-Type": "application/json",
|
||||
"User-Agent": resolveGithubCopilotUserAgent(),
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
},
|
||||
body: JSON.stringify(bodyBase),
|
||||
body: bodyBase,
|
||||
});
|
||||
|
||||
if (!res.ok) {
|
||||
@@ -115,14 +96,11 @@ async function pollForAccessToken(params: {
|
||||
|
||||
const err = "error" in json ? json.error : "unknown";
|
||||
if (err === "authorization_pending") {
|
||||
await new Promise((r) => setTimeout(r, params.intervalMs + OAUTH_POLLING_SAFETY_MARGIN_MS));
|
||||
await new Promise((r) => setTimeout(r, params.intervalMs));
|
||||
continue;
|
||||
}
|
||||
if (err === "slow_down") {
|
||||
const serverInterval =
|
||||
"interval" in json && typeof json.interval === "number" ? json.interval : undefined;
|
||||
const nextInterval = serverInterval ? serverInterval * 1000 : params.intervalMs + 5000;
|
||||
await new Promise((r) => setTimeout(r, nextInterval + OAUTH_POLLING_SAFETY_MARGIN_MS));
|
||||
await new Promise((r) => setTimeout(r, params.intervalMs + 2000));
|
||||
continue;
|
||||
}
|
||||
if (err === "expired_token") {
|
||||
@@ -159,42 +137,9 @@ export async function githubCopilotLoginCommand(
|
||||
);
|
||||
}
|
||||
|
||||
const deployment = await select({
|
||||
message: "Select GitHub deployment type",
|
||||
options: [
|
||||
{ label: "GitHub.com", value: DEFAULT_DOMAIN, hint: "Public" },
|
||||
{ label: "GitHub Enterprise", value: "enterprise", hint: "Data residency or self-hosted" },
|
||||
],
|
||||
});
|
||||
if (isCancel(deployment)) {
|
||||
throw new Error("GitHub login cancelled");
|
||||
}
|
||||
|
||||
let domain = DEFAULT_DOMAIN;
|
||||
let enterpriseDomain: string | null = null;
|
||||
if (deployment === "enterprise") {
|
||||
const enterpriseInput = await text({
|
||||
message: "Enter your GitHub Enterprise URL or domain",
|
||||
placeholder: "company.ghe.com or https://company.ghe.com",
|
||||
validate: (value) => {
|
||||
if (!value) return "URL or domain is required";
|
||||
return normalizeGithubCopilotDomain(value) ? undefined : "Enter a valid URL or domain";
|
||||
},
|
||||
});
|
||||
if (isCancel(enterpriseInput)) {
|
||||
throw new Error("GitHub login cancelled");
|
||||
}
|
||||
const normalized = normalizeGithubCopilotDomain(enterpriseInput);
|
||||
if (!normalized) {
|
||||
throw new Error("Invalid GitHub Enterprise URL/domain");
|
||||
}
|
||||
enterpriseDomain = normalized;
|
||||
domain = normalized;
|
||||
}
|
||||
|
||||
const spin = spinner();
|
||||
spin.start("Requesting device code from GitHub...");
|
||||
const device = await requestDeviceCode({ scope: "read:user", domain });
|
||||
const device = await requestDeviceCode({ scope: "read:user" });
|
||||
spin.stop("Device code ready");
|
||||
|
||||
note(
|
||||
@@ -208,7 +153,6 @@ export async function githubCopilotLoginCommand(
|
||||
const polling = spinner();
|
||||
polling.start("Waiting for GitHub authorization...");
|
||||
const accessToken = await pollForAccessToken({
|
||||
domain,
|
||||
deviceCode: device.device_code,
|
||||
intervalMs,
|
||||
expiresAt,
|
||||
@@ -218,13 +162,11 @@ export async function githubCopilotLoginCommand(
|
||||
upsertAuthProfile({
|
||||
profileId,
|
||||
credential: {
|
||||
type: "oauth",
|
||||
type: "token",
|
||||
provider: "github-copilot",
|
||||
refresh: accessToken,
|
||||
access: accessToken,
|
||||
// Copilot access tokens are treated as non-expiring (see resolveApiKeyForProfile).
|
||||
expires: 0,
|
||||
enterpriseUrl: enterpriseDomain ?? undefined,
|
||||
token: accessToken,
|
||||
// GitHub device flow token doesn't reliably include expiry here.
|
||||
// Leave expires unset; we'll exchange into Copilot token plus expiry later.
|
||||
},
|
||||
});
|
||||
|
||||
@@ -232,13 +174,12 @@ export async function githubCopilotLoginCommand(
|
||||
applyAuthProfileConfig(cfg, {
|
||||
provider: "github-copilot",
|
||||
profileId,
|
||||
mode: "oauth",
|
||||
mode: "token",
|
||||
}),
|
||||
);
|
||||
|
||||
logConfigUpdated(runtime);
|
||||
runtime.log(`Auth profile: ${profileId} (github-copilot/oauth)`);
|
||||
runtime.log(`Base URL: ${resolveGithubCopilotBaseUrl(enterpriseDomain ?? undefined)}`);
|
||||
runtime.log(`Auth profile: ${profileId} (github-copilot/token)`);
|
||||
|
||||
outro("Done");
|
||||
}
|
||||
|
||||
@@ -2,7 +2,6 @@ import path from "node:path";
|
||||
|
||||
import { resolveStateDir } from "../config/paths.js";
|
||||
import { loadJsonFile, saveJsonFile } from "../infra/json-file.js";
|
||||
import { DEFAULT_GITHUB_COPILOT_BASE_URL } from "./github-copilot-utils.js";
|
||||
|
||||
const COPILOT_TOKEN_URL = "https://api.github.com/copilot_internal/v2/token";
|
||||
|
||||
@@ -54,7 +53,7 @@ function parseCopilotTokenResponse(value: unknown): {
|
||||
return { token, expiresAt: expiresAtMs };
|
||||
}
|
||||
|
||||
export const DEFAULT_COPILOT_API_BASE_URL = DEFAULT_GITHUB_COPILOT_BASE_URL;
|
||||
export const DEFAULT_COPILOT_API_BASE_URL = "https://api.individual.githubcopilot.com";
|
||||
|
||||
export function deriveCopilotApiBaseUrlFromToken(token: string): string | null {
|
||||
const trimmed = token.trim();
|
||||
|
||||
@@ -1,24 +0,0 @@
|
||||
export const DEFAULT_GITHUB_COPILOT_BASE_URL = "https://api.githubcopilot.com";
|
||||
|
||||
export function resolveGithubCopilotUserAgent(): string {
|
||||
const version = process.env.CLAWDBOT_VERSION ?? process.env.npm_package_version ?? "unknown";
|
||||
return `clawdbot/${version}`;
|
||||
}
|
||||
|
||||
export function normalizeGithubCopilotDomain(input: string | null | undefined): string | null {
|
||||
const trimmed = (input ?? "").trim();
|
||||
if (!trimmed) return null;
|
||||
try {
|
||||
const url = trimmed.includes("://") ? new URL(trimmed) : new URL(`https://${trimmed}`);
|
||||
return url.hostname;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export function resolveGithubCopilotBaseUrl(enterpriseDomain?: string | null): string {
|
||||
if (enterpriseDomain && enterpriseDomain.trim()) {
|
||||
return `https://copilot-api.${enterpriseDomain.trim()}`;
|
||||
}
|
||||
return DEFAULT_GITHUB_COPILOT_BASE_URL;
|
||||
}
|
||||
@@ -5,10 +5,20 @@ import path from "node:path";
|
||||
import sharp from "sharp";
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
|
||||
import { loadWebMedia } from "./media.js";
|
||||
import { loadWebMedia, optimizeImageToJpeg, optimizeImageToPng } from "./media.js";
|
||||
|
||||
const tmpFiles: string[] = [];
|
||||
|
||||
function buildDeterministicBytes(length: number): Buffer {
|
||||
const buffer = Buffer.allocUnsafe(length);
|
||||
let seed = 0x12345678;
|
||||
for (let i = 0; i < length; i++) {
|
||||
seed = (1103515245 * seed + 12345) & 0x7fffffff;
|
||||
buffer[i] = seed & 0xff;
|
||||
}
|
||||
return buffer;
|
||||
}
|
||||
|
||||
afterEach(async () => {
|
||||
await Promise.all(tmpFiles.map((file) => fs.rm(file, { force: true })));
|
||||
tmpFiles.length = 0;
|
||||
@@ -185,4 +195,69 @@ describe("web media loading", () => {
|
||||
|
||||
fetchMock.mockRestore();
|
||||
});
|
||||
|
||||
it("preserves PNG alpha when under the cap", async () => {
|
||||
const buffer = await sharp({
|
||||
create: {
|
||||
width: 64,
|
||||
height: 64,
|
||||
channels: 4,
|
||||
background: { r: 255, g: 0, b: 0, alpha: 0.5 },
|
||||
},
|
||||
})
|
||||
.png()
|
||||
.toBuffer();
|
||||
|
||||
const file = path.join(os.tmpdir(), `clawdbot-media-${Date.now()}.png`);
|
||||
tmpFiles.push(file);
|
||||
await fs.writeFile(file, buffer);
|
||||
|
||||
const result = await loadWebMedia(file, 1024 * 1024);
|
||||
|
||||
expect(result.kind).toBe("image");
|
||||
expect(result.contentType).toBe("image/png");
|
||||
const meta = await sharp(result.buffer).metadata();
|
||||
expect(meta.hasAlpha).toBe(true);
|
||||
});
|
||||
|
||||
it("falls back to JPEG when PNG alpha cannot fit under cap", async () => {
|
||||
const sizes = [512, 768, 1024];
|
||||
let pngBuffer: Buffer | null = null;
|
||||
let smallestPng: Awaited<ReturnType<typeof optimizeImageToPng>> | null = null;
|
||||
let jpegOptimized: Awaited<ReturnType<typeof optimizeImageToJpeg>> | null = null;
|
||||
let cap = 0;
|
||||
|
||||
for (const size of sizes) {
|
||||
const raw = buildDeterministicBytes(size * size * 4);
|
||||
pngBuffer = await sharp(raw, { raw: { width: size, height: size, channels: 4 } })
|
||||
.png()
|
||||
.toBuffer();
|
||||
smallestPng = await optimizeImageToPng(pngBuffer, 1);
|
||||
cap = Math.max(1, smallestPng.optimizedSize - 1);
|
||||
jpegOptimized = await optimizeImageToJpeg(pngBuffer, cap);
|
||||
if (jpegOptimized.buffer.length < smallestPng.optimizedSize) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!pngBuffer || !smallestPng || !jpegOptimized) {
|
||||
throw new Error("PNG fallback setup failed");
|
||||
}
|
||||
|
||||
if (jpegOptimized.buffer.length >= smallestPng.optimizedSize) {
|
||||
throw new Error(
|
||||
`JPEG fallback did not shrink below PNG (jpeg=${jpegOptimized.buffer.length}, png=${smallestPng.optimizedSize})`,
|
||||
);
|
||||
}
|
||||
|
||||
const file = path.join(os.tmpdir(), `clawdbot-media-${Date.now()}-alpha.png`);
|
||||
tmpFiles.push(file);
|
||||
await fs.writeFile(file, pngBuffer);
|
||||
|
||||
const result = await loadWebMedia(file, cap);
|
||||
|
||||
expect(result.kind).toBe("image");
|
||||
expect(result.contentType).toBe("image/jpeg");
|
||||
expect(result.buffer.length).toBeLessThanOrEqual(cap);
|
||||
});
|
||||
});
|
||||
|
||||
138
src/web/media.ts
138
src/web/media.ts
@@ -6,7 +6,12 @@ import { logVerbose, shouldLogVerbose } from "../globals.js";
|
||||
import { type MediaKind, maxBytesForKind, mediaKindFromMime } from "../media/constants.js";
|
||||
import { resolveUserPath } from "../utils.js";
|
||||
import { fetchRemoteMedia } from "../media/fetch.js";
|
||||
import { convertHeicToJpeg, resizeToJpeg } from "../media/image-ops.js";
|
||||
import {
|
||||
convertHeicToJpeg,
|
||||
hasAlphaChannel,
|
||||
resizeToJpeg,
|
||||
resizeToPng,
|
||||
} from "../media/image-ops.js";
|
||||
import { detectMime, extensionForMime } from "../media/mime.js";
|
||||
|
||||
export type WebMediaResult = {
|
||||
@@ -61,27 +66,59 @@ async function loadWebMediaInternal(
|
||||
meta?: { contentType?: string; fileName?: string },
|
||||
) => {
|
||||
const originalSize = buffer.length;
|
||||
const optimized = await optimizeImageToJpeg(buffer, cap, meta);
|
||||
const fileName = meta && isHeicSource(meta) ? toJpegFileName(meta.fileName) : meta?.fileName;
|
||||
if (optimized.optimizedSize < originalSize && shouldLogVerbose()) {
|
||||
logVerbose(
|
||||
`Optimized media from ${(originalSize / (1024 * 1024)).toFixed(2)}MB to ${(optimized.optimizedSize / (1024 * 1024)).toFixed(2)}MB (side≤${optimized.resizeSide}px, q=${optimized.quality})`,
|
||||
);
|
||||
}
|
||||
if (optimized.buffer.length > cap) {
|
||||
throw new Error(
|
||||
`Media could not be reduced below ${(cap / (1024 * 1024)).toFixed(0)}MB (got ${(
|
||||
optimized.buffer.length /
|
||||
(1024 * 1024)
|
||||
).toFixed(2)}MB)`,
|
||||
);
|
||||
}
|
||||
return {
|
||||
buffer: optimized.buffer,
|
||||
contentType: "image/jpeg",
|
||||
kind: "image" as const,
|
||||
fileName,
|
||||
|
||||
const optimizeToJpeg = async () => {
|
||||
const optimized = await optimizeImageToJpeg(buffer, cap, meta);
|
||||
const fileName = meta && isHeicSource(meta) ? toJpegFileName(meta.fileName) : meta?.fileName;
|
||||
if (optimized.optimizedSize < originalSize && shouldLogVerbose()) {
|
||||
logVerbose(
|
||||
`Optimized media from ${(originalSize / (1024 * 1024)).toFixed(2)}MB to ${(optimized.optimizedSize / (1024 * 1024)).toFixed(2)}MB (side≤${optimized.resizeSide}px, q=${optimized.quality})`,
|
||||
);
|
||||
}
|
||||
if (optimized.buffer.length > cap) {
|
||||
throw new Error(
|
||||
`Media could not be reduced below ${(cap / (1024 * 1024)).toFixed(0)}MB (got ${(
|
||||
optimized.buffer.length /
|
||||
(1024 * 1024)
|
||||
).toFixed(2)}MB)`,
|
||||
);
|
||||
}
|
||||
return {
|
||||
buffer: optimized.buffer,
|
||||
contentType: "image/jpeg",
|
||||
kind: "image" as const,
|
||||
fileName,
|
||||
};
|
||||
};
|
||||
|
||||
// Check if this is a PNG with alpha channel - preserve transparency when possible
|
||||
const isPng =
|
||||
meta?.contentType === "image/png" || meta?.fileName?.toLowerCase().endsWith(".png");
|
||||
const hasAlpha = isPng && (await hasAlphaChannel(buffer));
|
||||
|
||||
if (hasAlpha) {
|
||||
const optimized = await optimizeImageToPng(buffer, cap);
|
||||
if (optimized.buffer.length <= cap) {
|
||||
if (optimized.optimizedSize < originalSize && shouldLogVerbose()) {
|
||||
logVerbose(
|
||||
`Optimized PNG (preserving alpha) from ${(originalSize / (1024 * 1024)).toFixed(2)}MB to ${(optimized.optimizedSize / (1024 * 1024)).toFixed(2)}MB (side≤${optimized.resizeSide}px)`,
|
||||
);
|
||||
}
|
||||
return {
|
||||
buffer: optimized.buffer,
|
||||
contentType: "image/png",
|
||||
kind: "image" as const,
|
||||
fileName: meta?.fileName,
|
||||
};
|
||||
}
|
||||
if (shouldLogVerbose()) {
|
||||
logVerbose(
|
||||
`PNG with alpha still exceeds ${(cap / (1024 * 1024)).toFixed(0)}MB after optimization; falling back to JPEG`,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return await optimizeToJpeg();
|
||||
};
|
||||
|
||||
const clampAndFinalize = async (params: {
|
||||
@@ -246,3 +283,62 @@ export async function optimizeImageToJpeg(
|
||||
|
||||
throw new Error("Failed to optimize image");
|
||||
}
|
||||
|
||||
export async function optimizeImageToPng(
|
||||
buffer: Buffer,
|
||||
maxBytes: number,
|
||||
): Promise<{
|
||||
buffer: Buffer;
|
||||
optimizedSize: number;
|
||||
resizeSide: number;
|
||||
compressionLevel: number;
|
||||
}> {
|
||||
// Try a grid of sizes/compression levels until under the limit.
|
||||
// PNG uses compression levels 0-9 (higher = smaller but slower)
|
||||
const sides = [2048, 1536, 1280, 1024, 800];
|
||||
const compressionLevels = [6, 7, 8, 9];
|
||||
let smallest: {
|
||||
buffer: Buffer;
|
||||
size: number;
|
||||
resizeSide: number;
|
||||
compressionLevel: number;
|
||||
} | null = null;
|
||||
|
||||
for (const side of sides) {
|
||||
for (const compressionLevel of compressionLevels) {
|
||||
try {
|
||||
const out = await resizeToPng({
|
||||
buffer,
|
||||
maxSide: side,
|
||||
compressionLevel,
|
||||
withoutEnlargement: true,
|
||||
});
|
||||
const size = out.length;
|
||||
if (!smallest || size < smallest.size) {
|
||||
smallest = { buffer: out, size, resizeSide: side, compressionLevel };
|
||||
}
|
||||
if (size <= maxBytes) {
|
||||
return {
|
||||
buffer: out,
|
||||
optimizedSize: size,
|
||||
resizeSide: side,
|
||||
compressionLevel,
|
||||
};
|
||||
}
|
||||
} catch {
|
||||
// Continue trying other size/compression combinations
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (smallest) {
|
||||
return {
|
||||
buffer: smallest.buffer,
|
||||
optimizedSize: smallest.size,
|
||||
resizeSide: smallest.resizeSide,
|
||||
compressionLevel: smallest.compressionLevel,
|
||||
};
|
||||
}
|
||||
|
||||
throw new Error("Failed to optimize PNG image");
|
||||
}
|
||||
|
||||
@@ -299,99 +299,83 @@ export async function finalizeOnboardingWizard(options: FinalizeOnboardingOption
|
||||
].join("\n"),
|
||||
"Start TUI (best option!)",
|
||||
);
|
||||
await prompter.note(
|
||||
[
|
||||
"Gateway token: shared auth for the Gateway + Control UI.",
|
||||
"Stored in: ~/.clawdbot/clawdbot.json (gateway.auth.token) or CLAWDBOT_GATEWAY_TOKEN.",
|
||||
"Web UI stores a copy in this browser's localStorage (clawdbot.control.settings.v1).",
|
||||
`Get the tokenized link anytime: ${formatCliCommand("clawdbot dashboard --no-open")}`,
|
||||
].join("\n"),
|
||||
"Token",
|
||||
);
|
||||
}
|
||||
|
||||
hatchChoice = (await prompter.select({
|
||||
message: "How do you want to hatch your bot?",
|
||||
options: [
|
||||
{ value: "tui", label: "Hatch in TUI (recommended)" },
|
||||
{ value: "web", label: "Open the Web UI" },
|
||||
{ value: "later", label: "Do this later" },
|
||||
],
|
||||
initialValue: "tui",
|
||||
})) as "tui" | "web" | "later";
|
||||
await prompter.note(
|
||||
[
|
||||
"Gateway token: shared auth for the Gateway + Control UI.",
|
||||
"Stored in: ~/.clawdbot/clawdbot.json (gateway.auth.token) or CLAWDBOT_GATEWAY_TOKEN.",
|
||||
"Web UI stores a copy in this browser's localStorage (clawdbot.control.settings.v1).",
|
||||
`Get the tokenized link anytime: ${formatCliCommand("clawdbot dashboard --no-open")}`,
|
||||
].join("\n"),
|
||||
"Token",
|
||||
);
|
||||
|
||||
if (hatchChoice === "tui") {
|
||||
await runTui({
|
||||
url: links.wsUrl,
|
||||
token: settings.authMode === "token" ? settings.gatewayToken : undefined,
|
||||
password: settings.authMode === "password" ? nextConfig.gateway?.auth?.password : "",
|
||||
// Safety: onboarding TUI should not auto-deliver to lastProvider/lastTo.
|
||||
deliver: false,
|
||||
message: "Wake up, my friend!",
|
||||
});
|
||||
if (settings.authMode === "token" && settings.gatewayToken) {
|
||||
seededInBackground = await openUrlInBackground(authedUrl);
|
||||
}
|
||||
if (seededInBackground) {
|
||||
await prompter.note(
|
||||
`Web UI seeded in the background. Open later with: ${formatCliCommand(
|
||||
"clawdbot dashboard --no-open",
|
||||
)}`,
|
||||
"Web UI",
|
||||
);
|
||||
}
|
||||
} else if (hatchChoice === "web") {
|
||||
const browserSupport = await detectBrowserOpenSupport();
|
||||
if (browserSupport.ok) {
|
||||
controlUiOpened = await openUrl(authedUrl);
|
||||
if (!controlUiOpened) {
|
||||
controlUiOpenHint = formatControlUiSshHint({
|
||||
port: settings.port,
|
||||
basePath: controlUiBasePath,
|
||||
token: settings.gatewayToken,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
hatchChoice = (await prompter.select({
|
||||
message: "How do you want to hatch your bot?",
|
||||
options: [
|
||||
{ value: "tui", label: "Hatch in TUI (recommended)" },
|
||||
{ value: "web", label: "Open the Web UI" },
|
||||
{ value: "later", label: "Do this later" },
|
||||
],
|
||||
initialValue: "tui",
|
||||
})) as "tui" | "web" | "later";
|
||||
|
||||
if (hatchChoice === "tui") {
|
||||
await runTui({
|
||||
url: links.wsUrl,
|
||||
token: settings.authMode === "token" ? settings.gatewayToken : undefined,
|
||||
password: settings.authMode === "password" ? nextConfig.gateway?.auth?.password : "",
|
||||
// Safety: onboarding TUI should not auto-deliver to lastProvider/lastTo.
|
||||
deliver: false,
|
||||
message: hasBootstrap ? "Wake up, my friend!" : undefined,
|
||||
});
|
||||
if (settings.authMode === "token" && settings.gatewayToken) {
|
||||
seededInBackground = await openUrlInBackground(authedUrl);
|
||||
}
|
||||
if (seededInBackground) {
|
||||
await prompter.note(
|
||||
`Web UI seeded in the background. Open later with: ${formatCliCommand(
|
||||
"clawdbot dashboard --no-open",
|
||||
)}`,
|
||||
"Web UI",
|
||||
);
|
||||
}
|
||||
} else if (hatchChoice === "web") {
|
||||
const browserSupport = await detectBrowserOpenSupport();
|
||||
if (browserSupport.ok) {
|
||||
controlUiOpened = await openUrl(authedUrl);
|
||||
if (!controlUiOpened) {
|
||||
controlUiOpenHint = formatControlUiSshHint({
|
||||
port: settings.port,
|
||||
basePath: controlUiBasePath,
|
||||
token: settings.gatewayToken,
|
||||
});
|
||||
}
|
||||
await prompter.note(
|
||||
[
|
||||
`Dashboard link (with token): ${authedUrl}`,
|
||||
controlUiOpened
|
||||
? "Opened in your browser. Keep that tab to control Clawdbot."
|
||||
: "Copy/paste this URL in a browser on this machine to control Clawdbot.",
|
||||
controlUiOpenHint,
|
||||
]
|
||||
.filter(Boolean)
|
||||
.join("\n"),
|
||||
"Dashboard ready",
|
||||
);
|
||||
} else {
|
||||
await prompter.note(
|
||||
`When you're ready: ${formatCliCommand("clawdbot dashboard --no-open")}`,
|
||||
"Later",
|
||||
);
|
||||
controlUiOpenHint = formatControlUiSshHint({
|
||||
port: settings.port,
|
||||
basePath: controlUiBasePath,
|
||||
token: settings.gatewayToken,
|
||||
});
|
||||
}
|
||||
await prompter.note(
|
||||
[
|
||||
`Dashboard link (with token): ${authedUrl}`,
|
||||
controlUiOpened
|
||||
? "Opened in your browser. Keep that tab to control Clawdbot."
|
||||
: "Copy/paste this URL in a browser on this machine to control Clawdbot.",
|
||||
controlUiOpenHint,
|
||||
]
|
||||
.filter(Boolean)
|
||||
.join("\n"),
|
||||
"Dashboard ready",
|
||||
);
|
||||
} else {
|
||||
const browserSupport = await detectBrowserOpenSupport();
|
||||
if (!browserSupport.ok) {
|
||||
await prompter.note(
|
||||
formatControlUiSshHint({
|
||||
port: settings.port,
|
||||
basePath: controlUiBasePath,
|
||||
token: settings.authMode === "token" ? settings.gatewayToken : undefined,
|
||||
}),
|
||||
"Open Control UI",
|
||||
);
|
||||
} else {
|
||||
await prompter.note(
|
||||
"Opening Control UI automatically after onboarding (no extra prompts).",
|
||||
"Open Control UI",
|
||||
);
|
||||
}
|
||||
await prompter.note(
|
||||
`When you're ready: ${formatCliCommand("clawdbot dashboard --no-open")}`,
|
||||
"Later",
|
||||
);
|
||||
}
|
||||
} else if (opts.skipUi) {
|
||||
await prompter.note("Skipping Control UI/TUI prompts.", "Control UI");
|
||||
|
||||
@@ -227,6 +227,61 @@ describe("runOnboardingWizard", () => {
|
||||
await fs.rm(workspaceDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it("offers TUI hatch even without BOOTSTRAP.md", async () => {
|
||||
runTui.mockClear();
|
||||
|
||||
const workspaceDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-onboard-"));
|
||||
|
||||
const select: WizardPrompter["select"] = vi.fn(async (opts) => {
|
||||
if (opts.message === "How do you want to hatch your bot?") return "tui";
|
||||
return "quickstart";
|
||||
});
|
||||
|
||||
const prompter: WizardPrompter = {
|
||||
intro: vi.fn(async () => {}),
|
||||
outro: vi.fn(async () => {}),
|
||||
note: vi.fn(async () => {}),
|
||||
select,
|
||||
multiselect: vi.fn(async () => []),
|
||||
text: vi.fn(async () => ""),
|
||||
confirm: vi.fn(async () => false),
|
||||
progress: vi.fn(() => ({ update: vi.fn(), stop: vi.fn() })),
|
||||
};
|
||||
|
||||
const runtime: RuntimeEnv = {
|
||||
log: vi.fn(),
|
||||
error: vi.fn(),
|
||||
exit: vi.fn((code: number) => {
|
||||
throw new Error(`exit:${code}`);
|
||||
}),
|
||||
};
|
||||
|
||||
await runOnboardingWizard(
|
||||
{
|
||||
acceptRisk: true,
|
||||
flow: "quickstart",
|
||||
mode: "local",
|
||||
workspace: workspaceDir,
|
||||
authChoice: "skip",
|
||||
skipProviders: true,
|
||||
skipSkills: true,
|
||||
skipHealth: true,
|
||||
installDaemon: false,
|
||||
},
|
||||
runtime,
|
||||
prompter,
|
||||
);
|
||||
|
||||
expect(runTui).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
deliver: false,
|
||||
message: undefined,
|
||||
}),
|
||||
);
|
||||
|
||||
await fs.rm(workspaceDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it("shows the web search hint at the end of onboarding", async () => {
|
||||
const prevBraveKey = process.env.BRAVE_API_KEY;
|
||||
delete process.env.BRAVE_API_KEY;
|
||||
|
||||
@@ -356,6 +356,10 @@ export async function runOnboardingWizard(
|
||||
prompter,
|
||||
runtime,
|
||||
setDefaultModel: true,
|
||||
opts: {
|
||||
tokenProvider: opts.tokenProvider,
|
||||
token: opts.authChoice === "apiKey" && opts.token ? opts.token : undefined,
|
||||
},
|
||||
});
|
||||
nextConfig = authResult.config;
|
||||
|
||||
|
||||
15
vitest.gateway.config.ts
Normal file
15
vitest.gateway.config.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
import { defineConfig, mergeConfig } from "vitest/config";
|
||||
import baseConfig from "./vitest.config.ts";
|
||||
|
||||
const baseTest = (baseConfig as { test?: { exclude?: string[] } }).test ?? {};
|
||||
const exclude = baseTest.exclude ?? [];
|
||||
|
||||
export default mergeConfig(
|
||||
baseConfig,
|
||||
defineConfig({
|
||||
test: {
|
||||
include: ["src/gateway/**/*.test.ts", "extensions/**/*.test.ts"],
|
||||
exclude,
|
||||
},
|
||||
}),
|
||||
);
|
||||
20
vitest.unit.config.ts
Normal file
20
vitest.unit.config.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
import { defineConfig, mergeConfig } from "vitest/config";
|
||||
import baseConfig from "./vitest.config.ts";
|
||||
|
||||
const baseTest = (baseConfig as { test?: { include?: string[]; exclude?: string[] } }).test ?? {};
|
||||
const include = baseTest.include ?? [
|
||||
"src/**/*.test.ts",
|
||||
"extensions/**/*.test.ts",
|
||||
"test/format-error.test.ts",
|
||||
];
|
||||
const exclude = baseTest.exclude ?? [];
|
||||
|
||||
export default mergeConfig(
|
||||
baseConfig,
|
||||
defineConfig({
|
||||
test: {
|
||||
include,
|
||||
exclude: [...exclude, "src/gateway/**", "extensions/**"],
|
||||
},
|
||||
}),
|
||||
);
|
||||
Reference in New Issue
Block a user