nebulaflow

Configuration

This guide covers how to configure NebulaFlow for your environment, workspace, and individual workflows. Configuration includes environment variables, VS Code settings, workspace-specific settings, and node-level settings.

Environment Variables

Environment variables are used for API keys and debugging flags. They can be set in your shell, in a .env file in your workspace root, or in your system environment.

Required for LLM Nodes

Optional Debugging & Behavior Flags

Setting Environment Variables

Linux/macOS (temporary):

export AMP_API_KEY="your_amp_api_key_here"
export OPENROUTER_API_KEY="your_openrouter_api_key_here"  # optional

Linux/macOS (permanent): Add the lines to your shell profile file (e.g., ~/.bashrc, ~/.zshrc, ~/.profile):

export AMP_API_KEY="your_amp_api_key_here"
export OPENROUTER_API_KEY="your_openrouter_api_key_here"

Windows (PowerShell):

$env:AMP_API_KEY="your_amp_api_key_here"
$env:OPENROUTER_API_KEY="your_openrouter_api_key_here"

Windows (Command Prompt):

set AMP_API_KEY=your_amp_api_key_here
set OPENROUTER_API_KEY=your_openrouter_api_key_here

Note: Environment variables set in a terminal session only apply to that session. For permanent changes, set them system-wide or use a .env file in your workspace (see below).

Using a .env File

You can also place a .env file in your workspace root (or the first workspace folder). NebulaFlow will read environment variables from this file when the extension activates.

Create a file named .env in your workspace:

AMP_API_KEY=your_amp_api_key_here
OPENROUTER_API_KEY=your_openrouter_api_key_here

VS Code Settings

NebulaFlow provides VS Code configuration options that can be set in the Settings UI (Ctrl+, or Cmd+,) or in .vscode/settings.json.

nebulaFlow.storageScope

nebulaFlow.globalStoragePath

Example .vscode/settings.json

{
  "nebulaFlow.storageScope": "workspace",
  "nebulaFlow.globalStoragePath": "/path/to/custom/storage"
}

Workspace Configuration

You can configure LLM settings per workspace by creating a .nebulaflow/settings.json file in your workspace root. This file can contain Amp SDK settings, model configurations, and more.

File Location

The file must be placed in the first workspace folder at .nebulaflow/settings.json.

Structure

The file must contain a nebulaflow.settings object that maps directly to Amp SDK settings keys.

{
  "nebulaflow": {
    "settings": {
      "openrouter.key": "sk-or-...",
      "internal.primaryModel": "openrouter/xiaomi/mimo-v2-flash:free"
    }
  }
}

Common Settings Keys

Example: OpenRouter Models Configuration

{
  "nebulaflow": {
    "settings": {
      "openrouter.models": [
        {
          "model": "openrouter/z-ai/glm-4.7-flash",
          "provider": "z-ai",
          "maxOutputTokens": 131000,
          "contextWindow": 200000,
          "temperature": 0.5
        },
        {
          "model": "openrouter/openai/gpt-5.2-codex",
          "provider": "openai",
          "isReasoning": true,
          "reasoning_effort": "medium",
          "maxOutputTokens": 128000,
          "contextWindow": 400000
        }
      ]
    }
  }
}

Model Selection Priority

  1. Node-level model: Selected in the Property Editor (Model combobox).
  2. Workspace default: internal.primaryModel in .nebulaflow/settings.json.
  3. Built-in default: openai/gpt-5.1 (if neither of the above is set).

The Model combobox is populated from the Amp SDK listModels() API and also includes the configured workspace default and any OpenRouter models defined in openrouter.models.

Node Configuration

Each node type has specific configuration options. These are set in the Property Editor when a node is selected.

LLM Node

CLI Node

Text Node

Variable Node

Accumulator Node

If/Else Node

Loop Start Node

Loop End Node

Preview Node

Subflow Node

Approval System

CLI nodes (and LLM tool calls) can require approval before execution. This is controlled by the Safety and Approval settings.

When approval is required, the Right Sidebar shows a preview of the command/script and a structured summary (Mode, Shell, Safety, Stdin, Flags). You can approve or reject.

Storage and Persistence

The storage location is determined by nebulaFlow.storageScope and nebulaFlow.globalStoragePath.

Troubleshooting Configuration

“AMP_API_KEY is not set” error

LLM node fails with model errors

CLI node fails to execute commands

Workflow not saving/loading

Next Steps