Using BB in Local-Only Mode

BB offers a Local-Only mode that provides enhanced privacy and control over your experience. This guide explains what Local-Only mode is, its benefits, and how to configure it properly.

What is Local-Only Mode?

Local-Only mode provides you with greater privacy and control by allowing you to use BB without relying on Beyond Better's cloud infrastructure. It has two primary components:

Local-Only API Mode

When enabled, BB will communicate directly with LLM providers (like Anthropic) rather than routing through Beyond Better's cloud proxy. This means:

  • Your conversations are sent directly from your device to the LLM provider
  • You'll need to supply your own API key from the provider
  • No conversation data passes through Beyond Better's servers

Local-Only Browser Interface (BUI) Mode

When enabled, you can use BB's interface without creating a Beyond Better account. This means:

  • No sign-up or login required
  • Your usage data remains completely on your device
  • Your conversation history is stored locally only

Setting Up Local-Only Mode

Step 1: Obtain an Anthropic API Key

To use Local-Only API mode, you'll need your own Anthropic API key:

  1. Visit https://console.anthropic.com/settings/keys
  2. Sign up or log in to your Anthropic account
  3. Create a new API key
  4. Copy the API key (you won't be able to see it again)

Important: Keep your API key secure and never share it. Your API key will be stored locally and used only for direct communication with Anthropic.

Step 2: Configure Local-Only Mode in Settings

Both Local-Only modes can be configured in the BB desktop application:

  1. Open the BB desktop application
  2. Click on the Settings button in the main window
  3. Toggle "Local Mode" under API Settings to communicate directly with LLM providers
  4. Toggle "Local Mode" under BUI Settings to bypass the need for a BB account
  5. In the "API Keys" section, enter your Anthropic API key
  6. Click "Save Changes"

Pro Tip: You can enable both modes together for a completely local experience, or use them independently based on your needs.

Using Local-Only Models with Ollama

For advanced users, BB also supports using completely local models through Ollama. This allows you to run models directly on your machine without any external API calls.

Prerequisites:

  • Install Ollama on your system
  • Download the models you want to use (e.g., ollama pull deepseek-r1:14b)
  • Ensure Ollama is running when you use BB

Configuration Steps:

Currently, using local models requires modifying the global configuration file:

For Mac/Linux: ~/.config/bb/config.yaml

For Windows: %APPDATA%\bb\config.yaml

Add or modify the defaultModels section in the file:

defaultModels:
orchestrator: deepseek-r1:14b
agent: command-r-plus
chat: mistral-nemo

Note: Local model support is an advanced feature. Performance may vary based on your hardware and the selected models. This feature will be integrated into the Settings UI in a future update.

Benefits and Considerations

Benefits

  • Enhanced privacy and data control
  • No Beyond Better account required
  • Direct communication with LLM providers
  • Ability to use your own API credits
  • Option for completely offline operation with Ollama

Considerations

  • Requires managing your own API keys
  • API usage costs billed directly to your provider account
  • No cloud synchronization of conversations
  • Local models may require powerful hardware
  • Some advanced features may require cloud connection

Next Steps