Skip to main content

Quick Start

Get your first prompt running in Scope in under 5 minutes. This guide walks you through signing in, configuring a provider, creating a prompt, testing it, and promoting it to production.

Prerequisites

  • A base14 account with access to Scope
  • An API key from at least one LLM provider (e.g., OpenAI, Anthropic)

Step 1: Sign in to Scope

Navigate to your base14 dashboard and open Scope. Scope is available to all base14 accounts — no additional setup is required.

Step 2: Configure a Provider

Before you can test prompts, Scope needs credentials for at least one LLM provider.

  1. Go to Settings > Providers
  2. Click Add Provider
  3. Select your provider (e.g., OpenAI)
  4. Enter your API key
  5. Click Test Connection to verify the key is valid
  6. Save the provider configuration

Base14 Scope provider configuration page showing connected LLM providers like OpenAI and Anthropic

Once connected, enable the models you want to use (e.g., gpt-4o, claude-3-opus).

tip

You can configure multiple providers and switch between them when testing prompts. See Configure Providers for details.

Step 3: Create a Prompt

  1. Click New Prompt from the prompt list

  2. Enter a name (e.g., greeting)

  3. Add a description (e.g., "Greet the user and suggest an activity")

  4. Add your prompt messages:

    SYSTEM message:

    You are a friendly assistant. Your goal is to greet the user by name
    and suggest a fun activity based on their preference.

    HUMAN message:

    My name is {{name}} and I enjoy {{preference}} activities.
    What should I try today?

Scope automatically detects {{name}} and {{preference}} as variables.

Base14 Scope create prompt editor with variable syntax highlighting and metadata fields

  1. Click Create — this creates version v1 in draft status

Step 4: Test the Prompt

  1. Open the prompt you just created
  2. Click Test to open the test panel
  3. Select a provider and model (e.g., OpenAI / gpt-4o)
  4. Fill in the variable values:
    • name: Alice
    • preference: outdoor
  5. Click Run

Base14 Scope test panel showing LLM response with token usage, latency, and cost metrics

The test panel shows the LLM response along with token usage, latency, and estimated cost. Adjust your prompt and re-run until you're satisfied.

Step 5: Promote to Production

When the prompt is ready:

  1. Click Promote on the version you want to deploy
  2. Add optional promotion notes (e.g., "Initial release")
  3. Confirm the promotion

Base14 Scope promote to production dialog with promotion notes for deploying prompt version

The version status changes from draft to published. Any application using the SDK or API will now receive this version when requesting the greeting prompt.

Step 6: Fetch from Your Application

Install the Scope SDK and fetch the prompt at runtime:

from scope_client import ScopeClient, ApiKeyCredentials

credentials = ApiKeyCredentials.from_env()
client = ScopeClient(credentials=credentials)

version = client.get_prompt_version("greeting")
rendered = version.render(name="Alice", preference="outdoor")
print(rendered)

Pass the rendered string to your LLM provider of choice. See the SDK Quickstart for a full end-to-end example.

Next Steps

Was this page helpful?