Python Prompt SDK Quick Start

Install

poetry add autoblocksai

Create a prompt

Go to the prompts page and click Create Prompt to create your first prompt. The prompt must contain one or more templates and can optionally contain LLM model parameters.

The code samples below are using this example prompt:

Autogenerate prompt classes

The prompt SDK ships with a CLI that generates Python classes with methods and arguments that mirror the structure of your prompt's templates and parameters. This gives you type safety and autocomplete when working with Autoblocks prompts in your codebase.

Create .autoblocks.yml

The .autoblocks.yml file allows you to configure:

  • The location of the file that will contain the generated classes
  • A list of prompts to generate classes for

This file should be at the root of your project.

autogenerate:
  prompts:
    # The location of the file that will contain the generated classes
    outfile: my_project/autoblocks_prompts.py

    # A list of prompts to generate classes for
    prompts:
        # The ID of the prompt (what you chose when creating the prompt in the UI)
      - id: text-summarization
        # The major versions of the `text-summarization` prompt to generate classes for
        major_versions:
          - 2

      - id: flashcard-generator
        major_versions:
          - 1

Set your API key as an environment variable

Get your Autoblocks API key from the settings page and set it as an environment variable:

export AUTOBLOCKS_API_KEY=...

Run the CLI

Installing the autoblocksai package adds the prompts generate CLI to your path:

poetry run prompts generate

Running the CLI will create a file at the outfile location you have configured. You will need to run the prompts generate CLI any time you update .autoblocks.yml or if there is a new minor version of a prompt in your .autoblocks.yml file that you want to use.

Import and use a prompt manager

The classes you will need to import and interact with from the autogenerated code are the prompt managers and the minor version enums.

For each prompt and major version specified in your .autoblocks.yml file, there will be a prompt manager class and a minor version enum named after the prompt's ID. For example, if a prompt's ID is "text-summarization", then the autogenerated file will have:

  • TextSummarizationPromptManager
  • TextSummarizationMinorVersion

Initialize the prompt manager

Create a single instance of the prompt manager for the lifetime of your application. The only required argument when initializing a prompt manager is the minor version. To specify the minor version, use the enum that was autogenerated by the CLI:

from my_project.autoblocks_prompts import TextSummarizationPromptManager
from my_project.autoblocks_prompts import TextSummarizationMinorVersion

mgr = TextSummarizationPromptManager(
  minor_version=TextSummarizationMinorVersion.v0,
)

Execute a prompt

The exec method on the prompt manager starts a new prompt execution context. It is a context manager that creates a PromptExecutionContext instance that gives you fully-typed access to the prompt's templates and parameters:

with mgr.exec() as prompt:
    tracer = AutoblocksTracer(trace_id=str(uuid.uuid4()))

    params = dict(
        model=prompt.params.model,
        temperature=prompt.params.temperature,
        max_tokens=prompt.params.max_tokens,
        messages=[
            dict(
                role="system",
                content=prompt.render.system(
                    language_requirement=prompt.render.util_language(
                        language="Spanish",
                    ),
                    tone_requirement=prompt.render.util_tone(
                        tone="formal",
                    ),
                ),
            ),
            dict(
                role="user",
                content=prompt.render.user(
                  document="mock document",
                ),
            ),
        ],
    )

    tracer.send_event(
        "ai.request",
        properties=params,
    )

    response = openai.chat.completions.create(**params)

    tracer.send_event(
        "ai.response",
        prompt_tracking=prompt.track(),
        properties=dict(
            response=response.model_dump(),
        ),
    )

Include prompt information in the LLM response event

Notice that we include prompt tracking information on the LLM response event:

tracer.send_event(
    "ai.response",
    prompt_tracking=prompt.track(),
    properties=dict(
        response=response.model_dump(),
    ),
)

This correlates LLM response events with the prompt that was used to generate them. The prompt ID and version will be sent as properties on your event, allowing you to track its performance on the explore page.

Develop locally against a prompt that hasn't been deployed

You will often want to develop locally against a prompt that you've modified in the UI but haven't deployed yet. As you modify prompts in the UI, the undeployed changes can be pulled down at any time by adding the major version dangerously-use-undeployed to the .autoblocks.yml file and re-running prompts generate:

prompts:
  - id: text-summarization
    major_versions:
      - 2
      - dangerously-use-undeployed

This will generate a new prompt manager class with the undeployed changes.

Organizing multiple prompt managers

If you are using many prompt managers, we recommend initializing them in a single file and importing them as a module:

prompt_managers.py:

from my_project.autoblocks_prompts import TextSummarizationPromptManager
from my_project.autoblocks_prompts import TextSummarizationMinorVersion
from my_project.autoblocks_prompts import FlashcardGeneratorPromptManager
from my_project.autoblocks_prompts import FlashcardGeneratorMinorVersion
from my_project.autoblocks_prompts import StudyGuideOutlinePromptManager
from my_project.autoblocks_prompts import StudyGuideOutlineMinorVersion

text_summarization = TextSummarizationPromptManager(
  minor_version=TextSummarizationMinorVersion.v0,
)

flashcard_generator = FlashcardGeneratorPromptManager(
  minor_version=FlashcardGeneratorMinorVersion.v0,
)

study_guide_outline = StudyGuideOutlinePromptManager(
  minor_version=StudyGuideOutlineMinorVersion.v0,
)

Then, throughout your application, import the entire prompt_managers module and use the prompt managers as needed:

from my_project import prompt_managers

with prompt_managers.text_summarization.exec() as prompt:
  ...

with prompt_managers.flashcard_generator.exec() as prompt:
  ...

with prompt_managers.study_guide_outline.exec() as prompt:
  ...

This is preferable over importing each prompt manager individually, as it keeps the context of it being a prompt manager in the name. If you were to import each manager individually, it is hard to tell at a glance that it is a prompt manager:

from my_project.prompt_managers import text_summarization

# Somewhere deep in a file, it is not clear
# what the `text_summarization` variable is
with text_summarization.exec() as prompt:
  ...