A complete guide to the Python SDK, covering all available classes, methods, usage patterns, and configuration options.
The Project
class is the main entry point for interacting with a Prompt Lockbox project via the Python SDK. It represents your entire prompt library and provides methods for finding, creating, and managing all prompts within it.
Initializes the Project, finding its root directory and loading its configuration.
Parameters
An optional path to a directory within the project. If None, it searches upwards from the current working directory to find the project root.
Raises
If no plb.toml file is found, indicating it’s not a valid project.
Example
The root pathlib.Path object of the Prompt Lockbox project.
Returns
The absolute path to the project’s root directory.
Finds a single prompt by its name, ID, or file path. If a name is provided, it returns the Prompt object for the latest version.
Parameters
The name, ID, or file path string of the prompt to find.
Returns
A Prompt object if found, otherwise None.
Returns a list of all prompts found in the project.
Returns
A list of Prompt objects for every valid prompt file found.
Creates a new prompt file on disk from the given metadata.
The name of the new prompt.
The starting semantic version for the prompt.
The author of the prompt. If not provided, it attempts to use the current Git user.
A short, human-readable summary of the prompt’s purpose.
A list of strings to organize the prompt in a hierarchy (e.g., [‘billing’, ‘invoices’]).
A list of lowercase keywords for search and discovery.
The specific LLM this prompt is designed for (e.g., openai/gpt-4o-mini).
Any extra comments, warnings, or usage instructions.
A dictionary of model parameters to be stored.
A list of IDs of other prompts that are related to this one.
Returns
A Prompt object representing the newly created file.
Finds a single prompt by its name, ID, or file path. If a name is provided, it returns the Prompt object for the latest version.
Parameters
The name, ID, or file path string of the prompt to find.
Returns
A Prompt object if found, otherwise None.
Searches for prompts using a specified method.
Parameters
The search query string.
The search method to use. Choices: fuzzy, hybrid, splade.
The maximum number of results to return.
(Hybrid search only) A float between 0.0 (keyword) and 1.0 (semantic) to balance the search.
Returns
A list of result dictionaries, sorted by relevance.
Builds a search index for all prompts to enable advanced search.
The indexing method to use. Choices: hybrid or splade.
Validates all prompt files in the project for correctness and consistency.
Returns
A dictionary of results categorized by check type.
Generates a report of the lock status of all prompts.
Returns
A dictionary categorizing prompts into locked, unlocked, tampered, and missing.
Uses an AI to automatically document a list of prompts, or all prompts if none are provided.
Parameters
A specific list of Prompt objects to document. If None, all prompts in the project will be processed.
Retrieves the AI configuration from the project’s plb.toml file.
Returns
A dictionary with provider and model keys.
The Prompt
class represents a single, versioned prompt file. It is the primary object you’ll work with to render, validate, and manage an individual prompt.
These are read-only attributes that provide quick access to the prompt’s data.
The absolute pathlib.Path to the prompt’s .yml file.
A dict containing all the parsed data from the YAML file.
The name of the prompt as a string.
The version of the prompt as a string (e.g., “1.0.0”).
The description of the prompt as a string.
A set of all undeclared template variables found in the template string (e.g., ).
Parameters
If True, the method will raise an UndefinedError
for any missing variables. If False, it will render missing variables as <<variable_name>>
in the output string.
The key-value pairs to inject into the template. These will override any default_inputs specified in the prompt file.
Returns
The final, rendered prompt text as a string.
Renders the prompt, calls the configured LLM, and returns a structured result.
Parameters
The key-value pairs to inject into the template before sending the prompt to the LLM.
Returns
A dictionary containing the rendered prompt, LLM output, model details, and usage statistics.
Renders the prompt with given variables and executes it against the configured AI provider, returning the live response. This is the primary method for getting an AI response from a prompt.
Parameters
The variables to inject into the template before execution, provided as keyword arguments (e.g., customer_name=“Jane Doe”).
Returns
The return type depends on whether an output_schema is defined in your prompt’s .yml file.
Creates a lock entry for this prompt in the project’s lockfile. This records the file’s current SHA256 hash and a timestamp, marking it as secure.
Removes the lock entry for this prompt from the project’s lockfile, allowing it to be edited.
Verifies the integrity of this prompt against the lockfile.
Returns
A tuple containing a boolean (True if secure) and a status string (‘OK’, ‘UNLOCKED’, ‘TAMPERED’).
Creates a new, version-bumped copy of this prompt file and returns a Prompt object for the new file.
Parameters
The type of version bump to perform. Choices: major, minor, patch.
The author for the new version. If None, it defaults to the author of the source prompt or the current Git user.
Returns
A new Prompt object representing the newly created file.
Uses an AI to analyze the prompt’s template and automatically generate and save a description and tags for it. The original file’s comments and layout are preserved.
Gets an AI-powered critique and suggestions for improving the prompt. This method does not modify the file.
Parameters
A specific instruction for the AI on how to improve the prompt (e.g., “Make it more robust”).
Returns
A dictionary containing the ‘critique’, ‘suggestions’, and ‘improved_template’.
Overwrites the prompt’s template block with a new version and updates its last_update timestamp. The original file’s comments and layout are preserved.
Parameters
The new template string to write to the file. This is typically sourced from the result of .get_critique().
Congo! Explore more.