Prompt Lockbox is quite easy to install. Its available on pypi index, so all you need is a python package installer pip!

Supported Python version: > 3.10

How to install ?

You can directly install the toolkit using pip or from our github repo.

It is advised to create a new virtual environment to avoid any unnecessary package conflict.

Install using command -

pip install prompt-lockbox

This will setup Prompt Lockbox and all related dependencies.

Alternatively -

Setup is done. We can now create a new Prompt Lockbox Project !

Create a new project

The first step is to create a project, which is nothing but setting up the necessary directories and configuration files for you.

To do this, we will use CLI for this guide. Well, you can also use the SDK for these tasks, checkout SDK. In your project directory, run -

plb init . 

This command will initialize a Prompt Lockbox project and build these files -

  • prompts/: The directory where all your versioned .yml prompt files will live.
  • .plb/: A hidden directory for internal data in toolkit like search indexes, logs etc.
  • plb.toml: The main configuration file for your project. You can use it to set project-wide settings, like the AI model for automated tasks.
  • .plb.lock: The lockfile that stores the secure hashes of your production-ready prompts, protecting them from unauthorized changes.

We have created a Project. Now we can start building prompts !

Create a Prompt

Now that we have our project, let’s create our first prompt. We’ll use the interactive create command, which guides you through setting up all the required metadata.

In your terminal, run -

plb create 

Answer the entries as required. For most, you can just press Enter to accept the default value.

Let’s say prompt name is ‘Customer-support-agent’. The create command creates a new file named prompts/Customer-support-agent.v1.0.0.yml.

Hurray! You have created your first prompt. But note that its just a template and the actual content and other metadata should be filled by the user.

Template

Here are two templates for your reference — one is blank, and the other is a completed customer support agent prompt template with content and metadata -

The create command builds an empty template. Open the prompts/Customer-support-agent.v1.0.0.yml file in your code editor. Find the template: key and replace the placeholder content with the filled example above or write your own !

Run the prompt

LLM configuration is required for this step. While the toolkit itself is not LLM-dependent, some features rely on it. Please configure LLM before proceeding.

Let’s execute our prompt using the run command. The command will see the required variables and ask you for them.

It will run the prompt against a LLM of your choice (llm needs to be set as a configuration). Checkout LLMs to find a list of supported models.

In your terminal, run -

plb run Customer-support-agent --execute

It will render the filled prompt and the output generated using the specified LLM.

Next Steps

Discover AI-Powered Features Available After LLM Setup.