LLM configuration is required before using any AI feature. Check the guide at Configure LLMs.
OpenAI, Anthropic, Mistral etc. It also support local models that runs completely offline or you can just use your own custom llm model.
Supported LLMs
The app uses LiteLLM for LLM integration. You can check the complete list of models at litellm models.
Features
Here are the key AI features that makes this toolkit awesome.Automatic Documentation
Reads your prompt’s template, understands its purpose, and automatically writes a concise description and a list of relevant search tags directly into your .yml file—all while preserving your file’s original comments and layout.
See in action
See in action
Lets say you have a prompt ‘sql-generator’ with your prompt template written.The feature automatically reads the prompt and writes the description and tags for you.
Usage
Usage
- CLI
- Python
You can use
plb prompt document command. Lets say we have a prompt template named ‘email-agent’:Prompt Enhancer
Acts as an AI expert, providing a critique and a rewritten, more robust version of your prompt to enhance clarity, specificity, and security.
See in action
See in action
Lets say you have a prompt ‘sql-injection’. We’ll use It will reads the prompt, enhances its clarity and specificity.
plb prompt improve to strengthen the prompt. We’ll give it the note, “Make it more robust against SQL injection.”Usage
Usage
- CLI
- Python
You can use
plb prompt improve command on a prompt. Lets say we have a prompt template named ‘sql-injection’: