Back
PromptHub Prompt Engineering Method Templates
Templates for the most popular and effective prompt engineering methods
- ๐ฅ
Multi-persona collaboration
A new prompting method that instructs the LLM to create multiple personas to work together to complete the task
PromptHub - ๐ด
Tree of Thoughts
A prompting method that instructs the LLM to traverse many different paths when completing a task. Movie recommender is used as the example task. Update the variables and steps for your use case.
PromptHub - ๐
"According to..." prompting
A prompting method that reduces hallucinations by grounding responses in pre-training data. Update the source and question variable to run this prompt.
PromptHub - ๐
Skeleton of Thought
Skeleton of Thought (SoT) typically uses 2 parallel prompts. This one-shot prompt merges them: first forming a task skeleton, then filling it in. Just update the question and run the prompt!
PromptHub - ๐
AutoHint
This template corresponds to Step 3 in the AutoHint framework. It's designed to generate a broad hint based on incorrect input/output pairs, which can then be added to the original prompt to increase accuracy.
PromptHub - ๐งฎ
Algorithm of Thoughts
Input your prompt in the variable and it will be converted into a new prompt, following the Algorithm of Thoughts framework. A final, cohesive, prompt will be below the AoT framework output.
PromptHub - โค
๏ธEmotionPrompt
A prompting method that uses emotional statements to yield better results. Just add the emotional statement at the end of your prompt. Read more about it on our blog.
PromptHub - ๐
Step-Back Prompting
A prompting method that encourages the model to take a step-back before diving into a task or question
PromptHub - ๐
Chain of Density
Chain of Density prompting generates 5 increasingly detailed summaries. Research points to the third summary being the closest resemblance to human-written summaries in regard to density of information.
PromptHub - โ
Chain of Verification
CoVe, typically a two-prompt method, can also function effectively with just one prompt, still helping to reduce hallucinations.
PromptHub - ๐
Semantic Alternative Enhancer
Optimize longer prompts using methods backed by research
PromptHub - ๐
RecPrompt
The base prompt used in the RecPrompt framework. A great starting point if you are building any sort of recommendation system on top of an LLM. We've added some structural enhancements to better distinguish different parts of the prompt.
PromptHub - โ๏ธ
Analogical Prompting
Auto-generate CoT examples
PromptHub - ๐
Universal Self-Consistency
Applicable to a wide range of tasks (including free-form answers), USC typically should be run separately to generate multiple outputs and select the most consistent. This template gives a starting point to understand the method
PromptHub - ๐
Self-Consistency
While typically the prompt should be run separately to generate answers, this template gives a starting point to understand the method
PromptHub - ๐ป
Program of Thoughts
Template to generate the code portion of the Program of Thoughts (PoT) prompting method
PromptHub - ๐งฉ
Least-to-most step 2
A generalizable prompt for stage 1 of least-to-most prompting, where the the problem is broken down into subproblems
PromptHub - ๐ ๏ธ
Least-to-most step 3
A generalizable prompt for Stage 2 in least-to-most prompting where the subproblems are solved
PromptHub - โจ
Least-to-most step 1
Generate few shot examples, for any task, to be used to show the model how to decompose problems
PromptHub - โ๏ธ
Contrastive Chain-of-Thought
Contrastive CoT prompting involves adding both correct and incorrect examples to a Chain-of-Thought prompt to help the model learn by contrasting faulty reasoning with correct logic.
PromptHub - ๐ง
System 2 Attention
System 2 Attention prompting guides the model to remove unnecessary or irrelevant information from the input before processing, ensuring a focus on the most relevant details.
PromptHub - ๐งต
Thread-of-Thought
Ideal for situations with large context, Thread of Thought helps the model maintain a coherent line of thought across many messages.
PromptHub - โ๏ธ
Zero-Shot CoT
The simplest way to implement Chain-of-Thought reasoning. Just add language that prompts the model to demonstrate reasoning.
PromptHub - ๐๏ธ
Few-Shot CoT
Provide the model with a few examples that demonstrate ideal reasoning chains.
PromptHub - ๐
Faithful CoT
Faithful Chain-of-Thought ensures reasoning chains accurately reflect the model's thought process by converting natural language queries into symbolic reasoning chains with Python, and then uses a deterministic solver to find the final answer.
PromptHub - ๐
Tabular CoT
Tabular Chain-of-Thought directs the model to present its reasoning in a structured format, such as markdown tables.
PromptHub - ๐
LLMCompare
LLMCompare evaluator, specifically for summary evaluation
PromptHub - ๐ค
o1-preview reasoning
Turn GPT-4 into o1-preview through a structured, step-by-step reasoning process.
PromptHub - ๐ก๏ธ
Claude 3 Haiku SP
The system prompt used to power Claude 3 Haiku in the Claude.ai interface
PromptHub - ๐ก๏ธ
Claude 3 Opus SP
The system prompt used to power Claude 3 Opus in the Claude.ai interface
PromptHub - ๐ก๏ธ
Claude 3.5 Sonnet SP
The system prompt used to power Claude 3.5 Sonnet in the Claude.ai interface
PromptHub - ๐ค
ChatGPT-4o SP
The system prompt used to power ChatGPT when using 4o
PromptHub - ๐
Meta Prompt Conductor
Remember this is the prompt for the meta model that is the conductor. This prompt itself doesn't optimize prompts
PromptHub - ๐ค
OpenAI SI Generator
Based on a little of prompt injecting, we believe this is the prompt behind the new OpenAI System Instructions generator
PromptHub - ๐
Auto ICL Step 1
This prompt guides the model to generate diverse, well-structured input-output demonstrations to be used in a second prompt as few-shot examples
PromptHub - ๐
Auto ICL Step 2
This prompt is designed to use demonstrations generated in a previous prompt to guide the model's reasoning when solving a new question/task
PromptHub - ๐งโ๐ซ
ExpertPrompt
From the ExpertPrompt framework, this template will generate expert agent personas for any given task, leveraging In-Context Learning (ICL)
PromptHub - ๐จโ๐ค
Persona Generator
Persona Generator from the Jekyll & Hyde framework. Generate a persona for any given task. Output is in JSON.
PromptHub - ๐
AutoReason
Generate Chain of Thought reasoning traces for any task. Take the reasoning steps and the original query to then generate better outputs
PromptHub - ๐
Chain of Thought Critic
Critics generate Chain of Thought steps, serving as an additional layer of reasoning to enhance LLM outputs before final processing
PromptHub