BatchPrompt
Last updated
Last updated
BatchPrompt
is the class for batch prompts. Batch prompting is a simple alternative prompting approach that enables the LLM to run inference in batches, instead of one sample at a time. Batch prompting can reduce both token and time costs while retaining downstream performance.
build_prompt(self, prompt_list:list)
Description
Build a batch_prompt from a given list of different types of prompts.
Parameters
prompt_list
(list): The prompt list.
parse_response(self)
Description
Divide the overall response of batch_prompt into corresponding responses for prompt_list and pass them back into the response of the corresponding prompt.
Example
from easyinstruct import BasePrompt, IEPrompt, ZeroshotCoTPrompt, FewshotCoTPrompt, BatchPrompt
from easyinstruct.utils.api import set_openai_key, set_anthropic_key, set_proxy
set_openai_key("")
set_anthropic_key(