Skip to main content
Version: 25.6

snorkelai.sdk.client.fm_suite.prompt_fm

snorkelai.sdk.client.fm_suite.prompt_fm(prompt, model_name, model_type=None, question=None, runs_per_prompt=1, sync=True, cache_name='default', **fm_hyperparameters)

Send one or more prompts to a foundation model

Parameters

NameTypeDefaultInfo
promptUnion[str, List[str]]The prompt(s) to send to the foundation model.
model_namestrThe name of the foundation model to use.
model_typeOptional[LLMType]NoneThe way we should use the foundation model, must be one of the LLMType values.
questionOptional[str]NoneWhen provided, this get’s passed to the model for each prompt which is useful for information retrieval tasks. The prompt argument essentially then becomes the context(s) which contains the answer to the question.
runs_per_promptint1The number of times to run a prompt, note each response can be different. All will be cached.
syncboolTrueWhether to wait for the job to complete before returning the result.
cache_namestr'default'The cache name is used in the hash construction. To run a prompt and get a different result, you should change the cache name to something that hasn’t been used before. For example: >> sf.prompt_fm(“What is the meaning of life?”, “openai/gpt-4o”) The meaning of life is to work… >> sf.prompt_fm(“What is the meaning of life?”, “openai/gpt-4o”) << hit’s the cache The meaning of life is to work… >> sf.prompt_fm(“What is the meaning of life?”, “openai/gpt-4o”, cache_name=”run_2”) << hit’s a different part of the cache The meaning of life is to have fun!.
fm_hyperparametersAnyAdditional keyword arguments to pass to the foundation model such as temperature, max_tokens, etc.

Return type

Union[DataFrame, str]

Returns

  • df – Dataframe containing the predictions for the data points. There are two columns, the input prompt and the output of the foundation model.

  • job_id – The job id of the prompt inference job which can be used to monitor progress with sf.poll_job_status(job_id).

Examples

>>> sf.prompt_fm(prompt="What is the meaning of life?", model_name="openai/gpt-4")
| text | generated_text | perplexity
--------------------------------------------------------------------------------------------------------------
0 | What is the meaning of life? | Life is all about having fun! | 0.789
>>> sf.prompt_fm(prompt=["What is the meaning of life?", "What is the meaning of death?"], model_name="openai/gpt-4")
| text | generated_text | perplexity
--------------------------------------------------------------------------------------------------------------
0 | What is the meaning of life? | Life is all about having fun! | 0.789
1 | What is the meaning of death? | Death is about not having fun! | 0.981
>>> sf.prompt_fm(question="What is surname", prompt="Joe Bloggs is a person", model_name="deepset/roberta-base-squad2")
| text | answer | start | end | score
-------------------------------------------------------------------------------------------------------------
0 | Joe Bloggs is a person | Bloggs | 4 | 11 | 0.985