LLM model details.
OptionalpromptStatic prompts. Can be:
template: an array of templated chat messages (with {{?placeholders}}).id or scenario, name and version: reference to a remote prompt template.This is meant for static instructions included with every call.
For per-request templating, use messages in .chatCompletion() instead.
Prompt templating module configuration.