|
liblloyal 1.0.0
Composable primitives for llama.cpp inference
|
Classes | |
| struct | FormatResult |
| Result from chat template formatting NOTE: Named FormatResult, NOT ChatTemplateResult. More... | |
Functions | |
| FormatResult | format (const llama_model *model, const std::string &messages_json, const std::string &template_override="") |
| Format chat messages using model's chat template with fallback. | |
| bool | validate (const std::string &template_str) |
| Validate chat template syntax. | |
|
inline |
Format chat messages using model's chat template with fallback.
Orchestration logic:
Fallback hierarchy:
| model | Llama model (for template and vocab) |
| messages_json | JSON string with messages array |
| template_override | Optional custom template |
Definition at line 57 of file chat_template.hpp.
|
inline |
Validate chat template syntax.
Calls validate_chat_template_helper() from helpers.hpp. Does NOT require a model (syntax-only validation).
| template_str | Template string to validate |
Definition at line 130 of file chat_template.hpp.