|
liblloyal 1.0.0
Composable primitives for llama.cpp inference
|
Chat Template Formatting. More...
#include "common.hpp"#include "helpers.hpp"#include <llama/llama.h>#include "nlohmann/json.hpp"#include <string>#include <vector>Go to the source code of this file.
Classes | |
| struct | lloyal::chat_template::FormatResult |
| Result from chat template formatting NOTE: Named FormatResult, NOT ChatTemplateResult. More... | |
Namespaces | |
| namespace | lloyal |
| JSON Schema to Grammar Converter (Header-Only) | |
| namespace | lloyal::chat_template |
Functions | |
| FormatResult | lloyal::chat_template::format (const llama_model *model, const std::string &messages_json, const std::string &template_override="") |
| Format chat messages using model's chat template with fallback. | |
| bool | lloyal::chat_template::validate (const std::string &template_str) |
| Validate chat template syntax. | |
Chat Template Formatting.
Orchestrates chat template processing with fallback error handling. Wraps helpers.hpp functions and adds graceful degradation when template processing fails.
Architecture:
Definition in file chat_template.hpp.