liblloyal 1.0.0
Composable primitives for llama.cpp inference
Loading...
Searching...
No Matches
chat_template.hpp File Reference

Chat Template Formatting. More...

#include "common.hpp"
#include "helpers.hpp"
#include <llama/llama.h>
#include "nlohmann/json.hpp"
#include <string>
#include <vector>

Go to the source code of this file.

Classes

struct  lloyal::chat_template::FormatResult
 Result from chat template formatting NOTE: Named FormatResult, NOT ChatTemplateResult. More...
 

Namespaces

namespace  lloyal
 JSON Schema to Grammar Converter (Header-Only)
 
namespace  lloyal::chat_template
 

Functions

FormatResult lloyal::chat_template::format (const llama_model *model, const std::string &messages_json, const std::string &template_override="")
 Format chat messages using model's chat template with fallback.
 
bool lloyal::chat_template::validate (const std::string &template_str)
 Validate chat template syntax.
 

Detailed Description

Chat Template Formatting.

Orchestrates chat template processing with fallback error handling. Wraps helpers.hpp functions and adds graceful degradation when template processing fails.

Architecture:

  • Uses format_chat_template_complete() and validate_chat_template_helper() from helpers.hpp
  • Adds fallback to simple "role: content" format on errors
  • Provides clean FormatResult API for template formatting + stop token extraction

Definition in file chat_template.hpp.