liblloyal 1.0.0
Composable primitives for llama.cpp inference
Loading...
Searching...
No Matches
lloyal::ChatTemplateResult Struct Reference

Result from complete chat template processing. More...

#include <helpers.hpp>

Public Attributes

std::string prompt
 Formatted chat prompt ready for tokenization.
 
std::vector< std::string > additional_stops
 Template-specific stop tokens (e.g., "<|im_end|>", "<|eot_id|>")
 

Detailed Description

Result from complete chat template processing.

Contains formatted prompt and dynamically detected stop tokens specific to the model's chat template (ChatML, Llama-3, etc.).

Definition at line 113 of file helpers.hpp.

Member Data Documentation

◆ additional_stops

std::vector<std::string> lloyal::ChatTemplateResult::additional_stops

Template-specific stop tokens (e.g., "<|im_end|>", "<|eot_id|>")

Definition at line 115 of file helpers.hpp.

◆ prompt

std::string lloyal::ChatTemplateResult::prompt

Formatted chat prompt ready for tokenization.

Definition at line 114 of file helpers.hpp.


The documentation for this struct was generated from the following file: