|
liblloyal 1.0.0
Composable primitives for llama.cpp inference
|
#include <chat-template.hpp>
Public Attributes | |
| nlohmann::ordered_json | messages |
| nlohmann::ordered_json | tools |
| bool | add_generation_prompt = true |
| nlohmann::ordered_json | extra_context |
| std::chrono::system_clock::time_point | now = std::chrono::system_clock::now() |
Definition at line 48 of file chat-template.hpp.
| bool minja::chat_template_inputs::add_generation_prompt = true |
Definition at line 51 of file chat-template.hpp.
| nlohmann::ordered_json minja::chat_template_inputs::extra_context |
Definition at line 52 of file chat-template.hpp.
| nlohmann::ordered_json minja::chat_template_inputs::messages |
Definition at line 49 of file chat-template.hpp.
| std::chrono::system_clock::time_point minja::chat_template_inputs::now = std::chrono::system_clock::now() |
Definition at line 53 of file chat-template.hpp.
| nlohmann::ordered_json minja::chat_template_inputs::tools |
Definition at line 50 of file chat-template.hpp.