liblloyal 1.0.0
Branched Inference for llama.cpp
Loading...
Searching...
No Matches
lloyal::chat_in::FormatInputs Struct Reference

Input parameters for chat formatting. More...

#include <lloyal/chat_in.hpp>

Public Attributes

std::string messages_json
 JSON array of OpenAI-format messages (required)
 
std::string template_override = ""
 Optional Jinja2 template override.
 
bool add_generation_prompt = true
 Append assistant prompt prefix (set false for partial formatting)
 
std::string tools_json = ""
 JSON array of OpenAI-format tool definitions.
 
std::string tool_choice = "auto"
 "auto" | "required" | "none"
 
bool parallel_tool_calls = false
 Allow parallel tool calls.
 
std::string reasoning_format = "none"
 "none" | "auto" | "deepseek" | "deepseek_legacy"
 
bool enable_thinking = true
 Enable <think> blocks (pairs with reasoning_format)
 
std::string json_schema = ""
 JSON schema for structured output.
 
std::string grammar = ""
 Explicit GBNF grammar string.
 

Detailed Description

Input parameters for chat formatting.

Controls all format-awareness fields passed to common_chat_templates_apply(). All fields have sensible defaults so callers can provide only what they need.

Definition at line 64 of file chat_in.hpp.

Member Data Documentation

◆ add_generation_prompt

bool lloyal::chat_in::FormatInputs::add_generation_prompt = true

Append assistant prompt prefix (set false for partial formatting)

Definition at line 67 of file chat_in.hpp.

◆ enable_thinking

bool lloyal::chat_in::FormatInputs::enable_thinking = true

Enable <think> blocks (pairs with reasoning_format)

Definition at line 72 of file chat_in.hpp.

◆ grammar

std::string lloyal::chat_in::FormatInputs::grammar = ""

Explicit GBNF grammar string.

Definition at line 74 of file chat_in.hpp.

◆ json_schema

std::string lloyal::chat_in::FormatInputs::json_schema = ""

JSON schema for structured output.

Definition at line 73 of file chat_in.hpp.

◆ messages_json

std::string lloyal::chat_in::FormatInputs::messages_json

JSON array of OpenAI-format messages (required)

Definition at line 65 of file chat_in.hpp.

◆ parallel_tool_calls

bool lloyal::chat_in::FormatInputs::parallel_tool_calls = false

Allow parallel tool calls.

Definition at line 70 of file chat_in.hpp.

◆ reasoning_format

std::string lloyal::chat_in::FormatInputs::reasoning_format = "none"

"none" | "auto" | "deepseek" | "deepseek_legacy"

Definition at line 71 of file chat_in.hpp.

◆ template_override

std::string lloyal::chat_in::FormatInputs::template_override = ""

Optional Jinja2 template override.

Definition at line 66 of file chat_in.hpp.

◆ tool_choice

std::string lloyal::chat_in::FormatInputs::tool_choice = "auto"

"auto" | "required" | "none"

Definition at line 69 of file chat_in.hpp.

◆ tools_json

std::string lloyal::chat_in::FormatInputs::tools_json = ""

JSON array of OpenAI-format tool definitions.

Definition at line 68 of file chat_in.hpp.


The documentation for this struct was generated from the following file: