ant_ai.core.response
ChatLLMResponse
pydantic-model
Bases: BaseModel
Response from a chat-based LLM. It follows a OpenAI-like structure.
Show JSON schema:
{
"$defs": {
"Message": {
"description": "Generic message used in a conversation",
"properties": {
"kind": {
"const": "message",
"default": "message",
"title": "Kind",
"type": "string"
},
"role": {
"title": "Role",
"type": "string"
},
"content": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"default": null,
"title": "Content"
},
"metadata": {
"additionalProperties": true,
"title": "Metadata",
"type": "object"
}
},
"required": [
"role"
],
"title": "Message",
"type": "object"
},
"ToolCall": {
"description": "Single tool call object inside assistant.tool_calls (OpenAI schema).",
"properties": {
"id": {
"title": "Id",
"type": "string"
},
"type": {
"default": "function",
"title": "Type",
"type": "string"
},
"function": {
"$ref": "#/$defs/ToolFunction"
}
},
"required": [
"id",
"function"
],
"title": "ToolCall",
"type": "object"
},
"ToolFunction": {
"description": "Inner function payload for a tool call (OpenAI schema).",
"properties": {
"name": {
"title": "Name",
"type": "string"
},
"arguments": {
"title": "Arguments",
"type": "string"
}
},
"required": [
"name",
"arguments"
],
"title": "ToolFunction",
"type": "object"
}
},
"description": "Response from a chat-based LLM. It follows a OpenAI-like structure.",
"properties": {
"message": {
"$ref": "#/$defs/Message",
"description": "The assistant message returned by the model."
},
"tool_calls": {
"description": "Tool calls requested by the model. Empty when the model produced a text answer.",
"items": {
"$ref": "#/$defs/ToolCall"
},
"title": "Tool Calls",
"type": "array"
},
"finish_reason": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"default": null,
"description": "The reason the model stopped generating (e.g. 'stop', 'tool_calls').",
"title": "Finish Reason"
},
"usage": {
"anyOf": [
{
"additionalProperties": true,
"type": "object"
},
{
"type": "null"
}
],
"default": null,
"description": "Token usage statistics reported by the model.",
"title": "Usage"
},
"reasoning": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"default": null,
"description": "Reasoning/thinking content produced by the model, if supported.",
"title": "Reasoning"
}
},
"required": [
"message"
],
"title": "ChatLLMResponse",
"type": "object"
}
Fields:
-
message(Message) -
tool_calls(list[ToolCall]) -
finish_reason(str | None) -
usage(dict | None) -
reasoning(str | None)
Source code in src/ant_ai/core/response.py
6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | |
tool_calls
pydantic-field
tool_calls: list[ToolCall]
Tool calls requested by the model. Empty when the model produced a text answer.
finish_reason
pydantic-field
finish_reason: str | None = None
The reason the model stopped generating (e.g. 'stop', 'tool_calls').
usage
pydantic-field
usage: dict | None = None
Token usage statistics reported by the model.
reasoning
pydantic-field
reasoning: str | None = None
Reasoning/thinking content produced by the model, if supported.
ChatLLMStreamChunk
pydantic-model
Bases: BaseModel
A chunk of a streaming response from a chat-based LLM.
Show JSON schema:
{
"$defs": {
"MessageChunk": {
"description": "Represents *partial output* from an LLM during streaming.\n`delta` is the content's only the newly streamed text fragment,\nnot the whole message so far.",
"properties": {
"role": {
"title": "Role",
"type": "string"
},
"delta": {
"title": "Delta",
"type": "string"
},
"metadata": {
"additionalProperties": true,
"title": "Metadata",
"type": "object"
}
},
"required": [
"role",
"delta"
],
"title": "MessageChunk",
"type": "object"
}
},
"description": "A chunk of a streaming response from a chat-based LLM.",
"properties": {
"delta": {
"$ref": "#/$defs/MessageChunk",
"description": "Newly streamed text fragment, not the accumulated response so far."
},
"tool_calls": {
"anyOf": [
{
"additionalProperties": true,
"type": "object"
},
{
"type": "null"
}
],
"default": null,
"description": "Partial tool call data streamed incrementally alongside the delta.",
"title": "Tool Calls"
}
},
"required": [
"delta"
],
"title": "ChatLLMStreamChunk",
"type": "object"
}
Fields:
-
delta(MessageChunk) -
tool_calls(dict | None)
Source code in src/ant_ai/core/response.py
30 31 32 33 34 35 36 37 38 39 | |
delta
pydantic-field
delta: MessageChunk
Newly streamed text fragment, not the accumulated response so far.
tool_calls
pydantic-field
tool_calls: dict | None = None
Partial tool call data streamed incrementally alongside the delta.