LLM¶
Base LLM.
BaseLLM
¶
Bases: ABC
Base LLM Class.
Source code in src/llm_agents_from_scratch/base/llm.py
complete
abstractmethod
async
¶
Text Complete.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The prompt the LLM should use as input. |
required |
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Name | Type | Description |
|---|---|---|
str |
CompleteResult
|
The completion of the prompt. |
Source code in src/llm_agents_from_scratch/base/llm.py
structured_output
abstractmethod
async
¶
Structured output interface for returning ~pydantic.BaseModels.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
prompt
|
str
|
The prompt to elicit the structured output response. |
required |
mdl
|
type[StructuredOutputType]
|
The ~pydantic.BaseModel to output. |
required |
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Name | Type | Description |
|---|---|---|
StructuredOutputType |
StructuredOutputType
|
The structured output as the specified |
Source code in src/llm_agents_from_scratch/base/llm.py
chat
abstractmethod
async
¶
Chat interface.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
input
|
str
|
The user's current input. |
required |
chat_history
|
Sequence[ChatMessage] | None
|
chat history. |
None
|
tools
|
Sequence[BaseTool] | None
|
tools that the LLM can call. |
None
|
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Type | Description |
|---|---|
tuple[ChatMessage, ChatMessage]
|
tuple[ChatMessage, ChatMessage]: A tuple of ChatMessage with the first message corresponding to the ChatMessage created from the supplied input string, and the second ChatMessage is the response from the LLM structured. |
Source code in src/llm_agents_from_scratch/base/llm.py
continue_chat_with_tool_results
abstractmethod
async
¶
Continue a chat by submitting tool call results.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
tool_call_results
|
Sequence[ToolCallResult]
|
Tool call results. |
required |
chat_history
|
Sequence[ChatMessage]
|
The chat history. Defaults to None. |
required |
tools
|
Sequence[BaseTool] | None
|
tools that the LLM can call. |
None
|
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Type | Description |
|---|---|
tuple[list[ChatMessage], ChatMessage]
|
tuple[list[ChatMessage], ChatMessage]: A tuple whose first element is a list of ChatMessage objects corresponding to the supplied ToolCallResult converted objects. The second element is the response ChatMessage from the LLM. |