API Reference#

Here is the API reference for gptme.

gptme package#

Some of the main classes in gptme.

gptme.message#

class gptme.message.Message(role: Literal['system', 'user', 'assistant'], content: str, pinned: bool = False, hide: bool = False, quiet: bool = False, timestamp: datetime | str | None = None)#

A message in the assistant conversation.

classmethod from_toml(toml: str) Self#

Converts a TOML string to a message.

The string can be a single [[message]].

get_codeblocks(content=False) list[str]#

Get all codeblocks. If content set, return the content of the code block, else return the whole message.

to_dict(keys=None)#

Return a dict representation of the message, serializable to JSON.

to_toml() str#

Converts a message to a TOML string, for easy editing by hand in editor to then be parsed back.

gptme.message.format_msgs(msgs: list[gptme.message.Message], oneline: bool = False, highlight: bool = False, indent: int = 0) list[str]#

Formats messages for printing to the console.

gptme.message.msgs_to_toml(msgs: list[gptme.message.Message]) str#

Converts a list of messages to a TOML string, for easy editing by hand in editor to then be parsed back.

gptme.message.print_msg(msg: Message | list[gptme.message.Message], oneline: bool = False, highlight: bool = True, show_hidden: bool = False) None#

Prints the log to the console.

gptme.message.toml_to_msgs(toml: str) list[gptme.message.Message]#

Converts a TOML string to a list of messages.

The string can be a whole file with multiple [[messages]].

gptme.logmanager#

class gptme.logmanager.LogManager(log: list[gptme.message.Message] | None = None, logdir: str | Path | None = None, branch: str | None = None, show_hidden=False)#

Manages a conversation log.

append(msg: Message) None#

Appends a message to the log, writes the log, prints the message.

branch(name: str) None#

Switches to a branch.

diff(branch: str) str | None#

Prints the diff between the current branch and another branch.

edit(new_log: list[gptme.message.Message]) None#

Edits the log.

fork(name: str) None#

Copy the conversation folder to a new name.

get_last_code_block(role: Literal['user', 'assistant', 'system'] | None = None, history: int | None = None, content=False) str | None#

Returns the last code block in the log, if any.

If role set, only check that role. If history set, only check n messages back. If content set, return the content of the code block, else return the whole message.

classmethod load(logfile: str | ~pathlib.Path, initial_msgs: list[gptme.message.Message] = [<Message role=system content=You are gptme, an...>], branch: str = 'main', **kwargs) LogManager#

Loads a conversation log.

prepare_messages() list[gptme.message.Message]#

Prepares the log into messages before sending it to the LLM.

rename(name: str, keep_date=False) None#

Rename the conversation. Renames the folder containing the conversation and its branches.

If keep_date is True, we will keep the date part of conversation folder name (“2021-08-01-some-name”) If you want to keep the old log, use fork()

to_dict(branches=False) dict#

Returns a dict representation of the log.

undo(n: int = 1, quiet=False) None#

Removes the last message from the log.

write(branches=True) None#

Writes to the conversation log.

gptme.server#

Endpoint functions for the server.

gptme.tools#

Tools available to gptme.

gptme.tools.execute_codeblock(codeblock: str, ask: bool) Generator[Message, None, None]#

Executes a codeblock and returns the output.

gptme.tools.execute_python(code: str, ask: bool) Generator[Message, None, None]#

Executes a python codeblock and returns the output.

gptme.tools.execute_save(fn: str, code: str, ask: bool, append: bool = False) Generator[Message, None, None]#

Save the code to a file.

gptme.tools.execute_shell(cmd: str, ask=True) Generator[Message, None, None]#

Executes a shell command and returns the output.

gptme.tools.summarize(msg: Message | list[gptme.message.Message]) Message#

Uses a cheap LLM to summarize long outputs.

gptme.tools.shell#

The assistant can execute shell commands by outputting code blocks with bash or sh as the language.

Example:

User
How can I list the files in the current directory?
Assistant
To list the files in the current directory, use the `ls` command:
```bash
ls
```
System
Ran command: `ls`
stdout:
```
file1.txt
file2.txt
```

The user can also run shell code with the /shell command:

User
/shell ls
System
Ran command: `ls`
stdout:
```
file1.txt
file2.txt
```
gptme.tools.shell.execute_shell(cmd: str, ask=True) Generator[Message, None, None]#

Executes a shell command and returns the output.

Python#

The assistant can execute Python code blocks.

It uses IPython to do so, and persists the IPython instance between calls to give a REPL-like experience.

User
What is 2 + 2?
Assistant
```python
2 + 2
```
System
Executed code block.
stdout:
```
4
```

The user can also run Python code with the /python command:

User
/python 2 + 2
System
Executed code block.
stdout:
```
4
```
gptme.tools.python.check_available_packages()#

Checks that essentials like numpy, pandas, matplotlib are available.

gptme.tools.python.execute_python(code: str, ask: bool) Generator[Message, None, None]#

Executes a python codeblock and returns the output.

gptme.tools.python.register_function(func: T) T#

Decorator to register a function to be available in the IPython instance.

gptme.tools.context#

Generate context information for a conversation.

Can include the current working directory, git status, and ctags output.

gptme.tools.context.ctags() str#

Generate ctags output for project in working dir.

gptme.tools.context.gen_context_msg() Message#

Generate a message with context information from output of pwd and git status.

gptme.tools.save#

Gives the assistant the ability to save code to a file.

Example:

User
write hello world to hello.py
Assistant
```hello.py
print("hello world")
```
System
Saved to hello.py
gptme.tools.save.execute_save(fn: str, code: str, ask: bool, append: bool = False) Generator[Message, None, None]#

Save the code to a file.

gptme.tools.patch#

Gives the LLM agent the ability to patch files, by using a adapted version git conflict markers.

Example:

User
patch the file `hello.py` to ask for the name of the user
Assistant
```patch hello.py
<<<<<<< ORIGINAL
print("Hello world")
=======
name = input("What is your name? ")
print(f"hello {name}")
>>>>>>> UPDATED
```
System
Patch applied

Inspired by aider.

gptme.tools.patch.apply(codeblock: str, content: str) str#

Applies the patch in codeblock to content.

gptme.tools.patch.execute_patch(codeblock: str, fn: str, ask: bool) Generator[Message, None, None]#

Applies the patch.