LLM

Using the \llm Command (AI-assisted SQL)

The \llm special command lets you ask natural-language questions and get SQL proposed for you. It uses the open‑source llm CLI under the hood and enriches your prompt with database context (schema and one sample row per table) so answers can include runnable SQL.

Alias: \ai works the same as \llm.


Quick Start

1) Make sure mycli is installed with the [llm] extras, like

pip install 'mycli[llm]'

or that the llm dependency is installed separately:

pip install llm

2) From the mycli prompt, configure your API key (only needed for remote providers like OpenAI):

\llm keys set openai

3) Ask a question. The response’s SQL (inside a ```sql fenced block) is extracted and pre-filled at the prompt:

World> \llm "Capital of India?"
-- Answer text from the model...
-- ```sql
-- SELECT ...;
-- ```
-- Your prompt is prefilled with the SQL above.

You can now hit Enter to run, or edit the query first.


What Context Is Sent

When you ask a plain question via \llm "...", mycli: - Sends your question. - Adds your current database schema: table names with column types. - Adds one sample row (if available) from each table.

This helps the model propose SQL that fits your schema. Follow‑ups using -c continue the same conversation and do not re-send the DB context (see “Continue Conversation (-c)”).

Note: Context is gathered from the current connection. If you are not connected, using contextual mode will fail — connect first.


Using llm Subcommands from mycli

You can run any llm CLI subcommand by prefixing it with \llm inside mycli. Examples:

Tab completion works for \llm subcommands, and even for model IDs under models default.

Aside: https://ollama.com/ for using local models.


Ask Questions With DB Context (default)

Ask your question in quotes. mycli sends database context and extracts a SQL block if present.

World> \llm "Most visited urls?"

Behavior: - Response is printed in the output pane. - If the response contains a ```sql fenced block, mycli extracts the SQL and pre-fills it at your prompt.


Continue Conversation (-c)

Use -c to ask a follow‑up that continues the previous conversation with the model. This does not re-send the DB context; it relies on the ongoing thread.

World> \llm "Top 10 customers by spend"
-- model returns analysis and a ```sql block; SQL is prefilled
World> \llm -c "Now include each customer's email and order count"

Behavior: - Continues the last conversation in the llm history. - Database context is not re-sent on follow‑ups. - If the response includes a ```sql block, the SQL is pre-filled at your prompt.


Examples

See: https://ollama.com/ for details.


Customize the Prompt Template

mycli uses a saved llm template named mycli-llm-template for contextual questions. You can view or edit it:

World> \llm templates edit mycli-llm-template

Tip: After first use, mycli ensures this template exists. To just view it without editing, use:

World> \llm templates show mycli-llm-template

Troubleshooting


Notes and Safety

Turning Off LLM Support

To turn off LLM support even when the llm dependency is installed, set the MYCLI_LLM_OFF environment variable:

export MYCLI_LLM_OFF=1

This may be desirable for faster startup times.


Learn More

Sponsors

Nova Kid School

blogroll

social