summaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorMartin Bielik <mx.bielik@gmail.com>2024-12-17 23:24:22 +0100
committerGitHub <noreply@github.com>2024-12-17 23:24:22 +0100
commit0f448c88e3eec091cedde2de535f69294cec5b63 (patch)
tree56a24d4a055d58c91c8eae5ff88fbe3412554da5 /README.md
parentdcd2393fb3f3cfe9256fecaa7459719b8ef26aa2 (diff)
downloadvim-ai-0f448c88e3eec091cedde2de535f69294cec5b63.tar.gz
added table of contents
Diffstat (limited to '')
-rw-r--r--README.md10
1 files changed, 10 insertions, 0 deletions
diff --git a/README.md b/README.md
index f3950cb..c3df247 100644
--- a/README.md
+++ b/README.md
@@ -27,6 +27,16 @@ In case you would like to experiment with Gemini, Claude or other models running
A simple way is to use [OpenRouter](https://openrouter.ai) which has a fair pricing (and currently offers many models for [free](https://openrouter.ai/models?max_price=0)), or setup a proxy like [LiteLLM](https://github.com/BerriAI/litellm) locally.
See this simple [guide](#example-create-custom-roles-to-interact-with-openrouter-models) on configuring custom OpenRouter roles.
+## Table of Contents
+
+- [Installation](#installation)
+- [Usage](#usage)
+- [Roles](#roles)
+- [Reference](#reference)
+- [Configuration](#configuration)
+- [Key bindings](#key-bindings)
+- [Custom commands](#custom-commands)
+
## Installation
### Prerequisites