| Commit message (Collapse) | Author | Age | Files |
| | |
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
First search for AI chat windows within the same tab and then within other tabs if none are found in the current tab.
It now prioritizes the reusing of an existing chat window that matches the '.aichat' filetype before considering opening a new one.
If there are no existing AI chat windows, the plugin will open a new chat window as a last resort.
diff --git a/autoload/vim_ai.vim b/autoload/vim_ai.vim:
- " reuse chat in active window or tab
+ " TODO: look for first active chat buffer. If .aichat file is used,
+ " then reuse chat in active window
- " allow .aichat files windows to be switched to, preferably on same tab
- let buffer_list_tab = tabpagebuflist(tabpagenr())
- let buffer_list_tab = filter(buffer_list_tab, 'getbufvar(v:val, "&filetype") ==# "aichat"')
-
- let buffer_list = []
- for i in range(tabpagenr('$'))
- call extend(buffer_list, tabpagebuflist(i + 1))
- endfor
- let buffer_list = filter(buffer_list, 'getbufvar(v:val, "&filetype") ==# "aichat"')
-
- if len(buffer_list_tab) > 0
- call win_gotoid(win_findbuf(buffer_list_tab[0])[0])
- elseif len(buffer_list) > 0
- call win_gotoid(win_findbuf(buffer_list[0])[0])
- else
- " open new chat window
- let l:open_conf = l:config['ui']['open_chat_command']
- call s:OpenChatWindow(l:open_conf)
- endif
+ " open new chat window
+ let l:open_conf = l:config['ui']['open_chat_command']
+ call s:OpenChatWindow(l:open_conf)
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
diff --git a/autoload/vim_ai.vim b/autoload/vim_ai.vim:
-
- let l:chat_win_ids = win_findbuf(bufnr(s:scratch_buffer_name))
- if !empty(l:chat_win_ids)
- " TODO: look for first active chat buffer. If .aichat file is used,
- " then reuse chat in active window
- call win_gotoid(l:chat_win_ids[0])
+ let l:chat_win_id = bufwinid(s:scratch_buffer_name)
+ if l:chat_win_id != -1
+ " TODO: look for first active chat buffer, in case .aichat file is used
+ " reuse chat in active window
+ call win_gotoid(l:chat_win_id)
|
| | |
|
| |\
| |
| | |
allow overriding range if called on visual selection
|
| | | |
|
| | | |
|
| | |
| |
| |
| |
| |
| |
| | |
Check if the start and end line of range equals that of visual
selection.
If so, take the visual selection;
otherwise the supplied range
|
| | | |
|
| |\ \
| | |
| | | |
feat: add an option to customize api key file location
|
| | |/ |
|
| |\ \
| |/
|/| |
fix selection handling in vim_ai functions, fixes #81
|
| |/
|
|
|
|
|
|
|
| |
The vim-ai plugin had an issue where the AINewChat command was using
<q-args> to always pass a quoted (possibly empty!) argument to the
vim_ai#AINewChatRun function where an optional <f-args> agument was
expected.
Addresses https://github.com/madox2/vim-ai/issues/81
|
| | |
|
| |\
| |
| | |
fix selection handling in vim_ai functions
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | |
| |
| |
| |
| |
| |
| |
| | |
These changes were necessary to fix a bug where commands were not
being executed correctly in non-visual modes, and to make the code
cleaner and more efficient. By explicitly handling the visual
selection state, it ensures that the plugin functions correctly
regardless of how the user invokes the AI features.
|
| | |
| |
| |
| | |
addresses https://github.com/madox2/vim-ai/issues/76
|
| | |
| |
| | |
Fix instructions which lead to `Undefined variable: g:vim_ai_chat` if config options are not initially set
|
| |\ \
| | |
| | | |
Fix instructions which lead to `Undefined variable: g:vim_ai_chat` if config options are not initially set
|
| |/ /
| |
| | |
Fix instructions which lead to `Undefined variable: g:vim_ai_chat` if config options are not initially set
|
| |\|
| |
| | |
feat(chat): add `include` role to include files
|
| |/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Files may be included in the chat by a special `include` role. Each
file's contents will be added to an additional `user` role message with
the files separated by `==> {path} <==` where `{path}` is the path to
the file. Globbing is expanded out via `glob.glob` and relative apths to
the current working directory (as determined by `getcwd()`) will be
resolved to absolute paths.
Example:
```
>>> user
Generate documentation for the following files
>>> include
/home/user/myproject/src/../requirements.txt
/home/user/myproject/**/*.py
```
Fixes: #69
|
| | |
|
| | |
|
| |\
| |
| | |
fix selection include extra content when the user is in visual mode
|
| |/ |
|
| | |
|
| |\
| |
| | |
Custom APIs, closes #55, closes #51
|
| | | |
|
| | |\
| |/
|/| |
|
| | | |
|
| | | |
|
| |\ \
| | |
| | | |
Support including OpenAI Org ID in the request to OpenAI API endpoints
|
| |/ / |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| |/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
For example, you can start llama-cpp-python like this (it emulates
the openai api):
```sh
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install 'llama-cpp-python[server]'
wget https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-GGUF/resolve/main/codellama-13b-instruct.Q5_K_M.gguf
python3 -m llama_cpp.server --n_gpu_layers 100 --model codellama-13b-instruct.Q5_K_M.gguf
```
Then set the API url in your `.vimrc`:
```vim
let g:vim_ai_chat = {
\ "engine": "chat",
\ "options": {
\ "base_url": "http://127.0.0.1:8000",
\ },
\ }
```
And chat with the locally hosted AI using `:AIChat`.
The change in utils.py was needed because llama-cpp-python adds a new
line to the final response: `[DONE]^M`.
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|