summaryrefslogtreecommitdiff
path: root/py/utils.py (follow)
Commit message (Collapse)AuthorAgeFiles
* improved error handling, fixes #126Martin Bielik2024-11-081
|
* fixes #110, python compatibility issue with escape sequenceMartin Bielik2024-06-111
|
* Fix print_info_message <Esc> issueMichael Buckley2024-06-041
| | | | | | | | | | | | | | | | | | | | | I ran into an issue when first using this plugin where the print_info_message function wasn't working correctly due to vim misinterpreting the <Esc> sequence in `vim.command("normal \\<Esc>")` as a series of individual characters rather than a single literal Escape character. This resulted in the characters 'c>' being inserted into the active buffer at the cursor location because the 's' in '<Esc>' was being interpreted as a normal mode 's', causing it to enter insert mode, and none of the info messages were being echoed properly. This was frustrating as it was not easy to figure out why my commands weren't working initially (turns out I hadn't configured my billing plan correctly, d'oh). Fix this by using a more robust way of sending the <Esc> character to vim via `vim.command('call feedkeys("\<Esc>")')`. The usage of double quotes inside the feedkeys() call is important because it causes vim to treat the sequence as a proper escape sequence rather than a series of individual characters (see :h feedkeys).
* reusing parsing codeMartin Bielik2024-03-241
|
* optionally supplement roles dict by vim function sourceKonfekt2024-03-111
| | | | | | | | | | | | | | | | | | | | | | | The application was restricted to loading role configurations only from a predefined config file, which limited extensibility. Enable dynamic role configuration by invoking a custom Vim function if it is defined. This allows users to extend the role configurations beyond the static file. diff --git a/doc/vim-ai.txt b/doc/vim-ai.txt: -The roles in g:vim_ai_roles_config_file are converted to a Vim dictionary. -Optionally, additional roles can be added by defining a function VimAIRoleParser() -whose output is a dictionary of the same format as g:vim_ai_roles_config_file. - diff --git a/py/roles.py b/py/roles.py: -if vim.eval('exists("*VimAIRoleParser")'): - roles.update(vim.eval('VimAIRoleParser()')) - diff --git a/py/utils.py b/py/utils.py: - if vim.eval('exists("*VimAIRoleParser")'): - roles.update(vim.eval('VimAIRoleParser()')) -
* supprot config only rolesMartin Bielik2024-03-091
|
* simple error handlingMartin Bielik2024-03-091
|
* roles example fileMartin Bielik2024-03-091
|
* parse role optionsMartin Bielik2024-03-091
|
* read role prompt from configMartin Bielik2024-03-091
|
* removed config path logMartin Bielik2024-03-091
|
* feat: add an option to customize api key file locationjiangyinzuo2024-03-081
|
* feat(chat): add `include` role to include filesJason Kölker2024-01-241
| | | | | | | | | | | | | | | | | | | | | | | | | Files may be included in the chat by a special `include` role. Each file's contents will be added to an additional `user` role message with the files separated by `==> {path} <==` where `{path}` is the path to the file. Globbing is expanded out via `glob.glob` and relative apths to the current working directory (as determined by `getcwd()`) will be resolved to absolute paths. Example: ``` >>> user Generate documentation for the following files >>> include /home/user/myproject/src/../requirements.txt /home/user/myproject/**/*.py ``` Fixes: #69
* added explaining commentMartin Bielik2023-12-021
|
* fix selection include extra content when the user is in visual modecposture2023-12-021
|
* fixed python3.12 slash escaping, fixes #61Martin Bielik2023-11-011
|
* Merge remote-tracking branch 'origin/main' into base-url-configMartin Bielik2023-10-211
|\
| * Include OpenAI Org ID from the token configDuy Lam2023-09-091
| |
* | option to disable authorizationMartin Bielik2023-10-211
| |
* | Add support for base_url option to use local modelsjuodumas2023-09-181
|/ | | | | | | | | | | | | | | | | | | | | | | | | | For example, you can start llama-cpp-python like this (it emulates the openai api): ```sh CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install 'llama-cpp-python[server]' wget https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-GGUF/resolve/main/codellama-13b-instruct.Q5_K_M.gguf python3 -m llama_cpp.server --n_gpu_layers 100 --model codellama-13b-instruct.Q5_K_M.gguf ``` Then set the API url in your `.vimrc`: ```vim let g:vim_ai_chat = { \ "engine": "chat", \ "options": { \ "base_url": "http://127.0.0.1:8000", \ }, \ } ``` And chat with the locally hosted AI using `:AIChat`. The change in utils.py was needed because llama-cpp-python adds a new line to the final response: `[DONE]^M`.
* allow string in initial_prompt, closes #35Martin Bielik2023-06-251
|
* optional max_tokens, fixes #42Martin Bielik2023-06-111
|
* importing vim module, fixes #43Martin Bielik2023-05-231
|
* print error in debugMartin Bielik2023-05-191
|
* clear echo message after completionMartin Bielik2023-05-141
|
* Allow single undoBonaBeavis2023-05-041
| | | Fixes https://github.com/madox2/vim-ai/issues/14
* http error handlingMartin Bielik2023-04-261
|
* recover for unfinished chatMartin Bielik2023-04-221
|
* empty message warning, reference #20Martin Bielik2023-04-181
|
* nvim keyboard interrupt handlingMartin Bielik2023-04-161
|
* fixed error handlingMartin Bielik2023-04-151
|
* using messages to show error/warningMartin Bielik2023-04-151
|
* reusing error handlerMartin Bielik2023-04-151
|
* reorganized request optionsMartin Bielik2023-04-151
|
* removing openai-python from docuMartin Bielik2023-04-131
|
* implemented request_timeoutMartin Bielik2023-04-131
|
* poc: removing openai dependencyMartin Bielik2023-04-131
|
* moving import openai check to python scriptsMartin Bielik2023-04-121
|
* fixed debug variable typeMartin Bielik2023-04-111
|
* fixed legacy methodMartin Bielik2023-04-111
|
* added debug loggingMartin Bielik2023-04-111
|
* improved error handlingMartin Bielik2023-04-101
|
* populate options in chatMartin Bielik2023-04-101
|
* parse chat header optionsMartin Bielik2023-04-091
|
* chat engineMartin Bielik2023-04-041
|
* Merge branch 'main' into nextMartin Bielik2023-04-041
|\
| * extending config programaticallyMartin Bielik2023-04-021
| |
* | chat initial prompt pocMartin Bielik2023-03-271
|/
* completion configurationMartin Bielik2023-03-221
|
* openai configurationMartin Bielik2023-03-211
|