summaryrefslogtreecommitdiff
Commit message (Collapse)AuthorAgeFiles
* align AIRedo with AI/AIEdit/ChatKonfekt2024-02-261
| | | | | | | | These changes were necessary to fix a bug where commands were not being executed correctly in non-visual modes, and to make the code cleaner and more efficient. By explicitly handling the visual selection state, it ensures that the plugin functions correctly regardless of how the user invokes the AI features.
* fix selection handling in vim_ai functionsKonfekt2024-02-262
| | | | addresses https://github.com/madox2/vim-ai/issues/76
* Update README.mdChris Stryczynski2024-02-261
| | | Fix instructions which lead to `Undefined variable: g:vim_ai_chat` if config options are not initially set
* feat(chat): add `include` role to include filesJason Kölker2024-01-245
| | | | | | | | | | | | | | | | | | | | | | | | | Files may be included in the chat by a special `include` role. Each file's contents will be added to an additional `user` role message with the files separated by `==> {path} <==` where `{path}` is the path to the file. Globbing is expanded out via `glob.glob` and relative apths to the current working directory (as determined by `getcwd()`) will be resolved to absolute paths. Example: ``` >>> user Generate documentation for the following files >>> include /home/user/myproject/src/../requirements.txt /home/user/myproject/**/*.py ``` Fixes: #69
* import vim before utils, fixes #43Martin Bielik2023-12-232
|
* added explaining commentMartin Bielik2023-12-022
|
* Merge pull request #64 from cposture/fix-visual-selectionMartin Bielik2023-12-024
|\ | | | | fix selection include extra content when the user is in visual mode
| * fix selection include extra content when the user is in visual modecposture2023-12-024
|/
* fixed python3.12 slash escaping, fixes #61Martin Bielik2023-11-011
|
* Merge pull request #59 from madox2/base-url-configMartin Bielik2023-10-216
|\ | | | | Custom APIs, closes #55, closes #51
| * removed unused importMartin Bielik2023-10-212
| |
| * Merge remote-tracking branch 'origin/main' into base-url-configMartin Bielik2023-10-214
| |\ | |/ |/|
* | use gpt-3.5-turbo-instruct by default, closes #48Martin Bielik2023-09-263
| |
* | compact org id docuMartin Bielik2023-09-181
| |
* | Merge pull request #54 from duylam/openai-org-supportMartin Bielik2023-09-182
|\ \ | | | | | | Support including OpenAI Org ID in the request to OpenAI API endpoints
| * | Include OpenAI Org ID from the token configDuy Lam2023-09-092
|/ /
| * docu on custom apisMartin Bielik2023-10-211
| |
| * endpoint_url configMartin Bielik2023-10-215
| |
| * option to disable authorizationMartin Bielik2023-10-214
| |
| * base_url extracted to config, docuMartin Bielik2023-10-215
| |
| * Add support for base_url option to use local modelsjuodumas2023-09-183
|/ | | | | | | | | | | | | | | | | | | | | | | | | | For example, you can start llama-cpp-python like this (it emulates the openai api): ```sh CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install 'llama-cpp-python[server]' wget https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-GGUF/resolve/main/codellama-13b-instruct.Q5_K_M.gguf python3 -m llama_cpp.server --n_gpu_layers 100 --model codellama-13b-instruct.Q5_K_M.gguf ``` Then set the API url in your `.vimrc`: ```vim let g:vim_ai_chat = { \ "engine": "chat", \ "options": { \ "base_url": "http://127.0.0.1:8000", \ }, \ } ``` And chat with the locally hosted AI using `:AIChat`. The change in utils.py was needed because llama-cpp-python adds a new line to the final response: `[DONE]^M`.
* allow string in initial_prompt, closes #35Martin Bielik2023-06-254
|
* optional max_tokens, fixes #42Martin Bielik2023-06-112
|
* selection boundary issue reference, fixes #45Martin Bielik2023-05-301
|
* importing vim module, fixes #43Martin Bielik2023-05-231
|
* tags updateMartin Bielik2023-05-221
|
* print error in debugMartin Bielik2023-05-193
|
* Merge pull request #38 from madox2/nextMartin Bielik2023-05-143
|\ | | | | clear echo message after completion, fixes #16
| * clear echo message after completionMartin Bielik2023-05-143
|/
* Added command example, fixes #37Martin Bielik2023-05-111
|
* Merge pull request #34 from KupferDigital/mainMartin Bielik2023-05-081
|\ | | | | Allow modification of vim_ai_open_chat_presets
| * Allow modification of vim_ai_open_chat_presetsBonaBeavis2023-05-071
|/
* Merge pull request #30 from BonaBeavis/patch-1Martin Bielik2023-05-051
|\ | | | | Allow single undo
| * Allow single undoBonaBeavis2023-05-041
|/ | | Fixes https://github.com/madox2/vim-ai/issues/14
* configurable paste modeMartin Bielik2023-05-024
|
* using value from extended configMartin Bielik2023-05-021
|
* http error handlingMartin Bielik2023-04-261
|
* wiki referencesMartin Bielik2023-04-241
|
* split to AI commands and utilitiesMartin Bielik2023-04-231
|
* renamed custom commands examplesMartin Bielik2023-04-231
|
* added AINewChat commandMartin Bielik2023-04-224
|
* open_chat_command presetsMartin Bielik2023-04-224
|
* custom commands documentationMartin Bielik2023-04-223
|
* pass config as a parameterMartin Bielik2023-04-225
|
* fixed redo in chatMartin Bielik2023-04-221
|
* fixed redo in chatMartin Bielik2023-04-221
|
* recover for unfinished chatMartin Bielik2023-04-222
|
* Merge branch 'main' into nextMartin Bielik2023-04-211
|\
| * docu typo fixMartin Bielik2023-04-211
| |
* | move prompt to pythonMartin Bielik2023-04-212
| |