| Commit message (Collapse) | Author | Files | ||
|---|---|---|---|---|
| 2025-01-31 | chore: rebase and fix up conflicts | Max Resnick | 1 | |
| 2024-12-21 | image to text support, closes #134 | Martin Bielik | 1 | |
| 2024-12-16 | refactoring: import python when needed, run as functions | Martin Bielik | 1 | |
| 2024-12-15 | refactoring: make prompt in python | Martin Bielik | 1 | |
| 2024-12-15 | unified config parsing + tests | Martin Bielik | 1 | |
| 2024-12-12 | fixed complete command roles after refactoring | Martin Bielik | 1 | |
| 2024-12-08 | print prompt in debug mode | Martin Bielik | 1 | |
| 2024-12-07 | fixed options normalization | Martin Bielik | 1 | |
| 2024-12-05 | fixed stream=0 in chat engine | Martin Bielik | 1 | |
| 2024-10-08 | support non streaming api | Martin Bielik | 1 | |
| 2024-03-09 | fix using role in existing chat | Martin Bielik | 1 | |
| 2024-03-09 | parse role options | Martin Bielik | 1 | |
| 2024-03-09 | read role prompt from config | Martin Bielik | 1 | |
| 2023-12-23 | import vim before utils, fixes #43 | Martin Bielik | 1 | |
| 2023-12-02 | fix selection include extra content when the user is in visual mode | cposture | 1 | |
| 2023-10-21 | removed unused import | Martin Bielik | 1 | |
| 2023-10-21 | endpoint_url config | Martin Bielik | 1 | |
| 2023-10-21 | base_url extracted to config, docu | Martin Bielik | 1 | |
| 2023-09-18 | Add support for base_url option to use local models | juodumas | 1 | |
| For example, you can start llama-cpp-python like this (it emulates the openai api): ```sh CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install 'llama-cpp-python[server]' wget https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-GGUF/resolve/main/codellama-13b-instruct.Q5_K_M.gguf python3 -m llama_cpp.server --n_gpu_layers 100 --model codellama-13b-instruct.Q5_K_M.gguf ``` Then set the API url in your `.vimrc`: ```vim let g:vim_ai_chat = { \ "engine": "chat", \ "options": { \ "base_url": "http://127.0.0.1:8000", \ }, \ } ``` And chat with the locally hosted AI using `:AIChat`. The change in utils.py was needed because llama-cpp-python adds a new line to the final response: `[DONE]^M`. | ||||
| 2023-06-25 | allow string in initial_prompt, closes #35 | Martin Bielik | 1 | |
| 2023-05-19 | print error in debug | Martin Bielik | 1 | |
| 2023-05-14 | clear echo message after completion | Martin Bielik | 1 | |
| 2023-04-22 | pass config as a parameter | Martin Bielik | 1 | |
| 2023-04-22 | recover for unfinished chat | Martin Bielik | 1 | |
| 2023-04-21 | move prompt to python | Martin Bielik | 1 | |
| 2023-04-15 | improved undo sequence break | Martin Bielik | 1 | |
| 2023-04-15 | reusing error handler | Martin Bielik | 1 | |
| 2023-04-15 | reorganized request options | Martin Bielik | 1 | |
| 2023-04-15 | using scoped variables | Martin Bielik | 1 | |
| 2023-04-13 | implemented request_timeout | Martin Bielik | 1 | |
| 2023-04-13 | poc: removing openai dependency | Martin Bielik | 1 | |
| 2023-04-12 | moving import openai check to python scripts | Martin Bielik | 1 | |
| 2023-04-11 | added debug logging | Martin Bielik | 1 | |
| 2023-04-10 | improved error handling | Martin Bielik | 1 | |
| 2023-04-10 | populate options in chat | Martin Bielik | 1 | |
| 2023-04-09 | parse chat header options | Martin Bielik | 1 | |
| 2023-04-04 | combine initial prompt with empty chat prompt | Martin Bielik | 1 | |
| 2023-04-04 | chat engine | Martin Bielik | 1 | |
| 2023-04-03 | break undo sequence after initial prompt | Martin Bielik | 1 | |
| 2023-04-03 | handle roles in python | Martin Bielik | 1 | |
| 2023-04-03 | trim newlines from the prompt, fixes #5 | Martin Bielik | 1 | |
| 2023-03-27 | chat initial prompt poc | Martin Bielik | 1 | |
| 2023-03-26 | improved request timeout message | Martin Bielik | 1 | |
| 2023-03-25 | handle connection timeout errors | Martin Bielik | 1 | |
| 2023-03-22 | completion configuration | Martin Bielik | 1 | |
| 2023-03-21 | openai configuration | Martin Bielik | 1 | |
| 2023-03-20 | fixed missing whitespace in chat | Martin Bielik | 1 | |
| 2023-03-20 | request timeout | Martin Bielik | 1 | |
| 2023-03-14 | ctrl c to cancel completion | Martin Bielik | 1 | |
| 2023-03-13 | stream complete/edit commands | Martin Bielik | 1 | |