summaryrefslogtreecommitdiff
path: root/py/chat.py (follow)
Commit message (Collapse)AuthorAgeFiles
* Add support for base_url option to use local modelsjuodumas2023-09-181
| | | | | | | | | | | | | | | | | | | | | | | | | | For example, you can start llama-cpp-python like this (it emulates the openai api): ```sh CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install 'llama-cpp-python[server]' wget https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-GGUF/resolve/main/codellama-13b-instruct.Q5_K_M.gguf python3 -m llama_cpp.server --n_gpu_layers 100 --model codellama-13b-instruct.Q5_K_M.gguf ``` Then set the API url in your `.vimrc`: ```vim let g:vim_ai_chat = { \ "engine": "chat", \ "options": { \ "base_url": "http://127.0.0.1:8000", \ }, \ } ``` And chat with the locally hosted AI using `:AIChat`. The change in utils.py was needed because llama-cpp-python adds a new line to the final response: `[DONE]^M`.
* allow string in initial_prompt, closes #35Martin Bielik2023-06-251
|
* print error in debugMartin Bielik2023-05-191
|
* clear echo message after completionMartin Bielik2023-05-141
|
* pass config as a parameterMartin Bielik2023-04-221
|
* recover for unfinished chatMartin Bielik2023-04-221
|
* move prompt to pythonMartin Bielik2023-04-211
|
* improved undo sequence breakMartin Bielik2023-04-151
|
* reusing error handlerMartin Bielik2023-04-151
|
* reorganized request optionsMartin Bielik2023-04-151
|
* using scoped variablesMartin Bielik2023-04-151
|
* implemented request_timeoutMartin Bielik2023-04-131
|
* poc: removing openai dependencyMartin Bielik2023-04-131
|
* moving import openai check to python scriptsMartin Bielik2023-04-121
|
* added debug loggingMartin Bielik2023-04-111
|
* improved error handlingMartin Bielik2023-04-101
|
* populate options in chatMartin Bielik2023-04-101
|
* parse chat header optionsMartin Bielik2023-04-091
|
* combine initial prompt with empty chat promptMartin Bielik2023-04-041
|
* chat engineMartin Bielik2023-04-041
|
* Merge branch 'main' into nextMartin Bielik2023-04-041
|\
| * break undo sequence after initial promptMartin Bielik2023-04-031
| |
| * handle roles in pythonMartin Bielik2023-04-031
| |
| * trim newlines from the prompt, fixes #5Martin Bielik2023-04-031
| |
* | chat initial prompt pocMartin Bielik2023-03-271
|/
* improved request timeout messageMartin Bielik2023-03-261
|
* handle connection timeout errorsMartin Bielik2023-03-251
|
* completion configurationMartin Bielik2023-03-221
|
* openai configurationMartin Bielik2023-03-211
|
* fixed missing whitespace in chatMartin Bielik2023-03-201
|
* request timeoutMartin Bielik2023-03-201
|
* ctrl c to cancel completionMartin Bielik2023-03-141
|
* stream complete/edit commandsMartin Bielik2023-03-131
|
* chat streaming, more py3 integrationMartin Bielik2023-03-131
|
* getting rid of global dependenciesMartin Bielik2023-03-121
|
* adding edit and chat commandsMartin Bielik2023-03-041