Commit graph

4350 commits

Author SHA1 Message Date
oobabooga
1dd4aedbe1 Fix the streaming_llm UI checkbox not being interactive 2025-04-29 05:28:46 -07:00
oobabooga
c5fb51e5d1 Update README 2025-04-28 22:40:26 -07:00
oobabooga
d10bded7f8 UI: Add an enable_thinking option to enable/disable Qwen3 thinking 2025-04-28 22:37:01 -07:00
oobabooga
1ee0acc852 llama.cpp: Make --verbose print the llama-server command 2025-04-28 15:56:25 -07:00
oobabooga
15a29e99f8 Lint 2025-04-27 21:41:34 -07:00
oobabooga
be13f5199b UI: Add an info message about how to use Speculative Decoding 2025-04-27 21:40:38 -07:00
oobabooga
c6c2855c80 llama.cpp: Remove the timeout while loading models (closes #6907) 2025-04-27 21:22:21 -07:00
oobabooga
bbcaec75b4 API: Find a new port if the default one is taken (closes #6918) 2025-04-27 21:13:16 -07:00
oobabooga
ee0592473c Fix ExLlamaV3_HF leaking memory (attempt) 2025-04-27 21:04:02 -07:00
oobabooga
965ca7948f Update README 2025-04-27 07:33:08 -07:00
oobabooga
f5b59d2b0b Fix the vulkan workflow 2025-04-26 20:11:24 -07:00
oobabooga
765fea5e36 UI: minor style change 2025-04-26 19:33:46 -07:00
oobabooga
70952553c7 Lint 2025-04-26 19:29:08 -07:00
oobabooga
363b632a0d Lint 2025-04-26 19:22:36 -07:00
oobabooga
fa861de05b Fix portable builds with Python 3.12 2025-04-26 18:52:44 -07:00
oobabooga
7b80acd524 Fix parsing --extra-flags 2025-04-26 18:40:03 -07:00
oobabooga
943451284f Fix the Notebook tab not loading its default prompt 2025-04-26 18:25:06 -07:00
oobabooga
511eb6aa94 Fix saving settings to settings.yaml 2025-04-26 18:20:00 -07:00
oobabooga
8b83e6f843 Prevent Gradio from saying 'Thank you for being a Gradio user!' 2025-04-26 18:14:57 -07:00
oobabooga
4a32e1f80c UI: show draft_max for ExLlamaV2 2025-04-26 18:01:44 -07:00
oobabooga
0fe3b033d0 Fix parsing of --n_ctx and --max_seq_len (2nd attempt) 2025-04-26 17:52:21 -07:00
oobabooga
c4afc0421d Fix parsing of --n_ctx and --max_seq_len 2025-04-26 17:43:53 -07:00
oobabooga
234aba1c50 llama.cpp: Simplify the prompt processing progress indicator
The progress bar was unreliable
2025-04-26 17:33:47 -07:00
oobabooga
4ff91b6588 Better default settings for Speculative Decoding 2025-04-26 17:24:40 -07:00
oobabooga
bf2aa19b21 Bump llama.cpp 2025-04-26 16:39:22 -07:00
oobabooga
029aab6404 Revert "Add -noavx2 portable builds"
This reverts commit 0dd71e78c9.
2025-04-26 16:38:13 -07:00
oobabooga
35717a088c API: Add an /v1/internal/health endpoint 2025-04-26 15:42:27 -07:00
oobabooga
bc55feaf3e Improve host header validation in local mode 2025-04-26 15:42:17 -07:00
oobabooga
a317450dfa Update README 2025-04-26 14:59:29 -07:00
oobabooga
d1e7d9c5d5 Update CMD_FLAGS.txt 2025-04-26 09:00:56 -07:00
oobabooga
3a207e7a57 Improve the --help formatting a bit 2025-04-26 07:31:04 -07:00
oobabooga
6acb0e1bee Change a UI description 2025-04-26 05:13:08 -07:00
oobabooga
cbd4d967cc Update a --help message 2025-04-26 05:09:52 -07:00
oobabooga
19c8dced67 Move settings-template.yaml into user_data 2025-04-26 05:03:23 -07:00
oobabooga
b976112539 Remove the WSL installation scripts
They were useful in 2023 but now everything runs natively on Windows.
2025-04-26 05:02:17 -07:00
oobabooga
763a7011c0 Remove an ancient/obsolete migration check 2025-04-26 04:59:05 -07:00
oobabooga
d9de14d1f7
Restructure the repository (#6904) 2025-04-26 08:56:54 -03:00
oobabooga
d4017fbb6d
ExLlamaV3: Add kv cache quantization (#6903) 2025-04-25 21:32:00 -03:00
oobabooga
d4b1e31c49 Use --ctx-size to specify the context size for all loaders
Old flags are still recognized as alternatives.
2025-04-25 16:59:03 -07:00
oobabooga
faababc4ea llama.cpp: Add a prompt processing progress bar 2025-04-25 16:42:30 -07:00
oobabooga
877cf44c08 llama.cpp: Add StreamingLLM (--streaming-llm) 2025-04-25 16:21:41 -07:00
oobabooga
d35818f4e1
UI: Add a collapsible thinking block to messages with <think> steps (#6902) 2025-04-25 18:02:02 -03:00
oobabooga
0dd71e78c9 Add -noavx2 portable builds 2025-04-25 09:07:14 -07:00
oobabooga
98f4c694b9 llama.cpp: Add --extra-flags parameter for passing additional flags to llama-server 2025-04-25 07:32:51 -07:00
oobabooga
b6fffbd216 UI: minor style change 2025-04-25 05:37:44 -07:00
oobabooga
2c7ff86015 Bump exllamav3 to de83084184 2025-04-25 05:28:22 -07:00
oobabooga
5993ebeb1b Bump exllamav2 to 0.2.9 2025-04-25 05:27:59 -07:00
oobabooga
23399aff3c UI: minor style change 2025-04-24 20:39:00 -07:00
oobabooga
5861013e68 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2025-04-24 20:36:20 -07:00
oobabooga
a90df27ff5 UI: Add a greeting when the chat history is empty 2025-04-24 20:33:40 -07:00