Commit graph

4473 commits

Author SHA1 Message Date
oobabooga
bc55feaf3e Improve host header validation in local mode 2025-04-26 15:42:17 -07:00
oobabooga
a317450dfa Update README 2025-04-26 14:59:29 -07:00
oobabooga
d1e7d9c5d5 Update CMD_FLAGS.txt 2025-04-26 09:00:56 -07:00
oobabooga
3a207e7a57 Improve the --help formatting a bit 2025-04-26 07:31:04 -07:00
oobabooga
6acb0e1bee Change a UI description 2025-04-26 05:13:08 -07:00
oobabooga
cbd4d967cc Update a --help message 2025-04-26 05:09:52 -07:00
oobabooga
19c8dced67 Move settings-template.yaml into user_data 2025-04-26 05:03:23 -07:00
oobabooga
b976112539 Remove the WSL installation scripts
They were useful in 2023 but now everything runs natively on Windows.
2025-04-26 05:02:17 -07:00
oobabooga
763a7011c0 Remove an ancient/obsolete migration check 2025-04-26 04:59:05 -07:00
oobabooga
d9de14d1f7
Restructure the repository (#6904) 2025-04-26 08:56:54 -03:00
oobabooga
d4017fbb6d
ExLlamaV3: Add kv cache quantization (#6903) 2025-04-25 21:32:00 -03:00
oobabooga
d4b1e31c49 Use --ctx-size to specify the context size for all loaders
Old flags are still recognized as alternatives.
2025-04-25 16:59:03 -07:00
oobabooga
faababc4ea llama.cpp: Add a prompt processing progress bar 2025-04-25 16:42:30 -07:00
oobabooga
877cf44c08 llama.cpp: Add StreamingLLM (--streaming-llm) 2025-04-25 16:21:41 -07:00
oobabooga
d35818f4e1
UI: Add a collapsible thinking block to messages with <think> steps (#6902) 2025-04-25 18:02:02 -03:00
oobabooga
0dd71e78c9 Add -noavx2 portable builds 2025-04-25 09:07:14 -07:00
oobabooga
98f4c694b9 llama.cpp: Add --extra-flags parameter for passing additional flags to llama-server 2025-04-25 07:32:51 -07:00
oobabooga
b6fffbd216 UI: minor style change 2025-04-25 05:37:44 -07:00
oobabooga
2c7ff86015 Bump exllamav3 to de83084184 2025-04-25 05:28:22 -07:00
oobabooga
5993ebeb1b Bump exllamav2 to 0.2.9 2025-04-25 05:27:59 -07:00
oobabooga
23399aff3c UI: minor style change 2025-04-24 20:39:00 -07:00
oobabooga
5861013e68 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2025-04-24 20:36:20 -07:00
oobabooga
a90df27ff5 UI: Add a greeting when the chat history is empty 2025-04-24 20:33:40 -07:00
oobabooga
ae1fe87365
ExLlamaV2: Add speculative decoding (#6899) 2025-04-25 00:11:04 -03:00
Matthew Jenkins
8f2493cc60
Prevent llamacpp defaults from locking up consumer hardware (#6870) 2025-04-24 23:38:57 -03:00
oobabooga
370fe7b7cf Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2025-04-24 09:33:17 -07:00
oobabooga
8ebe868916 Fix typos in b313adf653 2025-04-24 09:32:17 -07:00
oobabooga
93fd4ad25d llama.cpp: Document the --device-draft syntax 2025-04-24 09:20:11 -07:00
oobabooga
f1b64df8dd EXL2: add another torch.cuda.synchronize() call to prevent errors 2025-04-24 09:03:49 -07:00
Ziya
60ac495d59
extensions/superboogav2: existing embedding check bug fix (#6898) 2025-04-24 12:42:05 -03:00
oobabooga
b313adf653 Bump llama.cpp, make the wheels work with any Python >= 3.7 2025-04-24 08:26:12 -07:00
oobabooga
c71a2af5ab Handle CMD_FLAGS.txt in the main code (closes #6896) 2025-04-24 08:21:06 -07:00
oobabooga
bfbde73409 Make 'instruct' the default chat mode 2025-04-24 07:08:49 -07:00
oobabooga
e99c20bcb0
llama.cpp: Add speculative decoding (#6891) 2025-04-23 20:10:16 -03:00
oobabooga
9424ba17c8 UI: show only part 00001 of multipart GGUF models in the model menu 2025-04-22 19:56:42 -07:00
oobabooga
bce1b68ca9 Minor fix after previous commit 2025-04-22 18:37:36 -07:00
oobabooga
812d878812 Make the dependabot less spammy 2025-04-22 18:35:22 -07:00
oobabooga
8228822a6c Revert "Temporary change"
This reverts commit 765de6f678.
2025-04-22 18:01:47 -07:00
oobabooga
765de6f678 Temporary change 2025-04-22 17:53:56 -07:00
oobabooga
89ec4c9ba6 Add vulkan workflow 2025-04-22 17:51:08 -07:00
oobabooga
06619e5f03 Add vulkan requirements.txt files 2025-04-22 17:46:54 -07:00
oobabooga
4335a24ff8 Fix the workflow 2025-04-22 08:14:13 -07:00
oobabooga
25cf3600aa Lint 2025-04-22 08:04:02 -07:00
oobabooga
39cbb5fee0 Lint 2025-04-22 08:03:25 -07:00
oobabooga
da1919baae Update the README 2025-04-22 08:03:22 -07:00
oobabooga
a3031795a3 Update the zip filename 2025-04-22 08:03:16 -07:00
oobabooga
008c6dd682 Lint 2025-04-22 08:02:37 -07:00
oobabooga
ee09e44c85
Portable version (#6868) 2025-04-22 09:25:57 -03:00
oobabooga
78aeabca89 Fix the transformers loader 2025-04-21 18:33:14 -07:00
oobabooga
8320190184 Fix the exllamav2_HF and exllamav3_HF loaders 2025-04-21 18:32:23 -07:00