Commit graph

4244 commits

Author SHA1 Message Date
oobabooga
5ad080ff25 Attempt at making the llama-server streaming more efficient. 2025-04-18 18:04:49 -07:00
oobabooga
4fabd729c9 Fix the API without streaming or without 'sampler_priority' (closes #6851) 2025-04-18 17:25:22 -07:00
oobabooga
5135523429 Fix the new llama.cpp loader failing to unload models 2025-04-18 17:10:26 -07:00
oobabooga
8d481ef9d5 Update README 2025-04-18 11:31:22 -07:00
oobabooga
caa6afc88b Only show 'GENERATE_PARAMS=...' in the logits endpoint if use_logits is True 2025-04-18 09:57:57 -07:00
oobabooga
e52f62d3ff Update README 2025-04-18 09:29:57 -07:00
oobabooga
85c4486d4a Update the colab notebook 2025-04-18 08:53:44 -07:00
oobabooga
d00d713ace Rename get_max_context_length to get_vocabulary_size in the new llama.cpp loader 2025-04-18 08:14:15 -07:00
oobabooga
c1cc65e82e Lint 2025-04-18 08:06:51 -07:00
oobabooga
d68f0fbdf7 Remove obsolete references to llamacpp_HF 2025-04-18 07:46:04 -07:00
oobabooga
a0abf93425 Connect --rope-freq-base to the new llama.cpp loader 2025-04-18 06:53:51 -07:00
oobabooga
ef9910c767 Fix a bug after c6901aba9f 2025-04-18 06:51:28 -07:00
oobabooga
1c4a2c9a71 Make exllamav3 safer as well 2025-04-18 06:17:58 -07:00
oobabooga
03544d4fb6 Bump llama.cpp and exllamav3 to the latest commits 2025-04-18 06:14:13 -07:00
oobabooga
c6901aba9f Remove deprecation warning code 2025-04-18 06:05:47 -07:00
oobabooga
170ad3d3ec Update the README 2025-04-18 06:03:35 -07:00
oobabooga
8144e1031e Remove deprecated command-line flags 2025-04-18 06:02:28 -07:00
oobabooga
ae54d8faaa
New llama.cpp loader (#6846) 2025-04-18 09:59:37 -03:00
oobabooga
5c2f8d828e Fix exllamav2 generating eos randomly after previous fix 2025-04-18 05:42:38 -07:00
oobabooga
2fc58ad935 Consider files with .pt extension in the new model menu function 2025-04-17 23:10:43 -07:00
Googolplexed
d78abe480b
Allow for model subfolder organization for GGUF files (#6686)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2025-04-18 02:53:59 -03:00
oobabooga
ce9e2d94b1 Revert "Attempt at solving the ExLlamaV2 issue"
This reverts commit c9b3c9dfbf.
2025-04-17 22:03:21 -07:00
oobabooga
5dfab7d363 New attempt at solving the exl2 issue 2025-04-17 22:03:11 -07:00
oobabooga
c9b3c9dfbf Attempt at solving the ExLlamaV2 issue 2025-04-17 21:45:15 -07:00
oobabooga
2c2d453c8c Revert "Use ExLlamaV2 (instead of the HF one) for EXL2 models for now"
This reverts commit 0ef1b8f8b4.
2025-04-17 21:31:32 -07:00
oobabooga
0ef1b8f8b4 Use ExLlamaV2 (instead of the HF one) for EXL2 models for now
It doesn't seem to have the "OverflowError" bug
2025-04-17 05:47:40 -07:00
oobabooga
38dc09dca5 Bump exllamav3 to the latest commit 2025-04-15 09:50:36 -07:00
oobabooga
038a012581 Installer: Remove .installer_state.json on reinstalling 2025-04-11 21:12:32 -07:00
oobabooga
682c78ea42 Add back detection of GPTQ models (closes #6841) 2025-04-11 21:00:42 -07:00
oobabooga
454366f93e Change the ExLlamaV3 wheel version to 0.0.1a1 2025-04-10 18:33:29 -07:00
oobabooga
d7b336d37e Update the README 2025-04-09 20:12:14 -07:00
oobabooga
4ed0da74a8 Remove the obsolete 'multimodal' extension 2025-04-09 20:09:48 -07:00
oobabooga
598568b1ed Revert "UI: remove the streaming cursor"
This reverts commit 6ea0206207.
2025-04-09 16:03:14 -07:00
oobabooga
297a406e05 UI: smoother chat streaming
This removes the throttling associated to gr.Textbox that made words appears in chunks rather than one at a time
2025-04-09 16:02:37 -07:00
oobabooga
6ea0206207 UI: remove the streaming cursor 2025-04-09 14:59:34 -07:00
oobabooga
9025848df5 Small change to installer 2025-04-09 10:25:47 -07:00
oobabooga
d337ea31fa Revert "Reapply "Update transformers requirement from ==4.50.* to ==4.51.* (#6834)""
This reverts commit 8229736ec4.
2025-04-09 10:16:47 -07:00
oobabooga
8229736ec4 Reapply "Update transformers requirement from ==4.50.* to ==4.51.* (#6834)"
This reverts commit 0b3503c91f.
2025-04-09 08:38:06 -07:00
oobabooga
89f40cdcf7 Update libstdcxx-ng for GLIBCXX_3.4.30 support on Linux 2025-04-09 08:28:44 -07:00
oobabooga
ad1ada6574 Change one message in the installer 2025-04-09 05:17:10 -07:00
oobabooga
d8aad6da94 Fix an update bug 2025-04-08 20:20:24 -07:00
oobabooga
8b8d39ec4e
Add ExLlamaV3 support (#6832) 2025-04-09 00:07:08 -03:00
oobabooga
0b3503c91f Revert "Update transformers requirement from ==4.50.* to ==4.51.* (#6834)"
This reverts commit f1f32386b4.
2025-04-08 12:26:03 -07:00
oobabooga
649ee729c1 Remove Python 3.10 support 2025-04-08 09:22:06 -07:00
oobabooga
bf48ec8c44 Remove an unnecessary UI message 2025-04-07 17:43:41 -07:00
oobabooga
a5855c345c
Set context lengths to at most 8192 by default (to prevent out of memory errors) (#6835) 2025-04-07 21:42:33 -03:00
dependabot[bot]
f1f32386b4
Update transformers requirement from ==4.50.* to ==4.51.* (#6834) 2025-04-07 19:29:39 -03:00
oobabooga
204db28362 Update the dockerfiles 2025-04-06 18:48:31 -07:00
oobabooga
eef90a4964 Update some intel arc installation commands 2025-04-06 17:44:07 -07:00
oobabooga
a8a64b6c1c Update the README 2025-04-06 17:40:18 -07:00