oobabooga
|
3bc2ec2b11
|
Fix #6965
|
2025-05-08 10:34:09 -07:00 |
|
oobabooga
|
1c7209a725
|
Save the chat history periodically during streaming
|
2025-05-08 09:46:43 -07:00 |
|
oobabooga
|
a1b3307b66
|
Bump llama.cpp
|
2025-05-08 08:58:43 -07:00 |
|
Jonas
|
fa960496d5
|
Tools support for OpenAI compatible API (#6827)
|
2025-05-08 12:30:27 -03:00 |
|
Scott Z
|
ed6e16191d
|
Docker fix for NVIDIA (#6964)
|
2025-05-08 12:21:52 -03:00 |
|
oobabooga
|
13a434f351
|
Bump exllamav3
|
2025-05-08 08:06:07 -07:00 |
|
oobabooga
|
a2ab42d390
|
UI: Remove the exllamav2 info message
|
2025-05-08 08:00:38 -07:00 |
|
oobabooga
|
348d4860c2
|
UI: Create a "Main options" section in the Model tab
|
2025-05-08 07:58:59 -07:00 |
|
oobabooga
|
d2bae7694c
|
UI: Change the ctx-size description
|
2025-05-08 07:26:23 -07:00 |
|
oobabooga
|
b28fa86db6
|
Default --gpu-layers to 256
|
2025-05-06 17:51:55 -07:00 |
|
oobabooga
|
760b4dd115
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2025-05-06 14:02:57 -07:00 |
|
oobabooga
|
e4fb2475d2
|
UI: Multiple small style improvements (light/dark themes)
|
2025-05-06 14:02:15 -07:00 |
|
Downtown-Case
|
5ef564a22e
|
Fix model config loading in shared.py for Python 3.13 (#6961)
|
2025-05-06 17:03:33 -03:00 |
|
oobabooga
|
c4f36db0d8
|
llama.cpp: remove tfs (it doesn't get used)
|
2025-05-06 08:41:13 -07:00 |
|
oobabooga
|
05115e42ee
|
Set top_n_sigma before temperature by default
|
2025-05-06 08:27:21 -07:00 |
|
oobabooga
|
1927afe894
|
Fix top_n_sigma not showing for llama.cpp
|
2025-05-06 08:18:49 -07:00 |
|
oobabooga
|
605cc9ab14
|
Update exllamav3
|
2025-05-06 06:43:35 -07:00 |
|
oobabooga
|
89590adc14
|
Update llama.cpp
|
2025-05-06 06:41:17 -07:00 |
|
oobabooga
|
d1c0154d66
|
llama.cpp: Add top_n_sigma, fix typical_p in sampler priority
|
2025-05-06 06:38:39 -07:00 |
|
oobabooga
|
cbef35054c
|
UI: CSS fix
|
2025-05-05 17:46:09 -07:00 |
|
Evgenii Novikov
|
4e8f628d3c
|
docker: App uid typo in other docker composes (#6958)
|
2025-05-05 20:05:15 -03:00 |
|
oobabooga
|
530223bf0b
|
UI: Fix the hover menu colors
|
2025-05-05 16:03:43 -07:00 |
|
oobabooga
|
76f947e3cf
|
UI: Minor style change
|
2025-05-05 15:58:29 -07:00 |
|
Alireza Ghasemi
|
99bd66445f
|
SuperboogaV2: minor update to avoid json serialization errors #6945
|
2025-05-05 19:04:06 -03:00 |
|
Evgenii Novikov
|
987505ead3
|
docker: Fix app uid typo in cpu docker compose (#6957)
|
2025-05-05 19:03:33 -03:00 |
|
oobabooga
|
941e0663da
|
Update README
|
2025-05-05 14:18:16 -07:00 |
|
oobabooga
|
f82667f0b4
|
Remove more multimodal extension references
|
2025-05-05 14:17:00 -07:00 |
|
oobabooga
|
85bf2e15b9
|
API: Remove obsolete multimodal extension handling
Multimodal support will be added back once it's implemented in llama-server.
|
2025-05-05 14:14:48 -07:00 |
|
mamei16
|
8137eb8ef4
|
Dynamic Chat Message UI Update Speed (#6952)
|
2025-05-05 18:05:23 -03:00 |
|
oobabooga
|
53d8e46502
|
Ensure environment isolation in portable installs
|
2025-05-05 12:28:17 -07:00 |
|
oobabooga
|
bf5290bc0f
|
Fix the hover menu in light theme
|
2025-05-05 08:04:12 -07:00 |
|
oobabooga
|
967b70327e
|
Light theme improvement
|
2025-05-05 07:59:02 -07:00 |
|
oobabooga
|
6001d279c6
|
Light theme improvement
|
2025-05-05 07:42:13 -07:00 |
|
oobabooga
|
475e012ee8
|
UI: Improve the light theme colors
|
2025-05-05 06:16:29 -07:00 |
|
oobabooga
|
b817bb33fd
|
Minor fix after df7bb0db1f
|
2025-05-05 05:00:20 -07:00 |
|
oobabooga
|
f3da45f65d
|
ExLlamaV3_HF: Change max_chunk_size to 256
|
2025-05-04 20:37:15 -07:00 |
|
oobabooga
|
df7bb0db1f
|
Rename --n-gpu-layers to --gpu-layers
|
2025-05-04 20:03:55 -07:00 |
|
oobabooga
|
d0211afb3c
|
Save the chat history right after sending a message
|
2025-05-04 18:52:01 -07:00 |
|
oobabooga
|
2da197bba4
|
Refinement after previous commit
|
2025-05-04 18:29:05 -07:00 |
|
oobabooga
|
690d693913
|
UI: Add padding to only show the last message/reply after sending a message
To avoid scrolling
|
2025-05-04 18:13:29 -07:00 |
|
oobabooga
|
d9da16edba
|
UI: Remove the chat input textarea border
|
2025-05-04 16:53:52 -07:00 |
|
oobabooga
|
84ab1f95be
|
UI: Increase the chat area a bit
|
2025-05-04 15:21:52 -07:00 |
|
oobabooga
|
d186621926
|
UI: Fixes after previous commit
|
2025-05-04 15:19:46 -07:00 |
|
oobabooga
|
7853fb1c8d
|
Optimize the Chat tab (#6948)
|
2025-05-04 18:58:37 -03:00 |
|
oobabooga
|
b7a5c7db8d
|
llama.cpp: Handle short arguments in --extra-flags
|
2025-05-04 07:14:42 -07:00 |
|
oobabooga
|
5f5569e9ac
|
Update README
|
2025-05-04 06:20:36 -07:00 |
|
oobabooga
|
4c2e3b168b
|
llama.cpp: Add a retry mechanism when getting the logits (sometimes it fails)
|
2025-05-03 06:51:20 -07:00 |
|
oobabooga
|
ea60f14674
|
UI: Show the list of files if the user tries to download a GGUF repository
|
2025-05-03 06:06:50 -07:00 |
|
oobabooga
|
b71ef50e9d
|
UI: Add a min-height to prevent constant scrolling during chat streaming
|
2025-05-02 23:45:58 -07:00 |
|
oobabooga
|
b21bd8bb1e
|
UI: Invert user/assistant message colors in instruct mode
The goal is to make assistant messages more readable.
|
2025-05-02 22:43:33 -07:00 |
|