.. |
grammar
|
Let grammar escape backslashes (#5865)
|
2024-05-19 20:26:09 -03:00 |
block_requests.py
|
Fix the Google Colab notebook
|
2025-01-16 05:21:18 -08:00 |
callbacks.py
|
Lint
|
2025-01-09 13:18:23 -08:00 |
chat.py
|
Revert "UI: remove the streaming cursor"
|
2025-04-09 16:03:14 -07:00 |
deepspeed_parameters.py
|
Fix typo in deepspeed_parameters.py (#3222)
|
2023-07-24 11:17:28 -03:00 |
evaluate.py
|
New llama.cpp loader (#6846)
|
2025-04-18 09:59:37 -03:00 |
exllamav2.py
|
Connect XTC, DRY, smoothing_factor, and dynatemp to ExLlamaV2 loader (non-HF)
|
2025-01-04 16:25:06 -08:00 |
exllamav2_hf.py
|
Fix exllamav2 generating eos randomly after previous fix
|
2025-04-18 05:42:38 -07:00 |
exllamav3_hf.py
|
Make exllamav3 safer as well
|
2025-04-18 06:17:58 -07:00 |
extensions.py
|
Move update_wizard_windows.sh to update_wizard_windows.bat (oops)
|
2024-03-04 19:26:24 -08:00 |
github.py
|
Fix several typos in the codebase (#6151)
|
2024-06-22 21:40:25 -03:00 |
gradio_hijack.py
|
Bump gradio to 4.23 (#5758)
|
2024-03-26 16:32:20 -03:00 |
html_generator.py
|
UI: smoother chat streaming
|
2025-04-09 16:02:37 -07:00 |
llama_cpp_server.py
|
Lint
|
2025-04-18 08:06:51 -07:00 |
loaders.py
|
Remove deprecated command-line flags
|
2025-04-18 06:02:28 -07:00 |
logging_colors.py
|
Lint
|
2023-12-19 21:36:57 -08:00 |
logits.py
|
Remove obsolete references to llamacpp_HF
|
2025-04-18 07:46:04 -07:00 |
LoRA.py
|
Remove the AutoGPTQ loader (#6641)
|
2025-01-08 19:28:56 -03:00 |
metadata_gguf.py
|
llama.cpp: read instruction template from GGUF metadata (#4975)
|
2023-12-18 01:51:58 -03:00 |
models.py
|
New llama.cpp loader (#6846)
|
2025-04-18 09:59:37 -03:00 |
models_settings.py
|
New llama.cpp loader (#6846)
|
2025-04-18 09:59:37 -03:00 |
one_click_installer_check.py
|
Lint
|
2023-11-16 18:03:06 -08:00 |
presets.py
|
Add the top N-sigma sampler (#6796)
|
2025-03-14 16:45:11 -03:00 |
prompts.py
|
Fix "send instruction template to..." buttons (closes #4625)
|
2023-11-16 18:16:42 -08:00 |
relative_imports.py
|
Add ExLlama+LoRA support (#2756)
|
2023-06-19 12:31:24 -03:00 |
sampler_hijack.py
|
Add the top N-sigma sampler (#6796)
|
2025-03-14 16:45:11 -03:00 |
sane_markdown_lists.py
|
Sane handling of markdown lists (#6626)
|
2025-01-04 15:41:31 -03:00 |
shared.py
|
Remove obsolete references to llamacpp_HF
|
2025-04-18 07:46:04 -07:00 |
tensorrt_llm.py
|
Add TensorRT-LLM support (#5715)
|
2024-06-24 02:30:03 -03:00 |
text_generation.py
|
New llama.cpp loader (#6846)
|
2025-04-18 09:59:37 -03:00 |
training.py
|
Don't import PEFT unless necessary
|
2024-09-03 19:40:53 -07:00 |
ui.py
|
Remove deprecated command-line flags
|
2025-04-18 06:02:28 -07:00 |
ui_chat.py
|
UI: smoother chat streaming
|
2025-04-09 16:02:37 -07:00 |
ui_default.py
|
Lint
|
2024-12-17 20:13:32 -08:00 |
ui_file_saving.py
|
Fix the "save preset" event
|
2024-10-01 11:20:48 -07:00 |
ui_model_menu.py
|
Remove obsolete references to llamacpp_HF
|
2025-04-18 07:46:04 -07:00 |
ui_notebook.py
|
Lint
|
2024-12-17 20:13:32 -08:00 |
ui_parameters.py
|
Set context lengths to at most 8192 by default (to prevent out of memory errors) (#6835)
|
2025-04-07 21:42:33 -03:00 |
ui_session.py
|
Fix a bug after c6901aba9f
|
2025-04-18 06:51:28 -07:00 |
utils.py
|
Consider files with .pt extension in the new model menu function
|
2025-04-17 23:10:43 -07:00 |