Compare commits

...

29 commits

Author SHA1 Message Date
Tomas M.
235302a807
Merge fe7e1a2565 into d47c8eb956 2025-06-05 12:00:36 -03:00
oobabooga
d47c8eb956 Remove quotes from LLM-generated websearch query (closes #7045).
Fix by @Quiet-Joker
2025-06-05 06:57:59 -07:00
oobabooga
977ec801b7 Improve table colors in instruct mode 2025-06-05 06:33:45 -07:00
Tomas M.
fe7e1a2565
Update README.md
I placed the "Pointing to an existing AI model library" section first, as I believe, this is more relevant to majority of users.
2025-05-19 22:25:59 +00:00
oobabooga
e8595730b4
Merge pull request #6992 from oobabooga/dev
Merge dev branch
2025-05-17 11:58:46 -03:00
oobabooga
17c29fa0a2
Merge pull request #6987 from oobabooga/dev
Merge dev branch
2025-05-16 22:23:59 -03:00
oobabooga
dc3094549e
Merge pull request #6984 from oobabooga/dev
Merge dev branch
2025-05-16 17:13:26 -03:00
oobabooga
ace8afb825
Merge dev branch 2025-05-01 12:25:04 -03:00
oobabooga
a41da1ec95
Merge pull request #6939 from oobabooga/dev
Merge dev branch
2025-05-01 00:15:11 -03:00
oobabooga
6e6f9971a2
Merge pull request #6919 from oobabooga/dev
Merge dev branch
2025-04-27 11:35:19 -03:00
oobabooga
1180bb0d80
Merge pull request #6913 from oobabooga/dev
Merge dev branch
2025-04-27 00:12:16 -03:00
oobabooga
9bb9ce079e
Merge pull request #6912 from oobabooga/dev
Merge dev branch
2025-04-27 00:03:16 -03:00
oobabooga
1aa76b3beb
Merge pull request #6885 from oobabooga/dev
Merge dev branch
2025-04-22 22:38:24 -03:00
oobabooga
1df2b0d3ae
Merge pull request #6884 from oobabooga/dev
Merge dev branch
2025-04-22 22:02:30 -03:00
oobabooga
62455b415c
Merge pull request #6883 from oobabooga/dev
Merge dev branch
2025-04-22 21:54:34 -03:00
oobabooga
022664f2bd
Merge pull request #6881 from oobabooga/dev
Merge dev branch
2025-04-22 12:15:34 -03:00
oobabooga
a778270536
Merge pull request #6869 from oobabooga/dev
Merge dev branch
2025-04-22 12:09:20 -03:00
oobabooga
c19b995b8e
Merge pull request #6857 from oobabooga/dev
Merge dev branch
2025-04-19 21:45:55 -03:00
oobabooga
b1495d52e5
Merge pull request #6855 from oobabooga/dev
Merge dev branch
2025-04-19 01:53:11 -03:00
oobabooga
44a6d8a761
Merge pull request #6854 from oobabooga/dev
Merge dev branch
2025-04-18 23:41:56 -03:00
oobabooga
4fa52a1302
Merge pull request #6852 from oobabooga/dev
Merge dev branch
2025-04-18 22:15:40 -03:00
oobabooga
4eecb6611f
Merge pull request #6850 from oobabooga/dev
Merge dev branch
2025-04-18 15:33:32 -03:00
oobabooga
c5e54c0b37
Merge pull request #6848 from oobabooga/dev
Merge dev branch
2025-04-18 13:36:06 -03:00
oobabooga
14e6baeb48
Merge pull request #6838 from oobabooga/dev
Merge dev branch
2025-04-09 14:48:37 -03:00
oobabooga
bb1905ebc5 Fix the colab notebook 2025-03-29 19:17:36 -07:00
oobabooga
9b80d1d6c2 Remove the stalebot 2025-03-29 13:44:37 -07:00
oobabooga
80cdbe4e09
Merge pull request #6797 from oobabooga/dev
Merge dev branch
2025-03-15 00:11:25 -03:00
Kelvie Wong
769eee1ff3 Fix OpenAI API with new param (show_after), closes #6747 (#6749)
---------

Co-authored-by: oobabooga <oobabooga4@gmail.com>
2025-02-18 07:02:19 -08:00
oobabooga
7c883ef2f0
Merge pull request #6746 from oobabooga/dev
Merge dev branch
2025-02-14 23:25:31 -03:00
3 changed files with 26 additions and 1 deletions

View file

@ -325,6 +325,18 @@ https://github.com/oobabooga/text-generation-webui/wiki
## Downloading models ## Downloading models
### Pointing to an existing AI model library
Edit the file `text-generation-webui\user_data\CMD_FLAGS.txt` to include this line:
```
--model-dir 'D:\MyAIModels\'
```
Replace `D:\MyAIModels\` with the path to your model library folder. Sub-folders will be automatically parsed to enumerate all existing models.
### Manual model download
Models should be placed in the folder `text-generation-webui/user_data/models`. They are usually downloaded from [Hugging Face](https://huggingface.co/models?pipeline_tag=text-generation&sort=downloads). Models should be placed in the folder `text-generation-webui/user_data/models`. They are usually downloaded from [Hugging Face](https://huggingface.co/models?pipeline_tag=text-generation&sort=downloads).
* GGUF models are a single file and should be placed directly into `user_data/models`. Example: * GGUF models are a single file and should be placed directly into `user_data/models`. Example:

View file

@ -17,6 +17,14 @@
color: #d1d5db !important; color: #d1d5db !important;
} }
.chat .message-body :is(th, td) {
border-color: #40404096 !important;
}
.dark .chat .message-body :is(th, td) {
border-color: #ffffff75 !important;
}
.chat .message-body :is(p, ul, ol) { .chat .message-body :is(p, ul, ol) {
margin: 1.25em 0 !important; margin: 1.25em 0 !important;
} }

View file

@ -604,7 +604,12 @@ def generate_search_query(user_message, state):
query = "" query = ""
for reply in generate_reply(formatted_prompt, search_state, stopping_strings=[], is_chat=True): for reply in generate_reply(formatted_prompt, search_state, stopping_strings=[], is_chat=True):
query = reply.strip() query = reply
# Strip and remove surrounding quotes if present
query = query.strip()
if len(query) >= 2 and query.startswith('"') and query.endswith('"'):
query = query[1:-1]
return query return query