Commit graph

308 commits

Author SHA1 Message Date
qwopqwop200
c359f672a8
support faster and model load strict 2023-05-04 09:04:07 +09:00
qwopqwop200
afe1323b3f
support faster and model load strict 2023-05-04 09:03:36 +09:00
qwopqwop200
a88cd16d65
fix bug 2023-05-03 22:36:14 +09:00
qwopqwop200
24251d1397
check kwargs 2023-05-02 22:32:54 +09:00
qwopqwop200
26581b6946
remove LlamaGPTQForCausalLM 2023-05-02 22:18:17 +09:00
qwopqwop200
694f2954a3
add auto model parameter 2023-05-02 22:16:23 +09:00
qwopqwop200
ccd87e5800
add Auto model parameter 2023-05-02 22:15:56 +09:00
qwopqwop200
d8707f92a9
support fused_attn 2023-05-02 21:54:15 +09:00
qwopqwop200
61c6f6a5d2
typo fix 2023-05-02 21:53:39 +09:00
qwopqwop200
a11d59f6c4
support fused_attn 2023-05-02 21:53:13 +09:00
qwopqwop200
f47322f073
fix bug 2023-05-02 21:14:27 +09:00
qwopqwop200
41f2379850
bug fix 2023-05-02 20:38:17 +09:00
qwopqwop200
d2f48e5311
bug fix 2023-05-02 20:36:53 +09:00
qwopqwop200
709bd7594f
Merge pull request #44 from PanQiWei/fix-bug-cuda
Fix bug cuda
2023-05-02 19:50:59 +09:00
qwopqwop200
9490a98444
add LlamaGPTQForCausalLM 2023-05-02 19:32:18 +09:00
qwopqwop200
a6d4f5c091
fix bug 2023-05-02 19:19:04 +09:00
qwopqwop200
2ba84fbb48
fix bug 2023-05-02 19:13:40 +09:00
qwopqwop200
1388acac94
fix bug 2023-05-02 19:13:13 +09:00
qwopqwop200
6c23e5b3a5
add fused mlp ,fused attn 2023-05-02 18:55:44 +09:00
qwopqwop200
f51f763fde
fused attn ,fused mlp apply 2023-05-02 18:51:04 +09:00
qwopqwop200
50c0fd13c5
Multi-GPU, allocate output tensor 2023-05-02 17:51:41 +09:00
潘其威(William)
144bd80436
Merge pull request #39 from TheBloke/TheBloke_check_model_exists
Check that model_save_name exists before trying to load it, to avoid confusing checkpoint error
2023-05-01 19:55:24 +08:00
TheBloke
593a0b28bb Fix typo: 'hole' -> 'whole' 2023-05-01 10:25:18 +01:00
TheBloke
60195ca5f2 Check that model_save_name exists before trying inference, to avoid confusing checkpoint error 2023-05-01 10:15:13 +01:00
qwopqwop200
f0f37c1fe7
fix bug 2023-05-01 18:09:39 +09:00
qwopqwop200
95e633a597
add old cuda 2023-05-01 13:05:14 +09:00
qwopqwop200
5a69e22a93
add qlinear_old 2023-05-01 13:04:47 +09:00
qwopqwop200
9dfcac8e26
add qlinear_old 2023-05-01 13:03:57 +09:00
潘其威(William)
5fa803334d
Merge branch 'main' into change-save-name 2023-04-29 20:36:45 +08:00
qwopqwop200
787909084f
fix bug 2023-04-29 19:08:34 +09:00
qwopqwop200
a2ef4b98db
change save the name 2023-04-29 18:20:46 +09:00
qwopqwop200
1792cd1111
change save the name 2023-04-29 18:16:48 +09:00
ZXED
24a371d14a
use the same Optional style as in other params 2023-04-29 09:52:11 +03:00
ZXED
c22770188d
allow user to set trust_remote_code flag manually 2023-04-29 09:52:11 +03:00
ZXED
b3f19a7ba7
support custom model name when loading the model 2023-04-29 09:52:11 +03:00
ZXED
ea8ab73343
support custom quantize_config when loading the model 2023-04-29 09:51:50 +03:00
PanQiWei
16d8dd200f remove non-parameters module from MOSSGPTQForCausalLM.outside_layer_modules 2023-04-29 10:58:29 +08:00
PanQiWei
b490ab004e remove override of _resize_attention_mask for llama and opt 2023-04-28 23:08:42 +08:00
qwopqwop200
ae8b1a22a3
change global to local 2023-04-28 23:18:39 +09:00
qwopqwop200
e914b9b1bd
update support 256 not div 2023-04-28 22:48:23 +09:00
qwopqwop200
c9215a1b5b
change div num 2023-04-28 22:42:29 +09:00
qwopqwop200
19f167e58b
add raise-exception 2023-04-28 22:24:44 +09:00
潘其威(William)
1e353a8dc5
Merge pull request #24 from PanQiWei/speedup_quantization
Offloading and Multiple devices quantization/inference
2023-04-28 18:50:12 +08:00
PanQiWei
bdb713b5a3 add batch_size to model.quant() api 2023-04-28 18:26:07 +08:00
PanQiWei
41564a48db make data_utils.py as global utils 2023-04-28 18:08:58 +08:00
PanQiWei
3dfc87bec3 return module in .to function 2023-04-28 17:20:46 +08:00
PanQiWei
a69a73a22c fix device mismatch when directly using model to inference after quantization 2023-04-28 16:41:46 +08:00
qwopqwop200
329a64ed40
support conv1d,conv2d 2023-04-28 09:15:42 +09:00
qwopqwop200
bb9afe8b61
support conv1d,conv2d 2023-04-28 09:15:13 +09:00
qwopqwop200
c1b7c7647d
support conv1d 2023-04-28 09:14:44 +09:00