zR
|
804733811f
|
change of readme and req for vllm new version
|
2024-11-18 10:10:43 +08:00 |
sixgod
|
9b39ba6d1b
|
transformers4.46 and vllm0.6.3
|
2024-11-01 09:06:04 +00:00 |
sixgod
|
e3e6de52c4
|
transformers4.46 and vllm0.6.3
|
2024-11-01 09:00:39 +00:00 |
zR
|
c2c28bc45c
|
transforemrs>=4.46 support
|
2024-10-29 00:13:41 +08:00 |
zR
|
3e7735d4f7
|
update the req and chatglm_tokenizer.py
|
2024-10-06 14:09:05 +08:00 |
zR
|
5aca9bcb95
|
update readme
|
2024-09-06 15:14:25 +08:00 |
zR
|
9a27e77bba
|
requirement update
|
2024-08-11 17:59:00 +08:00 |
zR
|
913cb6dc06
|
部分依赖更新
|
2024-07-24 10:17:07 +08:00 |
zR
|
4ab7a1efd1
|
依赖更新
|
2024-07-16 17:08:50 +08:00 |
zR
|
ef015f88f9
|
跟进OpenAI server部分代码解释
with liuzhenghua
|
2024-06-18 18:13:12 +08:00 |
zR
|
5c4bf6201c
|
fix openai stream function
|
2024-06-15 20:59:23 +08:00 |
zR
|
ce2667cf5d
|
fix issue #74
|
2024-06-06 16:18:14 +08:00 |
zR
|
8102212b9f
|
fix with vllm requirements.txt
|
2024-06-06 13:57:22 +08:00 |
duzx16
|
c0a6d1e0fa
|
init commit
|
2024-06-05 10:22:16 +08:00 |