更新报告
This commit is contained in:
parent
46ecf7cca7
commit
e1bc2691d4
|
@ -10,7 +10,8 @@
|
|||
Read this in [English](README_en.md)
|
||||
|
||||
## 项目更新
|
||||
- 🔥🔥 **News**: ``2024/7/16``: GLM-4-9B-Chat 模型依赖的` transformers`版本升级到 `4.42.4`, 请更新模型配置文件并参考 `basic_demo/requirements.txt` 更新依赖。
|
||||
- 🔥🔥 **News**: ```2024/07/24```: 我们发布了与长文本相关的最新技术解读,关注 [这里](https://medium.com/@ChatGLM/glm-long-scaling-pre-trained-model-contexts-to-millions-caa3c48dea85) 查看我们在训练 GLM-4-9B 开源模型中关于长文本技术的技术报告。
|
||||
- 🔥 **News**: ``2024/7/16``: GLM-4-9B-Chat 模型依赖的` transformers`版本升级到 `4.42.4`, 请更新模型配置文件并参考 `basic_demo/requirements.txt` 更新依赖。
|
||||
- 🔥 **News**: ``2024/7/9``: GLM-4-9B-Chat
|
||||
模型已适配 [Ollama](https://github.com/ollama/ollama),[Llama.cpp](https://github.com/ggerganov/llama.cpp),您可以在[PR](https://github.com/ggerganov/llama.cpp/pull/8031) 查看具体的细节。
|
||||
- 🔥 **News**: ``2024/7/1``: 我们更新了 GLM-4V-9B 的微调,您需要更新我们的模型仓库的运行文件和配置文件,
|
||||
|
|
|
@ -8,7 +8,10 @@
|
|||
</p>
|
||||
|
||||
## Update
|
||||
- 🔥🔥 **News**: ``2024/7/16``: The ` transformers` version that the GLM-4-9B-Chat model depends on has been upgraded
|
||||
- 🔥🔥 **News**: ```2024/07/24```: we released the latest technical interpretation related to long texts. Check
|
||||
out [here](https://medium.com/@ChatGLM/glm-long-scaling-pre-trained-model-contexts-to-millions-caa3c48dea85) to view our
|
||||
technical report on long context technology in the training of the open-source GLM-4-9B model.
|
||||
- 🔥 **News**: ``2024/7/16``: The ` transformers` version that the GLM-4-9B-Chat model depends on has been upgraded
|
||||
to `4.42.4`. Please update the model configuration file and refer to `basic_demo/requirements.txt` to update the dependencies.
|
||||
- 🔥 **News**: ``2024/7/9``: The GLM-4-9B-Chat model has been adapted to [Ollama](https://github.com/ollama/ollama)
|
||||
and [Llama.cpp](https://github.com/ggerganov/llama.cpp), you can check the specific details
|
||||
|
|
Loading…
Reference in New Issue