first commit

This commit is contained in:
xxl 2025-02-28 10:18:51 +08:00
parent f1c323c923
commit 548d7d1779
14 changed files with 152795 additions and 2 deletions

213
README.md
View File

@ -1,3 +1,212 @@
# proxy-lite-3b
---
library_name: transformers
tags:
- agent
- action
- vlm
base_model: Qwen/Qwen2.5-VL-3B-Instruct
license: cc-by-nc-4.0
---
proxy-lite-3b
# Model Card for Proxy Lite
<!-- Provide a quick summary of what the model is/does. -->
<div align="center">
<img src="https://raw.githubusercontent.com/convergence-ai/proxy-lite/refs/heads/main/assets/proxy-lite.png" alt="Proxy Lite logo" width="600" height="auto" style="margin-bottom: 20px;" />
<h2>
A mini, open-weights, version of <a href="https://proxy.convergence.ai">Proxy</a>.
</h2>
</div>
## Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Convergence AI
- **Model type:** 3B Vision-Language Model
- **Agent type**: Web-browsing Agent
- **License:** CC-BY-NC-4.0
- **Finetuned from model:** Qwen/Qwen2.5-VL-3B-Instruct
- [Running the agent](https://github.com/convergence-ai/proxy-lite)
## Running Proxy on the web
<!-- Provide the basic links for the model. -->
https://github.com/convergence-ai/proxy-lite to run Proxy lite on a browser
```
git clone https://github.com/convergence-ai/proxy-lite.git
make proxy
proxy "Find some markets near Kings Cross and tell me their ratings."
```
<div align="center">
<img src="https://raw.githubusercontent.com/convergence-ai/proxy-lite/refs/heads/main/assets/demo.gif" alt="Proxy Lite Demo" />
</div>
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
Proxy Lite is designed and trained to complete automated tasks in a web browser.
Full code for running the model is available in the [github repository](https://github.com/convergence-ai/proxy-lite).
This includes a CLI tool for running the model, as well as a streamlit app.
You can use this [endpoint](https://huggingface.co/spaces/convergence-ai/demo-api) for small-scale testing.
---
#### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
We recommend hosting your own endpoint with vLLM, you can use the following command:
```bash
vllm serve convergence-ai/proxy-lite-3b \
--trust-remote-code \
--enable-auto-tool-choice \
--tool-call-parser hermes \
--port 8008 \
```
The tool arguments are **very important** for parsing the tool calls from the model appropriately.
> **Important:** Qwen-2.5-VL Support in `transformers` is not yet available in the latest release so be sure to install from source.
#### Message History
When it comes to using and prompting Proxy Lite, please refer to the [repository](https://github.com/convergence-ai/proxy-lite) for more information, but the model expects a message history of the form:
```python
message_history = [
{
"role": "system",
"content": "You are Proxy Lite...", # Full system prompt in src/proxy_lite/agents/proxy_lite_agent.py
}, # System prompt
{
"role": "user",
"content": "Find some markets near Kings Cross and tell me their ratings.",
}, # Set the task
{
"role": "user",
"content": [
{"type": "image_url", "image_url": {base64_encoded_screenshot} },
{"type": "text", "text": "URL: https://www.google.com/ \n- [0] <a>About</a> \n- [1] <a>Store</a>...."}
] # This is the observation from the environment
},
]
```
This would then build up the message history, alternating between the assistant (who takes the *action*) and the user (who provides the *observation*).
> **Context-Window Management:** When making calls to the model, all the observations other than the current one are discarded in order to reduce the large number of image tokens required. Since the model responses include reflection on the observations and are all included in the message history, the model is still aware of the entire history when planning new actions.
#### Tools
You should also pass the `Tools` that the model has access to, these will define the action space available to the model. You can do this with `transformers`:
```python
from qwen_vl_utils import process_vision_info
from transformers import AutoProcessor
from proxy_lite.tools import ReturnValueTool, BrowserTool
from proxy_lite.serializer import OpenAICompatableSerializer
processor = AutoProcessor.from_pretrained("convergence-ai/proxy-lite-3b")
tools = OpenAICompatableSerializer().serialize_tools([ReturnValueTool(), BrowserTool(session=None)])
templated_messages = processor.apply_chat_template(
message_history, tokenize=False, add_generation_prompt=True, tools=tools
)
image_inputs, video_inputs = process_vision_info(message_history)
batch = processor(
text=[templated_messages],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
```
Or you can send to the endpoint directly, which will handle the formatting:
```python
from openai import OpenAI
client = OpenAI(base_url="http://convergence-ai-demo-api.hf.space/v1")
response = client.chat.completions.create(
model="convergence-ai/proxy-lite-3b",
messages=message_history,
tools=tools,
tool_choice="auto",
)
```
---
## Evaluation
Proxy Lite scored 72.4% on the [WebVoyager](https://huggingface.co/datasets/convergence-ai/WebVoyager2025Valid) benchmark, placing it 1st out of all available open-weights models.
A breakdown of the results by website is shown below:
| web_name | Success Rate (%) | Finish Rate (%) | Avg. Steps |
|---------------------|-----------------|-----------------|------------|
| Allrecipes | 87.8 | 95.1 | 10.3 |
| Amazon | 70.0 | 90.0 | 7.1 |
| Apple | 82.1 | 89.7 | 10.7 |
| ArXiv | 60.5 | 79.1 | 16.0 |
| BBC News | 69.4 | 77.8 | 15.9 |
| Booking | 70.0 | 85.0 | 24.8 |
| Cambridge Dict. | 86.0 | 97.7 | 5.7 |
| Coursera | 82.5 | 97.5 | 4.7 |
| ESPN | 53.8 | 87.2 | 14.9 |
| GitHub | 85.0 | 92.5 | 10.0 |
| Google Flights | 38.5 | 51.3 | 34.8 |
| Google Map | 78.9 | 94.7 | 9.6 |
| Google Search | 71.4 | 92.9 | 6.0 |
| Huggingface | 68.6 | 74.3 | 18.4 |
| Wolfram Alpha | 78.3 | 93.5 | 6.1 |
---
## Out-of-Scope Use
Proxy Lite is specifically designed to automate routine tasks within a web browser environment. However, it should **not be used** for:
- **High-Stakes or Safety-Critical Applications:**
_Avoid using Proxy Lite for tasks such as financial transactions, healthcare operations, legal decision-making, or emergency responses, where any error could lead to serious harm or significant financial loss._
- **Unauthorized or Invasive Data Extraction:**
_Automated scraping or extraction of data from websites should only be performed with explicit permission. Proxy Lite should not be used to bypass websites' terms of service, copyright restrictions, or privacy policies._
- **Interactions with Malicious or Unverified Websites:**
_Using the model to navigate or interact with suspicious or untrusted websites may expose the system to security threats such as malware, phishing attacks, or other forms of cyber exploitation._
- **Compliance-Regulated or Legally Sensitive Actions:**
_Tasks that require adherence to strict legal or regulatory standards (e.g., processing personal data or sensitive information) should employ additional safeguards beyond what the model provides._
---
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```bibtex
@article{proxy-lite,
title={Proxy Lite - A Mini, Open-weights, Autonomous Assistant},
author={Convergence AI},
year={2025}
}
```

24
added_tokens.json Normal file
View File

@ -0,0 +1,24 @@
{
"</tool_call>": 151658,
"<tool_call>": 151657,
"<|box_end|>": 151649,
"<|box_start|>": 151648,
"<|endoftext|>": 151643,
"<|file_sep|>": 151664,
"<|fim_middle|>": 151660,
"<|fim_pad|>": 151662,
"<|fim_prefix|>": 151659,
"<|fim_suffix|>": 151661,
"<|im_end|>": 151645,
"<|im_start|>": 151644,
"<|image_pad|>": 151655,
"<|object_ref_end|>": 151647,
"<|object_ref_start|>": 151646,
"<|quad_end|>": 151651,
"<|quad_start|>": 151650,
"<|repo_name|>": 151663,
"<|video_pad|>": 151656,
"<|vision_end|>": 151653,
"<|vision_pad|>": 151654,
"<|vision_start|>": 151652
}

3
chat_template.json Normal file
View File

@ -0,0 +1,3 @@
{
"chat_template": "{% set image_count = namespace(value=0) %}{% set video_count = namespace(value=0) %}{{- '<|im_start|>' + messages[0].role + '\n' }}{%- if messages[0]['content'] is string %}{{ messages[0]['content'] }}{%- else %}{%- for content in messages[0]['content'] %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{%- endif %}{%- if messages[0].tool_calls %}{%- for tool_call in messages[0].tool_calls %}{%- if tool_call.function is defined %}{%- set tool_call = tool_call.function %}{%- endif %}<tool_call>{\"name\": \"{{ tool_call.name }}\", \"arguments\": {{ tool_call.arguments | tojson }}} </tool_call>{%- endfor %}{%- endif %}{{- '<|im_end|>\n' }}{%- if tools %}{{- '<|im_start|>system\n' }}{{- \"\n\n# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>\" }}{%- for tool in tools %}{{- \"\n\" }}{{- tool | tojson }}{%- endfor %}{{- \"\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n\" }}<tool_call>{\"name\": <function-name>, \"arguments\": <args-json-object>}</tool_call>{{- '<|im_end|>\n' }}{%- endif %}{%- for message in messages[1:] %}{%- if (message.role == \"user\") or (message.role == \"system\") or (message.role == \"assistant\" and not message.tool_calls) %}{{- '<|im_start|>' + message.role + '\n' }}{%- if message.content is string %}{{ message.content }}{%- else %}{%- for content in message.content %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{%- endif %}{{- '<|im_end|>\n' }}{%- elif message.role == \"assistant\" %}{{- '<|im_start|>' + message.role }}{%- if message.content %}{% if message.content is string %}{{ '\n' + message.content }}{% else %}{%- for content in message.content %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{% endif %}{%- endif %}{%- for tool_call in message.tool_calls %}{%- if tool_call.function is defined %}{%- set tool_call = tool_call.function %}{%- endif %}<tool_call>{\"name\": \"{{ tool_call.name }}\", \"arguments\": {{ tool_call.arguments | tojson }}} </tool_call>{%- endfor %}{{- '<|im_end|>\n' }}{%- elif message.role == \"tool\" %}{%- if (loop.index0 == 0) or (messages[loop.index0].role != \"tool\") %}{{- '<|im_start|>user' }}{%- endif %}{{- '\n<tool_response>\n' }}{%- if message.content is string %}{{ message.content }}{%- else %}{%- for content in message.content %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{%- endif %}{{- '\n</tool_response>' }}{%- if loop.last or (messages[loop.index0 + 1].role != \"tool\") %}{{- '<|im_end|>\n' }}{%- endif %}{%- endif %}{%- endfor %}{%- if add_generation_prompt %}{{- '<|im_start|>assistant\n' }}{%- endif %}"
}

51
config.json Normal file
View File

@ -0,0 +1,51 @@
{
"_name_or_path": "convergence-ai/subset-distill-tools-3b-17-02-2025",
"architectures": [
"Qwen2_5_VLForConditionalGeneration"
],
"attention_dropout": 0.0,
"bos_token_id": 151643,
"eos_token_id": 151645,
"hidden_act": "silu",
"hidden_size": 2048,
"image_token_id": 151655,
"initializer_range": 0.02,
"intermediate_size": 11008,
"max_position_embeddings": 128000,
"max_window_layers": 70,
"model_type": "qwen2_5_vl",
"num_attention_heads": 16,
"num_hidden_layers": 36,
"num_key_value_heads": 2,
"rms_norm_eps": 1e-06,
"rope_scaling": {
"mrope_section": [
16,
24,
24
],
"rope_type": "default",
"type": "default"
},
"rope_theta": 1000000.0,
"sliding_window": 32768,
"tie_word_embeddings": true,
"torch_dtype": "bfloat16",
"transformers_version": "4.49.0.dev0",
"use_cache": true,
"use_sliding_window": false,
"video_token_id": 151656,
"vision_config": {
"hidden_size": 1280,
"in_chans": 3,
"model_type": "qwen2_5_vl",
"out_hidden_size": 2048,
"spatial_patch_size": 14,
"tokens_per_second": 2,
"torch_dtype": "bfloat16"
},
"vision_end_token_id": 151653,
"vision_start_token_id": 151652,
"vision_token_id": 151654,
"vocab_size": 151936
}

6
generation_config.json Normal file
View File

@ -0,0 +1,6 @@
{
"_from_model_config": true,
"bos_token_id": 151643,
"eos_token_id": 151645,
"transformers_version": "4.49.0.dev0"
}

151388
merges.txt Normal file

File diff suppressed because it is too large Load Diff

BIN
model-00001-of-00002.safetensors (Stored with Git LFS) Normal file

Binary file not shown.

BIN
model-00002-of-00002.safetensors (Stored with Git LFS) Normal file

Binary file not shown.

View File

@ -0,0 +1,831 @@
{
"metadata": {
"total_size": 7509245952
},
"weight_map": {
"model.embed_tokens.weight": "model-00001-of-00002.safetensors",
"model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.11.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.19.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.19.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.19.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.19.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.19.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.19.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.19.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.20.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.20.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.20.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.20.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.20.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.20.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.21.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.21.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.22.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.22.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.23.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.23.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.24.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.24.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.25.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.25.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.26.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.26.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.27.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.27.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.28.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.28.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.29.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.29.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.30.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.30.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.30.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.30.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.30.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.31.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.31.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.32.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.32.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.33.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.33.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.34.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.34.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.35.mlp.down_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.mlp.gate_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.mlp.up_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.o_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
"model.layers.35.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
"model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.4.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.4.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.5.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.8.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.o_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
"model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
"model.norm.weight": "model-00002-of-00002.safetensors",
"visual.blocks.0.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.0.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.0.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.0.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.0.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.0.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.0.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.0.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.0.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.0.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.0.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.0.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.1.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.1.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.1.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.1.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.1.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.1.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.10.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.10.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.10.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.10.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.10.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.10.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.11.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.11.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.11.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.11.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.11.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.11.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.12.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.12.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.12.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.12.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.12.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.12.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.13.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.13.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.13.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.13.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.13.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.13.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.14.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.14.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.14.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.14.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.14.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.14.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.15.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.15.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.15.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.15.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.15.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.15.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.16.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.16.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.16.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.16.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.16.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.16.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.17.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.17.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.17.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.17.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.17.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.17.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.18.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.18.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.18.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.18.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.18.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.18.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.19.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.19.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.19.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.19.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.19.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.19.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.2.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.2.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.2.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.2.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.2.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.2.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.20.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.20.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.20.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.20.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.20.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.20.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.21.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.21.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.21.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.21.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.21.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.21.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.22.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.22.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.22.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.22.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.22.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.22.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.23.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.23.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.23.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.23.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.23.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.23.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.24.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.24.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.24.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.24.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.24.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.24.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.25.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.25.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.25.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.25.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.25.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.25.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.26.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.26.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.26.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.26.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.26.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.26.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.27.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.27.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.27.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.27.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.27.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.27.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.28.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.28.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.28.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.28.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.28.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.28.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.29.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.29.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.29.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.29.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.29.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.29.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.3.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.3.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.3.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.3.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.3.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.3.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.30.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.30.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.30.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.30.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.30.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.30.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.31.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.31.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.31.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.31.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.31.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.31.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.4.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.4.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.4.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.4.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.4.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.4.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.5.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.5.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.5.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.5.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.5.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.5.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.6.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.6.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.6.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.6.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.6.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.6.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.7.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.7.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.7.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.7.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.7.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.7.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.8.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.8.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.8.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.8.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.8.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.8.norm2.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.attn.proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.9.attn.proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.attn.qkv.bias": "model-00001-of-00002.safetensors",
"visual.blocks.9.attn.qkv.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.mlp.down_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.9.mlp.down_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.mlp.gate_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.9.mlp.gate_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.mlp.up_proj.bias": "model-00001-of-00002.safetensors",
"visual.blocks.9.mlp.up_proj.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.norm1.weight": "model-00001-of-00002.safetensors",
"visual.blocks.9.norm2.weight": "model-00001-of-00002.safetensors",
"visual.merger.ln_q.weight": "model-00001-of-00002.safetensors",
"visual.merger.mlp.0.bias": "model-00001-of-00002.safetensors",
"visual.merger.mlp.0.weight": "model-00001-of-00002.safetensors",
"visual.merger.mlp.2.bias": "model-00001-of-00002.safetensors",
"visual.merger.mlp.2.weight": "model-00001-of-00002.safetensors",
"visual.patch_embed.proj.weight": "model-00001-of-00002.safetensors"
}
}

29
preprocessor_config.json Normal file
View File

@ -0,0 +1,29 @@
{
"do_convert_rgb": true,
"do_normalize": true,
"do_rescale": true,
"do_resize": true,
"image_mean": [
0.48145466,
0.4578275,
0.40821073
],
"image_processor_type": "Qwen2_5_VLImageProcessor",
"image_std": [
0.26862954,
0.26130258,
0.27577711
],
"max_pixels": 12845056,
"merge_size": 2,
"min_pixels": 3136,
"patch_size": 14,
"processor_class": "Qwen2_5_VLProcessor",
"resample": 3,
"rescale_factor": 0.00392156862745098,
"size": {
"longest_edge": 12845056,
"shortest_edge": 3136
},
"temporal_patch_size": 2
}

31
special_tokens_map.json Normal file
View File

@ -0,0 +1,31 @@
{
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"eos_token": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

BIN
tokenizer.json (Stored with Git LFS) Normal file

Binary file not shown.

211
tokenizer_config.json Normal file
View File

@ -0,0 +1,211 @@
{
"add_bos_token": false,
"add_prefix_space": false,
"added_tokens_decoder": {
"151643": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151644": {
"content": "<|im_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151645": {
"content": "<|im_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151646": {
"content": "<|object_ref_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151647": {
"content": "<|object_ref_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151648": {
"content": "<|box_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151649": {
"content": "<|box_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151650": {
"content": "<|quad_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151651": {
"content": "<|quad_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151652": {
"content": "<|vision_start|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151653": {
"content": "<|vision_end|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151654": {
"content": "<|vision_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151655": {
"content": "<|image_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151656": {
"content": "<|video_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": true
},
"151657": {
"content": "<tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151658": {
"content": "</tool_call>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151659": {
"content": "<|fim_prefix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151660": {
"content": "<|fim_middle|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151661": {
"content": "<|fim_suffix|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151662": {
"content": "<|fim_pad|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151663": {
"content": "<|repo_name|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
},
"151664": {
"content": "<|file_sep|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false,
"special": false
}
},
"additional_special_tokens": [
"<|im_start|>",
"<|im_end|>",
"<|object_ref_start|>",
"<|object_ref_end|>",
"<|box_start|>",
"<|box_end|>",
"<|quad_start|>",
"<|quad_end|>",
"<|vision_start|>",
"<|vision_end|>",
"<|vision_pad|>",
"<|image_pad|>",
"<|video_pad|>"
],
"bos_token": null,
"chat_template": "{% set image_count = namespace(value=0) %}{% set video_count = namespace(value=0) %}{{- '<|im_start|>' + messages[0].role + '\n' }}{%- if messages[0]['content'] is string %}{{ messages[0]['content'] }}{%- else %}{%- for content in messages[0]['content'] %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{%- endif %}{%- if messages[0].tool_calls %}{%- for tool_call in messages[0].tool_calls %}{%- if tool_call.function is defined %}{%- set tool_call = tool_call.function %}{%- endif %}<tool_call>{\"name\": \"{{ tool_call.name }}\", \"arguments\": {{ tool_call.arguments | tojson }}} </tool_call>{%- endfor %}{%- endif %}{{- '<|im_end|>\n' }}{%- if tools %}{{- '<|im_start|>system\n' }}{{- \"\n\n# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>\" }}{%- for tool in tools %}{{- \"\n\" }}{{- tool | tojson }}{%- endfor %}{{- \"\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n\" }}<tool_call>{\"name\": <function-name>, \"arguments\": <args-json-object>}</tool_call>{{- '<|im_end|>\n' }}{%- endif %}{%- for message in messages[1:] %}{%- if (message.role == \"user\") or (message.role == \"system\") or (message.role == \"assistant\" and not message.tool_calls) %}{{- '<|im_start|>' + message.role + '\n' }}{%- if message.content is string %}{{ message.content }}{%- else %}{%- for content in message.content %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{%- endif %}{{- '<|im_end|>\n' }}{%- elif message.role == \"assistant\" %}{{- '<|im_start|>' + message.role }}{%- if message.content %}{% if message.content is string %}{{ '\n' + message.content }}{% else %}{%- for content in message.content %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{% endif %}{%- endif %}{%- for tool_call in message.tool_calls %}{%- if tool_call.function is defined %}{%- set tool_call = tool_call.function %}{%- endif %}<tool_call>{\"name\": \"{{ tool_call.name }}\", \"arguments\": {{ tool_call.arguments | tojson }}} </tool_call>{%- endfor %}{{- '<|im_end|>\n' }}{%- elif message.role == \"tool\" %}{%- if (loop.index0 == 0) or (messages[loop.index0].role != \"tool\") %}{{- '<|im_start|>user' }}{%- endif %}{{- '\n<tool_response>\n' }}{%- if message.content is string %}{{ message.content }}{%- else %}{%- for content in message.content %}{%- if content.type == 'image' or ('image' in content) or ('image_url' in content) %}{% set image_count.value = image_count.value + 1 %}{%- if add_vision_id %}Picture {{ image_count.value }}:{%- endif %}<|vision_start|><|image_pad|><|vision_end|>{%- elif content.type == 'video' or ('video' in content) %}{% set video_count.value = video_count.value + 1 %}{%- if add_vision_id %}Video {{ video_count.value }}:{%- endif %}<|vision_start|><|video_pad|><|vision_end|>{%- elif 'text' in content %}{{ content.text }}{%- endif %}{%- endfor %}{%- endif %}{{- '\n</tool_response>' }}{%- if loop.last or (messages[loop.index0 + 1].role != \"tool\") %}{{- '<|im_end|>\n' }}{%- endif %}{%- endif %}{%- endfor %}{%- if add_generation_prompt %}{{- '<|im_start|>assistant\n' }}{%- endif %}",
"clean_up_tokenization_spaces": false,
"do_image_splitting": false,
"eos_token": "<|im_end|>",
"errors": "replace",
"extra_special_tokens": {},
"model_max_length": 131072,
"pad_token": "<|endoftext|>",
"padding_side": "left",
"processor_class": "Qwen2_5_VLProcessor",
"split_special_tokens": false,
"tokenizer_class": "Qwen2Tokenizer",
"unk_token": null
}

1
vocab.json Normal file

File diff suppressed because one or more lines are too long